Isn't the entropy being talked about the intensive entropy because in the video on probabilistic definition of entropy we found that S with a bar on top and not the S itself is defined as -k* {summation of (p(sub i) *ln(p (sub i)}?
Yes, that's true. You can also tell this from the units. k (Boltzmann's constant) has units of J K⁻¹ mol⁻¹. The Clausius Theorem (dS = đq_rev / T) is valid as long as both dS and đq are intensive, or as long as they are both extensive. Converting from intensive to extensive properties can be done easily by using Avogadro's number, so we're probably not always as careful as we should be about labeling them as intensive or extensive.
It is a chemistry approach -- chemistry uses an opposite sign convention to engineering. It's not so much to avoid discussion of heat engines. We do talk about heat engines (ruclips.net/video/hvfN_UGjCn0/видео.html), but not in nearly as much detail as engineers like to. Most physical chemists stop with Carnot, for example, and don't learn about the Otto or Rankine cycles, etc. Heat engines are, in fact, the root of the philosophical disagreement over the sign convention. If your main concern is getting a heat engine to do work, then of course the most natural definition of work is the amount of work done *by* the system. So PV work is positive when a piston expands. But if your main concern is a chemical system and its thermodynamic changes, then the most natural definition of work is the work done *on* the system. So PV work is positive when we compress a gas (thus raising its energy).
at 10.08, should not dq(rev) be less than dq(irrev)? Efficiency of reversible is greater than irreversible due to smaller net dq which leads to higher dw.
The sign of the inequality is correct. See a full derivation here: ruclips.net/video/X0E4gPv8vTQ/видео.html These inequalities can be especially confusing, for two reasons: (1) when the quantities are negative, the quantity which is *more* negative is "smaller" than the one that is less negative, in a mathematical sense, even though it has a larger magnitude. And (2) scientists and engineers tend to use an opposite sign convention for PV work (although that doesn't apply to heat, which you are asking about). To make sense out of the inequality, consider letting a gas expand isothermally. Since it is isothermal, it must absorb heat from the environment (q is positive). This heat is what pays for the PV work of expansion. In a reversible expansion, we extract the largest possible work from the system -- and thus absorb the most heat from the environment -- so q_rev is larger than any q_irrev.
I agree! Clausius was ahead of his time, in being able to formulate the idea of entropy purely from thinking about heat and work. I think it even confused him, as he spent about a decade gradually refining his explanation. In my mind, it must have gotten much easier to think about entropy after Boltzmann came along a few years later.
Clausius? More like "Cool theorem; thanks for teaching us!" 🙏
Really appreciate this content. Love the details
Thanks, I'm glad to hear it
I love the content, and the *heat* content too 😆
Isn't the entropy being talked about the intensive entropy because in the video on probabilistic definition of entropy we found that S with a bar on top and not the S itself is defined as -k* {summation of (p(sub i) *ln(p (sub i)}?
Yes, that's true.
You can also tell this from the units. k (Boltzmann's constant) has units of J K⁻¹ mol⁻¹.
The Clausius Theorem (dS = đq_rev / T) is valid as long as both dS and đq are intensive, or as long as they are both extensive.
Converting from intensive to extensive properties can be done easily by using Avogadro's number, so we're probably not always as careful as we should be about labeling them as intensive or extensive.
great work, i wish i had known it sooner.
I wonder if 10.08 where dq rev is less than dq irrev is a chemistry approach to avoid heat engines discussions?
It is a chemistry approach -- chemistry uses an opposite sign convention to engineering.
It's not so much to avoid discussion of heat engines. We do talk about heat engines (ruclips.net/video/hvfN_UGjCn0/видео.html), but not in nearly as much detail as engineers like to. Most physical chemists stop with Carnot, for example, and don't learn about the Otto or Rankine cycles, etc.
Heat engines are, in fact, the root of the philosophical disagreement over the sign convention. If your main concern is getting a heat engine to do work, then of course the most natural definition of work is the amount of work done *by* the system. So PV work is positive when a piston expands.
But if your main concern is a chemical system and its thermodynamic changes, then the most natural definition of work is the work done *on* the system. So PV work is positive when we compress a gas (thus raising its energy).
at 10.08, should not dq(rev) be less than dq(irrev)? Efficiency of reversible is greater than irreversible due to smaller net dq which leads to higher dw.
The sign of the inequality is correct. See a full derivation here: ruclips.net/video/X0E4gPv8vTQ/видео.html
These inequalities can be especially confusing, for two reasons: (1) when the quantities are negative, the quantity which is *more* negative is "smaller" than the one that is less negative, in a mathematical sense, even though it has a larger magnitude. And (2) scientists and engineers tend to use an opposite sign convention for PV work (although that doesn't apply to heat, which you are asking about).
To make sense out of the inequality, consider letting a gas expand isothermally. Since it is isothermal, it must absorb heat from the environment (q is positive). This heat is what pays for the PV work of expansion. In a reversible expansion, we extract the largest possible work from the system -- and thus absorb the most heat from the environment -- so q_rev is larger than any q_irrev.
beautiful explanation
Thanks
How Clausius came up with the idea of entropy always confuse me😪
I agree!
Clausius was ahead of his time, in being able to formulate the idea of entropy purely from thinking about heat and work. I think it even confused him, as he spent about a decade gradually refining his explanation.
In my mind, it must have gotten much easier to think about entropy after Boltzmann came along a few years later.
@@PhysicalChemistry Thanks a lot
Thanks!!!
No problem!
the audio is too low.
Yes, I agree the audio can be a little inconsistent. I know it's annoying to have to reach for the volume control