My cat goes out into the cold, then comes in to warm himself in my bed. He is a sort of engine for moving heat out of the apartment. I don't know if he does any useful work, though.
Your cat eats mousies. Digesting them creates heat, to warm you. They are well insulated so the cold outer fur is less of a heat sink, than your cat is a heat source. You must thank and cuddle your hot kitty bottle!
""Welcome to Entropy Burgers - may I take your order?" "I put in disorder a long time ago. The service here is getting worse all the time." "My experience Gibbs me reason to believe you." "I know the waitress who asked that, too. Her name's Ellen Omega. She really made me thermally dynamic. So, I asked her out. I tell you, when she don't like you, she really Boltz, man. Women like that are never distributed normally among the population." "What kind of Poisson would say something like this?""
Fantastic explanation of entropy, thanks. It is slightly worse than you suggest. The idea that every micro-state is equally likely is a mathematical simplification, but to end in a particular state you have to get there. Take the state of a go board. Half black on one side, representing all “hot” atoms in one place and a similar state with one black out of place. To get the out of place atom to the right point in the board is even less likely as it would have to move against the temperature/pressure gradient. Also you are really talking about a high entropy state with all atoms having an around average kinetic energy (speed) to one where half the atoms have all the energy. Collisions won’t do that. The statical model, with it’s simplifications becomes inaccurate at the extremes.
There is no such thing as a disorder. If one drops a deck of cards off the top of a building just what makes the result on the ground define "disorder"? The fact that some cards are up and others are done is no good since blackjack and 7 card stud are forms of order and each of those games has some cards up and some down. The scattered cards do not represent a disorder since the winner of a card game makes a similar so-called mess out of the formerly neatly stacked chips as he or she gathers them toward him or herself. Therefore, entropy only describes how the number of possible orders will be in larger in number than the fewer number we started with along with a reduction in temperature over time.
@@drakeequation521 The fact that disorder isn't objective doesn't reduce its subjective reality. If you are interested only in whether all your card are in a nice pile, then the line between order and disorder is drawn there - if you sorted the cards in some specific way, any other sorting would be disorder for your purpose. If you use the bayesian concept of probability theory to derive statistical mechanics, it's subjective nature is utterly obvious.
I prefer the sand castle analogy. There's trillions of arrangements of sand particles that define the macro state "lump of sand" But there's only millions that define the macro state "pretty sand castle" That's why the wind turns sand castles into lumps of sand, not lumps of sand into sand castles. Straight probability. And it shows you the "arrow of time" What I've always wondered is, does increasing entropy create the arrow of time? Or does the arrow of time increase entropy? Does time even exist?
Hmmm. From my perspective: the arrangement of sand after the wind blew down your sand castle was highly meaningful (one of only a handful of meaningful arrangements of the sand). I found your "sand castle" to just be a random lump of sand. So from my perspective, the wind decreased the entropy. Is entropy relative? Is it wrong of me to have found that arrangement meaningful? Is time running backwards for me?
Duane Eitzen - I'm gonna have to go with B: "Is it wrong of me to have found that arrangement meaningful?" Or I guess, maybe not wrong so much as missing the point. It isn't about being "meaningful" to some intelligent mammal. The pile of sand is more statistically likely (assuming no outside energy, like manufactured buckets and expenditure of calories) than the sand castle. Thus, over time, sand castles tend to turn into piles of sand, and piles of sand tend to turn into other piles of sand. You are free to find the most statistically common arrangement of particles more meaningful if you desire, of course. Whatever "meaningful" means in this case. Though, I suppose there _is_ always the possibility that time is actually running backwards for you somehow, whatever that would actually entail. I can't imagine that the rest of us would be able to measure whether or not that was happening to you.
There's a lot more than millions, and a lot more than trillions. It's more like 10^(10^12) vs 10^(10^6), not 10^12 vs 10^6. And THEN you see why the wind doesn't turn lumps of sand into sand castles.... because if there was a 1 in a million chance, there'd actually be a LOT of sand castles spontaneously generated, but no, it's not 10^6/10^12=10^-6, the odds are more like 10^(10^6)/10^(10^12)=10^(-.999999*10^12) which is about as infinitessimal as 10^(-10^12) itself is.
My father didn't find that funny at all when I explained it to him on "this is what happened on the simpsons", but when he later watched it, he practically shit his pants.
It blows my mind how good Matt is at presenting! Explaining the most complex physics and topics science has, every time I watch I feel like I understand something, for a Atto second, then he says something else and it's gone. Your the best Matt keep it up, I might be able to hold on to some of your conversation soon enough.
There is just one thing I would like to add. In every episode you state things, and draw conclusions in a very elegant and simplified way. It would be just a nice touch to emphasize every time, what was the fundamental assumption on which the whole model is standing, and what would be the consequences if that axiom would not be true. In this case the fundamental assumption is that ALL possible physical states are EQUALLY probable. It is possible to change the outcome of a phase propagation by messing up the probabilities. Also, even when you keep the "equally probable" axiom, the macro probabilities are highly dependent on the definition of the micro states. That is also a good filter for messed up theories. Not to mention that the infinite improbability drive is the MOST powerful machine that anyone can construct, as it practically means omnipotence.
Maybe try this other video, I found it much better! (it was made longer ago than the above video, yet very similar, I wonder if PBS copied it...) ruclips.net/video/w2iTCm0xpDc/видео.html
One way to think about entropy i heard in chemistry is roughly as follows- A given sample of matter can only be arranged a very large but still finite number of ways, and some ways can be achieved more ways than others, as such the arrangements that have the largest number of possible ways to form will generally be the arrangement matter tends to take when left alone with no input for long enough. Entropy can be considered to be a way to measure of how close a given sample is to achieving the most probable state.
At least we can now conclusively say "Maybe" to the original question of "if a tree falls in the woods, and no one is around to hear it, does it make a sound?"
To me, Entropy is, first and foremost, about constraints. - Entropy is basically counting how many states are currently *possible.* And that's why ordered states are *usually* (but not *necessarily)* in a lower entropy state: To have a decent chance to actually run into such an ordered state, you have to constrain the system to make that state relatively likely. For instance, in a typical ordered vs. chaotic room analogy, if you don't put any extra constraints on the state of your room, any *particular* tidy room is just as likely as any *particular* messy one. So in principle, both are consistent with high entropy. But there are so, *so* many more ways for your room to be messy than for it to be tidy; if you actually find it in a tidy state, it's *very* likely that you put it under strong constraints, essentially removing the possibility of a messy room from the phase space. Take a single pile of 100 sheets on your desk, for instance. If we're looking at a neat stack, meticulously arranged so that no piece of paper stands out, that's a very strong constraint on the sheets. An even stronger constraint would be if, on top of that, they also were alphabetized. More or less the only (macroscopic) degrees of freedom, at that point, would be how the stack as a whole is oriented and where on the desk its center of mass is. It's effectively a static rigid body at that point. Being tightly stacked and sorted has removed all other uncertainties about it. But maintaining that constraint is pretty resource intensive. One gust of wind could destroy it all, and you must either, building a well enclosed, wind-tight room, protect the pile from that or, putting extra work into it, restore the order to the constraints' specifications after the fact. Meanwhile, if you couldn't care less, and the pieces of paper can just be wherever, some even crumpled to balls and dropped on the floor, or, I don't know, about to be used as fuel for starting a cozy fire in the extravagant gold-plated fireplace you definitely have? There is hardly any constraint on those sheets then. You couldn't say much about any particular sheet at all, then. Unless you explicitly grab one and check, it's virtually indistinguishable from the others. (And grabbing a single piece of paper to check it out might be feasible, but try that for a single molecule in the air, say) Every time you halve the volume in which a piece of paper can be (a location constraint), you decrease entropy by one bit. Halving possible states is like answering a single balanced yes/no question: "Is this particular sheet of paper in this half of the room?" is worth one bit. "Is it in this quarter?" would require two bits, "Is it in this eighth?" three, and so on. - And the same is true for other parts of the state space: "Is this the side that's facing upward?", "Is the sheet's current momentum relative to the room in this interval?" * All that being said, even a very tidy room is going to have rather high entropy: You won't be able to sort the air in the room into neat piles. You won't be able to cool down the pile of paper to such low temperatures, that the sheets' individual atoms' momenta drops to essentially 0. Strictly enforcing a neat and tidy room *will* reduce entropy, since literally fewer micro states are possible then. But compared to the sheer number of possible micro states, the ones you actually plausibly have macroscopic control over is, even as it is vast in absolute terms, virtually undetectable and negligible. * One caveat: it's only exactly a bit of information if, given your previous knowledge, chances for the answer to be "yes" are just as high as for it to be "no". In general the number of bits gained is as large as how surprised you are by an answer. If you are *very* sure, that gravity never ceases, and it indeed doesn't cease, you effectively gained almost no information, and entropy hasn't further been constrained. - If you suddenly find yourself, one day, completely unaffected by gravity, that'd be a massive surprise, equivalent to a TON of bits. Enforcing constraints is just a way to reshape the world in a manner that you can then be sure about. So for instance, if you divide a room in two volumes of equal size and ask about which half one particular particle is in, if it's a vertical split, that will indeed be one bit of information gained. If you force it to remain in that half, you reduced the entropy by one bit. You have pinned down its location twice as well as you did before. But if you split it horizontally, because there is going to be a (very minor) pressure gradient from top to bottom due to gravity, the answer will give you (just the tiniest sliver) more or less than one bit, depending on where you end up finding it. (If it's in the top half, that's gonna be slightly more, if it's in the bottom half, slightly less) For your sheets of paper, meanwhile, it's gonna be vastly more likely to find them essentially static (macroscopic momentum relative to the room ~0), so it's virtually guaranteed for them to land in what ever interval contains that 0. If you want to be *super* strict about enforcing your constraints of perfectly defined stacks in a perfectly defined location, momentum would even have to be *exactly* zero... Which, of course, is disallowed by Heisenberg. In reality the sheets will always just jitter a macroscopically irrelevantly tiny bit, both due to thermal motion of individual atoms, and due to constantly being bombarded by air molecules. (And also by the equally present thermal jitter of the desk itself, say)
So, to extend it to its absolute conclusion, the constraints that can be imposed on the universe, by intelligent intervention, could prevent the heat death of the universe? Say by creating a perfect vacuum flask. Warm outside. But hot (or cold) inside.
Are you aware of Maxwell's demon? A theoretical construct, that could sort a chamber to have all of the air on one side, by opening a small hatch to allow air atoms to pass to one side, but shutting it to stop them from coming back. In practice, a being like that would most likely consume a lot more energy than any usefull energy that could be gained from fludding the resulting vaccume, because it needs to track atoms and open\close a hatch fast enough to catch them.
Now please consider the energy required for you just to type this... Sorry, bro. You added external energy to the system. Doesn't count. But nice try. ;-)
Never understand anything this man explains. Maybe I'm not smart enough, but all other YT channels explain everything so much more clear that, there, I do learn something. Slow it down my man, or dumb it down.
Thank you, thank you, thank you for addressing entropy in this and upcoming videos! It's why I became a patron of Space Time. I have a bunch of questions about your excellent introduction to entropy, which covered a lot of conceptual ground in a short amount of time. I'll break up my questions into separate comments.
2:06 Slight correction but even a perfect Carnot cycle engine won't turn all the Heat going into it into Work some of the Heat will go to the lower reservoir. If this weren't the case you could combine a Carnot engine with a Carnot refrigerator to get a perpetuum mobile. Carnot proved that the Carnot cycle is the most efficient thermodynamic cycle possible but even that cycle is not 100% efficient.
Hi, I'm sorry, this is an old comment, but it struck me and I had to ask. Is this true? Do physicists struggle with the math too? I've always felt drawn to Physics, but I am so bad with numbers 😅😅 Is there hope for me in the field? Lol
This a much better explanation of entropy, thank you! When I first heard the 2nd law of thermodynamics as "a system's entropy (disorder) will always increase over time", it sounded like these people had no idea what they were talking about. Big bang -> atoms -> molecules -> stars -> galaxies -> solar systems -> planets -> life -> humans -> intelligence -> consciousness. It's pretty obvious to anyone that the universe has only become more orderly over time. Describing entropy as the number of possible random states that a system can have (or the amount of information needed to describe a system) makes a lot more sense. I once heard someone use Sudoku as an analogy for entropy. When you're trying to solve a square in an empty row, it can be anything from 1 to 9 (maximum entropy), but as other squares are resolved, the number of possible states changes to let's say 1,3,5 (lower entropy). So ironically, it seems us humans enjoy puzzles where we reduce entropy in a system. As if it's a fundamental part of our nature.
Note that entropy can decrease from time to time, just shortly, but it happens. From Quora (Jack Wimberley, Ph.D.): The second law of thermodynamics does not absolutely state that the entropy of a finite system cannot decrease - only that the probability of it doing so vanishes in the limit that its size becomes infinite.
Don't worry, we can fight entropy by granting the wishes of little girls (turning them into "Magical Girls"), then harvesting the energy that is created when their wish implodes and turns them into a "Witch".
"no mother, i shant clean my room, as the effort i spend cleaning will decrease the amount of energy available for our civilization at the end of ends!"
zombie blood which maths of all? The probabilities for a board of n cells where each cell is either occupied or unoccupied has a number of states of 2^n. that's the same as a sequence of n coin flips. If it's the case that half are occupied and half are unoccupied then the associated number of states is nCn/2, I.e. it's the appropriate binomial coefficient. The rest is number-crunching. But it shows why, say, in a sequence of coin flips, the most probable state is one where the number of heads and tails are equal or close to equal.
Thermodynamics utilizes a lot of calculus, nothing too advanced, specifically differentials (at the senior undergraduate level). For instance, entropy, more succinctly is described as dS=-dQ/T. One thing I'm glad they stated is that thermodynamics describes large scales. Statistical mechanics describes small scales. The mathematics of statistical mechanics is a lot more involving, especially as the systems grow in scales. Statistical mechanics does use a lot of probability equations that are termed partition functions. Where statistical mechanics grows in complexity in describing systems, mathematically, are in non equilibrium problems.
ErwinSchrodinger64 I'm in high school. I love learning above my level but what you said was pretty confusing thanks for trying though. I'll have to come back at a later date after learning more.
AT LAST! The one thing I've always tried to explain for smirking science nerds who believe they know enormously much, but in reality are rather ignorant: that order is not the same as low entropy! Thank you.
What a beautiful sentence, im definitely usig it: be careful to keep the number of accesible microstates low, avoid thermal equilibrium and keep being that brilliant macro state that is you until I see you next week
Your a super instructor. I love how simple-like, and matter of fact you are. You make the second law sound like stacking blocks or something. You're so cool. I read a book, while in a physics class in a college; Entropy, by Jeremy Rifkin, I think that was his name. It scared me to death. It didn't help being in physics at the time. It's like a nightmare. A coma. You'll never wake up, you'll only eventually die. Your being pulled through the hallway of life into the gaping mouth of a giant freezer - DEAD end. That's what reading that damn book felt like. It was scary. You don't really know what's truly 'real' and all of this endless mathematics and theory theory theory. I'm glad I'm outta there that's all I can say. Thinking's cool, but sometimes I feel like the Bros. Grimm make more sense. Really.
it's not just energy coming in from outside that can reduce entropy, it can also be energy lost to the outside. If you had some kind of magic heat sink that was infinitely cold forever, you could extract a ton of useful work from a universe at maximal entropy... because of the energy being lost from the rest of the universe into that "cold hole". No outside energy input required.
Pfhorrest wouldn’t that situation be equivalent to air being in only half the room? i.e. as the cold hole fills up entropy is increasing not decreasing.
Virgil yes, but useful work is done by increasing entropy. You can think of "negative entropy" as "fuel", and then when you spend that fuel to do work, entropy is the "waste product". The existence of a "cold hole", an infinite heat sink, creates an energy gradient as big as the total energy content of the universe, and thus extremely low entropy. As all of the energy in the universe got sucked into it, entropy would go up, but you could do a ton of useful work using that "destruction of energy". For a less fantastic example, imagine a closed system of a balloon that is somehow perfectly insulated from the outside world. The air in the balloon is at maximum entropy; for a tiny being living inside that balloon, there is no possibility to do useful work (so I guess they wouldn't really be living to begin with, natch). But if you poke a hole in the balloon, and let's just assume it's the infinite vacuum of space outside the balloon, suddenly the air in the balloon goes rushing out, and the total energy content of the balloon goes down, but beings living inside the balloon could use that rush of energy out of the balloon to do useful work, say by putting a wind turbine over the hole to generate electricity. The point is just that you don't have to put stuff IN from the outside to create an energy gradient capable of doing useful work, you can also let stuff OUT to the outside to do so.
Your magic heat sink sounds like absolute zero. If the cold reservoir of a carnot engine could be zero degrees, then the efficiency of the heat engine would be T(hot)/0, ie infinite or undefined. That's why the third law of thermodynamics forbids such an infinitely cold heat sink.
The heat sink wouldn't have to be at absolute zero, just anywhere below the average temperature of the universe; nor would being at absolute zero be enough to make it function as described. What's more important it just its total heat capacity. If you had a penny that was at absolute zero and tossed it into a universe at maximal entropy, you'd get as much work out as it would take to raise the penny to the temperature of the universe, but that's all. But if you had a penny made of some magic material with infinite specific heat (i.e. it takes an infinite amount of energy to raise its temperature), that was only slightly colder than the rest of the universe, then it would suck and suck energy from the rest of the universe until it had lowered the temperature of the whole rest of the universe to its own slightly lower temperature. For an easier to illustrate example, imagine you have a pebble with infinite specific heat that's currently at -1C, which is a lot warmer than most of the universe, but say you dropped it onto planet Earth, where it would probably land in an ocean. All of the world's oceans would freeze. The atmosphere everywhere would be brought down to to -1C. All of the rock in the planet would be brought down to -1C. The sun would keep shining on the world trying to warm it up but all of that heat would be soaked up by the pebble which is still at -1C. Eventually the sun would expand to a red giant and swallow up the Earth, but the part of the sun that came into contact with the Earth would be cooled down to -1C as all of its heat was absorbed by the pebble, though of course the rest of the sun would convect more heat to the part around the Earth but that would be soaked up too until the entire sun had been cooled enough that it shrank back away from contact with the Earth. But, that would all take a while. And meanwhile here on Earth, the temperature gradient between the frozen pebble and the environment surrounding it would be a source for extracting useful work. It'd basically be a source of nigh-limitless free energy, as all of the unusable heat energy of the world, and all of the heat energy being pumped into the entire planet by the sun, would become a comparatively high-temperature energy source that could be pumped into the cold little rock to power a heat engine.
I studied economy and one of most amazing thing is entropy can be applied on economic systems like bancomat or sociologic system in the same way like thermodynamic (excuse me for my english but I'm italian and we never learn :) )
The approach you mention rests simply on mathematical system analogies - personally I'm in favor of a deeper identification, like that proposed by Georgescu-Whatever (bar spurious laws of thermodynamics)
I'm always pleased to find areas of overlap in the mathematics of different fields. My favourite is that spherical harmonics are used to compress data for light transport in 3D graphics, but also describe the electron orbitals. Such "coincidences" are actually pretty common, and that fascinates me in a philosophical kind of way.
I see the messy room as a valid analogy for explaining entropy. There are all kinds of messy but only one kind of clean. If I use items in my room and leave them at random spots without returning them to where they belong, the room gets messier. "Is my mom happy with the way my room looks" would be the macrostate. There aren't many microstates to make her happy but a lot of microstates to make her unhappy. Therefore there is a natural tendency for the room to get into a configuration that makes mom unhappy unless I invest energy to clean it up.
Would it be possible to describe the entropy of a system in terms of the minimum amount of bits needed accurately to describe a region of space of a certain volume?
Question: Way in the far future of the universe, when all matter and energy have reached equilibrium (maximum entropy), how can the second law of thermodynamics still be followed (i.e., if entropy is at maximum, it can't go further up)?
Balrighty 35 It won't. At that stage entropy will decrease in local areas of space. Given an unimaginably long period of time structures will form purely randomly. Look up Boltzman brains for more information on this.
@@ebenolivier2762 ... given an unimaginably long period of time, the inexorable expansion of space between the remaining maximally dispersed particles will dwarf their differences in velocity. Now, they can never meet. Structures random or otherwise cannot form. Plus, there really isn't such a thing as 'random' in thermodynamics.
Seth of Lagos This will only happen in the "big rip" end of the universe. We don't really have sufficient models to predict whether this will happen or not. We also cannot say for certain whether what I said would happen. It's all just speculation if one thinks that far into the future. The idea of the Poincare Recurrence Time and Boltzmann brains is quite fascinating though. 😊
I have learned more about this video than other videos and articles about Entropy. This video already corrected a few misconceptions I had about entropy. I am getting there. I have trouble understanding it but that only means I got to keep trying. I'll understand entropy if it kills me.
Thought experiment: If the arrow of time is the process of Entropy, then when the Universe ultimately reaches its highest state of Entropy, and only randomized subatomic particles exist, does the arrow of time still exist? We know that the current state of the Universe can produce high levels of emergent complexity, far from equilibrium. At the Universe's state of highest Entropy, we would have only subatomic particles, distributed randomly throughout the Universe, and no measurable arrow of time. Would the probability of widespread, spontaneous emergence of complexity, increase "exponentially" (from our present point of view), because arrow of time no longer exists? In this high-entropy macrostate, the huge number of interactions, required to finally produce cosmic-scale emergent complexity, would be occurring "instantaneously", by particles moving at light speed, "outside of time". So, high-entropy conditions, at the end of the Universe, are somehow equal to the low-entropy Singularity of the Big Bang?
Excellent episode! In my 35 years of engineering design, the understanding of thermodynamics is the defining factor that determines the ability of a designer and engineer or scientist! Without this understanding I wouldn't even call someone a team member. PS. The room with all air molecules on one side could happen, it is just that the universe is not nearly old enough and might never survive long enough for this to come true. But wouldn't it be cool to see, assuming you could build a room full of air that you could maintain and continuously monitor forever!
I think it would be possible to use an advanced quantum computer to scan through possibility space, focus on the set of possible microstates that is consistent with the current macrostate of your room, and then somehow guide your path through that phase space to the particular set of microstates where all the air is on one side of your room for some duration t. If there was a device that could do that, then essentially you would just need to tell it the desired conditions you're looking for, it checks to see if it's possible to 'get there from here', and if it is, then the device would steer you through the path of evolving microstates that corresponds with the desired evolution of your macrostate. So, conceivably, you could just ask the device to 'make it so all the air goes to one side of the room for the next 30 seconds', and it'll just steer you/your macrostate through the path of that phase space in which the desired macrostate actually occurs. You could do anything, really. You could have your device steer you through the evolution of microstates which corresponds to your room full of air safely flying to Mars with you in it. Such an evolution is incredibly statistically unlikely, but since it is still possible with non-zero probability, a sufficiently good device would be able to find that desired path of evolving microstates and steer you through it.
My teacher linked this in the PowerPoint presentation of our last topic in my second year physics class about thermodynamics lmao I'm blown away a channel I watch for fun is featured as a part of my assignment lmao
Entropy would exist even if specific particle interactions that aren't time symmetric didn't exist. Hypothetically even if you have a system with all its particles behaving symmetrically with respect to time you would still have increasing entropy in one temporal direction. So you will always have entropy acting as an arrow of time even in systems which have no other such compass.
I guess it would be local maximums. Like billiard balls could all be in the pockets in any arrangement, any of which are more likely than them being arranged still on the table.
I think what you are asking is whether there can be situations where two distinct macrostates of a physical system corresponds to the maximum number of microstates that the system can exist in. Well, there's always a macrostate with more microstates associated with it. A Universe in which there exist isolated physical systems has lower entropy compared to a Universe filled with only photons. Thermal equilibrium is merely when all the fast processes have already occurred, while the really slow processes have yet to take place.
If the mentioned macrostates are distinct, they would be separated by certain intermediate macrostates which are less likely. This means that if the system reached one of the most common macrostates, it would not spontaneously switch to the other such macrostate, unless pushed from outside. There are many such bistable systems. Your computer's memory and disks are built of trillions of them, which allows storing information. Each bistable system can store 1 bit of information.
Conservation of charge. This can be electric charge, weak hypercharge, or sometimes number conservation like baryon number or lepton number., depending on the context. These are all a U(1) phase symmetry.
Michael Roberts so local phase invariance gives electric charge conservation and global phase invariance gives the conservation of other type of charge or quantum number?
Nothing - conservation only appears with local symmetries. The fact you have a global symmetry merely flags the possibility that 'localizing' (AKA gauging) that symmetry may lead to the discovery or reinterpretation of some physical phenomenon
They are all examples of local symmetries, but for different Lagrangians. The QED Lagrangian has a phase symmetry that corresponds to conservation of electric charge. The electroweak Lagrangian has a phase symmetry for conservation of weak hypercharge that's broken by the Higgs field. But Iago is right that these all have to be local symmetries for the conservation laws.
This video was brought to you by Kyubey. Get your Mahou Shoujo cosplay costumes today! You can do your part to lower entrophy and save the rest of the universe. The rest of the universe needs you!
Entropy can be thought of as a measure of a system's organization or disorganization. Generally, we can describe it as: High entropy = Low organization (disorder, randomness) Low entropy = High organization (order, structure) For example: A system in thermodynamic equilibrium, like a gas in a sealed container, has high entropy because the molecules move randomly without a structured arrangement. A crystal, on the other hand, has low entropy because its molecules are arranged in an ordered and structured manner. However, as you mentioned earlier, entropy can also be understood as a process of reorganization through randomness, leading to the renewal of systems and greater complexity and order over time. --- Explanation with Examples 1. High Entropy and Randomness In systems like a gas in thermodynamic equilibrium, high entropy reflects disorganization and uniform energy distribution. Although seemingly chaotic, this randomness creates conditions that allow reorganization when external forces (e.g., gravity, temperature) act on the system. 2. Low Entropy and Order A crystal represents a state of low entropy due to its highly organized structure, achieved under specific conditions that channel energy and molecules into a stable pattern. 3. Entropy as a Driver of Renewal The randomness of high entropy creates space for unpredictability, where natural forces can reorganize elements into more complex structures. Example: After a supernova explosion, the dispersed gas and dust (high entropy) provide the material for the formation of new stars and planets (emergent order). 4. In the Long Term Entropy can be viewed as a tool of the universe that transforms local disorder into greater complexity and order, perpetuating a constant cycle of creation and renewal. --- Conclusion This perspective positions entropy as a constructive and transformative force rather than a destructive one. It is not the "end," but rather the natural pathway for reorganization and evolution, enabling the emergence of more complex structures in harmony with the universe's dynamic conditions.
I added my "3rd law of thermodynamics" -- that the Universe does everything possible automatically to increase the speed of entropy. It wants to die ASAP. This is the ONLY purpose of "life" and therefore all life is equal. Evaluation works because higher intelligence always leads to higher entropy. The Universe is full of organic molecules that self assembled from the elements created beforehand. Given sufficient time these organic molecules then self assemble into "life" because life is better at increasing the speed of entropy than non-living inorganic molecules (i.e. rocks). The end result is that we will find life everywhere. Much of it will not resemble our carbon based life. The Universe does not care about us or about any life form -- other than the fact it is speeding entropy. We are totally on our own. This provides us wonderful opportunities. We should focus on life extension technology and "stasis" so we can journey to the stars without FTL space drives.
You can apply the concept of entropy to a lot of stuff, in a lot of different fields. For example, in information theory, you can calculate the entropy of a text. The text "abababab" as a higher entropy than "aaaaaaaa", for an alphabet comprised of ab. But if this is for an alphabet from a to z, it still has a very low entropy. You have a lot of definitions on wikipedia : en.wikipedia.org/wiki/Entropy_(disambiguation) Each one meaning a slightly different thing.
I think if you defined the macrostates under consideration, you could just use/define the log of the number of microstates belonging to each macrostate to be the entropy of that macrostate. However, the fact that Conway's Game of Life is not reversible might make some stuff different. Many states lead to the empty grid state, but if "an empty grid" is a macrostate, that might suggest "entropy" decreasing? So that particular definition might not work out great. Maybe if you defined all still lifes (where nothing changes from step to step) as belonging to the same macrostate, and all states with period 2 as having another macrostate, and so on, then that could fit, because that way those macrostates have high entropy? (Or maybe vary this in some way like taking into account the total number of live cells, or the furthest cell from the origin, when defining the macrostates). I'd guess that there is some interesting way of picking the macrostates such that the entropy never decreases. Would be especially nice if it could be done in a way such that you could add it between different parts of the board, but that sounds hard? I think people who study GoL have defined a measure they call temperature. Maybe they defined that based on entropy?
Not all, but it's REALLY inapplicable to Conway's game of life, because there's no conservation of mass (or # of live cells) in that, structures can increase in mass without limit. That was after all the thing that the first glider gun won the prize for doing. And if you can already violate the 1st law of thermodynamics, who CARES about the 2nd one, the 2nd one is irrelevant if you can violate the first.
You almost mentioned my favorite physicist: Josiah Gibbs. And your outro is tantamount to saying: Please, "don't die" until I see you next time on Spacetime.
"My room is dirty because entropy" "Yes son but if we're trying to obey entropy, your room must as efficient as possible for you to successfully bring about a fast, universal, entropic equilibrium. Ain't no entropy-efficient babies being made in this room. Clean it or get a real job."
I am an aerospace engineer and I have been mocked by these "physicists" for being an end-user of their theories and equations. Guess what?! Sadi Carnot, the father of Thermodynamics, was actually a mechanical engineer. On behalf of all mechanical and aerospace engineers in the world: IN YOUR FACE Sheldon Cooper!
- "Entropy has been credited with predicting the ultimate heat death of the universe." - "Hold my beer and just give me a few disposable high-school girls" /人◕ ‿‿ ◕人\
I understood a little something of entropy by his closing words; "Be careful to keep your number of accessible microstates low (the chemical processes inside of our bodies should be kept on its natural course/function, unaffected as much as possible by external matter), avoid thermal equilibrium (discomfort or death) and keep being that brilliant macro state that is you (the culmination of most likely microstates)..."
I love it when creationists use the argument that how could complex life arise on earth if entropy always increases, They always forget the earth isn't a closed system /:
No it's much worse than that, because you see, the rise of complexity is EXACTLY what the 2nd law of thermodynamics predicts, after all, entropy also goes by another name, which has positive cannotations, and that name is "information". Higher entropy MEANS higher complexity, that's what entropy IS. So of COURSE it's not a contradiction for complex life to be a higher entropy state.
Hmm but doesn't that raise the question how the universe got to a low entropy state in the first place? This is like the problem of infinite regression of time.
Penrose makes a big deal of this and proposes his hypothesis of conformal cyclic cosmology as an answer, which predicts a high-entropy heat-death universe can give rise to a low-entropy big bang. Very speculative but the notion gives the prediction we should see correlations in the CMB radiation of a given angular size; don't think he (or his collaborators) have found anything yet though.
+Al Quinn Oh wow. That sounds really interesting. I wonder if it eventually turns out to be true. Hmm but how would a high-entropy universe spontaneously give rise to a low-entropy big bang?
Penrose argues the solution to the black hole information paradox is that the information is destroyed, which resets the entropy to a much lower level once all BHs have evaporated. I'm just a RUclips troll though so I don't pretend to understand this at great depth. Long talk he gives here: ruclips.net/video/4YYWUIxGdl4/видео.html The part where he discusses how to get around the 2nd Law is around 1h 19m
My cat goes out into the cold, then comes in to warm himself in my bed. He is a sort of engine for moving heat out of the apartment. I don't know if he does any useful work, though.
Not so dumb, and close to a theory!
what an underrated comment
Your cat eats mousies. Digesting them creates heat, to warm you. They are well insulated so the cold outer fur is less of a heat sink, than your cat is a heat source. You must thank and cuddle your hot kitty bottle!
Always remember: dogs have a master, cats have servants.
*dlevi67* Word.
Now that I'm 70, I have a clear understanding of entropy when I look in the mirror .
That's rough.
You mean the second law of thermodynamics.
Either .
@@StanislavMudrets well think you could say
🌚
"keep beeing that brilliant macrostate that is you" ... noone has ever given me a better compliment ☺️
Holly crap, this was my Astronomy professor last semester. Thanks for the lessons 👌🏻
Lucky.....
""Welcome to Entropy Burgers - may I take your order?"
"I put in disorder a long time ago. The service here is getting worse all the time."
"My experience Gibbs me reason to believe you."
"I know the waitress who asked that, too. Her name's Ellen Omega. She really made me thermally dynamic. So, I asked her out. I tell you, when she don't like you, she really Boltz, man. Women like that are never distributed normally among the population."
"What kind of Poisson would say something like this?""
That really hurt, ouch
i read it till the end
Fantastic explanation of entropy, thanks.
It is slightly worse than you suggest. The idea that every micro-state is equally likely is a mathematical simplification, but to end in a particular state you have to get there.
Take the state of a go board. Half black on one side, representing all “hot” atoms in one place and a similar state with one black out of place. To get the out of place atom to the right point in the board is even less likely as it would have to move against the temperature/pressure gradient.
Also you are really talking about a high entropy state with all atoms having an around average kinetic energy (speed) to one where half the atoms have all the energy. Collisions won’t do that. The statical model, with it’s simplifications becomes inaccurate at the extremes.
There is no such thing as a disorder. If one drops a deck of cards off the top of a building just what makes the result on the ground define "disorder"? The fact that some cards are up and others are done is no good since blackjack and 7 card stud are forms of order and each of those games has some cards up and some down. The scattered cards do not represent a disorder since the winner of a card game makes a similar so-called mess out of the formerly neatly stacked chips as he or she gathers them toward him or herself. Therefore, entropy only describes how the number of possible orders will be in larger in number than the fewer number we started with along with a reduction in temperature over time.
@@drakeequation521 The fact that disorder isn't objective doesn't reduce its subjective reality. If you are interested only in whether all your card are in a nice pile, then the line between order and disorder is drawn there - if you sorted the cards in some specific way, any other sorting would be disorder for your purpose.
If you use the bayesian concept of probability theory to derive statistical mechanics, it's subjective nature is utterly obvious.
I prefer the sand castle analogy.
There's trillions of arrangements of sand particles that define the macro state "lump of sand"
But there's only millions that define the macro state "pretty sand castle"
That's why the wind turns sand castles into lumps of sand, not lumps of sand into sand castles.
Straight probability.
And it shows you the "arrow of time"
What I've always wondered is, does increasing entropy create the arrow of time?
Or does the arrow of time increase entropy?
Does time even exist?
Have you read "The Order of Time" by C. Rovelli?
Nope
Hmmm. From my perspective: the arrangement of sand after the wind blew down your sand castle was highly meaningful (one of only a handful of meaningful arrangements of the sand). I found your "sand castle" to just be a random lump of sand. So from my perspective, the wind decreased the entropy. Is entropy relative? Is it wrong of me to have found that arrangement meaningful? Is time running backwards for me?
Duane Eitzen - I'm gonna have to go with B: "Is it wrong of me to have found that arrangement meaningful?"
Or I guess, maybe not wrong so much as missing the point. It isn't about being "meaningful" to some intelligent mammal. The pile of sand is more statistically likely (assuming no outside energy, like manufactured buckets and expenditure of calories) than the sand castle. Thus, over time, sand castles tend to turn into piles of sand, and piles of sand tend to turn into other piles of sand.
You are free to find the most statistically common arrangement of particles more meaningful if you desire, of course. Whatever "meaningful" means in this case.
Though, I suppose there _is_ always the possibility that time is actually running backwards for you somehow, whatever that would actually entail. I can't imagine that the rest of us would be able to measure whether or not that was happening to you.
There's a lot more than millions, and a lot more than trillions. It's more like 10^(10^12) vs 10^(10^6), not 10^12 vs 10^6. And THEN you see why the wind doesn't turn lumps of sand into sand castles.... because if there was a 1 in a million chance, there'd actually be a LOT of sand castles spontaneously generated, but no, it's not 10^6/10^12=10^-6, the odds are more like 10^(10^6)/10^(10^12)=10^(-.999999*10^12) which is about as infinitessimal as 10^(-10^12) itself is.
IN THIS HOUSE WE OBEY THE LAWS OF THERMODYNAMICS
But I dont wanna
*Most of the time
My father didn't find that funny at all when I explained it to him on "this is what happened on the simpsons", but when he later watched it, he practically shit his pants.
Your kids will be pretty pissed for not being allowed to use this fancy new DeLorean car called time machine.
This perpetual motion machine she made today is a joke! It just keeps getting faster.
Really well explained. The gou board analogy was a brilliant touch.
It blows my mind how good Matt is at presenting! Explaining the most complex physics and topics science has, every time I watch I feel like I understand something, for a Atto second, then he says something else and it's gone.
Your the best Matt keep it up, I might be able to hold on to some of your conversation soon enough.
I can relate, but you need to watch his videos two or three times to fill the gaps left after the first time. The investment in time is worth it.
I spent a semester in college writing a research paper on entropy. That's what it took to get me to wrap my head around the subject...
This guy looks like he's got two personalities: a calm, academic one for the cameras and wilder one for the pub with his mates lol
I think you're right. His jokes are too funny and too cool😂
I think he actually has a rock band or plays in one or something like that
He posesses different microstates? So, how would his entropy go? Let's see in a year of... two, three... Five?
There is just one thing I would like to add. In every episode you state things, and draw conclusions in a very elegant and simplified way. It would be just a nice touch to emphasize every time, what was the fundamental assumption on which the whole model is standing, and what would be the consequences if that axiom would not be true. In this case the fundamental assumption is that ALL possible physical states are EQUALLY probable. It is possible to change the outcome of a phase propagation by messing up the probabilities. Also, even when you keep the "equally probable" axiom, the macro probabilities are highly dependent on the definition of the micro states. That is also a good filter for messed up theories. Not to mention that the infinite improbability drive is the MOST powerful machine that anyone can construct, as it practically means omnipotence.
I've been trying to understand Entropy for a couple of years by now, I watched the video... I still don't get it xd
Maybe try this other video, I found it much better! (it was made longer ago than the above video, yet very similar, I wonder if PBS copied it...)
ruclips.net/video/w2iTCm0xpDc/видео.html
@@bpansky Thanks, it was a good video, I really wanna get that Stirling Engine now it's only $35.
One way to think about entropy i heard in chemistry is roughly as follows- A given sample of matter can only be arranged a very large but still finite number of ways, and some ways can be achieved more ways than others, as such the arrangements that have the largest number of possible ways to form will generally be the arrangement matter tends to take when left alone with no input for long enough. Entropy can be considered to be a way to measure of how close a given sample is to achieving the most probable state.
Entropy is the degree of randomness in a system. And I think maximum entropy means all possibilities has been exhausted.
😆😆😆😆😆
i think this is my favorite space time episode. revealing how a law arises out of possibilities is just mind blowing.
I'm obsessed with the idea of entropy. Thanks for this!
Regarding all air atoms being in one half of the room, does that mean if a tree falls in the other half, there is a chance it does not make a sound?
SAHM it would most certainly not make a sound
At least we can now conclusively say "Maybe" to the original question of "if a tree falls in the woods, and no one is around to hear it, does it make a sound?"
It would still make a sound but there would be no medium to conduct the sound waves away... Other than the floor and the tree irself
Atlas WalkedAway Yeah I wanted to conclude that every scientific fact/conclusion that we consider can simply become a highly probable guess.
Tony Bastian so what you’re basically saying is it wouldn’t make a sound? Lol
If my room is the entire universe, heat death has already happened
Until you fart that is =)
Bring back the Chaos!
Is your room a uniform space of evenly tempretured gas? If no, then not yet.
well no... because you are in your room and you were still warm because you wrote this
😲I can see that it has reached maximum entropy... 😭
thanos:"i am inevitable
2nd law of thermodynamics:"hold my beer"
Although I rarely comment on a RUclips contents, this made do so. Just wow! Finally, the best explanation for entropy. Thank you for this!
I'm not sure I could ever be smart enough to fully comprehend this channel but the host still makes it interesting.
To me, Entropy is, first and foremost, about constraints. - Entropy is basically counting how many states are currently *possible.* And that's why ordered states are *usually* (but not *necessarily)* in a lower entropy state: To have a decent chance to actually run into such an ordered state, you have to constrain the system to make that state relatively likely.
For instance, in a typical ordered vs. chaotic room analogy, if you don't put any extra constraints on the state of your room, any *particular* tidy room is just as likely as any *particular* messy one. So in principle, both are consistent with high entropy.
But there are so, *so* many more ways for your room to be messy than for it to be tidy; if you actually find it in a tidy state, it's *very* likely that you put it under strong constraints, essentially removing the possibility of a messy room from the phase space.
Take a single pile of 100 sheets on your desk, for instance.
If we're looking at a neat stack, meticulously arranged so that no piece of paper stands out, that's a very strong constraint on the sheets.
An even stronger constraint would be if, on top of that, they also were alphabetized.
More or less the only (macroscopic) degrees of freedom, at that point, would be how the stack as a whole is oriented and where on the desk its center of mass is. It's effectively a static rigid body at that point. Being tightly stacked and sorted has removed all other uncertainties about it.
But maintaining that constraint is pretty resource intensive. One gust of wind could destroy it all, and you must either, building a well enclosed, wind-tight room, protect the pile from that or, putting extra work into it, restore the order to the constraints' specifications after the fact.
Meanwhile, if you couldn't care less, and the pieces of paper can just be wherever, some even crumpled to balls and dropped on the floor, or, I don't know, about to be used as fuel for starting a cozy fire in the extravagant gold-plated fireplace you definitely have? There is hardly any constraint on those sheets then. You couldn't say much about any particular sheet at all, then. Unless you explicitly grab one and check, it's virtually indistinguishable from the others. (And grabbing a single piece of paper to check it out might be feasible, but try that for a single molecule in the air, say)
Every time you halve the volume in which a piece of paper can be (a location constraint), you decrease entropy by one bit. Halving possible states is like answering a single balanced yes/no question: "Is this particular sheet of paper in this half of the room?" is worth one bit. "Is it in this quarter?" would require two bits, "Is it in this eighth?" three, and so on. - And the same is true for other parts of the state space: "Is this the side that's facing upward?", "Is the sheet's current momentum relative to the room in this interval?" *
All that being said, even a very tidy room is going to have rather high entropy: You won't be able to sort the air in the room into neat piles. You won't be able to cool down the pile of paper to such low temperatures, that the sheets' individual atoms' momenta drops to essentially 0.
Strictly enforcing a neat and tidy room *will* reduce entropy, since literally fewer micro states are possible then. But compared to the sheer number of possible micro states, the ones you actually plausibly have macroscopic control over is, even as it is vast in absolute terms, virtually undetectable and negligible.
* One caveat: it's only exactly a bit of information if, given your previous knowledge, chances for the answer to be "yes" are just as high as for it to be "no". In general the number of bits gained is as large as how surprised you are by an answer. If you are *very* sure, that gravity never ceases, and it indeed doesn't cease, you effectively gained almost no information, and entropy hasn't further been constrained. - If you suddenly find yourself, one day, completely unaffected by gravity, that'd be a massive surprise, equivalent to a TON of bits.
Enforcing constraints is just a way to reshape the world in a manner that you can then be sure about.
So for instance, if you divide a room in two volumes of equal size and ask about which half one particular particle is in, if it's a vertical split, that will indeed be one bit of information gained. If you force it to remain in that half, you reduced the entropy by one bit. You have pinned down its location twice as well as you did before.
But if you split it horizontally, because there is going to be a (very minor) pressure gradient from top to bottom due to gravity, the answer will give you (just the tiniest sliver) more or less than one bit, depending on where you end up finding it. (If it's in the top half, that's gonna be slightly more, if it's in the bottom half, slightly less)
For your sheets of paper, meanwhile, it's gonna be vastly more likely to find them essentially static (macroscopic momentum relative to the room ~0), so it's virtually guaranteed for them to land in what ever interval contains that 0. If you want to be *super* strict about enforcing your constraints of perfectly defined stacks in a perfectly defined location, momentum would even have to be *exactly* zero... Which, of course, is disallowed by Heisenberg. In reality the sheets will always just jitter a macroscopically irrelevantly tiny bit, both due to thermal motion of individual atoms, and due to constantly being bombarded by air molecules. (And also by the equally present thermal jitter of the desk itself, say)
I think the thermal jitter of your desk was increased through your keyboard overheating.
Nah, my computer's processor and stuff produces plenty more heat than my typing ever could.
:-D
I think you need to factor Tracey Ermine into your analogy. She purposely creates a disordered state, in a room, for art.
So, to extend it to its absolute conclusion, the constraints that can be imposed on the universe, by intelligent intervention, could prevent the heat death of the universe? Say by creating a perfect vacuum flask. Warm outside. But hot (or cold) inside.
Are you aware of Maxwell's demon? A theoretical construct, that could sort a chamber to have all of the air on one side, by opening a small hatch to allow air atoms to pass to one side, but shutting it to stop them from coming back.
In practice, a being like that would most likely consume a lot more energy than any usefull energy that could be gained from fludding the resulting vaccume, because it needs to track atoms and open\close a hatch fast enough to catch them.
I'm going to reverse entropy, hold my beer... yportne.
Done.
Now please consider the energy required for you just to type this... Sorry, bro. You added external energy to the system. Doesn't count. But nice try. ;-)
Slow down Maxwell’s Demon.
I want to argue but, Bravo sir, bravo.
!!! esrevinu eht fo htaed taeh
This seems to check out. We might be on to something here.
I think life is the best effort the universe has made against chaos .
Never understand anything this man explains. Maybe I'm not smart enough, but all other YT channels explain everything so much more clear that, there, I do learn something. Slow it down my man, or dumb it down.
Am i really a macro state? Thst is the nicest thing anyone have said in a long time
Thank you, thank you, thank you for addressing entropy in this and upcoming videos! It's why I became a patron of Space Time. I have a bunch of questions about your excellent introduction to entropy, which covered a lot of conceptual ground in a short amount of time. I'll break up my questions into separate comments.
2:06 Slight correction but even a perfect Carnot cycle engine won't turn all the Heat going into it into Work some of the Heat will go to the lower reservoir. If this weren't the case you could combine a Carnot engine with a Carnot refrigerator to get a perpetuum mobile. Carnot proved that the Carnot cycle is the most efficient thermodynamic cycle possible but even that cycle is not 100% efficient.
It is if the cold reservoir is at zero Kelvin
@@dcamron46 true but in a similar way you can prove that to reach 0 Kelvin you need infinite energy (IIRC)
Avoid thermal equilibrium? Don't tell me what to do!
*equilibralizes himself thermaly*
did you just... commit suicide?
RIP
Jack Vernian hell of an euphemism, isn't it? But hey the universe ain't got shit on me, I did heat death before it was cool(pun intended)
ded
Lakshay Modi at least i feel very equally distributed now.
Great final comments. Learning involves doing, not just listening and memorizing. It’s always worth the work.
Entropy... it just ain't what it used to be. ;) :)
No, it's all changed for this dumbass who wanks off to his god of nothing that did it all naturally.
omg hhaarrgghh brilliant
🏆 Joke of the Universe!
Like all other physics, I barely get the math, but I get the concepts, I finally understand entropy. So thanks!
Hi, I'm sorry, this is an old comment, but it struck me and I had to ask. Is this true? Do physicists struggle with the math too? I've always felt drawn to Physics, but I am so bad with numbers 😅😅 Is there hope for me in the field? Lol
You had me at "42" and GO Board!
Fredrick Nietzsche I saw the Go board and had to watch. I love Go.
Hey what's 42 about
@@andrewwright4304 H2G2
@@TheZentegi yep
This a much better explanation of entropy, thank you! When I first heard the 2nd law of thermodynamics as "a system's entropy (disorder) will always increase over time", it sounded like these people had no idea what they were talking about. Big bang -> atoms -> molecules -> stars -> galaxies -> solar systems -> planets -> life -> humans -> intelligence -> consciousness. It's pretty obvious to anyone that the universe has only become more orderly over time.
Describing entropy as the number of possible random states that a system can have (or the amount of information needed to describe a system) makes a lot more sense.
I once heard someone use Sudoku as an analogy for entropy. When you're trying to solve a square in an empty row, it can be anything from 1 to 9 (maximum entropy), but as other squares are resolved, the number of possible states changes to let's say 1,3,5 (lower entropy). So ironically, it seems us humans enjoy puzzles where we reduce entropy in a system. As if it's a fundamental part of our nature.
Can't wait for the explanation of the temporal pincer movement
Note that entropy can decrease from time to time, just shortly, but it happens. From Quora (Jack Wimberley, Ph.D.): The second law of thermodynamics does not absolutely state that the entropy of a finite system cannot decrease - only that the probability of it doing so vanishes in the limit that its size becomes infinite.
So the cosmos is a finite system right ?😐and if you say infinite then how did the universe form in the first place ?🤔
Don't worry, we can fight entropy by granting the wishes of little girls (turning them into "Magical Girls"), then harvesting the energy that is created when their wish implodes and turns them into a "Witch".
/人◕ ‿‿ ◕人\
YESS A FELLOW FAN!
Until Homura messes everything up ;)
@@crimsonstrykr you mean fix :)
Kind of you to say "Don't die". Uhh ... likewise!
I guess that's actually a pretty good life lesson. Just thinnk of all the Darwin award winners. If they had just known ...
or "don't become stupid"
This is the best channel on RUclips - hands down.
This is an awesome lecture for introductory statistical mechanics.
"no mother, i shant clean my room, as the effort i spend cleaning will decrease the amount of energy available for our civilization at the end of ends!"
"avoid thermal equilibrium"
hes literallly telling us to stay alive right?
Either that or stay out of uncomfortable 98.6 degree F weather.
Yea. A body at equilibrium with the environment is a dead body
Hahaha
Avoid being room ( morgue) temperature
I like the numbers and probabilities you guys give but if it's not too much trouble could you walk us through the math.
zombie blood which maths of all?
The probabilities for a board of n cells where each cell is either occupied or unoccupied has a number of states of 2^n. that's the same as a sequence of n coin flips. If it's the case that half are occupied and half are unoccupied then the associated number of states is nCn/2, I.e. it's the appropriate binomial coefficient. The rest is number-crunching.
But it shows why, say, in a sequence of coin flips, the most probable state is one where the number of heads and tails are equal or close to equal.
Thermodynamics utilizes a lot of calculus, nothing too advanced, specifically differentials (at the senior undergraduate level). For instance, entropy, more succinctly is described as dS=-dQ/T. One thing I'm glad they stated is that thermodynamics describes large scales. Statistical mechanics describes small scales. The mathematics of statistical mechanics is a lot more involving, especially as the systems grow in scales. Statistical mechanics does use a lot of probability equations that are termed partition functions. Where statistical mechanics grows in complexity in describing systems, mathematically, are in non equilibrium problems.
ErwinSchrodinger64 I'm in high school. I love learning above my level but what you said was pretty confusing thanks for trying though. I'll have to come back at a later date after learning more.
zombie blood Hahahahahaha...
Public funding...
Badlaama Urukehu Willing to learn isn't good enough for you, champ?
AT LAST! The one thing I've always tried to explain for smirking science nerds who believe they know enormously much, but in reality are rather ignorant: that order is not the same as low entropy! Thank you.
What a beautiful sentence, im definitely usig it:
be careful to keep the number of accesible microstates low, avoid thermal equilibrium and keep being that brilliant macro state that is you until I see you next week
Your a super instructor. I love how simple-like, and matter of fact you
are. You make the second law sound like stacking blocks or something. You're so cool.
I read a book, while in a physics class in a college; Entropy, by Jeremy Rifkin, I think that was his name. It scared me to death.
It didn't help being in physics at
the time. It's like a nightmare.
A coma. You'll never wake up,
you'll only eventually die. Your being pulled through the hallway of life into the gaping mouth of a giant freezer - DEAD end. That's what reading that damn book felt like. It was scary. You don't really know what's truly 'real' and all of this endless mathematics and theory theory theory. I'm glad I'm outta there that's all I can say.
Thinking's cool, but sometimes I
feel like the Bros. Grimm make
more sense. Really.
it's not just energy coming in from outside that can reduce entropy, it can also be energy lost to the outside. If you had some kind of magic heat sink that was infinitely cold forever, you could extract a ton of useful work from a universe at maximal entropy... because of the energy being lost from the rest of the universe into that "cold hole". No outside energy input required.
Pfhorrest wouldn’t that situation be equivalent to air being in only half the room? i.e. as the cold hole fills up entropy is increasing not decreasing.
Doesn't the temperature of the universe decrease, thus not affecting entropy?
Virgil yes, but useful work is done by increasing entropy. You can think of "negative entropy" as "fuel", and then when you spend that fuel to do work, entropy is the "waste product". The existence of a "cold hole", an infinite heat sink, creates an energy gradient as big as the total energy content of the universe, and thus extremely low entropy. As all of the energy in the universe got sucked into it, entropy would go up, but you could do a ton of useful work using that "destruction of energy".
For a less fantastic example, imagine a closed system of a balloon that is somehow perfectly insulated from the outside world. The air in the balloon is at maximum entropy; for a tiny being living inside that balloon, there is no possibility to do useful work (so I guess they wouldn't really be living to begin with, natch). But if you poke a hole in the balloon, and let's just assume it's the infinite vacuum of space outside the balloon, suddenly the air in the balloon goes rushing out, and the total energy content of the balloon goes down, but beings living inside the balloon could use that rush of energy out of the balloon to do useful work, say by putting a wind turbine over the hole to generate electricity.
The point is just that you don't have to put stuff IN from the outside to create an energy gradient capable of doing useful work, you can also let stuff OUT to the outside to do so.
Your magic heat sink sounds like absolute zero. If the cold reservoir of a carnot engine could be zero degrees, then the efficiency of the heat engine would be T(hot)/0, ie infinite or undefined. That's why the third law of thermodynamics forbids such an infinitely cold heat sink.
The heat sink wouldn't have to be at absolute zero, just anywhere below the average temperature of the universe; nor would being at absolute zero be enough to make it function as described. What's more important it just its total heat capacity. If you had a penny that was at absolute zero and tossed it into a universe at maximal entropy, you'd get as much work out as it would take to raise the penny to the temperature of the universe, but that's all.
But if you had a penny made of some magic material with infinite specific heat (i.e. it takes an infinite amount of energy to raise its temperature), that was only slightly colder than the rest of the universe, then it would suck and suck energy from the rest of the universe until it had lowered the temperature of the whole rest of the universe to its own slightly lower temperature.
For an easier to illustrate example, imagine you have a pebble with infinite specific heat that's currently at -1C, which is a lot warmer than most of the universe, but say you dropped it onto planet Earth, where it would probably land in an ocean. All of the world's oceans would freeze. The atmosphere everywhere would be brought down to to -1C. All of the rock in the planet would be brought down to -1C. The sun would keep shining on the world trying to warm it up but all of that heat would be soaked up by the pebble which is still at -1C.
Eventually the sun would expand to a red giant and swallow up the Earth, but the part of the sun that came into contact with the Earth would be cooled down to -1C as all of its heat was absorbed by the pebble, though of course the rest of the sun would convect more heat to the part around the Earth but that would be soaked up too until the entire sun had been cooled enough that it shrank back away from contact with the Earth.
But, that would all take a while. And meanwhile here on Earth, the temperature gradient between the frozen pebble and the environment surrounding it would be a source for extracting useful work. It'd basically be a source of nigh-limitless free energy, as all of the unusable heat energy of the world, and all of the heat energy being pumped into the entire planet by the sun, would become a comparatively high-temperature energy source that could be pumped into the cold little rock to power a heat engine.
I studied economy and one of most amazing thing is entropy can be applied on economic systems like bancomat or sociologic system in the same way like thermodynamic (excuse me for my english but I'm italian and we never learn :) )
matteo conz interesting
The approach you mention rests simply on mathematical system analogies - personally I'm in favor of a deeper identification, like that proposed by Georgescu-Whatever (bar spurious laws of thermodynamics)
Wasn't it also discovered independently in economics as well?
Da italiano apprezzo che segui il canale
I'm always pleased to find areas of overlap in the mathematics of different fields. My favourite is that spherical harmonics are used to compress data for light transport in 3D graphics, but also describe the electron orbitals. Such "coincidences" are actually pretty common, and that fascinates me in a philosophical kind of way.
dS = dQ/T only for a reversible process
dS > dQ/T for an irreversible process
I see the messy room as a valid analogy for explaining entropy. There are all kinds of messy but only one kind of clean. If I use items in my room and leave them at random spots without returning them to where they belong, the room gets messier.
"Is my mom happy with the way my room looks" would be the macrostate. There aren't many microstates to make her happy but a lot of microstates to make her unhappy.
Therefore there is a natural tendency for the room to get into a configuration that makes mom unhappy unless I invest energy to clean it up.
Would it be possible to describe the entropy of a system in terms of the minimum amount of bits needed accurately to describe a region of space of a certain volume?
Question: Way in the far future of the universe, when all matter and energy have reached equilibrium (maximum entropy), how can the second law of thermodynamics still be followed (i.e., if entropy is at maximum, it can't go further up)?
Balrighty 35 It won't. At that stage entropy will decrease in local areas of space. Given an unimaginably long period of time structures will form purely randomly. Look up Boltzman brains for more information on this.
@@ebenolivier2762 ... given an unimaginably long period of time, the inexorable expansion of space between the remaining maximally dispersed particles will dwarf their differences in velocity. Now, they can never meet. Structures random or otherwise cannot form. Plus, there really isn't such a thing as 'random' in thermodynamics.
Seth of Lagos This will only happen in the "big rip" end of the universe. We don't really have sufficient models to predict whether this will happen or not. We also cannot say for certain whether what I said would happen. It's all just speculation if one thinks that far into the future. The idea of the Poincare Recurrence Time and Boltzmann brains is quite fascinating though. 😊
@@ebenolivier2762 No big rip or other speculation required. Just simple isentropic gas expansion into an ever-expanding void.
The equation for entropy is flow of heat / temperature. As temperature approaches 0 entropy approaches Infinity. There is no maximum.
as I watch several of these in a row, I can assume you love "Hitchhiker's Guide to the Galaxy" 😆😆
Always loved stat. mech. My first encounter with the concept of "phase space" was mind blowing. Looking forward to these eps. Well done, as usual.
I have learned more about this video than other videos and articles about Entropy. This video already corrected a few misconceptions I had about entropy. I am getting there. I have trouble understanding it but that only means I got to keep trying.
I'll understand entropy if it kills me.
Thought experiment:
If the arrow of time is the process of Entropy, then when the Universe ultimately reaches its highest state of Entropy, and only randomized subatomic particles exist, does the arrow of time still exist?
We know that the current state of the Universe can produce high levels of emergent complexity, far from equilibrium. At the Universe's state of highest Entropy, we would have only subatomic particles, distributed randomly throughout the Universe, and no measurable arrow of time.
Would the probability of widespread, spontaneous emergence of complexity, increase "exponentially" (from our present point of view), because arrow of time no longer exists?
In this high-entropy macrostate, the huge number of interactions, required to finally produce cosmic-scale emergent complexity, would be occurring "instantaneously", by particles moving at light speed, "outside of time".
So, high-entropy conditions, at the end of the Universe, are somehow equal to the low-entropy Singularity of the Big Bang?
I don't know if you guys can handle this, but could you do a video about how cells "use" neg-entropy?
Great explanation, although I’m still struggling to grasp all of it, I understand a few fundamental ideas of entropy
Excellent episode! In my 35 years of engineering design, the understanding of thermodynamics is the defining factor that determines the ability of a designer and engineer or scientist! Without this understanding I wouldn't even call someone a team member.
PS. The room with all air molecules on one side could happen, it is just that the universe is not nearly old enough and might never survive long enough for this to come true. But wouldn't it be cool to see, assuming you could build a room full of air that you could maintain and continuously monitor forever!
I think it would be possible to use an advanced quantum computer to scan through possibility space, focus on the set of possible microstates that is consistent with the current macrostate of your room, and then somehow guide your path through that phase space to the particular set of microstates where all the air is on one side of your room for some duration t.
If there was a device that could do that, then essentially you would just need to tell it the desired conditions you're looking for, it checks to see if it's possible to 'get there from here', and if it is, then the device would steer you through the path of evolving microstates that corresponds with the desired evolution of your macrostate. So, conceivably, you could just ask the device to 'make it so all the air goes to one side of the room for the next 30 seconds', and it'll just steer you/your macrostate through the path of that phase space in which the desired macrostate actually occurs.
You could do anything, really. You could have your device steer you through the evolution of microstates which corresponds to your room full of air safely flying to Mars with you in it. Such an evolution is incredibly statistically unlikely, but since it is still possible with non-zero probability, a sufficiently good device would be able to find that desired path of evolving microstates and steer you through it.
My teacher linked this in the PowerPoint presentation of our last topic in my second year physics class about thermodynamics lmao I'm blown away a channel I watch for fun is featured as a part of my assignment lmao
27 seconds in and get called out for my messy room
And at 8:38 we have a picture of that room I believe. 😉
The arrow of time part confuses me. Aren't there some particle interactions that violate time symmetry, which could thus also give the arrow of time?
I THINK there was something, but don't catch me on that.
If I understand correctly, all interactions are theoretically reversible in both the macro and micro scales.
Brent Lewis CPT symmetry has not shown to be broken, but time symmetry alone does not always hold.
Pretty sure this was covered here. Changing left handed to right handed chirality in electrons was violating time symmetry, iirc
Entropy would exist even if specific particle interactions that aren't time symmetric didn't exist. Hypothetically even if you have a system with all its particles behaving symmetrically with respect to time you would still have increasing entropy in one temporal direction.
So you will always have entropy acting as an arrow of time even in systems which have no other such compass.
"The second lore of thermodynamics" :)
The graphics behind Sadi and Rudolf at 3:30 are AWESOME
Oh jeez In comes Ludwig
Among the many brilliant Space Time episodes this one was uniquely so. Thank you.
Can there be situations where there are two distinct macrostates that are roughly tied for most common state? What would that look like?
I guess it would be local maximums. Like billiard balls could all be in the pockets in any arrangement, any of which are more likely than them being arranged still on the table.
I think that would look like Schrodinger's cat where one state is chosen upon observation
I think what you are asking is whether there can be situations where two distinct macrostates of a physical system corresponds to the maximum number of microstates that the system can exist in.
Well, there's always a macrostate with more microstates associated with it. A Universe in which there exist isolated physical systems has lower entropy compared to a Universe filled with only photons. Thermal equilibrium is merely when all the fast processes have already occurred, while the really slow processes have yet to take place.
If the mentioned macrostates are distinct, they would be separated by certain intermediate macrostates which are less likely. This means that if the system reached one of the most common macrostates, it would not spontaneously switch to the other such macrostate, unless pushed from outside. There are many such bistable systems. Your computer's memory and disks are built of trillions of them, which allows storing information. Each bistable system can store 1 bit of information.
ruclips.net/video/8tArShb1fhw/видео.html
This is about the previous video but I didn't have the chance to ask it so: what conserved quantity arises from global (not local) phase invariance?
Conservation of charge. This can be electric charge, weak hypercharge, or sometimes number conservation like baryon number or lepton number., depending on the context. These are all a U(1) phase symmetry.
Michael Roberts so local phase invariance gives electric charge conservation and global phase invariance gives the conservation of other type of charge or quantum number?
Nothing - conservation only appears with local symmetries. The fact you have a global symmetry merely flags the possibility that 'localizing' (AKA gauging) that symmetry may lead to the discovery or reinterpretation of some physical phenomenon
They are all examples of local symmetries, but for different Lagrangians. The QED Lagrangian has a phase symmetry that corresponds to conservation of electric charge. The electroweak Lagrangian has a phase symmetry for conservation of weak hypercharge that's broken by the Higgs field. But Iago is right that these all have to be local symmetries for the conservation laws.
This video was brought to you by Kyubey. Get your Mahou Shoujo cosplay costumes today! You can do your part to lower entrophy and save the rest of the universe. The rest of the universe needs you!
Meguca
Entropy can be thought of as a measure of a system's organization or disorganization. Generally, we can describe it as:
High entropy = Low organization (disorder, randomness)
Low entropy = High organization (order, structure)
For example:
A system in thermodynamic equilibrium, like a gas in a sealed container, has high entropy because the molecules move randomly without a structured arrangement.
A crystal, on the other hand, has low entropy because its molecules are arranged in an ordered and structured manner.
However, as you mentioned earlier, entropy can also be understood as a process of reorganization through randomness, leading to the renewal of systems and greater complexity and order over time.
---
Explanation with Examples
1. High Entropy and Randomness
In systems like a gas in thermodynamic equilibrium, high entropy reflects disorganization and uniform energy distribution.
Although seemingly chaotic, this randomness creates conditions that allow reorganization when external forces (e.g., gravity, temperature) act on the system.
2. Low Entropy and Order
A crystal represents a state of low entropy due to its highly organized structure, achieved under specific conditions that channel energy and molecules into a stable pattern.
3. Entropy as a Driver of Renewal
The randomness of high entropy creates space for unpredictability, where natural forces can reorganize elements into more complex structures.
Example: After a supernova explosion, the dispersed gas and dust (high entropy) provide the material for the formation of new stars and planets (emergent order).
4. In the Long Term
Entropy can be viewed as a tool of the universe that transforms local disorder into greater complexity and order, perpetuating a constant cycle of creation and renewal.
---
Conclusion
This perspective positions entropy as a constructive and transformative force rather than a destructive one. It is not the "end," but rather the natural pathway for reorganization and evolution, enabling the emergence of more complex structures in harmony with the universe's dynamic conditions.
This is one of the most relevant videos in the internet.
No one called me "brilliant microstate" before *_*
Meowwwww
*mAcrostate
You're a unique and special sno.....microstate
Thumbs up for the Go-board illustration :)
OMG I smoked so much weed and this is like blowing my mind right now.
Nice explanation separating the idea of order from entropy.
I added my "3rd law of thermodynamics" -- that the Universe does everything possible automatically to increase the speed of entropy. It wants to die ASAP. This is the ONLY purpose of "life" and therefore all life is equal. Evaluation works because higher intelligence always leads to higher entropy.
The Universe is full of organic molecules that self assembled from the elements created beforehand. Given sufficient time these organic molecules then self assemble into "life" because life is better at increasing the speed of entropy than non-living inorganic molecules (i.e. rocks). The end result is that we will find life everywhere. Much of it will not resemble our carbon based life.
The Universe does not care about us or about any life form -- other than the fact it is speeding entropy. We are totally on our own. This provides us wonderful opportunities. We should focus on life extension technology and "stasis" so we can journey to the stars without FTL space drives.
The end is nigh. There is no hope. There is only entropy.
Ketsueki Kumori hail entropy!
"Nigh", in this case, is about 100 trillion years.
S kildert we must speed it up. Quick gas the Enthalpy’s
For extremely large values of "nigh".
Don't Panic
Does the concept of entropy also apply to other systems than our universe, for example to conways game of life?
🤔🤔🤔
Game of life is not really random, so I don't think so.
You can apply the concept of entropy to a lot of stuff, in a lot of different fields.
For example, in information theory, you can calculate the entropy of a text.
The text "abababab" as a higher entropy than "aaaaaaaa", for an alphabet comprised of ab.
But if this is for an alphabet from a to z, it still has a very low entropy.
You have a lot of definitions on wikipedia : en.wikipedia.org/wiki/Entropy_(disambiguation)
Each one meaning a slightly different thing.
I think if you defined the macrostates under consideration, you could just use/define the log of the number of microstates belonging to each macrostate to be the entropy of that macrostate.
However, the fact that Conway's Game of Life is not reversible might make some stuff different. Many states lead to the empty grid state, but if "an empty grid" is a macrostate, that might suggest "entropy" decreasing? So that particular definition might not work out great.
Maybe if you defined all still lifes (where nothing changes from step to step) as belonging to the same macrostate, and all states with period 2 as having another macrostate, and so on,
then that could fit, because that way those macrostates have high entropy?
(Or maybe vary this in some way like taking into account the total number of live cells, or the furthest cell from the origin, when defining the macrostates).
I'd guess that there is some interesting way of picking the macrostates such that the entropy never decreases. Would be especially nice if it could be done in a way such that you could add it between different parts of the board, but that sounds hard?
I think people who study GoL have defined a measure they call temperature. Maybe they defined that based on entropy?
Not all, but it's REALLY inapplicable to Conway's game of life, because there's no conservation of mass (or # of live cells) in that, structures can increase in mass without limit. That was after all the thing that the first glider gun won the prize for doing. And if you can already violate the 1st law of thermodynamics, who CARES about the 2nd one, the 2nd one is irrelevant if you can violate the first.
Damn! Gotta love this channel! ❤
Thank you! Now I've got an analogue so to speak to reword something of mine or compare it to-entropy. 👍
You almost mentioned my favorite physicist: Josiah Gibbs.
And your outro is tantamount to saying: Please, "don't die" until I see you next time on Spacetime.
"Keep being that brilliant macro-state that is you"
Are you coming on to me?
"My room is dirty because entropy"
"Yes son but if we're trying to obey entropy, your room must as efficient as possible for you to successfully bring about a fast, universal, entropic equilibrium. Ain't no entropy-efficient babies being made in this room. Clean it or get a real job."
Where did you get that T Shirt?
Puny Gods Look in the description. Second link
Simon Bouchard Thanks
They sold them on dftba shop. Key word being 'sold' :(
they're still for sale, just sold out of large and extra large. luckily i can still wear a medium
Great video ,I recommend the song Entropy by Bad religion to all the people watching this
The Go board analogy was exquisite!
I see 42 - I must click
I am an aerospace engineer and I have been mocked by these "physicists" for being an end-user of their theories and equations. Guess what?!
Sadi Carnot, the father of Thermodynamics, was actually a mechanical engineer. On behalf of all mechanical and aerospace engineers in the world:
IN YOUR FACE Sheldon Cooper!
The Earth in PBS intro rotates the wrong way. :D
Ciekawostki o poranku I thought that I was the only person to notice that.
Or rotates the right way from a certain perspective?
How can you change the direction of spin using perspective?
Then i guess Earth would be made out of antimatter due to CPT-symmetry :D
It doesn’t matter which frame of reference you’re in. It should still spin counter-clockwise.
Thank you and at last a video mentioning Sadi Carnot. I definitely vote !
"Also, excusing the messiness of your room" This was brilliant.
So somewhere in the universe it’s possible a beach of sand blew all it’s grains into an image of my face? Cool 😎
If not, does that mean the universe hates my face?
May the Science be with You Everyone hates your face.
Look in the mirror what do you see😎
- "Entropy has been credited with predicting the ultimate heat death of the universe."
- "Hold my beer and just give me a few disposable high-school girls" /人◕ ‿‿ ◕人\
Drew Durant uh... what is strictly smaller than 3?
That Go board analogy was quite neat!
Enter the Braggn' I will check it out this evening!
I see the video is saying that the purposeof life is to encourage entropy which creates space time.
Background animation of the Sadi and Rudolf photos are so cool.
I understood a little something of entropy by his closing words; "Be careful to keep your number of accessible microstates low
(the chemical processes inside of our bodies should be kept on its natural course/function, unaffected as much as possible by external matter), avoid thermal equilibrium (discomfort or death) and keep being that brilliant macro state that is you (the culmination of most likely microstates)..."
*sets hair on fire*
Suck it, entropy!
Halberdier Woosh
I love it when creationists use the argument that how could complex life arise on earth if entropy always increases, They always forget the earth isn't a closed system /:
It's like they forget that the sun is burning.
Halberdier That is true, It's unfortunate ppl see increase in entropy as increase in disorder, in a literal sense
No it's much worse than that, because you see, the rise of complexity is EXACTLY what the 2nd law of thermodynamics predicts, after all, entropy also goes by another name, which has positive cannotations, and that name is "information". Higher entropy MEANS higher complexity, that's what entropy IS. So of COURSE it's not a contradiction for complex life to be a higher entropy state.
raees khan
It's not that they forget, they never knew, because none of them understands even that much (or rather less) physics.
you"re quite right, it is not a closed system....it has an external creator. your turn.
Hmm but doesn't that raise the question how the universe got to a low entropy state in the first place? This is like the problem of infinite regression of time.
Penrose makes a big deal of this and proposes his hypothesis of conformal cyclic cosmology as an answer, which predicts a high-entropy heat-death universe can give rise to a low-entropy big bang. Very speculative but the notion gives the prediction we should see correlations in the CMB radiation of a given angular size; don't think he (or his collaborators) have found anything yet though.
+Al Quinn Oh wow. That sounds really interesting. I wonder if it eventually turns out to be true. Hmm but how would a high-entropy universe spontaneously give rise to a low-entropy big bang?
Penrose argues the solution to the black hole information paradox is that the information is destroyed, which resets the entropy to a much lower level once all BHs have evaporated. I'm just a RUclips troll though so I don't pretend to understand this at great depth. Long talk he gives here:
ruclips.net/video/4YYWUIxGdl4/видео.html
The part where he discusses how to get around the 2nd Law is around 1h 19m
+Al Quinn Lol you're the best troll I've ever met, mate. Also, the trolls never bothered me anyway :D Thanks for the awesome replies.
It's basically because of motion.. more motion means more energy, more energy means more dimensions and variations of energy dispersion and density.
Excellent video with a SUPERB speaker. Thank you!
So far so good. The description of thermodynamic entropy is accurate here. I just hope they don’t equate it with information entropy.
Clean up your room!