I have witnessed water running upwards from a ditch onto a road then becoming airborne and creating a cloud. But, it was because of a tornado passing near to my house and I hope never to see anything like that again.
Aging biologist here! I’d like to add an uplifting comment to this uplifting video. :) I’m always a bit sad to see entropy given as the reason we get old. As Sabine discusses later in the video, open systems with access to a source of low entropy can use that to decrease their own entropy, so given that we can take in low-entropy food, there’s no in-principle reason we couldn’t use this to keep the entropy of our bodies roughly constant with time. So not aging is totally allowed by the laws of physics! It’s even well within the laws of biology-there are plenty of animals that don’t age, from tiny hydra to giant tortoises, and even one of nature’s most beautiful animals, the naked mole-rat. Their risk of death and disease doesn’t change with time, which basically means they keep their entropy constant throughout their adult lives. Now all we need to do is crack this biological entropy preservation using science…but that’s another story!
Unfortunately, we already have an overpopulated earth so preventing people from aging is gonna exasperate the issue. I would not be against preventing ageing though, even if my life was no longer than normal, a life without aging is far more enjoyable than living with a constant gradual decline.
Wow, I can't believe that I ever espoused that exact view. When you put it like that, entropy as a reason for aging makes no sense. Thank you for ridding me of that misunderstanding.
THANK YOU for pushing back against entropy being described as order vs. disorder! Through years of schooling entropy was this poorly defined, almost spooky concept of order. Then I was finally introduced to entropy as probabilities of microstates (with gas in a box), and it was completely logical and clear.
Entropy is to heat transfer what friction is to an object in motion. Entropy reduces the available energy in a system just like friction. Order vs disorder isn't useful at all IMO and only serves to cloud what's happening.
I think they're synonymous ways of talking about it, it's just that order as a concept has more to unpack to get to the point. Order means that there are rules that limit the number of microstates. A bookshelf ordered alphabetically has very few allowed microstates (defining the microstates as the books' arrangement), a disordered bookshelf on the other hand would have as many microstates as there are ways to arrange the books.
I agree! I've always found order & statistical description of Entropy to be fundamentally wrong. And if you look you can see that this approach exists in many other fields of science (of SCIENCE! Which is supposed to be OBJECTIVE :)) The trouble is scientists are not always objective - actually everyone is at least a little subjective at times, that due to our limited resources (at least attention and Time). But how science generally work - you observe a phenomenon, you describe is somehow, you test if your description allows you to predict same phenomenon in the close future; if your prediction fails (or is imprecise) you improve your description until you get accurate enough predictions ... and at this point you KNOW that you best description is the CAUSE for your prediction ... so it's easy to miss the fact that the Universe doesn't give a rat's ass about your descriptions :) It's just one of our many biases - one that many scientists also fail to. Ultimately our descriptions would be so perfect that they'll fit to Reality 100%, and then it would be only a philosophy game to distinguish between the two ... but even with our most precise sciences we're still not there. But people (and esp. scientists) love to believe that the theories they learn (and especially the ones they make) are close to perfect and thus that bias becomes really, really strong! Have you seen a scientist trying to explain some phenomenon by stating some equation and ending with something like "and as the equation tells us, the object has to do this and that". Well sure if your theory and equations are perfect you'll get it right ... except for the understanding part! Very little of such descriptions actually explain anything, and it's exactly because they jump over the actual forces and threat the description as the cause. In that case if your description is for example statistical (which many sciences use today ... and particularly QM) then it's really easy to believe that probabilities and statistics are the CAUSE for things like Entropy! ... but this is still fundamentally wrong of course! :)
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
I'm a mechanical engineer who focused on thermodynamics in college. Despite years of study and using entropy in formulas, this is by far the best explanation of entropy I've ever heard.
There's a reason my father, who trained as a mechanical engineer but became a computer programmer/systems analyst, always referred to thermodynamics as "thermogoddamics".
My education was the same. I was waiting for her to say entropy tends to increase in a CLOSED SYSTEM, but she never did. She must assume the universe as we understand it is a closed system, but that's very debatable, given the required low entropy at the Big Bang. Also, questioning the semantics of the 2nd Law (the meaning of "order" seems trivial to me. After all, I recall the word "disorganized" being used in the 2nd Law, which is better I think.
I completely agree with you. When I was young, I used to tell my mom that my room was never in disorder because disorder itself doesn't truly exist. Instead, what exists are infinite states of possible order, and none of them holds more inherent sense than another.
If a workspace becomes disorganised, the ability to get work done rapidly approaches zero. So I would argue that certain states of possible order do make more inherent sense than others.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
Extremely organized workspace looks amazing, but it can intimide some people and drain creativity. On the other hand organizing is really good work hygiene after project is done.
Sabine, I'm a chemist, and in conversations in the past, I attempted to make the arguments about entropy you've made so eloquently in this video (especially the associations/ideas about order). From now on, rather than arguing, I'll recommend watching your video. Thank you so much for this great video!
As a chemist you must realize the issue with naturalism is that time is the enemy not the friend. You need biological chemistry to happen faster than the reductive decaying chemistry. The problem is that doesn't happen in this universe... amino acids and proteins only have hours before becoming useless...
@@WaterspoutsOfTheDeep The subtext of what this video is saying is that the 'starting' state of the big bang was a macrostate that we no longer have information/knowledge of. Therefore, at this alleged 'heat death', it will yet again be a big bang of a different kind. All this suggesting that the universe might very well be an eternal macrostate that *can't* 'discontinue'.
Too bad this "eloquent" argument falls apart if reductive materialism isn't assumed. I don't think many people will agree with her that _none_ of the macro-states don't objectively exist. Clearly, humans ─ who are macro-states ─ objectively exist since, well, cogito ergo sum. Her argument is pretty nonsensical.
@@maxkho00 Its premature to label concepts on the vanguard of existence in that way. Entire areas of study could very well reside within something woefully 'nonsensical'. Since we are talking about expanses of time in the trillions of years, I'm not really prepared to reject anything. Regardless of how silly it might first seem.
9:55 Finally someone said it! I always considered a homogeneous mixture as "ordered", contrary to how lots of entropy explainers describe it as "disordered". This led me to lots of confusion over the years, until my recent physics classes cleared things up.
@@cameronbartlett6593I am a plasma engineer and I would beat you in physical anything best of 3, pick your best sport it wont matter even if I have never done it you will still lose because I will also pick my best sport and third party picks the third sport meaning you have low chances of winning. Go say your bs to yourself in the mirror. Not every nerd minded dude isn’t physically able.
This video reminds me how it always frustrated me in school as well as later at Uni that things were described or defined in simplified ways that made them wrong, harder to actually grasp or both. Sometimes the actual truth of the matter would slowly reveal itself often leading to an „aha“ moment years later and a feeling of „I knew it“. Also funny how oftentimes simply explaining the meaning of a latin or greek term would have almost explained the whole concept behind it, yet somehow no professor ever did that.
Exactly! Like for example how percentage literally means "per hundred". So 10% is 10/100. Which means you can easily convert between percentages and fractions.
The entropy of RUclips must have decreased since veritasium and Sabine both posted videos on entropy within two weeks of eachother :) They're both great, but Sabine definitely lives up to her motto of no gobbledygook... thanks Sabine!
I agree, they are both great. Sabine's was posted on Jun18, and Veritassium's on Jun2nd . So we know Sabine wasn't responding to his (since she was first) and since Veritassium takes longer than two weeks to make one of his elaborate videos, we know he started his before he knew Sabine was doing one. That's perfect. Also cool how they both didn't like the word "disorder" as entropy,, Veritassium likes entropy as "energy spread out", and Sabine presents her own argument.
Thank you for delivering such great content 👏 always interesting, humorous, and informative. My field is engineering, but sometimes, I regret not pursuing physics. Watching your videos gives me a chance to see what I missed.
@@SabineHossenfelder Time does not pass, only you is moving thru the Time. Time is a coordinate of the state of the moving entropy. From the pass state to the present state you can aim the future state.
Engineering is good experience for a prospective physicist. Nikola Tesla, perhaps the greatest engineer, was by extension, also one of the greatest physicists who ever lived, especially in the field of electromagnetism.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
The idea that there are always macrostates capable of turning a high entropy system into a low entropy system is fascinating. Entropy being constant, but perceived by us macrostates as variable, and thus subjective is powerful. I love how more and more physicists are adopting such a perspective! 😁✨
It may be powerful, but that doesn't mean it is particularly useful. Saying that life may be able to emerge post-heat death of the universe just because it perceives reality differently may be a great Douglas Adams book, but belongs right alongside the parallel dimensions theories in plausibility.
@@florianp4627 It depends on what you mean by "idea" and what you mean by "similar". But I personally find Penrose's hypothesis intellectually coherent, interesting and plausibly congruent with reality, while Sabine here is engaging in the exact same cognitive hocus-pocus that she makes fun of certain other physicists so much for.
Videos that provide a "simple understanding" of historically complex concepts (like the laws of thermodynamics) are crucial for our kids to hear and learn. I am especially interested when the video explains what we "don't know" or "don't yet understand" to pique their interest in what to solve next!
This is the coolest clearest and most concise explanation of entropy ever. I wish my physics professors had taught it this way. It would have saved me so much sleep.
True say. While almost anyone can understand a concept/subject, it really takes a special mind to be able to explain it in a way that can be understood by someone else. Sabine's way of conveying information is so refreshing.
@@aggies11 I am a firm believer in the Feynman methodology: start as simple as you can to build up knowledge. If you can explain a complex idea in such a way that a child (or a non-specialist) can understand it while still being faithful to the core concepts then you are good!
@@AurelienCarnoy Yes, I get fed up with it being called a law ... to call it an assumption, or - as sabine does - a matter of probability, would be much more accurate and satisfying.
For a mortal being with a life expectancy of ~80yrs I confess I spend an irrationally large portion of my life brooding on the eventual heat death of the universe. Glad to find videos like this on occasion which at least try and come up with a philosophical explanation for why we shouldn't feel depressed over the stars eventually going out. Kudos Sabine!
This remembers me of the book "A Choice of Catastrophes" by Isaac Asimov. He talks about how to survive heat death by exploiting low entropy fluctuations. The book was written in 1979.
@@vikiai4241 that's interesting because I am a native English speaker and frequently come onto this but never was sure about my way of thinking about it. Thanks for the validation lol.
Perhaps entropy is just the natural part or inverse phase, moment, flip, reversal or change that occurs with all energy and matter interactions and formations in the cosmos! Which creates, evolves, then reverses, changes, deconstructs or destroys things.... to reuse and recycle into new positive/negative formations and events ...and so on!
If you don't confidently, arrogantly even, question physics you're doing it wrong... Entropy should be redefined as Simplicity or Uniformity... Complexity can be static and structured, or dynamic (and structured)... Closed systems simplify over time. Energy can increase complexity. This redefinition solves many issues.
@@PrivateSi You haven't been humbled yet clearly. Entropy has already been defined precisely in physics, from the mid 19th century. What loose definitions you get from science communicators isn't reflecting the reality of this situation, hence you must study physics. (Even this video enunciated this so I truly don't know where your head is)
@@moneteezee .. You can either make a new term or change the Law of Entropy, because that law is wrong when over-applied to the entire universe. It's fine for gases in a closed system, but fails as a universal law, unlike my redefinition.. You could call it the Law of Simplicity if you prefer, as long as you stop declaring the Law of Entropy to hold at all times.
@@moneteezee.. If fields are real they have to be made of discrete elements that have to be as equidistant as possible throughout the entire field for the field to be empty. This is a perfectly ordered state that's as simple as possible. Hence, The Law of Entropy should not be a law or Entropy should be redefined. It's a simple, logical argument given the evidence for QUANTISED FIELD(S).
@@moneteezeeentropy is likely the unwinding of quantum entanglement from the source of singular entanglement of all possible quantum expressions into less entanglement limited expressions … it entropy is the cost or the balancing side of the of increased complexity that rebirths cycles of quantum tangling and de tangling that at some phase of transition leads to us. 😊 Entropy must be considered within the scope it’s relationship to quantum complexity. In short small scale fluctuations must run dry for the whole quantum baseline to reset into the potential of singulars expression. 😊
Me encanta la profundidad y claridad que tienes, este tipo de contenidos es oro. Yo estudié Física y Filosofía y una conversación contigo me hubiera ahorrado años de dudas y confusiones. Gracias por esto.
The true and big picture answer is that it will never end ! The universe as a whole is evolving on a very large time scale, which is macroscopically time-periodic in a statistical sense ! The initial state not relevant, like for a non-linear oscillator that becomes chaotic, but its chaotic state is periodic, perhaps because of the reorganizing entropy-reducing nature of gravity, whose long term impact has still not been understood or quantified ! 😇!
Sabine your work has changed the very way I view life and physics in a way that's nearly spiritual. I feel so lucky to be able to learn about the beauty of the universe and physics from you
She is wonderful at explaining very difficult or (almost) impossible ideas in an understandable way. Alas, even Sabine leaves my poor brain often mystified, but always stretched, curious, and entertained. 🙂
When it’s a topic like this I need to watch a few times over a few days and dig more into some of the ideas with other videos. I like that she doesn’t completely dumb it down for people like us but opens the door for us to learn more (such as that entropy logarithm formula). But yeah at the end of this video I STILL don’t QUITE understand a simple way to explain entropy to someone. Yet. But I will after a few views 😂
The statistical description of entropy has never resonated with me, but your talk about other life with access to different macrostates is very interesting, and warrants further thought.
It’s vague and non substantial. She needs to produce some scenario that could meaningfully demonstrate how one side steps Boltzmann’s formulation of the second law. Not the order/ disorder pop science stuff, the physics. Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently. I love her, but this argument isn’t serious.
I didn't get wut she means by Complex systems with different microstates which we can't use and further the entropy is small for them.... Can u explain
Could Entropy just be a matter of perspective? If a horizon from one perspective can appear to be a singularity from another perspective... A horizon takes an eternity, and a singularity happens in an instant.... And, a horizon is a state of maximum entropy, and a singularity is a state of minimum entropy... Entropy should be dependent on your reference frame. If your horizon is growing, then your entropy should be growing. I've been thinking about it like this after listening to Leonard Susskind. I don't know if I'm getting it quite right, but it's really interesting to think about in this way.
@@kristianshreiner6893 _"Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently."_ Well, if you understand "magic" as "fairy tales", like many, then its a straw man fallacy, because Sabine didn't thought of that on terms of "something that doesn't exist nor can exist". And she didn't use the order/disorder stuff, but to only criticize it. Boltzmann's formulation, as Sabine hinted on this video, it's only useful for thinks that we "know", not stuff we clearly don't know, as information not accesible to us. Stop lying pls.
One of the best commentaries on Entropy I've encountered thanks SB, especially as "The # of microstates per macrostate" - Given that a macrostate is simply an arbitrary human classification/aggregation, does this mean that entropy is an arbitrary physical aggregate outcome? I think I'm missing something, so I'll listen to your commentary until I can discern this thing. 🥝🐐
Every fundamental unit you may imagine of in physics is relative in the end. References are set arbitrarily by us on the basis of what we deem enough stable to compare to. I guess entrophy is also relative as we still don't know much about what is going on at subatomic level, but for the time it seems to be the most pure and universally applicable unit we can think about. (I am not a physicist but an engineer, although I think obtained degrees are meaningless when it comes to pursuing actual knowledge and are just used as a social reference for people to be inclined to more easily believe you when you speak, as a sort of warranty. Some people make use of this extensively to bolster their ego even though in the end they may don't know much)
I've been waiting a long time for this video. Sabine, you are an amazing communicator and I think the world needs to hear you. This channel and other sources like it are THE REASON I use the internet. Thank you! ❤
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
@@hyperduality2838 sounds like your grasping for straws, likely for some sort of affirmation of a biased narrative. entropy is all around you and can be observed, posing a balance to the force is wishful thinking
@@MR-ub6sq Concepts are dual to percepts -- the mind duality of Immanuel Kant, the critique of pure reason. Mathematicians create or synthesize concepts all the time from their perceptions. Your mind is dual. Mind (the internal soul, syntropy) is dual to matter (the external soul, entropy) -- Descartes or Plato's divided line. Matter is also dual -- wave/particle duality! "Always two there are" -- Yoda. Everything in physics is made from energy and energy is duality, duality is energy. Potential energy is dual to kinetic energy. Electro is dual to magnetic -- electro-magnetic energy is dual, photons.
I guess Sabine is the best person that has ever explained this relationship between order, life, entropy and those things without either being carried away in theory or relying on very superficial metaphors. Because maybe physics is both ! intuition and somewhat philosiphical insights + rigourous mathematical modals.
@@No-cg9kj I agree with you actually. Philosophical thinking doesnt mean something detached from reality. There s a section in philospohy about science and modelization and those things. I mean that it's useful to take some time to pose some "meta" questions about what we're trying to look for in nature, what is really fundemantal and what stems from our specific human view of nature. In this video Sabine took sometime to think that maybe our defintion of entropy is a reflexion of what information is accessible to US and what energy WE can use. It is pretty philosophical it seems to me. It's funny how your user name is NO btw xd, thnx for ur comment.
What is "life" (body, soul & spirit)but the interiorizing, complexification of matter, a 'gift' of anti-entropy...😊 👉Creation Continues...,& that mass beyond our bounds..also, accelerates & increases... And the gravity of it all makes for a stable & expanding space-tiime in our neighborhood.
"If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation-well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." - Arthur Eddington, New Pathways in Science
Great video Sabine! Even if I have a MSc in chemistry and PhD in pharmacy, one of the hardest thing to undetstand is the thermodynamics. You really have to eat and breathe thermodynamics to really understand it, and it doesn't come with intuition, and one of these things are entropy. So thank you Sabine to make these entropy more intuitiv.
Entropy is a tendency, not a law. For instance, galaxies could not assemble if entropy was the rule. The solar system could not exist if entropy was the rule. Fusion could not occur if entropy was the rule. It would appear then that gravity is the opposing force to entropy.
Also, doesn't that mean that gravity provides an objective standard for what counts as a macrostate? 🤔Gravity puts loose constraints on how particles can move (e.g. directing particles so that they would make a galaxy shape). Particles can move out of those confines but are less likely to do so. It seems this is just what a macrostate is. And it's not dependent in any way on human interests.
The analogy of "disorder" can work in the right context. You could say: there's all kinds of messy but only one kind of clean. For that reason your room tends towards getting messy if you make any change not specifically targetting cleanliness. What I like about the example is that it uses the cleanliness as a macroscopic state defined by a human and talks about the associated microstate count.
i see you've never been to a certain college dorm. as determined with mathematical certainty, this was a space of superlative messiness: the Platonic form of disorder, chaos so intense both the gravitational and nuclear forces had all but given up - trash and antitrash not belonging to our dimensions materialized, ionized, annihilated, sublimated, vaporized, crystallized, and floated about the space with such frequency that any external modification to the region did nudge the system towards cleanliness regardless of the actor's intention. the situation passed when a window broke itself in its frustration. the resulting decompression vented the contents of the room (including furniture, wallpaper, rodentia) into the environment.
@@petevenuti7355 This is correct and similar to what I was going to say about her comment that "entropy is why things break down" Actually things break apart because usually a "thing" is comprised of many things forced together and eventually unravel because they don't share the same essence. A house will break down because its a collection of wood, glue, sand, gypsum, wire, metal etc.. But those things on their own do not break down
@@LMarti13 If you assume any margin where two adjacent states would be considered "the same" there's a finite amount of states - both "clean" and "dirty". E.g. if you assume that a pair of scissors being moved by less than 1mm constitutes the same state (that is: you quantize the position) the amount of states becomes finite. Importantly increasing the "precision" doesn't make the amount of states infinite - and it doesn't change the proportion of "clean" and "dirty" states significantly. So even at a limit of however small your measurement error is, the amount of "clean" states is smaller than the amount of "dirty" status. Unless quantum effects start taking over, the amount of "clean" states is lower than the amount of dirty "states", no matter the scale.
Sabine has just addressed a question I been tickling me since my teenage years. This notion of pockets of emergent decreasing of entropy, such as say, our brains. I'm so grateful to hear her address this topic.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
This reminds me of Roger Penrose’s ideas. I think you shared a stage with him at one point. If we change our scale of looking at the universe, in both space *and* time, the almost uniform distribution after 10^100 years (or whatever the number is) becomes recognisably clumpy again. The almost imperceptible effects of gravity speed up and become perceptible again.
i don't think this is possible. if the universe at that point is nothing but a diluteted, even, gas, there may happen some random "clumping" but there just not enough matter nearby for anything more than a few particles hooking up.
@@B33t_R007 cold things contract. The universe is expanding because it’s still heating as evidenced by the high number of stars currently burning. Turn the heat off and the dark matter contracts as does the distance between all particles and celestial bodies. The crunch will happen as it obviously happened before. Unless you believe the universe poofed into existed out of nothing. Which defies all known physics and logic. I think Penrose is brilliant. And wrong about his cyclical universe approach. But I also think hawking was wrong about BH radiation for a number of reasons.
@@B33t_R007 the point is that there is eventually no matter, just stray photons. but photons move at the speed of light and from their point of view everything happens instantly. penrose claims that this has peculiar consequences. i can't tell you more than that, or vouch for penrose, but you don't seem to be anticipating the situation fully
Once every remaining particle is beyond the event horizon of every other particle, there is no way they can interact again. Expansion of the universe would end up in a zillion one-particle universes.
It's not that entropy cannot decrease; it's that entropy in a closed system cannot decrease. If your box has a window with the sun shining in, it's no longer a closed system. And taking measurements also means it is not a closed system.
I have had major depressive episodes because of entropy and the second law. I am deadly serious. Since learning about it in high school physics class, I have always considered it to be my greatest fear. Thank you for easing my fear a bit. 😊
I like to think life only came about because of entropy because a human that builds a car will generate a lot more entropy than a rock that gets chipped to pieces over time.
If you like SciFi I can suggest a book to you that has this as part of the story.. it's called 'Voyage from Yesteryear' by James P Hogan. Should give you a positive story to carry around wherever you go.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
so true 🤣 it's so subtle as it blends in with her education. imagine her being the teacher, she would grab your attention because you'd be like wtf? then listen more. she's a great teacher. 'if you tell this to a random physicist on the street to see if they agree, you should let them go as they've got better things to do' 🤣
I think one of the most amazing example of low entropy , or localized reversal of entropy is photosynthesis at the molecular level. That with amino acids and DNA/RNA is the basis of life on Earth. Execpt for life around ocean floor volcanic events.
Entropy is simply a lack of data. In a “real” sense though. Information or data is better to say than order because things being “ordered” are subjective. So entropy happens when you begin to lose information of a system. If something is breaking down it’s order or information is also separating. Therefore it’s entropy is increasing. In a way entropy could be seen as time. Because time moves forward as things spread out or break down. It’s even how we quantify the “second” is by using the measure of an atom breaking down. These two things are the same though. We measure time by our perception of all systems around us progressing towards entropy. So time moves based on the frame rate we perceive it which is again measured or quantized by entropy.
Entropy is simply the rise of equality. Which is a result of calculations the nature does to produce the next moment of time. So, entropy is simply the result of calculation. It's real, but it's wrongly defined. Entropy is not a measure of disorder. It's a measure of equality. Because once the full equality is reached, the universe stops. The formulas still work, but their inputs and outputs become the same, hence there's no change, hence the passage of time makes no difference. That's maximum entropy. Or equality. So, equality is bad 😊
@@cinegraphics No their inputs and outputs don't become the same, only when you're working with an information discarding model of the universe (macrostate). There is still exactly one microstate leading to exactly one next microstate, and thus each microstate has exactly one preceding microstate, so the universe in terms of microstates cannot advance to a point where the inputs and outputs become the same; as then that final microstate that would be reached would have more than one way for it to be reached: first the last state of the universe in which inputs and outputs were not the same, and second the final state itself where inputs and outputs are the same. But this is in contradiction with the assumption that there is only exactly one microstate leading to exactly one next microstate.
@@snaawflake you're forgetting the rounding errors. At one moment the error level drops below the computation precision and the output becomes same as input. And that's the end of computation. Death of the universe. Stable state has been reached.
I agree that we can measure location and velocity at the same time in our macrostate. The reason we can not do the same for a microstate is that we do not have the sterile tools and technology for the microstate. So we use the next best technique, statistics which reveals the wave function. That is just a tool to interface with the microstate without much interference, which inherently causes ambiguity. We are just not built to interface with the microstate without causing a disturbance. We need to devise a fractal technique to zoom down into the microstate to accomplish what we want to do without disturbing the microstate. Without that, we are just guessing at the characterization of the microstate. And we go down the rabbit hole. Thank you for your great insight on entropy. This allowed me to grasp the limitations of measurement theory in our macrostate, which is a fractal state of the cosmos.
Sabine is not only intelligent enough to master a complex subject, but also think in new ways about how the data should be interpreted. It absolutely blew my mind when you explained how macro systems are a relative concept, and not an absolute value that happens to match our anthropocentric view of the universe. Thank you.
I often think about how the anthropocentric view tends to divide ‘reality’ into separate things where there are none .. ie. a kind of fiction that perceives order v chaos, living v non-living, observer v observed etc.. ‘we’ seem to separate ourselves out along with everything else. My hunch is that the energetic solidity we feel in the body gives rise to the appearance of everything else as distinct and knowable.. a kind of ‘specialness’ that has something to do with the apparent body’s drive to conserve low entropy... what we call ‘me’ is only a kind of interpretation and nothing at all in reality.
You should read Asimov's story called "The last question". It covers the topic of entropy and this video explains perfectly the science behind it. The story also follows Sabrines idea about future life forms in higher entropy universe. It is one of my favorite Asimov's pieces.
As a young boy I became a science fiction devotee, and Asimov was one if my favorites, among with Heinlein and Clarke. I presume you are 😊from an older generation like me to have these greats on your reading list. It was a seminal experience in my life.
Great video, as usual. Sabine really puts things in perspective with her very humble scientific admission of gaps. Around the 7th minute we can visualize clearly what is the greatest bias (blindness?) that will make our grandkids cringe in a century: "how could you not understand this emergent order?" To me it feels like it's way past time we start conceptualizing things like "syntropy" and eutropy", and take Metaphysics seriously again. Restore it to its natural place since Aristotle: the real "theory of everything". Intuition (idealism) is the way we understand things so that science can make strides again. Empricism depends on it, and is basically a secondary development.
I read a while ago that they made a tiny membrane out of graphene that could harvest minute amounts of electricity from brownian motion by vibrating the membrane. That's pretty cool because as I understand it, it would mean that if you're in a sealed sphere that doesn't lose energy, you could in theory produce energy out of seemingly nothing by cooling the air to convert the heat to electricity, which in turn would do useful things before inevitably heating the air again.
We actually don’t need a closed system to make accurate predictions with thermo though. You can derive how a system in contact with a thermal reservoir will act by treating the system and reservoir as a closed system. You’ll still see that perpetual motion machines like Brownian ratchets are impossible with heat exchange
I just checked a presentation from Paul Thibado | Charging Capacitors Using Graphene Fluctuations, where he explains in a nutshell that energy in his experiment is gathered by the diodes, not the graphene. The graphene is acting like a variable capacitor that doesn't need energy input, but is at equilibrium. From his own claims, it doesn't break 2nd law of Th. even though I don't fully get the explanation.
That was one of the best lectures although it is hard to compare lectures. It's maybe the first time entropy makes sense even though I have been reading about it all my life. I also like your dry physics humor. One needs humor in life. So thank you very much.
At the time 14:25 "I believe as the Universe gets older ..." and we observe "local entropy" ... "... new system will emerge ..." ! absolutely in a never ending dance! And we start to have clues as the pseudo "dark energy" and really two old galaxies to belong to our own Big-Bang! I dig into these questions a c-l-i-c-k away!
I think many of my (undergraduate) students would get a laugh (or a cry) out of the phrase, '...You don't need to know that much maths, just differential equations and probabilities...'
There's not a single thing about Special Relativity that Einstein couldn't have explained to Archimedes. Einstein would probably take a detour into explaining the modern notation of algebra. But this wouldn't really _be_ algebra, for the same reason that many undergraduates in computer science can barely distinguish a formula from an equation. Archimedes: This algebra thing is cool! I wonder what else you can do with it. Einstein: Well, I did once take a year-long detour into the deep technical weeds of Ricci tensors. Archimedes: Excellent! Please explain. [Archimedes smooths out some complex geometric diagram in the sand.] Einstein: Uh, okay, but we're going to need a _much_ bigger beach.
My college students would laugh at "half the box being filled with particles" as being a low entropy state as they would understand that this is just one state that's counted in the entropy of the system. As a person that implies you have physics students, do you not see anything wrong with her understanding of thermodynamics?
@@johnzander9990 But it is clearly lower relative entropy to being fully spread out in the box. Just by halving the space you're halving the number of microstates. Further, it is clearly not an equilibrium state and not a state that can last long.
@@johnzander9990 Sorry, I should add that your understanding of entropy is a bit concerning. The state of the box half filled is the macrostate. It is not one of the contributing microstates to some other macrostate as your statement implies. Ex. You could instead have the microstate be that all molecules are in the box and *then* you could consider all on one side as a contributing microstate to that macrostate.
Perfect timing. I am reading The Order of Time by Carlo Rovelli. And the low entropy Google algorithm sends me right here. Always enjoy your explanations. Well worth my subscription.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
My assessment of the reason behind the confusion that seems to permeate discussions about entropy, is that in the act of representing entropy as a thing, our focus is drawn to it as such. Yet it is more like the negative space *around* objects, rather than a thing itself. Entropy is an effect of a process. The most succinct description I have heard (for those of us who may not readily be able to calculate the log of Boltzmans constant) is that “Entropy is the tendency of energy to go from concentrated to diffuse.” Every single event since the beginning of the universe (assuming it had one) can be described as follows… “Energy changing from one concentrated form to a different more diffuse form.” Everything from our planets geology, magnetosphere, biosphere, including our bodies & their metabolic processes, brain activity, almost any form of technology we invent and use… all of it merely capitalizes on complications of this eons-long cosmic slide down the entropic decline. We are all living in the aftermath of an explosion; embers in the dark.
If we crack fusion and are able to create low entropy objects in perpetuity (and maybe even black hole generators), how does that affect the progression towards heatdeath? Schrodinger wrote a great book near the end of his life titled "What is Life?" that touched on entropy and order, but it always confused me. This video helped to clear a lot up! Thanks Sabine :)
Hi Sabine, I think this is imo one of your best physics presentations yet. Explaining entropy and how it applies to pretty much all aspects of nature. Thanks.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
You know, it's not as if she made a conscious choice to be conceived, and then to pop out of her mom's uterus for the sake of her future RUclips audience... 😂 You could thank her for making a RUclips channel, but the caveat is that she isn't a believer in Free Will (and in Agency by extension, at least not in the classical sense) so...🤔
As a total layman, I love the way you manage to dumb down any subject to my level without actually dumbing it down. I wish you'd been my physics teacher when I was at school! However, you did give me one microsecond of pure existential dread when (I thought) you said that the heat death of the universe would occur in ten to one hundred years. Aaaaaaaaghhh! Oh, no, that's ten to THE one hundred years. Panic over. Now I need to find some clean trousers.....
Great video! But tiny correction: high entropy means high information, not low information. The more random something is, the least it can be succinctly explained. At least for Shannon Entropy. For instance imagine water in its solid form with molecules neatly aligned, it can be described succinctly as a pattern repeating, whereas in its (high entropy) gaseous form, each particle can be anywhere in a the volume that contains it: describing the micro-states would require describing the position of each molecule (high information).
That's a nice description - just watching the vid but based on what you say I wonder if this video can do better than that concise description. Maybe the simpler the explanation the more encompassing it is also? Entropy feels like such a description tending towards that... hence it's ubquity/applicability.
Ya... I think one has to be careful when analyzing a physical example using Shannon. In physics it about information known (or information that can be potentially known.) It's not conceivable that we could know very much about the micro-states of gas molecules. Yet, we have a lot of information about the micro-states of ice. Thus... in the context of the ice and gas example you gave.... it's kind of opposite of what you said. There is a subtle connection between the two concepts of entropy. A good paper is by Jaynes, if you want to read that. Sean Carroll also writes about this too.
But given that entropy tends to increase, does that mean the amount of information tends to increase or decrease? Consider the case of a gas confined to one half of a container by a barrier. Then the barrier is opened and the gas escapes to fill both halves of the container. Do we have more or less information about the positions of the gas particles than we did before?
how satisfying to hear that the reference to order in explanations about entropy is non sense. the connection to information and macro micro states is another source of confusion I'd say, as it can introduce the role of observers and their subjective view in the definition.
This a great explanation of entropy. I loved when Sabine said that entropy has nothing to do with order. Also, that entropy depends on the definition of macro states, which are defined by the observers, us.
But in the bigger picture we're talking about a battle between a system's tendency to disorder vs order. 2nd Law states all physical systems tend to a disordered equilibrium. This in itself is problematic as a perfect equilibrium would be well ordered.. A gas at PERFECT equilibrium in a container would be perfectly evenly spread and it's molecules at rest. Quantum fuzz may prevent this perfectly ordered system and scientists consider a gas where all the molecules have the same velocity to be in equilibrium too, even if they are moving about 'randomly'.. -- The question is, has the universe become more or less ordered over time? MUCH MORE is the answer, not less. The basics of the 2nd Law of Thermodynamics holds for MOST closed systems, is the best we can honestly say if you ask me, which you didn't! To say it holds for the whole universe, be it finite or infinite, would be a leap of absurd faith.
This has become my favorite stream of yours. Never though entropy can be explained in so simple terms. Also, the past hypothesis may be able to be falsified with how you designed it. Human life expectancy has been increasing in time. Inspirational.
Fascinating video and very well explained; entropy has always been one of these quantities I've found difficult to define or understand. It has mostly been explained to me in the concept of order (or just plain equations), and those mostly just made me go ‘oh that makes sense... wait.’ Minor note, Boltzmann has two n's. 😅
I absolutely love this discussion! Everything is Entropy. However, I always wonder about Entropy and Infinity and the implications. If a system and its microstate become open at some point will this not allow a low entropy influx just as the divider would. It would introduce new information for sure. Our perception seems to be a factor in this riddle and perhaps the microstate (the system) is actually consciousness and will remain a stable charge intertwining between infinite dimensional states. Perhaps "thought" is the only real thing.
As a former physicist now going my way into philosophy of science, what you said in the "How will the universe end" is the most satysfying take on entropy I've ever heard of coming from physicist. I've been wondering about the precise fact that "macrostate" is subjectively defined and so entropy increasing must not be that fundamental, but I never heard of any physicist going down that rabbit hole. Thanks you for that ! You must have at least some interests in philosophy of physics to be that great on these questions :)
If you view the universe as being in a single quantum state, then a macrostate is defined as the possible result of the expectation of this state on some Hamiltonian. The space of Hamiltonians defines the space of macrostates. I do not believe this to be subjectively defined.
@@chippyprice1993 That is interesting, so in this regard only the entropy of the whole universe increases through time because other macrostates are ill-defined, right ?
@@graine7929 I'm not sure I understand what you mean by other macrostates are ill defined. You mean partial macrostates? Like ones that apply to a part of the universe?
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
Thank you for so clearly stating what I've always fundamentally thought about entropy. Of course I will mention Dr. Asimov's famous story "The Last Question" which so firmly planted this idea in my mind. Life is truly the universe's riposte to the dismal 2nd law of thermodynamics. "Let there be light..."
I love that you started this video talking about biology. It’s not often addressed when it comes to entropy, but life is the exact reason I’ve been skeptical about the universe heat death prediction. Life has a way of swimming upstream, cleaning up messes, and finding energy.
Entropy is also adressed and very real in biology. Cells locally decrease entropy (by organizing themselves into complex structures etc.) but they can only do so by increasing entropy around them (absorbing nutrient and releasing wastes, breaking down molecules etc.). When you study lifeforms, you realize there is no "trick" or no escape to entropy there. All in all the reality is that without free available energy (work), every complex life form ultimately dies. Once free energy in a closed system is spent it's over, period. And one could argue that the universe is such a system.
well, the argument goes like this: entropy of the whole system increases over time (/ doesn't decrease). So looking at the biological organisms by themselves is a mistake. If you take into account everything they (we) do, the entropy actually goes up. As the video says: we increase the entropy of the sun, fossil fuels etc. to decrease the entropy of others. However if you calculate entropy for, say, the solar system as a whole, it increased as a result. There's even an interesting philosopical point here: you could say that the low local entropy that biological systems are, is a good way to maximize the increase(!) of overall entropy - because through our actions we alter the environment in more sophisticated ways, increasing the overall entropy of everything around us through the search of usable energy (which, as Sabine stated, is local low entropy reservoires). I'm not sure I agree with this line of reasoning, I'm just saying that there's more to biology vs entropy than meets the eye
But that only happens at the expense of decreasing entropy elsewhere. Mostly in the Sun, but some, like the life around undersea 'black smokers,' at the expense of nuclear decay of materials in Earth's crust. (and *perhaps* also tidal flexing in some planetary moons, keeping interior water liquid). However all stars have finite resources for fusion (iron-56 and nickel-56 are the end of the line, if not sooner), nuclear decay eventually reaches its lowest state, and tidal flexing goes away as those orbits tend to circularize and 'flexing' stops (and over *seriously* long times, orbits decrease via gravitational radiation). It's true that life is a constant *struggle against* entropy, though.
Life (once it exists, and especially in prokaryotic form) is pretty good at avoiding becoming extinct, but it's pretty bad a breaking natural laws. So, sorry pal, heat death is the absolute end (and very likely the last living organism will have died a lot earlier than that).
@@TheBouli Once heat death has been reached there is no way of extracting energy from any source in it to fuel life (and all living beings are parasites that thrive on the increase of entropy, that is the availability of an external energy source). Trying to extract energy from a closed max entropy system is like trying to extract zero point energy from an object. It won't work. It's just physically impossible.
Chaos Theory always seemed to go against Entropy but I never really got far enough into them. Chaos Theory was the study of order arising from a chaotic environment. the example I remember is cars and red lights. Even is the lights, roads and cars are random the time to drive from point to point tends to be close average that emerges naturally. Order emerging from a high entropy state. I guess it could go with information thing, when we don't know all the microstates order can emerge from chaotic interactions.
Yeah, but as i concern in most chaos system general trend is create and utilize order to produce more chaos, for example black hole or human tend to turn all kind of stuff into plain heat (useless energy) and we are high order ourselves - a product of chaos nature and universe , i dont know if we can achieve level of sci-fi when we can break material into useless energy better than black hole itself. So in big picture, chaos theory does not seemed to go agains entropy Infact we can say Life exists to "accelerate" entropy
I liked this video because it resonated with my views on entropy that I gained after reading the inspiring article by Myron Tribus and Edward C. McIrvine, "Energy and Information", Scientific American , Vol. 225, No. 3 (September 1971), pp. 179-190. I strongly recommend that paper, because it provides an insightful complement and an additional explanation and formalization of the points raised by Sabine. Following Shannon's original proposal, lucidly explained by Tribus and NcIrvine, entropy is defined as the degree of uncertainty, X, that one has about a certain question Q. A question can be "In which part of the box is the particle, left or right?" . Complete uncertainty about that question means that there is equal probability, p_1= p_2 = 1/2, for the particle being in the left or the right part of the box. The entropy, defined as S(Q|X)= - \sum p_i ln p_i, is then S(Q|X)= - ln (1/2) = ln 2. On the contrary, if one knows the answer, namely that the particle is in the left part (denoted 1) and not in the right part (denoted 2), then p_1=1 and p_2=0. The entropy is then S(Q|X) = 0, which means that there is no uncertainty about the question Q. This is in contrast with S(Q|X) = ln 2, which is the maximal ignorance about the question Q. The authors then remark: "The only state of greater ignorance is not to know Q." And not knowing Q, or even worse, not even be aware of the fact that one must bring Q into the definition of entropy, is a source of all the confusion and all the difficulties people have in understanding the concept of entropy. It was great that Sabine revealed in her own wonderful and easily grasping way this important point to the general public. In the article, cited above, the simple model with two possible answers that I gave to illustrate the idea, was generalized to the cases of generic questions and any number of possible answers. Information was defined as the difference between two uncertainties, I= S(Q|X) - S(Q|X'), The relation to the thermodynamic entropy was exposed. Again, a fabulous, inspiring, highly cited paper that removes the longstanding "mystery" and confusion.
@kaboomboom5967 You would certainly understand the Scientific American paper "Energy and Information" which is very clear and readable, written for general audience.
Can you do a vid discussing your take on the recently proposed “Second Law of Quantum Complexity?” I am both fascinated by it and would love to hear where you stand on its proposal and maths. Thanks for all you do Sabine!
The interplay between life (makes order from disorder) and the 2nd Law of Thermodynamics (expects disorder from order) is very interesting. The way life fights the losing battle of ageing is by being multi-generational, a reset to a low entropy state. It then follows that life is fundamentally about paying it forward with the older generation being devoted to the success of the younger one.
@@jzemens4646 Furthermore, the constant battle against the 2nd Law of Thermodynamics is why societies evolve to more enlightened states, not uniformly or monotonically, but inevitably. This is because every generation has to create sufficient new information to overcome the entropy increase. Those societies that are unable to do so eventually perish. Very few societies can maintain an exact balance, by neither advancing nor going backwards. The rest, utilising the culture and the fruits of science build more information than entropy increase destroys and arrive at a better place from one generation to the next.
but it isn't order from disorder. The sun literally provides more energy to the surface of the planet than is required by the life on it. The 2nd law only applies to a closed system, and the earth is by no means a closed system.
All life radiates heat therefore functions as an entropy machine. The "order" of life is emergent from the 2nd law of themodynamics because it functions to increase entropy (ie spreads out energy.) One could cite climate change as proof of this!
Entropy is much more concrete in the case of genetics than heat. Think of the landscape model that virus physicist Peter Schuster describes in explaining the natural algorithm that allowed proteins to evolve into the efficient optimal folding machines that they are. A similar landscape model can be used to describe the universe of all possible gene configurations and organize them by the genetic differences (as distances) between them. For organisms to evolve, their gene structures must migrate across generations through this landscape. The law of entropy applies to this landscape because less gene arrangements produce successful complex life forms than produce successful simple life forms. And, likewise, gene arrangements that produce successful simple life forms are outnumbered by those that produce unsuccessful life. We know this based on entropy. Within this landscape may be more than 10⁶⁰⁰ possible arrangements. However many, it is a very large number so big that not even a smidgen of them could have occurred across 4 billion years even if every organism that ever existed in that time frame was genetically unique. Yet, in defiance of the static entropy characteristics of the genetics landscape mapping all life, gene configurations managed to randomly migrate to ones that produce ever increasingly complex life forms. All I can say is bull dung; it didn’t happen. Do the math. A natural selection genetic algorithm is not sufficient enough to solve such complex problems approximately optimizing dozens and dozens of interrelated phenotype systems in single organisms. Bull dung. No algorithm could solve such a search problem.
the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...
@@andyreznick soup c00ls down enouGh to eAt it... is a form of tiMe travler forWard beCausE u have to keep thinkiNg is it c00l down eNouGh 2 Eat...oR is dIs a paRaDox...the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...
11:36 Sabine you are wrong. High entropy indeed corresponds to high information content, as per Claude Shannon's definition in information theory. The reasoning behind this is that entropy essentially quantifies the level of uncertainty or randomness within a set of data. The more uncertain or unpredictable the data, the more information it contains. For reference: "en.wikipedia.org/wiki/Entropy_(information_theory)"
If a highly likely (High entropy) event occurs, the message carries very little information. On the other hand, if a highly unlikely (Low entropy) event occurs, the message is much more informative. This is exactly what Sabine is saying.
@@Darenimo You twisted some definitions in your comment. Entropy IS average information. If a high-entropy message occurs, then the message carries lots of information. High Entropy is NOT the same as "highly likely". This holds for information theory AND statistical mechanics!
@@TheCreativeKruemel it's not that anyone here is wrong, it's just the definitions of stat mech is all screwed up, and always has been. Physicists have been arguing about thermodynamics for god damn years cause of the way it was formulated. First, you have to remember what information really is, and what observer the "message" is being viewed from. If you read the message 000110010101 This tells you some kind of information, but it turned out that this message, was just the macro-state description of a more underlying message 00110000 00110000 00110000 00110001 00110001 00110000 00110000 00110001 00110000 00110001 00110000 00110001 When you compare these two, their information content is the same (binary expansion of a binary set of numbers, where 00110000 is 0 and 00110001 is 1.) The 2nd clearly has more bits of information, because it's very obviously longer and can describe a lot more states (higher shannon entrop). But also under Shannon, the surprise of the messages are equivelent...in that these messages have the same amount of surprise, and therefor have the same "shannon" information. Additionally and by contrast, randomness is not well defined. Consider the random string again of 0's and 1's : 000110010101...it contains regularity and in fact, for any arbitrarily large string, you will never find a truly random sequence, there will always be patterns and repetition in that sequence... 000, 11, 00 0101, 1010...these are patterns embedded in the seemingly "random" string and I'd challenge anyone to try and create a truly random string of 0's and 1's. The only time one can is when each bit is unique , at which point the idea of "surprise" gets thrown out the window because every state would be a surprise if all of it's bits are all unique (and this is the true extension to what Sabine's sid, just from an information theoretic viewpoint) Cheers,
Such as, if you have an airtight box on your desk, and you request information about the contents of the box, you could get a high entropy response, letting you know all the possible configurations of atoms and molecules that could exist in the box. Or you could get a low entropy message, telling you it contains mustard gas and opt to not open it. And you'd probably find you gained more information from the latter message than the former, even if the former had a higher information content.
@@Darenimo Hey good point :)! The low entropy response, saying "it contains mustard gas", is less detailed but more straightforward for us to understand. However, 'information' here doesn't just mean what WE can understand easily (very important!). It includes ALL possible details, even very complex ones (one key idea in Shannons original paper). In principle it could also be possible to use data compression and Character encoding on the high entropy message to reduce the high entropy message to a message we humans can read :). The important thing to understand is that the high entropy message (about all configurations in the box) contains ALL the information of the low entropy message!
In my book "Evolution: "Von Mikroorganismen zu multinationalen Konzernen - 3. Auflage" (only Amazon) I came to somewhat different conclusions. This book is also based on the idea that the increase of entropy means a loss of information (see Murray Gell-Mann). However, when discussing life, physicists practically always talk about thermodynamic entropy (see Schrödinger). In my book I claimed instead that living beings have competences (knowledge etc.) towards their habitat in order to gain resources (especially energy) from it. However, they are primarily concerned with not losing information (which has to be stored physically somewhere). In other words: living beings behave in a loss-of-competence averse way. And this is then also the basic assumption of the competence-based theory of evolution, which is substantiated in my book. Unlike Schrödinger's approach, it can also be used to justify reproduction. Schrödinger's approach can only explain self-preservation of living beings, not reproduction. The competence-based theory of evolution is a generalization of the original Darwinian theory of evolution (which unfortunately has no physical foundation). However, it contradicts the theory of selfish genes and the total fitness theory. I deal in my book of course also with the question whether there will be no more life in the universe sometime. In my opinion this will be so. Because life needs competences. And these competences (knowledge) must be physically held somewhere (DNA, brain, hard drives, etc.) as extremely improbable states. And they must be reproduced permanently, because the entropy law applies. So high quality energy is constantly needed to reproduce the competencies (so that high quality energy can still be obtained in the future). And I am afraid that at some point this high quality energy will no longer exist in sufficient quantity. My book is in German and quite difficult. It also covers the evolution of man and his societies, which Darwin cannot even begin to do. Biologists are often of the opinion that man is just another animal. My book, however, very clearly concludes that man has emerged from the animal kingdom. Basically, human beings have acquired more competencies than the rest of the animal kingdom put together. Man is capable of division of competencies, other animals need division of species for that. I hope to translate the book into English via ChatGPT and/or DeepL by the end of the year, until then only reading in German remains.
Information can not get lost (law of the conservation of quantum information). An increase in entropy does not mean a decrease in information, but an increase. That's the basis of information theory. If your book is based upon the idea that an increase of entropy means a loss of information, then it's build upon a flawed premise.
@@noneofyourbusiness-qd7xi Unfortunately this is wrong. I recommend a Google search for "information loss entropy". Murray Gell-Mann explains the connection very well in his book: "The Quark and the Jaguar: Adventures in the Simple and the Complex". His explanation goes, roughly, like this: If all particles in a gas are in a tiny quadrant, then we know not only the macrostate of the gas, but also its microstate quite precisely. In this state, the entropy of the gas is very low. However, when the particles then disperse again, we lose much of the knowledge about the microstate of the gas. Thus, with the increase in entropy, we experience a loss of information (or knowledge about the microstate). The situation is similar for living beings in their environment. The better adapted it is to the environment, the greater its knowledge about its environment (Popper). One can also say: The greater are their competences towards the environment. If they lose a part of their adaptation, they lose a part of their knowledge about the environment. Schrödinger looked at the events from the thermodynamics of a living being. I think this is problematic, especially since evolution is about information processing (even genes are subjected to information processing). It is therefore more advantageous to look at the events from an information-theoretical point of view and to understand an increase in entropy as a loss of competence. And by the way: buildings decay with time. Living things decay in no time at all when they lose their life. But we also forget things, for example names. The older you get, the easier it is to forget. You lose information. Did it really never occur to you that this is the same mechanism?
From what I learned about the laws of thermodynamics, great stress was placed on the concepts of OPEN vs CLOSED vs ISOLATED systems. We were taught that the potential function which determines the overall direction of change in an ISOLATED system is entropy increase. There can only be ONE isolated system, the universe. So my understanding was that as the universe evolves its total entropy increases. You seem to be saying that this is incorrect. That the total entropy of the universe can stay the same. Also my partial understanding of what you are saying, (as a non-expert), is really to do with open and closed systems within the isolated system of the universe. So that even when the total entropy of the universe as a whole has reached a maximum (and where does it stop?), you can have closed systems in which the entropy is locally decreased at the expense of the surroundings.
A vacuum Dewar is a pretty good short term approximation for an isolated system ( at least for chemical rxn time scales). And if the accelerating expansion of the "universe" that our astronomical observations indicate is true, then we are all trapped in segments of the "universe" by an effective event horizon defining our observable universe. A sort of relativistic Dewar. Yours will be only slightly different from mine, but researchers 100 billion light years from us are effectively in a completely different isolated system.
This my understanding too. There's the theory dark energy is proportional to the entropy in the universe system, specifically the area of the boundary expands with total total entropy within the universe, it's speculated this also applies to black hole event horizons.
You're right. Sabine's statement that the entropy of the universe can remain the same is utterly erroneous. It ALWAYS increases, regardless of any number of mucrostates you might be looking at. That it must increase with every physicochemical process is a corollary of the 2nd law of thermodynamics. And this has not changed (overall) since Boltzmann.
For an isolated system at equilibrium, there's the approximation/heuristic that all microstates are equally likely. So, defining macrostates in terms of subsets of the total available microstates in the system then allows calculating probabilities for the macrostates. And defining entropy as the (log of) the number of microstates comprising the macrostates makes sense. But, for systems that are not isolated, or that are not at equilibrium, not all microstates are equally likely. So then I'm less certain that entropy can be defined as a simple count of microstates.
@@WestleySherman True, but entropy is the sum of the individual probability Pí of each microstate (times log Pi),. Pi can be different for each microstate that is part of a macrostate.
Veritasium's video on entropy is really interesting and maybe aligns with the "uplifting" takeaway here. The universe doesn't just tend towards higher entropy, but it also tends towards faster entropy gain. Life as we know it is locally the fastest entropy increasing mechanism, as entropy further increases other life-like mechanisms (that may not be recognizable to us) should be likely because the universe tends towards accelerating entropy growth.
As Abba said: "Entropy killed the radio star..."
Come on Sabine ! Do another techno pop song ! Please.
Abba? I think you mean The Buggles.
That was Jefferson No Ship...😮
Don’t get me wrong I love the physics. But “Catching Light” was the bomb !
Abba surely not
I have witnessed water running upwards from a ditch onto a road then becoming airborne and creating a cloud. But, it was because of a tornado passing near to my house and I hope never to see anything like that again.
😮😮😮
Fascinating! Scary, but fascinating.
💀💀
I thought you were going to say mushrooms.
When water evaporates it goes into the sky to form clouds.
Aging biologist here! I’d like to add an uplifting comment to this uplifting video. :)
I’m always a bit sad to see entropy given as the reason we get old. As Sabine discusses later in the video, open systems with access to a source of low entropy can use that to decrease their own entropy, so given that we can take in low-entropy food, there’s no in-principle reason we couldn’t use this to keep the entropy of our bodies roughly constant with time. So not aging is totally allowed by the laws of physics!
It’s even well within the laws of biology-there are plenty of animals that don’t age, from tiny hydra to giant tortoises, and even one of nature’s most beautiful animals, the naked mole-rat. Their risk of death and disease doesn’t change with time, which basically means they keep their entropy constant throughout their adult lives.
Now all we need to do is crack this biological entropy preservation using science…but that’s another story!
Unfortunately, we already have an overpopulated earth so preventing people from aging is gonna exasperate the issue. I would not be against preventing ageing though, even if my life was no longer than normal, a life without aging is far more enjoyable than living with a constant gradual decline.
@@elio7610 If you’re worried about overpopulation, I made a video about exactly that which you might enjoy :)
Planaria?
Wow, I can't believe that I ever espoused that exact view. When you put it like that, entropy as a reason for aging makes no sense. Thank you for ridding me of that misunderstanding.
@@edcunion I’m not sure if we’ve got any lifespan data on them (happy to be corrected!), but given their regenerative powers I’d not be surprised!
THANK YOU for pushing back against entropy being described as order vs. disorder! Through years of schooling entropy was this poorly defined, almost spooky concept of order. Then I was finally introduced to entropy as probabilities of microstates (with gas in a box), and it was completely logical and clear.
Entropy is to heat transfer what friction is to an object in motion. Entropy reduces the available energy in a system just like friction. Order vs disorder isn't useful at all IMO and only serves to cloud what's happening.
Agree. Order is a matter of human opinion. Don’t think nature cares.
I think they're synonymous ways of talking about it, it's just that order as a concept has more to unpack to get to the point. Order means that there are rules that limit the number of microstates. A bookshelf ordered alphabetically has very few allowed microstates (defining the microstates as the books' arrangement), a disordered bookshelf on the other hand would have as many microstates as there are ways to arrange the books.
I agree! I've always found order & statistical description of Entropy to be fundamentally wrong.
And if you look you can see that this approach exists in many other fields of science (of SCIENCE! Which is supposed to be OBJECTIVE :))
The trouble is scientists are not always objective - actually everyone is at least a little subjective at times, that due to our limited resources (at least attention and Time).
But how science generally work - you observe a phenomenon, you describe is somehow, you test if your description allows you to predict same phenomenon in the close future; if your prediction fails (or is imprecise) you improve your description until you get accurate enough predictions ... and at this point you KNOW that you best description is the CAUSE for your prediction ... so it's easy to miss the fact that the Universe doesn't give a rat's ass about your descriptions :)
It's just one of our many biases - one that many scientists also fail to.
Ultimately our descriptions would be so perfect that they'll fit to Reality 100%, and then it would be only a philosophy game to distinguish between the two ... but even with our most precise sciences we're still not there.
But people (and esp. scientists) love to believe that the theories they learn (and especially the ones they make) are close to perfect and thus that bias becomes really, really strong!
Have you seen a scientist trying to explain some phenomenon by stating some equation and ending with something like "and as the equation tells us, the object has to do this and that".
Well sure if your theory and equations are perfect you'll get it right ... except for the understanding part!
Very little of such descriptions actually explain anything, and it's exactly because they jump over the actual forces and threat the description as the cause.
In that case if your description is for example statistical (which many sciences use today ... and particularly QM) then it's really easy to believe that probabilities and statistics are the CAUSE for things like Entropy!
... but this is still fundamentally wrong of course! :)
Kinda like dark matter
To some of your vids, I come back months later again and again, because they are lovely every time new. This is one of them.
“My videos can only go downhill from here…” The perfect ending to a video about entropy! 😂
Yep, that joke is brilliant and works on so many levels.
The entropy of explanations of entropy. So meta! The explanations can only be as good, and will likely be worse from now on.
She is probably right, not that they were anything special from the start.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
@@hyperduality2838 Yeah
I'm a mechanical engineer who focused on thermodynamics in college. Despite years of study and using entropy in formulas, this is by far the best explanation of entropy I've ever heard.
Would you agree that entropy is fishy?
There's a reason my father, who trained as a mechanical engineer but became a computer programmer/systems analyst, always referred to thermodynamics as "thermogoddamics".
Classic response on a video that does not explain Entropy.
It's far easier to believe we understand than believe we've been fooled into understanding.
My education was the same. I was waiting for her to say entropy tends to increase in a CLOSED SYSTEM, but she never did. She must assume the universe as we understand it is a closed system, but that's very debatable, given the required low entropy at the Big Bang. Also, questioning the semantics of the 2nd Law (the meaning of "order" seems trivial to me. After all, I recall the word "disorganized" being used in the 2nd Law, which is better I think.
She actually makes the mistake of calling heat a form of energy, when in fact heat is a form of energy transfer.
If I don’t shower, all the good air in a room definitely moves into the corner.
Chocolate RAIINNNN!
Some stay dry
@tayzonday we definitely have the same algorithm. You keep behaving like a tyrant in the comment sections 😂
When I don't shower, everybody suffocates and my flowers die
@tayzonday And if you move into *that* corner, it will move into the opposite corner.
This has brought me closer to reconciling questions I've had about entropy than I've ever been before. Thank you for that.
I've made some visualisation that might further enhance your understanding: ruclips.net/video/lW7HT6IxI_o/видео.html
I completely agree with you. When I was young, I used to tell my mom that my room was never in disorder because disorder itself doesn't truly exist. Instead, what exists are infinite states of possible order, and none of them holds more inherent sense than another.
If a workspace becomes disorganised, the ability to get work done rapidly approaches zero. So I would argue that certain states of possible order do make more inherent sense than others.
If I understood Ramsey theory I could say something intelligent here.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
So your mother was the intervening force , if you keep your room tidy today as an adult it’s probably her victory, or your partners:)
Extremely organized workspace looks amazing, but it can intimide some people and drain creativity. On the other hand organizing is really good work hygiene after project is done.
"It can only go downhill from here"
Sabine's humor, much like our universe, is chaotic.
It was a brilliant joke!
Well ordered!
I think she meant, "It can only go high entropy from here".
11:45 "If you stop a random physicist on the street and ask them if they agree..."
I wonder if she was aware how funny it sounds to anglophone ears to have someone with a German accent expound the desirability of order.
Sabine, I'm a chemist, and in conversations in the past, I attempted to make the arguments about entropy you've made so eloquently in this video (especially the associations/ideas about order). From now on, rather than arguing, I'll recommend watching your video. Thank you so much for this great video!
chemistry is really hard for me.i can never become a chemist
As a chemist you must realize the issue with naturalism is that time is the enemy not the friend. You need biological chemistry to happen faster than the reductive decaying chemistry. The problem is that doesn't happen in this universe... amino acids and proteins only have hours before becoming useless...
@@WaterspoutsOfTheDeep The subtext of what this video is saying is that the 'starting' state of the big bang was a macrostate that we no longer have information/knowledge of. Therefore, at this alleged 'heat death', it will yet again be a big bang of a different kind. All this suggesting that the universe might very well be an eternal macrostate that *can't* 'discontinue'.
Too bad this "eloquent" argument falls apart if reductive materialism isn't assumed. I don't think many people will agree with her that _none_ of the macro-states don't objectively exist. Clearly, humans ─ who are macro-states ─ objectively exist since, well, cogito ergo sum. Her argument is pretty nonsensical.
@@maxkho00 Its premature to label concepts on the vanguard of existence in that way. Entire areas of study could very well reside within something woefully 'nonsensical'. Since we are talking about expanses of time in the trillions of years, I'm not really prepared to reject anything. Regardless of how silly it might first seem.
9:55 Finally someone said it! I always considered a homogeneous mixture as "ordered", contrary to how lots of entropy explainers describe it as "disordered". This led me to lots of confusion over the years, until my recent physics classes cleared things up.
now run outside and play
@cameronbartlett6593 no u
EZ water / exclusion zone worth a read
@@cameronbartlett6593I am a plasma engineer and I would beat you in physical anything best of 3, pick your best sport it wont matter even if I have never done it you will still lose because I will also pick my best sport and third party picks the third sport meaning you have low chances of winning. Go say your bs to yourself in the mirror. Not every nerd minded dude isn’t physically able.
@yazmeliayzol624or maybe they're spammers? I don't see them either
This video reminds me how it always frustrated me in school as well as later at Uni that things were described or defined in simplified ways that made them wrong, harder to actually grasp or both. Sometimes the actual truth of the matter would slowly reveal itself often leading to an „aha“ moment years later and a feeling of „I knew it“.
Also funny how oftentimes simply explaining the meaning of a latin or greek term would have almost explained the whole concept behind it, yet somehow no professor ever did that.
So true!
Exactly! Like for example how percentage literally means "per hundred". So 10% is 10/100. Which means you can easily convert between percentages and fractions.
This is a great point. Most teachers are just bad.
The entropy of RUclips must have decreased since veritasium and Sabine both posted videos on entropy within two weeks of eachother :) They're both great, but Sabine definitely lives up to her motto of no gobbledygook... thanks Sabine!
Veritasium explains it better for the layperson, IMO.
This video about life and entropy ruclips.net/video/k-vm3ZWnMWk/видео.html is really good match to these two videos you mentioned.
I like my gobbledygook with a good glass of razzmatazz
I agree, they are both great. Sabine's was posted on Jun18, and Veritassium's on Jun2nd . So we know Sabine wasn't responding to his (since she was first) and since Veritassium takes longer than two weeks to make one of his elaborate videos, we know he started his before he knew Sabine was doing one. That's perfect. Also cool how they both didn't like the word "disorder" as entropy,, Veritassium likes entropy as "energy spread out", and Sabine presents her own argument.
@@geoffwales8646not nit picking, correcting. 👍🏻
I enjoy your sense of humor keeping things light and entertaining while simultaneously tackling deep concepts. Great blend.
Sabine’s Carnot efficiency at explaining this stuff is 100%.
So there is an infinite temperature gradient across her body?
Not possible 😂
Are you claiming to have Carnot knowledge of Sabine?
There are always losses. Superconductors radiate when the current flow changes.
@@nancymatro8029I appreciate this joke.
Thank you for delivering such great content 👏 always interesting, humorous, and informative. My field is engineering, but sometimes, I regret not pursuing physics. Watching your videos gives me a chance to see what I missed.
Many thanks from the entire team!
@@SabineHossenfelder Time does not pass, only you is moving thru the Time. Time is a coordinate of the state of the moving entropy. From the pass state to the present state you can aim the future state.
you still have time!
Engineering is good experience for a prospective physicist. Nikola Tesla, perhaps the greatest engineer, was by extension, also one of the greatest physicists who ever lived, especially in the field of electromagnetism.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
The idea that there are always macrostates capable of turning a high entropy system into a low entropy system is fascinating. Entropy being constant, but perceived by us macrostates as variable, and thus subjective is powerful.
I love how more and more physicists are adopting such a perspective! 😁✨
time crystals are a great example. Check that out!
It may be powerful, but that doesn't mean it is particularly useful.
Saying that life may be able to emerge post-heat death of the universe just because it perceives reality differently may be a great Douglas Adams book, but belongs right alongside the parallel dimensions theories in plausibility.
@@thearpox7873isn't the idea similar to Penrose's cyclical cosmos hypothesis? And he has proposed some actual ways to experimentally verify that
@@florianp4627 It depends on what you mean by "idea" and what you mean by "similar".
But I personally find Penrose's hypothesis intellectually coherent, interesting and plausibly congruent with reality, while Sabine here is engaging in the exact same cognitive hocus-pocus that she makes fun of certain other physicists so much for.
Sounds like Maxwell's demon
She’s an excellent teacher and her dry humor is hilarious! Love her!
"She’s an excellent -teacher- _propagandist_ and her dry humor is hilarious! Love her!"
FYP!
Videos that provide a "simple understanding" of historically complex concepts (like the laws of thermodynamics) are crucial for our kids to hear and learn. I am especially interested when the video explains what we "don't know" or "don't yet understand" to pique their interest in what to solve next!
This is the coolest clearest and most concise explanation of entropy ever. I wish my physics professors had taught it this way. It would have saved me so much sleep.
True say. While almost anyone can understand a concept/subject, it really takes a special mind to be able to explain it in a way that can be understood by someone else. Sabine's way of conveying information is so refreshing.
Yes. Finaly someone that admit it is a human bias and not a law.😅
To compare is not fair. Everyting is perfect compared to itself
@@aggies11 I am a firm believer in the Feynman methodology: start as simple as you can to build up knowledge. If you can explain a complex idea in such a way that a child (or a non-specialist) can understand it while still being faithful to the core concepts then you are good!
@@AurelienCarnoy Yes, I get fed up with it being called a law ... to call it an assumption, or - as sabine does - a matter of probability, would be much more accurate and satisfying.
maybe, but she misses the point. and hence your professor too
For a mortal being with a life expectancy of ~80yrs I confess I spend an irrationally large portion of my life brooding on the eventual heat death of the universe. Glad to find videos like this on occasion which at least try and come up with a philosophical explanation for why we shouldn't feel depressed over the stars eventually going out. Kudos Sabine!
It almost makes one consider praying to God.
@@edh.9584 which one.. There's so many to choose from...
@siddified Definitely the Pasta One
Although I personally like the machine god that came from the future
@@HardHardMaster Well, choose one.
@@edh.9584 The "satanic temple" has some surprisingly good doctrine
I’m not old, I’m just high entropy.
Intuitive explanation of entropy. Thanks a lot
This remembers me of the book "A Choice of Catastrophes" by Isaac Asimov. He talks about how to survive heat death by exploiting low entropy fluctuations. The book was written in 1979.
Ah good remembers
We wont make it anywhere near that long
Tru vacum coming soon
@@A_Stereotypical_Heretic Quants idiomes parles tu, pallasso?
Love Asimov, but haven't read A Choice of Catastrophes yet. Thanks for mentioning it!
I've always read the term heat death as the death of heat, which made a lot of sense to me.
Or a death caused by heat. It could mean both things, and language is just inaccurate in many ways.
I thought so too lol
Yes, the word order in English defaults to "Death by heat" , but "Death of heat" is also a valid interpretation of the words, though not the default.
@@vikiai4241 that's interesting because I am a native English speaker and frequently come onto this but never was sure about my way of thinking about it. Thanks for the validation lol.
@@vikiai4241 Why though? What rule is there
Perhaps entropy is just the natural part or inverse phase, moment, flip, reversal or change that occurs with all energy and matter interactions and formations in the cosmos!
Which creates, evolves, then reverses, changes, deconstructs or destroys things.... to reuse and recycle into new positive/negative formations and events ...and so on!
LOL “I’d be inconvenient if you entered room and all the air went into a corner” @sabine you totally cracked me up with this joke. Thank you!
A friend of mine once said ”if studying physics doesn't humble you, you're doing it wrong."
If you don't confidently, arrogantly even, question physics you're doing it wrong... Entropy should be redefined as Simplicity or Uniformity... Complexity can be static and structured, or dynamic (and structured)... Closed systems simplify over time. Energy can increase complexity. This redefinition solves many issues.
@@PrivateSi You haven't been humbled yet clearly. Entropy has already been defined precisely in physics, from the mid 19th century. What loose definitions you get from science communicators isn't reflecting the reality of this situation, hence you must study physics. (Even this video enunciated this so I truly don't know where your head is)
@@moneteezee .. You can either make a new term or change the Law of Entropy, because that law is wrong when over-applied to the entire universe. It's fine for gases in a closed system, but fails as a universal law, unlike my redefinition.. You could call it the Law of Simplicity if you prefer, as long as you stop declaring the Law of Entropy to hold at all times.
@@moneteezee.. If fields are real they have to be made of discrete elements that have to be as equidistant as possible throughout the entire field for the field to be empty. This is a perfectly ordered state that's as simple as possible. Hence, The Law of Entropy should not be a law or Entropy should be redefined. It's a simple, logical argument given the evidence for QUANTISED FIELD(S).
@@moneteezeeentropy is likely the unwinding of quantum entanglement from the source of singular entanglement of all possible quantum expressions into less entanglement limited expressions … it entropy is the cost or the balancing side of the of increased complexity that rebirths cycles of quantum tangling and de tangling that at some phase of transition leads to us. 😊
Entropy must be considered within the scope it’s relationship to quantum complexity.
In short small scale fluctuations must run dry for the whole quantum baseline to reset into the potential of singulars expression.
😊
Me encanta la profundidad y claridad que tienes, este tipo de contenidos es oro. Yo estudié Física y Filosofía y una conversación contigo me hubiera ahorrado años de dudas y confusiones. Gracias por esto.
The true and big picture answer is that it will never end ! The universe as a whole is evolving on a very large time scale, which is macroscopically time-periodic in a statistical sense ! The initial state not relevant, like for a non-linear oscillator that becomes chaotic, but its chaotic state is periodic, perhaps because of the reorganizing entropy-reducing nature of gravity, whose long term impact has still not been understood or quantified ! 😇!
Sabine your work has changed the very way I view life and physics in a way that's nearly spiritual. I feel so lucky to be able to learn about the beauty of the universe and physics from you
She is wonderful at explaining very difficult or (almost) impossible ideas in an understandable way.
Alas, even Sabine leaves my poor brain often mystified, but always stretched, curious, and entertained. 🙂
Same same
When it’s a topic like this I need to watch a few times over a few days and dig more into some of the ideas with other videos. I like that she doesn’t completely dumb it down for people like us but opens the door for us to learn more (such as that entropy logarithm formula). But yeah at the end of this video I STILL don’t QUITE understand a simple way to explain entropy to someone. Yet. But I will after a few views 😂
She is wonderful
Without the gobblety gook
@@aaronreyes7645 In my opinion, it was all gobblety gook. She talked in circles and never really said anything at all
I know, right? I'm totally in love with her. With her MIND, but I suspect it's a package deal. I'm still game.
The statistical description of entropy has never resonated with me, but your talk about other life with access to different macrostates is very interesting, and warrants further thought.
It’s vague and non substantial. She needs to produce some scenario that could meaningfully demonstrate how one side steps Boltzmann’s formulation of the second law.
Not the order/ disorder pop science stuff, the physics.
Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently. I love her, but this argument isn’t serious.
I didn't get wut she means by Complex systems with different microstates which we can't use and further the entropy is small for them.... Can u explain
Could Entropy just be a matter of perspective?
If a horizon from one perspective can appear to be a singularity from another perspective...
A horizon takes an eternity, and a singularity happens in an instant....
And, a horizon is a state of maximum entropy, and a singularity is a state of minimum entropy...
Entropy should be dependent on your reference frame. If your horizon is growing, then your entropy should be growing.
I've been thinking about it like this after listening to Leonard Susskind. I don't know if I'm getting it quite right, but it's really interesting to think about in this way.
@@kristianshreiner6893 _"Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently."_
Well, if you understand "magic" as "fairy tales", like many, then its a straw man fallacy, because Sabine didn't thought of that on terms of "something that doesn't exist nor can exist".
And she didn't use the order/disorder stuff, but to only criticize it. Boltzmann's formulation, as Sabine hinted on this video, it's only useful for thinks that we "know", not stuff we clearly don't know, as information not accesible to us. Stop lying pls.
@@ywtcc All stadistic is clearly a "matter of perspective"-
One of the best commentaries on Entropy I've encountered thanks SB, especially as "The # of microstates per macrostate" - Given that a macrostate is simply an arbitrary human classification/aggregation, does this mean that entropy is an arbitrary physical aggregate outcome? I think I'm missing something, so I'll listen to your commentary until I can discern this thing. 🥝🐐
Every fundamental unit you may imagine of in physics is relative in the end. References are set arbitrarily by us on the basis of what we deem enough stable to compare to. I guess entrophy is also relative as we still don't know much about what is going on at subatomic level, but for the time it seems to be the most pure and universally applicable unit we can think about. (I am not a physicist but an engineer, although I think obtained degrees are meaningless when it comes to pursuing actual knowledge and are just used as a social reference for people to be inclined to more easily believe you when you speak, as a sort of warranty. Some people make use of this extensively to bolster their ego even though in the end they may don't know much)
I've been waiting a long time for this video. Sabine, you are an amazing communicator and I think the world needs to hear you. This channel and other sources like it are THE REASON I use the internet.
Thank you! ❤
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
@@hyperduality2838 sounds like your grasping for straws, likely for some sort of affirmation of a biased narrative. entropy is all around you and can be observed, posing a balance to the force is wishful thinking
@@gregwarrener4848 Your concept of reality is a prediction or model -- syntropic!
@@hyperduality2838 Really?
@@MR-ub6sq Concepts are dual to percepts -- the mind duality of Immanuel Kant, the critique of pure reason.
Mathematicians create or synthesize concepts all the time from their perceptions.
Your mind is dual.
Mind (the internal soul, syntropy) is dual to matter (the external soul, entropy) -- Descartes or Plato's divided line.
Matter is also dual -- wave/particle duality!
"Always two there are" -- Yoda.
Everything in physics is made from energy and energy is duality, duality is energy.
Potential energy is dual to kinetic energy.
Electro is dual to magnetic -- electro-magnetic energy is dual, photons.
I guess Sabine is the best person that has ever explained this relationship between order, life, entropy and those things without either being carried away in theory or relying on very superficial metaphors. Because maybe physics is both ! intuition and somewhat philosiphical insights + rigourous mathematical modals.
Physics isn't philosophy, it's what we know to be reality.
@@No-cg9kj What you think you know (episteme) is philosophy.
@@No-cg9kj Physics is not that despite its name.
@@No-cg9kj I agree with you actually. Philosophical thinking doesnt mean something detached from reality. There s a section in philospohy about science and modelization and those things. I mean that it's useful to take some time to pose some "meta" questions about what we're trying to look for in nature, what is really fundemantal and what stems from our specific human view of nature.
In this video Sabine took sometime to think that maybe our defintion of entropy is a reflexion of what information is accessible to US and what energy WE can use. It is pretty philosophical it seems to me.
It's funny how your user name is NO btw xd, thnx for ur comment.
Science is built up on a bedrock of epistemology.
Very interesting point about 'macrostates' that I've never heard or thought of before. Definitely food for thought. Thanks!
What is "life" (body, soul & spirit)but the interiorizing, complexification of matter, a 'gift' of anti-entropy...😊
👉Creation Continues...,& that mass beyond our bounds..also, accelerates & increases... And the gravity of it all makes for a stable & expanding space-tiime in our neighborhood.
"If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation-well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." - Arthur Eddington, New Pathways in Science
Great video Sabine! Even if I have a MSc in chemistry and PhD in pharmacy, one of the hardest thing to undetstand is the thermodynamics. You really have to eat and breathe thermodynamics to really understand it, and it doesn't come with intuition, and one of these things are entropy. So thank you Sabine to make these entropy more intuitiv.
Farmacy?
Do you mean agriculture or is this alternative medicine?
Sorry... couldn't help myself.😁
@@mixerD1-or maybe an Fhd.
@@mixerD1-ha ha, too much swedish 😊
Your physical chemistry profs will be extremely disappointed if they read your comment (and likely be sorry that they let you pass)
Entropy is a tendency, not a law. For instance, galaxies could not assemble if entropy was the rule. The solar system could not exist if entropy was the rule. Fusion could not occur if entropy was the rule. It would appear then that gravity is the opposing force to entropy.
Also, doesn't that mean that gravity provides an objective standard for what counts as a macrostate? 🤔Gravity puts loose constraints on how particles can move (e.g. directing particles so that they would make a galaxy shape). Particles can move out of those confines but are less likely to do so. It seems this is just what a macrostate is. And it's not dependent in any way on human interests.
“and luckily so because it would be inconvenient if you entered a room and all the air went to a corner” just made my day
That is so annoying when it happens.
The analogy of "disorder" can work in the right context. You could say: there's all kinds of messy but only one kind of clean. For that reason your room tends towards getting messy if you make any change not specifically targetting cleanliness.
What I like about the example is that it uses the cleanliness as a macroscopic state defined by a human and talks about the associated microstate count.
i see you've never been to a certain college dorm. as determined with mathematical certainty, this was a space of superlative messiness: the Platonic form of disorder, chaos so intense both the gravitational and nuclear forces had all but given up - trash and antitrash not belonging to our dimensions materialized, ionized, annihilated, sublimated, vaporized, crystallized, and floated about the space with such frequency that any external modification to the region did nudge the system towards cleanliness regardless of the actor's intention. the situation passed when a window broke itself in its frustration. the resulting decompression vented the contents of the room (including furniture, wallpaper, rodentia) into the environment.
A single homogeneous pile of ash should be the cleanest state, if it's totally homogeneous then there truly can be only one state..
@@petevenuti7355 This is correct and similar to what I was going to say about her comment that "entropy is why things break down"
Actually things break apart because usually a "thing" is comprised of many things forced together and eventually unravel because they don't share the same essence. A house will break down because its a collection of wood, glue, sand, gypsum, wire, metal etc.. But those things on their own do not break down
except that there isn't one kind of clean, there are literally an infinite number of them
@@LMarti13 If you assume any margin where two adjacent states would be considered "the same" there's a finite amount of states - both "clean" and "dirty". E.g. if you assume that a pair of scissors being moved by less than 1mm constitutes the same state (that is: you quantize the position) the amount of states becomes finite.
Importantly increasing the "precision" doesn't make the amount of states infinite - and it doesn't change the proportion of "clean" and "dirty" states significantly. So even at a limit of however small your measurement error is, the amount of "clean" states is smaller than the amount of "dirty" status. Unless quantum effects start taking over, the amount of "clean" states is lower than the amount of dirty "states", no matter the scale.
Sabine has just addressed a question I been tickling me since my teenage years. This notion of pockets of emergent decreasing of entropy, such as say, our brains. I'm so grateful to hear her address this topic.
...what
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
Locally you can have decreasing entropy, but the overall entire system remains highly entropic.
@@hyperduality2838 I'm here to remind you to take your pills.
@@Lightning_Lancered pill or blue?
This reminds me of Roger Penrose’s ideas. I think you shared a stage with him at one point. If we change our scale of looking at the universe, in both space *and* time, the almost uniform distribution after 10^100 years (or whatever the number is) becomes recognisably clumpy again. The almost imperceptible effects of gravity speed up and become perceptible again.
i don't think this is possible. if the universe at that point is nothing but a diluteted, even, gas, there may happen some random "clumping" but there just not enough matter nearby for anything more than a few particles hooking up.
The most probable state for the universe is a gigantic black hole, which has the largest entropy given by the Bekenstein formula.
@@B33t_R007 cold things contract. The universe is expanding because it’s still heating as evidenced by the high number of stars currently burning. Turn the heat off and the dark matter contracts as does the distance between all particles and celestial bodies. The crunch will happen as it obviously happened before. Unless you believe the universe poofed into existed out of nothing. Which defies all known physics and logic. I think Penrose is brilliant. And wrong about his cyclical universe approach. But I also think hawking was wrong about BH radiation for a number of reasons.
@@B33t_R007 the point is that there is eventually no matter, just stray photons. but photons move at the speed of light and from their point of view everything happens instantly. penrose claims that this has peculiar consequences. i can't tell you more than that, or vouch for penrose, but you don't seem to be anticipating the situation fully
Once every remaining particle is beyond the event horizon of every other particle, there is no way they can interact again. Expansion of the universe would end up in a zillion one-particle universes.
It's not that entropy cannot decrease; it's that entropy in a closed system cannot decrease. If your box has a window with the sun shining in, it's no longer a closed system. And taking measurements also means it is not a closed system.
I like how complexity emerges somewhere between minimum and maximum entropy :)
Indeed, it is within the inflection point of rate change that new minima can be added.
I have had major depressive episodes because of entropy and the second law. I am deadly serious. Since learning about it in high school physics class, I have always considered it to be my greatest fear. Thank you for easing my fear a bit. 😊
I totally know what you mean.
I like to think life only came about because of entropy because a human that builds a car will generate a lot more entropy than a rock that gets chipped to pieces over time.
If you like SciFi I can suggest a book to you that has this as part of the story.. it's called 'Voyage from Yesteryear' by James P Hogan.
Should give you a positive story to carry around wherever you go.
@@joansparky4439 What about the concept of entropy daunts you so much?
@@PaulElmont-fd1xc also an uplifting story on entropy is "The Last Question" by Isaac Asimov
Sabine s humour and wit is so empowering. Such a wonderful person. Thank you for existing Sabine 😂
She doesn't exist. Our reality is an illusion.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
She couldn't help it. Her existence is a high entropy state.
I agree, but you summed it upwith far fewer words. Cheers from Canada.
so true 🤣 it's so subtle as it blends in with her education. imagine her being the teacher, she would grab your attention because you'd be like wtf? then listen more. she's a great teacher.
'if you tell this to a random physicist on the street to see if they agree, you should let them go as they've got better things to do' 🤣
I think one of the most amazing example of low entropy , or localized reversal of entropy is photosynthesis at the molecular level. That with amino acids and DNA/RNA is the basis of life on Earth. Execpt for life around ocean floor volcanic events.
Thank you for making clear to me what entropy is and as a bonus giving me a solid idea of what Necessity could mean as well .
I have been trying to understand entropy for decades, now I am a little closer to it.
Entropy is simply a lack of data. In a “real” sense though. Information or data is better to say than order because things being “ordered” are subjective. So entropy happens when you begin to lose information of a system. If something is breaking down it’s order or information is also separating. Therefore it’s entropy is increasing. In a way entropy could be seen as time. Because time moves forward as things spread out or break down. It’s even how we quantify the “second” is by using the measure of an atom breaking down. These two things are the same though. We measure time by our perception of all systems around us progressing towards entropy. So time moves based on the frame rate we perceive it which is again measured or quantized by entropy.
I wasn’t far along enough in the video she pretty much says the same thing.
Entropy is simply the rise of equality. Which is a result of calculations the nature does to produce the next moment of time. So, entropy is simply the result of calculation. It's real, but it's wrongly defined. Entropy is not a measure of disorder. It's a measure of equality. Because once the full equality is reached, the universe stops. The formulas still work, but their inputs and outputs become the same, hence there's no change, hence the passage of time makes no difference. That's maximum entropy. Or equality. So, equality is bad 😊
@@cinegraphics No their inputs and outputs don't become the same, only when you're working with an information discarding model of the universe (macrostate).
There is still exactly one microstate leading to exactly one next microstate, and thus each microstate has exactly one preceding microstate, so the universe in terms of microstates cannot advance to a point where the inputs and outputs become the same; as then that final microstate that would be reached would have more than one way for it to be reached: first the last state of the universe in which inputs and outputs were not the same, and second the final state itself where inputs and outputs are the same. But this is in contradiction with the assumption that there is only exactly one microstate leading to exactly one next microstate.
@@snaawflake you're forgetting the rounding errors. At one moment the error level drops below the computation precision and the output becomes same as input. And that's the end of computation. Death of the universe. Stable state has been reached.
"It can only go downhill from here" made me laugh out loud. Another excellent video that teaches us about science. Lovely.
I agree that we can measure location and velocity at the same time in our macrostate.
The reason we can not do the same for a microstate is that we do not have the sterile tools and technology for the microstate.
So we use the next best technique, statistics which reveals the wave function. That is just a tool to interface with the microstate without much interference, which inherently causes ambiguity.
We are just not built to interface with the microstate without causing a disturbance.
We need to devise a fractal technique to zoom down into the microstate to accomplish what we want to do without disturbing the microstate.
Without that, we are just guessing at the characterization of the microstate. And we go down the rabbit hole.
Thank you for your great insight on entropy. This allowed me to grasp the limitations of measurement theory in our macrostate, which is a fractal state of the cosmos.
This is a great philosophical point. I love your subtlety, Sabine.
Sabine is not only intelligent enough to master a complex subject, but also think in new ways about how the data should be interpreted. It absolutely blew my mind when you explained how macro systems are a relative concept, and not an absolute value that happens to match our anthropocentric view of the universe. Thank you.
I often think about how the anthropocentric view tends to divide ‘reality’ into separate things where there are none .. ie. a kind of fiction that perceives order v chaos, living v non-living, observer v observed etc.. ‘we’ seem to separate ourselves out along with everything else. My hunch is that the energetic solidity we feel in the body gives rise to the appearance of everything else as distinct and knowable.. a kind of ‘specialness’ that has something to do with the apparent body’s drive to conserve low entropy... what we call ‘me’ is only a kind of interpretation and nothing at all in reality.
You should read Asimov's story called "The last question". It covers the topic of entropy and this video explains perfectly the science behind it. The story also follows Sabrines idea about future life forms in higher entropy universe. It is one of my favorite Asimov's pieces.
Asimov.
@@RobertR3750 oops, you're right. English is not native for me and we spell it with a "z".
I've read that. Very clever story ! I love Asimov.
@@tezzerii me, too! I think I read almost everything he wrote..
As a young boy I became a science fiction devotee, and Asimov was one if my favorites, among with Heinlein and Clarke. I presume you are 😊from an older generation like me to have these greats on your reading list. It was a seminal experience in my life.
Great video, as usual. Sabine really puts things in perspective with her very humble scientific admission of gaps. Around the 7th minute we can visualize clearly what is the greatest bias (blindness?) that will make our grandkids cringe in a century: "how could you not understand this emergent order?" To me it feels like it's way past time we start conceptualizing things like "syntropy" and eutropy", and take Metaphysics seriously again. Restore it to its natural place since Aristotle: the real "theory of everything". Intuition (idealism) is the way we understand things so that science can make strides again. Empricism depends on it, and is basically a secondary development.
If anyone here has NOT read Isaac Asimov's short-story "The Last Question", now would be a good time.
Also, „Exhalation“ by Ted Chiang.
I read a while ago that they made a tiny membrane out of graphene that could harvest minute amounts of electricity from brownian motion by vibrating the membrane. That's pretty cool because as I understand it, it would mean that if you're in a sealed sphere that doesn't lose energy, you could in theory produce energy out of seemingly nothing by cooling the air to convert the heat to electricity, which in turn would do useful things before inevitably heating the air again.
Sounds a lot like the molecular ratchet or rectifying thermal noise sort of ideas. So far none have worked.
We actually don’t need a closed system to make accurate predictions with thermo though. You can derive how a system in contact with a thermal reservoir will act by treating the system and reservoir as a closed system. You’ll still see that perpetual motion machines like Brownian ratchets are impossible with heat exchange
You're not going to power much from Brownian motion.
Duh... You're vibrating a membrane and expecting electricity. Where do you think the energy to vibrate the membrane came from - the vacuum?
I just checked a presentation from Paul Thibado | Charging Capacitors Using Graphene Fluctuations, where he explains in a nutshell that energy in his experiment is gathered by the diodes, not the graphene. The graphene is acting like a variable capacitor that doesn't need energy input, but is at equilibrium. From his own claims, it doesn't break 2nd law of Th. even though I don't fully get the explanation.
That was one of the best lectures although it is hard to compare lectures. It's maybe the first time entropy makes sense even though I have been reading about it all my life. I also like your dry physics humor. One needs humor in life. So thank you very much.
At the time 14:25 "I believe as the Universe gets older ..." and we observe "local entropy" ... "... new system will emerge ..." ! absolutely in a never ending dance! And we start to have clues as the pseudo "dark energy" and really two old galaxies to belong to our own Big-Bang! I dig into these questions a c-l-i-c-k away!
I think many of my (undergraduate) students would get a laugh (or a cry) out of the phrase, '...You don't need to know that much maths, just differential equations and probabilities...'
linear algebra and complex number theory be damned!
There's not a single thing about Special Relativity that Einstein couldn't have explained to Archimedes. Einstein would probably take a detour into explaining the modern notation of algebra. But this wouldn't really _be_ algebra, for the same reason that many undergraduates in computer science can barely distinguish a formula from an equation.
Archimedes: This algebra thing is cool! I wonder what else you can do with it.
Einstein: Well, I did once take a year-long detour into the deep technical weeds of Ricci tensors.
Archimedes: Excellent! Please explain.
[Archimedes smooths out some complex geometric diagram in the sand.]
Einstein: Uh, okay, but we're going to need a _much_ bigger beach.
My college students would laugh at "half the box being filled with particles" as being a low entropy state as they would understand that this is just one state that's counted in the entropy of the system.
As a person that implies you have physics students, do you not see anything wrong with her understanding of thermodynamics?
@@johnzander9990 But it is clearly lower relative entropy to being fully spread out in the box. Just by halving the space you're halving the number of microstates. Further, it is clearly not an equilibrium state and not a state that can last long.
@@johnzander9990 Sorry, I should add that your understanding of entropy is a bit concerning. The state of the box half filled is the macrostate. It is not one of the contributing microstates to some other macrostate as your statement implies. Ex. You could instead have the microstate be that all molecules are in the box and *then* you could consider all on one side as a contributing microstate to that macrostate.
Perfect timing. I am reading The Order of Time by Carlo Rovelli. And the low entropy Google algorithm sends me right here. Always enjoy your explanations. Well worth my subscription.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
I used to hate science in school but you make everything easy to understand, fun and interesting! I love watching your channel!
My assessment of the reason behind the confusion that seems to permeate discussions about entropy, is that in the act of representing entropy as a thing, our focus is drawn to it as such. Yet it is more like the negative space *around* objects, rather than a thing itself. Entropy is an effect of a process. The most succinct description I have heard (for those of us who may not readily be able to calculate the log of Boltzmans constant) is that “Entropy is the tendency of energy to go from concentrated to diffuse.” Every single event since the beginning of the universe (assuming it had one) can be described as follows… “Energy changing from one concentrated form to a different more diffuse form.” Everything from our planets geology, magnetosphere, biosphere, including our bodies & their metabolic processes, brain activity, almost any form of technology we invent and use… all of it merely capitalizes on complications of this eons-long cosmic slide down the entropic decline. We are all living in the aftermath of an explosion; embers in the dark.
And what to do with it?
Accelerationism?
If we crack fusion and are able to create low entropy objects in perpetuity (and maybe even black hole generators), how does that affect the progression towards heatdeath?
Schrodinger wrote a great book near the end of his life titled "What is Life?" that touched on entropy and order, but it always confused me. This video helped to clear a lot up! Thanks Sabine :)
black holes are FICTION
nor do we live in a gravity based universe
Hi Sabine, I think this is imo one of your best physics presentations yet. Explaining entropy and how it applies to pretty much all aspects of nature. Thanks.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
You know, it's not as if she made a conscious choice to be conceived, and then to pop out of her mom's uterus for the sake of her future RUclips audience... 😂 You could thank her for making a RUclips channel, but the caveat is that she isn't a believer in Free Will (and in Agency by extension, at least not in the classical sense) so...🤔
@@hyperduality2838 Really?
I remember the days when nothing Sabine is saying would make sense. Now it all makes sense. I've been watching her videos for so long.
I feel exactly the same! What an amazing resource for learning this platform is, i feel so lucky to be able to learn from geniuses like Sabine
Her videos start off simple and gradually get more complex. The half way point is enough for me.
Agreed. She's personable, but not very good at explaining things.
As a total layman, I love the way you manage to dumb down any subject to my level without actually dumbing it down. I wish you'd been my physics teacher when I was at school!
However, you did give me one microsecond of pure existential dread when (I thought) you said that the heat death of the universe would occur in ten to one hundred years. Aaaaaaaaghhh! Oh, no, that's ten to THE one hundred years. Panic over. Now I need to find some clean trousers.....
🤣🤣🤣🤣
Great video! But tiny correction: high entropy means high information, not low information. The more random something is, the least it can be succinctly explained. At least for Shannon Entropy. For instance imagine water in its solid form with molecules neatly aligned, it can be described succinctly as a pattern repeating, whereas in its (high entropy) gaseous form, each particle can be anywhere in a the volume that contains it: describing the micro-states would require describing the position of each molecule (high information).
That's a nice description - just watching the vid but based on what you say I wonder if this video can do better than that concise description. Maybe the simpler the explanation the more encompassing it is also? Entropy feels like such a description tending towards that... hence it's ubquity/applicability.
Ya... I think one has to be careful when analyzing a physical example using Shannon. In physics it about information known (or information that can be potentially known.) It's not conceivable that we could know very much about the micro-states of gas molecules. Yet, we have a lot of information about the micro-states of ice. Thus... in the context of the ice and gas example you gave.... it's kind of opposite of what you said. There is a subtle connection between the two concepts of entropy. A good paper is by Jaynes, if you want to read that. Sean Carroll also writes about this too.
But given that entropy tends to increase, does that mean the amount of information tends to increase or decrease?
Consider the case of a gas confined to one half of a container by a barrier. Then the barrier is opened and the gas escapes to fill both halves of the container. Do we have more or less information about the positions of the gas particles than we did before?
This bothered me too. Maybe Sabine meant information in the colloquial sense, not how it's defined in Information Theory. Either way it's confusing.
You are perfectly right and she got it wrong. Information and entropy are perfectly positively correlated, not negatively.
How did Sabine know I was about to ask, “But what about Quantum Mechanics?”
Spooky action at a distance at work?
how satisfying to hear that the reference to order in explanations about entropy is non sense.
the connection to information and macro micro states is another source of confusion I'd say, as it can introduce the role of observers and their subjective view in the definition.
This a great explanation of entropy. I loved when Sabine said that entropy has nothing to do with order.
Also, that entropy depends on the definition of macro states, which are defined by the observers, us.
Sorry sorry! I was gone for so many hours. Of course we can talk. On what topic?
But in the bigger picture we're talking about a battle between a system's tendency to disorder vs order. 2nd Law states all physical systems tend to a disordered equilibrium. This in itself is problematic as a perfect equilibrium would be well ordered.. A gas at PERFECT equilibrium in a container would be perfectly evenly spread and it's molecules at rest. Quantum fuzz may prevent this perfectly ordered system and scientists consider a gas where all the molecules have the same velocity to be in equilibrium too, even if they are moving about 'randomly'..
--
The question is, has the universe become more or less ordered over time? MUCH MORE is the answer, not less. The basics of the 2nd Law of Thermodynamics holds for MOST closed systems, is the best we can honestly say if you ask me, which you didn't! To say it holds for the whole universe, be it finite or infinite, would be a leap of absurd faith.
This has become my favorite stream of yours. Never though entropy can be explained in so simple terms. Also, the past hypothesis may be able to be falsified with how you designed it. Human life expectancy has been increasing in time. Inspirational.
Fascinating video and very well explained; entropy has always been one of these quantities I've found difficult to define or understand. It has mostly been explained to me in the concept of order (or just plain equations), and those mostly just made me go ‘oh that makes sense... wait.’
Minor note, Boltzmann has two n's. 😅
I have to say: her jokes are rather as you would expect them to be from a German
I disagree, as she is making jokes, and they appear to be at random as opposed to at scheduled times.
I absolutely love this discussion! Everything is Entropy. However, I always wonder about Entropy and Infinity and the implications. If a system and its microstate become open at some point will this not allow a low entropy influx just as the divider would. It would introduce new information for sure. Our perception seems to be a factor in this riddle and perhaps the microstate (the system) is actually consciousness and will remain a stable charge intertwining between infinite dimensional states. Perhaps "thought" is the only real thing.
As a former physicist now going my way into philosophy of science, what you said in the "How will the universe end" is the most satysfying take on entropy I've ever heard of coming from physicist. I've been wondering about the precise fact that "macrostate" is subjectively defined and so entropy increasing must not be that fundamental, but I never heard of any physicist going down that rabbit hole. Thanks you for that ! You must have at least some interests in philosophy of physics to be that great on these questions :)
If you view the universe as being in a single quantum state, then a macrostate is defined as the possible result of the expectation of this state on some Hamiltonian. The space of Hamiltonians defines the space of macrostates. I do not believe this to be subjectively defined.
@@chippyprice1993 That is interesting, so in this regard only the entropy of the whole universe increases through time because other macrostates are ill-defined, right ?
@@graine7929 I'm not sure I understand what you mean by other macrostates are ill defined. You mean partial macrostates? Like ones that apply to a part of the universe?
A video on the limitations of scientism ruclips.net/video/zFqj_NEbfgY/видео.html
what created the low entropy at the beginning of the universe
So much food for thought. One of your best yet Sabine.
SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
Making predictions to track targets, goals & objective is a syntropic process -- teleological.
Teleological physics (syntropy) is dual to non teleological physics (entropy).
Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
Mathematicians create new concepts from their perceptions (geometry) all the time!
Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
Mathematics (concepts) is dual to physics (measurements, perceptions).
Deductive inference is dual to inductive inference -- Immanuel Kant.
Inference is dual.
The rule of two -- Darth Bane, Sith lord.
"Always two there are" -- Yoda.
Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.
Thank you for so clearly stating what I've always fundamentally thought about entropy. Of course I will mention Dr. Asimov's famous story "The Last Question" which so firmly planted this idea in my mind. Life is truly the universe's riposte to the dismal 2nd law of thermodynamics. "Let there be light..."
According to entropy at some point we're going to come back to this video and say "well this didn't age well" 😉
That'd be Poincare's recurrence theorem which only applies to classical closed systems.
@@elinope4745 Not quite.
I think this video is already the response to the sentiment of 30 years ago
I love that you started this video talking about biology. It’s not often addressed when it comes to entropy, but life is the exact reason I’ve been skeptical about the universe heat death prediction. Life has a way of swimming upstream, cleaning up messes, and finding energy.
Entropy is also adressed and very real in biology. Cells locally decrease entropy (by organizing themselves into complex structures etc.) but they can only do so by increasing entropy around them (absorbing nutrient and releasing wastes, breaking down molecules etc.). When you study lifeforms, you realize there is no "trick" or no escape to entropy there.
All in all the reality is that without free available energy (work), every complex life form ultimately dies. Once free energy in a closed system is spent it's over, period. And one could argue that the universe is such a system.
well, the argument goes like this: entropy of the whole system increases over time (/ doesn't decrease). So looking at the biological organisms by themselves is a mistake. If you take into account everything they (we) do, the entropy actually goes up. As the video says: we increase the entropy of the sun, fossil fuels etc. to decrease the entropy of others. However if you calculate entropy for, say, the solar system as a whole, it increased as a result.
There's even an interesting philosopical point here: you could say that the low local entropy that biological systems are, is a good way to maximize the increase(!) of overall entropy - because through our actions we alter the environment in more sophisticated ways, increasing the overall entropy of everything around us through the search of usable energy (which, as Sabine stated, is local low entropy reservoires).
I'm not sure I agree with this line of reasoning, I'm just saying that there's more to biology vs entropy than meets the eye
But that only happens at the expense of decreasing entropy elsewhere. Mostly in the Sun, but some, like the life around undersea 'black smokers,' at the expense of nuclear decay of materials in Earth's crust. (and *perhaps* also tidal flexing in some planetary moons, keeping interior water liquid).
However all stars have finite resources for fusion (iron-56 and nickel-56 are the end of the line, if not sooner), nuclear decay eventually reaches its lowest state, and tidal flexing goes away as those orbits tend to circularize and 'flexing' stops (and over *seriously* long times, orbits decrease via gravitational radiation).
It's true that life is a constant *struggle against* entropy, though.
Life (once it exists, and especially in prokaryotic form) is pretty good at avoiding becoming extinct, but it's pretty bad a breaking natural laws.
So, sorry pal, heat death is the absolute end (and very likely the last living organism will have died a lot earlier than that).
@@TheBouli
Once heat death has been reached there is no way of extracting energy from any source in it to fuel life (and all living beings are parasites that thrive on the increase of entropy, that is the availability of an external energy source). Trying to extract energy from a closed max entropy system is like trying to extract zero point energy from an object. It won't work. It's just physically impossible.
Chaos Theory always seemed to go against Entropy but I never really got far enough into them. Chaos Theory was the study of order arising from a chaotic environment. the example I remember is cars and red lights. Even is the lights, roads and cars are random the time to drive from point to point tends to be close average that emerges naturally. Order emerging from a high entropy state. I guess it could go with information thing, when we don't know all the microstates order can emerge from chaotic interactions.
Yep. All the work done by non equilibrium thermodynamics are ignored by physic.
Yeah, but as i concern in most chaos system general trend is create and utilize order to produce more chaos, for example black hole or human tend to turn all kind of stuff into plain heat (useless energy) and we are high order ourselves - a product of chaos nature and universe , i dont know if we can achieve level of sci-fi when we can break material into useless energy better than black hole itself. So in big picture, chaos theory does not seemed to go agains entropy
Infact we can say Life exists to "accelerate" entropy
entropy = -emergence
I liked this video because it resonated with my views on entropy that I gained after reading the inspiring article by Myron Tribus and Edward C. McIrvine, "Energy and Information", Scientific American , Vol. 225, No. 3 (September 1971), pp. 179-190. I strongly recommend that paper, because it provides an insightful complement and an additional explanation and formalization of the points raised by Sabine.
Following Shannon's original proposal, lucidly explained by Tribus and NcIrvine, entropy is defined as the degree of uncertainty, X, that one has about a certain question Q. A question can be "In which part of the box is the particle, left or right?" . Complete uncertainty about that question means that there is equal probability, p_1= p_2 = 1/2, for the particle being in the left or the right part of the box. The entropy, defined as S(Q|X)= - \sum p_i ln p_i, is then S(Q|X)= - ln (1/2) = ln 2. On the contrary, if one knows the answer, namely that the particle is in the left part (denoted 1) and not in the right part (denoted 2), then p_1=1 and p_2=0. The entropy is then S(Q|X) = 0, which means that there is no uncertainty about the question Q. This is in contrast with S(Q|X) = ln 2, which is the maximal ignorance about the question Q. The authors then remark: "The only state of greater ignorance is not to know Q." And not knowing Q, or even worse, not even be aware of the fact that one must bring Q into the definition of entropy, is a source of all the confusion and all the difficulties people have in understanding the concept of entropy. It was great that Sabine revealed in her own wonderful and easily grasping way this important point to the general public.
In the article, cited above, the simple model with two possible answers that I gave to illustrate the idea, was generalized to the cases of generic questions and any number of possible answers. Information was defined as the difference between two uncertainties, I= S(Q|X) - S(Q|X'), The relation to the thermodynamic entropy was exposed. Again, a fabulous, inspiring, highly cited paper that removes the longstanding "mystery" and confusion.
@kaboomboom5967 You would certainly understand the Scientific American paper "Energy and Information" which is very clear and readable, written for general audience.
Can you do a vid discussing your take on the recently proposed “Second Law of Quantum Complexity?” I am both fascinated by it and would love to hear where you stand on its proposal and maths. Thanks for all you do Sabine!
The interplay between life (makes order from disorder) and the 2nd Law of Thermodynamics (expects disorder from order) is very interesting. The way life fights the losing battle of ageing is by being multi-generational, a reset to a low entropy state. It then follows that life is fundamentally about paying it forward with the older generation being devoted to the success of the younger one.
A very enlightening observation to make. Touche'
@@jzemens4646 Furthermore, the constant battle against the 2nd Law of Thermodynamics is why societies evolve to more enlightened states, not uniformly or monotonically, but inevitably. This is because every generation has to create sufficient new information to overcome the entropy increase. Those societies that are unable to do so eventually perish. Very few societies can maintain an exact balance, by neither advancing nor going backwards. The rest, utilising the culture and the fruits of science build more information than entropy increase destroys and arrive at a better place from one generation to the next.
but it isn't order from disorder. The sun literally provides more energy to the surface of the planet than is required by the life on it. The 2nd law only applies to a closed system, and the earth is by no means a closed system.
All life radiates heat therefore functions as an entropy machine. The "order" of life is emergent from the 2nd law of themodynamics because it functions to increase entropy (ie spreads out energy.) One could cite climate change as proof of this!
Entropy is much more concrete in the case of genetics than heat. Think of the landscape model that virus physicist Peter Schuster describes in explaining the natural algorithm that allowed proteins to evolve into the efficient optimal folding machines that they are. A similar landscape model can be used to describe the universe of all possible gene configurations and organize them by the genetic differences (as distances) between them. For organisms to evolve, their gene structures must migrate across generations through this landscape. The law of entropy applies to this landscape because less gene arrangements produce successful complex life forms than produce successful simple life forms. And, likewise, gene arrangements that produce successful simple life forms are outnumbered by those that produce unsuccessful life. We know this based on entropy. Within this landscape may be more than 10⁶⁰⁰ possible arrangements. However many, it is a very large number so big that not even a smidgen of them could have occurred across 4 billion years even if every organism that ever existed in that time frame was genetically unique. Yet, in defiance of the static entropy characteristics of the genetics landscape mapping all life, gene configurations managed to randomly migrate to ones that produce ever increasingly complex life forms. All I can say is bull dung; it didn’t happen. Do the math. A natural selection genetic algorithm is not sufficient enough to solve such complex problems approximately optimizing dozens and dozens of interrelated phenotype systems in single organisms. Bull dung. No algorithm could solve such a search problem.
I love you Sabine. You explain physics so well. I never understood entropy.
I didn't understand entropy until I had a two-year-old . . .
I always think of it as the reason my soup cools down enough to eat.
the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...
@@andyreznick soup c00ls down enouGh to eAt it... is a form of tiMe travler forWard beCausE u have to keep thinkiNg is it c00l down eNouGh 2 Eat...oR is dIs a paRaDox...the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...
Then lower entropy (regarding RHEOSTASIS) of a battery with a strong magnet, for example 👍:)
I'm just an amateur physics enjoyer, but this is a lot more satisfying as an explanation.
"This was the most optimistic and uplifting video I have ever done and can ever do. It can only go downhill from here."
Großartig!
You have many great videos ahead of you
11:36 Sabine you are wrong. High entropy indeed corresponds to high information content, as per Claude Shannon's definition in information theory. The reasoning behind this is that entropy essentially quantifies the level of uncertainty or randomness within a set of data. The more uncertain or unpredictable the data, the more information it contains. For reference: "en.wikipedia.org/wiki/Entropy_(information_theory)"
If a highly likely (High entropy) event occurs, the message carries very little information. On the other hand, if a highly unlikely (Low entropy) event occurs, the message is much more informative.
This is exactly what Sabine is saying.
@@Darenimo You twisted some definitions in your comment. Entropy IS average information. If a high-entropy message occurs, then the message carries lots of information. High Entropy is NOT the same as "highly likely". This holds for information theory AND statistical mechanics!
@@TheCreativeKruemel it's not that anyone here is wrong, it's just the definitions of stat mech is all screwed up, and always has been. Physicists have been arguing about thermodynamics for god damn years cause of the way it was formulated.
First, you have to remember what information really is, and what observer the "message" is being viewed from.
If you read the message
000110010101
This tells you some kind of information, but it turned out that this message, was just the macro-state description of a more underlying message
00110000 00110000 00110000 00110001 00110001 00110000 00110000 00110001 00110000 00110001 00110000 00110001
When you compare these two, their information content is the same (binary expansion of a binary set of numbers, where 00110000 is 0 and 00110001 is 1.) The 2nd clearly has more bits of information, because it's very obviously longer and can describe a lot more states (higher shannon entrop). But also under Shannon, the surprise of the messages are equivelent...in that these messages have the same amount of surprise, and therefor have the same "shannon" information.
Additionally and by contrast, randomness is not well defined. Consider the random string again of 0's and 1's : 000110010101...it contains regularity and in fact, for any arbitrarily large string, you will never find a truly random sequence, there will always be patterns and repetition in that sequence... 000, 11, 00 0101, 1010...these are patterns embedded in the seemingly "random" string and I'd challenge anyone to try and create a truly random string of 0's and 1's. The only time one can is when each bit is unique , at which point the idea of "surprise" gets thrown out the window because every state would be a surprise if all of it's bits are all unique (and this is the true extension to what Sabine's sid, just from an information theoretic viewpoint)
Cheers,
Such as, if you have an airtight box on your desk, and you request information about the contents of the box, you could get a high entropy response, letting you know all the possible configurations of atoms and molecules that could exist in the box.
Or you could get a low entropy message, telling you it contains mustard gas and opt to not open it. And you'd probably find you gained more information from the latter message than the former, even if the former had a higher information content.
@@Darenimo Hey good point :)!
The low entropy response, saying "it contains mustard gas", is less detailed but more straightforward for us to understand.
However, 'information' here doesn't just mean what WE can understand easily (very important!). It includes ALL possible details, even very complex ones (one key idea in Shannons original paper).
In principle it could also be possible to use data compression and Character encoding on the high entropy message to reduce the high entropy message to a message we humans can read :).
The important thing to understand is that the high entropy message (about all configurations in the box) contains ALL the information of the low entropy message!
Probably one of the best explanation of Entropy. Thanks for the video.
In my book "Evolution: "Von Mikroorganismen zu multinationalen Konzernen - 3. Auflage" (only Amazon) I came to somewhat different conclusions. This book is also based on the idea that the increase of entropy means a loss of information (see Murray Gell-Mann). However, when discussing life, physicists practically always talk about thermodynamic entropy (see Schrödinger). In my book I claimed instead that living beings have competences (knowledge etc.) towards their habitat in order to gain resources (especially energy) from it. However, they are primarily concerned with not losing information (which has to be stored physically somewhere). In other words: living beings behave in a loss-of-competence averse way. And this is then also the basic assumption of the competence-based theory of evolution, which is substantiated in my book. Unlike Schrödinger's approach, it can also be used to justify reproduction. Schrödinger's approach can only explain self-preservation of living beings, not reproduction.
The competence-based theory of evolution is a generalization of the original Darwinian theory of evolution (which unfortunately has no physical foundation). However, it contradicts the theory of selfish genes and the total fitness theory.
I deal in my book of course also with the question whether there will be no more life in the universe sometime. In my opinion this will be so. Because life needs competences. And these competences (knowledge) must be physically held somewhere (DNA, brain, hard drives, etc.) as extremely improbable states. And they must be reproduced permanently, because the entropy law applies. So high quality energy is constantly needed to reproduce the competencies (so that high quality energy can still be obtained in the future). And I am afraid that at some point this high quality energy will no longer exist in sufficient quantity.
My book is in German and quite difficult. It also covers the evolution of man and his societies, which Darwin cannot even begin to do. Biologists are often of the opinion that man is just another animal. My book, however, very clearly concludes that man has emerged from the animal kingdom. Basically, human beings have acquired more competencies than the rest of the animal kingdom put together. Man is capable of division of competencies, other animals need division of species for that.
I hope to translate the book into English via ChatGPT and/or DeepL by the end of the year, until then only reading in German remains.
I appreciate your book in english, look very interesting to study and know your view.
Information can not get lost (law of the conservation of quantum information). An increase in entropy does not mean a decrease in information, but an increase. That's the basis of information theory. If your book is based upon the idea that an increase of entropy means a loss of information, then it's build upon a flawed premise.
@@noneofyourbusiness-qd7xi Unfortunately this is wrong. I recommend a Google search for "information loss entropy". Murray Gell-Mann explains the connection very well in his book: "The Quark and the Jaguar: Adventures in the Simple and the Complex".
His explanation goes, roughly, like this: If all particles in a gas are in a tiny quadrant, then we know not only the macrostate of the gas, but also its microstate quite precisely. In this state, the entropy of the gas is very low. However, when the particles then disperse again, we lose much of the knowledge about the microstate of the gas. Thus, with the increase in entropy, we experience a loss of information (or knowledge about the microstate).
The situation is similar for living beings in their environment. The better adapted it is to the environment, the greater its knowledge about its environment (Popper). One can also say: The greater are their competences towards the environment. If they lose a part of their adaptation, they lose a part of their knowledge about the environment. Schrödinger looked at the events from the thermodynamics of a living being. I think this is problematic, especially since evolution is about information processing (even genes are subjected to information processing). It is therefore more advantageous to look at the events from an information-theoretical point of view and to understand an increase in entropy as a loss of competence.
And by the way: buildings decay with time. Living things decay in no time at all when they lose their life. But we also forget things, for example names. The older you get, the easier it is to forget. You lose information. Did it really never occur to you that this is the same mechanism?
From what I learned about the laws of thermodynamics, great stress was placed on the concepts of OPEN vs CLOSED vs ISOLATED systems. We were taught that the potential function which determines the overall direction of change in an ISOLATED system is entropy increase. There can only be ONE isolated system, the universe. So my understanding was that as the universe evolves its total entropy increases. You seem to be saying that this is incorrect. That the total entropy of the universe can stay the same. Also my partial understanding of what you are saying, (as a non-expert), is really to do with open and closed systems within the isolated system of the universe. So that even when the total entropy of the universe as a whole has reached a maximum (and where does it stop?), you can have closed systems in which the entropy is locally decreased at the expense of the surroundings.
A vacuum Dewar is a pretty good short term approximation for an isolated system ( at least for chemical rxn time scales).
And if the accelerating expansion of the "universe" that our astronomical observations indicate is true, then we are all trapped in segments of the "universe" by an effective event horizon defining our observable universe.
A sort of relativistic Dewar. Yours will be only slightly different from mine, but researchers 100 billion light years from us are effectively in a completely different isolated system.
This my understanding too. There's the theory dark energy is proportional to the entropy in the universe system, specifically the area of the boundary expands with total total entropy within the universe, it's speculated this also applies to black hole event horizons.
You're right. Sabine's statement that the entropy of the universe can remain the same is utterly erroneous. It ALWAYS increases, regardless of any number of mucrostates you might be looking at. That it must increase with every physicochemical process is a corollary of the 2nd law of thermodynamics. And this has not changed (overall) since Boltzmann.
For an isolated system at equilibrium, there's the approximation/heuristic that all microstates are equally likely. So, defining macrostates in terms of subsets of the total available microstates in the system then allows calculating probabilities for the macrostates. And defining entropy as the (log of) the number of microstates comprising the macrostates makes sense. But, for systems that are not isolated, or that are not at equilibrium, not all microstates are equally likely. So then I'm less certain that entropy can be defined as a simple count of microstates.
@@WestleySherman True, but entropy is the sum of the individual probability Pí of each microstate (times log Pi),. Pi can be different for each microstate that is part of a macrostate.
This is truly your best video IMO. Thank you Sabine! Finally a happy ending, from which, as you said, everything will have to go downhill 😂
Veritasium's video on entropy is really interesting and maybe aligns with the "uplifting" takeaway here. The universe doesn't just tend towards higher entropy, but it also tends towards faster entropy gain. Life as we know it is locally the fastest entropy increasing mechanism, as entropy further increases other life-like mechanisms (that may not be recognizable to us) should be likely because the universe tends towards accelerating entropy growth.