I don't believe the 2nd law of thermodynamics. (The most uplifting video I'll ever make.)

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии • 6 тыс.

  • @MrEiht
    @MrEiht Год назад +1838

    As Abba said: "Entropy killed the radio star..."

    • @chris.hinsley
      @chris.hinsley Год назад +99

      Come on Sabine ! Do another techno pop song ! Please.

    • @nagualdesign
      @nagualdesign Год назад +331

      Abba? I think you mean The Buggles.

    • @MagnumInnominandum
      @MagnumInnominandum Год назад +26

      That was Jefferson No Ship...😮

    • @chris.hinsley
      @chris.hinsley Год назад +14

      Don’t get me wrong I love the physics. But “Catching Light” was the bomb !

    • @jorriffhdhtrsegg
      @jorriffhdhtrsegg Год назад +41

      Abba surely not

  • @mikebarushok5361
    @mikebarushok5361 Год назад +2612

    I have witnessed water running upwards from a ditch onto a road then becoming airborne and creating a cloud. But, it was because of a tornado passing near to my house and I hope never to see anything like that again.

  • @DrAndrewSteele
    @DrAndrewSteele Год назад +397

    Aging biologist here! I’d like to add an uplifting comment to this uplifting video. :)
    I’m always a bit sad to see entropy given as the reason we get old. As Sabine discusses later in the video, open systems with access to a source of low entropy can use that to decrease their own entropy, so given that we can take in low-entropy food, there’s no in-principle reason we couldn’t use this to keep the entropy of our bodies roughly constant with time. So not aging is totally allowed by the laws of physics!
    It’s even well within the laws of biology-there are plenty of animals that don’t age, from tiny hydra to giant tortoises, and even one of nature’s most beautiful animals, the naked mole-rat. Their risk of death and disease doesn’t change with time, which basically means they keep their entropy constant throughout their adult lives.
    Now all we need to do is crack this biological entropy preservation using science…but that’s another story!

    • @elio7610
      @elio7610 Год назад +23

      Unfortunately, we already have an overpopulated earth so preventing people from aging is gonna exasperate the issue. I would not be against preventing ageing though, even if my life was no longer than normal, a life without aging is far more enjoyable than living with a constant gradual decline.

    • @DrAndrewSteele
      @DrAndrewSteele Год назад +46

      @@elio7610 If you’re worried about overpopulation, I made a video about exactly that which you might enjoy :)

    • @edcunion
      @edcunion Год назад +3

      Planaria?

    • @TitanOfClash
      @TitanOfClash Год назад +28

      Wow, I can't believe that I ever espoused that exact view. When you put it like that, entropy as a reason for aging makes no sense. Thank you for ridding me of that misunderstanding.

    • @DrAndrewSteele
      @DrAndrewSteele Год назад +2

      @@edcunion I’m not sure if we’ve got any lifespan data on them (happy to be corrected!), but given their regenerative powers I’d not be surprised!

  • @ToddPangburn
    @ToddPangburn Год назад +407

    THANK YOU for pushing back against entropy being described as order vs. disorder! Through years of schooling entropy was this poorly defined, almost spooky concept of order. Then I was finally introduced to entropy as probabilities of microstates (with gas in a box), and it was completely logical and clear.

    • @billschlafly4107
      @billschlafly4107 Год назад +22

      Entropy is to heat transfer what friction is to an object in motion. Entropy reduces the available energy in a system just like friction. Order vs disorder isn't useful at all IMO and only serves to cloud what's happening.

    • @sjoerdthabozz
      @sjoerdthabozz Год назад +28

      Agree. Order is a matter of human opinion. Don’t think nature cares.

    • @xxportalxx.
      @xxportalxx. Год назад +32

      I think they're synonymous ways of talking about it, it's just that order as a concept has more to unpack to get to the point. Order means that there are rules that limit the number of microstates. A bookshelf ordered alphabetically has very few allowed microstates (defining the microstates as the books' arrangement), a disordered bookshelf on the other hand would have as many microstates as there are ways to arrange the books.

    • @blueckaym
      @blueckaym Год назад +6

      I agree! I've always found order & statistical description of Entropy to be fundamentally wrong.
      And if you look you can see that this approach exists in many other fields of science (of SCIENCE! Which is supposed to be OBJECTIVE :))
      The trouble is scientists are not always objective - actually everyone is at least a little subjective at times, that due to our limited resources (at least attention and Time).
      But how science generally work - you observe a phenomenon, you describe is somehow, you test if your description allows you to predict same phenomenon in the close future; if your prediction fails (or is imprecise) you improve your description until you get accurate enough predictions ... and at this point you KNOW that you best description is the CAUSE for your prediction ... so it's easy to miss the fact that the Universe doesn't give a rat's ass about your descriptions :)
      It's just one of our many biases - one that many scientists also fail to.
      Ultimately our descriptions would be so perfect that they'll fit to Reality 100%, and then it would be only a philosophy game to distinguish between the two ... but even with our most precise sciences we're still not there.
      But people (and esp. scientists) love to believe that the theories they learn (and especially the ones they make) are close to perfect and thus that bias becomes really, really strong!
      Have you seen a scientist trying to explain some phenomenon by stating some equation and ending with something like "and as the equation tells us, the object has to do this and that".
      Well sure if your theory and equations are perfect you'll get it right ... except for the understanding part!
      Very little of such descriptions actually explain anything, and it's exactly because they jump over the actual forces and threat the description as the cause.
      In that case if your description is for example statistical (which many sciences use today ... and particularly QM) then it's really easy to believe that probabilities and statistics are the CAUSE for things like Entropy!
      ... but this is still fundamentally wrong of course! :)

    • @Bryan-Hensley
      @Bryan-Hensley Год назад

      Kinda like dark matter

  • @Thomas-gk42
    @Thomas-gk42 9 месяцев назад +46

    To some of your vids, I come back months later again and again, because they are lovely every time new. This is one of them.

  • @truejim
    @truejim Год назад +388

    “My videos can only go downhill from here…” The perfect ending to a video about entropy! 😂

    • @aggies11
      @aggies11 Год назад +19

      Yep, that joke is brilliant and works on so many levels.

    • @yeroca
      @yeroca Год назад +2

      The entropy of explanations of entropy. So meta! The explanations can only be as good, and will likely be worse from now on.

    • @GoDodgers1
      @GoDodgers1 Год назад +1

      She is probably right, not that they were anything special from the start.

    • @hyperduality2838
      @hyperduality2838 Год назад +1

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @MR-ub6sq
      @MR-ub6sq Год назад +6

      @@hyperduality2838 Yeah

  • @maninspired
    @maninspired Год назад +240

    I'm a mechanical engineer who focused on thermodynamics in college. Despite years of study and using entropy in formulas, this is by far the best explanation of entropy I've ever heard.

    • @deltalima6703
      @deltalima6703 Год назад +4

      Would you agree that entropy is fishy?

    • @johndemeritt3460
      @johndemeritt3460 Год назад +12

      There's a reason my father, who trained as a mechanical engineer but became a computer programmer/systems analyst, always referred to thermodynamics as "thermogoddamics".

    • @oosmanbeekawoo
      @oosmanbeekawoo Год назад +13

      Classic response on a video that does not explain Entropy.
      It's far easier to believe we understand than believe we've been fooled into understanding.

    • @chrisj5443
      @chrisj5443 Год назад +14

      My education was the same. I was waiting for her to say entropy tends to increase in a CLOSED SYSTEM, but she never did. She must assume the universe as we understand it is a closed system, but that's very debatable, given the required low entropy at the Big Bang. Also, questioning the semantics of the 2nd Law (the meaning of "order" seems trivial to me. After all, I recall the word "disorganized" being used in the 2nd Law, which is better I think.

    • @karol_p
      @karol_p Год назад +4

      She actually makes the mistake of calling heat a form of energy, when in fact heat is a form of energy transfer.

  • @tayzonday
    @tayzonday Год назад +2586

    If I don’t shower, all the good air in a room definitely moves into the corner.

    • @jdtv50
      @jdtv50 Год назад +119

      Chocolate RAIINNNN!

    • @AttackOfTheTube
      @AttackOfTheTube Год назад +41

      Some stay dry

    • @stevenlayne9227
      @stevenlayne9227 Год назад +83

      @tayzonday we definitely have the same algorithm. You keep behaving like a tyrant in the comment sections 😂

    • @bossoholic
      @bossoholic Год назад +22

      When I don't shower, everybody suffocates and my flowers die

    • @alexandershendi7428
      @alexandershendi7428 Год назад +26

      @tayzonday And if you move into *that* corner, it will move into the opposite corner.

  • @AlexanderTome
    @AlexanderTome Год назад +74

    This has brought me closer to reconciling questions I've had about entropy than I've ever been before. Thank you for that.

    • @whyofpsi
      @whyofpsi Год назад +3

      I've made some visualisation that might further enhance your understanding: ruclips.net/video/lW7HT6IxI_o/видео.html

  • @randomwalk5095
    @randomwalk5095 Год назад +93

    I completely agree with you. When I was young, I used to tell my mom that my room was never in disorder because disorder itself doesn't truly exist. Instead, what exists are infinite states of possible order, and none of them holds more inherent sense than another.

    • @methylene5
      @methylene5 Год назад +23

      If a workspace becomes disorganised, the ability to get work done rapidly approaches zero. So I would argue that certain states of possible order do make more inherent sense than others.

    • @lemurpotatoes7988
      @lemurpotatoes7988 Год назад +5

      If I understood Ramsey theory I could say something intelligent here.

    • @hyperduality2838
      @hyperduality2838 Год назад +2

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @petermaunsell4575
      @petermaunsell4575 Год назад +4

      So your mother was the intervening force , if you keep your room tidy today as an adult it’s probably her victory, or your partners:)

    • @jumpycat
      @jumpycat Год назад +5

      Extremely organized workspace looks amazing, but it can intimide some people and drain creativity. On the other hand organizing is really good work hygiene after project is done.

  • @theprinceofinadequatelighting
    @theprinceofinadequatelighting Год назад +696

    "It can only go downhill from here"
    Sabine's humor, much like our universe, is chaotic.

    • @truejim
      @truejim Год назад +19

      It was a brilliant joke!

    • @davido.newell4566
      @davido.newell4566 Год назад +14

      Well ordered!

    • @jonathanbourret2968
      @jonathanbourret2968 Год назад +8

      I think she meant, "It can only go high entropy from here".

    • @cdorman11
      @cdorman11 Год назад +8

      11:45 "If you stop a random physicist on the street and ask them if they agree..."

    • @keithwollenberg5237
      @keithwollenberg5237 Год назад +3

      I wonder if she was aware how funny it sounds to anglophone ears to have someone with a German accent expound the desirability of order.

  • @deebarker1969
    @deebarker1969 7 месяцев назад +107

    Sabine, I'm a chemist, and in conversations in the past, I attempted to make the arguments about entropy you've made so eloquently in this video (especially the associations/ideas about order). From now on, rather than arguing, I'll recommend watching your video. Thank you so much for this great video!

    • @doopov
      @doopov 5 месяцев назад

      chemistry is really hard for me.i can never become a chemist

    • @WaterspoutsOfTheDeep
      @WaterspoutsOfTheDeep 4 месяца назад

      As a chemist you must realize the issue with naturalism is that time is the enemy not the friend. You need biological chemistry to happen faster than the reductive decaying chemistry. The problem is that doesn't happen in this universe... amino acids and proteins only have hours before becoming useless...

    • @NobleUnclean
      @NobleUnclean 4 месяца назад +2

      @@WaterspoutsOfTheDeep The subtext of what this video is saying is that the 'starting' state of the big bang was a macrostate that we no longer have information/knowledge of. Therefore, at this alleged 'heat death', it will yet again be a big bang of a different kind. All this suggesting that the universe might very well be an eternal macrostate that *can't* 'discontinue'.

    • @maxkho00
      @maxkho00 4 месяца назад

      Too bad this "eloquent" argument falls apart if reductive materialism isn't assumed. I don't think many people will agree with her that _none_ of the macro-states don't objectively exist. Clearly, humans ─ who are macro-states ─ objectively exist since, well, cogito ergo sum. Her argument is pretty nonsensical.

    • @NobleUnclean
      @NobleUnclean 4 месяца назад

      @@maxkho00 Its premature to label concepts on the vanguard of existence in that way. Entire areas of study could very well reside within something woefully 'nonsensical'. Since we are talking about expanses of time in the trillions of years, I'm not really prepared to reject anything. Regardless of how silly it might first seem.

  • @suomeaboo
    @suomeaboo Год назад +196

    9:55 Finally someone said it! I always considered a homogeneous mixture as "ordered", contrary to how lots of entropy explainers describe it as "disordered". This led me to lots of confusion over the years, until my recent physics classes cleared things up.

    • @cameronbartlett6593
      @cameronbartlett6593 Год назад +6

      now run outside and play

    • @b1ff
      @b1ff Год назад

      @cameronbartlett6593 no u

    • @limitlessenergy369
      @limitlessenergy369 Год назад

      EZ water / exclusion zone worth a read

    • @limitlessenergy369
      @limitlessenergy369 Год назад

      @@cameronbartlett6593I am a plasma engineer and I would beat you in physical anything best of 3, pick your best sport it wont matter even if I have never done it you will still lose because I will also pick my best sport and third party picks the third sport meaning you have low chances of winning. Go say your bs to yourself in the mirror. Not every nerd minded dude isn’t physically able.

    • @revimfadli4666
      @revimfadli4666 Год назад +10

      ​@yazmeliayzol624or maybe they're spammers? I don't see them either

  • @markusk2289
    @markusk2289 Год назад +55

    This video reminds me how it always frustrated me in school as well as later at Uni that things were described or defined in simplified ways that made them wrong, harder to actually grasp or both. Sometimes the actual truth of the matter would slowly reveal itself often leading to an „aha“ moment years later and a feeling of „I knew it“.
    Also funny how oftentimes simply explaining the meaning of a latin or greek term would have almost explained the whole concept behind it, yet somehow no professor ever did that.

    • @arsenypogosov7206
      @arsenypogosov7206 5 месяцев назад +1

      So true!

    • @Lightbeerer
      @Lightbeerer 2 месяца назад +4

      Exactly! Like for example how percentage literally means "per hundred". So 10% is 10/100. Which means you can easily convert between percentages and fractions.

    • @noahyes
      @noahyes 28 дней назад

      This is a great point. Most teachers are just bad.

  • @jeffb3357
    @jeffb3357 Год назад +310

    The entropy of RUclips must have decreased since veritasium and Sabine both posted videos on entropy within two weeks of eachother :) They're both great, but Sabine definitely lives up to her motto of no gobbledygook... thanks Sabine!

    • @geoffwales8646
      @geoffwales8646 Год назад +24

      Veritasium explains it better for the layperson, IMO.

    • @michalgric
      @michalgric Год назад +1

      This video about life and entropy ruclips.net/video/k-vm3ZWnMWk/видео.html is really good match to these two videos you mentioned.

    • @HardHardMaster
      @HardHardMaster Год назад +9

      I like my gobbledygook with a good glass of razzmatazz

    • @AnthonyCassidy50
      @AnthonyCassidy50 Год назад +17

      I agree, they are both great. Sabine's was posted on Jun18, and Veritassium's on Jun2nd . So we know Sabine wasn't responding to his (since she was first) and since Veritassium takes longer than two weeks to make one of his elaborate videos, we know he started his before he knew Sabine was doing one. That's perfect. Also cool how they both didn't like the word "disorder" as entropy,, Veritassium likes entropy as "energy spread out", and Sabine presents her own argument.

    • @andrewmycock2203
      @andrewmycock2203 Год назад

      @@geoffwales8646not nit picking, correcting. 👍🏻

  • @Cre8tvMG
    @Cre8tvMG 7 месяцев назад +8

    I enjoy your sense of humor keeping things light and entertaining while simultaneously tackling deep concepts. Great blend.

  • @SeattleShelby
    @SeattleShelby Год назад +131

    Sabine’s Carnot efficiency at explaining this stuff is 100%.

    • @redandblue1013
      @redandblue1013 Год назад +11

      So there is an infinite temperature gradient across her body?

    • @saraawwad9163
      @saraawwad9163 Год назад +2

      Not possible 😂

    • @nancymatro8029
      @nancymatro8029 Год назад +6

      Are you claiming to have Carnot knowledge of Sabine?

    • @msimon6808
      @msimon6808 Год назад

      There are always losses. Superconductors radiate when the current flow changes.

    • @IsoMorphix
      @IsoMorphix Год назад +1

      ​@@nancymatro8029I appreciate this joke.

  • @live_free_or_perish
    @live_free_or_perish Год назад +138

    Thank you for delivering such great content 👏 always interesting, humorous, and informative. My field is engineering, but sometimes, I regret not pursuing physics. Watching your videos gives me a chance to see what I missed.

    • @SabineHossenfelder
      @SabineHossenfelder  Год назад +32

      Many thanks from the entire team!

    • @barnsisback8524
      @barnsisback8524 Год назад +2

      @@SabineHossenfelder Time does not pass, only you is moving thru the Time. Time is a coordinate of the state of the moving entropy. From the pass state to the present state you can aim the future state.

    • @enk335
      @enk335 Год назад +4

      you still have time!

    • @effectingcause5484
      @effectingcause5484 Год назад +1

      Engineering is good experience for a prospective physicist. Nikola Tesla, perhaps the greatest engineer, was by extension, also one of the greatest physicists who ever lived, especially in the field of electromagnetism.

    • @hyperduality2838
      @hyperduality2838 Год назад +1

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @tyfooods
    @tyfooods Год назад +115

    The idea that there are always macrostates capable of turning a high entropy system into a low entropy system is fascinating. Entropy being constant, but perceived by us macrostates as variable, and thus subjective is powerful.
    I love how more and more physicists are adopting such a perspective! 😁✨

    • @adamt5
      @adamt5 Год назад +1

      time crystals are a great example. Check that out!

    • @thearpox7873
      @thearpox7873 Год назад +23

      It may be powerful, but that doesn't mean it is particularly useful.
      Saying that life may be able to emerge post-heat death of the universe just because it perceives reality differently may be a great Douglas Adams book, but belongs right alongside the parallel dimensions theories in plausibility.

    • @florianp4627
      @florianp4627 Год назад +9

      ​@@thearpox7873isn't the idea similar to Penrose's cyclical cosmos hypothesis? And he has proposed some actual ways to experimentally verify that

    • @thearpox7873
      @thearpox7873 Год назад +16

      @@florianp4627 It depends on what you mean by "idea" and what you mean by "similar".
      But I personally find Penrose's hypothesis intellectually coherent, interesting and plausibly congruent with reality, while Sabine here is engaging in the exact same cognitive hocus-pocus that she makes fun of certain other physicists so much for.

    • @dzcav3
      @dzcav3 Год назад

      Sounds like Maxwell's demon

  • @charlesthomas8450
    @charlesthomas8450 7 месяцев назад +8

    She’s an excellent teacher and her dry humor is hilarious! Love her!

    • @undercoveragent9889
      @undercoveragent9889 7 месяцев назад

      "She’s an excellent -teacher- _propagandist_ and her dry humor is hilarious! Love her!"
      FYP!

  • @adanbrown
    @adanbrown Год назад +34

    Videos that provide a "simple understanding" of historically complex concepts (like the laws of thermodynamics) are crucial for our kids to hear and learn. I am especially interested when the video explains what we "don't know" or "don't yet understand" to pique their interest in what to solve next!

  • @marknovak6498
    @marknovak6498 Год назад +118

    This is the coolest clearest and most concise explanation of entropy ever. I wish my physics professors had taught it this way. It would have saved me so much sleep.

    • @aggies11
      @aggies11 Год назад +7

      True say. While almost anyone can understand a concept/subject, it really takes a special mind to be able to explain it in a way that can be understood by someone else. Sabine's way of conveying information is so refreshing.

    • @AurelienCarnoy
      @AurelienCarnoy Год назад +5

      Yes. Finaly someone that admit it is a human bias and not a law.😅
      To compare is not fair. Everyting is perfect compared to itself

    • @FGBFGB-vt7tc
      @FGBFGB-vt7tc Год назад +4

      @@aggies11 I am a firm believer in the Feynman methodology: start as simple as you can to build up knowledge. If you can explain a complex idea in such a way that a child (or a non-specialist) can understand it while still being faithful to the core concepts then you are good!

    • @ricktownend9144
      @ricktownend9144 Год назад

      @@AurelienCarnoy Yes, I get fed up with it being called a law ... to call it an assumption, or - as sabine does - a matter of probability, would be much more accurate and satisfying.

    • @monnoo8221
      @monnoo8221 Год назад +2

      maybe, but she misses the point. and hence your professor too

  • @homeworld22
    @homeworld22 Год назад +186

    For a mortal being with a life expectancy of ~80yrs I confess I spend an irrationally large portion of my life brooding on the eventual heat death of the universe. Glad to find videos like this on occasion which at least try and come up with a philosophical explanation for why we shouldn't feel depressed over the stars eventually going out. Kudos Sabine!

    • @edh.9584
      @edh.9584 Год назад +5

      It almost makes one consider praying to God.

    • @HardHardMaster
      @HardHardMaster Год назад +20

      @@edh.9584 which one.. There's so many to choose from...

    • @jongyon7192p
      @jongyon7192p Год назад +12

      @siddified Definitely the Pasta One
      Although I personally like the machine god that came from the future

    • @edh.9584
      @edh.9584 Год назад +2

      @@HardHardMaster Well, choose one.

    • @jongyon7192p
      @jongyon7192p Год назад +9

      @@edh.9584 The "satanic temple" has some surprisingly good doctrine

  • @mikezooper
    @mikezooper 10 месяцев назад +12

    I’m not old, I’m just high entropy.

  • @exciton007
    @exciton007 Год назад +25

    Intuitive explanation of entropy. Thanks a lot

  • @alexandretorres5087
    @alexandretorres5087 Год назад +73

    This remembers me of the book "A Choice of Catastrophes" by Isaac Asimov. He talks about how to survive heat death by exploiting low entropy fluctuations. The book was written in 1979.

  • @luipaardprint
    @luipaardprint Год назад +103

    I've always read the term heat death as the death of heat, which made a lot of sense to me.

    • @LeanAndMean44
      @LeanAndMean44 Год назад +7

      Or a death caused by heat. It could mean both things, and language is just inaccurate in many ways.

    • @AdaptiveApeHybrid
      @AdaptiveApeHybrid Год назад +4

      I thought so too lol

    • @vikiai4241
      @vikiai4241 Год назад +7

      Yes, the word order in English defaults to "Death by heat" , but "Death of heat" is also a valid interpretation of the words, though not the default.

    • @gnomiefirst9201
      @gnomiefirst9201 Год назад +2

      @@vikiai4241 that's interesting because I am a native English speaker and frequently come onto this but never was sure about my way of thinking about it. Thanks for the validation lol.

    • @FredMaverik
      @FredMaverik Год назад +2

      @@vikiai4241 Why though? What rule is there

  • @humanitech
    @humanitech Месяц назад +1

    Perhaps entropy is just the natural part or inverse phase, moment, flip, reversal or change that occurs with all energy and matter interactions and formations in the cosmos!
    Which creates, evolves, then reverses, changes, deconstructs or destroys things.... to reuse and recycle into new positive/negative formations and events ...and so on!

  • @geoicons1943
    @geoicons1943 Год назад +64

    LOL “I’d be inconvenient if you entered room and all the air went into a corner” @sabine you totally cracked me up with this joke. Thank you!

  • @shivasive
    @shivasive Год назад +236

    A friend of mine once said ”if studying physics doesn't humble you, you're doing it wrong."

    • @PrivateSi
      @PrivateSi Год назад +5

      If you don't confidently, arrogantly even, question physics you're doing it wrong... Entropy should be redefined as Simplicity or Uniformity... Complexity can be static and structured, or dynamic (and structured)... Closed systems simplify over time. Energy can increase complexity. This redefinition solves many issues.

    • @moneteezee
      @moneteezee 8 месяцев назад +4

      @@PrivateSi You haven't been humbled yet clearly. Entropy has already been defined precisely in physics, from the mid 19th century. What loose definitions you get from science communicators isn't reflecting the reality of this situation, hence you must study physics. (Even this video enunciated this so I truly don't know where your head is)

    • @PrivateSi
      @PrivateSi 8 месяцев назад

      @@moneteezee .. You can either make a new term or change the Law of Entropy, because that law is wrong when over-applied to the entire universe. It's fine for gases in a closed system, but fails as a universal law, unlike my redefinition.. You could call it the Law of Simplicity if you prefer, as long as you stop declaring the Law of Entropy to hold at all times.

    • @PrivateSi
      @PrivateSi 8 месяцев назад

      @@moneteezee.. If fields are real they have to be made of discrete elements that have to be as equidistant as possible throughout the entire field for the field to be empty. This is a perfectly ordered state that's as simple as possible. Hence, The Law of Entropy should not be a law or Entropy should be redefined. It's a simple, logical argument given the evidence for QUANTISED FIELD(S).

    • @M-dv1yj
      @M-dv1yj 7 месяцев назад

      @@moneteezeeentropy is likely the unwinding of quantum entanglement from the source of singular entanglement of all possible quantum expressions into less entanglement limited expressions … it entropy is the cost or the balancing side of the of increased complexity that rebirths cycles of quantum tangling and de tangling that at some phase of transition leads to us. 😊
      Entropy must be considered within the scope it’s relationship to quantum complexity.
      In short small scale fluctuations must run dry for the whole quantum baseline to reset into the potential of singulars expression.
      😊

  • @juancarlos-cl7cs
    @juancarlos-cl7cs Год назад +20

    Me encanta la profundidad y claridad que tienes, este tipo de contenidos es oro. Yo estudié Física y Filosofía y una conversación contigo me hubiera ahorrado años de dudas y confusiones. Gracias por esto.

  • @anandawijesinghe6298
    @anandawijesinghe6298 8 месяцев назад +1

    The true and big picture answer is that it will never end ! The universe as a whole is evolving on a very large time scale, which is macroscopically time-periodic in a statistical sense ! The initial state not relevant, like for a non-linear oscillator that becomes chaotic, but its chaotic state is periodic, perhaps because of the reorganizing entropy-reducing nature of gravity, whose long term impact has still not been understood or quantified ! 😇!

  • @Anaesify
    @Anaesify Год назад +16

    Sabine your work has changed the very way I view life and physics in a way that's nearly spiritual. I feel so lucky to be able to learn about the beauty of the universe and physics from you

  • @willdon.1279
    @willdon.1279 Год назад +120

    She is wonderful at explaining very difficult or (almost) impossible ideas in an understandable way.
    Alas, even Sabine leaves my poor brain often mystified, but always stretched, curious, and entertained. 🙂

    • @hannahbaxter8825
      @hannahbaxter8825 Год назад +4

      Same same

    • @warrenjoseph76
      @warrenjoseph76 Год назад +6

      When it’s a topic like this I need to watch a few times over a few days and dig more into some of the ideas with other videos. I like that she doesn’t completely dumb it down for people like us but opens the door for us to learn more (such as that entropy logarithm formula). But yeah at the end of this video I STILL don’t QUITE understand a simple way to explain entropy to someone. Yet. But I will after a few views 😂

    • @aaronreyes7645
      @aaronreyes7645 Год назад +1

      She is wonderful
      Without the gobblety gook

    • @jackcarswell4515
      @jackcarswell4515 Год назад +2

      @@aaronreyes7645 In my opinion, it was all gobblety gook. She talked in circles and never really said anything at all

    • @TheBsavage
      @TheBsavage Год назад +2

      I know, right? I'm totally in love with her. With her MIND, but I suspect it's a package deal. I'm still game.

  • @matthewparker9276
    @matthewparker9276 Год назад +18

    The statistical description of entropy has never resonated with me, but your talk about other life with access to different macrostates is very interesting, and warrants further thought.

    • @kristianshreiner6893
      @kristianshreiner6893 Год назад +8

      It’s vague and non substantial. She needs to produce some scenario that could meaningfully demonstrate how one side steps Boltzmann’s formulation of the second law.
      Not the order/ disorder pop science stuff, the physics.
      Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently. I love her, but this argument isn’t serious.

    • @demonicakane2083
      @demonicakane2083 Год назад

      I didn't get wut she means by Complex systems with different microstates which we can't use and further the entropy is small for them.... Can u explain

    • @ywtcc
      @ywtcc Год назад +1

      Could Entropy just be a matter of perspective?
      If a horizon from one perspective can appear to be a singularity from another perspective...
      A horizon takes an eternity, and a singularity happens in an instant....
      And, a horizon is a state of maximum entropy, and a singularity is a state of minimum entropy...
      Entropy should be dependent on your reference frame. If your horizon is growing, then your entropy should be growing.
      I've been thinking about it like this after listening to Leonard Susskind. I don't know if I'm getting it quite right, but it's really interesting to think about in this way.

    • @ThePowerLover
      @ThePowerLover Год назад

      ​@@kristianshreiner6893 _"Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently."_
      Well, if you understand "magic" as "fairy tales", like many, then its a straw man fallacy, because Sabine didn't thought of that on terms of "something that doesn't exist nor can exist".
      And she didn't use the order/disorder stuff, but to only criticize it. Boltzmann's formulation, as Sabine hinted on this video, it's only useful for thinks that we "know", not stuff we clearly don't know, as information not accesible to us. Stop lying pls.

    • @ThePowerLover
      @ThePowerLover Год назад

      @@ywtcc All stadistic is clearly a "matter of perspective"-

  • @anthonybelz7398
    @anthonybelz7398 9 месяцев назад +5

    One of the best commentaries on Entropy I've encountered thanks SB, especially as "The # of microstates per macrostate" - Given that a macrostate is simply an arbitrary human classification/aggregation, does this mean that entropy is an arbitrary physical aggregate outcome? I think I'm missing something, so I'll listen to your commentary until I can discern this thing. 🥝🐐

    • @LlamameX
      @LlamameX 6 месяцев назад

      Every fundamental unit you may imagine of in physics is relative in the end. References are set arbitrarily by us on the basis of what we deem enough stable to compare to. I guess entrophy is also relative as we still don't know much about what is going on at subatomic level, but for the time it seems to be the most pure and universally applicable unit we can think about. (I am not a physicist but an engineer, although I think obtained degrees are meaningless when it comes to pursuing actual knowledge and are just used as a social reference for people to be inclined to more easily believe you when you speak, as a sort of warranty. Some people make use of this extensively to bolster their ego even though in the end they may don't know much)

  • @KauTi0N
    @KauTi0N Год назад +20

    I've been waiting a long time for this video. Sabine, you are an amazing communicator and I think the world needs to hear you. This channel and other sources like it are THE REASON I use the internet.
    Thank you! ❤

    • @hyperduality2838
      @hyperduality2838 Год назад

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @gregwarrener4848
      @gregwarrener4848 Год назад

      @@hyperduality2838 sounds like your grasping for straws, likely for some sort of affirmation of a biased narrative. entropy is all around you and can be observed, posing a balance to the force is wishful thinking

    • @hyperduality2838
      @hyperduality2838 Год назад

      @@gregwarrener4848 Your concept of reality is a prediction or model -- syntropic!

    • @MR-ub6sq
      @MR-ub6sq Год назад

      @@hyperduality2838 Really?

    • @hyperduality2838
      @hyperduality2838 Год назад

      @@MR-ub6sq Concepts are dual to percepts -- the mind duality of Immanuel Kant, the critique of pure reason.
      Mathematicians create or synthesize concepts all the time from their perceptions.
      Your mind is dual.
      Mind (the internal soul, syntropy) is dual to matter (the external soul, entropy) -- Descartes or Plato's divided line.
      Matter is also dual -- wave/particle duality!
      "Always two there are" -- Yoda.
      Everything in physics is made from energy and energy is duality, duality is energy.
      Potential energy is dual to kinetic energy.
      Electro is dual to magnetic -- electro-magnetic energy is dual, photons.

  • @ahmed51988
    @ahmed51988 Год назад +45

    I guess Sabine is the best person that has ever explained this relationship between order, life, entropy and those things without either being carried away in theory or relying on very superficial metaphors. Because maybe physics is both ! intuition and somewhat philosiphical insights + rigourous mathematical modals.

    • @No-cg9kj
      @No-cg9kj Год назад

      Physics isn't philosophy, it's what we know to be reality.

    • @off6848
      @off6848 Год назад +2

      @@No-cg9kj What you think you know (episteme) is philosophy.

    • @ThePowerLover
      @ThePowerLover Год назад +1

      @@No-cg9kj Physics is not that despite its name.

    • @ahmed51988
      @ahmed51988 Год назад +1

      @@No-cg9kj I agree with you actually. Philosophical thinking doesnt mean something detached from reality. There s a section in philospohy about science and modelization and those things. I mean that it's useful to take some time to pose some "meta" questions about what we're trying to look for in nature, what is really fundemantal and what stems from our specific human view of nature.
      In this video Sabine took sometime to think that maybe our defintion of entropy is a reflexion of what information is accessible to US and what energy WE can use. It is pretty philosophical it seems to me.
      It's funny how your user name is NO btw xd, thnx for ur comment.

    • @shrub4248
      @shrub4248 Год назад +4

      Science is built up on a bedrock of epistemology.

  • @wyrmh0le
    @wyrmh0le Год назад +16

    Very interesting point about 'macrostates' that I've never heard or thought of before. Definitely food for thought. Thanks!

  • @495whiterobe
    @495whiterobe 9 месяцев назад +1

    What is "life" (body, soul & spirit)but the interiorizing, complexification of matter, a 'gift' of anti-entropy...😊
    👉Creation Continues...,& that mass beyond our bounds..also, accelerates & increases... And the gravity of it all makes for a stable & expanding space-tiime in our neighborhood.

  • @serpentphoenix
    @serpentphoenix Год назад +6

    "If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation-well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." - Arthur Eddington, New Pathways in Science

  • @PATRIK67KALLBACK
    @PATRIK67KALLBACK Год назад +23

    Great video Sabine! Even if I have a MSc in chemistry and PhD in pharmacy, one of the hardest thing to undetstand is the thermodynamics. You really have to eat and breathe thermodynamics to really understand it, and it doesn't come with intuition, and one of these things are entropy. So thank you Sabine to make these entropy more intuitiv.

    • @mixerD1-
      @mixerD1- Год назад

      Farmacy?
      Do you mean agriculture or is this alternative medicine?
      Sorry... couldn't help myself.😁

    • @andrewmycock2203
      @andrewmycock2203 Год назад

      @@mixerD1-or maybe an Fhd.

    • @PATRIK67KALLBACK
      @PATRIK67KALLBACK Год назад

      ​@@mixerD1-ha ha, too much swedish 😊

    • @noneofyourbusiness-qd7xi
      @noneofyourbusiness-qd7xi Год назад

      Your physical chemistry profs will be extremely disappointed if they read your comment (and likely be sorry that they let you pass)

  • @johncampbell9216
    @johncampbell9216 6 месяцев назад +4

    Entropy is a tendency, not a law. For instance, galaxies could not assemble if entropy was the rule. The solar system could not exist if entropy was the rule. Fusion could not occur if entropy was the rule. It would appear then that gravity is the opposing force to entropy.

    • @indrekl89
      @indrekl89 7 дней назад +1

      Also, doesn't that mean that gravity provides an objective standard for what counts as a macrostate? 🤔Gravity puts loose constraints on how particles can move (e.g. directing particles so that they would make a galaxy shape). Particles can move out of those confines but are less likely to do so. It seems this is just what a macrostate is. And it's not dependent in any way on human interests.

  • @olive_oil87
    @olive_oil87 9 месяцев назад +69

    “and luckily so because it would be inconvenient if you entered a room and all the air went to a corner” just made my day

    • @colorado841
      @colorado841 5 месяцев назад +1

      That is so annoying when it happens.

  • @tnb178
    @tnb178 Год назад +27

    The analogy of "disorder" can work in the right context. You could say: there's all kinds of messy but only one kind of clean. For that reason your room tends towards getting messy if you make any change not specifically targetting cleanliness.
    What I like about the example is that it uses the cleanliness as a macroscopic state defined by a human and talks about the associated microstate count.

    • @boldCactuslad
      @boldCactuslad Год назад +2

      i see you've never been to a certain college dorm. as determined with mathematical certainty, this was a space of superlative messiness: the Platonic form of disorder, chaos so intense both the gravitational and nuclear forces had all but given up - trash and antitrash not belonging to our dimensions materialized, ionized, annihilated, sublimated, vaporized, crystallized, and floated about the space with such frequency that any external modification to the region did nudge the system towards cleanliness regardless of the actor's intention. the situation passed when a window broke itself in its frustration. the resulting decompression vented the contents of the room (including furniture, wallpaper, rodentia) into the environment.

    • @petevenuti7355
      @petevenuti7355 Год назад +1

      A single homogeneous pile of ash should be the cleanest state, if it's totally homogeneous then there truly can be only one state..

    • @off6848
      @off6848 Год назад +3

      @@petevenuti7355 This is correct and similar to what I was going to say about her comment that "entropy is why things break down"
      Actually things break apart because usually a "thing" is comprised of many things forced together and eventually unravel because they don't share the same essence. A house will break down because its a collection of wood, glue, sand, gypsum, wire, metal etc.. But those things on their own do not break down

    • @LMarti13
      @LMarti13 Год назад +2

      except that there isn't one kind of clean, there are literally an infinite number of them

    • @wojtek4p4
      @wojtek4p4 Год назад +2

      ​@@LMarti13 If you assume any margin where two adjacent states would be considered "the same" there's a finite amount of states - both "clean" and "dirty". E.g. if you assume that a pair of scissors being moved by less than 1mm constitutes the same state (that is: you quantize the position) the amount of states becomes finite.
      Importantly increasing the "precision" doesn't make the amount of states infinite - and it doesn't change the proportion of "clean" and "dirty" states significantly. So even at a limit of however small your measurement error is, the amount of "clean" states is smaller than the amount of "dirty" status. Unless quantum effects start taking over, the amount of "clean" states is lower than the amount of dirty "states", no matter the scale.

  • @vazap8662
    @vazap8662 Год назад +10

    Sabine has just addressed a question I been tickling me since my teenage years. This notion of pockets of emergent decreasing of entropy, such as say, our brains. I'm so grateful to hear her address this topic.

    • @FredMaverik
      @FredMaverik Год назад +1

      ...what

    • @hyperduality2838
      @hyperduality2838 Год назад

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @henrythegreatamerican8136
      @henrythegreatamerican8136 Год назад +1

      Locally you can have decreasing entropy, but the overall entire system remains highly entropic.

    • @Lightning_Lance
      @Lightning_Lance Год назад +4

      @@hyperduality2838 I'm here to remind you to take your pills.

    • @SineN0mine3
      @SineN0mine3 Месяц назад

      ​@@Lightning_Lancered pill or blue?

  • @gaemlinsidoharthi
    @gaemlinsidoharthi Год назад +52

    This reminds me of Roger Penrose’s ideas. I think you shared a stage with him at one point. If we change our scale of looking at the universe, in both space *and* time, the almost uniform distribution after 10^100 years (or whatever the number is) becomes recognisably clumpy again. The almost imperceptible effects of gravity speed up and become perceptible again.

    • @B33t_R007
      @B33t_R007 Год назад +5

      i don't think this is possible. if the universe at that point is nothing but a diluteted, even, gas, there may happen some random "clumping" but there just not enough matter nearby for anything more than a few particles hooking up.

    • @akidnag
      @akidnag Год назад

      The most probable state for the universe is a gigantic black hole, which has the largest entropy given by the Bekenstein formula.

    • @IamPreacherMan
      @IamPreacherMan Год назад

      @@B33t_R007 cold things contract. The universe is expanding because it’s still heating as evidenced by the high number of stars currently burning. Turn the heat off and the dark matter contracts as does the distance between all particles and celestial bodies. The crunch will happen as it obviously happened before. Unless you believe the universe poofed into existed out of nothing. Which defies all known physics and logic. I think Penrose is brilliant. And wrong about his cyclical universe approach. But I also think hawking was wrong about BH radiation for a number of reasons.

    • @5naxalotl
      @5naxalotl Год назад +11

      @@B33t_R007 the point is that there is eventually no matter, just stray photons. but photons move at the speed of light and from their point of view everything happens instantly. penrose claims that this has peculiar consequences. i can't tell you more than that, or vouch for penrose, but you don't seem to be anticipating the situation fully

    • @wafikiri_
      @wafikiri_ Год назад +12

      Once every remaining particle is beyond the event horizon of every other particle, there is no way they can interact again. Expansion of the universe would end up in a zillion one-particle universes.

  • @paulx2777
    @paulx2777 9 месяцев назад

    It's not that entropy cannot decrease; it's that entropy in a closed system cannot decrease. If your box has a window with the sun shining in, it's no longer a closed system. And taking measurements also means it is not a closed system.

  • @whyofpsi
    @whyofpsi Год назад +16

    I like how complexity emerges somewhere between minimum and maximum entropy :)

    • @anywallsocket
      @anywallsocket Год назад

      Indeed, it is within the inflection point of rate change that new minima can be added.

  • @PaulElmont-fd1xc
    @PaulElmont-fd1xc Год назад +108

    I have had major depressive episodes because of entropy and the second law. I am deadly serious. Since learning about it in high school physics class, I have always considered it to be my greatest fear. Thank you for easing my fear a bit. 😊

    • @SabineHossenfelder
      @SabineHossenfelder  Год назад +51

      I totally know what you mean.

    • @MommysGoodPuppy
      @MommysGoodPuppy Год назад +8

      I like to think life only came about because of entropy because a human that builds a car will generate a lot more entropy than a rock that gets chipped to pieces over time.

    • @joansparky4439
      @joansparky4439 Год назад +4

      If you like SciFi I can suggest a book to you that has this as part of the story.. it's called 'Voyage from Yesteryear' by James P Hogan.
      Should give you a positive story to carry around wherever you go.

    • @jackkrell4238
      @jackkrell4238 Год назад

      @@joansparky4439 What about the concept of entropy daunts you so much?

    • @MommysGoodPuppy
      @MommysGoodPuppy Год назад +8

      @@PaulElmont-fd1xc also an uplifting story on entropy is "The Last Question" by Isaac Asimov

  • @cedriclorand1634
    @cedriclorand1634 Год назад +101

    Sabine s humour and wit is so empowering. Such a wonderful person. Thank you for existing Sabine 😂

    • @leonardgibney2997
      @leonardgibney2997 Год назад +2

      She doesn't exist. Our reality is an illusion.

    • @hyperduality2838
      @hyperduality2838 Год назад +5

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @peter9477
      @peter9477 Год назад +2

      She couldn't help it. Her existence is a high entropy state.

    • @johnkufeldt3564
      @johnkufeldt3564 Год назад +1

      I agree, but you summed it upwith far fewer words. Cheers from Canada.

    • @neanda
      @neanda Год назад +3

      so true 🤣 it's so subtle as it blends in with her education. imagine her being the teacher, she would grab your attention because you'd be like wtf? then listen more. she's a great teacher.
      'if you tell this to a random physicist on the street to see if they agree, you should let them go as they've got better things to do' 🤣

  • @comet2062
    @comet2062 Месяц назад +1

    I think one of the most amazing example of low entropy , or localized reversal of entropy is photosynthesis at the molecular level. That with amino acids and DNA/RNA is the basis of life on Earth. Execpt for life around ocean floor volcanic events.

  • @un4given868
    @un4given868 Год назад +5

    Thank you for making clear to me what entropy is and as a bonus giving me a solid idea of what Necessity could mean as well .

  • @Dron008
    @Dron008 Год назад +43

    I have been trying to understand entropy for decades, now I am a little closer to it.

    • @treadwell1917
      @treadwell1917 Год назад +3

      Entropy is simply a lack of data. In a “real” sense though. Information or data is better to say than order because things being “ordered” are subjective. So entropy happens when you begin to lose information of a system. If something is breaking down it’s order or information is also separating. Therefore it’s entropy is increasing. In a way entropy could be seen as time. Because time moves forward as things spread out or break down. It’s even how we quantify the “second” is by using the measure of an atom breaking down. These two things are the same though. We measure time by our perception of all systems around us progressing towards entropy. So time moves based on the frame rate we perceive it which is again measured or quantized by entropy.

    • @treadwell1917
      @treadwell1917 Год назад +1

      I wasn’t far along enough in the video she pretty much says the same thing.

    • @cinegraphics
      @cinegraphics Год назад +3

      Entropy is simply the rise of equality. Which is a result of calculations the nature does to produce the next moment of time. So, entropy is simply the result of calculation. It's real, but it's wrongly defined. Entropy is not a measure of disorder. It's a measure of equality. Because once the full equality is reached, the universe stops. The formulas still work, but their inputs and outputs become the same, hence there's no change, hence the passage of time makes no difference. That's maximum entropy. Or equality. So, equality is bad 😊

    • @snaawflake
      @snaawflake Год назад +2

      ​@@cinegraphics No their inputs and outputs don't become the same, only when you're working with an information discarding model of the universe (macrostate).
      There is still exactly one microstate leading to exactly one next microstate, and thus each microstate has exactly one preceding microstate, so the universe in terms of microstates cannot advance to a point where the inputs and outputs become the same; as then that final microstate that would be reached would have more than one way for it to be reached: first the last state of the universe in which inputs and outputs were not the same, and second the final state itself where inputs and outputs are the same. But this is in contradiction with the assumption that there is only exactly one microstate leading to exactly one next microstate.

    • @cinegraphics
      @cinegraphics Год назад

      @@snaawflake you're forgetting the rounding errors. At one moment the error level drops below the computation precision and the output becomes same as input. And that's the end of computation. Death of the universe. Stable state has been reached.

  • @scifieric
    @scifieric Год назад +14

    "It can only go downhill from here" made me laugh out loud. Another excellent video that teaches us about science. Lovely.

  • @adon2424
    @adon2424 Месяц назад +1

    I agree that we can measure location and velocity at the same time in our macrostate.
    The reason we can not do the same for a microstate is that we do not have the sterile tools and technology for the microstate.
    So we use the next best technique, statistics which reveals the wave function. That is just a tool to interface with the microstate without much interference, which inherently causes ambiguity.
    We are just not built to interface with the microstate without causing a disturbance.
    We need to devise a fractal technique to zoom down into the microstate to accomplish what we want to do without disturbing the microstate.
    Without that, we are just guessing at the characterization of the microstate. And we go down the rabbit hole.
    Thank you for your great insight on entropy. This allowed me to grasp the limitations of measurement theory in our macrostate, which is a fractal state of the cosmos.

  • @atmosphericSkull
    @atmosphericSkull Год назад +5

    This is a great philosophical point. I love your subtlety, Sabine.

  • @minifix
    @minifix Год назад +9

    Sabine is not only intelligent enough to master a complex subject, but also think in new ways about how the data should be interpreted. It absolutely blew my mind when you explained how macro systems are a relative concept, and not an absolute value that happens to match our anthropocentric view of the universe. Thank you.

    • @katebuckley4523
      @katebuckley4523 9 месяцев назад

      I often think about how the anthropocentric view tends to divide ‘reality’ into separate things where there are none .. ie. a kind of fiction that perceives order v chaos, living v non-living, observer v observed etc.. ‘we’ seem to separate ourselves out along with everything else. My hunch is that the energetic solidity we feel in the body gives rise to the appearance of everything else as distinct and knowable.. a kind of ‘specialness’ that has something to do with the apparent body’s drive to conserve low entropy... what we call ‘me’ is only a kind of interpretation and nothing at all in reality.

  • @mishelleilieva9657
    @mishelleilieva9657 Год назад +24

    You should read Asimov's story called "The last question". It covers the topic of entropy and this video explains perfectly the science behind it. The story also follows Sabrines idea about future life forms in higher entropy universe. It is one of my favorite Asimov's pieces.

    • @RobertR3750
      @RobertR3750 Год назад +1

      Asimov.

    • @mishelleilieva9657
      @mishelleilieva9657 Год назад +1

      @@RobertR3750 oops, you're right. English is not native for me and we spell it with a "z".

    • @tezzerii
      @tezzerii 7 месяцев назад +1

      I've read that. Very clever story ! I love Asimov.

    • @mishelleilieva9657
      @mishelleilieva9657 7 месяцев назад

      @@tezzerii me, too! I think I read almost everything he wrote..

    • @audistik1199
      @audistik1199 7 месяцев назад +2

      As a young boy I became a science fiction devotee, and Asimov was one if my favorites, among with Heinlein and Clarke. I presume you are 😊from an older generation like me to have these greats on your reading list. It was a seminal experience in my life.

  • @vieiradelimafilho
    @vieiradelimafilho 8 месяцев назад +1

    Great video, as usual. Sabine really puts things in perspective with her very humble scientific admission of gaps. Around the 7th minute we can visualize clearly what is the greatest bias (blindness?) that will make our grandkids cringe in a century: "how could you not understand this emergent order?" To me it feels like it's way past time we start conceptualizing things like "syntropy" and eutropy", and take Metaphysics seriously again. Restore it to its natural place since Aristotle: the real "theory of everything". Intuition (idealism) is the way we understand things so that science can make strides again. Empricism depends on it, and is basically a secondary development.

  • @ColonelFredPuntridge
    @ColonelFredPuntridge Год назад +7

    If anyone here has NOT read Isaac Asimov's short-story "The Last Question", now would be a good time.

    • @Rotaermel
      @Rotaermel 8 месяцев назад

      Also, „Exhalation“ by Ted Chiang.

  • @NumericFork
    @NumericFork Год назад +16

    I read a while ago that they made a tiny membrane out of graphene that could harvest minute amounts of electricity from brownian motion by vibrating the membrane. That's pretty cool because as I understand it, it would mean that if you're in a sealed sphere that doesn't lose energy, you could in theory produce energy out of seemingly nothing by cooling the air to convert the heat to electricity, which in turn would do useful things before inevitably heating the air again.

    • @TheBackyardChemist
      @TheBackyardChemist Год назад +7

      Sounds a lot like the molecular ratchet or rectifying thermal noise sort of ideas. So far none have worked.

    • @zsmith200
      @zsmith200 Год назад

      We actually don’t need a closed system to make accurate predictions with thermo though. You can derive how a system in contact with a thermal reservoir will act by treating the system and reservoir as a closed system. You’ll still see that perpetual motion machines like Brownian ratchets are impossible with heat exchange

    • @dragons_red
      @dragons_red Год назад +1

      You're not going to power much from Brownian motion.

    • @IVANHOECHAPUT
      @IVANHOECHAPUT Год назад

      Duh... You're vibrating a membrane and expecting electricity. Where do you think the energy to vibrate the membrane came from - the vacuum?

    • @G1vr1x
      @G1vr1x Год назад +1

      I just checked a presentation from Paul Thibado | Charging Capacitors Using Graphene Fluctuations, where he explains in a nutshell that energy in his experiment is gathered by the diodes, not the graphene. The graphene is acting like a variable capacitor that doesn't need energy input, but is at equilibrium. From his own claims, it doesn't break 2nd law of Th. even though I don't fully get the explanation.

  • @Viewpoint314
    @Viewpoint314 Год назад +5

    That was one of the best lectures although it is hard to compare lectures. It's maybe the first time entropy makes sense even though I have been reading about it all my life. I also like your dry physics humor. One needs humor in life. So thank you very much.

  • @coolParadigms
    @coolParadigms 5 месяцев назад +2

    At the time 14:25 "I believe as the Universe gets older ..." and we observe "local entropy" ... "... new system will emerge ..." ! absolutely in a never ending dance! And we start to have clues as the pseudo "dark energy" and really two old galaxies to belong to our own Big-Bang! I dig into these questions a c-l-i-c-k away!

  • @dr.gordontaub1702
    @dr.gordontaub1702 Год назад +91

    I think many of my (undergraduate) students would get a laugh (or a cry) out of the phrase, '...You don't need to know that much maths, just differential equations and probabilities...'

    • @bryck7853
      @bryck7853 Год назад +6

      linear algebra and complex number theory be damned!

    • @afterthesmash
      @afterthesmash Год назад +10

      There's not a single thing about Special Relativity that Einstein couldn't have explained to Archimedes. Einstein would probably take a detour into explaining the modern notation of algebra. But this wouldn't really _be_ algebra, for the same reason that many undergraduates in computer science can barely distinguish a formula from an equation.
      Archimedes: This algebra thing is cool! I wonder what else you can do with it.
      Einstein: Well, I did once take a year-long detour into the deep technical weeds of Ricci tensors.
      Archimedes: Excellent! Please explain.
      [Archimedes smooths out some complex geometric diagram in the sand.]
      Einstein: Uh, okay, but we're going to need a _much_ bigger beach.

    • @johnzander9990
      @johnzander9990 Год назад

      My college students would laugh at "half the box being filled with particles" as being a low entropy state as they would understand that this is just one state that's counted in the entropy of the system.
      As a person that implies you have physics students, do you not see anything wrong with her understanding of thermodynamics?

    • @brooklyna007
      @brooklyna007 Год назад +10

      @@johnzander9990 But it is clearly lower relative entropy to being fully spread out in the box. Just by halving the space you're halving the number of microstates. Further, it is clearly not an equilibrium state and not a state that can last long.

    • @brooklyna007
      @brooklyna007 Год назад

      @@johnzander9990 Sorry, I should add that your understanding of entropy is a bit concerning. The state of the box half filled is the macrostate. It is not one of the contributing microstates to some other macrostate as your statement implies. Ex. You could instead have the microstate be that all molecules are in the box and *then* you could consider all on one side as a contributing microstate to that macrostate.

  • @cribbsprojects
    @cribbsprojects Год назад +7

    Perfect timing. I am reading The Order of Time by Carlo Rovelli. And the low entropy Google algorithm sends me right here. Always enjoy your explanations. Well worth my subscription.

    • @hyperduality2838
      @hyperduality2838 Год назад

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @frankakruger8432
    @frankakruger8432 Год назад +9

    I used to hate science in school but you make everything easy to understand, fun and interesting! I love watching your channel!

  • @Pencil0fDoom
    @Pencil0fDoom 4 месяца назад +2

    My assessment of the reason behind the confusion that seems to permeate discussions about entropy, is that in the act of representing entropy as a thing, our focus is drawn to it as such. Yet it is more like the negative space *around* objects, rather than a thing itself. Entropy is an effect of a process. The most succinct description I have heard (for those of us who may not readily be able to calculate the log of Boltzmans constant) is that “Entropy is the tendency of energy to go from concentrated to diffuse.” Every single event since the beginning of the universe (assuming it had one) can be described as follows… “Energy changing from one concentrated form to a different more diffuse form.” Everything from our planets geology, magnetosphere, biosphere, including our bodies & their metabolic processes, brain activity, almost any form of technology we invent and use… all of it merely capitalizes on complications of this eons-long cosmic slide down the entropic decline. We are all living in the aftermath of an explosion; embers in the dark.

    • @LeZylox
      @LeZylox 3 месяца назад +1

      And what to do with it?

    • @LeZylox
      @LeZylox 3 месяца назад +2

      Accelerationism?

  • @Lemerlex
    @Lemerlex Год назад +4

    If we crack fusion and are able to create low entropy objects in perpetuity (and maybe even black hole generators), how does that affect the progression towards heatdeath?
    Schrodinger wrote a great book near the end of his life titled "What is Life?" that touched on entropy and order, but it always confused me. This video helped to clear a lot up! Thanks Sabine :)

    • @prioris55555
      @prioris55555 Год назад

      black holes are FICTION
      nor do we live in a gravity based universe

  • @Seofthwa
    @Seofthwa Год назад +14

    Hi Sabine, I think this is imo one of your best physics presentations yet. Explaining entropy and how it applies to pretty much all aspects of nature. Thanks.

    • @hyperduality2838
      @hyperduality2838 Год назад

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

    • @telumatramenti7250
      @telumatramenti7250 Год назад

      You know, it's not as if she made a conscious choice to be conceived, and then to pop out of her mom's uterus for the sake of her future RUclips audience... 😂 You could thank her for making a RUclips channel, but the caveat is that she isn't a believer in Free Will (and in Agency by extension, at least not in the classical sense) so...🤔

    • @MR-ub6sq
      @MR-ub6sq Год назад

      @@hyperduality2838 Really?

  • @dannyb3663
    @dannyb3663 Год назад +14

    I remember the days when nothing Sabine is saying would make sense. Now it all makes sense. I've been watching her videos for so long.

    • @Anaesify
      @Anaesify Год назад +1

      I feel exactly the same! What an amazing resource for learning this platform is, i feel so lucky to be able to learn from geniuses like Sabine

  • @fwwryh7862
    @fwwryh7862 2 месяца назад +1

    Her videos start off simple and gradually get more complex. The half way point is enough for me.

    • @keppela1
      @keppela1 Месяц назад

      Agreed. She's personable, but not very good at explaining things.

  • @lexluthier8290
    @lexluthier8290 Год назад +15

    As a total layman, I love the way you manage to dumb down any subject to my level without actually dumbing it down. I wish you'd been my physics teacher when I was at school!
    However, you did give me one microsecond of pure existential dread when (I thought) you said that the heat death of the universe would occur in ten to one hundred years. Aaaaaaaaghhh! Oh, no, that's ten to THE one hundred years. Panic over. Now I need to find some clean trousers.....

  • @abelriboulot7166
    @abelriboulot7166 Год назад +16

    Great video! But tiny correction: high entropy means high information, not low information. The more random something is, the least it can be succinctly explained. At least for Shannon Entropy. For instance imagine water in its solid form with molecules neatly aligned, it can be described succinctly as a pattern repeating, whereas in its (high entropy) gaseous form, each particle can be anywhere in a the volume that contains it: describing the micro-states would require describing the position of each molecule (high information).

    • @commentarytalk1446
      @commentarytalk1446 Год назад

      That's a nice description - just watching the vid but based on what you say I wonder if this video can do better than that concise description. Maybe the simpler the explanation the more encompassing it is also? Entropy feels like such a description tending towards that... hence it's ubquity/applicability.

    • @willthecat3861
      @willthecat3861 Год назад +3

      Ya... I think one has to be careful when analyzing a physical example using Shannon. In physics it about information known (or information that can be potentially known.) It's not conceivable that we could know very much about the micro-states of gas molecules. Yet, we have a lot of information about the micro-states of ice. Thus... in the context of the ice and gas example you gave.... it's kind of opposite of what you said. There is a subtle connection between the two concepts of entropy. A good paper is by Jaynes, if you want to read that. Sean Carroll also writes about this too.

    • @lawrencedoliveiro9104
      @lawrencedoliveiro9104 Год назад

      But given that entropy tends to increase, does that mean the amount of information tends to increase or decrease?
      Consider the case of a gas confined to one half of a container by a barrier. Then the barrier is opened and the gas escapes to fill both halves of the container. Do we have more or less information about the positions of the gas particles than we did before?

    • @jarno.rajala
      @jarno.rajala Год назад +4

      This bothered me too. Maybe Sabine meant information in the colloquial sense, not how it's defined in Information Theory. Either way it's confusing.

    • @noneofyourbusiness-qd7xi
      @noneofyourbusiness-qd7xi Год назад +3

      You are perfectly right and she got it wrong. Information and entropy are perfectly positively correlated, not negatively.

  • @kencusick6311
    @kencusick6311 Год назад +6

    How did Sabine know I was about to ask, “But what about Quantum Mechanics?”
    Spooky action at a distance at work?

  • @markcello9879
    @markcello9879 3 месяца назад +1

    how satisfying to hear that the reference to order in explanations about entropy is non sense.
    the connection to information and macro micro states is another source of confusion I'd say, as it can introduce the role of observers and their subjective view in the definition.

  • @pablostraub
    @pablostraub Год назад +8

    This a great explanation of entropy. I loved when Sabine said that entropy has nothing to do with order.
    Also, that entropy depends on the definition of macro states, which are defined by the observers, us.

    • @MR-ub6sq
      @MR-ub6sq Год назад

      Sorry sorry! I was gone for so many hours. Of course we can talk. On what topic?

    • @PrivateSi
      @PrivateSi Год назад

      But in the bigger picture we're talking about a battle between a system's tendency to disorder vs order. 2nd Law states all physical systems tend to a disordered equilibrium. This in itself is problematic as a perfect equilibrium would be well ordered.. A gas at PERFECT equilibrium in a container would be perfectly evenly spread and it's molecules at rest. Quantum fuzz may prevent this perfectly ordered system and scientists consider a gas where all the molecules have the same velocity to be in equilibrium too, even if they are moving about 'randomly'..
      --
      The question is, has the universe become more or less ordered over time? MUCH MORE is the answer, not less. The basics of the 2nd Law of Thermodynamics holds for MOST closed systems, is the best we can honestly say if you ask me, which you didn't! To say it holds for the whole universe, be it finite or infinite, would be a leap of absurd faith.

  • @yangpachankis
    @yangpachankis Год назад +5

    This has become my favorite stream of yours. Never though entropy can be explained in so simple terms. Also, the past hypothesis may be able to be falsified with how you designed it. Human life expectancy has been increasing in time. Inspirational.

  • @inciaradible7144
    @inciaradible7144 Год назад +6

    Fascinating video and very well explained; entropy has always been one of these quantities I've found difficult to define or understand. It has mostly been explained to me in the concept of order (or just plain equations), and those mostly just made me go ‘oh that makes sense... wait.’
    Minor note, Boltzmann has two n's. 😅

  • @robert48719
    @robert48719 11 месяцев назад +7

    I have to say: her jokes are rather as you would expect them to be from a German

    • @glendaly3344
      @glendaly3344 6 месяцев назад +1

      I disagree, as she is making jokes, and they appear to be at random as opposed to at scheduled times.

  • @levarenewagemusic
    @levarenewagemusic Год назад +4

    I absolutely love this discussion! Everything is Entropy. However, I always wonder about Entropy and Infinity and the implications. If a system and its microstate become open at some point will this not allow a low entropy influx just as the divider would. It would introduce new information for sure. Our perception seems to be a factor in this riddle and perhaps the microstate (the system) is actually consciousness and will remain a stable charge intertwining between infinite dimensional states. Perhaps "thought" is the only real thing.

  • @graine7929
    @graine7929 Год назад +11

    As a former physicist now going my way into philosophy of science, what you said in the "How will the universe end" is the most satysfying take on entropy I've ever heard of coming from physicist. I've been wondering about the precise fact that "macrostate" is subjectively defined and so entropy increasing must not be that fundamental, but I never heard of any physicist going down that rabbit hole. Thanks you for that ! You must have at least some interests in philosophy of physics to be that great on these questions :)

    • @chippyprice1993
      @chippyprice1993 Год назад

      If you view the universe as being in a single quantum state, then a macrostate is defined as the possible result of the expectation of this state on some Hamiltonian. The space of Hamiltonians defines the space of macrostates. I do not believe this to be subjectively defined.

    • @graine7929
      @graine7929 Год назад

      @@chippyprice1993 That is interesting, so in this regard only the entropy of the whole universe increases through time because other macrostates are ill-defined, right ?

    • @chippyprice1993
      @chippyprice1993 Год назад

      @@graine7929 I'm not sure I understand what you mean by other macrostates are ill defined. You mean partial macrostates? Like ones that apply to a part of the universe?

    • @thethinredline4714
      @thethinredline4714 Год назад

      A video on the limitations of scientism ruclips.net/video/zFqj_NEbfgY/видео.html

    • @thethinredline4714
      @thethinredline4714 Год назад +2

      what created the low entropy at the beginning of the universe

  • @zerocero5850
    @zerocero5850 Год назад +4

    So much food for thought. One of your best yet Sabine.

    • @hyperduality2838
      @hyperduality2838 Год назад

      SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics!
      Making predictions to track targets, goals & objective is a syntropic process -- teleological.
      Teleological physics (syntropy) is dual to non teleological physics (entropy).
      Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant.
      Mathematicians create new concepts from their perceptions (geometry) all the time!
      Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant.
      Mathematics (concepts) is dual to physics (measurements, perceptions).
      Deductive inference is dual to inductive inference -- Immanuel Kant.
      Inference is dual.
      The rule of two -- Darth Bane, Sith lord.
      "Always two there are" -- Yoda.
      Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence.
      Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @jaysmith8957
    @jaysmith8957 8 месяцев назад +1

    Thank you for so clearly stating what I've always fundamentally thought about entropy. Of course I will mention Dr. Asimov's famous story "The Last Question" which so firmly planted this idea in my mind. Life is truly the universe's riposte to the dismal 2nd law of thermodynamics. "Let there be light..."

  • @Nefville
    @Nefville Год назад +32

    According to entropy at some point we're going to come back to this video and say "well this didn't age well" 😉

    • @anywallsocket
      @anywallsocket Год назад +4

      That'd be Poincare's recurrence theorem which only applies to classical closed systems.

    • @ThePowerLover
      @ThePowerLover Год назад

      @@elinope4745 Not quite.

    • @bavingeter423
      @bavingeter423 6 месяцев назад +1

      I think this video is already the response to the sentiment of 30 years ago

  • @blackmber
    @blackmber Год назад +16

    I love that you started this video talking about biology. It’s not often addressed when it comes to entropy, but life is the exact reason I’ve been skeptical about the universe heat death prediction. Life has a way of swimming upstream, cleaning up messes, and finding energy.

    • @drersatz9822
      @drersatz9822 Год назад +22

      Entropy is also adressed and very real in biology. Cells locally decrease entropy (by organizing themselves into complex structures etc.) but they can only do so by increasing entropy around them (absorbing nutrient and releasing wastes, breaking down molecules etc.). When you study lifeforms, you realize there is no "trick" or no escape to entropy there.
      All in all the reality is that without free available energy (work), every complex life form ultimately dies. Once free energy in a closed system is spent it's over, period. And one could argue that the universe is such a system.

    • @TheBouli
      @TheBouli Год назад +12

      well, the argument goes like this: entropy of the whole system increases over time (/ doesn't decrease). So looking at the biological organisms by themselves is a mistake. If you take into account everything they (we) do, the entropy actually goes up. As the video says: we increase the entropy of the sun, fossil fuels etc. to decrease the entropy of others. However if you calculate entropy for, say, the solar system as a whole, it increased as a result.
      There's even an interesting philosopical point here: you could say that the low local entropy that biological systems are, is a good way to maximize the increase(!) of overall entropy - because through our actions we alter the environment in more sophisticated ways, increasing the overall entropy of everything around us through the search of usable energy (which, as Sabine stated, is local low entropy reservoires).
      I'm not sure I agree with this line of reasoning, I'm just saying that there's more to biology vs entropy than meets the eye

    • @stardolphin2
      @stardolphin2 Год назад

      But that only happens at the expense of decreasing entropy elsewhere. Mostly in the Sun, but some, like the life around undersea 'black smokers,' at the expense of nuclear decay of materials in Earth's crust. (and *perhaps* also tidal flexing in some planetary moons, keeping interior water liquid).
      However all stars have finite resources for fusion (iron-56 and nickel-56 are the end of the line, if not sooner), nuclear decay eventually reaches its lowest state, and tidal flexing goes away as those orbits tend to circularize and 'flexing' stops (and over *seriously* long times, orbits decrease via gravitational radiation).
      It's true that life is a constant *struggle against* entropy, though.

    • @noneofyourbusiness-qd7xi
      @noneofyourbusiness-qd7xi Год назад +4

      Life (once it exists, and especially in prokaryotic form) is pretty good at avoiding becoming extinct, but it's pretty bad a breaking natural laws.
      So, sorry pal, heat death is the absolute end (and very likely the last living organism will have died a lot earlier than that).

    • @noneofyourbusiness-qd7xi
      @noneofyourbusiness-qd7xi Год назад

      @@TheBouli
      Once heat death has been reached there is no way of extracting energy from any source in it to fuel life (and all living beings are parasites that thrive on the increase of entropy, that is the availability of an external energy source). Trying to extract energy from a closed max entropy system is like trying to extract zero point energy from an object. It won't work. It's just physically impossible.

  • @squarewheels2491
    @squarewheels2491 Год назад +5

    Chaos Theory always seemed to go against Entropy but I never really got far enough into them. Chaos Theory was the study of order arising from a chaotic environment. the example I remember is cars and red lights. Even is the lights, roads and cars are random the time to drive from point to point tends to be close average that emerges naturally. Order emerging from a high entropy state. I guess it could go with information thing, when we don't know all the microstates order can emerge from chaotic interactions.

    • @gillesdeleuze6083
      @gillesdeleuze6083 Год назад

      Yep. All the work done by non equilibrium thermodynamics are ignored by physic.

    • @antm9771
      @antm9771 Год назад +2

      Yeah, but as i concern in most chaos system general trend is create and utilize order to produce more chaos, for example black hole or human tend to turn all kind of stuff into plain heat (useless energy) and we are high order ourselves - a product of chaos nature and universe , i dont know if we can achieve level of sci-fi when we can break material into useless energy better than black hole itself. So in big picture, chaos theory does not seemed to go agains entropy
      Infact we can say Life exists to "accelerate" entropy

    • @havenbastion
      @havenbastion Год назад

      entropy = -emergence

  • @MatejPavšič-i3k
    @MatejPavšič-i3k 8 месяцев назад

    I liked this video because it resonated with my views on entropy that I gained after reading the inspiring article by Myron Tribus and Edward C. McIrvine, "Energy and Information", Scientific American , Vol. 225, No. 3 (September 1971), pp. 179-190. I strongly recommend that paper, because it provides an insightful complement and an additional explanation and formalization of the points raised by Sabine.
    Following Shannon's original proposal, lucidly explained by Tribus and NcIrvine, entropy is defined as the degree of uncertainty, X, that one has about a certain question Q. A question can be "In which part of the box is the particle, left or right?" . Complete uncertainty about that question means that there is equal probability, p_1= p_2 = 1/2, for the particle being in the left or the right part of the box. The entropy, defined as S(Q|X)= - \sum p_i ln p_i, is then S(Q|X)= - ln (1/2) = ln 2. On the contrary, if one knows the answer, namely that the particle is in the left part (denoted 1) and not in the right part (denoted 2), then p_1=1 and p_2=0. The entropy is then S(Q|X) = 0, which means that there is no uncertainty about the question Q. This is in contrast with S(Q|X) = ln 2, which is the maximal ignorance about the question Q. The authors then remark: "The only state of greater ignorance is not to know Q." And not knowing Q, or even worse, not even be aware of the fact that one must bring Q into the definition of entropy, is a source of all the confusion and all the difficulties people have in understanding the concept of entropy. It was great that Sabine revealed in her own wonderful and easily grasping way this important point to the general public.
    In the article, cited above, the simple model with two possible answers that I gave to illustrate the idea, was generalized to the cases of generic questions and any number of possible answers. Information was defined as the difference between two uncertainties, I= S(Q|X) - S(Q|X'), The relation to the thermodynamic entropy was exposed. Again, a fabulous, inspiring, highly cited paper that removes the longstanding "mystery" and confusion.

    • @MatejPavšič-i3k
      @MatejPavšič-i3k 8 месяцев назад

      @kaboomboom5967 You would certainly understand the Scientific American paper "Energy and Information" which is very clear and readable, written for general audience.

  • @ArchaicSEAL.ST3
    @ArchaicSEAL.ST3 Год назад +8

    Can you do a vid discussing your take on the recently proposed “Second Law of Quantum Complexity?” I am both fascinated by it and would love to hear where you stand on its proposal and maths. Thanks for all you do Sabine!

  • @ibperth
    @ibperth Год назад +13

    The interplay between life (makes order from disorder) and the 2nd Law of Thermodynamics (expects disorder from order) is very interesting. The way life fights the losing battle of ageing is by being multi-generational, a reset to a low entropy state. It then follows that life is fundamentally about paying it forward with the older generation being devoted to the success of the younger one.

    • @jzemens4646
      @jzemens4646 Год назад +2

      A very enlightening observation to make. Touche'

    • @ibperth
      @ibperth Год назад

      @@jzemens4646 Furthermore, the constant battle against the 2nd Law of Thermodynamics is why societies evolve to more enlightened states, not uniformly or monotonically, but inevitably. This is because every generation has to create sufficient new information to overcome the entropy increase. Those societies that are unable to do so eventually perish. Very few societies can maintain an exact balance, by neither advancing nor going backwards. The rest, utilising the culture and the fruits of science build more information than entropy increase destroys and arrive at a better place from one generation to the next.

    • @timq6224
      @timq6224 Год назад +3

      but it isn't order from disorder. The sun literally provides more energy to the surface of the planet than is required by the life on it. The 2nd law only applies to a closed system, and the earth is by no means a closed system.

    • @subplantant
      @subplantant Год назад +1

      All life radiates heat therefore functions as an entropy machine. The "order" of life is emergent from the 2nd law of themodynamics because it functions to increase entropy (ie spreads out energy.) One could cite climate change as proof of this!

    • @whimpypatrol5503
      @whimpypatrol5503 Год назад +1

      Entropy is much more concrete in the case of genetics than heat. Think of the landscape model that virus physicist Peter Schuster describes in explaining the natural algorithm that allowed proteins to evolve into the efficient optimal folding machines that they are. A similar landscape model can be used to describe the universe of all possible gene configurations and organize them by the genetic differences (as distances) between them. For organisms to evolve, their gene structures must migrate across generations through this landscape. The law of entropy applies to this landscape because less gene arrangements produce successful complex life forms than produce successful simple life forms. And, likewise, gene arrangements that produce successful simple life forms are outnumbered by those that produce unsuccessful life. We know this based on entropy. Within this landscape may be more than 10⁶⁰⁰ possible arrangements. However many, it is a very large number so big that not even a smidgen of them could have occurred across 4 billion years even if every organism that ever existed in that time frame was genetically unique. Yet, in defiance of the static entropy characteristics of the genetics landscape mapping all life, gene configurations managed to randomly migrate to ones that produce ever increasingly complex life forms. All I can say is bull dung; it didn’t happen. Do the math. A natural selection genetic algorithm is not sufficient enough to solve such complex problems approximately optimizing dozens and dozens of interrelated phenotype systems in single organisms. Bull dung. No algorithm could solve such a search problem.

  • @Brian-rh3qh
    @Brian-rh3qh Год назад +5

    I love you Sabine. You explain physics so well. I never understood entropy.

    • @sanchellewellyn3478
      @sanchellewellyn3478 Год назад

      I didn't understand entropy until I had a two-year-old . . .

    • @andyreznick
      @andyreznick Год назад

      I always think of it as the reason my soup cools down enough to eat.

    • @SteveLevy-ld7hl
      @SteveLevy-ld7hl Год назад

      the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...

    • @SteveLevy-ld7hl
      @SteveLevy-ld7hl Год назад

      @@andyreznick soup c00ls down enouGh to eAt it... is a form of tiMe travler forWard beCausE u have to keep thinkiNg is it c00l down eNouGh 2 Eat...oR is dIs a paRaDox...the amount of tiMes i have heard dis sonG..EyE had a dream🕉WoW dis is live...whAt w00d you Say..2 your Selfish selF....if EyE Could tiMe...888 inFinty with da ...tiMe traVller spAce tiMe continue ATAR🕉🕉🕉 aNd thEn my b00k...

  • @purrrradise
    @purrrradise 11 месяцев назад +1

    Then lower entropy (regarding RHEOSTASIS) of a battery with a strong magnet, for example 👍:)

  • @merthsoft
    @merthsoft Год назад +8

    I'm just an amateur physics enjoyer, but this is a lot more satisfying as an explanation.

  • @Krawolga
    @Krawolga Год назад +11

    "This was the most optimistic and uplifting video I have ever done and can ever do. It can only go downhill from here."
    Großartig!

    • @retrieval1
      @retrieval1 Год назад

      You have many great videos ahead of you

  • @TheCreativeKruemel
    @TheCreativeKruemel Год назад +6

    11:36 Sabine you are wrong. High entropy indeed corresponds to high information content, as per Claude Shannon's definition in information theory. The reasoning behind this is that entropy essentially quantifies the level of uncertainty or randomness within a set of data. The more uncertain or unpredictable the data, the more information it contains. For reference: "en.wikipedia.org/wiki/Entropy_(information_theory)"

    • @Darenimo
      @Darenimo Год назад

      If a highly likely (High entropy) event occurs, the message carries very little information. On the other hand, if a highly unlikely (Low entropy) event occurs, the message is much more informative.
      This is exactly what Sabine is saying.

    • @TheCreativeKruemel
      @TheCreativeKruemel Год назад +1

      ​@@Darenimo You twisted some definitions in your comment. Entropy IS average information. If a high-entropy message occurs, then the message carries lots of information. High Entropy is NOT the same as "highly likely". This holds for information theory AND statistical mechanics!

    • @NightmareCourtPictures
      @NightmareCourtPictures Год назад

      ​@@TheCreativeKruemel it's not that anyone here is wrong, it's just the definitions of stat mech is all screwed up, and always has been. Physicists have been arguing about thermodynamics for god damn years cause of the way it was formulated.
      First, you have to remember what information really is, and what observer the "message" is being viewed from.
      If you read the message
      000110010101
      This tells you some kind of information, but it turned out that this message, was just the macro-state description of a more underlying message
      00110000 00110000 00110000 00110001 00110001 00110000 00110000 00110001 00110000 00110001 00110000 00110001
      When you compare these two, their information content is the same (binary expansion of a binary set of numbers, where 00110000 is 0 and 00110001 is 1.) The 2nd clearly has more bits of information, because it's very obviously longer and can describe a lot more states (higher shannon entrop). But also under Shannon, the surprise of the messages are equivelent...in that these messages have the same amount of surprise, and therefor have the same "shannon" information.
      Additionally and by contrast, randomness is not well defined. Consider the random string again of 0's and 1's : 000110010101...it contains regularity and in fact, for any arbitrarily large string, you will never find a truly random sequence, there will always be patterns and repetition in that sequence... 000, 11, 00 0101, 1010...these are patterns embedded in the seemingly "random" string and I'd challenge anyone to try and create a truly random string of 0's and 1's. The only time one can is when each bit is unique , at which point the idea of "surprise" gets thrown out the window because every state would be a surprise if all of it's bits are all unique (and this is the true extension to what Sabine's sid, just from an information theoretic viewpoint)
      Cheers,

    • @Darenimo
      @Darenimo Год назад

      Such as, if you have an airtight box on your desk, and you request information about the contents of the box, you could get a high entropy response, letting you know all the possible configurations of atoms and molecules that could exist in the box.
      Or you could get a low entropy message, telling you it contains mustard gas and opt to not open it. And you'd probably find you gained more information from the latter message than the former, even if the former had a higher information content.

    • @TheCreativeKruemel
      @TheCreativeKruemel Год назад

      @@Darenimo Hey good point :)!
      The low entropy response, saying "it contains mustard gas", is less detailed but more straightforward for us to understand.
      However, 'information' here doesn't just mean what WE can understand easily (very important!). It includes ALL possible details, even very complex ones (one key idea in Shannons original paper).
      In principle it could also be possible to use data compression and Character encoding on the high entropy message to reduce the high entropy message to a message we humans can read :).
      The important thing to understand is that the high entropy message (about all configurations in the box) contains ALL the information of the low entropy message!

  • @EminezArtus
    @EminezArtus 8 месяцев назад +1

    Probably one of the best explanation of Entropy. Thanks for the video.

  • @petermersch9059
    @petermersch9059 Год назад +5

    In my book "Evolution: "Von Mikroorganismen zu multinationalen Konzernen - 3. Auflage" (only Amazon) I came to somewhat different conclusions. This book is also based on the idea that the increase of entropy means a loss of information (see Murray Gell-Mann). However, when discussing life, physicists practically always talk about thermodynamic entropy (see Schrödinger). In my book I claimed instead that living beings have competences (knowledge etc.) towards their habitat in order to gain resources (especially energy) from it. However, they are primarily concerned with not losing information (which has to be stored physically somewhere). In other words: living beings behave in a loss-of-competence averse way. And this is then also the basic assumption of the competence-based theory of evolution, which is substantiated in my book. Unlike Schrödinger's approach, it can also be used to justify reproduction. Schrödinger's approach can only explain self-preservation of living beings, not reproduction.
    The competence-based theory of evolution is a generalization of the original Darwinian theory of evolution (which unfortunately has no physical foundation). However, it contradicts the theory of selfish genes and the total fitness theory.
    I deal in my book of course also with the question whether there will be no more life in the universe sometime. In my opinion this will be so. Because life needs competences. And these competences (knowledge) must be physically held somewhere (DNA, brain, hard drives, etc.) as extremely improbable states. And they must be reproduced permanently, because the entropy law applies. So high quality energy is constantly needed to reproduce the competencies (so that high quality energy can still be obtained in the future). And I am afraid that at some point this high quality energy will no longer exist in sufficient quantity.
    My book is in German and quite difficult. It also covers the evolution of man and his societies, which Darwin cannot even begin to do. Biologists are often of the opinion that man is just another animal. My book, however, very clearly concludes that man has emerged from the animal kingdom. Basically, human beings have acquired more competencies than the rest of the animal kingdom put together. Man is capable of division of competencies, other animals need division of species for that.
    I hope to translate the book into English via ChatGPT and/or DeepL by the end of the year, until then only reading in German remains.

    • @arrakis8320
      @arrakis8320 Год назад +1

      I appreciate your book in english, look very interesting to study and know your view.

    • @noneofyourbusiness-qd7xi
      @noneofyourbusiness-qd7xi Год назад

      Information can not get lost (law of the conservation of quantum information). An increase in entropy does not mean a decrease in information, but an increase. That's the basis of information theory. If your book is based upon the idea that an increase of entropy means a loss of information, then it's build upon a flawed premise.

    • @petermersch9059
      @petermersch9059 Год назад

      @@noneofyourbusiness-qd7xi Unfortunately this is wrong. I recommend a Google search for "information loss entropy". Murray Gell-Mann explains the connection very well in his book: "The Quark and the Jaguar: Adventures in the Simple and the Complex".
      His explanation goes, roughly, like this: If all particles in a gas are in a tiny quadrant, then we know not only the macrostate of the gas, but also its microstate quite precisely. In this state, the entropy of the gas is very low. However, when the particles then disperse again, we lose much of the knowledge about the microstate of the gas. Thus, with the increase in entropy, we experience a loss of information (or knowledge about the microstate).
      The situation is similar for living beings in their environment. The better adapted it is to the environment, the greater its knowledge about its environment (Popper). One can also say: The greater are their competences towards the environment. If they lose a part of their adaptation, they lose a part of their knowledge about the environment. Schrödinger looked at the events from the thermodynamics of a living being. I think this is problematic, especially since evolution is about information processing (even genes are subjected to information processing). It is therefore more advantageous to look at the events from an information-theoretical point of view and to understand an increase in entropy as a loss of competence.
      And by the way: buildings decay with time. Living things decay in no time at all when they lose their life. But we also forget things, for example names. The older you get, the easier it is to forget. You lose information. Did it really never occur to you that this is the same mechanism?

  • @bhangrafan4480
    @bhangrafan4480 Год назад +5

    From what I learned about the laws of thermodynamics, great stress was placed on the concepts of OPEN vs CLOSED vs ISOLATED systems. We were taught that the potential function which determines the overall direction of change in an ISOLATED system is entropy increase. There can only be ONE isolated system, the universe. So my understanding was that as the universe evolves its total entropy increases. You seem to be saying that this is incorrect. That the total entropy of the universe can stay the same. Also my partial understanding of what you are saying, (as a non-expert), is really to do with open and closed systems within the isolated system of the universe. So that even when the total entropy of the universe as a whole has reached a maximum (and where does it stop?), you can have closed systems in which the entropy is locally decreased at the expense of the surroundings.

    • @NullHand
      @NullHand Год назад

      A vacuum Dewar is a pretty good short term approximation for an isolated system ( at least for chemical rxn time scales).
      And if the accelerating expansion of the "universe" that our astronomical observations indicate is true, then we are all trapped in segments of the "universe" by an effective event horizon defining our observable universe.
      A sort of relativistic Dewar. Yours will be only slightly different from mine, but researchers 100 billion light years from us are effectively in a completely different isolated system.

    • @TheUAoB
      @TheUAoB Год назад

      This my understanding too. There's the theory dark energy is proportional to the entropy in the universe system, specifically the area of the boundary expands with total total entropy within the universe, it's speculated this also applies to black hole event horizons.

    • @raminagrobis6112
      @raminagrobis6112 Год назад +2

      You're right. Sabine's statement that the entropy of the universe can remain the same is utterly erroneous. It ALWAYS increases, regardless of any number of mucrostates you might be looking at. That it must increase with every physicochemical process is a corollary of the 2nd law of thermodynamics. And this has not changed (overall) since Boltzmann.

    • @WestleySherman
      @WestleySherman Год назад

      For an isolated system at equilibrium, there's the approximation/heuristic that all microstates are equally likely. So, defining macrostates in terms of subsets of the total available microstates in the system then allows calculating probabilities for the macrostates. And defining entropy as the (log of) the number of microstates comprising the macrostates makes sense. But, for systems that are not isolated, or that are not at equilibrium, not all microstates are equally likely. So then I'm less certain that entropy can be defined as a simple count of microstates.

    • @raminagrobis6112
      @raminagrobis6112 Год назад

      @@WestleySherman True, but entropy is the sum of the individual probability Pí of each microstate (times log Pi),. Pi can be different for each microstate that is part of a macrostate.

  • @nissieln
    @nissieln Год назад +6

    This is truly your best video IMO. Thank you Sabine! Finally a happy ending, from which, as you said, everything will have to go downhill 😂

  • @whong09
    @whong09 11 месяцев назад +1

    Veritasium's video on entropy is really interesting and maybe aligns with the "uplifting" takeaway here. The universe doesn't just tend towards higher entropy, but it also tends towards faster entropy gain. Life as we know it is locally the fastest entropy increasing mechanism, as entropy further increases other life-like mechanisms (that may not be recognizable to us) should be likely because the universe tends towards accelerating entropy growth.