The Misunderstood Nature of Entropy

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024
  • Viewers like you help make PBS (Thank you 😃) . Support your local PBS Member Station here: to.pbs.org/Don...
    Entropy and the second law of thermodynamics has been credited with defining the arrow of time.
    You can further support us on Patreon at / pbsspacetime
    Get your own Space Time t­-shirt at bit.ly/1QlzoBi
    Tweet at us! @pbsspacetime
    Facebook: pbsspacetime
    Email us! pbsspacetime [at] gmail [dot] com
    Comment on Reddit: / pbsspacetime
    Help translate our videos!
    / timedtext_cs_. .
    Previous Episode:
    Quantum Invariance & The Origin of The Standard Model
    • Quantum Invariance & T...
    Entropy is surely one of the most intriguing and misunderstood concepts in all of physics. The entropy of the universe must always increase - so says the second law of thermodynamics. It’s a law that seems emergent from deeper laws - it’s statistical in nature - and yet may ultimately be more fundamental and unavoidable than any other law of physics.
    Hosted by Matt O'Dowd
    Written Matt O'Dowd
    Graphics by Grayson Blackmon
    Assistant Editing and Sound Design by Mike Petrow
    Made by Kornhaber Brown (www.kornhaberbrown.com)
    Special thanks to our Patreon Big Bang, Quasar and Hypernova Supporters:
    Big Bang
    CoolAsCats
    David Nicklas
    Anton Lifshits
    Joey Redner
    Fabrice Eap
    Quasar
    Tambe Barsbay
    Mayank M. Mehrota
    Mars Yentur
    Mark Rosenthal
    Dean Fuqua
    Roman Pinchuk
    ColeslawPurdie
    Vinnie Falco
    Hypernova
    Donal Botkin
    Edmund Fokschaner
    Matthew O’Connor
    Eugene Lawson
    Barry Hatfield
    Martha Hunt
    Joseph Salomone
    Chuck Zegar
    Craig Peterson
    Jordan Young
    Ratfeast
    John Hofmann
    Thanks to our Patreon Gamma Ray Burst Supporters:
    James Hughes
    Fabian Olesen
    Kris Fernet
    Jane Meyers
    James Flowers
    Greg Allen
    Denys Ivanov
    Nick Virtue
    Alexey Eromenko
    Nicholas Rose
    Scott Gossett
    Mark Vasile
    Patrick Murray
    سلطان الخليفي
    Alex Seto
    Michal-Peanut Karmi
    Erik Stein
    Kevin Warne
    JJ Bagnell
    Avi Goldfinger
    John Pettit
    Florian Stinglmayr
    Benoit Pagé-Guitard
    Nathan Leniz
    Brandon Labonte
    David Crane
    Greg Weiss
    Shannan Catalano
    Brandon Cook
    Malte Ubl

Комментарии • 2,2 тыс.

  • @Valdagast
    @Valdagast 6 лет назад +1701

    My cat goes out into the cold, then comes in to warm himself in my bed. He is a sort of engine for moving heat out of the apartment. I don't know if he does any useful work, though.

    • @apekillssnake
      @apekillssnake 6 лет назад +25

      Not so dumb, and close to a theory!

    • @circadianarchist
      @circadianarchist 6 лет назад +64

      what an underrated comment

    • @Temp0raryName
      @Temp0raryName 6 лет назад +79

      Your cat eats mousies. Digesting them creates heat, to warm you. They are well insulated so the cold outer fur is less of a heat sink, than your cat is a heat source. You must thank and cuddle your hot kitty bottle!

    • @dlevi67
      @dlevi67 6 лет назад +48

      Always remember: dogs have a master, cats have servants.

    • @Temp0raryName
      @Temp0raryName 6 лет назад +4

      *dlevi67* Word.

  • @thomascorbett2936
    @thomascorbett2936 3 года назад +500

    Now that I'm 70, I have a clear understanding of entropy when I look in the mirror .

  • @yuccapalme4577
    @yuccapalme4577 3 года назад +35

    "keep beeing that brilliant macrostate that is you" ... noone has ever given me a better compliment ☺️

  • @323observing
    @323observing 6 лет назад +129

    Holly crap, this was my Astronomy professor last semester. Thanks for the lessons 👌🏻

  • @UnidentifiedAerialPhenomena0
    @UnidentifiedAerialPhenomena0 6 лет назад +202

    ""Welcome to Entropy Burgers - may I take your order?"
    "I put in disorder a long time ago. The service here is getting worse all the time."
    "My experience Gibbs me reason to believe you."
    "I know the waitress who asked that, too. Her name's Ellen Omega. She really made me thermally dynamic. So, I asked her out. I tell you, when she don't like you, she really Boltz, man. Women like that are never distributed normally among the population."
    "What kind of Poisson would say something like this?""

    • @karolakkolo123
      @karolakkolo123 5 лет назад +9

      That really hurt, ouch

    • @stephentrueman4843
      @stephentrueman4843 5 лет назад +11

      i read it till the end

    • @jamesmaybury7452
      @jamesmaybury7452 5 лет назад +1

      Fantastic explanation of entropy, thanks.
      It is slightly worse than you suggest. The idea that every micro-state is equally likely is a mathematical simplification, but to end in a particular state you have to get there.
      Take the state of a go board. Half black on one side, representing all “hot” atoms in one place and a similar state with one black out of place. To get the out of place atom to the right point in the board is even less likely as it would have to move against the temperature/pressure gradient.
      Also you are really talking about a high entropy state with all atoms having an around average kinetic energy (speed) to one where half the atoms have all the energy. Collisions won’t do that. The statical model, with it’s simplifications becomes inaccurate at the extremes.

    • @drakeequation521
      @drakeequation521 5 лет назад +2

      There is no such thing as a disorder. If one drops a deck of cards off the top of a building just what makes the result on the ground define "disorder"? The fact that some cards are up and others are done is no good since blackjack and 7 card stud are forms of order and each of those games has some cards up and some down. The scattered cards do not represent a disorder since the winner of a card game makes a similar so-called mess out of the formerly neatly stacked chips as he or she gathers them toward him or herself. Therefore, entropy only describes how the number of possible orders will be in larger in number than the fewer number we started with along with a reduction in temperature over time.

    • @sh4dow666
      @sh4dow666 4 года назад

      @@drakeequation521 The fact that disorder isn't objective doesn't reduce its subjective reality. If you are interested only in whether all your card are in a nice pile, then the line between order and disorder is drawn there - if you sorted the cards in some specific way, any other sorting would be disorder for your purpose.
      If you use the bayesian concept of probability theory to derive statistical mechanics, it's subjective nature is utterly obvious.

  • @MostlyPennyCat
    @MostlyPennyCat 6 лет назад +205

    I prefer the sand castle analogy.
    There's trillions of arrangements of sand particles that define the macro state "lump of sand"
    But there's only millions that define the macro state "pretty sand castle"
    That's why the wind turns sand castles into lumps of sand, not lumps of sand into sand castles.
    Straight probability.
    And it shows you the "arrow of time"
    What I've always wondered is, does increasing entropy create the arrow of time?
    Or does the arrow of time increase entropy?
    Does time even exist?

    • @dlevi67
      @dlevi67 6 лет назад +6

      Have you read "The Order of Time" by C. Rovelli?

    • @MostlyPennyCat
      @MostlyPennyCat 6 лет назад +3

      Nope

    • @duaneeitzen1025
      @duaneeitzen1025 6 лет назад +10

      Hmmm. From my perspective: the arrangement of sand after the wind blew down your sand castle was highly meaningful (one of only a handful of meaningful arrangements of the sand). I found your "sand castle" to just be a random lump of sand. So from my perspective, the wind decreased the entropy. Is entropy relative? Is it wrong of me to have found that arrangement meaningful? Is time running backwards for me?

    • @Draconaes
      @Draconaes 6 лет назад +29

      Duane Eitzen - I'm gonna have to go with B: "Is it wrong of me to have found that arrangement meaningful?"
      Or I guess, maybe not wrong so much as missing the point. It isn't about being "meaningful" to some intelligent mammal. The pile of sand is more statistically likely (assuming no outside energy, like manufactured buckets and expenditure of calories) than the sand castle. Thus, over time, sand castles tend to turn into piles of sand, and piles of sand tend to turn into other piles of sand.
      You are free to find the most statistically common arrangement of particles more meaningful if you desire, of course. Whatever "meaningful" means in this case.
      Though, I suppose there _is_ always the possibility that time is actually running backwards for you somehow, whatever that would actually entail. I can't imagine that the rest of us would be able to measure whether or not that was happening to you.

    • @medexamtoolscom
      @medexamtoolscom 6 лет назад +19

      There's a lot more than millions, and a lot more than trillions. It's more like 10^(10^12) vs 10^(10^6), not 10^12 vs 10^6. And THEN you see why the wind doesn't turn lumps of sand into sand castles.... because if there was a 1 in a million chance, there'd actually be a LOT of sand castles spontaneously generated, but no, it's not 10^6/10^12=10^-6, the odds are more like 10^(10^6)/10^(10^12)=10^(-.999999*10^12) which is about as infinitessimal as 10^(-10^12) itself is.

  • @Minecraftster148790
    @Minecraftster148790 6 лет назад +435

    IN THIS HOUSE WE OBEY THE LAWS OF THERMODYNAMICS

    • @florianmarcher
      @florianmarcher 6 лет назад +24

      But I dont wanna

    • @BenjaminCronce
      @BenjaminCronce 6 лет назад +11

      *Most of the time

    • @medexamtoolscom
      @medexamtoolscom 6 лет назад +9

      My father didn't find that funny at all when I explained it to him on "this is what happened on the simpsons", but when he later watched it, he practically shit his pants.

    • @frankschneider6156
      @frankschneider6156 6 лет назад +1

      Your kids will be pretty pissed for not being allowed to use this fancy new DeLorean car called time machine.

    • @1035pm
      @1035pm 6 лет назад +17

      This perpetual motion machine she made today is a joke! It just keeps getting faster.

  • @KafshakTashtak
    @KafshakTashtak 6 лет назад +282

    Regarding all air atoms being in one half of the room, does that mean if a tree falls in the other half, there is a chance it does not make a sound?

    • @ryanschuch1832
      @ryanschuch1832 6 лет назад +22

      SAHM it would most certainly not make a sound

    • @AtlasReburdened
      @AtlasReburdened 6 лет назад +50

      At least we can now conclusively say "Maybe" to the original question of "if a tree falls in the woods, and no one is around to hear it, does it make a sound?"

    • @ajbastian
      @ajbastian 6 лет назад +51

      It would still make a sound but there would be no medium to conduct the sound waves away... Other than the floor and the tree irself

    • @KafshakTashtak
      @KafshakTashtak 6 лет назад +7

      Atlas WalkedAway Yeah I wanted to conclude that every scientific fact/conclusion that we consider can simply become a highly probable guess.

    • @ryanschuch1832
      @ryanschuch1832 6 лет назад +8

      Tony Bastian so what you’re basically saying is it wouldn’t make a sound? Lol

  • @DutchDeLorean
    @DutchDeLorean 6 лет назад +24

    I spent a semester in college writing a research paper on entropy. That's what it took to get me to wrap my head around the subject...

  • @davnape14
    @davnape14 4 года назад +84

    This guy looks like he's got two personalities: a calm, academic one for the cameras and wilder one for the pub with his mates lol

    • @MK-tq3tz
      @MK-tq3tz 3 года назад +3

      I think you're right. His jokes are too funny and too cool😂

    • @LuisSierra42
      @LuisSierra42 3 года назад +3

      I think he actually has a rock band or plays in one or something like that

    • @theobolt250
      @theobolt250 3 года назад +3

      He posesses different microstates? So, how would his entropy go? Let's see in a year of... two, three... Five?

  • @Kram1032
    @Kram1032 6 лет назад +123

    To me, Entropy is, first and foremost, about constraints. - Entropy is basically counting how many states are currently *possible.* And that's why ordered states are *usually* (but not *necessarily)* in a lower entropy state: To have a decent chance to actually run into such an ordered state, you have to constrain the system to make that state relatively likely.
    For instance, in a typical ordered vs. chaotic room analogy, if you don't put any extra constraints on the state of your room, any *particular* tidy room is just as likely as any *particular* messy one. So in principle, both are consistent with high entropy.
    But there are so, *so* many more ways for your room to be messy than for it to be tidy; if you actually find it in a tidy state, it's *very* likely that you put it under strong constraints, essentially removing the possibility of a messy room from the phase space.
    Take a single pile of 100 sheets on your desk, for instance.
    If we're looking at a neat stack, meticulously arranged so that no piece of paper stands out, that's a very strong constraint on the sheets.
    An even stronger constraint would be if, on top of that, they also were alphabetized.
    More or less the only (macroscopic) degrees of freedom, at that point, would be how the stack as a whole is oriented and where on the desk its center of mass is. It's effectively a static rigid body at that point. Being tightly stacked and sorted has removed all other uncertainties about it.
    But maintaining that constraint is pretty resource intensive. One gust of wind could destroy it all, and you must either, building a well enclosed, wind-tight room, protect the pile from that or, putting extra work into it, restore the order to the constraints' specifications after the fact.
    Meanwhile, if you couldn't care less, and the pieces of paper can just be wherever, some even crumpled to balls and dropped on the floor, or, I don't know, about to be used as fuel for starting a cozy fire in the extravagant gold-plated fireplace you definitely have? There is hardly any constraint on those sheets then. You couldn't say much about any particular sheet at all, then. Unless you explicitly grab one and check, it's virtually indistinguishable from the others. (And grabbing a single piece of paper to check it out might be feasible, but try that for a single molecule in the air, say)
    Every time you halve the volume in which a piece of paper can be (a location constraint), you decrease entropy by one bit. Halving possible states is like answering a single balanced yes/no question: "Is this particular sheet of paper in this half of the room?" is worth one bit. "Is it in this quarter?" would require two bits, "Is it in this eighth?" three, and so on. - And the same is true for other parts of the state space: "Is this the side that's facing upward?", "Is the sheet's current momentum relative to the room in this interval?" *
    All that being said, even a very tidy room is going to have rather high entropy: You won't be able to sort the air in the room into neat piles. You won't be able to cool down the pile of paper to such low temperatures, that the sheets' individual atoms' momenta drops to essentially 0.
    Strictly enforcing a neat and tidy room *will* reduce entropy, since literally fewer micro states are possible then. But compared to the sheer number of possible micro states, the ones you actually plausibly have macroscopic control over is, even as it is vast in absolute terms, virtually undetectable and negligible.
    * One caveat: it's only exactly a bit of information if, given your previous knowledge, chances for the answer to be "yes" are just as high as for it to be "no". In general the number of bits gained is as large as how surprised you are by an answer. If you are *very* sure, that gravity never ceases, and it indeed doesn't cease, you effectively gained almost no information, and entropy hasn't further been constrained. - If you suddenly find yourself, one day, completely unaffected by gravity, that'd be a massive surprise, equivalent to a TON of bits.
    Enforcing constraints is just a way to reshape the world in a manner that you can then be sure about.
    So for instance, if you divide a room in two volumes of equal size and ask about which half one particular particle is in, if it's a vertical split, that will indeed be one bit of information gained. If you force it to remain in that half, you reduced the entropy by one bit. You have pinned down its location twice as well as you did before.
    But if you split it horizontally, because there is going to be a (very minor) pressure gradient from top to bottom due to gravity, the answer will give you (just the tiniest sliver) more or less than one bit, depending on where you end up finding it. (If it's in the top half, that's gonna be slightly more, if it's in the bottom half, slightly less)
    For your sheets of paper, meanwhile, it's gonna be vastly more likely to find them essentially static (macroscopic momentum relative to the room ~0), so it's virtually guaranteed for them to land in what ever interval contains that 0. If you want to be *super* strict about enforcing your constraints of perfectly defined stacks in a perfectly defined location, momentum would even have to be *exactly* zero... Which, of course, is disallowed by Heisenberg. In reality the sheets will always just jitter a macroscopically irrelevantly tiny bit, both due to thermal motion of individual atoms, and due to constantly being bombarded by air molecules. (And also by the equally present thermal jitter of the desk itself, say)

    • @Temp0raryName
      @Temp0raryName 6 лет назад +22

      I think the thermal jitter of your desk was increased through your keyboard overheating.

    • @Kram1032
      @Kram1032 6 лет назад +10

      Nah, my computer's processor and stuff produces plenty more heat than my typing ever could.

    • @Temp0raryName
      @Temp0raryName 6 лет назад

      :-D
      I think you need to factor Tracey Ermine into your analogy. She purposely creates a disordered state, in a room, for art.

    • @Temp0raryName
      @Temp0raryName 6 лет назад +2

      So, to extend it to its absolute conclusion, the constraints that can be imposed on the universe, by intelligent intervention, could prevent the heat death of the universe? Say by creating a perfect vacuum flask. Warm outside. But hot (or cold) inside.

    • @Mernom
      @Mernom 6 лет назад +1

      Are you aware of Maxwell's demon? A theoretical construct, that could sort a chamber to have all of the air on one side, by opening a small hatch to allow air atoms to pass to one side, but shutting it to stop them from coming back.
      In practice, a being like that would most likely consume a lot more energy than any usefull energy that could be gained from fludding the resulting vaccume, because it needs to track atoms and open\close a hatch fast enough to catch them.

  • @Jacktrack7
    @Jacktrack7 5 лет назад +331

    I've been trying to understand Entropy for a couple of years by now, I watched the video... I still don't get it xd

    • @bpansky
      @bpansky 5 лет назад +18

      Maybe try this other video, I found it much better! (it was made longer ago than the above video, yet very similar, I wonder if PBS copied it...)
      ruclips.net/video/w2iTCm0xpDc/видео.html

    • @Jacktrack7
      @Jacktrack7 5 лет назад +8

      @@bpansky Thanks, it was a good video, I really wanna get that Stirling Engine now it's only $35.

    • @garethbaus5471
      @garethbaus5471 4 года назад +18

      One way to think about entropy i heard in chemistry is roughly as follows- A given sample of matter can only be arranged a very large but still finite number of ways, and some ways can be achieved more ways than others, as such the arrangements that have the largest number of possible ways to form will generally be the arrangement matter tends to take when left alone with no input for long enough. Entropy can be considered to be a way to measure of how close a given sample is to achieving the most probable state.

    • @VirtualWithholdingNetwork
      @VirtualWithholdingNetwork 4 года назад +4

      Entropy is the degree of randomness in a system. And I think maximum entropy means all possibilities has been exhausted.

    • @zolnsalt
      @zolnsalt 4 года назад

      😆😆😆😆😆

  • @FrntRow
    @FrntRow 6 лет назад +11

    It blows my mind how good Matt is at presenting! Explaining the most complex physics and topics science has, every time I watch I feel like I understand something, for a Atto second, then he says something else and it's gone.
    Your the best Matt keep it up, I might be able to hold on to some of your conversation soon enough.

    • @isaackitone
      @isaackitone Год назад +1

      I can relate, but you need to watch his videos two or three times to fill the gaps left after the first time. The investment in time is worth it.

  • @meleardil
    @meleardil 6 лет назад +9

    There is just one thing I would like to add. In every episode you state things, and draw conclusions in a very elegant and simplified way. It would be just a nice touch to emphasize every time, what was the fundamental assumption on which the whole model is standing, and what would be the consequences if that axiom would not be true. In this case the fundamental assumption is that ALL possible physical states are EQUALLY probable. It is possible to change the outcome of a phase propagation by messing up the probabilities. Also, even when you keep the "equally probable" axiom, the macro probabilities are highly dependent on the definition of the micro states. That is also a good filter for messed up theories. Not to mention that the infinite improbability drive is the MOST powerful machine that anyone can construct, as it practically means omnipotence.

  • @rebelyell1983x
    @rebelyell1983x 6 лет назад +90

    Entropy... it just ain't what it used to be. ;) :)

    • @2fast2block
      @2fast2block 4 года назад

      No, it's all changed for this dumbass who wanks off to his god of nothing that did it all naturally.

    • @junkjunk2493
      @junkjunk2493 4 года назад +1

      omg hhaarrgghh brilliant

    • @ЮрійГірич-т1с
      @ЮрійГірич-т1с 3 года назад +4

      🏆 Joke of the Universe!

  • @rDnhey
    @rDnhey 6 лет назад +330

    I'm going to reverse entropy, hold my beer... yportne.
    Done.

    • @chrish.7563
      @chrish.7563 6 лет назад +40

      Now please consider the energy required for you just to type this... Sorry, bro. You added external energy to the system. Doesn't count. But nice try. ;-)

    • @zsmith200
      @zsmith200 6 лет назад +17

      Slow down Maxwell’s Demon.

    • @The.Incredible.Mister.E
      @The.Incredible.Mister.E 6 лет назад +5

      I want to argue but, Bravo sir, bravo.

    • @georgemargaris
      @georgemargaris 6 лет назад +4

      !!! esrevinu eht fo htaed taeh

    • @Οδοιπόρος
      @Οδοιπόρος 6 лет назад +3

      This seems to check out. We might be on to something here.

  • @thenotflatearth2714
    @thenotflatearth2714 6 лет назад +839

    If my room is the entire universe, heat death has already happened

    • @Attlanttizz
      @Attlanttizz 6 лет назад +38

      Until you fart that is =)

    • @ThreesixnineGF
      @ThreesixnineGF 6 лет назад +12

      Bring back the Chaos!

    • @Mernom
      @Mernom 6 лет назад +20

      Is your room a uniform space of evenly tempretured gas? If no, then not yet.

    • @StaryWymiatacz
      @StaryWymiatacz 6 лет назад +7

      well no... because you are in your room and you were still warm because you wrote this

    • @ZomBeeNature
      @ZomBeeNature 6 лет назад

      😲I can see that it has reached maximum entropy... 😭

  • @suneperera8644
    @suneperera8644 5 лет назад +41

    thanos:"i am inevitable
    2nd law of thermodynamics:"hold my beer"

  • @BrendanSteffens
    @BrendanSteffens 6 лет назад +10

    Really well explained. The gou board analogy was a brilliant touch.

  • @judgeomega
    @judgeomega 3 года назад +4

    i think this is my favorite space time episode. revealing how a law arises out of possibilities is just mind blowing.

  • @Temp0raryName
    @Temp0raryName 6 лет назад +44

    Kind of you to say "Don't die". Uhh ... likewise!

    • @frankschneider6156
      @frankschneider6156 6 лет назад +1

      I guess that's actually a pretty good life lesson. Just thinnk of all the Darwin award winners. If they had just known ...

    • @auail5594
      @auail5594 6 лет назад

      or "don't become stupid"

  • @ismireghal68
    @ismireghal68 6 лет назад +165

    Avoid thermal equilibrium? Don't tell me what to do!
    *equilibralizes himself thermaly*

    • @jackvernian7779
      @jackvernian7779 6 лет назад +56

      did you just... commit suicide?

    •  6 лет назад +27

      RIP

    • @ismireghal68
      @ismireghal68 6 лет назад +37

      Jack Vernian hell of an euphemism, isn't it? But hey the universe ain't got shit on me, I did heat death before it was cool(pun intended)

    • @lakshaymd
      @lakshaymd 6 лет назад +7

      ded

    • @ismireghal68
      @ismireghal68 6 лет назад +9

      Lakshay Modi at least i feel very equally distributed now.

  • @zombieblood1675
    @zombieblood1675 6 лет назад +220

    I like the numbers and probabilities you guys give but if it's not too much trouble could you walk us through the math.

    • @Hecatonicosachoron
      @Hecatonicosachoron 6 лет назад +15

      zombie blood which maths of all?
      The probabilities for a board of n cells where each cell is either occupied or unoccupied has a number of states of 2^n. that's the same as a sequence of n coin flips. If it's the case that half are occupied and half are unoccupied then the associated number of states is nCn/2, I.e. it's the appropriate binomial coefficient. The rest is number-crunching.
      But it shows why, say, in a sequence of coin flips, the most probable state is one where the number of heads and tails are equal or close to equal.

    • @ErwinSchrodinger64
      @ErwinSchrodinger64 6 лет назад +19

      Thermodynamics utilizes a lot of calculus, nothing too advanced, specifically differentials (at the senior undergraduate level). For instance, entropy, more succinctly is described as dS=-dQ/T. One thing I'm glad they stated is that thermodynamics describes large scales. Statistical mechanics describes small scales. The mathematics of statistical mechanics is a lot more involving, especially as the systems grow in scales. Statistical mechanics does use a lot of probability equations that are termed partition functions. Where statistical mechanics grows in complexity in describing systems, mathematically, are in non equilibrium problems.

    • @zombieblood1675
      @zombieblood1675 6 лет назад +22

      ErwinSchrodinger64 I'm in high school. I love learning above my level but what you said was pretty confusing thanks for trying though. I'll have to come back at a later date after learning more.

    • @badlaamaurukehu
      @badlaamaurukehu 6 лет назад +2

      zombie blood Hahahahahaha...
      Public funding...

    • @Ni999
      @Ni999 6 лет назад +40

      Badlaama Urukehu Willing to learn isn't good enough for you, champ?

  • @dxamphetamin
    @dxamphetamin 6 лет назад +15

    I don't know if you guys can handle this, but could you do a video about how cells "use" neg-entropy?

  • @WildsDreams45
    @WildsDreams45 4 года назад +3

    I'm obsessed with the idea of entropy. Thanks for this!

  • @NickGall
    @NickGall 6 лет назад +7

    Thank you, thank you, thank you for addressing entropy in this and upcoming videos! It's why I became a patron of Space Time. I have a bunch of questions about your excellent introduction to entropy, which covered a lot of conceptual ground in a short amount of time. I'll break up my questions into separate comments.

  • @CylonDorado
    @CylonDorado 6 лет назад +57

    Don't worry, we can fight entropy by granting the wishes of little girls (turning them into "Magical Girls"), then harvesting the energy that is created when their wish implodes and turns them into a "Witch".

  • @BotchFrivarg
    @BotchFrivarg 6 лет назад +8

    2:06 Slight correction but even a perfect Carnot cycle engine won't turn all the Heat going into it into Work some of the Heat will go to the lower reservoir. If this weren't the case you could combine a Carnot engine with a Carnot refrigerator to get a perpetuum mobile. Carnot proved that the Carnot cycle is the most efficient thermodynamic cycle possible but even that cycle is not 100% efficient.

    • @dcamron46
      @dcamron46 2 года назад

      It is if the cold reservoir is at zero Kelvin

    • @BotchFrivarg
      @BotchFrivarg 2 года назад

      @@dcamron46 true but in a similar way you can prove that to reach 0 Kelvin you need infinite energy (IIRC)

  • @LiborTinka
    @LiborTinka 6 лет назад +3

    Note that entropy can decrease from time to time, just shortly, but it happens. From Quora (Jack Wimberley, Ph.D.): The second law of thermodynamics does not absolutely state that the entropy of a finite system cannot decrease - only that the probability of it doing so vanishes in the limit that its size becomes infinite.

    • @jettmthebluedragon
      @jettmthebluedragon 2 года назад +1

      So the cosmos is a finite system right ?😐and if you say infinite then how did the universe form in the first place ?🤔

  • @doncarlodivargas5497
    @doncarlodivargas5497 4 года назад +6

    Am i really a macro state? Thst is the nicest thing anyone have said in a long time

  • @fredricknietzsche7316
    @fredricknietzsche7316 6 лет назад +47

    You had me at "42" and GO Board!

  • @HarbingerHarley
    @HarbingerHarley 3 года назад +7

    Great explanation, although I’m still struggling to grasp all of it, I understand a few fundamental ideas of entropy

  • @epif1
    @epif1 5 лет назад +10

    as I watch several of these in a row, I can assume you love "Hitchhiker's Guide to the Galaxy" 😆😆

  • @rceby2024
    @rceby2024 Год назад +1

    Although I rarely comment on a RUclips contents, this made do so. Just wow! Finally, the best explanation for entropy. Thank you for this!

  • @JL897139
    @JL897139 6 лет назад +3

    Like all other physics, I barely get the math, but I get the concepts, I finally understand entropy. So thanks!

    • @pebbletrees
      @pebbletrees Год назад +1

      Hi, I'm sorry, this is an old comment, but it struck me and I had to ask. Is this true? Do physicists struggle with the math too? I've always felt drawn to Physics, but I am so bad with numbers 😅😅 Is there hope for me in the field? Lol

  • @poseidone5
    @poseidone5 6 лет назад +18

    I studied economy and one of most amazing thing is entropy can be applied on economic systems like bancomat or sociologic system in the same way like thermodynamic (excuse me for my english but I'm italian and we never learn :) )

    • @ismireghal68
      @ismireghal68 6 лет назад

      matteo conz interesting

    • @thstroyur
      @thstroyur 6 лет назад

      The approach you mention rests simply on mathematical system analogies - personally I'm in favor of a deeper identification, like that proposed by Georgescu-Whatever (bar spurious laws of thermodynamics)

    • @MostlyPennyCat
      @MostlyPennyCat 6 лет назад

      Wasn't it also discovered independently in economics as well?

    • @rDnhey
      @rDnhey 6 лет назад

      Da italiano apprezzo che segui il canale

    • @coder0xff
      @coder0xff 6 лет назад +4

      I'm always pleased to find areas of overlap in the mathematics of different fields. My favourite is that spherical harmonics are used to compress data for light transport in 3D graphics, but also describe the electron orbitals. Such "coincidences" are actually pretty common, and that fascinates me in a philosophical kind of way.

  • @Pfhorrest
    @Pfhorrest 6 лет назад +7

    it's not just energy coming in from outside that can reduce entropy, it can also be energy lost to the outside. If you had some kind of magic heat sink that was infinitely cold forever, you could extract a ton of useful work from a universe at maximal entropy... because of the energy being lost from the rest of the universe into that "cold hole". No outside energy input required.

    • @Aquillyne
      @Aquillyne 6 лет назад

      Pfhorrest wouldn’t that situation be equivalent to air being in only half the room? i.e. as the cold hole fills up entropy is increasing not decreasing.

    • @coder0xff
      @coder0xff 6 лет назад

      Doesn't the temperature of the universe decrease, thus not affecting entropy?

    • @Pfhorrest
      @Pfhorrest 6 лет назад +2

      Virgil yes, but useful work is done by increasing entropy. You can think of "negative entropy" as "fuel", and then when you spend that fuel to do work, entropy is the "waste product". The existence of a "cold hole", an infinite heat sink, creates an energy gradient as big as the total energy content of the universe, and thus extremely low entropy. As all of the energy in the universe got sucked into it, entropy would go up, but you could do a ton of useful work using that "destruction of energy".
      For a less fantastic example, imagine a closed system of a balloon that is somehow perfectly insulated from the outside world. The air in the balloon is at maximum entropy; for a tiny being living inside that balloon, there is no possibility to do useful work (so I guess they wouldn't really be living to begin with, natch). But if you poke a hole in the balloon, and let's just assume it's the infinite vacuum of space outside the balloon, suddenly the air in the balloon goes rushing out, and the total energy content of the balloon goes down, but beings living inside the balloon could use that rush of energy out of the balloon to do useful work, say by putting a wind turbine over the hole to generate electricity.
      The point is just that you don't have to put stuff IN from the outside to create an energy gradient capable of doing useful work, you can also let stuff OUT to the outside to do so.

    • @NickGall
      @NickGall 6 лет назад

      Your magic heat sink sounds like absolute zero. If the cold reservoir of a carnot engine could be zero degrees, then the efficiency of the heat engine would be T(hot)/0, ie infinite or undefined. That's why the third law of thermodynamics forbids such an infinitely cold heat sink.

    • @Pfhorrest
      @Pfhorrest 6 лет назад +1

      The heat sink wouldn't have to be at absolute zero, just anywhere below the average temperature of the universe; nor would being at absolute zero be enough to make it function as described. What's more important it just its total heat capacity. If you had a penny that was at absolute zero and tossed it into a universe at maximal entropy, you'd get as much work out as it would take to raise the penny to the temperature of the universe, but that's all.
      But if you had a penny made of some magic material with infinite specific heat (i.e. it takes an infinite amount of energy to raise its temperature), that was only slightly colder than the rest of the universe, then it would suck and suck energy from the rest of the universe until it had lowered the temperature of the whole rest of the universe to its own slightly lower temperature.
      For an easier to illustrate example, imagine you have a pebble with infinite specific heat that's currently at -1C, which is a lot warmer than most of the universe, but say you dropped it onto planet Earth, where it would probably land in an ocean. All of the world's oceans would freeze. The atmosphere everywhere would be brought down to to -1C. All of the rock in the planet would be brought down to -1C. The sun would keep shining on the world trying to warm it up but all of that heat would be soaked up by the pebble which is still at -1C.
      Eventually the sun would expand to a red giant and swallow up the Earth, but the part of the sun that came into contact with the Earth would be cooled down to -1C as all of its heat was absorbed by the pebble, though of course the rest of the sun would convect more heat to the part around the Earth but that would be soaked up too until the entire sun had been cooled enough that it shrank back away from contact with the Earth.
      But, that would all take a while. And meanwhile here on Earth, the temperature gradient between the frozen pebble and the environment surrounding it would be a source for extracting useful work. It'd basically be a source of nigh-limitless free energy, as all of the unusable heat energy of the world, and all of the heat energy being pumped into the entire planet by the sun, would become a comparatively high-temperature energy source that could be pumped into the cold little rock to power a heat engine.

  • @someone2973
    @someone2973 6 лет назад +7

    Would it be possible to describe the entropy of a system in terms of the minimum amount of bits needed accurately to describe a region of space of a certain volume?

  • @belmiris1371
    @belmiris1371 3 года назад

    I'm not sure I could ever be smart enough to fully comprehend this channel but the host still makes it interesting.

  • @silverth1002
    @silverth1002 4 года назад +44

    "avoid thermal equilibrium"
    hes literallly telling us to stay alive right?

    • @ocek2744
      @ocek2744 4 года назад +3

      Either that or stay out of uncomfortable 98.6 degree F weather.

    • @OmnipotentO
      @OmnipotentO 4 года назад +8

      Yea. A body at equilibrium with the environment is a dead body

    • @itisfinished4646
      @itisfinished4646 3 года назад

      Hahaha

    • @joehinojosa24
      @joehinojosa24 3 года назад +1

      Avoid being room ( morgue) temperature

  • @balrighty3523
    @balrighty3523 5 лет назад +12

    Question: Way in the far future of the universe, when all matter and energy have reached equilibrium (maximum entropy), how can the second law of thermodynamics still be followed (i.e., if entropy is at maximum, it can't go further up)?

    • @ebenolivier2762
      @ebenolivier2762 4 года назад

      Balrighty 35 It won't. At that stage entropy will decrease in local areas of space. Given an unimaginably long period of time structures will form purely randomly. Look up Boltzman brains for more information on this.

    • @sethoflagos2880
      @sethoflagos2880 4 года назад +1

      @@ebenolivier2762 ... given an unimaginably long period of time, the inexorable expansion of space between the remaining maximally dispersed particles will dwarf their differences in velocity. Now, they can never meet. Structures random or otherwise cannot form. Plus, there really isn't such a thing as 'random' in thermodynamics.

    • @ebenolivier2762
      @ebenolivier2762 4 года назад

      Seth of Lagos This will only happen in the "big rip" end of the universe. We don't really have sufficient models to predict whether this will happen or not. We also cannot say for certain whether what I said would happen. It's all just speculation if one thinks that far into the future. The idea of the Poincare Recurrence Time and Boltzmann brains is quite fascinating though. 😊

    • @sethoflagos2880
      @sethoflagos2880 4 года назад

      @@ebenolivier2762 No big rip or other speculation required. Just simple isentropic gas expansion into an ever-expanding void.

    • @AKAKiddo
      @AKAKiddo 2 года назад +2

      The equation for entropy is flow of heat / temperature. As temperature approaches 0 entropy approaches Infinity. There is no maximum.

  • @thomascorbett2936
    @thomascorbett2936 3 года назад +4

    I think life is the best effort the universe has made against chaos .

  • @JoshuaHillerup
    @JoshuaHillerup 6 лет назад +21

    The arrow of time part confuses me. Aren't there some particle interactions that violate time symmetry, which could thus also give the arrow of time?

    • @Mernom
      @Mernom 6 лет назад

      I THINK there was something, but don't catch me on that.

    • @coder0xff
      @coder0xff 6 лет назад +3

      If I understand correctly, all interactions are theoretically reversible in both the macro and micro scales.

    • @JoshuaHillerup
      @JoshuaHillerup 6 лет назад +7

      Brent Lewis CPT symmetry has not shown to be broken, but time symmetry alone does not always hold.

    • @ingoseiler
      @ingoseiler 6 лет назад +1

      Pretty sure this was covered here. Changing left handed to right handed chirality in electrons was violating time symmetry, iirc

    • @Bodyknock
      @Bodyknock 6 лет назад +7

      Entropy would exist even if specific particle interactions that aren't time symmetric didn't exist. Hypothetically even if you have a system with all its particles behaving symmetrically with respect to time you would still have increasing entropy in one temporal direction.
      So you will always have entropy acting as an arrow of time even in systems which have no other such compass.

  • @tnb178
    @tnb178 6 лет назад +1

    I see the messy room as a valid analogy for explaining entropy. There are all kinds of messy but only one kind of clean. If I use items in my room and leave them at random spots without returning them to where they belong, the room gets messier.
    "Is my mom happy with the way my room looks" would be the macrostate. There aren't many microstates to make her happy but a lot of microstates to make her unhappy.
    Therefore there is a natural tendency for the room to get into a configuration that makes mom unhappy unless I invest energy to clean it up.

  • @AbandonedMines11
    @AbandonedMines11 2 года назад

    Trying to understand entropy here and have a question which I think is related to it. I had an argument with a buddy of mine about lottery numbers. Let’s say for a given lottery game you are to choose six numbers from 1 to 30. When the lottery game is played, six numbers will be drawn at random and you will win if those six numbers match the six numbers you have preselected. The argument I had with my buddy was that I said me choosing the numbers 1, 2, 3, 4, 5, and 6 had just as much of an equal chance of being drawn as the numbers that he chose which were 7, 18, 29, 4, 14, and 10. He argued and said that his six numbers had a greater chance of being chosen than my six numbers. Regarding entropy, wouldn’t both choices of numbers have an equal chance of being drawn? We attach significance to my numbers because they appear in numerical order and seem “special.“ That’s just an inherent bias that we have as human beings. When we are looking at the numbers objectively, each number has an equal chance of being drawn no matter what the order is, if any. Can you see where I’m coming from with this? So which selection of numbers has the greater chance of being drawn in the lottery drawing? Mine or his? Or do both have an equal chance of being drawn? I am torn between thinking both have an equal chance of being drawn and his having a greater chance of being drawn because his numbers aren’t in numerical order like mine are. I know the concept of entropy fits in here somewhere but I’m just not sure. And I don’t even want to begin to think about the laws of probability! This is really confusing! Can anybody help?

    • @Hank254
      @Hank254 Год назад

      You are right, they have an equal chance of being drawn. More related to probability than entropy.

    • @vovan101
      @vovan101 Год назад

      You are absolutely right, when they draw numbers every number has equal rights (just as the US constitution:) . However, according to asymptotic equipartition theorem the winning sequence should belong to the typical set, which is in this case should be a sum from 1 to 30 divided by two. This does not reduce your chances. Read about it, it is interesting.

  • @5kollar7of26
    @5kollar7of26 4 года назад +8

    Your a super instructor. I love how simple-like, and matter of fact you
    are. You make the second law sound like stacking blocks or something. You're so cool.
    I read a book, while in a physics class in a college; Entropy, by Jeremy Rifkin, I think that was his name. It scared me to death.
    It didn't help being in physics at
    the time. It's like a nightmare.
    A coma. You'll never wake up,
    you'll only eventually die. Your being pulled through the hallway of life into the gaping mouth of a giant freezer - DEAD end. That's what reading that damn book felt like. It was scary. You don't really know what's truly 'real' and all of this endless mathematics and theory theory theory. I'm glad I'm outta there that's all I can say.
    Thinking's cool, but sometimes I
    feel like the Bros. Grimm make
    more sense. Really.

  • @oisnowy5368
    @oisnowy5368 6 лет назад +14

    This video was brought to you by Kyubey. Get your Mahou Shoujo cosplay costumes today! You can do your part to lower entrophy and save the rest of the universe. The rest of the universe needs you!

  • @palhardy4292
    @palhardy4292 6 лет назад +8

    "The second lore of thermodynamics" :)

  • @shmookins
    @shmookins 6 лет назад

    I have learned more about this video than other videos and articles about Entropy. This video already corrected a few misconceptions I had about entropy. I am getting there. I have trouble understanding it but that only means I got to keep trying.
    I'll understand entropy if it kills me.

  • @JoshuaHillerup
    @JoshuaHillerup 6 лет назад +12

    Can there be situations where there are two distinct macrostates that are roughly tied for most common state? What would that look like?

    • @coder0xff
      @coder0xff 6 лет назад

      I guess it would be local maximums. Like billiard balls could all be in the pockets in any arrangement, any of which are more likely than them being arranged still on the table.

    • @ajbastian
      @ajbastian 6 лет назад

      I think that would look like Schrodinger's cat where one state is chosen upon observation

    • @MrAlRats
      @MrAlRats 6 лет назад

      I think what you are asking is whether there can be situations where two distinct macrostates of a physical system corresponds to the maximum number of microstates that the system can exist in.
      Well, there's always a macrostate with more microstates associated with it. A Universe in which there exist isolated physical systems has lower entropy compared to a Universe filled with only photons. Thermal equilibrium is merely when all the fast processes have already occurred, while the really slow processes have yet to take place.

    • @bormisha
      @bormisha 6 лет назад +1

      If the mentioned macrostates are distinct, they would be separated by certain intermediate macrostates which are less likely. This means that if the system reached one of the most common macrostates, it would not spontaneously switch to the other such macrostate, unless pushed from outside. There are many such bistable systems. Your computer's memory and disks are built of trillions of them, which allows storing information. Each bistable system can store 1 bit of information.

    • @Open_Source_Society
      @Open_Source_Society 6 лет назад

      ruclips.net/video/8tArShb1fhw/видео.html

  • @AlejandroBravo0
    @AlejandroBravo0 6 лет назад +9

    This is about the previous video but I didn't have the chance to ask it so: what conserved quantity arises from global (not local) phase invariance?

    • @drcosmos137
      @drcosmos137 6 лет назад +4

      Conservation of charge. This can be electric charge, weak hypercharge, or sometimes number conservation like baryon number or lepton number., depending on the context. These are all a U(1) phase symmetry.

    • @AlejandroBravo0
      @AlejandroBravo0 6 лет назад +3

      Michael Roberts so local phase invariance gives electric charge conservation and global phase invariance gives the conservation of other type of charge or quantum number?

    • @thstroyur
      @thstroyur 6 лет назад +3

      Nothing - conservation only appears with local symmetries. The fact you have a global symmetry merely flags the possibility that 'localizing' (AKA gauging) that symmetry may lead to the discovery or reinterpretation of some physical phenomenon

    • @drcosmos137
      @drcosmos137 6 лет назад +3

      They are all examples of local symmetries, but for different Lagrangians. The QED Lagrangian has a phase symmetry that corresponds to conservation of electric charge. The electroweak Lagrangian has a phase symmetry for conservation of weak hypercharge that's broken by the Higgs field. But Iago is right that these all have to be local symmetries for the conservation laws.

  • @hulkjumptowork3643
    @hulkjumptowork3643 5 лет назад +8

    27 seconds in and get called out for my messy room

    • @valiroime
      @valiroime 3 года назад

      And at 8:38 we have a picture of that room I believe. 😉

  • @MoraisSanFoo
    @MoraisSanFoo Год назад

    "Also, excusing the messiness of your room" This was brilliant.

  • @chrisjones9525
    @chrisjones9525 4 года назад

    Great final comments. Learning involves doing, not just listening and memorizing. It’s always worth the work.

  • @grimmcreole44
    @grimmcreole44 2 года назад +4

    "no mother, i shant clean my room, as the effort i spend cleaning will decrease the amount of energy available for our civilization at the end of ends!"

  • @ex5tube
    @ex5tube 6 лет назад +3

    Thought experiment:
    If the arrow of time is the process of Entropy, then when the Universe ultimately reaches its highest state of Entropy, and only randomized subatomic particles exist, does the arrow of time still exist?
    We know that the current state of the Universe can produce high levels of emergent complexity, far from equilibrium. At the Universe's state of highest Entropy, we would have only subatomic particles, distributed randomly throughout the Universe, and no measurable arrow of time.
    Would the probability of widespread, spontaneous emergence of complexity, increase "exponentially" (from our present point of view), because arrow of time no longer exists?
    In this high-entropy macrostate, the huge number of interactions, required to finally produce cosmic-scale emergent complexity, would be occurring "instantaneously", by particles moving at light speed, "outside of time".
    So, high-entropy conditions, at the end of the Universe, are somehow equal to the low-entropy Singularity of the Big Bang?

  • @PaulPaulPaulson
    @PaulPaulPaulson 6 лет назад +45

    Does the concept of entropy also apply to other systems than our universe, for example to conways game of life?

    • @ThreesixnineGF
      @ThreesixnineGF 6 лет назад +5

      🤔🤔🤔

    • @Mernom
      @Mernom 6 лет назад +6

      Game of life is not really random, so I don't think so.

    • @LePoudingue
      @LePoudingue 6 лет назад +20

      You can apply the concept of entropy to a lot of stuff, in a lot of different fields.
      For example, in information theory, you can calculate the entropy of a text.
      The text "abababab" as a higher entropy than "aaaaaaaa", for an alphabet comprised of ab.
      But if this is for an alphabet from a to z, it still has a very low entropy.
      You have a lot of definitions on wikipedia : en.wikipedia.org/wiki/Entropy_(disambiguation)
      Each one meaning a slightly different thing.

    • @drdca8263
      @drdca8263 6 лет назад +4

      I think if you defined the macrostates under consideration, you could just use/define the log of the number of microstates belonging to each macrostate to be the entropy of that macrostate.
      However, the fact that Conway's Game of Life is not reversible might make some stuff different. Many states lead to the empty grid state, but if "an empty grid" is a macrostate, that might suggest "entropy" decreasing? So that particular definition might not work out great.
      Maybe if you defined all still lifes (where nothing changes from step to step) as belonging to the same macrostate, and all states with period 2 as having another macrostate, and so on,
      then that could fit, because that way those macrostates have high entropy?
      (Or maybe vary this in some way like taking into account the total number of live cells, or the furthest cell from the origin, when defining the macrostates).
      I'd guess that there is some interesting way of picking the macrostates such that the entropy never decreases. Would be especially nice if it could be done in a way such that you could add it between different parts of the board, but that sounds hard?
      I think people who study GoL have defined a measure they call temperature. Maybe they defined that based on entropy?

    • @medexamtoolscom
      @medexamtoolscom 6 лет назад +8

      Not all, but it's REALLY inapplicable to Conway's game of life, because there's no conservation of mass (or # of live cells) in that, structures can increase in mass without limit. That was after all the thing that the first glider gun won the prize for doing. And if you can already violate the 1st law of thermodynamics, who CARES about the 2nd one, the 2nd one is irrelevant if you can violate the first.

  • @jacksongoerges9422
    @jacksongoerges9422 6 лет назад

    The graphics behind Sadi and Rudolf at 3:30 are AWESOME

  • @UteChewb
    @UteChewb 6 лет назад +1

    Always loved stat. mech. My first encounter with the concept of "phase space" was mind blowing. Looking forward to these eps. Well done, as usual.

  • @gianpa
    @gianpa 6 лет назад +39

    No one called me "brilliant microstate" before *_*

    • @abhirishi6200
      @abhirishi6200 6 лет назад

      Meowwwww

    • @hrgwea
      @hrgwea 5 лет назад +3

      *mAcrostate

    • @fireclown68
      @fireclown68 4 года назад

      You're a unique and special sno.....microstate

  • @Doping1234
    @Doping1234 6 лет назад +4

    Thumbs up for the Go-board illustration :)

  • @MushookieMan
    @MushookieMan 6 лет назад +6

    "Keep being that brilliant macro-state that is you"
    Are you coming on to me?

  • @T75-n1m
    @T75-n1m 6 лет назад +2

    This is an awesome lecture for introductory statistical mechanics.

  • @Leipage
    @Leipage 4 года назад

    This a much better explanation of entropy, thank you! When I first heard the 2nd law of thermodynamics as "a system's entropy (disorder) will always increase over time", it sounded like these people had no idea what they were talking about. Big bang -> atoms -> molecules -> stars -> galaxies -> solar systems -> planets -> life -> humans -> intelligence -> consciousness. It's pretty obvious to anyone that the universe has only become more orderly over time.
    Describing entropy as the number of possible random states that a system can have (or the amount of information needed to describe a system) makes a lot more sense.
    I once heard someone use Sudoku as an analogy for entropy. When you're trying to solve a square in an empty row, it can be anything from 1 to 9 (maximum entropy), but as other squares are resolved, the number of possible states changes to let's say 1,3,5 (lower entropy). So ironically, it seems us humans enjoy puzzles where we reduce entropy in a system. As if it's a fundamental part of our nature.

  • @davey3765
    @davey3765 6 лет назад +7

    OMG I smoked so much weed and this is like blowing my mind right now.

  • @ketsuekikumori9145
    @ketsuekikumori9145 6 лет назад +96

    The end is nigh. There is no hope. There is only entropy.

    • @シロダサンダー
      @シロダサンダー 6 лет назад +4

      Ketsueki Kumori hail entropy!

    • @Sam_on_YouTube
      @Sam_on_YouTube 6 лет назад +20

      "Nigh", in this case, is about 100 trillion years.

    • @kevin42
      @kevin42 6 лет назад

      S kildert we must speed it up. Quick gas the Enthalpy’s

    • @Pfhorrest
      @Pfhorrest 6 лет назад +6

      For extremely large values of "nigh".

    • @alcosmic
      @alcosmic 6 лет назад +1

      Don't Panic

  • @sergiogarza2519
    @sergiogarza2519 6 лет назад +7

    "My room is dirty because entropy"
    "Yes son but if we're trying to obey entropy, your room must as efficient as possible for you to successfully bring about a fast, universal, entropic equilibrium. Ain't no entropy-efficient babies being made in this room. Clean it or get a real job."

  • @Stroheim333
    @Stroheim333 5 лет назад +1

    AT LAST! The one thing I've always tried to explain for smirking science nerds who believe they know enormously much, but in reality are rather ignorant: that order is not the same as low entropy! Thank you.

  • @thylatrash7668
    @thylatrash7668 6 лет назад +1

    hey, really like your videos, but could you maybe add some short overviews in the end like the "today you learned..." thing crash course does?
    it's really helpful to process what you've just learned and make sure you don't just forget about it 1 hour later

  • @amiralozse1781
    @amiralozse1781 5 лет назад +16

    I see 42 - I must click

  • @FantasticMrFrog
    @FantasticMrFrog 6 лет назад +11

    - "Entropy has been credited with predicting the ultimate heat death of the universe."
    - "Hold my beer and just give me a few disposable high-school girls" /人◕ ‿‿ ◕人\

  • @MusiCaninesTheMusicalDogs
    @MusiCaninesTheMusicalDogs 6 лет назад +5

    Damn! Gotta love this channel! ❤

  • @eqisoftcom
    @eqisoftcom 5 лет назад +1

    At last! I found a video that explains entropy! Thank you. ... Are you a Wellingtonian?

  • @tartanhandbag
    @tartanhandbag 4 года назад

    is there a vid in this series on entropic gravity?
    also ...big up PBS Space-Time!

  • @markdelej
    @markdelej 6 лет назад +6

    So somewhere in the universe it’s possible a beach of sand blew all it’s grains into an image of my face? Cool 😎

  • @SaeedAcronia
    @SaeedAcronia 4 года назад +11

    I am an aerospace engineer and I have been mocked by these "physicists" for being an end-user of their theories and equations. Guess what?!
    Sadi Carnot, the father of Thermodynamics, was actually a mechanical engineer. On behalf of all mechanical and aerospace engineers in the world:
    IN YOUR FACE Sheldon Cooper!

  • @Ciekawostkioporanku
    @Ciekawostkioporanku 6 лет назад +4

    The Earth in PBS intro rotates the wrong way. :D

    • @canyadigit6274
      @canyadigit6274 6 лет назад +1

      Ciekawostki o poranku I thought that I was the only person to notice that.

    • @Cabolt44
      @Cabolt44 6 лет назад

      Or rotates the right way from a certain perspective?

    • @Ciekawostkioporanku
      @Ciekawostkioporanku 6 лет назад

      How can you change the direction of spin using perspective?

    • @Ciekawostkioporanku
      @Ciekawostkioporanku 6 лет назад

      Then i guess Earth would be made out of antimatter due to CPT-symmetry :D

    • @canyadigit6274
      @canyadigit6274 6 лет назад +2

      It doesn’t matter which frame of reference you’re in. It should still spin counter-clockwise.

  • @1a9Cj5
    @1a9Cj5 6 лет назад

    Background animation of the Sadi and Rudolf photos are so cool.

  • @jakkakasunset5485
    @jakkakasunset5485 4 месяца назад +1

    Can't wait for the explanation of the temporal pincer movement

  • @koenvandamme6901
    @koenvandamme6901 6 лет назад +5

    *sets hair on fire*
    Suck it, entropy!

  • @kadourimdou43
    @kadourimdou43 6 лет назад +10

    Where did you get that T Shirt?

    • @twistedsim
      @twistedsim 6 лет назад +4

      Puny Gods Look in the description. Second link

    • @kadourimdou43
      @kadourimdou43 6 лет назад +1

      Simon Bouchard Thanks

    • @LA-MJ
      @LA-MJ 6 лет назад

      They sold them on dftba shop. Key word being 'sold' :(

    • @daemonhat
      @daemonhat 6 лет назад

      they're still for sale, just sold out of large and extra large. luckily i can still wear a medium

  • @drewdurant3835
    @drewdurant3835 6 лет назад +30

    • @JorgetePanete
      @JorgetePanete 6 лет назад +1

      Drew Durant uh... what is strictly smaller than 3?

    • @abhirishi6200
      @abhirishi6200 6 лет назад

      That Go board analogy was quite neat!

    • @drewdurant3835
      @drewdurant3835 6 лет назад

      Enter the Braggn' I will check it out this evening!

    • @upgrade1583
      @upgrade1583 6 лет назад +1

      I see the video is saying that the purposeof life is to encourage entropy which creates space time.

  • @robertragsdale2447
    @robertragsdale2447 6 лет назад +1

    An episode of Informational Entropy sounds highly interesting...

  • @clieding
    @clieding 3 года назад

    I am not motivated to straighten up my apartment until it reaches a state of maximum entropy. I can determine this whenever I happen to move something from here to there and the rooms seem equally messy. [Having shifted the configuration to one of the many possible micro-states that give the same macro-state: ‘Maximum Messy”.]. It is then that I know it is time to expend some high-grade energy (in the form of pizza 🍕, cola 🥤etc.) and do some useful work. This is my personal „Carnot Cycle“ of laziness punctuated with bursts of entropy lowering effort.

  • @feynstein1004
    @feynstein1004 6 лет назад +3

    Hmm but doesn't that raise the question how the universe got to a low entropy state in the first place? This is like the problem of infinite regression of time.

    • @alquinn8576
      @alquinn8576 6 лет назад +1

      Penrose makes a big deal of this and proposes his hypothesis of conformal cyclic cosmology as an answer, which predicts a high-entropy heat-death universe can give rise to a low-entropy big bang. Very speculative but the notion gives the prediction we should see correlations in the CMB radiation of a given angular size; don't think he (or his collaborators) have found anything yet though.

    • @feynstein1004
      @feynstein1004 6 лет назад

      +Al Quinn Oh wow. That sounds really interesting. I wonder if it eventually turns out to be true. Hmm but how would a high-entropy universe spontaneously give rise to a low-entropy big bang?

    • @alquinn8576
      @alquinn8576 6 лет назад

      Penrose argues the solution to the black hole information paradox is that the information is destroyed, which resets the entropy to a much lower level once all BHs have evaporated. I'm just a RUclips troll though so I don't pretend to understand this at great depth. Long talk he gives here:
      ruclips.net/video/4YYWUIxGdl4/видео.html
      The part where he discusses how to get around the 2nd Law is around 1h 19m

    • @feynstein1004
      @feynstein1004 6 лет назад

      +Al Quinn Lol you're the best troll I've ever met, mate. Also, the trolls never bothered me anyway :D Thanks for the awesome replies.

    • @MrTripcore
      @MrTripcore 6 лет назад

      It's basically because of motion.. more motion means more energy, more energy means more dimensions and variations of energy dispersion and density.

  • @dyer308
    @dyer308 6 лет назад +10

    I love it when creationists use the argument that how could complex life arise on earth if entropy always increases, They always forget the earth isn't a closed system /:

    • @coder0xff
      @coder0xff 6 лет назад +2

      It's like they forget that the sun is burning.

    • @dyer308
      @dyer308 6 лет назад +1

      Halberdier That is true, It's unfortunate ppl see increase in entropy as increase in disorder, in a literal sense

    • @medexamtoolscom
      @medexamtoolscom 6 лет назад +1

      No it's much worse than that, because you see, the rise of complexity is EXACTLY what the 2nd law of thermodynamics predicts, after all, entropy also goes by another name, which has positive cannotations, and that name is "information". Higher entropy MEANS higher complexity, that's what entropy IS. So of COURSE it's not a contradiction for complex life to be a higher entropy state.

    • @frankschneider6156
      @frankschneider6156 6 лет назад +1

      raees khan
      It's not that they forget, they never knew, because none of them understands even that much (or rather less) physics.

    • @Aj32678
      @Aj32678 6 лет назад +1

      you"re quite right, it is not a closed system....it has an external creator. your turn.

  • @Limpn00dle84
    @Limpn00dle84 6 лет назад +5

    Clean up your room!

  • @markharder3676
    @markharder3676 4 года назад +1

    A couple of points I didn't hear in this lecture: thank you, thank you, thank you for pointing out that disorder and entropy are not the same thing. As a biochemist, I've heard misunderstanding used to explain protein and the hydrophobic effect and it's really a hand-wavy explanation. Arieh benNaim has written more than one book explaining this. One objection to the entropy-is-disorder equivalence is that entropy is an extensive variable. But is disorder? If I have 2 identical samples of some thermodynamic system, I also have twice the entropy of each one. But do I have twice the disorder? Is the set of both samples twice as disordered as one alone? I'm afraid it doesn't seem correct to me. Disorder seems more like an intensive variable, the intensity of ... what? Perhaps it's the equivalent of temperature, not entropy, temperature being an intensive variable, the change in thermal energy per unit of entropy change.
    The other point is that probability is only a model of stochastic systems like rolling dice or flipping coins. Systems like those are really chaotic systems, deterministic chaotic systems. It's our inability to exactly specify the initial state of such a system and the exact conditions in which it evolves that gives rise to the resemblance to - nondeterministic - stochastic properties. Perhaps only quantum systems are inherently stochastic, which is why the uncertainty principles hold true. Which is why it's inherently impossible to specify the initial state (position and momentum, energy and time,..) of a deterministicly chaotic system.

    • @omkarchavan5940
      @omkarchavan5940 4 года назад

      How do we know that quantum systems are indeed stochastic?

  • @holdenrobbins852
    @holdenrobbins852 6 лет назад +1

    Seems like physics might be overlooking something like giant balls of entropy reducers that continuously order matter into lower entropy states. Kind of like how some cellular automaton create chaos, and others are very ordered, and a few oscillate between low entropy and high entropy. He references life itself, which seems to have the sole of objective of reducing entropy in the universe by ordering more and more matter into a specific set of microstates that naturally reproduce themselves.

  • @zacmilne9423
    @zacmilne9423 6 лет назад +1

    So far so good. The description of thermodynamic entropy is accurate here. I just hope they don’t equate it with information entropy.

  • @xgozulx
    @xgozulx 6 лет назад

    What a beautiful sentence, im definitely usig it:
    be careful to keep the number of accesible microstates low, avoid thermal equilibrium and keep being that brilliant macro state that is you until I see you next week

  • @TheSharmanova
    @TheSharmanova 6 лет назад

    Among the many brilliant Space Time episodes this one was uniquely so. Thank you.

  • @jacobyoung2045
    @jacobyoung2045 Месяц назад

    I watched this video a long time ago, and I barely understood a minute of the video. I watched it again today and understand half of the video. I am hoping to watch it in the coming years to understand the full video.

  • @merkov8715
    @merkov8715 4 года назад

    I understood a little something of entropy by his closing words; "Be careful to keep your number of accessible microstates low
    (the chemical processes inside of our bodies should be kept on its natural course/function, unaffected as much as possible by external matter), avoid thermal equilibrium (discomfort or death) and keep being that brilliant macro state that is you (the culmination of most likely microstates)..."

  • @StevenErnest
    @StevenErnest 6 лет назад +2

    Excellent video. How does entropy work with the expansion of the universe actually increasing in speed due to dark energy and dark matter?

  • @pretzelogic2689
    @pretzelogic2689 3 года назад

    Nice explanation separating the idea of order from entropy.

  • @GigaChadJecht
    @GigaChadJecht 2 года назад

    My teacher linked this in the PowerPoint presentation of our last topic in my second year physics class about thermodynamics lmao I'm blown away a channel I watch for fun is featured as a part of my assignment lmao

  • @karilentz5000
    @karilentz5000 5 лет назад +1

    When heat is added to make ice melt or water boil, the temperature stays exactly the same the entire time while the level of entropy accumulates. When joules of heat travel somewhere, it is entropy, not temperature that is guaranteed to change.

  • @NickSibicky
    @NickSibicky 6 лет назад

    Hooray for using Go as the analogy!

  • @HKHasty
    @HKHasty 3 года назад +1

    Great video!
    I wish to understand entropy in the context of star formation. Seems condradictory