The Biggest Ideas in the Universe | 20. Entropy and Information

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 295

  • @SuperFredAZ
    @SuperFredAZ 4 года назад +32

    I am a graduate electrical engineer, and retired a few years ago. I wish my instructors were half as good a you. It is nice to come back to physics after so many years.

  • @K1lostream
    @K1lostream 4 года назад +14

    Soooo, entropy is a measure of our ignorance, and entropy always increases.... that explains a few things!

  • @salvatronprime9882
    @salvatronprime9882 4 года назад +28

    Thank you Sean for these videos, they are the absolute best and most comprehensible series of science lectures I have ever seen. All for free! You are a genius and a gentleman, kind sir.

  • @BMerker
    @BMerker 4 года назад +15

    Incredibly helpful to have the full four "versions" of entropy outlined like this. Brilliant pedagogy, much appreciated!

  • @MoshkitaTheCat
    @MoshkitaTheCat 5 месяцев назад +3

    You, Dr. Carroll, are my favorite teacher of all times. Thank you.

  • @rickharold7884
    @rickharold7884 4 года назад +14

    Awesome as always !
    I finally made through the 40 (I include QA as they are like their own lesson honestly) I love it.
    With my math and CS background I understood 70% of the videos! Unfortunately when later trying to explain to my teen kids I remember 20%. But..I love 100% of the videos. With infinite time I’d rewatch them again. Thanks for the video series.

  • @halfalligator6518
    @halfalligator6518 4 года назад +4

    I can't tell you enough how much I appreciate this stuff. Not only is it interesting but it distracts me from lockdown blues over here in Melbourne. You're a great teacher with a knack for reducing things down for us plebs. This series and your podcast (which I listen to on my walks) inspire and motivate me to get on with my own projects. While many of your audience will never be scientists, what you're doing ensures our children will grow up in scientifically literate households - that's a very valuable contribution to the world if you ask me.

  • @joxterjones2563
    @joxterjones2563 4 года назад +6

    I'd just like to add my personal thanks for your kindness and generosity in putting together this lecture series for all of us to learn and enjoy. This particular lecture is a masterpiece and weaves together many profound ideas in a clear and comprehensible way - the mark of a great teacher.

  • @rhondagoodloe3275
    @rhondagoodloe3275 4 года назад +11

    Sean, thanks for making this quality of information available to anyone (no prerequisites or tuition required).

  • @kevinegan1400
    @kevinegan1400 4 года назад +3

    There is just nothing anywhere nearly as good anywhere on the Internet on advanced physics concepts explained in understandable vocabulary without obscuring with impenetrable mathematics. Fantastic work Professor Carroll.

  • @p_square
    @p_square 4 года назад +24

    Your explanations are on the next level!!!

  • @SkorjOlafsen
    @SkorjOlafsen 4 года назад +12

    Congrats on passing 100k subscribers. Good reach for a physics lecture series!

  • @sandrasandra7593
    @sandrasandra7593 4 года назад +9

    A great new video, thank you, dr. Sean! You are building democracy, and a better world, because culture and science are the ground to make the world a fairer, a better, a safer place. Many skilled people can't afford expensive experience of study. You really help, sharing for free your top-level knowledge in a systematic way. You great scientist and also great man!

  • @Jaggerbush
    @Jaggerbush 3 года назад +2

    This is like a free tutor session with Sean - and for free. It doesn’t get better than this. And I’m a musician who loves this stuff as a hobby.

  • @PhilipSportel
    @PhilipSportel 4 года назад +19

    Sean, you are my new physics hero.

  • @v0lrath1985
    @v0lrath1985 4 года назад +11

    What a great way to start the day!

  • @JustOneAsbesto
    @JustOneAsbesto 4 года назад +39

    Well, we always get to entropy eventually.

    • @EmanueleLecchi
      @EmanueleLecchi 4 года назад +6

      More like entropy gets to us, sooner or later...

  • @NuclearCraftMod
    @NuclearCraftMod 4 года назад +13

    I have two questions:
    1. As the Gibbs entropy is constant for a closed system (22:43) and the universe is a closed system, why should the entropy of the universe increase at all? I understood the argument that the past hypothesis allows us to assume a low-entropy past, and the entropy should tend to increase with the time evolution from this low-entropy state, but this seems to simply be at odds with the statement that dS/dt = 0. I can only presume that the logical step of the universe being closed is flawed somehow, but I don't see why.
    2. At 52:05 you begin to argue for why the anthropic principle can not refute the recurrence objection, primarily by using the idea of Boltzmann brains. I didn't really understand how this idea makes the anthropic principle fail in the first place, but nevertheless, can't the past hypothesis give us a way to avoid the problem of an eternal past of Boltzmann brains forming at all? Surely the entire history of the universe is one in which the universe's entropy has been increasing from a low value? Surely in this context, i.e. the observable one in which we even state the second law in the first place, there is no problem?
    Thanks for the great video as always!

    • @TheOne10525
      @TheOne10525 4 года назад +1

      How do you have gas presure without the necessary antecedent of a container if entropy is still a thing?!
      Space is fake!!!!

    • @JohnDlugosz
      @JohnDlugosz 4 года назад +1

      @@TheOne10525 Might be funny if you were trolling... this is a long way from your flat-earth, conspiracy theory, biblical literalist content that you normally consume. What are you doing here? Seriously: if you came to learn, don't heckle with your FE nonsense. If you came to ask serious questions from a different crowd than you normally hang with, then ask on a post for that purpose rather than non-sequitor replies to other questions, and be receptive to actually getting (and understanding) the answer rather than just babbling the same stupid thing over and over no matter how many times it's been explained.
      Go pick on Thunderf00t: you'll ether learn or be destroyed. Either outcome is fine as far as the rest of us are concerned.

    • @barefootalien
      @barefootalien 4 года назад +1

      1. Yeah... the universe does that a lot. ;)
      As in most cases when questions start with "But the universe does X, doesn't that violate Y?" the answer is likely "Because the universe is expanding, so NOT Y". For example, as light redshifts as it travels across vast distances, it loses energy. But that energy doesn't _go_ anywhere, so doesn't that violate Conservation of Energy? Well, the universe is expanding, and that means it is a non-inertial reference frame, so energy is not conserved. In this case, I suspect that something similar could be said, along the lines of that because the universe is expanding, it _acts as if_ it were not a closed system, _even though_ it may not have anything outside of itself with which to interact and thus is definitionally a closed system. Even worse, the _observable_ universe's expansion is accelerating, which means that _it_ is _definitely not_ a closed system; the cosmic horizon generated by that expansion has stuff flowing across it, "exiting" the observable universe all the time. Whole galaxies disappear beyond it, in fact, so the observable universe is _certainly_ not a closed system.
      2. I think this is a result of Dr. Carroll taking some shortcuts since he didn't want to go into too much detail. I suspect it isn't _just_ the anthropic principle that refutes the recurrence objection, but rather the combination of the anthropic principle and the Copernican principle, which together say that we must necessarily be not only in a phase of the universe in which we can exist, but with overwhelming probability we must also be in a _typical example_ of such a phase. Since the overwhelmingly large probability for such phases is that you are a Boltzmann Brain that just popped into existence as you are reading this, with the memories of what came before already in place, and will perish immediately after reading this, yet that is clearly not the case, then the anthropic principle alone doesn't completely defeat the recurrence objection.

  • @dalefvictor123
    @dalefvictor123 4 года назад +4

    Thank you for this and other lectures. I have enjoyed every one and have renewed my interest in Physics and Mathematics. I am also the person who has been purchasing your books.

  • @nicodemosvarnava2520
    @nicodemosvarnava2520 4 года назад +4

    The episode we've all been waiting for

  • @astral6060
    @astral6060 3 года назад

    This is the best physics series for non-physics people: it connects various topics together and provides clear big pictures. Thank you, sir!

  • @d95mback
    @d95mback 4 года назад +1

    I knew basically all of this already, but now it feels that I know it to a level I did not before. You are fantastic, Sean.

  • @aclearlight
    @aclearlight 2 года назад

    This is truly masterful pedagogy and a huge, generous gift to the public mind. Having studied and taught in the more chemistry-related aspects of this area, I am SO grateful to have this clear explanation across disciplines so that I can start making better sense of quantum informarion theory. You sir, are a national treasure. Your efforts toward the betterment of the human condition put you in a rare cohort of relatable, eloquent geniuses who have the care and the capacity to reach back and to some considerable extent bring your fellows along with you on your expansive voyage. These works of REMARKABLE public outreach will have huge ripple effects and will change lives. 👍🏼

  • @TehPwnerer
    @TehPwnerer 4 года назад +3

    Sean Carroll thank you for everything you have done.

  • @fasligand7034
    @fasligand7034 4 года назад +1

    As a layman, many purely physical considerations go right over my head. But I'm so glad you connected this to Shannon's information theory. It blew my mind (even though I kind of heard about it before! From GEB)

  • @martinds4895
    @martinds4895 4 года назад +1

    My favourite episode so far in this series, but I like them all. Thanks Professor. Hope we get more in this topic, maybe more in dept.

  • @drneuroentropy
    @drneuroentropy 2 года назад

    If there were a Nobel Prize for a RUclips video, you'd have got it with this one Sir. Saying that, the things you pointed out here do highlight the very reasons why we have cancer at the first place, hence as of now all will be used to crack it by my colleagues and myself. Thank you.

  • @kevinmccarthy2793
    @kevinmccarthy2793 4 года назад +5

    That fact that no one understands entropy (or information for that matter) has allowed creationists to get away with a lot of shenanigans (Shannonigans?) about information and entropy with biologists (who don't generally study either).
    In fact, that's why I watched this video was to more understand both in order to counter creationist claims.
    Thanks!

  • @expchrist
    @expchrist 4 года назад +1

    I love that you love this topic. It makes me want to learn more about entropy and information theory.

  • @silent_traveller7
    @silent_traveller7 4 года назад +1

    I am enjoying this series sooo much. This series will long be seen and have impact after the lockdown ends!

  • @markzambelli
    @markzambelli 4 года назад

    Prof Carroll... thankyou, thankyou, thankyou. This is an amazing series that I (among others) are cherishing during these lockdown days. Thanks again, Mark.

  • @musicmaker33428
    @musicmaker33428 3 года назад

    Thank you Dr. Carroll. This was incredibly helpful and such a digestible approach and explanation for a complicated subject.

  • @sadsalidhalskdjhsald
    @sadsalidhalskdjhsald 4 года назад +2

    Absolutely love these vids. Hope they never stop! 😆

  • @kc-cn8zy
    @kc-cn8zy 4 года назад

    Wonderful summary of "entropy"! TY. There's an excellent discussion about "order" and "power" (more than just different words) in the later part of "The Bottomless Well: The Twilight of Fuel, the Virtue of Waste, and Why We Will Never Run Out of Energy", Mark P. Mills and Peter W. Huber. Maxwell demon's "waste heat" also makes an appearance. (Very interesting subtopic of book, 'waste is virtue'.) Again, TY very much for these lectures.

  • @Cooldrums777
    @Cooldrums777 4 года назад

    I was on board with the first three definitions of entropy, especially Shannon which was covered in depth for me in graduate school. Then you got to quantum entropy and it flew over my head. Haaaaaaaa. Some of what you said made sense. The rest I think fluctuated out of existence for me. Great lecture as usual Prof.

  • @DrDress
    @DrDress 4 года назад

    Finally some of the ideas comes from Sean himself. I suppose this was the subtle point of ALL these videos: To legitimately, yet indirectly, call ones own idea one of the Biggest Ideas of the Universe. *tongue in cheek*

  • @mihaiserbu8447
    @mihaiserbu8447 Год назад

    Oauuu. Amazing. Thank you !!!
    Almost apostolic job !!
    Such a great spirit !

  • @websurfer352
    @websurfer352 4 года назад +1

    Thank you!! Thank you!! Please do more of these??

  • @RafaelQuirinoVex
    @RafaelQuirinoVex 3 года назад

    Why cant every professor and/or book teach things simply and clearly like sir Sean Carroll does here ? Brilliant exposition in this video, thank you a lot professor !

  • @sunny-sq6ci
    @sunny-sq6ci 4 года назад

    hey dr. Carroll. your lecture on time reminded me of the 4th wall scene in the movie space balls where the characters were genuinely making a pretty thought provoking thought on time.

  • @starp8949
    @starp8949 4 года назад

    I'm so thrilled to be listening to you about physics! I wish I had an opportunity to listen to you 20 years ago. I mightve been a Physicist. Oh well, maybe in another parallel universe, I am!

  • @tovarischkrasnyjeshi
    @tovarischkrasnyjeshi 4 года назад +1

    This is kind of more speculative than BIitU, but, if I'm understanding things
    So starting around 38:00 (to 43:00ish), regarding the lines about the early and late universe looking like homogenous black bodies, and cosmologists 'not knowing' they're different entropically, was that a criticism of cyclic/bubble universe models (or at least ones where heat death looks like a big bang)?
    Does that extend to models like Penrose and other's CCC model? or does that model's rescaling affect how the entropy is quantified as well and wiggles out of the question by changing the amount of information its working with?
    Thinking on it, I guess the CCC model would just move the question back or not deal with it, if I understand. n universes ago still would have had an extreme entropy differential to evolve through, just at unimaginably different scales, if the 2nd law is true across aeons. So the question of why that universe had such a lower entropy than even ours is pushed back to that universe, back to whatever quantum fluctuation is at the root of the family of aeons that ours is part of.

  • @James_Stewart
    @James_Stewart 4 года назад +1

    I adored information theory and still have the heavily annotated Baierlein text to prove it!

  • @protoword10
    @protoword10 4 года назад

    In my native language, entropy is female word. I remembered during my student's years at college, my old professor of thermodynamic told us: Guys don't get philosophical, get real, we have to solve some problems here (s/i diagram of entropy/enthalpy for some fluids-gases). If you try, he said, I promise, she'll get you! LOL
    By the way, professor Carrol, you have great explanation of entropy!
    Very methodical approach, even Boltzmann from our books would tell you thank you for explanation of his take on entropy...

  • @LearnedSome
    @LearnedSome 4 года назад

    I think this is my favorite episode yet.

  • @dmfoneill
    @dmfoneill 4 года назад +1

    I have enjoyed being challenged by these sessions. My Chem degree ('75, back when there were fewer elements!) has helped... some.
    I am surprised that you haven't covered the biggest of ideas - model building and validation i.e. "science" itself!

  • @theodorebouchez7381
    @theodorebouchez7381 2 года назад

    1:14:10 why did Landauer and Bennett not consider the entropy and free energy cost associated to the process of storing information and only the cost of erasing information?

  • @CosmoNaco
    @CosmoNaco 4 года назад

    Thank you Sean, great! This one entry is one of the most interesting videos one can ever see about the subject since it covers so many concepts and let one thinking about it.
    Can you expand a little more (I don't know if you already made the Q&A video) about the idea you mention in 56:25 of entropy with a minimum implying that time arrow flows opposite (and so validates the "past hypothesis") for far observers in the past? I think it resembles the way Hawking discusses in Brief History of Time that we would perceive time flowing reversely when it goes for a possible big crunch, but also it cames to mind to the more recent proposal by Neil Turok and others about the CPT symmetric Universe... can also have a relationship with your proposal of 2004?
    Also, I remember that in the mid 1990's Bekenstein and Mukhanov had the idea that entropy could be quantized (for black holes but since the generalized second law of Bekenstein I suppose this also generalizes). If this is the case, would this imply a discretization of time?

  • @Im-just-Stardust
    @Im-just-Stardust 4 года назад +10

    Who said Tuesday was boring. Not ANYMORE with Sean my friends.

  • @dyna88cui
    @dyna88cui Год назад

    To 1:28 if communication conveys more information when we know very little and viceversa, communicator's and physicist's view on entropy are the same, aren't they? just that the one focuses on the flow and the other on the stock? ... or maybe I am just confused myself. Thank you for the video.

  • @rosedragon108
    @rosedragon108 4 года назад +2

    smart of you to do youtube vids - ty so much ... recommend your books etc all the time.

  • @CalendulaF
    @CalendulaF 4 года назад

    Just a tiny quibble: 31:00 the guys name is Josef Loschmidt, not Lohschmidt. He was a giant in chemistry.

  • @robinbrowne5419
    @robinbrowne5419 2 года назад

    Thank you. This is perhaps the most fascinating topic of all :-)

  • @w6wdh
    @w6wdh 4 года назад

    1:27:00 From communications theory, you can use Huffman variable bit rate encoding to transform a low information alphabet into a high information alphabet. That is, you encode high probability (expected) symbols with fewer bits and low probability (surprising) symbols with more bits, so the resulting bitstream can be seen as a high information alphabet. The information content of each chunk of bits (resulting symbols, e.g. 32-bit words) is more uniform.
    You can also use things like LZW compression (or other compression methods) to perform a similar task. For example, given an English text, LZW compression builds up a dictionary of symbols (words, phonemes, or patterns seen) and then uses Huffman coding to store the actual sequence of symbols. I think.
    What I’m trying to say is that it seems you can transform an alphabet, changing its Shannon entropy, to better use the available bandwidth and signal-to-noise ratio of a communications channel (see Shannon’s law, channel capacity = bandwidth x signal-to-noise ratio).

    • @w6wdh
      @w6wdh 4 года назад

      A modern example is old analog TV vs. digital TV. An analog TV signal is low entropy. It has huge content at the horizontal (~15750 Hz) and vertical (~60 Hz) scanning frequencies, both from the sync pulses and the correlations in the images from line-to-line and frame-to-frame. (I’m ignoring the color information for sake of simplicity. If you want to, add lots of content around 3.579545 MHz and slightly adjust the scan frequencies.)
      A digital TV signal looks like white noise, with nearly equal content at all frequencies within the allowed bandwidth. The sequence of video images has been digitized and then highly compressed (MPEG encoding). The resulting bitstream looks like a random sequence of bits. Far more information is transmitted in a DTV channel than was transmitted in an old analog video channel. Good thing, then we get an HDTV channel and some extra TV channels in one old analog channel.

    • @w6wdh
      @w6wdh 4 года назад

      Which leads into the fun physics question, Could alien civilizations detect our TV transmissions?
      The answer for analog TV transmissions is m-m-maybe, and for DTV transmission, no, because those signals look like white noise.
      For analog TV, something Iike 100kW of 15750 Hz and 60 Hz is radiated on a carrier frequency between about 50 and 500 MHz, with antenna patterns that are mostly in a horizontal plane, on a spinning Earth. So you can calculate peak watts per steradian of transmitted power. At great distances (other stars), our TV signal might be swamped by the blackbody radiation of the Sun and the cosmic microwave background radiation. So detectability depends on how large a dish antenna an alien civilization could build: larger diameter = narrower detection angle = higher antenna gain.

  • @Shonucic
    @Shonucic Год назад

    Love this series, thanks a ton for making them!

  • @NeedsEvidence
    @NeedsEvidence 4 года назад

    The lecture is gold. Thank you, professor.

  • @Well_Earned_Siesta
    @Well_Earned_Siesta 4 года назад

    This is easily my favorite episode!

  • @srenherstrm2173
    @srenherstrm2173 4 года назад +3

    Thank you very much for these great videos!
    A question regarding entropy: Arrow of time is due to increasing entropy. However, entropy can decrease locally, how to relate this to the arrow of time?

    • @tomcraddock9002
      @tomcraddock9002 2 года назад

      Good question, I would also like to know the answer.

  • @expchrist
    @expchrist 4 года назад +1

    Questions:
    1. Do quantum computers violate shannon entropy by running Shor's algorithm?
    2. Do quantum computers "erase" bits of information?
    3. Can quantum systems convey "negative information" and does this in any way affect our calculations when it comes to our probability expectations associated with information communicated between two systems?
    phys.org/news/2005-08-quantum-negative.html
    4. Why does Maxwell's demon not create entropy by writing information, doesn't writing bits of data to his memory require the demon to do work? Why is work only done when erasure occurs?
    5. Von Neumann saying, "nobody knows what entropy means" sounds similar to
    “I think I can safely say that nobody reallyunderstands quantum mechanics” attributed to Richard Feynman
    Is the interpretations of entropy as varied and diverse as the interpretations of quantum mechanics? What is the fundamental point of contention that people still debate about when it comes to entropy?

  • @Johncornwell103
    @Johncornwell103 4 года назад +1

    I know that this question would be best suited under Special Relativity or Quantum physics.
    But from my current understanding of Special Relativity and General Relativity, has any physicist proposed that it is just the Uncertainty principle at the macroscale?
    I mean depending on what your velocity and distance is compared another observer determines your position or momentum through spacetime for them.

  • @rv706
    @rv706 4 года назад +2

    Are the equal-entropy subsets of phase space like chunks of the same dimension of the state space, or are they more like a foliation into lower-dimensional submanifolds? (in the second case how is the volume of each submanifold computed? maybe using the volume form induced by the restriction of the symplectic form, assuming e.g. that the restriction is still non-degenerate?)

    • @Majoen1998
      @Majoen1998 4 года назад

      In the microcanonical ensemble, a macrostate is defined by a value for the total energy, so the subset is defined by the equation H (p, q) = E, for some fixed energy E.

    • @rv706
      @rv706 4 года назад

      @@Majoen1998: Okay, so it's more like a foliation. So the "discontinuity" of S that prof. Sean Carrol hinted at some point in the video probably isn't a thing, in this case.

  • @shera4211
    @shera4211 4 года назад

    First of all, thank you so much for this great lecture and for the series in general! Two things that I learned from this episode are:
    1. The 2nd law of thermodynamics (related to entropy) gives time its direction.
    2. Quantum mechanics forces entropy (when sub-systems of a larger entangled system)
    Now can one infer from these two points that quantum mechanics is what (indirectly) gives time its direction?

  • @lovefeelsbest
    @lovefeelsbest 4 года назад +2

    Is that a bong hit in the background?

  • @rbettsx
    @rbettsx 4 года назад

    1:00:28 .. the passage concerning 'memories', or 'records' acquiring their meaning for us, conditional on our hidden assumption of a low-entropy past. What's the maths? What is the relative probability of 1. The low-entropy past giving rise to a consciousness making the interpretations of its surrounding physical world as described, and 2. Any trajectory which gives rise to a transient consciousness with an *illusion* of the past, and / or its physical surroundings? I know this seems a pretty far-out question, but I start to feel that when arguments are made in this style, either the math should answer it, or a justification, perhaps a meta-physical one, is needed for its exclusion. Is that question really covered by Boltzmann brains?

  • @dmofOfficial
    @dmofOfficial 4 года назад +1

    Steve: So lets go down that rabbit hole a little bit further.
    Me: Yes please!!

  • @JohnDlugosz
    @JohnDlugosz 4 года назад

    1:20:00 that idea is seen _directly_ these days in predictive input on your phone or messaging app. There's even an XKCD cartoon www.explainxkcd.com/wiki/index.php/1068:_Swiftkey and later www.explainxkcd.com/wiki/index.php/2169:_Predictive_Models
    The more you have to correct the predictive text input, the more interesting is the message.

  • @sherlockholmeslives.1605
    @sherlockholmeslives.1605 4 года назад +2

    Mike's meal equation.
    Fish + Chips + Salt = A nice meal for Mike.

  • @michellehu562
    @michellehu562 3 года назад

    My entropy certainly increased since i’ve got more unknowns by listening to this.

  • @EnginAtik
    @EnginAtik 4 года назад

    It is possible to decrease entropy in a bounded region. It is like cleaning up your living space. If we have an energy source and an entropy dumpster like a black hole that we have access to, we can create a living space. There could have been a major entropy cleanup effort in our region of the universe at some time in the past.

  • @klausgartenstiel4586
    @klausgartenstiel4586 3 года назад

    feature film length, and me on the edge of my seat the whole time.

  • @cdgt1
    @cdgt1 4 года назад +1

    If the early universe had low entropy it would only be able to produce a " little bang ". Creation would take place via the Casimir effect where at first only minute particles would be created. The minute states of matter would combine to create the macro states. In this case a volume or area is required to exist prior to any production of matter and in turn matter is required to produce waves or electromagnetic radiation. This, as you commented at 49 :00 leads to the realization that our universe is not bounded. Our universe must be a sub-component of a larger system. The big bang occurs at the end of the universe and is produced by its collapse.

  • @mathadventuress
    @mathadventuress 4 года назад

    new time fan, And i am still watching the first part of the series :)

  • @Jaggerbush
    @Jaggerbush 3 года назад

    Why do i enjoy entropy so much? I’ve exhausted everything on RUclips as it relates to Entropy. Even those goofy 70s films (that I love) if someone knows of a deep cut entropy upload please share.

  • @ColbyNye
    @ColbyNye 4 года назад

    Another great episode! Thank you!

  • @timbabb2508
    @timbabb2508 4 года назад +1

    I was recently very surprised and delighted to learn that temperature can be measured in units of gigabytes per nanojoule.
    It absolutely blows my mind that aspects of the physical universe itself can be measured as quantities of discrete data, distinct from any attempt to _represent_ those aspects; that a human and an alien might be able to agree, in principle, on something like "how many gigabytes a box of gas weighs". Can you talk a little about how this is possible?
    What other physical aspects of the universe can be said to have units involving bits? In a world (and as I understand it, this is a topic close to your heart 🙂) where the structure of spacetime arises from entanglement, in a Von Neumann-entropy sort of way, might it be possible to measure things like distance or time in units of bits?

    • @JohnDlugosz
      @JohnDlugosz 4 года назад

      Well, thermodynamic beta has units if reciprocal of energy, which if you look back at Sean's earlier lecture explaining natural units, is the same as space and time.
      I'm not sure how they equate "bytes per joule" with just "per joule" in two different measurement systems. But having done so, you can solve for bytes and make the similar correspondence for space and time.

  • @michaeldamolsen
    @michaeldamolsen 4 года назад +1

    Preprint of Dr. Carroll's paper on Boltzmann Brains, for those interested in more detail: arxiv.org/pdf/1702.00850.pdf

  • @timbabb2508
    @timbabb2508 4 года назад +1

    With regards to the globally low entropy at the beginning of the universe, how is this related to the size of the cosmic horizon?
    If we play our expanding cosmic horizon in reverse, going backward in time, it seems the visible universe should be getting smaller and smaller. Is there a point in the early universe where the visible universe, being so tiny, would contain only a few qubits of quantum information? If so, would that configuration be considered to have very low entropy compared to today, or any other moment thereafter? (And would that be enough to "explain" the initially low entropy, or is there, say, a circular assumption in there?)
    And in the other direction, would it be right to say the expanding cosmic horizon is responsible for globally increasing entropy? Either in the sense that the volume of the visible universe is increasing (larger phase space → more entropy), or that a bunch of thermal photons are raining down on us from the horizon ("outside the system"), bumping into things and screwing up our ability to confine the evolution of local phase space?

  • @shera4211
    @shera4211 4 года назад

    Another question: I understand the AdS - CFT correspondence as stating that gravity can be seen as a dimension (or an additional axis of the phase space maybe?). If so, can one interpret the event horizon of a black hole as the threshold for the gravity (density)? I mean, the gravity density within a black hole is so much larger than that of the surrounding, that in coarse-grained fashion one could say that gravity is relevant inside - i.e. gravity turned on, i.e. 4+1-dim AdS space, at event horizon. At the surface gravity is turned off, i.e. 3+1-dim boundary described by CFT.
    Is the surface of black hole the coarse graining of the volume it encloses s.t. one could do the same one does in thermodynamics with atoms and gas?

  • @MrPythonnn
    @MrPythonnn 4 года назад

    thanks Sir . very loud and clear.

  • @alwaysdisputin9930
    @alwaysdisputin9930 3 года назад

    1:21:00 So Shannon says if we get told Sun rises in the East then our surprisal = 0 & we gain no new info, but 'Sun rises in West' gives us information e.g. maybe we're not on Earth. It's like how when Fermilab found 1 of the electron's cousins being more wobbly than the standard model predicts, physicists' got excited

  • @BobBogaert
    @BobBogaert 3 года назад

    Some of that rare RUclips time that isn't wasted.

  • @PavlosPapageorgiou
    @PavlosPapageorgiou 4 года назад

    All right finally I get it where the low-entropy mystery comes from. If the reversibility objection is valid you need the hypothesis that we're on a path with a low entropy past, and you could speculate it is The past or A possible past among many. To me the objection seems thoroughly unconvincing because it's classical and global. I'd expect you to say there's a local asymmetry driving the 2nd law, either from coarse graining or from some combinatorial aspect of quantum mechanics. I need to read up on this. My hunch is the 2nd law can be reformulated as conservation of data, were the information that makes the macrostate distinct among others is preserved but the macrostate gets larger. Then the past hypothesis is that our universe has a relatively low bound on information content that's distinct from its evolution. Thanks.

  • @olivierloose9905
    @olivierloose9905 4 года назад

    A question: I don't know how to fit together the notion of unitarity in quantum mechanics (information is conserved) with the notion of an increasing entropy in the Universe. That is, we know that the 2nd law of thermodynamics holds in the Universe because of the Past Hypothesis (entropy was lower in the past), but we also know that entropy increases as a result of breaking the time reversibility symmetry (e.g., entropy increases when erasing information). Given that quantum mechanics (a theory that describes the Universe) dictates that information is conserved we could infer that entropy is overall stable. How is this possible?

  • @shohamsen8986
    @shohamsen8986 2 года назад

    This is really good.

  • @SkorjOlafsen
    @SkorjOlafsen 4 года назад +3

    Does black hole "decay" due to Hawking radiation increase entropy? I find that hard to believe, as the entropy of event horizons is so high. Also, you mentioned the max entropy as 10^123, is that based on the cosmic event horizon? If we include that, it dominates everything else, right? (Which makes sense, as it sort of represents the entropy of the universe outside the observable). Aren't there cosmological models where the cosmic event horizon actually shrinks as part of a Big Rip, and thus the universe's entropy falls quite fast at the end?

    • @JohnDlugosz
      @JohnDlugosz 4 года назад

      Yes, that's what he said. When the universe is cooler and more rarefied, being spread out increases entropy again. Hard to believe? Well, think about that big black holes won't decay until the universe is much older, as their temp is lower than the CMB. They actually are growing by absorbing CMB photons. It's not _until_ the outside is lower entropy that the BH starts to evaporate.

  • @kagannasuhbeyoglu
    @kagannasuhbeyoglu 4 года назад

    Excellent series carry on...👍

  • @eugeniusbear2297
    @eugeniusbear2297 4 года назад

    The vacuum pressure of the universe doesn’t like being disturbed by mass and so it acts to push mass back together to minimize the overall field disturbance created by mass. This follows naturally from consideration of the square-cubed relationship between the field disturbances (surface areas) and mass (volumes).
    Entropy is the universe’s reaction to pack mass/energy back into a single point or multiple single points (i.e. black holes).

  • @phillipsmith4979
    @phillipsmith4979 4 года назад

    An excellent presentation. I have a question. What is the amount of entropy a micro state has to absorb to move to the next higher entropy macro state, called. For example how much disorder is required to move from a tidy room to an untidy room. While that distinction is arbitrary some tidy rooms are further away from being untidy than others. Just as some micro states are further away from the macro state boundary than others. As I understand this it is called negative entropy however I’m not sure. If so does this have any relationship to negative probability?

  • @vinm300
    @vinm300 3 года назад

    Nobody handles big ideas better than Sean Carroll.
    Robert (Closer to Truth) asked him "Is information the fundamental underlying reality"
    Carroll said, "No".
    Most of Robert's interlocutors talk in circles, give pedantic metaphors, then don't answer.

  • @shera4211
    @shera4211 4 года назад

    Does the decoherence explanation for Hyperion also work for the randomness of a fair flip coin experiment? I.e. is the Bernoulli distribution a coarse-grained model for all the interactions between the coin, and, the surrounding air molecules and radiation during its flip duration?

  • @Reppucci24
    @Reppucci24 4 года назад +1

    Consciousness an emergent property of the brain... Could it be a "dark emergent" property that influences the collapse of the wave function and stacks our time slices ( like a messy file cabinet :). Similar to emergent properties in other subatomic realms.

    • @DApple-sq1om
      @DApple-sq1om 3 года назад

      A rock can collapse the wave function.

  • @TheMemesofDestruction
    @TheMemesofDestruction 2 года назад

    30:08 - How did he know?

  • @LiamHaleMcCarty
    @LiamHaleMcCarty 4 года назад

    When you talk about probability in the context of entropy (e.g. that a system is extremely likely to evolve to a higher entropy state), what’s the best philosophical grounding for that? The frequentist view seems natural... but also artificial. Trying to connect this to your last biggest idea video

  • @infomage
    @infomage 4 года назад

    So if we artificially make a REALLY high entropy in a closed system we can make time flow backwards in that system? I don't understand. Suppose at T=0 we create (simulate?) a really low entropy system. Then which direction does the entropy begin to decrease? ? If you say there is a preferred direction then you've already built in a direction. I'm not trying to be contrary, just trying to understand.

  • @orsozapata
    @orsozapata 4 года назад

    @53:18 A single brain that lives long enough to look around and go "Hah, thermal equilibrum" and then it dies

  • @BC-hz4ut
    @BC-hz4ut 4 года назад

    Hi Sean,
    Thank you for the great work you’re doing,your illuminations are shining to all corners of the known universe.
    1)Philosophically speaking,does the quantum decoherence that creates the “Many worlds” proposition, violate the second law of thermodynamics? Or is each world truly a closed system that “forgets” the “other” worlds in the wave function?
    2) Can Maxwells demon be used to explain how “Many worlds” doesn’t violate the 2nd law because each “world” just like Maxwell’s demon needs to “forget” the other worlds in the wave function in order to keep “separating”
    3) Can Sir Roger Penrose’s “Cyclical Conformal Cosmology” be used to explain the low entropy initial state of this “Aeon?” Looking at how the past “Aeon” could have converged into a universe sized black hole (singularity) that through Hawking radiation gives us the intial state variables that gave us the cosmological constant and the quantum gravity that creates the initial Low entropy state for this universe?

  • @adamharris8666
    @adamharris8666 3 года назад

    Thanks for the info 🙏🏻

  • @cleon_teunissen
    @cleon_teunissen 4 года назад +1

    Back when I was 15 or so, in physics class in school, our teacher treated us to a vivid tabletop demonstration of the physical significance of entropy:
    The demonstration involved two beakers, stacked, the openings facing each other, initially a sheet of thin cardboard separated the two. In the bottom beaker a quantity of Nitrogen dioxide gas had been had been added. The brown color of the gas was clearly visible. The top beaker was filled with plain air. Nitrogen dioxide is denser than air.
    When the separator was removed we saw the brown color of the Nitrogen dioxide rise to the top. In less than half a minute the combined space was an even brown color.
    And then the teacher explained the significance: in the process of filling the entire space the heavier Nitrogen dioxide molecules had displaced lighter molecules. That is: a significant part of the population of Nitrogen dioxide had moved _against_ the pull of gravity. This move against gravity is probability driven.
    Much later I learned about statistical mechanics. Statistical mechanics provides the means to treat this process quantatively. You quantify by counting numbers of states. Let's say that at the start there are 4 heavy moleculels in the lower half and 4 light molecules in the top half. With a set of 4 elements you count 24 different states (4*3*2*1) So before removing the separator: top half: 24 stats, bottom half: 24 states. Remove the separator and you count 8*7*6*5*4*3*2*1 states. Of course that's not how you would count the states of actual gas, this is just to give somewhat of an idea how this kind of probability can be expressed in quantitative form.
    Returning to the demonstration with the Nitrogen dioxide. The heavy Nitrogen dioxide molecules were (on average) climbing up. This was the only way forward. The end state (mixed) is more probable than the starting state, so that is what that system progresses to.

    • @barefootalien
      @barefootalien 4 года назад

      That is an amazing and powerful demonstration! I'd love to congratulate that teacher on a job well done!

    • @JohnDlugosz
      @JohnDlugosz 4 года назад

      Brazil nuts rise to the top when you vibrate a can of mixed nuts. Again, the denser stuff floats rather than sinks, increasing potential energy.
      I like to think the same principle is at work in any endothermic reaction: entropy trumps energy. The flow of energy/temperature is just a manifestation of the larger rule of increasing entropy, and if other forms of entropy are present it can dominate.

    • @barefootalien
      @barefootalien 4 года назад

      @@JohnDlugosz Um... sorry, but... what??
      First of all, brazil nuts are not a particularly dense nut. They're less dense than almonds and cashews, more dense than peanuts and pecans. Their bulk density (including the spaces between them in a typical random spacial orientation) is also middle-of-the-road, significantly lower than peanuts, a little higher than almonds, and quite a bit higher than walnuts or pecans.
      Also, I'm pretty sure that after they have risen to the top, the total entropy of the container has _decreased_ (there are many more ways for them to be randomly distributed than to be all on the top, just like the coffee and cream example).
      Rather, the reason they rise to the top is simply due to their size and shape; they are larger, so they would need to displace and move aside many more smaller nuts in order to descend, where if one of them moves upward, many more smaller nuts can move downward to compensate. Though this does decrease entropy, it is allowed because by vibrating the container, you are inputting energy (and thus it is very much _not_ a closed system). Whatever means you use to vibrate them, whether a machine or your own body, you will be increasing the entropy of the universe more than the small decrease of energy from the more orderly resulting orientations of the nuts. The potential energy of the container should remain approximately the same, since they are of average density.
      Effectively, this is the opposite situation compared to what happened in the demonstration described. In the demonstration, an initially ordered, low-entropy configuration moved, as a closed system, toward a higher-entropy state, in spite of the fact that potential energy increased in order to do so. In the nuts example, an initially disordered, high-entropy configuration moved, as an _open_ system, toward a lower-entropy state with the nuts in a more ordered configuration, with negligible change in potential energy.

    • @cleon_teunissen
      @cleon_teunissen 4 года назад

      @@JohnDlugosz
      The unmixing of the mixed nuts, that does illustrate something, but I believe it's not applicable for illustrating entropy effect.
      I think the mixed nuts case cannot be approximated in some form of idealized case.
      To develop a visualization we must simplify. Among the most powerful simplifications is to treat the case as frictionless. For comparison: brownian motion of particles with a higher specific density than water. Brownian motion: the particles remain in suspension when the randomness introduced by being randomly buffeted is larger than the gravity bias. This system proceeds to an equilibrium state. We know it's an equilibrium state because the equilibrium can be shifted: increase the gravity bias with centrifugation. If you spin fast enough the G-load wins the day: particles go out of suspension.
      Now the mixed nuts. Let the large nuts have a slightly higher specific density than the overall nut mix. Then the vibration will still make the bigger nuts migrate _upwards_. (Both with 1 G and with higher G-load) That is the opposite of the result in the brownian motion case. This shows that in the case of the mixed nuts the simplification of ignoring friction is _not valid_.
      The mixed nuts unmixing does illustrate something, but not entropy effect.
      Endothermic process
      Yeah, the mixing of air and Nitrogen dioxide is endothermic. In the final state the gravitational potential energy is higher than in the starting state. Accordingly from start to end the kinetic energy has decreased.
      As we know, a general class of endothermic processes is a salt getting dissolved in a polar solvent (dissolving in water the most common example, of course)
      About endothermic chemical reaction.
      Well, with chemical reaction it's more complicated. Let's say you have molecule A and molecule B, and they can combine to form molecule AB, endothermically. As we know, chemical reaction is a two-way street. That means that in the absence of any other process the concentration of AB will remain very low; any AB that is formed has a high probability of falling apart again. One way to shift that equilibrium is to have a large supply of a molecule C that binds to AB, forming a very stable ABC. If the concentration of AB can be pulled very low then the trickle of the A + B => AB reaction is kept alive.
      This scenario is complicated; chemical intervention is used to prevent equilibrium

  • @bjpafa1322
    @bjpafa1322 4 года назад

    Congratulations, once more.

    • @bjpafa1322
      @bjpafa1322 4 года назад

      fantastic. It seems so simple...

  • @deletefacebook8419
    @deletefacebook8419 2 года назад

    Could we not monitor the power consumption of the United States to understand efficiency? Thus predicting the future to a probability that is at least greater then 51% assuming that there are some visible patterns in human behavior? Given enough data collected.