I think the great oxygenation catastrophe is in the short term rather portentous, but in the long term a good sign. That is, the overall complexity of life on Earth increased in that event. The downside is that in the (geologically) "short" timescale of millions of years, there was the devastating extinction event. Human civilization probably represents one such sudden jump in complexity - we will do great harm to the biosphere before a new "balance" is reached between our technological species and the broader environment. So perhaps we can extrapolate that things are going to get a lot worse before they get better... but at least they will eventually get better. Hopefully the human species can survive that period, as the first photosynthesizing life did before us.
Great. One question of hers I would like to echo. She asked about the subjectivity of the macrostate and Sean said it was a long standing problem with entropy, though I've listened to a lot of what Sean and others have said about it and this is the first acknowledgement of that problem I've ever heard. I understand Sean's answer, but the subjectivity is still there and it seems like we haven't really unraveled the issue yet.
This problem is because physicists were taught a really backwards idea of what entropy is (the real deal of the law of nature). It's simply an increase in the number of free flowing elements in a system. Like books on a shelf. It doesn't matter how you order those books, as long as the quantity of books is the same, the entropy remains the same. The concept of "disorder" is meaningless. Each order is equal. But add another book to the shelf, and the entropy increases. That's it. This means that the universe keeps adding more things as time progresses, and mainstream physics is still kinda stuck on the steady state theory, at least when it comes to the "smallest parts of the universe". But, what is probably the case is that the universe is fractal, and instead of "adding" atoms, or particles, or strings, or whatever smallest bit we're thinking about these days is, the bits are being divided. So we have the same amount of matter and energy, but we add detail, like zooming in on a Mandelbrot fractal. This adds complexity as entropy increases, but we never have an end, and the only "heat death" is local, as beloved patterns end, and become new patterns.
Another fantastic conversation to listen to. I liked that while the guest was clearly very accomplished and intelligent in neuroscience, she was asking physics questions I had too. Well done!
Kate Jeffery's concerns about the subjectivity involved in the statistical definition of entropy are very well to the point. There is no objective way to coarse grain your way to the second law of thermodynamics, like Loschmidt already pointed out to Boltzmann. Entropy is the amount of information that we miss in order to accurately predict a systems future. The relevant information that we have, not only consists of the macroscopic pattern of milk and coffee in the cup, one also has to take into account the crucial bit of information that two minutes ago there was no milk in the cup. For each microstate consistent with a pattern of milk strings in the coffee, there is an equivalent microstate (same positions, opposite velocities) that would result in a process that would look like time going backwards. It is not that those states are very small in number, in fact, the numbers are exactly the same! What sets them apart is the one crucial bit of information that two minutes ago there was no milk in the coffee. History does matter when defining entropy. The history of a DNA molecule is crucial information. We have much more information about the future of a healthy, young DNA molecule than about the future of an old, worn and torn DNA molecule, hence the latter has much lower entropy than the former. Entropy is a measure of the amount of information missing to accurately predict the future. That is all there is to it.
From _The Key to Everything_ (May 2018) by Freeman Dyson at nybooks.com. "The astronomer Fang Lizhi published with his wife, Li Shuxian, a popular book, _Creation of the Universe_ (1989), which includes the best explanation that I have seen of the paradox of order and disorder. The explanation lies in the peculiar behavior of gravity in the physical world. ... Any object whose motions are dominated by gravity will have energy decreasing as temperature increases and energy increasing as temperature decreases. ... In every situation where gravity is dominant, the second law causes local contrasts to increase together with entropy. ... The diversity of astronomical and terrestrial objects, including living creatures, tends to increase with time, in spite of the second law." This is an offhand paragraph in a slightly cranky review of Jeffery West's _Scale: The Universal Laws of Growth_ (2018). Before you say (as I am also tempted to do) " Damn, Freeman, give us the whole book!" you have to take into account that Freeman was born in 1923 and wrote that review in his mid-nineties. I don't know if this conversation is worth a hill of beans without taking Dyson's point into account. Carroll would know, if he reads this comment. People in math these days joke that the last recourse of a failed research project is to interest Terrence Tao over the weekend. You might have the rough outlines of viable attack on your problem on your desk by Monday morning. This joke used to be made about Freeman Dyson concerning his own lightning mathematical gifts. I also like to quote this passage by Freeman Dyson from _Of Historical Note: Richard Feynman_ (2011): "That afternoon, Feynman produced more brilliant ideas per square minute than I have ever seen anywhere before or since. In the evening I mentioned that there were just two problems for which the finiteness of the theory remained to be established; both problems are well-known and feared by physicists. ... When I mentioned this fact, Feynman said, "We'll see about this," and proceeded to sit down and in two hours, before our eyes, obtain finite and sensible answers to both problems. It was the most amazing piece of lightning calculation I have ever witnessed, and the results prove, apart from some unforeseen complication, the consistency of the whole theory. The two problems were: the scattering of light by an electric field, and the scattering of light by light." See also _A Flash of Illumination on the Greyhound Bus: Physicist Freeman Dyson on Creative Breakthrough and the Unconscious Mind_ (June 2018) by Maria Popova: "On the third day of the journey a remarkable thing happened; going into a sort of semistupor as one does after forty-eight hours of bus riding, I began to think very hard about physics, and particularly about the rival radiation theories of Schwinger and Feynman. Gradually my thoughts grew more coherent, and before I knew where I was, I had solved the problem that had been in the back of my mind all this year, which was to prove the equivalence of the two theories. ... This piece of work is neither difficult nor particularly clever, but it is undeniably important if nobody else has done it in the meantime. I became quite excited over it when I reached Chicago and sent off a letter to Bethe announcing the triumph. I have not had time yet to write it down properly, but I am intending as soon as possible to write a formal paper and get it published. ... I was for five days stuck in my rooms, writing and thinking with a concentration which nearly killed me. On the seventh day the paper was complete, and with immense satisfaction I wrote the number 52 at the bottom of the last page. ..." "Less than a month after he submitted the paper to the prestigious science journal _Physical Review,_ Dyson received a letter that it had been accepted in its entirety - a highly unusual decision in its rapidity. His paper, one of the longest the journal has ever published, had gone through the peer review process in record time. The first paper to make use of Feynman’s diagrams, it championed their power not merely as a computational tool but as a physical theory, inviting other scientists to appreciate their brilliance and splendor. It was a turning point for the acceptance of Feynman’s unorthodox ideas in the scientific community." Must be hard for the reviewers to quibble in good conscience, when they have each already passed along samizdat copies of the draft manuscript to ten of their closest friends. That was quite the authorial fugue. Many have tried to write a Ph.D. dissertation by locking themselves into their dorm room for in five days. Few have succeeded. The funny part is, Dyson never obtained a Ph.D. and spent much of his life railing against the institution: "Oh, yes. I’m very proud of not having a Ph.D. I think the Ph.D. system is an abomination. It was invented as a system for educating German professors in the 19th century, and it works well under those conditions. It’s good for a very small number of people who are going to spend their lives being professors. But it has become now a kind of union card that you have to have in order to have a job, whether it’s being a professor or other things, and it’s quite inappropriate for that. It forces people to waste years and years of their lives sort of pretending to do research for which they’re not at all well-suited. In the end, they have this piece of paper which says they’re qualified, but it really doesn’t mean anything. The Ph.D. takes far too long and discourages women from becoming scientists, which I consider a great tragedy. So I have opposed it all my life without any success at all." So bear in mind when Dyson says that gravity matters to the entropy story, he's really just some geriatric, contrarian schlub with barely any academic credentials at all.
14:00 Every change to the gene will cause change to the geometry but not necessarily the function. Imagine a hammer, changes to the length of the handle will have little impact on function, unless you make it too long or skinny near the head, structure can be varied generation to generation while maintaining function. 20:30 Some nature program I watched described life in Southern Africa as 'chasing the rain'. It was a little while after I had been watching hydrology lectures. It made me think about the earliest life on the planet and such life being caught in ocean and wind currents. How from there life has been in an evolving race to catch the rain. 45:00 While being unable to communicate in a two way fashion across black-hole boundaries, would a message from ancestors or 'information' survive the transition of a new black-hole universe being created?
I love this talk. Not for its content. For the moment, lets (try to) disregard the obvious and barely hidden subjective hard relativist philosophy-centric motives and tendencies that Kate Jeffery's ideas suggest. Beyond all that subjective noise (who knows what emotional desperation is causing Kate to need such a heavy philosophical overlay), what I am interested in is a larger question of the shifting reactive relationship between knowledge and prediction. Lets try to get beyond the obvious problem of Sean's (too lost in the forest to see the trees) generalims, and Kate's (too lost in the tree to see the forest) specificities. The content seems confused at best. I love it for the larger questions that the confusion suggests. Carrol's description of entropy, the standard stochastic rough graining, the accounting of the macro-states that the micro-states can be in, is indeed blind to the criteria of future utility, what in evolution is referred to as "fitness". As fitness is a function of the ability to exploit environmental resources (better than the entities you share that environment with), the macro-states that matter, that are worth measuring, in any tomorrow, are inscrutable in any present, except as predictions. This problem highlights the importance of prediction, the centrality of prediction in any formalization of evolution as a process and importantly in any description of evolution. Carrol seems either to be avoiding the topic altogether or unable to risk the academic blowback of considering evolution as a universal phenomenon ACTIVELY going about maximizing total entropy (dissipation of energy/structure density). Seems so obvious to me, that fitness is the capacity of a system to find the shortest path from all here and now's to that one then and there eventuality which is asymptotic dissipation. This competition towards "shortest path" causality towards maximum entropy necessarily demands ever broader and deeper prediction and such prediction demands ever more concise and processable understanding of causality, of reality. In the context of this podcast, of the discussion that Kate is imploring Sean to engage, the observational criteria that any macro-state measurement entails, is not static, must take into account, the ever improving mapping of causal reality that is the result of evolution. Understanding change, predicting the future of the general and specific environment(s), must be (is) folded back into our criteria for "macro-state" as a measure of entropy. If all reactions are measurements, than nature certainly exists as dynamic measure of macro states (not a static measure of past macro-states). Prediction (in its most simple form) is happening at every granularity (certainly doesn't require a sophisticated nervous system). A box of triangle shaped pieces of cardboard predicts (causes) different macro-configurations, than does a box of rectangle shaped pieces of cardboard. Any difference predicts different futures. What evolution selects for at the macro scale, is more and more robust and comprehensive prediction processes simply because the least energy path towards maximum entropy is always a better map, a more accurate and and processable map of of all that is. Kate, as a researcher, more familiar with the rapid wild end of evolution, of learning, is rightly demanding that Sean acknowledged the directional dynamics at play in all systems but more easily seen in biology than in cosmology. PS: 1. Trying to unpack Kate's obsession with what she is calling "space". Went and read some of her articles and seems that she is confusing the concept of "locality" with "space". Locality is the likelihood that two systems will interact and effect each other when they are closer to each other. Space is the Dimentions in which locality is variable. Again, Kate seems wholly ignorant of the most basic understanding of the physics of the dynamics of anything of which brains are an example, namely graph and network and communication and computation theory. 2. What is Kate's obsession with "space" as a concept? She really never actually reveals her subjective motive here. Why? What is the purpose of her obfuscation? I am guessing that she feels a sense of personal power and control when hiding her motives and intent. I am guessing that she has found herself studying neuroscience for a very particular personal reason… childhood trauma? Regardless, I am constantly amazed at the ignorance of neuroscientists of the basic physical dynamics that govern brains of any kind, namely the network and graph dynamics that result in the computational and communication laws that govern all knowledge and processing. How is it that neurologists are forgiven for such obvious ignorance? Why the kid gloves Sean?
One fascinating conclusion that was not addressed was the finite life span of any biological organism (or simply death inevitability) as another consequence of the second law of thermodynamics.
Gravity is still important in the oil/water example even if there may be another chemical reason why they would separate and increase kinect energy in the absence of it. You can find other examples that don't have the water/oil chemistry stuff, such as heavy and light gases separating in an atmosphere. When the oil rises to the top, potential energy drops and kinetic energy rises (causing increased temperature), that then is radiated out through more photons increasing degrees of freedom of the description of the system. Same is true in an atmosphere of say helium and hydrogen; ignoring nuclear decay etc., helium would be more towards the center a large ball of gas floating out in space and hydrogen more towards the surface, even in the steady state, for the same reasons of potential/kinetic energy. Hydrogen would also get knocked off more easily and escape faster than the helium. Entropy doesn't say everything has to end up well mixed, just that the degrees of freedom have to increase. Heavier things will tend to cluster, because high potential energy states tend to have less degrees of freedom, and clustered heavy things minimize potential energy and maximize kinetic. In a state where light things are as evenly distributed as heavy dense things, you have less entropy than if you had the heavy things clustered and the lighter things rapidly moving with the increased kinectic energy.
Okay. One has a universe that is expanding, and can expand faster than light. How, then, with matter somewhere in the universe moving away from another somewhere faster than light, can the matter in the two places be considered to be the same (that is, in a maximum) entropy state? Looking at this another way, how can it be declared that the big bang isn't something more like a big jet - a continual outpouring of matter - in a part of the universe that is now so far away from us as to be causally disconnected? Provide your answer below. Thanks.
Fascinating discussion. However, i think the incorporation of the many worlds interpretation would allow for a more complete understanding of entropy in both simple and complex systems. It also makes her "what drives the evolution of entropy" as many worlds explains how an zero entropy singularity is divided into an infinite number of realities that each eventually end in quantum equilibrium, one of which we find ourselves in. There are plenty of realities in which life can not develop, others where life went extinct early in its development, but no realities where life exists forever within its own universe. The "why did life in our universe develop the way it has" is answered in that it was a possible emergent phenomenon from one possible solution of the singularity expanding into equilibrium (aka the driving force of entropy) ... And all possible solutions (universes) exist.
I thoroughly enjoyed this podcast. I have been fascinated with evolution and complexity ever since I read Ilya Prigogines books: ‘Order out of Chaos’ and ‘The End of Certainty,’ that also sparked an interest in the concept of time. However, there are no references to Prigogines work here and that surprises me. (Ps. I am not a physicist but a curious spectator).
love your videos but it seems you are seeking in more and more ads. personally id prefer them to be lumped together, I enjoy the continuity associated with commercials at the begging or end.
Physicists are just really, really confused about entropy, and probably need to let go of all of the old explanations. The simple way to describe it is just the number of free elements in a system. For example, the number of books on a shelf. You can arrange them in whatever order you want - alphabetical by author's last name, alphabetical by title, in order of published date, by size, by color, by Dewey decimal number, by how much you like them, or by how close they were to the shelf when you decided to put them on the shelf - and no order would have more or less entropy than any other order, even if you ordered them "randomly" by using a lottery. It's just the number of individual things that have the freedom to move around the system. There is no sense of the term "disorder" in entropy. Only many versus few possible ways to arrange the things. This means that an adult human has more entropy than a fetal human, simply because it has more atoms in it. This also means that our universe is increasing in the number of particles, over time. Which might seem "wrong", since we presumably can't increase the matter~energy overall. But what can happen is fractal expansion, as the smallest building blocks of reality keep dividing, and thus increasing the number of possible ways that reality can be rearranged. So, there is no "heat death" but only more and more possible states of the universe, infinitely. Like Pascal's triangle describes. And living things are always at the peak of entropy, because we continually add more stuff to ourselves as our family tree of genes and memes expands outward in space and time.
""One should be drunken with wine, poetry or anything else to opose the oppressiveness of time."" Charles Boudelaire once said. Well, I used to be like that .. and got nuts altogether. Some artists even lost their live over their fixations and drug related to issues. My advice is to get your fixation up to the point of selfelimination and get rid of it just for a while. Get down to serious business or get gumbling or make another beloved partner your spouse for life ha ha ha No what I am saying is that your program is useful for people like me: incurable humanists idealists with sientific drive. Well I prefer my passionate fixation blessesd with romantic hues and angelic voices anyway. Thank you Sir for the instructive and invigorating discussion Aloha !
hmm yes... the way in which the brain is highly connected spatially with itself... I think spacial density is a part of the equation. Technology has also increased information transmission beyond what biology can do.
This is great. What spoke out at me was the part where she mentions "borrowing entropy from another universe". I actually believe this is possible and I'm studying this, but I wouldn't call it borrowing. It's more of a mixing, but relativity plays a role in this as well. RJ
I can't believe that the only planet that has animals on it just happens to have evolved us Humans to ask these questions ! There has got to be other planets with animals on it without any human like creatures..
the great extinctions do not necessarily mean that the complexity of life on Earth at these times was diminishing -- life looks like irreversible and directional process of ever growing complexity in the biosphere as a global living system. thanks for your interesting conversation)
Fascinating dialogue. Her questions were clever and his answers revealing.
I am sincerely amazed to have easy access to such information.
Puu.
@@youngdavidwe😶
First Class Podcast Sean. A genuine conversation instead of Q/A Interview, which most other Podcasts cant seem to avoid. Thanks.
Kate does a nice job of interviewing Sean.
Yes he was a great guest, very articulate. Should think of setting up his own podcast!
This mindscape is a nice podcast for undergraduate students like me to listen and become familiar with science. Thanks for making this.
I think the great oxygenation catastrophe is in the short term rather portentous, but in the long term a good sign. That is, the overall complexity of life on Earth increased in that event. The downside is that in the (geologically) "short" timescale of millions of years, there was the devastating extinction event. Human civilization probably represents one such sudden jump in complexity - we will do great harm to the biosphere before a new "balance" is reached between our technological species and the broader environment. So perhaps we can extrapolate that things are going to get a lot worse before they get better... but at least they will eventually get better. Hopefully the human species can survive that period, as the first photosynthesizing life did before us.
I wish this could have gone on for hours and hours more. So thought provoking, and I know they both had so much more to discuss.
Great. One question of hers I would like to echo. She asked about the subjectivity of the macrostate and Sean said it was a long standing problem with entropy, though I've listened to a lot of what Sean and others have said about it and this is the first acknowledgement of that problem I've ever heard. I understand Sean's answer, but the subjectivity is still there and it seems like we haven't really unraveled the issue yet.
This problem is because physicists were taught a really backwards idea of what entropy is (the real deal of the law of nature). It's simply an increase in the number of free flowing elements in a system. Like books on a shelf. It doesn't matter how you order those books, as long as the quantity of books is the same, the entropy remains the same. The concept of "disorder" is meaningless. Each order is equal. But add another book to the shelf, and the entropy increases. That's it.
This means that the universe keeps adding more things as time progresses, and mainstream physics is still kinda stuck on the steady state theory, at least when it comes to the "smallest parts of the universe". But, what is probably the case is that the universe is fractal, and instead of "adding" atoms, or particles, or strings, or whatever smallest bit we're thinking about these days is, the bits are being divided. So we have the same amount of matter and energy, but we add detail, like zooming in on a Mandelbrot fractal. This adds complexity as entropy increases, but we never have an end, and the only "heat death" is local, as beloved patterns end, and become new patterns.
Another fantastic conversation to listen to. I liked that while the guest was clearly very accomplished and intelligent in neuroscience, she was asking physics questions I had too. Well done!
Kate Jeffery's concerns about the subjectivity involved in the statistical definition of entropy are very well to the point. There is no objective way to coarse grain your way to the second law of thermodynamics, like Loschmidt already pointed out to Boltzmann. Entropy is the amount of information that we miss in order to accurately predict a systems future. The relevant information that we have, not only consists of the macroscopic pattern of milk and coffee in the cup, one also has to take into account the crucial bit of information that two minutes ago there was no milk in the cup. For each microstate consistent with a pattern of milk strings in the coffee, there is an equivalent microstate (same positions, opposite velocities) that would result in a process that would look like time going backwards. It is not that those states are very small in number, in fact, the numbers are exactly the same! What sets them apart is the one crucial bit of information that two minutes ago there was no milk in the coffee. History does matter when defining entropy. The history of a DNA molecule is crucial information. We have much more information about the future of a healthy, young DNA molecule than about the future of an old, worn and torn DNA molecule, hence the latter has much lower entropy than the former. Entropy is a measure of the amount of information missing to accurately predict the future. That is all there is to it.
would love to get Douglas Hofstadter on here!!! please!
Yeah would be great... I've sent a few messages/comments wanting him on Sam Harris's podcast too.
From _The Key to Everything_ (May 2018) by Freeman Dyson at nybooks.com.
"The astronomer Fang Lizhi published with his wife, Li Shuxian, a popular book, _Creation of the Universe_ (1989), which includes the best explanation that I have seen of the paradox of order and disorder. The explanation lies in the peculiar behavior of gravity in the physical world. ... Any object whose motions are dominated by gravity will have energy decreasing as temperature increases and energy increasing as temperature decreases. ... In every situation where gravity is dominant, the second law causes local contrasts to increase together with entropy. ... The diversity of astronomical and terrestrial objects, including living creatures, tends to increase with time, in spite of the second law."
This is an offhand paragraph in a slightly cranky review of Jeffery West's _Scale: The Universal Laws of Growth_ (2018).
Before you say (as I am also tempted to do) " Damn, Freeman, give us the whole book!" you have to take into account that Freeman was born in 1923 and wrote that review in his mid-nineties.
I don't know if this conversation is worth a hill of beans without taking Dyson's point into account. Carroll would know, if he reads this comment. People in math these days joke that the last recourse of a failed research project is to interest Terrence Tao over the weekend. You might have the rough outlines of viable attack on your problem on your desk by Monday morning. This joke used to be made about Freeman Dyson concerning his own lightning mathematical gifts.
I also like to quote this passage by Freeman Dyson from _Of Historical Note: Richard Feynman_ (2011):
"That afternoon, Feynman produced more brilliant ideas per square minute than I have ever seen anywhere before or since. In the evening I mentioned that there were just two problems for which the finiteness of the theory remained to be established; both problems are well-known and feared by physicists. ... When I mentioned this fact, Feynman said, "We'll see about this," and proceeded to sit down and in two hours, before our eyes, obtain finite and sensible answers to both problems. It was the most amazing piece of lightning calculation I have ever witnessed, and the results prove, apart from some unforeseen complication, the consistency of the whole theory. The two problems were: the scattering of light by an electric field, and the scattering of light by light."
See also _A Flash of Illumination on the Greyhound Bus: Physicist Freeman Dyson on Creative Breakthrough and the Unconscious Mind_ (June 2018) by Maria Popova:
"On the third day of the journey a remarkable thing happened; going into a sort of semistupor as one does after forty-eight hours of bus riding, I began to think very hard about physics, and particularly about the rival radiation theories of Schwinger and Feynman. Gradually my thoughts grew more coherent, and before I knew where I was, I had solved the problem that had been in the back of my mind all this year, which was to prove the equivalence of the two theories. ... This piece of work is neither difficult nor particularly clever, but it is undeniably important if nobody else has done it in the meantime. I became quite excited over it when I reached Chicago and sent off a letter to Bethe announcing the triumph. I have not had time yet to write it down properly, but I am intending as soon as possible to write a formal paper and get it published. ... I was for five days stuck in my rooms, writing and thinking with a concentration which nearly killed me. On the seventh day the paper was complete, and with immense satisfaction I wrote the number 52 at the bottom of the last page. ..."
"Less than a month after he submitted the paper to the prestigious science journal _Physical Review,_ Dyson received a letter that it had been accepted in its entirety - a highly unusual decision in its rapidity. His paper, one of the longest the journal has ever published, had gone through the peer review process in record time. The first paper to make use of Feynman’s diagrams, it championed their power not merely as a computational tool but as a physical theory, inviting other scientists to appreciate their brilliance and splendor. It was a turning point for the acceptance of Feynman’s unorthodox ideas in the scientific community."
Must be hard for the reviewers to quibble in good conscience, when they have each already passed along samizdat copies of the draft manuscript to ten of their closest friends. That was quite the authorial fugue. Many have tried to write a Ph.D. dissertation by locking themselves into their dorm room for in five days. Few have succeeded. The funny part is, Dyson never obtained a Ph.D. and spent much of his life railing against the institution:
"Oh, yes. I’m very proud of not having a Ph.D. I think the Ph.D. system is an abomination. It was invented as a system for educating German professors in the 19th century, and it works well under those conditions. It’s good for a very small number of people who are going to spend their lives being professors. But it has become now a kind of union card that you have to have in order to have a job, whether it’s being a professor or other things, and it’s quite inappropriate for that. It forces people to waste years and years of their lives sort of pretending to do research for which they’re not at all well-suited. In the end, they have this piece of paper which says they’re qualified, but it really doesn’t mean anything. The Ph.D. takes far too long and discourages women from becoming scientists, which I consider a great tragedy. So I have opposed it all my life without any success at all."
So bear in mind when Dyson says that gravity matters to the entropy story, he's really just some geriatric, contrarian schlub with barely any academic credentials at all.
14:00 Every change to the gene will cause change to the geometry but not necessarily the function. Imagine a hammer, changes to the length of the handle will have little impact on function, unless you make it too long or skinny near the head, structure can be varied generation to generation while maintaining function. 20:30 Some nature program I watched described life in Southern Africa as 'chasing the rain'. It was a little while after I had been watching hydrology lectures. It made me think about the earliest life on the planet and such life being caught in ocean and wind currents. How from there life has been in an evolving race to catch the rain. 45:00 While being unable to communicate in a two way fashion across black-hole boundaries, would a message from ancestors or 'information' survive the transition of a new black-hole universe being created?
Fascinating discussion!
One of your best episodes so far.
I love this talk. Not for its content. For the moment, lets (try to) disregard the obvious and barely hidden subjective hard relativist philosophy-centric motives and tendencies that Kate Jeffery's ideas suggest. Beyond all that subjective noise (who knows what emotional desperation is causing Kate to need such a heavy philosophical overlay), what I am interested in is a larger question of the shifting reactive relationship between knowledge and prediction. Lets try to get beyond the obvious problem of Sean's (too lost in the forest to see the trees) generalims, and Kate's (too lost in the tree to see the forest) specificities. The content seems confused at best. I love it for the larger questions that the confusion suggests. Carrol's description of entropy, the standard stochastic rough graining, the accounting of the macro-states that the micro-states can be in, is indeed blind to the criteria of future utility, what in evolution is referred to as "fitness". As fitness is a function of the ability to exploit environmental resources (better than the entities you share that environment with), the macro-states that matter, that are worth measuring, in any tomorrow, are inscrutable in any present, except as predictions. This problem highlights the importance of prediction, the centrality of prediction in any formalization of evolution as a process and importantly in any description of evolution. Carrol seems either to be avoiding the topic altogether or unable to risk the academic blowback of considering evolution as a universal phenomenon ACTIVELY going about maximizing total entropy (dissipation of energy/structure density). Seems so obvious to me, that fitness is the capacity of a system to find the shortest path from all here and now's to that one then and there eventuality which is asymptotic dissipation. This competition towards "shortest path" causality towards maximum entropy necessarily demands ever broader and deeper prediction and such prediction demands ever more concise and processable understanding of causality, of reality. In the context of this podcast, of the discussion that Kate is imploring Sean to engage, the observational criteria that any macro-state measurement entails, is not static, must take into account, the ever improving mapping of causal reality that is the result of evolution. Understanding change, predicting the future of the general and specific environment(s), must be (is) folded back into our criteria for "macro-state" as a measure of entropy. If all reactions are measurements, than nature certainly exists as dynamic measure of macro states (not a static measure of past macro-states). Prediction (in its most simple form) is happening at every granularity (certainly doesn't require a sophisticated nervous system). A box of triangle shaped pieces of cardboard predicts (causes) different macro-configurations, than does a box of rectangle shaped pieces of cardboard. Any difference predicts different futures. What evolution selects for at the macro scale, is more and more robust and comprehensive prediction processes simply because the least energy path towards maximum entropy is always a better map, a more accurate and and processable map of of all that is. Kate, as a researcher, more familiar with the rapid wild end of evolution, of learning, is rightly demanding that Sean acknowledged the directional dynamics at play in all systems but more easily seen in biology than in cosmology.
PS:
1. Trying to unpack Kate's obsession with what she is calling "space". Went and read some of her articles and seems that she is confusing the concept of "locality" with "space". Locality is the likelihood that two systems will interact and effect each other when they are closer to each other. Space is the Dimentions in which locality is variable. Again, Kate seems wholly ignorant of the most basic understanding of the physics of the dynamics of anything of which brains are an example, namely graph and network and communication and computation theory.
2. What is Kate's obsession with "space" as a concept? She really never actually reveals her subjective motive here. Why? What is the purpose of her obfuscation? I am guessing that she feels a sense of personal power and control when hiding her motives and intent. I am guessing that she has found herself studying neuroscience for a very particular personal reason… childhood trauma? Regardless, I am constantly amazed at the ignorance of neuroscientists of the basic physical dynamics that govern brains of any kind, namely the network and graph dynamics that result in the computational and communication laws that govern all knowledge and processing. How is it that neurologists are forgiven for such obvious ignorance? Why the kid gloves Sean?
One fascinating conclusion that was not addressed was the finite life span of any biological organism (or simply death inevitability) as another consequence of the second law of thermodynamics.
Gravity is still important in the oil/water example even if there may be another chemical reason why they would separate and increase kinect energy in the absence of it. You can find other examples that don't have the water/oil chemistry stuff, such as heavy and light gases separating in an atmosphere. When the oil rises to the top, potential energy drops and kinetic energy rises (causing increased temperature), that then is radiated out through more photons increasing degrees of freedom of the description of the system. Same is true in an atmosphere of say helium and hydrogen; ignoring nuclear decay etc., helium would be more towards the center a large ball of gas floating out in space and hydrogen more towards the surface, even in the steady state, for the same reasons of potential/kinetic energy. Hydrogen would also get knocked off more easily and escape faster than the helium. Entropy doesn't say everything has to end up well mixed, just that the degrees of freedom have to increase. Heavier things will tend to cluster, because high potential energy states tend to have less degrees of freedom, and clustered heavy things minimize potential energy and maximize kinetic. In a state where light things are as evenly distributed as heavy dense things, you have less entropy than if you had the heavy things clustered and the lighter things rapidly moving with the increased kinectic energy.
Okay. One has a universe that is expanding, and can expand faster than light. How, then, with matter somewhere in the universe moving away from another somewhere faster than light, can the matter in the two places be considered to be the same (that is, in a maximum) entropy state? Looking at this another way, how can it be declared that the big bang isn't something more like a big jet - a continual outpouring of matter - in a part of the universe that is now so far away from us as to be causally disconnected? Provide your answer below. Thanks.
Please get David Deutsch on the podcast! Thanks for the great content!
YES! David is great.
@@Kritikk Incorrect. He's the BEST.
Fascinating discussion. However, i think the incorporation of the many worlds interpretation would allow for a more complete understanding of entropy in both simple and complex systems. It also makes her "what drives the evolution of entropy" as many worlds explains how an zero entropy singularity is divided into an infinite number of realities that each eventually end in quantum equilibrium, one of which we find ourselves in. There are plenty of realities in which life can not develop, others where life went extinct early in its development, but no realities where life exists forever within its own universe.
The "why did life in our universe develop the way it has" is answered in that it was a possible emergent phenomenon from one possible solution of the singularity expanding into equilibrium (aka the driving force of entropy) ... And all possible solutions (universes) exist.
These topics are the best way to start the week
I thoroughly enjoyed this podcast. I have been fascinated with evolution and complexity ever since I read Ilya Prigogines books: ‘Order out of Chaos’ and ‘The End of Certainty,’ that also sparked an interest in the concept of time. However, there are no references to Prigogines work here and that surprises me.
(Ps. I am not a physicist but a curious spectator).
love your videos but it seems you are seeking in more and more ads. personally id prefer them to be lumped together, I enjoy the continuity associated with commercials at the begging or end.
I'm keen to hear the many-worlds quantum approach to life insurance - steal money from your counterpart in another universe who got super-rich.
Yeah this is jarring
Physicists are just really, really confused about entropy, and probably need to let go of all of the old explanations. The simple way to describe it is just the number of free elements in a system. For example, the number of books on a shelf. You can arrange them in whatever order you want - alphabetical by author's last name, alphabetical by title, in order of published date, by size, by color, by Dewey decimal number, by how much you like them, or by how close they were to the shelf when you decided to put them on the shelf - and no order would have more or less entropy than any other order, even if you ordered them "randomly" by using a lottery. It's just the number of individual things that have the freedom to move around the system. There is no sense of the term "disorder" in entropy. Only many versus few possible ways to arrange the things.
This means that an adult human has more entropy than a fetal human, simply because it has more atoms in it.
This also means that our universe is increasing in the number of particles, over time. Which might seem "wrong", since we presumably can't increase the matter~energy overall. But what can happen is fractal expansion, as the smallest building blocks of reality keep dividing, and thus increasing the number of possible ways that reality can be rearranged.
So, there is no "heat death" but only more and more possible states of the universe, infinitely. Like Pascal's triangle describes. And living things are always at the peak of entropy, because we continually add more stuff to ourselves as our family tree of genes and memes expands outward in space and time.
Now THIS is a podcast!
Much appreciated. Thank you.
thank you so much for doing these podcasts
""One should be drunken with wine, poetry or anything else to opose the oppressiveness of time.""
Charles Boudelaire once said. Well, I used to be like that
.. and got nuts altogether.
Some artists even lost their live over their fixations and drug related to issues. My advice is to get your fixation up to the point of selfelimination and get rid of it just for a while.
Get down to serious business or get gumbling or make another beloved partner your spouse for life ha ha ha
No what I am saying is that your program is useful for people like me: incurable humanists idealists with sientific drive.
Well I prefer my passionate fixation blessesd with romantic hues and angelic voices anyway.
Thank you Sir for the instructive and invigorating discussion
Aloha !
I would have loved to know you were coming to Costa Rica, I feel I missed a one in a lifetime chance of hearing one of your talks live : /
Just getting through Something Deeply Hidden.....sold me on the any world theory😂
Thanks for the podcasts✌️
You surely mean Many Worlds Theory. Amazing book book the less
One of the greatest scientist and speaker nova days
hmm yes... the way in which the brain is highly connected spatially with itself... I think spacial density is a part of the equation. Technology has also increased information transmission beyond what biology can do.
sir bring your broadcast into video slides respect from Pakistan
I"d like to hear you chat with Rudy Rucker.
I don't see how anything would be possible in only two dimensions. Light could not exist. Nothing could exist, as far as I can imagine.
This is great. What spoke out at me was the part where she mentions "borrowing entropy from another universe". I actually believe this is possible and I'm studying this, but I wouldn't call it borrowing. It's more of a mixing, but relativity plays a role in this as well.
RJ
I can't believe that the only planet that has animals on it just happens to have evolved us Humans to ask these questions ! There has got to be other planets with animals on it without any human like creatures..
This is awesome.
wow- evolution can produce complexity because entropy is on a curve from orderly back to orderly. altho the timescales seem like they dont match.
Kate sounds like a Doctor Ian Malcom type
I was just getting into it when entropy suddenly became obtaining life insurance. Killed it.
Agreed.
Here before the inevitable Evolution denial crap.
the great extinctions do not necessarily mean that the complexity of life on Earth at these times was diminishing -- life looks like irreversible and directional process of ever growing complexity in the biosphere as a global living system.
thanks for your interesting conversation)
bro
For real!? But I don’t get out of work for another 6 hours!
Come on my brotha time for video
20:30 .... HALLOWEEN IS ON THEWAY
First
great mansplaining!