You know what's really cool? The second chromosome in our body has been genetically spiced which cannot happen naturally so it is fair to draw the conclusion that either we have lost technology that allowed genetic manipulation or we have been made by some other form of life AKA aliens... one of these is the truth and it is a logical 50/50 split with 100% chance one of these is the truth...
How is everything not equally complex? Everything in existence came from the Big Bang which is a singularity state. Anything that exists is equally complex as anything else... all existence is God here this is a god eat God world... face the facts...
What people fail to realize is that everything came from The Big Bang singularity... if anything exists anywhere in this universe it exists everywhere for everything is from the singularity.
Wolfram has an entire book about the science of complexity. As a PhD in computer science, I think this has been my favorite video of yours in a long while.
@@TheFutureIsEloi It's not yet scientific. It doesn't make any testable predictions, nor has he worked out how it explains the existing measurements. It's a very long rant about complexity that's unassociated with physics, and his physics thing is (so far) unassociated with science, until he manages to work it out far enough to make predictions.
@@darrennew8211 The "science of complexity" has the same implied "structure" as the "science of Evolution" has. By definition the "science of evolution" does not make predictions, because it implies prior knowledge of changes in the environment in which the Evolution/Complexity emerges. The "science of Evolution" has mostly explanatory properties rather than predictive properties. The same should be true for any " science of Complexity".
@@reasonerenlightened2456 Evolution certainly makes predictions. For example, if you count the number of chromosomes in an ape vs a human, humans have one less. The science of evolution predicts that you find telomeres in the middle of one chromosome, which you do. Evolution predicts that sea creatures that swim quickly will tend to all be streamlined. Evolution predicts there are *not* giant jumps in the fossil record, that we don't find rabbit bones in the bellies of dinosaur fossils, etc. Hell, evolution predicted the existence of DNA well before DNA was discovered. Darwin figured evolution wouldn't work if traits could be diluted, so inheritance had to be some sort of digital/discrete mechanism, which Mendel later showed was the case. That said, yes, evolution isn't going to predict the way in which creatures evolve in the future, because that is too complex, as you say. It's definitely less predictive than fundamental physics stuff, for sure. I'm not sure what Wolfram's "science of complexity" was trying to predict. I read the thing, but I don't remember it making too many predictions about the structure of measurements.
Two quotes to mull over, re complexity: 1) On his death bed, Heisenberg is reported to have said, "When I meet God, I am going to ask him two questions: Why relativity? And why turbulence? I really believe he will have an answer for the first." 2) "If the universe were simple enough for us to understand, we would be too simple-minded to understand it." [source unknown]
I think kolmogorov complexity didnt get a fair shake. Yes, the standard model describes rocks, clocks and babies equally, but the *length* of this description is not the same, because there are more particles arranged spatially and temporally in a greater amount of ways, thus requiring a longer program leading to higher kolmogorov complexity
Agreed! Moreover, the 'basic' description may well not be the _shortest_: if you consider a system consisting of an iron sphere floating through an infinite vacuum, then we can model all state variables using a much simpler model (treating the sphere as a point object with some fixed velocity) than the fundamental physics of all constituent particles -- moreover that complexity would be more or less constant across a large range of sizes of the rock, at least until internal repulsive forces begin to fail, even though the base system would get more and more difficult to compute using the naive method described in the video. Both of those facts (there existing a simpler model than raw particle physics, and that that model is to an extent size-invariant) are also true of a clock, but the simplest model for such would still have to be more complex than the simplest model for an iron sphere.
If we accept Kolmogorov complexity, we will need to take seriously Dembski’s schema regarding Complex Specified Information, and the “conservation of information”. Also, Stephen Wolfram’s “A New Kind of Science” is relevant.
But isnt Kolmogorov's complexity really a measure of "disorder" instead of complexity? A "completely random" sequence (generated by white noise) of elements would take a longer program to recreate, if compared with a "complex" sequence (that necessarily would have some kind of correlations that allow us to compress the sequence).
@@adolfobahamondealvarez5739 It certainly is a measure of disorder. A uniform sequence of a repeating constant value would have the lowest kolmogorov complexity and and a totaly random sequence would have the "highest", with Sheakspeare somewhere in between. You're totally right in that merely higher kolmogorov complexity does not mean higher actual complexity. However, it is still, in my opinion, extremely useful because it mathematically bounds actual complexity. We still don't know what it is but we can be sure it must be more than x and less than y. Putting bounds is useful.
@@yakovdan Quite possibly I don't properly understand Kolmogorov complexity but producing random noise with an algorithm doesn't seems long or hard if you are satisfied with a pseudo random solution and seems impossible if you want a true random one, thus I am not convinced that it isn't really a good measure of complexity in most practical cases.
During my time as a PHD student, I also worked intensively with complex systems. The most important insight I gained was that it is practically irrelevant how many individual components a system consists of, but that the interactions and dependencies between them are decisive for its complexity. It is also very important how many scales (temporal and spatial) the system comprises. If the micro scale has a strong influence on the macro scale, this also makes the system significantly more complex. Now, as a software engineer, I have realized that these fundamental properties (many dependencies and multiscality) also make the software and the code (= the system) complex and therefore difficult to understand and develop.
Indeed, for the purposes of control and prediction it is more important how much interconnected (interdependent) the individual components of a system are than the number of components. As far as the role of software engineer is concerned, it is long overdue for a code to be written which treats the human as a consultant to the code that writes code instead of the the human being the code writer. Even better, the human must become like a customer to the code that writes the code the human needs, you just tell it what you need as you would do to another human, and the code gives you the code solution that will work for you. Do not confuse what I am saying with the stupidity of the chatGPT technology that currently became popular. The chatGPT technology and the likes (for images, video, sound, code, control, etc.) just have the veneer of intelligence.
As a product designer who designs systems to control mass production lines, I like the c4 model of software architecture. I like to use it to map and understand the user requirements and flows.
@@reasonerenlightened2456well, it's still garbage in, garbage out. You want a human to tell the code what it wants and the code will take that as an input and convert that into code that does what the human wants. However, any kind of function suggests both implicit and explicit properties/aspects. It's only when beginning to use and test the first code iterations that it becomes clear what was implicit that needs to be made explicit. We call those implicit things that haven't yet been handled 'bugs', because they create the possibility for an unwanted output without the means for handling a correct output having been made explicit.
As an academic scholar in complexity, I really enjoyed the video. To define complexity is always hard, and whenever we go to wide enough conference it is clear that everyone is intending the word in a different way. More than once I tried to write projects with other researchers to later find out that, although we both refer to our field as "complexity", what we do is completely unrelated. However, to put my 2 cents, I think that to try to define the complexity of "things" is a mistake. We can define the complexity of their aggregate "processes", "dynamics", "behaviors" - and in these Kolmogorov complexity works quite well. This is not strange for a physicist by the way. When we define the entropy of a mixed deck of cards, we are not looking at the physical structure of cards: we are looking only at their mixing properties, abstracting completely from their physical properties. We say that the card set have 52! possible states, ignoring the spins of their atoms and the molecular configuration of the materials composing them. A rock is simple if we consider its autonomous motion - it just stays there - while it could be complex when looking at its internal function: it would be however irrelevant if I am looking at its autonomous motion, like the cards material is irrelevant when looking at the probability of getting a flush. To compute the complexity of "a rock" is as useless as computing the entropy of a deck of cards considering the materials of the cards. Notice that "assembly complexity" is, again, measuring the complexity of the process of building the item, it does not have any relation - in principle - with the object itself. While it is true that objects we identify as "complex" tend to have complex processes to build them, there is no reason to believe it is a general characteristics.
As a software engineer, complexity is bane of my existence. I was looking for a way to measure software complexity. Now I have some threads to follow thanks to your video. Edit: tbh I hadn't really looked, but it seems there are some methods Edit: Computational complexity isn't what I'm talking about. Computational complexity is about the amount of resources needed to run the computation. That's not at all what I talked about other than containing the same word "complexity". Don't be half knowledge, ego stroking assholes.
@@threeMetreJim to be honest, I haven't really looked really deep into it. But it looks like there are some methods already. Since there's no perfect way to measure complexity in physics, there's no perfect way to measure complexity of software. Because everything emerges from physics.
How about something along the lines of product-sum ratio of the number of functions defined and the number of calls to those functions multiplied by the number of variables parameterized by those functions, though I suspect that the ratio between number of value and reference parameters will need to be considered as well. User-code only, as I would assume you don't care how complex the generated machine code is, just the user-code-base you need to maintain. First bit is an attempt to measure the size of the code-base against how much it interacts with itself, second bit is an attempt to measure how much data is being moved around. More considerations might be needed to compare dissimilar code bases, perhaps incorporating the average number of lines of code per function in attempt to get a ball-park measure of the code's relationship to spaghetti... /two cents Edit: If this ends up helping, I'd be happy with 10% of the prize money and an acknowledgement in your Nobel prize winning paper. XD
You are performing a valuable service with your work here. For me, for others who see and think about your presentations, and for those we impact in some way as a result of what we’ve learned. Thank you.
My favorite book on complexity is the novel by Robert Pirsig; "Lila". He outlines a hierarchy of complexity based on the emergent behavioral rules in a single taxonomy start with physics at the bottom and aggregating up through chemistry, bilology, ... sociology. It is a brilliant work by a brilliant man and accessible to all.
Pirsig is a wonder. He's probably the guy that came the closest to realising enlightenment, the closest to walking through the door, without actually walking through it. Even though logically he saw that Quality was the Tao, the source of all individual things, he still could not 'feel' it, feel himself as just a manifestation of Quality. So instead of feeling so happy and free, realising there was no individual named Robert Pirsig, nothing for him to do, nowhere he was supposed to go, no burden of the future or even the present moment on his shoulders, and went about his day like many mystics do, he instead only realised that logically, and reasoned there was no reason to get up so urinated on the floor, no reason to stop the cigarettes from burning his fingers, no reason to do anything. The logical part of his brain realised the truth, but could not let go and let the emotional part of his brain feel it in the experiential dimension. This 'letting go' is crucial for the process. Without knowing, without understanding, you let go and trust the Tao, the universe, whatever you want to call it, to do you, rather than you doing it. Pirsig seems to have never done this. Chris' death made it worse. "Where, after all, did he go?", he asked over and over. Lila is an agonising attempt to answer these questions and analyse Quality without ever having actually felt himself to be a manifestation of it, without ever having felt the 💙love💙 that is another name some people give to Quality. He never let go and Lila is strained at best, torturous at worst.
Consciousness is also more prominent at the edge of uncertainty. Predictable or repeated events lowers the brain activity, as we have measured using fMRI (A phenomenon called "Repetition Suppression"). We can also deduce that subjectively e.g we become sleepy (& bored) when watching very predictable events that has no surprises.
Predictability can be a useful measure of complexity for dynamic systems with multiple interdependencies and variables. Global and local weather trends, changes in wind, temperature and relative humidity are of significant concern and yet their predictability decreases rapidly over relatively little time. Modeling possibilities is one thing we are good at, but knowing which will be realised is still beyond our abilities.
As a budding Biologist with a great interest in systems biology, I really appreciate this video. Since childhood I've always loved tinkering with objects to find out how things work and that's how ended up figuring out how the most complex things in the universe work.
Really, you figured out how the most complex things in the universe work? How much did it cost? Tell me how to build a gene drive. I'm passively putting together the math model. Tell me what you know. Bruh.
Do you use NetLogo to model your systems biology? You should. It is easy, You define your environment, you define your agents, you define how the agents interact with each other, and the environment,...and how those interactions change the agents and the environment. Then you sit back and observe emerging behaviours. The only limit are the accuracy of the assumptions your model makes.
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Complexity is the minimal number of laws (mathematical functions) of all the possible connections between the elements of the system described within a specified space-time dimension. It gets its real meaning as a comparison, as a relative value between two or more systems, although it is not impossible to quantify it as an absolute value. Let's take the rock-baby example. The baby is more complex if we compare it to a similar sized rock, and we take a small timespan, let's say a day, and investigate both starting from their molecular size to the whole object. The baby is complexer because to count all the laws describing the connections of its elements on the same macroscopic size it is much higher for a baby, not only because its organs or molecules on the similar microscopic level need more laws to describe them, but in a 24 hour a rock doesn't really change, so the are no extra laws needed to describe the changes too, the time generated connections (evolution) and the laws describing these are missing for the rock. However if we take millions of years of evolution of that rock and compare to the millions of years in which a baby state of a human is negligible (99.9999...% it doesn't even exists), the rock may be more complex, so geologically it is complex, because it the a higher timespan, an "evolution" of that rock must be taken in consideration too. The same is true if we change the spatial dimensions in which we compare the system. No surprise that the geology of a whole planet is pretty much complex. To describe all the possible connections of the elements of a DNA molecule of a baby needs also much more mathematical functions than a silicate molecule (just to stay at the baby-rock example, however it is not important at a molecular level, where those example molecules are coming from). We can also define a span of space to compare objects, or to measure it's complexity, and a timespan too, so we can compare two objects starting from a molecular size to an average human size in 100 years, then compare those. It is also okay to compare different space-time spans of objects. In both cases we can say which one is more complex. Just to use it for the coffee exaple, the two separate liquids half mixed are much more complex, because you need to describe the process of mixing as the time goes on, which is probably the biggest part of the mathematics to do, much more as to describe the separate or the completely mixed liquids. This is the reason why the states in between, and the systems which rapidly change by time are so complex. Now if we compare the whole universe from the smallest particles and in its complete "lifespan" there is nothing as complex as it, because all other systems are only a subset of elements in space and/or time, and those can't be more complex. (Sometimes I used objects and systems as synonyms, but these thoughts are just born as I wrote them, inspired by Sabines explanations. But you probably got the point anyways if you took time to read this.) Thank you Sabine for relieving the philosopher and thinking person and the wannabe scientist in me.
Douglas Hofstadter in Gödel Escher Bach talks about “chunking” - if you can treat a collection of things (lines of code, molecules in a protein, etc.) as one thing, structurally or functionally, it’s chunkable. If you can’t, it’s complex. See how I just “chunked” the science of complexity ? Now to watch Sabine and see if I’m right…
Definitely our largest gap. Thanks for elevating this! I've been in the complex-systems space (mostly practice instead of research, but with lots of formal training) for ten years or so, and appreciate your robust summary!
I really liked this video, it gave me quite a lot to think about. It reminds me of a notion in topological dynamics known as a "dynamically maximal map" which, roughly speaking, is the most chaotic dynamical system (with respect to some family of dynamical system). Often, dynamical systems which are not dynamically maximal can exhibit hybrid behaviors, which are complex but not "completely" chaotic.
This was a great talk about a very interesting topic which is far above my pay grade....I have a small library of books from 10yrs ago when this was a hot topic.....your comments hint at how the concepts have grown......my favorite issue is "self organization of complex/chaotic systems" which seems to me to be the edge of answering the question of how RNA/DNA got started......but now I must "just shut up"
When I "discovered" the Duffing Equation and van der Pol Equation in my Upper Division Classical Mechanics course, it put the "WOW!" back into Physics for me. Thanks, Professor, Scott! The dynamical chasm between these two equations and the SHO it still astounding.
Once again, you succeeded at making a difficult question understandable without dumbing it down. This is why I always come back to this channel. Keep up the great work!
I'm writing a book concerning logic and math. I just want to thank you Sabine for your effort. I'm on the same boat as you: wanting the truth and striving to separate unfalsifiable stuff from falsifiable ones. The problem with modern science can be reduced to a problem of differing languages. Insert: I won't drop technical details here. But here's my take: Sabine's on the right track. Replication crisis from soft to hard science exposed and falsified many theories including their fields. This is actually beneficial. Why? Science is on its way becoming a unitary epistemological framework. Someday, it'll be a fair and perfect system like pure math. Right now, scientists are tribal. Fields are divided. Truth isn't polarized. It's not a bilateral thing. Truth is unitary.
RM3, non-binary logic, categorical logic --- validity is better than truth? Because mere truth is binary and not a real thing in the real world, there is always a third possibility that is Valid
I’m not on the same boat, but I’m looking across and trying to distinguish the possibility of methodology and intriguing concepts that invariably make us ask more questions 😊
The truth is out there. We need to go into space. That must be the biggest goal of humanity. As in the era of the sailing ships exploring the world, we need to embark on the same journey in the ocean of space. We have hit a theoretical limit, we must now push the pragmatic possibilities. It's cyclical and we need to embrace the half of the cycle we are in now.
IN THE INTEREST OF FINDING THE THEORY OF EVERYTHING: SOME THINGS MODERN SCIENCE DOES NOT APPARENTLY KNOW: Consider the following: a. Numbers: Modern science does not even know how numbers and certain mathematical constants exist for math to do what math does. (And nobody as of yet has been able to show me how numbers and certain mathematical constants can come from the Standard Model Of Particle Physics). b. Space: Modern science does not even know what 'space' actually is nor how it could actually warp and expand. c. Time: Modern science does not even know what 'time' actually is nor how it could actually warp and vary. d. Gravity: Modern science does not even know what 'gravity' actually is nor how gravity actually does what it appears to do. And for those who claim that 'gravity' is matter warping the fabric of spacetime, see 'b' and 'c' above. e. Speed of Light: 'Speed', distance divided by time, distance being two points in space with space between those two points. But yet, here again, modern science does not even know what space and time actually are that makes up 'speed' and they also claim that space can warp and expand and time can warp and vary, so how could they truly know even what the speed of light actually is that they utilize in many of the formulas? Speed of light should also warp, expand and vary depending upon what space and time it was in. And if the speed of light can warp, expand and vary in space and time, how then do far away astronomical observations actually work that are based upon light and the speed of light that could warp, expand and vary in actual reality? f. Photons: A photon swirls with the 'e' and 'm' energy fields 90 degrees to each other. A photon is also considered massless. What keeps the 'e' and 'm' energy fields together across the vast universe? And why doesn't the momentum of the 'e' and 'm' energy fields as they swirl about not fling them away from the central area of the photon? And electricity is electricity and magnetism is magnetism varying possibly only in energy modality, energy density and energy frequency. Why doesn't the 'e' and 'm' of other photons and of matter basically tear apart a photon going across the vast universe? Also, 'if' a photon actually red shifts, where does the red shifted energy go and why does the photon red shift? And for those who claim space expanding causes a photon to red shift, see 'b' above. Why does radio 'em' (large 'em' waves) have low energy and gamma 'em' (small 'em' waves) have high energy? And for those who say E = hf; see also 'b' and 'c' above. (f = frequency, cycles per second. But modern science claims space can warp and expand and time can warp and vary. If 'space' warps and expands and/or 'time' warps and varies, what does that do to 'E'? And why doesn't 'E' keep space from expanding and time from varying?). g. Energy: Modern science claims that energy cannot be created nor destroyed, it's one of the foundations of physics. Hence, energy is either truly a finite amount and eternally existent, or modern science is wrong. First Law Of Thermodynamics: "Energy can neither be created nor destroyed." How exactly is 'energy' eternally existent? h. Existence and Non-Existence side by side throughout all of eternity. How?
@joeboxter3635 'you are the field, and the knower of the field' - old Eastern proverb. Thing is, we already know, intuitively, however, our unique perspectives from our individual observer positions causes the 'truth'to slightly fluctuate between actors.
This brings me back to few years ago, when, as a Ph. D. student, I investigated some laws of nature regarding the interaction between complex systems. It is true: we are still at the beginning of our understanding, but some progresses are being made in this field..Still thankful to my professor for his passion in developing the theory of temporal complexity and complexity in general.
Thank you Sabine. Wonderful, as always! I am reminded of a graduate student who confronted J. Watson in Cold Spring Harbor once, in a seminar series. Watson insisted that everything wil be explainable and understood in terms of atoms. The student raised his hand and asked; Professor Watson, if I tell you that a house is made of bricks and mortar, can you tell me its architecture? Watson was dumbfounded. Who said this? he inquired semianglrily! Good old Aristotle, said the unphased student. And that is the truth!
I really like Stephen Wolfram and Jonathan Gorard's insights on complexity arising from computational algorithms, and how finding pockets of computational reducibility within a system which is otherwise computationally irreducible gives rise to emergent properties.
A slab of marble and Michaelangelo's sculpture, "The Pieta," are both made of the exact same material, but it would be a *gross mischaracterization of reality* to argue there is no difference in their overall complexity.
I was planning on doing my thesis in philosophy about complexity. But then mental illness struck and life took on a different path... Still, thanks for this video, even though it hurts to watch it, as it reminds me of all I've lost recently.
You didn’t lose anything, "potential" is a mental construct, and to believe that you lost something that was never yours is even more detrimental for your mental health. Once we understand that nothing is expected from us, life becomes much simpler and lighter... and it helps, or at least it's helping me... and it doesn't take away anything from your "chances".
I love your explanations! They are always food for thought and help me pushing my brain to keep in shape. As to measure complexity: How about "The harder it is to predict an outcome from a given input, the more complex the system is." The verge of chaos would still be maintained, as you could not predict anything in pure chaos. At least if it was really chaotic and not just a limit in our understanding.
Hard to predict for who though? The same problem can be solved in a different amount of steps by different function. If you have the perfect model, maybe predicting different things is equally difficult.
Unless we assume complexity is equal to the probability of outcome after some amount of time, for which we maybe need to quantize time to get finite number of steps. If you can predict a state of system in 100 steps with 100% certainty, you get a complexity factor of 1? Sounds arbitrary but hey
This was an excellent episode. A related classic paper to read: P.W. Anderson's "More is Different." A single atom cannot have conductivity, but a collection of them can.
A relevant question is whether the property (conductivity) of the collection of atoms could in principle be predicted given complete knowledge of the physics of a single atom.
@@brothermine2292You can't, because arrangement matters. Graphite and diamond have different conductivities despite being made from the same atoms. Defects matter as well. And temperature. Not to mention impurities. While you can in principle calculate for a given arrangement and conditions, this becomes impractical quickly as you scale up. Exploiting symmetry helps, but defects break symmetry.
@@michaelinners5421 : Yes, arrangement matters. But why couldn't one in principle predict the emergent property (conductivity) for any given assumption about the arrangement (positions, momenta, etc) of the atoms?
@@brothermine2292 You will be able to predict IF you have a theory and understanding of the electronic structure of multiple atoms. This means valence bond theory or molecular orbital theory. So, just the "complete knowledge of the physics of a single atom" will not be enough. You need to understand the physics of what states the collection of atoms can have. Does this make sense?
For Kolomogorov complexity you have to include your initial conditions in the length of the program. Otherwise, everything is just the complexity of a Turing machine.
Still her point is coherent in this context. Anything with the same number of particles has the same number of quantum states, and that also implies that it has the same order of magnitude of entropy insofar as an algorithm is concerned. A baby is just atoms, exactly the same as a rock. Simulating a same-weight rock and a same-weight baby is exactly as computationally complex, yet this doesn't fit our idea of complexity, so it can't be an accurate measure of complexity from our perspective.
@@Zeuskabob1 I completely disagree. It ignores the effects of the processes going on inside the brain as well as all those going on inside the body. Her statement that "If you went by Kolmogorov complexity everything would be equally complex" is just completely wrong. She just said in effect that all programs are the same length. I cannot imagine how Sabine made such a huge error. That's just factually wrong: very much a rarity in her videos, but this was one. (She did immediately follow by saying that "That makes no sense.", but then why did she choose to make the statement?)
Ya it didn't really make sense to me either because Kolomogorov complexity requires a turing machine or something like it. Which means you would somehow have to reduce the dynamics of the baby, rock, etc. to some type of turing machine and prove lower bounds on program size. Yes the dynamics are all describable by in QFT but the dynamics would be (and should be) extremely different even in the QFT formalism
This was a super interesting programme. We definitely seem to require new maths to progress in certain areas of research, or a different approach. If nothing else, very fun to think about.
Simple and simplistic are two very different things. So are complex and complicated. As Einstein observed if we cannot explain it to a six year old child then we don't understand it ourselves. But that is the beauty of science: it is complicated until it gets complex or simple which is the same. Thought provoking video. Thanks a lot.
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Construction is dual to destruction, creation is dual to annihilation. Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
I really think the connected dots are a more complex system. As a software developer, having my functions used and calling each other is way more complex than having them as a pointers that never get called. We were discussing with my colleagues how complexity grows and we concluded it has to do with combinatorics and combinatorial explosion. But there are cases where you can have linear growth and exponential growth as well depending on how the connections are made. Also, we came to the conclusion that intelligence can be described as an aligned complexity.
The mere existence of connections adds a certain complexity, but the dots remain identical, which reduces complexity from the state where only some dots are connected only to some other dots. Maximum complexity lies somewhere between the unconnected and fully-connected states, as in neural networks (and our brains.)
@@jpdemer5 I think of aligned complexity as what you described as the in-between the unconnected and fully connected states. which I think describes intelligence, looking at it this way you can consider systems where the path of least resistance is chosen as intelligent(like light, movement of charged particles, water flow, etc. but in general they are complex systems all connected and offering other paths that are not taken. I think of them as functions that can be called if a certain if-else statement is fulfilled.
if we are only talking about the complexity of description, then no connections vs full graph is exactly the same: for any two functions they are either always connected or always disconnected. You know this bit of information ans you can reconstruct the graph just knowing its size. Now if you want to consider complexity of operation, a dynamic one, you are actually talking about possible traces, paths in this graph, which are a different thing altogether
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
14:42 Consider an alternative point of view: A Pencil would not exist unless manufactured however, a pencil isn't manufactured for the sake of itself, it exist and is bound to the creation of self expression or data retention and who is to say that this isn't an integral part of the object? On the other hand, us humans are made just for the sake of making us if not by an overly enthusiastic but frugal father or a mother with a dubious math skills. The vid was interesting and thought provoking, thank you!
Man I am so glad to see this video from Dr. Sabine. I have been obsessed with Dr. Wolfram's work on computational thermodynamics ever since chat gpt came out he has been on a roll.
I've had trouble understanding graph theory, but as soon as I thought of it in terms of lines and dots it became so much clearer. Score another win for Sabine reducing gobbledegook! 🤪
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Personally I always thought of complexity as the number of interactions between different systems (which itself could consist of different interacting systems) that will have a fixed result. The total of all those interacting systems could make something more complex than others.
Exactly my though. For me the complexity of a system depends on the number of significant interactions (at that emergence level) you need to account for in order to predict the exact outcome.
so how does your framing differentiate between two colliding galaxies and two people talking with each other, for example? And how is a "significant interaction" defined ? Seems to me like it's just pushing the definition of "complex" to be based on the vague term "significant"...
I agree, but I think there's a psychological component. How your brain categorizes thing contributes to whether or not you will view things as different systems. You can view many body systems as the same system, and it's largely arbitrary that we separate them. How do you have an immune system without a digestive system, without a skeletal system, without a cardiovascular system, etc etc etc.
@@VelereonicsImagination can definitely be emergent even when bound by karma/conditioning... is there emergence via an omniscient consciousness, or is the universe a block theory coupled with determinism and randomness. We know there is supposedly no free will, so consciousness is just an emergent property derived from determinism and randomness? This is what the mathematics shows, but no one really knows where the energy/information is coming from. Observation is a physical act and can alter non-local reality, yet has no unit of measurement within the quantum wave function. So it's a simulation created by something, but not GOD? GOD doesn't have to be a humanoid, but when people talk about simulation theory and an architect while refuting God, it really doesn't make sense. Whatever the architect is or happens to be can be defined as God even if it's something we can't fully comprehend. God= the architect of our simulation that we don't fully understand. That's pretty much the same definition from thousands of years ago. If we aren't in a simulation, then where is the energy/information coming from? It just existed forever... how can we ever fully define consciousness if we don't know its source?
@@VelereonicsThat can be applied to the totality of the universe. It's all one thing. There is no separation and time is just an illusion. Measurement from A to B.
Complexity breeds higher Complexity, with others and environment. Inevitably creating challenges and conflicts. Insuring motive to change , adapt / evolve . Ie fight . The beginnings tword intelligence in life. As easier 'simpler' developments have been exhausted.
That quiz was so cool! I wish more creators put quizzes after their videos, even for videos that aren't strictly educational. It gives a content creator a good measure of how well they are expressing their given points in each video, and it gives the viewer a measure of how well they are retaining it. So many times I've seen people talking about topics I'm interested in and saying things completely contradicting what I thought about the topic- even though we likely consumed a lot of the same content! It always made me wonder how one could know that they were actually learning when browsing content. I hope that quizzes like this catch on.
It all depends on how simplicity and complexity are defined in any given system, and the more configurations that a system has the more ways there are to describe it.
@@bodeeangus9957 *"It all depends on how simplicity and complexity are defined in any given system, and the more configurations that a system has the more ways there are to describe it."* ... The number of particles that make up the living, breathing, self-aware lifeform called "you" can be counted. Another structure can have the exact same number of particles, but be completely void of life and self-awareness. *Conclusion:* An equal mathematical assessment of two distinct objects does not dictate that they are equal in complexity. Mathematics, material substrates, and particle configurations are not the arbitrators of "complexity."
@@brothermine2292 *"You neglect that since entropy grows, complexity will eventually become impossible."* ... 13.8 billion years ago there was nothing but a totally benign point of singularity. That is as _least-complex_ as one can get! Today we have eight billion living, breathing, self-aware lifeforms running around exchanging information in a RUclips comment thread. If you charted the evolution of complexity like we do with the stock market from its beginning to now, the "complexity line" would be consistently moving upward. ... That is the way "reality" is presented to us.
@@0-by-1_Publishing_LLC Then what exactly is the parameter that we can use to measure complexity? Atoms make up everything, but some things that are made of atoms can move and others cannot, but that still doesn't answer the question of how complexity is defined. The question of what complexity actually means is likely to be sensitive to the context of whatever system is being studied, because while we can call a life form complex, we can say the same thing about a star for instance. Is the sun more complex than a human? How do you quantify something like that?
I've been interested in this topic for years. The best I've come up with is that "complexity" seems to be the opposite of "predictability", and in that sense the Kolmogorov complexity I feel is headed in the right direction. If a system can be modelled by only a few variables it is less complex. A pencil can be described by the materials that make it, its dimensions, and maybe some attributes like colour (i.e. its absorption spectrum). A person requires far more variables to adequately predict its behaviour, and is therefore more complex. The formalisation of this I think already exists to some extent in the realm of statistics. While a model can always match the data best with more variables, the "best" model - the one that generalises to predict future observations - has an optimal number of variables. Perhaps this number, or something related to it, can describe its complexity.
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
As Assembly Theory (AT) is an important part of this video (and perceived as "novel" and promising), there have formally demonstrated through published papers and blog posts that AT is formally equivalent to existing work (Shannon entropy and LZ compression grammar) without proper citation, and is a weaker version of these established concepts. This raises significant concerns about AT's originality and scientific merit, and also highlights the importance of not exaggerating intended scientific work, especially if it is not original and does not explain what their authors claim to explain. Publications include a paper in npj Systems Biology and Applications: "On the salient limitations of the methods of assembly theory and their classification of molecular biosignatures". "Assembly Theory is a weak version of algorithmic complexity based on LZ compression that does not explain or quantify selection or evolution", published in the arXiv and two medium post by Dr. Hector Zenil, broadly explaining why Assembly Theory and its marketing campaign are seriously damaging the image of science as a whole.
Huh? That odd feeling when I can suddenly follow 100% of the topic... Without ever expecting to! My backgrounds are in social silences, statistics and 3d modeling and somehow this just connected all the topics I fret about to physics! :D I have to say I am happy to hear that people are working on math for this - we need thous tools, badly - so many good ideas are currently not going anywhere because the math to test them is just not there! This video gives me hope :) Thank you Sabine! :)
❤ I like your background in social silence 😂. It made me think. People in general make too much noise. We all should listen more to each other, especially listen to Sabine 🧐
One way to define complexity algorithmically, is maybe how big a program would have to be to generate a full description (i.e. arrangement of particles) of an object that would be considered to be a stone, or a clockwork/human etc. For a stone it would just need to generate a regular arrangement of molecules with some random irregularities, and can have any shape, and the output would still be a stone. For a human it would need to be much more specific. (To get a full description of one specific stone, it would of course be just as complex as one specific human). This would be similar to proceducally generated worlds in video games, or maybe to (lossy) data compression algorithms.
Yeah, this is a mistake in how Kolmogorov complexity is explained in this video. You cannot seperate out some input (i.e. arrangement of particles) from of the program. A program cannot take input when you want to describe its Kolmogorov complexity, thus the input needs to be included/encoded in the program itself, thus affecting the program size and the Kolmogorov complexity.
There is also no distinction made whether "complexity of a stone" means: - The complexity of one specific stone that you take as an example, or - The complexity of the characteristics that an object needs to have so that it falls under the category "stone" In the latter case, the Kolmogorov complexity would depend on how broadly you define the type of object you want. In the former case, it would be about the same for any real object, maybe depend on its size or number of particles. The algorithmic complexity in the video seems to refer to how complex it is to simulate the physical evolution of the object, but not how the object is made up in the first place.
@@tmlen845 I think that a major issue in the notion of complexity @Sabine is speaking about is that it is unclear that one needs to distinguish the complexity of a system (that is any mathematical or physical object) from the complexity of its properties. One such properties being its predictability. And I think that we can roughly agree that a system is "predictable" if you can "infer" its future states faster than the speed of time. (That is, you want to know what happens tomorrow today) The thing is that the complexity of a a dynamic system is not really related to the complexity of any static state of this system. Take the Syracuse sequence "u(n+1) = if 'n is odd' then u(n)/2 else 3 u(n)+1" quite a simple system yet it's dynamic is quite complex. Another example is the "full network" disagreement, there is actually no disagreement, I think everyone can agree that this network is "simple" in the sense that it is easy to describe. The fact that it is more "complex" than the empty network is not a matter of complexity of the network (seen as a graph) but rather a matter of complexity of the underlying system or worse of complexity of the dynamic of the underlying. system.
Yet I also agree that we do not have an airtight definition of complexity, good thing for me considering I'm finishing my PhD on a closely related topic :-)
I think the number of emergent properties is probably the best measure of complexity, it seems to be the lowest common denominator of all the other measures stated in the video, for example systems with more emergent properties have a higher likely hood of those properties interacting with one another, making them harder to predict.
The way forward, then, is to realize that we don't need to agree on what is being measured! What we need is multiple measurements of utmost precision and accuracy. Then we can produce theories of thermodynamic complexity, chemical complexity, biological complexity, social complexity, computational complexity, theoretical complexity, and any kind of complexity produced by a system of well defined measurements and/or theories.
@@ywtcc yeah we don’t need to but when we don’t share or share too much, people get confused and turn to magic and religion. It’s present in the patterns of connections. Like how matter is fueled by energy and that energy allows a flux or change and now all the big can change to small and small to big and we exist between. Multiple variables come together of different shapes and sizes such as quantum particles and macro like planets or giant bodies of water or something in space waiting to be connected to more so that the differences of 6 plus 7 can blow up and create everything in between and then when the combos in between start gaining layers and more variables then they can see what is not. Look at how we see chemicals and those variables and how the structures and bonds share energy through those connections. Plants, cells, chemicals, made into all this. The Big Bang set in motion the very variables and now we reflect and connect. Our brain grows different cells in different patterns and edits them through the senses and diet and such so that we can survive and grow. We take in patterns connected to a whole and we don’t see everything. That’s apartment when dealing with radiation and X-rays and stuff. We lack the biological tools to sense them but we created them and connected the inputs to a different form called language and understood them through the details of connections and recognizing the reflections. Sounds like magic but it’s like connect the dots on steroids.
@@ywtcc i mean think about it, we became complex over time, what is time but a measurement of change in relation. The sun and earth and moon are just the same as the people and the cycles we shift through. We are the same. As above so below. It’s more than one saying because it’s everywhere. A paradox is like both sides of the same coin so which makes the other but it’s a connected and intertwined connection where they make each other sharing the energy. We try to leave something behind that can contain the complexity of the pattern so that we can persist and grow, we draw and do art because it is another pattern we reflect because we noticed a deeper complexity just a little to the side of others. Conscious growth and connection and thought is what came forth but think about this, how many people are conscious when they just live their lives? They bounce around the environment because they live in it and get changed by it but how many stop and think of a different pattern from somewhere else that makes them see different? How many preform an action that isn’t what the environment grew into them? Do people ever sit back and wonder why they see through their eyes? I think we do but we don’t use our conscious will to do what we want and instead we’ve been trained by the media and such to only believe this and don’t trust each other. I mean our society hates each other somehow when we should be listening and trying to think of things other than what reality grew into us. Be conscious of thought and grow a new thought beyond what you currently are. Idk how else to put it. We give up our why because we don’t see it but our why is everywhere and we just haven’t learned enough to see it. There is no why, there is only so live.
This boils down to the first term psychology question about what intelligence is. The correct answer is: It's the thing you measure with an intelligence test. That's why definitions are so hard when you enter the terrain of complexity: Complexity is too much for our understanding, so we have to break it down into models (thus reducing complexity) to have a grasp on it. And this very step is where different people (and scientific disciplines) extract different metrics/variables to suit their needs of the model being able to make predictions. So maybe we need different approaches to measure complexity depending on which property of complexity in a given system we are interested in.
In your most optimistic video on Entropy, you brilliantly pointed out that so much is not understood about Entropy and what it means for the future of life, and even what it means period. Lee Cronin and Sara Imari Walker have voiced your concerns for how Entropy is understood in their Lex Fridman and other interviews. They are on an exciting track with Assembly Theory. They welcome criticism and feedback to improve Assembly Theory and as they take criticisms to heart and flesh out assembly theory, it shows interesting possibilities for new insights into physics, biology, chemistry, selection, etc. They are even exploring applying Assembly Theory to Math!
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Gosh! This is fascinating! I recognise this as one of those abstract concepts that will fascinate me for the rest of my life! Please revisit this whenever something significant arises.
Hard to describe, but I gave this some thought. I imagine a maze with a single straight path from A to B of being minimal complexity. The more times you have to stop (a node) and take a different direction seems to be an increase of complexity. The more choices (nodes) to get between A and B, the greater the complexity, also the 'path length' increases, but at a lower rate. Sounds similar to why quantum physics is complex; many combinations to get from A to B (in my limited understanding anyway).
Algorithmic complexity still seems to me like one of the the best measure that could be extended to accomodate more different types of complexity. For example if you write a module that describes the particle physics completely and allows you to operate on the level of molecules, the rest of the algorythm will now be much more complex for baby than it is for the rock. You sort of define the cut off point for complexity. By defining micro physics complexity as a module, you decouple it from whatever other types of complexity you want to describe. By defining bondaries of the code modules you would define boundaries of complexity types. They could also overlap and include each other. So overall the algorithmic complexity seems like a very universal way to quantify complexity.
The only algorithm you need to make a baby are the laws of nature. There is no cut off point. The baby is a result of the same laws as the rock. Also check out Steven Wolfram's work on how equally simple algorithms may give rise to vastly different outcomes, some outcomes are periodical, some are chaotic.
Accept, it does fall short in some cases. The whole concept of complexity is higher-dimensional, it's likely that in a system with infinite configurations there can also be an infinite number of ways to describe them.
I am wondering if Shannon information could be used to describe complexity? Higher missing information needed to describe a system would imply higher complexity. Larger space of configurations, higher missing information, higher complexity.
Complexity is the only thing that humbles over-confident scientists who think we basically know everything and just need to figure out the details. The elephant in the room surrounding this topic couldn't be more massive
When I saw the title of this video, I exploded with delight. I have been pondering complexity for at least around 20 years. I had originally been thinking of how before computers were built, no one would have predicted what changes their development would bring. Even if someone had theorized 'oh maybe some day we could make a machine that does math at fantastic speeds', it is very unlikely they would have realized that would change the whole of the world (although Ada Lovelace seems to have understood far more of that aspect far earlier than anyone else, juding from her notes in Babbage's journals). And I wondered what might be the next thing... I already knew chaos theory places tremendous restrictions on what mathematics is even capable of, and some things about fundamental limits to computation. Last night, I was watching some film that involved a character being given the ability to have any wish they made granted... and without a doubt the wish I would make in that circumstance would be "I want to understand the fundamental nature of complexity to an extent that I would be able to engineer it through creation of micro-scale interacting components that give rise to a desired macro-scale emergent phenomenon." I want to know exactly, precisely, how many atoms must be collected before the concept of "temperature" has meaning and gives rise to the phase diagram and different states of matter. I want to know what aspects of the constituents determines the behavior of the aggregate. It is crystal clear that mathematics can not handle this. We are dealing with non-linear dynamics here, and math just simply can't do that. They brag about finding stable orbits for the Three Body Problem, while I'm only interested in the 6.022*10^23 Body Problem. Right now I think the strongest possibility of where a fundamental understanding of complexity might come from is AI. Not that I think an AI will figure out the problem, I don't believe that at all. But, the specific way that modern AI is built, with networks of trillions of nodes, is an actual example of a system which can be pulled apart, manipulated freely, and experimented with that is a true complex system, where the "More is Different" adage holds true. We can build systems at a large enough scale now that I imagine people are doing experiments trying to figure out either WHY certain large-scale properties emerge, or else how to modify the emergent behavior through manipulation of the architecture of the nodes or their 'interaction'. We might not develop a formalism to make this engineering easy, but I imagine we will definitely make some progress towards recognizing it, setting bounds of how much influence and of what kind can be had through interventions of different kinds, etc. They are nothing but big matrices of floating point numbers that use matrix multiplication, and yet we can build them and have them understand language (for certain definitions of 'understand'), generate images, extract data from noisy inputs, etc. I have pinned down what I think is a pretty robust definition of consciousness, and it is coherent with everything I've read about both human and animal consciousness. Consciousness is an emergent property of systems with a 'large enough' number of adaptive components embedded within a feedback loop inside a highly-enough correlated environment. Every part of that definition is important. If you break the feedback loop, say through introducing total sensory deprivation, you can not maintain consciousness. I also believe, similar to the emergent property of temperature, that this produces a phase diagram where there are different 'states' of consciousness similar to states of matter. I am uncertain yet whether the number of adaptive components is actually the feature that controls the different phases. With the simplest brains, flatworms and things like that, behaviors that appear complex are possible but it's a very bad idea to rely on intuitive perception like this. Those behaviors can often be mapped pretty directly to correspond to features in the immediate environment. The larger the number of neurons (roughly), the further away from the environment and present-time the consciousness can get, eventually accounting for some intuitive prediction. That gives rise to 'emotion' which is in-brain anticipation of interoceptive changes. Beyond that, development of language is a phase shift and enables an entirely different state of consciousness, enabling perception, and thus thought, of things which are not immediately present. This also enables abstraction, and is the first place at which "understanding" really exists. If we get to a point where we can engineer complex systems... every other technological advancement of the human species will look like idiocy. It would probably be harder to describe something that would not be possible if we had a true deep understanding and ability to engineer complexity than anything. We'd be able to 'build' things that build themselves and that would look like literal magic to anyone alive today. Too much CO2 in the atmosphere? Whip up a custom bacteria that uses solar power to pop it apart and build the carbon atoms liberated into perfect sheets of graphene. Maybe then do another that'll slice up the graphene sheets into strips and wave them into cables.
Video fails to relate Complexity with Degrees of Freedom of individual parts along Time, not only spatial networks. That's why baby is much more complex. Cells born and died at rate of Millions per minute, to maintain a sustainable body. Cells itself have specific roles and options. All related with options at present input data AND memory data. Kurgesagt video "Counscience- How Unaware things become aware" and other "What are you ". Memory is key for algorithm evolution.
Thank you Sabine, never stop making them videos! 😂 Can't wait for a longer attention span in my 5 and 6y olds to binge watch this and more.. :) Stay safe, and have a good happy holiday time! And, ummm, as this video is almost screaming for it, can we have an update/info video about AI, its current state, all the different modalities being utilized and orchestrated (ai agents?) and how it seems to exponentially evolve? Skip a step in complexity, connecting artificial neural dots😮😅 and go straight for explaining what superintelligence means, even in the context of this video? (A neural net saving 800years of material research in one very specific area of science (gnome) etc) Thank you so much, whether i see this particular video or not, it's always nice to leave listening and watching someone conveying an interesting topic, with a little more knowledge, and maybe even provoked thoughts ❤🤙👌👍
Fascinating indeed. As a computer scientist, my perspective might be biased... But, for me, we just need to define the variables and rules to add them (since they're different things)... Which doesn't make it much easier... Does it? 😬 Anyway, thanks, Sabine! 😊 Stay safe there with your family! 🖖😊 And happy holidays!
Computability is a challenge to tackling Complexity for sure or as Sabine said we would just use the fundamental equations in Physics and be done with it, (*sigh*) but for combinatorial explosion eh? Where we have success in high complexity domains like Biology is where we have good heuristics that sufficiently model the fundamental behaviors we are interested in predicting. The number of NP problems nature tackles every day is quite impressive until you recognize that it's using all of the atoms in the entire universe to do so... an option that is mostly closed to us unless we redefine predict to mean observe. To me one of the essential features of a theory of Complexity needs to account for is the dynamics of a system. If you take a human body and freeze it in time it's not really a complex system anymore, it's just a particle arrangement. A rock that is moving through time has dynamics but they are relatively static (non-complex) on a human time scale. Geology has complexity but only on geologic time scales so clearly the complexity of a system somehow includes a scale factor based on the dynamics of the system and so comparing system complexity must include that variable. I mean on a geological time scale the inanimate and decidedly non-complex inorganic chemistry of the earth somehow evolved into organic chemistry and then life which makes the complexity of rocks pretty interesting. Linguistic challenges are clearly at play here as we continue to overload mathematical concepts with human linguistic baggage.
@John-zz6fz Yeah, it's a huge task and the number of variables is just unreal... Unless we really are in a simulation and can, some how, access the source code. 😬
@@John-zz6fzHow would functions(where certain subsystems emerge which serve a superior "super-system" - known as functional differenciation, as seen like in the brain or society) and evolution(a REAL evolution - like biological life, society, the cosmos as a whole) add to the complexity of a system? Compared to just complex chaotic phenomena like the weather.
@bjornrie The increasing complexity would be to emergent phenomena. In this case the term complexity has dual use. In the computer science frame we mean computational complexity and if the system is calculable in a Turing sense then no new complexity emerges. From the information science frame complexity isn't the same as computational complexity. It's not totally clear what the term actually means in information science and that point is made by Sabine as she's saying we need to make progress on a formal mathematical definition of complexity in the frame of complexity theory.
Thank you so much, Sabine. This is one of the best science videos I've ever watched on RUclips. I admire how you synthesized key aspects and insights regarding the "multiverse of complexities" in just 18 min. I am part of the Engineering Systems community, and understanding how to assess [socio-technical] complexity is a major research theme for us. I plan to watch this video over and over for inspiration.
thats so cool, i recently fell in love with the idea of assembly theory after listening to lee cronin. iam starting to see assembly theory everywhere now. so happy that you commented on it as well. i would love to hear more from you about it, especially on the point that assembly theory requires time to be essential for rising complexity and the point that cronin made, that the past is deterministic but the future isnt, because emerging complexity is so huge, that its unpredictable by just looking at the laws of physics! thanks for your insights
In my opinion its more of a value judgement on what is more difficult for us to describe. The early universe even though there is so much stuff and energy interacting it is actually fairly easy to describe mathematically whilst a baby is incredibly difficult.
It would appear in terms of complexity that the early universe is in essence the same as the universe today., there must be at least equal unseen complexity in the early universe, since it describes and preforms the future it will become. Unlike DNA, which iterates within an outside medium which provides external elements to make a baby, the early universe is self contained and must have the same complexity at all points in time. The early universe must have unseen internal complexity to produce a baby in 13 billion years , as well as civilization.
the important thing is realizing that this inherent complexity can also be a characteristic of a system through time. We need to consider time and space as equivalent even though we are stuck in one and free in the other ;) something that starts extremely simple can become complex and in a sense then it "already is complex".@@myroomtv2014
@@myroomtv2014 I don't agree that the initial conditions of a system are necessarily complex for the result of the evolution of that system to be complex. The rules themselves breed complexity by their very nature. It seems enough to provide a very low entropy seed and let the rules create the complexity.
Thank you for this very insightful video, Dr. Hossenfelder. The problem with describing complexity in detail, is that it is saddled by the mathematical theorem of incompleteness. I wouldn't make rain on this wonderful parade - but as a philosopher, I'm firmly convinced of it. Merry Christmas!!! Anthony
Pleased to see that Sabine has returned to her core capabilities, resulting in thought provoking questions backed up be various facts based on science.
While studying Graph Theory for my MA, I came across Ramsey Theory and was immediately intrigued. Ramsey Theory deals with complex systems (usually finite, discrete networks), and how large the networks can be before some certain substructure must manifest within the network. The "simplest" example is how big can a simple complete binary (or 2-color) graph K_n be such that K_n avoids a monochromatic triangle (or 3-cycle C³). The answer is 5. K_5 is the largest simple complete graph that can be drawn in 2 colors without a triangle (3-cycle) of the same color edges. Sometimes I wish I'd gone into Ramsey Theory because it's still in its infancy and most of even the simplest questions are still open and an active area of research.
This is mindboggling topic, as well as the topic of consciousness. We obviously missing a huge chunk of understanding of the world, which is right under our nose, but even asking questions and trying to give simplest definitions is like carrying big stones.
I don't think I agree that the algorithmic complexity of all objects is necessarily the same. Just because everything can be described by the same general algorithm doesn't mean there's no shorter program specific to each object. File data is vaguely analogous - for two files of the same size, there's one algorithm that can describe both (simply enumerate the bits) but with a compression algorithm, a highly ordered file will compress to a much smaller blob than a highly irregular file.
Perhaps what we think of as "irregular" is an illusion, in that we use it in order to describe something that does not have any patterns that are recognizable to us currently. It could just be that we don't yet understand the patterns, and that even highly chaotic systems are ordered in strange ways.
you're right, it's not just the algorithm but the inputs, which in turn may be expressed more compactly, compression is a good intuitoin for this but the video seems to assume that these programs can have free variables / inputs, but they shouldn't
@@bodeeangus9957 Kolmogorov complexity goes even deeper than that, look into Chaitin's constant for example, although it can be compared in some sense for actual programs, you can objectively say which is shorter, if i'm not mistaken you can even do that without loss of generality in assuming a particular language, but you can't do it in general and you can't coompute the Kolmogorov complexity of a given string because concrete examples exist in cryptography, for example a pseudorandom generator is an object which without knowledge of the seed appears to have unbounded complexity, but in a very tangible sense only has a constant amount of it
I kind of agree - the only issue is quantifying "amounts" of description. I can describe a pencil with the word "pencil" and a human with "human", so using "number of words" isn't quite enough. Possibly "number of variables" is more in the right direction.
5:30 "Lines connected by dots". Hmm. Quite an interesting reverse point of view. I usually think of graphs as a collection of points connected by lines... 12:00 "It’s not all that hard to tell a brain from a piece of wood (in most cases)". Thank you, you made my day! 😀 Blooming complexity is our ideal. From this point of view, we can define *_good_* and *_evil_* : what increases complexity is *_good_* , what reduces/destroys complexity we perceive as *_evil_* .
True, the kolmogorov complexity of the math governing the evolution of all systems is the same, but that's not what people mean by the complexity of a system. That complexity also has to include a description of the system's current state. In that way, the kolmogorov complexity of the baby far exceeds that of the rock. The two really difficult problems with kolmogorov complexity are: 1. A general mechanism for computing its value is impossible. Optimally minifying an algorithm is known to be reducible to the Halting Problem, which is the classic incomputable problem. 2. Like many measures of complexity, it depends on the granularity at which you want to describe the system. For example If you want to describe the macrostate (temperature, pressure, phase) of a system, the complexity will work out to be much lower than if you want to describe the microstate (position and momentum of every particle).
15:01 I don’t think that’s what they mean by “difficult to assemble”. Obviously it’s not hard for two very in love oppositely sexed humans to make another human, but imagine trying to “assemble” a human, with all their DNA and cells and neurological complexity in place. A good way to think about it is through statistics of natural evolution. In the Earth’s early years, the overwhelming majority of structures which developed were extremely low in complexity, and on most planets, that pattern continues. The Earth, as it cooled down and collected useful elements from nearby supernovae, was able to become a highly rare habitat for the development of organic molecules capable of self replication. The majority of systems in the universe are not suitable habitats for this assembly, and if they are, those microscopic living molecules will find it difficult to evolve to anything larger and more intelligent unless they are within a perfect “Goldilocks” zone like Earth. Assembly theory might not be able to account for complexity seen in how structures change over time in certain ways, but keep in mind that from a certain perspective, evolution and natural change are part of this “assembly” notion. When we want to “create” our own complex biological structures, we always have to have some sort of sample cells, DNA, some kind of naturally occurring set of molecules containing information such that it can build itself; the system is too complex to produce in a factory the way you’d produce computer hardware.
Can you make a video about discoveries that the human body used quantum physics in our DNA, Sense of Smell, and Brains and how this can possibly make us non determistic beings because Quantum physics is probabalistic not deterministic. This is the argument of Some mathematical physicists like John H. Conway, Simon B. Kochen, and Steven Hawking himself
QM implies its all random anyway, theres also the point that most of the high entropy situations are near indistinguishable from each other and can usefully be approximated to random for that reason
Adjectives, including "high," are ambiguous shorthands that actually refer to a relative comparison to an unstated alternative: "X is high" actually means X is higher than some unstated Y. There are typically many possible alternatives for comparison, which is why failure to explicitly state the unstated alternative creates ambiguity and is often misleading. What your comment is doing is proposing a definition... a particular threshold that distinguishes "high entropy" from "non-high entropy." That's fine, but different thresholds could be proposed. For example: Entropy is "high" when entropy has grown to the point when a complex system can never afterward exist. I think this definition of high entropy would be more consistent with Sabine's point that complex systems are neither low entropy nor high entropy.
@@festusmaximus4111 Isn't QM all about finding patterns in all this "randomness", that could also be close to the highest possible complexity when you assume particles interact with every other particle in the universe?
I've been fascinated by complexity for years (probably decades). The best (intuitive) explanation I've come across is "everything humans MAKE is complicated, everything humans MANAGE is complex". It doesn't help scientifically, but it helps make sense of "things".
Good video, the other problem with complexity getting in the way of progress is the lack of agreement in how to define it. Which leads to a lot of conflation
IN THE INTEREST OF FINDING THE THEORY OF EVERYTHING: SOME THINGS MODERN SCIENCE DOES NOT APPARENTLY KNOW: Consider the following: a. Numbers: Modern science does not even know how numbers and certain mathematical constants exist for math to do what math does. (And nobody as of yet has been able to show me how numbers and certain mathematical constants can come from the Standard Model Of Particle Physics). b. Space: Modern science does not even know what 'space' actually is nor how it could actually warp and expand. c. Time: Modern science does not even know what 'time' actually is nor how it could actually warp and vary. d. Gravity: Modern science does not even know what 'gravity' actually is nor how gravity actually does what it appears to do. And for those who claim that 'gravity' is matter warping the fabric of spacetime, see 'b' and 'c' above. e. Speed of Light: 'Speed', distance divided by time, distance being two points in space with space between those two points. But yet, here again, modern science does not even know what space and time actually are that makes up 'speed' and they also claim that space can warp and expand and time can warp and vary, so how could they truly know even what the speed of light actually is that they utilize in many of the formulas? Speed of light should also warp, expand and vary depending upon what space and time it was in. And if the speed of light can warp, expand and vary in space and time, how then do far away astronomical observations actually work that are based upon light and the speed of light that could warp, expand and vary in actual reality? f. Photons: A photon swirls with the 'e' and 'm' energy fields 90 degrees to each other. A photon is also considered massless. What keeps the 'e' and 'm' energy fields together across the vast universe? And why doesn't the momentum of the 'e' and 'm' energy fields as they swirl about not fling them away from the central area of the photon? And electricity is electricity and magnetism is magnetism varying possibly only in energy modality, energy density and energy frequency. Why doesn't the 'e' and 'm' of other photons and of matter basically tear apart a photon going across the vast universe? Also, 'if' a photon actually red shifts, where does the red shifted energy go and why does the photon red shift? And for those who claim space expanding causes a photon to red shift, see 'b' above. Why does radio 'em' (large 'em' waves) have low energy and gamma 'em' (small 'em' waves) have high energy? And for those who say E = hf; see also 'b' and 'c' above. (f = frequency, cycles per second. But modern science claims space can warp and expand and time can warp and vary. If 'space' warps and expands and/or 'time' warps and varies, what does that do to 'E'? And why doesn't 'E' keep space from expanding and time from varying?). g. Energy: Modern science claims that energy cannot be created nor destroyed, it's one of the foundations of physics. Hence, energy is either truly a finite amount and eternally existent, or modern science is wrong. First Law Of Thermodynamics: "Energy can neither be created nor destroyed." How exactly is 'energy' eternally existent? h. Existence and Non-Existence side by side throughout all of eternity. How?
Video fails to relate Complexity with Degrees of Freedom of individual parts along Time, not only spatial networks. That's why baby is much more complex. Cells born and died at rate of Millions per minute, to maintain a sustainable body. Cells itself have specific roles and options. All related with options at present input data AND memory data. Kurgesagt video "Counscience- How Unaware things become aware" and other "What are you ". Memory is the key of algorithm evolution.
@@SabineHossenfelder actually now I think about it a few of my lecturers said when were studying solvation chemistry, vapour points and super critical fluids and substances which sublime. That you reach critical points where the properties change whether from temperature, pressure or concentration and there is a lot to understand within these processes
Hmm... Is this not a dead end? Complexity is a measure of a description, not a thing. We can describe one thing many ways. We can apply different complexity measures to one description. There is no single measure. You propose some combination of emergent behavior, edge of chaos, and self-organisation. But the simplest system has emergent behavior. And the self-organisation of system can lie in a simple coupling of organising and organised components, independently of how the complex the coupled components are.
Hmm. It seems that by that value system everything we have now is a dead end. Which in a sense unless we suggest have the sheer to life the universe and everything sitting under our noses but we all missed it, it is. But as a tool to advance or understanding further before later throwing it away, complexity theory as vehicle for other techniques to model complex questions, does seem very powerful. It is an opportunity for thought exercise, which provided such does become self serving, and sufficient as an end in itself, has contributed to breakthroughs in the past.
Actually life drives entropy at a much faster rate. Simply because of your extreme locality might seem to exhibit lower entropy is a far cry from the wider environment. Look at climate change as a simple example.
@@SebWilkes How is change in climate an example of entropy? Life takes the energy that would otherwise be entropically wasted and concentrates some of it and builds greater complexity and organization. It is the only thing we know of that goes against the prevailing entropy gradient.
@@obsidianjane4413 CO2 in the atmosphere is a higher entropic state than oxidisable carbon in the ground. The amount of additional entropy you generate in creating a building, say, greatly outsizes the entropy you reduce by having a few crushed rocks suspended above you.
@@obsidianjane4413 I don't think it is limited to just humans, but without a better argument I'll concede on that. Nonethless, it is worth pointing out that since life, or at the very least intelligent life, is so good at increasing entropy it has been argued by proponents of biological determinism as a reason to believe that a system that seeks to maximise entropy (and its rate thereof) would be likely to bring about life.
It's nice to have a full length video again; very interestng topic. The reason I subscribed was for excellent videos like this. Not a fan of 1 - 2 minute long videos and "short form content" that can't go into detail. Thank you!
I have been thinking about complexity in another context: in our everyday lives! Our society, especially in the age of social media, has demanded an absurd amount of knowledge on a very wide range of topics to navigate relationships with other people. Under penalty of social execration! The tension this has generated is absurd and the stress is inevitable. Our mental capacity to deal with this is too limited to reasonably demand understanding and acting in accordance with so many "isms" that postmodernity has created.
Complexity is dual to simplicity. Randomness (entropy) is dual to order (syntropy). Points are dual to lines -- the principle of duality in geometry. Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual. Convergence (syntropy) is dual to divergence (entropy). Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics! Syntax is dual to semantics -- languages. If mathematics is a language then mathematics is dual. Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy). The integers are self dual as they are their own conjugates. Subgroups are dual to subfields -- the Galois correspondence. Sine is dual to cosine or dual sine -- the word co means mutual and implies duality. "Always two there are" -- Yoda.
Sabine, the easiest way to explain complexity is to think about if a "something" is hard or not hard. If you look at it that way, you will realise that a rock is a lot harder than a baby. 😂
This is why AI is getting so much attention. All complex systems are essentially networks. AI is a network. It takes a network to decode the properties of a network. AI (and indeed intelligence itself) understands complex systems to whatever degree it is able build a sympathetic network that responds favourably to that complex system. We will probably never have equations to solve complex problems because those equations themselves will likely be complex system. We will solve those questions by using progressively trained networks (AIs) to decode complex networks that we have no other tools to predict. Our tools, though functional, will be just as incomprehensible as the systems whose problems they solve. We understand how AI works. We understand the output of an AI from complex input. But we can no more understand the knowledge (training) inside the AI than the problem the AI solves because that knowledge training is itself a complex system.
Love the part ;complexity is between order and entropy ... Im not smart by a long shot but i do love thinking of the universe and the state of its being! I believe that almost all of us make things way to complicated by trying to make it uncomplicated... thats where the magic happens. And it looks like fusion at one time or another and looks like maybe absolute zero in one way or another... either way its all very very amazing even without thinking and just soaking up the sights... and as that information gathers and filters your experience through your brain , to me it seems like instinct /self preservation and many more such things are like the clumping of cosmic material in a specific area until a galaxy or a new planetary system has amassed enough energy to spark fusion reactions. To me our personality traits and thought processes largely depend on pattern recognition simply put keeps us alive and whats more, thriving with creation and emotion. Wich is to say both creativity & emotion seem to be the access energy of "experience impact events" They're gonna stop now because I will literally go on ten thousand more words
This video comes with a quiz which you can take here: quizwithit.com/start_thequiz/1702514595195x839772357362217700
i like you very much !
You know what's really cool? The second chromosome in our body has been genetically spiced which cannot happen naturally so it is fair to draw the conclusion that either we have lost technology that allowed genetic manipulation or we have been made by some other form of life AKA aliens... one of these is the truth and it is a logical 50/50 split with 100% chance one of these is the truth...
How is everything not equally complex? Everything in existence came from the Big Bang which is a singularity state. Anything that exists is equally complex as anything else... all existence is God here this is a god eat God world... face the facts...
What people fail to realize is that everything came from The Big Bang singularity... if anything exists anywhere in this universe it exists everywhere for everything is from the singularity.
I address everything from the root therefore I am never wrong... anyone doing anything different is going to fail in comparison to me...
Wolfram has an entire book about the science of complexity. As a PhD in computer science, I think this has been my favorite video of yours in a long while.
"A New Kind of Science" by Stephen Wolfram. A great nerdy read.
Yeah, looks like Ms Hossenfelder isn't aware of Wolfram's NKS otherwise she would have mentioned it even if she didn't like it.
@@TheFutureIsEloi It's not yet scientific. It doesn't make any testable predictions, nor has he worked out how it explains the existing measurements. It's a very long rant about complexity that's unassociated with physics, and his physics thing is (so far) unassociated with science, until he manages to work it out far enough to make predictions.
@@darrennew8211 The "science of complexity" has the same implied "structure" as the "science of Evolution" has. By definition the "science of evolution" does not make predictions, because it implies prior knowledge of changes in the environment in which the Evolution/Complexity emerges. The "science of Evolution" has mostly explanatory properties rather than predictive properties. The same should be true for any " science of Complexity".
@@reasonerenlightened2456 Evolution certainly makes predictions. For example, if you count the number of chromosomes in an ape vs a human, humans have one less. The science of evolution predicts that you find telomeres in the middle of one chromosome, which you do. Evolution predicts that sea creatures that swim quickly will tend to all be streamlined. Evolution predicts there are *not* giant jumps in the fossil record, that we don't find rabbit bones in the bellies of dinosaur fossils, etc.
Hell, evolution predicted the existence of DNA well before DNA was discovered. Darwin figured evolution wouldn't work if traits could be diluted, so inheritance had to be some sort of digital/discrete mechanism, which Mendel later showed was the case.
That said, yes, evolution isn't going to predict the way in which creatures evolve in the future, because that is too complex, as you say. It's definitely less predictive than fundamental physics stuff, for sure.
I'm not sure what Wolfram's "science of complexity" was trying to predict. I read the thing, but I don't remember it making too many predictions about the structure of measurements.
Two quotes to mull over, re complexity:
1) On his death bed, Heisenberg is reported to have said, "When I meet God, I am going to ask him two questions: Why relativity? And why turbulence? I really believe he will have an answer for the first."
2) "If the universe were simple enough for us to understand, we would be too simple-minded to understand it." [source unknown]
I think kolmogorov complexity didnt get a fair shake. Yes, the standard model describes rocks, clocks and babies equally, but the *length* of this description is not the same, because there are more particles arranged spatially and temporally in a greater amount of ways, thus requiring a longer program leading to higher kolmogorov complexity
Agreed!
Moreover, the 'basic' description may well not be the _shortest_: if you consider a system consisting of an iron sphere floating through an infinite vacuum, then we can model all state variables using a much simpler model (treating the sphere as a point object with some fixed velocity) than the fundamental physics of all constituent particles -- moreover that complexity would be more or less constant across a large range of sizes of the rock, at least until internal repulsive forces begin to fail, even though the base system would get more and more difficult to compute using the naive method described in the video. Both of those facts (there existing a simpler model than raw particle physics, and that that model is to an extent size-invariant) are also true of a clock, but the simplest model for such would still have to be more complex than the simplest model for an iron sphere.
If we accept Kolmogorov complexity, we will need to take seriously Dembski’s schema regarding Complex Specified Information, and the “conservation of information”. Also, Stephen Wolfram’s “A New Kind of Science” is relevant.
But isnt Kolmogorov's complexity really a measure of "disorder" instead of complexity? A "completely random" sequence (generated by white noise) of elements would take a longer program to recreate, if compared with a "complex" sequence (that necessarily would have some kind of correlations that allow us to compress the sequence).
@@adolfobahamondealvarez5739
It certainly is a measure of disorder. A uniform sequence of a repeating constant value would have the lowest kolmogorov complexity and and a totaly random sequence would have the "highest", with Sheakspeare somewhere in between. You're totally right in that merely higher kolmogorov complexity does not mean higher actual complexity. However, it is still, in my opinion, extremely useful because it mathematically bounds actual complexity. We still don't know what it is but we can be sure it must be more than x and less than y. Putting bounds is useful.
@@yakovdan Quite possibly I don't properly understand Kolmogorov complexity but producing random noise with an algorithm doesn't seems long or hard if you are satisfied with a pseudo random solution and seems impossible if you want a true random one, thus I am not convinced that it isn't really a good measure of complexity in most practical cases.
During my time as a PHD student, I also worked intensively with complex systems. The most important insight I gained was that it is practically irrelevant how many individual components a system consists of, but that the interactions and dependencies between them are decisive for its complexity. It is also very important how many scales (temporal and spatial) the system comprises. If the micro scale has a strong influence on the macro scale, this also makes the system significantly more complex. Now, as a software engineer, I have realized that these fundamental properties (many dependencies and multiscality) also make the software and the code (= the system) complex and therefore difficult to understand and develop.
Indeed, for the purposes of control and prediction it is more important how much interconnected (interdependent) the individual components of a system are than the number of components.
As far as the role of software engineer is concerned, it is long overdue for a code to be written which treats the human as a consultant to the code that writes code instead of the the human being the code writer. Even better, the human must become like a customer to the code that writes the code the human needs, you just tell it what you need as you would do to another human, and the code gives you the code solution that will work for you. Do not confuse what I am saying with the stupidity of the chatGPT technology that currently became popular. The chatGPT technology and the likes (for images, video, sound, code, control, etc.) just have the veneer of intelligence.
read NASA guidelines
As a product designer who designs systems to control mass production lines, I like the c4 model of software architecture. I like to use it to map and understand the user requirements and flows.
@@reasonerenlightened2456well, it's still garbage in, garbage out.
You want a human to tell the code what it wants and the code will take that as an input and convert that into code that does what the human wants. However, any kind of function suggests both implicit and explicit properties/aspects. It's only when beginning to use and test the first code iterations that it becomes clear what was implicit that needs to be made explicit. We call those implicit things that haven't yet been handled 'bugs', because they create the possibility for an unwanted output without the means for handling a correct output having been made explicit.
@@512Squared Machine learning still does not understand "iterations".
As an academic scholar in complexity, I really enjoyed the video. To define complexity is always hard, and whenever we go to wide enough conference it is clear that everyone is intending the word in a different way. More than once I tried to write projects with other researchers to later find out that, although we both refer to our field as "complexity", what we do is completely unrelated.
However, to put my 2 cents, I think that to try to define the complexity of "things" is a mistake. We can define the complexity of their aggregate "processes", "dynamics", "behaviors" - and in these Kolmogorov complexity works quite well. This is not strange for a physicist by the way. When we define the entropy of a mixed deck of cards, we are not looking at the physical structure of cards: we are looking only at their mixing properties, abstracting completely from their physical properties. We say that the card set have 52! possible states, ignoring the spins of their atoms and the molecular configuration of the materials composing them. A rock is simple if we consider its autonomous motion - it just stays there - while it could be complex when looking at its internal function: it would be however irrelevant if I am looking at its autonomous motion, like the cards material is irrelevant when looking at the probability of getting a flush.
To compute the complexity of "a rock" is as useless as computing the entropy of a deck of cards considering the materials of the cards. Notice that "assembly complexity" is, again, measuring the complexity of the process of building the item, it does not have any relation - in principle - with the object itself. While it is true that objects we identify as "complex" tend to have complex processes to build them, there is no reason to believe it is a general characteristics.
Can you point to resources for understanding how information content and complexity might be different?
@@pbajnow wikipedia -))))?
@@dancroitoru364 I get that. But order, information content, entropy, and complexity all seem related but maybe different in common usage.
As a software engineer, complexity is bane of my existence. I was looking for a way to measure software complexity. Now I have some threads to follow thanks to your video.
Edit: tbh I hadn't really looked, but it seems there are some methods
Edit: Computational complexity isn't what I'm talking about. Computational complexity is about the amount of resources needed to run the computation. That's not at all what I talked about other than containing the same word "complexity". Don't be half knowledge, ego stroking assholes.
Don't forget to collect your Nobel prize when you figure it out.
Overall function vs size? I don't like recursive code, even if it makes a very small program. Can be a right pain to debug.
@@threeMetreJim to be honest, I haven't really looked really deep into it. But it looks like there are some methods already. Since there's no perfect way to measure complexity in physics, there's no perfect way to measure complexity of software. Because everything emerges from physics.
Software engineers being software engineers, there have been software tools written to measure software complexity. Like Testwell.
How about something along the lines of product-sum ratio of the number of functions defined and the number of calls to those functions multiplied by the number of variables parameterized by those functions, though I suspect that the ratio between number of value and reference parameters will need to be considered as well.
User-code only, as I would assume you don't care how complex the generated machine code is, just the user-code-base you need to maintain.
First bit is an attempt to measure the size of the code-base against how much it interacts with itself, second bit is an attempt to measure how much data is being moved around. More considerations might be needed to compare dissimilar code bases, perhaps incorporating the average number of lines of code per function in attempt to get a ball-park measure of the code's relationship to spaghetti...
/two cents
Edit: If this ends up helping, I'd be happy with 10% of the prize money and an acknowledgement in your Nobel prize winning paper. XD
You are performing a valuable service with your work here. For me, for others who see and think about your presentations, and for those we impact in some way as a result of what we’ve learned. Thank you.
My favorite book on complexity is the novel by Robert Pirsig; "Lila". He outlines a hierarchy of complexity based on the emergent behavioral rules in a single taxonomy start with physics at the bottom and aggregating up through chemistry, bilology, ... sociology. It is a brilliant work by a brilliant man and accessible to all.
Interesting .
a very mentally ill man with severe schizophrenia
This has been pointed out by Auguste Comte two centuries ago. The modern scientists think they have invented everything.
Pirsig is a wonder. He's probably the guy that came the closest to realising enlightenment, the closest to walking through the door, without actually walking through it. Even though logically he saw that Quality was the Tao, the source of all individual things, he still could not 'feel' it, feel himself as just a manifestation of Quality. So instead of feeling so happy and free, realising there was no individual named Robert Pirsig, nothing for him to do, nowhere he was supposed to go, no burden of the future or even the present moment on his shoulders, and went about his day like many mystics do, he instead only realised that logically, and reasoned there was no reason to get up so urinated on the floor, no reason to stop the cigarettes from burning his fingers, no reason to do anything. The logical part of his brain realised the truth, but could not let go and let the emotional part of his brain feel it in the experiential dimension. This 'letting go' is crucial for the process. Without knowing, without understanding, you let go and trust the Tao, the universe, whatever you want to call it, to do you, rather than you doing it. Pirsig seems to have never done this. Chris' death made it worse. "Where, after all, did he go?", he asked over and over.
Lila is an agonising attempt to answer these questions and analyse Quality without ever having actually felt himself to be a manifestation of it, without ever having felt the
💙love💙 that is another name some people give to Quality.
He never let go and Lila is strained at best, torturous at worst.
@@philharmer198 I expanded your answer. :-)
Consciousness is also more prominent at the edge of uncertainty. Predictable or repeated events lowers the brain activity, as we have measured using fMRI (A phenomenon called "Repetition Suppression"). We can also deduce that subjectively e.g we become sleepy (& bored) when watching very predictable events that has no surprises.
Predictability can be a useful measure of complexity for dynamic systems with multiple interdependencies and variables. Global and local weather trends, changes in wind, temperature and relative humidity are of significant concern and yet their predictability decreases rapidly over relatively little time. Modeling possibilities is one thing we are good at, but knowing which will be realised is still beyond our abilities.
This is the most interesting video on RUclips this month. Thanks Sabine for consistently delivering such high quality.
Agreed.
Congrats on the phD and its been really cool seeing your channel grow so much, cheers!
As a budding Biologist with a great interest in systems biology, I really appreciate this video.
Since childhood I've always loved tinkering with objects to find out how things work and that's how ended up figuring out how the most complex things in the universe work.
Even your own brain?
Really, you figured out how the most complex things in the universe work? How much did it cost? Tell me how to build a gene drive. I'm passively putting together the math model. Tell me what you know. Bruh.
@@ADUAquascaping That becomes a culinary question.
Do you use NetLogo to model your systems biology? You should. It is easy, You define your environment, you define your agents, you define how the agents interact with each other, and the environment,...and how those interactions change the agents and the environment. Then you sit back and observe emerging behaviours. The only limit are the accuracy of the assumptions your model makes.
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
Complexity is the minimal number of laws (mathematical functions) of all the possible connections between the elements of the system described within a specified space-time dimension.
It gets its real meaning as a comparison, as a relative value between two or more systems, although it is not impossible to quantify it as an absolute value.
Let's take the rock-baby example.
The baby is more complex if we compare it to a similar sized rock, and we take a small timespan, let's say a day, and investigate both starting from their molecular size to the whole object. The baby is complexer because to count all the laws describing the connections of its elements on the same macroscopic size it is much higher for a baby, not only because its organs or molecules on the similar microscopic level need more laws to describe them, but in a 24 hour a rock doesn't really change, so the are no extra laws needed to describe the changes too, the time generated connections (evolution) and the laws describing these are missing for the rock.
However if we take millions of years of evolution of that rock and compare to the millions of years in which a baby state of a human is negligible (99.9999...% it doesn't even exists), the rock may be more complex, so geologically it is complex, because it the a higher timespan, an "evolution" of that rock must be taken in consideration too.
The same is true if we change the spatial dimensions in which we compare the system. No surprise that the geology of a whole planet is pretty much complex.
To describe all the possible connections of the elements of a DNA molecule of a baby needs also much more mathematical functions than a silicate molecule (just to stay at the baby-rock example, however it is not important at a molecular level, where those example molecules are coming from).
We can also define a span of space to compare objects, or to measure it's complexity, and a timespan too, so we can compare two objects starting from a molecular size to an average human size in 100 years, then compare those. It is also okay to compare different space-time spans of objects. In both cases we can say which one is more complex.
Just to use it for the coffee exaple, the two separate liquids half mixed are much more complex, because you need to describe the process of mixing as the time goes on, which is probably the biggest part of the mathematics to do, much more as to describe the separate or the completely mixed liquids.
This is the reason why the states in between, and the systems which rapidly change by time are so complex.
Now if we compare the whole universe from the smallest particles and in its complete "lifespan" there is nothing as complex as it, because all other systems are only a subset of elements in space and/or time, and those can't be more complex.
(Sometimes I used objects and systems as synonyms, but these thoughts are just born as I wrote them, inspired by Sabines explanations. But you probably got the point anyways if you took time to read this.)
Thank you Sabine for relieving the philosopher and thinking person and the wannabe scientist in me.
Douglas Hofstadter in Gödel Escher Bach talks about “chunking” - if you can treat a collection of things (lines of code, molecules in a protein, etc.) as one thing, structurally or functionally, it’s chunkable. If you can’t, it’s complex.
See how I just “chunked” the science of complexity ? Now to watch Sabine and see if I’m right…
Best ever! You are not going to do this every day.
Thank you for bringing complexity to the light during a very narrowing trend in social reasoning.
This comment emerges from quantum fluctuations shortly after the Big Bang.
There is no free will but don't worry 😊
and the big bang emerged from....
GOD?
lol we've come so far since the medieval age
and this one doesn't
Definitely our largest gap. Thanks for elevating this! I've been in the complex-systems space (mostly practice instead of research, but with lots of formal training) for ten years or so, and appreciate your robust summary!
Thank you from the entire team!
I really liked this video, it gave me quite a lot to think about. It reminds me of a notion in topological dynamics known as a "dynamically maximal map" which, roughly speaking, is the most chaotic dynamical system (with respect to some family of dynamical system). Often, dynamical systems which are not dynamically maximal can exhibit hybrid behaviors, which are complex but not "completely" chaotic.
This was a great talk about a very interesting topic which is far above my pay grade....I have a small library of books from 10yrs ago when this was a hot topic.....your comments hint at how the concepts have grown......my favorite issue is "self organization of complex/chaotic systems" which seems to me to be the edge of answering the question of how RNA/DNA got started......but now I must "just shut up"
@@docjohnson2874 Don't give up, Doc! 🙂
Sabine you're touching my heart you're sounding oh so much like a philosopher good for you!
😊
When I "discovered" the Duffing Equation and van der Pol Equation in my Upper Division Classical Mechanics course, it put the "WOW!" back into Physics for me. Thanks, Professor, Scott!
The dynamical chasm between these two equations and the SHO it still astounding.
Once again, you succeeded at making a difficult question understandable without dumbing it down. This is why I always come back to this channel. Keep up the great work!
I'm writing a book concerning logic and math. I just want to thank you Sabine for your effort. I'm on the same boat as you: wanting the truth and striving to separate unfalsifiable stuff from falsifiable ones. The problem with modern science can be reduced to a problem of differing languages. Insert: I won't drop technical details here. But here's my take: Sabine's on the right track. Replication crisis from soft to hard science exposed and falsified many theories including their fields. This is actually beneficial. Why? Science is on its way becoming a unitary epistemological framework. Someday, it'll be a fair and perfect system like pure math. Right now, scientists are tribal. Fields are divided. Truth isn't polarized. It's not a bilateral thing. Truth is unitary.
RM3, non-binary logic, categorical logic --- validity is better than truth? Because mere truth is binary and not a real thing in the real world, there is always a third possibility that is Valid
I’m not on the same boat, but I’m looking across and trying to distinguish the possibility of methodology and intriguing concepts that invariably make us ask more questions 😊
The truth is out there. We need to go into space. That must be the biggest goal of humanity. As in the era of the sailing ships exploring the world, we need to embark on the same journey in the ocean of space. We have hit a theoretical limit, we must now push the pragmatic possibilities. It's cyclical and we need to embrace the half of the cycle we are in now.
IN THE INTEREST OF FINDING THE THEORY OF EVERYTHING:
SOME THINGS MODERN SCIENCE DOES NOT APPARENTLY KNOW:
Consider the following:
a. Numbers: Modern science does not even know how numbers and certain mathematical constants exist for math to do what math does. (And nobody as of yet has been able to show me how numbers and certain mathematical constants can come from the Standard Model Of Particle Physics).
b. Space: Modern science does not even know what 'space' actually is nor how it could actually warp and expand.
c. Time: Modern science does not even know what 'time' actually is nor how it could actually warp and vary.
d. Gravity: Modern science does not even know what 'gravity' actually is nor how gravity actually does what it appears to do. And for those who claim that 'gravity' is matter warping the fabric of spacetime, see 'b' and 'c' above.
e. Speed of Light: 'Speed', distance divided by time, distance being two points in space with space between those two points. But yet, here again, modern science does not even know what space and time actually are that makes up 'speed' and they also claim that space can warp and expand and time can warp and vary, so how could they truly know even what the speed of light actually is that they utilize in many of the formulas? Speed of light should also warp, expand and vary depending upon what space and time it was in. And if the speed of light can warp, expand and vary in space and time, how then do far away astronomical observations actually work that are based upon light and the speed of light that could warp, expand and vary in actual reality?
f. Photons: A photon swirls with the 'e' and 'm' energy fields 90 degrees to each other. A photon is also considered massless. What keeps the 'e' and 'm' energy fields together across the vast universe? And why doesn't the momentum of the 'e' and 'm' energy fields as they swirl about not fling them away from the central area of the photon?
And electricity is electricity and magnetism is magnetism varying possibly only in energy modality, energy density and energy frequency. Why doesn't the 'e' and 'm' of other photons and of matter basically tear apart a photon going across the vast universe?
Also, 'if' a photon actually red shifts, where does the red shifted energy go and why does the photon red shift? And for those who claim space expanding causes a photon to red shift, see 'b' above.
Why does radio 'em' (large 'em' waves) have low energy and gamma 'em' (small 'em' waves) have high energy? And for those who say E = hf; see also 'b' and 'c' above. (f = frequency, cycles per second. But modern science claims space can warp and expand and time can warp and vary. If 'space' warps and expands and/or 'time' warps and varies, what does that do to 'E'? And why doesn't 'E' keep space from expanding and time from varying?).
g. Energy: Modern science claims that energy cannot be created nor destroyed, it's one of the foundations of physics. Hence, energy is either truly a finite amount and eternally existent, or modern science is wrong. First Law Of Thermodynamics: "Energy can neither be created nor destroyed." How exactly is 'energy' eternally existent?
h. Existence and Non-Existence side by side throughout all of eternity. How?
@joeboxter3635 'you are the field, and the knower of the field' - old Eastern proverb.
Thing is, we already know, intuitively, however, our unique perspectives from our individual observer positions causes the 'truth'to slightly fluctuate between actors.
This brings me back to few years ago, when, as a Ph. D. student, I investigated some laws of nature regarding the interaction between complex systems. It is true: we are still at the beginning of our understanding, but some progresses are being made in this field..Still thankful to my professor for his passion in developing the theory of temporal complexity and complexity in general.
Thank you Sabine. Wonderful, as always! I am reminded of a graduate student who confronted J. Watson in Cold Spring Harbor once, in a seminar series. Watson insisted that everything wil be explainable and understood in terms of atoms. The student raised his hand and asked; Professor Watson, if I tell you that a house is made of bricks and mortar, can you tell me its architecture? Watson was dumbfounded. Who said this? he inquired semianglrily! Good old Aristotle, said the unphased student. And that is the truth!
I really like Stephen Wolfram and Jonathan Gorard's insights on complexity arising from computational algorithms, and how finding pockets of computational reducibility within a system which is otherwise computationally irreducible gives rise to emergent properties.
A slab of marble and Michaelangelo's sculpture, "The Pieta," are both made of the exact same material, but it would be a *gross mischaracterization of reality* to argue there is no difference in their overall complexity.
I was planning on doing my thesis in philosophy about complexity. But then mental illness struck and life took on a different path... Still, thanks for this video, even though it hurts to watch it, as it reminds me of all I've lost recently.
You didn’t lose anything, "potential" is a mental construct, and to believe that you lost something that was never yours is even more detrimental for your mental health.
Once we understand that nothing is expected from us, life becomes much simpler and lighter... and it helps, or at least it's helping me... and it doesn't take away anything from your "chances".
One of the best videos you have done. I really enjoy videos like this that reveals what is missing in the world of science and technology.
I love your explanations! They are always food for thought and help me pushing my brain to keep in shape.
As to measure complexity: How about "The harder it is to predict an outcome from a given input, the more complex the system is."
The verge of chaos would still be maintained, as you could not predict anything in pure chaos. At least if it was really chaotic and not just a limit in our understanding.
Hard to predict for who though? The same problem can be solved in a different amount of steps by different function. If you have the perfect model, maybe predicting different things is equally difficult.
Unless we assume complexity is equal to the probability of outcome after some amount of time, for which we maybe need to quantize time to get finite number of steps. If you can predict a state of system in 100 steps with 100% certainty, you get a complexity factor of 1? Sounds arbitrary but hey
A frozen baby could become not very complex though as system gets predictable, by that definition :p
This was an excellent episode. A related classic paper to read: P.W. Anderson's "More is Different." A single atom cannot have conductivity, but a collection of them can.
A relevant question is whether the property (conductivity) of the collection of atoms could in principle be predicted given complete knowledge of the physics of a single atom.
The answer is an emphatic no.@@brothermine2292
@@brothermine2292You can't, because arrangement matters. Graphite and diamond have different conductivities despite being made from the same atoms. Defects matter as well. And temperature. Not to mention impurities. While you can in principle calculate for a given arrangement and conditions, this becomes impractical quickly as you scale up. Exploiting symmetry helps, but defects break symmetry.
@@michaelinners5421 : Yes, arrangement matters. But why couldn't one in principle predict the emergent property (conductivity) for any given assumption about the arrangement (positions, momenta, etc) of the atoms?
@@brothermine2292 You will be able to predict IF you have a theory and understanding of the electronic structure of multiple atoms. This means valence bond theory or molecular orbital theory. So, just the "complete knowledge of the physics of a single atom" will not be enough. You need to understand the physics of what states the collection of atoms can have. Does this make sense?
For Kolomogorov complexity you have to include your initial conditions in the length of the program. Otherwise, everything is just the complexity of a Turing machine.
Plus the length of the program, a point Sabine seemed not to grasp: as if she confused the programming language with the program itself.
Still her point is coherent in this context. Anything with the same number of particles has the same number of quantum states, and that also implies that it has the same order of magnitude of entropy insofar as an algorithm is concerned.
A baby is just atoms, exactly the same as a rock. Simulating a same-weight rock and a same-weight baby is exactly as computationally complex, yet this doesn't fit our idea of complexity, so it can't be an accurate measure of complexity from our perspective.
@@Zeuskabob1 I completely disagree. It ignores the effects of the processes going on inside the brain as well as all those going on inside the body. Her statement that "If you went by Kolmogorov complexity everything would be equally complex" is just completely wrong. She just said in effect that all programs are the same length. I cannot imagine how Sabine made such a huge error. That's just factually wrong: very much a rarity in her videos, but this was one. (She did immediately follow by saying that "That makes no sense.", but then why did she choose to make the statement?)
Ya it didn't really make sense to me either because Kolomogorov complexity requires a turing machine or something like it. Which means you would somehow have to reduce the dynamics of the baby, rock, etc. to some type of turing machine and prove lower bounds on program size. Yes the dynamics are all describable by in QFT but the dynamics would be (and should be) extremely different even in the QFT formalism
The more I listen to you, the more my brain draws to the complexity of your sense of humour, I’m an old bugger, so just ignore me 😄👍
A whole lot of handwaving and guesswork going on here.
This was a super interesting programme. We definitely seem to require new maths to progress in certain areas of research, or a different approach. If nothing else, very fun to think about.
Simple and simplistic are two very different things. So are complex and complicated. As Einstein observed if we cannot explain it to a six year old child then we don't understand it ourselves. But that is the beauty of science: it is complicated until it gets complex or simple which is the same. Thought provoking video. Thanks a lot.
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
Would be great to see Sabine make a video on complexity in the context of Constructor Theory
Construction is dual to destruction, creation is dual to annihilation.
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
I really think the connected dots are a more complex system. As a software developer, having my functions used and calling each other is way more complex than having them as a pointers that never get called. We were discussing with my colleagues how complexity grows and we concluded it has to do with combinatorics and combinatorial explosion. But there are cases where you can have linear growth and exponential growth as well depending on how the connections are made. Also, we came to the conclusion that intelligence can be described as an aligned complexity.
The mere existence of connections adds a certain complexity, but the dots remain identical, which reduces complexity from the state where only some dots are connected only to some other dots. Maximum complexity lies somewhere between the unconnected and fully-connected states, as in neural networks (and our brains.)
@@jpdemer5 I think of aligned complexity as what you described as the in-between the unconnected and fully connected states. which I think describes intelligence, looking at it this way you can consider systems where the path of least resistance is chosen as intelligent(like light, movement of charged particles, water flow, etc. but in general they are complex systems all connected and offering other paths that are not taken. I think of them as functions that can be called if a certain if-else statement is fulfilled.
Wdym by aligned complexity
@@l1mbo69 Having a way of finding the solution to a problem faster than brute force.
if we are only talking about the complexity of description, then no connections vs full graph is exactly the same: for any two functions they are either always connected or always disconnected. You know this bit of information ans you can reconstruct the graph just knowing its size. Now if you want to consider complexity of operation, a dynamic one, you are actually talking about possible traces, paths in this graph, which are a different thing altogether
That's most informative, of never thought that complexity arises from being in the middle of entropy. Thanks Sabine!
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
This is an absolutely incredible video. So late in 2023 this is right up there as one of the best video this year.
And one week later the gravity video, can't decide which one is the best. She's a extraordinary teacher and human.
14:42 Consider an alternative point of view: A Pencil would not exist unless manufactured however, a pencil isn't manufactured for the sake of itself, it exist and is bound to the creation of self expression or data retention and who is to say that this isn't an integral part of the object?
On the other hand, us humans are made just for the sake of making us if not by an overly enthusiastic but frugal father or a mother with a dubious math skills.
The vid was interesting and thought provoking, thank you!
Man I am so glad to see this video from Dr. Sabine. I have been obsessed with Dr. Wolfram's work on computational thermodynamics ever since chat gpt came out he has been on a roll.
I've had trouble understanding graph theory, but as soon as I thought of it in terms of lines and dots it became so much clearer. Score another win for Sabine reducing gobbledegook! 🤪
I'm confused. In what terms were you thinking before?
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
Personally I always thought of complexity as the number of interactions between different systems (which itself could consist of different interacting systems) that will have a fixed result. The total of all those interacting systems could make something more complex than others.
Exactly my though. For me the complexity of a system depends on the number of significant interactions (at that emergence level) you need to account for in order to predict the exact outcome.
so how does your framing differentiate between two colliding galaxies and two people talking with each other, for example?
And how is a "significant interaction" defined ?
Seems to me like it's just pushing the definition of "complex" to be based on the vague term "significant"...
I agree, but I think there's a psychological component. How your brain categorizes thing contributes to whether or not you will view things as different systems. You can view many body systems as the same system, and it's largely arbitrary that we separate them. How do you have an immune system without a digestive system, without a skeletal system, without a cardiovascular system, etc etc etc.
@@VelereonicsImagination can definitely be emergent even when bound by karma/conditioning... is there emergence via an omniscient consciousness, or is the universe a block theory coupled with determinism and randomness. We know there is supposedly no free will, so consciousness is just an emergent property derived from determinism and randomness? This is what the mathematics shows, but no one really knows where the energy/information is coming from. Observation is a physical act and can alter non-local reality, yet has no unit of measurement within the quantum wave function. So it's a simulation created by something, but not GOD? GOD doesn't have to be a humanoid, but when people talk about simulation theory and an architect while refuting God, it really doesn't make sense. Whatever the architect is or happens to be can be defined as God even if it's something we can't fully comprehend. God= the architect of our simulation that we don't fully understand. That's pretty much the same definition from thousands of years ago. If we aren't in a simulation, then where is the energy/information coming from? It just existed forever... how can we ever fully define consciousness if we don't know its source?
@@VelereonicsThat can be applied to the totality of the universe. It's all one thing. There is no separation and time is just an illusion. Measurement from A to B.
Complexity breeds higher Complexity, with others and environment. Inevitably creating challenges and conflicts. Insuring motive to change , adapt / evolve . Ie fight . The beginnings tword intelligence in life. As easier 'simpler' developments have been exhausted.
Step by step inch by inch. Change. Maybe not for a reason but simply a benefit. That's all it takes to get to the ""chicken ""
You follow my egg ?
That quiz was so cool! I wish more creators put quizzes after their videos, even for videos that aren't strictly educational. It gives a content creator a good measure of how well they are expressing their given points in each video, and it gives the viewer a measure of how well they are retaining it. So many times I've seen people talking about topics I'm interested in and saying things completely contradicting what I thought about the topic- even though we likely consumed a lot of the same content! It always made me wonder how one could know that they were actually learning when browsing content. I hope that quizzes like this catch on.
🥳 Yay, a twentyish minute video!
And one of her best!
Understanding complexity is the key to understanding "Existence." ... Existence always evolves from *simplicity to complexity.*
It all depends on how simplicity and complexity are defined in any given system, and the more configurations that a system has the more ways there are to describe it.
You neglect that since entropy grows, complexity will eventually become impossible.
@@bodeeangus9957 *"It all depends on how simplicity and complexity are defined in any given system, and the more configurations that a system has the more ways there are to describe it."*
... The number of particles that make up the living, breathing, self-aware lifeform called "you" can be counted. Another structure can have the exact same number of particles, but be completely void of life and self-awareness. *Conclusion:* An equal mathematical assessment of two distinct objects does not dictate that they are equal in complexity.
Mathematics, material substrates, and particle configurations are not the arbitrators of "complexity."
@@brothermine2292 *"You neglect that since entropy grows, complexity will eventually become impossible."*
... 13.8 billion years ago there was nothing but a totally benign point of singularity. That is as _least-complex_ as one can get! Today we have eight billion living, breathing, self-aware lifeforms running around exchanging information in a RUclips comment thread.
If you charted the evolution of complexity like we do with the stock market from its beginning to now, the "complexity line" would be consistently moving upward. ... That is the way "reality" is presented to us.
@@0-by-1_Publishing_LLC Then what exactly is the parameter that we can use to measure complexity? Atoms make up everything, but some things that are made of atoms can move and others cannot, but that still doesn't answer the question of how complexity is defined. The question of what complexity actually means is likely to be sensitive to the context of whatever system is being studied, because while we can call a life form complex, we can say the same thing about a star for instance. Is the sun more complex than a human? How do you quantify something like that?
"Lines connected by dots" I see what you did there 😂
What
hmm? 🤔
The complexity theory argues for the incompleteness of knowledge and not just information. Thanks for such an insightful video.
I've been interested in this topic for years. The best I've come up with is that "complexity" seems to be the opposite of "predictability", and in that sense the Kolmogorov complexity I feel is headed in the right direction.
If a system can be modelled by only a few variables it is less complex. A pencil can be described by the materials that make it, its dimensions, and maybe some attributes like colour (i.e. its absorption spectrum). A person requires far more variables to adequately predict its behaviour, and is therefore more complex.
The formalisation of this I think already exists to some extent in the realm of statistics. While a model can always match the data best with more variables, the "best" model - the one that generalises to predict future observations - has an optimal number of variables. Perhaps this number, or something related to it, can describe its complexity.
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
As Assembly Theory (AT) is an important part of this video (and perceived as "novel" and promising), there have formally demonstrated through published papers and blog posts that AT is formally equivalent to existing work (Shannon entropy and LZ compression grammar) without proper citation, and is a weaker version of these established concepts. This raises significant concerns about AT's originality and scientific merit, and also highlights the importance of not exaggerating intended scientific work, especially if it is not original and does not explain what their authors claim to explain.
Publications include a paper in npj Systems Biology and Applications: "On the salient limitations of the methods of assembly theory and their classification of molecular biosignatures". "Assembly Theory is a weak version of algorithmic complexity based on LZ compression that does not explain or quantify selection or evolution", published in the arXiv and two medium post by Dr. Hector Zenil, broadly explaining why Assembly Theory and its marketing campaign are seriously damaging the image of science as a whole.
That's great, but it doesn't actually explain any physics of any importance.
Huh? That odd feeling when I can suddenly follow 100% of the topic... Without ever expecting to! My backgrounds are in social silences, statistics and 3d modeling and somehow this just connected all the topics I fret about to physics! :D I have to say I am happy to hear that people are working on math for this - we need thous tools, badly - so many good ideas are currently not going anywhere because the math to test them is just not there! This video gives me hope :) Thank you Sabine! :)
❤ I like your background in social silence 😂. It made me think. People in general make too much noise. We all should listen more to each other, especially listen to Sabine 🧐
One way to define complexity algorithmically, is maybe how big a program would have to be to generate a full description (i.e. arrangement of particles) of an object that would be considered to be a stone, or a clockwork/human etc. For a stone it would just need to generate a regular arrangement of molecules with some random irregularities, and can have any shape, and the output would still be a stone. For a human it would need to be much more specific. (To get a full description of one specific stone, it would of course be just as complex as one specific human). This would be similar to proceducally generated worlds in video games, or maybe to (lossy) data compression algorithms.
You should look at Kolmogorov's complexity en.wikipedia.org/wiki/Kolmogorov_complexity
Yeah, this is a mistake in how Kolmogorov complexity is explained in this video. You cannot seperate out some input (i.e. arrangement of particles) from of the program. A program cannot take input when you want to describe its Kolmogorov complexity, thus the input needs to be included/encoded in the program itself, thus affecting the program size and the Kolmogorov complexity.
There is also no distinction made whether "complexity of a stone" means:
- The complexity of one specific stone that you take as an example, or
- The complexity of the characteristics that an object needs to have so that it falls under the category "stone"
In the latter case, the Kolmogorov complexity would depend on how broadly you define the type of object you want. In the former case, it would be about the same for any real object, maybe depend on its size or number of particles.
The algorithmic complexity in the video seems to refer to how complex it is to simulate the physical evolution of the object, but not how the object is made up in the first place.
@@tmlen845 I think that a major issue in the notion of complexity @Sabine is speaking about is that it is unclear that one needs to distinguish the complexity of a system (that is any mathematical or physical object) from the complexity of its properties.
One such properties being its predictability.
And I think that we can roughly agree that a system is "predictable" if you can "infer" its future states faster than the speed of time.
(That is, you want to know what happens tomorrow today)
The thing is that the complexity of a a dynamic system is not really related to the complexity of any static state of this system.
Take the Syracuse sequence "u(n+1) = if 'n is odd' then u(n)/2 else 3 u(n)+1" quite a simple system yet it's dynamic is quite complex.
Another example is the "full network" disagreement, there is actually no disagreement, I think everyone can agree that this network is "simple" in the sense that it is easy to describe.
The fact that it is more "complex" than the empty network is not a matter of complexity of the network (seen as a graph) but rather a matter of complexity of the underlying system or worse of complexity of the dynamic of the underlying. system.
Yet I also agree that we do not have an airtight definition of complexity, good thing for me considering I'm finishing my PhD on a closely related topic :-)
I think the number of emergent properties is probably the best measure of complexity, it seems to be the lowest common denominator of all the other measures stated in the video, for example systems with more emergent properties have a higher likely hood of those properties interacting with one another, making them harder to predict.
Measuring complexity is hard because nobody can agree what is being measured.
Exactly. First useful commentary on that matter/video!
The way forward, then, is to realize that we don't need to agree on what is being measured!
What we need is multiple measurements of utmost precision and accuracy.
Then we can produce theories of thermodynamic complexity, chemical complexity, biological complexity, social complexity, computational complexity, theoretical complexity, and any kind of complexity produced by a system of well defined measurements and/or theories.
@@ywtcc yeah we don’t need to but when we don’t share or share too much, people get confused and turn to magic and religion. It’s present in the patterns of connections. Like how matter is fueled by energy and that energy allows a flux or change and now all the big can change to small and small to big and we exist between. Multiple variables come together of different shapes and sizes such as quantum particles and macro like planets or giant bodies of water or something in space waiting to be connected to more so that the differences of 6 plus 7 can blow up and create everything in between and then when the combos in between start gaining layers and more variables then they can see what is not. Look at how we see chemicals and those variables and how the structures and bonds share energy through those connections. Plants, cells, chemicals, made into all this. The Big Bang set in motion the very variables and now we reflect and connect. Our brain grows different cells in different patterns and edits them through the senses and diet and such so that we can survive and grow. We take in patterns connected to a whole and we don’t see everything. That’s apartment when dealing with radiation and X-rays and stuff. We lack the biological tools to sense them but we created them and connected the inputs to a different form called language and understood them through the details of connections and recognizing the reflections. Sounds like magic but it’s like connect the dots on steroids.
@@ywtcc i mean think about it, we became complex over time, what is time but a measurement of change in relation. The sun and earth and moon are just the same as the people and the cycles we shift through. We are the same. As above so below. It’s more than one saying because it’s everywhere. A paradox is like both sides of the same coin so which makes the other but it’s a connected and intertwined connection where they make each other sharing the energy. We try to leave something behind that can contain the complexity of the pattern so that we can persist and grow, we draw and do art because it is another pattern we reflect because we noticed a deeper complexity just a little to the side of others. Conscious growth and connection and thought is what came forth but think about this, how many people are conscious when they just live their lives? They bounce around the environment because they live in it and get changed by it but how many stop and think of a different pattern from somewhere else that makes them see different? How many preform an action that isn’t what the environment grew into them? Do people ever sit back and wonder why they see through their eyes? I think we do but we don’t use our conscious will to do what we want and instead we’ve been trained by the media and such to only believe this and don’t trust each other. I mean our society hates each other somehow when we should be listening and trying to think of things other than what reality grew into us. Be conscious of thought and grow a new thought beyond what you currently are. Idk how else to put it. We give up our why because we don’t see it but our why is everywhere and we just haven’t learned enough to see it. There is no why, there is only so live.
This boils down to the first term psychology question about what intelligence is. The correct answer is: It's the thing you measure with an intelligence test.
That's why definitions are so hard when you enter the terrain of complexity: Complexity is too much for our understanding, so we have to break it down into models (thus reducing complexity) to have a grasp on it.
And this very step is where different people (and scientific disciplines) extract different metrics/variables to suit their needs of the model being able to make predictions.
So maybe we need different approaches to measure complexity depending on which property of complexity in a given system we are interested in.
9:40 - best stock video
Yes, a true gem lol
In your most optimistic video on Entropy, you brilliantly pointed out that so much is not understood about Entropy and what it means for the future of life, and even what it means period. Lee Cronin and Sara Imari Walker have voiced your concerns for how Entropy is understood in their Lex Fridman and other interviews. They are on an exciting track with Assembly Theory. They welcome criticism and feedback to improve Assembly Theory and as they take criticisms to heart and flesh out assembly theory, it shows interesting possibilities for new insights into physics, biology, chemistry, selection, etc. They are even exploring applying Assembly Theory to Math!
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
Gosh! This is fascinating! I recognise this as one of those abstract concepts that will fascinate me for the rest of my life!
Please revisit this whenever something significant arises.
Hard to describe, but I gave this some thought. I imagine a maze with a single straight path from A to B of being minimal complexity. The more times you have to stop (a node) and take a different direction seems to be an increase of complexity. The more choices (nodes) to get between A and B, the greater the complexity, also the 'path length' increases, but at a lower rate. Sounds similar to why quantum physics is complex; many combinations to get from A to B (in my limited understanding anyway).
Algorithmic complexity still seems to me like one of the the best measure that could be extended to accomodate more different types of complexity. For example if you write a module that describes the particle physics completely and allows you to operate on the level of molecules, the rest of the algorythm will now be much more complex for baby than it is for the rock.
You sort of define the cut off point for complexity. By defining micro physics complexity as a module, you decouple it from whatever other types of complexity you want to describe. By defining bondaries of the code modules you would define boundaries of complexity types. They could also overlap and include each other.
So overall the algorithmic complexity seems like a very universal way to quantify complexity.
The only algorithm you need to make a baby are the laws of nature. There is no cut off point. The baby is a result of the same laws as the rock. Also check out Steven Wolfram's work on how equally simple algorithms may give rise to vastly different outcomes, some outcomes are periodical, some are chaotic.
Accept, it does fall short in some cases. The whole concept of complexity is higher-dimensional, it's likely that in a system with infinite configurations there can also be an infinite number of ways to describe them.
I am wondering if Shannon information could be used to describe complexity? Higher missing information needed to describe a system would imply higher complexity. Larger space of configurations, higher missing information, higher complexity.
Complexity is the only thing that humbles over-confident scientists who think we basically know everything and just need to figure out the details. The elephant in the room surrounding this topic couldn't be more massive
When I saw the title of this video, I exploded with delight. I have been pondering complexity for at least around 20 years. I had originally been thinking of how before computers were built, no one would have predicted what changes their development would bring. Even if someone had theorized 'oh maybe some day we could make a machine that does math at fantastic speeds', it is very unlikely they would have realized that would change the whole of the world (although Ada Lovelace seems to have understood far more of that aspect far earlier than anyone else, juding from her notes in Babbage's journals). And I wondered what might be the next thing... I already knew chaos theory places tremendous restrictions on what mathematics is even capable of, and some things about fundamental limits to computation. Last night, I was watching some film that involved a character being given the ability to have any wish they made granted... and without a doubt the wish I would make in that circumstance would be "I want to understand the fundamental nature of complexity to an extent that I would be able to engineer it through creation of micro-scale interacting components that give rise to a desired macro-scale emergent phenomenon."
I want to know exactly, precisely, how many atoms must be collected before the concept of "temperature" has meaning and gives rise to the phase diagram and different states of matter. I want to know what aspects of the constituents determines the behavior of the aggregate. It is crystal clear that mathematics can not handle this. We are dealing with non-linear dynamics here, and math just simply can't do that. They brag about finding stable orbits for the Three Body Problem, while I'm only interested in the 6.022*10^23 Body Problem.
Right now I think the strongest possibility of where a fundamental understanding of complexity might come from is AI. Not that I think an AI will figure out the problem, I don't believe that at all. But, the specific way that modern AI is built, with networks of trillions of nodes, is an actual example of a system which can be pulled apart, manipulated freely, and experimented with that is a true complex system, where the "More is Different" adage holds true. We can build systems at a large enough scale now that I imagine people are doing experiments trying to figure out either WHY certain large-scale properties emerge, or else how to modify the emergent behavior through manipulation of the architecture of the nodes or their 'interaction'. We might not develop a formalism to make this engineering easy, but I imagine we will definitely make some progress towards recognizing it, setting bounds of how much influence and of what kind can be had through interventions of different kinds, etc. They are nothing but big matrices of floating point numbers that use matrix multiplication, and yet we can build them and have them understand language (for certain definitions of 'understand'), generate images, extract data from noisy inputs, etc.
I have pinned down what I think is a pretty robust definition of consciousness, and it is coherent with everything I've read about both human and animal consciousness. Consciousness is an emergent property of systems with a 'large enough' number of adaptive components embedded within a feedback loop inside a highly-enough correlated environment. Every part of that definition is important. If you break the feedback loop, say through introducing total sensory deprivation, you can not maintain consciousness. I also believe, similar to the emergent property of temperature, that this produces a phase diagram where there are different 'states' of consciousness similar to states of matter. I am uncertain yet whether the number of adaptive components is actually the feature that controls the different phases. With the simplest brains, flatworms and things like that, behaviors that appear complex are possible but it's a very bad idea to rely on intuitive perception like this. Those behaviors can often be mapped pretty directly to correspond to features in the immediate environment. The larger the number of neurons (roughly), the further away from the environment and present-time the consciousness can get, eventually accounting for some intuitive prediction. That gives rise to 'emotion' which is in-brain anticipation of interoceptive changes. Beyond that, development of language is a phase shift and enables an entirely different state of consciousness, enabling perception, and thus thought, of things which are not immediately present. This also enables abstraction, and is the first place at which "understanding" really exists.
If we get to a point where we can engineer complex systems... every other technological advancement of the human species will look like idiocy. It would probably be harder to describe something that would not be possible if we had a true deep understanding and ability to engineer complexity than anything. We'd be able to 'build' things that build themselves and that would look like literal magic to anyone alive today. Too much CO2 in the atmosphere? Whip up a custom bacteria that uses solar power to pop it apart and build the carbon atoms liberated into perfect sheets of graphene. Maybe then do another that'll slice up the graphene sheets into strips and wave them into cables.
Video fails to relate Complexity with Degrees of Freedom of individual parts along Time, not only spatial networks. That's why baby is much more complex. Cells born and died at rate of Millions per minute, to maintain a sustainable body. Cells itself have specific roles and options. All related with options at present input data AND memory data. Kurgesagt video "Counscience- How Unaware things become aware" and other "What are you ". Memory is key for algorithm evolution.
I recently learned Leonard Susskind's #1 issue is now complexity, this is all new to me and very interesting, thank you
Very good video, ms Sabine! And I agree about the importance of the study of complexity! Thanks!
Thank you Sabine, never stop making them videos! 😂
Can't wait for a longer attention span in my 5 and 6y olds to binge watch this and more.. :) Stay safe, and have a good happy holiday time! And, ummm, as this video is almost screaming for it, can we have an update/info video about AI, its current state, all the different modalities being utilized and orchestrated (ai agents?) and how it seems to exponentially evolve? Skip a step in complexity, connecting artificial neural dots😮😅 and go straight for explaining what superintelligence means, even in the context of this video? (A neural net saving 800years of material research in one very specific area of science (gnome) etc)
Thank you so much, whether i see this particular video or not, it's always nice to leave listening and watching someone conveying an interesting topic, with a little more knowledge, and maybe even provoked thoughts ❤🤙👌👍
Fascinating indeed. As a computer scientist, my perspective might be biased... But, for me, we just need to define the variables and rules to add them (since they're different things)...
Which doesn't make it much easier... Does it? 😬
Anyway, thanks, Sabine! 😊
Stay safe there with your family! 🖖😊
And happy holidays!
Computability is a challenge to tackling Complexity for sure or as Sabine said we would just use the fundamental equations in Physics and be done with it, (*sigh*) but for combinatorial explosion eh? Where we have success in high complexity domains like Biology is where we have good heuristics that sufficiently model the fundamental behaviors we are interested in predicting. The number of NP problems nature tackles every day is quite impressive until you recognize that it's using all of the atoms in the entire universe to do so... an option that is mostly closed to us unless we redefine predict to mean observe. To me one of the essential features of a theory of Complexity needs to account for is the dynamics of a system. If you take a human body and freeze it in time it's not really a complex system anymore, it's just a particle arrangement. A rock that is moving through time has dynamics but they are relatively static (non-complex) on a human time scale. Geology has complexity but only on geologic time scales so clearly the complexity of a system somehow includes a scale factor based on the dynamics of the system and so comparing system complexity must include that variable. I mean on a geological time scale the inanimate and decidedly non-complex inorganic chemistry of the earth somehow evolved into organic chemistry and then life which makes the complexity of rocks pretty interesting. Linguistic challenges are clearly at play here as we continue to overload mathematical concepts with human linguistic baggage.
@John-zz6fz Yeah, it's a huge task and the number of variables is just unreal... Unless we really are in a simulation and can, some how, access the source code. 😬
@@John-zz6fzHow would functions(where certain subsystems emerge which serve a superior "super-system" - known as functional differenciation, as seen like in the brain or society) and evolution(a REAL evolution - like biological life, society, the cosmos as a whole) add to the complexity of a system? Compared to just complex chaotic phenomena like the weather.
@bjornrie The increasing complexity would be to emergent phenomena. In this case the term complexity has dual use. In the computer science frame we mean computational complexity and if the system is calculable in a Turing sense then no new complexity emerges. From the information science frame complexity isn't the same as computational complexity. It's not totally clear what the term actually means in information science and that point is made by Sabine as she's saying we need to make progress on a formal mathematical definition of complexity in the frame of complexity theory.
Thank you so much, Sabine. This is one of the best science videos I've ever watched on RUclips. I admire how you synthesized key aspects and insights regarding the "multiverse of complexities" in just 18 min. I am part of the Engineering Systems community, and understanding how to assess [socio-technical] complexity is a major research theme for us. I plan to watch this video over and over for inspiration.
thats so cool, i recently fell in love with the idea of assembly theory after listening to lee cronin. iam starting to see assembly theory everywhere now. so happy that you commented on it as well. i would love to hear more from you about it, especially on the point that assembly theory requires time to be essential for rising complexity and the point that cronin made, that the past is deterministic but the future isnt, because emerging complexity is so huge, that its unpredictable by just looking at the laws of physics! thanks for your insights
In my opinion its more of a value judgement on what is more difficult for us to describe. The early universe even though there is so much stuff and energy interacting it is actually fairly easy to describe mathematically whilst a baby is incredibly difficult.
It would appear in terms of complexity that the early universe is in essence the same as the universe today., there must be at least equal unseen complexity in the early universe, since it describes and preforms the future it will become. Unlike DNA, which iterates within an outside medium which provides external elements to make a baby, the early universe is self contained and must have the same complexity at all points in time. The early universe must have unseen internal complexity to produce a baby in 13 billion years , as well as civilization.
the important thing is realizing that this inherent complexity can also be a characteristic of a system through time. We need to consider time and space as equivalent even though we are stuck in one and free in the other ;) something that starts extremely simple can become complex and in a sense then it "already is complex".@@myroomtv2014
Bad humor below:
A rock is harder than a baby.
Usually.
@@myroomtv2014 I don't agree that the initial conditions of a system are necessarily complex for the result of the evolution of that system to be complex. The rules themselves breed complexity by their very nature. It seems enough to provide a very low entropy seed and let the rules create the complexity.
Thank you for sharing your thoughts
This might be my favorite of your videos. This is a ton of 'complex' ideas presented beautifully.
We know beauty when we see it. Sabrine ❤.
You are nice to see everyday but even better to hear.
Thanks.
Thank you for this very insightful video, Dr. Hossenfelder.
The problem with describing complexity in detail, is that it is saddled by the mathematical theorem of incompleteness.
I wouldn't make rain on this wonderful parade - but as a philosopher, I'm firmly convinced of it.
Merry Christmas!!!
Anthony
Pleased to see that Sabine has returned to her core capabilities, resulting in thought provoking questions backed up be various facts based on science.
While studying Graph Theory for my MA, I came across Ramsey Theory and was immediately intrigued.
Ramsey Theory deals with complex systems (usually finite, discrete networks), and how large the networks can be before some certain substructure must manifest within the network.
The "simplest" example is how big can a simple complete binary (or 2-color) graph K_n be such that K_n avoids a monochromatic triangle (or 3-cycle C³). The answer is 5. K_5 is the largest simple complete graph that can be drawn in 2 colors without a triangle (3-cycle) of the same color edges.
Sometimes I wish I'd gone into Ramsey Theory because it's still in its infancy and most of even the simplest questions are still open and an active area of research.
One of your best videos in a while! More discussion of this, perhaps looking at specific applications of different complexity measures would be great.
This is mindboggling topic, as well as the topic of consciousness. We obviously missing a huge chunk of understanding of the world, which is right under our nose, but even asking questions and trying to give simplest definitions is like carrying big stones.
Since last few days i was studying complexity and now your video...❤
I really liked "scienc". It had a big gap!
I don't think I agree that the algorithmic complexity of all objects is necessarily the same. Just because everything can be described by the same general algorithm doesn't mean there's no shorter program specific to each object. File data is vaguely analogous - for two files of the same size, there's one algorithm that can describe both (simply enumerate the bits) but with a compression algorithm, a highly ordered file will compress to a much smaller blob than a highly irregular file.
Perhaps what we think of as "irregular" is an illusion, in that we use it in order to describe something that does not have any patterns that are recognizable to us currently. It could just be that we don't yet understand the patterns, and that even highly chaotic systems are ordered in strange ways.
@@bodeeangus9957 That is certainly a possibility, though in principle we could find a dataset that cannot be produced by any program of length
you're right, it's not just the algorithm but the inputs, which in turn may be expressed more compactly, compression is a good intuitoin for this but the video seems to assume that these programs can have free variables / inputs, but they shouldn't
@@bodeeangus9957 Kolmogorov complexity goes even deeper than that, look into Chaitin's constant for example, although it can be compared in some sense for actual programs, you can objectively say which is shorter, if i'm not mistaken you can even do that without loss of generality in assuming a particular language, but you can't do it in general and you can't coompute the Kolmogorov complexity of a given string because
concrete examples exist in cryptography, for example a pseudorandom generator is an object which without knowledge of the seed appears to have unbounded complexity, but in a very tangible sense only has a constant amount of it
To me, complexity is about the amount of description of behavior of the item and all parts within.
Nope.
I kind of agree - the only issue is quantifying "amounts" of description. I can describe a pencil with the word "pencil" and a human with "human", so using "number of words" isn't quite enough. Possibly "number of variables" is more in the right direction.
5:30 "Lines connected by dots". Hmm. Quite an interesting reverse point of view. I usually think of graphs as a collection of points connected by lines...
12:00 "It’s not all that hard to tell a brain from a piece of wood (in most cases)". Thank you, you made my day! 😀
Blooming complexity is our ideal. From this point of view, we can define *_good_* and *_evil_* : what increases complexity is *_good_* , what reduces/destroys complexity we perceive as *_evil_* .
True, the kolmogorov complexity of the math governing the evolution of all systems is the same, but that's not what people mean by the complexity of a system. That complexity also has to include a description of the system's current state. In that way, the kolmogorov complexity of the baby far exceeds that of the rock. The two really difficult problems with kolmogorov complexity are: 1. A general mechanism for computing its value is impossible. Optimally minifying an algorithm is known to be reducible to the Halting Problem, which is the classic incomputable problem. 2. Like many measures of complexity, it depends on the granularity at which you want to describe the system. For example If you want to describe the macrostate (temperature, pressure, phase) of a system, the complexity will work out to be much lower than if you want to describe the microstate (position and momentum of every particle).
15:01 I don’t think that’s what they mean by “difficult to assemble”. Obviously it’s not hard for two very in love oppositely sexed humans to make another human, but imagine trying to “assemble” a human, with all their DNA and cells and neurological complexity in place.
A good way to think about it is through statistics of natural evolution. In the Earth’s early years, the overwhelming majority of structures which developed were extremely low in complexity, and on most planets, that pattern continues. The Earth, as it cooled down and collected useful elements from nearby supernovae, was able to become a highly rare habitat for the development of organic molecules capable of self replication. The majority of systems in the universe are not suitable habitats for this assembly, and if they are, those microscopic living molecules will find it difficult to evolve to anything larger and more intelligent unless they are within a perfect “Goldilocks” zone like Earth.
Assembly theory might not be able to account for complexity seen in how structures change over time in certain ways, but keep in mind that from a certain perspective, evolution and natural change are part of this “assembly” notion. When we want to “create” our own complex biological structures, we always have to have some sort of sample cells, DNA, some kind of naturally occurring set of molecules containing information such that it can build itself; the system is too complex to produce in a factory the way you’d produce computer hardware.
Can you make a video about discoveries that the human body used quantum physics in our DNA, Sense of Smell, and Brains and how this can possibly make us non determistic beings because Quantum physics is probabalistic not deterministic. This is the argument of Some mathematical physicists like John H. Conway, Simon B. Kochen, and Steven Hawking himself
Isn't high entropy just so much complexity that we give up trying to understand the detail and call it random?
Every Physicist ever: dy/dx = x" + x' + y + H.O.T., but we can ignore the Higher Order Terms
QM implies its all random anyway, theres also the point that most of the high entropy situations are near indistinguishable from each other and can usefully be approximated to random for that reason
Adjectives, including "high," are ambiguous shorthands that actually refer to a relative comparison to an unstated alternative: "X is high" actually means X is higher than some unstated Y. There are typically many possible alternatives for comparison, which is why failure to explicitly state the unstated alternative creates ambiguity and is often misleading.
What your comment is doing is proposing a definition... a particular threshold that distinguishes "high entropy" from "non-high entropy." That's fine, but different thresholds could be proposed. For example: Entropy is "high" when entropy has grown to the point when a complex system can never afterward exist. I think this definition of high entropy would be more consistent with Sabine's point that complex systems are neither low entropy nor high entropy.
@@festusmaximus4111 Isn't QM all about finding patterns in all this "randomness", that could also be close to the highest possible complexity when you assume particles interact with every other particle in the universe?
So complexity seems to depend on perspective. What you see, how you look at it, what it does for you, or why it functions that way.
I've been fascinated by complexity for years (probably decades).
The best (intuitive) explanation I've come across is "everything humans MAKE is complicated, everything humans MANAGE is complex".
It doesn't help scientifically, but it helps make sense of "things".
Good video, the other problem with complexity getting in the way of progress is the lack of agreement in how to define it. Which leads to a lot of conflation
Exactly!
IN THE INTEREST OF FINDING THE THEORY OF EVERYTHING:
SOME THINGS MODERN SCIENCE DOES NOT APPARENTLY KNOW:
Consider the following:
a. Numbers: Modern science does not even know how numbers and certain mathematical constants exist for math to do what math does. (And nobody as of yet has been able to show me how numbers and certain mathematical constants can come from the Standard Model Of Particle Physics).
b. Space: Modern science does not even know what 'space' actually is nor how it could actually warp and expand.
c. Time: Modern science does not even know what 'time' actually is nor how it could actually warp and vary.
d. Gravity: Modern science does not even know what 'gravity' actually is nor how gravity actually does what it appears to do. And for those who claim that 'gravity' is matter warping the fabric of spacetime, see 'b' and 'c' above.
e. Speed of Light: 'Speed', distance divided by time, distance being two points in space with space between those two points. But yet, here again, modern science does not even know what space and time actually are that makes up 'speed' and they also claim that space can warp and expand and time can warp and vary, so how could they truly know even what the speed of light actually is that they utilize in many of the formulas? Speed of light should also warp, expand and vary depending upon what space and time it was in. And if the speed of light can warp, expand and vary in space and time, how then do far away astronomical observations actually work that are based upon light and the speed of light that could warp, expand and vary in actual reality?
f. Photons: A photon swirls with the 'e' and 'm' energy fields 90 degrees to each other. A photon is also considered massless. What keeps the 'e' and 'm' energy fields together across the vast universe? And why doesn't the momentum of the 'e' and 'm' energy fields as they swirl about not fling them away from the central area of the photon?
And electricity is electricity and magnetism is magnetism varying possibly only in energy modality, energy density and energy frequency. Why doesn't the 'e' and 'm' of other photons and of matter basically tear apart a photon going across the vast universe?
Also, 'if' a photon actually red shifts, where does the red shifted energy go and why does the photon red shift? And for those who claim space expanding causes a photon to red shift, see 'b' above.
Why does radio 'em' (large 'em' waves) have low energy and gamma 'em' (small 'em' waves) have high energy? And for those who say E = hf; see also 'b' and 'c' above. (f = frequency, cycles per second. But modern science claims space can warp and expand and time can warp and vary. If 'space' warps and expands and/or 'time' warps and varies, what does that do to 'E'? And why doesn't 'E' keep space from expanding and time from varying?).
g. Energy: Modern science claims that energy cannot be created nor destroyed, it's one of the foundations of physics. Hence, energy is either truly a finite amount and eternally existent, or modern science is wrong. First Law Of Thermodynamics: "Energy can neither be created nor destroyed." How exactly is 'energy' eternally existent?
h. Existence and Non-Existence side by side throughout all of eternity. How?
Video fails to relate Complexity with Degrees of Freedom of individual parts along Time, not only spatial networks. That's why baby is much more complex. Cells born and died at rate of Millions per minute, to maintain a sustainable body. Cells itself have specific roles and options. All related with options at present input data AND memory data. Kurgesagt video "Counscience- How Unaware things become aware" and other "What are you ". Memory is the key of algorithm evolution.
@@SabineHossenfelder actually now I think about it a few of my lecturers said when were studying solvation chemistry, vapour points and super critical fluids and substances which sublime. That you reach critical points where the properties change whether from temperature, pressure or concentration and there is a lot to understand within these processes
Hmm... Is this not a dead end? Complexity is a measure of a description, not a thing. We can describe one thing many ways. We can apply different complexity measures to one description. There is no single measure. You propose some combination of emergent behavior, edge of chaos, and self-organisation. But the simplest system has emergent behavior. And the self-organisation of system can lie in a simple coupling of organising and organised components, independently of how the complex the coupled components are.
Hmm. It seems that by that value system everything we have now is a dead end. Which in a sense unless we suggest have the sheer to life the universe and everything sitting under our noses but we all missed it, it is. But as a tool to advance or understanding further before later throwing it away, complexity theory as vehicle for other techniques to model complex questions, does seem very powerful.
It is an opportunity for thought exercise, which provided such does become self serving, and sufficient as an end in itself, has contributed to breakthroughs in the past.
@@Aoi-mirror "Complexity theory" is a phrase that covers a jumble of ideas. How do measure it?
Life, the interiorizing complexification of matter..a gift of anti-entropy👈🙏
Actually life drives entropy at a much faster rate. Simply because of your extreme locality might seem to exhibit lower entropy is a far cry from the wider environment. Look at climate change as a simple example.
@@SebWilkes How is change in climate an example of entropy? Life takes the energy that would otherwise be entropically wasted and concentrates some of it and builds greater complexity and organization. It is the only thing we know of that goes against the prevailing entropy gradient.
@@obsidianjane4413 CO2 in the atmosphere is a higher entropic state than oxidisable carbon in the ground. The amount of additional entropy you generate in creating a building, say, greatly outsizes the entropy you reduce by having a few crushed rocks suspended above you.
@@SebWilkes That isn't "life" that is just humans mucking about. Very tiny percentage compared to the total on that particular thing.
@@obsidianjane4413 I don't think it is limited to just humans, but without a better argument I'll concede on that. Nonethless, it is worth pointing out that since life, or at the very least intelligent life, is so good at increasing entropy it has been argued by proponents of biological determinism as a reason to believe that a system that seeks to maximise entropy (and its rate thereof) would be likely to bring about life.
"Forget theories of everything. The next big thing in science is theories of complexity."
Loved it.
It's nice to have a full length video again; very interestng topic. The reason I subscribed was for excellent videos like this. Not a fan of 1 - 2 minute long videos and "short form content" that can't go into detail. Thank you!
I have been thinking about complexity in another context: in our everyday lives! Our society, especially in the age of social media, has demanded an absurd amount of knowledge on a very wide range of topics to navigate relationships with other people. Under penalty of social execration! The tension this has generated is absurd and the stress is inevitable. Our mental capacity to deal with this is too limited to reasonably demand understanding and acting in accordance with so many "isms" that postmodernity has created.
We will let our robot overlords shoulder that burden.
Like my physics teacher said, "Complexity is simplicity twice."
Does that, “simplicity twice” mean “self and all the rest” I think is all there is?
That makes no sense.
@@wizzyno1566structure and arrangement are all simple outcomes from thoughts
Complexity is dual to simplicity.
Randomness (entropy) is dual to order (syntropy).
Points are dual to lines -- the principle of duality in geometry.
Points (nodes, vertices) are dual to lines (links, edges) -- graphs and networks are dual.
Convergence (syntropy) is dual to divergence (entropy).
Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
Syntax is dual to semantics -- languages.
If mathematics is a language then mathematics is dual.
Real is dual to imaginary -- complex numbers are dual (photons. light or pure energy).
The integers are self dual as they are their own conjugates.
Subgroups are dual to subfields -- the Galois correspondence.
Sine is dual to cosine or dual sine -- the word co means mutual and implies duality.
"Always two there are" -- Yoda.
@@hyperduality2838 Well said, Yoda!! Self and the non-selves!!
Sabine, the easiest way to explain complexity is to think about if a "something" is hard or not hard.
If you look at it that way, you will realise that a rock is a lot harder than a baby.
😂
This is why AI is getting so much attention. All complex systems are essentially networks. AI is a network. It takes a network to decode the properties of a network. AI (and indeed intelligence itself) understands complex systems to whatever degree it is able build a sympathetic network that responds favourably to that complex system. We will probably never have equations to solve complex problems because those equations themselves will likely be complex system. We will solve those questions by using progressively trained networks (AIs) to decode complex networks that we have no other tools to predict. Our tools, though functional, will be just as incomprehensible as the systems whose problems they solve. We understand how AI works. We understand the output of an AI from complex input. But we can no more understand the knowledge (training) inside the AI than the problem the AI solves because that knowledge training is itself a complex system.
Love the part ;complexity is between order and entropy ...
Im not smart by a long shot but i do love thinking of the universe and the state of its being!
I believe that almost all of us make things way to complicated by trying to make it uncomplicated... thats where the magic happens. And it looks like fusion at one time or another and looks like maybe absolute zero in one way or another... either way its all very very amazing even without thinking and just soaking up the sights... and as that information gathers and filters your experience through your brain , to me it seems like instinct /self preservation and many more such things are like the clumping of cosmic material in a specific area until a galaxy or a new planetary system has amassed enough energy to spark fusion reactions.
To me our personality traits and thought processes largely depend on pattern recognition simply put keeps us alive and whats more, thriving with creation and emotion. Wich is to say both creativity & emotion seem to be the access energy of "experience impact events"
They're gonna stop now because I will literally go on ten thousand more words
Great and smart work, helpful to think some deeper thoughts at the end of the year, thanks so much🌻
So you wait till the end of a year to do a little thinking. Then only to listen to someone else do the thinking for you.
@@PlanetEarth3141Hi, are you here to communicate? That´s what makes thinking easier indeed sometimes.