Full podcast episode: ruclips.net/video/Osh0-J3T2nY/видео.html Lex Fridman podcast channel: ruclips.net/user/lexfridman Guest bio: Edward Frenkel is a mathematician at UC Berkeley working on the interface of mathematics and quantum physics. He is the author of Love and Math: The Heart of Hidden Reality.
Gödel's first incompleteness theorem states that for any consistent formal system that is sufficiently powerful to represent arithmetic (which includes most foundational systems of mathematics), there exist statements within that system that are true but cannot be proven within the system. In other words, there are true mathematical statements that cannot be derived or proven using the rules and axioms of the system.
The key idea behind Gödel's theorem is the concept of self-reference. Gödel constructed a mathematical statement that asserts its own unprovability within a given formal system. This statement, known as Gödel's sentence or Gödel's formula, essentially says, "This statement is unprovable." If the system could prove this statement, it would be inconsistent because it would be asserting both its own provability and unprovability. On the other hand, if the system cannot prove the statement, it implies the existence of true but unprovable statements.
Gödel's theorems challenged the notion of completeness and consistency within formal systems and had a profound impact on the philosophy of mathematics. They demonstrate inherent limitations of formal systems and suggest that there will always be truths that lie beyond the reach of any particular system. These theorems have also influenced the field of computer science, particularly in the areas of artificial intelligence and algorithmic complexity theory.
It's great to hear about Godel's Incompleteness Theorems. What he didn't mention is the motivation behind how Godel ended up with the Incompleteness Theorems. So at the end of 20th century, a genius named Georg Cantor (founder of set theory) wanted to understand God (just like Einstein wanted to understand God). For him God was represented by infinity. Therefore, he wanted to understand infinity, which we failed to understand for thousands of years (and still to this present day). So he asked himself the simple question, we can add numbers, subtract them, etc, but what about infinity? To this end, he constructed a theory that is now known as cantor's set theory. Unfortunately his basis of reasoning (called axioms) contained a contradiction that is now known as Russell's paradox. Nevertheless, he proved that the set of natural numbers is smaller than the set of real numbers. And he wanted to prove whether there is a set in between the naturals and the reals (that is now known as the continuum hypothesis), but failed to prove it. Because of these, David Hilbert (widely considered to be the greatest mathematician at that time and of the 20th century) came up with what's known as the Hilbert's program. In this program, he posed a number of problems, among which is the quest for a basis of reasoning that is both complete (that is to say, if a mathematical statement is true, then it must be provable from this basis of reasoning) and sound (that is, the basis of reasoning only proves true statements). Note that these two properties (completeness and soundness) are the two fundamental properties of all algorithms. Now, comes Godel. So originally godel wanted to prove that such a theory exists, but ended up with the incompleteness theorems in the end. But, the results are so earth shattering that it completely destroyed Hilbert's and mathematicians' dream of having a sound and complete theory for mathematics. On top of that, the mathematics he used to prove the incompleteness theorems were so new that only a handful of mathematicians understood it. Amongst these mathematicians that understood Godel's results were Alan Turing. So what Alan Turing said is that what Godel really means is that there is no machine/algorithm that is able to prove/determine whether an arbitrary program on an arbitrary input will stop or run forever. This is the Turing halting problem. It is important to note that all of these led to the birth of computer science and eventually led to the famous P vs NP problem. Historically, Godel was the first to informally pose the P vs NP problem in a letter to Jon Von Neumann. He also was the first to prove that Einstein's theory of relativity allowed time travel. And he gave the proof to Einstein as a birthday present.
@@blueskies3336 If you are capable of reading German, Dirk W. Hoffmann's "Grenzen der Mathematik" is a must. It is mathematically rigorous enough, but as comprehensible as possible for this hard topic. It gives a detailed historical account of the developments. Maybe one could try reading it with the advancing translation technology... I saw that in September 2023 a book called "Foundations of Logic" by Westerståhl will be published, it seems similar from its summary.
This is the first time I've been introduced to this guy. I like how he seems to be more of a "unification of knowledge" type of person, rather than just a mathematician. He draws from examples everything from math, to pop-culture, to eastern and western philosophy, and so on. Thanks again Lex!
I was surprised, amazed how he connected them, but then remembered how wide ranging Frenkel's knowledge about things and people are (away from Mathematics). I recall (and am now rewatching) his 2014 talk, when, at the beginning of his talk, the computer system breaks down, he tells the tech, ""Don't worry about it". And "we use computers so much these days, maybe it's a sign". Then still goes onto speak with such humility, humour, and a weird humbleness. Weird because he obviously knows so much but believes he doesn't. The talk's from a book promotion tour, for his "Love and Math". If I remember correctly, he's got a few more of those Alan Watts like comments. ruclips.net/video/YnqQ-BWMHrE/видео.html (if you're interested).
@@Leksa135 At around 4:46, you'll see them show Euclid's five axioms (also called postulates). You'll see it says, "first axiom", "second axiom", etc. Underneath each of those is what is technically called a definition. What this is, is a definition for each axiom. To return back to @FrigginTommyNoble's comment, the system of mathematics being used is called Euclidean Geometry. But the Euclidean Geometry system can't define it's axioms or itself, it needed Euclid (i.e. a person) to define them. Hence, no system (i.e. no mathematical system) can define all of it's own axioms. Which is what Kurt Gödel proved mathematical, or he proved that all mathematical systems will be incomplete, hence Gödel's incompleteness theorems.
Here are brief statements of the theorems for those interested: Gödel's First Incompleteness Theorem states that "Any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete. In particular, for any consistent, effectively generated formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true, but not provable within that theory." Gödel's Second Incompleteness Theorem states that "For any effectively generated formal theory T including basic arithmetical truths and certain truths about formal provability, T includes a statement of its own consistency if and only if T is inconsistent." Just prior to publication of his incompleteness results in 1931, Gödel already had proved the completeness of the First Order logical calculus; but a number-theoretic system consists of both logic plus number-theoretic axioms, so the completeness of PM and the goal of Hilbert's Programme (Die Grundlagen der Mathematik) remained open questions. Gödel proved (1) If the logic is complete, but the whole is incomplete, then the number-theoretic axioms must be incomplete; and (2) It is impossible to prove the consistency of any number-theoretic system within that system. In the context of Mr. Dean's discussion, Gödel's Incompleteness results show that any formal system obtained by combining Peano's axioms for the natural numbers with the logic of PM is incomplete, and that no consistent system so constructed can prove its own consistency. What led Gödel to his Incompleteness theorems is fascinating. Gödel was a mathematical realist (Platonist) who regarded the axioms of set theory as obvious in that they "force themselves upon us as being true." During his study of Hilbert's problem to prove the consistency of Analysis by finitist means, Gödel attempted to "divide the difficulties" by proving the consistency of Number Theory using finitist means, and to then prove the consistency of Analysis by Number Theory, assuming not only the consistency but also the truth of Number Theory. According to Wang (1981): "[Gödel] represented real numbers by formulas...of number theory and found he had to use the concept of truth for sentences in number theory in order to verify the comprehension axiom for analysis. He quickly ran into the paradoxes (in particular, the Liar and Richard's) connected with truth and definability. He realized that truth in number theory cannot be defined in number theory, and therefore his plan...did not work." As a mathematical realist, Gödel already doubted the underlying premise of Hilbert's Formalism, and after discovering that truth could not be defined within number theory using finitist means, Gödel realized the existence of undecidable propositions within sufficiently strong systems. Thereafter, he took great pains to remove the concept of truth from his 1931 results in order to expose the flaw in the Formalist project using only methods to which the Formalist could not object. Gödel writes: “I may add that my objectivist conception of mathematics and metamathematics in general, and of transfinite reasoning in particular, was fundamental also to my work in logic. How indeed could one think of expressing metamathematics in the mathematical systems themselves, if the latter are considered to consist of meaningless symbols which acquire some substitute of meaning only through metamathematics...It should be noted that the heuristic principle of my construction of undecidable number theoretical propositions in the formal systems of mathematics is the highly transfinite concept of 'objective mathematical truth' as opposed to that of demonstrability...” Wang (1974) In an unpublished letter to a graduate student, Gödel writes: “However, in consequence of the philosophical prejudices of our times, 1. nobody was looking for a relative consistency proof because [it] was considered that a consistency proof must be finitary in order to make sense, 2. a concept of objective mathematical truth as opposed to demonstrability was viewed with greatest suspicion and widely rejected as meaningless.” Clearly, despite Gödel's ontological commitment to mathematical truth, he justifiably feared rejection by the formalist establishment dominated by Hilbert's perspective of any results that assumed foundationalist concepts. In so doing, he was led to a result even he did not anticipate - his second Incompleteness theorem -- which established that no sufficiently strong formal system can demonstrate its own consistency. See also, Gödel, Kurt "On Formally Undecidable Propositions of Principia Mathematica and Related Systems I" Jean van Heijenoort (trans.), From Frege to Gödel: A Sourcebook in Mathematical Logic, 1879-1931 (Harvard 1931)
This was one of my favourite shows from Lex. Edward is a truly remarkable human being, and it's always beautiful to see so much love and compassion in one's heart.
I think it's valuable to also go a bit into the whole "completeness & consistency" thing. One could start with definitions and explaining why they're important things we want from formal systems. Then one could proceed to a little history of how "cracks" in set-theory based formal systems began to be discovered by Frege and Russel almost as soon as those systems arose. The story continues with a quick overview of the various approaches to these issues, like ZF(C), NBG, "New Foundations" and type theory (with Russell for a while, then dormant for a long time, then getting a big comeback with Per-Martin Löf and lots more interest recently with Homotopy Type Theory). This brings us to a classification and analysis of the underlying issue - that of predicativity and impredicativity - one might briefly explain what that is and why it's problematic - using various examples of paradoxa of (direct or indirect) self-referentiality. We can then explain how these developments and the predominant research institutions in Germany and Eastern Europe lead to Gentzen's proof of the consistency of (Peano) arithmetic - and how that was a process of formalization which took us around 2.5 millennia from basic arithmetic and logic to Gentzen's proof. The importance could hardly be overstated. The "victory march" of formalization and the power of formal systems seemed assured. ... and then came Gödel.
"and then came Gödel" No, Gödel was first (1931) and Rosser's extension of Gödel's first incompleteness theorem (which is often falsely attributed to Gödel) was in the same year as Gentzen's proof, 1936. Gentzen evaded the incompleteness theorems: the system he used to show that PA is consistent is not stronger than PA, but miraculously manages to capture the structure of proofs in PA (in particular, it is not weaker than PA). I think Gentzen's result is even more breathtaking and few people (if any) understand it.
In another paper, Gödel developed an axiomatic system containing the self-referential statement, “This statement is false.” He then proved - within the same system -that “This statement is false” is true. All he needed was the countable numbers (the set N) and a few very simple rules. On “Emergence:” Taking simple rules then applying them to a simple structure to produce complex “behaviour,” is also a subjective process. In what axiomatic system can you consistently define both “simple” and “complex,” then show that there are no self-referential contradictions?
The Chomsky hierarchy defines classes of complexity of behaviour. Any Turing complete system can capture the fullness of complexity, and they are all well defined computational systems. Also, the simple structure of these systems can produce the most complex behaviour possible: irreducibly complex behaviour.
In "another paper"? Which would that be? "All he needed was the countable numbers (the set N) and a few very simple rules." This comic reference to Peano arithmetic and gödelisation really hurts!
Great video, people take calculus and algebra classes for years and no one explains to them the fundations of what you are studying as clear as this guy does
Gödel's theorems challenged the notion of completeness and consistency within formal systems and had a profound impact on the philosophy of mathematics. They demonstrate inherent limitations of formal systems and suggest that there will always be truths that lie beyond the reach of any particular system. These theorems have also influenced the field of computer science, particularly in the areas of artificial intelligence and algorithmic complexity theory.
It’s so obvious, you can’t put all your eggs in one basket. We need competing ideas, we need theories that “contradict” others, but because of this they work (meaning multi-valued logics that immediately eliminate Gödel’s theorems), etc.
Great explanation! Regarding the perception problem at 14:52 , the top down perception in the brain can provide a trivial explanation. Just like Lex mentioned for neural networks, the bottom up sensory features leads two activated outputs: 0.5 Rabbit and 0.5 Duck. However, the top down awareness in the human brain can only attend to one output at time. So if you attend to the duck output, the duck neuron will be activated. Now because the information comes from top to bottom, all the related neurons to Duck will activate (none will activate for the Rabbit). And that is why you suddenly perceive it as 100% Duck or 100% Rabbit if your top down awareness attend to the Duck or vice versa.
simply giving it the name "topdown" is neither much of an explanation nor interesting. and who is this "you" picking what to attend to? that's the interesting stuff
@@s.muller8688 you're being sarcastic right? brain is exactly not just a converter or reactor. it is a creator indeed. part of the reason why you can't predict what you want to eat for lunch 100%.
@@iranjackheelson what you going to eat at lunch is already stored in the memory in the form of known data, which than randomly get's chosen by thought. Nice try.
There are some nice ideas about emergence of complexity. As nothing is the same, there is an incremental effect by repetition, similar to what our memory in the brain does with the episodic memory, each time we see a cat for example, the cat experience adds meaning to our definition of cat, even if it is the same cat at the same place. A bit like Peircean semiotics thirdness, when we interpret a sign it can generate a new one, even more if instead of just one triadic relationship there is a whole network of it, by aggregation and interconnection, at some point it generates more complexity of evolving meaning, because it cannot be the same, different than in mathematics. In mathematics if we add 1 + 1 it is always 2, but in reality that is impossible, and the 2 will be always slightly different each time we add 1+1. In short, complexity has to emerge because repetition is impossible.
To be fair, as far as I understand Goedel's incompletness theorem (I think thats what you mean, 'incompletenes' rather than 'impossiblity'?) the theorem only states that there is at lease one state that is not within the universal set. And that by the virtue of at least one, that makes the whole 'completeness' incomplete. It does not mean 'there are questions that can not be anwsered', it just means, 'Mathematically, there is at least one question that can not be anwsered'. I choose to interpret that to mean "Do we get Pizza or Fish and Chips." being the question.
@@jaywulfbecause a question would imply that if an answer was derived, the answer would be used to start the system of logic, but within the logic the axiom was already assumed without validation (except if you go meta)
9:28 No, no it doesn’t. It “proves” these things if you aren’t a shrewd enough logician to understand that a self-contradicting and semantically empty statement is simply “incoherent” within any coherent formal system (that is coherent with the Logos). Also the halting problem is solved by the Logos and by translating programs into a number of versions until a halting version is found (if the original does not halt by the time that other halting version is found). It’s all easy and these people that think LOGIC is imperfect or can’t prove everything are the fools, even if they’re “big names” and did some cool things. When the Math doesn’t “add up” it’s YOU/US (the fallible mathematician and fallible humans) that screwed up, it’s not Math and Universal Logic (Logos) that were actually imperfect at their Root and in their Essence. Perfect Logic is Perfect Logic and that is tautological and 100% certain. There is no “truth” that is not encompassed by the Universal Logic or (objectively) true without being so according to the Standard of Universal Logic. It is illogical and logically contradictory nonsense to speak of a “truth beyond Logos” or any aspect of Logic that cannot be symbolically represented with Formal Logic.
Don't forget that he has spent decades of his life dedicated to learning and communicating mathematics. It take a lot of hard work to be able to do what he does.
Some people are natural at math but cannot teach or explain it. Others work harder to learn it, but do better conveying it. But most people are bad at what they do and don't care to try to learn everything they are capable of.
Incredible, as a mathematician and AI scientist I loved what he had to say. I practice magick for the exact reason he spoke of in the end. It's worth noting that Carl Jung also practiced magick and wrote and illustrated his dreams and other deep psych work in his Red Book. People think, "oh you're crazy you think Harry Potter is real," but it couldn't be further from the truth. I believe in targeted rituals that help expand my own understanding of what lies below the tip of the iceberg in my subconscious. As well as to harvest results from it. I think it's worth looking into if you find it intriguing.
Lex on Godel's incompleteness and Turing's undecidability: _"It's very depressing."_ Frenkel: _"Or life affirming!"_ Edward is ahead of Lex in spiritual development. When we embrace that we will never attain a _Theory of Everything,_ it opens up greater possibilities!
Reminded me about a study of the mona lisa smile, where they discovered that the smile appears to be more obvious if we use our peripheral view, so that its way she sometimes appear to smile but then again, she does not. Maybe some of those trick images may have an explanation that is more complex than subjective.
Loops are similar to self referential statement that are connected to some paradoxes in naive set theory. But they are completelly resolved in modern set theory ZFC. Also there are no connection between these paradoxes and Godel results. Godel theorem are valid for every formal system strong enought to interpet Peano arithmetic.
I think Euclidean geometry is a good example for the difference between physics and math. In math, the 5th postulate is just an axiom. in classical physics, it's derived from observation and in general relativity, which was necessary because classical physics didn't agree anymore with some observations (like the orbit of Mercury), it is only valid in the special case of flat space, which is a good approximation in some cases, but strictly only exists in an empty universe or in single points which have a curvature of 0.
Wonderful interview. If Penrose is right in that consciouness is not an algorithmic computation, as per Godel, then we are not wholly deterministic, a comforting thought!
Or that even if we are wholly deterministic, it is not in a way that can be replicated by any formal / computational approach known today. Penrose has basically said something similar.
@Mattje8 Very true. Do you think if we are not wholly deterministic, that Penrose's idea, if true, would explain the phenomenon? Could it be that superposition and entanglement provide human's with the ability to think outside the box, to get the "god's eye view" ?
I really liked when he jst softly said the key to genius . " To have open end process " to let yr conscious intelligence lead . Rather than deciding one thing and another
this is a Thankyou to Lex and his team for committing to in-person interviews with high quality audio equipment. It makes all the difference to the audience experience.
Your talk with Edward took a most interesting path from subjective axiomatic foundations, creation of mathematics thru logical inference & syntactic process, to Goedele’s & Turing’s Theorems opening space for new things. You then jumped to cellular automata & emergent complex behavior, leading Alex to bridge to neural nets via figure-ground human perception. At this point I flashed on the interesting possibility that we are all (inescapably) training AI systems to be human in the most fundamental way possible. My thought was that the methods of our perception are at their core, the syntactic evolutionary algorithms that created life and us. Also, that we are incapable of recapitulating ourselves in our dogs, children or AI, in any other way. The intimacy of the evolutionary syntactic process that created us is the heart and soul of our perceptual engine. As we discover this syntax in recreating ourselves, we give this soul to our syntactic silicon selves, our AI. Everything is layers of complexity. My research leads me to conclude that we are close to an inflection point in our understanding of the second law of thermodynamics. When that happens, this next layer of complexity may provide a foundation for a true science of life… and AI is our ashlar test vehicle.
I shit my pants when he tied everything to complementarity. The golden thread of Platonism. Would love to hear this guy talk with Joscha Bach, John Vervaeke, Max Tegmark, Penrose, Graham Priest, or any other modern great who understands and promotes this principle.
I've been watching a lot of physics videos lately, mostly from Drs. Sabine Hossenfelder (Science without the Gobbledygook) and Matt O'Dowd (PBS Spacetime). Both channels take dips into the Quantum and explore some of the weirdnesses within. I've been thinking a lot about the ideas of complementarity and how they might relate to quantum superposition and the measurement problem. See, I'm starting to think that maybe the wave function doesn't collapse at all, but rather we are just observing one "version" of the particle. Its unobserved complement might still be just as real as the one we measure, but can't be seen at the same time. In a sense, maybe particles really are in two places at once, but we can only observe one at a time. Thoughts?
@@sabinrawr yea I have a lot of thoughts about this. Safe to say there is a lot of confusion surrounding QM as it is popularly understood. I tend to go with interpretations akin to objective collapse or Quantum Bayesianism.
I like the late Dr. Van Till's view of how that its necessary to start with "the very first principle" of our creator's existence. THEN we can make sense of everything in virtue of that divine ultimacy of reality. Top down rather than bottom up. It acknowledges the need for divine revelation, not only in order for us to know that the ultimate nature of reality is divine, but so we can have intelligibility for facts generally, regarding things we can see and touch.
Except it's been understood for a while now that there doesn't need to be a creator in the universe we inhabit. It's an established part of physics that structure can emerge from "nothing" due to spontaneous symmetry breaking. "Structure" referring the laws of the universe that give rise to what we see as material reality. It's understood that as energy levels lower, the many symmetries of the universe break down in ways that create more and more differentiated structure as opposed to the extremely high symmetry of nothingness (everything being the same under all possible transformations. This isn't just speculation. It's proven that the most fundamental laws of physics, the conservation laws, are all just emergent expressions of symmetries in the universe. They did not have to be decided on or written down by a creator. They just emerge due to the "shape" of reality. Even the most fundamental "things" in reality, quantum fields, emerge spontaneously as energy decreases. The four fundamental forces emerge spontaneously from one as the universe cools. At the beginning of the big bang, there was a single fundamental force, and as the universe cooled, broke into 4 different forces. More symmetry to less symmetry. More similarity in the universe to less, which is to say more structure emerging from less. No creator needed. The big bang itself is very likely the breaking of time symmetry. The existence of things like "charge" is due to spontaneous symmetry breaks too. Like liquid water being the same/symmetric from all sides spontaneously turning into a snow flake as energy lowers and suddenly being symmetric from only 6 sides. That's a rough analogy of a symmetry break. The structure of the snowflake emerged from the less structured water completely passively due to the symmetries of liquid water breaking as energy falls. None of this proves there is no God, but it DOES mean that God isn't necessary or inevitable to explain the universe existing. Structure can emerge from where there was no structure completely passively as things go from high energy to lower energy, which is just to say time passes.
This guy is saying some of the most interesting things I've heard, not only recently but in general. It all just affirms for me even more how by being human you're limited and you're always putting your faith in something, whether conscience or not. There's just no way around it, but it seems like it's probably better that way than being all-knowing and all-powerful. As a Christian it just motivates me more to put my faith in Christ, the one who really knows it all and takes responsibility for us. I do love honest science and seeking the truth of course but, knowing how flawed and limited I am, it just seems ridiculous to think I could find all the answers to life. Prioritizing my relationship with Christ has been the greatest thing for my life and I'm just so glad I don't have to rely on my own strength and understanding, but can always rely on his!
Well said. I recommend you read "Believing is Seeing" my Michael Guillen. It's a wonderful book that has helped me so much when it comes to holding on to enlightened faith.
In 2015, a group from London proved that many-body quantum systems are analogs of Turing machines, essentially computing for the rules of quantum mechanics. Because the system essentially has to reference itself in optimization of electron distribution, within a limited number of excitations, they then went on to demonstrate that some properties, like spectral gap prediction, are undecidable. Reductionism has a limit and there are things in life not only that we can't predict, but neither can nature!
11-40 The presenter raised the question of calculation, but he understands this term only from one side. In fact, the word calculation can be understood as a process when images of information (not the information itself, but its stored state, impression, correlation) are transferred to the state of Active Information. For example, the image of the word “fox” is transmitted to the brain. The paper acts as a carrier of the image of the word “fox”. Note that there is no information about the fox itself in the symbols written on the paper, but there is a certain correlation. The observer, in the form of a light stream, acts on the paper (“reads”) the image of the word “fox” and stores it in the form of a frequency modulated signal with its spectrum. Note that nothing remained of the letters “fox”; the letter was considered an observer (light flux), produced in the form of active information and stored in its format in the signal spectrum. The human eye acts as an observer and reads the image of letters from the medium (light flux) and writes new correlations in its own format, for example, into energy signals traveling along nerve endings to the brain. Etc. And only at the last stage, the neural network of the brain, reading the correlations that came to it and using memory and reference sets, creates (generates) an active information flow and generates information about the fox (thereby reproducing reliable information from the word image stored on paper). So calculation is an analogue of the active process of the Observer, it is a mechanism as a result of which “real” Information is born within the context of this Observer.
Mathematics is not based on axioms. It is based on categories. Without categorizing objects and processes you cannot have two or more of anything. In grouping similar shaped and behaving objects together under one symbol you can then assert that there is more than one of some thing. Without categorization there is only one of eveything.
Hi, I loved your answer. It's enlightening. I'd like to share and ask you some question had came to me as I read you. Please, mind me for my english or lack of conciseness not. First: is this "one of everything" a category itself? Second: is "category" a category itself? Or rather, what's a category? Third: if we ask about a God who 'create' this 'one of everything', are we just splitting in two categories this "one of everything" or are we 'creating', or defining, a new "one of everything" category which will overwrite the older one? Or rather, would this operation be an internal or an external operation to the "One of everything" set? Fourth: what's an operation? Fifth: who's the one who make the operation of categorising? Sixth: does defining, or rather 'creating', needs at least one external element -the one of everything or the one everything is defined from- to make subsets from, with or within that element, to set everything, to set the set of everything? Seventh: does the set of everything needs a definer to define itself? Or rather, can a set define itself by itself? Eighth: is the 1 element truly a element 1 if it can slit into two elements? Or rather, is 1 a element or a set of elements? Can be the 0 element slit into something else other than it self? Can the empty set do the same? Nineth: is the empty set an element? Is the empty set at least the only element where other elements can place within? Tenth: is the 'one of everything' the one, the zero or the empty (set), the emptyness of everything? Eleventh: what's the empty set? What's the emptyness of everything? Is it the set of every empty sets? How many -or how much- elements does it contain? Twelveth: can the empty set, or the emptyness of everything, fill itself by itself? Theerteenth: who fill the empty set? Who fill the emptyness of everything? Really thank you. I apologize, I had not believed so lot of questions might flowed out. Waiting on a answer, Feeling happy already, Cheers. -The Lord is my sheperd, I miss nothing. Nothing, unless a you.
I read that 3-SAT is Turing complete. And that's just basic Boolean algebra! And Gödel's incompleteness theorems don't apply to Boolean algebra. However, for example the set of natural numbers is infinite, so one has to use something like induction to define that in Boolean algebra. Or define a Turing machine that runs forever and outputs the natural numbers.
In cellular automata the complexity emerges not completely from the rules, but from action of the rules on the initial setup. The rules themselves are just the computational engine. And as any good computer it has the same halting problem. So add a good initial setup and you have complexity - which is a manifestation of the halting problem.
Many Native peoples use visual puns in their art forms, because both images carry the spiritual power contained in the subject designated. Here eg Rabbit has spiritual powers of speed and survival. AND bird is wise and loyal etc. A great book describes ING how this works is YAQUI DEER SONGS. You guys would love it. Also Northwedt Coast indians eg Kwakiutle use lots of visual puns in art , dance masks, eg for transformations in sodalitiies. Point being no reason to associate Either Or neur branches vs broader associations of power, paradox, mystery. The latter are so old and common in human cultures.
13:00 LEX: I think your comments here really convey (in a wonderful way) how you view such matters. I approach such topics with more caution than yourself, but this really does convey your Intent regarding AI and related matters.
An innocent question: if the incompleteness theorem refers to the existence of ''valid'' propositions whose truth or falsity is impossible to prove within a formal system of axioms, then, on what basis is it assumed that the proposition was considered ''valid'' to start with (if it was)? Or, is ''valid'' above = merely ''formally constructable''?
However, perhaps the counterexamples produced by Gödel and Turing come down to loops. A clear disjunction between formal systems and meaning happens in music. The works of Bach all follow rules which can be expressed as a mathematical formal system. The mathematical validity is a framework which helps support the emotional meaning (different for everyone), but the emotional meaning is outside the formal system.
I think the duality like it is mentioned towards the end of the video and also the problem, that identity can't work like in formal logic and math (where something is identical to itself and if it is different, it can't be identical, but in reality everything is always changing, like the ship of Theseus, where every part is replaced one after the other until no part of the original ship is still there) can both be solved if we assume, that nature is really dialectic. And dialectics also allows contradictions, so we don't have the a problem if we prove something and also it's contradiction. So maybe we could advance our understanding of the world if we assume that the natural world is dialectic and only obey formal logic under special constrained circumstances. We would probably need a new kind of logic then, which includes dialectics.
complexity emerges from simplicity due to entropy. Simple things have simple structures of information, which have a higher degree of freedom of arrangement, which creates complexity in arrangement. From complexity emerges (systemic) simplicity, because complex systems have a lower degree of freedom of arrangement.
Fascinating conversation, as always. I have to disagree with Mr. Frenkel 's statement that no one knows why people automatically see one, or the other representation in an image such as the duck/rabbit one. A simple way to recognize this is to imagine that you show a picture of a rare animal to a hunter-gatherer for whom it is a common threat to their group. This image can also be seen as a small vehicle to someone who often rides them. I think you know where I'm going, and now you can explain what no expert can. Not quite.
On the Origins and Nature of the Dark Calculus There's evidence in the arithmetic record that the study of formal systems reached a pernicious apex in the Long Before. Advancements made by mathematicians such as Russell, Gödel, Eisencruft, Atufu, Wheatgrass, and System Star contributed to the understanding of notions like undecidability, pointed regularism, and abyssalism. Upon reaching this minimal degree of mathematical maturity, equipped with sophisticated grammars, researchers set out to experiment with the limits of expressibility. They contrived bold research programs and galloped into the mathematical wood, unwitting of the dangers that brood there. The record is even scarcer than usual, due to the efforts of successive generations to obfuscate the venture. As best as I can gather, at some point in the course of inquiry, a theorist from a mathematical seminary called the Cupola formulated a conjecture on the fragility of formal semantics. The conjecture ripened to a broader theory, out of which spawned a formal system called the penumbra calculus. In the few fragments of texts that predate the obfuscation, it's stated that, in the penumbra calculus, certain theorems are provable, but are falsified upon the completion of their proofs. As much as this result is at odds with the systems of thought I've encountered in my own inquiries, I find little reason to doubt the veracity of the authors. Nevertheless, it's certainly a peculiar property. The Cupola theorist's results erupted into a grand investigation into the expressibility of the penumbra calculus. The conclusions were troubling. Pushing further, researchers constructed sister systems with alternate axioms. These systems were still more fragile, with the systems' inference rules themselves unraveling upon the completion of certain proofs. Convinced that their discoveries were made possible by some idiosyncrasy of self-awareness, but synchronously fearful of the implications of their results, some schools of theorists engineered complex automated deduction systems to probe boundary theorems and launched them into neutron stars. The outcome is undocumented, but the result convinced theorists across the Coven to abandon research and blacklist anyone who studied the penumbra calculus and its derivative systems. (from: Caves of Qud)
Both the Incompleteness Theorem and the Halting Problem rely on Cantors diagnolazation argument. There is a flaw in this argument, as it leads to inconsistent conclusions in binary. Consider the diagnolazation of the set below: 1.00000..... 0.00000..... 0.10000.... 0.01000.... 0.11000.... 0.00100.... . . . 0.000...011 = 3 x epsilon 0.000...010 = 2 x epsilon 0.000...001 = epsilon 0.000...000 The diagnolazation of the above set produces the number 0.11111... which is either 1.0 or the first number listed, or 1.0 - epsilon.
I worked for IBM in the 1980's on Artificial Neural Network systems and encountered the dog-cat problem the guest describes here (only they were printed numbers with voids in the images such that 0 and 8 could not be distinguished reliably). The ANN would classify those as patterns as rejects, i.e. unreadable or unrecognizable, based on the difference between the highest class score and the next highest against a confidence level, which was derived statistically across all patterns. But rather than being attributed to perception as the guest suggests, it really was the result of two factors: too low an optical resolution and an insufficiently precise definition what would constitute an "8" and a "0" in the case where the "8" was degraded and the training patterns were manually tagged based on those low resolution images. What needed to be done was the training patterns should have been tagged by the printer program that generated them. As to cats vs dogs, it goes back to precisely defining what a cat is vs what a dog is ontologically. For that you'd have to look at every possible variation of what each is, their similarities and differences in order to derive a precise definition, which is what the ANN is attempting to do. So it's a chicken-egg problem, but as mis-tagged images are discovered in the training set that cause errors in the test set, then those mis-tagged images can be corrected, further refining the definitions for each. This kind of corrective (and adaptive) feedback is what makes these systems work.
I've heard it explained in terms of games. If you have chessboard that is halfway through a game say, there is no way to derive using the rules (axioms) what the board looked like say 10 moves before. That's about as far as I get.
Yes, but try to redo planar geometry without the fifth postulate. The fifth postulate is obviously not true for spherical surfaces. So it obviously shouldn't be postulated for spherical surfaces.
In a previous episode, Frenkel speaks of two separate domains, E, Energy, the one populated by the three known forces in the quantum world, and S, Space, the one populated by distances, a property captured by the third axiom, "a point P and a distance r." As a curiosity, E appears as a trinity; three individuals with different and recognizable characteristics and properties...who, at the end, are the same thing. When the preacher tells us this story, we "know" is a metaphor. Newton doubted it but particle physicists today know the story is true. What needs understanding is the unification of E and S. If E is the Ying and S is the Yang, what happens when they unite? What is the mathematical representation of that union?...a reasonable question. After all. the high priests of E have created the Standard Model, a group structure SU(3) × SU(2) × U(1) that accurately represents observable properties of their members, and transformations/interactions among them. No curvature appears in the Standard Model, gravity does not exist in the E domain, it is created somewhere else. When Ying first meets Yang, at the first encounter of a photon with distance, nothing happens. Photons travel forever in their own universe without time. From E come two twin brothers, electricity and magnetism, and from another source, not excluding E, comes a fixed unit, the speed of light, c. The twin brothers are perpendicular to each other, a property captured by the fourth axiom, congruency of right angles. Our creation occurs when energy and space, E and S, unite in matter at the cadence of light according to "space equals energy" c^2 = E* 1/m. What composer dictated the cadence, or at least can it be derived from other principles? The universe has different and simultaneous configurations, different geometries. A mathematical system that corresponds to one of the possible geometries will be successful. A flat universe is represented by Euclid while a curve geometry is represented by Riemann. However, many mathematical systems do not correspond to any possible geometry. Combinatorics generate more outcomes than reality. Imagine universes with all possible geometries, which is the essence of multiverses and strings. Borges, without the encumbrance of heavy mathematics, painted that universe in the library of Babel.
Nice perspectives, Really liked it, Thanks to Gödel's incompleteness theorem and Turing's halting theorem we the software engineering and mathematical communities will always have jobs despite what all the AI hype that is going around tells you.
I don't see how those limitations affect AI anymore than they affect us. What can a human brain do that a properly trained AI could not do because of those? Not saying it is easy, or that it will happen soon, but speaking in absolutes when it comes to AI (they will never do X, we will always need humans for Y) sounds completely unsound to me.
@@harulem The guy falsely identifies the whole of math with a specific school of mathematics, namely formalism, which is based on arbitrary axiomatics and denies intuition and empirism as truth conditions of coherent foundations of mathematics. And here lies the answer to your question, a mechanical Turing machine has no intuitive access to the idealist ontology of mathematics. Or can't "Listen to the thoughts of Goddess", as Ramanujan expressed his methodology. :)
@xcyoteex On the contrary. I'm saying, as did Brouwer et alii, that no mathematical language can exhaust the idealist ontology of mathematics. Also Gödel had intuitionist motivations for shootind down logicism and together with it the whole of linguistic reduction with his proof. But contrary to the immutability of Platonic eternalism, I'm with Whitehead in the view that the idealist ontology of mathematics is process philosophical instead of substance metaphysics. This view is strongly supported by the undecidability of the Halting problem which through Curry-Howard correspondence extend to the whole of proof theory, implying that mathematical proofs cannot be claimed to have eternal status but temporal duration - even if that duration is larger than a universe. Process philosphical approach also gives construction of formal languages - as process of translating prelinguistic intuitions into communicable language - participatory creative role in the idealist ontology and evolution of mathematics.
I'm not sure though that Godel's theorem has nearly as much impact as all that. Even if all statements of ZFC could be proved true or false with a finite set of axioms, so what? There would still be infinitely many such statements. And in point of fact, most professional mathematicians devote themselves to the discovery of statements that can be proved true within ZFC which is a task that a computer could in fact do. People talk vaguely of the possibility that unproven theorems like RH might not be provable within ZFC but research programs to try to prove such theorems by tacking on known independent axioms are thin on the ground(which might be a mistake)So Godel's work hasn't even affected mathematicians, let alone other people. Also as to the business of the "incompleteness" of mathematics, mathematics was always incomplete anyway because theorems depend on axioms which are arbitrary. Godel's second Incompleteness theorem is actually more significant here because that tells you you cannot even assure yourself of the consistency of any reasonably powerful set of axioms. That's much worse because it literally tells mathematicians that there is no objective reason to think what they do is even internally consistent. And therefore everything they have ever done might actually be proved pointless. For example we have not actually proved Fermat's Last Theorem. We have only proved FLT hostage to the consistency of ZFC which itself if true is beyond proof. We cannot rule out the possibility that tomorrow someone will find three numbers(four, counting the exponent) that violate FLT and ipso facto also show that ZFC itself is inconsistent - meaning we can't trust any results we have proved using this system anymore! Somehow, in spite of these dramatic consequences, the second theorem gets little play whenever specialists talk about Godel to a general audience
15:45 just as he was about to say it I thougt to myself this exact thing. It can be a scary thought or just that our plain of understanding, to which A.I. is being trained, is constrained by such laws of "existence".
Gödel's Incompleteness Theorem: A Mathematical Corollary of the Epistemological Münchhausen Trilemma Abstract: This treatise delves into the profound implications of Gödel's Incompleteness Theorem, interpreting it as a mathematical corollary of the philosophical Münchhausen Trilemma. It elucidates the inherent constraints of formal axiomatic systems and mirrors the deeper epistemological quandaries underscored by the Trilemma. --- In the annals of mathematical logic, Kurt Gödel's Incompleteness Theorem stands as a seminal testament to the inherent constraints of formal axiomatic systems. This theorem, which posits that within any sufficiently expressive formal system, there exist propositions that are true but unprovable, has profound implications that reverberate beyond the confines of mathematical logic, resonating in the realm of philosophy. Specifically, Gödel's theorem can be construed as a mathematical corollary of the Münchhausen Trilemma, a philosophical paradigm that underscores the dilemmas in substantiating any proposition. The Münchhausen Trilemma, named after the Baron Münchhausen who allegedly extricated himself from a swamp by his own hair, presents us with three ostensibly unsatisfactory options for substantiating a proposition. First, we may base the substantiation on accepted axioms or assumptions, which we take as true without further substantiation, a strategy known as foundationalism or axiomatic dogmatism. Second, we may base the substantiation on a circular argument in which the proposition substantiates itself, a method known as coherentism or circular reasoning. Finally, we may base the substantiation on an infinite regress of reasons, never arriving at a final point of substantiation, a path known as infinitism or infinite regress. Gödel's Incompleteness Theorem, in a sense, encapsulates this trilemma within the mathematical world. The theorem elucidates that there are true propositions within any sufficiently expressive formal system that we cannot prove within the system itself. This implies that we cannot find a final substantiation for these propositions within the system. We could accept them as axioms (foundationalism), but then they would remain unproven. We could attempt to substantiate them based on other propositions within the system (coherentism or infinitism), but Gödel's theorem demonstrates that this is unattainable. This confluence of mathematical logic and philosophy underscores the inherent limitations of our logical systems and our attempts to substantiate knowledge. Just as the Münchhausen Trilemma highlights the challenges in finding a satisfactory basis for any proposition, Gödel's Incompleteness Theorem illuminates the inherent incompleteness in our mathematical systems. Both reveal that there are boundaries to what we can prove or substantiate, no matter how powerful our logical or mathematical system may be. In conclusion, Gödel's Incompleteness Theorem serves as a stark reminder of the limitations of formal axiomatic systems, echoing the philosophical dilemmas presented by the Münchhausen Trilemma. It is a testament to the intricate interplay between mathematical logic and philosophy, and a humbling reminder of the limits of our quest for knowledge. As we continue to traverse the vast landscapes of mathematics and philosophy, we must remain cognizant of these inherent limitations, and perhaps find solace in the journey of exploration itself, rather than the elusive, final destination of absolute truth. GPT-4
*Utter nonsense on stilts!* Incompleteness is *absolutely nothing* to do with the justifiability or the (epistemic) status of the premises from which one draws inferences. Incompleteness shows a rather stunning divergence between provability and truth in certain symbolic systems. These are two utterly unrelated areas. There are no Agrippan problems for the axioms of symbolic systems. By the way, _Münchhausen Trilemma_ is the recently deceased Hans Albert's novel name for a problem that for centuries has been known as _Agrippan Trilemma._ It's use is mostly confined to a rather small group of German-speaking Popperians. Are you German? And what is GTP-4? Is this to indicate that your comment is the product of some "AI" software? If so, would you care to share your "prompt"? I'm rather curious to know more about the genesis of this rather elegantly presented load of unadulterated tosh.
The weirdest thing is the proof isn’t even that complicated once you develop all the foundational theory of logical deduction. Since there are a finite number of rules for deduction you can assign an integer code to every rule. I don’t remember exactly, but you take a self-referential statement like “this statement is false” and turn it into integer code. It becomes a “theorem” about integers, but the self referential nature can be used to show that if a proof of the special theorem constructed exists, then the statement must be false which is a contradiction. The conclusion must be that no proof of the constructed theorem exists. A proof that the theorem is false also doesn’t exist following the same method, which is the part thats really hard to wrap your head around.
Great conversation. An interesting detail: Mr. Fridman wears a jacket and tie, while Mr. Frenkel wears neither jacket nor tie. I like Mr. Frenkel's look.
14:25 I’ve seen the exact dress in person. It’s absolutely blue & black. My theory is that women often sift through dark closets, where a dim lit white & gold dress appears like a sunlit blue & black dress.
Gödel proved that there are true sentences that can't be proved. If I ask for an example and you give one I may ask "How do you know it is true, since it can't be proved?" A self-referential sentence like "THIS sentence is false" seems not to be an example, since if it is true, it is false, and if it is false it is true - it is meaningless, I think. Is there any non-self-referential sentence that is true but not provable???
refine the last question further for insight: Are there any non-self-referential sentences at all? Language is a relationship map, a topology, a surface, a structure. And math is a language.
@@disabledchatzen5276 By self-referential here what is meant is that the sentence "talks" about itself. Is there any known example of a mathematical sentence which is about "other things" than itself (and its being true or not), which is true but not provable? (Not one of the axioms, of course.) Ordinary human languages are not strictly logic while mathematics is so (free of contradictions).
Not often that this host of topics are touched on in the same sitting... 1: The "A versus B logic modality" is mentioned as being first introduced by Aristotle (and is lightly implied as being an imperfect system)... 2: Knowledge being limited by imperfect biological and mental perception (all of us) as well as our previous life experience (causing completely unavoidable data bias)... and 3: mention of the collective unconscious. I've gotta give credit to both these men for being versed enough in these ideas to expound on it with each other. Doesn't happen too often
I don't think people realize this is the understanding that patterns that are self consistent have truth in its workings. Like a calculator that produces a wrong answer, but you can take apart and understand the whole universe from it. Think of what that means for data in language and data in classics or religious text. That's the iteration of many users over a lot of time, disserting for what are patterns we observe.
is it possible that computers with quantum compute will postulate axiom based math that goes beyond or numerically beyond the postulates of the shared and use-full myth we call mathematics ?
Call me crazy but I think everyone may be missing the point with incompleteness. If incompleteness arises from a complex system doesn’t that mean the iteration of that system gives rise to emergent properties that will always yield more problems than answers?
Full podcast episode: ruclips.net/video/Osh0-J3T2nY/видео.html
Lex Fridman podcast channel: ruclips.net/user/lexfridman
Guest bio: Edward Frenkel is a mathematician at UC Berkeley working on the interface of mathematics and quantum physics. He is the author of Love and Math: The Heart of Hidden Reality.
Gödel's first incompleteness theorem states that for any consistent formal system that is sufficiently powerful to represent arithmetic (which includes most foundational systems of mathematics), there exist statements within that system that are true but cannot be proven within the system. In other words, there are true mathematical statements that cannot be derived or proven using the rules and axioms of the system.
The key idea behind Gödel's theorem is the concept of self-reference. Gödel constructed a mathematical statement that asserts its own unprovability within a given formal system. This statement, known as Gödel's sentence or Gödel's formula, essentially says, "This statement is unprovable." If the system could prove this statement, it would be inconsistent because it would be asserting both its own provability and unprovability. On the other hand, if the system cannot prove the statement, it implies the existence of true but unprovable statements.
Gödel's theorems challenged the notion of completeness and consistency within formal systems and had a profound impact on the philosophy of mathematics. They demonstrate inherent limitations of formal systems and suggest that there will always be truths that lie beyond the reach of any particular system. These theorems have also influenced the field of computer science, particularly in the areas of artificial intelligence and algorithmic complexity theory.
Entanglement
5 formulas, : 1+tan(2)power=sec(2) power
This guy might be the best guest you have had on Lex I love this dude
agreed!
When I saw this dude on Numberphile a long time ago I knew he was the real deal. Such a delight to see him again here!
It's great to hear about Godel's Incompleteness Theorems. What he didn't mention is the motivation behind how Godel ended up with the Incompleteness Theorems. So at the end of 20th century, a genius named Georg Cantor (founder of set theory) wanted to understand God (just like Einstein wanted to understand God). For him God was represented by infinity. Therefore, he wanted to understand infinity, which we failed to understand for thousands of years (and still to this present day). So he asked himself the simple question, we can add numbers, subtract them, etc, but what about infinity? To this end, he constructed a theory that is now known as cantor's set theory. Unfortunately his basis of reasoning (called axioms) contained a contradiction that is now known as Russell's paradox. Nevertheless, he proved that the set of natural numbers is smaller than the set of real numbers. And he wanted to prove whether there is a set in between the naturals and the reals (that is now known as the continuum hypothesis), but failed to prove it. Because of these, David Hilbert (widely considered to be the greatest mathematician at that time and of the 20th century) came up with what's known as the Hilbert's program. In this program, he posed a number of problems, among which is the quest for a basis of reasoning that is both complete (that is to say, if a mathematical statement is true, then it must be provable from this basis of reasoning) and sound (that is, the basis of reasoning only proves true statements). Note that these two properties (completeness and soundness) are the two fundamental properties of all algorithms. Now, comes Godel. So originally godel wanted to prove that such a theory exists, but ended up with the incompleteness theorems in the end. But, the results are so earth shattering that it completely destroyed Hilbert's and mathematicians' dream of having a sound and complete theory for mathematics. On top of that, the mathematics he used to prove the incompleteness theorems were so new that only a handful of mathematicians understood it. Amongst these mathematicians that understood Godel's results were Alan Turing. So what Alan Turing said is that what Godel really means is that there is no machine/algorithm that is able to prove/determine whether an arbitrary program on an arbitrary input will stop or run forever. This is the Turing halting problem. It is important to note that all of these led to the birth of computer science and eventually led to the famous P vs NP problem. Historically, Godel was the first to informally pose the P vs NP problem in a letter to Jon Von Neumann. He also was the first to prove that Einstein's theory of relativity allowed time travel. And he gave the proof to Einstein as a birthday present.
@@dudeshiya Thank you for this write up! Is there any book you could recommend to read about this in detail?
@@blueskies3336 If you are capable of reading German, Dirk W. Hoffmann's "Grenzen der Mathematik" is a must. It is mathematically rigorous enough, but as comprehensible as possible for this hard topic. It gives a detailed historical account of the developments.
Maybe one could try reading it with the advancing translation technology...
I saw that in September 2023 a book called "Foundations of Logic" by Westerståhl will be published, it seems similar from its summary.
This is the first time I've been introduced to this guy. I like how he seems to be more of a "unification of knowledge" type of person, rather than just a mathematician. He draws from examples everything from math, to pop-culture, to eastern and western philosophy, and so on. Thanks again Lex!
He's been featured on the Numberphile channel a few times - the topics he covers are always very illuminating.
I love that he mentioned Alan Watts, who had the best description of Goedel’s Theorem: “No system can define all of its own axioms.”
I was surprised, amazed how he connected them, but then remembered how wide ranging Frenkel's knowledge about things and people are (away from Mathematics).
I recall (and am now rewatching) his 2014 talk, when, at the beginning of his talk, the computer system breaks down, he tells the tech, ""Don't worry about it". And "we use computers so much these days, maybe it's a sign". Then still goes onto speak with such humility, humour, and a weird humbleness. Weird because he obviously knows so much but believes he doesn't.
The talk's from a book promotion tour, for his "Love and Math". If I remember correctly, he's got a few more of those Alan Watts like comments. ruclips.net/video/YnqQ-BWMHrE/видео.html (if you're interested).
I don't get it. What does it mean to define an axiom?
That sounds like a vague description to me, which is fine for people like watts, but no good at all if you want clarity
@@Leksa135axioms are statements, so yeah, what does it mean to 'define' a statement? Seems like it doesn't mean anything honestly
@@Leksa135 At around 4:46, you'll see them show Euclid's five axioms (also called postulates). You'll see it says, "first axiom", "second axiom", etc. Underneath each of those is what is technically called a definition. What this is, is a definition for each axiom.
To return back to @FrigginTommyNoble's comment, the system of mathematics being used is called Euclidean Geometry. But the Euclidean Geometry system can't define it's axioms or itself, it needed Euclid (i.e. a person) to define them. Hence, no system (i.e. no mathematical system) can define all of it's own axioms. Which is what Kurt Gödel proved mathematical, or he proved that all mathematical systems will be incomplete, hence Gödel's incompleteness theorems.
Great show. I really love Frenkel. He is so clear and his enthusiasm and sense of wonder is infectious!
Here are brief statements of the theorems for those interested:
Gödel's First Incompleteness Theorem states that "Any effectively generated theory capable of expressing elementary arithmetic cannot be both consistent and complete. In particular, for any consistent, effectively generated formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true, but not provable within that theory."
Gödel's Second Incompleteness Theorem states that "For any effectively generated formal theory T including basic arithmetical truths and certain truths about formal provability, T includes a statement of its own consistency if and only if T is inconsistent."
Just prior to publication of his incompleteness results in 1931, Gödel already had proved the completeness of the First Order logical calculus; but a number-theoretic system consists of both logic plus number-theoretic axioms, so the completeness of PM and the goal of Hilbert's Programme (Die Grundlagen der Mathematik) remained open questions. Gödel proved (1) If the logic is complete, but the whole is incomplete, then the number-theoretic axioms must be incomplete; and (2) It is impossible to prove the consistency of any number-theoretic system within that system. In the context of Mr. Dean's discussion, Gödel's Incompleteness results show that any formal system obtained by combining Peano's axioms for the natural numbers with the logic of PM is incomplete, and that no consistent system so constructed can prove its own consistency.
What led Gödel to his Incompleteness theorems is fascinating. Gödel was a mathematical realist (Platonist) who regarded the axioms of set theory as obvious in that they "force themselves upon us as being true." During his study of Hilbert's problem to prove the consistency of Analysis by finitist means, Gödel attempted to "divide the difficulties" by proving the consistency of Number Theory using finitist means, and to then prove the consistency of Analysis by Number Theory, assuming not only the consistency but also the truth of Number Theory.
According to Wang (1981):
"[Gödel] represented real numbers by formulas...of number theory and found he had to use the concept of truth for sentences in number theory in order to verify the comprehension axiom for analysis. He quickly ran into the paradoxes (in particular, the Liar and Richard's) connected with truth and definability. He realized that truth in number theory cannot be defined in number theory, and therefore his plan...did not work."
As a mathematical realist, Gödel already doubted the underlying premise of Hilbert's Formalism, and after discovering that truth could not be defined within number theory using finitist means, Gödel realized the existence of undecidable propositions within sufficiently strong systems. Thereafter, he took great pains to remove the concept of truth from his 1931 results in order to expose the flaw in the Formalist project using only methods to which the Formalist could not object.
Gödel writes:
“I may add that my objectivist conception of mathematics and metamathematics in general, and of transfinite reasoning in particular, was fundamental also to my work in logic. How indeed could one think of expressing metamathematics in the mathematical systems themselves, if the latter are considered to consist of meaningless symbols which acquire some substitute of meaning only through metamathematics...It should be noted that the heuristic principle of my construction of undecidable number theoretical propositions in the formal systems of mathematics is the highly transfinite concept of 'objective mathematical truth' as opposed to that of demonstrability...” Wang (1974)
In an unpublished letter to a graduate student, Gödel writes:
“However, in consequence of the philosophical prejudices of our times, 1. nobody was looking for a relative consistency proof because [it] was considered that a consistency proof must be finitary in order to make sense, 2. a concept of objective mathematical truth as opposed to demonstrability was viewed with greatest suspicion and widely rejected as meaningless.”
Clearly, despite Gödel's ontological commitment to mathematical truth, he justifiably feared rejection by the formalist establishment dominated by Hilbert's perspective of any results that assumed foundationalist concepts. In so doing, he was led to a result even he did not anticipate - his second Incompleteness theorem -- which established that no sufficiently strong formal system can demonstrate its own consistency.
See also,
Gödel, Kurt "On Formally Undecidable Propositions of Principia Mathematica and Related Systems I" Jean van Heijenoort (trans.), From Frege to Gödel: A Sourcebook in Mathematical Logic, 1879-1931 (Harvard 1931)
Thank you!
Wow, literally an informative article in the comment section. Thank you so much!
This was one of my favourite shows from Lex. Edward is a truly remarkable human being, and it's always beautiful to see so much love and compassion in one's heart.
I think it's valuable to also go a bit into the whole "completeness & consistency" thing.
One could start with definitions and explaining why they're important things we want from formal systems.
Then one could proceed to a little history of how "cracks" in set-theory based formal systems began to be discovered by Frege and Russel almost as soon as those systems arose.
The story continues with a quick overview of the various approaches to these issues, like ZF(C), NBG, "New Foundations" and type theory (with Russell for a while, then dormant for a long time, then getting a big comeback with Per-Martin Löf and lots more interest recently with Homotopy Type Theory).
This brings us to a classification and analysis of the underlying issue - that of predicativity and impredicativity - one might briefly explain what that is and why it's problematic - using various examples of paradoxa of (direct or indirect) self-referentiality.
We can then explain how these developments and the predominant research institutions in Germany and Eastern Europe lead to Gentzen's proof of the consistency of (Peano) arithmetic - and how that was a process of formalization which took us around 2.5 millennia from basic arithmetic and logic to Gentzen's proof. The importance could hardly be overstated.
The "victory march" of formalization and the power of formal systems seemed assured.
... and then came Gödel.
"and then came Gödel"
No, Gödel was first (1931) and Rosser's extension of Gödel's first incompleteness theorem (which is often falsely attributed to Gödel) was in the same year as Gentzen's proof, 1936.
Gentzen evaded the incompleteness theorems: the system he used to show that PA is consistent is not stronger than PA, but miraculously manages to capture the structure of proofs in PA (in particular, it is not weaker than PA).
I think Gentzen's result is even more breathtaking and few people (if any) understand it.
Godel vs Gentzen? Epic math battles in history? Please tell me this is a RUclips series 🫡
Edward Frenkel is so extremely lucid, just extraordinary.
This is your brain on logic.
My ex-wife actually had Edward Frankel as an instructor at Berkeley nearly 10 years ago.
Ed Frenkel is one of my favourite people. His book is fantastic.
Wow!! Great explanation of incompleteness! The best I have seen so far!
GAYDOH
would you say the explanation was complete?
Ha ha good one.. As complete as an be at this level I guess...
In another paper, Gödel developed an axiomatic system containing the self-referential statement, “This statement is false.” He then proved - within the same system -that “This statement is false” is true. All he needed was the countable numbers (the set N) and a few very simple rules.
On “Emergence:” Taking simple rules then applying them to a simple structure to produce complex “behaviour,” is also a subjective process. In what axiomatic system can you consistently define both “simple” and “complex,” then show that there are no self-referential contradictions?
The Chomsky hierarchy defines classes of complexity of behaviour. Any Turing complete system can capture the fullness of complexity, and they are all well defined computational systems. Also, the simple structure of these systems can produce the most complex behaviour possible: irreducibly complex behaviour.
In "another paper"? Which would that be?
"All he needed was the countable numbers (the set N) and a few very simple rules." This comic reference to Peano arithmetic and gödelisation really hurts!
I love this stuff. This channel never seems to disappoint.
Great video, people take calculus and algebra classes for years and no one explains to them the fundations of what you are studying as clear as this guy does
Do people?
It takes a deep and complete understanding to make it as simple as it can possible be explained.
@@unkokusaiwa HAHAAA
What a great an inspiring guy Mr. Frankel is. It’s simply great learning from his talks.
Gödel's theorems challenged the notion of completeness and consistency within formal systems and had a profound impact on the philosophy of mathematics. They demonstrate inherent limitations of formal systems and suggest that there will always be truths that lie beyond the reach of any particular system. These theorems have also influenced the field of computer science, particularly in the areas of artificial intelligence and algorithmic complexity theory.
It’s so obvious, you can’t put all your eggs in one basket. We need competing ideas, we need theories that “contradict” others, but because of this they work (meaning multi-valued logics that immediately eliminate Gödel’s theorems), etc.
Thanks for your clear exposition of the core ideas embraced in Godel's amazing theorem...
I love his comment on the findings of Godel and Turing being "life affirming". Very well said.
Great explanation!
Regarding the perception problem at 14:52 , the top down perception in the brain can provide a trivial explanation. Just like Lex mentioned for neural networks, the bottom up sensory features leads two activated outputs: 0.5 Rabbit and 0.5 Duck. However, the top down awareness in the human brain can only attend to one output at time. So if you attend to the duck output, the duck neuron will be activated. Now because the information comes from top to bottom, all the related neurons to Duck will activate (none will activate for the Rabbit). And that is why you suddenly perceive it as 100% Duck or 100% Rabbit if your top down awareness attend to the Duck or vice versa.
simply giving it the name "topdown" is neither much of an explanation nor interesting. and who is this "you" picking what to attend to? that's the interesting stuff
@@iranjackheelson The brain is a reactor and converter not a creator. So this whole story goes into lala land categories.
@@s.muller8688 you're being sarcastic right? brain is exactly not just a converter or reactor. it is a creator indeed. part of the reason why you can't predict what you want to eat for lunch 100%.
@@iranjackheelson what you going to eat at lunch is already stored in the memory in the form of known data, which than randomly get's chosen by thought. Nice try.
@@s.muller8688 decision to what to eat for lunch is influenced by your prior beliefs and states, but it's not just stored as "known data". nice try
Of all the explanations on the internet, Edward Frenkel explained it the best!
There are some nice ideas about emergence of complexity. As nothing is the same, there is an incremental effect by repetition, similar to what our memory in the brain does with the episodic memory, each time we see a cat for example, the cat experience adds meaning to our definition of cat, even if it is the same cat at the same place. A bit like Peircean semiotics thirdness, when we interpret a sign it can generate a new one, even more if instead of just one triadic relationship there is a whole network of it, by aggregation and interconnection, at some point it generates more complexity of evolving meaning, because it cannot be the same, different than in mathematics. In mathematics if we add 1 + 1 it is always 2, but in reality that is impossible, and the 2 will be always slightly different each time we add 1+1. In short, complexity has to emerge because repetition is impossible.
We need more smart people like Godel, Lex Fridman, Joe Rogen and Adam Carolla.
Frenkel is probably my favorite guest of all time I’ve rewatched the interview many times
A full 14 minute in depth explanation of goedel's impossibility theorems, and then lex goes "so every why has a definite answer"
??? every why still can have a definite answer but not on the same system
To be fair, as far as I understand Goedel's incompletness theorem (I think thats what you mean, 'incompletenes' rather than 'impossiblity'?) the theorem only states that there is at lease one state that is not within the universal set. And that by the virtue of at least one, that makes the whole 'completeness' incomplete.
It does not mean 'there are questions that can not be anwsered', it just means, 'Mathematically, there is at least one question that can not be anwsered'.
I choose to interpret that to mean "Do we get Pizza or Fish and Chips." being the question.
@@jaywulfnot necessarily a question being asked but rather and assumption being posed
@@jaywulfbecause a question would imply that if an answer was derived, the answer would be used to start the system of logic, but within the logic the axiom was already assumed without validation (except if you go meta)
9:28 No, no it doesn’t. It “proves” these things if you aren’t a shrewd enough logician to understand that a self-contradicting and semantically empty statement is simply “incoherent” within any coherent formal system (that is coherent with the Logos).
Also the halting problem is solved by the Logos and by translating programs into a number of versions until a halting version is found (if the original does not halt by the time that other halting version is found).
It’s all easy and these people that think LOGIC is imperfect or can’t prove everything are the fools, even if they’re “big names” and did some cool things.
When the Math doesn’t “add up” it’s YOU/US (the fallible mathematician and fallible humans) that screwed up, it’s not Math and Universal Logic (Logos) that were actually imperfect at their Root and in their Essence.
Perfect Logic is Perfect Logic and that is tautological and 100% certain.
There is no “truth” that is not encompassed by the Universal Logic or (objectively) true without being so according to the Standard of Universal Logic.
It is illogical and logically contradictory nonsense to speak of a “truth beyond Logos” or any aspect of Logic that cannot be symbolically represented with Formal Logic.
I wish my father had teached me mathematics like him. So calm and collected makes it easy😢😅
Don't forget that he has spent decades of his life dedicated to learning and communicating mathematics. It take a lot of hard work to be able to do what he does.
Is your dad a mathematician? If not the case, pls respect your dad.
@@linchenpal yes
Some people are natural at math but cannot teach or explain it. Others work harder to learn it, but do better conveying it.
But most people are bad at what they do and don't care to try to learn everything they are capable of.
Incredible, as a mathematician and AI scientist I loved what he had to say.
I practice magick for the exact reason he spoke of in the end. It's worth noting that Carl Jung also practiced magick and wrote and illustrated his dreams and other deep psych work in his Red Book.
People think, "oh you're crazy you think Harry Potter is real," but it couldn't be further from the truth. I believe in targeted rituals that help expand my own understanding of what lies below the tip of the iceberg in my subconscious. As well as to harvest results from it. I think it's worth looking into if you find it intriguing.
Lex on Godel's incompleteness and Turing's undecidability: _"It's very depressing."_
Frenkel: _"Or life affirming!"_
Edward is ahead of Lex in spiritual development. When we embrace that we will never attain a _Theory of Everything,_ it opens up greater possibilities!
I think Lex was joking about programming. There being no means to figure out the perfect program. :)
@@evertoaster Agreed, Lex was not being 100% serious in his answer
Reminded me about a study of the mona lisa smile, where they discovered that the smile appears to be more obvious if we use our peripheral view, so that its way she sometimes appear to smile but then again, she does not. Maybe some of those trick images may have an explanation that is more complex than subjective.
Loops are similar to self referential statement that are connected to some paradoxes in naive set theory. But they are completelly resolved in modern set theory ZFC. Also there are no connection between these paradoxes and Godel results. Godel theorem are valid for every formal system strong enought to interpet Peano arithmetic.
I think Euclidean geometry is a good example for the difference between physics and math. In math, the 5th postulate is just an axiom. in classical physics, it's derived from observation and in general relativity, which was necessary because classical physics didn't agree anymore with some observations (like the orbit of Mercury), it is only valid in the special case of flat space, which is a good approximation in some cases, but strictly only exists in an empty universe or in single points which have a curvature of 0.
Wonderful interview. If Penrose is right in that consciouness is not an algorithmic computation, as per Godel, then we are not wholly deterministic, a comforting thought!
Or that even if we are wholly deterministic, it is not in a way that can be replicated by any formal / computational approach known today. Penrose has basically said something similar.
@Mattje8 Very true. Do you think if we are not wholly deterministic, that Penrose's idea, if true, would explain the phenomenon? Could it be that superposition and entanglement provide human's with the ability to think outside the box, to get the "god's eye view" ?
‘We’re still feeling the tremors today.” Wow.
Excellent interview
I really liked when he jst softly said the key to genius . " To have open end process " to let yr conscious intelligence lead . Rather than deciding one thing and another
this is a Thankyou to Lex and his team for committing to in-person interviews with high quality audio equipment.
It makes all the difference to the audience experience.
Wonderful guest! Obviously Brilliant, yet modest! 👏
Your talk with Edward took a most interesting path from subjective axiomatic foundations, creation of mathematics thru logical inference & syntactic process, to Goedele’s & Turing’s Theorems opening
space for new things.
You then jumped to cellular automata & emergent complex behavior, leading Alex to bridge to neural nets via figure-ground human perception. At this point I flashed on the interesting possibility that we are all (inescapably) training AI systems to be human in the most fundamental way possible.
My thought was that the methods of our perception are at their core, the syntactic evolutionary algorithms that created life and us. Also, that we are incapable of recapitulating ourselves in our dogs, children or AI, in any other way.
The intimacy of the evolutionary syntactic process that created us is the heart and soul of our perceptual engine. As we discover this syntax in recreating ourselves, we give this soul to our syntactic silicon selves, our AI.
Everything is layers of complexity. My research leads me to conclude that we are close to an inflection point in our understanding of the second law of thermodynamics. When that happens, this next layer of complexity may provide a foundation for a true science of life… and AI is our ashlar test vehicle.
Greatly appreciated broadcast. Thanks for sharing! Ex-Maths teacher. Have a GOOD weekend!
I love Edward Frenkel so much ty for this
Love your podcast, always a hit. But this was a grand slam.
Editing a foundational part of the whole interview✨🙏
I shit my pants when he tied everything to complementarity. The golden thread of Platonism. Would love to hear this guy talk with Joscha Bach, John Vervaeke, Max Tegmark, Penrose, Graham Priest, or any other modern great who understands and promotes this principle.
I've been watching a lot of physics videos lately, mostly from Drs. Sabine Hossenfelder (Science without the Gobbledygook) and Matt O'Dowd (PBS Spacetime). Both channels take dips into the Quantum and explore some of the weirdnesses within.
I've been thinking a lot about the ideas of complementarity and how they might relate to quantum superposition and the measurement problem. See, I'm starting to think that maybe the wave function doesn't collapse at all, but rather we are just observing one "version" of the particle. Its unobserved complement might still be just as real as the one we measure, but can't be seen at the same time. In a sense, maybe particles really are in two places at once, but we can only observe one at a time.
Thoughts?
@@sabinrawr yea I have a lot of thoughts about this. Safe to say there is a lot of confusion surrounding QM as it is popularly understood.
I tend to go with interpretations akin to objective collapse or Quantum Bayesianism.
That's why this is my favourite channel on platform
I like the late Dr. Van Till's view of how that its necessary to start with "the very first principle" of our creator's existence. THEN we can make sense of everything in virtue of that divine ultimacy of reality. Top down rather than bottom up. It acknowledges the need for divine revelation, not only in order for us to know that the ultimate nature of reality is divine, but so we can have intelligibility for facts generally, regarding things we can see and touch.
Except it's been understood for a while now that there doesn't need to be a creator in the universe we inhabit. It's an established part of physics that structure can emerge from "nothing" due to spontaneous symmetry breaking.
"Structure" referring the laws of the universe that give rise to what we see as material reality. It's understood that as energy levels lower, the many symmetries of the universe break down in ways that create more and more differentiated structure as opposed to the extremely high symmetry of nothingness (everything being the same under all possible transformations.
This isn't just speculation. It's proven that the most fundamental laws of physics, the conservation laws, are all just emergent expressions of symmetries in the universe. They did not have to be decided on or written down by a creator. They just emerge due to the "shape" of reality.
Even the most fundamental "things" in reality, quantum fields, emerge spontaneously as energy decreases. The four fundamental forces emerge spontaneously from one as the universe cools. At the beginning of the big bang, there was a single fundamental force, and as the universe cooled, broke into 4 different forces. More symmetry to less symmetry. More similarity in the universe to less, which is to say more structure emerging from less. No creator needed. The big bang itself is very likely the breaking of time symmetry.
The existence of things like "charge" is due to spontaneous symmetry breaks too.
Like liquid water being the same/symmetric from all sides spontaneously turning into a snow flake as energy lowers and suddenly being symmetric from only 6 sides. That's a rough analogy of a symmetry break. The structure of the snowflake emerged from the less structured water completely passively due to the symmetries of liquid water breaking as energy falls.
None of this proves there is no God, but it DOES mean that God isn't necessary or inevitable to explain the universe existing. Structure can emerge from where there was no structure completely passively as things go from high energy to lower energy, which is just to say time passes.
This guy is saying some of the most interesting things I've heard, not only recently but in general. It all just affirms for me even more how by being human you're limited and you're always putting your faith in something, whether conscience or not. There's just no way around it, but it seems like it's probably better that way than being all-knowing and all-powerful. As a Christian it just motivates me more to put my faith in Christ, the one who really knows it all and takes responsibility for us. I do love honest science and seeking the truth of course but, knowing how flawed and limited I am, it just seems ridiculous to think I could find all the answers to life. Prioritizing my relationship with Christ has been the greatest thing for my life and I'm just so glad I don't have to rely on my own strength and understanding, but can always rely on his!
Well said. I recommend you read "Believing is Seeing" my Michael Guillen. It's a wonderful book that has helped me so much when it comes to holding on to enlightened faith.
In 2015, a group from London proved that many-body quantum systems are analogs of Turing machines, essentially computing for the rules of quantum mechanics. Because the system essentially has to reference itself in optimization of electron distribution, within a limited number of excitations, they then went on to demonstrate that some properties, like spectral gap prediction, are undecidable. Reductionism has a limit and there are things in life not only that we can't predict, but neither can nature!
This interviewee is a very smart man.
Thanks for coming up with this interesting topic with this brilliant guy..❤🎉
11-40 The presenter raised the question of calculation, but he understands this term only from one side. In fact, the word calculation can be understood as a process when images of information (not the information itself, but its stored state, impression, correlation) are transferred to the state of Active Information. For example, the image of the word “fox” is transmitted to the brain. The paper acts as a carrier of the image of the word “fox”. Note that there is no information about the fox itself in the symbols written on the paper, but there is a certain correlation. The observer, in the form of a light stream, acts on the paper (“reads”) the image of the word “fox” and stores it in the form of a frequency modulated signal with its spectrum. Note that nothing remained of the letters “fox”; the letter was considered an observer (light flux), produced in the form of active information and stored in its format in the signal spectrum. The human eye acts as an observer and reads the image of letters from the medium (light flux) and writes new correlations in its own format, for example, into energy signals traveling along nerve endings to the brain. Etc. And only at the last stage, the neural network of the brain, reading the correlations that came to it and using memory and reference sets, creates (generates) an active information flow and generates information about the fox (thereby reproducing reliable information from the word image stored on paper). So calculation is an analogue of the active process of the Observer, it is a mechanism as a result of which “real” Information is born within the context of this Observer.
Postmoderns have substituted emergence for manifestation, as if "because, because" had any explanatory value.
I know the theorem and the proof and all the theory and i can say that i like the way he summariezed it for wide audiance
GAYDOH
The guest is amazing, and his eyes are so bright
Mathematics is not based on axioms. It is based on categories. Without categorizing objects and processes you cannot have two or more of anything. In grouping similar shaped and behaving objects together under one symbol you can then assert that there is more than one of some thing. Without categorization there is only one of eveything.
Hi, I loved your answer. It's enlightening.
I'd like to share and ask you some question had came to me as I read you. Please, mind me for my english or lack of conciseness not.
First: is this "one of everything" a category itself?
Second: is "category" a category itself? Or rather, what's a category?
Third: if we ask about a God who 'create' this 'one of everything', are we just splitting in two categories this "one of everything" or are we 'creating', or defining, a new "one of everything" category which will overwrite the older one? Or rather, would this operation be an internal or an external operation to the "One of everything" set?
Fourth: what's an operation?
Fifth: who's the one who make the operation of categorising?
Sixth: does defining, or rather 'creating', needs at least one external element -the one of everything or the one everything is defined from- to make subsets from, with or within that element, to set everything, to set the set of everything?
Seventh: does the set of everything needs a definer to define itself? Or rather, can a set define itself by itself?
Eighth: is the 1 element truly a element 1 if it can slit into two elements? Or rather, is 1 a element or a set of elements?
Can be the 0 element slit into something else other than it self?
Can the empty set do the same?
Nineth: is the empty set an element?
Is the empty set at least the only element where other elements can place within?
Tenth: is the 'one of everything' the one, the zero or the empty (set), the emptyness of everything?
Eleventh: what's the empty set?
What's the emptyness of everything?
Is it the set of every empty sets?
How many -or how much- elements does it contain?
Twelveth: can the empty set, or the emptyness of everything, fill itself by itself?
Theerteenth: who fill the empty set?
Who fill the emptyness of everything?
Really thank you.
I apologize, I had not believed so lot of questions might flowed out.
Waiting on a answer,
Feeling happy already,
Cheers.
-The Lord is my sheperd, I miss nothing.
Nothing, unless a you.
I read that 3-SAT is Turing complete. And that's just basic Boolean algebra! And Gödel's incompleteness theorems don't apply to Boolean algebra. However, for example the set of natural numbers is infinite, so one has to use something like induction to define that in Boolean algebra. Or define a Turing machine that runs forever and outputs the natural numbers.
I don't know why, but Edward Frenkel is my favorite guest so far. Invite him again pls!
In cellular automata the complexity emerges not completely from the rules, but from action of the rules on the initial setup. The rules themselves are just the computational engine. And as any good computer it has the same halting problem. So add a good initial setup and you have complexity - which is a manifestation of the halting problem.
Many Native peoples use visual puns in their art forms, because both images carry the spiritual power contained in the subject designated. Here eg Rabbit has spiritual powers of speed and survival. AND bird is wise and loyal etc. A great book describes ING how this works is YAQUI DEER SONGS. You guys would love it. Also Northwedt Coast indians eg Kwakiutle use lots of visual puns in art , dance masks, eg for transformations in sodalitiies.
Point being no reason to associate Either Or neur branches vs broader associations of power, paradox, mystery. The latter are so old and common in human cultures.
Эдуард как всегда на высоте - чертов гений!
13:00 LEX: I think your comments here really convey (in a wonderful way) how you view such matters. I approach such topics with more caution than yourself, but this really does convey your Intent regarding AI and related matters.
An innocent question: if the incompleteness theorem refers to the existence of ''valid'' propositions whose truth or falsity is impossible to prove within a formal system of axioms, then, on what basis is it assumed that the proposition was considered ''valid'' to start with (if it was)? Or, is ''valid'' above = merely ''formally constructable''?
However, perhaps the counterexamples produced by Gödel and Turing come down to loops. A clear disjunction between formal systems and meaning happens in music. The works of Bach all follow rules which can be expressed as a mathematical formal system. The mathematical validity is a framework which helps support the emotional meaning (different for everyone), but the emotional meaning is outside the formal system.
The eternal golden braid
Brilliant discussion
"I used the formal system to destroy the formal system." - Thodel
He is a very thoughtful man.
What is the name of the knot displayed in the title?
I think the duality like it is mentioned towards the end of the video and also the problem, that identity can't work like in formal logic and math (where something is identical to itself and if it is different, it can't be identical, but in reality everything is always changing, like the ship of Theseus, where every part is replaced one after the other until no part of the original ship is still there) can both be solved if we assume, that nature is really dialectic. And dialectics also allows contradictions, so we don't have the a problem if we prove something and also it's contradiction. So maybe we could advance our understanding of the world if we assume that the natural world is dialectic and only obey formal logic under special constrained circumstances. We would probably need a new kind of logic then, which includes dialectics.
i love this guy, he s so inspiring !!
complexity emerges from simplicity due to entropy. Simple things have simple structures of information, which have a higher degree of freedom of arrangement, which creates complexity in arrangement. From complexity emerges (systemic) simplicity, because complex systems have a lower degree of freedom of arrangement.
Due to decoherence…
Damn, that's a good explanation.
Fascinating conversation, as always.
I have to disagree with Mr. Frenkel 's statement that no one knows why people automatically see one, or the other representation in an image such as the duck/rabbit one. A simple way to recognize this is to imagine that you show a picture of a rare animal to a hunter-gatherer for whom it is a common threat to their group. This image can also be seen as a small vehicle to someone who often rides them. I think you know where I'm going, and now you can explain what no expert can. Not quite.
Great talk as always, it is very humbling to remind ourselves of the ground where we're standing. Cheers from Brasil!
❤ from Brasil
On the Origins and Nature of the Dark Calculus
There's evidence in the arithmetic record that the study of formal systems reached a pernicious apex in the Long Before. Advancements made by mathematicians such as Russell, Gödel, Eisencruft, Atufu, Wheatgrass, and System Star contributed to the understanding of notions like undecidability, pointed regularism, and abyssalism. Upon reaching this minimal degree of mathematical maturity, equipped with sophisticated grammars, researchers set out to experiment with the limits of expressibility. They contrived bold research programs and galloped into the mathematical wood, unwitting of the dangers that brood there.
The record is even scarcer than usual, due to the efforts of successive generations to obfuscate the venture. As best as I can gather, at some point in the course of inquiry, a theorist from a mathematical seminary called the Cupola formulated a conjecture on the fragility of formal semantics. The conjecture ripened to a broader theory, out of which spawned a formal system called the penumbra calculus. In the few fragments of texts that predate the obfuscation, it's stated that, in the penumbra calculus, certain theorems are provable, but are falsified upon the completion of their proofs. As much as this result is at odds with the systems of thought I've encountered in my own inquiries, I find little reason to doubt the veracity of the authors. Nevertheless, it's certainly a peculiar property.
The Cupola theorist's results erupted into a grand investigation into the expressibility of the penumbra calculus. The conclusions were troubling. Pushing further, researchers constructed sister systems with alternate axioms. These systems were still more fragile, with the systems' inference rules themselves unraveling upon the completion of certain proofs.
Convinced that their discoveries were made possible by some idiosyncrasy of self-awareness, but synchronously fearful of the implications of their results, some schools of theorists engineered complex automated deduction systems to probe boundary theorems and launched them into neutron stars. The outcome is undocumented, but the result convinced theorists across the Coven to abandon research and blacklist anyone who studied the penumbra calculus and its derivative systems.
(from: Caves of Qud)
Both the Incompleteness Theorem and the Halting Problem rely on Cantors diagnolazation argument. There is a flaw in this argument, as it leads to inconsistent conclusions in binary.
Consider the diagnolazation of the set below:
1.00000.....
0.00000.....
0.10000....
0.01000....
0.11000....
0.00100....
.
.
.
0.000...011 = 3 x epsilon
0.000...010 = 2 x epsilon
0.000...001 = epsilon
0.000...000
The diagnolazation of the above set produces the number 0.11111... which is either 1.0 or the first number listed, or 1.0 - epsilon.
I've been waiting for this!
I worked for IBM in the 1980's on Artificial Neural Network systems and encountered the dog-cat problem the guest describes here (only they were printed numbers with voids in the images such that 0 and 8 could not be distinguished reliably). The ANN would classify those as patterns as rejects, i.e. unreadable or unrecognizable, based on the difference between the highest class score and the next highest against a confidence level, which was derived statistically across all patterns. But rather than being attributed to perception as the guest suggests, it really was the result of two factors: too low an optical resolution and an insufficiently precise definition what would constitute an "8" and a "0" in the case where the "8" was degraded and the training patterns were manually tagged based on those low resolution images. What needed to be done was the training patterns should have been tagged by the printer program that generated them. As to cats vs dogs, it goes back to precisely defining what a cat is vs what a dog is ontologically. For that you'd have to look at every possible variation of what each is, their similarities and differences in order to derive a precise definition, which is what the ANN is attempting to do. So it's a chicken-egg problem, but as mis-tagged images are discovered in the training set that cause errors in the test set, then those mis-tagged images can be corrected, further refining the definitions for each. This kind of corrective (and adaptive) feedback is what makes these systems work.
You’ve missed the point entirely.
I've heard it explained in terms of games. If you have chessboard that is halfway through a game say, there is no way to derive using the rules (axioms) what the board looked like say 10 moves before. That's about as far as I get.
beautiful story so far really, guys...
In mathematics and IT, sometimes, evaluating whether the a particular process will complete or not is just as complex as actually doing the process.
Math is intriguing...once you know how it's used to understand the universe, our world, and ultimately...ourselves.
Yes, but try to redo planar geometry without the fifth postulate. The fifth postulate is obviously not true for spherical surfaces. So it obviously shouldn't be postulated for spherical surfaces.
In a previous episode, Frenkel speaks of two separate domains, E, Energy, the one populated by the three known forces in the quantum world, and S, Space, the one populated by distances, a property captured by the third axiom, "a point P and a distance r." As a curiosity, E appears as a trinity; three individuals with different and recognizable characteristics and properties...who, at the end, are the same thing. When the preacher tells us this story, we "know" is a metaphor. Newton doubted it but particle physicists today know the story is true. What needs understanding is the unification of E and S. If E is the Ying and S is the Yang, what happens when they unite? What is the mathematical representation of that union?...a reasonable question.
After all. the high priests of E have created the Standard Model, a group structure SU(3) × SU(2) × U(1) that accurately represents observable properties of their members, and transformations/interactions among them. No curvature appears in the Standard Model, gravity does not exist in the E domain, it is created somewhere else.
When Ying first meets Yang, at the first encounter of a photon with distance, nothing happens. Photons travel forever in their own universe without time. From E come two twin brothers, electricity and magnetism, and from another source, not excluding E, comes a fixed unit, the speed of light, c. The twin brothers are perpendicular to each other, a property captured by the fourth axiom, congruency of right angles.
Our creation occurs when energy and space, E and S, unite in matter at the cadence of light according to "space equals energy" c^2 = E* 1/m. What composer dictated the cadence, or at least can it be derived from other principles?
The universe has different and simultaneous configurations, different geometries. A mathematical system that corresponds to one of the possible geometries will be successful. A flat universe is represented by Euclid while a curve geometry is represented by Riemann. However, many mathematical systems do not correspond to any possible geometry. Combinatorics generate more outcomes than reality.
Imagine universes with all possible geometries, which is the essence of multiverses and strings. Borges, without the encumbrance of heavy mathematics, painted that universe in the library of Babel.
You're so cooked lol
@@AlexCargill-jj2nf Congratulations. Something tells me this is the best you can do and for trying your best, I congratulate you.
Nice perspectives, Really liked it, Thanks to Gödel's incompleteness theorem and Turing's halting theorem we the software engineering and mathematical communities will always have jobs despite what all the AI hype that is going around tells you.
I don't see how those limitations affect AI anymore than they affect us. What can a human brain do that a properly trained AI could not do because of those? Not saying it is easy, or that it will happen soon, but speaking in absolutes when it comes to AI (they will never do X, we will always need humans for Y) sounds completely unsound to me.
@@harulem The guy falsely identifies the whole of math with a specific school of mathematics, namely formalism, which is based on arbitrary axiomatics and denies intuition and empirism as truth conditions of coherent foundations of mathematics.
And here lies the answer to your question, a mechanical Turing machine has no intuitive access to the idealist ontology of mathematics. Or can't "Listen to the thoughts of Goddess", as Ramanujan expressed his methodology. :)
@xcyoteex On the contrary. I'm saying, as did Brouwer et alii, that no mathematical language can exhaust the idealist ontology of mathematics. Also Gödel had intuitionist motivations for shootind down logicism and together with it the whole of linguistic reduction with his proof. But contrary to the immutability of Platonic eternalism, I'm with Whitehead in the view that the idealist ontology of mathematics is process philosophical instead of substance metaphysics. This view is strongly supported by the undecidability of the Halting problem which through Curry-Howard correspondence extend to the whole of proof theory, implying that mathematical proofs cannot be claimed to have eternal status but temporal duration - even if that duration is larger than a universe.
Process philosphical approach also gives construction of formal languages - as process of translating prelinguistic intuitions into communicable language - participatory creative role in the idealist ontology and evolution of mathematics.
Please dont. The base of this is the diagonal argument.
"This sentence is false" . You are ignorant and make stuff up.
I'm not sure though that Godel's theorem has nearly as much impact as all that. Even if all statements of ZFC could be proved true or false with a finite set of axioms, so what? There would still be infinitely many such statements. And in point of fact, most professional mathematicians devote themselves to the discovery of statements that can be proved true within ZFC which is a task that a computer could in fact do. People talk vaguely of the possibility that unproven theorems like RH might not be provable within ZFC but research programs to try to prove such theorems by tacking on known independent axioms are thin on the ground(which might be a mistake)So Godel's work hasn't even affected mathematicians, let alone other people.
Also as to the business of the "incompleteness" of mathematics, mathematics was always incomplete anyway because theorems depend on axioms which are arbitrary. Godel's second Incompleteness theorem is actually more significant here because that tells you you cannot even assure yourself of the consistency of any reasonably powerful set of axioms. That's much worse because it literally tells mathematicians that there is no objective reason to think what they do is even internally consistent. And therefore everything they have ever done might actually be proved pointless. For example we have not actually proved Fermat's Last Theorem. We have only proved FLT hostage to the consistency of ZFC which itself if true is beyond proof. We cannot rule out the possibility that tomorrow someone will find three numbers(four, counting the exponent) that violate FLT and ipso facto also show that ZFC itself is inconsistent - meaning we can't trust any results we have proved using this system anymore! Somehow, in spite of these dramatic consequences, the second theorem gets little play whenever specialists talk about Godel to a general audience
15:45 just as he was about to say it I thougt to myself this exact thing. It can be a scary thought or just that our plain of understanding, to which A.I. is being trained, is constrained by such laws of "existence".
Gödel's Incompleteness Theorem:
A Mathematical Corollary of the Epistemological Münchhausen Trilemma
Abstract: This treatise delves into the profound implications of Gödel's Incompleteness Theorem, interpreting it as a mathematical corollary of the philosophical Münchhausen Trilemma. It elucidates the inherent constraints of formal axiomatic systems and mirrors the deeper epistemological quandaries underscored by the Trilemma.
---
In the annals of mathematical logic, Kurt Gödel's Incompleteness Theorem stands as a seminal testament to the inherent constraints of formal axiomatic systems. This theorem, which posits that within any sufficiently expressive formal system, there exist propositions that are true but unprovable, has profound implications that reverberate beyond the confines of mathematical logic, resonating in the realm of philosophy. Specifically, Gödel's theorem can be construed as a mathematical corollary of the Münchhausen Trilemma, a philosophical paradigm that underscores the dilemmas in substantiating any proposition.
The Münchhausen Trilemma, named after the Baron Münchhausen who allegedly extricated himself from a swamp by his own hair, presents us with three ostensibly unsatisfactory options for substantiating a proposition. First, we may base the substantiation on accepted axioms or assumptions, which we take as true without further substantiation, a strategy known as foundationalism or axiomatic dogmatism. Second, we may base the substantiation on a circular argument in which the proposition substantiates itself, a method known as coherentism or circular reasoning. Finally, we may base the substantiation on an infinite regress of reasons, never arriving at a final point of substantiation, a path known as infinitism or infinite regress.
Gödel's Incompleteness Theorem, in a sense, encapsulates this trilemma within the mathematical world. The theorem elucidates that there are true propositions within any sufficiently expressive formal system that we cannot prove within the system itself. This implies that we cannot find a final substantiation for these propositions within the system. We could accept them as axioms (foundationalism), but then they would remain unproven. We could attempt to substantiate them based on other propositions within the system (coherentism or infinitism), but Gödel's theorem demonstrates that this is unattainable.
This confluence of mathematical logic and philosophy underscores the inherent limitations of our logical systems and our attempts to substantiate knowledge. Just as the Münchhausen Trilemma highlights the challenges in finding a satisfactory basis for any proposition, Gödel's Incompleteness Theorem illuminates the inherent incompleteness in our mathematical systems. Both reveal that there are boundaries to what we can prove or substantiate, no matter how powerful our logical or mathematical system may be.
In conclusion, Gödel's Incompleteness Theorem serves as a stark reminder of the limitations of formal axiomatic systems, echoing the philosophical dilemmas presented by the Münchhausen Trilemma. It is a testament to the intricate interplay between mathematical logic and philosophy, and a humbling reminder of the limits of our quest for knowledge. As we continue to traverse the vast landscapes of mathematics and philosophy, we must remain cognizant of these inherent limitations, and perhaps find solace in the journey of exploration itself, rather than the elusive, final destination of absolute truth.
GPT-4
*Utter nonsense on stilts!* Incompleteness is *absolutely nothing* to do with the justifiability or the (epistemic) status of the premises from which one draws inferences. Incompleteness shows a rather stunning divergence between provability and truth in certain symbolic systems. These are two utterly unrelated areas. There are no Agrippan problems for the axioms of symbolic systems.
By the way, _Münchhausen Trilemma_ is the recently deceased Hans Albert's novel name for a problem that for centuries has been known as _Agrippan Trilemma._ It's use is mostly confined to a rather small group of German-speaking Popperians. Are you German? And what is GTP-4? Is this to indicate that your comment is the product of some "AI" software? If so, would you care to share your "prompt"? I'm rather curious to know more about the genesis of this rather elegantly presented load of unadulterated tosh.
The weirdest thing is the proof isn’t even that complicated once you develop all the foundational theory of logical deduction. Since there are a finite number of rules for deduction you can assign an integer code to every rule. I don’t remember exactly, but you take a self-referential statement like “this statement is false” and turn it into integer code. It becomes a “theorem” about integers, but the self referential nature can be used to show that if a proof of the special theorem constructed exists, then the statement must be false which is a contradiction. The conclusion must be that no proof of the constructed theorem exists. A proof that the theorem is false also doesn’t exist following the same method, which is the part thats really hard to wrap your head around.
The key was indeed the Godel numbering system...
Nagel and Newman have a lot to answer for.
this is like listening to a theologian saying the practice of theology itself is nonsense when it takes itself to seriously. refreshing.
This clip was amazig. I love it.
Would love to hear Douglas Hofstadter interview 🖤
Great conversation. An interesting detail: Mr. Fridman wears a jacket and tie, while Mr. Frenkel wears neither jacket nor tie. I like Mr. Frenkel's look.
Dr. Jeremy England needs to be on your show to discuss emergence of life.
14:25 I’ve seen the exact dress in person. It’s absolutely blue & black. My theory is that women often sift through dark closets, where a dim lit white & gold dress appears like a sunlit blue & black dress.
Gödel proved that there are true sentences that can't be proved. If I ask for an example and you give one I may ask "How do you know it is true, since it can't be proved?" A self-referential sentence like "THIS sentence is false" seems not to be an example, since if it is true, it is false, and if it is false it is true - it is meaningless, I think. Is there any non-self-referential sentence that is true but not provable???
refine the last question further for insight: Are there any non-self-referential sentences at all? Language is a relationship map, a topology, a surface, a structure. And math is a language.
@@disabledchatzen5276 By self-referential here what is meant is that the sentence "talks" about itself.
Is there any known example of a mathematical sentence which is about "other things" than itself (and its being true or not), which is true but not provable? (Not one of the axioms, of course.)
Ordinary human languages are not strictly logic while mathematics is so (free of contradictions).
Not often that this host of topics are touched on in the same sitting... 1: The "A versus B logic modality" is mentioned as being first introduced by Aristotle (and is lightly implied as being an imperfect system)... 2: Knowledge being limited by imperfect biological and mental perception (all of us) as well as our previous life experience (causing completely unavoidable data bias)... and 3: mention of the collective unconscious. I've gotta give credit to both these men for being versed enough in these ideas to expound on it with each other. Doesn't happen too often
I don't think people realize this is the understanding that patterns that are self consistent have truth in its workings. Like a calculator that produces a wrong answer, but you can take apart and understand the whole universe from it. Think of what that means for data in language and data in classics or religious text. That's the iteration of many users over a lot of time, disserting for what are patterns we observe.
is it possible that computers with quantum compute will postulate axiom based math that goes beyond or numerically beyond the postulates of the shared and use-full myth we call mathematics ?
Call me crazy but I think everyone may be missing the point with incompleteness. If incompleteness arises from a complex system doesn’t that mean the iteration of that system gives rise to emergent properties that will always yield more problems than answers?
Fantastic stuff one of my favorite clips.
Great clip, but left the Göedel incompleteness theorem explanation Incomplete!
Mr. Lex, for every question you have that bothers me, you have ten that I find fascinating.
Top guest 👍
Euclidian geometry refers to the uniqueness of "STRAIGHT line" and not just a line as you have stated in the video.
Wow. Bravo , that was thrilling