Timestamps, Links, & Sponsors: - Brilliant: brilliant.org/TOE for 20% off - Henson Shaving: hensonshaving.com/everything and enter code EVERYTHING at checkout to get 100 free blades with your purchase. LINKS MENTIONED: - Tim Maudlin's podcast on TOE: ruclips.net/video/fU1bs5o3nss/видео.html - Primacy of Doubt (Tim Palmer's book): amzn.to/3Oo55k7 - Metaphysics Within Physics (Tim Maudlin's book): amzn.to/3pXHNcn - Quantum Non-Locality and Relativity (Tim Maudlin's book): amzn.to/44QoS2F - Go Fund Me for "The John Bell Institute" initiative: www.gofundme.com/f/a-permanent-home-for-the-john-bell-institute - Speakable and Unspeakable in Quantum Mechanics (John Bell's book): amzn.to/43RyQ2t TIMESTAMPS: 00:00:00 Introduction 00:02:04 Explaining Superdeterminism & Fractal Cosmology 00:05:06 What is Tim Palmer working on 00:09:24 What is Tim Maudlin working on 00:11:47 Assumptions of Bell's inequality / theorem 00:22:40 Locality and Superdeterminism 00:28:33 Summary of disagreement + why do we care what Bell said? 00:32:54 Counterfactuals & Counterfactual definiteness 00:59:38 Chaos theory, attractors, and fractals 01:26:32 Free variables and ensembles 01:36:30 Invariant set theory 01:48:21 Relevant links and teaser for Part 2
Correlation is not causation he says I guarantee one of the most fundamental Concepts is here in reality if you are born then guess what you're going to die!
Conservation of Spatial Curvature (both Matter and Energy described as "Quanta" of Spatial Curvature) Is there an alternative interpretation of "Asymptotic Freedom"? What if Quarks are actually made up of twisted tubes which become physically entangled with two other twisted tubes to produce a proton? Instead of the Strong Force being mediated by the constant exchange of gluons, it would be mediated by the physical entanglement of these twisted tubes. When only two twisted tubules are entangled, a meson is produced which is unstable and rapidly unwinds (decays) into something else. A proton would be analogous to three twisted rubber bands becoming entangled and the "Quarks" would be the places where the tubes are tangled together. The behavior would be the same as rubber balls (representing the Quarks) connected with twisted rubber bands being separated from each other or placed closer together producing the exact same phenomenon as "Asymptotic Freedom" in protons and neutrons. The force would become greater as the balls are separated, but the force would become less if the balls were placed closer together. Therefore, the gluon is a synthetic particle (zero mass, zero charge) invented to explain the Strong Force. An artificial Christmas tree can hold the ornaments in place, but it is not a real tree. String Theory was not a waste of time, because Geometry is the key to Math and Physics. However, can we describe Standard Model interactions using only one extra spatial dimension? What did some of the old clockmakers use to store the energy to power the clock? Was it a string or was it a spring? What if we describe subatomic particles as spatial curvature, instead of trying to describe General Relativity as being mediated by particles? Fixing the Standard Model with more particles is like trying to mend a torn fishing net with small rubber balls, instead of a piece of twisted twine. Quantum Entangled Twisted Tubules: “We are all agreed that your theory is crazy. The question which divides us is whether it is crazy enough to have a chance of being correct.” Neils Bohr (lecture on a theory of elementary particles given by Wolfgang Pauli in New York, c. 1957-8, in Scientific American vol. 199, no. 3, 1958) The following is meant to be a generalized framework for an extension of Kaluza-Klein Theory. Does it agree with the “Twistor Theory” of Roger Penrose, and the work of Eric Weinstein on “Geometric Unity”? During the early history of mankind, the twisting of fibers was used to produce thread, and this thread was used to produce fabrics. The twist of the thread is locked up within these fabrics. Is matter made up of twisted 3D-4D structures which store spatial curvature that we describe as “particles"? Are the twist cycles the "quanta" of Quantum Mechanics? When we draw a sine wave on a blackboard, we are representing spatial curvature. Does a photon transfer spatial curvature from one location to another? Wrap a piece of wire around a pencil and it can produce a 3D coil of wire, much like a spring. When viewed from the side it can look like a two-dimensional sine wave. You could coil the wire with either a right-hand twist, or with a left-hand twist. Could Planck's Constant be proportional to the twist cycles. A photon with a higher frequency has more energy. ( E=hf, More spatial curvature as the frequency increases = more Energy ). What if gluons are actually made up of these twisted tubes which become entangled with other tubes to produce quarks. (In the same way twisted electrical extension cords can become entangled.) Therefore, the gluons are a part of the quarks. Quarks cannot exist without gluons, and vice-versa. Mesons are made up of two entangled tubes (Quarks/Gluons), while protons and neutrons would be made up of three entangled tubes. (Quarks/Gluons) The "Color Charge" would be related to the XYZ coordinates (orientation) of entanglement. "Asymptotic Freedom", and "flux tubes" are logically based on this concept. The Dirac “belt trick” also reveals the concept of twist in the ½ spin of subatomic particles. If each twist cycle is proportional to h, we have identified the source of Quantum Mechanics as a consequence twist cycle geometry. Modern physicists say the Strong Force is mediated by a constant exchange of Mesons. The diagrams produced by some modern physicists actually represent the Strong Force like a spring connecting the two quarks. Asymptotic Freedom acts like real springs. Their drawing is actually more correct than their theory and matches perfectly to what I am saying in this model. You cannot separate the Gluons from the Quarks because they are a part of the same thing. The Quarks are the places where the Gluons are entangled with each other. Neutrinos would be made up of a twisted torus (like a twisted donut) within this model. The twist in the torus can either be Right-Hand or Left-Hand. Some twisted donuts can be larger than others, which can produce three different types of neutrinos. If a twisted tube winds up on one end and unwinds on the other end as it moves through space, this would help explain the “spin” of normal particles, and perhaps also the “Higgs Field”. However, if the end of the twisted tube joins to the other end of the twisted tube forming a twisted torus (neutrino), would this help explain “Parity Symmetry” violation in Beta Decay? Could the conversion of twist cycles to writhe cycles through the process of supercoiling help explain “neutrino oscillations”? Spatial curvature (mass) would be conserved, but the structure could change. ===================== Gravity is a result of a very small curvature imbalance within atoms. (This is why the force of gravity is so small.) Instead of attempting to explain matter as "particles", this concept attempts to explain matter more in the manner of our current understanding of the space-time curvature of gravity. If an electron has qualities of both a particle and a wave, it cannot be either one. It must be something else. Therefore, a "particle" is actually a structure which stores spatial curvature. Can an electron-positron pair (which are made up of opposite directions of twist) annihilate each other by unwinding into each other producing Gamma Ray photons? Does an electron travel through space like a threaded nut traveling down a threaded rod, with each twist cycle proportional to Planck’s Constant? Does it wind up on one end, while unwinding on the other end? Is this related to the Higgs field? Does this help explain the strange ½ spin of many subatomic particles? Does the 720 degree rotation of a 1/2 spin particle require at least one extra dimension? Alpha decay occurs when the two protons and two neutrons (which are bound together by entangled tubes), become un-entangled from the rest of the nucleons . Beta decay occurs when the tube of a down quark/gluon in a neutron becomes overtwisted and breaks producing a twisted torus (neutrino) and an up quark, and the ejected electron. The production of the torus may help explain the “Symmetry Violation” in Beta Decay, because one end of the broken tube section is connected to the other end of the tube produced, like a snake eating its tail. The phenomenon of Supercoiling involving twist and writhe cycles may reveal how overtwisted quarks can produce these new particles. The conversion of twists into writhes, and vice-versa, is an interesting process, which is also found in DNA molecules. Could the production of multiple writhe cycles help explain the three generations of quarks and neutrinos? If the twist cycles increase, the writhe cycles would also have a tendency to increase. Gamma photons are produced when a tube unwinds producing electromagnetic waves. ( Mass=1/Length ) The “Electric Charge” of electrons or positrons would be the result of one twist cycle being displayed at the 3D-4D surface interface of the particle. The physical entanglement of twisted tubes in quarks within protons and neutrons and mesons displays an overall external surface charge of an integer number. Because the neutrinos do not have open tube ends, (They are a twisted torus.) they have no overall electric charge. Within this model a black hole could represent a quantum of gravity, because it is one cycle of spatial gravitational curvature. Therefore, instead of a graviton being a subatomic particle it could be considered to be a black hole. The overall gravitational attraction would be caused by a very tiny curvature imbalance within atoms. We know there is an unequal distribution of electrical charge within each atom because the positive charge is concentrated within the nucleus, even though the overall electrical charge of the atom is balanced by equal positive and negative charge. In this model Alpha equals the compactification ratio within the twistor cone, which is approximately 1/137. 1= Hypertubule diameter at 4D interface 137= Cone’s larger end diameter at 3D interface where the photons are absorbed or emitted. The 4D twisted Hypertubule gets longer or shorter as twisting or untwisting occurs. (720 degrees per twist cycle.) How many neutrinos are left over from the Big Bang? They have a small mass, but they could be very large in number. Could this help explain Dark Matter? Why did Paul Dirac use the twist in a belt to help explain particle spin? Is Dirac’s belt trick related to this model? Is the “Quantum” unit based on twist cycles? I started out imagining a subatomic Einstein-Rosen Bridge whose internal surface is twisted with either a Right-Hand twist, or a Left-Hand twist producing a twisted 3D/4D membrane. This topological Soliton model grew out of that simple idea. I was also trying to imagine a way to stuff the curvature of a 3 D sine wave into subatomic particles. .
I think Mr Palmer is mixing up physics with philosophy which is what all scientists tend to do. Physics involves cognition and measurement where philosophy takes place in abstraction. The relation ship of physics to mathematics, logic and philosophy is that physics is only valid within the boundaries of its mathmatics that in itself is derivative of logic and philosophy. A derivative has nothing to do with precision or measurement. It stands true at the limits regardless of precision. Logic is the basis of all branches of mathematics that is a tool for physics. You can't disprove logic using physics. This is why conclusions to observations turn out to be wrong once a new paramters is discovered and yet the logic remains true. This idea of hidden variable will only be valid once the variable is no longer hidden. Relativity is non local and Eninstien knew that but only after his presentation of his theory.
Got a crate of my favorite whisky shipped in on my doorstep tomorrow for this convo. It's good to have some me-time together with my imaginary best friends Tim & Tim for three hours. Sit back, relax, shout at the screen and sometimes type some angry youtube comments. Good to have some male nerd self-care sometimes ❤
1:24:30 Dr Maudlin states that the value on Nth digit of pi doesn't give any information about what's been going on in the Universe before running the test setup. This is true for TOEs - practically all so far - that take for granted/given - algebra to be applied - number system (reals vs rationals, for instance) - metric - constants of nature The value of pi as a math concept is based on an infinite series of terms. I'm working on an alternative model where the value of pi is refined at each iteration in the evolution of a universe, based on an ever-growing number of logarithmic poles. Hence Dr Maudlin's argument remains valid only when we specify such mathematical objects exist as physically meaningful values in a potentially finite universe. Btw, in my model also the logarithmic potential itself surrounding these poles evolves (by addition of terms in a series) along with the increasing number of poles in the system. Funnily enough, these effects seem to counterbalance each other to keep the Euler's identity true whatever the number of poles from 2 upwards. Otherwise, Dr Maudlin is fighting the windmills, and I highly appreciate his efforts, especially ideas about how to modify the number system at the base of physics from real numbers to rational ones.
I, a lowly wind turbine technician, struggle to understand a majority of what was discussed here but I do take pleasure in listening to highly educated individuals disagree in such a constructive way. You, Curt, are an invaluable host in this podcasting space 🙏
Wow! that got the Brain working overtime. At a personal level, it's prompted me to clarify obscurities in my own understanding. Always a good thing! Thanks to all three of you (and behind the scenes support!).
Great stuff Curt. It's super refreshing to hear my favorite thinkers in physics being interviewed by someone competent in the subject who asks the questions I would have wanted to ask.
Prof Palmer how honored to meet you my first time , I have been following TimM for years now and saw this talk with him. What a honorable man you are to show respect for TimM work. Maybe more people like both of you educating our planet, our earth might not be doomed to destruction in the next few hundred years. I also have been following polar bears and just educated myself about global warming. Ty both for your time!!!
A beautiful discussion. I truly enjoyed watching these two make such efforts to understand one another. It gives one a good sense for how much room there is for discussion and further thought in the sciences and philosophy. Thanks for making it happen gentlemen, especially you, Curt.
Having worked with more than 60 graduate students in 3 different research groups, it’s important to point out that there are academically talented students who ALSO have limited, independent problem-solving skills, lack practical real-world experiences, and the ability to construct sophisticated conceptual models built upon foundational ones. Clear thinking takes time to evolve. If they do not improve under mentoring, they will not become good experimentalists, but instead do academic, mathematical analytical work based on existing literature. They can be really intelligent but do not necessarily become creative problem solvers outside of known, well-established works in their chosen field. This comment is not strictly about experimentalist versus non-experimental list, but about the ability to think independently. As the person responsible for creating an apparatus for an experiment, on more than one occasion I had to block the involvement of some graduate students in experimental work because they were ill-suited for the task. My own experience has been that about one out of 12 students are the exception regarding their creative problem-solving skills. I am seeing evidence of this trained academic versus thoughtful creative thinking in this discussion.
Palmer is autistic with a special interest in fractals, so he's stuck on synthesizing his fractal interest into this topic. I know virtually nothing about Bell's work, but here's what I take from the discussion of it. He means to solve the problem of being unable to actually carry out the same exact experiment more than once. The universe is expanding, there's some humidity in our tools, a deer farted in the forest, etc., so we need a way to cope with that reality. If we're just talking about binary outcomes, a function in the programming sense is a perfect analogy. In code, say you have a function that takes an integer as a parameter and returns a boolean. The function logic could be anything. We can land anywhere at all in pi and take any number of digits, and pass that in as an integer to our function, and we'll get back a boolean when we execute the code. We could run the program again using the same value for the parameter and get the same result. The experiment has been superficially duplicated, but in the real world, different electrons ran through the circuits of our computer, and the computer temperature was not exactly the same. Every time we run the function, we're using different initial conditions on some level. That's one aspect. Another is that the function code could include a random number generator, so we really don't know what to expect, even if we provide the same input multiple times. Applying this analogy to the discussion in the video, Palmer wants to presume that the input parameters are more important than they really are because he fails to imagine that the function could contain a randomizing element to its logic. He thinks the randomness of the input parameters is all that's going on. He sees nothing wrong with his suggestion that every measurement can be done with a different lambda because he confuses the slight temperature change in the circuit board running our function with running a completely different function. We could use a different lambda for each measurement if we knew that each one was deterministic, but only if we used the same value for the parameter on each measurement, meaning that we'd have inverted everything, measuring the lambdas against an input instead of measuring inputs against a given lambda. If I had to hazard a guess at what the "hidden variables" question is about, I'd say they are analogous to global program state that may or may not be referenced within our function. Could be way off on all of that, but if you've made it this far, thanks for indulging my brain dump after watching this video.
@@shanerigsby9030 lambda is, by definition, the parameter that defines the result of the experiment. It's not just one of a bunch of properties that do, like in your example. If you define lambda this way what palmer says is completely true. Unfortunately he isn't successful in getting his point across because he never specifies that the derivation of Bell's inequality rests on your ability to perform an integral over multiple lambdas (more technically rhos, which are functions of lambda). That's where the ensemble part the other Tim talks about comes into the picture, through an integral. But if the integral is not mathematically legitimate, then all bets are off.
"clear thinking...good experimentalists" versus "mathematical...work based on existing literature"? Your experiments are based on existing lit. I'm glad I'm not one of your students. I don't think you should be employed to make any decisions about any student's potential. You're a sciolist ie. you claim knowledge which you do not possess.
I believe that a debate between a scientist and a philosopher is not fair. They both talk in different languages. And I always trust a scientist over a philosopher. Both can be wrong or right. But I trust the scientific method over imaginative deductions.
I usually would agree with you, but in this case neither of the gentlemen understand physics even at the undergrad level. They are both very good at pretending, though. ;-)
Agree. I wish Maudlin would along with Palmer's point so we could see where it takes us. Seems arbitrary to pick on that assumption when there are so many interpretations. There are so many more silly things in QM than the idea that a difference in the millionth digit of pi (or whatever) could have an impact on a single electron.
Only have one question do you or do you not believe in or have evidence for a fourth spatial dimension‽ if yes then infinite 3d universe potentiality is possible.... You're trying to exclude yourself from infinity!!!!! This is my first comment. Will leave more as video progresses under separate thread.
@@AquarianSoulTimeTraveler I'm not sure if I "believe" a fourth spatial dimension because I do not have enough evidence for it, but still I would assume that it might be possible. The world of quantum physics gives the possibility to for wired things to happen. Including Tim talking to another separate version of himself.
@@AquarianSoulTimeTraveler I'm not sure why a fourth spacial dimension would necessarily cause the universe to be infinite. It may be infinite without it. I'm not trying to start shit... just trying to understand your thought process. I may be missing something.
I have a phd in math and I fail to see why physicists call Bell's argument (and the resulting inequality) a _theorem_ when they even disagree on the hypothesis! But it's even worse than that: Bell's argument doesn't even define all of its terms as unambiguous mathematical objects. So Bell's argument is "just" a very important rational argument that comes with a computation of probabilities associated to mathematically ambiguous entities (such as "a theory"). It's *not* a mathematical theorem.
@@JohnnyComelately-eb5zvneither. There is a step in the derivation of any inequality of Bell's type which amounts to saying measuring a pair of entangled particles at angles A and B, and another DIFFERENT pair on angles B and C, is the same as measuring a single particle pair both at angle AB and BC, which is obviously an impossible measurement. It's all about a wrong application of counterfactual definiteness, without this wrong step the absolute bound is 4, which is obviously more than 2sqrt2 of quantum mechanics, that is no violation.
The bell experiment result table should be sent back to the particle entangler to tell it to entangle a pair or not in various ways, and see if anything unexpected happens on subsequent output, perhaps looping back the result many times up to billions of times in clever ways both logically and epicyclically. Feedback or closing the circle might reveal something just as profound as interference did in the double-slit experiment.
I absolutely love how far the discussion of superdeterminism has come over the past few years. It used to be that to even talk about superdeterminism you'd get aspersions thrown your way that you either believe in a universal conspiracy or that without statistical independence you couldn't do science at all! Both so called objections are just attempts at dismissing superdeterminism without engaging with the subject. I'm glad to see that it's getting a wider acceptance.
There is no need to talk about superdeterminism. It doesn't buy you anything in terms of physical insight. I can tell you why the universe looks superdeterministic, but that's merely a fata morgana. Operationally looking at it doesn't lead to any usable insight.
@@lepidoptera9337 We don't know where superdeterministic theories will lead us as there aren't any out there yet. They could lead to physics beyond the standard model, who knows. It's worth pursuing.
@@trucid2 The universe looks superdeterministic even classically. If you make a measurement in the future, you are picking up energy that currently resides in spacelike separated regions of spacetime. How much energy is there is completely unknowable, i.e. the future is fundamentally unpredictable. You can play this argument at all scales and it stays valid. In particular you can ask where the energy comes from at those spacelike regions. Well, it comes from even further spacelike regions. No matter how far (back) you go, it always comes from a place that has never been visible to us before. It will, at the end of the day, lead us all the way back to the beginning of this era of the universe. That's totally useless knowledge. Unknown is unknown is unknown.
Ahhh time to spark a joint, crack a brew-ha-ha and grab my note pad for yet another Dr. Mauldin master class. I’ve listened, read, and kept notes on every thing Dr. Mauldin has done and thankful to Curt for keeping the knowledge coming!
1:31:20 Got hard to watch here seeing such a great mind as Prof. Palmer has but Prof. Mauldin is just on mf’in fire. He is just seeing the code in real time. And yes I do look at this as my type of ‘nerd wrestling’ 😂 because I see this as being in Ancient Greece University we’re all these big ideas and theories are bouncing off each other and clashing ideas between the greatest minds! Prof. Mauldin could not have been more clear and kept asking Prof. Palmer to clarify his position and even fully acknowledged what exactly the two were disagreeing about and ask how and also tried to resolve it.
Alternatively, to my previous post, you could use the results of one bell experiment, to drive a subsequent or parallel one, the first acting as the second’s pseudorandom generator. Something tells me if you let this thing feedback on itself you might see something interesting. Or you might just descend into randomness. Perhaps carefully crafted experiments might amplify a subtle effect when such feedback is applied.
The pseudorandom generator is defined as not having any cyclic attractors except after its cycle repeats, which a million digit seed would outlive the universe. The selection of instrument orientation driven by a random coin toss of that depth, would not alter the balance of the distributions in any meaningful way. To suggest so is a fallacy of composition, requiring a super-deterministic universe where even gods choices are predetermined and free will and economic progress is predetermined as well.
It's hard to even figure out what the chaos guy is even trying to say. What does the millionth digit of the value of a variable in a dynamical system have to do with the value of the millionth observation in an experiment? It seems that he isn't talking about anything even vaguely similar to what Bell and Maudlin are talking about.
Palmer is saying that there is no such thing as probability because if it happened, it had a probability of 1 and if it didn't, then 0. He os arguing for extreme determinism, and his rationale is that the seed values of a fractal change its path, but that's only true because a fractal is mathematically deterministic. What he fails to acknowledge is that math can only be a model of the universe. It's not the universe, itself. Within a fractal, the whole universe is the mathermatical model, but that's not what is at stake, here. We're talking about the actual universe, and the theorem is a model to understand the real universe, not to understand a deterministic mathematical model. Determinism is absolutely at the heart of their disagreement.
A realized outcome doesn't have a probability. Technically nothing (as in no thing/system) has a probability. Physical systems can produce outcome frequencies, but only the theory produces probabilities, which are ultimately just estimators of actual frequencies. I don't know why so few people understand this. It's not rocket science.
I wondered if this discussion could have been summarised without all the intricacies of the Tims respective positions. I gather that Maudlin believes a fully predetermined universe precludes scientists from doing effective science, as their attempts to capture statistically independent sets is undermined. He uses appeals to Bell as reinforcement. This position is similar to a common the position on the subject of Free Will, that predetermination prevents a person from making fully rational decisions - as the decision was a foregone conclusion. It seems "compatabalism" (borrowed from the Free Will topic) is relevant here. I think it remains to be proven that, in a predetermined universe, the experimenter would have no ability to effectively partition statistical sets in order to follow general scientific methodology EXCEPT in the case of studying entanglement - predeterminism would then become the "hidden variable" that undermines the Bell Inequality, and preserves locality.
@@ThePurza Maudlin is a philosopher. He has probably never been in a lab where they are doing quantum measurements. Absolutely nothing is predetermined in the lab. The entire theory is based on a lack of determinism.
As a science enthusiast, I think I understand the difference between Tim Maudlin and Palmer's interpretation of quantum mechanics is that Palmer was talking about the quantum state of the atom when the atom is in quiescent state; because when the atom is in undisturbed state, no serious statistics is involved to understand the quantum state of the atom even if the atom is presumed to be in chaotic state. However, based on the inferences, whenever the atom is in the act of being observed, then statistics is involved because in chaotic state, there are some reactive physical entities involved in the process, which I call composite wave that could pop in and out of existence. And by virtue of which, a composite wave cannot develop any non-zero net effective force to affecting the angular momentum of the observed or the electron in question until the atom is in the act of being observed, which in turn forcing the composite wave to collapse, altering the quantum state of the electron in question, which I believe Palmer had overlooked in his analysis.
Really great theolocution Curt; well done! Even though progress wasn’t definitively made between the two; it’s amazing to listen/watch the views clash. I was super impressed at Curt’s interjections. He interrupted at the perfect times when things were going nowhere and asked a very good and perfectly pointed question to Palmer (something like “can you explain your position with ensembles?”).
Tim Palmer’s closing remark on “holistic fractal structures” being more fruitful (than Planck scale structures) for the formulation of quantum gravity is prescient.
I agree with Palmer on this (though perhaps not for the same reasons). It seems to me that the only possible way in the near future we will be learning about / resolving what goes on at the scales of quantum gravity… the only way this will be achieved is by observing the larger Universe and making inferences at the quantum level.
Palmer understands the invariance of complex numbers or says he does, but doesn't say why a complex number has anything to do with invariance. I would like to hear an explanation of how this is so.
Curt, you sir, realize it or not, are at the same time a young good looking successful man whom has a degree in theoretical physics/mathematics, a natural affinity for anything that has to do with audio and recording equipment as well as great tastes from what i can gather in every field of interest that you admit to, which leads me to how brave you are to step on stage, if you will, with the modern day equivalent of giants among men (in so far as science and physics and philosophy are concerned), and last but not least your humble, respectful, curious, studious, hip to most relevant literature, and display damn near perfect balance of letting guess spea their piece the way they wish it to be comsumed, while managing to push and pull and steer and reign in the flow and direction of the episodes, moderating between guests for debates while prodding it along and you do this all while also personally consuming, retaining, summarizing, and politely yet academically and rigorously holding your guests to the highest standard without surrendering any important ground or bowing down or giving any guest any opportunities to either mislead, lazily communicateto us audience, or use your platform to weaponise and/or propagandise any tool or idea or subject using there status or reputation or position professionally or academically. You are a true gem man
I surely enjoyed this. Much appreciated. However, it was disappointing to see Dr. Maudlin never really understand what Dr. Palmer was saying. He didn’t seem to try very hard. Instead, he continually defended his perspective and attempted to undermine Plamers’, which he admittedly didn’t understand. I am also a physicist… and my thesis focused on chaotic systems… and I also discovered the remarkable ignorance that most physicists maintain around chaos theory. Why is that, I often wondered… and came to the conclusion that it’s too dangerous to most views on physics. For example, it is proven mathematically that a chaotic set is indistinguishable from randomness… if viewed with sufficient coarse graining. Thus, the stochasticity of quantum properties could, in fact, be deterministic chaos… and yet, inherently indistinguishable from pure randomness… so long as the required resolution to observe the determinism were below the planck scale. Of that one fact, most physicists are strangely ignorant. Such is the general case for physicists… they are mostly ignorant of chaos theory except at a superficial level. That’s a VERY big deal… because, what chaos theory shows us is that the Universe IS a chaotic system. Must be. Anyway… I digress. My point is that… to my one sufficiently knowledgeable in the domains of both quantum physics AND chaos theory… it is clear that Maudlin simply did not understand Palmer… whereas Palmer certainly understood Maudlin. That’s a bit sad because Palmer obviously hoped to share something valuable with Maidlin…. namely, the perspective of a physicist who is adequately knowledgeable regarding chaos theory. (Note: When Maudlin supposes himself to be summarizing chaos theory, Palmer nods his head “no”… clearly indicating that Maudlin didn’t understand. Anyone well versed in Chaos theory nodded at the same time. I highly respected appreciate Maudlin. I hope he turns some of his big brain toward chaos theory. It would help him for sure. Thanks for this awesome video and all the others you offer. Be Blessed, TOA
1:01:54: What Dr. Palmer is implying is that for *some reason* the universe might only *allow* initial conditions "on the attractor." My position is that such an assertion must have some "reason." That's a very special case - it needs to be justified before being taken for granted.
Skill issue. On your part and his. He's not a great communicator but if you don't understand what he's saying, you're being intentionally obtuse or have some other issue.
What’s the point of having a model if the vast majority of the contributors to greenhouse gas are China and India and neither one of them are going to change anytime soon. You can model all day long but if those two don’t modify their production of greenhouse gas, your model is pointless. I find it maddening that we waste a lot of time like in the UK where they contribute 1% and they’re trying to knock it down. Figure out a way to help India.
Bell's theory is a mind boggling concept about the reality, I have tried many times to understand it but failed. Any debate between well educated scientists like this video will bring us a bit closer to understanding it. Well done I enjoyed it.
As an analogy, Palmer is asking the equivalent of what would happen to the universe if we changed the gravitational constant to a number other than what we measure it to be. Maudin is saying "you can't do that". I agree with Maudin. But you can also screw around with ideas to find where it goes. Like finding how stable something in the universe is, or how unstable it is.
You seem to have several interesting points and questions. Have you considered consolidating those somewhere? I'm not super-involved in the "TOE Community", so I don't know if there's something like a discord where people discuss such things. But I wonder if there's someway that your observations/questions could reach Curt?
@@fnamelname9077 It has Twitter, but somehow all the discussions are happening in here on RUclips, where comments are not easy to read and trace the replies. We need to Twitter or somewhere else.
That is not how I interpreted Palmer. He never mentioned tweeking constants of nature, in fact his whole concern seems to be thinking more carefully about what are the variables that you are free to tweek in a specific theory. Maudlin seem to me to argue that in the context of the Bell's inequalities you are ONLY allowed to consider large enough ensembles, and they say nothing about single measurement scenarios.
@@bazsawowregarding your comment on Maudlin: larger ensembles and sub ensembles give your more statistical confidence, so yes they are good. It would be a fools game to study a large ensemble in attempts to say what happens to any single run of the experiment. All you can get is a probability of what the results of a single run will be.
@@timjohnson3913 exactly, that's how I understand what Maudlin said in this debate. I have a follow up question though on this point. Does this then mean that locality is also only violated in a statistical sense and not necessarily in individual cases? What does it mean for locality to be violated statistically?
In my opinion, Tim Palmer's interpretation isn't incorrect per se, it is irrelevant. For one thing, the experiment Bell describes to test his theorem is set up in such a way so that the uncertainty associated with chaotic systems can be avoided. In this way, the fact that chaotic systems are sensitive to changes in initial conditions will not affect the experimental results. Furthermore, the idea that a hurricane could have been caused by something as insignificant as a butterfly flapping its wings only arises because the butterfly is immersed in an atmosphere. Thus, the two seemingly separate events are connected across space and time by a causal chain of trillions of oscilating oxygen atoms. The counterexample given by Professor Palmer refers to an analogous chaotic system, however, the changes made to any free variables are still communicated to the system via a physical interaction. But this is exactly what Bell was attempting to rule out, which he did by providing a rigorous experimental component to test his theorem. He specified that during this crucial phase of the experiment, measurements were to be made with sufficient distance between subjects to ensure they were not causally connected. In other words, the hurricane would have already finished forming before the atoms pushed by the butterfly's wing could reach it. In much the same way, any observed correlations between measurements made by Alice and Bob could not be explained in similar fashion i.e a local interaction in Bob's lab radiating out to Alice's lab and affecting her results. And of course vice versa.
I agree palmer misses the point completely. I don't agree with bell's theorem either. For one, it's not a mathematical theorem as maudlin asserts: it's a statement about physical reality, guided by mathematical AND physical assumptions. If it truly were a theorem in the mathematical sense we would not have loopholes. The second issue (and most important one) is that any inequality of bell's type requires a specific assumption about counterfactual definiteness: that the results of measurements that were not performed are still well defined. This is the original definition of realism given by Einstein in the epr paper, and is not some handwavy concept as maudlin would have you believe, it's a well defined concept mathematically. We can have two types of counterfactuals, respectively relating to a weakly objective or strongly objective interpretation of results. The crucial point is that if you define your counterfactuals in the weak sense (as bell did), meaning they must be referred to different pairs of particles for each measurement (as is physically the case) you can't derive the lower bounds in contradiction with QM. Instead of 2 for the CHSH correlator, you get 4. To get two you need to reduce the independence of the various experimental runs as if the properties you measure were actually those of a single pair, and this is equivalent to a strong objective interpretation. So bell mixed the two up, which is not logically reasonable.
Expansive thoughts! Exactly what was needed following the prior episode with NDT -: Contemplative thoughts considering new frontiers. No porcupines up 🌳 that "Aren't really there 'cause beliefs about porcupines based on feelings absent facts." - (N*T) (You did a great job in that interview, btw. N*T is fine for pop-science promotional stuff, not a knock on him. Just some people are better suited for one thing than another. His lane is not my lane, no biggie.)
Can we get DrPalm and ProfTimMaud back. This is one of the best debates of 2023-2024 I watched hundreds and I keep coming back. Maybe some other topics outside of chaos ?
This discussion leaves me with the following questions: 1) Maudlin seemed to have a strong point when he said that Bell's statistical independence assumption is a _mathematical_ assumption, which puts the burden on Palmer to justify his metaphysical reading of that assumption that apparently goes beyond the mathematical expression. Can Palmer justify his reading of the statistical independence assumption? If he cannot do so, then this undermines the rhetorical force of his non-conspiratorial model of the violation of statistical independence. In service of Palmer's position, one might try to draw a parallel with Maudlin's comments regarding quantum mechanics. For Maudlin, quantum mechanics is not a physical theory; rather, it is a mere mathematical formalism from which predictions can be derived. Theories must go beyond this and must tell us what the world is like. Perhaps one could argue that the type of move Palmer is making is of the same kind, of "going beyond" the mathematical formalism. 2) Palmer seemed to have a strong point when he pointed to textual evidence of Bell's writing, of variables being "free" in the sense of "could have been otherwise". Of course, this goes back to the point above, where Maudlin points out that Bell's main claim is expressed in unambiguously mathematical form. Nevertheless, Palmer did seem to have some significant evidence to support his position. Could it be argued that Palmer's reading of Bell is correct? If this argument is strong, I think this might prompt Maudlin to pause, and to take more seriously the possibility that had Bell been alive today, he might have been persuaded by Palmer's model of non-conspiratorial violations of statistical independence.
It all boils down to this crazy super determinism idea - that the random number generators must be intentionally not random to produce the result impossible for any hidden variable systems, as Bell's theorem shows in the theory of QM itself and experiments support. All the things Palmer says are just word salad without any meaning, - physical, mathematical,. logical - no sense at all. He's just a fraud.
I didn’t take it as Tim Maudlin didn’t have an argument against the counterfactual determinism or whatever Palmer wanted to push. I took it as Maudlin just not wanting to go down that route of argument because it didn’t apply. And Palmer or someone needs to explain to us all where counterfactuals fits into Bell’s mathematical statement of statistical independence. Or why and where Bell went wrong with the math.
@@timjohnson3913 I think I agree with you -- Maudlin's rejection of Palmer's hypothetical model was not due to it being internally inconsistent or illogical, but rather because it was not applicable, and so did not threaten his interpretation of Bell's theorem. My questions (1) and (2) explore two different angles of approaching the question whether or not Palmer's line of reasoning was applicable. Question (1) explores whether there is a justification for Palmer reading counterfactuality into the mathematical expression of statistical independence. Question (2) explores whether there is compelling textual evidence in Bell's _Free Variables and Local Causality_ that support Palmer's view of what Bell meant when he referred to variables being "free".
Just used Chat to garnner an explaination about Bells idea...sure did clear up the complexity of the discussion. Thank you for the presentation, learned a bit more.
From a heuristic state you can model the universe deterministically or not. There are no laws preventing either choice. But which model tells us more and does so in a clear way. With respect to clarity a statistical model hasn't got a chance. (no pun intended)
Well I’m really enjoying this, two heavy intellectuals slugging it out over statistical independence and Bells theorem. I’m probably understanding only about 50% and in my book Tim M is ahead. If I understand the other Tim correctly , he’s saying you can’t lock Cary only one variable in a particular environment, everything is interconnected. But as Tim M is saying , that’s the whole basis of RCTs, you randomise apart from the thing you’re testing for. Anyway thanks to both and Curt for helping me understand the theorem a bit better and to see the way these guys think.
There is nothing to slug out. Statistical independence can be tested with correlation functions. Some quantum systems produce independent outcomes and others do not. Those systems that do not can not be treated with the simple formalism of quantum mechanics that we are teaching in beginner's classes and that relies on a straight forward ensemble approximation. A theorist who tells you that it's either/or simply doesn't understand physics. It's both and one has to adapt the theory to the requirements.
Exact Initial conditions aren't "free variables", they are irreproducible. Floating variables seems more precise. But it sounds like Bell didn't make that explicit but that is what he seems to be saying.
You could get emergent statistical independence if the function determining collapse outcomes is balancing between complexity of spatial form and complexity of change over time. In finite measurements, we would only see it balancing complexity over time, which would be randomness. Balancing complexity of form across space would only be evident from a correlation between local and global complexity. I’ve got a video about equations that describe that correlation. To use the “digits of Pi” example, the digits are not just a series of collapses or just states of space. They’re both. The digits of Pi are a series of dance moves between universal state and individual collapse.
It seems to me that a counterfactual in quantum mechanics must always return a wave function, rater then a definite, and therefore include any and all possible superpositions.
Pretty hyped for this, but shouldn't it be called "Superdeterminism vs Nonlocality" or so? After all, superdeterminism is implicit as a possible solution in Bell's Theorem…
I think "statistical independence" is one of the assumptions of Bell's Theorem, and that is what superdeterminists reject. So in that sense, the name makes sense. Although, I think you make a fair point, as superdeterminists don't necessarily reject Bell's Theorem in its entirety; they only reject statistical independence.
I think it was more about Bohmian dynamic holism vs. statistical mechanics. Palmer game mathematically sound arguments in the favour of "God doesn't throw dice", Maudlin kept confusing mathematical theorem with randomized testing. In takes only a single counterexample to disprove a conjecture, and Palmer offered that against the conjecture of "statistical independence". Nothing wrong with rejecting both statistical mechanics and localism as universals. :)
@@maxwelldillon4805 People can be right on one issue and wrong on other issue. I think non-localism has been unnecessarily mystified. Local can be simply defined as consecutive and non-local as parallel. Already the "tape" of the mathematical definition of a Turing machine is a parallel BOTH left AND right continuum. Dyck language is another very familiar example. The fatal flaw of Einstein relativity is the attempt to deny the possibility of standing waves of light. The Thomas recession attempt to "fix" the problem basically leads to fully scalable quantum cosmology.
@30:50 maybe important, maybe not, but I'd suggest we should disagree with TM here. Indeterminism does not mean, "things can be exactly the same up to a certain point and then diverge...". Quantum indeterminacy is only ever about the measurement outcome, not what may or may not happen in-between in the spacetime cobordism. To me this is an important subtle disagreement because I have a different view of QM and GR. I do appreciate the orthodoxy thinks like TM, they should check the strict logic of time evolution though. They're assuming a Hamiltonian and no advanced causation. In fact they're not just assuming a Hamiltonian, they're assuming their Hamiltonian is somehow perfectly describing how time evolution develops. I'd suggest Feynman showed us that's _not necessarily_ the case. It's subtle though. When the Hamiltonian is only offering a statistical description then you cannot really prove that it is "wrong", and indeed it is not really wrong, the point is the Hamiltonian time evolution is an incomplete picture of reality. (As are scattering amplitudes which are probabilistic.) Is a Bell experiment just scattering processes? Yes it is. You should think hard about that. How can this be non-local? How can anything be non-local without mysticism of some variety? (MW and multiway hypergraphs and whatnot, all of which are dumb mysticism when mysticism is not needed). (I am not against mysticism in its right place, but QM and GR is not the place.)
I have some (as yet) incomplete thoughts. First, from Wiki the definition of 'locality'. "In physics, the principle of locality states that an object is influenced directly only by its immediate surroundings. A theory that includes the principle of locality is said to be a "local theory". This is an alternative to the concept of instantaneous, or "non-local" action at a distance. " So, let's take a simple quantum experiment. A television manufacturer tests his "electron gun" for the TV set by firing a beam of electrons at the target. If the electrons leave the gun when he turns the switch and they make a permanent dot on the screen by chemical reaction, then the universe is 'local'. In other words, the electrons are demonstrated to have left the gun when he turned the switch. He 'caused' the switch to close, which connected the circuit, which allowed electricity to heat the filament which boiled off the electrons. Now, this manufacturer had been 'dreaming' or 'thinking' about turning the switch for the last twenty-four hours (because it was Sunday and the lab was not open on that day). Further, he can tell when he walks in the lab on Monday, the screen was blank of electron spots. His thinking, dreaming, or other entanglement with the system did not cause the gun to fire on Sunday. Only on Monday when he became part of the critical causal link to close the switch did the gun fire. The gun does not fire at other times. Other T.V. manufacturers thinking of the same test don't cause 'his' electron gun to fire. Only when he flips the switch. This is how we understand the world. This is how we fire electrons at other atoms for investigative purposes. This is a quantum experiment. It demonstrates 'locality' and 'only locality'. There is no spooky action at a distance in this well-defined experiment which can be repeated all around the world any time one choses. The only place where I see probability involved at all in the experiment is here: the filament of the gun is made of "atoms too numerous to count". The too numerous atoms have many more "too numerous to count" electrons. One might suggest that precisely which electron(s) boils off is a matter of probability. Really, we know the filament is heating, which is causing the atoms to vibrate more rapidly, and several or many of the atoms vibrate enough to kick-out an electron. So, while we can say which electron is fired is a matter of probability, it really seems if we were small enough to actually track which atoms vibrate either most violently or at the right frequency, we could tell which would then be most likely to eject the electron. There is nothing spooky about this. We do this every day. By contrast here are some processes that I don't understand: 1) using a pseudo-random number generator like the odd-even state of the digits after the millionth digits of pi. The last I heard, although the digits of pi form an unending sequence, every digit will 'always' be the same from one repetition to the next. Otherwise, there is no meaning to pi. And, therefore, reference to his psudeo-random process delivers precisely nothing useful. Using the time of a 'radioactive decay' might work better. 2) Creating entangled particles. Why do I have to 'create' them? Aren't they just entangled? Can't you find entangled particles in completely natural processes? If not, does entanglement 'actually' exist or is it a construct. I do not know enough about photon 'splitters' and 'down-converters' to know exactly what processes are going on here. And many of these arguments do a lot of 'hand-waving' at the creation of these pairs. And, if they are created just how exactly are they transported to a locale where they can't have been affected by slower-than-light processes. In short, I want to know 'details', as I suspect the devil is in the details.
I'm so PUMPED for this. Superdeterminism is the juiciest, most philosophically interesting approach to quantum mechanics... at least that's my opinion, as someone with zero knowledge of math or physics.
It would be if it actually had any experimental teeth to it, but it doesn't. People can't just wishcast hidden variables that they can't define and can't falsify into reality in order to hold onto their materialistic prejudices. I'm sorry but that's a lot of things, but it isn't science.
Let me take a crack at saying what the confusion was about which variables Tim P. was talking about and Tim M. was talking about when talking about free variables. Tim P was referring to (freely) changing a variable that is part of and thus very relevant to its effect on the experiment, e.g., changing millionth digit of x in x, y, z of which the very experiment is keeping track of the trajectory of. In other words, Tim P. is talking precisely about variable(s) that are generally and obviously considered relevant to the experiment. Where as Tim M. was precisely talking about variable from which the experiment is free of their effect on it e.g. the position of planets around a star billion light years away, which we intuitively agree are not relevant to an experiment on electron singlets being performed locally. What Tim M. is saying that you are free to let the position of planets around stars billion light years away and freely set any values for them and will not and should not affect your experiment on electron singlets locally. That is why they are free variable - in the sense that experiment on electron singlet is free from the values of those (distant/irrelevant) variables. And because they were referring to the different meaning of the word "free" the confusion ensued. Tim P. - variables with values that are not constrained in any way and thus you can choose any value for them but are part of the experiment. Tim. M. - the variables and their freely chosen values from which experiment is free and is not affected. What do you think?
I don’t agree with this comment as Tim Palmer is the one who brought up the passage of Bell, and it’s Bell who gave the example of the moons of Jupiter and people walking. The passage Palmer was bringing up is also where Bell says “the millionth digit is not useful for any other purpose”. But along with Maudlin, I don’t understand what exactly Palmer was getting at. Apparently you need to understand his theory of the universe to make sense of his interpretation of Bell’s statistical independence assumption.
A few conversations I would love to see and that could be quite fascinating : Bernardo Kastrup with Joscha Bach, Sam Harris and Sean Carroll. Not necessarily all at the same time though 😂 I wanna see Bernardos idealist thought really clash against other thinkers with an entirely different metaphysics, since that feels somewhat rare. His first conversation with John Vervaeke was intense at times but he didn't really struggle too much. In his conversations with Graham Oppy (on another channel) and Susan Blackmoore, he wasn't challenged nearly enough. I want to see his views really tested p. s. perhaps Joscha Bach and Sam Harris wouldn't clash that much with his metaphysical views, not sure.
52:48: The problem I have here with Dr. Palmer's position is that nothing in our theories give us any reason to believe that certain other instrument settings would have been disallowed.. In his three-variable example, what would have been the *mechanism* that restricts the ability to change variable A without also changing B and C? It seems that superdeterminism skirts the Bell argument by contending that certain instrument settings simply could not have happened. I think we need a WHY for that before we can take it seriously.
The focus seems to land to frequently on Bell rather than the actual experiments. If we assume quantum measurement is simply probabilistic around a real underlying quantity, then for entangled particles you can't explain them always giving orthogonal results when measured the same way, and if you then believe them to possess predetermined but correlated results for all measurement orientations, you can't explain the actual experimental results, which are consistent with either experiment result having been "true" and the other then probabilistically determinable from that result. This gets called a collapse, as if the entangled particles were one particle and the one measurement "decides" the probability of what the measurement of the other would be.
Quantum measurements are NOT probabilistic. They are independent. That's a huge difference that most people simply don't understand. There are also very important cases of quantum mechanical systems which do NOT satisfy independence in the first place and for which the standard prescription of the theory is not sufficient. That doesn't mean we can't understand these systems, but it means that we have to understand how the standard theory has to be modified to be applicable. A well known example is the Mott problem (although it might fit into a density matrix scheme, but I would have to think about that for a while).
Tim Maudlin (TM) needs to read Tim Palmer's (TP's) paper on this idea, because TM was missing TP's point. Let me explain. Look at Alice and Bob making two measurements each of some Bell state (the example used in the video). Statistical independence (SI) says the four measurement sub-ensembles (Alice 1, Bob 1)(Alice 1, Bob 2)(Alice 2, Bob 1)(Alice 2, Bob 2) are equally populated by the hidden variables responsible for the outcomes. According to local causality and SI, you can ask what would have happened if a given trial in (Alice 1, Bob 1), say, had been (Alice 1, Bob 2) instead. The answer is "Whatever the locally causal mechanism says." That trial would have simply been a trial in (Alice 1, Bob 2) instead of (Alice 1, Bob 1). So what? That's what TM said, counterfactuals just don't matter. What TP is proposing is a way to violate SI such that counterfactuals do matter. It's possible in TP's proposed theory that when you pick a trial in (Alice 1, Bob 1) and ask "What would have happened in that trial had Bob chosen measurement 2 instead?" the answer is "The theory says that counterfactual is impossible in that trial." That's how TP's proposed theory violates SI and attempts to account for the violation of Bell inequalities. As a side note, no one pointed out that you can explain the violations of Bell inequalities without violating locality or SI. In fact, I think TM said you have to violate one or the other at minimum to account for the violation of Bell's inequality. The axiomatic reconstruction of quantum mechanics via information-theoretic principles renders quantum mechanics a principle theory based on the observer-independence of Planck's constant h between inertial reference frames related by spatial rotations and translations. And that can be justified by the relativity principle just like the the observer-independence of the speed of light c between inertial reference frames related by boosts (which gives you the principle theory of special relativity). Accordingly, quantum mechanics is as complete as possible and does not violate locality, statistical independence, intersubjective agreement, or the uniqueness of experimental outcomes. You can read all about that in "Einstein's Entanglement: Bell Inequalities, Relativity, and the Qubit" forthcoming with Oxford UP.
@1:15:00 this was a really interesting point in the discussion. TM seems right that they were slightly at cross purposes. However, TP was right (I think) that Bell's Theorem is not really statistical. it is about *_entanglement,_* which in *_not_* a statistical property. TM seems to not grok this. They way you _test_ for entanglement is the Bell Experiment, and you use the Bell Theorem for this! That's the whole point. A favourable outcome violating *_either_* locality or Kolmogorov statistics with independence due to (non) causality between A and B (violating Bells Inequality) is a success for QM, but the experiment (and Bell's math theorem) tell you *_nothing_* about which assumption of classical mechanics was the one violated. You need a model for that, and a different experiment to test your model for it against other models that have the other assumption false.
It seems to me that the discussion is made more difficult without a rigorous statement about causality. If statistical inference requires a surrender to probability, then it seems reasonable that "threads of causality" are required to link the seemingly disparate incentives (NOTE: I DID NOT State that Entropy doesn't exist, nor do I unnecessarily anthropomorphize the conditions with the term "incentives"; merely that it can be resisted by an action - an exchange of energy for a difference of result which APPEARS to be a modification of Entropy), maybe this is akin to Imaginary Numbers vs. Real Numbers.
I just don't get it. Does this superdeterminism really assume that decaying nuclei in 2 different random number generators intentionally decay exactly the way to simulate the non-locality of the Bob and Alice experiment? Is it really that simple and stupid?
I understand that Maudlin is talking about a mathematical theorem, and Palmer on whether the assumptions of that mathematical theorem apply to any physical model and his conclusion is that it does not, and therefore although the mathematical theorem is correct, we cannot use it to rule out hidden variables as there could be hidden infinite correlations that can be, in practice, considered as hidden variables. Makes sense?
@1:20:00 have to agree with Palmer here. TM is talking about the experiment, which tests Bell's theorem. Statistics has nothing to do with the underlying cause of the entanglement that the theorem was created to test for! The Bell Experiment does not test or inspect entanglement, it only detects a result of entanglement and to so so uses an ensemble of similarly prepared particles (Bell pairs). It's frustrating for TP because m basically admits this, he points out it's like a randomized control trial. However, Bell's theorem suggest that as a way to test the inequality, not to derive the inequality. One derives the Inequality from QM, not from statistics. It holds for a single Bell pair. Tells you the probabilities. The probabilities (in QM) are not the statistics. The statistics can only come from a whole lot of identically prepared systems, which then test the computed probabilities. Assuming statistical independence is the experimental condition, not the condition of Bell's Theorem. Bell's theorem assumes locality and causality (some also say "objective reality" or objective properties "not magical dice throwing by the Devil").
In reality one can't assume statistical independence because real physical systems are not required to produce statistically independent outcomes. Statistical independence is just an assumption of the ensemble theory. Is it a necessary assumption? No. If we give up on it we simply don't end up with standard quantum mechanics. So what? So nothing.
0 chaotic, 1 structure, 3 function of the median. 4 black holes and information in a potentiality. Black holes dilate, sometimes information in, sometimes information out. In a galaxy it's information out with multiple portholes in. The question is the source of this function. We have Pi, fractals and Phi.
@32:00 this one I am less sure about, but while ρ is a statistical factor, it's not just about frequencies, because how QM gives you ρ matters, it is from non-Kolmogorov probability amplitudes (q.v. Raphael Sorkin et al.). In QM you should never lapse into talking or thinking about classical Kolmogrov probabilities. And *_one way_* of not so lapsing (not the only way I suppose) is to talk about counterfactuals. Another is to go in the Huw Price direction and talk about advanced causation, which also means you cannot have Kolmogorov probabilities (although Price often seems confusing, I think that's what he'd say).
Quantum mechanics is simply another solution of Kolmogorov. Not sure where the problem is supposed to be here. Counterfactuals are intellectual nonsense. They don't even satisfy the definition of science.
Interesting topic. Before I listen to the presentation I want to mention what Stephen Hawking wrote that quantum mechanics may be only waves without uncertainty. I think that's correct! Edit: Maybe I should provide the source: "But maybe that is our mistake: maybe there are no particle positions and velocities, but only waves. It is just that we try to fit the waves to our preconceived ideas of positions and velocities. The resulting mismatch is the cause of the apparent unpredictability." - Stephen Hawing, A Brief History of Time, chapter 12
Then it all circles back to Schrödinger's cat - at some point, you will need to collapse the wave function, cats are dead or alive. QM is incomplete, GR is wrong, - that's the current state of the physics. It's still in the PTSD after Bell exposed the stupidity and laziness of all the QM creators.
@@syzygyman7367 I heard that the pilot wave interpretation of QM doesn't need a collapse of the wave function. Could have been something like that Hawking was writing about, but I'm a complete amateur when it comes to this. The idea of an attractor as was mentioned in the video made me think that maybe QM has a "complex attractor" instead of the strange attractors in chaos theory, meaning the QM attractor could be extraordinary complex and guide the whole universal wavefunction.
@@Anders01 To be honest I have absolutely no understanding of why he talks about the chaos theory at all. He sounds like some activist saying words he doesn't understand himself. I checked his bio to make sure that he has an education in physics at all. I have this education, even though not working as a scientist now, I'm basically like Curt. My current hypothesis is just these 2 worlds, our big one (even though QM effects work even on an astronomical scale) and QM one might be really indeed 2 worlds, 2 independent layers of reality, and might be not the only existing, and the unifying theory might be impossible fundamentally, the very question might be incorrect. If you don't have a professional understanding of physics but want to understand it anyway - read Penrose. In his more or less popular books, he explains a lot about not only the problem of consciousness but about physics itself, you can start with Emperor's New Mind - if you haven't read it yet lol)
@@syzygyman7367 I think the problem with QM is mainly Einstein's relativity. That QM is essentially correct and that SR and GR are fundamentally wrong. Because the Lorentz transformation is derived through a mathematical division-by-zero fallacy it seems. I heard about a solution using hyperbolic cosine but that sounds like more of a cop out trick to me than actually solving the problem.
@@Anders01 QM has much bigger problems (or miracles) - backsword in time casualty (shown in the delayed choice experiment) and the possibility of receiving information from possible futures (the bombs testing experiment). It's against all the logic we had, let alone physics. Special relativity is kinda in Maxwell's equations already, if there' sno ether, there must be some time-space transformation. Mathematician Gauss had an idea that the real space can be curved in the 1800s, he even measured the angles of big triangles between hills, - didn't find violations but thought that they can indeed be real. Relativities were logical and comprehensible. QM is so so unnatural for our understanding, that it's ugly and scary for many people. Scientists were and are afraid of it. It took 30 years for Bell's theorem to be discovered, 50 years for Penrose to start the investigation of consciousness from a quantum perspective. We are probably only at the beginning of discovering the nature of real reality.
unfortunate tangent there, but at the end they began to touch on what I think is both the importance and the confusion of Bell's Theorem: We don't know what's going on at the small scale, as in we don't know the mathematical or physical mechanism, but we know that the macro scale has strange/unexpected results that we can only infer from statistical measurements. i.e.we can only see evidence on a large macro scale in statistics. But that doesn't mean that any two entangled particles are doing anything non-real or non-local (or non-deterministic), but that the mechanism prevents us from seeing it by only looking at a macro scale. I interpret the Bell test much more simply: what is it that 2 particles can do to produce this result, that cannot be achieved by 3 or more particles?
Palmer talks at 49:00 about how carry rules work (or rather don't work) in decimal arithmetic of so called "real numbers". In binary, there's no telling whether "real numbers" 0,000... and 0.000.. add up to 1, 0 or something else, as we don't have any computable algorithm to tell us whether the strings contain any ones, and if so, how many.
Can you run an experiment where the wave function of the elements being tested are independent from the wave function of the environment the testing is being performed in?
I think what is missing in Palmer’s argument is not the fact that we can get statistical independence from chaotic systems but that statistical independence is not enough to explain the correlations observed in the EPR experiments. We observe clear cut strong correlations violating Bell’s inequalities. Maudlin should have focused on asking Palmer how to recover those correlations using a chaotic system (I guess he can’t 😁)
The misunderstanding I think occurs because quantum mechanics as it is traditionally formulated using a linear mathematical framework. General relativity uses a non-linear mathematical framework. You have some people trying to generate GR from QM. The basis of non-linearity is feedback. What is feedback? We look at a blackbox system which is a box containing a hidden internal mechanical system. There are inputs into this system which trigger changes inside the black box causing an output effect. The inputs affect the system leading to perturbation leading to output effects. The system can be simple (few components) or complicated (many components) but the component number isnt necessarily an indication of whether a system behaves linearly or non-linearly. I say that because I wish to draw an early distinction between complicated (many components) and complex (many interactions). In a linear system, the outputs do not relate back into the system as inputs. In a non-linear system, the outputs of the system interact with the system to either restrain it (negative feedback) or amplify it (positive feedback). In complex systems, interactions between components in a system lead to higher order structures. Consider DNA: Deoxyribo-Nucleic Acid. It's basic components are a Phosphate and Sugar backbone with nitrogenous bases connected off the sugar molecules. With two strands of phosphate/sugar lines up, the bases interact, essentially creating a flexible ladder structure. Information is encoded in the sequence of bases as you read down the ladder. Each 3 bases combine to be read as a codon. Specofic sequences of codons translate into a string of amino-acids unique to a particular protein. Now, each chromosome contains a massive amount of DNA. How do you fit all that information as DNA into a relatively small volume? The answer is that you need a way to physically compress DNA. In that respect, DNA is like wires inba hifi system: you want to utilise devices to organise the wires so they are discretely packaged and dont appear unsightly or worse trip you up. The DNA gets packaged similarly to how thread is wound to spools in a sowing box. Proteins called Histones are the cells spools. You can wrap the thread of DNA around histone protein. This reduces the amount of space taken up. You bring another histone 'spool' and wrap the DNA around that too. Continuing in that vain, you end up with a string made of a sequence of histone spools wrapped with DNA. If you wish to comprress the information further, you coil a string of histones by twisting. A series of coilings enables higher compression. A geometry genetated by this system means that DNA encoding protein A can be a long distance away from DNA encoding protein Z, yet due to the compression into a histone rope structure, the two sections of DNA can be physically very close. What controls the wrapping or unwinding of DNA around histones turns out to be chemical modification of aminoacid components of histones. Enzyme machinary can crawl along histone rope, modifing histones to expose DNA so sections of it can be photocopied and exported as blueprints used to produce proteins at factories embedded at the outer wall of the nucleus. This is a cellular example of how simple components combine to create complicated structure that utilise complex interactions to compress information. The geometry of the chromosomes are highly convoluted such that connections can occur between distant sections of DNA. You could call these non-local sections brought into close proximity by the geometry of the storage medium. Similarly, fractal geometry compresses information. Non-linear systems have different sorts of attractors. An attractor is a region of state space that influences (attracts) elements in a dynamic system. Some attractors pull objects in state space in like magnetic attraction---leaving them unable to escape. Other attractors cause objects to orbit in a cyclical fashion. Chaotic attractors are special because the object tends to move in a non-similar path--never exactly repeating its previous path. Yet the broad patterns of movement are predictable even through exact trajectories aren't calculatable! The geometry needed to achieve chaotic attractors is not euclidean. In general, Non-linear systems are associated with non-euclidean geometry and linear systems can be more easily described using linear geometry. If the reality of activity at the quantum scale were to be complex or were to result in complexity, then bridging the gap between the quantum and non-quatum realms is likely to require a non-linear formulation of QM. Regardless of whether the scientist/mathematician is an experimentalist or mostly theoretical researcher, to develop a non-linear formulation of QM that can link to GR will require some people who are intuitively able to understand the non-euckidean geometry of the quatum realm most likely arising out of expert mathematical understanding of geometry and topology. This effort will require a huge research effort probably spanning decades, utilising quantum computing, GAI and experts in the mathematics of non-linear systems theory. I think that where issues of perceptual disagreement are bound to occur is between experts whose backgrounds are underpinned mathematically by linear vs non-linear systems thinking. Yet, linear thinking is so much easier---both conceptually and mathematically---that it will take the theoretical physics community quite a while to transfer as a group to studying non-linear physics. Complexity is harder to experimentally model, harder to mathematically describe and harder to generate useful long-term predictions because the number of degress of freedom you need to model and their even greater interactivity willl severly strain the best computers and minds. That said, thinking which aligns with 1970's quantum tradition has so far failed to evolve to explain GR. At some point, continued failure will eventually force greater interest in nonlinearity within mainstream physics.
@@notanemoprog QM states that the unity of the system described with 1 wave function trumps time and space, Bell's theorem shows the way how to check it, and it's been checked - the reality indeed works on different principles of causality than regular human logic. Future can interfere with past, present can get answers from all the possible futures, Read about delayed choice experiment and quantum bombs testing experiment. Most of the scientists just can't comprehend all of that, they want to save their comfy worldview, it becomes personal, some kind of uber-materialistic religion.
@1:25:00 - @1:28:00 here is where I'd tentatively critique TP. To me it's about entanglement, not measurement instrument setting independence. Entanglement is outside the light cone. The Alice and Bob measurements are outside each other's lightcone, but that's ok. TP does not need to resort to violation of statistical independence of the instrument stuff for this. The clear violation is (irony drips here) both Minkowski causality *_and_* independence can be false, howso? If the entanglement is in fact locally mediated by say an ER=EPR wormhole. So it only appears non-local to someone assuming Minkowski spacetime. People never mention this. The "fourth" assumption of Bell's famous result is a Minkowski spacetime. Why do we have to assume SR holds down to the Planck scale? To my mind _this_ is the false assumption, and relaxing it (as any old quantum gravity or foamy spacetime model would) gives you a proper ontology behind Bell's Theorem. Experimental statistical dependence, normal causality and locality are all still true. Minkowski spacetime is not true (at the Planck scale). At this date you can even choose your ontology, it's wide open for speculation. Just do not choose Minkowski spacetime at the Planck scale, but you'd better recover causality and locality akin to Minkowski at the macroscopic scale or your model ontology is wrong. Get Lenny Susskind on Curt!!! He'll tell you, like I am, Bell's theorem is just general relativity.
Palmer was insisting that if Alice changes a detector's angle by 1/millionth of angle second " that the result of experiment may be different. Palmer was insisting that changing one gene in a person may result in breaking his blood type. Yes that's true but chances of that are miniscule. And Maudlin emphasized that no meaningful predictive theory could come out of such chaotic dependance on miniscule changes and in the end it is irrelevant to Bell's theorem results
Right, if you take Palmer's assertion to its logical conclusion, how on earth could anyone ever demonstrate an interference pattern with a laser in a double slit experiment, for example? Does he really think that there is some way to make a minor adjustment to the equipment so that the outcome of the particle distribution is not what it always is?
I am not educated in math or physics just intensely curious , so please nobody make fun of me about being not smart enough for this podcast, but are they arguing about quantum physics always has to have uncertainty. The millionth digit can be both odd and even so if the they don’t take this position about bell it essentially says all off quantum physics is wrong. Personally I think quantum physics is very wrong about something or is not incorporating some truth about the universe we just don’t know or understand. Peace from Ct
QM is incomplete, GR is wrong - that's th ecurrent state of physics. The Bell's theorem shows faster than light communication inside of a QM system, that it is possible to experimentally check it The whole buzz is about all the great physicists not noticing in for 30 years. It's a shame and embracement the academia still dwell in.
@@syzygyman7367physics advances funeral by funeral …. I do wonder the ratio of younger physicists who understand bell vs older physicists who understand bell. There’s some sort of strongly held magical QM belief that some in the older generation can’t break free of
@@timjohnson3913 QM is real and it's crazy. This faster-than-light communication thing that is the subject of Bell's theorem is the least horrifying miracle of it. Delayed choice experiment shows back in time causation, and bomb testing technology shows communication from a possible future, - that is the crazy reality of this world. You can't hide from it. Most of the playsets hid from it in the 20th century, and there are basically no real physicists now.
Not sure I understand Palmer's point. But if I do, he is saying that in some model of the universe, Bob and Alice will fail to make random measurements. For however hard they try, their pseudo-random generators will be correlated : if they use random generators based on the arrival time of cosmic rays from distant galaxies, they will be correlated in a way to force A and B to do precisely the measurements that will lead to observe a violation. So, distant galaxies conspire to trick A and B minds to believe that they are doing the experiment randomly, in such a way that the analysis of the result always show the same violation that looks like non-locality. At least, nothing as crazy as non-locality! Please explain me where I am wrong.
I just don't see any sense in this superdeterminism thing. I just see an emperor without clothes. Does it really says that the whole universe is programmed for any random number generator to be correlated with any other?
palmer's main point is that he is challenging counterfactual definiteness, i.e. the idea of 'alternative timelines which could have been'. so he is arguing for determinism, fundamentally. bell test results never "look like nonlocality", rather they show that at least one of the assumptions must be wrong. there are three assumptions in bell's theorem, locality is only one of the assumptions (although in this video, they said that there are only 2 assumptions, which confuses me). some people are convinced that locality is false, but that could be wrong. there is no contradiction in the idea of certain people being determined to be wrong. lastly, the conspiracy objection is not compelling. life is a conspiracy to start with.
@@maxwelldillon4805 You're so wrong. The Bell's shows that non-locality, faster than light communications - 1. Exists inside the theory itself, and hadn't been noticed by its very authors for 30 years, till an outsider Bell came and took a look at their creature. 2. Can be tested experimentally, which has been done, showing that both special and general relativity are simply wrong, This shame of stupidity of Einstein, Bohr, Heisenberg etc. caused a harsh PTSD the current academics still live in.
@@syzygyman7367 qm is nonlocal because of the instantaneous wave collapse, not because of bell's theorem. this is considered a problem by many. nonlocality hasn't been experimentally proven.
Curt, the issue is fundamentally very simple. Dr. Bell used scalar algebra. Scalar algebra isn’t closed over 3D rotation. Algebras that aren’t closed have singularities. Non-closed algebras having singularities are isomorphic to partial functions. Partial functions yield logical inconsistency via the Curry-Howard Isomorphism. So you cannot use a non-closed algebra in a proof, which Dr. Bell unfortunately did. … This is a sufficient disproof of Bell’s theorem.
@@lepidoptera9337 still refusing to elaborate I see. Since nonsense=illogical, tell me what you find illogical about my statement, and I'll promptly explain to you. And don't tell me "everything", because that would just mean you haven't understood it at all.
37:30 - I'm really eager to hear these details, but I'll tell you that at the moment my sense is that if we don't have statistical independence we don't have science. It just seems like if we endow the universe with this much ability to "sneak around" on us it's hard to see how we can really trust any conclusion we come to.
If I understand it correctly, Maudlin is arguing that the Bell's inequalities only apply for large enough ensembles and breaks down for single measurement scenarios (i.e. they say nothing about them). But then does that mean that locality is only violated statistically as well? What does that mean?
Statistical independence is an assumption of bell’s theorem, which is what 90% of the video is about. If the assumption holds true, which Maudlin argues for, then bell’s theorem shows that the universe is non local
@@timjohnson3913 Not the universe though. For example, the forces in the standard model will still act locally. I assume you mean that the universe is shown to be non local on the most fundamental level, which you take to be some indetermenistic quantum theory. But Maudlin also argues that Bell's theorem only applies to large ensembles and cannot be interpreted in individual experiments, right? My point is that then you also can't use it to argue that locality is violated in individual cases, it is only violated statistically whatever that means.
@@bazsawow The universe has a property of nonlocality; this is what Bell proved. It does not require that every interaction be nonlocal for the Universe to have the property of nonlocality. Back to the main point, if you can show statistically that an ensemble of measurements require nonlocality, then individual cases must be nonlocal. You can’t get ensemble nonlocality out of individual local interactions. So while you can never point to 1 specific run of an experiment and declare that it is nonlocal, you can look at multiple runs of the experiment and statistically prove there must be nonlocality. This is what Bell did.
@@timjohnson3913 There's a technology for proving Bell's theorem in only 1 experiment with only 1 pair of entangled objects, - in Penrose's Shadows of the Mind of 1994. It's sad that people arguing about the subject don't know the works of the greatest living contributor to it.
@@timjohnson3913 I agree with everything you've said. I have a question about the definition of "locality", though: Is it necessarily the case that Bell required "nonlocality" instead of "faster than light causal influence", or are they two ways of saying the same thing? Intuitively, it seems to me that a physical process that appeared to be instantaneous could turn out to be merely much faster than we were capable of measuring (e.g., C to the power of C). In that case, we would not abandon locality altogether, but just redefine it. Perhaps the notion that _any_ faster-than-light causal influence constitutes a violation of locality is due to a definition of locality tied to the understanding of the causal structure of spacetime found in relativity.
Brilliant discussion but I can with confidence say that the QE effect is physically non-local for light but 100% local and causality superluminal connected by an unknown 5th extra dimension superluminal force not yet discovered by humans.
Nothing is really a wave of probablilty but is determinated because we live in a Uni-verse and so we can not see the Multi-verse that is that wave of probabilty. At the same time we can not make a perfect prevision of what is going on in the quantum world because of the Uncertainty Principle. Even if we want make previsions with great accuracy of what will be the future is impossible to “see” it even with great intelligent AI, because the nature if things is imprevedibile at the core
Not really. That's standard quantum mechanics. Super-determinism tells you that the universe is lying to you about your choices because it has already decided for you. ;-)
@@johnbihlnielsen3578 Nobody who is serious about quantum mechanics considers nonsense like superdeterminism. We know why the universe LOOKS superdeterministic. It follows from special relativity. The energy that we are measuring in a quantum experiment comes at us at the speed of light (consider the case of a gamma ray detected by a Geiger-Mueller counter). Since nothing is faster than light, that detection event is unpredictable (we can't get advanced warning, because that would require a superluminal signal, which doesn't exist). So then we play this game backwards to the source of the gamma ray and ask what "triggered" the release of that ray to begin with? So that gives us another physical even that also happened at the speed of light and so on. If we play this again and again and again, the only identifiable "source" for this causality chain is the beginning of the universe. In other words, an attempt to classify quantum mechanics by demanding causal chains makes it look like as if everything, including the decision to perform a certain measurement, had to be initiated by the initial conditions of the universe. That's a completely useless piece of physics intuition. It explains no more than the usual religious "god did it" claim, except that the superdeterminist replaces "god" with "initial conditions of the universe". So every click in our Geiger-Mueller counter was caused by the "initial conditions of the universe". Great. What are we supposed to learn from that?
factor's of 1 not self similar when the principal 1 is rendered in 3 parts within the whole 1. Time and space dilate the number 1 plus or minus at a singularity.
Imagine if the AI was cellular automata, a self organising principal within an opposite, 0 and 1osolating around in a field of potentiality, suddenly repulsion, suddenly space, suddenly distance, suddenly time. Suddenly a geometry separating the 2 warring parties on a field of singularity. Singularity being where both combine and repulse in a mass of energy. Each fighting to maintain it's reality. 1 conserving it's energy, the other using that same energy to escape.
All we see is a product of the evolution of our reality within our discrete package in a continuous chain. Counterfactualisms are the opposite end of a scale where both factual and counterfactual are the same in a singularity of potential information from an origin outside of the singularity.
You have to give Maudlin credit for carefully listening to what was saying. He focused enough to catch errors of Palmer live “I don’t understand what this means” “I don’t know what probability of roe means” etc, and he was able to come up with live counter arguments to what Palmer said.
Buddha says, "only the one pointed succeed." Tim Maudulin need to concede that e2 is an explorer and should investigate the possible contribution of counter factual. The whole thing is ultimately going to be very close to chaos, insense that it emerges via of fluke. That is, from our point of view. If we could get down there, where everything comes from, it would be so natural and there would be no other way.
Timestamps, Links, & Sponsors:
- Brilliant: brilliant.org/TOE for 20% off
- Henson Shaving: hensonshaving.com/everything and enter code EVERYTHING at checkout to get 100 free blades with your purchase.
LINKS MENTIONED:
- Tim Maudlin's podcast on TOE: ruclips.net/video/fU1bs5o3nss/видео.html
- Primacy of Doubt (Tim Palmer's book): amzn.to/3Oo55k7
- Metaphysics Within Physics (Tim Maudlin's book): amzn.to/3pXHNcn
- Quantum Non-Locality and Relativity (Tim Maudlin's book): amzn.to/44QoS2F
- Go Fund Me for "The John Bell Institute" initiative: www.gofundme.com/f/a-permanent-home-for-the-john-bell-institute
- Speakable and Unspeakable in Quantum Mechanics (John Bell's book): amzn.to/43RyQ2t
TIMESTAMPS:
00:00:00 Introduction
00:02:04 Explaining Superdeterminism & Fractal Cosmology
00:05:06 What is Tim Palmer working on
00:09:24 What is Tim Maudlin working on
00:11:47 Assumptions of Bell's inequality / theorem
00:22:40 Locality and Superdeterminism
00:28:33 Summary of disagreement + why do we care what Bell said?
00:32:54 Counterfactuals & Counterfactual definiteness
00:59:38 Chaos theory, attractors, and fractals
01:26:32 Free variables and ensembles
01:36:30 Invariant set theory
01:48:21 Relevant links and teaser for Part 2
Correlation is not causation he says I guarantee one of the most fundamental Concepts is here in reality if you are born then guess what you're going to die!
THAT WAS GREAT!!!!!! More !!!! More !!!
Conservation of Spatial Curvature (both Matter and Energy described as "Quanta" of Spatial Curvature)
Is there an alternative interpretation of "Asymptotic Freedom"? What if Quarks are actually made up of twisted tubes which become physically entangled with two other twisted tubes to produce a proton? Instead of the Strong Force being mediated by the constant exchange of gluons, it would be mediated by the physical entanglement of these twisted tubes. When only two twisted tubules are entangled, a meson is produced which is unstable and rapidly unwinds (decays) into something else. A proton would be analogous to three twisted rubber bands becoming entangled and the "Quarks" would be the places where the tubes are tangled together. The behavior would be the same as rubber balls (representing the Quarks) connected with twisted rubber bands being separated from each other or placed closer together producing the exact same phenomenon as "Asymptotic Freedom" in protons and neutrons. The force would become greater as the balls are separated, but the force would become less if the balls were placed closer together. Therefore, the gluon is a synthetic particle (zero mass, zero charge) invented to explain the Strong Force. An artificial Christmas tree can hold the ornaments in place, but it is not a real tree.
String Theory was not a waste of time, because Geometry is the key to Math and Physics. However, can we describe Standard Model interactions using only one extra spatial dimension? What did some of the old clockmakers use to store the energy to power the clock? Was it a string or was it a spring?
What if we describe subatomic particles as spatial curvature, instead of trying to describe General Relativity as being mediated by particles? Fixing the Standard Model with more particles is like trying to mend a torn fishing net with small rubber balls, instead of a piece of twisted twine.
Quantum Entangled Twisted Tubules:
“We are all agreed that your theory is crazy. The question which divides us is whether it is crazy enough to have a chance of being correct.” Neils Bohr
(lecture on a theory of elementary particles given by Wolfgang Pauli in New York, c. 1957-8, in Scientific American vol. 199, no. 3, 1958)
The following is meant to be a generalized framework for an extension of Kaluza-Klein Theory. Does it agree with the “Twistor Theory” of Roger Penrose, and the work of Eric Weinstein on “Geometric Unity”? During the early history of mankind, the twisting of fibers was used to produce thread, and this thread was used to produce fabrics. The twist of the thread is locked up within these fabrics. Is matter made up of twisted 3D-4D structures which store spatial curvature that we describe as “particles"? Are the twist cycles the "quanta" of Quantum Mechanics?
When we draw a sine wave on a blackboard, we are representing spatial curvature. Does a photon transfer spatial curvature from one location to another? Wrap a piece of wire around a pencil and it can produce a 3D coil of wire, much like a spring. When viewed from the side it can look like a two-dimensional sine wave. You could coil the wire with either a right-hand twist, or with a left-hand twist. Could Planck's Constant be proportional to the twist cycles. A photon with a higher frequency has more energy. ( E=hf, More spatial curvature as the frequency increases = more Energy ). What if gluons are actually made up of these twisted tubes which become entangled with other tubes to produce quarks. (In the same way twisted electrical extension cords can become entangled.) Therefore, the gluons are a part of the quarks. Quarks cannot exist without gluons, and vice-versa. Mesons are made up of two entangled tubes (Quarks/Gluons), while protons and neutrons would be made up of three entangled tubes. (Quarks/Gluons) The "Color Charge" would be related to the XYZ coordinates (orientation) of entanglement. "Asymptotic Freedom", and "flux tubes" are logically based on this concept. The Dirac “belt trick” also reveals the concept of twist in the ½ spin of subatomic particles. If each twist cycle is proportional to h, we have identified the source of Quantum Mechanics as a consequence twist cycle geometry.
Modern physicists say the Strong Force is mediated by a constant exchange of Mesons. The diagrams produced by some modern physicists actually represent the Strong Force like a spring connecting the two quarks. Asymptotic Freedom acts like real springs. Their drawing is actually more correct than their theory and matches perfectly to what I am saying in this model. You cannot separate the Gluons from the Quarks because they are a part of the same thing. The Quarks are the places where the Gluons are entangled with each other.
Neutrinos would be made up of a twisted torus (like a twisted donut) within this model. The twist in the torus can either be Right-Hand or Left-Hand. Some twisted donuts can be larger than others, which can produce three different types of neutrinos. If a twisted tube winds up on one end and unwinds on the other end as it moves through space, this would help explain the “spin” of normal particles, and perhaps also the “Higgs Field”. However, if the end of the twisted tube joins to the other end of the twisted tube forming a twisted torus (neutrino), would this help explain “Parity Symmetry” violation in Beta Decay? Could the conversion of twist cycles to writhe cycles through the process of supercoiling help explain “neutrino oscillations”? Spatial curvature (mass) would be conserved, but the structure could change.
=====================
Gravity is a result of a very small curvature imbalance within atoms. (This is why the force of gravity is so small.) Instead of attempting to explain matter as "particles", this concept attempts to explain matter more in the manner of our current understanding of the space-time curvature of gravity. If an electron has qualities of both a particle and a wave, it cannot be either one. It must be something else. Therefore, a "particle" is actually a structure which stores spatial curvature. Can an electron-positron pair (which are made up of opposite directions of twist) annihilate each other by unwinding into each other producing Gamma Ray photons?
Does an electron travel through space like a threaded nut traveling down a threaded rod, with each twist cycle proportional to Planck’s Constant? Does it wind up on one end, while unwinding on the other end? Is this related to the Higgs field? Does this help explain the strange ½ spin of many subatomic particles? Does the 720 degree rotation of a 1/2 spin particle require at least one extra dimension?
Alpha decay occurs when the two protons and two neutrons (which are bound together by entangled tubes), become un-entangled from the rest of the nucleons
. Beta decay occurs when the tube of a down quark/gluon in a neutron becomes overtwisted and breaks producing a twisted torus (neutrino) and an up quark, and the ejected electron. The production of the torus may help explain the “Symmetry Violation” in Beta Decay, because one end of the broken tube section is connected to the other end of the tube produced, like a snake eating its tail. The phenomenon of Supercoiling involving twist and writhe cycles may reveal how overtwisted quarks can produce these new particles. The conversion of twists into writhes, and vice-versa, is an interesting process, which is also found in DNA molecules. Could the production of multiple writhe cycles help explain the three generations of quarks and neutrinos? If the twist cycles increase, the writhe cycles would also have a tendency to increase.
Gamma photons are produced when a tube unwinds producing electromagnetic waves. ( Mass=1/Length )
The “Electric Charge” of electrons or positrons would be the result of one twist cycle being displayed at the 3D-4D surface interface of the particle. The physical entanglement of twisted tubes in quarks within protons and neutrons and mesons displays an overall external surface charge of an integer number. Because the neutrinos do not have open tube ends, (They are a twisted torus.) they have no overall electric charge.
Within this model a black hole could represent a quantum of gravity, because it is one cycle of spatial gravitational curvature. Therefore, instead of a graviton being a subatomic particle it could be considered to be a black hole. The overall gravitational attraction would be caused by a very tiny curvature imbalance within atoms. We know there is an unequal distribution of electrical charge within each atom because the positive charge is concentrated within the nucleus, even though the overall electrical charge of the atom is balanced by equal positive and negative charge.
In this model Alpha equals the compactification ratio within the twistor cone, which is approximately 1/137.
1= Hypertubule diameter at 4D interface
137= Cone’s larger end diameter at 3D interface where the photons are absorbed or emitted.
The 4D twisted Hypertubule gets longer or shorter as twisting or untwisting occurs. (720 degrees per twist cycle.)
How many neutrinos are left over from the Big Bang? They have a small mass, but they could be very large in number. Could this help explain Dark Matter?
Why did Paul Dirac use the twist in a belt to help explain particle spin? Is Dirac’s belt trick related to this model? Is the “Quantum” unit based on twist cycles?
I started out imagining a subatomic Einstein-Rosen Bridge whose internal surface is twisted with either a Right-Hand twist, or a Left-Hand twist producing a twisted 3D/4D membrane. This topological Soliton model grew out of that simple idea.
I was also trying to imagine a way to stuff the curvature of a 3 D sine wave into subatomic particles.
.
I think Mr Palmer is mixing up physics with philosophy which is what all scientists tend to do. Physics involves cognition and measurement where philosophy takes place in abstraction. The relation ship of physics to mathematics, logic and philosophy is that physics is only valid within the boundaries of its mathmatics that in itself is derivative of logic and philosophy. A derivative has nothing to do with precision or measurement. It stands true at the limits regardless of precision.
Logic is the basis of all branches of mathematics that is a tool for physics. You can't disprove logic using physics. This is why conclusions to observations turn out to be wrong once a new paramters is discovered and yet the logic remains true.
This idea of hidden variable will only be valid once the variable is no longer hidden.
Relativity is non local and Eninstien knew that but only after his presentation of his theory.
@@ExiledGypsy Mathematics wasn't derived from philosophy. It was derived from early empirical observations.
Curt, you've really honed your craft. Providing a glossary of terms before the discussion is proof you consider the viewer experience highly!
Got a crate of my favorite whisky shipped in on my doorstep tomorrow for this convo. It's good to have some me-time together with my imaginary best friends Tim & Tim for three hours. Sit back, relax, shout at the screen and sometimes type some angry youtube comments. Good to have some male nerd self-care sometimes ❤
Enjoy!
Can relate to that better than I‘d like to 😂 (although I’m female and I don’t like whiskey)
You'd need a whole crate of whisky to find this episode relaxing!
1:24:30 Dr Maudlin states that the value on Nth digit of pi doesn't give any information about what's been going on in the Universe before running the test setup.
This is true for TOEs - practically all so far - that take for granted/given
- algebra to be applied
- number system (reals vs rationals, for instance)
- metric
- constants of nature
The value of pi as a math concept is based on an infinite series of terms. I'm working on an alternative model where the value of pi is refined at each iteration in the evolution of a universe, based on an ever-growing number of logarithmic poles. Hence Dr Maudlin's argument remains valid only when we specify such mathematical objects exist as physically meaningful values in a potentially finite universe.
Btw, in my model also the logarithmic potential itself surrounding these poles evolves (by addition of terms in a series) along with the increasing number of poles in the system.
Funnily enough, these effects seem to counterbalance each other to keep the Euler's identity true whatever the number of poles from 2 upwards.
Otherwise, Dr Maudlin is fighting the windmills, and I highly appreciate his efforts, especially ideas about how to modify the number system at the base of physics from real numbers to rational ones.
I, a lowly wind turbine technician, struggle to understand a majority of what was discussed here but I do take pleasure in listening to highly educated individuals disagree in such a constructive way. You, Curt, are an invaluable host in this podcasting space 🙏
💯(well, a number as close to 💯that we can get to in the real world)
Wow! that got the Brain working overtime. At a personal level, it's prompted me to clarify obscurities in my own understanding. Always a good thing! Thanks to all three of you (and behind the scenes support!).
Great stuff Curt. It's super refreshing to hear my favorite thinkers in physics being interviewed by someone competent in the subject who asks the questions I would have wanted to ask.
Prof Palmer how honored to meet you my first time , I have been following TimM for years now and saw this talk with him. What a honorable man you are to show respect for TimM work. Maybe more people like both of you educating our planet, our earth might not be doomed to destruction in the next few hundred years. I also have been following polar bears and just educated myself about global warming. Ty both for your time!!!
A beautiful discussion. I truly enjoyed watching these two make such efforts to understand one another. It gives one a good sense for how much room there is for discussion and further thought in the sciences and philosophy. Thanks for making it happen gentlemen, especially you, Curt.
Having worked with more than 60 graduate students in 3 different research groups, it’s important to point out that there are academically talented students who ALSO have limited, independent problem-solving skills, lack practical real-world experiences, and the ability to construct sophisticated conceptual models built upon foundational ones. Clear thinking takes time to evolve. If they do not improve under mentoring, they will not become good experimentalists, but instead do academic, mathematical analytical work based on existing literature. They can be really intelligent but do not necessarily become creative problem solvers outside of known, well-established works in their chosen field. This comment is not strictly about experimentalist versus non-experimental list, but about the ability to think independently.
As the person responsible for creating an apparatus for an experiment, on more than one occasion I had to block the involvement of some graduate students in experimental work because they were ill-suited for the task. My own experience has been that about one out of 12 students are the exception regarding their creative problem-solving skills. I am seeing evidence of this trained academic versus thoughtful creative thinking in this discussion.
Palmer is autistic with a special interest in fractals, so he's stuck on synthesizing his fractal interest into this topic. I know virtually nothing about Bell's work, but here's what I take from the discussion of it. He means to solve the problem of being unable to actually carry out the same exact experiment more than once. The universe is expanding, there's some humidity in our tools, a deer farted in the forest, etc., so we need a way to cope with that reality. If we're just talking about binary outcomes, a function in the programming sense is a perfect analogy. In code, say you have a function that takes an integer as a parameter and returns a boolean. The function logic could be anything. We can land anywhere at all in pi and take any number of digits, and pass that in as an integer to our function, and we'll get back a boolean when we execute the code. We could run the program again using the same value for the parameter and get the same result. The experiment has been superficially duplicated, but in the real world, different electrons ran through the circuits of our computer, and the computer temperature was not exactly the same. Every time we run the function, we're using different initial conditions on some level. That's one aspect. Another is that the function code could include a random number generator, so we really don't know what to expect, even if we provide the same input multiple times. Applying this analogy to the discussion in the video, Palmer wants to presume that the input parameters are more important than they really are because he fails to imagine that the function could contain a randomizing element to its logic. He thinks the randomness of the input parameters is all that's going on. He sees nothing wrong with his suggestion that every measurement can be done with a different lambda because he confuses the slight temperature change in the circuit board running our function with running a completely different function. We could use a different lambda for each measurement if we knew that each one was deterministic, but only if we used the same value for the parameter on each measurement, meaning that we'd have inverted everything, measuring the lambdas against an input instead of measuring inputs against a given lambda. If I had to hazard a guess at what the "hidden variables" question is about, I'd say they are analogous to global program state that may or may not be referenced within our function. Could be way off on all of that, but if you've made it this far, thanks for indulging my brain dump after watching this video.
@@shanerigsby9030 lambda is, by definition, the parameter that defines the result of the experiment. It's not just one of a bunch of properties that do, like in your example. If you define lambda this way what palmer says is completely true. Unfortunately he isn't successful in getting his point across because he never specifies that the derivation of Bell's inequality rests on your ability to perform an integral over multiple lambdas (more technically rhos, which are functions of lambda). That's where the ensemble part the other Tim talks about comes into the picture, through an integral. But if the integral is not mathematically legitimate, then all bets are off.
"clear thinking...good experimentalists" versus "mathematical...work based on existing literature"? Your experiments are based on existing lit. I'm glad I'm not one of your students. I don't think you should be employed to make any decisions about any student's potential. You're a sciolist ie. you claim knowledge which you do not possess.
I believe that a debate between a scientist and a philosopher is not fair. They both talk in different languages. And I always trust a scientist over a philosopher. Both can be wrong or right. But I trust the scientific method over imaginative deductions.
I usually would agree with you, but in this case neither of the gentlemen understand physics even at the undergrad level. They are both very good at pretending, though. ;-)
1:34:25 is the most important part of this discussion, I think.
Agree. I wish Maudlin would along with Palmer's point so we could see where it takes us. Seems arbitrary to pick on that assumption when there are so many interpretations. There are so many more silly things in QM than the idea that a difference in the millionth digit of pi (or whatever) could have an impact on a single electron.
Exactly
Awesome ... clash of Tims
I thought you misspelled "titans" for a moment.
I'm an idiot....😂
Only have one question do you or do you not believe in or have evidence for a fourth spatial dimension‽ if yes then infinite 3d universe potentiality is possible.... You're trying to exclude yourself from infinity!!!!! This is my first comment. Will leave more as video progresses under separate thread.
@@AquarianSoulTimeTraveler
I'm not sure if I "believe" a fourth spatial dimension because I do not have enough evidence for it, but still I would assume that it might be possible. The world of quantum physics gives the possibility to for wired things to happen. Including Tim talking to another separate version of himself.
@@AquarianSoulTimeTraveler I'm not sure why a fourth spacial dimension would necessarily cause the universe to be infinite. It may be infinite without it. I'm not trying to start shit... just trying to understand your thought process. I may be missing something.
Tims doesn't just clash, they prove that no one understands QM.
Than you Curt. Its good to be with You and yours Guests 😊
Ooohhhh tim palmer. You get a chance to talk with him about climate? His talk with Sabine was quite entertaining.
Well done Professor Palmer you made your point so far as I am concerned.
As far as I'm concerned, he's grasping at straws, and Maudlin makes perfect sense.
I have a phd in math and I fail to see why physicists call Bell's argument (and the resulting inequality) a _theorem_ when they even disagree on the hypothesis!
But it's even worse than that: Bell's argument doesn't even define all of its terms as unambiguous mathematical objects. So Bell's argument is "just" a very important rational argument that comes with a computation of probabilities associated to mathematically ambiguous entities (such as "a theory"). It's *not* a mathematical theorem.
Ok. Im learning. Who was correct, in your opinion, Maudlin or the other guy?
@@JohnnyComelately-eb5zvneither. There is a step in the derivation of any inequality of Bell's type which amounts to saying measuring a pair of entangled particles at angles A and B, and another DIFFERENT pair on angles B and C, is the same as measuring a single particle pair both at angle AB and BC, which is obviously an impossible measurement. It's all about a wrong application of counterfactual definiteness, without this wrong step the absolute bound is 4, which is obviously more than 2sqrt2 of quantum mechanics, that is no violation.
The bell experiment result table should be sent back to the particle entangler to tell it to entangle a pair or not in various ways, and see if anything unexpected happens on subsequent output, perhaps looping back the result many times up to billions of times in clever ways both logically and epicyclically. Feedback or closing the circle might reveal something just as profound as interference did in the double-slit experiment.
I absolutely love how far the discussion of superdeterminism has come over the past few years. It used to be that to even talk about superdeterminism you'd get aspersions thrown your way that you either believe in a universal conspiracy or that without statistical independence you couldn't do science at all! Both so called objections are just attempts at dismissing superdeterminism without engaging with the subject. I'm glad to see that it's getting a wider acceptance.
There is no need to talk about superdeterminism. It doesn't buy you anything in terms of physical insight. I can tell you why the universe looks superdeterministic, but that's merely a fata morgana. Operationally looking at it doesn't lead to any usable insight.
@@lepidoptera9337 We don't know where superdeterministic theories will lead us as there aren't any out there yet. They could lead to physics beyond the standard model, who knows. It's worth pursuing.
@@trucid2 The universe looks superdeterministic even classically. If you make a measurement in the future, you are picking up energy that currently resides in spacelike separated regions of spacetime. How much energy is there is completely unknowable, i.e. the future is fundamentally unpredictable. You can play this argument at all scales and it stays valid. In particular you can ask where the energy comes from at those spacelike regions. Well, it comes from even further spacelike regions. No matter how far (back) you go, it always comes from a place that has never been visible to us before. It will, at the end of the day, lead us all the way back to the beginning of this era of the universe. That's totally useless knowledge. Unknown is unknown is unknown.
Ahhh time to spark a joint, crack a brew-ha-ha and grab my note pad for yet another Dr. Mauldin master class. I’ve listened, read, and kept notes on every thing Dr. Mauldin has done and thankful to Curt for keeping the knowledge coming!
1:31:20 Got hard to watch here seeing such a great mind as Prof. Palmer has but Prof. Mauldin is just on mf’in fire. He is just seeing the code in real time.
And yes I do look at this as my type of ‘nerd wrestling’ 😂 because I see this as being in Ancient Greece University we’re all these big ideas and theories are bouncing off each other and clashing ideas between the greatest minds!
Prof. Mauldin could not have been more clear and kept asking Prof. Palmer to clarify his position and even fully acknowledged what exactly the two were disagreeing about and ask how and also tried to resolve it.
Alternatively, to my previous post, you could use the results of one bell experiment, to drive a subsequent or parallel one, the first acting as the second’s pseudorandom generator. Something tells me if you let this thing feedback on itself you might see something interesting. Or you might just descend into randomness. Perhaps carefully crafted experiments might amplify a subtle effect when such feedback is applied.
The pseudorandom generator is defined as not having any cyclic attractors except after its cycle repeats, which a million digit seed would outlive the universe. The selection of instrument orientation driven by a random coin toss of that depth, would not alter the balance of the distributions in any meaningful way. To suggest so is a fallacy of composition, requiring a super-deterministic universe where even gods choices are predetermined and free will and economic progress is predetermined as well.
It's hard to even figure out what the chaos guy is even trying to say. What does the millionth digit of the value of a variable in a dynamical system have to do with the value of the millionth observation in an experiment? It seems that he isn't talking about anything even vaguely similar to what Bell and Maudlin are talking about.
Palmer is saying that there is no such thing as probability because if it happened, it had a probability of 1 and if it didn't, then 0. He os arguing for extreme determinism, and his rationale is that the seed values of a fractal change its path, but that's only true because a fractal is mathematically deterministic. What he fails to acknowledge is that math can only be a model of the universe. It's not the universe, itself. Within a fractal, the whole universe is the mathermatical model, but that's not what is at stake, here. We're talking about the actual universe, and the theorem is a model to understand the real universe, not to understand a deterministic mathematical model. Determinism is absolutely at the heart of their disagreement.
A realized outcome doesn't have a probability. Technically nothing (as in no thing/system) has a probability. Physical systems can produce outcome frequencies, but only the theory produces probabilities, which are ultimately just estimators of actual frequencies. I don't know why so few people understand this. It's not rocket science.
I wondered if this discussion could have been summarised without all the intricacies of the Tims respective positions.
I gather that Maudlin believes a fully predetermined universe precludes scientists from doing effective science, as their attempts to capture statistically independent sets is undermined. He uses appeals to Bell as reinforcement.
This position is similar to a common the position on the subject of Free Will, that predetermination prevents a person from making fully rational decisions - as the decision was a foregone conclusion.
It seems "compatabalism" (borrowed from the Free Will topic) is relevant here. I think it remains to be proven that, in a predetermined universe, the experimenter would have no ability to effectively partition statistical sets in order to follow general scientific methodology EXCEPT in the case of studying entanglement - predeterminism would then become the "hidden variable" that undermines the Bell Inequality, and preserves locality.
@@ThePurza Maudlin is a philosopher. He has probably never been in a lab where they are doing quantum measurements. Absolutely nothing is predetermined in the lab. The entire theory is based on a lack of determinism.
As a science enthusiast, I think I understand the difference between Tim Maudlin and Palmer's interpretation of quantum mechanics is that Palmer was talking about the quantum state of the atom when the atom is in quiescent state; because when the atom is in undisturbed state, no serious statistics is involved to understand the quantum state of the atom even if the atom is presumed to be in chaotic state. However, based on the inferences, whenever the atom is in the act of being observed, then statistics is involved because in chaotic state, there are some reactive physical entities involved in the process, which I call composite wave that could pop in and out of existence. And by virtue of which, a composite wave cannot develop any non-zero net effective force to affecting the angular momentum of the observed or the electron in question until the atom is in the act of being observed, which in turn forcing the composite wave to collapse, altering the quantum state of the electron in question, which I believe Palmer had overlooked in his analysis.
Penrose has 1 pair testing technology, for the Bells' theorem / non-locality testing in his Shadows of the Mind, 1994.
Really great theolocution Curt; well done! Even though progress wasn’t definitively made between the two; it’s amazing to listen/watch the views clash. I was super impressed at Curt’s interjections. He interrupted at the perfect times when things were going nowhere and asked a very good and perfectly pointed question to Palmer (something like “can you explain your position with ensembles?”).
Tim Palmer’s closing remark on “holistic fractal structures” being more fruitful (than Planck scale structures) for the formulation of quantum gravity is prescient.
I agree with Palmer on this (though perhaps not for the same reasons). It seems to me that the only possible way in the near future we will be learning about / resolving what goes on at the scales of quantum gravity… the only way this will be achieved is by observing the larger Universe and making inferences at the quantum level.
What do you mean by "prescient"?
@@G_Doggy_Jr Y. How can commenter know if it's prescient? lol
@@CurtOntheRadio retrocausality 😂😂
Palmer understands the invariance of complex numbers or says he does, but doesn't say why a complex number has anything to do with invariance. I would like to hear an explanation of how this is so.
Curt, you sir, realize it or not, are at the same time a young good looking successful man whom has a degree in theoretical physics/mathematics, a natural affinity for anything that has to do with audio and recording equipment as well as great tastes from what i can gather in every field of interest that you admit to, which leads me to how brave you are to step on stage, if you will, with the modern day equivalent of giants among men (in so far as science and physics and philosophy are concerned), and last but not least your humble, respectful, curious, studious, hip to most relevant literature, and display damn near perfect balance of letting guess spea their piece the way they wish it to be comsumed, while managing to push and pull and steer and reign in the flow and direction of the episodes, moderating between guests for debates while prodding it along and you do this all while also personally consuming, retaining, summarizing, and politely yet academically and rigorously holding your guests to the highest standard without surrendering any important ground or bowing down or giving any guest any opportunities to either mislead, lazily communicateto us audience, or use your platform to weaponise and/or propagandise any tool or idea or subject using there status or reputation or position professionally or academically. You are a true gem man
Dude, you are dripping and drooling. It's embarrassing. Go get some personal relief. There are apps for that.
Just read this. Thank you for such a kind analysis Davy. That means plenty to me and my wife. - Curt
Everything that Maudlin said was crystal clear, and made perfect sense. Palmer kept circling around a number of fairly irrelevant things...
@@exp9r interesting! I see it completely oppositely.
I surely enjoyed this. Much appreciated. However, it was disappointing to see Dr. Maudlin never really understand what Dr. Palmer was saying. He didn’t seem to try very hard. Instead, he continually defended his perspective and attempted to undermine Plamers’, which he admittedly didn’t understand.
I am also a physicist… and my thesis focused on chaotic systems… and I also discovered the remarkable ignorance that most physicists maintain around chaos theory. Why is that, I often wondered… and came to the conclusion that it’s too dangerous to most views on physics.
For example, it is proven mathematically that a chaotic set is indistinguishable from randomness… if viewed with sufficient coarse graining. Thus, the stochasticity of quantum properties could, in fact, be deterministic chaos… and yet, inherently indistinguishable from pure randomness… so long as the required resolution to observe the determinism were below the planck scale. Of that one fact, most physicists are strangely ignorant.
Such is the general case for physicists… they are mostly ignorant of chaos theory except at a superficial level.
That’s a VERY big deal… because, what chaos theory shows us is that the Universe IS a chaotic system. Must be.
Anyway… I digress. My point is that… to my one sufficiently knowledgeable in the domains of both quantum physics AND chaos theory… it is clear that Maudlin simply did not understand Palmer… whereas Palmer certainly understood Maudlin. That’s a bit sad because Palmer obviously hoped to share something valuable with Maidlin…. namely, the perspective of a physicist who is adequately knowledgeable regarding chaos theory.
(Note: When Maudlin supposes himself to be summarizing chaos theory, Palmer nods his head “no”… clearly indicating that Maudlin didn’t understand. Anyone well versed in Chaos theory nodded at the same time.
I highly respected appreciate Maudlin. I hope he turns some of his big brain toward chaos theory. It would help him for sure.
Thanks for this awesome video and all the others you offer.
Be Blessed, TOA
Yes, you don't understand physics. ;-)
1:01:54: What Dr. Palmer is implying is that for *some reason* the universe might only *allow* initial conditions "on the attractor." My position is that such an assertion must have some "reason." That's a very special case - it needs to be justified before being taken for granted.
Looking forward to this convo 🎉
I could follow Maudlin but had no idea what Palmer was talking about.
Skill issue. On your part and his. He's not a great communicator but if you don't understand what he's saying, you're being intentionally obtuse or have some other issue.
What’s the point of having a model if the vast majority of the contributors to greenhouse gas are China and India and neither one of them are going to change anytime soon. You can model all day long but if those two don’t modify their production of greenhouse gas, your model is pointless. I find it maddening that we waste a lot of time like in the UK where they contribute 1% and they’re trying to knock it down. Figure out a way to help India.
It's all a scam. The CO2 concentration never correlated with the temperature in the geological history. Ice ages happened even with 3000 ppm.
Bell's theory is a mind boggling concept about the reality, I have tried many times to understand it but failed. Any debate between well educated scientists like this video will bring us a bit closer to understanding it. Well done I enjoyed it.
You don't need to understand it because it's the wrong approach to the problem.
Do the blackboard in-person round 2 🎉🎉🎉
As an analogy, Palmer is asking the equivalent of what would happen to the universe if we changed the gravitational constant to a number other than what we measure it to be. Maudin is saying "you can't do that". I agree with Maudin. But you can also screw around with ideas to find where it goes. Like finding how stable something in the universe is, or how unstable it is.
You seem to have several interesting points and questions. Have you considered consolidating those somewhere?
I'm not super-involved in the "TOE Community", so I don't know if there's something like a discord where people discuss such things.
But I wonder if there's someway that your observations/questions could reach Curt?
@@fnamelname9077 It has Twitter, but somehow all the discussions are happening in here on RUclips, where comments are not easy to read and trace the replies. We need to Twitter or somewhere else.
That is not how I interpreted Palmer. He never mentioned tweeking constants of nature, in fact his whole concern seems to be thinking more carefully about what are the variables that you are free to tweek in a specific theory. Maudlin seem to me to argue that in the context of the Bell's inequalities you are ONLY allowed to consider large enough ensembles, and they say nothing about single measurement scenarios.
@@bazsawowregarding your comment on Maudlin: larger ensembles and sub ensembles give your more statistical confidence, so yes they are good. It would be a fools game to study a large ensemble in attempts to say what happens to any single run of the experiment. All you can get is a probability of what the results of a single run will be.
@@timjohnson3913 exactly, that's how I understand what Maudlin said in this debate.
I have a follow up question though on this point. Does this then mean that locality is also only violated in a statistical sense and not necessarily in individual cases? What does it mean for locality to be violated statistically?
In my opinion, Tim Palmer's interpretation isn't incorrect per se, it is irrelevant. For one thing, the experiment Bell describes to test his theorem is set up in such a way so that the uncertainty associated with chaotic systems can be avoided. In this way, the fact that chaotic systems are sensitive to changes in initial conditions will not affect the experimental results. Furthermore, the idea that a hurricane could have been caused by something as insignificant as a butterfly flapping its wings only arises because the butterfly is immersed in an atmosphere. Thus, the two seemingly separate events are connected across space and time by a causal chain of trillions of oscilating oxygen atoms.
The counterexample given by Professor Palmer refers to an analogous chaotic system, however, the changes made to any free variables are still communicated to the system via a physical interaction. But this is exactly what Bell was attempting to rule out, which he did by providing a rigorous experimental component to test his theorem. He specified that during this crucial phase of the experiment, measurements were to be made with sufficient distance between subjects to ensure they were not causally connected. In other words, the hurricane would have already finished forming before the atoms pushed by the butterfly's wing could reach it. In much the same way, any observed correlations between measurements made by Alice and Bob could not be explained in similar fashion i.e a local interaction in Bob's lab radiating out to Alice's lab and affecting her results. And of course vice versa.
I agree palmer misses the point completely. I don't agree with bell's theorem either. For one, it's not a mathematical theorem as maudlin asserts: it's a statement about physical reality, guided by mathematical AND physical assumptions. If it truly were a theorem in the mathematical sense we would not have loopholes.
The second issue (and most important one) is that any inequality of bell's type requires a specific assumption about counterfactual definiteness: that the results of measurements that were not performed are still well defined. This is the original definition of realism given by Einstein in the epr paper, and is not some handwavy concept as maudlin would have you believe, it's a well defined concept mathematically. We can have two types of counterfactuals, respectively relating to a weakly objective or strongly objective interpretation of results. The crucial point is that if you define your counterfactuals in the weak sense (as bell did), meaning they must be referred to different pairs of particles for each measurement (as is physically the case) you can't derive the lower bounds in contradiction with QM. Instead of 2 for the CHSH correlator, you get 4. To get two you need to reduce the independence of the various experimental runs as if the properties you measure were actually those of a single pair, and this is equivalent to a strong objective interpretation. So bell mixed the two up, which is not logically reasonable.
Expansive thoughts! Exactly what was needed following the prior episode with NDT -: Contemplative thoughts considering new frontiers. No porcupines up 🌳 that "Aren't really there 'cause beliefs about porcupines based on feelings absent facts." - (N*T)
(You did a great job in that interview, btw. N*T is fine for pop-science promotional stuff, not a knock on him. Just some people are better suited for one thing than another. His lane is not my lane, no biggie.)
Can we get DrPalm and ProfTimMaud back. This is one of the best debates of 2023-2024 I watched hundreds and I keep coming back.
Maybe some other topics outside of chaos ?
Great discussion as always!
This discussion leaves me with the following questions:
1) Maudlin seemed to have a strong point when he said that Bell's statistical independence assumption is a _mathematical_ assumption, which puts the burden on Palmer to justify his metaphysical reading of that assumption that apparently goes beyond the mathematical expression. Can Palmer justify his reading of the statistical independence assumption? If he cannot do so, then this undermines the rhetorical force of his non-conspiratorial model of the violation of statistical independence. In service of Palmer's position, one might try to draw a parallel with Maudlin's comments regarding quantum mechanics. For Maudlin, quantum mechanics is not a physical theory; rather, it is a mere mathematical formalism from which predictions can be derived. Theories must go beyond this and must tell us what the world is like. Perhaps one could argue that the type of move Palmer is making is of the same kind, of "going beyond" the mathematical formalism.
2) Palmer seemed to have a strong point when he pointed to textual evidence of Bell's writing, of variables being "free" in the sense of "could have been otherwise". Of course, this goes back to the point above, where Maudlin points out that Bell's main claim is expressed in unambiguously mathematical form. Nevertheless, Palmer did seem to have some significant evidence to support his position. Could it be argued that Palmer's reading of Bell is correct? If this argument is strong, I think this might prompt Maudlin to pause, and to take more seriously the possibility that had Bell been alive today, he might have been persuaded by Palmer's model of non-conspiratorial violations of statistical independence.
It all boils down to this crazy super determinism idea - that the random number generators must be intentionally not random to produce the result impossible for any hidden variable systems, as Bell's theorem shows in the theory of QM itself and experiments support. All the things Palmer says are just word salad without any meaning, - physical, mathematical,. logical - no sense at all. He's just a fraud.
I didn’t take it as Tim Maudlin didn’t have an argument against the counterfactual determinism or whatever Palmer wanted to push. I took it as Maudlin just not wanting to go down that route of argument because it didn’t apply. And Palmer or someone needs to explain to us all where counterfactuals fits into Bell’s mathematical statement of statistical independence. Or why and where Bell went wrong with the math.
@@timjohnson3913 I think I agree with you -- Maudlin's rejection of Palmer's hypothetical model was not due to it being internally inconsistent or illogical, but rather because it was not applicable, and so did not threaten his interpretation of Bell's theorem.
My questions (1) and (2) explore two different angles of approaching the question whether or not Palmer's line of reasoning was applicable. Question (1) explores whether there is a justification for Palmer reading counterfactuality into the mathematical expression of statistical independence. Question (2) explores whether there is compelling textual evidence in Bell's _Free Variables and Local Causality_ that support Palmer's view of what Bell meant when he referred to variables being "free".
Very surprised the definition 'free variable' is up for discussion.
What’s your definition of free variable and how does that impact the statistical independence assumption of Bell’s Theorem?
Just used Chat to garnner an explaination about Bells idea...sure did clear up the complexity of the discussion. Thank you for the presentation, learned a bit more.
Fascinating talk. I bet Kurt enjoyed it. I did as much as I could understand. 😎
How much wood would a woodchuck chuck if a woodchuck could chuck wood?
Pizza was just now delivered....lets gooo
From a heuristic state you can model the universe deterministically or not. There are no laws preventing either choice. But which model tells us more and does so in a clear way. With respect to clarity a statistical model hasn't got a chance. (no pun intended)
Well I’m really enjoying this, two heavy intellectuals slugging it out over statistical independence and Bells theorem. I’m probably understanding only about 50% and in my book Tim M is ahead. If I understand the other Tim correctly , he’s saying you can’t lock Cary only one variable in a particular environment, everything is interconnected. But as Tim M is saying , that’s the whole basis of RCTs, you randomise apart from the thing you’re testing for. Anyway thanks to both and Curt for helping me understand the theorem a bit better and to see the way these guys think.
There is nothing to slug out. Statistical independence can be tested with correlation functions. Some quantum systems produce independent outcomes and others do not. Those systems that do not can not be treated with the simple formalism of quantum mechanics that we are teaching in beginner's classes and that relies on a straight forward ensemble approximation. A theorist who tells you that it's either/or simply doesn't understand physics. It's both and one has to adapt the theory to the requirements.
Exact Initial conditions aren't "free variables", they are irreproducible. Floating variables seems more precise. But it sounds like Bell didn't make that explicit but that is what he seems to be saying.
You could get emergent statistical independence if the function determining collapse outcomes is balancing between complexity of spatial form and complexity of change over time.
In finite measurements, we would only see it balancing complexity over time, which would be randomness.
Balancing complexity of form across space would only be
evident from a correlation between local and global complexity. I’ve got a video about equations that describe that correlation.
To use the “digits of Pi” example, the digits are not just a series of collapses or just states of space. They’re both. The digits of Pi are a series of dance moves between universal state and individual collapse.
It seems to me that a counterfactual in quantum mechanics must always return a wave function, rater then a definite, and therefore include any and all possible superpositions.
Then we're back to Schrodinger's cat - is it in the superposition of the alive state and the dead state?
Pretty hyped for this, but shouldn't it be called "Superdeterminism vs Nonlocality" or so? After all, superdeterminism is implicit as a possible solution in Bell's Theorem…
I think "statistical independence" is one of the assumptions of Bell's Theorem, and that is what superdeterminists reject. So in that sense, the name makes sense. Although, I think you make a fair point, as superdeterminists don't necessarily reject Bell's Theorem in its entirety; they only reject statistical independence.
indeed you are right.
I think it was more about Bohmian dynamic holism vs. statistical mechanics. Palmer game mathematically sound arguments in the favour of "God doesn't throw dice", Maudlin kept confusing mathematical theorem with randomized testing. In takes only a single counterexample to disprove a conjecture, and Palmer offered that against the conjecture of "statistical independence".
Nothing wrong with rejecting both statistical mechanics and localism as universals. :)
@@santerisatama5409 palmer doesn't argue for nonlocality. in his paper, he defines a superdeterministic theory as being local
@@maxwelldillon4805 People can be right on one issue and wrong on other issue. I think non-localism has been unnecessarily mystified. Local can be simply defined as consecutive and non-local as parallel. Already the "tape" of the mathematical definition of a Turing machine is a parallel BOTH left AND right continuum. Dyck language is another very familiar example.
The fatal flaw of Einstein relativity is the attempt to deny the possibility of standing waves of light. The Thomas recession attempt to "fix" the problem basically leads to fully scalable quantum cosmology.
@30:50 maybe important, maybe not, but I'd suggest we should disagree with TM here. Indeterminism does not mean, "things can be exactly the same up to a certain point and then diverge...". Quantum indeterminacy is only ever about the measurement outcome, not what may or may not happen in-between in the spacetime cobordism. To me this is an important subtle disagreement because I have a different view of QM and GR. I do appreciate the orthodoxy thinks like TM, they should check the strict logic of time evolution though. They're assuming a Hamiltonian and no advanced causation. In fact they're not just assuming a Hamiltonian, they're assuming their Hamiltonian is somehow perfectly describing how time evolution develops. I'd suggest Feynman showed us that's _not necessarily_ the case.
It's subtle though. When the Hamiltonian is only offering a statistical description then you cannot really prove that it is "wrong", and indeed it is not really wrong, the point is the Hamiltonian time evolution is an incomplete picture of reality. (As are scattering amplitudes which are probabilistic.) Is a Bell experiment just scattering processes? Yes it is. You should think hard about that. How can this be non-local? How can anything be non-local without mysticism of some variety? (MW and multiway hypergraphs and whatnot, all of which are dumb mysticism when mysticism is not needed).
(I am not against mysticism in its right place, but QM and GR is not the place.)
I have some (as yet) incomplete thoughts. First, from Wiki the definition of 'locality'. "In physics, the principle of locality states that an object is influenced directly only by its immediate surroundings. A theory that includes the principle of locality is said to be a "local theory". This is an alternative to the concept of instantaneous, or "non-local" action at a distance. "
So, let's take a simple quantum experiment. A television manufacturer tests his "electron gun" for the TV set by firing a beam of electrons at the target. If the electrons leave the gun when he turns the switch and they make a permanent dot on the screen by chemical reaction, then the universe is 'local'. In other words, the electrons are demonstrated to have left the gun when he turned the switch. He 'caused' the switch to close, which connected the circuit, which allowed electricity to heat the filament which boiled off the electrons. Now, this manufacturer had been 'dreaming' or 'thinking' about turning the switch for the last twenty-four hours (because it was Sunday and the lab was not open on that day). Further, he can tell when he walks in the lab on Monday, the screen was blank of electron spots. His thinking, dreaming, or other entanglement with the system did not cause the gun to fire on Sunday. Only on Monday when he became part of the critical causal link to close the switch did the gun fire.
The gun does not fire at other times. Other T.V. manufacturers thinking of the same test don't cause 'his' electron gun to fire. Only when he flips the switch. This is how we understand the world. This is how we fire electrons at other atoms for investigative purposes. This is a quantum experiment. It demonstrates 'locality' and 'only locality'. There is no spooky action at a distance in this well-defined experiment which can be repeated all around the world any time one choses.
The only place where I see probability involved at all in the experiment is here: the filament of the gun is made of "atoms too numerous to count". The too numerous atoms have many more "too numerous to count" electrons. One might suggest that precisely which electron(s) boils off is a matter of probability. Really, we know the filament is heating, which is causing the atoms to vibrate more rapidly, and several or many of the atoms vibrate enough to kick-out an electron. So, while we can say which electron is fired is a matter of probability, it really seems if we were small enough to actually track which atoms vibrate either most violently or at the right frequency, we could tell which would then be most likely to eject the electron. There is nothing spooky about this. We do this every day.
By contrast here are some processes that I don't understand: 1) using a pseudo-random number generator like the odd-even state of the digits after the millionth digits of pi. The last I heard, although the digits of pi form an unending sequence, every digit will 'always' be the same from one repetition to the next. Otherwise, there is no meaning to pi. And, therefore, reference to his psudeo-random process delivers precisely nothing useful. Using the time of a 'radioactive decay' might work better. 2) Creating entangled particles. Why do I have to 'create' them? Aren't they just entangled? Can't you find entangled particles in completely natural processes? If not, does entanglement 'actually' exist or is it a construct. I do not know enough about photon 'splitters' and 'down-converters' to know exactly what processes are going on here. And many of these arguments do a lot of 'hand-waving' at the creation of these pairs. And, if they are created just how exactly are they transported to a locale where they can't have been affected by slower-than-light processes.
In short, I want to know 'details', as I suspect the devil is in the details.
Awesome, just the two people I have wanted to see debate ❤
48:42: Changing the millionth digit would be one part in 10^million; changing the sixth digit would be one part in a million.
Great episode Curt.
I'm so PUMPED for this. Superdeterminism is the juiciest, most philosophically interesting approach to quantum mechanics... at least that's my opinion, as someone with zero knowledge of math or physics.
It would be if it actually had any experimental teeth to it, but it doesn't. People can't just wishcast hidden variables that they can't define and can't falsify into reality in order to hold onto their materialistic prejudices.
I'm sorry but that's a lot of things, but it isn't science.
Let me take a crack at saying what the confusion was about which variables Tim P. was talking about and Tim M. was talking about when talking about free variables. Tim P was referring to (freely) changing a variable that is part of and thus very relevant to its effect on the experiment, e.g., changing millionth digit of x in x, y, z of which the very experiment is keeping track of the trajectory of. In other words, Tim P. is talking precisely about variable(s) that are generally and obviously considered relevant to the experiment. Where as Tim M. was precisely talking about variable from which the experiment is free of their effect on it e.g. the position of planets around a star billion light years away, which we intuitively agree are not relevant to an experiment on electron singlets being performed locally. What Tim M. is saying that you are free to let the position of planets around stars billion light years away and freely set any values for them and will not and should not affect your experiment on electron singlets locally. That is why they are free variable - in the sense that experiment on electron singlet is free from the values of those (distant/irrelevant) variables. And because they were referring to the different meaning of the word "free" the confusion ensued. Tim P. - variables with values that are not constrained in any way and thus you can choose any value for them but are part of the experiment. Tim. M. - the variables and their freely chosen values from which experiment is free and is not affected.
What do you think?
I don’t agree with this comment as Tim Palmer is the one who brought up the passage of Bell, and it’s Bell who gave the example of the moons of Jupiter and people walking. The passage Palmer was bringing up is also where Bell says “the millionth digit is not useful for any other purpose”. But along with Maudlin, I don’t understand what exactly Palmer was getting at. Apparently you need to understand his theory of the universe to make sense of his interpretation of Bell’s statistical independence assumption.
@@timjohnson3913 I am generally on Tim M.'s side as well for the reasons you mention. I was just trying to clarify why there was confusion.
A few conversations I would love to see and that could be quite fascinating : Bernardo Kastrup with Joscha Bach, Sam Harris and Sean Carroll. Not necessarily all at the same time though 😂
I wanna see Bernardos idealist thought really clash against other thinkers with an entirely different metaphysics, since that feels somewhat rare. His first conversation with John Vervaeke was intense at times but he didn't really struggle too much. In his conversations with Graham Oppy (on another channel) and Susan Blackmoore, he wasn't challenged nearly enough. I want to see his views really tested
p. s. perhaps Joscha Bach and Sam Harris wouldn't clash that much with his metaphysical views, not sure.
52:48: The problem I have here with Dr. Palmer's position is that nothing in our theories give us any reason to believe that certain other instrument settings would have been disallowed.. In his three-variable example, what would have been the *mechanism* that restricts the ability to change variable A without also changing B and C? It seems that superdeterminism skirts the Bell argument by contending that certain instrument settings simply could not have happened. I think we need a WHY for that before we can take it seriously.
The focus seems to land to frequently on Bell rather than the actual experiments. If we assume quantum measurement is simply probabilistic around a real underlying quantity, then for entangled particles you can't explain them always giving orthogonal results when measured the same way, and if you then believe them to possess predetermined but correlated results for all measurement orientations, you can't explain the actual experimental results, which are consistent with either experiment result having been "true" and the other then probabilistically determinable from that result. This gets called a collapse, as if the entangled particles were one particle and the one measurement "decides" the probability of what the measurement of the other would be.
Quantum measurements are NOT probabilistic. They are independent. That's a huge difference that most people simply don't understand. There are also very important cases of quantum mechanical systems which do NOT satisfy independence in the first place and for which the standard prescription of the theory is not sufficient. That doesn't mean we can't understand these systems, but it means that we have to understand how the standard theory has to be modified to be applicable. A well known example is the Mott problem (although it might fit into a density matrix scheme, but I would have to think about that for a while).
Tim Maudlin (TM) needs to read Tim Palmer's (TP's) paper on this idea, because TM was missing TP's point. Let me explain. Look at Alice and Bob making two measurements each of some Bell state (the example used in the video). Statistical independence (SI) says the four measurement sub-ensembles (Alice 1, Bob 1)(Alice 1, Bob 2)(Alice 2, Bob 1)(Alice 2, Bob 2) are equally populated by the hidden variables responsible for the outcomes. According to local causality and SI, you can ask what would have happened if a given trial in (Alice 1, Bob 1), say, had been (Alice 1, Bob 2) instead. The answer is "Whatever the locally causal mechanism says." That trial would have simply been a trial in (Alice 1, Bob 2) instead of (Alice 1, Bob 1). So what? That's what TM said, counterfactuals just don't matter. What TP is proposing is a way to violate SI such that counterfactuals do matter. It's possible in TP's proposed theory that when you pick a trial in (Alice 1, Bob 1) and ask "What would have happened in that trial had Bob chosen measurement 2 instead?" the answer is "The theory says that counterfactual is impossible in that trial." That's how TP's proposed theory violates SI and attempts to account for the violation of Bell inequalities.
As a side note, no one pointed out that you can explain the violations of Bell inequalities without violating locality or SI. In fact, I think TM said you have to violate one or the other at minimum to account for the violation of Bell's inequality. The axiomatic reconstruction of quantum mechanics via information-theoretic principles renders quantum mechanics a principle theory based on the observer-independence of Planck's constant h between inertial reference frames related by spatial rotations and translations. And that can be justified by the relativity principle just like the the observer-independence of the speed of light c between inertial reference frames related by boosts (which gives you the principle theory of special relativity). Accordingly, quantum mechanics is as complete as possible and does not violate locality, statistical independence, intersubjective agreement, or the uniqueness of experimental outcomes. You can read all about that in "Einstein's Entanglement: Bell Inequalities, Relativity, and the Qubit" forthcoming with Oxford UP.
Isn't it possible that a type of lensing could be useful to demonstrate quantum mechanics?
That lottery ball example is perfect.
@1:15:00 this was a really interesting point in the discussion. TM seems right that they were slightly at cross purposes. However, TP was right (I think) that Bell's Theorem is not really statistical. it is about *_entanglement,_* which in *_not_* a statistical property. TM seems to not grok this. They way you _test_ for entanglement is the Bell Experiment, and you use the Bell Theorem for this! That's the whole point.
A favourable outcome violating *_either_* locality or Kolmogorov statistics with independence due to (non) causality between A and B (violating Bells Inequality) is a success for QM, but the experiment (and Bell's math theorem) tell you *_nothing_* about which assumption of classical mechanics was the one violated. You need a model for that, and a different experiment to test your model for it against other models that have the other assumption false.
It seems to me that the discussion is made more difficult without a rigorous statement about causality. If statistical inference requires a surrender to probability, then it seems reasonable that "threads of causality" are required to link the seemingly disparate incentives (NOTE: I DID NOT State that Entropy doesn't exist, nor do I unnecessarily anthropomorphize the conditions with the term "incentives"; merely that it can be resisted by an action - an exchange of energy for a difference of result which APPEARS to be a modification of Entropy), maybe this is akin to Imaginary Numbers vs. Real Numbers.
I just don't get it. Does this superdeterminism really assume that decaying nuclei in 2 different random number generators intentionally decay exactly the way to simulate the non-locality of the Bob and Alice experiment? Is it really that simple and stupid?
I understand that Maudlin is talking about a mathematical theorem, and Palmer on whether the assumptions of that mathematical theorem apply to any physical model and his conclusion is that it does not, and therefore although the mathematical theorem is correct, we cannot use it to rule out hidden variables as there could be hidden infinite correlations that can be, in practice, considered as hidden variables. Makes sense?
Great stuff! OT how about Carlo Rovelli and Lee Smolin on "time" for more fireworks?
Just amazing talk, i think Tim Palmer, find a possible solution for a very complex problem. All the best.
@1:20:00 have to agree with Palmer here. TM is talking about the experiment, which tests Bell's theorem. Statistics has nothing to do with the underlying cause of the entanglement that the theorem was created to test for! The Bell Experiment does not test or inspect entanglement, it only detects a result of entanglement and to so so uses an ensemble of similarly prepared particles (Bell pairs). It's frustrating for TP because m basically admits this, he points out it's like a randomized control trial. However, Bell's theorem suggest that as a way to test the inequality, not to derive the inequality.
One derives the Inequality from QM, not from statistics. It holds for a single Bell pair. Tells you the probabilities. The probabilities (in QM) are not the statistics. The statistics can only come from a whole lot of identically prepared systems, which then test the computed probabilities. Assuming statistical independence is the experimental condition, not the condition of Bell's Theorem. Bell's theorem assumes locality and causality (some also say "objective reality" or objective properties "not magical dice throwing by the Devil").
In reality one can't assume statistical independence because real physical systems are not required to produce statistically independent outcomes. Statistical independence is just an assumption of the ensemble theory. Is it a necessary assumption? No. If we give up on it we simply don't end up with standard quantum mechanics. So what? So nothing.
0 chaotic, 1 structure, 3 function of the median. 4 black holes and information in a potentiality. Black holes dilate, sometimes information in, sometimes information out. In a galaxy it's information out with multiple portholes in. The question is the source of this function. We have Pi, fractals and Phi.
@32:00 this one I am less sure about, but while ρ is a statistical factor, it's not just about frequencies, because how QM gives you ρ matters, it is from non-Kolmogorov probability amplitudes (q.v. Raphael Sorkin et al.). In QM you should never lapse into talking or thinking about classical Kolmogrov probabilities. And *_one way_* of not so lapsing (not the only way I suppose) is to talk about counterfactuals. Another is to go in the Huw Price direction and talk about advanced causation, which also means you cannot have Kolmogorov probabilities (although Price often seems confusing, I think that's what he'd say).
Quantum mechanics is simply another solution of Kolmogorov. Not sure where the problem is supposed to be here. Counterfactuals are intellectual nonsense. They don't even satisfy the definition of science.
Interesting topic. Before I listen to the presentation I want to mention what Stephen Hawking wrote that quantum mechanics may be only waves without uncertainty. I think that's correct! Edit: Maybe I should provide the source: "But maybe that is our mistake: maybe there are no particle positions and velocities, but only waves. It is just that we try to fit the waves to our preconceived ideas of positions and velocities. The resulting mismatch is the cause of the apparent unpredictability." - Stephen Hawing, A Brief History of Time, chapter 12
Then it all circles back to Schrödinger's cat - at some point, you will need to collapse the wave function, cats are dead or alive. QM is incomplete, GR is wrong, - that's the current state of the physics. It's still in the PTSD after Bell exposed the stupidity and laziness of all the QM creators.
@@syzygyman7367 I heard that the pilot wave interpretation of QM doesn't need a collapse of the wave function. Could have been something like that Hawking was writing about, but I'm a complete amateur when it comes to this. The idea of an attractor as was mentioned in the video made me think that maybe QM has a "complex attractor" instead of the strange attractors in chaos theory, meaning the QM attractor could be extraordinary complex and guide the whole universal wavefunction.
@@Anders01 To be honest I have absolutely no understanding of why he talks about the chaos theory at all. He sounds like some activist saying words he doesn't understand himself. I checked his bio to make sure that he has an education in physics at all. I have this education, even though not working as a scientist now, I'm basically like Curt. My current hypothesis is just these 2 worlds, our big one (even though QM effects work even on an astronomical scale) and QM one might be really indeed 2 worlds, 2 independent layers of reality, and might be not the only existing, and the unifying theory might be impossible fundamentally, the very question might be incorrect. If you don't have a professional understanding of physics but want to understand it anyway - read Penrose. In his more or less popular books, he explains a lot about not only the problem of consciousness but about physics itself, you can start with Emperor's New Mind - if you haven't read it yet lol)
@@syzygyman7367 I think the problem with QM is mainly Einstein's relativity. That QM is essentially correct and that SR and GR are fundamentally wrong. Because the Lorentz transformation is derived through a mathematical division-by-zero fallacy it seems. I heard about a solution using hyperbolic cosine but that sounds like more of a cop out trick to me than actually solving the problem.
@@Anders01 QM has much bigger problems (or miracles) - backsword in time casualty (shown in the delayed choice experiment) and the possibility of receiving information from possible futures (the bombs testing experiment). It's against all the logic we had, let alone physics. Special relativity is kinda in Maxwell's equations already, if there' sno ether, there must be some time-space transformation. Mathematician Gauss had an idea that the real space can be curved in the 1800s, he even measured the angles of big triangles between hills, - didn't find violations but thought that they can indeed be real. Relativities were logical and comprehensible. QM is so so unnatural for our understanding, that it's ugly and scary for many people. Scientists were and are afraid of it. It took 30 years for Bell's theorem to be discovered, 50 years for Penrose to start the investigation of consciousness from a quantum perspective. We are probably only at the beginning of discovering the nature of real reality.
unfortunate tangent there, but at the end they began to touch on what I think is both the importance and the confusion of Bell's Theorem: We don't know what's going on at the small scale, as in we don't know the mathematical or physical mechanism, but we know that the macro scale has strange/unexpected results that we can only infer from statistical measurements. i.e.we can only see evidence on a large macro scale in statistics. But that doesn't mean that any two entangled particles are doing anything non-real or non-local (or non-deterministic), but that the mechanism prevents us from seeing it by only looking at a macro scale. I interpret the Bell test much more simply: what is it that 2 particles can do to produce this result, that cannot be achieved by 3 or more particles?
You can achieve this with as many entangled sets of particles as you like (see the GHZ experiments).
Palmer talks at 49:00 about how carry rules work (or rather don't work) in decimal arithmetic of so called "real numbers".
In binary, there's no telling whether "real numbers" 0,000... and 0.000.. add up to 1, 0 or something else, as we don't have any computable algorithm to tell us whether the strings contain any ones, and if so, how many.
Can you run an experiment where the wave function of the elements being tested are independent from the wave function of the environment the testing is being performed in?
You can't run any experiments with wave functions. Wave functions are mathematical abstracts. They don't exist in nature.
I think what is missing in Palmer’s argument is not the fact that we can get statistical independence from chaotic systems but that statistical independence is not enough to explain the correlations observed in the EPR experiments. We observe clear cut strong correlations violating Bell’s inequalities. Maudlin should have focused on asking Palmer how to recover those correlations using a chaotic system (I guess he can’t 😁)
It seems the discussion is also between a Frequentist Statistics interpretation and a Bayesian Statistics interpretation
The misunderstanding I think occurs because quantum mechanics as it is traditionally formulated using a linear mathematical framework. General relativity uses a non-linear mathematical framework. You have some people trying to generate GR from QM.
The basis of non-linearity is feedback. What is feedback?
We look at a blackbox system which is a box containing a hidden internal mechanical system. There are inputs into this system which trigger changes inside the black box causing an output effect. The inputs affect the system leading to perturbation leading to output effects. The system can be simple (few components) or complicated (many components) but the component number isnt necessarily an indication of whether a system behaves linearly or non-linearly. I say that because I wish to draw an early distinction between complicated (many components) and complex (many interactions).
In a linear system, the outputs do not relate back into the system as inputs. In a non-linear system, the outputs of the system interact with the system to either restrain it (negative feedback) or amplify it (positive feedback).
In complex systems, interactions between components in a system lead to higher order structures. Consider DNA: Deoxyribo-Nucleic Acid. It's basic components are a Phosphate and Sugar backbone with nitrogenous bases connected off the sugar molecules. With two strands of phosphate/sugar lines up, the bases interact, essentially creating a flexible ladder structure. Information is encoded in the sequence of bases as you read down the ladder. Each 3 bases combine to be read as a codon. Specofic sequences of codons translate into a string of amino-acids unique to a particular protein. Now, each chromosome contains a massive amount of DNA. How do you fit all that information as DNA into a relatively small volume? The answer is that you need a way to physically compress DNA. In that respect, DNA is like wires inba hifi system: you want to utilise devices to organise the wires so they are discretely packaged and dont appear unsightly or worse trip you up. The DNA gets packaged similarly to how thread is wound to spools in a sowing box. Proteins called Histones are the cells spools. You can wrap the thread of DNA around histone protein. This reduces the amount of space taken up. You bring another histone 'spool' and wrap the DNA around that too. Continuing in that vain, you end up with a string made of a sequence of histone spools wrapped with DNA. If you wish to comprress the information further, you coil a string of histones by twisting. A series of coilings enables higher compression. A geometry genetated by this system means that DNA encoding protein A can be a long distance away from DNA encoding protein Z, yet due to the compression into a histone rope structure, the two sections of DNA can be physically very close. What controls the wrapping or unwinding of DNA around histones turns out to be chemical modification of aminoacid components of histones. Enzyme machinary can crawl along histone rope, modifing histones to expose DNA so sections of it can be photocopied and exported as blueprints used to produce proteins at factories embedded at the outer wall of the nucleus.
This is a cellular example of how simple components combine to create complicated structure that utilise complex interactions to compress information. The geometry of the chromosomes are highly convoluted such that connections can occur between distant sections of DNA. You could call these non-local sections brought into close proximity by the geometry of the storage medium.
Similarly, fractal geometry compresses information.
Non-linear systems have different sorts of attractors. An attractor is a region of state space that influences (attracts) elements in a dynamic system. Some attractors pull objects in state space in like magnetic attraction---leaving them unable to escape. Other attractors cause objects to orbit in a cyclical fashion. Chaotic attractors are special because the object tends to move in a non-similar path--never exactly repeating its previous path. Yet the broad patterns of movement are predictable even through exact trajectories aren't calculatable! The geometry needed to achieve chaotic attractors is not euclidean.
In general, Non-linear systems are associated with non-euclidean geometry and linear systems can be more easily described using linear geometry.
If the reality of activity at the quantum scale were to be complex or were to result in complexity, then bridging the gap between the quantum and non-quatum realms is likely to require a non-linear formulation of QM.
Regardless of whether the scientist/mathematician is an experimentalist or mostly theoretical researcher, to develop a non-linear formulation of QM that can link to GR will require some people who are intuitively able to understand the non-euckidean geometry of the quatum realm most likely arising out of expert mathematical understanding of geometry and topology.
This effort will require a huge research effort probably spanning decades, utilising quantum computing, GAI and experts in the mathematics of non-linear systems theory.
I think that where issues of perceptual disagreement are bound to occur is between experts whose backgrounds are underpinned mathematically by linear vs non-linear systems thinking. Yet, linear thinking is so much easier---both conceptually and mathematically---that it will take the theoretical physics community quite a while to transfer as a group to studying non-linear physics. Complexity is harder to experimentally model, harder to mathematically describe and harder to generate useful long-term predictions because the number of degress of freedom you need to model and their even greater interactivity willl severly strain the best computers and minds.
That said, thinking which aligns with 1970's quantum tradition has so far failed to evolve to explain GR. At some point, continued failure will eventually force greater interest in nonlinearity within mainstream physics.
My suggestion: make a million (or four) instances of this discussion so MAYBE I will understand the dispute :)
The words Tim Palmer says it this discussion have absolutely zero sense and relevance.
@@syzygyman7367 Please explain it to me like I'm a 26.7 billion year old
@@notanemoprog QM states that the unity of the system described with 1 wave function trumps time and space, Bell's theorem shows the way how to check it, and it's been checked - the reality indeed works on different principles of causality than regular human logic. Future can interfere with past, present can get answers from all the possible futures, Read about delayed choice experiment and quantum bombs testing experiment. Most of the scientists just can't comprehend all of that, they want to save their comfy worldview, it becomes personal, some kind of uber-materialistic religion.
@1:25:00 - @1:28:00 here is where I'd tentatively critique TP. To me it's about entanglement, not measurement instrument setting independence. Entanglement is outside the light cone. The Alice and Bob measurements are outside each other's lightcone, but that's ok. TP does not need to resort to violation of statistical independence of the instrument stuff for this. The clear violation is (irony drips here) both Minkowski causality *_and_* independence can be false, howso? If the entanglement is in fact locally mediated by say an ER=EPR wormhole. So it only appears non-local to someone assuming Minkowski spacetime. People never mention this. The "fourth" assumption of Bell's famous result is a Minkowski spacetime. Why do we have to assume SR holds down to the Planck scale? To my mind _this_ is the false assumption, and relaxing it (as any old quantum gravity or foamy spacetime model would) gives you a proper ontology behind Bell's Theorem. Experimental statistical dependence, normal causality and locality are all still true. Minkowski spacetime is not true (at the Planck scale).
At this date you can even choose your ontology, it's wide open for speculation. Just do not choose Minkowski spacetime at the Planck scale, but you'd better recover causality and locality akin to Minkowski at the macroscopic scale or your model ontology is wrong.
Get Lenny Susskind on Curt!!! He'll tell you, like I am, Bell's theorem is just general relativity.
Palmer was insisting that if Alice changes a detector's angle by 1/millionth of angle second " that the result of experiment may be different. Palmer was insisting that changing one gene in a person may result in breaking his blood type. Yes that's true but chances of that are miniscule. And Maudlin emphasized that no meaningful predictive theory could come out of such chaotic dependance on miniscule changes and in the end it is irrelevant to Bell's theorem results
Right, if you take Palmer's assertion to its logical conclusion, how on earth could anyone ever demonstrate an interference pattern with a laser in a double slit experiment, for example? Does he really think that there is some way to make a minor adjustment to the equipment so that the outcome of the particle distribution is not what it always is?
What do counterfactuals have to do with researching a warp theory?
I am not educated in math or physics just intensely curious , so please nobody make fun of me about being not smart enough for this podcast, but are they arguing about quantum physics always has to have uncertainty. The millionth digit can be both odd and even so if the they don’t take this position about bell it essentially says all off quantum physics is wrong. Personally I think quantum physics is very wrong about something or is not incorporating some truth about the universe we just don’t know or understand. Peace from Ct
QM is incomplete, GR is wrong - that's th ecurrent state of physics. The Bell's theorem shows faster than light communication inside of a QM system, that it is possible to experimentally check it The whole buzz is about all the great physicists not noticing in for 30 years. It's a shame and embracement the academia still dwell in.
@@syzygyman7367physics advances funeral by funeral …. I do wonder the ratio of younger physicists who understand bell vs older physicists who understand bell. There’s some sort of strongly held magical QM belief that some in the older generation can’t break free of
@@timjohnson3913 QM is real and it's crazy. This faster-than-light communication thing that is the subject of Bell's theorem is the least horrifying miracle of it. Delayed choice experiment shows back in time causation, and bomb testing technology shows communication from a possible future, - that is the crazy reality of this world. You can't hide from it. Most of the playsets hid from it in the 20th century, and there are basically no real physicists now.
Not sure I understand Palmer's point. But if I do, he is saying that in some model of the universe, Bob and Alice will fail to make random measurements. For however hard they try, their pseudo-random generators will be correlated : if they use random generators based on the arrival time of cosmic rays from distant galaxies, they will be correlated in a way to force A and B to do precisely the measurements that will lead to observe a violation. So, distant galaxies conspire to trick A and B minds to believe that they are doing the experiment randomly, in such a way that the analysis of the result always show the same violation that looks like non-locality. At least, nothing as crazy as non-locality! Please explain me where I am wrong.
I just don't see any sense in this superdeterminism thing. I just see an emperor without clothes. Does it really says that the whole universe is programmed for any random number generator to be correlated with any other?
palmer's main point is that he is challenging counterfactual definiteness, i.e. the idea of 'alternative timelines which could have been'. so he is arguing for determinism, fundamentally.
bell test results never "look like nonlocality", rather they show that at least one of the assumptions must be wrong. there are three assumptions in bell's theorem, locality is only one of the assumptions (although in this video, they said that there are only 2 assumptions, which confuses me). some people are convinced that locality is false, but that could be wrong. there is no contradiction in the idea of certain people being determined to be wrong.
lastly, the conspiracy objection is not compelling. life is a conspiracy to start with.
@@maxwelldillon4805 How is it different from my caricature?
@@maxwelldillon4805 You're so wrong. The Bell's shows that non-locality, faster than light communications - 1. Exists inside the theory itself, and hadn't been noticed by its very authors for 30 years, till an outsider Bell came and took a look at their creature. 2. Can be tested experimentally, which has been done, showing that both special and general relativity are simply wrong, This shame of stupidity of Einstein, Bohr, Heisenberg etc. caused a harsh PTSD the current academics still live in.
@@syzygyman7367 qm is nonlocal because of the instantaneous wave collapse, not because of bell's theorem. this is considered a problem by many.
nonlocality hasn't been experimentally proven.
What does “free” mean?
Curt, the issue is fundamentally very simple. Dr. Bell used scalar algebra. Scalar algebra isn’t closed over 3D rotation. Algebras that aren’t closed have singularities. Non-closed algebras having singularities are isomorphic to partial functions. Partial functions yield logical inconsistency via the Curry-Howard Isomorphism. So you cannot use a non-closed algebra in a proof, which Dr. Bell unfortunately did. … This is a sufficient disproof of Bell’s theorem.
That was funny. Total nonsense, too. ;-)
@@lepidoptera9337 just because you don't understand it doesn't mean it's nonsense :)
@@FunkyDexter I do understand it. That's how I know that it's total nonsense.
@@lepidoptera9337 still refusing to elaborate I see. Since nonsense=illogical, tell me what you find illogical about my statement, and I'll promptly explain to you. And don't tell me "everything", because that would just mean you haven't understood it at all.
@@FunkyDexter Why don't you explain it to me in great length and detail now? ;-)
37:30 - I'm really eager to hear these details, but I'll tell you that at the moment my sense is that if we don't have statistical independence we don't have science. It just seems like if we endow the universe with this much ability to "sneak around" on us it's hard to see how we can really trust any conclusion we come to.
If I understand it correctly, Maudlin is arguing that the Bell's inequalities only apply for large enough ensembles and breaks down for single measurement scenarios (i.e. they say nothing about them). But then does that mean that locality is only violated statistically as well? What does that mean?
Statistical independence is an assumption of bell’s theorem, which is what 90% of the video is about. If the assumption holds true, which Maudlin argues for, then bell’s theorem shows that the universe is non local
@@timjohnson3913 Not the universe though. For example, the forces in the standard model will still act locally. I assume you mean that the universe is shown to be non local on the most fundamental level, which you take to be some indetermenistic quantum theory. But Maudlin also argues that Bell's theorem only applies to large ensembles and cannot be interpreted in individual experiments, right? My point is that then you also can't use it to argue that locality is violated in individual cases, it is only violated statistically whatever that means.
@@bazsawow The universe has a property of nonlocality; this is what Bell proved. It does not require that every interaction be nonlocal for the Universe to have the property of nonlocality. Back to the main point, if you can show statistically that an ensemble of measurements require nonlocality, then individual cases must be nonlocal. You can’t get ensemble nonlocality out of individual local interactions. So while you can never point to 1 specific run of an experiment and declare that it is nonlocal, you can look at multiple runs of the experiment and statistically prove there must be nonlocality. This is what Bell did.
@@timjohnson3913 There's a technology for proving Bell's theorem in only 1 experiment with only 1 pair of entangled objects, - in Penrose's Shadows of the Mind of 1994. It's sad that people arguing about the subject don't know the works of the greatest living contributor to it.
@@timjohnson3913 I agree with everything you've said. I have a question about the definition of "locality", though:
Is it necessarily the case that Bell required "nonlocality" instead of "faster than light causal influence", or are they two ways of saying the same thing?
Intuitively, it seems to me that a physical process that appeared to be instantaneous could turn out to be merely much faster than we were capable of measuring (e.g., C to the power of C). In that case, we would not abandon locality altogether, but just redefine it. Perhaps the notion that _any_ faster-than-light causal influence constitutes a violation of locality is due to a definition of locality tied to the understanding of the causal structure of spacetime found in relativity.
Brilliant discussion but I can with confidence say that the QE effect is physically non-local for light but 100% local and causality superluminal connected by an unknown 5th extra dimension superluminal force not yet discovered by humans.
Nothing is really a wave of probablilty but is determinated because we live in a Uni-verse and so we can not see the Multi-verse that is that wave of probabilty. At the same time we can not make a perfect prevision of what is going on in the quantum world because of the Uncertainty Principle. Even if we want make previsions with great accuracy of what will be the future is impossible to “see” it even with great intelligent AI, because the nature if things is imprevedibile at the core
You can measure what you like and in whatever way you want, but when you then measure your choices will affect the result. That is super-determinism.
Not really. That's standard quantum mechanics. Super-determinism tells you that the universe is lying to you about your choices because it has already decided for you. ;-)
@@lepidoptera9337 I would like you to argue that to qm theorists
@@johnbihlnielsen3578 Nobody who is serious about quantum mechanics considers nonsense like superdeterminism. We know why the universe LOOKS superdeterministic. It follows from special relativity. The energy that we are measuring in a quantum experiment comes at us at the speed of light (consider the case of a gamma ray detected by a Geiger-Mueller counter). Since nothing is faster than light, that detection event is unpredictable (we can't get advanced warning, because that would require a superluminal signal, which doesn't exist). So then we play this game backwards to the source of the gamma ray and ask what "triggered" the release of that ray to begin with? So that gives us another physical even that also happened at the speed of light and so on. If we play this again and again and again, the only identifiable "source" for this causality chain is the beginning of the universe. In other words, an attempt to classify quantum mechanics by demanding causal chains makes it look like as if everything, including the decision to perform a certain measurement, had to be initiated by the initial conditions of the universe.
That's a completely useless piece of physics intuition. It explains no more than the usual religious "god did it" claim, except that the superdeterminist replaces "god" with "initial conditions of the universe". So every click in our Geiger-Mueller counter was caused by the "initial conditions of the universe". Great. What are we supposed to learn from that?
factor's of 1 not self similar when the principal 1 is rendered in 3 parts within the whole 1. Time and space dilate the number 1 plus or minus at a singularity.
Imagine if the AI was cellular automata, a self organising principal within an opposite, 0 and 1osolating around in a field of potentiality, suddenly repulsion, suddenly space, suddenly distance, suddenly time. Suddenly a geometry separating the 2 warring parties on a field of singularity. Singularity being where both combine and repulse in a mass of energy. Each fighting to maintain it's reality. 1 conserving it's energy, the other using that same energy to escape.
Thanks!
Thank you so much!
All we see is a product of the evolution of our reality within our discrete package in a continuous chain. Counterfactualisms are the opposite end of a scale where both factual and counterfactual are the same in a singularity of potential information from an origin outside of the singularity.
Quantum phenomena tame chaos non-linear dynamics even at the semi-classical scale. True even for Bohm theory, where initial conditions do matter.
Can't wait to hear Tim Mauldin being certain about everything he says about his God, Bell.
You have to give Maudlin credit for carefully listening to what was saying. He focused enough to catch errors of Palmer live “I don’t understand what this means” “I don’t know what probability of roe means” etc, and he was able to come up with live counter arguments to what Palmer said.
Buddha says, "only the one pointed succeed." Tim Maudulin need to concede that e2 is an explorer and should investigate the possible contribution of counter factual. The whole thing is ultimately going to be very close to chaos, insense that it emerges via of fluke. That is, from our point of view. If we could get down there, where everything comes from, it would be so natural and there would be no other way.