I like to read the blog "Not Even Wrong" by Peter Woit. The day after I gave this lecture, Woit posted the following -- www.math.columbia.edu/~woit/wordpress/?p=10522 -- about a recent book called "Beyond Weird" by Philip Ball; Woit called it the "best popular survey I've seen of the contemporary state of discussions about the 'interpretation' of quantum mechanics". I think both Ball and Woit are skeptical of the Many Worlds Interpretation -- which is fine :) It is a minority viewpoint, after all. Anyway, I personally am interested in checking this book out. I looked at its foreword and it already has some interesting stuff. For example, the very start is the Feynman quotation I mentioned on Tuesday: "I think I can safely say that nobody understands quantum mechanics." But then, just as I did, Ball goes on to emphasize that Feynman was referring here to the "meaning" of quantum mechanics for "reality"; and that by way of contrast, the mathematics of how to *do* quantum mechanics (and thus quantum computing) is perfectly clear, fine, and uncontroversial. If any of you do read the book, let me know your thoughts!
Ball is a popular writer, and he's not familiar with the range of quantum interpretations or theories (i.e. Växjö school). Yale's Peter W. Morgan wrote a great paper titled "The Straw Man of Quantum Physics", which serves as a great reminder to computer scientists of field theory, which many of them have little acquaintance. For instance, supercorrelation gives a a completely non-weird explanation for Bell violations in terms of a background field. You can get Bell violations with classical light, water waves, Brownian motion, etc. Furthermore, QM gives the expectations of observables, and this doesn't strictly imply any QC speedups. Besides Gil Kalai's objections based on harmonic analysis of boolean functions, people are neglecting the INTERNAL decoherence of quantum evolutions. The speed of a real-world reversible computer scales linearly with applied force and entropy. The Heisenberg energy-time bound shows that energy release per time step is greater than Planck's constant over the step time. Also, the bound on total entropy over a complete computation is O(Boltzmann's constant) to avoid decoherence. By Boltzmann law, if the entropy release per step is much greater than Boltzmann's constant, then the quantum computer decoheres, and noise will be read out. So the runtime of a general quantum computer seems to be lower bounded by (h∗S^2)/(k∗T), where S is the number of steps, h is Planck’s constant, k is Boltzmann’s constant and T is the ambient temperature. The runtime bounds on Grover’s and Shor’s algorithms don’t look too impressive under this basic analysis. It seems that MT and ML bounds dramatically overestimate the speed of quantum evolution. John D. Norton has had to embarrassingly explain to CS people why Landauer's principle neglects quantum fluctuations. Furthermore, there is no "global phase", and spin is continuous and not discrete. A 2019 experiment published in Nature recently confirmed the continuous, deterministic trajectory theory of subatomic "particles".
Thanks, so much! I'll go on to the end. Could we have some text book names (just to have more materials or aspects to think on)? I'm not sure I had heard one in the lecture start.
It's funny just how little is actually said during such college lectures but props for not going insane when basically providing entertainment for about 60 minutes and giving about 5minuets of actual practical usage information.
Thank you for this, but it's a bit weird for me that computer science students needed such an introduction. A lot of time could have been saved without this long intro
This lecture was sick! Definitely going to watch the rest! Great job Ryan!
Too good
yooo man, wsp
fr
I like to read the blog "Not Even Wrong" by Peter Woit. The day after I gave this lecture, Woit posted the following --
www.math.columbia.edu/~woit/wordpress/?p=10522
-- about a recent book called "Beyond Weird" by Philip Ball; Woit called it the "best popular survey I've seen of the contemporary state of discussions about the 'interpretation' of quantum mechanics". I think both Ball and Woit are skeptical of the Many Worlds Interpretation -- which is fine :) It is a minority viewpoint, after all.
Anyway, I personally am interested in checking this book out. I looked at its foreword and it already has some interesting stuff. For example, the very start is the Feynman quotation I mentioned on Tuesday:
"I think I can safely say that nobody understands quantum mechanics."
But then, just as I did, Ball goes on to emphasize that Feynman was referring here to the "meaning" of quantum mechanics for "reality"; and that by way of contrast, the mathematics of how to *do* quantum mechanics (and thus quantum computing) is perfectly clear, fine, and uncontroversial.
If any of you do read the book, let me know your thoughts!
Ball is a popular writer, and he's not familiar with the range of quantum interpretations or theories (i.e. Växjö school). Yale's Peter W. Morgan wrote a great paper titled "The Straw Man of Quantum Physics", which serves as a great reminder to computer scientists of field theory, which many of them have little acquaintance. For instance, supercorrelation gives a a completely non-weird explanation for Bell violations in terms of a background field. You can get Bell violations with classical light, water waves, Brownian motion, etc. Furthermore, QM gives the expectations of observables, and this doesn't strictly imply any QC speedups. Besides Gil Kalai's objections based on harmonic analysis of boolean functions, people are neglecting the INTERNAL decoherence of quantum evolutions. The speed of a real-world reversible computer scales linearly with applied force and entropy. The Heisenberg energy-time bound shows that energy release per time step is greater than Planck's constant over the step time. Also, the bound on total entropy over a complete computation is O(Boltzmann's constant) to avoid decoherence. By Boltzmann law, if the entropy release per step is much greater than Boltzmann's constant, then the quantum computer decoheres, and noise will be read out. So the runtime of a general quantum computer seems to be lower bounded by (h∗S^2)/(k∗T), where S is the number of steps, h is Planck’s constant, k is Boltzmann’s constant and T is the ambient temperature. The runtime bounds on Grover’s and Shor’s algorithms don’t look too impressive under this basic analysis. It seems that MT and ML bounds dramatically overestimate the speed of quantum evolution. John D. Norton has had to embarrassingly explain to CS people why Landauer's principle neglects quantum fluctuations. Furthermore, there is no "global phase", and spin is continuous and not discrete. A 2019 experiment published in Nature recently confirmed the continuous, deterministic trajectory theory of subatomic "particles".
Thanks, so much! I'll go on to the end. Could we have some text book names (just to have more materials or aspects to think on)? I'm not sure I had heard one in the lecture start.
Not sure how I got here from studying dynamic programming and algorithms but I'm not dissapointed lol
Oh nang thanks
Thanks. ( :
I think the video quality should be a little better
aawesomeeeeeeeeeeeeeeeeeeeeeeeeeeee
It's funny just how little is actually said during such college lectures but props for not going insane when basically providing entertainment for about 60 minutes and giving about 5minuets of actual practical usage information.
Thank you for this, but it's a bit weird for me that computer science students needed such an introduction. A lot of time could have been saved without this long intro