When I was a PhD student in quantum computing we all found it depressing to read articles about how quantum computers will revolutionise business. “Quantum winter” felt inevitably around the corner
Was it worth it doing the PhD in QC? During my Bsc and Msc I have focused on researching quantum algorithms, but now I have to choose between continuing with QC and shifting towards ML/DL for my PhD. Both fields are over-hyped, but ML has seen some really promising developments lately.
Certainly true of Liz Truss' government! 'The Market' looked ... the highly entangled (as in kittens and balls of string) government promptly collapsed.
Sabine is awesome. Love her one on Nuclear Confusion. And in this one: "Some people will lose a lot of money, but that just means they had too much of it to begin with, so I can't say it bothers me too much" 🤣😂
Being part of a quantum research group myself, I really appreciate Sabine's well informed take on this subject. The fact that its seasoned with her dry deadpan german humour, is a bonus.
I do not really have a clue beyond pop science about quantum anything. I do know EECS though and after seeing claims of the capabilities of quantum computers have been trying to tell people for a long time they won't make their GPUs faster. At best, for everyday use, they'll be a coprocessor for rare, very specialised uses. Now I have a video to send people instead of getting dragged into go nowhere argument.
Sabine's dead-pan delivery and her sarcastic and sometimes cynical beat-downs of overblown ideas with her sharp wit never fail to amuse me. In a climate where so many people dare not call out BS for what it is, I adore Sabine for her realness.
As I came to the end of the video, I started to come to a confirmation of the same feeling - "Science without the gobbledygook; also without the BS". Thanks Sabine, for your honesty and clarity.
She's got a decidedly unscientific certainty to the statements she makes. Her opinions are just that and delivering her opinions as obvious facts is deceptive. Quantum computing has enormous potential. It may be expensive and may always be expensive and it may be noisy and this may never change but even with those problems it has enormous potential. If it's at all useful it can do things that seem like total magic right now. This power is known and this knowledge is what causes the seekers to pour so much time, effort, and money in. And Sabine is doing something very sneaky here. She makes a video that contains inflammatory rhetoric about a topic she knows little about and finishes it off by turning into a saleswomen hawking a predatory product that promises it can teach you how to think. And she gets paid for it. Unknown to most, the reason this video exists is to get Sabine paid. The only reason. This is why it is inflammatory. So you'll watch it. It's not inflammatory so it's closer to the truth. The topic isn't chosen because Sabine has strong feelings about it. It's pandering. It's clickbait. Advertising and even payment consistently and inevitably causes the reduction of speech into drama and propaganda. Each RUclipsr that engages in sponsorships falls into this trap and all their channels become hollow shells of their former glory. Honesty gets turned into drama. The depths are made shallow. The complex is made simple. The heartfelt is made insincere. The RUclipsr clearly only shows up because it's their job. The passion fades and becomes jadedness. Lies creep in. Exaggeration bulldozes over subtlety. Truth dwindles. And the fanboys keep cheering. Like lemmings. Their brains rotted. Their faculties stunted. Off the cliff they go. I dare call out the BS for what it is. I despise Sabine for her fakeness. Another video only made so that there's a place to tack an ad onto.
@@xbzq Clueless, Sabine was a leading theoretical physicist and calls it like she sees it. the world is filled with bubbles and this is certainly one. She never said they won't happen, just bagging on the hype.
@@dft1 Sabine IS the hype. But you could never see that. Also, she has very strong ideas about a lot of scientific ideas as if they were settled science. They aren't. She talks a if she has answers that she doesn't have. I don't care about her credentials. She's just a person. And she's often wrong. Like all of us. She's no god. She doesn't understand programming, or quantum algorithms. She doesn't seem to understand quantum computers won't be used for payroll. Neither does any of you comments seem to understand Sabine is just a person. Everyone worships her as if she's done kinda of god. Everyone on earth it seems theses days subscribes to some hero's ideas and worships them. When they inevitably fall, these people fall right along with them because they've tied their identity in with this person.
@@xbzq I share some of your concerns about her aggressive way of stating her opinions (she does seem to present herself as the arbiter of what science is useful or worth spending time and money on), but RUclipsrs need money to survive the same as the rest of us.
Not really. She very clearly has her biases. Especially on anything quantum related or related to ideas like free will or the multiverse that involve both science and philosophy. She is great but she's not the "living embodiment of keeping it real".
If by, keeping it real, you mean leaning further and further toward cynical skepticism, and philosophical contrarianism. I still watch and enjoy her content , but am often left disheartened and hopeless.
I love the balanced and skeptical position you take on all of these issues. It really feels a lot more informative than a video that's clearly trying to push a certain agenda or use clickbait claims to draw in gullible viewers.
While I really enjoyed the video, saying that she is balanced and skeptical is quite funny. It is a massively oriented video. Even more, since it is so obviously oriented toward one conclusion, I don't mind it : it's made very clear that she will present points that go in her direction, and you know that straight up.
I worked at Texas Instruments (TI) during the first Artificial Intelligence (AI) bubble, circa 1985-90 (my estimated dates). The issues described in Sabine's vid, were nearly identical to the marketplace hype and expectation of AI. TI developed and produced computer hardware that directly supported the Lisp programming language. To my knowledge, TI sold very few of these systems. Attractive commercial AI outcomes (algorithms, etc), took decades to develop, and used existing hardware.
Indeed, there were a couple of waves of AI, which mostly consisted in making a new lisp variant and some hardware to make the lisp test programs work faster. Then there was the "expert systems" that were supposed to revolutionize the world and "knowledge experts" were to be trained everywhere.to solve all problems. The fact is that real revolutions are unpredictable by nature.
@@niceguy100000 Yes, also Santa Cruz Operation (SCO), offering Xenix and other Intel/Microsoft/HP based UNIX solutions, all of which Linux killed off. Heady times, fabulous trade show parties, zero profit.
@@wyattonline Yeah, good times. SunOS was next level and properly working unlike some other caveman software. Also the first VR hype (of course totally profitless) was cute.
"Some people will lose a lot of money but that just means they had too much of it to begin with, so I can’t say it bothers me all that much." Nailed it.
Not quite. Some people have more inside information and make money on both the growing and the bursting of the bubble. Other people buy in near the peak and then can't get out fast enough and get burned. The stock market creates nothing, it is a zero-sum game.
There's a suing attributed to P.T. Barnum that I think applies to investors who spray money into stuff without understanding the subject in even the most cursory way: "There's a sucker born every minute."
"Some people will lose a lot of money, but that just means they had too much of it to begin with" 😂 That made me spit my coffee. Now I have to clean it up. Thanks so much, Sabine 😑
Sabine is top of the list in making difficult subjects comprehensible for normal people without dumbing it down too much. Wonderful absence of irritating melodramatic soundscape Excellent work.
@@RandomForestGump I know what you mean... Yet sometimes people need a slap of reality. I am seeing overblown expectations that even my layman mind knows is wrong, because I think some want funding. So the researchers have to play a game of manipulation in order to get it. Sort of like how channels use click bait to get attention, because it works! Especially versus being honest. I'm sure there are a few downright fraudulent companies running amok. Raising expectations so high always has a backlash. I expect it both for AI and Quantum computers. Yet! I hope I am dead wrong. There are a lot of promising theories and technologies in their infancy. I just think timeliness are off. Photonic computing, and eventually quantum photonic computing looks great if we can get it to work as some way it will.
Indeed! Sabine knows how to cut right through the fog! But to be frank, I have an MSc in Physics anyway so I know how to navigate this landscape myself. This is why the general public REALLY needs to get up to speed with scientific literacy! Plenty of crazy crackpot ideas running around and marketing hype that says so much but delivers so little! So as Public Enemy said so forcefully: *DON'T Believe the Hype!!!*
As a computing specialist myself I couldn't disagree with her more. An operation that's used heavily in AI, matrix multiplication, can be done on quantum computers. If a quantum computer is sufficiently scaled up, and these operations were run to train an AI, the amount of training on that AI you could do within a short period of time is ridiculous. This becomes increasingly important as data sets are getting bigger and bigger. I've sat around for hours waiting for a smaller model to train on traditional top end silicon hardware, imagine the bigger models. Quantum computers will revolutionise AI, and that's just one example. Quantum computers aren't going away, they're going to get better and better, and a bitter old lady who's mad that their obscure research isn't getting as much money as quantum computing won't stop it. There's hype but there's also substance there. Of course, it won't ever live up to the hype, no new tech does, but eventually it will exceed it. It's really unscientific to discourage quantum computers by saying it's a dead end.
On that point about QM computers revolutionising AI, I should also point out this: the better models are trending towards consuming more and more data so as time goes on the computing requirements are going to also climb up. QM computers will solve this. Tesla, to train their self driving car models, have created basically a whole factory of servers with specially made chips just for crunching these numbers, now imagine you could do this on one machine that's significantly smaller. And a sufficiently powerful AI can solve all human problems, including our deepest physics problems. Seems like a pretty fucking big deal to me.
@ 2:16 "If you look at them, do they collapse?" Sabine, your comedy if flawless. I watch a lot of science videos on YT. You are the only one capable making esoteric abstract concepts clear, but also funny in the cleverest ways. You _are_ the smartest social media explainer of scientist on the internet. Thank you _so much_ for what you do.
Absofreakinglutely agreed on that one! The next best is Anton Petrov, but he's not a PhD, just (I say "just" but it's definitely not nothing!) a teacher. Still, he has the same flavor of comedy, and he covers lots of different kinds of papers that have been recently published, from astronomy to microbiology and everything in between. But the top winner has, yes, _got_ to be Sabine!
@@UnderscoreZeroLP Because IQ is the _only true_ measure of mentorship, compassion and humanity? I grew up in a library. This is your problem of you. May harmony find you.
@@kaourintintamine1383 And now we have jaw dropping AI applications like DallE, ChatGPT and more. Not sure what we will get from quantum computers. The NSA decrypting all our porn videos?
I am so glad to hear someone else, someone with more credibility than I have, speak out about this. I've been dubious about quantum computing since the mid-90s. I'm not a physicist, but I've been doing software for many decades, and I've also worked with stochastic and probabilistic systems, and most of the time my understanding of the projections has veered off course as soon as the quantum jargon began to dominate the explanation. It always felt like hand-waving. I gave a talk at Interval where someone brought it up, and my reply (at the time) was "Yes, quantum computing will give you the right answer, and it will deliver it faster than conventional computing, but it will also deliver a multitude of wrong answers, so you'll have to go through them all to determine which one was correct." [edit] Even fuzzy computing does better than that. I just went back and noticed the cat on the cover of the Hoofnagle and Garfinkle book, but I couldn't tell if it was alive or dead.
You are becoming part of the few physicists that keep emotional entanglement out of science. It's nice to be optimistic, but much better to be realistic. I appreciate your approach, now more than ever.
@@bentonjackson8698 she'd pronounce "hat" closer to "het" (not nailing either the American, British, or Australian vowel sound langing somewhere in between everything and succeeding - German engineering for you), and also she has quite a cruel euro humour going so pretty sure it was heads
Every new technology goes through this “overhype” phase, like internet itself, or blockchain as the most recent example. This is not a reason to stop doing it. We pretty much HAVE TO go through this phase, to learn their real usefulness.
I don't actually remember overhype of the internet (and I do predate it). I suspect it has exceeded almost all predictions made for it. Do you have examples to support your claim? (Simple, honest question... not trying to start an internet war. :-) ) (And I do agree with your general premise.)
@@peter9477 good question on that internet hype thing I think it fulfilled everything and more I also predate the internet myself. I think the second statement is the biggest truth to be said and is pretty significant. There is a lot of growing pains in all fields of development and some are more embarrassing than others but it's the fact they even tried it is what matters, as far as quantum goes though it has indeed gone too far. It's better to try and fail than to not try and miss your chance.
The problem is that quantum computers will almost certainly be useless for decades just due to physics. Until someone can discover a room temperature superconductor and a way to stop particles in superposition from decohering (I don't believe this even possible), most people won't even be able to have a supercomputer, let alone use it, and even if a way is found, we have maybe a few algorithms that are actually useful, and creating more would require even more work and even more research.
This reminds me of cloud computing. I was a recent hire at a software company and inherited a project a recently leaving partner signed on. It seems he sold it to the clients by peppering the document with the word cloud with little to no actual meaning reflectig what the technology did. Instead, it appeared to be the cure all for every problem they had or could have. In the 5 page document specifying what our company would do, he had signed us up for a job that would have taken a team of experts years to do and me, a lowly junior, ended up as the lead developer with only three months left in the contract. Suffice to say, it did not turn out well and the term cloud computing always leaves a bad taste in my mouth. Seems that snake oil salesmen are making the same use out of Quantum technologies.
@@MyCatJeff not exactly. When cloud computing first came put, it became widely known as a term, but few people actually had any notion what it meant. This sort of dynamic breeds a certain group of people to use such a term as a cure-all of any problem. The snake oil salesmen basically. Back then, if you had a problem, just had to litter the term cloud into the conversation a few times and mamy would automatically think they new some great fix for almost any problem. It was the same hype crypto had and similar junk related to it. Cloud computing, when actually defined, I agree, wasn't anything great and just a play on how computers manage themselves. Still, back then it was more mystical that real.
Or, you know, a device that can revolutionise AI. Quantum computers will be useful, so long as the progress in scaling up the number of qubits and stability continues, as it has been continuing strongly.
It sounds like the discussion I have with co-workers about their kids. They say to me, "I want my kids to have what I never did, don't you?" My answer is, "No, I don't care about that." It really rocks them when I tell them, "I'm not putting myself in charge of my kids' happiness, that's something they need to go out and find."
I so very much enjoy your presentations! They are informative, they flow very nicely, and are easily followed. But most of all, I truly enjoy your calm and WICKEDLY sharp sense of humor! It is always an education to watch your presentations, and always VERY entertaining. You're like a breath of fresh air in my day. Thank you so very much.
One of the wonderful things about watching you Sabine , and its the reason i was drawn to all your earlier work in other mediums , is the sense your fed up with the BS in modern physics. Love it, more please ! If you havnt already , please please cover the 'big rip' :)
@@flagmichael hehhehehe funny nonsense. In fact it is precisely that: funny nonsense, because you mix different concepts with different transcendental conditions, hence creating a pseudo-paradox... which can be perceived as funny nonsense,. I prefer to drop the "funny". Another ingredient is the claim that quantum theory explains "everything", which , not accessible for quantum theorists, includes complexity, language, and culture.
Unbelievable how people nowadays have so deep of an understanding of Quantum physics and technology on a microscopic level while just 250 Years ago people had to bite on a stick because someone had to cut their arm off when it was infected.
She’s so flipping funny. My family keep asking me what I’m watching on my phone (headphones on) as I keep laughing out loud. Sabine has Fantastic, sharp-as-a-knife humour. More please!
Humor aside she is basically the only one who is exposing rotten, stinking mold that have been decomposing physics and stopping advancement in the last century.
Right! And her delivery is so deadpan. hehe. She goes from the highly technical to humor and back, and if you had the sound turned off, you wouldn't detect it.
fact. the fact of this kind of humour existing within this subject’s framework is already kinda absurdly ironic, but the fact that it’s been so tastefully executed in these videos just cracks me, i almost feel rick-rolled each time. a brilliant thing of its own kind
I have worked on systems since 1974 and I have serious doubts about the usefulness of quantum computers and I have had them since I first heard about it and I haven't seen anything that has given me more confidence in them. Hype is all it is.
The girl at the right in the photo that was shown in 9:39 got me entangled away from Sabine´s explanation. I had to go on five times over the interval 9:39-9:57 to just barely understand what Sabine was saying (something about "crosstalk".) Closing my eyes was of little use, the image kept entangled in my mind.
I’ve just found your channel and what a delight it is to hear things explained in such a humble manner. Thank you for adding to the collective intelligence of our society
@@SpeedOfThought1111 don’t be so soft , her sarcasm is part of the humor which allows others to be more receptive and keep gen Z non existing attention spans on her. If you believe she’s actually insulting them purposefully, I’m sorry your sense of humor is so dry.
I am getting rather obsessed with your channel and your hilarious humour, I must tell you. This should be a million+ subscriber channel. You'll get there, Sabine.
She has a distinct bite of linguistic sharpness that combines the candid command of language as exemplified in the writing of the late Scalia, the tenacity exemplified by the writing O'Connor, and the linguistic influences of the late Ginsberg.
I've been loving how much shade this physicist throws in her videos at pop culture stuff, keeps it grounded. Just cam across her videos and they've been great!
You nailed it! I worked in the supercomputer field, and spent time working with algorithms to solve quantum problems. I cannot see how the current quantum computer architectures can support any algorithms that compute, ab-initio, molecular structure.
In a quantum winter you can know the air temperature or the wind speed to arbitrary precision at some point in space-time but never both, but either way you freeze.
Enjoyed your book a lot. Refreshing to read something the cuts through the hype. However it does speak volumes about the positive state of physics that you can be this critical of paradigms without having your career destroyed! Wish the same could be said for other fields...
Hossenfelder is getting better and better. At her very best she's producing biting sarcasm and it stands very well to her voice and personality. This is how science news necessarily must be to attract intelligent people. I'm not afraid to admit it, I love you Hossenfelder.
We need more skeptical physicists like you. So many physicists are so wrapped up in their theories, they just can't see the reasons some say it won't work the way they say it will.
A lot of us are sceptical and aware of how much hype there is, but, the entire grant system is based on promising future results and dreaming up how they create profit. If you are honest there will be no grant money for your research. It's a similar issue like impact factor that is more a measure of a field's popularity than actual quality of published articles. Science has been corrupted by capitalism and society at large has no tolerance for complex problems and explanations.
dont get me wrong, i like her stuff usually, but this video is more like a rant about that others get more money for there research then she gets... ofc she got some points in the video but at the end it sounds like a rant about why she dont get the money the others get
She's a physicist, clearly not a computer scientist to see the benefits of quantum computing. Matrix multiplication is used heavily in AI, and guess what can do that at ludicrous speeds? This physicist should stick to making science videos, not videos on computing she knows absolutely nothing about. Being an expert in one field clearly doesn't translate to anything. Of course, there is an initial hype phase of every new tech, but that eventually passes and then after that trough comes revolutionary tech. To say it's a bubble that's going to burst and nothing revolutionary will come out of it eventually, is complete bullshit.
I spend the majority of my time teaching composition and music theory. And although I love my work, I often have this sense that I am using only one small portion of my brain. When I watch your videos Sabine, I always feel like these other dormant parts of my mind are being awakened and asked to stretch their muscles. I feel like I am being encouraged to venture beyond what I know and feel comfortable with, into this marvelous world that is so foreign to me in many ways, making it all the more wondrous and awe inspiring. I feel like an adventurer discovering new lands, and this is such a wonderful feeling! Thank you Sabine, for being my guide into this extraordinary uncharted territory.
It's because she is challenging a lot of the conventional wisdom. She is asking you to think for yourself, rather than regurgitate what you are being taught. Perhaps you can find ways to do that with your music. I suspect that unlike Sabine with all the BS surrounding physics, you will find a deeper understanding of the intricacies and realities of your field and grasp the wisdom of your predecessors, but finding out that the giants of old were right and understanding why in new ways is its own fun and equally challenging. But then again, maybe music theory has gone down the crapper, too, and needs the same kind of sanity to be brought back down to Earth.
Just as an FYI, the whole using a certain percentage of our brain is a myth. It helps to think of our neurons as a binary system like computers; neuron not firing as 0 and firing as 1. The 0 is doing just as much as the 1. It's obviously much more complex, but it helps illustrate the issue. If we used all neurons simultaneously, we would have a stroke BTW ;). I'm not saying you made this claim, I just got triggered into a soap box moment
I've been watching Sabine's channel for a while now, and her quirky, almost incongruous humour has really grown on me, not to mention the down-to-earth analysis of a lot of the hype being thrown around in the physics arena at the moment.
Sabine, you are my favorite Physicist AND my favorite German. You speak in real and fairly understandable terms. Please continue to bring us these mini-summations of Physics - you might save a lot of middle class people from investments in companies that have little chance of ever making a profit. This is a valuable thing you do.
When I graduated from graduate school in 2014 with PhD in physics and went out looking for jobs, I found that almost all of them were somehow associated with the military. That is to say, an experimental physicist has a relatively easy time finding jobs building RADAR, missile guidance systems, etc. I was depressed. The military industrial complex that Eisenhower warned us about in 1961 is real and I felt it at that time in my life. I wound up getting a research position at Google, working in quantum computing. I am and will forever be grateful that while so many scientists -- who got into science to understand Nature and improve our lives -- are resigned to building weapons while being paid by taxpayers, I have the chance to work on something that might be useful for medicine and other beneficial fields and is paid for by internet ads.
honestly sabine should know better than to say some of this rubbish.... i have no doubt they will be useful, probably far sooner than that higgs bosom .... but hey no-one knows might be a few hundred years before the machines themselves are useful.... what you learn along the way will be useful and it all contributes this is how science works respect to this field as when i learnt about some of the achievements i was like hog diggity, big factorisations proven!
Yea I feel that, I’m currently getting my PhD in physics and I’m dreading heading into the workforce. I don’t want to work on military technology bc I hate the way the US uses its military. I’m thinking I might try to stay in academia and if it turns out I’m not that great of a researcher I’ll just go into being a high school teacher or community college professor. At least that way I can get some fulfillment from teaching. I don’t think I could be happy producing weapons or even going into the private sector in general since I’ll basically be working to make some fat capitalist richer.
It just sucks man I went into physics bc I loved the formalism and learning how the universe works on a fundamental level. I motivated myself through undergrad by watching lectures on cosmology, QFT, and GR and now I’m facing the grim reality that that’s probably not what I’m gonna get to work on after grad school and it’s really depressing. I’m glad to hear you found a job in physics you enjoy and I hope I can do the same
@@jongya Private sector can be very rewarding. Consider working in fusion power, wind/hydro power, new battery technology, biotech (think artificial replacement limbs with a brain-machine control interface, or whatever other cool technology catches your interest.
A bit withering but I love Sabine’s scepticism and passion for genuine scientific thought. It’s refreshing in its intellectual honesty. And the background research is so impressive.
I remember years ago when it was announced that scientists had built a device that could factor 15. Today I learned that they can now factor 21. I expect they will be able to factor 91 by the year 2030! Being able to cool things down to the millikelvin level with devices bought on the surplus market should open up some cool research opportunities.
An impressive achievement: 21=7X3! That's phenomenal! I'm an old software engineer, thus I've already seen something better 😀 Congratulations for the excellent video!
Excellent summary. I wrote a paper, to spite Scott Aaronson, called "On the (Im)possibility of Scalable Quantum Computing." Would very much like your thoughts. Regarding algorithms, you mentioned the very short list, but the key one so often mentioned is Shor's Algorithm. Many have mistakenly claimed that this has already been implemented on a quantum computer to factorize the number 21, but this is patently false, which you discuss. Thank you so much for clarifying this, and for your great video.
Thanks, Sabine, for this video. I am a computer scientist who has been skeptical for ages about quantum computing. It is a sad state of affairs that it drains money away from much more promising research.
@@danielsank2286 Like, in 2007 they could factorize 15=3x5 using Shor's algorithm, while now they can factorize 21=3x7? Note that the quantum computer they use for such a calculation is application-specific and input-specific. In other words, the quantum computer used to factorize 21 cannot be used to factorize 15. To make things worse, hardware complexity grows exponentially with respect to the number of bits used to represent the input.
@@SurfinScientist When I started in 2007, a good lifetime for a superconducting qubit was 600 ns. Now we see 100 times larger as a matter of routine. Factoring small numbers in algorithms tailored for those specific cases was a nice demonstration in the 2000's when the qubits couldn't live long enough to compute anything actually interesting, but these days we're aiming for full error corrected computation. I'm not sure what you mean about the hardware complexity growing exponentially with number of qubits, because actually in my career I'd say the hardware has overall gotten simpler, and the electronics, packaging, etc. remains relatively the same level of complexity as the number of bits increases.
First of all, thank you for presenting this in such a clear and non-BS way! I’m currently working “around” post-quantum cryptography and quantum networking (from a technology strategy and standards point of view, not as a researcher) a, and also worked in artificial intelligence (as a researcher) in the 1980s and early 1990s. I see a strong parallel between the current quantum bubble and the AI bubble of 1985-1990 or so. AI and Machine Learning recovered after a dry spell of a few years, but this was mostly because of Moore’s Law and the explosion of the internet making things like big data and ultra-fast tiny devices practical. As you point out, these solutions don’t apply to quantum technology, despite the fact that people are expecting them to. I’m near the end of my career, and it’s kind of sad that we seem to have fallen down this rathole - I would have preferred to have finished on a high note.
The cooling for the ion trap quantum computers is a technical issue that might be solved. The main reason it is used is to decrease the pressure below 10^-12 Torr so that your ions don't get knocked around by background gas. Anyway - thanks for the great video!
This video concerns me, as a physics major who was just starting to be interested in quantum effects in condensed materials, especially because of their application to research in quantum computing ( I didn't believe the articles about how revolutionary computers are because they're clearly bogus, but it worries me that there might be no funding to research this interesting topic )
There will be funding in the future. Money pours in from all over. Government agencies, the EU, big corporations throw out billions. It will need many years to dry out.
There's basically three types of computers. _Consumer_ stuff like desktops and smartphones. _Enterprise_ stuff that runs webservers and datacenters and businesses. _Supercomputers_ that do research or are designed to solve specific complex problems. Quantum computing has no real application for consumer or enterprise machines. Too much added cost and complexity for too little added performance. So it's going to be the realm of expensive private supercomputer platforms - it's not going to magically revolutionize the world. At least not for everyone and not for free.
the funding will scale down to a more reasonable level in line with more sensible expectations and potential specialized applications. This doesn't mean that studying quantum mechanics or solid state physics is a dead end by any means.
A follow up on this topic would be great. Because at this point, 5 Months after this video we have 433 Qbits Quantum Computer, Qiskit that can be used in combination with python and upcoming, the first prototype of a quantum software application. I don't understand the possible obstacles or limits of this technologie, but I can see progress. And we also have to remember that every major breakthrough was ridiculed at the beginning and many times called out to be impossible. I also would like to know what your approach would be? Wait until we could build a Q Computer right away with millions of QBits, Q-OS and Q-Software? Isn't research and experiments the best way to achieve ones goal and make progress?
Sabine is correct about the current technical shortcomings of quantum computer DESIGNS, but when it comes the emerging quantum computing INDUSTRY, she is just flat out wrong, and continues to present data and info that is about 2-3 years outdated. The time to build important manufacturing relationships and infrastructure, as well as relationships with other companies, along with governments (and their militaries), is something that starts during the end phase of coming to a viable design. That end phase is just beginning. A more mature mass-adoption phase is admittedly is more than a few years off, but it's not that far off. And the quantum wave that people speak of is not necessarily the actual tech, but what that tech represents to investors. For instance, you don't wait for computers to become massively popular in the '90s before buying shares. You speculate and buy tons of shares in the mid '80s when they are dirt cheap. If you wait until 2030 or so to start investing in quantum technology, you will be about 7 years too late. It's not a matter of if quantum computing will dominate the world, but when. Don't fall for pessimists who say it isn't coming, or that it's not going to be viable until 2045 or something. Technology moves much faster now than it did even 20 years ago. Also, the market is already in a "winter," or bear market, due to Covid, the war in Ukraine and mega-inflation ... so if you think an investing opportunity is going to get any better, you are sorely mistaken. Shares are already at a 50-70% discount.
Good Lord, do I love the snark from Sabine!!! This kind of reminds me of the hype behind the revival of 3-D and VR to be honest, and I suspect it'll pan out the same way: outside of specific applications its impractical at current tech levels. Unlike VR and 3-D (where the primary application is entertainment which chases the lowest common denominator), I think there are enough diverse applications for Quantum Computing for the tech to continue developing when the bubble pops.
Also, from an algorithmic standpoint it doesn't matter if an individual algorithm is of use or not. All algorithms fall into certain categories of complexity and approaches to solving them. There are only a handful of distinct approaches that are modified and repurposed for many use cases. If you, for instance, could find a way to solve sudoku puzzles in polynomial time, it would only take a matter of a few years to have several approaches for solving protein folding in polynomial time published.
The issue is that quantum computers do not allow you to solve NP complete problems in polynomial time. The algorithms we know of so far only apply to a few specialty problems.
@@peterisawesomeplease Yes, but since all NP complete problems are solvable if one is in polynomial time, he is saying that once one is solved in polynomial time, the whole class of problems will be solvable in polynomial time.
@@Eric-jh5mp Yes that would be true if there was an algorithm to solve a NP complete problem in polynomial time on a quantum computer. But there isn't one that we know of.
@@sccur Maybe. My point is that even if solving one problem often leads to many being solved in computer science it doesn't mean that any of the quantum algorithms are particularly useful even when considering this point.
Makes me think of the transition in classical computers from vacuum tube to the transistor and then integrated circuit. It was hardware refined for binary software. It will be interesting to see how hardware design changes to the specific nature of quantum software.
@@shrimpflea I don't think I quite see that comparison. I'm talking about the relationship between hardware development to efficiently use binary software code. Vacuum tubes were used as on/off switches (binary), which were shrunk into a solid state transistor switches. It did away with unnecessary components to make the switching occur (like heating elements, glass chambers, and a vacuum). I don't know that the hardware containing a fission process has changed much at all since it's inception (and ironically, it's still basically just a complex steam turbine engine), and fusion has never actually been used to power anything, (despite promises to the contrary for 30-40 years).
I don't believe there will be any change in hardware specifically determined by quantum computing. Quantum computing, provided it will indeed take off, will be very niche, and as such hardware will be developed very specifically for the things it's good at, leaving the rest of the hardware untouched.
Cloud computing because the idea of everyone having to supercool their personal computers or having a mini vacuum chamber installed is ridiculous. We're going back to the pre-PC days with quantum computing, but the benefits will be made available to the masses via the internet and cloud computing, and the IC technology which allows PC's the be small isn't going away.
@@shrimpflea Fission and fusion is not an apples and oranges comparison? If you want an apples to apples comparison, you'd want to compare between two different fission nuclear power plants. Fusion technology has not matured to where there is an industry standard, and an "apples to apples" comparison can be made between different approaches. I also don't like the term "apples and oranges", because it begs the question: when is it acceptable to compare two different fruit and why? Can you compare an apple to a pear or an orange to a lemon? What about a a red apple to a green apple? Do the apples being compared need to come from the same tree? What exactly are we comparing between the fruits? If we are discussing the evolutionary tree of apples and oranges and their common ancestor, THEN can we compare them?
I almost never hear the problem mentioned that most quantum processes are -- essentially -- random in nature. For calculations (e.g. the theoretical paths of photons) you have to add huge numbers of "possible" pathways and find the average. So ... quantum computers will have a VERY high ERROR RATE! Thus, binary supercomputers will be much more reliable when definitive "answers" are needed!
Sometimes I feel like that the likelihood of quantum computers being used by everyday people is like making human teleportation happen from the current "quantum teleportation".
Quantum computers were never supposed to be used by everyday people. Their use is limited to very specific areas of research and development. Even if we had a perfectly functioning and scalable quantum computer the size of a normal laptop, normal people would still not use them. A normal computer is much more versatile and useful for anything an average consumer would use a computer for. That is not to say a quantum computer is useless, but it's uses are limited, and mostly interesting for research in specific areas.
@@danimyte3021 Agreed. This is why the scenario of wide scale business usage is a fiction. Quantum computing startups might work for specific areas, but ordinary business people will never need nor will use them, at least in the near future. It will stay the tool of specific research areas and fields both needing high computation capacity and having the ability to compensate its higher costs.
Do quantum energy producers, recently discovered by humans, wish to be domesticated, corralled in a computer, forced to run in circles so that their energy can be redirected and harvested to improve human memory and computational faculties?
I'm finding the prospect of the unexpected, magnetism-based computing more interesting with more potential of being here sooner, more widespread, with wider applications frankly. Edit: Also teleportation is death. Star Trek had many great ideas, but once you look into what teleportation actually is.. you die and get replaced by a replica. No thanks I'd rather take space shuttles to planets
@@danimyte3021 That is oddly similar to what in the early 1940s, IBM's president, Thomas J Watson, reputedly said: "I think there is a world market for about five computers." 🤔 Einstein: predictions are always difficult. Especially if they concern the future. Might still go the same path as nuclear fusion, with success always 10 years off.
I'm no expert, but my partner is. The scariest headline to me is the factoring of numbers, breaking rsa in subexponential time. It'll probably take a while to do it, but is that really infeasible? I think nist is still looking for something quantum safe to replace rsa.
Fascinating. Especially the factorisation of 21, and they had to make it simpler! Well I do think its impressive quantum computers can be built at all, but you really helped that bubble burst before it got any more big and dangerous. It seems to me that optical computers might be way better for most tasks. An optical transistor combined with the ability to compute multiply and add ops for neural networks in.a massively parallel way would be way more useful
yes even lcd, fluidics, 2d non dispersive non dissipative circuits, mabye graphene layers, or surface based , 2d logic gates are possible in 2 with accurate timers.. solitons, in 1d, advanced crystal nano fluids and such material sciences are more promising and show results. cheap and dont heat up nor need cooling. even a 8 cm wire has a massive voltage drop, these have no resistance so if they are bigger but flatter, and doent make 400 watts of heat, its great.
13:30 There was a very prominent astronomer who once said "We will never be able to know the chemical compositions of stars or interstellar clouds. His thinking was, the only way to know the chemical composition of something would be to analyze a sample of it. And he was probably correct in thinking we will never be able to take a sample of star stuff. But even when this scientist said this, spectroscopy had already been invented, though it was such a shot time after this that he likely had not heard of it. My point is, when a scientist says something will never be possible, they are very likely wrong.
I'm at the end of my studies to become an engineer and a researcher in quantum physics, and I hope I will benefit from the end of this bubble, and hopefully some nice results will appear from all this research that will have real applications that benefit society and knowledge about the universe.
You sound like me when I was doing my dissertation in autonomous vehicles with deep reinforcement learning, imitation learning and transfer learning. My advice is to get some hard skills from your research and don't hope for magical sci-fi fantasy application of it
Another bubble that will burst in the next few years will be FSD - Full Self Driving. The programmers working hard on vector spaces and all modes of computer learning are quickly learning that the human mind is much more diversified and complex than what they originally thought, and cannot be emulated with the current hardware and learning theories...
Don't listen to her, quantum computing will be one of the most important inventions this century. I hope you know how much it will revolutionise AI eventually: for example quantum systems can be used for matrix multiplication, an operation that's repeated millions/billions+ times to train an AI model. Imagine the complexity of an AI trained making use of this operation, it'd run rings around traditional hardware.
@@benj6964 no, and it will likely not have a real world use for matrix multiplication for a long, long, long time. Matrix multiplication is highly parallelizable, and GPUs are incredibly good at their computation. Even with a hypothetically faster quantum algorithm, the pure power we can commit to matrix multiplication will likely dwarf any computational efficiencies.
I love your content, especially the honesty. It's easy for a lay person like me to not understand when people are just trying to Hype up something that may not be as it seems. Your video has helped me know what the actual cases
There is commercial hype and then there is scientific hype. Don't confuse the two of them. Commercial hype about possible technological breakthroughs tend to inflate and collapse very easily. In 1822 Charles Babbage, the father of the computer, came up with a design for a mechanical computer that he called the difference engine. The English government funded him to make his design a reality but the metalworking technology at the time was too primitive. After 20 years the government gave up and considered the investment a complete failure. It wasn't until the invention of electronic circuits in the 20th century that computers exploded and became a part of everyday life. The theoretical possibilities of computers in 1822 looked just as promising as the theoretical promises of quantum computers look today. We may currently not have the technological sophistication to make quantum computers commercially profitable, but this should not deter us from pursuing further research and development into this fascinating technology.
I suspect all those physicists who leave the quantum computing field when it collapses will turn up working at banks on Wall Street as “quants”. They will help to bring about the next financial collapse by developing yet another exciting new “vehicle” that no one understands, but will be vastly over-hyped. It’s amazing how transferable these skills are.
1) why have a position in the market, when you can have a superposition. 2) Hedge funds? No. Entangled asset distributions and phase factors on stock prices.
haha... I'm pretty sure that it's the one profession where they think they're at the top of food chain and will be able to do anything successfully whether it's science, engineering, youtube, maybe even modeling and porn.
I love your unintended double-entendres (sic), at 4:50 you talk about the quantum winter and then proceed with "let me summarise", which could be heard as "let me summer-ise"! Get it? I am sure you do.
I work in quantum technologies, and I follow closely the development of those technologies for the next 10 years. Will there be a quantum winter? Yes, but only because the investors hype will drop, not because the technology is not promising. And this winter will probably only impact quantum computing, not quantum sensing, metrology, cryptography or communications. I have a technical background, and work with a lot of researchers. I realized that technical people lack this "vision" of what's to come, they only see the things as they are right now. If you look at the first electronic computers, there were a lot of haters because it only did the same calculations as an abacus, but with a lot of costly and extra steps. We now know it is way more powerful than an abacus because we see all the useful applications. It's the same for a quantum computer - because you don't see applications right now doesn't mean there are none, it simply means you cannot see them.
This “vision” is a lot of hype, and comparison with electronic computers is tiresome and even the first old radio-tubes based computers actually performed useful work, could do a lot more than “abacus”.
I really wish that those who are throwing tax payers money at quantum computing watched Sabine's channel, but I am afraid they are rushing into quantum winter not wanting to listen to the nay sayers.
Could you do a piece of what Nu Quantum, from Cambridge, is doing with photonics for QC and QN? They claim to be working on technology that uses photonics to efficiently scale quantum computers.
I’m a computer engineering student, and I understand that maybe IBM won’t have a use for their cryogenic chamber if the bubble does indeed burst, but isn’t it better to leave no stone unturned in the search for our evolution as a species? I say this cause if something is important enough, we should do it even if the chances are slim that we will accomplish our goal that being said you’re really funny 😄
well when the bubble pops, there will be plenty of second hand dilution refrigerators to go around, which i am the most excited about, since the stuff is expensive as hell but very usefull in superfluid research
I remember when the advent of quantum computing had everyone in a panic about how all data that was encrypted would suddenly be able to be instantly cracked From what you've described here, I feel like the reality of something like that occurring is getting slimmer and slimmer
And when it does occur they'll make sure not to inform you until they have all your datas. Maybe they all just have cracked it. You wouldn't need to be informed. They can do it without your help.
That would require quantum computers to show some success in number-theoretic problems (e.g. integer factorization). So far, the progress on that front has been precisely zero.
I want that to happen mostly because it would be a consequence of a quantum computer solving either the Riemann zeta hypothesis or whether P equals or not NP. But yeah it won't happen overnight.
While I agree that it is overhyped, I think that there is still value in the research. Same with FTL -- I am quite sure it will never be achieved, but chasing it has uncovered some really cool and potential viable (and valuable) side concepts :)
I remember giving a university seminar on quantum computing more than 20 years ago. In the absence of any implementations, I lost interest and only regained it (temporarily) when IBM made its announcement. I was sure that if IBM was announcing product, there must be something there. But I was wrong, although actual functioning hardware existed, little or nothing conceptual had changed. When I was in university, I worked with a couple of hybrid machines - IBM 1620/1710 and 360/44 - which had analog computers as co-processors. The idea was that the analog processors could perform simulations of dynamical systems faster than their digital counterparts. Of course faster digital machines have long since obsoleted these systems, but present quantum computing organized in the same way as they were and may have an analogous role to fill (intentional). Of note, Dr. Roger Penrose has proposed that the human brain may include quantum elements which allow non-computable results. However, I came to the conclusion that IBM's objective was primarily to regain its role as a technology leader and to have something new to talk to its customers about.
Molecule biology works in scale of hundreds atoms (protein) so maybe some of protein function comes from quantum relations? Or maybe it is still limited only to electron interaction level. We still don't have good models in electron cloud interactions, so who knows. Waiting for IBM Block-chain quantum AI solution. 😏
@@peterbonucci9661 That's the 1710 part. en.wikipedia.org/wiki/IBM_1710 which provided the A/D interface. The analog computer itself was built and maintained by faculty. As a undergrad, I never used it. It was physically separate in the next room with a window.
I always feel like the pursuit of quantum computing will teach us about edge cases related to quantum mechanics, some of which may provide fruit. After hearing your video, I feel like we should focus more on length of time staying coherent so we can at least play with the system. It's probably 20 years out along with fusion reactors and artificial general intelligence.
Fusion has been "20 yrs out" for most of my entire life (now pushing 50). Artificial General Intelligence much less so and, to be fair, is probably much more achievable than fusion given that there's a couple hundred billion working examples of AGI roaming the earth already.
With shor's algorithm being a thing I don't think gov funding is going to stop until someone can prove quantum computers are unachievable. The fear of another country getting ahead and being able to break encryption is too strong a motivator
It is public (Asymmetric) key encription that can be broken. Private (Symmetric) key encryption, once you have set it up, perhaps by passing pieces of paper around, cannot be broken.
@@alanjenkins1508 also not all asymmetric encryption schemes can be broken by Shor's algorithm, only the ones which rely on the infeasibility of factoring large primes, such as RSA.
Most encryption has moved to algorithms that aren't susceptible to Shor's algorithm. And there are no quantum computers that can run it. If you have heard hype about such and such a large number being factored, they didn't use Shor, they effectively told the computer the answer in advance.
@@fluffysheap this is not true. The most common (asymmetrical) encryption nowdays is based on elliptic curves, which is susceptible to Shor's algorithm.
I was involved in quantum computer research since the beginning. The difficulty of the task seemed apparent. Yet great optimism grew from the advent of quantum error correction. The problem I saw is that the quantum error correction overhead has atrocious scaling. The question I'm still asking is whether the qubit count represents physical or logical qubits? For example, if the physical qubit count reaches 1 million, there may only be thousands of logical qubits due to error correction overhead...
Hello. I think it is good that you explain so many different fields and branches of physics and I am really feeling like I am learning a lot. However, I sometimes get demotivated when you show all the downsides of different fields and how they do not seem useful for society. My request is, could you make some hopeful content or recommendations especially for people considering a career in physics for interesting and meaningful fields. It really brings me down to see how much "useless" research is being done in fields that sound so interesting and I do not want to fall for them. I hope you can recommend some alternatives some time. Thank you.
In a way, she is giving motivation. Just like how dark humor can be therapeutic for a individual; being able to be self aware enough to realize when certain things get off track and it can motivate you/anyone to attempt to look at these things and try and find a new perspective/approach to solve a problem. The best inventors, philosophers, artists seem to be able to be comfortable with facing things and learning to harness the ability of how you observe things as a tool and through perspective experimenting, you can sometimes stumble upon new methods/manner's on how to solve problems in ways that others didn't notice/see/or understand. (Just like how Einstein had the ability to get lost in hypothetical thought and imagine different perspectives and then it stumbled upon his path towards learning about light/energy/physics etc.) That came from Einstein taking a category that previously was in a state of limbo, but he was willing to look at it and see if he could find anything else out about it. So if you get demotivated, or down, try and remember that even things we think we fully understand; have the total possibility that you can look at it/observe it in a new light that others haven't and totally find a new layer of complexity or depth to a topic. So even if we are faced with a lot of difficult situations, roadblock's, the best thing is the universe is full of amazing intracity so the chance of further understanding the world around us is profoundly possible and to me that's very motivational and Nature itself some how seems to have a way of always keeping that inner child like curiosity alive with-in my Life.
There are many research fields in physics which do have modern day applications, but they usually require you to move slightly away from particle and theoretical physics. Condensed matter physics for example is a very hot research field with many applications ( for example, material science, semi-conductors, super conductors). There are many other fields as well such as for example space physics, medicinal and biological physics and solar physics.
You're looking at it the wrong way. There is nothing wrong with "useless" research, knowledge for knowledge sake is still valuable to society just not on a day to day basis necessarily. The downsides come from the fact you need funding to do your research, so you need to play the hype game in order to get money for your research. But that shouldn't stop you from pursuing a field you genuinely feel passionate about, just make sure you know what you're getting yourself into.
Try getting into materials science. Lots of potential there, a good deal of it on solid foundations. Condensed matter physics is the field you're looking for.
There's also been a lot of hype about analog computing in mixed-domain applications, and some of that looks promising. I wonder if there may be a future for digital/analog/quantum hybrid computers.
Qbits are analogous memory. This analogousness is also the problem. They will never make it 100% accurate. When many qbits are used, error will be exponential to qbits. They will never make working large QC.
In the early part of the 20th century, analog computers had the edge over digital ones in physical-simulation problems in being a lot faster, albeit with limited precision. I see history repeating itself, with “quantum” computers being the new-style analog computers. You never see them demonstrate any success with number-theoretic problems, for example, as would be needed for cracking encryption (what happened with Shor’s algorithm?). Just about all the success stories I hear about have to do with physical-simulation problems.
This was super illuminating. I have watched about 10 to 15 youtube explanations of quantum computing, trying to find one that would give me a real world example of how it would do things better. Of course they mentioned pharmaceutical applications and other high-level things like that. But never a down to earth example that I could understand being a non-scientist. So Sabrina explain things perfectly, they have not figured out anything to do with quantum computing yet other than then to say things can be computed faster and then again only on a targeted limited set of problems with a super-small set of algorithims? Amazing, thx so muc Sabrina for cutting theough the hype!
I am for and against quantum computing at the same time.
😀
Won't be anywhere near a thing in a decade. In a century? Maybe!
I observed what you did. So you don't!
I'm clapping with one hand! Bravo!
Ah, Schrodinger's comment.
When I was a PhD student in quantum computing we all found it depressing to read articles about how quantum computers will revolutionise business. “Quantum winter” felt inevitably around the corner
It's like scientists tend to Elonize (*Elon Musk) things for sensationalistic purposes.
It will make the landscape even more polarized. Nothing new
Was it worth it doing the PhD in QC? During my Bsc and Msc I have focused on researching quantum algorithms, but now I have to choose between continuing with QC and shifting towards ML/DL for my PhD. Both fields are over-hyped, but ML has seen some really promising developments lately.
@@stefanbalauca7481 ML is the programming language that replaced LISP in education sometime around 1990. Machine Learning needs a different acronym.
@@johndododoe1411 I'm afraid it's going to be the other way round.
Haha „if you look at them, will they collapse?“ loved it
Best joke I’ve heard all week
" "*
I was glad I wasn’t right in the middle of a sip when I heard that.
I was sipping beer from a can when I heard that and it almost caused an accident. 😆
Certainly true of Liz Truss' government! 'The Market' looked ... the highly entangled (as in kittens and balls of string)
government promptly collapsed.
Sabine is awesome. Love her one on Nuclear Confusion.
And in this one: "Some people will lose a lot of money, but that just means they had too much of it to begin with, so I can't say it bothers me too much" 🤣😂
Just started watching her videos and I adore her dry sense of humour, such a great bonus on top of the education she provides 👍
LOL, exactly the comment I was going to make.
Just so long as it’s not your pension fund that -invested- lost that money ...
Being part of a quantum research group myself, I really appreciate Sabine's well informed take on this subject. The fact that its seasoned with her dry deadpan german humour, is a bonus.
Can one look/interpret Qubits as a wave? if the question even makes sense... :D thank you
I look forward to having a conversation about qbits while you drive me around town one day.
I do not really have a clue beyond pop science about quantum anything. I do know EECS though and after seeing claims of the capabilities of quantum computers have been trying to tell people for a long time they won't make their GPUs faster. At best, for everyday use, they'll be a coprocessor for rare, very specialised uses.
Now I have a video to send people instead of getting dragged into go nowhere argument.
@@GermanGameAdviser Yes, until entanglement leaves the building-thx for asking.
You know alot about germans, very suspicious...
Sabine's dead-pan delivery and her sarcastic and sometimes cynical beat-downs of overblown ideas with her sharp wit never fail to amuse me. In a climate where so many people dare not call out BS for what it is, I adore Sabine for her realness.
As I came to the end of the video, I started to come to a confirmation of the same feeling - "Science without the gobbledygook; also without the BS". Thanks Sabine, for your honesty and clarity.
She's got a decidedly unscientific certainty to the statements she makes. Her opinions are just that and delivering her opinions as obvious facts is deceptive. Quantum computing has enormous potential. It may be expensive and may always be expensive and it may be noisy and this may never change but even with those problems it has enormous potential. If it's at all useful it can do things that seem like total magic right now. This power is known and this knowledge is what causes the seekers to pour so much time, effort, and money in. And Sabine is doing something very sneaky here. She makes a video that contains inflammatory rhetoric about a topic she knows little about and finishes it off by turning into a saleswomen hawking a predatory product that promises it can teach you how to think. And she gets paid for it. Unknown to most, the reason this video exists is to get Sabine paid. The only reason. This is why it is inflammatory. So you'll watch it. It's not inflammatory so it's closer to the truth. The topic isn't chosen because Sabine has strong feelings about it. It's pandering. It's clickbait. Advertising and even payment consistently and inevitably causes the reduction of speech into drama and propaganda. Each RUclipsr that engages in sponsorships falls into this trap and all their channels become hollow shells of their former glory. Honesty gets turned into drama. The depths are made shallow. The complex is made simple. The heartfelt is made insincere. The RUclipsr clearly only shows up because it's their job. The passion fades and becomes jadedness. Lies creep in. Exaggeration bulldozes over subtlety. Truth dwindles. And the fanboys keep cheering. Like lemmings. Their brains rotted. Their faculties stunted. Off the cliff they go. I dare call out the BS for what it is. I despise Sabine for her fakeness. Another video only made so that there's a place to tack an ad onto.
@@xbzq Clueless, Sabine was a leading theoretical physicist and calls it like she sees it. the world is filled with bubbles and this is certainly one. She never said they won't happen, just bagging on the hype.
@@dft1 Sabine IS the hype. But you could never see that. Also, she has very strong ideas about a lot of scientific ideas as if they were settled science. They aren't. She talks a if she has answers that she doesn't have. I don't care about her credentials. She's just a person. And she's often wrong. Like all of us. She's no god. She doesn't understand programming, or quantum algorithms. She doesn't seem to understand quantum computers won't be used for payroll. Neither does any of you comments seem to understand Sabine is just a person. Everyone worships her as if she's done kinda of god. Everyone on earth it seems theses days subscribes to some hero's ideas and worships them. When they inevitably fall, these people fall right along with them because they've tied their identity in with this person.
@@xbzq I share some of your concerns about her aggressive way of stating her opinions (she does seem to present herself as the arbiter of what science is useful or worth spending time and money on), but RUclipsrs need money to survive the same as the rest of us.
Sabine is the living embodiment of "keeping it real", and I'm so grateful for that.
Not really. She very clearly has her biases. Especially on anything quantum related or related to ideas like free will or the multiverse that involve both science and philosophy. She is great but she's not the "living embodiment of keeping it real".
If by, keeping it real, you mean leaning further and further toward cynical skepticism, and philosophical contrarianism. I still watch and enjoy her content , but am often left disheartened and hopeless.
With great humor!
@@anakinthemannequin69 Free will is just an illusion , consciousness is just an absurd joke....
@@anakinthemannequin69 because she is fed up by all the media hypes and BS by journalists and marketing types who have no clue whatsoever.
I love the balanced and skeptical position you take on all of these issues. It really feels a lot more informative than a video that's clearly trying to push a certain agenda or use clickbait claims to draw in gullible viewers.
While I really enjoyed the video, saying that she is balanced and skeptical is quite funny. It is a massively oriented video. Even more, since it is so obviously oriented toward one conclusion, I don't mind it : it's made very clear that she will present points that go in her direction, and you know that straight up.
I always leave the video depressed.
I worked at Texas Instruments (TI) during the first Artificial Intelligence (AI) bubble, circa 1985-90 (my estimated dates). The issues described in Sabine's vid, were nearly identical to the marketplace hype and expectation of AI. TI developed and produced computer hardware that directly supported the Lisp programming language. To my knowledge, TI sold very few of these systems. Attractive commercial AI outcomes (algorithms, etc), took decades to develop, and used existing hardware.
There was also a nice little neural network hype (not called deep learning then) in about the same timeframe.
Indeed, there were a couple of waves of AI, which mostly consisted in making a new lisp variant and some hardware to make the lisp test programs work faster.
Then there was the "expert systems" that were supposed to revolutionize the world and "knowledge experts" were to be trained everywhere.to solve all problems.
The fact is that real revolutions are unpredictable by nature.
@@niceguy100000 Yes, also Santa Cruz Operation (SCO), offering Xenix and other Intel/Microsoft/HP based UNIX solutions, all of which Linux killed off. Heady times, fabulous trade show parties, zero profit.
@@wyattonline Yeah, good times. SunOS was next level and properly working unlike some other caveman software. Also the first VR hype (of course totally profitless) was cute.
But AI is arriving at last and is a true world changing revolution
"Some people will lose a lot of money but that just means they had too much of it to begin with, so I can’t say it bothers me all that much."
Nailed it.
Not quite. Some people have more inside information and make money on both the growing and the bursting of the bubble. Other people buy in near the peak and then can't get out fast enough and get burned. The stock market creates nothing, it is a zero-sum game.
Investing in Quantum Computing is God's way of saying you have too damn much money. All credit to Robin Williams
There's a suing attributed to P.T. Barnum that I think applies to investors who spray money into stuff without understanding the subject in even the most cursory way: "There's a sucker born every minute."
Unfortunately, a lot of those people losing money will be taxpayers via their governments.
@@fewwiggle Then people need to stop electing paint chip gobbling Mow Ron's.
"Some people will lose a lot of money, but that just means they had too much of it to begin with" 😂 That made me spit my coffee. Now I have to clean it up. Thanks so much, Sabine 😑
Absolutely savage
Sabine is top of the list in making difficult subjects comprehensible for normal people without dumbing it down too much. Wonderful absence of irritating melodramatic soundscape
Excellent work.
She’s just pessimistic and gloomy… it’s easier to destroy arguments than build them.
@@RandomForestGump I know what you mean... Yet sometimes people need a slap of reality.
I am seeing overblown expectations that even my layman mind knows is wrong, because I think some want funding.
So the researchers have to play a game of manipulation in order to get it.
Sort of like how channels use click bait to get attention, because it works! Especially versus being honest.
I'm sure there are a few downright fraudulent companies running amok.
Raising expectations so high always has a backlash.
I expect it both for AI and Quantum computers.
Yet! I hope I am dead wrong. There are a lot of promising theories and technologies in their infancy. I just think timeliness are off.
Photonic computing, and eventually quantum photonic computing looks great if we can get it to work as some way it will.
I’m really grateful for this channel. You really tackle these pop-science issues very well. I say this as a fellow physics PhD.
PSST, STREAMMIE, BEING ASKED :
" JUST HWHAT DO YA SERVE HERE ? "
RESPOND :
" ETERNITY ..."
Indeed! Sabine knows how to cut right through the fog! But to be frank, I have an MSc in Physics anyway so I know how to navigate this landscape myself.
This is why the general public REALLY needs to get up to speed with scientific literacy! Plenty of crazy crackpot ideas running around and marketing hype that says so much but delivers so little!
So as Public Enemy said so forcefully:
*DON'T Believe the Hype!!!*
As a computing specialist myself I couldn't disagree with her more. An operation that's used heavily in AI, matrix multiplication, can be done on quantum computers. If a quantum computer is sufficiently scaled up, and these operations were run to train an AI, the amount of training on that AI you could do within a short period of time is ridiculous. This becomes increasingly important as data sets are getting bigger and bigger. I've sat around for hours waiting for a smaller model to train on traditional top end silicon hardware, imagine the bigger models. Quantum computers will revolutionise AI, and that's just one example. Quantum computers aren't going away, they're going to get better and better, and a bitter old lady who's mad that their obscure research isn't getting as much money as quantum computing won't stop it. There's hype but there's also substance there. Of course, it won't ever live up to the hype, no new tech does, but eventually it will exceed it. It's really unscientific to discourage quantum computers by saying it's a dead end.
On that point about QM computers revolutionising AI, I should also point out this: the better models are trending towards consuming more and more data so as time goes on the computing requirements are going to also climb up. QM computers will solve this. Tesla, to train their self driving car models, have created basically a whole factory of servers with specially made chips just for crunching these numbers, now imagine you could do this on one machine that's significantly smaller. And a sufficiently powerful AI can solve all human problems, including our deepest physics problems. Seems like a pretty fucking big deal to me.
@@StephenWylie1522 WYLIE, SAY, I AM GOOGY, THE GOBBLEDYGOOK.... YEAAAAAAH.... LOVE
@ 2:16 "If you look at them, do they collapse?" Sabine, your comedy if flawless. I watch a lot of science videos on YT. You are the only one capable making esoteric abstract concepts clear, but also funny in the cleverest ways. You _are_ the smartest social media explainer of scientist on the internet. Thank you _so much_ for what you do.
Thanks for watching 👆text my trader Jeremy if you are interested in investing in crypto to making large profits tell him I linked you..
Absofreakinglutely agreed on that one! The next best is Anton Petrov, but he's not a PhD, just (I say "just" but it's definitely not nothing!) a teacher. Still, he has the same flavor of comedy, and he covers lots of different kinds of papers that have been recently published, from astronomy to microbiology and everything in between.
But the top winner has, yes, _got_ to be Sabine!
and what USE is the higgs boson or looking at the "big bang theory" with a telecope?
“esoteric abstract concepts” that phrase alone screams “i have no personality other than my 115 IQ”
@@UnderscoreZeroLP Because IQ is the _only true_ measure of mentorship, compassion and humanity? I grew up in a library. This is your problem of you. May harmony find you.
"But if you do it with a quantum computer, you can publish it in Nature." Just phenomenal.
"But if you do it with a quantum computer you can publish it in Nature" - and in Science at the same time 😜
😂😂
Yeah.... cause its fucking cool as hell.
Was the same thing 5 years ago with a deep learning model instead of other types of maths
@@kaourintintamine1383 And now we have jaw dropping AI applications like DallE, ChatGPT and more. Not sure what we will get from quantum computers. The NSA decrypting all our porn videos?
I am so glad to hear someone else, someone with more credibility than I have, speak out about this. I've been dubious about quantum computing since the mid-90s. I'm not a physicist, but I've been doing software for many decades, and I've also worked with stochastic and probabilistic systems, and most of the time my understanding of the projections has veered off course as soon as the quantum jargon began to dominate the explanation. It always felt like hand-waving. I gave a talk at Interval where someone brought it up, and my reply (at the time) was "Yes, quantum computing will give you the right answer, and it will deliver it faster than conventional computing, but it will also deliver a multitude of wrong answers, so you'll have to go through them all to determine which one was correct."
[edit] Even fuzzy computing does better than that.
I just went back and noticed the cat on the cover of the Hoofnagle and Garfinkle book, but I couldn't tell if it was alive or dead.
Necromancy will fix your cat problem. Just summon up the spirit of Schrodinger. Actually, it sounds as if you've tried that already.
You are becoming part of the few physicists that keep emotional entanglement out of science. It's nice to be optimistic, but much better to be realistic. I appreciate your approach, now more than ever.
hehe entanglement
yes, I feel like I can rely on her for an honest opinion.
"I guess if they don't work out, they can rent them out to have their heads frozen." - best quote of the week.
I thought she said hats....
But you probably do not want your head frozen so cold, that parts of it become superconducting. I prefer I fire funeral.
@@manoo422 I wasn't sure if she said "head" or "hats", but I didn't rewind because the joke works either way.
@@bentonjackson8698 she'd pronounce "hat" closer to "het" (not nailing either the American, British, or Australian vowel sound langing somewhere in between everything and succeeding - German engineering for you), and also she has quite a cruel euro humour going so pretty sure it was heads
But if you look at it, does it collapse?
Every new technology goes through this “overhype” phase, like internet itself, or blockchain as the most recent example.
This is not a reason to stop doing it. We pretty much HAVE TO go through this phase, to learn their real usefulness.
I don't actually remember overhype of the internet (and I do predate it). I suspect it has exceeded almost all predictions made for it. Do you have examples to support your claim? (Simple, honest question... not trying to start an internet war. :-) ) (And I do agree with your general premise.)
@@peter9477 good question on that internet hype thing I think it fulfilled everything and more I also predate the internet myself. I think the second statement is the biggest truth to be said and is pretty significant. There is a lot of growing pains in all fields of development and some are more embarrassing than others but it's the fact they even tried it is what matters, as far as quantum goes though it has indeed gone too far.
It's better to try and fail than to not try and miss your chance.
The problem is that quantum computers will almost certainly be useless for decades just due to physics. Until someone can discover a room temperature superconductor and a way to stop particles in superposition from decohering (I don't believe this even possible), most people won't even be able to have a supercomputer, let alone use it, and even if a way is found, we have maybe a few algorithms that are actually useful, and creating more would require even more work and even more research.
@@peter9477 omg, you didn’t hear about the dotcom crash, which caused a smaller recession in the whole economy?
en.wikipedia.org/wiki/Dot-com_bubble
@@peter9477 the burst of the DotCom bubble in 2000 was the internet hype winter. Now we are definitely through.
This reminds me of cloud computing. I was a recent hire at a software company and inherited a project a recently leaving partner signed on. It seems he sold it to the clients by peppering the document with the word cloud with little to no actual meaning reflectig what the technology did. Instead, it appeared to be the cure all for every problem they had or could have. In the 5 page document specifying what our company would do, he had signed us up for a job that would have taken a team of experts years to do and me, a lowly junior, ended up as the lead developer with only three months left in the contract.
Suffice to say, it did not turn out well and the term cloud computing always leaves a bad taste in my mouth.
Seems that snake oil salesmen are making the same use out of Quantum technologies.
'Cloud computing' was always about controlling the product, and streaming revenue. You will own nothing and like it.
@@MyCatJeff not exactly. When cloud computing first came put, it became widely known as a term, but few people actually had any notion what it meant.
This sort of dynamic breeds a certain group of people to use such a term as a cure-all of any problem. The snake oil salesmen basically. Back then, if you had a problem, just had to litter the term cloud into the conversation a few times and mamy would automatically think they new some great fix for almost any problem. It was the same hype crypto had and similar junk related to it.
Cloud computing, when actually defined, I agree, wasn't anything great and just a play on how computers manage themselves. Still, back then it was more mystical that real.
“Superconducting and ionic qubits are equally bad” is the best compliment I have ever heard from you about QC. Thank you.
I think the most interesting thing about quantum computers is what we can learn from attempting to make them.
Learn about science, or human nature? XD
@@Mkoivuka Both, I would have thought!
Sounds a bit like "the friends we made along the way"?
Or, you know, a device that can revolutionise AI. Quantum computers will be useful, so long as the progress in scaling up the number of qubits and stability continues, as it has been continuing strongly.
You mean like Teflon was the only useful product from putting a man on the moon?
Hossenfelder is like the doctor who told me "it is not my job to give you hope", but with a different sense of humor.
haha, doctor goes brrrr
It sounds like the discussion I have with co-workers about their kids. They say to me, "I want my kids to have what I never did, don't you?" My answer is, "No, I don't care about that." It really rocks them when I tell them, "I'm not putting myself in charge of my kids' happiness, that's something they need to go out and find."
@@Skank_and_Gutterboy but wouldn’t you prefer to be a part of it rather then a distant observer? They are YOUR kids after all
German humour is not a "widerspruch in sich", it's loudly sarcastic but somehow subtle, like Nietzsche
You talked to your doctor? Guessing this must've been pre-pandemic
I so very much enjoy your presentations! They are informative, they flow very nicely, and are easily followed. But most of all, I truly enjoy your calm and WICKEDLY sharp sense of humor! It is always an education to watch your presentations, and always VERY entertaining. You're like a breath of fresh air in my day. Thank you so very much.
One of the wonderful things about watching you Sabine , and its the reason i was drawn to all your earlier work in other mediums , is the sense your fed up with the BS in modern physics. Love it, more please ! If you havnt already , please please cover the 'big rip' :)
I really appreciate your uncanny ability to blend knowledge with humor. Your videos are always excellent
I work in quantum sensing and metrology, and we've started making the headlines now as well. Guess I should be worried! 👀
You have only an indefinite probability of needing to be worried.
@@flagmichael hehhehehe funny nonsense. In fact it is precisely that: funny nonsense, because you mix different concepts with different transcendental conditions, hence creating a pseudo-paradox... which can be perceived as funny nonsense,. I prefer to drop the "funny".
Another ingredient is the claim that quantum theory explains "everything", which , not accessible for quantum theorists, includes complexity, language, and culture.
Unbelievable how people nowadays have so deep of an understanding of Quantum physics and technology on a microscopic level while just 250 Years ago people had to bite on a stick because someone had to cut their arm off when it was infected.
These days you get genitally mutilated for no reason instead.
Truly progressive times we live in right now...
She’s so flipping funny. My family keep asking me what I’m watching on my phone (headphones on) as I keep laughing out loud. Sabine has Fantastic, sharp-as-a-knife humour. More please!
Humor aside she is basically the only one who is exposing rotten, stinking mold that have been decomposing physics and stopping advancement in the last century.
delivered with a german style. i love her videos.
Right! And her delivery is so deadpan. hehe. She goes from the highly technical to humor and back, and if you had the sound turned off, you wouldn't detect it.
@smcg2490 my neighbors can hear me bust out laughing my @$$ off 🤣, their kinda scared of me now
The subtle humor/sarcasm your put out there and your delivery kills me every time 😂
fact. the fact of this kind of humour existing within this subject’s framework is already kinda absurdly ironic, but the fact that it’s been so tastefully executed in these videos just cracks me, i almost feel rick-rolled each time. a brilliant thing of its own kind
Yes, it's great to hear the science WITHOUT the hype!
I have worked on systems since 1974 and I have serious doubts about the usefulness of quantum computers and I have had them since I first heard about it and I haven't seen anything that has given me more confidence in them. Hype is all it is.
The girl at the right in the photo that was shown in 9:39 got me entangled away from Sabine´s explanation. I had to go on five times over the interval 9:39-9:57 to just barely understand what Sabine was saying (something about "crosstalk".) Closing my eyes was of little use, the image kept entangled in my mind.
I’ve just found your channel and what a delight it is to hear things explained in such a humble manner. Thank you for adding to the collective intelligence of our society
humble??? she's casually insulting extremely intelligent people and their projects and acting like a total know-it-all
@@SpeedOfThought1111 don’t be so soft , her sarcasm is part of the humor which allows others to be more receptive and keep gen Z non existing attention spans on her. If you believe she’s actually insulting them purposefully, I’m sorry your sense of humor is so dry.
@@SpeedOfThought1111 she is insulting them rightfully
Never again the title "science without the gobbledygook" seemed more relevant. Brutal, but honest and funny. Thank you Sabine.
I am getting rather obsessed with your channel and your hilarious humour, I must tell you. This should be a million+ subscriber channel. You'll get there, Sabine.
She has a distinct bite of linguistic sharpness that combines the candid command of language as exemplified in the writing of the late Scalia, the tenacity exemplified by the writing O'Connor, and the linguistic influences of the late Ginsberg.
I've been loving how much shade this physicist throws in her videos at pop culture stuff, keeps it grounded. Just cam across her videos and they've been great!
This woman is a gift to intelligent people with a sense of humor everywhere.
You nailed it! I worked in the supercomputer field, and spent time working with algorithms to solve quantum problems. I cannot see how the current quantum computer architectures can support any algorithms that compute, ab-initio, molecular structure.
In a quantum winter you can know the air temperature or the wind speed to arbitrary precision at some point in space-time but never both, but either way you freeze.
Enjoyed your book a lot. Refreshing to read something the cuts through the hype. However it does speak volumes about the positive state of physics that you can be this critical of paradigms without having your career destroyed! Wish the same could be said for other fields...
Hossenfelder is getting better and better. At her very best she's producing biting sarcasm and it stands very well to her voice and personality. This is how science news necessarily must be to attract intelligent people. I'm not afraid to admit it, I love you Hossenfelder.
We need more skeptical physicists like you. So many physicists are so wrapped up in their theories, they just can't see the reasons some say it won't work the way they say it will.
Isn't a "skeptical physicist" a pleonasm. At least it should be.
A lot of us are sceptical and aware of how much hype there is, but, the entire grant system is based on promising future results and dreaming up how they create profit. If you are honest there will be no grant money for your research. It's a similar issue like impact factor that is more a measure of a field's popularity than actual quality of published articles. Science has been corrupted by capitalism and society at large has no tolerance for complex problems and explanations.
dont get me wrong, i like her stuff usually, but this video is more like a rant about that others get more money for there research then she gets...
ofc she got some points in the video but at the end it sounds like a rant about why she dont get the money the others get
She's a physicist, clearly not a computer scientist to see the benefits of quantum computing. Matrix multiplication is used heavily in AI, and guess what can do that at ludicrous speeds? This physicist should stick to making science videos, not videos on computing she knows absolutely nothing about. Being an expert in one field clearly doesn't translate to anything. Of course, there is an initial hype phase of every new tech, but that eventually passes and then after that trough comes revolutionary tech. To say it's a bubble that's going to burst and nothing revolutionary will come out of it eventually, is complete bullshit.
@@mattl1250 Nice that you mention the other big bubble. There is exactly zero intelligence in AI so fits well here :D
I spend the majority of my time teaching composition and music theory. And although I love my work, I often have this sense that I am using only one small portion of my brain. When I watch your videos Sabine, I always feel like these other dormant parts of my mind are being awakened and asked to stretch their muscles. I feel like I am being encouraged to venture beyond what I know and feel comfortable with, into this marvelous world that is so foreign to me in many ways, making it all the more wondrous and awe inspiring. I feel like an adventurer discovering new lands, and this is such a wonderful feeling! Thank you Sabine, for being my guide into this extraordinary uncharted territory.
It's because she is challenging a lot of the conventional wisdom. She is asking you to think for yourself, rather than regurgitate what you are being taught. Perhaps you can find ways to do that with your music. I suspect that unlike Sabine with all the BS surrounding physics, you will find a deeper understanding of the intricacies and realities of your field and grasp the wisdom of your predecessors, but finding out that the giants of old were right and understanding why in new ways is its own fun and equally challenging. But then again, maybe music theory has gone down the crapper, too, and needs the same kind of sanity to be brought back down to Earth.
I am so amazed that you can now write for a 100-instrument orchestra.
Just as an FYI, the whole using a certain percentage of our brain is a myth.
It helps to think of our neurons as a binary system like computers; neuron not firing as 0 and firing as 1. The 0 is doing just as much as the 1.
It's obviously much more complex, but it helps illustrate the issue.
If we used all neurons simultaneously, we would have a stroke BTW ;).
I'm not saying you made this claim, I just got triggered into a soap box moment
@@coronnation8854 actually, a seizure
a stroke happens when the blood vessels supplying the brain are either ruptured or clotted
You're actually using all of your brain, all of the time.
I've been watching Sabine's channel for a while now, and her quirky, almost incongruous humour has really grown on me, not to mention the down-to-earth analysis of a lot of the hype being thrown around in the physics arena at the moment.
Sabine, you are my favorite Physicist AND my favorite German. You speak in real and fairly understandable terms. Please continue to bring us these mini-summations of Physics - you might save a lot of middle class people from investments in companies that have little chance of ever making a profit. This is a valuable thing you do.
She’s Swiss.
When I graduated from graduate school in 2014 with PhD in physics and went out looking for jobs, I found that almost all of them were somehow associated with the military. That is to say, an experimental physicist has a relatively easy time finding jobs building RADAR, missile guidance systems, etc. I was depressed. The military industrial complex that Eisenhower warned us about in 1961 is real and I felt it at that time in my life. I wound up getting a research position at Google, working in quantum computing. I am and will forever be grateful that while so many scientists -- who got into science to understand Nature and improve our lives -- are resigned to building weapons while being paid by taxpayers, I have the chance to work on something that might be useful for medicine and other beneficial fields and is paid for by internet ads.
honestly sabine should know better than to say some of this rubbish.... i have no doubt they will be useful, probably far sooner than that higgs bosom .... but hey no-one knows might be a few hundred years before the machines themselves are useful.... what you learn along the way will be useful and it all contributes this is how science works
respect to this field as when i learnt about some of the achievements i was like hog diggity, big factorisations proven!
@@DarkShroom Can you nestle yourself in a quantum computer? No! Higgs bosom on the other hand? You can never be sure what will be useful.
Yea I feel that, I’m currently getting my PhD in physics and I’m dreading heading into the workforce. I don’t want to work on military technology bc I hate the way the US uses its military. I’m thinking I might try to stay in academia and if it turns out I’m not that great of a researcher I’ll just go into being a high school teacher or community college professor. At least that way I can get some fulfillment from teaching. I don’t think I could be happy producing weapons or even going into the private sector in general since I’ll basically be working to make some fat capitalist richer.
It just sucks man I went into physics bc I loved the formalism and learning how the universe works on a fundamental level. I motivated myself through undergrad by watching lectures on cosmology, QFT, and GR and now I’m facing the grim reality that that’s probably not what I’m gonna get to work on after grad school and it’s really depressing. I’m glad to hear you found a job in physics you enjoy and I hope I can do the same
@@jongya Private sector can be very rewarding. Consider working in fusion power, wind/hydro power, new battery technology, biotech (think artificial replacement limbs with a brain-machine control interface, or whatever other cool technology catches your interest.
I appreciate your channel and and appreciate your sensible approach to problems
I love your videos Sabine! Very informative, easy to understand, and presented with a great sense of humour!
A bit withering but I love Sabine’s scepticism and passion for genuine scientific thought. It’s refreshing in its intellectual honesty. And the background research is so impressive.
I remember years ago when it was announced that scientists had built a device that could factor 15. Today I learned that they can now factor 21.
I expect they will be able to factor 91 by the year 2030!
Being able to cool things down to the millikelvin level with devices bought on the surplus market should open up some cool research opportunities.
AH a positive view on it I like it
this comment about nuclear fusion didn’t age well
Sabine keeping us all grounded whilst simultaneously blowing our minds, as usual.
An impressive achievement: 21=7X3! That's phenomenal! I'm an old software engineer, thus I've already seen something better 😀
Congratulations for the excellent video!
Excellent summary. I wrote a paper, to spite Scott Aaronson, called "On the (Im)possibility of Scalable Quantum Computing." Would very much like your thoughts. Regarding algorithms, you mentioned the very short list, but the key one so often mentioned is Shor's Algorithm. Many have mistakenly claimed that this has already been implemented on a quantum computer to factorize the number 21, but this is patently false, which you discuss. Thank you so much for clarifying this, and for your great video.
Thanks, Sabine, for this video. I am a computer scientist who has been skeptical for ages about quantum computing. It is a sad state of affairs that it drains money away from much more promising research.
Research like optical computing?
It's unfortunate, so much money; what happened to deliverables?
I've been in quantum computing since 2007 and I've seen steady progress the whole way.
@@danielsank2286 Like, in 2007 they could factorize 15=3x5 using Shor's algorithm, while now they can factorize 21=3x7? Note that the quantum computer they use for such a calculation is application-specific and input-specific. In other words, the quantum computer used to factorize 21 cannot be used to factorize 15. To make things worse, hardware complexity grows exponentially with respect to the number of bits used to represent the input.
@@SurfinScientist When I started in 2007, a good lifetime for a superconducting qubit was 600 ns. Now we see 100 times larger as a matter of routine. Factoring small numbers in algorithms tailored for those specific cases was a nice demonstration in the 2000's when the qubits couldn't live long enough to compute anything actually interesting, but these days we're aiming for full error corrected computation. I'm not sure what you mean about the hardware complexity growing exponentially with number of qubits, because actually in my career I'd say the hardware has overall gotten simpler, and the electronics, packaging, etc. remains relatively the same level of complexity as the number of bits increases.
Always love your videos Sabine, thank you so much for the thorough explanation!
First of all, thank you for presenting this in such a clear and non-BS way! I’m currently working “around” post-quantum cryptography and quantum networking (from a technology strategy and standards point of view, not as a researcher) a, and also worked in artificial intelligence (as a researcher) in the 1980s and early 1990s. I see a strong parallel between the current quantum bubble and the AI bubble of 1985-1990 or so. AI and Machine Learning recovered after a dry spell of a few years, but this was mostly because of Moore’s Law and the explosion of the internet making things like big data and ultra-fast tiny devices practical. As you point out, these solutions don’t apply to quantum technology, despite the fact that people are expecting them to. I’m near the end of my career, and it’s kind of sad that we seem to have fallen down this rathole - I would have preferred to have finished on a high note.
The cooling for the ion trap quantum computers is a technical issue that might be solved. The main reason it is used is to decrease the pressure below 10^-12 Torr so that your ions don't get knocked around by background gas.
Anyway - thanks for the great video!
This video concerns me, as a physics major who was just starting to be interested in quantum effects in condensed materials, especially because of their application to research in quantum computing ( I didn't believe the articles about how revolutionary computers are because they're clearly bogus, but it worries me that there might be no funding to research this interesting topic )
There will be funding in the future. Money pours in from all over. Government agencies, the EU, big corporations throw out billions. It will need many years to dry out.
There's basically three types of computers.
_Consumer_ stuff like desktops and smartphones. _Enterprise_ stuff that runs webservers and datacenters and businesses. _Supercomputers_ that do research or are designed to solve specific complex problems.
Quantum computing has no real application for consumer or enterprise machines. Too much added cost and complexity for too little added performance. So it's going to be the realm of expensive private supercomputer platforms - it's not going to magically revolutionize the world. At least not for everyone and not for free.
the funding will scale down to a more reasonable level in line with more sensible expectations and potential specialized applications. This doesn't mean that studying quantum mechanics or solid state physics is a dead end by any means.
A follow up on this topic would be great. Because at this point, 5 Months after this video we have 433 Qbits Quantum Computer, Qiskit that can be used in combination with python and upcoming, the first prototype of a quantum software application. I don't understand the possible obstacles or limits of this technologie, but I can see progress. And we also have to remember that every major breakthrough was ridiculed at the beginning and many times called out to be impossible. I also would like to know what your approach would be? Wait until we could build a Q Computer right away with millions of QBits, Q-OS and Q-Software? Isn't research and experiments the best way to achieve ones goal and make progress?
Sabine is correct about the current technical shortcomings of quantum computer DESIGNS, but when it comes the emerging quantum computing INDUSTRY, she is just flat out wrong, and continues to present data and info that is about 2-3 years outdated. The time to build important manufacturing relationships and infrastructure, as well as relationships with other companies, along with governments (and their militaries), is something that starts during the end phase of coming to a viable design. That end phase is just beginning. A more mature mass-adoption phase is admittedly is more than a few years off, but it's not that far off. And the quantum wave that people speak of is not necessarily the actual tech, but what that tech represents to investors. For instance, you don't wait for computers to become massively popular in the '90s before buying shares. You speculate and buy tons of shares in the mid '80s when they are dirt cheap. If you wait until 2030 or so to start investing in quantum technology, you will be about 7 years too late. It's not a matter of if quantum computing will dominate the world, but when. Don't fall for pessimists who say it isn't coming, or that it's not going to be viable until 2045 or something. Technology moves much faster now than it did even 20 years ago. Also, the market is already in a "winter," or bear market, due to Covid, the war in Ukraine and mega-inflation ... so if you think an investing opportunity is going to get any better, you are sorely mistaken. Shares are already at a 50-70% discount.
Good Lord, do I love the snark from Sabine!!! This kind of reminds me of the hype behind the revival of 3-D and VR to be honest, and I suspect it'll pan out the same way: outside of specific applications its impractical at current tech levels. Unlike VR and 3-D (where the primary application is entertainment which chases the lowest common denominator), I think there are enough diverse applications for Quantum Computing for the tech to continue developing when the bubble pops.
Also, from an algorithmic standpoint it doesn't matter if an individual algorithm is of use or not. All algorithms fall into certain categories of complexity and approaches to solving them. There are only a handful of distinct approaches that are modified and repurposed for many use cases. If you, for instance, could find a way to solve sudoku puzzles in polynomial time, it would only take a matter of a few years to have several approaches for solving protein folding in polynomial time published.
The issue is that quantum computers do not allow you to solve NP complete problems in polynomial time. The algorithms we know of so far only apply to a few specialty problems.
@@peterisawesomeplease you kind of missed the point I was trying to make there.
@@peterisawesomeplease Yes, but since all NP complete problems are solvable if one is in polynomial time, he is saying that once one is solved in polynomial time, the whole class of problems will be solvable in polynomial time.
@@Eric-jh5mp Yes that would be true if there was an algorithm to solve a NP complete problem in polynomial time on a quantum computer. But there isn't one that we know of.
@@sccur Maybe. My point is that even if solving one problem often leads to many being solved in computer science it doesn't mean that any of the quantum algorithms are particularly useful even when considering this point.
Makes me think of the transition in classical computers from vacuum tube to the transistor and then integrated circuit. It was hardware refined for binary software. It will be interesting to see how hardware design changes to the specific nature of quantum software.
That is a tempting comparsion but it's apples and oranges. I think the the fission to fusion energy is closer.
@@shrimpflea I don't think I quite see that comparison. I'm talking about the relationship between hardware development to efficiently use binary software code. Vacuum tubes were used as on/off switches (binary), which were shrunk into a solid state transistor switches. It did away with unnecessary components to make the switching occur (like heating elements, glass chambers, and a vacuum). I don't know that the hardware containing a fission process has changed much at all since it's inception (and ironically, it's still basically just a complex steam turbine engine), and fusion has never actually been used to power anything, (despite promises to the contrary for 30-40 years).
I don't believe there will be any change in hardware specifically determined by quantum computing. Quantum computing, provided it will indeed take off, will be very niche, and as such hardware will be developed very specifically for the things it's good at, leaving the rest of the hardware untouched.
Cloud computing because the idea of everyone having to supercool their personal computers or having a mini vacuum chamber installed is ridiculous. We're going back to the pre-PC days with quantum computing, but the benefits will be made available to the masses via the internet and cloud computing, and the IC technology which allows PC's the be small isn't going away.
@@shrimpflea Fission and fusion is not an apples and oranges comparison? If you want an apples to apples comparison, you'd want to compare between two different fission nuclear power plants. Fusion technology has not matured to where there is an industry standard, and an "apples to apples" comparison can be made between different approaches.
I also don't like the term "apples and oranges", because it begs the question: when is it acceptable to compare two different fruit and why? Can you compare an apple to a pear or an orange to a lemon? What about a a red apple to a green apple? Do the apples being compared need to come from the same tree? What exactly are we comparing between the fruits? If we are discussing the evolutionary tree of apples and oranges and their common ancestor, THEN can we compare them?
I almost never hear the problem mentioned that most quantum processes are -- essentially -- random in nature.
For calculations (e.g. the theoretical paths of photons) you have to add huge numbers of "possible" pathways and find the average.
So ... quantum computers will have a VERY high ERROR RATE!
Thus, binary supercomputers will be much more reliable when definitive "answers" are needed!
Sometimes I feel like that the likelihood of quantum computers being used by everyday people is like making human teleportation happen from the current "quantum teleportation".
Quantum computers were never supposed to be used by everyday people. Their use is limited to very specific areas of research and development. Even if we had a perfectly functioning and scalable quantum computer the size of a normal laptop, normal people would still not use them. A normal computer is much more versatile and useful for anything an average consumer would use a computer for.
That is not to say a quantum computer is useless, but it's uses are limited, and mostly interesting for research in specific areas.
@@danimyte3021 Agreed. This is why the scenario of wide scale business usage is a fiction. Quantum computing startups might work for specific areas, but ordinary business people will never need nor will use them, at least in the near future. It will stay the tool of specific research areas and fields both needing high computation capacity and having the ability to compensate its higher costs.
Do quantum energy producers, recently discovered by humans, wish to be domesticated, corralled in a computer, forced to run in circles so that their energy can be redirected and harvested to improve human memory and computational faculties?
I'm finding the prospect of the unexpected, magnetism-based computing more interesting with more potential of being here sooner, more widespread, with wider applications frankly. Edit: Also teleportation is death. Star Trek had many great ideas, but once you look into what teleportation actually is.. you die and get replaced by a replica. No thanks I'd rather take space shuttles to planets
@@danimyte3021 That is oddly similar to what in the early 1940s, IBM's president, Thomas J Watson, reputedly said: "I think there is a world market for about five computers." 🤔
Einstein: predictions are always difficult. Especially if they concern the future. Might still go the same path as nuclear fusion, with success always 10 years off.
I'm no expert, but my partner is. The scariest headline to me is the factoring of numbers, breaking rsa in subexponential time. It'll probably take a while to do it, but is that really infeasible? I think nist is still looking for something quantum safe to replace rsa.
Fascinating. Especially the factorisation of 21, and they had to make it simpler! Well I do think its impressive quantum computers can be built at all, but you really helped that bubble burst before it got any more big and dangerous. It seems to me that optical computers might be way better for most tasks. An optical transistor combined with the ability to compute multiply and add ops for neural networks in.a massively parallel way would be way more useful
yes even lcd, fluidics, 2d non dispersive non dissipative circuits, mabye graphene layers, or surface based , 2d logic gates are possible in 2 with accurate timers.. solitons, in 1d, advanced crystal nano fluids and such material sciences are more promising and show results. cheap and dont heat up nor need cooling. even a 8 cm wire has a massive voltage drop, these have no resistance so if they are bigger but flatter, and doent make 400 watts of heat, its great.
13:30 There was a very prominent astronomer who once said "We will never be able to know the chemical compositions of stars or interstellar clouds.
His thinking was, the only way to know the chemical composition of something would be to analyze a sample of it. And he was probably correct in thinking we will never be able to take a sample of star stuff. But even when this scientist said this, spectroscopy had already been invented, though it was such a shot time after this that he likely had not heard of it.
My point is, when a scientist says something will never be possible, they are very likely wrong.
I'm at the end of my studies to become an engineer and a researcher in quantum physics, and I hope I will benefit from the end of this bubble, and hopefully some nice results will appear from all this research that will have real applications that benefit society and knowledge about the universe.
You sound like me when I was doing my dissertation in autonomous vehicles with deep reinforcement learning, imitation learning and transfer learning. My advice is to get some hard skills from your research and don't hope for magical sci-fi fantasy application of it
Another bubble that will burst in the next few years will be FSD - Full Self Driving.
The programmers working hard on vector spaces and all modes of computer learning are quickly learning that the human mind is much more diversified and complex than what they originally thought, and cannot be emulated with the current hardware and learning theories...
Don't listen to her, quantum computing will be one of the most important inventions this century. I hope you know how much it will revolutionise AI eventually: for example quantum systems can be used for matrix multiplication, an operation that's repeated millions/billions+ times to train an AI model. Imagine the complexity of an AI trained making use of this operation, it'd run rings around traditional hardware.
@@mattl1250 Has quantum superiority been proven for matrix multiplication ?
@@benj6964 no, and it will likely not have a real world use for matrix multiplication for a long, long, long time. Matrix multiplication is highly parallelizable, and GPUs are incredibly good at their computation. Even with a hypothetically faster quantum algorithm, the pure power we can commit to matrix multiplication will likely dwarf any computational efficiencies.
I love your content, especially the honesty. It's easy for a lay person like me to not understand when people are just trying to Hype up something that may not be as it seems. Your video has helped me know what the actual cases
There is commercial hype and then there is scientific hype. Don't confuse the two of them. Commercial hype about possible technological breakthroughs tend to inflate and collapse very easily. In 1822 Charles Babbage, the father of the computer, came up with a design for a mechanical computer that he called the difference engine. The English government funded him to make his design a reality but the metalworking technology at the time was too primitive. After 20 years the government gave up and considered the investment a complete failure. It wasn't until the invention of electronic circuits in the 20th century that computers exploded and became a part of everyday life. The theoretical possibilities of computers in 1822 looked just as promising as the theoretical promises of quantum computers look today. We may currently not have the technological sophistication to make quantum computers commercially profitable, but this should not deter us from pursuing further research and development into this fascinating technology.
I suspect all those physicists who leave the quantum computing field when it collapses will turn up working at banks on Wall Street as “quants”. They will help to bring about the next financial collapse by developing yet another exciting new “vehicle” that no one understands, but will be vastly over-hyped. It’s amazing how transferable these skills are.
1) why have a position in the market, when you can have a superposition.
2) Hedge funds? No. Entangled asset distributions and phase factors on stock prices.
haha... I'm pretty sure that it's the one profession where they think they're at the top of food chain and will be able to do anything successfully whether it's science, engineering, youtube, maybe even modeling and porn.
I love your unintended double-entendres (sic), at 4:50 you talk about the quantum winter and then proceed with "let me summarise", which could be heard as "let me summer-ise"! Get it? I am sure you do.
I work in quantum technologies, and I follow closely the development of those technologies for the next 10 years. Will there be a quantum winter? Yes, but only because the investors hype will drop, not because the technology is not promising. And this winter will probably only impact quantum computing, not quantum sensing, metrology, cryptography or communications.
I have a technical background, and work with a lot of researchers. I realized that technical people lack this "vision" of what's to come, they only see the things as they are right now. If you look at the first electronic computers, there were a lot of haters because it only did the same calculations as an abacus, but with a lot of costly and extra steps. We now know it is way more powerful than an abacus because we see all the useful applications.
It's the same for a quantum computer - because you don't see applications right now doesn't mean there are none, it simply means you cannot see them.
This “vision” is a lot of hype, and comparison with electronic computers is tiresome and even the first old radio-tubes based computers actually performed useful work, could do a lot more than “abacus”.
then show us those applications or go fish
I see Sabine conveniently left out Xanadu's photonic computer with 216 qubits completely at room temperature... They even provide public access to it.
top content, thank you so much for keeping expectations grounded Sabine! any chance we can get some podcasts please ? :D
I really wish that those who are throwing tax payers money at quantum computing watched Sabine's channel, but I am afraid they are rushing into quantum winter not wanting to listen to the nay sayers.
@@adambarker3130 We have this money, we have to spend it! And while none of us understand this, it sounds exciting!
I'd love to hear Sabine off script
Could you do a piece of what Nu Quantum, from Cambridge, is doing with photonics for QC and QN? They claim to be working on technology that uses photonics to efficiently scale quantum computers.
I love your objectivity ( especially when it's pessimistic) as well as your humor! :)
I’m a computer engineering student, and I understand that maybe IBM won’t have a use for their cryogenic chamber if the bubble does indeed burst, but isn’t it better to leave no stone unturned in the search for our evolution as a species? I say this cause if something is important enough, we should do it even if the chances are slim that we will accomplish our goal that being said you’re really funny 😄
Its about Ressource allication of smart people and Research funding
What Yolo Swaggins said!
Misallocation of resources can be extremely detrimental.
well when the bubble pops, there will be plenty of second hand dilution refrigerators to go around, which i am the most excited about, since the stuff is expensive as hell but very usefull in superfluid research
If we had infinite resources, then sure. But if we had infinite resources, we’d have much bigger things to worry about.
My super-position on the subject is this: I favour a bit of quantum computing, but I try not to get too entangled with it.
Thank you for the clarification without the hype. I really appreciate it.
Always a joy to listen to Sabine put into words what I’ve long believed.
Really? True or not, the pessimism is acidic to taste.
I remember when the advent of quantum computing had everyone in a panic about how all data that was encrypted would suddenly be able to be instantly cracked
From what you've described here, I feel like the reality of something like that occurring is getting slimmer and slimmer
And when it does occur they'll make sure not to inform you until they have all your datas. Maybe they all just have cracked it. You wouldn't need to be informed. They can do it without your help.
That would require quantum computers to show some success in number-theoretic problems (e.g. integer factorization). So far, the progress on that front has been precisely zero.
I want that to happen mostly because it would be a consequence of a quantum computer solving either the Riemann zeta hypothesis or whether P equals or not NP. But yeah it won't happen overnight.
Thank you, Sabine, for injecting a dose of realism into a world of ridiculous hype.
You became my favorite channel. RUclips needs more content like yours, real and updated. Every one of the words you say is interesting. Thank you.
Thank you for a sincere honest commentary on this popular subject.
Her subtle humor is next level 🤣
I have the same type of humor but at a kindergarten education level. To hear it with her genius is fantastic.
2:20
You are always a breath of fresh air in the suffocating fog of hypes that is the internet.
I like Your humor woman. I like the links to free books too. I enjoy learning from You 🤠 Thanks!
While I agree that it is overhyped, I think that there is still value in the research. Same with FTL -- I am quite sure it will never be achieved, but chasing it has uncovered some really cool and potential viable (and valuable) side concepts :)
I remember giving a university seminar on quantum computing more than 20 years ago. In the absence of any implementations, I lost interest and only regained it (temporarily) when IBM made its announcement. I was sure that if IBM was announcing product, there must be something there. But I was wrong, although actual functioning hardware existed, little or nothing conceptual had changed. When I was in university, I worked with a couple of hybrid machines - IBM 1620/1710 and 360/44 - which had analog computers as co-processors. The idea was that the analog processors could perform simulations of dynamical systems faster than their digital counterparts. Of course faster digital machines have long since obsoleted these systems, but present quantum computing organized in the same way as they were and may have an analogous role to fill (intentional). Of note, Dr. Roger Penrose has proposed that the human brain may include quantum elements which allow non-computable results. However, I came to the conclusion that IBM's objective was primarily to regain its role as a technology leader and to have something new to talk to its customers about.
Molecule biology works in scale of hundreds atoms (protein) so maybe some of protein function comes from quantum relations?
Or maybe it is still limited only to electron interaction level. We still don't have good models in electron cloud interactions, so who knows.
Waiting for IBM Block-chain quantum AI solution. 😏
exactly. Regarding Penrose: he is right and wrong, not understanding the actual ontological source of the "quantum" as anyone else.
I used a 1620, but there wasn't an analog coprocessor. How did it hook up to the digital? I don't remember anything in the documentation.
@@peterbonucci9661 That's the 1710 part.
en.wikipedia.org/wiki/IBM_1710
which provided the A/D interface. The analog computer itself was built and maintained by faculty. As a undergrad, I never used it. It was physically separate in the next room with a window.
I always feel like the pursuit of quantum computing will teach us about edge cases related to quantum mechanics, some of which may provide fruit. After hearing your video, I feel like we should focus more on length of time staying coherent so we can at least play with the system. It's probably 20 years out along with fusion reactors and artificial general intelligence.
Fusion has been "20 yrs out" for most of my entire life (now pushing 50). Artificial General Intelligence much less so and, to be fair, is probably much more achievable than fusion given that there's a couple hundred billion working examples of AGI roaming the earth already.
With shor's algorithm being a thing I don't think gov funding is going to stop until someone can prove quantum computers are unachievable. The fear of another country getting ahead and being able to break encryption is too strong a motivator
It is public (Asymmetric) key encription that can be broken. Private (Symmetric) key encryption, once you have set it up, perhaps by passing pieces of paper around, cannot be broken.
@@alanjenkins1508 also not all asymmetric encryption schemes can be broken by Shor's algorithm, only the ones which rely on the infeasibility of factoring large primes, such as RSA.
Most encryption has moved to algorithms that aren't susceptible to Shor's algorithm. And there are no quantum computers that can run it. If you have heard hype about such and such a large number being factored, they didn't use Shor, they effectively told the computer the answer in advance.
@@fluffysheap this is not true. The most common (asymmetrical) encryption nowdays is based on elliptic curves, which is susceptible to Shor's algorithm.
Like doctor Strangelove
I was involved in quantum computer research since the beginning. The difficulty of the task seemed apparent. Yet great optimism grew from the advent of quantum error correction. The problem I saw is that the quantum error correction overhead has atrocious scaling. The question I'm still asking is whether the qubit count represents physical or logical qubits? For example, if the physical qubit count reaches 1 million, there may only be thousands of logical qubits due to error correction overhead...
Error correction helps but does not address issues.
Hello. I think it is good that you explain so many different fields and branches of physics and I am really feeling like I am learning a lot. However, I sometimes get demotivated when you show all the downsides of different fields and how they do not seem useful for society. My request is, could you make some hopeful content or recommendations especially for people considering a career in physics for interesting and meaningful fields. It really brings me down to see how much "useless" research is being done in fields that sound so interesting and I do not want to fall for them. I hope you can recommend some alternatives some time. Thank you.
Study biomolecular systems.
In a way, she is giving motivation. Just like how dark humor can be therapeutic for a individual; being able to be self aware enough to realize when certain things get off track and it can motivate you/anyone to attempt to look at these things and try and find a new perspective/approach to solve a problem. The best inventors, philosophers, artists seem to be able to be comfortable with facing things and learning to harness the ability of how you observe things as a tool and through perspective experimenting, you can sometimes stumble upon new methods/manner's on how to solve problems in ways that others didn't notice/see/or understand. (Just like how Einstein had the ability to get lost in hypothetical thought and imagine different perspectives and then it stumbled upon his path towards learning about light/energy/physics etc.)
That came from Einstein taking a category that previously was in a state of limbo, but he was willing to look at it and see if he could find anything else out about it.
So if you get demotivated, or down, try and remember that even things we think we fully understand; have the total possibility that you can look at it/observe it in a new light that others haven't and totally find a new layer of complexity or depth to a topic. So even if we are faced with a lot of difficult situations, roadblock's, the best thing is the universe is full of amazing intracity so the chance of further understanding the world around us is profoundly possible and to me that's very motivational and Nature itself some how seems to have a way of always keeping that inner child like curiosity alive with-in my Life.
There are many research fields in physics which do have modern day applications, but they usually require you to move slightly away from particle and theoretical physics. Condensed matter physics for example is a very hot research field with many applications ( for example, material science, semi-conductors, super conductors). There are many other fields as well such as for example space physics, medicinal and biological physics and solar physics.
You're looking at it the wrong way. There is nothing wrong with "useless" research, knowledge for knowledge sake is still valuable to society just not on a day to day basis necessarily. The downsides come from the fact you need funding to do your research, so you need to play the hype game in order to get money for your research. But that shouldn't stop you from pursuing a field you genuinely feel passionate about, just make sure you know what you're getting yourself into.
Try getting into materials science. Lots of potential there, a good deal of it on solid foundations. Condensed matter physics is the field you're looking for.
Wow. Very engaging. Glad to have found Sabine.
I might be wrong. That is the most scientific statement that has ever been made. Until we recognize that we don't know, we cannot learn.
Somebody had to say it! Knowledge both aids in the search for more and prevents us from recognizing what we thought could never happen.
May I propose a new word for quantum hype - qubris?
There's also been a lot of hype about analog computing in mixed-domain applications, and some of that looks promising. I wonder if there may be a future for digital/analog/quantum hybrid computers.
Qbits are analogous memory. This analogousness is also the problem. They will never make it 100% accurate. When many qbits are used, error will be exponential to qbits. They will never make working large QC.
In the early part of the 20th century, analog computers had the edge over digital ones in physical-simulation problems in being a lot faster, albeit with limited precision.
I see history repeating itself, with “quantum” computers being the new-style analog computers. You never see them demonstrate any success with number-theoretic problems, for example, as would be needed for cracking encryption (what happened with Shor’s algorithm?). Just about all the success stories I hear about have to do with physical-simulation problems.
This was super illuminating. I have watched about 10 to 15 youtube explanations of quantum computing, trying to find one that would give me a real world example of how it would do things better. Of course they mentioned pharmaceutical applications and other high-level things like that. But never a down to earth example that I could understand being a non-scientist. So Sabrina explain things perfectly, they have not figured out anything to do with quantum computing yet other than then to say things can be computed faster and then again only on a targeted limited set of problems with a super-small set of algorithims? Amazing, thx so muc Sabrina for cutting theough the hype!