Absolutely. I think our limited perception and understanding is a huge barrier to our learning process. We need to start disregarding these impedements, such as that our ubiquitous reality doesn't match some of our scientific observations or otherwise we won't advance. If it was obvious, it wouldn't be exciting. "If it's easy, it's not worth it".
This was a really informative video. Some illustrations/animations to visualize what he said would have made the video even better, although I understand that they take quite some time to make.
i agree. His experience as a professor really shines through. I'm not sure if there's a definite correlation, but all the physics professors I had in college were the best teachers I ever had. Along with their ability to explain things they were passionate/excited about their field
This is one of my all time favorite videos on RUclips! I have watched this video about 7 times now and I just absolutely love how well Mr. Moriarty explains the semi-transistor manufacturing.
Being a chemist, just having listened to a physicist, talking about mechanics, for the purpose of computing, I just realized that the electron couldn't care less about how it's manipulated and by whom.
That was a brilliantly clear and energetic overview of modern chipmaking. Professor Moriarty explained how a Silicon transistor works, but didn't label it as such. The Silicon substrate is formed into the transistors by adding impurities (doping).
Prof Moriarty is one of my favorite guests on any of the “phile” videos. Awesome guy and very good at breaking things down to a level I can understand.
The energy of his explanations.... HE should be duplicated in the kindergarten, school, high school... NOBODY cannot be energized and curious the way he is talk and explain things! His energy in language would make me a damn poet!!
I really admire Phil Moriarty's ability to talk around naming concepts like quantum tunneling, and keep his talk on track even with tangential questions.
Very good speaker, very well explained, and engaging. This really helped me understand exactly what the subject was about. I'd love to have this guy as a teacher.
I don't know why but he reminds me of Roy from The IT Crowd Just of course more intelligent than, "Hello IT, have you tried turning it off and on again?"
Thats not the case there. 5nm isn’t exactly 5nm. It does not represent a geometrical shape on transistor. It’s just the technology’s name. You can call it “marketing”. The real limitation for geometrical shape of transistor is 7nm. Nothing more than that.
I brought up the fact that Chemistry and Physics and ultimately everything that follows is a seamless whole just divided into digestible parts to a chemistry instructor once, and he almost flipped his lid. It was almost the same reaction from the physics department, yet they worked together constantly on things, though the chemists tend to be more reserved and the physicists tended to let their reservations go a bit, especially on things that went "boom". Nice to hear the Professor say nearly the same thing, about the relationship.
You said the wavelength is a limit but the nobel prize last year was for the invention of microscopes which overcome this wavelength barrier. I think they used the light emitted by proteins and blocked the light emitted by neighbouring proteins so that the resolution was down to one protein. There might be a way to use that for making smaller chips.
Wow, that was was barely within what I could follow all the way through. The few times I started to get lost he stopped and explained it a bit more. Very well done, and I even picked up a few new things.
To elaborate on the "Layers" question, yes, it's very much done in layers. In fact, even the first layer wasn't fully described here. After the exposed (or unexposed, depending on the process) areas of the polymer are washed away, another layer of some material is applied. Then the places where the polymer remained are washed away, leaving the new material only in the gaps. The material can be dopants for the underlying silicon, metal layers to connect components, insulating layers to separate things, etc. It can even be exposed to etchants, rather than a new material, to remove whatever layer is showing.
Three years ago they were talking about 14 nm, today we're talking about 5 nm; so this video becomes history in less time than it would take to study Electronics.
Unfortunately the 5nm and 7nm are really brand names than actual sizes of features - they are usually much more refined 10nm process which are capable of higher densities by eliminating issues with previous 10nm and 14nm processes. Essentially we are right up at the limits of manipulatable sizes when it comes to computing - most every improvement now comes from extremely complex and well designed architecture - cache for instance which is what amd rides on for their superior processing of late with 5000 and coming 6000 series chips
@@aravindpallippara1577 You need to search before replying, and address what was written. Plants are being constructed in Arizona, Tainan, etc. with 2 nm coming in 2025 - call it creative naming or fudging on the numbers - each new plant builds a smaller process. People don't invest and they don't spend over a hundred billion to convince you, you are not convinced. Instead the money is spent to place billions of transistors in the space previously occupied by one transistor, decades ago. They really are getting smaller.
please, more on the physics of computer hardware! there's been so many amazing inventions and discoveries through the years in the semiconductor industry so we can use computers as we know it...
My "like" happened at 7:05. The "quantum computing" question wasn't very well informed, though it may have highlighted a common misconception. The answer, spurred from the question, about how "classical" computing must necessarily exploit the quantum nature of matter if it intends on reaching ~1nm scale features is totally spot on. Looking forward to the next ~5min of video!
The wavelengths they're using in mass production today are not that small. I believe 193nm is still standard, even though they're making features as little as ~14nm in size (actually many parts of a "14nm" process are not 14nm, but it's all pretty much to scale compared to older process nodes like 22nm). The next step is (supposed to be) EUV, where they do drop to very short wavelengths and high energies, as discussed in the video. They are having a lot of issues getting that to work for mass production though.
The diffraction of light really is a limitinig factor when reducing the size of features. One of the ways semiconductor manufacturers get around this is phase shift masking, which Prof. Moriarty explained as two masks just slightly offset from each other.
+sewer renegade There are actually several ways of masking phase shift masks. I think Professor Moriarty is portraying the general concept of setting up having the edges of the photomask phase shift the light passing by so that when the light reaches the photoresist on the wafer, the edges of the patterns are being enhanced by constructive and destructive interference of the light waves, thus making features smaller than the wavelength of light possible.
I'm just glad there's someone that can see that physics chemistry and computer science are integral to each other than the normal 'brinksmanship' that you see in these fields.
I swear every time a computerphile video ends, when I hear those beeps, I start singing "Askepios" by the Mars Volta. I love it. At the end of each video I literally start singing "I'll be there waiting..." and start asking myself "damn what song is that?"
this is great info, since I come from a physical science background this shows the application of what I learned, wish they had a class on the physics of computers
I thought that the wave interference of the photons (as in, interference you see with the double split experiment) would be come a problem. But instead you could create a mask that actually utilizes this phenomenon to create interference patterns that match the target pattern on your silicon sheet.
My fiancé worked for Intel, he was sent around to various clean rooms and such to work on the computers that were running the electron scanning microscopes that they were using for debugging chips. The whole process is pretty cool to me, as I just stopped learning the abstractions of the CPU at the logic gates, and VHDL design. A curious wonder, what kind of feature size would a hobbyist be able to achieve? I mean, there's that guy who built a macro-computer by using full chips for his transistors, and I know most of us are better off using FPGAs anyways. But say someone wanted to get into etching their own silicon, what do you think would be the range of quality that they could get to?
5:40 "as a physicist, it's not that you understand it. you just get used to it" wise words. I tried understanding the wave nature of electrons, lost half of my hairs just to get my head around that thing and I am not even a physicist. Physics can be addictive. also, it can be intuitive and unforgivingly confusing at the same time. I know a person who'd agree to the last statement. That is, Mr. Erwin Schrodinger.
Hearing about 14 nanometers being the smallest you can go, while my cpu is made with 7 nanometers manufacturing just shows how fast technology moves on. This is 4 to 5 years ago, and at that time, their ultimate goal was 13.5 or something like that! Science moves fast!
it may be off topic: ion computer 00 North up spin CW (volt amp) 01 North up spin CCW (volt -amp) 10 South up spin CW (-volt amp) 11 South up spin CCW (-volt - amp) ------------------------------- photon computer- 00 photon (0,90) no mirror no lens 01 photon (0,270) no mirror lens 10 photon (180, 90) mirror no lens 11 photon (180,270) mirror lens
Speaking of the 'integrated whole'... Years ago I read Richard Feynman's 'Lectures on Computation' (certainly less popular than his lectures on physics, but Feynman did a bit of computer science as well) and he described a model of computation that theoretically required zero energy. I don't know enough quantum physics to understand the exact mechanism (it had to do with particles moving from an excited state to one of multiple rest states I believe) but I've never heard about it anywhere else. I would love to hear some academicians talk about this idea. Or about amorphous computing, although that is still a fairly specialist field... Oh, but memristors! I'm sure you could easily do a video about them! Please don't get distracted by the whole "neuromorphic processors" stuff, that's all anyone ever talks about. Talk instead about the impact they could have on computer architecture - Everything, CPU, registers, L1 cache, L2 cache, L3 cache, RAM, mass storage, all of it could be done with a single large array of memristors. And portions of the array could dynamically change from providing memory functionality to providing computational capacity faster than a RAM read. That could change everything!
I believe what you are essentialy talking about is quantum computing. It uses photon as information. By emmitting light on surfaces on the atomic scale you 'bounce' the photonic energy inside an atom causing the electrons that are 'orbitting' the nucleus to up energy levels. When atoms become more complicated these energy levels gain sublevels. The energy photons carry which equals the Planck constant * the frequency of the light is the lowest obtainable energy possible(zero energy). This energy can be transported just like electrons(i.e. information) and can be stored in the energy levels(i.e. memory). If you can read/write(manipulate) photons you can make a computer using the lowest resolution our universe has to offer.
Amazing video! I have a question about Professor Moriarty's explanation of how the semiconductor industry is able to create such precise patterns on transistors. When the two offset masks are placed over the silicon wafer and light is shown, how is it that the light is able to deterministically etch a pattern? Why would it not behave like a wave/particle in the double slit experiment and defract into a probabilistic wave pattern on the wafer?
With feature size shrink, we're already running into really tough problems. Notice how there have not been a lot of processor improvements (pipelining as one example of a processor improvement) and have resorted instead to parallelization (multiple cores on a single die) and increasing clock frequency. Plus we're starting to see where feature sizes are so small that operating one component starts to affect the operation of adjacent components (e.g. rowhammer in memory chips). 14nm may be the practical limit for feature size.
funny thing is, average clock frequency is actually decreasing. Mainly because of popularity of note-books. Power consumption and heat-output rises faster than clock frequency, so it's more power-friendly to have slower parallel circuits instead of one fast circuit. There are very few algorithms which can't be parallelized. The circuit development simply mirrors the demand. Even with that in mind, microchips are still the fastest computer-machines we can build today. For example a single consumer-grade CPU is more powerful than all quantum-computers build to this date combined. And the quantum computers are all build to perform specific task. Biocomputers are only recently been build as a proof of concept.
Intel started 10nm transistor production couple of weeks ago, their 10nm CPUs are comming out next year. IBM said they managed to make a 7nm transistor, so we will for sure go at least to 7nm, below that we'll have to see since quantum laws start there.
Nadir Jofas , I'm referring to the pursuit of more processing per unit of time. At first, chip designers thought of things like pipelining, branch prediction, and so forth, and physical/fabrication improvements to allow clocking the parts faster. For a relatively long time now, there hasn't been any improvement in that sort of design, so now parallelization seems the only practical avenue left. But as others point out, for some problems that doesn't help because they're not intrinsically able to be parallelized. Still, for other problems/workloads, we're trying die shrink to get more cores in a single package.
There are metal layers on top of the transistors. E.g. Intel 45nm has 9 metal layers. You always needed room for these since invention for the microchip. So, chips were never flat.
Yeah, i know that. That's not what i meant. More along the line of what memory chips been going through lately, with 32 or 48 layer stacking, except for CPUs. TSV, etc.
This was a really good video. Lots of detail. Could you do a video about error checking/handling in chips (CPU's). I read a few years ago that now CPU's are becoming smaller, there are more errors in the calculations. I would love to know how manufacturers get around the errors.
"You don't understand it, you just get used to it"
Probably one of the best physics quotes ever
Yeah that is actually very profound , it keeps you going.
It’s sounds like the food that’s served in US prisons and jails!
John von Neumann was the first one to say it. Give him the credits
"You don't understand it, you get used to it" sooo relatable
Yep
females xD
Isn't it like that for everything we "know"? Classical physics is just as strange, we are just used to it.
I don't see why it's so common to have a problem understanding quantum physics.
Absolutely. I think our limited perception and understanding is a huge barrier to our learning process. We need to start disregarding these impedements, such as that our ubiquitous reality doesn't match some of our scientific observations or otherwise we won't advance. If it was obvious, it wouldn't be exciting. "If it's easy, it's not worth it".
My favorite connection from physics to electronics, is the fact that quantum tunnelling effects are at the heart of how flash memory and EPROMS work.
Phil, aren't you too large to call yourself a nanoscientist?
Dad, I told you not to make these jokes in RUclips comments!
its okay, he identifies as a nanoscientist!
cissized pig
He was referring to little phil... downstairs.
If he's a nanoscientist, I hope I never meet even a microscientist, let alone a full scientist!
I don't always watch computerphile (over my head) but you could put Phil Moriarty in a video about paint drying and I would watch it........oh wait.
More of this guy please.
I want more of this guy. He could start his own channel and just talk and I think he'd have thousands of subscribers;)
He has his own channel. Called Phillip Moriarty.
He does have his own channel! Moriarty2112, or you could follow Sixty Symbols where he has many videos about physics!
You should watch Sixty Symbols, he features in literally several dozen videos there.
+Phi6er Aww really? I liked this guy. :/
seconded
This was a really informative video. Some illustrations/animations to visualize what he said would have made the video even better, although I understand that they take quite some time to make.
Check out en.wikipedia.org/wiki/Photolithography
i agree. His experience as a professor really shines through.
I'm not sure if there's a definite correlation, but all the physics professors I had in college were the best teachers I ever had. Along with their ability to explain things they were passionate/excited about their field
This is one of my all time favorite videos on RUclips! I have watched this video about 7 times now and I just absolutely love how well Mr. Moriarty explains the semi-transistor manufacturing.
Being a chemist, just having listened to a physicist, talking about mechanics, for the purpose of computing, I just realized that the electron couldn't care less about how it's manipulated and by whom.
It seemed like the video ended while he was still explaining something.
Prof. Moriarty actually never stops talking. The best you can do is turn off the camera just as he changes topics.
He was about to unify classical and quantum physics
He was about to show his pee pee
Why don't they just download more RAM into the electron beam to make it go faster?
I'll create a GUI interface using Visual Basic to see if I can track down an IP address for the download.
CSI?
Usual Hollywood hacker nonsense, that particular excerpt is from CSI:NY.
Justin Bell I thought so
"we need to hack faster !!!"... 3 people typing at once... on the same keyboard
That was a brilliantly clear and energetic overview of modern chipmaking. Professor Moriarty explained how a Silicon transistor works, but didn't label it as such. The Silicon substrate is formed into the transistors by adding impurities (doping).
Prof Moriarty is one of my favorite guests on any of the “phile” videos. Awesome guy and very good at breaking things down to a level I can understand.
I love how excited he gets to answer each question and you can tell it’s genuine too
The energy of his explanations.... HE should be duplicated in the kindergarten, school, high school... NOBODY cannot be energized and curious the way he is talk and explain things! His energy in language would make me a damn poet!!
I really admire Phil Moriarty's ability to talk around naming concepts like quantum tunneling, and keep his talk on track even with tangential questions.
Very good speaker, very well explained, and engaging. This really helped me understand exactly what the subject was about. I'd love to have this guy as a teacher.
hi there camera man!
Hello!
Was the camera man sitting on a basketball?
not enough giggles for that to be the case.
And this why movies tend not to use real mirrors!
movies use real mirrors, they just don't have the camera face on with the mirror
I usually just understand 25% of these talks but just love this channel and will keep coiming back to it again and again. Thank you for this!
I don't know why but he reminds me of Roy from The IT Crowd
Just of course more intelligent than, "Hello IT, have you tried turning it off and on again?"
Similar accent
He's also got the Roy-esque quality of talking about intricate stuff in a non-jargon way
Was about to comment this, brilliant.
...and gestures.
I was wondering who he reminded me of!
2019 update: 5nm in the works, 7nm in production (AMD Ryzen 3000 series, e.g.)
So if 14nm = 50 atoms, 7nm = 25 atoms, 5nm = 17 atoms. Getting there
3nm in lab
yet they are too inneficient
Thats not the case there. 5nm isn’t exactly 5nm. It does not represent a geometrical shape on transistor. It’s just the technology’s name. You can call it “marketing”. The real limitation for geometrical shape of transistor is 7nm. Nothing more than that.
For silicon ^^
Moriarty is one of my favorite science communicators
I brought up the fact that Chemistry and Physics and ultimately everything that follows is a seamless whole just divided into digestible parts to a chemistry instructor once, and he almost flipped his lid. It was almost the same reaction from the physics department, yet they worked together constantly on things, though the chemists tend to be more reserved and the physicists tended to let their reservations go a bit, especially on things that went "boom". Nice to hear the Professor say nearly the same thing, about the relationship.
God, I love how professor Moriarty explains stuff !
You said the wavelength is a limit but the nobel prize last year was for the invention of microscopes which overcome this wavelength barrier. I think they used the light emitted by proteins and blocked the light emitted by neighbouring proteins so that the resolution was down to one protein. There might be a way to use that for making smaller chips.
this was incredible to watch. the passion and conviction he showed was amazing
Watching this in 2019, they are now manufacturing 7nm microprocessors, how things move on.
intel 14nm have 8nm wide fins in a finfet transistor, I think they are making features bit smaller than 7nm in the absolute sense.
And on what he call Extreme UV (EUV)
@SuperTanner But how can they go smaller than atoms
And tomorrow morning a 5nm machine’s getting delivered to my home.
Let’s come back in a year, see where we’re at
we are in 5nm stage now
I can never get enough of Professor Moriarty. Such a fantastic and interesting person!
You should have seen him in Sherlock Holmes
Wow, that was was barely within what I could follow all the way through. The few times I started to get lost he stopped and explained it a bit more. Very well done, and I even picked up a few new things.
To elaborate on the "Layers" question, yes, it's very much done in layers. In fact, even the first layer wasn't fully described here. After the exposed (or unexposed, depending on the process) areas of the polymer are washed away, another layer of some material is applied. Then the places where the polymer remained are washed away, leaving the new material only in the gaps. The material can be dopants for the underlying silicon, metal layers to connect components, insulating layers to separate things, etc. It can even be exposed to etchants, rather than a new material, to remove whatever layer is showing.
One of my favorite Computerphile videos. Great explanation of concepts I've always wanted to understand. Thank you!
I love how you can see how passionate this man is about what he does.
Three years ago they were talking about 14 nm, today we're talking about 5 nm; so this video becomes history in less time than it would take to study Electronics.
Unfortunately the 5nm and 7nm are really brand names than actual sizes of features - they are usually much more refined 10nm process which are capable of higher densities by eliminating issues with previous 10nm and 14nm processes.
Essentially we are right up at the limits of manipulatable sizes when it comes to computing - most every improvement now comes from extremely complex and well designed architecture - cache for instance which is what amd rides on for their superior processing of late with 5000 and coming 6000 series chips
@@aravindpallippara1577 You need to search before replying, and address what was written.
Plants are being constructed in Arizona, Tainan, etc. with 2 nm coming in 2025 - call it creative naming or fudging on the numbers - each new plant builds a smaller process.
People don't invest and they don't spend over a hundred billion to convince you, you are not convinced.
Instead the money is spent to place billions of transistors in the space previously occupied by one transistor, decades ago.
They really are getting smaller.
This is probably my favorite video on the channel.
This guy is magic, and clearly loves his stuff. Excellent.
please, more on the physics of computer hardware! there's been so many amazing inventions and discoveries through the years in the semiconductor industry so we can use computers as we know it...
This is a great effing interview.. awesome enthusiasm and passion.
My "like" happened at 7:05. The "quantum computing" question wasn't very well informed, though it may have highlighted a common misconception. The answer, spurred from the question, about how "classical" computing must necessarily exploit the quantum nature of matter if it intends on reaching ~1nm scale features is totally spot on. Looking forward to the next ~5min of video!
Videos like this immunize me from the Kruger-Dunning Effect.
same here my friend. i hqd to look up the effect so im even more ignorant ;-)
putting you in your place is a simpler way of saying it
If you think it did that, you should be worried.
sugarfrosted Okay... it was just a booster.
Or even the Dunning-Kruger Effect.
Great video; thanks Computerphile and Dr. Moriarty :)
This material is way over my head, but this video was fascinating. Thanks for putting it together.
This interview elevated my understanding of how we're able to manipulate atoms. Thank you.
"Quantum Mindset" #bandname
+MaxPower ^ rofl
But damn if you try to find the location and time of any given concert in particular.
+scabbynacker "Are you thinking with quanta yet?" #tagline
It's very interesting to dive into the physics and chemistry of electronic computing, it's not a subject I've explored much as a computer scientist.
these videos are like the best thing in my life sometimes. thanks for continuing to make them. :)
Physics AND computer science, all with Professor Moriarty. Great combo, I really enjoyed this episode. Thanks guys!
I see Phil Moriarty. I watch. I upvote
indeed,his videos are better , because he actually is more specific than others...
face01face f
And that's how you identify a redditor.
I see Phil Moriarty. I upvote. I watch
Just a silly question: doesn't diffraction screw up with the lithographic process considering that light has to go through such tiny apertures?
Yes, which is why you use light of very small wavelengths.
The wavelengths they're using in mass production today are not that small. I believe 193nm is still standard, even though they're making features as little as ~14nm in size (actually many parts of a "14nm" process are not 14nm, but it's all pretty much to scale compared to older process nodes like 22nm).
The next step is (supposed to be) EUV, where they do drop to very short wavelengths and high energies, as discussed in the video. They are having a lot of issues getting that to work for mass production though.
The diffraction of light really is a limitinig factor when reducing the size of features. One of the ways semiconductor manufacturers get around this is phase shift masking, which Prof. Moriarty explained as two masks just slightly offset from each other.
i didnt get how shifting the two templates helps
+sewer renegade There are actually several ways of masking phase shift masks. I think Professor Moriarty is portraying the general concept of setting up having the edges of the photomask phase shift the light passing by so that when the light reaches the photoresist on the wafer, the edges of the patterns are being enhanced by constructive and destructive interference of the light waves, thus making features smaller than the wavelength of light possible.
I'm just glad there's someone that can see that physics chemistry and computer science are integral to each other than the normal 'brinksmanship' that you see in these fields.
This guy won't shut up. I love it!!!!
I swear every time a computerphile video ends, when I hear those beeps, I start singing "Askepios" by the Mars Volta.
I love it.
At the end of each video I literally start singing "I'll be there waiting..." and start asking myself "damn what song is that?"
I'd love to hear Prof Moriarty talk about spintronics and photonics if you have him on again.
"If you find that confusing.. Good" - soo funny and soo true
Welcome back professor. I had wondered about your absence.
this is great info, since I come from a physical science background this shows the application of what I learned, wish they had a class on the physics of computers
I thought that the wave interference of the photons (as in, interference you see with the double split experiment) would be come a problem. But instead you could create a mask that actually utilizes this phenomenon to create interference patterns that match the target pattern on your silicon sheet.
Really cool to see semiconductor fab processing explained here!
This is brilliant! I would be really glad if you made more videos on this topic
I really liked this sort of unplanned interview
silicon is reflective? MIND BLOWN.
Watched extra bits, still want more.
Big man working on small stuff, respect
Nice to see you again Dr Phil!
My fiancé worked for Intel, he was sent around to various clean rooms and such to work on the computers that were running the electron scanning microscopes that they were using for debugging chips.
The whole process is pretty cool to me, as I just stopped learning the abstractions of the CPU at the logic gates, and VHDL design.
A curious wonder, what kind of feature size would a hobbyist be able to achieve? I mean, there's that guy who built a macro-computer by using full chips for his transistors, and I know most of us are better off using FPGAs anyways. But say someone wanted to get into etching their own silicon, what do you think would be the range of quality that they could get to?
This man is brilliant and captivating. Would definitely enjoy more videos featuring him.
I love how excited he got for the silicon question.
Complex concepts beautifully explained. This guy has a gift for simplifying things to layman language
5:40 "as a physicist, it's not that you understand it. you just get used to it" wise words. I tried understanding the wave nature of electrons, lost half of my hairs just to get my head around that thing and I am not even a physicist. Physics can be addictive. also, it can be intuitive and unforgivingly confusing at the same time. I know a person who'd agree to the last statement. That is, Mr. Erwin Schrodinger.
This is my favorite video in a while.
Hearing about 14 nanometers being the smallest you can go, while my cpu is made with 7 nanometers manufacturing just shows how fast technology moves on. This is 4 to 5 years ago, and at that time, their ultimate goal was 13.5 or something like that! Science moves fast!
it may be off topic:
ion computer
00 North up spin CW (volt amp)
01 North up spin CCW (volt -amp)
10 South up spin CW (-volt amp)
11 South up spin CCW (-volt - amp)
-------------------------------
photon computer-
00 photon (0,90) no mirror no lens
01 photon (0,270) no mirror lens
10 photon (180, 90) mirror no lens
11 photon (180,270) mirror lens
I believe there's both positive and negative photo-resist. Love Moriarty! He's the man.
It is so amazing to see Prof Moriarty talk about computer science. *beaming* :D
This guy is awesome! It's very obvious he's gushing with knowledge and enthusiasm of his subject.
As many have said: more of this guy, please!
Really like how you explain things, hope you do more videos in the immediate future!
I could listen to this guy for hours
great video, we need more low level hardware videos like this
Speaking of the 'integrated whole'... Years ago I read Richard Feynman's 'Lectures on Computation' (certainly less popular than his lectures on physics, but Feynman did a bit of computer science as well) and he described a model of computation that theoretically required zero energy. I don't know enough quantum physics to understand the exact mechanism (it had to do with particles moving from an excited state to one of multiple rest states I believe) but I've never heard about it anywhere else. I would love to hear some academicians talk about this idea. Or about amorphous computing, although that is still a fairly specialist field... Oh, but memristors! I'm sure you could easily do a video about them! Please don't get distracted by the whole "neuromorphic processors" stuff, that's all anyone ever talks about. Talk instead about the impact they could have on computer architecture - Everything, CPU, registers, L1 cache, L2 cache, L3 cache, RAM, mass storage, all of it could be done with a single large array of memristors. And portions of the array could dynamically change from providing memory functionality to providing computational capacity faster than a RAM read. That could change everything!
I believe what you are essentialy talking about is quantum computing. It uses photon as information. By emmitting light on surfaces on the atomic scale you 'bounce' the photonic energy inside an atom causing the electrons that are 'orbitting' the nucleus to up energy levels. When atoms become more complicated these energy levels gain sublevels. The energy photons carry which equals the Planck constant * the frequency of the light is the lowest obtainable energy possible(zero energy). This energy can be transported just like electrons(i.e. information) and can be stored in the energy levels(i.e. memory). If you can read/write(manipulate) photons you can make a computer using the lowest resolution our universe has to offer.
I learned SO much from this. Thank you.
I also love his passion. It excited me to learn this.
I can listen to this man for hours ...
Phil Moriarty is badass
Cracking piece of work. I want to see a paper published titled "Quantum Effects on Classic Computing" by Professor Phil.
Amazing video! I have a question about Professor Moriarty's explanation of how the semiconductor industry is able to create such precise patterns on transistors. When the two offset masks are placed over the silicon wafer and light is shown, how is it that the light is able to deterministically etch a pattern? Why would it not behave like a wave/particle in the double slit experiment and defract into a probabilistic wave pattern on the wafer?
Great video... More of Professor Moriarty!
Congratulations! A very successful comback.
8:43 ish - How do they make the masks with features that small?
really interesting
more please
6:20 The sound of going from one very difficult topic to an other very difficult topic fast.
With feature size shrink, we're already running into really tough problems. Notice how there have not been a lot of processor improvements (pipelining as one example of a processor improvement) and have resorted instead to parallelization (multiple cores on a single die) and increasing clock frequency. Plus we're starting to see where feature sizes are so small that operating one component starts to affect the operation of adjacent components (e.g. rowhammer in memory chips). 14nm may be the practical limit for feature size.
funny thing is, average clock frequency is actually decreasing. Mainly because of popularity of note-books. Power consumption and heat-output rises faster than clock frequency, so it's more power-friendly to have slower parallel circuits instead of one fast circuit. There are very few algorithms which can't be parallelized. The circuit development simply mirrors the demand.
Even with that in mind, microchips are still the fastest computer-machines we can build today. For example a single consumer-grade CPU is more powerful than all quantum-computers build to this date combined. And the quantum computers are all build to perform specific task. Biocomputers are only recently been build as a proof of concept.
Intel started 10nm transistor production couple of weeks ago, their 10nm CPUs are comming out next year. IBM said they managed to make a 7nm transistor, so we will for sure go at least to 7nm, below that we'll have to see since quantum laws start there.
I am pretty sure that the average clock speed is stabe since years.
Nadir Jofas , I'm referring to the pursuit of more processing per unit of time. At first, chip designers thought of things like pipelining, branch prediction, and so forth, and physical/fabrication improvements to allow clocking the parts faster. For a relatively long time now, there hasn't been any improvement in that sort of design, so now parallelization seems the only practical avenue left. But as others point out, for some problems that doesn't help because they're not intrinsically able to be parallelized. Still, for other problems/workloads, we're trying die shrink to get more cores in a single package.
rchandraonline Ah dammit youtube. I was referring to KohuGaly
It would be cool if Computerphile did an episode on 3D chips. With physics wall coming soon, the only place will be up.
There are metal layers on top of the transistors. E.g. Intel 45nm has 9 metal layers. You always needed room for these since invention for the microchip. So, chips were never flat.
Yeah, i know that. That's not what i meant. More along the line of what memory chips been going through lately, with 32 or 48 layer stacking, except for CPUs. TSV, etc.
Double patterning is so arcane; especially when looking at super small designs like SRAMs
"You can´t beat physics". Phil, challenge accepted my fellow.
I'll follow you anywhere Phil
best video i'm seen on computer chips
That polished silicon would make a super nice (and extremely expensive) mirror
i love this guy he explains so well.get him on the show more often plz
How do the individual atoms feel about being manipulated?
#METoo
electronized
Uncertain. Of course.
The electrons are TRIGGERED
Try to learn about transistor...
This was a really good video. Lots of detail.
Could you do a video about error checking/handling in chips (CPU's). I read a few years ago that now CPU's are becoming smaller, there are more errors in the calculations. I would love to know how manufacturers get around the errors.