Want to learn more about the Technological Revolution? Watch our playlist here: ruclips.net/video/ENWsoWjzJTQ/видео.html - ALSO - Become a RUclips member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord
Also as eh side note cosmic rays have been know to reap havoc on silicon chips with transistors smaller then 45nm creating currents in the electronic pathways that short out ships !
I love learning about ICs and electronics specifically CPUs because they are the cutting edge. And I've pondered since 22nm what are we going to do when we get to 1nm great video, please SLOW DOWN YOUR VOICE.
@ Not sure if you didn't get my analogy, what I meant is that it is cheaper to build more layers and thus needing less area for one chip on a waver than using less layers and taking up more area on a waver. The greater the area of your chip the less chips you get out of a waver, which means they will be more expensive and your margins are dwindling.
@@MyNameIsPetch That is true, but the 7nm TSMC node is slightly more dense than Intel's 10nm node. The last thing I've heard is that Intel is reducing the density to try and improve yields, but at the same time, news outlets have been stating that the node is basically dead. So idk?
@@MyNameIsPetch They sacrifice multi-core power draw for single core performance. Now that AMD has released their 7nm chips everyone will see through Intel's shit chips.
@@miguelpereira9859 just because a person can create tech doesn't make them 300iq. If you put those professionals in another environment they wouldn't know what to do. That's what I mean. Take elon musk as an example. Put him in Jeff bezos shoes and vise versa then watch both companies crumble
Speed is OK for me but not for many others - so slow it down. Also, you should inflect more, else your flat, robotic tone (sorry) will repel people. Great job on the content! Subbed!
Andrew Palfreyman there is an option on RUclips to slow down the speed of a video. Btw i am not native in English but have no problems following the video
Ignore what everyone is saying about talking to fast, this video was the perfect pace and you can always rewind/scrub the video if you missed something. To be honest this video was a perfect refresher on the topic and I greatly appreciated the work that went into it.
@@SineDeus life is too short to complain about trivial stuff. it would take longer to complain about the voice speed than it would to just change the speed.
Hmmm many companies can announce nano1 or even omstrom(tech beyond nano) but only 2 companies in the world that can make chips with nano7 and beyond, SS and TSMC. AT nano5 and competing for nano3. Samsung anniunced that they have nano3 with GAA feasible and will start producing q1 of 2022. TSMC looks last in GAA tech and will stick with finfett. My point is, they can announce all they want but wont be available until some companies are capable of nano3 tech.
This video prompted me to go and learn how transistors work and come back. Great video and holy cow my mind is blown. Very appreciative of the scientists amd engineers that have made this all possible
Ye,idk wh6 people dont want ram,q Why not even server owners,as i server owner and gamer.For .y .inecraft server i use corsair ripjawas ram,for all of my servers together,160 sticks.
It has indeed come down, 12 years ago 512 mb of PC 133 cost me $139.00 dollars and that was the going rate. Now I can buy 16 gigs of DDR4 3200 for around $100.00 give or take a few bucks. You new to computers?
@@freeman2399 compared to 12 years ago, it still was. If the prices back then was still the same rate today per meg, 16 gigs would cost over $1100.00 dollars. At that rate, many would still be using a gig or less.
This kind of presentation is the stuff we all dreaded doing in high school, and now RUclips is packed end to end with these kinds of videos with way more effort than could be expected.
The death of Moore’s Law? I have been hearing this for 20+ years. It's unlikely to happen anytime soon. Even when we hit the limit for transistor size we likely will just start layering them as we do with flash.
It's already happened. Intel and Nvidia have both released flagship cpu/gpus that aren't really faster than their last year counterparts for the first time.
Nvidia is cheating with Moore's law. Instead of relying on making node size smaller, they simply increase die surface area. Moore's law makes an assumption that each product is of similar production costs, but this assumption is unfair. If you plot for computing power per price unit, you'll notice that the last few data points on the Moore's law graph are all outliers
@Akin Khoo I agree. In fact you can pretty much extrapolate (see what I did there?) Moore's law to pretty much any important metric about computer power & digital performance, as he showed in the video. And as he said at the end, it doesn't even really matter, ultimately, that the size of the chip is going down. As long as the computing power increases, even if it's from CPU processing optimization (better planning computations so there's less overhead, basically, and compiling them in a way that avoids multiple computation when possible), we're good. The letter of the law may be broken, but the spirit remains true. No cheating in that.
For the PC Market no one cares if the Chip is 1 inch or 5 inches big ;D only "smart"phone hipsters need small chips so they can carry their survailance device everywhere and look cool ;D
@@hyperhektor7733 You seem half serious, but still. The density of computing power matters just as much a costs. You can see it as the number of operations you can perform per volume unit. Just as increasing computer density has made possible PC (which would never be possible with, say, the 70's density that could be achieved, regardless of price). Greater computing density enable applications unthinkable before. Of course price matter, otherwise there's not breadth to the market for computing power. I see price as breadth & density as depth, in a way...
@Francis Vachon i am on Board since Windows 3.11 and the x386 Cpu :D and i installed and since then every windows version and build every PC my self (the newest midprice-cpu which was aviable at the time). So i can say i have a kind of overview of the topic :). What i see is that the cpu-power is wasted by the programmers, since they started to use more and more less efficient programming languanges(for programms and operating systems). So the hardware becomes faster but the user experince stays the same since 20 years xD. We already are far beyond the point where performance per volume counts. I can buy a used xenon CPU for 7$ on ebay which has 4000 Passmark points which is insane. People these days have really no clue how powerfull cpus are , they think an i7 is just good enough for websurfing lol. BTW, the time where you can compare cpus by Ghz or cores or cache stopped When the first Intel-cpus with ratings came up. I found the most consistent and largest database is from passmark.com. They have very old cpus and the newest so i use this as comparison since 15 years for my cpus.
last I checked the nm size of a processor doesn't actually have much relation to the size of the transistor or even the computing power anymore. It is mainly just used to give a sense of noticeable improvement for each new process, with processes from different manufacturers with different "sizes" often being comparable to each other.
I looked on youtube and found out that the 5 nm and 2nm chips actually are bigger than that but because of their design they are equal to the power of a theoretical 2nm chip The transistor is then marketed as a 2nm transistor even though they aren't actually that small Link to the video that explains it ruclips.net/video/GdLRSF5wZpE/видео.html
(On the funny side) Are you a robot in human form to have so much information sequencetially bombarded over us to short-circuit our brains to pieces? (On a serious note) This is just mind-blowing information with never-seen-presentation-style which keeps going on and on and there is no chance of skipping any part of the video. Hats-off to you for all the hardwork.
that's true. too bad people don't know that. he made tons of mistakes in the video because of that. 7 nm is actually something between 20-50 nm as far as i remember.
Where did they come up with 10nm then? If the transistors are (twice?) that size, than what exactly IS 10nm? Is it just a made up number? A goal? If 10nm isn't 10nm then what's the point of even using a measurement? 😟 Please tell me that something in that chip is 10nm.
@@Inertia888 for example Intel's 10 nm process is equal in size to AMD's 7 nm. It's a lie, but the truth is that each process is getting better and I think that matters the most
Solid video. All the while I was building an argument about Moore's law as the narrow definition we know coming to an end, but the overall trend in increasing in computing power holding true for likely much longer. For the reasons you said, essentially. When starting from transitors, which were fairly big, the low-hanging fruit was downsizing it. Once we had integrated chips, the low-hanging fruit was still downsizing because they were still huge and because those remained in effect a "first draft" - the concept was working but far from efficient and optimized space-wise. That means we really haven't touched in comparison things like 3d stacking of chips, instructions compilations optimizations & so on. Hopefully that buys us a few more years/decades of Moore's law until we can figure some sort of commercially usable quantum computer. Then we can leap again. Even if such a computer never gets to the smartphone-level of tech and remains confined to specialized, centralized data center, they would still provide huge computer power that can be distributed through networks.
And all for what purpose - so we can watch more movies on our talking fridges! Will RUclips videos play any faster? All this technology for the sake of more technology is useless if it does not make any real advances in the quality of life as we know it. Does the world really need an All-seeing, All-knowing Cloud? Do we really need better video War games? Will real wars and conflicts over limited resources end? What happens to the Military Industrial Complex that runs the world - do those $Trillions get returned to the common-good? Will the AI tell us who really did 911 and how?
@@JohnnyBGoode-tt7yv Yeah but in a sense you are talking about the by-products. Parallele to those by products, our ability to simulate a lot more things did lead to advances in medecine and other things. More will follow. It does, as well, improve life I would argue.
Moore’s law isn’t a law. It was an interesting idea. The fact that it has had so much influence on this technology is completely artificial. It’s impossible to say where the industry might be had Gordon Moore not made what is essentially an economic guess.
but the price to research and figure out new tech does not go down. them figuring out and finding how to make stuff smaller and smaller but keep the same performance is not suddenly cheap
In terms of design the low hanging fruit is gone, die shrinks represent more and more man hours, new tools and techniques. The material costs go down but the rnd budgets swell more and more.
u can‘t say the production costs go down. I work at Infineon, one the biggest semiconductor company. In our Production halls there are maschines which cost over 1 million usd/euro. And u have to sell them und buy new ones every 24 months.. Just look at the Money that Siemens earns and thennat the money which Infineon earns. There are huge differents cause Infineon has a lot of production costs in there maschines
This isn't just happening with computers look at tons of other things. Compare a motocross bike from 1996 to 1986 huge difference. Now compare a 2006 to a 2016 not nearly as big of a difference. Same thing with cars etc. Have we hit a wall in an innovation sense?
I feel the next big innovation will be based in understanding how to bridge the gap of electrical signals between machine and human. If we could understand better the electrical signals our brain receives regarding simple things like physical sensation you could theoretically mimic real life sensations while in VR.
What if hardwade limit is at 99% but software optimization is at stone age with 0.000000001%.... We don't need hardware shrinkage, more likely to do software magic...
We are already in the singularity, and this is exactly the way I expect it to be the entire way - no one even notices. Anyone born today will never get to drive a car. They will never need a job. They will never die against their will. There will be no doctors, lawyers, experts, nothing, no careers. The only question is, who owns the TS? The answer determines if the future is heaven or hell.
Simply amazing info, video and narrating. No, don't slow down your speech. It's perfect ! Short, concise and straight to the point, as it should be. Other channels should learn from you.
Does quantum tunnelling stop happening when a particle is being observed? Is it the same as the wave function, that collapses when observed in the double slit experiment? So for example, if you want to make faster CPUs under the 2nm lithography, would one solution be to observe when an electron passes via a logical gate?
Here’s a quick thought. Just know I haven’t thought about any of this and as I write this I’m thinking “actually this is a shit idea because x y and z but what if we were to switch to analog in terms of input and here’s what I mean: a neurone works by these little gates for the cell for sodium and they open as a (even more) positive charge comes through like a chain reaction once one opens it triggers the opening of the sodium channel next to it. How it triggers the opening is simple, an impulse raises the charge of the local area past a threshold which indices the gate open to let sodium in (sodium raises the charge). In computers this could mean that the electrons could leak and as long as there is only a small current an output would only be 0 until the gate is opened and even more electrons would pass through and therefore the current would increase and therefore would be detected as a 1. I’m sorry if this is poorly explained and after reading this I’m thinking well how would it know sort of thing but I’m assuming this train of thought is where sci-fi (but plausible) biomechanical cyborg brains are made.
Moore's law is not ending. We've just managed to push it further back. Eventually, maybe a decade from now this new generation of transistors will hit the same bump again. Unless we develop another ingenious way to do things again. Props to the people behind this development tho.
Not only the amount of transistors increase but also the operation frequency. Thus actually the computing power increase when the amount of transistors double is actually way over the double.
Walther Penne I don't think you have bothered to check the current state of technology. But we already do stacked die CPUs and memory. As long as the wafers are sufficiently thin. You can just back down the frequency and voltage till you hit the peak of the efficiency curve. Cooling is a problem, but it isn't "the" problem. It has the same issues as other large die solutions, more surface area means more percentage chance that there is an error, with the added bonus of damage during the stacking and soldering the TSVs. Stacked dies will be popular outside of embedded processors not long after we have more reliable lithography and better testing equipment.
@Walther Penne So what can be done to speed up the process of turning the heat in your processor into Hot Air Dude? And what happens to Hot Air Dude after his transformation? Does he fly out of your PC on the wings of a stork? I am fascinated to learn more about this Hot Air Dude.
The not as obvious trend = past silicon, nothing is affordable in the consumer market. Which means that, if we can't improve battery technology very soon, Moore's law for consumer computing basically comes to a standstill. Great video. What do you do for a living and do you think technology or need for computing power will out pace the other?
You may be right! There are some laws of physics limits we'll be approaching soon, however, there are also many alternatives the computing industry is beginning to shift towards. I'll be covering these in upcoming videos :) Thanks for watching!
Infinitely large maybe, not small though. Atomic theory is pretty well founded, and after that size you run into the fundamental information density of the uncertainly principle
Transistor count can be increased, given slice of silicon, by stacking, which is an approach with NAND flash storage "3D NAND". When I was in highschool 20 years ago, the death of Moores law was expected to arrive, at that time 180nm was still in development.
Should do a video on how AMD is Destroying Intel? And despite AMD achieving 7NM B4 intel!?! You still only show Intel's Grossly Overpriced Crap in your video?... LOL!...
@@marcusm5127 a true quantum computer would but they are hampered because they still need to interface with the old technology - electronics. A true quantum computer doesn't run off electricity. For example, current quantum chips can be placed onto a motherboard that runs off electricity. The quanta to electric conversion introduces inefficiencies and delays. As we rely less and less on electrical computers we'll be able to do more.
My CPU is 22 nm architecture (Intel i5-4200M). It would be awesome to have a CPU from the 1990s to get a retro vibe, but I like the high performance of CPUs from the 2010s decade. I'm impressed by our digital technology over the past 5, 10, 20, and 50 years. Who knows what the 2020s would bring, maybe the second generation of quantum computers such as the D-Wave...
And for that reason, Quantum Computers started, though we still haven't made one for commercial usage, i think in the future, it will be a commercial computer.
Over 8 years ago I bought a laptop for a certain price with 4 cores 8 threads and 8 GB of DRR3. Now after taking inflation into account, you can buy a similarly priced laptop with 6 cores 12 threads and 16 GB of DRR4. This isn't going as it should with Moore's Law. By now we should be at 32 cores and 256 GB of DDR6 for that price.
Should do a video on how AMD is Destroying Intel? And despite AMD achieving 7NM B4 intel!?! You still only show Intel's Grossly Overpriced Crap in your video?... LOL!...
You talk so fast, It seems like a computer simulated voice over. Humanize the information more. Like what's the ultimate goal here? Why the need to go to plank size? Give a useful example of something that could be made at plank size that currently can't be made at 7 nanometers. Why are 5 nanometers a useful goal? How will things change when that breakthrough is achieved?
alt f4 I understand that, but why isn't the current standard good enough? Again, give an example of something they want to build that they are unable to build currently. Or is it just about speed? Why is 7 nanometers too slow for a new application.
@@gl7011 i know this is a late reply but here goes: think 50 years back. you couldve asked that same question, and no one would have been able to tell you that we need more powerful processors to run fluid dynamics simulations, finite element calculations for engineering, and the amazing CG effects we can see in cinemas today; simply because they couldnt even imagine it. even if just one step forward doesnt immediately bring about huge changes to daily life, looking back after a few decades shows just how far weve come by taking one step at a time.
@@qtrg5794 Thanks for that, I'm much more informed now than I was just 6 months ago when I posted that question. Since then more and more information about the dawn of artificial intelligence and it's potential impact on life as we know it has been written. I'm sure the area if AI will greatly benefit from these New breakthroughs. Computers will be able to perform in ways that mimic if not surpass the human brain. We live in very interesting times.
The example here is the human brain. Although neurons are slow by silicon circuitry standards, the brain does not particularly rely on the speed of neurons, but rather massive parallel processing. To this extent, you can regard the end of Moore's law as a good thing. We are still in baby steps as far as PP goes. The reason that GPUs do so well today is that image processing and graphics are problems well suited to parallel processing.
Slow down! You're going too fast even people with multiple EE and CE degrees. If your intention is to educate and inform slow down. If your intention is to sound cool, then just show some pretty videos and speak fast!
Hey, may I ask for the source of info used in the vid. As a student in computer systems and technologyies I am really curios to read it myself! Awesome vid.
I think it is worth pointing out that further miniaturization won't nessicarily enable more IOT technology. The ESP8265 (not 8266) is available for pennies when ordered in bulk, and offers loads of performance in a fully integrated 5mm package. You could make an IOT ring with this and an OLED display for a jewel for less than $10. and this is on 40nm. These things are so cheap that they are disposable. They even offer capacitive touch on the chip, meaning you don't even have to pay to Integrate buttons.
@@fer5839 Well as time passes by, technologies evolve. It happened with vacuum tubes, it happened with floppy disks and so on. It is no surprise silicon chips wont do it anymore. Isnt quantum computing the answer to that?
does anyone know of a good video that explains better the thing about gate transistors, how they work, why they start having problems at smaller scales, etc? this is like the third video I watch trying to get into the details of that stuff but all I get is always the same CG video with the fancy cubes that seems to be done in archicad renderer. is there any detailed, precise, non over simplified explanation out there? I mean I'm a computer scientist but I have no clue about the physics behind this problem (more than the repetead phrase "its because quantum tunneling", which doesn't explain anything). thanks!
ruclips.net/video/RF7dDt3tVmI/видео.html ruclips.net/video/rtI5wRyHpTg/видео.html I hope these videos help explain. A person can spend years learning to understand the mechanics of this subject but this should be a good place to start.
Want to learn more about the Technological Revolution? Watch our playlist here: ruclips.net/video/ENWsoWjzJTQ/видео.html
- ALSO - Become a RUclips member for many exclusive perks from exclusive posts, bonus content, shoutouts and more! subscribe.futurology.earthone.io/member - AND - Join our Discord server for much better community discussions! subscribe.futurology.earthone.io/discord
What's the theme song?
there going to stack silicon CPU dies on top of each other just like HMB memory is
Also as eh side note cosmic rays have been know to reap havoc on silicon chips with transistors smaller then 45nm creating currents in the electronic pathways that short out ships !
I love learning about ICs and electronics specifically CPUs because they are the cutting edge. And I've pondered since 22nm what are we going to do when we get to 1nm great video, please SLOW DOWN YOUR VOICE.
Aaron ball 1nm is impossible. Once you get to 4nm quantum tunnelling becomes a issue.
It’s like housing in a dense city area. When there’s no room left on the ground, you start building up with high rise...
Stupid!
Just build bigger chips then!
It's like buying more land in your analogy
@@justicewarrior9187
Problem is that that "more land" is more expansive than building skyscrapers.
@
Not sure if you didn't get my analogy, what I meant is that it is cheaper to build more layers and thus needing less area for one chip on a waver than using less layers and taking up more area on a waver. The greater the area of your chip the less chips you get out of a waver, which means they will be more expensive and your margins are dwindling.
That's kinda what threadripper is
Yup. 3D chips are way past due.
Reads the news - Intel is having trouble with their 10nm process.
While AMD is already gearing up for a 7nm release in 2019.
Poor boi Intel
And it launched just yesterday.
And this Video is one year old, sooooo... yea
They're not really comparable, Intel's 14nm was equivalent to everyone else's 10nm
@@MyNameIsPetch That is true, but the 7nm TSMC node is slightly more dense than Intel's 10nm node. The last thing I've heard is that Intel is reducing the density to try and improve yields, but at the same time, news outlets have been stating that the node is basically dead. So idk?
@@MyNameIsPetch They sacrifice multi-core power draw for single core performance. Now that AMD has released their 7nm chips everyone will see through Intel's shit chips.
I remember how everyone talked about the end of moores law 20 years ago.
I guess we'll stop using a flat sheet as a place to store transistors but use tubes that can run liquid cooling through them
How many more iterations do you think we could get if we wrote better software? (Or designed programs to write better software.)
But its for real now.
There is an old Byte magazine cover blabbering that the limit was 250 Mhz....yeah.
Little did he know 5nm ended up coming out in 2021
How are you not insanely popular!? Proud early subscriber
Man I really admire all those engineers and researchers who come up with all this technology. 300IQ
No, not really. A tech junky knows as much of his profession as a drug dealer knows his but neither wouldn't understand each other's
@@deathbydeviceable What do you mean "not really". Are you saying I DON'T admire them? Lol
@@miguelpereira9859 just because a person can create tech doesn't make them 300iq. If you put those professionals in another environment they wouldn't know what to do. That's what I mean.
Take elon musk as an example. Put him in Jeff bezos shoes and vise versa then watch both companies crumble
@@miguelpereira9859 Lol this guy doesn't believe you actually admire them I guess
@@jessicalloyd2330 hahahaha
I can understand him clearly, you guys need to overclock.
🤣😂😂😂🤣
I'm not native speaker.
I downclocked to. 75 speed and got everything.
i think its because of the hideous annoying music in the back ground
OK, For native speakers is easy... kkkk
@@littlerussianmax5831 I could understand the French and Japanese clearly unlike the narrator.
Speed is OK for me but not for many others - so slow it down. Also, you should inflect more, else your flat, robotic tone (sorry) will repel people. Great job on the content! Subbed!
Yeah, i think the constant montonous tone is throwing me off a bit. It would be fine for a 5 minute long vid but not 10.
This is a wholesome comment
Andrew Palfreyman no he sounds fine
I actually listened at x1.25 lol
Andrew Palfreyman there is an option on RUclips to slow down the speed of a video. Btw i am not native in English but have no problems following the video
Ignore what everyone is saying about talking to fast, this video was the perfect pace and you can always rewind/scrub the video if you missed something. To be honest this video was a perfect refresher on the topic and I greatly appreciated the work that went into it.
To those complaining about the fast narration, Adjust your setting to 0.25 playback, Goodbye
lfe is to short to adjust settings
@@SineDeus life is too short to complain about trivial stuff. it would take longer to complain about the voice speed than it would to just change the speed.
@@TheTimmy4745 did I complained about the video? Don't think so
I just watched it at 1.5x because I realized 2x was too fast haha
@@camerica7400
Just put it on 0.75x much better.
watching this 4 years later ibm announces a 2nm chip
Hmmm many companies can announce nano1 or even omstrom(tech beyond nano) but only 2 companies in the world that can make chips with nano7 and beyond, SS and TSMC. AT nano5 and competing for nano3. Samsung anniunced that they have nano3 with GAA feasible and will start producing q1 of 2022. TSMC looks last in GAA tech and will stick with finfett. My point is, they can announce all they want but wont be available until some companies are capable of nano3 tech.
2 years later we have AGI almost here..
We here bro
with worse specifications than its 5nm...
This video prompted me to go and learn how transistors work and come back. Great video and holy cow my mind is blown. Very appreciative of the scientists amd engineers that have made this all possible
Why does this seem like a Cold Fusion Vid...🤔
moonman_Z Probably the voice and quality
*_C O L D F U S I O N T V_*
Exactly!!!
My thoughts the same 😱
Same concept, less quality
To bad RAM prices haven't come down in accordance with moors law.
Ye,idk wh6 people dont want ram,q
Why not even server owners,as i server owner and gamer.For .y .inecraft server i use corsair ripjawas ram,for all of my servers together,160 sticks.
It has indeed come down, 12 years ago 512 mb of PC 133 cost me $139.00 dollars and that was the going rate. Now I can buy 16 gigs of DDR4 3200 for around $100.00 give or take a few bucks. You new to computers?
@@garyr7027 Yes now it's cheap. Two years ago it wasn't.
@@freeman2399 compared to 12 years ago, it still was. If the prices back then was still the same rate today per meg, 16 gigs would cost over $1100.00 dollars. At that rate, many would still be using a gig or less.
Lol ur an idiot...
This kind of presentation is the stuff we all dreaded doing in high school, and now RUclips is packed end to end with these kinds of videos with way more effort than could be expected.
The death of Moore’s Law? I have been hearing this for 20+ years. It's unlikely to happen anytime soon. Even when we hit the limit for transistor size we likely will just start layering them as we do with flash.
dont froget graphene
processors
How about heat dissipation?
It's already happened. Intel and Nvidia have both released flagship cpu/gpus that aren't really faster than their last year counterparts for the first time.
Andrew Scott
That's because of greed.
@@FrostEclipse21 or so we'd like to think atleast
Fantastic video - very informative, very high quality, and very well edited. Subscribed and hope to see your channel grow!
Thanks for watching!
Nvidia is cheating with Moore's law. Instead of relying on making node size smaller, they simply increase die surface area. Moore's law makes an assumption that each product is of similar production costs, but this assumption is unfair. If you plot for computing power per price unit, you'll notice that the last few data points on the Moore's law graph are all outliers
Exactly
@Akin Khoo I agree.
In fact you can pretty much extrapolate (see what I did there?) Moore's law to pretty much any important metric about computer power & digital performance, as he showed in the video.
And as he said at the end, it doesn't even really matter, ultimately, that the size of the chip is going down. As long as the computing power increases, even if it's from CPU processing optimization (better planning computations so there's less overhead, basically, and compiling them in a way that avoids multiple computation when possible), we're good.
The letter of the law may be broken, but the spirit remains true.
No cheating in that.
For the PC Market no one cares if the Chip is 1 inch or 5 inches big ;D
only "smart"phone hipsters need small chips so they can carry their survailance device everywhere and look cool ;D
@@hyperhektor7733 You seem half serious, but still.
The density of computing power matters just as much a costs. You can see it as the number of operations you can perform per volume unit. Just as increasing computer density has made possible PC (which would never be possible with, say, the 70's density that could be achieved, regardless of price). Greater computing density enable applications unthinkable before.
Of course price matter, otherwise there's not breadth to the market for computing power.
I see price as breadth & density as depth, in a way...
@Francis Vachon i am on Board since Windows 3.11 and the x386 Cpu :D and i installed and since then every windows version and build every PC my self (the newest midprice-cpu which was aviable at the time).
So i can say i have a kind of overview of the topic :).
What i see is that the cpu-power is wasted by the programmers, since they started to use more and more less efficient programming languanges(for programms and operating systems). So the hardware becomes faster but the user experince stays the same since 20 years xD.
We already are far beyond the point where performance per volume counts. I can buy a used xenon CPU for 7$ on ebay which has 4000 Passmark points which is insane. People these days have really no clue how powerfull cpus are , they think an i7 is just good enough for websurfing lol.
BTW, the time where you can compare cpus by Ghz or cores or cache stopped When the first Intel-cpus with ratings came up. I found the most consistent and largest database is from passmark.com. They have very old cpus and the newest so i use this as comparison since 15 years for my cpus.
"bite-sized chunks of content" had to be the most comedic line in this video.
do love the style, learned so much in what only felt like 10 mins.
6:52 Home - Resonance...
last I checked the nm size of a processor doesn't actually have much relation to the size of the transistor or even the computing power anymore. It is mainly just used to give a sense of noticeable improvement for each new process, with processes from different manufacturers with different "sizes" often being comparable to each other.
I looked on youtube and found out that the 5 nm and 2nm chips actually are bigger than that but because of their design they are equal to the power of a theoretical 2nm chip
The transistor is then marketed as a 2nm transistor even though they aren't actually that small
Link to the video that explains it ruclips.net/video/GdLRSF5wZpE/видео.html
I was waiting for the day somebody would use synthwave/vaporwave music to new technology. Love your video, sub'd!
It just fits so perfectly!
Here I am with a 5nm iPad in 2020. In just 3 years since this post it went from 10nm to 5nm!
vaporwave is strong with this one
Great Music, Great Graphics and Clear Fast Explanation!
Hands crossed and waiting for branches of computing other than classical type
Getting there!! Just want to run down classical computing first as a prelude!
Quantum computing exists. You can uncross those arms😂
An updated version of this video would be really cool
Intel is releasing 10 nm cpu in 2018 😂😂😂
Nice one xD
yeah lol'd at that
14nm* also weird way to spell Shintel
@@macrett 7nm coming from amd soon.
amd releasing 7nm soon, forget intel.
(On the funny side) Are you a robot in human form to have so much information sequencetially bombarded over us to short-circuit our brains to pieces? (On a serious note) This is just mind-blowing information with never-seen-presentation-style which keeps going on and on and there is no chance of skipping any part of the video. Hats-off to you for all the hardwork.
This video is top notch. Did a lot of research and analysis before uploading it. Thank you. Great video.
While things may stall at ~3nm, cost of production will be improved, effectively further increasing performance.
THANK YOUUUUU For listing the song names in the description 👌🏾
IBM just made the 2nm chip!
Awesome bro, im so proud to be your early subscriber before you go insanely popular!
Some constructive advice: slow down, and have some more inflexibility in your voice, it’s a little monotone.
lilpandaftw No. his speed is fine
@@evanwatling3897 Look at the comments. A lot of people don't agree with you.
lilpandaftw That doesn’t mean that I’m wrong.
Sentient2x It kinda does though.
Zane Just because there’s a comment section full of flat earthers doesn’t mean that people disagreeing with them are incorrect.
I love the video, your voice is pretty nice and the speed you narrate is perfect.
This is a bit misleading, transistors in today's 10nm process are a lot bigger than 10nm and their smallest features are still bigger than 10nm.
that's true. too bad people don't know that. he made tons of mistakes in the video because of that. 7 nm is actually something between 20-50 nm as far as i remember.
Where did they come up with 10nm then? If the transistors are (twice?) that size, than what exactly IS 10nm? Is it just a made up number? A goal? If 10nm isn't 10nm then what's the point of even using a measurement? 😟 Please tell me that something in that chip is 10nm.
@@Inertia888 nothing is 10 nm there, sorry...
@@hihtitmamnanThat is a real bummer. It's a Big Fat Lie. How can I trust anything they claim?
@@Inertia888 for example Intel's 10 nm process is equal in size to AMD's 7 nm. It's a lie, but the truth is that each process is getting better and I think that matters the most
Solid video. All the while I was building an argument about Moore's law as the narrow definition we know coming to an end, but the overall trend in increasing in computing power holding true for likely much longer. For the reasons you said, essentially.
When starting from transitors, which were fairly big, the low-hanging fruit was downsizing it. Once we had integrated chips, the low-hanging fruit was still downsizing because they were still huge and because those remained in effect a "first draft" - the concept was working but far from efficient and optimized space-wise.
That means we really haven't touched in comparison things like 3d stacking of chips, instructions compilations optimizations & so on.
Hopefully that buys us a few more years/decades of Moore's law until we can figure some sort of commercially usable quantum computer. Then we can leap again. Even if such a computer never gets to the smartphone-level of tech and remains confined to specialized, centralized data center, they would still provide huge computer power that can be distributed through networks.
And all for what purpose - so we can watch more movies on our talking fridges! Will RUclips videos play any faster? All this technology for the sake of more technology is useless if it does not make any real advances in the quality of life as we know it. Does the world really need an All-seeing, All-knowing Cloud? Do we really need better video War games? Will real wars and conflicts over limited resources end? What happens to the Military Industrial Complex that runs the world - do those $Trillions get returned to the common-good? Will the AI tell us who really did 911 and how?
@@JohnnyBGoode-tt7yv Yeah but in a sense you are talking about the by-products.
Parallele to those by products, our ability to simulate a lot more things did lead to advances in medecine and other things. More will follow. It does, as well, improve life I would argue.
Moore’s law isn’t a law.
It was an interesting idea. The fact that it has had so much influence on this technology is completely artificial. It’s impossible to say where the industry might be had Gordon Moore not made what is essentially an economic guess.
Had to watch it on vapor wave speed, music also gets better this way, everything is more chill
What other aspects of computing should be considered before we say Moore's law may be dead by mid 2020's.
Beautifully Explained.
5nm is here now
we're now at 2nm
This channel is gold....
production costs go down , market prices go up! thats how you know you are enslaved!
but the price to research and figure out new tech does not go down. them figuring out and finding how to make stuff smaller and smaller but keep the same performance is not suddenly cheap
In terms of design the low hanging fruit is gone, die shrinks represent more and more man hours, new tools and techniques. The material costs go down but the rnd budgets swell more and more.
u can‘t say the production costs go down. I work at Infineon, one the biggest semiconductor company. In our Production halls there are maschines which cost over 1 million usd/euro. And u have to sell them und buy new ones every 24 months.. Just look at the Money that Siemens earns and thennat the money which Infineon earns. There are huge differents cause Infineon has a lot of production costs in there maschines
Hey stupid ever heard of AMD?
perfect example of a dumb human being
Could you please add a link to the first part of the video?
watch in 0.75x speed
John Vatic lmao, you're right.
I tought he was speaking way too slowly :/
first thing i tried. audio is "chopped" at 0.75... speedup works fine, slowdown not
I was watching at 2.0x. IT'S OKAY THOUGH!
at that speed it sounds like he is telling a bedtime story.
Some pause between sentences would be a massive improvement. Very interesting information F for delivery.
"Snapdragon 850 has 10nm transistors"
Snapdragon 888: laughs in 5nm
Thank you for the free education. We love you.
This isn't just happening with computers look at tons of other things. Compare a motocross bike from 1996 to 1986 huge difference. Now compare a 2006 to a 2016 not nearly as big of a difference. Same thing with cars etc. Have we hit a wall in an innovation sense?
electrical cars are entering the normal market. we need to change silicon to something better. graphene seems to have many problems nowadays.
I would like to see a bigger focus on efficiency, and it seems like that might be where we are headed.
I feel the next big innovation will be based in understanding how to bridge the gap of electrical signals between machine and human. If we could understand better the electrical signals our brain receives regarding simple things like physical sensation you could theoretically mimic real life sensations while in VR.
Subscribed.
Keep up with the good content!
Thanks!
What if hardwade limit is at 99%
but software optimization is at stone age with 0.000000001%.... We don't need hardware shrinkage, more likely to do software magic...
MR BOLEUS for example apples phones are silky smooth with 2gb of ram
@@gaiazoulay9 Bullseye!!!
@@gaiazoulay9 no one cares there are applications that needs superpowerfull chips for super fast calculations
@Mikasa Imagine what kinda experience we'd have if every line of code was optimized ;)
@@carholic-sz3qv The only reason it needs a superpowerful chip is because the code sucks
I work in the Electro polymer bziness, so it's good to learn what our customers are up to. Thx.
just bought a phone with a 4nm chip : D
We are already in the singularity, and this is exactly the way I expect it to be the entire way - no one even notices. Anyone born today will never get to drive a car. They will never need a job. They will never die against their will. There will be no doctors, lawyers, experts, nothing, no careers.
The only question is, who owns the TS? The answer determines if the future is heaven or hell.
Get your head out of your ass please.
@@fdamoreg The funny part is that you think he is wrong. Look at the trends.
Amd 7nm gpu
_Oh yeah yeah_
Great video dude, this is the first Ive heard off nano sheets
please slow down a bit, great video though.
Simon Mayrand you can slow down the audio on youtube, or speed it up
Jordan Moorman thank you good trick😀
Set the speed at 0,75
Brain Food Spotify Playlist playing the whole video.
Good stuff.
Simply amazing info, video and narrating.
No, don't slow down your speech. It's perfect !
Short, concise and straight to the point, as it should be.
Other channels should learn from you.
AJ81 Number 15: Burger King Foot Lettuce...
Does quantum tunnelling stop happening when a particle is being observed? Is it the same as the wave function, that collapses when observed in the double slit experiment?
So for example, if you want to make faster CPUs under the 2nm lithography, would one solution be to observe when an electron passes via a logical gate?
lol, not how that works.
One atom is about 0,3 nm and they say 2 - 3 nm is the limit for transistors...
On wikipedia is writen humans have build a 0,4 nm working transistor.... But ok I think for microprocess chips maybe 2 - 3 is the limit...
At least on silicon ;)
Microchips in our smartphones are 12 nm. For records in 8K video they will need to be 4x more powerfull, so they need to be in 3 nm !
@Hernando Malinche I know :) Its the first mass production device with a 7 nm chip. My note 8 can 4k video recording with I would say a 14 nm chip.
It gets very hot. If you have more transistors you can reduce frequency or operation times, and this reduce heat... No ?
Here’s a quick thought. Just know I haven’t thought about any of this and as I write this I’m thinking “actually this is a shit idea because x y and z but what if we were to switch to analog in terms of input and here’s what I mean: a neurone works by these little gates for the cell for sodium and they open as a (even more) positive charge comes through like a chain reaction once one opens it triggers the opening of the sodium channel next to it. How it triggers the opening is simple, an impulse raises the charge of the local area past a threshold which indices the gate open to let sodium in (sodium raises the charge). In computers this could mean that the electrons could leak and as long as there is only a small current an output would only be 0 until the gate is opened and even more electrons would pass through and therefore the current would increase and therefore would be detected as a 1. I’m sorry if this is poorly explained and after reading this I’m thinking well how would it know sort of thing but I’m assuming this train of thought is where sci-fi (but plausible) biomechanical cyborg brains are made.
Whoa! Slow down turbo, you talk a bit fast and robotic! Otherwise, great video!
Moore's law is not ending. We've just managed to push it further back. Eventually, maybe a decade from now this new generation of transistors will hit the same bump again. Unless we develop another ingenious way to do things again. Props to the people behind this development tho.
awesome channel
but you talk too fast bro
Not only the amount of transistors increase but also the operation frequency. Thus actually the computing power increase when the amount of transistors double is actually way over the double.
Multiple layers on a single chip is the answer.
Walther Penne Well, maybe a different material can solve the heating problem. I have heard that the graphene is pretty good at this.
Tommy Hammernots I know, but I meant more layers, and probably with different materials.
ruclips.net/video/UUO-f0kIgVU/видео.html
Walther Penne I don't think you have bothered to check the current state of technology. But we already do stacked die CPUs and memory. As long as the wafers are sufficiently thin. You can just back down the frequency and voltage till you hit the peak of the efficiency curve. Cooling is a problem, but it isn't "the" problem. It has the same issues as other large die solutions, more surface area means more percentage chance that there is an error, with the added bonus of damage during the stacking and soldering the TSVs. Stacked dies will be popular outside of embedded processors not long after we have more reliable lithography and better testing equipment.
@Walther Penne So what can be done to speed up the process of turning the heat in your processor into Hot Air Dude? And what happens to Hot Air Dude after his transformation? Does he fly out of your PC on the wings of a stork? I am fascinated to learn more about this Hot Air Dude.
The not as obvious trend = past silicon, nothing is affordable in the consumer market. Which means that, if we can't improve battery technology very soon, Moore's law for consumer computing basically comes to a standstill.
Great video. What do you do for a living and do you think technology or need for computing power will out pace the other?
I believe it will never end, the evidence being the universe itself will always be infinite.
You may be right! There are some laws of physics limits we'll be approaching soon, however, there are also many alternatives the computing industry is beginning to shift towards. I'll be covering these in upcoming videos :) Thanks for watching!
Singularity Prosperity waiting for those videos as an early suber
ZeN The universe isn't infinite, I think that's why we all die, even the universe itself. Just imagine if there was one though!
Infinitely large maybe, not small though. Atomic theory is pretty well founded, and after that size you run into the fundamental information density of the uncertainly principle
Joseph Moore
Uncertainty.
Transistor count can be increased, given slice of silicon, by stacking, which is an approach with NAND flash storage "3D NAND". When I was in highschool 20 years ago, the death of Moores law was expected to arrive, at that time 180nm was still in development.
You cant stack to much though because of heat.
There is also a big difference between storage and the actuall processing parts
you do know amd is on 7nm right
That is not the limit though. The limit is closer to 3-4nm. Also, using a combination of materials might push that limit even further.
Should do a video on how AMD is Destroying Intel? And despite AMD achieving 7NM B4 intel!?! You still only show Intel's Grossly Overpriced Crap in your video?... LOL!...
RawLu You realize you copy and pasted this onto someone with the same opinion as you? Complete and utter moron
@@sav22rem22 He put it on at least 4 others too. People are retarded.
What if we use two chips instead of one?
I feel drastically smarter now... (+ 1 sub)
Thanks for making such a informative video, can you please decrease the bass level. bass is high in voice.
10:33 corona virus first appear;
😂😂
This was a really well made video. I loved it!!
*S Y N T H W A V E*
I know right
It's Home - Resonance
Dude, this video is awesome!
Quantum computers will continue the trend even if there's a delay...
No they will double even faster but with bigger jumps and longer time between. Steeper curve fewer points on it.
@@marcusm5127 a true quantum computer would but they are hampered because they still need to interface with the old technology - electronics. A true quantum computer doesn't run off electricity. For example, current quantum chips can be placed onto a motherboard that runs off electricity. The quanta to electric conversion introduces inefficiencies and delays. As we rely less and less on electrical computers we'll be able to do more.
My CPU is 22 nm architecture (Intel i5-4200M). It would be awesome to have a CPU from the 1990s to get a retro vibe, but I like the high performance of CPUs from the 2010s decade. I'm impressed by our digital technology over the past 5, 10, 20, and 50 years. Who knows what the 2020s would bring, maybe the second generation of quantum computers such as the D-Wave...
Great video. I understood perfectly I don't know if these snails got it though ;)
r/iamverysmart
It's comprehensible just not an enjoyable listen
Yeah, us damn snails. Can't even get a simple video.
r/iamverysmart
And for that reason, Quantum Computers started, though we still haven't made one for commercial usage, i think in the future, it will be a commercial computer.
10:30 coronavirus in 2017
This video was sponsored by Illuminati.
💯💯
did you know virus existed before covid right.
@@TheZenytram yes🤡
Over 8 years ago I bought a laptop for a certain price with 4 cores 8 threads and 8 GB of DRR3. Now after taking inflation into account, you can buy a similarly priced laptop with 6 cores 12 threads and 16 GB of DRR4. This isn't going as it should with Moore's Law. By now we should be at 32 cores and 256 GB of DDR6 for that price.
>intel releasing 10nm in 2018
Should do a video on how AMD is Destroying Intel? And despite AMD achieving 7NM B4 intel!?! You still only show Intel's Grossly Overpriced Crap in your video?... LOL!...
@@RawLu. intel make their own die, amd and nvidia dont
Was gonna switch to other video, nope, stayed here for HOME
You talk so fast, It seems like a computer simulated voice over. Humanize the information more. Like what's the ultimate goal here? Why the need to go to plank size? Give a useful example of something that could be made at plank size that currently can't be made at 7 nanometers. Why are 5 nanometers a useful goal? How will things change when that breakthrough is achieved?
Did you watch the video? He it will let you fit more transistors in a chip, making it more efficient, generate less heat, etc.
alt f4
I understand that, but why isn't the current standard good enough? Again, give an example of something they want to build that they are unable to build currently. Or is it just about speed? Why is 7 nanometers too slow for a new application.
@@gl7011 i know this is a late reply but here goes: think 50 years back. you couldve asked that same question, and no one would have been able to tell you that we need more powerful processors to run fluid dynamics simulations, finite element calculations for engineering, and the amazing CG effects we can see in cinemas today; simply because they couldnt even imagine it. even if just one step forward doesnt immediately bring about huge changes to daily life, looking back after a few decades shows just how far weve come by taking one step at a time.
@@qtrg5794
Thanks for that, I'm much more informed now than I was just 6 months ago when I posted that question. Since then more and more information about the dawn of artificial intelligence and it's potential impact on life as we know it has been written. I'm sure the area if AI will greatly benefit from these New breakthroughs. Computers will be able to perform in ways that mimic if not surpass the human brain. We live in very interesting times.
The example here is the human brain. Although neurons are slow by silicon circuitry standards, the brain does not particularly rely on the speed of neurons, but rather massive parallel processing. To this extent, you can regard the end of Moore's law as a good thing. We are still in baby steps as far as PP goes. The reason that GPUs do so well today is that image processing and graphics are problems well suited to parallel processing.
Slow down! You're going too fast even people with multiple EE and CE degrees. If your intention is to educate and inform slow down. If your intention is to sound cool, then just show some pretty videos and speak fast!
I'm a random 17 yo dude, and I still understand what he's saying, I don't really see where's your problem.
Meta Cube i think he's trying to complain about an obvious flaw with the video, while attempting to passively flex lol
Hey, may I ask for the source of info used in the vid. As a student in computer systems and technologyies I am really curios to read it myself! Awesome vid.
I think it is worth pointing out that further miniaturization won't nessicarily enable more IOT technology. The ESP8265 (not 8266) is available for pennies when ordered in bulk, and offers loads of performance in a fully integrated 5mm package. You could make an IOT ring with this and an OLED display for a jewel for less than $10. and this is on 40nm. These things are so cheap that they are disposable. They even offer capacitive touch on the chip, meaning you don't even have to pay to Integrate buttons.
Moore's law is not a "law' it is an observation. I don't see why the end if this observation is such a huge deal to everyone.
Because if we want more power we will have to use something else than silicon chips. Someday scientist will figure out to do that I hope
@@fer5839 Well as time passes by, technologies evolve. It happened with vacuum tubes, it happened with floppy disks and so on. It is no surprise silicon chips wont do it anymore. Isnt quantum computing the answer to that?
@@Pspet quantum is a lot harder than silicon computers was back on mid XX
You deserve way more subs/views tbh.
very good video man, keep it up!
this was 3 years ago...yall realize we are living in the future?
How about multi layered CPUs or/and calculating with light?
does anyone know of a good video that explains better the thing about gate transistors, how they work, why they start having problems at smaller scales, etc? this is like the third video I watch trying to get into the details of that stuff but all I get is always the same CG video with the fancy cubes that seems to be done in archicad renderer. is there any detailed, precise, non over simplified explanation out there? I mean I'm a computer scientist but I have no clue about the physics behind this problem (more than the repetead phrase "its because quantum tunneling", which doesn't explain anything). thanks!
ruclips.net/video/RF7dDt3tVmI/видео.html
ruclips.net/video/rtI5wRyHpTg/видео.html
I hope these videos help explain. A person can spend years learning to understand the mechanics of this subject but this should be a good place to start.