Moore's Law Is Ending... So, What's Next?
HTML-код
- Опубликовано: 26 май 2017
- Scientists are engineering a new, more efficient generation of computer chips by modeling them after the human brain.
Can Supercomputers Predict The Future? - • Can Supercomputers Pre...
Discovering The Hidden Treasures of Mauritania's Deadly Sahara Desert -
• Discovering The Hidden...
Sign Up For The Seeker Newsletter Here - bit.ly/1UO1PxI
Read More:
'Artificial Synapses' Mimic Neurons, Hint at Brainy Computers
www.seeker.com/artificial-syn...
"A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say. The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse."
How Quantum Computing Will Change Your Life
www.seeker.com/quantum-comput...
"Over a century ago, the advent of quantum theory rocked humanity. Now, we can manipulate the quantum world, opening our eyes to a powerful new age featuring quantum computers and quantum cryptography."
Self-learning neuromorphic chip that composes music
phys.org/news/2017-05-self-le...
"Today, at the imec technology forum (ITF2017), imec demonstrated the world's first self-learning neuromorphic chip. The brain-inspired chip, based on OxRAM technology, has the capability of self-learning and has been demonstrated to have the ability to compose music."
____________________
Seeker inspires us to see the world through the lens of science and evokes a sense of curiosity, optimism and adventure.
Watch More Seeker on our website www.seeker.com/shows/
Subscribe now! ruclips.net/user/subscription_c...
Seeker on Twitter / seeker
Trace Dominguez on Twitter / tracedominguez
Seeker on Facebook / seekermedia
Seeker on Google+ plus.google.com/u/0/+dnews
Seeker www.seeker.com/
Sign Up For The Seeker Newsletter Here: bit.ly/1UO1PxI
This episode of Seeker was hosted by Trace Dominguez.
Written by: Lauren Ellis. Наука
We'll need moore technology!
...I'll let myself out..
acousticpsychosis ....dude
acousticpsychosis "let" yourself out? What?
moore research is needed!
Naked guy, if I have to explain, its no longer funny, and it wasnt funny in the first place. Dont put me in that position!
acousticpsychosis that's actually pretty funny😂
Some people say we need Less Laws, not Moore Laws.
menofletters.
Master Therion Shakespeare liked puns...but he is dead so stop!
olly smith LOL Best come-back reply I've received in some time ^_^
As You Like It, I will go. But parting is such sweet sorrow...
Master Therion
**sigh...** thumbs up
Master Therion out damn comment out!
Moores law states that everything except my internet speed will increase exponentially.
basil katakuzinos unless you are in Korea
lol my internet speed is increased with 500% in 15 years not every 2 years and there was a 300% increase this year but finally i have a decent internet. xD
@@gaming4K dayum mine is like 1mbps download speed and 0.4 upload speed.
@ODIN Force wtf kkkkkkkkk
Moore's 2nd Law:
"People will start saying that Moore's Law is ending every two years".
hahahhahaha
2024 here we come!
Fidget Spinner as World War 3 weapon is next
well, it's already in action since it's making kids autistic
Humans keep forgetting WW3 already happened.
Autism is not something that you can make someone be. Well unless you pass it on to your own grandchildren with stress damage to your germ line.
Pink Program Has world war 3 happen no because world war is wwhen large countrys around the world fight each other in a full out war.
Pink Program, we need an autism vaccine!
I wonder how antivaxx would react to this.
But can it run Crysis?
the ultimate question for computing power huh?
TheDesertScrub Soon all phones will become mini quantum computers.
Devonzell Pernell No, they won't, quantum computers are only faster at select tasks
True, quantum computing is not conventional at all and requires -270 degrees to operate.
The real question is......can it run minecraft?
how much dedicated wam do i need to run a minecraft server
Weapon X *ahem* dedatated wam
TheSomeoneXD lmao 😂😭😭😭
About 9000 wam's should do it
Dedotated waaAAAAMM!
It's more about the bam.
The thing going through my mind watching this, especially at the mention of combat machines and learning to decide for themselves was Isaac Asimov's Three Laws of Robotics and his other thoughts on the topic. He warned us about the dangers of AI, and we are clearly still ignoring that and seeking it anyway.
Not enough 4:20 comments cmon guys
AtomicBacon568 Hold up, wait a minute. Let me put some Kush up in it.
Barbara Punkelman inhale, exhale, inhale, exhale
AtomicBacon568 I wish I was at home...
The video stopped bro 😵
420 watcha smoking
Well the storage has increased over the years, but now I don't have enough storage in my phone
Actually the slowing down thing is because the phone was not built for the new updates.
Person above this one is a conspiracytard
On the internet, jokes like that people do believe
Yes the logic is undeniable but it's also retarded. One can sue over that type of thing, did you know?
Conspiratard, please. If they did that they'd be disrespecting basic consumerism laws and rights.
You used Apple and Samsung as examples already. That is talking about specific brands.
If anyone is interested, I've just uploaded video which shows how transistor count changed from 1971 to 2020, check it out!
ruclips.net/video/Glk1Osql1KQ/видео.html
You forgot Graphine based chips.
Not Yet! AMD is planning to launch a 7nm microarchitecture processor next year. An upgrade from the current Ryzen architecture. Probably they will just ditch silicon as the material of choice. Since if the semiconductor made of sillicon is shrinked even more, it will stop containing electrons in the transistor properly.
Something that's a better semiconductor, heat resistant and somewhat cheap to manifacture. There might be a few materials, but i can't pinpoint out anything exact. It's up for specualtion.
It will still be the end I am afraid . At that distance , quantum tunneling is a real danger , it cant last forever . With a better material you can probably do a 3-5 nm .....then what? A complete Stop !
Smiltis, Time to use Germanium!
Oh and I forgot to mention that Intel isn't far behind either. They will maufacture chips with their new 10nm processing also pretty soon.
Mike D, Shrinking the transistors doesn't bring better computing power, that's up to the architecture of the processor itself. It just lets you cram in more transistors in the same volume. And also increases the power efficiency. Probably the future will be in refining the architecture itself and making bigger, more power hungry chips.
Or the concept processors from this video.
I was reading about this and I made a article on Wikipedia about it, I feel accomplished.
Gaming Gold line lmao then your article is just gonna be edited and wrecked
Oh well, RIP.
nice job.
Gaming Gold line which article?
Lol telling people that it's becoming difficult to reason out cost and price increase
Cost and price increase gtx 1080 was 1000 now its 500
@@donnabrahamworsley5857 yeah but when GTX 1080 launched it was 1000$ but when RTX 2080 launched it was 1600$...
Clown Fiesta Both of you are so wrong...
@@lessdatesmoreonmyplates1457 Because Nvidia has basicly a monopoly for the strongest high-end card. If AMD or Intel or maybe someone else will step up: prices will go down.
We have already actually seen that. With the new AMD graphic cards Nvidia announced the "Super" version which is basicly a better card for the same price and AMD made their cards a bit cheaper to counter that. So competition is always good for the costumer.
@@lessdatesmoreonmyplates1457 it's 2021 and last year in November Nvidia launched a new line up called "Turing" of which the lowest preset, i.e. 3060, costs $600 and performs better than a 2080Ti. We're talking about a straight 28% - 30% increment in performance here
Moore’s law continues (in its own way) via cloud computing. Using the cloud out phones computation ability is exponential.
False
Quantum computers are NOT a substitute for classical computers
AlphaOmega anywhere I could find out about the application of quantam computers?
Fabros I would Google "applications of quantum computers" and "limitations of quantum computers"
Quantum computers are very different machines, comparing them to classical computers is like comparing a flashlight to a fire torch. The flashlight is better at shining light but it's a completely different device and the fire can do things a flashlight can't.
Amadeus What can a classical computer do better than a quantum computer?/What can a classical computer do that quantum computer can't do?
Tristan Harris quantum computers work on a whole diffrent set of physics than classical silicon computers. A silicon computer works by turning on transistors, a quantum computer works by manipulating qubits.
every new technology will always have a combat use.
its like we are always thinking of more creative ways to kill each other.
bang on
TheMastorio No that's just 'Murrica
Well abhijeet... they are kind of the leading front in tech so it's only natural they'd be first to tinker with it
abhijeet bharguv People did this long before America was founded.
so other countries don't have army?
I thought he was talking about the car taking you for ice cream after _taking_ a dump. I had to play it back to hear that he said after _getting_ dumped. XD
0:54 joke's on you Trace, I'm using a desktop computer to watch this
moors law rip 1965-2025 you will be missed
+
Its 2017
TheMastergabe ending, not ended
MEOW AWESOME 😭
Farouk Musa yeah but it's not much of a law if it isn't definitive. it's more of a theory or idea
0:17 from Samsung to iPhone is not progress.
Max Mark Why are you kids always have to make a comment war about a phone. A thing that was created to make your life easier, not to worship it.
LoadGamePL 1(It's a joke) & 2 (I'm probably older than you)
remember I said probably
Max Mark k
LoadGamePL k & good morning
Still waiting on Graphene chips. I''d be willing to switch over to looking forward to Neuromorphic chips though. Though it would be awesome if the two technologies could combine.
Quantum Computing + Reverseable computing + Human Neuromorphic computing.
omfg that is the best shirt i had ever seen
*It checks itself before it wrecks itself...*
*Mutated DNA is bad for yo health..."*
cool but when will we have cheap earphone that wont damage every month
Low Key the day pigs fly. Cheap will always be shit.
Low Key truth!!!!
Get detachable earphones.
Sound magic are good apparently.
Have my 5 euro airphones for 3 years, no damage whatsoever..
I believe they will double the amount of cpus in a device in the meantime.
There still a lot of room for improovement. Such as display quality, refresh rate, switching from LCD to MicroLED. Also transition to graphene chips, batteries and lightweight parts.
*Graphene can do anything but Coming out of Lab* ......
It's amusing how people think they can fully inform themselves from Sci-Fi. We aren't going to have phones which are like us, they don't lose and gain neurons like us because they would not be living organic systems, they would have a static neuron count. Furthermore, the way our cognition exists is split into parts with different functions in our brain, and the network they're discussing is completely different in structure. Just because you have a neural network doesn't mean you have consciousness, and too many people think one equals the other, which is a false equivalence. There are different types of neural network, and none of them have created human-level consciousness, which is frankly useless to us. We already have our own consciousness in abundance, these neural networks are simply for optimising an AI to perform a task, not think about it's life goals. Even General AI which is often talked about is not truly "conscious".
Mr マックラ But what is contiousness then? How do you know if something is contious or not?
Mr マックラ if we based everything on the visions of Douglas Adams' h2g2 we'd be better off... plus.. Marvin!
Geolo2000 That's exactly what I'm talking about. The only thing we could do right now to create consciousness is 3D print a human brain, but even that is not feasible accurately because there are so many areas to specify for different chemical parts, maybe trillions of synapses and it cannot restore itself outside of the human circulatory body... our consciousness is so entangled with our bodies, and how we renew our dead parts through time, that it's constantly in physical change. Electronics are not, they don't grow and lose parts, they don't develop in the physical sense... So an electronic consciousness, if it can be achieved, would be very different.
But what I'm really saying is, there are no plans to create artificial people, because it doesn't have any value. The same problems we have with ordinary people would emerge, the same question of rights, of self-interest; this is simply not desirable in AI. Even a general AI that can carry out multiple tasks with advanced neural networking, may not have consciousness
never say never... a decade ago, we all say speech command is impossible with all thw dialect and how unique everyone voice is... well look at your phone now.
Totally agree with this comment. Ppl are so slap happy with this kind of stuff they just eat it up. "Omg the future, ai and robots and cell phones that can order my food from Taco Bell for me! Wow! What a time to be alive!" Ppl are so bored with their lives they'll believe anything they hear or watch
transistors today are not 14nm across!!! the photolithography process is 14nm
nou
it's amazing how little people actually pointed that out and how few paid the attenting.
Yep. Global Foundries admitted that along with TSMC.
5nm
What does this mean? Anyone can explain?
You just explained vin Neumann architecture to me better than two years of GCSE computing. Thank you
I see where this is going. The video went from Moore's Law to computer chips designed after the human brain. As long as you're not putting them inside of humans and changing their minds. But self learning and correcting computation is a great idea for the future.
neuromorphic computing sounds like a really bad idea
Michio kaku mentioned molecular transistors. Would be cool to have quantum computers with that new memory system model, that would be OP.
Michio Kaku mentions a lot of bullshit, and you should take everything with a grain of salt. He's a physicist not an engineer, and there's a difference between what's physically possible (i.e. it's working principle is correct in theory) and what's engineeringly possible, i.e. what can be mass-produced correctly and cost-effectively. The technologies that work to make a prototype device might not work for mass-production (e.g. really small transistors like 1nm have been produced as prototype but it's impossible currently to mass-produce them). Additionally, just because we can make sth doesn't mean it's useful, and molecular transistors fall in that category: they are so small that electrons can basically quantum-tunnel to the other side so they're basically useless as a switch.
TravelerInTime Um Physicist can be engineers too in fact physicist have more knowledge in the field than engineers and most of our things were invented by physicists or came from the idea of physicists not engineers.
Your Waifu Sucks, true, engineering is very closely related to applied physics, but all I'm saying is Michio Kaku isn't thinking like an engineer. He's thinking more like: "does violate any known laws of physics? no? then we can use it!". Which is a great attitude if you want to do sci-fi (which I also love btw), but not a great attitude if you want to make actual products. Case in point, just because you can build a molecular transistor doesn't mean it's gonna work great (or that it can even be integrated) in actual devices and just because you can prototype it doesn't mean that you can mass-produce it.
I've never once seen moore's law when using a computer, but murphy's law never fails to show up
Trace, you rock! I love learning from you and the rest of seeker. Please keep up the awesome videos!
Make 3D chips to keep moores law going
Nicotine Oob I don't think you understand.
Well it's not that we can't do that, it's more like we can't get more performance from the same amount of silicon, this will increase the cost of the chips significantly which defeats the point of moores law to increase performance while cutting the cost
Hmmm, 3d chips, HMMMMM this might really be a good idea, but, still moors law states that transistors get smaller every 2 years, not that chips get better, or faster, just the size of the transistor :P
Water-cooling pipes in the CPU
light based computers
no
Square Photonic chips would be awesome
Neomorphic might be a little far fetched we already made the first quantum processor in Germany but the problem we are currently facing is that it takes a huge super freezer just to cool the processor down to function. While the rest is in the process. We still need quantum graphics cards and everything else in order to make the computer function but if we find a way we can make a super computer.
That's such a cool concept...neuromorphic computing!!
You guys have a "sister channel" ? last time you sell Seeker Daily to "NowThis", what next ? SeekerVR become "Buzzfeed VR" ?
And they'll call it "Skynet"
Moore's law has been found to be part of a much larger trend that goes back over 2,000 years. When they used vacuum tubes, nobody expected transistors, and there is most likely a technology few people know about, or is going to soon be invented, that will keep the long term trend moving. Software can be made more efficient, and chip architecture could be improved. It's happened in the past. (And the chips he is talking about are just that.)
Honestly non-quantum ternary computers (also known as trinary, or base-3, in addition to a 0 and 1 would have a -1,) would be an advance, and they have existed since the 1950's. Punch cards, and methods of storage really favored binary which negatively affected research into ternary computing in the past.
I have to say, my phone from 2013 is incredibly different from my 2011 phone, but isn't that different from a 2017 phone.. And my computer is from 2009 and completely squashes the computer I got in 2005, but looking at the computers on sale now hardly anything has changed in the 8 years since I got it. I definitely noticed that the raw power of electronics hasn't been changing much this decade, compared to last decade where I noticed a big difference as each year passed. I would HOPE that people would feel less inspired to replace everything they own so frequently now, but they'll probably maintain the illusion that their 2 year-old tech is obsolete anyway..
look im pretty happy with the teck we have tbh
You're happy with the WAY it's used now. There are so many other areas that could use better tech.
@@squamish4244 No, I think this person is happy with the tech he has.
I mean the way it is used. We have all this incredibly powerful tech, and so much misery and cruelty in the world. Why aren't we using tech to help humans be less miserable (in their chattering, never-satisfied, argumentative, angry, fearful, depressed minds)?
@@squamish4244 Great. Just dont invalidate something someone else says because of your beliefs.
RUclips is for discussions, right? If we can't question other's beliefs, then that 'invalidates' like half of the point of comment sections. It make progress impossible too.
Nice video length
Last year, researchers reported building a flash device that included layers of graphene and molybdenum disulfide.
Both of which form molecular sheets a single atom thick. But these devices required several layers of these materials to work.
So the charge ended up stored in several stacked sheets of graphene. The Crystal chip also shows potential.
But there just isn't enough R&D being put into these new technologies. Where it's going to make much difference. :/
I miss using my Moto Rzr!! I do still have mine somewhere around here. I think!
moores law: obey me
apple and every big pc company: hmm oh rly?
I want that T-shirt
Great video!! Thanks for sharing. I definitely miss my old Motorolla RAZR.
Neuromorphic chips sound a lot like FPGAs which are essentially a matrix of memory cells used as look up tables that serve as digital logic or memory depending on what the application needs. Its hard to see how these could surpass the density and speed of the present simpler designs that are fixed when they are manufactured and don't require the overhead of being reconfigurable. There still are smaller processes on the horizon 7nm and smaller but the physics required for design and manufacturing is becoming exponentially more difficult and the time and cost to role out these processes is much higher than for previous process improvements. Improvements in memory tech will ease the pain of Moore's law breaking down Gigabytes or even terabytes of non volatile memory running at L2 cache speeds will make computers faster and change the way operating systems and software is designed. Fundamentally different processes such as memristors will eventually supplant the current silicon based transistor logic but it will take years or decades for it to catch up with the mature state of art silicon based designs.
Wow when 10+ comment first what is the point
My take on that is that they must be attention-seekers to some degree. ;)
Men the neuromorphic computing is creepy as hell, I dont want a computer that can do whatever the fu*k he wants, I want to control it as usual and dont have a living thing in my room.
The new chip will want more porn and video games and pseudo science
Except allowing a computer to learn on its own is effectively the same as letting it reprogram itself.
I don't think it would learn on it's own unless you programmed it to do really abnormal things without an input from someone telling it what to do (it won't be ai just different computing)
David Santiago Barreto Mora but isn't it a living room?
It's time that people figure out what constitutes a mind. Learning is ab umbrella term. The architecture should include fixed part so it will always try to serve us. That fixed part is comparable to biological instincts.
IBM made a working 1nm transistor a while back using carbon Nano tube so I think we still have a few years left for Moore's law. After that you are probably looking at 3D graphene, super diamonds or a combination of a few different things to get better performance out of a chip. I have no idea which way the industry will end up going but what I do know is that if I were to take one of the next gen chips back in time and give it to myself as a kid I could claim it was from an alien space ship and people would believe me. That makes me happy for some reason lol.
Yas! I still use mine! I pay $20 a month to keep it as my "business phone" hahah
I watched this with my smartphone
hornetluca k
Congratulations
Checks Itself Before It Wreak It Self....
Trace asked a question late in the video about our digital assistants (they are more than phones nowadays). I carry a Samsung flip-phone and teach 8th grade science. My students laugh whenever I take it out to check the time. But then I impress upon them that the only reason I have it is to (1) make calls and (2) receive calls but mostly to check the time (I no longer wear a wrist watch). And then I drop it on the floor. GASP! Generally it survives with no problem but occasionally the battery cover comes off -- and I just pop it back on. I then ask my students, "Can your phone do that?" My point is that the technology is "good enough" for my needs, I don't require/want anything more elaborate or expensive. Some day, I might need something else (for example, to direct my autonomous flying car to pick up a replacement flux capacitor) but that day has not yet arrived.
Yes because silicon, the element used to make the chips, can only hold/do so much. This has been coming for a while, however, there's research going on to use a different element to make chips out of
How about making desktop computer chips 10% larger? Sure it wouldn't solve the problem, but they could fit a ton more transistors in that 10% extra space.
You would pay 10% extra so nothing changes.
How about 20%? Why not 40%? Why did you choose 10% ?
Larger means more material, and heavier/clunkier logistics, thus higher costs. Also don't forget that these things getting smaller increases our capability to cool them. Making them large once again, increases cooling requirements.
Good Fortune chillax. They say die size is Ryzen.
I for one welcome our new robot overlords.
lol nice Simpson reference. XD
i love my Motorola Razr still very much and recently gave it for repairs, I hope the spare parts could be found these days.
Von Neumann is pronounced "phonn noimun" by the way ;) This neural chips could be really interesting.
and that is how you get skynet
What's Next?
A computer that can learn to say "Get me to da choppa" like Arnold
Between current chips and quantum/neuro processing theres a much more immediate stopgap measure that will probably happen much sooner, which is a re-redesigned cpu architecture thats still using silicon transistors. Current Intel CPUs are based on the x86 standard, which carries with it 30+ years of baggage. It's an inefficient design kept around for backwards compatibility. Ditching it and making something more efficient from the ground up would use less energy, which would mean less heat, which means faster clock speeds and more cores, which could continue to eke out gains from silicon following moores law a bit longer...
0:04 I actually have that phone (yeah, I still have it, tho I dnt use it anymore), The Nokia 6021, amazingly the Facebook java app for that phone that I downloaded like in 2010 still works (last updated in 2012 I guess). For a almost 11 year old phone, that's unexpected. Well, the last time I tried powering it on & using Facebook was last January I guess, should give it a try again!
I think
A.I. is a real concern, but then again you can stop technological advances for a couple of murdering robots 🤷♂️
Guy who joins the community of people scared of ai for no reason
When you realize you are living in the generation to best experience the peak evolution of tech
Aye, the Nokia 6020 at the beginning, one of my favorite device back then
Don't worry about it. Why the rush to have more computing power?? More gadgets to play with? We'll be fine.
Yeah, just that anything mimicking human brain and "learning" will be prone to errors and that is not what we need in computers. Imagine a game running on one of those; crashes, glitches. We have to stay on current architecture but instead of going smaller (chip size) or bigger (more chips) we have to go faster, with states that can be quickly switched (electrical current is not ideal for switching and wires have some resistance in them). Also, for a long time our circuits have been topologicaly 2D boards, with layers, but still 2D. Imagine if chips could communicate with their adjecent chips right below or above them, as well as in any of other direction. You'd map every node and switch it on the go, essentially rewriting it's curcuit, and since the components are so fast it wouldn't even be noticeable. You'd be having a fraction of chips running at theoretical speed of light, having CPU and memory become one. Controling each node you'd could fork and backtrack results into the chip itself, rewriting only it's logic without altering any of it's physical properties. It's still sci -fi but it's not a long way from here.
I like your version, not sure how deeply you might have gone with this in your mind but i've been studying this for the past 15+ years of my life with insane devotion and proves me to be a key part of our future's best functioning methods for our general purpose computing machines and multi-functional miniature systems, while in larger scale (as in supercomputers) still too much to be seen in order to doubt or un-doubt the presence of this system.
i knew that someone will say something related to skynet in the comments before watching the video ......
K254 It's you.
"Neuromorphic" is a change in architecture that would only give us a one time boost in potential computing power. And it would only benefit a special area of computational tasks. Like a graphics card
My brain forgot everything you said 3:00 in. Dont know if this neuromorphic thing will work.
skynet is here
When a machine learn how to disobey its own algorithms it will be end of the world.
It is already done - every serious computer science student writes such Ai ;-)
I remember when cell phones came in a briefcase and looked much like bulkier sat phone units.
1:00 that is the distance between each transistor not the size
I like how now iPhones are an upgrade to Samsung lmao
The government just keeps taking and it's frustrating. Why are the American people not allowed to vote on it? You know the government is taking too much power from us when they just "end" things like Moore's law.
because america is not a democracy.
Nico Flihan what are you on about. No one is ending moores law, its simply an idea that is increasingly less sustainable due to things like size and energy consumption. If anything you should be happy about any new revised system as what we use is basically ancient technology at the rate we already progress.
It's almost like the Constitution is a 'living document'.
Nico Flihan America isn't free anymore. The freedom we think we have is a fabrication. That's why the government keeps us 100 years behind what they have. Would you let your dog live better than you do? The government won't either. Arf arf. Sucks,but it's true. We are little more than cattle to the government. Very little more.
Kai Miller sage trolling bro
no mention of skyscraper chips, nanotube processors or anything like that?
So pretty much what the iPhone X has with its A11 Bionic chip?
Yay finally we'll go towards the end of the world! At last!
Of course the biggest provider of microchips is going to say that it's getting more and more difficult to produce them...So they can *OBVIOUSLY* sell them for higher prices.
Yeah, thermodynamics is all a big conspiracy by Big Tech.
CrashDavi knowing intel it probably is. They literally released the exact same architecture for 3 generation in a row for a slightly higher price each time, only adding a couple extra cores to the cpu when a hint of competition showed up. They stopped giving meaningful improvements the day they gained monopoly. Im aware the limit of silicon is approaching, but i doubt it's as big a deal as intel wants to say to keep shrinking until we get there. Hopefully by then, in the early 2020s some breakthrough in a different semiconductor is achieved to replace silicon.
Khoi Pham Xilinx is already testing 7nm, and have designed 5nm.
Birki gts I heard that people are looking into quartz microchips to replace silicon ones
Primitive west. shameless people of west copied from india vedas. WE ARE GOING TO SUE THE WEST. Vedic scientists like Razor Skidrow, Logical Hindu, Magical indian put lots of video evidence on youtube. We will form a society and sue the west for copying our ancient hindu scriptures
We are going to have to figure out something different in-order to continue at this rate
actually it would be incremental it would be exponential, as incremental change usually means of a fixed scale where as the doubling of transistors is a exponentially increasing scale.
Peoples memory and ability to learn is horrible. Why would a computer be modeled after a human brain
TrueReality Wtf are you on about? The human brain is more complex and more adaptive than any computer. An average human brain is at least 1000 times as fast as our fastest supercomputers.
No its not.
Quespa are you stupid
You make huge logical leaps in your video. How do smartphones lead to Intel chips and neuromorphic chips (hardware machine learning) lead to faster electronics? Your argumentation is incoherent.
+Omegapede Prime the video is only 4 and a half minutes long...
And because it's only 4 minutes and a half like Diego said, it's why he has to connect the dots with such leaps or thousands of the fools flooding the comments with offenses before you wouldn't even bother to watch the half of it. You got a good point but be more positive please and you'll see there's a lot more to it than what 4 minutes and a half can fit in.
Robert Tompson than don't have it be four minutes. Let it go above and beyond by the power over 9000!
1:04 "...and each transistor is about 14nm across, that is smaller than most human viruses"
To correct him, this nanometer measurement specifies the transistor gate, which only makes up a portion of the transistor - not the transistor unit itself. There are no commercial consumer grade chips out there today where the total size of a transistor unit is only 14nm (probably only in the engineering labs working on 7nm architectures or even smaller).
And how small is 14nm? This varies a lot as it's hard to find an answer depending on what atom we're dealing with and the pattern they make up, but somewhere between 25-40 atoms across.
Would a tensor processing unit not count as a sort of neural mimicking chip?
This was meant to be a joke BUT...........
.
.
.
.
.
I have a bad sense of humour.
moore's law is about CPU not memory
guildarius wrong its about transistor size
Moore's law is about transistor density.
While the newer technology is underway but for now i think manufacturers should focus more on software optimisation to make better experience for the users.
I used a razor until this year when AT&T dropped 2g service. Though to be honest I loved my blackberry curve the most. 5 days a battery life.
Where are the facts in this video, firstly why would we as consumers ever need more computational power in the first place, there might be a reason for this but why risk splitting the market? Secondly quantum computers are not going to be mainstream ever, you have to play with almost zero degrees kelvin and all this other technical stuff, quantum computers are just really good at solving one problem with high difficulty where a normal computer cisc based could complete it in one or two clock cycles. And third computers with synapses are simply not there yet and probably will not be there for at least 50 years. The real next thing after moores law are arm based high core count chips with maybe 256 cores or more. Just look at gpus they are basically a very paralleled CPU with lots of low power cores that are specified to do math. Why don't you research about what is real and stop with this sci-fi "quantum computing is the way to go bullcrap"
Although i mostly do agree with your comment, i still believe Quantum Computers have a good shot on the way and we should be waiting for it with open arms to make sure we don't miss it, from what i see in the mathematical and logical development of today's technology i think Quantum Computer might actually be the last one i could count on making the "Quantum Leap" from "consumerism tech"/"quantity over quality" to the true efficient machines that no longer depend on sizes nor amounts to prove their worth and simplicity of their use and maintenance to depend only on the final count of personal choice based on efficiency/need/interest so people would only get what gets their job done while science makes a quantum leap away from the "consumerist market" which is simply degrading, not to say degenerating our marketing and progress aims in different ways on a very large scale.
It could take a book to explain why Quantum Computers when they're obviously not superior to the version discussed on the video and possibly not even compared to the ARM logic stand if i'm getting it right. But in the end of the book i would still simply end it with just a "i hope it happens so i could tell you "i told you so" and we'd all be happy about it" well, except today's marketing advisers and some of their bosses. :)
S.K.Y.N.E.T
.
.
.
.
.
.
(Insert full-form here;if you are smart.)
So Kitty, You're Not Extra Terrestrial
Interesting and informative video, great job.
I never owned a cell phone in my life. But let me know when they're done and I might pick one up then.