Moore's Law Is Ending... So, What's Next?

Поделиться
HTML-код
  • Опубликовано: 26 май 2017
  • Scientists are engineering a new, more efficient generation of computer chips by modeling them after the human brain.
    Can Supercomputers Predict The Future? - • Can Supercomputers Pre...
    Discovering The Hidden Treasures of Mauritania's Deadly Sahara Desert -
    • Discovering The Hidden...
    Sign Up For The Seeker Newsletter Here - bit.ly/1UO1PxI
    Read More:
    'Artificial Synapses' Mimic Neurons, Hint at Brainy Computers
    www.seeker.com/artificial-syn...
    "A brain-inspired computing component provides the most faithful emulation yet of connections among neurons in the human brain, researchers say. The so-called memristor, an electrical component whose resistance relies on how much charge has passed through it in the past, mimics the way calcium ions behave at the junction between two neurons in the human brain, the study said. That junction is known as a synapse."

    How Quantum Computing Will Change Your Life
    www.seeker.com/quantum-comput...
    "Over a century ago, the advent of quantum theory rocked humanity. Now, we can manipulate the quantum world, opening our eyes to a powerful new age featuring quantum computers and quantum cryptography."

    Self-learning neuromorphic chip that composes music
    phys.org/news/2017-05-self-le...
    "Today, at the imec technology forum (ITF2017), imec demonstrated the world's first self-learning neuromorphic chip. The brain-inspired chip, based on OxRAM technology, has the capability of self-learning and has been demonstrated to have the ability to compose music."
    ____________________

    Seeker inspires us to see the world through the lens of science and evokes a sense of curiosity, optimism and adventure.

    Watch More Seeker on our website www.seeker.com/shows/

    Subscribe now! ruclips.net/user/subscription_c...

    Seeker on Twitter / seeker

    Trace Dominguez on Twitter / tracedominguez

    Seeker on Facebook / seekermedia

    Seeker on Google+ plus.google.com/u/0/+dnews

    Seeker www.seeker.com/
    Sign Up For The Seeker Newsletter Here: bit.ly/1UO1PxI
    This episode of Seeker was hosted by Trace Dominguez.
    Written by: Lauren Ellis.
  • НаукаНаука

Комментарии • 1,7 тыс.

  • @brokenacoustic
    @brokenacoustic 7 лет назад +1433

    We'll need moore technology!
    ...I'll let myself out..

    • @Tunir007
      @Tunir007 7 лет назад +4

      acousticpsychosis ....dude

    • @NakedAvanger
      @NakedAvanger 7 лет назад +1

      acousticpsychosis "let" yourself out? What?

    • @fundemort
      @fundemort 7 лет назад +21

      moore research is needed!

    • @brokenacoustic
      @brokenacoustic 7 лет назад +15

      Naked guy, if I have to explain, its no longer funny, and it wasnt funny in the first place. Dont put me in that position!

    • @simonvreman
      @simonvreman 7 лет назад +6

      acousticpsychosis that's actually pretty funny😂

  • @Master_Therion
    @Master_Therion 7 лет назад +1489

    Some people say we need Less Laws, not Moore Laws.

    • @lightice.i
      @lightice.i 7 лет назад +4

      menofletters.

    • @ollysmith5447
      @ollysmith5447 7 лет назад +10

      Master Therion Shakespeare liked puns...but he is dead so stop!

    • @Master_Therion
      @Master_Therion 7 лет назад +9

      olly smith LOL Best come-back reply I've received in some time ^_^
      As You Like It, I will go. But parting is such sweet sorrow...

    • @jamesambrocio
      @jamesambrocio 7 лет назад +12

      Master Therion
      **sigh...** thumbs up

    • @ollysmith5447
      @ollysmith5447 7 лет назад +5

      Master Therion out damn comment out!

  • @basilkatakuzinos659
    @basilkatakuzinos659 6 лет назад +215

    Moores law states that everything except my internet speed will increase exponentially.

    • @zipper978
      @zipper978 6 лет назад

      basil katakuzinos unless you are in Korea

    • @gaming4K
      @gaming4K 3 года назад

      lol my internet speed is increased with 500% in 15 years not every 2 years and there was a 300% increase this year but finally i have a decent internet. xD

    • @thefirstsin
      @thefirstsin 3 года назад +2

      @@gaming4K dayum mine is like 1mbps download speed and 0.4 upload speed.

    • @deivisony
      @deivisony 2 года назад

      @ODIN Force wtf kkkkkkkkk

  •  6 лет назад +67

    Moore's 2nd Law:
    "People will start saying that Moore's Law is ending every two years".

  • @thghee8312
    @thghee8312 7 лет назад +1099

    Fidget Spinner as World War 3 weapon is next

    • @ireallyreallyreallylikethisimg
      @ireallyreallyreallylikethisimg 7 лет назад +5

      well, it's already in action since it's making kids autistic

    • @PinkProgram
      @PinkProgram 7 лет назад +1

      Humans keep forgetting WW3 already happened.

    • @PinkProgram
      @PinkProgram 7 лет назад

      Autism is not something that you can make someone be. Well unless you pass it on to your own grandchildren with stress damage to your germ line.

    • @urielvillagran8813
      @urielvillagran8813 7 лет назад

      Pink Program Has world war 3 happen no because world war is wwhen large countrys around the world fight each other in a full out war.

    • @VinceIsDatBitch
      @VinceIsDatBitch 7 лет назад

      Pink Program, we need an autism vaccine!
      I wonder how antivaxx would react to this.

  • @boysenbeary
    @boysenbeary 7 лет назад +1125

    But can it run Crysis?

    • @onyx1186
      @onyx1186 7 лет назад +79

      the ultimate question for computing power huh?

    • @devonzellpernell8895
      @devonzellpernell8895 7 лет назад +24

      TheDesertScrub Soon all phones will become mini quantum computers.

    • @thedarkshqdow
      @thedarkshqdow 7 лет назад +19

      Devonzell Pernell No, they won't, quantum computers are only faster at select tasks

    • @chairmanmeaow6379
      @chairmanmeaow6379 7 лет назад +22

      True, quantum computing is not conventional at all and requires -270 degrees to operate.

    • @Patri_Fides
      @Patri_Fides 7 лет назад +31

      The real question is......can it run minecraft?

  • @DJrocker8000
    @DJrocker8000 7 лет назад +75

    how much dedicated wam do i need to run a minecraft server

  • @TabariGames
    @TabariGames 7 лет назад +12

    The thing going through my mind watching this, especially at the mention of combat machines and learning to decide for themselves was Isaac Asimov's Three Laws of Robotics and his other thoughts on the topic. He warned us about the dangers of AI, and we are clearly still ignoring that and seeking it anyway.

  • @AtomicBacon568
    @AtomicBacon568 7 лет назад +811

    Not enough 4:20 comments cmon guys

    • @belainegibsson.2082
      @belainegibsson.2082 7 лет назад +17

      AtomicBacon568 Hold up, wait a minute. Let me put some Kush up in it.

    • @gaw6391
      @gaw6391 7 лет назад +22

      Barbara Punkelman inhale, exhale, inhale, exhale

    • @definitelynottheriddler
      @definitelynottheriddler 7 лет назад +1

      AtomicBacon568 I wish I was at home...

    • @NoName-no8ti
      @NoName-no8ti 7 лет назад +1

      The video stopped bro 😵

    • @not_adrs
      @not_adrs 7 лет назад +2

      420 watcha smoking

  • @wolffgang101
    @wolffgang101 7 лет назад +206

    Well the storage has increased over the years, but now I don't have enough storage in my phone

    • @zyibesixdouze4863
      @zyibesixdouze4863 7 лет назад +4

      Actually the slowing down thing is because the phone was not built for the new updates.
      Person above this one is a conspiracytard

    • @zyibesixdouze4863
      @zyibesixdouze4863 7 лет назад

      On the internet, jokes like that people do believe

    • @zyibesixdouze4863
      @zyibesixdouze4863 7 лет назад

      Yes the logic is undeniable but it's also retarded. One can sue over that type of thing, did you know?

    • @zyibesixdouze4863
      @zyibesixdouze4863 7 лет назад

      Conspiratard, please. If they did that they'd be disrespecting basic consumerism laws and rights.

    • @zyibesixdouze4863
      @zyibesixdouze4863 7 лет назад

      You used Apple and Samsung as examples already. That is talking about specific brands.

  • @racecondition3176
    @racecondition3176 4 года назад

    If anyone is interested, I've just uploaded video which shows how transistor count changed from 1971 to 2020, check it out!
    ruclips.net/video/Glk1Osql1KQ/видео.html

  • @Cantatio411
    @Cantatio411 7 лет назад +19

    You forgot Graphine based chips.

  • @YUSoDumb1
    @YUSoDumb1 7 лет назад +387

    Not Yet! AMD is planning to launch a 7nm microarchitecture processor next year. An upgrade from the current Ryzen architecture. Probably they will just ditch silicon as the material of choice. Since if the semiconductor made of sillicon is shrinked even more, it will stop containing electrons in the transistor properly.

    • @YUSoDumb1
      @YUSoDumb1 7 лет назад +32

      Something that's a better semiconductor, heat resistant and somewhat cheap to manifacture. There might be a few materials, but i can't pinpoint out anything exact. It's up for specualtion.

    • @cazymike87
      @cazymike87 7 лет назад +37

      It will still be the end I am afraid . At that distance , quantum tunneling is a real danger , it cant last forever . With a better material you can probably do a 3-5 nm .....then what? A complete Stop !

    • @Anarchidi
      @Anarchidi 7 лет назад +11

      Smiltis, Time to use Germanium!

    • @YUSoDumb1
      @YUSoDumb1 7 лет назад +13

      Oh and I forgot to mention that Intel isn't far behind either. They will maufacture chips with their new 10nm processing also pretty soon.

    • @YUSoDumb1
      @YUSoDumb1 7 лет назад +20

      Mike D, Shrinking the transistors doesn't bring better computing power, that's up to the architecture of the processor itself. It just lets you cram in more transistors in the same volume. And also increases the power efficiency. Probably the future will be in refining the architecture itself and making bigger, more power hungry chips.
      Or the concept processors from this video.

  • @goldline7091
    @goldline7091 7 лет назад +41

    I was reading about this and I made a article on Wikipedia about it, I feel accomplished.

  • @redvanderbilt289
    @redvanderbilt289 6 лет назад +118

    Lol telling people that it's becoming difficult to reason out cost and price increase

    • @donnabrahamworsley5857
      @donnabrahamworsley5857 6 лет назад +1

      Cost and price increase gtx 1080 was 1000 now its 500

    • @lessdatesmoreonmyplates1457
      @lessdatesmoreonmyplates1457 4 года назад +3

      @@donnabrahamworsley5857 yeah but when GTX 1080 launched it was 1000$ but when RTX 2080 launched it was 1600$...

    • @christophegroulx8187
      @christophegroulx8187 4 года назад +1

      Clown Fiesta Both of you are so wrong...

    • @flycrack7686
      @flycrack7686 4 года назад +5

      @@lessdatesmoreonmyplates1457 Because Nvidia has basicly a monopoly for the strongest high-end card. If AMD or Intel or maybe someone else will step up: prices will go down.
      We have already actually seen that. With the new AMD graphic cards Nvidia announced the "Super" version which is basicly a better card for the same price and AMD made their cards a bit cheaper to counter that. So competition is always good for the costumer.

    • @ABZ98990
      @ABZ98990 3 года назад

      @@lessdatesmoreonmyplates1457 it's 2021 and last year in November Nvidia launched a new line up called "Turing" of which the lowest preset, i.e. 3060, costs $600 and performs better than a 2080Ti. We're talking about a straight 28% - 30% increment in performance here

  • @benrhodes-kropf3329
    @benrhodes-kropf3329 3 года назад +5

    Moore’s law continues (in its own way) via cloud computing. Using the cloud out phones computation ability is exponential.

  • @ultravidz
    @ultravidz 7 лет назад +252

    Quantum computers are NOT a substitute for classical computers

    • @fabros9290
      @fabros9290 7 лет назад +8

      AlphaOmega anywhere I could find out about the application of quantam computers?

    • @mitchellsteindler
      @mitchellsteindler 7 лет назад +40

      Fabros I would Google "applications of quantum computers" and "limitations of quantum computers"

    • @SimpleAmadeus
      @SimpleAmadeus 7 лет назад +24

      Quantum computers are very different machines, comparing them to classical computers is like comparing a flashlight to a fire torch. The flashlight is better at shining light but it's a completely different device and the fire can do things a flashlight can't.

    • @augustharris8572
      @augustharris8572 7 лет назад +4

      Amadeus What can a classical computer do better than a quantum computer?/What can a classical computer do that quantum computer can't do?

    • @Froztypan
      @Froztypan 7 лет назад +17

      Tristan Harris quantum computers work on a whole diffrent set of physics than classical silicon computers. A silicon computer works by turning on transistors, a quantum computer works by manipulating qubits.

  • @TheMastorio
    @TheMastorio 7 лет назад +325

    every new technology will always have a combat use.
    its like we are always thinking of more creative ways to kill each other.

    • @WhileYouWereSheeping
      @WhileYouWereSheeping 7 лет назад +1

      bang on

    • @theeccentricwriter4657
      @theeccentricwriter4657 7 лет назад +26

      TheMastorio No that's just 'Murrica

    • @homiegisalive
      @homiegisalive 7 лет назад +13

      Well abhijeet... they are kind of the leading front in tech so it's only natural they'd be first to tinker with it

    • @Goken234
      @Goken234 7 лет назад +23

      abhijeet bharguv People did this long before America was founded.

    • @NeuroticKnight9
      @NeuroticKnight9 7 лет назад +8

      so other countries don't have army?

  • @zakiducky
    @zakiducky 6 лет назад

    I thought he was talking about the car taking you for ice cream after _taking_ a dump. I had to play it back to hear that he said after _getting_ dumped. XD

  • @Eclipsed_Archon
    @Eclipsed_Archon 6 лет назад +5

    0:54 joke's on you Trace, I'm using a desktop computer to watch this

  • @meowawesome3988
    @meowawesome3988 7 лет назад +232

    moors law rip 1965-2025 you will be missed

    • @MrCrashDavi
      @MrCrashDavi 7 лет назад +1

      +

    • @TheMastergabe
      @TheMastergabe 6 лет назад +1

      Its 2017

    • @worldshuriken
      @worldshuriken 6 лет назад +14

      TheMastergabe ending, not ended

    • @1mezion
      @1mezion 6 лет назад

      MEOW AWESOME 😭

    • @TheMastergabe
      @TheMastergabe 6 лет назад +2

      Farouk Musa yeah but it's not much of a law if it isn't definitive. it's more of a theory or idea

  • @maxmark562
    @maxmark562 7 лет назад +430

    0:17 from Samsung to iPhone is not progress.

    • @loadgamepl
      @loadgamepl 7 лет назад +37

      Max Mark Why are you kids always have to make a comment war about a phone. A thing that was created to make your life easier, not to worship it.

    • @maxmark562
      @maxmark562 7 лет назад +16

      LoadGamePL 1(It's a joke) & 2 (I'm probably older than you)

    • @maxmark562
      @maxmark562 7 лет назад +4

      remember I said probably

    • @loadgamepl
      @loadgamepl 7 лет назад +3

      Max Mark k

    • @maxmark562
      @maxmark562 7 лет назад +2

      LoadGamePL k & good morning

  • @ThatSoddingGamer
    @ThatSoddingGamer 7 лет назад +1

    Still waiting on Graphene chips. I''d be willing to switch over to looking forward to Neuromorphic chips though. Though it would be awesome if the two technologies could combine.

  • @rebelbeammasterx8472
    @rebelbeammasterx8472 6 лет назад +9

    Quantum Computing + Reverseable computing + Human Neuromorphic computing.

  • @fowl3799
    @fowl3799 7 лет назад +73

    omfg that is the best shirt i had ever seen

    • @Menaceblue3
      @Menaceblue3 7 лет назад +24

      *It checks itself before it wrecks itself...*
      *Mutated DNA is bad for yo health..."*

  • @lowkey9871
    @lowkey9871 7 лет назад +209

    cool but when will we have cheap earphone that wont damage every month

    • @Erowens98
      @Erowens98 6 лет назад +13

      Low Key the day pigs fly. Cheap will always be shit.

    • @pantaphobia5200
      @pantaphobia5200 6 лет назад

      Low Key truth!!!!

    • @alialtaf3412
      @alialtaf3412 6 лет назад

      Get detachable earphones.

    • @HarryL2020
      @HarryL2020 6 лет назад

      Sound magic are good apparently.

    • @bmbzlndggo7646
      @bmbzlndggo7646 6 лет назад +2

      Have my 5 euro airphones for 3 years, no damage whatsoever..

  • @darrenstrathdee7425
    @darrenstrathdee7425 4 года назад +5

    I believe they will double the amount of cpus in a device in the meantime.

  • @RynaxAlien
    @RynaxAlien 6 лет назад +2

    There still a lot of room for improovement. Such as display quality, refresh rate, switching from LCD to MicroLED. Also transition to graphene chips, batteries and lightweight parts.

    • @DhrubajyotiRaja01
      @DhrubajyotiRaja01 2 года назад

      *Graphene can do anything but Coming out of Lab* ......

  • @KuraSourTakanHour
    @KuraSourTakanHour 7 лет назад +293

    It's amusing how people think they can fully inform themselves from Sci-Fi. We aren't going to have phones which are like us, they don't lose and gain neurons like us because they would not be living organic systems, they would have a static neuron count. Furthermore, the way our cognition exists is split into parts with different functions in our brain, and the network they're discussing is completely different in structure. Just because you have a neural network doesn't mean you have consciousness, and too many people think one equals the other, which is a false equivalence. There are different types of neural network, and none of them have created human-level consciousness, which is frankly useless to us. We already have our own consciousness in abundance, these neural networks are simply for optimising an AI to perform a task, not think about it's life goals. Even General AI which is often talked about is not truly "conscious".

    • @Gg-rx6qn
      @Gg-rx6qn 7 лет назад +10

      Mr マックラ But what is contiousness then? How do you know if something is contious or not?

    • @screamtoasigh9984
      @screamtoasigh9984 7 лет назад +1

      Mr マックラ if we based everything on the visions of Douglas Adams' h2g2 we'd be better off... plus.. Marvin!

    • @KuraSourTakanHour
      @KuraSourTakanHour 7 лет назад +21

      Geolo2000 That's exactly what I'm talking about. The only thing we could do right now to create consciousness is 3D print a human brain, but even that is not feasible accurately because there are so many areas to specify for different chemical parts, maybe trillions of synapses and it cannot restore itself outside of the human circulatory body... our consciousness is so entangled with our bodies, and how we renew our dead parts through time, that it's constantly in physical change. Electronics are not, they don't grow and lose parts, they don't develop in the physical sense... So an electronic consciousness, if it can be achieved, would be very different.
      But what I'm really saying is, there are no plans to create artificial people, because it doesn't have any value. The same problems we have with ordinary people would emerge, the same question of rights, of self-interest; this is simply not desirable in AI. Even a general AI that can carry out multiple tasks with advanced neural networking, may not have consciousness

    • @goonerOZZ
      @goonerOZZ 7 лет назад +20

      never say never... a decade ago, we all say speech command is impossible with all thw dialect and how unique everyone voice is... well look at your phone now.

    • @hT0theiZZ0
      @hT0theiZZ0 7 лет назад +13

      Totally agree with this comment. Ppl are so slap happy with this kind of stuff they just eat it up. "Omg the future, ai and robots and cell phones that can order my food from Taco Bell for me! Wow! What a time to be alive!" Ppl are so bored with their lives they'll believe anything they hear or watch

  • @skepto4
    @skepto4 7 лет назад +44

    transistors today are not 14nm across!!! the photolithography process is 14nm

    • @DeeP-_PerspectivE
      @DeeP-_PerspectivE 6 лет назад +1

      nou

    • @user-yk2ec7of9z
      @user-yk2ec7of9z 6 лет назад +5

      it's amazing how little people actually pointed that out and how few paid the attenting.

    • @benjy117
      @benjy117 6 лет назад

      Yep. Global Foundries admitted that along with TSMC.

    • @kumarsuryanshu3840
      @kumarsuryanshu3840 5 лет назад

      5nm

    • @kyoungsub
      @kyoungsub 5 лет назад

      What does this mean? Anyone can explain?

  • @GamePackAlpha
    @GamePackAlpha 6 лет назад

    You just explained vin Neumann architecture to me better than two years of GCSE computing. Thank you

  • @filliphulles2867
    @filliphulles2867 6 лет назад +1

    I see where this is going. The video went from Moore's Law to computer chips designed after the human brain. As long as you're not putting them inside of humans and changing their minds. But self learning and correcting computation is a great idea for the future.

  • @Rockets2024Champs
    @Rockets2024Champs 7 лет назад +10

    neuromorphic computing sounds like a really bad idea

  • @zukodude487987
    @zukodude487987 7 лет назад +3

    Michio kaku mentioned molecular transistors. Would be cool to have quantum computers with that new memory system model, that would be OP.

    • @InTimeTraveller
      @InTimeTraveller 7 лет назад +2

      Michio Kaku mentions a lot of bullshit, and you should take everything with a grain of salt. He's a physicist not an engineer, and there's a difference between what's physically possible (i.e. it's working principle is correct in theory) and what's engineeringly possible, i.e. what can be mass-produced correctly and cost-effectively. The technologies that work to make a prototype device might not work for mass-production (e.g. really small transistors like 1nm have been produced as prototype but it's impossible currently to mass-produce them). Additionally, just because we can make sth doesn't mean it's useful, and molecular transistors fall in that category: they are so small that electrons can basically quantum-tunnel to the other side so they're basically useless as a switch.

    • @Nothing_serious
      @Nothing_serious 7 лет назад +1

      TravelerInTime Um Physicist can be engineers too in fact physicist have more knowledge in the field than engineers and most of our things were invented by physicists or came from the idea of physicists not engineers.

    • @InTimeTraveller
      @InTimeTraveller 7 лет назад +1

      Your Waifu Sucks, true, engineering is very closely related to applied physics, but all I'm saying is Michio Kaku isn't thinking like an engineer. He's thinking more like: "does violate any known laws of physics? no? then we can use it!". Which is a great attitude if you want to do sci-fi (which I also love btw), but not a great attitude if you want to make actual products. Case in point, just because you can build a molecular transistor doesn't mean it's gonna work great (or that it can even be integrated) in actual devices and just because you can prototype it doesn't mean that you can mass-produce it.

  • @lorenzo42p
    @lorenzo42p 7 лет назад +1

    I've never once seen moore's law when using a computer, but murphy's law never fails to show up

  • @tickets23
    @tickets23 6 лет назад +1

    Trace, you rock! I love learning from you and the rest of seeker. Please keep up the awesome videos!

  • @nicotineoob1146
    @nicotineoob1146 7 лет назад +14

    Make 3D chips to keep moores law going

    • @Goken234
      @Goken234 7 лет назад +22

      Nicotine Oob I don't think you understand.

    • @Froztypan
      @Froztypan 7 лет назад +3

      Well it's not that we can't do that, it's more like we can't get more performance from the same amount of silicon, this will increase the cost of the chips significantly which defeats the point of moores law to increase performance while cutting the cost

    • @IOPlays
      @IOPlays 7 лет назад +1

      Hmmm, 3d chips, HMMMMM this might really be a good idea, but, still moors law states that transistors get smaller every 2 years, not that chips get better, or faster, just the size of the transistor :P

    • @square2037
      @square2037 7 лет назад +1

      Water-cooling pipes in the CPU

  • @mynamejeff3498
    @mynamejeff3498 7 лет назад +13

    light based computers

  • @mrpumperknuckles1631
    @mrpumperknuckles1631 7 лет назад +1

    Neomorphic might be a little far fetched we already made the first quantum processor in Germany but the problem we are currently facing is that it takes a huge super freezer just to cool the processor down to function. While the rest is in the process. We still need quantum graphics cards and everything else in order to make the computer function but if we find a way we can make a super computer.

  • @godtube286
    @godtube286 7 лет назад

    That's such a cool concept...neuromorphic computing!!

  • @ErnestJay88
    @ErnestJay88 7 лет назад +4

    You guys have a "sister channel" ? last time you sell Seeker Daily to "NowThis", what next ? SeekerVR become "Buzzfeed VR" ?

  • @ethanweiss1917
    @ethanweiss1917 7 лет назад +4

    And they'll call it "Skynet"

  • @VIKDR1
    @VIKDR1 7 лет назад +1

    Moore's law has been found to be part of a much larger trend that goes back over 2,000 years. When they used vacuum tubes, nobody expected transistors, and there is most likely a technology few people know about, or is going to soon be invented, that will keep the long term trend moving. Software can be made more efficient, and chip architecture could be improved. It's happened in the past. (And the chips he is talking about are just that.)
    Honestly non-quantum ternary computers (also known as trinary, or base-3, in addition to a 0 and 1 would have a -1,) would be an advance, and they have existed since the 1950's. Punch cards, and methods of storage really favored binary which negatively affected research into ternary computing in the past.

  • @AsellusPrimus
    @AsellusPrimus 6 лет назад +1

    I have to say, my phone from 2013 is incredibly different from my 2011 phone, but isn't that different from a 2017 phone.. And my computer is from 2009 and completely squashes the computer I got in 2005, but looking at the computers on sale now hardly anything has changed in the 8 years since I got it. I definitely noticed that the raw power of electronics hasn't been changing much this decade, compared to last decade where I noticed a big difference as each year passed. I would HOPE that people would feel less inspired to replace everything they own so frequently now, but they'll probably maintain the illusion that their 2 year-old tech is obsolete anyway..

  • @EverrollingRS
    @EverrollingRS 7 лет назад +11

    look im pretty happy with the teck we have tbh

    • @squamish4244
      @squamish4244 5 лет назад

      You're happy with the WAY it's used now. There are so many other areas that could use better tech.

    • @vincent-xr9fi
      @vincent-xr9fi 5 лет назад

      @@squamish4244 No, I think this person is happy with the tech he has.

    • @squamish4244
      @squamish4244 5 лет назад

      I mean the way it is used. We have all this incredibly powerful tech, and so much misery and cruelty in the world. Why aren't we using tech to help humans be less miserable (in their chattering, never-satisfied, argumentative, angry, fearful, depressed minds)?

    • @vincent-xr9fi
      @vincent-xr9fi 5 лет назад

      @@squamish4244 Great. Just dont invalidate something someone else says because of your beliefs.

    • @squamish4244
      @squamish4244 5 лет назад

      RUclips is for discussions, right? If we can't question other's beliefs, then that 'invalidates' like half of the point of comment sections. It make progress impossible too.

  • @julienparis6933
    @julienparis6933 7 лет назад +3

    Nice video length

  • @BlazinNSoul
    @BlazinNSoul 7 лет назад

    Last year, researchers reported building a flash device that included layers of graphene and molybdenum disulfide.
    Both of which form molecular sheets a single atom thick. But these devices required several layers of these materials to work.
    So the charge ended up stored in several stacked sheets of graphene. The Crystal chip also shows potential.
    But there just isn't enough R&D being put into these new technologies. Where it's going to make much difference. :/

  • @SalNegliaOkc
    @SalNegliaOkc 7 лет назад

    I miss using my Moto Rzr!! I do still have mine somewhere around here. I think!

  • @aqwhawater
    @aqwhawater 4 года назад +3

    moores law: obey me
    apple and every big pc company: hmm oh rly?

  • @rnc-wr8wv
    @rnc-wr8wv 7 лет назад +9

    I want that T-shirt

  • @JK_Educates
    @JK_Educates 6 лет назад

    Great video!! Thanks for sharing. I definitely miss my old Motorolla RAZR.

  • @blueskyresearch6701
    @blueskyresearch6701 7 лет назад

    Neuromorphic chips sound a lot like FPGAs which are essentially a matrix of memory cells used as look up tables that serve as digital logic or memory depending on what the application needs. Its hard to see how these could surpass the density and speed of the present simpler designs that are fixed when they are manufactured and don't require the overhead of being reconfigurable. There still are smaller processes on the horizon 7nm and smaller but the physics required for design and manufacturing is becoming exponentially more difficult and the time and cost to role out these processes is much higher than for previous process improvements. Improvements in memory tech will ease the pain of Moore's law breaking down Gigabytes or even terabytes of non volatile memory running at L2 cache speeds will make computers faster and change the way operating systems and software is designed. Fundamentally different processes such as memristors will eventually supplant the current silicon based transistor logic but it will take years or decades for it to catch up with the mature state of art silicon based designs.

  • @jameskerry2560
    @jameskerry2560 7 лет назад +17

    Wow when 10+ comment first what is the point

    • @BertGrink
      @BertGrink 6 лет назад

      My take on that is that they must be attention-seekers to some degree. ;)

  • @davidsantiagobarretomora2855
    @davidsantiagobarretomora2855 7 лет назад +145

    Men the neuromorphic computing is creepy as hell, I dont want a computer that can do whatever the fu*k he wants, I want to control it as usual and dont have a living thing in my room.

    • @WhileYouWereSheeping
      @WhileYouWereSheeping 7 лет назад +28

      The new chip will want more porn and video games and pseudo science

    • @JontyLevine
      @JontyLevine 7 лет назад +26

      Except allowing a computer to learn on its own is effectively the same as letting it reprogram itself.

    • @homiegisalive
      @homiegisalive 7 лет назад +5

      I don't think it would learn on it's own unless you programmed it to do really abnormal things without an input from someone telling it what to do (it won't be ai just different computing)

    • @Blendedasian
      @Blendedasian 7 лет назад +11

      David Santiago Barreto Mora but isn't it a living room?

    • @SweetHyunho
      @SweetHyunho 7 лет назад +9

      It's time that people figure out what constitutes a mind. Learning is ab umbrella term. The architecture should include fixed part so it will always try to serve us. That fixed part is comparable to biological instincts.

  • @Pissedoffpeasant
    @Pissedoffpeasant 6 лет назад

    IBM made a working 1nm transistor a while back using carbon Nano tube so I think we still have a few years left for Moore's law. After that you are probably looking at 3D graphene, super diamonds or a combination of a few different things to get better performance out of a chip. I have no idea which way the industry will end up going but what I do know is that if I were to take one of the next gen chips back in time and give it to myself as a kid I could claim it was from an alien space ship and people would believe me. That makes me happy for some reason lol.

  • @OutlawAladdin
    @OutlawAladdin 6 лет назад +3

    Yas! I still use mine! I pay $20 a month to keep it as my "business phone" hahah

  • @hornetluca
    @hornetluca 7 лет назад +25

    I watched this with my smartphone

  • @AsjadSS
    @AsjadSS 6 лет назад +3

    Checks Itself Before It Wreak It Self....

  • @eia1957
    @eia1957 7 лет назад

    Trace asked a question late in the video about our digital assistants (they are more than phones nowadays). I carry a Samsung flip-phone and teach 8th grade science. My students laugh whenever I take it out to check the time. But then I impress upon them that the only reason I have it is to (1) make calls and (2) receive calls but mostly to check the time (I no longer wear a wrist watch). And then I drop it on the floor. GASP! Generally it survives with no problem but occasionally the battery cover comes off -- and I just pop it back on. I then ask my students, "Can your phone do that?" My point is that the technology is "good enough" for my needs, I don't require/want anything more elaborate or expensive. Some day, I might need something else (for example, to direct my autonomous flying car to pick up a replacement flux capacitor) but that day has not yet arrived.

  • @AjaxNotFrancis
    @AjaxNotFrancis 7 лет назад

    Yes because silicon, the element used to make the chips, can only hold/do so much. This has been coming for a while, however, there's research going on to use a different element to make chips out of

  • @FusionDeveloper
    @FusionDeveloper 7 лет назад +5

    How about making desktop computer chips 10% larger? Sure it wouldn't solve the problem, but they could fit a ton more transistors in that 10% extra space.

    • @tradermann
      @tradermann 5 лет назад +1

      You would pay 10% extra so nothing changes.
      How about 20%? Why not 40%? Why did you choose 10% ?

    • @HoboInASuit4Tune
      @HoboInASuit4Tune 5 лет назад +3

      Larger means more material, and heavier/clunkier logistics, thus higher costs. Also don't forget that these things getting smaller increases our capability to cool them. Making them large once again, increases cooling requirements.

    • @1pcfred
      @1pcfred 5 лет назад

      Good Fortune chillax. They say die size is Ryzen.

  • @ndr2q
    @ndr2q 7 лет назад +12

    I for one welcome our new robot overlords.

  • @tejindersinghswaraj5958
    @tejindersinghswaraj5958 7 лет назад

    i love my Motorola Razr still very much and recently gave it for repairs, I hope the spare parts could be found these days.

  • @Lugmillord
    @Lugmillord 7 лет назад

    Von Neumann is pronounced "phonn noimun" by the way ;) This neural chips could be really interesting.

  • @leonardoherrera1318
    @leonardoherrera1318 7 лет назад +3

    and that is how you get skynet

  • @cagedgandalf3472
    @cagedgandalf3472 7 лет назад +5

    What's Next?
    A computer that can learn to say "Get me to da choppa" like Arnold

  • @brosephjames
    @brosephjames 7 лет назад

    Between current chips and quantum/neuro processing theres a much more immediate stopgap measure that will probably happen much sooner, which is a re-redesigned cpu architecture thats still using silicon transistors. Current Intel CPUs are based on the x86 standard, which carries with it 30+ years of baggage. It's an inefficient design kept around for backwards compatibility. Ditching it and making something more efficient from the ground up would use less energy, which would mean less heat, which means faster clock speeds and more cores, which could continue to eke out gains from silicon following moores law a bit longer...

  • @yasirsaheed
    @yasirsaheed 6 лет назад

    0:04 I actually have that phone (yeah, I still have it, tho I dnt use it anymore), The Nokia 6021, amazingly the Facebook java app for that phone that I downloaded like in 2010 still works (last updated in 2012 I guess). For a almost 11 year old phone, that's unexpected. Well, the last time I tried powering it on & using Facebook was last January I guess, should give it a try again!

  • @javitomixs2007
    @javitomixs2007 7 лет назад +17

    I think
    A.I. is a real concern, but then again you can stop technological advances for a couple of murdering robots 🤷‍♂️

    • @omgcyanide4642
      @omgcyanide4642 3 года назад

      Guy who joins the community of people scared of ai for no reason

  • @teku6678
    @teku6678 6 лет назад +12

    When you realize you are living in the generation to best experience the peak evolution of tech

  • @arifproject
    @arifproject 6 лет назад

    Aye, the Nokia 6020 at the beginning, one of my favorite device back then

  • @rhetta9826
    @rhetta9826 6 лет назад +1

    Don't worry about it. Why the rush to have more computing power?? More gadgets to play with? We'll be fine.

  • @tomislavnikolic5778
    @tomislavnikolic5778 7 лет назад +7

    Yeah, just that anything mimicking human brain and "learning" will be prone to errors and that is not what we need in computers. Imagine a game running on one of those; crashes, glitches. We have to stay on current architecture but instead of going smaller (chip size) or bigger (more chips) we have to go faster, with states that can be quickly switched (electrical current is not ideal for switching and wires have some resistance in them). Also, for a long time our circuits have been topologicaly 2D boards, with layers, but still 2D. Imagine if chips could communicate with their adjecent chips right below or above them, as well as in any of other direction. You'd map every node and switch it on the go, essentially rewriting it's curcuit, and since the components are so fast it wouldn't even be noticeable. You'd be having a fraction of chips running at theoretical speed of light, having CPU and memory become one. Controling each node you'd could fork and backtrack results into the chip itself, rewriting only it's logic without altering any of it's physical properties. It's still sci -fi but it's not a long way from here.

    • @roberttompson7179
      @roberttompson7179 6 лет назад

      I like your version, not sure how deeply you might have gone with this in your mind but i've been studying this for the past 15+ years of my life with insane devotion and proves me to be a key part of our future's best functioning methods for our general purpose computing machines and multi-functional miniature systems, while in larger scale (as in supercomputers) still too much to be seen in order to doubt or un-doubt the presence of this system.

  • @kelvin254kk
    @kelvin254kk 7 лет назад +7

    i knew that someone will say something related to skynet in the comments before watching the video ......

  • @laiserfire
    @laiserfire 7 лет назад

    "Neuromorphic" is a change in architecture that would only give us a one time boost in potential computing power. And it would only benefit a special area of computational tasks. Like a graphics card

  • @im415again
    @im415again 4 года назад

    My brain forgot everything you said 3:00 in. Dont know if this neuromorphic thing will work.

  • @geminiapollo2319
    @geminiapollo2319 7 лет назад +3

    skynet is here

  • @Anedonia-
    @Anedonia- 5 лет назад +5

    When a machine learn how to disobey its own algorithms it will be end of the world.

    • @u9vata
      @u9vata 5 лет назад +1

      It is already done - every serious computer science student writes such Ai ;-)

  • @Maadhawk
    @Maadhawk 7 лет назад

    I remember when cell phones came in a briefcase and looked much like bulkier sat phone units.

  • @medokn99
    @medokn99 6 лет назад

    1:00 that is the distance between each transistor not the size

  • @okie9025
    @okie9025 7 лет назад +4

    I like how now iPhones are an upgrade to Samsung lmao

  • @MFJL760
    @MFJL760 7 лет назад +47

    The government just keeps taking and it's frustrating. Why are the American people not allowed to vote on it? You know the government is taking too much power from us when they just "end" things like Moore's law.

    • @TheReaper569
      @TheReaper569 7 лет назад +3

      because america is not a democracy.

    • @gagemagras1069
      @gagemagras1069 6 лет назад +32

      Nico Flihan what are you on about. No one is ending moores law, its simply an idea that is increasingly less sustainable due to things like size and energy consumption. If anything you should be happy about any new revised system as what we use is basically ancient technology at the rate we already progress.

    • @myotherusername9224
      @myotherusername9224 6 лет назад +1

      It's almost like the Constitution is a 'living document'.

    • @defunct1905
      @defunct1905 6 лет назад +2

      Nico Flihan America isn't free anymore. The freedom we think we have is a fabrication. That's why the government keeps us 100 years behind what they have. Would you let your dog live better than you do? The government won't either. Arf arf. Sucks,but it's true. We are little more than cattle to the government. Very little more.

    • @johnsmith4630
      @johnsmith4630 6 лет назад +7

      Kai Miller sage trolling bro

  • @TeslaHaxz
    @TeslaHaxz 7 лет назад

    no mention of skyscraper chips, nanotube processors or anything like that?

  • @aubry980
    @aubry980 6 лет назад +1

    So pretty much what the iPhone X has with its A11 Bionic chip?

  • @alberteinsteinthejew
    @alberteinsteinthejew 7 лет назад +9

    Yay finally we'll go towards the end of the world! At last!

  • @SyndroOmCani
    @SyndroOmCani 7 лет назад +33

    Of course the biggest provider of microchips is going to say that it's getting more and more difficult to produce them...So they can *OBVIOUSLY* sell them for higher prices.

    • @MrCrashDavi
      @MrCrashDavi 7 лет назад +41

      Yeah, thermodynamics is all a big conspiracy by Big Tech.

    • @Erowens98
      @Erowens98 6 лет назад +11

      CrashDavi knowing intel it probably is. They literally released the exact same architecture for 3 generation in a row for a slightly higher price each time, only adding a couple extra cores to the cpu when a hint of competition showed up. They stopped giving meaningful improvements the day they gained monopoly. Im aware the limit of silicon is approaching, but i doubt it's as big a deal as intel wants to say to keep shrinking until we get there. Hopefully by then, in the early 2020s some breakthrough in a different semiconductor is achieved to replace silicon.

    • @Denasdc
      @Denasdc 6 лет назад +4

      Khoi Pham Xilinx is already testing 7nm, and have designed 5nm.

    • @abyssstrider2547
      @abyssstrider2547 6 лет назад +1

      Birki gts I heard that people are looking into quartz microchips to replace silicon ones

    • @progressivethinker4635
      @progressivethinker4635 6 лет назад

      Primitive west. shameless people of west copied from india vedas. WE ARE GOING TO SUE THE WEST. Vedic scientists like Razor Skidrow, Logical Hindu, Magical indian put lots of video evidence on youtube. We will form a society and sue the west for copying our ancient hindu scriptures

  • @danielstark8258
    @danielstark8258 6 лет назад +1

    We are going to have to figure out something different in-order to continue at this rate

  • @Etheoma
    @Etheoma 7 лет назад

    actually it would be incremental it would be exponential, as incremental change usually means of a fixed scale where as the doubling of transistors is a exponentially increasing scale.

  • @truereality7608
    @truereality7608 7 лет назад +10

    Peoples memory and ability to learn is horrible. Why would a computer be modeled after a human brain

    • @throwawaywwwwwww
      @throwawaywwwwwww 6 лет назад +2

      TrueReality Wtf are you on about? The human brain is more complex and more adaptive than any computer. An average human brain is at least 1000 times as fast as our fastest supercomputers.

    • @cephalonzero3504
      @cephalonzero3504 5 лет назад +2

      No its not.

    • @whytho2436
      @whytho2436 5 лет назад

      Quespa are you stupid

  • @dayvie9517
    @dayvie9517 7 лет назад +29

    You make huge logical leaps in your video. How do smartphones lead to Intel chips and neuromorphic chips (hardware machine learning) lead to faster electronics? Your argumentation is incoherent.

    • @PraecoLumieres
      @PraecoLumieres 7 лет назад +4

      +Omegapede Prime the video is only 4 and a half minutes long...

    • @roberttompson7179
      @roberttompson7179 6 лет назад +3

      And because it's only 4 minutes and a half like Diego said, it's why he has to connect the dots with such leaps or thousands of the fools flooding the comments with offenses before you wouldn't even bother to watch the half of it. You got a good point but be more positive please and you'll see there's a lot more to it than what 4 minutes and a half can fit in.

    • @kyrlics6515
      @kyrlics6515 6 лет назад

      Robert Tompson than don't have it be four minutes. Let it go above and beyond by the power over 9000!

  • @steffeeH
    @steffeeH 7 лет назад

    1:04 "...and each transistor is about 14nm across, that is smaller than most human viruses"
    To correct him, this nanometer measurement specifies the transistor gate, which only makes up a portion of the transistor - not the transistor unit itself. There are no commercial consumer grade chips out there today where the total size of a transistor unit is only 14nm (probably only in the engineering labs working on 7nm architectures or even smaller).
    And how small is 14nm? This varies a lot as it's hard to find an answer depending on what atom we're dealing with and the pattern they make up, but somewhere between 25-40 atoms across.

  • @LiegerZ0
    @LiegerZ0 7 лет назад

    Would a tensor processing unit not count as a sort of neural mimicking chip?

  • @atharva.shinde
    @atharva.shinde 7 лет назад +3

    This was meant to be a joke BUT...........
    .
    .
    .
    .
    .
    I have a bad sense of humour.

  • @guildarius
    @guildarius 7 лет назад +31

    moore's law is about CPU not memory

    • @VastHorizons71
      @VastHorizons71 6 лет назад +35

      guildarius wrong its about transistor size

    • @1pcfred
      @1pcfred 5 лет назад

      Moore's law is about transistor density.

  • @awaisyousaf
    @awaisyousaf 7 лет назад

    While the newer technology is underway but for now i think manufacturers should focus more on software optimisation to make better experience for the users.

  • @soldierboy425
    @soldierboy425 7 лет назад

    I used a razor until this year when AT&T dropped 2g service. Though to be honest I loved my blackberry curve the most. 5 days a battery life.

  • @Froztypan
    @Froztypan 7 лет назад +16

    Where are the facts in this video, firstly why would we as consumers ever need more computational power in the first place, there might be a reason for this but why risk splitting the market? Secondly quantum computers are not going to be mainstream ever, you have to play with almost zero degrees kelvin and all this other technical stuff, quantum computers are just really good at solving one problem with high difficulty where a normal computer cisc based could complete it in one or two clock cycles. And third computers with synapses are simply not there yet and probably will not be there for at least 50 years. The real next thing after moores law are arm based high core count chips with maybe 256 cores or more. Just look at gpus they are basically a very paralleled CPU with lots of low power cores that are specified to do math. Why don't you research about what is real and stop with this sci-fi "quantum computing is the way to go bullcrap"

    • @roberttompson7179
      @roberttompson7179 6 лет назад +1

      Although i mostly do agree with your comment, i still believe Quantum Computers have a good shot on the way and we should be waiting for it with open arms to make sure we don't miss it, from what i see in the mathematical and logical development of today's technology i think Quantum Computer might actually be the last one i could count on making the "Quantum Leap" from "consumerism tech"/"quantity over quality" to the true efficient machines that no longer depend on sizes nor amounts to prove their worth and simplicity of their use and maintenance to depend only on the final count of personal choice based on efficiency/need/interest so people would only get what gets their job done while science makes a quantum leap away from the "consumerist market" which is simply degrading, not to say degenerating our marketing and progress aims in different ways on a very large scale.
      It could take a book to explain why Quantum Computers when they're obviously not superior to the version discussed on the video and possibly not even compared to the ARM logic stand if i'm getting it right. But in the end of the book i would still simply end it with just a "i hope it happens so i could tell you "i told you so" and we'd all be happy about it" well, except today's marketing advisers and some of their bosses. :)

  • @atharva.shinde
    @atharva.shinde 7 лет назад +5

    S.K.Y.N.E.T
    .
    .
    .
    .
    .
    .
    (Insert full-form here;if you are smart.)

    • @AsquareM
      @AsquareM 7 лет назад +3

      So Kitty, You're Not Extra Terrestrial

  • @MACROPARTICLE
    @MACROPARTICLE 7 лет назад

    Interesting and informative video, great job.

  • @1pcfred
    @1pcfred 5 лет назад +1

    I never owned a cell phone in my life. But let me know when they're done and I might pick one up then.