The Race to Build a Perfect Computer Chip

Поделиться
HTML-код
  • Опубликовано: 9 ноя 2022
  • Digital activity uses a huge amount of electricity with semiconductors near the limit of their efficiency. Now scientists are racing to perfect new chips that use much less power and handle much more data.
    #thespark #technology #green
    --------
    Like this video? Subscribe: ruclips.net/user/Bloomberg?sub_...
    Become a Quicktake Member for exclusive perks: ruclips.net/user/bloombergjoin
    Subscribe to Quicktake Explained: bit.ly/3iERrup
    QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
    Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
    Visit our partner channel QuickTake News for breaking global news and insight in an instant.

Комментарии • 782

  • @firstnamelastname7941
    @firstnamelastname7941 Год назад +341

    0:00 The importance of developing low-energy computer chips
    4:12 Carbon Nanotube Transistors
    11:07 Photonics Chips
    15:26 Neuromorphic Computing
    24:47 Conclusion

  • @freddoflintstono9321
    @freddoflintstono9321 Год назад +109

    From my perspective this was one of the most interesting quicktakes I've seen. Well assembled and presented.

  • @adrielamadi8585
    @adrielamadi8585 Год назад +59

    I'm a computer scientist specialized in software development but this made me appreciate the hardware side of things. It's really inspiring

    • @7eVen.si62
      @7eVen.si62 Год назад

      Pr!ck

    • @dtrueg
      @dtrueg 3 месяца назад

      what do you develop? just wondering. AI is taking over.. I was the complete opposite.. i would take my dads old computers in the early 90's (from ibm at the time) think they only ran windows dos.. back when 128mb and floppy disks were awesome haha.. and i would take em apart and learned how to soder and what did what and how etc.. now can build and mod pretty much anything for optimal results. mining, model, training, servers, etc.. just recently started learning coding languages.. wish i started sooner tbh

    • @adrielamadi8585
      @adrielamadi8585 2 месяца назад +1

      @@dtrueg i develop web applications and desktop applications but I want to explore AI development and cloud computing.

    • @dtrueg
      @dtrueg 2 месяца назад

      get the best gpu preferably the nvidia 40 series also may need to up cooling with an overload of fans if get into training models.. i mean you basically need server hardware for any real results but can be done slowly.. making agents or llms or assistants is fairly simple. just need the correct programs to run such as visual studio/textwebui/lm studio/autogen etc. then can pick from hugginface and go from there..@@adrielamadi8585

  • @thinktoomuchb4028
    @thinktoomuchb4028 Год назад +227

    I'd heard of these technologies to varying degrees, but this piece on the current progress of all of them was informative and fascinating. Thank you!

    • @rawallon
      @rawallon Год назад +5

      Yeah, also, personally I find it quite odd how I never thought about the carbon footprint of our reliance in computers/tech in general

    • @Ottee2
      @Ottee2 Год назад

      Fascinating, indeed. Not only do we need to consume energy more efficiently, but also, we need to devise novel ways to create more energy on the planet. Maybe, one day, for example, we will have massive solar energy collectors in space, which then transmit that energy to the planetary surface.

    • @ko7305
      @ko7305 Год назад

      Epyc.

    • @notevennelson
      @notevennelson Год назад +1

      No one asked

    • @alanhat5252
      @alanhat5252 Год назад

      @@Ottee2 ...without chargrilling intervening birds.

  • @shivangsingh2463
    @shivangsingh2463 Год назад +316

    Just want to thank the team of Bloomberg Quicktake for making these really high quality content for us 🙏🏻♥️

    • @JonahNelson7
      @JonahNelson7 Год назад +3

      It’s genuinely great

    • @ko7305
      @ko7305 Год назад +1

      Epyc.

    • @JakeWitmer
      @JakeWitmer Год назад

      Yep. Too bad they're associated with the totalitarian name "Bloomberg." I've met Bloomberg employees before who were rightfully ashamed to be associated with the name...

    • @AndrewMellor-darkphoton
      @AndrewMellor-darkphoton Год назад +1

      they said something without saying anything

    • @ROSUJACOB
      @ROSUJACOB Год назад

      You are welcome Mr.Singhania.

  • @CoreyChambersLA
    @CoreyChambersLA Год назад +4

    The atom is not the limit to size reduction. Subatomic particles can perform the same functions, better, cheaper and faster.

  • @kayakMike1000
    @kayakMike1000 Год назад +77

    The greatest enemy of a wonderful technological breakthrough is the advanced technology that works well enough.

    • @ko7305
      @ko7305 Год назад +2

      Epyc.

    • @Typhonnyx
      @Typhonnyx Год назад +3

      yup the consistency is the death of developement

    • @khatharrmalkavian3306
      @khatharrmalkavian3306 Год назад +1

      Nonsense. We're using all of our existing technology and pouring hundreds of billions of dollars per year into researching new methods.

    • @mnomadvfx
      @mnomadvfx Год назад +1

      Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment.
      If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different.
      The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage.
      But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.

    • @mnomadvfx
      @mnomadvfx Год назад

      @@khatharrmalkavian3306 All the nope.
      Hundreds of $billions is going into silicon semiconductor logic and all the other standard computer and information tech of the moment.
      Only a tiny fraction of that amount is going into alternative paths.
      Quantum dots were predicted to replace CMOS image sensors years ago, but nothing is forthcoming simply because the industry investment in CMOS sensors is too high and QD's are not regarded as enough of a benefit to pull money away from CMOS sensor improvement research.
      You can create a revolutionary memory tech for not a huge amount of money in a lab like Rice university - but making a chip using it that scales to competitive bit density with a modern 3D NAND chip costs a frickin shipping port full of money to engineer in staff and time, only compounded by the lesser experience and knowledge with the newer technologies in the industry.
      There are some things that are being pursued more vigorously, such as metalenses - they can be produced faster, more cheaply than conventional lenses and offer dramatically increased compactness + utility as a single achromatic metalens element can replace many in an optical system by focusing all wavelengths onto a sensor in one super thin piece rather than needing one element per wavelength and others for extra adjustment.
      So they are basically winners across the board relative to the technology they aim to replace. 20 years from now we will wonder why cameras lenses ever used to be so heavy.

  • @nayeb2222
    @nayeb2222 Год назад +135

    This just renewed my interest in engineering, truly an inspiring documentary.

    • @atlantic_love
      @atlantic_love Год назад +1

      I'm sure tomorrow would see a kitty rescue and go "this renewed my faith in humanity", right? You renew your interest is something by doing something.

    • @nayeb2222
      @nayeb2222 Год назад

      @@atlantic_love nothing renews my faith in humanity, I know it's doomed at this point

    • @HueghMungus
      @HueghMungus Год назад

      @@nayeb2222 Then be a prime example, and do not have kids, then dissolve your body and donate all your organs to charity. Thanks!

    • @nayeb2222
      @nayeb2222 Год назад

      @@HueghMungus I do have faith in religion, so it won't be an option for me

    • @joshuathomas512
      @joshuathomas512 Год назад +2

      @@nayeb2222 you have faith in fiction

  • @djayjp
    @djayjp Год назад +52

    Still a lot better than delivering physical goods (eg Blockbuster vs streaming). I'm sure the former uses more than 100x as much energy.

    • @niveshproag3761
      @niveshproag3761 Год назад +3

      Yes, delivering physical goods use 100x more energy but the convenience of digital means that we use it 1000x more. Some people just leave netflix/youtube running in the background the whole day.

    • @djayjp
      @djayjp Год назад +6

      @@niveshproag3761 I highly doubt that figure.

    • @niveshproag3761
      @niveshproag3761 Год назад +2

      I highly doubt the 100x too. I just mean our consumption outpaces the increases in efficiency. Proven by the fact that our electricity consumption increases every decade.

    • @djayjp
      @djayjp Год назад +4

      @@niveshproag3761 Certainly not proven as you're not isolating variables.

    • @florisr9
      @florisr9 Год назад +1

      Exactly, we should focus on the way digital equipment makes our lives more productive and efficient, rather than how it consumers 'much' energy (it doesn't). The ratio of energy consumption to value added is tremendously small compared to other sectors like transportation.

  • @Rnankn
    @Rnankn Год назад +2

    Progress is not tied to computers getting better, it is contingent on excising technology from our lives. Apparently these folks failed to learn Jevon’s paradox: “an increase in efficiency in resource use will generate an increase in resource consumption rather than a decrease”. Nor do they consider the exponential power of compound growth to exceed any linear reduction or transition. Even the Greeks knew Sisyphus would never get the boulder to the top of the hill, yet techno-utopians gleefully assume a smaller transistor is going to solve all problems.

  • @siddhantjain243
    @siddhantjain243 Год назад +20

    Lithography "nm" these days doesn't really means exact no. Ie 5nm doesn't actually mean 5nm manufacturing process

    • @djayjp
      @djayjp Год назад +5

      Yeah TSMC state it's more of a marketing term than anything.

    • @siddhantjain243
      @siddhantjain243 Год назад +3

      @@djayjp same goes for Samsung & Intel

    • @DementedPingu
      @DementedPingu Год назад +3

      @Cobo Ltger Isn't it refering to the size of transistor gates?

    • @mmmmm49513
      @mmmmm49513 Год назад +1

      It did at one point. But now it’s just used to say it’s 2x better than this old process etc.

  • @hgbugalou
    @hgbugalou Год назад +102

    Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.

    • @zhinkunakur4751
      @zhinkunakur4751 Год назад +8

      cmon are you really suggesting high level languages are bad and inefficient ? I believe high level languages are really inevitable

    • @zhinkunakur4751
      @zhinkunakur4751 Год назад +4

      what we should look at more is high efficiency conversion from Upper bound languages like basic English instruction to machine language using machine learning using the analog energy efficiency advantage we have , you cannot stop the inevitable but we can get more efficient codes and there isn't only one way to do it

    • @hgbugalou
      @hgbugalou Год назад +7

      @@zhinkunakur4751 I am not suggesting that entirely. High level languages are awesome and things like python have made countless cool and invaluable solutions and gotten a lot of people into coding. Part of the benefit of this more powerful hardware is the amount of abstraction that can be done and still get the job done nicely to the end user. My point was only to highlight that I worry that as this things advance, the lower level stuff will start to become lost and appreciation for how efficient low level languages can be will be more and more underappreciated due to lack of understanding or thinking its voodoo not worth getting into. The are still a lot of scenarios where efficient code matters, and the closer to hardware you are the better. It is important we do not lose site of that or let that knowledge become stale.

    • @zhinkunakur4751
      @zhinkunakur4751 Год назад

      @@hgbugalou I see , Agreed , I too am a little worried for the increasing unpopularity of LLLs , or maybe the concentration is seeming to be going down because more more people are getting into coding and vast majority of them are will be using the HLLs , and not that the growthrate of LLLS are going down maybe its just that HLLs have a higher growth rate

    • @mnomadvfx
      @mnomadvfx Год назад +3

      Hopefully generative AI's can do something about that.
      It's something I have often thought about when observing the painfully slow development process of new video codecs from ISA portable C code to fast, efficient ISA specific SIMD assembly.

  • @oldgamer856
    @oldgamer856 Год назад +4

    24:36
    Where's the mouse?
    * Points at camera *
    What a burn

    • @TheSano
      @TheSano 2 месяца назад

      Yeah, money wasted type moment 😂

  • @AlanTheBeast100
    @AlanTheBeast100 Год назад +18

    I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time.
    Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.

    • @Beyondarmonia
      @Beyondarmonia Год назад +1

      You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.

    • @AlanTheBeast100
      @AlanTheBeast100 Год назад

      @@Beyondarmonia Regardless of how I heat, the heat from all things dumps into the house - so no extra charge ($). As to heat pumps: True enough, but it's cold here (-24°C presently, -30°C tomorrow - will be mild next week) and electricity is very, very cheap - whereas heat pumps are expensive to buy and install - and fail expensively. That said, Hydro Quebec will subsidize about $1000 if I throw a heat pump at it. Maybe some day.

    • @Beyondarmonia
      @Beyondarmonia Год назад +1

      @@AlanTheBeast100 Makes sense.

  • @madad0406
    @madad0406 Год назад +10

    Literally just passed my computer architecture final and then this was recommended to me haha. Great video!

  • @edeneden97
    @edeneden97 Год назад +5

    I really like the jumps between the founders and the skeptic guy

  • @mdaverde
    @mdaverde Год назад +146

    What a time to be alive.
    This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift

    • @michaeledwards2251
      @michaeledwards2251 Год назад +2

      Rust and other modern languages will become more significant. Hardware implementation of typing, inheritance, bounds, soft cells, networks, and flexible control assignment, will all be implemented.

    • @jackdoesengineering2309
      @jackdoesengineering2309 Год назад +3

      I'm using an APU not a CPU. It's 35 watts and is extremely capable. With computational progress comes reduced power. It's just they are mutually exclusive and at some point people have to choose lower power over performance gains. As energy prices rise this choice is reconsidered.

    • @michaeledwards2251
      @michaeledwards2251 Год назад +1

      @@jackdoesengineering2309
      Under clocking tricks allow the computational rate to be reduced to the demand rate with lower power consumption. This still allows high computational rates, with high power consumption, whenever they are needed.

    • @amosbatto3051
      @amosbatto3051 Год назад +9

      To some degree the energy consumption hasn't fallen, especially with desktop PCs, but the greater energy efficiency has made possible all sorts of new form factors which are much more energy efficient: laptops, netbooks, tablets, smart phones and smart watches. Eventually we will get to smart glasses, smart earbuds and smart clothes that are extremely energy efficient and can replace much of the functionality of traditional PCs. If you look at energy consumption in advanced economies, it is actually falling, which is an indication that we are doing more with less energy.
      As a computer programmer, I can tell you that energy efficiency is becoming increasingly important in programming. Not only are programmers focusing more on code that can deal with low energy systems running on a battery, but they are focusing more on compiled languages, such as Rust, Swift, Go and Julia, that use less memory and computing cycles than interpreted languages.

    • @boptillyouflop
      @boptillyouflop Год назад +4

      @@michaeledwards2251 Hardware implementation of typing, inheritance and bounds as of yet hasn't been able to make any of these things faster for code that does these things as of yet:
      - Inheritance is basically a fancy jump instruction. The main problem with this is that with inheritance, your jump address usually has to be loaded from memory, which can take many cycles, and the CPU has to basically guess the branch target and run a whole bunch of speculative instructions while the address loads for real. Having a special version of "jump to variable address" just for inheritance just doesn't gain much over the regular variable jump.
      - Bounds is likewise a fancy conditional branch. Conditional branches that rarely get taken are already quite cheap on modern CPUs - they do take up slots in the instruction decoder and micro-op execution but they don't compete for the really important slots (memory loading/storing). In fact, loading the bound is definitely slower than testing it (since it uses a memory load instruction). The speed gain from adding hardware bounding tests is likely to be rather small.
      - Typing is in a similar situation. Usually dynamic typed variables are either just dynamic versions of small fixed-size static types (double, float, bool, int32_t, int64_t) or larger dynamic-sized variable types (strings, objects, maps, etc). The larger dynamic-sized types have to be handled in software (too complex for hardware), so you'd still have to load the type and test for it. The small fixed-size types could conceivably be handled in hardware but you'd probably just be using the largest type all the time.

  • @trolly4233
    @trolly4233 Год назад +45

    Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.

  • @hyperteleXii
    @hyperteleXii Год назад +42

    This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.

  • @omid_saedi
    @omid_saedi Год назад +36

    Such an awesome content that you produce. It has something to teach nearly anybody at any level of knowledge regarding the problem.

  • @stevesedio1656
    @stevesedio1656 Год назад +2

    Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled.
    Light travels through fiber at 65% speed of light, through copper at 60%.
    The devices that convert data to light have the same limits as the devices that drive wire.
    Light can send more than one signal using color, but that only uses a small slice of the available bandwidth.
    Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire.
    The big advantage fiber has is how far a signal can travel.

  • @johnsavard7583
    @johnsavard7583 Год назад +22

    We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.

    • @ko7305
      @ko7305 Год назад

      Epyc.

    • @mnomadvfx
      @mnomadvfx Год назад

      That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now.
      The increase in use means we need to push the hardware efficiency ever further to keep up.

    • @0xD1CE
      @0xD1CE Год назад +3

      Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.

  • @aresmars2003
    @aresmars2003 Год назад +5

    At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!

  • @pterandon
    @pterandon Год назад +6

    Superb presentation. Both “pop culture” exposure, and real technical info for experts

  • @RB747domme
    @RB747domme Год назад +8

    Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears.
    Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence.
    The future is definitely bright.

  • @michaelmccoubrey4211
    @michaelmccoubrey4211 Год назад +6

    photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today.
    We could:
    - use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient)
    - use RISC based CPUs such as ARM chips or RISC5 chips
    - underclock CPUs so that they maximise power efficiency rather trying to maximise performance
    - use operating system drivers that aim to use minimal power
    If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.

    • @User9681e
      @User9681e Год назад +2

      There is benefit for higer languages too c is only for low level / high performance stuff and is absolutely unreplaceable there
      other stuff y need higer languages like rust python java to actually get projects done in time and not mess with optimization too much
      there is always a use for both
      About undercolcking the PC has a thing called a governer who decides clock speed per workload so that's already been done when y don't need max performance
      Plus we have ecores

    • @rajuaditya1914
      @rajuaditya1914 Год назад +5

      This is such a normie take that it is hilarious.

    • @User9681e
      @User9681e Год назад +3

      @@rajuaditya1914 we all are interested learning and yeah we normies may not understand key concepts

    • @ericmoulot9148
      @ericmoulot9148 Год назад

      @@rajuaditya1914 Sounds like a reasonable argument to me. Maybe you have some insights to share that'd change my mind and his?

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 Год назад

      Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test

  • @user-ue7wu2dh4o
    @user-ue7wu2dh4o Месяц назад

    brilliantly explained to the layman with such recondite acumen.

  • @dan_rad
    @dan_rad Год назад +20

    Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding).
    It's like Parkinson's law, but with computer memory.

    • @zonegaming3498
      @zonegaming3498 Год назад +4

      I could see AI generating new ways to solve computational problems that reduces the need to compute them.
      For example DLSS or AI upscaling.

    • @Meleeman011
      @Meleeman011 Год назад +1

      nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.

    • @Meleeman011
      @Meleeman011 Год назад +2

      the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.

    • @velvetypotato711
      @velvetypotato711 Год назад +2

      @@Meleeman011 Using c++ server side can have a reduction in energy usage

    • @andyfreeze4072
      @andyfreeze4072 Год назад

      @@Meleeman011 mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.

  • @deiphosant
    @deiphosant Год назад +5

    People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.

    • @RobinOnTour
      @RobinOnTour Год назад +2

      Energy consumption would increase either way

  • @alecs536
    @alecs536 Год назад +11

    I'm glad that Pam from "The Office" has finally found her calling

  • @chi4829
    @chi4829 Год назад +2

    15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.

  • @gusauriemo
    @gusauriemo Год назад +2

    Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting

  • @roopkaransingh1794
    @roopkaransingh1794 Год назад +5

    Amazing content, so cool information, please keep coming up with these kinds of videos

  • @gkess7106
    @gkess7106 Год назад +2

    Light travels in “conductors” not “wires”.

  • @vrajpatel3139
    @vrajpatel3139 Год назад +7

    neuro computer looked fired🔥

  • @zAlaska
    @zAlaska Год назад +2

    The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.

  • @PrinceKumar-hh6yn
    @PrinceKumar-hh6yn 8 месяцев назад

    I am heavily impressed and amazed at the same time the kind of presentation Bloomberg has presented here..PURELY scientific ...

  • @ojhuk
    @ojhuk Год назад +49

    Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.

    • @koiyujo1543
      @koiyujo1543 Год назад

      Well yea bur Maybe bit it depends we can always make our current election ones better besides trying to add more transistors I mean yea will need better materials like graphene could make computers hundreds of thousands of times faster.

    • @ojhuk
      @ojhuk Год назад +4

      @@koiyujo1543 Yeah i agree, there are still advancements to be made in electronics, i imagine hybrid photonic/electronic systems will become a thing before we get any fully photonic chips, from what i understand the benefits of photonics to latency and efficiency go far beyond what is possible with electronics.

    • @katarn848
      @katarn848 Год назад +4

      I have my doubt about carbon. After High-NA EUV lithography has reached it limit with silicon wafers. That like in 2 decades. I think there will be limits found how far you can go in complexity , layers and materials.

    • @billfarley9015
      @billfarley9015 Год назад

      Offhand I can't think of any examples of light-based computers being a mainstay of science fiction. Can you cite any?

    • @ojhuk
      @ojhuk Год назад +5

      @@billfarley9015 I can't offhand either. I thought about Data from Star Trek: TNG but he's positronic. I also thought about Voyager's computer but iirc that's organic. There is Orac, The LIberator's supercomputer from Blake's 7, I always assumed that was Photonic but I may be wrong. I'm sure if I was to look hard enough I'd find something soon enough, sci-fi writers have a far greater imagination and scientific knowledge than myself. :)

  • @_Pickle_Rick_
    @_Pickle_Rick_ Год назад +6

    So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...

  • @ankitroy3319
    @ankitroy3319 Год назад +1

    This is really youtube should recommend

  • @BradenLehman
    @BradenLehman Год назад +1

    24:19 "What is this called" *flips off the teacher* 🤣

  • @AlexTrusk91
    @AlexTrusk91 Год назад +3

    I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)

  • @sergioespitia7847
    @sergioespitia7847 Год назад +7

    Definitely. This is a pretty important documentary to inspire engineers all over the world.

  • @weirdsciencetv4999
    @weirdsciencetv4999 Год назад +4

    Neuromorphic computers are the next key technology.
    What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.

  • @sergebillault730
    @sergebillault730 Год назад +6

    The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?

  • @nicolasdujarrier
    @nicolasdujarrier Год назад +6

    I think a few other options have not been duscussed liked spintronics (with MRAM already on the market), and maybe (flexible) organic electronics…

  • @drivenbyrage5710
    @drivenbyrage5710 Год назад +3

    Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.

    • @SumitPalTube
      @SumitPalTube Год назад

      Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.

  • @jaredspencer3304
    @jaredspencer3304 Год назад +7

    It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.

    • @latvialava6644
      @latvialava6644 Год назад +1

      New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!

    • @femiairboy94
      @femiairboy94 Год назад

      They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.

  • @BlackBirdNL
    @BlackBirdNL Год назад

    24:33, "Here is the mouse." Proceeds to point at the Jell-O.
    Jump cut to it pointing at the mouse.

  • @CrackDavidson1
    @CrackDavidson1 Месяц назад

    Even though these chips would be more expensive to produce, the power savings in use makes a huge dent in the life time costs of those chips. This is what big data centers live on. So if there is a 1000 time reduction in power usage, *theoretically* they can cost a 1000 times more to produce, but still be competitive.

  • @xanokothe
    @xanokothe 6 месяцев назад +1

    The think is that while building data centers, these tech companies also build solar and wind

    • @phelan8385
      @phelan8385 3 месяца назад

      They're gonna start building reactors in the data centers

  • @grumpeepoo
    @grumpeepoo 9 месяцев назад +1

    There is already a neuromorphic chip company from Australia called BrainChip with their Akida 2nd gen chip out. They have partnered with ARM, Intel, Prophesee, Megachips to name a few

  • @vigneshs6232
    @vigneshs6232 Год назад

    Wonderful...Great task....Enormous knowledge...Thankyou all....

  • @duckmasterflex
    @duckmasterflex Год назад +1

    reducing energy consumption is like adding more lanes to a highway, it won't reduce traffic, it will just add more cars

  • @ryansumbele3552
    @ryansumbele3552 Год назад +2

    Very informative, thank you from Cameroon ❤️

  • @nishantaadi
    @nishantaadi Год назад +10

    Semiconductor is the world new Gold and Oil.

    • @anantsky
      @anantsky Год назад

      There's nothing new about semi-conductors.

    • @organicfarm5524
      @organicfarm5524 Год назад

      semiconductor came in 1930s; they are not new

    • @zzmmz3789
      @zzmmz3789 Год назад

      But still begging for oil

  • @carl8790
    @carl8790 8 месяцев назад

    @3:32 there should have been a huge asterisk at the figure of 5nm (nanometer). The transistors aren't actually 5nm in size and that '5nm' technology is just advertising for the manufacturing company's next gen transistors. Now due to how transistors can be manufactured and packaged differently, there's no agreed industry standardized size. Some fabs (places where semiconductors are being built), usually give a density figure of x amount of transistors per mm² of their die, but even that is difficult to verify independently.

  • @rheung3
    @rheung3 Год назад +1

    Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.

  • @jessty5179
    @jessty5179 Год назад +1

    0:00 misleading image, atomic power plant don't produce carbone emissions, in fact it's one of the best alternative to that specific problem.

  • @BloodyMobile
    @BloodyMobile Год назад +2

    I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it.
    I wouldn't be surprised if it's around or above half of it...

  • @rafaelysais8483
    @rafaelysais8483 Год назад

    The advertisement just made tinnitus an effing symptom.

  • @martinblake2278
    @martinblake2278 Год назад +1

    I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.

    • @I___Am
      @I___Am 10 месяцев назад

      Mythic chip?

  • @stevegunderson2392
    @stevegunderson2392 Год назад +4

    Putting a computer into a toaster is the dumbest use of computing power one can imagine. Are you so lonely that you need emails from your refrigerator? Security = control.

  • @jaqhass
    @jaqhass Год назад +1

    Wait... Recognize oders? Does that mean chips will be used to snag drug mules? That's awesome!

  • @johndawson6057
    @johndawson6057 Год назад +3

    This was The best explanation i have heard for quantum tunneling. Thanks guys.

  • @shapelessed
    @shapelessed Год назад +1

    There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.

  • @mousatat7392
    @mousatat7392 Год назад +22

    Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.

  • @pedropimont6716
    @pedropimont6716 Год назад +1

    people forgot to consider how a smartphone saves energy by replacing old solutions like physical maps, books and so on which also takes energy to produce and pollute the planet

  • @laughingvampire7555
    @laughingvampire7555 Месяц назад

    The real benefit comes from using Analog computers, that can optimize the usage of energy.

  • @Wulfcry
    @Wulfcry Год назад +1

    Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.

  • @D-Z321
    @D-Z321 Год назад

    My dad works as an electrical engineer in the semi-conductor industry. Pretty crazy stuff.

  • @TotallyRandomHandle
    @TotallyRandomHandle Год назад +1

    I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).

  • @qwertyali2943
    @qwertyali2943 Год назад

    this really got my attention becuz of my deep interest in science and tech, thanks bloomberg!!

  • @bhuvaneshs.k638
    @bhuvaneshs.k638 Год назад +15

    As an Indian I have to compliment the US for pushing these bleeding edge r&d. I work at neuromorphic computing area in India & I'm sure India will start competing with USA soon

    • @lophilip
      @lophilip Год назад +2

      I don't work in that specific field but I'm certain that India will have much to contribute in that technology!

    • @Zipsnis608
      @Zipsnis608 Год назад

      India have great future, however...the day when world will start to cooperate more instead of compete, will be the best day for humanity...also, and i am sure that it wont happen any time soon. People are just too self centric and primal

    • @J_X999
      @J_X999 Год назад

      China's already much further ahead in all types of chips, whether carbon, photonic or RISC

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 Год назад

      @@J_X999 u mean compared to USA?

    • @J_X999
      @J_X999 Год назад +1

      @@bhuvaneshs.k638 Compared to India. India has potential but it just isn't as big as many people think

  • @TheRomanTimesNews
    @TheRomanTimesNews Год назад

    13:00 talk to me boi
    you got me at photon

  • @studiolezard
    @studiolezard Год назад +1

    Wouldn't a high frequency vibration like ultrasound while in suspension help to align the nanotubes?

  • @pinkliongaming8769
    @pinkliongaming8769 Год назад

    Wow I can't wait for my Quantum Carbon Photonic Neuromorphic Toaster

  • @MrChronicpayne
    @MrChronicpayne Год назад +1

    The guy with the beard is a great commentator/middle man for this Quicktake. Hope to see him again.

  • @blueguy5588
    @blueguy5588 Год назад

    Some corrections: 1) 5 nm process isn't actually 5 nm, it's a marketing term, so the graphic is inaccurate, 2) modern chips are already layered.

  • @spaceprior
    @spaceprior Год назад +1

    Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.

  • @PrivateSi
    @PrivateSi Год назад

    Nice succinct, informative, up-to-date vid and objective analysis. Photonic computing is definitely the way forward. Neuro-photonic and even bio-photonic computing will combine well in the future when the tech. is worked out. 1000x more computing using 1000x less power within 20 years. Moore's Law will be utterly broken, but in a productive way via a large tech. leap or two, rather than slowing to a standstill as pushed by many youtube vids.

  • @TheDane_BurnAllCopies
    @TheDane_BurnAllCopies 4 месяца назад

    0:40 Factories use more energy, than all smartphones all over the world.

  • @celdur4635
    @celdur4635 Год назад +3

    I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them.
    So cool, more efficient chips, its great, BUT we will still increase your energy consumption.

  • @FS-ft8ri
    @FS-ft8ri Год назад +1

    Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers.
    Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.

    • @ivanlam1304
      @ivanlam1304 Год назад

      Am I correct in thinking that the aligned nanotubes would form a large scale matrix of potential MOSFET transistors?

    • @FS-ft8ri
      @FS-ft8ri Год назад

      @@ivanlam1304 In principle i think you could manage to make it that way, however, I habe to admit that i am no expert in Transistor technology.
      My knowledge is more coming from the surface science/electrochemistry/interface science especially solid to liquid

  • @anushantony
    @anushantony Год назад +1

    beautifully put together.

  • @JG_UK
    @JG_UK Год назад +5

    Amazed how long these alternative silicon methods have been in development. Seems like we’re stuck with wafer silicon for this generation

  • @kayakMike1000
    @kayakMike1000 Год назад +2

    I will have that researcher know.... My brain uses at least 25 watts, when I am sleeping. It's well over 32 watts when I am awake.

  • @JJs_playground
    @JJs_playground Год назад

    This was a great little mini-documentary.

  • @ChrisBrengel
    @ChrisBrengel 9 месяцев назад

    First minute does a great job explaining how much electricity computers use.

  • @piyh3962
    @piyh3962 Год назад

    This is under selling the photonics already in the market and data centers. Millions of miles of fiber optics have been running for years off of photonics.

  • @QuaaludeCharlie
    @QuaaludeCharlie Год назад +1

    All we need is DVD Ram , Don't you get it ? :\ QC

  • @sciamachy9838
    @sciamachy9838 Год назад

    yeah, when I discovered in that I started watching videos in 720p and, 144 when I'm doing other things, less data transmitted equal less CO2

  • @dekev7503
    @dekev7503 Год назад +1

    As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.

    • @Typhonnyx
      @Typhonnyx Год назад

      really like for example

    • @boptillyouflop
      @boptillyouflop Год назад

      Any technology that can build a >1Ghz 32bit exact adder...

  • @snooks5607
    @snooks5607 Год назад +1

    0:10 oh yes the classic, "carbon emissions" -> *water vapor from cooling towers*, why do they keep doing that?

  • @jamesjanse3731
    @jamesjanse3731 Год назад +3

    If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?

    • @gkess7106
      @gkess7106 Год назад

      Not when they are copper.

  • @nickvoutsas5144
    @nickvoutsas5144 Год назад

    Light traveling through optic chips is the future.
    Combined calculations of a traditional binary computer integrated with a quantum computer makes sense

  • @femiairboy94
    @femiairboy94 Год назад

    It’s amazing that just a hundred years ago we barely had cars on the road. The speed at which technology is developing is something else.

  • @wonkafactory936
    @wonkafactory936 Год назад

    Gaming at the speed of light. Trading stocks at the speed of light. Downloading at the speed of light. Money transactions at the speed of light. I love the thought already.

  • @chrismv102
    @chrismv102 Год назад +4

    So...how is it that a computer materials scientist expresses himself by saying "100 times smaler...". It's become popular to use "times smaller" in media but it's very inaccurate.

    • @3rdvoidmen594
      @3rdvoidmen594 Год назад +2

      Easier to understand for a Layman. How would you phrase it

  • @shadow-sea
    @shadow-sea Год назад

    absolutely fascinating