Photonic Neuromorphic Computing: The Future of AI?

Поделиться
HTML-код
  • Опубликовано: 25 дек 2024

Комментарии • 636

  • @quimblyjones9767
    @quimblyjones9767 3 года назад +107

    The way you are able to compress complicated ideas or concepts down into easily digestible bytes... It really is astounding.
    Thank you for making this channel.

    • @ExplainingComputers
      @ExplainingComputers  3 года назад +18

      You're very welcome!

    • @Reziac
      @Reziac 3 года назад +7

      And without dumbing it down, too. The mark of a great teacher.

    • @gusgone4527
      @gusgone4527 3 года назад +4

      His manner and style is that of a very good British teacher. I bet he is even better with a live class of students giving real time feedback. So watch, listen and learn from a master.

    • @slimeball3209
      @slimeball3209 3 года назад +1

      @@Reziac in fact when we understand something we are compress complicated ideas (like iron ore where is too many atoms, we dont need to store every single atom, we can just know what all in this piece of ore contains atoms and structures and they are repeating) to it's meaning.
      but in process of understanding we really will store in brain many "trash" data.

    • @Reziac
      @Reziac 3 года назад

      @@slimeball3209 Tell me about it. My brain saves everything... and can't be arsed to index it. So when one datum is processed, piles of junk data swirl up from sludge...

  • @zebop917
    @zebop917 3 года назад +112

    Intel’s choice of Loihi (Lo-eee-he) as a name is quite subtle. It’s presently an undersea volcano on the flank of the Big Island of Hawai’i and hidden to view but in the future it’ll be a giant island all of its own.

    • @ianchan2624
      @ianchan2624 3 года назад +1

      u mean the pacific garbage patch's reclaimation:?

    • @synoptic1505
      @synoptic1505 3 года назад

      It means, by the idea of petite androginous boy children

    • @maloxi1472
      @maloxi1472 2 года назад

      @@ianchan2624 Nope, but you already knew that 🤷🏻‍♀

  • @johnphilippatos
    @johnphilippatos 3 года назад +18

    Nothing better than watching a video on a difficult subject that provides its research sources. Shows that the creator of the video did his homework.

  • @aloysiussnailchaser272
    @aloysiussnailchaser272 3 года назад +27

    This is exceptionally high quality stuff. It reminds me of what the OU used to broadcast on BBC2 in the 1970s. I used to get up in the early hours to watch it, before we had a VCR. Not just computing, but physics, English, maths, history, chemistry. Whatever was on.

    • @Pulsonar
      @Pulsonar Год назад +1

      Yep me too, Im a 70s kid, I derived a strange masochist pleasure from those dry OU lectures at night/early morning on BBC2 😂 This was well before the BBC micro show in the early 80s put domestic and school computing on the map in the UK, the great old days 😉

  • @John_Locke_108
    @John_Locke_108 3 года назад +59

    Why do I have a feeling that this one is going to make my brain hurt a little bit? Man how I look forward to 9:00 a.m. EST on Sunday

    • @kevinshumaker3753
      @kevinshumaker3753 3 года назад

      EDT

    • @John_Locke_108
      @John_Locke_108 3 года назад

      @@kevinshumaker3753 Yes. I was forget to say EDT this time of year.

    • @gklinger
      @gklinger 3 года назад +2

      If your brain hurts a little it is because you’re using it correctly.

    • @DarthVader1977
      @DarthVader1977 3 года назад +3

      I like ketchup.

    • @John_Locke_108
      @John_Locke_108 3 года назад

      @@DarthVader1977 Mayo for the win.

  • @duncanmcneill7088
    @duncanmcneill7088 3 года назад +3

    Back in the late 70’s, when I studied electronics at University, we were imagining neural networks based on a self modifying, distributed type structure using a hybrid of analog and digital techniques. The technology didn’t exist back then.
    Digital processing took over and got us to the point we are today and now, 40 years later, the technology may be finally catching up.

  • @Kevin-mx1vi
    @Kevin-mx1vi 3 года назад +18

    When I saw the subject of this video I thought I could hear it whooshing clean over my head, but Chris explained it so well that I actually understood it !
    Dunno whether I'm more surprised at Chris or myself, but I definitely learned something today ! 😀

  • @Grandwigg
    @Grandwigg 3 года назад +6

    I love the way Chris stated his conclusions of 'not overtake, bit side by side von Newman arch. . . As best fits the use case' . I wish we'd see more of this with new technologies- rather than the unrestrained hype like with graphene, carbon nanotubes, teflon, and so on.
    Loved this video.

  • @warrengibson7898
    @warrengibson7898 3 года назад +2

    I’m reminded why this is my favorite channel on all of RUclips. Tinkering with SBCs one week, looking at cutting-edge R&D the next.

  • @TopHatCentury
    @TopHatCentury 3 года назад +2

    Thank you very much for this amazing video! When I was taking a Cisco networking class a few years ago, I thought about how light-based computing was possible but I had a bit of trouble figuring it out in my mind. A few years later, here is a fantastic video explaining the concept of light-based computing. The times have certainly changed.

  • @Uniblab8
    @Uniblab8 3 года назад +23

    I hope I live long enough to experience some of this. It's terribly exciting!

    • @leendert1100
      @leendert1100 3 года назад +4

      terrible indeed, not exiting at all.

    • @Prakhar_Choubey
      @Prakhar_Choubey 3 года назад +3

      @@leendert1100 It's completely alright if you can't understand it. It's rather complicated and requires some background. So no worries. But..... It's awesome!

  • @Crobisaur
    @Crobisaur 3 года назад +6

    Working in my photonics lab in grad school I've always hoped to see the day computer architectures change to enable new usecases for photonic computing.

  • @dxutube
    @dxutube 3 года назад +3

    Very inspiring. Good to hear classic architecture will be carrying on for decades yet.

  • @jk-ml5fb
    @jk-ml5fb 3 года назад +28

    When we get to year 2030 Christopher will still not age one bit, then we realize he is computer generated all this time.

  • @LoporianIndustries
    @LoporianIndustries 3 года назад +2

    I imagine Photonic Quantum Computing as Neural Net Architecture, in the progression toward the complete quantification of the human being like a large set of algorithmic logic, as a fluid metamaterial of nanotechnology, is going to lead to the eventual development of the Physical Spirit Being. You are a Human. You are a Machine. You are Fluid Nanotech Photonic Quantum Neural Human, and you are Human Energy that can generate yourself. You are Physical. You are Virtual. You are Living Data whose Sentience translates that Data into Human Information.

  • @martt9889
    @martt9889 3 года назад +1

    As someone who`s studying electronic engineering AND psychology, I loved this video and I'll keep an eye on photonic Neuromorphic Computing in the near future

  • @gaylancooper497
    @gaylancooper497 3 года назад +1

    While this topic is way above my Knowledge, the way you explained the subject matter just blew my mind. I really didn't think I would understand it but I definitely have an understanding of this type of computing now. Thanks for the great explanation and another great video.

  • @alandean2
    @alandean2 3 года назад +1

    We already have the nearest computational mechanism to the human brain (and mind). It is called the thermostat that thinks just like the human brain by experiencing the subjective experience of reporting "Now it is too hot, now it is too cold"

  • @jeus2884
    @jeus2884 3 года назад +4

    It is about time when I first started studying this type of technology before even became available 25 years ago when I thought of something like that I never thought that I would actually live long enough to see other people making the technology a reality

  • @deechvogt1589
    @deechvogt1589 3 года назад +1

    Chris, wow, okay, mind blown. I love taking a look ahead into the future of computing in all of its possible forms. Thanks for sharing this information.

  • @Colin_Ames
    @Colin_Ames 3 года назад +3

    That was a little different from our usual Sunday morning fare. Interesting topic that certainly provides food for thought. Thanks Chris.

    • @ExplainingComputers
      @ExplainingComputers  3 года назад +5

      I like to throw in a wildcard now and then. Next week we are controlling stuff with a Raspberry Pi Pico! :)

  • @squelchstuff
    @squelchstuff 3 года назад +3

    Brilliantly accessible coverage of the subject Christopher. Thanks also for your research sources too. Photonic Neuromorphic Computing is such a mouthful, so it's only a matter of time before the marketing bods come up with some strange acronym. PNC has a different (and depending on your proclivities/recidivism) meaning in the UK :)

  • @scdesign1565
    @scdesign1565 3 года назад +1

    A character in a novel I am writing has such a brain. Ill let you know how it all works out! Very nice presentation of the concepts!

  • @srtcsb
    @srtcsb 3 года назад +1

    I had to watch this one twice. The first time, I was having brunch and couldn't give this information my full attention. Alas, the gray matter is still reeling. I used to think the physical foundation (the silicon wafer/chip) needed to be changed because it's (basically) reached its maximum efficiency. As it turns out... It's the bus, stupid! Moving the data around is where the bottleneck is. Our computing devices might be going back to the beginning in ten or fifteen years: The Radio Shack Light Computer, The Commodore 64000 Quantalaser, The IBM Laser Jr. (made by Lenovo, of course). Thanks for another great video Chris.

  • @jmsiener
    @jmsiener 3 года назад +1

    Who provides sources for their videos? Chris does!! Really, thank you for that. I know there are other channels that do that but I think it raises the bar.

  • @thispandaispurple
    @thispandaispurple 3 года назад +1

    Sunday morning coffee and an interesting explainingcomputers video to watch! Off to a great start today!

  • @CnCDune
    @CnCDune 3 года назад +7

    I remember a movie about Time Travel and a photonic intelligence.
    "Time travel. Practical application."
    On that note, there's also a game featuring a von Neumann probe. Grey Goo, a fairly decent real-time strategy game.

    • @AlRoderick
      @AlRoderick 3 года назад

      The von Neumann probe is a different concept then the von Neumann architecture, invented by the same guy (he was a prolific futurist). He posited the idea of self-replicating machinery, macro scale space probes that would go to other star systems, build copies of themselves, and send the copies forward to still more star systems. This process would repeat until you visited every single star system in the galaxy. Other later thinkers applied that same concept to nano scale machinery, which is where we get the grey goo idea that's the source of that game's name. Ironically a nanomachine would probably not use the von Neumann architecture.

  • @madworld.
    @madworld. 3 года назад +34

    Even Star Trek didn't dare such a name 😂😂😂
    passionating subject, indeed

    • @janglestick
      @janglestick 3 года назад +3

      apparently, positronic networks = photonic neuromorphic networks + tasha yar

    • @saalkz.a.9715
      @saalkz.a.9715 3 года назад +3

      Duotronic circuitry and the M-5 Multitronic unit (TOS)... Isolinear chips and Positronic neural network (TNG)... Synthetic Bio-neutral gel packs (Voy)... The Borg... Are we all a joke to you?

    • @victorarnault
      @victorarnault 3 года назад

      I Loved your comment

    • @victorarnault
      @victorarnault 3 года назад

      @@saalkz.a.9715 I liked your comment even more.

    • @MikaelMurstam
      @MikaelMurstam 3 года назад

      @@janglestick no positronics use positrons which are positive electrons (anti-electrons).

  • @ManyHeavens42
    @ManyHeavens42 2 года назад +3

    We are our greatest with the help of computers, Not without. Not just see and hear, Think ! Remember ?

  • @ludwigvanbeethoven61
    @ludwigvanbeethoven61 3 года назад +1

    Wouldn't have expected that topic from you. Very cool. Thank you

  • @chriholt
    @chriholt 3 года назад +1

    Chris, you never cease to amaze me with your in-depth research and very clear explanations of new technologies. Thanks as always!

  • @PS_Tube
    @PS_Tube 3 года назад +5

    I'm getting notifications at least 2 minutes later. But once the notification pop-up, I arrive here for my weekly dose of EC.

  • @Tense
    @Tense 3 года назад

    One of your best videos to date!

  • @LarryKapp1
    @LarryKapp1 3 года назад +1

    thanks for explaining a complex subject into something understandable.

  • @hasanalpaslan8202
    @hasanalpaslan8202 3 года назад +1

    Thanks for the very interesting and informative video. Both easy to understand and also high quality in presentation.
    On the other hand, the possibility of having such powerful machinery ending up on wrong hands is scary indeed.

  • @lastinline1958
    @lastinline1958 3 года назад +11

    This sounds a lot like the premise for "Terminator".

  • @jogon1052
    @jogon1052 3 года назад +3

    What an interesting subject, especially when IBM has released information on the 2 nanometre chip that it has been researching and is able to produce. Just imagine the trying to keep up with all of this future technology. Great video Chris and congrats on being able to keep up with all of the research you must have to do to keep up with all of this information.

  • @jeraldgooch6438
    @jeraldgooch6438 3 года назад +2

    Christopher,
    1. Please excuse the tardiness of this note. I watched the video on Sunday, but got waylaid by the world. Namely Mother's Day
    2. I can see why you would be a good and effective lecturer at university. You took a very complex topic and condensed it down to a number of relatively easily digestible bits and presented them effectively. In other words, you effectively dumbed down a complex topic without talking down to me. That is hard to do.
    3. So, it appears computers are making advances in three areas
    a. semiconductors go for ever smaller trace sizes and compressing more transistors into smaller and smaller spaces
    b. quantum computing is still being researched and on sees bits and pieces about advances in this area and some semi-practical devices
    c and no photonic computers
    d. (no doubt there is still someone out there still touting fluidics and the cure for what ails you)
    4. One wonders what programming languages (and operating systems) will look like for photonic and quantum computing devices will look like? Will the interfaces still be electronic with visual and touch devices or will there be more direct interface with the brain?
    5. When will we be seeing the SBC version of a photonic computing device from someone like Raspberry Pi? 😊

  • @kerriepatterson7641
    @kerriepatterson7641 3 года назад

    I wish this video had been available when I was doing research for my novel, Twisted Light. I absolutely believe that photonics will be the future in computing and in communication. Great video. Thanks.

  • @jhonbus
    @jhonbus 3 года назад +20

    I can't be the only one who joins in the recitation of Chris's intro and outro for every video like I'm chanting some sort of religious creed?

  • @megatronDelaMusa
    @megatronDelaMusa Год назад +1

    kamasutra has evolved beautifully. A neuromorphic internet infrastructure would set the world ablaze. Our ability to build synthetic artificial synapses and dendritic branching would give a whole new meaning to the future. AGi has arrived on our blindside.

  • @savirien4266
    @savirien4266 3 года назад +1

    Some of our shortest wavelength UV lasers are in the 270nm range. That’s quite large compared to current transistor sizes. There’s a reason electron microscopes can image smaller objects than optical microscopes.

  • @alexhudspeth1213
    @alexhudspeth1213 3 года назад

    This video is "required watching' for the Human Resistance against Skynet. Wonderful video, Chris; thanks!

  • @ObsidianMercian
    @ObsidianMercian 3 года назад +1

    Thank you Chris for this incredibly informative video. I had already come across neuromorphic computing, but was unaware of photonic neuromorphic computing, so this information is greatly appreciated .

  • @ManyHeavens42
    @ManyHeavens42 2 года назад +2

    Is true you're creating a Symphony ! Genius
    You Rock !

  • @akk5830
    @akk5830 2 года назад +1

    Highly appreciated if u could explain tech in depth of photonoc neuromophics

  • @piconano
    @piconano 3 года назад +2

    Back in the early 80's, IBM patented a process that used red and green lasers, to read and write a crystal made of the same molecule inside your eyes.
    When the red laser hit this molecule, it turned to "L" shape. When hit with the green laser, it would turn to an "I" shape and straighten out. Or something like that.
    They said, they can fit 10 libraries of congress on a crystal the size of a sugar cube! They were going to make solid state optical drives that had fast read and write speeds (in nano seconds).
    I never heard anything after that. Anyone else remembers this?

  • @sallienewton7184
    @sallienewton7184 3 года назад +1

    This is one invaluable lesson. Thank you for your clear and concise explanation of photonic neuromorphic computing!

  • @larrywebber2971
    @larrywebber2971 3 года назад

    Another consideration are the yet-to-be-designed programming languages that will allow the best chance of taking advantage of this newer architecture and at the same time are reasonably approachable by us mere mortal programmers. It seems a big learning curve ahead for both the hardware and software designers IMO.

  • @cdl0
    @cdl0 3 года назад

    The photonic part of this technology is something that people have been thinking about since more than forty years ago; however, suitable materials have always remained a tricky problem.

  • @semuhphor
    @semuhphor 3 года назад +3

    ok ... I have to admit, Christopher, that this one makes me feel like a photonic neuromoron. Thanks for the vid. :D

  • @shyamasingh9020
    @shyamasingh9020 3 года назад +1

    Thanks very much for explaining complex concepts in a crystal clear concise manner.

    • @meetoptics
      @meetoptics 3 года назад

      We agree
      From MEETOPTICS team!

  • @danieldc8841
    @danieldc8841 3 года назад +1

    This is really well-explained; you did a great job of explaining the key concepts and why they're important. Thanks for making this!

    • @meetoptics
      @meetoptics 3 года назад

      Yes, MEETOPTICS team completely agree!

  • @Barnardrab
    @Barnardrab 3 года назад +1

    We often think of data in terms of bytes instead of bits.
    So regarding the point at 3:00, 100 gigabits is equal to 12.5 gigabytes.

  • @edugio
    @edugio 3 года назад

    This is one of the best tech videos I've ever seen.

  • @aaronmatos5581
    @aaronmatos5581 2 года назад +1

    Thank you, sir! So much golden info.

  • @joaocarlosalmeida7325
    @joaocarlosalmeida7325 3 года назад +1

    Awesome video! Very inspiring for a Physics Student as myself, hope it inspires more people on the Technology sector! Great references as well!

    • @meetoptics
      @meetoptics 3 года назад

      It is so good! It had inspire MEETOPTICS, definitely!

  • @1000left
    @1000left 3 года назад

    WOW!!! That is a great video thank you!!!! It's a GREAT time to be alive!!!!!

  • @saturno_tv
    @saturno_tv 3 года назад +3

    Good Morning Mr. Barnatt. Here finally for the 10th 🥇 gold. Always supporting. Best tech stuff on internet is here. Many thanks. First.

    • @ExplainingComputers
      @ExplainingComputers  3 года назад +1

      Thanks for your support -- 10th Gold medal awarded! :)

    • @williamhorton9763
      @williamhorton9763 3 года назад +1

      @@ExplainingComputers Does Saturno live next door to you?

  • @timmurphy5541
    @timmurphy5541 3 года назад

    As you mention at the end, our brains don't use photonics and we haven't yet learned how to make an intelligence that's even as good as our own. So there's something more to the design than just going fast.

  • @akk5830
    @akk5830 2 года назад +1

    This is the ultimate goal of computing

  • @walterig33
    @walterig33 3 года назад

    Your videos truly are great. I thoroughly enjoy them, they are so well structured and informative. Thank you kindly. Greetings from Barcelona.

  • @marceloabreu5749
    @marceloabreu5749 3 года назад

    Great work, Chris! Greetings from Brazil.

  • @Aditya-wb2uo
    @Aditya-wb2uo 3 года назад +2

    The video was great. I always like stuff like this. Keep making great videos like this :)

  • @slimplynth
    @slimplynth 3 года назад +28

    I always click like before it's even started :)

  • @blevenzon
    @blevenzon 3 года назад +1

    Brilliant stuff. Thank you so much

  • @TheOrganicartist
    @TheOrganicartist 3 года назад

    excellent video. i've been following photonics & neuromorphic research for a while and the possibilities are exciting.

  • @lorderectus1849
    @lorderectus1849 3 года назад +1

    Welcome to another video from Chris!

  • @TARS..
    @TARS.. 3 года назад +1

    When you mentioned wavelength multiplexing in photonic hardware my mind blew a little.

  • @meetoptics
    @meetoptics 3 года назад

    Congratulations for introducing this topic so well in RUclips.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.

  • @desert-stormvet88
    @desert-stormvet88 3 года назад +2

    ”Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
    (Jeff Goldblum Jurassic Park)

  • @G-3-A-R-Z
    @G-3-A-R-Z 3 года назад +2

    A video all about fixing screen taring would be amazing. Both on amd side and and intel. To browser tricks to enable flags. Thanks. Great stuff.

  • @emmanueloluga9770
    @emmanueloluga9770 2 года назад +2

    THE NEXT FRONTIER

  • @largepimping
    @largepimping 3 года назад +1

    Very different from the typical content, but fascinating!

  • @4.0.4
    @4.0.4 3 года назад +2

    It's kinda surreal if you had to explain this to someone from the past. We'll go from using little magic sand squares into making our metal golems do stuff using microscopic rainbow magic.

  • @sbc_tinkerer
    @sbc_tinkerer 3 года назад +2

    Awesome deviation from the "normal" videos. Thank you sir for the glimpse into the future. Hope I live to see it. Must get those papers cited! 12 likes!!

  • @vroomik
    @vroomik 3 года назад +2

    Check out the company callled Lightmatter, which suppose to sell their photonic processor this year, claiming to be up to 10x faster than nvidia A100 in BERT and using 90% less energy (we'll see when Envise is tested IRL...)

  • @rogerkoh1979
    @rogerkoh1979 3 года назад +1

    Running faster and more energy efficient. It could mean lighter and longer lasting mobile devices. Thanks for sharing. Looking forward to next week show.

  • @lawrenceallwright7041
    @lawrenceallwright7041 3 года назад +12

    I'm just getting a sneaking feeling that my old PC with its built-in 5.25" floppy drive might be getting a bit out of date.

  • @ElmerFuddGun
    @ElmerFuddGun 3 года назад +3

    PICs... ya, lets make up a _new definition_ of the PIC acronym/abbreviation. It already is used for _picture_ and _Peripheral Interface Controller_ but why not something else? - 3:18

    • @ElmerFuddGun
      @ElmerFuddGun 3 года назад +3

      I mean... the more definitions, the better, right? These people are smart enough to come up with this laser/light tech but don't know how to use Google to see if their acronym is already in use?

    • @janglestick
      @janglestick 3 года назад

      hey uh ... wth has happened to reality, am I in the mandala effect !?!
      PIC doesnt mean Programmable Interrupt Controller to ya ppls ?

    • @ElmerFuddGun
      @ElmerFuddGun 3 года назад

      @@janglestick - Oh, I forgot about that. But the first thing that comes to my mind with PIC (in all caps) is the Microchip brand PIC micro controllers. Billions of these little things in use around the world and they are the first MCUs I learned on.

  • @insanemainstream3633
    @insanemainstream3633 3 года назад +2

    Woot! Sunday EC!!!

  • @TradersTradingEdge
    @TradersTradingEdge 3 года назад +1

    Great video and starting point.
    Thanks very much!

  • @williama29
    @williama29 3 года назад +2

    today is the day I wake up and see a new video from explaining computers on mother's day Yay 🙂

    • @SBCBears
      @SBCBears 3 года назад +1

      You mean, "birthing persons", of course. We must start learning robo-speech asap. 😃
      Happy Mother's Day to you and yours.

    • @williama29
      @williama29 3 года назад

      @@SBCBears thanks I agree to that this video has me curious

  • @Antonio-fl3nr
    @Antonio-fl3nr 3 года назад +1

    I'm a bit late to the party. I didn't know this existed at all. I just hope this gets to be used for good purposes.
    Thanks. I understood everything as always.

  • @sinjhguddu4974
    @sinjhguddu4974 3 года назад +1

    That was quite a head full! Thank you very much! Never knew about this at all.

  • @enocescalona
    @enocescalona Год назад

    I hope this stuff about light-artificial neuron computers can exist in RL soon. The idea sounds cool.

  • @gpalmerify
    @gpalmerify 3 года назад

    The primary limitation I read about decades ago for photonic computing in general was miniaturization. Simply, the smaller and/or thinner a material (be it silicon or film) is the more transparent it becomes. Try to get down to our modern IC sizes and signals will be lost

  • @BlazeMaster
    @BlazeMaster 3 года назад +1

    I think they might be used in parrarel and share some functionality they might however replace traditional computing eventually or simply exist in parrarel.

  • @D.u.d.e.r
    @D.u.d.e.r Год назад

    Thx for this ep! Its clear that besides Moore's law dying the Von Neumann architecture is much slowly, however still dying as well. I believe we are in the beginning of the longer transition phase and I agree that we will see changes to classical computing in the next decade.
    Still in comparison to the advancements of the silicon photonics organic brain has still HUGE advances especially in the efficiency/energy consumption area. Its truly a miracle what our human brain can do with such a minimal consumption. Its truly on a completely different level in comparison to what we can create. Looks like our creators/gods r in a completely different league. Even organic computing is something out of this world it still has one BIG disadvantage - it doesn't last that long in comparison to its synthetic brothers. However its still far advanced from it can do especially when it works in tandem and cooperation with other brains together as one superorganism. This is what we haven't achieved as species to fully interconnect our brain capabilities to solve the toughest issues and problems.

  • @SJPretorius000
    @SJPretorius000 3 года назад +1

    Oh yes, my fav channel, hey Chris!

  • @mcconkeyb
    @mcconkeyb 3 года назад +1

    Excellent video! This will be the foundation for true AI.

  • @MikaelMurstam
    @MikaelMurstam 3 года назад +1

    11:50 ...but photonic regular computers will also happen, and they will replace the ones we have. Photonics is not just for neuromorphic processors.

  • @SSingh-nr8qz
    @SSingh-nr8qz 3 года назад +1

    This video stimulated my aging Neuromorphic Computer in my skull.

  • @bigjoeangel
    @bigjoeangel 3 года назад +1

    I can see the presentation for nVidia's 2036 flagship graphics cards include the words " Using actual light to process the AI to improve ray tracing, new Photonic neuromorphic RTX cores boost performance by a factor of 1000."

  • @zackaboy1236
    @zackaboy1236 3 года назад +1

    This is amazing technology, the only problem is that components etc. Will have to be bigger until light can be directed in microscopic levels. Hopefully I’m making some sort of sense Chris! 👍😃

  • @stanpotter7764
    @stanpotter7764 3 года назад +1

    I understood this video. I'm not that bright. Well done, Chris! 👏 Chris' next trick, teaching Calculus to a sea slug.

  • @bobwong8268
    @bobwong8268 3 года назад +2

    👍👍👍👍👍
    Never failed to learn something new from you.
    Thank you Christopher!

  • @erlinglorentsen4262
    @erlinglorentsen4262 3 года назад +2

    Interesting.
    A couple of questions though:
    I generally don't do conspiracy theories but I can't help wondering if we should be scared as a species for our own survival? Yes - species evolve - but are we actively building our own replacements?
    I kinda felt like this was more of an "Explaining the future" video.
    Finally the basics: how is photonic logic built up currently? And/or/xor etc. I can't visualize a photonic transistor.

  • @benh9350
    @benh9350 3 года назад

    The cutting edge of technology and future casting is always interesting! The story of how and when we get there is just as interesting. My sincere hope is that more good than bad comes from development. I’m sure the vast majority of happenings will be very positive. there is a nice kind of security in time delay due to hardware or processing times, but having things move quickly and fluidly is really nice too. So I’m looking forward to seeing what comes along. I’m really excited about using entanglement in communication/ information tech. I kind of wonder how far we will go.. I guess only time will tell.

  • @HeavenlyWarrior
    @HeavenlyWarrior 3 года назад +1

    I miss this kind of subjects in your channel. Probably more adequate to your other channel?
    Very interesting content!