The way you are able to compress complicated ideas or concepts down into easily digestible bytes... It really is astounding. Thank you for making this channel.
His manner and style is that of a very good British teacher. I bet he is even better with a live class of students giving real time feedback. So watch, listen and learn from a master.
@@Reziac in fact when we understand something we are compress complicated ideas (like iron ore where is too many atoms, we dont need to store every single atom, we can just know what all in this piece of ore contains atoms and structures and they are repeating) to it's meaning. but in process of understanding we really will store in brain many "trash" data.
@@slimeball3209 Tell me about it. My brain saves everything... and can't be arsed to index it. So when one datum is processed, piles of junk data swirl up from sludge...
Intel’s choice of Loihi (Lo-eee-he) as a name is quite subtle. It’s presently an undersea volcano on the flank of the Big Island of Hawai’i and hidden to view but in the future it’ll be a giant island all of its own.
This is exceptionally high quality stuff. It reminds me of what the OU used to broadcast on BBC2 in the 1970s. I used to get up in the early hours to watch it, before we had a VCR. Not just computing, but physics, English, maths, history, chemistry. Whatever was on.
Yep me too, Im a 70s kid, I derived a strange masochist pleasure from those dry OU lectures at night/early morning on BBC2 😂 This was well before the BBC micro show in the early 80s put domestic and school computing on the map in the UK, the great old days 😉
Back in the late 70’s, when I studied electronics at University, we were imagining neural networks based on a self modifying, distributed type structure using a hybrid of analog and digital techniques. The technology didn’t exist back then. Digital processing took over and got us to the point we are today and now, 40 years later, the technology may be finally catching up.
When I saw the subject of this video I thought I could hear it whooshing clean over my head, but Chris explained it so well that I actually understood it ! Dunno whether I'm more surprised at Chris or myself, but I definitely learned something today ! 😀
I love the way Chris stated his conclusions of 'not overtake, bit side by side von Newman arch. . . As best fits the use case' . I wish we'd see more of this with new technologies- rather than the unrestrained hype like with graphene, carbon nanotubes, teflon, and so on. Loved this video.
Thank you very much for this amazing video! When I was taking a Cisco networking class a few years ago, I thought about how light-based computing was possible but I had a bit of trouble figuring it out in my mind. A few years later, here is a fantastic video explaining the concept of light-based computing. The times have certainly changed.
@@leendert1100 It's completely alright if you can't understand it. It's rather complicated and requires some background. So no worries. But..... It's awesome!
Working in my photonics lab in grad school I've always hoped to see the day computer architectures change to enable new usecases for photonic computing.
I imagine Photonic Quantum Computing as Neural Net Architecture, in the progression toward the complete quantification of the human being like a large set of algorithmic logic, as a fluid metamaterial of nanotechnology, is going to lead to the eventual development of the Physical Spirit Being. You are a Human. You are a Machine. You are Fluid Nanotech Photonic Quantum Neural Human, and you are Human Energy that can generate yourself. You are Physical. You are Virtual. You are Living Data whose Sentience translates that Data into Human Information.
As someone who`s studying electronic engineering AND psychology, I loved this video and I'll keep an eye on photonic Neuromorphic Computing in the near future
While this topic is way above my Knowledge, the way you explained the subject matter just blew my mind. I really didn't think I would understand it but I definitely have an understanding of this type of computing now. Thanks for the great explanation and another great video.
We already have the nearest computational mechanism to the human brain (and mind). It is called the thermostat that thinks just like the human brain by experiencing the subjective experience of reporting "Now it is too hot, now it is too cold"
It is about time when I first started studying this type of technology before even became available 25 years ago when I thought of something like that I never thought that I would actually live long enough to see other people making the technology a reality
Chris, wow, okay, mind blown. I love taking a look ahead into the future of computing in all of its possible forms. Thanks for sharing this information.
Brilliantly accessible coverage of the subject Christopher. Thanks also for your research sources too. Photonic Neuromorphic Computing is such a mouthful, so it's only a matter of time before the marketing bods come up with some strange acronym. PNC has a different (and depending on your proclivities/recidivism) meaning in the UK :)
I had to watch this one twice. The first time, I was having brunch and couldn't give this information my full attention. Alas, the gray matter is still reeling. I used to think the physical foundation (the silicon wafer/chip) needed to be changed because it's (basically) reached its maximum efficiency. As it turns out... It's the bus, stupid! Moving the data around is where the bottleneck is. Our computing devices might be going back to the beginning in ten or fifteen years: The Radio Shack Light Computer, The Commodore 64000 Quantalaser, The IBM Laser Jr. (made by Lenovo, of course). Thanks for another great video Chris.
Who provides sources for their videos? Chris does!! Really, thank you for that. I know there are other channels that do that but I think it raises the bar.
I remember a movie about Time Travel and a photonic intelligence. "Time travel. Practical application." On that note, there's also a game featuring a von Neumann probe. Grey Goo, a fairly decent real-time strategy game.
The von Neumann probe is a different concept then the von Neumann architecture, invented by the same guy (he was a prolific futurist). He posited the idea of self-replicating machinery, macro scale space probes that would go to other star systems, build copies of themselves, and send the copies forward to still more star systems. This process would repeat until you visited every single star system in the galaxy. Other later thinkers applied that same concept to nano scale machinery, which is where we get the grey goo idea that's the source of that game's name. Ironically a nanomachine would probably not use the von Neumann architecture.
Duotronic circuitry and the M-5 Multitronic unit (TOS)... Isolinear chips and Positronic neural network (TNG)... Synthetic Bio-neutral gel packs (Voy)... The Borg... Are we all a joke to you?
Thanks for the very interesting and informative video. Both easy to understand and also high quality in presentation. On the other hand, the possibility of having such powerful machinery ending up on wrong hands is scary indeed.
What an interesting subject, especially when IBM has released information on the 2 nanometre chip that it has been researching and is able to produce. Just imagine the trying to keep up with all of this future technology. Great video Chris and congrats on being able to keep up with all of the research you must have to do to keep up with all of this information.
Christopher, 1. Please excuse the tardiness of this note. I watched the video on Sunday, but got waylaid by the world. Namely Mother's Day 2. I can see why you would be a good and effective lecturer at university. You took a very complex topic and condensed it down to a number of relatively easily digestible bits and presented them effectively. In other words, you effectively dumbed down a complex topic without talking down to me. That is hard to do. 3. So, it appears computers are making advances in three areas a. semiconductors go for ever smaller trace sizes and compressing more transistors into smaller and smaller spaces b. quantum computing is still being researched and on sees bits and pieces about advances in this area and some semi-practical devices c and no photonic computers d. (no doubt there is still someone out there still touting fluidics and the cure for what ails you) 4. One wonders what programming languages (and operating systems) will look like for photonic and quantum computing devices will look like? Will the interfaces still be electronic with visual and touch devices or will there be more direct interface with the brain? 5. When will we be seeing the SBC version of a photonic computing device from someone like Raspberry Pi? 😊
I wish this video had been available when I was doing research for my novel, Twisted Light. I absolutely believe that photonics will be the future in computing and in communication. Great video. Thanks.
kamasutra has evolved beautifully. A neuromorphic internet infrastructure would set the world ablaze. Our ability to build synthetic artificial synapses and dendritic branching would give a whole new meaning to the future. AGi has arrived on our blindside.
Some of our shortest wavelength UV lasers are in the 270nm range. That’s quite large compared to current transistor sizes. There’s a reason electron microscopes can image smaller objects than optical microscopes.
Thank you Chris for this incredibly informative video. I had already come across neuromorphic computing, but was unaware of photonic neuromorphic computing, so this information is greatly appreciated .
Back in the early 80's, IBM patented a process that used red and green lasers, to read and write a crystal made of the same molecule inside your eyes. When the red laser hit this molecule, it turned to "L" shape. When hit with the green laser, it would turn to an "I" shape and straighten out. Or something like that. They said, they can fit 10 libraries of congress on a crystal the size of a sugar cube! They were going to make solid state optical drives that had fast read and write speeds (in nano seconds). I never heard anything after that. Anyone else remembers this?
Another consideration are the yet-to-be-designed programming languages that will allow the best chance of taking advantage of this newer architecture and at the same time are reasonably approachable by us mere mortal programmers. It seems a big learning curve ahead for both the hardware and software designers IMO.
The photonic part of this technology is something that people have been thinking about since more than forty years ago; however, suitable materials have always remained a tricky problem.
As you mention at the end, our brains don't use photonics and we haven't yet learned how to make an intelligence that's even as good as our own. So there's something more to the design than just going fast.
Congratulations for introducing this topic so well in RUclips.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.
”Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm (Jeff Goldblum Jurassic Park)
It's kinda surreal if you had to explain this to someone from the past. We'll go from using little magic sand squares into making our metal golems do stuff using microscopic rainbow magic.
Awesome deviation from the "normal" videos. Thank you sir for the glimpse into the future. Hope I live to see it. Must get those papers cited! 12 likes!!
Check out the company callled Lightmatter, which suppose to sell their photonic processor this year, claiming to be up to 10x faster than nvidia A100 in BERT and using 90% less energy (we'll see when Envise is tested IRL...)
Running faster and more energy efficient. It could mean lighter and longer lasting mobile devices. Thanks for sharing. Looking forward to next week show.
PICs... ya, lets make up a _new definition_ of the PIC acronym/abbreviation. It already is used for _picture_ and _Peripheral Interface Controller_ but why not something else? - 3:18
I mean... the more definitions, the better, right? These people are smart enough to come up with this laser/light tech but don't know how to use Google to see if their acronym is already in use?
@@janglestick - Oh, I forgot about that. But the first thing that comes to my mind with PIC (in all caps) is the Microchip brand PIC micro controllers. Billions of these little things in use around the world and they are the first MCUs I learned on.
I'm a bit late to the party. I didn't know this existed at all. I just hope this gets to be used for good purposes. Thanks. I understood everything as always.
The primary limitation I read about decades ago for photonic computing in general was miniaturization. Simply, the smaller and/or thinner a material (be it silicon or film) is the more transparent it becomes. Try to get down to our modern IC sizes and signals will be lost
I think they might be used in parrarel and share some functionality they might however replace traditional computing eventually or simply exist in parrarel.
Thx for this ep! Its clear that besides Moore's law dying the Von Neumann architecture is much slowly, however still dying as well. I believe we are in the beginning of the longer transition phase and I agree that we will see changes to classical computing in the next decade. Still in comparison to the advancements of the silicon photonics organic brain has still HUGE advances especially in the efficiency/energy consumption area. Its truly a miracle what our human brain can do with such a minimal consumption. Its truly on a completely different level in comparison to what we can create. Looks like our creators/gods r in a completely different league. Even organic computing is something out of this world it still has one BIG disadvantage - it doesn't last that long in comparison to its synthetic brothers. However its still far advanced from it can do especially when it works in tandem and cooperation with other brains together as one superorganism. This is what we haven't achieved as species to fully interconnect our brain capabilities to solve the toughest issues and problems.
I can see the presentation for nVidia's 2036 flagship graphics cards include the words " Using actual light to process the AI to improve ray tracing, new Photonic neuromorphic RTX cores boost performance by a factor of 1000."
This is amazing technology, the only problem is that components etc. Will have to be bigger until light can be directed in microscopic levels. Hopefully I’m making some sort of sense Chris! 👍😃
Interesting. A couple of questions though: I generally don't do conspiracy theories but I can't help wondering if we should be scared as a species for our own survival? Yes - species evolve - but are we actively building our own replacements? I kinda felt like this was more of an "Explaining the future" video. Finally the basics: how is photonic logic built up currently? And/or/xor etc. I can't visualize a photonic transistor.
The cutting edge of technology and future casting is always interesting! The story of how and when we get there is just as interesting. My sincere hope is that more good than bad comes from development. I’m sure the vast majority of happenings will be very positive. there is a nice kind of security in time delay due to hardware or processing times, but having things move quickly and fluidly is really nice too. So I’m looking forward to seeing what comes along. I’m really excited about using entanglement in communication/ information tech. I kind of wonder how far we will go.. I guess only time will tell.
The way you are able to compress complicated ideas or concepts down into easily digestible bytes... It really is astounding.
Thank you for making this channel.
You're very welcome!
And without dumbing it down, too. The mark of a great teacher.
His manner and style is that of a very good British teacher. I bet he is even better with a live class of students giving real time feedback. So watch, listen and learn from a master.
@@Reziac in fact when we understand something we are compress complicated ideas (like iron ore where is too many atoms, we dont need to store every single atom, we can just know what all in this piece of ore contains atoms and structures and they are repeating) to it's meaning.
but in process of understanding we really will store in brain many "trash" data.
@@slimeball3209 Tell me about it. My brain saves everything... and can't be arsed to index it. So when one datum is processed, piles of junk data swirl up from sludge...
Intel’s choice of Loihi (Lo-eee-he) as a name is quite subtle. It’s presently an undersea volcano on the flank of the Big Island of Hawai’i and hidden to view but in the future it’ll be a giant island all of its own.
u mean the pacific garbage patch's reclaimation:?
It means, by the idea of petite androginous boy children
@@ianchan2624 Nope, but you already knew that 🤷🏻♀
Nothing better than watching a video on a difficult subject that provides its research sources. Shows that the creator of the video did his homework.
This is exceptionally high quality stuff. It reminds me of what the OU used to broadcast on BBC2 in the 1970s. I used to get up in the early hours to watch it, before we had a VCR. Not just computing, but physics, English, maths, history, chemistry. Whatever was on.
Yep me too, Im a 70s kid, I derived a strange masochist pleasure from those dry OU lectures at night/early morning on BBC2 😂 This was well before the BBC micro show in the early 80s put domestic and school computing on the map in the UK, the great old days 😉
Why do I have a feeling that this one is going to make my brain hurt a little bit? Man how I look forward to 9:00 a.m. EST on Sunday
EDT
@@kevinshumaker3753 Yes. I was forget to say EDT this time of year.
If your brain hurts a little it is because you’re using it correctly.
I like ketchup.
@@DarthVader1977 Mayo for the win.
Back in the late 70’s, when I studied electronics at University, we were imagining neural networks based on a self modifying, distributed type structure using a hybrid of analog and digital techniques. The technology didn’t exist back then.
Digital processing took over and got us to the point we are today and now, 40 years later, the technology may be finally catching up.
When I saw the subject of this video I thought I could hear it whooshing clean over my head, but Chris explained it so well that I actually understood it !
Dunno whether I'm more surprised at Chris or myself, but I definitely learned something today ! 😀
I love the way Chris stated his conclusions of 'not overtake, bit side by side von Newman arch. . . As best fits the use case' . I wish we'd see more of this with new technologies- rather than the unrestrained hype like with graphene, carbon nanotubes, teflon, and so on.
Loved this video.
I’m reminded why this is my favorite channel on all of RUclips. Tinkering with SBCs one week, looking at cutting-edge R&D the next.
Thank you very much for this amazing video! When I was taking a Cisco networking class a few years ago, I thought about how light-based computing was possible but I had a bit of trouble figuring it out in my mind. A few years later, here is a fantastic video explaining the concept of light-based computing. The times have certainly changed.
I hope I live long enough to experience some of this. It's terribly exciting!
terrible indeed, not exiting at all.
@@leendert1100 It's completely alright if you can't understand it. It's rather complicated and requires some background. So no worries. But..... It's awesome!
Working in my photonics lab in grad school I've always hoped to see the day computer architectures change to enable new usecases for photonic computing.
Very inspiring. Good to hear classic architecture will be carrying on for decades yet.
When we get to year 2030 Christopher will still not age one bit, then we realize he is computer generated all this time.
I imagine Photonic Quantum Computing as Neural Net Architecture, in the progression toward the complete quantification of the human being like a large set of algorithmic logic, as a fluid metamaterial of nanotechnology, is going to lead to the eventual development of the Physical Spirit Being. You are a Human. You are a Machine. You are Fluid Nanotech Photonic Quantum Neural Human, and you are Human Energy that can generate yourself. You are Physical. You are Virtual. You are Living Data whose Sentience translates that Data into Human Information.
As someone who`s studying electronic engineering AND psychology, I loved this video and I'll keep an eye on photonic Neuromorphic Computing in the near future
That's a nice combination of subjects to be studying. :)
While this topic is way above my Knowledge, the way you explained the subject matter just blew my mind. I really didn't think I would understand it but I definitely have an understanding of this type of computing now. Thanks for the great explanation and another great video.
We already have the nearest computational mechanism to the human brain (and mind). It is called the thermostat that thinks just like the human brain by experiencing the subjective experience of reporting "Now it is too hot, now it is too cold"
It is about time when I first started studying this type of technology before even became available 25 years ago when I thought of something like that I never thought that I would actually live long enough to see other people making the technology a reality
Chris, wow, okay, mind blown. I love taking a look ahead into the future of computing in all of its possible forms. Thanks for sharing this information.
That was a little different from our usual Sunday morning fare. Interesting topic that certainly provides food for thought. Thanks Chris.
I like to throw in a wildcard now and then. Next week we are controlling stuff with a Raspberry Pi Pico! :)
Brilliantly accessible coverage of the subject Christopher. Thanks also for your research sources too. Photonic Neuromorphic Computing is such a mouthful, so it's only a matter of time before the marketing bods come up with some strange acronym. PNC has a different (and depending on your proclivities/recidivism) meaning in the UK :)
A character in a novel I am writing has such a brain. Ill let you know how it all works out! Very nice presentation of the concepts!
I had to watch this one twice. The first time, I was having brunch and couldn't give this information my full attention. Alas, the gray matter is still reeling. I used to think the physical foundation (the silicon wafer/chip) needed to be changed because it's (basically) reached its maximum efficiency. As it turns out... It's the bus, stupid! Moving the data around is where the bottleneck is. Our computing devices might be going back to the beginning in ten or fifteen years: The Radio Shack Light Computer, The Commodore 64000 Quantalaser, The IBM Laser Jr. (made by Lenovo, of course). Thanks for another great video Chris.
Hi Steve. :)
Who provides sources for their videos? Chris does!! Really, thank you for that. I know there are other channels that do that but I think it raises the bar.
Sunday morning coffee and an interesting explainingcomputers video to watch! Off to a great start today!
I remember a movie about Time Travel and a photonic intelligence.
"Time travel. Practical application."
On that note, there's also a game featuring a von Neumann probe. Grey Goo, a fairly decent real-time strategy game.
The von Neumann probe is a different concept then the von Neumann architecture, invented by the same guy (he was a prolific futurist). He posited the idea of self-replicating machinery, macro scale space probes that would go to other star systems, build copies of themselves, and send the copies forward to still more star systems. This process would repeat until you visited every single star system in the galaxy. Other later thinkers applied that same concept to nano scale machinery, which is where we get the grey goo idea that's the source of that game's name. Ironically a nanomachine would probably not use the von Neumann architecture.
Even Star Trek didn't dare such a name 😂😂😂
passionating subject, indeed
apparently, positronic networks = photonic neuromorphic networks + tasha yar
Duotronic circuitry and the M-5 Multitronic unit (TOS)... Isolinear chips and Positronic neural network (TNG)... Synthetic Bio-neutral gel packs (Voy)... The Borg... Are we all a joke to you?
I Loved your comment
@@saalkz.a.9715 I liked your comment even more.
@@janglestick no positronics use positrons which are positive electrons (anti-electrons).
We are our greatest with the help of computers, Not without. Not just see and hear, Think ! Remember ?
Wouldn't have expected that topic from you. Very cool. Thank you
Chris, you never cease to amaze me with your in-depth research and very clear explanations of new technologies. Thanks as always!
Hi Chris! :)
I'm getting notifications at least 2 minutes later. But once the notification pop-up, I arrive here for my weekly dose of EC.
One of your best videos to date!
thanks for explaining a complex subject into something understandable.
Thanks for the very interesting and informative video. Both easy to understand and also high quality in presentation.
On the other hand, the possibility of having such powerful machinery ending up on wrong hands is scary indeed.
This sounds a lot like the premise for "Terminator".
What an interesting subject, especially when IBM has released information on the 2 nanometre chip that it has been researching and is able to produce. Just imagine the trying to keep up with all of this future technology. Great video Chris and congrats on being able to keep up with all of the research you must have to do to keep up with all of this information.
Christopher,
1. Please excuse the tardiness of this note. I watched the video on Sunday, but got waylaid by the world. Namely Mother's Day
2. I can see why you would be a good and effective lecturer at university. You took a very complex topic and condensed it down to a number of relatively easily digestible bits and presented them effectively. In other words, you effectively dumbed down a complex topic without talking down to me. That is hard to do.
3. So, it appears computers are making advances in three areas
a. semiconductors go for ever smaller trace sizes and compressing more transistors into smaller and smaller spaces
b. quantum computing is still being researched and on sees bits and pieces about advances in this area and some semi-practical devices
c and no photonic computers
d. (no doubt there is still someone out there still touting fluidics and the cure for what ails you)
4. One wonders what programming languages (and operating systems) will look like for photonic and quantum computing devices will look like? Will the interfaces still be electronic with visual and touch devices or will there be more direct interface with the brain?
5. When will we be seeing the SBC version of a photonic computing device from someone like Raspberry Pi? 😊
I wish this video had been available when I was doing research for my novel, Twisted Light. I absolutely believe that photonics will be the future in computing and in communication. Great video. Thanks.
I can't be the only one who joins in the recitation of Chris's intro and outro for every video like I'm chanting some sort of religious creed?
Certainly not!
Me too :)
Let's go and take... a closer look... at you 😜
kamasutra has evolved beautifully. A neuromorphic internet infrastructure would set the world ablaze. Our ability to build synthetic artificial synapses and dendritic branching would give a whole new meaning to the future. AGi has arrived on our blindside.
Some of our shortest wavelength UV lasers are in the 270nm range. That’s quite large compared to current transistor sizes. There’s a reason electron microscopes can image smaller objects than optical microscopes.
This video is "required watching' for the Human Resistance against Skynet. Wonderful video, Chris; thanks!
Thank you Chris for this incredibly informative video. I had already come across neuromorphic computing, but was unaware of photonic neuromorphic computing, so this information is greatly appreciated .
Is true you're creating a Symphony ! Genius
You Rock !
Highly appreciated if u could explain tech in depth of photonoc neuromophics
Back in the early 80's, IBM patented a process that used red and green lasers, to read and write a crystal made of the same molecule inside your eyes.
When the red laser hit this molecule, it turned to "L" shape. When hit with the green laser, it would turn to an "I" shape and straighten out. Or something like that.
They said, they can fit 10 libraries of congress on a crystal the size of a sugar cube! They were going to make solid state optical drives that had fast read and write speeds (in nano seconds).
I never heard anything after that. Anyone else remembers this?
I do remember that.
This is one invaluable lesson. Thank you for your clear and concise explanation of photonic neuromorphic computing!
You're very welcome!
Another consideration are the yet-to-be-designed programming languages that will allow the best chance of taking advantage of this newer architecture and at the same time are reasonably approachable by us mere mortal programmers. It seems a big learning curve ahead for both the hardware and software designers IMO.
The photonic part of this technology is something that people have been thinking about since more than forty years ago; however, suitable materials have always remained a tricky problem.
ok ... I have to admit, Christopher, that this one makes me feel like a photonic neuromoron. Thanks for the vid. :D
Thanks very much for explaining complex concepts in a crystal clear concise manner.
We agree
From MEETOPTICS team!
This is really well-explained; you did a great job of explaining the key concepts and why they're important. Thanks for making this!
Yes, MEETOPTICS team completely agree!
We often think of data in terms of bytes instead of bits.
So regarding the point at 3:00, 100 gigabits is equal to 12.5 gigabytes.
This is one of the best tech videos I've ever seen.
Thank you, sir! So much golden info.
Awesome video! Very inspiring for a Physics Student as myself, hope it inspires more people on the Technology sector! Great references as well!
It is so good! It had inspire MEETOPTICS, definitely!
WOW!!! That is a great video thank you!!!! It's a GREAT time to be alive!!!!!
Good Morning Mr. Barnatt. Here finally for the 10th 🥇 gold. Always supporting. Best tech stuff on internet is here. Many thanks. First.
Thanks for your support -- 10th Gold medal awarded! :)
@@ExplainingComputers Does Saturno live next door to you?
As you mention at the end, our brains don't use photonics and we haven't yet learned how to make an intelligence that's even as good as our own. So there's something more to the design than just going fast.
This is the ultimate goal of computing
Your videos truly are great. I thoroughly enjoy them, they are so well structured and informative. Thank you kindly. Greetings from Barcelona.
Great work, Chris! Greetings from Brazil.
The video was great. I always like stuff like this. Keep making great videos like this :)
I always click like before it's even started :)
Standard procedure for me on this channel.. 😉 👍
Wonderful!
Brilliant stuff. Thank you so much
excellent video. i've been following photonics & neuromorphic research for a while and the possibilities are exciting.
Welcome to another video from Chris!
When you mentioned wavelength multiplexing in photonic hardware my mind blew a little.
Cool.
Congratulations for introducing this topic so well in RUclips.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.
”Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.” - Dr. Ian Malcolm
(Jeff Goldblum Jurassic Park)
A video all about fixing screen taring would be amazing. Both on amd side and and intel. To browser tricks to enable flags. Thanks. Great stuff.
Now this is a good video idea, noted.
THE NEXT FRONTIER
Very different from the typical content, but fascinating!
It's kinda surreal if you had to explain this to someone from the past. We'll go from using little magic sand squares into making our metal golems do stuff using microscopic rainbow magic.
Awesome deviation from the "normal" videos. Thank you sir for the glimpse into the future. Hope I live to see it. Must get those papers cited! 12 likes!!
Check out the company callled Lightmatter, which suppose to sell their photonic processor this year, claiming to be up to 10x faster than nvidia A100 in BERT and using 90% less energy (we'll see when Envise is tested IRL...)
Running faster and more energy efficient. It could mean lighter and longer lasting mobile devices. Thanks for sharing. Looking forward to next week show.
I'm just getting a sneaking feeling that my old PC with its built-in 5.25" floppy drive might be getting a bit out of date.
PICs... ya, lets make up a _new definition_ of the PIC acronym/abbreviation. It already is used for _picture_ and _Peripheral Interface Controller_ but why not something else? - 3:18
I mean... the more definitions, the better, right? These people are smart enough to come up with this laser/light tech but don't know how to use Google to see if their acronym is already in use?
hey uh ... wth has happened to reality, am I in the mandala effect !?!
PIC doesnt mean Programmable Interrupt Controller to ya ppls ?
@@janglestick - Oh, I forgot about that. But the first thing that comes to my mind with PIC (in all caps) is the Microchip brand PIC micro controllers. Billions of these little things in use around the world and they are the first MCUs I learned on.
Woot! Sunday EC!!!
Here we are again!
@@ExplainingComputers Sir you have taught me so much and really appreciate your teaching style.
Great video and starting point.
Thanks very much!
It is great!
today is the day I wake up and see a new video from explaining computers on mother's day Yay 🙂
You mean, "birthing persons", of course. We must start learning robo-speech asap. 😃
Happy Mother's Day to you and yours.
@@SBCBears thanks I agree to that this video has me curious
I'm a bit late to the party. I didn't know this existed at all. I just hope this gets to be used for good purposes.
Thanks. I understood everything as always.
Thanks for watching! :)
That was quite a head full! Thank you very much! Never knew about this at all.
I hope this stuff about light-artificial neuron computers can exist in RL soon. The idea sounds cool.
The primary limitation I read about decades ago for photonic computing in general was miniaturization. Simply, the smaller and/or thinner a material (be it silicon or film) is the more transparent it becomes. Try to get down to our modern IC sizes and signals will be lost
I think they might be used in parrarel and share some functionality they might however replace traditional computing eventually or simply exist in parrarel.
Thx for this ep! Its clear that besides Moore's law dying the Von Neumann architecture is much slowly, however still dying as well. I believe we are in the beginning of the longer transition phase and I agree that we will see changes to classical computing in the next decade.
Still in comparison to the advancements of the silicon photonics organic brain has still HUGE advances especially in the efficiency/energy consumption area. Its truly a miracle what our human brain can do with such a minimal consumption. Its truly on a completely different level in comparison to what we can create. Looks like our creators/gods r in a completely different league. Even organic computing is something out of this world it still has one BIG disadvantage - it doesn't last that long in comparison to its synthetic brothers. However its still far advanced from it can do especially when it works in tandem and cooperation with other brains together as one superorganism. This is what we haven't achieved as species to fully interconnect our brain capabilities to solve the toughest issues and problems.
Oh yes, my fav channel, hey Chris!
Hey! :)
Excellent video! This will be the foundation for true AI.
11:50 ...but photonic regular computers will also happen, and they will replace the ones we have. Photonics is not just for neuromorphic processors.
This video stimulated my aging Neuromorphic Computer in my skull.
:)
I can see the presentation for nVidia's 2036 flagship graphics cards include the words " Using actual light to process the AI to improve ray tracing, new Photonic neuromorphic RTX cores boost performance by a factor of 1000."
:)
This is amazing technology, the only problem is that components etc. Will have to be bigger until light can be directed in microscopic levels. Hopefully I’m making some sort of sense Chris! 👍😃
I understood this video. I'm not that bright. Well done, Chris! 👏 Chris' next trick, teaching Calculus to a sea slug.
👍👍👍👍👍
Never failed to learn something new from you.
Thank you Christopher!
Interesting.
A couple of questions though:
I generally don't do conspiracy theories but I can't help wondering if we should be scared as a species for our own survival? Yes - species evolve - but are we actively building our own replacements?
I kinda felt like this was more of an "Explaining the future" video.
Finally the basics: how is photonic logic built up currently? And/or/xor etc. I can't visualize a photonic transistor.
The cutting edge of technology and future casting is always interesting! The story of how and when we get there is just as interesting. My sincere hope is that more good than bad comes from development. I’m sure the vast majority of happenings will be very positive. there is a nice kind of security in time delay due to hardware or processing times, but having things move quickly and fluidly is really nice too. So I’m looking forward to seeing what comes along. I’m really excited about using entanglement in communication/ information tech. I kind of wonder how far we will go.. I guess only time will tell.
I miss this kind of subjects in your channel. Probably more adequate to your other channel?
Very interesting content!