Cloud-based keyboard apps are a really bad idea for security and privacy. They send all your keystrokes to a third party.; often including passwords and such.
There was a recent announcement about a “fabric chip” from a startup from Carnegie Mellon University called efficient computer corp. that claims their chip is 100x more efficient than current CPU’s and 1000x times more economical than GPUs would appreciate a video on this tech too I’ve been trying to look more into it and can’t find out anything
A few corrections. The entire optical transmission industry uses infrared light, typically centred around 1550nm. Silicon is perfectly transparent at these wavelengths and is an excellent optical semiconductor for some functions. The concept of the photonic integrated circuit (PIC) dates back to a Bell Labs paper in 1969 and companies like Infinera have been building PIC-based optical transmission equipment based on indium phosphide since 2005. Even the original PIC chips from Infinera have over 200 individual optical elements integrated. Lithium niobate is an excellent optical semiconductor as a modulator but there is very limited experience of using it in a PIC. Perhaps the main challenge of scaling the number of components on a single chip (be it electrical or optical) is the killer defect rate - which is when material faults randomly occur in the chip structure during manufacture. If these defects occur where there is a critical chip function then the chip will not work properly. Once of the major successes of the silicon chip industry is to reduce killer defect rates to allow large die areas with acceptable killer defect rates. Indium phosphide is maybe a decade behind silicon in this respect, but lithium niobate is twenty or thirty years behind silicon. Companies like IBM and Intel have poured billions of dollars into silicon photonics so I would still put my money on silicon winning the race - even though it's not an ideal semiconductor for optical computing given that it cannot not work as an electrically pumped laser (silicon is an indirect bandgap semiconductor).
@@patrickday4206 The transparent part of ordinary sand is silica -- chemically silicon dioxide: SiO_2 -- amorphous quartz. That is indeed transparent in the visible and even into the UV. But... _Silicon,_ which the overwhelming majority of microelectronics is based on, has a bandgap of 1.1 eV, so it's transparent only to wavelengths a bit longer than about one micron, or 0.25 microns longer than wavelengths at the red (long-wavelength) end of the visible spectrum. The oxide of silicon _is_ used in conventional chips, but mostly as a transistor-gate insulator and as a gate material, which doesn't have to be a very good conductor at all. That stuff is polycrystalline, and thus not very useful for electrical conduction generally _or_ for optical transmission. I have no idea what the current status of SiO_2 crystal (i.e., quartz) growth technology (or doping options) is.
Good points, but regarding defects: The image at 0:10 suggests a very simple periodic circuit with many independent circuit lines. In that case, the yield problem might be meliorated with a standard trick used in memory arrays: build in a lot of redundancy and discard lines that test bad. You could even do it in software. The killer defect rate is then measured in defects per unit length of a broad "line" and only scales, roughly speaking, with the inverse of the "width," or the square root of the chip area. Of course, that might solve the defect problem, but it rather limits the chip architecture. It also kinda looks like the device density sucks.
@@patrickday4206 Sand is silicon oxide (silica...or glass)and is reasonably transparent to visible light. The elementary form of silicon is transparent at wavelengths above about 1100 nm while we generally say that the longest wavelength of visible light is 700 nm.
I remember writing a paper on optical computing back in the late 80's. There were high hopes back then. Not much has happened in this space since that time. It's encouraging to see some progress.
It is funny. Your comment bought to mind Stephen Wolfram's idea of "computing" or "calculating" (where pretty much anything that exists dynamically is doing that). Actually, IMO, there have been huge advances in photovoltaic and photonics in the last 20-30 years. Your high resolution TV and display screens are examples, as are solar power and Starlink and other satellite networks interconnected by lasers, not to mention bio-medical applications, GPS, all sorts of sensors, etc. Google says China is first in this field - but that may be for applications. Numerous universities (Colorado at Boulder, Stanford, MIT, University of Rochester come to mind) are doing most of the world-class research in these fields IMO.
Well, actually there's been a lot of progress, just not as general purpose CPU computing, but in fiber optic communications switching (it's pretty much right under everyone's nose). :)
@@davestorm6718 Correct. Been all-optical switches in comm's networks for over 20 years now. And, optical compute is far different than optical switch.
This is an amazing way of injecting a lot of data into a silicon microchip so you can actually have faster processing. Heck, we can even bring back ring-loops as memory, store it at the GHz range inside a loop of optic fiber. Terabytes of memory faster than DRAM. Forget about using it for computing, just moving data and storing data are already amazing. Modern CPU/GPUs already spend most of their time just waiting for data to come to their caches. The rate at which computing can happen is severely restricted by memory and bandwidth speeds.
You make a good point, similar to Anastasi's final comments, that this "chip" and similar devices might be better suited for data transfer and communications (not sure about "data storage") than for typical computing. At least perhaps for the time being.
I still remember the first time I allocated a Tb of memory on our HPC, the sense of power was awesome. Still took half a day to process my data though :(
This is NOT new...... it is already being done, at many Trading exchanges..... in the real world for like the last 10 years. They are "storing" the data feeds in optical fiber BEFORE processing them.
@@stevesteve8098Yes, this is not new, but I think it requires the optical feed and then 2 more lasers, one to convert the optical information into phonons and a second to read that data. However, as I understand it, that only "stores" the data for 10-20 nanoseonds ... not very long even for day- or seond-traders.
ring loops...ring loops ring loops...the answer to the speed benefits of lithium niobate encrusted photonics. do you know rain neuromorphics chip department wants to transition to ring loop techniques of dealing with memory bottlenecks. very intresting point there.
Yep, a Thousand times faster, while CONSUMING 400 times LESS energy, could (SHOULD) be revolutionaty indeed 😛 Hey, a DARK thought just crossed my mind . . . what would this bring to the world of things such as blockchains and crypto networks ?? . . . BTW are blockchains Not BASED on some philosophy called "proof of WORK" and the need to consume INSANE amounts of power and dissipate insiduous amounts of heat . . . just [for a lifeless Machine 🙄] to "PROVE" that it has apparently "WORKED" 😅😅 ?
Ha fancy seeing you here! It takes a company like QCOM or other to pick up one of these research companies, and things get very interesting very quickly
by the time photonics for actual compute (not transport) becomes a niche adoption, we'll have viable quantum computers that dont require super cooling and can fit in a 2u rack mount case that already are capable of 5000x the compute that is expected of this paper-only compute. That will render photonic-over-medium based computing obsolete other than for a storage medium.
TBH as a programmer, I don't understand your domain well enough to understand everything you teach. But I am curious, and the main takeaway for me from your presentations in general is that it's very inspirational. You have a very good way of explaining things. Thank you for that!
As you said there's a basic takeaway. She's not presenting to physicists. I understood the basic concept of limited use of optical transmission betw chip components & poss basic maths. But as she explains in the end, this is only a working concept, all the processing components are not there yet... currently they need to supplement traditional (higher power) components.
@@raheesomShe said nothing. Photonics can still be digital. Digital means 1 and 0, it doesn't mean the physics behind the CPU. Most likely, nothing will change for programmers for photonics.
Regarding your comment about being a programmer, which language do you personally program in which requires you to understand how a regular chip works?
You remind of one of my old professors. She was an older woman and she was a wonderful professor and I'm very fond of her. She taught Data Structures and algorithms. Thanks for the breakdown of this. It's very clear and understandable.
we dont have electronics that can feed an optical encoding device in THz speeds, thats why theyre using microwave encoding. Its the fastest we can transmit data using conventional computers to encode the optical signal. The optical signal is the carrier and the radio signal is the modulator. The only reason they are converting to light is because they can optimize certain computations (like matrix multiplication and fourier series) natively due to the innate physics of the system that is designed. This is essentially an analog signal with no discrete values which makes repeatability a bit of a challenge due to things like chaos theory. It would be cool to see some sort of optical transistor that can be switched easily with optical logic gates and possibly do discrete calculations instead of analog ones.
I could listen to you breakdown super intense subjects that I'll never really need in my lifetime, forever! Your approach and ease of translating things into near layman's terms is great. Grazie mille Anastasia.
POET Technologies ($POET) is a pre-revenue company that created the first-ever wafer-level integration of electronics and photonics into a single device
Sentences like silicon is black and therefor does not work for optical computing with visual lights is true, but in IR silicon is transparent and in that range silicon is working well for optical computing.
I think the next age of innovation is sort of dependant on breakthroughs in Material Sciences, it is the key that will unlock a whole lot and every thing will just fall in to place and we will move faster and much further.
a lot of work here: congratulations! I totally agree with the last part of your video. Just a comment about what you say at the beginning of your video on silicon chips (3:30). Silicon photonics do exist and are also working very well. Silicon may be "black" in the visible optical domain but it is completely transparent in the near infrared region where all these photonic developments are made (mainly for optical telecommunications). And the chips discribed in the nature paper you present use silicon, silicon oxyde, silicon nitride material for passive waveguides (all the circuitry you describe) and thin film lithium niobate above these waveguides for electro-optic modulation (the mixing part) because it's a bit faster than silicon modulators that do exist as well. By the way, the wafer that is shown in the video at 11:35 is indeed a silicon wafer integrating silicon photonics technologies
Chip advancement is actually an amalgamation of many technologies. Especially the material science and integration of optics and quantum computing I guess
There have been recent breakthroughs in 1 AI 2 Quantum 3 Photonics Now imagine an AI running on a Photonic Quantum computer! It would make all our existing computers as useful as handheld calculators.
Doesn't even need to be entirely photonics. Specially if later we develop a way of fusing photonics and electronics. You can do a lot of computing in parallel, so the latency in electronics wasn't a problem, the problem is always the von-newman bottleneck, the bandwidth at which you can inject data into a silicon chip. For ex, H100 TPU can do merely 3.35TB/s of data transfer internally. It does 51 Teraflops (FP32) but because it can only transfer 250GB/s from memory, it doesn't get even closer to that. The H100 has 456 tensor units, if you could feed data without ever having to wait a single clock cycle, you would need at least 153TB/s. But if your chip can do 1 multiplication per clock cycle (*1) at 3.5Ghz, that would at least consume 0.336 TB/s for a single unit, or 153TB/s total. So if you could feed data at the rate it can consume, you can easily do 50 times more computing. You can do 2.5 Petaflops instead per chip. (*1 - tensor cores in the volta actually have 5 stages, those at least use 5 cycles per "instruction", I'm considering 3 floats inputs) But there's no way to inject that much data into a silicon chip, the die size would have to be huge for the amount of pins it would need to have. Maybe some IBM mainframes can do that. 150TB/s is a lot of data. Now with photonics, that's a possible optimization, far away (aka, in the same server, but not on the same PCB) you can have terabytes of RAM in parallel feeding data to a microwave emitter and then to a light pipe and then to a single microchip. (it would still need to have another back-converter to microwave and a receptor to feed the data to the silicon above, but that would be flip-chip interconnect instead) Also you can now make it even denser with much more tensor units and remove all the cruft used for managing caches, you don't need caching or even registers, its pure computing, data in / data out. You can even do crazy things like active cooling inside a microchip if you don't have so much die size being spent on interconnect or cache.
Handheld calculators are still useful 😛 Especially the ones that use solar for power. I still have mine from decades ago, that works, using LCD and a small solar panel.
I don't think a photonic quantum computer would benefit from many of the advancements in classical photonic computing. I could be wrong, but I imagine quantum vs classical photonics is a whole different beast. But yeah, photonics could certainly speed up AI if it gets good enough, and quantum computers make AI go crazy, but that's probably further away. We'd probably have to relearn a lot of what we know about training classical AIs too.
This is crazy technology if they can master it, because of lights ability to be split into different wavelengths or colors this will significantly increase data storage and speed, it'll turn silicon valley into the stone age
Well done Anastasi. You really have the courage to bring new information about important research in computation. Thank you for your video. Impressive information. Congratulations.
Although I'm not highly knowledgeable about computer chips (I've watched a couple of your RUclips videos because I'm interested in AI and thought it would be worth learning about the hardware too), I've heard that transistors are nearing the size of atoms, suggesting an imminent impasse, a 'brick wall". However, learning about these innovative AI chips, such as the Grok chip (or is it Groq, Elons AI makes this confusing), analog chips, and photonics, has utterly astonished me. I'm being shocked and stunned to death. That 'brick wall' is more like a paper wall, and we are shooting right through it with bullets. The convergence of exponential advancements in both hardware and software is mind bending. Everyone around me seems oblivious to what's on the horizon, and I am too, yet am certain that these developments will render the world beyond recognition. 5ht2a receptor agonism level beyond recognition.
Interesting to see a video on this when I work at a company designing optical computers :D I am happy to see more coverage. We are talking about clock speeds on the order of teraherz, we dont have to deal with inherant capcitance or thermal limits. Also, the goal isnt to replicate electronic architectures but utilize some properties of light to do operations in the analog domain, e.g. fourier transforms are extremely easy and efficient and only take a handful of components. That said, afaik my company is the only one doing digital computation as well (which can be done!)
I love it too! It's not exactly my domain, but I like to occasionally dive into the tech world outside of my programmer world, and Anastasi paints an intriguing picture of that realm.
SAW (Surface Acoustic Wave) devices are conceptually similar. The interaction and coupling of different energy types. Resonators, couplers and transmission lines are designed into the system to accommodate each energy type. Reliable low power laser emitters are readily available from the fiber optic communications industry. For photonics to expand there needs to be a toolbox of active elements which can be simulated. Much like the inductive, resistive, and capacitive components in electronics.
Being ahead of time is the judgment made by those who are trying to find better solutions by extrapolating from past experience and not accepting that past experience becomes almost useless as the fundamental ground rules have changed. Only the real leaders in any field who can come up with revolutionary ideas and solutions are those who accept that "business as usual" is a thing of the past and has become more of a roadblock than any help when new technologies come into play.
As a retired physicist, I was wondering where this research had gone. I've been out of the scene for a long time but I'm encouraged to see it making a comeback. Long way to go yet but I always said (in my youth) that one day we would have photonic processors running at mind-blowing speeds and, possibly, even fast enough to be used in human-replicative systems like 'real-life' robots. Maybe the days of "SkyNet" are closer than we might like! Brilliant video! Thank you for this. You've just won another subscriber.
Great video. Just would like to point out that even silicon is used for photonics and the field is known as silicon photonics. Lithium Niobate make good modulators, but currently the foundries also fabricate silicon photonic modulators.
I believe the power consumption referenced in the paper is the actual power consumption of the chip not the peripheral equipment supplying the clock and displaying the results.
This lady can easily be a professor in practically 99 percent of the Universities of the world! In one compelling video she bring so much "light" to the basic understanding of the world of photonic chips. As Einstein once said to the effect, if you cannot explain something simply so that a six year old can also understand it, then you do not know it yourself!
That is what identifies a genius from us - the others. To them - everything seems simple since their mind can zoom like a laser beam to the root of the issue and follow the logical steps from there And at no time has she demanded $ billions to be sent to the UN - since only money can solve the issue.
@@raheesom Light travels at the speed of causality, as all massless particles do. Light is simply a small part of the electromagnetic spectrum. It isn't special. It also isn't much faster than normal electronics, its speed is not why we use it.
I theorised this in the 90's but the finesse here is amazing. Light sensor/emitter arrays are only limited by the medium the light travels through and the switching states of the sensors. Different diffraction patterns can instantiate different data outputs from inputs, and the need for a medium apart from storage is minimal, compared to circuit boards. Plus points include higher resistance to EMP events. Cons include likely sensitivity to cosmic radiation.
People are saying that quantum computing is only suitable for some calculations. And that is true. And they say that this will decrease the speed of the quantum computer, cos it will only be suitable for a small pool of stuff. But I think they might be wrong. I think it might be like when classical compute was in its infancy, like in the 40's and 50's, when the first transistor was made. They were probably saying to themselves back then, how do we do stuff with zero's and one's. And it just took them along time to think in zero's and one's. And today we have multiple languages, and can take it almost anywhere we want. To the point that the zero's and ones are not a barrier to creativity like it was. I expect quantum might be the same. I think it will just take us a certain amount of time to think in the way that quantum calculates stuff, and eventually the same thing will happen. When we learn to think in terms of quantum calculations, the pool of stuff we can do with it will expand. No different than in the 50's with classical compute.
Anastasi, it was very refreshing getting your channel recommended to me via RUclips. From the outset of the near-no-power statement, of course the energy for the input laser and subsequent readout were apparent. This technology is very promising for future development and perhaps as an initial step can be a valuable component of a computer system. Advances in energy storage are as important as this = sodium vs lithium, for example. I look forward to exploring your past posts! I will subscribe as your communication skills and ability to take advanced concepts and translate them into terms that I can understand is valued. Thank you and Cheers from Seattle!
It is so refreshing to have facts presented by a human that understands the subject and does not feel the need to plaster pointless imagery all over everything ! And... the content is new and interesting too !
And since "power saving is involved - how about demanding that this project be financed by the government - since it saves the planet? At first glance people laugh - but in reality = it demonstrates how ridiculous the entire climate change charade is - $ billions and more - but absolutely no solution on the table. Why? Making the money flow from the taxpayers into the pockets of the politicians is the game - not the climate.
This is an interesting development. QUite a few years ago whilst serving in the militay, we were made aware of a program that was working on using light for chips, I have often wondered since then how far they got.
Does anyone watch this on a hi-fi system? Not only is she brilliant and beautiful but her voice and accent sounds amazing. I could listen to her for hours on end. Your videos are always very detailed and informative about what's coming in the world of computers.
Interesting well presented idea & although it is not yet practical, as you note, there is so much potential combining speed with low power that in the by & by someone will find a problem that this solution can solve. Thank you for sharing!
Complexity of systems increases the probabilities of errors and breakdowns. Truly optical computing in an end-to-end open system where you can pick up and move or join a device is still many decades away, but the groundwork is being laid today. The next Bob Metcalf and Tim Berners-Lee could become synonymous with photonics computing soon. Perhaps it will be Anastasi?
I love your presentation! very few girls actually show the manners and statedly act like they can do both business profession and lady like at the same. Love that! If you're ever in my town , you can come by, whether technical or not, I'm not good at talking in real life, but I have loved watching you.
News like this gets me giddy with excitement. I am a 3D modeler/animator/simulator with a huge need for fast processing of physical simulations and image rendering. Anything that will speed those processes up for me is something I hunger for. Right now, building a single render server with four GPUs is very expensive and chews up a lot of power, turns my office/studio into an oven. With technology like this, I could be rendering full-on, photorealistic imagery at 8k resolution in realtime. Chewing my way through massive city destruction simulations would be also dramatically faster. And being able to do that in one box sitting by my desk instead of having to have access to an entire render server farm would be a very big deal to me.
The level of computing advancements today is incredible. It’s happening in multiple areas simultaneously. And the implications of how these will effect life in the future I’m really tired of having my mind blown.
Last video I have watched from you must be from a few months ago. From the first seconds of the video one thing standed out: your fluency in english has improved! Great content as usual!
Happening today, they are using photons as qubits. Now imagine where this goes in communications when they can entangle these photons or at least enough of them and making them addressable to enable FTL communication between disparate systems. Yeah, we are talking Star Trek like subspace communication across the universe.
Quantum liquid metallic variant photonic crystalen shaped shifting cybernetic meda morphic robotic limbs arms hands legs feet torsion heads assembly line New Tesla reverse engineering back in play
What is the speed of transmissible light in this medium? Does the waveguide effect the refractive index? Are they using electro-optic effect to reduce the refractive index to make make some Matrix like computer that we will be forced to jack into? I ask because I know nothing about this field.
I'm interested in how this may be used in differential image recognition in astronomy. Could this help to go through the vast amounts of data from telescopes to find anomalies?
The efficiency might very well still hold up, didn't read the paper, but if the photon-microwave generator uses up 100 watts for 400x the compute power, than its already better then a silicon chip using up 40001 watts. In other words if the task gets completed 400 times faster while using 10x more kw/h this is a huge gain in efficiency.
Is this for quantum computing or normal computing? I'd also like to point out that lasers can generate a lot of heat at their end points. Wherever the light stops is where the heat will be generated. You'd need a cooling system of some sort which would use power.
A computed image is a grid or array, and the data in each cell is a value. If you can do this, then you can use an array to, for example, replicate a binary byte (8 bits) or larger binary arrays. Ergo, these chips can do much more than process images.
Photonic computing will never be that common. The reason electrons are so fantastic to work with vs. Photons, is that Electrons can be moved easily; all you need to accelerate them is a small charge and they jump at your command. And you can stop them and have them wait around. Photons on the other hand only travel in a straight line, and inside fiber optic they are zig-zagging, bouncing against the walls, but the real problem is that photons will not sit still; you can bounce them back and forth, but that's not really the same thing as standing still. Photonics is a fascinating field, and clearly the 400 Gbit fiber optic transceivers that exist are doing very clever things. I am personally getting very tired of these PR releases from research groups claiming massive improvements, and then when you get down to it, it's a bunch of smoke and mirrors that is really just a quest for more funding.
Hi Anastasi. This paper certainly seems to stir up high hopes that it is possible. Ultimately they need to demonstrate the actual hardware that does this. I am an Electronic Engineer and always try to lookout for new technological breakthroughs. In my opinion photonics seems to solve the inherent problems associated with silicon and that is capacitance and inductance which both pose their own set of problems such as maximum clock speed and add to overall power consumption which is always multiplied by the billions of transistors on the chips. Is it so, that photonics solves some of these limiting factors? Am I missing something? And then also the noise issue which limits the minimum operating core voltage. If you have a conductor that carries a signal it is prone to electromagnetic interference from surrounding signals. In my understanding photonics is not prone to electromagnetic interference, so how does photonics compare to regular electronic signals when it comes to noise. What other limiting factors does photonics have on the other hand?
Not long after optical data storage, (CDs/DVDs, etc), came into common use, I expected data storage to quickly evolve from the initial one dimensional storage like what is used for CDs, DVDs and Blue-ray, to 3 dimensional data storage using some kind of crystal storage medium. Just as an example, imagine a wedding ring where the video footage of the wedding is stored inside the diamond and accessible by a device specifically designed to read the data on the ring, by placing the diamond, (or other gemstone), onto a special optical reader. Of course so much more than video/audio could be stored in that diamond, but it would be a great and novel way to help bring the technology into the public eye.
Knew it would be lithium niobate. It is an extremely versitile material. A bulk chunk makes a fast q switch for a laser, a thin piece is used in accousto optic switches, appy an AC electric field while cooling and you get a wavelength converter to turn a beam from IR to any wavelength you wish that the material is transparent to. Oh the laser used for a chip like described would be similar to the one in a blu ray disk burner. The RF generator would not require much, it uses the electrostatic field of the wave to change the polarization rapidly. The main issue would be extreme coherence length of the laser and extremely precise frequency stabilization of the RF source.❤
I remember when this sort of thing first appeared decades ago. My thought then was that it may be fast but it still needs to interface to regular electronics at some point ant that would be the bottleneck.
It appears that the operations performed in photonics are analog. Sort of like what you would do with an op-amp based integrator. If this is correct, then they are suitable for solutions that can tolerate some inaccuracies or lower precision like AI matrix operations. Is this the case?
The main question is: What will be better diamond quantum accelerators (Quantum Brilliance company) or photonic quantum computers or both in connection to each other?
I think the better choice is solid state nuclear magnetic resonance quantum computer or more likely spin qubit quantum dot computer (from Intel for example)
All speculation. The processing can run at asynchronous speeds way faster but there is no from of optical flip-flop. All modern silicon designs are synchronous designs that use flip-flops,
This is exciting idea, I hope it comes to fruition one day! With such low power requirements, we can start building chips into the 3rd dimension without running into heat issues.
Me no good at science. Your presentations are so clear, however, that even those of us without great scientific backgrounds can garner the gist of your messages. Really, really, well done. Thank you.
Cooool! By the by, any chance you could put some sound deadening behind and to the side of your mic (out of shot)? I think the echo is affecting the sound picked up by your mic. Also, if you could direct the top of the mic towards your mouth a little more and maybe move it a little closer, rather than speaking horizontally over the top of it, then, it will likely bring out more tone in your voice too.
@8:22 , the 10 core xeon is 85w so 1/400th of that is .... they must just be just measuring gate power usage , no way they are including photon generation , but the thing I do like about these is lower temps, IBM has been tinkering with Photonics for quite some time
I think it is one if the next obvious avenues to computing. Hardware will eventually catch up with the chip. What was explained seem to actually present the opportunity for embedded systems as may be some type excelerator for enumerators in a hybrid die? Thee efficiency is something may be we should nt pass on. Pretty cool and thank for sharing.😊
Thanks for the video. Photonics may be useful in the future, but as you explained, this technology must go through considerable development before it is ready to replace electronics. I've been able to watch the progress of lithography, including several promising technologies which never reached large scale commercialization.
Get TypeAI PREMIUM now! Start your FREE trial by clicking the link here: bit.ly/Mar24AnastasiInTech
Cloud-based keyboard apps are a really bad idea for security and privacy. They send all your keystrokes to a third party.; often including passwords and such.
@@protocol6 it's really private we really care about our customer we make him really happy! so they come!
what about these computers vs graphene transistor computers, how do they compare?
If this were real, all the standard chip making company stocks would be plummeting big time.
There was a recent announcement about a “fabric chip” from a startup from Carnegie Mellon University called efficient computer corp. that claims their chip is 100x more efficient than current CPU’s and 1000x times more economical than GPUs would appreciate a video on this tech too I’ve been trying to look more into it and can’t find out anything
A few corrections. The entire optical transmission industry uses infrared light, typically centred around 1550nm. Silicon is perfectly transparent at these wavelengths and is an excellent optical semiconductor for some functions. The concept of the photonic integrated circuit (PIC) dates back to a Bell Labs paper in 1969 and companies like Infinera have been building PIC-based optical transmission equipment based on indium phosphide since 2005. Even the original PIC chips from Infinera have over 200 individual optical elements integrated. Lithium niobate is an excellent optical semiconductor as a modulator but there is very limited experience of using it in a PIC. Perhaps the main challenge of scaling the number of components on a single chip (be it electrical or optical) is the killer defect rate - which is when material faults randomly occur in the chip structure during manufacture. If these defects occur where there is a critical chip function then the chip will not work properly. Once of the major successes of the silicon chip industry is to reduce killer defect rates to allow large die areas with acceptable killer defect rates. Indium phosphide is maybe a decade behind silicon in this respect, but lithium niobate is twenty or thirty years behind silicon. Companies like IBM and Intel have poured billions of dollars into silicon photonics so I would still put my money on silicon winning the race - even though it's not an ideal semiconductor for optical computing given that it cannot not work as an electrically pumped laser (silicon is an indirect bandgap semiconductor).
This is what I thought when she said silicon wasn't transparent zoom in on silicon sand and it's transparent 😮
@@patrickday4206 The transparent part of ordinary sand is silica -- chemically silicon dioxide: SiO_2 -- amorphous quartz. That is indeed transparent in the visible and even into the UV. But...
_Silicon,_ which the overwhelming majority of microelectronics is based on, has a bandgap of 1.1 eV, so it's transparent only to wavelengths a bit longer than about one micron, or 0.25 microns longer than wavelengths at the red (long-wavelength) end of the visible spectrum.
The oxide of silicon _is_ used in conventional chips, but mostly as a transistor-gate insulator and as a gate material, which doesn't have to be a very good conductor at all. That stuff is polycrystalline, and thus not very useful for electrical conduction generally _or_ for optical transmission. I have no idea what the current status of SiO_2 crystal (i.e., quartz) growth technology (or doping options) is.
Good points, but regarding defects: The image at 0:10 suggests a very simple periodic circuit with many independent circuit lines. In that case, the yield problem might be meliorated with a standard trick used in memory arrays: build in a lot of redundancy and discard lines that test bad. You could even do it in software. The killer defect rate is then measured in defects per unit length of a broad "line" and only scales, roughly speaking, with the inverse of the "width," or the square root of the chip area. Of course, that might solve the defect problem, but it rather limits the chip architecture.
It also kinda looks like the device density sucks.
@@patrickday4206 Sand is silicon oxide (silica...or glass)and is reasonably transparent to visible light. The elementary form of silicon is transparent at wavelengths above about 1100 nm while we generally say that the longest wavelength of visible light is 700 nm.
Underrated comment wrt to the hundred of thousands of views of the videos! 18 likes vs 11k likes at the time I'm typing. 0.16% ratio!
I remember writing a paper on optical computing back in the late 80's. There were high hopes back then. Not much has happened in this space since that time. It's encouraging to see some progress.
I ran an optical switching startup in the late 90's. Met several times with the NSA to explore optical computing. Nice to see it developing.
It is funny. Your comment bought to mind Stephen Wolfram's idea of "computing" or "calculating" (where pretty much anything that exists dynamically is doing that). Actually, IMO, there have been huge advances in photovoltaic and photonics in the last 20-30 years. Your high resolution TV and display screens are examples, as are solar power and Starlink and other satellite networks interconnected by lasers, not to mention bio-medical applications, GPS, all sorts of sensors, etc. Google says China is first in this field - but that may be for applications. Numerous universities (Colorado at Boulder, Stanford, MIT, University of Rochester come to mind) are doing most of the world-class research in these fields IMO.
I remember pencils and paper. I miss the good old days
Well, actually there's been a lot of progress, just not as general purpose CPU computing, but in fiber optic communications switching (it's pretty much right under everyone's nose). :)
@@davestorm6718 Correct. Been all-optical switches in comm's networks for over 20 years now. And, optical compute is far different than optical switch.
This is an amazing way of injecting a lot of data into a silicon microchip so you can actually have faster processing. Heck, we can even bring back ring-loops as memory, store it at the GHz range inside a loop of optic fiber. Terabytes of memory faster than DRAM.
Forget about using it for computing, just moving data and storing data are already amazing. Modern CPU/GPUs already spend most of their time just waiting for data to come to their caches. The rate at which computing can happen is severely restricted by memory and bandwidth speeds.
You make a good point, similar to Anastasi's final comments, that this "chip" and similar devices might be better suited for data transfer and communications (not sure about "data storage") than for typical computing. At least perhaps for the time being.
I still remember the first time I allocated a Tb of memory on our HPC, the sense of power was awesome.
Still took half a day to process my data though :(
This is NOT new...... it is already being done, at many Trading exchanges..... in the real world for like the last 10 years.
They are "storing" the data feeds in optical fiber BEFORE processing them.
@@stevesteve8098Yes, this is not new, but I think it requires the optical feed and then 2 more lasers, one to convert the optical information into phonons and a second to read that data. However, as I understand it, that only "stores" the data for 10-20 nanoseonds ... not very long even for day- or seond-traders.
ring loops...ring loops ring loops...the answer to the speed benefits of lithium niobate encrusted photonics.
do you know rain neuromorphics chip department wants to transition to ring loop techniques of dealing with memory bottlenecks. very intresting point there.
This is absolutely HUGE. Photonics needs way more coverage. Great breakdown of the paper!
Yep, a Thousand times faster, while CONSUMING 400 times LESS energy, could (SHOULD) be revolutionaty indeed 😛 Hey, a DARK thought just crossed my mind . . . what would this bring to the world of things such as blockchains and crypto networks ?? . . . BTW are blockchains Not BASED on some philosophy called "proof of WORK" and the need to consume INSANE amounts of power and dissipate insiduous amounts of heat . . . just [for a lifeless Machine 🙄] to "PROVE" that it has apparently "WORKED" 😅😅 ?
Ha fancy seeing you here! It takes a company like QCOM or other to pick up one of these research companies, and things get very interesting very quickly
is it 30 years away like the fusion?
It isn't huge. This concept has been theorized for over 20 years and it's still nowhere near marketable.
by the time photonics for actual compute (not transport) becomes a niche adoption, we'll have viable quantum computers that dont require super cooling and can fit in a 2u rack mount case that already are capable of 5000x the compute that is expected of this paper-only compute. That will render photonic-over-medium based computing obsolete other than for a storage medium.
TBH as a programmer, I don't understand your domain well enough to understand everything you teach.
But I am curious, and the main takeaway for me from your presentations in general is that it's very inspirational. You have a very good way of explaining things. Thank you for that!
Don't worry, the people who do understand will develop a framework or library for you to connect up to, if needed.
As you said there's a basic takeaway. She's not presenting to physicists.
I understood the basic concept of limited use of optical transmission betw chip components & poss basic maths.
But as she explains in the end, this is only a working concept, all the processing components are not there yet... currently they need to supplement traditional (higher power) components.
@@raheesomShe said nothing. Photonics can still be digital. Digital means 1 and 0, it doesn't mean the physics behind the CPU. Most likely, nothing will change for programmers for photonics.
Regarding your comment about being a programmer, which language do you personally program in which requires you to understand how a regular chip works?
she doesn't teach, this is porn
You remind of one of my old professors. She was an older woman and she was a wonderful professor and I'm very fond of her. She taught Data Structures and algorithms. Thanks for the breakdown of this. It's very clear and understandable.
Was looking forward to your take on this paper. Thank you 🙏
we dont have electronics that can feed an optical encoding device in THz speeds, thats why theyre using microwave encoding. Its the fastest we can transmit data using conventional computers to encode the optical signal. The optical signal is the carrier and the radio signal is the modulator. The only reason they are converting to light is because they can optimize certain computations (like matrix multiplication and fourier series) natively due to the innate physics of the system that is designed. This is essentially an analog signal with no discrete values which makes repeatability a bit of a challenge due to things like chaos theory. It would be cool to see some sort of optical transistor that can be switched easily with optical logic gates and possibly do discrete calculations instead of analog ones.
I have been saying photonics is the ultimate in processing for 20 years, finally good to sees it's starting to take hold.
Thanks to you for making this happen.
I could listen to you breakdown super intense subjects that I'll never really need in my lifetime, forever! Your approach and ease of translating things into near layman's terms is great. Grazie mille Anastasia.
POET Technologies ($POET) is a pre-revenue company that created the first-ever wafer-level integration of electronics and photonics into a single device
By the way, what a wonderful content you are sharing in such a simple and very explanatory way! Thank you!🙏🏽
Thank you for all this incredible informations and on your way of presentation of everything, greetings from Bosnia.
Sentences like silicon is black and therefor does not work for optical computing with visual lights is true, but in IR silicon is transparent and in that range silicon is working well for optical computing.
I learn a lot from you! thanks for posting these videos!
I think the next age of innovation is sort of dependant on breakthroughs in Material Sciences, it is the key that will unlock a whole lot and every thing will just fall in to place and we will move faster and much further.
The next age of innovation is 1000% in AI, and it's gonna happen really soon. AI will boost material sciences drastically
a lot of work here: congratulations! I totally agree with the last part of your video.
Just a comment about what you say at the beginning of your video on silicon chips (3:30). Silicon photonics do exist and are also working very well. Silicon may be "black" in the visible optical domain but it is completely transparent in the near infrared region where all these photonic developments are made (mainly for optical telecommunications). And the chips discribed in the nature paper you present use silicon, silicon oxyde, silicon nitride material for passive waveguides (all the circuitry you describe) and thin film lithium niobate above these waveguides for electro-optic modulation (the mixing part) because it's a bit faster than silicon modulators that do exist as well. By the way, the wafer that is shown in the video at 11:35 is indeed a silicon wafer integrating silicon photonics technologies
Thank you for another interesting video. I look forward to the day when the technology is perfected for photonics. I think it'll be history making.
Chip advancement is actually an amalgamation of many technologies. Especially the material science and integration of optics and quantum computing I guess
There have been recent breakthroughs in
1 AI
2 Quantum
3 Photonics
Now imagine an AI running on a Photonic Quantum computer!
It would make all our existing computers as useful as handheld calculators.
Super AGI
Doesn't even need to be entirely photonics. Specially if later we develop a way of fusing photonics and electronics.
You can do a lot of computing in parallel, so the latency in electronics wasn't a problem, the problem is always the von-newman bottleneck, the bandwidth at which you can inject data into a silicon chip.
For ex, H100 TPU can do merely 3.35TB/s of data transfer internally. It does 51 Teraflops (FP32) but because it can only transfer 250GB/s from memory, it doesn't get even closer to that.
The H100 has 456 tensor units, if you could feed data without ever having to wait a single clock cycle, you would need at least 153TB/s.
But if your chip can do 1 multiplication per clock cycle (*1) at 3.5Ghz, that would at least consume 0.336 TB/s for a single unit, or 153TB/s total.
So if you could feed data at the rate it can consume, you can easily do 50 times more computing. You can do 2.5 Petaflops instead per chip.
(*1 - tensor cores in the volta actually have 5 stages, those at least use 5 cycles per "instruction", I'm considering 3 floats inputs)
But there's no way to inject that much data into a silicon chip, the die size would have to be huge for the amount of pins it would need to have. Maybe some IBM mainframes can do that. 150TB/s is a lot of data.
Now with photonics, that's a possible optimization, far away (aka, in the same server, but not on the same PCB) you can have terabytes of RAM in parallel feeding data to a microwave emitter and then to a light pipe and then to a single microchip. (it would still need to have another back-converter to microwave and a receptor to feed the data to the silicon above, but that would be flip-chip interconnect instead)
Also you can now make it even denser with much more tensor units and remove all the cruft used for managing caches, you don't need caching or even registers, its pure computing, data in / data out.
You can even do crazy things like active cooling inside a microchip if you don't have so much die size being spent on interconnect or cache.
Handheld calculators are still useful 😛 Especially the ones that use solar for power. I still have mine from decades ago, that works, using LCD and a small solar panel.
@@monad_tcpoh yes, talk dirty to me! 😄
I don't think a photonic quantum computer would benefit from many of the advancements in classical photonic computing.
I could be wrong, but I imagine quantum vs classical photonics is a whole different beast.
But yeah, photonics could certainly speed up AI if it gets good enough, and quantum computers make AI go crazy, but that's probably further away. We'd probably have to relearn a lot of what we know about training classical AIs too.
Good to see a sober assessment of the developments. Eager to see systems that exploit the full parallelism possible with optical computing.
This is crazy technology if they can master it, because of lights ability to be split into different wavelengths or colors this will significantly increase data storage and speed, it'll turn silicon valley into the stone age
Well done Anastasi. You really have the courage to bring new information about important research in computation. Thank you for your video. Impressive information. Congratulations.
As ALWAYS, excellent content, incredibly complex ideas described simply and thoroughly. These videos are essential and I thank you for making them!
Thank you
Although I'm not highly knowledgeable about computer chips (I've watched a couple of your RUclips videos because I'm interested in AI and thought it would be worth learning about the hardware too), I've heard that transistors are nearing the size of atoms, suggesting an imminent impasse, a 'brick wall". However, learning about these innovative AI chips, such as the Grok chip (or is it Groq, Elons AI makes this confusing), analog chips, and photonics, has utterly astonished me. I'm being shocked and stunned to death. That 'brick wall' is more like a paper wall, and we are shooting right through it with bullets. The convergence of exponential advancements in both hardware and software is mind bending. Everyone around me seems oblivious to what's on the horizon, and I am too, yet am certain that these developments will render the world beyond recognition. 5ht2a receptor agonism level beyond recognition.
Interesting to see a video on this when I work at a company designing optical computers :D I am happy to see more coverage. We are talking about clock speeds on the order of teraherz, we dont have to deal with inherant capcitance or thermal limits. Also, the goal isnt to replicate electronic architectures but utilize some properties of light to do operations in the analog domain, e.g. fourier transforms are extremely easy and efficient and only take a handful of components. That said, afaik my company is the only one doing digital computation as well (which can be done!)
I love this channel!🙃🙂
I love it too!
It's not exactly my domain, but I like to occasionally dive into the tech world outside of my programmer world, and Anastasi paints an intriguing picture of that realm.
SAW (Surface Acoustic Wave) devices are conceptually similar. The interaction and coupling of different energy types. Resonators, couplers and transmission lines are designed into the system to accommodate each energy type.
Reliable low power laser emitters are readily available from the fiber optic communications industry.
For photonics to expand there needs to be a toolbox of active elements which can be simulated.
Much like the inductive, resistive, and capacitive components in electronics.
SAW is vastly slower tough I think.
@@tortysoft The comment was intended to highlight the difficulties with coupling dissimilar modalities.
Wow, I’ve waited for this! Awesome.
Your channel seems a bit ahead of its time. Keep up the good work, you're great !
This could be streamlined AI.
Being ahead of time is the judgment made by those who are trying to find better solutions by extrapolating from past experience and not accepting that past experience becomes almost useless as the fundamental ground rules have changed.
Only the real leaders in any field who can come up with revolutionary ideas and solutions are those who accept that "business as usual" is a thing of the past and has become more of a roadblock than any help when new technologies come into play.
@@juanluismartinez4587 Haha, nice comeback!
@@juanluismartinez4587 I guess it could, but is it ? What made you think it is ?
Love your presentation Anastasi. Would love to see an episode on advances in storage tech.
As a retired physicist, I was wondering where this research had gone. I've been out of the scene for a long time but I'm encouraged to see it making a comeback. Long way to go yet but I always said (in my youth) that one day we would have photonic processors running at mind-blowing speeds and, possibly, even fast enough to be used in human-replicative systems like 'real-life' robots. Maybe the days of "SkyNet" are closer than we might like! Brilliant video! Thank you for this. You've just won another subscriber.
Excellent article. You were very honest with the review. I appreciate your honesty.
Great video. Just would like to point out that even silicon is used for photonics and the field is known as silicon photonics. Lithium Niobate make good modulators, but currently the foundries also fabricate silicon photonic modulators.
I believe the power consumption referenced in the paper is the actual power consumption of the chip not the peripheral equipment supplying the clock and displaying the results.
Great research and video Anastasi !
This is amazing!
This is going to be such a huge step forward!
Hope it will be customer aviable once!
This lady can easily be a professor in practically 99 percent of the Universities of the world! In one compelling video she bring so much "light" to the basic understanding of the world of photonic chips.
As Einstein once said to the effect, if you cannot explain something simply so that a six year old can also understand it, then you do not know it yourself!
I think she should be in R&D.
She is mostly talking nonsense. ie "Light is the fastest thing in the universe", Not something an educated person would say.
That is what identifies a genius from us - the others. To them - everything seems simple since their mind can zoom like a laser beam to the root of the issue and follow the logical steps from there
And at no time has she demanded $ billions to be sent to the UN - since only money can solve the issue.
@@RichardDuncan-ju1xk I would have thought light was the fastest (ie fibreoptics). What is faster than light?
@@raheesom Light travels at the speed of causality, as all massless particles do. Light is simply a small part of the electromagnetic spectrum. It isn't special. It also isn't much faster than normal electronics, its speed is not why we use it.
As always, a very cool video, thank you.
I theorised this in the 90's but the finesse here is amazing. Light sensor/emitter arrays are only limited by the medium the light travels through and the switching states of the sensors. Different diffraction patterns can instantiate different data outputs from inputs, and the need for a medium apart from storage is minimal, compared to circuit boards. Plus points include higher resistance to EMP events. Cons include likely sensitivity to cosmic radiation.
perhaps the cosmic radiation be... cancelled out but that slows down processes
How do you store the data?
From where did you take that list of startups....it's very interesting to see!
my relax channel. Anastasi! thank you for as you is
Anastasi... You and your videos are so wonderfully informed.
I just love ♥ the cat 🙀 that limits the clock 🕒 speed in your video. 😂
People are saying that quantum computing is only suitable for some calculations. And that is true. And they say that this will decrease the speed of the quantum computer, cos it will only be suitable for a small pool of stuff.
But I think they might be wrong. I think it might be like when classical compute was in its infancy, like in the 40's and 50's, when the first transistor was made. They were probably saying to themselves back then, how do we do stuff with zero's and one's. And it just took them along time to think in zero's and one's. And today we have multiple languages, and can take it almost anywhere we want. To the point that the zero's and ones are not a barrier to creativity like it was.
I expect quantum might be the same. I think it will just take us a certain amount of time to think in the way that quantum calculates stuff, and eventually the same thing will happen. When we learn to think in terms of quantum calculations, the pool of stuff we can do with it will expand. No different than in the 50's with classical compute.
Setting aside the substance of your comment, do note that this "breakthrough" has nothing directly to do with quantum computing.
Anastasi, it was very refreshing getting your channel recommended to me via RUclips. From the outset of the near-no-power statement, of course the energy for the input laser and subsequent readout were apparent. This technology is very promising for future development and perhaps as an initial step can be a valuable component of a computer system.
Advances in energy storage are as important as this = sodium vs lithium, for example. I look forward to exploring your past posts!
I will subscribe as your communication skills and ability to take advanced concepts and translate them into terms that I can understand is valued. Thank you and Cheers from Seattle!
Deployed Worldwide Through My Deep Learning AI Research Library, Thank You
It's also immune to EMP. It's be used in military applications for years.
not immune but much higher resistance to it. you still need to generate light somehow, which needs some form of electricity.
It is so refreshing to have facts presented by a human that understands the subject and does not feel the need to plaster pointless imagery all over everything ! And... the content is new and interesting too !
For a project in school, I made simple logic gates based on IR light and laser resonators which could convert that IR to green. Neat stuff
And since "power saving is involved - how about demanding that this project be financed by the government - since it saves the planet?
At first glance people laugh - but in reality = it demonstrates how ridiculous the entire climate change charade is - $ billions and more - but absolutely no solution on the table. Why? Making the money flow from the taxpayers into the pockets of the politicians is the game - not the climate.
This is an interesting development. QUite a few years ago whilst serving in the militay, we were made aware of a program that was working on using light for chips, I have often wondered since then how far they got.
Does anyone watch this on a hi-fi system? Not only is she brilliant and beautiful but her voice and accent sounds amazing. I could listen to her for hours on end. Your videos are always very detailed and informative about what's coming in the world of computers.
Ray Kurzweil will love this one :)
You got A chip on ya shoulder means got A problem!!!
Interesting well presented idea & although it is not yet practical, as you note, there is so much potential combining speed with low power that in the by & by someone will find a problem that this solution can solve. Thank you for sharing!
This was a great laugh 😂 thanks for the video 🤙🏻
Skip to 5:55
Complexity of systems increases the probabilities of errors and breakdowns. Truly optical computing in an end-to-end open system where you can pick up and move or join a device is still many decades away, but the groundwork is being laid today. The next Bob Metcalf and Tim Berners-Lee could become synonymous with photonics computing soon. Perhaps it will be Anastasi?
with the dawn of A.I., prediction will more and more not make sense.
@@particleconfig.8935Never forget... GIGO.
I love your presentation! very few girls actually show the manners and statedly act like they can do both business profession and lady like at the same. Love that! If you're ever in my town , you can come by, whether technical or not, I'm not good at talking in real life, but I have loved watching you.
News like this gets me giddy with excitement. I am a 3D modeler/animator/simulator with a huge need for fast processing of physical simulations and image rendering. Anything that will speed those processes up for me is something I hunger for. Right now, building a single render server with four GPUs is very expensive and chews up a lot of power, turns my office/studio into an oven. With technology like this, I could be rendering full-on, photorealistic imagery at 8k resolution in realtime. Chewing my way through massive city destruction simulations would be also dramatically faster. And being able to do that in one box sitting by my desk instead of having to have access to an entire render server farm would be a very big deal to me.
The level of computing advancements today is incredible. It’s happening in multiple areas simultaneously. And the implications of how these will effect life in the future
I’m really tired of having my mind blown.
This kind of technology is not coming out fast enough. Love research and development. Just love it.
Except lithium is extremely rare, and silicon is abundant. Not to mention electricity travels at the speed of light anyway.
Last video I have watched from you must be from a few months ago. From the first seconds of the video one thing standed out: your fluency in english has improved!
Great content as usual!
Thank you so much, I'm working on it :)
his fluency in english: standed out! High Praise
Imagine photonics and quantum computing... ❤
Happening today, they are using photons as qubits. Now imagine where this goes in communications when they can entangle these photons or at least enough of them and making them addressable to enable FTL communication between disparate systems. Yeah, we are talking Star Trek like subspace communication across the universe.
Quantum liquid metallic variant photonic crystalen shaped shifting cybernetic meda morphic robotic limbs arms hands legs feet torsion heads assembly line New Tesla reverse engineering back in play
Beauty and brains. What a wonderful combination !
What is the speed of transmissible light in this medium? Does the waveguide effect the refractive index? Are they using electro-optic effect to reduce the refractive index to make make some Matrix like computer that we will be forced to jack into? I ask because I know nothing about this field.
In the blink of an eye...thanks again for a great glimpse at the future.Cheers.
8:08 how is this low power when it uses Lazer and microwave? is it really power efficient at scale ?
I'm interested in how this may be used in differential image recognition in astronomy. Could this help to go through the vast amounts of data from telescopes to find anomalies?
Anyone else just get 😵💫 hypnotised just by listening to her voice
The efficiency might very well still hold up, didn't read the paper, but if the photon-microwave generator uses up 100 watts for 400x the compute power, than its already better then a silicon chip using up 40001 watts. In other words if the task gets completed 400 times faster while using 10x more kw/h this is a huge gain in efficiency.
Photonics is very intriguing. It’ll be very interesting to see how it goes.
If there were ever a place for AI generated dubbing, this channel is it.
Is this for quantum computing or normal computing? I'd also like to point out that lasers can generate a lot of heat at their end points. Wherever the light stops is where the heat will be generated. You'd need a cooling system of some sort which would use power.
Compliments for your critical review. Yes, I think that this new technology may be useful in optical routers.
A computed image is a grid or array, and the data in each cell is a value. If you can do this, then you can use an array to, for example, replicate a binary byte (8 bits) or larger binary arrays. Ergo, these chips can do much more than process images.
Photonic computing will never be that common. The reason electrons are so fantastic to work with vs. Photons, is that Electrons can be moved easily; all you need to accelerate them is a small charge and they jump at your command. And you can stop them and have them wait around. Photons on the other hand only travel in a straight line, and inside fiber optic they are zig-zagging, bouncing against the walls, but the real problem is that photons will not sit still; you can bounce them back and forth, but that's not really the same thing as standing still.
Photonics is a fascinating field, and clearly the 400 Gbit fiber optic transceivers that exist are doing very clever things.
I am personally getting very tired of these PR releases from research groups claiming massive improvements, and then when you get down to it, it's a bunch of smoke and mirrors that is really just a quest for more funding.
I imagine at some hundred years in the future the chips will be crystals like in Krypton.
Hi Anastasi. This paper certainly seems to stir up high hopes that it is possible. Ultimately they need to demonstrate the actual hardware that does this. I am an Electronic Engineer and always try to lookout for new technological breakthroughs. In my opinion photonics seems to solve the inherent problems associated with silicon and that is capacitance and inductance which both pose their own set of problems such as maximum clock speed and add to overall power consumption which is always multiplied by the billions of transistors on the chips. Is it so, that photonics solves some of these limiting factors? Am I missing something? And then also the noise issue which limits the minimum operating core voltage. If you have a conductor that carries a signal it is prone to electromagnetic interference from surrounding signals. In my understanding photonics is not prone to electromagnetic interference, so how does photonics compare to regular electronic signals when it comes to noise. What other limiting factors does photonics have on the other hand?
Not long after optical data storage, (CDs/DVDs, etc), came into common use, I expected data storage to quickly evolve from the initial one dimensional storage like what is used for CDs, DVDs and Blue-ray, to 3 dimensional data storage using some kind of crystal storage medium. Just as an example, imagine a wedding ring where the video footage of the wedding is stored inside the diamond and accessible by a device specifically designed to read the data on the ring, by placing the diamond, (or other gemstone), onto a special optical reader. Of course so much more than video/audio could be stored in that diamond, but it would be a great and novel way to help bring the technology into the public eye.
Well... technology had to evolve.. and it's nice to see someone actually bringing it forward to practical application.. Continued Success
When will we see products?
Knew it would be lithium niobate. It is an extremely versitile material. A bulk chunk makes a fast q switch for a laser, a thin piece is used in accousto optic switches, appy an AC electric field while cooling and you get a wavelength converter to turn a beam from IR to any wavelength you wish that the material is transparent to. Oh the laser used for a chip like described would be similar to the one in a blu ray disk burner. The RF generator would not require much, it uses the electrostatic field of the wave to change the polarization rapidly. The main issue would be extreme coherence length of the laser and extremely precise frequency stabilization of the RF source.❤
I remember University of Crete, in Greece, had breakthrougg research on this type of computing, 20 years ago.
I remember when this sort of thing first appeared decades ago. My thought then was that it may be fast but it still needs to interface to regular electronics at some point ant that would be the bottleneck.
This Sounds really interesting.
It appears that the operations performed in photonics are analog. Sort of like what you would do with an op-amp based integrator. If this is correct, then they are suitable for solutions that can tolerate some inaccuracies or lower precision like AI matrix operations.
Is this the case?
Running It Wild
The main question is: What will be better diamond quantum accelerators (Quantum Brilliance company) or photonic quantum computers or both in connection to each other?
I think the better choice is solid state nuclear magnetic resonance quantum computer or more likely spin qubit quantum dot computer (from Intel for example)
All speculation. The processing can run at asynchronous speeds way faster but there is no from of optical flip-flop. All modern silicon designs are synchronous designs that use flip-flops,
This is exciting idea, I hope it comes to fruition one day! With such low power requirements, we can start building chips into the 3rd dimension without running into heat issues.
Me no good at science. Your presentations are so clear, however, that even those of us without great scientific backgrounds can garner the gist of your messages. Really, really, well done. Thank you.
Cooool!
By the by, any chance you could put some sound deadening behind and to the side of your mic (out of shot)?
I think the echo is affecting the sound picked up by your mic.
Also, if you could direct the top of the mic towards your mouth a little more and maybe move it a little closer, rather than speaking horizontally over the top of it, then, it will likely bring out more tone in your voice too.
2:20 That cat is a scientist :).
@8:22 , the 10 core xeon is 85w so 1/400th of that is .... they must just be just measuring gate power usage , no way they are including photon generation , but the thing I do like about these is lower temps, IBM has been tinkering with Photonics for quite some time
I think it is one if the next obvious avenues to computing. Hardware will eventually catch up with the chip. What was explained seem to actually present the opportunity for embedded systems as may be some type excelerator for enumerators in a hybrid die? Thee efficiency is something may be we should nt pass on. Pretty cool and thank for sharing.😊
601 reporting for duty. This sounds like a revolution for chips Light, Crystals, Computing, Fascinating Materials, What will make us humans better!?
Thanks for the video. Photonics may be useful in the future, but as you explained, this technology must go through considerable development before it is ready to replace electronics. I've been able to watch the progress of lithography, including several promising technologies which never reached large scale commercialization.