It's really cool to see one of our episodes (especially one on such an underrated topic, Photonics!) perform so well out of the gate! One thing I (Jason) want to set the record straight on: We were not paid to make this video and additionally I, nor anyone on my team, have shares in Lightmatter. I'm seeing a lot of comments from people saying this is an "ad" or a "paid video," which deeply annoys me because we have worked incredibly hard for 2 years (and said no to A LOT of sponsorships and companies who have wanted to pay us) to make S3 an independent, non pay-to-play / pay-for-coverage channel. This is unlike many other tech adjacent RUclips channels/news outlets in the space... You cannot pay to be on S3. I understand my tone and rhetoric of general optimism and excitement in our videos can come off as overly excited and maybe even biased, but that's just how I communicate. I'm sick of seeing so much negativity around discussions involving advanced technology; the reason I am making these videos is to try and change the vibe around innovation. We make these videos because we believe that stories and conversations about technology should happen more broadly and that building the future is a really good thing to do! Anyways, I'm always happen to answer specific questions about how or why we do what we do! I'm also VERY aware of things to improve on. Right now I'm mostly focused on: (a) more sources in videos, (b) in 2025 we are moving almost completely away from 1-company focused videos, (c) better editing, (d) trying to present a more balanced/educated view of any given topic we cover. I'm always open to feedback too! Thanks for watching our stuff for the past few years and watching this episode! I hope you learned something new about photonics :) (Also side note: It's unfortunate to see so many highly upvoted comments from people who didn't watch the entirety of our video. Sometimes, oftentimes, RUclips is painful lol).
@@mateosantos4980 Stick around in 2025! We've got some really great stuff coming, I feel like after almost 2 years we've finally JUST beginning to figure out our style.
I don't build or program anything today but the advantages of photonic over electronic processing seem quite clear. They're working on what my dad said over 30 years ago was the main bottleneck in computing: system bus speed. Chip-to-chip communication, chip-to-memory and storage. My question is when the broader industry begins R&D in earnest or if there's a photonics skunkworks somewhere. It seems like the natural next step in computing. Then combine that with better AI programming that's 10x more efficient.
@@storyco2000 i did go off on you. I'll cut you some slack, I'm just incredibly nitpicky on that exactly "rhetoric" Overly excited yeah i do that sometimes myself when trying to explain Ai code im designing which I design for free atm. although i did write many manifesto which summarized, concluded, or mapped a more specific use of terminology, for me it was that you don't seem to fully know what you are saying that or how to say it. excitement can complicate that given the chance of over explanation is more likely to occur while excited, and the sloppy run through of terminology and their more definite definition, I'll leave you with this AGi means the ability for the Ai to have human kike context and language fluent understanding GAi would be fluent spatial understanding, AMi is human mutual intellectual intellect intelligence ability 🤓👍 🤣. QAi is quantum &or full parallel processing and understanding between all notions. all these combined with varying degrees of AMi or surpassing human norms of intellect or being close to right under or at or right above is = ASi.
@@storyco2000 the code i write is for gernative Ai although it most likely serves they used to some backend access developer as a actual foundation to what will birth any one form of these if not all of them and later reach forms of pre ASi. AMi is simply a form of vectorization code which does take a lot of processing power it also requires Quantum tandems for true potential. i actually wrote the first pseudo codes of what makes up an AMi.
@@HyenaEmpyema In case you didn't know, today's 3nm chips have transistors with smallest features the size of around 20nm, and the whole transistor is around 130nm total. The "3nm" is really just marketing
@@boohoo5419 and electric cars were invented before gas powered cars and for about hundred years electric cars weren't used. Sometimes a technology can have more potential but is harder to get started compared to a similar technology.
The more I watched, the more I grew frustrated how things were being misrepresented. Then I scrolled down to the comments and breathed a sigh of relief.
This is why most sites removed the comment section from their articles. People were reading the comments and trusting those over the content of the article.
I had to stop after about 90 seconds. I’ve seen the rise and fall of Moore’s law, and watched the price of memory decrease when I paid $400 for 16K (1978) by a factor of 10,000,000 x (maybe more; also access times and clock cycles have decreased).
@@storyco2000 You keep saying that in replies but maybe a timestamp for when it changes back to being about computing would help? I must have missed it, as this appears to be technology to optically couple GPUs together to reduce latency. Certainly interesting, but not going to reduce power consumption if it just means these companies will chain more GPUs together.
After seeing countless comments stating that this is "photonic transfer", not "photonic computation", and seeing the replies to them saying that obviously they didnt watch the whole video, i can confidently say that this is absolutely not photonic computation. When the process of assembling an Envise chip is being explained, it's specifically stated that these chips merge existing electronic tech with this new photonic tech: there are transistor dies within these chips, a substrate for the dies is added, and then photonic chips are added to the assembly, followed by a substrate for the photonic and electronic components to communicate. I'm no expert, but it sure seems like we can't accurately call this "photonic computation" if electrons are still doing most of the heavy lifting. I will say, though, that if photons can be used to actually perform computation, as opposed to the aforementioned transistor dies using electrons, the sky would be the limit.
There was no demo of a functional unit, and it is another jeet-run scam. IF you fell for it, you just proved your IQ does not exceed 50. Now tell us about your travels in outer space.
watched a guys phd thesis on youtube maybe a year ago that was entirely about computations performed using light and some physical geometry. There is a very cool future of optical computing outside of the obvious speed increase, especially with parallel computing, using the different wavelengths for what i guess are essentially cores but even more compact. Went and found the link from my watch history: ruclips.net/video/Mdh2pLwsK8Y/видео.html the video title is "OPTICAL COMPUTING with PLASMA: Stanford PhD Defense" This guys video is more just establishing the development of using photonics for the interconnects, like regular fibre optic but for smaller communication paths like computer chips.
This is fact and true. I guess one day when looking at my GPU the rainbow colors on it would actually be there for something. (ik you can't see infrared but humor me here)
Fiber is not "faster" than electric signals, it is in fact slower: electrical signals travel at 99% of the speed of light in vacuum, fiber signals travel at 70% the speed of light in vacuum. It's just that light allows to transport more data at once. That is why some datacenters use radio instead of fiber for low bandwith, low latency connections.
100%. I scoured what Google had available for some reference benchmarks for where they currently are and found nothing. I've sent an email asking for a reference point but I'm pretty sure I won't get an actual answer back. Conclusion: Whatever they currently have is worse than what already exists. Probably by a lot. This is an ad to drive funding because in theory if they can work out all issues it'll be better. I'm guessing much better from my own research into photonics.
@@LiveType Yeah no. If they had a significantly worse chip than whats on the market then that would imply they have something at all which they wouldve used to demonstrate. My guess is that they have absolutely nothing but theory
I work in a steel manufacturing industry and we're improving our method of steel production to create better racks for the GPUs and thereby enabling AGI progress.
AGI is but an illusion. You cannot simulate the entire world through technology. Progress will lead us somewhere, however, we will never get something as close to AGI which basically equates to a real human being.
this channel is full of bullshit tech. not one thing actually came to market that was shown. this idea alone is 80 years old. computing with light was invented before the turing test! if your totally clueless this channel looks really hight tech. but every video i clicked showed some investor money sink and nothing else.
I would guess that size and power of adc dac array is a limit for analog computing in general. consider the matrix multiplication can be over 10,000 by 10,000
This is not a credible video. You never ask these CEOs anything except softball questions, nor do you ever put numbers to anything. "What are the biggest challenges your company is facing, and how do you plan to solve them?". "How does the existing tech work, and how EXACTLY does your new tech disrupt it?". You're terrible at digging deep when you are interviewing these really smart people who should be able to answer these questions. Instead, we get questions like "When did you first get into computers". Like really??? 10:45 "Imagine data flowing through the computer at literally the speed of light... while traditional computers are stuck waiting for electrical signals to bounce back and forth... light moves through these chips without any delay." -- at this point you REALLY need to put actual numbers to this. What actual delays in existing architectures can be solved with this? What is the current speed and what do you expect the speed to be with the photonic system? How much performance does this get you on the chip? Also, why the hell are you stuffing AGI and AI into this video. This isn't an AI company. This seems like an attempt to hitch onto the excitement around AI. Are there actual bottlenecks with regard to connectivity between chips? Don't we already use fiber (infiniband) in these datacenters?
"10:45 "Imagine data flowing through the computer at literally the speed of light... while traditional computers are stuck waiting for electrical signals to bounce back and forth... light moves through these chips without any delay." -- at this point you REALLY need to put actual numbers to this. What actual delays in existing architectures can be solved with this? What is the current speed and what do you expect the speed to be with the photonic system? How much performance does this get you on the chip?" Oh my lord, this, so much this. One thing though before the rest... they did mention bandwidth performance. Not clock speeds or anything like that, but the bandwidth is apparently numbers like 100terabits per second. I'd like to see some proof of that though. Now on to the rest. I have to wonder if these guys have ever used any sort of satellite internet before from before Starlink became a thing. (It's far better by comparison to the other options further out in orbit.) I used to have Xplornet a while back. Sure, they could reach some higher speeds than some of the other options I had out in the sticks. But their latency was god awful. Trying to do anything on it that required a quick ping time was basically a no go. You wanna try to play a game with not just full 1 second ping time, but potentially 2-3 seconds of ping time? That's your ticket. Like Jason said, all radio waves are just lightwaves. Satellite internet is just light, by their own definition. If light speed is truly that fast in regards to networking and such, then how is it that earthbound cables can transmit the same data faster in similar distances using different servers to account for the fact the satellite is in space and thus the hop adds distance? I.E. same length of signal travel in test, land based cable internet is faster. So yeah, I agree, we need to see some real numbers here. Because sure, light is fast, especially so in small spaces with very little distance to travel. So in this case light may really be the winner. But the radio wave example... it's kind of not really hitting the mark. 50ms latency on land vs 1-3 seconds latency using satellite. (Not starlink. I've heard better things about that, but it's also once again... closer to land. It's more like terrestrial line of sight tower based internet at that point. And apparently has similar latency from what I have been told of the numbers some people get compared to my own past experience using line of sight tower based net.) And one more thing to consider. Some of these towers are as far away on a earthly horizontal plane, as those further out satellites of companies like Xplornet. And even the terrestrial tower based I am talking about right now was faster in latency than Xplornet. AND it uses radio waves too. What gives?!?
@@francisreagan You uh, mistakenly looking in a mirror while you type that? Cause your comment makes zero sense unless assuming you are projecting without realizing you are looking at yourself. How glossy is your screen?
The funny thing is that they didn't likely even make it better. Not only is Fiber Optics slower than EM waves in metal, there needs to be an Electronic -> Photonic conversion and a Photonic -> Electronic conversion too.
He's saying how making computers bigger isn't a sustainable strategy yet their technology seems to be centered around inter-chip communication and allowing computers to be bigger
> "this isn't just a small upgrade, it's a completely different way of thinking about computation" It's an upgrade. You're swapping out the electrical bus for data transfer for a "photonic bus". Most else seems to be the same. You've upgraded the speed/bandwidth for data transfer, possibly reduced the latency as well.
@@joshshepherd5660 theoretically, if it’s speed of light, you could house the entire photonic computation system in a seperate building. Due to the fact distance wouldn’t reduce latency as much as heat soaking would.
Ehhh... no, it is possibly way more than just data transfer, since it's entirely possible to create all-optical computational elements which would work like differential amplifiers, transistors, logic gates etc., while also creating entirely new components that behave in ways in which usual electricity cannot.
Don't let this distract you from the fact that a machine with computational power less than a calculator landed humans on moon while we're doomscrolling our life away on the most advanced piece of tech available to us today. With AGI and stuff, we can create and consume memes and waste everyone's time at light speed.
I would never call it photonic computing, from what i understand it is just adding light for communication from chip to chip or chip to memory. It is not computing, it is only transporting data so what is the point of talking about Moore's law or size of tranzistor if computation is still being done on classic chips.
Yes, I agree. That's what I understood as well, and I think you're right that the title and tone of this video are exaggerated. Don't get me wrong, I like the idea behind this video, and I like this channel, but... It seems to me that they're describing here a system that can be analogized to one that makes the change between trains (on your route to work, for instance) faster - not a system that makes the trains themselves faster. Yet, they seem to talk about it as if they're describing something that speeds up the train.
it's just encoding. We have been using what you said in a way with the internet and disks. much like quantum computers, you just need to develop a new language to do photonic computing.
The start of the video really made me feel like we are talking about computing with photons, but as the video went on, I realized that this is about reducing data-transfer bottlenecks today's systems. I thought I misinterpreted the guy but now I can see others were confused as well. I totally agree that reducing these bottlenecks will have a huge impact on stuff like generative AI and graphics, but you're right that we would still be eventually limited by transistor size. In IC design terms, this means that present AI systems are more limited by the data-path than their control path or ALU, and replacing electronic data-paths with photonic data-paths will allow us to conquer this bottleneck. IMO, this makes the tech even more promising because they are not challenging existing norms, rather making an improvement that all major manufacturers will soon adopt if it turns out to be cost-effective. It's just the the tone of the video is a bit misleading.
I think you might be mistaken, my friend. Light in optical fibers is inherently faster than electrical signals in wiring when comparing their propagation speeds directly. Light in fibers travels at about 200,000 km/s, while electrical signals in even the best wiring propagate at 150,000-200,000 km/s due to the medium's properties. The confusion may stem from latency in real-world systems, such as delays caused by converting light to electricity and back. However, this reflects system efficiency, not the inherent speed of the signals. The physics is clear: light is fundamentally faster because nothing in electrical wiring can reach the speed of light in a vacuum (300,000 km/s), and electrical signals are always limited by the properties of the conductor. If I'm wrong please tell me why. That's what I've learned anyway.
@@alexhamilton3522 The true statement is that **light in optical fibers is faster than electrical signals in wires**. However, light in fibers is slowed slightly due to bouncing inside the fiber, while electrical signals are slower due to the properties of the wire.
@@HailAzathoththat’s what I’m confused about, like is it actually faster to use this to transmit data between the memory and processor even when considering it has to be translated between mediums? Maybe in the context of a supercomputer?? I wish that had been explained better. I’m sure it makes sense on paper, otherwise nobody would be spending time or money on it. But honestly, even after this video, I have no idea.
Correction: Light does NOT travel at the speed of light through optical fiber. It travels at v = c/n which is speed of light, c, divided by the refractive index, n, of the optical fibre core. Light only travels at speed of light in a vacuum.
@@1q3er5 Why do you sound so dismissive? The entire point of the video is about computing at the Speed of Light, and this is a correction that no, this would not allow computation at the speed of light. Or at least, the speed of light everyone thinks of.
Where fiber excels is 1. Much lower loss, so you are able to send high speed data much further. 2. Each fiber is smaller, more will fit in a given area 3. Fiber can carry multiple signals, each using a different color. That gives it a much higher bandwidth. But, data rate is limited by the reflections inside the fiber. A signal that goes down the center of the fiber arrives first, one that bounces off the sides at a shallow angle a bit later, and at a stepper angle even later. If the center light from the next bit arrives too early, it is hard to extract data. Even though light is 400+ Tera Hertz, the data is still in Giga Hertz. To get higher thruput, CPU's are replaced with GPU (graphic processors) because they process in parallel. Communication speed is a major factor, but on motherboards data over copper is already travelling at 70% or so the speed of light, about the same as fiber. To reduce compute time Intel and others have moved high speed memory and communication inside the CPU, less than a mm, compared to 10's of cm on a motherboard. To improve yield lower cost processing is done with multiple "chiplets" instead of one large CPU. Each chiplet has it's own power supply to optimize performance. A major advantage of making the transistors smaller is they can operate faster with the same amount of power. As Moore's law runs into physical limits (size of the atom), more performance requires more power. That power creates heat, limiting how many CPU's you can put in a given area with air cooling. They have moving to liquid cooling. First using liquid instead of air, now they are actually allowing the liquid to boil. The phase change from liquid to vapor carries away far more heat. But, even that has limits. So, as shown in the video, they build large racks filled with processors. Fiber allows you to connect more racks. Much of the fiber optics you see in server farms, and coming to your home, connect to the computer using an optical to copper interface. A direct chip to optical uses less power and can operate at higher speed. Fiberoptic "connectors" exist, but they require high precision and a very clean environment, which can be controlled in a server farm.
So, I get that this isn't an advertisement in the literal sense, in that money wasn't exchanged, you don't own any shares, etc etc. But it feels like an advertisement. Or maybe more fittingly, white noise. Don't get me wrong, this was a technically great video and I can tell you poured tons and tons of time and effort into it. And I respect that and appreciate that, and I really really want you to continue making content. This was good content! But I feel like it didn't have any substance. You talked about a company which claims to be making advancements in photonics. Straight from the description, "Lightmatter - the company leading the photonic revolution." And you explained what photonics is, how it differs from traditional electronics and electronic computing, and touched on its history and current applications. You mentioned that Lightmatter has developed a product called Passage, which is essentially a reimagined PCB which uses microscopic wave guides and fiber optics embedded in (or created out of? I'm not sure) a traditional silicon wafer to transmit signals between chips. That's sounds pretty cool! But I still have lots of questions about it. Electricity isn't light. So, how does Passage convert between the two? My understanding is that this can often be a bottleneck, so how did they work around that or mitigate it? Or is it actually a nonissue? Or.. anything. I know about electricity, I know about light, I'd like to know how the connection between them works. Some rough examples would have been nice, as well. Instead of saying buzzwords like "Passage bridges the interconnect between chips with light, allowing them to communicate at lightspeed!" you could have said "Imagine an arduino or raspberry pi, but instead of copper traces on a PCB connecting each critical component, they're fixed to a board that allows them to communicate with light, allowing higher bandwidth, lower latency, and more efficient energy usage!" giving some insight into the "how" instead of just the "what," and showing that you yourself actually understand what this technology is and that we can trust your enthusiasm and hype. You briefly mention Envise, and how it can perform computations with light. But, that's pretty much all you said about it. Photonic computation is what I assumed this video would be about, considering that moore's law is always associated with computation, and moore's law is literally in the title. But there was very very little discussion about photonic computation, and I have absolutely no reason to believe, based on what I saw in this video, that Lightmatter has made any kind of advancement whatsoever regarding it. That bit felt 100% like "trust me bro, it can think with light, and it's super freaking cool." Again, there just wasn't any substance. I didn't learn anything. I think that might be the main reason this felt more like an advertisement than a video about some cool advancements in an important field. I didn't learn anything. Well, I learned plenty about Lightmatter and what they claim their products can do, but that's it... That's all I learned. I didn't get to see the advancements, I was just told that they theoretically exist. I'm constantly been told that some new thing is really cool and revolutionary, but it vary rarely is. I think that's why people are so skeptical when it comes to topics like these. Look at crypto and NFTs. They were supposed to be revolutionary, but they're a joke now. People are always claiming to have discovered something revolutionary with fusion energy or truly AR/VR experiences or whatever, but it's never revolutionary. Over-hyping something usually has the opposite effect you're going for. And that's not even mentioning the heavy focus on AI, which I think most people are getting sick of hearing about. Or that people have been saying moore's law is dead for over a decade. (Not saying it's not dead, just that no one believes anyone who says it is.) Or probably some other smaller nitpicks that I've forgotten about while typing this out. Again, the video was a great video, on the surface. The editing was great, the narrative structure was good, it was a fun and interesting topic, etc. There just wasn't any depth, outside of talking a lot about some random startup. This isn't meant to be an attack or bait or anything malicious, I simply wanted to share my thoughts and feelings about the video. Hopefully the criticisms are constructive, or hopefully I'm just wrong about some of them. But regardless, this was my take, and I think some of the sentiment is shared by others as well.
Liquid crystal, the physics of light boucing off the different surfaces of different materials, prism effect - keywords which I wanted to hear here. You need to be able to either change wavelenghts during computation process or to activate / energise crystal microstructure to block / open the signal to run on binary. Where is the solution for memory in those chips? The only thing I uderstood from the material is fact, that they try to make ALU photon-based chip, which is really bad info - bottlenecks will be immense here. Also, I bet they did not want to say anything about this tech to not make Nvidia, Microsoft and other giants interested. Their budget is relatively tiny. Their only chance is to be silent.
I find “Undecided” to be far more interesting in terms of keeping up with new tech. I only watched this because “Undecided” doesn’t usually go into new data tech, but is more of an energy guy.
Something that might have been helpful to note: This is not the first company to bring photonic technology to a commercial environment. Fiber optic cables are a photonic technology that we’ve used for over 50 years. When you put this in the context of fiber optics, this is just the same change to computing that we’ve already made to data transfer, making this technology sound much less foreign. Or in short, this is just applying the same technology we’ve used for decades in fiber optic cables for data transfer to all forms of data transmission between a processor and its inputs/outputs
Optical computation is also an old technology. Too bad the presenters lack the background to understand and explain what's actually new and interesting here.
Even without a physics degree, I think that statement at 12:33 is incorrect. Transistors are not approaching the size of an electron by any means, we are reducing transistor sizes by just nanometers per year
That’s a great observation, and it’s true that transistor sizes are shrinking by just a few nanometers each year. However, the comparison to the size of an electron is a bit nuanced and often misunderstood. Electrons are not classical particles with a fixed size like a marble; instead, they are described by their wavefunctions in quantum mechanics. The ‘size’ of an electron is related to the extent of its wavefunction, which can be spread over nanometer scales in a material like silicon. Modern transistors, especially at the cutting edge (e.g., 3nm and beyond), are indeed reaching sizes comparable to the spatial extent of electron wavefunctions. This is why quantum effects like tunneling and confinement become significant in such small devices, influencing their behavior and performance. So while transistors aren’t approaching the “size of an electron” in a literal sense, they are reaching scales where the wave nature of electrons dominates the physics of how transistors work.
@@Redwoodz112 "ChatGPT please make me sound more correct on top of this already correct statement to hopefully reach the correction singularity and secure infinite energy." ur prompt b like
As someone who's been building my own computers for more than half my life, I can tell you the main problem with these sorts of things is integration. You can have the fastest, most advanced computer chip ever, but if you can't plug it into anything, it won't do anything to replace what's already there.
I read an article in the 80s describing IBM’s research with photonic computing. An innovation they had proposed, due to light’s properties, is the ability to easily multiplex using different wavelengths of light. This ability also allowed them to switch from binary to decimal for the underlying computation engine. Glad to see a company is finally commercializing a viable product.
I remember reading something like that in Scientific America many decades ago. I couldn't see how it could be used except in very simple ways with large chips.
That is absolutely not what is happening here. Lightmatter is an optical link company now. They pivoted away from optical compute because they built it and it wasn't competitive with CMOS. It turns out you can write lots of flashy papers about optical compute, but you can't make a product that makes any sense.
If you have fiber internet at home/office, the light is already multiplexed with different wavelength at least to your building. This is actual technology used right now.
To summarize the issues between light computation and light communication: Nick is an expert on particle physics. Light is a particle and a wave. Electrons are about 1 Angstrom across if we look at them based on how they interact with their environment. Our current chip gate "sizes" (in quotes because this is a hand-wavy measure) is 2nm if you believe TSMC. 2nm is 20 Angstroms. Lightmatter creates an INTERPOSER, which sits between the silicon dice and the substrate to which the chip is bonded. This is standard chiplet packaging. In "The Matrix", there is no spoon. At Lightmatter, there is no compute being done in the interposer. However, light waves can interact with each other, but likely not in a way that can be used to create a computation gate. Having studied this at M.I.T., Nick is likely one of the people in the world most likely to understand if computation in the interposer is even possible based on what we know now (or when he went to school). Cheers! Todd B
I've read about this concept in Popular Science, the magazine. Over a decade ago, around 2010-2012. This has nothing to do with the computational speeds of silicon gates, as it only improves data transference speeds between them. There's still a need for a light encoder (turns electrical signals into light) and a light decoder (turns light into electrical signals), which are both likely bulky (unlikely to connect individual gates, but more clusters, or entire chips as shown in the video) and have their own inherent delays (offsetting the benefit of light paths). The significant potential advantage is the use of different wavelengths (colors) to transfer several data streams on the same paths.
Utilizing the natural interference patterns of light, a computational system can be devised. This system harnesses the interference effects to create a logical loop, leveraging the phenomenon to its advantage. The process involves: - Creating a series of patterns that the light passes through, generating an interference matrix. - Employing the interference to produce a self-sustaining light loop, even in vacuum conditions. - Utilizing mirrors to redirect and manipulate the light, creating a computational framework. Notably, light travels faster in vacuum than in crystal or glass, making this approach potentially more efficient. By harnessing short-frequency wavelengths, such as X-rays, the system can be designed to be compact and intense. This innovative approach eliminates the need for fiber optics, instead relying on the principles of interference and reflection to create a computational matrix
Sorry, if it is just a bandwidth increase, it isn’t really fundamentally changing how computers work, because the actual computation is still being done by semiconductors powered by electrons, all they are doing here is just increasing the speed each chips communicate with each other, by basically putting fiber optics into the silicon substrate.
We now have have electrically pumped group IV lasers - meaning, lasers embedded directly in the silicon. This is HUGE news, because now we can create fully enclosed optical circuits ready for production, whereas before this technology was limited to proof of concept lab experiments using external light sources.
I used to build semiconductor fabs back in the late 90's and early 2000's. If that's a "cleanroom" I've got a bridge to sell you really cheap, and some nice high ground in FL.
"imagine data going at litterally the speed of light" : well that's already the case with electricity you know, signals are transmitted at the speed of light. nowaydays we use fiber optics in communication, not because it goes at the speed of light, but because at high frequency copper impedance increase degrading your signal, but it's not a signal speed issue. Having various colors for information is also already done, we are multiplexing various frequencies, which is exactly the same as using different colors.
Electrons do travell at the speed of light but thats not the main problem u see the problem is that communication inside the chip is really fast but when the chip tries to communicate with other chips the speed becomes really less.
Lol these replies are killing me. Guys, get an education. Voltage (i.e. scalar potential) wavefronts in conductors travel at about the speed of light, but the dispersion and impedance characteristics of those conductive mediums lead to lower informational bandwidths and higher power consumption.
So if I'm not mistaken they've basically built a light bus. All they're doing is replacing the electrical bus with a photonic bus. Thus allowing multi communication paths on the same bus line since they can use different light frequencies simultaneously transmitted.
These are planar waveguides using multiple wavelengths, great optical networking on a chip. No mention of photonic logic gates let alone an entire processor.
Sorry, so many things wrong with this video. 1. The speed of electricity is basically equal to the speed of light. So upgrading electrical data bus to a light one does not improve the speed of data transmission. 2. When the CEO of Lightmatter says that “GPU sits and waits for the data from memory and CPU” he makes it like the data bus is the bottleneck. However, the GPU waits for the data from memory and CPU only because the speed of memory and the productivity of CPU is way less than the GPU one. Not because the data transmission bus is slow! His statements are misleading. Considering this, I agree with comments saying it gives Theranos vibes…
Wrong! The speed of light is significantly faster than the speed of electricity; while light travels at the constant speed of approximately 300,000 kilometers per second in a vacuum, electricity typically travels at a speed much slower, usually around 85-95% of the speed of light depending on the medium it's traveling through, like a wire, due to factors like the material's conductivity and the wire's thickness.
Electricity only matches speed of light In an ideal state, with no resistance, inductance, or capacitance in the circuit, electricity would travel at the speed of light.
@@ModernDatingLifeFJ Light isn't traveling in a vacuum. It's traveling in a glass fiber. Either way, the point stands, the slowest part of computation is fetch and store, not data transfer.
So it's 'Sid Myers Alpha Centauri' - "To achieve the most powerful computing power we need the fastest possible element moving the shortest possible distance."
Moores law doesn't claim that the number of transistors in a microchip will double every two years. The law claims that the density of transistors is I'll double every two years, which does not imply exponential computing power.
If the density of transistors in a microchip doubles every two years then the number of transistors in the same area doubles by definition of density so what are you talking about? Sure, speed of computing doesn’t double obviously.
Light and electrons both travel at the same speed. The advantage of light is the high frequency allows light to carry more information. Another advantage of light is multible frequency’s can travel through the same waveguide simultaneously.
Yeah it's maddening to hear them say multiple times in the video that photonics "travel at the speed of light"...as if electronics didn't. Marketing bullshit. There are some real advantages to photonics, but the ability to travel at the speed of light isn't one of them.
That's an excellent utilization of current photonics but I'd love to see the dawn of boson-fermion transistors that use light to change states and enable photon based logic instead of electron flow logic.
Hopefully we get there one day! I think that this is a good stepping stone that will have actual business use cases and then from there we may see more investment toward this end by this company and competitors. I’m no expert, but it seems to be a pretty hard thing to implement
You keep making it sound like the signal in Light Matter's chips moves faster than conventional chips. They don't, unless their wave guides are hollow. As far as I am aware, light in a standard fiber optic cable moves at the same speed or slower than electronic signals in a copper cable (both at about 70% the speed of light). Thus, this will not make anything faster in terms of response unless their light wave guides are hollow core (which would be closer to 95% the speed of light).
Yes, a similar speed in physical lines, however you can't stack 16 signals overlapping on one wire as you can with their light setup. The frequency is also higher, so it's still an improvement.
@@Flyingwigs Oh, please. What do you think cable TV does? Hundreds or thousands of independent signals overlapping on one wire through frequency-division multiplexing (FDM) which is literally exactly how photonic communication "stacks" multiple signals on one optical fiber. The two benefits are that the frequencies are higher, as you said, and that losses are lower.
@paulgolff4996 still stands as being an improvement, FDM etc can handle hundreds to thousands of signals, but fiberoptic COULD handle thousands to millions of signals depending on the system used.
@@paulgolff4996 Light is all encompassing though. It doesn't interfere with itself, unlike frequency bands. Like Wifi Bands or Radio bands - if you get them close enough together, you run into interference. AND light has 16 "bands" (aka colors) in 1 instance of light. One instance of light is 16 times more effective than one band of frequency multiplexing. So before you "Oh, please" someone, use your head and think a bit more. Otherwise you just sound pretentious, which is probably what you wanted anyway so people wouldn't question you.
@@Flyingwigs Yeah, no. Practical optical amplifiers have limited bandwidths. Commercial systems using FDM over coax are available with far more channels than commercial systems using WDM over fiber. Fiber still wins on aggregate bandwidth and range, of course.
The tagline says "It's time to compute with photons" but I don't really see any optical computation executing on the the chip but just chip interconnects that are being implemented optically. And not a word about the photo or laser diodes that are responsible for the electro/optical conversion or anything about how the light is being modulated to achieve the necessary transmission rates.
Converting photons to electrons and vice versa is still an issue they are dealing with. The founder has some long lectures here on YT. Pretty enlightening. Haha.
@@theseemsligitguy8848 For a transistor (logic/gate element) to work, it must be able to turn on and off. This is done by allowing in electrons to move or stop. Light must be converted into electrons in order for electrons to move or stop (i.e. digital 1 or digital 0). After the electrons move, they then need to be converted back into light so the signal can move to another location. Converting between light and electrons requires additional silicon and heat. This will make chips bigger. The ideal way to do this is have a transistor which runs purely on light so no electrons are needed. No one has figured out how to do this. Until this breakthrough is found the idea of photonics will be like quantum computing, always being worked on but a working CPU is never created. Edit: Another issue with purely light based CPUs is that to make a small CPU the wavelength of light would need to be 10 nm or less. This puts photons into the x-ray region, which creates other challenges.
The "Clean room" for this technology does seem a lot more easy on the humans rather than the nano scale clean room of pure silicon foundries in Taiwan.
I don't know anything about computers, but from what I understand it's also a bit like what happens in nature, if we put all our neurons in a line they would take up a lot of space and the signals would arrive with a long delay, whereas the information being interconnected at many levels they can be shared simultaneously and much faster
I've been following the news around this company for a couple of years now; it's great to see finally someone visited their facility and made a video about their work. Great video.
I'm baffled by why Moore's Law being dead is so important to many people. People get really passionate about, but everyone knew that it wouldn't last. Physics would make it impossible to continue at some point.
I got to “light travels so much faster than electricity” when my eyes glazed over. Clearly there are certain benefits of with fiber optics with things like bandwidth and such but the speed of electromagnetic waves (be it light, radio or electricity) isn’t one of them. I like the idea of using light speed for AGI. What better way to speed run our extinction?😂 There was an ad during your video with a lady asking her Meta glasses which pool ball to hit while playing pool, remember when we used to play games to have fun? Now we can have our AI play our games for us, future is looking really good! (Sarcasm intended)
I really love the editing, and how much work you put in, actually visiting the company, doing interviews, motivating and exciting people. But i'd love for there to be more actual information in these episodes as well... I think you repeat some points, and you're afraid the average viewer is incredibly uneducated. I'd love for you to mention interesting properties of light interconnect like, layering different wavelengths of light down the same channels and filtering them out, meaning each optical channel can act like multiple cables occupying the same space etc.
Light has been used for communication for 100,000 years. The first known lighthouse was the Pharos of Alexandria, which was built in Egypt around 280 BC.
To throw in my 2 cents- I think the main confusion is that "light" is many many different things. At it's most basic form it's just a wave traveling through the electromagnetic field that deposits it's energy in a single point; more technically though photons are the force carrying boson for the electromagnetic field. In practice these are the same thing but conceptually we don't often consider magnetism as a form of light (virtual photons) or really consider how strange photons behave having no rest mass and neutral charge when compared to electrons.
I am frustrated that it took until 8:25 for a number to be mentioned. I understand that context for the issue needed to be provided, but much of this video is restating "traditional silicon can't shrink further" in different ways. I am also confused. At first, it seemed as if Light Matter is claiming to solve a problem in the space of digital logic. But the two products they show are system interconnects. Are they addresssing limitations in networks or logical computation on a single chip? The end of Denard scaling was not the end of transistor counts scaling. Multicore processors, FinFET, Gate All Around, 3D packaging, backside power etc are all new approaches enabling transistors to scale down further. I do appreciate the on-screen source information as you show external content. Please keep doing that. At 18:09: correct me if I am wrong, but impedance would be delay, no? Inductance is also a charge storing device, like the capacitor. Both capacitors and inductors have a response delay, called reactance, the sum of which comprise the impedance of a circuit.
I agree with you. The video initially makes it seem like they have created some new form of digital logic that computes with photons, when infact they are only focusing on creating faster chip interconnects. I wish the video also covered some of the current challange in photonics, instead of just saying that it's better than traiditional methods in silicon. The concept seems interresting, but the video explains it as something completely new an revolutionary, which I don't belive that it is.
@sunhome22 Good point. More performance metrics, especially as compared to traditional solutions, would also be appreciated. Since they are the first, allegedly, it would have been interesting to hear about what specific physical challenges they solved to make their solutions possible.
This was a great commercial you made for this company. Very well produced. I wonder how you setup these deals with companies to produce commercials for them? Are you paid a flat fee or is it based upon the performance of the video? I did notice you neglected to include the require notation that this is a commercial.
This is not a commercial and we were not paid to make this video. We've been making videos about deep tech companies for almost 2 years now and are only now beginning to receive sponsorships to cover our costs.
this is is the 10th time in the past 10 years that next year we will hit the cap of whatever moores law says the limit is.. Especially whenever a new gen CPU drops. They love bringing it up each time then
Sorry but no, this doesn't kill the Moore's law. It isn't computing, it's just data transfer. It helps to keep the compute not needing to wait around for as long for data to compute, but doesn't change anything related to the actual computing Also why tf does AI has to be shoved into this?
AI does increase our energy usage at incredible rates (we're talking about millions of GPUs for an AI company like xAI), and if this chip can reduce that usage, then it's already worth to invest into it, IMO.
This is really cool! Now i wish there are other companies that also hedges their bets & research prowess into spintronics as well instead of fully betting on photonic ic. Cause making waveguides and lasers and all still face manufacturing challenges and costs associated with it. Whereas spintronics you gotta have spin polarizers, spin transistors & GMRs which is relatively easier to do.
Dont like that this is focused on AGI. Because this and light computing are at best loslly conected. More processing power wont bring agi. We just dont know how to build such a thing yet. And no LLM arent AGI and willl never be, regardles of how much data they are fedd.
Tbf, optical neuromorphic computing is possibly quite reliable solution, so it'd work perfectly fine with neural nets and other types of AI. Edit: you are right on AGI mostly, though.
@@allepiccondor1166 that so ass. I mean ligth computing is cool engough anayways and worth investing into if you do wanne more powerfull computers in the future
@@baltofarlander2618 That might be. I mean photons are qutom objects after all. And I am not saying that AGI is never going to happen, its just not going to happen with LLM. They are just not build for it. And i hate it when some people tell me that chat gpt is already smarter then humans and can also code better then most of the people. Me a hobby programmer can absolutly falsefy this.
if not agi then what? games? faster tick tocks? y'all don't like the buzzwords but also don't realize what's needed for the next advancement. the grass is always greener on other side right?
It’s just a silicone chip where fiber optical cables are being used as a communication between them and at the end of the day it might increase the speed at which data travels inside a chip but it is not going to compute anything. The computation will still happen on the silicone. For example every year apple says that they have developed a way to connect 2 M max chips to make 1 M ultra chip, so this technology is there to make that connection faster that’s it. The computation will still happen on the chips
This video is confusing since it describes a technology for solving the interconnection between chips, but on the other hand claiming that the actual computation is done with photonics. This claim is by no means explaned in the video. What I would like to understand regarding the interconnects is how the light is generated and detected, I suppose there is a laser for generating the light, is it a vcsel that generates the light? So what is the datarate this laser-fiber-detector link can achieve ? i het the feeling that the intention of this video is to raise more funding, without explaining the key performance indicators.
It is totally incorrect to say that integrated photonics has never before seen mainstream commercial usage! Hello Luxtera, Acacia, and Cisco? Lightmatter is absolutely NOT pioneering this field, and the actual subject matter experts literally yell "Theranos!" at Nick Harris when he makes appearances at academic/industry conferences with this pitch. According to Nvidia, the Grace Blackwell platform achieves ~1Tbps chip-to-chip links at ~1pJ/bit and near free cost while optical interposers will immediately increase power by 3-5x and cost by 30x-50x with ZERO added functionality. Short-reach links with silicon photonics are completely unnecessary and counterproductive from a system perspective. They do offer benefits for scale-up and scale-out links that can take full advantage of the low loss of optical fibers vs copper, but this is already being done with co-packaged optics and chiplets by the rest of the industry, which is productively humming along without making all kinds of flamboyant claims as this company does. By the way, GlobalFoundries is the one actually developing the new process technology and making all of the stuff in this video, these guys are just drawing GDS files like all their other customers.
The human body had exceeded speeds of 25 knots per hour at that point (MPH wasn't often used in the 1700s as a measurement); being in a sailboat and hitting a hurricane, you'll be sailing pretty fast, and many survived. The real argument behind that was related to deceleration and instant acceleration. And they were right. Get going 25 mph and come to a dead and sudden stop with nothing protecting you. You could very likely live with modern technology and medicine, but you would not have said your body "withstood" that
The "speed of light" as we call it is just the maximum speed any information can travel and is always depending on the medium it is traveling through! The maximum speed of information transfer inside a copper wire is about 290,000,000 meters per second (according to wikipedia) The maximum speed of information transfer in fiber optics is about 200,000,000 meters per second (according to wikipedia) This is not - as is repeatedly advertised - "faster" - but this tech enables more bandwidth, much higher frequency due to less resistance and it produces less heat and needs less energy.
basically what we have here is a replacement for "infinity fabric"/ the interconnect that connects the chiplets or cores together and the connect between multiple chips. this is a great thing but it's not quiet replacing the switches yet. when that happens that's the real revolution we are looking for.
Exactly what came to mind during the first half of the video. I wondered if it was replacing links at the SXM/OAM level / the NVlink/Infinity Fabric level.
Wow, these are such young guns that are now leading us into this new tech that will transform the way we communicate and process vast amount of data. So impressive.
So many comments splitting hairs over definitions. As long as it makes things better, I don’t care. It’s over hyped but it kinda needs to be for their investors. It is new and certainly needs to be looked into, even if it is just a stepping stone to true photonic computing.
I was working and experimenting on optical communication from my childhood and I'm 19 now still I see future in light communication methods...after finding this video my brain immediately said one day you should be over there! LIGHT MATTER I will remember this company...I'll be coming there with resume one day...
@AfifFarhati I'm working on Light Fidelity AKA LiFi...data transmission through light...I convert my data into Binary through a microcontroller(Arduino) and this binary data modulates the light and those pulses are received and decoded by another Arduino with a photodiode.
@@millieh3179 Li Fi is optical communication that is done externally not inside a covered environment like a fibre cable...I've built my own communication protocol to make it possible in any type of ambient light...you need both clock and data streaming in a single situation under another ambient light source...I learn things by doing rather than making critic comments under youtube videos...
@@millieh3179 Li Fi is optical communication done in external environment not in a covered environment like a fibre cable...you need to transfer both Clock and Data Signal simultaneously under an ambient light...I built my own communication protocol to get adjusted to various ambient lightings...I learn things by doing rather than making comments under youtube videos!
This is one of those videos where you have to sort comments as "newest first"... and have the dislike counter addon. It's a clickbait fluff piece. Solidly unauthentic.
I once met a man in a rainforest in QLD who claimed he had actually made a computer chip that was faster than light; however, when he tried to patent it, they said they were sorry, but as this breaks the laws of physics, they could not accept it. Not word for word, but you get the picture.
If this was the case current silicone manufacturers would keep expanding and putting more resources into conventional chips and would also turn to this, i wonder why they dont
Light is not faster than electricity. Yes actual electrons move slowly but electron signals, which is what is relevant in computing, already travels near the speed of light.
It's really cool to see one of our episodes (especially one on such an underrated topic, Photonics!) perform so well out of the gate! One thing I (Jason) want to set the record straight on: We were not paid to make this video and additionally I, nor anyone on my team, have shares in Lightmatter. I'm seeing a lot of comments from people saying this is an "ad" or a "paid video," which deeply annoys me because we have worked incredibly hard for 2 years (and said no to A LOT of sponsorships and companies who have wanted to pay us) to make S3 an independent, non pay-to-play / pay-for-coverage channel. This is unlike many other tech adjacent RUclips channels/news outlets in the space... You cannot pay to be on S3.
I understand my tone and rhetoric of general optimism and excitement in our videos can come off as overly excited and maybe even biased, but that's just how I communicate. I'm sick of seeing so much negativity around discussions involving advanced technology; the reason I am making these videos is to try and change the vibe around innovation. We make these videos because we believe that stories and conversations about technology should happen more broadly and that building the future is a really good thing to do!
Anyways, I'm always happen to answer specific questions about how or why we do what we do! I'm also VERY aware of things to improve on. Right now I'm mostly focused on: (a) more sources in videos, (b) in 2025 we are moving almost completely away from 1-company focused videos, (c) better editing, (d) trying to present a more balanced/educated view of any given topic we cover. I'm always open to feedback too!
Thanks for watching our stuff for the past few years and watching this episode! I hope you learned something new about photonics :)
(Also side note: It's unfortunate to see so many highly upvoted comments from people who didn't watch the entirety of our video. Sometimes, oftentimes, RUclips is painful lol).
Fantastic video, I will now binge all your others.
@@mateosantos4980 Stick around in 2025! We've got some really great stuff coming, I feel like after almost 2 years we've finally JUST beginning to figure out our style.
I don't build or program anything today but the advantages of photonic over electronic processing seem quite clear. They're working on what my dad said over 30 years ago was the main bottleneck in computing: system bus speed. Chip-to-chip communication, chip-to-memory and storage. My question is when the broader industry begins R&D in earnest or if there's a photonics skunkworks somewhere. It seems like the natural next step in computing. Then combine that with better AI programming that's 10x more efficient.
@@storyco2000 i did go off on you. I'll cut you some slack, I'm just incredibly nitpicky on that exactly "rhetoric" Overly excited yeah i do that sometimes myself when trying to explain Ai code im designing which I design for free atm. although i did write many manifesto which summarized, concluded, or mapped a more specific use of terminology, for me it was that you don't seem to fully know what you are saying that or how to say it. excitement can complicate that given the chance of over explanation is more likely to occur while excited, and the sloppy run through of terminology and their more definite definition, I'll leave you with this AGi means the ability for the Ai to have human kike context and language fluent understanding GAi would be fluent spatial understanding, AMi is human mutual intellectual intellect intelligence ability 🤓👍 🤣. QAi is quantum &or full parallel processing and understanding between all notions. all these combined with varying degrees of AMi or surpassing human norms of intellect or being close to right under or at or right above is = ASi.
@@storyco2000 the code i write is for gernative Ai although it most likely serves they used to some backend access developer as a actual foundation to what will birth any one form of these if not all of them and later reach forms of pre ASi. AMi is simply a form of vectorization code which does take a lot of processing power it also requires Quantum tandems for true potential. i actually wrote the first pseudo codes of what makes up an AMi.
"The number of people predicting the death of Moore's law doubles every two years."
-Peter Lee, Microsoft
Lol.
LMAO
So it's going to continue to sub-atomic scale transistors? What will the gates be made of?
I like this recursive property of Moore's Law.
@@HyenaEmpyema In case you didn't know, today's 3nm chips have transistors with smallest features the size of around 20nm, and the whole transistor is around 130nm total. The "3nm" is really just marketing
moore's law has died like 900 times by now
computing with photons was thinked of even before the turing test. i wonder how the last 100 years went. oh yeah. we dont use photons.
@@boohoo5419thinked of 🫵😂
@@boohoo5419 thats an ignorant statement
It sounds so sexy for investors there are surely gonna be another 100,000 definitions of its death
@@boohoo5419 and electric cars were invented before gas powered cars and for about hundred years electric cars weren't used. Sometimes a technology can have more potential but is harder to get started compared to a similar technology.
The more I watched, the more I grew frustrated how things were being misrepresented. Then I scrolled down to the comments and breathed a sigh of relief.
This is why most sites removed the comment section from their articles. People were reading the comments and trusting those over the content of the article.
im with ya, most channel providers are noobs and peacocks.
@@bigTBossbecause free mind is bad for revenue
@fqertexirte3054 irrevelant.. also, "free minds" tend to be too free and forget that ignorance isn't a form of knowledge.
I think youre right
Before you ask me if I watched the whole video. I did, this is NOT computing, it's just faster BUS speed.
Which is great, but misleading.
Entire thing is misleading
yea, its not like electricity moves at speed of ligth already, wtf is this invention
@@john-thejohn-johnson4403 haha yes, the "less heat" is promising. But the video is more hype and smoke than what the thing really is.
that won’t mitigate context switching. 😂
I had to stop after about 90 seconds.
I’ve seen the rise and fall of Moore’s law, and watched the price of memory decrease when I paid $400 for 16K (1978) by a factor of 10,000,000 x
(maybe more; also access times and clock cycles have decreased).
The premise of this company changed halfway through the video from "Computing with light" to "transferring data with light".
(Watch until the end)
@@storyco2000 You keep saying that in replies but maybe a timestamp for when it changes back to being about computing would help? I must have missed it, as this appears to be technology to optically couple GPUs together to reduce latency. Certainly interesting, but not going to reduce power consumption if it just means these companies will chain more GPUs together.
@@mcs9702If he told you a timestamp, you wouldn't have to watch the entire video
@@mcs9702 it's 100% so the retention on the video is good - he's just lying
Watched until the end. Yes, it's just some interconnect.
Still didn't get the advantage of it over infiniband though.
After seeing countless comments stating that this is "photonic transfer", not "photonic computation", and seeing the replies to them saying that obviously they didnt watch the whole video, i can confidently say that this is absolutely not photonic computation.
When the process of assembling an Envise chip is being explained, it's specifically stated that these chips merge existing electronic tech with this new photonic tech: there are transistor dies within these chips, a substrate for the dies is added, and then photonic chips are added to the assembly, followed by a substrate for the photonic and electronic components to communicate. I'm no expert, but it sure seems like we can't accurately call this "photonic computation" if electrons are still doing most of the heavy lifting.
I will say, though, that if photons can be used to actually perform computation, as opposed to the aforementioned transistor dies using electrons, the sky would be the limit.
Nice
Envise does indeed do analog matrix multiplication with light.
There was no demo of a functional unit, and it is another jeet-run scam. IF you fell for it, you just proved your IQ does not exceed 50. Now tell us about your travels in outer space.
watched a guys phd thesis on youtube maybe a year ago that was entirely about computations performed using light and some physical geometry. There is a very cool future of optical computing outside of the obvious speed increase, especially with parallel computing, using the different wavelengths for what i guess are essentially cores but even more compact.
Went and found the link from my watch history: ruclips.net/video/Mdh2pLwsK8Y/видео.html
the video title is "OPTICAL COMPUTING with PLASMA: Stanford PhD Defense"
This guys video is more just establishing the development of using photonics for the interconnects, like regular fibre optic but for smaller communication paths like computer chips.
This is fact and true. I guess one day when looking at my GPU the rainbow colors on it would actually be there for something. (ik you can't see infrared but humor me here)
Fiber is not "faster" than electric signals, it is in fact slower: electrical signals travel at 99% of the speed of light in vacuum, fiber signals travel at 70% the speed of light in vacuum. It's just that light allows to transport more data at once. That is why some datacenters use radio instead of fiber for low bandwith, low latency connections.
Thank you. Saved me typing this
Yeop
"Imagine information travelling at LITERALLY the speed of light" 😂
@@H4RM0N1C5it does travel at speed of light. The speed of light in glass 😂
@@H4RM0N1C5 yea, like the device you use to what this video lol
This is a hype video. It doesn't get into the technical details of the chip. Adding references to AGI lead me to question the validity of the claims.
Indeed, it feel like the whole video just stuffs with hype and buzz words
same feeling
100%. I scoured what Google had available for some reference benchmarks for where they currently are and found nothing. I've sent an email asking for a reference point but I'm pretty sure I won't get an actual answer back.
Conclusion: Whatever they currently have is worse than what already exists. Probably by a lot. This is an ad to drive funding because in theory if they can work out all issues it'll be better. I'm guessing much better from my own research into photonics.
Something more specific than "It's a completely different way of thinking about computation because LIGHT!!!!" would be nice
@@LiveType Yeah no. If they had a significantly worse chip than whats on the market then that would imply they have something at all which they wouldve used to demonstrate. My guess is that they have absolutely nothing but theory
At first you think they've created a transistor replacement, then you realise it's just the interconnect.
for 20 minutes I was trying to understand how light will be the new transistor
I work in a steel manufacturing industry and we're improving our method of steel production to create better racks for the GPUs and thereby enabling AGI progress.
Best comment. Everyone just wants to hop on the AI train 😂
AGI is but an illusion. You cannot simulate the entire world through technology. Progress will lead us somewhere, however, we will never get something as close to AGI which basically equates to a real human being.
Well, i'll TELL YOU THIS, GPUs are not going to be the future of AGI
@CSol-s6j Missed the joke?
Are you using some NEW, REVOLUTIONARY and [fill_your_own_buzz_word] method?
Tell me 7 different ways that you're connecting the silicon substrate, to light; but take 20 minutes to do it.
True. What an idiotic waste of time.
Not everyone has knowledge in this topic.
Yeah they never really explained how photonic computing actually works. He kind of just said the same thing over and over
Thank you
Yup this way of video making is sooooo annoying. So many are doing this. I had to skip quite often.
I've said it once and I'll say it again. Now we can literally say that RGB improves performance.
Study's have shown RGB increases FPS😂
While I enjoy the content, you definitely have to include a second viewpoint where you ask what the CURRENT problems with photonics is
what's the issue with photonics ?
We've got something coming in ~1 month that I think you'll like re multiple perspectives in a single video
this channel is full of bullshit tech. not one thing actually came to market that was shown. this idea alone is 80 years old. computing with light was invented before the turing test! if your totally clueless this channel looks really hight tech. but every video i clicked showed some investor money sink and nothing else.
I would guess that size and power of adc dac array is a limit for analog computing in general. consider the matrix multiplication can be over 10,000 by 10,000
Didn’t you hear him “the problem is that people haven’t quit realized that lights going to be going into computers shortly”
This is not a credible video. You never ask these CEOs anything except softball questions, nor do you ever put numbers to anything. "What are the biggest challenges your company is facing, and how do you plan to solve them?". "How does the existing tech work, and how EXACTLY does your new tech disrupt it?". You're terrible at digging deep when you are interviewing these really smart people who should be able to answer these questions. Instead, we get questions like "When did you first get into computers". Like really???
10:45 "Imagine data flowing through the computer at literally the speed of light... while traditional computers are stuck waiting for electrical signals to bounce back and forth... light moves through these chips without any delay." -- at this point you REALLY need to put actual numbers to this. What actual delays in existing architectures can be solved with this? What is the current speed and what do you expect the speed to be with the photonic system? How much performance does this get you on the chip?
Also, why the hell are you stuffing AGI and AI into this video. This isn't an AI company. This seems like an attempt to hitch onto the excitement around AI.
Are there actual bottlenecks with regard to connectivity between chips? Don't we already use fiber (infiniband) in these datacenters?
You sir need to REALLY watch the whole video 😅 make an index ☝️
@@hunterdna2257bro didn’t watch the video at all lmaoo ASK THEM YOURSELF!
"10:45 "Imagine data flowing through the computer at literally the speed of light... while traditional computers are stuck waiting for electrical signals to bounce back and forth... light moves through these chips without any delay." -- at this point you REALLY need to put actual numbers to this. What actual delays in existing architectures can be solved with this? What is the current speed and what do you expect the speed to be with the photonic system? How much performance does this get you on the chip?"
Oh my lord, this, so much this. One thing though before the rest... they did mention bandwidth performance. Not clock speeds or anything like that, but the bandwidth is apparently numbers like 100terabits per second. I'd like to see some proof of that though.
Now on to the rest.
I have to wonder if these guys have ever used any sort of satellite internet before from before Starlink became a thing. (It's far better by comparison to the other options further out in orbit.)
I used to have Xplornet a while back. Sure, they could reach some higher speeds than some of the other options I had out in the sticks. But their latency was god awful. Trying to do anything on it that required a quick ping time was basically a no go. You wanna try to play a game with not just full 1 second ping time, but potentially 2-3 seconds of ping time? That's your ticket.
Like Jason said, all radio waves are just lightwaves. Satellite internet is just light, by their own definition. If light speed is truly that fast in regards to networking and such, then how is it that earthbound cables can transmit the same data faster in similar distances using different servers to account for the fact the satellite is in space and thus the hop adds distance? I.E. same length of signal travel in test, land based cable internet is faster.
So yeah, I agree, we need to see some real numbers here. Because sure, light is fast, especially so in small spaces with very little distance to travel. So in this case light may really be the winner. But the radio wave example... it's kind of not really hitting the mark. 50ms latency on land vs 1-3 seconds latency using satellite. (Not starlink. I've heard better things about that, but it's also once again... closer to land. It's more like terrestrial line of sight tower based internet at that point. And apparently has similar latency from what I have been told of the numbers some people get compared to my own past experience using line of sight tower based net.)
And one more thing to consider. Some of these towers are as far away on a earthly horizontal plane, as those further out satellites of companies like Xplornet. And even the terrestrial tower based I am talking about right now was faster in latency than Xplornet. AND it uses radio waves too. What gives?!?
Dont quit your day job
@@francisreagan You uh, mistakenly looking in a mirror while you type that? Cause your comment makes zero sense unless assuming you are projecting without realizing you are looking at yourself. How glossy is your screen?
"We made marginally better wires for connecting dice" being hyped for 20 minutes is crazy
The funny thing is that they didn't likely even make it better.
Not only is Fiber Optics slower than EM waves in metal, there needs to be an Electronic -> Photonic conversion and a Photonic -> Electronic conversion too.
@@stanieldev EM in metal is 0.6c iirc and optics have better signaling so i would say achieving better results with light is logical but overhyped
Ye but its MIT graduates, and they are in Palo Alto. And Nvidia got a mention as well.
He's saying how making computers bigger isn't a sustainable strategy yet their technology seems to be centered around inter-chip communication and allowing computers to be bigger
oyvey pajeeet tech
Do you work on some interesting tech projects?
Bruh you forgot to say "with less energy usage".I forgot the part where that's my problem
> "this isn't just a small upgrade, it's a completely different way of thinking about computation"
It's an upgrade. You're swapping out the electrical bus for data transfer for a "photonic bus". Most else seems to be the same. You've upgraded the speed/bandwidth for data transfer, possibly reduced the latency as well.
The Magic Light Bus.
And made it more efficient from less heat lost to some degree, I would assume. May not be a lot but it should be something.
@@KkfightStarBaal The Magic School Bus makes a rainbow
@@joshshepherd5660 theoretically, if it’s speed of light, you could house the entire photonic computation system in a seperate building.
Due to the fact distance wouldn’t reduce latency as much as heat soaking would.
Ehhh... no, it is possibly way more than just data transfer, since it's entirely possible to create all-optical computational elements which would work like differential amplifiers, transistors, logic gates etc., while also creating entirely new components that behave in ways in which usual electricity cannot.
Don't let this distract you from the fact that a machine with computational power less than a calculator landed humans on moon while we're doomscrolling our life away on the most advanced piece of tech available to us today. With AGI and stuff, we can create and consume memes and waste everyone's time at light speed.
Need to put this comment in the human civilization's tombstone.
Kinda like the time wasting nature of human beings. 😂
I would never call it photonic computing, from what i understand it is just adding light for communication from chip to chip or chip to memory. It is not computing, it is only transporting data so what is the point of talking about Moore's law or size of tranzistor if computation is still being done on classic chips.
Yes, I agree. That's what I understood as well, and I think you're right that the title and tone of this video are exaggerated. Don't get me wrong, I like the idea behind this video, and I like this channel, but... It seems to me that they're describing here a system that can be analogized to one that makes the change between trains (on your route to work, for instance) faster - not a system that makes the trains themselves faster. Yet, they seem to talk about it as if they're describing something that speeds up the train.
it's just encoding. We have been using what you said in a way with the internet and disks.
much like quantum computers, you just need to develop a new language to do photonic computing.
The start of the video really made me feel like we are talking about computing with photons, but as the video went on, I realized that this is about reducing data-transfer bottlenecks today's systems. I thought I misinterpreted the guy but now I can see others were confused as well. I totally agree that reducing these bottlenecks will have a huge impact on stuff like generative AI and graphics, but you're right that we would still be eventually limited by transistor size.
In IC design terms, this means that present AI systems are more limited by the data-path than their control path or ALU, and replacing electronic data-paths with photonic data-paths will allow us to conquer this bottleneck. IMO, this makes the tech even more promising because they are not challenging existing norms, rather making an improvement that all major manufacturers will soon adopt if it turns out to be cost-effective. It's just the the tone of the video is a bit misleading.
Yep. This is not photonic computation or whatsoever. Hype to get investors.
Why you fussing about it?, it an important step for future computers to fully go the photonic route, baby steps..
2:10 Just a correction here. Light in optical fibers is NOT faster than electrical.
Yup, it's around 30% slower
Nope it is slower than light you can google it because of different conductor being used only in vaccum it stays the same as speed of light
I think you might be mistaken, my friend. Light in optical fibers is inherently faster than electrical signals in wiring when comparing their propagation speeds directly. Light in fibers travels at about 200,000 km/s, while electrical signals in even the best wiring propagate at 150,000-200,000 km/s due to the medium's properties. The confusion may stem from latency in real-world systems, such as delays caused by converting light to electricity and back. However, this reflects system efficiency, not the inherent speed of the signals. The physics is clear: light is fundamentally faster because nothing in electrical wiring can reach the speed of light in a vacuum (300,000 km/s), and electrical signals are always limited by the properties of the conductor.
If I'm wrong please tell me why. That's what I've learned anyway.
@@AnkurSharma_321 wrong
@@alexhamilton3522 The true statement is that **light in optical fibers is faster than electrical signals in wires**. However, light in fibers is slowed slightly due to bouncing inside the fiber, while electrical signals are slower due to the properties of the wire.
This is about using light to replace electrons to transmit data, not about replacing silicon gates itself. Moore’s law is definitely not dead 🫤
Moore's Law isn't dead, it just smells funny. 🧟
Yes. Just the interconnect. Not the gate.
Light is already used to transmit data and has been for ages. How do people think their Internet works
@@HailAzathoth ditto
@@HailAzathoththat’s what I’m confused about, like is it actually faster to use this to transmit data between the memory and processor even when considering it has to be translated between mediums? Maybe in the context of a supercomputer?? I wish that had been explained better. I’m sure it makes sense on paper, otherwise nobody would be spending time or money on it. But honestly, even after this video, I have no idea.
Correction: Light does NOT travel at the speed of light through optical fiber. It travels at v = c/n which is speed of light, c, divided by the refractive index, n, of the optical fibre core. Light only travels at speed of light in a vacuum.
Correct: Light travels about 70% of the speed of light through fiber.
well ok there mr smarty pants
@@1q3er5 Why do you sound so dismissive? The entire point of the video is about computing at the Speed of Light, and this is a correction that no, this would not allow computation at the speed of light. Or at least, the speed of light everyone thinks of.
Where fiber excels is
1. Much lower loss, so you are able to send high speed data much further.
2. Each fiber is smaller, more will fit in a given area
3. Fiber can carry multiple signals, each using a different color. That gives it a much higher bandwidth.
But, data rate is limited by the reflections inside the fiber. A signal that goes down the center of the fiber arrives first, one that bounces off the sides at a shallow angle a bit later, and at a stepper angle even later. If the center light from the next bit arrives too early, it is hard to extract data. Even though light is 400+ Tera Hertz, the data is still in Giga Hertz.
To get higher thruput, CPU's are replaced with GPU (graphic processors) because they process in parallel.
Communication speed is a major factor, but on motherboards data over copper is already travelling at 70% or so the speed of light, about the same as fiber. To reduce compute time Intel and others have moved high speed memory and communication inside the CPU, less than a mm, compared to 10's of cm on a motherboard. To improve yield lower cost processing is done with multiple "chiplets" instead of one large CPU. Each chiplet has it's own power supply to optimize performance.
A major advantage of making the transistors smaller is they can operate faster with the same amount of power. As Moore's law runs into physical limits (size of the atom), more performance requires more power. That power creates heat, limiting how many CPU's you can put in a given area with air cooling.
They have moving to liquid cooling. First using liquid instead of air, now they are actually allowing the liquid to boil. The phase change from liquid to vapor carries away far more heat. But, even that has limits.
So, as shown in the video, they build large racks filled with processors. Fiber allows you to connect more racks.
Much of the fiber optics you see in server farms, and coming to your home, connect to the computer using an optical to copper interface. A direct chip to optical uses less power and can operate at higher speed. Fiberoptic "connectors" exist, but they require high precision and a very clean environment, which can be controlled in a server farm.
So, I get that this isn't an advertisement in the literal sense, in that money wasn't exchanged, you don't own any shares, etc etc. But it feels like an advertisement. Or maybe more fittingly, white noise. Don't get me wrong, this was a technically great video and I can tell you poured tons and tons of time and effort into it. And I respect that and appreciate that, and I really really want you to continue making content. This was good content! But I feel like it didn't have any substance.
You talked about a company which claims to be making advancements in photonics. Straight from the description, "Lightmatter - the company leading the photonic revolution." And you explained what photonics is, how it differs from traditional electronics and electronic computing, and touched on its history and current applications. You mentioned that Lightmatter has developed a product called Passage, which is essentially a reimagined PCB which uses microscopic wave guides and fiber optics embedded in (or created out of? I'm not sure) a traditional silicon wafer to transmit signals between chips. That's sounds pretty cool! But I still have lots of questions about it.
Electricity isn't light. So, how does Passage convert between the two? My understanding is that this can often be a bottleneck, so how did they work around that or mitigate it? Or is it actually a nonissue? Or.. anything. I know about electricity, I know about light, I'd like to know how the connection between them works. Some rough examples would have been nice, as well. Instead of saying buzzwords like "Passage bridges the interconnect between chips with light, allowing them to communicate at lightspeed!" you could have said "Imagine an arduino or raspberry pi, but instead of copper traces on a PCB connecting each critical component, they're fixed to a board that allows them to communicate with light, allowing higher bandwidth, lower latency, and more efficient energy usage!" giving some insight into the "how" instead of just the "what," and showing that you yourself actually understand what this technology is and that we can trust your enthusiasm and hype.
You briefly mention Envise, and how it can perform computations with light. But, that's pretty much all you said about it. Photonic computation is what I assumed this video would be about, considering that moore's law is always associated with computation, and moore's law is literally in the title. But there was very very little discussion about photonic computation, and I have absolutely no reason to believe, based on what I saw in this video, that Lightmatter has made any kind of advancement whatsoever regarding it. That bit felt 100% like "trust me bro, it can think with light, and it's super freaking cool." Again, there just wasn't any substance. I didn't learn anything.
I think that might be the main reason this felt more like an advertisement than a video about some cool advancements in an important field. I didn't learn anything. Well, I learned plenty about Lightmatter and what they claim their products can do, but that's it... That's all I learned. I didn't get to see the advancements, I was just told that they theoretically exist. I'm constantly been told that some new thing is really cool and revolutionary, but it vary rarely is. I think that's why people are so skeptical when it comes to topics like these. Look at crypto and NFTs. They were supposed to be revolutionary, but they're a joke now. People are always claiming to have discovered something revolutionary with fusion energy or truly AR/VR experiences or whatever, but it's never revolutionary. Over-hyping something usually has the opposite effect you're going for.
And that's not even mentioning the heavy focus on AI, which I think most people are getting sick of hearing about. Or that people have been saying moore's law is dead for over a decade. (Not saying it's not dead, just that no one believes anyone who says it is.) Or probably some other smaller nitpicks that I've forgotten about while typing this out.
Again, the video was a great video, on the surface. The editing was great, the narrative structure was good, it was a fun and interesting topic, etc. There just wasn't any depth, outside of talking a lot about some random startup. This isn't meant to be an attack or bait or anything malicious, I simply wanted to share my thoughts and feelings about the video. Hopefully the criticisms are constructive, or hopefully I'm just wrong about some of them. But regardless, this was my take, and I think some of the sentiment is shared by others as well.
💯
Well said.
What a well thought out critique.
Liquid crystal, the physics of light boucing off the different surfaces of different materials, prism effect - keywords which I wanted to hear here. You need to be able to either change wavelenghts during computation process or to activate / energise crystal microstructure to block / open the signal to run on binary. Where is the solution for memory in those chips? The only thing I uderstood from the material is fact, that they try to make ALU photon-based chip, which is really bad info - bottlenecks will be immense here. Also, I bet they did not want to say anything about this tech to not make Nvidia, Microsoft and other giants interested. Their budget is relatively tiny. Their only chance is to be silent.
I find “Undecided” to be far more interesting in terms of keeping up with new tech. I only watched this because “Undecided” doesn’t usually go into new data tech, but is more of an energy guy.
this is the kind of future i want for humanity, just making laws and then immediately breaking them with some incredible new sci-fi technology
Something that might have been helpful to note:
This is not the first company to bring photonic technology to a commercial environment. Fiber optic cables are a photonic technology that we’ve used for over 50 years.
When you put this in the context of fiber optics, this is just the same change to computing that we’ve already made to data transfer, making this technology sound much less foreign.
Or in short, this is just applying the same technology we’ve used for decades in fiber optic cables for data transfer to all forms of data transmission between a processor and its inputs/outputs
Short he saying is Zionist light matter making money out US money....
It seems their advancement is making optical interlinks at a scale small enough to implement in a large number of chips.
Optical computation is also an old technology. Too bad the presenters lack the background to understand and explain what's actually new and interesting here.
I wonder, then why are top chip companies not even trying to invent by this technology? There might be a reason for that, and I want to know that.
@@paulgolff4996I feel like that was actually explained. Did you watch the whole video without skipping?
Even without a physics degree, I think that statement at 12:33 is incorrect. Transistors are not approaching the size of an electron by any means, we are reducing transistor sizes by just nanometers per year
Yeah that's a pretty shady statement. Transistors aren't anywhere near the size of an electron or even an atom.
That’s a great observation, and it’s true that transistor sizes are shrinking by just a few nanometers each year. However, the comparison to the size of an electron is a bit nuanced and often misunderstood.
Electrons are not classical particles with a fixed size like a marble; instead, they are described by their wavefunctions in quantum mechanics. The ‘size’ of an electron is related to the extent of its wavefunction, which can be spread over nanometer scales in a material like silicon.
Modern transistors, especially at the cutting edge (e.g., 3nm and beyond), are indeed reaching sizes comparable to the spatial extent of electron wavefunctions. This is why quantum effects like tunneling and confinement become significant in such small devices, influencing their behavior and performance.
So while transistors aren’t approaching the “size of an electron” in a literal sense, they are reaching scales where the wave nature of electrons dominates the physics of how transistors work.
@@Redwoodz112 "ChatGPT please make me sound more correct on top of this already correct statement to hopefully reach the correction singularity and secure infinite energy." ur prompt b like
@@Redwoodz112 Literally the most ChatGPT style message I've ever seen
@@Redwoodz112 oh shit science yay
As someone who's been building my own computers for more than half my life, I can tell you the main problem with these sorts of things is integration. You can have the fastest, most advanced computer chip ever, but if you can't plug it into anything, it won't do anything to replace what's already there.
I read an article in the 80s describing IBM’s research with photonic computing. An innovation they had proposed, due to light’s properties, is the ability to easily multiplex using different wavelengths of light. This ability also allowed them to switch from binary to decimal for the underlying computation engine.
Glad to see a company is finally commercializing a viable product.
I remember reading something like that in Scientific America many decades ago. I couldn't see how it could be used except in very simple ways with large chips.
That is absolutely not what is happening here. Lightmatter is an optical link company now. They pivoted away from optical compute because they built it and it wasn't competitive with CMOS. It turns out you can write lots of flashy papers about optical compute, but you can't make a product that makes any sense.
Fundraising on YT. 😆 Clickbaited.
If you have fiber internet at home/office, the light is already multiplexed with different wavelength at least to your building. This is actual technology used right now.
19:25 i love chips
To summarize the issues between light computation and light communication:
Nick is an expert on particle physics. Light is a particle and a wave. Electrons are about 1 Angstrom across if we look at them based on how they interact with their environment. Our current chip gate "sizes" (in quotes because this is a hand-wavy measure) is 2nm if you believe TSMC. 2nm is 20 Angstroms.
Lightmatter creates an INTERPOSER, which sits between the silicon dice and the substrate to which the chip is bonded. This is standard chiplet packaging.
In "The Matrix", there is no spoon. At Lightmatter, there is no compute being done in the interposer.
However, light waves can interact with each other, but likely not in a way that can be used to create a computation gate.
Having studied this at M.I.T., Nick is likely one of the people in the world most likely to understand if computation in the interposer is even possible based on what we know now (or when he went to school).
Cheers!
Todd B
I've read about this concept in Popular Science, the magazine. Over a decade ago, around 2010-2012.
This has nothing to do with the computational speeds of silicon gates, as it only improves data transference speeds between them. There's still a need for a light encoder (turns electrical signals into light) and a light decoder (turns light into electrical signals), which are both likely bulky (unlikely to connect individual gates, but more clusters, or entire chips as shown in the video) and have their own inherent delays (offsetting the benefit of light paths).
The significant potential advantage is the use of different wavelengths (colors) to transfer several data streams on the same paths.
So it sits between the cpu and gpu? So you guys are making motherboards?
Basically.
The presentation is confusing, on purpose.
cables for motherboards
Fancy cable company
😂😂
Utilizing the natural interference patterns of light, a computational system can be devised. This system harnesses the interference effects to create a logical loop, leveraging the phenomenon to its advantage.
The process involves:
- Creating a series of patterns that the light passes through, generating an interference matrix.
- Employing the interference to produce a self-sustaining light loop, even in vacuum conditions.
- Utilizing mirrors to redirect and manipulate the light, creating a computational framework.
Notably, light travels faster in vacuum than in crystal or glass, making this approach potentially more efficient. By harnessing short-frequency wavelengths, such as X-rays, the system can be designed to be compact and intense.
This innovative approach eliminates the need for fiber optics, instead relying on the principles of interference and reflection to create a computational matrix
Sorry, if it is just a bandwidth increase, it isn’t really fundamentally changing how computers work, because the actual computation is still being done by semiconductors powered by electrons, all they are doing here is just increasing the speed each chips communicate with each other, by basically putting fiber optics into the silicon substrate.
Sabine Hossenfelder literally 3 days ago: Moore's law is so back.
We now have have electrically pumped group IV lasers - meaning, lasers embedded directly in the silicon. This is HUGE news, because now we can create fully enclosed optical circuits ready for production, whereas before this technology was limited to proof of concept lab experiments using external light sources.
Idk why, This gives me Theranos vibes
Same here, they didnt show a single example of it computing.
For sure! The whole "lab" looked a bit too ... perfect. The cool white guy MIT graduate + a bunch of nerdy Indian engineers ... hmmmm. 😂
I used to build semiconductor fabs back in the late 90's and early 2000's. If that's a "cleanroom" I've got a bridge to sell you really cheap, and some nice high ground in FL.
@@TexasRiverRat31254 There should be a new measure of IQ - the LQ Low Quotient. People would be happy getting a bigger number.
@@bigminifridge It is the same as QC (quantum computing) - 100% BS.
"imagine data going at litterally the speed of light" : well that's already the case with electricity you know, signals are transmitted at the speed of light. nowaydays we use fiber optics in communication, not because it goes at the speed of light, but because at high frequency copper impedance increase degrading your signal, but it's not a signal speed issue.
Having various colors for information is also already done, we are multiplexing various frequencies, which is exactly the same as using different colors.
communication lines travel at the speed of light, but signals within systems do not. That is what photonics is trying to break
tell me you didn't actually watch the video and understand it with out telling me.🤣
@@zapman2100plz stahp using that stoopid saying ffs
Electrons do travell at the speed of light but thats not the main problem u see the problem is that communication inside the chip is really fast but when the chip tries to communicate with other chips the speed becomes really less.
Lol these replies are killing me. Guys, get an education. Voltage (i.e. scalar potential) wavefronts in conductors travel at about the speed of light, but the dispersion and impedance characteristics of those conductive mediums lead to lower informational bandwidths and higher power consumption.
So if I'm not mistaken they've basically built a light bus. All they're doing is replacing the electrical bus with a photonic bus. Thus allowing multi communication paths on the same bus line since they can use different light frequencies simultaneously transmitted.
These are planar waveguides using multiple wavelengths, great optical networking on a chip.
No mention of photonic logic gates let alone an entire processor.
Sorry, so many things wrong with this video.
1. The speed of electricity is basically equal to the speed of light. So upgrading electrical data bus to a light one does not improve the speed of data transmission.
2. When the CEO of Lightmatter says that “GPU sits and waits for the data from memory and CPU” he makes it like the data bus is the bottleneck. However, the GPU waits for the data from memory and CPU only because the speed of memory and the productivity of CPU is way less than the GPU one. Not because the data transmission bus is slow! His statements are misleading.
Considering this, I agree with comments saying it gives Theranos vibes…
Wrong!
The speed of light is significantly faster than the speed of electricity; while light travels at the constant speed of approximately 300,000 kilometers per second in a vacuum, electricity typically travels at a speed much slower, usually around 85-95% of the speed of light depending on the medium it's traveling through, like a wire, due to factors like the material's conductivity and the wire's thickness.
Electricity only matches speed of light
In an ideal state, with no resistance, inductance, or capacitance in the circuit, electricity would travel at the speed of light.
@@ModernDatingLifeFJ Nice
@@ModernDatingLifeFJ Light isn't traveling in a vacuum. It's traveling in a glass fiber. Either way, the point stands, the slowest part of computation is fetch and store, not data transfer.
5:50 my g went from gaming to quantum computing in like 20 seconds. I’m still on overwatch 2.
So it's 'Sid Myers Alpha Centauri' - "To achieve the most powerful computing power we need the fastest possible element moving the shortest possible distance."
Moores law doesn't claim that the number of transistors in a microchip will double every two years. The law claims that the density of transistors is I'll double every two years, which does not imply exponential computing power.
Por eso es que estamos estancados requerimos otra tecnología para creación de los cpu,aparece ingeniosamente la idea fotónica
*doesn't
Yes! Thank you!
Yes even with better and efficient architecture it is possible gain more computing power with less no. of transistors
If the density of transistors in a microchip doubles every two years then the number of transistors in the same area doubles by definition of density so what are you talking about? Sure, speed of computing doesn’t double obviously.
Light and electrons both travel at the same speed. The advantage of light is the high frequency allows light to carry more information. Another advantage of light is multible frequency’s can travel through the same waveguide simultaneously.
okay someone else noticed, just commented the same, cheers.
electrons tend to lose a lot of energy as heat
reducing heat emission is already enough to improve speed
Yeah it's maddening to hear them say multiple times in the video that photonics "travel at the speed of light"...as if electronics didn't. Marketing bullshit. There are some real advantages to photonics, but the ability to travel at the speed of light isn't one of them.
@RenderingUser yes but light luses some energy too, ether it should be reflected or generated which adds its losses
Ehm, electrons dont travel at speed of light in the wire. Its the electric field wich travels with speed of light and transfers the Information.
Remember this is the tech that's publicly available. Imagine what the Military has.
That's an excellent utilization of current photonics but I'd love to see the dawn of boson-fermion transistors that use light to change states and enable photon based logic instead of electron flow logic.
Hopefully we get there one day! I think that this is a good stepping stone that will have actual business use cases and then from there we may see more investment toward this end by this company and competitors. I’m no expert, but it seems to be a pretty hard thing to implement
These are comments I’m looking for!
Wouldn't you still need to read the changes at the speed of light and how would that process data for output?
All I see is "let's all believe" hype.
oyvey pajeeets can't make it
You keep making it sound like the signal in Light Matter's chips moves faster than conventional chips. They don't, unless their wave guides are hollow. As far as I am aware, light in a standard fiber optic cable moves at the same speed or slower than electronic signals in a copper cable (both at about 70% the speed of light). Thus, this will not make anything faster in terms of response unless their light wave guides are hollow core (which would be closer to 95% the speed of light).
Yes, a similar speed in physical lines, however you can't stack 16 signals overlapping on one wire as you can with their light setup. The frequency is also higher, so it's still an improvement.
@@Flyingwigs Oh, please. What do you think cable TV does? Hundreds or thousands of independent signals overlapping on one wire through frequency-division multiplexing (FDM) which is literally exactly how photonic communication "stacks" multiple signals on one optical fiber. The two benefits are that the frequencies are higher, as you said, and that losses are lower.
@paulgolff4996 still stands as being an improvement, FDM etc can handle hundreds to thousands of signals, but fiberoptic COULD handle thousands to millions of signals depending on the system used.
@@paulgolff4996 Light is all encompassing though. It doesn't interfere with itself, unlike frequency bands. Like Wifi Bands or Radio bands - if you get them close enough together, you run into interference. AND light has 16 "bands" (aka colors) in 1 instance of light. One instance of light is 16 times more effective than one band of frequency multiplexing.
So before you "Oh, please" someone, use your head and think a bit more. Otherwise you just sound pretentious, which is probably what you wanted anyway so people wouldn't question you.
@@Flyingwigs Yeah, no. Practical optical amplifiers have limited bandwidths. Commercial systems using FDM over coax are available with far more channels than commercial systems using WDM over fiber. Fiber still wins on aggregate bandwidth and range, of course.
The tagline says "It's time to compute with photons" but I don't really see any optical computation executing on the the chip but just chip interconnects that are being implemented optically. And not a word about the photo or laser diodes that are responsible for the electro/optical conversion or anything about how the light is being modulated to achieve the necessary transmission rates.
Converting photons to electrons and vice versa is still an issue they are dealing with. The founder has some long lectures here on YT. Pretty enlightening. Haha.
Could you explain in brief i couldn't find the video
@@theseemsligitguy8848 For a transistor (logic/gate element) to work, it must be able to turn on and off. This is done by allowing in electrons to move or stop. Light must be converted into electrons in order for electrons to move or stop (i.e. digital 1 or digital 0). After the electrons move, they then need to be converted back into light so the signal can move to another location.
Converting between light and electrons requires additional silicon and heat. This will make chips bigger.
The ideal way to do this is have a transistor which runs purely on light so no electrons are needed. No one has figured out how to do this. Until this breakthrough is found the idea of photonics will be like quantum computing, always being worked on but a working CPU is never created.
Edit: Another issue with purely light based CPUs is that to make a small CPU the wavelength of light would need to be 10 nm or less. This puts photons into the x-ray region, which creates other challenges.
The "Clean room" for this technology does seem a lot more easy on the humans rather than the nano scale clean room of pure silicon foundries in Taiwan.
I don't know anything about computers, but from what I understand it's also a bit like what happens in nature, if we put all our neurons in a line they would take up a lot of space and the signals would arrive with a long delay, whereas the information being interconnected at many levels they can be shared simultaneously and much faster
I've been following the news around this company for a couple of years now; it's great to see finally someone visited their facility and made a video about their work. Great video.
It could be argued that smoke signals were communicating with light.
bro ☠
@@ohzinteractive.studio Also, mirrors from lighting glass were way before the late 1800.
@@GudasWorld_2 what about duck tapping a USB stick to a homing pigeon?
well they did
Flags used flags, and light
In my opinion, these light chips have the potential to break the boundary between technology and organic life, seamlessly merging them together.
I'm baffled by why Moore's Law being dead is so important to many people. People get really passionate about, but everyone knew that it wouldn't last. Physics would make it impossible to continue at some point.
It's like oil is going to run out in 50 years 50 years ago 😂
Plus am sure we are seeing that limit with quantum computing chips. Since transistor got so small they begin to exhibit the quantum properties.
I got to “light travels so much faster than electricity” when my eyes glazed over.
Clearly there are certain benefits of with fiber optics with things like bandwidth and such but the speed of electromagnetic waves (be it light, radio or electricity) isn’t one of them.
I like the idea of using light speed for AGI. What better way to speed run our extinction?😂 There was an ad during your video with a lady asking her Meta glasses which pool ball to hit while playing pool, remember when we used to play games to have fun? Now we can have our AI play our games for us, future is looking really good! (Sarcasm intended)
Laws of Computer physics dont die,they just evolve into something more than they previously were!
I really love the editing, and how much work you put in, actually visiting the company, doing interviews, motivating and exciting people. But i'd love for there to be more actual information in these episodes as well... I think you repeat some points, and you're afraid the average viewer is incredibly uneducated. I'd love for you to mention interesting properties of light interconnect like, layering different wavelengths of light down the same channels and filtering them out, meaning each optical channel can act like multiple cables occupying the same space etc.
Getting into the nitty gritty of the matrix multiplication would have been cool! Thanks for the feedback
@ 7:00 The juicy bit
Was looking for this comment 😂
This honestly feels like the real start of the video.
I don't know if you put this together yourself but it is FIRST RATE !! completely professional. Fantastic voice over and script SUBSCRIBED !!!👍
Metamaterials in Photonics would be the ultimate way
Light has been used for communication for 100,000 years. The first known lighthouse was the Pharos of Alexandria, which was built in Egypt around 280 BC.
Light Houses use a very slow "Digital" type signaling system. Stop-start light.
Dude, smoke signals?
Whowudathunk
To throw in my 2 cents- I think the main confusion is that "light" is many many different things. At it's most basic form it's just a wave traveling through the electromagnetic field that deposits it's energy in a single point; more technically though photons are the force carrying boson for the electromagnetic field. In practice these are the same thing but conceptually we don't often consider magnetism as a form of light (virtual photons) or really consider how strange photons behave having no rest mass and neutral charge when compared to electrons.
I am frustrated that it took until 8:25 for a number to be mentioned. I understand that context for the issue needed to be provided, but much of this video is restating "traditional silicon can't shrink further" in different ways.
I am also confused. At first, it seemed as if Light Matter is claiming to solve a problem in the space of digital logic. But the two products they show are system interconnects. Are they addresssing limitations in networks or logical computation on a single chip?
The end of Denard scaling was not the end of transistor counts scaling. Multicore processors, FinFET, Gate All Around, 3D packaging, backside power etc are all new approaches enabling transistors to scale down further.
I do appreciate the on-screen source information as you show external content. Please keep doing that.
At 18:09: correct me if I am wrong, but impedance would be delay, no? Inductance is also a charge storing device, like the capacitor. Both capacitors and inductors have a response delay, called reactance, the sum of which comprise the impedance of a circuit.
I agree with you. The video initially makes it seem like they have created some new form of digital logic that computes with photons, when infact they are only focusing on creating faster chip interconnects. I wish the video also covered some of the current challange in photonics, instead of just saying that it's better than traiditional methods in silicon. The concept seems interresting, but the video explains it as something completely new an revolutionary, which I don't belive that it is.
@sunhome22
Good point. More performance metrics, especially as compared to traditional solutions, would also be appreciated.
Since they are the first, allegedly, it would have been interesting to hear about what specific physical challenges they solved to make their solutions possible.
This was a great commercial you made for this company. Very well produced. I wonder how you setup these deals with companies to produce commercials for them? Are you paid a flat fee or is it based upon the performance of the video? I did notice you neglected to include the require notation that this is a commercial.
This is not a commercial and we were not paid to make this video. We've been making videos about deep tech companies for almost 2 years now and are only now beginning to receive sponsorships to cover our costs.
@@storyco2000"Sponsorships"
Hehe 😁
What an idiot
How much d do you suck?
this is is the 10th time in the past 10 years that next year we will hit the cap of whatever moores law says the limit is.. Especially whenever a new gen CPU drops. They love bringing it up each time then
Sorry but no, this doesn't kill the Moore's law. It isn't computing, it's just data transfer. It helps to keep the compute not needing to wait around for as long for data to compute, but doesn't change anything related to the actual computing
Also why tf does AI has to be shoved into this?
I feel its a sponsored video and one to show investors as well 😪
@@ComicalChonkCat That's the only logical explanation, either this or clickbait for ad revenue
They explained how AI fits into this. The amount of energy and heat used by AI can be drastically reduced by using light.
@@akatsukilevi they explained it in the video.
AI does increase our energy usage at incredible rates (we're talking about millions of GPUs for an AI company like xAI), and if this chip can reduce that usage, then it's already worth to invest into it, IMO.
This is really cool! Now i wish there are other companies that also hedges their bets & research prowess into spintronics as well instead of fully betting on photonic ic.
Cause making waveguides and lasers and all still face manufacturing challenges and costs associated with it. Whereas spintronics you gotta have spin polarizers, spin transistors & GMRs which is relatively easier to do.
Light can have resistance, if it goes through a slightly opaque medium. We just have lots of room temperature light superconducters.
Light speed in fiberoptics is 100,000,000 meters a second slower than electricity through copper.
bro the audio mixing wild. ts a whole experience while high
😂
AYO WHAT
lmao yes
please internet, enough with the "bro"
@@The_SUN1234 womp
imagine if this hits mainstream, we'll go for 2000Watt PSUs to mere 80W or less
The tech is interesting without the AGI smoke
Dont like that this is focused on AGI. Because this and light computing are at best loslly conected. More processing power wont bring agi. We just dont know how to build such a thing yet. And no LLM arent AGI and willl never be, regardles of how much data they are fedd.
Came here to say this but saw you already did. Seems they are just throwing the word around for buzz
Tbf, optical neuromorphic computing is possibly quite reliable solution, so it'd work perfectly fine with neural nets and other types of AI.
Edit: you are right on AGI mostly, though.
@@allepiccondor1166 that so ass. I mean ligth computing is cool engough anayways and worth investing into if you do wanne more powerfull computers in the future
@@baltofarlander2618 That might be. I mean photons are qutom objects after all. And I am not saying that AGI is never going to happen, its just not going to happen with LLM. They are just not build for it. And i hate it when some people tell me that chat gpt is already smarter then humans and can also code better then most of the people. Me a hobby programmer can absolutly falsefy this.
if not agi then what? games? faster tick tocks? y'all don't like the buzzwords but also don't realize what's needed for the next advancement. the grass is always greener on other side right?
It’s just a silicone chip where fiber optical cables are being used as a communication between them and at the end of the day it might increase the speed at which data travels inside a chip but it is not going to compute anything. The computation will still happen on the silicone. For example every year apple says that they have developed a way to connect 2 M max chips to make 1 M ultra chip, so this technology is there to make that connection faster that’s it. The computation will still happen on the chips
This video is confusing since it describes a technology for solving the interconnection between chips, but on the other hand claiming that the actual computation is done with photonics. This claim is by no means explaned in the video. What I would like to understand regarding the interconnects is how the light is generated and detected, I suppose there is a laser for generating the light, is it a vcsel that generates the light? So what is the datarate this laser-fiber-detector link can achieve ? i het the feeling that the intention of this video is to raise more funding, without explaining the key performance indicators.
> Humanity's 'bout to invent a new computer.
> Still hasn't figured out how to eradicate bedbug infestations.
They have made the connections, now all we need is a light based architecture.
It is totally incorrect to say that integrated photonics has never before seen mainstream commercial usage! Hello Luxtera, Acacia, and Cisco? Lightmatter is absolutely NOT pioneering this field, and the actual subject matter experts literally yell "Theranos!" at Nick Harris when he makes appearances at academic/industry conferences with this pitch. According to Nvidia, the Grace Blackwell platform achieves ~1Tbps chip-to-chip links at ~1pJ/bit and near free cost while optical interposers will immediately increase power by 3-5x and cost by 30x-50x with ZERO added functionality. Short-reach links with silicon photonics are completely unnecessary and counterproductive from a system perspective. They do offer benefits for scale-up and scale-out links that can take full advantage of the low loss of optical fibers vs copper, but this is already being done with co-packaged optics and chiplets by the rest of the industry, which is productively humming along without making all kinds of flamboyant claims as this company does. By the way, GlobalFoundries is the one actually developing the new process technology and making all of the stuff in this video, these guys are just drawing GDS files like all their other customers.
In the 1700s, experts said and proved theres no way the human body can withstand speeds in excess of 25 mph.
We all know how worked out.
Falling for 2 seconds is about 40mph lol
The human body had exceeded speeds of 25 knots per hour at that point (MPH wasn't often used in the 1700s as a measurement); being in a sailboat and hitting a hurricane, you'll be sailing pretty fast, and many survived. The real argument behind that was related to deceleration and instant acceleration. And they were right. Get going 25 mph and come to a dead and sudden stop with nothing protecting you. You could very likely live with modern technology and medicine, but you would not have said your body "withstood" that
@@Diamonte420 knots per hour is redundant, a knot is already a unit of speed over time, thats like saying mph^2
It wasn't a law but a powerful tool for companies to make more money. Even the technology allowed companies did not make chips commercially available.
Is this one giant ad?
"Invest before Nvidia buys us"!
No? If it was an ad we would have been paid to make this. It's should a video about a company doing cool photonics stuff
@@storyco2000Why not both? /s
It is an awesome video, I'm also investing lol.
@@luttman23 It's not publicly traded, so what are you investing in? 🤔
@@latorn buggar all, couldn't find it as it isn't publicly traded
In your 'Why I do S3' video, you talked about reading all these books about startups. Could you share some of your favorites?
The "speed of light" as we call it is just the maximum speed any information can travel and is always depending on the medium it is traveling through!
The maximum speed of information transfer inside a copper wire is about 290,000,000 meters per second (according to wikipedia)
The maximum speed of information transfer in fiber optics is about 200,000,000 meters per second (according to wikipedia)
This is not - as is repeatedly advertised - "faster" - but this tech enables more bandwidth, much higher frequency due to less resistance and it produces less heat and needs less energy.
Achieving AGI will require a new paradigm of AI models
basically what we have here is a replacement for "infinity fabric"/ the interconnect that connects the chiplets or cores together and the connect between multiple chips. this is a great thing but it's not quiet replacing the switches yet. when that happens that's the real revolution we are looking for.
Exactly what came to mind during the first half of the video.
I wondered if it was replacing links at the SXM/OAM level / the NVlink/Infinity Fabric level.
Wow, these are such young guns that are now leading us into this new tech that will transform the way we communicate and process vast amount of data. So impressive.
Skip to 15:45 to get past the bloat
"so we communicate with light speed", basically the whole video
Dumb.
Starting background music name at 1:30?
So many comments splitting hairs over definitions. As long as it makes things better, I don’t care. It’s over hyped but it kinda needs to be for their investors. It is new and certainly needs to be looked into, even if it is just a stepping stone to true photonic computing.
I was working and experimenting on optical communication from my childhood and I'm 19 now still I see future in light communication methods...after finding this video my brain immediately said one day you should be over there! LIGHT MATTER I will remember this company...I'll be coming there with resume one day...
Bruh , how do you even experiment in optical communication as a child/teen ?
@AfifFarhati I'm working on Light Fidelity AKA LiFi...data transmission through light...I convert my data into Binary through a microcontroller(Arduino) and this binary data modulates the light and those pulses are received and decoded by another Arduino with a photodiode.
He probably put together a FOBOT with pre-term fibre and thinks it’s en par with a degree in photonics engineering.
@@millieh3179 Li Fi is optical communication that is done externally not inside a covered environment like a fibre cable...I've built my own communication protocol to make it possible in any type of ambient light...you need both clock and data streaming in a single situation under another ambient light source...I learn things by doing rather than making critic comments under youtube videos...
@@millieh3179 Li Fi is optical communication done in external environment not in a covered environment like a fibre cable...you need to transfer both Clock and Data Signal simultaneously under an ambient light...I built my own communication protocol to get adjusted to various ambient lightings...I learn things by doing rather than making comments under youtube videos!
This is one of those videos where you have to sort comments as "newest first"... and have the dislike counter addon. It's a clickbait fluff piece. Solidly unauthentic.
“Inauthentic”
I once met a man in a rainforest in QLD who claimed he had actually made a computer chip that was faster than light; however, when he tried to patent it, they said they were sorry, but as this breaks the laws of physics, they could not accept it. Not word for word, but you get the picture.
12:36 * the size of an atom
Yes i just paused and cam tocorrect thanku
Wanted to comment the same thing 😅
CEO of a company dealing with this with an MIT degree in the field and still getting it wrong? 🫣
@@Janiaje even the best makes mistakes, and it that video with so much information its easy to miss something
6:29 Veritasium B-roll spotted
I was trying to find atleast one who noticed that. Glad i found
If this was the case current silicone manufacturers would keep expanding and putting more resources into conventional chips and would also turn to this, i wonder why they dont
Light is not faster than electricity. Yes actual electrons move slowly but electron signals, which is what is relevant in computing, already travels near the speed of light.