My typical spiel was cut off by Circle Glasses Phil (the one true Phil who controls them all), but you can find the Patreon reaction here: www.patreon.com/posts/reaction-video-2-101845366 Full sources in description, but biggest applause goes to Acquired: ruclips.net/user/AcquiredFM You'll find 10 hours of NVidia stuff there, and it really helped ground this vid. Great place to become Nvidia obsessed. Thanks for watching.
For someone who's been into computers for years, Nvidia has always been a company I've been aware of. I never thought of how weird it must be for people who had never heard of it until it became worth over a trillion dollars out of nowhere. One of my family members, who doesn't know much about computers, thought it was so cool that I had an RTX card in my PC simply because it was Nvidia.
my journey was coveting those graphics cards on video game magazines in the 90s, forgetting about it for a while except when i started 3d graphics, and then waking up to find it worth $2T.
@@beetooexGamers aren't forced, but professionals are. CUDA is overwhelmingly dominant in the professional space, though AMD is working to catch up with ROCm and its professional GPUs are significantly cheaper
such an epic channel. I did not know about it before this vid but it definitely was a reason to have this sort of high level approach since Asianometry is so good at the detailed stuff. The TSMC videos are really nuts...so detailed.
@@PhilEdwardsInc duuude that channel itself is an entire academic course in some ways. There's also TechTechPotato with Dr Ian Cutress for some occasional in depth stuff.
It never occurred to me that the general public wasn't particularly aware of Nvidia. I mean, anyone who has ever built a PC in the last 20 years had to go down the rabbit hole of deciding between a GeForce or Radeon GPU and researching which model was the better bang for your buck at that particular point in time. I often forget that there's people out there that aren't into PC's and/or gaming.
Yeah I was sort of a version of this myself - coveted them all totally in the 90s as a kid, fell away, and then came back in 2024 to be like - wait, why is the thing that made Starsiege Tribes look cool suddenly worth 2 trillion!?!?
You're waiting for a train, a train that will take you far away. You know where you hope this train will take you, but you don't know for sure. But it doesn't matter.😱
The hype train is going to keep pushing for the current round of "AI," but so far it seems the costs will remain too high and the results too poor for machine learning to actually be an effective tool for most use cases. Like previous AI hype waves (since the 1950s!), there will be specific, niche cases where the current hyped tech is a great fit, and in those niches it will stick around, and people will stop calling it "AI." Then sometime down the road a new technology will be hyped as "AI," and the cycle will start over again.
nvidia makes supercomputers and silicon for the military... they make components for cars,ai, and medical robots, all that on top of regular consumers and crypto miners. how could they possibly not be worth 2trillion 😂
Nvidia has a hell of a chin stuck out and ready to get hit by the fist of conpetitive silicon design. The only thing keeping them safe... software. (Cuda integration) as we know, software moats are often shallow.
@PhilEdwardsInc so one if the notable things about most AI implementations is how inefficiently they are implemented at the orchestration level, but we'll tied together by ubiquitous frameworks. Pytoarch, tensorflow, MXnet. Because of this, you can easily park a competitive hardware compute solution ontop of the stack so long as those frameworks are able to interface with it. (Insert intel here) intel already can in some ways. Also you should look at the value of Tesla vs the value of nvidia over time. You will see a trend that suggests nvidias valuation is a bit of a bubble in inherited from Tesla investors moving over. Nvidia also has a vulnerability around how they impliment their chips. Very jack of all traits, and brut force master of all via everything and the kitchen design. This opens the door for pure play chips to come in and scoop up marketshare. Lastly, nvidia is going to find themselves very vulnerable to power efficient competition that can run very large models. Most power costs come from moving data, and the large models use a lot of memory and data. So narrow bandwidth chips don't handle them efficiently given all the data moves. If a competitor focuses on wide bandwidth and efficiency, not pure speed. They will gobble up a large chunk of data center clients with proven fixed run models.
NVIDIA also cashed in big time on the bitcoin boom, I think this deserves more than just a footnote. Anyone remember the graphics card shortage, when everyone was into mining crypto?
yeah i was pretty split on how to portray it - ultimately, the fact that nvidia tried to dissuade and regulate crypto miners made me think that it wasn't central to their mission (even though they did eventually sell crypto-focused gear). the acquired pod does a good job at kinda contextualize this.
@@PhilEdwardsInc I think your very-high-level take here is pretty accurate, as someone who's watched the 3D industry since it was born. The crypto-boom wasn't a BAD thing for them, put a fair amount of cash in their reserves. And it really hurt us in the gaming space due to the supply constraints which made it feel big in the consumer facing market, ultimately it's pretty damn small potatoes compared to the AI driven sales of the 10-50-100k$ enterprise market and the resulting 2 trillion dollar market valuation.
@@PhilEdwardsIncAgreed. Overall crypto ran parallel to the AI story. Crypto made Nvidia some money but was just something they happened to be good at and had to scale up for when their GPUs came into high demand. It was a headache for their gaming division as it wasn't meant for crypto miners so they tried many ways to find ways to segment crypto away from gaming.
My first PC in the 90s had NVIDIA graphic chip. They also gave buyers demo games to demonstrate their 3D game capacity. My cousin and I played Future Cop all the time. Fot the 90s it was incredible!
@@180_S well, just to start with, it's this core element of his personality (with an explanation tied to his wife) but the older GTCs actually have him dressing in totally normal polo shirts...
The only stock I’ve ever called to be one to explode in value is Nvidia. Back in ~2016 I told a close friend who was very into trading that he needs to go all in with Nvidia. It wasn’t just the obvious thing that they would supply so much of the worlds AI chips, but the market analysts also agreed unanimously that it’s going to see spectacular growth. The next couple of years went by and it did indeed happen. My friend later tells me, “man, I should have listened to you” 😂
ugh i wish we were friends. there were some old marc andreesen tweets i ran into from around that time - made me feel pretty silly for not betting on them (though, to be fair, I still have no idea if it'll work out for them).
Nvidia's fortune changed with Crypto mining.. Right now the AI "Datacenter startups" is the leverage used to price so high, valuation is kind of cooked up. Nvidia has great hardware but price is not right.
They're just good at locking in users with shiny proprietary features, whereas their competitors fail to do the same. Where else would I get 3D Vision support but on Nvidia? Even when they deprecate features like the aforementioned 3D Vision, it's still hanging around in the professional driver. CUDA isn't even that special, but the platform lock-in definitely is.
Great topic! A dive into the benefits of specialization, when paired with cooperative-competition among other industry specialists, to push technology further and further. I hope this VOD is a home run, great current events information
When got into investing during the pandemic i didnt have any knowledge about the stock market I decided to pick whats going to be in high demand in the future and i thought about A.I. i just googled what would be a good A.I. stocks and Nvidia came up I dollar cost average and didnt imagine this stock would blow up in a just of couple of years.
Phil, I want you to explain why this video is like a few years late? It should have been made a couple of years ago if nvidia had that much of potential and not now when it hit the 3 trillion market cap.
I remember doing social studies stocks test on 8th grade and put 20% of imaginary 1000 euros (which is ~1050 dollars) into nvidia right before the summer break and seeing how much it has grown really puts a smile on my face. Even though I didn't use real money, just test calculation I feel like I won big money.
PUMP and DUMP... That's all you will get here... Why are people so gong ho on allowing artificial intelligence and super fast chips to dictate their future....?
Nice video friend but my problem is that the market structure keeps having fluctuations and is causing quite an issue for us investors, especially with today’s rapid changed and complexities But its quite favorable to options traders whom understand the dynamics of the market.
So, a lot of this seems like the best sort of success story, an accidental one. In the late 1970s, IBM began experimenting with Reduced Instruction Set Computers, or RISC. The theory was, by making the machine-code for the internal procedures on a CPU simpler (as opposed to specialized), the chip architecture can be optimized to run more computations faster (the assembler/compiler would then transform a single more complicated instruction into several smaller but faster to compute instructions). Companies like Silicon Graphics made their name in constructing powerful new computers for high-end clients, only to gradually be edged out by cheaper personal computers that were becoming faster. Originally, the graphics card was just a specialized bit of hardware designed to (very quickly) compute a lot of similar data all at once. What companies like NVIDIA stumbled across was a new way to think about computing altogether: simultaneous computation of many similarly structured problems. Video game graphics, cryptocurrency blockchains, and artificial intelligence all thrive on these sort of distributed, parallel computation systems. It opens up a new era in computer science.
I work in the industry and know Nvidia's story pretty well. Have to commend you for breaking down the history and industry in a very approachable way. You clearly did your research. The discussion of fabless at the beginning was especially good and something people don't appreciate about how the industry has changed compared to decades ago. Also the introduction of CUDA from the early days before DL became a thing.
AI is just a buzz word that helps to sell more "NVIDIA stuff" Like GOLD is a buzz word to sell more shovels the real value comes in the potential use of these chips More shovels = better digging more chips = more ... first it was parallel compute in research then it was parallel compute in gaming then it was parallel compute in crypto then it was parallel compute in ai so the real question is what can you do with parallel compute what can you do BETTER now and even better tomorrow and what will it even enable? and when it will enable a new market with new products its value can increase dramatically
It will increase productivity in all sectors because it has to. With those birthrates we are doomed, we will have 60 elderly out of 100 people in 2100. We have time until then to figure out how to use ai or many will die of starvation when the workers aren't able to feed to elderly.
Hey man, you make nice videos, but you're going too far with the word "fabless". Fabless (*fabrication*-less) just means that nVidia only designs their chips but doesn't print them. It has nothing to do with simulating real-life or training robots. There is no fabless universe. I think you've just extrapolated NVidia's strategy of virtually prototyping their chips to something that "fabless" isn't.
i don't know - i think it makes sense that the importance of emulation to their success would have convinced them that emulation could be a similar panacea for other industries. i do sorta think the omniverse fits that fabless universe definitions. but i do get it's a stretch.
It is funny how intimidating certain computing concepts are to many. I would like to say that NVidia's chip manufacture process is not that difficult to explain, and using a potato chip to describe what NVidia does does not simplify anything it "dumbs it down". It is an abstraction so far removed from the underlying truth that the only people who could understand the abstraction are the very people who understand the actual process. All a simplification gag, such as pulling out a potato chip, does is reenforce in the minds of the many that this is complex and too hard for them to understand. I say let them decided that, give an attempt to explain the reality of what NVidia does and if you as the author of the video think it is too boring or not compelling enough for the narrative, understandable remove the "dumbing down" from the video entirely.
yeah, i think you have a bit of a point, but at the same time, i think that if you start getting into 5 minutes on stuff like this (en.wikipedia.org/wiki/List_of_semiconductor_scale_examples) you're going to lose people, especially if they're trying to get a grounding on nvidia.
nvidia makes specialized processors for highly parallelizable workloads like graphics and ai. why didn't you just ask me dude, i could have told you that in the first place
Crypto really boosted their company with perfect timing for AI to become mainstream to keep their stock price going upward, I did hear that their mark ups on chips are so crazy (like 800% profit on each chip) that companies like Microsoft and Amazon want to design and use their own chips to bring cost down so we will see what happens in the long run
There's also this consistent idea the Nvidia does not work well with other companies. Apple forsake them from Macs forever back in the early 2010s, Microsoft and Sony collaborate with AMD for their consoles. Only Nintendo is a true long-term partner. Outside of *you buy our cards and stick 'em in a box*, Nvidia does shockingly little with other companies. The Apple of graphics.
First graphics card I had in early 2000 was a Nvidia Riva TNT2 32 mb ultra, was leaps and bounds ahead at the time. Their 3000 series cards were good, gaming is such a small percentage of their business now they're neglecting it sad times.
@@PhilEdwardsInc part of the fun of cheetos is getting your fingers covered in bright orange cheesy dust. Then being in a constant state of potential destruction if you ever touch anything. With the glorious finish of licking the fingers off.
My typical spiel was cut off by Circle Glasses Phil (the one true Phil who controls them all), but you can find the Patreon reaction here: www.patreon.com/posts/reaction-video-2-101845366
Full sources in description, but biggest applause goes to Acquired: ruclips.net/user/AcquiredFM
You'll find 10 hours of NVidia stuff there, and it really helped ground this vid. Great place to become Nvidia obsessed.
Thanks for watching.
6- AI and “social existence, social experience, social consciousness”.
@jamshidi_rahim
For someone who's been into computers for years, Nvidia has always been a company I've been aware of. I never thought of how weird it must be for people who had never heard of it until it became worth over a trillion dollars out of nowhere. One of my family members, who doesn't know much about computers, thought it was so cool that I had an RTX card in my PC simply because it was Nvidia.
my journey was coveting those graphics cards on video game magazines in the 90s, forgetting about it for a while except when i started 3d graphics, and then waking up to find it worth $2T.
Do gamers still hate Nvidia for their price gouging on consumer cards? I stopped paying attention years ago.
@@beetooexYes they still do. But AMD and Intel GPUs are missing features/not as good, so they're forced to buy nVidia
@@beetooexof course, I would've been trashed if I posted my 4070ti purchase when I got it but I was upgrading from a 960 so I really didn't care
@@beetooexGamers aren't forced, but professionals are. CUDA is overwhelmingly dominant in the professional space, though AMD is working to catch up with ROCm and its professional GPUs are significantly cheaper
that leather jacket. you know he had a team of people that picked this for him.
he says his wife bought it for him! there's a whole mythology...
@@PhilEdwardsInc oh no, the PR department and the stylists share an office! 🤣. rabbit hole opened-afternoon ruined :)
@@onemorechris I am so split on what I think is true....
Nah he just saw todd howard
@@PhilEdwardsInc i noticed it not the same jacket so unless his wife bought him a set of jackets…or he gets the same gift every birthday…🧐🕵️
Still remember the day nVidia bought 3dfx. So very world-changing to my early-college-years brain.
And 99 % of the people worldwide don't even know what 3dfx is! But it was a gamechanger!!! 😄
You see, I think that just shows how they’ve always been abusing the market for their own gain…
@@nottucks what is? Buying competitors?
Cuz that's... standard free market behavior
Or... Do you mean "them acting like this isn't new"?
0:22 that camera shake as you sat down was brilliant!
thank you, it took some planning and post work, but it was worth it (I will never admit that I just fail to sandbag my tripod).
You are rapidly becoming one of the best presenters on RUclips. Another great video.
Translating one type of nerd to the nerd populous is such a nerdy thing to do... And I appreciate the heck out of it
Asianometry+Phil = the crossover I didn't ask for, but desperately needed!
such an epic channel. I did not know about it before this vid but it definitely was a reason to have this sort of high level approach since Asianometry is so good at the detailed stuff. The TSMC videos are really nuts...so detailed.
@@PhilEdwardsInc seconding (thirding?) the Asianometry channel. Everything you wanted to know about making computer chips and more.
@@PhilEdwardsInc duuude that channel itself is an entire academic course in some ways. There's also TechTechPotato with Dr Ian Cutress for some occasional in depth stuff.
Did not expect that crossover! Phil is killing it with these great videos
Jensen Huang is so weird. That said, I bought Nvidia stock in the long long ago, so it's not like I'm complaining.
I had such an emotional journey around him. I ended up thinking he's kinda awesome. That said, I did not trot out the leather jacket for this vid...
the world currently runs on nvidia - you want to do anything with AI, you're using nvidia GPUs
12:15 That laugh reminds me of the laughing without smiling trend
Now it’s the MOST valuable
It never occurred to me that the general public wasn't particularly aware of Nvidia. I mean, anyone who has ever built a PC in the last 20 years had to go down the rabbit hole of deciding between a GeForce or Radeon GPU and researching which model was the better bang for your buck at that particular point in time. I often forget that there's people out there that aren't into PC's and/or gaming.
Yeah I was sort of a version of this myself - coveted them all totally in the 90s as a kid, fell away, and then came back in 2024 to be like - wait, why is the thing that made Starsiege Tribes look cool suddenly worth 2 trillion!?!?
great video keep it up Phil!
I still remember the NVIDIA vs 3Dfx flame-wars...
You're waiting for a train, a train that will take you far away. You know where you hope this train will take you, but you don't know for sure. But it doesn't matter.😱
NVDiA transforming every industry and every company
I can see TSMC charging more knowing they sell high.
if only the engineers figured out their power connector, nvidia wouldve been fine
4:06 says he wants to look at a chip, and instead looks at a crisp 😉
i totally forgot about the atlantic divide. every time i say silicon chips in this video, imagine silicon crisps
@@PhilEdwardsInc 😀
The hype train is going to keep pushing for the current round of "AI," but so far it seems the costs will remain too high and the results too poor for machine learning to actually be an effective tool for most use cases. Like previous AI hype waves (since the 1950s!), there will be specific, niche cases where the current hyped tech is a great fit, and in those niches it will stick around, and people will stop calling it "AI." Then sometime down the road a new technology will be hyped as "AI," and the cycle will start over again.
(i secretly agree!)
nvidia makes supercomputers and silicon for the military... they make components for cars,ai, and medical robots, all that on top of regular consumers and crypto miners.
how could they possibly not be worth 2trillion 😂
Great video thanks
In a gold rush, the ones who sell the shovels are the richest
Why CUDA moat? That was never explained?
I think nvidia’s growth has left other tech stocks a bit nvious.
The third Smothers Brother...
they are not!
You're a bit too good at that creepy dead-eyed laugh, mate. Worrying.
Nvidia has a hell of a chin stuck out and ready to get hit by the fist of conpetitive silicon design. The only thing keeping them safe... software. (Cuda integration) as we know, software moats are often shallow.
sure seems that way to me.
@PhilEdwardsInc so one if the notable things about most AI implementations is how inefficiently they are implemented at the orchestration level, but we'll tied together by ubiquitous frameworks. Pytoarch, tensorflow, MXnet. Because of this, you can easily park a competitive hardware compute solution ontop of the stack so long as those frameworks are able to interface with it. (Insert intel here) intel already can in some ways.
Also you should look at the value of Tesla vs the value of nvidia over time. You will see a trend that suggests nvidias valuation is a bit of a bubble in inherited from Tesla investors moving over.
Nvidia also has a vulnerability around how they impliment their chips. Very jack of all traits, and brut force master of all via everything and the kitchen design. This opens the door for pure play chips to come in and scoop up marketshare.
Lastly, nvidia is going to find themselves very vulnerable to power efficient competition that can run very large models. Most power costs come from moving data, and the large models use a lot of memory and data. So narrow bandwidth chips don't handle them efficiently given all the data moves. If a competitor focuses on wide bandwidth and efficiency, not pure speed. They will gobble up a large chunk of data center clients with proven fixed run models.
Anderson Deborah Hall Jason Thompson Scott
Lee Brenda Wilson Patricia Hernandez Michael
Anderson William Gonzalez Richard Young Gary
I appreciate the version of you that is incapable of putting the carafe back in the coffee maker
he's still there
Plot twist that's the human one
I thought he was going to accidentally shatter the carafe.
keep tryin buddy. you'll get there
NVIDIA also cashed in big time on the bitcoin boom, I think this deserves more than just a footnote. Anyone remember the graphics card shortage, when everyone was into mining crypto?
yeah i was pretty split on how to portray it - ultimately, the fact that nvidia tried to dissuade and regulate crypto miners made me think that it wasn't central to their mission (even though they did eventually sell crypto-focused gear). the acquired pod does a good job at kinda contextualize this.
@@PhilEdwardsInc I think your very-high-level take here is pretty accurate, as someone who's watched the 3D industry since it was born. The crypto-boom wasn't a BAD thing for them, put a fair amount of cash in their reserves. And it really hurt us in the gaming space due to the supply constraints which made it feel big in the consumer facing market, ultimately it's pretty damn small potatoes compared to the AI driven sales of the 10-50-100k$ enterprise market and the resulting 2 trillion dollar market valuation.
@@jellorelic Basically a shovel retailer in back to back gold rushes.
@@PhilEdwardsIncAgreed. Overall crypto ran parallel to the AI story. Crypto made Nvidia some money but was just something they happened to be good at and had to scale up for when their GPUs came into high demand. It was a headache for their gaming division as it wasn't meant for crypto miners so they tried many ways to find ways to segment crypto away from gaming.
Mining Ponzy, lol
I have regret i didnt invest 5k in nvida 5 years ago 😔 😪
If you just invested when you made this comment you would be up 30%
I regret not investing in whatever company out there that increased the most percentage wise
As an electrical engineer, this is one of the best explanations of Nvidia I’ve ever seen- amazing job!
Phil I gotta say the whole meta parts of the videos that has become your style is amazing and I hope you never stop because I love it
+1
My first PC in the 90s had NVIDIA graphic chip. They also gave buyers demo games to demonstrate their 3D game capacity. My cousin and I played Future Cop all the time. Fot the 90s it was incredible!
Yet they still can’t make good Linux drivers…
"Nvidia Fuck You!" - Linus Torvalds (at most from) 2012.
That man's personality is that leather jacket
i have a whole conspiracy theory about this.
@@180_S well, just to start with, it's this core element of his personality (with an explanation tied to his wife) but the older GTCs actually have him dressing in totally normal polo shirts...
@@PhilEdwardsInc I bet there is a not insubstantial correlation between nvdia's stock value and when the leather jacket was rolled out.
That jacket IS NVIDIA. It has him under its control. He's trapped in the jacket.
Nvidia stopped begin a Gaming Company 2020+ but fanboys still buy their overpriced GPUs,
AMD is today best suited for Gamers.
Unless their GPUs burn in which case they will blame it on you and refuse to replace it
Wow this video is fantastic. I am a 5 year fan boy and investor of Nvidia but this actually taught me new stuff. Great thx
And today NVIDIA is same as EA or Activision, hated on world level. And for a good reason.
It's very simple why: because Nvidia provides "shovels" to gold diggers
The only stock I’ve ever called to be one to explode in value is Nvidia. Back in ~2016 I told a close friend who was very into trading that he needs to go all in with Nvidia. It wasn’t just the obvious thing that they would supply so much of the worlds AI chips, but the market analysts also agreed unanimously that it’s going to see spectacular growth. The next couple of years went by and it did indeed happen. My friend later tells me, “man, I should have listened to you” 😂
ugh i wish we were friends. there were some old marc andreesen tweets i ran into from around that time - made me feel pretty silly for not betting on them (though, to be fair, I still have no idea if it'll work out for them).
Nvidia's fortune changed with Crypto mining.. Right now the AI "Datacenter startups" is the leverage used to price so high, valuation is kind of cooked up. Nvidia has great hardware but price is not right.
They're just good at locking in users with shiny proprietary features, whereas their competitors fail to do the same. Where else would I get 3D Vision support but on Nvidia? Even when they deprecate features like the aforementioned 3D Vision, it's still hanging around in the professional driver. CUDA isn't even that special, but the platform lock-in definitely is.
too bad Steve Jobs never had a mustache
his greatest flaw
Now it's the world's most Valuable Company...mind blowing 😮. Glad I own it😊
Great topic! A dive into the benefits of specialization, when paired with cooperative-competition among other industry specialists, to push technology further and further. I hope this VOD is a home run, great current events information
When got into investing during the pandemic i didnt have any knowledge about the stock market I decided to pick whats going to be in high demand in the future and i thought about A.I. i just googled what would be a good A.I. stocks and Nvidia came up I dollar cost average and didnt imagine this stock would blow up in a just of couple of years.
AI clone: “Escape…alt-f4… shutdown! Logout! Exit!”
Phil, I want you to explain why this video is like a few years late? It should have been made a couple of years ago if nvidia had that much of potential and not now when it hit the 3 trillion market cap.
i'm not a prognosticator!
0:18 the remake of Multiplicity I never knew I wanted so badly
is the robot glitching or cheersing a fellow robot on a great video at the end? (coffee maker) Either way it makes sense. Keep it up!
haha perhaps both
I remember doing social studies stocks test on 8th grade and put 20% of imaginary 1000 euros (which is ~1050 dollars) into nvidia right before the summer break and seeing how much it has grown really puts a smile on my face. Even though I didn't use real money, just test calculation I feel like I won big money.
WTF is that musk guy rich? His cars dont sell even an 18th as much as Toyotas!
Why during the downside it dropped one of the hardest and quickly climbing back up
Okay, so those are the 2 guys one would theoretically need to stop if they went back in time to halt the AI emergence at the source. Got it.
unfortunately john conner loves geforce
It’s Connor. You were probably thinking about the Conners.
AI AI AI, AI AI AI AI, AI AI AI AI, AI.
PUMP and DUMP...
That's all you will get here...
Why are people so gong ho on allowing artificial intelligence and super fast chips to dictate their future....?
It's simple. Loose monetary conditions, a weak federal reserve and an AI bubble are good ways to be overvalued.
Nice video friend but my problem is that the market structure keeps having fluctuations and is causing quite an issue for us investors, especially with today’s rapid changed and complexities But its quite favorable to options traders whom understand the dynamics of the market.
Brilliant, interesting and fun! You’re fantastic Phil! Thank you for all your hard work!
Hey look a hipster LARPing as a nerd. Jokes aside good intro video for those previously unaware of Nvidia.
"NVideos": Same humor wavelength here 😅.
Great video as always.
member when nvidia used to make motherboard chipsets? and you needed a motherboad with a specific chipset to use sli? and remember goddman sli?
So, a lot of this seems like the best sort of success story, an accidental one. In the late 1970s, IBM began experimenting with Reduced Instruction Set Computers, or RISC. The theory was, by making the machine-code for the internal procedures on a CPU simpler (as opposed to specialized), the chip architecture can be optimized to run more computations faster (the assembler/compiler would then transform a single more complicated instruction into several smaller but faster to compute instructions). Companies like Silicon Graphics made their name in constructing powerful new computers for high-end clients, only to gradually be edged out by cheaper personal computers that were becoming faster.
Originally, the graphics card was just a specialized bit of hardware designed to (very quickly) compute a lot of similar data all at once. What companies like NVIDIA stumbled across was a new way to think about computing altogether: simultaneous computation of many similarly structured problems. Video game graphics, cryptocurrency blockchains, and artificial intelligence all thrive on these sort of distributed, parallel computation systems. It opens up a new era in computer science.
My stock is still going down for them day after day 😂😅
Aged well
both companies amd and nvida owned by chinese people on the same family. this is only us tech cus it was made in the us not by the west
Great Video! Subscribed! NVIDIA is the future of technology. If you are astute enough to buy and hold this software you will be wealthy!
Wym why ? 😂 They dominating one entire section of chips for almost 2 decades now
"Magno-electritism"
"Artificial intelligence"
"Mcgriddles"
"Infered laser thermometers"
Jackson Daniel Walker Michael Young Amy
I love my NVIDIA video card. Never done me wrong.
What's Nvidia worth without TSMC?
i'm sure they could pivot but...it seems hard!
I work in the industry and know Nvidia's story pretty well. Have to commend you for breaking down the history and industry in a very approachable way. You clearly did your research.
The discussion of fabless at the beginning was especially good and something people don't appreciate about how the industry has changed compared to decades ago. Also the introduction of CUDA from the early days before DL became a thing.
AI is just a buzz word that helps to sell more "NVIDIA stuff"
Like GOLD is a buzz word to sell more shovels
the real value comes in the potential use of these chips
More shovels = better digging
more chips = more ...
first it was parallel compute in research
then it was parallel compute in gaming
then it was parallel compute in crypto
then it was parallel compute in ai
so the real question is what can you do with parallel compute
what can you do BETTER now and even better tomorrow
and what will it even enable?
and when it will enable a new market with new products its value can increase dramatically
It will increase productivity in all sectors because it has to. With those birthrates we are doomed, we will have 60 elderly out of 100 people in 2100. We have time until then to figure out how to use ai or many will die of starvation when the workers aren't able to feed to elderly.
Hey man, you make nice videos, but you're going too far with the word "fabless". Fabless (*fabrication*-less) just means that nVidia only designs their chips but doesn't print them. It has nothing to do with simulating real-life or training robots. There is no fabless universe. I think you've just extrapolated NVidia's strategy of virtually prototyping their chips to something that "fabless" isn't.
i don't know - i think it makes sense that the importance of emulation to their success would have convinced them that emulation could be a similar panacea for other industries. i do sorta think the omniverse fits that fabless universe definitions. but i do get it's a stretch.
💚🖤📈🇺🇸. nVidia is the GOAT 🐐. Jensen 🙏
It is funny how intimidating certain computing concepts are to many. I would like to say that NVidia's chip manufacture process is not that difficult to explain, and using a potato chip to describe what NVidia does does not simplify anything it "dumbs it down". It is an abstraction so far removed from the underlying truth that the only people who could understand the abstraction are the very people who understand the actual process. All a simplification gag, such as pulling out a potato chip, does is reenforce in the minds of the many that this is complex and too hard for them to understand. I say let them decided that, give an attempt to explain the reality of what NVidia does and if you as the author of the video think it is too boring or not compelling enough for the narrative, understandable remove the "dumbing down" from the video entirely.
yeah, i think you have a bit of a point, but at the same time, i think that if you start getting into 5 minutes on stuff like this (en.wikipedia.org/wiki/List_of_semiconductor_scale_examples) you're going to lose people, especially if they're trying to get a grounding on nvidia.
nvidia makes specialized processors for highly parallelizable workloads like graphics and ai. why didn't you just ask me dude, i could have told you that in the first place
dang coulda saved a few weeks
@@PhilEdwardsInccuda saved*
*ba-dum-tss*
Walker Anthony Gonzalez George Martinez Maria
I learned a lot, well done video!
everyone knows nvidia, what agenda are you pushing?
Gonzalez Paul Gonzalez Scott Martinez Brian
They gonna be the leading comspy for ai chips, easy
Crypto really boosted their company with perfect timing for AI to become mainstream to keep their stock price going upward, I did hear that their mark ups on chips are so crazy (like 800% profit on each chip) that companies like Microsoft and Amazon want to design and use their own chips to bring cost down so we will see what happens in the long run
There's also this consistent idea the Nvidia does not work well with other companies. Apple forsake them from Macs forever back in the early 2010s, Microsoft and Sony collaborate with AMD for their consoles. Only Nintendo is a true long-term partner. Outside of *you buy our cards and stick 'em in a box*, Nvidia does shockingly little with other companies. The Apple of graphics.
Skynet just might be nvidia in disguise.
Clark Helen Anderson Angela Clark David
Taylor Gary Moore Jessica Lewis Betty
First graphics card I had in early 2000 was a Nvidia Riva TNT2 32 mb ultra, was leaps and bounds ahead at the time.
Their 3000 series cards were good, gaming is such a small percentage of their business now they're neglecting it sad times.
New crypto NVIDIA Token NVIDIAT / USDT 👀🚀🌕
Ans: Bubble
Saved you a lot of time
Didnt need to watch it, it is ofc a bubble. If thats what was shown in the video, then the author is correct.
Can we talk about how much your clones reminds me of some modern Twin Peaks meets i,Robot? I need the next episode already.
That's a really cool and interesting video, mr Edwards, really is, but that's a really long way of saying 'AI boom'
ai boom isn't enough though!
Just the idea that "math!" is a punchline
I love the video Phil. We might need to do an intervention on how you eat chips though.
i don't eat chips like that but if i ever get cheetos i will be tempted...
@@PhilEdwardsInc part of the fun of cheetos is getting your fingers covered in bright orange cheesy dust. Then being in a constant state of potential destruction if you ever touch anything. With the glorious finish of licking the fingers off.
My parents harangue me for licking my fingers.
@@ryanortega1511 they are wrong. Licking the finger is the proper eitquette.
I was told it’s because they’re susceptible to germs. My mother is a doctor, so she has it on some authority.