Word!! He has come a long way though! Although it will be great if he gets as many subs as he deserves, it may mean he won't be able to reply to as many comments as he does, but that's me being selfish lol.
They're not even *that* technical, but they're like the perfect level of technical for armchair enthusiasts. He also explains things very well. But moreso, he's not sensationalist at all and actually grasps the things he's saying, so whenever he gets into more of a personal analysis situation, he's generally making good points and not just flying off the handle like you often see with others.
@@maynardburger Thank you, I honestly think you perfectly described what I am going for. I'm not a semi engineer so I actually have less technical knowledge than for example someone like Ian Cutress on his channel. I often need a lot of time and research to fully understand something, but once I do, I think I can explain it very well, because I had to put in the time. If I was a semi engineer I might glance over basic things because I would consider them general knowledge.
I've switched over to using M1Max and M2Max for training small computer vision models. No point forking over 3,500.00 for 2 RTX 4090, when I can get a M2Max with 96G of memory for the same price with a full computer. NVidia has gotten too greedy and I hope the market smacks them straight.
The problem all these companies face is that Nvidia themselves aren't sleeping and the nature of this work allows their current lead to be leveraged into future advantages as well. Efficiency and cost of running are obviously important, but if you want your software AI solutions to be the world leader(where there's much riches to be had), then you will justify spending for the best hardware you can, costs be damned. That said, as a gaming enthusiast, I'm absolutely hoping Nvidia does get knocked down a peg by these companies and other, growing AI chip businesses. Would be nice for Nvidia to remember where their bread has always been buttered so they can stop trying to screw us over.
They all do pay for the best hardware, TSMC 3nm, 4nm. NVDA doesn't manufacture the best hardware themselves. In the big picture, I'm pretty sure every tech giant does not want to be captive to any company if they can avoid it, so it makes sense they'll have their own internal solutions as well. And if it's a matter of having the tech or not, then it's understandable the costs are secondary. But if there are cheaper solutions, I'm sure shareholders and customers do care. After all, the costs just get passed to the customer and then their customer and so on, so in the end are you willing to pay more?
"Cost be damned" No...not really. If you can make a chip of just SIMILAR performance but for a much cheaper price you will use your own stuff instead of buying NVIDIA. Of course research also eats up loads of budget but an H100 reportedly only costs around $3200 to produce. And since a lot of these non-NVIDIA chips are smaller and on cheaper processes, they are even cheaper. Maybe only $2000 or $2500. That is a gigantic difference to the $45,000 that NVIDIA is demanding for their H100. And so even if Google, Microsoft etc. produce their own chip that has a quarter of the performance at half the wattage it would STILL be mich cheaper to manufacture four of your own chips to match one of NVIDIAs chips, even if those 4 chips use 2x the power of that one NVIDIA chips because you're literally saving tens of thousands of dollars. I mean just do the math. Let's say Google has a chip with the aforementioned performance. ¼ of the performance for ½ the wattage. 4 of those chips then provide the power of one NVIDIA chip. Same performance but by using 1400W instead of 700W. Too expensive in the long run? No. Cause even if they run 24/7, it's 0.7Kw × 24hrs = 16.8Kw/day. 16.8 × 365 = 6132Kwh. At a price of $0.1/Kwh that's ~$600 more in electricity. But they saved 20 to 30K by not buying NVIDIA. So even after 10 years the extra power draw would be MUCH cheaper than just buying one NVIDIA chip. It really does make sense for them. Even if THEIR hardware isn't the best, which can also change because all of these companies are trillion dollar mega corporations.
as a gaming enthusiast what you really wnt is AMD to come out on top, you just dont know it yet. The ultimate gaming pc is a ps5 that fits in an AM5 socket thie gaming future will be all about APUs
So in summary, NVIDA is a stop gap for all of it's biggest customer's AI training... soon as they develop their own, they're going to drop NVIDA like a bad habit.
I had to smile when i liked, commented and added your video just now to a playlist so i could increase your exposure to the recommendation model of youtube. I suspect RUclips’s recommendation engine is tensor based. Another thought is I’m thinking you’re correct that the gold rush mentality of AI with NVIDIA and their insane stock price isn’t scalable. Loving your channel and thank you!
I think Nvidia will remain on top for at least the next 5 years. Not necessarily because of the Hardware which is a strong reason but also because of the software support. CUDA to be precise. Most developers starting out in ML are learning and testing using GeForce GPUs. Libraries that depend on CUDA are pretty much what everyone who is learning ML learns to use. Once these developers get employed, Companies are more likely to go with Nvidia GPUs because it's already familiar with their ML Engineers. I believe that this foothold isn't going away any time soon.
It's very interesting to see everyone doing custom hardware. Because of where FAANG expects to add value with AI costs are very very important to them, if you have cheaper inference you can make more margin or run better/smarter networks than your competitor. This combined with their insane scale incentivizes them to develop alternatives to NVIDIA. This also means that AI hardware could be commoditized much quicker then if all of them just bought NVIDIA hardware. We'll have to see what happens but I wouldn't be surprised if in 5 years inference is a completely commoditized low margin business and NVIDIA is trying to defend their number one position in the higher margin training business with everything they have.
Meta' LLama 2 is probably one of the most popular open source LLMs with downloadable weights. So, Meta definitely has some much powerful models for internal uses.
Yay! Been looking forward to yiur new video! I can't finish it now, but I wanted to give you the fast like and comment for thw algorithm!! Canr wait to watch it later!!
I made time lol. Great video! I'm looking forward to seeing what happens to Nvidia. After their ____ show of a comtech keynote, amd their sad cash grab GPU launches, it seems like they are probably going down. I'm looking forward to what AMD does especially with their APUs and hopefully applying AI chiplets to them potentially making excellent mini/SB computers that can be used for automation/AI applications in a super small form factor and low energy use! I would hate for all AI hardware to be proprietary as I would rather design my own than allow Google, Apple, or worse yet FB to have access to my systems just so I can use the tech. I am also hopeful that their APUs will essentially kill the market for consoles like Xbox and PS! Once that happens we will see an explosion of awesome games with incredible innovations. Especially when AI gets better at helping with that! UE5.2 has some incredible PGC tools now along with easy to use face/motion capture tools and with systems like CGTP games cpuld literally generate content in real time based off of what you have done in a game up to that point and informed by the lore/parameters of the game world. As long as we are not fighting skynet before that can happen lol. Anyway thanks!! Have a great week!
Right now, Nvidia is clearly on top, but it could change rapidly over the next couple of years. The entire AI space is very volatile right now, lots of possibilities for disruption.
The user needs to be educated to behave in non predictable or at least less predictable manner. This propably would include proactive and systemic rejection of the advertising ai/meta does offer.
Nice vid covering whole topic... all main AI chip developers. It could happen, that Nvidia meeting such strong competition and also independence of big potential AI customers, that team green could return main focus on gaming gpu market, although it seems not much profitable, AI market can be more competitive environment than AMD in gaming gpu for them.
Think Team green is getting the most attention because they are the “for everyone” hardware, while the rest will probably keep it internal or behind huge pay walls.
How does Tesla's D1 in house designed AI chip compare to the Mi300 and H100 ? I am new to this type of content, but as an AMB and TSLA investor, I am trying to understand more. *new sub*
3060 12gb is the best lower end machine learning card. $200 used. A2000 12gb uses less power even after undervolting the 3060 but costs $400+. Vram counts more than speed.
I know this is old video, but good example for efficiency is Tesla and they own chip in car, which is in them for many years, Dojo is popular now, but what Tesla use in cars is better example, and where Nvidia did lose one partner. (they did use MobileEye and Nvidia before they design own)
I think an important point to make, and the one you missed, is that nvidia has attained complete market dominance in terms of combining hardware and software to provide a unified solution. In todays AI market, one of the most important factors is to deliver first. Delivering the product before anyone else allows you to sway customers and establish your product in the market. Now if you are one of those big companies, you could use your in-house hardware, but it's going to be slower than Nvidia. That's not an argument, that's a fact. And now, what if you found out, that your competition is using Nvidia? If you continue using your in-house hardware, you will not be first to the market and your model might not be as accesible to your customers. Nobody cares about efeciency at that point, because market capture is the only thing that matters. If you fail to do so, who cares if your inhouse TPU is 100% more efficient. That is the decision, the major driving decision, that makes everyone buy Nvidia. It's because they have the best integrated solution, so they are fastest. And that means if the competition is using them, you will have to use them too, or get used to being late to the party.
For training and experimentation Nvidia is the way but when you need to deploy those much model at large scale Nvidia hardware becomes very expensive. Data scientific at these tech companies use Nvidia hardware and software stack to train their models because its easy and fast for doing large number of experiments. But once the model is ready to be deploy it will be deployed in ASIC made only for inference.
@@sagyamthapa Dead Wrong,those at the top funding all this have more $$$$ then God and every atheist that will ever live. Money is no object,think way way bigger
@@mannyc19 They don't have all that money by wasting it. The OP who thinks efficiency (power usage) isn't a driving factor is, like you, very ignorant about how decisions like these get made. Stock valuation is everything. There isn't any actual physical worth. They don't have "money" (which is fiat regardless) .. they have VALUE. Perceived value. You have a lot to learn.
What's the only viable AI accelerator you can actually buy for your PC today? Yeah, nVidia GPUs... (There is also AMD with ROCm if you want to waste a lot of time.)
I know there are a lot more custom AI chips around, but this is the first time I've heard about Graphcore. Sounds super interesting (just checked out their website).
@@HighYield They are 700 ME funded, and it is a real 3D chip. Next move for them will be to put it on a interposer close to HBM. Yes it a great technology that can potentially do very well. My BR
I don’t know who will come out on top but I do know who will be on the bottom. That’s right us. We are living in a world we’re every word we say, every place we go will be used to sell us something.
No. The 0.01% dont want to sell us stuff anymore,they want most of us dead. Hence the Great Reset and Carbon Zero insanity and obsessions with self driving cars(Food Delivery,think Truckers Strike....) and A.I.
We basically already live in that world. Though I agree that recommendation engines are really just the most lousy and disgusting use of AI. That said, people really need to start learning to curb their consumerist habits, be it products or services. We are still ultimately responsible for our own behavior.
I really do not like Meta. But. Their R&D section is doing real wonders. A lot of companies like that. Eh. I guess google too. I would love to work for them. But. Dang they are evil.
Nvidia stock was around $470 with $ 1.2 Trillion mrk. cap., stupid wall street analysts are falling over each to other to upgrade their rating to buy at super high valuation.
Its very costly to build models using Nvidia AI chips. Its not sustainable for these companies to continue using Nvidia. If you dont build your own chips then you are over.
Whatever algorithms meta is using to put content in my feed, I can tell you it is a miserable failure. I don't even keep Facebook open anymore. All I see is AI generated celeb photos and political arguments. I haven't seen a funny monkey video in more than a decade. If AI knew what it was doing, it would show my funny monkey videos.
ASIC beats nvidia garbage by efficiency hundreds of times. Sure, it is good for one thing only but it makes it incredible fast and cheap. nvidia is garbage
You really deserve more subs. Your videos are the most technical ones I have ever seen in tech videos.
Word!! He has come a long way though! Although it will be great if he gets as many subs as he deserves, it may mean he won't be able to reply to as many comments as he does, but that's me being selfish lol.
Yeah that's why I subscribed. 😮
They're not even *that* technical, but they're like the perfect level of technical for armchair enthusiasts. He also explains things very well. But moreso, he's not sensationalist at all and actually grasps the things he's saying, so whenever he gets into more of a personal analysis situation, he's generally making good points and not just flying off the handle like you often see with others.
@@maynardburger Thank you, I honestly think you perfectly described what I am going for. I'm not a semi engineer so I actually have less technical knowledge than for example someone like Ian Cutress on his channel. I often need a lot of time and research to fully understand something, but once I do, I think I can explain it very well, because I had to put in the time. If I was a semi engineer I might glance over basic things because I would consider them general knowledge.
I Concur
The production value of this video is incredible.
the chip is called V1 for a reason: meta starts counting from 1 not from 0
Why's that
Another excellent analysis! This channel is awesome - SUBSCRIBED!🤘
Thank you for the very technical videos. Keep up the good work!
Thank you for the tip!
Nice video, Ants and Elephant approaches similar to initial days of Parallel computing
0:12 that shoul've been a clip from Satya Nadella.
You are right, but a lot more ppl know Bill Gates.
Wow, just discovered the channel. Amazing work so far! :)
Glad you like my videos!
i tried the beta for apples AI autocorrect, and its good! same thing with the AI powered speech to text for text dictation. Its quite impressive!
I've switched over to using M1Max and M2Max for training small computer vision models. No point forking over 3,500.00 for 2 RTX 4090, when I can get a M2Max with 96G of memory for the same price with a full computer. NVidia has gotten too greedy and I hope the market smacks them straight.
another small AI chip company developing an accelerator is GSI technology. May be worth a look.
The problem all these companies face is that Nvidia themselves aren't sleeping and the nature of this work allows their current lead to be leveraged into future advantages as well. Efficiency and cost of running are obviously important, but if you want your software AI solutions to be the world leader(where there's much riches to be had), then you will justify spending for the best hardware you can, costs be damned. That said, as a gaming enthusiast, I'm absolutely hoping Nvidia does get knocked down a peg by these companies and other, growing AI chip businesses. Would be nice for Nvidia to remember where their bread has always been buttered so they can stop trying to screw us over.
They all do pay for the best hardware, TSMC 3nm, 4nm. NVDA doesn't manufacture the best hardware themselves. In the big picture, I'm pretty sure every tech giant does not want to be captive to any company if they can avoid it, so it makes sense they'll have their own internal solutions as well. And if it's a matter of having the tech or not, then it's understandable the costs are secondary. But if there are cheaper solutions, I'm sure shareholders and customers do care. After all, the costs just get passed to the customer and then their customer and so on, so in the end are you willing to pay more?
"Cost be damned"
No...not really. If you can make a chip of just SIMILAR performance but for a much cheaper price you will use your own stuff instead of buying NVIDIA. Of course research also eats up loads of budget but an H100 reportedly only costs around $3200 to produce. And since a lot of these non-NVIDIA chips are smaller and on cheaper processes, they are even cheaper. Maybe only $2000 or $2500. That is a gigantic difference to the $45,000 that NVIDIA is demanding for their H100. And so even if Google, Microsoft etc. produce their own chip that has a quarter of the performance at half the wattage it would STILL be mich cheaper to manufacture four of your own chips to match one of NVIDIAs chips, even if those 4 chips use 2x the power of that one NVIDIA chips because you're literally saving tens of thousands of dollars.
I mean just do the math.
Let's say Google has a chip with the aforementioned performance. ¼ of the performance for ½ the wattage.
4 of those chips then provide the power of one NVIDIA chip. Same performance but by using 1400W instead of 700W. Too expensive in the long run? No. Cause even if they run 24/7, it's 0.7Kw × 24hrs = 16.8Kw/day. 16.8 × 365 = 6132Kwh.
At a price of $0.1/Kwh that's ~$600 more in electricity. But they saved 20 to 30K by not buying NVIDIA. So even after 10 years the extra power draw would be MUCH cheaper than just buying one NVIDIA chip.
It really does make sense for them. Even if THEIR hardware isn't the best, which can also change because all of these companies are trillion dollar mega corporations.
as a gaming enthusiast what you really wnt is AMD to come out on top, you just dont know it yet.
The ultimate gaming pc is a ps5 that fits in an AM5 socket
thie gaming future will be all about APUs
So in summary, NVIDA is a stop gap for all of it's biggest customer's AI training... soon as they develop their own, they're going to drop NVIDA like a bad habit.
Nvidia is still good for more general processing.
But even there, yes, I too expect Nvidia to be badly sidelined.
Mal wieder ein Tech-Fest! Höchst informativ und super gestaltet! Top!
I had to smile when i liked, commented and added your video just now to a playlist so i could increase your exposure to the recommendation model of youtube. I suspect RUclips’s recommendation engine is tensor based. Another thought is I’m thinking you’re correct that the gold rush mentality of AI with NVIDIA and their insane stock price isn’t scalable. Loving your channel and thank you!
I think Nvidia will remain on top for at least the next 5 years. Not necessarily because of the Hardware which is a strong reason but also because of the software support. CUDA to be precise. Most developers starting out in ML are learning and testing using GeForce GPUs. Libraries that depend on CUDA are pretty much what everyone who is learning ML learns to use. Once these developers get employed, Companies are more likely to go with Nvidia GPUs because it's already familiar with their ML Engineers. I believe that this foothold isn't going away any time soon.
I tend to agree with you because of the "CUDA entry learning" factor. But keep your eye on RISC-V based Jim Keller and Tenstorrent.
Need more TSMC shares :D
Oh yeah!!!
One of the best and most informative and most relevant videos to our time where companies are racing to implement AI in their services
Thank you! Interestingly, there are not a lot of videos covering this topic.
interesting. Didn't know so mayn companies were working on their own tech to this extent
Yeah, it also surprised me during my research. Big Tech is going all in.
Very Good in-depth analysis.
It's very interesting to see everyone doing custom hardware. Because of where FAANG expects to add value with AI costs are very very important to them, if you have cheaper inference you can make more margin or run better/smarter networks than your competitor. This combined with their insane scale incentivizes them to develop alternatives to NVIDIA. This also means that AI hardware could be commoditized much quicker then if all of them just bought NVIDIA hardware. We'll have to see what happens but I wouldn't be surprised if in 5 years inference is a completely commoditized low margin business and NVIDIA is trying to defend their number one position in the higher margin training business with everything they have.
It’s gonna be a tough fight for Nvidia. Very interesting times ahead of us.
how nice that tesla is measuring their chips in length of wire, so the question is who's longer ?
Great TPU Pods
Meta' LLama 2 is probably one of the most popular open source LLMs with downloadable weights. So, Meta definitely has some much powerful models for internal uses.
Yay! Been looking forward to yiur new video! I can't finish it now, but I wanted to give you the fast like and comment for thw algorithm!! Canr wait to watch it later!!
I made time lol. Great video! I'm looking forward to seeing what happens to Nvidia. After their ____ show of a comtech keynote, amd their sad cash grab GPU launches, it seems like they are probably going down. I'm looking forward to what AMD does especially with their APUs and hopefully applying AI chiplets to them potentially making excellent mini/SB computers that can be used for automation/AI applications in a super small form factor and low energy use! I would hate for all AI hardware to be proprietary as I would rather design my own than allow Google, Apple, or worse yet FB to have access to my systems just so I can use the tech. I am also hopeful that their APUs will essentially kill the market for consoles like Xbox and PS! Once that happens we will see an explosion of awesome games with incredible innovations. Especially when AI gets better at helping with that! UE5.2 has some incredible PGC tools now along with easy to use face/motion capture tools and with systems like CGTP games cpuld literally generate content in real time based off of what you have done in a game up to that point and informed by the lore/parameters of the game world. As long as we are not fighting skynet before that can happen lol. Anyway thanks!! Have a great week!
Right now, Nvidia is clearly on top, but it could change rapidly over the next couple of years. The entire AI space is very volatile right now, lots of possibilities for disruption.
@@theminer49erz AMD has good GPUs however they’re constrained by their software or poorly understood software.
😂Year 2030 : Back when we were young there was a very innovative company named NVIDIA. They became too greedy and hence died !
Can you do an analysis of the Tesla self driving chip (FSD chip)? Die shots are available on the internet
Awesome video! Thanks. I wonder where you place Gaudi 1 and Gaudi 2?
Gaudi2 seems very promising, but I haven't looked at it in-depth enough. Gaudi 3 is offering a huge jump in performance too.
Very excited to see tenstorrent's hardware. Also excited for nvidia to get sidelined by companies producing their own chips :P
Question is when...
So long-term Google for the win...
just got myself a nvidia jetson nano 🔥 everything ai is just so cool 😎
You are a beast!
Please keep the AI computing content coming!
Who will come on top? Customer, of course. 😜
I hope not,as those are all Malthusians with Trillions to spend on our end.
The user needs to be educated to behave in non predictable or at least less predictable manner. This propably would include proactive and systemic rejection of the advertising ai/meta does offer.
Then the next AI model would include these behaviors ;)
@@HighYield then only tons of galllium to end it.
Nice vid covering whole topic... all main AI chip developers.
It could happen, that Nvidia meeting such strong competition and also independence of big potential AI customers, that team green could return main focus on gaming gpu market, although it seems not much profitable, AI market can be more competitive environment than AMD in gaming gpu for them.
Smaller companies are totally destroying in the performance department. Like Helios
Think Team green is getting the most attention because they are the “for everyone” hardware, while the rest will probably keep it internal or behind huge pay walls.
Two "not Big Tech" companies to look at for AI accelerators.
Cerebras and Graphcore...
How does Tesla's D1 in house designed AI chip compare to the Mi300 and H100 ? I am new to this type of content, but as an AMB and TSLA investor, I am trying to understand more. *new sub*
3060 12gb is the best lower end machine learning card. $200 used. A2000 12gb uses less power even after undervolting the 3060 but costs $400+. Vram counts more than speed.
I know this is old video, but good example for efficiency is Tesla and they own chip in car, which is in them for many years, Dojo is popular now, but what Tesla use in cars is better example, and where Nvidia did lose one partner. (they did use MobileEye and Nvidia before they design own)
Great video, would love an update. 1 year later Nvidia still has 95% market share it seems.
Love the video Jensen bringing those sexy chips out of the oven
Imagine if those companies collaborated in chip creation
I think an important point to make, and the one you missed, is that nvidia has attained complete market dominance in terms of combining hardware and software to provide a unified solution. In todays AI market, one of the most important factors is to deliver first. Delivering the product before anyone else allows you to sway customers and establish your product in the market. Now if you are one of those big companies, you could use your in-house hardware, but it's going to be slower than Nvidia. That's not an argument, that's a fact. And now, what if you found out, that your competition is using Nvidia? If you continue using your in-house hardware, you will not be first to the market and your model might not be as accesible to your customers.
Nobody cares about efeciency at that point, because market capture is the only thing that matters. If you fail to do so, who cares if your inhouse TPU is 100% more efficient.
That is the decision, the major driving decision, that makes everyone buy Nvidia. It's because they have the best integrated solution, so they are fastest. And that means if the competition is using them, you will have to use them too, or get used to being late to the party.
For training and experimentation Nvidia is the way but when you need to deploy those much model at large scale Nvidia hardware becomes very expensive.
Data scientific at these tech companies use Nvidia hardware and software stack to train their models because its easy and fast for doing large number of experiments. But once the model is ready to be deploy it will be deployed in ASIC made only for inference.
@@sagyamthapa Dead Wrong,those at the top funding all this have more $$$$ then God and every atheist that will ever live. Money is no object,think way way bigger
@@mannyc19 They don't have all that money by wasting it. The OP who thinks efficiency (power usage) isn't a driving factor is, like you, very ignorant about how decisions like these get made. Stock valuation is everything. There isn't any actual physical worth. They don't have "money" (which is fiat regardless) .. they have VALUE. Perceived value. You have a lot to learn.
What's the only viable AI accelerator you can actually buy for your PC today? Yeah, nVidia GPUs... (There is also AMD with ROCm if you want to waste a lot of time.)
Right now Meta is leading everyone, including ChatGPT and even its newer models and Google isn't even in the conversation.
very interesting, ....and what about Graphcore from UK?
I know there are a lot more custom AI chips around, but this is the first time I've heard about Graphcore. Sounds super interesting (just checked out their website).
@@HighYield They are 700 ME funded, and it is a real 3D chip. Next move for them will be to put it on a interposer close to HBM. Yes it a great technology that can potentially do very well. My BR
Couldn't nvidia do a dedicated AI chip by producing a 'GPU' with only Tensor cores?
Having 100 Flops and 128GB RAM vs 1000 Flops and 80GB of RAM means you are doing different classes of LLM work for different goals
Google just released Trillium TPU
If Intel would optimize its drivers, their arc could be good for ai given the cost.
Apple has been making their custom chips over decade now.
pretty sure ai will come out on top when it destroys humanity.
nvda the king of world only one can kill nvda itself use it computility to find the new king of the world
I don’t know who will come out on top but I do know who will be on the bottom. That’s right us. We are living in a world we’re every word we say, every place we go will be used to sell us something.
No. The 0.01% dont want to sell us stuff anymore,they want most of us dead. Hence the Great Reset and Carbon Zero insanity and obsessions with self driving cars(Food Delivery,think Truckers Strike....) and A.I.
We basically already live in that world. Though I agree that recommendation engines are really just the most lousy and disgusting use of AI. That said, people really need to start learning to curb their consumerist habits, be it products or services. We are still ultimately responsible for our own behavior.
You missed out IBM
Oh I missed out on many more custom AI chips. There will be more videos on AI hardware in the future :)
I really do not like Meta. But. Their R&D section is doing real wonders. A lot of companies like that. Eh. I guess google too. I would love to work for them. But. Dang they are evil.
Meta know that I only use reels to watch brain rot and absurd humor video.
It seems Nvidia A.I M&S ambition might be short lived.
Nvidia stock was around $470 with $ 1.2 Trillion mrk. cap., stupid wall street analysts are falling over each to other to upgrade their rating to buy at super high valuation.
Its very costly to build models using Nvidia AI chips. Its not sustainable for these companies to continue using Nvidia. If you dont build your own chips then you are over.
How to ruin Humana's job
SORRY ONLY AMD DID IT NOW
All what you showed are outdated tecs. We have a better solution, but we struggle to find an investor with 1M USD.
What’s your solution? Which company are you talking about?
Enti benung berapi bejimat
Bala ke nginsap naka ulh anang ba ruai ba luar ja mh
Whatever algorithms meta is using to put content in my feed, I can tell you it is a miserable failure. I don't even keep Facebook open anymore. All I see is AI generated celeb photos and political arguments. I haven't seen a funny monkey video in more than a decade. If AI knew what it was doing, it would show my funny monkey videos.
nvidia so oveerated.
ASIC beats nvidia garbage by efficiency hundreds of times. Sure, it is good for one thing only but it makes it incredible fast and cheap. nvidia is garbage
🧞 Thanks