Google TPU & other in-house AI Chips

Поделиться
HTML-код
  • Опубликовано: 2 янв 2025

Комментарии •

  • @TuxikCE
    @TuxikCE Год назад +59

    You really deserve more subs. Your videos are the most technical ones I have ever seen in tech videos.

    • @theminer49erz
      @theminer49erz Год назад

      Word!! He has come a long way though! Although it will be great if he gets as many subs as he deserves, it may mean he won't be able to reply to as many comments as he does, but that's me being selfish lol.

    • @christophermullins7163
      @christophermullins7163 Год назад

      Yeah that's why I subscribed. 😮

    • @maynardburger
      @maynardburger Год назад +2

      They're not even *that* technical, but they're like the perfect level of technical for armchair enthusiasts. He also explains things very well. But moreso, he's not sensationalist at all and actually grasps the things he's saying, so whenever he gets into more of a personal analysis situation, he's generally making good points and not just flying off the handle like you often see with others.

    • @HighYield
      @HighYield  Год назад +6

      @@maynardburger Thank you, I honestly think you perfectly described what I am going for. I'm not a semi engineer so I actually have less technical knowledge than for example someone like Ian Cutress on his channel. I often need a lot of time and research to fully understand something, but once I do, I think I can explain it very well, because I had to put in the time. If I was a semi engineer I might glance over basic things because I would consider them general knowledge.

    • @odytrice
      @odytrice Год назад

      I Concur

  • @tomashrivnak5655
    @tomashrivnak5655 Год назад +1

    The production value of this video is incredible.

  • @builtofire1
    @builtofire1 Год назад +13

    the chip is called V1 for a reason: meta starts counting from 1 not from 0

  • @D.u.d.e.r
    @D.u.d.e.r 10 месяцев назад +3

    Another excellent analysis! This channel is awesome - SUBSCRIBED!🤘

  • @haoshan7253
    @haoshan7253 Год назад

    Thank you for the very technical videos. Keep up the good work!

  • @maneeshs3876
    @maneeshs3876 26 дней назад

    Nice video, Ants and Elephant approaches similar to initial days of Parallel computing

  • @johanneskingma
    @johanneskingma Год назад +3

    0:12 that shoul've been a clip from Satya Nadella.

    • @HighYield
      @HighYield  Год назад

      You are right, but a lot more ppl know Bill Gates.

  • @ralllao7295
    @ralllao7295 Год назад +2

    Wow, just discovered the channel. Amazing work so far! :)

    • @HighYield
      @HighYield  Год назад

      Glad you like my videos!

  • @jamesrosenberg1612
    @jamesrosenberg1612 Год назад +5

    i tried the beta for apples AI autocorrect, and its good! same thing with the AI powered speech to text for text dictation. Its quite impressive!

  • @woolfel
    @woolfel Год назад +9

    I've switched over to using M1Max and M2Max for training small computer vision models. No point forking over 3,500.00 for 2 RTX 4090, when I can get a M2Max with 96G of memory for the same price with a full computer. NVidia has gotten too greedy and I hope the market smacks them straight.

  • @boot-camp2197
    @boot-camp2197 Год назад +3

    another small AI chip company developing an accelerator is GSI technology. May be worth a look.

  • @maynardburger
    @maynardburger Год назад +8

    The problem all these companies face is that Nvidia themselves aren't sleeping and the nature of this work allows their current lead to be leveraged into future advantages as well. Efficiency and cost of running are obviously important, but if you want your software AI solutions to be the world leader(where there's much riches to be had), then you will justify spending for the best hardware you can, costs be damned. That said, as a gaming enthusiast, I'm absolutely hoping Nvidia does get knocked down a peg by these companies and other, growing AI chip businesses. Would be nice for Nvidia to remember where their bread has always been buttered so they can stop trying to screw us over.

    • @alwanexus
      @alwanexus Год назад +1

      They all do pay for the best hardware, TSMC 3nm, 4nm. NVDA doesn't manufacture the best hardware themselves. In the big picture, I'm pretty sure every tech giant does not want to be captive to any company if they can avoid it, so it makes sense they'll have their own internal solutions as well. And if it's a matter of having the tech or not, then it's understandable the costs are secondary. But if there are cheaper solutions, I'm sure shareholders and customers do care. After all, the costs just get passed to the customer and then their customer and so on, so in the end are you willing to pay more?

    • @kaystephan2610
      @kaystephan2610 Год назад

      "Cost be damned"
      No...not really. If you can make a chip of just SIMILAR performance but for a much cheaper price you will use your own stuff instead of buying NVIDIA. Of course research also eats up loads of budget but an H100 reportedly only costs around $3200 to produce. And since a lot of these non-NVIDIA chips are smaller and on cheaper processes, they are even cheaper. Maybe only $2000 or $2500. That is a gigantic difference to the $45,000 that NVIDIA is demanding for their H100. And so even if Google, Microsoft etc. produce their own chip that has a quarter of the performance at half the wattage it would STILL be mich cheaper to manufacture four of your own chips to match one of NVIDIAs chips, even if those 4 chips use 2x the power of that one NVIDIA chips because you're literally saving tens of thousands of dollars.
      I mean just do the math.
      Let's say Google has a chip with the aforementioned performance. ¼ of the performance for ½ the wattage.
      4 of those chips then provide the power of one NVIDIA chip. Same performance but by using 1400W instead of 700W. Too expensive in the long run? No. Cause even if they run 24/7, it's 0.7Kw × 24hrs = 16.8Kw/day. 16.8 × 365 = 6132Kwh.
      At a price of $0.1/Kwh that's ~$600 more in electricity. But they saved 20 to 30K by not buying NVIDIA. So even after 10 years the extra power draw would be MUCH cheaper than just buying one NVIDIA chip.
      It really does make sense for them. Even if THEIR hardware isn't the best, which can also change because all of these companies are trillion dollar mega corporations.

    • @user-lp5wb2rb3v
      @user-lp5wb2rb3v 6 месяцев назад

      as a gaming enthusiast what you really wnt is AMD to come out on top, you just dont know it yet.
      The ultimate gaming pc is a ps5 that fits in an AM5 socket
      thie gaming future will be all about APUs

  • @stachowi
    @stachowi Год назад +9

    So in summary, NVIDA is a stop gap for all of it's biggest customer's AI training... soon as they develop their own, they're going to drop NVIDA like a bad habit.

    • @musaran2
      @musaran2 Год назад +3

      Nvidia is still good for more general processing.
      But even there, yes, I too expect Nvidia to be badly sidelined.

  • @danimatzevogelheim6913
    @danimatzevogelheim6913 Год назад +3

    Mal wieder ein Tech-Fest! Höchst informativ und super gestaltet! Top!

  • @stefanbuscaylet
    @stefanbuscaylet Год назад +1

    I had to smile when i liked, commented and added your video just now to a playlist so i could increase your exposure to the recommendation model of youtube. I suspect RUclips’s recommendation engine is tensor based. Another thought is I’m thinking you’re correct that the gold rush mentality of AI with NVIDIA and their insane stock price isn’t scalable. Loving your channel and thank you!

  • @odytrice
    @odytrice Год назад +11

    I think Nvidia will remain on top for at least the next 5 years. Not necessarily because of the Hardware which is a strong reason but also because of the software support. CUDA to be precise. Most developers starting out in ML are learning and testing using GeForce GPUs. Libraries that depend on CUDA are pretty much what everyone who is learning ML learns to use. Once these developers get employed, Companies are more likely to go with Nvidia GPUs because it's already familiar with their ML Engineers. I believe that this foothold isn't going away any time soon.

    • @marktwain5232
      @marktwain5232 8 месяцев назад

      I tend to agree with you because of the "CUDA entry learning" factor. But keep your eye on RISC-V based Jim Keller and Tenstorrent.

  • @pw1169
    @pw1169 Год назад +7

    Need more TSMC shares :D

  • @jihadrouani5525
    @jihadrouani5525 Год назад +3

    One of the best and most informative and most relevant videos to our time where companies are racing to implement AI in their services

    • @HighYield
      @HighYield  Год назад +3

      Thank you! Interestingly, there are not a lot of videos covering this topic.

  • @RM-el3gw
    @RM-el3gw Год назад +4

    interesting. Didn't know so mayn companies were working on their own tech to this extent

    • @HighYield
      @HighYield  Год назад +3

      Yeah, it also surprised me during my research. Big Tech is going all in.

  • @daimanfatan
    @daimanfatan Год назад

    Very Good in-depth analysis.

  • @bartios
    @bartios Год назад +3

    It's very interesting to see everyone doing custom hardware. Because of where FAANG expects to add value with AI costs are very very important to them, if you have cheaper inference you can make more margin or run better/smarter networks than your competitor. This combined with their insane scale incentivizes them to develop alternatives to NVIDIA. This also means that AI hardware could be commoditized much quicker then if all of them just bought NVIDIA hardware. We'll have to see what happens but I wouldn't be surprised if in 5 years inference is a completely commoditized low margin business and NVIDIA is trying to defend their number one position in the higher margin training business with everything they have.

    • @HighYield
      @HighYield  Год назад +4

      It’s gonna be a tough fight for Nvidia. Very interesting times ahead of us.

  • @builtofire1
    @builtofire1 Год назад +2

    how nice that tesla is measuring their chips in length of wire, so the question is who's longer ?

  • @aditisingh7606
    @aditisingh7606 8 месяцев назад

    Great TPU Pods

  • @dr.jameshwang9309
    @dr.jameshwang9309 11 месяцев назад

    Meta' LLama 2 is probably one of the most popular open source LLMs with downloadable weights. So, Meta definitely has some much powerful models for internal uses.

  • @theminer49erz
    @theminer49erz Год назад

    Yay! Been looking forward to yiur new video! I can't finish it now, but I wanted to give you the fast like and comment for thw algorithm!! Canr wait to watch it later!!

    • @theminer49erz
      @theminer49erz Год назад +1

      I made time lol. Great video! I'm looking forward to seeing what happens to Nvidia. After their ____ show of a comtech keynote, amd their sad cash grab GPU launches, it seems like they are probably going down. I'm looking forward to what AMD does especially with their APUs and hopefully applying AI chiplets to them potentially making excellent mini/SB computers that can be used for automation/AI applications in a super small form factor and low energy use! I would hate for all AI hardware to be proprietary as I would rather design my own than allow Google, Apple, or worse yet FB to have access to my systems just so I can use the tech. I am also hopeful that their APUs will essentially kill the market for consoles like Xbox and PS! Once that happens we will see an explosion of awesome games with incredible innovations. Especially when AI gets better at helping with that! UE5.2 has some incredible PGC tools now along with easy to use face/motion capture tools and with systems like CGTP games cpuld literally generate content in real time based off of what you have done in a game up to that point and informed by the lore/parameters of the game world. As long as we are not fighting skynet before that can happen lol. Anyway thanks!! Have a great week!

    • @HighYield
      @HighYield  Год назад +1

      Right now, Nvidia is clearly on top, but it could change rapidly over the next couple of years. The entire AI space is very volatile right now, lots of possibilities for disruption.

    • @Fractal_32
      @Fractal_32 Год назад

      @@theminer49erz AMD has good GPUs however they’re constrained by their software or poorly understood software.

  • @prashanthb6521
    @prashanthb6521 Год назад +4

    😂Year 2030 : Back when we were young there was a very innovative company named NVIDIA. They became too greedy and hence died !

  • @BGTech1
    @BGTech1 Год назад

    Can you do an analysis of the Tesla self driving chip (FSD chip)? Die shots are available on the internet

  • @masumasi
    @masumasi Год назад

    Awesome video! Thanks. I wonder where you place Gaudi 1 and Gaudi 2?

    • @HighYield
      @HighYield  Год назад +2

      Gaudi2 seems very promising, but I haven't looked at it in-depth enough. Gaudi 3 is offering a huge jump in performance too.

  • @VideogamesAsArt
    @VideogamesAsArt Год назад +2

    Very excited to see tenstorrent's hardware. Also excited for nvidia to get sidelined by companies producing their own chips :P

  • @GOODTAGO
    @GOODTAGO 5 месяцев назад +1

    So long-term Google for the win...

  • @hanspeter24
    @hanspeter24 Год назад +2

    just got myself a nvidia jetson nano 🔥 everything ai is just so cool 😎

  • @furtsmagee1513
    @furtsmagee1513 8 месяцев назад

    You are a beast!

  • @klaudialustig3259
    @klaudialustig3259 Год назад +6

    Please keep the AI computing content coming!

  • @blablabic2024
    @blablabic2024 Год назад +3

    Who will come on top? Customer, of course. 😜

    • @mannyc19
      @mannyc19 Год назад

      I hope not,as those are all Malthusians with Trillions to spend on our end.

  • @a.tevetoglu3366
    @a.tevetoglu3366 Год назад +3

    The user needs to be educated to behave in non predictable or at least less predictable manner. This propably would include proactive and systemic rejection of the advertising ai/meta does offer.

    • @HighYield
      @HighYield  Год назад +1

      Then the next AI model would include these behaviors ;)

    • @a.tevetoglu3366
      @a.tevetoglu3366 Год назад

      @@HighYield then only tons of galllium to end it.

  • @gstormcz
    @gstormcz Год назад

    Nice vid covering whole topic... all main AI chip developers.
    It could happen, that Nvidia meeting such strong competition and also independence of big potential AI customers, that team green could return main focus on gaming gpu market, although it seems not much profitable, AI market can be more competitive environment than AMD in gaming gpu for them.

  • @lazerusmfh
    @lazerusmfh Год назад +3

    Smaller companies are totally destroying in the performance department. Like Helios

  • @abpdev
    @abpdev 8 месяцев назад

    Think Team green is getting the most attention because they are the “for everyone” hardware, while the rest will probably keep it internal or behind huge pay walls.

  • @glenyoung1809
    @glenyoung1809 Год назад +4

    Two "not Big Tech" companies to look at for AI accelerators.
    Cerebras and Graphcore...

  • @eugeniustheodidactus8890
    @eugeniustheodidactus8890 Год назад

    How does Tesla's D1 in house designed AI chip compare to the Mi300 and H100 ? I am new to this type of content, but as an AMB and TSLA investor, I am trying to understand more. *new sub*

  • @trashtrashisfree
    @trashtrashisfree Год назад

    3060 12gb is the best lower end machine learning card. $200 used. A2000 12gb uses less power even after undervolting the 3060 but costs $400+. Vram counts more than speed.

  • @m_sedziwoj
    @m_sedziwoj 8 месяцев назад

    I know this is old video, but good example for efficiency is Tesla and they own chip in car, which is in them for many years, Dojo is popular now, but what Tesla use in cars is better example, and where Nvidia did lose one partner. (they did use MobileEye and Nvidia before they design own)

  • @michaelmuller8494
    @michaelmuller8494 29 дней назад

    Great video, would love an update. 1 year later Nvidia still has 95% market share it seems.

  • @esyjournal
    @esyjournal 9 месяцев назад

    Love the video Jensen bringing those sexy chips out of the oven

  • @kezif
    @kezif 10 месяцев назад

    Imagine if those companies collaborated in chip creation

  • @FrantisekPicifuk
    @FrantisekPicifuk Год назад +5

    I think an important point to make, and the one you missed, is that nvidia has attained complete market dominance in terms of combining hardware and software to provide a unified solution. In todays AI market, one of the most important factors is to deliver first. Delivering the product before anyone else allows you to sway customers and establish your product in the market. Now if you are one of those big companies, you could use your in-house hardware, but it's going to be slower than Nvidia. That's not an argument, that's a fact. And now, what if you found out, that your competition is using Nvidia? If you continue using your in-house hardware, you will not be first to the market and your model might not be as accesible to your customers.
    Nobody cares about efeciency at that point, because market capture is the only thing that matters. If you fail to do so, who cares if your inhouse TPU is 100% more efficient.
    That is the decision, the major driving decision, that makes everyone buy Nvidia. It's because they have the best integrated solution, so they are fastest. And that means if the competition is using them, you will have to use them too, or get used to being late to the party.

    • @sagyamthapa
      @sagyamthapa Год назад +3

      For training and experimentation Nvidia is the way but when you need to deploy those much model at large scale Nvidia hardware becomes very expensive.
      Data scientific at these tech companies use Nvidia hardware and software stack to train their models because its easy and fast for doing large number of experiments. But once the model is ready to be deploy it will be deployed in ASIC made only for inference.

    • @mannyc19
      @mannyc19 Год назад

      @@sagyamthapa Dead Wrong,those at the top funding all this have more $$$$ then God and every atheist that will ever live. Money is no object,think way way bigger

    • @THE-X-Force
      @THE-X-Force Год назад +2

      @@mannyc19 They don't have all that money by wasting it. The OP who thinks efficiency (power usage) isn't a driving factor is, like you, very ignorant about how decisions like these get made. Stock valuation is everything. There isn't any actual physical worth. They don't have "money" (which is fiat regardless) .. they have VALUE. Perceived value. You have a lot to learn.

  • @couldntfindafreename
    @couldntfindafreename Год назад +1

    What's the only viable AI accelerator you can actually buy for your PC today? Yeah, nVidia GPUs... (There is also AMD with ROCm if you want to waste a lot of time.)

  • @Elegant-Capybara
    @Elegant-Capybara 7 месяцев назад +1

    Right now Meta is leading everyone, including ChatGPT and even its newer models and Google isn't even in the conversation.

  • @rcubillo4204
    @rcubillo4204 Год назад

    very interesting, ....and what about Graphcore from UK?

    • @HighYield
      @HighYield  Год назад

      I know there are a lot more custom AI chips around, but this is the first time I've heard about Graphcore. Sounds super interesting (just checked out their website).

    • @rcubillo4204
      @rcubillo4204 Год назад

      @@HighYield They are 700 ME funded, and it is a real 3D chip. Next move for them will be to put it on a interposer close to HBM. Yes it a great technology that can potentially do very well. My BR

  • @Steamrick
    @Steamrick 10 месяцев назад

    Couldn't nvidia do a dedicated AI chip by producing a 'GPU' with only Tensor cores?

  • @idiomaxiom
    @idiomaxiom Год назад

    Having 100 Flops and 128GB RAM vs 1000 Flops and 80GB of RAM means you are doing different classes of LLM work for different goals

  • @nike5428
    @nike5428 Месяц назад

    Google just released Trillium TPU

  • @Phil-D83
    @Phil-D83 Год назад

    If Intel would optimize its drivers, their arc could be good for ai given the cost.

  • @bharatgroverkuuni171290
    @bharatgroverkuuni171290 Месяц назад

    Apple has been making their custom chips over decade now.

  • @Joker-no1fz
    @Joker-no1fz Год назад +1

    pretty sure ai will come out on top when it destroys humanity.

  • @nanometer7
    @nanometer7 7 месяцев назад

    nvda the king of world only one can kill nvda itself use it computility to find the new king of the world

  • @jelipebands1700
    @jelipebands1700 Год назад +2

    I don’t know who will come out on top but I do know who will be on the bottom. That’s right us. We are living in a world we’re every word we say, every place we go will be used to sell us something.

    • @mannyc19
      @mannyc19 Год назад

      No. The 0.01% dont want to sell us stuff anymore,they want most of us dead. Hence the Great Reset and Carbon Zero insanity and obsessions with self driving cars(Food Delivery,think Truckers Strike....) and A.I.

    • @maynardburger
      @maynardburger Год назад +1

      We basically already live in that world. Though I agree that recommendation engines are really just the most lousy and disgusting use of AI. That said, people really need to start learning to curb their consumerist habits, be it products or services. We are still ultimately responsible for our own behavior.

  • @cmdrwhiskeygalore3044
    @cmdrwhiskeygalore3044 Год назад

    You missed out IBM

    • @HighYield
      @HighYield  Год назад +1

      Oh I missed out on many more custom AI chips. There will be more videos on AI hardware in the future :)

  • @ristopoho824
    @ristopoho824 8 месяцев назад

    I really do not like Meta. But. Their R&D section is doing real wonders. A lot of companies like that. Eh. I guess google too. I would love to work for them. But. Dang they are evil.

  • @ikramramli6410
    @ikramramli6410 4 месяца назад

    Meta know that I only use reels to watch brain rot and absurd humor video.

  • @Dmwntkp99
    @Dmwntkp99 Год назад

    It seems Nvidia A.I M&S ambition might be short lived.

  • @ukhalid238
    @ukhalid238 Год назад

    Nvidia stock was around $470 with $ 1.2 Trillion mrk. cap., stupid wall street analysts are falling over each to other to upgrade their rating to buy at super high valuation.

  • @kirankumarsukumar
    @kirankumarsukumar 11 месяцев назад

    Its very costly to build models using Nvidia AI chips. Its not sustainable for these companies to continue using Nvidia. If you dont build your own chips then you are over.

  • @mr.libluckiestinfinitebene2589
    @mr.libluckiestinfinitebene2589 Год назад +1

    How to ruin Humana's job

  • @MaxKrumholz
    @MaxKrumholz Год назад

    SORRY ONLY AMD DID IT NOW

  • @wiedenn
    @wiedenn Год назад

    All what you showed are outdated tecs. We have a better solution, but we struggle to find an investor with 1M USD.

    • @HighYield
      @HighYield  Год назад

      What’s your solution? Which company are you talking about?

  • @HuntaKiller91
    @HuntaKiller91 Год назад

    Enti benung berapi bejimat
    Bala ke nginsap naka ulh anang ba ruai ba luar ja mh

  • @ChristopherPisz
    @ChristopherPisz 11 месяцев назад

    Whatever algorithms meta is using to put content in my feed, I can tell you it is a miserable failure. I don't even keep Facebook open anymore. All I see is AI generated celeb photos and political arguments. I haven't seen a funny monkey video in more than a decade. If AI knew what it was doing, it would show my funny monkey videos.

  • @Capeau
    @Capeau Год назад

    nvidia so oveerated.

  • @simplemechanics246
    @simplemechanics246 Год назад

    ASIC beats nvidia garbage by efficiency hundreds of times. Sure, it is good for one thing only but it makes it incredible fast and cheap. nvidia is garbage

  • @PravdaSeed
    @PravdaSeed 7 месяцев назад

    🧞 Thanks