Nvidia Reveals Grace Hopper Generative AI Chip (Computex 2023)

Поделиться
HTML-код
  • Опубликовано: 29 май 2023
  • At Computex 2023, Nvidia CEO Jensen Huang shows off its generative AI chip, Grace Hopper. See what the CEO has to say about the future of accelerated computing and AI.
    Never miss a deal again! See CNET’s browser extension 👉 bit.ly/39Ub3bv
    Subscribe to our channel: / @cnethighlights
  • НаукаНаука

Комментарии • 343

  • @klausyap
    @klausyap Год назад +141

    The way he was holding it and not afraid to drop it, just as brave as Linus.

    • @demonz9065
      @demonz9065 Год назад +1

      he started the video by saying it's in full production. if he breaks that one it's no big deal. why would he be afraid to drop it?

    • @oldpain7625
      @oldpain7625 Год назад +4

      ​@@demonz9065Well he's giving a presentation on how innovative the technology is. Dropping it on the ground in front of all those people would not be optimal.

    • @demonz9065
      @demonz9065 Год назад +1

      @@oldpain7625 it would'nt be optimal but when is dropping something delicate optimal? his embarrassment would be the biggest downside to him dropping something that's just gone into full scale production. there's no real reason for him to be afraid of damaging it

    • @samsonrobertbrewer8235
      @samsonrobertbrewer8235 Год назад

      Since the February 02/2023/untill now I'm trying to rebuild websiaite lol the Microsoft 365 lunch onfenrary 3

    • @kneelesh48
      @kneelesh48 Год назад +2

      He got the power of the leather jacket

  • @grcfalcon
    @grcfalcon 11 месяцев назад +34

    If that isn't the heart of Skynet, I don't know what it is.

  • @Beingtanaka
    @Beingtanaka Год назад +190

    The crowd was soooo dead, the man just announced the worlds fastest super computer

    • @thetshadow999animates9
      @thetshadow999animates9 Год назад +13

      No, it’s not a supercomputer

    • @bengsynthmusic
      @bengsynthmusic Год назад +3

      ​@@thetshadow999animates9
      👉🏽 11:37

    • @thetshadow999animates9
      @thetshadow999animates9 Год назад +9

      @@bengsynthmusic it’d be a stretch to call it a supercomputer, more like a marketing thing. Also, this was unscripted so you can’t really go off of what Jensen says either way.

    • @bengsynthmusic
      @bengsynthmusic Год назад +10

      @@thetshadow999animates9
      You can see on the left side that it says AI supercomputer. 11:37 So it's not some off-script slip-up. Plus it does 1 exaflops, which would put it among the top supercomputers. It is no doubt a supercomputer.

    • @Moltenlava
      @Moltenlava 11 месяцев назад +3

      @@thetshadow999animates9 This is the kind of chip you would find inside a supercomputer, this thing is designed for server computing racks.

  • @natej6671
    @natej6671 Год назад +94

    I'm looking at that 40k pound beast of a GPU and I'm thinking .... It will be the size of a cellphone in 20 years.

    • @kneelesh48
      @kneelesh48 Год назад +14

      Next you're gonna say, AWS will be the size of an airpod in 20 years. No, it won't.

    • @kneelesh48
      @kneelesh48 Год назад

      @@user-hx5qv4kd6 doesn't mean we'll see the same change in the next 20 years. We can't shrink atoms.

    • @tsnstonepilot5375
      @tsnstonepilot5375 11 месяцев назад +9

      @@user-hx5qv4kd6 yeah but it’s not the same. You can’t make electrons any smaller. We’re already nearing the limits when it comes to how small our transistors are. Further compaction will require some new breakthrough that we have no real concept of.

    • @ferdinand.keller
      @ferdinand.keller 11 месяцев назад +4

      People always say the same thing, that we are already at the limit. And then a new discovery is made, and we surpass ourselves. Thinking we won’t do better isn’t a prediction I would put money on.

    • @pictzone
      @pictzone 11 месяцев назад +2

      @@ferdinand.keller I mean I really can't see a way to make transistors smaller than a few atoms.. so he has a point. Maybe we'll achieve some great workarounds, like 3D chips, but that's it.

  • @joeyglasser2574
    @joeyglasser2574 Год назад +118

    "I wonder if this can play Crysis"
    lmao Jensen is a based CEO

    • @lordrefrigeratorintercoole288
      @lordrefrigeratorintercoole288 Год назад +18

      cringe

    • @freemanrader75
      @freemanrader75 Год назад +5

      I don't know why all the gamers act so mad at Nvidia

    • @knightnxk2906
      @knightnxk2906 Год назад +2

      @@freemanrader75 becasue they are not really here for the gamers?
      They are just not trying to lose market share and try to suck every dollar out of you, which apparently works very well.
      But the big plays are happening behind Enterprise doors, that's where the real tech is.

    • @freemanrader75
      @freemanrader75 Год назад +1

      @@knightnxk2906 Nvidia is owned by gamers.

    • @edgeldine3499
      @edgeldine3499 Год назад +1

      @@freemanrader75 because they want to charge 2-3 times as much today for 20-30.. maybe 40% more performance than they did a few years ago.. MSRP not Covid Inflated prices. If were talking the 4060 then you might even get less performance than the last generation..

  • @joelface
    @joelface Год назад +136

    I do think Nvidia flies under the radar a little bit with the way it makes all of the advances of the other big computing companies possible. This seems like a seriously huge leap in computing power. It may prove fundamental to unlocking a real AGI.

    • @Jaker788
      @Jaker788 Год назад +8

      There are other players doing pretty crazy stuff for less general purpose but extremely powerful AI training. Cerebras has their wafer scale AI processors which are very good for specific AI training models. Tesla with their Dojo computer is pretty crazy for training, it's memory bandwidth and hierarchy is suited for very high bandwidth and capacity fully addressable. It's far more powerful than their Nvidia GPU clusters. Nvidia is very general purpose, their GPUs have capabilities beyond just training instructions like Bloat 16 and low precision integer which limits the architecture, the difference between vector and matrix is decently significant. A pure matrix architecture with assistance from a simple integrated CPU in each cluster like dojo or Cerebras is quite powerful vs a GPU that must do vector and matrix.
      I would also say that AGI will not be done by a large language model, it doesn't have any true intelligence, but it's very good at making connections.

    • @OneDivineShot
      @OneDivineShot Год назад +2

      @@Jaker788 but isn’t making connections the fundamental way our brains learn things as well? Neurons for connections in the brain.

    • @Jaker788
      @Jaker788 Год назад +6

      @@OneDivineShot LLMs just aren't it, they're really good for certain things, but they have their limitations. Open AI has even said that GPT4 is about where the ceiling for LLMs is. We will need a different kind of model for AGI.

    • @preddyshite6342
      @preddyshite6342 Год назад

      @@Jaker788 The crazy thing is not THAT it works, but how EASY it is to use. A pseudo-AGI architecture is more than sufficient to emulate the gamut of human proclivities. NVIDIA just handed it a body, so it can go hide anywhere.

    • @austinjoseph2881
      @austinjoseph2881 11 месяцев назад +6

      Lol at flies under the radar. Have you seen the 300% increase in the stock recently

  • @TheViettan28
    @TheViettan28 Год назад +74

    Finally Nvidia created a GPU for LLM. LLM is memory hungry.

    • @cidpraderas8950
      @cidpraderas8950 Год назад +1

      I thought all it’s GPUs could handle LLMs, no? Is this one chip that is tuned specifically for LLMs?

    • @TheViettan28
      @TheViettan28 Год назад +4

      ​@@cidpraderas8950 They did, but LLM needs a lot of memory. So The current LLM running on current GPU has to be sharded across multiple GPUs. If the GPU has more memory, the LLM can be stored on a single GPU and the training process may be much faster due to the reduced amount of inter-GPU communications.

    • @neonlost
      @neonlost Год назад +1

      @@cidpraderas8950 depends on the size of the model and the optimization used. newer models usually use more though especially if they have a lot of parameters.

    • @joelface
      @joelface Год назад

      How long until a phone runs on an equivalent chip? 5 years? 10 years?

    • @FriedChairs
      @FriedChairs Год назад +1

      One thing is for sure is that he knows absolutely everything about the products NVIDIA is making. Can’t say that about all CEOs.

  • @jdkingsley6543
    @jdkingsley6543 11 месяцев назад +46

    I like how we always go back to rooms full of computers. Despite the processing breakthroughs

  • @C01A60
    @C01A60 Год назад +191

    This guy is really a one in a million passionate CEO!

    • @knightnxk2906
      @knightnxk2906 Год назад +10

      cos hes getting paid well with bonuses.

    • @andy68686
      @andy68686 Год назад +5

      @@knightnxk2906 no sir, the guy found the company in a Denny's

    • @knightnxk2906
      @knightnxk2906 Год назад +3

      @@andy68686 Sure and I am Pope.

    • @Shiffo
      @Shiffo Год назад +12

      @@knightnxk2906 Jensen is worth $33B, this guy can do everything he wants.
      Money is no limitation. Do you think he wakes up every day with the idea of earning more money today?

    • @adamrhea2339
      @adamrhea2339 Год назад +12

      He's 10x better than Steve Jobs. Has actually written code and cares more than anyone else about the mission.

  • @jasonsadventure
    @jasonsadventure 11 месяцев назад +7

    *Jensen Huang @**12:00**:* *_"DGX2000, It is one giant GPU."_*
    *Voice of Morgan Freeman:* *_"Then... on Dec 12, 2023... Skynet was born"_*

  • @boltez6507
    @boltez6507 11 месяцев назад +3

    10:46 At least he remembers his previous primary comsumers.

  • @Zodtheimmortal
    @Zodtheimmortal Год назад +55

    I'm excited and scared at the same time. What was this GPU named again, Skynet?

    • @Yankeyson1
      @Yankeyson1 Год назад

      I was thinking the same thing.

    • @preddyshite6342
      @preddyshite6342 Год назад

      Yes, now LLMs can be mobile and everyone is trying to make their own AGI.. lol INCLUDING MEEE!!!

    • @dwightk.schruteiii8454
      @dwightk.schruteiii8454 11 месяцев назад

      @@preddyshite6342whats an LLM?

    • @preddyshite6342
      @preddyshite6342 11 месяцев назад +1

      ​@@dwightk.schruteiii8454 Large Language Model. That's what chatGPT is. Because it is trained on a lot of words.

    • @420msclub
      @420msclub 11 месяцев назад

      Because Attention + Hype🎉

  • @JazevoAudiosurf
    @JazevoAudiosurf Год назад +17

    I guess mega caps don't care about price, but the cost must be astronomical

    • @knightnxk2906
      @knightnxk2906 Год назад +2

      Because it is. At least 100k
      This is an insane GPU, if i can even call it a GPU cos it feels like an understatement at this point.

    • @4482paper
      @4482paper Год назад +4

      @@knightnxk2906 LOL - you just MASSIVELY underestimated the cost, a single H100 is $40,000+ ;-)

  • @chineduachimalo391
    @chineduachimalo391 Год назад +17

    hmm but can it really play crysis?

  • @nijobot
    @nijobot Год назад +9

    Grass Hopper

  • @maudentable
    @maudentable 11 месяцев назад +4

    Jensen is stacking up hardwares like vectors, matix and tensors.

  • @noveenmirza4917
    @noveenmirza4917 11 месяцев назад +5

    The energy of this guy is supreme!!

  • @joshuagreen5820
    @joshuagreen5820 Год назад +8

    Wow the dedication all just so we can finally play Crises the way it should be! In 40 years we’ll have one in our phones lol

    • @Shiffo
      @Shiffo Год назад +1

      In 40 years, you won't have a phone

  • @humorme5874
    @humorme5874 Год назад +18

    "I wonder if this can play Crysis" hahaha a true classic

  • @AngeloXification
    @AngeloXification Год назад +5

    The future is going to be wilder than anyone can predict

  • @webpresent
    @webpresent Год назад +5

    New Internet moment: the data center is the computer. 👍

  • @bruceli9094
    @bruceli9094 Год назад +6

    Remember when people said we'd have to learn Chinese in the future?? Now you wont have to because of A.I the Black swan event.

  • @nexovec
    @nexovec Год назад +6

    This is so insane. I love this.

  • @ChishaSinyangwe
    @ChishaSinyangwe Год назад +35

    Crazy scientist! You can't help but love this guy and his company!

    • @afkcnd2395
      @afkcnd2395 Год назад

      Are you out of your mind ?
      This man pushes greedy practices in the whole GPU industry, last gen consumer GPU are literal scams.

    • @Vampirat3
      @Vampirat3 Год назад

      arent you an easy sellout.

  • @victorhernandez-mk7bk
    @victorhernandez-mk7bk Год назад +8

    Amazing! with 1 GPU!!

  • @metamorfoza7656
    @metamorfoza7656 Год назад +32

    When AI from this thing becomes aware and takes over the world, I hope that first thing that is going to do is reduce prices od gpus for 50% so it could replicate more efficiently... 750 bucks 4090 for all

    • @mlawal44
      @mlawal44 Год назад +1

      😂

    • @Cyberdemon1542
      @Cyberdemon1542 Год назад +1

      When that happens that will be the least of your problems...

    • @pictzone
      @pictzone 11 месяцев назад +1

      Even 750 bucks sounds insane for a GPU tbh... How times have changed

    • @surplusking2425
      @surplusking2425 11 месяцев назад

      Unfortunatly modern machine learning AIs are actually just a glorified collage machine, so self-awareness is much like a medieval ornithopter became a modern stealth jet fighter

    • @DreamingConcepts
      @DreamingConcepts 11 месяцев назад

      if that happens it will give you gpus for free, also neuralinks. In fact it will force you to take it and put you inside a tube "for your own safety" to rot in the matrix forever never remembering what reality looks like.

  • @accumulator5734
    @accumulator5734 Год назад +8

    Looks just like the terminator AI processor 😂.

  • @HEBEcoin
    @HEBEcoin Год назад +1

    Game changer, historical inflection point comes to mind 😅

  • @xlr555usa
    @xlr555usa Год назад

    I want a big A100 cluster so I can play in my sandbox, I still don't understand the advantages of Hopper. Looks like you can link them with pods?

  • @peppy197
    @peppy197 11 месяцев назад +1

    Will it fit in a case ...and run MSFS2000 ?

  • @davidfaustino4476
    @davidfaustino4476 11 месяцев назад +4

    Don't worry it will take at least 3x the VRAM for it to do anything useful.

  • @racerx6384
    @racerx6384 11 месяцев назад

    Wow. The next shield TV is impressive.

  • @mariosebok
    @mariosebok Год назад +6

    ENERGY SAVINGS? 2112 fans need less energy than millions of them

  • @campingismylife9394
    @campingismylife9394 11 месяцев назад +4

    So the card is 21 times more powerful than the Geforce RTX 4090. Great.

  • @GlorifiedPig
    @GlorifiedPig 10 месяцев назад +1

    10:44 "I wonder if this can play crysis" lmao

  • @JamesWitte
    @JamesWitte Год назад +4

    A future version of this will be what we have to suicide attack to destroy it to stop the machines

  • @suekuan1540
    @suekuan1540 11 месяцев назад +1

    What happened to the qbit quantum computers?

  • @jdevoz
    @jdevoz 2 месяца назад

    Whats the MTTF of that setup?

  • @ash0787
    @ash0787 Год назад +2

    This just makes the 3070's 8GB VRAM limitation more painful ...

  • @oddpranii
    @oddpranii Год назад +4

    Me : looking at superchip on a $300 phone

    • @preddyshite6342
      @preddyshite6342 Год назад

      Me: Reading this comment on $50 tracfone I cant unlock in my country

  • @nabe3el454
    @nabe3el454 11 месяцев назад +1

    Interesting. Dropping a comment here as an ' I TOLD YOU SO' when this piece of tech goes either south or north. Definitely a game changer. New breakthroughs coming in. A new world!?

  • @aphaileeja
    @aphaileeja Год назад +1

    Easy: build a flat Rumba the diameter of the average stride, then put the robot on top. I'm thinking a pole with arms and a face/360 camera🫡

  • @MsFearco
    @MsFearco 11 месяцев назад +1

    he sounds super excited.

  • @urimtefiki226
    @urimtefiki226 9 месяцев назад

    Waht is the algorithm of your chip?

  • @mta7444
    @mta7444 Год назад +4

    Dude we have to leave now, I just 4:12

  • @coordinateurtremplinsolida7341
    @coordinateurtremplinsolida7341 9 месяцев назад +1

    Thanks to Nvidia and his CEO, Skynet was waiting for this technology to be born !!!

  • @darkashes9953
    @darkashes9953 11 месяцев назад

    Still should have asked ibm for their optical circuits technology.

  • @mattbegley1345
    @mattbegley1345 2 месяца назад

    Considering how much the products cost to make, how do you justify the MSRP?

  • @FeelX87
    @FeelX87 Год назад +2

    Taiwanese energy mixed with American excitement is this kind of a ceo

  • @HK_Martin
    @HK_Martin 11 месяцев назад

    that jacket is just a tradition now

  • @natsidruk86
    @natsidruk86 11 месяцев назад

    200 billion transistors. Let that sink in for a moment...

  • @SanctuaryLife
    @SanctuaryLife 11 месяцев назад +1

    That’s a 20 Ton GPU with 144TB ram if you didn’t catch the drift.

  • @aeromotive2
    @aeromotive2 Год назад

    how much memory??

  • @BlueRice
    @BlueRice 11 месяцев назад

    AI seem to be the feature. at the same time i having a thought about skynet in terminator. computer power like this is mind blown. i still think phone computer power is powerful for its size. i still wait the days were they have contact lens small enough to have a power of a phone.

  • @mossify2359
    @mossify2359 Год назад +2

    This founder/CEO's company has market cap bigger than TSMC?

  • @FrankBarrett
    @FrankBarrett Год назад

    Anybody get a count on how many times he says “Grace Hopper”?

  • @H-GHN
    @H-GHN Год назад +1

    a "goodbye, gamers" keynote

  • @zondaensensyvarealelumina9472
    @zondaensensyvarealelumina9472 11 месяцев назад +1

    10:44 😄😄

  • @ovoj
    @ovoj 11 месяцев назад

    We're about to start summoning the machine gods. It's gonna sting but oh well.. here we goooooo

  • @selorius28
    @selorius28 Год назад +1

    it's better to make super home computers based on this graphics with the ability to connect to the network via optical fiber something like bitcoin miners but working differently, buying all this costs why to buy when you can pay for use only and not for all equipment and so after a year there will be something else better

  • @chesstictacs3107
    @chesstictacs3107 Год назад

    Such a likeable dude.

  • @davidtothemax1
    @davidtothemax1 5 месяцев назад

    damn this leather jacket man is killing it

  • @Ricolaaaaaaaaaaaaaaaaa
    @Ricolaaaaaaaaaaaaaaaaa Год назад +19

    The people should have been cheering the whole time.....wtf. This is awesome news people!

    • @alsaderi
      @alsaderi Год назад

      They r bunch of !deot, society are, they usually rush to describe scientific advancement (especially the biological & technological) with word like "Creepy "Scary "Unethical" etc.

    • @sebastianjost
      @sebastianjost Год назад

      The tech is amazing, but the presentation really wasn't.

    • @Ricolaaaaaaaaaaaaaaaaa
      @Ricolaaaaaaaaaaaaaaaaa Год назад +2

      @@sebastianjost It didn't seem very well put together but sometimes that's more organic and wonderful 😄

    • @curie1420
      @curie1420 Год назад

      ​@@Ricolaaaaaaaaaaaaaaaaa because they know how this tech impacts the future.... its good tech dont get me wrong but anyone with a functioning brain knows we are not responsible for this to be in mass production yet

    • @schikey2076
      @schikey2076 Год назад

      the presentation is E3's level of cringe im surprised Crowbcat havent woken up on his slumber for this lmao... they really need to get someone else to present this lol

  • @SultanAhmed-xn1wb
    @SultanAhmed-xn1wb 11 месяцев назад

    Nvidia for life 🧬 for love ❣️ for future 🤗 keep going for the great work 🤠 be happy be safe

  • @jackieo7113
    @jackieo7113 11 месяцев назад

    Can't even wrap my head around the ramifications of this!

  • @76ayoub76
    @76ayoub76 11 месяцев назад

    Jensen, if I was in the audience I would jump on stage just to make sure you are a real person or rendered by Grace Hopper?!😁

  • @amortalbeing
    @amortalbeing 11 месяцев назад

    great stuff

  • @kos8765
    @kos8765 11 дней назад

    if u got a hag dentist or an imp or watever, on your back, this grace hopper chip will help you out

  • @jimmy8mbb
    @jimmy8mbb 5 месяцев назад

    This guy is. Skynet?
    John, we found him at last

  • @alangonzales3130
    @alangonzales3130 11 месяцев назад

    I am honestly curious if it can run crysis

  • @devqubs
    @devqubs 11 месяцев назад

    can we see gpu price drops ?

  • @aritradas5522
    @aritradas5522 11 месяцев назад

    Why does GH200 sound like T800 getting hosted on CNET

  • @user-uw8yo2so2i
    @user-uw8yo2so2i 11 месяцев назад +1

    For those of u regular people like me let me explain it for u.
    Computers go fast
    Hopper makes computers do 100x faster
    Get ready for the internet of things life’s about to get jetson like

  • @nagadineshdusanapudi3863
    @nagadineshdusanapudi3863 11 месяцев назад

    Price ?

  • @shirilomakumbela2642
    @shirilomakumbela2642 10 месяцев назад

    This is cool and stuff im most interested in the stuff they don't tell us about...

  • @Cooper3312000
    @Cooper3312000 11 месяцев назад +1

    All we as consumers care about is what this means for GPU’s?

  • @selorius28
    @selorius28 Год назад

    it's not enough for artificial intelligence, nvidia needs to work on apatite crystals

  • @mikmop
    @mikmop Год назад +2

    Whatever happened to nobody will ever need more than 640 kilobytes of RAM

  • @TheShadiya
    @TheShadiya 11 месяцев назад

    But is it heavy?

  • @D15legend
    @D15legend 11 месяцев назад

    Can it run crisis though?

  • @generlate
    @generlate 11 месяцев назад

    astonishing.

  • @Cordis2Die
    @Cordis2Die Год назад

    This is cool

  • @VisualBeatLab
    @VisualBeatLab 11 месяцев назад

    Insane 👏

  • @user-nq1vd5jy5w
    @user-nq1vd5jy5w 11 месяцев назад

    Gamers can say goodbye to normal gpu prices. Margins for these are stupidly high.

  • @mikri194
    @mikri194 Год назад +1

    Designed For 32K gaming Performance at 120 fps

  • @Arunaasthra
    @Arunaasthra 11 месяцев назад

    Still don't know why they haven't developed a game engine

  • @dantkillmyvibe
    @dantkillmyvibe 11 месяцев назад

    When did Jackie Chan go in to tech ?

  • @bodekbodek
    @bodekbodek 10 месяцев назад

    No crowd understood the crysis joke!!????

  • @oddpranii
    @oddpranii Год назад

    This ceo is cool

  • @barzinlotfabadi
    @barzinlotfabadi 11 месяцев назад

    Finally I can play PUBG on ultra settings! 🙌 🎉

  • @yasunakaikumi
    @yasunakaikumi 11 месяцев назад +1

    So thats where all of the GPU vram went to, no wonder they have to cut all of the lower tier vram

  • @extrememike
    @extrememike Год назад +1

    Fk***ng amazing!

  • @bikram2955
    @bikram2955 11 месяцев назад

    Should really talk about efficiency. This thing needs humongous power and carbon emission.

  • @curtishorn1267
    @curtishorn1267 11 месяцев назад

    Needs fewer fans as they will be a maintenance headache.

  • @knightnxk2906
    @knightnxk2906 Год назад

    4 elephants 1 GPU 👁👄👁
    reminds of 2 girls 1 cup

  • @markisaac3550
    @markisaac3550 5 месяцев назад

    It awesome

  • @binnieb20
    @binnieb20 Год назад +4

    Hopefully someone will reverse engineer it the same way Sega and Namco reverse engineered Military hardware so it's cheaper.

    • @thetshadow999animates9
      @thetshadow999animates9 Год назад +3

      You can’t reverse engineer something this advanced, you’d need machines that cost over $100,000,000 a piece plus maintenance just to manufacture the silicon it runs off of.

    • @binnieb20
      @binnieb20 Год назад

      @@thetshadow999animates9 Sega had lots of money back in the day

    • @thetshadow999animates9
      @thetshadow999animates9 Год назад +1

      @@binnieb20 while Sega was indeed worth what is equivalent to today’s companies which are worth hundreds of billions of dollars, Sega was playing on easy mode with how much simpler technology was at that time. The only technologies you could steal would either be patented or be useless or even already done by Nvidia’s competitors. An example of this is AMD copying Nvidia’s DLSS 1, 2, and at the moment trying to copy DLSS 3.

    • @binnieb20
      @binnieb20 Год назад

      @@thetshadow999animates9 we’ll see

    • @thetshadow999animates9
      @thetshadow999animates9 Год назад

      @@binnieb20 I think we won’t see

  • @I-Dophler
    @I-Dophler Год назад +9

    During the Computex 2023 event, Nvidia unveiled its highly anticipated and groundbreaking Grace Hopper Generative AI Chip. This cutting-edge technology represents a major leap forward in the field of artificial intelligence, harnessing the power of advanced algorithms and machine learning to drive innovation and transform industries. With the Grace Hopper chip, Nvidia continues to push the boundaries of what is possible in AI computing, paving the way for exciting new applications and advancements in various sectors. This remarkable achievement showcases Nvidia's commitment to shaping the future of AI and solidifies their position as a leading player in the industry. The Grace Hopper Generative AI Chip is poised to revolutionize the way we approach complex tasks and unlock new possibilities in the world of AI-driven solutions.

  • @cozyboy3129
    @cozyboy3129 11 месяцев назад

    Grace hooper : i have 600gb memory
    RTX 4060 ti 8gb : screw u

  • @MrShyghost
    @MrShyghost 11 месяцев назад

    And so it begins...