NVIDIA Made a CPU.. I’m Holding It. - Grace CPU/Hopper SuperChip @ Computex 2023

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024

Комментарии • 4,9 тыс.

  • @RevJR
    @RevJR Год назад +4968

    They don't want you to know this, but the processors at the convention are free. You can just walk up and take one.

    • @maxmustermann2370
      @maxmustermann2370 Год назад +106

      Just the reason that nvidia squez the moneyz from gamer. They now can gift thier new CPUs, the gamer allrdy payd for the development and research.
      Btw. HP Moonshot was a fail. I see here no diff, just a bunch of Desktop GPUs crunchd in a 14" Laptop Blade. But it is AI man!!!11!1!!, the next hype hardware for failing startups.

    • @russellzauner
      @russellzauner Год назад +42

      *doorbell rings*
      Linus 2 weeks from publishing this clip: Why did [ship company] just pull up in a semi?

    • @HanSolo__
      @HanSolo__ Год назад +20

      Also, these are our available GPUs.

    • @user-go7mc4ez1d
      @user-go7mc4ez1d Год назад +97

      Can confirm, I went to one of these conventions and offered $1000 for one of their processors.
      Their answer? "It's not for sale".
      Snooze you loose Nvidia, thanks for the freebie

    • @oscarsh2909
      @oscarsh2909 Год назад +7

      I think we all know that you made a joke like this because you thought about stealing that poor CPU.

  • @Krolitian1
    @Krolitian1 Год назад +20424

    I love the idea of Linus just going into conventions and just unscrewing random tech he finds all over the walls without permission.

    • @faceboy1392
      @faceboy1392 Год назад +1116

      sinus lebastion is just too dangerous at conventions

    • @MrJosephrandall
      @MrJosephrandall Год назад +378

      I was about to comment this myself, goes to show how much the companies trust him now

    • @CrustySofa
      @CrustySofa Год назад +110

      seems like something he does everywhere he goes

    • @Vysair
      @Vysair Год назад +79

      That's probably what he do in the old day. On-the-spot permission with no prior planning

    • @jaxtinpetersen62
      @jaxtinpetersen62 Год назад +69

      the way he chuckles as well, when he actually get's permission. lol

  • @HStorm26
    @HStorm26 Год назад +6262

    A green cpu with a blue gpu may soon be possible.
    Scary times.

    • @justinbiggs1005
      @justinbiggs1005 Год назад +601

      Scary times with pricing and greed. But interesting times hardware/software technology wise

    • @TheCaseyCunningham
      @TheCaseyCunningham Год назад +117

      What the hell kind of bizzaro world are we in?

    • @DuckAutomata
      @DuckAutomata Год назад +495

      Highly doubt the green goblin is interested in making a cpu for peasants like us.

    • @BudgetGamingEnthusiast
      @BudgetGamingEnthusiast Год назад +16

      It already is
      The world is ending

    • @SWOTHDRA
      @SWOTHDRA Год назад

      ​​@@justinbiggs1005 scary times indeed for the PC cucked race, hold up ya'll are now gonna get cucked the other way around Nvidia proc and intel gpu??? Damn the PC peasant race keep taking L's.

  • @billygilbert7911
    @billygilbert7911 Год назад +442

    I'm not surprised they let you take it apart. 1.5 million views in less than 24 hours is more coverage then this would get anywhere. I love these types of videos.

    • @pr0f3ta_yt
      @pr0f3ta_yt Год назад +6

      Yeah duh. This whole video is full of shill. Do people actually think this isn’t paid for by Gigabyte or Nvidia. It might aswell be a marketing video for them

    • @billygilbert7911
      @billygilbert7911 Год назад +38

      @@pr0f3ta_yt Who cares if it's marketing. Its still cool.

    • @draketurtle4169
      @draketurtle4169 Год назад +2

      @@pr0f3ta_yt at least through Linus we can’t done transparency from these big tech companies.
      We actually get to see up and coming tech and Linus explains it’s use cases etc to us normies.

    • @nguyenhanh9479
      @nguyenhanh9479 Год назад

      @@pr0f3ta_yt how do you expect them making money ? youtube pay is sh*t, everyone know that.

    • @hjf3022
      @hjf3022 Год назад

      @@pr0f3ta_yt they would have to disclose that fact if it was.

  • @ANeMzero
    @ANeMzero Год назад +1281

    For reference on the name: Grace Hopper was the US Navy computer scientist who wrote some of the earliest theory on machine-independent programming languages and is credited for writing the first compiler, two incredibly important steps towards modern computing.

    • @idova
      @idova Год назад +63

      yes, hearing 'grandma COBOL' mentioned did bring a smile to my face

    • @crimson-foxtwitch2581
      @crimson-foxtwitch2581 Год назад +50

      yeah, NVIDIA names a lot of their architectures after important people in science history.

    • @markus1351
      @markus1351 Год назад +32

      also ranked Rear Admiral on top of that

    • @legerdemain
      @legerdemain Год назад +7

      Grace Hopper has a Posse.

    • @cruzer789
      @cruzer789 Год назад +44

      She was also the first person to coin the term 'bug' in computer sciences because she found an actual bug in one of their systems and then taped it into the maintenance log book.

  • @macias22
    @macias22 Год назад +2495

    I love how Linus just HAS to disassemble everything he gets his hands on

    • @superintendent1152
      @superintendent1152 Год назад +16

      thats how he rolls

    • @grumpyratt2163
      @grumpyratt2163 Год назад +48

      Someone somewhere was holding their breath saying don't f'ing drop it Linus don't you dare drop it 😂

    • @dilbertron2
      @dilbertron2 Год назад +4

      thats how he rick rolls

    • @jordi95
      @jordi95 Год назад +17

      He could not use the LTT screwdriver though! What a missed oportunity!

    • @Ander01SE
      @Ander01SE Год назад

      Imagine if it was GN Steve...

  • @shorty685
    @shorty685 Год назад +1988

    The confidence that some manufacturers have in Linus despite his track record is impressive.

    • @Fishmanistan
      @Fishmanistan Год назад +240

      That's because if Linus drop's their product it's free advertising though clips for years to come lol

    • @TAMAMO-VIRUS
      @TAMAMO-VIRUS Год назад +142

      Also, these are display units. Meaning they either don't work at full capacity, or might not even work at all.

    • @SockyNoob
      @SockyNoob Год назад +83

      @@TAMAMO-VIRUS big companies rarely put something valuable out there in public view, sometimes it's just a dummy unit.

    • @huskycruxes7232
      @huskycruxes7232 Год назад +15

      ​@@FishmanistanI did not think of that. I have found myself watching Linus drop compilations

    • @michaelshafer2996
      @michaelshafer2996 Год назад +6

      Walk in there with a fat wallet and or million dollar business insurance policy theyd let you do it to 🤷🏻

  • @arshkhanlm3086
    @arshkhanlm3086 Год назад +581

    Intel made a GPU, now NVIDIA made a CPU, what a time we live in

    • @i_Kruti
      @i_Kruti Год назад +2

      😂🤣😂🤣😂🤣😂🤣😂🤣

    • @crashnreset6987
      @crashnreset6987 Год назад

      Yea,, what's next?..... Men making babies and women getting drunk and having tattoos ? ;p

    • @KanadeYagami
      @KanadeYagami Год назад +38

      NVIDIA should start making motherboards again to go with that new CPU. That would be a real trip. 😆

    • @i_Kruti
      @i_Kruti Год назад +10

      @@KanadeYagami and we willn't be able to buy that motherboard due to its price.....😂🤣😂🤣😂🤣😂🤣😂🤣

    • @adamhassam
      @adamhassam Год назад

      meanwhile we "gamers" are still fine..

  • @Ljudet_Innan
    @Ljudet_Innan Год назад +2681

    I've never been so nervous watching Linus holding new tech.

    • @mike-tq5es
      @mike-tq5es Год назад +63

      the intel fab tour was more nerve wrecking lol~ even though he wasn't holding anything like here, his hand gestures and body movement so near all those precision machines after saying we shouldn't touch anything was true anxiety. (oh yea, and he actually did pat machines anyway) XD

    • @phoenux3986
      @phoenux3986 Год назад +44

      Something tells me the display units are probably nonfunctional if they're willing to let Linus take one off the wall and open it up with little to no supervision.

    • @Xorthis
      @Xorthis Год назад +32

      @@phoenux3986 Nope. I'm sure they are fully functional hardware items. I'm kinda sad he didn't drop one!
      Next week: Repairing the $150,000 server we had to buy after breaking it!

    • @gloamglozer
      @gloamglozer Год назад +1

      @@Xorthis haha :D I need to see it! But I gues it costs much more.

    • @IngwiePhoenix
      @IngwiePhoenix Год назад +5

      If you've watched him for years, you get used to it.
      Gold controller, 10k Intel CPU (which he dropped) are just among the first things that come to my mind. xD

  • @jamr3y
    @jamr3y Год назад +923

    Linus: we haven’t been on good terms with nvidia for a long time
    Also Linus: proceeds to dismantle latest nvidia tech

    • @Saint_Chompy
      @Saint_Chompy Год назад +70

      It's gigabyte's booth

    • @brandonmoss7976
      @brandonmoss7976 Год назад +7

      @@Saint_Chompy they are third-party seller this is NVIDIA tech tho

    • @Gabu_
      @Gabu_ Год назад +34

      @@brandonmoss7976 Realistically, Nvidia can't do shit if Gigabyte wants to show off their new stuff that's already available.

    • @prestonvarner611
      @prestonvarner611 Год назад +10

      @@brandonmoss7976 Gigabyte can let linus do what he wants... Nvidia would not stop him lets be real here.

    • @Petrixxxxxxx
      @Petrixxxxxxx Год назад +7

      @@brandonmoss7976 Which does not matter?
      If you bought a car from Toyota and started dismantling it, do you think Toyota could tell you to stop?

  • @xtr0city
    @xtr0city Год назад +923

    Gigabyte allowing Linus to disassemble a product mounted vertically is a level of trust I didn't know was possible, glad it worked out for them cause Jensen made it very clear how much it costs lol.

    • @hariranormal5584
      @hariranormal5584 Год назад +20

      They got a visit to ASML... that cuts everything. Visit to one of the arguably most complicated machines on earth is not a easy task.

    • @karmatraining
      @karmatraining Год назад +64

      That module was probably a dud or scrap part that they just used to show how it looks. Ain't nobody leaving a $100K chip hanging on a wall

    • @miroslavmilan
      @miroslavmilan Год назад +4

      At first I thought he was going to DELID it.

    • @yourboi1842
      @yourboi1842 Год назад +2

      I’d imagine a sponsor on a Linus tech tips video is a few grand. But Linus making a an entire video directly on your product is somehow not worth him dropping it once a blue moon?

    • @miroslavmilan
      @miroslavmilan Год назад +1

      The thing is, that probability is a lot higher than once in a blue moon 😄
      Anyhow, it’s mostly just a banter from his loyal fans.

  • @landonvincent9586
    @landonvincent9586 Год назад +73

    Linus is literally the legend of the tech industry. imagine not only being invited to a pre-show, but also being allowed to play with the displays.

  • @fatrobin72
    @fatrobin72 Год назад +632

    Fun fact... one of the first boards Acorn (the company who created ARM) made had a broken power connection to the CPU... but as ARM chips were so low powered, it was still fine

    • @someoneelse5005
      @someoneelse5005 Год назад +96

      I watched that... the insanity was that residual power from capacitance all around the chassis managed to power the circuits!

    • @Dragoon710
      @Dragoon710 Год назад +7

      @@someoneelse5005 that seems very interesting how can I find this video?

    • @brandonw1604
      @brandonw1604 Год назад +11

      Then RISC, what ARM is built on, the CPU was running after power was disconnected.

    • @createusername6421
      @createusername6421 Год назад

      😮

    • @tanmaypanadi1414
      @tanmaypanadi1414 Год назад

      ​@@Dragoon710
      You are in for a treat
      Lowspec gaming YT channel has a couple of videos covering ARM .
      ruclips.net/video/gKYOjDz_RT8/видео.html
      ruclips.net/video/nIwdhPOVOUk/видео.html

  • @mylestheman
    @mylestheman Год назад +1761

    I can’t believe they trusted Linus not to drop one of these 😂

    • @LegionofKyle
      @LegionofKyle Год назад +38

      I think they more trust he can compensate fairly when he does, plus it would be good advertising.

    • @TravisFabel
      @TravisFabel Год назад +153

      I think these are non-operational demo examples. That's why they don't care.
      You don't hang $100,000 machine on the wall of a convention. You put up the dead CPUs and mockup PSUs that are basically worthless.

    • @mattsnyder4754
      @mattsnyder4754 Год назад +20

      I can’t believe you think that they hung functional hardware on the wall of a convention center 😂

    • @oddball_the_blue
      @oddball_the_blue Год назад +3

      I came here just to say the same...

    • @hw2508
      @hw2508 Год назад +3

      Mounted to the wall? Probably just models with damaged CPUs anyways.

  • @idonotcomplyrevolution
    @idonotcomplyrevolution Год назад +493

    What I'm beginning to notice is, "compute modules" are essentially the PC and the motherboard isn't really a motherboard anymore, its just an I/O interface for the compute module. Which if you remember is how we used to make computers 40 years ago, just with wildly more advanced tech.

    • @krozareq
      @krozareq Год назад +37

      Yep. Everything got shoved into an ISA slot. Keyboard controller, mouse controller, VGA card, memory, etc.

    • @bassplayer3974
      @bassplayer3974 Год назад +7

      Reaching limit shrinking and motherboard a bottle neck. Buy in fab package.

    • @brodriguez11000
      @brodriguez11000 Год назад +7

      Still do with industrial computers. e.g. VME, etc.

    • @threalismaradona9899
      @threalismaradona9899 Год назад +13

      Cloud is just the mainframe and time sharing albeit as you said with very much more advanced tech

    • @SockyNoob
      @SockyNoob Год назад +10

      Yep. It's going back to everything being on a card and having it all connect to a backplane.

  • @thefreebooter8816
    @thefreebooter8816 Год назад +64

    Linus holding a ~$150,000 compute module like it's a boombox will never get old

  • @DJAutonomous
    @DJAutonomous Год назад +157

    That totally natural scan around the room before he takes the thing apart is just brilliant.

  • @DarkSwordsman
    @DarkSwordsman Год назад +506

    It's ironic that in an era where we went from needing dozens of dedicated cards to having most things handled in software, we are now going in reverse: Hardware processing things with dedicated chips or cards.

    • @Ferretsnarf
      @Ferretsnarf Год назад +106

      About 10 years ago when I was in college for Electrical and Computer Engineering this is actually one of the things we were talking about. We're more or less hitting a brick wall in miniaturization and increasing the raw speed of individual components. How do we improve performance when we can't miniaturize our chips any more than we already have (At this point we're talking about transistors that are so small that you can count their width in atoms)? Well you offload tasks into different chips (TCP/IP on the network adapter and like Linus showed putting the encryption workload on the adapter). If you find there's a specific workload that you're constantly asking your general-purpose CPU to do, it might start to make sense to put that task on a specialist chip rather than putting it on your CPU.
      ASICs are on the rise and expansion cards are coming back.

    • @autohmae
      @autohmae Год назад +17

      Do you remember some people were saying: the end of Moore's law ? That's what is going on here...

    • @Ferretsnarf
      @Ferretsnarf Год назад +28

      @@autohmae Yeah, we were talking about that at the time as well. I avoided saying it because I kind of hate talking about Moore's law online - you almost always get some kind of blowback when you talk about moore's law being dead. On the consumer side of things I could almost see why you might think moore's law isn't dead. We're not really seeing smaller+faster all that much anymore. We occasionally barely scrape by into a smaller node, but you're not really getting faster and more efficient transistors out of it anymore, instead you're mostly cramming more stuff onto the die and subsequently aiming a firehose at it to hope you cool it enough to not explode.

    • @autohmae
      @autohmae Год назад +5

      @@Ferretsnarf do you know why consumers with some technical knowledge don't know it's dead ? Because of the marketing with CPU node process size.

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Год назад +8

      @@Ferretsnarf This has happened before, and ASICs have always had a need over general purpose processors. Our reasons for stagnation in tech is more of a complex problem as opposed to exclusively being down to physics. As it is, quite a few clever people in fields of research have proposed numerous workarounds that are plausible in theory, but simply not testable at the moment and not feasible on a wide scale, especially without aggressive grant funding like in the past.
      If anything, I would say that we're actually quite lucky that AI has brought about a bit of a resurgence in potential general optimization and advancement.
      Finally, Moore's law was always more of a "loose observation" and never intended to be indefinite, with Moore himself saying that he was certain the trend would not hold for long and become irrelevant to the next abstract steps in advanced design.

  • @EposVox
    @EposVox Год назад +497

    I feel like we're looking at the future of consumer platforms in 5-10 years, just in BIG form

    • @z0phi3l
      @z0phi3l Год назад +38

      Like mentioned Apple is there, Microsoft is close, question is, who will do mass ARM based consumer chips first, Intel or AMD?

    • @NostraDavid2
      @NostraDavid2 Год назад +7

      Some powerusers, maybe. I don't see windows hardcore switching to ARM. Who knows... Maybe we'll be surprised.

    • @jonforhan9196
      @jonforhan9196 Год назад +6

      @@z0phi3lMicrosoft is far from there with their Qualcomm chip surface laptops, maybe for a student taking notes and using a web browser but it’s basically the compute power of a phone lol

    • @wright96d
      @wright96d Год назад +14

      @@NostraDavid2 I think you have that switched. I’ll be surprised if 90% of consumer PCs aren’t running ARM SoCs in 10 years. And I’m talking mostly pre-builts here.

    • @z0phi3l
      @z0phi3l Год назад +2

      @@NostraDavid2 if this goes like I think it will, we won't have a choice. Wild guess is Intel x86 will make it to 16th gen before they kill it, same with AMD, 2-3 more x86 before they also switch to all ARM

  • @williambrunelle9050
    @williambrunelle9050 Год назад +33

    The little giggle of holding a server... a very expensive server and not dropping it made everyone's day! Like a kid in the toy store... Would love to see how hard it was for Jake to pull him away kicking and screaming.

  • @dariusz.9119
    @dariusz.9119 Год назад +624

    Imagine if Nvidia's reps didn't know Linus has a screwdriver and he just looked around, saw the reps moved away and started dismantling the showcase board before anyone could take a notice 😅😅

    • @davidhanousek9874
      @davidhanousek9874 Год назад

      he has mounted ltt screwdriver in his butt... like always 😁

    • @thewiirocks
      @thewiirocks Год назад +21

      If they're not used to Linus by now, that's on them!

    • @U1TR4F0RCE
      @U1TR4F0RCE Год назад +9

      To be fair NVIDIA didn't let him do that, it's Gigabyte who does work with NVIDIA who did the presentations.

    • @Landen79Foff-wc5ej
      @Landen79Foff-wc5ej Год назад +3

      and then when they noticed, they'd be like "HEY!" and then Linus drops it. 😏🤣

    • @8088I
      @8088I Год назад +1

      Get ready for Not Super Ai🤖,
      Bot for Super Duper AI👾👧!
      Westworld is only a generation
      away🤠👧(👾). ... :-))

  • @stratonarrow
    @stratonarrow Год назад +325

    For Linus to not drop whatever he’s holding immediately after saying “I don’t even wanna know what this thing costs” is pretty astounding to me.

    • @tombrauey
      @tombrauey Год назад +13

      I personally doubt that those are working chips. It‘s more likely that they are defective and are used for exhibition purposes.

    • @jackturner269
      @jackturner269 Год назад

      Linus has enough money to replace what ever gets broken guaranteed

    • @Demidar665
      @Demidar665 Год назад

      you really think Nvidia is going to let him dissasemble working systems ? one of those racks is probarbly 500k

    • @stratonarrow
      @stratonarrow Год назад +1

      @@tombrauey oh it is known. Still funny though.

    • @stratonarrow
      @stratonarrow Год назад

      @@Demidar665 I didn't say that

  • @Yoshi-ux9ch
    @Yoshi-ux9ch Год назад +173

    It's a very brave move to allow Linus to hold anything important

    • @trapical
      @trapical Год назад +7

      I mean let's be honest. This video is going to get more views than anything in the entire rest of the weekend of this convention... It's worth the risk of having to drop something when he is the headliner.

    • @Gabu_
      @Gabu_ Год назад +11

      They're almost certainly dummy chips that already don't work.

    • @aleks138
      @aleks138 Год назад

      considering how much there is as stake if someone steals one, they're not real chips

    • @brodriguez11000
      @brodriguez11000 Год назад

      @@trapical The people who can afford this most likely aren't in Linuses demographic.

  • @Flots1111
    @Flots1111 Год назад +9

    Fantastic overview of the new NVIDIA products and a stellar breakdown on ARM procs and where they work best. I'm working through some NVIDIA certification courses and the info is all there but they provide no context other than a dizzying array of multiplier comparisons against previous gen hardware and this video brought it all into focus. Thanks so much, really helpful!

  • @billyeveryteen7328
    @billyeveryteen7328 Год назад +616

    Hopefully Linus is still making content fifteen or twenty years later, when you can pick these up for relatively cheap to see how they perform in games.

    • @Jaroartx
      @Jaroartx Год назад +22

      i could imagine lets install batorcera for running ps6 games 😂😂😂 at ease, and for you dirty otakus create your own living A.I waifu cat girl

    • @JavierPhotoMX
      @JavierPhotoMX Год назад +15

      On Ali express 10 years from now

    • @mika2666
      @mika2666 Год назад +16

      Sadly the "100" series, previously known as Tesla, does not support any video outputs and does not support any graphics APIs, it's only for compute

    • @Xorthis
      @Xorthis Год назад +6

      @@mika2666 That hasn't stopped people running games on them. There's a few benchmarks out there.

    • @honam1021
      @honam1021 Год назад +6

      @@Xorthis A100 and newer would perform very poorly in games as only a small subset of the chip supports graphics workload.
      From the H100 architecture whitesheet: "Only two TPCs in both the SXM5 and PCIe H100 GPUs are graphics-capable (that is, they can run vertex, geometry, and pixel shaders)." (a full H100 has 72 TPCs)

  • @VertexTuner
    @VertexTuner Год назад +99

    What I find amazing about ARM architecture CPU's is that the very first one was was simulated and developed by Acorn Computers on an Acorn BBC Microcomputer (which used a MOS 6502 CPU) . The original name for Advanced RISC Machine was Acorn RISC Machine. I'm happy to say as a Brit I saw the beginning of this CPU legacy and still own both my BBC Microcomputer Model B and my Acorn Archimedes A3010 (which featured an ARM250, the 2nd generation ARM CPU). There was an actual ARM upgrade system for the BBC Micro, but it was far out of anyone's league/access and was mainly used by Acorn to develop the Archimedes.

    • @robinpage2730
      @robinpage2730 Год назад +10

      Fun fact: when the fist ARM CPU passed it's bench test, the testers went to unplug it, and realized it was already unplugged. It had passed it's bench test entirely on residual stored energy. It was THAT power efficient.

    • @100daysofmeh
      @100daysofmeh Год назад

      The first computer i ever used was a bbc micro. When i got to infant school and even highschool we had Acorns

    • @autohmae
      @autohmae Год назад

      This is why the Raspberry Pi exists, to re-ignite the BBC Micro experience, as a teaching tool. And I can report: the Raspberry Pi is the most sold computer from the UK ever.

    • @Xorthis
      @Xorthis Год назад +1

      @@autohmae I was just reading about the shortages! Insane that it's so popular. I also just bought an RP2040 to mess about with. Incredible little devices!

    • @Xorthis
      @Xorthis Год назад +1

      Granny's Garden and Suburban Fox were two of the biggest games in my primary school :D Damn I miss the old BBC machines. Just before my computer lab went to x86 to keep up with the newest trends, we had one RISC machine with module cards. It could run as an Acorn, as a BBC, or even as a 486 (Each module card had the CPU to run that standard). I have no idea why this kind of system never made it.

  • @gaborkeresztes1739
    @gaborkeresztes1739 Год назад +17

    Mad respect to Gigabyte for letting this chip get into Linus' hands.

  • @jamesswaby5676
    @jamesswaby5676 Год назад +402

    Linus: 'I didn't ask permission for this part but nobody seems to be stopping me.'
    Security: 'That's Linus... just let him do his thing. He'll put it back together... probably.' 😂

    • @RK-252
      @RK-252 Год назад +23

      "trust me bro". 😉

    • @kameljoe21
      @kameljoe21 Год назад +12

      he might even drop it!

    • @whitekong316
      @whitekong316 Год назад +5

      Put it back with half the screws…

    • @KalebG
      @KalebG Год назад +1

      let him cook

    • @TheZanzou
      @TheZanzou Год назад +8

      I mean knowing how security goes for various events they probably weren't fully informed of what he could and couldn't do, just that he was allowed to mess with the display over there.

  • @artamereenshort6610
    @artamereenshort6610 Год назад +277

    It's a real moment of pure pleasure, to see Linus with eyes that shine, like a kid in a toy store

  • @TheSickness
    @TheSickness Год назад +1131

    Nvidia: we can connect multiple GPUs in multiple racks into one room filling huge Gpu
    Also Nvidia: SLI...yeah that don't work

    • @nhiko999
      @nhiko999 Год назад +108

      Just in case: SLI works but it's mainly dépendant on the type of work asked to the GPU, and games are not benefiting much of the multiple nodes. For scientific computation however...

    • @EmilKlingberg
      @EmilKlingberg Год назад +102

      Well, SLI is actually a great technology, but its requires high competency from game developers, and lets just say that's not too common. Look at simulation programs or modeling and raytracing software and you realize how awesome sli setups are when running proper software.

    • @coccoborg
      @coccoborg Год назад +24

      ​@@EmilKlingberg on point! If you want to see a well optimized game for sli/cf, have a look back at crysis 2! May not have been the best in the series, but multi-GPU support in that title was wildly effective!

    • @TheSickness
      @TheSickness Год назад +12

      @@EmilKlingberg yeah, feels like game devs these days need 'guardrails' enforced by Sony and a 'one click - enable' to implement feature button.
      (Thinking about the interviews on Morres law is dead channel)
      For the mentioned use cases you can forget running that on consumer cards as none have the connectors anymore

    • @krozareq
      @krozareq Год назад +18

      The difficulty with SLI is that is has to raster frames real time for 144+hz display on a screen. GPU offloaded work, such as NN machine learning, is a much easier task to parallelize.

  • @1over137
    @1over137 Год назад +3

    I have used a system with 1440 cores and 64Tb RAM, but it was a few hundred physical commodity boxes. The latest compute space stuff that is replacing the likes that I used, is insane.

    • @1oglop1
      @1oglop1 Год назад +1

      And can it run Crysis?

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 Год назад +398

    Nvidia has been making CPUs for over a decade now. Tegra initially for high end tablets and now for high end (~$700-$2,500) embedded systems. And they've been making Grace for AI prototyping workstations for about 5 years (if you have a spare $25,000).
    If you only have $5,000, there are a few options with the Ampere Altra if you really must have ARM.
    The power savings are very suspect, Jeff Geerling tested and found it to not be much different than Threadripper.

    • @GauravSharma-dy8xv
      @GauravSharma-dy8xv Год назад +27

      I seriously forgot about tegra, it was sooo long ago

    • @FilippRoos
      @FilippRoos Год назад +26

      and the Switch

    • @rafradeki
      @rafradeki Год назад +50

      Arm is no magic bullet to energy efficiency, if using arm alone would be enough to make cpus more efficient even at high power, we would only have arm cpus

    • @andywolan
      @andywolan Год назад +10

      Is it a true that ARM instructions are more energy efficient, but require more instructions to get the same task done than x86 instructions?

    • @genderender
      @genderender Год назад +16

      Nvidia is betting that people will use Hopper and most definitely betting that people will buy their expensive ass interconnect modules. The actual performance of these chips is probably meaningless outside of the context of "shove a shit ton of DDR5 at it", much like Apple Silicon. And plus, AMD already beat Nvidia to the punch here. MI300 is CDNA3 + Zen 4 on a single package, using their Infinity Fabric (which is literally the same technology but packaged differently)
      Epyc still exists, and is impossible to actually beat because its much more versatile than these bespoke solutions. Until Arm can complete outside of the niche, we will keep hearing these arguments for years to come. Zen 4 is extremely efficient, as good as many Arm chips so x86 isn't out of the game yet

  • @MaxXxoUh
    @MaxXxoUh Год назад +155

    That Grace Hopper is a freaking piece of art. It makes you want to code an entire OS and Game just to test it. Just imagine what a crazy project that would be.

    • @TheCHEATER900
      @TheCHEATER900 Год назад +10

      I love the homage to grace Hopper btw. Excellent naming

    • @chrishousby2685
      @chrishousby2685 Год назад +4

      ​@@TheCHEATER900 imagine going back and telling her how big a transistor will become. The ones she developed software languages with were vacuum tubes several inches long.

    • @brodriguez11000
      @brodriguez11000 Год назад

      We know what future Crysis developers will be using.

    • @puppergump4117
      @puppergump4117 Год назад +2

      @@chrishousby2685 I imagine that anyone working with computers understands how quickly they will improve. The stuff shown in this video may be made for personal computers in 20 years. Just as 20 years ago, personal computers had millions of times less space and compute power.

    • @chrishousby2685
      @chrishousby2685 Год назад

      @@puppergump4117 honestly I don't think personal computers will be a thing in 20 years, it'll probably be closer to cloud based hardware. It'll be cheaper and more efficient than making individual hardware for both parties.

  • @Morecow
    @Morecow Год назад +503

    No no no this can’t be right

    • @thischannelisforcommenting5680
      @thischannelisforcommenting5680 Год назад +26

      technically nintendo switch is Nvidia GPU

    • @chillaxen2113
      @chillaxen2113 Год назад +5

      That’s what I said bro it’s not real I swear

    • @skylovescars69420
      @skylovescars69420 Год назад +24

      No! And now Intel is making GPUs! And they are good value!!

    • @Chowy2
      @Chowy2 Год назад +2

      GOD NO

    • @FalloutProto
      @FalloutProto Год назад +2

      @@thischannelisforcommenting5680 isn’t the switch more of an SOC?

  • @Zensiji
    @Zensiji Год назад +8

    @Linus, I'm so glad you decided to step down as CEO so you could focus on the magic! Every day I tune into this channel to learn something new and you guys always manage to keep it fresh and engaging! Long Live LTT!

  • @raymondm.3748
    @raymondm.3748 Год назад +295

    This is nothing short of insane, the fact that there is so much processing power with less power means that we will have much higher speeds throughout our internet!

    • @OmniKoneko
      @OmniKoneko Год назад +19

      Not only that but it will decrease the heat generated from it so it provides more cushions for the coolers

    • @KalebG
      @KalebG Год назад +6

      also cheaper hosting!

    • @JelvinCS
      @JelvinCS Год назад +36

      This can't happen. What will I blame my awful Counter Strike performance on?

    • @SlyNine
      @SlyNine Год назад +9

      Just lower the clock speed on your cpu and you'll have much better performance per watt.
      Our home chips run way beyond the efficiency curve

    • @phantomedits3716
      @phantomedits3716 Год назад

      This won't really make your internet faster. But, there's a case to be made that it might, in a roundabout way, make your websites load faster because the website is running on this hardware.

  • @theindependentradio
    @theindependentradio Год назад +15

    4:14 who else was waiting for him to drop it

  • @jackoboy1754
    @jackoboy1754 Год назад +70

    linus: *walks in*
    also linus: *randomly starts unscrewing things from the wall*

  • @TiagoRamosVideos
    @TiagoRamosVideos Год назад +4

    Incredible product! 👏 And it was great to see the confidence brands put on you Linus 👌

  • @CMoney1401
    @CMoney1401 Год назад +179

    I don’t know what’s more impressive, the technology or Linus not dropping it!

    • @bricefleckenstein9666
      @bricefleckenstein9666 Год назад +6

      I vote for Linus not dropping it.
      9-)

    • @davidgoodnow269
      @davidgoodnow269 Год назад +3

      "You drop it, you pay for it.
      Don't worry, we have payment plan options available."

    • @emerje0
      @emerje0 Год назад +1

      Linus goes into these tech booths with all the abandon of a kid with boundary issues walking into a toy store unattended.

  • @trapical
    @trapical Год назад +242

    This hardware is utterly and completely insane. If you can comprehend the slightest bit of the numbers behind this, it's just madness.

    • @MarkOakleyComics
      @MarkOakleyComics Год назад +37

      Those AI art programs which can produce 30 photo-real variations of, "A mountain of cookies" in under a second strongly suggests that we're living the last generation before everybody is born in pods and never uses their eyes. I'm legit alarmed by these compulsive engineers who know deep down that they should put the brakes on, but just can't stop themselves.

    • @rudisimo
      @rudisimo Год назад +39

      ​@@MarkOakleyComics​you should make your tinfoil hat tighter, perhaps that will help.

    • @Leanzazzy
      @Leanzazzy Год назад

      ​@@MarkOakleyComicsOnly idiots think AI will overthrow the world.
      If you actually understood what AI is and how it works you wouldn't think that. It's not some magical sentient being. It's literally just mathematical models and equations used to predict future outcomes based on inputs datasets.
      Datasets, which need I remind you, need to come from living, active, intelligent humans. If there aren't humans producing new, creative, informative data, AI would be useless.
      AI is a good thing. It is simply a tool to help us simplify our work and reach our goals. It can, and hopefully will, be used to ease and remove the burden of existence from mankind, so we can truly be free to do what we want and not struggle just to survive.

    • @Leanzazzy
      @Leanzazzy Год назад

      It's insane only if you compare it to consumer-level hardware and software.
      Remember, governments all over the world have and maintain far higher tech than the public can even dream of. They secretly use this tech for military, scientific, and usually espionage purposes.
      We get only the bottom of the barrel. Most of the tech we use today were once government secrets.
      The Internet itself started as a US military defence and research project.

    • @MarkOakleyComics
      @MarkOakleyComics Год назад +15

      @@rudisimo Right. Because there aren't any examples of technology getting ahead of our ability to adapt without catastrophic results. I can think of a couple items of note just from the last few years.
      Meanwhile.., Neuralink is entering human trials.

  • @chriskaprys
    @chriskaprys Год назад +75

    Wild to see just how far-wide-deep the subscription model has reached. If the contemporary fiscal landscape were a chess board, the pawn could only move to a square that it's rented from the opposing king.

    • @autohmae
      @autohmae Год назад +2

      IBM has been doing this for decades, so no surprise

  • @ianmchale770
    @ianmchale770 Год назад +12

    Hearing Linus saying “I can’t believe they let me take this off the wall” and proceeding to laugh like a small child made my day. Linus is the geeky adult version of a kid in a candy store 😅

  • @dannymitchell6131
    @dannymitchell6131 Год назад +54

    I just love how excited Linus always is for new tech. Never change bro.

  • @miyaayazaki4273
    @miyaayazaki4273 Год назад +44

    Can we all take a second and appreciate how casually Linus holds that GPU on his shoulder? ( 5:12 )

  • @Thohean
    @Thohean Год назад +309

    I think the most surprising thing to me is that Gigabyte has enterprise class hardware.

    • @konnorj6442
      @konnorj6442 Год назад +29

      Ah but notice they did NOT show you the GB power supply?
      Lmfao

    • @ThyXsphyre
      @ThyXsphyre Год назад +1

      Pleaase dont say that about Gigabyte

    • @georgevel
      @georgevel Год назад +1

      My whole system is aorus bro

    • @Tehkezah
      @Tehkezah Год назад +14

      @@georgevel and nothing of it is enterprise class hardware

    • @georgevel
      @georgevel Год назад +1

      @@Tehkezah I said that bc ppl are saying about their psus and I wanna note that they got no problem

  • @Hisham_HMA
    @Hisham_HMA Год назад +8

    whats more amazing is them letting Linus handle those parts with his steady hands

  • @egillis214
    @egillis214 Год назад +63

    ARM is a RISC instruction set. The Hewlett-Packard Packard PA-RISC was way ahead of its time. I worked on the first HP 3000 on MPE & HP 9000 HP-UX systems. Some of the desktop workstations like the tiny 715 systems were incredible in 1980’s.

    • @konnorj6442
      @konnorj6442 Год назад +1

      Ah the toys of my youth! I worked with some of that wayyyy back when along with many other goodies that all ofnthe winbloze babies wouldnt have any clue what it is now nevermind how to use it and due to the millenials andnbeyond idiotic overly entitled arrogant bs they dont even appreciate that which was gained to make the current toys evennpossuble via our hard work long before they were a set spot on cheap hotels sheets

    • @danielwoods7325
      @danielwoods7325 Год назад +1

      Beat me to this lol good comment

    • @kellecetraro4807
      @kellecetraro4807 Год назад +2

      You beat me to this comment, but I made it anyway 😂
      I'm sort of scratching my head as to why he's (Linus??) acting like it's a new thing... Just for novelty I still have a SUN E450 still running and productive 😂

    • @archgirl
      @archgirl Год назад +1

      @@konnorj6442Pardon?

    • @1Sonicsky1
      @1Sonicsky1 Год назад +6

      ​​@@konnorj6442 First of all, it is this exact toxicity that completely stagnates any real intelligence... I would rather be stuck fixing Windows 3.1 and Vista installations for the rest of eternity than ever hold a mindset akin to yours. Every architecture, operating system, and programming language has its strengths and weaknesses, and it is our responsibility as technicians to learn and understand each one so that we can always provide the best for whatever our client is trying to achieve. I have met both old and young people who are kinder, more intelligent, and exhibit far more competence than you have shown here.

  • @Glitch5970
    @Glitch5970 Год назад +8

    4:22 dude you sounded like Ramon Salazar from og RE4 laughing like that LMAOOO

  • @sjgonline
    @sjgonline Год назад +117

    The smile on Linus’ face is like a 80’s kid going to a toy store… you know you won’t leave the place with anything, but just being surrounded with the toys is a joy

  • @BrunoTorrente
    @BrunoTorrente Год назад +1

    Grace Brewster Hopper (née Murray; December 9, 1906 - January 1, 1992) was an American computer scientist, mathematician, and United States Navy rear admiral. One of the first programmers of the Harvard Mark I computer, she was a pioneer of computer programming who invented one of the first linkers. Hopper was the first to devise the theory of machine-independent programming languages, and the FLOW-MATIC programming language she created using this theory was later extended to create COBOL, an early high-level programming language still in use today.

  • @dragon2knight
    @dragon2knight Год назад +378

    This is NVidia's future, and they know it. Good, now we can let folks like Intel and AMD shine a bit more, especially when they get their drivers ironed out.

    • @maxjames00077
      @maxjames00077 Год назад +24

      You think Intel and AMD can get some GPU market share in HPC?

    • @RedEverything
      @RedEverything Год назад +43

      AMD drivers are really nice. Meanwhile, Intel drivers struggle to play old games.

    • @Mr.Morden
      @Mr.Morden Год назад +8

      As long as Nvidia continues to fail at purchasing a CPU vendor.

    • @mcslender2965
      @mcslender2965 Год назад +11

      Im sure NVidia will be sad about that as they control whatever runs the AI customer support you work with, the AI that power your online services, the servers that renders your movie...

    • @maxjames00077
      @maxjames00077 Год назад +13

      @@RedEverything My Arc A770's drivers are amazing after the updates.. AMD has gone nowhere in the last 10 years 😂

  • @ardentdfender4116
    @ardentdfender4116 Год назад +19

    My heart definitely skipped a beat at 8:34 when someone threw the Network Card. That would have been a hell of a drop.

    • @tormodhag6824
      @tormodhag6824 Год назад +3

      Like that thing can cost as much as a car

    • @Vortex001_MLG
      @Vortex001_MLG 9 месяцев назад

      @@tormodhag6824it probably does 😮

  • @shahidahamed4785
    @shahidahamed4785 Год назад +208

    If they are able to optimise their cpu and gpu together properly... It's gonna be fire

    • @xustom250
      @xustom250 Год назад +32

      quite literally lmao

    • @s8080_
      @s8080_ Год назад +19

      Yes, fire, AND a housefire

    • @empressofmadness
      @empressofmadness Год назад +1

      and ten million dollars

    • @realghostxd
      @realghostxd Год назад

      @@s8080_ lol

    • @XecularOfficial
      @XecularOfficial Год назад

      As long as they don't try making it so certain software features are incompatible with non-nvidia CPUs

  • @jpdj2715
    @jpdj2715 Год назад +1

    ARM - "Acorn RISC Machine - first used in 1983. The ARM company never made processors/chips themselves, but designed them in specialised CAD systems. The CAD logical design file then was converted into a physical design that could be "printed" (my term) by the "foundry" (industry jargon). Such a logical design actually facilitates simulation in software of how the processor will work. The first physical batch of ARM came back to the ARM company and they had their physical test motherboards. Set the mobo up, plug the CPU in, run tests. Overnight, one of the engineers wakes up and becomes aware there was a connection or configuration issue in the power-lines and the test should have failed. Turned out the processor needs so little power that it had run off the power leaked into the processor from I/O presented to the processor. That's why almost all CPUs in smartphones are derived from that first ARM and why Apple derived their current generation of "proprietary" Apple chips from ARM too.

  • @Akizurius
    @Akizurius Год назад +38

    I can imagine Computex chief security officer watching this and thinking to himself "Why didn't anyone stop him? Well... At least he didn't drop anything."

    • @andreastyrberg7556
      @andreastyrberg7556 Год назад +2

      Security officer maybe. PR leader says "lovely". and maybe "sad he didn´t drop it for the meme-videos". It is worth a lot in advertising

  • @LPcrazy_88
    @LPcrazy_88 Год назад +106

    This might be the first video where Linus hasn't damaged or at least recklessly handled expensive electronics. So it IS possible for him to not break stuff!

    • @autumntechdruid
      @autumntechdruid Год назад +6

      Must be a robot Linus.....

    • @lonelyPorterCH
      @lonelyPorterCH Год назад +5

      What about the network card jake threw?^^
      I guess that was not linus

    • @LPcrazy_88
      @LPcrazy_88 Год назад +3

      @@lonelyPorterCH The more surprising part about that was Linus actually yelling NO! I would have expected him to just carry on like it's normal to throw around electronics like that.

    • @DaftFader
      @DaftFader Год назад +2

      TBF, we haven't seen it powered on since he touched it ... ! xD

    • @TheOnlyAndreySotnikov
      @TheOnlyAndreySotnikov Год назад

      He probably did damage something, it was just edited out to avoid liability.

  • @J0URDAIN
    @J0URDAIN Год назад +22

    Crazy to think that at one point our future generations will see this as ancient technology just how we see OUR ancestor’s tech (tools).

  • @topsofwow
    @topsofwow Год назад +1

    Those network cards are also hypervisors allowing you to divide one system up and scale the compute needed per customer.

  • @bricoschmoo1897
    @bricoschmoo1897 Год назад +15

    0:16 I love how the worker jumps away as soon as he realizes he's going to inadvertedly go between Linus and the camera !

    • @notjux
      @notjux Год назад +1

      He tried so hard to bail but only drew more attention to himself. May he rest in peace. o7

  • @viken3368
    @viken3368 Год назад +5

    8:33 throwing prototype/showcase tech to Linus 'Droptips' Sebastian is a very bold move

  • @KentHambrock
    @KentHambrock Год назад +15

    AWS has had a similar card to the last thing you showed off since around 2017. They just call them Annapurna cards inside the data centers (likely because they're made by Annapurna, a company they acquired back in 2015), but it's literally that. 1 or 2 SFP ports + an ethernet port and it gets used as the NIC inside pretty much all their servers these days. I assume the industry at large has had cards like that since 2015~ or even earlier, since I don't remember AWS being on the leading edge of anything in the data center space when I was working for them. xD

    • @ulthirm
      @ulthirm Год назад +1

      The only 'revolutionary' part of AWS was elasticompute almost entirely in the software side. They do self manufacture certain things now tho (that the company would probably actually look to terminate me for discussing. Me and my PIP is already riding very thin water heh)

  • @hedlund
    @hedlund Год назад +16

    Holy crapballs that bandwidth. Damn cool stuff. I'm so happy Arm remains independent. Gonna be cool seeing what all comes to market following this beastie.

  • @TheWendellpaulo
    @TheWendellpaulo Год назад +10

    It's an Arm CPU, same architecture used on mobile phones processors, it's RISC based but is very powerful, all of the systems today have a version for arm: Android, Linux and Windows with the Windows for IoT version. If developers start developing compatibility layers for x86, like with Exagear, compatibility start to have a solution.

    • @SWOTHDRA
      @SWOTHDRA Год назад +1

      On apple devices everything is already compatible. Best we leave windows all together, make steam compatible with ARM and there you are, no more windows needed

    • @skirata3144
      @skirata3144 Год назад +2

      @@SWOTHDRA It doesn’t matter much for the immediate future (5-10 years) if steam is compiled for ARM or not since it’s not a high performance application and therefore works fine with a translation layer (e.g. macos). Also it doesn’t really matter for most people anyway since pretty much all games released till now are not going to get an update to run on ARM anyway so if you want to be able to keep playing your library making the switch won’t make any sense whatsoever.

  • @viktoraggerholm5102
    @viktoraggerholm5102 Год назад

    00:15 shoutout to that employee who saw you were filming and didn't want to be in the way

  • @henrycollins2478
    @henrycollins2478 Год назад +66

    Their data centers have been pretty helpful for their stock

    • @genderender
      @genderender Год назад +8

      I find H100's price tag nothing more than "well Google will pay no matter what" kind of price. A million dollar investment into a server for a company this large is both eaten up immediately by running costs, but also still a blip for total operating cost of the company. Nvidia shareholders must be happy as shit

    • @ZeLoShady
      @ZeLoShady Год назад

      Ya I think this is a good time to invest given their Q1 reports and all these new tech developments. Nvidia is currently operating like an innovator again and not some comfortable company (like intel was before AMD came back).

    • @ZebraGoat
      @ZebraGoat Год назад +4

      @@ZeLoShady their q1 financial report showed a decline in revenue and net income lol

    • @ZeLoShady
      @ZeLoShady Год назад

      @@ZebraGoat exactly.

    • @tjoloi
      @tjoloi Год назад

      @@ZeLoShady Considering they had a 26% rally yesterday over the span of a single hour, I'm gonna go and assume that you're not the only one to think this

  • @PsRohrbaugh
    @PsRohrbaugh Год назад +59

    In 2004, my home lab included a 186 drive NAS, full of 18GB 15k RPM drives. It had similar iops and bandwidth to the best SATA SSDs, a decade earlier.
    I connected my dual processor workstation to it directly with Fibre Channel.
    But this stuff is so advanced... I can't even wrap my head around using it outside of an enterprise environment.

    • @HarshJha
      @HarshJha Год назад +2

      Thats was some amazing hardware! What do you run these days?

    • @konnorj6442
      @konnorj6442 Год назад +2

      Ah I have another clone out there in the wild.. of a sort
      Trick is I've worked on some of the bleeding edge goodies and even in a pro environment some of it is so wild it still really not useable by just one person per se
      Like my friend (asm coder of god like level).. way back when I was sadly stuck several hundred meters out of the area where cable inet was avail at home he got direct fiber to his home due to his work.. granted his home was only about 400 feet from the nearest main trunk data center by sheer luck.. but thenspeeds he got were so fast nothing he could really build for use at home could use the bandwidth he got.
      Since then it's even more insane.. decades later his connection is now so fast his server class top speed flash drive array cant write fast enough to fully use the fiber connection
      Lucky fukker he is lol

    • @PsRohrbaugh
      @PsRohrbaugh Год назад +1

      @@HarshJha My setup has become much more consumer-grade due to the rapid advancement in tech. I've got an i5 running FreeNAS with 4x 12TB and 6x 20TB mechanical drives, and 3x SSDs for cache. It has 10GbE ethernet, but none of my computers have more than 5GbE - and it can fully saturate that. But I almost never do. I only use it to archive home movies (I'm not a professional content creator) because nvme SSDs are just so big / fast / cheap compared to a few years ago. Dual 4TB SSDs is plenty for so many use cases.

  • @teach9023
    @teach9023 Год назад +19

    00:19 the commitment to not messing up the shot is admirable

    • @Tholi
      @Tholi Год назад

      this was funny

  • @v3nomdahunta
    @v3nomdahunta Год назад

    I love that you don't appear to double check that it's off or unplugged.
    I worry when seeing you with any screwdriver to hardware.

  • @Posh_Quack
    @Posh_Quack Год назад +8

    1:27 DON'T DROP IT

  • @ChristianStout
    @ChristianStout Год назад +5

    3:19 I'm more interested in those Intel Arc MXM cards in the background. Can we get some coverage on those?

  • @Woodie-xq1ew
    @Woodie-xq1ew Год назад +90

    What you don’t see is just below the camera shot is a nvidia employee with their hands out ready to catch that module if Linus drops it 😂

    • @BeatXaber
      @BeatXaber Год назад

      wait what time

    • @ccalvinn
      @ccalvinn Год назад +4

      ​@@BeatXaber just jokes m8

    • @damianea103
      @damianea103 Год назад +10

      @@BeatXaber 11:17

  • @Indrid__Cold
    @Indrid__Cold Год назад +2

    I love your contagious passion and enthusiasm for technology. I joined the PC industry as a hardware trainer/presenter in 1991. It took me months to accept the fact that i was actually getting paid to do something was so passionate about. Best wotking years of my life!

  • @awp_silver
    @awp_silver Год назад +28

    They were fearless to risk it and trust Linus to no drop anything😂

  • @timseguine2
    @timseguine2 Год назад +23

    The network card you showed at the end reminds me a lot of IBM Z-Series programmable IO. And yeah, offloading IO stuff to a coprocessor is the secret to crazy high throughput. You guys have seen in your reviews of desktop products how bogged down the system gets with high speed IO if it needs to be handled all by the CPU.

    • @mads7790
      @mads7790 Год назад +2

      Reinventing mainframes in 2023!

    • @derpataur1162
      @derpataur1162 Год назад

      As someone who had to use MLNX Connect-X4 and 5 cards -- the software for these cards is an absolute nightmare.
      1.) Sometimes the cards would just stop working until someone physically disconnected, then reconnected the cable.
      2.) Kernel Updates in most cases will brick the drivers, causing your network adapter to stop working until you can either reinstall the old driver or install whatever updated driver is released for the new kernel.
      3.) On at least one occasion, shortly after NVIDIA bought Mellanox they decided with zero warning to change the default operating mode of the driver from IPoIB to RDMA -- with zero help / instructions on how to revert it. This one left a really bad taste in my mouth.
      4.) The Documentation is bad. NVIDIA has done a terrible job of providing even a similar level of documentation to the old Mellanox documentation. Frequently when debugging issues with these cards, I found myself looking up old Mellanox pages for ancient secrets to try and equip myself with enough knowledge to debug stupid problems.
      5.)In order to fully utilize these cards you need to operate in RDMA mode, which literally requires that your application be compatible with RDMA/Configured for it. RDMA basically lets data get piped straight from the network to your application -- completely skipping the kernel. It's intended for crazy fast low latency. But they don't really tell you that. So if you're running in IPoIB then you're basically just paying a couple hundred bucks per cable for nothing.
      6.) Networking for these things requires you run a network manager on one of your nodes, or pay crazy licensing for a network device that can support running the network manager. So it's dummy thicc easy to setup out of the box -- but it's actually quite a pain to set it up correctly.
      TL;DR -- These cards are absolute trash if you don't fully invest into utilizing them correctly, and that impacts everything from hardware to the application. The software isn't robust, and your engineers are going to HATE it. I wouldn't recommend even entertaining the idea unless you absolutely intend to build your entire application/system around the idea of using RDMA.. cause its definitely not something you can just tack on for more performance.

  • @BenjaminWagener
    @BenjaminWagener Год назад +35

    The way NVIDIA focuses on cloud and AI and so on and makes local gaming more and more expensive, I fear local gaming will get rarer and rarer. They want us to nudge to use GeForce Now instead, because it's more efficient to them to share the performance from its servers than to sell us individually a GPU.

    • @SWOTHDRA
      @SWOTHDRA Год назад +9

      True, thats the future. Thats also thw reason gaming companies want to go always online , games as a service. Cause those games you can easily do the transition from local to cloud without the consumer knowing and once ur locked in you gonna pay for renting the software and hardware

    • @brandonmoss7976
      @brandonmoss7976 Год назад

      Pretty soon you're gpus are going to come with their own custom CPUs 😄 along with a few pelethites of storage for the AI data, so every time you play a game, the EI will be smarter every single time it customized for every single game for every single place style for every single player 😳

    • @ryanw7196
      @ryanw7196 Год назад

      Honestly with the percentage of their income that is now coming from AI I dont really see them giving a shit about GeForce anything, they could be completely out of the commercial hardware space in 10 years. I mean I imagine they may already have rtx 5000 series in the wings and possibly even the 6000 series... But after that? If everything else goes according to plan then Nvidia wont care much about consumer cash anymore.

    • @Wobble2007
      @Wobble2007 Год назад +1

      It will never be viable until the majority of the internet infra structure is pure fibre, copper is just way too high-latency (laggy) for gaming remotely, even 1GB fibre is borderline, in reality, 10gb full fat fibre is the minimum for a good gaming experience over remote connections, even current HDMI standards struggles to carry enough bandwidth to keep up with modern video games, so even with 10GB fibre a heavy compression technique will need to be employed, I wouldn't ever want to use it for gaming personally.

    • @Gabu_
      @Gabu_ Год назад

      @@Wobble2007 What are you smoking? You can ALREADY play remotely at fairly decent latency with a regular 100 MB bandwidth. Barely anybody except competitive e-sports professionals care whether you have a latency of 50ms or 10ms

  • @Cyberguy42
    @Cyberguy42 Год назад +1

    6:55 "Giving them access to up to 150TB of High Bandwidth Memory" The total memory capacity is ~150TB, but only a fraction of it is HBM. Each node has 512GB LPDDR5X but 'only' 80GB HBM.

  • @oblivion_2852
    @oblivion_2852 Год назад +6

    I'd like to point out that because LLVM is the backend for quite a few modern languages compiling for arm is as simple as passing a parameter to your compiler

  • @dennisfahey2379
    @dennisfahey2379 Год назад +5

    Virtualization really pivoted the server market into making CPU's with massive memory requirements. Each virtual node may be okay with a small number of cores but every single virtual machine needs a lot of dedicated memory - it adds up fast. And spinning that node up and tearing it down in an acceptable period of time changed the whole network architecture datacenters. Tensorcores (AI) exacerbated the problem. And as bad as it is now - CXL is going to really do a number on it as things get more heterogeneous. Fun Fact: Windows used to run on ARM, MIPS and x86. In fact it still does on the lowly Raspberry PI (IOT) which is ARM. In the 90's the fastest Windows machine was actually an Silicon Graphics Octane Server (MIPS) with many CPU's and an interconnect design that is very similar to NVLINK. NVIDIA's acquisition of Mellanox really focused on Data Center and the necessary interconnect needed at chip level. They have been "beyond" simple GPU/Gaming for many years. The margins are much higher in the datacenter as is the refresh cycle.

  • @isanova960
    @isanova960 Год назад +17

    This is 100% the content I live for, awesome video. Keep up the good work!

  • @carrino15
    @carrino15 Год назад +1

    When he threw the little processor net work card my heart stopped a second.

  • @gandalphf2026
    @gandalphf2026 Год назад +17

    Finally a cpu that can emulate my mobile games with no frame drops!

    •  Год назад +5

      Genshin Impact 2000fps

    • @trapical
      @trapical Год назад +1

      But can it play Crysis?...

  • @YonatanAvhar
    @YonatanAvhar Год назад +46

    Man, the speed at which these things rip through your wallet is insane

    • @AM-uy1ez
      @AM-uy1ez Год назад +2

      the biggest problem is intel and x86 and their royalty fees.... that require everyone to pay them for using x86 chip architecture while they do nothing to improve it... so chips are gonna be high in price.. every new chip that comes out is at least 400-500 dollars. Amd uses x86 architecture so they need to pay intel a fixed amount of royalty fee for no reason other than using the x86 architecture that has been globally used by everyone and invested in by developers around the world. i dont think intel shuld be getting payed cus it "ownes" a globally used architecture needed in all kinds of health sectors food production companies and businesses around the world. Mostly cuz all the hard work comes from the developers who made the programs. Apple tried to exclude intel from all of this but they are no better for using arm architecture that is also licenced. we need a licencing free architecture that everyone can use only that way we will get rid of the money sucking leeching companies such as "intel". i was rooting for huawei chips but poor huawei got banned here.

    • @MichaelPohoreski
      @MichaelPohoreski Год назад

      @@AM-uy1ez Intel pays AMD licensing fees for AMD64, while AMD pays Intel licensing fees for x86. They were suing each other and reached a cross-licensing agreement years ago.

  • @sebbes333
    @sebbes333 Год назад +4

    4:44 Holding the "computer" like it was a bazooka :D

  • @CreativeMindsAudio
    @CreativeMindsAudio Год назад +6

    so like m2 ultra at the server level. this makes me really happy this kinda stuff is great for the environment energy wise. also if that interconnect could be used for SLI or simiilar in the future that would be huge!

  • @FemboyBussyEnjoyer
    @FemboyBussyEnjoyer Год назад +340

    I love seeing Linus having fun while disassembling all those things

    • @Fogolol
      @Fogolol Год назад +3

      yeah and he looks like a kid in a candy shop lmao

    • @tyson31415
      @tyson31415 Год назад +3

      Me too, but it also makes me nervous. He and gravity don't always get along so well.

    • @ddbrosnahan
      @ddbrosnahan Год назад

      anything on Cerebras Andromeda AI wafer-exa-scale technology?

  • @amazingdrewH
    @amazingdrewH Год назад +7

    Once the Windows for ARM exclusivity ends it’ll be interesting to see if Nvidia has plans for the consumer CPU market

  • @ProjectPhysX
    @ProjectPhysX Год назад +6

    6:59 my CFD software can fit 1.25 billion cells in a single GPU with 64GB VRAM. 150TB could hold 2.85 trillion cells, holy smokes! That is 14170³ resolution.

  • @davidkelly132
    @davidkelly132 Год назад +3

    When I saw Linus start to unscrew the cpu I though "wait..what are you doing" then I saw him hold it and I was full of fear

  • @richardmaher2600
    @richardmaher2600 Год назад +5

    wonderful to see linus actually excited about a product and impressed as well. shows his passion for tech.

  • @GalactusTheDestroyer
    @GalactusTheDestroyer Год назад +19

    8:32 You can see the sheer terror in Linus's eyes when someone from off camera threw that card to him. I'll never be able to afford anything that he showed, save the computer setup at the booth. But it's cool seeing a glimpse into what will be.

    • @Steamrick
      @Steamrick Год назад +4

      I'm about 90% sure that was Jake throwing expensive networking cards.

    • @Thats_Mr_Random_Person_to_you
      @Thats_Mr_Random_Person_to_you Год назад

      Connect-x cards are expensive but actually not insane expensive. I'd bet based on personal experience looking at costs of other Mellanox cards the 'average' price for the entire connect-x 7 SKU (ie all the different speed and port number options) would be in the $1500 range... the 400GbE will be expensive as, but a 100GbE or 50GbE will be less.
      The 'Smart Nic' / DPU on the other hand.... yh them would be very very expensive, but equally, as Linus says, super super cool and for cloud providers a huge benefit to getting more CPU compute out of existing hardware.

  • @Mainyehc
    @Mainyehc Год назад +4

    4:32 “*This* is Grace Hopper. On the one side, we’ve got the same, 72-core Grace ARM cpu we’ve just saw, but on the other side, the “ooooooo shiny” latest and greatest nVIDIA H100 Hopper GPU. Today I am going to review Grace Hopper, and show you all of its quirks and features.”

  • @AlwaresHUN
    @AlwaresHUN Год назад

    In work we already on the ARM architecture. Switching to it was change the amd64 to arm values in our infrastructure config. Its like half an hour with hundreds of microservices (+ testing, validating).

  • @MaybyAGhost
    @MaybyAGhost Год назад +48

    I like to think that Linus isn't supposed to be there and that he just started unscrewing the displays out of habit and nobody was brave enough to stop him
    Edit: Well 3:12 confirms that! Never change, Linus. Never change

    • @dota2tournamentss
      @dota2tournamentss Год назад +8

      Even when staff sees it they probably assume that someone allowed him to do it because no one is crazy enough to do it without permission.... right?

  • @michaelbushee3968
    @michaelbushee3968 Год назад +18

    The fact that their processor is named Grace just to make that Grace Hopper reference is perfection.

    • @acmenipponair
      @acmenipponair Год назад

      I hope they searched the CPU design for bugs :D

  • @HardcoreGamers115
    @HardcoreGamers115 Год назад +27

    Im excited to see what nvidia does with nvlink. Really cool technology!

  • @Lampe2020
    @Lampe2020 Год назад +2

    5:33 WHAT?!? 4TB/s?!? That's all computer data I ever have produced - in a single second?!?

  • @KenS1267
    @KenS1267 Год назад +42

    I first heard the prediction that RISC processors would soon displace x86 ones in the early 90's. It is now, checks calendar, 2023. 30 years is not any definition of soon in technology I've ever heard of.
    What non professionals never seem to understand is that there is far more involved than just producing a "better" CPU or a "better" computer." You need a lot of people developing software for an architecture before it is sustainable and before the improvements that are needed will become apparent, and of course the company doing the maintaining will have to respond to those needs. You need a good compiler/IDE (does anyone think Nvidia provides anything of the like for this?). You need profiling tools and all the rest of the sorts of tools needed to not just debug but to maintain and understand how an application performs once deployed. For x86 I can choose between literally dozens of any of these sorts of programs, including dozens of bundles of these, including freeware ones. On the RISC side you're stuck with whatever the HW manufacturer provides or maybe a port of something but nothing with any community behind it because there is no installed base. That will greatly drive up the cost of developing SW for these platforms and ultimately no matter how eye watering the HW's prices the cost to develop and maintain the SW is always higher.
    RISC found its way onto phones and tablets because there was no competing installed base of x86 chips. There will be some niche markets in the enterprise for RISC but this unending "RISC is the wave of the future" is dumb.

    • @abcpea
      @abcpea Год назад +5

      There's plenty of RISC software, a lot of it open-source. Don't forget SPARC and PowerPC.

    • @iamamish
      @iamamish Год назад +8

      All true but as Linus pointed out, given the right use case, the cost of porting your dev tools, compilers, OS, etc. often pales in comparison to the savings. Overall, we HAVE moved to a RISC world. X86 is really the lone holdout, and that's just due to backward compatibility. Look how quickly Apple was able to move to their own silicon, once they were willing to bite the bullet of running older software through emulation.

    • @Gabu_
      @Gabu_ Год назад +2

      And that's where you're wrong. All modern CPUs, INCLUDING x86, are already RISC - but with a thin translation layer.

    • @blxblx6843
      @blxblx6843 Год назад

      RISC GOES BRRR 😂

    • @phantomedits3716
      @phantomedits3716 Год назад +1

      You're dead on about the "soon" hype train. However - the story on software ecosystems is a bit different than you've suggested. Yes - for compiled binaries, there's an entire universe available for x86, and rather less so for ARM OS's on the same hardware (ARM Linux, ARM Windows). However, what's sneaking up on x86 (btw, it's pedantic but you really mean x64) is the fact that robust and mature ARM toolchains are getting built up *because* of ARM on mobile, and more importantly now, ARM on OSX desktop machines. It's not as simple as recompiling stuff for ARM, but honestly, most client applications these days are built in an interpreted language anyways - so the interpreter or VM runtime just needs to be recompiled for ARM. Windows on ARM will happily run the majority of .Net apps, as one example.

  • @elonwong
    @elonwong Год назад +6

    “That’s a lot of balls”
    -Jensen while describing the solder balls under the processor💀😂

  • @gmacdonalddc
    @gmacdonalddc Год назад +4

    Cool stuff! I've always wondered what cutting edge server tech stuff was capable of, enjoyed the vid!