Your Graphics Card Needs an SSD!

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024

Комментарии • 7 тыс.

  • @au5music
    @au5music 10 месяцев назад +22101

    GPUs are becoming brotherboards

    • @tanvir.ahamed0
      @tanvir.ahamed0 9 месяцев назад +464

      Underrated comment 😭😭

    • @TristonHughes-p1f
      @TristonHughes-p1f 9 месяцев назад

      My thoughts exactly. How only 4 likes!?@@tanvir.ahamed0

    • @perochialjoe
      @perochialjoe 9 месяцев назад +2457

      So the brotherboard plugs into the motherboard? That's a fucked up computer

    • @jamariiion
      @jamariiion 9 месяцев назад +333

      @@perochialjoenah frl🤣

    • @SilverAegis39
      @SilverAegis39 9 месяцев назад +726

      Fatherboards, since they still need mother to be effective

  • @NP_Com
    @NP_Com Год назад +39208

    "my gpu ran out of space"
    Edit 7 months later: i got this video reccomended again and got shocked.

  • @Nullzd
    @Nullzd Год назад +15545

    "GPU: look at me, I'm the motherboard now" meme is starting to get too realistic

    • @mmrox4320
      @mmrox4320 Год назад +527

      Since some CPUs also come with integrated graphics for how small they are, i don't see why not GPU with an integrated CPU also mini-ITX motherboards can fit too

    • @mr_juuq1916
      @mr_juuq1916 Год назад +66

      Oh s#it... Oh no. :D Here we go again! :D

    • @endo9881
      @endo9881 Год назад +99

      ​​@@mmrox4320Oh god what have you done

    • @tobiid1585
      @tobiid1585 Год назад +120

      the funny thing is, gpu is basically can be called a computer itself with all the component inside that, except for the lack of psu.

    • @mr_juuq1916
      @mr_juuq1916 Год назад +48

      @@tobiid1585 yep and then there is the thing about when we are going to plug a cord straight from our gpus to power outlets :D
      Also when are we transferring power to the mbo from gpu, hmm :D
      Future idea, make a computer that is only one component. Shouldn't be hard i guess.

  • @Fvistugdsggratugfh
    @Fvistugdsggratugfh 5 месяцев назад +427

    “How much vram do you have?”
    “Uhhh 2 tb”

    • @valentds
      @valentds 3 месяца назад +24

      it's not for vram sadly

    • @Fvistugdsggratugfh
      @Fvistugdsggratugfh 3 месяца назад +14

      @@valentds 😑

    • @arthasmenethil2201
      @arthasmenethil2201 2 месяца назад +6

      In the future, we may expand VRAM like that. Until then, you ask an electronics guy to upgrade your GPU (delicate and expensive task).

    • @brotherhoodofsteel3090
      @brotherhoodofsteel3090 Месяц назад +9

      Being able to add VRAM would be clutch.. but let's be honest, they'd never do that.

    • @ElvisRandomVideos
      @ElvisRandomVideos Месяц назад

      That’s what they should’ve done. But VRAM is way too fast at ddr6, the drive wouldn’t be able to keep up.

  • @tomekkwegna5881
    @tomekkwegna5881 11 месяцев назад +5475

    Previously: built-in graphics on the motherboard
    The future: built-in motherboard on the graphics card

    • @ichdissdich98
      @ichdissdich98 10 месяцев назад +106

      Your laughing but this is it my dude, gpu with built in CPU as approximate values are enough now. Same goes for society and that’s what really worries me. You don’t have to know stuff any more just approximates and where to look up the real values

    • @FocusedCat
      @FocusedCat 10 месяцев назад +53

      Technically, that's an APU.

    • @benson4820
      @benson4820 10 месяцев назад +29

      How about a full on prebuilt pc inside the gpu 😂

    • @xshade65
      @xshade65 10 месяцев назад +4

      ​@@FocusedCatlike that one that runs a fighter jet?

    • @SomeGuyNamedMy
      @SomeGuyNamedMy 10 месяцев назад

      @@ichdissdich98 the fuck are you going on about lol

  • @Azuraeon
    @Azuraeon Год назад +11089

    - 4060: "What is my purpose?"
    - "You provide cooling for SSDs"
    - 4060: "Oh my god"

    • @samuelmatheson9655
      @samuelmatheson9655 Год назад +147

      Lmao

    • @malikkelly
      @malikkelly Год назад +156

      Rick and Morty

    • @madfox285
      @madfox285 Год назад +77

      😂😂😂 nice reference

    • @Dante2913
      @Dante2913 Год назад +13

      😂

    • @charles3840
      @charles3840 Год назад +44

      That'll be me when I have a 4060 and all I play is Factorio games smaller than 20GB with pixel graphics.

  • @BRMakesStuff
    @BRMakesStuff 7 месяцев назад +4075

    "What computer do you have?"
    "An RTX 4090."

    • @karlhendrikse
      @karlhendrikse 7 месяцев назад +113

      That's already how it is, pretty much. Lead with the most expensive part

    • @SaladDev
      @SaladDev 6 месяцев назад +44

      This reminds me of diesel truck owners. I don’t drive a 3500, I drive a Cummins!

    • @Iheartpillzz
      @Iheartpillzz 4 месяца назад +6

      @@karlhendriksefr

    • @Ashitaka0815
      @Ashitaka0815 4 месяца назад +8

      That one has no free pci express lanes though, so it's more likely a 4060 :P

    • @jmcgaming9457
      @jmcgaming9457 4 месяца назад +5

      Ohh I got the rtx 6090

  • @AZREDFERN
    @AZREDFERN 7 месяцев назад +14

    My mini-ITX would be happy. It has 2 full size M.2 slots. But only one of them has a cooler, and the other is on the bottom of the board with no head space for a heat sink or any airflow if you found a low profile one.

  • @SoccerBoyAP
    @SoccerBoyAP Год назад +3354

    Your GPU is becoming the System within a System

    • @davidparker.2227
      @davidparker.2227 Год назад +110

      Always was.

    • @CossackHD
      @CossackHD Год назад +69

      Meanwhile, SSD has its own little OS and 4 core ARM CPU that manages the data.

    • @RoninYoutube
      @RoninYoutube Год назад +23

      Cells within Cells

    • @club2772
      @club2772 Год назад +4

      Not really. The traces are practically going straight from the ssd to the motherboard.

    • @minapenguin7107
      @minapenguin7107 Год назад +9

      All pc components are systems within a system

  • @MrScy13
    @MrScy13 Год назад +2371

    GPU: look at me, look at me. I am the computer now

    • @milefiori7694
      @milefiori7694 Год назад +82

      From discrete GPU to discrete Computer 😂

    • @KReeMMeeNAL
      @KReeMMeeNAL Год назад +4

      Hahahahahaha, YES 👍🏻 exactly that

    • @hevityaus631
      @hevityaus631 Год назад +1

      Kek

    • @magnuswright5572
      @magnuswright5572 Год назад +5

      Somebody ported Linux to OpenCL, so yeah basically

    • @Kivsha
      @Kivsha Год назад

      DPU: look at me, look at me. I am the computer now

  • @creativename.
    @creativename. Год назад +3366

    Bro got that RTX 4060 512GB version 😂

  • @comptvlee
    @comptvlee 4 месяца назад

    Would super love this idea especially for small form factors where you're limited in space and mobo features

  • @zyuradesu89
    @zyuradesu89 Год назад +931

    2077: and now you can also put some ram on your gpu

    • @DTPandemonium
      @DTPandemonium 11 месяцев назад +80

      Finally can actually download more ram

    • @toucane2124
      @toucane2124 11 месяцев назад +17

      more like "now you can also put some ram in your head"

    • @KaityKat117
      @KaityKat117 11 месяцев назад +31

      Am I getting wooshed here, or are you not aware that GPUs already have their own dedicated RAM?

    • @zyuradesu89
      @zyuradesu89 11 месяцев назад +11

      @@KaityKat117 i know, i mean we can upgrade it, its a joke anyway

    • @KaityKat117
      @KaityKat117 11 месяцев назад +13

      @@zyuradesu89 oh like give the GPU DIMM slots.
      actually....... that doesn't sound like too bad an idea

  • @burgerbait
    @burgerbait Год назад +3981

    How can I cool my SSD?
    Nvidia: How about you strap it to the second hottest component in your computer?

    • @Nordlicht05
      @Nordlicht05 11 месяцев назад +97

      I wonder how that should be the future? That means there will never be GPUs wich use all lanes on any Mainboard ever!?

    • @ericknorskr8568
      @ericknorskr8568 11 месяцев назад +38

      ​@@Nordlicht05if its not necessary, they won't do it. gold and copper are expensive the less the better for them

    • @w花b
      @w花b 11 месяцев назад +50

      Hottest component requires the coolest fans

    • @infernaldaedra
      @infernaldaedra 11 месяцев назад +32

      ​@@Nordlicht05GPUs will never match the full throughput of pcie technology. Ultimately that's a unnecessary and worthless endeavor unless pcie stops scaling

    • @Nordlicht05
      @Nordlicht05 11 месяцев назад +4

      @@infernaldaedra than they could make less lanes on Mainboards ☝️

  • @chidubem826
    @chidubem826 Год назад +3672

    4060: "What is my purpose?
    Me: "You are a house for my SSD."
    4060: "You've gotta be fckin kiddin me!"

    • @justacollegestudent5147
      @justacollegestudent5147 11 месяцев назад +78

      Honestly being a place for an ssd is worth more than the worthless garbage 4060

    • @junkpow
      @junkpow 11 месяцев назад +60

      4060 should apologize for existing... That thing is waste of my semen.

    • @Dante2913
      @Dante2913 11 месяцев назад +3

      😂

    • @noelisgod1
      @noelisgod1 11 месяцев назад

      ​@@junkpowyou fucked your GPU? 😂

    • @TheCynicalOptimist88
      @TheCynicalOptimist88 11 месяцев назад +12

      But wouldn't the heat created by the graphics card far exceed normal conditions that a hard drive would be sit at? Regardless of fans ... Water cooling I could concede being a more at reason for such close integration

  • @aquaman5464
    @aquaman5464 11 дней назад

    Now that’s something I am looking forward too. Great concept. There may be few degrees hotter temps overall for gpu but the gains are worth it

  • @derthellevoon
    @derthellevoon Год назад +796

    this brings the „I am the motherboard now“ joke to a whole different level

    • @braulioasd283
      @braulioasd283 Год назад

      ​@@ghst-vq8sv😂😂😂

    • @Aliyah_666
      @Aliyah_666 Год назад +1

      ​@@ghst-vq8svasus 4090 matrix has entered the chat lol

    • @jaquelinegillisfrancine2923
      @jaquelinegillisfrancine2923 Год назад +1

      This will no longer be a joke in 10 years

    • @GAMER000-CAT
      @GAMER000-CAT 10 дней назад

      I still use a intel celeron now and run kali linux on it

  • @buenogoodlive
    @buenogoodlive 11 месяцев назад +1225

    In the future, your graphics card will just be your computer.

    • @lucythefrogge
      @lucythefrogge 9 месяцев назад +55

      C*nsole be like

    • @bragtime1052
      @bragtime1052 9 месяцев назад +7

      ​@@lucythefrogge😨

    • @RenanBecker
      @RenanBecker 9 месяцев назад +8

      like smartphones?

    • @DiditCoding
      @DiditCoding 9 месяцев назад +1

      This is what I’ve been saying from long time

    • @conqwiztadore2213
      @conqwiztadore2213 9 месяцев назад +15

      No genius, gpus will be obsolete and cpus will do both in the future.

  • @Joshfarmpig
    @Joshfarmpig Год назад +4281

    this is a perfect opportunity for nvidia to make a 4060 that has an SLI bridge and like have a super gpu

    • @Joshfarmpig
      @Joshfarmpig Год назад +170

      or better yet, a compact 4060

    • @JackalTheProto
      @JackalTheProto Год назад +257

      They won't its NVIDIA were taking about

    • @WCGwkf
      @WCGwkf Год назад +121

      A double 4060 would not make a super gpu

    • @darealmaul
      @darealmaul Год назад +111

      Sli barely did any good. It was a waste of time and money.

    • @Jwalker76
      @Jwalker76 Год назад +58

      problem with SLI is driver support and getting game devs to not only for new games but old games as well.

  • @phoenixfireclusterbomb
    @phoenixfireclusterbomb Месяц назад +23

    Manufactures should be selling the cpu, gpu, ram, and ssd already installed and discounted. Just like a phone with different tare levels and size.

    • @tekirandlime
      @tekirandlime 14 дней назад +22

      I believe that's called a pre-built pc..

    • @gamingarc1
      @gamingarc1 14 дней назад +4

      I was gonna say that​@@tekirandlime

    • @raywieds
      @raywieds 14 дней назад +1

      Have u not heard of pre-built pcs? 🤦🏻‍♂️

    • @julisod
      @julisod 14 дней назад +5

      you would love laptops

    • @wFTHw
      @wFTHw 14 дней назад +2

      Or consoles

  • @Y3SS1N
    @Y3SS1N Год назад +651

    I almost thought those cards would need 2TB VRAM

    • @Roach_Dogg_JR
      @Roach_Dogg_JR Год назад +86

      Ah yes, I can now load the textures of the entire GTA V map.

    • @smsff7
      @smsff7 Год назад +45

      @@Roach_Dogg_JR So that should mean no load time for the game, everything is just there and ready to go.

    • @webbedshadow2601
      @webbedshadow2601 Год назад +2

      Dude same

    • @narrativeless404
      @narrativeless404 Год назад +4

      @@AnzaiDior Even unarchived GTA V files wouldn't be taking 2 TB of space

    • @meghanachauhan9380
      @meghanachauhan9380 Год назад +2

      @@smsff7 you can load your entire games library onto the SSD just waiting to be launched

  • @Gornius
    @Gornius Год назад +1086

    "So what we can do to make 60°C SSDs cooler?"
    "Stick it to the board that can get up to 100°C, duh..."

    • @rage8010
      @rage8010 Год назад +110

      Exactly, it's not by the actual chip. It's on the fucking backplate lmao That shit gets hot...

    • @Not_interestEd-
      @Not_interestEd- Год назад +34

      Cool concept, gonna take an arm and a leg to engineer.

    • @piotrkowalski3869
      @piotrkowalski3869 Год назад +12

      @@Not_interestEd- What can i get for a rib? Guess a grilled SSD.

    • @leviofanh
      @leviofanh Год назад +7

      Attach the heatsink to the board of the SSD itself. It will be even more efficient.

    • @Not_interestEd-
      @Not_interestEd- Год назад +8

      @@hystaric_3013
      Problem: the NVMe slot is going to sit next to the chip.
      Solution: Move it to the other side of the board.
      Can't be THAT hard.... right?

  • @shadoutenma5774
    @shadoutenma5774 Год назад +1868

    Imagine your gpu overheating and cooking your 2TB ssd

    • @ichisenzy
      @ichisenzy Год назад +133

      actually no cuz the gpu itself is not close to the ssd + the ssd is inserted through the plastic shroud so at worst it’ll get heated up by a few degrees by the hot air blown from the gpu heat sink

    • @redslate
      @redslate Год назад +122

      If it got that hot, your GPU would be toasted first.

    • @dp_dapper
      @dp_dapper Год назад +112

      Imagine your 2TB ssd overheating and cooking your gpu 🙃

    • @daniel4dev
      @daniel4dev Год назад +57

      ​@@ichisenzy if your GPU and SSD share the same heat sink then your SSD would actually heat up quite significantly. Heat sinks are designed to be conductors of heat, and your thermal transfer elements (like thermal paste) aren't directional (you can't put thermal paste on upside down or round the wrong way). Therefore if your GPU heats up, the heat sink heats up, the SSD heats up.

    • @daniel4dev
      @daniel4dev Год назад +40

      ​@@redslatepretty sure SSDs have a lower maximum operating threshold. So in theory you could actually destroy your SSD because your GPU will thermal throttle at a temperature higher than is safe for your SSD

  • @LetzBlend
    @LetzBlend 7 месяцев назад +6

    Imagine releasing a custom firmware that uses that ssd slot as an extra vram slot. I might request that.

    • @leonhardmollney8638
      @leonhardmollney8638 6 месяцев назад

      Nice multi tb of vram

    • @LetzBlend
      @LetzBlend 6 месяцев назад

      @@leonhardmollney8638 😂😂

    • @regeneric928
      @regeneric928 5 месяцев назад +1

      There is a reason why we don't use SSDs for RAM or VRAM...

  • @smoothmiles24
    @smoothmiles24 9 месяцев назад +1967

    "Yo dawg I heard you like SSDs"

    • @41dn
      @41dn 8 месяцев назад +44

      An older meme, but it checks out

    • @Bin0mar
      @Bin0mar 7 месяцев назад +9

      THAT'S MEAN MORE VRAM ??????

    • @Umar_nad
      @Umar_nad 7 месяцев назад +2

      No​@@Bin0mar

    • @EricOnline92
      @EricOnline92 7 месяцев назад

      😂😂😂

    • @EthanDoezYT
      @EthanDoezYT 7 месяцев назад +20

      “So we putting an SSD, ON YOU MOTHA FKIN GPU!”

  • @Morgothol
    @Morgothol 11 месяцев назад +1021

    Soon, your SSD needs a SSD

  • @akaHarvesteR
    @akaHarvesteR 10 месяцев назад +2367

    Back in my day, we just used to have more than one PCIe slot.

    • @ohlala9546
      @ohlala9546 10 месяцев назад +123

      You still have. Except for the x16 and probably one m.2, they aren't directly connected to the cpu but handled by the chipset. It reduces the bandwidth (that SSDs especially use to it's maximum) thus using all direct PCIe lanes isn't too stupid.
      But nvidia not using the 16x and artificially throtteling their gpus is fucking stupid.

    • @ErgoProdigy
      @ErgoProdigy 9 месяцев назад +14

      That shares lanes.

    • @krzysztofb9279
      @krzysztofb9279 9 месяцев назад +38

      back in my day there was no pcie. you had isa. and maybe one pci slot. if you wanted to watch a multimedia clip you needed a gpu AND a mpeg decoder card. if you wanted a dvd player a few years later, you needed a dvd decoder card. I miss those days.

    • @irishdrunkass
      @irishdrunkass 9 месяцев назад

      We understand how it works. It's just that like...the kinds of cards that aren't using 16 lanes, are all low end builds anyways...so it just kind of gimmicky. Usually hardware innovation comes to high end risk taking users first, and trickles down. This goes straight to the low end and the mediocre@@ohlala9546

    • @iso-didact789
      @iso-didact789 9 месяцев назад

      @@krzysztofb9279 Not a card just a decoder installed like the k-lite codec pack.

  • @Graxu132
    @Graxu132 3 месяца назад

    this is actually so good for matx builds

  • @AmirRazan
    @AmirRazan 11 месяцев назад +1447

    "I am all about this if this is the future."
    - Every Tech Company

    • @Klovaneer
      @Klovaneer 11 месяцев назад +45

      narrator: it was not the future

    • @tobiaslewis8285
      @tobiaslewis8285 10 месяцев назад +3

      Hahahaha, 😂

    • @nassozeebo
      @nassozeebo 10 месяцев назад +7

      "I am rooting for whichever team is winning"

    • @JhanOjan
      @JhanOjan 10 месяцев назад +1

      Because future is unpredictable. Every new technology implementation requires try and error. Some stopped in r&d phase, some stopped because users and market didn't receive it quite well. It'll become standard once market received it well and less than 10% negative feedback from the users

    • @Klovaneer
      @Klovaneer 10 месяцев назад +6

      @@JhanOjan This isn't some new technology, just strapping an SSD to the hottest PC component.

  • @047Kenny
    @047Kenny Год назад +1332

    Can’t wait to see what computers look like in 10 years. Gonna be one RGB block

    • @northwastaken
      @northwastaken Год назад +41

      Mac studio lol

    • @zappalavigna
      @zappalavigna Год назад +85

      "What's you're specs?"
      34893020696969 rgb strips

    • @luckyluc25
      @luckyluc25 Год назад +77

      In 10 years, 16 core - 32 thread, 128Gb of Ram, 400 TFLOPS GPU with 48Gb of VRAM just to run Unreal 7, or 8, at 1080p with no ray tracing at 47 FPS. LOL. I'm probably not to far off.

    • @bryansantillano
      @bryansantillano Год назад +7

      @@northwastakenbasically lmao

    • @Tommmy1.0
      @Tommmy1.0 Год назад +2

      Rgb ssd

  • @RogueAI
    @RogueAI 8 месяцев назад +1327

    I wish vram could be expanded like that

    • @GameStationZ
      @GameStationZ 8 месяцев назад +55

      for real...

    • @pakan357
      @pakan357 8 месяцев назад +23

      Radeon Pro SSG. Sort of. Kind of. Eh.

    • @Dumanddd
      @Dumanddd 7 месяцев назад

      ​@@bck187lucky you

    • @susrut3664
      @susrut3664 7 месяцев назад +8

      how much vram do you need?

    • @iambicpotato2198
      @iambicpotato2198 7 месяцев назад +88

      This is exactly what crossed my mind when I clicked on this video.

  • @breadmerc5360
    @breadmerc5360 19 часов назад

    Love this, it's a great way to add value to your product and opens the door for ways to Differentiate Your product from the competition in a market Overly controlled by the Chip suppliers.

  • @Henry-Kuren
    @Henry-Kuren 9 месяцев назад +382

    If only they used this tech to store prerenders and shader caches to take load off the GPU.

    • @vladimus9749
      @vladimus9749 6 месяцев назад +59

      I expected it would be used for something like this. Super disappointed it wasn't, especially with how this guy presented it.

    • @clintonweir7609
      @clintonweir7609 6 месяцев назад +9

      Give it time.

    • @darkinfuser_44
      @darkinfuser_44 5 месяцев назад +14

      DirectStorage would be absolutely goated with this because the gpu can just utilise the ssd straight from the card instead of through the motherboard

    • @AcceptYourDeath
      @AcceptYourDeath 5 месяцев назад

      @@clintonweir7609 There is no time because the logical step will be using all 16 lanes for newer GPU generations.
      The idea is flawed untill they start using SSD for direct access for a GPU like consoles do.

    • @jerseyse410
      @jerseyse410 5 месяцев назад +4

      @@flsendzz I would love to be able to store Stable Diffusion directly on my GPU so its already there for use

  • @dietcokepapii
    @dietcokepapii Год назад +897

    Taking "direct storage" to a different level 😭

    • @thoreberlin
      @thoreberlin 11 месяцев назад +4

      This is the opposite. Its having less bandwith instead of more with less latency. The SSD is still connectet to the CPU but physically located on the GPU Board that cheaps out on PCIeLanes; limiting the GPU Bandwith to half.

    • @dietcokepapii
      @dietcokepapii 11 месяцев назад +1

      @@thoreberlin it was a joke mate

    • @thoreberlin
      @thoreberlin 11 месяцев назад

      @nandishmunjal1998 Well, there were professional graphics cards that had direct storage from AMD: Radeon Pro SSG. Not a joke ;)

    • @xanderwusky
      @xanderwusky 11 месяцев назад +3

      @@thoreberlin that is for now though, whos to say that cant be developed? Its got to at least be something they are thinking about. What if you could have a drive that is optimised for direct storage that will get gpu access directly with cpu second. If that would be possible you could probably do some crazy stuff too if you need a lot of vram but not as fast for example. ssds are so fast already that it can probably be fast enough for a lot of uses if it was accessible by the gpu directly

    • @royk1374
      @royk1374 11 месяцев назад +1

      @@thoreberlin Unless it's used to load directly from the NVME, it might be possible that the GPU could use it for direct storage. Like he said, this GPU doesn't use the full bandwidth anyway and the nvme wouldn't either. So it might be profitable for optimized games. From NVIDIA:
      GPUDirect® Storage creates a direct data path between local or remote storage, such as NVMe or NVMe over Fabrics (NVMe-oF), and GPU memory.

  • @johng3292
    @johng3292 11 месяцев назад +603

    Can't wait to upgrade my 12GB VRAM to 1TB.

    • @ShiggyCompPt2
      @ShiggyCompPt2 11 месяцев назад +23

      It’s not for vram

    • @ryanpongracz8051
      @ryanpongracz8051 10 месяцев назад +87

      ​@@ShiggyCompPt2it should be

    • @ShiggyCompPt2
      @ShiggyCompPt2 10 месяцев назад

      @@ryanpongracz8051 its for storage, not vram. The gpu uses 8 lanes, They put a ssd on the gpu to make the most out of the x16 slot.

    • @Arterexius
      @Arterexius 10 месяцев назад +28

      @@ryanpongracz8051 That would require a whole other type of solid state storage. If you just loaded, unloaded and flashed a standard M.2 disk like it was RAM, you'd have a fried M.2 disk very, very quickly

    • @SckMyPopsickle
      @SckMyPopsickle 10 месяцев назад +7

      It does not require any m.2 storage at all , we're talking about vrams, they are totally different things, all they're saying is that the heat issues from m.2 drives can be cooled by your graphics card.
      Excuse me for bad English

  • @NotGoofyBenanymore
    @NotGoofyBenanymore 2 месяца назад

    Dude that’s actually awesome. I can’t wait to hear more abt these future projects with pcs. Thanks Zach!

  • @duderobi
    @duderobi 11 месяцев назад +362

    Imagine putting the SSD on the Hottest place in the whole PC

    • @LuisPerez-5
      @LuisPerez-5 9 месяцев назад +10

      Genius

    • @legominimovieproductions
      @legominimovieproductions 9 месяцев назад +42

      And the gpu in the video has the ssd mounted on the back, so basically the part of that gpu that has the absolute least cooling on the entire gpu

    • @jankrnac3535
      @jankrnac3535 9 месяцев назад +6

      CPU is still hotter but yeah.

    • @legominimovieproductions
      @legominimovieproductions 9 месяцев назад +16

      @@jankrnac3535 not necessarily, my cpu runs at 69 to 70 degrees under full load, my gpu goes up to 85 under full load

    • @yeahh_idk_
      @yeahh_idk_ 9 месяцев назад +2

      @@legominimovieproductionsNot always.

  • @cullenmanning141
    @cullenmanning141 Год назад +797

    SSD: “AHHHHH GOD IT BURNSSSS AHHHHHH PLEASE END MEEEEEE GOD IT BURRRRR-“

    • @fennex7575
      @fennex7575 Год назад +67

      "MICHAEEEL
      DONT LEAVE ME HEEERE
      MIICHAAEEL"

    • @cardoman3986
      @cardoman3986 Год назад

      ruclips.net/user/shortsr7UG5bbjM60?si=hpy9siEeMNvbPC5M

    • @Chomta
      @Chomta Год назад +8

      ​@@fennex7575hey Vsauce Micheal here

    • @juliandelao4570
      @juliandelao4570 11 месяцев назад

      @@fennex7575i don’t even want to tell you how bad i want to punch you for being a cringe furry

    • @Bakedfrijoless
      @Bakedfrijoless 11 месяцев назад +2

      “that if you confess with your mouth the Lord Jesus and believe in your heart that God has raised Him from the dead, you will be saved.”
      ‭‭Romans‬ ‭10‬:‭9‬ ‭NKJV‬‬

  • @Southpawarsenal
    @Southpawarsenal Год назад +153

    At first I thought this was implying that we are now using NVMe drives as VRAM lmao

    • @astral_haze
      @astral_haze 11 месяцев назад +9

      needed since nvidia scammed me out of 4 gb of vram (they make worse laptop gpus under the same name)

    • @arashkmahshidfar7780
      @arashkmahshidfar7780 11 месяцев назад +4

      ​​@@astral_hazethat's on you for not doing research. The chip on each laptop card is also the same as one below it's equivalent on desktop. So a 4090 mobile would be 4080 on desktop, 4080 mobile would be 4070 desktop, etc.

    • @irnehhenri
      @irnehhenri 11 месяцев назад +15

      ​@@arashkmahshidfar7780I don't think it's fair to blame the consumer for "not doing research" when it is *vidia who is doing the fraudulent advertisement, like you basically explained. A rational person would expect a "3080 (laptop)" to be almost exactly the same as a "3080 (desktop)" if they're both called "3080"...

    • @arashkmahshidfar7780
      @arashkmahshidfar7780 11 месяцев назад +3

      @@irnehhenri I see the implications and I know that the naming scheme sucks, but also everyone knows laptops are slower than PCs. On top of that, every website and computer store I went to says the amount of VRAM what they are trying to sell you has. So in practice, if you say they scammed you, that's your fault for not knowing what is basically common info for the PC enthusiast.

    • @luminousfractal420
      @luminousfractal420 11 месяцев назад +1

      It got me too. Who woulda thought it was something even stupider 😂

  • @killerbunny9535
    @killerbunny9535 6 дней назад +1

    Let's just rember that this is all 20 year old military technology.

  • @soumyadeepmitrayo94
    @soumyadeepmitrayo94 Год назад +407

    This is probably the most useful feature for RTX 4060 😆😆

  • @Al13n1nV8D3R
    @Al13n1nV8D3R Год назад +721

    This is how you can cook an NVME in no time!

    • @-FAFO-
      @-FAFO- Год назад +47

      Exactly what i was thinking, that thing is gonna get toasted.

    • @FingalPersson
      @FingalPersson Год назад

      LoL! Uh... LOL im just gonna state that again without saying any further and see if you can understand how dumb you just sound by saying so LOL!!! :-D

    • @FingalPersson
      @FingalPersson Год назад +28

      @@-FAFO- Mainly the head you read of your GPU comes from the Vram and GPU-processor... And in the far behind on many cards there are just fins and fans, the actual card is pretty small. And in this video its almost as if there is added some extra space on the card to make it bigger, with no Vram anywhere near, which means the SSD is gonna do fine, usually they aren't cooled by fans at all, and here they are actually getting aircooled. My 4080 looks like a beast, but once you take the shell off and bring the real card out, you'll see its not that big of a card :)

    • @-FAFO-
      @-FAFO- Год назад +33

      @@FingalPersson Don't know if you've ever touched a 30 or 40-series card while running but it's hot af. Yes the insides get some air running through, but the SSD will only get some residual cooling from the casing the air is passing through. That's still not as ideal as having it on the motherboard where you can have fans directly blowing onto it. You're going to be running your SSD at constantly hotter temps than you would otherwise, but have at it.

    • @FingalPersson
      @FingalPersson Год назад +5

      @@-FAFO- I started building computers back in the days of 2002, when i was 17. I do know a thing or two about computers i think, and tbh i can touch my 4080 while gaming star citizen at 2160p any day on the rare end, mostly plastic anyways. Now i cant touch the fins while the card is running, but i touched the rare end of the backplate and it was not as hot as the gpu and vram that showed 65 and 73 degrees celius under heavy gaming. 65 being the gpu and 73 being the vram.
      Also got tons of sensors on my motherboard and in the case it self, so i dont need to touch stuff to know the temps, i can read em of software, so should you, stop touching inside your computer while running it, it can actually end up in a shortcut and break components in your system.
      Ill also let you know i own a Asus 3070 for my streamer rig and a ROG Strix RTX 4080oc so i guess that sorta answers your questions. In fact, i have had a suprim x 3090ti, 3060, 1070, 1060, 2080ti. i did also go for a 980 and i had a 480ti for quite a good while. A card i never forget was my voodoo 2 from 1999 when gta 2 launched.
      My first 3D card ever was a radeon saphire.
      I've build computers for 21 years, my own, and my friends, and their friends computers ive build and upgraded over the years, and i've assembled ( i dont like calling it building anymore, was not that easy to build computers before as it is today )
      SO, when i say that, this can actually work, it is because i dont know you know the temps an nvme runs at, and i dunno if you have ever tried to run an nvme without block on top either from motherboard or one that comes with the nvme, but ill tell you this, ive done it.. ive run an nvme 1. gen without any block on top of it, no fan giving it air except for the air that is in the case.
      It never had a problem, in fact it works today in use in the very same rig it was build it, even tho that built is now longer mine as i gave it away to a friend.
      usually the readings of my ssd back then was 45-50 degrees, without a block... you wanna know the temps of my 2 nvme's that im currently using, which has speeds upto 7500/7200... well they have the blocks on that comes with the Rog Strix z590-E gaming mobo, and run about 35-45 degrees. Not much dfference.
      WD says that the maximum operating temperature for NVMe is 72°C.
      Now with that in mind that is pretty hot right? and even at that temp, you are still fine, but you don't want to push the temps any further.
      So with that said, imagine having it on my RTX 4080 which btw has output for fans on the card in the rare actually 2x4-pins are on the far end on that card, which means i could easily costumize additional fans to blow directly on the nvme, which would be in the back of the gpu, near fins, near the last gpu fan, near the intake fans and with additional fans if you choose tho even tho i think that would be overkill and unneeded.
      So yeah, i think it would actually not melt at all, and if you put a block on it that also help with temps, and in the very back of my card as i said, you could touch the backplate feeling it has temperature but NOTHING compared to if you touch the plate in the front which i would not recommend you doing, especially when the system is running.
      So now i have elaborated this alot, now please elaborate your thoughts that makes you so certain this will melt nvme's...

  • @roulifreeman4460
    @roulifreeman4460 Год назад +120

    What we really need then is the GPU being able to directly decode textures and things from the builtin SSD

    • @EvilTim1911
      @EvilTim1911 11 месяцев назад +24

      I thought this video was going in that direction, THAT would be cool. This will just make the SSD slower because the data has to pass through two interfaces instead of one. Plus the main M.2 slot on a mobo is usually very close to the CPU and there's a good reason for that. So this is adding an interface and moving the SSD further away. No thanks

    • @roulifreeman4460
      @roulifreeman4460 11 месяцев назад +8

      @@EvilTim1911 there is direct storage access or whatever it's called for GPU directly reading from the hard drive. So it might be useful

    • @soul_pantheon
      @soul_pantheon 11 месяцев назад +1

      @@EvilTim1911 well while the cpu is incharge of processing a majority of the file transfers, you could inturn use the verry many processors present in the gpu instead to process the files, this is the case for how workstation gpus work

  • @KrotowX
    @KrotowX 5 месяцев назад

    DirectStorage support certainly will benefit from that. Better game loading and video encoding times in result. I believe that video encoding with saving on GPU SSD will run faster too.

  • @xGrifDog96x
    @xGrifDog96x 11 месяцев назад +108

    You say the SSD is gonna get cooler from the GPU but my GPU is just gonna overheat my SSD lmao

    • @jax3r
      @jax3r 9 месяцев назад +2

      I’ll stick to my 1080ti that runs at 25C

    • @JFat5158
      @JFat5158 9 месяцев назад +5

      ​@@jax3rdude is your ambient temp in the negatives or something?

    • @ashii_ii
      @ashii_ii 9 месяцев назад +13

      @@JFat5158bro lives in Antarctica

  • @deadselect
    @deadselect Год назад +284

    -Wee need direct storage on PC!
    -But we have direct storage at home
    Direct storage at home:

    • @MsHojat
      @MsHojat Год назад

      What do you mean? PCs already have DirectStorage. And the drive doesn't need to be near the GPU, it makes no difference.

  • @tni333
    @tni333 8 месяцев назад +281

    I’d be more excited if the gpus just used the full 16 lanes by default

    • @heramaaroricon4738
      @heramaaroricon4738 7 месяцев назад +7

      Same.

    • @myk1_sp
      @myk1_sp 7 месяцев назад +8

      I'm not sure why some GPUs don't utilize the 16 PCIe lanes that they get installed to.

    • @Not_interestEd-
      @Not_interestEd- 7 месяцев назад +29

      ​@@myk1_sp it's cheaper to not do that. That's... The entire reason.
      What? You're disappointed? Sucks to be you, just buy a new card! Casual.

    • @charginginprogresss
      @charginginprogresss 7 месяцев назад +31

      ​@@myk1_sp simply not powerful enough, they can either fully use 8 or half use 16 so they just use 8
      Note this applies to stuff like a 4060, a 4090 definitely uses all 16.

    • @Godjo_THO
      @Godjo_THO 7 месяцев назад +17

      This is because we don't fulfill that capacity at this moment, your system is x64 bits, why you don't use 16 exabytes (17179869184 gigabytes) of ram instead of just 16 gigabytes? It's because we don't need such power by this moment, 32 bits used to accept 4GB of ram and we jumped to 64 bits, be patient, maybe in the future we can start needing those 8 extra.

  • @nitrodion2005
    @nitrodion2005 7 месяцев назад

    Actually an insane concept that would sell amazingly

    • @zacharysliva4020
      @zacharysliva4020 6 месяцев назад

      There already was a product that is this exact concept, The Radeon Pro SSG

  • @ZachMcphersonMusic
    @ZachMcphersonMusic Год назад +560

    Ahh yes i definitely need something that will warm my GPU up even more

    • @erictsui8802
      @erictsui8802 Год назад +20

      They have tested the temp, nothing was heated.

    • @Levinas-od5yx
      @Levinas-od5yx Год назад +41

      @@erictsui8802 gpu w/o m.2 slot = x temp, gpu with m.2 slot = x temp + 'nothing was heated'.
      It's as if you don't understand simple logic or the word 'warmer' means.

    • @erictsui8802
      @erictsui8802 Год назад +18

      @@Levinas-od5yx If you can understand Chinese, you could just check the test session of the original video. But you don't understand it, and you're still here writing a very simple equation intended to make your point, how sad is that?

    • @erictsui8802
      @erictsui8802 Год назад +10

      @@Levinas-od5yx Just for pity's sake, I'll outline the test results for you. The SSD in this design is mounted opposite of what they normally are, with all the heat sources from the SSD coming into direct contact with the GPU heat sink, the temp of SSD could stay 42 C degree after 10 min Furmark GPU test.

    • @erictsui8802
      @erictsui8802 Год назад +9

      @@Levinas-od5yx Oh the GPU was 62 degrees C after 10 minutes of Furmark testing, lest you say I missed the point

  • @I_Am_Empyrean
    @I_Am_Empyrean 10 месяцев назад +322

    Nah, what's really cool about this is the potential for the gpu to directly access the NVME storage and essentially use it as a T2 memory cache. That's only if the storage controller for the NVME is on the GPU rather than Mobo/CPU though.

    • @hasangonendemirler9756
      @hasangonendemirler9756 9 месяцев назад +9

      That WAS a thing back then.

    • @skizm5804
      @skizm5804 9 месяцев назад +65

      no, what would really be cool is if games could be designed knowing that this is an option, and be able to install textures directly to a gpu-attached ssd, meaning that data could be loaded into vram much faster, with much less overhead, and using zero pcie bandwidth

    • @daniillasso2916
      @daniillasso2916 9 месяцев назад +5

      ATI already did this,
      It didn’t pan out

    • @skizm5804
      @skizm5804 9 месяцев назад +20

      @@daniillasso2916 that doesn't mean it was a bad idea, just came too early and didn't give it long enough to be adopted

    • @michaelhanson5773
      @michaelhanson5773 9 месяцев назад

      ​​@@daniillasso2916it was also ATI and not nvidia, which lets be honest, really is the dominating market so if they do something it has a better chance of making it further than if ATI tried it.

  • @Traymark
    @Traymark Год назад +374

    I had been paralyzed for 7 years before watching this video, but this video gave me enough energy to get on my feet to open my window and jump out.

    • @eye9have5you35
      @eye9have5you35 11 месяцев назад +9

      Seems that jump max paralyzed your legs this time 😜

    • @PrivateAccount80527
      @PrivateAccount80527 11 месяцев назад +1

      fairs

    • @Valery0p5
      @Valery0p5 11 месяцев назад +6

      Then you remember you are at the ground floor

    • @MoswenMedia
      @MoswenMedia 11 месяцев назад +3

      Glad I'm not the only one who was disappointed you can't dislike a video twice

    • @LazySpaceRaptor
      @LazySpaceRaptor 11 месяцев назад +1

      Are you paralyzed again?

  • @fly2841
    @fly2841 Месяц назад +1

    "bro i got the 8TB graphics memory 4060"
    -bro's last words

  • @tadashiminami7595
    @tadashiminami7595 11 месяцев назад +247

    Companies getting more and more clever on how get consumers to spend

    • @X_irtz
      @X_irtz 9 месяцев назад +15

      I mean, yeah. That's the point of a business lmao.

    • @quest2782
      @quest2782 9 месяцев назад +9

      ​@@X_irtzand they could lose that business by overcharged and nickel and dimming their consumers.

    • @Nim...
      @Nim... 9 месяцев назад +4

      ​@@quest2782That's why competition is a good thing.

    • @HunterOfHalfLife
      @HunterOfHalfLife 9 месяцев назад +8

      @@Nim... yeah until they all start offering the same price

    • @disguiseddv8ant486
      @disguiseddv8ant486 9 месяцев назад +6

      People complain about companies not being innovative enough. Then when the company show the consumer their innovations the consumers complains about the cost, about how it's not necessary, about it's too big/it's too small, about the color, etc. etc.

  • @Fernandosampaio_
    @Fernandosampaio_ Год назад +405

    Today on How to cook your SSDs and lose your data😂

    • @extendedclipricco3214
      @extendedclipricco3214 Год назад +15

      right, imagine this came out on the 4090 FE THE ROFLS WOULD BE R34L

    • @xero6774
      @xero6774 Год назад +78

      ​@@extendedclipricco3214please never say that again

    • @kingeling
      @kingeling Год назад

      @@extendedclipricco3214 what the fuck

    • @benduffy4223
      @benduffy4223 Год назад +12

      I couldn't tell you how many SSDs that were mounted under the GPU that ive had to replace because of this.

    • @rambu3013
      @rambu3013 Год назад +26

      ​@@xero6774my man speaks early 2000s internet

  • @jamesmnguyen
    @jamesmnguyen Год назад +224

    We're slowly evolving PCs into SoCs.

    • @Demopans5990
      @Demopans5990 Год назад +26

      Well, AMD did get there first. I'd expect GB of cache from them soon

    • @jamesmnguyen
      @jamesmnguyen Год назад +8

      @@Demopans5990 Looks like we might get there in 5-10 years.

    • @VanSanProductions
      @VanSanProductions Год назад +18

      @@Demopans5990 still waiting for AMD SOC on par with consoles.. If they make a new socket with RAM/CPU/GPU all-in-one that'll be great. They are not catching Nvidia so they should start just doing something different instead.

    • @RainyFoxUwU
      @RainyFoxUwU Год назад +6

      @@VanSanProductions bro amd lterally did this. You can get a board with ram, vram etc and the Jaguar cores from the Xbox... And stuff it in an ATX case.

    • @Sherolox
      @Sherolox Год назад +2

      @@Demopans5990
      Didn't Gamers Nexus make a News video about that?
      1GB cache already exists on high end server/workstation CPUs, iirc.
      Edit: Yeah, search for it. It already exists.

  • @shivs.goswami5379
    @shivs.goswami5379 4 месяца назад +3

    RUclipsr:-Are you excited?
    Broke me:-Oh Hell No! Im Good! Thank You!

  • @testtube173
    @testtube173 11 месяцев назад +742

    imagine being excited to overheat your GPU by attaching a SSD to it.

    • @koollio9724
      @koollio9724 11 месяцев назад +5

      💀

    • @DesroQc
      @DesroQc 10 месяцев назад +169

      Imagine thinking that your GPU will overheat because of an SSD...

    • @koollio9724
      @koollio9724 10 месяцев назад +51

      @@DesroQc depending on the gpu the thermal situation would be far from ideal to say the least

    • @thomaskg3802
      @thomaskg3802 10 месяцев назад +12

      "He's all about it, if it's the future"

    • @DanielDuhon
      @DanielDuhon 10 месяцев назад +7

      @@DesroQcit’s entirely possible if you’re running higher settings than your hardware can handle…

  • @revenant_z
    @revenant_z Год назад +87

    Year 2090: ughhhh, my 10000 terabyte gpu is full again

    • @NickNieves112
      @NickNieves112 Год назад +3

      Well, duh... you're still using measurements and terms like Terabytes from over sixty years ago. It's like when people from the 2020's still measured everything in bits... Wait, they didn't do that then either?!

    • @revenant_z
      @revenant_z Год назад +6

      @@NickNieves112 well, sorry! Misspelling. I meant 10000 gixabyte

    • @AbsoluteHuman
      @AbsoluteHuman Год назад +1

      ​@@revenant_z it's just 10 petabytes

    • @revenant_z
      @revenant_z Год назад +5

      @@AbsoluteHuman but its still a lot 😭 i am poor

    • @silvioantonio6952
      @silvioantonio6952 11 месяцев назад

      They will be using googlebytes 😂😂😂

  • @PFG888
    @PFG888 Год назад +75

    That is definitely good, but I'm more excited for the Fiore System Airjet coolers to be available to the public. The Airjet can possibly be a great solution to over heating m.2

    • @damohr69
      @damohr69 Год назад +2

      this

    • @xAndrzej42
      @xAndrzej42 Год назад +1

      I would be happier if they made pci-e x32. These gpus have 2 maybe even 4x more computing power than pci-e can deliver.

  • @achour.falestine
    @achour.falestine 4 месяца назад +1

    He sounds so happy

  • @limitlessrelaxation08
    @limitlessrelaxation08 Год назад +335

    "How to cook your NVME 101"

    • @germanmosca
      @germanmosca Год назад +10

      How would it cook from this? You do realise that the cooling solution of current GPU's are oversized? There is more then enough cooling reserve in those cooling designs that they can take care of several nvme ssds.

    • @ichisenzy
      @ichisenzy Год назад +46

      @@germanmoscaSSDs get damaged if they get over 60 degrees and GPUS go way above that even on a 4060

    • @gabor6200
      @gabor6200 Год назад +7

      @@ichisenzy Bruh my laptop SSDs are constantly getting cooked

    • @ichisenzy
      @ichisenzy Год назад +32

      @@gabor6200 yeah you should expect that on a gaming furnace

    • @vonshtoyven3060
      @vonshtoyven3060 Год назад +2

      M.2 ssd heatsinks exist, even with their own fans.

  • @cappuccino-1721
    @cappuccino-1721 Год назад +154

    Use that SSD as VRAM… now that would be some good stuff!

    • @SlofSi
      @SlofSi Год назад +12

      That came to my mind too. If the enormous amount of gigabytes of an SSD could be used somehow as a continuation of VRAM. Of course it's a lot slower than normal VRAM, but it would be at least close to the GPU unlike a normal SSD.

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp Год назад +27

      Wouldn't want to write much of anything to it considering how much data a Gpu goes through, it would kill the SSD fast. For reads that's ok.

    • @7kortos7
      @7kortos7 Год назад +1

      my thoughts exactly! or even a dedicated slot for AMD SAM (smart access memory)

    • @7kortos7
      @7kortos7 Год назад

      @@SlofSi
      AMD smart access memory is kind of in line with that.
      would be really cool to see.

    • @CosmicCustodian
      @CosmicCustodian Год назад +4

      This is what I was thinking he was gonna say, but nope, just more storage 😩

  • @viddesigner980
    @viddesigner980 Год назад +37

    Not only that it cools off the SSD using the GPU fan, but also transfer the GPU's heat to the SSD itself

    • @miranda.cooper
      @miranda.cooper Год назад +2

      Toasty

    • @farbenpracht
      @farbenpracht Год назад +3

      ikr.. putting the SSD on my already 80°C GPU that is at its cooling limits does not really sound right 🤷‍♂️

  • @BIeechy
    @BIeechy 2 месяца назад

    micro builds would go crazy with those

  • @VanBourner
    @VanBourner Год назад +49

    Heat transfer has some specifics as to why mounting SSDs onto GPUs is a bad idea.
    Sure, it can cool your SSD. But only if your GPU is not producing any heat or produces less heat than the SSD itself. And GPUs generate a lot of heat, in most setups they are the main part you are trying to cool down. And that heat goes into the heatsink and PCB. And from that PCB or heatsink it would transfer to the SSD.
    Upper limit for SSD temps is between 70-80 degrees Celsius. GPUs often operate at those temps during games and if the heatsink is at 60 degrees during gaming session, an SSD that would be otherwise at 45-50 is going to soak up the heat and get warmer.
    So while in theory it seems like a good concept, you'd need to thermally isolate the cooling, otherwise you'd just warm up the SSD for no reason.

    • @MünWeeew
      @MünWeeew Год назад +3

      the i9-13900K says hi

    • @josephschultz
      @josephschultz Год назад +3

      @@MünWeeew oh you mean my sandwich warmer it works as well as my microwave LOL

    • @cristianJoker2512
      @cristianJoker2512 Год назад +1

      finally someone who says right, this it's just like the sd card burning issue from rog ally , same spot near transistors

    • @MünWeeew
      @MünWeeew Год назад

      Water cooled GPU and SSD lmfao

  • @hasangonendemirler9756
    @hasangonendemirler9756 9 месяцев назад +36

    Back in the history, graphics cards had user installable RAM slots. I was excited we were going back to the future for a second.

    • @SaraMorgan-ym6ue
      @SaraMorgan-ym6ue 6 месяцев назад +1

      the only problem is that it ads heat to the gpu which makes a lot of heat by itself that's why they are getting so damned fat🤣

    • @ParadoxISPower
      @ParadoxISPower 6 месяцев назад

      @@SaraMorgan-ym6ue The GPU Chip is bigger that's why they are thicker, and the GPU generates more heat then the ssd, so it isn't adding heat. GPU's triple in size and complexity, so do cards, it's 1:1.
      In a closed system, adding heat will increase the temperature, but it will not rise infinitely. The temperature will eventually reach an equilibrium where the heat added is balanced by the heat lost to the surroundings. The temperature will not exceed the heat source because the system will reach a point where the heat input is balanced by the system's ability to dissipate heat. This is known as thermal equilibrium.

    • @JessicaMorgani
      @JessicaMorgani 4 месяца назад

      Number still goes up
      ​@@ParadoxISPower

  • @redslate
    @redslate Год назад +30

    This is actually pretty cool.
    Kinda how like sound cards would pull double-duty as a joystick adapter back in the day.

    • @djDenimusic
      @djDenimusic Год назад +2

      Ahh man, i forgot about those times.. pretty cool

  • @Jv-tx1kx
    @Jv-tx1kx 11 месяцев назад +133

    It'd be really useful if the gpu can access the ssd and copy data to vram, otherwise it still gonna have to allocate space in main memory and then be allocated in gpu, which doesn't change anything

    • @banaana1234
      @banaana1234 11 месяцев назад +21

      Did you even watch the video? Its not about performance, but putting the PCIe lanes to use.

    • @stargazersdance
      @stargazersdance 11 месяцев назад +21

      @@banaana1234 Yeah, and they're saying that it'd be useful if it did that.

    • @MorganSaph
      @MorganSaph 11 месяцев назад +12

      @@banaana1234 Which is about performance anyway.
      "Look at all this empty space. It could be used for something"
      "Maybe it could be used for ?"

    • @orbatos
      @orbatos 10 месяцев назад +4

      None of this is about memory allocation at all, just direct PCIE addressing to the extra device, which is supported in hardware already.

    • @barrettgeibel9538
      @barrettgeibel9538 10 месяцев назад

      It's called resizeable bar. Look it up!

  • @ileapsy2071
    @ileapsy2071 11 месяцев назад +50

    what would be more cool is to see something similar to that but as "VRAM upgrade" like you could buy VRAM and upgrade VRAM capacity of your GPU

    • @imo098765
      @imo098765 11 месяцев назад +3

      thats not going to work SSD are far too slow, imagine running DDR2 with increased latency instead of GDDR6/X
      You going to have a stuttery mess, not only that the traces to the memory modules are the same length, now this SSD is completely off the timing.
      It can never work as a VRAM upgrade because VRAM is so much faster and integrated into the design of the board

    • @ileapsy2071
      @ileapsy2071 11 месяцев назад

      @@imo098765 I mean who knows, we live in a world where technology is exponentially developing itself 🤷‍♂

    • @adamstewarton
      @adamstewarton 11 месяцев назад

      vram is dirt cheap so they won't allow it , they want to charg us fortune for more vram on their cards.

    • @sebastien1047
      @sebastien1047 10 месяцев назад +1

      It was a thing before, where there was ram slots on gpus

    • @endermaster08
      @endermaster08 10 месяцев назад

      From what ive heard, you could do this on very old graphics cards.

  • @clintmurphy5436
    @clintmurphy5436 11 месяцев назад +36

    Imagine a GPU... that comes with ability to add extra VRam to the card. This needs to happen.

    • @sebay4654
      @sebay4654 9 месяцев назад +3

      This is what I want

    • @hajjdawood
      @hajjdawood 9 месяцев назад

      This is exactly what Apple does with its Apple Silicone. Whatever ram you get also goes towards the GPU. So if you order 256 gb of RAM, youll also have 256 gb of video ram.

    • @obj_obj
      @obj_obj 9 месяцев назад +5

      @@hajjdawoodthat's also what every single other laptop or desktop pc with integrated graphics does, but apple markets it like they came up with the idea

    • @hajjdawood
      @hajjdawood 9 месяцев назад

      @@obj_obj There is several key differences, first its actually good integrated graphics (scoring better then the 3090 and worse then the 4090), second because its a SOC and the ram is also on the same chip the memory bandwidth is 400 GB/s verses a normal integrated cpu/gpu like then Intel i9 runs at only 89 GB/s
      I think its a bit unfair to act like Apple didn't make a very strong leap in CPU technology. Non-apple laptops are struggling to keep up with the nearly 4 year old M1 chips especially when it comes to performance per watt.
      Apple almost never invents anything new, theyve admitted as such many times over, but when they do something they typically do so really really well. iPads, Apple Watch, the M series chips, the iPod, even smartphones (iPhone) apple was definitely not the first but they did a really really great job that leap frogged them past competition for years and years.

    • @kentaronagame7529
      @kentaronagame7529 9 месяцев назад +1

      ​@@hajjdawood This feature is extremely old and has been used by windows machines forever. However, if you actually think that you can buy 256GB of ram and allocate it all to vram, you're not just uneducated on the subject, you don't even know how Macs actually work, because what you're claiming isn't even possible on an M3 Max.

  • @benji-menji
    @benji-menji 22 дня назад +1

    This could also be great for storing compiled shaders so they can be accessed faster than through the motherboard SSD. Perfect for when RAM isn't big enough and runtime compiled shaders require too much GPU power.

  • @Cold_waffles
    @Cold_waffles 9 месяцев назад +24

    They should do this with VRAM. All my problems will be solved

  • @geografiainfinitului
    @geografiainfinitului 11 месяцев назад +53

    In another timeline:
    Internal Email: NSA can now use the GPU to scan your files...

    • @Papitoti
      @Papitoti 9 месяцев назад +2

      Don't worry mate, you're most likely not interesting enough to be monitored

    • @OsiDio
      @OsiDio 9 месяцев назад +1

      ​@maxjes5 this. Hahaha, unless you got a few files thatll land you in prison i wouldnt worry. If the NSA wanted into your network theyd be there.

    • @prophetzarquon1922
      @prophetzarquon1922 8 месяцев назад +2

      With trust replacing secure practices, stega usage getting really creative, & the price of meta analysis at an all time low, everything is now "interesting enough" to be monitored; it's just that very few things require overt intervention

    • @maxheadman59
      @maxheadman59 8 месяцев назад

      @@Papitotihaha that's what you think...

    • @Papitoti
      @Papitoti 8 месяцев назад

      Why would I care don't live in the US.@@maxheadman59

  • @orangeapples
    @orangeapples Год назад +12

    As an ITX builder this sounds amazing. No need for storage options on the back of the motherboard with no airflow.

    • @Terraqueo22
      @Terraqueo22 Год назад +2

      then you put an ssd on a piece of hardware that gets near 80 degrees normally?

  • @MuqriBlue
    @MuqriBlue 7 месяцев назад

    I kinda wish that Low End, Entry Level or Mid Range can handle any games at 4K max settings.

  • @informagico6331
    @informagico6331 Год назад +114

    This idea could be nice to let SSDs access to more memory to load game objects and/or shaders without the need to involve many CPU instructions at the process.
    Edit: in terms of global computation, drivers that communicate GPUs with CPUs follow interface rules that may not be as optimum as the GPU hardware itself loading information from own disk memory. For those who say this is a limitation, it really isn't because during the "long" process of initialization of the game/OS, the reality is that the GPU obtains one more read/write channel that is independent from the computer's main disk space, leading to faster effective load times just because the GPU is avoiding two things:
    1. Driver control instructions overhead (may be different between applications)
    2. OS main disk read/write operations that may be used for other system internal purposes, or not parallel instructions that require high performance CPU clock times.
    And this also introduces a new potential paradigm where GPU programming becomes essential to write programs that let the GPU store it's own internal files/resources into a totally different memory channel without generating overhead with main disk memory requests.
    So yeah, it is just better, whenever disk speeds are similar. And people complaining or throwing arguments each against other should shut up and think they're not spreading really accurate information, but can offer a different view where disadvantages instead of spreading a useless fight.
    Thanks.

    • @The_Keeper
      @The_Keeper 11 месяцев назад +8

      Right. that was my thoughtprocess too.
      Letting it act as an on-card cache.

    • @PEACEMAKERS_OFFICIAL
      @PEACEMAKERS_OFFICIAL 11 месяцев назад

      True

    • @4doorsmorewhors
      @4doorsmorewhors 11 месяцев назад +1

      @@The_Keeper the memory on a GPU isnt close to being the same as an SSD LOLL

    • @4doorsmorewhors
      @4doorsmorewhors 11 месяцев назад +2

      @@The_Keeper GPU memory is GDDR6... An SSD would never come close to being as fast. I can't believe you guys thought you could just strap a 1tb SSD on a GPU and call it a day 😂
      Edit: GDDR6 NOT GDDR6X

    • @4doorsmorewhors
      @4doorsmorewhors 11 месяцев назад

      @@micaheiber1419 what does closer proximity have to do with anything?

  • @maskednil
    @maskednil 10 месяцев назад +224

    EA's graphic card: We've locked out a portion of the RAM. Pay 9.99 a month for full access, or 59.99 a year for full access.

    • @OldRedOwl
      @OldRedOwl 9 месяцев назад +2

      Barf. They would do this too

    • @piotrkowalski3869
      @piotrkowalski3869 9 месяцев назад +5

      ​@@OldRedOwlthey already do this all 4xxx series cards use practically the same PCB. They just install less for each model. Con is less performance and lesser image quality. Cons are cheaper price.

    • @AndreyCata
      @AndreyCata 8 месяцев назад +2

      You know, BMW thought about that. They try to sell subscription for heating seats. 😅😅😅
      We can definitely see something like that on pc soon. Hopefully, people won't fall for it. Otherwise we are F***d😢

    • @lexecomplexe4083
      @lexecomplexe4083 8 месяцев назад +3

      Intel actually did this on CPUs at one point. People were not happy.

    • @lexecomplexe4083
      @lexecomplexe4083 8 месяцев назад +1

      ​@@piotrkowalski3869this.. isn't even remotely close to the truth. You don't understand how GPUs are manufactured

  • @italianbasegard
    @italianbasegard 8 месяцев назад +97

    These graphics cards require you to split the x16 lanes into x8 for the GPU and x4 for the NVMe drive using what’s called “PCIe bifurcation”. Even attempting to use such an SSD on a GPU like this requires your motherboard to support PCIe bifurcation.
    I’ve ran server disks using PCIe bifurcation before. It is a nuisance to set up, a nuisance to manage, and the payoff is negligible (assuming you have space M.2 slots on a standard motherboard, use those instead).
    So no, no I am not excited for this GPU or this approach at all lol.

    • @SirPano85
      @SirPano85 7 месяцев назад +7

      But I feel the marketing team will like it...

    • @GrugGaming
      @GrugGaming 7 месяцев назад +2

      I could imagine this adding additional input lag tbh

    • @RoganKayle
      @RoganKayle 6 месяцев назад +2

      What a load of nonsense lol. What you say is simply not true.

    • @Klaevin
      @Klaevin 6 месяцев назад +8

      I don't think you inderstand what's going on here. bifurcation is a pain because it turns 1 lane into 2 lanes.
      this GPU only uses 8 lanes and so 8 lanes are still available for something else. 2 NVMe drives, for example, which only use 4 lanes each.

    • @maciejgada740
      @maciejgada740 5 месяцев назад +3

      1. Most of the AMD motherboards support bifurcation by default, but motherboard manufacturers do not include BIOS interface to configure it.
      2. This does not require lane splitting.

  • @kairon156
    @kairon156 26 дней назад

    Great timing for this video.
    I was recently wondering if video card slots needed an update in order to improve GPU tech. I didn't realize that they had too many slots, but with how heavy graphic cards are they need all the mobo support they can get.
    Note: I got a video card stand recently.

  • @keelysmash
    @keelysmash 9 месяцев назад +78

    Honestly this could be super cool if GPUs in the future had like expandable VRAM

    • @lexecomplexe4083
      @lexecomplexe4083 8 месяцев назад +14

      Actually, they used to in the past. I still remember an old card from my early days that had 4 slots to plug in memory modules on the gpu for expandability. The only reason we don't see it anymore is because the timing of the memory modules running through the circuit to the gpu die itself would be too slow for modern components. Faster than you can think, but still not fast enough to keep up. The memory modules would have no way to compete with memory directly on the gpu die or on the card immediately next to it. Sad because it was such a novel idea and it was brilliant at the time. 500MB GPu, quadrupled to 2GB. That was a ton back then. Having 1GB even was like, baller.

    • @keelysmash
      @keelysmash 8 месяцев назад

      @@lexecomplexe4083 Yes I’m certainly much newer to PC building than you the first ever GPU I bought for myself was a 1080

    • @SuperPickle15
      @SuperPickle15 8 месяцев назад +8

      But how will nvida be able to make money? ;(

    • @Nocturnalphil123
      @Nocturnalphil123 8 месяцев назад +1

      The 4070 would easily become a commonly used card😊

    • @gargolgaming8101
      @gargolgaming8101 6 месяцев назад

      I have an old computer with an IGPU on board, with a slot for a VRAM card

  • @olympiaalkisbarmagiann7859
    @olympiaalkisbarmagiann7859 Год назад +46

    Me who uses an 13 year old laptop:

  • @-us3if
    @-us3if 11 месяцев назад +365

    Putting an ssd on an oven is the most idiotic thing I’ve heard.

    • @harujp2
      @harujp2 11 месяцев назад +24

      it'll be on the coolest part of the card, add a layer of thermal transfer tape to it, touching the rad that can effectively cool the 2nd hottest component with no effort? ;) i would genuinely love to see thermals with/without an ssd on the card. I think it's a really good concept especially since gpus typically vent their hot air directly out the case . well alot of the air

    • @adamdunne6645
      @adamdunne6645 11 месяцев назад +13

      dude that end of the gpu is not very hot at all. there's a fan blowing through the heatsink, that's it

    • @architectsxiii5379
      @architectsxiii5379 11 месяцев назад

      Then don't put it on the hot part moron? 😂

    • @Armand79th
      @Armand79th 11 месяцев назад +6

      Do you even solder, bruh?

    • @ihavetubes
      @ihavetubes 11 месяцев назад +4

      The backplate runs hot as hell, you can cook an egg on it. I put a fan on my gpus backplate because it is too damn hot for my comfort.

  • @bigteo90
    @bigteo90 3 месяца назад +2

    It sounds like theyre selling us incomplete graphics cards

  • @TVWyrm
    @TVWyrm 11 месяцев назад +29

    Honestly if it works with the EGPU market this is a big W for that. Not only can you have an external gpu to run your more demanding games but you can have storage as well Without too much hassle. W

    • @Not_interestEd-
      @Not_interestEd- 10 месяцев назад

      Also more storage options in mini-ITX builds. Cooling would be a pain, but a big enough heat sink to handle both the GPU and NVMe isn't a far dream.
      A singular 4090 chip probably runs hotter than both combined.

    • @bun00b
      @bun00b 9 месяцев назад +1

      Kind of how back in the day we would install games on a USB stick to play on school PCs as we couldn't install any software on them, just now you bring your own GPU with it as well so you can play games on any pc with a decent processor.

  • @BioGenx2b
    @BioGenx2b Год назад +47

    I'm more interested about the DirectStorage implications that this may have, since the SSD sits closer to the GPU and could significantly improve framerates during asset loads.

    • @fendysuryanto
      @fendysuryanto Год назад +6

      Interesting, but if your SSDs sit closer to your GPU then just run GPU Direct Storage instead of Direct Storage. Both GPU and CPU have specific condition in which their workload can be handle efficiently.

    • @Lagger625
      @Lagger625 Год назад +10

      The short distance is irrelevant. The gpu is only giving the SSD its 8 unused lanes, their communication is still through the cpu

    • @branbroken
      @branbroken Год назад +4

      ​@@Lagger625on this concept build yes, but the point is theres no reason why a manufacturer couldn't build in an SSD like this and run it as GPUDirect storage, passing between the gpu and the ssd without needing to run through the cpu for everything.

    • @MrMediator24
      @MrMediator24 Год назад +4

      ​@@branbrokenthey would need to basically add storage controller lite to GPU for that to work without involving CPU PCIe

    • @EmongTimothy
      @EmongTimothy Год назад

      I like how you think.

  • @MarkReed-smokindeist
    @MarkReed-smokindeist Год назад +45

    Not that unusual in the past. A lot of Amiga boards often had multiple items on them such as RAM on a SCSI HD Controller. I think it had something to do with how Autoconfig worked with Amiga Zorro and CPU boards.
    So this is a retro idea that is still a good idea.

    • @deairreoh
      @deairreoh 11 месяцев назад +2

      Alienware x51 kinda did this

    • @tylisirn
      @tylisirn 11 месяцев назад +2

      In this case it's more akin to a pass-through port. The GPU doesn't talk to the unused PCI lanes, it just routes them to somewhere potentially useful, like an M.2 socket.

    • @Space-O-2001
      @Space-O-2001 11 месяцев назад +3

      I had to cut a hole in my A1200 for a 80GB drive that cost £315 back in 1999

    • @MarkReed-smokindeist
      @MarkReed-smokindeist 11 месяцев назад

      @@Space-O-2001 Did you stick a full-sized drive in there? lol
      There's a reason I really liked the big-box Amigas. :D

    • @Space-O-2001
      @Space-O-2001 11 месяцев назад

      @@MarkReed-smokindeist Yeah 3.5", 2.5" weren't really that common. A2000/A4000s weren't really in my budget back then lol

  • @SW20Turbo
    @SW20Turbo 8 дней назад

    This is the symtom of moving from SATA based SSDs to PCI-e SSDs. While they are fast, they ultimately take up lanes. Its one of the few reasons to run SATA. But if you want real speed, you need SAS. Its niche to the enterprise market, but think a simple upgrade to SAS-2 or SAS-3 even would be a game changer for those that want to run SSD RAID setups. I personally run SAS and I love it! There's nothing better than SAS for both external and internal capability. Its a truly flexible system. It's a truly the wonderful successor to parallel SCSI.

  • @kippo133
    @kippo133 Год назад +77

    It’s a pretty interesting concept. But what if you could add more VRAM via a port on the back? Any takers?

    • @shen8243
      @shen8243 Год назад +17

      Too slow for gpu

    • @babyhandsthegreat
      @babyhandsthegreat Год назад +13

      Too slow, terrible bandwidth too.

    • @Pasi123
      @Pasi123 Год назад +13

      Some video cards in the 90s had a slot for extra VRAM

    • @rogeliocano9536
      @rogeliocano9536 Год назад +6

      @@Pasi123yeah and it wasn’t good

    • @ujiltromm7358
      @ujiltromm7358 Год назад +2

      Imagine trying to configure VRAM timings just as well as RAM timings... and then there's the additional variable of extra VRAM port. UGH.

  • @TheRealCobrax08
    @TheRealCobrax08 Год назад +68

    Sounds preety good in theory and i am excited for the result😅

    • @l1ght608
      @l1ght608 Год назад +3

      dead nvme back plate of gpu gets hot and isn't cooled by the heatsink and fans from the front. lol

  • @IrisCorven
    @IrisCorven 9 месяцев назад +40

    Imagine if GPU manufactures took that concept, and used an SSD on your GPU as a way to handle shader caching in games, freeing up your PC HDD from bulky shader files that build up, with some games.

    • @meteor541
      @meteor541 8 месяцев назад +6

      Thats smart ngl but why not just use sn ssd than? Moreover a lot of bypassing required to do so

    • @Cookiedible
      @Cookiedible 8 месяцев назад +5

      HDD? What year was that written in

    • @JohnSmith-fq3rg
      @JohnSmith-fq3rg 7 месяцев назад +8

      "Freeing up your blah blah blah to do more..." You know full fucking well they ain't freeing up shit. Performance will get even worse with more hardware resources available as lazier and lazier unskilled devs are able to use modern specs to skate by with even worse optimized games. The 360 lasted over 10 years on market running the same games under the hood as modern pc's and consoles with only graphical hits taken, all on MODEST specs for when it came out back in 05. Now with 8gb and now 16gb of ram standard, internet browsers take up 10gb for like 5 tabs and youtube, and standard run of the mill fps games can max out ram usage qhen they used to get by on a gig or two max for the same amount or characters, physics and underlying game logic being performed. Even new graphics technology isn't being utilized by the majority of games, just the additional vram for people who don't understand how to optimize their texture cache usage and just demand you pay more for a car with more GB, or demand that the vram be gddr6 instead of gddr 3 or 4 like it used to only a few years ago. The new technologies ARE NOT ground breaking or genuine improvements, just a slightly larger bandwidth which is nowhere even close to being needed for consumer software applications. These nvme's on gpu's will only be another burden placed on the consumer's shoulders by shitty third world devs that cannot actually develope software at a professional level

    • @marko3770
      @marko3770 5 месяцев назад

      and why exactly would they do that when they could just solder additional memory if they wanted to without need for ports and shit?

  • @noobsplaysensei3324
    @noobsplaysensei3324 Месяц назад

    Good thing you mentioned the SSD Temperature issue. When an SSD overheats it freezes its read and write to zero.

  • @therealkarokun
    @therealkarokun Год назад +75

    Waiting for gpu that has expandable ram 😅

    • @apryctic
      @apryctic Год назад +4

      holy that would be so awesome

    • @kayukshi4271
      @kayukshi4271 Год назад +2

      i also thought of that like we can just add a sodimm ram stick to the gpu and get better fps

    • @wolvreigns
      @wolvreigns Год назад +2

      ​@@kayukshi4271nah ram and dram are quite different. Pretty sure like the 1030 or something had a varient with ddr4 instead of gddr5. And the difference it made was a lot

    • @kayukshi4271
      @kayukshi4271 Год назад +2

      @@wolvreigns maybe a sodimm ram stick with vram for gpus might work

    • @taylon5200
      @taylon5200 Год назад

      ​@kayukshi4271 no, it wont.

  • @JJ-gn1tb
    @JJ-gn1tb Год назад +39

    Well there is also a chance that the SSD will corrupt due to the heat of the gpu, which we all know, it heats up a lot especially on gaming. If the SSD can withstand the heat of the gpu, then I'll support this concept.

    • @sjneow
      @sjneow Год назад +10

      The position they installed is pretty far away from the hotspot of the GPU

    • @ujiltromm7358
      @ujiltromm7358 Год назад +2

      Non-issue. The air coming out of the heatsink is lower temperature than the SSD heatsink and will cool the SSD, even under load.

    • @Demopans5990
      @Demopans5990 Год назад +3

      Probably best for watercooling. Some sort of waterblock that cools both

    • @fixminer9797
      @fixminer9797 Год назад +1

      If it is designed with that in mind by competent engineers, it would not be an issue. The real problem is that you’d need an additional, expensive chip to do the PCIe bifurcation.

    • @Tatusiek_1
      @Tatusiek_1 Год назад +1

      @@Demopans5990Pretty sure that if you have the money for a water block, you might as well get a higher end gpu that uses all 16 pcie lanes.

  • @simeonwood3613
    @simeonwood3613 11 месяцев назад +16

    Could work, though it shouldn't be standard since most gpus do infact use all 16x lanes.

  • @garykoh5974
    @garykoh5974 7 дней назад

    i am as nerdcited as u are. This is awesome.

  • @Twoorah
    @Twoorah Год назад +25

    AMD did this 5+ years ago with multiple SDD for TBs of extra storage. Imagine if they had their current RDNA and card architecture.

    • @3DMarkRGBA
      @3DMarkRGBA 11 месяцев назад +1

      That device was a poorly conceived product. It was a x16 GPU meant to utilize that slot. The two SSDs added 2 each X4 devices to that x16 slot. All of these lanes connect to the CPU meaning that the slot is 150% oversubscribed. You would only see contention between the lanes when you utilize them. In the end the device would flounder and drop frames when you put it to task. As for the GPU speaking directly to the GPU, this could conceivably work but only the GPU would be able to utilize the SSDs . It’s better to keep devices on discrete slots/connectors for reliability and serviceability.

    • @josephschultz
      @josephschultz 11 месяцев назад

      @@3DMarkRGBA true but some mods and even new games going crazy with ram requirements this might be helpful although the heat being pumped out by some of these cards might just cook SSD's

    • @revdarian
      @revdarian 10 месяцев назад

      Speaking out of your ass about it floundering and dropping frames, first of all it was a professional card not for gaming, the use case was very specific in video production and CAD.

  • @funey9541
    @funey9541 Год назад +36

    This is a smoother loop than those smooth loop videos

  • @ronlaverdiere
    @ronlaverdiere Месяц назад

    This is so simple yet it took someone with an unorthodox vision to think of it! I hope whoever was behind this gets appropriately credited.

  • @Ffbjfdvjki
    @Ffbjfdvjki Год назад +8

    Soon we'll have threadrippers built into our 5090s lol

  • @janluofficial
    @janluofficial 8 месяцев назад +8

    Use an SSD as VRAM, now u got 8TB of VRAM

    • @namedauuid
      @namedauuid 5 месяцев назад

      great for if you want your gpu to be both low-end AND high-end at the same time.

    • @technorunner1
      @technorunner1 2 месяца назад

      😎😎😎😎

    • @Flesh_Wizard
      @Flesh_Wizard Месяц назад

      ​@@namedauuidhell yeah time to minmaxx my 4090