Why VRAM Is So Important For Gaming: 4GB vs. 8GB

Поделиться
HTML-код
  • Опубликовано: 16 июн 2024
  • Thermal Grizzly: www.thermal-grizzly.com/en/kr...
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    Buy relevant products from Amazon, Newegg and others below:
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 4070 - geni.us/8dn6Bt
    GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
    GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
    GeForce RTX 4060 - geni.us/7QKyyLM
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3050 - geni.us/fF9YeC
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    Radeon RX 7800 XT - geni.us/Jagv
    Radeon RX 7700 XT - geni.us/vzzndOB
    Radeon RX 7600 - geni.us/j2BgwXv
    Radeon RX 6950 XT - geni.us/nasW
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6750 XT - geni.us/53sUN7
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6650 XT - geni.us/8Awx3
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6600 - geni.us/cCrY
    00:00 - Welcome to Hardware Unboxed
    00:24 - Ad Spot
    01:04 - A brief description of VRAM
    06:13 - Baldur’s Gate 3
    06:51 - Cyberpunk 2077: Phantom Liberty
    07:32 - Dying Light 2: Stay Human
    08:01 - Forza Motorsport
    08:17 - Immortals of Aveum
    09:08 - Ratchet and Clank: Rift Apart
    09:51 - Marvel’s Spider-Man Remastered
    10:23 - Investigating Texture Presets
    10:36 - Assassin’s Creed Mirage [Visual Comparison]
    12:51 - Banishers: Ghosts of New Eden
    14:02 - Hogwarts Legacy [Visual Comparison]
    17:47 - Skull and Bones
    18:30 - Star Wars: Jedi Survivor [Visual Comparison]
    19:43 - The Last of Us Part I [Visual Comparison]
    22:10 - Total War: Warhammer III [Visual Comparison]
    23:49 - Final Thoughts
    Read this feature on TechSpot: www.techspot.com/article/2815...
    Why VRAM’s So Important For Gaming: 4GB vs. 8GB
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • НаукаНаука

Комментарии • 1,5 тыс.

  • @JarrodsTech
    @JarrodsTech 3 месяца назад +2193

    It's a good thing you can always just download more VRAM.

    • @EyesOfByes
      @EyesOfByes 3 месяца назад +82

      Yeah, I remember when we used to buy RAM on CD but then Napster came along...
      (I'm also joking)

    • @JarrodsTech
      @JarrodsTech 3 месяца назад +130

      @@EyesOfByes Now RAM is all subscription!

    • @Remi_Jansen
      @Remi_Jansen 3 месяца назад +19

      Usually for free too! No idea why people spent money on physical ram, its ridiculous.

    • @Splarkszter
      @Splarkszter 3 месяца назад +35

      NVidia is worse than Apple

    • @chadbizeau5997
      @chadbizeau5997 3 месяца назад +37

      ​@@JarrodsTech RAM as a Service!

  • @AthanImmortal
    @AthanImmortal 3 месяца назад +835

    I never understood why Hardware unboxed caught so much flak originally for suggesting that GPU VRAM was not moving on as fast as the gaming industry was, and that 8GB cards were going to age a lot faster. Instead customers should be angry that Nvidia was selling us the same 8GB of VRAM on the 1070 (2016), 2070 (2018) and 3070 (2020), and STILL wanted $400 for an 8GB card in 2023 in the form of the 4060Ti.
    Go back 4 years from 2016 and in 2012 you have the 670 with 2GB. Memory *quadrupled* in the same time frame in the mid range. Yet everyone was barking "game devs need to optimise", sure if you want graphics to stay the same, why don't they optimise for 2GB then? Because at some point we need to up and move on.
    The fact is that someone that bought a 1070 in 2016 can still play at the same level they did 7 years ago, but at the point they run into VRAM issues, those same issues are going to affect someone that bought a 3070 just 3 years ago, regardless of the difference in 3d capability between the cards.
    I'm glad to see HUB still championing this point.

    • @lookitsrain9552
      @lookitsrain9552 3 месяца назад +153

      They bought cards with 8gb of ram for 800 dollars and have to argue about it to make their purchase seem good.

    • @vmafarah9473
      @vmafarah9473 3 месяца назад +94

      1070 8gb , 2070 should be 10gb , 3070 sb 12gb, 4070 sb 16gb .in my opinion.

    • @RobBCactive
      @RobBCactive 3 месяца назад +33

      Ironically I remember HUB doing the opposite when many of us pointed out Nvidia were skimping on VRAM in 2020/21 and valuing the 12/16GB configurations RDNA2 offered.
      All you needed to do was listen to game devs.

    • @highlanderknight
      @highlanderknight 3 месяца назад +26

      I agree about NVIDIA not putting enough VRAM on their cards. HOWEVER, we are Not forced to buy those cards. If you buy one, you have little right to complain.

    • @gamingunboxed5130
      @gamingunboxed5130 3 месяца назад

      ​@RobBCactive that didn't happen 😅

  • @Phil_529
    @Phil_529 3 месяца назад +327

    First 8GB card was the Sapphire Vapor-X R9 290X in Nov. 2014.

    • @magikarp2063
      @magikarp2063 3 месяца назад +31

      Msi had a 6gb version of 280X, a midrange card from 2013.

    • @lennartj.8072
      @lennartj.8072 3 месяца назад +18

      Sapphire da goat fr

    • @Pixel_FX
      @Pixel_FX 3 месяца назад +66

      Sapphire also created the first blow through cooler. Then almost a decade later Jensen made nvidiots believe they invented that with 30 series cooler lmao, it only took some 10 mins of bullshitting with AI, aerodynamics, thermal dynamics yada yada.

    • @mitsuhh
      @mitsuhh 3 месяца назад +2

      280X was high end, not mid range@@magikarp2063

    • @tomstech4390
      @tomstech4390 3 месяца назад +14

      @@magikarp2063 6GB hd7970 (same card) existed before that.
      There is no architectural change from HD7000 to Rx200 series (except 260x bonaire which added Truaudio)
      Infactl the shaders were unchanged from hd7000 to Polaris RX 400, even then it was a switch to 2x FP16 instead oif 1xFP32 units, vega added rapid packed math but again not *that much* changed.
      From Fermi 200 series to RTX20... GCN was amazing.

  • @sergiopablo6555
    @sergiopablo6555 3 месяца назад +77

    I don't know why, but this is the only channel where I click "like" even before the videos start. It may be the absolute lack of clickbait on the titles, how useful all of them are, or how relaxing it is to watch someone speak without yelling or jumping around.

    • @GankAlpaca
      @GankAlpaca 3 месяца назад +3

      Haha same. Just checked and I already liked the vid instinctively. Maybe it's because the topic is so important to me and generally a thing that a lot of people think about.

  • @andrexskin
    @andrexskin 3 месяца назад +46

    I guess that we should already look up on a 8GB vs 12GB of "similar cards".
    An example would be the 3060 12GB vs 3060Ti 8GB, trying to spot if there are already cases where the raster performance from 3060Ti isn't enough to balance the quality a 3060 12GB would achieve with higher texture quality

    • @TheKims82
      @TheKims82 3 месяца назад +14

      HUB did test this earlier where the 3060 actually outperformed the 3080 10GB. I believe it was in Hogwart Legacy just when the game came out.

    • @andrexskin
      @andrexskin 3 месяца назад +9

      @@TheKims82 I think that unoptimized RTX is kinda a niche case.
      It would be better to extend the tests to more cases

  • @807800
    @807800 3 месяца назад +478

    Any GPU over $300 should have at least 16GB now.

    • @TTM1895
      @TTM1895 3 месяца назад +9

      ikr?

    • @Radek494
      @Radek494 3 месяца назад +71

      4060 Ti 16 GB should be $350, 8 GB version should not exist and 4060 8 gb should be $250

    • @MasoMathiou
      @MasoMathiou 3 месяца назад +65

      12Go is acceptable in my opinion, but definitely 16Go above $400.

    • @N.K.---
      @N.K.--- 3 месяца назад +25

      Even 12gb will make sense

    • @Eleganttf2
      @Eleganttf2 3 месяца назад +40

      lol stop being DELUSIONAL wanting a 16GB Vram on a measily 300$ and besides why would they need to put massive 16GB Vram on it, just take a look at how RX 7600 XT,Arc A770 16GB or 4060 Ti 16GB is lol especially at just 300$ where i dont expect the gpu to perform good even at 1080p why would they wanna put 16gb ? for "future proofing" bs ? you do realize that if your gpu is getting weaker especially as its getting older having more vram wont help right ?😂what we need is the right Vram for the right PERFORMANCE SEGMENT but for 350-400$ gpu i would agree that it needs 12gb at bare minimum

  • @BUDA20
    @BUDA20 3 месяца назад +138

    also some of those cross the 16 GB of RAM limit because of the low VRAM capacity, so... people with 4GB cards, are likely to have 16GB of ram... so it will tank even more

    • @kaznika6584
      @kaznika6584 3 месяца назад +16

      That's a really good point.

    • @DivorcedStates
      @DivorcedStates 3 месяца назад

      How is it that 16gb ram instead of 8 for example is bad for a setup with a 4gb vram card? I dont understand.

    • @Pasi123
      @Pasi123 3 месяца назад +10

      @@DivorcedStates Some of the games used more than 16GB of system memory with the 4GB VRAM card so if you only had 16GB RAM you'd see even bigger performance hit. With the 8GB card that wouldn't be a problem because the system memory usage was below 16GB.
      8GB single-channel RAM + 4GB VRAM card would be straight from hell.

    • @Apollo-Computers
      @Apollo-Computers 3 месяца назад +1

      I have 16gb of ram with 24gb of vram.

    • @gctypo2838
      @gctypo2838 3 месяца назад +7

      ​@@DivorcedStates The point is that if you're using a 4GB VRAM card, the extra VRAM demanded gets "spilled over" into RAM, taking a game that might require 12GB of RAM to requiring 17GB of RAM. If you only have 16GB of RAM, this spills over into swap/pagefile which will tank performance _exponentially_ more. Very few people using a 4GB VRAM card will be running with 32GB of RAM, which makes that 16GB RAM threshold so significant.

  • @theslimeylimey
    @theslimeylimey 3 месяца назад +238

    Sitting here still happy with my "ancient" 1080TI with 11gig of ram and "only" 484 GB/s memory bandwidth.

    • @N.K.---
      @N.K.--- 3 месяца назад +21

      Cmon bruh that's pretty legendary for 1080p and works decent on 1440p on any game except for games with horrible optimization

    • @shagohodds
      @shagohodds 3 месяца назад +19

      That's BS. i own a 1080ti a 3080 and a 4080 the 1080ti is pretty much dead in the water for most recent games at 1080 and certainly at 1440p@@N.K.---

    • @KnightofAges
      @KnightofAges 3 месяца назад +78

      @@shagohodds You're coping hard for the fact you spend tons of cash on GPUs every gen. Except for Ray and Path Tracing, the 1080ti runs pretty much everything at 50-60fps at 1440p, and is much faster at 1080p. Even in Alan Wake 2, the one game the 1080ti could not run above 30fps due to the mesh shader technology, they are going to put out a patch that will allow it to run the game at around 50-60fps. Now, you're free to spend thousands on a GPU every gen for small gains, but don't try to gaslight owners of the 1080ti, who know very well the resolutions and settings they're gaming at, as well as the fps they get.

    • @Phil_529
      @Phil_529 3 месяца назад +21

      @@shagohodds Not really. 1080 Ti is pretty much PS5/Series X power but lacks proper DX12 support. It's still a fine 1440p card if you're using upscaling. Avatar medium settings with FSR quality mode averages 56fps.

    • @kosmosyche
      @kosmosyche 3 месяца назад +6

      @@Phil_529 Well, depends on what games you play really. And it's not a VRAM-related problem mainly, rather than in general it's getting really really old for modern games. I had GTX 1080 Ti for many many years (too many because of the crypto craze and my refusal to buy a new card for idiotic prices) and I'll be honest, I'll take RTX 3070 8GB over it any day of the week and twice on Sunday, just because it works substantially better with most modern DX12 games, despite the lower amount of memory.

  • @Rexter2k
    @Rexter2k 3 месяца назад +174

    Boy, we are so lucky that current generation gpu's have more vram than the previous gen, right guys?
    Guys?...

    • @shadowlemon69
      @shadowlemon69 3 месяца назад +30

      Double the VRAM, Double the price

    • @Azhtyhx
      @Azhtyhx 3 месяца назад +4

      I mean, this is not exactly something new. Let's take a look at the Nvidia side of things, going back to the 200 series in 2009, using data related to the '70 and '80 model cards:
      * Two generations saw a 0% increase in VRAM compared to the previous generation. The GTX 500 Series as well as the RTX 2000 Series.
      * The '70 has had an average amount of VRAM of 87.5% compared to the '80, in the same generation.
      * The '80 has had an average increase in VRAM of 38.5% compared to the previous generation.
      * The '70 has had an average increase in VRAM of 39.3% compared to the previous generation.

    • @shadowlemon69
      @shadowlemon69 3 месяца назад

      @@Azhtyhx I'm talking about the price increase that came with more VRAM though
      GTX 480 1.5GB - $500
      GTX 580 1.5GB - $500 (590 3GB - $700)
      GTX 680 2GB - $500 (690 4GB - $1,000)
      GTX 780 3GB - $650, 780 Ti 6GB - $700 (TITAN 6GB - $1,000)
      GTX 980 4GB - $550, 980 Ti 6GB - $650 (TITAN X MAXWELL 12GB - $1,000)
      GTX 1080 8GB - $600, 1080 Ti 11GB - $700
      RTX 2080 8GB - $700, 2080 Ti 11GB - $1,000 (TITAN RTX 24GB - $2,500)
      RTX 3080 10GB - $700, RTX 3080 Ti 12GB - $1,200, 3090 24GB - $1,500
      RTX 4080 16GB - $1,200, RTX 4090 24GB - $1,600
      Remember that 10-series gave a massive jump in performance compared to the 20-series increased price with insignificant gains. The price hike that started in 2018 wasn't really justifiable. And it kept on increasing after 2 generations. So, it's something new lol.

    • @upon1772
      @upon1772 3 месяца назад +1

      ​@@shadowlemon69The more you spend, the more you save!

    • @OGPatriot03
      @OGPatriot03 3 месяца назад +2

      I had an 8gb GPU since 2014... 8gb is absolutely out of modern spec these days, it would've been like running a 2gb GPU back in 2014 which would've been low end for sure.

  • @DJackson747
    @DJackson747 3 месяца назад +184

    I can't imagine getting any less than 12 GB these days. Granted most of what I play are older games or low spec games but it still let's me play newer software at med/high settings comfortably. A 4GB card should be like $99 usd at this point as it's gonna age like milk.

    • @erikbritz8095
      @erikbritz8095 3 месяца назад +12

      should be $75 max for a 4gig gpu in 2024

    • @ricky4673
      @ricky4673 3 месяца назад +8

      For me, I cannot do without at least 16. I prefer 24 to 32. Next year, I want 64.

    • @lorsch.
      @lorsch. 3 месяца назад

      @@ricky4673 32GB would be quite nice, really. A little more future proof than 24GB

    • @mitsuhh
      @mitsuhh 3 месяца назад

      you want 64gb vram next year ?@@ricky4673

    • @sebastian38488
      @sebastian38488 3 месяца назад +17

      Only 128gb vram

  • @ojassarup258
    @ojassarup258 3 месяца назад +152

    Would be interested in seeing 8v12v16 GB VRAM comparisons too!

    • @vmafarah9473
      @vmafarah9473 3 месяца назад +12

      As 30 series suck in Vram, developers are forced to reduce game quality to incorporate handicapped 30 series, 12 GB going to be enough for 1440p . Fc Ngreedia. The same company released 6GB 3060 laptops. 8gb 4070 laptops.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 3 месяца назад +8

      It's possible to use 18gb in modern games at 4k. The game will use what it can. If you have less than ideal it will just stutter more and look worse but it won't run out because of Safety features keeping you from running out. My friend had a 3080 and had usage at 8.5gb then got a 7900xtx and saw those same games actually took advantage of more like 16-20gb. More than double the fps.

    • @c.m.7692
      @c.m.7692 3 месяца назад +1

      ... and Vs 20gb!

    • @user-mz1if8oe9k
      @user-mz1if8oe9k 3 месяца назад +6

      There is no vcards that can offer 8-12-16Gb variants at the same time. But you can re-watch RTX3070vsRX6800, or seek for videos like "RTX3070-8Gb vs RTX3070-16Gb(modified)"

    • @zfacersane272
      @zfacersane272 3 месяца назад +1

      @@vmafarah9473what are you even talking about not true… but i agree with the laptop part

  • @Ale-ch7xx
    @Ale-ch7xx 3 месяца назад +41

    @24:29 Correction: The first Radeon card that had 8gb was the Sapphire Vapor-X R9 290X 8gb released on Nov 7th, 2014.

    • @Hardwareunboxed
      @Hardwareunboxed  3 месяца назад +16

      We're talking about the first official AMD spec.

    • @guille92h
      @guille92h 3 месяца назад +6

      Shapphire knows how to download Vram😂

    • @Ale-ch7xx
      @Ale-ch7xx 3 месяца назад +20

      @@Hardwareunboxed The way it was worded I didn't know you were referring to the official AMD spec

    • @OGPatriot03
      @OGPatriot03 3 месяца назад +1

      That was an epic GPU

    • @andersjjensen
      @andersjjensen 3 месяца назад

      @@Ale-ch7xx They always are. Cards are set to official spec clocks for core and memory if a Reference or Founders Edition is not available, etc, etc. What individual board partners do will be put into specific model reviews.

  • @CharcharoExplorer
    @CharcharoExplorer 3 месяца назад +37

    The R9 390 was not pointless. Modders had a blast with texture mods and LOD/Model mods with it.

    • @pumcia718
      @pumcia718 3 месяца назад

      Oh I had one of those from Sapphire. The thing didn't care what game I threw at it at the time.
      Eventually it started thermal throttling and some other thing that I couldn't fix. In early 2019 I got the 16GB Radeaon VII to replace it, dude's still going strong.

    • @infinitycovuk699
      @infinitycovuk699 3 месяца назад +1

      had the sapphire 390 nitro was a beast of a card for the price.

  • @crazylocha2515
    @crazylocha2515 3 месяца назад +28

    Surprised me how well Steve put things and it kept becoming interesting at every level. Great piece 👍 (Thx Steve)

  • @chriscastillo291
    @chriscastillo291 3 месяца назад +101

    I love these VRAM videos! No one talks about this or let alone testing on it.. thanks for the vids!❤

    • @bns6288
      @bns6288 3 месяца назад +7

      RUclips is full of videos like this.^^ the only rare comparison is 4gb vs 8gb imo. idk why we still talk about 4g while 8gb is already too low.

    • @zodwraith5745
      @zodwraith5745 3 месяца назад +1

      Lolwut? There's a ton of videos chasing the vram boogieman.

    • @tyre1337
      @tyre1337 3 месяца назад

      @@zodwraith5745 tons of vram drama clickbait videos, very few actual deep dive testing like this, the only one i can think of is daniel owen

    • @dave4148
      @dave4148 3 месяца назад +2

      No one? Really?

    • @ElysaraCh
      @ElysaraCh 2 месяца назад +1

      "No one talks about this"? It's been talked about by all the major tech channels since the 30 series launched lmao

  • @Pandemonium088
    @Pandemonium088 3 месяца назад +100

    Next up, 24gb of vram and unreal 5 enginre still has traversal stutter 😅

    • @tenow
      @tenow 3 месяца назад +19

      recent UE5 titles lack good VRAM settings. Robocop for example always tries to fit into 8 Gb VRAM and as a result always has texture pop-in. Saw it in some other recent titles as well.

    • @residentCJ
      @residentCJ 3 месяца назад +4

      ​@@tenow ithought nanite was the holy grail in ue5 to eliminate texture pop in.

    • @ROFLPirate_x
      @ROFLPirate_x 3 месяца назад +16

      ​@@residentCJonly if you have enough vram to use it. Nanite doesn't stop texture pop in, it better adjusts LOD over distance. I.e the closer you get the more details it renders. If you go past the VRAM buffer, you are still gonna have pop in as your system is forced to use system ram, which is much slower for the GPU to access. Also. Devs have the ability to not use nanite, it is quite resource intensive. So not every dev is gonna implement it.

    • @OutOfNameIdeas2
      @OutOfNameIdeas2 3 месяца назад +1

      Because not even 24gb is what you need for a 80 series card to use it's power properly. I'm having trouble playing even beat saber on my 3080 with a index.

    • @TheCgOrion
      @TheCgOrion 3 месяца назад

      No kidding. I have NVME, X3D, 96GB Ram (other work), and 24GB Vram, and they still manage to not have that scenario take advantage. 😂

  • @gamingoptimized
    @gamingoptimized 3 месяца назад +204

    Well, I know I made a mistake when I didn't spend 50 dollars more for a GTX 1060 6GB... (I have the 3GB one...)

    • @Radek494
      @Radek494 3 месяца назад +48

      Ouch

    • @kingplunger6033
      @kingplunger6033 3 месяца назад +57

      yeah, that is like one of the worst gpus ever

    • @GloriousCoffeeBun
      @GloriousCoffeeBun 3 месяца назад +44

      I bought a 3070 over 6800 just for cuda cores.
      Guess I am the next 🫠

    • @baophantom6469
      @baophantom6469 3 месяца назад +6

      Bruh ..

    • @PandaMoniumHUN
      @PandaMoniumHUN 3 месяца назад +19

      @@kingplunger6033 Hard disagree, it was a great GPU for it's time in 2016. Sure, even back then it made more sense to buy the 6GB model, but I was using the 3GB variant of that card between 2016-2019 without any issues. Every gaming having 4K and 8K textures only started a few years ago, that's when having way more VRAM was starting to be necessary.

  • @pvtcmyers87
    @pvtcmyers87 3 месяца назад +8

    Thanks for creating the video for this and for taking a different look at the vram issue.

  • @Code_String
    @Code_String 3 месяца назад +31

    It sucks that some nice GPU chips will be held back by the VRAM buffer x: .The 12Gb of my 6800m has been surprisingly more useful than I could have thought.

    • @roqeyt3566
      @roqeyt3566 3 месяца назад +5

      On mobile, that card was awesome for the generation
      There wasnt a lot of vram to go around in affordable laptops, which the 6800m fixed

    • @timalk2097
      @timalk2097 3 месяца назад +2

      @@roqeyt3566 bruh you talk as if it's a decade old gpu, it's still by far one of the best options for laptop gpus atm (especially if you're on a budget)

  • @user-lk5kn2tr7k
    @user-lk5kn2tr7k 3 месяца назад +7

    Great job with testing Steve, thanks for an update on VRAM matter.

  • @9r33ks
    @9r33ks 3 месяца назад +27

    Yes, textures makes all the difference. Back in a day of my GTX1080 I'd always try to pump up textures as high as I can, while lowering everything else as much as reasonably practicable. Textures make all the difference.

    • @viktorianas
      @viktorianas 3 месяца назад +3

      Exactly, I put everything on low and then put high/ultra textures, next goes lighting, effects, then shadows, draw distance and other bells and whistles with ray tracing at the very bottom on priority list.

    • @9r33ks
      @9r33ks 3 месяца назад +9

      @@viktorianas indeed. I hate when developers try "optimising" and "balancing" graphical presets. Devs always put textures resolution and details way too low on their presets, making the game look like ass, while it could look so much better with some useless particle effects disabled or lowered, and decent textures raised to reasonable level. Vram capacity and its importance really opened my eyes on the subject. I won't let nvidia and AMD mislead me with "great deals" of gpus having reasonable price but low vram.

    • @redmoogle
      @redmoogle 3 месяца назад

      Nvidia doesn't want you to do this. They want you to crank up lighting and shadows while lowering textures. What's insane is also the motion blur modern games stuff in, and then adding the fake frames and filter on top of that pretending they are running at a higher resolution. It's really bad, the reason Nvidia can do this is because of AI marketing and their superior drivers. If AMD or Intel could deliver flawless drivers for games going back 20 years there would be no reason to even look at Nvidia anymore.

    • @Rapunzel879
      @Rapunzel879 3 месяца назад

      That's incorrect. Resolution makes the biggest difference, followed by textures.

    • @defeqel6537
      @defeqel6537 3 месяца назад

      @@Rapunzel879 nah, I often rather crank up shadows than resolution

  • @TheIndulgers
    @TheIndulgers 3 месяца назад +9

    I don't know why people defend nvidia (trillion dollar company btw) for skimping on vram.
    This same company gave you 8gb for $379 EIGHT years ago with the gtx 1070. People out here coping over their $800 4070ti purchase.
    50% more vram for over double the price 3 generations later doesn't sound like progress to me.

  • @MrSmitheroons
    @MrSmitheroons 3 месяца назад

    I had been meaning to do some testing like this myself but found it too daunting. You deserve massive props for doing this, and doing the subject justice. The conclusion section at the end going over the timeline of "how we got here", along with the context of the multi-preset, multi-texture-quality and visuals comparisons, it is just *chef's kiss*. So rounded and complete.
    The only way to give more context would be to show more "negative results" where nothing interesting happened, to contextualize which games it doesn't matter in. But I imagine in 2024 this is getting to be not that many recent games, for one thing, and would arguably bloat an already long video at half an hour.
    But this was just a really great video, and I really do agree it adds good context for this moment we're in, trying to see how well 8GB will do not just today (we already see it *start* to struggle in certain games), but in a few years down when 12-16+GB cards will be more normalized and devs will want to throw more "bang" into the higher presets to really give gamers something nice to look at in those presets.
    As you've shown it's not just about visuals, not just about performance, but often both. It either makes no difference, or you're starting to trade quality for performance despite the compute part of the card (actual GPU chip) being fast enough. The question is when it's going to be more relevant, but you make a strong case that it's "starting basically now," and that "the transition will probably be pretty complete within a year or three" to where 8GB will be felt badly as a limit.
    I know games will tend to be *launchable* with 8GB VRAM, but poorly optimized ports and AAA launches are still a thing, for those that jump on a title on day one (or worse, pre-order)... and you're still going to be leaving some performance *and* visuals or both on the table, that the GPU could have handled otherwise.
    I think it's high time we understood NVidia and AMD are cheaping out if they give less than 12GB on a multi-hundred-dollar piece of hardware when it's not costing them nearly as much as they're upselling us for 12GB+ VRAM. I don't consider intel to be a trend-setter in this area, but at least they have leaned into their VRAM performance a lot of the time, so I don't suppose they're too egregious, but I do hope they're listening as well.
    Sorry for the long post, this has been a topic on my mind for some time now. Thanks much again, for all the testing and well considered conclusions. Cheers to you and the team!

  • @Kiyuja
    @Kiyuja 3 месяца назад +13

    I think many people dont realize is that modern games dont just crash with insufficient VRAM but rather dynamically lower assets in order to prevent crashes or stutter. This doesnt mean you dont "need" more. Especially these days where DLSS and RT are more and more important, these techniques store temporal data in VRAM, this scales with your resolution. I consider 12GB to be low end these days.

  • @SweatyFeetGirl
    @SweatyFeetGirl 3 месяца назад +70

    thats exactly for the people that say "your card will run out of power before you run out of vram". they dont know that texturs dont give you a performance hit and are a free major visual upgrade.

    • @redmoogle
      @redmoogle 3 месяца назад +13

      Yeah I got tired of even saying it most of the time. I suspect Nvidia is leveraging their AI bots to market this way, only explanation I have for why people are so against getting a quality product at a reasonable price. They actively want to overpay for poor products, but that's the very opposite of my real life experience where people want to pay less for a great product and brag about getting great deals.

    • @therecoverer2481
      @therecoverer2481 3 месяца назад +1

      does this stay true with lower-end GPU like RX6600? im currently looking for one at the used market, they go around 150 usd, or should i go with RX6700xt at like 200 usd?

    • @SweatyFeetGirl
      @SweatyFeetGirl 3 месяца назад

      RX 6700XT has 12gb vram compared to the 8gb on the rx 6600 and the 6700xt is 40% faster, so yes, its much better than the rx 6600! you spend 33% more money but you get 40-45% more fps, its absolutely worth it@@therecoverer2481

    • @stangamer1151
      @stangamer1151 3 месяца назад +1

      It depends on a certain game.
      But 4GB vs 8GB is still an extreme case. 4GB cards are either very old or very cheap, so no one should expect them provide great results in modern games anyways.
      Talking about textures, it is not always the most noticeable visual change. Objects and shadows pop-in are much more crucial IMO. When everything pops out of nowhere, it makes gaming unenjoyable. I'd rather sacrifice texture quality, then draw distance. Also low resolution shadows and low quality AA are very annoying too (constant flickering is awful). So for me it is draw distance first, shadow quality and AA quality next and only then texture quality.

    • @triadwarfare
      @triadwarfare 3 месяца назад

      ​@@redmooglethey probably want their product to last more. If tech moves too fast, your 8gb would be obsolete in a few years rather than having to last it for a console generation

  • @teardowndan5364
    @teardowndan5364 3 месяца назад +14

    The difference between 4GB-to-8GB and 8GB-to-12+GB is that the baseline image quality has a much higher starting point, which makes lowering details to save VRAM and VRAM bandwidth far more palatable with 8GB when shopping in the $150-250 new range today. Anything much over $250 really should have 12GB at a bare minimum.

  • @xkxk7539
    @xkxk7539 3 месяца назад +31

    some notable past examples are 980 ti (6gb) vs fury x (4gb) and 780 ti (3gb) vs 290x (4gb). vram helped push out more performance for those cards

    • @Pixel_FX
      @Pixel_FX 3 месяца назад +4

      390X came with 8gb and released same month as 980ti. typical nvidia planned obsolescence

    • @XxGorillaGodxX
      @XxGorillaGodxX 3 месяца назад +1

      ​@@Pixel_FXThat was also with a good price on a 512 bit bus. Those were better days.

    • @PhAyzoN
      @PhAyzoN 3 месяца назад +1

      @@Pixel_FX I loved my 8GB 290X (essentially identical to a 390X) but let's be real here; the 980Ti was better across the board at everything and for a far longer period of time. That extra 2GB didn't do a damn thing for the 390X.

    • @OGPatriot03
      @OGPatriot03 3 месяца назад

      @@PhAyzoN Sure, but the 980 ti was considerably newer than the Hawaii architecture. The fact that it was rebranded for so long was thanks to AMD's "fine wine" over the years.
      Could you imagine if the 290x had that later spec performance all the way back in 2013? It was purely a software holdup all that time.

    • @defeqel6537
      @defeqel6537 3 месяца назад +1

      @@PhAyzoN GTX 980 Ti was also about 50% / $200 more expensive

  • @Starkiller12481
    @Starkiller12481 3 месяца назад

    Great video! Been looking for guidance on VRAM/RAM behavior under different texture loads this was very informative.
    Thanks gentlemen 🙏🏾

  • @Jamelele
    @Jamelele 3 месяца назад +29

    I have a 3080 10GB, great thing they released a 3080 12GB

    • @Phil_529
      @Phil_529 3 месяца назад +36

      10GB was a terrible choice by NVIDIA. They marketed that as a "flagship" but it had less VRAM than the 11GB 1080Ti from years earlier for the same $700.

    • @mertince2310
      @mertince2310 3 месяца назад +9

      @@Phil_529 Its a great choice for nvidia* bad for users. If they can get away with less vram and people even defend this choice, why would they put more?

    • @Phil_529
      @Phil_529 3 месяца назад +4

      @@mertince2310Well clearly they didn't have to. Ampere was peak crypto and the day I got my 10GB for $700 I could've sold it for $1500 followed by it being sold out for almost 2 years. To be fair the 10GB at 4K was mostly OK for the first 2 years but it fell off a cliff once "next gen" games started to come out. Dead Space remake and RE4 was the final straw for me and I upgraded to a 4090.
      Pretty sure the lack of VRAM is directly related to NVIDIA wanting to push AI users into more expensive GPUs. At least now they offer a 16GB model for $450 so next generation shouldn't be so outrageous. These overpriced mid ranged cards shouldn't be limited to 12GB and asking for $600+.

    • @BitZapple
      @BitZapple 3 месяца назад +5

      @@Phil_529 Even with 10GB I personally never actually ran into actual problems even playing games like Hogwarts legacy. 1440p and I guess I just didnt reach that area yet, but even if I would I can always go from Textures Ultra to High (which looks the same), problem solved, There's also even a mod that reduces VRAM usage.

    • @Phil_529
      @Phil_529 3 месяца назад +3

      @@BitZappleHogwarts Legacy ate VRAM like no ones business at launch if you were using ray tracing (14.2GB at 1440p). That was another game that choked up my 10GB card. I also had problems with Need for Speed Unbound (easy fix just go from ultra reflections to high and it's fine).

  • @metallurgico
    @metallurgico 3 месяца назад +9

    My 2014 980Ti has 6GB of RAM... 10 years later games still looks like Half Life 2 and we still have 4-8 GB cards. That's real innovation.

    • @chillhour6155
      @chillhour6155 3 месяца назад +4

      Yeah these Unreal 5 games look boring in appealing garbage but everyone's seems to have drinking the coolaid, they REALLY must've like that matrix demo

    • @mitsuhh
      @mitsuhh 3 месяца назад

      Robocop is fun@@chillhour6155

    • @TheBoostedDoge
      @TheBoostedDoge 3 месяца назад +6

      Actually look worse because devs rely more and more on upscalers instead of actually optimizing their games

    • @Grandmaster-Kush
      @Grandmaster-Kush 2 месяца назад

      Upscaling crutch + TAA + Poor optimization + high development cost + lack of interest in their work / homogenization of coding due to thousand and thousand programmers and developers being churned out in "bideogame skewls" and you have uninspired low risk modern AA and AAA games as a result.

    • @metallurgico
      @metallurgico 2 месяца назад

      @@Grandmaster-Kush that's why I started studying bass guitar instead of gaming.

  • @axlfrhalo
    @axlfrhalo 3 месяца назад +1

    Halfway through the vid but i love this, gives very good understanding of how VRAM makes a difference and exactly when texture do and dont impact performance and just how drastic it can get.

  • @BaggerPRO
    @BaggerPRO 3 месяца назад

    Very informative and descriptive. Thank you!

  • @menohomo7716
    @menohomo7716 3 месяца назад +20

    "no you don't understand you NEED to preload 8 gigs of texture in dedicated memory" -the devs probably

    • @Splarkszter
      @Splarkszter 3 месяца назад

      To be fair, textures do the majority of the job on making a game look good.
      Yes 4K(or more) textures are absolutely dumb. But oh well.

    • @sush7117
      @sush7117 3 месяца назад +7

      well, yes. You can see what happens if textures are loaded in RAM instead of VRAM in this video

    • @Splarkszter
      @Splarkszter 3 месяца назад +3

      ​​@@noir-13 Search about the storage space difference. It may seem like a small little number but the growth is exponential.
      More resolution doesn't necessarely yield higher details.
      It's an incredibly wide-spread issue.
      The whole dev market is filled with people that don't know what a 'job well done' is or even means.
      2K it's fine i guess, 1K still. 4K or more is just unnecesary because of the exponential storage space consumption(that also well applies to texture loading speed, world loading speed and VRAM consumption)

    • @DarkNia64
      @DarkNia64 3 месяца назад +1

      Sounds like someone doesn't care to do their own research ​@noir-13

    • @mikehawk6918
      @mikehawk6918 3 месяца назад

      @@noir-13 Sounds like you're 12 years old. Also seems like it judging by your profile picture.

  • @vulcan4d
    @vulcan4d 3 месяца назад +9

    Just add sockets on the back of the GPU to upgrade the memory and create a new industry standard of upgradable vram modules.

    • @kenshirogenjuro873
      @kenshirogenjuro873 3 месяца назад +3

      I would SO love to see this

    • @klv8037
      @klv8037 3 месяца назад +5

      Nvidia's not gonna like this one ☠️☠️🙏🙏

    • @kajmak64bit76
      @kajmak64bit76 2 месяца назад +1

      ​@@kenshirogenjuro873that is actually possible
      I saw a video of some dude testing out some modified RX 580 from China it had like 16gb of VRAM ( or was it 32? )
      And it's real... it worked and used more VRAM and everything detected the extra VRAM
      So it's entirely possible to just add more VRAM via soldering but it may vary from card to card

  • @Kelekona_808
    @Kelekona_808 3 месяца назад +1

    Highlighting the visual differences in the Vram images was very helpful for me. Usually I fail to see the differences between cards when going over visual comparisons.

  • @mopanda81
    @mopanda81 3 месяца назад +2

    Once again doing tests on lower spec hardware at 1080p gives us a lot of perspective that doesn't exist with new card reviews and ranked framerate lists. Thanks so much for this work since it really fleshes out the whole picture.

  • @ivaniliev2272
    @ivaniliev2272 3 месяца назад +4

    I will tell you why. Because game developers are lazy to optimize the games. VRAM should not affect much the framerate, but draw distance, texture resolution and mesh LODs(level of details). If the game is overfilling the GPU memory, it means that someone, somewhere did a really bad job and I am saying this as a game developer.

  • @TheZoenGaming
    @TheZoenGaming 3 месяца назад +75

    I always laugh when someone claims that more than 8GB isn't necessary on a card just because it isn't a high-end card.
    Not only does this show that they don't understand how VRAM is used, but mods can really crank up the amount of VRAM used by a game.

    • @Dragoonoar
      @Dragoonoar 3 месяца назад +8

      yea but lets be real here, the only game you'd mod the shit out until it utilizes more than 8 gb of vram is skyrim. there are strong arguments against 8gb vram, but mods aint it

    • @nimrodery
      @nimrodery 3 месяца назад

      Sure, if you tweak your settings you can sometimes find a game where you can run out of VRAM on one GPU while the equivalent GPU with more RAM keeps running fine, but in most cases you'll get bad performance with the higher VRAM GPU as well. Basically lower end cards don't need as much VRAM because they won't see as much benefit. I wouldn't say "no" to extra RAM but I'm not going to spend an extra $150 or $200 on it when buying a low end GPU (not a dealbreaker). For the midrange and up everything should be over 10/12 GB.

    • @TheZoenGaming
      @TheZoenGaming 3 месяца назад +21

      @@nimrodery I also laugh when people show that they failed to read the comment they are replying to, or, for that matter, that they didn't watch the video that comment is posted to.

    • @morgueblack
      @morgueblack 3 месяца назад +27

      ​@@nimrodery"lower end cards don't need as much VRAM".....Bruh, did you even watch the video? Hardware Unboxed literally used a 6500XT (that's a lower end card, don't you know?) to disprove your point. What are you on about?

    • @nimrodery
      @nimrodery 3 месяца назад +1

      @@morgueblack No, I was talking about 8 vs 12 or 16, HUB already has a video about 8 vs 16 in the form of the 4060 ti, which shows that for the most part there's no performance gains to be had. For instance you can enable RT and ultra settings on both, and while you may run out of VRAM on one, the other GPU isn't putting out high FPS because of the settings and the fact it's a 4060 ti. No GPU should have less than 8 at this point.

  • @vylrent
    @vylrent 2 месяца назад

    These videos are so nice to listen to. Just get a good dose of tech info so I can stay updated (i haven't been in a few months) with a relaxing not-jumpy voice unlike other tech channels. Good job HU team

  • @Celis.C
    @Celis.C 3 месяца назад +1

    Interesting video idea: statements/stances you've made in the past that you might reconsider now, as well as the why of it (new insights, new developments, etc).

  • @lencox2x296
    @lencox2x296 3 месяца назад +6

    @HUB: So lession to learn from the past: In 2024 one should recommend 16GB cards only, or at least 12 GB as the bare minimum for mid-range GPUs.

  • @mxyellow
    @mxyellow 3 месяца назад +3

    I sold my 3070 as soon as I saw HUB's 16GB vs. 8GB VRAM video. Best decision ever.

  • @Hamsandwich5805
    @Hamsandwich5805 3 месяца назад

    I really liked this video, HU! Love the work you do. You ask good questions and seek the tough answers, being
    Just wanted to add - in some of those games where ram usage creeped over 12GB, you'll be in a really tight spot with most budget builds. Assuming you opted for the 6500XT for budgetary purposes, I'd reckon you likely also only have 8 or 16GB of sys RAM. That's going to make it nearly impossible to rely on sys ram for back up, even if your card's buffer is being fully consumed at 8GB.
    Windows will still take several GB of RAM, so those games will likely be totally unplayable or run very poorly when loading/switching between apps (the challenges of alt+tab like in the windows XP days).
    I'd really like to see a system that matches budget direction - maybe a 3600/5600CPU or i3 alternative with 16GB of ram (or maybe even 8GB) and a 6500XT8GB - along side the 'ideal' scenario with the 7800X3D to really compare how these will work for budget builds in years to come. People may not be realizing just how poorly the system will perform forgetting you're testing on a 7800X3D with 32GB+ of RAM.
    For future technologies, I'm curious how direct storage can help users with tighter budgets take advantage of speedier, low-cost NVME drives, instead of relying on potentially slower sys RAM (what a bizarre statement, considering how slow HDD's were!). It's possible most new games begin to roll out those direct storage technologies alongside 1TB PCI4/5 NVME drives that are relatively fast performance compared to budget GPU buffer speeds, meaning you won't need the upgrades to 16GB as quickly for 1080P gaming.
    Thanks! Keep up the awesome work!

  • @TheHurcano
    @TheHurcano 3 месяца назад

    I really appreciate how clear this comparison illustrated the differences. Now I am really curious about performance and quality differences in similar gaming titles when going from 8GB VRAM to 12GB (or 16GB) on the next tier up in cards, especially when comparing 1080P to 1440P along with varied quality settings at both resolutions. Seems like that comparison could end up being a little less obvious on what the "correct" breakpoints of acceptability/desirability are, but might be more relevant to buyers in the $300-$400 market.

  • @Zayran626
    @Zayran626 3 месяца назад +9

    would love to see a video like this with the 4060 Ti 8 GB / 16 GB cards

    • @vvhitevvizard_
      @vvhitevvizard_ 3 месяца назад +3

      4060 ti 8GB should not exist at all. and 4060 ti 16GB to be priced $350. well, $400 at max

    • @Rachit0904
      @Rachit0904 3 месяца назад +3

      There already is! Look at the 4060 Ti 16GB review

  • @adink6486
    @adink6486 3 месяца назад +7

    How is it so hard to for the fanboy to understand? Obviously you want to max out most of the graphical settings when you spend $600+ on a GPU. That's the whole point. We should never need worry about running out of VRAM if the company give us enough in the first place. Why would I want to lower my graphical settings so that I have enough VRAM to run the game. We want NVIDIA to treat us better. That's it.

    • @innie721
      @innie721 3 месяца назад +1

      I want AMD to treat us better too. Imagine buying their highest end offering like the 7900 XTX only to get 25fps at 1440p with RT on medium in Alan Wake 2, a AAA title and one of the best titles of 2023. Scammed.

    • @defeqel6537
      @defeqel6537 3 месяца назад +1

      @@innie721 AMD is irrelevant in the market, and memory subsystems are a "solved" problem: it would be trivial for nVidia to increase VRAM amount. Heck, with the 3080 they had to put in work to cut the VRAM amount from 12 to 10 GB, by disabling parts of the chip.

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp 2 месяца назад

      @@innie721 Again it isn't raytracing, its raytraced lighting and shadows. Nvidia may as well call it RTLS. Were it true raytracing the game would be unplayable. Its bad enough that "RT" has as bad a performance hit it does on Nvidia let alone AMD.

  • @Ariane-Bouchard
    @Ariane-Bouchard 3 месяца назад

    One thing I wish you'd done more of is zoomed in/slowed down scenes for visual comparisons. Most visual differences I wasn't able to see on my phone, even though I tried zooming in manually. Making things bigger, focusing on the important details, would've probably been a relatively simple way to get around that issue.

  • @justhereforgear3441
    @justhereforgear3441 3 месяца назад

    Thank you for the in depth breakdown. Very informative.

  • @MrHamof
    @MrHamof 3 месяца назад +11

    So what I'm getting from this is that the 6500XT should always have been 8gb and would have been much better received if it was.

    • @Hardwareunboxed
      @Hardwareunboxed  3 месяца назад +10

      Yes

    • @TrueThanny
      @TrueThanny 3 месяца назад +5

      Yes and no. Yes, it would have been better with 8GB. No, it would not have been better received, because it was a pandemic card, created explicitly to reach the $200 point in a time of drastic supply line disruption. Doubling the VRAM would have notably increased its price at the time, and it would still have received negative reviews on that basis, even from HUB, which ignored the effect of the pandemic on pricing of all cards for some bizarre reason. Specifically, they compared AMD's MSRP values for cards released in the midst of the pandemic, which accounted for supply line disruption, to nVidia's fictional MSRP values for cards released before the supply line disruption.
      The 6500 XT was only ever supposed to be a stop-gap measure that allowed people to get a functional graphics card for $200, when used versions of much slower cards were selling for a lot more.
      AMD should have at the very leased ceased production of the 4GB model after supply lines cleared up.

    • @defeqel6537
      @defeqel6537 3 месяца назад

      @@TrueThanny Indeed, pandemic pricing for GDDR6 was about $15 / GB (while it is around $3 /GB now), extra 4GB would have cost about $60 more + OEM/retail margins which are often based on the price of the product (so about $80 more total)

  • @Pamani_
    @Pamani_ 3 месяца назад +4

    20:48 I think there is an error here. The perf shown for the 4GB is at high textures while the script says it's low textures

    • @adnank4458
      @adnank4458 3 месяца назад +2

      Agree. noticed the same error. don't know if it's the only error.

  • @KimBoKastekniv47
    @KimBoKastekniv47 3 месяца назад

    You're the first channel I go to for day-1 reviews, but these "investigation / for science" videos is the cherry on top.

  • @hmst5420
    @hmst5420 3 месяца назад

    Great video quality btw. Looks like something's changed in your equipment

  • @Jomenaa
    @Jomenaa 3 месяца назад +3

    My GTX 1070 Ti still going strong, OC'd to GTX 1080 lvls of performance and those sweet 8GB's of GDDR5 :)

  • @toad7395
    @toad7395 3 месяца назад +4

    Its so scummy how NVIDIA refuses to give their cards more vram (and if they do it is absurdly overpriced)

  • @madarcturian
    @madarcturian 3 месяца назад

    We need more videos about this. Thanks so much guys. Sad how low some cards are these days on vram. I pray I live to see afforadble cards with a lot of vram. VR and proper image scalers of image quality really need a lot of vram.

  • @aaron_333
    @aaron_333 3 месяца назад +1

    Very nice! I wonder if you can do one which compares 10GB, 12GB and 16GB?

  • @pivorsc
    @pivorsc 3 месяца назад +3

    Doesent more vram prevents random shuttering? When i play CoD my GPU uses around 14gb of vram that i believe is a data preload to prevent loading from the drive.

    • @imo098765
      @imo098765 3 месяца назад

      Its not VRAM reduce the random stuttering as much that you running out of VRAM you will introduce the stuttering. CoD just asks for everything possible and you wont see a difference on a 12GB, its just an "incase" we need it moment

    • @sebastian38488
      @sebastian38488 3 месяца назад

      Definitely! On 24gb GPUs everything work smooth.

    • @andersjjensen
      @andersjjensen 3 месяца назад

      Very simplistically you can say "There are two amounts of VRAM: Enough and Not Enough". Having more than enough doesn't help with anything. Having less than enough hits 1% lows the hardest and the AVG progressively harder the bigger deficit you have.
      Until the whole Direct Storage thing is rolled out the system RAM is still being used as a staging area before dispatch to VRAM, rather than the VRAM being used as a buffer for things that are still (far) out of view. This means that game stutters can also occur if you're low on system RAM, despite having enough VRAM.

    • @andersjjensen
      @andersjjensen 3 месяца назад

      @@sebastian38488Not if you chuck them in a system with 8GB system RAM.

  • @hasnihossainsami8375
    @hasnihossainsami8375 3 месяца назад +3

    I think the biggest takeaway here is using higher quality textures almost never hurts performance, and so GPUs with insufficient VRAM should be avoided like the goddamn plague. 4GB cards are absolutely dead at this point, 6GB doesn't have much longer left and 8GB only falls short in some newer games and some edge cases in others. Buying an older 8GB card at discount/second hand prices still makes sense, imo. But newer GPUs? Considering the vast majority don't upgrade for atleast 3 years, they aren't worth it. 10GB is the minimum for a new GPU.

  • @nicktempletonable
    @nicktempletonable 3 месяца назад +1

    Loved this video Steve!

  • @itsyaboitrav5348
    @itsyaboitrav5348 3 месяца назад +3

    You should compare running GPUs on PCIE gen 3 vs gen 4 next, looking at whether people on gen 3 boards are being held back on there gen 4 cards

  • @OutOfNameIdeas2
    @OutOfNameIdeas2 3 месяца назад +4

    Nvidia was the dumbest buying decision I made in years. I bought the 10gb 3080. It lasted me 1 month before it got limited by it's VRAM. 4k max settings are undoable. It's not even good enough to run beat saber with a index at 120fps.

    • @lorsch.
      @lorsch. 3 месяца назад +2

      Same, replaced it by a 3090.
      For VR 24GB is definitely needed.

  • @ravanlike
    @ravanlike 3 месяца назад +3

    Speaking of VRAM, yesterday one polish tech youtuber compared 3070 8GB vs 3070 16GB (modded, all memory dices were replaced). Findings were very similar to yours (when you compared 3070 8GB with one gpu for professionals (in reality 3070 chip with 16gb)). 1% low fps was higher with more VRAM, especially when running games with RT enabled.
    ruclips.net/video/9mWJb25pU8A/видео.htmlsi=er0AK11pJyAEeMNC&t=172

  • @Petch85
    @Petch85 3 месяца назад +1

    Always try if you can run the game with the highest texture settings. Sometimes you get a lot of quality for no performans loss at all or very little performans loss.

  • @ziokalco
    @ziokalco 3 месяца назад

    If I'm not mistaken. Games such as howarts legacy dinamicaly adjust real texture quality when the VRAM is saturated. Results may be missing some data

  • @gregorsamsa555
    @gregorsamsa555 3 месяца назад +3

    I guess 4GB RX 580 performs weaker than 8GB RX 570 in latest modern games...

  • @patricktho6546
    @patricktho6546 3 месяца назад +9

    It was really worth it, to go for the 8 GB version of the R9 290 X, instead of the 4 GB version.

    • @hrayz
      @hrayz 3 месяца назад

      When my R9 290X-4GB card died I was able to get the RX 580-8GB card. That 8GB has allowed the card to survive to this day (good for 1080p gaming.)
      My friends and roommate are still using theirs, although I moved on to the RX 6900XT-16GB for 4k use.

  • @andersjjensen
    @andersjjensen 3 месяца назад

    I just ordered a Ryzen 7 7840U based laptop, which has 780M integrated graphics. The 780M and 6500XT are very evenly matched in terms of performance (+/- 10% for either depending on title). I opted to cough up the premium for 64GB (shared) RAM, as I tend to hold on to my laptops for a very long time (the one I'm typing this from is 12 years old). But that could very well come in handy, as shown here. HD texture packs for old games are a thing, and with more and more games targeting a Steam Deck preset I guess I'll actually be playing on this thing when I'm too lazy to get off my ass.
    What I find insane is that the 780M is a 15W solution while the 6500XT is a 107W solution. TSMC DUV N6 vs TSMC EUV N4 really makes one hell of a difference.

  • @Omni-mal
    @Omni-mal 3 месяца назад

    Great video, I wonder how would it impact performance if you didnt have extra system ram?

    • @Frozoken
      @Frozoken 3 месяца назад

      Likely a crash if not unplayable, a similar further performance dip would occur at pcie gen 3 speeds because remember the gpu isnt getting system ram bandwidth it's getting the bandwidth of the pcie link which is way lower then even that so ur pcie link speed basically determines your gpu's bandwidth from ur system ram. In other words if the 6500xt was at pciex16 it'd probably perform much better with the 4gb model too. To more directly answer ur question if ram was only just running out and needed to be compressed like a said, a similar performance drop as moving to pcie ge n3, if you went way over, almsot defintley a hard crash

  • @happybuggy1582
    @happybuggy1582 3 месяца назад +16

    1060 3GB owners are crying

    • @alternaterealityfilms
      @alternaterealityfilms 2 месяца назад +2

      7800GTX 256MB owner here. Literally all you need

    • @kajmak64bit76
      @kajmak64bit76 2 месяца назад

      1060 3gb was a mistake and should never have been made
      Or if it was to be made it should atleast have 4gb of VRAM since GTX 1050 ti has 4gb's like wtf

  • @Azureskies01
    @Azureskies01 3 месяца назад +2

    everyone with 3070s and 3080s are now on suicide watch

    • @Nurse_Xochitl
      @Nurse_Xochitl 3 месяца назад +3

      I still use a card with 4GB of VRAM.
      I'm pissed, not so much at NVIDIA for not including more VRAM (although I'm still pissed at them for different reasons, I use Linux and they have no open source drivers for the GTX series of cards)... but at the gaming industry as a whole.
      The only thing the gaming industry optimizes is monetization, not the games themselves. A prime example of this is "EOMM", a way to rig matchmaking to increase "engagement" which basically refers to how much people play and pay.
      Modern games do NOT use SBMM!
      SBMM is actually a good thing (game companies hate it because they can't milk players as well, and content creators/streamers do NOT tell the truth about it (along with toxic no-life "veteran" players) because they hate it that they can't "pubstomp" new players when SBMM is properly implemented.
      Content Creators/Streamer BTW are often paid shills, so when EOMM is brought up, they often act skeptical at best, or tell lies about it (and refer to it falsely as SBMM) likely to cover up the game companies' ass... because otherwise they could lose their partnership, get hit with lawsuits and DMCAs, etc.
      Combine that with grinding/progression/unlocks (and of course, "FOMO" limited-time events) and every player who doesn't spend a bunch of time grinding or spending money on "Pay To Progress" (which is Pay To win) crap like boosters, will always fall behind and be at a disadavantage, and will generally have a worse time gaming.
      Engagement Optimized Match Making:
      web.cs.ucla.edu/~yzsun/papers/WWW17Chen_EOMM)
      On top of that, I'd imagine there's probably some sketchy deals going on behind closed doors with hardware manufacturers and game companies.
      Perhaps game companies get new high-end hardware at a huge discount and/or are incentivized to "optimize" their games to run "well" only on the latest, highest end hardware (while not giving any older/weaker hardware any real optimization). (Optimized and well are in quotes because they don't mean to actually optimize the game and have it run well... just barely playable on only the newest, most powerful shit... so they can sell more hardware upgrades.)
      I would not be surprised if there was this much corruption in the gaming industry, as I have seen a lot of it personally as a gamer (and BTW, as a nurse - I can say the healthcare industry is also very corrupt with big government and big pharma lobbying) I'd imagine it's a somewhat similar deal here, as I follow the game industry somewhat closely. Heck, even the Australian government is defending microtransactions via state-owned media.
      In Defense of Microtransactions | THE LOOT DROP
      ABC Gamer
      ruclips.net/video/rUnO-njvVZA/видео.htmlfeature=shared
      TBH, we really don't need more than 4GB of VRAM... if game companies would just optimize stuff.
      People could just say to buy a better card, but then it's only a matter of time before that card also becomes useless... which generates more and more e-waste. Not everyone can afford to do that anyway either.
      There has to be a point where people put their foot down and crack down on bad optimization.
      People need to stop buying new hardware, especially higher end hardware... and use stuff longer.
      They also need to stop supporting bad games (online-only/DRM'd games, games full of Grinding/P2W/Gambling, games without self hosted server support, games without modding support, etc.)
      Only then will we see a change in the industry.

  • @mashedindaed
    @mashedindaed 3 месяца назад

    Great video, I didn't realise VRAM had such an impact in performance once it had effectively been saturated. One slight critique in the charts, especially when talking about the percentage difference, is to always to compare in the same direction, otherwise the numbers could be misleading. For example, 15 is 50% more than 10 but 10 is 33.3% less than 15, so direction of travel between the numbers matters a lot because the difference between 33.3% and 50% is potentially massive. Alternatively an arrow on the charts to indicate which way you're comparing could help de-obfuscate the numbers.

  • @dianaalyssa8726
    @dianaalyssa8726 3 месяца назад

    Great video. Am a bit curious how the old 3gb (thinking about 1060 3gb) vs 4gb vs 8gb vs 10gb would compare. I consider 12 and up for new, main rigs, if the budget is there.

  • @coganjr
    @coganjr 3 месяца назад +4

    I love how this community always complaining about more vram while game developer can get away freely to relased unoptimize game.
    What an awesome community LOL

    • @Hardwareunboxed
      @Hardwareunboxed  3 месяца назад +3

      Don't see how the two are connected, but okay. It would be very odd to expect modern games to work well on old 4GB graphics cards right?

    • @coganjr
      @coganjr 3 месяца назад

      @@Hardwareunboxed Yes modern game will not running well on 4GBs or even 8GBs of vram.
      What I mean is when we looked at the recent unoptimized game that need a lot of vram but don't have any graphic fidelity like Alan Wake 2, Cyberpunk 2077

    • @Hardwareunboxed
      @Hardwareunboxed  3 месяца назад +2

      Alan Wake 2 is pretty heavy on VRAM, especially if you use RT while CP2077 has fairly poor texture quality.

  • @neurofiber2406
    @neurofiber2406 3 месяца назад

    Another great video Steve. Since I'm sorely in need of a GPU upgrade...

  • @RetrOrigin
    @RetrOrigin 3 месяца назад

    This also helps demonstrate that texture quality/resolution doesn't really affect framerates that much if any as long as you have enough VRAM.
    I often see people turning texture quality down even when they have a video card with more than enough VRAM thinking that would help with performance when it usually doesn't.

  • @LlywellynOBrien
    @LlywellynOBrien 2 месяца назад

    Were the screen caps for the texture quality on Warhammer backwards? Just looked like the ultra one with the additional effects was on the right. Unless this is a CitySkylines 2 situation again.

  • @rangersmith4652
    @rangersmith4652 3 месяца назад +1

    My first 100% home-assembled PC sported an FX-6300 and an R0-270X 4BG. Yes, even way back in 2014, going AMD meant we could opt for extra VRAM.

  • @alpha007org
    @alpha007org 3 месяца назад +2

    Thank you for not being upset, Steve. Mental stability is equally important as frametime stability.

  • @unclej3910
    @unclej3910 Месяц назад +1

    I don’t know why the vram discussion is such a devisive hot button topic.

  • @ipotato95
    @ipotato95 3 месяца назад

    This would be a great video to revisit with 8vs12 when the next gen of graphics cards are announced

  • @timduck8506
    @timduck8506 3 месяца назад +1

    Im so glad I brought a rtx 3080 16gb laptop version for my travels 3 years ago. total specs are 32gb ram and a 5900hx cpu with 8tb of storage.😃

  • @_Azurael_
    @_Azurael_ Месяц назад +2

    To be honest, I am just getting into all this VRAM importance thing now because I am buying a new PC.
    My old PC with a GTX 1070 OC (8GB) is still able to run most games at 2k... I played Elden Ring last year with 50~60fps. I then started BG3, and that game forced me to lower graphic quality to medium/low, but is still working with 50~60fps.
    Because it's getting to the point were I have to lower graphics bellow Medium, I am upgrading, but I don't know if the problem is VRAM of simple raw power.
    The only time I see VRAM completly used is in very unoptimized games.

  • @jimmyjiang3413
    @jimmyjiang3413 3 месяца назад +2

    It reminds me why professional (Quadro) RTX and Radeon Pros utilize double VRAM buffer. I am not sure whether or not something like RTX 4000 SFF (Ada) worth it for a given SFF build due to VRAM buffer reasons or simply performance per watt. 🤷🏻‍♀️

  • @AndersHass
    @AndersHass 3 месяца назад

    I didn’t know liquid metal could dry out. I just thought the main concern is if it isn’t contained, it could make connections that will damage the electronics. Where regular thermal paste and those cryosheets aren’t electrical conductive and thereby no fear of any electrical damage.

    • @AndersHass
      @AndersHass 3 месяца назад

      ​​@@InnerFury666not sure if PTM7950 thermal pads won't have pressure issues when cryosheets did. At least I would think for pads in general the pressure can be a bigger issue than more liquid solutions like liquid metal and thermal paste (at least if you don't have too much liquid). Maybe PTM7950 are thinner and thereby more easily get pressure on the surfaces.

  • @rxpacman1893
    @rxpacman1893 3 месяца назад +1

    would've been good to hear how memory bus and memory type ie gddr6x play a part in all this, good vid but 👍

  • @ShaunRoselt
    @ShaunRoselt 3 месяца назад +1

    I'd love to see this, but on 4K. It's really interesting to see RAM usage.

  • @lake5044
    @lake5044 3 месяца назад +5

    One important thing to keep in mind is no matter which memory we're talking about (RAM, VRAM, etc), utilizing it close to its limits (say above 80%) will always have some extra overhead. The details can get very complicated (and I don't know them all) but it depends on many things like the controller and the algorithm used to allocate and retrieve the data. But basically, the issue boils down to having to move data around (either to consolidate unused space or to avoid random reads across different module subdivisions for performance reasons, etc). Those techniques sacrifice speed to be able to use as much as possible from the empty space, but the ideal for performance is to have so much empty space that you don't trigger those techniques that move data around.

    • @Citizenflaba
      @Citizenflaba 3 месяца назад +2

      4090 purchase vindicated

  • @williamrutter3619
    @williamrutter3619 3 месяца назад +1

    Interesting video, would be interesting to see this mixed up with pci 3.0 and 4.0, ddr4 and ddr5, and an rx580 4gb and 8gb.

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 2 месяца назад +1

    Meanwhile in laptops, a 4070 costs $250-$300 MORE than a 4060 and both come with 8GB of memory.
    I remember buying an RX 480 with 8GB for just $250 over 7 years ago. Even back then we were saying that 4GB wasn't enough for 1440p.

  • @SPPACATR
    @SPPACATR 3 месяца назад

    Dude sounds sooooo chill in this video lol.

  • @carlkidd752
    @carlkidd752 3 месяца назад

    console/pc gaming is my hobby versus cars or camping or fishing. So I budget for the best I can afford at any particular time. My last and longest use GPU was EVGA 1080Ti FTW3 which my grandkids now have. Thru various monitors including my current Hisense 55U8G, a 4K 120hz TV, it performed outstandingly. When, briefly, the 7900XTX went on sale for $900, I upgraded. Very pleased with my HellHound and what I play doesn't have RT possible. As the 1080Ti also had no RT capability, I don't miss not having it.
    From your reviews and others, reasonably priced GPUs (under $600) have a tough time at 4K let alone enabling RT. For me, no RT and spending $500-550 for decent 1440/4k gaming makes a compelling buy AMD argument. Heck, imagine EVGA making new 1080Tis for $400.
    Want versus need. I didn't need a new GPU, but I wanted all the non RT eye candy at 4K. If your desire for "better" starts to overwhelm what you really need....skip the friday night trip to the bar, skip starbucks, my talk and text is $20 a month, ONE streaming service not 4 or 5 or 6 and shoot, in a couple months you could buy any GPU you want. I'm retired, so I already do those "skip" things which is why I could buy a $900 GPU. I could have bought a $1600 4090, but its RT functions would have been wasted. The $700 premium for a few more frames would equate to lighting cigars with 50 dollar bills. I don't need or want a Rolls Royce for trips to the grocery store.

  • @mleise8292
    @mleise8292 3 месяца назад

    24:59 While you are right that the Vega56 wasn't an affordable 8 GB card originally, it was still the latest AMD offering when it came down in price to 260€ (VAT _not_ included) in the first half of 2019. I still use mine. 😅

  • @Raxiel497
    @Raxiel497 3 месяца назад

    It may have seemed excessive at the time but my 1070 with its 8gb had some pretty long legs. Bought at launch it's only just been replaced with a 4070s, the 12gb frame buffer of the new card a 50% improvement but I took it begrudgingly and I'd have preferred 16gb (not enough to spend an extra £200 for the ti). I don't plan on replacing it for at least 6 years, time will tell if that's reasonable. The previous upgrade was a much larger jump, coming from an AMD HD6970 with just 2GB of VRAM.
    As for the 1070, it will probably see a good few more years in the family PC.

  • @milanbajic242pcchannel
    @milanbajic242pcchannel 3 месяца назад +1

    For me, in a game settings Global Textures are always the most important and only then other settings, I could play with High+ textures and everything else on Medium 60 FPS without any problems, that is my MINIMUM below which I would not go, My PC: 1080p, R5 5600X, (2x8) 16GB 3200MHz 16CL, RTX 2070 8GB, for now all working as I wish (playing games with all on HIGH+ settings), thanks for amazing video, have a awesome day Hardware Unboxed. 👍❤💯🔔

  • @EmblemParade
    @EmblemParade 3 месяца назад +1

    To be honest, this increasing appetite for VRAM happened faster than most of us expected. Some devs were signaling that this would happen for a while, but we assumed that the baseline represented by PS5 and Xbox Series X would limit requirements for PC, too. The takeaway is that we can't assume that anymore. Devs are targeting a higher resource profile for PC gaming than for consoles, period.

  • @kicsiduda
    @kicsiduda 3 месяца назад

    Very well made video, thank you

  • @owlmostdead9492
    @owlmostdead9492 3 месяца назад +5

    Same goes for RAM, I believe it's a crime against nature (as in creating e-waste) to pair a modern CPU with less than 16GB of RAM. More so if it's soldered and not upgradeable.

  • @666Maeglin
    @666Maeglin 3 месяца назад +1

    Great video would it be possible to make one comparing 8, 12 and 16 GB VRAM??

    • @Hardwareunboxed
      @Hardwareunboxed  3 месяца назад +4

      Not unless there is an 8, 12 and 16 GB model with the same GPU.

    • @666Maeglin
      @666Maeglin 3 месяца назад

      that was my question , though badly formulated@@HardwareunboxedWith all the model fuck uppery done by nvidia i thoughjt it must be possible that such a model exists

  • @johnk.7836
    @johnk.7836 3 месяца назад

    Appreciate you sharing the benchmarks and the knowledge - this is how we learn and become better shoppers and consumers.

  • @_Jayonics
    @_Jayonics 3 месяца назад

    I know the purpose of this video is what is needed and that graphics cards need more memory but for cases where either buying a new graphics card (like if its soldered in the case of a laptop) is not an option or just not something you want to do...
    What other options do you have to improve your performance in VRAM constrained environments?
    On the settings and software side, what settings should you change to attempt to keep VRAM usage low, just textures? Is it worth using DLSS and FSR to render at lower resolutions while putting more processing overhead onto the GPU?
    When exceeding the VRAM buffer the system RAM is used. How much benefit would you get from upgrading your DRAM to something of higher density, higher frequency, lower latency E.C.T?
    I think how the system RAM affects graphics performance is going to become increasingly important as the AMD Strix Point and Strix Point Halo APUs take market share from low end GPUs. And I suspect Intel will follow suit with Battlemage APUs considering Iris XE APUs exist. In this scenario DRAM is the only source of VRAM so has even more impact