How bad is 8GB of VRAM in 2023? The newest games, RT on/off, Ultra, High, 1080p, 1440p, 4K

Поделиться
HTML-код
  • Опубликовано: 21 авг 2024

Комментарии • 2,1 тыс.

  • @danielowentech
    @danielowentech  Год назад +82

    Sell your old GPU to fund your upgrade at Jawa! bit.ly/JawaOwenGPUAug23

    • @ihor1337
      @ihor1337 Год назад +41

      they offered me $69.25 for my 1080Ti, lol. It goes for $150 - 170 on Ebay.

    • @Dumnorix76
      @Dumnorix76 Год назад +4

      Too bad it's only in US

    • @Z3t487
      @Z3t487 Год назад +17

      @@ihor1337 Really? Thanks for letting us know. Not gonna bother even trying it then.

    • @retakleksum111
      @retakleksum111 Год назад +9

      @@ihor1337 i dont see a point in a side like that since ebay is a bigger market for anything

    • @ging3rbreadtiger618
      @ging3rbreadtiger618 Год назад +1

      I cannot find this specific page on jawa when i visit the site.

  • @guanyu210379
    @guanyu210379 Год назад +1327

    The problem with 8GB VRAM is being told as bad, not only because of the technical side of the story. It is also about pricing.
    GPU pricing today is simply outrangeously high ...it is normal the people expect to get more because they are paying more.

    • @matthewjohnson1891
      @matthewjohnson1891 Год назад +25

      I bought a voodoo 2 card in the late 90's for 250.00. Thats close to 500.00 so about the same for the budget cards. "Top tier" and over kill like 4090 didnt exist then because the market was so small.

    • @guanyu210379
      @guanyu210379 Год назад

      @@matthewjohnson1891 I have not heard "Voodoo" again since years ^o^
      I tend to use the last few GPU generation I bough as reference. My older cards like S3 Savage 4, Riva TNT2, etc...I do not use them anymore as price reference.
      Basically,
      I bought HD3850 €80, HD6850 €110, GTX970 €330 and my current 1070ti €375.
      Price on RTX3xxx, RTX4xxx, RX6000 and RX7xxx are like starrting to rise . Now, €400 is not even belonging to mid-range...this is nuts...For me, RTX4070 is mid-range but look at the pricing.

    • @baronsengir187
      @baronsengir187 Год назад +6

      @@matthewjohnson1891 Did you forget the Voodoo 5 5500?

    • @honza9911
      @honza9911 Год назад +2

      I have question for you only 2 gpus on market 1 8gig second 16 gig 8 gig has 256bus and 16 gig has 128 which one u choose price for 8 gig is 400 and for 16 500

    • @yves1926
      @yves1926 Год назад +14

      The problem, is that people(gamers, at most) buy 8GB graphics cards, so Nvidia sells them...

  • @sparklingbra
    @sparklingbra Год назад +463

    Some games don't suffer FPS performance when GPU runs out of VRAM. Instead they suffer texture pop in or textures not loading at all, so the FPS numbers alone don't always tell the full story between 8GB and 16GB.

    • @teutorixaleria918
      @teutorixaleria918 10 месяцев назад +47

      More often its both. Any game I've seen where the textures start popping in late due to VRAM limits also get stuttery AF, might not show in average FPS but definitely will in lows.

    • @TechnoFreak-IN
      @TechnoFreak-IN 7 месяцев назад +10

      my 2060 with 6gb, i'm getting the textures issue in warzone, sometimes guns becomes invisible in the start then appears after 30 secs!

    • @lesterama6110
      @lesterama6110 5 месяцев назад

      horizon Zero Dawn is the best example

    • @crisr4667
      @crisr4667 5 месяцев назад

      Run all lowest settings fuck it might aswell​@@TechnoFreak-IN

    • @samplautz5586
      @samplautz5586 4 месяца назад

      I have a 1050 super, and it runs at over 60fps in flight simulator and Jedi Survivor. You are right though, the textures are very poor

  • @imbro2529
    @imbro2529 Год назад +224

    "8GBs isn't enough!!!!!"
    *meanwhile*
    Steam Hardware Survey: 4GB 1650 Most used card

    • @JamesBond77
      @JamesBond77 6 месяцев назад +8

      Stop mocking the poors my man.😇

    • @Who_Does
      @Who_Does 6 месяцев назад +44

      tbf 1650 4Gb is more than enough for Steam's consistent top games (Dota2, CSGO, PUBG)

    • @infernal-toad
      @infernal-toad 5 месяцев назад +9

      8GB VRAM will be enough for arround a few years for playing on the minimum settings. That’s what counts. As long as you can play the game with good performance on the lowest settings, don’t complain. My laptop has 6GB VRAM and I only bought it a few months ago. And I am not someone that cares about playing in the highest settings.

    • @FireJach
      @FireJach 4 месяца назад +1

      ​​@@infernal-toad 😂 According to this I can buy a whatever gpu with more vram and be able to play the games on highest settings

    • @phantom16518
      @phantom16518 4 месяца назад +2

      I used to run valve games with my 512mb Vram laptop up to 2021 (CSGO wouldn't run on it anymore), dark times 😅

  • @Erik_Dz
    @Erik_Dz Год назад +130

    one thing to note for UE4 games (like Jedi Survivor) the engine prioritizes FPS when Vram is capped. The engine will automatically load lower resolution textures to maintain stable fps and prevent stuttering. This is why games using other engines have more obvious FPS drops and stuttering in these benchmarks.
    UE5's nanites behave similarly to how UE4 handles Vram overflow, though it is much less noticeable than entire textures being loaded at lower resolution.

    • @slickrounder6045
      @slickrounder6045 Год назад +12

      So you are saying that Jedi Survivor is doing what games like Forspoken do, namely loading in lower quality textures when Vram capped?? If so this is a big deal. I don't think I've seen HUB ever show that with Jedi Survivor, and seemingly Daniel isn't aware of that either or he would have mentioned it in this video.

    • @LoneWolf-tk9em
      @LoneWolf-tk9em Год назад +7

      That's a very important information

    • @Erik_Dz
      @Erik_Dz Год назад +4

      @@slickrounder6045 Daniel did mention it in passing in the video. That some textures may be lower resolution but he can't see it. Pretty sure he knows about it and has talked about it in previous videos. It was a bigger deal with Hogwarts, because its super noticeable in that game. It happens in every UE4 game. Its a core 'feature' of the engine.

    • @xerxeslv
      @xerxeslv Год назад

      @@Erik_Dz So, you think UE5 doing that by default also? Like, I don't really care what kind of wizardry is going on on the background if picture looks good and FPS is stable, looking at the remnant at the same time, performance is pretty bad but vram consumption seems pretty low.

    • @1O1O11
      @1O1O11 Год назад +1

      @@Erik_Dz It is also one of those things that might not be noticeable in a testing scenario, but you might start to notice after several hours of gaming.
      So on one hand, I think it is great that games will still be playable on older 8gb GPUs long into the future, and I am certainly not going to throw out my RTX 2070 that is in my TV PC rig... I am also certainly not paying money to buy a brand new 8gb VRAM GPU in 2023...
      But anyways... Unreal Engine sometimes gets a bad wrap because only the 3 or 4 most graphically demanding games ever get tested, while 99% of the Unreal games out there run at low enough specs that no RUclips tech channel would bother using them to benchmark. Like you are never going to see a highly optimized game like Hi-Fi Rush be used for benchmarking purposes. In the next 5 years there will be hundreds of games like Hi-Fi Rush in Unreal Engine 5, but people will shit on the engine because of games like Remnant.

  • @Steven-ex3ne
    @Steven-ex3ne Год назад +1167

    $100 for 10% extra performance. Thanks Nvidia.

    • @danielowentech
      @danielowentech  Год назад +425

      The more you buy, the more you save!

    • @Steven-ex3ne
      @Steven-ex3ne Год назад +434

      @@danielowentech the less I buy, the more I save!

    • @100500daniel
      @100500daniel Год назад +70

      So you'll get either the 8GB model or the 12gb 4070 and have VRAM related issues by the time RTX 50 launches lmao. They can't make the deal too sweet.

    • @GewelReal
      @GewelReal Год назад +20

      Depends on the title. Some titles have massuve difference so the extra 100$ is 100% worth it

    • @Onisak25
      @Onisak25 Год назад +86

      ​@@GewelRealno its not. Not on a 128bit buss lol.

  • @riven4121
    @riven4121 Год назад +874

    Really nice of Nvidia to give the 4050 16 gigs of memory

    • @oleksandrrogozhnikov5525
      @oleksandrrogozhnikov5525 Год назад +91

      And price around 350 bucks😂

    • @heyjoe335
      @heyjoe335 Год назад +164

      Also nice of them to charge a 70 series price for it. 👍

    • @jasonking1284
      @jasonking1284 Год назад +54

      It would be far, far more succesful as a GPU if they un-launched the "4060/TI" and re-launched as 4050/TI starting from it's correct price of $200....
      Then un-launch the "4070/TI" and re-launch as the true 4060/TI starting $300....
      Watch those cards fly off the shelves rather than as now, gathering layers of dust...

    • @SwingArmCity
      @SwingArmCity Год назад +20

      Nvidia's response.....F you buy our products!

    • @RockyRacoon5
      @RockyRacoon5 Год назад

      @@oleksandrrogozhnikov5525 More like $500 lmao

  • @IrishShea
    @IrishShea Год назад +17

    An excellent video, things get worse for VR users such as myself, I've been considering to replace my 3070TI with the 4060TI 16GB just to improve my VR experience overall.. mad times we live in!
    *Bought an RX 6800 XT days after posting this, has helped more than I thought it would.*

    • @michael.calisthenics
      @michael.calisthenics 6 месяцев назад +1

      which vr u got at home? cuz i have a gtx 1660 super and i wanna get a 3070 for my new quest 2. my calculation say that the 8gb should be fine?

    • @IrishShea
      @IrishShea 6 месяцев назад

      I now own the Quest 3/6800XT, a good combo. Quest 2/3070 should be fine, just expect to drop texture quality to stay within the 8gb VRAM limit.@@michael.calisthenics

  • @karansingh1154
    @karansingh1154 Год назад +19

    Texture swapping should also be taken into consideration when vram constrained.

    • @slickrounder6045
      @slickrounder6045 Год назад +6

      Yeh this is a big thing many people miss. They see seemingly good fps on these obsolete 8gb cards and say "oh look everything is fine", meanwhile they don't realize that textures are being swapped or not loading properly... So it might SEEM like the Fps is similar, but the graphical difference between say 8gb of vram and 16gb could be huge if the former is stuck loading low res default textures or shuffling textures around like a maniac.

    • @karansingh1154
      @karansingh1154 Год назад +5

      @@slickrounder6045 the exact thing happened with me. Was playing re4 remake on a rx 6600. Didn't realized how big the visual changes were until i upgraded to a 6800 xt.

  • @willia451
    @willia451 Год назад +696

    So 8GB is not obsolete. Just not desirable in 2023 and moving forward. Got it. Thanks for the great work, Daniel. Appreciated.

    • @hasnihossainsami8375
      @hasnihossainsami8375 Год назад +162

      Don't think the consensus was that 8GB is obsolete, but rather that it's not feasible on mid/high end GPUs; there's no excuse for putting 8GB memory on a $400 card when GDDR has gotten so cheap. Even AMD isn't a hero for putting 8GB on the 7600 and asking for $270, and imo a 8GB card shouldn't cost more than $200 in 2023.

    • @Splarkszter
      @Splarkszter Год назад +60

      You can play on high 1080p with 8GB no problem.
      Ultra is unnecesary, there is no visual difference between High and Ultra settings.

    • @MicaelAzevedo
      @MicaelAzevedo Год назад +117

      @@Splarkszter the point is not ultra. Is 8GB bottlenecks games from 2020. 8gb card can not use frame gen dlss3 which is advertised by nvidia as a feature. Stop defending this greedy companies as they do not care about you lol

    • @BrunoFerreira-fp1vb
      @BrunoFerreira-fp1vb Год назад +22

      @@Splarkszter Agreed even with 1440 high have no problem.

    • @Adrian-is6qn
      @Adrian-is6qn Год назад +18

      If you have 8gb card it's fine i would just not buy a new one with this amount. You definitely want to get 6700XT if you are getting new GPU or 4070 for nvidia.

  • @Humanaut.
    @Humanaut. Год назад +543

    It needs to be noted that the 4060 16GB version does NOT get the full benefit of 16GB because of the 128bit bus.

    • @iDeparture
      @iDeparture Год назад

      THANK YOU. For fucks sake did like everyone forget? Fucking Nvidia. Watch testing games the bech it and it's preformace ti and non 8 and 16 vs 30 and 20 series cards is pathetic. Gamer nexus anyone? It's a joke. steve reams Nvidia like they should be or any even half honest review.

    • @OSemRival
      @OSemRival Год назад +118

      Yes, there was a test by a guy at MSI that clearly shows this.
      4060 16gb was worse in many scenarios...
      It Got deleted from RUclips.

    • @Humanaut.
      @Humanaut. Год назад

      @@OSemRival Interesting to know!

    • @iDeparture
      @iDeparture Год назад

      @@OSemRival there's more alot more

    • @retrofizz727
      @retrofizz727 Год назад +21

      still better than 8gb tho

  • @dima6969
    @dima6969 6 месяцев назад +13

    Me watching with 3gb vram:

  • @s1p0
    @s1p0 Год назад +59

    This test would be interesting with RTX 3060 12GB (vs 4060 8GB).

    • @nickochioneantony9288
      @nickochioneantony9288 Год назад +12

      Done multiple times... the 4060 "generally" win with higher fps unless you past the vram limit (which is still a rare case because 60 series buyer aimed for 1080p high anyway)... the PROBLEM is that memory bus, it simply unacceptable because it may bottleneck in most of new games. It was designed to be obsolete in 2 years max, so you'll ended up buying new gpu in that timeframe

    • @fabrb26
      @fabrb26 Год назад

      @@nickochioneantony9288 That's why people do buy the 6700xt , 4060ti performance and 3060 Vram capacity, easy pick

    • @rhdbenchmarks
      @rhdbenchmarks 11 месяцев назад +1

      The only reason I wanted a 4060 but settled for 3060 12GB is because of DLSS FG, which doesn't even work correctly with 8GB Vram.
      I've seen plenty of games passing the 8GB threshold, hell, FH5 at Extreme uses 10GB of VRAM, TLoU uses up to 11.7 GB of VRAM in 1080p Ultra.
      Was afraid it would get obsolete really fast due to that VRAM lol

    • @xXXEnderCraftXXx
      @xXXEnderCraftXXx 9 месяцев назад +1

      ​​@@rhdbenchmarksI am rocking an rtx 2080 which is an 8GB card.FH5 is playable at extreme with a few tweaks in the settings.But you are mostly right.

    • @jbmusic4095
      @jbmusic4095 4 месяца назад

      @@rhdbenchmarks are you happy with your 3060 12g? im basically in your same boat now considering a 3060 12g vs 4060 8g

  • @zenix4416
    @zenix4416 Год назад +149

    I'm so glad that I bought an rx 6700 xt with 12gb of vram instead of an rx 6600 xt with 8gb a year ago even though I didnt really need the performance and the vram but now I really appreciate these.

    • @ocha-time
      @ocha-time Год назад +23

      6700xt was one of the only deals in the house for so long, and even now comparing it to Nvidia price competitors, it still makes the most sense somehow.

    • @notrixamoris3318
      @notrixamoris3318 Год назад +7

      12gb vram is or will be the bare minimum...

    • @Onisak25
      @Onisak25 Год назад +44

      ​​@@notrixamoris3318do you even know what bare minimum means? People play ratchet and clank on 4gb cards just fine when turning down the settings.

    • @Z3t487
      @Z3t487 Год назад +3

      @@Onisak25 I think he meant with Ultra Settings is/will be the bare minimum.

    • @Onisak25
      @Onisak25 Год назад +23

      @@Z3t487 every site that knows what is talking about doesn't reccomend ultra. So 8gb is totally fine.

  • @jasonnchuleft894
    @jasonnchuleft894 Год назад +10

    Just a quick note on performance when VRAM capped: In games that support it decompressing assets prior to launching the game and having it load these instead gets rid of much of the frametime spiking whenever the VRAM is overloaded. It massively increases file system usage however and it won't otherwise improve fps, just leads to a noticably smoother experience. However it can be a viable solution to salvage an otherwise unplayably skippy experience.

    • @SmoothSportsGaming
      @SmoothSportsGaming 2 месяца назад

      How do you decompress the assets prior to launching?

    • @jasonnchuleft894
      @jasonnchuleft894 2 месяца назад

      @@SmoothSportsGaming Depends on the game. In some all you need is a tool like 7zip to extract the assets from the asset packs and put them in folders with the same name as the asset pack. In others you need modding tools to even extract the data. And some don't natively support loading the uncompressed assets so you need a mod. Not sure how it works in more recent games without mod-support tho, haven't had to look into it for years since VRAM capping used to be pretty rare until Nvidia went retard mode with the 4000 series 🤔

  • @theftking
    @theftking Год назад +1

    Having gone from the VRAM-starved 3070 Ti to a 4090, I think it comes down to this: 12GB of VRAM _should_ be enough for 1440p for the foreseeable future, save for a few poorly-optimized games that go over.
    ...the problem is, _gamers still want to play those games,_ even if they're poorly optimized. RE4 Remake already can't be maxed on 12GB at 1440p.
    It feels bad that you can buy a graphics card more expensive than the flagships of the day and then _not be able to play every game at your preferred resolution._ That sucks, and it's why VRAM will always be a sticking point.
    You said it best in your RE4 vid: it just FEELS BAD when your game crashes due to VRAM overallocation when you _know_ your GPU could handle it with $8 more GDDR.

  • @syncmonism
    @syncmonism Год назад +177

    8GB is fine for cards that are around 230 or below. Anything more expensive than that gets too close in price to the excellent 6700 XT, and, to a lesser extent, the 10GB 6700 (non-XT), and 16GB A770 as well. As of today, August 3rd, 2023, The Sapphire Pulse version of the 6700 XT is available for 330 on Newegg in the US, and that's a very good quality model, which I strongly believe is easily worth at least 20 dollars more than any of the cheaper models from other brands. 330 is an EXCELLENT price for that model of the 6700 XT. If you're building a new system, I think that's definitely worth paying around 130 more than a 6600. If you're upgrading an older system, or are ONLY going to be playing older or less demanding games, then an RX 6600 or 6600 XT, or A750, would all be decent choices as well, though the A750 does require a more modern CPU and motherboard than the 6600 or 6600 XT does in order to avoid bottlenecks.
    The 16GB A770 has been as low as 290 recently. It's significantly slower than the 6700 XT on average, but its performance could improve more with driver updates, it does run as well or even better than the 6700 XT in some games, and it does perform surprisingly well in ray tracing, and XESS upscaling is actually very good in games which support it as well. I would stay away from the Intel cards entirely unless you're comfortable understanding that you're probably going to have a less consistent experience, and might have to do a bit more trouble-shooting with them, but I still can't deny that they offer a good value as long as you're comfortable with that.
    The 16GB 4060 ti does an amazing job of making the 8GB version look really bad, but the 6800 XT costs about the same, and is roughly 30% faster than the 16GB 4060 ti on average, and is usually about the same or slightly faster even with ray tracing enabled, and also has nearly double the memory bandwidth, which makes both versions of the 4060 ti look incredibly over-priced by comparison.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior Год назад +16

      I disagree, all the card you listed are too slow to be used in 2023. The only gpu I recommend is a RTX 4090. For those who are poor or an a budget, the RTX 4070 is the lowest I can recommend.

    • @Humanaut.
      @Humanaut. Год назад

      200 max

    • @globalzero
      @globalzero Год назад +5

      What about RTX 3060 12GB?! In my country ( Germany ) it is much cheaper than 6700XT.

    • @mystic5583
      @mystic5583 Год назад +7

      @@globalzeromuch less powerful

    • @syncmonism
      @syncmonism Год назад

      @@angrysocialjusticewarrior I have no idea if you're joking or not. If you're not joking, you are seriously out of touch with reality.
      Assuming that you're being serious, the fact that you recommend the 4070 as the cheapest GPU you can recommend to "those who are poor or on a budget" shows how out of touch you are with what kind of budget most people actually have, but also demonstrates clearly that you think you know better than everything Daniel just said in this video, which is arrogant, and also just very incorrect. You're probably also trying to impress us with how wealthy you are, that you would only recommend the 4090 to everyone who isn't "poor or on a budget", but are unaware that it just makes you sound ignorant, rude, AND arrogant, all at the same time.
      The RTX 4090 is totally over-powered, and not a good value, even for a lot of people who have a well above average income. Realistically, you're not going to be getting a significantly better subjective gaming experience from the RTX 4090 vs. a 6700 XT, the 6700 XT will just require more careful tuning of the settings in some games, and you'll usually want to keep ray tracing turned off. Even a 6700 XT is totally over-powered for those who just want to play older and/or less demanding games, and even most very demanding games can run quite well with properly tuned settings on something like a 6600, which Daniel has clearly demonstrated in many, many different videos.
      Personally, I have used ray tracing, because I did own an RTX 3080 for a while, and I was really unimpressed with the difference between ray tracing being on vs. off in both Cyberpunk and Control, especially in Cyberpunk, because I had to turn DLSS on to get decent performance with ray tracing, and even then, the frame rate was still significantly worse as well. I really didn't feel strongly one way or the other about whether it was better to play with ray tracing and DLSS enabled or disabled with either game, and sometimes it could take me a long time to even figure out whether I had left ray tracing turned on or off when loading a game after not playing for a day or two, with the difference in frame-rate often being the most obvious indicator of whether ray tracing was currently turned on or off. I do think that ray tracing is a promising technology, and CAN look good, but the most impressive lighting effects in both games almost always looked just as good regardless of whether ray tracing was turned on or off, in large part because, even with ray tracing turned on, much of the lighting effects used were still not being done with ray-tracing.

  • @Scytherman
    @Scytherman Год назад +47

    Very informative video. I think what was especially interesting was seeing a side by side comparison of different settings to really get an idea of how much impact they have on VRAM in the first place. Was impressive to see how big of a difference in VRAM usage even just DLSS Quality made at 1440p for Cyberpunk.

    • @KamleshMallick
      @KamleshMallick 9 месяцев назад +1

      So always play with DLSS right?
      Not sure why he does not test with DLSS all the time.

    • @CommonSenserules1981
      @CommonSenserules1981 6 месяцев назад

      Rubbish, 4070ti and i can play everything on 4k max settings way over 60 fps and i mean very demanding games like Ark accended ect... This info just isnt true.

  • @WhatWillYouFind
    @WhatWillYouFind Год назад +6

    8gb ram was high end for pc's . . . . 5 years ago. It really is insulting they even released this damn thing at all. By the way DANIEL great job with this video presentation. A lot of reviewers for tech and gaming include benchmark statistics but they don't translate or describe what is actually on screen to people who are less savy with this stuff. It is great that you talked about the SPILL OVER during the Ratchet segment and properly communicated WHY the 8 to 16gb different can matter. Great work, you deserve to have a channel 10 or 20x greater than it currently is. I'm sure you will have just that in short time with the content you're pushing out.

    • @87crimson
      @87crimson Год назад +1

      Sad thing is that it actually wasn't. RX480/580/590 were mainstream cards that had 8GB. Even the 570. We should be on 16 GB as mainstream now.

  • @HankBaxter
    @HankBaxter Год назад +16

    I love how Daniel moves himself around like a curser to point at things. Just cracks me up. 😆

    • @GrainGrown
      @GrainGrown 4 месяца назад +1

      Curser? That's not the word you're looking for, kiddo.

    • @HankBaxter
      @HankBaxter 4 месяца назад

      @@GrainGrown Oh, *^%$#@! 😆

  • @BogdanM116
    @BogdanM116 Год назад +19

    Just sold my 3060Ti an hour ago and ordered a 6750XT. The extra 4Gb of VRAM at 1440p is game changing, also Starfield and a new gpu with warranty is worth the 100ish euros that I added to the 3060Ti selling price.

    • @caribbaviator7058
      @caribbaviator7058 Год назад

      Yep 8gb will definitely limit you in some games @1440p. I bought a 3060ti last year and had returned it. While the fps was good vram was maxed out. Replaced it for an RX6800 🙂
      I had money for the 3070 and 3070ti and even the 10gb 3080. None of them have enough VRAM!

    • @Cal316326
      @Cal316326 Год назад

      Congrats! thats a sweet deal with starfield. Too bad 6750-6700xt not really worth it here in my region due to it being too expensive and also rare. Just build a pc with tuf 3060ti as I got a sweet deal at under 200 new. Still can play at 1440p usually and 4k in some games but have to use dlss. For the money I paid I can deal with the 8 vram. At least 3060ti has 256bit bus and gddr6x ram that makes it perform on par or even better then the 4060ti on 1440p and above.

    • @Egg.Of.Glory999
      @Egg.Of.Glory999 Год назад

      ​@@caribbaviator7058 If buying an AMD GPU makes me a neanderthal, guess I'm now a Neanderthal, because I'm gonna buy an RX 6950 XT for my custom PC build! 🤣
      Userbenchmark won't like that very much, but who cares about them, anyways? 🤣

    • @BogdanM116
      @BogdanM116 Год назад

      @@Cal316326 3060Ti is still capable at 1440p. Sucks that you couldn't go 6700-6750XT but i've used the 3060Ti at 1440p and it still hangs in there really well, especially with DLSS. Enjoy your build bro!

    • @spitfire1789
      @spitfire1789 Год назад +1

      @@Egg.Of.Glory999 I sold my 3060ti last year for $300 and upgraded to a 6800xt for $529. Love this card but i woulda been fine with the 3060ti as well, i usually optimize my graphics and lock the frames at 100-120 anyways. Coulda gotten a 4070 now too. After watching this video, I don't think there is any reason to worry about a 12gb card. It's gonna be plenty for another generation or so.

  • @samserious1337
    @samserious1337 Год назад +53

    Please note that the 2-3% performance differences mostly come from the clockspeed differences of the GPUs, i.e. 2820MHz vs 2745MHz = 2.7%

    • @BlackJesus8463
      @BlackJesus8463 Год назад +6

      You really felt the need to explain that? Come on man!

    • @tonep3168
      @tonep3168 Год назад +6

      You deserve the stuttering mess that will be here by the end of the year if you can’t watch this video and still not comprehend what is being shown here.

    • @Blissy1175
      @Blissy1175 Год назад +3

      The GPU clock speed is being throttled on the 8gb card because the memory bandwidth is limiting the amount of data the 8gb card can load/unload from vram. So, sure, the 2-3% clock speed can be chalked up to "clock speed differences" but the clock speed differences are a symptom of a larger problem.

    • @larsjrgensen5975
      @larsjrgensen5975 Год назад

      GPU clocks and performance do not scale 100%, especially not with the memory speed limit.
      A AMD R9 Fury from 2015 has a much faster memory speed then a 4060.

    • @GrainGrown
      @GrainGrown 4 месяца назад

      ​@@larsjrgensen5975 *than

  • @jessietomich8043
    @jessietomich8043 Год назад +8

    The value proposition for a 8GB card is the issue. It may function perfectly well in most cases now, however, if you use a card for 3 to 5 years like myself you will end up sacrificing a lot of visual quality to play games after a year. I think that texture quality will suffer big time eventually with these cards. So I don't know if this is a form of planned obsolescence or not.

    • @jamicassidy7955
      @jamicassidy7955 11 месяцев назад +1

      One other consideration is that 8 GB of GDDR6X vram is decently better than 8 GB of GDDR6 vram. A lot of reviewers don't seem to realise this.
      Sure, at the end of the day, all most people care about is Frames Per Second and many 8 GB cards of GDDR6 or GDDR6X perform similarly overall, since
      vram is not the only difference or the only thing that matters. Just a side note that MOST review sites and videos do NOT talk about GDDR6 vs GDDR6X, which
      heavily implies there is no difference, which is very much false. A hugely overlooked difference in cards and price.

  • @MADHIKER777
    @MADHIKER777 Год назад +7

    What a clear presentation demonstrating the real world difference between 8 and 16 Gb of VRAM.
    I am building a new system now and went with RTX 4070 with 12 Gb of VRAM. This video makes me happy with that choice.
    Keep up the fabulous work that you do!

    • @deagle7602
      @deagle7602 Год назад

      4070 is the only decent gpu. Still $50-$100 too expensive tho

    • @jochenkraus7016
      @jochenkraus7016 Год назад

      Due to the MSRP difference for 8GB more, I'd either think about a 4060 non-Ti (like "If 8GB it should at least be cheap") or if I'd be willing to spend 4060Ti 16GB money then add some more for the 4070 and be annoyed that I fell for Nvidia's 200$ upselling ;-)

    • @brohvakiindova4452
      @brohvakiindova4452 Год назад

      @@jochenkraus7016 This is painfully true. The situation remains really bad. I was really hoping that finally prices would come down but last news were about production stop and AI craze so maybe it's going to become as ridiculous as during the mining situation.

    • @hadtoregister408
      @hadtoregister408 7 месяцев назад

      bought also an rtx 4070 but with 16 gb vram - called rx 7800 xt. no more vram problems for me ;)

  • @Antody
    @Antody Год назад +78

    For games that I play, mostly e-sports, even 8GB is an overkill. But it's nice to have some buffer in case something heavier releases down the line.

    • @danielowentech
      @danielowentech  Год назад +45

      Yeah, this was focusing on AAA single player games. eSports (especially at competitive settings) don't usually need a lot of VRAM, so everyone should take their own needs into consideration.

    • @randxalthor
      @randxalthor Год назад +7

      Though similar stuff suffers even at 1440p. Halo Infinite starts failing to load textures over time as the VRAM buffer hits 8GB.

    • @Vfl666
      @Vfl666 Год назад +2

      Halo infinite looks average but the performance is shite.

    • @ThunderingRoar
      @ThunderingRoar Год назад +9

      The esport free to play games will always target lowest common denominator of PC hardware. Thats their entire business model, to cast the widest net.

    • @lobiankk77
      @lobiankk77 Год назад

      @@Vfl666 Halo Infinite looks like a 2018 game tbh.

  • @RogerWilco486
    @RogerWilco486 Год назад +28

    This makes us Radeon VII users feel vindicated. We've been rocking 16Gigs of HBM for 5 years...

    • @mariusvanvuuren4983
      @mariusvanvuuren4983 Год назад +3

      Yup. I said some years ago that the Vii is just going to chuck along, and it does. Great cards.

    • @THU31
      @THU31 Год назад +4

      I mean, if you're happy having a card with lower performance than a 7600/4060 while drawing 300 W of power, then who am I to dispute that?

    • @rsltgc8706
      @rsltgc8706 10 месяцев назад

      At lower price

    • @coops1992
      @coops1992 5 месяцев назад

      Radeon sucks ass, Nvidia has long passed it in both price, performance and power usage.

  • @aspyr5681
    @aspyr5681 Год назад +6

    It's not that bad, I had a ton of fun with 6GB, VRAM was never maxed out regardless of whether my settings were Ultra or not, those "few games" that they didn't optimize are not worth the upgrade, I just bought 3060ti and it will serve me well for at least 3 years, I bought it specifically because I knew 4060 aren't worth it and 3070 is too expensive, so I bought it knowing that I'll resell it in a few years, and guess what? In a few years, it will still be a pretty great card that will absolutely demolish the competitive gaming spectrum and also nail all the games that come out now, in 2024 and maybe even 2025, I am confident in that. Why? Because my GTX1060 kicked ass until late 2022, it was amazing, and 3060ti is how much faster than 1060? 130%? Yeah, I'm good. My advice to people who don't know much about tech is - don't buy into the VRAM panic! Do what I did, buy something that's adequate to you and your budget with an intent to upgrade in the near future. Feel free to ask me for opinions, I'll help anyone out

    • @ThimbaDM
      @ThimbaDM 8 месяцев назад

      I didn’t even know 8gb is too little vram. I have my 2060 6gb vram and I honestly don’t see any reason to upgrade. I play all my games pretty much maxed out on 1080p and games run over 60fps.

    • @dangsalty
      @dangsalty 8 месяцев назад

      yeah im still using my gtx 1060 3gb which is not ideal at all but im still getting by, 8gb vram is definitely enough right now

    • @ok1_183
      @ok1_183 7 месяцев назад

      still rocking my 3070ti. dont really need to upgrade. not a "full time" triple A gamer anyway. i play online pvp shooter mostly and i play those titles at low-medium setting or the so called "optimized setting" to gain the best balance of visibility, framerate, decent looking and gameplay. from time to time i will play some AAA titles, like the game play of those AAA titles have to be really really interesting for me to really play it. and i honestly dont see any major issue with my 3070ti. the frame rate is good enough for me to causally play those AAA games. and im on 2k res btw. i bought my card when the gpu price costed way TOO MUCH. i literally paid 999$ for my strix 3070ti at my local micro center. that 1000 bucks could easily get my a 4070ti today which will double my frame rate no doubt. but it is what it is man. im happy with my 3070ti and will probably upgrade when 5070ti comes around. stay strong 8GB vRAM gamers!!!!

    • @TheBoltMaster456
      @TheBoltMaster456 5 месяцев назад

      I have a laptop with a 3060, and the laptop version of the 3060 comes with 6gb's instead of 12gb's(stupid, I know), but it still does me just fine on pretty much every game except Elden Ring which has always been not that well optimized.

  • @jpesicka999
    @jpesicka999 7 месяцев назад +1

    I still use a 3070 FE, I did switch back from 1440p 144hz to 1080p 240hz so there’s no issue with the vram.

  • @pf100andahalf
    @pf100andahalf Год назад +57

    I'm glad you mentioned something that no one seems to be talking about, that frame generation uses vram. It can use up to 2gb vram. So somebody who bought a shiny new rtx 4070 ti can experience slowdowns and stutters by turning it on.
    It's ironic that nvidia gives you ray tracing and frame generation, then doesn't give you enough vram to use it at higher settings unless you bought a 4080 or 4090, or a 4060 ti 16gb. It's ridiculous if you ask me.

    • @87crimson
      @87crimson Год назад +3

      Wot... I wasn't aware of that! Huge L for NVIDIA GPUs if true. I'll keep playing my backlog until GPUs with 24/32 GB of VRAM are mainstream. We had 8GB mainstream GPUs in 2016...

    • @PowellCat745
      @PowellCat745 Год назад +2

      @@87crimsonThat’s why I never turn on frame gen or even RT unless it minimally impacts performance. But I have a 4090; I can’t imagine what experience users of 4070 Ti and lower will go through.

    • @MasQerRade
      @MasQerRade Год назад +1

      @@PowellCat745 cyberpunk2077 at 1400p with rt psycho and frame gen enable is running fine with my rtx 4070 ti through.

    • @PowellCat745
      @PowellCat745 Год назад +2

      @@MasQerRade 1440p* imagine needing frame gen for 1440p lol
      I can’t tolerate the added latency. Sorry.

    • @MasQerRade
      @MasQerRade Год назад +2

      @@PowellCat745 well it's ray-tracing afterall so it can't be help for now. But with story type of game and play at 100+ fps with ray, it quite excellent to me.

  • @mathlfr
    @mathlfr Год назад +27

    If we look at steam survey's most common GPUs, we're gonna have a problem real quick. Devs NEED to optimize better at native resolutions and stop relying on upscaling so much, even for 1080p native gaming. Streamers and RUclipsrs playing the games on ultra 1440p is not the majority of people, far from it and their games won't sell well if they can't run properly on low end GPUs from the last 2 or 3 years.

    • @Chrissy717
      @Chrissy717 Год назад +8

      We are literally at a point where PC gaming is holding back game development.
      And people said it's the consoles causing bad looking games. This is just way too funny

    • @valrond
      @valrond Год назад

      @@Chrissy717 Indeed. It's been over 2.5 years since the new consoles arrived. They are using all of their RAM mostly as VRAM. PCs stuck with 8GB are a real drag. That's what GPUs had 8 years ago, like the previous consoles.

    • @sammiller6631
      @sammiller6631 Год назад

      Nvidia has the deep profits to fix this easier than any game dev with a much smaller budget and team. Game developers with one-tenth the size or one-hundredth the budget don't have the time to optimize for all resolutions.

    • @mathlfr
      @mathlfr Год назад +1

      @@sammiller6631 tell me how is it nvidia's problem to fix the performance of a small studio's game? If they don't have time to make their game work they should think about doing something else.

    • @mathlfr
      @mathlfr Год назад +1

      @@valrond PC players have all the tools and thew hardware to make this work but it just costs so much.

  • @ExacoMvm
    @ExacoMvm 11 месяцев назад +4

    I still use 6GB GPU ( hopefully not for long anymore ) and the only games that hit hard on VRAM was Ratchet and Clank & Last of Us Part I.
    In Ratchet going form High textures to Medium alone gained me like +30fps. In TLoU I still managed to go beyond limits and set essential textures to High without performance loss.

    • @kristijanceple6026
      @kristijanceple6026 9 месяцев назад

      Same, I still have the 5600 XT. Looking to go to 7800 XT tho once some money comes in, the 6 GB VRAM is really bottlenecking badly in some games

    • @thenonexistinghero
      @thenonexistinghero Месяц назад +1

      Well R&C is kind of a mess anyhow. I mostly played it in 4K on my 3080 Ti just fine at 60 fps, but it would randomly drop performance and never recover properly until a full PC restart. The game's VRAM management is no doubt a mess. Seems to be the case for most Sony game ports to PC actually. Surprised it wasn't the case for Returnal.

  • @Withing_
    @Withing_ 6 месяцев назад +1

    This solved my frame gen confusion, it's stuttering when it's pushing the VRAM, makes sense! Thank you!

  • @chrismullins6396
    @chrismullins6396 Год назад +58

    As a 3060ti user, this video helped relieve some vram anxiety for me. Combo of DLSS, and high instead of ultra settings and I should be good to go until next gen cards, even on new games. Appreciate your work!

    • @Ojas008
      @Ojas008 Год назад +4

      Thanks for the info ,my friend. I'm considering to buy 3060ti or 4060 ti so this comment really helps.

    • @soulhalo7343
      @soulhalo7343 Год назад +14

      ​@Ojas008 I'd go with something more than 10gb but I don't recommend anything under the 4080 it's a bad deal for the price even the 4080 is over priced I say 6800xt 16gb be good for years to come or 6700xt

    • @svinjadebela6893
      @svinjadebela6893 Год назад +6

      basically it's enough to just lower the textures from ultra to high (which is hard to even differentiate) and you are good to go. if needed, DLSS will also help.
      4060 Ti 16GB outperforms 8GB mostly in scenarios where 16GB version is not strong enough to provide 60+FPS anyway.
      3060 Ti is probably the best cost/performance ratio card nvidia had provided in last 3 generations when it comes to mid tier gaming.

    • @soulhalo7343
      @soulhalo7343 Год назад +4

      The 3060ti even beats a 4060ti heck the 3060 beats the 4060 but game by game but from what I've seen still outperforms unless the game is using more than 8gb

    • @rnwilliams44
      @rnwilliams44 Год назад +2

      I feel the same way. I bought a 3060ti on black friday and i have that same anxiety. (It hasnt been a full year and now im hearing it could no be enough?) Then again the only new game I'm waiting for is Tekken 8.

  • @ThunderingRoar
    @ThunderingRoar Год назад +32

    Have a good feeling that Starfield maxxed out is gonna shatter 8GB gpus, especially once modding community gets their hands on it

    • @Mopantsu
      @Mopantsu Год назад +4

      If Starfield turns out to be THE game to get and gets good ratings it could really push Nvidia into a corner. Especially being bundled with AMD cards. But somehow I think Nvidia just don't care anymore. AMD needs to make the 7700 and 7800 cards 400 and 500 respectively but I am pretty certain they will push for more.

    • @Cookedfrfrfr
      @Cookedfrfrfr Год назад

      This is why I'm going to get an Arc Battlemage once it comes out since by then all the bugs in Starfield should've been squashed and a lot of mods should've come out.

    • @matthewjohnson1891
      @matthewjohnson1891 Год назад

      Not to mention games using lumen, nanite and water works all together.

    • @JoeL-xk6bo
      @JoeL-xk6bo Год назад +11

      every Bethesda game blows past 8Gb with enough mods.

    • @lth_lch
      @lth_lch Год назад +2

      I've seen Skyrim saves that eat up at least 12 GB lol

  • @InTheBoxDev
    @InTheBoxDev 7 месяцев назад +3

    m still playing with 4gb and its fine for almost all games on medium to high settings

  • @halistinejenkins5289
    @halistinejenkins5289 Год назад +2

    good work dude. you're doing what hardware unboxed did when they were a young channel and hungry. i recommend doing cpu scaling next. channels shy away from it probably because of the work involved.

  • @mayssm
    @mayssm Год назад +28

    The whole reason I went AMD this generation. 4070ti only 12 GB vram, while the 7900 XT has way more. I had to turn textures down in games on my 3070, and didn't want to have to do that on an $800 card in upcoming games. My FPS was fine in games, but you could see when it hit the limit because of freezes and stutters. Now with 20 GB, just buttery smooth.

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe Год назад

      For now my 4070ti is great for 1440p. Most I've ever seen is like 11gb running cyberpunk with path tracing and all ultra settings with frame generation. If those visuals can fit into 12gb then I'm not interested in any excuses from other devs.

    • @ericahearn9604
      @ericahearn9604 Год назад

      HOw does the 7900xt compare to the 3070?

    • @mayssm
      @mayssm Год назад +2

      @@ericahearn9604 Big improvement. Sure, it uses a ton more power, but basically 50% faster frames in most games. I've got to give the 3070 props though. It's like half the size, weight and power draw and yet puts up great numbers. If not for the stuttering I was getting when the VRAM maxed out on high or ultra, I might have kept it.

    • @samgragas8467
      @samgragas8467 Год назад

      @@JudeTheRUclipsPoopersubscribe It is ok as proven in this video, both the 4060 Ti 8 GB and the 4070 Ti are in a similar situation.
      I would not expect any problem appart from having to turn on dlss or find optimal setting in games like GTA VI, Starfield, Alan Wake 2, gaming with heavy raytracing and photorealistic games using UE 5.

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe Год назад

      @@samgragas8467 it's okay for me at 1440p. I doubt I'll ever upgrade my monitor I've had my 1440p 144hz for 4 years now.

  • @CrazyLegsFE
    @CrazyLegsFE Год назад +15

    I can't help but wonder if there is some pixel peeping required to tell if some games are dealing with VRAM overflows with degraded textures or simply not loading them. FPS might not be the only indicator In this situation. You mentioned it in the video and it got me thinking I'm wondering if there is more to this than meets the eye so to speak. This reminds me of Nvidia when they cheated with color compression long ago. As always great investigation good detail great video.

  • @_Devil
    @_Devil 9 месяцев назад +3

    Having a 6gb 1660S, 8gb would be a very good upgrade for me. I would prefer 10 or more but I could definitely settle for 8 if someone was offering.

  • @taz_artista
    @taz_artista Год назад +5

    great video. would be good to know game versions that were benchmarked. genuinely believe game optimization is at an all time low for newer games as we lean heavily on upscaling

    • @nextgen3ric
      @nextgen3ric Год назад +1

      I'd have hoped we would take longer to reach this stage, but remnant really blew me away with its reliance on upscaling.
      There's nothing wrong with a VERY useful feature I'm still excited about. There IS something wrong if it is misused as the primary performance optimization while developing, at least at this stage.

  • @FOGoticus
    @FOGoticus Год назад +4

    So the TL;DR: of this whole video is pretty simple. Is 8GB enough? No for a new card, yes for an old one. Getting more VRAM will offer you roughly a 10% increase with the same GPU and more future games will require more than 8GB in the future even for 1080P maxed out although lowering settings can mitigate that to a degree.
    Got 8GB? You're good. Got a budget for a new GPU? Better look at GPUs with more than 8GB if you want more future proofing.

  • @seanmaclean1341
    @seanmaclean1341 Год назад +4

    Two of the *heaviest* VRAM gaming use cases I've ran my 4060 TI 16GB through has been maxed Street Fighter 6 World Tour mode, which has some spots that ate over 10GB of VRAM on 1080p, and Cyberpunk with Overdrive RT and DLSS frame gen on, which got around the same VRAM usage on just 1080p.
    Definitely think 8GB of VRAM is on its way out even at low resolutions, and I'm worried that 12 may be cutting it close in only a couple of years.

    • @JathraDH
      @JathraDH Год назад

      Just because a card is reporting that much VRAM use doesn't mean it needs that much VRAM to run. Its a concept no one seems to understand because they don't really understand how computers work at all. This whole VRAM thing is largely copypasta word vomit propagated by people who have no idea what they are talking about.
      The fact of the matter is that the only way to tell how much VRAM a game actually needs to run is to have the GPU's VRAM 100% capped and then also at a point where you are seeing tangible performance hits. Games can run with 0% slowdown using capped VRAM and do all the time. The more VRAM your GPU has the more VRAM you will tend to see used.

    • @danielgiselde5509
      @danielgiselde5509 Год назад +2

      Always orientate around the newest console gens and what specs they are using

  • @Nerex7
    @Nerex7 Год назад +10

    Thank you for this phenomenal and informative video! I think I will be using my RTX 3070 until it becomes unbearable and then look towards a 16GB card (at least). The RX 6800 XT is becoming cheaper these days and so are the 6950 XTs (but their power consumption is too high for me). I actually have somewhat hope for the 7800 XT.

    • @jamicassidy7955
      @jamicassidy7955 11 месяцев назад

      To defend the 4070 just a little, the 12 GB is GDDR6X while 6800 XT and 6950 XT and even 7800 XT are 16 GB of GDDR6.
      12 GB of GDDR6X is basically as good as 16 GB of GDDR6. Some say it is better and only cards like 7900 XT with 20 GB of GDDR6 are better.
      VRAM is of course not the only thing that matters and 7900 XT (and even 6950 XT and 7800 XT) is often better than the 4070 overall.

  • @Seppe1106
    @Seppe1106 Год назад +2

    Well, the 16GB versions Chip is apparently better cause it's consistently boosting higher than the 8GB Chip. That is also making it perform better and seem like the gap is sometimes pretty big.

  • @colossusblack3653
    @colossusblack3653 Год назад +18

    I bought a RTX 3070 in 2020 when it was impossible to find a GPU. That was the last 8GB card I bought. Since then I got a 3080 12GB, which became my wife's GPU. I got a 6950XT which is mine, and then my daughter got into PC gaming and I got her a RX 6700 for her 1440p monitor. The 3070 does fine for most games at 1440p, but there are definitely issues with frame times and textures in some of the latest titles. I used to play at 4K with that card, but that's become impossible with most AAA titles in 2023 at even medium settings. There's a lot of texture pop in and just stuttering in general.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior Год назад +2

      3070 was never a 4k card even when new. It was suitable for 4k gaming only because it was a really good card. But it still excels at resolutions it was made to play at (1440p). Even with only 8Gb vram it still blows away the RX6700xt and is around 10fps slower than a RX6800. It is the 3080 series cards that were meant for 4k gaming.

    • @bpcgos
      @bpcgos Год назад +2

      the shame thing is the GPU is actually still strong enough if only it had more than 8GB of RAM, the potential of the GPU trully wasted because of that (I have 3060 Ti and cant imagine if only it has 12 or 16 GB of RAM)

    • @siyzerix
      @siyzerix Год назад

      Yeah, my mobile 3070ti (desktop 3070 in specs) was mainly brought for modded skyrim se and a few other titles like elden ring and hogwarts which it handles nicely maxed out.

    • @Egg.Of.Glory999
      @Egg.Of.Glory999 Год назад

      @@angrysocialjusticewarrior But, 8 GB of VRAM HAS been becoming... less reliable, for quite some time now. At some point, we're going to have to put 8 GB cards to rest, and move on to GPUs with a far more reasonable amount of VRAM, like 16 GB.

    • @colossusblack3653
      @colossusblack3653 Год назад

      @@angrysocialjusticewarrior Agree with just about everything, but the idea that it blows away a 6700XT, they trade blows depending on the title and no, the 3080 is no longer a 4K card, that weak 10GB version and even now my 12GB is getting left behind because of the lack of VRAM.

  • @Seboy1
    @Seboy1 Год назад +28

    The problem is that Nvidia changed from releasing a card where everyone was like "Who the f needs 8GB!" to "where the f is my VRAM for 700$, Nvidia?" Also what bothers me is that nobody is talking about VR and how detrimental low VRAM there is if you want a sharp image. 4k gaming is nothing compared to VR and it still looks blurry compared to flat screen that means the resolution for VR will be even more in the future.

  • @vipast6262
    @vipast6262 Год назад +3

    Very good breakdown of the stats. Simple and Concise. Now that we have a baseline, I wonder if there will be an easier way in the future to tell how much VRAM -> System RAM spillover we have in games.

  • @shalifi7774
    @shalifi7774 Год назад +2

    a better question is how well do graphics cards with 2GB and 4 GB work today? that will be a good reflection on where we will be in 3-5 years. Which is where some people see themselves with the same GPU. I gave an old 2GB RX550 AMD card for someone who said they dont need anything major. For a server its ok. Not for gaming at this point. Same will go for the 8gb cards.

    • @Nightcore_Paradise
      @Nightcore_Paradise Год назад +1

      A lot of games won't even launch, let alone play with 2 gb of vram.

    • @ThePipojp
      @ThePipojp 10 месяцев назад

      4gb was fine for almost a whole decade

  • @steve42069master
    @steve42069master Год назад +3

    So glad I returned my 4070 Ventus 3x OC for a Merc 310 7900 XT.

  • @patb3356
    @patb3356 Год назад +5

    Awesome testing and thank you for the summary. Your intense analysis is much appreciated!

  • @THU31
    @THU31 Год назад +1

    Those Ratchet & Clank results are really weird. Maybe it has something to do with DirectStorage? Could it be that GPU decompression requires more VRAM, but it doesn't show up as actual usage?
    It seems like a topic worth investigating. This game definitely shows the most unique behavior when it comes to VRAM.

  • @longplaylegends
    @longplaylegends Год назад +6

    One thing to sort of note is... Textures have (usually) gotten better and better, so using high textures compared to "yesterday's" ultra, will often give you similar quality in newer games. Although in some games, going lower than high looks like garbage, so it is game dependent for sure. Sometimes you'll go down to medium textures and things just don't look very good, especially in open world games where a lot more is tending to be loaded in from what I can tell.

    • @DamianSzajnowski
      @DamianSzajnowski Год назад +4

      Anything above high is waste of fps/GPU/VRAM money when buying almost in every instance for a regular gamer. The visual difference is indistinguishable, especially if you get more playable frames then that will look better. Only rare exceptions are maybe if you are a high-time streamer, have too much cash or sth. Onlh visible improvememt on very or ultra-high textures over high on recent AAA games I've personally seen ona was in RDR2. This is even more true for Ultra textures/settings - 80/20 rule is in play or more like 95% additional hardware cost for at most 5% improvement. I'd personally only use RT on if possible with playable drops since that huge fps drop is offset by massive visual light upgrades whenever well implemented.

  • @KidGacy
    @KidGacy Год назад +10

    I remember people scratching their head as to why the 3060 came out with 12GB , i'm currently using it for 1440p (with dlss) and the extra Vram has saved me a lot of headaches

    • @Onisak25
      @Onisak25 Год назад +10

      3060 is to weak to use that ram before performance is a problem in most cases.

    • @Blue-920
      @Blue-920 Год назад +1

      ​@@Onisak25 I have a 3060 and play on 1440p, i just turn down the settings to high, use dlss and max out all the texture settings and have zero performance issues in any game i throw at it, also in games like last of us part 1, forza horizon 5, resident evil 4 remake and warzone 2 that 12gb vram really comes in handy , and in other games like cyberpunk 2077 , witcher 3 and farcry 6 i have installed high resolution texture mods and took advantage of that extra 4gb vram, btw higher resolution textures are really noticeable in games, so the 12gb vram is definietly not an overkill for the 3060, especially in the future since next gen games will have high resolution textures

  • @user-lk5kn2tr7k
    @user-lk5kn2tr7k Год назад +9

    Very detailed and interesting review. Thanks for your work Daniel.

  • @Bolt451
    @Bolt451 4 месяца назад +4

    Me still using 4 gb

  • @ozreshef5182
    @ozreshef5182 Год назад +1

    You mentioned it in the middle of the video, but not at the conclusion:
    The patches that "fixed" the problem that the 8gb was slower, just lowered the texture quality and make the texture pop in and out, so the FPS is fine, but the visuals sucks compared the 16gb virsion.

  • @BriBCG
    @BriBCG Год назад +4

    Another potential problem is that as Hogwarts Legacy showed it's entirely possible to get rid of most of the performance problems lack of VRAM causes by turning your visuals into complete trash. If all people are looking at is fps charts(and that would be the norm when deciding what card is appropriate for your needs) it could be quite misleading.

  • @Cheesejaguar
    @Cheesejaguar Год назад +8

    Interesting to note that your 4060 Ti 16GB card is consistently running 50-100 Mhz faster, and often running at a noticeable increase in power draw as well. In Ratchet and Clank, your FE card is only drawing 87W and the 16GB card is drawing 132W. That's 50% more power draw for a 33% boost in frame rates. Cooler difference certainly resting a heavy thumb on the scale here.

    • @nipa5961
      @nipa5961 Год назад +6

      The 16GB model can actually render frames as intended while the 8GB model has to wait for data and is forced to idle.

  • @foundcrab4742
    @foundcrab4742 Год назад +1

    8gb vram gaming is still going to live for a couple more years, most 8gb vram gpus can run all games that are in 2023, 2022, 2021, etc. on high settings 60fps 1080p

  • @Demaulicus
    @Demaulicus Год назад

    The answer is somewhere in between. 8GB VRAM is good at 1080p and higher settings while it is doable at lower settings on 1440p. 16GB VRAM is better for higher resolutions and settings. This is if you want a consistent higher end frame rate. For this to work you also need the GPU chip itself and bandwidth to be good enough to handle the extra VRAM.

  • @Schroinx
    @Schroinx Год назад +5

    Indeed. The real deal with more VRAM is that you can load better textures with the same performance, so increasing the quality and longevity of the GPU. Also RT also uses VRAM, so that also becomes better.
    The 4060 ti, should have been a 4060 and be born with more memory bandwidth and 12GB of VRAM (as the 3060 was) and be priced at 300$-350$, but Jensen got greedy and AMD has yet to take the opportunity created by Nvidias poor Ada generation. The GPUs are good, but memory bandwidth, VRAM and price/performance sucks.

    • @shadowrealms2676
      @shadowrealms2676 Год назад +4

      The 4070 should have been the 4060 and the 4080 should have been the 4070

    • @TV-xv1le
      @TV-xv1le Год назад +1

      Uh no. The 4080 is a 4080. The 4080 is around a 50% jump in performance from a 3080. It's a 90% or so performance jump from a 3070. Given inflation (which is real) I would have anticipated a launch price of $900 for the 4080 and $800 for the 7900xtx. But here we are. AMD trying to be Nvidia also.

    • @ThePipojp
      @ThePipojp 10 месяцев назад

      ​@@shadowrealms2676Nah, the 4060ti is like, 30~40% faster than a 3060. You seriously think that would've been a 4050ti or whatever?

  • @richardfarmer6570
    @richardfarmer6570 Год назад +3

    I admit I was in the group that thought 8gb was only a problem on a few poorly optimized games, seems there was more to it than I thought. Although it is the console ports that seem to suffer the most, I would expect Ratchet and Clank to improve with time.

  • @crow3327
    @crow3327 7 месяцев назад +3

    me using 128 MB VRAM GPU......

  • @JayzBeerz
    @JayzBeerz Год назад +1

    Millions and millions of people have 8GB cards and not getting rid of them for 1080P Medium settings.

  • @FuelX
    @FuelX Год назад +31

    I bought a RTX 3070 ti less than 2 years ago because that was my price limit. I already replaced it because of it's VRAM. That card was so much oversight from NVidia. It was a great card with older titles, and I realy liked it. But it has a terrible limitation. It's aging so badly.

    • @Gazer75
      @Gazer75 Год назад +14

      I guess if you really want to play new AAA titles at Ultra it wont work. I've yet to have any VRAM issues with my 3070Ti card in game I play. But I tend to play games that are more CPU demanding like management games, automation, city builder kind of games.
      And I don't really buy new games at full price, they tend to be a buggy mess for the first year. I do buy some early access games, but they often are 25-50% cheaper at that stage than after full release.
      By adopting this I save on both hardware and gaming cost. I can buy cheaper GPUs and still play games at Ultra because the GPU often came out after the game was released.

    • @Gazer75
      @Gazer75 Год назад

      @@garrusvakarian8709 If you refuse to lower textures then sure. If you can afford to replace the by all means.
      As I said before, I don't play new AAA titles needing more than 8GB, and if I did I'd reduce textures to high or medium if needed. By the time I play these games, if ever, I probably have a better card anyway.
      Only game I have that is even close to maxing out the 8GB buffer is No Man's Sky. Its up around 7GB maxed out.
      Not sure what will happen to Satisfactory when they move to UE5, but its not been a problem yet. This game is CPU limited anyway once you got plenty of machines going.
      Most games I play stay around 5.5-6.5GB.

    • @FuelX
      @FuelX Год назад

      @@garrusvakarian8709 I think your answers both show what's wrong with de 3070 ti. It has a great GPU that suffer from bad design choices or greed from Nvidia. When you don't play those new very hungry on VRAM titles it's a charm. But same as you, I had a similar experience in the last year. I had to return some games because they were unplayables.
      I didn't know if I should point at Nvidia or the developpers for that, or maybe both. Now I believe Daniel Owen's review is proof that in 2023 a 8gb video card is no longer an oversight. It's a bad design.

    • @noimageavailable2934
      @noimageavailable2934 Год назад +3

      @@FuelX To me it seems more like arrogance and greed on nvidia's part. Developers had been telling them for multiple years that 8gb won't be enough for 1440p and they just didn't care.

    • @Gazer75
      @Gazer75 Год назад

      @@FuelX Probably unplayable because you insisted on playing at maxed settings. Reduce textures and they work fine. The graphics settings are there for a reason.

  • @brandonkelleher2651
    @brandonkelleher2651 Год назад +6

    Would be interesting to see how system RAM speed would effect this test such as DDR4 vs DDR5.

    • @historybugs
      @historybugs Год назад +3

      It won't matter at all because thete is a huge difference between the bandwidth and latency between system ram and a gpu ram. GPU ram is much more faster, which is what GPU needs. Your question is like saying as if a faster ssd helps for Windows page fie when pc has a low ram amount. No it wont. You simply need more VRam

    • @SianaGearz
      @SianaGearz Год назад +3

      My guess is that it shouldn't since the PCIe ingest bandwidth is much more limited anyway, but we do really need a test to know for certain don't we.

    • @__-fi6xg
      @__-fi6xg Год назад

      @@historybugs but a pcie 4.0 nvme is loading games faster than pcie 3.0 ssd, now if thats just one second on every loading screen, who knows...

    • @brandonkelleher2651
      @brandonkelleher2651 Год назад

      @@historybugs Some of the excess memory is being loaded into system ram when the gpu runs out of VRAM so I would assume that the speed of that memory would have some effect on how fast the game could run.

  • @shahrukhwolfmann6824
    @shahrukhwolfmann6824 Год назад

    2:47 Thank you, for the integrity you bring to the RUclips tech scene. Blessings

  • @spladam3845
    @spladam3845 Год назад

    Daniel has been practicing moving himself around the screen while pointing and I think it's hilarious.
    Great stuff dude, good breakdown, thanks.

  • @exnozgaming5657
    @exnozgaming5657 Год назад +4

    still survivin with 4GB VRAM

    • @lionelstarkweather982
      @lionelstarkweather982 4 месяца назад

      I gave my brother my pc with 2gb gtx 960, now im playin on a 1 gb hd 5850 from 2009, plenty to do with just 1, but 2 is amazing.

    • @talijunior
      @talijunior 3 месяца назад

      same my friend

  • @chiyolate
    @chiyolate Год назад +6

    As a power user, more VRAM means more power to me, like I can run a game, record it, while editing it the adobe After Effects, and open numerous browser tabs and chat programs, all at once. Those are using up all of the VRAM. These days 8GB are just not enough, your browser with multiple tabs can consume around 1GB alone, and After Effects maybe 3-5GB. So I hope more 16GB VRAM cards coming with better price. If you're just a gamer, okay then maybe 8GB is still sufficient for a couple of years.

    • @JathraDH
      @JathraDH Год назад +3

      Your VRAM is not really being used for any of that. Your system ram is being used for that. People have very weird misconceptions about what VRAM gets used for.
      The truth of the matter is that nearly 100% of your VRAM will be used for the active application you are running, because it is what needs access to the VRAM at any given time.
      Anything that is being used by a background application will have its VRAM swapped out to system ram if the VRAM gets full because it's performance is not as important as the active application. It will then be swapped back in once that application regains primary focus.
      This is why when you have something open in the background for a very long time and swap to it there will be a second or two of lag when you bring it back to focus. It's memory is being swapped from system ram back into VRAM.
      People also have grave misconceptions over how much VRAM you need in gaming because they don't understand how memory caching works but that is another can of worms.

    • @chiyolate
      @chiyolate Год назад

      @@JathraDH of course they can swap in and out between system ram but like you said yourself it will lag. In game there will be frame drops and in applications there will be lags. More vram means you will have smooth experience with all your tasks

    • @JathraDH
      @JathraDH Год назад

      @@chiyolate Yea but most of what you described is about system memory, not VRAM. System memory is what allows for such multitasking much more than VRAM does.
      Try doing all you described with even a 32GB VRAM GPU and only 8/16 GB of system ram and see how smooth your experience is.

  • @mondzi
    @mondzi Год назад +1

    Could you please separate the "per process VRAM usage" and "allocated VRAM" (in your overlay) with their own label, so it's not so confusing. Thanks for the comparison!

  • @FooFighter477
    @FooFighter477 Год назад +1

    If you already have something with 8 GB - its fine. Maybe decrease textures if needed in some games.
    If you’re planning to buy something higher than low end - at least 10-12 GB should be consideration. You don’t want to buy something that doesn’t have enough memory in few but quite realistic scenarios on day 1.

  • @MrBeetsGaming
    @MrBeetsGaming Год назад +2

    I'm playing Ratchet and Clank right now on a 6600xt and as long as I leave texture quality on medium I get 60fps at 4k with FSR on performance and it looks great, the rest of the settings are max except I believe I left the fur one on high instead of ultra and weather I think on medium. No raytracing of course, it does use the entire 8gb though.

    • @vicente3k
      @vicente3k 4 месяца назад

      6600xt is a beast of a card.

  • @jddes
    @jddes Год назад +4

    I'm just constantly astounded by the absolute teacher rizz Daniel possess every time he whizzes across the screen to point at numbers. Thanks for another great comparison!

  • @neomestar
    @neomestar Год назад +2

    Just bought a used 3070 without any delusions of ultra, aimed for 1440p high and it's all good. I'm fairly certain it'll hold up until publishers really start pushing for 8GB+ on high. But a new one with 8GB? No way.

    • @VS_Gamer84
      @VS_Gamer84 9 месяцев назад

      8 GB is enough for For 1080 p native and 1440 DLSS Q resolution. Just reduce texture to medium or high from Ultra ... As texture uses more VRAM in all games.

  • @slickrounder6045
    @slickrounder6045 Год назад +1

    Really nice inclusion with the Graphs of all the games at the end! PLEASE also in the future add the 1% low graphs, those are just as important (or maybe even the most important).
    Also allegedly Jedi Survivor when it caps Vram it will produce lower quality textures to not "lose" fps. So it may seem like there isn't a big difference in FPS, but there is graphic quality difference that favors the non 8gb constrained cards. That type of win can't be shown by Fps difference, but its still significant.

  • @Enigma1336
    @Enigma1336 Год назад +3

    Would be interesting to see how these games would fare with the minimum 16GB system ram requirement. It seems that when the 4060TI8 runs out of VRAM the system memory suddenly flows way above 16GB in several of these games. Running out of system memory because of VRAM overspill would be a double whammy. God help you if you are still chugging along with a mechanical hdd on top of that. Triple whammy.

    • @jochenkraus7016
      @jochenkraus7016 Год назад

      On a system slower that PCIe 4 it's actually quadruple whammy :-o

  • @realpoxu
    @realpoxu Год назад +3

    Just dropping by to say you're doing an amazing job with your videos, very informative. Thanks!

  • @mishi4187
    @mishi4187 Год назад +1

    Me watching this video while my 1050ti 5g trembles in fear

  • @randobuy
    @randobuy Год назад +1

    So pretty much the performance will depend on what resolution/ram the devs have decided to optimize the game at. But the 16gb will future proof ur system when more games are optimized for that amount of ram useage...

  • @timybcn
    @timybcn 11 месяцев назад +3

    .........Nvidia MUST CALLBACK ALL 8GB 4060 CARDS ..ASAP...ITS A CRIME BY NVIDIA................

  • @SirTJ
    @SirTJ Год назад +3

    First 3 minutes I learned alot... no one has ever explained this...well I never looked anything up. Thanks for the video.

    • @GrainGrown
      @GrainGrown 4 месяца назад

      *a lot, "alot" isn't a word

  • @jm8080ful
    @jm8080ful Год назад

    I love how you fly all over the screen to point at the relevant information as you talk❤

  • @andipajeroking
    @andipajeroking Год назад +1

    I game on a 4 gb vram card that i bought with ~150$ new. Everything works,1080p, on medium-high.
    The technological advancements are minimal nowadays, you can game no problems with a 5 years old or more rig.
    Stop being consumerists.

  • @gnomekingclive
    @gnomekingclive Год назад +3

    Rachet and clank will probably see a patch for the VRAM issues just like the other games on the list. Ive seen people complain about performance issues on 12GB cards as well in that game which should not happen.

    • @ValkisCalmor
      @ValkisCalmor Год назад +4

      The PS5 and Series X have 16GB of shared system RAM, of which about 13.5 can be allocated to the GPU. As games drop support for last gen consoles you're going to see more and more games aiming to take advantage of all of that, and that will carry over to the PC ports. 12GB should be considered borderline, especially since RT and framegen increase VRAM usage, and I'll be surprised if we don't see some particularly ambitious PC first titles targeting 16+ in the next couple of years. If the rumors of a PS5 Pro hitting next year turn out to be true, that likely makes matters even worse.
      If you're looking for 1080p, then yeah 12GB is definitely fine. Higher than that and you're I think you're going to see the same problems 8GB cards are having now by the time the 50 and 8000 series roll around. 'Course I don't know why you'd buy a card like a 4070Ti for 1080p unless you're looking for 390Hz in esports or something.

    • @Onisak25
      @Onisak25 Год назад +5

      @@ValkisCalmor its bad optimization. Very bad. Devs are just as much to blame as Nvidia. People forget that.

    • @16xthedetail76
      @16xthedetail76 Год назад +2

      The fact that you're just following a crowd of people that don't understand that technology moves forward speaks volumes. It is not the games fault, it is nvidias and amds. It is a miracle that we've been at 8gb for so long to be honest. Ratchet and clank is a monumental game in terms of visuals and the fact it can even run on 12gb cards is insane.

    • @Onisak25
      @Onisak25 Год назад +5

      @@16xthedetail76 visuals are nothing special at all. Its a bad optimization. Even people with 16gb have problems.

    • @gnomekingclive
      @gnomekingclive Год назад +1

      @@ValkisCalmor Being shared RAM means realistically it cant allocate all 13.5GB to VRAM related tasks. I doubt the ps5 pro will have anymore system memory than the ps5.

  • @ARobotIsMe
    @ARobotIsMe Год назад +3

    Great work as always Dan, keep it up

  • @dennisbradley5620
    @dennisbradley5620 Год назад +1

    Ive found at 1440p my 4070 doesnt use more that 12gb (i see around 10gb being used a lot) but does use more than 8. I also upgraded my ram at the same time and yes some games will use more than 16gb if its there. So 32gb ram is just as important. Seen it use over 20gb even when vram isnt fully used at 1440p ultra settings with rt.

    • @grandrtiv26
      @grandrtiv26 Год назад

      I recall using up to 15 gigs of VRAM while playing RE4R during raining sections. Crazy

  • @TheTryingDutchman
    @TheTryingDutchman Год назад +1

    awesome video, thanks man

  • @Onisak25
    @Onisak25 Год назад +3

    Ratchet and clank is very buggy with vram. Not good optimization at all. Even 12gb cards run out of vram.

    • @endbringer241
      @endbringer241 5 месяцев назад

      Their recommended cards are RTX 2060 (6GB) and RX 5700 (8GB) lmao. They have no excuse to run so poorly on 8GB VRAM.

  • @jiml6822
    @jiml6822 Год назад +3

    in eary 2022, i bought a 3070 (8GB) at Microcenter replacing a 2060. After installing it, I was able to get a 6800XT (16GB) reference card directly from AMD at MRSP . I was able return the 3070 to Microcenter. So glad I did. Now I just replaced my 3600 with a 5600X3D (Microcenter exclusive). I am very happy at 1440p!

    • @kennethpereyda5707
      @kennethpereyda5707 Год назад

      why buy a card you'll throw away in less than 2 years, now you'll be 2 generations behind

    • @jiml6822
      @jiml6822 Год назад

      @@kennethpereyda5707 Back in early 2022 you couldnt easily buy any card at MSRP. The market has changed now with the death of crypto mining. Technology is always cheaper if you wait but if you want to play, you got jump in some time. I upgraded from a 2060 (6gb), I have been loving the 6800XT. The 3070 at the time was actually $50 buck more expensive than what I paid for the 6800XT. The 6800XT with 16GB of Vram will age better then the 3070 or 3080 . I am now an AMD fanboy. For 1440p its a great card. I wouldnt buy any of the cards available today if I wanted to go 4K. $1600 for a 4090 is a joke. I have a nice 1440p 32 monitor 165HZ which at the time was $350. I will be happy for years. I would be pissed if I kept the 3070. I wouldnt consider upgrading until 4K monitors are well below $500 and cards get betterand cheaper to run 4K.

  • @IndigoBlues-dt7ob
    @IndigoBlues-dt7ob Год назад

    Very informative. Thanks for all the testing and commentary!

  • @icy1007
    @icy1007 Год назад +1

    There is definitely something different between these two GPUs other than their VRAM amounts. In Ratchet & Clank, The 16GB version is clocking significantly higher than the 8GB version. OC model vs non-OC model. The GPU usage is also lower on the 8GB version along with the TGP.
    Also, more system RAM usage normally is due to more background apps/tasks open on the PC. The 8GB card is also allocating significantly less VRAM than its max which is weird.

    • @Jeycht
      @Jeycht Год назад

      He is talking about the ram "issue" at 2min. Not a difference in app running in background.
      About cards, 8GB is founder edition; 16GB is Gigabyte OC version. He is talking about it right after the ram part.
      All answers for your questions are in the video yet your are talking like he is hidding something.

  • @NGreedia
    @NGreedia Год назад +5

    If the only version of the 4060ti was the 16gb and 256 bit, you might actually have seen a worthy generational uplift over the 3060ti. Instead, this
    Is what the 4050/4050ti should have been at but costs $500

  • @ZeppelinEXAhmed
    @ZeppelinEXAhmed Год назад +3

    Let's say, if you can buy a GPU with 24 vram but with a bus of 128 then the bottleneck is the memory bus speed. But not the amount of memory.
    For example RTX 4060 AND RTX 4060 IT.
    Just like the memory is important in a GPU for gaming in 2023 also the memory bus speed is also important. We need a balance of power, memory and prices.

    • @Argedis
      @Argedis Год назад

      GPU performance needs to match VRAM in my opinion.
      The only times in this test that VRAM helped was at QHD/UHD and the highest settings.
      But no one's going to be playing at 20-30fps they are going to lower settings anyway and that will reduce VRAM usage.

  • @TenkaichiMeister
    @TenkaichiMeister 5 месяцев назад

    I'm a pretty frequent gamer with a 2070 Super, so 8GB of VRAM.
    I'm playing in 1080p, 1440p and 4K (depends on use-case) and so far I still have nothing to complain about my GPU. I haven't played any high-end stuff in a while (the highest probably being Marvel's Spider-Man or Cyberpunk) but yeah so far I don't see any need to upgrade yet

  • @ojonathan
    @ojonathan Год назад

    tl;dr: VRAM usage measurement is not easy for sure. What we normally see is indeed how much of VRAM is being allocated on the GPU, but games, drivers and operating systems may decide if something goes to the VRAM or to the System Memory, depending on how much of VRAM the GPU have available.
    Long explanation (as always):
    In scenarios that you have less VRAM than the game needs, you can imagine it like when you don't have enough RAM, in which some things goes to the Pagefile/Swap and as long as you don't need those things very frequently to do your work, you won't notice any considerable performance hit (but it's there). The OS and Driver are smart enough to figure out what the game will need before it really does, it can even hide latency behind some parallel work (GPUs does this all the time).
    The problem is when the GPU frequently needs things from RAM because the GPU Memory Hierarchy implies that you must copy things from the RAM to the VRAM before doing anything with it¹, and because VRAM is exhausted, it also needs to copy things from VRAM to the RAM before doing the other way around, that's also the reason we never see the VRAM being 100% utilized, but we can for sure see the RAM being 100% utilized (if your machine don't freeze).
    ¹: It's not quite a must to have the data on VRAM, it can somewhat operate with data on the System Memory for light workloads, but for graphics rendering, the System Memory lacks bandwidth and holds back the GPU which depends on high bandwidth and high parallelism to hide latency.
    In general, it's hard to accurately measure memory usage, it doesn't matter if it's VRAM or RAM, for example, we have some other problems like Memory Fragmentation that can be caused by both bad memory management and not enough memory to hold everything fully. Games can also decide to not load some textures from the storage when you don't have enough VRAM, or just not set the preferred buffer location which causes the driver to decide where to store based on how much VRAM you have available. The only way to get accurate information is by using profiling tools, but even those cannot tell you if the game is using different strategies to reduce VRAM usage and driver overhead because of this lack of resources.
    One of the reasons for me to get a GPU upgrade was not even because of the raw power, I was happy with my 6600XT, but I was really worried about VRAM, so I got the RX 6800 non-XT, good price, 16GB VRAM, amazing performance for 1440p. VRAM makes a lot of difference, better have your GPU busy doing heavy calculations than busy transferring data from the RAM to VRAM and the other way around all the time.

  • @16xthedetail76
    @16xthedetail76 Год назад +3

    VRAM should never be an issue EVER. If it is an issue, that lies with nvidia and amd. Moreso nvidia.

    • @Onisak25
      @Onisak25 Год назад

      Or the devs could just optimize the games a bit before releasing them. It goes both ways.

    • @16xthedetail76
      @16xthedetail76 Год назад +2

      @@Onisak25 They can only optimize so much. Technology moves forward.

    • @masterlee1988
      @masterlee1988 Год назад

      @@16xthedetail76 Yep, this is true. For newer games, higher res and ray tracing more vram will always be needed. Now there are exceptions(along with some unoptimized games) but for the most part this can be true.

  • @zaraizara2794
    @zaraizara2794 Год назад +3

    "But for the vast majority of games out there, 99% at least, there is no difference between a 8 GB and 16 GB card, at 1080p and 1440p." - techpowerup

    • @ThunderingRoar
      @ThunderingRoar Год назад +4

      But you arent upgrading to a brand new gpu to play old games, you re buying it to play current and future games

    • @adriancioroianu1704
      @adriancioroianu1704 Год назад +6

      @@ThunderingRoar Current and future games are not just historically bad ports from PS5 only (like Ratchet and Clanks or TLoU), there are plenty of current and future games that run and will run perfectly fine, pretty much that 99% percentage that OP invoked.
      Moreover, how is that some game studios, like Playground for example, can release AAA titles that run and scale superbly on a big pallet of gpu's while a few other always fail and rush products, disconsidering pc gamers. And of course bench youtubers will always use these outliers in their talks and comparisons because they show the most amount of discrepancies which equals "content" and keep rolling with these scenarios? I'm not saying videos like this are useless, they are informative, but generalising arguments from these is just bad for everybody and usually misses the point of what needs to be improved and what's worth criticising the most.

    • @Z3t487
      @Z3t487 Год назад

      What kind of games? Indie games? 99% of games are indie games, right?

    • @masterpainter78
      @masterpainter78 Год назад

      OP, that is a horribly bad quote. My nephews and nieces all have so many browser tabs and other apps open while they play their games. Then they complain about game optimization. My nephew had about 35 brave tabs opened, discord, paint, and some sort of game programming tool opened....Complained that Skyrim SE was at 25 FPS. Have him shut down all that stuff and it came right back to 60, weeks later on the phone have to hear about 20 FPS.

    • @ThunderingRoar
      @ThunderingRoar Год назад +1

      @@adriancioroianu1704Because PS5 can use 11-12GB of its unified memory for VRAM. And its not just playstation ports, look at D4 or Plague Tale (also both NV sponsored titles)
      People can cry, piss and shit their pants about VRAM unoptimized games, but the reality is, owning a 12+GB GPU allows you to brute force thru any potential issues and not wait months for patch notes and fixes. Modern era midrange gpu buyers cannot fathom the idea of having enough VRAM to never worry about it
      And lastly ive been into PC hardware long enough to remember the exact same talking points once PS4 became the baseline and 2/3GB gpus started becoming irrelevant. 3070/ti owners starting to sound a lot like 3GB 1060 owners

  • @angpham941
    @angpham941 6 месяцев назад +1

    Me still using 1060 3gb: Interesting

  • @MgelikaXevi
    @MgelikaXevi 7 месяцев назад

    your channel is packed with value, many thanks!