How Much VRAM do You Need in 2024?

Поделиться
HTML-код
  • Опубликовано: 15 июл 2024
  • // Join the Community Discord! ► / discord
    How much VRAM do you need to game comfortably in 2024? Well, it really depends on your target game, resolution, and funny enough, available memory to your game. It's a complicated issue, but hopefully we can break it down a bit to help solve this popular and pertinent question!
    Test System Specs:
    - Intel Core i5-13600k (5.4GHz-P, 4.0GHz-E)
    - Corsair H150i 360mm AIO
    - Gigabyte Z690 Aorus Elite AX DDR4
    - 64GB 3600MT/s GSkill DDR4
    - Various GPUs Discussed & Shown
    - WDB SN770 1TB PCIe 4.0 for games
    - Samsung 970 Evo 1TB PCIe 3.0 for OS/Boot
    - 650W EVGA PSU
    - Resizable BAR was Turned ON for testing
    Building a Budget PC can be tough. Not only are GPUs and CPUs so incredibly expensive, but they can be hard to find on a budget... But, there are tips and tricks to finding you your dream Budget GPU, and pairing it with a CPU that will give you the performance you want!
    Also, if you're reading this far - I've got more content coming!
    Timestamps:
    0:00 Intro
    0:36 Preface
    1:10 Types of Memory
    3:06 Memory Hierarchy
    4:42 GDDR vs DDR
    5:37 Uses for VRAM
    7:41 Required VRAM Capacities
    11:10 Recommended VRAM Capacities
    13:07 Guinea Pig Cam
    Have a Great Day!
    - Proceu
    #vram #gddr7 #gddr6 #gddr6x
  • НаукаНаука

Комментарии • 63

  • @swagatrout3075
    @swagatrout3075 4 месяца назад +11

    next video titel "Do you need 20gb of VRAM in mid 2024"
    comming next month

    • @AnonymousUser-ww6ns
      @AnonymousUser-ww6ns 3 месяца назад +4

      That might be true in a couple years. I would say in mid to late 2026 I would then recommend 20GB VRAM as a starting point.
      But for now I would say 16GB is fine for now

    • @P_K97
      @P_K97 3 месяца назад +1

      @@AnonymousUser-ww6ns Depends on resolution 😅 If you want to play on a 8k TV well good luck 🤣

    • @MrWasian
      @MrWasian Месяц назад

      @@AnonymousUser-ww6ns We won't see that sharp of a climb at least not until 4k gaming becomes affordable which is still a few years away. NVidia tried pushing 8k rhetoric already and immediately stopped when they realized how many people aren't going to adopt 8k when 4k isn't even reasonably affordable for most.

  • @P_K97
    @P_K97 3 месяца назад +5

    I have RX570 8GB. It is 7 years old. It was a mid range card then.
    Considering that I would need to to get 10GB + to get an upgrade.
    I am playing 1440p and considering how good games are made right now I believe new cards with 8GB shouldn't exist 🙄

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 2 месяца назад +2

      I agree, it baffles me that the 30 series made little to no improvement towards Vram capacity, when RT was starting to kick off and the PS5 and Series X were about to launch, entering a new generation. Nvidia timed it so well it's crazy, getting so many people to upgrade to cards that 6 months later were crippled by some games because we saw a massive spike in what performance was needed for games. 8gb Cards should be like your 3050 cards, with 10gb and 12gb at the 4060 and 4060Ti mark leaving 16gb and over for anything above that, realistically the 4090 should have been 32gb or 36gb, since the 24gb is plenty, it's not impressive or overkill for a card that is supposed to be, especially when AMD's so happy to put 16, 20 and 24gb on their cards.

    • @P_K97
      @P_K97 2 месяца назад

      @@mttrashcan-bg1ro It is a reason to upgrade your GPU faster 😅 More money for them 🙄
      Probably the reason to why i am still able to play with ok fps in 1440p is that 8GB of VRAM.
      If I would go for 4GB cyberpunk would probably not even run 🤣

  • @maticsb
    @maticsb 4 месяца назад +9

    E WATCHING WITH 4GB V RAM

    • @yumri4
      @yumri4 4 месяца назад +3

      watching it on a GTX 980 so 4GB of VRAM which i hit almost daily with 3D modeling and AI compute tasks.
      For gaming they almost never hit a bottleneck though that is probably as i 1 do not look at fps so i do not notice unless it stutters and 2 i almost never use max settings in games.

  • @Razor2048
    @Razor2048 4 месяца назад +5

    For many games, 8GB is still heavily limited. for example, even with older titles such as Far Cry 6, the highest texture settings will use more than 8GB at the highest texture setting, where the company recommends at least 12GB. Textures also do not have much GPU overhead provided enough VRAM throughput. for example, Far cry 6 with the high res texture pack, has a negligible performance difference on modern cards, especially ones like the 4070 super.
    A modern card with 500-600+GB/s VRAM throughput, things like doubling the texture resolution, has virtually no impact on performance while significantly improving visuals, as the cost of a higher memory controller load. On weird crippled cards,such as ones with a large pool of VRAM but a 128 bit bus such as the 4060 ti 16GB where the VRAM is doing 288GB/s (basically around 12GB/s faster than the VRAM on the GTX 970 (PS, extra L2 cache does not help with heavy texture demands, it only reduces bandwidth needs for tiny datasets unrelated to textures.), then a higher texture load will begin to negatively impact performance.
    Even in cases of using a card for as long as possible, more VRAM helps greatly since even if you reach a point where you are running games on low due to inadequate GPU compute performance, a card with more VRAM will mean overall low settings with high or ultra textures, rather than an overall low preset and low textures, and that makes a huge difference in the overall visual experience, and is easy to test out. Try a few games on low, then try maxing out the textures while everything else is on low. The visual improvement from that, will be more substantial than from almost any other setting except in cases where low completely kills dynamic lighting.
    The worst combination/ trend these days, are cards with very little VRAN, e.g., 8GB, and also using PCIe 4.0 x8 for the interface.
    A fast interface is highly important for scenarios involving VRAM spillover, as the PCIe bus load determines when you run into major hitching/ stuttering issues for games where the developers are allowing shared memory use. A PCIe 4.0 X8 or PCIe 3.0 X16 interface, dealing with shared memory in active use (e.g., like earlier builds of hogwarts legacy, and not like ratchet and clank rift apart which will store other rifts in the shared memory, thus being able to allocate 2-3GB of shared memory without much of a performance hit), Games that will actively use the shared memory, will typically run into stuttering after around 512-700MB is used, while cards on a PCIe 4.0 X16 interface, can typically handle around 1.5GB or more allocated before major stuttering and hitching issues start. While shared memory is a situation that is never ideal, it is important to consider when taking into account the aging of a card.

  • @auntiepha8343
    @auntiepha8343 4 месяца назад +6

    🏆Great video, you break down VRAM to where anyone could understand. 👍

  • @afti03
    @afti03 4 месяца назад +2

    NOW I GET IT! what a great video! what a great teacher! it all makes sense now!

  • @desktopstu4145
    @desktopstu4145 2 месяца назад +2

    I purchased an Arc A770 16gb card as part of a system upgrade from Zen 2 to Rocketlake. Old graphics card was 1060 6gb, game of choice was World of Tanks. 1060 used to use around 4-5 gb or VRAM playing at 4k medium settings the Arc uses a whopping 10-11gb of VRAM playing at ultra.

  • @Randomgirl4629
    @Randomgirl4629 4 месяца назад +8

    9:49 I’m staring at this image you used as evidence that the 4060 ti 8gb performs comparably to the 16gb version, but are you seeing those 1% lows and horrendous frame time spikes on the graph for the 8gb version? Unless you are okay tolerating horrendous stutter in games, more vram is a must nowadays. The best example is the 4070 ti super may not be much more powerful than the 4070 super, but the extra 4gb of vram will prove useful in the future for keeping settings maxed out at 1440p without worrying about bad stutter and horrible 1% lows, which is more noticeable than the small performance increase over the 4070 super. This is the same debate we saw between 4gb and 8gb cards in the past, it has become clear that 4gb cards are UNPLAYABLE even at lowest settings 1080p in recent AAA games now while the old 8gb cards are definitely slow, but not completely unplayable with low settings. Hardware Unboxed’s most recent video is blatant proof, and you should also watch their 8gb vs 16gb comparison video for more.
    Going back to the 4060 ti 8gb vs 16gb, yes, the bus width is still tiny, but you do not want to have “vram misses” in your games where calls to you motherboard system memory are made by the gpu to constantly cycle textures and such in and out of vram. It’s horrible to play and it’s why I upgraded from my 2070 this year: 8gb doesn’t feel like enough anymore for a smooth experience. A slow card with enough vram running a game at at a smooth 60fps at 1080p max settings is better than 100fps on a faster but lower vram card stuttering like crazy at max settings because it’s out of vram. Frametime graphs are more important than average framerate when measuring the actual gameplay feel and how smooth/responsive the experience is.
    TL;DR: The 4060 ti 8gb has more potential to become an unplayable stuttery mess than the 4060 ti 16gb.
    EDIT: Modern Consoles have 16GB hybrid VRAM that is also used for system memory. This effectively means 12gb vram is a must now just to keep up with the PS5, 10gb to keep up with the Series X. Conversely, the Xbox Series S has 8gb of total system memory as vram which means 6gb really is the absolute lowest you should go, but you’re getting a $300 console experience at that point.
    I think the simplest method is to simply match the flagship console vram. This means 16gb in 2024 to stay ahead of console visual quality, performance, responsiveness, and smoothness.
    CONCLUSION: 16GB is the definitive vram amount if you want to have the best chance at a smooth, stutter free experience at 1440p/4k max settings playing the pc ports of upcoming console releases in the years to come. Consoles cannot exceed this amount of memory at this time under any circumstance as their total system memory is 16gb of vram. 4070 ti super is the best option from nvidia at this time, and the 7800 xt the best option from amd if you’re on a tighter budget. You could buy the 4060 ti 16gb or the 7900 gre instead but they are worse than the 7800 xt and the 4070 ti super, respectively. The 4070 ti super is a better buy than the 7900 xt; 20gb is overkill UNLESS you know you’re going to be stuck with the gpu for a long time. After all who knows? The next consoles might have 20 or even 24gb of vram as total system memory…

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 2 месяца назад

      This is exactly why it irks me how many people recommend you buy 30 series cards still. The only 30 series cards worth buying are the 3090 and 3090Ti if you can get them for less than a 4080 as they might only perform as well as mid tier 40 series cards, they still have more Vram and DLSS Frame Gen isn't currently worth using in most cases as it's very broken and causes a lot of problems.
      As for the most popular cards like the 3070 and 3080, even the 3080Ti, they are 8gb, 10gb and 12gb, that amount was not enough for what you were paying for the card when they were new at MSRP, the 3070 should have been 12gb with the other 2 at 16gb then they'd still be insanely good cards and the 3080 and 3080Ti would still be usable at 4k. The 3060 outperforms the 3080 in some games which is an absolute joke.
      As for matching console Vram, the PS4 and Xbox One had 8gb, yet 8gb cards didn't hit the mainstream until people started buying 1070s and RX 480s and stuff, it's not a good guide at all we personally I think the comparison between consoles and PC specs should never even come up in conversation outside of straight up comparing them for curiosity. On PC we have to factor in that we often get bad ports that want a lot more than what the consoles have. Keep in mind that both last gen consoles also had 8 core CPUs, when 8 core CPUs still aren't required in a lot of games.

  • @paulboyce8537
    @paulboyce8537 4 месяца назад +4

    Always very informative. In my opinion it is the VRAM/bus/Wattage and on ARC you also need to take in count the different architecture where CPU/RAM is paired with the GPU much more equally. For example the 225W you have on A750/A770 at the first look is limited by the Wattage. In some sense this is true but you need to take in count the architecture that takes the CPU equally paired. The working is a bit different and sometimes a bit weird as 1080p might not be that high compared to the game ceiling but 1440p the drop is not much and to 4k it seems that sometimes it holds even that the demand gets higher. Of course same rules apply 16gb/256 bit bus for 4k. But here it seems to take a bit of a different way. I've noticed that if the game has a ceiling for the FPS you usually get half at 1080p. Don't get me wrong if it is 200FPS and you still get 100FPS that is very playable. But for 1440p it might only drop to 85FPS and 4k to 70FPS.
    The threshold I would say is 40FPS for most of the 4k with ARC. But the difference is it is playable. 60FPS works great and most you get with a mix of medium and high settings and often you can also add RT. You can't do the same with 4070ti for example even if you have twice the FPS. In my understanding this is because ARC GPU does one task and CPU does the other half and the two are brought together with minimal queuing with superior frame time where as AMD/Nvidia almost all is done within the GPU and when you ask higher resolution and push the card you end with queuing/overloading and that gives stutter. That's also why you often see twice the FPS but still it doesn't work and we keep on talking about stutter. 4070ti does have 192 bus that doesn't help but it does have 285W available and on paper it should be far the superior card and lot more expensive.

  • @cowman876
    @cowman876 4 месяца назад +4

    6700 xt with 12 GB vram

  • @spikykitten3502
    @spikykitten3502 16 дней назад +1

    RTX 2070, 8GB. It still works great at 1440p, but it would be a lie to say the newest games don't have some VRAM trouble.

  • @AnonymousUser-ww6ns
    @AnonymousUser-ww6ns 3 месяца назад +2

    I will make sure to get 16GB VRAM on 256 bit memory bus form now on.
    New Games will become demanding and extra VRAM will help.

  • @luka33luk42
    @luka33luk42 4 месяца назад +3

    512 MB vram 😎

  • @MooKyTig
    @MooKyTig 4 месяца назад +7

    Say it with me: TEXTURE QUALITY HAS NO REAL WORLD IMPACT ON PERFORMANCE.
    If you have enough VRAM you can have higher quality textures with basically ZERO impact on performance.
    All this crap about "enough for a card of this level" means you've swallowed Nvidia's marketing BS.
    PS. "Enough for a card of this level" could apply to the Bandwidth, yes. But not the total amount of VRAM. Total VRAM, although related, is not directly tied to bus width by using different size chips or so-called clamshell designs.

    • @da1punisher
      @da1punisher 4 месяца назад +2

      Hear hear. The latest HUB/Tech Spot article on the 6500XT 4GB vs 8GB bears out what you wrote. All else being equal even a slow GPU can benefit from extra VRAM. Being able to turn up textures even on low settings is a nice visual improvement. Better frame pacing and even higher FPS at times too.

    • @matthewjohnson3148
      @matthewjohnson3148 2 месяца назад

      This is the biggest load of horse shit ever....
      The 7900xt is a 20gb 320 bus card, yet it cannot run high textures in 4k. Games like avatar it can't even run it at 1440p native...

    • @MooKyTig
      @MooKyTig 2 месяца назад

      ​@@matthewjohnson3148 Ignorance is tough to combat.
      Exactly which game can the 7900xt not "run" high textures?
      Ignorance of how the Snowdrop engine works is not an excuse. Sounds to me like someone looked up the VRAM usage of Avatar and was actually foolish enough to think that the game was using that much VRAM...

    • @MooKyTig
      @MooKyTig 2 месяца назад

      @@matthewjohnson3148 There are something like 6 or 7 video cards on the planet that can do Avatar in 1440p60fps.
      The 7900xt is one of them. It trades blows depending on reviewer/resolution/settings with the 4070ti (it's price competitor) which seems pretty normal to me.
      PS. The 4090 can barely, and I mean barely, eek out 60fps in 4k in avatar. So, if people want to say avatar has crap performance, hey, I agree.

    • @matthewjohnson3148
      @matthewjohnson3148 2 месяца назад

      @MoKTiger0o the 7900xt is not one of them... you have no idea what you're talking about..
      My 4080 super can tho.

  • @jonservo
    @jonservo 4 месяца назад +2

    Sure 8gb is fine for 1080p but what about us 1080p ultrawide monitor users, no one ever talks about us lol

    • @MooKyTig
      @MooKyTig 2 месяца назад

      1440p results are a very good proxy for 1080 ultrawide, like take 1440p and remove 10% and it's close enough.

  • @ethancbaker2002
    @ethancbaker2002 3 месяца назад +4

    Regret buying a 3070ti with 8gb or vram

    • @ProceuTech
      @ProceuTech  3 месяца назад

      How come? I'm curious because I don't have my 3070ti anymore, so I haven't experienced anything on it in 6+ months.

    • @ethancbaker2002
      @ethancbaker2002 3 месяца назад +1

      @@ProceuTech not enough vram tbh. I know at the time it was nice but it’s dumb but only has 8 gb of vram when I bought it older cards than that has more vram. Me being dumb also bought it during scalper prices when Gpus were short which was also very dumb decision. Because I bought it during that time I missed out on newer cards that was a better deal and have more vram. NVIDIA is notorious for skimping out on vram rather than amd. I wish what I done was buy a worse card at the time. Which I wouldn’t have much money in and then upgrade that card to a 4070 ti super which has 16 gigs of vram and then I’d be done. But eh I’m gonna stick with my setup for awhile then upgrade way later unless I get good am5 motherboards for half price at like flea market or something lmao

    • @ProceuTech
      @ProceuTech  3 месяца назад +1

      @ethancbaker2002 seems valid. Sounds similar to my situation, however I sold the card off and upgraded to a current gen one instead. Pry not the smartest but 🤷‍♂️

    • @ethancbaker2002
      @ethancbaker2002 3 месяца назад +2

      @@ProceuTech yeah but the good thing is with computers is that you learn a lot when being in them for awhile lol. Nice vid btw!

  • @SafoGamer
    @SafoGamer Месяц назад +1

    Why don't you mention the AMD cards, they seem to offer the best bang for your buck, specially considering VRAM

  • @Personalinfo404
    @Personalinfo404 4 месяца назад +2

    still havent found a game that runs at less than 60 FPS at 4k and I have a 12gb 3080ti

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 2 месяца назад

      If it was a 3080 10gb, you'd be in serious trouble. 12gb isn't going to last much longer at 4k, in fact I'm surprised you say that when most games say they use 12-14gb and allocate more like 15-16gb of my 4090.

    • @Personalinfo404
      @Personalinfo404 2 месяца назад

      @@mttrashcan-bg1ro bold of you to assume that I play any games that have released in the past 8 years besides elden ring. stop buying bad games.

    • @Personalinfo404
      @Personalinfo404 2 месяца назад

      @@mttrashcan-bg1ro also, this was just your way of trying to tell people you have a 4090. reddit moment

  • @bauer9101
    @bauer9101 4 месяца назад +3

    Reminds me of the GTX 750 Ti 4GB GDDR5 my friend bought. He was convinced he ‘future proofed’ himself. He didn’t.

    • @Magnusrorsachhatty
      @Magnusrorsachhatty 4 месяца назад +4

      That's an extreme case, if he bought a 970 4GB it was usable until 2020

    • @ilhamkazimzadeisyourproducer
      @ilhamkazimzadeisyourproducer 4 месяца назад +2

      @@Magnusrorsachhatty i bought 4060ti 8 gb. it'll be usable until 2030 or not?

    • @MooKyTig
      @MooKyTig 4 месяца назад +4

      @@ilhamkazimzadeisyourproducerI highly doubt that. The big problem will be the next gen of consoles. They will almost certainly have access to even more vram than they do now. You'll be able to play some games for sure... but a lot of console ports will be out of your reach. They'll likely technically boot and run, but you'll have horrendous stuttering.
      This is a honest prediction. I could be wrong, but I doubt it.

    • @tlv8555
      @tlv8555 4 месяца назад +3

      Why did you buy the meme GPU 😂​@@ilhamkazimzadeisyourproducer

    • @ilhamkazimzadeisyourproducer
      @ilhamkazimzadeisyourproducer 4 месяца назад +1

      @@tlv8555 bro i don't have any good budget soo I have to buy it 😭😭

  • @GroundGame.
    @GroundGame. 4 месяца назад +2

    *Dat sweet, sweet 16GB 4070Ti SUPER doe* 😉

  • @-Rizecek-
    @-Rizecek- 4 месяца назад +4

    10-12GB is minimum for 1440P
    8Gb will be enough still for 1080P
    12-16Gb GB is minimal for 4K

    • @GrainOnTheGo
      @GrainOnTheGo 4 месяца назад +3

      Let me fix this for you.
      8-12Gb 1080p
      16+ 1440p
      20+ 4K
      I don't know what games you're playing where 16Gb isn't the minimum for 1440p. And it's going to get worse over the years.

    • @-Rizecek-
      @-Rizecek- 4 месяца назад +4

      @@GrainOnTheGo
      12gb for 1080p omg..

    • @GrainOnTheGo
      @GrainOnTheGo 4 месяца назад +4

      @@-Rizecek- if you’re playing stuff like Alan Wake 2, and UE5 games with RT? Yea you need 10-12GB. The only people who complain about VRAM not being important are people who don’t have a lot of VRAM (aka NVIDIA because they starve their cards of it).

    • @-Rizecek-
      @-Rizecek- 4 месяца назад +4

      @@GrainOnTheGo
      UE5 it sucks and very basd engine for modern games
      Sometimes looks worse than games from 2015-2018

    • @knowingbadger
      @knowingbadger 4 месяца назад +5

      ​@@GrainOnTheGoman I got 16gb of vram and almost only play 4K. I've never gone above 10gigs used

  • @TobiasCramon12
    @TobiasCramon12 4 месяца назад +2

    from my experience 8gb of vram works great for pretty much all games at 1080p

  • @willian9327
    @willian9327 4 месяца назад +7

    Just Cut the Crap,
    FHD - 8GB
    QHD - 12GB
    UHD - 16GB

    • @MooKyTig
      @MooKyTig 4 месяца назад +6

      All you've done here is displayed ignorance.