Is 8GB VRAM Enough For The Latest Games? If Not, Why Not?

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 343

  • @FMBriggs
    @FMBriggs 6 месяцев назад +19

    20 years ago I bought a gaming laptop with 2GB of shared system memory, during a time when 128-256MB was standard. That laptop lasted for well over a decade as a netbook. It's not something I'd recommend though, especially for gaming. Don't buy a 4090 or a 7900XTX for the large memory buffer, because by the time you actually need that much VRAM a mid-range card that uses 1/3 the power will outperform your space heater.

    • @24sLight
      @24sLight 4 месяца назад

      So which is better I plan to buy laptop

    • @vac59
      @vac59 Месяц назад

      @@FMBriggs the VRAM is now needed for extra features in the pipeline like dlss and Nvidia RT. You’re already peaking 12GB with the RE4 remake. 16GB seems to be the sweet spot and Nvidia know this, making retail jump to 5080 class card at probably 800 bucks is absurd.

  • @boaether
    @boaether 6 месяцев назад +84

    8 GB should be enough, but it isn't. I got tired of my 3070 Ti as the VRAM amount couldn't keep up with the processing power of the card. It COULD process RT and high settings in many games, but it WOULD often run out of VRAM bringing performance and sometimes stability crashing down.
    Now I have a 16 GB card and I don't have to worry about VRAM at all. I would not in any way get an 8 GB card in 2024, if you plan on playing AAA games.

    • @Philo229
      @Philo229 6 месяцев назад +17

      Not if you want to play them at max settings, with RT and at a high resolution. But, you can easily get away with 8GB at 1080p with lower settings.

    • @CaptainScorpio24
      @CaptainScorpio24 6 месяцев назад +4

      same here went from 3 yr old 3070 founders edition to gigabyte rtx 4070 ti super eagle oc 16gb at launch jan2024

    • @puffyips
      @puffyips 6 месяцев назад +7

      And there’s that guy saying “4gb is enough if devs optimize” like cmon man devs have been waiting for 16gb to be the norm for vram as well as more for the future

    • @TheIndulgers
      @TheIndulgers 6 месяцев назад +22

      VRAM should never be the limiting factor. The card should age from lack of performance before anything.
      Having to lower texture quality just to fit in the vram budget is unacceptable imo. Texture quality is literally free visual quality if you have enough vram - and imo has the largest impact on visuals. Much more than RT, or shadows, or SSR.

    • @tomthomas3499
      @tomthomas3499 6 месяцев назад +8

      They know full well when they give 3070, and 3080 8 and 10gb of vram respectively, to force you to upgrade for newer gen, which you did, and probably one of the many reasons that kept Nvidia monopoly grip on gpu market..

  • @Pawnband
    @Pawnband 6 месяцев назад +41

    John didn't have to flex so hard with that 12mb Voodoo.

  • @returningwhisper
    @returningwhisper 6 месяцев назад +46

    Nvidia cheaping out on the RAM solutions for their cards is not great.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP 6 месяцев назад +5

      They want you to buy their overpriced GPU's instead.

    • @johnarigi11
      @johnarigi11 6 месяцев назад +1

      If rumors of RTX 5080 having 16gb are true, then that's just Nvidia being very stingy. For their stinginess I'd expect at least 18gb, I don't see them putting 20gb

    • @rustyshackleford4117
      @rustyshackleford4117 5 месяцев назад +3

      @@johnarigi11 16GB is going to be plenty for any game for quite some time. I have never seen my 4090 hit even close to 16Gb usage in any game, even maxed at 4K with all RT settings to Ultra. More memory in this scenario is only warranted if you're doing AI workloads. Besides, due to the memory bus size, it's not possible to have a card that's say 18Gb of VRAM, it's either 8, 16, or 24.

    • @hannes0000
      @hannes0000 5 месяцев назад +1

      @@DeadPhoenix86DP They also want that people upgrade every generation.

    • @skychaos87
      @skychaos87 3 месяца назад +2

      Its part of their strategy, their mid range GPUs are powerful but they purposefully pair it with low memory cause you'd buy them thinking you can max out alot of games but come next gen games you realize your GPU suddenly performs like sh1t because of memory bottleneck and you are forced to upgrade again. 4060 Ti having only 8GB is a crime, my 5700 from 5 years ago already had 8GB. Even current gen 7700XT which is priced cheaper than 4060Ti has 12GB.
      4060 should have 12GB and 5060 should have at least 14GB. People buying GPUs should be future proofed for at least 4+ years for the purchase to make sense.

  • @manna6912
    @manna6912 6 месяцев назад +196

    Damn.. Here we go again.. Ps3 with 256mb ram was able to run watch dogs and gta 5. That was hell of the optimization back then.

    • @johnconnorpliskin7184
      @johnconnorpliskin7184 6 месяцев назад +21

      Not to mention, low-spec indie games continue to come out and sell really well.

    • @OwtDaftUK
      @OwtDaftUK 6 месяцев назад +39

      It is shocking how much they was able to do with the PS3's little ram.

    • @gothpunkboy89
      @gothpunkboy89 6 месяцев назад +62

      Run those games is being generous. They could barely maintain a frame rate in the upper 20s.

    • @KrazzeeKane
      @KrazzeeKane 6 месяцев назад +35

      ​@gothpunkboy89 you tell no lies, my friend. I bought a ps3 to jailbreak and my god, do a lot of the ps3 games run like absolute crap.
      So many titles sub 30fps, or bad frame pacing and stutters and such. I ended up upgrading my pc so I can run the rpcs3 emulator and play ps3 games at a stable fps and a proper resolution lol

    • @manna6912
      @manna6912 6 месяцев назад +7

      @@KrazzeeKane yeah.. Agree.. But that time had its own gold age vibe.. Back then games used to be smaller and affordable. But after witcher 3 every game became grinding and grinding for 100 hours and backlog keep getting bigger and bigger. Its just kind of overwhelming. Back then games used to chase players, today we are chasing to finish every game. It seems impossible with rpgs.

  • @TheIndulgers
    @TheIndulgers 6 месяцев назад +80

    VRAM should never be the limiting factor. The card should age from lack of performance before anything.
    Having to lower texture quality just to fit in the vram budget is unacceptable imo. Texture quality is literally free visual quality if you have enough vram - and imo has the largest impact on visuals. Much more than RT, or shadows, or SSR.
    Stop defending a trillion dollar company. It is weird.

    • @OG-Jakey
      @OG-Jakey 6 месяцев назад +2

      DF has arguably helped corporations more than they realise. Pushing 2560x1440 has been the biggest contributing factor when you look at the big picture.

    • @timothyjn100
      @timothyjn100 6 месяцев назад +3

      If you compare path tracing to 4k vs 1080p textures, textures actually have FAR less visual impact. Not even close.
      Not sure how you can say textures have the largest impact when objectively that is not true in the slightest.
      I agree with everything else you state. I agree that it's free improvements for nothing more than just having enough VRAM. But lets not get it twisted. It's night and day comparing pathtraced lighting to higher res textures. Take a game like minecraft for example. You can mod in crazy realistic high res textures, but doing so alone is hardly game changing. Now path traced MC with the standard 16x16 textures? It's game changing. I rest my case. That is the simplest comparison I can put fourth the express this basic comparison

    • @Hayden2447
      @Hayden2447 6 месяцев назад +7

      @@timothyjn100 Minecraft is an awful example as the 16x16 textures are literally the art style of the game, making them higher completely changes the Minecraft charm. Take a realistic game for one but if the textures are visibly low res vs a higher setting that makes them look nice and detailed then better textures are an all round win. RT can be big, but only when you crank all parts of it or path tracing which tanks frame rates vs textures which are free if you have the vram. However higher resolutions do make higher quality textures matter more as there's more pixels to represent them so you can see more of the detail such as at 4K or even 1440p vs 1080p where RT would make a relatively higher boost in fidelity as with 1080p high resolution textures are hidden more by the lack of rendered pixels so you need it to be closer to see the difference.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro 6 месяцев назад +1

      @@timothyjn100That's not what the OP meant at all. Compare low, medium, high and ultra textures on nearly any game and it'll always be the biggest impact on visuals where it has no performance impact if you have the Vram. Most of the time those texture qualities also don't state was "resolution" they are. 4k or HD textures is just something applied to texture packs you download as mods or for games like Far Cry 5. Games like GTA 5 have been said to have 4k textures, yet that is meaningless since it doesn't specify what texture quality it is, and it also has worse textures than newer games that are technically the same resolution.

    • @Sim7NYY
      @Sim7NYY 5 месяцев назад

      That’s not fair,
      Defend what deserves to be defended at all times.
      Your definitely right about the rest.

  • @BatCorkill
    @BatCorkill 6 месяцев назад +24

    I noticed the Avatar game seemed to be using 10+gb of my 12gb cars.

    • @zackmandarino1021
      @zackmandarino1021 6 месяцев назад

      it used 13-15gb on my 20gb card lol 1440 ultra with unobtanium mode enable it was 17 i was like wtf

    • @SimonBuchanNz
      @SimonBuchanNz 6 месяцев назад +6

      Avatar is a little bit of an odd case as it will upload as much as at can to reduce how much it needs to swap in and out.
      Since it's an open world game, this means it can fill up pretty much any amount of memory until most of the install is on the card!

    • @adlibconstitution1609
      @adlibconstitution1609 6 месяцев назад +2

      Rtx 3070ti can run avatar ultra 1080p

    • @ymiround
      @ymiround 6 месяцев назад +1

      Funny enough this particular game runs pretty great on 8gb cards according to different testers. But for example Horizon Forbidden West and Ghost of Tsushima use around 8gbs so in case of 4060 and 4060 ti it's fine but you can't enable FG because FG also takes vram. I mean, you can enable it but it starts swapping vram into ram and you'll see the same fps at best

  • @DJBV
    @DJBV 6 месяцев назад +13

    12 gb vram should be the minimum to match/exceed current consoles textures quality.

  • @sapphyrus
    @sapphyrus 6 месяцев назад +19

    8GB can still make do if you bought it in previous gen but a bad idea to buy in current gen.

  • @mondodimotori
    @mondodimotori 6 месяцев назад +18

    Again with this debate... Some people in the field alredy warned about this limit way back when the 3070 launched.
    8GB are obsolete in 2024 for resolutions higher than 1080p and max quality textures.

    • @renatoramos8834
      @renatoramos8834 Месяц назад +1

      8 GB was aalready unacceptable for cards launching in 2022.

  • @clinten3131
    @clinten3131 6 месяцев назад +7

    I would proritize a 16 gb card if I was making a new rig. But people can do what they want ofc its not my system or money. I just dont want the headache of having a potent GPU that I have to scale game settings down bc of vram.

    • @KrazzeeKane
      @KrazzeeKane 6 месяцев назад +3

      Thats what I did when I built my pc last year. I wasn't going with any less than 16gb, as I wanted to be able to play games for the next 5 years minimum before having to turn down settings and upgrade.
      At that time the only card that fit was an rtx 4080, as this was long before the 4080 super or 4070 ti super existed.
      Managed to nab a 4080 for $950 and it has been treating me very well--paired it with an i7 14700k and 32gb ddr5 6400mt cl32 ram, so it will definitely last me the next 5 to 6 years of high settings + high fps gaming at 1440p before I need an upgrade.

  • @DigitalVirusX2
    @DigitalVirusX2 6 месяцев назад +5

    I regret my 3070ti so much, $600 and at least in 2077 its impossible to maintain 60fps with any sort of ray tracing on because the vram gets eaten up so quick and performance TANKS after an hour of play. I'll probably buy a 4070ti super down the line when its a more reasonable price, hopefully the 50 series makes the price drop on the 40's.

  • @deathshade777
    @deathshade777 6 месяцев назад +31

    It’s not enough I run out on most of the current gen games with my 4070 laptop. It should be 12gb min

    • @desmond3245
      @desmond3245 6 месяцев назад +10

      Wow 4070 mobile is such a scam. 8GB 128bit Vram with the performance of a 3060 desktop. Even if it's called 4060ti mobile it's still bad. I thought with better power efficiency mobile chips would be much better than last gen. But then I forget it's Nvidia.

    • @filippetrovic845
      @filippetrovic845 5 месяцев назад

      @@desmond3245You have no idea what you are talking about. With FG 4070 laptop has 120% higher fps than shitty 3060 in all fg supported titles which are exactly the most demanding games. I have 4060 laptop and couldnt be happier, never ran out of vram. However if you bought 1440p laptop with 8gb vram you dont know what you are doing.

    • @rustyshackleford4117
      @rustyshackleford4117 5 месяцев назад

      @@desmond3245 I want to upgrade from my aging GTX 1080 laptop, but if you ignore ray tracing and DLSS as selling points, the 4070 mobile is about on par with the old school 1080 mobile in terms of pure raster performance. Back then, the mobile cards like this were full-fat GPUs with the same specs as the desktop parts.
      Now, we are power limited on cards with the highest-end mobile 4090 being functionally more similar to a desktop 4070, not even a 4080. Steam Deck has been plenty fine for 90% of my mobile gaming though, so getting a new laptop is more about having DLSS and an OLED screen...while 4090 on my desktop continues to provide more performance than I'll need for some time.

    • @panjak323
      @panjak323 3 месяца назад

      ​@@desmond32454060ti itself is a scam. It is 4060.

    • @steveco1800
      @steveco1800 2 месяца назад

      It’s borderline based on settings. At least with your 40 series laptops you have PCIe 4.0. My 3070 laptop is sometimes crippled by being on PCIe 3.0 x8.

  • @sengan2475
    @sengan2475 6 месяцев назад +22

    Ghost of tsushima alocates 8gb at anything above medium textures. This isnt a "future proof" thing, you will have troubles right now in many games unless you're planning to play 1080p medium with a budget rig, which there's nothing wrong with. But if your expectations are any higher, 8gb isn't enough

  • @f-35x-ii
    @f-35x-ii 6 месяцев назад +8

    I've got an Asus TUF RTX 3060 Ti OC gaming 8GB and paired it with an AMD R7 5800X3D with a curved 144hz 1440p monitor , I am having no issues with his at all, even in UE5 games, of course always need to optimize the settings and I'm also getting high FPS. I'm not switching videocards anytime soon neither but when I do it will definitely be 16GB one!

    • @puffyips
      @puffyips 6 месяцев назад

      Play rust, it’s an old game at this point.

    • @Dhrazor
      @Dhrazor 6 месяцев назад

      UE5 uses the least amount of VRAM out of any advanced engine, try running Alan Wake 2 with RT at 1440p, or 4K... if you had Frame Gen that would use even more VRAM.
      At 1440p MAX+PT+FG the game uses 14GB VRAM, at 4K it's almost 18GB... even at 1440p+RT the game can use 12-13GB so the 3060 beats the 3070Ti. 16Gb should be fine for now, but I hope there will be 24GB 5080Ti at around $1000 next year, I'd really like to move to 4k

    • @filippetrovic845
      @filippetrovic845 5 месяцев назад +1

      @@DhrazorTelling a guy to play certain game which he wouldnt normally play only to prove that 8gb vram is not enough is exactly what i expect from poor amd fanboy. BTW 8GB cards are meant for 1080p, so you shoudnt expect to max every game on 1440p with 8gb but you can prove to yourself that even at 1440p you mostly wont be dissapointed.

  • @MA-jz4yc
    @MA-jz4yc 6 месяцев назад +7

    Theres no reason why graphics cards should come with less than 12-16gb vram in 2024. GDDR6 is dirt cheap, (less than 2 dollars per GB) the only reason manufacturers do this is to up-sell you more expensive products.

  • @FormerHumanX
    @FormerHumanX 6 месяцев назад +5

    8GB VRAM definitely isn't enough for what Nvidia is charging.

  • @pothitoskourtis
    @pothitoskourtis 6 месяцев назад +5

    bad optimised games is a plague.

  • @D.Enniss
    @D.Enniss 6 месяцев назад +11

    IMO 12GB VRAM is the sweet spot for up to 1440p, but at 4K no less than 16GB VRAM is the new standard

    • @roguetwice469
      @roguetwice469 6 месяцев назад

      I'm definitely cutting it close, but my 3080 ti (12GB) is actually doing just fine at 4K in every game I've tested (LoTF, Tsushima, Starfield, Dragon's Dogma 2, etc) I certainly wouldnt recommend buying a GPU with any less than 16GB for 4K if you were building a PC today, but if you're on a very tight budget and need 4K, it's very doable on a 4070 Ti/3080 Ti

    • @adlibconstitution1609
      @adlibconstitution1609 6 месяцев назад

      So your saying 8gb of vram is not enough in 1080p?
      Cards like rtx 3070, Rx 6650xt/6600xt, 3060ti, Rx 6600, Rx 5700xt and rtx 2080super = all 8gb cards

    • @jonas34ptfernandes38
      @jonas34ptfernandes38 2 месяца назад

      What if i want 4k gaming but with dlss​@@roguetwice469?

  • @vac59
    @vac59 6 месяцев назад +4

    The new Nvidia 5070 class of cards should have 16GB of memory. But leaks are saying 12 🙄

    • @renatoramos8834
      @renatoramos8834 Месяц назад

      Because it's a 5060 under a fake name and price of a 5080.

  • @Hatecrewdethrol
    @Hatecrewdethrol 6 месяцев назад +1

    I thought the title just said "RAM" and i did a double take that you guys uploaded it today

  • @ambientlightofdarknesss4245
    @ambientlightofdarknesss4245 23 дня назад

    It's insane to me that in 2024 we are worrying about VRAM instead of the actual speeds of cards to run games. I can't even remember a time where Vram was the limiting factor of a card. Vram usually meant extra performance and fututre proofing in the coming years.

  • @melody_lane5081
    @melody_lane5081 6 месяцев назад +2

    I wouldn't buy new card with 8, but would buy used with 8 if a good deal. I have 8 now, don't have any problems at 1080 on ultra in everything. I have 5700 xt. Cheers!

  • @kazumakiryu7559
    @kazumakiryu7559 6 месяцев назад +11

    I only really saw vram related issues on my 3070 Mobile in titles ported by Nixxes (Horizon Forbidden West, Ratchet and Clank and The Last of Us). I feel the issue is related with the way they are performing memory management and people are bashing the hardware for no reason. Games like Cyberpunk 2077, RDR2, Forza Horizon 5 and Metro Exodus are proof that you do not need the latest and greatest hardware for graphics to look good.

    • @YikesHehe
      @YikesHehe 6 месяцев назад +5

      Small correction for ya, TLOU1 was NOT ported by Nixxes sadly 😭
      Iron Galaxy Studios ported it and that's one of the many reasons (at least imo) why tlou1 was such a bad port. Hopefully Nixxes will handle the TLOU2 port.

    • @JBrinx18
      @JBrinx18 6 месяцев назад

      But you DO need time... Optimization takes time. RDR2 is a 7 year development game. So is Cyberpunk, though it arguably needed more. If we want games to have more variety, more assets, and more speed on release we have to grow VRAM

    • @steveco1800
      @steveco1800 2 месяца назад +1

      On laptop 3070, you may have just PCIe 3.0 x8. Hardware Unboxed have a recent video testing 8 and 16gb VRAM at different pcie speeds. When 8gb isn’t quite enough having the faster PCIe makes quite a difference. Ratchet and Clank gets really bogged down at times on my laptop.

  • @plume...
    @plume... 6 месяцев назад

    I've never had a problem on my 3070TI laptop with 8gb vram (32gb ram) @1440p with high settings on every AAA game I have played (aim for, and do achieve 60fps). I set most things to high (not ultra), no RT (big caveat there, I know), apart from shadows to middle, no bloom, no DoF, middle volumetric, no chromatic aberration, no motion blur (all the usual rubbish that you immediately turn off if you can). BUUUUUUTT, I do believe I will start to get vram warnings very soon with newer games. I have recently ordered a 4090 laptop with 16gb vram and 64gb ram (i9 13980), so hopefully should be safe for a few years now. This 3070TI 8gb has been amazing though.

  • @adamkallin5160
    @adamkallin5160 6 месяцев назад +1

    I agree with Alex. Devs will use the PS5 hardware as a baseline and may or may not spend time to optimize for 8 GB cards on PC. Even Switch 2 is expected to have 12 GB (total RAM).

  • @colaboytje
    @colaboytje 6 месяцев назад +2

    No, if we expect high resolution game with good anti-aliasing.

  • @Tk-ed8ry
    @Tk-ed8ry Месяц назад +1

    I was using an 8gb rtx 2080 super, with msfs 2020, that game eats up my vram, i got a 7800xt with 16gb, vram, and its a whole lot different now. so much better

  • @smashishere
    @smashishere 3 месяца назад +1

    avatar swaps out textures for lower quality when the card is starved for vram and it does this so well, its hard to notice it during gameplay, game developers just dont optimize their games properly and expect the consumer to make up for it with higher vram, doesnt mean nvidia shouldnt provide more vram, but the blame isnt just on nvidia.

  • @Punisher6791
    @Punisher6791 6 месяцев назад

    as someone who went from a 3060 Ti with 8 gb of vram to a 4070 Ti super with 16 gb of vram i can say it makes a world of difference.

  • @stonejito
    @stonejito 6 месяцев назад

    Tbh, it’s enough if you know what you’re expecting out of the card. So like a 3070 or 3070 ti power wise is a really decent card but the 8gb vram holds the card back with 1440p gaming. So if someone wants that card with 1080p with “high” and not “very high” with some dlss, it will be a decent card still. Now I think the min should be 12gb at least maybe even 10gb vram. I know most pc gamers want the highest performance with the hightest resolution but really knowing your card limits and knowing what games and real achievements for your card to perform at is key.

  • @christopherbohling5719
    @christopherbohling5719 6 месяцев назад

    I don't mind PC ports getting more demanding VRAM-wise, but what I do want is that if I need to set textures to medium in a 2024 game to get it to not stutter on my 2070 Super, I would hope those medium textures look at least as good as high-ultra textures on games from a few years ago. The texture quality available for 8GB cards, whatever it's labeled in the menu, shouldn't be visually worse than it is for games from a few years ago.

  • @SireDragonChester
    @SireDragonChester 6 месяцев назад +9

    No 8 GB of Vram is not enough for current gen games. Current consoles have 16 GB. This is standard and what devs are targeting as 2024. Game demand will go up Imo in next 2+ yrs or more. 8GB Vram maybe fine for older games or less demanding games/indie games. But it’s not enough for future games that pushing 4k more, more RT. UE5 games are already showing this.

    • @Katsuchiyo
      @Katsuchiyo 6 месяцев назад +3

      With 10 to 13 GB allocated to video memory, since 2020.

    • @SireDragonChester
      @SireDragonChester 6 месяцев назад +2

      @@Katsuchiyo
      It’s true. When Horizon zero Dawn pc port came out. At max setting. It was already using 10.5 Gb Vram at just 1080p on my GTX 1080ti. Granted. It Wasn’t running well and we all know it was vary bad port. Games are only getting more demanding for current gen pc games. The old 8 GB Vram GPUs aren’t going cut it much longer. Even 10 GB GPUs may start having issues. My next gpu is going have at least 12GB Vram or more. Imo 16 Gb Vram will become bare minimum in few years. Nvidia may says Vram don’t matter. But that is bullshit. Consider all lies and bs they been doing. Imo the whole RTX 4xxx are scam since all cut down at primer price.

    • @dr.sivavignesh664
      @dr.sivavignesh664 6 месяцев назад +2

      That's shared memory. Watch some more df videos. The 16 gb memory is for vram plus RAM.

    • @SireDragonChester
      @SireDragonChester 6 месяцев назад +2

      @@dr.sivavignesh664
      I know it’s shared GDDR6 memory. Devs have access for games 12.5 ish Gb for games. The rest is os/background task. XSX has bunch weird memory config, XSS has only 10 Gb, leaving devs prob 7.5 to 8 GB for games. Which isn’t enough.
      Been gaming since intel 8088/86 days and building pc since 90’s and ex beta tester/server admin/ex moderator from small indie dev team.

  • @virtualpilgrim8645
    @virtualpilgrim8645 6 месяцев назад +26

    "640K ought to be enough for everyone." ~Bill Gates

    • @VicharB
      @VicharB 6 месяцев назад +1

      Lol ... them times!

  • @CheckmateStallioN
    @CheckmateStallioN 6 месяцев назад +1

    All those needing to upgrade to 4090s better chop off a part of your full tower chassis to fit it in if your full tower chassis is from like 2Kteens. Thats what I did with mine. I have a full tower Thermaltake Chaser MK-I blue LED version from 2013 and don't want to get rid of it as it is still one of the best looking towers out there combining the oldskool with modern look. Most chassis now are too bland and have no personality to their looks they are literally just simple angled boxes. I took out the metal side panel with the little glass window and swapped it with a full size acrylic panel. But even at full tower size it couldnt fit a 4090 in it bc of the metal frame holding all the harddrive bays so I had to chop off the whole metal frame so the 4090 can fit

  • @MACROSS2KYTB
    @MACROSS2KYTB 6 месяцев назад +1

    Can DirectStorage resolve VRAM hungry issue?

  • @nintendoconvert4045
    @nintendoconvert4045 6 месяцев назад +22

    Nope! 12GB is the minimum standard.

    • @Dhrazor
      @Dhrazor 6 месяцев назад +3

      Unless your game uses UE5 long term even 12GB won't be enough, Frame Gen and RT both use extra VRAM and you will run out of VRAM even at 1440p native if you turn everything to max... when the 1070 with 8GB released, games used 3-4GB VRAM, now that games are using 8-12 probably even 16GB won't be enough 4 years from now

    • @mianlo2624
      @mianlo2624 6 месяцев назад +2

      Nvidia: Best we can do is 8GB of GDDR6X & 4GB of GDDR6.

    • @JulliusTheGreat
      @JulliusTheGreat 6 месяцев назад

      @@Dhrazorthis is crazy, PC gaming is on the verge of being impossible for middle class gamers..

    • @tonin7228
      @tonin7228 6 месяцев назад +2

      @@Dhrazor Nvidia releasing the 4060 and the 4060ti even though they cant utilize FG (the main selling point of the 4000 series) effectively was super nasty

    • @Aleksey-vd9oc
      @Aleksey-vd9oc 6 месяцев назад

      @@Dhrazor The generator uses memory, but the base frames are important for the generator, which you will most likely want to raise at the expense of the upscale. In this case, the upscale will greatly reduce the consumption of video memory, it is enough to start the generator.

  • @PSx1991
    @PSx1991 6 месяцев назад

    Uncharted 1, 2, 3, last of us, beyond 2 souls, GTA V ran with 256MB VRAM and 256 MB RAm.

  • @EastyyBlogspot
    @EastyyBlogspot 6 месяцев назад +1

    Thing is say u had a new card with 8gb or and older card with 12 or 16...would u pick the higher vram or the less ram newer card that i would assume would have better performance? Though why they could not even make them a 10gb still puzzles me

  • @OwtDaftUK
    @OwtDaftUK 6 месяцев назад +3

    If your buying a PC today half way into the PS5's life cycle I think it's unwise to think, well it has enough video ram to play any PS5 game ported to PC.

    • @CheckmateStallioN
      @CheckmateStallioN 6 месяцев назад +2

      PS5 is x86 architecture so porting its games to PC is not much of an issue. We're not dealing with exotic architecture like the PS3's Cell BE

    • @vitordelima
      @vitordelima 6 месяцев назад

      @@CheckmateStallioN Somehow everything becomes a mess along the way even if all platforms are almost the same.

    • @KrazzeeKane
      @KrazzeeKane 6 месяцев назад +1

      ​​@@CheckmateStallioNpeople who don't know anything about programming always say this, like because they are both x86 it's just some easy auto convert button they push to port ps5 to PC. There is SO much more involved in it, even though both are x86. It is absolutely an issue still, and requires careful planning and development to make a proper port of ps5 to pc. Things like unified memory complicate direct conversions of ram requirements on pc and such, it's not a simple equation here

    • @CheckmateStallioN
      @CheckmateStallioN 6 месяцев назад

      @@KrazzeeKane of course theres finetuning to be done when these consoles have pooled memory and everything is on an APU but compared to an architecture like the PS3 where you had to account for 8 different SPUs it is much easier and 1 less major problem to deal with when you have a similar architecture to work with. This is why most 3rd party devs had major issues developing PS3 games because they had barely any training and experience working with that architecture unlike the 1st party devs. The architecture itself posed a far bigger problem to solve than achieving "parity" between unified memory in consoles and dedicated memory allocation (VRAM and system RAM) that we see in PCs

  • @mikelreborn3254
    @mikelreborn3254 6 месяцев назад

    "I am currently utilizing a 2023 Lenovo Legion Pro 5i, equipped with an RTX 4070 and 8GB of VRAM. The performance at 1080p resolution is satisfactory, and the visual quality meets my requirements. I acquired this laptop at a competitive price, and it primarily serves as a platform for my digital audio workstation (DAW). Although it's intriguing that VRAM can be allocated from the SSD to the CPU, it's not possible to do the same for the GPU, nor is there an option to expand the GPU's VRAM using a microSD card. Nonetheless, as long as the graphics are pleasing and the frame rate is stable, I am content with the setup."

  • @dr.sivavignesh664
    @dr.sivavignesh664 6 месяцев назад +36

    VRAM requirements have lately been overstated. Especially due to the lazy ass piss poor ports. Cyberpunk 2077 and rdr 2 maxed out still uses very less vram but looking better than anything released lately.

    • @muratssenol
      @muratssenol 6 месяцев назад +11

      Cyberpunk 2077 as of 2.12 version at Harware Unboxed optimized settings at 2K uses around 7.7 GB of VRAM with DLSS Quality mode, there is not enough headroom for Ray Tracing or to install high quality texture mods. I have a RTX3070ti and I can't use Ray racing just because GPU does not have enough VRAM to use Ray Tracing in 2K. This card for example should have been a 12GB card imo.

    • @dr.sivavignesh664
      @dr.sivavignesh664 6 месяцев назад +1

      @@muratssenol you have a valid point, but the point about high quality textures is invalid. If we had something like 24 gigs of vram we can talk about say high quality textures because we have the headroom and nothing uses it. The point is cyberpunk had excellent textures to begin with. I don't know if you will be noticing much difference on 2k display from the highest quality textures. Unless you are zooming in every little detail. 2.12 is probably the final patch ( they said they are done optimising the game ) so it's reasonable they gave us scalability. Pc has always been about scalability. That's what the wide range of cards are for. What I'm trying to say that , here in this instance you are talking about the highest possible setting being un usable with ray tracing. While the comparison games , obviously shitty ports can't even load tolerable textures ( even medium to high and not the ultra or highest ) but use like 12 gb and above of vram and that doesn't even have ray tracing. Just like Alex said we can't trust developers to make it a great experience for 8 gb vram users. Its not that we can't have a great texture at 2k that still uses less than 8gigs but that the devs don't seem to think its their problem to fix and they expect the consumer to fix it by buying cards with more vram. Obviously if it progresses like that even 24 gbs won't be sufficient. Also nobody could have predicted this when the 3070ti launched or when you bought it. But Nvidia 's rolling back on 4060 to have 8 gb is an absolute blunder.

    • @vitordelima
      @vitordelima 6 месяцев назад

      And they still use the common terrible data structures for storing geometry, textures, light maps, G-Buffers... which lack proper compression. They also lack any smarter method of managing the use of memory and statically transfer everything that seems to be needed for rendering (according to simpler algorithms such as visiblity tests) even when it isn't.

    • @mllecamill3
      @mllecamill3 6 месяцев назад +2

      RDR does not look that good anymore and this whole "unoptimized game!!!11" is just a very simplistic view.

    • @robo_bravado
      @robo_bravado 6 месяцев назад +4

      Cyberpunk will spike north of 10GB of committed vram at 1440p with ray tracing.

  • @coleshores
    @coleshores 6 месяцев назад

    8GB of ram will be more than enough for AA and Indie gaming for the foreseeable future so there isn't a shortage of content that will operate under those limits. Its only the massive AAA games like GTA6 where 8GB of VRam is going to become an issue.

  • @rowsdower84
    @rowsdower84 6 месяцев назад +42

    I blame both lazy game devs and Nvidia being stingy with VRAM.

    • @kphuts815
      @kphuts815 6 месяцев назад +1

      This might be a bad take but I don't think Devs are that lazy considering they give lower textures settings to allow users with less vram to play smoothly, of course there are some outliers but they aren't the majority of vram demanding games from what I've seen

    • @CheckmateStallioN
      @CheckmateStallioN 6 месяцев назад

      As if AMD cards are any better. In fact AMD cards still have a long way to catch up to nvidia. Look how terrible AMD video drivers are. They are almost caught up in CPUs but have a long way to go for their GPUs to catch up with nvidia. AMD offering slightly more VRAM while at the sametime downclocking their GPUs and having subpar driver support compared to nvidia GPUs isnt helping the situation either. Pick your poison

    • @mitchjames9350
      @mitchjames9350 6 месяцев назад +1

      @@CheckmateStallioNthere isn’t much difference in performance between AMD and NVIDIA the only advantage Nvidia has is DLSS which is fake frames and ray tracing which tanks frames and one uses. At least AMD future proofed their cards with memory. Also AMD didnt offer slightly more they offered alot more memory than NVIDIA.

    • @ChiekoGamers
      @ChiekoGamers 6 месяцев назад

      Blame the management and the publishers, not the workers 🤦‍♂️

  • @adriannavarro9962
    @adriannavarro9962 6 месяцев назад

    I mean, I heard all of that back in '21 when I miraculously stumbled upon a 3070 Ti in the middle of the cripto crisis, and three years later it still beats the 12 GB RX 6700 XT/6750 XT at 1440p in most games. By the time the RX 6700 XT overcomes it I'll have already updated. Does that mean I'd buy an 8 GB card in '24? Heck no. But I think we're at the same point with 12 GB cards right now, and I reckon they will be enough for many gamers who update every 3-4 years, just like the 8 GB 3079 Ti has been enough for me up to this point. Of course, if you plan on holding on to the card longer, then get as much VRAM as possible, but in my experience most regular PC gamers don't do that.

    • @Loundsify
      @Loundsify 6 месяцев назад

      Tbf the RX 6800 was it's main competitor and there's articles now that show the RX6800 comfortably beating the 3070, I can't remember how much faster the ti is over the standard 3070.

    • @adriannavarro9962
      @adriannavarro9962 6 месяцев назад

      Thta's not the point though. The RX 6800 was already faster at rasterization than both 3070s at launch. Those days MANY people were arguing the RX 6700 XT would also quickly overtake them because of its 12 GB of VRAM, and it still hasn't happened. Has it performed better in an anecdotical number of games? Yes. Has it quickly overcome both 3070 and beat them across the board? No. Not at all.
      I was merely translating that situation into the current one, when 12 GB cards are what 8 GB card were back then. I'd bet things are gonna turn out similarly, with, say the 4070 Super beating the RX 7800 XT throught their prime, only to be overtaken when most gamers purchasing at that tier of GPU have already moved on to new ones.

  • @Louis_Bautista
    @Louis_Bautista 6 месяцев назад

    I'm really not worried about the 12GB on my 4070 Super. I play at 4K60 and even though people say you need higher VRAM especially for 4K, what they forget is that once you're at 4K, DLSS works so well that at quality mode you're getting a better than native image while using less VRAM because you're only rendering at 1440p internally.

    • @Wolfos530
      @Wolfos530 6 месяцев назад +3

      12 should really be fine for any current-gen PC port. I get how 8 can get a little tight for lower budget ports - consoles do have 16 gigs shared and that's a bit more flexible - but 12 is significantly more.

    • @Sp3cialk304
      @Sp3cialk304 6 месяцев назад +1

      Same thing applies at 1440p. DLSS quality looks better than native in every game released in the last year+. Last 2 years if you don't mind taking 5 seconds to update the DLSS version being used.

  • @brucevandermescht6957
    @brucevandermescht6957 5 месяцев назад

    Couldn't care less, I only play games pre 2016. Everything after that is all style and fuckall substance.

  • @onomatopoeia162003
    @onomatopoeia162003 6 месяцев назад +2

    Should be 12-16-20-24

  • @HiddenAdept
    @HiddenAdept 6 месяцев назад

    Atleast with my 7900 XT (20GB) probably wont have to worry for a long time.

  • @TurboPikachu
    @TurboPikachu 6 месяцев назад +2

    Meanwhile, I’m still running with my 4GB Radeon RX 480, and I have yet to own a 4K TV because of the Nintendo Switch looking disgustingly pixilated or soupy when upscaled on panels I’ve tried it with. Having yet to own a 2K/4K TV effectively milked a lot more viability out of the 2016 midrange “4K ready” GPU.
    To this day, I haven’t run into any real issue in the latest games (except for the i5-6400 causing jutter in VR experiences), still achieving 60fps in practically every new game I’ve thrown at it around 800p-900p low settings and having having a perfect-locked 30fps at 1080p medium in those same games. And it’s especially been a delight for newer games like Stray and Hi-Fi Rush to be optimized enough to maintain a rock-solid 60fps at 1080p/high on the 7 year old card.
    The only game I’ve seen so far that can’t get at least 720p/30fps at minimum is Alan Wake 2. Even after the requirement reduction for the game, it’s still a travesty of unoptimization even on newer cards.
    Once the Switch’s successor comes out, that’d probably be the day I get a 2K/4K TV and build a new PC with a Ryzen 5 and a 16GB Radeon RX 7600 XT

    • @CheckmateStallioN
      @CheckmateStallioN 6 месяцев назад

      MClassic works wonders upscaling Switch games to 1440p and I herd if you daisy chain 2 MClassics you get even better results. A must have device for 1440p and 4K monitors

  • @VicharB
    @VicharB 6 месяцев назад +5

    12GB should be minimum with next gen (budget range) - RTX 5xxx and Radeon 8xxx. I am looking at RTX 5080 with preferably 24GB to be future proof. Today, even on some AAA titles with my 3080Ti the whole 12GB is utilised. Surely all this if you are planning to play at max/highest/ultra graphics settings in 1440p to 4K.

    • @frowningboat8039
      @frowningboat8039 6 месяцев назад +1

      I'd like to see RDNA4 have 16GB for most of the lineup with only the bottom card being 8 or 12GB. Ideally more on the top cards but it's not possible with a 256 bit GDDR6 unfortunately.

  • @ianism3
    @ianism3 6 месяцев назад +3

    it also depends what resolution and quality settings you're comfortable with. I'm still happy at 1080p and medium-to-high settings, so 8GB is probably enough for 4 years for me, especially since I do 90% of my gaming on console anyways.

    • @puffyips
      @puffyips 6 месяцев назад

      Exactly you can’t run high to ultra textures even at 1080p

  • @maplegamer4736
    @maplegamer4736 6 месяцев назад

    To correct misinformation, they should have addressed the claim that dlss alleviates vram requirements. This is tied to the myth that resolution is the main culprit for vram usage. Its actually higher res textures, which dlss cannot address.

  • @TheUAProdigy
    @TheUAProdigy 6 месяцев назад

    I hate hearing developer slamder. Especially here where people should understand that developers don't control anything, the execs do.

  • @bogstandardash3751
    @bogstandardash3751 6 месяцев назад

    Will games demanding enough to need more than 8/12gb of vram also need dlss and frame gen to run 4k high settings on the same card?
    Pretty as it'll look, the 1200p balanced mode will effectively be rendering at a lower resolution for other factors which solves the vram issue at the same time?
    Happy to be wrong as I'm not sure, obviously we're you too but a card today you'd go AMD with 6750 xt plus.

  • @ImakeTanks
    @ImakeTanks 2 месяца назад

    tried 2k 4k 27-32 inch gaming, went back to 1080p 24 inch 8GB vram 140 watt $249 video card for hardcore competitive gaming. the entire consumer computer industry market is a scam.

  • @GilbertGuilford
    @GilbertGuilford 6 месяцев назад

    For people more informed than me. What config (CPU, GPU and amount of RAM) would I need to run modern games at 4k highest possible settings, ray tracing where available and solid 60fps?
    Games I had in mind are stuff like Deus Ex Mankind Divided, GTA5, Horizon Zero Dawn, Metro Exodus, new Tomb Raider games, Control and Witcher 3.
    Is it really 4090 or GTFO, or can something else actually run those games properly?

    • @rustyshackleford4117
      @rustyshackleford4117 5 месяцев назад

      For CPU, 60FPS is a very easy target -- an AMD 5800x is more than sufficient, but it's hard to recommend when the 7800x3D is only $350. You could technically get by with a Ryzen 3000-series and 60FPS most titles, but some newer games are getting CPU bound on the 3000 series now. In either case, a cheaper 7000-series would make much more sense now.
      For the GPU, you don't need a 4090 for 60FPS 4K maxed -- the 4090 is for if you want between 120 and 160FPS in all games maxed at 4K. A 4070 could probably get you by for 60FPS in all of the games you mentioned, but maybe minus 2023/24 titles like Alan Wake or Horizon Forbidden West where you may be close to 60 but not locked in demanding areas. So I'd lean towards a 4080 as it simply has a lot more grunt and will let you continue to hit 60FPS at 4k for the games of the next year or 2 at minimum. With the 4080 you'll be able to hit around 100FPS in a lot of games as well, or near 120 in any of the older titles you've mentioned except maybe Witcher 3, as even that game needs frame gen and the 4090 to hit close to 120FPS with the RT update enabled at 4k.

  • @scepticskeptic1663
    @scepticskeptic1663 6 месяцев назад

    my laptop as a 3060 6gb and it seems fine ,a few games require texture back tracking from ultra/very high to simply high and its fine

    • @filippetrovic845
      @filippetrovic845 5 месяцев назад

      No!! Amd fanboys think you are not fine! You need to max out textures (which you cant even see on 1080p) or you cant possibly play the game. How dare you accept anything less than max settings. You are beta.

    • @mrman6035
      @mrman6035 5 месяцев назад

      Tbh I think the only thing that makes or breaks a games picture quality is good shadows for me. I already game on a 1650 super and shadows are the one setting I like high.

    • @Shatterfury1871
      @Shatterfury1871 5 месяцев назад

      Try Hogwarts Legacy and then comment on 6 VRAM being enough.

    • @scepticskeptic1663
      @scepticskeptic1663 5 месяцев назад

      @@Shatterfury1871 not interested in harry potter and it doesnt look any better than the many of the other games i play

    • @scepticskeptic1663
      @scepticskeptic1663 5 месяцев назад

      @@Shatterfury1871 also naming 1 Game doesnt define reality if 99.99999999999 percent of games are fine then so what . if you want to be super picky thats fine by you.

  • @jrmjrm2764
    @jrmjrm2764 6 месяцев назад

    With boost of l2 cache, there is no need for 16GB of vram as 4070 vs 7800 xt demonstrates even with high resolution

    • @puffyips
      @puffyips 6 месяцев назад

      Boots up rust in 1440p: 15gbs vram being used.

  • @masterquake7
    @masterquake7 6 месяцев назад

    8 to 10GB is perfectly fine for the average gamer. The 3080 is 10GB and still kicks ass.

  • @Brent_P
    @Brent_P 6 месяцев назад

    The bare minimum for discrete GPUs should be *16GB* of VRAM.

  • @nabiltaylor2012
    @nabiltaylor2012 6 месяцев назад

    upgrading from a 1660 super to a 3060 this week. should i do the 12gb or the 8gb 3060ti?

  • @GorillaKong
    @GorillaKong 6 месяцев назад

    12 to 8 is a bit mad tho like why not just drop to 10 ahhhhhh

  • @virtualpilgrim8645
    @virtualpilgrim8645 6 месяцев назад +1

    Alex looks very German\Scandinavianesque

    • @adamkallin5160
      @adamkallin5160 6 месяцев назад

      He looks a lot like my Polish cousin.

  • @Radek494
    @Radek494 6 месяцев назад

    You can't even max settings in Ghost of Tsushima at 1440p with 8 GB VRAM, it will swap to RAM

  • @alanreynolds4262
    @alanreynolds4262 6 месяцев назад

    I had a 4070 that was hitting the 12gb limit on Hogwarts Legacy, which would cause the came to crash from time to time. I ended up returning the card and went went to a RX 7900 GRE with 16GB VRAM. Game never crashes now. Hate how stingy Nvidia is with VRAM.

  • @XDGamerESP
    @XDGamerESP 24 дня назад

    I would lime to say,that if you play 1080p with mid-high settings(no rt) it still looks great,tye thing that really annoys me is why the hell do the sell power to use rt if i cant have enough memory to fucking activate It

  • @JoeWayne84
    @JoeWayne84 5 месяцев назад

    Show the games released this year and the Vram use.

  • @Orlyy
    @Orlyy 5 месяцев назад

    I'm glad I upgraded to my 4080 from a 3070. I've never once gone AMD, but if you're on a budget, and are looking at an 8gb Nvidia card just go get an AMD card.

  • @sonusmeister2325
    @sonusmeister2325 6 месяцев назад

    Honestly i just play the game on medium or even low texture, not like i'm streaming or anything...

  • @brtcobra
    @brtcobra 6 месяцев назад

    red dead 2 uses only 6 - 7 gb at 4k max settings. this limit is down to devs not optimising pc games

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 6 месяцев назад

      Yeah, a lot of these game that use stupid amount of vram look worse than rdr 2

    • @Hayden2447
      @Hayden2447 6 месяцев назад +2

      rdr2 is from 2018 tho, 2019 for PC. It has no RT and the textures are lower res than some newer games. It is an amazing game for its time but of course anything new should run a 5 to 6 year old game made for last generation consoles. Optimization still matters of course but vram requirements will be higher than they were 5 years ago, that is a fact.

    • @mrbobgamingmemes9558
      @mrbobgamingmemes9558 6 месяцев назад +1

      @@Hayden2447 true,but i would rather texture look like last gen games if little bit texture improvement use 2 - 3 times more vram. In the world of tech if you already hit sweet spot point. Going more than sweet spot result in insane compromise for tiny improvement ,

  • @manostororosso2364
    @manostororosso2364 6 месяцев назад +1

    Μαn what u talking about all game since PS3 are the same with betttgraphics

  • @OG-Jakey
    @OG-Jakey 6 месяцев назад

    I'd argue it's not just PC that devs dont give a shit about. It's Consoles as well. Games are generally made poorly across all platforms at this point and the only thing I can think of that is happening is a lack of skill in the industry. A veteran dev for a big studio has brought this up also though he didn't explicitly say this and may be to kind hearted to come to the conclusion but he has mentioned frustration of 45 min code needing 4 weeks for devs which does point to a lack of skill plaguing the industry imo.

  • @nologig
    @nologig 5 месяцев назад

    256mb-512mb-1gb-3gb-6gb-12gb next time when I need to upgrade my graphics card newer technology beyond current RTX features it has to have 24gb too.
    Until it is available on reasonable price point now current RTX3060 12gb will be good enough for me as was the GTX1060 6gb before and GTX660ti before that.
    Only shills are defending 8gb in 2024 for gpu upgrade at over 200 dollar price point when vram usage has always gone up with other hardware demands for latest games.

  • @playneverends1228
    @playneverends1228 6 месяцев назад

    "hello founders"
    thats a neat name for old ppl who criticize games and gaming hardware for living

  • @fredsorre6605
    @fredsorre6605 6 месяцев назад

    I had to Play Horizon Forbidden West at Medium textures cause my RTX3080 runs out of Vram everything else was kept at high or Ultra and I even had DLSS Quality on and my 10GB was almost always full causing stutters I didn't wanna deal with.

    • @rustyshackleford4117
      @rustyshackleford4117 5 месяцев назад

      Lack of texture optimization and compression is a major reason as to why these newer games both eat VRAM and Disk space usage for breakfast. We should start to see reductions with the next year or 2, as Nvidia and AMD are looking to offload texture decompression from the CPU to have it directly stream from the SSD to the GPU, such as with DirectStorage type tech. The next phase would be to have AI processing further speed this up and allow much high compression rates to be accomplished in real-time, whereas now it's not possible with introducing stutter/hitching while loading in new textures.

  • @kaseyboles30
    @kaseyboles30 6 месяцев назад

    8GB is good enough for near term 1080p (one or two years) 10 should add a year to that and 12 should be enough if you upgrade gpu's more than twice a decade. At 1440p 8gb will do, but not with some settings (texture res maxed) where as 10 is a year or two and 12 two to three years. for 4k less than 12 is likely a mistake, and 16 will get tight in about 3-5 years. That is not taking into account the raw power of the gpu in question relative to the game.

  • @Funnylittleman
    @Funnylittleman 6 месяцев назад +1

    I really wish I could get more than 8gb VRAM but I can’t afford a 4080 😢
    Maybe a 4070 Super but that’s not much better honestly.

    • @KrazzeeKane
      @KrazzeeKane 6 месяцев назад

      4070 ti super is the card to go with if you need 16gb vram, it's what I would have bought if it had existed last year but I had to save up extra to go for a rtx 4080.
      The 4070 ti super isn't cheap that's for sure, but it will absolutely treat you right and last you the next 4 to 5 years easy at high settings and high fps. I wouldn't go with a 12gb card like the 4070 super unless you plan to upgrade in 2 to 3 years, or you wish to have to turn down texture settings.
      The 4070 ti super will have no such issues, and the 16gb vram will handle ANY possible titles you throw at it for the next 5 years at least.

    • @CaptainScorpio24
      @CaptainScorpio24 6 месяцев назад

      ​@@KrazzeeKaneexactly i have upgraded from 3yr old 3070 founders edition to gigabyte rtx 4070 ti super eagle oc 16gb at launch jan2024 😊

    • @puffyips
      @puffyips 6 месяцев назад +1

      I got a 16gb 6800xt back in 2023 for $399, you’re asking the wrong company to provide

    • @mitchjames9350
      @mitchjames9350 6 месяцев назад

      Why not a 7800 xt or 7900 xt both cheaper and better than their Nvidia counterparts.

  • @Shatterfury1871
    @Shatterfury1871 3 месяца назад

    No, it is certainly not.
    Yes, we could make ir work, but that entails lowering the graphics.
    Ghost of Tsushima oh very high can gobble as high as 8.5 gb of VRAM, for example, and that is on 1080p resolution.

  • @m-copyright
    @m-copyright 6 месяцев назад +16

    Having 8GB VRAM in 2024 (especially for the price the cards go for) is like having a dual core processor in 2024.
    Can it be used? Yes.
    Will it be a good experience? No.

    • @Threewlz
      @Threewlz 6 месяцев назад +1

      Idk man, I've had a great experience with my 3070 so far, secret is to not max out settings and stay away from shitty ports

    • @KrazzeeKane
      @KrazzeeKane 6 месяцев назад +4

      ​@@Threewlzstaying away from AAA games (which are generally crappy ports) and having to run on low settings isn't what most people consider a "Good experience". It's a "making do" experience imo

    • @puffyips
      @puffyips 6 месяцев назад +3

      @@Threewlzexactly you can’t max out textures due to limited vram on your card

    • @Threewlz
      @Threewlz 6 месяцев назад

      @@KrazzeeKane I'm not running anything on low settings tho. Any game released on the last couple years is manageable at high textures/settings. Horizon FW is one of the best looking games of this gen, guess what? High textures. Alan Wake has got the best geometry of this gen, guess what? High textures. This vram stigma is way overblown.
      A RTX 3070 delivers a better experience than your average PC consumer hardware or console gamer is going to do, I can assure you that.
      Edit: even that mess of a port that was TLOU part 1 is playable at high settings/textures

    • @Threewlz
      @Threewlz 6 месяцев назад

      @@puffyips when there isn't a visually difference between high and max textures it doesn't make a difference. Most of the time it's just a vram streaming pool with no visual discrepances whatsoever.

  • @djtomoy
    @djtomoy 6 месяцев назад +3

    You need way more than 8gb, I shit 8gb 😅

  • @chris42069
    @chris42069 6 месяцев назад

    8GB is okay for me on 3070 because of DLSS and only becuae of DLSS. Its a hardware dongle for best tech in gaming

  • @xxbiohatchetxx
    @xxbiohatchetxx 6 месяцев назад

    I think it depends on the game and what kind of quality they are really looking for you know if they are okay with a little bit less quality because of less vram I mean yeah it's fine it all really depends on the person it's basically just a person to person basis really because one person maybe okay with a little less quality not everyone cares about real quality look at Minecraft look at grounded look at even killing floor 2 or Skyrim/fallout 4 the list goes on but there's a lot of games that aren't exactly perfect or real high quality but are still very popular now out of all of those I think the highest quality one is grounded tho but most people won't understand that or notice it or this or that but they aren't looking in the right spots either but yeah!

  • @danny200mph5
    @danny200mph5 6 месяцев назад

    Good thing I bought a 4080 with 16gb.

  • @virtualpilgrim8645
    @virtualpilgrim8645 6 месяцев назад +2

    I have the top of the line factory overclocked ASUS 2080 Ti with 11 GB VRAM and can play any game at 1440P at max settings with frame generation. 5800X CPU. 32 GB windows 11 RAM.

  • @asmodeusml
    @asmodeusml 6 месяцев назад +5

    The cope of the bro telling that "NVIDIA showed that 8GB is enough" is unreal.

  • @Payload25
    @Payload25 6 месяцев назад

    So I can throw my 2060 in the trash for future games. Just because the devs no longer optimize their game.👍

  • @spawnkiller9779
    @spawnkiller9779 6 месяцев назад +2

    With dlss and fsr i feel that this plays a part in scalability for 8 gb cards more than higher gb cards like 2070 and 2080

    • @rustyshackleford4117
      @rustyshackleford4117 5 месяцев назад

      This is a good point, as the lower internal resolution will keep 8Gb of VRAM relevant for much longer than it would be otherwise. That said, if we're down to the point of needing DLSS to upscale to 1080p final resolution we're getting to a bad state, as even DLSS quality with this output resolution has a very noticeable degradation in image quality over native 1080p. FSR2 is far worse, as it's basically useless unless upscaling to 4k from 1440p, and even then it still completely jacks up hair effects and fast motion.
      XESS is a good middle ground though, as it doesn't have the same hair/motion issues nearly as apparent as FSR2 does. At this rate, unless AMD pulls something amazing with the next version of FSR, XESS is set to be the universal replacement for DLSS that doesn't require Nvidia's proprietary AI cores as it's progressing much more quickly than FSR and is a significant upgrade over it at this point.

  • @acardenasjr1340
    @acardenasjr1340 6 месяцев назад

    For now, yes.

  • @soulreaper2557
    @soulreaper2557 5 месяцев назад

    I have a 3080 with 10gb of vram and it's doing ok for now. I'm gonna sell it soon cause that's not enough for future games. Nvidia is cheaping out on us. I will not be buying another Nvidia GPU.

  • @dvxAznxvb
    @dvxAznxvb 6 месяцев назад +2

    You aren’t going to care; when you will play with what you currently have, these questions are more so about when you are buying and buying mentality
    Most people cant and can control vram consumption on what their target resolution is and quality settings and also based on choice of what they committed on their gpu choice; but they cant decide on how much an engine determines is suitable for what it needs to display at the target frame rate
    It’s always a tug of war with components and technology available and vram is just another aspect of how not everyone wants 4k resolution cause if they did then they would go for more vram;
    Just because you can..... doesn’t mean you should...... developers CAN optimize but they dont; nvidia CAN put more vram but they dont

  • @outsideredge
    @outsideredge 6 месяцев назад +3

    I have a 3060 Ti with 8GB of VRAM. I don’t think I have a game that’s made it run out memory.

  • @juhakallen
    @juhakallen 6 месяцев назад +1

    I have 4060 laptop. I don''t play last gen games. Only game I had problems with too little vram was portal with RTX. Other first gen games with RTX, I don't have any problems. I don't care about future proof because in future I'll buy a new one. 8 GB is enough to games that cost less than 10 euros in sale.

  • @Zel912
    @Zel912 6 месяцев назад

    For 4k, it should be 16GB. More than this, is bad optimization.

  • @myatko9246
    @myatko9246 6 месяцев назад

    I'm still using my GTX 1660 Ti

  • @killermoon635
    @killermoon635 6 месяцев назад +2

    I have not played any game that has issue with 8GB
    Last of us part 1 runs fine on 8GB. I have just to drop some texure setting to high instead of very high, and runs perfectly fine. I don't see any difference in visual between high and very high. Forspoken not issue with latest patches either....
    Horizon Forbidden West, Ratchet etx all ran fine.

  • @cezarstefanseghjucan
    @cezarstefanseghjucan 6 месяцев назад

    Even 12 GB of VRAM isn't enough for 4K.

  • @mercuriete
    @mercuriete 6 месяцев назад

    Meanwhile steamdeck with 16GiB of VRAM. (More or less)

    • @MrCharlieCupcakes
      @MrCharlieCupcakes 6 месяцев назад +1

      I thought It was shared ram so you wouldn't be getting anywhere near that more around 8 gb

    • @mercuriete
      @mercuriete 6 месяцев назад

      @@MrCharlieCupcakes is dynamic. You have all memory available for both chips.
      If the Game use less RAM is more available for GPU.
      So tecnically of course is not 16GiB but is more than 8.

    • @Aleksey-vd9oc
      @Aleksey-vd9oc 6 месяцев назад +2

      @@mercuriete I think there is a direct relationship between the size of the video buffer and the consumption of the main memory. You will not be able to fill 8 gigs of the video buffer without pumping this volume through the main memory. And the main memory is also used for the OS (I'll assume 1-2 gigs)

    • @mercuriete
      @mercuriete 6 месяцев назад

      @@Aleksey-vd9oc main memory and VRAM is the same.
      Steamdeck pre-reserve 1GB contiguos but the remaining is the same memory as main memory. Steam deck doesnt have VRAM. Thats why It will work with new games (slow but work) meanwhile discrete graphics cards directly wont boot the game.
      I know discrete graphics cards are better and powerful but if you buy a mini pc with 32GB or even 64GB you will run games for a lot of years.
      Integrated GPU have their own atractiveness.

  • @sgttownley5800
    @sgttownley5800 6 месяцев назад

    running a 3080ti with 12gb of vram , cant play grayzone warefare ...

    • @mllecamill3
      @mllecamill3 6 месяцев назад

      Yeah, cause that game has insane detail and Lod. Nanite for high distance LOd and Lumen RT. It looks amazing.

  • @luiscantu4968
    @luiscantu4968 6 месяцев назад

    Size matters