8 GB VRAM is a Problem. Is 10G any Better?

Поделиться
HTML-код
  • Опубликовано: 25 июн 2024
  • 8 GB of VRAM is a problem with games coming out, isn't even uncommon for games to want about 12. The rtx 3080 released a couple years ago and is still a beast of a card. Can it hold up?
    HUB: • VRAM Issues, Crashing ...
    MLID: • Death of 8GB GPUs, RX ...
    Danny Boy: • Not Enough VRAM!!!
    TechPoweredUp: www.techpowerup.com/gpu-specs...
    My Spotify:
    open.spotify.com/artist/3Xulq...
    0:00- The memory problem
    1:47- 3080 in RE4
    3:04- 3080 in Last of Us
    3:38- 3080 in Fortnite
    5:13- 3080 in Atomic Heart
    5:26- THE IRONY
    6:51- VRAM misconceptions
    7:39- Who VRAM actually effects
    9:36- Does AMD's approach work?
  • ИгрыИгры

Комментарии • 1,9 тыс.

  • @livedreamsg
    @livedreamsg Год назад +1708

    If Intel can afford to put 16GB VRAM on a $350 card, I don't want to hear any excuses for Nvidia.

    • @BeatmasterAC
      @BeatmasterAC Год назад

      NVidia: "but...but...mUh TeNsOr CoReS...mUh DlLs...MuH FrAmE gEnErAtIoN...mUh iNfLaTioN...mUh HigHeR tSmC pRiCeS...mUh CuDa CoReS..."

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh Год назад +278

      agreed, i mean Nvidia has a $350 card with at least 12GB (3060) the 3060ti, 3070, 3070ti and 3080 having less VRAM than the 3060 is so stupid, Ngreedia at its best.

    • @KermenTheFrog
      @KermenTheFrog Год назад +20

      The difference is intel was selling those cards at or near cost

    • @sorinpopa1442
      @sorinpopa1442 Год назад +43

      Exactly , Puck Ngreedya , they keeping the gaming industry freeze in the last years bcs of they gpus severely lacking vrams. (3090 only exception)

    • @karlhungus545
      @karlhungus545 Год назад +25

      Unfortunately Nvidia could care less what you think...or anyone else on YT for that matter. They own the GPU market, and will for the foreseeable future. You don't need that VRAM anyways, unless you only play crap console ports at 4K with the 1% that have a 4K monitor 🙄😂 Buy AMD then (you won't), or better yet, have a brain and just get a console...

  • @Barrydick186
    @Barrydick186 Год назад +628

    Vram isn't the Problem. Nvidia's Vram is the problem.

    • @ForceInEvHorizon
      @ForceInEvHorizon Год назад +12

      Lol more like AMD

    • @RationHans
      @RationHans Год назад +50

      I thought the game Devs that do not optimize xd

    • @ForceInEvHorizon
      @ForceInEvHorizon Год назад +6

      @@RationHans if AMD haven't released they're Fsr we wouldn't have this problem. Nvidia DLSS isn't the problem since its only exclusive to RTX card but once amd released FSR in which available to every card the devs got lazy optimizing they're games since they know we can just use dlss/fsr

    • @V1CT1MIZED
      @V1CT1MIZED Год назад

      @@ForceInEvHorizon you sound like a child

    • @Rivexd
      @Rivexd Год назад +147

      @@ForceInEvHorizon that’s like saying “if guns weren’t invented, people wouldn’t kill each other”

  • @LucidStrike
    @LucidStrike Год назад +112

    I mean, the latest AAA games eventually become affordable Steam Sale games, and so the same problem eventually hits you even if you're not buying at launch.

    • @77wolfblade
      @77wolfblade Год назад +20

      Remember no preorders!

    • @SpinDlsc
      @SpinDlsc Год назад +9

      True. I also think his Steam argument only has a limited degree of validity, because if you also look at the Steam Hardware Survey and see what kind of graphics cards most people have, it's 50, 60 and 70-class cards, and a lot of those are still in the 10, 16 and 20 series. A big reason many people haven't wanted to upgrade in the last couple of years is because of the recent pricing problem in the GPU space and the current recession, so by that metric, most of those people aren't exactly going to try running any of the newer, shinier games.
      Also, if VRAM not being needed is the argument we were going to make, then we also have to ask why NVIDIA is going in so hard on marketing ray-tracing and now path-tracing to begin with when they aren't adding enough VRAM to help that feature run better on some of these cards in the long term. By the point of NVIDIA "not needing" to add more VRAM than is necessary for most users, we should also argue that they shouldn't be marketing ray-tracing to begin with.

    • @OrjonZ
      @OrjonZ Год назад +2

      A lot of people bought and played Hogwards.

    • @ZoragRingael
      @ZoragRingael Год назад +2

      + there are steam sales

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Год назад +1

      @@SpinDlscidk, I think raytracing is a good thing to push but it’s not there yet, so nvidia should be taking a hit to their profit margins to keep the prices actually making sense as opposed to doing the complete and utter opposite like they currently are.

  • @Tubes78
    @Tubes78 Год назад +580

    I remember choosing the 6800xt 16gb over the 3080 10gb because of this.

    • @eldritchtoilets210
      @eldritchtoilets210 Год назад +90

      Same, I guess it's the "fine wine" effect starting to settle in.

    • @OffBrandChicken
      @OffBrandChicken Год назад +45

      Same starting to see my purchase to be the correct one over all.
      While some people are saying “you don’t need 8gbs” they are also throttling their game. While I’m turning every up to ultra.

    • @gruiadevil
      @gruiadevil Год назад +68

      @@OffBrandChicken It's the same people that said "YoU dOn't nEeD a 4770K. What are you gonna do with 8 Threads? E8400, 2Cores/2Threads can run any game"
      In the meantime, in 10 years time, they swapped 6 CPU-s + Mobos + RAM kits, while I held on my i7.

    • @OffBrandChicken
      @OffBrandChicken Год назад +18

      @@gruiadevil Same, it's not about what tech is "technically fast right now is this specific use case". But "Is this going to fulfill my use cases for next X years"

    • @OffBrandChicken
      @OffBrandChicken Год назад +26

      ​@@gruiadevil I just love when Nvidia adds more than needed for the time, they're seen as the almighty jesus bestowing greatness. While when AMD does something extra, that frankly is very beneficial.
      The Nvidia users are like, "yes but i could get 10% speeds on games that use less than 8gbs 10 years ago. And that's what truly matters." Instead of thinking about the 20% they'll gain in the future.
      Like the copium is so hard that they don't even see it. Even the youtubers.

  • @xEricGNx
    @xEricGNx Год назад +118

    I remember when 512MB was enough for gaming.
    Thank you for your service, GeForce 8800 GT.

    • @ro_valle
      @ro_valle 8 месяцев назад +1

      I remember asking my dad for a 8800GTS 320mb and he surprised me with a 8800GTS 640mb , I was amazed by the amounts of vram

    • @davidrenton
      @davidrenton 7 месяцев назад

      my 1st PC had 4MB of Ram (yep MB, not GB), not in the GFX card, it did'nt have 1, 3D acceleration did'nt exist.
      I think my hard disk was 20MB
      4MB System ram total, but hey that was'nt the problem it was trying to get all the DOS drivers like CDROM,Sound,Mouse into the 1st 640K.
      Doom 1 final Boss was stuttery, but then i spent an insane amount and went to 8MB, Doom final boss, smooth as butter.

    • @m.i.l9724
      @m.i.l9724 6 месяцев назад

      that means if youd commit to be a father when you were 18 your child could have a child my age i guess damn@@davidrenton

    • @davidrenton
      @davidrenton 6 месяцев назад +1

      @@m.i.l9724 not yet my child would off had to been a parent at 13 , which is unlikely, i'm 49, so 36 i would have had an 18 year kid, if i had them when i was 18. Hence my hypothetical grandkid could be 13.
      Saying that it's not impossible, people have kids at say 15, their children at 15, they are a grandparent by the time the are 30
      i recently watched a TV bit from the 80's, it was about 6 generations alive in 1 family, so from baby, mother, grandmother, great grandmother, great great grandmother and the great great great grandmother was still alive, they where they all together in the studio

    • @valentinvas6454
      @valentinvas6454 5 месяцев назад

      In 2012 I thought 2GB in the GTX 660 will be enough for a decade but even 4-5 years later it was easily overwhelmed. Nvidia often screwed us over with VRAM capacity. The popular GTX 970 only had 3.5 usable VRAM and as soon as you used the last 0.5 GB that was much slower you started to see huge stutters.
      With Pascal and Turing they were quite generous but after that it's downhill once more. The 3070 TI and 4060 TI with 8Gb are such jokes.

  • @madrain3941
    @madrain3941 Год назад +250

    As soon as I heard that the RTX 4060 was gonna release with 8GB of VRAM I Instantly went ahead and purchased the RX 6700 XT with 12 GB of VRAM, and honestly, It is a HUGE game changer at least for me.

    • @weshouser821
      @weshouser821 Год назад +11

      What I don't understand is that we have systems with 64gb of memory I really don't see why we cand have a card that has 32/64gb of vram instead of messing around with it. Why can't we just make cards that have upgradable vram slots? I don't know... it's over my head, but I really think it's because of "planned obsolescence".

    • @LeoMajor1
      @LeoMajor1 Год назад +23

      @@weshouser821 Yes it isa bit over your head because GPU VRAM and SYSTEM RAM are not the same..... and a card with 64gb of VRAM would be HUUUUUUUUUUUUUUUUUUUUUGE and need a heavy power draw and more coooling and its even over my head so someone else can add to that

    • @weshouser821
      @weshouser821 Год назад

      @@LeoMajor1 Would it really though?

    • @brkbtjunkie
      @brkbtjunkie Год назад +3

      @@weshouser821 have you seen the prices of ddr5? Gddr6x is a whole different ballgame as well. Apples and oranges.

    • @Elinzar
      @Elinzar Год назад +5

      Why we don't have 64gb cards in the consumer space is simply because Gddr6 is still not dense enough, like the enterprise 3090ti card had 48gb of Vram and I think this gen might have an 84gb one or something like that
      Also capacities like that where only archived by HBM3 in the past gen
      So yeah I would say even 20gb+ midrange cards are still miles to go, 16gb will become the norm tho and something I love about what AMD did with RDNA2 (absolute underated cards this gen) is that the 6800 all the way to the top tier all got 16gb, I mean the 6950xt should have gotten 24 at least but you get the point, from the high midterm to the high end got a fair bit of Vram and the 6700xt got 10
      Only the entry level got 8gb

  • @sirab3ee198
    @sirab3ee198 Год назад +10

    So AAA games are a niche now ????? :)))))))))))) RE4 sold 5million copies, Elden ring sold 20million Witcher 3 sold 40milion etc ....... I hate it when people misuse Steam Charts to prove their point, Hogwards Legacy is a singleplayer game same as Elden Ring (kinda) but after a month from launch people move on because it is a single player game!!! people finish it and move on. Nvidia giving you 8GB VRAM for the x70 series was a slap in the face for consumers, now they are doing the same with 12GB VRAM. People who bought the RTX 3070 and who will buy the RTX 4070 will want to play the latest AAA games.

  • @ConfusionDistortion
    @ConfusionDistortion Год назад +56

    Working adult here that buys AAA games, so yeah, this affected me. It was sobering to start up Company of Heroes 3 and find I couldn't max it out due to a vram limit on my 2070 Super. Have had this card for 3 years, and I still like it, but yeah, sign of the times. So now it sits in a secondary pc and dropped the cash on a 7900 XT. Problem solved, and now I am back to running everything again and not sweating vram issues on Last of Us, COH 3 etc.

    • @kirbyatethanos
      @kirbyatethanos Год назад +7

      Same boat as me. Recently had an RTX 2060 6GB. Upgraded to an RTX 4070 Ti(the 7900XT is more expensive where I live).

    • @xTurtleOW
      @xTurtleOW Год назад

      Same man my 3080 was running out of vram very fast so dropped in 3090ti on second rig and 4090 on my main now no vram issues anymore

    • @xkannibale8768
      @xkannibale8768 Год назад +4

      So go from ultra to high? Like lmao. There isn't even a difference you'd notice 90% of the time and it uses half the vram 😂

    • @legionscrest143
      @legionscrest143 Год назад +1

      ​@@xkannibale8768 really?

    • @JonesBeiges
      @JonesBeiges Год назад

      @@legionscrest143 yes really i did a cpu upgrade and can still play the newest titles with my 6 yr old gpu above console graphics.....
      Many people like you seem to fall for those youtube salesman....... Only idiot game devs create games for that 1% elitist pc nerds who need the 4 k ultra settings 144 hz because ......

  • @trr4gfreddrtgf
    @trr4gfreddrtgf Год назад +247

    This is also why I'm going to take a 7900xt over a 4070 ti, 20gbs seems a lot more future proofed then 12

    • @VaginaDestroyer69
      @VaginaDestroyer69 Год назад +40

      Yeah, and with FSR 3.0 coming out soon AMD is really closing the gap on Nshitia. RT is still not in a place where I would be willing to base my entire GPU purchase on just ray tracing performance alone. I can see AMD and Intel offering outstanding value to gamers in the future compared to Nshitia's extortionate pricing.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf Год назад +13

      @@VaginaDestroyer69 Even the 7900xt and 7900xtx have made massive improvements on ray tracing, the 7900xt is only a little bit behind the 4070 ti (with raytracing) and I think we can expect the gap to close as the 4070 ti runs out of VRAM over time.
      I can't wait to see how FSR3 compares to DLSS3, it probably won't beat DLSS in terms of visual quality but hopefully it gives a similar performance boost.

    • @CRF250R1521
      @CRF250R1521 Год назад +10

      7900XT has low 1% lows. I returned it for a 4080.

    • @chickenpasta7359
      @chickenpasta7359 Год назад +2

      @@VaginaDestroyer69 you're acting like AMD is the hero in this situation. They literally wait until Nvidia drops their MSRP and then undercuts them by little

    • @WackyConundrum
      @WackyConundrum Год назад +4

      @@CRF250R1521 Interesting! Do you remember a particular benchmark with these results?

  • @liberteus
    @liberteus Год назад +190

    I own a 3080 10gb and my favorite game received tons of graphical updates to the point where 10gb isn't enough anymore, i had to cut all settings from ultra to a mix medium/high to get it over 60fps, down from 90 2 years ago.

    • @clownavenger0
      @clownavenger0 Год назад +19

      so they added more demanding graphical settings and that hurt performance. okay cool.

    • @sirab3ee198
      @sirab3ee198 Год назад +109

      @@clownavenger0 the GPUs are limited by VRAM, nothing to do with the demanding graphics, this is the same as r9 290x which had 4GB VRAM and 780TI which has 3GB RAM and the r9 290x aged much better than Nvidia counterpart. Nvidia is playing the planned obsolescence game forcing you to upgrade your GPU because it is RAM starved not because is slow. I can bet you my RX 6800 with 16GB RAM will run games smoother in 2 years than your 10GB RTX3080. When we told people 2 years ago about the limitations on 8GB VRAM on the RTX 3070 they called us Nvidia haters ....

    • @nazdhillon994
      @nazdhillon994 Год назад +5

      which game

    • @weakdazee
      @weakdazee Год назад

      literally same

    • @Tubes78
      @Tubes78 Год назад +3

      ​@clownavenger0 that's always going to have an impact but I can't help but think that the card would lose less performance with more VRAM.

  • @ngs2683
    @ngs2683 Год назад +67

    I just want to say one thing. I got a 1060 6GB in 2016 and spent nearly 7 years with it. Then finally I took my hard earned money and bought a 3080 Ti in November, this black friday. 12GB of VRAM. Then IMMEDIATELY new AAA games became this crazy demanding and devs are saying that 12 GB is minimum. On top of that, NVIDIA is effectly implementing planned obsolescence. The 4070Ti, the superior card to my 3080 Ti, had no evolution in VRAM. It's a 12 GB card. I just gotta say...it hurts to get an upgrade after 6.5 years only to end up immediately becoming the new low tier for this future they speak of. And I do blame NVIDIA. No card above the 3060 should have only 8 GB and the 3080 Ti should have been a 16 or 20 GB card. 3070 owners have all the right in the world to be mad. NVIDIA KNEW this was an issue but they don't care. They still don't.

    • @aquatix5273
      @aquatix5273 Год назад +5

      The cards would be completely fine with this list of VRAM. The VRAM on these cards would be how much they actually would need to stay efficient to their compute power:
      RTX 3050: 6 GB
      RTX 3060 + RTX 3060 Ti: 8 GB
      RTX 3070 + RTX 3070 Ti: 10 GB
      RTX 3080: 12 GB
      RTX 3080 Ti + RTX 3090 + RTX 3090 Ti: 16 GB
      RTX 4050: 6 GB
      RTX 4060: 8 GB
      RTX 4060 Ti: 10 GB
      RTX 4070: 12 GB
      RTX 4070 Ti: 16 GB
      RTX 4080: 20 GB
      RTX 4090: 24 GB

    • @dimintordevil7186
      @dimintordevil7186 9 месяцев назад +1

      @@aquatix5273
      rtx 3070 = 12 gb of vram. lets be honest . it is cheap for factory
      rtx 3070 ti =12gb
      rtx 3080 = 16 gb
      3080 ti =16 gb
      3090 =20 or 24
      4050= 8 gb
      4060=10 gb
      4060 ti =12gb
      4070 =16gb
      4070 ti =16gb
      4080 = 1200 usd / 20 gb
      4090 = 24 gb

    • @ozgurpeynirci
      @ozgurpeynirci 9 месяцев назад +1

      @@aquatix5273 4060 Ti is WAY MORE capable than 8 gb, that's why they made a 16GB version. 4070 should be 16 as well if not more. As a 10GB 3080 owner, this hurts.

    • @aquatix5273
      @aquatix5273 9 месяцев назад +1

      @ozgurpeynirci Yeah, doubt, 4060 ti barely is better than the 3060 ti, both cards don't have the performance.

    • @dimintordevil7186
      @dimintordevil7186 9 месяцев назад +2

      @@aquatix5273
      3060 ti is faster than 1080 ti . 1080 ti was faster than 2080 super . nowadays 2070 is as fast as 1080 ti . therefore , 3060 ti is a great card .

  • @KobeLoverTatum
    @KobeLoverTatum Год назад +112

    Nvidia: “Here gaming studio, $$ to use more VRAM”
    Also Nvidia: “Higher VRAM costs $$$$$$$$$$$$$”

    • @hardrock22100
      @hardrock22100 Год назад +11

      You do realize the last of us and RE4 are AMD sponsored titles, right?

    • @gruiadevil
      @gruiadevil Год назад +29

      @@hardrock22100 You do realize they use more VRAM precisely because AMD packs their GPU-s with more VRAM, and nVidia doesn't.

    • @hardrock22100
      @hardrock22100 Год назад +13

      @@gruiadevil
      1. This person was trying to claim that Nvidia is paying devs to use more VRAM in games that are sponsored by AMD.
      2. It's interesting that amd sponsored titles are running like hot garbage.
      3. The company that ported the last of us was the same one that ported Arkham knight.
      4. The last of us crashes when you run out of vram. That should not be happening. I've seen it even BSOD some PCs.

    • @AntiGrieferGames
      @AntiGrieferGames Год назад

      @@hardrock22100 The last of US is just a piece of shit port to getting bsod

    • @vaguedreams
      @vaguedreams Год назад +4

      @@hardrock22100 3. The company that ported arkham knights is also the same company that ported uncharted legacy of thieves collection.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +35

    When Nvidia released the 3060 with 12 GB of Vram, everything up to the 3070Ti should have also had 12 GB. with the 3080 and 3080Ti getting 16 GB. I just hope this is the straw that costs them enough market share to change their ways, instead of always thinking they can do whatever they want and people will just buy it.

    • @MarcoACto
      @MarcoACto Год назад +1

      The thing is that the 12 GB version of the 3060 was obviously aimed at crypto mining, which required a lot of vram and was hot at the time. It was never designed for gaming in mind.

    • @stratuvarious8547
      @stratuvarious8547 Год назад

      @@MarcoACto Yeah, it's true, but that doesn't change the fact that the skews above should have still been increased. Making GPUs obsolete 3 years after their release is inexcusable, and that's all that giving those cards 8 GB of Vram has done.

    • @naturesown4489
      @naturesown4489 Год назад

      @@MarcoACto Yeah that crypto thing is a myth. NotABotCommenter has the correct reason.

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад +1

      3060 will still have 30fps less than 3070/80 regardless of how much vram it has lol
      you people are clueless

    • @stratuvarious8547
      @stratuvarious8547 Год назад +1

      @@r3tr0c0e3 Of course it'd have less FPS, it's a lower class GPU, I was talking about the longevity of the purchase. Maybe before calling someone "clueless", look at the context of the conversation.

  • @rajagam5325
    @rajagam5325 Год назад +90

    i got a 6800xt for 3070 price, am really happy with it. and the video export times are really good.
    (dont support companies, support the better product :))

    • @gruiadevil
      @gruiadevil Год назад +2

      This is the best attitude!
      If NZXT or Corsair make a GPU and it's good price/performance compared to the other ones, I'm buying it.

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh Год назад +2

      yep, right now the 6950XT is CHEAPER than the 3070ti while having DOUBLE the VRAM (8gb vs 16gb) and being 36% faster!!! CRAZY

    • @IchiroSakamoto
      @IchiroSakamoto Год назад +4

      +1 I buy Nvidia products all my life, but went for 6800XT for better value for money as my 2070s really disappointed me in RT. Couldn't care less who's the maker but I'm disappointed there are so many fanboys around

    • @abdulqureshi2803
      @abdulqureshi2803 Год назад +1

      I got a 6800xt for MSRP right from AMD, upgraded from a 3070 (which I sold for the price of the 6800xt during the crypto boom) and it's soo much better. I ran into the vram problem on FH5 a few times ok the 3070, never had that issue with the rx 6800 xt tho had the black screen of death a couple times when I first got it

    • @Fluskar
      @Fluskar 6 месяцев назад

      facts. i will never support brand loyalty, its plain out stupid.

  • @veda9151
    @veda9151 Год назад +50

    It is very true that most don't actually affected by the vram issue now. The real controversy is Nvidia not providing enough vram while pricing their GPU as a high-end model. No one is complaining the 3050 or the 6600 only gots 8Gb. It's the 3070 and 3080(10Gb) that attract all the attention.

    • @MATRIX-ERROR-404
      @MATRIX-ERROR-404 Год назад +1

      RTX 3070 /RTX 3070 Ti/ RTX 3060 Ti = 8 GB vRAM

    • @infiniteblaz3416
      @infiniteblaz3416 Год назад +6

      Because the 3050 & 6600 are entry level cards priced accordingly. The 3070/ti is around a whopping $500 with the same amount of VRAM as the entry level GPUs. Hence why people are calling out Nvidia’s stupidity.

    • @vectivuz1858
      @vectivuz1858 Год назад +3

      @@infiniteblaz3416 That is exactly his point though. Price accordingly and people will understand.

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      system ram like ddr4 or 5 or fast nvme can easily be used to compensate for the lack of vram, devs just need to implement it, but they are lazy af

    • @vectivuz1858
      @vectivuz1858 Год назад +1

      ​@@r3tr0c0e3 Uhm yes some games do that, and it causes major lagging.

  • @Obie327
    @Obie327 Год назад +39

    Very good observation VEX, The older Pascal cards with 8 gigs of Vram utilize only what features sets they have baked in. The problem now is all these new advanced DX 12 features plus higher resolutions become more taxing on limited Vram buffers in niche situations. There's a car analogy here: When it's fast but runs out of gas? (tiny tank) Or the car can get to sixty really quick but tops out at 80 mph? (low gearing) i really think everyone wants something that performs great and has future potential/practicality, Or value? Hoping their GPU will last a good while for their current pricey investment? Limiting the Ram only limits the possibilities for game developers.

    • @user78405
      @user78405 Год назад +6

      Limiting ram should force game developers to open doors ...not milk them, that is john Carmack philosophy of good quality work ethics over quantity always become sloppy when having more than it can chew for a company, and ion storm is good example back then

    • @Obie327
      @Obie327 Год назад +1

      @@user78405 I totally agree with you. But like Moore's Law is dead interview with the game developer... The modern Consoles are using around 12 gigs of Vram. I hate sloppy laze code, But I do like to experience everything that the developers have to offer? Maybe AI can help clean this up? I feel that if more adopt higher ram limits this issue won't be a problem going forward. I feel like we are in a weird transition period and Nvidia could be more generous with their specs? Have a great weekend and Easter!

    • @David_Raab
      @David_Raab Год назад +1

      Some people like to buy a newly released graphics card (3070) for 600$-700$ and like that they have problems with already released games because of too less VRAM. People who critizice this are obviously AMD fanboys.

    • @Obie327
      @Obie327 Год назад +2

      @@David_Raab It's been years that we have had 8 gigs of Vram on a premium GPU. My GTX 1080 is 7 years old and AMD even longer. The latest consoles "was", The warning sign that more ram was going to be needed. And now they are using 12+ gigs for their new Console releases. I just think it's a damn shame to pay 500+ for anything new with only 8 gigs and call it exceptable going forward. I think Nvidia just missed the mark with their current product stack. Also Nvidia's new stuff still has the older display connector standard. Which has me scratching my head since they have the same display tech on the $1600 RTX 4090 as well. Intel's ARC A770 LE is only $350 dollars and has the latest display connectors, DX 12 ultimate/Vulcan/XeSS Feature sets, And 16 gigs of Vram. Is Video Ram that expensive to put more on a $800 4070ti? I just think the whole current role out of GPU's are off on many levels. Time will rapidly tell how fast are cards become obsolete? Crossing fingers, Peace!

    • @David_Raab
      @David_Raab Год назад +1

      @@Obie327 I agree to all of that. I find it a shame currently. I'm buying Nvidia now for nearly 20 years, and now i'm at the point of buying AMD instead. Nvidia now sells overpriced cards, the 4070 in my opinion should have been a 4060. I could live with such a card and 12GB if it would costs 300€, but not 600€. And yeah, they can't tell me that 4GB or 8GB more GDDR6 RAM can be so expensive. Any card over 300€ should have 16GB VRAM at least.

  • @clownavenger0
    @clownavenger0 Год назад +33

    Hogwarts was patched and works fine on a 3070 now. RE has a bugged implementation of RT so if you turn that off the 3070 does not have any issue in that game either. TLOU has PS1 textures on medium and other graphical issues regardless of hardware. If the question was "Is 8 GB on the edge for the 3070?" i would say yes but games are releasing completely broken which increases the need for over powered hardware. Some very good looking 3D titles with high quality textures use 4-6GB (Atomic Heart for example) on ultra while TLOU uses 8 while looking like a PS2 game at medium settings. I run a 3080 10GB myself and play everything at 1440p or 1440p ultrawide while using DLSS quality whenever offered to push about 100 FPS. I have not has a single issue but I only buy games on sale. So the game might be finished by the time I buy it. It seems like people just want to make excuses for game developers.

    • @DenverStarkey
      @DenverStarkey Год назад +2

      well these games wer also designed around a card that had 16 gigs ,( the radeon cards) so the devs got sloppy with vram usage.

    • @jorge86rodriguez
      @jorge86rodriguez Год назад +4

      just buying the game on sale avoids a lot of headaches jajjajaja early buyers are beta testers xD

    • @tyisafk
      @tyisafk Год назад

      I played through RE on an RTX 2070 and Arc A750, both 8GB cards. I agree that RT (And hair strands on the Intel) was the main issue with the game. To be fair though, both reflection implementations aren't good at all so it's worth just having both RT and Screen Space turned off. I even used higher quality textures than the game suggested with those disabled as per Digital Foundry's suggestion and the game ran flawlessly on both cards. I'm glad I don't often care for big AAA titles, and I have a PS5 if I'm that desperate to play one that isn't optimized properly on PC, but I do feel bad for regular AAA game fans who exclusively play on PC. PC used to be the main go to for long term savings if you didn't mind more up front, but now a current gen console is definitely the better option if you just want to be able to play anything decently.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      @@DenverStarkey those are high end cards. How many people actually own such GPU? All the talk surrounding this "not enough VRAM" mostly if not all of them is about max setting. Years ago i read an interview with dev (forgot which tech outlet are doing it) they said on pc they will try to optimize their game even on intel iGPU because of how big the user base is. And back then intel iGP are considered as super crap.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +2

      10GB will soon not be enough. Dying light 2 maxes it out easily on RT. You'll have to increase you reliance on DLSS and lower texture quality in the upcoming years. Even flight simulator maxes out 10GB VRAM at 1440p. So....................

  • @franciscoc905
    @franciscoc905 Год назад +16

    Definitely was waiting to see what you have to contribute on the discussion. I definitely see this as a negative for people wanting to play AAA games in 2023 with high details, but it will be a fire sale of great deals and second hand graphic cards for competitive gaming.

  • @Tripnotik25
    @Tripnotik25 Год назад +5

    5:52 during this interview the dev brings up stuff like "using VRAM to have high quality on all the body parts, like the eyes incase someone looks closer" and im thinking were supposed to pay 500$+ for the sake of having 4K resolution on 30 body parts of an NPC. You're not kidding when you say niche, how bout these devs make good games with griping writing and stop crap ports relying on DLSS/FSR to cover up laziness. 95% of the market is going to continue using 8~12GB products and indies will thrive.

  • @stephenpourciau8155
    @stephenpourciau8155 Год назад +4

    One little flaw is you did not turn on the setting that shows "memory usage \ process". This one in afterburner/rtss will show the ACTUAL vram usage of the application, and not what is allocated on the whole card.

  • @ner0718
    @ner0718 Год назад +5

    Spending a lot of time in Blender (3D modelling and rendering) being stuck on a 6gb card is incredibly frustrating as I can't render most of my scenes on the GPU as my Vram runs full. Can't upgrade as I am still a student and don't have the money to buy a new gpu.

  • @Doric357
    @Doric357 Год назад +28

    6800xt with the 16GB seems to be a sweet spot. I'm a casual enthusiast so I won't claim to know the in's and out's about all this but creators have been talking about VRAM forever I always believe more is better. However, I don't believe it should be at such a high premium.

    • @tokki2490
      @tokki2490 10 месяцев назад

      if you havea 6800xt... you are not a casual enthusiast lol

  • @bladimirarroyo8513
    @bladimirarroyo8513 Год назад +8

    Man im about to buy my first gpu and all your videos all answering my doubts.
    Thank you sm 🤗

    • @Ghostlynotme445
      @Ghostlynotme445 Год назад

      @@z3r009 if you got a console buy another console

  • @brandons9138
    @brandons9138 8 месяцев назад +4

    Nvidia doesn't design these cards in a vacuum. They are talking to developers and working with them. I'm thinking that nVidia has something cooking that will help mitigate the vram issues. I can't see them making 8GB the baseline knowing full well that the games will not run well on them. We're all so used to having cards with 12-24 GB of vram, because in the past GPU performance was based solely on brute force rasterization. With technology like DLSS and FSR I think we'll see that change some what. I just benchmarked my base model 4060 with 8GB and it runs Cyberpunk 2077 at 1080p on ultra with DLSS at an average of 90 FPS. Even without DLSS it was over 60FPS. I'm thinking that both nVidia and AMD are working to make DLSS and FSR a way to keep performance high without having to resort to massive power hungry chips that are too expensive to make and don't sell as well because of the price. The reason why we are seeing some games crush these cards is because they were not developed with DLSS and FSR in mind. It may have been patched in, but who knows how well optimized it is.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +18

    I expected when the games designed for the current gen consoles (Xbox Series, PS5) started releasing on PC, this was gonna start to be a problem. That's why when I was looking to upgrade my 2070, I was looking for something with a minimum of 12GB of Vram. Since I couldn't get a new 3080 (or Ti) for a reasonable price, I went with the RX 6900 XT and it's massive 16 GB of Vram. Since it was $650, It felt like the best price to performance in the price range I was looking at.

    • @latlanticcityphil
      @latlanticcityphil Год назад +1

      Man, I love my RX 6900 XT, I have no problems and a great investment too. I can play all the games and have a great experience even with Cyper punk. 16 gb of VRAM DOES MAKE A DIFFERANCE!

  • @friendofp.24
    @friendofp.24 Год назад +28

    Heavily regretting buying the 3070 now. I camped outside of a Best Buy for 20 hours and had the option to choose any 30 series. I didn't understand at the time how much VRAM mattered.

    • @HUNK__S
      @HUNK__S Год назад

      😂😂 suck to be you

    • @soumen8624
      @soumen8624 Год назад +16

      It’s not your fault, VRAM really didnt matter until very recently.

    • @Stephan5916
      @Stephan5916 Год назад

      @friend You live and you learn. Vram always mattered.

    • @Stephan5916
      @Stephan5916 Год назад +2

      @clockworknick9410 It's still Nvidia's fault. Before the 3080 their flagship card was the 2080ti. That was 11gb of vram. They should have at least matched it or better the Vram with the base 3080 model.

    • @naturesown4489
      @naturesown4489 Год назад +1

      @Clockwork Nick There were people saying at the time of 3070 release (hardware unboxed) that the VRAM wouldn't be enough in a couple of years. The 1070 had 8GB, the 2070 had 8GB.. so why would they not have put more on the 3070?

  • @metroplex29
    @metroplex29 Год назад +8

    that's why i preferred to go for the 6800xt with 16GB vram

  • @lukasbuhler1359
    @lukasbuhler1359 Год назад +7

    Planned obsolescence go crazy

  • @Lev-The-King
    @Lev-The-King Год назад +18

    Hope AMD takes the opportunity to clown Nvidia for this... They probably won't.

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 Год назад +8

      They did it at the time of the gtx 970 and its 3.5GB of Vram.

    • @lotto8466
      @lotto8466 Год назад

      @@nombredeusuarioinnecesaria3688 my 6750xt has 12 gb and play hog warts ultra perfectly

    • @Jacob_Overby
      @Jacob_Overby Год назад +1

      16gb on 6900xt, yum yum, newer 7xxx pushing 24

  • @tech6294
    @tech6294 Год назад +133

    Great video! We need more people talking about this. If Nvidia and AMD tomorrow put 24 GB as the standard midrange and 48 GB on the high-end games overnight would look photorealistic. And no vram doesn't cost that much more to go from 12gb to 24gb. You're probably only talking about a 40$ hike in price. These companies could easily make a 600$ 24gb card. They simply choose not to.

    • @Clashy69
      @Clashy69 Год назад +23

      amd has already put enough vram on their cards even their low end 6000 series, nvidia should do the same and add at least 12gb vram on their lower end cards id even be fine if it was just 10gb vram but we'll see since they gave the 4070 12gb vram

    • @bronsondixon4747
      @bronsondixon4747 Год назад +28

      It’d make no difference if 24gb was the minimum. It just needs to have more vram than the current console generation.
      Game developers wouldn’t take advantage of more than 16gb since that’s all they have available in PS5.

    • @66kaisersoza
      @66kaisersoza Год назад +8

      ​@@bronsondixon4747 the console ram is shared with the OS.
      Around 10gb is for the games and the other 6gb is dedicated to the OS

    • @luisgarciatrigas3651
      @luisgarciatrigas3651 Год назад +23

      ​@@66kaisersoza 13.5 for games, 2.5 for OS 👍

    • @retrofizz727
      @retrofizz727 Год назад +14

      24gb is overreact wtf, you wont need 24gb for 4K before like 2030

  • @Sinflux420
    @Sinflux420 Год назад +35

    Just got a 20gb 7900XT. Being able to run RE4 at max with ray tracing and having 6 gb leftover is pretty nice, ngl. Didn’t realize this ongoing issue until after getting the card, glad it’s well-equipped!

    • @Drake1701
      @Drake1701 Год назад

      Out of curiosity, what resolution do you play at?

    • @Austrium1483
      @Austrium1483 Год назад

      What did you pay

  • @ElladanKenet
    @ElladanKenet Год назад +6

    I upgraded from a GTX 960 4gb to a 3060ti in early 2021, and went from 720p to 1080p. The improvements were staggering, and it's still mostly impressive two years later, but there are a few games, like HL, that punish my system.

    • @jayclarke777
      @jayclarke777 Год назад +1

      Went from a 1050Ti to a 3060Ti. It was like going from VHS to Blu-ray

    • @Trainboy1EJR
      @Trainboy1EJR Год назад

      @@jayclarke777 as someone who had a 2gb GT640, I gotta say that Textures high, shadows low, everything else off looked really good. Never went past 40% gpu usage because of a CPU bottleneck. XD Played Lost Planet 2 totally cranked at 17fps. Had the game memorized from PS3, looked AMAZING on PC. Just Cause 2 24fps in the city, only like 15%GPU usage. XD
      Upgraded to 12gb 2060. Most Vram minimum price, high textures are what matters. Can’t wait to finish going through all my favorites in 4K 120fps! XD

  • @denerborba4994
    @denerborba4994 Год назад +29

    digital foundry recently made a review about RE4 remake and there they tested a 2070 super and on their analysis it seems that RE4 will never crash with texteures at 8 GB unless you are using ray tracing. i also have been playing the game with a 8gb card and have not faced any crash for far either about 13 hours in.

    • @viktor1master
      @viktor1master Год назад

      I test re 4 remake with 3070 ryzen 9 3950x 16 gb ram with total limit of 12 gb of vram textures 8 gb funny tho in settings it showed me mi card have only 7 gb of vram sou i dont know in another games its normal 8 gb but i was impressed it was 135 fps to 70/80 lowest slighty over 60 30 min testing demo tho now i know i must have it the games looks sou good🤣

    • @JoeL-xk6bo
      @JoeL-xk6bo Год назад +4

      it still has issues loading in high res textures. stop and look at certain surfaces the texture will go in and out.

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      rt doesn't look particularly accurate or appealing in this game anyway, besides it's just another condom they placed on a perfectly good looking game without it, so it was off just like in any other unoptimized garbage they tried to sell us to
      unless game goes full path tracing it simply not worth it and as we can see even 4090 struggles to do that at playable fps

  • @Saltbreather
    @Saltbreather Год назад +23

    If you go into the settings for MSI afterburner/RTSS, you can enable dedicated and allocated VRAM. That’ll give you a more accurate number when looking at how much VRAM a game is actually using.

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      by that account if you have 48gb game will allocate that much if needed, so yeah we need 64gb vram now lol
      funny how while all recent RE games are supposedly use more than 8 gb vram yet games run smooth and without any stutters, yet in settings it indicates that vram limit is exceeded
      RE4 remake crashed not because of that, it was eventually fixed later, you will get small stutter and dips if you enable RT, even on 4090, so vram is not in an issue in this case, Farcy 6 however will destroy your performance if you enable ultra textures and only got like 8gb vram, which kind of look just like high lol, RE games will use system ram to compensate, many games do that actually, simply because recent consoles have shared ram, be it ddr6 but still, lazy devs simply can't be bothered to port them properly, hence the vram rage

  • @themadnes5413
    @themadnes5413 Год назад +26

    I had a 1080ti, and the main reason i did not get a 20 or 30 series was vram. 3080 has less and 3080ti had 12 gb, 1200€ for a 12gb vram gpu is kinda stupid and in this regard a sidegrade. Now i have a 4080, 16gb is still a bit on the low side for a 1200€+ gpu but i can live with that. I know amd is an option too, and i was about to get a 7900xtx but the price of the 4080 was like 50€ more. So i choose the nvidia gpu, also i like rt and dlss a lot.

    • @dededede9257
      @dededede9257 Год назад +4

      I think 16gb still fine yeah he could be more for this price but i don't think you get vram limited

    • @zdspider6778
      @zdspider6778 Год назад +11

      1200€+ is the price of a "decent enough" second-hand car.
      The MSRP of the 1080 Ti was $699.
      Ngreedia is laughing all the way to the bank every time a schmuck buys one, lol. They're sitting comfortably on the shelves, not even scalpers are touching them. But enjoy it, I guess. LOL. You got a step-down, btw. From a "Ti" to a "non-Ti" 80-class, for much more money.

    • @paranikumarlpk
      @paranikumarlpk Год назад +3

      You could have easily choosed 7900xt but u just made an excuse to stick nvidia lol ggs

    • @dededede9257
      @dededede9257 Год назад +6

      @@paranikumarlpk he have make the good choice for almost same price the rtx 4080 is better thant xtx and doesn't have issues like 100w idle with multi monitor

    • @vaghatz
      @vaghatz Год назад +2

      ​@@paranikumarlpk DLSS

  • @VDavid003
    @VDavid003 Год назад +7

    I'm just glad that my 3060 that I bought used has 12gb of VRAM.
    I actually wanted to have as much vram for the money as possible, since last time I went with a 3gb 1060, and in the end that 3gb bottlenecked the card in some cases.

    • @0xEF666
      @0xEF666 3 месяца назад

      same

  • @dalebob9364
    @dalebob9364 Год назад +4

    The main thing is no one's telling everyone that 90% of the stuff that's eating up GPU memory are things you don't need to have on, and wouldn't even be on the console version of the game you're playing!

  • @terkiestorlorikin5958
    @terkiestorlorikin5958 Год назад +8

    7:39 The people that gets affected by the VRAM are the ones that like to run the game on Max settings, seriously, 90% of the games the differences between High and Ultra are barely noticeable, Alex from DigitalFoundry does amazing videos showing optimized graphical settings and most of the time you have to zoom in 200% to spot the difference between High and Ultra. I understand that the VRAM might be an issue in the future but some people should chill a little bit and ask themselves "Do I really need to run clouds at Ultra settings? Do I really need to run this specific setting at Ultra?".

  • @lucaoru502
    @lucaoru502 Год назад

    What is that headset you use in your videos? 🎧

  • @mechanicalpants
    @mechanicalpants Год назад +1

    I bought a RX 6700 10GB and I'm only gonna be playing at 1080p, will this be enough for these newer games and into the future and these super detailed textures? Because in games like Red Dead 2 and the Last Of Us, RE4 etc going down to the medium textures really starts to drop of in quality drastically z(really muddy textures) and there doesn't seem to be a nice middle point anymore with some of these newer games.

    • @itsJPhere
      @itsJPhere Год назад

      Like my mom used to say: At 1080p large textures are a waste of vram.

  • @VisibleVeil
    @VisibleVeil Год назад +10

    Vram is not niche because those triple AAA titles will be on sale for the rest of us in 1-2 years time. When that happens, how will we play with the limited ram on these cards?

    • @gruiadevil
      @gruiadevil Год назад +1

      You won't. Lol. You'll buy a new, better, much more expensive card and thank big Daddy nVidia for giving you another shite product.

  • @konstantinlozev2272
    @konstantinlozev2272 Год назад +3

    The real problem with high VRAM requirements and even with raytracing requirements is not (!) that it taxes your hardware. It's that the visual output is very underwhelming for the uptick in hardware requirements.
    You seem to be younger, but I do remember the Crysis WOW moment when we were seeing what kind of visual fidelity was possible.
    I fired up Titanfall 2 yesterday and on high it is a stunning game. Fake reflections and all, but you know what? It runs on 6-7 year old mid-range hardware. And looks just gorgeous.

  • @commonsense-og1gz
    @commonsense-og1gz Год назад

    can the amount used be lowered in the demanding games by running dlss, fsr, or just running medium?

  • @ChusmaChusme
    @ChusmaChusme Год назад +1

    9:00 I'm a pretty heavy Davinci user and I usually cap out my 16gb card and 4gb gets paged to ram. This is usually with something like 6k braw footage, applied with some frame interpolation, noise reduction, and a couple of other nodes. When it came from either upgrading from the RX 6800XT or RTX 3070 at the time, this was during the mining boom so prices didn't make sense, the 16gb made sense for my purpose of use.

  • @seaspeakss
    @seaspeakss Год назад +10

    Tbh I was expecting this. When Nvidia decided to put 8 gigs into the 1070, I was amazed, and looked forward for the future. But after the release of the 1080Ti, Nvidia got really comfortable, and havent really came out with a great card, considering cost-to-performance ratio (the 90 series cards are powerful, but super expensive, unlike what the 1080Ti was back in its day.) The 3070 STILL having 8 gigs of VRAM is what holds it back, and 3080 only having 10 is also a major dealbreaker for me.

    • @Mr.Stalin116
      @Mr.Stalin116 10 месяцев назад

      Tbh I feel like 10 gb is enough for 3080 for its performance. I recently got a 4070 ti 12gb, and I’m playing in 4k just fine. It does run out of vram when I’m playing 4k ultra rt in some games, like cyberpunk with rt override but those games would run at 20-30 fps anyway with more vram so there is not really any point of having more vram. And it sucks that there aren’t many other options if u wanna experience rt. And amd just doesn’t run well in rt. After trying rt in cyberpunk I was amazed by how much better it looks.

    • @y0h0p38
      @y0h0p38 7 месяцев назад +1

      ​@@Mr.Stalin116Right now, 10 gbs is pleanty fine. What about the future though? Its a higher end card, you should be able to use it for years without any issues

  • @Skylancer727
    @Skylancer727 Год назад +11

    I completely disagree on it being a niche issue. I do agree AMD advertising higher VRAM hasn't helped them, but this is an incredibly serious issue. People bought 3070s, 3080s, and 3060s only a couple years ago and have the power to play newer games, yet it won't work. Remember that some even bought 3070s for over $800 during the mining gold rush. That's an incredibly rough place to be in. Especially since most people seem to own a GPU over 2 generations and even the 40 series is low on VRAM. Even at MSRP the 3070 was $500, the same as the next gen systems that are running the games and again, this GPU is objectively faster. This could scare people away from PC gaming just after it recently took off again.
    And yes it's only AAA games, today. But when games start adding features like direct storage (which most likely will) even 12GB will be in a tough spot. Hell even Satisfactory announced moving to UE5 for new map loading systems and nanite. More games are going to continue to do this. And many people play at least one AAA game. Did you see the player counts for Hogwarts Legacy? They even announced over 3 million sales on PC alone in the first week. And games like COD will also likely become similar in the next 2 years with dropping PS4 and Xbox One, likely this next COD game.

    • @DenverStarkey
      @DenverStarkey Год назад +2

      i just bought a used 3070 in october of 22 , and i already feel like nivdia is standing on my dick. Re2R and Re3R both go over and crash when Ray tracing is on.

  • @Aleksander50853
    @Aleksander50853 Год назад

    @vex Listeing to HUB and other channels they have pointed out multiple times that looking at the vram number you are looking at does not give the correct answer to vram usage, and that it is in fact only vram allocation which is usually a bit higher than the actual usage.

  • @patric.rogers
    @patric.rogers Год назад

    wish we could add ram stick between the 16x pcie slot and gpu, is that even possible to do? or sodim? or the new laptop memory module?

  • @NootNoot.
    @NootNoot. Год назад +54

    I have to agree with you with the whole 'vram niche' point. I myself don't usually play AAA games that tax my gpu, although I do use workloads other than gaming that needs a lot of vram. Although, I do think that this whole vram fiasco, is a very important thing to discuss. Nvidias planned obsolescence should be put to a stop, and give consumers what they NEED for what they PAYED for. Like you said, performance is what scales with vram.
    The 1070 doesn't need anymore vram because of how it handles, unlike the 3070 where it should be able to play 1440p+ and shouldn't need to be bottleneck by memory, causing stutters, instability, to even not booting up the game. It's a business move, and it totally sucks. While these videos may seem 'repetitive' or controversial', I appreciate you making this.

    • @johnny_rook
      @johnny_rook Год назад +5

      Define "niche".
      AAA games sell by the millions on PC and new consoles have at least, 12GB VRAM addressed to GPU with a 4yo RTX 2070 tier GPU. People (both devs and players) will use high res. textures if they can, regardless of GPU power and textures are the biggest contributor to VRAM usage.

    • @Lyka-clock
      @Lyka-clock Год назад

      So far my 3060ti works well for mostly PC type games like RTS and some RPG's. I'll use console for AAA games but these days, most of its trash really. The ports haven't been that great to begin with. I tried Returnal on PC and it was stuttering no matter what settings i used and it wasn't a vram issue either. I already played Dead Space, RE4 and LOU. Maybe there needs to be a focus on making a new IP with great gameplay. That would be a wonderful idea! Lets hope Diablo 4 is actually fun. The demo was good but not great and this isn't like a new IP or anything.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +2

      ​@@johnny_rook millions of those PC does not have gpu with 12GB vram. In fact more than half of them probably only using iGPU.

    • @johnny_rook
      @johnny_rook Год назад

      @@arenzricodexd4409 Yeah, not having enough VRAM is the issue isn't it?
      How do you know that, exctly? Isn't it funny when people pull numbers out of their asses, without a shread of evidence to support it?

    • @alanchen7757
      @alanchen7757 Год назад +1

      @@johnny_rook proof is when amd competing gpu against 3070 3080 etc out last Nvidia due to having more vram

  • @Jerad2142
    @Jerad2142 Год назад +7

    One bright side about laptop gaming is a lot of these chips have way more VRAM on them, for example my 3080 laptop has 16GB of VRAM.

    • @dededede9257
      @dededede9257 Год назад +4

      So for the first time of story the laptop gaming will age better that desktop

    • @spaghettiupseti9990
      @spaghettiupseti9990 Год назад +2

      @@dededede9257 probably not, 3080 mobile gpu's don't perform like 3080 desktop cards, not even in close.
      a 3080 mobile is about 40-50% slower than a 3080 desktop.

    • @whohan779
      @whohan779 Год назад +1

      Correct, @@spaghettiupseti9990, their naming is hugely misleading. Even a 3080 Ti mobile may be trounced by a 3060 Ti desktop depending on clocks (it's realistic).
      This is mostly because the RTX 3080 mobile is almost identical to 3080 Ti mobile, so they need the same memory bandwidth. While I'm sure Nvidia could explain away 8 GB for all 3080 mobiles (as they do for some), this wouldn't fly for the Ti models, hence they always have 16 GB on mobile.
      The mobile 3070 and up are (according to somewhat unreliable UserBenchmark) just 20% apart vs. 38% on desktop, so the only reason to pay up for an 80(👔) SKU (apart from higher power-limit) is the additional VRAM.

    • @UNKNOWN-li5qp
      @UNKNOWN-li5qp Год назад

      But a laptop with 3080 will be like 3000 dollars and at that point just buy 3090 or 4090 lol

    • @Jerad2142
      @Jerad2142 Год назад

      @@UNKNOWN-li5qp Don't even know if you can get "new" ones with a 3080 anymore, one with a 3080ti and a 12800HX is about 2149.99 if you went Omen though. My 4090 laptop was about $3,300. But yea, definitely paying a premium for portability.

  • @SR-fi8ef
    @SR-fi8ef Год назад

    Could you do a video on DirectX direct storage? See how well or not this impacts the VRAM issue?

  • @T0pN0tchSoldi3r
    @T0pN0tchSoldi3r Год назад +1

    my question is what card is better for streaming amd or nvidia ?and does vram size matter

    • @Zxanonblade
      @Zxanonblade 11 месяцев назад

      Late reply but go with Intel or Nvidia. They both destroy AMD with their encoding in the most common formats (H264, H265) and then between Intel and Nvidia, I would personally go Intel because their encoding (Quicksync + Arc Hyper-encode) is pretty much equal for way cheaper and they also have AV1 (while being way cheaper than Nvidia's good options for AV1, the 4000 series).
      Vram doesn't matter much, you can easily use a basic card like Intel's A380 to do the encoding on its own while gaming off a second gpu (so basically, encoding doesn't need a ton of vram).

  • @Einygmar
    @Einygmar Год назад +54

    VRAM is a problem but optimization affects this issue as well. Better texture\asset streaming and more optimized BVH structures for ray acceleration would fix a lot of problem. I think the bigger issue is memory bandwidth on the modern cards that limits the throughput affecting the streaming capabilities.

    • @Vasharan
      @Vasharan Год назад +5

      Yes, but games will continue to be unoptimized, as long as every developer isn't John Carmack* or Takahisa Taura (Platinum Games), and as long as studios have deadlines and cashflow constraints.
      As a consumer, you can either not buy unoptimized games, or not buy underprovisioned hardware, or some combination of both.
      * And even Carmack's Rage was a buggy mess for years as he tried to get texture streaming to work seamlessly

    • @gagec6390
      @gagec6390 Год назад +4

      @@SkeleTonHammer That's just not true though. 4K monitors and especially TVs have become very affordable. 1080p is only still the standard among e-sports players and extreme budget builds or laptops. Most people who are even somewhat serious about PC gaming have at least a 1440p monitor and the fact is that anything under 12gb of VRAM just isn't enough for even the near future much less the 4-5 years that most people keep their graphics cards. If you paid more than $300 for an 8gb card recently then you got fucking scammed. (I would know, I unfortunately bought a 3060ti a year ago instead of a 6700XT)

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      @@gagec6390 1440p even 1080p looks playable on oled, upscaling made it possible
      i'm never going back to garbage tft panel

  • @rebelblade7159
    @rebelblade7159 Год назад +40

    I remember buying the GTX 960 4GB in 2015 for like 200$ equivalent brand new. That amount of VRAM was considered overkill for many but it allowed me to use it all the way up to 2021. VRAM matters a lot if you want to use a GPU for a long time.

    • @FatheredPuma81
      @FatheredPuma81 Год назад +3

      At the time you gained almost nothing for whatever you paid extra and now you're gaining around 15% extra performance for whatever you paid extra.
      Just put the extra money you saved into a savings account, wait a few years, sell your 960 2GB, and get a 970 3.5GB for the exact same cost.

    • @jacobhargiss3839
      @jacobhargiss3839 Год назад

      ​@@FatheredPuma81 that assumes the price actually does drop and you can find the cards.

    • @FatheredPuma81
      @FatheredPuma81 Год назад

      @@jacobhargiss3839 Always has always will.

    • @FatheredPuma81
      @FatheredPuma81 Год назад

      ​@@DeepfriedBeans4492 Looking at the wayback machine it was around $40 more. Toss that into a half decent Savings Account (not your local garbage bank) and that turns into $44 minimum in 5 years.
      GTX 970 was under $100 just before the mining craze and I'd guess the GTX 960 2GB was at the very least above $55. I actually sold a GTX 960 that summer and bought a RX 580 8GB for $120 but can't remember how much I sold it for. (Sold the RX 580 for $120 though a year later though lol)
      Sucks to be you I guess if you're too terrified of potentially mugged at a McDonalds in broad daylight with people around over $60. Sounds like you live in Ghetto-Siberia or something. I'd suggest moving.
      P.S. Do Ghetto-Siberian shippers not let you pay $2 to print a label? Do Ghetto-Siberian's not order things online and have loads of boxes and packing materials laying around? Does Ghetto-Siberian eBay not give you free shipping for broken items?

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 Год назад

      @@FatheredPuma81 please tell me what savings account you use that gives 110% returns in 5 years because the only kind of accounts I know of with that much potential are not what I would call a ‘savings account’, and most certainly do not come without large amounts of risk, and are also called ponzi schemes and are illegal.

  • @TheSocialGamer
    @TheSocialGamer Год назад

    I'm really feeling this young techy. Smart and a cutie 😅 I've yet to see any issues with my 24GB GDDR X6

  • @goblin3810
    @goblin3810 Год назад

    Who's steam or video did you keep cutting to

  • @c523jw7
    @c523jw7 Год назад +5

    You bring up some good points here. All that matters is your personal experience and your card fitting your purpose. 10g has been more than enough on the games that I play and really happy with my experience . Now I do think nvidia suck for not giving their cards enough vram, no excuse for that. 4 year upgrades seems about right just a shame vram usage has really spiked the last few games. Best to future proof for any card purchase moving forward though.

  • @MrGalax00
    @MrGalax00 Год назад +3

    I wanted a 3070 or 3060 ti for my HTP but ended up getting a 6700XT because it was cheaper, and had more VRAM. On my main PC I'm using a 3090 FE, so I don't have to worry about VRAM usage.

  • @hyxlo_
    @hyxlo_ Год назад +2

    Devs are not optimizing there games and we are blaming gpu manufacturers 🤦‍♂️

  • @StonnedGunner
    @StonnedGunner Год назад

    what if you would be possible to upgrade the Vram on gpus?

  • @16xthedetail76
    @16xthedetail76 Год назад +5

    My GTX 980ti will be going hopefully for another 2 years...

  • @simon6658
    @simon6658 Год назад +7

    It's always good to force GPU makers to add more VRAM.

  • @capnmoby9295
    @capnmoby9295 Год назад +4

    It's probably the dev's becoming more and more complacent and the games becoming more and more complicated

  • @nekumanner6605
    @nekumanner6605 Год назад

    what song in the RE gameplay of Daniel Owen video?

  • @j.rohmann3199
    @j.rohmann3199 Год назад +4

    I actually never had problems so far with my 3060 ti... it does amazing for me at 1080p and decent on 1440p. I will still be using it in like 4 years (if I live that long)

    • @j.rohmann3199
      @j.rohmann3199 Год назад +1

      @@VulcanM61 damn, I was going to do the same thing!
      From 5600x to 5800x3d... but maybe I will just go for a 5900x instead. I didnt decide yet

    • @j.rohmann3199
      @j.rohmann3199 Год назад +1

      @@VulcanM61 Epic!
      Yeah the X3D versions are crazy good. And pretty future proof!

  • @Ben256MB
    @Ben256MB Год назад +12

    I don't think it's a problem because games are graphically more realistic and the texture sizes are bigger too .
    Remember the tech demo of Unreal engine V on PS5 I knew that 1440p might consume all 8GB or more .
    We can keep it extra real " Most games there very little difference between ultra and high settings .
    Just down the settings to high at 1440p or 4k in an 8GB card . People are too sensitive !!

    • @OffBrandChicken
      @OffBrandChicken Год назад +3

      Or just add more VRAM and don’t have the issue to begin with.
      You see Im not even remotely worried about my card anytime soon, and I’ll be able to run my games at higher settings than Nvidia because of it.
      I can crank up settings without issue.
      I just find it ironic that a worse performing GPU is performing better now.

    • @gruiadevil
      @gruiadevil Год назад +4

      You can't ask to crank down settings.
      I paid the nVidia Tax.
      I paid the 3000/4000 Series Tax.
      I expect to play a game at highest possible settings on the resolution of it's tier.
      You can't charge extra, and deliver less, just so people buy another GPU in 2 years times, because you want to sell more GPU-s.
      It's the same mentality General Motors had in the '70-80's by starting to making cars that brake in 10-15 years. And everyone followed suite.
      If you buy a BMW made before 2005, it's a tank.
      If you any BMW made after, it's going to start breaking piece by piece.

    • @Ben256MB
      @Ben256MB Год назад

      @@OffBrandChicken Lol VRAM can't be added because it's on the PCB of the GPU board !!

    • @Ben256MB
      @Ben256MB Год назад

      @@gruiadevil Lol bruh !! Because you bought a low end or a med GPU you don't get the same benefits as someone who paid $1700 for a 4090 .
      You have a choice of buy an AMD GPU which has bigger VRAM for less the price for slightly less performance .

    • @OffBrandChicken
      @OffBrandChicken Год назад +1

      @@Ben256MB are you serious? I’m saying add more vram to begin with. How was that hard to understand?

  • @calibula95
    @calibula95 Год назад

    Wait, i know this will be a problem for all cards, it's justa matter of time.
    But if i play at 1080p, how likely is it to be a problem in the short term? On 1440p (the target for x700 and xx70 cards) i know 8GB is too little, but what about lower resolutions?

  • @andreigroza12
    @andreigroza12 Год назад

    i also wanted to get a 4070ti a bit after it's release, but thinking long term, it has just 12gb of vram so that might be hard for new games if i want all the fancy stuff in a year or two, so i decided to go for the rx 7900xtx despite spending a bit more.

  • @Sybertek
    @Sybertek Год назад +6

    No regrets with the 6800XT.

  • @ItsFreakinJesus
    @ItsFreakinJesus Год назад +4

    Adjust your settings and it's a manageable issue even with AAA games. Shadows and lighting have massive RAM hits with little to no visual difference at the higher settings for example.

  • @DonreparD
    @DonreparD Год назад +1

    I noticed the Obsidian dev that was on MLID wasn't complaining.
    It's typically the VFX developers who want more VRAM in the hopes that more people will be impressed with their work on the explosions and fire effects. As far as I'm concerned, those developers would be happier working for _Scanline VFX_ .
    If people want to be "immersed" with visual effects, watch a movie or play one on a PlayStation.
    *This is just my opinion. I feel like the "EVEYONE & EVERYTHING NEEDS MORE VRAM" narrative is getting a bit ridiculous.

  • @cartm3n871
    @cartm3n871 Год назад

    I recently got an all amd laptop a few months ago and it blows my mind knowing that my 6800m has more vram than both of the cards you were testing. how is that even possible

  • @ComradeChyrk
    @ComradeChyrk Год назад +6

    I have a 3070 and I was concerned at first about the 8gb of vram, but so far I haven't had any issues. I play in 1440p but I never was really interested in things lile raytracing, or playing at ultra settings. As long as I can play 1440p at 100 fps, I'm happy with it

    • @David-ln8qh
      @David-ln8qh Год назад

      I bought my 3070 for $1000 deep into the card-pocalypse when 3080s were in the $1500-$1600 range. I'm frustrated about the situation but still feel like I didn't really have many options and was probably better off pocketing that 5-600 dollars for my next card, which I'm hoping is a least a couple years out.
      For the record I also play at 1440p 80-120 fps and haven't yet run into problems.

    • @ComradeChyrk
      @ComradeChyrk Год назад

      @@David-ln8qh I'm glad I waited cause I got my 3070 at sub 600$. I was holding out with a 970 until the prices dropped. I got my 3070 about a year ago.

    • @Trainboy1EJR
      @Trainboy1EJR Год назад

      @@ComradeChyrk Still, wouldn’t it have made more sense to go with a 16gb AMD card if you weren’t going to bother with Ray tracing?

    • @ComradeChyrk
      @ComradeChyrk Год назад

      @@Trainboy1EJR I wanted the dlss. Plus it was in my price range. The amd equivalent (6700 xt) was roughly the same price but didn't have as good of performance.

  • @StraightcheD
    @StraightcheD Год назад +4

    10:33 It obviously depends on what you want to play, so I think you're technically right. I think people worry because nobody likes to end up in a niche by accident and be told that the game you want to play next just happens to be one of the dozen titles to avoid on your $600 card.

    • @dashkataey1740
      @dashkataey1740 Год назад +1

      This. When you spend that much on a card, you kind of expect it to last you a few years and be able to handle new titles for a while.

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      @@dashkataey1740 problem is it's not 2016 anymore and big corps don't care if they ever did, they will just double down at this point
      consoles might be the best option for triple a title players, it will run like a turd, but at least you didn't pay 1000$ instead of 500 for a fake4k 30 fps experience, all this dlss and fsr is fake resolution and fake frames at this point
      rest can be played on a potato anyway

  • @eldraque4556
    @eldraque4556 Год назад

    getting messages playing RDR2 on my 3060ti that I've ran out of Vram, stream sounds good, in the UK though are talking east coast night time?

  • @bouxesas2046
    @bouxesas2046 Год назад

    I put on the same settings as you in RE4 remake with a 3080 FE, and it worked fine until he gets to the village fight. Then it was a constant crash (D3D crash error 25), until I lowered the textures to 2GB.

  • @DeadlyKiller54
    @DeadlyKiller54 Год назад +7

    Just seeing this makes me super glad i got my 6700XT for 360 with the 12 GB of VRAM it has. Yeah not an nvidia card, but she still performs good, and streams decently.

    • @Trainboy1EJR
      @Trainboy1EJR Год назад

      Seeing this makes me even happier to have a $240 12gb RTX2060. XD Although it is almost certainly going to be my last Nvidia card, Intel is looking super good with ARC right now. Hopefully they teach Nvidia the lesson AMD hasn’t been able to. And with EVGA out of the scene, I completely understand not wanting to touch the 40xx series! Honestly I’m surprised more board partners didn’t “nope” out of this generation. XD

    • @GrainMuncher
      @GrainMuncher 2 месяца назад

      ​​@@Trainboy1EJRVRAM doesn't matter if the card is too weak to even use it. There's a reason the 3060 Ti 8gb destroys the 3060 12gb

    • @Trainboy1EJR
      @Trainboy1EJR 2 месяца назад

      @@GrainMuncher Destroyed? HA! 65fps vs 71fps is just the 192bit bus vs 256bit bus. I’ve learned to go with the most Vram card and have never been disappointed! 1gb GT220, 2gb GT640, 4gb 1050ti (laptop), 12gb RTX 2060. Let me repeat that, NEVER DISAPPOINTED!!! I will never sacrifice textures to play a game, because textures have zero impact to performance if you have the Vram for it.

  • @Hakeraiden
    @Hakeraiden Год назад +5

    At this point I'm scared to buy, 4070 which will be released soon. Not sure if I should wait for 7800xt or 7700xt reviews from AMD. Recently completely upgraded my PC after more than 9 years, I want finally to play games and not rely on streaming (which is still awesome as replacement). For now, I will play on 1080p, but consider upgrading to 1440p in a year or less.

  • @WebbTech1
    @WebbTech1 Год назад

    I think we're "very much present" here in 2023 on the vram issue...which I have commented upon in various tech channels (Vex, glad I discovered you, I dig your content!). This issue is not going away. To keep my comment short (which I don't do very often...lol) I have a 3070 Asus Tuf Gaming OC, and looking very seriously at upgrading to the 7900xt (20GB vram) which is around the same price as the 4070TI (12GB vram), which I initially was interested in. I want to play ANY game at ANY resolution without issues!

  • @nyrahl593
    @nyrahl593 Год назад +2

    The EVGA 3090's had their VRAM in clamshell mode and its frustrating, because those cards clearly did not have issues running their VRAM in clamshell mode. So I really wonder how much more it would cost vendors to double the VRAM on current chips since clearly the X070 series must(JEDEC standards and all) support up to 16GB, the 080's 20GB.

  • @OffBrandChicken
    @OffBrandChicken Год назад +4

    You know the only people that tend to try and even remotely justify 8gbs nowadays even on lower end graphic cards tend to be Nvidia users.
    You notice that? I think some people are coping and hoping they didn’t make a bad decision going with Nvidia this time.

    • @gruiadevil
      @gruiadevil Год назад

      The amount of Copium is high in these comments :))
      I noticed that too.
      And they're not just nVidia users. They're nVidia users who bought a 3070-3080 card during the boom. Ofc they are going to lower settings. They are still paying the loan they took to purchase that card :)))

    • @OffBrandChicken
      @OffBrandChicken Год назад +1

      ​@@gruiadevil It's so crazy on like how predictable their responses are. Like "Most Top Steam Games don't even require it."
      Like people don't realize that maybe the reason that they are still the most played. Is because the players Graphics Cards can't handle more.
      Most "Gamers" are gaming on laptops/prebuilts with lower end graphics.
      People that are building/upgrading, are doing so with the intent of playing modern games. Because you wouldn't waste that type of money otherwise.

    • @David-ln8qh
      @David-ln8qh Год назад

      Don't you need 8gb to say 8gb works for you?

  • @StuffIThink
    @StuffIThink Год назад +5

    Just got a 6650xt. Don't really care if I can run games on ultra in the future. As long as it holds out a few years playing on medium or higher I'll be happy.

    • @gruiadevil
      @gruiadevil Год назад +4

      Yes. But you bought it cheap.
      Look at how much a 3070, 3070Ti, 3080, 3080Ti cost. Those are the ones discussed here.
      Not the cheap products. When you buy cheap, you say to yourself "If I can play new games at a combo of Medium/High Settings, so I can enjoy the game, I'm satisfied. If it lasts me for the next 2-3 years, I'm satisfied."

    • @StuffIThink
      @StuffIThink Год назад +6

      @@gruiadevil he asked people with 8 gb cards what they thought. Just answering his question.

    • @vanquishhgg
      @vanquishhgg Год назад

      Havent had any issues with my 6650xt Hellhound. Will last me another year until I do a full rebuild

  • @heavenpiercer5095
    @heavenpiercer5095 Год назад

    Can anyone tell me in layman's term why 8 gb cram in gpu is bad? I thought more = better??? I m new to this stuff and I'm trying to buy a new card and any info is appreciated thanks

  • @junglejim101
    @junglejim101 11 месяцев назад

    Quick question, if you're just looking for console quality settings in terms of performance and fidelity than surely getting enough VRam equal to a PS5 would be safe long term? Surely this is just a matter of GPU vendors lagging behind the current gen consoles. Or no matter what will the requirements will keep increasing year by year ?

  • @tylerstokes6722
    @tylerstokes6722 Год назад +3

    I don't see vram being a huge problem, but I do feel all new gpu in this generation should of been 12 gb atleast or even 10gb for entry

  • @hyena-chase2176
    @hyena-chase2176 Год назад +7

    I find 12gb vram about right,would not go any lower if buying a new GPU in 2023 ,I play a few games like rust that use about 10+ gb at 1440p,so more Vram the better imo

    • @Pimp_Shrimp
      @Pimp_Shrimp Год назад

      Jesus, Rust got chunky. I used to play it on a 970 (albeit at 1080p) just fine many years ago.

  • @7lllll
    @7lllll Год назад

    i use an old laptop, and i notice specific problems: low screen resolution, low cpu core count, and slow storage. these are the problems, and other things, such as the size of ram and gpu performance are fine. this experience and this vram fiasco tells me that it is likely that, that specific problem in a new laptop i will buy, will be low vram capacity

  • @aliencatmeow
    @aliencatmeow Год назад

    Why do you need this much vram? What are you using it for, besides game?

  • @user78405
    @user78405 Год назад +3

    Lot of people said that adding extra 16 GB of system ram in their system with 3070 fix the majority of frame stuttering in LAST OF US and RE4 remake when I did notice it using 24gb of system ram...what a shocker....now I know why 3070 dips to 20fps when 16gb ram is bottle necking of data to process...now with 32gb...it stops and no more shader loading during gameplay...and fps minimal went up to 41 fps

    • @OffBrandChicken
      @OffBrandChicken Год назад +3

      When your computer cant use more VRAM it uses system ram. It’s still a VRAM issue at the end of the day.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Год назад

      My system can get up to 16gb vram usage but 8gb come from my rtx 3070, games crash or low fps in games when vram is not enough and in your case is that your vram was not enough and ram memmory xD

  • @aeneasoftroy9706
    @aeneasoftroy9706 Год назад +3

    next gen is here and that's it. moving forward you're just going to need high end hardware to play the latest games on pc. however, that doesn't mean you can't pc game, all it means is you need to have proper expectations for your hardware.

  • @fluffythecandleeater6196
    @fluffythecandleeater6196 Год назад

    i have a question , would running 2 graphics cards help with the issue,
    lets say if i had a 3070 and a 1660 would the 1660 help the 3070.
    ps idk how stuff like this works so pls tell me is stuff like this would even work

    • @vextakes
      @vextakes  Год назад

      No. They don’t share VRAM it wouldn’t help. Might actually confuse the system which gpu is the main one

    • @fluffythecandleeater6196
      @fluffythecandleeater6196 Год назад

      i see

  • @rarigate
    @rarigate Год назад +1

    Would be interesting to know if the RTX 3060 does surpass 3060 Ti or even the 3070 just because of 12GB VRAM (have 3060 Ti)

    • @wielkoduszny7401
      @wielkoduszny7401 6 месяцев назад

      Some cases yes, the VRAM makes more FPS, when You maxed out the settings, and data allocates full of VRAM in case of 3060 and it could be faster.

  • @admiralcarrot756
    @admiralcarrot756 Год назад +9

    Meanwhile, AMD 6000 series offers you VRAM based on card tier like 6500 for 4GB, 6600 8GB, 6700 12GB, 6800 16GB, and lastly 6900 for 16GB too.
    Nvidia 3000 series be like... 3050 8GB, 3060 8GB, 3070 8GB, 3080 10GB, 3090 24GB. See the problem there?

    • @user78405
      @user78405 Год назад +2

      Having 8gb RAM feels like having 12gb in games ...while 10gb feels like having 16gb in games ...very different in tech between methodology on conservative usage that don't tax high in your system usage and finding out who are best game developer than lazy take more VRAM to cover up bad game flaws like forspoken that I find it embarrassing on quality AAA titles now vs doom brand is always been carefully respected still...and runs all cards today. Even 4gb GPU's due to idtech Carmack genius work he put into ...wish every developer is like him...to have brains to make games ...but like for spoken, I can tell the developer is rude and super lazy can bring entire team down with bad energy he or she spread into...talked about employee don't like his or her job

    • @VDavid003
      @VDavid003 Год назад +1

      Actually, the regular 3060 is 12gb which is even weirder.

    • @silvershines
      @silvershines Год назад

      Overall the line-up isn't too weird once you realize the original plan was that there was meant to be a RTX 3080 20 GB. But then the crypto boom happened and Nvidia decided to chase sales volume for that sweet crypto cash. Better to produce more cards than a good product.
      The past few crypto booms (that were isolated to AMD cards) also showed that regardless of what you do -- you will have to deal with a bunch of cheap 2nd-hand used cards cannibalizing your new card sales. So regardless of what happens, your company is going to be the baddie anyway so might as well raise the price and condition people to accept higher prices.

  • @afgncap
    @afgncap Год назад +3

    AMD approch worked for me in the past when I could not afford top shelf gpu and upgraded once every 6 years. Their cards until now aged fairly well. However I agree that having ridiculous amount of VRAM at the moment of buying doesn't really help you. I now have 7900 XTX and I doubt I will ever be able to use 24 GB of VRAM before I upgrade.

    • @scythelord
      @scythelord Год назад

      I've already used the 24 gigs of VRAM my 3090 has. It isn't difficult to do.

    • @afgncap
      @afgncap Год назад

      @@scythelord in a gaming scenario, unlikely. Workload sure.

  • @Tzalmavet
    @Tzalmavet Год назад +2

    Definitely experiencing this issue with my RTX 3070Ti. The GPU itself runs fantastic but it's handicapped by lack of VRAM while playing games at 1440p. So I am in this dilemma of do I sell my EVGA RTX 3070Ti FTW3 and use the funds toward a Radeon 7900XT or do I hold out and hope Nvidia will actually offer sufficient VRAM on next gen cards that are marketed for high resolution gaming?

    • @devashishjn
      @devashishjn Год назад

      Same here. it sucks to have 3070 with all the graphical prowess i need for 1440p being handicapped by vram limitations. Have decided to sell it and buy 6800xt instead.

    • @blissseeker4719
      @blissseeker4719 Год назад

      Do your games stutter when they are at a high vram usage?

  • @mentalhell4846
    @mentalhell4846 Год назад

    If I run these games at 1080p with 8gb of vram (3070 ti)..will I face any issues at that resolution?

  • @pointlessless2656
    @pointlessless2656 Год назад +3

    I would argue that nvida isn't a full gen ahead of amd in ray tracing more half a gen if anything, the 7900xtx isn't that far behind a 4080, the only card that you could argue that has a huge ray tracing advantage is the 4090, a $600 mark up in price.

    • @debeb5148
      @debeb5148 Год назад

      Meanwhile ps5s just play games.

    • @m.h.4907
      @m.h.4907 Год назад

      ​@@debeb5148 Yeah with medium settings and limited to 30fps. GG

    • @debeb5148
      @debeb5148 Год назад

      @@m.h.4907 Your shitty 8 year old pc might, but ps5s go up to 120 fps lmfao

  • @AntiGrieferGames
    @AntiGrieferGames Год назад +10

    Stop being the VRAM drama and tell the devs to optimizing their aaa games like Indie Games did!

    • @naipigidi
      @naipigidi Год назад +5

      Im putting you in the list of people with a brain.

    • @AntiGrieferGames
      @AntiGrieferGames Год назад

      @@naipigidi I dont get it what that means. lmao.

  • @aliviper1075
    @aliviper1075 Год назад

    Should i get a rx6800m 12 gb with ryzen 9 5980 asus strix advantage edition for 1550$ or should i get a msi gp66 i 7 11800 rtx 3080 8 gb for 1480$ any suggestion ?

  • @soup-not-edible
    @soup-not-edible Год назад +3

    When I bought a 16GB RX 6800, I wasn't thinking much about its VRAM, but this is a godsend.
    I prefer having more RAM that's slower than less RAM that's faster.
    (Definitely laughing at Nvidia for the backlash of this "supposed" planned obsolescence)

    • @r3tr0c0e3
      @r3tr0c0e3 Год назад

      ironically 3080 with less ram still will be 30% faster than your 6800 and no vram will change that lol
      30 fps with stuttering is just as bad as 30 fps without, when the times comes to that and by then this cards will be just e-sports cards lol