Это видео недоступно.
Сожалеем об этом.

Not Enough VRAM!!!

Поделиться
HTML-код
  • Опубликовано: 25 мар 2023
  • Is 8GB of VRAM enough? In some recent games, the answer is a clear NO. I began by testing out Resident Evil 4 Remake, and noticing that the RTX 3070 Ti cannot even run the game without crashing unless turning down textures to control VRAM, which is a shame because it clearly has enough performance to run the game maxed out, but is just running out of VRAM. I then discuss other recent games that are using a lot of VRAM like Hogwarts Legacy and Forspoken. I also believe that this will become even more of an issue over time as game developers start to target their 1440p max settings toward the newer generation of cards like the 4070 Ti and 4070, which feature 12GBs of VRAM.
    Test system specs:
    CPU: Ryzen 7700X amzn.to/3ODM90l
    Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
    Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
    RAM: 32GB Corsair Vengeance DDR5 6000 CL36 amzn.to/3u563Yx
    SSD: Samsung 980 Pro amzn.to/3BfkKds
    Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
    PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
    Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
    Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
    Mouse: Logitech G305 amzn.to/3gDyfPh
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.

Комментарии • 3,2 тыс.

  • @danielowentech
    @danielowentech  Год назад +91

    I wanted this video to be a bit more general than just RE4, but I made a follow up video showing more details about the crashing in RE4, best ways to fix it, more details on what the texture settings mean and how they work, etc: ruclips.net/video/8fMB62F9_7I/видео.html

    • @tommypearson9260
      @tommypearson9260 Год назад +2

      I agreed with this game not needing features like RT or DLSS turned on it's amazing to me that people want to 1080p-1440p this game still runs over 160fps respectively.
      I tried to say this in a live stream but the person didn't understand, I was only talking about this specific user case RE4 not every game on the planet lol.........

    • @chun1324
      @chun1324 Год назад +2

      Capcom be like: You could lower the setting to use less vram, you can always fix the issue with a 4090 🎉

    • @VMaster1997
      @VMaster1997 Год назад +2

      I think also that developers are really lazy to optimize there game because i cant really tell if a game looks better with more v ram needed than a game from 4 years ago with less v ram needed i think they can do a lot with optimization on the vram side

    • @alephnole7009
      @alephnole7009 Год назад +1

      Try turning Ray tracing off and shadow quality down.
      Then push the texture quality up and see how that works because DF testing showed it's RT that's the issue. Something breaks with RT and texture quality above 4Gb at the same time.
      They also mention shadows effecting the crashes.

    • @alephnole7009
      @alephnole7009 Год назад +1

      @@ConfinedSpiral-xy8qz yeah its the ray tracing thats actually causing crashes. people can go into the red on vram without it and not have any issues.

  • @winj3r
    @winj3r Год назад +984

    NVidia knows very well what they are doing by offering GPUs with such low vram.
    It's a good way to force people to upgrade and spend more money, much sooner.

    • @AAjax
      @AAjax Год назад +68

      Entirely this. As Jensen said, Moore's Law is dead, so how else is Nvidia going to get everybody on-board for a 50 series upgrade? Especially since CPUs are now bottle-necking the top end GPUs.

    • @LarsH0NEYtoast
      @LarsH0NEYtoast Год назад +30

      Yeah and it worked lol
      Played the demo with my 3060Ti and it struggled to maintain 1440p 60fps in the demanding areas. So I ended up feeling compelled to upgrade. I got a 4070ti and now I'm running the game at 1440P 120FPS with even higher textures. To think there is rumors of the new 4060 still coming out with 8G vram is ridiculous

    • @Toxicin2
      @Toxicin2 Год назад +80

      @@LarsH0NEYtoast News in, gamers get played and even paying for it.
      Thats simping my son.

    • @Hombremaniac
      @Hombremaniac Год назад +22

      It's actually ingenious move from Nvidia. You manufacture quite potent GPUs, but you gimp them with not enough of VRAM. Nvidia still sold those for extra high prices (and saved costs on VRAM) and used DLSS as a sorts of excuse for low VRAM and demanding Ray Traycing. They seem to continue with this trend since 4070ti has measly 12GB and in order to produce good FPS with Ray Traycing on, you are yet again forced to use DLSS.

    • @gameordeath9464
      @gameordeath9464 Год назад +13

      @@LarsH0NEYtoast 4060 rumored to even have 8GB should just be a shame to Nvidia, AMD is out here with the 6700/50XT at in 10GB non XT and 12GB XT is just crazy to me. I surely hope they don't actually give it 8GB as it will definitely be out of the "trend" soon with these new games coming out being very high demand. Im just glad I game on my AMD card PC and keep my NVIDIA for Adobe Suite creator stuff. It's just so dumb and they know people will still buy it because its just NVIDIA. I've seen people on reddit talk about how their 3080(ti) with 10GB non ti and 12GB ti of VRAM are struggling.

  • @emeritusiv1366
    @emeritusiv1366 Год назад +2569

    The fact that they give 12gb to the 3060 instead of 3070, 3070ti or 3080 is mind blowing.

    • @PeterPauls
      @PeterPauls Год назад +901

      Original RTX 3060 has 192 bit memory bus, each memory module use 32 bit so 192/32 = 6, you can put 6 moduls, nVidia decided to go with 6x2GB modules because 6GB would have been too less. And RTX 3060 ti, 3070, 3070 ti have 256 bit memory bus which means 256/32 = 8, so they just went with 8x1GB modules and here were the biggest mistake IMO, they should have been put 2GB modules on that card, so the 3060 ti, 3070 and 3070 ti would have been 16GB cards. About RTX 3080 which has 320 bit wide memory bus, 320/32 = 10, so they went with 10GB but because they released the RTX 3080 ti with 384 bit bus which 384/32 = 12, they went with 12GB. They Fd up their lineup, it should have been RTX 3090 24GB, RTX 3080 20GB, RTX 3070 ti, 3070, 3060 ti with 16GB and RTX 3060 with 12GB and RTX 3050 with 8GB. I hope my logic is understandable even though English is not my best.

    • @od1sseas663
      @od1sseas663 Год назад +308

      @@PeterPauls Stop with the excuses. We know how it works with the bus thing. Just change the bus and put 12Gigs of VRAM or just make it 16.

    • @DanielOfRussia
      @DanielOfRussia Год назад +220

      @@od1sseas663 You do realize changing the bus impacts the performance as memory bandwidth either gets higher or lower?

    • @od1sseas663
      @od1sseas663 Год назад +26

      @ロシアのダニエル That's why I said "just put 16Gigs". Same bus.

    • @od1sseas663
      @od1sseas663 Год назад +22

      @Blue Yeah, that's why I said "or just put 16Gigs".

  • @Bestgameplayer10
    @Bestgameplayer10 Год назад +276

    I tried telling people early on that this was going to be an issue very soon. This is part of the reason why I'm on AMD. My 6800XT has 16GB of VRAM.

    • @agusvelozo3189
      @agusvelozo3189 Год назад +3

      I was thinking abt buying a 6800, what model did u buy? I'm thinking abt buying the phantom gaming.

    • @Bestgameplayer10
      @Bestgameplayer10 Год назад +8

      @@agusvelozo3189 I just have the AMD Radeon (the reference model). I didn’t get an AIB this time around.

    • @agusvelozo3189
      @agusvelozo3189 Год назад +2

      @@Bestgameplayer10 and is the performance good?

    • @Bestgameplayer10
      @Bestgameplayer10 Год назад +2

      @@agusvelozo3189 I suppose so. I haven’t looked into the performance of the other brand 6800XTs to know for comparison but I ain’t had no performance issues.

    • @agusvelozo3189
      @agusvelozo3189 Год назад

      @@Bestgameplayer10 nice, and drivers wise did u have any problem?

  • @Hypershell
    @Hypershell Год назад +932

    Well, I have a 3070, and Nvidia is doing their best to make sure my next upgrade is to AMD.

    • @83Bongo
      @83Bongo Год назад +82

      You only have yourself to blame. You knew the card had 8gb when you bought it.

    • @Mike0200000000
      @Mike0200000000 Год назад +21

      My 3070 handles all games perfectly, along with fhd and qhd. If I have to lower the textures from ultra to high, it runs great. Ray tracing and dlss make for nicer graphics and the card is better than the 6700xt

    • @Hypershell
      @Hypershell Год назад +86

      @@83Bongo Yes, I did. I'm happy with it, I'm just saying it's something to think about in the future.
      I'm more a jack-of-all-trades user than a hardcore gamer, so Nvidia having the better feature set was a draw (especially that NVENC encoder). But AMD is getting better and Nvidia is habitually going nuts with the power draw lately, so I didn't expect to be sticking to one brand forever.

    • @realpoxu
      @realpoxu Год назад +71

      @@Mike0200000000 Well the 6700 XT can run this at 1440p and Max settings + RT at over 60 FPS and FSR 2.1 does good look good indeed. 3070 < 6700 XT.

    • @Steel0079
      @Steel0079 Год назад +23

      Same, me too. Once I got 3070, I started running every game at max. The games that give me problem are the ones needing more VRAM. Sad part is that these games run out of VRAM, at 1080p. FFS.

  • @wakingfromslumber9555
    @wakingfromslumber9555 Год назад +817

    NVIDIA: “We have ray tracing , path tracing , DLSS 3.0 but you need to pay 1200 dollars for it…
    Consumers: “Can we have high quality textures too ?”
    Nvidia: “No, you need to pay at least 1500 bucks for that”
    Consumer:”But I don’t need all this Ray tracing stuff “
    Nvidia:” I don’t care you are having it”

    • @arfianwismiga5912
      @arfianwismiga5912 Год назад +16

      that's why Nvidia give you DLSS

    • @Mike80528
      @Mike80528 Год назад +70

      Nvidia: "And tons of AI cores that you also don't really need, but our profitable endeavors require."

    • @winj3r
      @winj3r Год назад +96

      The funny thing is that RT also requires more vram. So people eventually will have to turn off RT, because of vram limitations, despite the GPU being capable of RT.

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j Год назад +7

      not really. If you disable raytracing, it uses much less vram.

    • @dreamcrabjr7408
      @dreamcrabjr7408 Год назад +83

      That’s why you get a rx 6800 xt with 16gbs of vram 😎

  • @mirzaaljic
    @mirzaaljic Год назад +35

    After making a huge mistake of buying a 3GB version of GTX 1060 back in the day, I said that the first thing I'll be looking for in the next graphics card will be VRAM. Which is why I went for AMD this time around.

    • @happybuggy1582
      @happybuggy1582 Год назад

      Agreed with 2gb gtx960 with tears

    • @Shatterfury1871
      @Shatterfury1871 10 месяцев назад +1

      Welcome to team RED, we have cookies, enough VRAM, raw raster and a bit worse ray tracing.

    • @Crazical
      @Crazical 10 месяцев назад +1

      ​@@Shatterfury1871and the funky niche things like RSR and AFMF yeahhh

    • @zazoreal5536
      @zazoreal5536 9 месяцев назад

      Gtx 1060 3gb club. I have it running in my backup PC.

    • @jagjot1697
      @jagjot1697 8 месяцев назад +1

      I'm in the same boat. Yet another disappointed owner of the 1060 3GB. I'm thinking of going with the RX 6750XT now

  • @RAaaa777
    @RAaaa777 Год назад +30

    The patch is just automatically lowers the textures so that the game doesn't lags.
    If you pay attention in some areas the textures change.

  • @Ladioz
    @Ladioz Год назад +390

    Running out of Vram is probably the saddest thing ever. Having a fast card and expensive with all the entire components in the build up to date and on point, and you have to worry about a vram meter in game........

  • @giuseppeabate7342
    @giuseppeabate7342 Год назад +169

    Missed the opportunity to name the video "Not Enough VRAM, Stranger!"

    • @nishant5655
      @nishant5655 Год назад +6

      🤔🤔

    • @maximmk6446
      @maximmk6446 Год назад +6

      Lmfao mate 😂

    • @rohanchooramun7288
      @rohanchooramun7288 Год назад +5

      Good one... lmoa🤣🤣

    • @gamingbillyo333
      @gamingbillyo333 Год назад

      Broo Thank You Giuseppe ! Gave Me The Title Name For My Next Video ! My 4080 16 Gb Was Upgraded To A 4090 24Gb Play This With Everything Maxed Out ! Not A Proud Moment Supporting Nvidias Insane Prices Twice ! Smh

  • @stusen6153
    @stusen6153 Год назад +203

    While people slate Nvidia (and rightly so), a big part of this issue is the unoptimized mess that companies release just to meet deadlines. The recent Steam survey shows that the majority of people are still on 8GB or lower video RAM. Having anything less than 4K require more that 8GB ram is just poor by the devs. As is often the case, VRAM issues are fixed by patches down the line, so in most cases they are entirely avoidable. Both manufactures and game studios need to do better!

    • @francescobenelli2707
      @francescobenelli2707 Год назад +3

      Exactly

    • @MelindaSordinoIsLiterallyMe
      @MelindaSordinoIsLiterallyMe Год назад

      Two points:
      - If you buy an Nvidia GPU, suck it. No seriously, if you buy one and then complain that you are "forced" to upgrade to a new overpriced Nvidia GPU, you absolutely deserve it, sucker.
      - I do agree that Nvidia has always skimped on VRAM, but I also don't think it is that serious. Everyone references The Last of Us Part I regarding that VRAM topic. That's basically the only big example and The Last of Us Part I is an awfully unoptimized PC port. Shader caching would lower the VRAM need by a lot. I don't want to play that game, so I don't f***** care either way.

    • @remmykun8315
      @remmykun8315 Год назад +52

      The real problem is that consoles are and have always been a priority in the gaming industry. And since they are; you as a PC gamer always should pay attention to how those consoles are made in order to decide which PC parts you should choose. PS5 and Xbox Series X have 8 CPU cores and 16 gigs of unified memory. So, if a company like nvidia releases a new product with just 8 gigs of vram, they know it won’t age well. And sometimes (depending on price or the fact that we cannot wait for whatever reason) they force you to buy a product with clear planned obsolescence. But if you have the option to skip the generation, go with the competition or ultimately buy the “halo” product (24 gigs) then maybe you should do it based on the new consoles’ features that tells you how the games are going to be… sooner than later.

    • @waifuhunter9709
      @waifuhunter9709 Год назад +5

      @@remmykun8315 really good mindset
      I will remeber your advice for next purchase

    • @noctus9731
      @noctus9731 Год назад +8

      RE4R is greatly optimized and the 8GB still can't handle it mate.

  • @ookiiokonomiyaki
    @ookiiokonomiyaki Год назад +5

    Some folks on the internet bragged about small vram pool on ampere rtx cards, but 95% of gamers claimed 8 gigs was enough, and thought that they were hating. Now with developers adapting to 9th generation consoles, it seems 8 gigs is in fact not that future proof as we had assumed.

    • @infini_ryu9461
      @infini_ryu9461 7 месяцев назад

      Lol. Yeah. Not only are games so unoptimized for PC, the graphics cards don't have enough VRAM anymore.
      I mean, I have a 4090, so it's never gonna be a problem, but you shouldn't need that to play a game your graphics card could run normally if allowed to.
      Honestly, I think people are taking too much advice from 720p/1080p high fps competitive gamers. We all know to get beautiful games you're gonna want way more VRAM. 12GB will not be enough very soon.

  • @DrearierSpider1
    @DrearierSpider1 Год назад +159

    Nvidia has a history of this. Recall that the 780 Ti launched for $700 at the same time as the PS4 & XB1. While that card is substantially more powerful than the 8th gen consoles, it didn't have the VRAM to keep up. In 2013/2014, developers were still developing games for cross gen platforms, which meant they still were constrained by the 512MB of RAM in the 360/PS3, so the 3GB wasn't initially a problem. While most 8th gen & PC versions of games featured higher quailty textures, those earlier games could still work within a 3GB limit.
    But once games started being developed for the 8GB consoles exclusively, the VRAM limits on Kepler really became an issue. We're in the same boat with Ampere, and probably the "midrange" Lovelace models if rumors are to be believed.
    The Radeon R9 390 launched 8 years ago with 8GB of VRAM for $330. It's long past time 8GB was a thing of the past for all but display adapter tier GPU's.

    • @frippyfroo6064
      @frippyfroo6064 Год назад

      I upgraded to a 3070 ti from a GTX 660 lmao I have 0 complaints

    • @therockwizard.9687
      @therockwizard.9687 Год назад +12

      @@frippyfroo6064should have bought a 6800

    • @andersjjensen
      @andersjjensen Год назад +5

      @@frippyfroo6064 If you're on 1080p it's not going to be a problem for the foreseeable future. If you're on 1440p you're going to have to adjust settings in mostly every new title by the end of this year. If you're on 4k using and using the highest DLSS quality to get there it's going to be the same story. If you're trying for 4k native you'll already be doing it because it lacks both the grunt and the VRAM.

    • @Ben256MB
      @Ben256MB Год назад +6

      I remember when RadeonVII came out with 16GB for 699 a lot of Nvidia fan were dissing the card , but now Radeon VII is actually going to age well .
      8GB VRAM GPU shouldn't cost 400+ dollars at all but Nvidia are soo selfish .
      I just wish AMD priced their GPU's more cheaper and they would have a big market share .

    • @od13166
      @od13166 Год назад +2

      Back then people calling R9 290x had 4gb not that issue.
      But due to 780ti missing several direct feature its horribly aged

  • @Burdman660
    @Burdman660 Год назад +326

    I skipped last generation because the 3070 only had 8gb of VRAM. I surpass 8gb VRAM in A LOT of titles. I ended up picking up the 7900XT, and it has been fantastic. Middle finger to you Nvidia!

    • @Hombremaniac
      @Hombremaniac Год назад +28

      Some folks (me included) just recently grabbed RX6800XT for good price and are going to ignore current gen. Also feasible solution and one more budgetwise I guess, since previously we had covid/crypto inflated prices and only now they are slowly getting down.

    • @alsoyes3287
      @alsoyes3287 Год назад +1

      How are the temps and the power draw? I was considering 7900xt, but it seems to have power draw issues with multiple monitors, draining 90w when idle at times.

    • @likeagrape57
      @likeagrape57 Год назад +8

      I got a 3070 in early 2021 but ended up upgrading to a 7900xt just recently because I was tired of vram constraints. You made a good choice skipping the 3070

    • @oxfordsparky
      @oxfordsparky Год назад +1

      7900XT is such garbage value, literally bought the worst option.

    • @likeagrape57
      @likeagrape57 Год назад +15

      @@oxfordsparky that's true. But at least it's garbage value that should last a while.

  • @OMaMYdG
    @OMaMYdG Год назад +61

    As a 3070 ti owner myself I'm starting to see this happen in certain games. It's very frustrating but of course there's ways around it.

    • @webtax
      @webtax Год назад

      could you summarize some tips to go around this, so i can know what to look for when searching?

    • @OMaMYdG
      @OMaMYdG Год назад +9

      @webtax basically just have to mess with some of your graphics settings. Turn down the resolution or ray tracing.

    • @nape_soot_asd
      @nape_soot_asd Год назад

      @@OMaMYdG How does it look like when you dont use raytracing
      and not play 4k

    • @gonlyhlpz
      @gonlyhlpz Год назад +2

      Best option would be to use dlls balanced or performance or FSR 2.0, you cant run native 4k with this card, I have been using this card for more than a year, but its a solid card for 1080p

    • @OMaMYdG
      @OMaMYdG Год назад +5

      @gonlyhlpz I'm playing in 1440p and with most games it's very good. Just the recent games like RE4 even with dlss has issues. I Just turn the settings down a bit and im fine.

  • @parzival3632
    @parzival3632 Год назад +24

    Same with RX 6600 XT. When I set it to prioritize graphics, my GPU is only used at 60 - 70% max. Meaning I can set the graphics to higher. But as soon as I do and exceed the 8GB limit waaay too much, the game just crashes. I kinda regret not getting the RX 6700 XT now. It had 12 GB VRAM. I never thought VRAM would be an issue this quickly at 1080p.

  • @josh4434
    @josh4434 Год назад +81

    My 6950XT is crushing this game in 1440p. Mostly sits at 100-130FPS on Max settings RT ON. I'm at Chapter 5 now and the heavy rain makes it dip down to 80+fps. Still, awesome experience with settings maxed out. Not a single crash so far.

    • @tomhanke7703
      @tomhanke7703 Год назад +1

      yes would you play in 4k you would have crashe like me because its then uses sometimes more than 16 gb vram

    • @josh4434
      @josh4434 Год назад +1

      @Tom Hanke that sucks. My 5800X3D/6950XT system is set up for 1440p. The highest I've seen it hit is 14GBs in a couple of games

    • @garyb7193
      @garyb7193 Год назад +3

      @@tomhanke7703 Almost everything in life come with compromise and limitations. He says he's gaming at 1440p 120fps with RT. I say he's gained (in playability) more than what he's lost. I make the same choice, which is why I picked a used RX 6800xt 16Gb for $425.

    • @DaddyHensei
      @DaddyHensei Год назад +9

      4k is not all that worth it anyways. Monitors are over priced for a decent refresh rate. Meanwhile 1440p you can get a nice one for way cheaper and still look pretty amazing. Josh hit the jack pot.
      Ya can confirm that the 6950xt actually can do RT pretty well at 1440p. I run a 6950xt and a 7700x at 1440p. My gaming experience has been nothing but butter smooth with ultra settings across every game I've thrown at it. Callisto, Deadspace, Cyberpunk, heck even hogwarts (Don't recommend rt though, not because fps but because the reflections will blind you. Hogwarts simply isn't made for rt haha) All of those games have been wonderful for me at 1440p

    • @tomhanke7703
      @tomhanke7703 Год назад

      @@DaddyHensei well i cant even play in 2 k anymore its to unsharp and doesnt Look very Good in my opinion so i would rather play in 4k without Rt with all maxed instead of 2k with Rt and the money Thing is not a point for me 800€ for 4k 32 cm 144 hz 1 ms hdr Monitor is ok for the price

  • @DrearierSpider1
    @DrearierSpider1 Год назад +115

    I bought a GTX 1080 at launch and with its 8GB buffer the card was able to last me almost 5 years. Now with my 3080 I'm being VRAM limited only 2-2.5 years into its life.

    • @AlexanderMuresan
      @AlexanderMuresan Год назад +18

      I ended up going from a GTX 1080 to a 7900 xtx. The 4080 was 50% more expensive and the 4090 was double the price of the 7900 xtx. And I don't have any kind of brand loyalty. The 1080 server me incredibly well for 6 years, but I needed something to handle modern games at 1440p and 4k resolutions. Team red seemed like the obvious choice this time around.

    • @Azurefanger
      @Azurefanger Год назад +14

      even the rx 6700 xt and 6800 xt have more vram and are cheaper nvidia just want people to upgrade more often with these bad optimization in the high end gpus i mean even the 3060 got 12 gb of vram why the 3070 and 3080 got less only for this reason so poeple need to buy another one

    • @Hippida
      @Hippida Год назад +4

      Same with my Vega 64. To this day, it's main limitation is the 8 GB vram

    • @crimson7151
      @crimson7151 Год назад +9

      I have a 3080 and this issue is killing me, the gpu itself is extremely powerful, with overclocking it can get really close to the 3090 and they decide to give it only 10gb of vram.

    • @Obie327
      @Obie327 Год назад

      @@AlexanderMuresan I had the same problem running 1440p with my GTX 1080, I decided to give Intel a try and switch to Team Blue. ARC A770 16 gb easily handles all my quality settings and gaming frame rates. It's a damn shame Nvidia's greed has pissed off their loyal fan base.

  • @aapknaap
    @aapknaap Год назад +13

    About a year ago I bought a 3060 12gb because i thought 8 gigs of vram wouldn't be future proof and here we are.

    • @christopherbrewer2154
      @christopherbrewer2154 Год назад

      me too man

    • @verrat3219
      @verrat3219 Год назад +2

      3060 has decent vram but its a bad card due to its performance. so eventho its more future proof than 8gb vram, the cards performance itself is gunna be low on taxing games like rd4 remake. the good thing is atleast it will still be playable, unlike the 3070 where its unplayable/crashing.

    • @aapknaap
      @aapknaap Год назад +4

      The 3060 aint that bad. Never had performance issues so far

    • @dinakshow924
      @dinakshow924 Год назад +1

      @@aapknaap People draw their conclusions. from 120fps and others at 60fps... for example my GPU is an RX 6800 and the box says "4k gaming" and I said "WTF 4K" this gpu runs games at 4k but at an average of 60fps. but for 144fps I need to play at 1080p. what I mean. is that if they say that the "RTX 3060 is a bad card" it is very likely that they say that because I do not reach 144fps at 4k ..

    • @aapknaap
      @aapknaap Год назад

      Yeah

  • @dinakshow924
    @dinakshow924 Год назад +4

    for that reason. i like AMD. nvidia is too expensive however doesn't have vram enought

  • @alchemira
    @alchemira Год назад +261

    The game runs perfectly fine on my rx6800 with 16GB VRAM @ 1440p. I chose 6800 over 3070(ti) because of VRAM. 4060(ti) will be DoA with 8GB VRAM.

    • @theelectricprince8231
      @theelectricprince8231 Год назад

      Nvidia are delusional if they think they are apple

    • @muresanandrei7565
      @muresanandrei7565 Год назад +6

      why 1440p I have the same card and I do 4k full max ray tracing fine 60fps. 1440p looks shit once you play 4k.

    • @atnfn
      @atnfn Год назад +103

      @@muresanandrei7565 Maybe he doesn't have a 4k monitor, I don't. I figure in fact most people don't.

    • @xiyax6241
      @xiyax6241 Год назад +9

      yeee bro same lotta games i play use up too 10/12gb of vram. amd over nvidia for gpu right now

    • @xiyax6241
      @xiyax6241 Год назад +2

      @@muresanandrei7565 ye man have same card its insane no limitations like nvidia 8gb cards

  • @diablosv36
    @diablosv36 Год назад +153

    This is what I had with the 970 back in the day, was playing Tomb Raider and it would stutter like crazy. This is why I rejected the 3070 and got a RX 6800 instead. And now 2 years later having this card with a game like RE4 is something I can comfortably max out.

    • @ml_serenity
      @ml_serenity Год назад +1

      You rejected because you knew that 6800 has 16GB of VRAM vs 8GB in case of 3070 or simply because you had a bad experience with the 970 one?

    • @azraeihalim
      @azraeihalim Год назад +23

      @@ml_serenity I think both

    • @zentar2646
      @zentar2646 Год назад +34

      @@ml_serenity because he had experience with a once vram-gimped product like the 970 before he knew he shouldnt repeat it and instead get the offering with more vram

    • @EyeofValor
      @EyeofValor Год назад

      I'm maxed at a 10gb card. The op uploader is a moron and knows very little.

    • @xwar_88x30
      @xwar_88x30 Год назад +10

      ​@@ml_serenity he's basically saying he's future proofing by going for a gpu that has beefy vram and not experience a gimped card due to the lack of vram. This happened last gen with consoles people saying awh 2gb and 4gb is enough which lasted like a year and then they GPUs struggled, now ps5 etc is out games are just going to get more vram hungry as time goes on. Always future proof when it comes to pcs especially when it comes to vram. 6800/6800xt are going to just dominate the 3070 and 3080 as time goes on. It's sad because like Owen said the 3070 and 3070ti are easily able to max these games out but the lack of vram has gimped those cards.

  • @fredericothordendjent1408
    @fredericothordendjent1408 Год назад +13

    Definitely would love for you to do a comparison between the rtx 3080 12gb and the 6800 xt all maxed out with this game, I feel like the same thing could happen with the 3080..

  • @j.c.denton2312
    @j.c.denton2312 Год назад +21

    the main reason i bought the 6800xt over the 3080 was the increased vram kinda wild to see that paying off already. this game runs great! my only complaint is that the upscaling methods are poorly implemented

    • @clownavenger0
      @clownavenger0 Год назад

      How good does it run raytracing in this game?

    • @scylk
      @scylk Год назад +3

      ​@@clownavenger0 ray tracing is a joke anyway. Coming from someone who bought a 4070ti

    • @clownavenger0
      @clownavenger0 Год назад +1

      @@scylk Yeah mostly. I like some RT GI in games with dynamic lighting when its done well. shadows and reflections can be faked pretty well

    • @valenrn8657
      @valenrn8657 Год назад

      @@clownavenger0 Refer to Resident Evil Village.

    • @clownavenger0
      @clownavenger0 Год назад

      @@valenrn8657 okay. that was kinda poo

  • @Jefferson-he5rb
    @Jefferson-he5rb Год назад +77

    I think Nvidia is doing that on purpose. By adding lower vram, they force their customers to upgrade more often than they should. They are adding lots of useful features which are DLSS, Ray Tracing, Cuda cores for streaming and 3D modelling. It feels like future-proof, making you want to join in green team. But after 1-2 Years a fact hits you; Low VRAM... For example, RTX 3070 is a great card, like all in one. Great raw performance, good for 3D modelling, streaming, Better-than-competitors RTX performance. However, When you render high-poly projects in blender, RTX 3060 outperforms RTX 3070 because of the lack of VRAM RTX 3070 crashes. You try to play on high textures with RTX 3070 but it gives less performance or crash than RTX 3060 because of the VRAM. Even if it is so clear why Nvidia is doing that, Adding 2-4 GB more VRAM shouldn't be that expensive. RX 6700 XT is a more future-proof card than RTX 3070 and 3060 TI in my eyes.

    • @Frank-fg4jx
      @Frank-fg4jx Год назад +15

      It's the reason I will never buy another gpu from nvidia

    • @caribbaviator7058
      @caribbaviator7058 Год назад +5

      ​@@Frank-fg4jx Same here. The RX6800 gives me way better performance than any Nvidia card I own.

    • @DerpDerpson
      @DerpDerpson Год назад +6

      You hit the nail on the head. They started this practice with the GTX 970, but it went under the radar for most people because back then it wasn't causing that much of an issue. It is a big one today when nearly all developers can't into texture compression (due to everybody and their mother using UE4 and now 5), so they just let it rip and max out VRAM allocation even when the game looks like it's from 2010.

    • @MelindaSordinoIsLiterallyMe
      @MelindaSordinoIsLiterallyMe Год назад

      Two points:
      - If you buy an Nvidia GPU, suck it. No seriously, if you buy one and then complain that you are "forced" to upgrade to a new overpriced Nvidia GPU, you absolutely deserve it, sucker.
      - I do agree that Nvidia has always skimped on VRAM, but I also don't think it is that serious. Everyone references The Last of Us Part I regarding that VRAM topic. That's basically the only big example and The Last of Us Part I is an awfully unoptimized PC port. Shader caching would lower the VRAM need by a lot. I don't want to play that game, so I don't f***** care either way.

    • @pooriaroohie
      @pooriaroohie Год назад

      @@DerpDerpson they actualy started it with gtx 780 and gtx 780ti - which they both had 3 gb of vram and then they made another variation of gtx 760 that had 4gb of vram and that was the most retarded and absurd thing i ever saw.

  • @IrocZIV
    @IrocZIV Год назад +50

    The 1070 still had 8GB, always thought Nvidia was being stingy with RAM on their newer cards.

    • @sofyo2bib
      @sofyo2bib Год назад

      but not the same bandwidth 3070 double than 1070 but still not enough hhhh

    • @tankdjsims
      @tankdjsims Год назад +3

      yeaaa I just had a 1070 and upgraded to a 3070, sad that they both have the same vram. That never made sense to me when upgrading but I still went an bought a 3070 for $300

    • @Deernailt
      @Deernailt Год назад

      @@tankdjsims I had 1070 just upgraded to a 2080 in 2018-2019 has same 8GB VRAM with more fps after upgraded to 4090 GIGABYTE GAMING OC in 2023 so got 24GB VRAM now skipped 30 series

    • @tankdjsims
      @tankdjsims Год назад

      @@Deernailt even though I just got this maybe two weeks ago, im really thinking about selling it and upgrading to a 40s 😩

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe Год назад

      I went from 1070 to 4070ti very recently.

  • @TheGameBench
    @TheGameBench Год назад +5

    Had a feeling the 3070/3070 Ti were going to age poorly due to VRAM limitations. Going to be a 1080p card sooner than it should be.

    • @alrizo1115
      @alrizo1115 Год назад +1

      It will cost

    • @NyangisKhan
      @NyangisKhan Год назад +2

      Sure the vram of not only the 3070/3070ti but the 3080 doesn't make much sense. But this isn't a GPU issue. This is strictly a developer issue. RE4 remake has a poor PC port. It looks like absolute horseshit compared to games like "The vanishing of Ethan Carter" which was released a decade ago. And funny enough that game used to run with a stable 60FPS at max settings on low end GPUs at the time like the GTX650ti. Basically these recent incidents are just Capcom and Sony being absolute clowns.

    • @hircine92h
      @hircine92h Год назад

      Even at 1080p it would struggle on some games and you'll have to turn some settings down lmao

  • @sL1NK
    @sL1NK Год назад +1

    I use a 3070Ti at 3440x1440, and I never had any issues with running out of VRAM. People really overexaggarate about "not having enough vram". Hogwarts Legacy has VRAM management issues, and it can be fixed with a few extra lines in the config file.

  • @gromm225
    @gromm225 Год назад +73

    Main reason I went with a 6700xt was VRAM. The new Xbox/PS released use 16GB ram shared between the CPU and GPU so essentially 12GB VRAM assuming 4GB reserved for CPU. Since game developers tend to focus on consoles and their capabilities its probably a good idea to have AT LEAST 12GB Vram. I could be wrong but I always thought the 30 series cards didn't have enough Vram and it seems to be starting to have an effect.

    • @pk417
      @pk417 Год назад +15

      The truth is even consoles don't use ultra settings
      So image what is going to happen in 2024 with 8gb gpus

    • @victorsegoviapalacios4710
      @victorsegoviapalacios4710 Год назад +2

      That is not how it works, max real VRAM available is around 10 GB, best case; usually even less. There is a reason of the memory config of the XSX

    • @larion2336
      @larion2336 Год назад +1

      Same here, also 6700 XT and it's been great. Runs a lot cooler than my old 1070 TI as well.

    • @saricubra2867
      @saricubra2867 Год назад

      But you are ignoring the 3080Ti and 3090.

    • @TankoxD
      @TankoxD Год назад +6

      @@saricubra2867 they are 2 times more expensive

  • @Avarent01
    @Avarent01 Год назад +58

    I am very happy that I chose my 6900XT over the 3070/3080. Not only is it super power efficient, but yes, the 16GB Vram make it potentially more future proof as well.

    • @renoputra5219
      @renoputra5219 Год назад +20

      I sold my 6900XT over 3080 10gb, the most stupid decision ever done in my live😢

    • @vladimirdosen6677
      @vladimirdosen6677 Год назад +1

      ​@@renoputra5219 Lowering texture can be compensated by using dlss which is superior in image quality to fsr. I would argue ray tracing and dlss is worth it more.

    • @rubsfern
      @rubsfern Год назад

      Same here :)

    • @nicane-9966
      @nicane-9966 Год назад +10

      ​@@vladimirdosen6677 dlss and fsr sre cheap techs that decreases your image quality. Besides the 6900xt is at 3090 level so it was already more powerfull tbw

    • @dampintellect
      @dampintellect Год назад +6

      @@vladimirdosen6677 That's all fine and good if you think so. But I would like to point out that Raytracing itself uses additional VRAM when turned on, so you would have to lower textures even further.

  • @BattlecryGWJ
    @BattlecryGWJ Год назад +12

    I bought a 3070 and a 3070 TI once I could get them at MSRP after prices crashed, wound up returning both and picking up a 6800 XT because I thought with more vram it would probably have better legs. Would love to see a comparison with a 6800 XT in these titles now.

    • @WalkerIGP
      @WalkerIGP Год назад +1

      You are not the only one. I am going to trade up mt 3070 Ti in a few days to a 6950XT or 7900XTX.

    • @user78405
      @user78405 Год назад

      Should get with 7900xt, a 6800xt is not powerful with more VRAM alone is slower engine to run other games that 3070ti ran best on with bigger engine even VRAM is small...odd choices to pick last gen...and for others I would recommend keep holding on, Microsoft already patch DirectX 12 issue with copy VRAM and copy system ram bug issues by upload Heap that devs will patch the game with new code that benefit 8gb VRAM by allowing the CPU have access VRAM directly than adding latency from system ram old fashion that been around for over 30 years of processing

    • @WalkerIGP
      @WalkerIGP Год назад +1

      @@user78405 In my opinion a 6950XT would be better, at just 600 to 700 USD, that is what I choose, on its way now. The 7900XT is not good value when you can get a XTX for just 100 USD more. This means that there is no reason to go for the 7900XT as a more powerful card is within reach. If the 7900XT was 700 to 800 USD I would have gone for it. I do agree that a 6800XT is not enough of an uplift from 3070ti, you need at least 6900XT or better to notice a difference. I also realized in my earlier comment I wrote XT, meant to write XTX.

    • @madtech5153
      @madtech5153 Год назад

      time to watch hardware unboxed's vid then

  • @GraaD-87
    @GraaD-87 Год назад +1

    You should have mensioned that the problem does not exist with RT off. You can go all the way up to ~14 GB VRAM usage out of 8 GB available, and it will run just fine. No errors, no stutters, perfecly well. Turn raytracing on and it instanly crashes.

  • @tristanweide
    @tristanweide Год назад +261

    When the VRAM is more limiting than the performance of the card, you have an issue. I think the 3080 10GB is going to be running into the same issues here soon. If you think this is bad, you should see the RTX 3050TI trying to run anything with 4GB of VRAM. My poor laptop has to play at 1080P medium most games because of textures, not performance.

    • @danielowentech
      @danielowentech  Год назад +109

      Exactly! If a card runs out of VRAM, but it is at settings that wouldn't make sense to run anyway, it isn't that big of a deal. But cards like the 3070 Ti are frustrating because it is powerful enough to max out some games but runs into its VRAM issue like we are seeing here.

    • @GewelReal
      @GewelReal Год назад +38

      3080 10GB already is an issue at 4K
      Also no way Nvidia made 3050 woth 4GB of VRAM bruh

    • @mitanashi
      @mitanashi Год назад +18

      @@GewelReal the 3050 ti is a laptop varient i think

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Год назад +14

      @@GewelReal interesting claim. Used 3080 10gb since launch at 4k. Never had any issues

    • @lifemocker85
      @lifemocker85 Год назад +21

      Thats why they gimped gpus that youd have to buy another soon

  • @TechnoTurtle9974
    @TechnoTurtle9974 Год назад +42

    This game is the one that let me know my 5700xt is already outdated because of the VRAM. Normally, lowering settings is fine, but there's something about lowering texture quality that hits different 😅

    • @raresmacovei8382
      @raresmacovei8382 Год назад +4

      We always knew that with just 8 GB VRAM (and lacking Vegas HBMC option).
      Sticking to high enough textures gives 80-120 fps in 1080p, maxed out, Hair Strands, FSR2 Quality. It's fine.

    • @Jacob-hl6sn
      @Jacob-hl6sn Год назад +1

      my rx 5700 xt was already obsoleted by performance in new games, you need a rx 6000 upgrade like I did

    • @demetter7936
      @demetter7936 Год назад +1

      Turn shadows from max to high and disable raytracing, it's only on water anyway.

    • @raresmacovei8382
      @raresmacovei8382 Год назад +1

      @@Jacob-hl6sn Which part of 80-120 fps feels like "lacking performance"?

    • @Jacob-hl6sn
      @Jacob-hl6sn Год назад

      @@raresmacovei8382 some new games i only got 60 fps with that gpu but that fps is fine if that's what it is in this game

  • @chariothe9013
    @chariothe9013 Год назад +5

    Dunno why I chose 2060super with 12gb for my second desktop pc at that time but I feel lucky finally able to put it to use lol😅

  • @MarioGoatse
    @MarioGoatse Год назад

    That knocking at the end of the video freaked me out. Thought it was coming from my house at 3am!

  • @adi6293
    @adi6293 Год назад +54

    My wife recently upgraded her 3070 to a 7900XTX due to ram issues in some games that she plays and I will be doing the same , we were going to get 4080 and 4090 but fuck nVidia , I'm aware of AMD's short comings but at lest they aren't stingy with the VRAM on their cards , 3070 should have had 12GB full stop , and my 3080 only having 10GB is just a disgrace

    • @GhOsThPk
      @GhOsThPk Год назад +8

      The 7900XT has 20GB, the RTX 4080 has 16GB same amount AMD gave to the RX 6800/XT last generation, NVIDIA does it because they know people will buy, they gimp it to make a little more profit.

    • @NXDf7
      @NXDf7 Год назад +2

      7900xtx is a great card, just make sure you get a good AIB like sapphire nitro. I have had nitro cards for 4 years and have had 0 problems.

    • @adi6293
      @adi6293 Год назад

      @@NXDf7 That is what she's got 😜 I'm also getting Nitro+

    • @igorrafael7429
      @igorrafael7429 Год назад +1

      7900XTX ray tracing's performance is on par with rtx 3080~~3090, which is pretty good, considerable stronger raster performance though. my 3080 is a beast but often limited by the vram, specially in rt games (rt is vram hungry, which is ironic), which makes me want to upgrade in less then 2 years of using it, unfortunately

    • @saricubra2867
      @saricubra2867 Год назад

      @@igorrafael7429 But that is quite mediocre for a flagship, on par with 3080 and 3090 means that it's the same as a 3080Ti from previous gen, therefore RTX4070Ti.
      A 4090 blows that Radeon out of the water. GTX 1080Ti 2.0

  • @theboostedbubba6432
    @theboostedbubba6432 Год назад +69

    I tested my friends RX 6600 XT against my GTX 1080 Ti in RE4 remake and even at 1080p the Max preset option simply doesn’t work on the 8gb 6600 XT. They delivered similar experiences when lowering the settings a bit but I just find it funny that 6 years later, the 1080 Ti’s Vram capacity is starting to help it out.

    • @user-eu5ol7mx8y
      @user-eu5ol7mx8y Год назад +34

      1080 Ti is legend

    • @fwef7445
      @fwef7445 Год назад +22

      1080ti was the last decent product nvidia made

    • @steveleadbeater8662
      @steveleadbeater8662 Год назад +12

      @@fwef7445 It would be the 3080 if it actually stayed at its $649 MSRP. Better to say it was the last Value for Money product.

    • @puppy.shorts
      @puppy.shorts Год назад +4

      same with my rtx 3060 12gb 😊

    • @TheEvilPorkchop03
      @TheEvilPorkchop03 Год назад +3

      i had my 1080ti from a couple months after launch till 2 months ago and for that reason it’s got to be the best. absolute unit that card is. Deffinetly better money spent than any computer component i’ve ever bought.

  • @icantthinkofagoodusername4464
    @icantthinkofagoodusername4464 Год назад

    You've probably heard this by now, but apparently the crashing issue that occurs when going over the VRAM limit only happens with ray-tracing enabled. My GTX1080 can max everything out including texture quality, and while it doesn't run at what I'd call a stable 60fps at 1080p, it certainly doesn't crash like that despite memory usage being well over the limit. Not being able to even turn on real-time raytracing is apparently a benefit here. From what I understand the raytracing implementation here isn't anything actually transformative anyway, so it's not that big of a loss to keep it turned off.

  • @GoldNugget
    @GoldNugget Год назад +2

    After a lot of thinking and comparison, last month I bought RTX 2080 Ti rather than RTX 3070. It's shame 3000 series Nvidia mid tier GPUs have same VRAM as the last gens... 11GB VRAM in RTX 2080 TI is deal changing.

  • @zahnatom
    @zahnatom Год назад +182

    I'm glad i chose the 7900 XT over the 4070 Ti when there's memory issues like this

    • @Gh0st_91
      @Gh0st_91 Год назад +6

      me too

    • @gloomy935
      @gloomy935 Год назад +17

      The 4070 TI doesn't have 8gb of vram.

    • @andrewsolis2988
      @andrewsolis2988 Год назад +21

      Early adopters of the 7900 xt are now laughing at team green!! 😎

    • @shebeski
      @shebeski Год назад +3

      I've been playing just fine. I've had to turn down the textures slightly but can't notice the difference the vast majority of the time.
      The benefits of this card vs the 7900XT show in a variety of other titles. You win some, you lose some.

    • @Dragonlord826
      @Dragonlord826 Год назад +59

      ​@@gloomy935 12gb is pretty pathetic for the price

  • @resonator7728
    @resonator7728 Год назад +28

    Just to think that the 1080ti had 11gb vram way back then. Glad I held onto mine, just put in a second computer with a 6700k

    • @tourmaline07
      @tourmaline07 Год назад

      Same here with 2080ti , given the card's age I'm not too fussed if I now need to play high instead of ultra textures :)

    • @user-eu5ol7mx8y
      @user-eu5ol7mx8y Год назад +1

      Looks like high-end cards are best value after all.

    • @saricubra2867
      @saricubra2867 Год назад

      @@user-eu5ol7mx8y They ALWAYS have been (with the exception of the 2080Ti, Turing sucked so much). Graphics card get obsolete very quickly unlike CPUs.

    • @saricubra2867
      @saricubra2867 Год назад

      @@user-eu5ol7mx8y 1080Ti is doing fine, 3080Ti chilling at 4K despite NOT being a 4K card, 4090 being a 1080Ti 2.0 (a mistake). AMD entry level, Intel fighting with Raja while they keep losing money by the failure of Arc launch.

    • @otto5423
      @otto5423 Год назад

      @@saricubra2867 4090 is not 1080 ti 2.0 because 4090 is very expensive card compared to 1080 ti launch price. 2000€ vs 700€ lol

  • @pRick7734
    @pRick7734 Год назад +45

    The reason why this happens is because the game loads textures into the vram and loads even more textures when you're traversing a loading zone from one area of the map to another, there's a clear spike in vram usage when that happens. Digital foundry covered this in a video, it's also the reason why traversal stuttering exists in the game. They need to fix this asap.

    • @mimimimeow
      @mimimimeow Год назад +13

      This is how PS5/XS is expected to work since they have a dedicated SSD IO stream decompression that goes directly into VRAM. Devs will stream data continuously thus sidestepping VRAM limit. PC DirectStorage isnt at that level yet and what we see is basically the result of MS and Nvidia not caring enough to push radical new things to PC. After all most PC gamers only care about getting 500 fps in CSGO *eats popcorn*

    • @brottochstraff
      @brottochstraff Год назад +9

      @@mimimimeow Also we are seeing more and more shitty console->pc ports...again. Last of Us, RE4, Plague and many more. There are clear technical flaws like the one you mention above where shader precompilation is needed and texture streaming is just not made for how PC hardware works best. If you look at games that kept PC in mind from start like Dying Light 2, Cyberpunk and others, there are no such issues. Im maxing out Dying Light 2 on RTX 3070 and it looks way better than RE4

    • @kitezzz360
      @kitezzz360 Год назад +1

      actually traversal stutter is loading the upcoming area into system memory, it has nothing to do with vram

    • @danielosetromera2090
      @danielosetromera2090 Год назад +4

      They are clearly doing things inefficiently. RDR 2, a game that looks a metric ton better than Hogwarts Legacy on every level, uses just 5,6 gb of vram at 4K. I repeat, AT 4K.

  • @Phoboskomboa
    @Phoboskomboa Год назад +4

    Even my 4080 at 4K required lowering the textures to 4GB to stop crashes. I was still in orange at 6GB, but it did crash a couple times, presumably when it hit a scene that used more VRAM. I've never been particularly sensitive to texture resolutions. I don't usually notice a difference between launch textures and the 4K fan-made ones that come out for a lot of games, so I don't really mind only allocating 4GB, but it's not really a great feeling that the $1200 video card I just bought is already hitting a hardcap on what game settings it can run.

    • @badimagerybyjohnromine
      @badimagerybyjohnromine Год назад +3

      Yeah this is unacceptable, if you pay a thousand dollars+ for a gpu you should be able to play whatever game you want on on ultra settings.

    • @Phoboskomboa
      @Phoboskomboa Год назад

      @@badimagerybyjohnromine I've noticed load times increasing and more stable behavior in the last few days. I think they may have quietly patched it to keep less in active VRAM at the cost of needing to load more per area. Or, it could be in my head.

    • @abhirupkundu2778
      @abhirupkundu2778 3 месяца назад

      you all here with rtx cards, while I play the game with my gtx 1650 GDDR6, textures at 0.5 GB only, hair strands off, 1080p, and yet face lag in intense areas due to low VRAM

  • @Dark88Dragon
    @Dark88Dragon Год назад +53

    Good that I went with AMDs 12Gigs instead of the 8 Gigs from Nvidias counterpart in 2021, this was one of the crucial aspects for me why I chose Team Red

    • @RS_Redbaron
      @RS_Redbaron Год назад

      A 1080Ti had 11 so what is 12 hahaha

    • @adi6293
      @adi6293 Год назад

      @@RS_Redbaron 1080Ti was a $699 high end card you idiot , 6700XT is a mid range card

    • @RS_Redbaron
      @RS_Redbaron Год назад

      @Khel easy !

    • @silverwatchdog
      @silverwatchdog Год назад +1

      @@RS_Redbaron And the 1080ti was a flagship overkill GPU, but somehow for a good price. It's not surprising it had 11GB in 2017 because it was meant to be overkill.

    • @mirzaaljic
      @mirzaaljic Год назад +1

      Same.

  • @DeadphishyEP3
    @DeadphishyEP3 Год назад +39

    Great video. We are about to see a 4060 TI with 8gb of Vram running just as fast as a 3070 Ti and it will have the same issues.

    • @Mortalz2
      @Mortalz2 Год назад +3

      need rtx 4060 ti 16gb

    • @fwef7445
      @fwef7445 Год назад +1

      because the mid 3000 cards have an identity crisis, they have the raw power for 1440p but the vram for only 1080p, therefore they are badly designed. if you want to play at 1440p you ideally want 12gbs of vram. Absolutely nobody should be buying a 3070 or 3070ti, these cards suck. The 4060ti is going to suffer the same fate, it's just going to turn into yet another overpriced 1080p card

  • @ericwheelhouse4371
    @ericwheelhouse4371 Год назад +2

    The issue isn’t just the capacity. It’s also how assets are loaded and managed. You don’t NEED all that vram if they optimized it to scale to what’s available. Adding more doesn’t fix the problem it just avoids it.

    • @WalkerIGP
      @WalkerIGP Год назад

      True, but I will take what solutions I can get. If the devs won't solve the problem then we gamers are forced to do it ourselves.

    • @cptnsx
      @cptnsx Год назад

      new games require MORE VRAM

    • @WalkerIGP
      @WalkerIGP Год назад

      @@cptnsx Yup and that is because dev are not optimizing their games. A well designed game uses less resources than available. The fact that all these games require so much VRAM points to devs getting more lazy. They know the VRAM is there, so why bother optimize? I just bought the 6950XT, no performance or VRAM issues for me!

  • @steve9094
    @steve9094 7 месяцев назад

    I wasn't aware of the vram issue before I upgraded from a 1070 to the 6650 XT during Cyber Monday, and now I'm finding that games like Hitman 3 run fine with ray tracing on my card but start becoming unplayable once the VRAM gradually fills up. So close, yet so far...

  • @sjuric2435
    @sjuric2435 Год назад +16

    This is the reason why I chose a used RTX 2080ti instead of a used RTX 3070, it was even €50 cheaper, no matter what others said, it is newer, more modern, less power consumption, better Ray Tracing. Its turn out that RTX 2080ti with its 11gb will outlast both the 3070 and 3070ti. It's a pity the strong graphics but limited VRAM. Consoles are the base line, they have 16gb of ram, approximately 12-13gb are reserved for games. I think 12gb is the norm for upcoming games, especially if it's played with Ray Tracing.

    • @ZratP
      @ZratP Год назад +3

      Yup you pointed up something there. The current gen of consoles (PS5 and Xbox Series X) are a much more capable baseline compared to what the PS4 and Xbox One were compared to PC 3 years after their release retrospectively.
      I think on the Xbox Series X, you have 10 GB of RAM faster than the other 6 GB, those 10 GB are clearly mainly used for GPU and textures. Meaning that 8 GB may start to become a bottleneck for some GPUs in the upcoming years for high textures quality and 1440+p.
      Actually a lot of PC gamers in the upcoming years will start to feel the effects of that new baseline as cross-gen games is almost over and more and more games will move exclusively to PS5 and Xbox Series (even if the existence of Series S may actually control the damage). Their fast SSDs, 8 quite recent cores, plenty of memory, decent GPU may up minimal/recommended requirements.

    • @PeterPauls
      @PeterPauls Год назад

      Ah, no, on consoles they are optimizing and to be fair the textures look very bad at 0.5GB, much worse than other older games. I'm sure the new consoles doesn't use more memory for textures than 6-8GB.

    • @sjuric2435
      @sjuric2435 Год назад +3

      @@ZratP That's right. I also remember the PS4 era from 2013 when they came with 8gb of RAM, people shouted that 2gb of RAM is enough when it was cross gen. I remember reading articles about what programmers said, buy graphics with as many as possible Vrama consoles then used 3gb for the OS and 5gb for games. When the cross gen passed in 2014, I then bought a used R9 280x with 3gb. A few months later, a number of games are released, Far Cry 4 requires 3gb for High texture, Rise of Tomb Raider 3gb for High, Middle-earth: Shadow of Mordor 3gb for High and I had no problem. And then Nvidia only sold GTX 780/780ti with 3gb, while AMD had, as it does now, HD 7950/7970 3gb and r9 280/280x 3gb and r9 290/290x 4gb. Nothing has changed at Nvidia regarding VRAM since then they don't give it easily.

    • @sjuric2435
      @sjuric2435 Год назад +7

      @@PeterPauls We are not in 2017 anymore. In 2016, I had a GTX 1070 with 8GB, and for $230 you could buy new RX 480 with 8GB of VRAM. it's time to move away from 8gb and move on. People today play on higher resolutions, textures are getting bigger and more detailed, Ray Tracing is there, etc.

    • @mimimimeow
      @mimimimeow Год назад +1

      @@PeterPauls the reason console textures look bad is due to CPU and GPU competing for the limited RAM bandwidth, not the capacity. It's also why consoles often use inferior texture filtering. They can still load shitton of VRAM assets for a bigass level.
      There is no excuse that Nvidia skimped on VRAM. Looking at console RAM trends alone, 4MB-64MB-512MB-8GB, the current would have 64GB. Sure they only have 16GB, but these consoles are also betting on something we in the PC ecosystem lack:
      On the PS5 the dedicated SSD IO compression could swap between VRAMSSD up to 20-ish GB/s effective, that is PS3-era VRAM bandwidth. Now what's the point of this? A creative dev could theoretically constantly buffer data that it'd seem to have unlimited VRAM. We have no equivalent of this on Windows. DirectStorage has some ways to go.

  • @od1sseas663
    @od1sseas663 Год назад +58

    That's exactly what Nvidia wanted. Go ahead upgrade to the 1000$ 12GB 4070 Ti that is also gonna run out of VRAM in 2 years!

    • @ALM_Relaxed
      @ALM_Relaxed Год назад +6

      Thats why you need to spend those 1600 bucks for The 4090 to stay relaxed for The upcoming future games

    • @albertoalves1063
      @albertoalves1063 Год назад +1

      @@ALM_Relaxed But even the 4090 will not hold on much time either, if the 4070TI is up to 2 years the 4090 is about 4

    • @ALM_Relaxed
      @ALM_Relaxed Год назад +1

      @@albertoalves1063 well 4 years it’s more than enought… than you can change gpu/entire pc

    • @albertoalves1063
      @albertoalves1063 Год назад +1

      @@ALM_Relaxed Bro the 4090 is like 8 times more powerful then the PS5 and in 4 years the PS5 will still will be running games that the 4090 will not be able to, because the usage of Vram

    • @od1sseas663
      @od1sseas663 Год назад +3

      @@ALM_Relaxed 1600$? Imma gonna buy 3!

  • @chrisclayton6788
    @chrisclayton6788 Год назад +14

    I've got a 6600xt playing at 1080p. There's only a few games maxed out that produce stuttering for me (Hogwarts legacy and Forza Horizon 5) the fact Nvidia is producing such powerful cards with vram capacity that was nice on a midrange GPU 5 years ago is beyond me

    • @dammaguy1286
      @dammaguy1286 Год назад

      6600 xt is a powerful gpu it should not stutter on fh5

  • @glebati_hlebati
    @glebati_hlebati Год назад +2

    I do not know if a lot of people have the same issue, but I experienced some memory issues in Dead Space Remake. It feels like my 3070 VRAM is not enough for this game even in 1080p, because after I started the game I have (in 1440p) ~90 fps, while after 10 minutes of running back-and-forth my VRAM consumption increases from 6GB to 7.9GB and my fps drops down to 25 and even lower. After restart of the game I get a good fps again but I cannot actually play the game :/ It feels like I need to conduct a surgery and increase the VRAM to be able to play

  • @radite2177
    @radite2177 Год назад +38

    This is the exact same reason why I switched to AMD (RX 6700XT) after almost 9 years using Nvidia GPU (my last Nvidia GPU was RTX 2060). I almost continue "the tradition" and planned to upgrade into RTX 3060 Ti but Im very glad now that I didnt do it and make a switch to AMD GPU

    • @lordmorbivod
      @lordmorbivod Год назад

      I took this one too, the game runs flawlessly at 60fps even with ray tracing on but keeps crashing with D3D error according to vram even lowering setting

    • @steadygaming7737
      @steadygaming7737 Год назад

      ​@GnarboK the d3d fatal error is due to a problem with the emplimentation of Ray tracing. I also have an rx 6700xt in my system, I can run the game at 1440p max settings with more than 60fps. When running the game with Ray tracing on, it crashes within 5 minutes, even though I have more than sufficient vram.

    • @radite2177
      @radite2177 Год назад +2

      @@lordmorbivod ah too bad.. me myself never like the ray tracing feature since first time trying it using RTX 2060 and now with 6700XT.. I just think that its not worth it of the performance impact, and didnt add substantial things to my enjoyment of the game itself.. but thats me.. back to each own preferences. For me just using normal reflection settings/features like SSR etc are more than enough

    • @serphvarna4154
      @serphvarna4154 Год назад +2

      6700xt to 3060ti is a downgrade

    • @radite2177
      @radite2177 Год назад +1

      @@serphvarna4154 yeah glad I choose 6700XT instead of 3060 Ti

  • @ConfusionDistortion
    @ConfusionDistortion Год назад +12

    Resident Evil 4 tells you you don't have enough vram even when you do, and crashes at around the 16gb mark after saying you are in the orange warning range (7900 XT has 20gb vram). Its a known bug that has carried over from RE 2 and RE 3. They both do the same thing. It seems to have been an issue introduced when they added ray tracing to the engine they use.

    • @Spr1ggan87
      @Spr1ggan87 Год назад +5

      That update came out ages ago and they still haven't fixed it in those those games, they just re-released the DX11 version of the game via beta branch and called it a day.

    • @MistyKathrine
      @MistyKathrine Год назад +3

      @@Spr1ggan87 I hate half-assed Steam ports of games. It's a huge problem in a lot of games. Like, is it really that hard to release games that don't have serious bugs?

    • @saricubra2867
      @saricubra2867 Год назад +1

      I have the Gamecube Orginal and i would rather play that instead of the foggy and bland Unreal Engine art look from the remake and we are talking about the Gamecube, optimization is perfect.

    • @Spr1ggan87
      @Spr1ggan87 Год назад +2

      @@MistyKathrine Try the PC version of RE4 HD with the HD Project mod installed, iirc it even has a set of lighting options to choose from with one mimicking the style of Gamecube version. Imho it's better than the GC version as in that version everything more than an arms length away from Leon looks like a pixelated mess.
      ruclips.net/video/WAf9R8_cLPo/видео.html

    • @MistyKathrine
      @MistyKathrine Год назад

      @@Spr1ggan87 Yeah there are a lot of games on Steam that pretty much require mods to make them playable. This reminds of another Gamecube title that's on Steam, Tales of Symphonia. What a disaster that port was, thankfully there were eventually mods that fixed most of the issues but that game was quite literally broken and unplayable at launch. I just don't understand how companies think it's okay to release games in that state as doing so just leads to bad reviews and poor sales.

  • @rebelblade7159
    @rebelblade7159 Год назад +13

    That's why I'm currently staying at 1080p with my 3070. The VRAM limitation is a shame because if it had 12GB or more, it(and the 3060ti) could have had great longevity for 1440p gaming and even 4k to some extent. My old GTX 960 4GB that I got in 2015 lasted upto mid 2021 thanks to its VRAM. 4GB was considered overkill then but it let me be future proof. For those of us who take their time in upgrading, stuff like VRAM can matter a lot. Is it possible that Nvidia made this bad decision due to thinking everyone will use DLSS?

    • @andrerocha3998
      @andrerocha3998 Год назад +5

      no nvidia made this decision so you need to upgrade your gpu sonner, simple as that

    • @brettlawrence9015
      @brettlawrence9015 Год назад +1

      The fact the 1070 2070 and 3070 all have the same amount of vram is a joke.

  • @portman8909
    @portman8909 10 месяцев назад +1

    How many of you can tell apart medium or high textures from ultra though? I'm willing to wager almost none unless pixel peeping.

  • @hirakchatterjee5240
    @hirakchatterjee5240 Год назад +22

    I have a rtx 3060 and ultra textures in 1080p can take over 10 gigs of VRAM if you enable RT. It can handle it with 60 fps locked. So they definitely need more VRAM for the higher end cards.

    • @Anto-xh5vn
      @Anto-xh5vn Год назад +8

      Me with my 3060 Ti 8GB seeing it getting fucked because of vram 🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿

    • @Simpleman1995
      @Simpleman1995 Год назад +1

      nvidia : say it louder i dont hear🗿🗿

    • @silverwatchdog
      @silverwatchdog Год назад

      You really don't need ultra textures on new games for 1080p. Sure on older games like RDR2 anything other than ultra textures looked horrible, but now even low texture will work on 1080p. Ultra textures are mainly for 8K and high for 4K. Depends on the game though. However if you have the spare VRAM, there is no reason not to put textures as high as they can go.

    • @nabilellaji6509
      @nabilellaji6509 Год назад

      @@silverwatchdog RTX 3060 is supposed to work better with 1080p anyways so you just cannot go wrong with this card if you don't mind playing at 1080p or even 1440p max.

    • @stygiann
      @stygiann Год назад

      I have 3060 and yeah the game can even perform even better (75-80fps) at 1080p, but I get the same VRAM crash so it's either an Nvidia or a Capcom issue

  • @theyoungjawn
    @theyoungjawn Год назад +19

    Nvidia should’ve put at least 12GB VRAM on this card and 16GB on the 3080. Can’t believe how much they skimped last gen. Planned obsolescence.

  • @invasivesurgery4595
    @invasivesurgery4595 Год назад +11

    I was Intel/Nvidia for over 10 years. I made the switch to AMD for my CPU in 2021 and was so happy. When I saw the 4000 series Nvidia cards, I'm happy to announce that my next GPU will be an AMD one. Currently rocking the 3060ti, but I'll be getting the 6800xt next

    • @nobody1101
      @nobody1101 Год назад +7

      the 6800XT isn't much faster than the 3060ti. Instead you should buy the 7800XT or the 6950XT. With 7800XT you will also have a way better ray tracing performance.

    • @82_930
      @82_930 Год назад +2

      @@nobody1101 bro it’s nearly 45% faster what are you smoking💀

    • @TFXZG
      @TFXZG 11 месяцев назад +1

      Turn on ray tracing. 3060ti be faster than 6800xt.

  • @mayconrodrigues2867
    @mayconrodrigues2867 Год назад +2

    All AMD-sponsored games are VRAM-heavy because the RX5000 and RX6000 series have more VRAM than the NVIDIA RTX 2000 and 3000, that's just to force us to go to AMD.

  • @mikapeltokorpi7671
    @mikapeltokorpi7671 Год назад +14

    This is why I have always considered amount of VRAM as most important feature of GPU ( I bought very obscure FireGL workstation dGPU 20 years ago and the system was still top performer 5 years later in benchmarking services).

  • @tonac13
    @tonac13 Год назад +9

    Excited for 4060 with 4GB of ram ❤gogo nVidia !

    • @renoputra5219
      @renoputra5219 Год назад +4

      I'm looking forward to 4050 Ti 1GB

    • @deaneyle3940
      @deaneyle3940 Год назад +2

      ​@@renoputra5219 and a standard 4050 with 500mb vram

    • @IISocratesII
      @IISocratesII Год назад +1

      This is why people used to called NVidia "NoVideo" lmao

  • @johnkaczynski1905
    @johnkaczynski1905 Год назад +2

    This is truly saddening, I too am a 3070 Ti owner and I will probably switch to Team Red for my next GPU. Noticed the VRAM problem in Cyberpunk 2077 years ago and i simply thought that it was because of how game's poor optimization that causes the memory leak, oh how wrong i was... Such a shame though that 3070 Ti is actually a powerful GPU but can't do anything because of the VRAM limit.

  • @Azurefanger
    @Azurefanger Год назад +2

    one more reason to love rx 6700 xt and rx 6800 xt more vram for these exact problems i dont understand why people waste money in less vram gpus and hope they will work like a 4 gb extra vram one

  • @GuyManley
    @GuyManley Год назад +16

    Yes sir. I have a 3080 10gb, and i HAD to turn of ray tracing to stop the crashes. Played the opening level 3 times on hardcore and crashed at the village fight 3 times.

    • @Grim_Jesus
      @Grim_Jesus Год назад

      For me, putting textures on 1gb, and shadows on 'high' with everything else maxed out stopped the crashing.

    • @unclehowdy409
      @unclehowdy409 Год назад

      Nvidia pushing a feature you can't even use properly on the flagship card. Really glad I went amd after my 2070 super

  • @lflyr6287
    @lflyr6287 Год назад +5

    Daniel Owen : the since 2015 standarized (at least on AMD R9 300 series GPU-s) 8 GB VRAM buffer issue was already present in Resident Evil 2 REMAKE 2019 and Resident Evil 3 REMAKE 2020 at 1080p already using Ultra details. Resident Evil 2 REMAKE used/allocated slightly above 13 GB of VRAM at those settings.

  • @UnN3wBie
    @UnN3wBie Год назад +4

    I had the same problem with RE2/RE3 Remake with the new raytracing update, turning off fixes the issue. There are people reporting crashes with 4070 Ti or even 4090, so it's not about 8GB VRAM.

    • @ddd45125
      @ddd45125 Год назад

      4090 zero crashes in 44 hours if gameplay. RT on or off 130% resolution scale. RT currently off as it is kind of broken.

    • @UnN3wBie
      @UnN3wBie Год назад

      @@ddd45125 good for you, in any case rt is broken on this game (go check digital foundry's vid for details)

    • @valentds
      @valentds Год назад

      @@ddd45125 it's a 4090 bro, even with RT broken, 24 gb vram compensate that

  • @zinarmagadan3751
    @zinarmagadan3751 11 месяцев назад +2

    I still have a 1080 Ti and I tried out RE4 Remake with settings well over the VRAM limit and it never crashed. I wonder why that's the case for the 3070?

    • @thepizzaboy7904
      @thepizzaboy7904 4 месяца назад

      Ray tracing could be the factor causing crash

  • @Metical1312
    @Metical1312 Год назад +17

    I predicted this would happen it made no sense that Nvidia released a gpu with 8gb in 2020 when they had a gpu in 2016 with 8gb of vram. This was there plan to force an upgrade before you really needed an upgrade. When ever your buying a video card you should always buy a higher tier then what the current Gen console has and in this case both xbox and ps5 have more vram then the 3070/70 ti. The best part about it is that the incoming 4060 is rumored to have 8gb and it will probably be somewhere between 3070 ti/3080 performance but kneecapped out the gate by the low vram.

    • @greggreg2458
      @greggreg2458 Год назад +1

      That level of performance with 6GB is so wrong...

    • @Metical1312
      @Metical1312 Год назад +2

      @@greggreg2458 I have been Nvidia ever since going back to school in 2013 not buy choice but just because of the need for cuda through out this whole time i have seen almost every card i purchase with Nvidia the Amd competitor ages better my GTX 780 6gb lost driver support as soon as the 900 series hit mean while the Rx 290 only stopped getting upgrades last year and is still viable for 1080 low gaming. GTX 1060 vs Rx 480/580 is no longer a topic today cause the 1060 can't really hang with it's 6gb in 2023. I'm still a slave to cuda and because of that now own a 3080 10gb instead of a 6800xt 16gb I'm sure AmD fine wine will win here to and the 6800xt will be viable longer then my 3080 even though they are roughly the same speed currently. Hopefully AMD gets there act together pertaining to 7800/7700 lines with decent performance/ VRAM and pricing to really apply pressure to Nvidia.🤞🏾

    • @Nightceasar
      @Nightceasar Год назад +3

      Not to mention the 4060 will probably have the 3070 performance but cost the same as the 3070 did aswell. So basically you are just getting the same 3070 again for the same price.

    • @niebuhr6197
      @niebuhr6197 Год назад

      4060 might not even match 3060ti. Unless they release something different than the laptop version with more specs but that looks unlikely.

  • @grimsaladbarvods8586
    @grimsaladbarvods8586 Год назад +15

    My 6700xt will be the best purchase I'll make for the next 4-5 years.
    Near 3070 levels of performance(minus RT) with 4GB more vram.

  • @MarkSinister
    @MarkSinister Год назад +11

    That's why I picked up a 3060 12GB. My programs kept running out of Vram for Rendering and editing when I was using the 1070ti I wasn't going to upgrade to another 8GB GPU no matter how fast it was. The only other option was the 3080 and the 3090.

    • @KRZStudios
      @KRZStudios Год назад

      I get your point, but still the 3060ti or 3070 with 8GB are more powerfull.

    • @MarkSinister
      @MarkSinister Год назад

      @@KRZStudios Yeah. But when you're using applications and your run out of ram... I'd much rather have the ram to work with than the speed to work with.
      What the point of going fast if you can't carry the load?
      In some of the 3D applications I use if you run out of ram it means you have to start deleting objects in your scene or start rendering stuff in parts which in turn doubles the amount of time it take to do something.
      For me it Ram first.

  • @Hitchens91
    @Hitchens91 Год назад +3

    Textures are the one setting that I will never compromise on, in any game. If I can’t play with max textures, I won’t play at all. Otherwise I’ll be forever wondering what could have been.

    • @Psevdonim123
      @Psevdonim123 Год назад

      Well, this day the difference is not as huge as it used to be, it's blurrier, but they're not compromising on details that much, at least you're not missing on important things anymore (unless you set them to REALLY low). Though yeah, it's really frustrating, that you have to limit it.

    • @madtech5153
      @madtech5153 Год назад +1

      @@Psevdonim123 only high vs ultra had slight difference to which you can discard but even in High settings, its already maxing out 8gb vram.
      you can go and watch hardware unboxed video 16gb vs 8gb

  • @TheCompyshop
    @TheCompyshop Год назад +4

    This is another reason why I strongly dislike Nvidia as a company. They purposely put out cards with less than desirable VRAM. Juuuusstt enough so that at time of release the games tested don't show any problems, but as time progresses it ages the card like milk. Another great example is the 1060 3gb/6gb vs the AMD RX 480/580 4GB/8GB cards. The 1060 gets left in the dust while the AMD cards are still perfectly useable today at higher settings than the 1060. The 3gb 1060 reaaaalllllyyy struggles these days. But that's what Nvidia wants. They purposely kneecap the GPU with low ram so that you don't keep it too long. Just creates more ewaste at the end of the day

  • @MrKevlad100
    @MrKevlad100 Год назад +28

    Glad I went with the 6700xt over the 3070, had a gut feeling vram would be an issue

    • @barrontrump3943
      @barrontrump3943 Год назад +2

      ​@@blue-lu3izgaming in general nowadays is a total rip off and scam. Just bought a new rig that's 3 generations into rtx? Gotta have that turned off fam or you ain't playing on 120 fps natively. Its a joke especially with ReMaKeS that are completely unnecessary and people just gobble it up because they have nothing else meaningful to live for

    • @lifemocker85
      @lifemocker85 Год назад +1

      No need for gut feelings when you use brains

    • @cripmeezy
      @cripmeezy Год назад

      Same here my guy no issues with resident evil 4 on high settings ray-tracing set to normal

    • @thelazyworkersandwich4169
      @thelazyworkersandwich4169 Год назад +1

      @@barrontrump3943 Facts. I'm playing games on a integrated graphics, have a ps4 for more demanding games and doing just fine, they're tons of legacy titles that provide way better content and value than whats coming out.

    • @AAjax
      @AAjax Год назад

      @@blue-lu3iz gotta fund the leather jackets somehow

  • @RazielAU
    @RazielAU Год назад +1

    I'm not really sure what you expected... you have to pick a texture setting that will work with the amount of VRAM available on your graphics card. That's why higher end cards exist that have more VRAM. It's not just so they can put a number on the box with it having no impact on gameplay.
    The argument that your card is otherwise powerful enough is kind of silly. As you noted yourself, the size of textures in VRAM don't heavily impact performance, so are you expecting all graphics card tiers to have the same amount of VRAM? It doesn't make sense, the amount of VRAM you get is part of what placed that card at that price point and it's a number that does have an impact on picture quality, which is why you should consider it when buying a new graphics card.
    If you're expecting to be able to max out texture settings on future titles for a longer time, you need to buy a higher end card which would allow you to do that. For example, right now, a 4070 ti has 12GB, a 4080 has 16GB, obviously, the 4080 will allow you to max out texture settings in future titles for a longer time. And if you made this video in a few years from now with a 4070 ti, you'd probably draw the conclusion that 12GB is not enough...
    Keep in mind that texture quality doesn't scale with resolution, so when you pick the maximum texture quality setting, that's the same texture sizes that people on higher end cards will be using for 4K gaming (and beyond). So that setting is probably overkill for 1440p to begin with and from what you've shown, the game still looks absolutely amazing with the texture quality setting that does work on your system, so if anything, it suggests that 8GB is actually enough...

  • @gustavo8647
    @gustavo8647 Год назад +20

    AMD knew what was going to happen, that's why they equipped their top of the line equipment with 16 gb of memory 😌

    • @jpa3974
      @jpa3974 Год назад

      Probably because they work with the new consoles so they know what specs will be needed for the near future, but the question is... how NVidia was "unable" to see something so obvious? I know that a lot of RUclips "tech" guys were until recently saying that VRAM is not important, but they are either shills or idiots. NVidia is not a company run by idiots, right? Why did they decide that 8GB VRAM, the same as a GTX1070ti, would be enough in the RTX3070?

    • @Biggrittz
      @Biggrittz Год назад +2

      @@jpa3974 It was Nvidia's plan all along to strip the VRAM on some of these cards short. That way you will be forced into upgrading. Which is unfournate to own a 3070 or 3070 Ti 8GB and was looking to play in 1440p or in some cases at the time 4K. I have a R7 5800X/RX 6900XT 16GB Tower and to Run Games in 4K and I sometimes will need over 12GB of VRAM. Far Cry 6 is a good example. So Even In 4K even the 3080 and 3080 Ti are unfournately also cut short.
      Meanwhile, i have an Aorus i7 12700h/RTX 3070m 8GB as my main laptop. Play in 1080p on there in which 8GB is fine in for now. The FPS on the 3070, 3070 Ti are still beyond exceptional when given it's proper VRAM space.
      P.S - Assuming the 3060 is powerful enough, I wonder if the 3060 12GB will be able to do things the 3070 and 3070 Ti couldn't. If so that would be a crazy sight to see. It may be too weak to run 1440p max settings at a fair enough frame rate. i have not checked the benchmarks so im not so sure.

  • @MrDabadabadu
    @MrDabadabadu Год назад +4

    Lazy developers. Game shouldn't crash no matter what. There is enough RAM to store some data in it, yes you will gate lower FPS but no crash. Devs should know what is on market, so a lot of 8GB cards there, 6600XT, 3060Ti, 307, 3070Ti etc. that should be able to work at least at MEDIUM textures if not at HIGH (maybe not ULTRA). Yes NVidia is the problem, but 6600XT and 6600 shouldn't be descanted so soon.

  • @A_Potato610
    @A_Potato610 Год назад +21

    The diablo 4 beta also uses crazy amounts of vram. It was using 21gb on my 3090.

    • @theyoungjawn
      @theyoungjawn Год назад +14

      Pretty sure it’s been found the game has a memory leak. Hopefully they fix it during launch.

    • @MrSnake9419
      @MrSnake9419 Год назад +1

      changing the textures to medium should fix this

    • @tahustvedt
      @tahustvedt Год назад +6

      Some games will use whatever's available (up to a point off course) even though the performance benefit will roll off at a much lower point.

    • @MrEditsCinema
      @MrEditsCinema Год назад +1

      ​@Grumpy RC Modeler call of duty cold war for example..it will allocate 80% of all your vram for the game even though it doesn't need it
      Doesn't make the game more stable or improve frame times etc

  • @tealuxe
    @tealuxe Год назад +2

    That's why I got the 3090 and decided to skip at least 2 or 3 generations of GPUS. I imagined the 10GB of the 3080 wasn't going to be enough, even the 1080ti had more VRAM. The 3090's 24GB or VRAM should allow me to skip until the 6000 or 7000 series.

  • @granolafunk6192
    @granolafunk6192 Год назад +1

    It's even worse with the 3080 ti. Them trying to save a few $ and only giving it 12GB has basically created an early EOL date.

  • @Orain88
    @Orain88 Год назад +12

    The textures in this game don't work the way they do in a lot of games. It basically has 3 texture sets, low, medium, and high. Low being set at zero and then the medium and high category are subdivided into texture pool sizes that the game dynamically draws from. So when set to high texture 0.5 gb, you might see some lower textures but if you were to focus on one area the higher textures will load in.
    I generally find if you can tune in your other settings to allow for a 2gb high texture pool the game will typically keep up with showing you the games highest quality textures at distances that actually make a difference to the experience at 1440p.
    Also, the RT in this game is totally not worth it for the performance hit. If it were RTGI, or RTAO, or even RT shadows, I would be more interested. But simply just RT reflections? Not worth it imo.
    Edit: you can run everything 100% maxed out ignoring the VRAM limit so long as RT is off. The D3D crash only happens with RT on and VRAM overloaded.

    • @Keivz
      @Keivz Год назад +6

      Yup. Odd this wasn’t mentioned but it should be well known. Same with many other Resident Evil games. And I bet if he set the vram setting to 1gb, it wouldn’t have crashed and it would have looked fine.
      Meanwhile, his example #2 just showed how much it is the fault of developers and how such problems can be fixed.
      As for Forspoken, the demo has not been updated. Their latest patch notes says under optimization:
      Reduced the VRAM usage. (PC version).
      This video is much ado about nothing.

    • @Orain88
      @Orain88 Год назад

      @@Keivz yes indeed. I'd also add that this crash only seems to happen when RT is on in combination with too much VRAM usage. I haven't been able to cause this crash with RT off and all other settings maxed out.

    • @Spr1ggan87
      @Spr1ggan87 Год назад +2

      @@Keivz Nah it just takes longer to crash at 1Gb, the same thing is happening to me with RE2 Remake. Maxing textures leads to a crash within minutes of playing, setting it to 1Gb let's me play for around 4 hours before i get the D3D Fatal Error crash.

    • @MistyKathrine
      @MistyKathrine Год назад

      Yeah certainly not worth running RT at high. Can run it at low and medium and reduce the performance hit and give more room for textures.

    • @ChaosAlter
      @ChaosAlter Год назад +1

      Only if he did some googling for a couple of minutes

  • @sm4sh3d
    @sm4sh3d Год назад +5

    It's funny, I have a 3080 mobile, so a gpu barely on par with a standard 3070 desktop and clearly behind a 3070Ti. But I got 16Gb VRAM and I can max everything here hitting an amount close to 14Gb VRAM, and it's butter smooth. VRAM was really underestimated last gen, especially from Nvidia

  • @MerolaC
    @MerolaC Год назад

    Missed the chance to quote the merchant.
    "Not enough VRAM, Stranger!"

  • @user-ts9uk9kj3z
    @user-ts9uk9kj3z Год назад +2

    The texture setting at 1:30 confuses me, as you lowered the setting but the texture quality still remains at High. It seems the setting only affects the memory pool to load textures for GPU to fetch, which is faster than releasing from storage drive. I would guess the setting doesn't change the resolution of textures, however reduce the number of models loaded in the scene, including those outside screen space. Ray tracing fidelity will suffer most from lowering such setting.

  • @cebuanostud
    @cebuanostud Год назад +28

    Intel's arc A770 starting to look really good

    • @UmbraWeiss
      @UmbraWeiss Год назад +2

      i have a 3060ti, the next upgrade i make will probably be AMD or Intel, in 2-3 years until this card is really outdated at 1080, Intel probably will be a really good option for Upgrade.

    • @gubbica99
      @gubbica99 Год назад +1

      @@UmbraWeiss i had an Amd card for 2 years after it broke (thinking its a joke for 2 years and the card is dead) i bought an Nvdia card and only after a year its not working as it should, would say the perfomnce of the card is at max 50 percent after games like witcher, elder ring and such. Thinking about intel card in like 2 years.

    • @xxvelocity_0439
      @xxvelocity_0439 Год назад

      @@gubbica99 gpus breaking down after a year of use(even intensive) is not common. If your AMD card died after two years and your Nvidia card is on the same road then im not so sure that the Intel card will be any better.

  • @varmastiko2908
    @varmastiko2908 Год назад +13

    I've been talking about this issue since Turing launched in 2018. It's refreshing that it's becoming more widely acknowledged now.

    • @pk417
      @pk417 Год назад +5

      2024 is going to be the year when people will realize this

    • @IgorBozoki1989
      @IgorBozoki1989 Год назад +3

      12gb vram should be absolute minimum. 16gb vram should be standard by now.

    • @earthtaurus5515
      @earthtaurus5515 Год назад

      Had this debate with a friend years ago when I bought my RX480 as I was being told off for spending a little more on getting the 8GB version (about £20 or so - just before the cryptomining boom mind you) over the 4GB version as I was flat broke but needed a GPU and this was just before the crypto mining boom. In summary told him, need the vram for textures and it would be an issue in the future - so I'm future proofing by getting spending abit more now as opposed to buying a new GPU later. It sucks to see this is still going to be issue several years later.. I paid £250ish for the RX480 and that was 6 years ago in Feb 2017. In a few short months, the price was triple on 2nd hand market - some would say I could have sold it and got a better GPU but everything went up and everything I could get had lower capacity Vram so it was not worth the trade off. I've been happily gaming with high texture mods. Now running a 6750XT which was supposed to be going into a relatives PC but they changed their mind and since the GPU has gone in price compared to what I paid (the pricing is fluctuating +/- ~ £30 excluding extended warranty). Might as well use it instead of having it gathering dust.

    • @clownavenger0
      @clownavenger0 Год назад

      @@pk417 so these cards will be 4 years old when it becomes an issue?

    • @pk417
      @pk417 Год назад

      @@clownavenger0 no 3 year old.. 3070ti released in 2021
      The irony is 5 yr old 1070ti is still relevant today ..3070ti still cost 600 dollars even after launch of 4000 series
      I really don't understand what went wrong with nvidia after 1000 series with gpu manufacturing
      I mean 3070ti is strong gpu but is only bottlenecked by vram
      Bcz of dlss 3070ti would have lasted much longer only if it had more than 8gb vram

  • @airplaneB3N
    @airplaneB3N Год назад +3

    The fact that all these big content creators were saying to "just get a 3070" and "you don't need to buy a better card." That makes this even more sad.

    • @MistyKathrine
      @MistyKathrine Год назад

      3070 is fine for the vast majority of people.

    • @airplaneB3N
      @airplaneB3N Год назад

      @@MistyKathrine Maybe for 1080p. I own one, and like in the video, I found that 8gb of VRAM is not enough for modern games at 1440p.

    • @MistyKathrine
      @MistyKathrine Год назад

      @@airplaneB3N I never had any problems with mine in games and I was running at at 2160p /shrug.

    • @saricubra2867
      @saricubra2867 Год назад +1

      I was defending the 3080Ti, ignore these people saying a lot of bullsh*t.

  • @Frucht
    @Frucht Год назад

    You raise some valid points here about planned obsolescence in regard to the 30 series Nvidia GPUs, which will all be significantly reduced in usefulness in the future due to their actual potential being limited by insufficient, non-upgradable VRAM. However, the actual engine crash here is a bug and not a specific issue of not having enough VRAM. Resident Evil Village has the same texture slider options, and with 8GB of VRAM, I can still max them out. This does not cause an engine crash but only results in general stuttering when the game needs to transfer data from RAM to VRAM.
    Additionally, setting the textures to 0.5 GB (High) will still utilize the best textures the game has, but it will reduce the texture pool size on the VRAM, causing the texture streaming to load textures (asynchronously) more frequently. If you stood there looking at a scene with poor textures, the game would likely attempt to replace them at some point, provided it can do so without displacing another texture from the pool that is also in use.

  • @iAmLanar
    @iAmLanar Год назад +34

    Hello Daniel! Thanks for sharing this with us. Now I'm a bit worried about my 3070. Was gonna play the game at 1440p/Max settings.
    Anyway, this is not supposed to work this way I guess. The game should not crash. We might expect to experience some stutters or visual glitches, such as popping textures and objects, at most. Data that cannot fit in VRAM should be stored in RAM and loaded on demand instead. For instance, there was a problem with the HD textures in Far Cry 6, where textures appeared blurry on GPUs that lacked sufficient VRAM. I'm not a game developer though...
    I hope it won't be getting worse in the future because we now have technologies such as direct storage available on PC.

    • @danielowentech
      @danielowentech  Год назад +19

      Most games I've seen behave more like the Forspoken benchmark I showed. More stutters, but not a crash. Not sure why RE4 completely crashes. Maybe could be patched.

    • @iAmLanar
      @iAmLanar Год назад +2

      @@danielowentech I hope so.

    • @mck8292
      @mck8292 Год назад +12

      3070 owner too :/
      Sad i didnt went for a 6800 instead..

    • @adi6293
      @adi6293 Год назад +4

      @@mck8292 Don't make that mistake again :P

    • @PineyJustice
      @PineyJustice Год назад +5

      @@mck8292 If only the warning signs were there since nvidia has been doing this garbage for over a decade...
      They sell upfront with "oh all we need is 2 or 4gb, that's fine for all current games at max settings" yet you're not buying a new card just to play old games, you want to use it for at least a year or two.

  • @advanceddarkness3
    @advanceddarkness3 Год назад +8

    Exactly how Nvidia killed off most of its early GPUs too. Powerful core, limited amount of VRAM. The exact reason why I chose a 3090 over any other 30 series card.

    • @robr640
      @robr640 Год назад +3

      Same, I was pretty sure I'd be going Radeon 6000 series when I upgraded a while back. Then happened across a 3090 for a great deal & jumped on it cause that was the only NV card I believed would last till the core is getting tanked by games.

    • @TheGAVstudio
      @TheGAVstudio Год назад +2

      ............ smart!

    • @jonboy602
      @jonboy602 Год назад +1

      Agree, my gtx570 was obsoleted way before it's time because of this - only 1.25GB VRAM.

    • @saricubra2867
      @saricubra2867 Год назад +1

      3080Ti is fine as well, same die as the 3090, 12GB is the sweetspot. The 3090 is overshadowed by the 4090.

  • @Killa-rd6hj
    @Killa-rd6hj Год назад

    When it’s windy and raining in the village is the best time to test your fps in game without benchmarking

  • @daysgone3950
    @daysgone3950 Год назад +2

    Nvidia: only miners, AI companies and xx90 users deserve the VRAM.

  • @dcgerard
    @dcgerard Год назад +8

    I have a 3070 ti and I ran into this very problem in the chainsaw RE4 demo. The card had been great in just about everything else though, but yeah I do wish it had at least 12GB

    • @jacketofthe80s13
      @jacketofthe80s13 Год назад

      you don't man. the highest texture quality means aboslutly nothing you're just using all that vram to store shaders in a cache so it loads textures without hick ups. so no it's not your gpu it's capcoms retarded way to handle shaders

  • @AdrianMuslim
    @AdrianMuslim Год назад +13

    This is why I game at 1080p with high end GPU and never worry about FPS or lowering settings.

    • @AndyHDGaming
      @AndyHDGaming Год назад +1

      When you have a 4K screen, you will find that the picture quality of a 1080p screen is really poor, and the pixelation is very noticeable

    • @trenchcoats4life891
      @trenchcoats4life891 Год назад

      @@AndyHDGaming agreed. I jumped from 1080p to 1440p and I cannot stand 1080p now.

    • @AdrianMuslim
      @AdrianMuslim Год назад +1

      @@AndyHDGaming When I have a 4K screen, I will also find that frames are really poor. I barely get 60FPS in some games at 1080p, let alone 4K. lol.

  • @eugenebebs7767
    @eugenebebs7767 Год назад +1

    I hope direct storage is gonna aliviate the vram limitations.

  • @TinMan445
    @TinMan445 Год назад +1

    I’m here learning why my 3070 ti was on sale for SO cheap

  • @gabriel_ramon
    @gabriel_ramon Год назад +17

    That's why I went with AMD instead of Nvidia this time. I had bought a 4070Ti for R$7,300 ($1.391,38 USD) here in Brazil. It had issues 2 weeks later and unfortunately died.
    I sent it for RMA and now they will refund me. I bought another graphics card to replace it for a much cheaper price, as I couldn't wait too long for the refund and decided to buy a 6750 XT, which is priced at R$3,400 ($648,04 USD).
    I'm quite happy with the performance of the 6750 XT, as it runs games smoothly in 2K resolution. I can run everything on max settings except ray tracing in Resident Evil 4 Remake, which causes the game to crash after a while. However, I don't really care about that as it doesn't make that much of a difference, and I prefer to have good performance rather than fancy lighting effects that drastically reduce it.
    My next card will be a 7900 XTX with 24GB of VRAM. I'm tired of Nvidia, this problem I had with the 4070Ti was unforgivable, such an expensive card having issues 2 weeks after use and still having only 12GB of VRAM.

  • @winslowpippleton7157
    @winslowpippleton7157 Год назад +37

    Set it to 2GB, it will use the same high texures as 8GB but doesn't allocate. take some screen shots and see for yourself (this has been known for years for the RE engine)

    • @pk417
      @pk417 Год назад +4

      Not true in this game
      Anything below 8gb looks low res in this games unlike previous re games

    • @Xilent1
      @Xilent1 Год назад +4

      @@pk417 Also, past RE games you could go over the VRAM limit and it still worked from my knowledge.

    • @Makr0ss
      @Makr0ss Год назад +2

      He did, at 3:40

    • @pk417
      @pk417 Год назад

      @@Xilent1 yup bcz that was only allocation
      But this time games are literally using that amount of vram

    • @Angel7black
      @Angel7black Год назад +1

      @@Xilent1 its the same still, its crashing cause raytracing is bugged, not cause of the vram, atleast not directly cause of it. If you turn it off and max out settings it still runs fine but he didnt show that and im not convinced he knows that. Even on lower textures and 9 gb of vram usage i was crashing in engagements with RT on with a 12gb RTX 4070 Ti. After that i turned RT off and set it to max with like 12.34 vram usage and was completely fine. The game just needs a few optimizations right now, and based off the RE’s the vram indicator is never gonna be accurate to how much vram youre actually using in the game.

  • @oktc68
    @oktc68 Год назад +7

    You make an excellent point. I've been struggling along with my 2070, and very nearly bought a 3070 Ti, as in many respects it was better than its predecessor, it was the 8GB VRAM that ultimately stopped me from buying this GPU. I'm glad I waited for the 4070 Ti, it's a great 1440p card, and the extra 50% of VRAM is just about ok for now and the next couple of years. 🤞🏻 Nvidia need to stop being tight with the VRAM on their cards, 8GB is only adequate for 1080p gaming, for the prices they charge they should have doubled it to 16GB for 1440p cards (don't know anything about 4K gaming, until you can run games at 144-165 FPS it doesn't really matter IMO) when paying for $1,000-$1,600 for a GPU I think it's fair that you get a card with some headroom, VRAM usage has been climbing in new titles and I'd argue that Nvidia are strangling the performance of some of their cards by being so tight with the VRAM

    • @katieadams5860
      @katieadams5860 11 месяцев назад

      The 4070 ti would be a good card if it was $600 instead of $800

  • @cadewhite8670
    @cadewhite8670 Год назад +1

    It’s so funny how adding more memoery isn’t even a logistical problem. There just being greedy a holes about it. They could pump out gpus with easy 16gb of vram and probably not Hurt their bottom line.