How bad is 8GB of VRAM in 2024? The newest games, RT, DLSS, FG on/off, Ultra, High, 1080p, 1440p, 4K
HTML-код
- Опубликовано: 17 июн 2024
- Sell your old GPU to fund your upgrade at Jawa! jawa.link/OwenGPUJune24 Use code OWEN10 to fund your upgrade!
In this video we use the 8GB and 16GB versions of the 4060 Ti head-to-head to pin down exactly when games need more than 8GB of VRAM, and how badly that is impacting performance.
Test system specs (ResizeBAR/SAM ON):
CPU: Ryzen 7800X3D amzn.to/3Hkf7Qi
Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
RAM: 32GB DDR5 6000 CL30: amzn.to/41XRtkM
SSD: Samsung 980 Pro amzn.to/3BfkKds
Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
Mouse: Logitech G305 amzn.to/3gDyfPh
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.
Chapters:
0:00 Ghost of Tsushima 4K Very High
0:22 How bad is 8GB of VRAM in 2024? What/how are we testing?
1:10 Sell your old GPU to fund your upgrade at Jawa!
2:41 How do you read the stats to see a VRAM issue?
4:35 Ghost of Tsushima 4K Very High DLSS P
5:31 Ghost of Tsushima 4K High DLSS Q
5:58 Frame Generation uses more VRAM- Ghost of Tsushima 4K High DLSS Q FG
6:52 Ghost of Tsushima 1440p Very High
7:19 Ghost of Tsushima 1440p Very High DLSS Q FG
7:47 Ghost of Tsushima 1080p Very High
8:14 Horizon Forbidden West 4K Very High
8:41 Horizon Forbidden West 4K Very High DLSS P
9:10 Horizon Forbidden West 4K High
9:36 Horizon Forbidden West 4K High DLSS P
10:03 Horizon Forbidden West 4K Medium DLSS P
10:30 Horizon Forbidden West 1440p Very High
10:57 Horizon Forbidden West 1440p Very High DLSS Q
11:24 Horizon Forbidden West 1440p High
11:56 Horizon Forbidden West 1440p High DLSS Q
12:08 Horizon Forbidden West 1080p Very High
12:20 Horizon Forbidden West 1080p High
12:43 Hellblade 2 4K High
13:10 Hellblade 2 4K High DLSS P
13:36 Hellblade 2 4K med DLSS P
14:04 Hellblade 2 1440p High
14:31 Hellblade 2 1440p High DLSS Q
14:54 Hellblade 2 1440p Med
15:12 Hellblade 2 1080p High
15:23 Avatar 4K Ultra
15:50 Avatar 4K Ultra DLSS P
16:17 Avatar 4K High
16:44 Avatar 4K High DLSS P
17:11 Avatar 1440p Ultra
17:40 Avatar 1440p High
17:47 Alan Wake 2 4K High DLSS P
18:06 Alan Wake 2 4K Low DLSS P
18:29 Alan Wake 2 1440p RT High DLSS B
18:56 Alan Wake 2 1440p RT High DLSS B FG
19:27 Alan Wake 2 1440p High
19:47 Alan Wake 2 1440p High DLSS Q
19:58 Alan Wake 2 1080p RT High DLSS Q
20:13 Alan Wake 2 1080p RT High DLSS Q FG
20:44 Alan Wake 2 1080p High
21:02 Resident Evil 4 Remake 4K Max
21:19 Resident Evil 4 Prioritize Graphics
21:40 Resident Evil 4 1440p Max
22:00 Resident Evil 4 Prioritize Graphics
22:23 Resident Evil 4 1080p Max
22:37 Resident Evil 4 1080p Prioritize Graphics
22:47 Starfield 4K Ultra
23:05 Starfield 4K Ultra DLSS P FG
23:26 Starfield 4K Ultra DLSS P
23:45 Starfield 1440p Ultra
23:55 Starfield 1440p Ultra DLSS Q FG
24:19 Starfield 1080p Ultra
24:26 Cyberpunk 2077 1440p RT Overdrive DLSS P
24:57 Cyberpunk 2077 1440p RT OVerdrive DLSS P FG
25:23 Cyberpunk 2077 1080p RT Overdrive DLSS Q
25:50 Cyberpunk 2077 1080p RT Overdrive DLSS Q FG
26:17 Final Thoughts Наука
Sell your old GPU to fund your upgrade at Jawa! jawa.link/OwenGPUJune24 Use code OWEN10 to fund your upgrade!
The Avatar devs had a presentation discussing how they use a dynamic texture streaming method to prevent performance issues with low VRAM cards, this allows even 4GB cards to work. DF has a video on it called " Avatar: Frontiers of Pandora Tech Deep Dive - GDC Reaction Special" at 15:00 min in, they explain it
Went and the price is a rip off, I can sell it on eBay at a much better price
No need to upgrade. My 6900 xt with 16 gb vram does a great job and I've yet to find a game that taxes it. And I play Vr games.😁
no thanks its offering $40 for my 5500XT mech 8gb OC
Could you compare faster 8 GB gpu to 4060 Ti 8GB? Like RTX 3080 vs 4060 Ti 8GB.
Remember folks nvidia is still going to release a 8gb vram gpu with the rtx 50 series and amd will do the same 🤔
The more you buy the more you save
yeah who cares, just dont buy it, then they will eventually stop relasing it. Nobody HAS to buy a 8gb 5060 or 5060ti. Ignore those products. If people always just cry about it but then still buy it, Nvidia wont change anything.
I personally dont care, I will buy a 5090, and even if that one would stay at 24gb I wouldnt care bc its way more than enough. (it wont stay there but likely be 28gb or 32gb; more likely 28gb tho). If 5060 is out, people can buy a 4070 Super if they want more than 8gb VRAM and if they cant afford a higher class GPU of the 50 series. Or they can buy an AMD card if they want to do that.
nVidia are banking on their new compression (NTC) which has to be implemented into games... greed or short slightness's? I leave that up to you
And likely a 6gb version at some point like the 3050 6gb lol.
They’re probably going to release 12gb 5070 and also slap 16gb onto a 5060ti in clamshell and charge extra $100
Just download more VRAM
How bad is 8GB of VRAM in 2024...
Me watching this video with 4GB of VRAM in 2024
Dodged a bullet right there. Good thing you didn't get an 8GB GPU.
Yeah i feel you. I am watching on a gtx 1050ti
i think its funny that im watching this video about how 8gb isnt enough and read this comment about you having 4, whilst i have 512 MEGABYTES of vram
😂😂 same here bud
Me with my 4GB 3050 laptop.
Meanwhile me with 6GB: Chuckles, I'm in danger. 💀
at minimum only the RTX 5050 should be 8GB, everything else needs to be 12GB+
Sadly people keep buying so they have no reason to up the VRAM.
theres another company offering higher vram capacities, look em up
@@Herr_AffeIg it depends on availability in the end. Where I'm from I had to choose between the 7800xt and the 4070 super. The 4070 super was around $100 more expensive. I went with the 7800xt because I didn't think 8% more performance with less vram was worth $100. If they the same price, it would've been a tougher pick. But welp, Nvidia tax is an actual thing.
rtx 5030 4gb
@@Herr_AffeIntel is actually really good and heckin' cheap, got one as an "inbetween" because my 3070 broke and honestly... Mostly everything just works and performance is good enough :D especially for like 150€ 💀
I see a lot of comments saying that the video just proves that 8 GB is for 1080p. Let's put one thing clear from the video: the 4060 TI chip can deliver more than what 8 GB is able to handle, and it's a fair assumption that someone tries to run on 4K using DLSS on performance with frame-generation. The world gaming community is not only US, there are countries where a 4090 costs the same price of a car, so buying a 60 card is usually the affordable way to go (and at least where I live, it is still more than 2K). The time for giving a little more of VRAM is beyond past now, 8 GB for a 60 class GPU should be unacceptable.
1080p or 2k with some upscale may be, still enough. Not talking about new cards tho...
As long as there are fanboys who defend the skewed price to value ratio, it will never change.
@@Nif-kun I would presume companies dont care about fanboys or even opinion, only sales.
Question is: how much fanboyism really contribute to sales.
@@xerxeslv
All honesty, gamers don't really have that much contribution when it comes to the overall sales. The reason why NVidia or AMD prioritizes buzzwords like crypto or AI now is because the industry level corpos buy them in bulk. Then again, we are still a demographic to consider since we still pump money for them. The reason why things seem to have stagnated is because people still buy them regardless of complaint. Voice has no say in sales, and until people stop buying these products, nothing will change. The fanboys simply propagate the mindset of buying without thinking, which leads to where we are.
Rx 6600 8gb is pretty popular GPU
Saw this coming with my 3070 and sold it.
Got myself a 7800xt instead and couldnt be happier with my decision,
Im gaming at 1440p without much limitations rn.
Saw a Rx 6800xt on jawa for 390$
Still hanging with a 3070 because a 7800xt would cost me 3x what I paid my 3070. And I'm not planning to upgrade before GTA 6 hits steam on PC
Did the same with my 3070, got a RX 6950 XT on sale and could not be happier. CPU is the current bottleneck so will be the next upgrade.
@@jacomoolman6503 6950xt is also apparently better than the 3090 without rt. Fricken insane ppl still sleep on amd.
im using 3070ti its still fine but maybe next year is when it really starts to struggle
My 1080 ti with 11gb of vram:
Best purchase you ever made if you got that thing at launch.
what about your 1080 ti with 11gb of vram?
that card is to old..
@@groenevinger3893 and yet it plays newer titles just fine
@@groenevinger3893 VRAM isn't the be all end all. Clock speed, bus width - these all factor in. A 352 bit bus on a 1080ti is still good in 2024 because of that bus width allowing for the memory to transfer much faster than if it had a 128 bit bus.
90% stutter and lag complains I read actually end up with just VRAM overflows. People are very uneducated when it comes to VRAM, and how it affects your games.
Someone is going to make a meme about you moving your whole person around to point at things 😂
This is why we love Daniel. He is the real deal, a legend, and to some.. a mouse pointer
Cursor Daniel is probably my favorite edition of Daniel
bout to make my windows mouse cursor into a transparrent Daniel pointing his finger at anything i want him to point his finger at
Daniel Owen The RUclips Pointer Channel. Sounds about right.
The same way a lot of youtubers started copying Josh Strife Hayes' quip of holding a mug while talking and General Sam standing up with a big mic in his hand narrating things in the background.
It's only a matter of time until it starts catching on, really.
12gb vram is the sweet spot..16gb vram is ideal..
For now.
@@sandboy5880 Thanks Captain Obvious.
then my 24gb vram is overkill... for now that is... i dont trust lazy developers 😂
Well, yes, 24 is ridiculously overkill (except for ai or productivity applications). Of course, in 10 years that will change but that's how it's always worked. @@brunoutechkaheeros1182
12GB is barely enough if you want to play at console settings...
Don't forget to mention that other software in background is also using VRAM and its usage grows with newer software versions. I can use up to 3 GB even without any game: browser, steam, vs-code, image-editor, video-player on 2nd monitor, discord, tg, etc.
Apps use SRAM not VRAM, VRAM is used for only the monitor output and textures so only a 2nd monitor can make you use up VRAM hence 32gb of SRAM is recommended
One thing about this video though is that its using a 4060 card at 4K, something nobody has ever recommended because 4K requires way more VRAM and graphics compute. However this is basically the only way to show that VRAM is an issue...by using a 4060 card with its 16GB counterpart at max settings at 4K...something people buy 4090s for.
@@alargecorgi2199 artificial limitation. The fact that 16 GB variants can do fine on 4k shows that these gpus are unnecessarily getting limited to lower visual fidelity. Also the the bigger problem is not being able to match console quality textures and probably resolution too in future.
@@RiasatSalminSami Yeah but that's the thing. Its not a actual fair gaming comparison because 99% of 4060 buyers don't use 4K, and wouldn't run into these comparisons. The only reason why, like I said and you said, is to show the limitation. It's like those CPU tests where they run a 4090 at 1080p to show max fps uncapped on a first person shooter at low settings. Except in those situations...fps players actually do play on low settings at 1080p to get the most out of their game for reaction speed. In these RPGs? Ain't nobody going to sacrifice all their graphics or go to max resolution with 8 fps.
@@alargecorgi2199 99% won't use 4k because they can't use 4k for the artificial limitation. None of these changes the fact that these GPUs are basically garbage compared to consoles. And for the price they're being sold, that's not a good sign.
Maybe if it was sold for 100$, then maybe 8 GB would have been acceptable for a potato product.
Frame gen was such a scam.
Would love to see a How bad is 10gb vs 12gb video.
Same as here, in the vast majority of games you'll never go past 12GB in 1440p or upscaled 4K. 8-10GB is an automatic no-buy in 2024.
he did a 4070 vs 3080 months ago, so you have that, its still from 2023 or early 2024.
🙂 the rx 6700 10gb is pretty decent, it should be enough for 1080 high, and it only uses a single 8pin plug so all u need is any 450w power supply, its a bit faster than a 5700XT
@@anitaremenarova6662 Oh boy you are wrong. The same comportament that he showed here with the 8GB card never going close to 8 GB in use is happening with 12 GB too. My 4090 has enough VRAM for games, and in most modern games I played used 13-15 GB.
@@Just4Games2011 Allocated VRAM isn't the same as the one being used.
Task Manger also shows when VRAM overflows in to system RAM, handy to keep an eye on.
CapFrameX will show you as well
How do you see that on the task manager?
I have 12g vram rn, I'll wait for 4 years before finally biting the dust and upgrading, one thing for sure, I'll get the top of the line shit next time whenever i upgrade
: )
Just get enough Vram, RX 6800 was $400 two years ago
@@lupintheiii3055 i feel like at 1440p i will survive 12g, especially coming from 4gb vram up until two months ago, i wont mind lowering down the settings ykwim ? + my gpu is very capable (4070 super)
thats what i do. I currently have 8gb - a 1070. I wanted to buy a 4090 but I decided to just wait for the 5090. 24gb of the 4090 or the probably 28gb of the 5090 are more than enough for YEARS. I will play in 4k resolution but even then. Developers wont make games where you need more than 24gb VRAM anytime soon. Hell more than 16gb isnt something Devs will ask for anytime soon bc the majority of users just doesnt have that much. Let alone 24gb+, only a minority has so much VRAM.
So I plan to keep my 5090 for a bit and maybe look at the 7090 and how good it is. When I can comfortably drive my upcoming 240hz 4k OLED monitor, Im extremely happy. A 5090 will be able to do it in some games with DLSS, but there are already UE5 games where this probably wont be possible in 4k, even with DLSS Performance. So if anything this will be the reason I like to upgrade to a 7090 or 8090, but certainly not bc I run out of VRAM. I firmly believe VRAM wont be even a single thought for even just a millisecond during the lifetime of my 5090.
1440p is going to be viable for that much time with 12 gb vram. Unless they make more unoptimised shit. Also I saw huge difference previously when I maxed out texture settings in games but these days high vs very high / ultra is barely noticeable.
@@DELTA9XTC The 90 class cards just don't seem worth it to me. Maybe I might get the 6090 because of my immaturity and poor sense of humor I don't know. But is a computer part in the range of $2000 worth it when you can get a whole very capable computer for the same price? If you want to upgrade from a 1070 to a 5090 then go for it it's not my money. But I just think there's more reasonable options.
I'm glad that I sold my 3070 and got a 4070tiS when it launched, 8GB definitely became a hindrance for 4K in 2024.
dlss brother
@@xxNiceLeaderxx Isn’t always enough like in CP77 RT.
@@sapphyrus I don't use RT in my games so i'm good on that.
I still have 8gb (built in feb 2021). When I upgrade from 1440p to 4k and/or play newer titles, I'll upgrade to 16gb+.
Daniel chose an excellent batch of games to test.
Fr💀
These are the games which I also want to play
On my 16GB 4060Ti
Interesting that even when you can adjust things to get it to work, there are certain modes that just don't behave in newer games on the 8 GB version.
Rules of happy gaming:
1. Turn off fps counter
2. Use recommended settings
3. Enjoy the game
The results seem like if you're buying a 4060 thinking it's a native 4k beast machine, you will have bad results with 8 or 16 gb because your card blows for native 4k no matter how much VRAM you give it.
You could struggle at native 4k even on 16gb (Like how Ghosts barely hits 30fps on a game with PS4-tier visuals) or you could play at something realistic for bottom tier hardware and get 60 on 8gb or 16gb.
It would have been nice to add the 3060 12GB and see if it beats the 8GB one in some cases
It does, none of these examples would brick a 12GB card.
Accademically: yes. Practically: it probably doesn't have the horsepower to lead to any desirable outcome.
Imagine spending money on an obsolete 8gb 5000 series card that would probably cost $300. "The more you buy, The more you SLAVE"
waiting for the gddr7 dweebs lol
Just turn down texture settings or get Radeon.
Nvidiot Tax
Those will sell a lot!
Slave lol
Great video as always, have you ever tried using Nvidia Broadcast to prevent annoying background audio on your mic during recording?
watching this with a RX 580 bought back in 2017 and still rocking! Playing GoTS with FSR3 1080p at 60fps fine here
Ppl blame but still byuing nvidia. Im at 3060ti from aliexpress, payd 200$ my next will be a amd
True! I crossed the fence - upgraded from a 2060S 8gb to a 7900GRE 16gb
What are you talking about dude?
The Ali prices for the RTX 3060ti i see is around 400$. So where do you find them for only 200$?
@@Tiber234 same with sapphire nitro+ 7900 gre
@@cajampa refurbished ex-cryptomining units from China probably? They started flooding aliexpress when China tried to ban crypto on 2021.
Those ex-miners 3060 Ti's are sold at my country at around 200 USD.
@@cajampa i got mine for 320$ on ebay 2 years ago now i think. Its used stuff Ig.
Bought a 3070ti before the 40 series and the 12gb 30 series came out. Feels real bad now
always me everytime i upgrade... like a month later a Ti version or so comes out with extra gigs or speed 😭
Eh, i have the same card and it still works out well enough for me on 1080p. Just manage your expectations and perhaps be like me and play older games that don't suck as much.
@@X_irtz bro a 3070ti isnt a 1080p card, GPU wise. I mean ppl use(d) 3060's for 1440p sometimes and that is a bit of a stretch imo but 3070ti is really a classic 1440p card. Or lets say it would be one, if it had more than 8gb VRAM. and thats exactly the problem. Performance wise its quite strong, its around an RX 6800 non XT. A 3080 is like 20% stronger and ppl used the 3080 as 4k card when it got released. A 4070 non super is only like 15% faster than a 3070ti according to techpowerups relative performance chart.
thing is you cant even use the full 3070ti at higher resolutions bc of the VRAM limitation. And its not about all new games sucking, its that they often want more VRAM bc of new graphically intensive features and the 3070ti doesnt have that. Its what we can see here with Ghost of Tsushima at 4k Ultra.
Should've had the foresight, people who bought a 4070ti for 1440p and 4080 for 4K will feel the same soon as well.
@@X_irtz Why 1080p? I have a non-TI 3070 and play everything at 1440p fine. Both our cards should absolutely have more Vram but I dont understand why you are playing at 1080p.
Great video, thanks for the information overload. My personal takeaway is that for my 8GB 3060ti I should think about trying out 1080p with higher settings besides what I usually do (what I usually try first now is 1440p with high settings, and then DLSS Q if that does not give me 60fps, but 1080p@60fps is fine for me as well, and as this video points out DLSS takes up memory too).
Thanks from France for this vidéo !!
Just lower texures form very high to high and problem solved. It also have almost no impact in image quality in most games
It is not big deal as you make it
Like literally it’s that simple smh or just don’t use FG. I like Daniel’s content but damn this VRAM discussion will always be the same.
And it’s just common sense to get more VRAM when you upgrade your gpu.
Infact it's actually worst than what he make it since textures have zero impact on performance, so have to lower that because you lack Vram is the dumbest thing ever
ultra and high settings may not look to different but to think that games moving forward aren’t going to keep using more n more vram is delusional
ye lower textures even tho u bought an expensive GPU +$400. Seems logical. Great times.
@@hircine92h 4060ti is 1080p gpu and u are crying u can see diff in 4k both gpus suck soft dick in this res no diff 8 or 30fps u wont play like this either way - and if game will use more than 8gb at 1080p 4060ti will have to slow die to work at this res either way as every feature cost performance not only vram - even framegen cost u real fps and u can easly see that with any program
The man doing his duty to say "nice" when he runs into the number 69. Salutations, sir!
Does FSR produce the same frame rates with 8gb or 16gb given that video memory wouldn't be an issue?
It took this video 8mins to actually show me the true difference, which is why I tell people that 8GB is not the end of the world at all. Who? In there right mind would be buying a 4060 and playing at 4K, come on man… there shouldn’t even be any testing done at such a ridiculous resolution for a budget card. Make a new video with both cards side to side at 1080p only and now let’s see how unnecessary 16GB is for what? Twice as much money.
Still at 1080p, my 3060Ti will las for a long time...
using a 4060 ti for 4k gaming...
It is like using 800kg car to town 4000kg trailer…
depends on what game
Quality and informative videos like this one is why I am a subscriber to his channel. Great factual work, not opinion based work.
Tried to get my hands on 4060ti 16gig Unfortunately the price was way above msrp and only 8gig was available. So I got a new 3060 12gig.
I think these are entry level cards and still have a place in the market. If you are not at maxed or ultra settings or being ambitious with resolution on a low end card I think a 8GB card is enough. I think $250-350 is where 8GB belongs
I expect that next gen 8gb will be close $500… But who knows. Gddr7 will be really expensive in the beginning.
@@haukikannel yeah the die is too but supposedly they are cutting down its core count as well as keeping it gddr6. If I had to guess it would most likely be maybe 10% faster than a 4060, and either be between $280 (if Nvidia ever threw gamers a bone) and maybe $330 until 40 series dropped in supply or $299 and letting 40 series drop under msrp. 40 series is sitting around msrp still with about a year left until a 5060 release
price is key though, I got a 6700xt 12gb for a secondary build, it was $245 on eBay +free shipping. 8gb shouldn’t be more than that ever. Personally 3060 12gb or 6700 non xt 10gb are the lowest end cards I’d even recommend for entry.
@@puffyips comparing used market is a different ball game entirely but I agree with you in terms of price point to an extent. A 3050 was 8GB for about $250 msrp but a 4060(which should have been 4050) probably wouldn’t be able to use more than 8 regularly. They def should’ve kept it around $280 with the 4060ti 8GB around $329-350 & 16GB at $399. It would’ve been better received.
This is regarding 5060 and 4060 though, I believe 5060ti will be 12gb
In my opinion 60fps should be the mininum. 24-30fps in 4k high with 16gb vs 8gb. The 4060 Ti just doesnt have enough power to make use of that vram. 30fps isnt enjoyable and you are going to get much better experience if you just lower your settings little bit to get 60fps and by that point the difference between 8gb and 16gb is 5-15fps. and for the money you could get a card that has more power and you would end up getting more fps with that.
@jumbob That's what I was noticing too. The settings he used to create vram issues are so far beyond what those cards are suited for to begin with that it doesn't even really matter. The game is going to run like garbage at those settings either way, just marginally less so with more vram. Do people really think they're supposed to crank everything to the max on a midrange card? Higher settings don't necessarily even look better depending on what it is. Stuff like depth of field, motion blur etc I turn off anyway regardless of performance because I just don't like the way they look. A lot of stuff like that is a matter of personal taste. And when you dig a little deeper there are always settings that have a heavy performance impact for miniscule difference in visuals.
You wouldn't use some insane 4k config but the video literally shows it loses frames in Forbidden West at 1440p with DLSS quality. You're rendering a base 1080p image at that point and the FPS is still over 60 but it's less than it should be for that hardware. I don't think 1440p DLSS quality is some insane setting in a game that first came out in 2022 on freaking consoles.
We only need to watch the first 11 seconds hahaha job done this could have been a short.
Great test scenario. I'd love to see you revisit this once system RAM clock speed can close the gap with GPU VRAM clock speed. I'm curious if solutions like overclocked CAM2 memory will make this a nonfactor for the budget GPUs of the future.
i feel like 8GB is less than a problem than some think it is, yes i think 4060 and 4060TI should have had at least 12GB but is 8GB really a problem ?
if u buy a 4060TI u are NOT aiming for 4K ultra that is a fact, that is more of a "1440P high with DLSS Q" card, and for that 8GB prove to be just (barely) enough.
Its kinda okay, just the fact that games cant run smooth because of VRAM not GPU’s power sucks :(((( image having a decent GPU and it easily can handle ultra settings, but its limited by VRAM 🫠
@@blondegirl7240 nah, 28 fps is already a crappy frame rate for any game, it does not matter if 8gb sends you to 8 fpsinstead because you start from crappy performance anyway.
@@blondegirl7240 me who is having a 16GB 4060Ti but limited due to its power 🥲
the point is that as the AAA games (games being unoptimized dramas aside) advance even 1080p can demand close to 8GB of vram already, with that kind of thinking everyone should be stuck with using just iphone 5 or Samsung S5 etc
Test Forbidden west on Burning shores area, i find my 8gb card really struggling in busy areas such as the main town even at 1080p.
that's sounds like more about CPU bottleneck
@@arenzricodexd4409 Nah, zWormz Gaming did a bunch of test using various gpus, using high end cpu, and he also came to the conclusion that vram usage is higher in that area.
Fleet's End is extremely CPU demanding area. My tuned R5 5600 can barely hold 60 fps here. While VRAM-vise the DLC does not use more than the main game.
Played both the main game and DLC at 1620p/DLAA/Very High on my 12GB GPU w/o any issues. While at 4K + DLSS Quality this game needs more like 13-14GB.
Had a 12Gb 3060 and I had no problem in terms of running some games at 4k with upscaling , at no point did the Vram become an issue , upgraded to 4070 super recently no problems so far either, it gets close to that 12Gb but 99% of games i played don't go over it. but it certainly is clear that at some point in the near future even 12Gb won't be cutting it anymore.
Thanks Daniel 😊.
i run 8gb on 1440p and its all good
It’s alright but not ideal, it’s so much easier to enjoy a game when you can just play on maxed out settings from the get go
@@puffyips it's the 4060ti, gonna wait for the 5090 to drop
@@Zombie1014060ti 8gb💀 at what cost? I payed $400 for a 6800xt 16gb last year, even 3080’s were $450 at that time. I wouldn’t even take a 4060ti 8gb for free.
I hope you have a capable monitor for a 5090 or else that’s an easy way to waste $2000 for just a gpu.
@@puffyips Bought for 379£ and its a temporary gpu. Monitor oled 240hz 2k
That's weird
On my rtx 3070 at 4k ghost of tsushima it's run like the 4060 ti 16gb
Why?
It is because the more bandwidth or the resizable bar?
Probably both. Higher memory bandwidth may help here a little bit. And enabled ReBar uses a bit more VRAM.
@stangamer1151 I have 32gb of fast ddr5 ram too
I think the vram is swapping things to the ram
I didn't face any stuttering issues with my 3070 in most of the games
I always playing at 1440p and sometimes 4k if I can get good fps with dlss on
Because with 4k there is no aliasing anymore in the games
@@iikon01No aliasing? Haha. If only that were true. There's HEAVY aliasing in 4K.
@berkertaskiran what I meant
When at 1440p there is noise and little shimmering in the character hair and edges even with the best anti aliasing solution that game offer
But at 4k there is nothing the game is so detailed and sharp
And at 5k it's so so clean and no need to activate anti aliasing too
But at 5k my poor gpu can do many games
Only not very intensive games
@@iikon01 I always need AA at 4K. Either DLSS or MSAA. Even 4x doesn't look good.
Funny how Avatar, runs just fine on 8 gb GPU, with no massive visual quality downgrade vs 16g GPU.
At GDC they explained how they integrated techniques to manage memory efficiently (something akin to sampler feedback).
8 gb should be just fine for resolutions
Great Video Daniel, please do a video of RX7600 8 and 16GB. Thank you!
I would be interested to know how many games ran differently in 1080p which is the target resolution of the 4060 ti in the first place. All the other resolutions are for 4070 up.
99.9% will work just fine with 8gb… even with 4gb! When running 1080p
If i buy 16gb vram card like 4070ti super or 4080super for 1080p gaming would it last for atleast 6years for gaming???
Yes either card would last for a very long time at 1080p. What frames are you trying to get?
@@MrAnimescrazy I just want basic rasterized 60 fps at native 1080p for atleast 6-7 years because those fps will be enough for the features like framegen or only for normal smoothness of the gameplay
@satyamwathrey7704 ok then yeah either card would be overkill for 1080p 60 fps but I would get the 4080 super since it performs better then the 4070 ti super and it will last a bit longer. My pc is in my profile picture with my first all white build with a 4090/ 7800x3d/ 64 gigs of ddr5 pc build so my pc will last a very long time but I am looking toward the next gen pc parts so I may upgrade.
@@MrAnimescrazy Actually I want an overkill card because I want it to last long for the 1080p resolution because the normal performing card won't last long for the gaming I will wait 6-7 more months to see what 50 series have to offer then I'll decide accordingly to my budget
@satyamwathrey7704 ok and it will be hard to get a 5,000 series card but if you can get one compare the performance to the 4070 ti super and the 4080 super of course depending on your budget.
For your future testing, you might want to enable the per-process shared VRAM metric in the Afterburner OSD. It could be helpful to see exactly how much more VRAM is needed (though some games even without vram overflow have some shared allocation). Additionally, it looks like you have a lot of background applications open during this test. In my own tests, I typically see no more than around 750 MB of per-process usage compared to the actual usage.
I am very pleased to choose Rx 6700xt instead of RTX 3070 because at that time the price of Rx 6700xt was much cheaper. It's been 2 years of use now and its performance is still very good with 12gb vram.
Thank you for showing us 8gb cards are 1080p cards.
That was actually my takeaway too. But I didn't see that as a bad thing, from the results I just thought well I guess I should just default to 1080p on modern games on my 3060ti.
@@sanderbos yeah, I wish you could push ULTRA on 1080p tho. That way you can drop to high and enjoy 70+
@@sanderbos 3060 Ti does fine on 1440p on medium/ high settings. Why expect top of the line performance from a mid range card?
@@oliversmith2129 i mean i had a 3070 and had to lower settings in hfw in 1440p cause it was using too much vram, even on dlss balanced
@@userblame632 HFW gives me 60+ fps in most areas at 1440p high settings (texture, 4x anisotropic, level of detail, etc.) with DLSS Quality. It drops below 60 in some high populated areas sometimes. This is a mid tier card, not meant for ultra 1440p gameplay.
All this aside, RE4 remake uses wayyy too much vram for the graphics its putting out and i dont know why its the way it is
game engine
Only if you turn textures all the way up. At 1440p with 2gb textures and high shadows, RE4 Remake uses just over 4.6gb of VRAM. At 4k, 5gb of VRAM.
It isn't as VRAM hungry as you've been led to believe.
Maxing out textures in RE4 Remake serves no purpose of any kind, same with maxing out newer titles in general.
@@nicholasxamotainiumgilgamesh
not engine related, most "issues" are not related to the engine at all. it's just a catch all people who have zero idea of what they are talking about use to vent that their hardware sucks.
anyway the main difference is texture size and how many textures the game has in use at any given point in time (what's needed to draw the current scene).
however it isn't just "textures" either, some games utilize the GPU for heavy calculations and can flood it with large amounts of data which can gobble up vram as well.
any engine, custom or not can utilize the gpu in this way. all 3d games use the first bit but only a handful use the second because it's more work to plan around it.
It ALLOCATES too much VRAM, but it does not really need that much. At native 4K/Maxed out/RT/High textures (8GB) it runs fine on the 12GB 4070 Ti. This card can barely offer locked 60 fps experience though, so I personally played this game with modded DLSS set to Quality. That way the game looks better than at native and provides 90+ fps.
12GB for 4K + RT for a game with decent quality textures is pretty reasonable, I'd say.
It's because RUclips tech channels push people to buy more expensive cards with more VRAM, so game developers can release games that don't run well with less VRAM.
Meanwhile, there are more people with less than 8 GB VRAM in the Steam hardware survey than there are people with more than 8 GB. I'm getting the feeling that channels like this one don't really reflect what typical gamers are buying and playing.
There is something to be said by starting with the 4k very high tests vs working your way up to them.
If you only have a 24" monitor 1080p, what is the best GPU and CPU?
I dont think anyone should be buying 8gb card for 4k. 1440p is still fine on most games that are actually optimized with 8gb cards specially if you turn down solw settings
8GB is not fine enough for 1440p.
@@BlackJesus8463 it is in many games if you turn down some settins
@@BlackJesus8463 This video literally proves that 8GB is enough for 1440p. When you're buying a 4060 series card you should not be expecting MAX SETTINGS EVERY GAME NO MATTER WHAT kind of performance. This is nothing new. 60 series has always been the card where you may need to lower some settings here and there.
@@OneDollaBillNo it's not, try using a GPU with more than 12GB then compare immage quality
@@lupintheiii3055 you a bot, look at the video. Stop posting bs
12gb rtx 3060 here no problems
How about a test of a 10 gb card? Id like to know how much vram bound my 3080 is at my 1440p flatscreen and 4k in VR games..
That was a really well made analysis. 👍👍 Regarding the screen mode used for these tests, were you using Fullscreen or Borderless? There are some games that can stutter in either of those modes, depending on the game itself and the engine they are running on. One of the weirdest examples that I know of is Dead Island Remaster along with DI: Riptide Remaster. On 3090 RTX 24GB VRAM, both games have stutters in Fullscreen, but run super smooth in Borderless or Windowed, which is kinda odd. For comparison, they both run flawlessly on a 680 2GB VRAM and 1080 8GB VRAM, regardless of the screen mode. 🙂
1080p and no point to complain about 8gb vram most of the time...
BS. I tried Hogwarts Legacy on 1080p with an RX 6700XT 12GB. The game was eating around 9-10GB of Vram. Also Resident Evil games would eat huge Vram if textures and settings are pumped up. Ye 8GB Vram in 2024 is just garbage.
“most of the time…”
@@hircine92h yeah, that's called terrible optimization, there's even people with 24Vram GPUs that say the game consume almost all of it
and yet people playing this game with a rx6600 1080p ultra settings at 54-60 fps no problem
so no, 8GB Vram in 2024 is not garbage, is the AAA industry and their lack of optimization in their games... or they did that on purpose so people waste in more GPU's, since companies dont find it profit if someone stay with his GPU for more than a decade
@@hircine92h It had memory leaks which was fixed + you just do not need to use max settings every time... BTW. Nividia has better compression for vram so 8gb =/=8gb.
I am playing cyberpunk 2077 with max texture settings without problems on 6gb vram 🤣
Nowadays every game should have automatic texture swap, when vram is full like in the Avatar game, to bad it is not a case.
Nvidia is really ripping their customers off with these low VRAM cards.
and amd with those drivers and stutters pick ur poison
@@erisium6988 Drivers and performance are fine. Fanboys are really childish and don't contribute anything meaningful.
@@erisium6988 "me when I lie" :P
@@joshmonus ive only had crashes with my 7900xtx in two games that i play quite a bit since swapping from an nvidia card thats counter strike 2 (driver timeout issues) and team fortress 2 (which just randomly closes sometimes when connecting to a server) which im trying to figure out if its my ddr5 - 6000mhz ram or something else but other than that the software and amd seems pretty good no stutter when paired with my 7950x3d cpu
You do not need to drop ALL game settings to fix such issues. Often reducing texture quality one step from Very High to High (or Medium) is already enough to make the game playable if the GPU itself is fast enough.
Yes this is a downgrade in image quality, but if you are buying lower Midrange cards you should accept Medium Settings as your friend anyway.
That being said, medium in new games looks good most of the time.
Its not like the early 2000s when shadows or meshes just went missing when turning the settings a bit down.
8GB video card is barely adequate for 2024 with some compromise, but hardly futureproof. Buying used one with 8 GB is OK. But don't buy new one with 8 GB except cheap one of $250 or less.
Nahh bro, my 1060 with 3gb is enough!!!!
Literally not a big deal. You can't even tell texture settings UNLESS you sit up close to your panel and nitpick.
Maybe enough for Gta San Andreas.
Funny thing about games is that many play old titles, and your 1060 will be perfectly fine for older titles like Battlefield 4 and will perform pretty incredible
I don't judge people for having old gpus, some don't even care about modern titles, and play DOTA 2 and old RTS style of games
Having a powerful GPU may actually cause issues in older titles
I have 4090 and Battlefield 4 never utilizes 90% of my GPU, which in turn forces my CPU to work more. Because CPU is not fast enough for the gpu, and the title not being demanding enough to drive the gpu creating some issues in frame times.
You want that 90% sweet spot somewhere, with low CPU usage
I play older titles, and yet I have 4090 and 13900k. I was doing just fine with RTX 3070 in BF4 that I play to this day
It's the Unreal 4 and Unreal 5 engines that take a toll on your gpu. They are not optimized for responsiveness, but mostly an engine that looks good and has potential for beauty, at the cost of the performance. Unreal 5 is brutal on GPU, even 4090 struggles to keep 160 fps in most titles consistently
i have a budget rig with a 780 ti and it still amazes me in some games. 3gb gddr5
@@4m470 depends on the game, in cyberpunk and rdr2 there is very big diff, but there are some tricks to improve that
This is pretty much what I'm running into with a RTX 3080M 8GB laptop. It can boost all the way up to 140W so it has the performance but at 1440P in newer games, I have to drop the textures to high or even medium to avoid performance issues. Thankfully, once I do that, I can game at mostly high settings in both Ghost of Tsushima and Horizon Forbiden West with DLSS at Quality. My main rig has the 3080 10GB and it's similar but those extra few gb of VRam make the issue less pronounced. This is why Losssless Scaling has been a game changer for me as I just lock the games at 40fps and I'm able to get 120fps with their 3X FG.
Youre using an external monitor with a gaming laptop.
@@BlackJesus8463 I've done it before, big deal
@@BlackJesus8463 Yeah. The point is you can connect it with a cable or two to your monitor (which probably has a USB hub for keyboard/mouse/other accessories) and unplug it and take it wherever you go. Also, that way one has to replace only one device (gaming laptop) rather than two (a Macbook/XPS 13 like ultrabook + gaming PC) every 3-4 years and always has all their data in one place. Not everybody's trying to get 240+ FPS on a maxed out gaming PC because they have godlike aim and have to extract every single FPS on a high refresh rate monitor to be a pro, or a mediocre streaming career, or want to show off on Reddit/YT.
Why do people act retarded when it comes to setups based on gaming laptops?
People stop buying 8/12 GB VRAM GPU. PLEASE!
I'm going at least 16GB next for a gpu in 2025.
Daniel, do this test again with only 16gb of system memory. When the 8gb card runs out of memory, it spills over to system memory and uses over 16gb. I'll assume people buying a 8gb 4060 wont be buying more than 16gb of system memory and therefore would probably cause even worse results.
I got 7900 xtx it's always interesting seeing how much vram being used at 1440p maxed out. 14 gb in ghost of tsushima maxed for me.
Realty is that 8 GB is more than enough to fill plenty of high Quality textures, and if games cannot run well with it that's because developers are lazy to optimize better, or manufacturers push developers to use more memory to make them sell more expensive cards.
Ive been using a 4gb laptop for a while now 😅. Though i just got all my desktop parts shipped in with a 4070 super.
Daniel: How bad is 8GB of VRAM in 2024
Me: enjoying the video with a GTX 1650
people laughs and meme on 4060 ti 16gb for 4k gaming. but its actually pretty capable with DLSS and frame generation.
i thought ppl meme on it for its price
@@eclxysThey do, it’s meme’d because the 4070 makes absolutely more sense once you look at the price/performance. Base model 4060 is a good card, could still be $20 cheaper IMO.
Your grammar and opinions are laughable.
The 4060 Ti is in a very weird spot of being too weak to handle a lot of games at maxed 4k, while also having that be really the only area you actually begin to see a huge difference in VRAM performance. Yes, 20-30 frames at maxed 4k no DLSS is a lot higher than 5-8 frames, it's still essentially unplayable.
yep. in most scenarios whenever you have steady 60+FPS with 4060 12GB, you will also get 60+FPS on 8GB as well. so it's not such a big deal. 8GB is kinda doing the job. the problems start when you have 3070 ti with just 8GB, or 3080 with just 10GB. VRAM limits those much more.
Its for 3d and creative work. U can do alot with that vram and power
This is why most people whining about lower vram weren't really completely justified. While 8 was a downgrade from what the 3060 offered, if you treat it as it's own thing there's not really much of a reason to justify that much more than 8, because it's only going to handle 1080p and light raytracing workloads anyway, what would you actually need more vram for? The 4060 ti definitely should have at minimum had 10 but the last minute doubling of vram offering to appease complainers made no sense because it's completely overkill at its performance level. It might be worth it for some ai or productivity workloads on a budget but other than that it's kind of silly
@@jayceneal5273 No, just no to everything you just said.
You shouldn't even have this conversation in the first place, having 8GB on anything more expensive than $200 in 2024 is just miserable, the fact you justify that is the exact reason why that cards exist.
@@lupintheiii3055 miserable at over 1080p, sure. It's perfectly fine at that resolution level and lighter graphical settings which you should expect from a budget card. I have a 4060 laptop and have no problems when graphics settings are adjusted for its performance level. Why do you expect vram for full 4k path tracing levels on a card that is barely above console performance?
honestly depends on the resoultion... for 1080p 8gb vram is alright, and hence i dont see a need to upgrade my 3070
Hey daniel im wondering what microphone u use?
These GPUs are targeted for 1080p, and VRAM management isn't even a big deal in game development. 8GB is enough for 1080p and will stay that way for quite a while.
That may be true, and Nvidia may want us to see the RTX 4060 Ti 8GB as a 1080p GPU... However the price says otherwise.
There are no bad GPUs, only bad prices, and holy shit is the RTX 4060 Ti 8GB expensive for a 1080p GPU.
For ~$400 price I'd hope to get decent 1440p performance without any kind of issues or guaranteed +60 fps at max settings in every game at 1080p, and the RTX 4060 Ti 8GB isn't that.
@diego_chang9580 It's true and I agree 100%, I can't even imagine why we don't have a GTX 4060 instead of a RTX 4060, ray tracing is too heavy for cards below a RTX 3080 in "RT" cores.
it doesn't matter what nvidia says they are targeted for
4060 is capable of very high settings and even 1440p with dlss, so why bottleneck it with low vram if it can do much more, and you can clearly see that from the video
literally its bottlenecked even on 1080p with raytracing and fg, not because its weak but because of vram only,
@@Rasingard Probably something to do about the architecture and because it being an RTX card it helps it with the price.
If it was a GTX 4060 people wouldn't even consider paying above $300, which is exactly why the card is overpriced.
The RTX 4060 can do some RT, sure, and it can use Frame Gen... Most of the time. But honestly? The only reason I'd buy it right now is DLSS.
The more you buy, the more you slave. strong copium in this section here.
Nvidia will add VRAM generation for 50 series
Deep learning video memory.
Lmao
Can someone explain to me if I will notice any difference if I run my games on medium settings? will the 16gb help at all other than the 1/2% shown on this video? And most importantly how will the 16gb of ram compared to 8gb affect 3D modelling software's like Houdini? what will be the benefit of having 16gb GPU for such software?
You should be good with 16 gigs running games on medium settings. I don't know about the houdini software I only use my pc for gaming.
@@MrAnimescrazy since I run games on medium at best, I am wondering mainly what will I miss out on if I have an 8GB vram GPU
@JT11111 maybe you should be good with 8 gigs of vram but I would get a 16 gig vram card just in case. I would rather you have more then what you need instead of exactly what you need.
@@MrAnimescrazy fair point, since the GPU in question is a mobile 4070 gpu inside a laptop, which only offers this at its highest GPU chip, I have no choice to stick with 8gb, I was looking to import a laptop with higher GPU chip and 16gb from outside the continent im in but I dont think and hope it wont be that much of a difference on a laptop gaming set up, I am curious to know how vrams affect 3D softwares and that is my specialty and if I know it makes a significant difference I would have to opt for a different laptop
@JT11111 I'm sure you van just Google how much 8 gigs vs 16 gigs helps with houdini or someone may have a video on youtube.
The most popular amd card is rx 6600 8gb. And it is a very capable card. Vram issues is overrated
For anyone curious, Vram amount is mainly a concern when you increase your resolution.
1080p 6-8 is probably all you'll ever need,
1440p 8-12, you should probably stay away from 8GB though it'll work fine for most games, modern games will run better with 10+
4k 12-16+
This is exactly my experience.
It is fairly rare for 4k to exceed 12GB even, except for the most demanding games.
I can downscale from 8k maxed settings most of the time without hitting 16GB, but occasionally, I can exceed it with the most demanding games.
4090 is my reference. NVIDIA does have a very good memory compression algorithm, so results can vary against AMD or Intel
*Bro thank you for saying it, I've been saying this for so long and people while quite literally still don't get it, it also depends on the settings too, you can max out most settings and turn VRAM heavy settings like shadows or reflection down if you're worried about VRAM. But it all comes down too buy the right card for the right resolution, not that hard.*
I just bought an XFX Speedster Qick 7800 XT to upgrade from my 3070 Ti. The VRAM was the primary reason for doing it.
weird ´´upgrade´´
@@groenevinger3893 depends on the prices for both. used 50 $ more wouldnt be bad
Yup people were complaining about the regular 4060 for almost no reason. It’s a 1080p gpu and only has issues if you try to game in 4k with it
I'm playing GoT on my 6gb 1660 super game actually runs great 70-80 fps on med-high with FSR. I was on a 1060 3gb before and it was impossible to load textures on majority of newer games even the going from 3gb to 6gb was pretty big.
Idk for full hd 3060ti is still great... rx 6700 etc. Want play in 4k? Get console or be ready to spend tons of money. Btw never trade your gpu to java.. it's rip off
I know right? I get that 1440 and 4k are much better than 1080p, but thay doesn't make 1080p bad. I'd rather play at 60fps ultra 1080 than waste money chasing a 1440p build
As far as i know, a ps5 cant play the newest triple A titles at 4k, cuz that's just too much. But i guess it could upscale it from 2k or 1080p.
@@Daniel-ru8je "from 2k or 1080p"
omg you're a noob
@@GrainGrown ? Is it because i used 1080p and 2k instead of 1080p and 1440p?
@@Daniel-ru8je Yes, and also everything about that silliness.
Ghost of Tsushima/Forbidden West are both terribly optimised games when it comes to Vram utilisation. I think its more of a Nixxes problem.
It's nice when the solution to a performance issue is "drop only texture resolution 1 notch". Anyone only going with quality presents & not happy with quality or performance either didn't spend enough on hardware or understanding what dial incurs what cost and hits breahpoints on their chosen hardware.
in hell blade you can see screen tearing in the 8GB and a small black line popped up on the 8GB version
Now 16GB is a must or you will hit easy 12GB in all new games and 8GB is not sufficient.
I hate that modern gaming makes that a necessity. Too many just throw in 4k textures so we can see every mole on a character's face.
1080p 12gb is enough in 99,9% of cases. If you play at 1440p or even 4k with a card that has 12gb VRAM, its obviously not a high end card bc then it wouldnt have 12gb VRAM. So what that means, is that you can use DLSS. You SHOULD use it bc you get a massive amount of performance for almost no impact to image quality, sometimes DLSS upscaling is even better than native rendering with TAA anti aliasing.
dont forget, as soon as you upscale, the actual rendering resolution is what dictates how much VRAM you use, not the upscaled output resolution. And in new games, where you can run into problems with 12gb, you will very likely have DLSS. At 1440p and 4k resolution DLSS is really a godsend.
this doesnt excuse low VRAM offerings but its a realistic outlook of real world usage. UE5 gams with nanite, lumen, path tracing etc. are so freaking intensive, you will need upscaling anyways, so VRAM is less of a concern than when rendering native 1440p or especially 4k.
8gb of vram has been DOA for many years
Well, blame nvidia that refuses to give more vram even though people have been demanding it.
@@mrman6035 those games ain't even fun and just milk gamers for money in the way of DLC. Get a game from 5 to 10 years ago it is just way more fun to play.
i mean if u playing 4k at ultra with 8 gb vram card dont blame the company blame ur self for being stupid
That's just to illustrate the issue, but it gets a lot worse than that, many games downgrade the texture packs, so you're not getting the visual quality that you're supposed to get, you just don't notice a performance dip, in fact since most reviewers are looking at frames per second only, 1% lows etc, they miss the fact that the game looks like s***. And that's at 1080p/1440p...
In today's and yesterday's 2023 games, you shouldn't be paying 500 to $600 for 8 GB of RAM that will end up giving you a subpar experience. We're not talking about just framerates here, we're talking visual quality.
In addition, you shouldn't be paying a used car price for video card that the company wants you to throw away in 2 years
That is absolutely on the manufacturer
There's an old word for that, it's called usury
Don't blame the customer just because they're the ones who are trying to have a better time with their very hard-earned dollars, in a market where they cannot have what they actually want. This is monopolist behavior
I have issues with 16gb of vram in Wayfinders in ultra wide 1440p. I turn the resolution down by like 5% and it runs smooooth
I think the better way to do rhis would be to change just the texture/VRAM intensive settings
I don't care if a certain amount is "enough", i want a future guarantee. When i inevitably buy a new card i want it to be because of power, not because of vram
Future proof is 16gb + vram and 4070 ti super or rx 7800 xt above, simpel.
@@hannes0000Yeah, though for 4K only 20GB and above is safe. That being said only next-gen path-traced titles will require that much so we'll have to either wait for gen 60 to drop or buy 4090/5090.
"Future" proof is only finite. Specially when tech is advancing faster in many areas.
yeah, but as you can see, the power is held back by the VRAM
@@pinktuna3693nah, just know your settings, lower textures, not the end of the world. By the time my 4070 has vram issues Dlss will be needed anyway.
Excellent analysis. It's important for people to know where to place their GPUs and the 8GB 4060Ti, for example, is a GPU for 1080p with a foothold in 1440p, but nothing more.
The 16GB version can totally play at 1440p
@@lupintheiii3055 Depending on the game and quality, YES!
I was worried about vram when the 40 series released but now....
All I care is gameplay, I dont want eye candies and lower my settings even if it is too overkill so that I can actually see the important things happening and react on time. (4070 ti)
Hi @owen.
Hope this gets noticed.
Please try this test again on a 32gb system ram system.
To my test and experience stutters in the 8gb card should be alleviated by a ton if you simply just have a ton of system ram as overhead.. 😁
I experience vram spill on my 6700xt 12gb, but have 32gb of ram, but i rarely, if i ever did, experience stutters.
I experience vram spill because i always have nox/bluestacks open and that app eats up almost half of my vram
Thank god I don't give a flying fuck about AAA garbo
AAA garbo runs pretty well, it's the good AAA games that tend to push the limit
3080 10gb was a huge mistake 😂
3080 in 1440p will be fine for a couple years
@@brandon_gb while i still can run max with my 6800 xt bc of the 16 GB Vram.
You need use a user defined voltage curve where power usage doesn't big out from doshi driver support from n-poop-idia