Didn’t mention it in the video because it didn’t flow well, but I had to cap the frame rate in most games. Or the footage would’ve been stuttery 😢 The a770 bugs out it’s recording when it under max load and the encoder gets overloaded
I have a 3080 10gb, I went to a 3090 24gb because the 3080 would sometimes run out of vram when playing Far Cry 5 arcade user created maps at 4k, some maps would have 100+ characters on screen, there was 1 map where a herd of 100+ buffalo run across a field. The 3080 would tank into the single digits once it was out of vram, the 3090 would play it at a smooth 45fps (It plays most non intensive maps at twice that fps). I don't know how many games have user created content, I imagine a game like Minecraft's user created maps could use a lot of vram as well.
I think you completely wrong about the subject and the sharpening part too. Almost all new games uses TAA and sharpening already. Last of us too, no point oversharpen the image more. We need more vram, 16GB minimum at least, we have 8GB since 2016, its a joke now. You can get an 8 core console with a decent gpu with 16 GB ram for 500 bucks, when nvidia want charge the costumers 400 bucks for a 8GB card we should laught at them.
Its funny how Nvidia advertises and pushes tech like Ray Tracing and Frame generation which both requires more vram and yet they offer less vram as compared to the competition unless you have $1200 in the bank😂.
Yeah and this translates into the rx6800 being FASTER than the 3070 with ray tracing in some titles because it doesn't run out of vram lmao. This gen must be skipped. We need at least 16gb or more of gdrr7 next gen, with 12gb for the 200€ gpus.
@@sidneyentr5640 They released the 4000 series with low VRAM so that the VRAM chips are moved to the "AI" cards in data centers. Nvidia is effectively killing PC gaming. Last gen should have had 16GB VRAM as standard, and it's not even standard THIS gen. Nvidia, go duck yourself!
@@enricod.7198 idk in which games I have RX6800 and man it sucks balls in Witcher 3 next gen when RT is turned on, same in Cyberbug, runs at 35FPS in 1080p with no upscaling lol
Cyberpunk at this point is just Nvidias playground to showcase new technology. That conveniently doesn't run well on AMD so they can show their new GPU and say: Yeah, we get triple the fps in Cyberpunk compared to AMD.
Ironically though we do need a variety and new stuff in features and technology and improvements aswell like imagine if Nvidia didn't start the Ray tracing war, we would still be stuck with traditional raster
@@PackardKotch to say RT doesn't improve much is a bit exaggeration my friend but im glad there's competition, RT not only revolutionizes gaming but don't forget 3d work has also been massively helped with it
You're absolutely right. Most new games are too expensive and are not optimized. There's so many good "older games" and will play nicely with 8 gigs of ram
I remember back when the rtx 3060 launched, i thought 12GB was an insane ampunt of vram, but here we are, my 8gb gpu i bought in Q32022 is already being bottlenecked by the amount of memory because god forbid a $300gpu actually keeps up with games released a year or two after it came out
Games using more VRAM doesn't necessarily mean the texture quality will be higher. Most games in the past used a smaller pool of textures that devs would simply repeat over and over again, and it was fine then. That tree you see there? There's an identical one after every 4 trees in an area with 100 trees. That's 20 identical trees that cost the space of whatever a single tree takes up on the VRAM. Now that we're starting to use more and more of photogrammetry and unique textures for objects instead of highly repeated ones the memory usage will go up even without much of an improvement in the actual quality. Of course it isn't noticeable right away, because devs aren't leveraging it fully; if they did, the memory demands would skyrocket to 30GB or even more at just 1080p. And that's before making higher quality textures, which devs aren't even considering right now. Just look at how Skyrim extreme texture mods work and their insane memory demands. RE4 Remake uses so much VRAM precisely because it uses photogrammetry, which is why the textures look so nice.
So many are completely missing this. They think increased VRAM usage is just result of higher resolution textures and therefore it's waste of resources and needs to be "optimized" away.
Whilst textures do play their part, it's unfortunately nothing so simple. Plenty of channels have covered everything that eats VRAM (and why), and what devs have had to do for years to mitigate it due to HW vendors (like nVidia) not putting nearly enough VRAM on their cards in recent years. I have a GTX 1070 that has 8GB VRAM. It's 7 years old. That should say all you need to know about who is to blame (because putting more more VRAM on a card is not expensive). It's planned obsolescense to make you buy new hardware more often than you reasonably need to.
@@PQED I upped from a 2gb 960 to a 3070ti because at the time it was about the same price as a 3060 for some reason and I'm blown away that it's still 8gb of ram. Luckily modern games suck so it still does pretty well on the older ones I actually play.
@@PQEDdepends actually. During the mining craze GDDR6 end up being very expensive. I think Steve (GN) talk to some partners and they told him selling the 3060 12GB at $330 can be considered selling at lost. So it is not always cheaper to add more memory. Second developer do really need to optimize more. John Carmack said ever since the 8th gen console giving significant memory upgrade over the 7th gen majority of game developer end up brute forcing things instead of more optimization. The 9th gen only double the memory from 8th gen so dev eventually need to be careful with their memory usage. But so far the issue is only on pc ultra setting which is a setting that is not used by games on console.
That allocation does not matter is an idea spread by Nvidia. It does matter! If the pc cannot allocate it needs to constantly shuffle data inbetween ram and vram which can stutters and slow downs. It's not as bad as if the graphics card needs to access something from the ram yet, but still has a mayor impact.
Yes, but no, It does matter in many games, the higher the vram amount, the more stutters the game will have when loading new textures or the texture streaming will be super slow, noticing pop in constantly.
@@metroplex29 I thought I said that? Maybe I could have expressed myself better. However it's not just texture loading. It can be everything from culling differences when turning your character around to the loading of calculations on the gpu like game "Ai" or destruction physics etc. But yes pop in is an additional problem.
While this is true, games will also use the vram they have available, just like your PC will use the system ram you have available. If a game needs say 6GB of ram to completely load a scene and you run it on a 24GB card, it has a high chance to use over 6GB because it's leaving the unused assets in ram until it needs to swap them out so it doesn't have to swap them back in later on the off chance it needs them again. It will not, however, cause a performance hit to not have a higher vram GPU and this is what idiots don't understand. It might cause a bit more temporary stutter on a new map load as stuff is swapped out but that will quickly resolve itself. This is why these type of ram videos are misleading. The same was true on videos for SLI back when it was still around. A bunch of misleading information got copypastaed and people used that info to try to run SLI with and then magically SLI sucked, just like everyone said it would. No, they just were clueless about how to use SLI properly. It always worked fine, it was a PEBKC issue.
I'm still on a 1050TI and will continue to play at (up to) 4k with motion interpolation to 120Hz till i played every game in my backlog that it still can run. It's probably going to take ~1 more year and then i'll have a lot of "new" games i can play at max settings^^ By limiting what games you play you can find some Indie-gems you wouldnt have played else way. It's like when you where a kid and only had a Nintendo with few games.
It's not really pointless though, yes sure it has "only" 8gb vram, but the power consumption is far lower than the 3060ti. For 1080P is enough. Simply stop playing poorly optimized games(mostly triple A)
Another solution is to just target 1080p 60 and you're set. I mean a lot of us played ps3/ xbox 360 games on 720p tv for years and enjoyed every moment.
LMAO no we didn't, and certainly not in the way that we enjoy them right now. Graphics back then were like "Oh, that looks so realistic!" These days I want to be able to masturbate when I look at the graphics of a videogame. IT IS NOT THE SAME THING.
No but seriously, that argument is f*cking retarded. "Oh people were able to live in mud huts for thousands of years so why don't we do the same?" I've had a PS3 for a year up until 2020 when I bought my gaming PC and PS3 games look horrible. I certainly didn't enjoy my PS3 other than Persona 5 I guess and I certainly didn't have half as much fun playing it as I did enjoying the beautiful visuals at 1440p 60FPS on the Witcher 3 with my 1650 Super. Gaming back then was a vomit-inducing experience compared to these days.
cuz their income depends on viewer engagement and controversy seem to be a good way to provoke viewers into interaction, such as liking/disliking and commenting vids. Tech related comunitites sometimes look pretty much like some kind of religious sect and things are blown out of proportions to the point of hating some unreleased hardware piece for being garbage cuz it's leaked specs are not as high as they should've been
The solution is for devs to start using better compression and better UV layouts. When 2K textures look worse than 1K textures from 20 years ago you know we have a problem. Compare for example HL2 or FEAR with it's tiny 1k textures to modern games and you'll see that something has gone seriously wrong.
That doesn't mean much. Games use the vram when is there to store extra assets. But they don't necessary need all that vram. They just use it since is there.
3:02 when You need 8GB of VRAM for your "High" textures that look like PS3 era (not to mention 6GB for PS2 era textures) - all I can really say is - You done fucked it up!
@@Dayn-El awnser is no a 4090 cant run cyberpunk at 8k at max settings its 1fps gaming but with some ai features its 40-48fps all less then 60fps isnt a great deal so 8k high? yes this works fine but not maxed settings! :D
There's another solution besides giving up on playing newer games: settle with a lower graphic settings. Honestly I play games because of how fun they are, not how fucking photorealistic they look. I'm fine with playing Cyberpunk with my piss poor RTX 2060 laptop even though I can only play it with 30 fps at most, but at least I'm having fun with it.
tbh if games like Diablo 4 come out with VRAM memory-leaks and reallocation-issues, no amount of VRAM is gonna help you. As much as GPU companies should stock up on VRAM, as much have the game-companies held responsible for not giving the slightest fck about optimizing their games performance.
I think devs actually stopped doing ANY optimization whatsoever. Like proper cpu multithreading? What's that? Texture compression is non-existing. They all rely on dlss these days, as many predicted would happen with dlss, used for "optimization" instead of extra performance, a new way to cut corners and expences in the shitty aaa market. Hope things lile sampler feedback streaming and the new ai texture compression by nvidia will be used to fix this memory management hellscape we are living in.
Probably also why a lot of games are so big in size these days, because they don't compress textures. It is gonna be funny when gta 6 needs 200gb of space.
The problem is that people are holding back the progress by their refusal to lower graphical settings. Even OG Crysis ran well when on low settings. I played it on my potato PC. Enjoyed graphics and gameplay. However, people went REEEE over it and even youtubers after all these years are repeating REEEEE by claiming incorrect facts about the game. The truth of a matter is. You are not entitled to ultra settings. Settings are here for a reason. Pick ones appropriate for your system.
No not really. I can play Battlefield 1 on my 2060 with over 100fps on Max settings. The finals on all low looks way worse and I cannot even get 60 on all low. Red Dead 2 can be played on high settings using 5.2gb VRAM while looking better than new games on all low. Worse performance and more VRAM usage should not be accepted by saying "just lower the settings". Using the steam hardware survey we can see that a majority of people do not have more than 8GB VRAM, and most GPU's people own are not even that powerful. Developers should optimize their games, to the best of their ability before releasing it. I do not expect to play ultra settings. I want to be able to play games at 60 fps high or medium settings or lower the settings for more fps. And people who bought a 3070 which was a $600 card should not be forced to lower settings (at 1080p) because developers cannot optimize their games.
@@cxngo8124 I had checked newest games. They look similar on low and old hardware can support them. Also, developers develop on consoles first as base standard due to their popularity.
@@REgamesplayer what new games are you talking about. For me it's bf2042, MW2, Hogwarts legacy, Dead Space, The Finals etc. There is a bug difference between low and medium when you have to use something like DLSS on balanced to get playable fps. Hogwarts legacy in general doesn't even look that good compared to other titles, MW2 looks like MW2019 but runs worse, Dead Space did a good job so I give it and bf2042 looks worse than bf1. If they don't care about making there game have a good performance to visuals ratio I dont care about buying the game till its on sale years down the line when they have updated the game.
The problems start appearing when the highest end card available struggles to run modern games in 4k, like immortals of aevum. This is due to shitty optimization (look at his latest few videos)
@@M_CFV It is difficult to tell without objective metrics. Immortals of Aevum is UE5 game. Maybe all new games are inherently demanding. In order to judge, first we need to see graphically demanding game which would be optimized. Furthermore, we got one shitty generation of video cards both from Nvidia and AMD. This generation barely pushed needle forwards while games had a massive requirements jump due to games being built for PS5 now (this is why they all suddenly need more than 8 GB of VRAM) and adoption of UE5 (which is why some games gotten so demanding). If Nvidia would not had raised each generation by a tier, picture would be different. Imagine RTX 4060 Ti being RTX 4070 and RTX 4090 costing as much as RTX 4080. Picture would be completely different. Furthermore, it is not unheard that new games practically demand flagship products to run well. Crysis was another game. It was well optimized, but required future gen hardware to run. If you would look up old benchmarking videos, you would see that games would run at around 60 FPS with high end cards for the time. So, part of the problem is that people have memory length span of a gold fish.
As someone that makes characters for Indie games, I can tell you Vram isn't all just about visual quality, it also about lighting, AI, animations, etc. I have videos on my channel right now of different animations I have made using AI to do the animation and lighting, and some scenes I used up to 18GB of Vram. By now 12GB of vram should be entry level, 16GB midrange, 24GB high-end, 32-48 should be prosumer/enthusiast. The thing is if they release cards with that much VRam now, people won't by the next gen, if there was a 32GB 4090, no one will buy the 5090.
" By now 12GB of vram should be entry level, 16GB midrange, 24GB high-end, 32-48 should be prosumer/enthusiast. The thing is if they release cards with that much VRam now, people won't by the next gen, if there was a 32GB 4090, no one will buy the 5090." And that makes sense to you? clown
Games usage of VRAM really depends on the game itself. Some games will use all available VRAM as a cache, so even if it's not actively using a texture it will keep it loaded in case you have a slow hard drive. So those games if they have 40GB of textures or some crazy amount will use all of your VRAM, unless you have VRAM more than the total texture size of the game files, or whatever upper limit the developer hard coded. Does not mean the game will run any better, might mean you get faster loading times and less stutter. What we need is for someone to do a test on each game with all the major VRAM sizes, see what the minimum amount of VRAM you need to play at each major quality setting without the game turning into a slideshow. Would be nice if someone made a database for that sort of crap where people can submit results from in game benchmarks, but only if the CPU is not 100% on any core.
This, and also shared memory will be used too if you hit the ceiling of vram, itll eat some of system memory (though i dont know about this in ddr5, but in previous systems, when this happens, we will see major stutters) the hardware communication isnt great for pc unlike game consoles that is optimized to communucate well on the default settings
there is a great reason why we went away from socketed VRAM on GPU's; those brackets cost money, are point of failure from both oxidation and mechanical failure and current GDDR is tuned for soldered to PCB applications
Yeah there is definitly a disconnect from the engine developers and the hardware the average gamer owns. It's almost like they are saying: this is premium graphics, this is a premium game, so you should get premium hardware to run it. It's a uncertain and scary road we are travelling in this space*. I don't know if it's like a planned thing or just a unintended side effect, but it's almost like they are trying to push the mid to budget PC gamers onto console (where they get ripped on game prices and live services) *seems to be a buzz word everyone is using these days?
You must consider that most dev studios make games considering that they will be run on consoles. ps4/xb1 had 8GB of ram and ps5/xbseries have 16 and 10/16 respectively, so it makes sense that they make games with that amount of memory as a target. Also consoles have a bigger market share, and PC gamers have on average most varied system, some having systems as old as 10 years, so optimizing for PC at the cost of consoles doesn't seem the smartest idea (and I say this as someone that plays only on console/steamdeck and my PC is 7 years old, i5-6600K and GTX1070 with 8GB of VRAM).
@@TheShitpostExperience I think you missed the point, console games are demanding more on PC than on consoles. The new engines i.e UE5 are expecting much higher Vram usage, so it's like go to console or buy a high end card/cpu to play PC games. This is what they are going towards, it may not be the case overall right now but just look in 2 years time. This was not the case in the past. So now someone getting anything with less than 16gb WILL need to upgrade in 2 years time to play new games are high/highish settings and probably 20gb for max. There is always a lag from engine developments to proper game utilisation, so I'm looking at whats coming, the software engines are more demanding than most of the mid-range and below market for AAA gaming currently available. This is why getting 4060 for example makes no sense.
@@RavenZahadoom To be fair, the reason someone may need more than 8GB of VRAM right now are because high quality textures (made for 4K resolutions) are being added to the games, which didn't happen 10 years ago, for example dark souls 3, that's why playing it on 4K resolution may only use like 4GB of VRAM vs a game with true 4K using up to 6 or 7, and if you add to this high or ultra graphics qualities, the VRAM needed may jump up to 8 or more. UE5 actually tries to solve some of the issues this can cause (specially in games where a lot of stuff is rendered at once, such as open world games) by using nanite to simplify poly and texture rendering. On the other hand Lumen is a tool for realistic lightning which taxes more the system (lumen is the UE5 version of raytracing), so disabling that on UE5 games may ease up on the GPU. I think it's unfair to blame engines (specially ones that are still on their first year of release) for not having the same performance of a game that uses an engine that is 10 years old for games that were meant to be played on 720p or 1080p. Current mid/high tier GPUs having less than 16GB of VRAM is a scam, specially at the prices they are released. And blaming devs for making games targeted at current gen hardware is dumb. I won't tell you to buy a 7800X3D+4090 PC to play current gen games, that's absurd, you would be better off buying a PS5 or an Xbox X, they cost a fraction of the cost.
@@TheShitpostExperience And this is my point: "I won't tell you to buy a 7800X3D+4090 PC to play current gen games, that's absurd, you would be better off buying a PS5 or an Xbox X, they cost a fraction of the cost." this is what the push is, want the top end of gaming? get the top end of hardware, this was not the case in the past apart from the very early days of 3d gaming, when for a period you pretty much had to get a 3dfx card to game while everything else on the market was a 2d card. But maybe I didn't make it clear, I wasn't only talking about the software side, the hardware guys are just as much to blame, but seen as Vex covered that side, I wanted to zoom in on the software side. In anycase I think we are actually agreeing with each other but coming from different angles.
Maybe some of us bought 4k 120hz OLED monitors because we want to escape into a beautiful world. I'm going blind. I can't see anything clearly past 3-4ft anymore. But my monitor is 2ft away from me. So I can actually see everything. I want to play all the new games at max settings because that's the only way to travel and explore for me. In conclusion, Vram matters a lot. Some of NEED it to keep our sanity. 16gs should be the minimum.
I don't think more than 12GB will be a real necessity until the next console gen, because good-but-not-great graphics fidelity is based on that hardware. But the fact that it takes so little to get to that point, and well beyond it, is still an issue. Games are being terribly optimized, and it's been proven by patches for new games fixing these issues. It's not that they really need to use that much vram, it's that the publishers just want the game out NOW, and maybe they'll fix it later.
A) RAM is relatively cheap. B) if you’re spending £1200 on a PC, why not expect it to be better then a crappy £450 console? C) consoles typically play their generation of games on low to medium settings, usually always. PS5 and Series X can barely render 1800p/3k 30fps which is probably why there’s rumours of pro models in the works. Game assets require multiple high resolution textures to load like metallic, normal and bump maps. Less ram, more blur, it’s that simple. And for something so cheap, why settle?
I’m really glad I paid the extra $50 for a 3080 12GB. Found them both at a similar price, and thought it was a no-brainer to go for the 12. The last 6 months have really showed us why VRAM matters.
I have the 10gb 3080 and I get around 35 fps in cyberpunk with 4k ultra settings ray tracing and path tracing. I keep dls on auto. But if I spend any time in the inventory menu I have to reload or the game drops to around 20 fps until I reload. I'm thinking it's a vram issue. But hey I'm OK with it.
Just turn down the settings a notch and you will be able to play games on your 8GB card for years to come. In PC gaming maxing all the settings always came at a high cost, back in the 90's i had to buy an entire new PC every 2-3 years to even launch the newest titles.
I'm happy to stick with my 8GB card for the time being. Not much new coming out that I really care about anymore, and what I do care about playing right now works just fine on an 8GB at 1080p. I refuse to go beyond 1080p, for the time being, as it would require me to get a new display. But the difference between 1080p and 4K just isn't enough for me to spend the money (not saying there isn't a noticeable difference, but that I just don't care enough).
@@HeathBlythe Yeah, but for a display this size you, once again, need to spend even more on a good graphics card that you confidently know will handle that resolution.
I honestly love your approaching to tech reviews/discussion videos. It doesn't have the extreme pretenciousness that a lot of the big names I have watched for years has.
Im starting to believe that whatever card I use either it be a 4090 or a 7900xtx, some games will just run bad whatsoever because most of it is down to devs optimizing their games.
One thing to note is that surpassing vram capacity a gigabyte or two isn't somthing to worry about. I played Re4 remake 5 times with a 4 gb vram. And no, it did not look that bad.
When you run out of Vram the game will use System Ram which is slower but still not bad yes but it will still hurt your performance a bit also some games will use more vram if you provide them with more but will run just fine if restricted
I played WoW for 12 years and GW2 for the last 3. With shadows, etc. all maxed, GW2 looks really nice. WoW does have a RT on/off toggle, but all I noticed was performance tanking. My monitor is a Hisense 4K TV 55U8G. My living room TV is their 65U8H. Sony, LG, Samsung all may have "better" 4K TVs but nowhere close to the $800 I paid. While the performance of your PC is important, what you use for a monitor is just as important.
I am an older gamer, I began my addiction back in 1986 on a Commodore Amiga. The question each person needs to answer is why play video games. I bet for 95% the answer is to have fun. For the remaining 5% they are the asses who love to talk down on all the rest. They are the ones who money means nothing and so they spend thousands on a PC just to have bragging rights that their shit don't stink. Every gamer are individuals and have different views as to what games are fun for them. For me I do not enjoy mmo's, there are far too many people who love to camp out at respawn points. For me this simply is not fun. As things have progressed, games have become more complex and graphically challenging. These do not make the game fun to play. If you cannot forgo buying the newest games you will forever be slaves to Nvidia and their grossly over priced GPU's. My current system was built back in 2019 and has a Radeon 7 GPU with 16GB's of HBM vram. I was getting ready to build a new system but as I was looking at most of the new games coming out there is only one that appeals to me and my current PC will be able to play it without issues. I have many older games that to me are classic and are a lot of fun to play. As a result, I did not spend 4K + on a new system.
The last generation of consoles (you can buy for $400) have 10-12GB of VRAM so I'm not surprised devs don't want to make their games look worse for some ancient 8GB GPUs. So if you want the console's level of details, textures, and the same number of NPCs in the cities then go and buy 12GB GPU. But if you want better visuals than consoles can offer and you want to use RayTracing then you need 16GB. 8GB GPU for the price of the PS5 Digital is a joke, an $800 GPU with only 12GB is an even bigger joke, and the real 16GB GPU for $1200 is just an insult. Thank you, Jensen.
you actually need more consoles dont have a heavy operating system on top and games on consoles because its all the same hardware can be optimized much better.
To be fair, regarding RE4 remake, lowering the texture pool size down to something more reasonable won't impact what you see very much. 3 gb high is more than enough for you to always see the highest textures.
Just because you don't like current games should not influence your recommendations for buying graphics cards. This reminds me of the old Linux mantra: "Just don't play those modern games they are bad anyways and older games are better." It's an extremely boomer mindset.
Is 8GB enough? If you require ultra preset at 1440p or better and only play natively? Then probably not, no, it's not going to be enough. For the majority of gamers? Yes, it is still enough because the majority of gamers will play at the default out of the box configuration. They won't go into settings aside from altering key bindings at most. Out of the box in a modern title means the game chooses for you, and applies upscaling by default. They also don't look at their FPS, that's uniquely an enthusiast thing. Meaning their 8GB card will never face an issue because the game is designed to kind of fix that issue for you. It's the disconnect of the mainstream and enthusiast, which is only getting larger as time goes on unfortunately. The mainstream gamer is basically 3-5 years behind in hardware, so the issues we want fixed today, realistically are 3-5 years out because they won't become a mainstream issue until then. Edit: And that 3-5 year gap is being generous, there is a large subset of the gamer population that is closer to 7-10 years. Nvidia recently admitted openly that only like 18% of Geforce users have a 3060+ and since they have been providing the vast majority of graphic cards, that means realistically at least 75% of entire discreet graphics gaming community is below a 3060.
The 3060 was aimed at 1080p. It can do 1440p, you're taking it beyond what it was meant for. This is a "you" problem, you don't know what you're doing.
@@chrispappas3750 good point there, I'm a 3060 owner but I play at 1080p due to my display being far from me so I can't see pixels thus I don't need to push 1440p. This channel has helped me understand why we should change resolution in one of his other videos.
No, no, you're being ironic... 'cause if not: HAHAHAHAHAHAHAHAHA, LOOK AT THE TYPICAL SLOP CONSUMER AND IT'S BEHAVIOR, IT'S IRL BODY IS OBVIOUS BUT YT TOS WON'T LET ME SAY
Some people are saying "but it's GDDR6 so it doesn't need as much vram" and while that's partially true, it's just a sad excuse for what we as consumers are currently paying for. The bare minimum.
Well i'm happy with my RX 6700XT Red Devil with 12gb vram :) The only thing I need to get to my PC is to replace my 16gb ram with 32gb when I feel the need for it, so far 16gb serve me well, but starting to be a bit low, but SAM push the ram usage down 1gb to 1.5gb so that's saving me for a few hard hitting Ram games.
I have the sapphire nitro version (I always go for that brand, love them). Play with a masssssivley modded cyberpunk on ultra settings with zero issues. Such an awesome card.
If you need more ram, just slot in an extra 16GB? Did the same in my rig from 2017. Sure it's only 3200MT sticks, cuz that's the fastest the mobo can handle but I paid less then $40 for it.
@@jadedrivers6794 nah i got a paired sticks. If you do this just make sure both sticks are from the same brand and have the same mhrz speed. Simple knowlege.
It was more directed at the OP, but yes you are right. Funny thing is I paid close to $200 for the Gskill Ripjaws V F4-3200 kit back in 2017 cuz memory prices exploded back then in Europe and they were the fastest available. Now the same kit is dirt cheap luckily.
Indeed 12 gb is enough... for a couple of years... Spending more money for 16 gb for me is not worth it, because after these couple of years i would spend money for the next gen graphics cards
This is why I bought a second hand RTX 3090. The price on the used market is great for the card, and I don't have to worry about VRAM at all, in any game, on any setting. Also mine supposedly wasn't a mining card, but even if it was I just don't really care, if anything mining cards should last longer than gaming cards (see LTT).
still using my 1080 TI. years ago i found out i dont like new AAA games anyway, dont know why. the last decade i missed a lot of gems and am playing those now. currently borderlands 2 with my wife (if she is in the mood, doesnt happen often). then solo elder scrolls oblivion heavily modded (only played morrowind and skyrim, somehow skipped oblivion). then theres borderlands 3 still left. neverwinter nights enhanced edition, dishonored, and literally hundreds of more games i didnt play. and the best part is, you get most of them for free to keep with occassional steam givaways, twitch prime gaming, and the regular weekly epic games.
It's worth mentioning that GPUs from different manufacturers and even different GPUs from the same manufacturer can have significant differences in VRAM allocated and used, based on various factors like the GPU's architecture, memory bus width, memory ICs used and the frequency they are clocked at, as well as the memory management and compression algorithms used.
Let me tell you what I think about this whole thing. As long as you have a good gpu, vram does not matter. The most you can go down in fps from having unlimited VRAM and just 8GB vram is about 20 fps. This is ofcourse if we're talking about the difference between 60 and 80 fps. An 8GB VRAM card won't be able to load textures fast enough in order to achieve more than 60 fps at max settings. So, as long as you're okay with running any game at 60 fps on something like an RX 7600 at 4k res, you're good. And yes, I said 4k because FSR and DLSS on performance renders at 1080p on 4k res. Rendering at 1080p on a 4k monitor looks a bit better than native 1440p. So, don't ever pick a 2k monitor. Get a 4k monitor and play with FSR or DLSS on performance. And since 4k displays are stuck on 60hz if you don't want to break the bank and spend 1000$ on a 144hz 4k display, you won't ever need more than 60 fps. with 8GB of Vram, you'll be set for a 60 fps experience at 4k FSR performance mode, with medium to high settings in any major new release title. But, make sure to have a second monitor. A 1080p 144hz is the minimum for competitive gaming. This ofcourse is the case if you play games where fps matters the most. Like Fortnite. Fortnite being a ping dependent game, it means that the higher fps you get displayed in game, the more responsive the game will feel when you edit/build/move mouse around. This won't be the case for games where ping isn't an issue. Like COD or Battlefield. Though, you might wanna just play shooters at 1080p if you want to see enemies before they can see you. The whole point I'm trying to make is that VRAM won't keep a gpu from reaching 80% of its maximum gaming capacity. But if you want to max your gpu in the literal sense of the word, you will need at least 16 GB VRAM.
As a game dev I would like to inform u that, more realistic textures = more vram. It's that simple. RE 4 is an well optimized game imo, capcom didn't lie, they inform u right inside the game that it needs more vram for ultra textures.
I always use Nvidia Freestyle to add sharpening and possibly color filters to games. I will note that not all games support it, but a large portion do. All you do is hit Alt F3 in game and an overlay will pop up where you can do whatever you want. Some games even have a simulated raytracing filter. I know one is Final Fantasy 15, but that game is still so heavy in the first place I don't recommend it. That game already required over 12GBs of VRAM for the 4K texture pack.
One game I'm waiting for is Stormgate, and it doesn't seem like a demanding game so I'm sticking with my 1070 until 16GB vram cards become mainstream at 300$ lmao
Same card, same boat. 2000 series brought little to no improvement (ray tracing was, and continues to be, a gimmick). Something like 12% improvement, wow. For more money. 😐 3000 series was better, but still too expensive, and the crypto bubble didn't help either. Only now these are under their MSRP, by a few percentages. Pathetic. 4000 series is a flop, all across the board. 1-2 tiers shifting in naming AND a higher price. They started out promising with 4090 then nosedived. Nvidia, duck you! If they don't get their shit pricing in check with the 5000 series, that's gonna be another skip.
@@zdspider6778 there's actually a Used 6800 asus TUF model for 330$ in my place but my CPU isn't powerful enough to drive it and i only have a 1080p monitor so I'll wait a bit more til i can upgrade my CPU and GPU at the same time (cpu is R5 2600)
Vex I gotta ask you this since you use an Arc a770 too. Do you sometimes get DirectX issues when launching a game you have launched before and have worked perfectly?
Great points! I believe there's 3 VRAM tiers for optimum performance: 1080p = 8-10gb, 1440p = 12-16gb, 4K = 20+ gb. Of course, some games are better than others and gamers can squeeze more performance between tiers by upsampling, lower texture and turning off RT and other memory hog settings. Additionally, 1440p is the sweet spot for consoles and mid-priced gaming PCs. I run my Series X and RX 6800xt at 1440p 80-144fps with vrr on a gaming monitor. I adjust all settings (including RT) to maintain at least 60fps w/o using upscaling. In the end, RT and Ultra Quality textures are eye candy, often not barely noticable or not worth the performance hit. In the end, it's about the playability of the games.
something like that, people thinking they are fine with 8~12gb for 4k assets is just crazy, the same 8~12gb they would use at 1080p, which is half the quality.
I'd say that 12gb might be enough for a while for the general user though people who want to play vr and do production would need to make sure they have a handsome amount of vram to spare. On top of that though it seems that having cpu cache has helped with frame times and stability especially in vr games, which tells that sometimes its not always the gpu at fault still even now. A variety of things could lower the performance and optimization of a build in any way and that is pretty interesting to me at least on a technical standpoint, I would like things to just work conventionally but that's not how things go. Best to tread carefully in the market and manage recourses efficiently to be the most content with the results.
Intel has a LOT of adjustments to make. However, they are on the right track and can actually pull this off. Nvidia is set to abandon the pc gamers altogether and AMD has had smash hit gpus over the years but they both need a new competitor to force 1 out and 1 to improve while putting prices back in check. I am not ready to give up on my 6950 for a few years but for my back up pc I have purchased a 770 to tinker with and watch it grow or stall out. I am not a streamer, content creator or any of that. I am the 99%, where the a770 will do a very nice job for a grand less
I did an experiment yesterday, connected my old rig (8700k rtx2080) to my 65" TV LG Oled 4k gsync. And played Cyberpunk on 4k mid, high, ultra mix (70-80 fps) and Jedi Survivor (EPIC) also 4k (also around 60-70 fps), Dead Space runs well to, around 60. The magic sauce: DLSS & FSR on auto and DISABLE Anti Aliasing!!! From the distance of the couch the games look unbelievable. Anti Aliasing became "usable" in games since the 10 series GPU'S, without eating 20-30% of the performance on Full HD, on 4k with DLSS/FSR it is antialiased already.
Ask an AMD GPU user if they're bothered by the increased VRAM usage in the latest games. With 20 GB at my disposal, I could care less and am instead happy that the VRAM is being put to use. The annoyance and frustrations comes from consumers who are unwilling to try AMD or Intel since they use Nvidia exclusively. While I also have used Nvidia exclusively for over two decades, it no longer makes sense to in my opinion, unless you're purchasing an RTX 4090. The reason I buy PC hardware is with the hope that developers will put my hardware to good use so that I can have a prettier and more efficiently running game. Developers programming a game to use available system resources is not poor optimization. You don't want developers developing games with 8 or 12 GB forever. Nvidia stiffing it's loyal customer base on VRAM when the rest of the GPU market is moving on without them, and customers not retaliating with their wallet is the problem. Don't buy a low VRAM graphics cards and expect to play at High or Ultra anymore in the latest games. AMD and Intel are providing more sensible alternatives, and if you don't want to consider them seriously, then be happy with low or medium settings and DLSS 3. It is what it is.
I don't get it. I have 4060 8gb and it runs Alan Wake 2 1440p high-medium (optimized) settings DLSS balanced in 80+ fps. No stutters, no missing textures, nothing. And this game is notorious for how heavy it is on gpu. I mean, I think I'd get problems if I tried 4k, but 4060 is not powerful enough for that anyway, so it's not vram that would be the issue.
I use to have the 3080-10gb till it fried in a storm then I got a RX6800 & even though I have 10% less fps my overall experience feels smoother & I've noticed when I had my 3080-10gb playing at 3440x1440 I would get very close to maxing the VRAM a lot & I would notice it wouldn't feel smooth so I ended up locking the fps a lot to feel smooth now with the RX6800 I have less fps but it's a smoother experience but this is just my experience.
The worst thing is that ultra settings aren't ultra anymore. There're so many polygons and so many pixels already used that on a 1080p monitor you barely see any difference in many games going from medium to ultra. If only they could use actual fonts for tiny text on something like a soda can instead of having to make it 8k*8k to make it readable... Or maybe use 8k normals and bump with 1k or 2k diffuse and AO maps.
I view any current gen card with 8GB to be budget class, entry level 12GB, mid range 16GB, high end 20GB+, personal opinion ofc i'm sure some may disagree.
This is why I stick with retro titles through emulation. If you're using up to date emulators, they continually optimize resource consumption and performance. Might take a few updates to fix a problem for good, but patience is a virtue. It's worth the improved experience in the long run.
How did you make RE4 use that much vram? I just finished the game. Played at 1440p ultra settings with rtx. It never used more than 8.5 Gb out of 10 Gb on my 3080
Next week I get to see if my new build posts. All goes well, I'll be there enjoying 16gb of VRAM. Bought an Acer Predator Bifrost A770 OC. I briefly thought about an 8gb card, but it just felt like a waste of money.
@@noelchristie7669 It's perfect for my primary use case as an editing rig. For creative workloads, coupled with a 12900k and lots of DDR5 I think it's going to be fantastic. Excited to see how they exploit hardware acceleration.
Dude, Cyberpunk 2077 is one of the best games to come out in the last 5 years. Sure it was a broken mess on release (what's game isn't these days), but once they fixed the issues, it was an incredible game. First playthrough I was on a 3060 playing 4k, with all low settings and DLSS performance mode, and I still throught it was amazing. When I went back and played again with a 13900K/4090, and ran it with path tracing at 110 FPS, it was even more incredible.
In my opinion, 16gbs is the perfect sweetspot, you'll get the extra buffer over 12gbs which will be better for longevity, and there are plenty of last gen cards that offer 16gbs, which can be found for decently cheap (RX 6800, 6800XT, 6900XT, 6950XT, and of course the A770 which was used in this video, sadly no options from Nvidia, aside from the very expensive 4080)
$400-600 isn't decently cheap for a lot of people. Having that said I been buying $400+ bucks gpus for years(my last one was $427 bucks) so I might as well go for a 16GB one next year.
i bought rx6800xt specifically for longevity. i am fine with 50-60 fps at 1080p. i think it is good to go for at least 10 years or so for me since i don't mind playing on medium either.
How much VRAM you need depends on what resolution you play at & what quality/special settings that you choose. With that said, I personally wouldn't buy a GPU in 2024 without at least 16GB VRAM. My current home gaming/productivity system build (1440p, 120/144Hz, w/64GB DDR5) is using an ASUS TUF version AMD RX 7800 XT 16GB GPU, paired with the AMD Ryzen 9 7900 12C/24T CPU. It's a system build that meets all my needs just fine at 1440p.
I agree with you, my goal is 1440p, 144Hz too, got almost the same setup but the 4080 Super and the AMD 9 7950X3D. I can play any game (even modded Skyrim) 3D Model, render, edit videos and modding. In abt 5 Years i will get a new GPU if needed like i always did in the Past. FPS per Watt is important to me that is why i got the Nvidia card and for the price of 2.3k over 5 years i cannot complain
Allocated memory does count because if you don't have enough VRAM to allocate then the game swaps to system memory when you get VRAM spikes due to high texture loadings. 16 GB should be the sweet spot for 1440p high/ultra. Look at Diablo 4, that games takes up 20 GB at 4K ultra.
Im using sharpening 50% on ac valhala without using fsr ,from nvidia control panel on 3440x1440.imo its like a an even higher visual fidelity(ps: im playing it at max)
There's a sharpen+ filter from Nvidia, looks better and enhances texture clarity but it does come with a performance impact. Alternatively, you can add AMD CAS through reshade, it is the best sharpening filter I've seen with no performance impact.
Games using more vram now isn't devs being lazy. We are simply at a generational leap. We got spoiled with games being made to run on a PS4. The current consoles not only have about 12-14gb that devs can use. They also have the ability to stream high quality textures from the SSD. Everyone acts like vram requirements going up is bad optimization. But the suggested dram jumped from 16 to 32 at the same time as vram requirements increasing. Look at the CPU requirements for stuff like the cyberpunk update, Immortals of Aveum also higher than games in the past. We are not in the PS4 generation anymore.
@@rimuru2035 yeah about the time that games stopped also coming out on the previous gen consoles. It's almost like games made from the ground up with the current consoles as the new min specs have higher requirements. If PS5 and XBsX is running a game using upscaling at a locked 30fps then PC requirements might be higher then they where for games that where made to run on hardware that wasn't even all that good in 2013.
@@Sp3cialk304 Yeah I agree with on that. It's for the best for hardware and requirements to evolve into the current gen. It's why I'm upgrading my PC as I already moved up to 32GB RAM and get a new cpu(8 cores) and gpu(12-16GB) to complete it.
The Last of Us is what made me realize how important VRAM is. The game kept crashing on my 3070 so I upgraded to 7900XT been having a blast with the performance.
I have been excited to play Fallout New Vegas, hope my 4070 with 12gb can handle it :P I picked up the 4070 just because my local shop had it 'unopened' open box for $479 and I wanted to play cyberpunk and my 2060 was already struggling at the 165hz 1080p curved displays I use. If I had not gotten the price I never would have done it.
@@ThisIsLau Thanks! I am so excited I had my power supply and motherboard go with it so I have been upgrading piece by piece. Only thing left is the ram, case, SSD, Fans :P
Vex i love you but this is just wrong. We also need to stop blaming "poor optmization" for VRAM issues when its NVIDIAs fault for cheaping out on VRAM. Technology is moving forward and developers can only optimization so much. It is up to GPU manufacturers to accomodate for the coming storm.
If Naughty Dog can release on of worst port then fix it including lowering VRAM usage by almost 4gb so 8gb now can play game on 1080p high i dont see reason why others cant.
They gotta make some design choices until GDDR7 memory is available, so I blame memory module speed and capacity then but not Nvidia. So if I buy a 4060 Ti this might not be my last grapnics card ever anyway.
No it isn't at all, and I can list several where it's blatant optimization. Forspoken ran like shit on everything including the console it was exclusive for. HL has multiple patches addressing memory leaks, and admitted a lack of optimization for Nvidia. Dead Space addressed a MASSIVE memory leak in it's only patch. Jedi Survivor had several memory leaks, and a 4090 was lucky to hit and maintain 60 FPS. Not to mention using more VRAM than the highest tier AMD GPU on an AMD SPONSORED TITLE! Redfall, nothing else needed to explain. That's 5 titles off my head that the devs fucked up on. manufacturers can't predict stuff like that, and it's obvious devs share the blame at the very least.
In abstraction we have two things making balance: precalculated data [this is when we have data stored in memory] and calculations [any type of procedural generations including image restoration]. The first thing requires more storage and the second - faster processing. And now you as developer will need to meet hardware capabilities cutting out stuff that just won't be possible to handle. The slowest thing is to copy the texture to vram from anywhere (ssd or ram) and so it can't happen frequently. Plus GPU processors are for the most part already overloaded with shaders calculations in most scenarios. So we might have a new component of a matherboard appear sooner that will handle this problem. But now when you have not enough vram - the geme should use your RAM and ddr5 is quite fast yet newr version should be even faster, snd so this also csn be the answer. Alternatively there may appear a new ways of texture compression (i have seen one that have promised to make miracles) and this will reduce the size of textures.
UE5 is the future just be glad they made it!! Making games easy to made So the game before take 5 years now takes only 2 years on ue5 Graphics more to real life looking Lighting and reflection also close to real life Just be glad man
@@GameRTmaster considering how launches went i don't think we have any gifted devs this ain't mid 00's where you had black box and codemasters re-invent racing genre instead this is time where we see more and more corruption ask for money but deliver shit products were fucked if you people buy pile of shit games and keep inducing devs to this behavior
In terms of not playing the latest games, I do tend to lag behind in what I play. Usually around 3-5 years behind. I’m not too concerned for playing AAA titles. I’ll get to them eventually and when I do, they will be cheaper and my hardware will easily surpass the requirements.
Love this video, 16Gb cards should be safe for years to come. My personal guess is 16Gb will be the new 8Gbs. As for the super demanding games you can always come back and revisit.
not as long as you'd hope. in 2 years time 16gb vram would be like 16gb dram now for computers, the standard and the mininum you'd want when buying a new laptop or pc today. Today, as of now no one should be buying any gpus with less than 8gb of vram, and if you have a gpu that has less than 8gb vram sell it off immediately
Didn’t mention it in the video because it didn’t flow well, but I had to cap the frame rate in most games. Or the footage would’ve been stuttery 😢
The a770 bugs out it’s recording when it under max load and the encoder gets overloaded
You could use the igpu renderer to work around the bug, tried with an uhd 630 and the crash are way less when both gpu used in tandem
A770 its ok for gaming? Any problems?
I have a 3080 10gb, I went to a 3090 24gb because the 3080 would sometimes run out of vram when playing Far Cry 5 arcade user created maps at 4k, some maps would have 100+ characters on screen, there was 1 map where a herd of 100+ buffalo run across a field. The 3080 would tank into the single digits once it was out of vram, the 3090 would play it at a smooth 45fps (It plays most non intensive maps at twice that fps). I don't know how many games have user created content, I imagine a game like Minecraft's user created maps could use a lot of vram as well.
tbh my 3070 kind of struggles at 1440p with the vram even at medium. It is very annoying tby
I think you completely wrong about the subject and the sharpening part too. Almost all new games uses TAA and sharpening already. Last of us too, no point oversharpen the image more. We need more vram, 16GB minimum at least, we have 8GB since 2016, its a joke now. You can get an 8 core console with a decent gpu with 16 GB ram for 500 bucks, when nvidia want charge the costumers 400 bucks for a 8GB card we should laught at them.
Its funny how Nvidia advertises and pushes tech like Ray Tracing and Frame generation which both requires more vram and yet they offer less vram as compared to the competition unless you have $1200 in the bank😂.
Yeah and this translates into the rx6800 being FASTER than the 3070 with ray tracing in some titles because it doesn't run out of vram lmao. This gen must be skipped. We need at least 16gb or more of gdrr7 next gen, with 12gb for the 200€ gpus.
Because they want you to spend the $1200.
they released the 4000 series with low vram so when they release the 5000 series with more vram people will upgrade
@@sidneyentr5640 They released the 4000 series with low VRAM so that the VRAM chips are moved to the "AI" cards in data centers. Nvidia is effectively killing PC gaming. Last gen should have had 16GB VRAM as standard, and it's not even standard THIS gen. Nvidia, go duck yourself!
@@enricod.7198 idk in which games I have RX6800 and man it sucks balls in Witcher 3 next gen when RT is turned on, same in Cyberbug, runs at 35FPS in 1080p with no upscaling lol
Cyberpunk at this point is just Nvidias playground to showcase new technology. That conveniently doesn't run well on AMD so they can show their new GPU and say: Yeah, we get triple the fps in Cyberpunk compared to AMD.
😂😂😂
Ironically though we do need a variety and new stuff in features and technology and improvements aswell like imagine if Nvidia didn't start the Ray tracing war, we would still be stuck with traditional raster
@@Eleganttf2we still are though, raytracing is too damn slow and doesn't improve the image much in most games
@@PackardKotch to say RT doesn't improve much is a bit exaggeration my friend but im glad there's competition, RT not only revolutionizes gaming but don't forget 3d work has also been massively helped with it
@@Eleganttf2 gaming is revolutionized at a silky smooth 20fps
You're absolutely right. Most new games are too expensive and are not optimized. There's so many good "older games" and will play nicely with 8 gigs of ram
How old is your graphics card?
A game from 2018 only needs like 2gb.
We went from 4GB is more than enough for 1080P gaming back in 2019 to 8GB to be the new minimum
exactly. more than 8 is for doing work like video editing.
@@ArcturusAlpha 8 is clearly not enough for video editing. People say its lagging all the way. Too slow of a process.
I remember back when the rtx 3060 launched, i thought 12GB was an insane ampunt of vram, but here we are, my 8gb gpu i bought in Q32022 is already being bottlenecked by the amount of memory because god forbid a $300gpu actually keeps up with games released a year or two after it came out
I still use gtx 1050 ti 4gb lmao
Bottlenecked in what game?
@@fitmotheyap me too
Just reduce the in game setting. Problem solved. Why people are so obsessed on using setting that they cannot see the difference vs the lower one?
@@arenzricodexd4409 i just bought rtx 4070 bc some games was laggy
Games using more VRAM doesn't necessarily mean the texture quality will be higher. Most games in the past used a smaller pool of textures that devs would simply repeat over and over again, and it was fine then. That tree you see there? There's an identical one after every 4 trees in an area with 100 trees. That's 20 identical trees that cost the space of whatever a single tree takes up on the VRAM.
Now that we're starting to use more and more of photogrammetry and unique textures for objects instead of highly repeated ones the memory usage will go up even without much of an improvement in the actual quality. Of course it isn't noticeable right away, because devs aren't leveraging it fully; if they did, the memory demands would skyrocket to 30GB or even more at just 1080p. And that's before making higher quality textures, which devs aren't even considering right now.
Just look at how Skyrim extreme texture mods work and their insane memory demands. RE4 Remake uses so much VRAM precisely because it uses photogrammetry, which is why the textures look so nice.
So many are completely missing this. They think increased VRAM usage is just result of higher resolution textures and therefore it's waste of resources and needs to be "optimized" away.
Whilst textures do play their part, it's unfortunately nothing so simple.
Plenty of channels have covered everything that eats VRAM (and why), and what devs have had to do for years to mitigate it due to HW vendors (like nVidia) not putting nearly enough VRAM on their cards in recent years.
I have a GTX 1070 that has 8GB VRAM. It's 7 years old. That should say all you need to know about who is to blame (because putting more more VRAM on a card is not expensive).
It's planned obsolescense to make you buy new hardware more often than you reasonably need to.
Replace vram with SSD
@@PQED I upped from a 2gb 960 to a 3070ti because at the time it was about the same price as a 3060 for some reason and I'm blown away that it's still 8gb of ram. Luckily modern games suck so it still does pretty well on the older ones I actually play.
@@PQEDdepends actually. During the mining craze GDDR6 end up being very expensive. I think Steve (GN) talk to some partners and they told him selling the 3060 12GB at $330 can be considered selling at lost. So it is not always cheaper to add more memory.
Second developer do really need to optimize more. John Carmack said ever since the 8th gen console giving significant memory upgrade over the 7th gen majority of game developer end up brute forcing things instead of more optimization. The 9th gen only double the memory from 8th gen so dev eventually need to be careful with their memory usage. But so far the issue is only on pc ultra setting which is a setting that is not used by games on console.
That allocation does not matter is an idea spread by Nvidia. It does matter! If the pc cannot allocate it needs to constantly shuffle data inbetween ram and vram which can stutters and slow downs. It's not as bad as if the graphics card needs to access something from the ram yet, but still has a mayor impact.
Yes, but no, It does matter in many games, the higher the vram amount, the more stutters the game will have when loading new textures or the texture streaming will be super slow, noticing pop in constantly.
@@metroplex29 I thought I said that? Maybe I could have expressed myself better. However it's not just texture loading. It can be everything from culling differences when turning your character around to the loading of calculations on the gpu like game "Ai" or destruction physics etc. But yes pop in is an additional problem.
While this is true, games will also use the vram they have available, just like your PC will use the system ram you have available.
If a game needs say 6GB of ram to completely load a scene and you run it on a 24GB card, it has a high chance to use over 6GB because it's leaving the unused assets in ram until it needs to swap them out so it doesn't have to swap them back in later on the off chance it needs them again.
It will not, however, cause a performance hit to not have a higher vram GPU and this is what idiots don't understand. It might cause a bit more temporary stutter on a new map load as stuff is swapped out but that will quickly resolve itself.
This is why these type of ram videos are misleading. The same was true on videos for SLI back when it was still around. A bunch of misleading information got copypastaed and people used that info to try to run SLI with and then magically SLI sucked, just like everyone said it would. No, they just were clueless about how to use SLI properly. It always worked fine, it was a PEBKC issue.
@@JathraDH erm have you seen Hogwarts Legacy on a sub 6GB card? the textures are not even loaded because there aint enough VRAM for them.
@@sjneow I've played HL with my gtx 1660 super and I don't remember having any problems at high settings.
I am ok with the 8gb of my 3060ti but I would never buy a 4060ti with 8gb. 2 years passed and times have changed. Built in obsolescence is not cool.
I'm trying to take this attitude. I bought mine during the shortage and it needs to last a while. I play at 1080p so that helps.
I'm still on a 1050TI and will continue to play at (up to) 4k with motion interpolation to 120Hz till i played every game in my backlog that it still can run.
It's probably going to take ~1 more year and then i'll have a lot of "new" games i can play at max settings^^
By limiting what games you play you can find some Indie-gems you wouldnt have played else way.
It's like when you where a kid and only had a Nintendo with few games.
It's not really pointless though, yes sure it has "only" 8gb vram, but the power consumption is far lower than the 3060ti. For 1080P is enough.
Simply stop playing poorly optimized games(mostly triple A)
@@plompudu2529Some of the best gaming experiences I've had were indie games. Inside was awesome, nothing has topped that for me in years.
Another solution is to just target 1080p 60 and you're set. I mean a lot of us played ps3/ xbox 360 games on 720p tv for years and enjoyed every moment.
LMAO no we didn't, and certainly not in the way that we enjoy them right now. Graphics back then were like "Oh, that looks so realistic!" These days I want to be able to masturbate when I look at the graphics of a videogame. IT IS NOT THE SAME THING.
No but seriously, that argument is f*cking retarded. "Oh people were able to live in mud huts for thousands of years so why don't we do the same?" I've had a PS3 for a year up until 2020 when I bought my gaming PC and PS3 games look horrible. I certainly didn't enjoy my PS3 other than Persona 5 I guess and I certainly didn't have half as much fun playing it as I did enjoying the beautiful visuals at 1440p 60FPS on the Witcher 3 with my 1650 Super. Gaming back then was a vomit-inducing experience compared to these days.
entitlement is the death of fun
This man is the greatest chad I have ever heard.
I play everything on 4k graphics
Why do outlets like Hardware Unboxed or LTT suddenly care about Ultra settings after telling us for years that Ultra is dumb?
cuz their income depends on viewer engagement and controversy seem to be a good way to provoke viewers into interaction, such as liking/disliking and commenting vids. Tech related comunitites sometimes look pretty much like some kind of religious sect and things are blown out of proportions to the point of hating some unreleased hardware piece for being garbage cuz it's leaked specs are not as high as they should've been
Cause the option is there and people will stubbornly want to feel they can use it or max the game anyway
Because ultra texture settings does matter as opposed to the other graphical settings.
The solution is for devs to start using better compression and better UV layouts. When 2K textures look worse than 1K textures from 20 years ago you know we have a problem. Compare for example HL2 or FEAR with it's tiny 1k textures to modern games and you'll see that something has gone seriously wrong.
no amount of pixels can save a smudgy, low contrast, compression artifact loaded, badly layered image.
making every pixel count is an artform.
Most of HL2's textures ain't even 1k, most of them are at like 512x512 tops (with a few exceptions going higher)
I hit 19gb VRAM on Resident Evil 4 MAX settings on a 7900xtx. It only happened on certain scenes.
That doesn't mean much. Games use the vram when is there to store extra assets.
But they don't necessary need all that vram. They just use it since is there.
Is that allocation or actual usage on VRAM?
@@masterlee1988allocation
@@MrRoro582 Ah, then that makes sense.
I played it so smoothly on my 6gb 1060m, so can't see why we need so much vram
3:02
when You need 8GB of VRAM for your "High" textures that look like PS3 era (not to mention 6GB for PS2 era textures) - all I can really say is - You done fucked it up!
The 2nd solution is a harsh reality not a lot of people have realized.
In 2015 we joke about can it run crysis.
In 2023 we joke about can it run cyberpunk💀
@@Dayn-El But can it run hogwarts?
@@Dayn-El awnser is no a 4090 cant run cyberpunk at 8k at max settings its 1fps gaming but with some ai features its 40-48fps all less then 60fps isnt a great deal so 8k high? yes this works fine but not maxed settings! :D
There's another solution besides giving up on playing newer games: settle with a lower graphic settings. Honestly I play games because of how fun they are, not how fucking photorealistic they look. I'm fine with playing Cyberpunk with my piss poor RTX 2060 laptop even though I can only play it with 30 fps at most, but at least I'm having fun with it.
@@wzx6x6z6wbro i also have rtx 2060 laptop and I can play on ultra settings with 30+ fps
tbh if games like Diablo 4 come out with VRAM memory-leaks and reallocation-issues, no amount of VRAM is gonna help you. As much as GPU companies should stock up on VRAM, as much have the game-companies held responsible for not giving the slightest fck about optimizing their games performance.
I think devs actually stopped doing ANY optimization whatsoever. Like proper cpu multithreading? What's that? Texture compression is non-existing. They all rely on dlss these days, as many predicted would happen with dlss, used for "optimization" instead of extra performance, a new way to cut corners and expences in the shitty aaa market. Hope things lile sampler feedback streaming and the new ai texture compression by nvidia will be used to fix this memory management hellscape we are living in.
Probably also why a lot of games are so big in size these days, because they don't compress textures. It is gonna be funny when gta 6 needs 200gb of space.
The problem is that people are holding back the progress by their refusal to lower graphical settings. Even OG Crysis ran well when on low settings. I played it on my potato PC. Enjoyed graphics and gameplay. However, people went REEEE over it and even youtubers after all these years are repeating REEEEE by claiming incorrect facts about the game.
The truth of a matter is. You are not entitled to ultra settings. Settings are here for a reason. Pick ones appropriate for your system.
No not really. I can play Battlefield 1 on my 2060 with over 100fps on Max settings. The finals on all low looks way worse and I cannot even get 60 on all low. Red Dead 2 can be played on high settings using 5.2gb VRAM while looking better than new games on all low. Worse performance and more VRAM usage should not be accepted by saying "just lower the settings". Using the steam hardware survey we can see that a majority of people do not have more than 8GB VRAM, and most GPU's people own are not even that powerful. Developers should optimize their games, to the best of their ability before releasing it.
I do not expect to play ultra settings. I want to be able to play games at 60 fps high or medium settings or lower the settings for more fps. And people who bought a 3070 which was a $600 card should not be forced to lower settings (at 1080p) because developers cannot optimize their games.
@@cxngo8124 I had checked newest games. They look similar on low and old hardware can support them. Also, developers develop on consoles first as base standard due to their popularity.
@@REgamesplayer what new games are you talking about. For me it's bf2042, MW2, Hogwarts legacy, Dead Space, The Finals etc. There is a bug difference between low and medium when you have to use something like DLSS on balanced to get playable fps. Hogwarts legacy in general doesn't even look that good compared to other titles, MW2 looks like MW2019 but runs worse, Dead Space did a good job so I give it and bf2042 looks worse than bf1. If they don't care about making there game have a good performance to visuals ratio I dont care about buying the game till its on sale years down the line when they have updated the game.
The problems start appearing when the highest end card available struggles to run modern games in 4k, like immortals of aevum. This is due to shitty optimization (look at his latest few videos)
@@M_CFV It is difficult to tell without objective metrics. Immortals of Aevum is UE5 game. Maybe all new games are inherently demanding. In order to judge, first we need to see graphically demanding game which would be optimized.
Furthermore, we got one shitty generation of video cards both from Nvidia and AMD. This generation barely pushed needle forwards while games had a massive requirements jump due to games being built for PS5 now (this is why they all suddenly need more than 8 GB of VRAM) and adoption of UE5 (which is why some games gotten so demanding).
If Nvidia would not had raised each generation by a tier, picture would be different. Imagine RTX 4060 Ti being RTX 4070 and RTX 4090 costing as much as RTX 4080. Picture would be completely different.
Furthermore, it is not unheard that new games practically demand flagship products to run well. Crysis was another game. It was well optimized, but required future gen hardware to run. If you would look up old benchmarking videos, you would see that games would run at around 60 FPS with high end cards for the time. So, part of the problem is that people have memory length span of a gold fish.
As someone that makes characters for Indie games, I can tell you Vram isn't all just about visual quality, it also about lighting, AI, animations, etc. I have videos on my channel right now of different animations I have made using AI to do the animation and lighting, and some scenes I used up to 18GB of Vram. By now 12GB of vram should be entry level, 16GB midrange, 24GB high-end, 32-48 should be prosumer/enthusiast. The thing is if they release cards with that much VRam now, people won't by the next gen, if there was a 32GB 4090, no one will buy the 5090.
Stream from system ram or ssd instead
@@samgee500 pcs aren't consoles you are asking for alot of latency and dropped frames.
Yeah words.
" By now 12GB of vram should be entry level, 16GB midrange, 24GB high-end, 32-48 should be prosumer/enthusiast. The thing is if they release cards with that much VRam now, people won't by the next gen, if there was a 32GB 4090, no one will buy the 5090."
And that makes sense to you?
clown
@@nightvision3182 saying that 12gb should be entry level is wild bro my 4gb card is doing just fine 💀
Games usage of VRAM really depends on the game itself.
Some games will use all available VRAM as a cache, so even if it's not actively using a texture it will keep it loaded in case you have a slow hard drive. So those games if they have 40GB of textures or some crazy amount will use all of your VRAM, unless you have VRAM more than the total texture size of the game files, or whatever upper limit the developer hard coded.
Does not mean the game will run any better, might mean you get faster loading times and less stutter.
What we need is for someone to do a test on each game with all the major VRAM sizes, see what the minimum amount of VRAM you need to play at each major quality setting without the game turning into a slideshow.
Would be nice if someone made a database for that sort of crap where people can submit results from in game benchmarks, but only if the CPU is not 100% on any core.
This, and also shared memory will be used too if you hit the ceiling of vram, itll eat some of system memory (though i dont know about this in ddr5, but in previous systems, when this happens, we will see major stutters) the hardware communication isnt great for pc unlike game consoles that is optimized to communucate well on the default settings
We don't need to. Games are made to played on consoles, so VRAM you need is RAM in console. Currently 16Gb VRAM is enough.
Wish we could just add VRAM modules like RAM ones. Imo, it'll solve everything.
That would be so nice. But sadly, the manufacturers of these cards would lose their precious revenue, so that's an automatic no from them.
there is a great reason why we went away from socketed VRAM on GPU's; those brackets cost money, are point of failure from both oxidation and mechanical failure and current GDDR is tuned for soldered to PCB applications
in theory yes, but that also adds latency which kinda defeat the purpose
That actually was a thing, my ATI mach 64 back in the 90's had 1MB video ram but had aditional chip sokets so you could upgrade it to 4MB.
Yeah there is definitly a disconnect from the engine developers and the hardware the average gamer owns. It's almost like they are saying: this is premium graphics, this is a premium game, so you should get premium hardware to run it. It's a uncertain and scary road we are travelling in this space*.
I don't know if it's like a planned thing or just a unintended side effect, but it's almost like they are trying to push the mid to budget PC gamers onto console (where they get ripped on game prices and live services)
*seems to be a buzz word everyone is using these days?
You must consider that most dev studios make games considering that they will be run on consoles. ps4/xb1 had 8GB of ram and ps5/xbseries have 16 and 10/16 respectively, so it makes sense that they make games with that amount of memory as a target.
Also consoles have a bigger market share, and PC gamers have on average most varied system, some having systems as old as 10 years, so optimizing for PC at the cost of consoles doesn't seem the smartest idea (and I say this as someone that plays only on console/steamdeck and my PC is 7 years old, i5-6600K and GTX1070 with 8GB of VRAM).
@@TheShitpostExperience I think you missed the point, console games are demanding more on PC than on consoles. The new engines i.e UE5 are expecting much higher Vram usage, so it's like go to console or buy a high end card/cpu to play PC games. This is what they are going towards, it may not be the case overall right now but just look in 2 years time. This was not the case in the past. So now someone getting anything with less than 16gb WILL need to upgrade in 2 years time to play new games are high/highish settings and probably 20gb for max.
There is always a lag from engine developments to proper game utilisation, so I'm looking at whats coming, the software engines are more demanding than most of the mid-range and below market for AAA gaming currently available. This is why getting 4060 for example makes no sense.
@@RavenZahadoom To be fair, the reason someone may need more than 8GB of VRAM right now are because high quality textures (made for 4K resolutions) are being added to the games, which didn't happen 10 years ago, for example dark souls 3, that's why playing it on 4K resolution may only use like 4GB of VRAM vs a game with true 4K using up to 6 or 7, and if you add to this high or ultra graphics qualities, the VRAM needed may jump up to 8 or more.
UE5 actually tries to solve some of the issues this can cause (specially in games where a lot of stuff is rendered at once, such as open world games) by using nanite to simplify poly and texture rendering. On the other hand Lumen is a tool for realistic lightning which taxes more the system (lumen is the UE5 version of raytracing), so disabling that on UE5 games may ease up on the GPU.
I think it's unfair to blame engines (specially ones that are still on their first year of release) for not having the same performance of a game that uses an engine that is 10 years old for games that were meant to be played on 720p or 1080p. Current mid/high tier GPUs having less than 16GB of VRAM is a scam, specially at the prices they are released. And blaming devs for making games targeted at current gen hardware is dumb.
I won't tell you to buy a 7800X3D+4090 PC to play current gen games, that's absurd, you would be better off buying a PS5 or an Xbox X, they cost a fraction of the cost.
@@TheShitpostExperience And this is my point: "I won't tell you to buy a 7800X3D+4090 PC to play current gen games, that's absurd, you would be better off buying a PS5 or an Xbox X, they cost a fraction of the cost." this is what the push is, want the top end of gaming? get the top end of hardware, this was not the case in the past apart from the very early days of 3d gaming, when for a period you pretty much had to get a 3dfx card to game while everything else on the market was a 2d card. But maybe I didn't make it clear, I wasn't only talking about the software side, the hardware guys are just as much to blame, but seen as Vex covered that side, I wanted to zoom in on the software side.
In anycase I think we are actually agreeing with each other but coming from different angles.
Maybe some of us bought 4k 120hz OLED monitors because we want to escape into a beautiful world.
I'm going blind. I can't see anything clearly past 3-4ft anymore. But my monitor is 2ft away from me. So I can actually see everything. I want to play all the new games at max settings because that's the only way to travel and explore for me. In conclusion, Vram matters a lot. Some of NEED it to keep our sanity. 16gs should be the minimum.
I don't think more than 12GB will be a real necessity until the next console gen, because good-but-not-great graphics fidelity is based on that hardware. But the fact that it takes so little to get to that point, and well beyond it, is still an issue. Games are being terribly optimized, and it's been proven by patches for new games fixing these issues. It's not that they really need to use that much vram, it's that the publishers just want the game out NOW, and maybe they'll fix it later.
A) RAM is relatively cheap.
B) if you’re spending £1200 on a PC, why not expect it to be better then a crappy £450 console?
C) consoles typically play their generation of games on low to medium settings, usually always. PS5 and Series X can barely render 1800p/3k 30fps which is probably why there’s rumours of pro models in the works. Game assets require multiple high resolution textures to load like metallic, normal and bump maps. Less ram, more blur, it’s that simple. And for something so cheap, why settle?
Ram/memory is dirt cheap. Nvidia is purposely limiting the amount of ram they're putting on their cards. It's obviously planned obsolescence.
not me with only 3gb vram 💀🙏
you can play roblox tho
@@CryingWolf916 i actually have a gtx 960 now with 2gb ram and no. roblox is hard to run.
i dont even have a grapicsh card :,(
@@bencebartha7136 aww..
I’m really glad I paid the extra $50 for a 3080 12GB. Found them both at a similar price, and thought it was a no-brainer to go for the 12. The last 6 months have really showed us why VRAM matters.
I have the 10gb 3080 and I get around 35 fps in cyberpunk with 4k ultra settings ray tracing and path tracing. I keep dls on auto. But if I spend any time in the inventory menu I have to reload or the game drops to around 20 fps until I reload. I'm thinking it's a vram issue. But hey I'm OK with it.
@@haroldfranklin3670 You're having such problems and you still haven't dropped the graphic settings from Ultra to High-Medium?
Must be a masochist.
Just turn down the settings a notch and you will be able to play games on your 8GB card for years to come. In PC gaming maxing all the settings always came at a high cost, back in the 90's i had to buy an entire new PC every 2-3 years to even launch the newest titles.
I'm happy to stick with my 8GB card for the time being. Not much new coming out that I really care about anymore, and what I do care about playing right now works just fine on an 8GB at 1080p. I refuse to go beyond 1080p, for the time being, as it would require me to get a new display. But the difference between 1080p and 4K just isn't enough for me to spend the money (not saying there isn't a noticeable difference, but that I just don't care enough).
I said that until I tried 21:9 3440x1440.
@@HeathBlythe Yeah, but for a display this size you, once again, need to spend even more on a good graphics card that you confidently know will handle that resolution.
I honestly love your approaching to tech reviews/discussion videos. It doesn't have the extreme pretenciousness that a lot of the big names I have watched for years has.
7:08 I laughed hard when Gollum appeared out of nowhere lol
Im starting to believe that whatever card I use either it be a 4090 or a 7900xtx, some games will just run bad whatsoever because most of it is down to devs optimizing their games.
Yeah the thought of paying 670 euro for a 4070 having 12GB, when my 7 year old 1080 has 8 is just...
I can't justify it.
One thing to note is that surpassing vram capacity a gigabyte or two isn't somthing to worry about. I played Re4 remake 5 times with a 4 gb vram. And no, it did not look that bad.
When you run out of Vram the game will use System Ram which is slower but still not bad yes but it will still hurt your performance a bit also some games will use more vram if you provide them with more but will run just fine if restricted
I played WoW for 12 years and GW2 for the last 3. With shadows, etc. all maxed, GW2 looks really nice. WoW does have a RT on/off toggle, but all I noticed was performance tanking. My monitor is a Hisense 4K TV 55U8G. My living room TV is their 65U8H. Sony, LG, Samsung all may have "better" 4K TVs but nowhere close to the $800 I paid. While the performance of your PC is important, what you use for a monitor is just as important.
I am an older gamer, I began my addiction back in 1986 on a Commodore Amiga. The question each person needs to answer is why play video games. I bet for 95% the answer is to have fun. For the remaining 5% they are the asses who love to talk down on all the rest. They are the ones who money means nothing and so they spend thousands on a PC just to have bragging rights that their shit don't stink. Every gamer are individuals and have different views as to what games are fun for them. For me I do not enjoy mmo's, there are far too many people who love to camp out at respawn points. For me this simply is not fun. As things have progressed, games have become more complex and graphically challenging. These do not make the game fun to play. If you cannot forgo buying the newest games you will forever be slaves to Nvidia and their grossly over priced GPU's. My current system was built back in 2019 and has a Radeon 7 GPU with 16GB's of HBM vram. I was getting ready to build a new system but as I was looking at most of the new games coming out there is only one that appeals to me and my current PC will be able to play it without issues. I have many older games that to me are classic and are a lot of fun to play. As a result, I did not spend 4K + on a new system.
The last generation of consoles (you can buy for $400) have 10-12GB of VRAM so I'm not surprised devs don't want to make their games look worse for some ancient 8GB GPUs.
So if you want the console's level of details, textures, and the same number of NPCs in the cities then go and buy 12GB GPU.
But if you want better visuals than consoles can offer and you want to use RayTracing then you need 16GB.
8GB GPU for the price of the PS5 Digital is a joke, an $800 GPU with only 12GB is an even bigger joke, and the real 16GB GPU for $1200 is just an insult. Thank you, Jensen.
you actually need more consoles dont have a heavy operating system on top and games on consoles because its all the same hardware can be optimized much better.
the ps5 and xbox X/S series gpu is similar to an RX 6700 or 6700 XT
To be fair, regarding RE4 remake, lowering the texture pool size down to something more reasonable won't impact what you see very much. 3 gb high is more than enough for you to always see the highest textures.
Just because you don't like current games should not influence your recommendations for buying graphics cards. This reminds me of the old Linux mantra: "Just don't play those modern games they are bad anyways and older games are better." It's an extremely boomer mindset.
Is 8GB enough? If you require ultra preset at 1440p or better and only play natively? Then probably not, no, it's not going to be enough.
For the majority of gamers? Yes, it is still enough because the majority of gamers will play at the default out of the box configuration. They won't go into settings aside from altering key bindings at most. Out of the box in a modern title means the game chooses for you, and applies upscaling by default. They also don't look at their FPS, that's uniquely an enthusiast thing. Meaning their 8GB card will never face an issue because the game is designed to kind of fix that issue for you.
It's the disconnect of the mainstream and enthusiast, which is only getting larger as time goes on unfortunately. The mainstream gamer is basically 3-5 years behind in hardware, so the issues we want fixed today, realistically are 3-5 years out because they won't become a mainstream issue until then.
Edit: And that 3-5 year gap is being generous, there is a large subset of the gamer population that is closer to 7-10 years. Nvidia recently admitted openly that only like 18% of Geforce users have a 3060+ and since they have been providing the vast majority of graphic cards, that means realistically at least 75% of entire discreet graphics gaming community is below a 3060.
So happy I got the 12 gb 3060. I noticed a few games won't even let you try certain textures if you go over the v ram limit.
The 3060 was aimed at 1080p. It can do 1440p, you're taking it beyond what it was meant for. This is a "you" problem, you don't know what you're doing.
@@chrispappas3750 good point there, I'm a 3060 owner but I play at 1080p due to my display being far from me so I can't see pixels thus I don't need to push 1440p. This channel has helped me understand why we should change resolution in one of his other videos.
I think VRAM depends on which resolution you are using (your monitor) 12gb for 1440p 16+ for 4k , 6-8gb for 1080p and 3-4 gb for hd 720
Damn, buying a 3080 10gb didn't age well
I was crying for more ram since 2018. If you will get a new card get at least 16GB, new games will demand it.
Imagine a 3070Ti or below with 8gb then
@@ShoteR_Omega I'm glad I didn't get a 3070 like I was supposed to. Instead going for a 12-16GB gpu next.
Vex: Want VRAM not to not matter? Just don’t play!
Me: *munches on air-fried dumplings while modding stuff on a 3090*
No, no, you're being ironic... 'cause if not: HAHAHAHAHAHAHAHAHA, LOOK AT THE TYPICAL SLOP CONSUMER AND IT'S BEHAVIOR, IT'S IRL BODY IS OBVIOUS BUT YT TOS WON'T LET ME SAY
Better games are not coming anymore. So if you have GPU (8GB) to play games which have been released until recently, you will be the winner.
Some people are saying "but it's GDDR6 so it doesn't need as much vram" and while that's partially true, it's just a sad excuse for what we as consumers are currently paying for. The bare minimum.
Well i'm happy with my RX 6700XT Red Devil with 12gb vram :)
The only thing I need to get to my PC is to replace my 16gb ram with 32gb when I feel the need for it, so far 16gb serve me well, but starting to be a bit low, but SAM push the ram usage down 1gb to 1.5gb so that's saving me for a few hard hitting Ram games.
I have the sapphire nitro version (I always go for that brand, love them). Play with a masssssivley modded cyberpunk on ultra settings with zero issues. Such an awesome card.
Go for cosair vengeange 32gb
If you need more ram, just slot in an extra 16GB? Did the same in my rig from 2017. Sure it's only 3200MT sticks, cuz that's the fastest the mobo can handle but I paid less then $40 for it.
@@jadedrivers6794 nah i got a paired sticks. If you do this just make sure both sticks are from the same brand and have the same mhrz speed. Simple knowlege.
It was more directed at the OP, but yes you are right. Funny thing is I paid close to $200 for the Gskill Ripjaws V F4-3200 kit back in 2017 cuz memory prices exploded back then in Europe and they were the fastest available. Now the same kit is dirt cheap luckily.
Indeed 12 gb is enough... for a couple of years... Spending more money for 16 gb for me is not worth it, because after these couple of years i would spend money for the next gen graphics cards
This. When the time comes for 16 gb vram to be actually necessary, the raw performance of today's 16 GB (or above) VRAM cards will be outdated anyway.
This is why I bought a second hand RTX 3090. The price on the used market is great for the card, and I don't have to worry about VRAM at all, in any game, on any setting. Also mine supposedly wasn't a mining card, but even if it was I just don't really care, if anything mining cards should last longer than gaming cards (see LTT).
I also bought 3090, didn't buy it for games but it's quite great to not have to worry about whether I can run a game or not
so true, exactly what I did; i bought my 3090 for £500, how much was yours?
still using my 1080 TI. years ago i found out i dont like new AAA games anyway, dont know why. the last decade i missed a lot of gems and am playing those now. currently borderlands 2 with my wife (if she is in the mood, doesnt happen often). then solo elder scrolls oblivion heavily modded (only played morrowind and skyrim, somehow skipped oblivion). then theres borderlands 3 still left. neverwinter nights enhanced edition, dishonored, and literally hundreds of more games i didnt play. and the best part is, you get most of them for free to keep with occassional steam givaways, twitch prime gaming, and the regular weekly epic games.
It's worth mentioning that GPUs from different manufacturers and even different GPUs from the same manufacturer can have significant differences in VRAM allocated and used, based on various factors like the GPU's architecture, memory bus width, memory ICs used and the frequency they are clocked at, as well as the memory management and compression algorithms used.
Not to mention cache to reduce bus traffic.
Let me tell you what I think about this whole thing. As long as you have a good gpu, vram does not matter. The most you can go down in fps from having unlimited VRAM and just 8GB vram is about 20 fps. This is ofcourse if we're talking about the difference between 60 and 80 fps. An 8GB VRAM card won't be able to load textures fast enough in order to achieve more than 60 fps at max settings. So, as long as you're okay with running any game at 60 fps on something like an RX 7600 at 4k res, you're good. And yes, I said 4k because FSR and DLSS on performance renders at 1080p on 4k res. Rendering at 1080p on a 4k monitor looks a bit better than native 1440p. So, don't ever pick a 2k monitor. Get a 4k monitor and play with FSR or DLSS on performance. And since 4k displays are stuck on 60hz if you don't want to break the bank and spend 1000$ on a 144hz 4k display, you won't ever need more than 60 fps. with 8GB of Vram, you'll be set for a 60 fps experience at 4k FSR performance mode, with medium to high settings in any major new release title. But, make sure to have a second monitor. A 1080p 144hz is the minimum for competitive gaming. This ofcourse is the case if you play games where fps matters the most. Like Fortnite. Fortnite being a ping dependent game, it means that the higher fps you get displayed in game, the more responsive the game will feel when you edit/build/move mouse around. This won't be the case for games where ping isn't an issue. Like COD or Battlefield. Though, you might wanna just play shooters at 1080p if you want to see enemies before they can see you. The whole point I'm trying to make is that VRAM won't keep a gpu from reaching 80% of its maximum gaming capacity. But if you want to max your gpu in the literal sense of the word, you will need at least 16 GB VRAM.
I went for a 24GB VRAM GPU because I legit cannot be bothered with textures not loading or there being general shittyness
I can't decide between 3080ti 12gb and 3090 24gb LoL. Some say 12gb should be enough. Others say 16gb is the sweet spot for the next 3-4 years.
@@thewebheadgt WIth 24GB you won't have to worry about it at all until the GPU itself is not powerful enough to run the games optimally
@@thewebheadgt if it’s not a lot extra I would go the 3090 over the 3080Ti personally.
As a game dev I would like to inform u that, more realistic textures = more vram. It's that simple.
RE 4 is an well optimized game imo, capcom didn't lie, they inform u right inside the game that it needs more vram for ultra textures.
some of the games I play allocates way more that 8 gigs and not even maxed out, so I definitely need the 11 gigs of my trusty 1080 Ti!
1080ti is still a beast!
Just because it's allocated doesn't mean it's actually being used, minecraft can be set to use 20 of 32gb and it will allocate 20gb but use 12gb.
I always use Nvidia Freestyle to add sharpening and possibly color filters to games. I will note that not all games support it, but a large portion do. All you do is hit Alt F3 in game and an overlay will pop up where you can do whatever you want. Some games even have a simulated raytracing filter. I know one is Final Fantasy 15, but that game is still so heavy in the first place I don't recommend it. That game already required over 12GBs of VRAM for the 4K texture pack.
One game I'm waiting for is Stormgate, and it doesn't seem like a demanding game so I'm sticking with my 1070 until 16GB vram cards become mainstream at 300$ lmao
Dude same lmao 😂
I think later on this year, RX 6800 will be at the price of 340 to 370 dollars.
Same card, same boat.
2000 series brought little to no improvement (ray tracing was, and continues to be, a gimmick). Something like 12% improvement, wow. For more money. 😐
3000 series was better, but still too expensive, and the crypto bubble didn't help either. Only now these are under their MSRP, by a few percentages. Pathetic.
4000 series is a flop, all across the board. 1-2 tiers shifting in naming AND a higher price. They started out promising with 4090 then nosedived. Nvidia, duck you!
If they don't get their shit pricing in check with the 5000 series, that's gonna be another skip.
@@zdspider6778 there's actually a Used 6800 asus TUF model for 330$ in my place but my CPU isn't powerful enough to drive it and i only have a 1080p monitor so I'll wait a bit more til i can upgrade my CPU and GPU at the same time (cpu is R5 2600)
@@mohamedrafik9271 I sure hope so because I would buy a RX 6800 at under 400 bucks.
What is the song in the background in the first minutes of the video?
I have heard it before but i don't remember where...
very glad i subbed to this channel 😊
Me too!
the hesitated "bye" ate the end should be how you always end videos..that was fuckin' awesome.
Vex I gotta ask you this since you use an Arc a770 too. Do you sometimes get DirectX issues when launching a game you have launched before and have worked perfectly?
I legit hate how sharpening effect looks in every game. I'm probably a minority, but i prefer a softer image
Imagine optimizing a PS3 game so poorly, you need 6 gigs of vram to run it with low preset
Dragon's Dogma 2 will outsell Starfield.
Great points! I believe there's 3 VRAM tiers for optimum performance: 1080p = 8-10gb, 1440p = 12-16gb, 4K = 20+ gb. Of course, some games are better than others and gamers can squeeze more performance between tiers by upsampling, lower texture and turning off RT and other memory hog settings. Additionally, 1440p is the sweet spot for consoles and mid-priced gaming PCs. I run my Series X and RX 6800xt at 1440p 80-144fps with vrr on a gaming monitor. I adjust all settings (including RT) to maintain at least 60fps w/o using upscaling. In the end, RT and Ultra Quality textures are eye candy, often not barely noticable or not worth the performance hit. In the end, it's about the playability of the games.
something like that, people thinking they are fine with 8~12gb for 4k assets is just crazy, the same 8~12gb they would use at 1080p, which is half the quality.
I lol'ed at the "can we just make good games??" frame with Forspoken in it, you Sir, have a great sense of humor, I like it!
I'd say that 12gb might be enough for a while for the general user though people who want to play vr and do production would need to make sure they have a handsome amount of vram to spare. On top of that though it seems that having cpu cache has helped with frame times and stability especially in vr games, which tells that sometimes its not always the gpu at fault still even now. A variety of things could lower the performance and optimization of a build in any way and that is pretty interesting to me at least on a technical standpoint, I would like things to just work conventionally but that's not how things go. Best to tread carefully in the market and manage recourses efficiently to be the most content with the results.
this was the reason why i bought the 7900XTX 24gb vram hope this is enough for the next 4-5 years
0:04 yay hollow knight music
average 1440p-4k gamer: OH NOOOOOO NEED MORE VRAMMMMM LIKE 24GB!!!!! WTF NVIDIA/AMD!!!!
gigachad budget 1080p enjoyer: 60 FPS ITS 60 FPS
Intel has a LOT of adjustments to make. However, they are on the right track and can actually pull this off. Nvidia is set to abandon the pc gamers altogether and AMD has had smash hit gpus over the years but they both need a new competitor to force 1 out and 1 to improve while putting prices back in check. I am not ready to give up on my 6950 for a few years but for my back up pc I have purchased a 770 to tinker with and watch it grow or stall out. I am not a streamer, content creator or any of that. I am the 99%, where the a770 will do a very nice job for a grand less
I did an experiment yesterday, connected my old rig (8700k rtx2080) to my 65" TV LG Oled 4k gsync. And played Cyberpunk on 4k mid, high, ultra mix (70-80 fps) and Jedi Survivor (EPIC) also 4k (also around 60-70 fps), Dead Space runs well to, around 60. The magic sauce: DLSS & FSR on auto and DISABLE Anti Aliasing!!! From the distance of the couch the games look unbelievable. Anti Aliasing became "usable" in games since the 10 series GPU'S, without eating 20-30% of the performance on Full HD, on 4k with DLSS/FSR it is antialiased already.
Ask an AMD GPU user if they're bothered by the increased VRAM usage in the latest games. With 20 GB at my disposal, I could care less and am instead happy that the VRAM is being put to use. The annoyance and frustrations comes from consumers who are unwilling to try AMD or Intel since they use Nvidia exclusively. While I also have used Nvidia exclusively for over two decades, it no longer makes sense to in my opinion, unless you're purchasing an RTX 4090. The reason I buy PC hardware is with the hope that developers will put my hardware to good use so that I can have a prettier and more efficiently running game. Developers programming a game to use available system resources is not poor optimization. You don't want developers developing games with 8 or 12 GB forever. Nvidia stiffing it's loyal customer base on VRAM when the rest of the GPU market is moving on without them, and customers not retaliating with their wallet is the problem. Don't buy a low VRAM graphics cards and expect to play at High or Ultra anymore in the latest games. AMD and Intel are providing more sensible alternatives, and if you don't want to consider them seriously, then be happy with low or medium settings and DLSS 3. It is what it is.
I don't get it. I have 4060 8gb and it runs Alan Wake 2 1440p high-medium (optimized) settings DLSS balanced in 80+ fps. No stutters, no missing textures, nothing. And this game is notorious for how heavy it is on gpu. I mean, I think I'd get problems if I tried 4k, but 4060 is not powerful enough for that anyway, so it's not vram that would be the issue.
Free tip: just do high settings. Ultra is barely an improvement. But it’s far more demanding.
I use to have the 3080-10gb till it fried in a storm then I got a RX6800 & even though I have 10% less fps my overall experience feels smoother & I've noticed when I had my 3080-10gb playing at 3440x1440 I would get very close to maxing the VRAM a lot & I would notice it wouldn't feel smooth so I ended up locking the fps a lot to feel smooth now with the RX6800 I have less fps but it's a smoother experience but this is just my experience.
thats a good choice
The worst thing is that ultra settings aren't ultra anymore.
There're so many polygons and so many pixels already used that on a 1080p monitor you barely see any difference in many games going from medium to ultra. If only they could use actual fonts for tiny text on something like a soda can instead of having to make it 8k*8k to make it readable... Or maybe use 8k normals and bump with 1k or 2k diffuse and AO maps.
I view any current gen card with 8GB to be budget class, entry level 12GB, mid range 16GB, high end 20GB+, personal opinion ofc i'm sure some may disagree.
Yeah this is spot on and what I think as well.
12gb is not "entry level" when it costs $500 on average
@@M_CFVit is sorry to say it bud
This is why I stick with retro titles through emulation. If you're using up to date emulators, they continually optimize resource consumption and performance. Might take a few updates to fix a problem for good, but patience is a virtue. It's worth the improved experience in the long run.
Thanks for doing this very intensive testing and showing us all of the results - so we can make a more informed and knowledgeable decisions.
How did you make RE4 use that much vram? I just finished the game. Played at 1440p ultra settings with rtx. It never used more than 8.5 Gb out of 10 Gb on my 3080
I was wondering too, like how? Maybe he should have done a couple of runs tried some other apps as a variable (in case the app's monitoring was bad)
Next week I get to see if my new build posts. All goes well, I'll be there enjoying 16gb of VRAM. Bought an Acer Predator Bifrost A770 OC. I briefly thought about an 8gb card, but it just felt like a waste of money.
Glad to see more and more people respecting Arc cards. There are many of us!
@@noelchristie7669 It's perfect for my primary use case as an editing rig. For creative workloads, coupled with a 12900k and lots of DDR5 I think it's going to be fantastic. Excited to see how they exploit hardware acceleration.
If VRAM is less it would lead to stutters, but it won't crash. I guess I can live with it.
I bought an rx 6750 xt with 12 gb of vram, hoping I will keep it for some time. Hopefully I just want to play Everspace 2 so it will do
Dude, Cyberpunk 2077 is one of the best games to come out in the last 5 years. Sure it was a broken mess on release (what's game isn't these days), but once they fixed the issues, it was an incredible game. First playthrough I was on a 3060 playing 4k, with all low settings and DLSS performance mode, and I still throught it was amazing. When I went back and played again with a 13900K/4090, and ran it with path tracing at 110 FPS, it was even more incredible.
In my opinion, 16gbs is the perfect sweetspot, you'll get the extra buffer over 12gbs which will be better for longevity, and there are plenty of last gen cards that offer 16gbs, which can be found for decently cheap (RX 6800, 6800XT, 6900XT, 6950XT, and of course the A770 which was used in this video, sadly no options from Nvidia, aside from the very expensive 4080)
$400-600 isn't decently cheap for a lot of people. Having that said I been buying $400+ bucks gpus for years(my last one was $427 bucks) so I might as well go for a 16GB one next year.
i bought rx6800xt specifically for longevity. i am fine with 50-60 fps at 1080p. i think it is good to go for at least 10 years or so for me since i don't mind playing on medium either.
How much VRAM you need depends on what resolution you play at & what quality/special settings that you choose. With that said, I personally wouldn't buy a GPU in 2024 without at least 16GB VRAM. My current home gaming/productivity system build (1440p, 120/144Hz, w/64GB DDR5) is using an ASUS TUF version AMD RX 7800 XT 16GB GPU, paired with the AMD Ryzen 9 7900 12C/24T CPU. It's a system build that meets all my needs just fine at 1440p.
I agree with you, my goal is 1440p, 144Hz too, got almost the same setup but the 4080 Super and the AMD 9 7950X3D. I can play any game (even modded Skyrim)
3D Model, render, edit videos and modding. In abt 5 Years i will get a new GPU if needed like i always did in the Past. FPS per Watt is important to me that is why i got the Nvidia card and for the price of 2.3k over 5 years i cannot complain
Allocated memory does count because if you don't have enough VRAM to allocate then the game swaps to system memory when you get VRAM spikes due to high texture loadings. 16 GB should be the sweet spot for 1440p high/ultra. Look at Diablo 4, that games takes up 20 GB at 4K ultra.
Oh yeah that's true about that when it comes to VRAM.
"takes up 20 GB at 4K ultra" AKA too damn much, that maxes out literally anything that isn't an RTX XX90.
@@jaronmarles941 Yep, pretty wild!
@@jaronmarles941 Indeed. You're gonna need enthusiast class GPUs to max out games at 4K.
5:16 Medium X FSR makes those props look better and crisper than the High quality.
24gigs vram is what i need? Got it
Wow, 5:15 that Medium + FSR quality looks better than high. While regular Medium vs High looks the same.
Finally, healthy take on this situation that's not spreading panics
As far as I know 1080P is still the standard, and I still play at 1080P. I see no reason to change.
Im using sharpening 50% on ac valhala without using fsr ,from nvidia control panel on 3440x1440.imo its like a an even higher visual fidelity(ps: im playing it at max)
There's a sharpen+ filter from Nvidia, looks better and enhances texture clarity but it does come with a performance impact. Alternatively, you can add AMD CAS through reshade, it is the best sharpening filter I've seen with no performance impact.
Games using more vram now isn't devs being lazy. We are simply at a generational leap. We got spoiled with games being made to run on a PS4. The current consoles not only have about 12-14gb that devs can use. They also have the ability to stream high quality textures from the SSD. Everyone acts like vram requirements going up is bad optimization. But the suggested dram jumped from 16 to 32 at the same time as vram requirements increasing. Look at the CPU requirements for stuff like the cyberpunk update, Immortals of Aveum also higher than games in the past. We are not in the PS4 generation anymore.
8gb of vram wasn't an issue until the last few months.
@@rimuru2035 yeah about the time that games stopped also coming out on the previous gen consoles. It's almost like games made from the ground up with the current consoles as the new min specs have higher requirements. If PS5 and XBsX is running a game using upscaling at a locked 30fps then PC requirements might be higher then they where for games that where made to run on hardware that wasn't even all that good in 2013.
@@Sp3cialk304 Yeah I agree with on that. It's for the best for hardware and requirements to evolve into the current gen. It's why I'm upgrading my PC as I already moved up to 32GB RAM and get a new cpu(8 cores) and gpu(12-16GB) to complete it.
You so right! Enjoy games at a lower FPS instead of been a sheep.
We win.
The Last of Us is what made me realize how important VRAM is. The game kept crashing on my 3070 so I upgraded to 7900XT been having a blast with the performance.
I have been excited to play Fallout New Vegas, hope my 4070 with 12gb can handle it :P I picked up the 4070 just because my local shop had it 'unopened' open box for $479 and I wanted to play cyberpunk and my 2060 was already struggling at the 165hz 1080p curved displays I use. If I had not gotten the price I never would have done it.
Damm, that's an ok price! Enjoy your card, mate 😄
@@ThisIsLau Thanks! I am so excited I had my power supply and motherboard go with it so I have been upgrading piece by piece. Only thing left is the ram, case, SSD, Fans :P
@@cheetahfurry9107 Good luck with it! What are the case plans? 👀
@@ThisIsLau Thinking the new Lin Li version of the Y-60 or the new Thermaltake tower
@@cheetahfurry9107 480 bucks is how much I would spend on a 4070 so that's a nice deal!
Vex i love you but this is just wrong. We also need to stop blaming "poor optmization" for VRAM issues when its NVIDIAs fault for cheaping out on VRAM. Technology is moving forward and developers can only optimization so much. It is up to GPU manufacturers to accomodate for the coming storm.
If Naughty Dog can release on of worst port then fix it including lowering VRAM usage by almost 4gb so 8gb now can play game on 1080p high i dont see reason why others cant.
They gotta make some design choices until GDDR7 memory is available, so I blame memory module speed and capacity then but not Nvidia. So if I buy a 4060 Ti this might not be my last grapnics card ever anyway.
No it isn't at all, and I can list several where it's blatant optimization.
Forspoken ran like shit on everything including the console it was exclusive for.
HL has multiple patches addressing memory leaks, and admitted a lack of optimization for Nvidia.
Dead Space addressed a MASSIVE memory leak in it's only patch.
Jedi Survivor had several memory leaks, and a 4090 was lucky to hit and maintain 60 FPS. Not to mention using more VRAM than the highest tier AMD GPU on an AMD SPONSORED TITLE!
Redfall, nothing else needed to explain.
That's 5 titles off my head that the devs fucked up on. manufacturers can't predict stuff like that, and it's obvious devs share the blame at the very least.
In abstraction we have two things making balance: precalculated data [this is when we have data stored in memory] and calculations [any type of procedural generations including image restoration]. The first thing requires more storage and the second - faster processing. And now you as developer will need to meet hardware capabilities cutting out stuff that just won't be possible to handle. The slowest thing is to copy the texture to vram from anywhere (ssd or ram) and so it can't happen frequently. Plus GPU processors are for the most part already overloaded with shaders calculations in most scenarios. So we might have a new component of a matherboard appear sooner that will handle this problem. But now when you have not enough vram - the geme should use your RAM and ddr5 is quite fast yet newr version should be even faster, snd so this also csn be the answer. Alternatively there may appear a new ways of texture compression (i have seen one that have promised to make miracles) and this will reduce the size of textures.
You know something? F UE4 and UE5.
UE5 is the future just be glad they made it!!
Making games easy to made
So the game before take 5 years now takes only 2 years on ue5
Graphics more to real life looking
Lighting and reflection also close to real life
Just be glad man
@@GameRTmaster yeah the future, when everything will look the same, and poorly written with stuttering all over the place. ;]
@@Enclosure_2k you just saw the bad side only
Just wait for the gifted developers
@@GameRTmaster considering how launches went i don't think we have any gifted devs
this ain't mid 00's where you had black box and codemasters re-invent racing genre instead this is time where we see more and more corruption ask for money but deliver shit products
were fucked if you people buy pile of shit games and keep inducing devs to this behavior
It's the consoles that has moved the target for video ram.
In terms of not playing the latest games, I do tend to lag behind in what I play. Usually around 3-5 years behind.
I’m not too concerned for playing AAA titles. I’ll get to them eventually and when I do, they will be cheaper and my hardware will easily surpass the requirements.
Love this video, 16Gb cards should be safe for years to come. My personal guess is 16Gb will be the new 8Gbs. As for the super demanding games you can always come back and revisit.
not as long as you'd hope. in 2 years time 16gb vram would be like 16gb dram now for computers, the standard and the mininum you'd want when buying a new laptop or pc today. Today, as of now no one should be buying any gpus with less than 8gb of vram, and if you have a gpu that has less than 8gb vram sell it off immediately
@@kenhew4641true vram=ram basically