The first mainstream GPU with 8GB of VRAM was the R9 390 in 2015, which cost $330. By 2020, we should've only been seeing 8GB on the most entry level cards.
@@oOZellzimaOo nuance is that everyone SHOULD know NOT to get a gfx card like that in the modern era. Essentially, it's a waste of money over something that's actually useful. Sorry to be pedantic about it but that's the nuance of it.
Yeah I bought a 1080 on launch day and that was probably the first generation for me that I stopped looking at vram as a large concern. ...But as you point out that was kind of a while ago at this point.
The "I told you so angle" - is completely justified and I loved every second of it. However, the content is also extremely important because a lot of RTX 3070ti's, 3070's and 3060ti's are still being sold new, and are going to be exchanged on the second hand market. People need to know the value of these cards and the performance they can expect. Well done Steve!
And even worse, nVidia will continue to throw out 8GB cards in the 4xxx series while even new 12GB models should already get that warning the 8GB cards got back then. If people run out to buy 8GB 4050 Tis etc. at these highly inflated prices a "told you so" will be more than justified next time.
for me it reeks of confirmation bias. you would not run a 30 month old mid range gpu at ultra settings these days. I got a 6900XT and certainly don't run most games on ultra but high or very high settings for good fps. test the 3070 on high settings against the 6800 and it will do well.
@logirex I see what you're saying - but it's still great information for people looking at buying these cards - which sell a considerable amount new and used today. The old reviews are no longer valid.
@@claytonmurray11 Going forward you certainly want cards with more memory but it looks like they are getting it also with the RTX 4070 getting 12Gb for instance. That said even getting an older card I would still take a RTX 3070 over the 6700XT any day of the week. This channel tested those over 50! games and the 3070 was almost 15% faster. The only thing is if you buy older cards or mid range cards DO NOT RUN THEM AT ULTRA SETTINGS. Both this channel and others such as LTT multiple times points out running at ultra settings is pointless. LTT has one video where they had people look at games using ultra or very high setting and most preferred the very high setting as they clearly could not norice the difference.
I really appreciate the fact that on this channel you guys make the pressure you are putting on companies through your views and influence extremely clear to understand and remedy, with extremely specific target points so that they can't weasel out of criticism with a few extra gigabytes of VRAM. Quite frankly you guys are using your power in this market to the highest efficiency you possibly can without getting anti consumer pushback.
I was using a RX 580 for many years, and I feel what made that card age so well was its 8GB of VRAM. At its launch it was pretty high knowing NVIDIA's equivalent was using 3 and 6GB. So when I finally moved away from that card it was obvious that the RX 6800 would be the right pick for me. Its been good to see the card has been aging well already for the same reasons.
I switched from 1060 to Rx6800. I consider 3070 on release as price was very similar but im enthusiast and i knew 8gb won't cut at 1440p in couple months. It happend yet Nvidia claimed its not a problem. 3080 10gb ? What a joke. NVIDIA made fools from consumers
RX580 was the best bang for buck card I ever had, and yes, 8GB VRAM played a large part in that. Before that I had a 660 Ti as my favorite bang for buck card, it was quite excellent and the ONLY reason I had to replace it was because already then nVidia was just too stingy with their VRAM, it could've lasted me even longer if not for that.
Would have also been interesting to see the RTX 3060 12GB vs a higher end RTX 8GB card. See if there were cases the former played better than its more expensive counterpart.
Retesting the 6700XT would be a great idea, considering that it offers 12gb of VRAM, and how it's already very close to the 3070 compute power but for cheaper, would be interesting to see how it fairs against the 3070 in the current VRAM limitaions meta.
Hopefully Steve can do 2080ti vs 3070 vs 6750xt. Turing would win this because of much higher RT performance and 11GB of Vram is actually sweet spot for that class of performance.
nVidia's planned obsolescence has worked... I'm gonna buy a new GPU to replace my 3070. Grats nVidia, you've driven me into getting an AMD RX 7900 XTX.
@@Dempig I have a 3070 FE. Got it a few weeks after release when the only way to find one was to get lucky on Best Buy with constant website refreshes on stock days. Should have resold it when it was going for >$1000 lol.
What about the planned obsolescence angle in terms of NVIDIA purposefully limiting the VRAM in order to encourage consumers to upgrade sooner? Seems a clear strategy at this point.
my second ever notebook was a Lenovo y510P and i had 2x755M in SLI (nvidia). At that time i didn't know anything about PC hardware, so i thought 2 GPUs are better obviously, right. Anyways, nvidia came out with shadowplay and i was so happy, i could use it, and record stuff as FRAPS really sucked ass. I was recording my league of legends gameplays left and right. Few weeks passed, and i couldn't use shadowplay. I googled it, and found out Nvidia disabled it, but there was a workaround. So, i used said workaround. Few weeks passed again, and said workaround was disabled too. And that was the moment i didn't want Nvidia products anymore. When i built my PC that i use today, i bought a vega64, despite almost buying a 1080Ti. Now, i'm bying a new computer and looking at the things Nvidia does, the obvious choice was the 7900xtx. (CPU is in the mail, and it's a liquid devil, so can't use it yet) Fuck nvidia.
@@Hardwareunboxed So why did thy come out with the 3060 12gb. Its was a weird card when it came out in terms of the VRAM it got. Did Nvidia finally see the problem coming?
@@kaisersolo76 they saw 6 as too little at the time, as it would have had issues with some titles on launch. Remember you can only change VRAM amounts by doubling or halving because of the bus width.
Excellent work, Steve. I had no idea new games were using this much VRAM. I'll be sure to get something with no less than 16GB of VRAM next time I buy a GPU.
Some of the stuff I've been running has been using all 16GB VRAM and *still* going into system memory... Emergency 6800XT ( thankyou card death 1 month out of warranty! ) was a good purchase, thankfully got one of the lower binned versions on sale & saved about £300 at the time. I'd never get anything below 16GB now & frankly still want more. Less bothered about clocks.
@@orcusdei Did the thought occur to you that a decent PC case is way, way cheaper than a GPU? I'd never buy a crappy GPU because it doesn't fit - especially not if that GPU is way more pricy than the alternative card. You could literally have bought a decent future proof case, had a nice meal with the family at a restaurant AND bought a GPU instead of a 4070 🤣🤣🤣🤣💀
It would be very nice to see a 12Gb model like a 6700XT in the comparison as well. Or even an older 11Gb card like a 1080Ti. It would be hilarious if frametimes were more stable on the 1080 Ti than on a 3070 :D
I am having a lot better gaming experience with my 1080ti, compared some other people with cards twice as fast but only 8gig.\ I still get 60+ fps in almost all new games 1440p, no stuttering.
I'm actually pretty gobsmacked by how well the 6800 performs here. The 16gb buffer is an absolute life saver and I'm surprised how well it's doing with RT as well considering how questionable it was at launch. One GPU will be usable at higher settings for a few years yet probably and the other is relegated to lower settings and resolution for completely arbitrary reasons. Very frustrating.
To be fair, the 3070 has never competed with the 6800. The 6800 has always been 1 tier above the 3070 from the very beginning (even before this new vram scandal). The 3070 was built to play with the like of the RX 6700xt and RX 6750xt (which the 3070 beats most of the time).
@@angrysocialjusticewarrior no no no. Even the $400 digital PS5 console has over 12gb vram dedicated to games. Launching the 3070 with only 8gb vram while knowing devs in the near future would target the consoles as a baseline is treacherous 😢
@@angrysocialjusticewarrior I agree that the 6800 and 3070 weren't direct competitors, but they were only seperated by 70 bucks and that 70 bucks gets you a MASSIVE advantage now. Like an absurd one
What's frustrating is the fact that so many people bought these pieces of (as Steve would say) GAHBAGE just because they were in a green box. Anyone who spent that kind of money on 8GB deserves their fate.
One of my buddies got a 6700 xt recently and had horrible driver issues and some games performed really poorly in comparison to others, could’ve been his fault but I think amd still needs to work on their drivers more
It's actually normal behavior. They bought something and they want people to assure them they did the right choice. It's our human nature. Even though we know we're wrong, we'll continue telling ourselves we're right in hopes that we'll believe it ourselves. It's just lying to ourselves. And we do it not just with things we buy, but all sorts of things. Even relationships to give an example.
@@schwalmy8227 You're saying that as if Nvidia drivers were perfect. I had so many issues with Nvidia ones that for me it's not an argument for going team green. Their only advantage right now is prefessional space IMO. For someone that just plays video games AMD is a clear choice. Hell, even Intel has a lot to say here
I was in the market for a GPU upgrade at the start of the year and it came down to these two as the primary contenders. I had been a longtime user of Nvidia products, but this time around, I went with the RX 6800, with the VRAM issue being one of the major deciding factors. Watching this video, I'm very thankful that I did.
You know what would be an interesting follow up, including benchmark results from the 16gb Intel A770 in the titles the 3070 struggled. Just for fun, you could even include data from the Radeon VII for those same titles. That additional data would really drive your point home for owners of 8gb cards.
An RTX 4060 with 8gb of VRAM will be pretty much dead on arrival. I can't believe Nvidias social media team is unable to clearly communicate this to the higher ups.
They may give it 16gb in the end actually, just like they did with 3060. I am almost certain 3060 was planned to be a 6gb card (just like 2060 was) but they changed the mind last moment. No way they planned 3060 to have more vram than 3080! But the only thing they could do at that point is to double the buffer (same number of channels, double the volume of memory banks).
You think the higher ups are not fully informed? They have NO incentive to offer anything more. This would be a different story if they were facing the AMD of 8 years ago, that would do anything to gain market share. As soon as a competitor comes in offering the same performance with triple the vram costing half as much... and they truly loose market share. Then next gen will be different.
I'm glad I went with an RX 6700 XT. I remember people telling that rx 6700 xt was no match for rtx 3070 and the amd card actually performed a little bit worse initially but now it is a completely different story thanks to a bigger VRAM buffer and AMD's FineWine technology :)
@@LowMS3 I have been having vram problems with my 3080 10gb for some time, mainly with newer games, and after borrowing a friend 6800xt just last week I'm shocked how much better, so many games looked and played, even some 2+ year old games look and play much better, with older games like GoW at 1440p texter pack crashing the 3080 but the AMD cards run fine. Anyway, I have just brought a 6800 xt new for nearly half what the 3080 12gb cad cost.
Same here I got RX 6700XT for what I play, also I wanted more vram but not brake the bank, but the game I play need more than 16GB of Vram (Transport Fever 2, CityBus Manager, Cities Skylines, etc.....) I had a 8GB card but I had unbearable frame-times and a lot of lags due to Vram is overflowing into system memory and I had people saying you need Nvidia for gaming it's the best but my 8GB card was Nvidia GTX 1070TI, I'm like no I need more vram then 8GB on what Nvidia was offering in the mid-high end, I get what feels the best for me not for others, It's like being back at high school choosing your college and your friends say come where I'm going, sorry getting off topic aha.
Just to add to this, it was a real kick in the nuts when I bought the 10GB 3080 and then the 12GB variant comes out after... I wanted the extra VRAM too it should of been the first product they released.
never buy the first version. while its hip and cool to buy it first. you usually are a test subject just for better to drop right after, be it apple, samsung, monitors, gpus, cpus, whatever. dont get caught up into fomo fam.
@@eugenijusdolgovas9278 ya we are a wasteful throw away society, we think we need to replace things because corporations have trained us that way, so we keep profits nice. we are just modern day serfs.
@@CheGames497 Whilst some are wasteful, I tend to buy GPUs from used market and rock 6800 XT rn, which will be working till I decide to switch to 4k competitive FPS gaming.
AMD have really improved their Radeon cards over the recent years. I used to use Nvidia cards all the time but have been using Radeon for the last 3 years now and love these cards.
@@SMILECLINIC-2004 Its fine. not worse than nvidia wich recently got issues aswell. They are just slower at releasing newer drivers but thats it however im not one of those guys who update their divers immediately new ones come out, why would u do so? its not like u are gonna get much better performance in newer games, u need to buy a faster gpu for that. if it works good then u dont need to touch it.
I'm using a RX 5700, and I haven't had any driver issues throughout my time with AMD cards (rx 580, vega 56). I'm happy that the RX 5xxx series cards are still getting support and FSR/RSR updates as its allowed me to play most games at 2560x1080 and higher than 60fps. I don't really like the metrics overlay that AMD uses but I can't use Rivatuner without MSI afterburner which messes up my undervolting settings.
It's been said since 2020 go amd if u want regular features. They have the best regular rasterization frames. But at the low end with these 3070 3060 6800xt 6700xt ect you won't get any good benefits for raytrace lightefects so u need to go higher up the Chain. So obviously this test proves that
I wish Steve can do this comparison next. More interesting to see Nvidia vs Nvidia. Almost identical performance, but extra 3GB of Vram. I do have 2080ti and TLOU runs great at 4k, dllss balance, high settings.
@@deamondeathstone1 Yeah, it would have been great to see how the 12GB 3060 fared in these games compared with the 3070. I imagine a lot smoother gameplay in many instances, especialy with dlss.
I'm glad late last year, when upgrading, I went with a 6800 XT instead of the 3070Ti. They had similar price, but I also thought that 8gb was a bit too low since I had a GTX 1070 which also had 8gb, so I also wanted an upgrade in the VRAM. Thanks for your analysis HW Unboxed!
@@highmm2696 Thanks, yes I know the 6800 XT is often better than the 3070ti, but here where I live, they were practically the same price. Like Apple, Nvidia has a "tax" with it that they usually are more expensive or drop the price less due to more demand. So AMD usually tends to be cheaper
It happened with 128MB, 256MB, 512MB, 1GB, 2GB and 4GB it was only a matter of time until 8GB joined the 'Not enough' graveyard, I think some are just shocked at how fast it seemed to happen.
It really didn't though, 2, 3, 4gb was in pretty fast succession, it took awhile for 8gb to no longer be enough because RX 480 had 8gb long LONG ass time ago and it was still fine until a few years ago. And at the time of release it was already not enough, Doom Eternal was taking up 11gb and at 1440p Cyberpunk took up enough VRAM that it has texture issues.
@@drek9k2 Agreed. I think the people that are shocked are the ones who recently bought a 3070 on sale or maybe even a 3080, while the rest of us from the r9 290x 8gb and rx 480 8gb era are not.
@@Bassjunkie_1 Oh true that's a good point, you know I totally forget that a lot of people this may be their first actual video card, rather than, I'm not saying you needed to be from the GT 8800 days just that I keep forgetting many of these kids weren't even around for the RX 580 8gb yet. I guess a lot of people are really, really truly clueless.
Go play uncharted new version on pc i have no compliant getting new with 16gb vram when the game looks like that damn rip 2070 super and 8gb cards u won't be missed
I must admit, I didn't think 8GB would be an issue this soon. Up until recently I thought that, all things considered, the 3060 Ti was *_the_* card of this GPU generation. In the end, I bought a 6700 XT. Less so because of the additional VRAM, but mainly because it was cheaper - and now I'm really glad I did.
the rise of VRAM usage is due to consoles. consoles back then was a hindrance to PC gaming since companies are still focusing on making their games in ps4/xbox at that time. now that the current gen consoles are quite good and has 16gb ram (although unified), expect most games used at ultra around 10gb-12gb leaving 4gb ram for the OS. safe play to be made would be to buy 12gb vram gpu but its guaranteed to be safe in 16gb vram gpu unless you want 4k ultra + raytracing.
@@madtech5153 ps5 and xbox sx have 16 of shared memory. It means that 16 us shared buy both vram and ddr ram. Do how come 8gb vram and 16gb ddr ram become obselet If you consider texture size ps5 might only have 6gb vram and 10gb ddr ram to work with. Dude devs are f**king up
You showed very well that everything is not about fps. Most people were looking at the fps results and passing by. We understand again how complicated the user experience is. Congratulations
Been waiting for this video... in 2020 I had the opportunity of getting either a Sapphire Pulse 6800 or EVGA 3070... and I'm so glad I went with the 6800.
On the future side of things, team red has seemed to always had an eye for longevity. Just nabbed a 7900xtx myself after I saw my venerable rx580 8gig was minimum spec, after 5 years of use it was a no brainer to go red again.
@@Doomrider47 Agreed, I never have really dipped into AMD until I got a Ryzen 2700 a while ago. I dont think I will go back to an Nvidia card for the forseeable future unless AMD starts sticking its fists in consumers asses more than normal. Im honestly hoping Intel really brings a big update to its next GPUs, maybe both AMD and Intel putting pressure on Nvidia will knock them down a peg a bit.
@@Doomrider47 how big was the jump? I have a gtx 1070ti (which has similar performance to the rx 580) and I was also planning to buy a 7900 xtx or maybe wait 2 years and buy a 5080.
@@pyronite3323 if u can grab the XTX for anywhere close to 700-850$ - Just go for it - its worth it - tbh i have a 3yrs old 2070S 8Gb but i dont regret it for 170$
I'd love to see a 3070 Vs 2080ti comparison since most recommended specs specifically mention a 2080ti instead of a 3070 for a 1440p 60fps Ultra settings even though they're comparable in Rasterization, most likely due to VRAM
Imagine making such a good core architecture and software like DLSS only to be thrown away because of not enough VRAM. It really is the epitome of making an F1 car only to fit tyres from a prius.
Outdated shadowplay that dont have Capture preview
Год назад+75
I agree with this. The tiers are now: - 24 GB -- High End - 16 GB -- Mid Range - 12 GB -- Entry Level - 8 GB -- "barebones", sub 200$ cards This should be the standard in 2023.
I would agree but this also highlights the laziness in development of games now. There are still games more complex that look better than games like the last of us that use half the vram. Devs need to take responsibility as well, not just GPU manufacturers.
@@cairnex4473 lol that card was always a trap, I doubt anyone that bought it feels good about it now, I went for the 3080ti and the extra vram has been a godsend, still i think Devs need to seriously re-evaluate the priorities when making games, if a game looks amazing but plays like absolute garbage it's a bad game
@@FouTarnishedNot every developer became lazy it just technology has advance a lot, i don't included Bad Port Pc like the last of us or Beta Release Game like Lotr Gollum. What i mean is like Resident Evil 4 Remake, Capcom really improve game graphic and we can't blame game like Capcom to increase usage of Vram.
NVidia planned this knowing what was coming as they had optics working with upcoming devs. All their products had gimped VRAM amounts so they could justify everyone buying new cards sooner.
The worst part is the customers who got burned by Nvidias bs will just go "oh i guess my GPU is getting old" and will buy a newer Nvidia one without a second thought.
@@DebasedAnon I have a 3070 and my next card will be an AMD card. I have enough of Nvidias increasingly annoying nonsense for good and AMD cards are now more than a viable alternative.
@@Thysamithan I'd say AMD outright won the previous generation before the entire GPU market went up in flames, if they stayed at around MSRP Nvidia was trounced across the board. If people cared a tiny bit more the market wouldn't be as bad as it is now, it took the absolute farce that is the 4080 at 1200$ + a recession to make people NOT buy Nvidias products.. With that being said AMD is being way too passive for my tastes, they could afford to go for the jugular right now due to having vastly cheaper costs but they're content to undercut Nvidia by small amounts (relative to the price).
@@DebasedAnon My issue is that, I know Nvidia are screwing me over but I'm split between gaming and doing AI projects on the side in my free time, so I have no other choice but Nvidia since the workflows rely on CUDA. If AMD bothered to push Rocm and make it as viable as CUDA, and make FSR better than DLSS, I wouldn't even consider an Nvidia card.
Its funny how NVidia is pushing ray tracing, yet when you turn it on thats when their card drops to its knees because of VRAM, despite probably having superior ray tracing accelerators.
It's also bewildering how people blame developers and poor optimization, justified or not, for low performance on 8GB cards, yet there doesn't seem to be much outrage about Nvidia pushing RT so heavily to sell 20-series cards back when support for it was practically non-existent and enabling it now on virtually any card from that generation will bring it to its knees. Is that also poor "optimization" on the part of developers or Nvidia overpromising and underdelivering?
It's not even "probably", the RT cores make a huge difference and the Nvidia cards should be vastly better at it, so the lack of VRAM is without a doubt a big issue.
@@chronicalcultivation Don't exaggerate the difference RT cores make. Sucking less does not mean RT cores don't suck. And that's only usual limited RT. The RTX 4090 still only manages 17 FPS on Cyberpunk 2077 RT Overdrive mode.
@@sammiller6631 as someone who owns cards from both AMD and Nvidia, and also a PS5, no, it is not an exaggeration. The performance hit taken on an RDNA2 card or console for enabling RT is significantly more noticeable vs an Ampere GPU. It's not totally unusable on AMD, but it's a lot worse. I was really hoping the 7000 series would catch up and be equal to Ada in that department, but they seem to have only reached Ampere level RT instead. I've always been more of a Radeon guy for almost 20 years, so the last few generations have been a bit disappointing.. It was better when it was still ATi
I saw the 8GBs of VRAM being in issue back in 2020 because Shadow of War (2017 game) requires more than 8GBs of VRAM if you wanted to use the highest texture pack with all the settings at max. That's why I ended up going with the 6800XT.
I had 6700XT with 12GB and some games could fill 9-11GB easily on the highest preset in 1080p!!! 8GB is not enough...now I have also 6800XT and there it is normal that games use almost all the 16GB at 4K...
modded XCom 2 used almost 8Gb on my 1070 at 1440p in 2016. I upgraded this year and anything less than 12Gb was a non-option to me, which meant NVIDIA auto-excluded itself
This reminds me of the issues we used to have [way back] when we'd run out of system RAM and Windows started using it's swap file causing massive slow-downs. The main difference being we can upgrade system RAM. If only we could do the same with GPUs.
Moonstomper68 Swapping out RAM works much better on a motherboard because the motherboard PCB isn't completely covered with heatsinks, blocking off the memory chips. There is not much need to upgrade VRAM when the GPU performance is already at a given. It's much cheaper to just put on enough VRAM so the VRAM won't run out in situations where the card's raw performance could still hold up.
You cannot do upgradable VRAM because adding a socketable interface adds more traces and latency. Considering how fast GPU memory is these days, you need every bit of performance you can muster. The only way to do that is to have the VRAM as close to the CPU as possible to have the shortest amount of traces, between them. Shorter traces = Lower Latency = Faster Speeds. As much as I myself would love to see socketable memory module upgrades for GPU's it would sadly be more of a hinderance than a benefit.
Once I had VRAM issues, I ditched my 3080 12GB for a 7900xtx 24GB what an amazing card, it just lets me max out anything at 4K with loads of VRAM remaining. Am on the Team Red bandwagon for now. EDIT: When I mention maxing out everything, know that I do not use Ray Tracing if you want good RT performance go with the newer NVIDIA cards.
I couldn't be happier getting my 6800 last year for 480$. Apart from the odd driver issue (nothing serious) I'm a happy AMD customer so far. Nvidia became too rich for my taste with the 3000 series. Now both AMD & Nvidia are luxury goods with the newer gen products.
@@PotatMasterRace ?? there is nothing to do, its just plug and play. dunno why he had a problem but if linus had no problem, its more likely applies to general population as well (yes im dissing linus).
what scares me off of buying AMD are all of these driver issues I read about for example they had extremely high idle power usage for a really long time - I dont even know if its fixed now? And then there is FSR which is just not nearly as good as DLSS, Ray tracing etc
I waffled between the 3070 and 6800 back in 2021, and probably would have bought a 3070 if I could have gotten it anywhere near MSRP. Instead, I bought a $900 Gigabyte RX6800 Aorus Master. While I wasn’t thrilled with the price, I’m glad to hear I made the right choice. HUB’s concerns about VRAM was a major deciding factor for me. Thank you, HUB!
I feel your pain at that price too... I had to buy a 6700XT for 700€ in that same period... If only my GPU hadnt died on me back then i'd gladly get a 6800XT for 650€ right now...
Interesting, GTX1080 user here looking upgrade! The top card my 9700k can handle is a RTX3070 or a 6800. The 16GB of ram is definitely pulling me to team red for sure.
@@UKGBManny For the money right now it's one of the best options. It's closest competition is over $100 more. If you can spend another $100 you can get the XT, who's biggest competition is WAY more expensive than that. Unless you can get an AMD 79xx or NVidia 3080 the rest of either card that might compare is double to quadruple the price. I paid $370 for mine and am still amazed at how much it can handle.
The 6700 XT, the 6800, and the 2080 Ti are the only cards you should be looking at in the $350-450 range. Maybe sometime in the future we can add the A770 to that list.
@Cm you can find the 6700 xt New for 399€ for weeks in Germany. The prisebump to the 6750 xt is not wirth it. Beispiel: Mindfactory hat die Saphire Puls 6700 xt aktuell für 383,05 €
My Aorus 6800 Master still runs anything I throw at it. As long as my multi online games are able to pull 144fps on 5% low as no lower than 120 on 1% Im happy. As for offline games like resident evil or atomic still all good. Ray tracing at mid though.
Got a 3070 for 600 euros in 2020 (GPUs cost more here than in the US). This week I found a one-year-old 6800 XT for 360 euros on the used market and was able to sell my 3070 for about 300. Can't wait to enjoy the next few years with this beast of a card, it's still so powerful at 1440p!
@@Karimftw0.0yeah but you got the performance 4 years later well I've been using my RX 6900xt for 4 years now. Getting that performance for half half a decade later isn't as big of a flex when we'll just move onto something 2x the performance
If the rumored 8gb vram on the 4060 and 4060ti turn out to be true, I am very much looking forward to seeing HUB's review as there is no way any of those products will be priced accordingly.
@@josh2482 8 GB cards are fine for crypto... not so much for gamers... even 12GB cards... for me, I don't pay electric bills because of my solar power... so yeah I don't live under a rock... and there are still GPU miners out there...
The sad part is that yes, the 4060(ti) will get scathing reviews for only having 8GB vram, and yes, they'll sure as hell be overpriced for that - but people will still buy them. And, from what it looks like, AMD won't go ahead and do the smartest thing they could do and launch both the 7800XT *and* the 7700XT with 4 MCDs and 16GB. It's such a pity, and I really hope that they reconsider. Already having chiplet design for RDNA3 implemented would certainly give them the chance, unless the binning for the GCD also includes defects in the fabric interconnect to the MCDs.
Far out, I was tossing up between a 3070 Ti and a 6800 XT late last year and I'm really glad I went with the Radeon GPU now! Great work as always Steve.
That video perfectly reflects what it feels like to buy a GPU with 8GB some time ago and suffer with it now. I would be glad to see a comparison between RTX 3060Ti vs RX 6700XT, as these are also direct competitors. Good job!
I game atm since 1,5 years on a 6700xt on all ultra settings 1440p (with minor tweaks here and there). If I'd EVER experience ANY stutters or something I don't like I'd instantly switch to a better card. But I don't have to because EVERYTHING TODAY RUNS BUTTER SMOOTH!
Are you really suffering? Just drop texture quality one step and this won't even affect you. But if you swap the situation around and high texture quality didn't exist, everyone would blast AMD for wasting money on RAM when they don't need it. Funny how that works.
Similar situation here. On the other hand I tend to not use raytracing and use a mix of low/medium settings so I don't think 8GB VRAM matters too much for my use cases. The testing shows 3070 is kinda bad but my 3070 will do just fine until it dies I'd say.
Some people are missing the point in the comments. Limited VRAM is Planned obsolescence, most people don't upgrade their GPU often. it intentionally limits high end Graphics card from its full potential.
@@murdergang420 have you noticed who sponsors the vast majority of these problem games though? i mean seriously i feel like im the only one that notices this coincidence. Every game in this video with serious problems is an AMD sponsored title, that means they work closely with AMD when developing the game... im sure its nothing though right, no way AMD would be turning some knobs here to make games run badly on certain nvidia cards while taking advantage of the one huge advantage AMD have... nahhh couldnt possibly be, its just nvidia BAD and we need to remember that. not like anything like this has happened before with nvidia tessellation which crippled the opposition cards. honestly im very surprised outlets like HWU dont at least acknowledge this and question it even a little that something underhand and fishy is going on here when every game with a problem has AMD hands on it, always those games that come out with terrible RT aswell, just a coincidence im sure they have no say at all lol.
@@jinx20001 So do you have the same issues with video editing software like Adobe running better on Nvidia hardware? Because it's been that way for the longest and I've never seen conspiracy theories mentioned about performance. You should be more upset with Nvidia for putting out a crippled card with 8 GB of VRAM. If they had least put out a competent product, it would be easier to entertain accusations of bad optimization. That said, when AMD got the console contracts from Sony and MS, it was predicted that game optimizations might start to favor AMD.
@@murdergang420 for some games, have you experienced stutter due to shader cache with the 6800? Destiny, for an example, experience high stutter for a bit then over over time gets better.i just found it annoying that this happens every GPU driver update. It makes me not want to update the driver.
It's funny to me that you got comments complaining about your VRAM criticisms. Part of the reason I went with the 6800XT (aside from the fact I was able to get one for not much more than a 3070, and I would have been able to get a 6800 for a bit less than a 3070 if I'd gone that way) was because of your VRAM comments. I am, therefore, very grateful for your VRAM criticisms personally.
alot of people who criticizes the hub's Vram criticism says that speed it all that matters, size doesn't. its already quite clear back in the last of us part 1 benchmark that the speed doesn't help when there isn't enough vram.
The 3070 Ti with 8GB is faster than the 6800 but he chose to compare two different tiers of performance and keeps riding on bottlenecking issues when trying to run Ultra on badly optimized titles with a mid range card from years ago
I actually encounter vram problems when using my old rx580 8g. I sometimes need to stop the browser/kill dwm.exe so game will run smoothly. Thats in 2021, the time I decided to upgrade my graphic cards. And then I saw 3080…10g. I am like "what the hell? How on the earth 10gb at 2021 is enough?". Ends up go rx6900 because nvidia cheap out the vram for the whole 30 series. Nvidia plays a good game here.
@@siwexwot8994 and that's the point. The 3070 is only a 1080p card today due to it's lack of more memory... The take away is: the 6800XT was and is still the better buy. 😉👋🏼
@@siwexwot8994 yeah you're may be right but lowering the texture quality in 2023 is not future proof. I'm still glad that I sold my 3070 2 months after the purchase and got a 6800XT in my hands. FSR 2.x and 3.x will make it even longer usable as the 3070 will ever be. That's my personal opinion 😉
People who spent €900 on the 4070ti shouldn't sleep well, absolutely not futur proof card despite the abusive price. Even on close to 1k, you still can get a not-futur proof card. Nvidia is litteraly abusing its fanatics.
@@Ilestun true, but the 7900xt already can't hit 60fps on cyberpunk maxed out at 1440p because of slow RT performance. A 4070ti can exceed 60fps in just about any game currently available. I think both cards will eventually run into issues for different reasons this gen. Nvidia was greedy short changing gamers by 4gb VRAM, while the 7900xt is just a more powerful GPU at rasterized performance but still dragging it's feet with RT. AMD better come up with a viable FG alternative soon as well with drivers. Otherwise the rasterized performance advantage won't hold either in upcoming games.
@@modernlogix How many times I told people to avoid upgrading to 4070ti because of Vram shortage......but you just cannot stop them from spending their money badly I guess.
@@amineabdz I hope so, but I'll believe it when I see it. AMD cards have been great since RDNA 2, and growth has been slow. That mindshare is incredible.
@@amineabdz Probably not, AMD and Nvidia are just like Apple and other Android brands. It doesn't matter if the product is better and cheaper, the brand name matters the most and we already seen that with Iphone SE 2/3
Still rocking my 1070 I bought in 2016. Absolutely love this card, it's been enough to run recent games, granted in medium quality, but they still run and I can play in good conditions. I'm not hurry to upgrade GPU and not a compulsive buyer, and thank god for that seeing the state the GPU market is in. Hardware Unboxed is one of the channels I'll be sure to follow the advices of to know which GPU I want to upgrade to.
Dkn I agree the 1000 series my have cost most then the previses generations at first, but boy did the have longevity. I picked up a GTX 1070 Ti at the beginning of 2022, and man do I wish I had gotten it sooner! Nvidia and now AND have lost their track now, at least when it comes to there gaming side of things. more so Nvidia
Same here, EVGA 1070 FTW, an absolute diamond of a card. Lasted longer than any GPU I've ever bought, lasted through COVID and crypto, and plays almost everything I care to throw at it in near-silence. Just replaced it with a cheap second-hand 6800XT to last me until the next 'good' GPU release cycle whenever that will be, but that's only because I'm building an entire new system from scratch.
@@cameronblazowich2671 if we take AMDs perspective into consideration, then the only way they continue to increase profit margins is to charge their 5 customers even more money for their products.
I have ran out of VRAM on one game with my 3080. Ghost Recon Breakpoint, wither everything cranked to ultra at 3440*1440. It will run out of VRAM and crash after about half hour to hour of gameplay. At least for me, and i will get the error that it is out of VRAM. So far the only game i've had that issue on.
I paid a shit load of money for my 6800 XT during the shortage (twice of what it is now). Still extremely happy with it and still glad I did. Happy to see it still performs so well today.
Same. Actually, at that time in my country 6800XT and 6900XT were roughly the same on price. I ended up getting the 69 one. Glad 68 users are still in good shape
Also got the 6800xt, at launch price. 8GB was a big no no. It was very obvious that it was too little. Just like the gtx 580 that came in 1,5 and 3GB variants. The 3GB lasted years longer. Same with 3/6GB 1060
Great video Steve, thanks. Another thing to consider is that calling these cards "aging"... for a lot of people, these cards have only recently become available to buy, due to shortages and pricing. And they are currently still the newest 'midrange' cards available! So these are actually current gen cards struggling!
Well well well, the anti-amd man himself has no choice but to speak well about AMD. Even in ray tracing the 6800 slaps the 3070 back to the beach the silicone was made from.😂
Thanks for the revisist, Steve! As I said over 2 years ago: The RX 6800 was probably the best graphics card released in recent years. It was faster, more efficient and more future-proof than the 3070 from the start, and yet the 8GB VRAM cripple had to be hyped because it can do the oh-so-great DLSS and has much better ray tracing performance. Good job, press and influencers.
They can only review what they know, whether games will end up using more VRAM imminently is speculation and the reality is VRAM usage of games *in general* has been pretty stagnant for years. (Largely due to consoles, imo) I agree that Nvidia shouldn't be skimping on memory size or memory bus, but you can't really blame press for reviewing the cards they got at the time they got them with the information they had.
comparing 3070 to 6800 skipping 6700xt is like comparing 6800xt to 3090 skipping 3080 ... depends if u red or blue team fanboy... why not compare 3070 to what amd was promising as direct competitor at that time? 12gb 6700xt? both of those companies crapped on gamers in last years equally.
Whole argument about choosing nvidia for gaytracing in the midrange is officially debunked here. 😂 I dunno why people keep funding scammy practices and crazy prices by nvidia
I'm not an Nvidia homer. But I did get the 3090 and 4090. I can confirm ray tracing still sucks. It's simply not worth it yet and won't be for 5-7 years min.
This is one of the reasons I went from 3070 to 6900xt the more VRAM and also shockingly I had to spend no additional money on 6900xt! The performance has been great for the last 6 months and I'm quite pleased with the AMD card. Also knowing that AMD cards age much better than Nvidia cards makes me think maybe I should stick with AMD from now on
@@ChiplockRock To my surprise I have yet to encounter any problem with the drivers every game that I have played and tested worked great I don't know about productivity applications tho as I only use Photoshop and premiere pro anything else is CPU related
@@ChiplockRock I've had a good number of Radeon cards.... most of the driver issues I've has is when I've tried obscure stuff like connecting up multiple monitors of odd sizes and Windows automatic updates borking the drivers. My Radeon VII is also picky with what driver it wants to work with, and my x570 Crosshair VIII board does not like my Radeon VII. AMD has always had a shaky history with drivers.... and things like taking months to get RDNA 1 drivers months to get right has only hurt them. I bought a 7900xtx Red Devil on launch day and no problems so far. My only complaint is there is a 25c difference between Hot spot and regular GPU temps. Otherwise the card doesn't get too hot.
@@ChiplockRock just got a 7900 XTX (new build from a 2080ti) and have no issues, also the AMD software is way better than nVidias. I think the driver thing isn't really as applicable nowadays as it was in the past.
You are accurate as always. 8GB is now budget tier. 16GB is mid-tier and 24GB for top-end. When Nvidia launched (and then quickly unlaunched) the 4080 12GB it was such an obvious mistake. Even 16GB gives me pause because 4K gaming and SS needs more VRAM than ever.
16GB may not technically by mid end considering the current generation of games, I would say 8Gb would be entry like the 4Gb back then, 10Gb being somewhat standard like 6Gb and 12 Gb being the mainstream for GPUs, and 16Gb should be standard for cards that can run 4k unlike 3080 and 3080ti with that 12Gb
I just recently switched from team green (GTX 1070 8GB) to team red (RX 6900XT 16GB) and I gotta say, that extra VRAM is nice, the Resident evil 4 remake uses a whopping 13.8GB of VRAM at 1440p if you set everything to max (with Ray Tracing and Hair) so 16GB does seem like the way to go moving forward, which, isn't shocking considering there was a time where we were jumping to double, triple, or quadruple of the "at the time" acceptable VRAM requirements back in 2007-2010, I remember having a 128mb card and moving to a 768mb, I remember having a 1.5gb card and moving to a 4gb, 4gb to 8gb, etc, etc, as the years go by, we need more VRAM, it's that simple.
I have the 3070 Ti and this video is 100% accurate. At 1080 the lack of vram was constantly producing a stutter-ridden mess. At 3440x1440 the card can't handle any game without spikes in frametime and stutters. Turning my character around in games like No Mans Sky and Cyberpunk and Horizon zero dawn causes the game play to pause and detail/objects to pop in and out. I've been on the fence about this, but this video has convinced me to go with the 7900 xtx. With its vram and raster performance, I think that card will age the best.
This... isn't VRAM related. No Mans sky isn't filling the 8GB of VRAM, neither is Horizon Zero Dawn. Your stutters aren't VRAM caused in these titles because they don't fill the buffer. Seriously, go look up comparisons of your card to the 6700 or 6800 and look at the .1 or 1% lows in those two titles and they are pretty close, which means stutters aren't happening as a general rule to people with 8GB of vram vs 16.
@@tonymorris4335 My VRAM sits at 7892MBs used in Horizon, and similar in the other games. I've used both HWinfo and Afterburner to monitor, as well as Nvidia's own monitoring tool to confirm. And even if it wasn't the VRAM, then the stutters are caused by what? Poor gpu performance? Poor driver performance? Neither one makes me feel better.
Would be awesome to compare the 3080 10GB to a 6800XT 16GB in these games. They are pretty closely matched in rasterisation performance, I wonder if the AMD card pulls away massively with more RAM.
not with Vram tho, as a 3080 and 3090 user, I can tell you that the 10GB isint even that much of a difference from using an 8GB Vram since games that needs more than 8GB usually just go 12-14GB
6800 XT is closer to 3080 ti/3090 performance in rasterization in newer games, outside of 1 or 2 random titles. Sometimes exceeds that. The 3080 would get shit on.
@Kevin P. Lmao NO, wtf are you watching to say bullshit like this ? Just check the recent games benchmarks from HUB, the 3080 10Gb still beats the 6800xt in EVERY GAMES, even at 4k, latest ones are TLOU, Hogwart and Spider Man, infact the 3080 10Gb even beats the 6900xt and 6950xt in Hogwart and Spider Man respectively at 4k ultra, and the 6800xt by like 10-15%. And if you check other reviewers, the 3080 is also better in other new games like Dead Space or Plague Tales. The only recent game that the 6800xt can outperform the 3080 10Gb was CoD MW2, and everyone knows this game is just an outlier that heavily favors AMD. Imagine saying "6800xt is on 3090 level" while in the very recent video of TLOU the 3090 was faster by like 30% at 4k, stop coping.
I have have both 3080 and 6700XT. I can tell you that 10gb is not enough and it is also a really weird amount, because the way texture packs work in most games is you either have settings that require 8 or 12. Having 10gb just gives more headroom for 8gb textures but still below requirements for 12gb, so very weird. The 6700 XT can handle higher settings when it comes to textures in games I play now like Resident evil 4 and the last of us. The 3080 is stronger but the 10gb makes it so you have to use lower texture settings anyway, so it's like wtf.. you play without Ultra settings, what's the point then. I would recommend at least 12gb Vram from now on, and I mean really minimum 12gb, because i think even that is a risk. We will most likely need 16gb Vram soon for new games coming out this year and specially in 2024.
Probably a bit slow, but not choking on Vram. The 3060 is a 60+ fps experience whereas the 3070 was intended as HFR capable. The 3070 is likely faster if it is not choking on VRAM.
I mean, it would still be slower rasterization-wise, but at least the frametimes are far, far more stable and it wouldn't have that weird voodoo texture-flipping magic that the 3070 does on Hogwarts Legacy
@@peterpan408 I am not sure, because 12GiB version is ~25% SLOWER to its 8GiB Ti variant! While 3070 is nearly ~50% faster when ram is not an issue! Very much doubt you would have 60fps on ultra settings with this "turd" of the card despite having 12GiB of ram
lol. Back in 2021 the saying was "If you just wanna use rasterization, buy AMD, if you wanna use RT, buy nvidia". Funny how that turned out. For me it was never a question, since nvidia doesn't give us Linux-nerds open source drivers.
Great video Hardware Unboxed. As a 3070 owner, I feel jipped and spent a decent amount in this card thinking it’ll be good for years to come. I’m so extremely frustrated now, since most new games even on 1080p I’m running out of VRAM. Selling the used GPU is going to get me nothing in terms of price, and the new RTX cards are outrageous and frankly I’ll never buy an nvidia product again. 6900 XT is available and cheaper now, but not sure if I want to buy a new car that’s already last gen, even if it’s better than the 3070. Better yet, 7900 XTs and such seem to be quite expensive. I’m torn now to wait and deal with that absolute hot garbage product or try to empty my wallet into an AMD GPU. Or wait until next gen AMD. I have no idea. Nvidia is terrible and extremely anti-consumer. My next GPU will be amd and I cannot wait to be able to play games normally on a card that wasn’t made to just be obsolete in a few years.
Give Intel A770 a try then. Its cheap, has 16GB vram, supports raytracing, and Intel has been doing a lot of improvements to their drivers. Buy it from somewhere with a good gpu return policy in case it doesn't work out for you.
I had lot of issues with drivers on RX6700 XT specialy with old games where it performs almost like my old GTX1060, in new games everithing was ok. See internet forums lot of people have same issues. Im going backt to nvidia. AMD drivers are not good.
I remember feeling a bit burnt when my 1GB 560ti couldn't play PS4 games once the generation swapped over. Funny to see this happening all over again. Nvidia has *always* been stingy with VRAM. Even back in the day it was easy to get Radeon GPUs with over double the VRAM for the same price. In 2015 the GTX 960 only had 2GB, and the 970 famously only had 3.5GB of full speed VRAM!
Same! I remember asking on the forum whether I should get 560ti 1G or 2G, and everyone recommended 1G. I mean, people took you for an idiot if you dare to recommend the 2G version. But within just 1 year, Battlefield 3 came out and destroyed that 1GB vram. I was so mad. Yet it seems 12 years later, the history has repeated again.
Как же вы западные куколд клоуны закалебали с этим вечным упоминанием 3.5 гб у 970, словно каждый из вас вчера узнал и спешит всем остальным "не знающим" об этом рассказать
Another reason to buy AMD. More VRAM at every price point since the beginning. Its been keeping the R9 390 and RX480 relevant with 8GB of vram and the new stuff with 12/16GB should be standard.
@@maxpower18 Not true as 6650xt is WAY cheaper than both 3070 and 3060ti. 6800 is at those card's price point which has double the VRAM. Stop being a Nvidia simp.
I'm very happy using 6800 with a ryzen 7600 as my secondary gaming PC. Never owned an and CPU or GPU before and have to say I'm impressed! Will mostly be used for emulation as I put it all in a compact case.
This is why I always go for card variants with the most VRAM ammounts, despite what everyone says at the time "8GB is fine for all most recent games" sorry but I never swallowed that one.
It was kinda fine in 2020 but technology moves on. Since then developers have become used to using 10-12GB VRAM for the GPU on Series X/PS5- so this is translating across to PC games.
It was enough. Depending on the game. I always asked what is the budget + list of games/kind of games, resolution, graphics settings, Target FPS and when they plan the next upgrade.
The thing is you gotta look ahead when buying a PC component like a GPU. When people say "It's fine for almost all games today", well what does that mean? Does that mean that it won't be fine for a number of games in 2 years from now?
It was enough for 1080p and also when devs were making games for previous gen consoles they were forced to use less vram anyways to get the games to even run on those consoles. Now the new gen is a lot closer to PC specs they can push gaming further. NVIDIA are cucks and they should have shipped all their cards with minimum 10-12GB
@@malekkmeid9263 . They are no closer no then with previous Gen consoles. The main difference is that they have 16GB shared memory so They can use more of it for image quality upgrades. This combined with console efficiencies lead to greater stresses on PC VRAM requirements.
I am feeling mighty fine after buying my 7900XT to replace my 3070, these results are eye opening and hopefully Nvidia will have to pull their **** together. Thanks for all the hard work!
I am not trying to defend NVIDIA here, but if you already own a 3070 I would stick to it at first. The point of the Video was ofcause just showing the games which are really unoptimized on VRAM usage, most games still work fine with 8GB of VRAM. I think you should stick to your 3070 and wait a bit longer, maybe next gen of GPUs, as the performance increase will be much greater
@@otozinclus3593 i get what you mean, I wouldn’t advise a lot of people to go for the upgrade I went for, but I’m also upgrading my 3800x, and I tend to “donate” my used hardware to my family members that also use computers, and right now having an extra one will solve another issue, so my upgrade also has that in mind, my 3070 will go to someone that will enjoy it for years. I use my pc for work (spend 10+ hours daily on it) so I tend to be quite “liberal” with my upgrades as it is an investment in my case. (Hell, my old 2600+rx580 build is still in use) It just amazes me how bad 8gb vram is behaving on newer games, I was already experiencing a bit of it (and I have 1440p high refresh monitors), but it will only get worse over time. I was thinking about a 4070ti but even the 6800 sometimes uses over 12gb, so it was a good call on my side I think.
I can't believe, that after getting a reference model Rx 6800 at MSRP during launch week, I decided to trade it in for a RTX 3070... At the time it seemed like a good idea for my use case scenario... Both cards performed similar in then-current titles, 3070 technically had better RT performance (even though in reality I barely use RT and it actually runs pretty poorly on a 3070). I wanted to try streaming so Nvenc encoder was appealing, and at the time FSR wasn't really a thing so Nvidia had one over with DLSS. But the funny thing is, as a 1080p gamer, I don't even really use DLSS at all, so I really don't understand what I was thinking factoring upscaling features into my decision or hell even Ray Tracing performance, at the time. But either way, in hindsight, I'm kicking myself for not taking a chance on AMD. Lesson definitely learnt
Pretty sure you could sell the 3070 and buy a 6800 for the same price.. or trade it :) lots of people buy Nvidia for the name not the performance it provides. Lots of silly people out there. No harm in trying
@@defectiveclone8450 Yeah. At the time the decision felt right, but in hindsight it was a mistake. I was thinking of trading it into the local Cex store to recoup around £300, and then using that to either get a Rx 6000 or 7000 series gpu at some point. Since I'm only really gaming at 1080p 144hz @high-ish settings, I don't need anything too fancy
@@cal.j.walker As the other guy said, many people are still in the Nvidia bubble and will pay more for a 3070 over 6800. Maybe just try and trade it for 6800/xt.
Many RUclipsrs whilst reviewing the cards: 3060 12GB is all marketing and you'll never need that VRAM, buy the 3060ti or 3070... Aged like milk that advice. Also - 4070ti owners sweating that 12GB is pretty much low end now.
I recently replaced my 3070 with a 6800 after finding one for good deal actually. I initially thought it would be kind of a sidegrade performance wise, but with more VRAM to work with. After watching your video though, it seems like I definitely made a great decision going forward.
@@edgyjorgensen3286 seems like with every driver update AMD is squeezing a little more out of it and the 7000 series as well. My 7900xt just keeps getting a little better with each update.
@@ok-mx1mimaybe not but they had that same architecture for a while and had a lot of time to keep learning and getting more from it. I think with rdna 3 we might have a similar situation where they get more and more from it over time.
Would be nice to see a retest of these titles with the 3070 vs the 16Gb A770 to see if there is a similar result with 16gb vs 8gb! Also potentially add the 12GB 3060 to see if it pulls ahead!
The answer to that is simple. RTX 3070 will get better fps in some games but the lows will be undeniably terrible in 1440p, it might not load the texture very well too, same as what is happening in the video, so a bigger VRAM 3060 might get lower fps but will be able to stabilize the game with better lows and correct texture loaded. What he is pointing out is that the stutters will be there as long as the VRAM is not enough, and what makes it terrible is that it won't load the texture.
You know what would be fun doing this exact tests? RTX 3060 12GB RTX 3080 10GB RTX 3080 12GB And see if the 3060 will pass the 3080 10GB in some games while comparing what performance had been left on the table had the 3080 originally come with 12GB!
Hate to burst your bubble but the 3060 wouldn’t get higher fps in ANY game compared to a 3080… 12GB or not lol, it wouldn’t even be close. Just google performance stats for both and compare.
Excellent video. Appreciate the effort highlighting the importance of adequate VRAM. For anyone that doesn't know, while the 3060Ti/3070/3070Ti have 8GB and the 3080 has 10GB, the 3060 hilariously has 12GB because Nvidia screwed up their tech/marketing and AFAIK was responding to competition by AMD. Competition is a wonderful thing.
Yup. It was designed to have 6GB in 1GB modules (6x32 = 192bit bus) but Nvidia realized late in the game that it wouldn't even cut it for launch-day review benchmarks so they went with 2GB modules instead.
Initially I bought my 3060 12GB for hobby AI training, professional workload, and transcode, back then its either this, or slightly more expensive 6600XT or 3060ti. I even have some second thought since so many reviewer and online comment saying 3060ti is much better value..... and yet in last three months many games I played will be running into crippling texture load/quality problem if I went with 3060ti, with how DLSS2 constantly improve, if I feel fps is low I just turn DLSS up, but if you run into VRAM limit, you have to suffer through extremely visible texture issue, DLSS is a much lesser compromise then those 3070ti people huffing copium who insist turn texture down a few notch is okay.
This! I'm still rocking a 2080 Ti and I'd really like to see such a comparison. Especially since the 2080/Super/Ti is missing in all recent benchmarks which is a shame. Maybe it's time to ditch the 3070 from benchmarks and replace it with a 2080 Ti 😂 The low VRAM on Nvidia cards has always been a pain and now it finally starts to show and piss people off. I hope people finally recognize how shitty Nvidia is about these things and stop buying these gimped cards.
I was happy with my 5700XT and 8GB VRAM, but i'm glad now i bought a 7900XTX with 24GB VRAM. I hope AMD continues the trend in low andf midrange with 12GB and 16GB of VRAM for the rest of the RX 7000 lineup.
You guys are making me feel EVEN better about my 6800 purchase for $499 back in november... it was way cheaper than the 3070 and 3070ti which is what i was originally looking at... and the VRAM on top of the price sold it for me...
Even crazier that in late Summer 2020, 2080Ti owners were falling all over themselves to sell their cards for peanuts so they could buy a $500 3070. That price, and availability, turned out to be a pipe dream.
@@darkfire3691 Mostly true in terms of intent, but I know at least a couple of people who did sell a 2080Ti thinking they were going to buy a 3080 but wound up "rebuying" a 3070 because the 3080 was unobtainable. And, of course, they overpaid for their 3070s.
@@rangersmith4652 Yeh it should be a pretty big warning sign to any sensible person that downgrading on vram while trying to "upgrade" gpu's, is a foolish strategy.
I wish I could like this video twice! This was so helpful and instrumental in my progress of understanding how a GPU works and what the heck VRAM was and what it did, and clearly demonstrated why the graphs can be misleading sometimes, like when the GPU compensates for lack of memory and avoid crashing or buffering by entirely not loading in the textures, which isn't shown any other way but watching the side by side Thank you so much! Wonderful video
I think 8GB for entry, 12GB for lower mid, 16GB for upper mid, and 20+GB for high tier GPUs would be fine for this generation, though for the generation after the 12GB may need to shift to entry level and replace 8GB
@@crossfire4902 exactly why it's god tier. it's expensive to buy and run. but all reviews show it's the fastest out there Money can buy. choice is cobsumers
Amazing job once again. I am so happy with my 6800 xt since the first time. Probably the 3080 10 gb will have the same kind of issues. Who would have bet a dollar on AMD for Ray Tracing? Nvidia keeps claiming they have the superior tech and their cards are better, but they lost the way... Prices are totally stupid and even on fundamentals like the VRAM buffer, they took the wrong way.
I was thinking that also. With the 407 ti having 12 along with the 4070, it seems 800 and 600 is way too much for those GPUs! They're already obsolete! Everything coming out is just a waste of silicon! These cards barely hit 60 fps at 1080p at times and that's ridiculous for the prices of these cards. I've lost all hope in getting a fair deal when it comes to buying a new card.. just outright ripping people off. 😢
Yeah these are truly terribly optimized titles. Games 10 years ago looked better using 4GB of vram, I don't get why the criticism isn't aimed at these terrible ports.
Let's all not forget. Nvidia was super close to releasing a 4080 12gb. 🤣 At $900. Imagine if they were the only company in the market. And if the 4080 was 12gb, I'd probably guess they were planning for the 4070 to be 8gb. What a timeline that would have been.
I know this comparison would be a little out dated, but could you do a comparison of an older title with an RTX 3060 (12 GB) vs RTX 3070 (8 GB) to see how resolution and settings scaling do? This should also highlight the limitations of the 3070 compared to Nvidia's own cards.
The first mainstream GPU with 8GB of VRAM was the R9 390 in 2015, which cost $330. By 2020, we should've only been seeing 8GB on the most entry level cards.
@@istealpopularnamesforlikes3340 Ahem* RX 6300 2GB
@@oOZellzimaOo thats a sub 100$ GPU...8GB of Vram is like 50$+ alone...
There were variants of the R9 290X with 8GB as well
@@oOZellzimaOo nuance is that everyone SHOULD know NOT to get a gfx card like that in the modern era. Essentially, it's a waste of money over something that's actually useful. Sorry to be pedantic about it but that's the nuance of it.
In 2015, 8gb of vram was 330? Man, prices have come a long and sad way.....
The 1070 had the same 8GB of VRAM as the 3070, and that was seven years ago...
My gtx 1080 TI got more VRAM 😂
Yeah I bought a 1080 on launch day and that was probably the first generation for me that I stopped looking at vram as a large concern. ...But as you point out that was kind of a while ago at this point.
@@Crimsongz 11GB, 1 more than the 3080. They are so stingy with VRAM ;(
Loved my 1070.
And the 1080 Ti had 11GB.
The "I told you so angle" - is completely justified and I loved every second of it. However, the content is also extremely important because a lot of RTX 3070ti's, 3070's and 3060ti's are still being sold new, and are going to be exchanged on the second hand market. People need to know the value of these cards and the performance they can expect. Well done Steve!
And even worse, nVidia will continue to throw out 8GB cards in the 4xxx series while even new 12GB models should already get that warning the 8GB cards got back then. If people run out to buy 8GB 4050 Tis etc. at these highly inflated prices a "told you so" will be more than justified next time.
@@chrissoclone exactly. A 6950XT seems like a much better deal than the 4070ti and the upcoming 4070.
for me it reeks of confirmation bias. you would not run a 30 month old mid range gpu at ultra settings these days. I got a 6900XT and certainly don't run most games on ultra but high or very high settings for good fps. test the 3070 on high settings against the 6800 and it will do well.
@logirex I see what you're saying - but it's still great information for people looking at buying these cards - which sell a considerable amount new and used today. The old reviews are no longer valid.
@@claytonmurray11 Going forward you certainly want cards with more memory but it looks like they are getting it also with the RTX 4070 getting 12Gb for instance.
That said even getting an older card I would still take a RTX 3070 over the 6700XT any day of the week. This channel tested those over 50! games and the 3070 was almost 15% faster. The only thing is if you buy older cards or mid range cards DO NOT RUN THEM AT ULTRA SETTINGS.
Both this channel and others such as LTT multiple times points out running at ultra settings is pointless. LTT has one video where they had people look at games using ultra or very high setting and most preferred the very high setting as they clearly could not norice the difference.
I really appreciate the fact that on this channel you guys make the pressure you are putting on companies through your views and influence extremely clear to understand and remedy, with extremely specific target points so that they can't weasel out of criticism with a few extra gigabytes of VRAM. Quite frankly you guys are using your power in this market to the highest efficiency you possibly can without getting anti consumer pushback.
theyre still very, very far from done, although im very tempted to swap them
💯💯💯
yet, idiots still buy 3070ti in december 2023 "i need raytracing"
RIP in peace to anyone who paid $1400 for a GPU with 8GB of vram two years ago
Desperate times..
I paid $360 for one six months ago. In my defense, it was an upgrade from a 6GB 2060
Paid 400€ for 3070 strix a month ago. Now waiting for my 729€ 6950 xt red devil to arrive
Rest in peace in peace lol
I did pay like 1100$ for a 3070ti because I didn't really think things would get better, as far as price. I should have stuck to my 1070 and waited:)
I was using a RX 580 for many years, and I feel what made that card age so well was its 8GB of VRAM. At its launch it was pretty high knowing NVIDIA's equivalent was using 3 and 6GB. So when I finally moved away from that card it was obvious that the RX 6800 would be the right pick for me. Its been good to see the card has been aging well already for the same reasons.
I switched from 1060 to Rx6800. I consider 3070 on release as price was very similar but im enthusiast and i knew 8gb won't cut at 1440p in couple months. It happend yet Nvidia claimed its not a problem.
3080 10gb ? What a joke. NVIDIA made fools from consumers
RX580 was the best bang for buck card I ever had, and yes, 8GB VRAM played a large part in that. Before that I had a 660 Ti as my favorite bang for buck card, it was quite excellent and the ONLY reason I had to replace it was because already then nVidia was just too stingy with their VRAM, it could've lasted me even longer if not for that.
I made this same upgrade recently. The additional VRAM and Nvidia's bonkers pricing were key drivers in my decision.
you must have avoided the lemons quite well!!
@Varity Topic 6750 XT is still a pretty solid choice!
Would have also been interesting to see the RTX 3060 12GB vs a higher end RTX 8GB card. See if there were cases the former played better than its more expensive counterpart.
it was too much for the GPU though so it's not as good as you might expect. At least it won't crash in RE4 as much though LOL
The RX 6750 XT might be interesting to throw in the mix for those titles that report ~15GB of total VRAM usage already.
I'd love to see a 3070Ti with its G6X in this comparison.
My ancient RX580 has as much VRAM as that 3070. Now 8GB is just enough for 1080p gaming. Glad to see the Radeon card doing so well.
u_u
❤️
Gddr5 vs gddr6
@@jpesicka999 AMD's RX 6600 8 GB GDDR6 is reaching $199
And to think that even the 470 had 8Gb variants back in 2016.
Retesting the 6700XT would be a great idea, considering that it offers 12gb of VRAM, and how it's already very close to the 3070 compute power but for cheaper, would be interesting to see how it fairs against the 3070 in the current VRAM limitaions meta.
Hopefully Steve can do 2080ti vs 3070 vs 6750xt.
Turing would win this because of much higher RT performance and 11GB of Vram is actually sweet spot for that class of performance.
@@KrisDee1981 Turing uses first gen RT cores just like 6750XT, So I doubt that. But hey, no problem in comparing.
Actually, that would be awesome.
Yes, very good idea. Hardware Unboxed pleasssseeee
@@KrisDee1981 🤣🤣 yeah let's do it until nvidia wins, not the consumers, Nvidia must win.
nVidia's planned obsolescence has worked... I'm gonna buy a new GPU to replace my 3070. Grats nVidia, you've driven me into getting an AMD RX 7900 XTX.
I paid $850 for a 3070 ti on launch day.....what a mistake lol but im now switching to a 7900 xtx as well soon
Nothing wrong with 4090 too, but I get that $1600+ isn't necessary for a good experience.
I’m enjoying my 7900 XTX. I think you will like it.
@@macdonald2k Yup, the best card right now. But I choose not to give nVidia any more of my money.
@@Dempig I have a 3070 FE. Got it a few weeks after release when the only way to find one was to get lucky on Best Buy with constant website refreshes on stock days. Should have resold it when it was going for >$1000 lol.
What about the planned obsolescence angle in terms of NVIDIA purposefully limiting the VRAM in order to encourage consumers to upgrade sooner? Seems a clear strategy at this point.
They're only going to give gamers as little as they can get away with. This is obviously an excellent strategy on their behalf.
my second ever notebook was a Lenovo y510P and i had 2x755M in SLI (nvidia). At that time i didn't know anything about PC hardware, so i thought 2 GPUs are better obviously, right.
Anyways, nvidia came out with shadowplay and i was so happy, i could use it, and record stuff as FRAPS really sucked ass. I was recording my league of legends gameplays left and right.
Few weeks passed, and i couldn't use shadowplay. I googled it, and found out Nvidia disabled it, but there was a workaround. So, i used said workaround. Few weeks passed again, and said workaround was disabled too. And that was the moment i didn't want Nvidia products anymore. When i built my PC that i use today, i bought a vega64, despite almost buying a 1080Ti. Now, i'm bying a new computer and looking at the things Nvidia does, the obvious choice was the 7900xtx. (CPU is in the mail, and it's a liquid devil, so can't use it yet)
Fuck nvidia.
Gotta love that profit driven economy
@@Hardwareunboxed So why did thy come out with the 3060 12gb. Its was a weird card when it came out in terms of the VRAM it got. Did Nvidia finally see the problem coming?
@@kaisersolo76 they saw 6 as too little at the time, as it would have had issues with some titles on launch. Remember you can only change VRAM amounts by doubling or halving because of the bus width.
Excellent work, Steve. I had no idea new games were using this much VRAM. I'll be sure to get something with no less than 16GB of VRAM next time I buy a GPU.
Some of the stuff I've been running has been using all 16GB VRAM and *still* going into system memory...
Emergency 6800XT ( thankyou card death 1 month out of warranty! ) was a good purchase, thankfully got one of the lower binned versions on sale & saved about £300 at the time. I'd never get anything below 16GB now & frankly still want more. Less bothered about clocks.
@@Karibanu what resolution do you play at?
@@tre-ou5gt 1440p, occasionally 1440p triple-screen.
I've just bought 4070 with 12... the issue is that I can't fit any AMD card into my computer.
@@orcusdei Did the thought occur to you that a decent PC case is way, way cheaper than a GPU?
I'd never buy a crappy GPU because it doesn't fit - especially not if that GPU is way more pricy than the alternative card. You could literally have bought a decent future proof case, had a nice meal with the family at a restaurant AND bought a GPU instead of a 4070 🤣🤣🤣🤣💀
It would be very nice to see a 12Gb model like a 6700XT in the comparison as well. Or even an older 11Gb card like a 1080Ti. It would be hilarious if frametimes were more stable on the 1080 Ti than on a 3070 :D
Actually I recently ran my 1080ti I7700k with God of War 4K Ultra high with FSR 2 balanced or quality.
60fps solid
Blyat-Vidia gimping its own cards? Who would have thought.
I am having a lot better gaming experience with my 1080ti, compared some other people with cards twice as fast but only 8gig.\
I still get 60+ fps in almost all new games 1440p, no stuttering.
gtx on the last of us run like shit, zero optimization for that gpu generation, like in every games
^This please Steve, you need to do this!!
I'm actually pretty gobsmacked by how well the 6800 performs here. The 16gb buffer is an absolute life saver and I'm surprised how well it's doing with RT as well considering how questionable it was at launch. One GPU will be usable at higher settings for a few years yet probably and the other is relegated to lower settings and resolution for completely arbitrary reasons. Very frustrating.
To be fair, the 3070 has never competed with the 6800. The 6800 has always been 1 tier above the 3070 from the very beginning (even before this new vram scandal).
The 3070 was built to play with the like of the RX 6700xt and RX 6750xt (which the 3070 beats most of the time).
@@angrysocialjusticewarriorand what about the 3070 Ti suffer from same problem as the 3070 non Ti
@@angrysocialjusticewarrior no no no. Even the $400 digital PS5 console has over 12gb vram dedicated to games. Launching the 3070 with only 8gb vram while knowing devs in the near future would target the consoles as a baseline is treacherous 😢
@@angrysocialjusticewarrior I agree that the 6800 and 3070 weren't direct competitors, but they were only seperated by 70 bucks and that 70 bucks gets you a MASSIVE advantage now. Like an absurd one
What's frustrating is the fact that so many people bought these pieces of (as Steve would say) GAHBAGE just because they were in a green box. Anyone who spent that kind of money on 8GB deserves their fate.
I can't believe in the mental gymnastics that people go through to justify them buying Nvidia when they don't absolutely require it
One of my buddies got a 6700 xt recently and had horrible driver issues and some games performed really poorly in comparison to others, could’ve been his fault but I think amd still needs to work on their drivers more
Well they're better products (albeit more expensive) so that's really the only justification needed.
@@schwalmy8227 if there was a problem with the driver he could just use a older one
It's actually normal behavior. They bought something and they want people to assure them they did the right choice.
It's our human nature. Even though we know we're wrong, we'll continue telling ourselves we're right in hopes that we'll believe it ourselves.
It's just lying to ourselves. And we do it not just with things we buy, but all sorts of things. Even relationships to give an example.
@@schwalmy8227 You're saying that as if Nvidia drivers were perfect.
I had so many issues with Nvidia ones that for me it's not an argument for going team green. Their only advantage right now is prefessional space IMO. For someone that just plays video games AMD is a clear choice.
Hell, even Intel has a lot to say here
I was in the market for a GPU upgrade at the start of the year and it came down to these two as the primary contenders. I had been a longtime user of Nvidia products, but this time around, I went with the RX 6800, with the VRAM issue being one of the major deciding factors. Watching this video, I'm very thankful that I did.
how she been treating you so far?
@@briefingspoon380 I'm very satisfied with the 6800 and have had no issues with it at all.
@@vor78 Whats type of fps you getting and which games?
You know what would be an interesting follow up, including benchmark results from the 16gb Intel A770 in the titles the 3070 struggled. Just for fun, you could even include data from the Radeon VII for those same titles. That additional data would really drive your point home for owners of 8gb cards.
I’d also like to see a 3060 vs 3070 video to see if it can handle these better as well
hey man even a vega frontier at 16gb would dominate the 3070 lmao 3070 trash
Or a 2080 Ti. Similar GPU performance to the 3070, but more VRAM. Steve mentioned it, but didn't test it this time.
Yeah even going all the way to a 1080ti would be a good shout. I wonder if they even still have a Radeon vii?!😮
Nahh they should it for $2500 in the mining boom
An RTX 4060 with 8gb of VRAM will be pretty much dead on arrival. I can't believe Nvidias social media team is unable to clearly communicate this to the higher ups.
and does it really matter? It will still sell really well
@@Ddofik as well as the 4080 did? 😂
Imagine thinking Nvidia care about the consumer lol
They may give it 16gb in the end actually, just like they did with 3060. I am almost certain 3060 was planned to be a 6gb card (just like 2060 was) but they changed the mind last moment. No way they planned 3060 to have more vram than 3080! But the only thing they could do at that point is to double the buffer (same number of channels, double the volume of memory banks).
You think the higher ups are not fully informed?
They have NO incentive to offer anything more.
This would be a different story if they were facing the AMD of 8 years ago, that would do anything to gain market share.
As soon as a competitor comes in offering the same performance with triple the vram costing half as much... and they truly loose market share.
Then next gen will be different.
I'm glad I went with an RX 6700 XT. I remember people telling that rx 6700 xt was no match for rtx 3070 and the amd card actually performed a little bit worse initially but now it is a completely different story thanks to a bigger VRAM buffer and AMD's FineWine technology :)
And in fairness 6700xt always has been better especially at 108p and above due to the Vram.
My 6900xt reference beats my buddy's 3080ti in most games and synthetic benchmarks. His 80ti will out class my card in some games though!
@@LowMS3 I have been having vram problems with my 3080 10gb for some time, mainly with newer games, and after borrowing a friend 6800xt just last week I'm shocked how much better, so many games looked and played, even some 2+ year old games look and play much better, with older games like GoW at 1440p texter pack crashing the 3080 but the AMD cards run fine. Anyway, I have just brought a 6800 xt new for nearly half what the 3080 12gb cad cost.
The 3070 has enough gpu power to drive 1440p at decent frame rates, but struggles at 1080p just for less vram..
What a shame
Same here I got RX 6700XT for what I play, also I wanted more vram but not brake the bank, but the game I play need more than 16GB of Vram (Transport Fever 2, CityBus Manager, Cities Skylines, etc.....) I had a 8GB card but I had unbearable frame-times and a lot of lags due to Vram is overflowing into system memory and I had people saying you need Nvidia for gaming it's the best but my 8GB card was Nvidia GTX 1070TI, I'm like no I need more vram then 8GB on what Nvidia was offering in the mid-high end, I get what feels the best for me not for others, It's like being back at high school choosing your college and your friends say come where I'm going, sorry getting off topic aha.
Just to add to this, it was a real kick in the nuts when I bought the 10GB 3080 and then the 12GB variant comes out after... I wanted the extra VRAM too it should of been the first product they released.
They did and it was called 3080 Ti. 3080 was never meant to last longer than 2 years for 1440p ultra.
never buy the first version. while its hip and cool to buy it first. you usually are a test subject just for better to drop right after, be it apple, samsung, monitors, gpus, cpus, whatever. dont get caught up into fomo fam.
@@eugenijusdolgovas9278 ya we are a wasteful throw away society, we think we need to replace things because corporations have trained us that way, so we keep profits nice. we are just modern day serfs.
@@CheGames497 Whilst some are wasteful, I tend to buy GPUs from used market and rock 6800 XT rn, which will be working till I decide to switch to 4k competitive FPS gaming.
AMD have really improved their Radeon cards over the recent years. I used to use Nvidia cards all the time but have been using Radeon for the last 3 years now and love these cards.
How is the driver support for amd cards?
@@SMILECLINIC-2004 Its fine. not worse than nvidia wich recently got issues aswell. They are just slower at releasing newer drivers but thats it however im not one of those guys who update their divers immediately new ones come out, why would u do so? its not like u are gonna get much better performance in newer games, u need to buy a faster gpu for that. if it works good then u dont need to touch it.
@@SMILECLINIC-2004 No issues as a former 6700xt and a current 7900xt owner.
I'm using a RX 5700, and I haven't had any driver issues throughout my time with AMD cards (rx 580, vega 56). I'm happy that the RX 5xxx series cards are still getting support and FSR/RSR updates as its allowed me to play most games at 2560x1080 and higher than 60fps. I don't really like the metrics overlay that AMD uses but I can't use Rivatuner without MSI afterburner which messes up my undervolting settings.
It's been said since 2020 go amd if u want regular features. They have the best regular rasterization frames. But at the low end with these 3070 3060 6800xt 6700xt ect you won't get any good benefits for raytrace lightefects so u need to go higher up the Chain. So obviously this test proves that
Would have been interesting to see how the 2080ti with its 11gb vram compared to the 3070.
I wish Steve can do this comparison next. More interesting to see Nvidia vs Nvidia. Almost identical performance, but extra 3GB of Vram.
I do have 2080ti and TLOU runs great at 4k, dllss balance, high settings.
Now that would be interesting to see :D
Add the RTX 3060 with 12GB for shits and giggles.
2080ti is a beast, unfortunately those 2018 micron Gddr6 are ticking time bombs.
@@deamondeathstone1 Yeah, it would have been great to see how the 12GB 3060 fared in these games compared with the 3070. I imagine a lot smoother gameplay in many instances, especialy with dlss.
I'm glad late last year, when upgrading, I went with a 6800 XT instead of the 3070Ti. They had similar price, but I also thought that 8gb was a bit too low since I had a GTX 1070 which also had 8gb, so I also wanted an upgrade in the VRAM. Thanks for your analysis HW Unboxed!
These two cards are not even in the same league except for RT overall, you made the right choice sir!
Very good choice. The 6800 XT is more in line with the 3080, often beating it at 1440p in alot of titles (Non RT of course)
I think it was HW Unboxed that sold me on my RX6800 due to vram and 1440p performance
@@highmm2696 Thanks, yes I know the 6800 XT is often better than the 3070ti, but here where I live, they were practically the same price. Like Apple, Nvidia has a "tax" with it that they usually are more expensive or drop the price less due to more demand. So AMD usually tends to be cheaper
Thats why I bought the 6800XT instead :)
It happened with 128MB, 256MB, 512MB, 1GB, 2GB and 4GB it was only a matter of time until 8GB joined the 'Not enough' graveyard, I think some are just shocked at how fast it seemed to happen.
And that NVIDIA is selling a 4060Ti with 8GB of VRAM in 2023.
It really didn't though, 2, 3, 4gb was in pretty fast succession, it took awhile for 8gb to no longer be enough because RX 480 had 8gb long LONG ass time ago and it was still fine until a few years ago. And at the time of release it was already not enough, Doom Eternal was taking up 11gb and at 1440p Cyberpunk took up enough VRAM that it has texture issues.
@@drek9k2 Agreed. I think the people that are shocked are the ones who recently bought a 3070 on sale or maybe even a 3080, while the rest of us from the r9 290x 8gb and rx 480 8gb era are not.
@@Bassjunkie_1 Oh true that's a good point, you know I totally forget that a lot of people this may be their first actual video card, rather than, I'm not saying you needed to be from the GT 8800 days just that I keep forgetting many of these kids weren't even around for the RX 580 8gb yet. I guess a lot of people are really, really truly clueless.
Go play uncharted new version on pc i have no compliant getting new with 16gb vram when the game looks like that damn rip 2070 super and 8gb cards u won't be missed
I must admit, I didn't think 8GB would be an issue this soon. Up until recently I thought that, all things considered, the 3060 Ti was *_the_* card of this GPU generation.
In the end, I bought a 6700 XT. Less so because of the additional VRAM, but mainly because it was cheaper - and now I'm really glad I did.
As soon as you give a dev more ram to work with they will consume it as fast as possible 😂
the rise of VRAM usage is due to consoles. consoles back then was a hindrance to PC gaming since companies are still focusing on making their games in ps4/xbox at that time. now that the current gen consoles are quite good and has 16gb ram (although unified), expect most games used at ultra around 10gb-12gb leaving 4gb ram for the OS. safe play to be made would be to buy 12gb vram gpu but its guaranteed to be safe in 16gb vram gpu unless you want 4k ultra + raytracing.
@@madtech5153 VRAM and Teraflops on XBSS compared XBSX?
@@madtech5153 ps5 and xbox sx have 16 of shared memory. It means that 16 us shared buy both vram and ddr ram. Do how come 8gb vram and 16gb ddr ram become obselet
If you consider texture size ps5 might only have 6gb vram and 10gb ddr ram to work with.
Dude devs are f**king up
The 6700XT is the card of this generation, not the 3060Ti. It is cheaper, faster, and has more VRAM than the 3060Ti.
You showed very well that everything is not about fps. Most people were looking at the fps results and passing by. We understand again how complicated the user experience is. Congratulations
Been waiting for this video... in 2020 I had the opportunity of getting either a Sapphire Pulse 6800 or EVGA 3070... and I'm so glad I went with the 6800.
Came from a 5700XT to a 6800, worth it!
You must have friends in nvidia coz rtx 3070 released in july 2021 so 1year 8months ago and true release was months later late 2021
@@gaav87 No it didn’t. Maybe the 3070ti released in 2021 but the 3070 def released 2020.. September or October I believe
@@RUclipsTookMyNickname.WhyNot Nice! I had a 5500XT, 5600XT and then a 5700XT before my 6800 lol… I kept wanting more power lmao
yeah you got lucky with all the scalpers and miners they were selling by bulk before arriving
My choice a year ago stood between the RX 6800 and GeForce 3070. I'm happy to say I made the right choice.
This makes me really feel good about my recent upgrade to the 6800XT, hopefully it doing great at 1440p for years to come
On the future side of things, team red has seemed to always had an eye for longevity. Just nabbed a 7900xtx myself after I saw my venerable rx580 8gig was minimum spec, after 5 years of use it was a no brainer to go red again.
@@Doomrider47 Agreed, I never have really dipped into AMD until I got a Ryzen 2700 a while ago. I dont think I will go back to an Nvidia card for the forseeable future unless AMD starts sticking its fists in consumers asses more than normal. Im honestly hoping Intel really brings a big update to its next GPUs, maybe both AMD and Intel putting pressure on Nvidia will knock them down a peg a bit.
@@Doomrider47 how big was the jump? I have a gtx 1070ti (which has similar performance to the rx 580) and I was also planning to buy a 7900 xtx or maybe wait 2 years and buy a 5080.
@@pyronite3323 wait a bit more.Or if u can get a 6800XT for less than 500 bucks thats a steal.
@@pyronite3323 if u can grab the XTX for anywhere close to 700-850$ - Just go for it - its worth it - tbh i have a 3yrs old 2070S 8Gb but i dont regret it for 170$
I'd love to see a 3070 Vs 2080ti comparison since most recommended specs specifically mention a 2080ti instead of a 3070 for a 1440p 60fps Ultra settings even though they're comparable in Rasterization, most likely due to VRAM
That would be the most sensible comparison.
and maybe add the 1080 ti in the mix just for the lulz
Imagine making such a good core architecture and software like DLSS only to be thrown away because of not enough VRAM. It really is the epitome of making an F1 car only to fit tyres from a prius.
Outdated shadowplay that dont have Capture preview
I agree with this. The tiers are now:
- 24 GB -- High End
- 16 GB -- Mid Range
- 12 GB -- Entry Level
- 8 GB -- "barebones", sub 200$ cards
This should be the standard in 2023.
I would agree but this also highlights the laziness in development of games now. There are still games more complex that look better than games like the last of us that use half the vram. Devs need to take responsibility as well, not just GPU manufacturers.
I'm sure that makes people who bought 10GB 3080's feel good about their purchasing decision ;-)
@@cairnex4473 lol that card was always a trap, I doubt anyone that bought it feels good about it now, I went for the 3080ti and the extra vram has been a godsend, still i think Devs need to seriously re-evaluate the priorities when making games, if a game looks amazing but plays like absolute garbage it's a bad game
@@FouTarnishedNot every developer became lazy it just technology has advance a lot, i don't included Bad Port Pc like the last of us or Beta Release Game like Lotr Gollum. What i mean is like Resident Evil 4 Remake, Capcom really improve game graphic and we can't blame game like Capcom to increase usage of Vram.
Agree. Me playing with 1060 3gb ;-)
NVidia planned this knowing what was coming as they had optics working with upcoming devs. All their products had gimped VRAM amounts so they could justify everyone buying new cards sooner.
The worst part is the customers who got burned by Nvidias bs will just go "oh i guess my GPU is getting old" and will buy a newer Nvidia one without a second thought.
Planned obsolesence doing it's work
@@DebasedAnon I have a 3070 and my next card will be an AMD card.
I have enough of Nvidias increasingly annoying nonsense for good and AMD cards are now more than a viable alternative.
@@Thysamithan I'd say AMD outright won the previous generation before the entire GPU market went up in flames, if they stayed at around MSRP Nvidia was trounced across the board.
If people cared a tiny bit more the market wouldn't be as bad as it is now, it took the absolute farce that is the 4080 at 1200$ + a recession to make people NOT buy Nvidias products..
With that being said AMD is being way too passive for my tastes, they could afford to go for the jugular right now due to having vastly cheaper costs but they're content to undercut Nvidia by small amounts (relative to the price).
@@DebasedAnon My issue is that, I know Nvidia are screwing me over but I'm split between gaming and doing AI projects on the side in my free time, so I have no other choice but Nvidia since the workflows rely on CUDA. If AMD bothered to push Rocm and make it as viable as CUDA, and make FSR better than DLSS, I wouldn't even consider an Nvidia card.
Its funny how NVidia is pushing ray tracing, yet when you turn it on thats when their card drops to its knees because of VRAM, despite probably having superior ray tracing accelerators.
It's also bewildering how people blame developers and poor optimization, justified or not, for low performance on 8GB cards, yet there doesn't seem to be much outrage about Nvidia pushing RT so heavily to sell 20-series cards back when support for it was practically non-existent and enabling it now on virtually any card from that generation will bring it to its knees. Is that also poor "optimization" on the part of developers or Nvidia overpromising and underdelivering?
It's not even "probably", the RT cores make a huge difference and the Nvidia cards should be vastly better at it, so the lack of VRAM is without a doubt a big issue.
@@chronicalcultivation Don't exaggerate the difference RT cores make. Sucking less does not mean RT cores don't suck. And that's only usual limited RT.
The RTX 4090 still only manages 17 FPS on Cyberpunk 2077 RT Overdrive mode.
@@sammiller6631 as someone who owns cards from both AMD and Nvidia, and also a PS5, no, it is not an exaggeration. The performance hit taken on an RDNA2 card or console for enabling RT is significantly more noticeable vs an Ampere GPU.
It's not totally unusable on AMD, but it's a lot worse.
I was really hoping the 7000 series would catch up and be equal to Ada in that department, but they seem to have only reached Ampere level RT instead.
I've always been more of a Radeon guy for almost 20 years, so the last few generations have been a bit disappointing..
It was better when it was still ATi
It's funny how, despite this, there were enough people stupid enough to buy them.
I saw the 8GBs of VRAM being in issue back in 2020 because Shadow of War (2017 game) requires more than 8GBs of VRAM if you wanted to use the highest texture pack with all the settings at max. That's why I ended up going with the 6800XT.
Aging like fine wine
@Steven Turner Shadow of Mordor wasn't a grindfest
Far Cry 5, too. I’ve seen it use almost 10GB when you supersample it
I had 6700XT with 12GB and some games could fill 9-11GB easily on the highest preset in 1080p!!! 8GB is not enough...now I have also 6800XT and there it is normal that games use almost all the 16GB at 4K...
modded XCom 2 used almost 8Gb on my 1070 at 1440p in 2016. I upgraded this year and anything less than 12Gb was a non-option to me, which meant NVIDIA auto-excluded itself
This reminds me of the issues we used to have [way back] when we'd run out of system RAM and Windows started using it's swap file causing massive slow-downs. The main difference being we can upgrade system RAM. If only we could do the same with GPUs.
Maybe I should relive those times by running 4GB today
@@ArtisChronicles and then you can fix it by using an operating system that doesn't suck
Moonstomper68 Swapping out RAM works much better on a motherboard because the motherboard PCB isn't completely covered with heatsinks, blocking off the memory chips. There is not much need to upgrade VRAM when the GPU performance is already at a given. It's much cheaper to just put on enough VRAM so the VRAM won't run out in situations where the card's raw performance could still hold up.
@@archthearchvile And what OS would that be Hmmmm?
You cannot do upgradable VRAM because adding a socketable interface adds more traces and latency. Considering how fast GPU memory is these days, you need every bit of performance you can muster. The only way to do that is to have the VRAM as close to the CPU as possible to have the shortest amount of traces, between them. Shorter traces = Lower Latency = Faster Speeds. As much as I myself would love to see socketable memory module upgrades for GPU's it would sadly be more of a hinderance than a benefit.
Once I had VRAM issues, I ditched my 3080 12GB for a 7900xtx 24GB what an amazing card, it just lets me max out anything at 4K with loads of VRAM remaining. Am on the Team Red bandwagon for now.
EDIT: When I mention maxing out everything, know that I do not use Ray Tracing if you want good RT performance go with the newer NVIDIA cards.
Can it play 4k 120 FPS with maxed out settings?
@@alexts4920 For which game are you asking??
I go green, 4090 > everything
@@AdiiS 4090 is too expensive man
@@alexts4920 not exactly
I couldn't be happier getting my 6800 last year for 480$. Apart from the odd driver issue (nothing serious) I'm a happy AMD customer so far. Nvidia became too rich for my taste with the 3000 series. Now both AMD & Nvidia are luxury goods with the newer gen products.
I tried amd cpu and after how shit it was and didnt work and couldnt even update the drivers, just never trying amd again.
@@valtonen77 Something's terribly wrong with your basic tech/soft skills.
Nvidia with their 4000 series GPU: you're too poor to buy our products so we don't need you anymore
@@PotatMasterRace ?? there is nothing to do, its just plug and play. dunno why he had a problem but if linus had no problem, its more likely applies to general population as well (yes im dissing linus).
what scares me off of buying AMD are all of these driver issues I read about for example they had extremely high idle power usage for a really long time - I dont even know if its fixed now? And then there is FSR which is just not nearly as good as DLSS, Ray tracing etc
I waffled between the 3070 and 6800 back in 2021, and probably would have bought a 3070 if I could have gotten it anywhere near MSRP. Instead, I bought a $900 Gigabyte RX6800 Aorus Master. While I wasn’t thrilled with the price, I’m glad to hear I made the right choice. HUB’s concerns about VRAM was a major deciding factor for me. Thank you, HUB!
Best buy is selling that still for a cool $1400 + dollars 😂😂😂
@@TBRANN oof! Newegg still has it listed for $1,000. I probably bought the most expensive RX6800 out there 😂.
@jacobvriesema6633 hell $900 is a steal apparently 😳
I feel your pain at that price too... I had to buy a 6700XT for 700€ in that same period... If only my GPU hadnt died on me back then i'd gladly get a 6800XT for 650€ right now...
i can buy a 6950xt for 650€ atm lol
Literally just got my 6800 a week ago and it's been blowing my mind how powerful it is even wth raytracing.
I’ve had one for a year now and I honestly haven’t run into a situation where I’m concerned about performance. It’s a really good card!
Interesting, GTX1080 user here looking upgrade! The top card my 9700k can handle is a RTX3070 or a 6800. The 16GB of ram is definitely pulling me to team red for sure.
@@UKGBManny For the money right now it's one of the best options. It's closest competition is over $100 more. If you can spend another $100 you can get the XT, who's biggest competition is WAY more expensive than that. Unless you can get an AMD 79xx or NVidia 3080 the rest of either card that might compare is double to quadruple the price. I paid $370 for mine and am still amazed at how much it can handle.
going to get my 6800 for 350usd tmr and selling mine 6700xt, looking forward to get that thing after seeing all those videos
The 6700 XT, the 6800, and the 2080 Ti are the only cards you should be looking at in the $350-450 range. Maybe sometime in the future we can add the A770 to that list.
@Cm Not to mention the problems that arise with early gddr6. If your 20 series card has a certain type of early memory it could fail at any point now!
@Cm you can find the 6700 xt New for 399€ for weeks in Germany. The prisebump to the 6750 xt is not wirth it.
Beispiel: Mindfactory hat die Saphire Puls 6700 xt aktuell für 383,05 €
Steve revisited the A770 after a major drivers update and if I remember correctly the A770 and 6700xt were arround on par
My Aorus 6800 Master still runs anything I throw at it. As long as my multi online games are able to pull 144fps on 5% low as no lower than 120 on 1% Im happy. As for offline games like resident evil or atomic still all good. Ray tracing at mid though.
@Cm 348euro second hand here in Denmark, or a bit cheaper Ive seen.
I had a 3070 for a little while, it was a fine card. Ended up with a Radeon 6800XT not long afterwards and sold the 3070. I don't regret it.
I’m curious what the value of a 3070 will be in a couple more years as this continues to happen.
@@Vis117 it'll sell very poorly I think. The market will be flooded with them in due time.
I went from 3060ti to 6800xt, happy about the decision especially since I play call of duty
Smart guy
i did exactly the same thing
I'm impress by the rt performance of the 6800.
My 16GB RX6800 bought two years ago at $579 MSRP aged really well.
That's what I paid for my Gigabyte Gaming OC 6800XT a few months ago. Bout $585 after shipping.
bro i got mine for only 430 euros from xfx
lol I buying mine for 300 rn
Got a 3070 for 600 euros in 2020 (GPUs cost more here than in the US). This week I found a one-year-old 6800 XT for 360 euros on the used market and was able to sell my 3070 for about 300. Can't wait to enjoy the next few years with this beast of a card, it's still so powerful at 1440p!
@@Karimftw0.0yeah but you got the performance 4 years later well I've been using my RX 6900xt for 4 years now. Getting that performance for half half a decade later isn't as big of a flex when we'll just move onto something 2x the performance
If the rumored 8gb vram on the 4060 and 4060ti turn out to be true, I am very much looking forward to seeing HUB's review as there is no way any of those products will be priced accordingly.
RTX 4060 will be buyed by miners... cryptocurrency...
It's joever
@@hugobalbino2041 Do you live under a rock, crypto is dead. Nvidia even came out and said that cryptocurrency provides no value to society. Lmao.
@@josh2482 8 GB cards are fine for crypto... not so much for gamers... even 12GB cards... for me, I don't pay electric bills because of my solar power... so yeah I don't live under a rock... and there are still GPU miners out there...
The sad part is that yes, the 4060(ti) will get scathing reviews for only having 8GB vram, and yes, they'll sure as hell be overpriced for that - but people will still buy them.
And, from what it looks like, AMD won't go ahead and do the smartest thing they could do and launch both the 7800XT *and* the 7700XT with 4 MCDs and 16GB. It's such a pity, and I really hope that they reconsider. Already having chiplet design for RDNA3 implemented would certainly give them the chance, unless the binning for the GCD also includes defects in the fabric interconnect to the MCDs.
Far out, I was tossing up between a 3070 Ti and a 6800 XT late last year and I'm really glad I went with the Radeon GPU now! Great work as always Steve.
I had the same dilemma earlier this year and went with the 6800xt. Very happy with my choice
no DLSS...
@@Magaswamy716 fsr is pretty decent
Unfortunately people still buy a RTX3070 TI over the RX6950XT in some countries where prices are similar 😢😮
@@Magaswamy716 man you guys never give up........
That video perfectly reflects what it feels like to buy a GPU with 8GB some time ago and suffer with it now. I would be glad to see a comparison between RTX 3060Ti vs RX 6700XT, as these are also direct competitors. Good job!
They are not though. RX 6700xt crushes 3060Ti in everywhere except raytracing.
I game atm since 1,5 years on a 6700xt on all ultra settings 1440p (with minor tweaks here and there). If I'd EVER experience ANY stutters or something I don't like I'd instantly switch to a better card. But I don't have to because EVERYTHING TODAY RUNS BUTTER SMOOTH!
Are you really suffering? Just drop texture quality one step and this won't even affect you. But if you swap the situation around and high texture quality didn't exist, everyone would blast AMD for wasting money on RAM when they don't need it. Funny how that works.
Similar situation here. On the other hand I tend to not use raytracing and use a mix of low/medium settings so I don't think 8GB VRAM matters too much for my use cases. The testing shows 3070 is kinda bad but my 3070 will do just fine until it dies I'd say.
Im okay with my 5700xt 8gb it's performance ain't nothing to write home about to begin with lmao
Some people are missing the point in the comments. Limited VRAM is Planned obsolescence, most people don't upgrade their GPU often. it intentionally limits high end Graphics card from its full potential.
As an owner of both gpus i approve of this video
I noticed it in cod forza and other games for the 3070
Any questions feel free to ask 🙏
Does OC the memory help to mitigate the issue?
@@murdergang420 have you noticed who sponsors the vast majority of these problem games though? i mean seriously i feel like im the only one that notices this coincidence. Every game in this video with serious problems is an AMD sponsored title, that means they work closely with AMD when developing the game... im sure its nothing though right, no way AMD would be turning some knobs here to make games run badly on certain nvidia cards while taking advantage of the one huge advantage AMD have... nahhh couldnt possibly be, its just nvidia BAD and we need to remember that.
not like anything like this has happened before with nvidia tessellation which crippled the opposition cards. honestly im very surprised outlets like HWU dont at least acknowledge this and question it even a little that something underhand and fishy is going on here when every game with a problem has AMD hands on it, always those games that come out with terrible RT aswell, just a coincidence im sure they have no say at all lol.
@@jinx20001 So do you have the same issues with video editing software like Adobe running better on Nvidia hardware? Because it's been that way for the longest and I've never seen conspiracy theories mentioned about performance. You should be more upset with Nvidia for putting out a crippled card with 8 GB of VRAM. If they had least put out a competent product, it would be easier to entertain accusations of bad optimization.
That said, when AMD got the console contracts from Sony and MS, it was predicted that game optimizations might start to favor AMD.
@@murdergang420 for some games, have you experienced stutter due to shader cache with the 6800? Destiny, for an example, experience high stutter for a bit then over over time gets better.i just found it annoying that this happens every GPU driver update. It makes me not want to update the driver.
It's funny to me that you got comments complaining about your VRAM criticisms. Part of the reason I went with the 6800XT (aside from the fact I was able to get one for not much more than a 3070, and I would have been able to get a 6800 for a bit less than a 3070 if I'd gone that way) was because of your VRAM comments.
I am, therefore, very grateful for your VRAM criticisms personally.
alot of people who criticizes the hub's Vram criticism says that speed it all that matters, size doesn't. its already quite clear back in the last of us part 1 benchmark that the speed doesn't help when there isn't enough vram.
The 3070 Ti with 8GB is faster than the 6800 but he chose to compare two different tiers of performance and keeps riding on bottlenecking issues when trying to run Ultra on badly optimized titles with a mid range card from years ago
I actually encounter vram problems when using my old rx580 8g. I sometimes need to stop the browser/kill dwm.exe so game will run smoothly. Thats in 2021, the time I decided to upgrade my graphic cards. And then I saw 3080…10g. I am like "what the hell? How on the earth 10gb at 2021 is enough?". Ends up go rx6900 because nvidia cheap out the vram for the whole 30 series.
Nvidia plays a good game here.
I managed to get an RX 6800 for RRP on launch day, very savvy purchase in hindsight 😀
Keep up the good work HW UB
Glad I sold the 3070 two months after purchasing it. I was very disappointed with it and bought a 6800XT - still in use and happy 😅
@@siwexwot8994 yes. That's what we saw in the video. Dunno why you mention it..? 🤔
@@siwexwot8994 you can still use it in combination with DLSS...
@@siwexwot8994 and that's the point. The 3070 is only a 1080p card today due to it's lack of more memory... The take away is: the 6800XT was and is still the better buy. 😉👋🏼
@@siwexwot8994 yeah you're may be right but lowering the texture quality in 2023 is not future proof. I'm still glad that I sold my 3070 2 months after the purchase and got a 6800XT in my hands. FSR 2.x and 3.x will make it even longer usable as the 3070 will ever be. That's my personal opinion 😉
makes a whole video to say I TOLD YOU SO! No but seriously, great work. I bought 6800 2.5 years ago based on your original videos!
The crazy thing about the 3070, it that many people will upgrade to a 4070/4070ti and this card will face exactly the same issues SOON.
great, now we just gotta spend 1k dollars on a card and be done with it, right? What a joke
People who spent €900 on the 4070ti shouldn't sleep well, absolutely not futur proof card despite the abusive price.
Even on close to 1k, you still can get a not-futur proof card.
Nvidia is litteraly abusing its fanatics.
@@Ilestun It's(4070 Ti) already seeing stuttering at 4k in some games with RT. It's not even present proof, lol. Let alone "future proof".
@@Ilestun true, but the 7900xt already can't hit 60fps on cyberpunk maxed out at 1440p because of slow RT performance. A 4070ti can exceed 60fps in just about any game currently available.
I think both cards will eventually run into issues for different reasons this gen. Nvidia was greedy short changing gamers by 4gb VRAM, while the 7900xt is just a more powerful GPU at rasterized performance but still dragging it's feet with RT. AMD better come up with a viable FG alternative soon as well with drivers. Otherwise the rasterized performance advantage won't hold either in upcoming games.
@@modernlogix How many times I told people to avoid upgrading to 4070ti because of Vram shortage......but you just cannot stop them from spending their money badly I guess.
This proves that what Nvidia did, giving only 8gb on their midrange was smart, for their business lol
they are geniuses indeed.
Not really, in the current market, everyone will be pivoting to AMD for cheaper, yet better cards. It was actually a disastrous move if you ask me.
@@amineabdz I hope so, but I'll believe it when I see it. AMD cards have been great since RDNA 2, and growth has been slow. That mindshare is incredible.
@@amineabdz ill believe that the moment in steam hardware shows amd (discrete) in top 10.
@@amineabdz Probably not, AMD and Nvidia are just like Apple and other Android brands. It doesn't matter if the product is better and cheaper, the brand name matters the most and we already seen that with Iphone SE 2/3
Still rocking my 1070 I bought in 2016. Absolutely love this card, it's been enough to run recent games, granted in medium quality, but they still run and I can play in good conditions. I'm not hurry to upgrade GPU and not a compulsive buyer, and thank god for that seeing the state the GPU market is in. Hardware Unboxed is one of the channels I'll be sure to follow the advices of to know which GPU I want to upgrade to.
Whatever makes you happy, that's the important thing in life. This man has his head screwed on straight!
Dkn I agree the 1000 series my have cost most then the previses generations at first, but boy did the have longevity. I picked up a GTX 1070 Ti at the beginning of 2022, and man do I wish I had gotten it sooner! Nvidia and now AND have lost their track now, at least when it comes to there gaming side of things. more so Nvidia
Same here, EVGA 1070 FTW, an absolute diamond of a card. Lasted longer than any GPU I've ever bought, lasted through COVID and crypto, and plays almost everything I care to throw at it in near-silence. Just replaced it with a cheap second-hand 6800XT to last me until the next 'good' GPU release cycle whenever that will be, but that's only because I'm building an entire new system from scratch.
@@cameronblazowich2671 if we take AMDs perspective into consideration, then the only way they continue to increase profit margins is to charge their 5 customers even more money for their products.
1070 is absolute beast of a card
You should make a video comparing the RTX 3080 (12GB VRAM) and the RX 6800XT. I want to see if 12GB VRAM is sufficient.
I also want to see this comparison.
12GB is fine for now.
But I'm sure in 2 years the 3080 12GB and the 4070Ti will suffer from the exact same problems the 3070 has now.
I have ran out of VRAM on one game with my 3080. Ghost Recon Breakpoint, wither everything cranked to ultra at 3440*1440. It will run out of VRAM and crash after about half hour to hour of gameplay. At least for me, and i will get the error that it is out of VRAM. So far the only game i've had that issue on.
@@toothlessarcher Do you have the RTX 3080 with 12 GB VRAM or the one with 10 GB VRAM?
@@Unknown-ve6vp 10GB
I paid a shit load of money for my 6800 XT during the shortage (twice of what it is now). Still extremely happy with it and still glad I did. Happy to see it still performs so well today.
Same here. Felt lucky at the time to ONLY pay $900 for an AIB 6800 in 2020. I think I got my monies worth.
Same. Actually, at that time in my country 6800XT and 6900XT were roughly the same on price. I ended up getting the 69 one. Glad 68 users are still in good shape
Same here :)
Also got the 6800xt, at launch price. 8GB was a big no no. It was very obvious that it was too little.
Just like the gtx 580 that came in 1,5 and 3GB variants. The 3GB lasted years longer. Same with 3/6GB 1060
Paid
Great video Steve, thanks. Another thing to consider is that calling these cards "aging"... for a lot of people, these cards have only recently become available to buy, due to shortages and pricing. And they are currently still the newest 'midrange' cards available! So these are actually current gen cards struggling!
Makes me wonder where the "value" is...
Well well well, the anti-amd man himself has no choice but to speak well about AMD. Even in ray tracing the 6800 slaps the 3070 back to the beach the silicone was made from.😂
Thanks for the revisist, Steve! As I said over 2 years ago: The RX 6800 was probably the best graphics card released in recent years. It was faster, more efficient and more future-proof than the 3070 from the start, and yet the 8GB VRAM cripple had to be hyped because it can do the oh-so-great DLSS and has much better ray tracing performance. Good job, press and influencers.
ruclips.net/video/kKSBeuVlp0Q/видео.html
Yes but in most places in europe was out of stock or when it was in stock was more expensive than a 3080 10 gb
They can only review what they know, whether games will end up using more VRAM imminently is speculation and the reality is VRAM usage of games *in general* has been pretty stagnant for years. (Largely due to consoles, imo) I agree that Nvidia shouldn't be skimping on memory size or memory bus, but you can't really blame press for reviewing the cards they got at the time they got them with the information they had.
yes, the efficiency in every metric is incredible, but that got it very close to the XT version in real world prices
comparing 3070 to 6800 skipping 6700xt is like comparing 6800xt to 3090 skipping 3080 ... depends if u red or blue team fanboy... why not compare 3070 to what amd was promising as direct competitor at that time? 12gb 6700xt? both of those companies crapped on gamers in last years equally.
Whole argument about choosing nvidia for gaytracing in the midrange is officially debunked here. 😂
I dunno why people keep funding scammy practices and crazy prices by nvidia
I'm not an Nvidia homer. But I did get the 3090 and 4090. I can confirm ray tracing still sucks. It's simply not worth it yet and won't be for 5-7 years min.
This is one of the reasons I went from 3070 to 6900xt the more VRAM and also shockingly I had to spend no additional money on 6900xt! The performance has been great for the last 6 months and I'm quite pleased with the AMD card. Also knowing that AMD cards age much better than Nvidia cards makes me think maybe I should stick with AMD from now on
how are the drivers working for you? I'm very tempted to go red, but I don't know if AMD has resolved most of ther driver issues.
@@ChiplockRock To my surprise I have yet to encounter any problem with the drivers every game that I have played and tested worked great I don't know about productivity applications tho as I only use Photoshop and premiere pro anything else is CPU related
@@ChiplockRock I've had a good number of Radeon cards.... most of the driver issues I've has is when I've tried obscure stuff like connecting up multiple monitors of odd sizes and Windows automatic updates borking the drivers. My Radeon VII is also picky with what driver it wants to work with, and my x570 Crosshair VIII board does not like my Radeon VII.
AMD has always had a shaky history with drivers.... and things like taking months to get RDNA 1 drivers months to get right has only hurt them.
I bought a 7900xtx Red Devil on launch day and no problems so far. My only complaint is there is a 25c difference between Hot spot and regular GPU temps. Otherwise the card doesn't get too hot.
@@ChiplockRock just got a 7900 XTX (new build from a 2080ti) and have no issues, also the AMD software is way better than nVidias. I think the driver thing isn't really as applicable nowadays as it was in the past.
What size psu do you have? And what cpu do you have paired up with it? Ive been looking at maybe going a different route from my 3070
You are accurate as always. 8GB is now budget tier. 16GB is mid-tier and 24GB for top-end. When Nvidia launched (and then quickly unlaunched) the 4080 12GB it was such an obvious mistake. Even 16GB gives me pause because 4K gaming and SS needs more VRAM than ever.
especially with raytracing too
Like really i dont know how people trust them while they try to skip and move on cheapest shameless set higher price scenario.
I think 10-12 gb is mid tier. 16 and above is still pretty high end
16GB may not technically by mid end considering the current generation of games, I would say 8Gb would be entry like the 4Gb back then, 10Gb being somewhat standard like 6Gb and 12 Gb being the mainstream for GPUs, and 16Gb should be standard for cards that can run 4k unlike 3080 and 3080ti with that 12Gb
I just recently switched from team green (GTX 1070 8GB) to team red (RX 6900XT 16GB) and I gotta say, that extra VRAM is nice, the Resident evil 4 remake uses a whopping 13.8GB of VRAM at 1440p if you set everything to max (with Ray Tracing and Hair) so 16GB does seem like the way to go moving forward, which, isn't shocking considering there was a time where we were jumping to double, triple, or quadruple of the "at the time" acceptable VRAM requirements back in 2007-2010, I remember having a 128mb card and moving to a 768mb, I remember having a 1.5gb card and moving to a 4gb, 4gb to 8gb, etc, etc, as the years go by, we need more VRAM, it's that simple.
I have the 3070 Ti and this video is 100% accurate. At 1080 the lack of vram was constantly producing a stutter-ridden mess. At 3440x1440 the card can't handle any game without spikes in frametime and stutters. Turning my character around in games like No Mans Sky and Cyberpunk and Horizon zero dawn causes the game play to pause and detail/objects to pop in and out.
I've been on the fence about this, but this video has convinced me to go with the 7900 xtx. With its vram and raster performance, I think that card will age the best.
This... isn't VRAM related. No Mans sky isn't filling the 8GB of VRAM, neither is Horizon Zero Dawn. Your stutters aren't VRAM caused in these titles because they don't fill the buffer.
Seriously, go look up comparisons of your card to the 6700 or 6800 and look at the .1 or 1% lows in those two titles and they are pretty close, which means stutters aren't happening as a general rule to people with 8GB of vram vs 16.
@@tonymorris4335 My VRAM sits at 7892MBs used in Horizon, and similar in the other games. I've used both HWinfo and Afterburner to monitor, as well as Nvidia's own monitoring tool to confirm.
And even if it wasn't the VRAM, then the stutters are caused by what? Poor gpu performance? Poor driver performance? Neither one makes me feel better.
100% correct, my old 3070 was shit compared to my 7900 XTX and the only thing I don’t like is the heat with everything else being vastly superior.
@@ALmaN11223344 Same here on both cards. Received my 7900xtx 5 days ago and it's amazing.
@@tonymorris4335 You are so wrong.
Nothing quite like some fine wine 🍷😂
Great job Steve! Id love to see this same type of comparison with the 3080 10GB & 12GB against the 6800XT.
Exactly! This would be a nice follow up video, probably including 4K resolutions.
only difference is that the 3080 price wise is more comparable to a 6950 xt
So glad I got the 12gb version when prices became low. Getting the 3070 would have caused buyers remorse.
The 6800 XT also has 16GB, the cheaper 6700 XT and 6750 XT came with 12GB, they were more comparable to 3060 (Ti).
Wish got granted.
I'm coming back tk this video 2 weeks later to say thank you, you single handedly changed the consensus for Min requirement of vram in 2023 forwards
Would be awesome to compare the 3080 10GB to a 6800XT 16GB in these games.
They are pretty closely matched in rasterisation performance, I wonder if the AMD card pulls away massively with more RAM.
I would also like to see this comparison!
not with Vram tho, as a 3080 and 3090 user, I can tell you that the 10GB isint even that much of a difference from using an 8GB Vram since games that needs more than 8GB usually just go 12-14GB
6800 XT is closer to 3080 ti/3090 performance in rasterization in newer games, outside of 1 or 2 random titles. Sometimes exceeds that.
The 3080 would get shit on.
@Kevin P. Lmao NO, wtf are you watching to say bullshit like this ? Just check the recent games benchmarks from HUB, the 3080 10Gb still beats the 6800xt in EVERY GAMES, even at 4k, latest ones are TLOU, Hogwart and Spider Man, infact the 3080 10Gb even beats the 6900xt and 6950xt in Hogwart and Spider Man respectively at 4k ultra, and the 6800xt by like 10-15%. And if you check other reviewers, the 3080 is also better in other new games like Dead Space or Plague Tales. The only recent game that the 6800xt can outperform the 3080 10Gb was CoD MW2, and everyone knows this game is just an outlier that heavily favors AMD. Imagine saying "6800xt is on 3090 level" while in the very recent video of TLOU the 3090 was faster by like 30% at 4k, stop coping.
The first 6 games tested used 12-14GB of VRAM, so the 3080 10GB is still going to struggle
Expanding this comparison with an rtx 3080 10GB could be very interesting, thanks for the great work, steve!
Also the 12 GB RX6750XT for those without the bigger bucks
I have have both 3080 and 6700XT.
I can tell you that 10gb is not enough and it is also a really weird amount, because the way texture packs work in most games is you either have settings that require 8 or 12.
Having 10gb just gives more headroom for 8gb textures but still below requirements for 12gb, so very weird.
The 6700 XT can handle higher settings when it comes to textures in games I play now like Resident evil 4 and the last of us. The 3080 is stronger but the 10gb makes it so you have to use lower texture settings anyway, so it's like wtf.. you play without Ultra settings, what's the point then.
I would recommend at least 12gb Vram from now on, and I mean really minimum 12gb, because i think even that is a risk. We will most likely need 16gb Vram soon for new games coming out this year and specially in 2024.
or rtx 2080ti
The 3070 Ti would have been the same performance tier as the 6800.
The 3080 is matching the 6800 XT.
@@BitZapple this isn't about performance, it is about Vram.
I'd be curious to see how the 12 GB 3060 stacks up against the 3070 in those newer AAA titles.
Probably a bit slow, but not choking on Vram. The 3060 is a 60+ fps experience whereas the 3070 was intended as HFR capable.
The 3070 is likely faster if it is not choking on VRAM.
Likely only slightly worse rather than totally teeth-beaten as it should be
3060 on avg would be slower, but in unoptimize games, will have much better low 1% frame rate.
I mean, it would still be slower rasterization-wise, but at least the frametimes are far, far more stable
and it wouldn't have that weird voodoo texture-flipping magic that the 3070 does on Hogwarts Legacy
@@peterpan408 I am not sure, because 12GiB version is ~25% SLOWER to its 8GiB Ti variant!
While 3070 is nearly ~50% faster when ram is not an issue!
Very much doubt you would have 60fps on ultra settings with this "turd" of the card despite having 12GiB of ram
lol. Back in 2021 the saying was "If you just wanna use rasterization, buy AMD, if you wanna use RT, buy nvidia". Funny how that turned out.
For me it was never a question, since nvidia doesn't give us Linux-nerds open source drivers.
Great video Hardware Unboxed.
As a 3070 owner, I feel jipped and spent a decent amount in this card thinking it’ll be good for years to come. I’m so extremely frustrated now, since most new games even on 1080p I’m running out of VRAM. Selling the used GPU is going to get me nothing in terms of price, and the new RTX cards are outrageous and frankly I’ll never buy an nvidia product again. 6900 XT is available and cheaper now, but not sure if I want to buy a new car that’s already last gen, even if it’s better than the 3070. Better yet, 7900 XTs and such seem to be quite expensive.
I’m torn now to wait and deal with that absolute hot garbage product or try to empty my wallet into an AMD GPU. Or wait until next gen AMD. I have no idea.
Nvidia is terrible and extremely anti-consumer. My next GPU will be amd and I cannot wait to be able to play games normally on a card that wasn’t made to just be obsolete in a few years.
Totally agree, I was shocked to see this issues at 1080p, crazy feel for you bro. at this point I’m done with Nvidia.
Give Intel A770 a try then. Its cheap, has 16GB vram, supports raytracing, and Intel has been doing a lot of improvements to their drivers. Buy it from somewhere with a good gpu return policy in case it doesn't work out for you.
I bet you could sell a 3070 and buy a 6800 or 6800XT for a minimal difference. We will also have 7800XT and 7700XT sometime this year.
I had lot of issues with drivers on RX6700 XT specialy with old games where it performs almost like my old GTX1060, in new games everithing was ok. See internet forums lot of people have same issues. Im going backt to nvidia. AMD drivers are not good.
7800 XT sounds promising!
I remember feeling a bit burnt when my 1GB 560ti couldn't play PS4 games once the generation swapped over. Funny to see this happening all over again.
Nvidia has *always* been stingy with VRAM. Even back in the day it was easy to get Radeon GPUs with over double the VRAM for the same price. In 2015 the GTX 960 only had 2GB, and the 970 famously only had 3.5GB of full speed VRAM!
Lets not forget about 1030 with DDR4. VRAM so crap it shouldn't even be called 1030.
The question is, did you learn from the 560 Ti or did you bend over for Jensen again?
Same! I remember asking on the forum whether I should get 560ti 1G or 2G, and everyone recommended 1G. I mean, people took you for an idiot if you dare to recommend the 2G version. But within just 1 year, Battlefield 3 came out and destroyed that 1GB vram. I was so mad. Yet it seems 12 years later, the history has repeated again.
Как же вы западные куколд клоуны закалебали с этим вечным упоминанием 3.5 гб у 970, словно каждый из вас вчера узнал и спешит всем остальным "не знающим" об этом рассказать
Goes all the way back to the Geforce 3/4 days.
Another reason to buy AMD. More VRAM at every price point since the beginning.
Its been keeping the R9 390 and RX480 relevant with 8GB of vram and the new stuff with 12/16GB should be standard.
They don't offer more VRAM at every price point. All owners of a 6650XT and lower have the same "problems" as 3070 or 3060ti owners
To say more Vram at every price point is a bit of a slap in the face of someone buying a RX 6600XT that only comes with 8 gigs of Vram.
@@maxpower18 exactly lol this guy is clearly on crack.
@@maxpower18 Not true as 6650xt is WAY cheaper than both 3070 and 3060ti. 6800 is at those card's price point which has double the VRAM. Stop being a Nvidia simp.
@@maxpower18 just play 1080
I'm very happy using 6800 with a ryzen 7600 as my secondary gaming PC. Never owned an and CPU or GPU before and have to say I'm impressed! Will mostly be used for emulation as I put it all in a compact case.
This is why I always go for card variants with the most VRAM ammounts, despite what everyone says at the time "8GB is fine for all most recent games" sorry but I never swallowed that one.
It was kinda fine in 2020 but technology moves on. Since then developers have become used to using 10-12GB VRAM for the GPU on Series X/PS5- so this is translating across to PC games.
It was enough. Depending on the game. I always asked what is the budget + list of games/kind of games, resolution, graphics settings, Target FPS and when they plan the next upgrade.
The thing is you gotta look ahead when buying a PC component like a GPU. When people say "It's fine for almost all games today", well what does that mean? Does that mean that it won't be fine for a number of games in 2 years from now?
It was enough for 1080p and also when devs were making games for previous gen consoles they were forced to use less vram anyways to get the games to even run on those consoles. Now the new gen is a lot closer to PC specs they can push gaming further. NVIDIA are cucks and they should have shipped all their cards with minimum 10-12GB
@@malekkmeid9263 . They are no closer no then with previous Gen consoles. The main difference is that they have 16GB shared memory so They can use more of it for image quality upgrades. This combined with console efficiencies lead to greater stresses on PC VRAM requirements.
I am feeling mighty fine after buying my 7900XT to replace my 3070, these results are eye opening and hopefully Nvidia will have to pull their **** together. Thanks for all the hard work!
I am not trying to defend NVIDIA here, but if you already own a 3070 I would stick to it at first. The point of the Video was ofcause just showing the games which are really unoptimized on VRAM usage, most games still work fine with 8GB of VRAM.
I think you should stick to your 3070 and wait a bit longer, maybe next gen of GPUs, as the performance increase will be much greater
@@otozinclus3593 i get what you mean, I wouldn’t advise a lot of people to go for the upgrade I went for, but I’m also upgrading my 3800x, and I tend to “donate” my used hardware to my family members that also use computers, and right now having an extra one will solve another issue, so my upgrade also has that in mind, my 3070 will go to someone that will enjoy it for years. I use my pc for work (spend 10+ hours daily on it) so I tend to be quite “liberal” with my upgrades as it is an investment in my case. (Hell, my old 2600+rx580 build is still in use)
It just amazes me how bad 8gb vram is behaving on newer games, I was already experiencing a bit of it (and I have 1440p high refresh monitors), but it will only get worse over time. I was thinking about a 4070ti but even the 6800 sometimes uses over 12gb, so it was a good call on my side I think.
@@otozinclus3593 they already replaced their 3070.
@@otozinclus3593 ps5 and xbox x have 16 vram, how do you optimize the game with less vram?
Single generation upgrade is so not worth it. 7900xt will be outclassed in a few years
I can't believe, that after getting a reference model Rx 6800 at MSRP during launch week, I decided to trade it in for a RTX 3070...
At the time it seemed like a good idea for my use case scenario... Both cards performed similar in then-current titles, 3070 technically had better RT performance (even though in reality I barely use RT and it actually runs pretty poorly on a 3070).
I wanted to try streaming so Nvenc encoder was appealing, and at the time FSR wasn't really a thing so Nvidia had one over with DLSS.
But the funny thing is, as a 1080p gamer, I don't even really use DLSS at all, so I really don't understand what I was thinking factoring upscaling features into my decision or hell even Ray Tracing performance, at the time.
But either way, in hindsight, I'm kicking myself for not taking a chance on AMD. Lesson definitely learnt
Pretty sure you could sell the 3070 and buy a 6800 for the same price.. or trade it :) lots of people buy Nvidia for the name not the performance it provides. Lots of silly people out there. No harm in trying
@@defectiveclone8450 Yeah. At the time the decision felt right, but in hindsight it was a mistake.
I was thinking of trading it into the local Cex store to recoup around £300, and then using that to either get a Rx 6000 or 7000 series gpu at some point. Since I'm only really gaming at 1080p 144hz @high-ish settings, I don't need anything too fancy
@@cal.j.walker As the other guy said, many people are still in the Nvidia bubble and will pay more for a 3070 over 6800. Maybe just try and trade it for 6800/xt.
@@modernlogix Ah right, yeah that makes sense now. That's also a good shout, thanks
I really appreciate the revisit. I've been watching used GPUs and you saved me from a costly mistake.
Many RUclipsrs whilst reviewing the cards: 3060 12GB is all marketing and you'll never need that VRAM, buy the 3060ti or 3070...
Aged like milk that advice.
Also - 4070ti owners sweating that 12GB is pretty much low end now.
I recently replaced my 3070 with a 6800 after finding one for good deal actually. I initially thought it would be kind of a sidegrade performance wise, but with more VRAM to work with. After watching your video though, it seems like I definitely made a great decision going forward.
RDNA2 was and still is an excellent architecture. I believe it will continue to improve well into the lifecycle of RDNA3.
@@edgyjorgensen3286 seems like with every driver update AMD is squeezing a little more out of it and the 7000 series as well. My 7900xt just keeps getting a little better with each update.
@@toxicavenger6172i agree but its not the rx 5xx and 4xx level, those gpu gained so much its insane from all the updates
@@ok-mx1mimaybe not but they had that same architecture for a while and had a lot of time to keep learning and getting more from it. I think with rdna 3 we might have a similar situation where they get more and more from it over time.
It's still mostly a sidegrade... Would've bought a 6800XT
Would be nice to see a retest of these titles with the 3070 vs the 16Gb A770 to see if there is a similar result with 16gb vs 8gb! Also potentially add the 12GB 3060 to see if it pulls ahead!
That’s a good recommendation.
The answer to that is simple. RTX 3070 will get better fps in some games but the lows will be undeniably terrible in 1440p, it might not load the texture very well too, same as what is happening in the video, so a bigger VRAM 3060 might get lower fps but will be able to stabilize the game with better lows and correct texture loaded. What he is pointing out is that the stutters will be there as long as the VRAM is not enough, and what makes it terrible is that it won't load the texture.
I'd like to see more Intel Arc a770 comparisons, since they cost about the same as a 6600 and less than 3060 in several part's of the world.
came here to say this, could be a fascinating vid
Would love to see that. Worse GPU with more memory vs Better GPU with less memory
You know what would be fun doing this exact tests?
RTX 3060 12GB
RTX 3080 10GB
RTX 3080 12GB
And see if the 3060 will pass the 3080 10GB in some games while comparing what performance had been left on the table had the 3080 originally come with 12GB!
Yes, please.
Hate to burst your bubble but the 3060 wouldn’t get higher fps in ANY game compared to a 3080… 12GB or not lol, it wouldn’t even be close. Just google performance stats for both and compare.
@@alenko4763 yeah unfortunately I think 10GB is just enough to actually avoid the issues the 3070 has.
At 1080p Windows 11 desktop, about 0.5 GB VRAM is consumed.
@@alenko4763 3060 12 GB superior 0.1% lows when compared to RTX 3070 Ti 8 GB.
Excellent video. Appreciate the effort highlighting the importance of adequate VRAM. For anyone that doesn't know, while the 3060Ti/3070/3070Ti have 8GB and the 3080 has 10GB, the 3060 hilariously has 12GB because Nvidia screwed up their tech/marketing and AFAIK was responding to competition by AMD. Competition is a wonderful thing.
Yup. It was designed to have 6GB in 1GB modules (6x32 = 192bit bus) but Nvidia realized late in the game that it wouldn't even cut it for launch-day review benchmarks so they went with 2GB modules instead.
Initially I bought my 3060 12GB for hobby AI training, professional workload, and transcode, back then its either this, or slightly more expensive 6600XT or 3060ti. I even have some second thought since so many reviewer and online comment saying 3060ti is much better value..... and yet in last three months many games I played will be running into crippling texture load/quality problem if I went with 3060ti, with how DLSS2 constantly improve, if I feel fps is low I just turn DLSS up, but if you run into VRAM limit, you have to suffer through extremely visible texture issue, DLSS is a much lesser compromise then those 3070ti people huffing copium who insist turn texture down a few notch is okay.
Thanks Nvidia 😂 it would be great to see the rtx 3070 vs 2080ti on these titles as well as against the 12gb 3060
This! I'm still rocking a 2080 Ti and I'd really like to see such a comparison. Especially since the 2080/Super/Ti is missing in all recent benchmarks which is a shame.
Maybe it's time to ditch the 3070 from benchmarks and replace it with a 2080 Ti 😂
The low VRAM on Nvidia cards has always been a pain and now it finally starts to show and piss people off. I hope people finally recognize how shitty Nvidia is about these things and stop buying these gimped cards.
I have 2080ti and TLOU run perfect.
4k DLSS balanced, high settings, locked 60fps.
@@erazzed 2080 Super also had 8 gb of VRAM, so it should be what a 3060 Ti performs like as far as I am concerned.
This for sure
@@modernlogix exactly 2080S=3060ti but I really miss 2080ti in HU comparisons.
I was happy with my 5700XT and 8GB VRAM, but i'm glad now i bought a 7900XTX with 24GB VRAM. I hope AMD continues the trend in low andf midrange with 12GB and 16GB of VRAM for the rest of the RX 7000 lineup.
I woulda loved to see the 16GB A770 on these charts too, even just as a wildcard.
Intel is really providing a lot of value for entry level PC builders with the A770. If I was just starting out building PCs I would buy that one.
Unstable card for this kind of results?!?! No?
@@josh2482no? It still has a lot of crashes and is nothing for a pc beginner tbh
You guys are making me feel EVEN better about my 6800 purchase for $499 back in november... it was way cheaper than the 3070 and 3070ti which is what i was originally looking at... and the VRAM on top of the price sold it for me...
I couldn't downgrade from the 11gb I had with my 1080 ti. Managed to get a rx 6800 xt for about $460 shipped.
It's crazy! I can remember during the GPU crisis, scalpers were trying to sell 3070's for $1700.
Even crazier that in late Summer 2020, 2080Ti owners were falling all over themselves to sell their cards for peanuts so they could buy a $500 3070. That price, and availability, turned out to be a pipe dream.
@@rangersmith4652 stop dreaming lmao, nobody sold their 2080 ti to rebuy 3070, those who sold bought 3080/3090
@@darkfire3691 Mostly true in terms of intent, but I know at least a couple of people who did sell a 2080Ti thinking they were going to buy a 3080 but wound up "rebuying" a 3070 because the 3080 was unobtainable. And, of course, they overpaid for their 3070s.
@@rangersmith4652 Yeh it should be a pretty big warning sign to any sensible person that downgrading on vram while trying to "upgrade" gpu's, is a foolish strategy.
@@rangersmith4652 That is horribly depressing. Looking at it from today's standpoint, it's the same as setting your hard earned money on fire.
What always gets me is that there are people out there who upgraded from a 1080 Ti to a 3080. Same price 3 years later for 1GB less VRAM.
I wish I could like this video twice! This was so helpful and instrumental in my progress of understanding how a GPU works and what the heck VRAM was and what it did, and clearly demonstrated why the graphs can be misleading sometimes, like when the GPU compensates for lack of memory and avoid crashing or buffering by entirely not loading in the textures, which isn't shown any other way but watching the side by side
Thank you so much! Wonderful video
I think 8GB for entry, 12GB for lower mid, 16GB for upper mid, and 20+GB for high tier GPUs would be fine for this generation, though for the generation after the 12GB may need to shift to entry level and replace 8GB
I got a 7900 XTX last week and that thing is an absolute monster.
4090 is god tier this generation. problem is that only higher tier cards are getting looked after and are superexpensive
@fREAK yea and the price for it and the consuption on electricity 400-600w overclock is Godier lol
@@crossfire4902 exactly why it's god tier. it's expensive to buy and run. but all reviews show it's the fastest out there Money can buy. choice is cobsumers
Amazing job once again. I am so happy with my 6800 xt since the first time. Probably the 3080 10 gb will have the same kind of issues. Who would have bet a dollar on AMD for Ray Tracing? Nvidia keeps claiming they have the superior tech and their cards are better, but they lost the way... Prices are totally stupid and even on fundamentals like the VRAM buffer, they took the wrong way.
Feel a bit bad buying my 3070 now, haven't used RT once and the RX 6800 is smashing it in these benchmarks.
I was trying to choose between a 4070ti or a 7900xt. So glad I went with the 7900xt.
7900xt have even better RT
I am glad to picked up the 7900 XT instead of 4070ti :)
The VRAM usage is insane ! Even 12gigs could be a problem from what i’am seeing in this video !
I was thinking that also. With the 407 ti having 12 along with the 4070, it seems 800 and 600 is way too much for those GPUs!
They're already obsolete! Everything coming out is just a waste of silicon! These cards barely hit 60 fps at 1080p at times and that's ridiculous for the prices of these cards.
I've lost all hope in getting a fair deal when it comes to buying a new card.. just outright ripping people off. 😢
Good luck for those who buy 4070 and 4070ti.
At least that's the level of current texture. Might take a few years before they increase again the level.
Yeah these are truly terribly optimized titles. Games 10 years ago looked better using 4GB of vram, I don't get why the criticism isn't aimed at these terrible ports.
Let's all not forget. Nvidia was super close to releasing a 4080 12gb. 🤣 At $900. Imagine if they were the only company in the market.
And if the 4080 was 12gb, I'd probably guess they were planning for the 4070 to be 8gb. What a timeline that would have been.
I know this comparison would be a little out dated, but could you do a comparison of an older title with an RTX 3060 (12 GB) vs RTX 3070 (8 GB) to see how resolution and settings scaling do? This should also highlight the limitations of the 3070 compared to Nvidia's own cards.