16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit

Поделиться
HTML-код
  • Опубликовано: 6 сен 2024

Комментарии • 6 тыс.

  • @DrearierSpider1
    @DrearierSpider1 Год назад +5197

    The first mainstream GPU with 8GB of VRAM was the R9 390 in 2015, which cost $330. By 2020, we should've only been seeing 8GB on the most entry level cards.

    • @oOZellzimaOo
      @oOZellzimaOo Год назад +97

      @@istealpopularnamesforlikes3340 Ahem* RX 6300 2GB

    • @singular9
      @singular9 Год назад +523

      @@oOZellzimaOo thats a sub 100$ GPU...8GB of Vram is like 50$+ alone...

    • @_Winfried_
      @_Winfried_ Год назад +233

      There were variants of the R9 290X with 8GB as well

    • @BleedForTheWorld
      @BleedForTheWorld Год назад +75

      @@oOZellzimaOo nuance is that everyone SHOULD know NOT to get a gfx card like that in the modern era. Essentially, it's a waste of money over something that's actually useful. Sorry to be pedantic about it but that's the nuance of it.

    • @LuisC7
      @LuisC7 Год назад +137

      In 2015, 8gb of vram was 330? Man, prices have come a long and sad way.....

  • @kougar7
    @kougar7 Год назад +752

    The 1070 had the same 8GB of VRAM as the 3070, and that was seven years ago...

    • @Crimsongz
      @Crimsongz Год назад +88

      My gtx 1080 TI got more VRAM 😂

    • @Accuracy158
      @Accuracy158 Год назад +19

      Yeah I bought a 1080 on launch day and that was probably the first generation for me that I stopped looking at vram as a large concern. ...But as you point out that was kind of a while ago at this point.

    • @Magjee
      @Magjee Год назад +47

      ​@@Crimsongz 11GB, 1 more than the 3080. They are so stingy with VRAM ;(

    • @His_Noodly_Appendages
      @His_Noodly_Appendages Год назад +14

      Loved my 1070.

    • @AvroBellow
      @AvroBellow Год назад +7

      And the 1080 Ti had 11GB.

  • @claytonmurray11
    @claytonmurray11 Год назад +490

    The "I told you so angle" - is completely justified and I loved every second of it. However, the content is also extremely important because a lot of RTX 3070ti's, 3070's and 3060ti's are still being sold new, and are going to be exchanged on the second hand market. People need to know the value of these cards and the performance they can expect. Well done Steve!

    • @chrissoclone
      @chrissoclone Год назад +40

      And even worse, nVidia will continue to throw out 8GB cards in the 4xxx series while even new 12GB models should already get that warning the 8GB cards got back then. If people run out to buy 8GB 4050 Tis etc. at these highly inflated prices a "told you so" will be more than justified next time.

    • @claytonmurray11
      @claytonmurray11 Год назад +29

      @@chrissoclone exactly. A 6950XT seems like a much better deal than the 4070ti and the upcoming 4070.

    • @logirex
      @logirex Год назад +9

      for me it reeks of confirmation bias. you would not run a 30 month old mid range gpu at ultra settings these days. I got a 6900XT and certainly don't run most games on ultra but high or very high settings for good fps. test the 3070 on high settings against the 6800 and it will do well.

    • @claytonmurray11
      @claytonmurray11 Год назад +7

      @logirex I see what you're saying - but it's still great information for people looking at buying these cards - which sell a considerable amount new and used today. The old reviews are no longer valid.

    • @logirex
      @logirex Год назад +4

      @@claytonmurray11 Going forward you certainly want cards with more memory but it looks like they are getting it also with the RTX 4070 getting 12Gb for instance.
      That said even getting an older card I would still take a RTX 3070 over the 6700XT any day of the week. This channel tested those over 50! games and the 3070 was almost 15% faster. The only thing is if you buy older cards or mid range cards DO NOT RUN THEM AT ULTRA SETTINGS.
      Both this channel and others such as LTT multiple times points out running at ultra settings is pointless. LTT has one video where they had people look at games using ultra or very high setting and most preferred the very high setting as they clearly could not norice the difference.

  • @ThreaT650
    @ThreaT650 Год назад +556

    I really appreciate the fact that on this channel you guys make the pressure you are putting on companies through your views and influence extremely clear to understand and remedy, with extremely specific target points so that they can't weasel out of criticism with a few extra gigabytes of VRAM. Quite frankly you guys are using your power in this market to the highest efficiency you possibly can without getting anti consumer pushback.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад +1

      theyre still very, very far from done, although im very tempted to swap them

    • @nicknorthcutt7680
      @nicknorthcutt7680 9 месяцев назад

      💯💯💯

    • @mircomputers
      @mircomputers 8 месяцев назад +1

      yet, idiots still buy 3070ti in december 2023 "i need raytracing"

  • @philosophyetc
    @philosophyetc Год назад +2404

    RIP in peace to anyone who paid $1400 for a GPU with 8GB of vram two years ago

    • @tpmnrcks
      @tpmnrcks Год назад +125

      Desperate times..

    • @RifterDask
      @RifterDask Год назад +106

      I paid $360 for one six months ago. In my defense, it was an upgrade from a 6GB 2060

    • @znoozaros
      @znoozaros Год назад +72

      Paid 400€ for 3070 strix a month ago. Now waiting for my 729€ 6950 xt red devil to arrive

    • @breadkrum6860
      @breadkrum6860 Год назад +186

      Rest in peace in peace lol

    • @King-O-Hell
      @King-O-Hell Год назад +79

      I did pay like 1100$ for a 3070ti because I didn't really think things would get better, as far as price. I should have stuck to my 1070 and waited:)

  • @MrLimeyLime
    @MrLimeyLime Год назад +259

    I was using a RX 580 for many years, and I feel what made that card age so well was its 8GB of VRAM. At its launch it was pretty high knowing NVIDIA's equivalent was using 3 and 6GB. So when I finally moved away from that card it was obvious that the RX 6800 would be the right pick for me. Its been good to see the card has been aging well already for the same reasons.

    • @kaapuuu
      @kaapuuu Год назад +18

      I switched from 1060 to Rx6800. I consider 3070 on release as price was very similar but im enthusiast and i knew 8gb won't cut at 1440p in couple months. It happend yet Nvidia claimed its not a problem.
      3080 10gb ? What a joke. NVIDIA made fools from consumers

    • @chrissoclone
      @chrissoclone Год назад +10

      RX580 was the best bang for buck card I ever had, and yes, 8GB VRAM played a large part in that. Before that I had a 660 Ti as my favorite bang for buck card, it was quite excellent and the ONLY reason I had to replace it was because already then nVidia was just too stingy with their VRAM, it could've lasted me even longer if not for that.

    • @MetalGeek464
      @MetalGeek464 Год назад +3

      I made this same upgrade recently. The additional VRAM and Nvidia's bonkers pricing were key drivers in my decision.

    • @sacamentobob
      @sacamentobob Год назад

      you must have avoided the lemons quite well!!

    • @MrLimeyLime
      @MrLimeyLime Год назад +2

      @Varity Topic 6750 XT is still a pretty solid choice!

  • @instantstupor
    @instantstupor Год назад +166

    Would have also been interesting to see the RTX 3060 12GB vs a higher end RTX 8GB card. See if there were cases the former played better than its more expensive counterpart.

    • @sinephase
      @sinephase Год назад +3

      it was too much for the GPU though so it's not as good as you might expect. At least it won't crash in RE4 as much though LOL

    • @ole7736
      @ole7736 Год назад +2

      The RX 6750 XT might be interesting to throw in the mix for those titles that report ~15GB of total VRAM usage already.

    • @opachki8325
      @opachki8325 Год назад

      I'd love to see a 3070Ti with its G6X in this comparison.

  • @TheTardis157
    @TheTardis157 Год назад +478

    My ancient RX580 has as much VRAM as that 3070. Now 8GB is just enough for 1080p gaming. Glad to see the Radeon card doing so well.

    • @Renekor
      @Renekor Год назад +2

      u_u

    • @MATTW3R
      @MATTW3R Год назад +1

      ❤️

    • @jpesicka999
      @jpesicka999 Год назад +3

      Gddr5 vs gddr6

    • @valenrn8657
      @valenrn8657 Год назад +13

      @@jpesicka999 AMD's RX 6600 8 GB GDDR6 is reaching $199

    • @valentinvas6454
      @valentinvas6454 Год назад +11

      And to think that even the 470 had 8Gb variants back in 2016.

  • @amineabdz
    @amineabdz Год назад +206

    Retesting the 6700XT would be a great idea, considering that it offers 12gb of VRAM, and how it's already very close to the 3070 compute power but for cheaper, would be interesting to see how it fairs against the 3070 in the current VRAM limitaions meta.

    • @KrisDee1981
      @KrisDee1981 Год назад +21

      Hopefully Steve can do 2080ti vs 3070 vs 6750xt.
      Turing would win this because of much higher RT performance and 11GB of Vram is actually sweet spot for that class of performance.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +17

      @@KrisDee1981 Turing uses first gen RT cores just like 6750XT, So I doubt that. But hey, no problem in comparing.

    • @HankBaxter
      @HankBaxter Год назад

      Actually, that would be awesome.

    • @ComputerProfessor
      @ComputerProfessor Год назад

      Yes, very good idea. Hardware Unboxed pleasssseeee

    • @cooper23231
      @cooper23231 Год назад +4

      @@KrisDee1981 🤣🤣 yeah let's do it until nvidia wins, not the consumers, Nvidia must win.

  • @extragooey813
    @extragooey813 Год назад +468

    nVidia's planned obsolescence has worked... I'm gonna buy a new GPU to replace my 3070. Grats nVidia, you've driven me into getting an AMD RX 7900 XTX.

    • @Dempig
      @Dempig Год назад +59

      I paid $850 for a 3070 ti on launch day.....what a mistake lol but im now switching to a 7900 xtx as well soon

    • @macdonald2k
      @macdonald2k Год назад +11

      Nothing wrong with 4090 too, but I get that $1600+ isn't necessary for a good experience.

    • @pcdog8903
      @pcdog8903 Год назад +38

      I’m enjoying my 7900 XTX. I think you will like it.

    • @extragooey813
      @extragooey813 Год назад +36

      @@macdonald2k Yup, the best card right now. But I choose not to give nVidia any more of my money.

    • @extragooey813
      @extragooey813 Год назад

      @@Dempig I have a 3070 FE. Got it a few weeks after release when the only way to find one was to get lucky on Best Buy with constant website refreshes on stock days. Should have resold it when it was going for >$1000 lol.

  • @whiteglovepc
    @whiteglovepc Год назад +758

    What about the planned obsolescence angle in terms of NVIDIA purposefully limiting the VRAM in order to encourage consumers to upgrade sooner? Seems a clear strategy at this point.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад +667

      They're only going to give gamers as little as they can get away with. This is obviously an excellent strategy on their behalf.

    • @TheHighborn
      @TheHighborn Год назад +5

      my second ever notebook was a Lenovo y510P and i had 2x755M in SLI (nvidia). At that time i didn't know anything about PC hardware, so i thought 2 GPUs are better obviously, right.
      Anyways, nvidia came out with shadowplay and i was so happy, i could use it, and record stuff as FRAPS really sucked ass. I was recording my league of legends gameplays left and right.
      Few weeks passed, and i couldn't use shadowplay. I googled it, and found out Nvidia disabled it, but there was a workaround. So, i used said workaround. Few weeks passed again, and said workaround was disabled too. And that was the moment i didn't want Nvidia products anymore. When i built my PC that i use today, i bought a vega64, despite almost buying a 1080Ti. Now, i'm bying a new computer and looking at the things Nvidia does, the obvious choice was the 7900xtx. (CPU is in the mail, and it's a liquid devil, so can't use it yet)
      Fuck nvidia.

    • @BleedForTheWorld
      @BleedForTheWorld Год назад +69

      Gotta love that profit driven economy

    • @kaisersolo76
      @kaisersolo76 Год назад +11

      @@Hardwareunboxed So why did thy come out with the 3060 12gb. Its was a weird card when it came out in terms of the VRAM it got. Did Nvidia finally see the problem coming?

    • @1armbiker
      @1armbiker Год назад +133

      @@kaisersolo76 they saw 6 as too little at the time, as it would have had issues with some titles on launch. Remember you can only change VRAM amounts by doubling or halving because of the bus width.

  • @selohcin
    @selohcin Год назад +452

    Excellent work, Steve. I had no idea new games were using this much VRAM. I'll be sure to get something with no less than 16GB of VRAM next time I buy a GPU.

    • @Karibanu
      @Karibanu Год назад +40

      Some of the stuff I've been running has been using all 16GB VRAM and *still* going into system memory...
      Emergency 6800XT ( thankyou card death 1 month out of warranty! ) was a good purchase, thankfully got one of the lower binned versions on sale & saved about £300 at the time. I'd never get anything below 16GB now & frankly still want more. Less bothered about clocks.

    • @tre-ou5gt
      @tre-ou5gt Год назад +1

      ​@@Karibanu what resolution do you play at?

    • @Karibanu
      @Karibanu Год назад +5

      @@tre-ou5gt 1440p, occasionally 1440p triple-screen.

    • @orcusdei
      @orcusdei Год назад +2

      I've just bought 4070 with 12... the issue is that I can't fit any AMD card into my computer.

    • @kontoname
      @kontoname Год назад +33

      @@orcusdei Did the thought occur to you that a decent PC case is way, way cheaper than a GPU?
      I'd never buy a crappy GPU because it doesn't fit - especially not if that GPU is way more pricy than the alternative card. You could literally have bought a decent future proof case, had a nice meal with the family at a restaurant AND bought a GPU instead of a 4070 🤣🤣🤣🤣💀

  • @PotatMasterRace
    @PotatMasterRace Год назад +727

    It would be very nice to see a 12Gb model like a 6700XT in the comparison as well. Or even an older 11Gb card like a 1080Ti. It would be hilarious if frametimes were more stable on the 1080 Ti than on a 3070 :D

    • @TRONiX404
      @TRONiX404 Год назад +64

      Actually I recently ran my 1080ti I7700k with God of War 4K Ultra high with FSR 2 balanced or quality.
      60fps solid

    • @singular9
      @singular9 Год назад +51

      Blyat-Vidia gimping its own cards? Who would have thought.

    • @legi0naire
      @legi0naire Год назад +29

      I am having a lot better gaming experience with my 1080ti, compared some other people with cards twice as fast but only 8gig.\
      I still get 60+ fps in almost all new games 1440p, no stuttering.

    • @razoo911
      @razoo911 Год назад +1

      gtx on the last of us run like shit, zero optimization for that gpu generation, like in every games

    • @The_Cokimoto
      @The_Cokimoto Год назад +6

      ^This please Steve, you need to do this!!

  • @recordatron
    @recordatron Год назад +236

    I'm actually pretty gobsmacked by how well the 6800 performs here. The 16gb buffer is an absolute life saver and I'm surprised how well it's doing with RT as well considering how questionable it was at launch. One GPU will be usable at higher settings for a few years yet probably and the other is relegated to lower settings and resolution for completely arbitrary reasons. Very frustrating.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior Год назад +13

      To be fair, the 3070 has never competed with the 6800. The 6800 has always been 1 tier above the 3070 from the very beginning (even before this new vram scandal).
      The 3070 was built to play with the like of the RX 6700xt and RX 6750xt (which the 3070 beats most of the time).

    • @sorinpopa1442
      @sorinpopa1442 Год назад +17

      ​@@angrysocialjusticewarriorand what about the 3070 Ti suffer from same problem as the 3070 non Ti

    • @RydarGames
      @RydarGames Год назад +37

      ​@@angrysocialjusticewarrior no no no. Even the $400 digital PS5 console has over 12gb vram dedicated to games. Launching the 3070 with only 8gb vram while knowing devs in the near future would target the consoles as a baseline is treacherous 😢

    • @hansolo631
      @hansolo631 Год назад +20

      @@angrysocialjusticewarrior I agree that the 6800 and 3070 weren't direct competitors, but they were only seperated by 70 bucks and that 70 bucks gets you a MASSIVE advantage now. Like an absurd one

    • @AvroBellow
      @AvroBellow Год назад +11

      What's frustrating is the fact that so many people bought these pieces of (as Steve would say) GAHBAGE just because they were in a green box. Anyone who spent that kind of money on 8GB deserves their fate.

  • @GewelReal
    @GewelReal Год назад +424

    I can't believe in the mental gymnastics that people go through to justify them buying Nvidia when they don't absolutely require it

    • @schwalmy8227
      @schwalmy8227 Год назад +26

      One of my buddies got a 6700 xt recently and had horrible driver issues and some games performed really poorly in comparison to others, could’ve been his fault but I think amd still needs to work on their drivers more

    • @EarthIsFlat456
      @EarthIsFlat456 Год назад +16

      Well they're better products (albeit more expensive) so that's really the only justification needed.

    • @remasteredzero4076
      @remasteredzero4076 Год назад +58

      ​@@schwalmy8227 if there was a problem with the driver he could just use a older one

    • @NostradAlex
      @NostradAlex Год назад +32

      It's actually normal behavior. They bought something and they want people to assure them they did the right choice.
      It's our human nature. Even though we know we're wrong, we'll continue telling ourselves we're right in hopes that we'll believe it ourselves.
      It's just lying to ourselves. And we do it not just with things we buy, but all sorts of things. Even relationships to give an example.

    • @GewelReal
      @GewelReal Год назад +68

      @@schwalmy8227 You're saying that as if Nvidia drivers were perfect.
      I had so many issues with Nvidia ones that for me it's not an argument for going team green. Their only advantage right now is prefessional space IMO. For someone that just plays video games AMD is a clear choice.
      Hell, even Intel has a lot to say here

  • @vor78
    @vor78 Год назад +61

    I was in the market for a GPU upgrade at the start of the year and it came down to these two as the primary contenders. I had been a longtime user of Nvidia products, but this time around, I went with the RX 6800, with the VRAM issue being one of the major deciding factors. Watching this video, I'm very thankful that I did.

    • @briefingspoon380
      @briefingspoon380 8 месяцев назад

      how she been treating you so far?

    • @vor78
      @vor78 8 месяцев назад +7

      @@briefingspoon380 I'm very satisfied with the 6800 and have had no issues with it at all.

    • @briefingspoon380
      @briefingspoon380 8 месяцев назад

      @@vor78 Whats type of fps you getting and which games?

  • @Magmafire
    @Magmafire Год назад +226

    You know what would be an interesting follow up, including benchmark results from the 16gb Intel A770 in the titles the 3070 struggled. Just for fun, you could even include data from the Radeon VII for those same titles. That additional data would really drive your point home for owners of 8gb cards.

    • @ERZAY2
      @ERZAY2 Год назад +18

      I’d also like to see a 3060 vs 3070 video to see if it can handle these better as well

    • @UTFapollomarine7409
      @UTFapollomarine7409 Год назад +9

      hey man even a vega frontier at 16gb would dominate the 3070 lmao 3070 trash

    • @StevenDolbey
      @StevenDolbey Год назад +8

      Or a 2080 Ti. Similar GPU performance to the 3070, but more VRAM. Steve mentioned it, but didn't test it this time.

    • @tompayne4945
      @tompayne4945 Год назад +3

      Yeah even going all the way to a 1080ti would be a good shout. I wonder if they even still have a Radeon vii?!😮

    • @piercewiederecht5135
      @piercewiederecht5135 Год назад

      Nahh they should it for $2500 in the mining boom

  • @corriban
    @corriban Год назад +575

    An RTX 4060 with 8gb of VRAM will be pretty much dead on arrival. I can't believe Nvidias social media team is unable to clearly communicate this to the higher ups.

    • @Ddofik
      @Ddofik Год назад +89

      and does it really matter? It will still sell really well

    • @apostolos8734
      @apostolos8734 Год назад +79

      @@Ddofik as well as the 4080 did? 😂

    • @BlueMax109
      @BlueMax109 Год назад +124

      Imagine thinking Nvidia care about the consumer lol

    • @MoltenPie
      @MoltenPie Год назад +35

      They may give it 16gb in the end actually, just like they did with 3060. I am almost certain 3060 was planned to be a 6gb card (just like 2060 was) but they changed the mind last moment. No way they planned 3060 to have more vram than 3080! But the only thing they could do at that point is to double the buffer (same number of channels, double the volume of memory banks).

    • @fepethepenguin8287
      @fepethepenguin8287 Год назад +15

      You think the higher ups are not fully informed?
      They have NO incentive to offer anything more.
      This would be a different story if they were facing the AMD of 8 years ago, that would do anything to gain market share.
      As soon as a competitor comes in offering the same performance with triple the vram costing half as much... and they truly loose market share.
      Then next gen will be different.

  • @falcone5287
    @falcone5287 Год назад +419

    I'm glad I went with an RX 6700 XT. I remember people telling that rx 6700 xt was no match for rtx 3070 and the amd card actually performed a little bit worse initially but now it is a completely different story thanks to a bigger VRAM buffer and AMD's FineWine technology :)

    • @tilburg8683
      @tilburg8683 Год назад +29

      And in fairness 6700xt always has been better especially at 108p and above due to the Vram.

    • @LowMS3
      @LowMS3 Год назад +16

      My 6900xt reference beats my buddy's 3080ti in most games and synthetic benchmarks. His 80ti will out class my card in some games though!

    • @hardergamer
      @hardergamer Год назад +22

      ​@@LowMS3 I have been having vram problems with my 3080 10gb for some time, mainly with newer games, and after borrowing a friend 6800xt just last week I'm shocked how much better, so many games looked and played, even some 2+ year old games look and play much better, with older games like GoW at 1440p texter pack crashing the 3080 but the AMD cards run fine. Anyway, I have just brought a 6800 xt new for nearly half what the 3080 12gb cad cost.

    • @dontmatter4423
      @dontmatter4423 Год назад +11

      The 3070 has enough gpu power to drive 1440p at decent frame rates, but struggles at 1080p just for less vram..
      What a shame

    • @philipgibbon4473
      @philipgibbon4473 Год назад +5

      Same here I got RX 6700XT for what I play, also I wanted more vram but not brake the bank, but the game I play need more than 16GB of Vram (Transport Fever 2, CityBus Manager, Cities Skylines, etc.....) I had a 8GB card but I had unbearable frame-times and a lot of lags due to Vram is overflowing into system memory and I had people saying you need Nvidia for gaming it's the best but my 8GB card was Nvidia GTX 1070TI, I'm like no I need more vram then 8GB on what Nvidia was offering in the mid-high end, I get what feels the best for me not for others, It's like being back at high school choosing your college and your friends say come where I'm going, sorry getting off topic aha.

  • @petarracic6740
    @petarracic6740 Год назад +49

    Just to add to this, it was a real kick in the nuts when I bought the 10GB 3080 and then the 12GB variant comes out after... I wanted the extra VRAM too it should of been the first product they released.

    • @eugenijusdolgovas9278
      @eugenijusdolgovas9278 Год назад +1

      They did and it was called 3080 Ti. 3080 was never meant to last longer than 2 years for 1440p ultra.

    • @CheGames497
      @CheGames497 Год назад +5

      never buy the first version. while its hip and cool to buy it first. you usually are a test subject just for better to drop right after, be it apple, samsung, monitors, gpus, cpus, whatever. dont get caught up into fomo fam.

    • @CheGames497
      @CheGames497 Год назад

      @@eugenijusdolgovas9278 ya we are a wasteful throw away society, we think we need to replace things because corporations have trained us that way, so we keep profits nice. we are just modern day serfs.

    • @eugenijusdolgovas9278
      @eugenijusdolgovas9278 Год назад

      @@CheGames497 Whilst some are wasteful, I tend to buy GPUs from used market and rock 6800 XT rn, which will be working till I decide to switch to 4k competitive FPS gaming.

  • @scotty28653
    @scotty28653 Год назад +348

    AMD have really improved their Radeon cards over the recent years. I used to use Nvidia cards all the time but have been using Radeon for the last 3 years now and love these cards.

    • @SMILECLINIC-2004
      @SMILECLINIC-2004 Год назад

      How is the driver support for amd cards?

    • @nicane-9966
      @nicane-9966 Год назад +34

      @@SMILECLINIC-2004 Its fine. not worse than nvidia wich recently got issues aswell. They are just slower at releasing newer drivers but thats it however im not one of those guys who update their divers immediately new ones come out, why would u do so? its not like u are gonna get much better performance in newer games, u need to buy a faster gpu for that. if it works good then u dont need to touch it.

    • @mikem2253
      @mikem2253 Год назад +37

      @@SMILECLINIC-2004 No issues as a former 6700xt and a current 7900xt owner.

    • @MisterDirge
      @MisterDirge Год назад +4

      I'm using a RX 5700, and I haven't had any driver issues throughout my time with AMD cards (rx 580, vega 56). I'm happy that the RX 5xxx series cards are still getting support and FSR/RSR updates as its allowed me to play most games at 2560x1080 and higher than 60fps. I don't really like the metrics overlay that AMD uses but I can't use Rivatuner without MSI afterburner which messes up my undervolting settings.

    • @bumperxx1
      @bumperxx1 Год назад

      It's been said since 2020 go amd if u want regular features. They have the best regular rasterization frames. But at the low end with these 3070 3060 6800xt 6700xt ect you won't get any good benefits for raytrace lightefects so u need to go higher up the Chain. So obviously this test proves that

  • @itmegibby2459
    @itmegibby2459 Год назад +672

    Would have been interesting to see how the 2080ti with its 11gb vram compared to the 3070.

    • @KrisDee1981
      @KrisDee1981 Год назад +78

      I wish Steve can do this comparison next. More interesting to see Nvidia vs Nvidia. Almost identical performance, but extra 3GB of Vram.
      I do have 2080ti and TLOU runs great at 4k, dllss balance, high settings.

    • @Evie0133
      @Evie0133 Год назад +14

      Now that would be interesting to see :D

    • @deamondeathstone1
      @deamondeathstone1 Год назад +127

      Add the RTX 3060 with 12GB for shits and giggles.

    • @YH-lj9gy
      @YH-lj9gy Год назад +41

      2080ti is a beast, unfortunately those 2018 micron Gddr6 are ticking time bombs.

    • @ShinyHelmet
      @ShinyHelmet Год назад +24

      @@deamondeathstone1 Yeah, it would have been great to see how the 12GB 3060 fared in these games compared with the 3070. I imagine a lot smoother gameplay in many instances, especialy with dlss.

  • @gusgyn
    @gusgyn Год назад +102

    I'm glad late last year, when upgrading, I went with a 6800 XT instead of the 3070Ti. They had similar price, but I also thought that 8gb was a bit too low since I had a GTX 1070 which also had 8gb, so I also wanted an upgrade in the VRAM. Thanks for your analysis HW Unboxed!

    • @aHumanCookiee
      @aHumanCookiee Год назад +13

      These two cards are not even in the same league except for RT overall, you made the right choice sir!

    • @highmm2696
      @highmm2696 Год назад +5

      Very good choice. The 6800 XT is more in line with the 3080, often beating it at 1440p in alot of titles (Non RT of course)

    • @3lDuK371
      @3lDuK371 Год назад +1

      I think it was HW Unboxed that sold me on my RX6800 due to vram and 1440p performance

    • @gusgyn
      @gusgyn Год назад +3

      @@highmm2696 Thanks, yes I know the 6800 XT is often better than the 3070ti, but here where I live, they were practically the same price. Like Apple, Nvidia has a "tax" with it that they usually are more expensive or drop the price less due to more demand. So AMD usually tends to be cheaper

    • @Kamikatze_Gaming
      @Kamikatze_Gaming Год назад

      Thats why I bought the 6800XT instead :)

  • @Puremindgames
    @Puremindgames Год назад +65

    It happened with 128MB, 256MB, 512MB, 1GB, 2GB and 4GB it was only a matter of time until 8GB joined the 'Not enough' graveyard, I think some are just shocked at how fast it seemed to happen.

    • @AFourEyedGeek
      @AFourEyedGeek Год назад +21

      And that NVIDIA is selling a 4060Ti with 8GB of VRAM in 2023.

    • @drek9k2
      @drek9k2 8 месяцев назад +3

      It really didn't though, 2, 3, 4gb was in pretty fast succession, it took awhile for 8gb to no longer be enough because RX 480 had 8gb long LONG ass time ago and it was still fine until a few years ago. And at the time of release it was already not enough, Doom Eternal was taking up 11gb and at 1440p Cyberpunk took up enough VRAM that it has texture issues.

    • @Bassjunkie_1
      @Bassjunkie_1 8 месяцев назад +1

      @@drek9k2 Agreed. I think the people that are shocked are the ones who recently bought a 3070 on sale or maybe even a 3080, while the rest of us from the r9 290x 8gb and rx 480 8gb era are not.

    • @drek9k2
      @drek9k2 8 месяцев назад +3

      @@Bassjunkie_1 Oh true that's a good point, you know I totally forget that a lot of people this may be their first actual video card, rather than, I'm not saying you needed to be from the GT 8800 days just that I keep forgetting many of these kids weren't even around for the RX 580 8gb yet. I guess a lot of people are really, really truly clueless.

    • @alexarose2996
      @alexarose2996 8 месяцев назад

      Go play uncharted new version on pc i have no compliant getting new with 16gb vram when the game looks like that damn rip 2070 super and 8gb cards u won't be missed

  • @bierrollerful
    @bierrollerful Год назад +102

    I must admit, I didn't think 8GB would be an issue this soon. Up until recently I thought that, all things considered, the 3060 Ti was *_the_* card of this GPU generation.
    In the end, I bought a 6700 XT. Less so because of the additional VRAM, but mainly because it was cheaper - and now I'm really glad I did.

    • @PeakKissShot
      @PeakKissShot Год назад +20

      As soon as you give a dev more ram to work with they will consume it as fast as possible 😂

    • @madtech5153
      @madtech5153 Год назад +15

      the rise of VRAM usage is due to consoles. consoles back then was a hindrance to PC gaming since companies are still focusing on making their games in ps4/xbox at that time. now that the current gen consoles are quite good and has 16gb ram (although unified), expect most games used at ultra around 10gb-12gb leaving 4gb ram for the OS. safe play to be made would be to buy 12gb vram gpu but its guaranteed to be safe in 16gb vram gpu unless you want 4k ultra + raytracing.

    • @chronology556
      @chronology556 Год назад +1

      @@madtech5153 VRAM and Teraflops on XBSS compared XBSX?

    • @qweasd-xf5pv
      @qweasd-xf5pv Год назад +4

      @@madtech5153 ps5 and xbox sx have 16 of shared memory. It means that 16 us shared buy both vram and ddr ram. Do how come 8gb vram and 16gb ddr ram become obselet
      If you consider texture size ps5 might only have 6gb vram and 10gb ddr ram to work with.
      Dude devs are f**king up

    • @dragonl4d216
      @dragonl4d216 Год назад +4

      The 6700XT is the card of this generation, not the 3060Ti. It is cheaper, faster, and has more VRAM than the 3060Ti.

  • @ziyaceliksu
    @ziyaceliksu Год назад +43

    You showed very well that everything is not about fps. Most people were looking at the fps results and passing by. We understand again how complicated the user experience is. Congratulations

  • @kennethd4958
    @kennethd4958 Год назад +158

    Been waiting for this video... in 2020 I had the opportunity of getting either a Sapphire Pulse 6800 or EVGA 3070... and I'm so glad I went with the 6800.

    • @MrDoBerek
      @MrDoBerek Год назад +2

      Came from a 5700XT to a 6800, worth it!

    • @gaav87
      @gaav87 Год назад

      You must have friends in nvidia coz rtx 3070 released in july 2021 so 1year 8months ago and true release was months later late 2021

    • @kennethd4958
      @kennethd4958 Год назад +2

      @@gaav87 No it didn’t. Maybe the 3070ti released in 2021 but the 3070 def released 2020.. September or October I believe

    • @kennethd4958
      @kennethd4958 Год назад

      @@RUclipsTookMyNickname.WhyNot Nice! I had a 5500XT, 5600XT and then a 5700XT before my 6800 lol… I kept wanting more power lmao

    • @kimjiniinii
      @kimjiniinii Год назад

      yeah you got lucky with all the scalpers and miners they were selling by bulk before arriving

  • @stianh.4587
    @stianh.4587 Год назад +32

    My choice a year ago stood between the RX 6800 and GeForce 3070. I'm happy to say I made the right choice.

  • @AxTROUSRxMISSLE
    @AxTROUSRxMISSLE Год назад +275

    This makes me really feel good about my recent upgrade to the 6800XT, hopefully it doing great at 1440p for years to come

    • @Doomrider47
      @Doomrider47 Год назад +39

      On the future side of things, team red has seemed to always had an eye for longevity. Just nabbed a 7900xtx myself after I saw my venerable rx580 8gig was minimum spec, after 5 years of use it was a no brainer to go red again.

    • @AxTROUSRxMISSLE
      @AxTROUSRxMISSLE Год назад +8

      @@Doomrider47 Agreed, I never have really dipped into AMD until I got a Ryzen 2700 a while ago. I dont think I will go back to an Nvidia card for the forseeable future unless AMD starts sticking its fists in consumers asses more than normal. Im honestly hoping Intel really brings a big update to its next GPUs, maybe both AMD and Intel putting pressure on Nvidia will knock them down a peg a bit.

    • @pyronite3323
      @pyronite3323 Год назад +3

      @@Doomrider47 how big was the jump? I have a gtx 1070ti (which has similar performance to the rx 580) and I was also planning to buy a 7900 xtx or maybe wait 2 years and buy a 5080.

    • @nishanthadda8824
      @nishanthadda8824 Год назад +7

      ​@@pyronite3323 wait a bit more.Or if u can get a 6800XT for less than 500 bucks thats a steal.

    • @musouisshin
      @musouisshin Год назад

      @@pyronite3323 if u can grab the XTX for anywhere close to 700-850$ - Just go for it - its worth it - tbh i have a 3yrs old 2070S 8Gb but i dont regret it for 170$

  • @MonkeyMan125431
    @MonkeyMan125431 Год назад +66

    I'd love to see a 3070 Vs 2080ti comparison since most recommended specs specifically mention a 2080ti instead of a 3070 for a 1440p 60fps Ultra settings even though they're comparable in Rasterization, most likely due to VRAM

    • @Keivz
      @Keivz Год назад +6

      That would be the most sensible comparison.

    • @knuttella
      @knuttella Год назад +10

      and maybe add the 1080 ti in the mix just for the lulz

  • @Alcatraz760
    @Alcatraz760 Год назад +21

    Imagine making such a good core architecture and software like DLSS only to be thrown away because of not enough VRAM. It really is the epitome of making an F1 car only to fit tyres from a prius.

    • @od13166
      @od13166 Год назад

      Outdated shadowplay that dont have Capture preview

  •  Год назад +75

    I agree with this. The tiers are now:
    - 24 GB -- High End
    - 16 GB -- Mid Range
    - 12 GB -- Entry Level
    - 8 GB -- "barebones", sub 200$ cards
    This should be the standard in 2023.

    • @FouTarnished
      @FouTarnished Год назад +25

      I would agree but this also highlights the laziness in development of games now. There are still games more complex that look better than games like the last of us that use half the vram. Devs need to take responsibility as well, not just GPU manufacturers.

    • @cairnex4473
      @cairnex4473 Год назад +1

      I'm sure that makes people who bought 10GB 3080's feel good about their purchasing decision ;-)

    • @FouTarnished
      @FouTarnished Год назад +2

      @@cairnex4473 lol that card was always a trap, I doubt anyone that bought it feels good about it now, I went for the 3080ti and the extra vram has been a godsend, still i think Devs need to seriously re-evaluate the priorities when making games, if a game looks amazing but plays like absolute garbage it's a bad game

    • @jalma9643
      @jalma9643 Год назад +1

      ​@@FouTarnishedNot every developer became lazy it just technology has advance a lot, i don't included Bad Port Pc like the last of us or Beta Release Game like Lotr Gollum. What i mean is like Resident Evil 4 Remake, Capcom really improve game graphic and we can't blame game like Capcom to increase usage of Vram.

    • @thierrybo6304
      @thierrybo6304 Год назад

      Agree. Me playing with 1060 3gb ;-)

  • @waynesabourin4825
    @waynesabourin4825 Год назад +90

    NVidia planned this knowing what was coming as they had optics working with upcoming devs. All their products had gimped VRAM amounts so they could justify everyone buying new cards sooner.

    • @DebasedAnon
      @DebasedAnon Год назад +22

      The worst part is the customers who got burned by Nvidias bs will just go "oh i guess my GPU is getting old" and will buy a newer Nvidia one without a second thought.

    • @cin2110
      @cin2110 Год назад +1

      Planned obsolesence doing it's work

    • @Thysamithan
      @Thysamithan Год назад +3

      @@DebasedAnon I have a 3070 and my next card will be an AMD card.
      I have enough of Nvidias increasingly annoying nonsense for good and AMD cards are now more than a viable alternative.

    • @DebasedAnon
      @DebasedAnon Год назад +5

      @@Thysamithan I'd say AMD outright won the previous generation before the entire GPU market went up in flames, if they stayed at around MSRP Nvidia was trounced across the board.
      If people cared a tiny bit more the market wouldn't be as bad as it is now, it took the absolute farce that is the 4080 at 1200$ + a recession to make people NOT buy Nvidias products..
      With that being said AMD is being way too passive for my tastes, they could afford to go for the jugular right now due to having vastly cheaper costs but they're content to undercut Nvidia by small amounts (relative to the price).

    • @thestig026
      @thestig026 Год назад +1

      @@DebasedAnon My issue is that, I know Nvidia are screwing me over but I'm split between gaming and doing AI projects on the side in my free time, so I have no other choice but Nvidia since the workflows rely on CUDA. If AMD bothered to push Rocm and make it as viable as CUDA, and make FSR better than DLSS, I wouldn't even consider an Nvidia card.

  • @kolle128
    @kolle128 Год назад +62

    Its funny how NVidia is pushing ray tracing, yet when you turn it on thats when their card drops to its knees because of VRAM, despite probably having superior ray tracing accelerators.

    • @Kuriketto
      @Kuriketto Год назад +16

      It's also bewildering how people blame developers and poor optimization, justified or not, for low performance on 8GB cards, yet there doesn't seem to be much outrage about Nvidia pushing RT so heavily to sell 20-series cards back when support for it was practically non-existent and enabling it now on virtually any card from that generation will bring it to its knees. Is that also poor "optimization" on the part of developers or Nvidia overpromising and underdelivering?

    • @chronicalcultivation
      @chronicalcultivation Год назад +5

      It's not even "probably", the RT cores make a huge difference and the Nvidia cards should be vastly better at it, so the lack of VRAM is without a doubt a big issue.

    • @sammiller6631
      @sammiller6631 Год назад +2

      @@chronicalcultivation Don't exaggerate the difference RT cores make. Sucking less does not mean RT cores don't suck. And that's only usual limited RT.
      The RTX 4090 still only manages 17 FPS on Cyberpunk 2077 RT Overdrive mode.

    • @chronicalcultivation
      @chronicalcultivation Год назад +2

      @@sammiller6631 as someone who owns cards from both AMD and Nvidia, and also a PS5, no, it is not an exaggeration. The performance hit taken on an RDNA2 card or console for enabling RT is significantly more noticeable vs an Ampere GPU.
      It's not totally unusable on AMD, but it's a lot worse.
      I was really hoping the 7000 series would catch up and be equal to Ada in that department, but they seem to have only reached Ampere level RT instead.
      I've always been more of a Radeon guy for almost 20 years, so the last few generations have been a bit disappointing..
      It was better when it was still ATi

    • @AvroBellow
      @AvroBellow Год назад

      It's funny how, despite this, there were enough people stupid enough to buy them.

  • @pb602
    @pb602 Год назад +94

    I saw the 8GBs of VRAM being in issue back in 2020 because Shadow of War (2017 game) requires more than 8GBs of VRAM if you wanted to use the highest texture pack with all the settings at max. That's why I ended up going with the 6800XT.

    • @aaz1992
      @aaz1992 Год назад +1

      Aging like fine wine

    • @jayclarke777
      @jayclarke777 Год назад +1

      @Steven Turner Shadow of Mordor wasn't a grindfest

    • @Mark13376
      @Mark13376 Год назад +3

      Far Cry 5, too. I’ve seen it use almost 10GB when you supersample it

    • @tkanal1
      @tkanal1 Год назад +3

      I had 6700XT with 12GB and some games could fill 9-11GB easily on the highest preset in 1080p!!! 8GB is not enough...now I have also 6800XT and there it is normal that games use almost all the 16GB at 4K...

    • @jkahgdkjhafgsd
      @jkahgdkjhafgsd Год назад +3

      modded XCom 2 used almost 8Gb on my 1070 at 1440p in 2016. I upgraded this year and anything less than 12Gb was a non-option to me, which meant NVIDIA auto-excluded itself

  • @WilliamOwyong
    @WilliamOwyong Год назад +48

    This reminds me of the issues we used to have [way back] when we'd run out of system RAM and Windows started using it's swap file causing massive slow-downs. The main difference being we can upgrade system RAM. If only we could do the same with GPUs.

    • @ArtisChronicles
      @ArtisChronicles Год назад +2

      Maybe I should relive those times by running 4GB today

    • @archthearchvile
      @archthearchvile Год назад

      ​@@ArtisChronicles and then you can fix it by using an operating system that doesn't suck

    • @youfreaker
      @youfreaker Год назад

      Moonstomper68 Swapping out RAM works much better on a motherboard because the motherboard PCB isn't completely covered with heatsinks, blocking off the memory chips. There is not much need to upgrade VRAM when the GPU performance is already at a given. It's much cheaper to just put on enough VRAM so the VRAM won't run out in situations where the card's raw performance could still hold up.

    • @GMBazillion
      @GMBazillion Год назад

      @@archthearchvile And what OS would that be Hmmmm?

    • @GMBazillion
      @GMBazillion Год назад +1

      You cannot do upgradable VRAM because adding a socketable interface adds more traces and latency. Considering how fast GPU memory is these days, you need every bit of performance you can muster. The only way to do that is to have the VRAM as close to the CPU as possible to have the shortest amount of traces, between them. Shorter traces = Lower Latency = Faster Speeds. As much as I myself would love to see socketable memory module upgrades for GPU's it would sadly be more of a hinderance than a benefit.

  • @PKDeviluke25
    @PKDeviluke25 Год назад +220

    Once I had VRAM issues, I ditched my 3080 12GB for a 7900xtx 24GB what an amazing card, it just lets me max out anything at 4K with loads of VRAM remaining. Am on the Team Red bandwagon for now.
    EDIT: When I mention maxing out everything, know that I do not use Ray Tracing if you want good RT performance go with the newer NVIDIA cards.

    • @alexts4920
      @alexts4920 Год назад +1

      Can it play 4k 120 FPS with maxed out settings?

    • @dybrado1133
      @dybrado1133 Год назад +12

      @@alexts4920 For which game are you asking??

    • @AdiiS
      @AdiiS Год назад +3

      I go green, 4090 > everything

    • @megamanx1291
      @megamanx1291 Год назад +62

      @@AdiiS 4090 is too expensive man

    • @chriswright8074
      @chriswright8074 Год назад +1

      @@alexts4920 not exactly

  • @nomustacheguy5249
    @nomustacheguy5249 Год назад +201

    I couldn't be happier getting my 6800 last year for 480$. Apart from the odd driver issue (nothing serious) I'm a happy AMD customer so far. Nvidia became too rich for my taste with the 3000 series. Now both AMD & Nvidia are luxury goods with the newer gen products.

    • @valtonen77
      @valtonen77 Год назад +2

      I tried amd cpu and after how shit it was and didnt work and couldnt even update the drivers, just never trying amd again.

    • @PotatMasterRace
      @PotatMasterRace Год назад +63

      @@valtonen77 Something's terribly wrong with your basic tech/soft skills.

    • @thefurmidablecatlucky3
      @thefurmidablecatlucky3 Год назад +8

      Nvidia with their 4000 series GPU: you're too poor to buy our products so we don't need you anymore

    • @madtech5153
      @madtech5153 Год назад +1

      @@PotatMasterRace ?? there is nothing to do, its just plug and play. dunno why he had a problem but if linus had no problem, its more likely applies to general population as well (yes im dissing linus).

    • @sven957
      @sven957 Год назад

      what scares me off of buying AMD are all of these driver issues I read about for example they had extremely high idle power usage for a really long time - I dont even know if its fixed now? And then there is FSR which is just not nearly as good as DLSS, Ray tracing etc

  • @jacobvriesema6633
    @jacobvriesema6633 Год назад +194

    I waffled between the 3070 and 6800 back in 2021, and probably would have bought a 3070 if I could have gotten it anywhere near MSRP. Instead, I bought a $900 Gigabyte RX6800 Aorus Master. While I wasn’t thrilled with the price, I’m glad to hear I made the right choice. HUB’s concerns about VRAM was a major deciding factor for me. Thank you, HUB!

    • @TBRANN
      @TBRANN Год назад +3

      Best buy is selling that still for a cool $1400 + dollars 😂😂😂

    • @jacobvriesema6633
      @jacobvriesema6633 Год назад +1

      @@TBRANN oof! Newegg still has it listed for $1,000. I probably bought the most expensive RX6800 out there 😂.

    • @TBRANN
      @TBRANN Год назад +4

      @jacobvriesema6633 hell $900 is a steal apparently 😳

    • @KaHzAdaNe
      @KaHzAdaNe Год назад +2

      I feel your pain at that price too... I had to buy a 6700XT for 700€ in that same period... If only my GPU hadnt died on me back then i'd gladly get a 6800XT for 650€ right now...

    • @DELTA9XTC
      @DELTA9XTC Год назад

      i can buy a 6950xt for 650€ atm lol

  • @thefumexxl
    @thefumexxl Год назад +99

    Literally just got my 6800 a week ago and it's been blowing my mind how powerful it is even wth raytracing.

    • @MrBones105
      @MrBones105 Год назад +12

      I’ve had one for a year now and I honestly haven’t run into a situation where I’m concerned about performance. It’s a really good card!

    • @UKGBManny
      @UKGBManny Год назад +4

      Interesting, GTX1080 user here looking upgrade! The top card my 9700k can handle is a RTX3070 or a 6800. The 16GB of ram is definitely pulling me to team red for sure.

    • @thefumexxl
      @thefumexxl Год назад +4

      @@UKGBManny For the money right now it's one of the best options. It's closest competition is over $100 more. If you can spend another $100 you can get the XT, who's biggest competition is WAY more expensive than that. Unless you can get an AMD 79xx or NVidia 3080 the rest of either card that might compare is double to quadruple the price. I paid $370 for mine and am still amazed at how much it can handle.

    • @user-mx4gm6pc2o
      @user-mx4gm6pc2o Год назад +5

      going to get my 6800 for 350usd tmr and selling mine 6700xt, looking forward to get that thing after seeing all those videos

  • @rustler08
    @rustler08 Год назад +194

    The 6700 XT, the 6800, and the 2080 Ti are the only cards you should be looking at in the $350-450 range. Maybe sometime in the future we can add the A770 to that list.

    • @themaxster_
      @themaxster_ Год назад +9

      @Cm Not to mention the problems that arise with early gddr6. If your 20 series card has a certain type of early memory it could fail at any point now!

    • @steelkinq3708
      @steelkinq3708 Год назад

      @Cm you can find the 6700 xt New for 399€ for weeks in Germany. The prisebump to the 6750 xt is not wirth it.
      Beispiel: Mindfactory hat die Saphire Puls 6700 xt aktuell für 383,05 €

    • @steelkinq3708
      @steelkinq3708 Год назад +9

      Steve revisited the A770 after a major drivers update and if I remember correctly the A770 and 6700xt were arround on par

    • @amcdonald7479
      @amcdonald7479 Год назад +1

      My Aorus 6800 Master still runs anything I throw at it. As long as my multi online games are able to pull 144fps on 5% low as no lower than 120 on 1% Im happy. As for offline games like resident evil or atomic still all good. Ray tracing at mid though.

    • @karma9007
      @karma9007 Год назад

      @Cm 348euro second hand here in Denmark, or a bit cheaper Ive seen.

  • @Moshenokoji
    @Moshenokoji Год назад +47

    I had a 3070 for a little while, it was a fine card. Ended up with a Radeon 6800XT not long afterwards and sold the 3070. I don't regret it.

    • @Vis117
      @Vis117 Год назад +1

      I’m curious what the value of a 3070 will be in a couple more years as this continues to happen.

    • @toxicavenger6172
      @toxicavenger6172 Год назад +3

      @@Vis117 it'll sell very poorly I think. The market will be flooded with them in due time.

    • @davidyang9902
      @davidyang9902 Год назад +1

      I went from 3060ti to 6800xt, happy about the decision especially since I play call of duty

    • @musouisshin
      @musouisshin Год назад

      Smart guy

    • @LikeAGroove
      @LikeAGroove Год назад

      i did exactly the same thing

  • @ksevinamaniego
    @ksevinamaniego Год назад +65

    I'm impress by the rt performance of the 6800.

  • @bravestbullfighter
    @bravestbullfighter Год назад +100

    My 16GB RX6800 bought two years ago at $579 MSRP aged really well.

    • @elmSTREETnasty
      @elmSTREETnasty Год назад +4

      That's what I paid for my Gigabyte Gaming OC 6800XT a few months ago. Bout $585 after shipping.

    • @amirhasamtemori1843
      @amirhasamtemori1843 6 месяцев назад

      bro i got mine for only 430 euros from xfx

    • @Karimftw0.0
      @Karimftw0.0 6 месяцев назад +1

      lol I buying mine for 300 rn

    • @stefannita3439
      @stefannita3439 6 месяцев назад +1

      Got a 3070 for 600 euros in 2020 (GPUs cost more here than in the US). This week I found a one-year-old 6800 XT for 360 euros on the used market and was able to sell my 3070 for about 300. Can't wait to enjoy the next few years with this beast of a card, it's still so powerful at 1440p!

    • @NadeemAhmed-nv2br
      @NadeemAhmed-nv2br 6 месяцев назад

      ​@@Karimftw0.0yeah but you got the performance 4 years later well I've been using my RX 6900xt for 4 years now. Getting that performance for half half a decade later isn't as big of a flex when we'll just move onto something 2x the performance

  • @Finalterra1396
    @Finalterra1396 Год назад +83

    If the rumored 8gb vram on the 4060 and 4060ti turn out to be true, I am very much looking forward to seeing HUB's review as there is no way any of those products will be priced accordingly.

    • @hugobalbino2041
      @hugobalbino2041 Год назад

      RTX 4060 will be buyed by miners... cryptocurrency...

    • @Bokator2
      @Bokator2 Год назад

      It's joever

    • @josh2482
      @josh2482 Год назад +17

      ​@@hugobalbino2041 Do you live under a rock, crypto is dead. Nvidia even came out and said that cryptocurrency provides no value to society. Lmao.

    • @hugobalbino2041
      @hugobalbino2041 Год назад

      @@josh2482 8 GB cards are fine for crypto... not so much for gamers... even 12GB cards... for me, I don't pay electric bills because of my solar power... so yeah I don't live under a rock... and there are still GPU miners out there...

    • @danieloberhofer9035
      @danieloberhofer9035 Год назад +6

      The sad part is that yes, the 4060(ti) will get scathing reviews for only having 8GB vram, and yes, they'll sure as hell be overpriced for that - but people will still buy them.
      And, from what it looks like, AMD won't go ahead and do the smartest thing they could do and launch both the 7800XT *and* the 7700XT with 4 MCDs and 16GB. It's such a pity, and I really hope that they reconsider. Already having chiplet design for RDNA3 implemented would certainly give them the chance, unless the binning for the GCD also includes defects in the fabric interconnect to the MCDs.

  • @VredesbyrdNoir
    @VredesbyrdNoir Год назад +70

    Far out, I was tossing up between a 3070 Ti and a 6800 XT late last year and I'm really glad I went with the Radeon GPU now! Great work as always Steve.

    • @jays8287
      @jays8287 Год назад +4

      I had the same dilemma earlier this year and went with the 6800xt. Very happy with my choice

    • @Magaswamy716
      @Magaswamy716 Год назад +3

      no DLSS...

    • @ck275
      @ck275 Год назад +15

      @@Magaswamy716 fsr is pretty decent

    • @sirab3ee198
      @sirab3ee198 Год назад +1

      Unfortunately people still buy a RTX3070 TI over the RX6950XT in some countries where prices are similar 😢😮

    • @sirab3ee198
      @sirab3ee198 Год назад +8

      ​@@Magaswamy716 man you guys never give up........

  • @user-kq7fe1ny7n
    @user-kq7fe1ny7n Год назад +40

    That video perfectly reflects what it feels like to buy a GPU with 8GB some time ago and suffer with it now. I would be glad to see a comparison between RTX 3060Ti vs RX 6700XT, as these are also direct competitors. Good job!

    • @RealElevenTimes
      @RealElevenTimes Год назад +7

      They are not though. RX 6700xt crushes 3060Ti in everywhere except raytracing.

    • @cccpredarmy
      @cccpredarmy Год назад +1

      I game atm since 1,5 years on a 6700xt on all ultra settings 1440p (with minor tweaks here and there). If I'd EVER experience ANY stutters or something I don't like I'd instantly switch to a better card. But I don't have to because EVERYTHING TODAY RUNS BUTTER SMOOTH!

    • @Aggnog
      @Aggnog Год назад

      Are you really suffering? Just drop texture quality one step and this won't even affect you. But if you swap the situation around and high texture quality didn't exist, everyone would blast AMD for wasting money on RAM when they don't need it. Funny how that works.

    • @pokemongo-py6yq
      @pokemongo-py6yq Год назад

      Similar situation here. On the other hand I tend to not use raytracing and use a mix of low/medium settings so I don't think 8GB VRAM matters too much for my use cases. The testing shows 3070 is kinda bad but my 3070 will do just fine until it dies I'd say.

    • @alexworm1707
      @alexworm1707 Год назад

      Im okay with my 5700xt 8gb it's performance ain't nothing to write home about to begin with lmao

  • @stealthshock1
    @stealthshock1 Год назад +3

    Some people are missing the point in the comments. Limited VRAM is Planned obsolescence, most people don't upgrade their GPU often. it intentionally limits high end Graphics card from its full potential.

  • @murdergang420
    @murdergang420 Год назад +337

    As an owner of both gpus i approve of this video

    • @murdergang420
      @murdergang420 Год назад +9

      I noticed it in cod forza and other games for the 3070
      Any questions feel free to ask 🙏

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 Год назад +2

      Does OC the memory help to mitigate the issue?

    • @jinx20001
      @jinx20001 Год назад +7

      @@murdergang420 have you noticed who sponsors the vast majority of these problem games though? i mean seriously i feel like im the only one that notices this coincidence. Every game in this video with serious problems is an AMD sponsored title, that means they work closely with AMD when developing the game... im sure its nothing though right, no way AMD would be turning some knobs here to make games run badly on certain nvidia cards while taking advantage of the one huge advantage AMD have... nahhh couldnt possibly be, its just nvidia BAD and we need to remember that.
      not like anything like this has happened before with nvidia tessellation which crippled the opposition cards. honestly im very surprised outlets like HWU dont at least acknowledge this and question it even a little that something underhand and fishy is going on here when every game with a problem has AMD hands on it, always those games that come out with terrible RT aswell, just a coincidence im sure they have no say at all lol.

    • @slylockfox85
      @slylockfox85 Год назад +33

      @@jinx20001 So do you have the same issues with video editing software like Adobe running better on Nvidia hardware? Because it's been that way for the longest and I've never seen conspiracy theories mentioned about performance. You should be more upset with Nvidia for putting out a crippled card with 8 GB of VRAM. If they had least put out a competent product, it would be easier to entertain accusations of bad optimization.
      That said, when AMD got the console contracts from Sony and MS, it was predicted that game optimizations might start to favor AMD.

    • @crib8007
      @crib8007 Год назад

      ​@@murdergang420 for some games, have you experienced stutter due to shader cache with the 6800? Destiny, for an example, experience high stutter for a bit then over over time gets better.i just found it annoying that this happens every GPU driver update. It makes me not want to update the driver.

  • @JamesSomersetActor
    @JamesSomersetActor Год назад +16

    It's funny to me that you got comments complaining about your VRAM criticisms. Part of the reason I went with the 6800XT (aside from the fact I was able to get one for not much more than a 3070, and I would have been able to get a 6800 for a bit less than a 3070 if I'd gone that way) was because of your VRAM comments.
    I am, therefore, very grateful for your VRAM criticisms personally.

    • @madtech5153
      @madtech5153 Год назад +1

      alot of people who criticizes the hub's Vram criticism says that speed it all that matters, size doesn't. its already quite clear back in the last of us part 1 benchmark that the speed doesn't help when there isn't enough vram.

    • @BitZapple
      @BitZapple Год назад +1

      The 3070 Ti with 8GB is faster than the 6800 but he chose to compare two different tiers of performance and keeps riding on bottlenecking issues when trying to run Ultra on badly optimized titles with a mid range card from years ago

    • @mmis1000
      @mmis1000 Год назад

      I actually encounter vram problems when using my old rx580 8g. I sometimes need to stop the browser/kill dwm.exe so game will run smoothly. Thats in 2021, the time I decided to upgrade my graphic cards. And then I saw 3080…10g. I am like "what the hell? How on the earth 10gb at 2021 is enough?". Ends up go rx6900 because nvidia cheap out the vram for the whole 30 series.
      Nvidia plays a good game here.

  • @NashvilleUK
    @NashvilleUK Год назад +61

    I managed to get an RX 6800 for RRP on launch day, very savvy purchase in hindsight 😀
    Keep up the good work HW UB

  • @HuebietheGuru
    @HuebietheGuru Год назад +13

    Glad I sold the 3070 two months after purchasing it. I was very disappointed with it and bought a 6800XT - still in use and happy 😅

    • @HuebietheGuru
      @HuebietheGuru Год назад

      @@siwexwot8994 yes. That's what we saw in the video. Dunno why you mention it..? 🤔

    • @HuebietheGuru
      @HuebietheGuru Год назад

      @@siwexwot8994 you can still use it in combination with DLSS...

    • @HuebietheGuru
      @HuebietheGuru Год назад

      @@siwexwot8994 and that's the point. The 3070 is only a 1080p card today due to it's lack of more memory... The take away is: the 6800XT was and is still the better buy. 😉👋🏼

    • @HuebietheGuru
      @HuebietheGuru Год назад

      @@siwexwot8994 yeah you're may be right but lowering the texture quality in 2023 is not future proof. I'm still glad that I sold my 3070 2 months after the purchase and got a 6800XT in my hands. FSR 2.x and 3.x will make it even longer usable as the 3070 will ever be. That's my personal opinion 😉

  • @belema22
    @belema22 Год назад +29

    makes a whole video to say I TOLD YOU SO! No but seriously, great work. I bought 6800 2.5 years ago based on your original videos!

  • @Ilestun
    @Ilestun Год назад +16

    The crazy thing about the 3070, it that many people will upgrade to a 4070/4070ti and this card will face exactly the same issues SOON.

    • @Threewlz
      @Threewlz Год назад +1

      great, now we just gotta spend 1k dollars on a card and be done with it, right? What a joke

    • @Ilestun
      @Ilestun Год назад +1

      People who spent €900 on the 4070ti shouldn't sleep well, absolutely not futur proof card despite the abusive price.
      Even on close to 1k, you still can get a not-futur proof card.
      Nvidia is litteraly abusing its fanatics.

    • @modernlogix
      @modernlogix Год назад +2

      @@Ilestun It's(4070 Ti) already seeing stuttering at 4k in some games with RT. It's not even present proof, lol. Let alone "future proof".

    • @phattoni2117
      @phattoni2117 Год назад +2

      ​@@Ilestun true, but the 7900xt already can't hit 60fps on cyberpunk maxed out at 1440p because of slow RT performance. A 4070ti can exceed 60fps in just about any game currently available.
      I think both cards will eventually run into issues for different reasons this gen. Nvidia was greedy short changing gamers by 4gb VRAM, while the 7900xt is just a more powerful GPU at rasterized performance but still dragging it's feet with RT. AMD better come up with a viable FG alternative soon as well with drivers. Otherwise the rasterized performance advantage won't hold either in upcoming games.

    • @Ilestun
      @Ilestun Год назад +2

      @@modernlogix How many times I told people to avoid upgrading to 4070ti because of Vram shortage......but you just cannot stop them from spending their money badly I guess.

  • @fran117
    @fran117 Год назад +47

    This proves that what Nvidia did, giving only 8gb on their midrange was smart, for their business lol

    • @madtech5153
      @madtech5153 Год назад +2

      they are geniuses indeed.

    • @amineabdz
      @amineabdz Год назад +4

      Not really, in the current market, everyone will be pivoting to AMD for cheaper, yet better cards. It was actually a disastrous move if you ask me.

    • @benjaminoechsli1941
      @benjaminoechsli1941 Год назад +9

      ​@@amineabdz I hope so, but I'll believe it when I see it. AMD cards have been great since RDNA 2, and growth has been slow. That mindshare is incredible.

    • @madtech5153
      @madtech5153 Год назад +1

      @@amineabdz ill believe that the moment in steam hardware shows amd (discrete) in top 10.

    • @kazuhu580
      @kazuhu580 Год назад

      @@amineabdz Probably not, AMD and Nvidia are just like Apple and other Android brands. It doesn't matter if the product is better and cheaper, the brand name matters the most and we already seen that with Iphone SE 2/3

  • @ademkin
    @ademkin Год назад +75

    Still rocking my 1070 I bought in 2016. Absolutely love this card, it's been enough to run recent games, granted in medium quality, but they still run and I can play in good conditions. I'm not hurry to upgrade GPU and not a compulsive buyer, and thank god for that seeing the state the GPU market is in. Hardware Unboxed is one of the channels I'll be sure to follow the advices of to know which GPU I want to upgrade to.

    • @ZeroG
      @ZeroG Год назад +8

      Whatever makes you happy, that's the important thing in life. This man has his head screwed on straight!

    • @cameronblazowich2671
      @cameronblazowich2671 Год назад +3

      Dkn I agree the 1000 series my have cost most then the previses generations at first, but boy did the have longevity. I picked up a GTX 1070 Ti at the beginning of 2022, and man do I wish I had gotten it sooner! Nvidia and now AND have lost their track now, at least when it comes to there gaming side of things. more so Nvidia

    • @SquillagusNiggle
      @SquillagusNiggle Год назад +2

      Same here, EVGA 1070 FTW, an absolute diamond of a card. Lasted longer than any GPU I've ever bought, lasted through COVID and crypto, and plays almost everything I care to throw at it in near-silence. Just replaced it with a cheap second-hand 6800XT to last me until the next 'good' GPU release cycle whenever that will be, but that's only because I'm building an entire new system from scratch.

    • @ArtisChronicles
      @ArtisChronicles Год назад +1

      ​@@cameronblazowich2671 if we take AMDs perspective into consideration, then the only way they continue to increase profit margins is to charge their 5 customers even more money for their products.

    • @korana6308
      @korana6308 Год назад +3

      1070 is absolute beast of a card

  • @Unknown-ve6vp
    @Unknown-ve6vp Год назад +72

    You should make a video comparing the RTX 3080 (12GB VRAM) and the RX 6800XT. I want to see if 12GB VRAM is sufficient.

    • @YT_243
      @YT_243 Год назад +2

      I also want to see this comparison.

    • @mimon6706
      @mimon6706 Год назад +38

      12GB is fine for now.
      But I'm sure in 2 years the 3080 12GB and the 4070Ti will suffer from the exact same problems the 3070 has now.

    • @toothlessarcher
      @toothlessarcher Год назад +7

      I have ran out of VRAM on one game with my 3080. Ghost Recon Breakpoint, wither everything cranked to ultra at 3440*1440. It will run out of VRAM and crash after about half hour to hour of gameplay. At least for me, and i will get the error that it is out of VRAM. So far the only game i've had that issue on.

    • @Unknown-ve6vp
      @Unknown-ve6vp Год назад +2

      @@toothlessarcher Do you have the RTX 3080 with 12 GB VRAM or the one with 10 GB VRAM?

    • @toothlessarcher
      @toothlessarcher Год назад +5

      @@Unknown-ve6vp 10GB

  • @flashmutex
    @flashmutex Год назад +112

    I paid a shit load of money for my 6800 XT during the shortage (twice of what it is now). Still extremely happy with it and still glad I did. Happy to see it still performs so well today.

    • @conchobar
      @conchobar Год назад +4

      Same here. Felt lucky at the time to ONLY pay $900 for an AIB 6800 in 2020. I think I got my monies worth.

    • @joaopescusa
      @joaopescusa Год назад +2

      Same. Actually, at that time in my country 6800XT and 6900XT were roughly the same on price. I ended up getting the 69 one. Glad 68 users are still in good shape

    • @Kamikatze_Gaming
      @Kamikatze_Gaming Год назад

      Same here :)

    • @SpeedRebirth
      @SpeedRebirth Год назад

      Also got the 6800xt, at launch price. 8GB was a big no no. It was very obvious that it was too little.
      Just like the gtx 580 that came in 1,5 and 3GB variants. The 3GB lasted years longer. Same with 3/6GB 1060

    • @riba2233
      @riba2233 Год назад +2

      Paid

  • @PeerensClement
    @PeerensClement Год назад +38

    Great video Steve, thanks. Another thing to consider is that calling these cards "aging"... for a lot of people, these cards have only recently become available to buy, due to shortages and pricing. And they are currently still the newest 'midrange' cards available! So these are actually current gen cards struggling!

    • @bendavis3545
      @bendavis3545 Год назад +3

      Makes me wonder where the "value" is...

  • @stefensmith9522
    @stefensmith9522 6 месяцев назад +6

    Well well well, the anti-amd man himself has no choice but to speak well about AMD. Even in ray tracing the 6800 slaps the 3070 back to the beach the silicone was made from.😂

  • @HWMonster
    @HWMonster Год назад +112

    Thanks for the revisist, Steve! As I said over 2 years ago: The RX 6800 was probably the best graphics card released in recent years. It was faster, more efficient and more future-proof than the 3070 from the start, and yet the 8GB VRAM cripple had to be hyped because it can do the oh-so-great DLSS and has much better ray tracing performance. Good job, press and influencers.

    • @noanyobiseniss7462
      @noanyobiseniss7462 Год назад +1

      ruclips.net/video/kKSBeuVlp0Q/видео.html

    • @thodorisi.2889
      @thodorisi.2889 Год назад +5

      Yes but in most places in europe was out of stock or when it was in stock was more expensive than a 3080 10 gb

    • @Nimbus3000
      @Nimbus3000 Год назад +8

      They can only review what they know, whether games will end up using more VRAM imminently is speculation and the reality is VRAM usage of games *in general* has been pretty stagnant for years. (Largely due to consoles, imo) I agree that Nvidia shouldn't be skimping on memory size or memory bus, but you can't really blame press for reviewing the cards they got at the time they got them with the information they had.

    • @mircomputers
      @mircomputers Год назад +1

      yes, the efficiency in every metric is incredible, but that got it very close to the XT version in real world prices

    • @DlAlYlVlilD
      @DlAlYlVlilD Год назад +3

      comparing 3070 to 6800 skipping 6700xt is like comparing 6800xt to 3090 skipping 3080 ... depends if u red or blue team fanboy... why not compare 3070 to what amd was promising as direct competitor at that time? 12gb 6700xt? both of those companies crapped on gamers in last years equally.

  • @AntonioLexanTeh
    @AntonioLexanTeh Год назад +17

    Whole argument about choosing nvidia for gaytracing in the midrange is officially debunked here. 😂
    I dunno why people keep funding scammy practices and crazy prices by nvidia

    • @rpospeedwagon
      @rpospeedwagon Год назад +5

      I'm not an Nvidia homer. But I did get the 3090 and 4090. I can confirm ray tracing still sucks. It's simply not worth it yet and won't be for 5-7 years min.

  • @Alirezarz62
    @Alirezarz62 Год назад +49

    This is one of the reasons I went from 3070 to 6900xt the more VRAM and also shockingly I had to spend no additional money on 6900xt! The performance has been great for the last 6 months and I'm quite pleased with the AMD card. Also knowing that AMD cards age much better than Nvidia cards makes me think maybe I should stick with AMD from now on

    • @ChiplockRock
      @ChiplockRock Год назад

      how are the drivers working for you? I'm very tempted to go red, but I don't know if AMD has resolved most of ther driver issues.

    • @Alirezarz62
      @Alirezarz62 Год назад +2

      @@ChiplockRock To my surprise I have yet to encounter any problem with the drivers every game that I have played and tested worked great I don't know about productivity applications tho as I only use Photoshop and premiere pro anything else is CPU related

    • @m8x425
      @m8x425 Год назад +1

      @@ChiplockRock I've had a good number of Radeon cards.... most of the driver issues I've has is when I've tried obscure stuff like connecting up multiple monitors of odd sizes and Windows automatic updates borking the drivers. My Radeon VII is also picky with what driver it wants to work with, and my x570 Crosshair VIII board does not like my Radeon VII.
      AMD has always had a shaky history with drivers.... and things like taking months to get RDNA 1 drivers months to get right has only hurt them.
      I bought a 7900xtx Red Devil on launch day and no problems so far. My only complaint is there is a 25c difference between Hot spot and regular GPU temps. Otherwise the card doesn't get too hot.

    • @hobmaniac
      @hobmaniac Год назад +1

      ​@@ChiplockRock just got a 7900 XTX (new build from a 2080ti) and have no issues, also the AMD software is way better than nVidias. I think the driver thing isn't really as applicable nowadays as it was in the past.

    • @ItsFaber
      @ItsFaber Год назад

      What size psu do you have? And what cpu do you have paired up with it? Ive been looking at maybe going a different route from my 3070

  • @nhand42
    @nhand42 Год назад +61

    You are accurate as always. 8GB is now budget tier. 16GB is mid-tier and 24GB for top-end. When Nvidia launched (and then quickly unlaunched) the 4080 12GB it was such an obvious mistake. Even 16GB gives me pause because 4K gaming and SS needs more VRAM than ever.

    • @Dell-ol6hb
      @Dell-ol6hb 10 месяцев назад +1

      especially with raytracing too

    • @yonislav_9821
      @yonislav_9821 10 месяцев назад

      Like really i dont know how people trust them while they try to skip and move on cheapest shameless set higher price scenario.

    • @CSTITAN576
      @CSTITAN576 9 месяцев назад +2

      I think 10-12 gb is mid tier. 16 and above is still pretty high end

    • @shinefake48
      @shinefake48 9 месяцев назад

      16GB may not technically by mid end considering the current generation of games, I would say 8Gb would be entry like the 4Gb back then, 10Gb being somewhat standard like 6Gb and 12 Gb being the mainstream for GPUs, and 16Gb should be standard for cards that can run 4k unlike 3080 and 3080ti with that 12Gb

  • @Nate_the_Nobody
    @Nate_the_Nobody Год назад +9

    I just recently switched from team green (GTX 1070 8GB) to team red (RX 6900XT 16GB) and I gotta say, that extra VRAM is nice, the Resident evil 4 remake uses a whopping 13.8GB of VRAM at 1440p if you set everything to max (with Ray Tracing and Hair) so 16GB does seem like the way to go moving forward, which, isn't shocking considering there was a time where we were jumping to double, triple, or quadruple of the "at the time" acceptable VRAM requirements back in 2007-2010, I remember having a 128mb card and moving to a 768mb, I remember having a 1.5gb card and moving to a 4gb, 4gb to 8gb, etc, etc, as the years go by, we need more VRAM, it's that simple.

  • @MrMcp76
    @MrMcp76 Год назад +65

    I have the 3070 Ti and this video is 100% accurate. At 1080 the lack of vram was constantly producing a stutter-ridden mess. At 3440x1440 the card can't handle any game without spikes in frametime and stutters. Turning my character around in games like No Mans Sky and Cyberpunk and Horizon zero dawn causes the game play to pause and detail/objects to pop in and out.
    I've been on the fence about this, but this video has convinced me to go with the 7900 xtx. With its vram and raster performance, I think that card will age the best.

    • @tonymorris4335
      @tonymorris4335 Год назад +3

      This... isn't VRAM related. No Mans sky isn't filling the 8GB of VRAM, neither is Horizon Zero Dawn. Your stutters aren't VRAM caused in these titles because they don't fill the buffer.
      Seriously, go look up comparisons of your card to the 6700 or 6800 and look at the .1 or 1% lows in those two titles and they are pretty close, which means stutters aren't happening as a general rule to people with 8GB of vram vs 16.

    • @MrMcp76
      @MrMcp76 Год назад +3

      @@tonymorris4335 My VRAM sits at 7892MBs used in Horizon, and similar in the other games. I've used both HWinfo and Afterburner to monitor, as well as Nvidia's own monitoring tool to confirm.
      And even if it wasn't the VRAM, then the stutters are caused by what? Poor gpu performance? Poor driver performance? Neither one makes me feel better.

    • @ALmaN11223344
      @ALmaN11223344 Год назад +3

      100% correct, my old 3070 was shit compared to my 7900 XTX and the only thing I don’t like is the heat with everything else being vastly superior.

    • @luisangelini2220
      @luisangelini2220 Год назад

      @@ALmaN11223344 Same here on both cards. Received my 7900xtx 5 days ago and it's amazing.

    • @luisangelini2220
      @luisangelini2220 Год назад

      @@tonymorris4335 You are so wrong.

  • @MenTal9R
    @MenTal9R Год назад +160

    Nothing quite like some fine wine 🍷😂
    Great job Steve! Id love to see this same type of comparison with the 3080 10GB & 12GB against the 6800XT.

    • @odi8359
      @odi8359 Год назад +16

      Exactly! This would be a nice follow up video, probably including 4K resolutions.

    • @discipleofdagon8195
      @discipleofdagon8195 Год назад +20

      only difference is that the 3080 price wise is more comparable to a 6950 xt

    • @taipeitaiwan10
      @taipeitaiwan10 Год назад +4

      So glad I got the 12gb version when prices became low. Getting the 3070 would have caused buyers remorse.

    • @declde
      @declde Год назад +2

      The 6800 XT also has 16GB, the cheaper 6700 XT and 6750 XT came with 12GB, they were more comparable to 3060 (Ti).

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 Год назад +1

      Wish got granted.

  • @megumi_0
    @megumi_0 Год назад +6

    I'm coming back tk this video 2 weeks later to say thank you, you single handedly changed the consensus for Min requirement of vram in 2023 forwards

  • @highmm2696
    @highmm2696 Год назад +151

    Would be awesome to compare the 3080 10GB to a 6800XT 16GB in these games.
    They are pretty closely matched in rasterisation performance, I wonder if the AMD card pulls away massively with more RAM.

    • @JamesSomersetActor
      @JamesSomersetActor Год назад +6

      I would also like to see this comparison!

    • @yasunakaikumi
      @yasunakaikumi Год назад +25

      not with Vram tho, as a 3080 and 3090 user, I can tell you that the 10GB isint even that much of a difference from using an 8GB Vram since games that needs more than 8GB usually just go 12-14GB

    • @OC.TINYYY
      @OC.TINYYY Год назад +26

      6800 XT is closer to 3080 ti/3090 performance in rasterization in newer games, outside of 1 or 2 random titles. Sometimes exceeds that.
      The 3080 would get shit on.

    • @lhv2k
      @lhv2k Год назад +3

      @Kevin P. Lmao NO, wtf are you watching to say bullshit like this ? Just check the recent games benchmarks from HUB, the 3080 10Gb still beats the 6800xt in EVERY GAMES, even at 4k, latest ones are TLOU, Hogwart and Spider Man, infact the 3080 10Gb even beats the 6900xt and 6950xt in Hogwart and Spider Man respectively at 4k ultra, and the 6800xt by like 10-15%. And if you check other reviewers, the 3080 is also better in other new games like Dead Space or Plague Tales. The only recent game that the 6800xt can outperform the 3080 10Gb was CoD MW2, and everyone knows this game is just an outlier that heavily favors AMD. Imagine saying "6800xt is on 3090 level" while in the very recent video of TLOU the 3090 was faster by like 30% at 4k, stop coping.

    • @madscientist6843
      @madscientist6843 Год назад +13

      The first 6 games tested used 12-14GB of VRAM, so the 3080 10GB is still going to struggle

  • @alexs.2069
    @alexs.2069 Год назад +82

    Expanding this comparison with an rtx 3080 10GB could be very interesting, thanks for the great work, steve!

    • @dandaly7305
      @dandaly7305 Год назад +1

      Also the 12 GB RX6750XT for those without the bigger bucks

    • @nightdark2616
      @nightdark2616 Год назад +2

      I have have both 3080 and 6700XT.
      I can tell you that 10gb is not enough and it is also a really weird amount, because the way texture packs work in most games is you either have settings that require 8 or 12.
      Having 10gb just gives more headroom for 8gb textures but still below requirements for 12gb, so very weird.
      The 6700 XT can handle higher settings when it comes to textures in games I play now like Resident evil 4 and the last of us. The 3080 is stronger but the 10gb makes it so you have to use lower texture settings anyway, so it's like wtf.. you play without Ultra settings, what's the point then.
      I would recommend at least 12gb Vram from now on, and I mean really minimum 12gb, because i think even that is a risk. We will most likely need 16gb Vram soon for new games coming out this year and specially in 2024.

    • @madtech5153
      @madtech5153 Год назад

      or rtx 2080ti

    • @BitZapple
      @BitZapple Год назад

      The 3070 Ti would have been the same performance tier as the 6800.
      The 3080 is matching the 6800 XT.

    • @nightdark2616
      @nightdark2616 Год назад +1

      @@BitZapple this isn't about performance, it is about Vram.

  • @jamescavanaugh8211
    @jamescavanaugh8211 Год назад +39

    I'd be curious to see how the 12 GB 3060 stacks up against the 3070 in those newer AAA titles.

    • @peterpan408
      @peterpan408 Год назад

      Probably a bit slow, but not choking on Vram. The 3060 is a 60+ fps experience whereas the 3070 was intended as HFR capable.
      The 3070 is likely faster if it is not choking on VRAM.

    • @kingeling
      @kingeling Год назад

      Likely only slightly worse rather than totally teeth-beaten as it should be

    • @luciferxyXX
      @luciferxyXX Год назад +1

      3060 on avg would be slower, but in unoptimize games, will have much better low 1% frame rate.

    • @cee_M_cee
      @cee_M_cee Год назад

      I mean, it would still be slower rasterization-wise, but at least the frametimes are far, far more stable
      and it wouldn't have that weird voodoo texture-flipping magic that the 3070 does on Hogwarts Legacy

    • @XPuntar
      @XPuntar Год назад +3

      @@peterpan408 I am not sure, because 12GiB version is ~25% SLOWER to its 8GiB Ti variant!
      While 3070 is nearly ~50% faster when ram is not an issue!
      Very much doubt you would have 60fps on ultra settings with this "turd" of the card despite having 12GiB of ram

  • @leonavis
    @leonavis Год назад +3

    lol. Back in 2021 the saying was "If you just wanna use rasterization, buy AMD, if you wanna use RT, buy nvidia". Funny how that turned out.
    For me it was never a question, since nvidia doesn't give us Linux-nerds open source drivers.

  • @KorbiCS2
    @KorbiCS2 Год назад +32

    Great video Hardware Unboxed.
    As a 3070 owner, I feel jipped and spent a decent amount in this card thinking it’ll be good for years to come. I’m so extremely frustrated now, since most new games even on 1080p I’m running out of VRAM. Selling the used GPU is going to get me nothing in terms of price, and the new RTX cards are outrageous and frankly I’ll never buy an nvidia product again. 6900 XT is available and cheaper now, but not sure if I want to buy a new car that’s already last gen, even if it’s better than the 3070. Better yet, 7900 XTs and such seem to be quite expensive.
    I’m torn now to wait and deal with that absolute hot garbage product or try to empty my wallet into an AMD GPU. Or wait until next gen AMD. I have no idea.
    Nvidia is terrible and extremely anti-consumer. My next GPU will be amd and I cannot wait to be able to play games normally on a card that wasn’t made to just be obsolete in a few years.

    • @Justathought81
      @Justathought81 Год назад +5

      Totally agree, I was shocked to see this issues at 1080p, crazy feel for you bro. at this point I’m done with Nvidia.

    • @Sparksman23
      @Sparksman23 Год назад +6

      Give Intel A770 a try then. Its cheap, has 16GB vram, supports raytracing, and Intel has been doing a lot of improvements to their drivers. Buy it from somewhere with a good gpu return policy in case it doesn't work out for you.

    • @Vis117
      @Vis117 Год назад +3

      I bet you could sell a 3070 and buy a 6800 or 6800XT for a minimal difference. We will also have 7800XT and 7700XT sometime this year.

    • @qwerty_uiopas_dfgh
      @qwerty_uiopas_dfgh Год назад

      I had lot of issues with drivers on RX6700 XT specialy with old games where it performs almost like my old GTX1060, in new games everithing was ok. See internet forums lot of people have same issues. Im going backt to nvidia. AMD drivers are not good.

    • @KorbiCS2
      @KorbiCS2 Год назад

      7800 XT sounds promising!

  • @TheLingo56
    @TheLingo56 Год назад +122

    I remember feeling a bit burnt when my 1GB 560ti couldn't play PS4 games once the generation swapped over. Funny to see this happening all over again.
    Nvidia has *always* been stingy with VRAM. Even back in the day it was easy to get Radeon GPUs with over double the VRAM for the same price. In 2015 the GTX 960 only had 2GB, and the 970 famously only had 3.5GB of full speed VRAM!

    • @Stratos1988
      @Stratos1988 Год назад +16

      Lets not forget about 1030 with DDR4. VRAM so crap it shouldn't even be called 1030.

    • @AvroBellow
      @AvroBellow Год назад

      The question is, did you learn from the 560 Ti or did you bend over for Jensen again?

    • @user-zl8sm6ok2r
      @user-zl8sm6ok2r Год назад +16

      Same! I remember asking on the forum whether I should get 560ti 1G or 2G, and everyone recommended 1G. I mean, people took you for an idiot if you dare to recommend the 2G version. But within just 1 year, Battlefield 3 came out and destroyed that 1GB vram. I was so mad. Yet it seems 12 years later, the history has repeated again.

    • @vega7481
      @vega7481 Год назад

      Как же вы западные куколд клоуны закалебали с этим вечным упоминанием 3.5 гб у 970, словно каждый из вас вчера узнал и спешит всем остальным "не знающим" об этом рассказать

    • @supabass4003
      @supabass4003 Год назад +4

      Goes all the way back to the Geforce 3/4 days.

  • @singular9
    @singular9 Год назад +64

    Another reason to buy AMD. More VRAM at every price point since the beginning.
    Its been keeping the R9 390 and RX480 relevant with 8GB of vram and the new stuff with 12/16GB should be standard.

    • @maxpower18
      @maxpower18 Год назад +9

      They don't offer more VRAM at every price point. All owners of a 6650XT and lower have the same "problems" as 3070 or 3060ti owners

    • @deancameronkaiser
      @deancameronkaiser Год назад +4

      To say more Vram at every price point is a bit of a slap in the face of someone buying a RX 6600XT that only comes with 8 gigs of Vram.

    • @deancameronkaiser
      @deancameronkaiser Год назад

      ​@@maxpower18 exactly lol this guy is clearly on crack.

    • @yellowflash511
      @yellowflash511 Год назад +23

      @@maxpower18 Not true as 6650xt is WAY cheaper than both 3070 and 3060ti. 6800 is at those card's price point which has double the VRAM. Stop being a Nvidia simp.

    • @chriswright8074
      @chriswright8074 Год назад

      @@maxpower18 just play 1080

  • @tosvus
    @tosvus Год назад +7

    I'm very happy using 6800 with a ryzen 7600 as my secondary gaming PC. Never owned an and CPU or GPU before and have to say I'm impressed! Will mostly be used for emulation as I put it all in a compact case.

  • @JP-mx1zs
    @JP-mx1zs Год назад +109

    This is why I always go for card variants with the most VRAM ammounts, despite what everyone says at the time "8GB is fine for all most recent games" sorry but I never swallowed that one.

    • @FreeloaderUK
      @FreeloaderUK Год назад +13

      It was kinda fine in 2020 but technology moves on. Since then developers have become used to using 10-12GB VRAM for the GPU on Series X/PS5- so this is translating across to PC games.

    • @Gralorn
      @Gralorn Год назад +2

      It was enough. Depending on the game. I always asked what is the budget + list of games/kind of games, resolution, graphics settings, Target FPS and when they plan the next upgrade.

    • @Rayer24
      @Rayer24 Год назад +17

      The thing is you gotta look ahead when buying a PC component like a GPU. When people say "It's fine for almost all games today", well what does that mean? Does that mean that it won't be fine for a number of games in 2 years from now?

    • @malekkmeid9263
      @malekkmeid9263 Год назад

      It was enough for 1080p and also when devs were making games for previous gen consoles they were forced to use less vram anyways to get the games to even run on those consoles. Now the new gen is a lot closer to PC specs they can push gaming further. NVIDIA are cucks and they should have shipped all their cards with minimum 10-12GB

    • @robertt9342
      @robertt9342 Год назад +1

      @@malekkmeid9263 . They are no closer no then with previous Gen consoles. The main difference is that they have 16GB shared memory so They can use more of it for image quality upgrades. This combined with console efficiencies lead to greater stresses on PC VRAM requirements.

  • @PapaRage
    @PapaRage Год назад +82

    I am feeling mighty fine after buying my 7900XT to replace my 3070, these results are eye opening and hopefully Nvidia will have to pull their **** together. Thanks for all the hard work!

    • @otozinclus3593
      @otozinclus3593 Год назад +1

      I am not trying to defend NVIDIA here, but if you already own a 3070 I would stick to it at first. The point of the Video was ofcause just showing the games which are really unoptimized on VRAM usage, most games still work fine with 8GB of VRAM.
      I think you should stick to your 3070 and wait a bit longer, maybe next gen of GPUs, as the performance increase will be much greater

    • @PapaRage
      @PapaRage Год назад +1

      @@otozinclus3593 i get what you mean, I wouldn’t advise a lot of people to go for the upgrade I went for, but I’m also upgrading my 3800x, and I tend to “donate” my used hardware to my family members that also use computers, and right now having an extra one will solve another issue, so my upgrade also has that in mind, my 3070 will go to someone that will enjoy it for years. I use my pc for work (spend 10+ hours daily on it) so I tend to be quite “liberal” with my upgrades as it is an investment in my case. (Hell, my old 2600+rx580 build is still in use)
      It just amazes me how bad 8gb vram is behaving on newer games, I was already experiencing a bit of it (and I have 1440p high refresh monitors), but it will only get worse over time. I was thinking about a 4070ti but even the 6800 sometimes uses over 12gb, so it was a good call on my side I think.

    • @PandaBearJelly
      @PandaBearJelly Год назад +11

      ​@@otozinclus3593 they already replaced their 3070.

    • @likeabiker
      @likeabiker Год назад +1

      @@otozinclus3593 ps5 and xbox x have 16 vram, how do you optimize the game with less vram?

    • @iamrobot396
      @iamrobot396 Год назад +2

      Single generation upgrade is so not worth it. 7900xt will be outclassed in a few years

  • @cal.j.walker
    @cal.j.walker Год назад +26

    I can't believe, that after getting a reference model Rx 6800 at MSRP during launch week, I decided to trade it in for a RTX 3070...
    At the time it seemed like a good idea for my use case scenario... Both cards performed similar in then-current titles, 3070 technically had better RT performance (even though in reality I barely use RT and it actually runs pretty poorly on a 3070).
    I wanted to try streaming so Nvenc encoder was appealing, and at the time FSR wasn't really a thing so Nvidia had one over with DLSS.
    But the funny thing is, as a 1080p gamer, I don't even really use DLSS at all, so I really don't understand what I was thinking factoring upscaling features into my decision or hell even Ray Tracing performance, at the time.
    But either way, in hindsight, I'm kicking myself for not taking a chance on AMD. Lesson definitely learnt

    • @defectiveclone8450
      @defectiveclone8450 Год назад +8

      Pretty sure you could sell the 3070 and buy a 6800 for the same price.. or trade it :) lots of people buy Nvidia for the name not the performance it provides. Lots of silly people out there. No harm in trying

    • @cal.j.walker
      @cal.j.walker Год назад

      @@defectiveclone8450 Yeah. At the time the decision felt right, but in hindsight it was a mistake.
      I was thinking of trading it into the local Cex store to recoup around £300, and then using that to either get a Rx 6000 or 7000 series gpu at some point. Since I'm only really gaming at 1080p 144hz @high-ish settings, I don't need anything too fancy

    • @modernlogix
      @modernlogix Год назад +4

      @@cal.j.walker As the other guy said, many people are still in the Nvidia bubble and will pay more for a 3070 over 6800. Maybe just try and trade it for 6800/xt.

    • @cal.j.walker
      @cal.j.walker Год назад

      @@modernlogix Ah right, yeah that makes sense now. That's also a good shout, thanks

  • @TheDukeofPretzels
    @TheDukeofPretzels Год назад +47

    I really appreciate the revisit. I've been watching used GPUs and you saved me from a costly mistake.

  • @Donmatteo1980
    @Donmatteo1980 Год назад +4

    Many RUclipsrs whilst reviewing the cards: 3060 12GB is all marketing and you'll never need that VRAM, buy the 3060ti or 3070...
    Aged like milk that advice.
    Also - 4070ti owners sweating that 12GB is pretty much low end now.

  • @Tomcat115
    @Tomcat115 Год назад +52

    I recently replaced my 3070 with a 6800 after finding one for good deal actually. I initially thought it would be kind of a sidegrade performance wise, but with more VRAM to work with. After watching your video though, it seems like I definitely made a great decision going forward.

    • @edgyjorgensen3286
      @edgyjorgensen3286 Год назад +5

      RDNA2 was and still is an excellent architecture. I believe it will continue to improve well into the lifecycle of RDNA3.

    • @toxicavenger6172
      @toxicavenger6172 Год назад +2

      @@edgyjorgensen3286 seems like with every driver update AMD is squeezing a little more out of it and the 7000 series as well. My 7900xt just keeps getting a little better with each update.

    • @ok-mx1mi
      @ok-mx1mi Год назад +1

      ​@@toxicavenger6172i agree but its not the rx 5xx and 4xx level, those gpu gained so much its insane from all the updates

    • @toxicavenger6172
      @toxicavenger6172 Год назад +1

      @@ok-mx1mimaybe not but they had that same architecture for a while and had a lot of time to keep learning and getting more from it. I think with rdna 3 we might have a similar situation where they get more and more from it over time.

    • @jesusbarrera6916
      @jesusbarrera6916 Год назад +1

      It's still mostly a sidegrade... Would've bought a 6800XT

  • @unpotat7672
    @unpotat7672 Год назад +100

    Would be nice to see a retest of these titles with the 3070 vs the 16Gb A770 to see if there is a similar result with 16gb vs 8gb! Also potentially add the 12GB 3060 to see if it pulls ahead!

    • @dv6342
      @dv6342 Год назад +6

      That’s a good recommendation.

    • @ceferinamendoza-sh4vv
      @ceferinamendoza-sh4vv Год назад +9

      The answer to that is simple. RTX 3070 will get better fps in some games but the lows will be undeniably terrible in 1440p, it might not load the texture very well too, same as what is happening in the video, so a bigger VRAM 3060 might get lower fps but will be able to stabilize the game with better lows and correct texture loaded. What he is pointing out is that the stutters will be there as long as the VRAM is not enough, and what makes it terrible is that it won't load the texture.

    • @TheSlowDude
      @TheSlowDude Год назад +3

      I'd like to see more Intel Arc a770 comparisons, since they cost about the same as a 6600 and less than 3060 in several part's of the world.

    • @bilditup1
      @bilditup1 Год назад

      came here to say this, could be a fascinating vid

    • @GULIwer1980
      @GULIwer1980 Год назад +1

      Would love to see that. Worse GPU with more memory vs Better GPU with less memory

  • @Enivoke
    @Enivoke Год назад +38

    You know what would be fun doing this exact tests?
    RTX 3060 12GB
    RTX 3080 10GB
    RTX 3080 12GB
    And see if the 3060 will pass the 3080 10GB in some games while comparing what performance had been left on the table had the 3080 originally come with 12GB!

    • @kanbin4189
      @kanbin4189 Год назад +3

      Yes, please.

    • @alenko4763
      @alenko4763 Год назад +11

      Hate to burst your bubble but the 3060 wouldn’t get higher fps in ANY game compared to a 3080… 12GB or not lol, it wouldn’t even be close. Just google performance stats for both and compare.

    • @ArtisChronicles
      @ArtisChronicles Год назад

      ​@@alenko4763 yeah unfortunately I think 10GB is just enough to actually avoid the issues the 3070 has.

    • @valenrn8657
      @valenrn8657 Год назад

      At 1080p Windows 11 desktop, about 0.5 GB VRAM is consumed.

    • @valenrn8657
      @valenrn8657 Год назад +2

      @@alenko4763 3060 12 GB superior 0.1% lows when compared to RTX 3070 Ti 8 GB.

  • @Daniel-vx5vc
    @Daniel-vx5vc Год назад +27

    Excellent video. Appreciate the effort highlighting the importance of adequate VRAM. For anyone that doesn't know, while the 3060Ti/3070/3070Ti have 8GB and the 3080 has 10GB, the 3060 hilariously has 12GB because Nvidia screwed up their tech/marketing and AFAIK was responding to competition by AMD. Competition is a wonderful thing.

    • @andersjjensen
      @andersjjensen Год назад +10

      Yup. It was designed to have 6GB in 1GB modules (6x32 = 192bit bus) but Nvidia realized late in the game that it wouldn't even cut it for launch-day review benchmarks so they went with 2GB modules instead.

    • @Verpal
      @Verpal Год назад +6

      Initially I bought my 3060 12GB for hobby AI training, professional workload, and transcode, back then its either this, or slightly more expensive 6600XT or 3060ti. I even have some second thought since so many reviewer and online comment saying 3060ti is much better value..... and yet in last three months many games I played will be running into crippling texture load/quality problem if I went with 3060ti, with how DLSS2 constantly improve, if I feel fps is low I just turn DLSS up, but if you run into VRAM limit, you have to suffer through extremely visible texture issue, DLSS is a much lesser compromise then those 3070ti people huffing copium who insist turn texture down a few notch is okay.

  • @pedroduarte5052
    @pedroduarte5052 Год назад +130

    Thanks Nvidia 😂 it would be great to see the rtx 3070 vs 2080ti on these titles as well as against the 12gb 3060

    • @erazzed
      @erazzed Год назад +7

      This! I'm still rocking a 2080 Ti and I'd really like to see such a comparison. Especially since the 2080/Super/Ti is missing in all recent benchmarks which is a shame.
      Maybe it's time to ditch the 3070 from benchmarks and replace it with a 2080 Ti 😂
      The low VRAM on Nvidia cards has always been a pain and now it finally starts to show and piss people off. I hope people finally recognize how shitty Nvidia is about these things and stop buying these gimped cards.

    • @KrisDee1981
      @KrisDee1981 Год назад +2

      I have 2080ti and TLOU run perfect.
      4k DLSS balanced, high settings, locked 60fps.

    • @modernlogix
      @modernlogix Год назад +2

      @@erazzed 2080 Super also had 8 gb of VRAM, so it should be what a 3060 Ti performs like as far as I am concerned.

    • @josephoverstreet5584
      @josephoverstreet5584 Год назад

      This for sure

    • @KrisDee1981
      @KrisDee1981 Год назад +2

      @@modernlogix exactly 2080S=3060ti but I really miss 2080ti in HU comparisons.

  • @Astravall
    @Astravall Год назад +7

    I was happy with my 5700XT and 8GB VRAM, but i'm glad now i bought a 7900XTX with 24GB VRAM. I hope AMD continues the trend in low andf midrange with 12GB and 16GB of VRAM for the rest of the RX 7000 lineup.

  • @kozad86
    @kozad86 Год назад +27

    I woulda loved to see the 16GB A770 on these charts too, even just as a wildcard.

    • @josh2482
      @josh2482 Год назад

      Intel is really providing a lot of value for entry level PC builders with the A770. If I was just starting out building PCs I would buy that one.

    • @georgepanathas2009
      @georgepanathas2009 Год назад +3

      Unstable card for this kind of results?!?! No?

    • @theplayerofus319
      @theplayerofus319 Год назад

      ​@@josh2482no? It still has a lot of crashes and is nothing for a pc beginner tbh

  • @brendanbutkus2392
    @brendanbutkus2392 Год назад +35

    You guys are making me feel EVEN better about my 6800 purchase for $499 back in november... it was way cheaper than the 3070 and 3070ti which is what i was originally looking at... and the VRAM on top of the price sold it for me...

    • @peaj4812
      @peaj4812 Год назад +2

      I couldn't downgrade from the 11gb I had with my 1080 ti. Managed to get a rx 6800 xt for about $460 shipped.

  • @zingwilder9989
    @zingwilder9989 Год назад +19

    It's crazy! I can remember during the GPU crisis, scalpers were trying to sell 3070's for $1700.

    • @rangersmith4652
      @rangersmith4652 Год назад +5

      Even crazier that in late Summer 2020, 2080Ti owners were falling all over themselves to sell their cards for peanuts so they could buy a $500 3070. That price, and availability, turned out to be a pipe dream.

    • @darkfire3691
      @darkfire3691 Год назад +2

      @@rangersmith4652 stop dreaming lmao, nobody sold their 2080 ti to rebuy 3070, those who sold bought 3080/3090

    • @rangersmith4652
      @rangersmith4652 Год назад +3

      @@darkfire3691 Mostly true in terms of intent, but I know at least a couple of people who did sell a 2080Ti thinking they were going to buy a 3080 but wound up "rebuying" a 3070 because the 3080 was unobtainable. And, of course, they overpaid for their 3070s.

    • @slickrounder6045
      @slickrounder6045 Год назад

      @@rangersmith4652 Yeh it should be a pretty big warning sign to any sensible person that downgrading on vram while trying to "upgrade" gpu's, is a foolish strategy.

    • @zingwilder9989
      @zingwilder9989 Год назад

      @@rangersmith4652 That is horribly depressing. Looking at it from today's standpoint, it's the same as setting your hard earned money on fire.

  • @shremk9100
    @shremk9100 Год назад +11

    What always gets me is that there are people out there who upgraded from a 1080 Ti to a 3080. Same price 3 years later for 1GB less VRAM.

  • @grey4980
    @grey4980 Год назад +19

    I wish I could like this video twice! This was so helpful and instrumental in my progress of understanding how a GPU works and what the heck VRAM was and what it did, and clearly demonstrated why the graphs can be misleading sometimes, like when the GPU compensates for lack of memory and avoid crashing or buffering by entirely not loading in the textures, which isn't shown any other way but watching the side by side
    Thank you so much! Wonderful video

  • @alexripard
    @alexripard Год назад +5

    I think 8GB for entry, 12GB for lower mid, 16GB for upper mid, and 20+GB for high tier GPUs would be fine for this generation, though for the generation after the 12GB may need to shift to entry level and replace 8GB

  • @nova8747
    @nova8747 Год назад +7

    I got a 7900 XTX last week and that thing is an absolute monster.

    • @sharathvasudev
      @sharathvasudev Год назад

      4090 is god tier this generation. problem is that only higher tier cards are getting looked after and are superexpensive

    • @crossfire4902
      @crossfire4902 Год назад

      ​@fREAK yea and the price for it and the consuption on electricity 400-600w overclock is Godier lol

    • @sharathvasudev
      @sharathvasudev Год назад

      @@crossfire4902 exactly why it's god tier. it's expensive to buy and run. but all reviews show it's the fastest out there Money can buy. choice is cobsumers

  • @kaiservulcan
    @kaiservulcan Год назад +22

    Amazing job once again. I am so happy with my 6800 xt since the first time. Probably the 3080 10 gb will have the same kind of issues. Who would have bet a dollar on AMD for Ray Tracing? Nvidia keeps claiming they have the superior tech and their cards are better, but they lost the way... Prices are totally stupid and even on fundamentals like the VRAM buffer, they took the wrong way.

  • @tylerkidby8991
    @tylerkidby8991 Год назад +20

    Feel a bit bad buying my 3070 now, haven't used RT once and the RX 6800 is smashing it in these benchmarks.

  • @ericstl4529
    @ericstl4529 Год назад +10

    I was trying to choose between a 4070ti or a 7900xt. So glad I went with the 7900xt.

  • @Hunyika
    @Hunyika Год назад +5

    I am glad to picked up the 7900 XT instead of 4070ti :)

  • @johnwick8516
    @johnwick8516 Год назад +123

    The VRAM usage is insane ! Even 12gigs could be a problem from what i’am seeing in this video !

    • @purgatorysrath4455
      @purgatorysrath4455 Год назад +25

      I was thinking that also. With the 407 ti having 12 along with the 4070, it seems 800 and 600 is way too much for those GPUs!
      They're already obsolete! Everything coming out is just a waste of silicon! These cards barely hit 60 fps at 1080p at times and that's ridiculous for the prices of these cards.
      I've lost all hope in getting a fair deal when it comes to buying a new card.. just outright ripping people off. 😢

    • @spektrumB
      @spektrumB Год назад +34

      Good luck for those who buy 4070 and 4070ti.

    • @adhahanif9792
      @adhahanif9792 Год назад +4

      At least that's the level of current texture. Might take a few years before they increase again the level.

    • @kelownatechkid
      @kelownatechkid Год назад +36

      Yeah these are truly terribly optimized titles. Games 10 years ago looked better using 4GB of vram, I don't get why the criticism isn't aimed at these terrible ports.

    • @terrymike7053
      @terrymike7053 Год назад +19

      Let's all not forget. Nvidia was super close to releasing a 4080 12gb. 🤣 At $900. Imagine if they were the only company in the market.
      And if the 4080 was 12gb, I'd probably guess they were planning for the 4070 to be 8gb. What a timeline that would have been.

  • @joskamps4711
    @joskamps4711 Год назад +40

    I know this comparison would be a little out dated, but could you do a comparison of an older title with an RTX 3060 (12 GB) vs RTX 3070 (8 GB) to see how resolution and settings scaling do? This should also highlight the limitations of the 3070 compared to Nvidia's own cards.