AMD Radeon RX 6800 XT Benchmark Review, Smart Access Memory, Thermals & Gaming
HTML-код
- Опубликовано: 27 июл 2024
- Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Radeon RX 6800 XT - amzn.to/2HbNT2r
GeForce RTX 3070 - amzn.to/34wLJTY
GeForce RTX 3080 - amzn.to/31K5tBS
GeForce RTX 3090 - amzn.to/3ouQ9Tu
Radeon RX 5700 XT - amzn.to/2Js33xB
Radeon RX 5700 - amzn.to/2FYq5v5
Video Index:
00:00 - Welcome back to Hardware Unboxed
02:19 - Godfall
03:07 - Watch Dogs Legion
03:45 - Assassin's Creed Valhalla
04:18 - Dirt 5
05:03 - Death Stranding
05:30 - Microsoft Flight Simulator 2020
06:01 - Shadow of the Tomb Raider
06:28 - Tom Clancy's Rainbow Six Siege
06:54 - F1 2020
07:27 - Gears 5
08:03 - Horizon Zero Dawn
08:30 - Assassin's Creed Odyssey
09:09 - World War Z
09:36 - Metro Exodus
10:02 - Resident Evil 3
10:37 - DOOM Eternal
11:11 - Wolfenstein: Youngblood
11:49 - Hitman 2
12:09 - 18 Game Average
13:24 - Cost Per Frame
14:40 - Ray Tracing
16:05 - Shared Access Memory
18:18 - Power Consumption
19:30 - FE Thermal Performance
20:05 - FE Overclocking
20:35 - Final Thoughts
Read the review on TechSpot: www.techspot.com/review/2144-...
AMD Radeon RX 6800 XT Review
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Music By: / lakeyinspired Наука
HI STEVE, I HOPE WE GET TO SLEEP SOON
Our reaction when we see a dozen RX6800 XT reviews pop up at once.
Paul you are sexy as hell without sleep. Just wanted to say that.
LOL
GET BACK IN YOUR CAVE AND GET US NUMBERS, NUMBER SLAVE
hahaha you guys have busted your asses over the past month, you guys need a vacation
Well we can't buy one as per usual launch in 2020, but at least I can help reward Steve for all the hard work they put into the review of another great product we can't buy :) Thanks for the video.
Keep up the good work, Heir :),
didnt happen for cometlake tho 🤣 sad intel
I got 1. And I am a hunter pecker on the keyboard.
@@teamtechworked8217 if you learn to touch type you could get two or three.
@@teamtechworked8217 Personally I typed like that for a decade then realized I knew where all the keys were without looking and I no longer had tiny child hands to make it necessary,fully trained myself to do it by touch within days and faster than ever within weeks,just had to commit to it. Totally worth it.
Your graphs are very clear; much clearer than that of other outlets. Coloring the relevant bars with colors that standout makes it easy to immediately understand the graph without hunting around.
*angry buildzoid noises*
i agree. his graphs are the best out of all reviewers. his tests always seem to have amd cards higher than other reviewers as well. even the rx 5700 xt.
I was wondering if it's just because I'm Australian I prefer this channel by a lot. I guess not?
I agree with that
Sounds like my boss in every meeting
totally disregarding the performance: Am I the only one that finds the FE editions (both nVIDIA and AMD) both way more gorgeous than any board partner cards so far? Usually it was the other way around in previous generations...
I havent seen many AMD board partner designs yet, but yes. Both AMD and NVIDIA massively stepped up their games in the looks department.
I hate that red strip on the 6800/xt its obnoxious as hell. Makes it not look premium >
I don't know... those XFX teases are looking mighty sexy. 😏
@@MrMilkyCoco what is consider premium to you? Genuine question.
@@HickoryDickory86 they do actually... sapphire as well. most others though... I don't even wanna start with nVIDIA... most of them are ugly as hell. I also don't like overly "gamey" cards, a little RGB or the occasional exquisite cooler desigin is fine, but most are overly edgy or plain boring. the Founders edition hit the sweet spot between simple, yet good looking... AMD more on the edgy little rebel side, nVIDIA on the more elegant side. I personally don't mind the red strip, it is AMDs color after all and some red accents are perfectly fine with pretty much any case color setup.
Steve be like: “SLEEP IS FOR THE WEAK” *passes out*
We'll never sleep, cos sleep is for the weak.
We'll never rest, till we're fucking dead
I know how demanding repetitive job of PC creation can be, I´ve been drawing in CAD and catia for hours. But it is not like he is doing some crazy job. He is tired, but he can create 12-14 hours a day for a few days, no problem. After all, there is money for him.
I thought hardware news was for the weak
@@MarioAPN "12-14 hours a day for a few days, no problem" haha cute, imagine only working 14 hours per day, you're living the dream!
@@Hardwareunboxed the self employed will always work the hardest!
Who's ready to watch a combined two hours of Big Navi benchmarks from all these techtubers right now?
me
yup I'm ready bro
Sign me up
*my body is ready*
Except for GN, I actually need to be focused enough before watching their reviews
@@MattJDylan The second hour is reserved for Tech Jesus' review :P
Despite the whole DLSS and RTX thing its good to see that AMD is competitive again.
They have ray-tracing and will also have dlss in the future.
Yes but its nowhere near nvidia's performance in those categories thats what Romeo probably meant
Lets wait for drivers updates as we know amd at launch always dont have the better drivers.. So lets wait at less 1 month and we may see more performance
Sure but I don't think AMD can just fix their ray tracing and dlss with a simple driver update
while nvidia ... 2nd gen rt, amd 1st gen rt, so yeah.. rdna 3 will be the real battle. head to head with nvidia.
“In many ways the AMD GPU is the superior product”
What a good year for AMD. Just incredible.
Just a shame there is such a lack of stock, but that's an industry wide problem i suppose, not exactly AMDs fault
@@TheArrowedKnee This really sucks man. Sometimes I wish covid wasn't a thing, because of things like this. Most times I don't because this has honestly been a good year for me.
Only really in three ways is it the superior product: price, vram capacity, and power consumption. But I imagine that those three reasons alone can convince a lot of people, because it definitely almost convinced me. But for me the value advantage is negligible because of the extra features that rtx cards have. I’ll make a purchase sometime next year when all product choices are accounted for and planned features are properly implemented.
@@rct9617 Wise words.
now they just have to follow with good drivers
Finally! I can now start waiting for RDNA 3
Just in time for Covid 21 perhaps. o.O
“In many ways the AMD GPU is the superior product”
What a good year for AMD. Just incredible.
Looooool
🤣🤣🤣
It is amazing how much ground AMD has made uo in a single generation and it will be great once stocks not a problem for us as customers finally we will have a choice in the high end
2020: When hardware releases became a physical form of "Early Access"
2021: Introduction of new marketing, the "TimeSaver" Edition - only £30 extra but your item is guaranteed for launch day*
*Terms and Conditions apply
Shhhh, Nvidia might hear you!
TimeSaver Edition = Anti-competitive Edition = Billions of dollars in fines, not that we ordinary citizens would ever see any of that money anyway.
Noooo What have you done, they will see this idea and steal it lol
I have full faith that retail will add $300AU to the $1050AU price when AIB cards are released...
I'm so happy Amd is now giving competition to other previous market dominating brands
What competition? They sell exactly zero of these cards in the Netherlands.
@@Superiorer and that's why Nvidia can price their cards however they want, even when competition is great, people buy Nvidia
@@sebastianhernandez1228 so true
@@Superiorer not anymore.
But they arn't.
Customers arn't buying their cards, and especially not with the driver woes of RX 5700 and XT.
As evidenced by the fact that there's more 2080Ti owners, than both RX 5700 and XT..
How other people bench: Muscle gains.
How Steve Benches: FPS gains.
Brilliant comment Sir! :)
that's how Steve flexs
All kinds of gains. All kinds.
@@shredderorokusaki3013
Yeah can't wait for those benchmarks to release. Think there'll be a Non XT 6900? 6800XT is good for its price but I really am incline to get a GPU labelled 6900 solely for the name.
Steve: most of you are here for the blue bar graphs
Me: I'm here for Steve's sick accent, but that's just me.
Haha you got to move to Australia then mate. I'm a newcomer here and enjoying the accent a lot.
@@EvgeniiNeumerzhitckii Did you move for work? Genuinely curious.
@@TheMetalGaia I was more trying to escape Russia and live in a country where people are nice and not bully me :)
@@TheMetalGaia I applied to Australian permanent residency and got it after two years of paperwork
@@EvgeniiNeumerzhitckii That's awesome! Glad you got it. I've considered moving to Canada from the U.S. before.
Very comprehensive review @hardware unboxed. I've been anxiously waiting for your review
Just give me a Sapphire Nitro+ version of this baby.
Review should be up on the 25th for you ;)
@@Hardwareunboxed Thanks. Do you think it's worth upgrading from a 3900X to a 5900X for SAM? I kind of wanna, but I kind of feel like I'll have buyer's remorse next year if I do.
@@praetorxyn your still good bro
@@praetorxyn if you are planning to game, I might go for it. For anything else, no cause SAM only increases performance in games.
I want Toxic
Imagine that there were people who doubted that AMD could even match the 2080Ti...
Guilty.
i was guilty as well. finally ready to upgrade mt 1080ti
@@ddlog420 That's understandable, AMD has disappointed a lot during the last couple of years, especially with GPU's. I'm just glad they finally have brought competition to the high-end GPU market!
It still doesn't, the benchmark numbers don't matter.
/s
Damn, those 1440p and 1080p numbers (Steven hinted that the 6800XT beats the 3090 at 1080p in Watch Dog Legions during his unboxing). I was hoping for better 4k performance but whatever, I can stand 1440p in games.
Still cant compete in raytracing or any other possible software feature or even stock right now so its still not a win for AMD.
By far the best benchmark review again as always! Love you guys :-*
Thanks, have been waiting to see this review!
My 1080s fan just started squeaking after watching this video.
Best to upgrade before it explodes in your face then. xD
You can wait for rdna 3 with a 1080
My 1080's memory crapped itself last week. Had to upgrade to a 3070 ;-)
Hey, Im a computer engineering student and really need a new gpu for University , if you thinking to upgrade, how much can you sell me your GTX 1080 for?
@@TheAlbaniaGaming Look on Facebook market and Craigslist you can find them for 350$ and less if you spend a few hours looking 👍💯
11:44 - RTX 2060: "I can do this... I CAN DO THIIIIIS!"
😂
Why is it so low
😂
@@ihfazhassan1548 seems to be some driver bug with a memory bottleneck.
I saw that too and mumbled "Damn"
I'm always impressed with the paramount
size of data you guys crunch on each launch, really appreciate it. Thank you and lots of love from S. Korea
I'm actually impressed amd made a card this powerful. It's nice to see competition again.
They say all heroes wear capes, but Steve here prove it differently by being a wizard that delivered unbelievable amount of benchs.
"They say all heroes wear capes, but Steve here wears hoodie. What's up with that?"
He can't wear a cape: since he's aussie the cape would always float behind him because he's upside down... that would mess with the lighting, you know..
omg , the 6800xt is a monster of gpu
So is the 3080
lol there's a huge CPU bottleneck in every case of 1440p. Theres no way the 3090 is only faster than the 3080 by a few fps lmao, which indicates the CPU is bottlenecking. When you go to 4k where the cpu bottleneck is removed, you immediately see the 6800xt is way slower than the competition.
3080 is faster in 4k because of memory
@@legendarymop The cpu bottleneck is working against AMD not nvidia.
The reason that Nvidia has more performance at 4k is because of the FP32 compute (tons of cores) which doesn't scale well at lower resolution.
There is a CPU bottleneck at 1080p and a little one at 1440p, but it bottlenecks RDNA 2 more than Nvidia.
The rtx3090 is only slightly more powerful than a 3080 and runs at similar clocks, it's an architecture bottleneck at lower resolutions for the RTX3000 series.
So for fps snobs at lower resolutions it's RDNA 2 all the way now. (Massive 40%+ performance difference in some games)
For 4K snobs it's slightly more in favor of Nvidia. (18-20%+ performance difference in some games)
@@bastordd Doesn't the 3080 have less memory? How'd less memory make it faster.
This video really helped me out thank you
So happy I managed to get one today! Tomorrow I'll start testing it :)
AMD is finally back. I’ve been wanting to upgrade and it’s been a long time since I’ve owned an AMD card.
Amd reference design is "cool and quiet".
It's so weird hearing that after 'hot and loud' release after release after release.
improvement at its finest
Probably the biggest improvement ever made 😆
hmm... 75c is a full 15c higher than my 3080 when it's OC'd, seems to be weird considering AMD using a smaller node (7nm).
@@MLWJ1993 Likely due to the smaller die at -108mm^2 drawing albeit much less power but still almost 300W.
Thank you sir. That was very interesting specially the SAM analysis. I was curious if the performance they claim was true. Your test with 3.0 making no difference vs 4.0 is an extra that i really appreciated. Thank you for doing that test.
Just a few years ago AMD GPU'S were going for high compute capabilities at lower frequency. Nvidia was going for lower compute capabilities but at a higher frequency. Now they have completely switched places! The only difference is that they're both good now, not just Nvidia.
AMD costing 350 dollars less in australia even for AIBs! Hopefully they're in stock for more than 1 second.
LOL good luck Mate
144hz 1440p at max settings, finally!!!
Never cared for 4k that much....
Same not until 4k 144hz monitors become affordable 1440p 144hz is the sweet spot
@@HollowRick I'm happy with 1080p 144hz. Cheers.
@@rahuloberoi9739 same bro just sucks my 1060 cant run the new games more then 60 fps on medium to high settings anymore :(
3440x1440 is where it's at. 21:9 is so much better for gaming.
@@HollowRick There not that expensive now ... can pick up an LG 4K 144hz for what a 1440P 144hz monitor use to be ... 4K is really the new 1440p ... amazing to play at 4K with everything maxed out
A lack of ray tracing performance is what scares me now that the consoles adopted it. I personally have less vram and 1080p performance than giving up the option to have decent ray tracing performance. Cyberpunk 2077 is just around the corner and it may deliver something real nice using this technology.
I agree, Raytracing is the future, no matter how much Steve doesn't care about it..
I also think that a lot of future titles will bring ray tracing support, starting with cyberpunk which is gonna be awesome :)
But consoles use RDNA 2, devs will work more with AMD's implementation of Raytracing, in the long run, AMD has an advantage as their Raytracing matures
@@notsoanonymousmemes9455 Sony doesn't use RDNA2 more like RDNA 1.5 also the raytracing that they use is based on Microsoft implementation and not locked down to just AMD so Nvidia can use it also, as well as their own RTX.
@@choatus do you have a source for the nonsense you're saying? No. Sony uses RDNA 2.
And of course AMD has a advantage over nvidia because their architectures are everywhere. In 2 consoles and the PC. Only a idiot would ignore that fact.
Great benchmarks and work! :)
love your work guys! best channel, better reporting standards than the majority of news outlets period.
Out of stock
BUYER RAGE MODE ON
It sold out when Newegg had crashed at 0901 and when it came back at 0914 it was sold out. AMD website was similar.
Steve, you really are the best at these reviews. Always the first source I go to on new releases. Love the content.
Well done on all the work you guys have been doing
You guys are the best. Still tossing numbers for 980s into the mix is amazing. Also having 1080p data availible is also awesome. So many dont even bother with it
Very nice to see direct comparison for me as a 980ti owner
Seems like the faster pipes on RDNA2 help with 1080 and 1440, but the fatter pipes on Ampere help at 4K. Be interesting to see how it plays out in the future.
Its the 256 bit bus i think, but its still a great gpus for me
I agree. It'll be especially interesting to revisit this in say, 2-3 years to see if the low VRAM on the Ampere launch versions eats into their 4k lead to the point that it is nonexistent or flips to a deficit. Given that they are already scrambling to redo the lineup and add more VRAM, that'd be a BIG RIP to the people that bought a launch 3080 or 3070.
@@andrewwilliamson3513 10gb is plenty for 3+ years which by then most people will upgrade anyways, the Vram is GDDR 6x and is up to 4x faster than the GDDR 6 that AMD is using so it does help quite a lot especially in those "Fatter Pipes" it's really not as much of a big deal as people are making it out to be, most people don't understand that they are reading Vram allocated not Vram actually being actively used. Source- I own a 3080 plus a lot of other cards. Had my 3080 since almost launch and been loving it, worth every cent so far. But I have had many decent AMD cards in the past also, so don't call me a Fanboy, I will probably buy a 6800xt or 6900xt just for the fun of tinkering with it at some stage too.
@@choatus The basis of my argument is that the Nvidia cards only exceed the AMD card's performance in 4k, so their best use case is for 4k gaming, but the limited Vram might make that a moot point in the next cycle of games (without turning down settings.) Microsoft Flight Sim already allocates over 17Gb of Vram in 4k when under high load and nearly 10Gb in 1440p. Actual usage seems to be around 12Gb/8Gb respectively. The new trend in game development is to early access release games with shit optimization, and the optimization never seems to get addressed even at final release. AAA titles like COD that release every year are the same in a lot of ways. I think Warzone will use over 8Gb. It seems like Nvidia kept the Vram low because GDDR6X is expensive, but it keeps the 3070/3080 from being "Great" 4k cards for the future. They could have achieved the same memory bandwidth with more slower GDDR6 for roughly the same cost.
@@andrewwilliamson3513 to achieve the same bandwidth as GDDR6x they need to have 3-4 times the amount of normal GDDR6 so no it doesnt work like that.. it would end up costing a ton more to do it that way for starters. P.s MSFS on my 3080 in 4k uses (not allocates) less than 7gb of vram right now in ultra. but that game is broken as it needs its update to DX12 before we can see what it really runs like, it's basically running on 2 cores only atm.
Exactly what i need! Hope i can get one before the end of 2020.
I come to this channel first cuz we get straight benchmarks and no bullshit its just so time-saving. Thanks steve
Comparing RDNA to Zen
RDNA1 & Zen : "Meh"
RDNA2 & Zen 2 : "Interesting"
RDNA3 : "See Zen 3"
Comparing Nvidia to Intel's responses in both cases :
RDNA1/Zen1 : Nvidia/Intel - "Hahaha, lol wtf ffs, is that all you've got ?"
RDNA2/Zen2 : Nvidia/Intel - "Damn, we actually have to do something for the new generation now..."
RDNA3/Zen3: Nvidia/Intel - "FUUUUUUUUUUUUUUUUUUUU*K!!!!!"
EDIT: for clarity
I plan on upgrading my system to a zen 4 rdna 3 system when those both launch. It'll be a huge upgrade from my r5 1600 and gtx 1080 system.
If we assume (and i know they can do it now) that they can meet the challenge once, they can do it again. they took their performance straight from geforce 1080TI level performance (5700xt) straight to 3080-3090 performance.. They literally skipped a "2000 series" equivilant card. THAT is an accomplishment. And I think they will do it again with Rdna3 :)
@@tusux2949 eh nvidia still edged out the win this gen. but i believe amd will come for blood next gen i can garuntee it! this feels like their “first attempt” cards like nvidias 2000 series
@@djmccullough9233 The comparison to the 5700xt is very flawed and extremely missleading. 5700xt had 40 CUs while the 6800 has 60CUs and the 6800xt 72 CUs. It is not like it was impossible for AMD to create 60CU or 72CU RDNA1 GPU, they just decided not to. So the gain from architecture to architecture will be much smaller when we are comparing 40 CU vs 40 CU (probably 6700 or 6700xt) and the difference between those 2 cards is the architectural gain. Probably more in line with 10-20% though both use the same node. RDNA3 will probably be on a better node so you will get some gains there aswell.
But you will not see the same gains as when comparing 5700xt vs 6800xt. They will not produce 130-140CU gpus. Even their newly revealed MI100 data center chip only uses 128. So unless you want to pay 5000 bucks for a gaming gpu (and only leverage its performance at 8k) you will have to be satisfied with some architectural+node upgrades which probably will be more like 20-40% uplift.
BEGUN THE GPU WARS HAVE.
That's all that matters.
Spoken well you have, a like must I give.
@@edgain1502 Indeed a great comment have you given, a like doubled from me have you received.
BEGUN THE VAPORWARE WARS HAVE
There... fixed it for you
Yeah. That's actually the where the real excitement is - _at least until I can actually afford an upgrade_
I got feeling that , Nvidia had rush quickly they RTX product to show and try to claim own RayTrace system in games, knowing that AMD is gonna give hard time to them.
Amazing review, thank you for all the work you do. My goto channel.
good job guys, also...
please compare the Rage mode and the Manual OC...
again excellent work!
This is the first time I've actually been anticipating a GPU release in a while. Hopefully AMD Radeon continues the momentum with their graphics cards, it's about time we got a return to competition within this market segment.
The price is still an utter joke. Last time we saw competition there was RX480
The RX 480 was only powerful enough to compete with the mid end market. These high end cards are naturally always going to be expensive to a degree.
@@AlexanTheMan It was still a flagship card that gave the performance of a GTX980 for $200. The equivalent of that today would be the RX6800 performing on par with the RTX2080 for $200.
@@AlexanTheMan Plus nodes have gotten ridiculously expensive to develop and produce. Gone are the days where you can get a fantastic card for $300 like in the late 2000s-early 2010s.
You guys realize that money inflation is still a thing, right? When the Nvidia 10 series came out, it blew the RX 480 (and later rebranded 580) out of proportion, and AMD never came out with a proper response after the failure of Vega. Plus also consider that we are still not quite out of this pandemic yet, so the production of the chips are not on full force, but I'm not sure about those. So in my opinion, the prices don't surprise me one bit.
All sold out here in Switzerland.it took 4 minutes.
Price were
719 Swiss Francs for 6800 XT
689 Swiss Francs for 6800
So 6800 XT is better buy for sure. I got lucky and got 6800 but really wanted XT version
Give it time: with the same power limit shenanigans we saw with vega and rdna1, the 6800 will be a better value once oc'd
@@MattJDylan doubtful it hs 17% fewer CUs (60 vs 72), so unless there's someway to unlock them it's unlikely it'll be able to top the 6800XT in perf/dollar in any meaningful way.
Did you get the asus one at 689? I managed to order an msi 6800 at 669 from digitec before they greyed out the add to cart buttons.
@@cptwhite in perf/dollar the 6800 is already topping both the xt and the 3080, and I didn't meant the vanilla 6800 will beat both, but that it will come close enough for him not to worry too much
@@helljester8097 Yes I did, but I really wanted 6800 XT I mean it was just 30 bucks more! And I didn't know there was XFX 6800 XT on stock too.
But 4 models in total and they had less than 100 pieces in stock I think.
Great video as always Steve...
Excellent informational video once again. Great job.
BTW, you mentioned that you saw an average of about a 5% boost with overclocking. So that would be on top of the boost from enabling SAM? Would be great to see some benchmarks where you overclock the card and also enabled Rage+SAM to see what the absolute possible performance is on these cards.
Again, job well done!
Purposefully stayed up doing homework to watch this premiere! So worth it, love your work Steve!
lol
Imagine being able to buy one of these
cannot imagine
@@alvaro.goitino7321 lol
thanks for all the hard work.
First of all, thank you for all the tests you have to endure to provide all this data to all of us!
From the benchmark I guess this gpu best suits for sim racer as the differences in fps is quite significant.
lets be honest, were not here for 4k review we want that 1440p
Who's we?
not the case at all. like your pic tho ;)
4k gaming isnt here yet yes there are some titles that you can play at over 60 fps but for fps games that you need 144hz or more 4k is a far grasp 1440p is the next step up from 1080p and slowly will become more main stream all you have to do is look at games like cyber punk top gpu struggle to playing that game with all the eye candy on, 4k gaming is great for single player games, story based games where high fps dont matter much,
@@0mnikron702 I disagree. I play all my games in 4k max settings. 90% of the time I get 80-120 FPS in games. very few games bring my FPS down to 60 fps or lower. I enjoy all the 4k numbers.
Looking forward to seeing the SAM equivalent improvements for the 30-series 😃
Yep, and luckily Nvidia wont lock this to a single processor brand (or only Ryzen 5000 series, like AMD does). Nvidia "SAM" will support ALL CPU brands.
Thank you thank you thank you Steve, you've answered the questions other reviewers failed to address for me. Only one question remains - GPU encoding
thanks guys, great work as always \o/
people then : big navi cannot even compete against RTX 3070
big navi now :
This isn't even considered the big navi right? Isn't 6900 xt supposed to be big navi?
@@shaquilleoatmeal5332 Big navi is the entire generation, including 6800, 6800 xt and 6900 xt.
@@shaquilleoatmeal5332 The 9600XT is "the biggest big navi" :P
Well it technically still can't. It doesn't have DLSS compititor yet, has terrible video encoding performance and gives glitched out render outputs still. Plus bad at ray tracing. Its only for games that doesn't support DLSS I guess.
Still they are finally back.
These cards especially will of a great attraction to all 1440 and 1440p ultra wide gamers for sure which majority of gamers in this price range.
@@makisekurisu4674 AMD has always been lagging behind nvidia for those extra features... good thing about them is they make the price cheap so nvidia doesn't have a total monopoly and authority to increase prices like they would have if there was 0 competition from AMD.
11:30 those RTX 2060 4k Wolfenstein results xD 6fps average sounds good.
It seems weird, could just be a driver bug.
😂😂😂
@@PatalJunior Nah it ran out of Vram....6GB is not enough for 4k
@@subhajitchatterjee9688 funny enough the 980Ti has the same amount of VRAM (but GDDR5) & is still way faster. I would've expected both to crap out into similar abysmal performance.
@@MLWJ1993 Oh right...that seems strange how rtx 2060 can struggle so much then
July 2021 and this card is now 400$ cheaper than the 3080 in the pre built R12's from Aienware/Dell. Thats such a huge difference for essentially the same performance.
Thanks Steve, really great review as usual, the level of quality HU puts out is incredible, lots of hard work and it shows. IMHO, what really impresses me the most and the reason I recommend you guys before anyone, is how you guys manage to cover everything so well, I've found the attention given to the VRAM amount difference of the cards really lacking from most reviewers and you guys touched the subject and warned about it. If 8-10GB is barely enough today, for the high end, according to my experience, as soon as the new consoles hit, you better have 12GB or more for the high end, really. Imagine paying U$ 700 for an amazing card today and a year or so from now it struggles, not because the GPU can't handle, but framebuffer. I've had that experience with a fair share of nvidia cards (i.e. GTX 295, GTX 480 SLI, GTX 580, GTX 690), very frustrating.
Thanks again for the great work!
If the 6800XT is this good, I can’t imagine the 6900XT benchmarks when it comes out.
It won't be so much better, i believe, just look at 3080 vs 3090
Between the 6800 XT and 6900 XT, there is a difference of only 8 Compute Units. As the other commenter said, it'll be the same as 3080 vs 3090
@@matteovukoja1240 also 8 more "ray accelerators" so hopefully slightly better RT performance too
@@Saigonas It will definitely beat the 3090 at 1440p, though. Because the 6800XT was only 5% slower at 1440p than the 3090.
Given how bad the 3090 overclocks I wouldn't be surprised to see the 6900 XT beat the 3090 by an inconsequential margin, and if it does, the real story is the $500 difference in MSRP haha
Now to wait for the VR performance reviews. That's really gonna be the final benchmark telling me where to go.
Yeah, but who does VR performance reviews? I WISH I could find some!
"VR"...is that when you put a helmet on your head with a pair of small LCD displays very close to your eyes and keep it on for hours and hours? :)
@@GholaTleilaxu yup. Loads of fun.
Fantastic work as always :)
I can't wait to see these available for purchase in 2022! It's like getting a window into the future!
Nvidia 4000 series and AMD 7000 series...
Rage mode is enabled for people trying to buy the cards.
'Copying same comment mode is also enabled for you
Yea Nvidia enabled that last month.
He really wants to see everyone like his comment in every video lol
or for the Nvidia fanboys
Lmao
11:30 look at the end of the list, the 2060 with 6fps avg, LMFAO
@@xtechtips152 vram
@@smp219 doesn't make all that much sense since the 980Ti runs into a similar limitation unless it somehow powers through that like the monster it was back then. 🤔
@@MLWJ1993 that was history
@@MLWJ1993 ROPs
Sooo much better than the other 6800XT reviews. Good job as always Steve.
Amazing work Steve, like always! I love that competition is back, there is nothing better for us consumers then to have great competition! :-) 3090 owner here.
11:39 the 2060 was a PowerPoint presentation
2060* the 2060 *super* had >60fps average ending up just below the 2070.
@@MLWJ1993 ok bro I just saw it and edited the comment
This video: Steve saying "6800 xt is better in 1440p and the 3080 is faster at 4k" in 20 different ways
Seems to be somewhat weird, many other reviews see equal at 1440p, red win 1080, green win 4K.
NyxNyxNyx Green 3080 at 1140p? Does this also count for 3070?
he is using the 5950x for 6800xt results. it's ok for 4k but not 1440p when the 3080 results are with 10900k
I'm wondering how well all these reviews will hold up in the long run. If you compare results in recent games to the older ones Nvidia seems to be struggling. Dirt 5 and Valhalla in particular surprised me.
@@theeternal417 they are also done for 3080 using old drivers. I would rather he did less games but identical environments and up to date drivers
Great review 👍🏼
Guys nvidia offer a much more stable product with better infrastructure like geforce experience, game recording in 4k hdr 60 fps etc. Nvidia's driver support are also better vs AMD. Overall it's good to see AMD become a equals at the top end However, when you buy a nvidia card you just get a much more stable, reliable and better allround eco system. Also you showed have used the 10900k it's a better gaming cpu!
A really impressive jump in performance from AMD and even more impressive seeing the 6800XT pass the 3090 now and then. Looks like there will be a lot more all red builds in the future. 4k scaling is a bit dodgy with and without SAM but certainly the better choice for 1440p and overkill 1080p than the 3080.
brooo i wish every game would be as well optimized as death stranding
To be fair there's not really a lot going on in Death Stranding. The game looks gorgeous in 4k 60fps on a 65" HDR TV though.
Resident Evil as well. Gorgeous graphics too.
Nicely done, man.
Very helpfull thanks :D
Man,, Navi2x truly is Zen moment for Radeon
Nicely Done, AMD
14:41 worth mentioning:
The 6800xt has the same RT performance as the 2080 ti. (even slightly better)
That's not too bad at all!
I think a lot of people don't even understand this fact. AMD is basically right in the middle of RT 1 and 2 of Nividia. If AMD can keep this level of innovations for their 2nd gen RT cores i think we will have some serious competition. It's the same with AMDs version of DLSS. This is their first generation and essentially first go at it. As AMD said they will age like fine wine. I just hope it stands true
If it's a AMD sponsored game the RT performance is even better than Ampere. See Dirt 5. Otherwise it looked like 20% less than Ampere and a bit better than Turing.
@@eazen thats what I’m saying, RT is pretty much very limited on games and even more so that can actually run it at 60+ FPS. For me personally, I would buy the 6800XT cause its on par, and often times better, at performance with the 3080. However, what I like about the 3080 is the Audio Isolation thing they have with the microphones.
I really love the cost per frame comparisons. I bought a used Windforce 1080 for $320 about a week before the RTX cards came out, and I'm still glad I did. Cost per frame at 1440p is $4.50, and none of the games I play drop below 90fps.
Honestly, god bless you guys for using 1440p as the default benchmarking method shown in the charts, seriously, THANK You. It was so unnecessarily arduous / annoying just 6 months to a year ago finding 1440P only benchmarks for the games I care about. It was frustrating, so thank you for your hard work.
They attacc,
they protecc,
but most importantly:
AMD be bacc
I’m surprised they no longer include RDRII in their tests.
If I remember right, it's extremely hard to test in a consistently controlled manner (without using the built in benchmark). I believe they said the built in benchmark isn't very representative of true gameplay, so they try to test along a route in game. However, a dynamic world with a day-night cycle can make consistency very off.
Steve probably scrapped that test for time since there's so many product releases smashed together.
but I like that they include lot's of new(er) games, something most other reviewers fail to do and bench new hardware on 3-4 year old titles.
@@AngelicHunk Thats fair. I mean, the most taxing area in the game i believe is Saint Denis but even then that area isn’t so bad in itself. But I’d still say the benchmark for RDRII is decently accurate.
@@gorjy9610 Thats true. Like people who still test GTA V is kinda ridiculous. Nearly anything can run it as long as you aren’t tryna play on a laptop or something really old.
the spaghetti coding of the game probably contributed to this. They have have to release a single update that hasnt introduced stability issues
YOU ARE THE BEAST OF BENCHMARKING!!! awesome job as allways man.
Wow. What a card! Great review as always!
Just subbed. I love how this review is less opinion and more data than some of the other channels I know.
Steve is a legend and a blessing to tech mankind. Welcome to benchmark heaven where sleep is for the weak and there are numbers everywhere. :P
Check out Games Nexus. They stuff data down your throat.
Are these day one drivers vs day one drivers? I mean the 30 series has had an extra 2 months now to iron out driver issues.
Why would they use day 1 drivers for the 30 series? It's Nvdia's advantage for releasing earlier. You want to compare CURRENT performance to help your purchasing decision, not anecdotal. Steve will make an updated version for later drivers in the future as usual, anyways.
As a 3080 owner I can say for a lot of these games the average FPS with my card is way higher than what they got with their 3080 so who knows lol
@@Mkhalo3 but do you run the absolute maximum settings or a more reasonable "tuned ultra" setting.
This video isn't a guide for reachable fps, it's a comparison under the exact same circumstances.
@@paulelderson934 Everything maxed out with the highest FOV as well
Outstanding work as always
Thank you for your hard work 😘😘😘
And now I'm just patiently waiting for the 3080 Ti release.
yeah the 4k was mediocre at best, lets wait to see 6900xt too who knows amd might pull a last minute prank on nvidia :D otherwise 3080ti here we come baby
Different result in 1080p and 2k compared to guru3d and techpowerup. Probably because of Dirt 5 favoring AMD heavily, exactly 30% lead in 1440p.
yeah something about this lot of Benchmarks doesn't seem to make sense to me.
I was thinking the same thing...
It's because this game is optimized to run best on rdna2 because both consoles are using that platform and games are made first for consoles. So we can expect to see a lot more titles that will run better on AMD, especially next year when true next gen games come out.
@@bo-_t_-rs1152 Consoles have had AMD CPU's and GPU's for years, more than 4 generations, but this has never meant that games were all better optimised for AMD hardware on PC... most were better on Intel and Nvidia, and expect it to remain the case for a long time yet.
@@choatus True but the hardware in them was nowhere near the performance of the current gen. AMD was always behind nVidia and intel but that's changed with Ryzen/RDNA2 now.
Thank you for the continuously awesome data!
Though when you update the data to Ryzen 5000, it would be ok to leave out some older / lower GPUs or at least split them in different projects. Health is more important :)
ohh excited to watch!! :D
"So, without wasting any time let's talk about the test system, and then jump into the blue bar graphs. As I know that's what most of you are here for."
*_Very angry AHOC noises in the distance_*
Sorry but what is AHOC?
Valentine you are sus
@@ExZ1te Actually Hardcore OverClocking
He recently made a video where he’s upset about the bars in bargraphs
@@pythonner3644 uh- I Saw green vent to medbay
@@ValentineC137 but I am green
Damn, that 6800XT was seriously impressive, and waaaay better than I was expecting!! And excellent video as always guys, keep up the good work!!
Is there info on the test bench? Your SAM results are very different from other reviewers. Keep up the good work! Hi from Bacchus Marsh Vic!
I've noticed this Hardware Unboxed review is a bit different than other reviews from other sites. Steve also seems to just 'dismiss' Ray Tracing as if its a passing fad.
When AIB cards are released please check VRM temperatures with AMD's reference cooler.