Well we can't buy one as per usual launch in 2020, but at least I can help reward Steve for all the hard work they put into the review of another great product we can't buy :) Thanks for the video.
@@teamtechworked8217 Personally I typed like that for a decade then realized I knew where all the keys were without looking and I no longer had tiny child hands to make it necessary,fully trained myself to do it by touch within days and faster than ever within weeks,just had to commit to it. Totally worth it.
Your graphs are very clear; much clearer than that of other outlets. Coloring the relevant bars with colors that standout makes it easy to immediately understand the graph without hunting around.
i agree. his graphs are the best out of all reviewers. his tests always seem to have amd cards higher than other reviewers as well. even the rx 5700 xt.
I know how demanding repetitive job of PC creation can be, I´ve been drawing in CAD and catia for hours. But it is not like he is doing some crazy job. He is tired, but he can create 12-14 hours a day for a few days, no problem. After all, there is money for him.
@@TheArrowedKnee This really sucks man. Sometimes I wish covid wasn't a thing, because of things like this. Most times I don't because this has honestly been a good year for me.
Only really in three ways is it the superior product: price, vram capacity, and power consumption. But I imagine that those three reasons alone can convince a lot of people, because it definitely almost convinced me. But for me the value advantage is negligible because of the extra features that rtx cards have. I’ll make a purchase sometime next year when all product choices are accounted for and planned features are properly implemented.
It is amazing how much ground AMD has made uo in a single generation and it will be great once stocks not a problem for us as customers finally we will have a choice in the high end
@@ddlog420 That's understandable, AMD has disappointed a lot during the last couple of years, especially with GPU's. I'm just glad they finally have brought competition to the high-end GPU market!
It still doesn't, the benchmark numbers don't matter. /s Damn, those 1440p and 1080p numbers (Steven hinted that the 6800XT beats the 3090 at 1080p in Watch Dog Legions during his unboxing). I was hoping for better 4k performance but whatever, I can stand 1440p in games.
totally disregarding the performance: Am I the only one that finds the FE editions (both nVIDIA and AMD) both way more gorgeous than any board partner cards so far? Usually it was the other way around in previous generations...
@@HickoryDickory86 they do actually... sapphire as well. most others though... I don't even wanna start with nVIDIA... most of them are ugly as hell. I also don't like overly "gamey" cards, a little RGB or the occasional exquisite cooler desigin is fine, but most are overly edgy or plain boring. the Founders edition hit the sweet spot between simple, yet good looking... AMD more on the edgy little rebel side, nVIDIA on the more elegant side. I personally don't mind the red strip, it is AMDs color after all and some red accents are perfectly fine with pretty much any case color setup.
But they arn't. Customers arn't buying their cards, and especially not with the driver woes of RX 5700 and XT. As evidenced by the fact that there's more 2080Ti owners, than both RX 5700 and XT..
Lets wait for drivers updates as we know amd at launch always dont have the better drivers.. So lets wait at less 1 month and we may see more performance
@@Hardwareunboxed Thanks. Do you think it's worth upgrading from a 3900X to a 5900X for SAM? I kind of wanna, but I kind of feel like I'll have buyer's remorse next year if I do.
@@shredderorokusaki3013 Yeah can't wait for those benchmarks to release. Think there'll be a Non XT 6900? 6800XT is good for its price but I really am incline to get a GPU labelled 6900 solely for the name.
Hey, Im a computer engineering student and really need a new gpu for University , if you thinking to upgrade, how much can you sell me your GTX 1080 for?
2020: When hardware releases became a physical form of "Early Access" 2021: Introduction of new marketing, the "TimeSaver" Edition - only £30 extra but your item is guaranteed for launch day* *Terms and Conditions apply
Comparing Nvidia to Intel's responses in both cases : RDNA1/Zen1 : Nvidia/Intel - "Hahaha, lol wtf ffs, is that all you've got ?" RDNA2/Zen2 : Nvidia/Intel - "Damn, we actually have to do something for the new generation now..." RDNA3/Zen3: Nvidia/Intel - "FUUUUUUUUUUUUUUUUUUUU*K!!!!!" EDIT: for clarity
If we assume (and i know they can do it now) that they can meet the challenge once, they can do it again. they took their performance straight from geforce 1080TI level performance (5700xt) straight to 3080-3090 performance.. They literally skipped a "2000 series" equivilant card. THAT is an accomplishment. And I think they will do it again with Rdna3 :)
@@tusux2949 eh nvidia still edged out the win this gen. but i believe amd will come for blood next gen i can garuntee it! this feels like their “first attempt” cards like nvidias 2000 series
@@Dj-Mccullough The comparison to the 5700xt is very flawed and extremely missleading. 5700xt had 40 CUs while the 6800 has 60CUs and the 6800xt 72 CUs. It is not like it was impossible for AMD to create 60CU or 72CU RDNA1 GPU, they just decided not to. So the gain from architecture to architecture will be much smaller when we are comparing 40 CU vs 40 CU (probably 6700 or 6700xt) and the difference between those 2 cards is the architectural gain. Probably more in line with 10-20% though both use the same node. RDNA3 will probably be on a better node so you will get some gains there aswell. But you will not see the same gains as when comparing 5700xt vs 6800xt. They will not produce 130-140CU gpus. Even their newly revealed MI100 data center chip only uses 128. So unless you want to pay 5000 bucks for a gaming gpu (and only leverage its performance at 8k) you will have to be satisfied with some architectural+node upgrades which probably will be more like 20-40% uplift.
lol there's a huge CPU bottleneck in every case of 1440p. Theres no way the 3090 is only faster than the 3080 by a few fps lmao, which indicates the CPU is bottlenecking. When you go to 4k where the cpu bottleneck is removed, you immediately see the 6800xt is way slower than the competition.
@@legendarymop The cpu bottleneck is working against AMD not nvidia. The reason that Nvidia has more performance at 4k is because of the FP32 compute (tons of cores) which doesn't scale well at lower resolution. There is a CPU bottleneck at 1080p and a little one at 1440p, but it bottlenecks RDNA 2 more than Nvidia. The rtx3090 is only slightly more powerful than a 3080 and runs at similar clocks, it's an architecture bottleneck at lower resolutions for the RTX3000 series. So for fps snobs at lower resolutions it's RDNA 2 all the way now. (Massive 40%+ performance difference in some games) For 4K snobs it's slightly more in favor of Nvidia. (18-20%+ performance difference in some games)
Just a few years ago AMD GPU'S were going for high compute capabilities at lower frequency. Nvidia was going for lower compute capabilities but at a higher frequency. Now they have completely switched places! The only difference is that they're both good now, not just Nvidia.
@@HollowRick There not that expensive now ... can pick up an LG 4K 144hz for what a 1440P 144hz monitor use to be ... 4K is really the new 1440p ... amazing to play at 4K with everything maxed out
He can't wear a cape: since he's aussie the cape would always float behind him because he's upside down... that would mess with the lighting, you know..
I got feeling that , Nvidia had rush quickly they RTX product to show and try to claim own RayTrace system in games, knowing that AMD is gonna give hard time to them.
You guys are the best. Still tossing numbers for 980s into the mix is amazing. Also having 1080p data availible is also awesome. So many dont even bother with it
4k gaming isnt here yet yes there are some titles that you can play at over 60 fps but for fps games that you need 144hz or more 4k is a far grasp 1440p is the next step up from 1080p and slowly will become more main stream all you have to do is look at games like cyber punk top gpu struggle to playing that game with all the eye candy on, 4k gaming is great for single player games, story based games where high fps dont matter much,
@@0mnikron702 I disagree. I play all my games in 4k max settings. 90% of the time I get 80-120 FPS in games. very few games bring my FPS down to 60 fps or lower. I enjoy all the 4k numbers.
This is the first time I've actually been anticipating a GPU release in a while. Hopefully AMD Radeon continues the momentum with their graphics cards, it's about time we got a return to competition within this market segment.
@@AlexanTheMan It was still a flagship card that gave the performance of a GTX980 for $200. The equivalent of that today would be the RX6800 performing on par with the RTX2080 for $200.
@@AlexanTheMan Plus nodes have gotten ridiculously expensive to develop and produce. Gone are the days where you can get a fantastic card for $300 like in the late 2000s-early 2010s.
You guys realize that money inflation is still a thing, right? When the Nvidia 10 series came out, it blew the RX 480 (and later rebranded 580) out of proportion, and AMD never came out with a proper response after the failure of Vega. Plus also consider that we are still not quite out of this pandemic yet, so the production of the chips are not on full force, but I'm not sure about those. So in my opinion, the prices don't surprise me one bit.
All sold out here in Switzerland.it took 4 minutes. Price were 719 Swiss Francs for 6800 XT 689 Swiss Francs for 6800 So 6800 XT is better buy for sure. I got lucky and got 6800 but really wanted XT version
@@MattJDylan doubtful it hs 17% fewer CUs (60 vs 72), so unless there's someway to unlock them it's unlikely it'll be able to top the 6800XT in perf/dollar in any meaningful way.
@@cptwhite in perf/dollar the 6800 is already topping both the xt and the 3080, and I didn't meant the vanilla 6800 will beat both, but that it will come close enough for him not to worry too much
@@helljester8097 Yes I did, but I really wanted 6800 XT I mean it was just 30 bucks more! And I didn't know there was XFX 6800 XT on stock too. But 4 models in total and they had less than 100 pieces in stock I think.
A lack of ray tracing performance is what scares me now that the consoles adopted it. I personally have less vram and 1080p performance than giving up the option to have decent ray tracing performance. Cyberpunk 2077 is just around the corner and it may deliver something real nice using this technology.
But consoles use RDNA 2, devs will work more with AMD's implementation of Raytracing, in the long run, AMD has an advantage as their Raytracing matures
@@notsoanonymousmemes9455 Sony doesn't use RDNA2 more like RDNA 1.5 also the raytracing that they use is based on Microsoft implementation and not locked down to just AMD so Nvidia can use it also, as well as their own RTX.
@@choatus do you have a source for the nonsense you're saying? No. Sony uses RDNA 2. And of course AMD has a advantage over nvidia because their architectures are everywhere. In 2 consoles and the PC. Only a idiot would ignore that fact.
Given how bad the 3090 overclocks I wouldn't be surprised to see the 6900 XT beat the 3090 by an inconsequential margin, and if it does, the real story is the $500 difference in MSRP haha
Seems like the faster pipes on RDNA2 help with 1080 and 1440, but the fatter pipes on Ampere help at 4K. Be interesting to see how it plays out in the future.
I agree. It'll be especially interesting to revisit this in say, 2-3 years to see if the low VRAM on the Ampere launch versions eats into their 4k lead to the point that it is nonexistent or flips to a deficit. Given that they are already scrambling to redo the lineup and add more VRAM, that'd be a BIG RIP to the people that bought a launch 3080 or 3070.
@@andrewwilliamson3513 10gb is plenty for 3+ years which by then most people will upgrade anyways, the Vram is GDDR 6x and is up to 4x faster than the GDDR 6 that AMD is using so it does help quite a lot especially in those "Fatter Pipes" it's really not as much of a big deal as people are making it out to be, most people don't understand that they are reading Vram allocated not Vram actually being actively used. Source- I own a 3080 plus a lot of other cards. Had my 3080 since almost launch and been loving it, worth every cent so far. But I have had many decent AMD cards in the past also, so don't call me a Fanboy, I will probably buy a 6800xt or 6900xt just for the fun of tinkering with it at some stage too.
@@choatus The basis of my argument is that the Nvidia cards only exceed the AMD card's performance in 4k, so their best use case is for 4k gaming, but the limited Vram might make that a moot point in the next cycle of games (without turning down settings.) Microsoft Flight Sim already allocates over 17Gb of Vram in 4k when under high load and nearly 10Gb in 1440p. Actual usage seems to be around 12Gb/8Gb respectively. The new trend in game development is to early access release games with shit optimization, and the optimization never seems to get addressed even at final release. AAA titles like COD that release every year are the same in a lot of ways. I think Warzone will use over 8Gb. It seems like Nvidia kept the Vram low because GDDR6X is expensive, but it keeps the 3070/3080 from being "Great" 4k cards for the future. They could have achieved the same memory bandwidth with more slower GDDR6 for roughly the same cost.
@@andrewwilliamson3513 to achieve the same bandwidth as GDDR6x they need to have 3-4 times the amount of normal GDDR6 so no it doesnt work like that.. it would end up costing a ton more to do it that way for starters. P.s MSFS on my 3080 in 4k uses (not allocates) less than 7gb of vram right now in ultra. but that game is broken as it needs its update to DX12 before we can see what it really runs like, it's basically running on 2 cores only atm.
Why would they use day 1 drivers for the 30 series? It's Nvdia's advantage for releasing earlier. You want to compare CURRENT performance to help your purchasing decision, not anecdotal. Steve will make an updated version for later drivers in the future as usual, anyways.
@@Mkhalo3 but do you run the absolute maximum settings or a more reasonable "tuned ultra" setting. This video isn't a guide for reachable fps, it's a comparison under the exact same circumstances.
24:40 There's probably a few reasons to get a 3080 over a 6800XT, for some specific applications like ML or CUDA-acceleration in workstation tasks. With that said, I doubt most viewers here are buying either for workstation use primarily. For gaming, AMD's done immensely well here. Well done.
I know they have the audio improving software and better encoders for people that stream. Again. If you don't do that then the 6800XT looks pretty good. NVIDIA wins on the Value Adds. Something that AMD needs to start working on as soon as they can.
I do not agree for machine learning using Roc m with 16 gb of vram should work better than cuda with 10gb of vram because training saturates the ram very easely. Nvidia should be better at fp16 though with tensor cores.
@@KPTKleist CUDA's the big one for me. To a smaller extent, NVENC as well even though I've not done a lot of streaming but have been doing some recordings lately. AMD's got some catching up to do in the features department. The performance is finally right up there but that extra $50 might be enough to tempt some people to go 3080 for those features if they apply to them.
RTX cards have tons of gaming features that AMD users can only dream about. Just wait till you see Cyberpunk 2077 benchmarks, Nvidia will wipe the floor with AMD 6000 series. Biggest game release in many years and fully Nvidia backed with ALL RTX features including DLSS.
Well it technically still can't. It doesn't have DLSS compititor yet, has terrible video encoding performance and gives glitched out render outputs still. Plus bad at ray tracing. Its only for games that doesn't support DLSS I guess. Still they are finally back. These cards especially will of a great attraction to all 1440 and 1440p ultra wide gamers for sure which majority of gamers in this price range.
@@makisekurisu4674 AMD has always been lagging behind nvidia for those extra features... good thing about them is they make the price cheap so nvidia doesn't have a total monopoly and authority to increase prices like they would have if there was 0 competition from AMD.
@@nofreebeer Dont decades go from 0 to 9 ? Ie. we started counting years (and thus decades) in our current system in year 0, so the first 10 years/decade ended when year 9 ended/year 10 started. So we already are in the new decade and 2029 (or rather 01.01.2030) is the start of the next decade. Otherwise, the millenium would have started 2001, not 2000 and the first decade would have 11 years.
24:48 there are definite reasons to buy a 3080 over a 6800 XT and you already mentioned all of them so not sure what this statement means. Better 4k performance, DLSS, RT, driver support which has been garbage for AMD and seamless for Nvidia for years now and not something I'd personally give up easily until RDNA drivers prove to be normal for some time.
I've heard bad stuff about Nvidia drivers for Linux users... In a practical part of my studies, sb who was trying to get their 2080ti drivers working for Tensorflow on Ubuntu, the display wasn't working at all anymore 😅.
DLSS isn't supported in every game, in fact the number of titles that support DLSS are miniscule. AMD are working on a similar technology. Raytracing as it currently stands is nowt more than a gimmick and offers little in terms of visuals for the impact on performance not to mention most games that are not Nvidia sponsored will use the DXR version of RT and not Nvidia's version which will close the RT gap. Better 4k performance in SOME titles, sure. Driver issues have plagued both companies in various generations and will continue to do so, that's a cheap excuse. Why are you so salty bro? Competition is great and Radeon just spanked Nvidia.
1. Ray tracing is a gimmick as it stands today as it still takes a massive frame rate hit even with DLSS for minimal improvements in graphical fidelity. 2. The 4K performance is only 5%. 5% percent and the 3080 loses in 1440 and 1080p (although again within error of margin). 5% is within error of margin and not something anyone could notice. Most people are adopting 1440p and that’s the resolution most will play at soon. 3. AMD has their own implementation of DLSS coming soon. It’s not a feature that can only be implemented for only Nvidia. 4. The drivers are yet to be tested, but Nvidia’s track record is not seamless or spotless. Just a few weeks ago; they had to release patches for crashes to desktop and black screens although no one seemed to want to talk about that unless it’s AMD screwing up. The 3080 is a tough sale unless you want to play with slightly improved shadows and reflections for a fat loss in FPS with less vram and more power consumption.
Thanks Steve, really great review as usual, the level of quality HU puts out is incredible, lots of hard work and it shows. IMHO, what really impresses me the most and the reason I recommend you guys before anyone, is how you guys manage to cover everything so well, I've found the attention given to the VRAM amount difference of the cards really lacking from most reviewers and you guys touched the subject and warned about it. If 8-10GB is barely enough today, for the high end, according to my experience, as soon as the new consoles hit, you better have 12GB or more for the high end, really. Imagine paying U$ 700 for an amazing card today and a year or so from now it struggles, not because the GPU can't handle, but framebuffer. I've had that experience with a fair share of nvidia cards (i.e. GTX 295, GTX 480 SLI, GTX 580, GTX 690), very frustrating. Thanks again for the great work!
@@subhajitchatterjee9688 funny enough the 980Ti has the same amount of VRAM (but GDDR5) & is still way faster. I would've expected both to crap out into similar abysmal performance.
I think a lot of people don't even understand this fact. AMD is basically right in the middle of RT 1 and 2 of Nividia. If AMD can keep this level of innovations for their 2nd gen RT cores i think we will have some serious competition. It's the same with AMDs version of DLSS. This is their first generation and essentially first go at it. As AMD said they will age like fine wine. I just hope it stands true
If it's a AMD sponsored game the RT performance is even better than Ampere. See Dirt 5. Otherwise it looked like 20% less than Ampere and a bit better than Turing.
@@eazen thats what I’m saying, RT is pretty much very limited on games and even more so that can actually run it at 60+ FPS. For me personally, I would buy the 6800XT cause its on par, and often times better, at performance with the 3080. However, what I like about the 3080 is the Audio Isolation thing they have with the microphones.
3:58 why is Valhalla so low performing across the board. the performance deficit on every AC game on PC gets worse while i dont think the graphics level up proportionately with the performance drop in game to game. same thing with watch dogs. Ubisoft games dont look good enough to play as bad as they do imo.
I'm wondering how well all these reviews will hold up in the long run. If you compare results in recent games to the older ones Nvidia seems to be struggling. Dirt 5 and Valhalla in particular surprised me.
A really impressive jump in performance from AMD and even more impressive seeing the 6800XT pass the 3090 now and then. Looks like there will be a lot more all red builds in the future. 4k scaling is a bit dodgy with and without SAM but certainly the better choice for 1440p and overkill 1080p than the 3080.
"So, without wasting any time let's talk about the test system, and then jump into the blue bar graphs. As I know that's what most of you are here for." *_Very angry AHOC noises in the distance_*
Thirty seconds into sales going live, and the Newegg site is down. Here we go again, as expected. That said, is it just architecture differences that tend to give the 4K nod to Nvidia? Seems like somebody playing at 4K has a dilemma: the 3080 is faster but has less VRAM than the 6800XT. What to do?
This will be the dark generation I’m calling it. The vast majority of people will miss out on Zen 3, RNDA2 and Ampere. By the time they are buy able at regular means we will have zen 4, RNDA3 and Hopper.
@@ZackSNetwork I wouldn't call this gen the dark age, if anything this is a golden age for tech! We had massive generational leaps in multiple tech areas (motherboard with PCIE 4.0, CPU with Zen 3, and GPU for both Nvidea and AMD). More people are buying computers than ever before due to those reasons, and because people are stuck at home and realizing that 1 computer isn't enough for 5 poeple lol!
@@vgamedude12 Nor would I, but I don't game at 4K anyway. To me unless you can turn the graphics to max, there's no reason to game at 4K. I mean, 4K with settings dumbed down??? Opinions vary, and rightly so, but chasing high frame rates at 4k in games is a serious money pit. So it's 3440x1440 for me.
Thank you sir. That was very interesting specially the SAM analysis. I was curious if the performance they claim was true. Your test with 3.0 making no difference vs 4.0 is an extra that i really appreciated. Thank you for doing that test.
Yep, and luckily Nvidia wont lock this to a single processor brand (or only Ryzen 5000 series, like AMD does). Nvidia "SAM" will support ALL CPU brands.
Honestly, god bless you guys for using 1440p as the default benchmarking method shown in the charts, seriously, THANK You. It was so unnecessarily arduous / annoying just 6 months to a year ago finding 1440P only benchmarks for the games I care about. It was frustrating, so thank you for your hard work.
@Desktopia People have extremely short memories and horrifically limited attention spans. Anyway, it is true that RDNA is an exceptional GPU architecture and brought real competition to Nvidia that they had not had in a long time. The generational leap in performance from RDNA to RDNA 2 is nothing short of breathtaking and worthy of acclaim. And that AMD is targeting a similar ~50% increase in performance-per-watt with RDNA 3 is very exciting, but neither should detract from AMD's win here with RDNA 2.
Nobody needs expensive HBM if you have the power of Infinity Cache. The old problem which AMD had since 2014 is now solved. No HBM needed anymore for the highend cards.
@@eazen Never said I wished it had it, I said imagine. The infinity cache can only help so much before the gpu has to override the cache and fetch more data from system memory at higher resolutions. You can see how much frames it drops switching to 4k. Also this takes alot of space on the die (infinity cache estimated to take up about 6 billion transistors) which could've been used for more enhancements. Infinite cash though is the next step in graphics though, it is a great idea. HBM is just the overall better solution though, just more expensive.
If I remember right, it's extremely hard to test in a consistently controlled manner (without using the built in benchmark). I believe they said the built in benchmark isn't very representative of true gameplay, so they try to test along a route in game. However, a dynamic world with a day-night cycle can make consistency very off. Steve probably scrapped that test for time since there's so many product releases smashed together.
@@AngelicHunk Thats fair. I mean, the most taxing area in the game i believe is Saint Denis but even then that area isn’t so bad in itself. But I’d still say the benchmark for RDRII is decently accurate.
@@gorky_vk Thats true. Like people who still test GTA V is kinda ridiculous. Nearly anything can run it as long as you aren’t tryna play on a laptop or something really old.
July 2021 and this card is now 400$ cheaper than the 3080 in the pre built R12's from Aienware/Dell. Thats such a huge difference for essentially the same performance.
@@SBOG4215 I got 4 monitors. Three of them to work. The one for my gaming rig is a LG 27GL83A. Many games purposely don't support 21:9 aspect ratios you find on UW monitors. Tell you what, consoles are not even supporting 1440p resolutions, even less UW, pointless resolution really for gaming. For productivity it has its uses tho'
Guys nvidia offer a much more stable product with better infrastructure like geforce experience, game recording in 4k hdr 60 fps etc. Nvidia's driver support are also better vs AMD. Overall it's good to see AMD become a equals at the top end However, when you buy a nvidia card you just get a much more stable, reliable and better allround eco system. Also you showed have used the 10900k it's a better gaming cpu!
It's because this game is optimized to run best on rdna2 because both consoles are using that platform and games are made first for consoles. So we can expect to see a lot more titles that will run better on AMD, especially next year when true next gen games come out.
@@bo-_t_-rs1152 Consoles have had AMD CPU's and GPU's for years, more than 4 generations, but this has never meant that games were all better optimised for AMD hardware on PC... most were better on Intel and Nvidia, and expect it to remain the case for a long time yet.
@@choatus True but the hardware in them was nowhere near the performance of the current gen. AMD was always behind nVidia and intel but that's changed with Ryzen/RDNA2 now.
@@smp219 doesn't make all that much sense since the 980Ti runs into a similar limitation unless it somehow powers through that like the monster it was back then. 🤔
yeah the 4k was mediocre at best, lets wait to see 6900xt too who knows amd might pull a last minute prank on nvidia :D otherwise 3080ti here we come baby
Nice review, it seems NVIDIA has tight competition this "generation" of cards. I disagree with your RT assessment though. RT is relevant, else AMD wouldn't have added it.
It’s relevant but hardly. It’s implementation is games is few and far between. And what games do have it, the performance loss in exchange has hardly been worth it. I’m with Steve that it hardly matters in regards to which GPU is worth buying. The only reason why NVIDIA can tout about it at all is because of DLSS kind of “cheating” performance. Maybe in another 5 years it’ll be semi-standard in games and reasonable to run but as of now, not really.
Premature technologies like ray/path tracing get supported more as they mature, we're still in hybrid territory and it's really added bling at this stage, maybe in another two generations or the next console generation it'll be as significant as rasterisation capability. Just look at Control 4K which is 36FPS on a 3080 and needs DLSS to get to 60FPS and that's still a hybrid implementation. Fortnite with full RT effects (still hybrid) is even worse at 23FPS and 42FPS with DLSS. REF: ruclips.net/video/nX3W7Sx4l78/видео.html If it was as relevant as you say, AMD would have pushed their ray tracing implementation much more than just matching the RTX 2000 series. They've implemented just enough so that you can experience Hybrid real time ray tracing, but not so much that it becomes a primary selling point, because we aren't there yet.
Spot on. Yeah I'm happy AMD is catching up but anyone who says RT won't matter during this generation of cards is kidding themselves. Nvidia 100% this gen, and this isn't even including extra software features and better drivers.
RT will be thing now that consoles can use it. That's always the thing holding technologies like this back. We're going to see RT in just about every AAA game going forward now and AMD being so far behind is not great. We've yet to see if AMD's DLSS equivalent doesn't suck which given how terrible first gen DLSS was I'm not holding my breath.
Why do you think the FPS data that you are showing is so different to that of GamersNexus and Jayztwocents? Do you think it is the difference between enabling Ray Tracing and not? Or could it be something to do with the memory or other tuning of your test bench favouring AMD hardware somehow?
I've noticed this Hardware Unboxed review is a bit different than other reviews from other sites. Steve also seems to just 'dismiss' Ray Tracing as if its a passing fad.
Just want to say THANK YOU for having the 3950X in your test bench, you keep apologizing for it, but it's been a very welcome data point in reference to other reviewers, and closer to what a lot of your Ryzen 3000 viewers are still packing.
That's a lot for another excellent review, Steve! They must have made SAM available for all Ryzen users across the whole AM4 platform, no matter what motherboards or CPUs people use. I hope Nvidia will force AMD to do this soon. Anyway, I am impressed! AMD finally became a really dangerous competitor to both Intel and Nvidia. Ironically it happened almost at the same time, with the release of Zen 3 and RDNA 2. It will make things a lot more interesting!
I agree. these new cards are clearly a very good substitue for the RTX 3000. However, u also have think in the future and what we have right now. its obvious that the trend for games (AAA) is ray tracing...so u cannot say anymore that the number of games with ray tracing is limited. its just going to increase. Its a fact that ray tracing technology in AMD cards is inferior to nvidias at the moment. Furthermore they lack AI services like the nvidia broadcast app, which are pretty cool or even needed if u want to stream. I think if u dont care about streaming or ray tracing, sure amd is a better option. but like i said, more and more games are coming with better visuals and most people will be stuck with amd for a while... I think that AMD will crash Nvidia with their next generation RX cards though.
It depends on if older motherboards can manage it. Even nvdia did saynthatnthey will only domit for new amd motherboards, so older may lack some feature that is needed!
@@inrptn how can he know that? SAM doesn't even run on PCIe 3.0 right now. Only Ryzen 5000 series on an X570 supported, which guarantee PCIe 4.0. For more SAM benchmarks you can look at Hardware Unboxed. Some games have massive improvement like Assassin's Creed Valhalla.
Finally one thorough and clear 6800xt review. I'm tired of "big" reviewers making points with 3DMark benchmarks and 5-6 games of which 2 are 5+ years old. Who will buy these cards to play GTA V anyway???
Or red dead redemption 2, or control, or cyberpunk, or metro, or battlefield or any of the other titles that support it. Thats not even mentioning dlss where amd gets shit on :)
If AMD had a DLSS 2.0 competitor I could see the 6800xt as a great competitor to the 3080, without it there is just too much gained with that technology to switch from team Nvidia
@@jagoob that's the thing. DLSS is only going to get more and more optimized for a wider variety of games. And people with those cards will love the dedicated cores for minor ai workloads and the improving raytracing tech which is here to stay.
If you have 1080p screen then dlss isn't needed. 1440p or 4K yeah fine. They have super sampling or something like that and they said that they will release it in the near future. Judging by nvidia tho with their first gen of dlss( which was shit ) I dont know what to expect.
@@kostisap9350 .. Why judge by DLSS Technology by DLSS 1.0 (which was sh!t) when DLSS 2.0 already blows that first version outta the water. If DLSS 3.0 improves even half more the improvements of what DLSS 2.0 has provided, it will be something quite special imo ...
@@williammurphy1674 Yea, if they manage to improve even further it would be phenomenal. Even with DLSS 2.0 as it is now the game looks way better than with TAA or other AA methods and performs way better. I am enjoying Control so much on my 2k 240hz monitor right now with my 2070 super, getting steady 80+ framerates with gsync enabled. Such a smooth and beautiful experience!
In Canada there was no stock at the only major retailer in my city I could buy it from locally. Picked up my pre-ordered Nvidia card and the retailer told me they had more orders for 3080s today then any other day. Was considering the 6800xt but for me was a what came in stock for me to buy so Nvidia got my money.
@@syahmimi1425 They didn't get any extra stock I've been waiting weeks for card I back ordered. Sales associate told me they had more orders placed for back order on 3080s then any other day today. They did not get more stock just more orders for back order.
I'll give you the scoop. Nvidia would have slaughtered AMD because the RT stuff isn't just a little bit behind it's WAY behind and AMD has no DLSS equivalent yet. The review was dismissing RT so why bother showing it?
The stock levels here in Northern Europe was sufficient, had 7 min to pick one up :D But I missed some AIBs brands.. And for pricing, the 6800XT was easily the best buy.
Watch all reviews and draw the conclusion yourself. It doesn't matter what they say, it is only their own opinion. For me playing @1440*3440, result of 4k is a decisive comparison. I do care about RT and DLSS. Also, look at games they pick for the test. I am not going to play none of those AMD sponsored titles, so I do care about the titles that I am interested. For my personal perspective, 3080 is a go to option. But, none would regret buying 6800 series cause they are pretty darn good.
I dont even need to watch LTT's review to know they have once again created a supper click-baity title and a conclusion. never watch LTT for actual product review, its more of a drama channel for the masses.
HUB focused almost exclusively on rasterized price/perf, while LTT put a heavy emphasis on ray tracing performance and features. I wouldn't say either is right or wrong, just completely different priorities.
That is because Steve is an AMD fanboy 100%. He won't admit the defeat that these AMD cards have against Nvidia. He smooths over ruffled feathers with statements like: "And here the 6800XT does fall behind the 3080 by a 15% margin, which is quite a substantial difference, [THOUGH A 105 fps ON AVERAGE AT 4K IS STILL QUITE IMPRESSIVE]".
HI STEVE, I HOPE WE GET TO SLEEP SOON
Our reaction when we see a dozen RX6800 XT reviews pop up at once.
Paul you are sexy as hell without sleep. Just wanted to say that.
LOL
GET BACK IN YOUR CAVE AND GET US NUMBERS, NUMBER SLAVE
hahaha you guys have busted your asses over the past month, you guys need a vacation
Well we can't buy one as per usual launch in 2020, but at least I can help reward Steve for all the hard work they put into the review of another great product we can't buy :) Thanks for the video.
Keep up the good work, Heir :),
didnt happen for cometlake tho 🤣 sad intel
I got 1. And I am a hunter pecker on the keyboard.
@@teamtechworked8217 if you learn to touch type you could get two or three.
@@teamtechworked8217 Personally I typed like that for a decade then realized I knew where all the keys were without looking and I no longer had tiny child hands to make it necessary,fully trained myself to do it by touch within days and faster than ever within weeks,just had to commit to it. Totally worth it.
Your graphs are very clear; much clearer than that of other outlets. Coloring the relevant bars with colors that standout makes it easy to immediately understand the graph without hunting around.
*angry buildzoid noises*
i agree. his graphs are the best out of all reviewers. his tests always seem to have amd cards higher than other reviewers as well. even the rx 5700 xt.
I was wondering if it's just because I'm Australian I prefer this channel by a lot. I guess not?
I agree with that
Sounds like my boss in every meeting
Steve be like: “SLEEP IS FOR THE WEAK” *passes out*
We'll never sleep, cos sleep is for the weak.
We'll never rest, till we're fucking dead
I know how demanding repetitive job of PC creation can be, I´ve been drawing in CAD and catia for hours. But it is not like he is doing some crazy job. He is tired, but he can create 12-14 hours a day for a few days, no problem. After all, there is money for him.
I thought hardware news was for the weak
@@MarioAPN "12-14 hours a day for a few days, no problem" haha cute, imagine only working 14 hours per day, you're living the dream!
@@Hardwareunboxed the self employed will always work the hardest!
Finally! I can now start waiting for RDNA 3
Just in time for Covid 21 perhaps. o.O
“In many ways the AMD GPU is the superior product”
What a good year for AMD. Just incredible.
Looooool
🤣🤣🤣
“In many ways the AMD GPU is the superior product”
What a good year for AMD. Just incredible.
Just a shame there is such a lack of stock, but that's an industry wide problem i suppose, not exactly AMDs fault
@@TheArrowedKnee This really sucks man. Sometimes I wish covid wasn't a thing, because of things like this. Most times I don't because this has honestly been a good year for me.
Only really in three ways is it the superior product: price, vram capacity, and power consumption. But I imagine that those three reasons alone can convince a lot of people, because it definitely almost convinced me. But for me the value advantage is negligible because of the extra features that rtx cards have. I’ll make a purchase sometime next year when all product choices are accounted for and planned features are properly implemented.
@@rct9617 Wise words.
now they just have to follow with good drivers
It is amazing how much ground AMD has made uo in a single generation and it will be great once stocks not a problem for us as customers finally we will have a choice in the high end
Imagine that there were people who doubted that AMD could even match the 2080Ti...
Guilty.
i was guilty as well. finally ready to upgrade mt 1080ti
@@ddlog420 That's understandable, AMD has disappointed a lot during the last couple of years, especially with GPU's. I'm just glad they finally have brought competition to the high-end GPU market!
It still doesn't, the benchmark numbers don't matter.
/s
Damn, those 1440p and 1080p numbers (Steven hinted that the 6800XT beats the 3090 at 1080p in Watch Dog Legions during his unboxing). I was hoping for better 4k performance but whatever, I can stand 1440p in games.
Still cant compete in raytracing or any other possible software feature or even stock right now so its still not a win for AMD.
totally disregarding the performance: Am I the only one that finds the FE editions (both nVIDIA and AMD) both way more gorgeous than any board partner cards so far? Usually it was the other way around in previous generations...
I havent seen many AMD board partner designs yet, but yes. Both AMD and NVIDIA massively stepped up their games in the looks department.
I hate that red strip on the 6800/xt its obnoxious as hell. Makes it not look premium >
I don't know... those XFX teases are looking mighty sexy. 😏
@@MrMilkyCoco what is consider premium to you? Genuine question.
@@HickoryDickory86 they do actually... sapphire as well. most others though... I don't even wanna start with nVIDIA... most of them are ugly as hell. I also don't like overly "gamey" cards, a little RGB or the occasional exquisite cooler desigin is fine, but most are overly edgy or plain boring. the Founders edition hit the sweet spot between simple, yet good looking... AMD more on the edgy little rebel side, nVIDIA on the more elegant side. I personally don't mind the red strip, it is AMDs color after all and some red accents are perfectly fine with pretty much any case color setup.
Who's ready to watch a combined two hours of Big Navi benchmarks from all these techtubers right now?
me
yup I'm ready bro
Sign me up
*my body is ready*
Except for GN, I actually need to be focused enough before watching their reviews
@@MattJDylan The second hour is reserved for Tech Jesus' review :P
I'm so happy Amd is now giving competition to other previous market dominating brands
What competition? They sell exactly zero of these cards in the Netherlands.
@@Superiorer and that's why Nvidia can price their cards however they want, even when competition is great, people buy Nvidia
@@shernandez31 so true
@@Superiorer not anymore.
But they arn't.
Customers arn't buying their cards, and especially not with the driver woes of RX 5700 and XT.
As evidenced by the fact that there's more 2080Ti owners, than both RX 5700 and XT..
Despite the whole DLSS and RTX thing its good to see that AMD is competitive again.
They have ray-tracing and will also have dlss in the future.
Yes but its nowhere near nvidia's performance in those categories thats what Romeo probably meant
Lets wait for drivers updates as we know amd at launch always dont have the better drivers.. So lets wait at less 1 month and we may see more performance
Sure but I don't think AMD can just fix their ray tracing and dlss with a simple driver update
while nvidia ... 2nd gen rt, amd 1st gen rt, so yeah.. rdna 3 will be the real battle. head to head with nvidia.
Just give me a Sapphire Nitro+ version of this baby.
Review should be up on the 25th for you ;)
@@Hardwareunboxed Thanks. Do you think it's worth upgrading from a 3900X to a 5900X for SAM? I kind of wanna, but I kind of feel like I'll have buyer's remorse next year if I do.
@@praetorxyn your still good bro
@@praetorxyn if you are planning to game, I might go for it. For anything else, no cause SAM only increases performance in games.
I want Toxic
How other people bench: Muscle gains.
How Steve Benches: FPS gains.
Brilliant comment Sir! :)
that's how Steve flexs
All kinds of gains. All kinds.
@@shredderorokusaki3013
Yeah can't wait for those benchmarks to release. Think there'll be a Non XT 6900? 6800XT is good for its price but I really am incline to get a GPU labelled 6900 solely for the name.
I have full faith that retail will add $300AU to the $1050AU price when AIB cards are released...
My 1080s fan just started squeaking after watching this video.
Best to upgrade before it explodes in your face then. xD
You can wait for rdna 3 with a 1080
My 1080's memory crapped itself last week. Had to upgrade to a 3070 ;-)
Hey, Im a computer engineering student and really need a new gpu for University , if you thinking to upgrade, how much can you sell me your GTX 1080 for?
@@TheAlbaniaGaming Look on Facebook market and Craigslist you can find them for 350$ and less if you spend a few hours looking 👍💯
11:44 - RTX 2060: "I can do this... I CAN DO THIIIIIS!"
😂
Why is it so low
😂
@@ihfazhassan1548 seems to be some driver bug with a memory bottleneck.
I saw that too and mumbled "Damn"
2020: When hardware releases became a physical form of "Early Access"
2021: Introduction of new marketing, the "TimeSaver" Edition - only £30 extra but your item is guaranteed for launch day*
*Terms and Conditions apply
Shhhh, Nvidia might hear you!
TimeSaver Edition = Anti-competitive Edition = Billions of dollars in fines, not that we ordinary citizens would ever see any of that money anyway.
Noooo What have you done, they will see this idea and steal it lol
Comparing RDNA to Zen
RDNA1 & Zen : "Meh"
RDNA2 & Zen 2 : "Interesting"
RDNA3 : "See Zen 3"
Comparing Nvidia to Intel's responses in both cases :
RDNA1/Zen1 : Nvidia/Intel - "Hahaha, lol wtf ffs, is that all you've got ?"
RDNA2/Zen2 : Nvidia/Intel - "Damn, we actually have to do something for the new generation now..."
RDNA3/Zen3: Nvidia/Intel - "FUUUUUUUUUUUUUUUUUUUU*K!!!!!"
EDIT: for clarity
I plan on upgrading my system to a zen 4 rdna 3 system when those both launch. It'll be a huge upgrade from my r5 1600 and gtx 1080 system.
If we assume (and i know they can do it now) that they can meet the challenge once, they can do it again. they took their performance straight from geforce 1080TI level performance (5700xt) straight to 3080-3090 performance.. They literally skipped a "2000 series" equivilant card. THAT is an accomplishment. And I think they will do it again with Rdna3 :)
@@tusux2949 eh nvidia still edged out the win this gen. but i believe amd will come for blood next gen i can garuntee it! this feels like their “first attempt” cards like nvidias 2000 series
@@Dj-Mccullough The comparison to the 5700xt is very flawed and extremely missleading. 5700xt had 40 CUs while the 6800 has 60CUs and the 6800xt 72 CUs. It is not like it was impossible for AMD to create 60CU or 72CU RDNA1 GPU, they just decided not to. So the gain from architecture to architecture will be much smaller when we are comparing 40 CU vs 40 CU (probably 6700 or 6700xt) and the difference between those 2 cards is the architectural gain. Probably more in line with 10-20% though both use the same node. RDNA3 will probably be on a better node so you will get some gains there aswell.
But you will not see the same gains as when comparing 5700xt vs 6800xt. They will not produce 130-140CU gpus. Even their newly revealed MI100 data center chip only uses 128. So unless you want to pay 5000 bucks for a gaming gpu (and only leverage its performance at 8k) you will have to be satisfied with some architectural+node upgrades which probably will be more like 20-40% uplift.
AMD is finally back. I’ve been wanting to upgrade and it’s been a long time since I’ve owned an AMD card.
Steve: most of you are here for the blue bar graphs
Me: I'm here for Steve's sick accent, but that's just me.
Haha you got to move to Australia then mate. I'm a newcomer here and enjoying the accent a lot.
@@sausageskin Did you move for work? Genuinely curious.
@@TheMetalGaia I was more trying to escape Russia and live in a country where people are nice and not bully me :)
@@TheMetalGaia I applied to Australian permanent residency and got it after two years of paperwork
@@sausageskin That's awesome! Glad you got it. I've considered moving to Canada from the U.S. before.
omg , the 6800xt is a monster of gpu
So is the 3080
lol there's a huge CPU bottleneck in every case of 1440p. Theres no way the 3090 is only faster than the 3080 by a few fps lmao, which indicates the CPU is bottlenecking. When you go to 4k where the cpu bottleneck is removed, you immediately see the 6800xt is way slower than the competition.
3080 is faster in 4k because of memory
@@legendarymop The cpu bottleneck is working against AMD not nvidia.
The reason that Nvidia has more performance at 4k is because of the FP32 compute (tons of cores) which doesn't scale well at lower resolution.
There is a CPU bottleneck at 1080p and a little one at 1440p, but it bottlenecks RDNA 2 more than Nvidia.
The rtx3090 is only slightly more powerful than a 3080 and runs at similar clocks, it's an architecture bottleneck at lower resolutions for the RTX3000 series.
So for fps snobs at lower resolutions it's RDNA 2 all the way now. (Massive 40%+ performance difference in some games)
For 4K snobs it's slightly more in favor of Nvidia. (18-20%+ performance difference in some games)
@@bastordd Doesn't the 3080 have less memory? How'd less memory make it faster.
AMD costing 350 dollars less in australia even for AIBs! Hopefully they're in stock for more than 1 second.
LOL good luck Mate
Just a few years ago AMD GPU'S were going for high compute capabilities at lower frequency. Nvidia was going for lower compute capabilities but at a higher frequency. Now they have completely switched places! The only difference is that they're both good now, not just Nvidia.
144hz 1440p at max settings, finally!!!
Never cared for 4k that much....
Same not until 4k 144hz monitors become affordable 1440p 144hz is the sweet spot
@@HollowRick I'm happy with 1080p 144hz. Cheers.
@@rahuloberoi9739 same bro just sucks my 1060 cant run the new games more then 60 fps on medium to high settings anymore :(
3440x1440 is where it's at. 21:9 is so much better for gaming.
@@HollowRick There not that expensive now ... can pick up an LG 4K 144hz for what a 1440P 144hz monitor use to be ... 4K is really the new 1440p ... amazing to play at 4K with everything maxed out
Amd reference design is "cool and quiet".
It's so weird hearing that after 'hot and loud' release after release after release.
improvement at its finest
Probably the biggest improvement ever made 😆
hmm... 75c is a full 15c higher than my 3080 when it's OC'd, seems to be weird considering AMD using a smaller node (7nm).
@@MLWJ1993 Likely due to the smaller die at -108mm^2 drawing albeit much less power but still almost 300W.
I'm actually impressed amd made a card this powerful. It's nice to see competition again.
They say all heroes wear capes, but Steve here prove it differently by being a wizard that delivered unbelievable amount of benchs.
"They say all heroes wear capes, but Steve here wears hoodie. What's up with that?"
He can't wear a cape: since he's aussie the cape would always float behind him because he's upside down... that would mess with the lighting, you know..
Out of stock
BUYER RAGE MODE ON
It sold out when Newegg had crashed at 0901 and when it came back at 0914 it was sold out. AMD website was similar.
Sooo much better than the other 6800XT reviews. Good job as always Steve.
BEGUN THE GPU WARS HAVE.
That's all that matters.
Spoken well you have, a like must I give.
@@edgain1502 Indeed a great comment have you given, a like doubled from me have you received.
BEGUN THE VAPORWARE WARS HAVE
There... fixed it for you
Yeah. That's actually the where the real excitement is - _at least until I can actually afford an upgrade_
I got feeling that , Nvidia had rush quickly they RTX product to show and try to claim own RayTrace system in games, knowing that AMD is gonna give hard time to them.
You guys are the best. Still tossing numbers for 980s into the mix is amazing. Also having 1080p data availible is also awesome. So many dont even bother with it
Very nice to see direct comparison for me as a 980ti owner
lets be honest, were not here for 4k review we want that 1440p
Who's we?
not the case at all. like your pic tho ;)
4k gaming isnt here yet yes there are some titles that you can play at over 60 fps but for fps games that you need 144hz or more 4k is a far grasp 1440p is the next step up from 1080p and slowly will become more main stream all you have to do is look at games like cyber punk top gpu struggle to playing that game with all the eye candy on, 4k gaming is great for single player games, story based games where high fps dont matter much,
@@0mnikron702 I disagree. I play all my games in 4k max settings. 90% of the time I get 80-120 FPS in games. very few games bring my FPS down to 60 fps or lower. I enjoy all the 4k numbers.
This is the first time I've actually been anticipating a GPU release in a while. Hopefully AMD Radeon continues the momentum with their graphics cards, it's about time we got a return to competition within this market segment.
The price is still an utter joke. Last time we saw competition there was RX480
The RX 480 was only powerful enough to compete with the mid end market. These high end cards are naturally always going to be expensive to a degree.
@@AlexanTheMan It was still a flagship card that gave the performance of a GTX980 for $200. The equivalent of that today would be the RX6800 performing on par with the RTX2080 for $200.
@@AlexanTheMan Plus nodes have gotten ridiculously expensive to develop and produce. Gone are the days where you can get a fantastic card for $300 like in the late 2000s-early 2010s.
You guys realize that money inflation is still a thing, right? When the Nvidia 10 series came out, it blew the RX 480 (and later rebranded 580) out of proportion, and AMD never came out with a proper response after the failure of Vega. Plus also consider that we are still not quite out of this pandemic yet, so the production of the chips are not on full force, but I'm not sure about those. So in my opinion, the prices don't surprise me one bit.
All sold out here in Switzerland.it took 4 minutes.
Price were
719 Swiss Francs for 6800 XT
689 Swiss Francs for 6800
So 6800 XT is better buy for sure. I got lucky and got 6800 but really wanted XT version
Give it time: with the same power limit shenanigans we saw with vega and rdna1, the 6800 will be a better value once oc'd
@@MattJDylan doubtful it hs 17% fewer CUs (60 vs 72), so unless there's someway to unlock them it's unlikely it'll be able to top the 6800XT in perf/dollar in any meaningful way.
Did you get the asus one at 689? I managed to order an msi 6800 at 669 from digitec before they greyed out the add to cart buttons.
@@cptwhite in perf/dollar the 6800 is already topping both the xt and the 3080, and I didn't meant the vanilla 6800 will beat both, but that it will come close enough for him not to worry too much
@@helljester8097 Yes I did, but I really wanted 6800 XT I mean it was just 30 bucks more! And I didn't know there was XFX 6800 XT on stock too.
But 4 models in total and they had less than 100 pieces in stock I think.
A lack of ray tracing performance is what scares me now that the consoles adopted it. I personally have less vram and 1080p performance than giving up the option to have decent ray tracing performance. Cyberpunk 2077 is just around the corner and it may deliver something real nice using this technology.
I agree, Raytracing is the future, no matter how much Steve doesn't care about it..
I also think that a lot of future titles will bring ray tracing support, starting with cyberpunk which is gonna be awesome :)
But consoles use RDNA 2, devs will work more with AMD's implementation of Raytracing, in the long run, AMD has an advantage as their Raytracing matures
@@notsoanonymousmemes9455 Sony doesn't use RDNA2 more like RDNA 1.5 also the raytracing that they use is based on Microsoft implementation and not locked down to just AMD so Nvidia can use it also, as well as their own RTX.
@@choatus do you have a source for the nonsense you're saying? No. Sony uses RDNA 2.
And of course AMD has a advantage over nvidia because their architectures are everywhere. In 2 consoles and the PC. Only a idiot would ignore that fact.
If the 6800XT is this good, I can’t imagine the 6900XT benchmarks when it comes out.
It won't be so much better, i believe, just look at 3080 vs 3090
Between the 6800 XT and 6900 XT, there is a difference of only 8 Compute Units. As the other commenter said, it'll be the same as 3080 vs 3090
@@matteovukoja1240 also 8 more "ray accelerators" so hopefully slightly better RT performance too
@@Saigonas It will definitely beat the 3090 at 1440p, though. Because the 6800XT was only 5% slower at 1440p than the 3090.
Given how bad the 3090 overclocks I wouldn't be surprised to see the 6900 XT beat the 3090 by an inconsequential margin, and if it does, the real story is the $500 difference in MSRP haha
Seems like the faster pipes on RDNA2 help with 1080 and 1440, but the fatter pipes on Ampere help at 4K. Be interesting to see how it plays out in the future.
Its the 256 bit bus i think, but its still a great gpus for me
I agree. It'll be especially interesting to revisit this in say, 2-3 years to see if the low VRAM on the Ampere launch versions eats into their 4k lead to the point that it is nonexistent or flips to a deficit. Given that they are already scrambling to redo the lineup and add more VRAM, that'd be a BIG RIP to the people that bought a launch 3080 or 3070.
@@andrewwilliamson3513 10gb is plenty for 3+ years which by then most people will upgrade anyways, the Vram is GDDR 6x and is up to 4x faster than the GDDR 6 that AMD is using so it does help quite a lot especially in those "Fatter Pipes" it's really not as much of a big deal as people are making it out to be, most people don't understand that they are reading Vram allocated not Vram actually being actively used. Source- I own a 3080 plus a lot of other cards. Had my 3080 since almost launch and been loving it, worth every cent so far. But I have had many decent AMD cards in the past also, so don't call me a Fanboy, I will probably buy a 6800xt or 6900xt just for the fun of tinkering with it at some stage too.
@@choatus The basis of my argument is that the Nvidia cards only exceed the AMD card's performance in 4k, so their best use case is for 4k gaming, but the limited Vram might make that a moot point in the next cycle of games (without turning down settings.) Microsoft Flight Sim already allocates over 17Gb of Vram in 4k when under high load and nearly 10Gb in 1440p. Actual usage seems to be around 12Gb/8Gb respectively. The new trend in game development is to early access release games with shit optimization, and the optimization never seems to get addressed even at final release. AAA titles like COD that release every year are the same in a lot of ways. I think Warzone will use over 8Gb. It seems like Nvidia kept the Vram low because GDDR6X is expensive, but it keeps the 3070/3080 from being "Great" 4k cards for the future. They could have achieved the same memory bandwidth with more slower GDDR6 for roughly the same cost.
@@andrewwilliamson3513 to achieve the same bandwidth as GDDR6x they need to have 3-4 times the amount of normal GDDR6 so no it doesnt work like that.. it would end up costing a ton more to do it that way for starters. P.s MSFS on my 3080 in 4k uses (not allocates) less than 7gb of vram right now in ultra. but that game is broken as it needs its update to DX12 before we can see what it really runs like, it's basically running on 2 cores only atm.
Are these day one drivers vs day one drivers? I mean the 30 series has had an extra 2 months now to iron out driver issues.
Why would they use day 1 drivers for the 30 series? It's Nvdia's advantage for releasing earlier. You want to compare CURRENT performance to help your purchasing decision, not anecdotal. Steve will make an updated version for later drivers in the future as usual, anyways.
As a 3080 owner I can say for a lot of these games the average FPS with my card is way higher than what they got with their 3080 so who knows lol
@@Mkhalo3 but do you run the absolute maximum settings or a more reasonable "tuned ultra" setting.
This video isn't a guide for reachable fps, it's a comparison under the exact same circumstances.
@@paulelderson934 Everything maxed out with the highest FOV as well
24:40
There's probably a few reasons to get a 3080 over a 6800XT, for some specific applications like ML or CUDA-acceleration in workstation tasks.
With that said, I doubt most viewers here are buying either for workstation use primarily. For gaming, AMD's done immensely well here. Well done.
Same here, mainly because of emulators, but as an AMD user for a long time this makes me smile.
I know they have the audio improving software and better encoders for people that stream. Again. If you don't do that then the 6800XT looks pretty good. NVIDIA wins on the Value Adds. Something that AMD needs to start working on as soon as they can.
I do not agree for machine learning using Roc m with 16 gb of vram should work better than cuda with 10gb of vram because training saturates the ram very easely. Nvidia should be better at fp16 though with tensor cores.
@@KPTKleist CUDA's the big one for me. To a smaller extent, NVENC as well even though I've not done a lot of streaming but have been doing some recordings lately.
AMD's got some catching up to do in the features department. The performance is finally right up there but that extra $50 might be enough to tempt some people to go 3080 for those features if they apply to them.
RTX cards have tons of gaming features that AMD users can only dream about. Just wait till you see Cyberpunk 2077 benchmarks, Nvidia will wipe the floor with AMD 6000 series. Biggest game release in many years and fully Nvidia backed with ALL RTX features including DLSS.
people then : big navi cannot even compete against RTX 3070
big navi now :
This isn't even considered the big navi right? Isn't 6900 xt supposed to be big navi?
@@shaquilleoatmeal5332 Big navi is the entire generation, including 6800, 6800 xt and 6900 xt.
@@shaquilleoatmeal5332 The 9600XT is "the biggest big navi" :P
Well it technically still can't. It doesn't have DLSS compititor yet, has terrible video encoding performance and gives glitched out render outputs still. Plus bad at ray tracing. Its only for games that doesn't support DLSS I guess.
Still they are finally back.
These cards especially will of a great attraction to all 1440 and 1440p ultra wide gamers for sure which majority of gamers in this price range.
@@makisekurisu4674 AMD has always been lagging behind nvidia for those extra features... good thing about them is they make the price cheap so nvidia doesn't have a total monopoly and authority to increase prices like they would have if there was 0 competition from AMD.
I'm always impressed with the paramount
size of data you guys crunch on each launch, really appreciate it. Thank you and lots of love from S. Korea
Can't wait to buy literally any gpu at the end of this decade. I'm sure they'll be available then right?
Oh, you probably *can* get them pretty soon.
Just need to be prepared to pay 1000$+ to some random guy running a bot.
@@DeadNoob451 A good business as long as there is stupid people paying this ammount.
@@hermesgabriel yeah
the decade is over in less than 45 days. So probably not?
@@nofreebeer Dont decades go from 0 to 9 ?
Ie. we started counting years (and thus decades) in our current system in year 0, so the first 10 years/decade ended when year 9 ended/year 10 started. So we already are in the new decade and 2029 (or rather 01.01.2030) is the start of the next decade.
Otherwise, the millenium would have started 2001, not 2000 and the first decade would have 11 years.
24:48 there are definite reasons to buy a 3080 over a 6800 XT and you already mentioned all of them so not sure what this statement means.
Better 4k performance, DLSS, RT, driver support which has been garbage for AMD and seamless for Nvidia for years now and not something I'd personally give up easily until RDNA drivers prove to be normal for some time.
I've heard bad stuff about Nvidia drivers for Linux users...
In a practical part of my studies, sb who was trying to get their 2080ti drivers working for Tensorflow on Ubuntu, the display wasn't working at all anymore 😅.
Butthurt much?
DLSS isn't supported in every game, in fact the number of titles that support DLSS are miniscule. AMD are working on a similar technology. Raytracing as it currently stands is nowt more than a gimmick and offers little in terms of visuals for the impact on performance not to mention most games that are not Nvidia sponsored will use the DXR version of RT and not Nvidia's version which will close the RT gap. Better 4k performance in SOME titles, sure. Driver issues have plagued both companies in various generations and will continue to do so, that's a cheap excuse. Why are you so salty bro? Competition is great and Radeon just spanked Nvidia.
@@Sharki33 Only one who seems butthurt is you
1. Ray tracing is a gimmick as it stands today as it still takes a massive frame rate hit even with DLSS for minimal improvements in graphical fidelity.
2. The 4K performance is only 5%. 5% percent and the 3080 loses in 1440 and 1080p (although again within error of margin). 5% is within error of margin and not something anyone could notice. Most people are adopting 1440p and that’s the resolution most will play at soon.
3. AMD has their own implementation of DLSS coming soon. It’s not a feature that can only be implemented for only Nvidia.
4. The drivers are yet to be tested, but Nvidia’s track record is not seamless or spotless. Just a few weeks ago; they had to release patches for crashes to desktop and black screens although no one seemed to want to talk about that unless it’s AMD screwing up.
The 3080 is a tough sale unless you want to play with slightly improved shadows and reflections for a fat loss in FPS with less vram and more power consumption.
Thanks Steve, really great review as usual, the level of quality HU puts out is incredible, lots of hard work and it shows. IMHO, what really impresses me the most and the reason I recommend you guys before anyone, is how you guys manage to cover everything so well, I've found the attention given to the VRAM amount difference of the cards really lacking from most reviewers and you guys touched the subject and warned about it. If 8-10GB is barely enough today, for the high end, according to my experience, as soon as the new consoles hit, you better have 12GB or more for the high end, really. Imagine paying U$ 700 for an amazing card today and a year or so from now it struggles, not because the GPU can't handle, but framebuffer. I've had that experience with a fair share of nvidia cards (i.e. GTX 295, GTX 480 SLI, GTX 580, GTX 690), very frustrating.
Thanks again for the great work!
Just subbed. I love how this review is less opinion and more data than some of the other channels I know.
Steve is a legend and a blessing to tech mankind. Welcome to benchmark heaven where sleep is for the weak and there are numbers everywhere. :P
Check out Games Nexus. They stuff data down your throat.
11:30 those RTX 2060 4k Wolfenstein results xD 6fps average sounds good.
It seems weird, could just be a driver bug.
😂😂😂
@@PatalJunior Nah it ran out of Vram....6GB is not enough for 4k
@@subhajitchatterjee9688 funny enough the 980Ti has the same amount of VRAM (but GDDR5) & is still way faster. I would've expected both to crap out into similar abysmal performance.
@@MLWJ1993 Oh right...that seems strange how rtx 2060 can struggle so much then
14:41 worth mentioning:
The 6800xt has the same RT performance as the 2080 ti. (even slightly better)
That's not too bad at all!
I think a lot of people don't even understand this fact. AMD is basically right in the middle of RT 1 and 2 of Nividia. If AMD can keep this level of innovations for their 2nd gen RT cores i think we will have some serious competition. It's the same with AMDs version of DLSS. This is their first generation and essentially first go at it. As AMD said they will age like fine wine. I just hope it stands true
If it's a AMD sponsored game the RT performance is even better than Ampere. See Dirt 5. Otherwise it looked like 20% less than Ampere and a bit better than Turing.
@@eazen thats what I’m saying, RT is pretty much very limited on games and even more so that can actually run it at 60+ FPS. For me personally, I would buy the 6800XT cause its on par, and often times better, at performance with the 3080. However, what I like about the 3080 is the Audio Isolation thing they have with the microphones.
Purposefully stayed up doing homework to watch this premiere! So worth it, love your work Steve!
lol
3:58 why is Valhalla so low performing across the board. the performance deficit on every AC game on PC gets worse while i dont think the graphics level up proportionately with the performance drop in game to game. same thing with watch dogs. Ubisoft games dont look good enough to play as bad as they do imo.
Very comprehensive review @hardware unboxed. I've been anxiously waiting for your review
This video: Steve saying "6800 xt is better in 1440p and the 3080 is faster at 4k" in 20 different ways
Seems to be somewhat weird, many other reviews see equal at 1440p, red win 1080, green win 4K.
NyxNyxNyx Green 3080 at 1140p? Does this also count for 3070?
he is using the 5950x for 6800xt results. it's ok for 4k but not 1440p when the 3080 results are with 10900k
I'm wondering how well all these reviews will hold up in the long run. If you compare results in recent games to the older ones Nvidia seems to be struggling. Dirt 5 and Valhalla in particular surprised me.
@@theeternal417 they are also done for 3080 using old drivers. I would rather he did less games but identical environments and up to date drivers
A really impressive jump in performance from AMD and even more impressive seeing the 6800XT pass the 3090 now and then. Looks like there will be a lot more all red builds in the future. 4k scaling is a bit dodgy with and without SAM but certainly the better choice for 1440p and overkill 1080p than the 3080.
I come to this channel first cuz we get straight benchmarks and no bullshit its just so time-saving. Thanks steve
"So, without wasting any time let's talk about the test system, and then jump into the blue bar graphs. As I know that's what most of you are here for."
*_Very angry AHOC noises in the distance_*
Sorry but what is AHOC?
Valentine you are sus
@@ExZ1te Actually Hardcore OverClocking
He recently made a video where he’s upset about the bars in bargraphs
@@pythonner3644 uh- I Saw green vent to medbay
@@ValentineC137 but I am green
Now to wait for the VR performance reviews. That's really gonna be the final benchmark telling me where to go.
Yeah, but who does VR performance reviews? I WISH I could find some!
"VR"...is that when you put a helmet on your head with a pair of small LCD displays very close to your eyes and keep it on for hours and hours? :)
@@GholaTleilaxu yup. Loads of fun.
Thank you thank you thank you Steve, you've answered the questions other reviewers failed to address for me. Only one question remains - GPU encoding
Imagine being able to buy one of these
cannot imagine
@@alvaro.goitino7321 lol
Thirty seconds into sales going live, and the Newegg site is down. Here we go again, as expected. That said, is it just architecture differences that tend to give the 4K nod to Nvidia? Seems like somebody playing at 4K has a dilemma: the 3080 is faster but has less VRAM than the 6800XT. What to do?
Demand is just too high & they had to accommodate console silicon too, I think they have done well all things considered TBF
This will be the dark generation I’m calling it. The vast majority of people will miss out on Zen 3, RNDA2 and Ampere. By the time they are buy able at regular means we will have zen 4, RNDA3 and Hopper.
I would never buy a 10gb vram card for 4k
@@ZackSNetwork I wouldn't call this gen the dark age, if anything this is a golden age for tech! We had massive generational leaps in multiple tech areas (motherboard with PCIE 4.0, CPU with Zen 3, and GPU for both Nvidea and AMD). More people are buying computers than ever before due to those reasons, and because people are stuck at home and realizing that 1 computer isn't enough for 5 poeple lol!
@@vgamedude12 Nor would I, but I don't game at 4K anyway. To me unless you can turn the graphics to max, there's no reason to game at 4K. I mean, 4K with settings dumbed down??? Opinions vary, and rightly so, but chasing high frame rates at 4k in games is a serious money pit. So it's 3440x1440 for me.
Thank you sir. That was very interesting specially the SAM analysis. I was curious if the performance they claim was true. Your test with 3.0 making no difference vs 4.0 is an extra that i really appreciated. Thank you for doing that test.
Man,, Navi2x truly is Zen moment for Radeon
Nicely Done, AMD
Looking forward to seeing the SAM equivalent improvements for the 30-series 😃
Yep, and luckily Nvidia wont lock this to a single processor brand (or only Ryzen 5000 series, like AMD does). Nvidia "SAM" will support ALL CPU brands.
Honestly, god bless you guys for using 1440p as the default benchmarking method shown in the charts, seriously, THANK You. It was so unnecessarily arduous / annoying just 6 months to a year ago finding 1440P only benchmarks for the games I care about. It was frustrating, so thank you for your hard work.
Holy smokes, amd really did do a Zen improvement with RDNA2. Imagine that bad boy with HBM.
Lol don't forget to imagine the price too
@Desktopia People have extremely short memories and horrifically limited attention spans.
Anyway, it is true that RDNA is an exceptional GPU architecture and brought real competition to Nvidia that they had not had in a long time. The generational leap in performance from RDNA to RDNA 2 is nothing short of breathtaking and worthy of acclaim. And that AMD is targeting a similar ~50% increase in performance-per-watt with RDNA 3 is very exciting, but neither should detract from AMD's win here with RDNA 2.
who needs hbm2 when rnda2 is using ddr6 while amphere is using ddr6+ higher bandwidth...yet amd still inserting it in any orifix nvidia has
Nobody needs expensive HBM if you have the power of Infinity Cache. The old problem which AMD had since 2014 is now solved. No HBM needed anymore for the highend cards.
@@eazen Never said I wished it had it, I said imagine. The infinity cache can only help so much before the gpu has to override the cache and fetch more data from system memory at higher resolutions. You can see how much frames it drops switching to 4k. Also this takes alot of space on the die (infinity cache estimated to take up about 6 billion transistors) which could've been used for more enhancements. Infinite cash though is the next step in graphics though, it is a great idea. HBM is just the overall better solution though, just more expensive.
I’m surprised they no longer include RDRII in their tests.
If I remember right, it's extremely hard to test in a consistently controlled manner (without using the built in benchmark). I believe they said the built in benchmark isn't very representative of true gameplay, so they try to test along a route in game. However, a dynamic world with a day-night cycle can make consistency very off.
Steve probably scrapped that test for time since there's so many product releases smashed together.
but I like that they include lot's of new(er) games, something most other reviewers fail to do and bench new hardware on 3-4 year old titles.
@@AngelicHunk Thats fair. I mean, the most taxing area in the game i believe is Saint Denis but even then that area isn’t so bad in itself. But I’d still say the benchmark for RDRII is decently accurate.
@@gorky_vk Thats true. Like people who still test GTA V is kinda ridiculous. Nearly anything can run it as long as you aren’t tryna play on a laptop or something really old.
the spaghetti coding of the game probably contributed to this. They have have to release a single update that hasnt introduced stability issues
July 2021 and this card is now 400$ cheaper than the 3080 in the pre built R12's from Aienware/Dell. Thats such a huge difference for essentially the same performance.
Andddd they're sold out
Biggest paperlauncv in history. And they were priced way above amds msrp making the rtx3080 almost cheap.
*OUT OF STOCK* in not 3 minutes, not 3 second, *BUT IMMEDIATELY*
People triggering buy buttons instead of spraying bullets. lmao
Bruh these bots
Yeah it's sucks but we'll all be able to buy the GPUs eventually. Just gotta wait.
@@rickyrich93 exactly the best time to buy gpu's is not at release day.
I can't wait to see these available for purchase in 2022! It's like getting a window into the future!
Nvidia 4000 series and AMD 7000 series...
brooo i wish every game would be as well optimized as death stranding
To be fair there's not really a lot going on in Death Stranding. The game looks gorgeous in 4k 60fps on a 65" HDR TV though.
Resident Evil as well. Gorgeous graphics too.
It will be great to see ultra wide benchmarks ..
No really, very pointless resolution for gaming.
@@amaurytoloza1511 Have you got an UW? What makes you say that?
@@SBOG4215 I got 4 monitors. Three of them to work. The one for my gaming rig is a LG 27GL83A. Many games purposely don't support 21:9 aspect ratios you find on UW monitors. Tell you what, consoles are not even supporting 1440p resolutions, even less UW, pointless resolution really for gaming. For productivity it has its uses tho'
UW is trash
@@wallacesousuke1433 nope, only your taste.
Guys nvidia offer a much more stable product with better infrastructure like geforce experience, game recording in 4k hdr 60 fps etc. Nvidia's driver support are also better vs AMD. Overall it's good to see AMD become a equals at the top end However, when you buy a nvidia card you just get a much more stable, reliable and better allround eco system. Also you showed have used the 10900k it's a better gaming cpu!
Different result in 1080p and 2k compared to guru3d and techpowerup. Probably because of Dirt 5 favoring AMD heavily, exactly 30% lead in 1440p.
yeah something about this lot of Benchmarks doesn't seem to make sense to me.
I was thinking the same thing...
It's because this game is optimized to run best on rdna2 because both consoles are using that platform and games are made first for consoles. So we can expect to see a lot more titles that will run better on AMD, especially next year when true next gen games come out.
@@bo-_t_-rs1152 Consoles have had AMD CPU's and GPU's for years, more than 4 generations, but this has never meant that games were all better optimised for AMD hardware on PC... most were better on Intel and Nvidia, and expect it to remain the case for a long time yet.
@@choatus True but the hardware in them was nowhere near the performance of the current gen. AMD was always behind nVidia and intel but that's changed with Ryzen/RDNA2 now.
11:30 look at the end of the list, the 2060 with 6fps avg, LMFAO
@@xtechtips152 vram
@@smp219 doesn't make all that much sense since the 980Ti runs into a similar limitation unless it somehow powers through that like the monster it was back then. 🤔
@@MLWJ1993 that was history
@@MLWJ1993 ROPs
By far the best benchmark review again as always! Love you guys :-*
11:39 the 2060 was a PowerPoint presentation
2060* the 2060 *super* had >60fps average ending up just below the 2070.
@@MLWJ1993 ok bro I just saw it and edited the comment
14:45 - “Personally I care very little about ray tracing support since I feel there are no games where it is worth enabling.”
Laughs in Minecraft RTX
The whole 3 ppl that play it
The amount of work you guys did....clap that up people 👏
And now I'm just patiently waiting for the 3080 Ti release.
yeah the 4k was mediocre at best, lets wait to see 6900xt too who knows amd might pull a last minute prank on nvidia :D otherwise 3080ti here we come baby
Nice review, it seems NVIDIA has tight competition this "generation" of cards. I disagree with your RT assessment though. RT is relevant, else AMD wouldn't have added it.
It’s relevant but hardly. It’s implementation is games is few and far between. And what games do have it, the performance loss in exchange has hardly been worth it. I’m with Steve that it hardly matters in regards to which GPU is worth buying. The only reason why NVIDIA can tout about it at all is because of DLSS kind of “cheating” performance.
Maybe in another 5 years it’ll be semi-standard in games and reasonable to run but as of now, not really.
Premature technologies like ray/path tracing get supported more as they mature, we're still in hybrid territory and it's really added bling at this stage, maybe in another two generations or the next console generation it'll be as significant as rasterisation capability. Just look at Control 4K which is 36FPS on a 3080 and needs DLSS to get to 60FPS and that's still a hybrid implementation. Fortnite with full RT effects (still hybrid) is even worse at 23FPS and 42FPS with DLSS.
REF: ruclips.net/video/nX3W7Sx4l78/видео.html
If it was as relevant as you say, AMD would have pushed their ray tracing implementation much more than just matching the RTX 2000 series. They've implemented just enough so that you can experience Hybrid real time ray tracing, but not so much that it becomes a primary selling point, because we aren't there yet.
cyberpunk 2077 Rt heaven. You know CGI use RT rendering farm to look more realistic then just raster.
Spot on. Yeah I'm happy AMD is catching up but anyone who says RT won't matter during this generation of cards is kidding themselves. Nvidia 100% this gen, and this isn't even including extra software features and better drivers.
RT will be thing now that consoles can use it. That's always the thing holding technologies like this back. We're going to see RT in just about every AAA game going forward now and AMD being so far behind is not great. We've yet to see if AMD's DLSS equivalent doesn't suck which given how terrible first gen DLSS was I'm not holding my breath.
Why do you think the FPS data that you are showing is so different to that of GamersNexus and Jayztwocents? Do you think it is the difference between enabling Ray Tracing and not? Or could it be something to do with the memory or other tuning of your test bench favouring AMD hardware somehow?
(I’m assuming that it surely can’t be something as obvious as a driver version)
Damn, that 6800XT was seriously impressive, and waaaay better than I was expecting!! And excellent video as always guys, keep up the good work!!
Nvidia's days of being the performance king is numbered, Now they have to compete on price AND performance.
Is there info on the test bench? Your SAM results are very different from other reviewers. Keep up the good work! Hi from Bacchus Marsh Vic!
I've noticed this Hardware Unboxed review is a bit different than other reviews from other sites. Steve also seems to just 'dismiss' Ray Tracing as if its a passing fad.
AMD caught up with Intel and surpassed them and now they caught up with Nvidia too! Awesome 👍.
See how much Confidence makes a difference? Its like getting a new table.
Or like building a new PC, lel
Please build a new PC, I'm kind of bored and need some good entertainment.
Just want to say THANK YOU for having the 3950X in your test bench, you keep apologizing for it, but it's been a very welcome data point in reference to other reviewers, and closer to what a lot of your Ryzen 3000 viewers are still packing.
That's a lot for another excellent review, Steve!
They must have made SAM available for all Ryzen users across the whole AM4 platform, no matter what motherboards or CPUs people use. I hope Nvidia will force AMD to do this soon.
Anyway, I am impressed! AMD finally became a really dangerous competitor to both Intel and Nvidia. Ironically it happened almost at the same time, with the release of Zen 3 and RDNA 2. It will make things a lot more interesting!
I agree. these new cards are clearly a very good substitue for the RTX 3000. However, u also have think in the future and what we have right now. its obvious that the trend for games (AAA) is ray tracing...so u cannot say anymore that the number of games with ray tracing is limited. its just going to increase. Its a fact that ray tracing technology in AMD cards is inferior to nvidias at the moment. Furthermore they lack AI services like the nvidia broadcast app, which are pretty cool or even needed if u want to stream. I think if u dont care about streaming or ray tracing, sure amd is a better option. but like i said, more and more games are coming with better visuals and most people will be stuck with amd for a while... I think that AMD will crash Nvidia with their next generation RX cards though.
It depends on if older motherboards can manage it. Even nvdia did saynthatnthey will only domit for new amd motherboards, so older may lack some feature that is needed!
My guess is SAM aren't that good when running on PCIe 3.0. I could be wrong though, just wait for Nvidia or AMD to enable it on all platforms.
@@UltimateAlgorithm Steve said in the video there was no difference with SAM performance in PCIe 3 vs. 4.
@@inrptn how can he know that? SAM doesn't even run on PCIe 3.0 right now. Only Ryzen 5000 series on an X570 supported, which guarantee PCIe 4.0. For more SAM benchmarks you can look at Hardware Unboxed. Some games have massive improvement like Assassin's Creed Valhalla.
I wonder how much of a difference the 6900 xt will be
With these numbers I would say it’s going to stomp everything out right now.
Finally one thorough and clear 6800xt review. I'm tired of "big" reviewers making points with 3DMark benchmarks and 5-6 games of which 2 are 5+ years old. Who will buy these cards to play GTA V anyway???
Ugh. Disappointed with the encoder, the RT performance, and not much software like RTX voice, etc. everything else is great tho
AMD video encoder is rubbish for sure
@Melburn Sir
Yeah, but for the games where it does matter, like Minecraft, it sucks.
@@Wusyaname_ lmao Minecraft
Or red dead redemption 2, or control, or cyberpunk, or metro, or battlefield or any of the other titles that support it. Thats not even mentioning dlss where amd gets shit on :)
If AMD had a DLSS 2.0 competitor I could see the 6800xt as a great competitor to the 3080, without it there is just too much gained with that technology to switch from team Nvidia
Yes but only if the games you play support it. And if you play 1440p 144hz with no raytracing it mainly is no longer needed.
@@jagoob that's the thing. DLSS is only going to get more and more optimized for a wider variety of games. And people with those cards will love the dedicated cores for minor ai workloads and the improving raytracing tech which is here to stay.
If you have 1080p screen then dlss isn't needed. 1440p or 4K yeah fine. They have super sampling or something like that and they said that they will release it in the near future. Judging by nvidia tho with their first gen of dlss( which was shit ) I dont know what to expect.
@@kostisap9350 .. Why judge by DLSS Technology by DLSS 1.0 (which was sh!t) when DLSS 2.0 already blows that first version outta the water. If DLSS 3.0 improves even half more the improvements of what DLSS 2.0 has provided, it will be something quite special imo ...
@@williammurphy1674 Yea, if they manage to improve even further it would be phenomenal. Even with DLSS 2.0 as it is now the game looks way better than with TAA or other AA methods and performs way better. I am enjoying Control so much on my 2k 240hz monitor right now with my 2070 super, getting steady 80+ framerates with gsync enabled. Such a smooth and beautiful experience!
In Canada there was no stock at the only major retailer in my city I could buy it from locally. Picked up my pre-ordered Nvidia card and the retailer told me they had more orders for 3080s today then any other day. Was considering the 6800xt but for me was a what came in stock for me to buy so Nvidia got my money.
why nvidia suddenly have stock?
@@syahmimi1425 They didn't get any extra stock I've been waiting weeks for card I back ordered. Sales associate told me they had more orders placed for back order on 3080s then any other day today. They did not get more stock just more orders for back order.
Well-done AMD 🔥🔥🔥🔥🔥🔥🔥
You mention other RT games like watchdog legions and control, but dont include benchmarks of them lol
I'll give you the scoop. Nvidia would have slaughtered AMD because the RT stuff isn't just a little bit behind it's WAY behind and AMD has no DLSS equivalent yet. The review was dismissing RT so why bother showing it?
Great benchmarks and work! :)
The stock levels here in Northern Europe was sufficient, had 7 min to pick one up :D But I missed some AIBs brands.. And for pricing, the 6800XT was easily the best buy.
aaaaaand.... they're gone. Im from Germany
Well we all would be really surprised if They would not have gone!
;)
Kumpel hat eine bekommen :D
YOU ARE THE BEAST OF BENCHMARKING!!! awesome job as allways man.
This review vs LTT's review make me feel like I'm living in 2 different worlds. Their conclusions are so far apart.
Watch all reviews and draw the conclusion yourself. It doesn't matter what they say, it is only their own opinion. For me playing @1440*3440, result of 4k is a decisive comparison. I do care about RT and DLSS. Also, look at games they pick for the test. I am not going to play none of those AMD sponsored titles, so I do care about the titles that I am interested. For my personal perspective, 3080 is a go to option. But, none would regret buying 6800 series cause they are pretty darn good.
@@far_tech5555 That's what I do every launch and not just RUclips but there's other places to help inform a buying descion.
I dont even need to watch LTT's review to know they have once again created a supper click-baity title and a conclusion. never watch LTT for actual product review, its more of a drama channel for the masses.
HUB focused almost exclusively on rasterized price/perf, while LTT put a heavy emphasis on ray tracing performance and features. I wouldn't say either is right or wrong, just completely different priorities.
That is because Steve is an AMD fanboy 100%. He won't admit the defeat that these AMD cards have against Nvidia. He smooths over ruffled feathers with statements like: "And here the 6800XT does fall behind the 3080 by a 15% margin, which is quite a substantial difference, [THOUGH A 105 fps ON AVERAGE AT 4K IS STILL QUITE IMPRESSIVE]".