Watch our AMD RX 6800 XT GPU review here: ruclips.net/video/jLVGL7aAYgY/видео.html Watch our AMD RX 6800 XT tear-down here: ruclips.net/video/0s7bOaa6X9E/видео.html GPU TESTING METHODOLOGY EXPLAINER: ruclips.net/video/-P7-ML-bPCE/видео.html The best way to support our work is through our store: store.gamersnexus.net/
Great video. By the way, you forgot to mention that the AMD cards come with 16GB which will come in handy once those true next gen games start to come out.
I appreciate the work he does to give us grade-A quality data in regards to the new Pc hardware releases. GN is the first channel I would go to for that reason and I can be assured that whatever is presented is reliable. Ofcourse, there is room for improvements as with everything and I am sure Steve is his own biggest critic. Nevertheless, thanks for the dedication and love you put into PC building.
I feel like NVIDIA is going to have marketing issues if they don't introduce some sort of HULK mode to fight for mindspace in this team red RAGE future. Its what I think the modern PC space needs - more pointless marketing terms that mean nothing to confuse issues as much as possible.
@@brucetus of course...did you or did you not see the FE Nvidia grill design? It even has a fan blowing warm-ish air upward, to allow for sous-vide cooking.
30 seconds of a gopro swirling around and bouncing off trees and rocks, followed by a disturbing stillness. You see pieces of a white AMD bike strewn about the foliage. A red led on the side of the frame starts to blink as it uploads data to a distant server. A computer in Santa Clara turns on and a message appears on the screen: "Mission Accomplished: Target Eliminated."
"Being a Techtuber is easy they said, you just talk about products they said." I don't want to imagine how much work, how many hours/day you guys spend every day. BIG KUDOS for getting us the infos!
I believe he recently said he works around 100 hours a week, so as a random RUclips commenter, I obviously work 200 hours a week! Please ignore the fact that a week only has 168 hours in it.
Shoutout to the techtubers like gamers nexus and hardware unboxed who put in so much effort every launch. Always have the most trustworthy and reliable data as well. Make sure you're still getting enough you-time man.
Bought a new one JUST now for $320 It's the best "middle" I think you can get, man. The "sweet spot" was the 6800 XT, but they're more difficult to find at a good price, I'm seeing alot of them for over 550 to 630, I can't justify that.
@@dylanherron3963 bro, new for 320 is a steal. Also, from my experience, the 6800 is absolutely a high end card! Good speed, plenty of vram. You can play just about anything even in 4k relatively solid. That purchase should last you a good while :)
@@Ryzard My dude, I am happy to report, (after swapping out a 7 year old EVGA 550w for an EVGA 750w) that I just introduced a CPU bottleneck to my computer lmaoooo. Jokes aside, that's a good thing that I knew would happen, I figured an RX 6800 would barely lift a finger on Cyberpunk in 1080p, even completely maxed (RTX off) when paired with an i7 9700 (some extra cores would help). Like I imagined, when I'm zooming thru Night City on a bike at 140 mph, Rivatuner's got my CPU pegged at 99% (very CPU intensive game) loading in/out all the buildings, decals, and people. But during NORMAL play, I'm sitting between 119 and locked at 144. I realized I screwed myself by opening up a whole ass upgrade path now, lmaooo.
@@dylanherron3963 yep, you've opened the rabbit hole! That being said, your setup should be totally fine for 1080p at good frames for a good while. Glad you're enjoying!
And the XT is looking to be a couple of hundred cheaper trading blows with the 3080. Looks like I'm going to go team red this time. Be rally interesting to see who leads in different countries markets
A year later, I had the RTX 3070 in my cart for a few days and was doing research on which AIB to choose from, with many of them having a difference of up to $100 between them, then I watched comparison videos between the Rx 6800 Vs Rtx 3070 (and some forums too), after watching so many youtubers and benchmarks, i ordered the RX 6800 tonight. It was about $150-$200 cheaper and beats the 3070 considerable, for someone who doesn't care about RT. The games I play don't support it.
That is a hefty price difference, especially considering the performance. Which RX 6800 did you end up going with? Thinking about doing the same, but I will probably wait some more. In my country the 6800 is sadly still unreasonably expensive.
@@ashamancito4630 I bought the XFX speedster. I have no regrets...I also got a 3 game reward bundle from AMD (Sniper Elite 5, Forspoken and Saints Row). I think you should wait a little if you can. I say that because a few days after I bought my card, the price dropped even lower in the store where I bought it. So I think the price might still drop again and it might also drop further for the 3070 as well.
@@josephomoyajowo8509 Bro I have a doubt how is AMD drivers nowadays bro any issues ? Any stutter issues? Any black screen issues? Please share your experience
@@sajithsaji3606 drivers are still buggy. Personally, I haven't had any of those problems. The only issue I've had so far was being unable to 'record' my gameplay and the new update fixed that. Other than that, I haven't had any issues
I love the idea of it being is literally reaction, watching RUclips, seeing the "Rage Mode" on AMD's presentation and just saying "lmao", not even laughing, just saying that to himself
32:00 Thank you for making clear in your recommendations whether to buy a card for Ray Tracing performance or 1440p (no raytracing, upgrade). There are gamers like me who have one or two games they only play and those games are unlikely to be graphic wise updated by the developer to support full-feature raytracing (which also looks actually nice and is not a tank on performance).
@@shaung7441 the 10 commandments shall be etched in an IHS!!! His apostles will include Linus, Jayz2cents, Paul, Kyle, and Der8auer. He speeketh the benchmarks and the world shall know these truths come in good faith! For he is Tech Jesus ™️
@@Squilfinator Waiting for cheaper cards to launch like a 3060 Ti or 6700XT so I can do a full rebuild. The 3070 and 6080 currently cost more than I paid for my current core components (CPU, GPU, motherboard)! Then I'll probably pick up a 5600 non-X if they launch one.
Started watching this and was thinking, Steve sounds like he might be a little tired or drunk and he's talking slower and somehow I am following it better, then realized my playback speed was set to .75. Ugh.
In Eastern Europe the PowerColor RX 6800 is 100 euro cheaper than the Gigabyte RTX 3070, and the PowerColor 6800XT is 200 euro cheaper than the Gigabyte RTX 3080. All of them are like 650-900+ euro and ofc none of these have any stock anywhere.
@@PadyEos Indeed, it seems like in real life the 6800 is cheaper than the 3080. I hope Nvidia does an Intel, they were super dishonest about the pricing.
I'm really glad you guys still do 1080p testing. So many outlets have dropped 1080p, but it's definitely still important thanks to high refresh rate monitors. I would take refresh rate over resolution any day. (As long as it's not below 1080)
31:26 "That extra 80$ is what complicates it" Yep, that's my exact feelings when it comes to these cards. 499$ for the 6800 and 599$ for the 6800xt would have made the 6800 almost a no brainer, and the 6800xt way more compelling. Like it or not, the not-so-exciting RT performances need to be addressed in some way, and price would be the only way to do that for this gen. AMD still can't afford to not to be extremely aggressive when it comes to pricing in the gpu business.
It does not matter because stock issues. Also u will not find $580 and $500 cards. It will be mostly $550 vs 550. 6800 XT will be $700 and 3080 will be $800
@King Marco Louis III actually what happened in godfall and dirt 5 that's not the case point is you can't say that yet when the other company paid for for support and one just launch
How many games have raytracing ? Maybe 5 ? So why everyone so worried about raytracing eye candy? Even those games are not at acceptable level from both side. $80.00 extra from amd card gets you 16gb vram and 15 % higher performance over 3070. Simple as that ,rx 6800 is better card.
Wendell's review on Level1Tech implied that the RX6800 series outperforms the RTX30 series by quite a lot when using slower RAM. Is there any chance you could do a similar test to verify his results and maybe explain how come?
@@victorsegoviapalacios4710 Not claiming they are. But it's something that was not covered in the other reviews I've seen and it would be an interesting effect.
Basically, the card to buy if you're gaming at 1440p and don't care about RT features in general. Close enough to the 3080 to make that look ridiculously priced.
With NextGen gaming here, look to most new games feature RT. AMD was so close to having a killer card but their lackluster RT performance will see this current crop of cards on Ebay in a years time. Wait until Next your if you want to go team Red.
I dunno. With the new Playstation and Xbox *BOTH* using AMD as the CPU and GPU and with how much cross-platform gaming you have I don't think falling behind RT is going to be that big of a deal for this generation.
@@Keith2XS Don't be stupid mate. Very few people give a sh*t about RT. At least not in it's current form. Future games that will have it, will be designed specially for the RT features in RDNA2 because it's the dominant gaming platform for the next several years.
@@timpatrick564 This. The console GPUs have a RT baseline that any higher tier RDN2 GPU will exceed just fine. Where AMD really needs to catch up, and fast, is a strong DLSS competitor.
I've always had the impression that linus and jayz were more toward the entertainment side of the spectrum while GN is closer to the technical news side. Obviously both sides have both qualities (amd bike?) but that is what i get for an overall impression.
Well thats what GN does and is known for. LTT on the other hand tries to keep the videos short with only the information most people are looking for with a definite conclusion. But its always best to look at a lot of different channels when it comes to hardware reviews, as some consider things others don't at look at them from a different angle. LTT and GN especially are known for their transparency and honestly when it comes to reviews.
I wonder if in the coming years AMD's RT performance will start getting better performance in new games with the same cards, since all of the current implementations would have been optimized Nvidia cards; however now that consoles have RT devs will be incentivized to optimize games for AMD.
That´s the point. The mainstream developers are focused on consoles and set the standard for the base product from which we PC gamers get a port (in the worst case). That both main consoles run completly run on AMD means we will see more support in the PC space too.
Now that both companies released quite a few GPUs, will we get "Best Workstation GPUs 2020" video, or will that have to wait until close to end of the year so 6900XT is also included?
@@lyelllampero3748 yeah Nvidia will be better. Based on the pcworld interview on thier podcast with Scott. It sounds like AMD won't have content creation /workstation stuff till RDNA3. Which makes sense. The first thing they needed to do was to catch back up to Nvidia in rasterisarion, now that that's done, they can work on improving the arch and branching out to meet other needs. Hopefully we see AV1 encode in RDNA2 as well.
@@scarletspidernz Yea, I am looking to get a GPU for Blender work, and lack of VRAM on Nvidia's GPUs is killing me. I was hoping for 3070 to have at least like 12GB, but with 8 it's a hard pass, so 3080 is only reasonable choice since I can't really afford to double that for RTX 3090 24GB. RX6800 GPUs ended up being "good enough" in Blender workloads, but AMD and Blender have been having issues with crashes and what not for quite a while now, and CUDA is integrated and compatible with way more programs. So I am stuck, either RTX 3080, RX 6800/XT or again pray that mirical will happen and we suddenly get a 16GB 3070 for $600 ( more like $800-900 in my country.)
@@vator_rs The RTX 3060 is rumored to have 12 GB as a counter for the RX 6700 that's being rumored with 12 GB as well. Maybe that one would fit your case? or is it not powerful enough (around RTX 2080 performance expected)
Turing user here, why not show RT tests from AMD RT implementations as well, like Godfall and Dirt 5? Control is an independent title, but is still a very much Nvidia based RT implementation. Results swing back to Big Navi with those newer RT titles. It is relevant, even if the visual improvements, subjectively, are not.
@@victorsegoviapalacios4710 people who care about godfall and dirt 5? Also it’s just fair because it can show possible performance for something that is amd optimized and favored
Lovely review Steve... Just a bit odd that you keep referring to the "unicorn" $500 price point of the 3070. You even made a video that this price point isn't realistic as there is simply no FE stock left in the worlds and AIBs have their backs against the wall to produce a 3070 at that price point. I think the $500/$700 MSRP lie is the biggest "reviewers gotcha" that was pulled in this generation.... And it was successful because almost every single reviewer keep quoting that unrealistic FE price.
We have had two drops of FEs just this week here in the UK and another one two weeks ago. Picked one up myself for £469 which is our MSRP and came yesterday
@@Fabianthehunter Good example. Yes, there is definitely scalping going on in Europe (and not just by individuals). In the Benelux, there is only one official seller of AMD cards right now: Alternate (a German company that branched out here) and they are asking €999 for a 6800 and €1099 for a 6800X. Retailers are scalping right now. FYI: $500 = €421... multiply by 1.21 to account for VAT and you are stuck at €509. So you too paid €70 over MSRP for your custom card.
I think the RT test suite may need revision to include newer games that have been built alongside RDNA2 since games like Watchdogs Legion and Godfall show decent RT performance on AMD
I would wait and see how it performs in future raytracing games. The games that have Raytracing are primarly optimized for Nvidias RTX cards because those were the only ones with RT support. I wouldnt exspect more than RTX 2080 Raytracing performance but it will probably get better. You cant buy them anyway xD
Given that AMD is the vendor for the new raytracing capable consoles, the situation could change in the future. Though I wouldn't hold my breath either.
@@KyussTheWalkingWorm Like Tobias said, all those ray tracing games are likely optimized for Nvidia hardware as that was the only option until these new AMD gpu's got released, so the performance gap is likely going to become smaller, throw in that games target consoles and that's a lot more like these new AMD gpu's than Nvidia hardware as well as the drivers not being mature yet, I suspect performance is only going to get better in time. Another thing to remember is that ray tracing was poor on Nvidia hardware early on but with driver updates and software optimization, it got a lot better, if that happens on AMD hardware, the gap could close a lot but with that said, I doubt these cards will beat Nvidia's new cards in ray tracing performance but then it doesn't really have too.
I have an RX 6800 and a Gigabyte 3070 Vision coming this week. Both are the same price. A lot of this video brings up the extra $80 for the RX 6800 compared to the 3070 FE. I love the idea of RT and DLSS really seems to help. If I pick the RX 6800 for RT I feel like I'm banking on the promise of future super sampling or RT optimization. On the other side the 3070's 8GB feels limiting as time goes on.
RDNA2's RT never quite caught up to Ampere. That said, the 3070 doesn't have enough memory to enable RT in some new games, so it's become a moot point. Low VRAM always loses, too bad the mass market has gold fish memory and keeps falling for it every generation.
I would buy the AMD card if you don't need your Graphics card for 1. applications like Blender, Premiere Pro and so on 2. streaming 3. playing Games with Raytracing Effects The 8gb of the RTX 3070 are not enough in my opinion. 16gb will age much better and in games without Raytracing its faster. But if the points above are important to you then you dont really have a choice. I will probably wait and hope that Nvidia will release a 3070 Ti with 10GB RAM. 10gb is okay, I'd rather have more but I could live with that. You cant buy these cards anyway.
It depends. If you wanna save up 80 now and sell and buy something in let's say two years, no just get 3070. If you just want something for next 5 years? Yes 16gb ram will age far better given that developers can push AAA titles with more graphical elements that would be noticeable even in 1440p. 8gb is really reaching its limit.
Needs to be tested with applications that currently benefit from more VRAM, but yes, that's an unanswered question at the moment except that it's in a better position as a future-proofing card than the current 3070fe...
Hey Steve, what are your thoughts on future architecture optimisations at a game code/driver level? In specific, both the new gen consoles use AMD hardware, so one would assume a resizable BAR function would be less of a toggleable feature, and more of a baked in optimisation for those platforms. Furthermore, developers will have to optimise for AMD's RT architecture for the consoles, do you think this will transfer over to PC in uplift as it all matures? Maybe worth a short talking head piece or article, after you guys get some well deserved rest of course.
As far as I know, both Microsoft and Sony have implemented their own solutions somewhat to raytracing and neither are directly using AMD's solution or just standard DXR. They have their own software or hardware to help ray tracing on their respective platforms. But even if they're 90% the same, I don't think with such a big difference, it'll change things all that much until they bring out RDNA 3 to be honest and who knows what Nvidia will have by then. The thing is even with DXR optimised games, Nvidia's cards as far as I understand, would still run them better even if they're not using any of their proprietary RTX features directly due to the dedicated RT cores. Couple that with the fact that some renders apparently don't work properly in openCL anymore in Blender and the fact that Nvidia has most programs optimised for its cuda cores AND has more overall features on the cards and for me, it just has to be Nvidia I think. 10GB seems a little low for the future as I'd planned to get a card to last a good few years at least while pushing the best graphics or thereabouts so was looking forward to the 16GB with AMD. Alas with all of the above, I just can't risk it. The 3080ti would be perfect with its 20GB, could buy one of those and not upgrade for 5 plus years but I really don't think I can afford a 1000 dollar graphics card. Would be good to get an idea from someone a LOT more knowledgeable than me though for sure.
@@Jhakaro That's the flipside, yes, but the GPU in the consoles is RDNA2, I would assume the architecture is more or less the same. Which would mean they also have one RT accelerator per CU. Even if they're using more software trickery, the hardware is still there.
people seem to forget this is AMD first try on raytracing but still trade blow with 3000 series in non rt 4k games without dlss. They already forget usd1300 2080ti from Nvidia first try.
But to be fair there was no other gpu that came close in power to the 2080ti back then. Much of that price difference didn’t come from RT, it did come from the power. Like the 3090. You also had much cheaper cards with much more reasonable fps to price ratios, like 2060.
@@AverageDoomer69 there's no way you believe that to be true unless you're some AMDumb person. The consoles are RDNA2 and weaker than RX6800, they will barely have any raytracing. Also, there is no such thing as optimizing raytracing for Nvidia. You don't actually touch raytracing optimization in that way, it's DXR. It's a high level abstraction, game doesn't care what card raytraces it. It's part of DX12 specification and Nvidia never had its own API for raytracing in DX12.
I think its worth noting that because AMDs card is in the new Playstation and Xbox, game developers are most likely to start optimizing for AMD architecture. This may not be true of all games, but it has to be a serious determing factor when the majority of your customer base is going to be on 3/4 of the options available. With a portion on PS, a portion on Xbox, and a portion on AMD gpu PCs, its worth considering AMD may be in the driver seat to work with a ton of popular titles to optimize each title to their architecture. Please let me know if im looking at this the wrong way.
Honestly for me the RT performance leaves me with more questions than answers. So one of the big selling points for both of the 2 new Consoles is Raytracing. Both of them use the same RDNA2 Architecture yet so far the 6800 as well as 6800XT struggle a lot to consistently output 60 FPS even on 1440p with raytraced games. On paper however both cards have much more oomph and memory than the consoles. Are the few current DX12 RT games just really not well optimized yet (because RT is kinda an afterthought maybe) or is there something with the consoles that I am missing entirely that somehow can achieve better results with way less powerful hardware? Obviously I know that games can be optimized a lot better since the hardware is always the same but still something doesn't quite add up here.
It's simple: consoles are always running lower graphics settings than what you see on PC. It's been like that in the past gens, it will be in this one too. And sure, it comes down to optimization, but optimization for consoles often boils down to that very thing: how much (and in what way) can we scrape off graphics fidelity - and as such load for the gpu - before players begin noticing their game looks worse? The new consoles *already* are struggling keeping stable 60fps with their launch titles (look at Spiderman: if you play it in fidelity mode - read: rt on - it basically runs at 30 fps). Seriously, consoles have a different market, they can sell you hardware at cost, sometimes even at a loss, since they are proprietary platforms and they plan to recover that with the actual games sales. But there's a limit to that: you can't expect miracles when you pay for the whole platforms *less* than what just the kind of equivalent gpus would be supposed to be worth standalone. And if they are promising you such miracles, well: they're lying.
Consoles run on lower graphics and lower FPS not until recently have consoles even attempted to run at 60 FPS the previous gen ran games at 30 FPS that's why you see these discrepancies.
tldr; console RT will not be as good as Nvidia RTX because of inferior RT architecture like RDNA2 RDNA2 implemented RT because they had to compete with Nvidia. That's really the only reason for it being present in the new architecture, if Nvidia didn't push to start using this technology in real-time games rendering, neither company would probably be using it right now, and the same can be said for the new consoles. Its pretty easy to assume that if RX 6000 is unable to keep up with RTX 30 cards in RT, the consoles will struggle in a similar way too. Also, let's be honest, the vast majority of console gamer's don't have a clue what RT is or why they want it, it's pretty much just a marketing tool for Sony and Microsoft so they can say "well technically we have RT" and, with RT specifically, this isn't something that can simply "be optimized" by developers because RT performance is actually dictated by the actual architecture of the GPU die itself. That is why Nvidia will probably always have a lead here, because they are one generation ahead. Not to ramble on too much, but it's also worth mentioning that since AMD has been designing the hardware for consoles for generations now, and people have been saying games will always be optimized better for their architecture, but it never really measures out that way. I'm really happy to see RDNA2 trading blows with the top RTX cards though, this makes a really good market for us consumers. And just a PSA for anyone who wants to call me an Nvidia fanboy: I've had 7 different graphics cards in my life and 5 of them are Radeon , and I'm STILL currently using an FX-8350 processor. (Don't worry I just bought a Ryzen 7 5800X)
The consoles' performance are an overhyped lie spearheaded by usage of a useless marketing term: "Teraflop." The Series X and PS5 are worse at RT than a 2060 super, as was shown by digital foundry in watchdogs legion. Observer system redux performance on both consoles (rasterization or RT) take this point and nail it home.
So Nvidia's realtime raytracing performance lead is purely a hardware optimization result or can we expect the 6800 to get closer performance to the 3070 as AMD's software improves and works with game developers? I'm curious looking at this from a longevity perspective, since I do not frequently upgrade. Thanks for any answers!
Nvidia won’t just stagnate, so it’s more a question of AMD’s software catching up to Nvidias hardware AND software as they update the GeForce drivers, and they’re already starting off way behind the hardware. Do not bet on AMD for any kind of RT for at least another generation
In short; Nvidia leads over AMD due to more specialized hardware but if AMD keeps growing like now it could be that they shift the tables. So get Nvidia now or wait for next gen AMD graphics.
Agreed. I think avg fps is utterly useless number and it should be removed from tests - it tells nothing about how smooth the game plays, it's just a marketing number.
@@Cuthalu yeah like it’s nice to know I guess, but when they’re beyond 100 fps or within 10% it doesn’t seem very useful. I’d rather have a card that was 10% less avg but 10% higher 1% min
6800 is now much cheaper than 3070 here in the UK, almost £100 (about $115 US) Makes the decision much easier. Essp if you aren't fussed about ray tracing.
a 6800 is 400 euros cheaper at 625€ in my country than 3070 🤔 Think its worth upgrading from 1080ti to 6800 sapphire? I have no interest in ray tracing and rx7000 series cost 1000€
@@jaskajokunen3716 6800 and 6800 XT are very fine GPU's and will likely last for 1440 without ray tracing for perhaps 3 years at high quality settings or better (Some ultra I would imagine). For the price you mention - that seems a good deal
@@Thesuperhunter99 I have it it's great but tbh minecraft shaders still look better on java with a 1024x texture pack especially sues ptgi, minecraft rtx does keep the game similar tho and some people like thay
Why is it that I never see card/cpu reviews using a flight simulator as a test bench (No, MSFS2020 is NOT a simulator) but, using Xplane 11 which doesn't use multicores but depends on Frequency (for the CPU) - Would like to know how well this software fares with different CPUs and GPUs - Don't know whether to upgrade from a Ryzen 3800X + EVGA RTX 2060 Super in order to get better performance. I don't have a lot of money in order to try out different scenarios. Phil Nelson - Computers since 1975 (1st system: SWTPC 6800 with 4K Static ram) Thanks to you et al for getting me up to date on all this technology. Rasberry Pi - Who knew.
A lot of the performance issues are more to do with software and im going to bet that over the next few months as games companies get used to coding for the Xbox and PS5 we will see some huge changes
This card is in such a weird position. Being significantly cheaper than the 3080 and 6800xt. But it's significantly more expensive than the 3070. Maybe it'll compete with a future 3070 super or something.
Yeah it's not a director competitor to the 3070 exactly, given it's 80 dollars more expensive. But you get a lot for those 80 dollars, more vram for futureproofing and just downright a noticeable difference in performance. I think even if nvidia release a 3070 super, it's still not going to perform as good as a 6800. The only main drawback is lack of ray-tracing performance, but given it's available on so few games - does it really matter right now?
@@peymanstd very true. Just go with 6800XT. Nvidia's mid tier segmentation makes more sense. For what not to do, look at the 16+20 series from last gen.
Really interested in this card. But I'm running on a SeaSonic 550W 80+ Gold PSU. Do I really need to get a 650W one, or can I use my 550W with casual gaming just fine?
I got the ASUS Tuf gaming OC 6800, still need to install it though. No DLSS but so what and RT can care less about. I would not use DLSS to compare to AMD because it don't exist. I look at comparison with no DLSS that is more fair.
super cool broskiis think after your clock off with Jay on the kingpin card you fellas should take a bit of time to chill out its been a crazy tech season and I for one really appreciate all the content you've provided
@@gaochang784 , lol, I'm sure you are one of those who said 4 cores are plenty for another 5 years. Strange mentality from some of you . Ok you go buy handicapped 3070.
@@professionalinsultant3206 No its not. Most of the older games right now such as AC Odyssey etc already use 6 GB of Vram. The reason they are not using more, is because Nvidia was the only one in the market and they barely released cards with more than 8Gigs. So developer are capped at around 6GB which is the sweet spot. Go checkout 4k Benchmarks on Flight Simulator, Crysis, watch dogs legion, Godfall. These games already eat up around 8 to 9 GB of dedicated VRAM! Not allocated, dedicated! Doom eternal even loses around 20 FPS because it gets bottlenecked by the Vram usage. Same with Crysis Remastered. Godfall uses around 8 Gigs and 12 Gigs in benchmark mode. And Godfall doesnt even have Raytracing implemented which will be around 1 GB extra Vram too. www.overclock3d.net/gfx/articles/2020/11/16102420724l.jpg
@@GamersNexus Steve, we saw the same thing in HUB videos in other games, maybe there's something to it. Fun fact: My wife has seen you on the big screen so many times that she created you and your shop in Animal Crossing, and named you Tekku Jiisus, she's Japanese. Burst into laughter at work when she was sending me photos of that creation. Happy to send the photo over if you are interested!
I think I would have preferred living in a timeline where RT was handled by an additional board external to the GPU. I have so many PCIe slots doing nothing.
Wouldn’t get off the ground. Same way PhysX add-in boards didn’t. Besides, the extra board would need to have access to all the GPU data so it probably wouldn’t work anyway so it makes sense to have it handled inside the GPU.
@@MacGuyver85 Yeah, I'm sure there's a whole bunch of reasons it wouldn't have worked besides even those, but in the fantasy timeline I was imagining it using some kind of bridge connection like nvlink (ideally it wouldn't be proprietary but even my dreams have their limits). Anyway, this is just to say that I think it's a bit unfortunate that generally this card performs really well until ray tracing comes into play. I'm excited for the next generation of these cards, at least, when the ray acceleration hardware has matured (or whatever it is that needs to happen in order for it to be performant).
@@OMGItzFokral Sure, but latency would still be an issue. But I guess your alternative timeline could have some instantaneous quantum link to solve that ;) Yeah, the future is looking good with raytracing. Since DLSS or AMD’s Super Resolution will remain a necessity to have playable framerates for years, I think we should wait to judge the 6000 series until we see Super Resolution. Though I think Nvidia will hold the crown this round.
@@merwinpique I agree that choice is good, but if they had priced the RX 6800 at $550 or $560, that would make the performance difference between it and the XT just right. I feel like they're too close to each other right now.
Cycles run really badly with AMD cards and often comes with ugly artifacts + the lack of optix denoiser (which is hands down the best denoising algorithm ) atm means AMD cycles is a big no no. (edit:fixed phone auto correct and formatting)
However AMD does have its own render engine called Radeon pro render which should run really fast on the 6000 series cards and is available as a plugin for blender. Do note though it does handle materials differently than cycles so you might have to learn how shading works there.
If you wanna get back into cooler reviews - Alpenföhn launched their 240/360mm AIO's after being exclusively an CPU air-cooling brand before. I'm not sure on their availability though - it's a german brand.
When comparing the price points why is no review mentioning the fact the 6800 has DOUBLE the vram then the 3070 for just an extra $80. That adds so much value to the card.
I'd guess currently, in most cases, it does not have any benefits, but in productivity something else hurts Big Navi so it still lags behind even with the extra VRAM? I guess? It does show that, if trends keep going, AMD's cards should age better with time than Nvidia cards
Because you typically never get close to maxing out on your VRAM, and the ram SPEED on the Nvidia cards is better anyways so it's kind of a moot point.
Because he is a freakin schill!!!!!!!!! /jk Maybe because most titles don't need it yet, but I see that being a problem in the next year or so as some games already can use the memory.
Best review as always. Other reviews keep saying buy the 6800 over the 3070 and ignoring the higher price. I'm a dad building my sons first gaming PC he loves minecraft / fortnite / TR and hopes to play MS Flight Sim / Anno. 1 I'm on a budget and 2 the 3070 does better in those 3 games with ray tracing. For people in the same shoes as me the 3070 is the best budget 1440p gamers build and I'm pairing with the 3700x. Thanks again great review. I just hope I can get a hold of a 3070 in time for Christmas 🎄
I'd like to see some RT benchmarks on low settings. From what I've seen in Dirt 5, it seems AMD is better at handling RT in small amounts than Nvidia with less of a bottleneck, but chocks more in very RT heavy titles. With that I mean the percentage dip AMD gets from light RT is less than the percentage of Nvidia.
does anyone know for the non xt 6800, are amd using the hitachi pad on the gpu core? or normal paste? because i just bought a used unit with no warranty so i was thinking of cleaning and repaste it. And what about the sizing of pads everywhere else ? could not find any info regarding non xt 6800. everyone seems to be using the XT version :/
I found this comment on a video editing review for the rx6800... 1. the cooling paste is of such poor quality that you be wise to replace it with your own better quality, the paste that comes with the card, is already dried up and does not cool as good, so after 1 maybe 2 years depending on use or how long it was on display or in the stock, your card starts to artifact the screen, and at times create sort of screen blinking, quick off on. When used longer your card will crash and start with artefacting the crap out of it until it goes PROOF. 2. fan settings sucks, always check your gpu fan setting also nvidia users, better yet make your own, and to be sure use a external program and not radeon drivers, they always revert back to default nuts settings, when it crashes, try setting it at manual, and see what your default fan settings are, mine rx580 was set at max 90 degrees and fan speed at 80%, no wonder it was crashing. Make sure your fans run at max speed when they reach around 63 degrees, zero fan speed at certain low temp(DONT), disable that at once, make sure your fans spin at all times. 3. power control and options or settings as default, change it asap, undervolt your gpu and frequency on the gpu core, and store and run it as default in an external program(msi afterburner). Go -50 on the voltage, and minus 200 on your gpu core, check your temps while you adjust the settings during a heavy gpu test program. Go up 1 point on the voltage and 4 on the frequency until you reach a sweet spot at around 60 to 63 degrees, and then go 2 voltages below that, make sure your card does not reach higher then 63 degrees, you can go higher on that, but make sure you test it good, any artefacting, screen blinking, system reset/restart or program freezing go lower on the temps by loewring voltage and frequency. 4. last but not least the radeon software, man what can i say it does not even remember or apply your own settings when it reverts back to gpu default settings, during a crash or a freeze, and that is a huge problem on top of the 3 previous mentioned hardware misconfigure problems on hardware level, that makes your card go poof withing 2 years and 2 months on heavy use. I had issues after 8 months with my new rx6750xt that started blinking and later 1,5 years when it started to shutdown my pc for no reason or even freezed the game i was playing without any description on why yeh DXGI crash and driver reset to default. i changed my cooling paste, adjusted my fans to run at max speed at 63 degrees, undervolted to 1158(default=1200), and went to 2664 (default = 2694)on the gpu frequency, did not have a single issue after those changes. max temps at 63 degrees, but hardly reaches 58. The reason i keep using msi afterburner for my fan and volage control is, that amd radeon is a piece of crap software that keeps reverting to the crazy settings and burn your gpu out of exsistance within 2 years and 2 months, i lost a rx580 due to this. Amd default fan and voltage setting do not take in account the poor quality of the cooling paste and the dry up time of that same paste, they are to close and temps are to high and fan speeds to low at insane temps, and fine tuned the crap out of it to a literal dead spot, and a gpu card breaking spot with all 4 mentioned problems. This 7900xt will work fine if you change the cooling paste, lower your voltages, and temps, adjust your fan speeds,and start using msi afterburner as default voltage and fan control, do not rely on radeon software.PERIOD
@Gamers Nexus - 14:36 why are the 3080FE minimum frames so much lower than the other cards? Ahhh nm just seen it's only the OC'd score that does this. Must be throttling under load.
You guys should color code the names on some of the charts like at 24:46 , just making AMD cards Red and Nvidia green text then "OC" in orange or something will help a lot.
I'm holding onto my 2070 for awhile longer to see if Nvidia pulls a "3070 Super" type release with a lot more VRAM, but also not giving up DLSS so AMD also isn't an option for me at this point in time.
@@brazilpaes @PandaButtonFTW This is all that matters, you both had features you preferred, and got the cards that suited your desires. That’s the magic of choice that gets lost in the team bloodsports. 👍
Hello I have a question please I got a 620 +80 bronze psu ...with 5800x3d cpu ....can this psu handle the 6800 stock or undervolted ...with 2x 6+2 splitted from 1 cable ?
You could get a 6800/XT for about $900 from a scalper back then. Most gamers were so appalled at the price they regrettably held out, hoping for prices to drop. LOL!!! We know how that turned out.
Yeah but it's also clear why there were no RT benchmarks. These cards just don't compete, especially with no dlss type feature. Maybe that will change? Honestly though I'm still not seeing in crazy cool rt in games yet so...
@@codyjames3416 These games He is benchmarking are built for Nvidia's RT cores though, I've heard that AMD's ray accelerators work with a lower quality version of Ray Tracing really well. Vulkan RT will take full advantage of. The games coming for console will also work well on PC, don't forget the PS5 has HALF the Compute units than the 6800Xt, and yet they are saying they have playable ray tracing at even 4k. So we'll see this materialize next year I hope.
Hello Steve, I have a question about the USB Type-C port on the RX 6000 series cards. Would it be able to drive something like a Lenovo Thinkvision M14t portable touchscreen ? There was an article about the USB-C port of the RTX 2000 series cards by eurogamer.net (www.eurogamer.net/articles/digitalfoundry-2019-02-28-psa-the-usb-c-port-on-rtx-graphics-cards-isnt-just-for-vr) who reported that they were successful in connecting various devices. Is the USB-C port on the RX 6000 series cards also a full featured interface (video + data + power) ?
Hello. Will you look / cover in some of your video the flickering issue on nVidia new cards? Happened with DP1.4. Unfortunately my EVGA GeForce RTX 3070 XC3 BLACK GAMING and monitor BenQ BT 2420 PT has this issue. So, I have to run with HDMI cable. But for more monitors it is useless. :( Hard to say if it is problem of nVidia, or BenQ. As I have read discussions, most of users have this issue with BenQ monitors. When I was playing with setting, it helps to have 2560x1440 at 59Hz, not 60Hz, but it jumps back to 60 every time I start game.
I was angry that I had to pay 800 for a 6800 last November. Now things are 100 times worse. I only got this card because it was the least marked up of all graphics cards last November.
17:09 This is bizarre. The 3080 is equal to the 3090, the RX 6800 and RTX 3080 both aren't affected by OC, but the 6800 XT is much faster than the 6800. What could be causing such weird scaling? What does an RX 6800 XT have over the 6800 that a RTX 3090 doesn't have over a 3080, and that can't be improved by overclocks?
It's interesting to see how well can ray-tracing performance be uplifted by drivers and in-game optimizations by the developers for AMD platform... Curious. Does Steve consider this at all? Especially taking into account that original 2000 series RTX cards were uplifted pretty well by better drivers and optimizations made by the developers (IIRC, Dice scaled back some effects in BF5 to make it perform better, as an example). And then DLSS got to pretty decent level after some time, making it a viable option, especially at higher resolution. AMD should also have this feature with their Super Resolution hopefully soon. I believe there will be updates and retests, because it looks to me that AMD might not be the "worst RT peformance" out there, even with this generation and within a somewhat of a reasonable timespan (maybe even somewhat of a short one? who knows). My guess, if those improvements can be achieved, then maybe this first gen of RT cards by AMD can at least get to somewhat similar levels of 2070-2080 super cards, could be 2080TI/3070 level of performance, especially on upcoming 6900XT. Any thoughts?
Watch our AMD RX 6800 XT GPU review here: ruclips.net/video/jLVGL7aAYgY/видео.html
Watch our AMD RX 6800 XT tear-down here: ruclips.net/video/0s7bOaa6X9E/видео.html
GPU TESTING METHODOLOGY EXPLAINER: ruclips.net/video/-P7-ML-bPCE/видео.html
The best way to support our work is through our store: store.gamersnexus.net/
Get some rest boss. The bar runner arrived and it's very, very nice product!
Flash that XT bios Steve, you know you want to😎 thanks a lot for your hard work guys!
@Gamers Nexus Do you think that minecraft ray tracing will improve with game updates and drivers for it?
Imagine if scalpers started buying up all the Gamers Nexus merchandise. That would be cruel.
Great video. By the way, you forgot to mention that the AMD cards come with 16GB which will come in handy once those true next gen games start to come out.
Every time new GPU line is released, Steve lose a year of his life in a week.
HAH so true. All for our entertainment and education(and his livelihood). Thank you, Steve!
I mean, he's over 2000 years old by now, so it's a drop of wine in the ocean.
I appreciate the work he does to give us grade-A quality data in regards to the new Pc hardware releases. GN is the first channel I would go to for that reason and I can be assured that whatever is presented is reliable. Ofcourse, there is room for improvements as with everything and I am sure Steve is his own biggest critic. Nevertheless, thanks for the dedication and love you put into PC building.
not to mention his curl's
@@BWpepperrAgree one of the best
I just refunded order when he said no Rage Mode. I'm not stupid.
The real joke is that there was even an order to refund.
@@Zosu22 That was the joke.
@@fanboiahoy260 No it wasn't lol
ahahahah ur so cringe
Can I has your order?
Steve currently benchmarking how well a brain can do RTRT during REM sleep.
XD
The next step is looking at how DLSS can reduce testing times for new hardware, by predicting the results before the benchmarks.
@@bilalmalik5002 Username checks out.
@@uncivil_engineer8013 Oh yea!
@@bilalmalik5002 How apt to find someone named "The Imsomniac Scientist" under this comment 🤭
I feel like NVIDIA is going to have marketing issues if they don't introduce some sort of HULK mode to fight for mindspace in this team red RAGE future. Its what I think the modern PC space needs - more pointless marketing terms that mean nothing to confuse issues as much as possible.
GAMER RAGE MODE 1337 BBQ
@@GamersNexus Does that come with bundled burgers? I've heard FarBurger 7 is gonna be really good!
@@GamersNexus BBQ? Do the cards double as a grill or something
@@brucetus of course...did you or did you not see the FE Nvidia grill design? It even has a fan blowing warm-ish air upward, to allow for sous-vide cooking.
"1337"? Isn't that already taken for a gamer internet access plan?
This lack of sleep is going to make for a good mountain biking video later.
30 seconds of a gopro swirling around and bouncing off trees and rocks, followed by a disturbing stillness. You see pieces of a white AMD bike strewn about the foliage. A red led on the side of the frame starts to blink as it uploads data to a distant server. A computer in Santa Clara turns on and a message appears on the screen: "Mission Accomplished: Target Eliminated."
@@pirojfmifhghek566 RIP Steve. Another win for AMD.
@@pirojfmifhghek566 gold. Get someone on Fiverr to animate that 😆
hes gonna bike in zombie mode, half awake half asleep!going to wake up on top of a mountain with a red bull can in each hand!
Steve reviews his local hospital. Content is content!
"Being a Techtuber is easy they said, you just talk about products they said."
I don't want to imagine how much work, how many hours/day you guys spend every day. BIG KUDOS for getting us the infos!
I believe he recently said he works around 100 hours a week, so as a random RUclips commenter, I obviously work 200 hours a week! Please ignore the fact that a week only has 168 hours in it.
I don't know who says this shit
@@mrlescure haha, did 25 hrs on Tuesday. The hours in a "day" don't matter when the day never ends!
@@GamersNexus I promise I will still watch your videos if they're released a day late. Please rest your eyes lol
@@allenqueen I was just meme'ing but there are people who think being a techtuber is easy.
Shoutout to the techtubers like gamers nexus and hardware unboxed who put in so much effort every launch. Always have the most trustworthy and reliable data as well. Make sure you're still getting enough you-time man.
Odin Hardware as well
He even includes video, and 3 at a time side by side 👌
Linus is best
@@zomberkay if you like silly faces. Although I must admit I also like the fact his team did both cards on day one.
Did you see Steve from Hardware Unboxed on their last video? His face was screaming 'i need sleeeeeeeepp' 😅
@@Superiorer they are god
This card is aging incredibly well, especially used, with the power and vram it has. I just grabbed one for a friend at 300 dollars.
Bought a new one JUST now for $320 It's the best "middle" I think you can get, man. The "sweet spot" was the 6800 XT, but they're more difficult to find at a good price, I'm seeing alot of them for over 550 to 630, I can't justify that.
@@dylanherron3963 bro, new for 320 is a steal. Also, from my experience, the 6800 is absolutely a high end card! Good speed, plenty of vram. You can play just about anything even in 4k relatively solid. That purchase should last you a good while :)
@@Ryzard My dude, I am happy to report, (after swapping out a 7 year old EVGA 550w for an EVGA 750w) that I just introduced a CPU bottleneck to my computer lmaoooo. Jokes aside, that's a good thing that I knew would happen, I figured an RX 6800 would barely lift a finger on Cyberpunk in 1080p, even completely maxed (RTX off) when paired with an i7 9700 (some extra cores would help). Like I imagined, when I'm zooming thru Night City on a bike at 140 mph, Rivatuner's got my CPU pegged at 99% (very CPU intensive game) loading in/out all the buildings, decals, and people. But during NORMAL play, I'm sitting between 119 and locked at 144. I realized I screwed myself by opening up a whole ass upgrade path now, lmaooo.
@@dylanherron3963 yep, you've opened the rabbit hole! That being said, your setup should be totally fine for 1080p at good frames for a good while. Glad you're enjoying!
@@Ryzard Beyond enjoying, I've yet to dip under 80 fps tonight in the most intense settings lmao. Good chatting mate!
"TLDR: Don't do TLDR, watch the video"
😂 Fair enough.
In australia the 6800 is actually cheaper than the 3070, so it's an incredibly easy choice.
And the XT is looking to be a couple of hundred cheaper trading blows with the 3080. Looks like I'm going to go team red this time. Be rally interesting to see who leads in different countries markets
Even if it wasnt, that 16gb vram vs 8GB justifies the lil bit more premium imo.
in NZ the xt ref cost about the same as a low end AIB 3070. Hopefully custom Rx 6800 aren't too much more
Absolute steal, ordered mine in a heartbeat.
In Brazil as well (only about 40 USD, but that's enough to buy a 256gb boot ssd lol)
Nooooo the rage mode isn't there :((( Now we won't get to see those RAGE MODE animations which makes me sad. This is a major problem, AMD.
A year later, I had the RTX 3070 in my cart for a few days and was doing research on which AIB to choose from, with many of them having a difference of up to $100 between them, then I watched comparison videos between the Rx 6800 Vs Rtx 3070 (and some forums too), after watching so many youtubers and benchmarks, i ordered the RX 6800 tonight. It was about $150-$200 cheaper and beats the 3070 considerable, for someone who doesn't care about RT. The games I play don't support it.
That is a hefty price difference, especially considering the performance. Which RX 6800 did you end up going with?
Thinking about doing the same, but I will probably wait some more. In my country the 6800 is sadly still unreasonably expensive.
@@ashamancito4630 I bought the XFX speedster. I have no regrets...I also got a 3 game reward bundle from AMD (Sniper Elite 5, Forspoken and Saints Row). I think you should wait a little if you can. I say that because a few days after I bought my card, the price dropped even lower in the store where I bought it. So I think the price might still drop again and it might also drop further for the 3070 as well.
@@josephomoyajowo8509 Bro I have a doubt how is AMD drivers nowadays bro any issues ? Any stutter issues? Any black screen issues? Please share your experience
@@sajithsaji3606 drivers are still buggy. Personally, I haven't had any of those problems. The only issue I've had so far was being unable to 'record' my gameplay and the new update fixed that. Other than that, I haven't had any issues
@@josephomoyajowo8509 bro which ram you're using bro??? Do you use DDU before installing drivers or updates bro????
AMD: *implements RAGEMODE*
Gamers Nexus: "lmao"
I love the idea of it being is literally reaction, watching RUclips, seeing the "Rage Mode" on AMD's presentation and just saying "lmao", not even laughing, just saying that to himself
@Sour Typhoon025 all it does is raise the power limit.
That profile pic though
32:00 Thank you for making clear in your recommendations whether to buy a card for Ray Tracing performance or 1440p (no raytracing, upgrade). There are gamers like me who have one or two games they only play and those games are unlikely to be graphic wise updated by the developer to support full-feature raytracing (which also looks actually nice and is not a tank on performance).
This Jesus guy is very knowledgeable.
Silicon is just advanced carpentry
He can walk on liquid nitrogen, and turns water to liquid metal thermal paste, hail tech jesus
He's clearly Jesus because he doesn't want us to sleep by uploading all these videos....
@@shaung7441 the 10 commandments shall be etched in an IHS!!! His apostles will include Linus, Jayz2cents, Paul, Kyle, and Der8auer. He speeketh the benchmarks and the world shall know these truths come in good faith! For he is Tech Jesus ™️
Jesus Christ is Lord
"47 FPS average, not a pleasant level of performance to play at"
*cries in 30 FPS*
Not even 2k :(
yikes sorry m8. Hope you are blessed with a 6800 or 3070 (depending on which you want), as well as a better PC overall!
@@Squilfinator Waiting for cheaper cards to launch like a 3060 Ti or 6700XT so I can do a full rebuild.
The 3070 and 6080 currently cost more than I paid for my current core components (CPU, GPU, motherboard)!
Then I'll probably pick up a 5600 non-X if they launch one.
Started watching this and was thinking, Steve sounds like he might be a little tired or drunk and he's talking slower and somehow I am following it better, then realized my playback speed was set to .75. Ugh.
I just tried this, and thank you so much for introducing me to this amazing feature of GN videos. xD
0.5 for when you're out of red bull.
lol
even the ad spot is drunk xD
Holy shit 😂😂
Can't wait to get a 6800 when the 7000 series releases!
cant wait to get a rx 5700xt when the 6700 releases!
@@Xenoray1 I'm getting one if there's a good Black Friday deal. Fingers crossed for a sub 350€ value!
@ im waiting to get that for 250 buck used with waterblock XD
'Rage M'ode' feels like marketing ectoplasm left behind by the ghost of Raja Koduri
MSRP means nothing at this point in time. I can't imagine that getting any better anytime soon, rabid consumerism has gotten a bit out of hand.
Yeah, you can't find those cards for anything less than 70% markup here. Both amd and nvidia
Lowest price I’ve seen in my country is €899 for the 6800 and €1000 for 6800xt
Yep and the suppliers win
In Eastern Europe the PowerColor RX 6800 is 100 euro cheaper than the Gigabyte RTX 3070, and the PowerColor 6800XT is 200 euro cheaper than the Gigabyte RTX 3080. All of them are like 650-900+ euro and ofc none of these have any stock anywhere.
@@PadyEos Indeed, it seems like in real life the 6800 is cheaper than the 3080. I hope Nvidia does an Intel, they were super dishonest about the pricing.
I'm really glad you guys still do 1080p testing. So many outlets have dropped 1080p, but it's definitely still important thanks to high refresh rate monitors. I would take refresh rate over resolution any day. (As long as it's not below 1080)
I think Steve probably says "stock" more times in this video than the number of cards actually will be in stock ....
31:26 "That extra 80$ is what complicates it"
Yep, that's my exact feelings when it comes to these cards. 499$ for the 6800 and 599$ for the 6800xt would have made the 6800 almost a no brainer, and the 6800xt way more compelling. Like it or not, the not-so-exciting RT performances need to be addressed in some way, and price would be the only way to do that for this gen. AMD still can't afford to not to be extremely aggressive when it comes to pricing in the gpu business.
It does not matter because stock issues. Also u will not find $580 and $500 cards. It will be mostly $550 vs 550. 6800 XT will be $700 and 3080 will be $800
well, they got all sold so this crazy busniss going on just worth it lol
@King Marco Louis III actually what happened in godfall and dirt 5 that's not the case point is you can't say that yet when the other company paid for for support and one just launch
@@ZAGAN-OZ 550 to 550? But isn’t 6800’s price at least 580? Did you mean 650 to 550?
How many games have raytracing ? Maybe 5 ? So why everyone so worried about raytracing eye candy?
Even those games are not at acceptable level from both side.
$80.00 extra from amd card gets you 16gb vram and 15 % higher performance over 3070. Simple as that ,rx 6800 is better card.
Wendell's review on Level1Tech implied that the RX6800 series outperforms the RTX30 series by quite a lot when using slower RAM.
Is there any chance you could do a similar test to verify his results and maybe explain how come?
It does my buddy tested it on his testbench(he's a manager of a microcenter)
@@roboticvenom1935 Thanks.
@@victorsegoviapalacios4710 Not claiming they are. But it's something that was not covered in the other reviews I've seen and it would be an interesting effect.
@@poraktobask would be interesting but if it were true. Would it really be worth using slower RAM to squeak out more performance out of your GPU?
@@SmolLiftsRich i mean you would be lowering your performance on everything else execpt for gaming.
Basically, the card to buy if you're gaming at 1440p and don't care about RT features in general. Close enough to the 3080 to make that look ridiculously priced.
With NextGen gaming here, look to most new games feature RT. AMD was so close to having a killer card but their lackluster RT performance will see this current crop of cards on Ebay in a years time.
Wait until Next your if you want to go team Red.
I dunno. With the new Playstation and Xbox *BOTH* using AMD as the CPU and GPU and with how much cross-platform gaming you have I don't think falling behind RT is going to be that big of a deal for this generation.
@@Keith2XS Don't be stupid mate. Very few people give a sh*t about RT. At least not in it's current form. Future games that will have it, will be designed specially for the RT features in RDNA2 because it's the dominant gaming platform for the next several years.
@@timpatrick564 This. The console GPUs have a RT baseline that any higher tier RDN2 GPU will exceed just fine. Where AMD really needs to catch up, and fast, is a strong DLSS competitor.
@@Keith2XS Yep more games will definitely go for RT now that Nvidia has a stable card for it.
I prefer Nexus testing and documentation than Linus. This is more accurate.
I've always had the impression that linus and jayz were more toward the entertainment side of the spectrum while GN is closer to the technical news side. Obviously both sides have both qualities (amd bike?) but that is what i get for an overall impression.
Well thats what GN does and is known for. LTT on the other hand tries to keep the videos short with only the information most people are looking for with a definite conclusion.
But its always best to look at a lot of different channels when it comes to hardware reviews, as some consider things others don't at look at them from a different angle. LTT and GN especially are known for their transparency and honestly when it comes to reviews.
@@hfcred6754 Thats true but I prefer Nexus cuz he is more technical approach.
I wonder if in the coming years AMD's RT performance will start getting better performance in new games with the same cards, since all of the current implementations would have been optimized Nvidia cards; however now that consoles have RT devs will be incentivized to optimize games for AMD.
I believe 2022 will be the year of AMD with their 5nm cpus and their 2nd gen rt gpus but we will see
@@burakkose481 5nm coming next year
@@rdmz135 good luck buying it
@@burakkose481 dont crush our hopes 😂
That´s the point. The mainstream developers are focused on consoles and set the standard for the base product from which we PC gamers get a port (in the worst case).
That both main consoles run completly run on AMD means we will see more support in the PC space too.
Now that both companies released quite a few GPUs, will we get "Best Workstation GPUs 2020" video, or will that have to wait until close to end of the year so 6900XT is also included?
That'll be content for around Xmas
As well as Disappointment 2020
I think it will be Nvidia in terms of workstation due to 3d renderers having optics and Cuda from Nvidia while it's only opencl for amd
@@lyelllampero3748 yeah Nvidia will be better. Based on the pcworld interview on thier podcast with Scott. It sounds like AMD won't have content creation /workstation stuff till RDNA3. Which makes sense.
The first thing they needed to do was to catch back up to Nvidia in rasterisarion, now that that's done, they can work on improving the arch and branching out to meet other needs.
Hopefully we see AV1 encode in RDNA2 as well.
@@scarletspidernz Yea, I am looking to get a GPU for Blender work, and lack of VRAM on Nvidia's GPUs is killing me. I was hoping for 3070 to have at least like 12GB, but with 8 it's a hard pass, so 3080 is only reasonable choice since I can't really afford to double that for RTX 3090 24GB.
RX6800 GPUs ended up being "good enough" in Blender workloads, but AMD and Blender have been having issues with crashes and what not for quite a while now, and CUDA is integrated and compatible with way more programs.
So I am stuck, either RTX 3080, RX 6800/XT or again pray that mirical will happen and we suddenly get a 16GB 3070 for $600 ( more like $800-900 in my country.)
@@vator_rs The RTX 3060 is rumored to have 12 GB as a counter for the RX 6700 that's being rumored with 12 GB as well. Maybe that one would fit your case? or is it not powerful enough (around RTX 2080 performance expected)
Turing user here, why not show RT tests from AMD RT implementations as well, like Godfall and Dirt 5? Control is an independent title, but is still a very much Nvidia based RT implementation. Results swing back to Big Navi with those newer RT titles. It is relevant, even if the visual improvements, subjectively, are not.
Yes I agree
@@victorsegoviapalacios4710 people who care about godfall and dirt 5? Also it’s just fair because it can show possible performance for something that is amd optimized and favored
@@victorsegoviapalacios4710 Shadow of the Tomb Raiders sales tanked 2 years ago, but its benchmarked ad nauseum. Whats your point??
@@victorsegoviapalacios4710 Who cares about Control and Minecraft RTX?
@@odderphase everyone
I only fell asleep for 12.4% of the time compared to the 36.7% with traditional videos!!! Is that a win for AMD?? (jk, good job Steve!)
Lovely review Steve... Just a bit odd that you keep referring to the "unicorn" $500 price point of the 3070. You even made a video that this price point isn't realistic as there is simply no FE stock left in the worlds and AIBs have their backs against the wall to produce a 3070 at that price point.
I think the $500/$700 MSRP lie is the biggest "reviewers gotcha" that was pulled in this generation.... And it was successful because almost every single reviewer keep quoting that unrealistic FE price.
We have had two drops of FEs just this week here in the UK and another one two weeks ago. Picked one up myself for £469 which is our MSRP and came yesterday
Well, if it's a lie then I must have imagined when I bought a 3070fe at msrp and have been living in some sort of a dream world since then.
6800 costs 689€ in germany...i paid 579€ for my 3070 Custom!
@@Fabianthehunter Good example. Yes, there is definitely scalping going on in Europe (and not just by individuals). In the Benelux, there is only one official seller of AMD cards right now: Alternate (a German company that branched out here) and they are asking €999 for a 6800 and €1099 for a 6800X. Retailers are scalping right now.
FYI: $500 = €421... multiply by 1.21 to account for VAT and you are stuck at €509. So you too paid €70 over MSRP for your custom card.
@@elmirBDS yes, at Launch day, but still way cheaper...
I think the RT test suite may need revision to include newer games that have been built alongside RDNA2 since games like Watchdogs Legion and Godfall show decent RT performance on AMD
Exactly, as more console games adopt ray tracing, amd will be easier to port to I think.
Holy, seeing the 6800xt trade blows with the 3080 is pretty cool to watch
again, the ray tracing results make me sad, but damn those rasterized results look tempting.
We wanted more options and now that we have them it is harder to choose 😂
I would wait and see how it performs in future raytracing games. The games that have Raytracing are primarly optimized for Nvidias RTX cards because those were the only ones with RT support. I wouldnt exspect more than RTX 2080 Raytracing performance but it will probably get better. You cant buy them anyway xD
@@samgoff5289 not for me i don't need rtx I'm a fps gamer since the 90s I'm upgrading to 6800xt.
Given that AMD is the vendor for the new raytracing capable consoles, the situation could change in the future. Though I wouldn't hold my breath either.
@@KyussTheWalkingWorm Like Tobias said, all those ray tracing games are likely optimized for Nvidia hardware as that was the only option until these new AMD gpu's got released, so the performance gap is likely going to become smaller, throw in that games target consoles and that's a lot more like these new AMD gpu's than Nvidia hardware as well as the drivers not being mature yet, I suspect performance is only going to get better in time.
Another thing to remember is that ray tracing was poor on Nvidia hardware early on but with driver updates and software optimization, it got a lot better, if that happens on AMD hardware, the gap could close a lot but with that said, I doubt these cards will beat Nvidia's new cards in ray tracing performance but then it doesn't really have too.
I have an RX 6800 and a Gigabyte 3070 Vision coming this week. Both are the same price. A lot of this video brings up the extra $80 for the RX 6800 compared to the 3070 FE. I love the idea of RT and DLSS really seems to help. If I pick the RX 6800 for RT I feel like I'm banking on the promise of future super sampling or RT optimization. On the other side the 3070's 8GB feels limiting as time goes on.
RDNA2's RT never quite caught up to Ampere. That said, the 3070 doesn't have enough memory to enable RT in some new games, so it's become a moot point.
Low VRAM always loses, too bad the mass market has gold fish memory and keeps falling for it every generation.
is the extra Ram worth the $80? that's what I was wondering and was surprised he didn't bring that up
No
I would buy the AMD card if you don't need your Graphics card for
1. applications like Blender, Premiere Pro and so on
2. streaming
3. playing Games with Raytracing Effects
The 8gb of the RTX 3070 are not enough in my opinion. 16gb will age much better and in games without Raytracing its faster.
But if the points above are important to you then you dont really have a choice.
I will probably wait and hope that Nvidia will release a 3070 Ti with 10GB RAM. 10gb is okay, I'd rather have more but I could live with that.
You cant buy these cards anyway.
@@Febreezus Doom eternal would like a word with you
It depends. If you wanna save up 80 now and sell and buy something in let's say two years, no just get 3070. If you just want something for next 5 years? Yes 16gb ram will age far better given that developers can push AAA titles with more graphical elements that would be noticeable even in 1440p. 8gb is really reaching its limit.
Needs to be tested with applications that currently benefit from more VRAM, but yes, that's an unanswered question at the moment except that it's in a better position as a future-proofing card than the current 3070fe...
The next video, that I'll be most happy to see is an 8-hour long live-stream with Steve sleeping the whole time.
Let's be honest who actually plays games with Ray tracing on and in 4K, 0.01% of people?
Exactly :)
You know he’s been busy when the table is an absolute mess.
I feel like RageMode was marketing targeted right at dudes that look like Steve.
Good video i just started doing RX 6800 benchmarks
Hey Steve, what are your thoughts on future architecture optimisations at a game code/driver level? In specific, both the new gen consoles use AMD hardware, so one would assume a resizable BAR function would be less of a toggleable feature, and more of a baked in optimisation for those platforms.
Furthermore, developers will have to optimise for AMD's RT architecture for the consoles, do you think this will transfer over to PC in uplift as it all matures?
Maybe worth a short talking head piece or article, after you guys get some well deserved rest of course.
What I was thinking but said much more eloquently.
As far as I know, both Microsoft and Sony have implemented their own solutions somewhat to raytracing and neither are directly using AMD's solution or just standard DXR. They have their own software or hardware to help ray tracing on their respective platforms. But even if they're 90% the same, I don't think with such a big difference, it'll change things all that much until they bring out RDNA 3 to be honest and who knows what Nvidia will have by then.
The thing is even with DXR optimised games, Nvidia's cards as far as I understand, would still run them better even if they're not using any of their proprietary RTX features directly due to the dedicated RT cores. Couple that with the fact that some renders apparently don't work properly in openCL anymore in Blender and the fact that Nvidia has most programs optimised for its cuda cores AND has more overall features on the cards and for me, it just has to be Nvidia I think.
10GB seems a little low for the future as I'd planned to get a card to last a good few years at least while pushing the best graphics or thereabouts so was looking forward to the 16GB with AMD. Alas with all of the above, I just can't risk it. The 3080ti would be perfect with its 20GB, could buy one of those and not upgrade for 5 plus years but I really don't think I can afford a 1000 dollar graphics card. Would be good to get an idea from someone a LOT more knowledgeable than me though for sure.
@@Jhakaro That's the flipside, yes, but the GPU in the consoles is RDNA2, I would assume the architecture is more or less the same. Which would mean they also have one RT accelerator per CU. Even if they're using more software trickery, the hardware is still there.
@@Jhakaro also, being from Australia, the 6800 is already nearly $1000AU and so is the 3070.
people seem to forget this is AMD first try on raytracing but still trade blow with 3000 series in non rt 4k games without dlss. They already forget usd1300 2080ti from Nvidia first try.
But to be fair there was no other gpu that came close in power to the 2080ti back then. Much of that price difference didn’t come from RT, it did come from the power. Like the 3090. You also had much cheaper cards with much more reasonable fps to price ratios, like 2060.
Also they were optimized for Nvidia, once next gen games start coming out with rt we'll see better performance
@@AverageDoomer69 there's no way you believe that to be true unless you're some AMDumb person. The consoles are RDNA2 and weaker than RX6800, they will barely have any raytracing.
Also, there is no such thing as optimizing raytracing for Nvidia. You don't actually touch raytracing optimization in that way, it's DXR. It's a high level abstraction, game doesn't care what card raytraces it. It's part of DX12 specification and Nvidia never had its own API for raytracing in DX12.
I think its worth noting that because AMDs card is in the new Playstation and Xbox, game developers are most likely to start optimizing for AMD architecture. This may not be true of all games, but it has to be a serious determing factor when the majority of your customer base is going to be on 3/4 of the options available. With a portion on PS, a portion on Xbox, and a portion on AMD gpu PCs, its worth considering AMD may be in the driver seat to work with a ton of popular titles to optimize each title to their architecture. Please let me know if im looking at this the wrong way.
Honestly for me the RT performance leaves me with more questions than answers. So one of the big selling points for both of the 2 new Consoles is Raytracing. Both of them use the same RDNA2 Architecture yet so far the 6800 as well as 6800XT struggle a lot to consistently output 60 FPS even on 1440p with raytraced games. On paper however both cards have much more oomph and memory than the consoles.
Are the few current DX12 RT games just really not well optimized yet (because RT is kinda an afterthought maybe) or is there something with the consoles that I am missing entirely that somehow can achieve better results with way less powerful hardware? Obviously I know that games can be optimized a lot better since the hardware is always the same but still something doesn't quite add up here.
It's simple: consoles are always running lower graphics settings than what you see on PC. It's been like that in the past gens, it will be in this one too. And sure, it comes down to optimization, but optimization for consoles often boils down to that very thing: how much (and in what way) can we scrape off graphics fidelity - and as such load for the gpu - before players begin noticing their game looks worse?
The new consoles *already* are struggling keeping stable 60fps with their launch titles (look at Spiderman: if you play it in fidelity mode - read: rt on - it basically runs at 30 fps).
Seriously, consoles have a different market, they can sell you hardware at cost, sometimes even at a loss, since they are proprietary platforms and they plan to recover that with the actual games sales. But there's a limit to that: you can't expect miracles when you pay for the whole platforms *less* than what just the kind of equivalent gpus would be supposed to be worth standalone. And if they are promising you such miracles, well: they're lying.
It's all up to game developer the hardware on these cards is more than enough
Consoles run on lower graphics and lower FPS not until recently have consoles even attempted to run at 60 FPS the previous gen ran games at 30 FPS that's why you see these discrepancies.
tldr; console RT will not be as good as Nvidia RTX because of inferior RT architecture like RDNA2
RDNA2 implemented RT because they had to compete with Nvidia. That's really the only reason for it being present in the new architecture, if Nvidia didn't push to start using this technology in real-time games rendering, neither company would probably be using it right now, and the same can be said for the new consoles. Its pretty easy to assume that if RX 6000 is unable to keep up with RTX 30 cards in RT, the consoles will struggle in a similar way too. Also, let's be honest, the vast majority of console gamer's don't have a clue what RT is or why they want it, it's pretty much just a marketing tool for Sony and Microsoft so they can say "well technically we have RT" and, with RT specifically, this isn't something that can simply "be optimized" by developers because RT performance is actually dictated by the actual architecture of the GPU die itself. That is why Nvidia will probably always have a lead here, because they are one generation ahead. Not to ramble on too much, but it's also worth mentioning that since AMD has been designing the hardware for consoles for generations now, and people have been saying games will always be optimized better for their architecture, but it never really measures out that way. I'm really happy to see RDNA2 trading blows with the top RTX cards though, this makes a really good market for us consumers.
And just a PSA for anyone who wants to call me an Nvidia fanboy: I've had 7 different graphics cards in my life and 5 of them are Radeon , and I'm STILL currently using an FX-8350 processor. (Don't worry I just bought a Ryzen 7 5800X)
The consoles' performance are an overhyped lie spearheaded by usage of a useless marketing term: "Teraflop." The Series X and PS5 are worse at RT than a 2060 super, as was shown by digital foundry in watchdogs legion. Observer system redux performance on both consoles (rasterization or RT) take this point and nail it home.
So Nvidia's realtime raytracing performance lead is purely a hardware optimization result or can we expect the 6800 to get closer performance to the 3070 as AMD's software improves and works with game developers? I'm curious looking at this from a longevity perspective, since I do not frequently upgrade. Thanks for any answers!
I would say the 20 series RTX over time represents software improvements, whereas 30 series represents hardware improvements
Nvidia won’t just stagnate, so it’s more a question of AMD’s software catching up to Nvidias hardware AND software as they update the GeForce drivers, and they’re already starting off way behind the hardware.
Do not bet on AMD for any kind of RT for at least another generation
The worst result for the 6000Series is in an nVidia developed Renderer. It actually ran better than I expected.
In short; Nvidia leads over AMD due to more specialized hardware but if AMD keeps growing like now it could be that they shift the tables. So get Nvidia now or wait for next gen AMD graphics.
3000/6000 gen will be a pretty close fight, with slight edge to Nvidia for all-around feature set. 7000 gen is the time to get AMD.
Right hand side, Steve got plastic shroud of PlayStation 5, teardown maybe?
Please do more frametime tests, these are huge for for people who play competitive games, or value smoothness overall!
Yeah I put a higher value on 1% low and frame time than average high
Agreed. I think avg fps is utterly useless number and it should be removed from tests - it tells nothing about how smooth the game plays, it's just a marketing number.
@@Cuthalu yeah like it’s nice to know I guess, but when they’re beyond 100 fps or within 10% it doesn’t seem very useful. I’d rather have a card that was 10% less avg but 10% higher 1% min
Hopefully these cards are good room heaters. Need one this winter
Worse than NVIDIAs, they're too power efficient
@@xWood4000 Nvidia cards are less room heaters more like furnace(480)
@@ani0.biswas You're correct. Definitely enough for a sauna in the summer
6:30 graph .1% is significantly worse on 6800xt than 6800, that cant be right????
6800 is now much cheaper than 3070 here in the UK, almost £100 (about $115 US) Makes the decision much easier. Essp if you aren't fussed about ray tracing.
a 6800 is 400 euros cheaper at 625€ in my country than 3070 🤔 Think its worth upgrading from 1080ti to 6800 sapphire? I have no interest in ray tracing and rx7000 series cost 1000€
@@jaskajokunen3716 6800 and 6800 XT are very fine GPU's and will likely last for 1440 without ray tracing for perhaps 3 years at high quality settings or better (Some ultra I would imagine). For the price you mention - that seems a good deal
I'll only get excited about Ray Tracing in games when there is NO rasterisation at all. In other words, pure Path Tracing.
And none of the current generation hardware is capable of doing that. Minecraft or Quake II do not count...
@@djohnny1337I think Minecraft actually does count! I would absolutely love to be able to play that, I think it looks fantastic
@@Thesuperhunter99 I have it it's great but tbh minecraft shaders still look better on java with a 1024x texture pack especially sues ptgi, minecraft rtx does keep the game similar tho and some people like thay
@@christopher.costner What they like and what people are used too are different things, style is a choice.
@@christopher.costner sues ptgi still absolutely demolish every gpu there is, max settings on a 3080 gives 30fps
Why is it that I never see card/cpu reviews using a flight simulator as a test bench (No, MSFS2020 is NOT a simulator) but, using Xplane 11 which doesn't use multicores but depends on Frequency (for the CPU) - Would like to know how well this software fares with different CPUs and GPUs - Don't know whether to upgrade from a Ryzen 3800X + EVGA RTX 2060 Super in order to get better performance. I don't have a lot of money in order to try out different scenarios.
Phil Nelson - Computers since 1975 (1st system: SWTPC 6800 with 4K Static ram)
Thanks to you et al for getting me up to date on all this technology.
Rasberry Pi - Who knew.
A lot of the performance issues are more to do with software and im going to bet that over the next few months as games companies get used to coding for the Xbox and PS5 we will see some huge changes
What happened to the ps thermal video?
This card is in such a weird position. Being significantly cheaper than the 3080 and 6800xt. But it's significantly more expensive than the 3070. Maybe it'll compete with a future 3070 super or something.
I think they're planning on still coming out with a $500 card. I bet this gen is going to be highly segmented.
Yeah it's not a director competitor to the 3070 exactly, given it's 80 dollars more expensive. But you get a lot for those 80 dollars, more vram for futureproofing and just downright a noticeable difference in performance. I think even if nvidia release a 3070 super, it's still not going to perform as good as a 6800.
The only main drawback is lack of ray-tracing performance, but given it's available on so few games - does it really matter right now?
@@peymanstd very true. Just go with 6800XT. Nvidia's mid tier segmentation makes more sense. For what not to do, look at the 16+20 series from last gen.
its the cheapest 4k card.
We can see it like this.
Really interested in this card. But I'm running on a SeaSonic 550W 80+ Gold PSU. Do I really need to get a 650W one, or can I use my 550W with casual gaming just fine?
I have owned my AMD Reference Sapphire RX6800 for about two months now and have loved every minute!😁👍 My 3070 was garbage.
I got the ASUS Tuf gaming OC 6800, still need to install it though. No DLSS but so what and RT can care less about. I would not use DLSS to compare to AMD because it don't exist. I look at comparison with no DLSS that is more fair.
@@Danny_On_Wheels44 i got the same card, it's amazing
BLASPHEMY!! 😱
super cool broskiis think after your clock off with Jay on the kingpin card you fellas should take a bit of time to chill out its been a crazy tech season and I for one really appreciate all the content you've provided
Id say value is justified given 16GB vram vs 8gb on the 3070.
and more fps everyone forgetting that look at the charts
Most of that 16gb is going to waste anyway. Sure it’s nice knowing you have more but it’s not that useful realistically.
@@gaochang784 , lol, I'm sure you are one of those who said 4 cores are plenty for another 5 years.
Strange mentality from some of you . Ok you go buy handicapped 3070.
@@rauls9027 How's 3070 handicapped? By the time 16GB is needed these cards are already obsolete.
@@professionalinsultant3206 No its not. Most of the older games right now such as AC Odyssey etc already use 6 GB of Vram. The reason they are not using more, is because Nvidia was the only one in the market and they barely released cards with more than 8Gigs. So developer are capped at around 6GB which is the sweet spot. Go checkout 4k Benchmarks on Flight Simulator, Crysis, watch dogs legion, Godfall. These games already eat up around 8 to 9 GB of dedicated VRAM! Not allocated, dedicated! Doom eternal even loses around 20 FPS because it gets bottlenecked by the Vram usage. Same with Crysis Remastered. Godfall uses around 8 Gigs and 12 Gigs in benchmark mode. And Godfall doesnt even have Raytracing implemented which will be around 1 GB extra Vram too.
www.overclock3d.net/gfx/articles/2020/11/16102420724l.jpg
Seriously, thank you so much for you hard work on this.
WTF is up with Nvidia's 1% lows compared to AMD?
It's just in TWTK that it's abnormal.
@@GamersNexus Steve, we saw the same thing in HUB videos in other games, maybe there's something to it.
Fun fact: My wife has seen you on the big screen so many times that she created you and your shop in Animal Crossing, and named you Tekku Jiisus, she's Japanese. Burst into laughter at work when she was sending me photos of that creation. Happy to send the photo over if you are interested!
@@MadViking82 lol if that's not waifu material I don't know what is.
Maybe the Infinity Cache is helping a lot here?
@@MadViking82 wow she's a keeper
*What an amazing and complete in every aspect analysis* !!!!
*Thank you soooo much for your work Steve , you are the greatest* !!!!
You've all got it confused, S.A.M. : Slighty Above Margin
Can you please test video quality when recording gameplay please I like to see the video comparisons on video encoder they are using.
I think I would have preferred living in a timeline where RT was handled by an additional board external to the GPU. I have so many PCIe slots doing nothing.
Wouldn’t get off the ground. Same way PhysX add-in boards didn’t.
Besides, the extra board would need to have access to all the GPU data so it probably wouldn’t work anyway so it makes sense to have it handled inside the GPU.
@@MacGuyver85 Yeah, I'm sure there's a whole bunch of reasons it wouldn't have worked besides even those, but in the fantasy timeline I was imagining it using some kind of bridge connection like nvlink (ideally it wouldn't be proprietary but even my dreams have their limits).
Anyway, this is just to say that I think it's a bit unfortunate that generally this card performs really well until ray tracing comes into play. I'm excited for the next generation of these cards, at least, when the ray acceleration hardware has matured (or whatever it is that needs to happen in order for it to be performant).
@@OMGItzFokral Sure, but latency would still be an issue. But I guess your alternative timeline could have some instantaneous quantum link to solve that ;)
Yeah, the future is looking good with raytracing.
Since DLSS or AMD’s Super Resolution will remain a necessity to have playable framerates for years, I think we should wait to judge the 6000 series until we see Super Resolution. Though I think Nvidia will hold the crown this round.
I would gladly buy a GTX 3080 for 500 euro
God no. Nightmare for SFF builders
You should make your graphs always start at 0. Gives a much, much better idea of how close things are, and when noise matters or not.
I managed to purchase this card on launch day, but not the XT... Not sure if I regret it, seeing as the 6800 XT is not much more expensive :P
Its still beats 3070, be happy 😊
Choices bro. Choices.
There might be a BIOS flash to get XT performance similar to the 5700.
@@merwinpique I agree that choice is good, but if they had priced the RX 6800 at $550 or $560, that would make the performance difference between it and the XT just right. I feel like they're too close to each other right now.
@@TheJupiteL it has less compute units it's not the same thing. But he can OC it
Love your videos Tech Jesus 😊
I’d love to see some production benchmarks. That 16 gigs of ram’s looking really attractive for some cycles rendering on a bit of a budget.
Vram doesnt really mean anything when a lot of render engines now support rtx cores such as vray and arnold and probably more
Cycles run really badly with AMD cards and often comes with ugly artifacts + the lack of optix denoiser (which is hands down the best denoising algorithm ) atm means AMD cycles is a big no no.
(edit:fixed phone auto correct and formatting)
However AMD does have its own render engine called Radeon pro render which should run really fast on the 6000 series cards and is available as a plugin for blender. Do note though it does handle materials differently than cycles so you might have to learn how shading works there.
If you wanna get back into cooler reviews - Alpenföhn launched their 240/360mm AIO's after being exclusively an CPU air-cooling brand before.
I'm not sure on their availability though - it's a german brand.
When comparing the price points why is no review mentioning the fact the 6800 has DOUBLE the vram then the 3070 for just an extra $80.
That adds so much value to the card.
I'd guess currently, in most cases, it does not have any benefits, but in productivity something else hurts Big Navi so it still lags behind even with the extra VRAM? I guess?
It does show that, if trends keep going, AMD's cards should age better with time than Nvidia cards
Because you typically never get close to maxing out on your VRAM, and the ram SPEED on the Nvidia cards is better anyways so it's kind of a moot point.
Because he is a freakin schill!!!!!!!!! /jk
Maybe because most titles don't need it yet, but I see that being a problem in the next year or so as some games already can use the memory.
@@Febreezus doom eternal already bottlenecks the 3070, 8gb at this point is planned obsolescence by nvidia
@Невада большевик 3070 has 8gb not 10gb.
Best review as always. Other reviews keep saying buy the 6800 over the 3070 and ignoring the higher price. I'm a dad building my sons first gaming PC he loves minecraft / fortnite / TR and hopes to play MS Flight Sim / Anno. 1 I'm on a budget and 2 the 3070 does better in those 3 games with ray tracing. For people in the same shoes as me the 3070 is the best budget 1440p gamers build and I'm pairing with the 3700x. Thanks again great review. I just hope I can get a hold of a 3070 in time for Christmas 🎄
I'd like to see some RT benchmarks on low settings. From what I've seen in Dirt 5, it seems AMD is better at handling RT in small amounts than Nvidia with less of a bottleneck, but chocks more in very RT heavy titles. With that I mean the percentage dip AMD gets from light RT is less than the percentage of Nvidia.
In your dream
does anyone know for the non xt 6800, are amd using the hitachi pad on the gpu core? or normal paste? because i just bought a used unit with no warranty so i was thinking of cleaning and repaste it. And what about the sizing of pads everywhere else ? could not find any info regarding non xt 6800. everyone seems to be using the XT version :/
I found this comment on a video editing review for the rx6800...
1. the cooling paste is of such poor quality that you be wise to replace it with your own better quality, the paste that comes with the card, is already dried up and does not cool as good, so after 1 maybe 2 years depending on use or how long it was on display or in the stock, your card starts to artifact the screen, and at times create sort of screen blinking, quick off on.
When used longer your card will crash and start with artefacting the crap out of it until it goes PROOF.
2. fan settings sucks, always check your gpu fan setting also nvidia users, better yet make your own, and to be sure use a external program and not radeon drivers, they always revert back to default nuts settings, when it crashes, try setting it at manual, and see what your default fan settings are, mine rx580 was set at max 90 degrees and fan speed at 80%, no wonder it was crashing.
Make sure your fans run at max speed when they reach around 63 degrees, zero fan speed at certain low temp(DONT), disable that at once, make sure your fans spin at all times.
3. power control and options or settings as default, change it asap, undervolt your gpu and frequency on the gpu core, and store and run it as default in an external program(msi afterburner).
Go -50 on the voltage, and minus 200 on your gpu core, check your temps while you adjust the settings during a heavy gpu test program.
Go up 1 point on the voltage and 4 on the frequency until you reach a sweet spot at around 60 to 63 degrees, and then go 2 voltages below that, make sure your card does not reach higher then 63 degrees, you can go higher on that, but make sure you test it good, any artefacting, screen blinking, system reset/restart or program freezing go lower on the temps by loewring voltage and frequency.
4. last but not least the radeon software, man what can i say it does not even remember or apply your own settings when it reverts back to gpu default settings, during a crash or a freeze, and that is a huge problem on top of the 3 previous mentioned hardware misconfigure problems on hardware level, that makes your card go poof withing 2 years and 2 months on heavy use.
I had issues after 8 months with my new rx6750xt that started blinking and later 1,5 years when it started to shutdown my pc for no reason or even freezed the game i was playing without any description on why yeh DXGI crash and driver reset to default.
i changed my cooling paste, adjusted my fans to run at max speed at 63 degrees, undervolted to 1158(default=1200), and went to 2664 (default = 2694)on the gpu frequency, did not have a single issue after those changes. max temps at 63 degrees, but hardly reaches 58.
The reason i keep using msi afterburner for my fan and volage control is, that amd radeon is a piece of crap software that keeps reverting to the crazy settings and burn your gpu out of exsistance within 2 years and 2 months, i lost a rx580 due to this.
Amd default fan and voltage setting do not take in account the poor quality of the cooling paste and the dry up time of that same paste, they are to close and temps are to high and fan speeds to low at insane temps, and fine tuned the crap out of it to a literal dead spot, and a gpu card breaking spot with all 4 mentioned problems.
This 7900xt will work fine if you change the cooling paste, lower your voltages, and temps, adjust your fan speeds,and start using msi afterburner as default voltage and fan control, do not rely on radeon software.PERIOD
I got lucky getting a 3070, extremly happy with it. Espacially with Cyberpunk on the horizon.
same
@Gamers Nexus - 14:36 why are the 3080FE minimum frames so much lower than the other cards?
Ahhh nm just seen it's only the OC'd score that does this. Must be throttling under load.
For me, RT is going to be a next gen feature. For the time being I just don't care.
I do care about more memory though sooooo
Ding Ding!!
Agreed, waste of time IMO until it can easily handle it
Raytracing is compelling when combined with something like DLSS. It’ll be interesting what AMD cooks up with their SuperResolution solution.
You are top of every tech release with such great info! I hope you are getting some rest and a good meal, all this work must be exhausting.
These cards are wicked with clock speed, can’t deny that
You make the most expert review about this topic. Ty.
Reduction in performance because of Vulkan, maybe a driver issue? Just a guess
Doesn’t even make since
Be interesting to see if it's just rdr, or other games exhibit the same behavior
@@justmixah3408 This is red dead redemption we are talking here. Anything can go weird with that game.
@@Deliveredmean42 yea but dont say drivers.
@@justmixah3408 Red drivers redemption.
You guys should color code the names on some of the charts like at 24:46 , just making AMD cards Red and Nvidia green text then "OC" in orange or something will help a lot.
I wish you used aftermarket cards for this. Reference cards are pretty irrelevant to most of us I would think.
As always, very informative and detailed. Thanks
I care about RT and DLSS, i went with a 3070
I'm holding onto my 2070 for awhile longer to see if Nvidia pulls a "3070 Super" type release with a lot more VRAM, but also not giving up DLSS so AMD also isn't an option for me at this point in time.
I do not care about RT or DLSS, i went with 6800
@@brazilpaes @PandaButtonFTW
This is all that matters, you both had features you preferred, and got the cards that suited your desires. That’s the magic of choice that gets lost in the team bloodsports. 👍
RT and dlss is on 12 games LMAO. half the games with RT only do shadows too so it's useless
@@brazilpaes I care a lot about DLSS but RT is just a total waste of resources right now
So I'm doubting a lot on what to get
Hello I have a question please
I got a 620 +80 bronze psu ...with 5800x3d cpu ....can this psu handle the 6800 stock or undervolted ...with 2x 6+2 splitted from 1 cable ?
Yes it should work. In theory, 150 watts from the cable and 75 watts from motherboard which is 225 total. The 6800 is limited around 205 watts.
1 year ago, today this video released. I wonder if the first viewers got their GPUs.
You could get a 6800/XT for about $900 from a scalper back then. Most gamers were so appalled at the price they regrettably held out, hoping for prices to drop. LOL!!! We know how that turned out.
Well, There's a reference RX 6800 inside my PC right now...
What is the OC version? Is there a standard way you over clock or do you optimise it to an inch of its stability?
Oh man, those AMD slides at their investor meeting claiming Extreme Performance were not a joke.
Yeah but it's also clear why there were no RT benchmarks. These cards just don't compete, especially with no dlss type feature. Maybe that will change? Honestly though I'm still not seeing in crazy cool rt in games yet so...
@@codyjames3416 Yeah, very nice, RT and DLSS in like 10-12 games. Vast majority of people don't care about that.
@@codyjames3416 These games He is benchmarking are built for Nvidia's RT cores though, I've heard that AMD's ray accelerators work with a lower quality version of Ray Tracing really well. Vulkan RT will take full advantage of. The games coming for console will also work well on PC, don't forget the PS5 has HALF the Compute units than the 6800Xt, and yet they are saying they have playable ray tracing at even 4k. So we'll see this materialize next year I hope.
@@Omiicron no Gamer care about stupid RT, if 3070 didnt waste $ in RTcores and gave 16gb instead of 8gb,Nobrainer purchase
@@sujimayne Yeah nobody cares about a feature that gives 500% more performance.... you fanboys are hilarious.
Hello Steve, I have a question about the USB Type-C port on the RX 6000 series cards. Would it be able to drive something like a Lenovo Thinkvision M14t portable touchscreen ? There was an article about the USB-C port of the RTX 2000 series cards by eurogamer.net (www.eurogamer.net/articles/digitalfoundry-2019-02-28-psa-the-usb-c-port-on-rtx-graphics-cards-isnt-just-for-vr) who reported that they were successful in connecting various devices. Is the USB-C port on the RX 6000 series cards also a full featured interface (video + data + power) ?
These AMD cards have aged like fine wine while NVIDIA cards aren't faring too well.
Hello. Will you look / cover in some of your video the flickering issue on nVidia new cards? Happened with DP1.4. Unfortunately my EVGA GeForce RTX 3070 XC3 BLACK GAMING and monitor BenQ BT 2420 PT has this issue. So, I have to run with HDMI cable. But for more monitors it is useless. :( Hard to say if it is problem of nVidia, or BenQ. As I have read discussions, most of users have this issue with BenQ monitors. When I was playing with setting, it helps to have 2560x1440 at 59Hz, not 60Hz, but it jumps back to 60 every time I start game.
I was angry that I had to pay 800 for a 6800 last November. Now things are 100 times worse. I only got this card because it was the least marked up of all graphics cards last November.
3:31 "T.J. Maxx, Target" just wanted to say thanks for making the sponsor spots as minimal as possible
In a sane world, Nvidia's VRAM stinginess would push me to AMD this gen. But the sad fact is I'll take what I can get. Which will probably be a 3070.
17:09
This is bizarre. The 3080 is equal to the 3090, the RX 6800 and RTX 3080 both aren't affected by OC, but the 6800 XT is much faster than the 6800. What could be causing such weird scaling? What does an RX 6800 XT have over the 6800 that a RTX 3090 doesn't have over a 3080, and that can't be improved by overclocks?
sorcery (unoptimized for these cards)
It's interesting to see how well can ray-tracing performance be uplifted by drivers and in-game optimizations by the developers for AMD platform... Curious. Does Steve consider this at all? Especially taking into account that original 2000 series RTX cards were uplifted pretty well by better drivers and optimizations made by the developers (IIRC, Dice scaled back some effects in BF5 to make it perform better, as an example). And then DLSS got to pretty decent level after some time, making it a viable option, especially at higher resolution. AMD should also have this feature with their Super Resolution hopefully soon. I believe there will be updates and retests, because it looks to me that AMD might not be the "worst RT peformance" out there, even with this generation and within a somewhat of a reasonable timespan (maybe even somewhat of a short one? who knows). My guess, if those improvements can be achieved, then maybe this first gen of RT cards by AMD can at least get to somewhat similar levels of 2070-2080 super cards, could be 2080TI/3070 level of performance, especially on upcoming 6900XT. Any thoughts?
When we can expect the PS5 Video?
And - do i see one of the Internal PS5 plastic pieces to your left on the desk??