RX 6800 XT vs RTX 3080 in 2023!!! New Games, RT on/off, DLSS/FSR2 on/off, 1440p, 4K, 1080p

Поделиться
HTML-код
  • Опубликовано: 23 авг 2024

Комментарии • 1 тыс.

  • @sailyourface
    @sailyourface Год назад +641

    Considering I just picked up a 6800xt for $515 last month vs. a 3080 for $700+ this video is perfect to avoid the buyers remorse.

    • @zacthegamer6145
      @zacthegamer6145 Год назад +43

      Dude, RTX 3080 is better for VR, RT, work, better software support, RT voice, and got access to both DLSS and FSR.

    • @themegaminecrafteur
      @themegaminecrafteur Год назад +276

      @@zacthegamer6145 YEAH BUT 200 DOLLARS PLUS MATE :') ITS PLAIN STUPID

    • @kaisyaya8492
      @kaisyaya8492 Год назад +102

      @@zacthegamer6145 but what if someone just bought it for casual gaming ?

    • @FOREST10PL
      @FOREST10PL Год назад +281

      @@zacthegamer6145 great, I don't use my GPU for any of these tasks.

    • @jasonbrody1134
      @jasonbrody1134 Год назад +80

      Honestly Ray tracing is a very niche thing to have, it's great for certain single player games than can benefit from it but that's where it stops, second dlss should never be the biggest thing marketed on an 80 series card or higher especially for 30 & 40 series it's downright shameful of Nvidia to pull that kind of crap, the GPUs overall performance in rasterisation & efficiency should speak for it, 3rd people keep having this very biased & downright idiotic misconception that AMD GPUs are always inferior to Nvidias which is downright false, AMD GPUs are on par if not superior to Nvidia GPUs in rasterisation in 1080p & 1440p which I believe is the resolution where most people should be if not already playing on because let's face it 4k is extremely overated, over hyped & just not worth spending an arm & a leg for in terms of the cost of building a 4k PC.

  • @Sly_404
    @Sly_404 Год назад +145

    Particular the 6800 XT is probably a smart option if you are in the market for a new GPU atm. Either you can jump to a newer card in case we see significant price cuts down the line, or skip the current gen entirely.

    • @Z3t487
      @Z3t487 Год назад +5

      I bought my brand new RTX 3080 12 GB in July 2022 for 950 euro, when prices were still a bit crazy. I dunno if it was a so wise choice because, you know, January 5th 2023 Nvidia launched the RTX 4070 ti for much cheaper...and i was coming from an RTX 2080, so i wasn't too much in a hurry to upgrade.

    • @pepoCD
      @pepoCD Год назад +21

      dude what?? yes the 6800XT is okay for about 640€ here in Germany but the RTX 3080 is 800€ and even if you think that the extra price is worth it for DLSS and RT then you should just get RTX 4070ti for 900€, which not only offers better performance/price and wayyy better power efficiency but also has Frame Generation.
      Maybe it's different in your region, but here the 3080 is a completely pointless option since the 4070ti released.

    • @nothingisawesome
      @nothingisawesome Год назад +4

      @@pepoCD good point. i would note that occasionally the 6800XT is on sale for around $500-$550 in the states. That price is worth considering still

    • @SwingArmCity
      @SwingArmCity Год назад +1

      I think I agree with you. I would love a 6800 XT. Both(3080, 6800 XT) are probably the top of their generations (value to performance), but like most nvidia options, they are not good value. I see a valid point for forgetting the 3080 entirely and just going to the 4070ti, but I dont think that is the best value either.

    • @Phil_529
      @Phil_529 Год назад +6

      No point buying a 3080 right now. 4070Ti makes way more sense with DLSS3 support.

  • @Scharamo
    @Scharamo Год назад +93

    280 vs 340 watt, and the 6800xt cost much less. Glad I bought a 6800xt

    • @Dosudro
      @Dosudro Год назад +2

      Beat me to it. XD I am very happy with my 6800XT.

    • @PeterPauls
      @PeterPauls Год назад +5

      My OC RTX 3080 12GB eats 350W sometimes. 8nm Samsung vs 7nm TSMC nodes. It is what its is, I bought my card for a deal…

    • @Dosudro
      @Dosudro Год назад +18

      @@PeterPauls Nothing wrong with catching a deal. 👍

    • @riannair7101
      @riannair7101 Год назад +8

      My RTX3080 undervolted consumes maximum 230W for 5% less performance and temps are more than good (temps in full load never go over 60 C).

    • @PeterPauls
      @PeterPauls Год назад +2

      @@riannair7101 That are really great results. I have a Gigabyte Gaming OC version and it keeps the temps around 66-68°C no loud noise, so because of that I've never really tried to undervolt this card.

  • @adrianemporr58
    @adrianemporr58 Год назад +64

    Back in September I was standing between these 2 GPUs. However Nvidia pricing was just a joke. At that time I could buy RX6800XT for £599 whereas RTX 3080 12GB was for around £750 if it was on sale+it was bottom tier coolers like MSI VENTUS 3 or PNY or even Inno 3D . My choice was very easy. I got Sapphire Nitro + RX6800XT for £599 and I think It's very good card for 1440p.For few months (Mainly whole October and happened like 2/3 times in November) I had problems with drivers (made my drivers crash onece or twice a day) but with recent update in December AMD smashed it and all my problems were fixed.
    AWESOME CARD

    • @karlhungus545
      @karlhungus545 Год назад +6

      And this is the reason I (and 98% of people) won't buy AMD. Their drivers continue to be all over the place. Not worth the hassle. Their cards need to be AT LEAST 25% cheaper before I'd consider them, not MORE than Nvidia (i.e. 7900XT)! 🤣

    • @X_irtz
      @X_irtz Год назад +27

      @@karlhungus545 From a personal experience with having AMD cards for like 4 years now, they have improved drastically and i have not yet encountered issues on my 6700XT.

    • @moevor
      @moevor Год назад +24

      @@karlhungus545 Have you had personal experience with AMD cards? Just curious.

    • @nukesakuukasekun9050
      @nukesakuukasekun9050 Год назад +1

      @@karlhungus545 Just bought a brand new power color red devil edition 6700XT out the box. Plugged that bitch right in and took out my Rx 580 4GB. No driver issues at all except when I fucked with the tuning settings ONLY. im not educated enough to use that feature yet, plus I just wanted 1440p90FPS max, 6700xt suckin up all the homies, she can do it all at that framerate and pixel density! Very happy with my purchase

    • @mertince2310
      @mertince2310 Год назад +2

      I got a 6800xt xfx (used for mining) for 400 bucks which is absolutely incredible considering newer cards like 7900 xtx start at 1300 dollars in my country and nvidia ones are even more pricey. I have been using it for like 4 days and so far card looks really good, I overclocked a little bit and undervolted quite a lot. Even the junction temps dont go above 70 now which I thought was really good.

  • @rwschumm
    @rwschumm Год назад +52

    Hi Daniel- Always appreciate your reviews & performance comparisons! One factor that I'm always concerned with is power usage. I noticed that the 6800 was nearly always using from 80 to over 120Watts LESS power than the 3080.
    I wouldn't even consider a 3080 in view of this, and your performance comparisons. I'm retired and likely will be upgrading my PC this year at some point, and luckily I can afford high-end components. But I did want to mention this power comparison!

    • @felentus
      @felentus Год назад +6

      AMD is measuring the power usage differently. They measure TGP which only includes the GPU die and the memory dies. It does not include any losses from the power delivery and any usage from the rest of the card. Thus the number is a lot lower than the actual consumption. Nvidia meanwhile measures TBP which is what the card pulls at the connectors and it much more accurate.

    • @soulonr419
      @soulonr419 Год назад +1

      Indeed, would be nice for next time, that you make a final thoughts about power consuption aswell Daniel, with these crazy energie prices nowadays.

    • @rwschumm
      @rwschumm Год назад +1

      @@felentus Hi - Good to know. Don't recall how Daniel is measuring power he's displaying. Maybe Daniel can clarify?

    • @felentus
      @felentus Год назад

      @@rwschumm You can't really influence how you measure the power consumption with software. You just get what the sensors tell you and AMD and Nvidia use different approches. GPUs now usually pull what it says on the box, so 300w for the 6800xt and 350w for the 3080 12GB. Some AIB cards have higher power targets though.

    • @TheFalseShepphard
      @TheFalseShepphard Год назад

      Damn, you must have a good retirement fund :)

  • @cyberspectre8675
    @cyberspectre8675 Год назад +25

    I freaking love my WC 6800XT. It's the only card I bought at launch and it's the best card I've ever bought.

    • @adammartin4906
      @adammartin4906 Год назад

      Tell the truth you got it because if your any kind of a real gamer you know you're going to need that 16gig in the future and if you were to buy a 3080 it would be useless when the games start needing 13 gigs which is already happing plus if you want to mod a game again you got to have more memory and hmm who has big memory for cheap I wonder, Like always AMD GPUs age like fine wine,,,,,....

    • @cyberspectre8675
      @cyberspectre8675 Год назад

      @@adammartin4906 That's exactly why I got it, yes. I've always bought NVIDIA in the past, but when it came time to build my latest rig, I could see RDNA2's long-term value from miles away. Superior rasterization performance, superior overclock ability, and above all, 60% more memory! 2 years later, and I already use 11 to 12 GB of VRAM on a daily basis. I'd hate to be one of the suckers that bought a 12GB card now, let alone the 10GB 3080. Ampere was practically DOA. ☠

    • @jediflip2515
      @jediflip2515 Год назад +2

      @@adammartin4906 that's why I bought my 6800xt over a 3080 back in October. That 16gb was extremely appealing

    • @Dantido
      @Dantido Год назад

      @@adammartin4906 To be honest, yeah, the 16GB are real nice, but I think it's going to be a LONG time until we see games consistently use over 12GB of VRAM.

    • @vinylSummer
      @vinylSummer Год назад

      @@Dantido depends on games! To name a few now, we've got DCS World, Escape From Tarkov happily eating 16GB of VRAM on high textures. These games run really poorly on, for example, 3070ti with its 8gb vram already! I'm so happy that 6700xt exists, its 12gb memory buffer is enough for my games to not stutter at 1440p high and it did cost just $350 for me. So, there are games that consistently use ~12gb of vram now, I think in a few years 16gb is going to be something you may need in some games
      One of the other benefits of a big memory buffer is ability to run two games simultaneously. While playing Escape From Tarkov, I need to wait a few minutes for my raids to start and it feels nice to play some light games on my second monitor while I wait! 12 and 16gb is, unfortunately, only enough for the lightest games to run if Tarkov settings are set to high.. It's actually one of the reasons why my next GPU will have 20+ gb VRAM, but I understand that it's a very niche use case

  • @AntonMikhaylov
    @AntonMikhaylov Год назад +59

    I paid $570 for my 6800 XT and the cheapest 3080 (non-Ti) 12GB I can find rn is $800 ($700 if out of stock). I am still convinced I made a right choice in terms of price/performance ratio. Thank you for this video, completely supporting my pov. Plus I play games (1080p all the time) that require up to full 16Gb of vram, so 12gb is not going to cut it.

    • @JathraDH
      @JathraDH Год назад +12

      I don't believe any game at 1080p needs 16GB of vram. It may use it by simply keeping textures/assets in the ram its not using instead of unloading them for other textures because it simply has the space, but I sincerely doubt its giving you any type of actual performance gain unless you are using mega sized textures or something, which again at 1080p is kinda pointless.
      A 1024x1024 32 bit texture uncompressed uses 4MB of vram. Even if you are using 4096x4096 textures (64MB/Texture) that would be 256 of those textures needed in one map/area to actually require the 16gb of ram. No game is going to be that poorly optimized I think.

    • @moevor
      @moevor Год назад +2

      Price-performance and efficiency-performance! Nice :)

    • @TheVanillatech
      @TheVanillatech Год назад +29

      You paid 40% less to lose 5% performance. And you're "wondering" if you made the right choice?

    • @AntonMikhaylov
      @AntonMikhaylov Год назад +9

      ​@@TheVanillatech quite the opposite - I am solidifying my confidence in that decision.

    • @JathraDH
      @JathraDH Год назад +1

      @@AntonMikhaylov Texture size typically has everything to do with screen resolution, what are you talking about?
      The size of textures used ideally needs to be big enough that they aren't going to look too horrible when viewed close up but not so huge that they eat too much vram.
      The bigger the screen is, the more pixels are needed to view close up textures, thus the more they will have to stretch and the worse they will look. This is why games always need more vram at higher resolutions. The game loads in larger textures so it doesn't look like trash. At 1080p you can get away with quite small textures.
      Yes, like I said, the game may leave stuff in memory if it has the vram for it, but that doesn't mean its actively using all that vram, its not using remotely close to all of it most likely.
      The amount it needs at a minimum to not effect performance is enough to load all the textures/assets that are in the current area and need to be rapidly accessed and nothing more.
      I can't imagine a game at 1080p needing so much vram unless you are specifically loading in horribly oversized textures for no reason or it is loading in an ungodly amount of unique models, which is just a extremely poorly optimized game at that point.

  • @OmRaZaTa
    @OmRaZaTa Год назад +13

    Hey Daniel, the program Lossless Scaling that you reviewed before has a new version with ML Spatial Upscaler called LS1, which has standard and performance modes. It is way better than FSR1, not up to the DLSS level due to being a spatial upscaler, yet way better than the standard solutions. Please check that again. It is good for both lowe tier users, or people who wanna upscale demanding games to 4k from 1440 or 1080p

  • @tadsnyder2756
    @tadsnyder2756 Год назад +11

    Big difference in the 1% lows which means so much

    • @adammartin4906
      @adammartin4906 Год назад +1

      yup higher 1% lows means less studder.

  • @wallachia4797
    @wallachia4797 Год назад +6

    In Romania, a standard 10GB 3080 costs anywhere between 1000 Euros to 2000 Euros with most floating around the 1500 mark.
    An RX6800XT can be caught as low as 600 Euros with most floating around the 800-900 mark.
    It's not even a competition, the 3080 is a complete meme GPU in some markets where often times 4090s are cheaper.

    • @stefannita3439
      @stefannita3439 5 месяцев назад

      One year later and the prices on used 6800 XTs are pretty good here in Romania. Just got one for 1800 lei (360 euros) and sold my 3070 for 1500 lei (300 euros).

  • @ButteredBread10
    @ButteredBread10 10 месяцев назад +4

    With Sam and fsr enabled the 6800xt absolutely takes the cake.

  • @jamiereinig
    @jamiereinig Год назад +8

    The 3080 is pulling around 70-100W more than the 6800XT in these tests (Forza 5 @25:00 is even more of a delta, closer to 120W).
    That's not insignificant. 25-50% more power for similar (and in some cases worse) performance is something to consider.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +2

      you can't compare AMD and nvidia power consumption using software reading only. because AMD GPU sensor only measure GPU and VRM power while nvidia card have the sensor to measure the entire card power consumption. that's why true professional reviewer never user software like afterburner for power consumption measurement. if they don't have the specific equipment that can measure GPU power consumption only the would rather measured power at wall.

    • @moevor
      @moevor Год назад +1

      True, but that story is now reversed for Navi 31 vs AD103.

  • @bluesealol
    @bluesealol Год назад +7

    I got my 3080 around launch time in a prebuilt and after adding thermal pads its runs very well under low temps, I don’t plan on upgrading for many years because of how well it performs.

    • @erjohn5404
      @erjohn5404 Год назад

      I just bought it aorus one its awesome

  • @zalankhan5743
    @zalankhan5743 Год назад +7

    The Performance may be somewhat Similar but The Power Compsumtion is like 80-100 watt lower for the 6800xt

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +5

      did you look at the power reported by in game overlay (Afterburner)? the true difference probably much smaller. AMD design their card to only detect GPU and VRM power only (even RDNA3 still like this) while nvidia they have all sensors to measure the power of the entire card.

    • @demnogonis346
      @demnogonis346 Год назад +1

      @@arenzricodexd4409 So look at the total power consumption of the pc (from someone else), you will find that the pc with the 3080 sucks 70-100w more than the same pc with 6800xt

  • @dro_da_mobb
    @dro_da_mobb Год назад +51

    Awesome Daniel!! I am looking to build my first gaming rig soon, and I am going for the Ryzen 7 5800X3D, RX 6800 XT combo.. still have a couple months time, hopefully the GPUs become better available by then.. love the channel, keep up the good work!

    • @m4nc1n1
      @m4nc1n1 Год назад +1

      1440p?

    • @danigaminggt
      @danigaminggt Год назад +3

      not worth, go for ryzen 7000 with ddr5 and all that

    • @dro_da_mobb
      @dro_da_mobb Год назад +36

      @@danigaminggt sure.. I’m accepting donations 👌

    • @dro_da_mobb
      @dro_da_mobb Год назад +4

      @@danigaminggt jokes aside, I just might.. if the prices come down a notch

    • @bluehenz009
      @bluehenz009 Год назад +7

      I have the exact set up. Bought 6800xt used and 5800x3d bnew. Playing on 1080p with high refresh rate. If you dont have am4 motherboard, Honestly just go with am5 for better upgrade path

  • @Fantomas24ARM
    @Fantomas24ARM Год назад +12

    Iam happy with my 3080 10gb, bought is used for $500, going to wait for 5000 series cards.

    • @antasmunchie
      @antasmunchie Год назад +1

      same and same (assuming they aren't grossly overpriced like the 40 series)

  • @MisiuAcademy
    @MisiuAcademy Год назад +18

    I got a 3080 FE on launch by sheer luck on MSRP in Sweden. An absolute monster, beautiful card, quiet, still performs admirably. I think it can go strong for another 2 years!

    • @CaptainScorpio24
      @CaptainScorpio24 Год назад +1

      so lucky

    • @SwingArmCity
      @SwingArmCity Год назад +9

      I'll bet you are correct. Keep it. dont fall into the 40xx series and their REDICULOUS prices!!

    • @MisiuAcademy
      @MisiuAcademy Год назад +3

      @@SwingArmCity Will not! I don't game that hard or often (though I still play comfortably and mostly maxed) and it is still fantastic for encoding and video editing!

    • @carltonp5669
      @carltonp5669 Год назад +3

      Thanks, this was very helpful. I am glad the 6800xt did fine for 1440p. Also happy that the 3080 did good as well at 1440p for those who have a 3080.

    • @Monolitus7
      @Monolitus7 Год назад

      In Sweden it seems people were able to actually get cards at msrp at release cause a Swedish friend of mine also got a 3080 at msrp.

  • @morrays1996
    @morrays1996 Год назад +8

    The driver update that increased performance on the nvidia cards was the one that happened in October. I remember installing it and see about 10% (sometimes more) increase in performance on the same games.

    • @W3TFART
      @W3TFART Год назад +1

      Still a shit choice buying nvidia

    • @morrays1996
      @morrays1996 Год назад +2

      @@W3TFART at the time the 3080 and 6800xt had very similar msrp. Amd just dropped the price for that card after they released the new cards.

    • @W3TFART
      @W3TFART Год назад

      @@morrays1996 still a shit choice supporting those grubs and their behaviour.

    • @morrays1996
      @morrays1996 Год назад

      @@W3TFART okay

    • @W3TFART
      @W3TFART Год назад

      @@omegalol7229 lol I have a pc and laptop already as well as all consoles .
      So how does a fact about price gouging make me poor ?
      Any idiot that supports nvidia in their price v performance is poor in mental facilities.

  • @razzorj9899
    @razzorj9899 Год назад +1

    Do you have a 6950xt? Would love to see a 4070 ti vs 6950xt

  • @TeoshadeMC
    @TeoshadeMC Год назад +6

    Hi! Nice work, very detailed and varied comparison. personally between the two I prefer the 6800XT because it has practically identical performance to the 3080 but with decidedly lower consumption and 4 gb more vram, what more do you want?

    • @ShadowCraft_YT
      @ShadowCraft_YT Год назад +1

      I just bought a 6800xt! 🥳

    • @TeoshadeMC
      @TeoshadeMC Год назад

      @@ShadowCraft_YT nice choise! You will not regret

  • @noahramos4002
    @noahramos4002 Год назад +11

    Just recently got the same 6800 xt and it is such an amazing card. Crazy how people will go with the more expensive route (nvidia) when they can just get the competition for way cheaper and better fps!

    • @user-tm9ho3bm4v
      @user-tm9ho3bm4v Год назад +3

      Nvidia tends to offer better performance for more money in the higher tiers. But in the mid tier amd offers better performance for less money. Guess what people buy anyway 😂

    • @bjay5241
      @bjay5241 Год назад

      ill send back my 6800xt because of consisten frametime peaks between 50-250 ms... i cant explain where it comes from but its on every game and my old gtx 1070 was more smooth...
      (i7-12700k, 32gb DDR5, 850w)
      Maybe you know a fast solution, otherwhise i will head out for a nvidia card because my experience was always better with nvidia^^

    • @hardcorehardware361
      @hardcorehardware361 Год назад +2

      @@bjay5241 You need to set things up properly like your Adrenalin settings and turn windows auto updates off, there are also a few reg settings you can use. I just got my 6800XT for $425USD and I had a few issues but now its solid, I prefer it over my 3090 tbh. I updated from 102 Vbios to 103 Vbios along with proper settings in Adrenalin with the Reg edits and its a great card. The only game I have issues with is Star Citizen but according to the devs they expect us to roll back to an older driver, I put this down to SC being optimized like crap, all my other games run perfectly fine.
      Stick with the card because 6000 series cards are actually very very good IF you set everything up properly.

    • @d.ink3d
      @d.ink3d Год назад +4

      more expensive but features like dlss, ultra low latency mode on every game and RT, paying a bit more for nvidia is worth it but not how the cards are priced atm, its worth paying like 50-100 more for those features so im thinking about buying a 6800xt too if i find a good deal

    • @bjay5241
      @bjay5241 Год назад +1

      @@hardcorehardware361 i tried a lot with adrenaline stuff and after a reg edit it crashed completeley... in germany it costs 680€ (its almost the same in $)
      And its just not ok to need so many detailed optimisations to get it maybe working...
      Now i have an 4070ti for 900€, which has still one of the best fps\€ values, even if its super expensive.
      And it just works, even if its running on 100%, it has no random frametime spikes.

  • @rorypants
    @rorypants Год назад +3

    Hey could you take a look at Intel ARC 750. Really cheap card that uses AI with your intel CPU and GPU to deliver better performance.

  • @malevolence89
    @malevolence89 Год назад +2

    Bought 6800 XT for about $500 on sale from Newegg around Black Friday. Has been perfect.
    Been running it with a 4K monitor on graphics intensive games, upgraded from the 1060 6GB.
    I am able to run games on a 4K monitor with the 6800 xt on higher graphics settings than I could running the 1060 6GB at 1080p, which is funny.
    Most games I can max the settings, Calisto Protocols cannot be maxed out at 4K, but most others can or at least run near max settings at 4K.

  • @adi6293
    @adi6293 Год назад +3

    You are right :P This comparison isn't very fair :P:P That extra 2GB can make quite a difference , my 10GB 3080 runs out of memory at 1440p sometimes .......

  • @neilstack4194
    @neilstack4194 Год назад +41

    Good to see the 3080 holding its own nearly 2.5 after launch ....

    • @snoochpounder
      @snoochpounder Год назад +2

      I’m still happy with my 2080 because I don’t game on 4K

    • @kaisyaya8492
      @kaisyaya8492 Год назад +27

      holding its own against the 68xt ? a cheaper card ?

    • @tinyjotaro5316
      @tinyjotaro5316 Год назад +3

      @@kaisyaya8492 i think he means holding after 2.5 years

    • @AlexMiseroy
      @AlexMiseroy Год назад +14

      the 12gb version is just over a year old

    • @middleclassthrash
      @middleclassthrash Год назад

      @@kaisyaya8492 ikr

  • @AttunedFlux
    @AttunedFlux Год назад +3

    Don't forget about the 6800 non-XT as well. Snagged one for $475 at microcenter. My 4 year wait for a better GPU has now ended.

    • @matthewjuarbe5826
      @matthewjuarbe5826 Год назад +1

      im using a sapphire pulse RX6800 UV OV and with SAM enable im getting 6800 XT performance and paid 400USD very happy with-it NVIDIA was just too much i think cheapest i saw was like 700usd+. my choice was super easy lol

    • @tomohawkcloud
      @tomohawkcloud Год назад +1

      @@matthewjuarbe5826 whats SAM

    • @matthewjuarbe5826
      @matthewjuarbe5826 Год назад +1

      @@tomohawkcloud SAM is Smart Access Memory.

  • @Hyperdriveuk
    @Hyperdriveuk Год назад +1

    The price to performance ratio between the two is massive. 6800xt just makes so much more sense. In fact I can now get a RX 7900XT Ultra for the same price as a 3080. LOL

  • @JVlapleSlain
    @JVlapleSlain Год назад +7

    The Navi21 chip is definitely the unicorn release of the previous generation (and i would say an overall better product than the Navi31) especially with the cut down prices.
    My reference 6900XT is running @ 2.6ghz core clock, fast memory timings @ 2.1ghz, with +0% power limit and a -90mv undervolt, absoIutely monstrous. I think it's the perfect card right now for playing on 3440x1440 120Hz+ setups

    • @OrdinaryViewer808
      @OrdinaryViewer808 Год назад +2

      Glad to hear, got my 6950XT and my 5800x3d today and im stoked to put them into my system tomorrow!

    • @moevor
      @moevor Год назад

      Sounds like a beast of a set up. I would have loved to snag a Navi 21 XTX with a water block. I was eyeing the EKWB/XFX variant and the OEM one that came with a 120mm rad.

    • @midniteoyl8913
      @midniteoyl8913 Год назад

      Ya, have the Asrock OC Formula 6900XT and currently 2735mhz @1125mv with memory @2130 fast timings. ~70C/~90C core/hotspot max temps @max fan speeds of 60%.

  • @tristanweide
    @tristanweide Год назад +20

    I just gave my 3080 12GB to a friend it's still an amazing card! She had a 2070 so it was a decent upgrade

    • @danielowentech
      @danielowentech  Год назад +9

      Nice gift!

    • @ipotato95
      @ipotato95 Год назад +15

      What’s it take to be your friend?

    • @mikem2253
      @mikem2253 Год назад +37

      @@ipotato95 be a girl.

    • @bconxept
      @bconxept Год назад +5

      @@mikem2253 yup.

    • @lizzy1519
      @lizzy1519 Год назад +1

      @@bconxept i am i should test

  • @Sebbz
    @Sebbz Год назад +7

    I bought the 3080, the 6800 XT was $20 cheaper. I wanted to have DLSS for 4K gaming and couldn’t be happier with my decision so far. Dead Space Remake max settings 4K with DLSS Quality or Balanced runs amazing!

    • @DC-Lever
      @DC-Lever Год назад

      How much did your rtx 3080 cost? I'm also planning to buy it but 6800xt is 500+ while 3080 is 800+

    • @Sebbz
      @Sebbz Год назад +1

      @@DC-Lever I bought it brand new for $20 under MSRP in Norway. I would go with the 6800 XT if the price difference is that big! Both cards are the same price in my country

    • @ksks6802
      @ksks6802 Год назад

      I found the traversal stutter to distracting on my 3080 using dlss maxed. Never went below 63 fps in the most demanding scenes. Wound up going 2560x1440 using TAA high. Looks way cleaner and stuttering isnt as noticeable

    • @Sebbz
      @Sebbz Год назад +1

      @@ksks6802 I still have stutter after the newest patch as well. Will try out 1440p and see if it is any better :)

    • @boboboy8189
      @boboboy8189 Год назад +1

      @@Sebbz you are lucky because around the world Nvidia is $100 more

  • @jrossbaby141
    @jrossbaby141 Год назад +2

    I was rocking a 1050ti for the longest time and just picked up a 6800xt after watching several of your videos. I couldn’t believe the 3070 was the same price. Thank you for the many informative videos.

    • @erimpr0vise844
      @erimpr0vise844 Год назад

      im currently using a 1050 ti, was the upgrade very noticeable?
      Also which games do you play, so I have some kind of idea for comparison purposes.
      Been looking at 6800 xts
      thanks

  • @PotatMasterRace
    @PotatMasterRace Год назад +2

    6800XT is *significantly* cheaper though...

  • @adonisflos
    @adonisflos Год назад +4

    The RTX 3080 power draw is 70-100W higher...

  • @JPVee511
    @JPVee511 Год назад +3

    I snagged an XFX 319 Core RX 6800XT for $549USD before taxes. Brand new. I was worried it was too good to be true, but a few days after ordering it arrived for store pick-up, brand spanking new. That's below MSRP, of the reference card, at that. I was close to getting a 6750 directly from AMD but I just got incredibly lucky with Best Buy on a whim. Pairs great with my Ryzen 5700X.

    • @haroldbutler778
      @haroldbutler778 Год назад +1

      Newegg has em for same prices.. 6800 non xt for about 500 and 6800xt for 540 and 6950xt for 699

  • @Tropicocitwo
    @Tropicocitwo Год назад

    Is this using SAM on the 6800XT or not? Thanks for this video as I'm in the market for a used card and these are my choices!

  • @drizzl8899
    @drizzl8899 Год назад +5

    i picked up a 6800xt for a good price. since im not aiming for uhd the card should serve me well troughout this current generation of cards. in addition to that i didnt have to replace my 650w psu which i had for some years. system runs perfectly stable on a rm650x by corsair even under full load. so i can avoid spending money on that for some time too. waiting for the 7800s tho

  • @krasimirtodorov7135
    @krasimirtodorov7135 Год назад +3

    6800xt is £570 and 3080 12gb is £900. I don't think that we can compare them as they are in completely different price range!

    • @garyb7193
      @garyb7193 Год назад

      Wow, the 3080 12gb for £900. How much is the AMD's new 7900xt there, which has 20gb and is faster than the RTX 3080 in raster and raytracing?

    • @krasimirtodorov7135
      @krasimirtodorov7135 Год назад

      @@garyb7193 7900xt is £830. How much is the 12gb 3080 at your location?

    • @garyb7193
      @garyb7193 Год назад +1

      @@krasimirtodorov7135 At Newegg, $879. Nvidia pricing is ridiculous. AMD's 7900xt is a much better value than the 3080. Comes with 8 Gb more ram plus beats it the raster and raytracing. Memory increasingly more important for RT and 4K.
      Even Nvidia's 7080ti for $829 is better value than 3080 but the 7900xt still beats that too in most cases.

  • @videosbymathew
    @videosbymathew Год назад +5

    Thank you for the great analysis! I just bought a 6800XT right before watching this video. Was considering the new generation, but I'm really wanting something faster that has DP 2.1, and while AMD's cards are solid, I just can't justify the price for the heat and such they put out... not to mention DP 2.1 is sort of not needed still yet. This 6800XT will hold me over nicely for another couple of years to see what the next gen (2024+) provides.

    • @Zombie101
      @Zombie101 Год назад +3

      thanks to this comment i just ordered the 6800xt too. it makes total sense. allows us time to enjoy 1440p now at a solid cost/performance and wait for the next gen gpus to come along

    • @videosbymathew
      @videosbymathew Год назад

      @@Zombie101 I'm happy to report that I've done further testing and really do like this card (specifically this Asrock variation). So far still good. Glad you got one too!

    • @videosbymathew
      @videosbymathew Год назад +1

      @@Zombie101 Small side note... it's still a fairly long card, so hopefully you have a case that fits it. Unfortunately, the last couple of generations of cards are just large no matter where they are in the stack :(.

    • @Zombie101
      @Zombie101 Год назад

      @@videosbymathew I think what puts me off the most is the 3x 8 pin pcie that it needs. Length I'm ok up to 330mm which I believe is the size of this one. Tbh I am tempted to go 7900xt and bite the bullet. Just seems like something is right around the corner though, a bit of a grey GPU area to be in rn

    • @puffyips
      @puffyips Год назад

      Just bought one for $460 (:

  • @laracameron9876
    @laracameron9876 Год назад +1

    Is it still a good idea to buy a 6800xt now even if it's a 2+ year old card?

    • @hardcorehardware361
      @hardcorehardware361 Год назад

      Imo yes, I just bought one and I honestly prefer it over my 3090. Just follow some guides on RUclips how to setup Adrenalin for performance. Awesome card, I love it.

  • @GoldPunch
    @GoldPunch Год назад +1

    Hi. I know you are radeon professional. I just bought asrock 6900xt oc formula xtxh model for 620 dollars. I just saw some benchmark, they say oc formula even exceeds rtx3090. It is second hand but it was so clean. You think it was bad idea?

    • @hardcorehardware361
      @hardcorehardware361 Год назад

      I own a 3090 and 6800XT, you made the right choice. I prefer the 6800XT over the 3090 personally.

  • @billienomates7100
    @billienomates7100 Год назад +4

    6800XTs here cost less than the 3070Ti and border on 3070 prices. There's no price performance parity.
    However, I don't justify the 6800XT solely on the fact given the price of the 7900XT, chances are, the 7800XT will be released at about £650, which I expect the 4070 to be at, and be a better buy so better to wait and see the price reductions after and compare it to the relatively big performance uplift of the next gen cards.

  • @frshunter
    @frshunter Год назад +5

    Nice job! At the present price I would have felt much better buying the 6800 XT for what little more the 3080 offers!

    • @johnpen269
      @johnpen269 Год назад

      so ray t racing is a little more? its not the biggest graphical advancement we got in last 10 years and now you bought a new gpu that is not capable of that congrads

    • @frshunter
      @frshunter Год назад

      @@johnpen269 yes quite frankly for almost the present double the price yes it is marginal. For s similar price different subject. Anyhow for the price the 3080 is going for I would look at a newer Nvidia card if I was going to spend that much for ray tracing.

    • @nickb1762
      @nickb1762 Год назад

      @@johnpen269 ray tracing? More like gay tracing

  • @Joschi6007
    @Joschi6007 10 месяцев назад

    Great Video! Please compare more of this "old" GPUs. Thanks for your work.. Grüsse from Germany Stuttgart

  • @dontsupportrats4089
    @dontsupportrats4089 Год назад +1

    hey Daniel, can you do your thoughts and explanation on the "50" games and Apps for dlss3.0 .. I mean what are the apps part? are these like browser games or something?

  • @josh4434
    @josh4434 Год назад +24

    I've had my 6800XT Gaming X Trio for about a year now and I have Zero regrets about picking it over a 3080. This card will be great for 1440p for a long time. Cool, very efficient and powerful.

    • @TheJamesKF
      @TheJamesKF Год назад +7

      Totally agree. 6800xt Tuf here and no interest in swapping it out any time soon.

    • @j2dzillak750
      @j2dzillak750 Год назад +7

      I also agree, this card is much better for stable game-play. I don't really care about ray tracing.

    • @niklasbreidenbach1638
      @niklasbreidenbach1638 Год назад +6

      Im going to build my first gaming pc tomorrow and I will use the 6800xt with a 1440p monitor. I’m glad that you have a good experience with it so I guess I made the right decision :) so excited

    • @spektrumB
      @spektrumB Год назад +8

      Got one last October to replace my 1080Ti. Run very smooth with zero problem. AMD drivers(at least for the 6000 series) is just as stable as Nvidia's. All the fear mongering from Nvidia fanboys are unfound.

    • @josh4434
      @josh4434 Год назад +3

      @@niklasbreidenbach1638good choice. it's a beast. You will love it.

  • @kaiservulcan
    @kaiservulcan Год назад +11

    Hi Daniel. Very good in depth analysis of these cards. I'm really glad to own a 6800 xt card. It is a very solid performer. The 3080 is very good too, but the price premium for RT/DLSS doesn't attracts me because it seems quite useless to me. Anyway, keep up the hard work. I follow you since you barely had 7000 subs. Quite an impressive jump now. You deserve it because of your good ideas and the job quality 😉

    • @puffyips
      @puffyips Год назад

      RT/DLSS are useless to the 3080 too😭😭😭

  • @Flying_Mushroom
    @Flying_Mushroom Год назад

    Is the power usage showing correct readings? Sometimes the 6800XT is pulling almost 100W less?
    Edit: nvm you did address that later in video, thanks

  • @adammartin4906
    @adammartin4906 Год назад +4

    I have a 3090 and a rx 6950xt and not sure why but the AMD GPU always has higher 1% lows and the gameplay is smoother on AMD GPUs and I've been avoiding playing on the 3090 because of the studering issues nvidia has in games

    • @KenTWOu
      @KenTWOu Год назад +1

      Check Hardware Unboxed's 'Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs'. In that video GeForce drivers required slightly faster CPUs, because of the way driver scheduling works on Nvidia cards.
      UPDATE forgot to mention, there is a 2nd part: Nvidia Driver Investigation [Part 2] Owners of Old CPUs Beware

    • @adammartin4906
      @adammartin4906 Год назад +1

      @@KenTWOu yea I've heard of this it's basically Nvidia copying apples business model of making people upgrade to newer hardware which is very evil and makes me wonder why people even still buy Nvidia GPUs.

    • @hardcorehardware361
      @hardcorehardware361 Год назад

      @@adammartin4906 Hello Adam, I have a 3090 and 6800XT and I concur. I prefer to play on the 6800XT because the motions feel smoother not to mention the color accuracy.

    • @adammartin4906
      @adammartin4906 Год назад +1

      @@hardcorehardware361 yea its kind of strange I've tried everything and its not like its a lot of studder but it gets annoying on the 3090 were you see these strange studders come and go and the pausing issues in certain games when I'm online make it a deal breaker.

  • @litguru4748
    @litguru4748 Год назад +5

    At least for the first three games (still watching), the 6800 XT draws way less power than the 3080. I'm running my 6800 XT sometimes with a powerdraw of 345 W, which makes it outperform a 3080 12G for sure. I like it a lot and have no problem waiting for prices to fall or a generation of GPUs to actually impress me again.

    • @Matt_Duke
      @Matt_Duke Год назад +1

      Yeah I got 3080 10G last year and feel like i should have gotten a 6800xt instead, having the flexibility to choose between the extra fps or efficiency is pretty nice

    • @thebcwonder4850
      @thebcwonder4850 Год назад +1

      @@Matt_Duke can’t you undervolt your 3080? I’ve heard of people getting it to run at 200W

    • @Wobbothe3rd
      @Wobbothe3rd Год назад

      @@thebcwonder4850 he's lying, lol.

    • @rmal8845
      @rmal8845 Год назад

      ​@thebcwonder4850 yes I have a 3080 12GB and undervolted it. Although now it runs at 260-270 watts and I haven't seen temps go above 70 degrees

  • @pockitsune6347
    @pockitsune6347 Год назад +4

    I actually got that 520$ 6800xt to upgrade from my 1060 3GB, thanks for the heads up on that! I was able to actually get it for 410$ because amazon support gave me a 110$ coupon for anything fulfilled by amazon because they messed up my last gpu order TWICE lol

    • @RonnieMcNutt666
      @RonnieMcNutt666 Год назад

      hopefully amazon messes up my used gigabyte m32u 4k 144 monitor 2 times again ...

    • @pockitsune6347
      @pockitsune6347 Год назад +1

      @@RonnieMcNutt666 Lol I was buying a 6650 and they gave me a 3050 both times and just gave up after that and told me if I returned it ill get a full refund and a coupon to make up the difference between current price and the price i bought it at

    • @midniteoyl8913
      @midniteoyl8913 Год назад +1

      Pretty sweet deal..

    • @pockitsune6347
      @pockitsune6347 Год назад

      @@midniteoyl8913 Yeah im getting a pretty high end card for a midrange price xD

    • @SwingArmCity
      @SwingArmCity Год назад

      you got a great deal!

  • @Hotwhiskey1
    @Hotwhiskey1 Год назад +2

    I purchased Saphire RX 6800 XT nitro last week from GTX 1080. I play 1440p mostly Warhammer 3 Everything maxed out avarage 71fps on battle benchmark. I done a slight undervolt and overclock my 71fps turned into 88fps. God of war went from 103 to max 120fps. Incredible card the way prices are at the moment got it for €700 + €20 for postage while the cheapest RTX 3080 here is €850. Temps while playing 67c to 70c. If you are not interested in the whole Ray Tracing marketing you cant go wrong with this card. I always had Nvidia card in my builds but the have gotten real greedy.

    • @edh615
      @edh615 Год назад

      oh god, no used market?

  • @nimaarg3066
    @nimaarg3066 Год назад +1

    So 6800 xt is a 3060 ti level RT card but a 3080 level raster card.

  • @syncmonism
    @syncmonism Год назад +3

    It would be nice if Daniel actually had a 10GB 3080 to do testing and comparisons with, as that seems to be a much more common model of the card

    • @swagoneto7922
      @swagoneto7922 Год назад +2

      Yeah 10GB's are everywhere on the uses market..

    • @eclisis5080
      @eclisis5080 Год назад

      its the same performance

  • @MuhammadAhmad-sw1tr
    @MuhammadAhmad-sw1tr Год назад +6

    Do a 3070 vs 6700xt 2023 review

  • @luminous2585
    @luminous2585 Год назад +2

    Really good comparison. Everything's pretty much within expectations, though I'm still surprised with how far apart certain games can be when the cards perform pretty similarly on average. To be honest, I'm really not expecting much from FSR3 yet. It felt like some comment they threw out there as a reaction to DLSS3. Right now they seem to be more focused on FSR 2.3 (not sure if that was the right version number for what's coming next.) and they didn't want to give the impression that they're lagging behind.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +5

      AMD is lagging behind on everything, stop the denial.

  • @MrBob2k99
    @MrBob2k99 Год назад +2

    For me at least the FSR versus DLSS battle doesn't really exist if you buy a Nvidia gpu. If you own a Nvidia gpu, you can use either technology. If you buy AMD gpu, you can only use FSR. So FSR is arguably an additive feature if you buy a Nvidia gpu.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +2

      Yep, you can also use XeSS on an Nvidia GPU as well.

    • @boboboy8189
      @boboboy8189 Год назад +1

      that because AMD let Nvidia use their tech but Nvidia is not. AMD also give freesync to be open source as adaptive sync but nividia is capitalist freaks, they won't let their tech to be open source. AMD also create mantel and later on became directX 12 and vulcan. you know what Nvidia did? they create directX 11 and it's make AMD card slower.
      weird huh? Nvidia has been do this to their competitor since ATI days. Nvidia with their bullshit is why there's only 2 gpu company until Intel recently launch their arc GPU because intel use AMD tech

    • @Gl0ckb1te
      @Gl0ckb1te Год назад

      ​@@boboboy8189 yup and ppl who cant see past thier noses still shill for nvidia like its anti-consumer business practices are a good thing. AMD is also scummy slowing down shipments and delaying 7800-7700-7600 cards till June. Intel is not our savior either.

  • @Tagobert0815
    @Tagobert0815 Год назад +3

    DLSS 2.5.1 would be better to test base, because this is a big jump in DLSS quality

  • @-TheLynx-
    @-TheLynx- Год назад +7

    After having to give back my mates 1080 after using it for nearly two years, I had to buy a new GPU after waiting for the 7800XT.
    Ended up picking up a 6800XT Red Devil for 7700NOK (760~USD). I considered the 3080, but it would cost 9500-10000 NOK (950-1000) USD. The 4070ti was even higher.
    It was a no brainer and the 6800XT is an awesome card. Drivers have been solid too.

    • @Goldwavez
      @Goldwavez Год назад

      Amd peaseant card. Drivers suck and the cards are for wannabe pc gamers aka peaseants

    • @-TheLynx-
      @-TheLynx- Год назад

      @@Goldwavez Haha, funny joke

  • @s.monesh04-rq4vb
    @s.monesh04-rq4vb Год назад +1

    Hey Daniel...iam confused b/w 6700xt and 3060TI...I know that 6700xt is powerful... But iam afraid of some issues like black screen, stuttering and driver issue... Which should i choose for 1440p...

    • @SwingArmCity
      @SwingArmCity Год назад +1

      I have the 6700 xt. I've never had a black\blue
      ed\purple screen. Works as it should.

  • @jameswhitesell3076
    @jameswhitesell3076 Год назад +1

    I had bought a 3060 12gb, but found it was lack luster for the nearly $400 i paid. I just recently got a 6800xt 16gb for $540, so i will update when i get it to give my review of it

  • @nicholasharfield8780
    @nicholasharfield8780 Год назад +13

    Just upgraded to a 6800 XT last week, great 1440p card!

    • @Arsa...
      @Arsa... Год назад

      Well ofc because it was intended for 4k

    • @nunyabidniz
      @nunyabidniz Год назад

      GREAT 4K CARD as well.....

    • @nicholasharfield8780
      @nicholasharfield8780 Год назад

      @@nunyabidniz great 4k means 60fps, great 1440p is 90+

    • @nunyabidniz
      @nunyabidniz Год назад

      @@nicholasharfield8780 yeah and my base 6800 non xt can get 60 in Elden Ring @ 4k...... Even with your opinion based Stats on what qualifies for great lol

    • @nicholasharfield8780
      @nicholasharfield8780 Год назад +1

      @@nunyabidniz that's not a demanding game tbf

  • @mikem2253
    @mikem2253 Год назад +3

    These cards are still plenty viable long term for 4k so long as you use upscaling (ultra upscaling makes no difference real world) and turning down settings to high or medium where needed. Heavy games like Plague Tale still look really good at medium settings.

    • @kaisyaya8492
      @kaisyaya8492 Год назад +2

      certain new games looks good even in low settings, but no real performance gains because lately the PC scalability is very poor but that's another issue with gaming ....

    • @Dempig
      @Dempig Год назад

      3080 isnt really viable for 4k, even the 12gb (in newer games). DLSS looks so much worse than native 4k id hardly call it "no difference"

    • @mikem2253
      @mikem2253 Год назад +3

      @@Dempig Depends on the person I guess. I run FSR on my 7900XT attached to my 65" LG C1 OLED and don't notice any material difference.
      There are plenty of games where a 3080 is viable for 4k. What is your definition of viable? That can be subjective.

    • @Dempig
      @Dempig Год назад

      @@mikem2253 Idk man you either have bad vision or are pretending not to notice, because FSR looks even worse than dlss and dlss looks like crap compared to native 4k. Viable to me is 4k 60 fps high settings (no rtx) no upscaling. I game on a 65" as well, I just refuse to believe you dont see a difference. The aliasing looks TERRIBLE, so blurry compared to native 4k.

    • @mikem2253
      @mikem2253 Год назад +2

      @@Dempig lol DLSS looks like crap. I guess DF is lying when they say "better than native." Do you man. i'll do me. Thanks for coming out.

  • @Matthew-gg2hc
    @Matthew-gg2hc Год назад +1

    Thanks for doing a 1080p comparison 👍💯

  • @norbertnagy4468
    @norbertnagy4468 Год назад

    Please can you tell where did you find that the ResizeBar whitelist was expanded? Not even the official nvidia page mentions it

  • @tahustvedt
    @tahustvedt Год назад +3

    6800 XT killing it in the COD games. Amazing performance per watt.

    • @krizby87
      @krizby87 Год назад +1

      COD games have Nvidia Reflex that reduce the input latency massively on Nvidia GPUs, so while Radeon are faster, the competitive advantage fall to Nvidia GPU

    • @khanch.6807
      @khanch.6807 Год назад

      @@krizby87 If you have the dollars and the skills to make the dollars worth it.

    • @yeagmatic
      @yeagmatic Год назад +3

      @@krizby87 Radeon Anti-lag

    • @cesarpdc
      @cesarpdc Год назад

      @@krizby87 stop talking out your dirty boots with that nonsense 😒

    • @krizby87
      @krizby87 Год назад

      @@yeagmatic ruclips.net/video/7DPqtPFX4xo/видео.html
      Driver side anti lag is pos compare to Reflex API

  • @ganthrithor
    @ganthrithor Год назад +3

    Great video as always. Yeah: it's wild seeing people list 3080 stock for $700+ USD. Who in their right mind is gonna buy those?
    Every day I go back and forth on buying an RX 6000 card (either a 6700XT or 6800)-- on the one hand they're by far the best value in the GPU market right now. On the other hand, it's still spending ~$500 (depending on XT or vanilla for the 6800) on a three year old, obsolescent GPU. It's pretty hard to stomach when up until the pandemic, about that amount of money would get you a brand new flagship card on launch day.
    Power draw on the 6800 XT in the video looks really good too honestly. I'm sure I'd probably be very happy with an RX 6000 card. The only other thing that's a little annoying is finding water blocks for some of these cards: there are just so many vendors and different models of card (for both AMD and Nvidia), and it seems like every time a good deal on a particular one pops up, nobody makes a full-cover block for that exact model.

    • @DarthAndre24
      @DarthAndre24 Год назад

      Agree with you. Currently have a 5600 XT. Its just not cutting it at 1440p. Too many hiccups with a dual monitor setup when the primary monitor is used.
      Will have to judge if it runs Hogwarts Legacy well at medium 1440p before upgrading to a 3080 or 6800 XT. Otherwise I will just get a PS5 since we won’t get to play Forbidden West and Ragnarok on PC soon anyway.

  • @muhahaha153
    @muhahaha153 Год назад +1

    Before buying consider the powerusage. In summer you will be very happy to have a card that produces less heat.

  • @IghorMaia
    @IghorMaia Год назад +2

    so basically 3080 is a better card period

  • @ghost085
    @ghost085 Год назад +5

    I'm curious about FSR 3 because I don't have an 40xx series card, but I worry it was one instance where AMD marketing said "yes, we can do that too" before even getting started with R&D, meaning it could take a long while before we actually see something from them.

    • @jankratochvil9779
      @jankratochvil9779 Год назад

      Its their prio to finish it as soon as possible thus its great sell point especialy for Navi 32+33. My tip is mid 2023

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      @@jankratochvil9779 it depends on what tech actually needed to run them. just look at freesync before. they even hint that their older 5k and 6k series have the capability to do VRR. but when freesync finally comes out it only useable on their latest GCN 2 (dub as GCN 1.1 back then). not even the first gen GCN able to use it in games.

    • @moevor
      @moevor Год назад

      Honest question - do you have a use case for frame generation? Seems like many are listing FSR3 more so because it is an item on the spec/feature list rather than something they know they need.

    • @zwenkwiel816
      @zwenkwiel816 Год назад +2

      @@moevor VR? Think most of the reprojection is done by software atm. Having some hardware dedicated to this might help a lot to run more intensive games at the high framerates needed to not get sick in VR.

    • @ghost085
      @ghost085 Год назад +1

      @@moevor I love new tech, that's why I'm curious to see how it looks. If you ask me, I'd probably chose 75fps native over 120fps with frame generation if the former offered less input lag.

  • @mickeyriola9087
    @mickeyriola9087 Год назад +3

    Amd cards of this gen blew me away I’ve always had nvidia but I’m Sick of all their bs and I don’t care for the cards especially for the price I switch too 6800 non xt from a 1660 ti and it was the best decision I ever made haha I love it

    • @boboboy8189
      @boboboy8189 Год назад

      but AMD 7000 is lackluster though. i actually planning to upgrade to RX7600 but after the result released with 7900XTX, i grab RX6600

    • @Definitely_Melnyx
      @Definitely_Melnyx Год назад

      @@boboboy8189 THe other 7000 series supposeldy releasing in Q3 so yeh take what you can get now

  • @misaelperez1077
    @misaelperez1077 Год назад +1

    The 6950XT is $700 at microcenter. Can you compare it to the 4070ti? Thanks. Great content by the way.

  • @JohnSmith-hv6ks
    @JohnSmith-hv6ks Год назад +1

    12gb is indeed unfair imo. It's significantly more expensive.

  • @cpt.tombstone
    @cpt.tombstone Год назад +3

    Hi Daniel! It has recently come into my attention that ReBAR is disabled by default on Nvidia cards for Call of Duty MW2(even if it's enabled in the Motherboard's BIOS), and enabling it from Nvidia Profile Inspector gives a 15% uplift for my 4090. Therefore, I think that is one of the reasons that CoD MW2 is performing that much worse on Nvidia cards. The other, I think is the L2 Cache size difference, the RX 6800XT has 132 MBs of L2+L3 cache, while the 3080 has only 5MBs of L2 cache (no L3 at all).

    • @moevor
      @moevor Год назад +1

      Profile inspector is a 3rd party tool though. So, while there is an uplift to be had, it is not an out-of-the box experience for Nvidia GPUs.

    • @cpt.tombstone
      @cpt.tombstone Год назад +4

      @@moevor that's true, but Daniel said in the video, that he does not know why AMD cards are performing so much better I'm CoD MW2, and I think I have an explanation why. Lovelace has more cache compared to Ampere, so if the difference is less comparing a ReBAR enabled 4080 and a 7900XTX than between a ReBAR enabled 3080 and 6800XT, then we could confirm if it's down to the cache differnce or not.

    • @the80386
      @the80386 Год назад +1

      @@cpt.tombstone you bring up a good point about major difference in L3 size. But do keep in mind that AMD had to include such a large L3 cache to overcome their 33-46% memory bandwidth disadvantage compared to nvidia. AMD pairs a large L3 + 512 GBps memory to match nvidias small cache + 760-912 GBps memory.

    • @cpt.tombstone
      @cpt.tombstone Год назад

      @@the80386 You are right, AMD chose to add more cache to balance the slower memory, but the amount of cache they added is huge, the 6800XT has almost two times the cache compared to the 4090 even, and that has 12 times more cache than its predecessor. Kind of interesting to see how different engines behave with this discrepancy, as far as I can recall now, Only Call of Duty and Forza display such behavior, where the cache size seemingly matters more than video memory bandwidth. I've noticed that Call of Duty is not using much VRAM, only about 4 GBs at 3440x1440 with my 4090, but once I force ReBAR on, VRAM usage goes up to 7 GBs, and performance improves a lot. Very interesting.

  • @weeooh1
    @weeooh1 Год назад +2

    The 3080 12gb is not just 3080 10+2gb, but also other significant spec differences: 384-bit bus (vs 320), slightly more cuda cores, RT and tensor cores, higher mem bandwidth as well.

  • @connorsmith2556
    @connorsmith2556 Год назад +1

    Have you tried using amd rsr in the amd adrenalin, I use it and upscale from 1440p to 4k with 6800xt and constantly hit 90-100 with max settings

  • @A42yearoldARAB
    @A42yearoldARAB Год назад +1

    Good comparison. I do not see how comparing 12gb to 16gb is unfair. Amd people always have some excuse, "RT is a gimmick" Amd needs to catch up in RT cores that does not make it a gimmick.

    • @boboboy8189
      @boboboy8189 Год назад +1

      actually RT sacrifice your FPS, so it's not worthy. it's remind me hairworks by Nvidia, it's sacrifice FPS for eye candy. what we need to have is new kind of RT that not sacrifice FPS.
      when nividia make software propriety, it's always have some sort of hardware limitation while when AMD did software, they give it free

  • @Criftyman
    @Criftyman Год назад +7

    The 6800xt is much cheaper and 70-100w less Power consumption compared to the RTX 3080 for 10% more fps in the mid. This is not worth.

    • @noer0205
      @noer0205 Год назад

      You can't compare software readings of AMD cards and Nvidia cards, because they include different parts of the cards power consumption. Nvidia measures the whole card but AMD only includes chip and vrm power. Google Igor's lab, he did a great article documenting this.

  • @silverwerewolf975
    @silverwerewolf975 Год назад +4

    God fortnite is so bad optimized

    • @pockitsune6347
      @pockitsune6347 Год назад

      ?????????

    • @pockitsune6347
      @pockitsune6347 Год назад +1

      if you're using maxed out settings with raytracing sure lmao but you can get hundreds of frames at medium or even high without rt

    • @silverwerewolf975
      @silverwerewolf975 Год назад

      @@pockitsune6347 hes getting 70/80fps on only high without rt with a freaking 3080/6800xt, i would expect that poor performance from a 3050 maybe, no, not even that, a 1060, not from a top tier card on a competitive shitty game

    • @silverwerewolf975
      @silverwerewolf975 Год назад

      @@pockitsune6347 not even that, i remember when my rx580 gave me 150fps on epic, now a 3080 on high gives 70 🤣🤣🤣

    • @pockitsune6347
      @pockitsune6347 Год назад

      @@silverwerewolf975 Without HARDWARE raytracing, lumen is software raytracing dude

  • @Dtions
    @Dtions Год назад

    Thank you for the retest!! Just would like to ask will it possible in furture also have the wall power draw for reference...as seems the rx 6800xt report the power draw is at leaset 50W below the 3080 12 GB which is very awesome!! But I remember in the past amd power draw report is not acurate

  • @Z-Crebs
    @Z-Crebs Год назад +1

    I got a used 6800xt for $400 + $50 shipping. Best upgrade I’ve ever made

  • @DeanerNotTechTips
    @DeanerNotTechTips Год назад +2

    I kinda wish amd was as close to nvidia as they were last generation. Last generation with the 3080/6800xt or 3090/6900xt they were so close it came down to personal preference. Now with the 7000 series I feel like amd's offerings just aren't as appealing especially with all the new features the 40 series has (the 4080 and 4070 ti are still scams though). I ended up going with a 3080 ftw3 10gb last year because I wanted more performance in blender and ray tracing mainly just for peace of mind in case I ever wanted to use it but honestly if I had a 6800xt or 6900xt I think that I'd still have a pretty great experience.

  • @reganmorben9248
    @reganmorben9248 Год назад

    keep on rockin Dan! Thank you for your insight.

  • @Vaidd4
    @Vaidd4 Год назад +1

    BRO up to 100W more usage for nvidia ?

  • @assassinscreedplayer9041
    @assassinscreedplayer9041 Год назад +1

    Should i get pulse 6800xt for 2700pln or 6950xt for 3054pln? The power draw of 6950xt makeing me sick even after uv. Its still 300w

  • @mikkolindstrom1797
    @mikkolindstrom1797 Год назад +1

    RTX3080 10gb is 34% higher in price vs 6800XT right now at finnish retailers(cheapest model in stock). Worth it? I dont think so.

  • @tofuguru941
    @tofuguru941 6 месяцев назад +1

    The fact that the 6800xt uses 100watts less almost everywhere is absolutely phenomenal.
    It's a VERY overlooked detail that one needs to take into consideration.
    You'd need a 100w+ stronger PSU to run those silly power hog 3000 series GPU's.
    That also cost $.
    It's literally a win, win going with a 6800xt.

  • @Xanaxluver
    @Xanaxluver Год назад

    FUCK YEAH! Daniel's videos are the best!

  • @killermoon635
    @killermoon635 Год назад +1

    In demanding game, you can always use DLSS for 1440p
    No reason to play below 1440p on 3080

  • @forcezero2220
    @forcezero2220 Год назад

    I want to ask is amd encoder is good for compressing video,and rendering video?

  • @JaiAzura
    @JaiAzura Год назад +2

    Cheapest 3080 (10gb) is $950. Cheapest 6800xt is $550. So if you're looking for a new gpu, you should probably stay away from the 3080s.

    • @Dempig
      @Dempig Год назад

      Where are you finding these prices? Both seem 100% sold out new, and used i see 3080's for $400-$500 quite often

  • @bradeinarsen
    @bradeinarsen Год назад

    I love that you're a teacher IRL ... do any of your students ever find out about your RUclips channel?

  • @amaangamer29
    @amaangamer29 7 месяцев назад

    you do the best and detailed comparison in the you tube you are the best thank you for making such a detail video

  • @msct6080
    @msct6080 Год назад +1

    Excellent video. Within thou I would like to say out that you sometimes skipped pointing out the 1% drops comparisons. The 6800XT had much more cases were it was just much better. Like, A LOT. I''d rather have a card 10% slower, but where the min. FPS is 20-30% higher. And is also costs 40% less? Never by the first iteration of an NVIDIA cards lads&ladies. 1060 with 3GB VRAM. 3080 with 10GB, etc. Where with AMD you know the performance will scale up in time.

  • @marcbibeau1416
    @marcbibeau1416 Год назад +1

    Got a 3080 for 1200$ cad tax in just before covid screwed everything up. Best purchase of a gpu ever.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +1

    I can't believe how many people are trying to sell new 3080s and 3080 Ti for more than the 4070 Ti, it just doesn't make any sense. This is technology, which gets drastically cheaper when the next generation gets launched, at least it's supposed to.

  • @Definitely_Melnyx
    @Definitely_Melnyx Год назад +1

    recently picked up a 3 month old rx 6800 for 450 Euros and it has performed fantastically. That price wouldnt even get me 3070 here.

  • @Azhureus
    @Azhureus Год назад +1

    I have 6800xt, bought it for around 900 euros and it was much cheaper than 3080 back then. It runs games well, sure, in ray tracing it is not as good as 3080, but with some tweaking, you can still run games with RT on on decent 60+ fps. Ubisoft titles are running very well with RT on.

  • @MessiahFromR6
    @MessiahFromR6 Год назад +1

    I paid 500 euros for my Gigabyte 6800 XT OC, card is awesome. Great performance for the price, 0 drivers issue and even in full load temps don't go over 59 degrees, 100% recommend it.

  • @dafareyreynaldi9309
    @dafareyreynaldi9309 Год назад

    what is your rx 6800xt driver version ? please tell me

  • @dotxyn
    @dotxyn Год назад +1

    I think you should use footage from the main game vs the pre-game lobby in Fortnite. I've noticed, at least on my 6800XT system, there are more stutters and frame drops before you've landed on the island. Although, I'm unsure if this is a huge issue within the recorded playback.

    • @hardcorehardware361
      @hardcorehardware361 Год назад

      Fortnite stutters regardless of what GPU you use, UE5 update has made the game allot better though. I played since season 1 so I went through all of the good updates and bad updates. This update doesn't require as much caching, usually one match now after installing new driver updates etc which is really good because before it would take a few matches. It stutters on my 3090, 2080ti, 2070S, 6800XT and RX580 8gb but as I said its allot better now with UE5. The only issue I ran into with the 6800XT was DX12 Lumens at 4k, I get unreal exception crashes but Imo I think it has to do with the small 256bit bus or drivers since the last driver for our cards was Dec 2022? I don't play it at 4k though, this was just me testing out Lumens and DX12 on the card. I can notice the slow down even at DX11 4k comp settings, its the bus it has to be but its still a solid 1440p card. That's my personal experience.