AMD Radeon RX 6800 XT Benchmark Review, Smart Access Memory, Thermals & Gaming

Поделиться
HTML-код
  • Опубликовано: 16 дек 2024

Комментарии • 3,1 тыс.

  • @paulshardware
    @paulshardware 4 года назад +2079

    HI STEVE, I HOPE WE GET TO SLEEP SOON

    • @azurai3934
      @azurai3934 4 года назад +94

      Our reaction when we see a dozen RX6800 XT reviews pop up at once.

    • @nialmurphy9622
      @nialmurphy9622 4 года назад +16

      Paul you are sexy as hell without sleep. Just wanted to say that.

    • @spxctreofficial
      @spxctreofficial 4 года назад

      LOL

    • @ChrisDupres
      @ChrisDupres 4 года назад +52

      GET BACK IN YOUR CAVE AND GET US NUMBERS, NUMBER SLAVE

    • @jayreed2692
      @jayreed2692 4 года назад +8

      hahaha you guys have busted your asses over the past month, you guys need a vacation

  • @HeirofCarthage
    @HeirofCarthage 4 года назад +847

    Well we can't buy one as per usual launch in 2020, but at least I can help reward Steve for all the hard work they put into the review of another great product we can't buy :) Thanks for the video.

    • @Hector3436
      @Hector3436 4 года назад +2

      Keep up the good work, Heir :),

    • @SuperJosteen
      @SuperJosteen 4 года назад +2

      didnt happen for cometlake tho 🤣 sad intel

    • @teamtechworked8217
      @teamtechworked8217 4 года назад +3

      I got 1. And I am a hunter pecker on the keyboard.

    • @ffwast
      @ffwast 4 года назад

      @@teamtechworked8217 if you learn to touch type you could get two or three.

    • @ffwast
      @ffwast 4 года назад +1

      @@teamtechworked8217 Personally I typed like that for a decade then realized I knew where all the keys were without looking and I no longer had tiny child hands to make it necessary,fully trained myself to do it by touch within days and faster than ever within weeks,just had to commit to it. Totally worth it.

  • @georgelopez9411
    @georgelopez9411 4 года назад +247

    Your graphs are very clear; much clearer than that of other outlets. Coloring the relevant bars with colors that standout makes it easy to immediately understand the graph without hunting around.

    • @somethinglikethat2176
      @somethinglikethat2176 4 года назад +8

      *angry buildzoid noises*

    • @korndud
      @korndud 4 года назад +6

      i agree. his graphs are the best out of all reviewers. his tests always seem to have amd cards higher than other reviewers as well. even the rx 5700 xt.

    • @knifeyonline
      @knifeyonline 4 года назад

      I was wondering if it's just because I'm Australian I prefer this channel by a lot. I guess not?

    • @sniperwolfpk5
      @sniperwolfpk5 4 года назад

      I agree with that

    • @8bitchiptune420
      @8bitchiptune420 4 года назад

      Sounds like my boss in every meeting

  • @boukm3n
    @boukm3n 4 года назад +355

    Steve be like: “SLEEP IS FOR THE WEAK” *passes out*

    • @TheHighborn
      @TheHighborn 4 года назад +7

      We'll never sleep, cos sleep is for the weak.
      We'll never rest, till we're fucking dead

    • @MarioAPN
      @MarioAPN 4 года назад +2

      I know how demanding repetitive job of PC creation can be, I´ve been drawing in CAD and catia for hours. But it is not like he is doing some crazy job. He is tired, but he can create 12-14 hours a day for a few days, no problem. After all, there is money for him.

    • @Arbiter099
      @Arbiter099 4 года назад

      I thought hardware news was for the weak

    • @Hardwareunboxed
      @Hardwareunboxed  4 года назад +35

      @@MarioAPN "12-14 hours a day for a few days, no problem" haha cute, imagine only working 14 hours per day, you're living the dream!

    • @andrewwilliamson3513
      @andrewwilliamson3513 4 года назад +9

      @@Hardwareunboxed the self employed will always work the hardest!

  • @big.atom37
    @big.atom37 4 года назад +152

    Finally! I can now start waiting for RDNA 3

    • @ShaneMcGrath.
      @ShaneMcGrath. 4 года назад +18

      Just in time for Covid 21 perhaps. o.O

    • @micaluzzo83
      @micaluzzo83 4 года назад +12

      “In many ways the AMD GPU is the superior product”
      What a good year for AMD. Just incredible.

    • @AndrewTJackson
      @AndrewTJackson 4 года назад +2

      Looooool

    • @wizzyno1566
      @wizzyno1566 4 года назад

      🤣🤣🤣

  • @ScantyMosquito
    @ScantyMosquito 4 года назад +424

    “In many ways the AMD GPU is the superior product”
    What a good year for AMD. Just incredible.

    • @TheArrowedKnee
      @TheArrowedKnee 4 года назад +17

      Just a shame there is such a lack of stock, but that's an industry wide problem i suppose, not exactly AMDs fault

    • @chaniibak7702
      @chaniibak7702 4 года назад +2

      @@TheArrowedKnee This really sucks man. Sometimes I wish covid wasn't a thing, because of things like this. Most times I don't because this has honestly been a good year for me.

    • @rct9617
      @rct9617 4 года назад +36

      Only really in three ways is it the superior product: price, vram capacity, and power consumption. But I imagine that those three reasons alone can convince a lot of people, because it definitely almost convinced me. But for me the value advantage is negligible because of the extra features that rtx cards have. I’ll make a purchase sometime next year when all product choices are accounted for and planned features are properly implemented.

    • @MauriceTarantulas
      @MauriceTarantulas 4 года назад +5

      @@rct9617 Wise words.

    • @kaozz77
      @kaozz77 4 года назад +3

      now they just have to follow with good drivers

  • @blindtruth4614
    @blindtruth4614 4 года назад +65

    It is amazing how much ground AMD has made uo in a single generation and it will be great once stocks not a problem for us as customers finally we will have a choice in the high end

  • @emilromano
    @emilromano 4 года назад +772

    Imagine that there were people who doubted that AMD could even match the 2080Ti...

    • @ddlog420
      @ddlog420 4 года назад +104

      Guilty.

    • @thetg2117
      @thetg2117 4 года назад +76

      i was guilty as well. finally ready to upgrade mt 1080ti

    • @emilromano
      @emilromano 4 года назад +101

      @@ddlog420 That's understandable, AMD has disappointed a lot during the last couple of years, especially with GPU's. I'm just glad they finally have brought competition to the high-end GPU market!

    • @tomhsia4354
      @tomhsia4354 4 года назад +22

      It still doesn't, the benchmark numbers don't matter.
      /s
      Damn, those 1440p and 1080p numbers (Steven hinted that the 6800XT beats the 3090 at 1080p in Watch Dog Legions during his unboxing). I was hoping for better 4k performance but whatever, I can stand 1440p in games.

    • @retsu1182
      @retsu1182 4 года назад +12

      Still cant compete in raytracing or any other possible software feature or even stock right now so its still not a win for AMD.

  • @Luemm3l
    @Luemm3l 4 года назад +203

    totally disregarding the performance: Am I the only one that finds the FE editions (both nVIDIA and AMD) both way more gorgeous than any board partner cards so far? Usually it was the other way around in previous generations...

    • @gamergeek494
      @gamergeek494 4 года назад +15

      I havent seen many AMD board partner designs yet, but yes. Both AMD and NVIDIA massively stepped up their games in the looks department.

    • @MrMilkyCoco
      @MrMilkyCoco 4 года назад +10

      I hate that red strip on the 6800/xt its obnoxious as hell. Makes it not look premium >

    • @HickoryDickory86
      @HickoryDickory86 4 года назад +4

      I don't know... those XFX teases are looking mighty sexy. 😏

    • @Deliveredmean42
      @Deliveredmean42 4 года назад +3

      @@MrMilkyCoco what is consider premium to you? Genuine question.

    • @Luemm3l
      @Luemm3l 4 года назад +4

      @@HickoryDickory86 they do actually... sapphire as well. most others though... I don't even wanna start with nVIDIA... most of them are ugly as hell. I also don't like overly "gamey" cards, a little RGB or the occasional exquisite cooler desigin is fine, but most are overly edgy or plain boring. the Founders edition hit the sweet spot between simple, yet good looking... AMD more on the edgy little rebel side, nVIDIA on the more elegant side. I personally don't mind the red strip, it is AMDs color after all and some red accents are perfectly fine with pretty much any case color setup.

  • @vncube1
    @vncube1 4 года назад +309

    Who's ready to watch a combined two hours of Big Navi benchmarks from all these techtubers right now?

    • @cretene1
      @cretene1 4 года назад +6

      me

    • @roccociccone597
      @roccociccone597 4 года назад +5

      yup I'm ready bro

    • @Lionheart1188
      @Lionheart1188 4 года назад +4

      Sign me up

    • @MattJDylan
      @MattJDylan 4 года назад +7

      *my body is ready*
      Except for GN, I actually need to be focused enough before watching their reviews

    • @vncube1
      @vncube1 4 года назад +5

      @@MattJDylan The second hour is reserved for Tech Jesus' review :P

  • @newma5348
    @newma5348 4 года назад +524

    I'm so happy Amd is now giving competition to other previous market dominating brands

    • @Superiorer
      @Superiorer 4 года назад +19

      What competition? They sell exactly zero of these cards in the Netherlands.

    • @shernandez31
      @shernandez31 4 года назад +36

      @@Superiorer and that's why Nvidia can price their cards however they want, even when competition is great, people buy Nvidia

    • @justmixah3408
      @justmixah3408 4 года назад +5

      @@shernandez31 so true

    • @srinathshettigar379
      @srinathshettigar379 4 года назад +10

      @@Superiorer not anymore.

    • @Orcawhale1
      @Orcawhale1 4 года назад +6

      But they arn't.
      Customers arn't buying their cards, and especially not with the driver woes of RX 5700 and XT.
      As evidenced by the fact that there's more 2080Ti owners, than both RX 5700 and XT..

  • @A7XKoRnRocks1
    @A7XKoRnRocks1 4 года назад +343

    Despite the whole DLSS and RTX thing its good to see that AMD is competitive again.

    • @misterPAINMAKER
      @misterPAINMAKER 4 года назад +6

      They have ray-tracing and will also have dlss in the future.

    • @xhydrox
      @xhydrox 4 года назад +53

      Yes but its nowhere near nvidia's performance in those categories thats what Romeo probably meant

    • @weil46
      @weil46 4 года назад +5

      Lets wait for drivers updates as we know amd at launch always dont have the better drivers.. So lets wait at less 1 month and we may see more performance

    • @xhydrox
      @xhydrox 4 года назад +34

      Sure but I don't think AMD can just fix their ray tracing and dlss with a simple driver update

    • @wildanthea
      @wildanthea 4 года назад +8

      while nvidia ... 2nd gen rt, amd 1st gen rt, so yeah.. rdna 3 will be the real battle. head to head with nvidia.

  • @praetorxyn
    @praetorxyn 4 года назад +247

    Just give me a Sapphire Nitro+ version of this baby.

    • @Hardwareunboxed
      @Hardwareunboxed  4 года назад +211

      Review should be up on the 25th for you ;)

    • @praetorxyn
      @praetorxyn 4 года назад +4

      @@Hardwareunboxed Thanks. Do you think it's worth upgrading from a 3900X to a 5900X for SAM? I kind of wanna, but I kind of feel like I'll have buyer's remorse next year if I do.

    • @sanjivinsmoke2719
      @sanjivinsmoke2719 4 года назад +19

      @@praetorxyn your still good bro

    • @panafricanmarxist7632
      @panafricanmarxist7632 4 года назад +5

      @@praetorxyn if you are planning to game, I might go for it. For anything else, no cause SAM only increases performance in games.

    • @yacchaga
      @yacchaga 4 года назад +4

      I want Toxic

  • @DeeonleeCobb373
    @DeeonleeCobb373 4 года назад +290

    How other people bench: Muscle gains.
    How Steve Benches: FPS gains.

    • @KannX
      @KannX 4 года назад +4

      Brilliant comment Sir! :)

    • @nine2nine929
      @nine2nine929 4 года назад +1

      that's how Steve flexs

    • @RawkL0bster
      @RawkL0bster 4 года назад

      All kinds of gains. All kinds.

    • @DeeonleeCobb373
      @DeeonleeCobb373 4 года назад

      @@shredderorokusaki3013
      Yeah can't wait for those benchmarks to release. Think there'll be a Non XT 6900? 6800XT is good for its price but I really am incline to get a GPU labelled 6900 solely for the name.

  • @WinnieBlue
    @WinnieBlue 4 года назад +19

    I have full faith that retail will add $300AU to the $1050AU price when AIB cards are released...

  • @tylerthere5832
    @tylerthere5832 4 года назад +251

    My 1080s fan just started squeaking after watching this video.

    • @moomcmoosson1992
      @moomcmoosson1992 4 года назад +8

      Best to upgrade before it explodes in your face then. xD

    • @astronot4955
      @astronot4955 4 года назад +9

      You can wait for rdna 3 with a 1080

    • @CharlieboyK
      @CharlieboyK 4 года назад +3

      My 1080's memory crapped itself last week. Had to upgrade to a 3070 ;-)

    • @TheAlbaniaGaming
      @TheAlbaniaGaming 4 года назад +1

      Hey, Im a computer engineering student and really need a new gpu for University , if you thinking to upgrade, how much can you sell me your GTX 1080 for?

    • @davidlegkodukh6969
      @davidlegkodukh6969 4 года назад +1

      @@TheAlbaniaGaming Look on Facebook market and Craigslist you can find them for 350$ and less if you spend a few hours looking 👍💯

  • @SauvagePT
    @SauvagePT 4 года назад +164

    11:44 - RTX 2060: "I can do this... I CAN DO THIIIIIS!"

  • @djmandrick
    @djmandrick 4 года назад +65

    2020: When hardware releases became a physical form of "Early Access"
    2021: Introduction of new marketing, the "TimeSaver" Edition - only £30 extra but your item is guaranteed for launch day*
    *Terms and Conditions apply

    • @NickMcDonald89
      @NickMcDonald89 4 года назад +8

      Shhhh, Nvidia might hear you!

    • @GholaTleilaxu
      @GholaTleilaxu 4 года назад

      TimeSaver Edition = Anti-competitive Edition = Billions of dollars in fines, not that we ordinary citizens would ever see any of that money anyway.

    • @SmokeyGamingAu
      @SmokeyGamingAu 4 года назад

      Noooo What have you done, they will see this idea and steal it lol

  • @whitemoses7913
    @whitemoses7913 4 года назад +165

    Comparing RDNA to Zen
    RDNA1 & Zen : "Meh"
    RDNA2 & Zen 2 : "Interesting"
    RDNA3 : "See Zen 3"

    • @tusux2949
      @tusux2949 4 года назад +57

      Comparing Nvidia to Intel's responses in both cases :
      RDNA1/Zen1 : Nvidia/Intel - "Hahaha, lol wtf ffs, is that all you've got ?"
      RDNA2/Zen2 : Nvidia/Intel - "Damn, we actually have to do something for the new generation now..."
      RDNA3/Zen3: Nvidia/Intel - "FUUUUUUUUUUUUUUUUUUUU*K!!!!!"
      EDIT: for clarity

    • @jwest88
      @jwest88 4 года назад +20

      I plan on upgrading my system to a zen 4 rdna 3 system when those both launch. It'll be a huge upgrade from my r5 1600 and gtx 1080 system.

    • @Dj-Mccullough
      @Dj-Mccullough 4 года назад +15

      If we assume (and i know they can do it now) that they can meet the challenge once, they can do it again. they took their performance straight from geforce 1080TI level performance (5700xt) straight to 3080-3090 performance.. They literally skipped a "2000 series" equivilant card. THAT is an accomplishment. And I think they will do it again with Rdna3 :)

    • @ahdasick
      @ahdasick 4 года назад +1

      @@tusux2949 eh nvidia still edged out the win this gen. but i believe amd will come for blood next gen i can garuntee it! this feels like their “first attempt” cards like nvidias 2000 series

    • @ralfrudi3963
      @ralfrudi3963 4 года назад

      @@Dj-Mccullough The comparison to the 5700xt is very flawed and extremely missleading. 5700xt had 40 CUs while the 6800 has 60CUs and the 6800xt 72 CUs. It is not like it was impossible for AMD to create 60CU or 72CU RDNA1 GPU, they just decided not to. So the gain from architecture to architecture will be much smaller when we are comparing 40 CU vs 40 CU (probably 6700 or 6700xt) and the difference between those 2 cards is the architectural gain. Probably more in line with 10-20% though both use the same node. RDNA3 will probably be on a better node so you will get some gains there aswell.
      But you will not see the same gains as when comparing 5700xt vs 6800xt. They will not produce 130-140CU gpus. Even their newly revealed MI100 data center chip only uses 128. So unless you want to pay 5000 bucks for a gaming gpu (and only leverage its performance at 8k) you will have to be satisfied with some architectural+node upgrades which probably will be more like 20-40% uplift.

  • @Tylerjrb
    @Tylerjrb 4 года назад +51

    AMD is finally back. I’ve been wanting to upgrade and it’s been a long time since I’ve owned an AMD card.

  • @thatsounditmakes9177
    @thatsounditmakes9177 4 года назад +41

    Steve: most of you are here for the blue bar graphs
    Me: I'm here for Steve's sick accent, but that's just me.

    • @sausageskin
      @sausageskin 4 года назад +2

      Haha you got to move to Australia then mate. I'm a newcomer here and enjoying the accent a lot.

    • @TheMetalGaia
      @TheMetalGaia 4 года назад +2

      @@sausageskin Did you move for work? Genuinely curious.

    • @sausageskin
      @sausageskin 4 года назад +1

      @@TheMetalGaia I was more trying to escape Russia and live in a country where people are nice and not bully me :)

    • @sausageskin
      @sausageskin 4 года назад +2

      @@TheMetalGaia I applied to Australian permanent residency and got it after two years of paperwork

    • @TheMetalGaia
      @TheMetalGaia 4 года назад +2

      @@sausageskin That's awesome! Glad you got it. I've considered moving to Canada from the U.S. before.

  • @federicosciovolone
    @federicosciovolone 4 года назад +131

    omg , the 6800xt is a monster of gpu

    • @johns250
      @johns250 4 года назад +22

      So is the 3080

    • @legendarymop
      @legendarymop 4 года назад +8

      lol there's a huge CPU bottleneck in every case of 1440p. Theres no way the 3090 is only faster than the 3080 by a few fps lmao, which indicates the CPU is bottlenecking. When you go to 4k where the cpu bottleneck is removed, you immediately see the 6800xt is way slower than the competition.

    • @bastordd
      @bastordd 4 года назад +9

      3080 is faster in 4k because of memory

    • @dawienel1142
      @dawienel1142 4 года назад +37

      @@legendarymop The cpu bottleneck is working against AMD not nvidia.
      The reason that Nvidia has more performance at 4k is because of the FP32 compute (tons of cores) which doesn't scale well at lower resolution.
      There is a CPU bottleneck at 1080p and a little one at 1440p, but it bottlenecks RDNA 2 more than Nvidia.
      The rtx3090 is only slightly more powerful than a 3080 and runs at similar clocks, it's an architecture bottleneck at lower resolutions for the RTX3000 series.
      So for fps snobs at lower resolutions it's RDNA 2 all the way now. (Massive 40%+ performance difference in some games)
      For 4K snobs it's slightly more in favor of Nvidia. (18-20%+ performance difference in some games)

    • @Tinky1rs
      @Tinky1rs 4 года назад +2

      @@bastordd Doesn't the 3080 have less memory? How'd less memory make it faster.

  • @Brisingrr_
    @Brisingrr_ 4 года назад +30

    AMD costing 350 dollars less in australia even for AIBs! Hopefully they're in stock for more than 1 second.

    • @choatus
      @choatus 4 года назад +2

      LOL good luck Mate

  • @ejkk9513
    @ejkk9513 4 года назад +9

    Just a few years ago AMD GPU'S were going for high compute capabilities at lower frequency. Nvidia was going for lower compute capabilities but at a higher frequency. Now they have completely switched places! The only difference is that they're both good now, not just Nvidia.

  • @ailolen3954
    @ailolen3954 4 года назад +106

    144hz 1440p at max settings, finally!!!
    Never cared for 4k that much....

    • @HollowRick
      @HollowRick 4 года назад +27

      Same not until 4k 144hz monitors become affordable 1440p 144hz is the sweet spot

    • @rahuloberoi9739
      @rahuloberoi9739 4 года назад +15

      @@HollowRick I'm happy with 1080p 144hz. Cheers.

    • @NikA-wr6px
      @NikA-wr6px 4 года назад +2

      @@rahuloberoi9739 same bro just sucks my 1060 cant run the new games more then 60 fps on medium to high settings anymore :(

    • @filipealves6602
      @filipealves6602 4 года назад +2

      3440x1440 is where it's at. 21:9 is so much better for gaming.

    • @256alexdt
      @256alexdt 4 года назад +1

      @@HollowRick There not that expensive now ... can pick up an LG 4K 144hz for what a 1440P 144hz monitor use to be ... 4K is really the new 1440p ... amazing to play at 4K with everything maxed out

  • @ozker5448
    @ozker5448 4 года назад +90

    Amd reference design is "cool and quiet".
    It's so weird hearing that after 'hot and loud' release after release after release.

    • @R3in_Ch
      @R3in_Ch 4 года назад +5

      improvement at its finest

    • @MLWJ1993
      @MLWJ1993 4 года назад +6

      Probably the biggest improvement ever made 😆

    • @choatus
      @choatus 4 года назад +1

      hmm... 75c is a full 15c higher than my 3080 when it's OC'd, seems to be weird considering AMD using a smaller node (7nm).

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp 7 месяцев назад

      @@MLWJ1993 Likely due to the smaller die at -108mm^2 drawing albeit much less power but still almost 300W.

  • @Avalon1489
    @Avalon1489 4 года назад +10

    I'm actually impressed amd made a card this powerful. It's nice to see competition again.

  • @spade6774
    @spade6774 4 года назад +160

    They say all heroes wear capes, but Steve here prove it differently by being a wizard that delivered unbelievable amount of benchs.

    • @novoid69
      @novoid69 4 года назад +3

      "They say all heroes wear capes, but Steve here wears hoodie. What's up with that?"

    • @MattJDylan
      @MattJDylan 4 года назад +2

      He can't wear a cape: since he's aussie the cape would always float behind him because he's upside down... that would mess with the lighting, you know..

  • @MasterOfInfinity
    @MasterOfInfinity 4 года назад +159

    Out of stock
    BUYER RAGE MODE ON

    • @orthy853
      @orthy853 4 года назад

      It sold out when Newegg had crashed at 0901 and when it came back at 0914 it was sold out. AMD website was similar.

  • @aaardvaaark
    @aaardvaaark 4 года назад +2

    Sooo much better than the other 6800XT reviews. Good job as always Steve.

  • @Arjun-eb1yc
    @Arjun-eb1yc 4 года назад +110

    BEGUN THE GPU WARS HAVE.
    That's all that matters.

    • @edgain1502
      @edgain1502 4 года назад +21

      Spoken well you have, a like must I give.

    • @dawienel1142
      @dawienel1142 4 года назад +8

      @@edgain1502 Indeed a great comment have you given, a like doubled from me have you received.

    • @wolfshanze5980
      @wolfshanze5980 4 года назад +2

      BEGUN THE VAPORWARE WARS HAVE
      There... fixed it for you

    • @SuperLotus
      @SuperLotus 4 года назад +1

      Yeah. That's actually the where the real excitement is - _at least until I can actually afford an upgrade_

    • @GameBacardi
      @GameBacardi 4 года назад +1

      I got feeling that , Nvidia had rush quickly they RTX product to show and try to claim own RayTrace system in games, knowing that AMD is gonna give hard time to them.

  • @Bigbacon
    @Bigbacon 4 года назад +7

    You guys are the best. Still tossing numbers for 980s into the mix is amazing. Also having 1080p data availible is also awesome. So many dont even bother with it

    • @Arbiter099
      @Arbiter099 4 года назад

      Very nice to see direct comparison for me as a 980ti owner

  • @k_dubs5222
    @k_dubs5222 4 года назад +42

    lets be honest, were not here for 4k review we want that 1440p

    • @william_SMMA
      @william_SMMA 4 года назад

      Who's we?

    • @alangregory5580
      @alangregory5580 3 года назад

      not the case at all. like your pic tho ;)

    • @0mnikron702
      @0mnikron702 3 года назад

      4k gaming isnt here yet yes there are some titles that you can play at over 60 fps but for fps games that you need 144hz or more 4k is a far grasp 1440p is the next step up from 1080p and slowly will become more main stream all you have to do is look at games like cyber punk top gpu struggle to playing that game with all the eye candy on, 4k gaming is great for single player games, story based games where high fps dont matter much,

    • @Flooberjobby
      @Flooberjobby 3 года назад

      @@0mnikron702 I disagree. I play all my games in 4k max settings. 90% of the time I get 80-120 FPS in games. very few games bring my FPS down to 60 fps or lower. I enjoy all the 4k numbers.

  • @AlexanTheMan
    @AlexanTheMan 4 года назад +80

    This is the first time I've actually been anticipating a GPU release in a while. Hopefully AMD Radeon continues the momentum with their graphics cards, it's about time we got a return to competition within this market segment.

    • @YourPalHDee
      @YourPalHDee 4 года назад +2

      The price is still an utter joke. Last time we saw competition there was RX480

    • @AlexanTheMan
      @AlexanTheMan 4 года назад +7

      The RX 480 was only powerful enough to compete with the mid end market. These high end cards are naturally always going to be expensive to a degree.

    • @YourPalHDee
      @YourPalHDee 4 года назад +2

      @@AlexanTheMan It was still a flagship card that gave the performance of a GTX980 for $200. The equivalent of that today would be the RX6800 performing on par with the RTX2080 for $200.

    • @ivanalvarado3646
      @ivanalvarado3646 4 года назад +1

      @@AlexanTheMan Plus nodes have gotten ridiculously expensive to develop and produce. Gone are the days where you can get a fantastic card for $300 like in the late 2000s-early 2010s.

    • @AlexanTheMan
      @AlexanTheMan 4 года назад +3

      You guys realize that money inflation is still a thing, right? When the Nvidia 10 series came out, it blew the RX 480 (and later rebranded 580) out of proportion, and AMD never came out with a proper response after the failure of Vega. Plus also consider that we are still not quite out of this pandemic yet, so the production of the chips are not on full force, but I'm not sure about those. So in my opinion, the prices don't surprise me one bit.

  • @DS-pk4eh
    @DS-pk4eh 4 года назад +37

    All sold out here in Switzerland.it took 4 minutes.
    Price were
    719 Swiss Francs for 6800 XT
    689 Swiss Francs for 6800
    So 6800 XT is better buy for sure. I got lucky and got 6800 but really wanted XT version

    • @MattJDylan
      @MattJDylan 4 года назад +1

      Give it time: with the same power limit shenanigans we saw with vega and rdna1, the 6800 will be a better value once oc'd

    • @cptwhite
      @cptwhite 4 года назад

      @@MattJDylan doubtful it hs 17% fewer CUs (60 vs 72), so unless there's someway to unlock them it's unlikely it'll be able to top the 6800XT in perf/dollar in any meaningful way.

    • @helljester8097
      @helljester8097 4 года назад +1

      Did you get the asus one at 689? I managed to order an msi 6800 at 669 from digitec before they greyed out the add to cart buttons.

    • @MattJDylan
      @MattJDylan 4 года назад

      @@cptwhite in perf/dollar the 6800 is already topping both the xt and the 3080, and I didn't meant the vanilla 6800 will beat both, but that it will come close enough for him not to worry too much

    • @DS-pk4eh
      @DS-pk4eh 4 года назад +1

      @@helljester8097 Yes I did, but I really wanted 6800 XT I mean it was just 30 bucks more! And I didn't know there was XFX 6800 XT on stock too.
      But 4 models in total and they had less than 100 pieces in stock I think.

  • @balaran4349
    @balaran4349 4 года назад +8

    A lack of ray tracing performance is what scares me now that the consoles adopted it. I personally have less vram and 1080p performance than giving up the option to have decent ray tracing performance. Cyberpunk 2077 is just around the corner and it may deliver something real nice using this technology.

    • @choatus
      @choatus 4 года назад +5

      I agree, Raytracing is the future, no matter how much Steve doesn't care about it..

    • @xhydrox
      @xhydrox 4 года назад +3

      I also think that a lot of future titles will bring ray tracing support, starting with cyberpunk which is gonna be awesome :)

    • @notsoanonymousmemes9455
      @notsoanonymousmemes9455 4 года назад +2

      But consoles use RDNA 2, devs will work more with AMD's implementation of Raytracing, in the long run, AMD has an advantage as their Raytracing matures

    • @choatus
      @choatus 4 года назад +2

      @@notsoanonymousmemes9455 Sony doesn't use RDNA2 more like RDNA 1.5 also the raytracing that they use is based on Microsoft implementation and not locked down to just AMD so Nvidia can use it also, as well as their own RTX.

    • @eazen
      @eazen 4 года назад

      @@choatus do you have a source for the nonsense you're saying? No. Sony uses RDNA 2.
      And of course AMD has a advantage over nvidia because their architectures are everywhere. In 2 consoles and the PC. Only a idiot would ignore that fact.

  • @free2mov
    @free2mov 4 года назад +74

    If the 6800XT is this good, I can’t imagine the 6900XT benchmarks when it comes out.

    • @Saigonas
      @Saigonas 4 года назад +26

      It won't be so much better, i believe, just look at 3080 vs 3090

    • @matteovukoja1240
      @matteovukoja1240 4 года назад +31

      Between the 6800 XT and 6900 XT, there is a difference of only 8 Compute Units. As the other commenter said, it'll be the same as 3080 vs 3090

    • @Arbiter099
      @Arbiter099 4 года назад +6

      @@matteovukoja1240 also 8 more "ray accelerators" so hopefully slightly better RT performance too

    • @clarkkent8747
      @clarkkent8747 4 года назад +6

      @@Saigonas It will definitely beat the 3090 at 1440p, though. Because the 6800XT was only 5% slower at 1440p than the 3090.

    • @andrewwilliamson3513
      @andrewwilliamson3513 4 года назад +12

      Given how bad the 3090 overclocks I wouldn't be surprised to see the 6900 XT beat the 3090 by an inconsequential margin, and if it does, the real story is the $500 difference in MSRP haha

  • @RavTokomi
    @RavTokomi 4 года назад +10

    Seems like the faster pipes on RDNA2 help with 1080 and 1440, but the fatter pipes on Ampere help at 4K. Be interesting to see how it plays out in the future.

    • @Zarkinn
      @Zarkinn 4 года назад +1

      Its the 256 bit bus i think, but its still a great gpus for me

    • @andrewwilliamson3513
      @andrewwilliamson3513 4 года назад

      I agree. It'll be especially interesting to revisit this in say, 2-3 years to see if the low VRAM on the Ampere launch versions eats into their 4k lead to the point that it is nonexistent or flips to a deficit. Given that they are already scrambling to redo the lineup and add more VRAM, that'd be a BIG RIP to the people that bought a launch 3080 or 3070.

    • @choatus
      @choatus 4 года назад +1

      @@andrewwilliamson3513 10gb is plenty for 3+ years which by then most people will upgrade anyways, the Vram is GDDR 6x and is up to 4x faster than the GDDR 6 that AMD is using so it does help quite a lot especially in those "Fatter Pipes" it's really not as much of a big deal as people are making it out to be, most people don't understand that they are reading Vram allocated not Vram actually being actively used. Source- I own a 3080 plus a lot of other cards. Had my 3080 since almost launch and been loving it, worth every cent so far. But I have had many decent AMD cards in the past also, so don't call me a Fanboy, I will probably buy a 6800xt or 6900xt just for the fun of tinkering with it at some stage too.

    • @andrewwilliamson3513
      @andrewwilliamson3513 4 года назад

      @@choatus The basis of my argument is that the Nvidia cards only exceed the AMD card's performance in 4k, so their best use case is for 4k gaming, but the limited Vram might make that a moot point in the next cycle of games (without turning down settings.) Microsoft Flight Sim already allocates over 17Gb of Vram in 4k when under high load and nearly 10Gb in 1440p. Actual usage seems to be around 12Gb/8Gb respectively. The new trend in game development is to early access release games with shit optimization, and the optimization never seems to get addressed even at final release. AAA titles like COD that release every year are the same in a lot of ways. I think Warzone will use over 8Gb. It seems like Nvidia kept the Vram low because GDDR6X is expensive, but it keeps the 3070/3080 from being "Great" 4k cards for the future. They could have achieved the same memory bandwidth with more slower GDDR6 for roughly the same cost.

    • @choatus
      @choatus 4 года назад +1

      @@andrewwilliamson3513 to achieve the same bandwidth as GDDR6x they need to have 3-4 times the amount of normal GDDR6 so no it doesnt work like that.. it would end up costing a ton more to do it that way for starters. P.s MSFS on my 3080 in 4k uses (not allocates) less than 7gb of vram right now in ultra. but that game is broken as it needs its update to DX12 before we can see what it really runs like, it's basically running on 2 cores only atm.

  • @JeremyHansenblue2kid3
    @JeremyHansenblue2kid3 4 года назад +10

    Are these day one drivers vs day one drivers? I mean the 30 series has had an extra 2 months now to iron out driver issues.

    • @silentKeys20
      @silentKeys20 4 года назад +5

      Why would they use day 1 drivers for the 30 series? It's Nvdia's advantage for releasing earlier. You want to compare CURRENT performance to help your purchasing decision, not anecdotal. Steve will make an updated version for later drivers in the future as usual, anyways.

    • @Mkhalo3
      @Mkhalo3 4 года назад

      As a 3080 owner I can say for a lot of these games the average FPS with my card is way higher than what they got with their 3080 so who knows lol

    • @paulelderson934
      @paulelderson934 4 года назад +2

      @@Mkhalo3 but do you run the absolute maximum settings or a more reasonable "tuned ultra" setting.
      This video isn't a guide for reachable fps, it's a comparison under the exact same circumstances.

    • @Mkhalo3
      @Mkhalo3 4 года назад +1

      @@paulelderson934 Everything maxed out with the highest FOV as well

  • @NeXMaX
    @NeXMaX 4 года назад +15

    24:40
    There's probably a few reasons to get a 3080 over a 6800XT, for some specific applications like ML or CUDA-acceleration in workstation tasks.
    With that said, I doubt most viewers here are buying either for workstation use primarily. For gaming, AMD's done immensely well here. Well done.

    • @CrimsonKobaNakirigumi
      @CrimsonKobaNakirigumi 4 года назад

      Same here, mainly because of emulators, but as an AMD user for a long time this makes me smile.

    • @wolfgangjr74
      @wolfgangjr74 4 года назад

      I know they have the audio improving software and better encoders for people that stream. Again. If you don't do that then the 6800XT looks pretty good. NVIDIA wins on the Value Adds. Something that AMD needs to start working on as soon as they can.

    • @romanlf5620
      @romanlf5620 4 года назад

      I do not agree for machine learning using Roc m with 16 gb of vram should work better than cuda with 10gb of vram because training saturates the ram very easely. Nvidia should be better at fp16 though with tensor cores.

    • @NeXMaX
      @NeXMaX 4 года назад

      @@KPTKleist CUDA's the big one for me. To a smaller extent, NVENC as well even though I've not done a lot of streaming but have been doing some recordings lately.
      AMD's got some catching up to do in the features department. The performance is finally right up there but that extra $50 might be enough to tempt some people to go 3080 for those features if they apply to them.

    • @Dr.WhetFarts
      @Dr.WhetFarts 4 года назад

      RTX cards have tons of gaming features that AMD users can only dream about. Just wait till you see Cyberpunk 2077 benchmarks, Nvidia will wipe the floor with AMD 6000 series. Biggest game release in many years and fully Nvidia backed with ALL RTX features including DLSS.

  • @hyprmetlfan123
    @hyprmetlfan123 4 года назад +207

    people then : big navi cannot even compete against RTX 3070
    big navi now :

    • @shaquilleoatmeal5332
      @shaquilleoatmeal5332 4 года назад +12

      This isn't even considered the big navi right? Isn't 6900 xt supposed to be big navi?

    • @gabrielmendesmatos5121
      @gabrielmendesmatos5121 4 года назад +34

      @@shaquilleoatmeal5332 Big navi is the entire generation, including 6800, 6800 xt and 6900 xt.

    • @andersjjensen
      @andersjjensen 4 года назад +6

      @@shaquilleoatmeal5332 The 9600XT is "the biggest big navi" :P

    • @makisekurisu4674
      @makisekurisu4674 4 года назад +16

      Well it technically still can't. It doesn't have DLSS compititor yet, has terrible video encoding performance and gives glitched out render outputs still. Plus bad at ray tracing. Its only for games that doesn't support DLSS I guess.
      Still they are finally back.
      These cards especially will of a great attraction to all 1440 and 1440p ultra wide gamers for sure which majority of gamers in this price range.

    • @hiimcortana1568
      @hiimcortana1568 4 года назад +1

      @@makisekurisu4674 AMD has always been lagging behind nvidia for those extra features... good thing about them is they make the price cheap so nvidia doesn't have a total monopoly and authority to increase prices like they would have if there was 0 competition from AMD.

  • @SonGChaNSaO
    @SonGChaNSaO 4 года назад

    I'm always impressed with the paramount
    size of data you guys crunch on each launch, really appreciate it. Thank you and lots of love from S. Korea

  • @seanporcelli3965
    @seanporcelli3965 4 года назад +45

    Can't wait to buy literally any gpu at the end of this decade. I'm sure they'll be available then right?

    • @DeadNoob451
      @DeadNoob451 4 года назад +2

      Oh, you probably *can* get them pretty soon.
      Just need to be prepared to pay 1000$+ to some random guy running a bot.

    • @hermesgabriel
      @hermesgabriel 4 года назад +1

      @@DeadNoob451 A good business as long as there is stupid people paying this ammount.

    • @Saigonas
      @Saigonas 4 года назад

      @@hermesgabriel yeah

    • @nofreebeer
      @nofreebeer 4 года назад

      the decade is over in less than 45 days. So probably not?

    • @DeadNoob451
      @DeadNoob451 4 года назад

      @@nofreebeer Dont decades go from 0 to 9 ?
      Ie. we started counting years (and thus decades) in our current system in year 0, so the first 10 years/decade ended when year 9 ended/year 10 started. So we already are in the new decade and 2029 (or rather 01.01.2030) is the start of the next decade.
      Otherwise, the millenium would have started 2001, not 2000 and the first decade would have 11 years.

  • @omarcomming722
    @omarcomming722 4 года назад +9

    24:48 there are definite reasons to buy a 3080 over a 6800 XT and you already mentioned all of them so not sure what this statement means.
    Better 4k performance, DLSS, RT, driver support which has been garbage for AMD and seamless for Nvidia for years now and not something I'd personally give up easily until RDNA drivers prove to be normal for some time.

    • @AntonioNoack
      @AntonioNoack 4 года назад +1

      I've heard bad stuff about Nvidia drivers for Linux users...
      In a practical part of my studies, sb who was trying to get their 2080ti drivers working for Tensorflow on Ubuntu, the display wasn't working at all anymore 😅.

    • @Sharki33
      @Sharki33 4 года назад +1

      Butthurt much?

    • @kamachi
      @kamachi 4 года назад

      DLSS isn't supported in every game, in fact the number of titles that support DLSS are miniscule. AMD are working on a similar technology. Raytracing as it currently stands is nowt more than a gimmick and offers little in terms of visuals for the impact on performance not to mention most games that are not Nvidia sponsored will use the DXR version of RT and not Nvidia's version which will close the RT gap. Better 4k performance in SOME titles, sure. Driver issues have plagued both companies in various generations and will continue to do so, that's a cheap excuse. Why are you so salty bro? Competition is great and Radeon just spanked Nvidia.

    • @denkt
      @denkt 4 года назад +1

      @@Sharki33 Only one who seems butthurt is you

    • @apricotmadness4850
      @apricotmadness4850 4 года назад +3

      1. Ray tracing is a gimmick as it stands today as it still takes a massive frame rate hit even with DLSS for minimal improvements in graphical fidelity.
      2. The 4K performance is only 5%. 5% percent and the 3080 loses in 1440 and 1080p (although again within error of margin). 5% is within error of margin and not something anyone could notice. Most people are adopting 1440p and that’s the resolution most will play at soon.
      3. AMD has their own implementation of DLSS coming soon. It’s not a feature that can only be implemented for only Nvidia.
      4. The drivers are yet to be tested, but Nvidia’s track record is not seamless or spotless. Just a few weeks ago; they had to release patches for crashes to desktop and black screens although no one seemed to want to talk about that unless it’s AMD screwing up.
      The 3080 is a tough sale unless you want to play with slightly improved shadows and reflections for a fat loss in FPS with less vram and more power consumption.

  • @tilpesq
    @tilpesq 4 года назад +1

    Thanks Steve, really great review as usual, the level of quality HU puts out is incredible, lots of hard work and it shows. IMHO, what really impresses me the most and the reason I recommend you guys before anyone, is how you guys manage to cover everything so well, I've found the attention given to the VRAM amount difference of the cards really lacking from most reviewers and you guys touched the subject and warned about it. If 8-10GB is barely enough today, for the high end, according to my experience, as soon as the new consoles hit, you better have 12GB or more for the high end, really. Imagine paying U$ 700 for an amazing card today and a year or so from now it struggles, not because the GPU can't handle, but framebuffer. I've had that experience with a fair share of nvidia cards (i.e. GTX 295, GTX 480 SLI, GTX 580, GTX 690), very frustrating.
    Thanks again for the great work!

  • @maxplank2989
    @maxplank2989 4 года назад +16

    Just subbed. I love how this review is less opinion and more data than some of the other channels I know.

    • @tusux2949
      @tusux2949 4 года назад +1

      Steve is a legend and a blessing to tech mankind. Welcome to benchmark heaven where sleep is for the weak and there are numbers everywhere. :P

    • @TySoVm
      @TySoVm 4 года назад +1

      Check out Games Nexus. They stuff data down your throat.

  • @Alendo
    @Alendo 4 года назад +10

    11:30 those RTX 2060 4k Wolfenstein results xD 6fps average sounds good.

    • @PatalJunior
      @PatalJunior 4 года назад

      It seems weird, could just be a driver bug.

    • @rajdipdas1750
      @rajdipdas1750 4 года назад +1

      😂😂😂

    • @subhajitchatterjee9688
      @subhajitchatterjee9688 4 года назад +1

      @@PatalJunior Nah it ran out of Vram....6GB is not enough for 4k

    • @MLWJ1993
      @MLWJ1993 4 года назад

      @@subhajitchatterjee9688 funny enough the 980Ti has the same amount of VRAM (but GDDR5) & is still way faster. I would've expected both to crap out into similar abysmal performance.

    • @subhajitchatterjee9688
      @subhajitchatterjee9688 4 года назад

      @@MLWJ1993 Oh right...that seems strange how rtx 2060 can struggle so much then

  • @DC9V
    @DC9V 4 года назад +10

    14:41 worth mentioning:
    The 6800xt has the same RT performance as the 2080 ti. (even slightly better)
    That's not too bad at all!

    • @3CODKing
      @3CODKing 4 года назад +7

      I think a lot of people don't even understand this fact. AMD is basically right in the middle of RT 1 and 2 of Nividia. If AMD can keep this level of innovations for their 2nd gen RT cores i think we will have some serious competition. It's the same with AMDs version of DLSS. This is their first generation and essentially first go at it. As AMD said they will age like fine wine. I just hope it stands true

    • @eazen
      @eazen 4 года назад +3

      If it's a AMD sponsored game the RT performance is even better than Ampere. See Dirt 5. Otherwise it looked like 20% less than Ampere and a bit better than Turing.

    • @3CODKing
      @3CODKing 4 года назад +1

      @@eazen thats what I’m saying, RT is pretty much very limited on games and even more so that can actually run it at 60+ FPS. For me personally, I would buy the 6800XT cause its on par, and often times better, at performance with the 3080. However, what I like about the 3080 is the Audio Isolation thing they have with the microphones.

  • @devackroyd
    @devackroyd 4 года назад +13

    Purposefully stayed up doing homework to watch this premiere! So worth it, love your work Steve!

  • @SicSemperBeats
    @SicSemperBeats 4 года назад +7

    3:58 why is Valhalla so low performing across the board. the performance deficit on every AC game on PC gets worse while i dont think the graphics level up proportionately with the performance drop in game to game. same thing with watch dogs. Ubisoft games dont look good enough to play as bad as they do imo.

  • @samoluka3386
    @samoluka3386 4 года назад

    Very comprehensive review @hardware unboxed. I've been anxiously waiting for your review

  • @etmasikewo
    @etmasikewo 4 года назад +19

    This video: Steve saying "6800 xt is better in 1440p and the 3080 is faster at 4k" in 20 different ways

    • @UnEn666
      @UnEn666 4 года назад +1

      Seems to be somewhat weird, many other reviews see equal at 1440p, red win 1080, green win 4K.

    • @jarrajoseph-mcgrath9142
      @jarrajoseph-mcgrath9142 4 года назад

      NyxNyxNyx Green 3080 at 1140p? Does this also count for 3070?

    • @fofal
      @fofal 4 года назад

      he is using the 5950x for 6800xt results. it's ok for 4k but not 1440p when the 3080 results are with 10900k

    • @theeternal417
      @theeternal417 4 года назад +1

      I'm wondering how well all these reviews will hold up in the long run. If you compare results in recent games to the older ones Nvidia seems to be struggling. Dirt 5 and Valhalla in particular surprised me.

    • @fofal
      @fofal 4 года назад +1

      @@theeternal417 they are also done for 3080 using old drivers. I would rather he did less games but identical environments and up to date drivers

  • @Bajicoy
    @Bajicoy 4 года назад +4

    A really impressive jump in performance from AMD and even more impressive seeing the 6800XT pass the 3090 now and then. Looks like there will be a lot more all red builds in the future. 4k scaling is a bit dodgy with and without SAM but certainly the better choice for 1440p and overkill 1080p than the 3080.

  • @justgaming8867
    @justgaming8867 4 года назад

    I come to this channel first cuz we get straight benchmarks and no bullshit its just so time-saving. Thanks steve

  • @ValentineC137
    @ValentineC137 4 года назад +100

    "So, without wasting any time let's talk about the test system, and then jump into the blue bar graphs. As I know that's what most of you are here for."
    *_Very angry AHOC noises in the distance_*

    • @ExZ1te
      @ExZ1te 4 года назад +2

      Sorry but what is AHOC?

    • @pythonner3644
      @pythonner3644 4 года назад

      Valentine you are sus

    • @ValentineC137
      @ValentineC137 4 года назад +5

      @@ExZ1te Actually Hardcore OverClocking
      He recently made a video where he’s upset about the bars in bargraphs

    • @ValentineC137
      @ValentineC137 4 года назад +1

      @@pythonner3644 uh- I Saw green vent to medbay

    • @pythonner3644
      @pythonner3644 4 года назад +3

      @@ValentineC137 but I am green

  • @wolfgangjr74
    @wolfgangjr74 4 года назад +5

    Now to wait for the VR performance reviews. That's really gonna be the final benchmark telling me where to go.

    • @JoshS5811
      @JoshS5811 4 года назад

      Yeah, but who does VR performance reviews? I WISH I could find some!

    • @GholaTleilaxu
      @GholaTleilaxu 4 года назад

      "VR"...is that when you put a helmet on your head with a pair of small LCD displays very close to your eyes and keep it on for hours and hours? :)

    • @wolfgangjr74
      @wolfgangjr74 4 года назад

      @@GholaTleilaxu yup. Loads of fun.

  • @emdea
    @emdea 4 года назад +2

    Thank you thank you thank you Steve, you've answered the questions other reviewers failed to address for me. Only one question remains - GPU encoding

  • @viniqf
    @viniqf 4 года назад +25

    Imagine being able to buy one of these

  • @rangersmith4652
    @rangersmith4652 4 года назад +7

    Thirty seconds into sales going live, and the Newegg site is down. Here we go again, as expected. That said, is it just architecture differences that tend to give the 4K nod to Nvidia? Seems like somebody playing at 4K has a dilemma: the 3080 is faster but has less VRAM than the 6800XT. What to do?

    • @5izzy557
      @5izzy557 4 года назад

      Demand is just too high & they had to accommodate console silicon too, I think they have done well all things considered TBF

    • @ZackSNetwork
      @ZackSNetwork 4 года назад +4

      This will be the dark generation I’m calling it. The vast majority of people will miss out on Zen 3, RNDA2 and Ampere. By the time they are buy able at regular means we will have zen 4, RNDA3 and Hopper.

    • @vgamedude12
      @vgamedude12 4 года назад +2

      I would never buy a 10gb vram card for 4k

    • @Shipprofile08
      @Shipprofile08 4 года назад

      @@ZackSNetwork I wouldn't call this gen the dark age, if anything this is a golden age for tech! We had massive generational leaps in multiple tech areas (motherboard with PCIE 4.0, CPU with Zen 3, and GPU for both Nvidea and AMD). More people are buying computers than ever before due to those reasons, and because people are stuck at home and realizing that 1 computer isn't enough for 5 poeple lol!

    • @rangersmith4652
      @rangersmith4652 4 года назад

      @@vgamedude12 Nor would I, but I don't game at 4K anyway. To me unless you can turn the graphics to max, there's no reason to game at 4K. I mean, 4K with settings dumbed down??? Opinions vary, and rightly so, but chasing high frame rates at 4k in games is a serious money pit. So it's 3440x1440 for me.

  • @cocoleclown9480
    @cocoleclown9480 4 года назад +1

    Thank you sir. That was very interesting specially the SAM analysis. I was curious if the performance they claim was true. Your test with 3.0 making no difference vs 4.0 is an extra that i really appreciated. Thank you for doing that test.

  • @ZAR556
    @ZAR556 4 года назад +7

    Man,, Navi2x truly is Zen moment for Radeon
    Nicely Done, AMD

  • @kiwiasian
    @kiwiasian 4 года назад +4

    Looking forward to seeing the SAM equivalent improvements for the 30-series 😃

    • @Dr.WhetFarts
      @Dr.WhetFarts 4 года назад +1

      Yep, and luckily Nvidia wont lock this to a single processor brand (or only Ryzen 5000 series, like AMD does). Nvidia "SAM" will support ALL CPU brands.

  • @KhromTX
    @KhromTX 4 года назад

    Honestly, god bless you guys for using 1440p as the default benchmarking method shown in the charts, seriously, THANK You. It was so unnecessarily arduous / annoying just 6 months to a year ago finding 1440P only benchmarks for the games I care about. It was frustrating, so thank you for your hard work.

  • @kingsolo5112
    @kingsolo5112 4 года назад +37

    Holy smokes, amd really did do a Zen improvement with RDNA2. Imagine that bad boy with HBM.

    • @rdmz135
      @rdmz135 4 года назад +11

      Lol don't forget to imagine the price too

    • @HickoryDickory86
      @HickoryDickory86 4 года назад +9

      @Desktopia People have extremely short memories and horrifically limited attention spans.
      Anyway, it is true that RDNA is an exceptional GPU architecture and brought real competition to Nvidia that they had not had in a long time. The generational leap in performance from RDNA to RDNA 2 is nothing short of breathtaking and worthy of acclaim. And that AMD is targeting a similar ~50% increase in performance-per-watt with RDNA 3 is very exciting, but neither should detract from AMD's win here with RDNA 2.

    • @quoclien3343
      @quoclien3343 4 года назад +2

      who needs hbm2 when rnda2 is using ddr6 while amphere is using ddr6+ higher bandwidth...yet amd still inserting it in any orifix nvidia has

    • @eazen
      @eazen 4 года назад

      Nobody needs expensive HBM if you have the power of Infinity Cache. The old problem which AMD had since 2014 is now solved. No HBM needed anymore for the highend cards.

    • @kingsolo5112
      @kingsolo5112 4 года назад

      @@eazen Never said I wished it had it, I said imagine. The infinity cache can only help so much before the gpu has to override the cache and fetch more data from system memory at higher resolutions. You can see how much frames it drops switching to 4k. Also this takes alot of space on the die (infinity cache estimated to take up about 6 billion transistors) which could've been used for more enhancements. Infinite cash though is the next step in graphics though, it is a great idea. HBM is just the overall better solution though, just more expensive.

  • @Bestgameplayer10
    @Bestgameplayer10 4 года назад +14

    I’m surprised they no longer include RDRII in their tests.

    • @AngelicHunk
      @AngelicHunk 4 года назад +9

      If I remember right, it's extremely hard to test in a consistently controlled manner (without using the built in benchmark). I believe they said the built in benchmark isn't very representative of true gameplay, so they try to test along a route in game. However, a dynamic world with a day-night cycle can make consistency very off.
      Steve probably scrapped that test for time since there's so many product releases smashed together.

    • @gorky_vk
      @gorky_vk 4 года назад

      but I like that they include lot's of new(er) games, something most other reviewers fail to do and bench new hardware on 3-4 year old titles.

    • @Bestgameplayer10
      @Bestgameplayer10 4 года назад +1

      @@AngelicHunk Thats fair. I mean, the most taxing area in the game i believe is Saint Denis but even then that area isn’t so bad in itself. But I’d still say the benchmark for RDRII is decently accurate.

    • @Bestgameplayer10
      @Bestgameplayer10 4 года назад

      @@gorky_vk Thats true. Like people who still test GTA V is kinda ridiculous. Nearly anything can run it as long as you aren’t tryna play on a laptop or something really old.

    • @GoodGuyGaurav
      @GoodGuyGaurav 4 года назад

      the spaghetti coding of the game probably contributed to this. They have have to release a single update that hasnt introduced stability issues

  • @TheDigitalThreat
    @TheDigitalThreat 3 года назад +1

    July 2021 and this card is now 400$ cheaper than the 3080 in the pre built R12's from Aienware/Dell. Thats such a huge difference for essentially the same performance.

  • @RepsUp100
    @RepsUp100 4 года назад +70

    Andddd they're sold out

    • @AndrewTSq
      @AndrewTSq 4 года назад

      Biggest paperlauncv in history. And they were priced way above amds msrp making the rtx3080 almost cheap.

  • @zhongxina2614
    @zhongxina2614 4 года назад +33

    *OUT OF STOCK* in not 3 minutes, not 3 second, *BUT IMMEDIATELY*

    • @srinathshettigar379
      @srinathshettigar379 4 года назад

      People triggering buy buttons instead of spraying bullets. lmao

    • @dorsalispenile9891
      @dorsalispenile9891 4 года назад +1

      Bruh these bots

    • @rickyrich93
      @rickyrich93 4 года назад +1

      Yeah it's sucks but we'll all be able to buy the GPUs eventually. Just gotta wait.

    • @Argoon1981
      @Argoon1981 4 года назад +1

      @@rickyrich93 exactly the best time to buy gpu's is not at release day.

  • @EmblemParade
    @EmblemParade 4 года назад +2

    I can't wait to see these available for purchase in 2022! It's like getting a window into the future!

    • @Dr.WhetFarts
      @Dr.WhetFarts 4 года назад

      Nvidia 4000 series and AMD 7000 series...

  • @パラドックス-v7p
    @パラドックス-v7p 4 года назад +7

    brooo i wish every game would be as well optimized as death stranding

    • @jakeblargh
      @jakeblargh 4 года назад

      To be fair there's not really a lot going on in Death Stranding. The game looks gorgeous in 4k 60fps on a 65" HDR TV though.

    • @DrCid123
      @DrCid123 3 года назад

      Resident Evil as well. Gorgeous graphics too.

  • @alankar911
    @alankar911 4 года назад +4

    It will be great to see ultra wide benchmarks ..

    • @amaurytoloza1511
      @amaurytoloza1511 4 года назад +1

      No really, very pointless resolution for gaming.

    • @SBOG4215
      @SBOG4215 4 года назад

      @@amaurytoloza1511 Have you got an UW? What makes you say that?

    • @amaurytoloza1511
      @amaurytoloza1511 4 года назад

      @@SBOG4215 I got 4 monitors. Three of them to work. The one for my gaming rig is a LG 27GL83A. Many games purposely don't support 21:9 aspect ratios you find on UW monitors. Tell you what, consoles are not even supporting 1440p resolutions, even less UW, pointless resolution really for gaming. For productivity it has its uses tho'

    • @wallacesousuke1433
      @wallacesousuke1433 4 года назад

      UW is trash

    • @yrh002b8
      @yrh002b8 4 года назад

      @@wallacesousuke1433 nope, only your taste.

  • @GLIDGaming
    @GLIDGaming 4 года назад +2

    Guys nvidia offer a much more stable product with better infrastructure like geforce experience, game recording in 4k hdr 60 fps etc. Nvidia's driver support are also better vs AMD. Overall it's good to see AMD become a equals at the top end However, when you buy a nvidia card you just get a much more stable, reliable and better allround eco system. Also you showed have used the 10900k it's a better gaming cpu!

  • @daniellockwood4568
    @daniellockwood4568 4 года назад +6

    Different result in 1080p and 2k compared to guru3d and techpowerup. Probably because of Dirt 5 favoring AMD heavily, exactly 30% lead in 1440p.

    • @choatus
      @choatus 4 года назад +3

      yeah something about this lot of Benchmarks doesn't seem to make sense to me.

    • @xhydrox
      @xhydrox 4 года назад +1

      I was thinking the same thing...

    • @bo-_t_-rs1152
      @bo-_t_-rs1152 4 года назад

      It's because this game is optimized to run best on rdna2 because both consoles are using that platform and games are made first for consoles. So we can expect to see a lot more titles that will run better on AMD, especially next year when true next gen games come out.

    • @choatus
      @choatus 4 года назад +1

      @@bo-_t_-rs1152 Consoles have had AMD CPU's and GPU's for years, more than 4 generations, but this has never meant that games were all better optimised for AMD hardware on PC... most were better on Intel and Nvidia, and expect it to remain the case for a long time yet.

    • @bo-_t_-rs1152
      @bo-_t_-rs1152 4 года назад

      @@choatus True but the hardware in them was nowhere near the performance of the current gen. AMD was always behind nVidia and intel but that's changed with Ryzen/RDNA2 now.

  • @Mostan_Games
    @Mostan_Games 4 года назад +21

    11:30 look at the end of the list, the 2060 with 6fps avg, LMFAO

    • @smp219
      @smp219 4 года назад

      @@xtechtips152 vram

    • @MLWJ1993
      @MLWJ1993 4 года назад +1

      @@smp219 doesn't make all that much sense since the 980Ti runs into a similar limitation unless it somehow powers through that like the monster it was back then. 🤔

    • @smp219
      @smp219 4 года назад +1

      @@MLWJ1993 that was history

    • @Hardwareunboxed
      @Hardwareunboxed  4 года назад +2

      @@MLWJ1993 ROPs

  • @errorclone3
    @errorclone3 4 года назад +1

    By far the best benchmark review again as always! Love you guys :-*

  • @anshanchaliya1383
    @anshanchaliya1383 4 года назад +4

    11:39 the 2060 was a PowerPoint presentation

    • @MLWJ1993
      @MLWJ1993 4 года назад +1

      2060* the 2060 *super* had >60fps average ending up just below the 2070.

    • @anshanchaliya1383
      @anshanchaliya1383 4 года назад

      @@MLWJ1993 ok bro I just saw it and edited the comment

  • @gjmcgee
    @gjmcgee 4 года назад +6

    14:45 - “Personally I care very little about ray tracing support since I feel there are no games where it is worth enabling.”
    Laughs in Minecraft RTX

  • @smarthousetech8593
    @smarthousetech8593 4 года назад +1

    The amount of work you guys did....clap that up people 👏

  • @InvntdXNEWROMAN
    @InvntdXNEWROMAN 4 года назад +4

    And now I'm just patiently waiting for the 3080 Ti release.

    • @SirKakalaCh
      @SirKakalaCh 4 года назад +1

      yeah the 4k was mediocre at best, lets wait to see 6900xt too who knows amd might pull a last minute prank on nvidia :D otherwise 3080ti here we come baby

  • @andrezunido
    @andrezunido 4 года назад +13

    Nice review, it seems NVIDIA has tight competition this "generation" of cards. I disagree with your RT assessment though. RT is relevant, else AMD wouldn't have added it.

    • @Bestgameplayer10
      @Bestgameplayer10 4 года назад +4

      It’s relevant but hardly. It’s implementation is games is few and far between. And what games do have it, the performance loss in exchange has hardly been worth it. I’m with Steve that it hardly matters in regards to which GPU is worth buying. The only reason why NVIDIA can tout about it at all is because of DLSS kind of “cheating” performance.
      Maybe in another 5 years it’ll be semi-standard in games and reasonable to run but as of now, not really.

    • @magottyk
      @magottyk 4 года назад +1

      Premature technologies like ray/path tracing get supported more as they mature, we're still in hybrid territory and it's really added bling at this stage, maybe in another two generations or the next console generation it'll be as significant as rasterisation capability. Just look at Control 4K which is 36FPS on a 3080 and needs DLSS to get to 60FPS and that's still a hybrid implementation. Fortnite with full RT effects (still hybrid) is even worse at 23FPS and 42FPS with DLSS.
      REF: ruclips.net/video/nX3W7Sx4l78/видео.html
      If it was as relevant as you say, AMD would have pushed their ray tracing implementation much more than just matching the RTX 2000 series. They've implemented just enough so that you can experience Hybrid real time ray tracing, but not so much that it becomes a primary selling point, because we aren't there yet.

    • @xophaser
      @xophaser 4 года назад

      cyberpunk 2077 Rt heaven. You know CGI use RT rendering farm to look more realistic then just raster.

    • @rorzerkthelurk
      @rorzerkthelurk 4 года назад +6

      Spot on. Yeah I'm happy AMD is catching up but anyone who says RT won't matter during this generation of cards is kidding themselves. Nvidia 100% this gen, and this isn't even including extra software features and better drivers.

    • @devilmikey00
      @devilmikey00 4 года назад +2

      RT will be thing now that consoles can use it. That's always the thing holding technologies like this back. We're going to see RT in just about every AAA game going forward now and AMD being so far behind is not great. We've yet to see if AMD's DLSS equivalent doesn't suck which given how terrible first gen DLSS was I'm not holding my breath.

  • @towely8517
    @towely8517 4 года назад +1

    Why do you think the FPS data that you are showing is so different to that of GamersNexus and Jayztwocents? Do you think it is the difference between enabling Ray Tracing and not? Or could it be something to do with the memory or other tuning of your test bench favouring AMD hardware somehow?

    • @towely8517
      @towely8517 4 года назад

      (I’m assuming that it surely can’t be something as obvious as a driver version)

  • @barrycheesemore2928
    @barrycheesemore2928 4 года назад +4

    Damn, that 6800XT was seriously impressive, and waaaay better than I was expecting!! And excellent video as always guys, keep up the good work!!

  • @Mi2Lethal
    @Mi2Lethal 4 года назад +11

    Nvidia's days of being the performance king is numbered, Now they have to compete on price AND performance.

  • @Sneedoss
    @Sneedoss 4 года назад +1

    Is there info on the test bench? Your SAM results are very different from other reviewers. Keep up the good work! Hi from Bacchus Marsh Vic!

    • @highlanderknight
      @highlanderknight 4 года назад +1

      I've noticed this Hardware Unboxed review is a bit different than other reviews from other sites. Steve also seems to just 'dismiss' Ray Tracing as if its a passing fad.

  • @PaulSebastianM
    @PaulSebastianM 4 года назад +4

    AMD caught up with Intel and surpassed them and now they caught up with Nvidia too! Awesome 👍.

  • @StefanEtienneTheVerrgeRep
    @StefanEtienneTheVerrgeRep 4 года назад +19

    See how much Confidence makes a difference? Its like getting a new table.

    • @edgain1502
      @edgain1502 4 года назад +1

      Or like building a new PC, lel

    • @UltimateAlgorithm
      @UltimateAlgorithm 4 года назад +2

      Please build a new PC, I'm kind of bored and need some good entertainment.

  • @JustinGoffinet
    @JustinGoffinet 4 года назад

    Just want to say THANK YOU for having the 3950X in your test bench, you keep apologizing for it, but it's been a very welcome data point in reference to other reviewers, and closer to what a lot of your Ryzen 3000 viewers are still packing.

  • @stangamer706
    @stangamer706 4 года назад +11

    That's a lot for another excellent review, Steve!
    They must have made SAM available for all Ryzen users across the whole AM4 platform, no matter what motherboards or CPUs people use. I hope Nvidia will force AMD to do this soon.
    Anyway, I am impressed! AMD finally became a really dangerous competitor to both Intel and Nvidia. Ironically it happened almost at the same time, with the release of Zen 3 and RDNA 2. It will make things a lot more interesting!

    • @sebastiancusis4650
      @sebastiancusis4650 4 года назад +1

      I agree. these new cards are clearly a very good substitue for the RTX 3000. However, u also have think in the future and what we have right now. its obvious that the trend for games (AAA) is ray tracing...so u cannot say anymore that the number of games with ray tracing is limited. its just going to increase. Its a fact that ray tracing technology in AMD cards is inferior to nvidias at the moment. Furthermore they lack AI services like the nvidia broadcast app, which are pretty cool or even needed if u want to stream. I think if u dont care about streaming or ray tracing, sure amd is a better option. but like i said, more and more games are coming with better visuals and most people will be stuck with amd for a while... I think that AMD will crash Nvidia with their next generation RX cards though.

    • @haukionkannel
      @haukionkannel 4 года назад

      It depends on if older motherboards can manage it. Even nvdia did saynthatnthey will only domit for new amd motherboards, so older may lack some feature that is needed!

    • @UltimateAlgorithm
      @UltimateAlgorithm 4 года назад +1

      My guess is SAM aren't that good when running on PCIe 3.0. I could be wrong though, just wait for Nvidia or AMD to enable it on all platforms.

    • @inrptn
      @inrptn 4 года назад +5

      @@UltimateAlgorithm Steve said in the video there was no difference with SAM performance in PCIe 3 vs. 4.

    • @UltimateAlgorithm
      @UltimateAlgorithm 4 года назад +1

      @@inrptn how can he know that? SAM doesn't even run on PCIe 3.0 right now. Only Ryzen 5000 series on an X570 supported, which guarantee PCIe 4.0. For more SAM benchmarks you can look at Hardware Unboxed. Some games have massive improvement like Assassin's Creed Valhalla.

  • @scipionyx
    @scipionyx 4 года назад +4

    I wonder how much of a difference the 6900 xt will be

    • @kennethd4958
      @kennethd4958 4 года назад +8

      With these numbers I would say it’s going to stomp everything out right now.

  • @odderphase
    @odderphase 4 года назад

    Finally one thorough and clear 6800xt review. I'm tired of "big" reviewers making points with 3DMark benchmarks and 5-6 games of which 2 are 5+ years old. Who will buy these cards to play GTA V anyway???

  • @Wusyaname_
    @Wusyaname_ 4 года назад +7

    Ugh. Disappointed with the encoder, the RT performance, and not much software like RTX voice, etc. everything else is great tho

    • @amaurytoloza1511
      @amaurytoloza1511 4 года назад +1

      AMD video encoder is rubbish for sure

    • @Wusyaname_
      @Wusyaname_ 4 года назад

      @Melburn Sir
      Yeah, but for the games where it does matter, like Minecraft, it sucks.

    • @wallacesousuke1433
      @wallacesousuke1433 4 года назад +1

      @@Wusyaname_ lmao Minecraft

    • @xhydrox
      @xhydrox 4 года назад +1

      Or red dead redemption 2, or control, or cyberpunk, or metro, or battlefield or any of the other titles that support it. Thats not even mentioning dlss where amd gets shit on :)

  • @androsforever500
    @androsforever500 4 года назад +9

    If AMD had a DLSS 2.0 competitor I could see the 6800xt as a great competitor to the 3080, without it there is just too much gained with that technology to switch from team Nvidia

    • @jagoob
      @jagoob 4 года назад +1

      Yes but only if the games you play support it. And if you play 1440p 144hz with no raytracing it mainly is no longer needed.

    • @TK-ev
      @TK-ev 4 года назад +2

      @@jagoob that's the thing. DLSS is only going to get more and more optimized for a wider variety of games. And people with those cards will love the dedicated cores for minor ai workloads and the improving raytracing tech which is here to stay.

    • @kostisap9350
      @kostisap9350 4 года назад

      If you have 1080p screen then dlss isn't needed. 1440p or 4K yeah fine. They have super sampling or something like that and they said that they will release it in the near future. Judging by nvidia tho with their first gen of dlss( which was shit ) I dont know what to expect.

    • @williammurphy1674
      @williammurphy1674 4 года назад +2

      @@kostisap9350 .. Why judge by DLSS Technology by DLSS 1.0 (which was sh!t) when DLSS 2.0 already blows that first version outta the water. If DLSS 3.0 improves even half more the improvements of what DLSS 2.0 has provided, it will be something quite special imo ...

    • @androsforever500
      @androsforever500 4 года назад +1

      @@williammurphy1674 Yea, if they manage to improve even further it would be phenomenal. Even with DLSS 2.0 as it is now the game looks way better than with TAA or other AA methods and performs way better. I am enjoying Control so much on my 2k 240hz monitor right now with my 2070 super, getting steady 80+ framerates with gsync enabled. Such a smooth and beautiful experience!

  • @FerrumBellator
    @FerrumBellator 4 года назад +1

    In Canada there was no stock at the only major retailer in my city I could buy it from locally. Picked up my pre-ordered Nvidia card and the retailer told me they had more orders for 3080s today then any other day. Was considering the 6800xt but for me was a what came in stock for me to buy so Nvidia got my money.

    • @syahmimi1425
      @syahmimi1425 4 года назад

      why nvidia suddenly have stock?

    • @FerrumBellator
      @FerrumBellator 4 года назад

      @@syahmimi1425 They didn't get any extra stock I've been waiting weeks for card I back ordered. Sales associate told me they had more orders placed for back order on 3080s then any other day today. They did not get more stock just more orders for back order.

  • @Night-Fox-XTX
    @Night-Fox-XTX 4 года назад +8

    Well-done AMD 🔥🔥🔥🔥🔥🔥🔥

  • @kimnkk
    @kimnkk 4 года назад +10

    You mention other RT games like watchdog legions and control, but dont include benchmarks of them lol

    • @devilmikey00
      @devilmikey00 4 года назад +1

      I'll give you the scoop. Nvidia would have slaughtered AMD because the RT stuff isn't just a little bit behind it's WAY behind and AMD has no DLSS equivalent yet. The review was dismissing RT so why bother showing it?

  • @johanvirebrand7196
    @johanvirebrand7196 4 года назад +1

    Great benchmarks and work! :)

  • @Paisa231
    @Paisa231 4 года назад +4

    The stock levels here in Northern Europe was sufficient, had 7 min to pick one up :D But I missed some AIBs brands.. And for pricing, the 6800XT was easily the best buy.

  • @laggisch
    @laggisch 4 года назад +40

    aaaaaand.... they're gone. Im from Germany

    • @haukionkannel
      @haukionkannel 4 года назад +2

      Well we all would be really surprised if They would not have gone!
      ;)

    • @Hadw1n
      @Hadw1n 4 года назад

      Kumpel hat eine bekommen :D

  • @caliptus85
    @caliptus85 4 года назад

    YOU ARE THE BEAST OF BENCHMARKING!!! awesome job as allways man.

  • @cosmic_drew
    @cosmic_drew 4 года назад +7

    This review vs LTT's review make me feel like I'm living in 2 different worlds. Their conclusions are so far apart.

    • @far_tech5555
      @far_tech5555 4 года назад +8

      Watch all reviews and draw the conclusion yourself. It doesn't matter what they say, it is only their own opinion. For me playing @1440*3440, result of 4k is a decisive comparison. I do care about RT and DLSS. Also, look at games they pick for the test. I am not going to play none of those AMD sponsored titles, so I do care about the titles that I am interested. For my personal perspective, 3080 is a go to option. But, none would regret buying 6800 series cause they are pretty darn good.

    • @cosmic_drew
      @cosmic_drew 4 года назад +3

      @@far_tech5555 That's what I do every launch and not just RUclips but there's other places to help inform a buying descion.

    • @MrHakisak
      @MrHakisak 4 года назад +3

      I dont even need to watch LTT's review to know they have once again created a supper click-baity title and a conclusion. never watch LTT for actual product review, its more of a drama channel for the masses.

    • @isakh8565
      @isakh8565 4 года назад +2

      HUB focused almost exclusively on rasterized price/perf, while LTT put a heavy emphasis on ray tracing performance and features. I wouldn't say either is right or wrong, just completely different priorities.

    • @Rickbearcat
      @Rickbearcat 4 года назад

      That is because Steve is an AMD fanboy 100%. He won't admit the defeat that these AMD cards have against Nvidia. He smooths over ruffled feathers with statements like: "And here the 6800XT does fall behind the 3080 by a 15% margin, which is quite a substantial difference, [THOUGH A 105 fps ON AVERAGE AT 4K IS STILL QUITE IMPRESSIVE]".