AMD Radeon RX 6800 Review, Best Value High-End GPU?

Поделиться
HTML-код
  • Опубликовано: 10 сен 2024

Комментарии • 2,6 тыс.

  • @MadSupra354
    @MadSupra354 3 года назад +1009

    "I'm not gonna go into all the specs"
    _Goes into all the specs_

    • @TheNike0223
      @TheNike0223 3 года назад +20

      Don't listen to what people say, look at what people do.

    • @d15p4tch6
      @d15p4tch6 3 года назад +10

      @@TheNike0223 don't listen to what people say, listen to what they say later...?

    • @TheNike0223
      @TheNike0223 3 года назад +2

      @@d15p4tch6 Hi there, I ment "I'm not gonna go into all the specs"
      Goes into all the specs He stated one thing but did another.

    • @WickedRibbon
      @WickedRibbon 3 года назад +7

      😒 Hardly.
      There's a big difference between providing a very brief overview that lasted all of 2 mins, and a entire video covering RDNA 2 specs which lasted 18.

    • @anomanderrake3593
      @anomanderrake3593 3 года назад +3

      Future prime minister.

  • @Sunlight91
    @Sunlight91 3 года назад +1085

    You have the best bar graphs on RUclips.

    • @iamspencerx
      @iamspencerx 3 года назад +82

      Of all the tech reviewers, who also have great content, Hardware Unboxed edges out by having the most easy to read graphs

    • @xSyn08
      @xSyn08 3 года назад +24

      *Buildzoid screeching in the distance*

    • @joeschmoe5009
      @joeschmoe5009 3 года назад +4

      How dare you not speak of gamers nexus! Idk that any 1 of them have better, but theyre all really good and thorough

    • @theprophet2444
      @theprophet2444 3 года назад +44

      @@joeschmoe5009 GR is fine but steve puts too much info into the graphs, great when viewed still but not so nice when trying to keep up with the video.

    • @toxiccharger8121
      @toxiccharger8121 3 года назад +1

      Fax

  • @jimmy9916
    @jimmy9916 3 года назад +732

    6800XT currently costs the same as a 3070 in Australia. It insane.

    • @johngray3449
      @johngray3449 3 года назад +52

      The 6800 is also only $100AUD less then the XT, $849 and this would make more sense.

    • @kh0a87
      @kh0a87 3 года назад +40

      I bought a gigabyte 3070 gaming oc yesterday. Good thing they didn't sent yet from scorptec. Placed a early pre order a msi 6800xt instead. Got both 3080 ftw3 and 6800xt both on pre order. Waiting is a bitch

    • @jimmy9916
      @jimmy9916 3 года назад +1

      @@kh0a87 I missed the preorder. Hoping they get more coming soon but not holding my breath.

    • @sausageskin
      @sausageskin 3 года назад +24

      Yeah, that's crazy. In Scorptec, the cheapest NVIDIA 3070 is $970 while AMD 6800 is $949 (AUD). Of course, nothing is in stock, so 🤣

    • @TT-bs2vm
      @TT-bs2vm 3 года назад +5

      @jimmy Same in NZ

  • @hellonearth6411
    @hellonearth6411 3 года назад +580

    Graphics card prices are disgusting at the moment, here in the uk even a 2060 super is well over £400.The prices of any gpus in general are not good. 😕

    • @stefanospap6288
      @stefanospap6288 3 года назад +9

      Wait for navi 22 early 21. 2080 super performance for 300 by AMD.

    • @davida8119
      @davida8119 3 года назад +17

      Get a used 1080ti for £300 off the Bay.. cheaper, 11GB ram and faster than a 2060 super

    • @stefanospap6288
      @stefanospap6288 3 года назад +70

      @@davida8119 don't do that... Obsolete tech.

    • @nonaccettanullah301
      @nonaccettanullah301 3 года назад +11

      here in Italy I found a 6800xt available on ebay for only 10270€ XD

    • @zokuva
      @zokuva 3 года назад +24

      @@stefanospap6288 honestly, who cares, as long as it performs well

  • @Kailhun
    @Kailhun 3 года назад +249

    State of play: 3070, 3080, 6800, 6800XT: unavailable and priced up the wazoo.

    • @LiraeNoir
      @LiraeNoir 3 года назад +23

      And it's not that their MSRP were great in the first place. Nvidia Turing price anchoring worked _very well_, even reviewers forgot the prices of Maxwell and Pascal and AMD at that time.

    • @Drothen-
      @Drothen- 3 года назад +1

      @SmashStomp Inc you cheeky fox

    • @WCIIIReiniger
      @WCIIIReiniger 3 года назад +2

      @@LiraeNoir I cannot hammer the thumbs up button enough.
      The graphic card market is so fucked up.
      I loled today, I saw someone selling a broken 1080TI (Blower model) for 200€. That is only one of many examples I see daily since I am looking for a good used card right now.
      Let that sink down.

    • @lindigj
      @lindigj 3 года назад +2

      @@WCIIIReiniger keep in mind, prices haven’t really dropped in the used market yet. Due to COVID, prices are still high because of demand, and the new cards being unavailable means their performance has not yet reset the market. Once these new cards actually hit shelves and you can just walk in and buy one, 1080 Ti will for sure drop. You’re just too early.

    • @klyplays
      @klyplays 3 года назад

      Intentionally out of stock so they can sell those through unofficial means and get more. Hope people actually realize the scam by these billion dollar companies and not worship daddy jenson / mommy su.

  • @NeVErseeNMeLikEDis
    @NeVErseeNMeLikEDis 3 года назад +452

    This is how u benchmark
    Not with 3-7 games
    Like 80% of youtubers

    • @cipcpa140
      @cipcpa140 3 года назад +43

      yeah thats what i notice too. benchmark 3-5 games then call it a day and make a judgement that the 3070 is faster than the 6800

    • @RajPlayZ75
      @RajPlayZ75 3 года назад +23

      True also in that 3-7 games 2-3 games are games which no one cares.

    • @ShadowRomeo1784
      @ShadowRomeo1784 3 года назад +8

      @@cipcpa140 Never seen any benchmarks that showed 3070 being faster than 6800, unless if you are talking about RT performance. like most of average that i saw is 5 - 10% better than 3070 and with the best case scenario here with HW unboxed 10 - 15%.

    • @NeVErseeNMeLikEDis
      @NeVErseeNMeLikEDis 3 года назад +15

      @@Carrera92 probably when AMD comes out with their own kind of version they will test both.

    • @katech6020
      @katech6020 3 года назад +10

      @@NeVErseeNMeLikEDis yes Tim will try to do an image quality comparison between both

  • @Ashachi
    @Ashachi 3 года назад +548

    This is great and all but...
    I'm poor, where are the $150-$250 cards, I just want something better than my RX 580 that performs like a GTX 1080 that won't cost me a leg.

    • @TanvirKhan-
      @TanvirKhan- 3 года назад +116

      Wait for 3050, 3050ti, RX 6500, RX 6500XT

    • @sirab3ee198
      @sirab3ee198 3 года назад +122

      Buy a second hand GTX1080 :)

    • @VascovanZeller
      @VascovanZeller 3 года назад +47

      Used market or wait

    • @perfectdiversion
      @perfectdiversion 3 года назад +11

      Save up and get something current gen

    • @henrylau1643
      @henrylau1643 3 года назад +65

      RX5600XT cost around 279 USD, the performance of it is similar to gtx1080/vega 64

  • @kane31604
    @kane31604 3 года назад +279

    It looks like my RUclips captions have settled on Harvard Unboxed for the the last couple of videos.

    • @ts6603
      @ts6603 3 года назад +3

      harvard peplaugh

    • @bxdroid
      @bxdroid 3 года назад +1

      So...unboxing videos of stuff from Harvard University?

    • @Raivo_K
      @Raivo_K 3 года назад +9

      I feel smarter already.

    • @TheBIOSStar
      @TheBIOSStar 3 года назад +5

      @@bxdroid No, the actual Harvard University unboxed.

    • @alshahriar6230
      @alshahriar6230 3 года назад +1

      @@TheBIOSStar I am curious how they managed to get a box the size of Harvard University

  • @JohnyMcNeal
    @JohnyMcNeal 3 года назад +195

    Finally someone calling it adequate to the price, a "high end" gpu. If someone else calls it mainstream I'm gonna go nuts. Even of it was sellong for msrp atm which is far from reality....

    • @DaFunkz
      @DaFunkz 3 года назад +62

      A lot of tech bros are out of touch with the reality of mainstream affordable cards.

    • @devindykstra
      @devindykstra 3 года назад +56

      Exactly! Just because the top of the line is $1500, that does not mean a $700 GPU is "mid range."

    • @DeadNoob451
      @DeadNoob451 3 года назад +24

      @@devindykstra Thing is that Nvidias xx70 cards are using their midrange GPU chip. The 104 die to be precise. Its just that Nvidia randomly started charging 500-600$ for their midrange hardware.
      So yeah, 100% with you that these prices are faaar away from being midrange, but the hardware inside is midrange. They are just about 200$ above sane prices for these cards.
      Same thing really goes for the entry level (ie xx60 cards) that now cost the same as midrange cards.

    • @Krazie-Ivan
      @Krazie-Ivan 3 года назад +19

      agreed - high-end prices for mid-range silicon, sadly. no gettin around the fact this is a 3rd-tier product. even the $1500 3090 is a 2nd-tier chip, as we still haven't seen the uncut Titan, so that makes the $500 3070 a 4th-tier product (which use to be $180-250) ...pretty disgusting.

    • @devindykstra
      @devindykstra 3 года назад +4

      @@DeadNoob451 yeah, in terms of silicon there is currently a 100, 102, and 104 die. They will likely make 106, 107, and 108 dies, so in that sense the 104 die (and therefore the RTX 3070) is mid-range.
      That being said, the 100 die is nearly exclusively used in massive data centers, making the 102 die the consumer top-of-the-line. That would leave the 106 as the mid-range, which I believe is much closer to where the mid range should be.

  • @dean8525
    @dean8525 2 года назад +20

    I just bought my 6800 today. I'm so glad I went with this over the 3070.

    • @camotech1314
      @camotech1314 2 года назад +5

      same, got a Sapphire Reference RX6800 coming to me next week. Cannot wait to upgrade from my 3060ti FE, the Nvidia control panel is giving me 2001 flashbacks!

    • @LofiBtz
      @LofiBtz Год назад

      Got the ASUS TUF GAMING version, I'm so glad I did my research and didn't buy the 3070 :) Any tips for the card? I'm brand new to AMD cards.

    • @kaapuuu
      @kaapuuu Год назад

      Yes you are even more glad now. Check what happend to 8gigs on 3070... Its deeeead.

    • @stefannita3439
      @stefannita3439 6 месяцев назад +2

      Grabbed a second-hand one-year-old 6800 XT the other day; after also selling my 3070 I'm only out about 70 bucks. Super happy to have made the jump

  • @nicholaslemire7515
    @nicholaslemire7515 3 года назад +15

    I got a 6800 yesterday at microcenter. Waited 20 hours for it and it seems like the XT will be very hard to get for a long time as my store only had 2. I wanted the XT, but the 6800 is still a beast and I game at 1440p so it’s still great for me 👍

  • @CaptnDeadpool
    @CaptnDeadpool 3 года назад +168

    My used, ex-mining RX 580 for 100 bucks has the same amount of Vram as the 3070. Should have had 10Gb for that price, and then 12Gb for the 3080, 16Gb for the inevitable 3080ti

    • @emulation2369
      @emulation2369 3 года назад +70

      They did it on purpose, so they can push new SUPER cards with more vram and a bit more mhz frequency...

    • @piotrj333
      @piotrj333 3 года назад +20

      And what's wrong with that VRAM? Considering Nvidia already wins in 4k with lower VRAM, means AMD gonna have trouble in reality. What is the point of that 16GB of VRAM if that is too slow in case VRAM requirements raises?
      It is the truth since ancient times of Geforce 4, VRAM speeds are much more important then VRAM capacity and actual true VRAM usage is really low (most of it is used as caching, and that is why sometimes games pull out high numbers as vram usage). Heck, developer tools of flight simulator 2020 on max settings use max 4GB as "usage" although allocated as caching is much higher. Really at worst in future Geforce users will have to slighty drop textures from ultra to high and that is where problems will end, while Radeon big navi users won't be able to take advantage of that ultra textures anyway due to low speed of vram.
      Only truly future proof card here is overpriced 3090 because it has higher VRAM and higher speed vram (wider memory bus, gddr6x higher rated then 3080).

    • @SirKakalaCh
      @SirKakalaCh 3 года назад +53

      @@piotrj333 if you werent looking at ancient times and instead to future, you would know for sure that games will demand a shit ton more of vram in the next year with nextgen games. 3080 was a quick cashgrab and a fuckup for nvidia, if amd can remotely answer dlss, they win, 3080 will be obsolote at 4k in about a year. You will need to upgrade to the next series but you can get a long mileage from amd cards thanks to the huge vram advantage.

    • @arcanicc
      @arcanicc 3 года назад +15

      @@SirKakalaCh Vam is just a marketing gimmick from AMD let's face it. The engineers knew 8GB is enough for recent year game titles. VRam usage hardly go beyond 4GB or 6GB for most games. By the time games take up 16GB (which I highly doubt they would), your GPU core is gonna be obsolete anyways...

    • @Gadtkaz
      @Gadtkaz 3 года назад +21

      This reminds me of the 8gb 390's back in the day. Sure as hell didn't stop people from buying 3.5gb 970's...

  • @Sonic6293
    @Sonic6293 Год назад +4

    More than two years later, and with the release of the RX 7800 imminent in August 2023, I don't know if that new card is gonna be relevant. This one is still a beast, even against some of NVidia's latest products. It's probably the most hidden gem from last generation.

  • @dipak002
    @dipak002 3 года назад +34

    I am totally speechless after seeing the 6800 performance and how it stacks up against the 3000 series lineup. When RTX 3000 series was launched, I thought about how AMD going to compete against the mighty performance of the 3070 and 3080 cards BUT AMD not only complete against these cards but even did better and the same price range. HATS OFF AMD! I feel so proud of you guys! You guys are truly game-changer. You seem to be fighting and winning on two different fronts (CPU and GPU). This is truly remarkable. WOWWW!

    • @spoton5981
      @spoton5981 Год назад +1

      Not only that, BUT DANG the power draw difference, especially if undervolted. The 6800 becomes a 200w and less card in most games.

  • @yvalson6534
    @yvalson6534 3 года назад +135

    Competition has returned. And different from intel nvidia will actually fight AMD instead flopping around like a magicarp

    • @gewdvibes
      @gewdvibes 3 года назад +5

      Yea I think the 4080 might be another 2x jump over 3080 lol nvidia is not going to sit around

    • @mat.quantum2671
      @mat.quantum2671 3 года назад +6

      There’s going to be Ti or Super variants for all of these cards very soon, mark my words. That’s why there’s no stock yet, they’re saving dies to compete with AMD when they need to.

    • @yvalson6534
      @yvalson6534 3 года назад +8

      @@gewdvibes now let's hope AMD sticks to their new performance level competition and fights nvidia's next gen as well.

    • @haukionkannel
      @haukionkannel 3 года назад +3

      @@gewdvibes 2x... noup... but 20 to 30% quite possible in raytrasing more...

    • @gewdvibes
      @gewdvibes 3 года назад +2

      @@haukionkannel the 3080 was a 70-100% jump over 2080, why wouldn’t 4080 be the same? Especially with AMD actually competing now

  • @ellypsis603
    @ellypsis603 3 года назад +59

    3070 reminds me of the 670, it was great at the start but the 2gb of vram made it obsolete really quickly.

    • @thetechnocrat6388
      @thetechnocrat6388 3 года назад

      It is like the 970 4gb tho...

    • @AdaaDK
      @AdaaDK 3 года назад +2

      i got a 670 cheap used when the 780´s came out, and ran that thing until almost end of life for the 900 series, until i upgraded for a blackfriday deal on a 970. I dont think it was that bad.

    • @wobblysauce
      @wobblysauce 3 года назад

      Ya, I got the wind force and was able to OC it higher than the 680.

    • @ShadowRomeo1784
      @ShadowRomeo1784 3 года назад +3

      It reminds me more of the GTX 970, 980 vs R9 390 debate, many fanboys vouched for R9 390 because of 8GB Vram and 970, 980s still outsold it by a huge margin anyway and it's Vram didn't get obsolete as many AMD fans anticipated, even today it's still playable in majority of games at 1080p. I expect the same case with RTX 3070 and RX 6800 at 1440p resolution at 4K, you shouldn't get neither of the cards anyway as they are not made for 4K Gaming.

    • @SpeedRebirth
      @SpeedRebirth 3 года назад +18

      @@ShadowRomeo1784
      6800 not made for 4k?
      When the 1080Ti came out it was considered the only 4k capable gpu.
      When the 2080 Ti came it was considered the only 4k HIGH fps gpu.
      Now, we have the 6800 that performs 15% better than the 2080Ti and has 16GB ram, and you're telling us it's not a 4k capable card?
      Yes, you are wrong on that one :)

  • @kitark06
    @kitark06 Год назад +3

    You've amazing foresight. Predicting & warning us buyers about vram issues when nobody even considered it to be a problem.!

  • @drandersjiang
    @drandersjiang 3 года назад +77

    The performance of any card that you can't buy is zero.

    • @Drothen-
      @Drothen- 3 года назад +8

      Wise words my enlightened friend.

    • @JetskiSlumpt
      @JetskiSlumpt 3 года назад +1

      You heard the man, wait until the 25th and get your refresh key ready

    • @boredgunner
      @boredgunner 3 года назад +1

      @@JetskiSlumpt 25th of November, 2021 more like

    • @johnm2012
      @johnm2012 3 года назад

      By the same reckoning, the cost of any card you can't buy is also zero.

  • @pontusheimdahl5353
    @pontusheimdahl5353 3 года назад +2

    As one of the few people who have actually bought an RX 6800. It's great. Overclocked it often nearly matches the 6800XT. 2 FPS behind in Horizon Zero Dawn at 115 FPS

  • @hecticyak
    @hecticyak 3 года назад +7

    Wow, this review had a much different opinion than most others ive seen, most of them showed the 6800 roughly tied with the 3070 in performance. I'm assuming this is down to the selection of games tested, this channel seems to test a much wider variety of titles, and i cant imagine how long that must take. Much appreciated.

  • @debicadude
    @debicadude 3 года назад +150

    Holy moly, thats really impressive! The market is gonna be competitive as ever!
    AMD attacking on all fronts: CPU, Consoles, GPU; the next few years are gonna be lit !

    • @Carebear_Pooh
      @Carebear_Pooh 3 года назад +23

      Disappointing though that AMD is happily going along with the sky-high prices that nVidia introduced.

    • @nossy232323
      @nossy232323 3 года назад

      We will probably have to wait as long before the parts become available.

    • @aayushvijayvergiya7252
      @aayushvijayvergiya7252 3 года назад +3

      Wait for intel to arrive in 2021 with their gpu , hope they can compete in atleast midrange that will create even more competition. 3 gpu companies way to go.

    • @wisdoom9153
      @wisdoom9153 3 года назад +5

      @@Carebear_Pooh this is mid-high territory, what do you expect? $300?
      I do agree though, RX 6800 price could be lower...eh, but what am i? I'm not a businessman, i know jack sit about business.

    • @Pixel_FX
      @Pixel_FX 3 года назад +18

      @@Carebear_Pooh nvidia fanboys are to blame. why did they buy those overpriced turing cards :(

  • @dio4296
    @dio4296 3 года назад +73

    Battle of the non-existent video cards.

    • @mattbutalt723
      @mattbutalt723 3 года назад +1

      Would be a faster battle if you didn't randomly stop ti- *BBBZzzzzzz*

    • @xgunxsh0txpr0x
      @xgunxsh0txpr0x 3 года назад

      Yare yare daze..

  • @jobidi99
    @jobidi99 3 года назад +92

    I'm just wondering how the market prices will settle in the end and whether both the 3070 and 6800 will be available long term at MSRP

    • @amaurytoloza1511
      @amaurytoloza1511 3 года назад +14

      I am guessing we'll be wondering for a long time

    • @AndrewTSq
      @AndrewTSq 3 года назад

      swedish retailers have already told us that the 3rd party cards of 6800 and 6800xt will be way more expensive than this first batch that is not for sale anymore. THey even removed the amd-cards from their site since they wont get any more cards since htey dont make them

    • @tuulofdstrxn
      @tuulofdstrxn 3 года назад +1

      Mining might explode again soon. I have a feeling we can look at getting these cards for a doable price some time in the summer, and for msrp maybe never.

    • @amaurytoloza1511
      @amaurytoloza1511 3 года назад +13

      @@tuulofdstrxn I have the same feeling. I refuse to pay more than 5% above MSRP. I just think it's unfair, and shame on people who buys from scalpers. Those are worse than scalpers themselves. (unless you really really need one because your job depends on it)

    • @paullasky6865
      @paullasky6865 3 года назад +1

      High. They will stay high.

  • @mantequillas1278
    @mantequillas1278 3 года назад +7

    Thanks for all the great reviews over the past couple of months. I'll be back in 5 years when 4k is priced for a reasonable amount.

  • @Ahmmunition
    @Ahmmunition 3 года назад +18

    Bitwit's RX 6800 barely beat the 3070 in games, and yours is cruising ahead. Wonder which one is right

    • @usernamecantbeblankwtf6469
      @usernamecantbeblankwtf6469 3 года назад

      Wait for gamer nexus benchmarks they'll be right

    • @0rrchids
      @0rrchids 3 года назад

      I think both are right, just different hardware config on the test rig..

    • @Carebear_Pooh
      @Carebear_Pooh 3 года назад +4

      I know, the reviews are so much all over the place with the AMD launches that I'm seriously starting to doubt the integrity of some channels. Also as Linus pointed out, AMD is claiming that DLSS is a garbage feature, yet they are developing their own version of it. With the consoles and AMD as well supporting raytracing now, I'm gonna go ahead and predict that raytracing in games is going to be a lot more common. When only nVidia was doing it, yeah okay it makes no sense to put it in your game. When everybody else is doing it as well, well now it makes a ton of sense to offer raytracing in your game.

    • @termiguin1
      @termiguin1 3 года назад

      @@Carebear_Pooh Guess which architecture the consoles will be ray tracing with? If the consoles do it, RX 6000 will be able to do it. Also consider that consoles are targeting no more than 60fps for most titles, so turning on ray tracing makes sense, whereas for PC gamers with high refresh rate monitors, having a higher framerate will give a better experience.

    • @Tunisiannaru
      @Tunisiannaru 3 года назад

      Bitwit tested with raytracing on for every game

  • @umbranoxx
    @umbranoxx 3 года назад +5

    I think all that VRAM is heavily underrated with how long I personally keep my GPUs (~4 years). Also I'm sure these next-gen games are going to chew through that VRAM much more quickly than we're used to. Thanks for the video Steve, always my go to guy for these releases!

    • @haukionkannel
      @haukionkannel 3 года назад +1

      Yes, it depends! I normally keep gpu more than 4 year, so bigger mem may become somewhat needed. But 8Gb should be ok at least next 3 to 4 years easily. After that it may be nesessary to have more. Maybe.

    • @Rspsand07
      @Rspsand07 Год назад

      You were very right about that vram lol

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j Год назад

      i think its overrated. an 8gb rx 580 is as bad as a 4gb one.

  • @crazybiscut
    @crazybiscut 3 года назад +29

    Im glad im not building yet, this year just seems tricky

    • @rangersmith4652
      @rangersmith4652 3 года назад

      Ditto. It remains smart to "overbuild" when you build so you can skip a cycle like this one--since you're being forced to skip it anyway.

    • @Ratched45
      @Ratched45 3 года назад +1

      Not really, if you have the money.
      CPU: 5600x
      GPU: 6800
      MB: Gigabyte GA-X570 Aorus Elite (wifi if you want)
      PSU: 850W
      Just right there is a very capable pc

    • @rangersmith4652
      @rangersmith4652 3 года назад

      @@Ratched45 A very capable PC whose key parts you can't acquire. Well I suppose you can buy them if you don't mind rewarding bots and scalpers. I see a bit of a problem there.

    • @Ratched45
      @Ratched45 3 года назад

      @@rangersmith4652 you don't have to pay the scalpers anything. The most difficult one is the GPU. When they go on sale for pre order then you get one.
      Here in Australia at my go to pc store, the 6800xt was on sale for pre orders for about an hour or two before they stopped taking orders. And they only sell what the store gets. 3rd parties GPU will be the same. As long as pre order you'll get one. Almost the same with the other major retailers in Australia

    • @rangersmith4652
      @rangersmith4652 3 года назад

      @@Ratched45 In the US, it's not working that way, as I hear it told.

  • @tma2001
    @tma2001 3 года назад +14

    something tells me the Ti models are no longer rumours !

    • @jskyg68
      @jskyg68 3 года назад +1

      neither is the 6900xt...I wonder how well a 400watt TI is going to sell lol

  • @thyscott6603
    @thyscott6603 3 года назад +31

    So interesting to see how different these benchmark results are I watched now in order:
    LinusTechTips , AMD preformed far worst
    Level1Techs , AMD did pretty well
    Jayz2cents , AMD did mediocre
    Hardware Unboxed , AMD almost crunched it

    • @bastordd
      @bastordd 3 года назад +3

      So true I watch all reviews and is so different in all reviews. Who im gonna trust?

    • @1Life2Little
      @1Life2Little 3 года назад +7

      I love Jayz and always have a fun time watching him... But his benchmarks are not really that serious... he even tested with a damn Intel CPU =P Ohh and btw Linus have always been a huge Intel sellout so no surprise with those benchmarks/reviews... Intel probably paid him just to get back at AMD =P

    • @bastordd
      @bastordd 3 года назад +28

      @@1Life2Little its comments like that no one takes you seriously. U are just a fan boy

    • @thatoneguy3401
      @thatoneguy3401 3 года назад +11

      Out of all these gaming benchmarks, I lean towards hw unboxed and gn. I only watch the other channels when I take a number 2

    • @eazen
      @eazen 3 года назад +3

      LTT AMD did actually fine, consider the results with SAM. They used Ryzen 5950X there.

  • @TheAngrySaxon1
    @TheAngrySaxon1 3 года назад +2

    Please note: MSRP simply doesn't exist anywhere in the UK right now.

  • @stipi6996
    @stipi6996 Год назад +3

    Good review and thank you, i sleeped a few years, today i ordered rx 6800 and am retiting 1080 ti 😀

  • @lepari9986
    @lepari9986 3 года назад +17

    It's a winner if there is decent supply in the next month or so.

    • @gapinzonr
      @gapinzonr 3 года назад +2

      It won't

    • @DoomsdayR3sistance
      @DoomsdayR3sistance 3 года назад

      Well it's already out of stock now... so decent supplies seem unlikely. Maybe AMD can get some more out but it feels like AMD is suffering from the same issue as Nvidia, higher demand due to COVID and lower production due to COVID.

    • @BlaqZ
      @BlaqZ 3 года назад

      @@DoomsdayR3sistance its also some people from nvidia that didnt get their nvidia cards just wanted a card thats comparable with 2080ti

    • @lepari9986
      @lepari9986 3 года назад +1

      @@DoomsdayR3sistance What? The custom models haven't even released yet. Anyone knew that the reference cards will be sold out in minutes (or even seconds), that was always going to happen. That's why I mentioned that in the next month or so, there has been rumours that custom models have better stock so let's see. Even if they are higher priced they will sell a lot, just need to be able to supply them during the last part of the year.

    • @lepari9986
      @lepari9986 3 года назад

      @@gapinzonr yeah we will see. Hopefully you're wrong. The winner of the two companies is the one who can actually provide the product, they're both so good.

  • @shockracer
    @shockracer 3 года назад +3

    Thanks for including the 980 TI, it's nice to see it's still holding on even for its age. Seems it is about time for me to finally upgrade, it's had a great 5 year run for me!

  • @will4may175
    @will4may175 3 года назад +1

    The 6800xt on one regular website Scan shown a £599.99, but the buy button wasn't there, but it said 6 people were buying it, soon after the price vanished, but said 19 people were buying it, at no time did a buy button appear from the 2pm update on the site, a little later it said 7000 people were watching the product.
    The other site Ebuyer and Overclockers didn't have stock or a price of the 6800 or xt, and Ebuyer has removed the XT altogether just leaving four variants of 6800 still no pricing.
    Personally it feels suspicious that three big sites updated to show the cards but never posted prices (overclockers now have prices no stock) or had a buy button, I do feel though that they're holding them to inflate the price like what happened to the nVidia cards.

  • @HairFollicle
    @HairFollicle 3 года назад +13

    Watching benchmarks from multiple channels i changed my opinion multiple times and now have changed it again seeing that dlss and raytracing are not deal breakers if this continues im gonna loose my mind

    • @jonasziegler3025
      @jonasziegler3025 3 года назад +4

      Raytracing is not that good, little better visuals, insane hit to fluidity and performance.

    • @DaFunkz
      @DaFunkz 3 года назад +3

      Ray tracing whilst cool is just a gimmick for bragging rights at the moment. DLSS seems great but isn't widespread enough to be a big enough factor, for now at least.

    • @sijedevos2376
      @sijedevos2376 3 года назад +1

      @@jonasziegler3025 yeah only game that really impressed me was quake 2 rtx. But besides that I dont really care about ray tracing.

    • @jonasziegler3025
      @jonasziegler3025 3 года назад

      @@kleist5083 Not really, the hit is still quite big. I don't see it earlier than 2023 being really common to use it in 90% of the games tbh.

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @@jonasziegler3025 Uhh it's not that big now.. You lose 10 to 30 percent in frames depending on the game which isn't that much.

  • @blastermathz
    @blastermathz 3 года назад +16

    most of the other reviews I've seen for the 6800xt mention the lack of dlss and not great raytracing as a pretty big weakness.

    • @jonasziegler3025
      @jonasziegler3025 3 года назад +23

      Seriously, I don't get that at all. Who plays with Raytracing on, it dumps the performance in the bin for a bit better visuals, but way less fluidity in the game. Raytracing is cool, but maybe in 3-4 years when more games can make use of it and graphics cards are better at using it without dropping the performance like crazy.

    • @nimrodery
      @nimrodery 3 года назад +1

      This isn't a review of the 6800 xt and there's two or three games currently where I'd like raytracing turned on. A DLSS equivalent is in the works, which is the same for NVidia because you can only use it in some games. It depends how you feel about raytracing. I feel it uses the most GPU resources to deliver one of the least noticeable effects, other than a couple games. Once they're more common a retest would be in order.

    • @ivesscripts8876
      @ivesscripts8876 3 года назад +2

      @@jonasziegler3025 **Cyberpunk 2077** will make a great use of DLSS, and better Ray Tracing Performance from Nvidia for better visuals/looks. but that's up to anyone that'll play the game when it comes out.

    • @cipcpa140
      @cipcpa140 3 года назад

      it all comes down to the games you play. i for one want a raw fps card since the game i most play which is black desert doesnt even support dlss and ray tracing

    • @edgain1502
      @edgain1502 3 года назад +4

      Lel, I have an rtx card and couldn't give 2 shits about ray tracing. Even if I had a 3000 series I wouldn't care much.

  • @emilromano
    @emilromano 3 года назад +89

    It's interesting how other YT channels are getting so much worse results for both the 6800 and 6800 XT

    • @GTRWelsh
      @GTRWelsh 3 года назад +98

      Cherry picked metrics and settings. An over exaggerated focus on RT Minecraft. Using mainly Nvidia sponsored titles,etc

    • @1Life2Little
      @1Life2Little 3 года назад +91

      Hardware Unboxed are specialized in gaming benchmarks so i would trust them over anyone else.

    • @emilromano
      @emilromano 3 года назад +66

      @@1Life2Little Yeah, I trust them over any other YT channel when it comes to benchmarks! They deserve all the praise for doing 18 game benchmarks, while other channels do around 7 games

    • @ps2232
      @ps2232 3 года назад +24

      Not sure About individual tests but for the 18 game average the last minute purposeful addition of 3 AMD financed titles that clearly have same sort of driver issue for nvidia did skew the numbers sharply but its still overall a good Job by AMD when compared t their previous releases

    • @GTRWelsh
      @GTRWelsh 3 года назад +41

      @@ps2232 it's good to use sponsored titles, as long as it's balanced by other sponsored titles for the other team. This does well enough I think. Others do not.

  • @0marble8
    @0marble8 3 года назад +15

    kinda sad only seeing 300$+ cards in cost/frame

    • @techpappee
      @techpappee 3 года назад

      High end vs high end perhaps?

    • @jasonmcgrody9472
      @jasonmcgrody9472 3 года назад

      That's because Steve didn't test cheap cards. The relevant resolutions are 1440 and 4k. No sub-$300 card is decent at 1440p.

  • @hurlman03
    @hurlman03 3 года назад +2

    I know it's not your forte, but have you ever thought of benchmarking some VR games? I'm really curious to know the VR performance of the new AMD cards vs the 3000 series in games like Half-Life: Alyx and Star Wars: Squadrons.

  • @tiredofthebs9
    @tiredofthebs9 3 года назад +1

    I went to MicroCenter for a keyboard, casually asked if they had 3080s or 3090s and they had twenty-five 3090s. Bought a EVGA FTW3 ULTRA! I can’t believe it was so easy. Makes up for my frustration since launch day.

  • @justinv3080
    @justinv3080 2 года назад +3

    If these drop closer to 500 the price to performance value really gets high on this oem

  • @sedraniM
    @sedraniM 3 года назад +6

    You need to completely rework those sheets when the retail prices and availability are stable...
    A 3070 for 500$ or a 6800 for 580$ were just a massive PR stunt

  • @chadfang2267
    @chadfang2267 3 года назад +16

    Im so confused every tech channel's test results is so different from the others

    • @Wheres_my_Dragonator
      @Wheres_my_Dragonator 3 года назад +9

      HUB doesn't lump in rtx on their benchmarks. Which makes sense, because competitive gamers will never turn it on. Why would they want distracting features that just make it harder to see your targets and brings down performance?

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @ you quite literally have no proof of what you're saying. Just as much as you say Nvidia paid off these youtubers with no proof i can easily say that about AMD with no proof. You fanboys are something else.

    •  3 года назад

      @@georges-leeeugene2905 Oh dear, George, sorry to burst your 'paid reality bubble' with the simple truth of how things work in the real world - for some companies...Nvidia and Intel being prime examples of serial anti-competitive rule breakers.
      Enjoy reading that extremely easily accessible proof. You paid shills are, indeed, "something else"...!!!

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад +1

      @ Jesus the fuck christ. I'm done with you.. I'm now a paid shill... You're acting like as if AMD could not easily just as do what you claim Nvidia is doing. The only reason people are not saying that because of reputation but their reputation isn't even that clear.

  • @riddlex
    @riddlex 3 года назад +2

    Even though SAM provide single digit gains, it's just free performance on full AMD builds. My next build will be full AMD for sure! Thanks for the excellent benchmarks by the way. Much appreciated.

    • @Dr.WhetFarts
      @Dr.WhetFarts 3 года назад

      Dude... Nvidia will support this very soon and it won't be limited to AMD or Intel, but will work regardless of CPU choice. It's a STANDARD in PCI Express... It's like 1-2% performance increase... DLSS 2.0 provides like 50-100% compared to SAM.

    • @powerup9035
      @powerup9035 3 года назад

      many games don't get anything with sam, so hold your horses

  • @bwv1044
    @bwv1044 3 года назад +121

    It's a $500 card taking advantage of shortages in the market.

    • @1Life2Little
      @1Life2Little 3 года назад +29

      I am sure they will drop it to 500 when the 3070 TI releases. AMD is just being smart.

    • @inferno2143
      @inferno2143 3 года назад +20

      cant take advantage of shortages if it's out of stock XD

    • @mzpedro1450
      @mzpedro1450 3 года назад +1

      lamo

    • @eazen
      @eazen 3 года назад +3

      @@1Life2Little why should they, it depends how fast the 3070 Ti will be. If it's not faster they will not touch the price at all.

    • @Raphael-
      @Raphael- 3 года назад +2

      It's at least 700€ here in France

  • @Nimbus3000
    @Nimbus3000 3 года назад +6

    Would like to see an analysis on the streaming/recording capabilities of the RDNA2 vs Ampere GPUs, not really covered under these reviews and previously Nvidia has been the clear leader in this space in the past. Seriously considering RDNA2 but if streaming at the same time cripples it I'll have to pass.

    • @danielpindell1267
      @danielpindell1267 3 года назад

      My reason for getting the 3070 and I'm sorry but not many have not covered it. There's more then just getting FPS with these cards for a lot of people. There's other features nobody is talking about, only FPS.

  • @larsbaer3508
    @larsbaer3508 3 года назад +4

    I hope in January when Christmas is over they will have some 5600x and some 6800 non xt in stock, seems to be the best bang for bug pc imo, if you want a new setup and get a b550 board that sounds incredible you can really build an epic pc.... When rtx 2080ti was a thing you paid the same money for that card alone... Now you get like a better experience out of a 1200 rig than a few months ago from a 1200 gpu... That's insane value imo

    • @antonschaf4088
      @antonschaf4088 3 года назад

      Yes, that would be a "banger" :)

    • @makisekurisu1925
      @makisekurisu1925 3 года назад +1

      @@antonschaf4088 5800x + RX 6800

    • @antonschaf4088
      @antonschaf4088 3 года назад

      @@makisekurisu1925 :)
      I pesonally would buy a 5800X too.
      Not because of futureproofing alone. The 5800X seems to be the gaming monster in the Zen3 lineup.

    • @makisekurisu1925
      @makisekurisu1925 3 года назад +1

      @@antonschaf4088 yeah i mean i had the choice between i7 4790k and i5 4690k for not a lot more , i chose 4690k and look now 4790k can still be paired with 1080ti or 2070super with no bottleneck at 1440p , but i5 4590k Stutters has frame dips etc.. and that's just because of 4 less threads imagine :) we are talking 2cores 4 threads more now , so i'll not make the same mistake as many of buying a 6core when its the norm and in the long run 2-4 years it'll bottleneck me and force me to a new ram / mobo / CPU , hell no I'll add those fucking 100-150$ and get the 8core / 16 THreads :)

    • @makisekurisu1925
      @makisekurisu1925 3 года назад +1

      and nost just for gaming but 8 cores in general feels more responsive for multitasking etc.. , and paired with a dirt cheap 32 gb 3200 for 110Bucks and i'm ready to go !

  • @dutchdykefinger
    @dutchdykefinger 3 года назад +16

    radeon just loves them racing games, even the cards that couldn't hold up to nvidia's top end, were usually insanely good in racing games somehow :')

    • @ironcloudstudios6550
      @ironcloudstudios6550 3 года назад +5

      the red colour of AMD makes it go fasta!

    • @bestopinion9257
      @bestopinion9257 3 года назад

      Well, I play racing games in full image quality so DLSS is not for me.

    • @Blackout201_
      @Blackout201_ 3 года назад +1

      That's probably because many racing games are from Codemasters and they used to work very close to AMD 9 or so years ago.

    • @kidShibuya
      @kidShibuya 3 года назад

      @@bestopinion9257 Seeing that often DLSS2 is better than native, what you are really saying is that you like inferior IQ?

    • @bestopinion9257
      @bestopinion9257 3 года назад

      @@kidShibuya What's "inferior IQ"? And why do you use it in this context? Are you a triggered fan boy?
      Speaking about IQ, dude, I have deep knowledge in math so go away.

  • @EldaLuna
    @EldaLuna 3 года назад +2

    its so nice to see ya include a fair bit of gpu's even my 1080ti i was always wondering where my old card sits in all of this. good to know it can still push for bit more time but also tells me its eventually going to be replaced sooner than i expected if i ever plan new games with it in the future. crazy how amd pulled ahead like this its amazing.

  • @luigi229m
    @luigi229m 3 года назад +5

    I said this in a previous video and I'll say it one more time. Your reviews are as good as they come, however you cannot do the 18 game average calculations the way you are doing them, it is statistically a mistake. You cannot average frame rates across different games when they can vary greatly with some games having averages of about 30 fps and others in the hundreds. What you should do is take the percentage difference for every title (say 10%, 14%, etc) and then average those differences.
    Even if the end results turns out to be very similar between both methods, you are using an incorrect way of presenting data.
    I'll give an example below:
    GPU1 - game 1 - 40fps
    GPU2 - game 1 - 50fps
    Difference 25%
    GPU1 - game 2 - 600fps
    GPU2 - game 2 - 700fps
    Difference 16.6%
    Using your method you would have:
    GPU1 - 2 game average - 320fps
    GPU2 - 2 game average - 375fps
    Difference: 17%
    Using the method I describe
    Average of 16.6% and 25% = 20.8%
    So even using common sense and not knowing anything about statistics which results makes more sense?
    You have a 25% lead for 1 game and a 16.6% lead in another game, it would make sense that the conclusion is that the average lead is right about the halfway point so about 20-21%.
    It makes no sense to think that if one card has a lead of 25% and 16.6% then to conclude that the total lead is 17%.

  • @50H3i1
    @50H3i1 3 года назад +8

    6:52 steve again with the new gpu announcements 😅

  • @DrSpychology
    @DrSpychology 3 года назад +11

    This seems like the most well balanced card out of all the recent high-end releases, probably will be picking up one of these or a RX 6700 XT in a couple months

    • @danielcallahan976
      @danielcallahan976 3 года назад +1

      balanced haha..this is the most imbalanced card with horrible Ray tracing and productivity performance.

    • @BadKarma.-
      @BadKarma.- 3 года назад +1

      @@danielcallahan976 You're just angry AMD murdered Nvidia with this drop😂

    • @pitrass79
      @pitrass79 3 года назад +4

      @@BadKarma.- hes 2080ti owner 😂😂😂

    • @danielcallahan976
      @danielcallahan976 3 года назад

      @@BadKarma.- no hahahha..im not angry comeon. I was actually supporting AMD ..i will be definitely getting their CPU but cant go with the GPU...u have seen the RT performance ryt? n how did they murder lol? not even in rasterization is this card better at 4k..n no im not a 2080ti owner..I dont have that much money to spend on GPU tbh...but i wud be getting a 3080 ...unfortunately still not available

    • @BadKarma.-
      @BadKarma.- 3 года назад

      @@danielcallahan976 no one cares about RT, Im not nuking frames for some extra shadows and reflections that are barely noticeable lmao. The 6800 is better then then the 3070 amd the 6800XT is right at 3080 performance and above in some games for less money...

  • @highlanderknight
    @highlanderknight 3 года назад +15

    Clearly impressed with AMD here but I'm interested in the benchmarks with "it all on" SAM, RT, DLSS etc. I was impressed and disappointed with the 6800XT review because some features were dismissed as 'ehh.'

    • @JMartinni
      @JMartinni 3 года назад +2

      Well, he's not wrong. While amazing features RT and DLSS both rely on - so far rather limited - game support and that makes them not a great selling feature yet.

    • @highlanderknight
      @highlanderknight 3 года назад +3

      @@JMartinni Technically true, but if I get my hands on a new GPU, IF that is, I plan to hold onto it for a few years at least. So I'd like to (hopefully) see some benchmarks beyond the 'not alot of games support it right now' scenario, especially if the GPU supports it.

    • @KarlTheExpert
      @KarlTheExpert 3 года назад +2

      Agreed, dismissing the good stuff Nvidia has so quickly while hyping up SAM which absolutely does not improve 'almost all games' (see everyone else's reviews) left a bad taste. Show us the numbers and then let everyone decide if it's worth it for themselves.

    • @JS1066
      @JS1066 3 года назад +1

      The fact of the matter is Nvidia has had years to get RTX into more games and they have not done so at a better pace. This is what happens when you use proprietary tech. Open Source is the best way. Why pay more for less?

    • @highlanderknight
      @highlanderknight 3 года назад +1

      @@JS1066 But in this case RTX 3080 vs 6800 XT, if you go with the 6800 XT you are paying less for LESS. No thanks.

  • @scorpionking722
    @scorpionking722 3 года назад +2

    I am very happy with my 6800 reference model ! Just need a Ryzen 5000 series and a BIOS update to enable SAM on B450 from ASUS and it will be good for several years to come !

  • @kevsxp
    @kevsxp 3 года назад +2

    I still don't understand why he claims that the only advantage Ampere has over Big Navi is DLLS 2.0 and Raytracing @ 17:30 I know he knows about NVIDIA Reflex, CUDA, NVENC , and NVIDIA Broadcast. Why is it that when CPU test are done he discusses gaming and productivity use cases but for GPUs it's just a gaming focus? Much respect to AMD for the performance improvements. My thoughts are that 8gb of VRAM is fine if one is only gaming at 1080p to 1440p. Higher VRAM is only needed for 4k gaming.

  • @camotech1314
    @camotech1314 2 года назад +6

    Just picked up a Sapphire Reference RX6800 card, cant wait to play some games on it!! :D

    • @camotech1314
      @camotech1314 2 года назад +3

      Update = I love it!!!😍 Way better than the 3070 I had.

    • @slumy8195
      @slumy8195 2 года назад +2

      @@camotech1314 hows the one month update

    • @camotech1314
      @camotech1314 2 года назад +2

      @@slumy8195 still adore it mate, it's heavy and relatively short but doesn't sag which says a lot for the design and build quality!! Also quieter than any card I've had with a 3rd party cooler before. I owned a Gigabyte 3070 Vision before and that thing was way louder and more greedy with the wattage while producing way less FPS.

    • @slumy8195
      @slumy8195 2 года назад +2

      @@camotech1314 good to see you enjoy it! had to return one it was a lemon out of the box giving it one more chance on a new purchase again. really like the low wattage and huge vram capacity.

    • @camotech1314
      @camotech1314 2 года назад +1

      @@slumy8195 ahh that's a bummer mate, which one do you have your eye on? Surprised you got a lemon 1st time around. Which brand?
      You're right though, I do feel really safe for years to come now with 16GB Vram.

  • @807800
    @807800 3 года назад +10

    Zen 2 moment of Radeon. Will RDNA3 be their Zen 3?

    • @Witya
      @Witya 3 года назад +1

      unlikely, nvidia is pretty conservative with their tech process, but their architecture was always superior.

    • @ivaylotsankov7292
      @ivaylotsankov7292 3 года назад +3

      @@Witya Not always.

    • @TheDanupro
      @TheDanupro 3 года назад +1

      Exactly what i thought...Dejavu...

    • @eazen
      @eazen 3 года назад +3

      @@Witya unlikely 😂 yes it was unlikely with Intel as well, but it happened. AMD is simply the best technology company right now. The best CPUs since 2019. The best gaming GPUs overall, see consoles and PC. Even better CPUs since 2020. The most sympathic CEO in Lisa Su. And then look at that idiot with his leather jacket at nvidia, overcompensating. And the Intel CEO is just a manager, not a engineer.

    • @807800
      @807800 3 года назад +4

      @@Witya @WreakingHavok Yep, quite unlikely just like how Ryzen able to take the performance crown against Intel that's 10X the revenue.
      For real though, the fact is Nvidia is focusing heavily on compute, especially AI, to the fact that their gaming card is pretty much a compute-heavy card that being bottlenecked at a resolution below 4K, just like the old GCN.
      While AMD is separating their GPU between RDNA that focus on gaming and CDNA that focus on compute. Of course, we shall see.

  • @sadman.saqib.zahin01
    @sadman.saqib.zahin01 3 года назад +6

    I don't know why you are so dismissive of Raytracing. It is NOT just some better shadows or spilling water. RT is a whole suit of graphical enhancement ranging from global illumination to defused lighting. Nvidia has a great video showcasing the true potential of Raytracing used on Control.
    And all of these are just a start. With RT support on the newer consoles, the RT scenario will just keep getting better. Also you really didn't include the effect of DLSS on your benchmarks, even though you added SAM, which you yourself has shown to be almost pretty much the same quality as the native resolution.

    • @DirtyPoul
      @DirtyPoul 3 года назад +1

      Steve is dismissive of ray tracing because current hardware is just not fast enough to take advantage of it in a meaningful way. It's a niche feature because of its insanely high impact to performance. I don't see that changing until at least RDNA 3 and Hopper.
      Ray tracing will result in a revolution in gaming, but it has not happened yet. Give it time.

    • @coolronz
      @coolronz 3 года назад

      @@DirtyPoul I agree, the law of averages to me is 45% hit for Nvidia, 60% hit for AMD, as a general rule of thumb, from what I have seen. Its sort of like PhysX sorta kinda, ever since Nvidia made it open source, its pretty cool to go back and play some older decent titles with Advanced PhysX titles and go into the gamesworks settings an be able to turn half of them on and play them with my 5700XT. Even Metro Exodus looks a little better, just have to remember to install the latest PhysX software.

  • @mrfurion
    @mrfurion 3 года назад +1

    I feel like the 16GB VRAM point is strong if you're someone who doesn't want to upgrade again for 3+ years. If your upgrade cycle is more like 1.5 to 2 years it seems like RTSS and dramatically better RTX performance make the Nvidia GPUs more compelling, since I can't imagine 8GB VRAM is going to limit you much during that timeframe.
    Keep in mind that most people are still using 3-6GB cards not 8+, so game producers aren't going to make something that requires 10GB except at Ultra/Extreme presets.

    • @Cooe.
      @Cooe. Год назад

      This whole post was ignoring the whole "current gen consoles now have 16GB of GDDR6 memory" thing...

  • @popcorny007
    @popcorny007 3 года назад +1

    10:44 I think you misspoke here, the graph shows the 6800 is 14% faster, not 4% faster

  • @InternetEntity
    @InternetEntity 3 года назад +37

    Radeon 6800: Beating the 3070Ti before it exists.

    • @danielcallahan976
      @danielcallahan976 3 года назад

      doesnt even beat 3070 and u r talking about TI lol

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @mR Mossley Its called an outlier.

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @mR Mossley Most of the time they are relative to each other but there are sometimes where the 6800 has a lot more frames.

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @mR Mossley What? Thats not what I said at all.... You out a timestamp to what compare to other games is an outlier. I'm not going to bother reading what ever else you're saying since you clearly are arguing something I never said.

    • @georges-leeeugene2905
      @georges-leeeugene2905 3 года назад

      @mR Mossley The 6800 beating the 3070 is not an outlier is by how many frames it's beating it.

  • @Bandoolero
    @Bandoolero 3 года назад +29

    When Cyberpunk comes out, that will be the ultimate benchmark!

    • @jpavideo
      @jpavideo 3 года назад +8

      DLSS Off : AMD first.
      DLSS ON : Nvidia First.

    • @swaggitypigfig8413
      @swaggitypigfig8413 3 года назад

      @@jpavideo : Always first.

    • @gabrielst527
      @gabrielst527 3 года назад +11

      Not really, it's heavily Nvidia "optimized" so we already know what that means. (like all games that share the same fate)

    • @mattbrownbrands
      @mattbrownbrands 3 года назад

      @@jpavideo Big facts

    • @inflatable2
      @inflatable2 3 года назад +1

      Cyberpunk supports DLSS so not interesting.. Results where AMD beats Nvidia are much more interesting obviously..

  • @harryspapadopoulos8818
    @harryspapadopoulos8818 3 года назад +4

    Here in Greece there isn't any stock. Also older cards aren't properly priced right now.

    • @GholaTleilaxu
      @GholaTleilaxu 3 года назад

      The "proper" price is the high one you have just looked at. Welcome to the new age!

  • @AshtonCoolman
    @AshtonCoolman 3 года назад +2

    1 to 2 years from now, that 16GB is going to be paying huge dividends. We'll have to see what impact ray tracing and DLSS will have over that time.

    • @weed46
      @weed46 3 года назад

      To be fair, people were saying that about the 390x that i had with 8GB of Vram. Then I had to trade the 380x for a 2060 with only 6gb that performs loads better. To be fair, Vram is bullshit, just stick to benchmarks.

    • @AshtonCoolman
      @AshtonCoolman 3 года назад

      @@weed46 that makes no sense. Let me explain why: You'd have to compare an Nvidia GPU from the same generation as the 380X.

    • @1301Marko
      @1301Marko 3 года назад

      What do you mean it will be paying huge dividends?

    • @killerspetsnaz
      @killerspetsnaz 3 года назад

      ​@@AshtonCoolman RX 480 8GB vs GTX 1060 6GB, neck to neck cards and people worried the extra 2 GB of VRAM... which is fine up to today
      R9 390 8GB vs GTX 970 4GB (or more 3.5GB lmao), again both traded blows with each other
      , and you can still play modern titles with a 970
      The Fury cards also had 4GB HBM and despite being 1 year older, outperform in modern games cards like the RX 480 8GB GDDR5
      VRAM is an important factor, but the amount of it isn't necessarily future proof, you have to take many other factors and take them in account
      EDIT: RX 580 4GB vs 8GB... same performance in games... the same card with more 4GB, and no difference

  • @SirDaffyD
    @SirDaffyD Год назад +1

    I'm about to get the Gigabyte Gaming OC XT6800. Here at Centrecom in Australia, I can get it for $785au. The Gigabyte Gaming OC RTX3070 goes for $829au, and the RTX3070ti model goes for $1049au, and the 10gig RTX3080 goes for $1199au & 12gig for $1389au. This makes the RX6800 a bit of a steel in my opinion. I could wait for the ASUS Tough Gaming model to come down in price to match the Gigabyte model, which it does from time to time, but I'm itching to upgrade from my Gigabyte Gaming OC RTX2070 Super. lol. The ASUS Tough Gaming model currently is going for $899au btw.
    I think the RX6800 will compliment my R7 5700x & B550 AORUS ELITE AX V2 m/b nicely.
    Great review Steve.

  • @TechGamer-pq1gu
    @TechGamer-pq1gu 3 года назад +14

    as shown in this review almost every new game the just recently came out runs incredible on the RX 6800/XT Gpu's that is the benefit of being the GPU on the new consoles and expect it to only get better from here as more and more games will start utilizing AMD gpu's a lot more better and games will also start to use more and more Vram AMD having 16GB of Vram on their high end stack means your future proofed.
    Also expect Nvidia to convince more and more devs and publishers to use their Nvidia worx tech so expect a few upcoming games that will gimp AMD performance just as they have done for well over a decade now.

    • @fiftyfive1s410
      @fiftyfive1s410 3 года назад +2

      It's called not letting nvidia develop proper drivers for the game before the launch :D Also that 80 dollar price difference is all the magic there is here. It's not like a 6800XT vs 3080, this is linear progression, ofcourse the 3070 doesnt compete, 80 dollars is a lot, theres a reason they didnt let it reach a 100, marketing.

    • @fiftyfive1s410
      @fiftyfive1s410 3 года назад +2

      AMD gimped their own performance, devs didnt do anything, see 5700XT at launch vs after. Absolutely stellar performance after AMD fixed their own drivers

    • @1Life2Little
      @1Life2Little 3 года назад +1

      I think Nvidia got like 1-2 generations left unless they wake up... AMD development have gone off the rails with Lisa Su and they will catch up to all Nvidia's technologies within 1 or 2 generations.

    • @eazen
      @eazen 3 года назад +1

      AMD has more money now, they can counter Nvidia GimpWorks. Nvidia abused their money to gimp Radeon, but it's over now. And the console dominance of AMD will help as well.

    • @fiftyfive1s410
      @fiftyfive1s410 3 года назад

      @@eazen Nvidia didnt gimp anything lmao, they, as per the laws of physics, had the better card every time until now.

  • @zeronin4696
    @zeronin4696 3 года назад +14

    *Rx 6800 - the best out of stock card sold for a price of a 6800 xt*

  • @affectedrl5327
    @affectedrl5327 3 года назад +3

    Ill only buy the 6800 if the Drivers are perfect. I don`t want to have the 5700xt experience again. I have more fun with my 1060 than with my 5700xt, which i sent back.
    Edit: I dont get why this review "favors" AMD so much. There was not a single game, in which the 3070 was better than the 6800. That definetly happened multiple times in other reviews.

  • @Carebear_Pooh
    @Carebear_Pooh 3 года назад +1

    Maybe this is a crazy idea but tech-youtubers really need to come together and make a good list of games that should be benchmarked by standard. All these different pieces of advice are driving me nuts. HUB says 6800, bitwit says 3070, jayz2cents says 6800XT, Linus says 3070, who are we supposed to listen to at this point? Just make a standard already with games that should be always benchmarked with raw performance (no dlss, SAM, raytracing whatever) show the average of those and then show your extra benchmarked games to put it in context with older models.

  • @mattbrownbrands
    @mattbrownbrands 3 года назад +2

    I'm super happy with my 3070 to be honest! A huge jump from my 1070, $80 cheaper than the 6800 which in my country (South Africa) is really significant with the import taxes and markup, and with DLSS on will get a juicy boost in performance. Based on Linus' video, it's also the better choice as an all-round card as the non-gaming performance is better. Also - Nvidia drivers and support, Nvidia control panel and recording software are what keeps me leaning towards Nvidia!
    Really am impressed with what AMD has produced this time round though and I genuinely believe you can't go wrong either way! Both AMD and Nvidia are offering amazing performance at great prices! I'm pairing my 3070 with a Ryzen 5600x! Best of both worlds! :) it's an exciting time to be a gamer!!

    • @caidelander2561
      @caidelander2561 2 года назад

      Hey I’m in South Africa as well - the 3070 and 6800 are now the same price on Wootware - but at a 10% average increase of fps I would need to get a new case to fit the massive card - so it ends up being R1000 more expensive:( I also just had my 5700xt die on me

  • @Mi2Lethal
    @Mi2Lethal 3 года назад +37

    AMD had the tee ball setup with the launch of the RX 6000 series all they had to do was wack it with decent stock numbers 🤦

    • @SubiKinubi
      @SubiKinubi 3 года назад +27

      Everything in the world is suffering manufacturing issues, world wide pandemic saw to that

    • @Ghastly10
      @Ghastly10 3 года назад +3

      And by the looks of it they sliced it into the rough.. The irony is that pretty much everyone lambasted Nvidia for their screw up of a launch, now it looks like AMD followed in Nvidia's footsteps with terrible stock availability..

    • @Bestgameplayer10
      @Bestgameplayer10 3 года назад +1

      Exactly. The ball was entirely in their court and they blew it.

    • @Bestgameplayer10
      @Bestgameplayer10 3 года назад +2

      @@SubiKinubi Yeah but at this point there’s no excuse. A delayed launch probably would’ve helped them more than it would’ve harmed considering no one can even get a RTX GPU so it’s not like they’d been losing sales with a delayed launch.

    • @zeta2209
      @zeta2209 3 года назад +10

      Well, let’s see how many AIB cards are released. They should have the bulk of allocation of dies.

  • @redmoruga4600
    @redmoruga4600 3 года назад +44

    for now these are called "youtuber cards" as only youtubers/reviewers got them.. I don't see stock becoming available for 3070/3080 or ryzen 5000 so I don't see how radeon 6000 will be any different. Sure maybe in spring there will be stock, maybe.

    • @JohnyMcNeal
      @JohnyMcNeal 3 года назад +4

      You can find any of those if you're willing to sell your house. They can stick their msrp's up their buts.

    • @raawesome3851
      @raawesome3851 3 года назад +3

      Ryzen 5000 has only been out for 2 weeks. Let's wait a month before declaring any stock issues.

    • @TheShebulba
      @TheShebulba 3 года назад +1

      just got a 3070 fe from nvidia's website yesterday. The 30 series are slowly starting to pick up on refills

    • @raawesome3851
      @raawesome3851 3 года назад +1

      @@TheShebulba that's good.

    • @somethinglikethat2176
      @somethinglikethat2176 3 года назад +1

      The 5600X and 5800X are in stock. Where are you looking?

  • @knives6683
    @knives6683 3 года назад +3

    Thank you for benchmarking all these new games and forgoing Port Royal, Firestrike, 3dmark and 2-3yr old games that nobody plays anymore.. Basically just getting to the meat!

  • @jasonbateman6222
    @jasonbateman6222 3 года назад +2

    Great effort as always, thank you Steve!

  • @kkhalifah1019
    @kkhalifah1019 3 года назад +1

    RX6800 is definitely the sweet spot for 1440p x 3440p ultrawides. If you want something with a bit more oomph than a RTX3070 to drive 30% more pixels compared to a standard 1440p screen, this is it.... else cough up blood for a full-blown RX6800XT or RTX3080.
    That's if you can find it on the shelves somewhere... I had to suss around in Taiwan to get mine...

  • @benjaminnaidoo9776
    @benjaminnaidoo9776 3 года назад +34

    I know I'm gonna love this

  • @pondopondo1497
    @pondopondo1497 3 года назад +16

    I guess we won't be seeing this card till freaking March, eh? Guys, convince me to buy a 900$ 3080. I can't do it. And i already gave my 1080 to a friend.

    • @MauriceTarantulas
      @MauriceTarantulas 3 года назад

      Think I am. I'd rather the best perf that lasts...

    • @GTRWelsh
      @GTRWelsh 3 года назад

      Stock should trickle in ever now and then after the AIB launch. Immediate shortages is expected

    • @WayStedYou
      @WayStedYou 3 года назад +4

      AIB cards launch in a week so there could be 3/4 or more of their supply right there.

    • @1Life2Little
      @1Life2Little 3 года назад +13

      Do not pay $900 it is fucking insane... Just wait it out like the most of us. If we are lucky we can get a 6000 series card in December.

    • @pondopondo1497
      @pondopondo1497 3 года назад

      @@1Life2Little I'm in talks with this enthusiast type of retailer in my region. They've been said to not expect any real RDNA stock till end of december at the earliest...

  • @emanueldumitru6477
    @emanueldumitru6477 3 года назад +3

    Really appreciate what you did here. Very time consuming and we know it!

  • @sysakPL
    @sysakPL 3 года назад +1

    Steve would you check if it's possible to flash the 6800xt bios like it was possible on RX5700? While it didnt unlock any Cores it did raise the stock power limit and clock as well as their maximal allowed values for much better oc.

  • @spoton5981
    @spoton5981 Год назад

    Wish you guys would have shown how easy it is to undervolt/overclock the 6800 to become a 200w max gaming gpu, but really more like a 160w gpu. All that, while increasing the performance. It is truly an amazing card. Not to mention the 3070s were usual as expensive or more than the 6800 most of the time. Crazy

  • @MartijnterHaar
    @MartijnterHaar 3 года назад +6

    I feel like Steve is letting his personal opinion on ray-tracing shine through too much in these reviews. He might like playing Fortnite at low settings for the ultra high FPS, but there are also more than enough gamers go more for the pretty images at decent frame rates.
    It's weird to pick a known bad implementation of ray-tracing like SotT for the benchmark instead of Control, Metro Exodus or Watch Dogs Legion, where Tim has shown how good it can be and how much it adds to the atmosphere.
    It also makes no sense to me to for Steve to say there are virtual no games with ray-tracing, as Steve knows that now that the consoles and AMD gpus support it too, in the coming years a lot of games will have it.

    • @Allyouknow5820
      @Allyouknow5820 3 года назад

      By the time there's significant uptick in games using RT, we'll be in 2022/2023
      And I don't doubt for a second that AMD/Nvidia will have new cards with a far more robust RT performance.
      So that's not a question of preference : there's like 10 games with RT and most of them don't offer anything else than murdering performance.
      ¯\_(ツ)_/¯

    • @teamtechworked8217
      @teamtechworked8217 3 года назад

      Well make your own review channel and you can focus on the ray tracing. Pretty sure this is Steve's channel and he can test how he wants.

    • @MartijnterHaar
      @MartijnterHaar 3 года назад

      @@Allyouknow5820 I think it will be faster, as studios have been working games with RT support for quite a while now.
      I don't doubht Nvidia, AMD and maybe even Intel and Apple will have gpus with better RT support by then, but I also think that you can't expect everyone to replace a +$600 gpu after just two years. That's wasteful, both economically and evironmentally.

  • @amaurytoloza1511
    @amaurytoloza1511 3 года назад +9

    for just $70 you might as well go for an XT for that matter.

    • @Carebear_Pooh
      @Carebear_Pooh 3 года назад +6

      yeah, "only $70 more". I've never heard a more shilling statement. they were burning the RTX2000 series into the ground because of the slightly higher price, yet they are comfortable giving AMD a free pass on this matter. 6800 should've, at the very least, be the exact same price as the 3070, or even undercut it. With AMD adapting to the new ridiculous prices nVidia introduced, they are becoming part of the problem.

    • @amaurytoloza1511
      @amaurytoloza1511 3 года назад

      @@Carebear_Pooh I totally agree that "I'd pay $80 premium just for the extra VRAM" is a weak argument to be honest. I am criticizing that argument from Steve. I am on Linus' side on this one, for the price they're asking right now, they are hard to recommend. They should have been a tad cheaper as you rightly said. I guess they are getting greedy too.

    • @amaurytoloza1511
      @amaurytoloza1511 3 года назад

      @@ReachTea 5700XT was at the same price of 2070S when they launched in the UK, so I went for a 2070S for £480. The red devil was even more expensive than that. The 5700xt calmed down after few months however. I think on this case, it'll take more than a couple of month to see the new cards anywhere close to their MSRP

  • @hoangd4132
    @hoangd4132 3 года назад +12

    man it would be a kickass gpu if it was cheaper, like 3070 msrp

    • @ShuffleToExpresss
      @ShuffleToExpresss 3 года назад +3

      If you were to overclock the 6800 you could get close to 3080 performance it looks like though🤔 good value

    • @1Life2Little
      @1Life2Little 3 года назад +5

      Why would they sell you a GPU that beat 3070 in everything for the same price?!?!? you make no sense.

    • @totallyjerd1751
      @totallyjerd1751 3 года назад +4

      @@1Life2Little Considering the 3070 has NVENC, DLSS, better Raytracing, it's not actually that crazy a gap. Those are definitely elements that can contribute to the choice of purchase.

    • @HarshGupta-zz9ur
      @HarshGupta-zz9ur 3 года назад +5

      @@totallyjerd1751 None of those feature can cover Nvidia's VRAM scrutiny. 3070 won't be able to run next generation games maxed out at 4K. Making it a bad value.

    • @Bestgameplayer10
      @Bestgameplayer10 3 года назад +3

      @@HarshGupta-zz9ur It’s like people forget that games NOW (such as DOOM) is already using more VRAM than the 3070 has, and many other games come close to it. NVIDIA really dropped the ball with that since AMD is providing each card with 16GBs.

  • @PerpetualPot
    @PerpetualPot 3 года назад

    I just bought one! Then watched this video! Kinda feels good right now! Then again. Since I’m in South Africa we have some
    Board partner Cards available.... great video. Thanks.

  • @1973Hog
    @1973Hog 3 года назад

    18:50 I bit the bullet and ordered a system from CyberpowerPC with a 6800 and 5800X CPU. The total cost was $1774 which is only slightly more than what the system would be if I could get all the components and assemble them myself.

  • @tanjt
    @tanjt 3 года назад +39

    1080ti: hi all

    • @zorro5235
      @zorro5235 3 года назад +8

      1080ti, you are a legend

    • @daviddavidsonn3578
      @daviddavidsonn3578 3 года назад +8

      hello there fren, 1080ti still rocking

    • @KloVnPT
      @KloVnPT 3 года назад +2

      Still kicking

    • @luckysingh-lh4hf
      @luckysingh-lh4hf 3 года назад +4

      HI!! im 1080 for 200USD :D

    • @iceburglettuce6940
      @iceburglettuce6940 3 года назад +3

      5700 flashed to XT. Cost me £272 a year ago. The “fine wine” or whatever you want to call it is starting to kick in nicely.

  • @MrMatthias945
    @MrMatthias945 3 года назад +6

    Woud like to se Raytracing plus dlss and dlss without Raytracing.
    If you enable the shiny festures on amd you shoud also do that for nvidia.
    For the vram i think your right there for nvidia coud gave the cards more.
    But keep in mind with dlss 2.0 giving Image quality like native resulution it is a proper thing to enable and if enabled at 1440p you need much less vram because you only render at 1080p.
    Also did you guys take a look at the encoder for streaming? Looks like amd is really bad there against nvidia if you look at stream quality on similar settings.

    • @erubalu
      @erubalu 3 года назад +3

      this review is very biased, it's like he's in love with AMD for some reason... only giving importance to the stuff that the cards do well and not mentioning a lot what they suck at... watch GN it's by far the best one and it's very objective.

    • @WENEVERLEFT
      @WENEVERLEFT 3 года назад +1

      @@erubalu no godfall or dirt on gn? only games that made for nvidia rtx?

    • @wareverwarever9200
      @wareverwarever9200 3 года назад +1

      I have seen this kind of mentality when physX and tessalation were NVIDIA's Way to detrimentaly influence performance and look less bad compared to the competition. Rtx is a gimick and it's being used to the same purpose as of now. It may Change in the future but as of now, its how i see it. If it was a technology that improved framerate i would be all for it. You are just proposing something that degrades framerate and makes for an uneven playing field, aligning with NVIDIA's marketing bullshit. Dlss is a great technology but as usual, has the balast of all of gameworks. No thanks. At least not yet.

    • @Scytian1
      @Scytian1 3 года назад

      But not all games will have DLSS support, and I don't want to pay $500 for GPU and then need to reduce textures quality because I don't have enough VRAM. And DLSS is not that good at 1440p - I'd rather have it disabled in this resolution.

    • @erubalu
      @erubalu 3 года назад

      @@wareverwarever9200 Nao concordo contigo, RTX is not a gimmick I'm enjoying it a lot a the moment ;) I'm looking forward to you changing your opinion when you can't play cyberpunk with all the bells and whistles on your new AMD card...

  • @jfelm7436
    @jfelm7436 2 года назад +6

    now that we are getting to the end of the production run i would love to see a post mortum of the generation. Especially against the 3070ti which was the better head to head comparison. I am super happy I was able to pick up a ref card at MSRP. I feel like I am the only one running the card lol.

    • @CrustyWhiteBread
      @CrustyWhiteBread Год назад

      I just got one yesterday my man so there's at least two of us now!

    • @Cooe.
      @Cooe. Год назад

      ... The RTX 3070 Ti w/ its identical 8GB of VRAM was everything bad about the RTX 3070 just made that much worse. IMO (& Steve's too, see his review) t's an inferior product to BOTH the OG RTX 3070 & the RX 6800.
      By bumping up performance without bumping up the framebuffer, they actually managed to have an even WORSE VRAM bottlenecking problem! And that's right now today! Just how damn bad will things get for the 3070 Ti in a few more years into the 16GB consoles lifespan?
      If you replace your GPU every ≈2 years or so, the RTX 3070 Ti shouldn't be a problem, but w/ its mere 8GB of VRAM + sorta near RTX 3080 raw perf, it's an absolutely terrible, TERRIBLE card to choose for a long term build! No offense. 🤷‍♂️

  • @Destractoid
    @Destractoid Год назад +1

    Just snagged one of these for 400, so I'm now watching everything I can on it (even though I already know it was a good deal)

  • @TrueThanny
    @TrueThanny 3 года назад +1

    Given that anyone buying a 3070 from this point on is almost certainly going to be paying at least $600, your value estimation is going to prove conservative, so long as the AIB models of the 6800 don't go crazy with pricing (which I find unlikely).

  • @andydbedford
    @andydbedford 3 года назад +7

    I wish I could like this video more than once. Looks like im heading back to AMD for mmy next GPU.

  • @soma_eizo
    @soma_eizo 3 года назад +7

    Performance is good, but value, considering perf loss at 4K (kind of expected better because it's targeted at 4K with all of its 16GB VRAM) and raytracing perf, is not there. Unconvincing. Does not make me want to buy one. Not that I could anyway. And about value, I don't even think there will be ANY

  • @laggisch
    @laggisch 3 года назад +4

    Nice paper specs. This was the karma revenge of the 1200 dollar 2080ti buyers.

  • @rudy_dstroys1821
    @rudy_dstroys1821 3 года назад +1

    I honestly think that the 6XXX series will get better with improved drivers....there is always a bit more to obtain when drivers are optimized for AMD cards.

  • @Zelement911onrs
    @Zelement911onrs 3 года назад

    Great review, Steve! Really looking forward to the AIB reviews coming up.

  • @parthapratimroy8282
    @parthapratimroy8282 3 года назад +4

    I think I'll buy rtx 3080 because of its features whereas rx 6800 xt is in development state of its features. The rendering speed if rtx 3080 is faster than amd's.

    • @chaywarburton3488
      @chaywarburton3488 3 года назад

      @Liquid Sunshine its not chance, lol they fixed their issues months ago. I use one.

    • @djmccullough9233
      @djmccullough9233 3 года назад

      depends what kind of rendering, and at what resolution your talking about.

    • @chaywarburton3488
      @chaywarburton3488 3 года назад

      @Liquid Sunshine What have you done? What I don't get is how drivers are perfect for some people and terrible for others.
      Makes me wonder on user error.
      Just use DDU. From Radeon software get every darn update it'll let you from January forward.

    • @chaywarburton3488
      @chaywarburton3488 3 года назад

      @Liquid Sunshine Sorry, I'm not trying to be a jerk.
      Have you sent the card back for another?

    • @chaywarburton3488
      @chaywarburton3488 3 года назад

      @Liquid Sunshine just go to best buy, get the cheapest monitor you can find to test quickly! If its the monitor, then you make a better purchase. I dono

  • @Droid3455
    @Droid3455 3 года назад +6

    Games will be more optimized for RDNA2 because of the new consoles, so expect even better performance out of these cards

    • @hurlman03
      @hurlman03 3 года назад +3

      While that makes sense in theory, I don't think that's been true in the past. Previous console generations, also done by AMD, did not result in games being better optimized for AMD on the PC.

    • @iceburglettuce6940
      @iceburglettuce6940 3 года назад +2

      The GPUs in the last gen consoles didn’t share as much of the architecture as the current ones. I suspect this will be the case, especially in raytracing. I’m sure AMD cards won’t beat Nvidia ones in this but like Dirt 2, when the code is not specifically written for RTX I expect the numbers to be closer.
      Raytracing is still to slow for this gen anyway whether NV or AMD.

    • @danielcallahan976
      @danielcallahan976 3 года назад

      @@iceburglettuce6940 go n check GN's review or wait for HUB to do a vdo on RT with these cards. Dirt hardly has any RT. with the increase in RT effects, AMD's suffering increases and what do u mean by didnt share much of the architecture.. it's the same man..we will continue to get more nvidia sponsored titles.consoles aint going to do shit and these consoles are weak as fuck in Ray tracing. check Digital Foundry's latest vdo on Watch Dogs

    • @iceburglettuce6940
      @iceburglettuce6940 3 года назад

      @@danielcallahan976 time will tell I guess

  • @ethancarey3922
    @ethancarey3922 3 года назад +9

    Talks about ray tracing and dlss 2.0 being of questionable value when almost all major AAA titles coming out now and in the future feature it? Then proceeds to say the 6800's 16gb is it's best feature because it's future proofing the card? This channel is consistently the only one writing off the value of Nvidia's extra features and somehow managing to call the new radeon cards better when they're merely competitive.

    • @evol3852
      @evol3852 3 года назад

      The bias is strong with this one. This has always been the case with Hardware unboxed when it came to amd stuff. Hope he reviews the bike as well and say it's amazing as well.

    • @JoshS5811
      @JoshS5811 3 года назад

      Almost all new AAA games will support DLSS and or RT?? Perhaps, but I think we would be lucky to see 10% of games overall. If you ONLY look at AAA games perhaps most would be correct (like 51%), but MOST games that come out are not AAA games. So 50% of 10% is most? You could say the same thing about needing 16GB of VRAM though I guess..
      If we look SEVERAL years out, we can say with near certainty that a much higher percentage of games will need more than 8GB or VRAM. We can GUESS that a higher percentage will use RT, but DLSS? It could be like Hair Works if the AMD version of DLSS turns out to be the dominate open source upsampling implementation which would not surprise me since AMD is in both the PS5 and Xbox and what they will be using. I would guess that at least part of the reason Steve is saying the 16GB is more important than DLSS for the near future and Steve has a good track record for this type of thing.

    • @ethancarey3922
      @ethancarey3922 3 года назад

      @@JoshS5811 dlss is in the new cod. I think major devs are on board. 1080p and 1440p in most games (especially if you're using dlss) won't need more than 8gb for the next 1-2 years. After that, sure, you might have an effective argument. But the 3070 with dlss is already hitting framerates higher than what the 6800 is capable of and at similar image quality. It's a big deal, and will become bigger as time goes on if games like cyberpunk are any indication. But sure, keep parroting his obvious bias. He goes out of his way to downplay the value of it. It's bizarre if you ask me when literally ALL his contemporaries do the opposite. Are you going to tell me they're all paid schills? I don't think he is either. I just think you're making him out to be a lot smarter than he really is. They test a lot. So what. Doesn't mean they cant have a shit take from time to time.

  • @josefsvitak3967
    @josefsvitak3967 3 года назад

    Otherwise extremely well done review!!!! We should see only this kind of reviews, where everything is taken into account, not just performance in few games and no price/perform. ratio o watt/perform. ratio and many many other significant aspects. I am definitely subscribing after long time to this channel and will only recommend to others! I have gone through so many reviews on RUclips till now, and NOTHING COMPARES TO YOUR REVIEW! Pls keep doing this perfect job!

  • @jinx20001
    @jinx20001 3 года назад +6

    seems such a strange way to review products, literally basing a recommendation on something that makes absolutely no difference today, we heard the same stuff said about the radeon vii and yet that doesn't even belong on any charts anymore despite its 16gb vram and how valuable that would be in the future, well here we are in the future and its 16gb vram does nothing. the 6800 should be the recommendation based on the fact its outright faster by a fairly decent margin, the 16gb vram should never be a reason to choose one over the other based on the simple fact we learnt that lesson with the last 16gb card.
    its pretty unfair to suggest ray tracing holds no value either, a more in depth review of what these graphics are doing would be much more valuable for the viewers, writing it off as small improvements to shadows is bullshit and you know it. and the argument about it not being in many games is a cop out too, most people don't care how many games get it since most people are not playing 500 games a year, the majority of gamers play 5-10 triple A games a year so if those 5-10 games a year get RT it should get more than 50 seconds coverage in your review since it WILL play a major role in buying decisions, especially at the higher end where the 6800xt and 3080 for example clearly trade blows in general gaming so the question should be what else is on offer for the money. none of that ever gets mentioned or explored though and i think its a slight stain on an otherwise good review process, the unwillingness to change it up for the sake of keeping the charts easy isn't great. i only prefer other reviews because they go into much greater detail, digital foundry will deep dive into the tech and show what it offers, tech yes city even went as far as comparing encoders on the cards for those few that use them to stream.
    im sorry for the negativity i just feel you guys clearly make a huge effort to make a massive list of benchmarks but leave out a lot of what makes a review even more enjoyable to watch, maybe cut the benchmarks in half and use the time to explore other features, no mention of the tensor cores and how they make things like RTX voice possible, no mention of fidelity FX, lets get in there and see what it can do for games and image quality, many of us want to see this stuff, its not wasted time. but im one person, who gives a fack about 1 person who dares to challenge the traditional review process lol

    • @koustav318
      @koustav318 3 года назад

      Welll said

    • @rdmz135
      @rdmz135 3 года назад +1

      the 3070 is already bottlenecked by vram in doom eternal... obviously 16gb is overkill but 8gb is just stupid.

    • @jinx20001
      @jinx20001 3 года назад +1

      @@koustav318 thankyou, i feel it does not get said enough and this review process starts to get a bit stale. it ALWAYS comes down to more frames and price when the reality is nobody will notice the difference between 120fps and 130fps so more time should be spent on what else the cards offer, these guys have the knowledge and skills and frankly this is their job, i feel they should certainly go into more depth and explore what the technology does for its users, saying shit like RT is just nicer shadows for a massive performance drop is value to nobody, most of us already know that's bullshit and it doesn't sound smart at all, they know better than this.

    • @escarretada
      @escarretada 3 года назад

      @@rdmz135 3070 it s for 1440p raytracing dllss 2.0 if you want 4k were the 16 vram you need rtx 3080 /90 or 6800xt, if they cut the 16 gb to 8 in 6800 and lower the price then 3070 that would be a check mate but no, they go greedy...

    • @jinx20001
      @jinx20001 3 года назад +1

      @@rdmz135 but 8gb isnt stupid and has not been a problem for a long long time, nobody has ever got a message on the screen saying they have ran out of memory you know why, because allocated memory and memory in use is two different things. we can look at extreme cases where it looks like vram is an issue but the reality is its not a problem yet and likely wont be a problem for a long time. the 3070 and 6800 are arguably not the cards to buy if 4k is your preferred resolution choice either although they certainly can play at 4k with some quality setting changes so like i say its a non issue.
      when we see that message saying you have ran out of vram then ill take notice and ill ask what settings and resolution you would be using to make it physically run out.