RTX 3080 vs RX 6800 XT! | Same FAST Performance, Is Nvidia or AMD Better??

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024

Комментарии • 813

  • @vextakes
    @vextakes  Год назад +238

    🚨🚨 NVIDIA SHILL ALERT 🚨🚨

    • @soupdrinker72
      @soupdrinker72 Год назад +8

      real

    • @GewelReal
      @GewelReal Год назад +34

      I love Nvidia! Daddy Jensen make me into one of your jackets

    • @phrog.4809
      @phrog.4809 Год назад +24

      I decided to buy a 3080 instead of a 6800 XT a few days ago to upgrade from my 6600 XT, the card just shipped too, and now you post this video 💀. The only reason I'm still happy with my decision though is that it was a 12GB card and I got it for $390.
      Update: I got it and I'm happy with it

    • @JasonEllingsworth
      @JasonEllingsworth Год назад +9

      There have been videos for years showing the 6800xt as the better card. It also overclocks far better. Mine is as fast as a 3090, using air cooling and amd's stock software, which has improved a lot in the past 3 years. No driver issues either.

    • @lucidnonsense942
      @lucidnonsense942 Год назад +10

      CUDA worloads aren't a thing you can utilize well with a 3080 - they need a lot of RAM, that's why the professional cards have 24GB. 10GB makes that feature irrelevant - the low memory is a way to force prosumer buyers into paying the Quadro tax. 3080 is useless in pro apps, you get better performance with a A4000 - a 16GB quadro card that uses the 3070 core.

  • @BogdanM116
    @BogdanM116 Год назад +349

    I'm coming back here in 3 days with snacks to read the comments.

  • @chrisswann7578
    @chrisswann7578 Год назад +58

    I just bought a used rx 6800 non xt for 350 usd. I came from a rx 6600xt and I cannot imagine using anything faster. I'll definitely keep it for as long as I can. And to add I never see my card go above 200w, and that is with a slight tuning which makes its just that much faster. I love being able to max out settings and not run out of vram.

    • @hiriotapa1983
      @hiriotapa1983 Год назад +4

      Bought a 6800 XT Sapphire Nitro+ for 355 usd....

    • @chrisswann7578
      @chrisswann7578 Год назад +2

      @@hiriotapa1983 Awesome! What a killer deal!

    • @Adrian-tl5ue
      @Adrian-tl5ue Год назад +2

      what processor do you have and what psu? I have rtx 2070super and want to upgrade to rx 6800 or 6800xt with mine r7 5700x and I only have 650W psu 80+ bronze, i think 2070super runs with same watts like a 6800

    • @kilroy5680
      @kilroy5680 Год назад +2

      ​@@Adrian-tl5ueit's good enough

    • @TheNewmen10
      @TheNewmen10 10 месяцев назад

      ​@@Adrian-tl5ueFunciona sim,eu uso uma 650 w plus bronze tambem !

  • @MahBones
    @MahBones Год назад +25

    For me at the moment it's between the 4070 and the 7900gre but I'll have to see where that price lands in Aus. The 40 series power consumption is pretty compelling.

    • @Hydra_X9K_Music
      @Hydra_X9K_Music Год назад +2

      Not sure if this also matters to you, but the 4070 has a nicely small form factor. Can fit into a pretty wide range of case sizes

    • @Nein99x
      @Nein99x Год назад +2

      The 7900GRE will be limited to pre-built pcs. You might as well get the 4070.

    • @Dereageerder
      @Dereageerder Год назад +1

      Get the 4070, AMD is garbage beneath the 7900 xt and xtx

    • @chriswright8074
      @chriswright8074 Год назад +11

      ​@@Dereageerdercap 6950xt definitely gaps or beat 4070 at times

    • @Mangk89
      @Mangk89 Год назад +7

      ​​​@@chriswright8074i think he meant anything in the 7000 lineup, but i get your point

  • @thebeautifulandthedamned572
    @thebeautifulandthedamned572 Год назад +2

    gpu market has been confusing lately
    here in my country indonesia, there were a moment when you can't find RX 6950 XT at $630 USD, it was sold around $700
    but ironically, you also can't find brand new RTX 3080 on $600 USD, because it's also sold on 680-700 USD
    so people who actually buy brand new RTX 3080 is considered as "stupid", since they could get way better GPU which is RX 6950XT
    and now a brand new RX 6800XT is being sold at $560 USD, just $60 less cheaper than RTX 4070, and you can't find RX 6950 XT anymore, not even in 2nd hand version

  • @dirtydoge756
    @dirtydoge756 Год назад

    That is definitely a hot take on the 16GB not being a big deal. There's a LOT of other youtube tech channels that would vehemently disagree with that statement.

  • @professorchaos5620
    @professorchaos5620 Год назад +21

    The main reason 3080's are cheaper is that they were probably the most produced card of all time. Although I dont know how to find stats on that but so many miners like myself bought every one they could for a couple years and they have been re-selling them on ebay since then.

    • @maximus3294
      @maximus3294 Год назад +3

      it's true. ga102 in general was so abundant that they made the following GPUs with them:
      3070 ti
      3080 10 GB
      3080 12 GB
      3080 Ti
      3090
      3090 Ti
      and those are just the GeForce ones. their serverside cards also used the cores.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад

      F*** you miners, f*** you miners

    • @CPSPD
      @CPSPD Год назад +1

      when shitcoins and cryptocurrencies collapse i will rejoice

    • @professorchaos5620
      @professorchaos5620 Год назад

      @@CPSPD Thats good central bank slave. Keep rejoicing in your life long slavery

  • @RobBCactive
    @RobBCactive Год назад +96

    That's cool that you're using the RX 6800xt for editing, that used to be on the Nvidia Pro side.
    The fact is AMD have improved the software a lot, CUDA support via ROCM is coming out too.
    The thing is Vex has taken the plunge late in the game, there were deals on the AMD cards long ago, so you'd have had the usage out of it not waiting for 3080 prices to settle in the used market. Starfield is nice but not relevant to everyone.
    Unfortunately a lot of people ignored MSRP reductions and lower prices, scared by all the FUD put out about drivers etc. That doesn't send the market leader the signals it requires to reduce its margins.
    Gamers bitching about prices isn't enough, AMD have to be able to justify investment into features.
    That requires sales when they are close, because development costs are a real thing.

    • @molochi
      @molochi Год назад +5

      I thought it was considerate of NV to price their 4080 so high. Gives the 7900 xtx a chance to make money for amd as well. The really are gentlemen.

    • @RobBCactive
      @RobBCactive Год назад +4

      @@molochi well if Navi31 had met its expected performance targets the 4080/4070Ti would have looked very stupid, weak and totally over-priced.
      Scott Herkelman intended to "kick Nvidia's ass".
      So the 7900xt was crushing the 4080 12GB, the xtx the 16GB while much cheaper. Only fixes to the driver caused severe performance drop and they haven't found a general mitigation. Radical changes in architecture are inherently risky, RDNA3 has disappointed.
      Targets met, Nvidia added features like fake frames & DLSS would not have been enough, they'd be forced to cut prices or cede market share. Nvidia were maintaining high prices because of their vast RTX 30 GPU stockpile. Now they're cutting production to the contractual minimum rather than sell them cheaper, they figure gamers will relent eventually and cough up.
      The big money is in AI at the moment, so they're hoping to max out Hopper.

    • @ronaldhunt7617
      @ronaldhunt7617 Год назад

      The emulated cuda is not nearly as good as actual CUDA, I hear AMD is going to do the same with Tensor cores as well. So not as good, but way better than nothing. If you own AMD already it is a huge win, if you are buying a new GPU and will be utilizing CUDA/Tensor then you are better off with Nvidia. Video editing and other productivity software results are going to depend mainly on the software you are using, Premier Pro favors Nvidia much more, some free video editing software may or may not be more of a contender.

    • @RobBCactive
      @RobBCactive Год назад +3

      @@ronaldhunt7617 CUDA is a way to program a GPU, it takes source code and compiles it, for an application. Calling it emulation just shows your game is spreading FUD and confusing people.
      The AI boom is for large models and so called deep learning, with huge demand for data center products.
      What you need for running neural nets and training them is very different, laptops are coming out with AI acceleration in cooperation with MS.
      Like Apple has had, without any Nvidia hardware but be an accelerator on the CPU die.

    • @ronaldhunt7617
      @ronaldhunt7617 Год назад

      @@RobBCactive Not sure what you are trying to get at here, Nvidia has dedicated CUDA cores that process independently making certain computations a lot faster. AMD does not have CUDA cores, instead they (just like in everything else) saw what Nvidia had done and added stream processors which are not the same, not as good. Just like raytracing cores, and now it seems like Tensor cores (for AI) but only for the 6000 and newer GPUs, not to mention upscaling and frame generation (which AMD does not have as of now) Monkey see, monkey do... but not at the same level.

  • @speng5821
    @speng5821 Год назад +145

    16GB vs 10GB is no contest.
    Both of these cards are the lowest viable 4k options imo and at 4k 10GB is going to age like milk (even with FSR2/DLSS quality lowering the vram usage- which you'll probably be using as 4k native requires serious horsepower).
    In blind tests comparing FSR2 and DLSS at 4k quality, 99% of people won't be able to spot the slightly better image quality DLSS offers. The differences become more exaggerated using more aggresive upscaling at lower resolutions.
    If you don't stream or do any productivity and have a 4k monitor then 6800 xt just seems like the better option. £400 used and lower power which is a factor in the UK where energy is expensive.

    • @speng5821
      @speng5821 Год назад +11

      As for RT- I've always been about optimising graphics settings. Even before RT, I'd be turning shadows down a few notches from ultra for more performance for very little visual difference. Even on a 3080, RT is expensive and I'd much rather target 120FPS than have some eye candy.

    • @Ober1kenobi
      @Ober1kenobi Год назад +2

      Texture Sliders.
      So long as you keep the Vram in check,
      It’ll be faster no,,,

    • @speng5821
      @speng5821 Год назад +14

      @@Ober1kenobi but that's the issue- I don't want to have to think about keeping my vram in check- 4K textures are one of the best ways to improve the image- whats the point of playing at high resolution using low resolution textures? PC gaming and the pursuit of smooth frametimes is finicky enough without having to worry about vram as well

    • @stangamer1151
      @stangamer1151 Год назад +13

      Well, I can easily see the difference between DLSS Quality and FSR Quality at 4K rendered resolution. Even on my old 1080p screen, let alone 4K screen! DLSS provides much better AA quality and less ghosting. The overall image looks way smoother with DLSS.
      Plus, RTX cards also offer DLAA - the best currently available AA solution. And also DLDSR, which is an AI driven downscaler.

    • @thee-sportspantheon330
      @thee-sportspantheon330 Год назад +15

      @@stangamer1151 Cope. I will go nom nom on my vram.

  • @rain8478
    @rain8478 Год назад +32

    I was in this boat and grabbed a used 6800XT over a used 3080. There are two main reasons for this:
    1. 16 GB VRAM, self explanatory. Yes I bought into the fearmongering, boohoo.
    2. GDDR6X is an actual risk when buying ex-miner cards, GDDR6 not so much.
    This is really all there is to it.
    I've already had a 6700XT before so that kinda made it easy to know what to expect. Might be a little harder to risk going a different brand for others especially in used market. Though I don't think 3080 is cheaper because its valued less, I think it has more to do with just how many 3080's Nvidia produced and sold to miners. I just don't think there are as many 6800XT's out there total.

    • @person1745
      @person1745 Год назад +2

      G6 might not be as big as a risk than G6X but I think it really depends on the AIB rather than the card itself. Also fast G6 can produce just as much heat as slower G6X, like the 7900 XT.
      Reference 6700 XT for example has G6 but are known to run its memory close to 100 degrees. It’s all about how well the cooler is.

    • @rain8478
      @rain8478 Год назад

      @@person1745 well if you're buying an ex mining card known to run memory at 100c you're kinda asking for problems.

    • @molochi
      @molochi Год назад +2

      I think the fear mongering about 8gb is valid. 12gb 198bit is probably enough, but I wanted 256bit and that's 16gb.

    • @forog1
      @forog1 Год назад +1

      @@person1745 AMD OC those GDDR6 cards from factory. Which is why they produce far more heat than they should. 20gbs seems out of spec to me for GDDR6 non-X. I think they did that out of depuration for more performance when the realized their RX 7900 series was not hitting target numbers at 16gbs or 18gbs (GDDR6 at stock). Plus, my 7900xtx if I make the vram clock itself to its 20gbs. on the desktop it consumes like 80w idle with vram clocked up from 10w idle.... yeah, they pushed and overvolted these ram chips.

    • @rain8478
      @rain8478 Год назад

      @@forog1 you can undervolt SoC for what its worth with MPT to reduce power draw a little bit, it works on RDNA2 but I cant guarantee anything for RDNA3.

  • @ballcandy1117
    @ballcandy1117 Год назад +142

    I grabbed a 6800 xt for $150 less than the cheapest 3080 on the used market and 4070s are still over $300 more idk why anyone would spend more for AMD but ultimately besides the great price the 16gb vram was really what made me buy it.

    • @suzie9874
      @suzie9874 Год назад +5

      Couple weeks ago i swapped my 6800XT to a 3080 (10gb) And only reason i swapped with nephew was for AI. The AMD gpu while many say can run stable diffusion. I could never get it to work. So i swapped with my nephew. And SD works flawlessly now. Plus the games i play run about the same on both gpus. So nothing lost there

    • @xwar_88x30
      @xwar_88x30 Год назад +20

      Good choice my friend, 6800xt will age like fine wine and outperform the 3080 as time goes on and even matches or beats the 4070 with having more vram it'll age very well. Nvidia get Weaker as they get older while amd GPUs just get better as videos have proving this. Nvidia have been trash since the 2000 series. I'm hoping Amd can knock Nvidia of their high horse a bit since there just taking the complete mick out there consumers. As I've said in another comment look how smooth the frame graph is on the 6800xt compared to the 3080 amd is defo the smoother experience. Even the 5000 series had smoother frames compared to the 2000 series. Nvidia are more interested in marketing bs RT which tanks FPS on all gpus, there dlss , ai. There all for buy our product fo these features bs while there 3070/3080, 4060ti suffer because of vram limitations, absolute joke. If the 3070, 3080, 4060ti had 12gb plus vram they would be amazing GPUs but nvidia are more interested than ripping customers off making them to upgrade, horrible greed of a company.

    • @joee7452
      @joee7452 Год назад

      @@xwar_88x30 I would agree normally but the 3080 vs 6800xt seems to buck the trend. If you look around you can see a bunch of people and places doing the comparison again and contrary to the norm, overall the 3080 is stronger against the 6800 xt. I can here from a video comparing them that tested 50 games and the 3080 was on arg faster the the 6800xt by a higher percent then a couple of years ago. Now, it wasn't much of a change (like 9% on avg in 4k vs 7% that it was a couple of years ago 1440p was up the same) but it bucked the normal trend of AMD cards getting stronger vs Nvidia's. I thought it was funny.
      If I had to pick though I would still go with the 6800 xt because if you look and watch can you can find them new for around 500 or a little under sometimes. You are not going to find a 3080 new in that range or even in the zip code.

    • @jamesdoe7605
      @jamesdoe7605 Год назад +3

      ​@@xwar_88x30lol no

    • @xwar_88x30
      @xwar_88x30 Год назад +7

      @@jamesdoe7605 settle down there james, dont be silly now.

  • @RN1441
    @RN1441 Год назад +53

    After seeing how ridiculous the prices were getting on the new generations of cards at the end of last year and start of 2023, I decided to grab one of these 10GB 3080's used for a deal. The crop of new games which exhaust its VRAM started showing up almost immediately after this, so I probably should have waited a few months :D Oh well.

    • @michaelilie1629
      @michaelilie1629 Год назад +5

      what about medium settings and dlss/fsr?

    • @princekatana8792
      @princekatana8792 Год назад

      Imagine thinking about using medium settings with a last generation high end card. Pathetic@@michaelilie1629

    • @RN1441
      @RN1441 Год назад +2

      @@michaelilie1629 I'm mostly focused on 1440 since that's my native resolution, so I'll just have to turn off eye candy until it hits the gsync window

    • @mato_s
      @mato_s Год назад +7

      Same but i got a 3070 so im so screwed 💀

    • @stewenw4120
      @stewenw4120 Год назад +4

      @@michaelilie1629 Yeah i guess he bought a 400-600 € / $ card to play on medium settings. Nice advice^^
      The question is do you need more than ~80 fps in games that are requiring so much VRAM. I think 3080 should do the trick in the next 2 years.
      But since i don't care for RT and the other stuff NVidia offers i got me the 6800XT anyways.

  • @chrispittmanComputing
    @chrispittmanComputing Год назад +23

    Make sure your 5900X is running a negative all core in curve optimizer (start at negative 15) and if not running Samsung B-die get a set of 3600 CL14 and really tighten down the timings. This will give you a noticeable uplift on your 5900X.

    • @therecoverer2481
      @therecoverer2481 Год назад +1

      hey quick question, my ram (GSkill Sniper X 3600 CL19 @1.35v) is detected as Samsung B-die on Thaipoon Burner, but when i tried to tighten the timings even just 1 it refuses to boot, even at 1.45v, but the weird thing is i can lower the voltage when runing xmp to 1.25v. is it possible that i got trash B-die or am i doing something wrong? my cpu is R5 5600

    • @Wabos123
      @Wabos123 Год назад

      @@therecoverer2481 Its the B-die I had the same issue with my 5600X using Silicon Power ram from amazon. Ended up swapping to some Team Group T-Create 3600mhz CL18 and was able to undervolt and adjust timings.

    • @droptoasterintub297
      @droptoasterintub297 Год назад

      @@therecoverer2481 3600 MT/s with C19? That's definitely not B-Die. Guaranteed B-Die bins would be the likes of 3200 C14, 3600 C14, 4000 C15, etc. C19 is incredibly loose even at 3600. You shouldn't trust Thaiphoon Burner as it is known to have false positives, and in this case, it is glaringly obvious this is one of those false positives. I almost would say those ICs could actually be Samsung C-Die as they are known to become unstable over 1.35v, no matter the timings. It would also explain the very loose primary timing(s). I'd get ahold of a 3600 C14 kit as this is the best sweet spot for performance on Ryzen. Getting kits over 3600 MT/s isn't beneficial as they aren't necessarily better bins, but pricier; and, you almost can never go over 1900 FCLK which is 3800 MT/s when synced 1:1. Some Zen 3 samples may not even be able to do 1900 FCLK and need to step down to 1866 or 1800 (3733 MT/s and 3600 MT/s respectively). A B-Die 3600 C14 kit should easily do 3800 at the same timings on stock voltage most of the time.

  • @Ratich
    @Ratich Год назад +13

    The problem with the encoding is that i don't use that feature neither do I have a use case for CUDA and I'm probably going to stick to pure Rasterisation rather than turning on RT because upscaling or not the performance hit is too much. So for me the 6800XT is a better option.

    • @chetu6792
      @chetu6792 Год назад +4

      That’s the thing. Everybody talks about the Nvidia feature advantage. But beside DLSS, most features are used rather scarcely

    • @evrythingis1
      @evrythingis1 Год назад

      @@chetu6792 DLSS is just another scarcely used feature. They have to pay companies to implement it in their game, how ridiculous is that?

    • @gotworc
      @gotworc Год назад

      ​​​@@evrythingis1i mean that's literally what most companies do to push their tech. You gotta pay to play. Why would a company offer to put your tech in their game if they have nothing to gain from it. AMD does it too. While I don't think Nvidia GPUs are particularly good value right now since we're at the beginning of this era of AI being used to improve graphics computing. I don't think the technology is bad or a gimmick like most people are trying to say

    • @evrythingis1
      @evrythingis1 Год назад +1

      @@gotworc go ahead and tell me what proprietary features AMD has had to pay game devs to implement, I'll wait.

  • @ChusmaChusme
    @ChusmaChusme Год назад +10

    0:05 I'm like 95% sure the mining begun right when that generation began. Remember it cause it was impossible to buy any gpu at launch.

    • @Chrissy717
      @Chrissy717 Год назад

      That was actually covid shortages. 2020 and the first lockdowns caused supply chain issues on a global scale

  • @Pand0rasAct0r_
    @Pand0rasAct0r_ Год назад +4

    I have to disagree with the feature set. And before people call me a shill I have been using nvidia all my life. The 6950xt is the first amd gpu I have owned and used.
    The dlss vs fsr argument is in most cases highly blown up. Yes dlss provides a better picture. BUT you will almost never notice this during gaming and only notice this If you pause them and watch them next to eachother. If you have to go through such lengths to spot something than yeah that's just not an upside in my opinion.
    And raytracing is better on nvidia although not bad on amd either. But the cards we have right now and especially the 3080/3090/6800/6900 are just not raytracing cards. Neither is the 4070ti or 4080 or 7900xt or xtx. The only card capable of raytracing at minimum is the 4090. And even that card sucks at it in many ways. So If you plan to raytrace you really shouldn't be looking at these cards.
    The only upside is cuda but If you are just a gamer you wouldn't care about it.
    And the vram is just so important. There are so many games I see with my 6950xt that these days shoot past 10gbs.
    And I wouldn't choose the 4070 above the 6800xt in my opinion. 12gb is alright but as I said I've seen games shoot past 10gb Heck even 13gb. So the 4070 would already be having issues. And that at 600+ bucks new in Europe. Just not worth it in my opinion. At that point you may as well buy a 6950xt if they are still available.
    Most people buy it indeed for raster and in that case most amd cards just beat nvidias.

  • @Astravall
    @Astravall Год назад +2

    I do not stream to twitch, i do not upscale and i seldom use raytracing (well not that i could complain about raytracing performance of my RX 7900 XTX. It is plenty sufficient). But i have games that use more then 10GB RAM and that raytracing increases the VRAM usage thus the 3080 running out auf VRAM is kind of funny. So i have to disagree. I would chose the 6800XT anytime over a 3080.

  • @enricod.7198
    @enricod.7198 Год назад +40

    Could you include a software showcase comparing the nv control panel and amd adrenaline? Because I've used both in recent times and amd is way better, especially since it comes with oc/uv suite that's far easier to use than having to resort to afterburner. These things should be considered when doing a comparision.

    • @Raums
      @Raums Год назад +9

      See I’d love to see this too, haven’t had an AMD card in years but hated the software back then and have heard it’s improved a lot. Never really had an issue with nvidia

    • @enricod.7198
      @enricod.7198 Год назад +15

      @@Raums Tried both in the last 2 years. As I always undervolt my gpu I must say that amd is way easier and works better. Needing a 3rd party software made only by one outsourced person to undervolta a gpu from the market leader is embarassing imo. Amd software is great these days and drivers, while sometimes have issues like nvidias, have way more performance improvements over the months compared to nvidia.

    • @memoli801
      @memoli801 Год назад +8

      They always leave this important fact out.
      But Cuda is so important for a gamer?
      We are not all contant creater! Dont give a shit

    • @jjlw2378
      @jjlw2378 Год назад +2

      The fact that AMD has a built-in OC/UV utility is actually a bad thing. They have already used it to artificially limit the speeds in which you can tune your GPU. You want overclocking software to be a third party because they will provide the most impartial and least limited features.

    • @FenrirAlter
      @FenrirAlter Год назад +2

      ​@@jjlw2378🤡

  • @SuperShowDowns
    @SuperShowDowns Год назад +8

    I got the 6700xt spretral white edition last week for £300, and im so impressed with its performance. 1440p ultra no upscaling 🎉 glad i didnt get the 4060!

  • @syncmonism
    @syncmonism Год назад +14

    DLSS doesn't eliminate the value of Vram, but it can let you get away with using less with a small reduction to image quality at a given resolution. Also, while DLSS can help compensate for a lack of Vram, FSR can as well, it's just not as good at it at 1440p, but it gets quite hard to tell the difference between the two when upscaling to 4k, at least with the highest quality settings.
    I did own a 3080 for a while, and have played around a lot with DLSS and ray-tracing in Cyberpunk, and also in Control. Running without any upscaling at all is still better than running with DLSS, as long as you can get a high enough frame rate, and I find it very hard to recommend a card which costs 80-100 more, but has significantly less Vram, if it has about the same amount of standard raster performance. Ray tracing in Cyberpunk required using DLSS, and still ran significantly slower than running at 1440p native without ray tracing. I just didn't think that it was really worth using ray tracing. It never seemed obvious that it was worth running with it turned on, though it certainly was usable, and did look good, running at 1440p with DLSS. With Control, I found that the performance was good enough to run with ray tracing and without any upscaling but the game did still run noticeably smoother with ray tracing off, and the ray tracing itself was still not all that compelling to me.
    I found that the lighting in both games, even without ray tracing, was amazing. A lot of the lighting effects even with ray tracing turned on, are the same either way, unless you go with full path tracing, but I obviously had nowhere near enough performance to run with full path tracing.
    The 4070 isn't terrible, and it's not going to suddenly become obsolete because it doesn't have enough Vram at any time in the next 4-5 years, but it would have been a LOT better if it had had 16GB of Vram. It's not like 16GB would have been overkill on it because it has DLSS. That would have made the card that much better, and it would have also had more memory bandwidth as well, which would also be nice. A 16GB 4070 at 660 would have been a significantly better value than the 12GB version is at 600.

  • @alzarpomario889
    @alzarpomario889 Год назад +3

    I just put here my two cents about what is keeping me, and a few other folks, on the AMD side when buying a new GPU.
    Support for open stadards, like FreeSync, FSRX and OpenCL/ROCm: I don't like vendor lock-in, so I support agnostic tecnologies.
    I'm not the guy who cracks professional software just to tell my friends: I have photoshop 2026, idk how to use it but I have it.
    So I usally go for open software and I've never had a regret, in both my private labs and professionally.
    But the main plus above all it's the unix support.
    At home I can play windows games on linux flawlessly without having to thinker with monthly driver updates, it just works...and 2005 class hardware is still supported.
    At work I greatly extend hardware lifespan for the same reason and this philosphy allows us to offer fast and reliable citrix like remote desktops with GPU passthrough of graphics cards that would now be e-waste if made by nvidia.
    Intel is now in the middle between AMD and nVidia philosphy and i hope it will land on the AMD view of the HW/SW stack.

  • @kaisersolo76
    @kaisersolo76 Год назад +1

    The reason Nvidia skimps on Vram is so you have to buy the next gen and give you dlss to get by. Amd give you the VRAM and get blamed when theres not much improvement next Gen and offer a serviceable FSR.

  • @Games_and_Tech
    @Games_and_Tech Год назад +1

    Terrible comparison, a high en3080 vs a lower end 6800xt... the power consumption and the frequencies are too low for a 6800xt

  • @lucidnonsense942
    @lucidnonsense942 Год назад +14

    The problem with using a 3080 for productivity apps is that 10GB is REALLY not enough for doing actual work with. PLUS - nVidia's drivers limit many of the professional features you need to Quadro drivers. The 3080 Quadro equivalent is the A5000/5500 with 24GB vram and price around 1k-2k. You will get better performance, than 3080, in most CUDA workloads with a Quadro RTX 4000, a 16GB - 3070 equivalent. Because 10GB for any significant work is waaayyy too low - assuming the drivers allow the full feature set in the application on a non Quadro card.
    As far as AI worloads go; which is much more my wheel house. ROCm is feature complete and CUDA isn't all that relevant in 2023 as it was in 2021, for the AI space. 10GB - again - cripples the card for anything more than fiddling around as a hobby. Try to render a 1440p scene with a 10GB card vs a 16GB, it's not even funny how memory crippled you will be. You will get equivalent performance to a 6700XT with 12GB - which you can get for much cheaper. Additionally, we tend to put GPU render farms on a linux distro, where AMD has much more mature drivers and ROCm support. Specialised AI accelerators are a whole different kettle of fish, in that space, you will be writing your own custom libraries, that you will tune to whichever vendor allocated some boards for you - nobody is going to be picky which one, the lead up times are insane as everything is pre-sold before being fabed. You take what you can get, pay what is asked and count yourselves lucky.

  • @Ouroboross-
    @Ouroboross- 4 месяца назад +1

    3080 itsnt worth it. Just buy the 3090. I found one for 760. Cant push the card without sufficent Vram. I can max out every game and a good percentage eat up 11gb or more.

  • @DeepteshLovesTECH
    @DeepteshLovesTECH Год назад +1

    AMD had a much favorable time competing with RTX 3000 because of the huge node advantage that AMD had compared to NVIDIA on Samsung's crappy 8nm node.

  • @kon0212
    @kon0212 Год назад +1

    tbh some of us, well. most of us grew up without ray tracing, why need ray tracing when u can perform higher fps at a higher resolution, honestly, ray tracing is a useless invention when we can use its much more easier counter part also known as pixels.

  • @coolumar335
    @coolumar335 Год назад +2

    Dude is capping by saying 16gb VRAM doesn't matter.
    Extra vram allowed cards like the RX 580 8gb and even R9 390 8gb to extend their lifespan way beyond what was initially expected out of them.
    The 6800 XT is a great long-term purchase and will be viable for 1440p for at least 3-4 more years.

  • @Nalguita
    @Nalguita Год назад +3

    I bought a 6800xt 1 year ago. Where I'm from in Spain, for some reason AMD cost more than Nvidia. All my life I was Nvidia, except for an old ATI x800 I had, but the rest of the graphics cards I have had were all Nvidia, a 9800gx2, gtx 280, gtx 285, gtx 480, and Gtx 970. Even being more expensive here the 6800xt than the 3080 I opted for it, it was not for the performance, nor for the price, it was because personally I am up to ...... of Nvidia. First it was with a board with Nforce 790i ultra chipset, an expensive board that did not stop giving problems, then the 9800gx, the worst graphic I've had by far, then with the Physx marketing crap, that if you add a second card to have physical and blah blah blah, never ended up working well, pure marketing, then the 3.5 Gb of the 970, I ended up fed up and for my part they can put their marketing up their ass. It is clear that the streaming and rt in nvidia and superior, but for me it was not so much as to opt for Nvidia again. The Rt is superior but it is clear that in the 3000 series falls short so for me it is still marketing, the Dlss is above the FSR there is no doubt, but who tells me that when they leave the 5000 not leave me sold with as they did with the shit physx, the 3dvision and other technologies, also the Xsee of intel is not so bad and can be used in AMD. This is my humble opinion from years of owning Nvidia cards. I'm not saying that Amd hasn't had problems with their cards, but I personally haven't suffered from it. Sorry for the long and my English, I wrote it with the translator. Good video and good channel

  • @MonkeGaming420
    @MonkeGaming420 7 месяцев назад +1

    Feels bad that I bought rtx 3080 cus it was the first one to come out, I really needed an upgrade at the time, and I just checked upcoming gpus, and rtx 3080 was just bout to come out, so I went and ordered it on they day it came out. Little did I know that 10 gb vram was too low, I had 3.5gb (gtx 970) so I thought it was really big upgrade and future proof.

  • @gamerforever4837
    @gamerforever4837 10 месяцев назад +3

    Why do you say one thing, then contradict what you just said? " I don't think the extra 6 gigs of vram is that big of a deal" Then less than a minute later " but the 16 gigs of awesome, to allow you to enable higher settings.

  • @NinjaWeedle
    @NinjaWeedle Год назад +5

    Nabbed my Red Dragon 6800xt this prime day for 479$. Pretty happy with it, although i do sometimes find myself wishing I still had the CUDA support my 1070 Ti had. Looking forward to ROCm.

  • @moes95
    @moes95 Год назад +15

    Recently upgraded to a rtx 2070super for 150€ (card needed to be repasted, not that hard tbh), ill be good for a few years or i can snag something newer on the cheap
    Edit: i bought the gpu with mobo combo, i7 9700k 16gb ram and aio cooler for 200+150=350€ sold my old setup for 400€ with screen and other peripherals

    • @RFKG
      @RFKG Год назад +3

      Grats on your buy, mate. The 2070 super is still a very competent 1080p card.

    • @AndyViant
      @AndyViant Год назад

      2070 Super was a very good card for it's era in bang per buck. More a 1080 card now but if that's your end use it has a lot of life left yet.

    • @prem3548
      @prem3548 Год назад

      Great buy for $150. It has slightly better performance than my RX 5700 XT and I play every game at 1440p at 60+ FPS with FSR enabled no problem.

    • @moes95
      @moes95 Год назад

      @@RFKG you mean 1440p the current rig pushes 100fps in almost all games at high/max @1440p i dont use dlss or other resolution scaling, i went from i7 4770k w. Gtx 1070 that also pushed 1440p at medium settings albeit stuttery in demanding games
      Example: cyberpunk currently averages 80fps @ high to max @ 1440p

    • @RFKG
      @RFKG Год назад +1

      @@moes95 That's amazing, i honestly had no idea that the 2070super could still do 1440p gaming

  • @asdasdasd-gx7zs
    @asdasdasd-gx7zs Год назад +1

    Remnant 2 is kinda broken on nvidia and consoles. It forces ssgi graphics option which significantly reduces performance while making graphics a bit worse by causing various little artifacts. On amd it's disabled by default and can't be enabled at all even through console.
    Seriously, what's wrong with you? How can you look at 3080ti performing worse than 6800xt in this game and think that this is normal? Or 7900xtx performing almost on par with 4090?

  • @ProbablydecentIV
    @ProbablydecentIV Год назад +1

    6700xt

  • @daruthin
    @daruthin Год назад +1

    In the "new" market in europe (mostly france) :
    - For 599€ you got a brand new 3080, a 4070, or a 6900XT (the 6950XT is 629€)
    - The 6800XT is 60€ less
    I'm still waiting for the 7xxx from AMD. I don't get their strategy. Still, there isn't 7700 or 7800, XT or not, and they released the 7600.

  • @Bleckyyyy
    @Bleckyyyy Год назад +1

    The fact that AMD makes FSR 3.0 available to NVIDIA users is enough for me to go with AMD. Forget NGREEDIA...

  • @Azureskies01
    @Azureskies01 Год назад +1

    AMD doesn't really get destroyed by nvidia in ray tracing, they get destroyed in games that have really shitty implementations of ray tracing. Word on the street is it is just like tessellation where you have games over using rays that the player will never see to kill performance on both nvidia and amd but it kills it harder on amd cards. Crysis 2 (cyberpunk) anyone?
    I mean in UE5.2 (fortnite) AMD and nvidia are pretty much in RT parity so there are good implementations as well.

  • @danielcuchallocruz6986
    @danielcuchallocruz6986 15 дней назад +1

    I love raw power, as much as i love gwen stacy skin and i love the rx 6800x t

  • @100500daniel
    @100500daniel Год назад +3

    Bro increase your parameters from the default through Andrenaline. The fan speed and Power limiter by Powercolor's default are kinda low.
    run that bitch on 2000rpm with the power slider maxed and an undervolt&OC and it will be even faster.

  • @dayadam16
    @dayadam16 Год назад +2

    Gotta love at 3:54 "the nvidia card is pulling ahead pretty significantly"(+14%) then 10 seconds later when "the amd is pulling ahead on same game the amd is actually pulling ahead"(+14). I feel like for the rest of this video nvidia is gonna get way better choice words. Smh🤥

    • @vextakes
      @vextakes  Год назад +1

      Literally said amd has is very favorable with lumen there and that’s is a huge advantage in the future

    • @dayadam16
      @dayadam16 Год назад

      @@vextakes that doesn't change the choice of words that was used though. I see what your getting at but i'm saying using words like significant just sounds way more than what actually took place within that 12 seconds of that time stamp.

  • @Azureskies01
    @Azureskies01 Год назад +1

    DLSS has other issues that all temporal upscalers have, not going over those but mentioning FSRs issues is disingenuous. Be better.
    Also there is no proof that DLSS is using any AI to do the upscailing other than nvidias word that it is. As I don't trust any of the three GPU makers ima say they train the AI on the images in house and when it runs on your gpu it is just using the algorithm, not the AI to do real time upscailing. Still more involved than FSR sure but not nearly what nvidia is claiming (without proof) that it does.

  • @SmokinDave7373
    @SmokinDave7373 Год назад +4

    Nice Video man, I jumped ship from my 3070 to a 7900XT because it was 200AUD cheaper then the 4070ti and besides not having DLSS as an option I am super happy with my AMD card, in pure raster performance on average I get 15-20% more fps in most games(compared to the 4070ti) and the only issue I have had with my 7900XT TUF is bad coil whine which I am hoping I can send mine back soon and see if I can get a better luck of the draw on that. Keep up the videos you are a great new entry source into the PC DIY learning youtubes.

    • @xwar_88x30
      @xwar_88x30 Год назад +1

      Your not missing out on dlss since amd has built in fsr in there drivers which you can use in any game and it still looks amazing. Should defo try it out. It's under super resolution in the driver menu, I tend to have the sharpness either at 40/60 looks good imo.

  • @m4rc1n03
    @m4rc1n03 Год назад +1

    Valhalla, CODW2, Forza 5, Daying Light 2, For Honor, RX6800xt is 30+ > 3080.

  • @JanJirásek-n1n
    @JanJirásek-n1n 3 месяца назад +2

    I got brand new 6800 asrock challenger pro (non-XT) for 360usd in May 2023 and I absolutely love the card

  • @keola2
    @keola2 Год назад +14

    Vram becomes a huge problem when you don't have enough, I'd like to see some benchmark examples with Starfield 8k or maybe even 16k texture mods. Watch that 10GB absolutely tank in performance. With that said, right now 10GB is good enough, it's just not very future proof. When new games are using more and more vram every year, I'm sure that's a concern for many people. Despite all of that, I'm sure the used market is just flooded with more 3080s because of the end of the crypto craze.

    • @Dereageerder
      @Dereageerder Год назад +7

      8k and 16k textures are more for the 90 class cards. 10gb is plenty for starfield

    • @Eleganttf2
      @Eleganttf2 Год назад

      wtf u need 8k texture for ? you crazy and delusional

    • @justjoe5373
      @justjoe5373 Год назад +1

      Only a concern if you max them out. Gaming at 1080p, never had to drop more than 3 settings from max for stable 60 at 8GB. Haven't played Hogwarts Legacy, TLOU etc. but they don't interest me anyways and I bet you can play them perfectly fine with a few nonsense settings turned down. It's the same deal for higher resolutions, turn down a setting or 2 and watch the game become playable, usually with little visual difference. Ultra settings are beyond the curve for diminishing returns for performance hit vs visuals, when an amount of VRAM isn't enough to run the game at medium-high then it's too little
      10GB VRAM is gonna be fine, XBox S has 10GB total that's shared between the GPU and CPU. 10GB may be laughable on a 3080 but in and of itself that amount isn't an issue

    • @Ober1kenobi
      @Ober1kenobi Год назад +7

      When was the card designed for 8k ?
      Do you have an 8k panel ?
      You have a $3000 monitor ?
      Lol
      Same as testing a 4060 in 2023 at 4K, it doesn’t make sense, it ‘can’ up to a certain texture level.
      Would I want to, no

    • @keola2
      @keola2 Год назад

      I agree with you both, like I said, 10GB is good enough, especially if you're doing 1080p. It's more of a worry with future, poorly optimized games from AAA devs using more vram than necessary. It's like that saying, I'd rather have it and not need it, than need it and not have it.

  • @franciscojaviervazquez2635
    @franciscojaviervazquez2635 Год назад +2

    I got the 6800xt in november, and I love it. First, I had the good luck of getting a card without the annoying coil wine, so that made me happy (I bought the Gigabyte OC version). Second, I trully love Adrenaline, because there´s no need of Afterburner to adjust voltages, fans speed or whatever and is very easy to measure thermals without using a third party software. To be honest, DLSS is a major feature I would like to have on my card, but considering that Nvidia didn´t supported 30 gen of cards for frame generation and AMD promised FSR 3 will be supported in the 6000 cards; seems that was a deal. If AMD promise is achieved, the 6800xt would destroy even the 3090ti with half the price. We´ll see if true... but at the end, 6800xt seems like a good deal if RayTracing is not what you want. I´m not content creator or editor, not streamer; just using the card to play and even without FSR 3 I love my card. No drivers issues, more VRAM, more rasterization performance and includes Adrenaline. All for less price. Shut up and take my money!!

  • @evergaolbird
    @evergaolbird Год назад +18

    I would argue that VRAM matters for PC gamers that are not into competitive gaming but rather into the modding scene. NVIDIA is doubling down into this with RTX Remix with the RTX 40 series cause they know the demand of the modding scene in the PC landscape remains high. Skyrim is almost 12 years old, but to this day infact it has not slow down rather the modding scene in the game just hits the peak further instead (Check the view counts in RUclips or the player counts in SteamDB - its still relevant).
    High VRAM capacity and Memory Bandwidth are important in Skyrim, I have 1750 mods installed on my 5900X + RX 6700 XT and with my 12GB of VRAM there's a lot of planning I do which mods to keep or not cause of VRAM demand.

    • @siyzerix
      @siyzerix Год назад

      I agree. Skyrim modding also requires the fastest cpu to play smoothly. So a x3d cpu is almost a must.

    • @evergaolbird
      @evergaolbird Год назад

      Very true, Skyrim is one of those games that demands not just with the GPU but also the CPU. Not to mention at least having a GEN3 NVMe is required to even make Massive LODs work - System RAM too.
      Skyrim's modding scene is probably the only game out there which is like a symphony. Its uses everything in the PC especially if the person knows how to mod.

    • @siyzerix
      @siyzerix Год назад

      @@evergaolbird Indeed. My laptop has 32gb of ddr4 3200mhz ram, an I7 12650h and a 150w 3070ti and a 1tb gen 4 nvme ssd. Skyrim se still see's my CPU being the limitation. Thats despite me giving my CPU all the power it needs.

    • @siyzerix
      @siyzerix Год назад

      @@TheAscendedHuman The mobile 3070ti has the same chip and specs as the desktop 3070 except tdp and clocks. The 150w variant is within 10% of the desktop 3070.
      The i7 12650h is more akin to a i5 12600 non K (if it exists).
      I'd say thats a solid setup, even for desktop standards. Cause I know, not that many people are buying even a 3060ti or 6700xt on desktop. Most have a 3060 or 2060.
      My point is, modded skyrim se just needs way too fast of a CPU. I litearlly cannot use my GPU to its fullest, even if I let the CPU run wild. I hit the drawcall limit real fast and exceed it too causing FPS drops.

  • @RannonSi
    @RannonSi Год назад +24

    Personally, DLSS is a crutch-technology and should be advertised as a way to keep your e.g. x080 viable for another generation, or three. Not as something to make games playable when the card's still pretty new.
    Edit: fixing spelling mistakes and such like.

    • @KEWords
      @KEWords Год назад +1

      Like AA or anything that improves quality over rendered resolution does? Cruth. Do you play games or freeze frame and pixel peep. DLSS gives you a choice, and when I choose quality it get more FPS at the same visual quality on the same card. Its like a 5th gear that gives me better milage. If it was not valuable. AMD would not be bribing bundle partners like Starfield to no support it.

    • @_n8thagr8_63
      @_n8thagr8_63 Год назад +1

      it's not a 'crutch' technology. There's more to making a game run nice than just raw hp. DLSS is an excellent feature and a great example of hardware and software harmony

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      Depends on how you see it. If DLSS is "crutch" then rasterization is also a crutch.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      ​@@_n8thagr8_63it's only a crutch for Devs who don't know what optimization means imo

    • @evrythingis1
      @evrythingis1 Год назад +1

      @@arenzricodexd4409 You don't know how DLSS works do you?

  • @ZeekMX
    @ZeekMX День назад

    First just let me state that what I'm about to say is only in relation to Final Reality from 1997 as my quick video performance test.
    I'm still attempting to find out why it is that when I just enable fast mem timings on my RX6800XT suddenly it performs better than overclocking.
    For the fact check, I tried about 3 different top recommended overclocking profiles. I'm still scratching my head about it.
    I know it can't be my new $34.00 pair of Thermal Take 2000RPM Pro Fans. I've sentenced the two of them to very light duty to prolong their lives in flat land. Signed: Gen X

  • @casedistorted
    @casedistorted Год назад +1

    I could only get a 3070 Ti thanks to the crypto scumbag bros.

  • @mmadevgame
    @mmadevgame Год назад +1

    in my country 6800 xt the same price as rtx 3070 lol no brained picked up 6800 xt

  • @Paelmoon
    @Paelmoon Год назад +4

    I have a 3080 10gb. What's your opinion on what to upgrade too? 40 series are so pricey, but the amd cards are not so higher performance.
    Edit. Cheers guys! I will stick with the card and hope it doesn't conk out soon :-P

    • @Mako2401
      @Mako2401 Год назад +19

      There is no need for you to upgrade unless you get a 4090

    • @Humanaut.
      @Humanaut. Год назад +8

      Keep the card for as long as you can and pray that next gen doesn't suck as hard as this gen and that ai doesn't become to next crypto boom.
      The only issue you could have is VRAM limitations, which is a good old Nvidia planned obsolescence strategy, other than that you should be good for a while.

    • @niggapyrat
      @niggapyrat Год назад +1

      3080 is still a good card but if u want more vram for 4k etc.. go with a 7900 XT or XTX. If u want Nvidia, minimum a 4070ti

    • @Blaze72sH
      @Blaze72sH Год назад +7

      4090 or just don't. You are set for now.

    • @endmymiseryyy7908
      @endmymiseryyy7908 Год назад +2

      wait for next gen

  • @user78405
    @user78405 Год назад +2

    like i said, memory is not everything to run game faster....its the chip horse power that 6800xt lacks ....not enough cuda cores ...every game need atleast 8000 cuda cores run all the features

  • @notchipotle
    @notchipotle Год назад +3

    because RTX 3080 was used a lot in mining rig (better hashrate), and those cheap listing are probably in terrible condition (rusty, dirty and need a repaste)

    • @professorchaos5620
      @professorchaos5620 Год назад

      This is a hugely wrong view. Ive been mining with cards for years and they still perform exactly the same as when brand new. Most miners learn how to reduce heat and power for max efficiency which is far easier on the cards than gaming.

    • @notchipotle
      @notchipotle Год назад +2

      @@professorchaos5620 running 24/7 on an open bench, inside a humid room, never get clean, yeah sure 🤣 most used GPUs were in terrible condition, dust combined with humid air is the worst, it became black gunk that's hard to clean, the fins & IO shield get rusty, some of the fan is starting to fail. I saw it myself, most miners don't give a shit.

    • @Very_Questionable
      @Very_Questionable Год назад

      @@notchipotle honestly, the issues with quality are a huge one with used cards, regardless whether those are mined or not.
      there are some mining operations out there that do keep their room dry, clean their cards, and try their best to take care of their hardware.
      The best way to grab a used GPU is to get one locally, like arranging a meet-up instead of delivery. You can even negotiate pricing while you're at it, and at the same time you can also inspect the quality of the card carefully.
      Oh, and by the way, if you have any safety concerns, just know that you can do this right in front of your local police department.

    • @notchipotle
      @notchipotle Год назад +3

      @@Very_Questionable I sell used PC parts since 2011, even the most careless gamer can keep their GPU in good condition for at least a year, but mining cards already looks like crap after a couple months. you can say whatever you want, I was just describing the reality, not a personal opinion 🤣

    • @alperen1383
      @alperen1383 Год назад +1

      nah i got mine for 400 and it was from a gaming rig in mint condition

  • @m4dp4ck3t
    @m4dp4ck3t Год назад +1

    Just turn on DLSS 3.0 to mitigate the 10GB frame buffer. Oh wait..

  • @victorhogvall2065
    @victorhogvall2065 Год назад +1

    Got a 6800xt paired with a 43 inch 4k screen and I can tell you using quality fsr at 4k I cannot tell the diffrence and at 1440p with these cards u don't even have to use upscaling.

  • @Clint_the_Audio-Photo_Guy
    @Clint_the_Audio-Photo_Guy 7 месяцев назад

    I had a Red Devil 6800 XT and took it back when the 6750 XT refresh came out. I should have kept it. Incidentally, 6950 XT Reference cards are only $550 at Microcenter right now. Pretty comparable to the 7800 XT, or better in several games.

  • @5RWill
    @5RWill Год назад +3

    I got that one on prime day for $480 with starfield. But good points made. The only feature i really would nod the head to is dlss. I’m all for Ray tracing but it’s just not that big of a difference in most games for the performance. I wish amd would focus on fsr 3.0.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      Imagine fsr 3.0 miraculously comes with starfield lol

  • @Powerman293
    @Powerman293 Год назад +1

    "Nvidia feature advantage" MFers when they run out of VRAM even at 1080p:
    😩

    • @roamn4979
      @roamn4979 Год назад

      AMD fans when their game looks like a wet sponge because of FSR:
      🧽

  • @AshtonCoolman
    @AshtonCoolman Год назад

    I have a 4090 and I dislike DLSS. DLSS, as you said, renders at a lower resolution. I'm not spending all of this money to be forced to render at a lower resolution because of a lack of VRAM. Upscaling is a crutch, but people have given into Nvidia's marketing. Rasterization performance is, and always will be king.

  • @aditrex
    @aditrex Год назад +1

    ur 3080 is drawing 100w more compared to 6800xt while being smacked in restar perfromance which only matters sorry but 6800xt is just better unless ur heavy single player gamer but for fps games amd is way to go not to mention power draw

  • @johnrehak
    @johnrehak Год назад +1

    3080 shoot its leg by 10gb Vram. I would buy 3080 12gb or 3080 ti which also has 12gb if its not significaly more expensive.
    Was Smar acces memory turned on when you tested both cards? In general both cards are same until you turn on SAM, than 6800 xt stsr pulling head in nearly all games except RT.

  • @giucafelician
    @giucafelician Год назад +3

    I am absolutely thrilled to share that I recently snagged an incredible deal on a 6800 for only $353! Needless to say, my excitement is through the roof!🥰

    • @gustavopena99
      @gustavopena99 3 месяца назад

      I got an used 6800 no xt for 260usd aiming for 1440p im so happy also (1 year warranty) (in argentina is a good deal)

  • @ioannispoulakas8730
    @ioannispoulakas8730 Год назад +2

    10GB is veeeery borderline for 1440p. I very often see over 9GB. Over 10 is less often but we are getting there. I don't know for how long the 12GB on my 4070 will last. Ideally I wanted more VRAM but there wasn't any alternative on this price range. I didn't want to go with the 6800XT and lose DLSS 2,3 and better RT cause I am only playing single player games. Fingers crossed for the 12GB then :P

    • @xwar_88x30
      @xwar_88x30 Год назад +1

      That's all Nvidia is good for is borderline with the vram scam on their GPUs making people upgrade when there 3070/3080s/4060/4060 ti or whatever other bs Nvidia GPU that starts to stutter in heavy vram games. 16gbs Is the safe spot at the minute anything over 16gbs is even better future proof.

  • @danburke6568
    @danburke6568 Год назад +1

    When next generation games comes out 10gb is going to be a problem.
    Also so many was used for mining, those two elements. So many more about.
    The 3080 used is a bit pricey. Still a good card.

  • @adikonas1978
    @adikonas1978 Год назад +3

    Can you tell me what the temperatures of the card are, especially if you use a 1440p ultra wide monitor? For me, with the highest details, the temperature at the connector was 100 degrees Celsius

    • @Anto-xh5vn
      @Anto-xh5vn 4 месяца назад

      Um what holy shite bro that's probably not normal or is it

    • @adikonas1978
      @adikonas1978 4 месяца назад

      @@Anto-xh5vn as far as i checked normal 10 degrees before thermar throtling with undervolt its around 90

  • @Paulie8K
    @Paulie8K Год назад +13

    Nice breakdown. I've had the 3080 10GB since December 2021 and it's been amazing. Now i tend to play games that are a bit older because I have a rule where I wait at least a year after release for games to get patched and go on sale so I have nothing from 2023 but I've played dozens of games from 2015-2022 and the 3080 ran them beautifully at 1440P.

    • @MrBorisRoo
      @MrBorisRoo Год назад +1

      I dont have problem on 4K with my i7-9700 … 60 fps are stable…. Now I can never go back to 1080p

    • @unreleasedcontent9316
      @unreleasedcontent9316 Год назад

      Right the 16gb of vram isn’t all that 10g is all you need it nvidia still gives you advantages

    • @Cybersawz
      @Cybersawz 7 месяцев назад +1

      I have had that same 10GB card for almost 2 years, and use it for 4K gaming on high settings for most couple year old games. Granted, the latest AAA games can't be maxed out but still look pretty awesome on high. I run an I9-13900K with 32GB RAM.

    • @Paulie8K
      @Paulie8K 7 месяцев назад

      @@Cybersawz nice. I have 32gb of ram too but have a 3700X which I may upgrade to a 5800X3D. But at 4K, I'd be GPU bound so probably would be the same results as your 13900K there.

  • @MGrey-qb5xz
    @MGrey-qb5xz Год назад +1

    it's no longer about performance anymore, we honestly have enough of that and gaming has stagnated, what matters now is how power efficient the chips and cards are , even series console went the undervolt route to run more on less

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      think its the other way around, game devs want gamers to update their rigs so they can make better games

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      i mean no one can 144 at cyberpunk psycho rt as for now

    • @MGrey-qb5xz
      @MGrey-qb5xz Год назад

      @@iikatinggangsengii2471 there are more important things to be done then play cyberpunk in rt 144fps, we need to stop turning cards in to freaking ovens in the summer and decrease the electricity bill, aside from first world countries , using modern cards is a big no for electricty bills and ac bill to cool the effing room, can't even buy old ones now cause they are no longer in production

  • @RannonSi
    @RannonSi Год назад +2

    When it comes to VRAM.
    As it seems right now (if I'm not mistaken), there are about 4-6 games that (for whatever reason) exceed 8, 10 and even 12 GB, at a resolution of 1080.
    To me, that is entirely different to when Ray-Tracing started getting games, as it was obvious that it'd take years before buying a graphics cards just for Ray-Tracing was a viable (or at least a wise one - besides, how good was the RTX 2000-series with Ray-Tracing, really?).
    *This*, on the other hand, looks like the start of a trend, that the VRAM usage is spilling over the 8GB (with 10 and 12 following shortly, would be my guess) limit. And I'm guessing that it'll be at a rate, at least twice, the rate of the adoption of Ray-Tracing (at least after 2024).
    To me, Ray-Tracing is an add-on, a luxury item. But being able to play games at 1080p, on a card (that was released in the roaring 20's, mind you) that had an MSRP of $700, is (with exceptions, of course) the least one should expect.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +2

    It's just a shame that Nvidia doesn't want to give enough VRAM to use their GPUs to the fullest. I can forgive the 3000 series, 8 was plenty at the time, but it was becoming clear early enough in development of the 4000 series that they needed to increase the VRAM at every skew except for the 90s. If AI wasn't their focus right now, they probably would have, but, because of AI, Nvidia just doesn't care.

  • @tomtomkowski7653
    @tomtomkowski7653 Год назад +1

    Now for a card that competes with xx80, you have to pay $1000 just because Nvidia raised the price of the xx80 from $700 to $1200. This is duopoly cooperation, not competition.
    The funny thing is that 7900xtx is relatively slower than the 6800xt but costs $1000 pretending to be like 6900xt when in reality relative performance is not even close.
    6900xt = 3090 -7% -> $1000
    6800xt = 3090 -15% -> $650
    7900xtx = 4090 -20% -> $1000
    6800 non-xt = 3090 -24% -> $580

  • @patsk8872
    @patsk8872 8 месяцев назад +1

    "FSR doesn't look as good as DLSS" that's funny you say that right after showing a great comparison where both XESS and DLSS failed miserably at properly rendering a reflected light source, while FSR was the closest to the actual render. So that is anything but clear-cut. TBH all have problems, and I don't like the idea of using any of them unless I'm forced to. But if you really enjoy looking at an AI fever dream and pretending it's an actual render, you do you.

  • @hayseedfarmboy
    @hayseedfarmboy Год назад +2

    up scaling is allot better on nvidia as well as editing, but ray tracing is not really catching on well with game makers, ive seen some amazing effects with metals in some games but even in those game its just a few small things and isnt used well through out the game well (at least for metals, which is where ray tracing is worth while for me), i mean the armor in elden rings looks dry as sand even with ray tracing on, the deal i assume is in 5 years will the card you have still function well with the growing lack of optimization in new games, and some AMD tech is better optimized do to console partner games using amd components , while nvidia i dont think want there last gen gpu running any better than they are, 2 gen back maybe yes , but if 3060 ti drivers were really updated to do their best the 4060 (4050) would be selling even worse than it is , and the lack of EVGA helping with game ready drivers going forward is a big hit for Nvidia owners

  • @enricofermi3471
    @enricofermi3471 Год назад +1

    (Used) 3080-s are cheap because there's an extremely high probability they were mined on. As the crypto drops, there's little incentive left to mine, so the smaller farms get disassembled and sold. We will not disuss the probable condition of those GPUs (which may vary greatly depending on maintenance of the farm).
    The fresh 3080-s are indeed priced cheaper, because there are only so many advantages they have over an AMD competitor (6800XT): raytracing, DLSS (no frame gen on 3000 series btw) and CUDA, out of which only CUDA is not optional (if you actually need it as a content creator, that is). On the other hand, 3080 has only 10Gb VRAM, which is enough for now, and may (keyword may; it gets filled up in some games on ultra, but those are just singular examples... for now) last you until the 5000 series; 6800XT does not have that "may".
    Overall, I'd say a 6800XT is a more balanced long-term solution, "the 1080Ti of current era", while 3080 is an "I want the eye candy now and then I'll switch to 5070 as soon as it launches".

  • @todd92371
    @todd92371 Год назад

    I finally left Nvidia for AMD. I love the adrenaline software and the card is amazing. I'm really glad I jumped. To each is own though.

  • @ravs1988
    @ravs1988 Год назад +1

    Upgraded from a Rx570 from 2017 to a Rx6800 (non XT) for 300€ on the used market. Currently wasting its potential on Dota and Wow :)

  • @futurama420
    @futurama420 26 дней назад

    I took 6800 xt, in my country I found an option for 400 bucks from MSY gaming, when 3080 cost 470, but the main thing when choosing was a test that I found on the Internet, where a guy did undervolt cards. Now I’m playing on 6800 xt with a consumption of 130-140 watts at 2k, and in principle I’m happy with everything, the main thing is not to turn on retracing) Thanks to the Russian terrorists for forcing me to save energy.

  • @jeke8413
    @jeke8413 14 дней назад

    My goal was to get something that allowed me to max out details and textures in open world environments without worrying about stuttering or skipping, and upscaling isn't that important to me. At a reasonable price of $400, the 6800xt wasn't a hard decision.

  • @marufulislam4311
    @marufulislam4311 Год назад

    rx 6800xt is just enough for 1440p . you will hardly ever need an upscaling. it's just that much faster. having 16gb means you can max out any game & it's future proof.

  • @eranbraun2
    @eranbraun2 Год назад

    Imo the nvidia features are still not such a must, I'm still not really like the rt I'm every game I don't get that much loss (4090) but the bugs with the visuals are not worth it sometimes I got blinded just by moving my mouse and that's a bit stupid as I don't remember hogwarts legacy having flashbangs. Other then that like every fkn compression side by side when you try you will spot out some differences but alone everything is the same so the fsr dlss debate is kinda meh if the average is close and the vram is just future proofing as you may think that upscalers are a solution but they are just a temporary fix what will happen with new games that will come in some time it can work or not and that unnecessary worry just because nvidia can't put more then 10gb on a 80 class card is stupid

  • @paradoxeintervention5390
    @paradoxeintervention5390 Год назад +1

    I wanted to buy a 3080 2.5 years ago, ultimately it became a 6900xt.
    At that time I had a FHD144 monitor then switched to WQHD165, meanwhile I have a 4k Oled for gaming.
    Currently 12GB would also be too little for me, already the head thing.
    With every jerk I would ask myself whether it is perhaps the VRAM.
    That's why I'm quite happy not to have gotten a 3080, back then I would not have thought how quickly the memory becomes a problem.

  • @anthonyrizzo9043
    @anthonyrizzo9043 Год назад +1

    Dlss is what does it for me, fsr looks like crap in comparison, so your basically getting better image quality with Nvidia by far. And eventually if not today, you will be upscaling on these cards. I use dlss on my 4080 even when I don't have to, I want more fps and it looks like native usually.

  • @RudolfSikorsky
    @RudolfSikorsky 2 месяца назад

    That's why I chose 3080 12gb. Great card until today. Cyberpunk 2077 for example. DF optimized settings, 1440p, DLSS balanced, RT psycho and path tracing enabled I get 40-60 FPS in main game.
    Once you experience difference that PT offers, you will never go back and sacrifice some FPS instead.
    Would like to see how 6800XT performs in same conditions.

  • @nathanielchapman7675
    @nathanielchapman7675 Год назад +1

    I disagree with the VRAM statement. The stutters and graphical artifacts are a dealbreaker for me. I had a 3070 that ran out of VRAM and the drawbacks were unbearable. The additional 2 GB of VRAM for a 3080 would have not resolved the issue. Moreover, the raytracing on the 3080 is quite overblown with how good it is…. Also, it requires more VRAM. So what would you choose, better textures overall or better lighting?

  • @Psychx_
    @Psychx_ Год назад +1

    3080 10GB doesn't have enough VRAM to handle Cyberpunk in 1440p with Ultra-RT. Scaling up from a lower resolution alleviates that memory pressure.

  • @Hito343
    @Hito343 Год назад +2

    People are overblowing the whole VRAM issue.... yeah bad PS ports exist, badly optimized games exist especially recently..., capping FPS is a thing imagine (why would you be making it run more than it's needed unless it's a competitive game and you need that 120FPS), lot of the options in the Ultra preset make little to no visual difference and cost a big performance hit. .... yeah. Great video

  • @Aseapia
    @Aseapia Год назад +1

    Future games will crush both of these gpus in Ray tracing. The extra ram of the 6800xt will age dramatically better. In two years no one's going to want a 3080 10gb.

  • @CitizenTechTalk
    @CitizenTechTalk Год назад

    It has nothing to do with brand. It has EVERYTHING to do with RAM/Video memory. You're a "Tech" channel and the most obvious driver for purchase wasn't even mentioned from the very beginning of the video!? Seriously!? And RAYTRACING is literally ONLY a marketing thing, literally no-one cares even for a second about Raytracing AT ALL! RAM, RAM AND MORE RAM, is the ONLY purchasing decision made here!

  • @DonaldHendleyBklynvexRecords
    @DonaldHendleyBklynvexRecords Год назад

    From what I see Online Ppl Are Growing Tired Of Chasing RTX 'AND' Fps, Plus At This Point A Bunch Of Us Want Raw FPS over Frame Gen Etc Plus Like You Said Its Nice To Max Out Everything, If A Gamer Wants RTX etc Then The Super High End Is Where Its At, Lower End Card Owners Need Worry About FPS And AMD Gives That!

  • @GewelReal
    @GewelReal Год назад +3

    10GB of VRAM really aged badly.
    And it even uses more power than 6800XT!

    • @vincentvega3093
      @vincentvega3093 Год назад

      Thanks, Samsung!

    • @CersionX
      @CersionX Год назад +1

      @GewelReal if you actually watched the video you would know that it doesn’t

    • @Altrop
      @Altrop Год назад

      @@CersionX According to HUB's power graph shown in this same video it does use more power actually.
      He tested the power usage while standing still in a CPU bound game lmao.

  • @user78405
    @user78405 Год назад +1

    i wish there where more 3080 and 3080ti in store shelves with low price, but like 4080...anything bellow $999 it get sold out too quick ...back to $1099

  • @The_Noticer.
    @The_Noticer. Год назад

    DLSS negating VRAM usage is a pretty bad argument. Even through youtube encoding I can tell the DLSS image looks softer, notice at 7:62 that the actual suit the character is wearing is spotted, but both FSR and DLSS turn it into a uniform pink/blue color. And that argument is especially dubious after your previous video lamenting that developers are using DLSS as a stopgap solution...
    They are not like-for-like quality-wise, and thus cannot be compared that way. Much like you can't use FG as a mitigator either, because its not an actual fps increase. The input latency remains the same, or rather becomes worse. Hardware unboxed is right to not include it in the results.
    We shall see in the future. But like most AMD cards, this will age better than the 4070 in my opinion, and especially next to the 3080. You will have to step down several quality settings before you have to on the AMD card. Additionally, raytracing increases VRAM usage as well, so even with DLSS you will run out of VRAM.

  • @PoiZo_NG
    @PoiZo_NG Год назад +1

    The prices are WAY different in my country. A used 6800 xt has the same price as a 3060 or a 3060 ti. Which makes it a hugely better deal than nVIDIA ones. 3080 is much more expensive than 6800 xt in here.

  • @roki977
    @roki977 Год назад

    i use rx6800/5800x3d with 1080p 240hz screen and still havent use FSR, tried ofc it and it looks like crap.. I still own 3060ti and dlss is a must have on that gpu these days even on 1080p. My point is that dlss is much more needed on Nvidia GPUs than fsr or AMD bcs of lacks of vram and raw power.

  • @ocha-time
    @ocha-time Год назад

    Unfortunately if you're turning Raytracing off or don't want to pay extra for what equates to frame hacks, you just want a card that works and works well with a minimal of dicking around, Nvidia's rarely the card you wanna stick with, and I don't like this mind game they play where turning the settings down is somehow a novel feature. And if ten gigs is already starting to suck, I don't want to decrease settings and blur stuff to make it run good again. That's a loser mentality. That is, if I'm buying big. Buying small, 6650XT, sure. Smear the pixels together all day. If it's $220 for that or $300 for a 4060 when I have to do the same stuff to get good frames, Nvidia gets the middle finger again.

  • @paulburkey2
    @paulburkey2 5 месяцев назад

    great points personaly I like to use driver only install with AMD GPU's lower overhead plus afterburner for undervaluing is just simpler. Though I do lose the ability to overclock my monitor as well as run not sure what the setting is called but it alows you to un cap the FPS without tearing which on this rig can be pushed to 78Hz vs 75Hz not a big difference and you lose freesync with the overclock so not worth the overhead of the radeon software I just cap the frames @ 74FPS which is all this RX570 can handel with flat line frame times anyway I just purchased a red devil RX6800XT used on Ebay for $395 $445 after tax and shipping waiting for delivery to upgrade my RTX 2080 super, I plan to undervolt, and don't use ray tracing or DLSS I mostly play Fortnite with TSR enabled high- epic details it's all about latency for me @ 1080p 144Hz I plan to test it with my 4.75GHz R5 5600X then test it with my R5 5600X3D which I dont use at the moment because it runs hot on my 240 AIO may have to pick up a 360 AIO if the 6800XT performs better with it vs the non X3D's older sibling which is fine I would like to use this 240 AIO in my Rampage III Gene X5690 Xeon machine paired with my ASUS Strix RTX 2070 super which currently has an XFX RX570 that I customised with an XFX RX580 cooler 2 more heat pipes and direct contact ram plate which I also undervolt and overclock, I got the untested RX 580 for parts on Ebay for $20 im prety sure it was just dirty and needed a repad repaste but my RX 570 is basically new and I didn't want to risk installing a dead GPU on any of my current hardware

  • @Aaronnpool23
    @Aaronnpool23 2 месяца назад +1

    I have a 6800xt and it’s the best card I’ve ever had sold my 3060ti for it and happy with 16gb Vram

  • @hamzagulsoy1566
    @hamzagulsoy1566 Год назад +1

    Rtx 3080=rx 6900xt 4k

  • @bartdierickx4630
    @bartdierickx4630 Год назад +1

    If the extra 6GB VRAM gives you 2 extra years of gaming (let's say an additional generation of GPUs) before forking out new money it's worth worth 112$ a year, without taking the resale value into account of both GPUs after respectively 2 a(for the 3080) and 4 years (for the 6800). And not considering if you do more than gaming on the GPU.