RTX 3080 vs RX 6800 XT! | Same FAST Performance, Is Nvidia or AMD Better??

Поделиться
HTML-код
  • Опубликовано: 16 дек 2024

Комментарии • 834

  • @vextakes
    @vextakes  Год назад +250

    🚨🚨 NVIDIA SHILL ALERT 🚨🚨

    • @soupdrinker72
      @soupdrinker72 Год назад +8

      real

    • @GewelReal
      @GewelReal Год назад +35

      I love Nvidia! Daddy Jensen make me into one of your jackets

    • @phrog.4809
      @phrog.4809 Год назад +25

      I decided to buy a 3080 instead of a 6800 XT a few days ago to upgrade from my 6600 XT, the card just shipped too, and now you post this video 💀. The only reason I'm still happy with my decision though is that it was a 12GB card and I got it for $390.
      Update: I got it and I'm happy with it

    • @JasonEllingsworth
      @JasonEllingsworth Год назад +9

      There have been videos for years showing the 6800xt as the better card. It also overclocks far better. Mine is as fast as a 3090, using air cooling and amd's stock software, which has improved a lot in the past 3 years. No driver issues either.

    • @lucidnonsense942
      @lucidnonsense942 Год назад +10

      CUDA worloads aren't a thing you can utilize well with a 3080 - they need a lot of RAM, that's why the professional cards have 24GB. 10GB makes that feature irrelevant - the low memory is a way to force prosumer buyers into paying the Quadro tax. 3080 is useless in pro apps, you get better performance with a A4000 - a 16GB quadro card that uses the 3070 core.

  • @BogdanM116
    @BogdanM116 Год назад +374

    I'm coming back here in 3 days with snacks to read the comments.

  • @chrisswann7578
    @chrisswann7578 Год назад +70

    I just bought a used rx 6800 non xt for 350 usd. I came from a rx 6600xt and I cannot imagine using anything faster. I'll definitely keep it for as long as I can. And to add I never see my card go above 200w, and that is with a slight tuning which makes its just that much faster. I love being able to max out settings and not run out of vram.

    • @hiriotapa1983
      @hiriotapa1983 Год назад +4

      Bought a 6800 XT Sapphire Nitro+ for 355 usd....

    • @chrisswann7578
      @chrisswann7578 Год назад +3

      @@hiriotapa1983 Awesome! What a killer deal!

    • @Adrian-tl5ue
      @Adrian-tl5ue Год назад +2

      what processor do you have and what psu? I have rtx 2070super and want to upgrade to rx 6800 or 6800xt with mine r7 5700x and I only have 650W psu 80+ bronze, i think 2070super runs with same watts like a 6800

    • @kilroy5680
      @kilroy5680 Год назад +2

      ​@@Adrian-tl5ueit's good enough

    • @TheNewmen10
      @TheNewmen10 Год назад

      ​@@Adrian-tl5ueFunciona sim,eu uso uma 650 w plus bronze tambem !

  • @speng5821
    @speng5821 Год назад +155

    16GB vs 10GB is no contest.
    Both of these cards are the lowest viable 4k options imo and at 4k 10GB is going to age like milk (even with FSR2/DLSS quality lowering the vram usage- which you'll probably be using as 4k native requires serious horsepower).
    In blind tests comparing FSR2 and DLSS at 4k quality, 99% of people won't be able to spot the slightly better image quality DLSS offers. The differences become more exaggerated using more aggresive upscaling at lower resolutions.
    If you don't stream or do any productivity and have a 4k monitor then 6800 xt just seems like the better option. £400 used and lower power which is a factor in the UK where energy is expensive.

    • @speng5821
      @speng5821 Год назад +11

      As for RT- I've always been about optimising graphics settings. Even before RT, I'd be turning shadows down a few notches from ultra for more performance for very little visual difference. Even on a 3080, RT is expensive and I'd much rather target 120FPS than have some eye candy.

    • @Ober1kenobi
      @Ober1kenobi Год назад +2

      Texture Sliders.
      So long as you keep the Vram in check,
      It’ll be faster no,,,

    • @speng5821
      @speng5821 Год назад +14

      @@Ober1kenobi but that's the issue- I don't want to have to think about keeping my vram in check- 4K textures are one of the best ways to improve the image- whats the point of playing at high resolution using low resolution textures? PC gaming and the pursuit of smooth frametimes is finicky enough without having to worry about vram as well

    • @stangamer1151
      @stangamer1151 Год назад +15

      Well, I can easily see the difference between DLSS Quality and FSR Quality at 4K rendered resolution. Even on my old 1080p screen, let alone 4K screen! DLSS provides much better AA quality and less ghosting. The overall image looks way smoother with DLSS.
      Plus, RTX cards also offer DLAA - the best currently available AA solution. And also DLDSR, which is an AI driven downscaler.

    • @thee-sportspantheon330
      @thee-sportspantheon330 Год назад +16

      @@stangamer1151 Cope. I will go nom nom on my vram.

  • @rain8478
    @rain8478 Год назад +37

    I was in this boat and grabbed a used 6800XT over a used 3080. There are two main reasons for this:
    1. 16 GB VRAM, self explanatory. Yes I bought into the fearmongering, boohoo.
    2. GDDR6X is an actual risk when buying ex-miner cards, GDDR6 not so much.
    This is really all there is to it.
    I've already had a 6700XT before so that kinda made it easy to know what to expect. Might be a little harder to risk going a different brand for others especially in used market. Though I don't think 3080 is cheaper because its valued less, I think it has more to do with just how many 3080's Nvidia produced and sold to miners. I just don't think there are as many 6800XT's out there total.

    • @person1745
      @person1745 Год назад +2

      G6 might not be as big as a risk than G6X but I think it really depends on the AIB rather than the card itself. Also fast G6 can produce just as much heat as slower G6X, like the 7900 XT.
      Reference 6700 XT for example has G6 but are known to run its memory close to 100 degrees. It’s all about how well the cooler is.

    • @rain8478
      @rain8478 Год назад

      @@person1745 well if you're buying an ex mining card known to run memory at 100c you're kinda asking for problems.

    • @molochi
      @molochi Год назад +2

      I think the fear mongering about 8gb is valid. 12gb 198bit is probably enough, but I wanted 256bit and that's 16gb.

    • @forog1
      @forog1 Год назад +1

      @@person1745 AMD OC those GDDR6 cards from factory. Which is why they produce far more heat than they should. 20gbs seems out of spec to me for GDDR6 non-X. I think they did that out of depuration for more performance when the realized their RX 7900 series was not hitting target numbers at 16gbs or 18gbs (GDDR6 at stock). Plus, my 7900xtx if I make the vram clock itself to its 20gbs. on the desktop it consumes like 80w idle with vram clocked up from 10w idle.... yeah, they pushed and overvolted these ram chips.

    • @rain8478
      @rain8478 Год назад

      @@forog1 you can undervolt SoC for what its worth with MPT to reduce power draw a little bit, it works on RDNA2 but I cant guarantee anything for RDNA3.

  • @ballcandy1117
    @ballcandy1117 Год назад +148

    I grabbed a 6800 xt for $150 less than the cheapest 3080 on the used market and 4070s are still over $300 more idk why anyone would spend more for AMD but ultimately besides the great price the 16gb vram was really what made me buy it.

    • @suzie9874
      @suzie9874 Год назад +5

      Couple weeks ago i swapped my 6800XT to a 3080 (10gb) And only reason i swapped with nephew was for AI. The AMD gpu while many say can run stable diffusion. I could never get it to work. So i swapped with my nephew. And SD works flawlessly now. Plus the games i play run about the same on both gpus. So nothing lost there

    • @xwar_88x30
      @xwar_88x30 Год назад +21

      Good choice my friend, 6800xt will age like fine wine and outperform the 3080 as time goes on and even matches or beats the 4070 with having more vram it'll age very well. Nvidia get Weaker as they get older while amd GPUs just get better as videos have proving this. Nvidia have been trash since the 2000 series. I'm hoping Amd can knock Nvidia of their high horse a bit since there just taking the complete mick out there consumers. As I've said in another comment look how smooth the frame graph is on the 6800xt compared to the 3080 amd is defo the smoother experience. Even the 5000 series had smoother frames compared to the 2000 series. Nvidia are more interested in marketing bs RT which tanks FPS on all gpus, there dlss , ai. There all for buy our product fo these features bs while there 3070/3080, 4060ti suffer because of vram limitations, absolute joke. If the 3070, 3080, 4060ti had 12gb plus vram they would be amazing GPUs but nvidia are more interested than ripping customers off making them to upgrade, horrible greed of a company.

    • @joee7452
      @joee7452 Год назад

      @@xwar_88x30 I would agree normally but the 3080 vs 6800xt seems to buck the trend. If you look around you can see a bunch of people and places doing the comparison again and contrary to the norm, overall the 3080 is stronger against the 6800 xt. I can here from a video comparing them that tested 50 games and the 3080 was on arg faster the the 6800xt by a higher percent then a couple of years ago. Now, it wasn't much of a change (like 9% on avg in 4k vs 7% that it was a couple of years ago 1440p was up the same) but it bucked the normal trend of AMD cards getting stronger vs Nvidia's. I thought it was funny.
      If I had to pick though I would still go with the 6800 xt because if you look and watch can you can find them new for around 500 or a little under sometimes. You are not going to find a 3080 new in that range or even in the zip code.

    • @jamesdoe7605
      @jamesdoe7605 Год назад +3

      ​@@xwar_88x30lol no

    • @xwar_88x30
      @xwar_88x30 Год назад +7

      @@jamesdoe7605 settle down there james, dont be silly now.

  • @RN1441
    @RN1441 Год назад +55

    After seeing how ridiculous the prices were getting on the new generations of cards at the end of last year and start of 2023, I decided to grab one of these 10GB 3080's used for a deal. The crop of new games which exhaust its VRAM started showing up almost immediately after this, so I probably should have waited a few months :D Oh well.

    • @michaelilie1629
      @michaelilie1629 Год назад +5

      what about medium settings and dlss/fsr?

    • @princekatana8792
      @princekatana8792 Год назад

      Imagine thinking about using medium settings with a last generation high end card. Pathetic@@michaelilie1629

    • @RN1441
      @RN1441 Год назад +2

      @@michaelilie1629 I'm mostly focused on 1440 since that's my native resolution, so I'll just have to turn off eye candy until it hits the gsync window

    • @mato_s
      @mato_s Год назад +7

      Same but i got a 3070 so im so screwed 💀

    • @stewenw4120
      @stewenw4120 Год назад +4

      @@michaelilie1629 Yeah i guess he bought a 400-600 € / $ card to play on medium settings. Nice advice^^
      The question is do you need more than ~80 fps in games that are requiring so much VRAM. I think 3080 should do the trick in the next 2 years.
      But since i don't care for RT and the other stuff NVidia offers i got me the 6800XT anyways.

  • @RobBCactive
    @RobBCactive Год назад +98

    That's cool that you're using the RX 6800xt for editing, that used to be on the Nvidia Pro side.
    The fact is AMD have improved the software a lot, CUDA support via ROCM is coming out too.
    The thing is Vex has taken the plunge late in the game, there were deals on the AMD cards long ago, so you'd have had the usage out of it not waiting for 3080 prices to settle in the used market. Starfield is nice but not relevant to everyone.
    Unfortunately a lot of people ignored MSRP reductions and lower prices, scared by all the FUD put out about drivers etc. That doesn't send the market leader the signals it requires to reduce its margins.
    Gamers bitching about prices isn't enough, AMD have to be able to justify investment into features.
    That requires sales when they are close, because development costs are a real thing.

    • @molochi
      @molochi Год назад +5

      I thought it was considerate of NV to price their 4080 so high. Gives the 7900 xtx a chance to make money for amd as well. The really are gentlemen.

    • @RobBCactive
      @RobBCactive Год назад +4

      @@molochi well if Navi31 had met its expected performance targets the 4080/4070Ti would have looked very stupid, weak and totally over-priced.
      Scott Herkelman intended to "kick Nvidia's ass".
      So the 7900xt was crushing the 4080 12GB, the xtx the 16GB while much cheaper. Only fixes to the driver caused severe performance drop and they haven't found a general mitigation. Radical changes in architecture are inherently risky, RDNA3 has disappointed.
      Targets met, Nvidia added features like fake frames & DLSS would not have been enough, they'd be forced to cut prices or cede market share. Nvidia were maintaining high prices because of their vast RTX 30 GPU stockpile. Now they're cutting production to the contractual minimum rather than sell them cheaper, they figure gamers will relent eventually and cough up.
      The big money is in AI at the moment, so they're hoping to max out Hopper.

    • @ronaldhunt7617
      @ronaldhunt7617 Год назад

      The emulated cuda is not nearly as good as actual CUDA, I hear AMD is going to do the same with Tensor cores as well. So not as good, but way better than nothing. If you own AMD already it is a huge win, if you are buying a new GPU and will be utilizing CUDA/Tensor then you are better off with Nvidia. Video editing and other productivity software results are going to depend mainly on the software you are using, Premier Pro favors Nvidia much more, some free video editing software may or may not be more of a contender.

    • @RobBCactive
      @RobBCactive Год назад +3

      @@ronaldhunt7617 CUDA is a way to program a GPU, it takes source code and compiles it, for an application. Calling it emulation just shows your game is spreading FUD and confusing people.
      The AI boom is for large models and so called deep learning, with huge demand for data center products.
      What you need for running neural nets and training them is very different, laptops are coming out with AI acceleration in cooperation with MS.
      Like Apple has had, without any Nvidia hardware but be an accelerator on the CPU die.

    • @ronaldhunt7617
      @ronaldhunt7617 Год назад

      @@RobBCactive Not sure what you are trying to get at here, Nvidia has dedicated CUDA cores that process independently making certain computations a lot faster. AMD does not have CUDA cores, instead they (just like in everything else) saw what Nvidia had done and added stream processors which are not the same, not as good. Just like raytracing cores, and now it seems like Tensor cores (for AI) but only for the 6000 and newer GPUs, not to mention upscaling and frame generation (which AMD does not have as of now) Monkey see, monkey do... but not at the same level.

  • @chrispittmanComputing
    @chrispittmanComputing Год назад +24

    Make sure your 5900X is running a negative all core in curve optimizer (start at negative 15) and if not running Samsung B-die get a set of 3600 CL14 and really tighten down the timings. This will give you a noticeable uplift on your 5900X.

    • @therecoverer2481
      @therecoverer2481 Год назад +1

      hey quick question, my ram (GSkill Sniper X 3600 CL19 @1.35v) is detected as Samsung B-die on Thaipoon Burner, but when i tried to tighten the timings even just 1 it refuses to boot, even at 1.45v, but the weird thing is i can lower the voltage when runing xmp to 1.25v. is it possible that i got trash B-die or am i doing something wrong? my cpu is R5 5600

    • @Wabos123
      @Wabos123 Год назад

      @@therecoverer2481 Its the B-die I had the same issue with my 5600X using Silicon Power ram from amazon. Ended up swapping to some Team Group T-Create 3600mhz CL18 and was able to undervolt and adjust timings.

    • @droptoasterintub297
      @droptoasterintub297 Год назад

      @@therecoverer2481 3600 MT/s with C19? That's definitely not B-Die. Guaranteed B-Die bins would be the likes of 3200 C14, 3600 C14, 4000 C15, etc. C19 is incredibly loose even at 3600. You shouldn't trust Thaiphoon Burner as it is known to have false positives, and in this case, it is glaringly obvious this is one of those false positives. I almost would say those ICs could actually be Samsung C-Die as they are known to become unstable over 1.35v, no matter the timings. It would also explain the very loose primary timing(s). I'd get ahold of a 3600 C14 kit as this is the best sweet spot for performance on Ryzen. Getting kits over 3600 MT/s isn't beneficial as they aren't necessarily better bins, but pricier; and, you almost can never go over 1900 FCLK which is 3800 MT/s when synced 1:1. Some Zen 3 samples may not even be able to do 1900 FCLK and need to step down to 1866 or 1800 (3733 MT/s and 3600 MT/s respectively). A B-Die 3600 C14 kit should easily do 3800 at the same timings on stock voltage most of the time.

  • @gamerforever4837
    @gamerforever4837 Год назад +4

    Why do you say one thing, then contradict what you just said? " I don't think the extra 6 gigs of vram is that big of a deal" Then less than a minute later " but the 16 gigs of awesome, to allow you to enable higher settings.

  • @Ratich
    @Ratich Год назад +15

    The problem with the encoding is that i don't use that feature neither do I have a use case for CUDA and I'm probably going to stick to pure Rasterisation rather than turning on RT because upscaling or not the performance hit is too much. So for me the 6800XT is a better option.

    • @chetu6792
      @chetu6792 Год назад +5

      That’s the thing. Everybody talks about the Nvidia feature advantage. But beside DLSS, most features are used rather scarcely

    • @evrythingis1
      @evrythingis1 Год назад

      @@chetu6792 DLSS is just another scarcely used feature. They have to pay companies to implement it in their game, how ridiculous is that?

    • @gotworc
      @gotworc Год назад

      ​​​@@evrythingis1i mean that's literally what most companies do to push their tech. You gotta pay to play. Why would a company offer to put your tech in their game if they have nothing to gain from it. AMD does it too. While I don't think Nvidia GPUs are particularly good value right now since we're at the beginning of this era of AI being used to improve graphics computing. I don't think the technology is bad or a gimmick like most people are trying to say

    • @evrythingis1
      @evrythingis1 Год назад +1

      @@gotworc go ahead and tell me what proprietary features AMD has had to pay game devs to implement, I'll wait.

  • @ChusmaChusme
    @ChusmaChusme Год назад +10

    0:05 I'm like 95% sure the mining begun right when that generation began. Remember it cause it was impossible to buy any gpu at launch.

    • @Chrissy717
      @Chrissy717 Год назад

      That was actually covid shortages. 2020 and the first lockdowns caused supply chain issues on a global scale

  • @enricod.7198
    @enricod.7198 Год назад +40

    Could you include a software showcase comparing the nv control panel and amd adrenaline? Because I've used both in recent times and amd is way better, especially since it comes with oc/uv suite that's far easier to use than having to resort to afterburner. These things should be considered when doing a comparision.

    • @Raums
      @Raums Год назад +9

      See I’d love to see this too, haven’t had an AMD card in years but hated the software back then and have heard it’s improved a lot. Never really had an issue with nvidia

    • @enricod.7198
      @enricod.7198 Год назад +15

      @@Raums Tried both in the last 2 years. As I always undervolt my gpu I must say that amd is way easier and works better. Needing a 3rd party software made only by one outsourced person to undervolta a gpu from the market leader is embarassing imo. Amd software is great these days and drivers, while sometimes have issues like nvidias, have way more performance improvements over the months compared to nvidia.

    • @memoli801
      @memoli801 Год назад +8

      They always leave this important fact out.
      But Cuda is so important for a gamer?
      We are not all contant creater! Dont give a shit

    • @jjlw2378
      @jjlw2378 Год назад +2

      The fact that AMD has a built-in OC/UV utility is actually a bad thing. They have already used it to artificially limit the speeds in which you can tune your GPU. You want overclocking software to be a third party because they will provide the most impartial and least limited features.

    • @FenrirAlter
      @FenrirAlter Год назад +2

      ​@@jjlw2378🤡

  • @MahBones
    @MahBones Год назад +26

    For me at the moment it's between the 4070 and the 7900gre but I'll have to see where that price lands in Aus. The 40 series power consumption is pretty compelling.

    • @Hydra_X9K_Music
      @Hydra_X9K_Music Год назад +2

      Not sure if this also matters to you, but the 4070 has a nicely small form factor. Can fit into a pretty wide range of case sizes

    • @Nein99x
      @Nein99x Год назад +2

      The 7900GRE will be limited to pre-built pcs. You might as well get the 4070.

    • @Dereageerder
      @Dereageerder Год назад +1

      Get the 4070, AMD is garbage beneath the 7900 xt and xtx

    • @chriswright8074
      @chriswright8074 Год назад +12

      ​@@Dereageerdercap 6950xt definitely gaps or beat 4070 at times

    • @Mangk89
      @Mangk89 Год назад +7

      ​​​@@chriswright8074i think he meant anything in the 7000 lineup, but i get your point

  • @evergaolbird
    @evergaolbird Год назад +19

    I would argue that VRAM matters for PC gamers that are not into competitive gaming but rather into the modding scene. NVIDIA is doubling down into this with RTX Remix with the RTX 40 series cause they know the demand of the modding scene in the PC landscape remains high. Skyrim is almost 12 years old, but to this day infact it has not slow down rather the modding scene in the game just hits the peak further instead (Check the view counts in RUclips or the player counts in SteamDB - its still relevant).
    High VRAM capacity and Memory Bandwidth are important in Skyrim, I have 1750 mods installed on my 5900X + RX 6700 XT and with my 12GB of VRAM there's a lot of planning I do which mods to keep or not cause of VRAM demand.

    • @siyzerix
      @siyzerix Год назад

      I agree. Skyrim modding also requires the fastest cpu to play smoothly. So a x3d cpu is almost a must.

    • @evergaolbird
      @evergaolbird Год назад

      Very true, Skyrim is one of those games that demands not just with the GPU but also the CPU. Not to mention at least having a GEN3 NVMe is required to even make Massive LODs work - System RAM too.
      Skyrim's modding scene is probably the only game out there which is like a symphony. Its uses everything in the PC especially if the person knows how to mod.

    • @siyzerix
      @siyzerix Год назад

      @@evergaolbird Indeed. My laptop has 32gb of ddr4 3200mhz ram, an I7 12650h and a 150w 3070ti and a 1tb gen 4 nvme ssd. Skyrim se still see's my CPU being the limitation. Thats despite me giving my CPU all the power it needs.

    • @siyzerix
      @siyzerix Год назад

      @@TheAscendedHuman The mobile 3070ti has the same chip and specs as the desktop 3070 except tdp and clocks. The 150w variant is within 10% of the desktop 3070.
      The i7 12650h is more akin to a i5 12600 non K (if it exists).
      I'd say thats a solid setup, even for desktop standards. Cause I know, not that many people are buying even a 3060ti or 6700xt on desktop. Most have a 3060 or 2060.
      My point is, modded skyrim se just needs way too fast of a CPU. I litearlly cannot use my GPU to its fullest, even if I let the CPU run wild. I hit the drawcall limit real fast and exceed it too causing FPS drops.

  • @DeadOrAliveGames
    @DeadOrAliveGames Год назад +8

    I got the 6700xt spretral white edition last week for £300, and im so impressed with its performance. 1440p ultra no upscaling 🎉 glad i didnt get the 4060!

  • @Paulie8K
    @Paulie8K Год назад +15

    Nice breakdown. I've had the 3080 10GB since December 2021 and it's been amazing. Now i tend to play games that are a bit older because I have a rule where I wait at least a year after release for games to get patched and go on sale so I have nothing from 2023 but I've played dozens of games from 2015-2022 and the 3080 ran them beautifully at 1440P.

    • @MrBorisRoo
      @MrBorisRoo Год назад +1

      I dont have problem on 4K with my i7-9700 … 60 fps are stable…. Now I can never go back to 1080p

    • @unreleasedcontent9316
      @unreleasedcontent9316 Год назад

      Right the 16gb of vram isn’t all that 10g is all you need it nvidia still gives you advantages

    • @Cybersawz
      @Cybersawz 10 месяцев назад +2

      I have had that same 10GB card for almost 2 years, and use it for 4K gaming on high settings for most couple year old games. Granted, the latest AAA games can't be maxed out but still look pretty awesome on high. I run an I9-13900K with 32GB RAM.

    • @Paulie8K
      @Paulie8K 10 месяцев назад

      @@Cybersawz nice. I have 32gb of ram too but have a 3700X which I may upgrade to a 5800X3D. But at 4K, I'd be GPU bound so probably would be the same results as your 13900K there.

  • @syncmonism
    @syncmonism Год назад +15

    DLSS doesn't eliminate the value of Vram, but it can let you get away with using less with a small reduction to image quality at a given resolution. Also, while DLSS can help compensate for a lack of Vram, FSR can as well, it's just not as good at it at 1440p, but it gets quite hard to tell the difference between the two when upscaling to 4k, at least with the highest quality settings.
    I did own a 3080 for a while, and have played around a lot with DLSS and ray-tracing in Cyberpunk, and also in Control. Running without any upscaling at all is still better than running with DLSS, as long as you can get a high enough frame rate, and I find it very hard to recommend a card which costs 80-100 more, but has significantly less Vram, if it has about the same amount of standard raster performance. Ray tracing in Cyberpunk required using DLSS, and still ran significantly slower than running at 1440p native without ray tracing. I just didn't think that it was really worth using ray tracing. It never seemed obvious that it was worth running with it turned on, though it certainly was usable, and did look good, running at 1440p with DLSS. With Control, I found that the performance was good enough to run with ray tracing and without any upscaling but the game did still run noticeably smoother with ray tracing off, and the ray tracing itself was still not all that compelling to me.
    I found that the lighting in both games, even without ray tracing, was amazing. A lot of the lighting effects even with ray tracing turned on, are the same either way, unless you go with full path tracing, but I obviously had nowhere near enough performance to run with full path tracing.
    The 4070 isn't terrible, and it's not going to suddenly become obsolete because it doesn't have enough Vram at any time in the next 4-5 years, but it would have been a LOT better if it had had 16GB of Vram. It's not like 16GB would have been overkill on it because it has DLSS. That would have made the card that much better, and it would have also had more memory bandwidth as well, which would also be nice. A 16GB 4070 at 660 would have been a significantly better value than the 12GB version is at 600.

  • @Pand0rasAct0r_
    @Pand0rasAct0r_ Год назад +4

    I have to disagree with the feature set. And before people call me a shill I have been using nvidia all my life. The 6950xt is the first amd gpu I have owned and used.
    The dlss vs fsr argument is in most cases highly blown up. Yes dlss provides a better picture. BUT you will almost never notice this during gaming and only notice this If you pause them and watch them next to eachother. If you have to go through such lengths to spot something than yeah that's just not an upside in my opinion.
    And raytracing is better on nvidia although not bad on amd either. But the cards we have right now and especially the 3080/3090/6800/6900 are just not raytracing cards. Neither is the 4070ti or 4080 or 7900xt or xtx. The only card capable of raytracing at minimum is the 4090. And even that card sucks at it in many ways. So If you plan to raytrace you really shouldn't be looking at these cards.
    The only upside is cuda but If you are just a gamer you wouldn't care about it.
    And the vram is just so important. There are so many games I see with my 6950xt that these days shoot past 10gbs.
    And I wouldn't choose the 4070 above the 6800xt in my opinion. 12gb is alright but as I said I've seen games shoot past 10gb Heck even 13gb. So the 4070 would already be having issues. And that at 600+ bucks new in Europe. Just not worth it in my opinion. At that point you may as well buy a 6950xt if they are still available.
    Most people buy it indeed for raster and in that case most amd cards just beat nvidias.

  • @JanJirásek-n1n
    @JanJirásek-n1n 5 месяцев назад +4

    I got brand new 6800 asrock challenger pro (non-XT) for 360usd in May 2023 and I absolutely love the card

  • @NinjaWeedle
    @NinjaWeedle Год назад +5

    Nabbed my Red Dragon 6800xt this prime day for 479$. Pretty happy with it, although i do sometimes find myself wishing I still had the CUDA support my 1070 Ti had. Looking forward to ROCm.

  • @adikonas1978
    @adikonas1978 Год назад +3

    Can you tell me what the temperatures of the card are, especially if you use a 1440p ultra wide monitor? For me, with the highest details, the temperature at the connector was 100 degrees Celsius

    • @Anto-xh5vn
      @Anto-xh5vn 6 месяцев назад

      Um what holy shite bro that's probably not normal or is it

    • @adikonas1978
      @adikonas1978 6 месяцев назад

      @@Anto-xh5vn as far as i checked normal 10 degrees before thermar throtling with undervolt its around 90

  • @lucidnonsense942
    @lucidnonsense942 Год назад +14

    The problem with using a 3080 for productivity apps is that 10GB is REALLY not enough for doing actual work with. PLUS - nVidia's drivers limit many of the professional features you need to Quadro drivers. The 3080 Quadro equivalent is the A5000/5500 with 24GB vram and price around 1k-2k. You will get better performance, than 3080, in most CUDA workloads with a Quadro RTX 4000, a 16GB - 3070 equivalent. Because 10GB for any significant work is waaayyy too low - assuming the drivers allow the full feature set in the application on a non Quadro card.
    As far as AI worloads go; which is much more my wheel house. ROCm is feature complete and CUDA isn't all that relevant in 2023 as it was in 2021, for the AI space. 10GB - again - cripples the card for anything more than fiddling around as a hobby. Try to render a 1440p scene with a 10GB card vs a 16GB, it's not even funny how memory crippled you will be. You will get equivalent performance to a 6700XT with 12GB - which you can get for much cheaper. Additionally, we tend to put GPU render farms on a linux distro, where AMD has much more mature drivers and ROCm support. Specialised AI accelerators are a whole different kettle of fish, in that space, you will be writing your own custom libraries, that you will tune to whichever vendor allocated some boards for you - nobody is going to be picky which one, the lead up times are insane as everything is pre-sold before being fabed. You take what you can get, pay what is asked and count yourselves lucky.

  • @keola2
    @keola2 Год назад +15

    Vram becomes a huge problem when you don't have enough, I'd like to see some benchmark examples with Starfield 8k or maybe even 16k texture mods. Watch that 10GB absolutely tank in performance. With that said, right now 10GB is good enough, it's just not very future proof. When new games are using more and more vram every year, I'm sure that's a concern for many people. Despite all of that, I'm sure the used market is just flooded with more 3080s because of the end of the crypto craze.

    • @Dereageerder
      @Dereageerder Год назад +8

      8k and 16k textures are more for the 90 class cards. 10gb is plenty for starfield

    • @Eleganttf2
      @Eleganttf2 Год назад

      wtf u need 8k texture for ? you crazy and delusional

    • @justjoe5373
      @justjoe5373 Год назад +1

      Only a concern if you max them out. Gaming at 1080p, never had to drop more than 3 settings from max for stable 60 at 8GB. Haven't played Hogwarts Legacy, TLOU etc. but they don't interest me anyways and I bet you can play them perfectly fine with a few nonsense settings turned down. It's the same deal for higher resolutions, turn down a setting or 2 and watch the game become playable, usually with little visual difference. Ultra settings are beyond the curve for diminishing returns for performance hit vs visuals, when an amount of VRAM isn't enough to run the game at medium-high then it's too little
      10GB VRAM is gonna be fine, XBox S has 10GB total that's shared between the GPU and CPU. 10GB may be laughable on a 3080 but in and of itself that amount isn't an issue

    • @Ober1kenobi
      @Ober1kenobi Год назад +7

      When was the card designed for 8k ?
      Do you have an 8k panel ?
      You have a $3000 monitor ?
      Lol
      Same as testing a 4060 in 2023 at 4K, it doesn’t make sense, it ‘can’ up to a certain texture level.
      Would I want to, no

    • @keola2
      @keola2 Год назад

      I agree with you both, like I said, 10GB is good enough, especially if you're doing 1080p. It's more of a worry with future, poorly optimized games from AAA devs using more vram than necessary. It's like that saying, I'd rather have it and not need it, than need it and not have it.

  • @coolumar335
    @coolumar335 Год назад +2

    Dude is capping by saying 16gb VRAM doesn't matter.
    Extra vram allowed cards like the RX 580 8gb and even R9 390 8gb to extend their lifespan way beyond what was initially expected out of them.
    The 6800 XT is a great long-term purchase and will be viable for 1440p for at least 3-4 more years.

  • @AZTech25
    @AZTech25 2 дня назад

    This is my first time seeing your content. Very good job and full of useful information.Well made video!

  • @SmokinDave7373
    @SmokinDave7373 Год назад +4

    Nice Video man, I jumped ship from my 3070 to a 7900XT because it was 200AUD cheaper then the 4070ti and besides not having DLSS as an option I am super happy with my AMD card, in pure raster performance on average I get 15-20% more fps in most games(compared to the 4070ti) and the only issue I have had with my 7900XT TUF is bad coil whine which I am hoping I can send mine back soon and see if I can get a better luck of the draw on that. Keep up the videos you are a great new entry source into the PC DIY learning youtubes.

    • @xwar_88x30
      @xwar_88x30 Год назад +1

      Your not missing out on dlss since amd has built in fsr in there drivers which you can use in any game and it still looks amazing. Should defo try it out. It's under super resolution in the driver menu, I tend to have the sharpness either at 40/60 looks good imo.

  • @alzarpomario889
    @alzarpomario889 Год назад +3

    I just put here my two cents about what is keeping me, and a few other folks, on the AMD side when buying a new GPU.
    Support for open stadards, like FreeSync, FSRX and OpenCL/ROCm: I don't like vendor lock-in, so I support agnostic tecnologies.
    I'm not the guy who cracks professional software just to tell my friends: I have photoshop 2026, idk how to use it but I have it.
    So I usally go for open software and I've never had a regret, in both my private labs and professionally.
    But the main plus above all it's the unix support.
    At home I can play windows games on linux flawlessly without having to thinker with monthly driver updates, it just works...and 2005 class hardware is still supported.
    At work I greatly extend hardware lifespan for the same reason and this philosphy allows us to offer fast and reliable citrix like remote desktops with GPU passthrough of graphics cards that would now be e-waste if made by nvidia.
    Intel is now in the middle between AMD and nVidia philosphy and i hope it will land on the AMD view of the HW/SW stack.

  • @osmanbicer14
    @osmanbicer14 Год назад +12

    Upgraded from 6700K, GTX 1080 and 1080p gaming to RTX 4070, 7600x, 6000 MHz 32 GB DDR5 and Dell G3223D monitor. It was a nice upgrade.

    • @fanATIc666x
      @fanATIc666x Год назад +3

      i upgraded from 4690k, gtx 970 to 7800x3d and 4080, and monitor G2724D

    • @Ghostlynotme445
      @Ghostlynotme445 Год назад +2

      I upgraded from i3 19100f, original gtx 1650 to ryzen 9 7900x3d and rx 7900 xtx

    • @xwar_88x30
      @xwar_88x30 Год назад +3

      @@Ghostlynotme445 good choice, better upgrade than the other two noobs 😂😂

    • @fanATIc666x
      @fanATIc666x Год назад +1

      @@xwar_88x30 ATI fanboy here

    • @osmanbicer14
      @osmanbicer14 Год назад

      AMD users will have to adjust settings on Radeon software 'to get a stable experience' whereas; Nvidia users will just play the game without doing any adjustments. And when RT is turned on, AMD goes insane hahaha
      @@xwar_88x30

  • @victorhogvall2065
    @victorhogvall2065 Год назад +1

    Got a 6800xt paired with a 43 inch 4k screen and I can tell you using quality fsr at 4k I cannot tell the diffrence and at 1440p with these cards u don't even have to use upscaling.

  • @yukiokasamatsu
    @yukiokasamatsu 8 месяцев назад

    Hey if yu see this i have a question , in present time if theres no price difference which would you take?(basically rt or vram for all around use)

  • @RannonSi
    @RannonSi Год назад +24

    Personally, DLSS is a crutch-technology and should be advertised as a way to keep your e.g. x080 viable for another generation, or three. Not as something to make games playable when the card's still pretty new.
    Edit: fixing spelling mistakes and such like.

    • @KEWords
      @KEWords Год назад +1

      Like AA or anything that improves quality over rendered resolution does? Cruth. Do you play games or freeze frame and pixel peep. DLSS gives you a choice, and when I choose quality it get more FPS at the same visual quality on the same card. Its like a 5th gear that gives me better milage. If it was not valuable. AMD would not be bribing bundle partners like Starfield to no support it.

    • @_n8thagr8_63
      @_n8thagr8_63 Год назад +2

      it's not a 'crutch' technology. There's more to making a game run nice than just raw hp. DLSS is an excellent feature and a great example of hardware and software harmony

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +1

      Depends on how you see it. If DLSS is "crutch" then rasterization is also a crutch.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      ​@@_n8thagr8_63it's only a crutch for Devs who don't know what optimization means imo

    • @evrythingis1
      @evrythingis1 Год назад +1

      @@arenzricodexd4409 You don't know how DLSS works do you?

  • @princekatana8792
    @princekatana8792 Год назад +3

    Why doesn't anybody talk about the RTX 3080 12 gb, every review I see has the 10 gb. Not that I think it would make that much of a difference but still am curious as that is the card I have.

    • @MahBones
      @MahBones Год назад +7

      Just never see them, never seen one for sale in my local shops.

    • @vincentvega3093
      @vincentvega3093 Год назад +2

      It released a year later for almost 3080 Ti money. Only miners bought them

    • @shawnadams1965
      @shawnadams1965 Год назад +1

      I think its because most of us bought the 10gig version when it came out, at the time we didn't know that they would release a 12gig version. I sold my 10gig 3080 (that I bought as part of a full system upgrade at the same time as the rest of the components) for 450€ and bought a 4070 on sale for 580€ So a nice side-grade for 130€ It should hold me until I upgrade my entire system again in a few years.

    • @Legnalious
      @Legnalious Год назад +2

      Because the 10gb was the original. The 12gb was also $200 more expensive (at least). So odds are that there are significantly more people who have the 10gb than the 12 gb variant. So it just makes sense to use that version.

    • @princekatana8792
      @princekatana8792 Год назад

      I always buy a generation behind so by the time I bought it, mining craze had all but died down. @@JJssss

  • @johnrehak
    @johnrehak Год назад +1

    3080 shoot its leg by 10gb Vram. I would buy 3080 12gb or 3080 ti which also has 12gb if its not significaly more expensive.
    Was Smar acces memory turned on when you tested both cards? In general both cards are same until you turn on SAM, than 6800 xt stsr pulling head in nearly all games except RT.

  • @moes95
    @moes95 Год назад +15

    Recently upgraded to a rtx 2070super for 150€ (card needed to be repasted, not that hard tbh), ill be good for a few years or i can snag something newer on the cheap
    Edit: i bought the gpu with mobo combo, i7 9700k 16gb ram and aio cooler for 200+150=350€ sold my old setup for 400€ with screen and other peripherals

    • @RFKG
      @RFKG Год назад +3

      Grats on your buy, mate. The 2070 super is still a very competent 1080p card.

    • @AndyViant
      @AndyViant Год назад

      2070 Super was a very good card for it's era in bang per buck. More a 1080 card now but if that's your end use it has a lot of life left yet.

    • @prem3548
      @prem3548 Год назад

      Great buy for $150. It has slightly better performance than my RX 5700 XT and I play every game at 1440p at 60+ FPS with FSR enabled no problem.

    • @moes95
      @moes95 Год назад

      @@RFKG you mean 1440p the current rig pushes 100fps in almost all games at high/max @1440p i dont use dlss or other resolution scaling, i went from i7 4770k w. Gtx 1070 that also pushed 1440p at medium settings albeit stuttery in demanding games
      Example: cyberpunk currently averages 80fps @ high to max @ 1440p

    • @RFKG
      @RFKG Год назад +1

      @@moes95 That's amazing, i honestly had no idea that the 2070super could still do 1440p gaming

  • @cartm3n871
    @cartm3n871 Год назад +4

    I just recently upgraded to a 4070, a few weeks ago, I gotta say, I haven't felt the vram issues yet. every game I play, plays flawlessly.

    • @antonnrequerme6256
      @antonnrequerme6256 Год назад +1

      what resolution do you play on?

    • @cartm3n871
      @cartm3n871 Год назад

      @@antonnrequerme6256 1440, what about you?

    • @Altrop
      @Altrop Год назад

      You'll feel them in a year or so. 16GB is the new VRAM target for game devs. Hence the sudden explosion in VRAM use in 2023. We ent from "8GB is enough forevahhh" to some games using up to 14GB in months.
      When I buy a card I intend to keep it for 4 years. And for that I realized 10 or 12GB just wasn't enough, so I bought a 6800XT. Used to own a GTX1080 and saw that it was already being filled up in 2021, I saw the writing on the wall

    • @MahBones
      @MahBones Год назад

      Should be good on 12gb for most stuff at 1440 where the 4070 is well matched. I'm on a 6700xt atm at 1440 and don't have issues with vram.

    • @MahBones
      @MahBones Год назад

      @@Altrop Realistically the only things using that much vram are unfinished garbage piles or if you're set on raytracing but I don't think a 4070 is really the card for raytracing. Until consoles move away from their 12ish usable that will be the "benchmark"

  • @Paelmoon
    @Paelmoon Год назад +4

    I have a 3080 10gb. What's your opinion on what to upgrade too? 40 series are so pricey, but the amd cards are not so higher performance.
    Edit. Cheers guys! I will stick with the card and hope it doesn't conk out soon :-P

    • @Mako2401
      @Mako2401 Год назад +19

      There is no need for you to upgrade unless you get a 4090

    • @Humanaut.
      @Humanaut. Год назад +8

      Keep the card for as long as you can and pray that next gen doesn't suck as hard as this gen and that ai doesn't become to next crypto boom.
      The only issue you could have is VRAM limitations, which is a good old Nvidia planned obsolescence strategy, other than that you should be good for a while.

    • @niggapyrat
      @niggapyrat Год назад +1

      3080 is still a good card but if u want more vram for 4k etc.. go with a 7900 XT or XTX. If u want Nvidia, minimum a 4070ti

    • @Blaze72sH
      @Blaze72sH Год назад +7

      4090 or just don't. You are set for now.

    • @endmymiseryyy7908
      @endmymiseryyy7908 Год назад +2

      wait for next gen

  • @professorchaos5620
    @professorchaos5620 Год назад +22

    The main reason 3080's are cheaper is that they were probably the most produced card of all time. Although I dont know how to find stats on that but so many miners like myself bought every one they could for a couple years and they have been re-selling them on ebay since then.

    • @maximus3294
      @maximus3294 Год назад +3

      it's true. ga102 in general was so abundant that they made the following GPUs with them:
      3070 ti
      3080 10 GB
      3080 12 GB
      3080 Ti
      3090
      3090 Ti
      and those are just the GeForce ones. their serverside cards also used the cores.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад

      F*** you miners, f*** you miners

    • @CPSPD
      @CPSPD Год назад +1

      when shitcoins and cryptocurrencies collapse i will rejoice

    • @professorchaos5620
      @professorchaos5620 Год назад

      @@CPSPD Thats good central bank slave. Keep rejoicing in your life long slavery

  • @Nalguita
    @Nalguita Год назад +3

    I bought a 6800xt 1 year ago. Where I'm from in Spain, for some reason AMD cost more than Nvidia. All my life I was Nvidia, except for an old ATI x800 I had, but the rest of the graphics cards I have had were all Nvidia, a 9800gx2, gtx 280, gtx 285, gtx 480, and Gtx 970. Even being more expensive here the 6800xt than the 3080 I opted for it, it was not for the performance, nor for the price, it was because personally I am up to ...... of Nvidia. First it was with a board with Nforce 790i ultra chipset, an expensive board that did not stop giving problems, then the 9800gx, the worst graphic I've had by far, then with the Physx marketing crap, that if you add a second card to have physical and blah blah blah, never ended up working well, pure marketing, then the 3.5 Gb of the 970, I ended up fed up and for my part they can put their marketing up their ass. It is clear that the streaming and rt in nvidia and superior, but for me it was not so much as to opt for Nvidia again. The Rt is superior but it is clear that in the 3000 series falls short so for me it is still marketing, the Dlss is above the FSR there is no doubt, but who tells me that when they leave the 5000 not leave me sold with as they did with the shit physx, the 3dvision and other technologies, also the Xsee of intel is not so bad and can be used in AMD. This is my humble opinion from years of owning Nvidia cards. I'm not saying that Amd hasn't had problems with their cards, but I personally haven't suffered from it. Sorry for the long and my English, I wrote it with the translator. Good video and good channel

  • @alaskacpu
    @alaskacpu Год назад +1

    1080p 240Hz player for all games and I never use RT with my 3080 or 6800. It's not worth the best speed for visuals which I could care less about. I get awesome visuals at 1080p and I'm not going down that rabbit hole to compare 1440p or 4K. I'll leave that up to you guys to complain about. Speed & Efficiency for games I play is absolute! Enjoy 🙂

  • @Plague_Doc22
    @Plague_Doc22 Год назад

    I'm curious, do american retail stores like microcenter and others sell last gen products like the 3000 series for nvidia and 6000 for AMD?
    Because where i'm at, it's basically impossible to get last gen stuff unless its used or not sold out prior.

    • @kirbyatethanos
      @kirbyatethanos Год назад

      Only the low end/mid range options
      RTX 3050
      RTX 3060 8GB/16GB
      RTX 3070

    • @Plague_Doc22
      @Plague_Doc22 Год назад +1

      @@kirbyatethanos Ah ok. I mean I can find some 6800 XT's but its much harder to find and isnt for names I realy trust.

  • @daruthin
    @daruthin Год назад +1

    In the "new" market in europe (mostly france) :
    - For 599€ you got a brand new 3080, a 4070, or a 6900XT (the 6950XT is 629€)
    - The 6800XT is 60€ less
    I'm still waiting for the 7xxx from AMD. I don't get their strategy. Still, there isn't 7700 or 7800, XT or not, and they released the 7600.

  • @Icureditwithmybrain
    @Icureditwithmybrain Год назад +1

    You would pick a 12GB card over a 16GB one? Do you play at 1080p or something?

    • @manusiaorang2842
      @manusiaorang2842 6 месяцев назад

      this vram talk needs to stop, most the test where 12gb is barely enough is with 4k textures, you can't even notice the difference below 4k

  • @thebeautifulandthedamned572
    @thebeautifulandthedamned572 Год назад +2

    gpu market has been confusing lately
    here in my country indonesia, there were a moment when you can't find RX 6950 XT at $630 USD, it was sold around $700
    but ironically, you also can't find brand new RTX 3080 on $600 USD, because it's also sold on 680-700 USD
    so people who actually buy brand new RTX 3080 is considered as "stupid", since they could get way better GPU which is RX 6950XT
    and now a brand new RX 6800XT is being sold at $560 USD, just $60 less cheaper than RTX 4070, and you can't find RX 6950 XT anymore, not even in 2nd hand version

  • @dayadam16
    @dayadam16 Год назад +2

    Gotta love at 3:54 "the nvidia card is pulling ahead pretty significantly"(+14%) then 10 seconds later when "the amd is pulling ahead on same game the amd is actually pulling ahead"(+14). I feel like for the rest of this video nvidia is gonna get way better choice words. Smh🤥

    • @vextakes
      @vextakes  Год назад +1

      Literally said amd has is very favorable with lumen there and that’s is a huge advantage in the future

    • @dayadam16
      @dayadam16 Год назад

      @@vextakes that doesn't change the choice of words that was used though. I see what your getting at but i'm saying using words like significant just sounds way more than what actually took place within that 12 seconds of that time stamp.

  • @Hito343
    @Hito343 Год назад +2

    People are overblowing the whole VRAM issue.... yeah bad PS ports exist, badly optimized games exist especially recently..., capping FPS is a thing imagine (why would you be making it run more than it's needed unless it's a competitive game and you need that 120FPS), lot of the options in the Ultra preset make little to no visual difference and cost a big performance hit. .... yeah. Great video

  • @Games_and_Tech
    @Games_and_Tech Год назад +2

    Terrible comparison, a high en3080 vs a lower end 6800xt... the power consumption and the frequencies are too low for a 6800xt

  • @defennia
    @defennia Год назад

    Also depends on the card manufacturer brands. Do you have any idea how expensive EVGA 30 series cards are going up in value? I was just barely able to buy 3090before they started going up pass 1,200 and the kingpin series 3090 I've seen a couple go for over 2k

  • @ravs1988
    @ravs1988 Год назад +1

    Upgraded from a Rx570 from 2017 to a Rx6800 (non XT) for 300€ on the used market. Currently wasting its potential on Dota and Wow :)

  • @sketters9400
    @sketters9400 Год назад +1

    I got a 6900 xt for 400 euros used and I couldn't be any happier with it, crazy good GPU for it's price

  • @kaisersolo76
    @kaisersolo76 Год назад +1

    The reason Nvidia skimps on Vram is so you have to buy the next gen and give you dlss to get by. Amd give you the VRAM and get blamed when theres not much improvement next Gen and offer a serviceable FSR.

  • @franciscojaviervazquez2635
    @franciscojaviervazquez2635 Год назад +2

    I got the 6800xt in november, and I love it. First, I had the good luck of getting a card without the annoying coil wine, so that made me happy (I bought the Gigabyte OC version). Second, I trully love Adrenaline, because there´s no need of Afterburner to adjust voltages, fans speed or whatever and is very easy to measure thermals without using a third party software. To be honest, DLSS is a major feature I would like to have on my card, but considering that Nvidia didn´t supported 30 gen of cards for frame generation and AMD promised FSR 3 will be supported in the 6000 cards; seems that was a deal. If AMD promise is achieved, the 6800xt would destroy even the 3090ti with half the price. We´ll see if true... but at the end, 6800xt seems like a good deal if RayTracing is not what you want. I´m not content creator or editor, not streamer; just using the card to play and even without FSR 3 I love my card. No drivers issues, more VRAM, more rasterization performance and includes Adrenaline. All for less price. Shut up and take my money!!

  • @Astravall
    @Astravall Год назад +2

    I do not stream to twitch, i do not upscale and i seldom use raytracing (well not that i could complain about raytracing performance of my RX 7900 XTX. It is plenty sufficient). But i have games that use more then 10GB RAM and that raytracing increases the VRAM usage thus the 3080 running out auf VRAM is kind of funny. So i have to disagree. I would chose the 6800XT anytime over a 3080.

  • @michaelfalabella6296
    @michaelfalabella6296 Год назад

    awsome video man, i love the detail you get into. that red dragon looks like its performing great!

  • @shroomy8210
    @shroomy8210 Год назад +1

    Why is my cpu usage at 100% for cyberpunk with an i7 12700F?

  • @nathanielchapman7675
    @nathanielchapman7675 Год назад +1

    I disagree with the VRAM statement. The stutters and graphical artifacts are a dealbreaker for me. I had a 3070 that ran out of VRAM and the drawbacks were unbearable. The additional 2 GB of VRAM for a 3080 would have not resolved the issue. Moreover, the raytracing on the 3080 is quite overblown with how good it is…. Also, it requires more VRAM. So what would you choose, better textures overall or better lighting?

  • @sabatrol666
    @sabatrol666 9 месяцев назад

    I bought the lenovo loq with nvidia 4060 8GB and RX 6800 XT What did I do? I also am planing on using it with a oculink adapter

  • @kon0212
    @kon0212 Год назад +1

    tbh some of us, well. most of us grew up without ray tracing, why need ray tracing when u can perform higher fps at a higher resolution, honestly, ray tracing is a useless invention when we can use its much more easier counter part also known as pixels.

  • @bearsgarage272
    @bearsgarage272 10 месяцев назад

    Literally just bought a red dragon 6800xt today, open box special at $430. Couldnt postpone anymore, coming from a 5700, i hope to keep this around for a few years. Maybe next time around ill go greeen when RT is more optimized and adopted by more games

  • @AndersHass
    @AndersHass Год назад +2

    It is possible something else in the system than the GPU draws more power, but the selling point about power consumption certainly doesn't seem like something that would matter in this case.

  • @paradoxeintervention5390
    @paradoxeintervention5390 Год назад +1

    I wanted to buy a 3080 2.5 years ago, ultimately it became a 6900xt.
    At that time I had a FHD144 monitor then switched to WQHD165, meanwhile I have a 4k Oled for gaming.
    Currently 12GB would also be too little for me, already the head thing.
    With every jerk I would ask myself whether it is perhaps the VRAM.
    That's why I'm quite happy not to have gotten a 3080, back then I would not have thought how quickly the memory becomes a problem.

  • @bartdierickx4630
    @bartdierickx4630 Год назад +1

    If the extra 6GB VRAM gives you 2 extra years of gaming (let's say an additional generation of GPUs) before forking out new money it's worth worth 112$ a year, without taking the resale value into account of both GPUs after respectively 2 a(for the 3080) and 4 years (for the 6800). And not considering if you do more than gaming on the GPU.

  • @5RWill
    @5RWill Год назад +3

    I got that one on prime day for $480 with starfield. But good points made. The only feature i really would nod the head to is dlss. I’m all for Ray tracing but it’s just not that big of a difference in most games for the performance. I wish amd would focus on fsr 3.0.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      Imagine fsr 3.0 miraculously comes with starfield lol

  • @robertdelacruz7920
    @robertdelacruz7920 Год назад

    can i ask what is your ambient temp of your cpu . your cpu is cooler while gaming

  • @giucafelician
    @giucafelician Год назад +2

    I am absolutely thrilled to share that I recently snagged an incredible deal on a 6800 for only $353! Needless to say, my excitement is through the roof!🥰

    • @gustavopena99
      @gustavopena99 5 месяцев назад

      I got an used 6800 no xt for 260usd aiming for 1440p im so happy also (1 year warranty) (in argentina is a good deal)

  • @laplace3710
    @laplace3710 Год назад

    in my country both card prices are almost the same, so which one should i get? upgrading for 1440p reso

    • @Nekotaku_TV
      @Nekotaku_TV 10 месяцев назад

      Always Nvidia if you can.

  • @Doile911
    @Doile911 10 месяцев назад +3

    Meanwhile i'm still sitting on my 1070.

  • @synergistex7088
    @synergistex7088 2 месяца назад

    The problem is, latest AMD GPU's have been disasteous of black screens. I currently have it and I'm stuck with it. I can't turn on Resize bar cause it cause black screen

  • @DBimo
    @DBimo Год назад

    @Vex and other, i need help should i by second hand 3060 12gb or 3070 8gb? i play games and rendering 3d, but what should i do people say 12 vram is good, but i dont know what to do! i play on 1080p

    • @TonyChan-eh3nz
      @TonyChan-eh3nz Год назад +1

      3070. Vram is good, but not that good.

  • @mRibbons
    @mRibbons Год назад +1

    I love AMD, but Id never say no to a competatively priced 3080. Did the power connect get sorted out tho?

    • @Very_Questionable
      @Very_Questionable Год назад

      not really, but all boards that's not the founder edition cards (for the 30 series) have the standard 8-pin PCI-E connectors.

  • @DeepteshLovesTECH
    @DeepteshLovesTECH Год назад +1

    AMD had a much favorable time competing with RTX 3000 because of the huge node advantage that AMD had compared to NVIDIA on Samsung's crappy 8nm node.

  • @ImJusSha
    @ImJusSha Год назад

    How does the 6750xt compare to the rx6800 my cpu will be a 7 5700x im trying to understand which will be the best

    • @TonyChan-eh3nz
      @TonyChan-eh3nz Год назад

      The 6700xt/6750xt are great at 1080p and can do 1440p. The 6800xt can comfortably do 1440p. The 6950xt and 7800xt should also be on your radar.

  • @merlingt1
    @merlingt1 Год назад +1

    3080 12GB exists:
    RUclipsrs:

    • @Nozzinator
      @Nozzinator Месяц назад

      Ye cant find nun on the 12 gig variant, but still shoulda came with 12 gigs normally imo

  • @ProbablydecentIV
    @ProbablydecentIV Год назад +1

    6700xt

  • @Bleckyyyy
    @Bleckyyyy Год назад +1

    The fact that AMD makes FSR 3.0 available to NVIDIA users is enough for me to go with AMD. Forget NGREEDIA...

  • @Clint_the_Audio-Photo_Guy
    @Clint_the_Audio-Photo_Guy 10 месяцев назад

    I had a Red Devil 6800 XT and took it back when the 6750 XT refresh came out. I should have kept it. Incidentally, 6950 XT Reference cards are only $550 at Microcenter right now. Pretty comparable to the 7800 XT, or better in several games.

  • @cata89deea
    @cata89deea Год назад

    Hi, I was thinking to upgrade my GPU, at the moment I’m using ASUS ROG Strix GeForce RTX 2060 OC edition 6GB GDDR6 and my CPU is an Intel core i7-9700k 36gz 12mb cache, can you please recommend what GPU to buy for better performance!!! I’m playing Destiny 2, Overwatch 2, CS, Valorante. Thank you

  • @100500daniel
    @100500daniel Год назад +3

    Bro increase your parameters from the default through Andrenaline. The fan speed and Power limiter by Powercolor's default are kinda low.
    run that bitch on 2000rpm with the power slider maxed and an undervolt&OC and it will be even faster.

  • @CrazO9111
    @CrazO9111 Год назад

    Hm 3080 for better RT with Vram Limit or 6080XT with less RT but enough Vram? Could FSR3 a game changer here?

  • @stratuvarious8547
    @stratuvarious8547 Год назад +2

    It's just a shame that Nvidia doesn't want to give enough VRAM to use their GPUs to the fullest. I can forgive the 3000 series, 8 was plenty at the time, but it was becoming clear early enough in development of the 4000 series that they needed to increase the VRAM at every skew except for the 90s. If AI wasn't their focus right now, they probably would have, but, because of AI, Nvidia just doesn't care.

  • @StupidOnPC
    @StupidOnPC Год назад

    Hey i have a question, can you send me your red dragon oc stock BIOS?

  • @enricofermi3471
    @enricofermi3471 Год назад +1

    (Used) 3080-s are cheap because there's an extremely high probability they were mined on. As the crypto drops, there's little incentive left to mine, so the smaller farms get disassembled and sold. We will not disuss the probable condition of those GPUs (which may vary greatly depending on maintenance of the farm).
    The fresh 3080-s are indeed priced cheaper, because there are only so many advantages they have over an AMD competitor (6800XT): raytracing, DLSS (no frame gen on 3000 series btw) and CUDA, out of which only CUDA is not optional (if you actually need it as a content creator, that is). On the other hand, 3080 has only 10Gb VRAM, which is enough for now, and may (keyword may; it gets filled up in some games on ultra, but those are just singular examples... for now) last you until the 5000 series; 6800XT does not have that "may".
    Overall, I'd say a 6800XT is a more balanced long-term solution, "the 1080Ti of current era", while 3080 is an "I want the eye candy now and then I'll switch to 5070 as soon as it launches".

  • @paulburkey2
    @paulburkey2 8 месяцев назад +1

    great points personaly I like to use driver only install with AMD GPU's lower overhead plus afterburner for undervaluing is just simpler. Though I do lose the ability to overclock my monitor as well as run not sure what the setting is called but it alows you to un cap the FPS without tearing which on this rig can be pushed to 78Hz vs 75Hz not a big difference and you lose freesync with the overclock so not worth the overhead of the radeon software I just cap the frames @ 74FPS which is all this RX570 can handel with flat line frame times anyway I just purchased a red devil RX6800XT used on Ebay for $395 $445 after tax and shipping waiting for delivery to upgrade my RTX 2080 super, I plan to undervolt, and don't use ray tracing or DLSS I mostly play Fortnite with TSR enabled high- epic details it's all about latency for me @ 1080p 144Hz I plan to test it with my 4.75GHz R5 5600X then test it with my R5 5600X3D which I dont use at the moment because it runs hot on my 240 AIO may have to pick up a 360 AIO if the 6800XT performs better with it vs the non X3D's older sibling which is fine I would like to use this 240 AIO in my Rampage III Gene X5690 Xeon machine paired with my ASUS Strix RTX 2070 super which currently has an XFX RX570 that I customised with an XFX RX580 cooler 2 more heat pipes and direct contact ram plate which I also undervolt and overclock, I got the untested RX 580 for parts on Ebay for $20 im prety sure it was just dirty and needed a repad repaste but my RX 570 is basically new and I didn't want to risk installing a dead GPU on any of my current hardware

  • @Squilliam-Fancyson
    @Squilliam-Fancyson Год назад

    Any gpu miners here? What coin are you mining atm? RVN seem to be quite attractive since difficulty droped massively.

  • @jcgongavoe337
    @jcgongavoe337 Год назад

    Just checked on eBay, the non-auction options costs around 600 to 750CAD, not really a good deal considering thay are pre-mined

    • @crazymcgee3604
      @crazymcgee3604 Год назад

      Newegg has the Asrock RX 6800XT for 750 CAD, new with a premium Starfield code. If the used cards cost that much its just as well to buy new.

  • @Knaeckebrotsaege
    @Knaeckebrotsaege Год назад

    What confuses me is that he's still on a B350 motherboard (see 14:12 just below the GPU). I wouldn't want to put a 5900X anywhere close to an ancient low-end board (4+2 phase) like that O_q

  • @tenniearmybarbie1959
    @tenniearmybarbie1959 8 месяцев назад

    will my i7 14700k still bottle neck if im using a RX 6800xt ? or will it be even out ? or do i really need to pair my i7 14700k with a 4080 ??

  • @mjn5016
    @mjn5016 Год назад

    Did you turn on SAM?

  • @blackwaldoduck159
    @blackwaldoduck159 Год назад +2

    Is vex using Risk of rain 2 music 😮

  • @projecticy.
    @projecticy. Год назад

    How come you say your CPU is limiting your GPU but it’s only at 60% usage?

  • @MGrey-qb5xz
    @MGrey-qb5xz Год назад +1

    it's no longer about performance anymore, we honestly have enough of that and gaming has stagnated, what matters now is how power efficient the chips and cards are , even series console went the undervolt route to run more on less

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      think its the other way around, game devs want gamers to update their rigs so they can make better games

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Год назад

      i mean no one can 144 at cyberpunk psycho rt as for now

    • @MGrey-qb5xz
      @MGrey-qb5xz Год назад

      @@iikatinggangsengii2471 there are more important things to be done then play cyberpunk in rt 144fps, we need to stop turning cards in to freaking ovens in the summer and decrease the electricity bill, aside from first world countries , using modern cards is a big no for electricty bills and ac bill to cool the effing room, can't even buy old ones now cause they are no longer in production

  • @ioannispoulakas8730
    @ioannispoulakas8730 Год назад +2

    10GB is veeeery borderline for 1440p. I very often see over 9GB. Over 10 is less often but we are getting there. I don't know for how long the 12GB on my 4070 will last. Ideally I wanted more VRAM but there wasn't any alternative on this price range. I didn't want to go with the 6800XT and lose DLSS 2,3 and better RT cause I am only playing single player games. Fingers crossed for the 12GB then :P

    • @xwar_88x30
      @xwar_88x30 Год назад +1

      That's all Nvidia is good for is borderline with the vram scam on their GPUs making people upgrade when there 3070/3080s/4060/4060 ti or whatever other bs Nvidia GPU that starts to stutter in heavy vram games. 16gbs Is the safe spot at the minute anything over 16gbs is even better future proof.

  • @loophole3526
    @loophole3526 9 месяцев назад

    Have the 3080ti and a 6800xt and usually can’t tell them apart. But I think they are both CPU limited one on a 9900xf at 5.0 and a 8700k at 5.0. I will say the textures in LoU look muddy as it’s likely loading lower texture quality. Hardware Unboxed does some really good VRAM testing you should check out.

  • @IamMarkSmith
    @IamMarkSmith Год назад

    @Vex what CPU are you running?

  • @danburke6568
    @danburke6568 Год назад +1

    When next generation games comes out 10gb is going to be a problem.
    Also so many was used for mining, those two elements. So many more about.
    The 3080 used is a bit pricey. Still a good card.

  • @LVShonzy
    @LVShonzy Год назад

    something wrong with your rtx 3080 I have 3070 ti MSI Suprim X and it has more fps on same (max settings) for 1440p DLSS disabled

  • @RannonSi
    @RannonSi Год назад +2

    When it comes to VRAM.
    As it seems right now (if I'm not mistaken), there are about 4-6 games that (for whatever reason) exceed 8, 10 and even 12 GB, at a resolution of 1080.
    To me, that is entirely different to when Ray-Tracing started getting games, as it was obvious that it'd take years before buying a graphics cards just for Ray-Tracing was a viable (or at least a wise one - besides, how good was the RTX 2000-series with Ray-Tracing, really?).
    *This*, on the other hand, looks like the start of a trend, that the VRAM usage is spilling over the 8GB (with 10 and 12 following shortly, would be my guess) limit. And I'm guessing that it'll be at a rate, at least twice, the rate of the adoption of Ray-Tracing (at least after 2024).
    To me, Ray-Tracing is an add-on, a luxury item. But being able to play games at 1080p, on a card (that was released in the roaring 20's, mind you) that had an MSRP of $700, is (with exceptions, of course) the least one should expect.

  • @Kylypyukko
    @Kylypyukko Год назад

    Witch one should I buy if I’m going to play only cs2 whit low graphics? I want to have high fps.

    • @Dreyz-pg2zz
      @Dreyz-pg2zz 11 месяцев назад

      6800xt

    • @Nekotaku_TV
      @Nekotaku_TV 10 месяцев назад

      Haha you can get like 1080 Ti.

  • @Psychx_
    @Psychx_ Год назад +1

    3080 10GB doesn't have enough VRAM to handle Cyberpunk in 1440p with Ultra-RT. Scaling up from a lower resolution alleviates that memory pressure.

  • @nolive2nd180
    @nolive2nd180 Год назад +1

    for the last 3months been testing used 3080 and 6800xt cards, all gotten for under $350 in great conditions.
    I decided to keep the NVIDIA card in the end (a Gigabyte GAMING OC) because of the better performance with my QUEST2 for PCVR. I really fell in love with VR since I bought the Q2 (mainly used for Simracing).
    In the titles I play in desktop mode (4k 120Hz TV), never really saw a VRAM bottleneck. Maybe having a 5800X3D helps as well but that's my user experience

    • @JohnSmith-mk8hz
      @JohnSmith-mk8hz Год назад

      I also use a Quest 2 for PCVR. I was considering getting an RX 6800xt. What made the 3080 better? And how much better?