I Switched to AMD and Saved $300. Is it worth these tradeoffs?

Поделиться
HTML-код
  • Опубликовано: 14 янв 2025

Комментарии • 1,2 тыс.

  • @TechLens
    @TechLens  Год назад +14

    SPONSOR 👉 25% Coupon code for software: TL20: -> biitt.ly/XQT7V
    👀 Watch Pt.2: The Bugs... ▶️ ruclips.net/video/WFb3MwXbvDs/видео.html
    Let me know what you would want to see in the follow up and I'll do my best to include it!
    Thanks guys! James - TechLens

    • @confusedsiren56
      @confusedsiren56 Год назад

      Ye I have a rtx 3050 right now I was scared to go amd but now I regret it it's not good value for money and there's better gpus out there from amd that cost less and perform better in terms of the money you paying

    • @DYNAMiXRap
      @DYNAMiXRap Год назад

      Hey buddy, the idle powerconsumption is easy fixable:
      Either u put ur second monitor on 60hz and ur main on max refreshrate
      OR
      Enable VSR, this got reported dropping the idle from 100ish to 20w idle with full refreshrate on both monitors!

    • @MiracleManGaming
      @MiracleManGaming Год назад

      Are you the same guy who played as joker in Batman movies?

    • @korosoid
      @korosoid Год назад

      I didn't skip the ad because you have that countdown line on the bottom of the video.

    • @generalawareness101
      @generalawareness101 Год назад

      Hardware wise they are fantastic but their software side has stunk since it was ATI. My friend lost his 3090 and he could not bring himself to buy a 4090 to replace it. He did training of AI models, and Stable Diffusion. I got him to get the best 7900XTX out there (my favourite brand for AMD) the Sapphire Nitro+. He loved the card and so much cooler, and faster, than his 3090. Told him. YAY, but not so fast once you leave the gaming world because the main things he used the 3090 for just flat out don't work, is wonky, or never (yet) implement. ROCm, and attention, is the main downfalls to AMD for AI stuff NOT the hardware. Throw in the fact Python people are so Nvidia (to the point I speculate paid off) that if for AI we are stuck, for now, with Nvidia. My hope lies with Intel come Druid, or Celestial.

  • @solodagci
    @solodagci Год назад +103

    The issue you mention at 18:50 has a basic explanation. One (darker) has full legal colors (0-255) and the other one is limited (16-235). It's an ancient safety feature for old TVs that can't show full colors or incompatible with them and crush all the blacks. Try checking control panels of your GPUs for this setting or make sure you check ''legalize'' option in Atomos Ninja recorder menus.

    • @ScoutReaper-zn1rz
      @ScoutReaper-zn1rz Год назад +9

      Or Limited RGB vs Full RGB.

    • @vito74m
      @vito74m Год назад

      muhahaha :D

    • @npip99
      @npip99 Год назад +3

      I think this actually has to do with a color profile being applied in borderless windowed mode vs fullscreen mode. The difference between the two images isn't _just_ in the two extremes, but an overall saturation difference even on colors between 16 and 235. This is most likely an incorrect sRGB vs DCI P3 color profile setting.

    • @almostprofessionalrecords6651
      @almostprofessionalrecords6651 Год назад +4

      nvidia defaults to limited range because they think most people have 15 years old tvs

    • @ScoutReaper-zn1rz
      @ScoutReaper-zn1rz Год назад +2

      @@almostprofessionalrecords6651 It's more than that. Even modern 4K TVs are affected. It's because all media is made in Limited 16-235. So that means all of your movies unless they are HDR are in limited 16-235. After much experimentation, the 16-235 range was adopted by pretty much all cinematic and creative arts applications. Your streaming services and Blu-rays carry content that’s nearly always mastered in limited RGB.

  • @misterpinkandyellow74
    @misterpinkandyellow74 Год назад +491

    rx 7900 xt here, first AMD card in years, and no problems for me. Do not let the Nvidia shills scare you with fear mongering.

    • @nononsense846
      @nononsense846 Год назад +30

      why do nvidia shills try to force you to use nvidia? i have a nvidia+ intel gaming laptop and it sucks ass. Its gte hot as hell, Geforce now is dog water
      and makes me what to empty my airsoft rifle into the thing. Im sorry but i like my AMD computer waaaay better. ray tracing is nothing to me im more interested in the gpu settings and modding my games and for me the AMD gpu's just work without me wanting to kick the thing off my desk.

    • @righteousone8454
      @righteousone8454 Год назад

      Nvidia "shill" chiming in.
      After my 4090 had an issue so I had to RMA, I bought 7900xtx from Asrock (Taichi) and I have crashes in every game, stutters in every game that could not be fixed with any optimization tips on youtube and reddit, overheating hotspot of 105 C just while gaming, 450 watt draw (while my 4090 pulls 420 max if it is 100% utilized, but 4090 actually draws around 300-350 watts in almost all scenarios, because it's too powerful for any CPU right now). I don't have crashed with 4090, never did since I bought it. Drivers are beyond solid.
      Let me think what else on AMD side....driver timeouts. After crash to desktop.
      So, yeah, don't let this "shill" stop you from buying AMD card.
      Just pray you won't have at least 2 of my issues
      Stutters are present even in Unigine Heaven, and Asrock is a solid company.
      So much for hoping AMD can get their act together after 10 years I've been with NVIDIA since Radeon 7950 HD (2013)
      Oh, and don't forget the infamous AMDip in frames, as Frame Chasers channel puts it.

    • @JesusIsKing1001
      @JesusIsKing1001 Год назад +23

      Same gpu bro and the 7900xt is a great card 👍

    • @DonaldHendleyBklynvexRecords
      @DonaldHendleyBklynvexRecords Год назад +4

      yes xtx here i
      cant complain even with RT

    • @moneysworthtv2410
      @moneysworthtv2410 Год назад +3

      I've run nuthin but amd just something about nVidia just rubbed me the wrong way as a company from how they do the board partners to the pricing kinda feel the same on Intel " though the have gotten better " but I'm worried even more between AI and now with amd talking about not having a flagship card for the 8000 series worried what the price gonna be for the 5090 5080 and probably even the 5070

  • @JohnnyEMatos
    @JohnnyEMatos Год назад +225

    I'm using a 7800X3D + 7900 XTX system and it is perfectly stable. I'm also blown away by the performance coming from a 5600x 6600 XT

    • @UtubemeNohomo
      @UtubemeNohomo Год назад +3

      What motherboard did you pair it with? I'm thinking of going this route.

    • @HardwareAccent
      @HardwareAccent Год назад +5

      I'm also interested in your motherboard. I've got a 7900XTX but I do need to upgrade my Ryzen 3950x to 7800x3D so will need a new mobo with it.

    • @JohnnyEMatos
      @JohnnyEMatos Год назад +10

      @@UtubemeNohomo Gigabyte B650 Gaming X AX. When I bought it it was the only decent board I could find under 200 USD. I would say it's on the premium side of quality, it just lacks overclocking features that the 7800X3D does not need. For the RAM I got G. Skill Trident c30 6000mhz, which seems to be the best ram out there for AM5

    • @ousoonthebeat6927
      @ousoonthebeat6927 Год назад +1

      @@HardwareAccentim 5600x and 6700xt

    • @DeadPiixxel
      @DeadPiixxel Год назад

      @@ousoonthebeat6927 7600x and 6700xt for me

  • @themomaw
    @themomaw Год назад +130

    It's very easy to get back that slightly more contrasty look with AMD, if you want it. Open Adrenaline software, go to the "Gaming" tab at the top, open "Displays". Choose which screen you want to adjust. In the right hand column, set "Custom Color" to enabled, and tweak Brightness/Contrast settings to your preference.

    • @Shadow62x
      @Shadow62x Год назад +1

      I tried that (6600xt) and the games would constantly flicker between default and custom.

    • @NFchegg
      @NFchegg Год назад +6

      @@Shadow62x set it on global, and it will apply to all that dont have custom profiles.

    • @Shadow62x
      @Shadow62x Год назад

      @@NFchegg I'll try, thank you.

    • @pyranitar
      @pyranitar Год назад +3

      Well to be fair he did say he was trying to minimize the tweaking of settings as most people would just plug and play so he wanted it to represent the most common experience.

    • @ThisOLmaan
      @ThisOLmaan 11 месяцев назад

      @@NFchegg : Been wondering, about AMD's Adjusting Color settings, since I am so use to the Nvidias Control Panle. Looked up videos but the just mention setting the color Profiles for those particular games. I like Adjust color all round, so when not gaming, and want to watch a RUclips video want the color to be set, since I been wanting to Switch to AMD buy a 7800XT Hope understand my question TY👍

  • @dizzlery3628
    @dizzlery3628 Год назад +108

    My last Nvidia gpu was a 9500GT back in the days. Since then i ran AMD gpus because the have a better price to performance ratio. Never had any problems with those. Nvidia's recent actions only make me distance myself even more from them

    • @Elkarlo77
      @Elkarlo77 Год назад +3

      My last last Nvidia was a Gainward 9800 GT GS. It was a 9800 GTX+ in disguise which could run 818mhz with 128 shaders instead of the 650mhz and 112 of the GT. And there came the Problems. At the GTX 290 Launch and the GTX 480 launch i had sudden driver problems: The Card was demoted to a GT as i lost Shaderunits and Overclocking capability with some driver Versions. Always when a new Nvidia series was launched. Last time Nvidia. On work i still use Nvidia and i do builds for other poeple with Nvidia cards, but not personal.

    • @bigyundol3598
      @bigyundol3598 Год назад +2

      My last NVidia gpu was a Geforce 3 😂

    • @South_0f_Heaven_
      @South_0f_Heaven_ Год назад +1

      @@Elkarlo77had a 9800 GT myself paired with a Phenom II 965 X4 Black.

    • @Elkarlo77
      @Elkarlo77 Год назад

      @@South_0f_Heaven_ Curiously, i still have the 965 and Board with which she was later paired with, but the Gainward 9800GT GS is atm in the 2008 Nostalgia Gaming PC with its first mate the C2D E8500 on a P45 Board. (And another Grudge as Physic-X had to run on the CPU when you had an AMD CPU without hacked Physic-X drivers, so suddenly some Games slower with the more powerfull CPU.)

    • @MrDarknight651
      @MrDarknight651 Год назад

      same here.

  • @mr.guillotine1312
    @mr.guillotine1312 Год назад +82

    I like how you mentioned that most of the time people won't even notice the differences in the upscaling unless they are pixel peeping. It's a point I wish more people would make in their discussions of it, because the truth is, most people will not notice 99% of the differences between them during actual gameplay.

    • @bionicseaserpent
      @bionicseaserpent Год назад +24

      people really dig into the Nvidia propaganda. and i'm really tired of hearing people dig into this hole where they meatride Nvidia at every chance, going "i'm NEVER touching Radeon" when they get told that its an objectively better GPU than the Nvidia option. or that they ask for the best GPU to buy and get disgusted when the answer is clearly not gonna be Nvidia and likely NEVER will be again.

    • @ionrage
      @ionrage Год назад +12

      I notice it during gameplay, don't need to pixel peep for it.
      a lot of people can't keep up with action games and a lot of people don't have good eye sight either, this is also where people don't notice the difference. people never talk about the limitations of people and how this a factor.

    • @yeahitsme4427
      @yeahitsme4427 Год назад +10

      Blame majority of big tech youtubers that continue to praise DLSS like is 100% better than FSR2, when in reality the difference is not even noticeable for the average joe when is gaming. One thing i think AMD need to try. Release a monster card better than 4090 or 4090ti. They can do it, even if it will cost +2k and 600w power. This is needed because of "perception". RUclipsrs when benchmarking always use the best GPU to not be bottlenecked, so constantly using Nvidia, than they are indirectly marketing it, and Nvidia knows this , thats why they always release a monster GPU for perception and mindshare.
      AMD cannot beat Nvidia like it beated Intel with a Ryzen moment. They really need to compete neck to neck in Ray Tracing, and then launch a killer GPU.

    • @bionicseaserpent
      @bionicseaserpent Год назад +1

      @@yeahitsme4427 if they plan to release RDNA 4 instead of a Refresh (likely since they did say that RDNA 3 lineup is basically complete) they can definitely throw the 4090 for a loop. since there will not be a 4090 Ti. its a genius plan if Nvidia wishes to delay Blackwell till 2025.

    • @bionicseaserpent
      @bionicseaserpent Год назад +1

      @@ionrage you don't need to White Knight DLSS you know that?

  • @MarkVP
    @MarkVP Год назад +23

    I just changed to AMD 7800XT and am very happy I did. No problems so far and great performance. I went with the Sapphire Nitro. Awesome card

    • @markomarkovic5729
      @markomarkovic5729 Год назад +3

      The only card worth buying today, maybe a 7700xt if it drops in price or a 7900 GRE if you can find it. It recently appeared in Europe (7900 GRE) at a price of around 615 euros, which is only 60 euros more expensive than the 7800xt.

    • @Kleber_03
      @Kleber_03 11 месяцев назад +1

      @@markomarkovic5729its not available in my area 😢

    • @jskop566
      @jskop566 4 месяца назад

      Sapphire makes good cards. Haven't seen one yet that wasn't well designed.

  • @Night69420
    @Night69420 Год назад +18

    You dont mention this in the video, but in the linux/bsd world nvidia has the instable bad closed source drivers, and amd has the stable, good, (mostly)opensource drivers.

    • @ssquirrel88
      @ssquirrel88 Год назад

      I second this comment. OS support.

    • @pip5528
      @pip5528 4 месяца назад

      Nvidia has gotten a lot better since then. 555.58 added explicit sync so no more jittering on Wayland. 560 is even better. I have a 3050 ti Mobile laptop, old 1050 ti machine, and got a 7900 GRE for my build.

  • @eyeOfAC
    @eyeOfAC Год назад +44

    I got into the RTX hype with a 2060S and later upgraded to a 3070 and all I can say is that both cards were nothing but disappointing when it came do RT. Never quite managing a proper 60fps. Every new RTX sponsored game that came out was worse and worse. Then the 3070 vram size became an issue. So the 12gb 4070ti wasn’t looking good. All I saw was more disappointing RT and vram size issues. So I got the 7900 instead. It’s been great.

    • @adi6293
      @adi6293 Год назад +6

      No offence but you actually fell for RT and bought a 2060S? 😅😅

    • @Andytlp
      @Andytlp Год назад +5

      @@adi6293 It's cool to check out once but thats it. You can do that on amd gpu. it runs like sht either way unless you have 4090 and play on 1080p, 1440p max.

    • @eliesercepeda686
      @eliesercepeda686 Год назад +2

      Again, it’s like a lot of you don’t know the graphical powers between GPUs. At LEAST your GPU had RT, vs NOTHING that AMD was offering lol it’s like complaining for having SOMETHING, vs nothing. You upgraded to 7900 which is similar to a 4080. And I play Cyberpunk Maxed out DLSS 3, with its NEW Ray Reconstruction technology, (AMD doesn’t have) and Frame Generation, I’m playing this game EASILY over 100 fps. Your 7900 wouldn’t even come close! Not even AMD’s top GPU can beat my FPS on my 4080 on cyberpunk.

    • @adi6293
      @adi6293 Год назад

      @@eliesercepeda686 2060S and RT 😂😂😂😂 what an idiot, also you have DLSS 3 on? So what is your real fps? 50? The game will soon get FSR3 too and you can already use AFMF to a great effect in Cyberpunk, I have a 7900XTX and I play with RT on no problem 👍

    • @o0Silverwolf0o
      @o0Silverwolf0o Год назад +2

      ​@@adi6293even without rt the 2060 super was a good midrange card and if you got the promo it came with Metro Exodus and Control making it a great value plus giving a taste of the new tech. Now with the 4090 rt is playable at high framerates.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +16

    I was so disgusted with Nvidia's current business ethics, or lack thereof, when it comes to the GPU consumer, I bought a AMD card. Hopefully, things will change with them, but I'm not holding my breath. Fortunately, I won't have to worry about upgrading for a couple generations.

  • @davidhays2913
    @davidhays2913 Год назад +78

    Great video! I'm eager to see what you think about the card after using it for a while. I've been on the 7900XTX for about 6 months now and I couldn't be happier. I'm curious what experience you'll have.

    • @KamleshMallick
      @KamleshMallick Год назад

      Any games u played at 1440p ?
      How do they perform?

    • @davidhays2913
      @davidhays2913 Год назад +5

      @@KamleshMallick I play all my games at 1440p. They perform great! I play a lot of different stuff. Some older games, some new stuff. I haven't found anything yet that my card couldn't handle.

    • @chrissyboy7047
      @chrissyboy7047 Год назад +2

      Same card here for about the same length of time too. Im playing at 4k and can confirm this card still performs very well

    • @markomarkovic5729
      @markomarkovic5729 Год назад

      @@davidhays2913 It's a shame to use such a powerful card for 1440p gaming, the 6800 non-xt is more than enough for that. If you can, get a 4k monitor or connect pc to tv to take advantage of the full power of such a beast of a gpu

    • @larion2336
      @larion2336 Год назад +3

      @@KamleshMallick I only have a 6700 xt and I run 1440p flawlessly in anything I've tried, at 144 hz too. It'll do 4k at 60hz too. Needless to say something like a 7900 xtx no doubt smashes these resolutions in but all but the most demanding games.

  • @tqrules01
    @tqrules01 Год назад +13

    The drivers really got alot better since 2018/19 and have been rock solid ever since. It was ashame they got that low latency bug wrong but that will be resolved in time I'm sure

    • @ogrogordo6084
      @ogrogordo6084 Год назад

      And when in doubt, put things on 720p and upscale to 1080 with the gpu seems to solve a lot of issues.

    • @sshunt2987
      @sshunt2987 Год назад

      I wouldn't say rock solid. I have a 7900xt and some games will crash my system to a black screen. Others have had this issue and it's still ongoing. I thought it was power requirements but I meet them and others confirmed it wasn't.

    • @ishtiaqasif1185
      @ishtiaqasif1185 Год назад +1

      ​@@sshunt2987why I changed to nvidia. It would give me black screen cuz the windows update amd driver with drivers from 2020. I ran a different benchmark and it would run fine but once I start a game it's crashes.

  • @larion2336
    @larion2336 Год назад +2

    19:00 You're mistaken about Nvidia / AMD looking different on your screen. They don't. What looks different is the video output when you record your screen with either GPU, caused by differences in video renderer software. So you won't see this visual difference in practical use at all.

  • @pokepress
    @pokepress Год назад +26

    The gaps are definitely shrinking. I currently mainly use my GPU for amateur AI, so NVIDIA has the edge in terms of support-for now.

    • @LessThanPeachy
      @LessThanPeachy Год назад +12

      True, it's rapidly improving thanks to AMD's recent efforts. It's good seeing it because more and more AMD is being seen as the better option in terms of price but often lacking in anything other than gaming performance. If that's no longer the case, NVIDIA will have no other option than to lower their prices drastically and not give us the bare minimum hardware wise or face giving up a large chunk of their market share.

    • @ramsaybolton9151
      @ramsaybolton9151 Год назад +1

      @@LessThanPeachy I don't get the price argument. AMD is often similar to barely cheaper in price than Nvdia but you generally lose features and performance by doing so.

    • @LessThanPeachy
      @LessThanPeachy Год назад +2

      @@ramsaybolton9151 Depends. For me to have gotten something comparable from NVIDIA, I would've had to shell out $400-500 more.

    • @larion2336
      @larion2336 Год назад +2

      I've been doing that on my 6700 XT lately. AI that is, specifically running LLM's on Linux. It definitely took _a lot_ more setting up and figuring out than it would have on an Nvidia card, but once you get Rocm working on Linux the performance I think is not bad. I am running q3 13b models with 6k context at somewhere between 5-9 token/s now. Still, it's not the best but I could see the case being made to go for a 7900 XTX to get that 24gb at considerably less cost than a 4090. Though a 3090 may still do you better if you could get that cheap. Biggest issue with AMD really isn't hardware, it's software compatibility. If you can get rocm to work properly, it works well, but that's a big if sometimes. I couldn't get it to work on Windows at all and wound up needing to dual boot linux.

    • @IamSH1VA
      @IamSH1VA Год назад

      @@larion2336I had to do the same thing for my 6700xt, but after switching to Linux Stable Diffusion works great.
      I am Technical person, so it wasn’t a big deal for me to switch to Linux, but it is huge disadvantage for majority of consumers.

  • @npip99
    @npip99 Год назад +3

    18:30 Is your Monitor sRGB, or DCI P3 / AdobeRGB? I think the difference you're showing between 4070 Ti and 7800 XT is _not_ the difference between the GPUs, but rather the difference between rendering Full Screen and rendering in Borderless Windowed, since that can affect whether or not Microsoft applies its color profile conversion. Even if full screen vs windowed isn't the cause, the differences shown definitely look like a Color Profile issue, not a fundamental difference between the way the two cards render.
    Try to run Nvidia on fullscreen mode to verify. If that doesn't work, check your color profile settings in both GPUs, and/or use a Monitor that has 100% sRGB with no support or low percentages on DCI P3 / Adobe RGB. Also check any color profile assumptions with your capture device. Besides those, you can only try to apply an sRGB -> DCI P3 conversion to the final videos to see if that makes the two videos match in saturation (A post-processing conversion will be lower quality than originally capturing it correctly, but it may prove whether or not it's really a color profile issue)

  • @TehShadowDeath
    @TehShadowDeath Год назад +12

    I made the switch years ago because I got sick of my nvidia GPUs dying on me due to drivers. That was back when Nvidia's drivers kept killing the GTX 400/500 GPUs. I switched to the 290x and stayed with AMD since. I would rather have a GPU that runs slower than the competitions but isn't killed due to planned obsolescence. The way I kept my GTX 560 ti from dying was rolling back to older drivers because anything newer than a certain point cause them to run so hot they would cook themselves and that is how my GTX 460 died.

    • @BNBPhotofr
      @BNBPhotofr Год назад +1

      Fun fact, the 290x actually still runs modern games at around 30 FPS. Kryzzp made a video showcasing this recently.

  • @MrRoxBrown
    @MrRoxBrown Год назад +12

    I've had nividia since 2008. I've had many little issues with my cards over the years that I didn't know wasn't common. I've switched to 7900xtx and boy my everyday experience is waaay better going from 2080s both watercooled

    • @theghostofthomasjenkins9643
      @theghostofthomasjenkins9643 Год назад

      LOL, if you thought nvidia's cards had tiny issues, wait until you start using amd.

    • @npip99
      @npip99 Год назад +3

      ​@@theghostofthomasjenkins9643 Literally the target audience for exactly this video. Statistics already show Nvidia having _more_ driver issues now. I swapped from Nvidia to AMD and it resolved a lot of driver issues I had with Nvidia (In my case I had Linux driver issues, but they were Nvidia's proprietary drivers, and Nvidia's main price/performance advantage is in Artificial Intelligence, which usually runs on Linux!).
      Either way, driver issues and support are really a toss-up between two brands now. And for either brand, you should never buy a card too soon after it comes out to let them fix driver issues for a bit.

    • @MrRoxBrown
      @MrRoxBrown Год назад

      @theghostofthomasjenkins9643 I've been daily driving for 3 months now. Streaming and recording video. Still no problems. 🤷🏿‍♂️

    • @theghostofthomasjenkins9643
      @theghostofthomasjenkins9643 Год назад

      @@MrRoxBrown and i've never had a problem with nvidia, but that's why anecdotal evidence doesn't mean anything. the issues are well documented on a global scale.

  • @mikerochburns4104
    @mikerochburns4104 Год назад +6

    I'd wanted to try AMD for a while and finally took the plunge after I watched Gamers Nexus AMD tour. I got a real sense of "home brew" about the company, and I liked that. And if anything I now own has been in some way been engineered by Bill Alverson or Amit Mehra, I feel blessed.

  • @jamielawrence7217
    @jamielawrence7217 Год назад +17

    I built a brand new PC about 4 months ago and went with a 6900 XT. My last PC had a 760 TI in it (laughs), and I wanted to build something for under 3k CAD, GPU choice being the toughest. It all came down to RT vs more GPU memory for the price and in some blind tests I actually preferred non-RT options in the samples I viewed, so I wound up deciding on a 6800 XT for my budget. Luckily, when actually pricing parts I found a brand new 6900 XT on Ebay, cheaper than any retailer by 100 CAD, so that's what I went with. So far I'm super happy with it and I figure it will keep me happy with 1440p gaming for a few years to come (fingers crossed). BTW, my monitor wouldn't shut off sometimes depending on the driver release. This happened with 2 or 3 of the Adrenaline drivers that I've used since the build.

    • @Aaronnpool23
      @Aaronnpool23 Год назад +1

      I was thinking a 6900xt as well card is solid at 16gb with its speed. Seen them for 600 Canadian used

    • @jamestkirkcameron9189
      @jamestkirkcameron9189 Год назад +1

      Thanks for sharing. I’m looking to upgrade from my 1060 laptop. Thinking AMD is the way to go.

    • @larion2336
      @larion2336 Год назад +2

      I'm happy gaming at 1440p with a 6700 XT. With a 6900 you can probably do 4K just fine (at 60hz I can also do it). So I think you are definitely right about getting a good few years out of that.

    • @zerorequiem42
      @zerorequiem42 Год назад +2

      Don't need to laugh about your 760ti. I'm looking to upgrade now and my last GPU was a 960. Plenty people out there like us that waited through the crypto/pandemic pricing.
      I'll probably end up getting a 7800xt, but still crossing my fingers on finding something on eBay like you did.

    • @jamielawrence7217
      @jamielawrence7217 Год назад

      Yes, I can do 4k in pretty much every game I play, but my 4k TV is trash compared to my 1440 gaming monitor. The 6700 XT is what I first had my eye on with this build as it seems to be a very respectable card from all the research I did. It really is only by luck that I got the card I have because I wouldn't have bought it at the prices on Amazon, Newegg, etc.@@larion2336

  • @knivestv0
    @knivestv0 Год назад +4

    Good video. I'm probably not your target (I don't care at all about price to performance ratios) but I do appreciate your testing, research, opinion and effort in what you've presented here. Very nice.

  • @danspencer4235
    @danspencer4235 Год назад +41

    When you have ONLY owned nVidia cards, you have no basis for having an opinion on AMD. I have owned both and I think most people who will only consider nVidia are just not very smart.

    • @thebossroleplay4105
      @thebossroleplay4105 Год назад +3

      I've had a 6700 xt and I wanted to throw my gpu across the room more times then I could count the drivers were fine.... Until I went to update drivers and it would sometimes update no problem or cause me to reinstall windows as it deleted crucial system files. I'm not biased when I say this but I will say with my Nvidia card so far knock on wood I have not had any issues it's more mature I guess lol. You pay the Nvidia tax for mature drivers stability ray tracing more or less better dlss at least if you upscale which you will now days as games suck I want my graphics not to shimmy and dance in the background until amd fixes that with all games I'm not willing to do that to myself. Then again I'm a graphics snob and I like my graphics as close to perfect as they can. If I was not a graphics snob I would say go out and buy amd but be aware they may put out a driver that breaks things. I said enough tried hard to be un biased in my wording but someone is bound to turn this around

    • @danspencer4235
      @danspencer4235 Год назад +5

      @@thebossroleplay4105 I have never had any of the problems you describe with an AMD card. Perhaps you had a bad card that should have been exchanged. Regardless, I wish you well and if buying nVidia is your solution, I hope that works out for you.

    • @thebossroleplay4105
      @thebossroleplay4105 Год назад +1

      @@danspencer4235 maybe I was a one off but I appreciate your opinion and glad you respect mine

    • @AlanakaBlackCat
      @AlanakaBlackCat 2 месяца назад

      @@thebossroleplay41056900xt here i was told it was on par as a 3090 when i bought it and dear god the graphics drivers are shit sometimes your performance will suffer you may get more vram but it does not put its power where its mouth is its fucking terrible for vr for standard games its been okay but always lacking behind framerate of nvidia you will save like 200 or 300 bucks but you will get what you paid for i personally will never go with an amd gpu ever again also if you have an intel cpu do not go with amd you will miss out on features however if you have a ryzen you get more of what you paid for the mantra ive learned is amd cpu nvidia gpu for the best performance

    • @kotzer71
      @kotzer71 Месяц назад

      @@thebossroleplay4105 been using amd for 15 years know currently have a 6700xt and i gotta ask the hell were you doing that that updating driver's caused windows to delete itself

  • @r3n846
    @r3n846 Год назад +18

    The AMD card looks like it's using limited RGB and the Nvidia card is using full RGB. Go into AMD Settings, display, and set it to 4:4:4 Full RGB or YCbCr and compare output quality.

    • @shadowrealms2676
      @shadowrealms2676 Год назад

      Yep

    • @hamzaahmad1345
      @hamzaahmad1345 Год назад

      Bro, is it safe. what about colour enhancement setting.

    • @Ashitaka0815
      @Ashitaka0815 7 месяцев назад

      Wouldn't be shure about that, the examples in the video more look like oversaturation. crushing details in bright and dark areas are typical for color range beyond the monitors range.

    • @r3n846
      @r3n846 7 месяцев назад

      @@Ashitaka0815 Could be, but I know AMD cards have a weird affinity for defaulting to limited RGB. One monitor I literally can not set it to full RGB unless I use HDMI for some reason.

  • @djmidnightwolf
    @djmidnightwolf Год назад +7

    I bought the 6950XT last week and it surprised me with how good it was.

  • @vanveenmatt
    @vanveenmatt Год назад +6

    The last AMD card I had was many years. The XFX Radeon HD 6970. I've had two Nvidia cards since. The 970 and 2070 Super. Both were good cards. Just recently bought a 7900 XTX. So far so good. Very strong card.

  • @user-jd3pk1bz8e
    @user-jd3pk1bz8e Год назад +6

    I my self own a 6800xt, coming from a 3070ti. I do production work (Photoshop, Illustrator, After Effects, Premiere) and gaming. Well no problem running all those softwares........
    People need to know that AMD is more or less plug and play. No problem here.

    • @michaelschlosser1194
      @michaelschlosser1194 10 месяцев назад +1

      Have you ran into any programs you cant run? I feel like Radeon is better at fine details which is what I want in a gpu but the added programs with nvidea make me question.

    • @user-jd3pk1bz8e
      @user-jd3pk1bz8e 10 месяцев назад

      @@michaelschlosser1194 Never. And if you need stable drivers for software you can always install the pro (more stable) version of the drivers.

    • @user-jd3pk1bz8e
      @user-jd3pk1bz8e 10 месяцев назад

      Never got any issues

  • @Sonic6293
    @Sonic6293 Год назад +12

    I like AMD over Nvidia because I like to run Linux, and the compatibility with Nvidia is not worth the headache. Both are companies at the end of the day and they are not your friends. It just seems that AMD is more likely at the moment to not spit in your face.

    • @Roxor128
      @Roxor128 Год назад +2

      While they won't spit in your face, they are, however, a bit lax on their GPGPU stuff. If there's one area of software they need to spend time focussing their development efforts on, it's their Linux compute stuff.

  • @sobzeru
    @sobzeru Год назад +2

    I've noticed people expressing concerns about high prices for graphics cards, but it seems some are overlooking the potential increase in electricity bills associated with more power-hungry cards. While the situation may vary by country, it's something I find frustrating, especially where I live. Obviously, more power should always be better performance.

    • @South_0f_Heaven_
      @South_0f_Heaven_ Год назад +1

      Can’t afford an extra $2 a month?

    • @sobzeru
      @sobzeru Год назад

      @@South_0f_Heaven_ Unfortunately, it isn't just extra $2 in my side of the world.

    • @Bugs-o8k
      @Bugs-o8k Год назад

      If one can't afford the tiny difference in monthly electricity costs then a gpu upgrade was probably a bad idea in the first place and they have a much bigger problem financially. Who prioritizes a GPU when one can't reasonably afford an electric bill?

    • @sobzeru
      @sobzeru Год назад

      @@Bugs-o8k good for you guys on being charged tiny extra only.

  • @Pete856
    @Pete856 Год назад +5

    How did you make the 7800xt use so much power at idle? My 7900xt idles at 10w, unless I turn on desktop capture with a pre-recording buffer of a minute, then it increases to 40w.

    • @earthtaurus5515
      @earthtaurus5515 6 месяцев назад

      Playing this video in normal mode only has my 7900XTX using less than 50 watts and I'm using a 43" TV as a monitor at 4096 x 2160 120 Hz.

  • @shaunthefriendlylesbian2950
    @shaunthefriendlylesbian2950 Год назад +2

    Hey, 7900xt gamer here.
    I have had a few issues with using this card driver wise, but they are mainly quite niche:
    - I have had 2 crashes in cs2 which is still a new game that is fairly early in development for a live service
    - I have had driver crashes in pcsx2 after using dx11/12, however Vulkan works fine
    - Implementing interlaced resolutions on my CRT/displaying a duplicate image on my LCD monitor and Plasma TV
    Other than that, the card works great, using fsr 2 quality on cyberpunk on RT psycho gives me upscaled 1080 60, the software works great, the buffer of VRAM gives peace of mind, RT in general works great on most games, especially non Nvidia sponsored titles. Gaming with AMD is great.

  • @spentcasing3990
    @spentcasing3990 Год назад +11

    Swapped out my 3080 for a 7900xt and the difference was night and day. The last few updates with Nvidia I saw worse fps and more problems in game. Haven't had a single issues with the 7900xt since switching.

    • @2284090
      @2284090 Год назад +1

      Same here i just love my rx 7900 xt getting the best experince over my rtx 3080 (10) gb

    • @Mr11ESSE111
      @Mr11ESSE111 Год назад

      Nvidia downgrade performance with new drivers on older gpus that it looks like newer gpus have more and more fps with never games

    • @eliesercepeda686
      @eliesercepeda686 Год назад

      Lmao the 7900xt is the equivalent of a 4080. Of course there’s a difference! If you swapped an equivalent GPU in power and THEN saw something, perhaps your comment would make more sense lol

    • @spentcasing3990
      @spentcasing3990 Год назад +1

      @@eliesercepeda686 Did you even bother to read my comment ?
      I'm not talking about in game performance, being able to use higher graphic settings, or fps. What I mentioned was driver updates and how I haven't had any issues since switching to AMD. Compared to Nvidia which the last 4 or 5 updates have been a mess.

    • @eliesercepeda686
      @eliesercepeda686 Год назад

      @@spentcasing3990 Your first line in your comment stated that once you swapped it was night and day. That’s it. Then you added other things which are irrelevant. AMD has BARELY got its firmware issues together. But it was NEVER Nvidia who lagged behind. In fact in the last month or so I’ve had 2 firmware updates from Nvidia.

  • @berger15
    @berger15 Год назад +1

    I'm building a new PC, and the issue with which "mid-range" (under $600) GPU I should go for has been bugging me. Prices vary so much from week-to-week and month-to-month, and I still need to get a few other bits to finish the build.
    Looking at the games I play and my current setup - my little 1660 ti is doing a cracking job, but going forward I was falling toward AMD, and your video gives such a clear compelling argument for me to stick to that choice. I don't play many AAA titles - especially at launch, I can't use ray tracing yet (current GPU doesn't really support it) and probably won't bother, I don't game at 4K or 1440p (although that may change if my monitor gets a refresh too) and I don't do video or image editing.
    Thank you!

  • @chrissoclone
    @chrissoclone Год назад +3

    The color difference seems just like a calibration issue to me, I never noticed any differences between AMD and nVidia per se, but each graphics card change of course requires a new monitor calibration, or if you're not into that, needs different ICC profile adjustments, often even between GPUs from the same brand, and of course readjusting the gamma in games. One thing though, as someone also mentioned there's a "custom color" option in Adrenaline (maybe nVidia has an equivalent to that too), as someone using a Spyder for calibration I make sure everything there is disabled completely.

    • @arizona_anime_fan
      @arizona_anime_fan Год назад +3

      The color difference has been a thing for a decade now. It has to do with how nvidia does data compression. Their method results in higher fps at the expense of intended coloring... because nvidia sort of simulates the coloring. You can't really see it unless you're looking for it.but it's been well documented for a decade now.

    • @Andytlp
      @Andytlp Год назад +3

      You have to set the full dynamic range on nvidia every time in control panel. Idk why it defaults to half. But i did notice inferior rendering on nvidia. LIkely due to the too high memory compression. Theres just not enough data anymore with those pathetically low vram bus width sizes.

  • @Zebrax-kq4gg
    @Zebrax-kq4gg 7 месяцев назад

    17:45 Apparently the fix for the high energy cunsomption in IDLE is installing CRU and make a custom resolution where you set the blanking rates to around 55.

  • @wade196840
    @wade196840 Год назад +4

    Before you start a game go into tuning and set the GPU to AMDs undervolt stock settings you should increase the performance while using less energy.

  • @GraphicallyChallenged
    @GraphicallyChallenged Год назад

    Great video!

  • @MichaelStanton
    @MichaelStanton Год назад +7

    I am super interested in this video I'm only several minutes in but my 6900 was my first AMD card and it changed my mind about AMD, but ultimately went 4090 because... Frame Gen and Ray Tracing. But now with the shortage I'm really considering cashing in and getting an AMD card and pocketing some cash.

  • @littl3spy
    @littl3spy Год назад

    I just switched to AMD, and i had many problems:
    -performance much lower than expected: had to uninstall MSI afterburner to fix it
    -solid green screens crashing the whole PC when i connected anything over HDMI: fixed by downloading some registry files from nvidias website
    -Resolution scaling not working properly on my 21:9 monitor: still simetimes works sometimes not, i can go and play with AMD setting and it doesn’t change anything, sometimes when i turn the PC on it works sometimes not.
    Why can’t AMD drivers work as they should?

    • @patrickstar787-d9n
      @patrickstar787-d9n 10 месяцев назад +1

      Wat psu you have?
      Maybe was just need to make some settings

    • @woocash80
      @woocash80 2 месяца назад

      I just switched from Intel/Nvidia to an all AMD build and I did run into issues to the point of wanting to return my 7900xtx. What I noticed is that AMD do require all drivers to be up to date, including and most importantly the motherboard chipset drivers. I kept asking why people use AMD when there are so many issues?! I got so many driver crashes until I updated the chipset driver and so far so good. Right now I don't think I'd go back but again it's not as 'plug and play' level of easy as Nvidia for sure...

  • @simonkelly8906
    @simonkelly8906 Год назад +3

    I was using an RX 6950 xt but since latest driver install the games keep crashing and I get a black screen. I tried all the fixes online but nothing worked. So I bought a 4070 ti and its been great. Never had any driver issues with NVIDIA and temps are much cooler too.

    • @phlixcarbon
      @phlixcarbon Год назад

      Did you clean the old gpu drivers off your system with a software tool such as DDU? Ive seen this happen when people have had old Nvidia Cards and still have the drivers installed then attempt to upgrade by installing an AMD GPU without ensuring that no other graphic drivers are installed.

    • @simonkelly8906
      @simonkelly8906 Год назад

      Yes I used DDU

  • @charlesballard5251
    @charlesballard5251 Год назад

    Last July during the Amazon Prime Day Sale I ordered a bunch of parts to build a new machine. I re-used the case from my then current desktop, got the Micro Center exclusive AMD Ryzen 5 5600X3D CPU, an MSI X570S Edge Max Wifi MB, a 1TB Samsung 980 Pro 4 GEN SSD for the boot drive, a 4TB MSI Spatium M4614 GEN SSD for the data drive, 32GB RAM, and I topped it off with the XFX SWFT 319 AMD Radeon RX 6800 XT 16GB GPU. This thing ROCKS!!!! I spent a grand total of around $1500 U.S., and couldn't be happier. It runs what I want it to run with only a few exceptions. I did put WIN11 on it, so there are some games that are not running. But I held onto my old computer transplanting it into another case I had laying around for when I want to run those things that are too old to run on the new machine. I agree, there is no reason to choose an Nvidia card these days. AMD is cheaper and just as good. I love this new machine, but I will be honest, I'm possibly building another new machine by summer... but only because I'm coming into a decent inheritance!!! I'm hoping that AM5 will have some of it's bugs worked out and prices will have dropped giving me the impetus to build an even more advanced machine than this one. I've looked at MBs and I'd want one with at least 4 GEN 5 M.2 slots. There aren't a lot of those out there. They're either all GEN 4 or lower, or one 5 and a 4, or a 5 and multiple 4's. I've only seen one or 2 that were all GEN 5 (a minimum of 3 slots) and as I recall they were a bit pricey. I think one was $699. Maybe, but then again... maybe not. In short, welcome to the AMD club. Coffee and tea will be served.

  • @Sercil00
    @Sercil00 Год назад +3

    What really surprises me are the color differences and the hugely higher power draw on the 7800 XT. I would have assumed those are just different settings with less contrast and less saturation... Also, I don't get how it's running at over 70W in idle.
    I was looking into the 7800 XT and I happened to run with AMD since almost forever. I still think it's a much better offer than anything nvidia has right now. But those things do sour me on the 7800 XT. Electricity in my country costs about 45c/kWh

    • @kognak6640
      @kognak6640 Год назад +1

      Idle power consumption is abnormally high, it's not typical. With single monitor setup it's less than 10W, my 6800 does 6W. Radeons consume more with multimonitor setups, especially with high refresh rate ones but that much.

    • @nononsense846
      @nononsense846 Год назад +1

      I noticed that too, i need those color settings so i go with AMD. I cant deal with the washed out shit on nvidia gpu'
      s, also their software sucks ass, cant even record desktop on geforce. Nvidia is overrated as hell just like iphones.

    • @JesusIsKing1001
      @JesusIsKing1001 Год назад

      Yeah it probably depends on monitor resolution and refresh rate when using the gpu on idle but you can undervolt it on amd adrenaline

    • @martinrisian3420
      @martinrisian3420 Год назад +2

      I do own 7800 XT nitro+, it idles at 10W doing nothing, 30-40W during video playback and max value i saw at idle was 60W spikes so i guess it depends on the model. I run 2 displays 1440p 144Hz main and 1080p 60hz secondary.

    • @Sercil00
      @Sercil00 Год назад

      @@martinrisian3420 Ok good, that sounds exactly like my setup and the model I was looking at.

  • @thescerigai
    @thescerigai Год назад +1

    i helped my friend build a pc and saw all the "oh look amd is so much better value" talking points everywhere, so i speced out an amd cpu and gpu build for him. AMD does not "just work". crashing in games for no reason, buggy software, just random issues that should not happen. he switched to a 3060 and got a much more stable gaming experience. amd must be saving in on something with their so much better value. if it works for you, fine , get amd. but for me intel and nvidia have "just worked", its whats in my personal pc, so i will recommend intel and nvidia to anyone building their own pc.

    • @woocash80
      @woocash80 2 месяца назад

      Amd do require all drivers to be up to date, including and most importantly the motherboard chipset drivers. I scratched my head til I bled asking why I got so many driver crashes until I updated the chipset driver and so far so good. Gaming has never been so smooth. I can use adrenalin drivers to do everything including upscaling/frame gen all games I run no matter the age. And on top of that my 7900xtx is only going to get faster and better as it ages. I can run all my games 1440p ultrawide 144 hz at over 140 FPS maxed out most of the time with no upscaling. Even ue5 games run over 100fps. But again it's not as 'plug and play' level of easy as Nvidia for sure...

  • @Hullbreachdetected
    @Hullbreachdetected Год назад +3

    The worst problem I ever had with AMD gpu in general is that the driver resets to default GPU tuning settings each time there is ansystem failure, so I gotta reload my profile. That's it. If you wana buy AMD hardware just do it. Both Nvidia and AMD are good enough for gaming, the performance to value is the determining factor

    • @Hullbreachdetected
      @Hullbreachdetected Год назад +1

      @High_On_Hope It is a safety feature, AMD driver will revert back to default tuning settings in any scenario where the system fails, regardless of reason. I think AMD does this to reduce complaints from people who aren't as hardware literate.

    • @earthtaurus5515
      @earthtaurus5515 6 месяцев назад

      @@High_On_Hope a CMOS clear won't remove a faulty GPU tune lol. You would need to reinstall Windows / boot into safe mode if for example you had use MSI afterburner to apply a faulty tune on boot to desktop. Fortunately safemode does kick in after a few failed boots. I do miss being able to press and hold the F8 key for safe mode options.

    • @woocash80
      @woocash80 2 месяца назад

      I had the exact same problem. It was fixed after a mobo chipset update. No more crashes 😊

  • @zeeMuniStacksBundles
    @zeeMuniStacksBundles Год назад +1

    I've used amd/ati on and off for 20 years for both cpu/gpu, everything works/worked as it should

  • @ScoutReaper-zn1rz
    @ScoutReaper-zn1rz Год назад +3

    My only concern with switching to AMD is PhysX. I still play the Batman Arkham games and can't find any info on if they can run the PhysX effects like Fog/Debris, Rain and interactive cloth/paper the same way that Nvidia does. I know back when those games were new AMD would defer all those effects to the CPU resulting in fps below 20.

    • @shazzi1626
      @shazzi1626 Год назад +2

      I played through Batman Arkham knight on a 6800 xt recently haven’t noticed anything odd

  • @jaketompson9269
    @jaketompson9269 Год назад +2

    In Australia the 4070ti midrange is priced to compete against the 7900xtx at ~$1500AUD.

  • @donaldslayer
    @donaldslayer Год назад +3

    7800 XT here. I got the Phantom Gaming as well, pretty good binned silicon. Got it like a couple of weeks after launch. It’s been a monster with my 5800X3D. I swapped from 2080 super. I do miss DLSS but that is the only thing so far. I agree that RT is overrated for now, I think in a few generations it will matter though.

    • @thebackroomstockboy7336
      @thebackroomstockboy7336 Год назад

      I have a regular 2080 and I know the performance gap to the Super isn’t much. How’s the performance compared to your 2080 Super? Do you see better frame rates? I don’t care at all about RT so I’m not really even considering nvidia this time around.

    • @donaldslayer
      @donaldslayer Год назад

      @@thebackroomstockboy7336 it’s been a massive jump at 1440p and especially 4K. I’ll admit I didn’t “need” an upgrade, but I didn’t want to hold out yet another generation as I was starting to see some pretty hefty gpu bottlenecks. With optimized settings at 1440p and no raytracing cyberpunk, I’m cpu capped at ~110-130fps. With XeSS (FSR is too ugly for me to even look at even at quality) I can do raytraced lighting and reflections at a fairly locked vsync half (72fps), maybe some additional tuning could get me local shadows ray traced as well. All in all, it’s better at RT than the 2080/S, and leaves it in the dust completely in raster performance

    • @PineyJustice
      @PineyJustice Год назад

      @@thebackroomstockboy7336 Obviously depends on the game and resolution, but at 1440p and 4k the 7800xt should be about 50-100% faster with a few outliers being as low as 30% faster

  • @PyromancerRift
    @PyromancerRift Год назад +1

    If AMD really was that crappy, they would be out of GPU business with all the lawsuit they would face. I have a 7900xtx since 7 month and i only had driver crashes because of my overclock. When i put a stable overclock, i have zero issues whatsoever.

    • @kotzer71
      @kotzer71 Месяц назад

      most of the time people complain about amd it's normally user error or windows changing drivers on them

  • @Antonis24
    @Antonis24 Год назад +4

    I upgraded from an RTX 3080 to the XFX 7900 XTX (used DDU in safe mode) in May, and I'm really happy with it. The rasterization performance at 1440P/180Hz is amazing.

  • @esteban1973
    @esteban1973 Год назад +2

    Peer pressure is still strong past high school. Look at the Apple fanboys. They will proudly pay hundreds more for less choice and less features.

  • @nbrown5907
    @nbrown5907 Год назад +13

    Frame generation is a game changer, AMD must do this to stay in the game.

  • @sharktooh76
    @sharktooh76 Год назад +2

    Nvidia has an ARMY of fanboys like Apple (AMD has them too but not nearly as much). NEVER listen to fanboys.
    Just gather info from reliable sources and get what you need at the lowest price you can get.
    P.S. Idle power was fixed in Adrenalin 23.12.1

  • @MohawkNinja636
    @MohawkNinja636 Год назад +5

    My buddy had an older RX480 and had TONS of driver issues that I just didn't have on my 1080Ti. I've been following AMD for a long time but only bought in on their CPUs. I just bought my 1st ever AMD GPU and I'm very excited to see how it performs. I went with the 7900XTX and even upgraded my monitor from 1080p to 4k for the 7900XTX

    • @UncannySense
      @UncannySense Год назад +2

      had a 1080ti and an rx580 on the same cpu/motherboard. Never had issues with either. Sold both during the crypto hike for a profit and gamed on a 5700g apu till gpu prices normalized...currently using a rx6600 no problems...looking at used rx6800 now. Moslty support AMD due to open source consumer practices.

    • @mikfhan
      @mikfhan 3 месяца назад +1

      I think now in 2024 they finally stopped support for RX480 in the Adrenalin drivers, at least I keep getting an older driver recommended for mine when I search/update. It had a great run though since Witcher 3 launch.

  • @garyhall3919
    @garyhall3919 11 месяцев назад +1

    i upgraded from a 1080ti to a 7900xtx and after using it for 6 months i upgraded to a 4080, while using the 7900xtx i have never experienced the frequency of game crashes as i did with my 1080ti or my now 4080. this really disappointed me as i was hyped for a AMD build (5900x cpu) i will continue to use AMD cpu's as they are amazing but i will forever be wary of AMD gpu's

    • @sajithsaji3606
      @sajithsaji3606 6 месяцев назад

      Bro i have a doubt Is there any difference in image quality comparing amd and Nvidia or both r same?

  • @mungojerrie86
    @mungojerrie86 Год назад +4

    As for visual differences, I've immediately noticed them after switching from the 3070 to 6900 XT. Colors felt a bit more lively, image on the whole looked a tiny bit more colorful which I prefer. To make sure it's not placebo I've corroborated it with a mate who made a similar switch and he has confirmed my observation.

    • @user-jd3pk1bz8e
      @user-jd3pk1bz8e Год назад +2

      I notice better image fidelity on AMD also. Like Nvidia colors are way more saturated. Amd colors are more close to reality imo.

    • @systemBuilder
      @systemBuilder Год назад

      3070 was possibly doing texture compression to make up for the way they ripped off customers with too little memory. It astonishes me how stupid NVidia's customers are. They buy the card that produces fake pixels, fake frames, fake colors, and then brag. About what? What you're getting is NOT 3D. You're getting 2.8D, because most of the richness is lost by all of NVidia's cost-savings gimmicks! And yes, I'm a fool, i bought a 3070 Ti for myself but my kinds (mostly) own(ed) AMD (7900 xtx, Rx480, Rx580, 2060 KO).

    • @mungojerrie86
      @mungojerrie86 Год назад

      @@systemBuilder That wasn't texture compression, just different approaches to color management I guess.

  • @Gerry-K
    @Gerry-K Год назад +1

    With AMD updates causing trouble for it's consumers, I think people are Silly to use any AMD Cards. I switched to Nvidia a few years ago and will never go back!

  • @elvertmack5039
    @elvertmack5039 Год назад +3

    7900 xtx power color red devil.....but caught it on sell on Amazon for 775 bucks...couldn't best it. Had to repaste the die...but other than that...cool card so far.

    • @Lodorn
      @Lodorn Год назад +3

      Yeah, that sounds like a power color problem more than an AMD problem anyway. It's not like AMD ships the chips pre-pasted to the cooler manufacturers.

    • @elvertmack5039
      @elvertmack5039 Год назад

      @Lodorn exactly...but thank God for readit. Then learned how to break it down on RUclips, now Hotspot never reaches 80 with Overvolting/Undervolting.

  • @SuperBacon17yt
    @SuperBacon17yt Год назад +1

    I am distracted by the 90s tribal tat. Its been a minute since I've seen one in the wild

    • @korosoid
      @korosoid Год назад

      @@strangerdanger1012 Stoopid! Goyimtracing and pathtracing are still raster.

  • @steveseybolt
    @steveseybolt Год назад +4

    I have the PG 6950XT and the Pulse 7800XT . The 6950XT is Better . Not by a huge amount but Better nonetheless. The 6950XT is the 4070 TI equal . Very good video ! I love this as I am thinking of going to a 7900XTX next.

    • @ItzLnX
      @ItzLnX Год назад

      Did you try undervolting and overclocking the 7800xt? Seems to be an easy performance boost without the high power draw of the 6950xt

    • @steveseybolt
      @steveseybolt Год назад

      @@ItzLnX No I replaced a 6700XT on my sons PC . Its a Banger . Runs 1440p Ultra all day w/ 5600xt no problems. My 6950XT is on my Personal PC with a 5800X3D I do 4k .
      And yes the 6950 is insane with the power consumption !

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Год назад

      I can not pass 365W usage it's caped there when I do overclock, my time spy score as graphics is 23800.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Год назад

      What is highest wattage you can get up to?

  • @oxyuran5998
    @oxyuran5998 Год назад +10

    I have been running an all AMD rig for 12 years with one graphics card swap till spring and now upgraded to a 7800x3d/rx7900 rig and I have not had any of these crashy/buggy experiences.
    I had to limit the RAM with my X3D Chip and do the BIOS update to circumvent the ASUS problem that existed early on, but that's not AMD Graphics... So far it's a great system, no issues and I'm as happy as I was back when I built the other rig 12 years ago.

  • @kenshirogenjuro873
    @kenshirogenjuro873 Год назад +2

    Cyberpunk for about 3 years now has been the de facto standard for checking out the best possible ray tracing. And that’s useful info in some ways, don’t get me wrong. The issue is it’s not that big of a game and is in no way representative of the library of games most people play…even among ray tracing titles. Sure it’s much more than a mere tech demo, it’s a fully fleshed-out game some people quite enjoy. But almost no games implement ray tracing to nearly the same extent, and across most ray-tracing games on average the AMD cards don’t really trail by nearly as much as the Cyberpunk results alone might make people think.
    But even closer to the point, it’s interesting that two of the most popular and widely-played recent games, Starfield and COD MW2, perform markedly better on AMD cards, even well beyond their already superior performance:cost average across all games, and it’s strange how unnoticed this tends to go. A LOT of people play these games. For many they are the impetus for buying/building a new gaming PC, the same way such games literally sell consoles. I myself built TWO PCs in anticipation of Starfield. So it’s nice to see the 7900XTX and 6950XT cards I have in them completely demolishing the nearest-priced Nvidia cards in this title. Oh and MW3 just came out. All these COD games use the same engine so the AMD advantage will likely carry over there. That’s THREE, HUGE titles where anyone putting together a gaming PC is doing themselves a titanic disservice not being informed of how badly Nvidia cards perform for the money. At times the degree by which Nvidia overcharges has been so severe you could be looking at nearly DOUBLE the FPS with the closest-priced AMD card.
    This is not to say AMD has some mystical hardware advantage Nvidia lacks. This clearly is NOT the case. Nvidia IS markedly ahead in ray tracing, so there’s nothing invalid about the Cyberpunk comparison. But some engines preferring some architectures is normal for the industry, and how valid is placing so much focus on just one novel game (Cyberpunk) that’s not representative of what most people play or even most ray tracing games while largely ignoring quite possibly the most popular recent games, that AMD cards happen to really excel in? When AMD is already giving the better performance:dollar ratio across all games at large, this is a point I think deserves more discussion.

    • @systemBuilder
      @systemBuilder Год назад +1

      People are so rabid about ray-tracing, nobody talks about that you have to be playing a nigh-time game in the rain for it to be useful ... most night time games in the rain just frankly aren't that much fun ...

  • @ImaITman
    @ImaITman Год назад +9

    I run Nvidia at the moment (3080) and Nvidias drivers have been absolute crap on Many games in my time owning it (bought one when it was a fresh release). AMDs software is WAY more refined. If AMD hits the nail on the head with FSR 3.0 and can actually produce a 4090 competitor then I'll make that switch.

  • @thatdudeghostyxd
    @thatdudeghostyxd Месяц назад +1

    have a 6700xt
    really hope the 8000 series has better RT performance so I can stick with team red

  • @sirreal9691
    @sirreal9691 Год назад +6

    Ray Tracing is so over rated. I'm glad you said it. I've been playing Alan Wake 2 on my 7900 XTX. When I turn on RT the performance tanks. This would happen on Nvidia as well. I see no significant improvement in graphics with RT. To improve my frame rate I turn on FSR. The game is then playable, but looks significantly downgraded. I have seen some comparisons and it does seem like DLSS is slightly better looking, but to my eye, nothing is currently better than native resolution with raster graphics. I've had no major driver issues. I truly cannot see why anyone would prefer RT on with DLSS even for single player games. Maybe after 5 years from now when games can be played with better versions of RT at high res, I'll look for the best RT card. Currently, I am all raster at full resolution.

  • @DrkRydrProductions
    @DrkRydrProductions Год назад +1

    My first pc was equipped with a 1060 6g. From there it has been team Red. Next card was Vega 64 then 5700xt and now the 7900xtx. All cards from sapphire and all are Nitro plus version. They are very well made cards and have given me great gaming experiences for way less money.

  • @AvroBellow
    @AvroBellow Год назад +3

    I think that AMD must have fixed the idle power issue because my RX 7900 XTX idles at 5W total board power.

    • @systemBuilder
      @systemBuilder Год назад

      Yes, the fix was about 1-2 weeks before you posted your message ...

  • @gytis156
    @gytis156 Год назад

    That backlight not turning off is so annoying, has been happening for me with rtx 3060 ti ever since I got it.

  • @mattcanich
    @mattcanich Год назад +6

    7800xt installed 2 weeks ago. First AMD card in a decade. Couldn't be more pleased

  • @matilija
    @matilija Год назад +1

    One more reason to choose green over red, though very minor, but could be a big deal to a home theatre enthusiast in something like a plex server, is that Dolby Atmos home theatre doesn't currently work on AMD cards and hasn't for over a year now of driver updates.

    • @South_0f_Heaven_
      @South_0f_Heaven_ Год назад

      Don’t tell this to the rabid Radeon crowd as the excuses will start flowing.

    • @matilija
      @matilija Год назад

      @@South_0f_Heaven_ I don't personally care about people's excuses, it just doesn't make any sense that they haven't fixed it, I mean atmos works on the consoles with Radeon gpu's......why not PC???

    • @JackJohnson-br4qr
      @JackJohnson-br4qr Год назад +1

      What do you mean exactly? I have an RX 6800 XT and Denon AVR-X1700H and Dolby Atmos works fine. There was a problem with VRR 2 years ago that I didn't have with an RTX 3060 but has been fixed for year now.

    • @matilija
      @matilija Год назад

      @@JackJohnson-br4qr Are you sure? Have you double checked that your sound settings on your PC are actually set to Atmos and not just 7.1?

    • @JackJohnson-br4qr
      @JackJohnson-br4qr Год назад +1

      @@matilija Yes I'm totally sure. If the receiver displays Dolby Atmos it means it's receiving a Dolby Atmos signal just right.

  • @zargisan9017
    @zargisan9017 Год назад +6

    I’ve always used Nvidia. Upgraded my pc using all amd, 7800X3D with the 7900xt Taichi and I couldn’t be happier. Performance is absolutely impeccable. Love it

    • @evacody1249
      @evacody1249 Год назад

      I have a AMD setup 6700xt and 3700x. I enjoy my set up.

    • @ArtisChronicles
      @ArtisChronicles Год назад

      ​@@evacody1249 almost the same setup, except I'm using the 5700x

  • @HardwareAccent
    @HardwareAccent Год назад +1

    Very good video. Thank you.

  • @furynotes
    @furynotes Год назад +2

    Nvidia is still faster in blender. The thing that's holding the nvidia cards back is the small 192-bit bus. In that price range. Though they make great 1080p cards even with path tracing. There isn't much that I have fomo for with amd cards. For what I do with nvidia cards. Amd still don't have enough support for what I do. That amd feature I was excited about is useless now unless maybe its single player. No anti-lag buddy.That was the only reason I would consider to use amd with multiplayer games. Frame gen single player too. Unless its implemented into multiplayer games. Why not. It should be you want it you got. Don't care what brand. Ray tracing. I run ray tracing on mid range current cards. I have a 4070. Still run ray tracing. But at 1080p. Still get good frames. What you want blu ray quality or 4k ultra blu ray. That's the way I see it. The argument for 1440p is questionable. Especially for entry level cards between $200 to $500. I would barely consider a 7800 xt a mid range card. Same with both the 4070 cards. I would consider a rtx 4080 a solid 1440p card. Same with the 7900xt. Both entry level 4k at this point. As always even with the highest end cards. Those would be the 1440p kings. And decent 4k cards. Otherwise upscaling all the time. To pretend the 7800 xt is amazing 1440p card as well with the 4070 non ti or ti. I saw one guy say that the Alan Wake is a bloated game at just about 1440p no upscaling with a 7900xt. Max settings no ray tracing. He demanded 120fps in that game. I would agree that Amd still has better color range then nvidia. Ray tracing is playable with the newest cards pared with the newest games. At lower then expected resolutions. It always was that way. SInce Ray tracing was hitting the main stream with the 20 series nvidia cards. With a fair level of upscaling. People have weird expectations with gpus in general. Is amd a better value then nvidia yes and no. That's my opinion. I would still use nvidia because its better with heavier ray tracing and better in blender still. Though again the bus width really dictates what resolution is good those mid range gpus. You have to be more aggressive with dlss at 1440p. That's both the 4070s just because of the bus width. With performance DLSS. I can achieve similar results at 1440p on a regular rtx 4070 compared to a 7800 xt. Same rt medium settings. With very strong exception. I had frame generation to achieve the same results. Ultra performance dlss with rt over drive to achieve similar minimum above 50fps 1440p. Conclusion. You have to use performance or ultra performance with frame gen. To achieve the same minimum FPS with a 4070. The cheaper one to be in parody to the 7800 xt. The weakness I believe is the lower bus width of the nvidia gpus. That goes for anything under a rtx 4080. But I can still do ray tracing at playable frame rates. Even with the latest mid range nvidia gpu. So i argue you are right to some extent. But also wrong to some extent. Its a give and take. Do I regret getting a rtx 4070 with the regular pcie power connection. No. Do I want more yes. But even with my 3090 for my workstation 24gb isn't enough for what I do in blender 3.6 and above. Would I buy an amd card or even intel. Its quite there yet to give up nvidia. Especially with Ray tracing frame by frame with blender or real time in a video game or other applications that supports it. This comment is so long. Sorry. I doubt anybody would read this.

    • @tommeegunn9318
      @tommeegunn9318 Год назад +1

      I did read it all. Another perspective is always good.

    • @user-jd3pk1bz8e
      @user-jd3pk1bz8e Год назад +1

      True real advantage with 3D with Nvidia.

  • @astrea555
    @astrea555 Год назад

    Wait, wtf is up with the raised blacks on the nvidia Cyberpunk footage? wrong gamma?

  • @albal156
    @albal156 Год назад +4

    Such a great video! I love these types of deep dives into the pros and cons of each manufacturer. I recently switched to a 7800XT myself and have been having a good experience so far as well.

  • @sniper1251
    @sniper1251 29 дней назад

    17:20 didnt u sett ur screen to 60? bc it takes the hertz of the screen when in windowed

  • @daviddesrosiers1946
    @daviddesrosiers1946 Год назад +4

    When I was building my rig, Intel was just getting on the scene and I bought the Acer A770 out of curiosity, but I was still intent on a Red Devil 7900XTX. So, I got two GPUs, one a lovely curiosity piece and still saved $200+ not buying a 4090. Interestingly it looks like AMD handles HDR better than Nvidia.

  • @Oceanborn712
    @Oceanborn712 Год назад

    Randomly got here by a RUclips video suggestion. Really nice video and honestly, I'm happy for everyone pointing out that you don't have to have Popular Brand XZY just because there's some horror stories out there. You got my sub and I'm looking forward to followup videos. Also you have a very pleasant voice to listen to!
    I personally switched to AMD mid last year when they discounted the 6000 series cards. Got a 6900 XT back then, now also got a 7900 XTX in my main system (and an Arc A770 LE and RTX 3090 in my guest PC and HTPC respectively, my hardware is generally playing merry-go-round my systems as I upgrade until I really have no use for them anymore) and I couldn't be happier with them. The 6900 XT initially had a lot of crashes but that wasn't the driver or anything, it was a genuinely defective card that got replaced by Sapphire within a few weeks. That pretty much is the only negative thing that I had happen to me so far on the modern cards. Last AMD card I had before these was an RX 560 and I got that instead of a theoretically stronger GTX 1050 Ti which was giving me graphics issues due to engine issues with Pascal cards in Ragnarok Online (which is still my daily routine game to this day) so technically, I've been better off with AMD than Nvidia for a long time at this point.

  • @Tryptone
    @Tryptone Год назад +4

    I grabbed a lightly used 6800xt earlier this year for $575 CAD. Until then i was considering a 3080 from a friend, used for mining. He wouldn't budge from $1000. AMD hasn't let me down once.

  • @XeeDooX
    @XeeDooX Год назад

    7:55 how did you find this stock footage of Bulgarian money haha :D

  • @jesuschristislord77733
    @jesuschristislord77733 Год назад +3

    Sapphire Nitro 7900 xtx 👍

  • @leyterispap6775
    @leyterispap6775 Год назад

    Bravo , esp for the different saturation the two companies have , which is hard to spot . i noticed it when i went from a 1660ti to a 6600xt , but u need to pay attention to notice it .

  • @rolandbruceguzmanmamani663
    @rolandbruceguzmanmamani663 Год назад

    This was extremely detailed, thanks! New sub

  • @coin777
    @coin777 Год назад +1

    How is your idle power draw with multiple monitor setup?

  • @secondsystems3479
    @secondsystems3479 Год назад

    Yo, the chopstick method! THANK YOU!!

  • @n1ckstars
    @n1ckstars 7 месяцев назад

    The fullscreen fix for the fps cap also happens to me in overwatch. it caps at your monitors refresh rate even with vsync and frame rate cap disabled which is extremely weird

  • @shadowrealms2676
    @shadowrealms2676 Год назад +2

    I think the difference in color is because u have to enable full rgb

    • @furynotes
      @furynotes Год назад

      I tried the full rgb. Your half right. There is still more range with the amd cards.

  • @saulgoodman5873
    @saulgoodman5873 Год назад

    With your computer not going to sleep, it seems to be a Razer synapse issue that’s never been fixed by Razer that affects NVIDIA HD Audio. If you disable NVIDIA HD Audio in device manager, after a restart your pc should start going to sleep automatically.

  • @blvk3
    @blvk3 Год назад +1

    All those RGBs hurt my eyes.
    I am sure that give give massive performance increase in gaming and editing.

  • @mkatakm
    @mkatakm Год назад +2

    After 10 months of experience with my 4080:
    DLSS, when your hardware is subpar.
    Frame gen, when you want to convince yourself that your hardware is not subpar.
    Ray tracing, when you need to turn your hardware into subpar. Also helps you feel you have the superior card.
    Path tracing, when ray tracing is not enough. Brings down even a 4090.

    • @ramsaybolton9151
      @ramsaybolton9151 Год назад +1

      " the future is scary"...you sound like those idiots that used to rag on about PhysX lmao.

    • @cdiff07
      @cdiff07 Год назад

      Sounds like regret for obvious reasons,lol. Why in the hell would you or anyone buy a rtx 4080? When you are paying over a grand, you might as well have just saved a bit more and bought the 4090. Very dumb purchase!

    • @mkatakm
      @mkatakm Год назад

      @@cdiff07 between $1200 and $1600, the difference if $400. If it looks like nothing to you then you are parroting the same dumb argument that everyone else parrots from each other. That is called collective stupidity.

    • @mikfhan
      @mikfhan 3 месяца назад

      Budget might be a good reason - 90/Titan class cards are expensive (even more so than 80 class) and Nvidia keep promoting the RT stuff on those despite the heavy FPS cost. FSR/DLSS/frame gen is a band-aid "fix" for unoptimized games and WILL reduce visual clarity and control input response, it works best when you can already reach 60 FPS without it. Almost reminds me of "rubber banding" from online games with lag. Maybe in 5 years.

  • @diablosv36
    @diablosv36 Год назад +1

    Did you have full RGB set in adrenaline? Oh and I think your the first to ever find a difference with FSR between AMD and NVidia. But I would say thats because no one ever bothered to check, and most comparisons are done between DLSS and FSR on a NVidia GPU.

  • @yakovbl2968
    @yakovbl2968 4 месяца назад

    Got a RX 7900 xt about a year ago after using a RTX 2060 for 4 years. Only problems I've ever had were due to one bad driver update, and were easily solved with the next update

  • @babochee
    @babochee 11 месяцев назад

    The lastest nvidia driver update killed my hdmi port on my 4090. I had to rollback to the prior drivers to have it work again. People can't tell me nvidia's drivers arent without flaws.

  • @alonsogabriel9336
    @alonsogabriel9336 Год назад +1

    Personally saving for a 7800xt myself and Im planning to have it as my next 7 year gpu coming from a GTX 1070 and a GTX 750ti before that, a GTX 650 before that, and a GT 610 before that.
    Basically my first AMD anything.
    As someone who cant really see the point or the visual difference of ray tracing, the rasterized performance per dollar of the 7800xt is the best there probably is.

    • @KamleshMallick
      @KamleshMallick Год назад

      Same!
      I have GTX 980 with
      4 GB VRAM since 2014.
      Will move to 7800 XT with 16 GB VRAM in march 2024.
      Nvidea are apparently launching the 4070 Super in January.
      But AMD just offers superior value IMHO.
      What I understand is that Ray tracing saves lot of time for development. Thus many games will be using it in future. So that is where our AMD cards will be tested.

  • @PTQ4Q4Q4Q4
    @PTQ4Q4Q4Q4 Год назад +1

    Amd has never been an issue, i have never understood people who always went green, 6700xt great value.

  • @balinttoth2339
    @balinttoth2339 Год назад +1

    RX 6800 here, absolutely the best GPU I've ever had. Runs any game 1440p max settings (without RT) well over 60 fps. If you are like me, and don't care about computationally unfeasible, artificially hyped-up features, and don't want to spend the price of an entire PC on a single component, AMD is the way to go. Currently, I'm not planning to ever go back to Nvidia.

  • @BleepingWorld
    @BleepingWorld Год назад

    Which card was better for editing and rendering videos?

  • @garyrichards6079
    @garyrichards6079 8 месяцев назад

    You've forgotten Nvida's biggest new bug ! The 12vhpw connector ...

  • @christianblanco2596
    @christianblanco2596 8 дней назад

    with dual setup with a high refresh main monitor. Changing it from 144 to 120 will dropp idle power usage from 78 down to ~40.

  • @JamesJones-zt2yx
    @JamesJones-zt2yx Год назад +1

    Linux user here. I'm not a gamer, and until late last year I hadn't bought a graphics card in seven years; it was a 2GB GTX-960. That's the last Nvidia card I'll ever buy. I'm happy with the RX 6400 I use now. (Like I said, I'm no gamer.)

  • @buttergolem8584
    @buttergolem8584 Год назад +1

    Any recommendations for a very good Freesync monitor? My old one from 2016 only supports G-Sync.