Is Radeon REALLY better for old PCs? | Driver Overhead

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 534

  • @IcebergTech
    @IcebergTech  Год назад +200

    CLARIFICATION:
    I did want to go back and test the RTX 3070 with FSR 2 instead of DLSS, but I ran out of time with the Ryzen 7 1700. There is little to no performance difference between the two upscalers, with DLSS’s advantage mainly apparent in image quality, but ideally I’d like to have maintained consistency. Maybe I’ll revisit the topic with an Intel CPU in the future, and do it right next time!

    • @minhnguyennhat9556
      @minhnguyennhat9556 Год назад +2

      Thanks for your contents!

    • @lordazyks3134
      @lordazyks3134 Год назад +2

      That wasn't a problem for me, I mean you let each card try to perform as its best !

    • @mircomputers
      @mircomputers Год назад +2

      this test is so dumb, in no market the 6700xt is same price as the 3070. Should have got a 3060 12gb for comparison. And the fact that you even add a new CPU for a grand total of double the upgrade cost for nVidia is just added injury 😂

    • @edgy843
      @edgy843 Год назад +1

      That's needless fear. It's better to just use fsr on both cards regardless. It's been a while since I've watched a low sub youtube video with that much fairness in all acounts. Cheers!

    • @edgy843
      @edgy843 Год назад

      ​ @Mir Computers Unfortunately, it was in my country or at least in my nearest retailer. In fact, 6700xt was more expensive than 3070. Dunno how that happened. The prices are probably changed by now but that's the prices I saw when I built my pc last january.

  • @dorktales254
    @dorktales254 Год назад +681

    Driver overhead aside, can we at least commend AMD for having fully open source drivers on Linux. They are rock solid too, almost never crashing

    • @roccociccone597
      @roccociccone597 Год назад +32

      yep. I hope Intel gets there too. At least we can pick...

    • @Arsoonist
      @Arsoonist Год назад +23

      ​@@roccociccone597 I found the graphics driver for newer Nvidia cards 40xx series to be "good enough". It's still not as seamless as AMD drivers but at least you won't be limited to windows if you buy Nvidia anymore

    • @roccociccone597
      @roccociccone597 Год назад +18

      ​@@Arsoonist Your experience with nvidia GPUs, is heavily tied to what distribution you use.
      And besides, even now, the experience on X is atrocious and lags as soon as you have an intensive application running. Wayland is a hit or miss where you don't have hardware acceleration for browsers and electron apps. I use Wayland pretty much exclusively except for when I deal with Nvidia.
      Finally don't even start with hybrid laptops. They're probably the worst thing after Windows Updates.
      So they're on the "barely works" tier.

    • @Arsoonist
      @Arsoonist Год назад +5

      @@roccociccone597 yeah I tried to use a Wayland/ hyprland distro and Nvidia crashed everytime. Tumbleweed was much much much better though

    • @GewelReal
      @GewelReal Год назад

      Why are nerds all over Linux? Just pirate Windows

  • @hardcorehardware361
    @hardcorehardware361 Год назад +233

    This just shows how good the 6700 is for the money, great video mate I enjoyed it.

    • @white_mage
      @white_mage Год назад +8

      is this bait?

    • @GewelReal
      @GewelReal Год назад +6

      @@white_mage yes

    • @mircomputers
      @mircomputers Год назад +31

      he used a 3070 for comparison, that cost 50% more just to show some positive on the nVidia side when switching CPU

    • @faatihfattahillah4939
      @faatihfattahillah4939 Год назад +3

      this is how good amd is ☕

    • @PostWin
      @PostWin Год назад +7

      @@mircomputers Didnt seem positive to me when the 6700 beat it at some titles. Felt like even more of a smackdown to the more expensive card

  • @DMS_134
    @DMS_134 Год назад +134

    The TLOU performance the 6700 is seeing, is likely because of the bug that got fixed on 23.5.1, and happened all the time if you let the shaders compile and jump in the game without restarting it. The VRAM and ram values that ballooned during the compilation, remain there during gameplay, messing up the performance.

    • @mircomputers
      @mircomputers Год назад +12

      man , the 6700 cost 50% less than the 3070. Even with these issues it's still performing about right 😂

    • @koolin3613
      @koolin3613 Год назад +5

      ​@@mircomputers did you watch the rest of the video? With the CPU upgrade installed the RX 6700XT almost matches the RTX 3070, and if the price is really 50% less than the 3070 i would rather have the 6700XT

    • @mircomputers
      @mircomputers Год назад +3

      @@koolin3613 it is a 6700 10gb, not the Xt in the video

    • @mircomputers
      @mircomputers Год назад +3

      @@koolin3613 tlou is the only game where the AMD card performance is worst wit the 1700

    • @koolin3613
      @koolin3613 Год назад +2

      @@mircomputers yeah and as the guy said here above, that was apparantly a bug that has been fixed now, also it must have been a big cause its weird behaviour, how did it suddenly catch up with the 5600x installed? I wonder how it behaves with the patch installed

  • @Rabbit_AF
    @Rabbit_AF Год назад +80

    Recently switched my son's PC from a RTX 3050 to a RX 6600 XT. He has an E5-2667V2, and I didn't think it would do much for him performance wise. I asked him how Starship Troopers: extraction played and he said it ran smooth. I ran it on an E3 1270 with a GTX 1070 ti and I thought it was a stuttery mess constantly dipping into the teens when bug hordes appeared. I'll have to watch him next time he plays to see if he just doesn't notice FPS dips or something.
    All I know, is that I am glad I upgraded from that e3 1270 Xeon to a 5700g with a RX 6950 XT. I can now brute force unoptimized games.

    • @kkrolik2106
      @kkrolik2106 Год назад +19

      If you can return 5700g and swap for 5700x or 5800x g have iGPU but only 16mb L3 and only PCIE 3.0, x have 32MB L3 and PCIE 4.0 this make difference in games

    • @Rabbit_AF
      @Rabbit_AF Год назад +8

      @@kkrolik2106 the plan is/was to switch the 5700g out for a 5800x. I bought the 5700g used for a good deal and was planning to move it to HTPC duty eventually. I have been pleasantly surprised by it's performance. On the CPU-Z benchmark I *can* get it to beat the 5800x reference with a score of Single-Thread 652 Multi-Thread (16T) 6991, but that was just bench marking. Usually, the single core performance is low 630s. My R23 score at my normal OC frequency is 15112. Also, since it is 65w I have a Wraith like 4 heatpipe cooler that never lets the CPU get over 62°C. Finally, I'm running a X370, so I can't get Gen 4 anyways.
      However, you are right about missing out on Cache and I think that is ultimately what is going to push me to do a CPU and Motherboard upgrade eventually. I got the 6950 XT because it was available I didn't have to pay a lot for it because I had gift cards from work. I was planning on getting a 6700/6800. I kinda wish AMD made a non APU monolithic CPU with more cache instead of the GPU.

    • @SvrM_
      @SvrM_ Год назад +1

      @@Rabbit_AF I went with a 12700kf and 6950xt and I’ve been loving it

    • @SaxaphoneMan42
      @SaxaphoneMan42 Год назад +2

      old Xeon and 1070ti to Zen 3 architecture with double the core count and top of the line RDNA2? That upgrade has gotta feel incredible, the games that used to cripple your system are probably now hardly warming up your computer. I like the idea of using the 5700G and that board in a separate system and upgrading for pcie gen 4 and more cache on the CPU, maybe the X3D will be reasonable enough price by then? If so, that will be a pretty noticeable bump in performance.

    • @MaxIronsThird
      @MaxIronsThird Год назад +5

      6600XT is 50% faster than the 3050, of course it would make a big difference, even with an old "i9" Ivy Bridge.
      Also the E5-2667V2 is several times stronger than the E3-1270, that's why it running like shit o your PC.

  • @Prehistoric_Nerd
    @Prehistoric_Nerd Год назад +19

    Considering the price difference between the two cards it was surprising to see how well the RX 6700 held out even with the 5600x.

  • @joekoch8485
    @joekoch8485 Год назад +43

    It’s really interesting to see how much the games are reasource intensive and how it can be leveraged by graphics cards. Great video good sir!

    • @roveradventures
      @roveradventures Год назад +1

      My 2080ti was using almost 5gb of vram running crysis 1 disc version at 1600x1200 🤣.

  • @AlexandruJalea
    @AlexandruJalea Год назад +24

    Ha, not just gaming. 2023 is a problem generally 😅

  • @ctsd623
    @ctsd623 Год назад +64

    I've read that it's because AMD uses hardware schedulers on the GPU itself, whereas NVDA relies on software scheduling on the CPU, that there is such a disparity.

    • @alfonso5177
      @alfonso5177 Год назад +1

      it dont

    • @CaptainScorpio24
      @CaptainScorpio24 Год назад +1

      right on

    • @einarabelc5
      @einarabelc5 Год назад +4

      I used to hear that a long time ago. I didn't believe it hasn't changed since then when you said it, wow

    • @sategllib2191
      @sategllib2191 Год назад +1

      It hasn't for awhile. They are about to release hardware scheduling

  • @AshtonCoolman
    @AshtonCoolman Год назад +20

    This is a great test! I've been preaching this about Nvidia driver overhead for years and years. AMD GPUs have had a hardware command processor since the ATI R600(HD 2900XT) era. There are folks on AM4 that can double their GPU performance by just upgrading to a mere 5600X. Anyone with an older Intel or AMD CPU that recently upgraded their GPU needs to see this desperately.

  • @Psychx_
    @Psychx_ Год назад +26

    The devs just omitted optimizing new releases: Due to the reduction in development time, the publishers can save money and release an even more unfinished game early, people buy new GPUs and the economy is happy.
    The PC gaming industry is rotten and disgusting atm and having all increases in hardware performance eaten up by greed and lazyness is frustrating.

    • @einarabelc5
      @einarabelc5 Год назад

      That's human ungratefulness for you. It's on every industry right now, is gotten worse since the 60s and it's going to get worse

    • @goat_gaming1158
      @goat_gaming1158 Год назад

      I couldn’t agree more it’s the most annoying ducking thing in the pc world rn

    • @goat_gaming1158
      @goat_gaming1158 Год назад +7

      @@einarabelc5 what are you talking about ungrateful ness your god dam right we’re ungrateful they are just pushing out these games with no optimization to save money and time and this is not evry industry

  • @florider_hd
    @florider_hd Год назад +9

    I put a 1660super in my 2012 rig. Its been flawless so far, fitting right in at 500 watts. i know there are better options but for 120€ it was a great purchase!

    • @robotsix6268
      @robotsix6268 Год назад +6

      A 1660S for 120 is an insane deal even today. Gongrats in your purchase and keep gaming.

  • @sL1NK
    @sL1NK Год назад +4

    Been watching you (and also subbed) for a while. Honestly, your channel's content value is miles higher than creators like JayzTwoCents' nowadays. I truly appreciate your work and really hope your channel blows up asap.

  • @steveo4067
    @steveo4067 Год назад +9

    Thanks for making this video. This all makes a ton of sense, a buddy of mine is wanting to build a new PC and we are thinking of throwing a 12100F or a 13100F in it with a Radeon RX 5500XT for a ultra decentish budget build. This just kinda solidified that I am going with the RX 5500XT over the Nvidia equivalent I was going to go for. Thanks for the awesome work on these videos, the production quality is top notch from such a small channel.

    • @unknownname6519
      @unknownname6519 Год назад +4

      My 12100 goes well with a rx6650xt.. Maybe a Text worth if Budget is ok

    • @zbrkesbris5987
      @zbrkesbris5987 Год назад +4

      Better go with RX 6600 (non-XT) or RX 6600M instead of 5500XT. The latter is a mobile chip on a PCI-E card that can be found on Ali - it runs almost identically to 6600 but is usually cheaper.

    • @dangerous8333
      @dangerous8333 Год назад

      Maybe you should ask your buddy if he wants to do more than just gaming with his GPU before you make that decision.

    • @robotsix6268
      @robotsix6268 Год назад

      The 7600 might be a better option

    • @zbrkesbris5987
      @zbrkesbris5987 Год назад +2

      @@robotsix6268 but also considerably more expensive

  • @airwolf1337
    @airwolf1337 Год назад +7

    I upgraded my Ryzen 7 1700 to a Ryzen 7 5800X3D on my Asus X370 Mainboard. Its really a big difference.

    • @ismaelsoto9507
      @ismaelsoto9507 Год назад +3

      Gotta feel good when you have an old 5 years system and you can just slap a new CPU that competes with the latest and greatest, you could never do that with Intel's platform! (Technically possible to use modified 8th, 9th or 10th gen laptop CPUs but it's nowhere near the same performance boost besides needing a lot of troubleshooting and know how to make it work...)

    • @NamTran-xc2ip
      @NamTran-xc2ip Год назад

      @@ismaelsoto9507 the only thing is when I built my 8700k 1080ti in 2017 , the Ryzen equivalent is the 2600, sadly it's not as powerful.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      @@ismaelsoto9507 The last time that sort of worked on Intel was with LGA775. I have a board that supports a 3 GHz single core Pentium 4, and a 3.2 GHz Core 2 Quad. And in the right tasks performance difference is much more than 4x. Cinebench as a semi real world test (it does simulate rendering) does about 2.6x in single and 8.9x in multi, that is almost a 9-times increase just by having a different CPU on the same platform with nothing else changed.

  • @reinhardtwilhelm5415
    @reinhardtwilhelm5415 Год назад +5

    This is even more impressive when you remember that the 3070 is ordinarily ~20% faster than the 6700.

  • @Antilli
    @Antilli Год назад +4

    Great video... I remember the time when nVidia had lower driver overhead than AMD cards... It was all over the internet. Now that it's the opposite, what the internet says is not really that AMD has less overhead, but rather that nVidia has better scaling at higher resolutions...
    The nVidia hive mind is annoying.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq Год назад

      Incorrect. Nvidia didn't have lower overhead, the games had lower overhead, and Nvidia was using the slack. Then this became a problem in newer games that became CPU limited. 4 core CPUs took a long time to max out, but when they did it became a problem. Ryzen 1 shows this problem not because of cores, but lower IPC. Modern software still has a heavy reliance on IPC, Intel mostly bottlenecked itself by sticking to 4 cores for too long. 6 cores wouldn't have been such a problem, which is why Ryzen 1 was seen to be better at the time despite lower performance. Games right now don't scale past 8 cores, so it's all about IPC again.

    • @Antilli
      @Antilli Год назад

      @@JohnDoe-ip3oq The argument that it wasn't nVidia that had lower overhead, but games had, makes no sense. Both companies always used drivers, which is what causes overhead. It simply depends on how much CPU is being used by which company. In the past, AMD drivers used more CPU than nVidia, and now it is the opposite. That is why you see the bottleneck sooner on nVidia cards now.
      Your argument about IPC and cores is correct, but doesn't change anything about overhead.

    • @yancgc5098
      @yancgc5098 Месяц назад +1

      @@Antilli AMD drivers never used more CPU than Nvidia. Nvidia drivers were better back then because they spread the workload to multiple cores, while AMD’s drivers didn’t and since most games were single threaded they reached a CPU bottleneck faster than Nvidia GPUs

  • @TheUnexpected6
    @TheUnexpected6 Год назад +2

    Absolutely love your style of documentative videos. Very well researched and the structure of your script is super easy to follow and to the point. Subbed

  • @allergictobs9751
    @allergictobs9751 Год назад +2

    AMD is so fking underrated, open source drivers on linux, Freesync, FSR, RSR, fking love them.

  • @Totto70770
    @Totto70770 Год назад +7

    A brand new Sapphire RX 6700 XT (good cooler) and a Ryzen 5600 cost almost the same (a bit more) as a cheap RTX 3070 (bad cooler). You get more VRAM on the AMD card, and the XT model is more competitive with the 3070 (compared to the non-XT model in the video).

    • @alfonso5177
      @alfonso5177 7 месяцев назад

      Also u get lower power usage, and better future proof.

  • @scurbdubdub2555
    @scurbdubdub2555 Год назад +26

    Driver overhead was something I suffered when I still had an I9 7980XE. The CPU has plenty of power, but the RTX 4080 I was using wouldn’t get full usage in some titles (Like Spider-Man). The CPU wasn’t getting utilized to 100% either, it had plenty of performance left, it just wouldn’t use the rest of it. When moving to the 5955WX, that issue went away and now most games are fine. Still is there a bit though in Spider-Man, but it is nowhere near as bad now. I’m likely going to be selling my 4080 and getting a 7900XTX as having to deal with Driver Overhead issues at all is just really annoying. Along with the card just being what I wanted originally, but MC didn’t have it in stock at the time I went there to look for that GPU.

    • @Greez1337
      @Greez1337 Год назад +7

      To be frank, might be a memory latency bottleneck there, amigo. Quad channel at 3200mts and C14 might be the max you can hit. The nature of chiplets, amigo.

    • @Rhedox1
      @Rhedox1 Год назад +20

      > The CPU wasn’t getting utilized to 100%
      There's pretty much no game that does multi threading well enough to 100% saturate a semi-modern CPU. It's almost always a single thread that acts as the bottleneck.

    • @mikerandall273
      @mikerandall273 Год назад +11

      Dude you are beyond mistaken. In what universe would games ever utilize 18 cores. The weak single core performance was holding you back 🤣

    • @snaj9989
      @snaj9989 Год назад

      I have the same problem with my laptop featuring Ryzen 5600H and RTX3060. No high utilization on both yet games suffer.

    • @scurbdubdub2555
      @scurbdubdub2555 Год назад +1

      @@Rhedox1 Yeah, I can see that at this point. Most games really won’t take advantage of that many cores. Usually the most usage I can expect in a lot of games is 30% if it is CPU intensive. Only game that can actually fully utilize it is Minecraft Bedrock Edition on max render distance. Not like this is for gaming mainly anyways, it is more for my After Effects/Blender stuff with some gaming on the side.

  • @InternationalLiaison
    @InternationalLiaison Год назад +4

    For all the i7’s from 7th gen and previous. I started using the RX 6600, RX6600XT, and the RX6650XT. It shows far higher frame rate results than the Nvidia RTX 2060, RTX 2060super, RTX 3050, RTX 3060, and RTX 3060 TI.
    In many instances any 1st through 3rd generation i7’s with Gen 2 pcie 16 lanes, Nvidia GPU’s with gen 3 interface tend to crash where as AMD gen 3’s do not

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      Wait, I got a PCIe3.0 Geforce on a PCIe1.1 board without any issues? Is there any incompatibility on more recent cards?

    • @InternationalLiaison
      @InternationalLiaison Год назад

      @@HappyBeezerStudios depends on the motherboard and system bios. HP and Dell tend to be the worst offenders with generational compatibility

    • @InternationalLiaison
      @InternationalLiaison Год назад

      @@HappyBeezerStudios also AMD’s dont have locked security kernels like Nvidia does. So AMD GPU’s have a broader range of use across a wider spectrum of motherboards and operating systems that are linux, MAC OS or windows based.

  • @SilverforceX
    @SilverforceX Год назад +4

    Interesting results. Normally the 6700 (not XT) would be about 20% slower than 3070.

  • @BUBERMAIL
    @BUBERMAIL Год назад

    I liked the topic and the delivery, awesome video. Keep up the good work

  • @Derpynewb
    @Derpynewb Год назад +1

    I'd like to know about driver overhead on newer systems. Can you run some tests with the 5600 in cpu bound scenarios? Although uncommon there are some games that are not threaded so overhead is important.
    Also if someone is chasing the highest fps on some kinda budget. You don't have the games everyone plays so silly settings may not be so silly.
    Although most people probably don't fall under that use case.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      I'd like to see the difference with older games. There are quite a bunch that didn't get any fancy DX12 or Vulcan updates and would scale similar to how they did back then.

  • @get-in-2-get-out774
    @get-in-2-get-out774 Год назад +2

    Having recently tested RE4 with an i5 6500 & Vega64 on high settings just to see around 100FPS at times,
    I was just amazed. Those 4 cores are sure working fine with AMD drivers ! 😂

  • @sehvekah7368
    @sehvekah7368 Год назад

    I just wanted to say thanks for this video, as it confirmed a suspicion I had with my latest upgrade, which left my rig very similar to one of your test machines(R5 1600, B350 Tomahawk, 32GB 3200 RAM and an RX 6750 XT).
    My system ended up in this state due to medical issues eating nearly the last two years of my life, and when I *finally* felt up to playing Cyberpunk again, the game no longer ran on my old card(an R7 360), so that's what I chose to focus my first upgrade on. It's worked out pretty good, especially as a nasty crash/corruption lead to a windows re-install and a big push towards making Linux my primary OS.
    Still it's great to have confirmation that the Ryzen 5k series chips I'm looking at for the next upgrade will be a solid match for the rest of my system, and it's *fantastic* that there's folks like you who will test out these various "edge case" scenarios to get to the hard data.

  • @xzerokillx
    @xzerokillx 11 месяцев назад +1

    this actually worked great. i have a pc for my kids that has an i7 875k and i had a gtx 1080 in it so i swapped it for a rx 580 8gb and games went from barely playable to running smooth with 60 fps +

  • @hofnaerrchen
    @hofnaerrchen Год назад +1

    Darth Vader never runs!^^Apart from that you might want to compare power consumption, too. Would really be interesting to also see those numbers. A "simple" adapter at the wall will do the job for total system consumption.

  • @ecchichanf
    @ecchichanf Год назад +3

    Thanks for the video :3
    Another advantage on the 5600X is, you can actived Resizable BAR.
    In same games you get 10% boost in fps.
    My friend as 2700X with AMD 6900XT and have sometime better framerates with my ric (i7-6950x/Vega 64).
    We both playing in 1440p.

    • @b0ne91
      @b0ne91 Год назад

      Your CPU is just better than theirs so it'll give you an advantage if CPU limited titles, especially esports. If they bothered to overclock their CPU and RAM properly, you'd likely see the gap close to near 0.

    • @grygas8048
      @grygas8048 Год назад +1

      Rebar works on 1gen ryzen on my b350 board

    • @Sylarito
      @Sylarito Год назад

      You can mod in resizable bar support for X99 too. Havent tried it myself as my current 980Ti doesnt support resizable bar anyway.

    • @ecchichanf
      @ecchichanf Год назад

      @@Sylarito I actived rebar on my x99 china board. After activation I can't go in the bios anymore, because I get only a black screen. Still the system boot and rebar is activited/works.
      @Grygas This is a good point :3

  • @Vixen1525
    @Vixen1525 Год назад +2

    I got a suggestion for you to test an AMD FX-8350 and an AMD FX-9590 and compare them to the old chipsets you have from Intel and see in which way they are still "good" nowadays. Beware that the 9590 needs an IO and not a cpu fan, even a dark rock pro can not cool it :)

  • @HappyBeezerStudios
    @HappyBeezerStudios Год назад

    It's interesting how the perception of driver overhead has changed. I remember when Mantle was a big thing back in 2014 and AMD releaded Mantle to work on that, and it was the Radeon driver that had the massive overhead in DX9/10/11 games.
    For today's DX11/12 games the situation seem to have changed. The test shows that some of today's games with today's GPUs on today's drivers show how in those cases the nvidia driver is doing worse. Or DLSS is just hitting harder than FSR, and upscaling wasn't a thing back then.
    But those older games are still played. So seeing if those games benefit from more recent drivers.
    And I'm pretty much exactly in that position. CPU limited on an older platform, but the game is from 2013 and got a port from DX9 to DX11 back in 2017. So no low-level API or renderer updates and pretty much the same case as back when nvidia had the smaller overhead.

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад +1

      older games don't have issues on the CPU side, they already go over 200+

  • @Axeiaa
    @Axeiaa Год назад +4

    This video would have been so much better with frametime graphs side by side and perhaps some more focus on the 1% lows. The 1% lows are shown but not highlighted.
    For example in Spiderman as you ran it, AMDs 53.3fps 1% lows vs NVIDIA's 36.4fps lows. If you played on a 60Hz monitor and capped performance at 60fps AMD would play noticeably smoother. By focusing on the average it's more of a "who cares, it's 85 vs 100.8fps". Ironically the only place this is unintentionally done is in The last of us where the AMD card is struggling hard and showing terrible frame times.
    It's good to see more testing done in this space though. AMD has close to no information about their hardware based scheduler taking the load off the CPU whilst it could be a key selling feature.

  • @Very_Questionable
    @Very_Questionable Год назад +1

    Turning off multi threading makes it slightly faster in gaming, and I think that will be a really cool thing to revisit, honestly.

  • @MortalityUnleashed
    @MortalityUnleashed Год назад +1

    I am running a B350 board, formerly running R7 1700 and Vega 64. Roughly a year ago, I swapped them both for R7 5700X and 6800 XT. Running quite well, even on the red ringed LED cooler than the R7 1700 came with. Hurray 65W TDP 8-cores.

  • @ronnie3626
    @ronnie3626 Год назад +4

    I have a RX 580 8GB with a 3700x and GTX 1060 6GB and 4790K @4.7GHz in my second pc. I used the 4790K system with the RX 580 before and there was not much difference in GTA5, CPU usage 50-70%. Not sure if it would benefit from the RX 580 nowadays.

    • @berry966
      @berry966 Год назад

      the overhead is a 30 series issue. The rx 580 is mainly faster than the 1060 in vulkan and dx12, so in GTA V the gtx 1060 may be faster than the 580.

  • @Rosaslav
    @Rosaslav Год назад +5

    Thanks for this video, it is definitely useful to have more than one source for this particular test, than just from Hardware Unboxed alone.
    I am owner of AMD Vega 64, which I bought during time when Freesync monitors were yet to be supported by Nvidia GPUs, so I upgraded from my GTX 660 and bought Freesync monitor aswell for quite a bit cheaper than G-Sync alternative. I didn't know about this advantage of AMD GPUs with older CPUs at the time.
    Talking point of this video is likely one of the reasons, I am still able to comfortably use my now quite old Intel i5 3570 and I will likely be able to keep it until end of Win 10 support in 2025, which should be a fitting time for CPU upgrade.
    Are there games which I consider unplayable with this CPU? Definitely, but they are still a minority in my game library. Majority of games played by me still sit comfortably above 60 fps and many of them still seem to be more GPU than CPU limited.

    • @LukeMaster
      @LukeMaster Год назад

      You use that i5 with a vega 64? Ive been considering a similar setup, how do those two match together? And could you tell me what power supply you have? Im not sure if my 650w can push a Vega 64

    • @Rosaslav
      @Rosaslav Год назад +1

      @@LukeMaster Games I found limited by this CPU: Tomb Raider games, Call of Duty Warzone, Assetto Corsa Competizione with 10+ cars, Riftbreaker and Frostpunk both during lategame.
      I would go at least i5 8000 series, so you could go Win 11 in a two years.
      I am using 550W PSU with SSD+HDD in my system without any issues.
      Vega 64 is on about the same performance level as midrange GPUs nowadays (RX 7600).

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      try some lightweight windows 11 builds like Windows X lite with that CPU every bit of saved cycles helps

    • @LukeMaster
      @LukeMaster Год назад

      @@Rosaslav thanks alot, i have a similar cpu, and ive been looking for a new gpu to make new games playable, vega 64 is definitely one of the main contenders

    • @Rosaslav
      @Rosaslav Год назад +1

      @@LukeMaster Well, it all comes down to price, so if it is your best bang for buck option at the moment, then go for it, otherwise I would just go for newer gen. Vega was always better suited for productivity tasks, than for gaming, also it performed better at higher resolutions, but currently, you could use it only for 1080p, where it doesn't perform as good as the same gen Nvidia.
      If you can choose between different Vega models, I would go for Powercolor one. I have Sapphire one and it starts to spin up at fairly high RPM, so it can be a bit annoying when it cycles between 0 RPM and non0 RPM. Avoid ASUS and base blower cooler models.

  • @Carstuff111
    @Carstuff111 Год назад

    I went from an AMD Ryzen 5 1600X CPU clocked at 4GHz all core, RAM running at 3200MHz, paired with a Zotac GTX 1070 that averaged a GPU clock of 2000MHz, and was happy with that over my previous Intel i5 3570K (clocked at 4.8GHz) and Gigabyte Windforce Radeon R9 290. I got my hands on an EVGA RTX 2080 Super and paired it with my 1600X CPU, got a nice increase in performance but you could tell it was being held back. Now, I have a Ryzen 5 5600X and my RAM is now running its full 3600MHz speed, and I am so very happy with the GPU and CPU. The 5600X really is a great bang-for-the-buck processor.

  • @bdunn7037
    @bdunn7037 Год назад

    i dont know how you dont have 100k subs with this quality of content

  • @neondead2.0.15
    @neondead2.0.15 Год назад

    Amazing music choice. Love it !

  • @LucidStrike
    @LucidStrike Год назад +2

    Great video. Meanwhile, my 1800X is doing its best to feed a 7900 XTX. I can set it to 4K without performance loss most if the time. 😂
    I'll upgrade the rest of the system next year tho.

    • @evilleader1991
      @evilleader1991 Год назад

      You are severely CPU limited, ouchie.

    • @LucidStrike
      @LucidStrike Год назад +2

      @@evilleader1991 Indeed, that's the premise here. But between playing in 4K and the eventual use of FSR 3 Fluid Motion, I'll be fine until I can afford upgrading.

  • @jelipebands1700
    @jelipebands1700 Год назад +1

    Awesome video. If you can get a rx6800 and campare it to the 3070 you could check driver over head and vram limited games at the same time.

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      As they cost the same. But they had to make something up to show good on the nVidia side 😂

  • @quinncamargo
    @quinncamargo Год назад +1

    Great content as always. Now lets see if it matters if its better to have an old Intel or AMD.

  • @jayrap94
    @jayrap94 Год назад +2

    As someone stuck on an i7-4790 and really wants to build a new system but can't afford it, thinking about an RX 6600 or RX 6700 (and then using them to their full potential in the future) is temping. On a GTX 1060 6 GB right now.

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      even a 8gb rx470 would be a good upgrade

    • @lagginswag
      @lagginswag Год назад

      nah, he said a 1060 6gb not a 4gb 1050, even then a 580 beats or matches the 1060. Haswell can feed 1080/2060/5700/6600 level cards but wouldn't recommend higher. Shes pretty long in the tooth. Hop over to am4! 🎉

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      @@lagginswag 470 8gb would unlock a lot of CPU power therefore it's a very nice upgrade for a cpu limited machine even if graphics performance is the same

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      I'm on a 1060 6GB as well, but with an i5-3570k and the CPU limit is quite impressive.

  • @REgamesplayer
    @REgamesplayer Год назад +1

    The problem or rather perspective in this is that people who are looking to upgrade their ancient PCs are not looking for high quality gaming. They only want to keep their system capable of running games at decent quality for as little as possible. Anything above 60 FPS is superfluous to that.
    I personally was forced to upgrade my GTX 760 when it had a minor flaw in one of the pins. I bought used GTX 1060 6GB. My i5-4670 immediately became a main bottleneck in a lot of games. However, they both are fairly balanced as these parts both fully max out when playing games. It is of course not what you would want (GPU being held back by CPU and having no room to do other tasks with CPU. I still appreciate the fact that those two parts are evenly matched and both of them gets fully utilized.
    Here is where our perspectives are different. I bought GTX 1060 6GB, because I had to. However even if I would had bought it as an upgrade, I would had bought it because games which I wanted to play struggled, especially with Nvidia Geforce GPU recording. I was running out of VRAM in a lot of games. I wanted a cheap upgrade to get more life out of my system. Your suggestion of ''splitting budget between CPU and GPU is cringy''. In order to change CPU, I would need to buy new motherboard, new RAM. That is a lot of overhead just for buying a new CPU. On top of that, it would be a massive pain to upgrade which not everyone is capable of. I would also need to leave enough money for GPU too. At that point, it is like not upgrading, but replacing most of your system. You might as well buy new computer. A lot less work and you could cover part of a cost by selling old one.

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      a i5 4 series with rx580 Radeon gpu does around 80fps in warzone 2 big map, it's only using 1060 that you drop under 60

    • @REgamesplayer
      @REgamesplayer Год назад

      @@gabrielecarbone8235 Good to know.

    • @omegaPhix
      @omegaPhix Год назад

      ​@@REgamesplayer You can cop a ~$30 xeon (E3 1241 v3) for your mainboard. Performs the same as a i7 4770

    • @REgamesplayer
      @REgamesplayer Год назад

      @@omegaPhix Yeah, used i7 costs around 70-100 euros.
      Though, I'm just waiting for 14'th generation. Waiting out for 10 whole CPU generations is something symbolic.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      I'm almost in the same situation. Went from a GTX 660 Ti (which is about as fast as a GTX 760) to a GTX 1060 6GB. Paired with an i5-3570K. Noticed the same CPU limit. And while a Xeon would be interesting, those extra threads would cost me about 13% in clock at the same IPC, so won't help in those games where I would need a faster CPU.

  • @burtiq
    @burtiq Год назад +3

    You should try benchmarking the 1060 3GB in 2023, just to see how bad it does. I have one but I don't have/play newer games, would be fun to see what (little) it can do

    • @MLWJ1993
      @MLWJ1993 Год назад +1

      Could still be "fine" if you're okay with concessions made to VRAM intensive settings (texture/shadow/reflection/ambient occlusion/volumetric lighting resolution to name some). 🤔

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      oh, compare it to the 1050 Ti, which comes with 4 GB VRAM, but a much weaker GPU. The old question about GPU vs VRAM.

  • @100500daniel
    @100500daniel Год назад +2

    Your RDR2 results with Ryzens is bad because you're using DX12 instead of Vulkan. With my 5600x/3060Ti combo I noticed up to a 50% performance difference between both APIs. Although with my current setup there's barely any difference other than long stutters in DX12 when loading into new areas that don't appear with Vulkan

  • @VeiLofCognition
    @VeiLofCognition Год назад

    HAF 912 is a LEGEND!!...rocking it now in 3.0 form 13600KF + 3080Ti

  • @yearyellow3640
    @yearyellow3640 Год назад +1

    hurts to see the 6500xt described as a bad product, I, as an extreme budget gamer/ hobbist 3d modeler, just bought it to upgrade from my rx 460 4gb, godd video though

  • @rodrigomendes3327
    @rodrigomendes3327 10 месяцев назад

    Amazing video. It shows that nVidia doesnt suffers anymore from driver overhead.

  • @hiriotapa1983
    @hiriotapa1983 Год назад

    Awesome channel, with great thought-provoking content! Nice British accent too....

  • @Kratochvil1989
    @Kratochvil1989 Год назад

    pretty insane to see this i saw people running 4090 at core 8gens and so on, they just admited it on forums. that must be pretty terrible expyrience for buck :D...+ video itself is very, very well made stylish, clear comentary, music well chosen .. great job its perfection. sub. Thank you and good luck with milion ,-) subs.

  • @RobertJohanssonRBImGuy
    @RobertJohanssonRBImGuy Год назад +1

    7800x3d and a 6700xt is superb at 1440p as the gaming experience is fluid in a way due to the cpu cache.
    highly recomended you buy a 5800x3d or 7800x3d.
    Best gaming experience I had in 30 years.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      The cache is really great in games that make use of it. Looking back, the i7-5775C has aged really great as well.

  • @EDARDO112
    @EDARDO112 Год назад +2

    One thing that i have noticed is that Ray Tracing is super damanding on CPU, my 3700x goes to 40FPS while using ray tracing in cyberpunk 2077 (reflections make the most impatc but other options make it go bellow 60FPS). It would be cool to test CPUs for ray tracing since every test that is done excludes ray tracing. Meanwhile im keeping my CPU for now. (For context im using a 3090 using DLSS Auto and optimized settings at 4k (GPU usage is way bellow 100% even when ray tracing is on).

    • @hardcorehardware361
      @hardcorehardware361 Год назад +1

      RT is heavier on processor you are correct and it is also more Vram intensive, I had my 3090 paired with a 3950x all core OC 4.3ghz and my GPU usage in some games was always under 80% at 1440p but now since I upgraed to the 7950x its at 99% usage. I saw a rather large increase in performance by upgrading so if you do upgrade to a 7700x you will see quite a significant bump.
      DLSS and FSR lower Vram usage with RT on but I never use upscaling because it makes my games look washed out low res and I never use RT because the performance impact is significant.
      Just read your comment and thought I would reply to it.

    • @thousandyoung
      @thousandyoung Год назад +2

      That's why it's useless for Gaming. Most Gamers won't be using that Shit in Real Life. I prefer Raw FPS Power than 100 Fake Frames FPS Gimmick with Massive Latency too.

    • @hardcorehardware361
      @hardcorehardware361 Год назад +1

      @@thousandyoung Honestly I agree with you.

  • @McLeonVP
    @McLeonVP Год назад +1

    Ryzen 3 3100 + RX 7900XTX
    Perfection

  • @ChrisPBolsak
    @ChrisPBolsak Год назад +1

    I have an rx 5700xt, and even just going from an R9 3900x to a r7 5800x3d i got a decent performance increase

  • @ezpzjayzaar
    @ezpzjayzaar Год назад

    'Morning! Nice day for fishing, ain't it? Hu-huh! 😁

  • @fickedyodad2137
    @fickedyodad2137 Год назад +1

    Bought a 7900XTX and plan on keeping my 5800X non 3d for a two or three more years.

  • @doriansanchez1961
    @doriansanchez1961 Год назад

    I'm a AMD guy who owns a Nvidia but I have to say great job. Nothing but facts and knowledge not biased at all but full of good information for those in old cpu situation.

  • @skemmy215
    @skemmy215 Год назад

    My favourite tech channel rn

  • @SniperWolf2024
    @SniperWolf2024 10 месяцев назад

    as a PC fliper for 21 years now, I noticed something was wrong with the CPU load, when I updated! a long time ago, but now I only use AMD and intel for editing and capturing!

  • @NAKADZI
    @NAKADZI Год назад

    😥Pity that there was no comparison (or at least mention of it) with the Spectre/Meltdown patch disabled, because it can give you an additional 2-5% performance on Zen 1-3 CPUs (and also on Intel equivalents)

  • @ThisIsLau
    @ThisIsLau Год назад

    Great video, I subscribed!

  • @TheMasterOfSafari
    @TheMasterOfSafari Год назад +7

    I'm pretty sure the games where the 3070 performs better are in DX11 titles..
    Historically Nvidia has always performed better in DX11, and Vulkan / DX12 ran better for AMD.

    • @mareck6946
      @mareck6946 Год назад +5

      thats what makes teh cpu ovehead basically. their software implementation of scheduling and instruction ordering on the cpu. leading to more parallelized dxy11 workloads while amd has hardware schedulers that are hit or miss on dx11 but work very well in parallel APIs such as Dx12 VK.

    • @mircomputers
      @mircomputers Год назад +1

      as the 3070 cost 50% more than the 10gb 6700 non Xt, the fact that you have to look hard for any advantage for the former is such a joke

    • @mareck6946
      @mareck6946 Год назад +2

      @@mircomputers you know most ppl are still "nvidiots" :P
      sadly lol - regardless of specs price perf

    • @evilleader1991
      @evilleader1991 Год назад

      Dont think thats true for newer gen Nvidia from 3000 series onwards.

    • @83RhalataShera
      @83RhalataShera Год назад

      @@evilleader1991 I think that it is still the case on all NV GPUs since GTX 600.

  • @maxmad4771
    @maxmad4771 Год назад +4

    Does it make any sense pairing 2600k (Sandy Bridge), set at 4.3 or so, with say RX 6600 video card? Currently rig is running much older and slower video card. How much would it bottleneck (I guess lot)?

    • @Moesuito
      @Moesuito Год назад +1

      2600K is a very capable CPU. but I think he won't do much of it. This CPU doesn't have AVX2. I think a i7 4770K/4790K is a better choice for this case.
      I have a Haswell-E Xeon (E5 2666 V3), paired with 64 GB DDR4 Quad Channel and a RX 6600, i've got basically 100% GPU usage in most cases, e-sports titles runs very well (Warzone 2.0 90 FPS, COD MW1 160-180 FPS, Battlefield 4 arround 250-300 FPS at 1440p Low settings, CS:GO above 300 frames per second), and everything with a 2014 CPU with 10 cores and 3.5 GHz boost.

    • @blackf0rd
      @blackf0rd Год назад +4

      Of course it will bottleneck but buying RX 6600 isn't a bad idea. You will surely get more performance in games than with card that you own now. You can later upgrade the whole PC but keep the 6600 and continue gaming without any problems

    • @rileyhance318
      @rileyhance318 Год назад

      depends on resolution. I would personally look at an am4 5600 based system if you are trying to keep price down. if you get an older mobo the whole platform would be under or near 300 dollars and would keep up with almost any gpu besides a 4090 at 1080p.

    • @blackf0rd
      @blackf0rd Год назад +2

      @@rileyhance318 he already owns 2600K he doesn't plan on buying new rig

    • @rileyhance318
      @rileyhance318 Год назад +1

      @@blackf0rd the 2600k is a borderline antique. you can find optiplexes businesses are throwing out with 6th and 7th gen intel in them. running a 2600k isnt worth it for the power consumption and heat alone. replacing it with almost anything else for dirt cheap is the way to go

  • @BeefLettuceAndPotato
    @BeefLettuceAndPotato Год назад +1

    YES.
    Well I think so. I guess I'll go ahead and watch the video now lol.
    You know I'm always here for the AMD. 😅

  • @alt666
    @alt666 Год назад

    Man I remember when my i5-4440 could play games at mid to max settings by itself. Used to play modded Skyrim with hundreds of mods at a locked 60fps... Now you need a high end CPU and GPU if you like graphics and frames.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад +1

      And with upscaling, so not even at fully native res.

    • @alt666
      @alt666 Год назад

      @@HappyBeezerStudios And ai generated / fake frames smh. Why even buy a GPU past 30 series if it needs fake frames and upscaling to do the same thing. I'm still fine at 1080p.

  • @MadIIMike
    @MadIIMike 11 месяцев назад

    I wonder how extreme the difference would be on common use cases for budget PCs. That OC R7 1700 is still a pretty high end CPU from the perspective of a budget gamer and far in excess of requirements for most MMOs and so on.
    It would be interesting to see the difference between say... the GTX 1060 6GB and the RX 580 8GB on 4th Gen i5/i7.
    Also, it might be interesting for streamers to see the impact of the driver overhead if they have other software taking up CPU performance.

  • @h1tzzYT
    @h1tzzYT Год назад +1

    You are probably done with amd/nvidia driver overhead testing but i would like to see same test but on dead end platform where upgrade is much more painful than am4, lets say haswell/ivy bridge/sandy bridge i7 or HEDT variants, because with first gen ryzen its really not quite an issue, as cpu upgrade is very easy and cheap, you can upgrade from 1600/1700 to something like r5 3600 from ali express just for ~70€. If you are buying 400€+ gpu then spending additional 70-130€ 3600/5600 for new drop-in cpu upgrade should be a no brainer

  • @RolfWrenWalsh
    @RolfWrenWalsh Год назад

    I am an AMD guy at heart, but went with the 2070 Super from my Vega 56, and from there a 3090 12GB because of CDPR being in bed with Nvidia.
    I want to have an all AMD build once again, and since I plan on keeping my 5800X3D (OC'ed to 4.2ghz, paired with 64GB DDR4 3733mhz) for at least another 2 or 3 years, this video makes me think I will go for an AMD GPU once again in 2024/2025.

  • @Josue31627
    @Josue31627 Год назад +1

    I've seen that overhead playing a role when running certain older games like Dota 2 or GTA V. FX CPUs struggle a bit in those games especially with Maxwell and newer Nvidia GPUs. It's weird because normally those games prefer Nvidia GPUs. I know this video is supposed to be for newer games but I feel this applies even to older games and older GPUs.
    I don't know if 1st gen Ryzen also struggles in these older games.. Maybe they do but not as much as the FX series does I hope.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      And in slightly older games, around 2013-2016, on cards of that time, it was Radeon that had the worse driver overhead. They released mantle at the time, and in some cases improved performance by over 50% that way.

  • @jhonrock2386
    @jhonrock2386 Год назад

    Something I think should be considered. Once the GPU core archtetures are enough to handle the given game graphics at the given resolution, MAYBE, the clock speed of these cores could start to make some difference.

  • @prrocker9637
    @prrocker9637 Год назад +2

    I know it's not always possible but if you could get your hands on a 6700 xt it'll be a way closer comparison to the 3070 since the 6700 is a lot closer to something like a 6600 xt in terms of performance

    • @GewelReal
      @GewelReal Год назад +3

      That was not the point of this video

    • @mircomputers
      @mircomputers Год назад

      the price difference is just stupid, the 3070 cost 50% more than the 6700 10gb, this is totally omitted during the video commentary. Should have used a 3060 12gb

    • @prrocker9637
      @prrocker9637 Год назад +1

      @@GewelReal it's meant to show the driver overhead between the cards I'm aware it's more accurate when you pair cards that are closer in power is what I'm saying

    • @savagej4y241
      @savagej4y241 Год назад

      IMO the comparison of RX 6700 with RTX 3070 is a good choice. It shows that if you're on 1st gen Ryzen, a mid range NVIDIA card that's 40-50% more expensive than the lowest end comparable Radeon card with more than 8GB VRAM has on par performance with each other. Of course, you could drop in a Zen 3 chip in the same motherboard and that would fix the problem, but at that point its around $650 paired with the RTX 3070.
      If you're on an R7 1700, then paying just over half that ($335) for the RX 6700 and saving up for a new build instead of dropping in a Zen 3 chip and paying more for an NVIDIA card seems like the better deal if you're otherwise satisfied with the still serviceable 8-core CPU.

  • @hypersonicboom
    @hypersonicboom Год назад

    Excelent content, subbed

  • @cowboybenbop
    @cowboybenbop Год назад

    This type of video is a godsend for people like me who are new to PC and looking for a bargain 👍

  • @little_shady_fox
    @little_shady_fox Год назад

    Wish there was a company that was specialized in making low-to-mid range gpus that had decent performance and extremely long driver support

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад

      Pretty much. That is where the big money is. The flagships are prestige products. For the company the claim to have the fastest graphics card and for the user the hunt for the best.
      With better and better integrated graphics the low end market OEM is pretty much dead, but the midrange is where the money is.

  • @deltasixtwo
    @deltasixtwo Год назад +1

    Yes! That is the video i've been waiting for. TBH i'm an owner of 3070 (laptop) myself, but knowing about driver overhead, the old cpu + nvidia gpu pairing always felt a little off . Thank you for this video, and i kindly hope you would consider changing your old cpu testing methodology 😄
    Also let's hope that spreading more info about driver overhead issue would incline nvidia to rework their old-as-dinosaurs gpu drivers. 😄

  • @MD-se8ft
    @MD-se8ft Год назад +1

    If you're gonna do another video with an intel cpu, use the 7700k.
    I was searching for the same topic before considering buying a 6700 these past days, and i heard amd causes issues with pcie 3.0 with their 6600xt gpu, so yeah im confused

    • @lagginswag
      @lagginswag Год назад +1

      Some newer cards are pcie 4.0 x8 lanes, so it'd stuck on x8 on 3.0 which is half the bandwidth. Check videos it's not too much of a difference. Rx 6500xt is x4 however which is gimped.

    • @MD-se8ft
      @MD-se8ft Год назад

      ​@@lagginswag thank you

    • @lagginswag
      @lagginswag Год назад +1

      @M D Also x8 4.0 is the same as 3.0 x16. I've been eyeing rx 6600 but I'm on b450, so wondered about a bottleneck. Maybe 3-5% performance loss on avg.

  • @boonedawg1506
    @boonedawg1506 Год назад

    Based off this video, would you suggest upgrading a i7 6800k? Mine is clocked at 4.2 all core and does good in most AAA games paired with a 3070. But, how much am I leaving on the table? How much more FPS can I get if I go with a 13600k or 7700x? Thank you for the video too. I like seeing cool videos like this.

    • @83RhalataShera
      @83RhalataShera Год назад

      Search of benchmarks for your GPU in your games and compare them with your performance. You can also monitor your GPU usage with stuff like MSI Afterburner. If it is lower than 97-100% for a lot of the time you are losing FPS.
      I think there is a 6800K video on this channel, since the guy also tests all his CPUs with a 3070 and compares it to the 5600X you can see how much you lose.

  • @Derpynewb
    @Derpynewb Год назад +1

    WHAT THE FUCK? THAT STARK OF A DIFFERENCE? I'm cpu bound a lot of the time. You telling me if I went red id get 20% performance improvement? What the fuck? Thats like a whole generation increase.

  • @mikebruzzone9570
    @mikebruzzone9570 Год назад

    Good back gen coverage and upgrade observations. mb

  • @aparidaewithaclue3267
    @aparidaewithaclue3267 Год назад

    keep up this good work and style

  • @markr.1905
    @markr.1905 Год назад

    I recently purchased the RTX3070 for my 1700 RIG and am happy with it. The next step in the upgrade process will be the purchase of a new processor

  • @rileyhance318
    @rileyhance318 Год назад +5

    i dont think the 1700 is the best example bc for 80-130 bucks you can throw in a 5500 or 5600 to get a massive jump in IPC that will be able to easily keep up with even modern high end cards if using the resolution the cards were aimed at.

    • @TheMasterOfSafari
      @TheMasterOfSafari Год назад +7

      Your argument is honestly... terrible?
      This video is MEANT to test an older CPU, which the 1700 is def one 😅
      It's not about an upgradable socket..

    • @joemarais7683
      @joemarais7683 Год назад +3

      @@TheMasterOfSafari I don’t think it’s the worst argument since he’s kind of right.
      If you’re spending $600 on a 4070(4060) or $400 for a 4060ti(4050ti) then you have the money to spend $100 on an i socket upgrade that, unlike the gpu choice, wouldn’t be a massive ripoff.
      I more wish he chose the 9600k though. That way, there would clearly be no other option but a costly platform upgrade.

    • @rileyhance318
      @rileyhance318 Год назад +2

      @@TheMasterOfSafari who is spending 400 bucks on a gpu but cant spare 80 on a cpu to get the most out of it or even improve general everyday performance in web browsing. using am4 for this test is flawed due to the incredible upgrade path you have option to when running a 1st gen ryzen chip

    • @sjneow
      @sjneow Год назад +1

      I think he prefaced it that it is for someone looking for a GPU for older systems

    • @TheMasterOfSafari
      @TheMasterOfSafari Год назад

      @@rileyhance318 have u guys even watched the video... He did mention "especially if you have a dead socket" in the video as well.

  • @firisrozley5768
    @firisrozley5768 Год назад

    Speaking of Ryzen 7 1700 chipset, that reminds me of 2 particular laptops (Asus & HP Omen) with both powered by the same desktop chipset with 4 GB RX 580M as graphics card. Those laptops could technically install Ryzen 5 2600 & Ryzen 7 2700 chipsets (Windows 11 compatibility reasons) given the same AM4 architecture but those companies didn't bother to update the BIOS, thus being stuck in Windows 10 unless they can go for Chimera OS.

  • @warrenskeen9203
    @warrenskeen9203 Год назад

    The last of us part 1 is now not loading the textures into the 3070 because it doesn't have enough Vram.
    The 6700 has more so has more work to do to load the more detailed textures.

  • @aavvironalex
    @aavvironalex Год назад

    Nice videos as always.

  • @dangerous8333
    @dangerous8333 Год назад

    This is funny.
    Because I have had nothing but problems with ATI/AMD Gpu drivers on older machines.

  • @pkgod99
    @pkgod99 Год назад

    If on tight budget, why not just buy the GPU first then get the cpu later in a few months. Putting even just $30 or so aside every month will enable someone to upgrade cpu to 5600 in like 4 to 5 months.

  • @mayorplayz
    @mayorplayz Год назад

    bought i7 10700k and rtx 3070 on launch in 2020, still loving the combo

  • @lflyr6287
    @lflyr6287 Год назад

    Iceberg Tech : there are 3 things defining efficient utilization in CPU and GPU correlation :
    - DX API and its lack of ASYNCRHONUS computing (modern games in reality don't use DX 12 truly, they just run through it and compute in a SYNCHRONUS WAY behind it via DX 11 API -> because of modern games still using mostly derivates of older graphical engines that were never codec for DX 12
    - AMD-s parallelism in their architecture since GCN 2012 whereas Nvidias gpu architecture is older and very linear, hence it doesn't truly support DX 12 yet
    - AMD-s gpu-s only drag faster behind themselves only certain CPU-s and not all....some CPU-s are living their faster max potential when couple with Nvidia gpu-s....it all depends on what graphic engine is use in a game....certain like more linear utilization (which means that the faster the GPU core clock is the better and the larger number of ROP-s a GPU has the better) which is what older engines like Cyberpunk 2077 (which was developed in 2011) like and some like more parallel utilization (which is where AMD gpu-s come into play due to their stronger texture units and since the RX 5000 series above 2000 MHz gpu core clocks) which is what a much much never game engine like Spider-mans is likes.
    If a true parallel API like AMD-s 2011 developed MANTLE API would be in usage today, there would practically not be any driver overhead or bottlenecking in majority of PC configurations. Due to its ASYNCHRONUS way of computing those two API problems would almost, but no entirely, but still almost in majority vanish from games.

  • @deilusi
    @deilusi Год назад

    one thing not mentioned is that 100% on cpu means lags to input as well, and those are annoying, so it's MUCH better to not utilize your cpu in full.
    if you have anything open in background, those numbers will suffer as well, so any spotify or stuff, would cause another hit as well.

    • @HappyBeezerStudios
      @HappyBeezerStudios Год назад +1

      Can confirm. It's worse than just low performance, it makes playing uncomfortable.

  • @epi23
    @epi23 Год назад +1

    does the driver overhead issue matter in all CPU limited situations or only when all threads are utilized? Because you can be in a CPU limit with low CPU utilization.

    • @sjneow
      @sjneow Год назад +1

      Games are mostly single threaded so yeah you dont need to wait for full thread utilization to encounter the driver overhead

    • @tubaeseries5705
      @tubaeseries5705 Год назад

      it matters in dx12/vulkan games because these use many threads for rendering

  • @Baki-EGW
    @Baki-EGW Год назад +1

    Radeon is definitely better for old cpu-s. Just replaced my 1660 Ti with RX 6700 (non XT) on 4690K @ 4.6Ghz. 16Gb DDR3 2000Mhz.
    Shadow of the Tomb Raider bench. 1080P Highest, 1660 Ti = 80 FPS, RX 6700 = 111 FPS.

  • @saricubra2867
    @saricubra2867 Год назад

    I think Radeon is also better for newer PCs as well. I always play emulators or CPU limited games, and i'm not interested on the raw feature set horse power on NVIDIA unless i need them for work, and i have an i7-12700K. I would brute force my CPU for that stuff, it's already tuned for that.
    A friend has a 3080Ti but plays GPU bound games, like Calisto Protocol when it came out.

  • @bardavidson2102
    @bardavidson2102 Год назад

    Great video keep up the good work!, about the RDR2 results enabling Resizable/SAM can hugely improve the 1% and 0.1% lows , sometimes about dounble it, especially on Radeon 6000 GPUs when paired with AMD CPU's like the 5600x Nvidia gpus can also benefit from it just not to the same extent.

  • @Jesperkraakman
    @Jesperkraakman Год назад

    Built a really cheap second hand scrap parts pc for my brother, I5 4440 with mobo for 30 bucks with a cheap RX580, works very well for esports titles at 1080P and even heavier titles at medium.

  • @snap_oversteer
    @snap_oversteer Год назад +2

    I ran RTX 2080 with 2010 AMD Phenom II X6 1055T for some time when my newer motherboard died and oh yes that was a serious bottleneck, but plenty of games from previous decade were still perfectly playable event at 4K.

    • @timiko4
      @timiko4 Год назад

      "event at 4K"
      because resolution has almost nothing to add on CPU load, it's just GPU load

    • @deleater
      @deleater Год назад

      ​@@timiko4not always true. RDR2 at 4K native screen resolution runs at no more than 30-35 fps on all pre-11th quad core CPUs, no matter the overclock.

    • @liamsgamingpcs7473
      @liamsgamingpcs7473 Год назад

      ​​@@timiko4h so I might as well buy an i3 4160 and pair it with a 4090 then huh?
      Surrree buddy

    • @timiko4
      @timiko4 Год назад

      @@deleater not always, yes, cause some parts of rendering pipeline can be done by CPU (infamous skyrim shadows for example).
      Or just CPU needs to support more IO operations of using heavier assets.

    • @timiko4
      @timiko4 Год назад

      @@liamsgamingpcs7473 so read again.
      If CPU is crap it will run bad regardless of resolution.

  • @deagt3388
    @deagt3388 Год назад +1

    First gen Ryzen never was a gaming processor, it's a work horse! ;-)

  • @DarthAsdar
    @DarthAsdar Год назад

    Thank you so much for this video. I have recently upgraded my cpu from i7-6700 to i9-9900 (you can do it modifying BIOS of motherboards of z/h170 chipset, the only limitation is cpu’s power supply system). Now it is time to upgrade my old rtx2060. Due to pcie 3.0 rtx 4060ti with 8 lines of pcie 4.0 is a pretty bad idea (but it will be OK until it is enough vram, so 16gb version could fit). But now I see that I should wait for rx7700. Thank you one more time, this video was really useful.

    • @thedamntrain5481
      @thedamntrain5481 Год назад +1

      2060 isnt that old, im using an overclocked r5 1600 to 4ghz with 2x8gb 3400mhz (Yes I did manage to overclock ram from 2666 to 3400) and a rx 5600xt

    • @gabrielecarbone8235
      @gabrielecarbone8235 Год назад

      rx 6700xt will sure do it for that system