CPUs Matter for 4K Gaming, More Than You Might Think!

Поделиться
HTML-код
  • Опубликовано: 19 дек 2024

Комментарии • 1,3 тыс.

  • @fracturedlife1393
    @fracturedlife1393 7 месяцев назад +1643

    Usually wallet limited before CPU limited these days!

    • @djpep94
      @djpep94 7 месяцев назад +18

      So true 💔

    • @thesilentassassin1167
      @thesilentassassin1167 7 месяцев назад +14

      Not just these days

    • @theanimerapper6351
      @theanimerapper6351 7 месяцев назад +68

      CPUs are actually pretty cheap nowadays. It's the GPUs that cause a wallet limit 😂

    • @RiasatSalminSami
      @RiasatSalminSami 7 месяцев назад +35

      @@thesilentassassin1167 More these days compared to past. Especially since a potato gpu costs 500+ $

    • @empathy_ggs
      @empathy_ggs 7 месяцев назад +3

      If there is a booklet of truths for hardware related statements, this is the truest lmao

  • @1Grainer1
    @1Grainer1 7 месяцев назад +1782

    Obligatory "where is 5800x3d?" comment

    • @stephenwerner1662
      @stephenwerner1662 7 месяцев назад +173

      Did you check behind the couch?

    • @InfernoTrees
      @InfernoTrees 7 месяцев назад +53

      Check your pockets

    • @brucethen
      @brucethen 7 месяцев назад +9

      Lol you beat me to it

    • @1Grainer1
      @1Grainer1 7 месяцев назад

      @@stephenwerner1662 found it! in the cookie jar

    • @GewelReal
      @GewelReal 7 месяцев назад +8

      Still in a store

  • @StefandeJong1
    @StefandeJong1 7 месяцев назад +413

    I have an RTX 3090 and like playing simulation games, using my 42" LG C2. Upgrading from the 5600X to the 5700X3D (I reused the 5600X for my HTPC) gave me 60% higher 1% lows and 55% faster simulation speed in Cities:Skylines 2. It was night and day

    • @jay-5061
      @jay-5061 7 месяцев назад +44

      That game is also garbage in performance terms

    • @yahyasajid5113
      @yahyasajid5113 7 месяцев назад +27

      I did a comparison with a 7500F and 7900X3D in modded BeamNG
      Upto 40% performance difference with a 3090 at 1440p was nuts

    • @eswecto6074
      @eswecto6074 7 месяцев назад +14

      I wish there is more such data about games performance at 4K60, 4K120 and for both upscale/FG; with/without RT but different games. This can save a lot of time without need to look at different benchmark videos on youtube.

    • @hilerga1
      @hilerga1 7 месяцев назад +20

      I’m on a 5800x (non 3d), RTX 3080, and that 42” LG C2. Built the computer for retail prices at launch during the lock down - I was stuck home with very little better to do than obsess over beating scalpers to good parts 😂.
      Picked up the 42” LG C2 later on sale for about $600 and I have to say it is killer for sim gaming and single player RPGs! Best “monitor” I’ve ever owned.

    • @aftermarkgaming
      @aftermarkgaming 7 месяцев назад

      @@jay-5061 that makes it a great real world use-case example

  • @gurshair
    @gurshair 7 месяцев назад +135

    "try not to fixate on the cpu used"
    As someone with a R5 3600 I will most definitely pay attention

    • @PhotoshopArt
      @PhotoshopArt 7 месяцев назад +6

      3600 is just fine. Don't throw away money. I was on 2 core FX4100 a while ago. Wait for 2 - 3 more generations. Then you gonna notice the difference. Thank me later.

    • @turtleneck369
      @turtleneck369 6 месяцев назад +3

      @@PhotoshopArt false 3600 versus 5xxx series not even talking about 3d chips is big difference in every scenario + you dont have to upgrade board or ram and 3d chips are day and night difference with any 3xxx ryzen series for example is huge performance jump

    • @TheNerd
      @TheNerd 6 месяцев назад +6

      @@PhotoshopArt depending on his situation, there can be a HUGE difference like THE VIDEO YOU LITTERALLY JUST WATCHED, shows.
      Also: A comment like this from someone who upgrades every decade can't be taken seriously.
      The FX4100 was released in _2011_ . The Year Elder Scrolls Skyrim launched on PC, XBOX _360_ and Playstation _3_ . Yes it was THAT LONG ago.

    • @aRealAndHumanManThing
      @aRealAndHumanManThing 5 месяцев назад +1

      Yep.
      Got the 3600 in early 2020 because it was the only thing I could afford, and I really needed a PC back then.
      Today, it's a good value chip,
      but iirc, it's bottlenecking even my 6700xt, just not enough to buy sth new.
      I'm building an AM5 PC anyway, so it's just gonna be my backup from then.

    • @cosmic_gate476
      @cosmic_gate476 5 месяцев назад +1

      ⁠@@TheNerd It’s just a gaming cpu, no need to be so heated. You likely have a different definition of “notice the difference” to that person. Personally I only got the 5800x3d cause I was on a 2700x and upgrading nearly doubled my performance in many cases. A 3000 series owner should go for AM5 so they aren’t tied down to DDR4.

  • @andrewcross5918
    @andrewcross5918 7 месяцев назад +230

    Of course they matter. Not everyone plays AAA titles that are often GPU bound. Plenty of people are out here conquering the Europe in HOI4 and exterminating the Xenos in Stellaris or creating some traffic nightmare in Cities Skylines or trying to build the longest, most efficient production line ever in Satisfactory / Factorio or building a rocket to Mars in Civ 6 or guiding a team to victory in Football Manager and I could go on. CPU performance is the primary driver in these kinds of titles and yet they so rarely get tested which is a real shame.

    • @nootonian4149
      @nootonian4149 7 месяцев назад +3

      what is a good gpu for cities skylines 1 at 4k?

    • @RobBCactive
      @RobBCactive 7 месяцев назад +1

      Well a bar showing fps isn't significant (for those games), after the x3D CPUs came out I have seen various tests but the results don't seem all that surprising.
      CPU reviews test various features, not just in action games.
      For purchasing decisions weighing CPU / platform LT / cache relatively more in budget seems reasonable.

    • @Flaimbot
      @Flaimbot 7 месяцев назад +15

      add wow to the list. there's never "enough" cpu headroom.

    • @luzhang2982
      @luzhang2982 7 месяцев назад +8

      Rimworld, cs ( both counter and cities ), VR, etc Lots of top 10 games and game categories that are not what reviewers use.

    • @genocidehero9687
      @genocidehero9687 7 месяцев назад

      Wow niche nerd games lmao

  • @KillFrenzy96
    @KillFrenzy96 7 месяцев назад +372

    When a game throws you into a crazy situation, it's usually the CPU that suffers the most. Most benchmarks are not done during the unpredictable chaos that you get into for many games, especially modded games.

    • @roqeyt3566
      @roqeyt3566 7 месяцев назад +53

      Exactly, and those situations you do want a better CPU in to reduce frame skips, jitter, input lag etc.
      For some folks, that's worth 300$ on a 1500$ PC, for others, they'd rather pocket the money and deal with it

    • @Omar-Asim
      @Omar-Asim 7 месяцев назад +12

      Or certain old games or MMOs

    • @francystyle
      @francystyle 7 месяцев назад +33

      Yea I’ve got a 4070TiS with a 14600k and at the beginning of a mission in Helldivers 2 I’m gpu limited, but when I’m at the end with a ton of enemies my cpu starts to limit my gpu and my frames drop

    • @InnuendoXP
      @InnuendoXP 7 месяцев назад +3

      Yeah though it'd be interesting to see what extent multithreading would be beneficial, particularly as a modded game would probably be loading heavily on tasks designed for a single thread so the single-thread performance is the main limiting factor once you're at 6 cores or more. The 5600X3D showed this pretty comprehensively I think.

    • @snozbaries7652
      @snozbaries7652 7 месяцев назад +3

      @@francystyle really?? I have a 4070ti 5800x3d and I don't get cpu limited. I play @ 4k with dlss quality.

  • @moevor
    @moevor 7 месяцев назад +207

    Steve: "Try not to focus on the actual CPUs used here"
    Me: Yeah, good luck with that Steve!

    • @martineyles
      @martineyles 7 месяцев назад +6

      Well it does matter. We don't see the in between of the 5600 or 7600, which will be more likely upgrades for people like me currently running Skylake CPUs.

    • @deviantbuilds
      @deviantbuilds 7 месяцев назад +12

      I mean, it matters a lot. That's like having a race between a current Olympic level runner and some old dude that was the fastest in high-school 😆

    • @MaxIronsThird
      @MaxIronsThird 7 месяцев назад +3

      people missed the joke

    • @MaxIronsThird
      @MaxIronsThird 7 месяцев назад +13

      @@martineyles just look up the review for the CPU you're interested in, this video is not a CPU review comparing the 3600 and the 7800X3D, he's just trying to make a point that it seems it's still going over people's head.

    • @moevor
      @moevor 7 месяцев назад +4

      But it doesn't, a lot that matters is your target frame rate, to be more precise frame time (CPU time as captured by PresentMon). The CPU scaling is extra info you need if your current CPU is not providing what you need. Seems that message did not come across

  • @KimBoKastekniv47
    @KimBoKastekniv47 7 месяцев назад +92

    Of course the 7800x3D is faster than the 3600 even at 4K, it has more cores!
    (Steve don't kill me I'm joking)

    • @Hardwareunboxed
      @Hardwareunboxed  7 месяцев назад +91

      Agent has been dispatched...

    • @nathangamble125
      @nathangamble125 7 месяцев назад +22

      Obviously an FX 9590 is even faster! It's a 4.7GHz 8-core CPU, which is more than the 7800X3D's 4.5GHz!

    • @bumperxx1
      @bumperxx1 5 месяцев назад

      ​@nathangamble125 for old potato games like world of warcraft, war thunder, world of tanks, roblox, ect ect that Fx9590 is just fine 🙂 I used to have one with a gtx 1070 till I met cyberpunk 2077 and the game Control on a 4k 120hertz oled monitor and only got 19 to 32fps with settings on low bare bones. Upgraded to 5800x before the 3d and a 3080ti in 2020 it was a great experience even on them older above games I mentioned.

    • @alecksandru
      @alecksandru 4 месяца назад

      @@Hardwareunboxed Please tell me if for 4k games it is wiser to buy a 7900x3d because I noticed that there is a small price difference between it and 7800x3d?

    • @Haydatsanime
      @Haydatsanime 3 месяца назад

      ​@@alecksandruyou may get slightly less fps with the 7900x3d in most games, but if properly set up, the 7900x3d is a very strong contender in gaming while also being superior in productivity workloads.

  • @lanelesic
    @lanelesic 7 месяцев назад +55

    As an owner of the R5 3600 I must say its one of the best cpus I bought. Perfect match for gpus like the RX 6600, 6600xt and the 6650xt.

    • @andersjjensen
      @andersjjensen 7 месяцев назад

      In most games it can still pin those GPUs at 1080p. If you're gaming at 4k with upscaling it will still probably pin a 6800XT.

    • @lanelesic
      @lanelesic 7 месяцев назад +1

      @@andersjjensen Pin?

    • @andersjjensen
      @andersjjensen 7 месяцев назад +3

      @@lanelesic "to max out"

    • @lanelesic
      @lanelesic 7 месяцев назад +1

      @@andersjjensen It technically doesnt max them out, but those are the most reasonable combos.

    • @RohanSanjith
      @RohanSanjith 7 месяцев назад +2

      i have 3700x, primarily used for video editing. very powerful but it does run hot.

  • @UncannySense
    @UncannySense 7 месяцев назад +87

    So basically if I run my game whatever that game might be at 4k and changed quality setting and notice next to no frame rate change or uplift then I need to upgrade CPU....

    • @exscape
      @exscape 7 месяцев назад +17

      Yep! That's basically always the case. Some settings will affect the CPU more than others, though.
      If you get similar frame rates at your standard resolution and HALF your standard resolution (or at say 720p), you're almost certainly CPU limited.
      You could also check GPU usage while gaming though. When GPU limited it will almost always be above 95%, typically 98%-99%.

    • @kleinerprinz99
      @kleinerprinz99 7 месяцев назад +17

      @@exscape problem are several many games that came out last couple years that are very unoptimised and will idle your GPU and you'll run into CPU limit. famous examples are Jedi Survivor 2nd game and Cities Skylines 2 because the devs didnt even do their basic homework. Elden Ring is also a culprit. Didnt optmise anything even 1 year, or 2 years now after release. No CPU on the world exists that can compensate bad & lazy programming.

    • @IIHydraII
      @IIHydraII 7 месяцев назад +2

      Could also be memory, but generally speaking, yes.

    • @White-nl8td
      @White-nl8td 7 месяцев назад +7

      Check your GPU utilization. If it's less than 95%, you are certainly CPU bottlenecked.

    • @moevor
      @moevor 7 месяцев назад +5

      Use PresentMon from Intel, look at CPU time vs GPU time, this is the true metric of whether you are CPU bound or GPU bound. CPU and GPU time in ms cane be converted to equivalent FPS = 1000 / time. The difference in CPU/GPU time will tell how how imbalanced each component is for that workload. Bear in mind that just because you are CPU bound does not mean you need to upgrade. For two reasons: 1) is the fps you are getting enough? 2) will a faster CPU allow the GPU to render more. Bonus point of 3) is there a faster CPU in this workload. Your upper limit is not infinity.

  • @nannnanaa
    @nannnanaa 7 месяцев назад +16

    One thing is that I would rather be limited by GPU than by CPU, when limited by GPU, the frames are lower, but usually more stable than if I was limited by CPU- where stutters and hitches occur more- which I find more annoying. So I usually want my builds balanced towards more CPU power, especially with a high refresh rate monitor and playing more first person games.

  • @macmuchmore1
    @macmuchmore1 7 месяцев назад +39

    I upgraded my Ryzen 5 3600 to 7800x3d, while keeping the same 3070. My FPS almost doubled! Had no idea my cpu was holding back my gpu that much. CPU is DEFINITELY important in a gaming pc.

    • @Argoon1981
      @Argoon1981 7 месяцев назад +6

      Both are important, paring a strong CPU with a weak GPU, you will have the inverse.
      So the best is have a CPU and GPU combo that complement each other. But is certainly not easy to know what to buy.

    • @JonBslime
      @JonBslime 7 месяцев назад +7

      Yea obviously it’s important when you cpu bound basically 1080p and sometimes 1440p but at 4K which the video is referring to your 99% gpu bound where the cpu is not the main issue here…

    • @felixmdx
      @felixmdx 7 месяцев назад +4

      whats your resolution!?

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 7 месяцев назад +6

      Yes but did it double your 4k performance?

    • @vigilant_1934
      @vigilant_1934 7 месяцев назад +2

      @@JonBslime Not 99% at 4K. It depends on the game Steve showed about a quarter of these games being CPU bound at 4K. Did you watch the whole video?

  • @ericpisch2732
    @ericpisch2732 7 месяцев назад +79

    I suspect I might have been one of the people that triggered this video, this video was extremely helpful and very insightful. Thank you very much for it. I'm gonna stop asking him for 4K benchmarking :-)

    • @martineyles
      @martineyles 7 месяцев назад +4

      I don't think he's proven the point. Ignoring 1% lows didn't help. Neither did ignoring the 5600 and 7600 CPUs.

    • @geebsterswats
      @geebsterswats 7 месяцев назад +12

      I’ve been asking for this type of video for over a year. I game at 1440p and had bought an RTX 4090, but still had my 5 year old 3700X CPU. Now i know I’ve been leaving performance on the table

    • @rooster1012
      @rooster1012 7 месяцев назад +20

      @@martineyles You missed the point of the video completely.....

    • @martineyles
      @martineyles 7 месяцев назад +2

      @@rooster1012 The video showed a lot of useless 1080p charts. From those charts I was unable to say whether a CPU could deliver the goods (60 fps consistently) at 4k. I needed to see the 1% lows at 4k. However, that's useless if it only shows an old CPU and the best available. We need those charts for other CPUs.

    • @lewzealand4717
      @lewzealand4717 7 месяцев назад +20

      @@martineyles This was an example video to show *why* CPU tests are done this way, not a comprehensive CPU test video. Look for other vids from HUB and others to find more specific data you need.

  • @Vince_IA
    @Vince_IA 25 дней назад

    GREAT VIDEO!
    19:08 4k Native 64FPS, 4k XESS Q 82FPS (+28% FPS), 13:21 1440p Native 100FPS (+56% FPS)
    Double increase FPS Scaling 4k vs 1440p Native
    4 K Scaling / XeSS 1.3 Quality is 1.7x (2259x1271), XeSS 1.2 Quality is 1.5x (2560x1440)

  • @mornnb
    @mornnb 6 месяцев назад +3

    4k benchmarks are pretty important for a 4k gamer to be able to tell what value a better CPU will actually provides. Or you can't make an informed purchase that's actually applicable to how you use the product.

  • @danzydan2479
    @danzydan2479 7 месяцев назад +12

    Great content as always. Buy yourself a slab and enjoy the comments. Cheers mate.

    • @Hardwareunboxed
      @Hardwareunboxed  6 месяцев назад +2

      WOW thank you so much. A million apologies for missing this comment and donation. You didn't have to do that, but thank you again, we really appreciate the support.

    • @danzydan2479
      @danzydan2479 6 месяцев назад +2

      @@Hardwareunboxed No worries. I know you guys are busy most of the time. Thank you for the great content the channel always provides.

  • @MrHamof
    @MrHamof 7 месяцев назад +8

    Cryengine games increase the LODs when you're at a higher resolution, under the theory that at higher resolutions it's easier to spot lower LOD models. This is why in some circumstances increasing the resolution can reduce your framerate in those games even if the GPU still isn't at 100%.

  • @tommihommi1
    @tommihommi1 7 месяцев назад +48

    even with a RX480, going from a 3600 to a 5800X3D is a crazy difference in real world in game experience.
    The main difference is that you get rid of those spikes where the CPU can't keep up and the game noticeably chugs, even if they don't show up in the 1% lows at all

    • @dagnisnierlins188
      @dagnisnierlins188 7 месяцев назад

      Get intelligent standby list cleaner and there is a video on techyescity where you can unlock power settings in windows and change how often your cpu gets pinged for it's data, default is way too high. Just by this you system will be snappier and more responsive.

    • @martineyles
      @martineyles 7 месяцев назад

      It would be good to see whether a new CPU or GPU would help most, as my i5 6600 rx480 8GB combo sometimes chugs in Cities Skylines and SWTOR at 4k, but it's hard to tell whether it's CPU or GPU.

    • @dagnisnierlins188
      @dagnisnierlins188 7 месяцев назад

      @@martineyles try intelligent standby list cleaner and watch techyescity video how to unlock options in power plan settings, where you can change how often your cpu gets pinged for data. It will improve stuttering and responsiveness.

    • @SpringHaIo
      @SpringHaIo 7 месяцев назад +6

      @@martineyles download MSI afterburner and look at the GPU and CPU usage graphs. If GPU isn't at 90-100% usage, you have a CPU bottleneck. I would guess at 4k with a rx480 you're GPU bottlenecked.

    • @martineyles
      @martineyles 7 месяцев назад +2

      @@SpringHaIo I'm not always playing the latest games, and I get the impression that things like SWTOR and Cities Skylines may have CPU issues, but It's hard to get a definitive picture one way or the other. The GPU usage in AMD's overlay can keep going up and down.

  • @AK-Brian
    @AK-Brian 7 месяцев назад +3

    Really loved the methodical tone and approach of this video. Having it set up as a bit of an educational showcase should help a lot of people better understand how to plan for their upgrades, using information that might not always be intuitive at first glance. Those faster CPUs may often find themselves limited at higher resolutions, but the flexibility of being able to lower detail or apply upscaling lets them stretch their legs. It's a great option to have at your fingertips.

  • @baxsb1578
    @baxsb1578 7 месяцев назад +7

    As someone running a R5 3600x with a 4070 super, and waiting for next gen cpu's, this as good as it gets clarification on putting more eggs in the cpu basket. Thank you!

  • @tyre1337
    @tyre1337 7 месяцев назад +13

    i've upgraded from a 9700k to a 13600k and my framerate has doubled in certain game scenarios
    i'm using a 7800xt at 1440p, and was told by people it wouldn't make a difference
    for reference i'm also using the exact same ram

    • @mmaayyssoonn8858
      @mmaayyssoonn8858 7 месяцев назад +1

      I had a 5600X clocked to the moon with ram at super tight timings and in a good amount of the games I played the 13600k I switched to either matched or did worse in performance. I found disabling 3 E-cores actually gave me a substantial lift in performance. I suspect it's to do with clearing up more cache for the other cores.

    • @H94R
      @H94R 7 месяцев назад +2

      2x the perf and a few quid back selling your old hardware. hard to argue with that! people seem to think anything above 1080p just bypasses the cpu and doesn’t get held back by outdated hardware as long as you have a shiny gpu 🤣 i went from an AMD Athlon x4 to a 4690k back in the day and the difference was crazy, after being told my HD7770 (i think) gpu wouldn’t really benefit from it.

    • @Personalinfo404
      @Personalinfo404 7 месяцев назад +1

      willing to bet the people that told you that were children

  • @moskitoh2651
    @moskitoh2651 2 месяца назад +1

    1. Most recommendations take the 5600 and not the 3600, which is an older generation and slower, to avoid the cpu bottleneck at 4K. Of course with even slower processors you will get an issue.
    2. When we want to look at 4k, we spend much to much time on the 1080p and 1440p pages.
    3. Regarding framerates I strongly doubt the more is better! It needs to match. Adjust the 1% fps to the hz of your monitor and configure the sync, so that it only happens when the GPU ist faster than the monitor (99% of time). Enable motion blur. Than 60Hz will be enough for most games.

  • @MichaelChan0308
    @MichaelChan0308 7 месяцев назад +93

    Glad I upgraded my 3600X to 5800X3D to pair with 4090 for 4K gaming.

    • @45eno
      @45eno 7 месяцев назад +10

      The 4090 can get bottlenecked by a 5800x3d at 4K. If you do then make sure to run settings that will load the gpu down like Ultra.

    • @Deathscythe91
      @Deathscythe91 7 месяцев назад +7

      you saved up some money to pay the repair shop? for when the power plug melts?

    • @samson7294
      @samson7294 7 месяцев назад +40

      Enjoy mate! I got my 4090 about a year ago and I love it.
      Upgraded from a 8700k/2080 to a 7800X3D/4090
      Ignore the pocket watchers in the comments!
      All that matters is if you enjoy it.

    • @philliprokkas195
      @philliprokkas195 7 месяцев назад

      @@samson7294 2.5x the price for 30% more performance over something like a 7900xt looooool

    • @thelmaviaduct
      @thelmaviaduct 7 месяцев назад +11

      ​@Cole_Zulkowski only GPU worth buying

  • @GewelReal
    @GewelReal 7 месяцев назад +47

    I can see that Broadwell-based LGA 2011-3 CPU on the thumbnail 👀

    • @amidk75
      @amidk75 7 месяцев назад +4

      E5-2673 v3 here 🙃

    • @winterkoalefant
      @winterkoalefant 7 месяцев назад

      good eye, Gewel!

    • @kainlamond
      @kainlamond 7 месяцев назад

      If only it had the L4 cache...

    • @Sylarito
      @Sylarito 4 месяца назад

      Could be Skylake X or Cascade Lake X CPU which look very similar too.

  • @White-nl8td
    @White-nl8td 7 месяцев назад +3

    Forget about resolution. Think of it like this - your CPU has a frame rate and your GPU has a frame rate when running a game. The final frame rate you get is the minimum of these two. In a game if your GPU can deliver 120 FPS but your weaker CPU can deliver only 60 FPS, are you willing to settle for 60 FPS only?
    As mentioned in the video, even when your GPU maxed out at 60 fps, you can always enable DLSS / FSR or go lower settings so that it can reach 90 or 120 fps, but what can you do about your weaker CPU? It will bottleneck.

  • @steventn22
    @steventn22 7 месяцев назад +3

    This was a great learning experience for me, thank you!

  • @alifahran8033
    @alifahran8033 7 месяцев назад +10

    Yes, I am guilty of saying "CPU doesn't matter at 4K", but context matters here. I am saying it when its 7600 vs 7800X3D on 4K gaming with a non-4090 GPU. In my case 7600 + 7900 XTX. In this case the difference truly is almost unnoticeable without a FPS counter. There could be some very CPU limited titles, that might show the difference between the CPUs, but for the most part it is fine.
    At least for my case - single player, 4K native, Ultra/Very High, mostly highly GPU demanding games, the CPU doesn't matter as much. I could've probably used 5600 just fine as well. But who am I to pass on a 150 euro deal for 7600 one year ago, when at the time 7800X3D was newly released with a price tag of 450 euro.
    I am personally not willing to pay 3 times more, for less than a 5% increase of performance.
    Now that 7600 costs 200-220 euro and 7800X3D is 340 euro, it's obvious, that anyone who could afford it should buy it. It's really THAT good. But at the time, for the money, for my 4K use case it simply didn't make sense to buy 7800X3D.
    Hopefully Zen 6 is on AM5, so I can upgrade to Zen 6 X3D!

  • @klasssavage6581
    @klasssavage6581 7 месяцев назад +60

    The Ryzen 5 3600 lived a long and meaningful life. It is finally time to upgrade.

    • @puffyips
      @puffyips 7 месяцев назад +3

      I did in the beginning of 2023, 3600 to 7800X3D, and I’m still in 1080p @240hz. 200+ 1%lows is a game changer I can definitely see why people go over 240 but I’m good.

    • @ismaelsoto9507
      @ismaelsoto9507 7 месяцев назад +9

      Put a R7 5700X3D or 5800X3D and you're good until AM6 :)

    • @deviousnate7238
      @deviousnate7238 7 месяцев назад +3

      A year ago my R5 3600 started its 4k 60hz HTPC duty. Once my RX 6800 is free to join it there I expect that combo to last quite a while longer. Long live the 3600!

    • @klasssavage6581
      @klasssavage6581 7 месяцев назад +3

      My monitor is 1440p at 144hz. I'm still gaming on the Ryzen 9 5900X.

    • @nipa5961
      @nipa5961 7 месяцев назад +4

      The 3600 was an amazing budget gaming CPU for the last (almost) 5 years.

  • @Jojo_Tolentino
    @Jojo_Tolentino 7 месяцев назад +162

    Can't wait for the comments about bottleneck

  • @MrMeanh
    @MrMeanh 7 месяцев назад +6

    I have a 4090 paired with a 5800X3D gaming at 4k. I target 90-120 fps when I can (using DLSS and lowering settings) and in many newer games I've noticed that my 5800X3D is a bottleneck in some areas/scenarios. Luckily it's still not at the point where 60+ fps is hard to get most of the time, but in a year or two I suspect that a CPU upgrade will be more needed than a GPU upgrade for me to hit my fps target.

    • @randysalsman6992
      @randysalsman6992 7 месяцев назад

      It's most likely the games you are playing benifits less from memory and more from the CPU grunt, so if you had the R7 5800x non 3d I bet you'd have better performance. Also for the games that would benifit from memory the x3D would have gave you you could get most of it back from getting Samsung b-die memory RAM and OC it ( or under clock it lol) to as close as it can get with the cpu's fclock (double the f'clock), I find RAM at 3800MT's is the best you can get (it's what my 32GB are clocked too) and CPU's f'clock to 1900mhz. After that you tighten up the RAM's timings and subtimings as low (and stable) as you can get them and don't worry too much about how far you'll need to push the RAM's voltage because Samsung b-die just seems to love it, mine are set to 1.55v, any higher though you'd probably would want to start activily cooling those suckers. Oh and it never hurts to get yourself aftermarket RAM heatsinks and thermal pads cause that will enable you to use more voltage to push them further without the need of active cooling, I did that and I don't have any active cooling besides what my case fans are providing. I got my RAM to 3800MT/s CL14-14-14-14-28@1.55v, plus allthe subtimings lower too but I can't remember them off the top of my head. lol

  • @andersjjensen
    @andersjjensen 7 месяцев назад +21

    I admire that Steve patiently, over and over again, explain this simple concept. That people can't get into their head that any system is going to cap out, in any given scenario, at the performance of the part that is the slowest in that particular scenario, is completely bunkers to me. It's only two bloody factors to pair for gaming. How fast can a given CPU go, and how fast can a given GPU go. The slowest is your resulting performance.
    If you want nightmare fuel, get into database performance tuning. Suddenly you're juggling memory speed, memory latency, core count vs core speed, NUMA domain configuration, and umpteen different storage performance metrics... spread over wildly different tasks

  • @agent4seven856
    @agent4seven856 7 месяцев назад +2

    8700K team here. Still happy with what I'm getting out of it with my 3080Ti in terms of performance @1440p/4K and I OC'd it to 4.8GHz on all cores. I also don't mind perfectly stable 30FPS since I also play games on the PS5, but in no way 60 FPS is terrible, moreover, most AAA single player games can't even be played with over 120-144 FPS and in some cases locked to 120 or 144 FPS cuz they're locked to 60. Some games also have bugs and other stuff when running at higher FPS. My point here is that if you're only playing single player games, you'll be fine with 144Hz monitor, be it 1440p or 4K (16:9, 21:9 etc.) and in some cases 7800X3D won't help you get more than 120-144 FPS, even if it can deliver more performance cuz some games are locked to 60/120/144Hz.
    I don't care about competetive gaming, multiplayer games and such, so it doesn't matter to me if 7800X3D is better. I also won't be playing game at medium or low settings cuz I'm perfectly fine with locked 30/40/50/60FPS. I mean, 8700K is 7 yo already and the fact that it can still proved the level of performance I'm confortable with is one hell of an achievement after almost a decade. I do plan to upgrade the whole platform at some point, but it's hard to get latest high-end mobos where I live and... at this point I better wait for Ryzen 8000 series and new mobos. We'll see.

  • @garbuckle3000
    @garbuckle3000 7 месяцев назад +4

    This makes a lot of sense and just confirms what CPU reviews say; CPU makes a difference for high-refresh gaming; that meaning above 60fps. How you get above that threshold (4k low settings or low resolution) is irrelevant. I used to game on a 6600XT with a R5 3600, and playing at 1080p, I noticed a difference when I upgraded to a 5700X. Now that I have a 6950XT and play mainly at 4k60, there is less need to upgrade my CPU, however I am still debating it for the times I want to increase my FPS (my display does 4k120) and for multitasking while gaming. For this reason, I'm looking at a 7900X3D as an upgrade. I do wonder how much lower prices will go when Zen 5 is announced, though, or if I should just get it now. Thank you, Steve, for explaining this in detail, especially since 4k gaming is becoming more prevalent.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 5 месяцев назад

      those were indeed alot of hard work and learning process', im pretty happy and proud of them

  • @jeffreybouman2110
    @jeffreybouman2110 7 месяцев назад +6

    one thing to add to this: i went from a 3600 to a 5800X3D with a 3060 Ti.
    my max FPS stayed the same but my 1% and 0.1% lows got way better
    upgrading my CPU gave me much smoother gameplay ;)

    • @awebuser5914
      @awebuser5914 7 месяцев назад

      _"upgrading my CPU gave me much smoother gameplay"_ Yep, good 'ol confirmation bias in full-force...

    • @Amaan_OW
      @Amaan_OW 7 месяцев назад +3

      @@awebuser5914most tests show that the 3D cache chips do have vastly better lows than prior cpus. And even regardless 3600 to 5800x3d is a big jump

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад

      @@Amaan_OW Depends on the game, some are more prone to stuttering than others. Many games won't show any difference at all. I went from a 5800x to a 5800x3d with a 7900 XT and basically saw no real difference as I play at 1440p ultrawide at max settings. A 3060 ti isn't fast enough to really tax a 3600 even at 1080p.

  • @arradog3212
    @arradog3212 7 месяцев назад +6

    Recently upgraded from a 5950X with 64GB 3733CL14 to a 7800X3D with 32GB 6000CL30.
    I game at 4K with a 4080 Super, feels much smoother now.

    • @turboimport95
      @turboimport95 7 месяцев назад +3

      i went from a 5900x to a 7800x3d both using a 4080 and, i could tell a big difference, in lag spikes are now gone.

    • @Gielderst
      @Gielderst 7 месяцев назад

      I upgraded from a Ryzen 1700 + RTX 3070 to a Ryzen 7950X3D + RX 7900 XTX Red Devil Limited + 128GB RAM 6400MHz.
      And so far it's been pretty good.
      But i'll try to save up for Zen 5 / Ryzen 9000.

    • @turboimport95
      @turboimport95 7 месяцев назад +1

      @@Gielderst you wont need 9000 series for a long time, keep the same system for atleast 5+ years then upgrade the gpu maybe. I made a r7 1700 with r9 390x and 16gb ram last me like 7 years+

    • @Gielderst
      @Gielderst 7 месяцев назад

      @turboimport95 I know what you mean.
      But now, when i'm in a position where i could potentially upgrade one or two parts at a time
      if i have some saved up cash. Then i wouldn't mind that at all. I'm an enthusiast and really like hardware parts. And it's a hobby of mine to try and obtain the best stuff, if i can, of course. But that's not always the case, cause it's not easy to save up for these things. So i'll see. I hope to have enough saved up by the time Zen 5 and for example a Ryzen 9950X3D is out. Time will tell.

    • @turboimport95
      @turboimport95 7 месяцев назад +1

      @@Gielderst i would wait and watch the benchmarks first, If the 9950x3d does not improve fps or performance by at least 50% I would hold out until the generation that does. A 10-30% increase etc is not worth it at all.

  • @stefanrenn-jones9452
    @stefanrenn-jones9452 7 месяцев назад +1

    8:45 I bet the 3600 stutters like hell in that scenario even though the fps are the same. Moral of the story is that you need CPU overhead regardless of the framerate, resolution or settings used to prevent stutter and bad frame timings, even with VRR enabled. Even then with CPU overhead, micro stutters can occur because of poor map or game optimization/design leading to poor frame timings.

  • @AronHallan
    @AronHallan 7 месяцев назад +7

    IT does, turn ray tracing and go to a very crowded area in Cyberpunk.

    • @thetruth5232
      @thetruth5232 7 месяцев назад +3

      Yup. My 5800X3D runs out of steam well before my 7900XTX with RT+PT at 1440p60. The frames drop into the 50's while the GPU runs below 80%.

    • @andersjjensen
      @andersjjensen 7 месяцев назад +1

      @@thetruth5232 Yikes. My 7950X3D does 90-95FPS very solidly, also on a n XTX at 1440p. My settings are, however, a custom mix of RT and raster.

  • @Rogerkonijntje
    @Rogerkonijntje 7 месяцев назад +2

    i always want to be gpu limited. you can upscale for better frames. There is no "upscaling" for ai in Age of empires, for physics in beamNG, for all the logic in most Paradox games (stellaris, Hoi4, ck3, victoria, ....) etc.

  • @christophermullins7163
    @christophermullins7163 7 месяцев назад +7

    I made this same comment on daniel owens video on the subject. The only reason this video needs to be made or holds value for the community is because some people do not understand the way a pc works with regards to a gpu or cpu limitation. It is indisputable that CPU limited scenarios are less likely at higher resolution. It is ALWAYS beneficial to have a faster CPU in all scenarios.. that is obvious. However, if your favorite game is a very gpu demanding game and you are ok playing 4k 60fps then a cpu upgrade will most likely have ZERO impact on that situation where as it would at 1080p. Not every game/scenario can be treated like this but a LOT can be. This video is helpful only for the ignorant gamers that do not understand pcs on that level. This is no different than seeing Steve get frustrated with people that continue to demand "real world testing" by showing cpu benchmarks at 1440p or 4k.. That is simply not how it works and this video is for the same people asking for these "real world benchmarks". No offense to anyone.

    • @martineyles
      @martineyles 7 месяцев назад +2

      Except the video shows that the 3600 struggles with 1% lows and can't make 4k60 in several of the games. We only know that because 4k charts are there. The 1080p charts tell us literally nothing about the 1% lows the CPU can achieve at 4k.

    • @tiimhotep
      @tiimhotep 7 месяцев назад +5

      ⁠@@martineylesgo touch some grass man, you’ve replied to everyone’s comments saying the same thing. If you think there is further testing needed or they haven’t proven why 4K testing isn’t necessary, you simply didn’t understand the video.

    • @martineyles
      @martineyles 7 месяцев назад

      @@tiimhotep I think that there is nuance missing and that most other people don't understand that.

    • @awebuser5914
      @awebuser5914 7 месяцев назад +2

      Maybe it's that Steve is obsessed with being "right", even though it's invariably pointless edge-cases? That sounds more like it to me...

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад

      @@awebuser5914 There's opinion and there's fact. What Steve is showing is simply factual data. But as with everything in life, the devil is in the details and there are many variables.

  • @VicharB
    @VicharB 7 месяцев назад +1

    Thanks for this, it clarifies certain questions/queries I had. Good one, kudos. In the meantime I am happy with my upgrade to R7 7700 with 32GB 6400 CL30, retaining older 3080Ti and able to play 4K games, when required using frame generation mods, like on Alan Wake 2.

  • @mttkl
    @mttkl 7 месяцев назад +8

    Just a couple hours in and, as usual, a third of the comment section is still not getting the point lol.
    Steve: "Don't focus on the components, focus on the FPS"
    Comment section: "But WHY a 3600!!1, you should have tested a 5800X3D! All reviewers are dumb and doing everything wrong!"

    • @martineyles
      @martineyles 7 месяцев назад

      Focus on the average FPS, but ignore the 1% low, frame stuttering etc. That seemed to be his message.

    • @enderfox2667
      @enderfox2667 7 месяцев назад

      ​@@martineyles He summarized his point in the last segment. The message is: "It's the framerate that's important, not the resolution. Looking at heavily CPU limited results that are often showing sub 60 FPS aren't useful for gauging CPU performance.". Hope this helps ❤

    • @martineyles
      @martineyles 7 месяцев назад +2

      @@enderfox2667 Looking at the 4k charts is what tells me whether a CPU can handle 4k60. That's the information I want to know, because my TV has a native resolution of 4k and a maximum framerate of 60fps. I just want 4k charts for the CPUs I'm interested in buying in the games I'm interested in playing.

    • @enderfox2667
      @enderfox2667 7 месяцев назад

      @@martineyles A CPU can handle 4k at the same fps as it would at 1080p. The bottlenecks there all come from the gpu. You can check the graphs in this video, and if you listen to Steve, he's trying to explain it to you! (if you dont want to watch the full video, you can skip to the summary at the end)

  • @nathangamble125
    @nathangamble125 7 месяцев назад +1

    I really love this video. It shows a ton of the nuance involved in bottlenecking which is overlooked in typical benchmarking videos.

  • @nipa5961
    @nipa5961 7 месяцев назад +20

    It's so sad to see how many in the comments didn't watch or didn't understand the video.

    • @olo398
      @olo398 7 месяцев назад +1

      welcome to the internet?

  • @catwrangler420
    @catwrangler420 7 месяцев назад +1

    Been waiting for a CPU 4k video from you guys, thank you for the hard work!

  • @qrcus
    @qrcus 7 месяцев назад +2

    5:00 "....7700xt which is a fairly low-end gpu by today's standards" aren't you a bit delusional ?

  • @garrettkajmowicz
    @garrettkajmowicz 7 месяцев назад +4

    I'm still running a Ryzen 3600. I still don't see a large amount of value in upgrading so far. But then again I don't game a lot, am not very FPS sensitive, and wear dirty glasses. I mostly want my code to compile faster.

  • @band0lero
    @band0lero 7 месяцев назад +4

    7800X3D is a beast! I regret not giving enough attention to CPU performance in the past. Upgrading from 5800X to 5800X3D to 7800X3D has been great. Can’t stress enough how much better it is in 1% lows and overall performance stability.

    • @puffyips
      @puffyips 7 месяцев назад +2

      200+ 1% lows is a game changer.

    • @TheGuWie
      @TheGuWie 7 месяцев назад

      This is interesting information for me (running a 570x mb, a 5800X cpu, a 4090 gpu and a 240hz tft). Thx. But upgrading isn't just money but also time and stress.

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад +1

      The 7800x3d is only about 15% faster than the 5800x3d give or take. Performance is essentially identical at 1440p or above in most cases. Though you could still see some performance improvements at 1440p with a 4090.

    • @band0lero
      @band0lero 5 месяцев назад

      @@Son37Lumiere it really depends on what games and what performance target. AAA gaming you might be right, but that’s really more of a GPU bound scenario usually.
      I’m seeing the most difference on indie/badly optimized titles on low graphics, and in those scenarios I’m getting over 20% performance gains on average FPS at 1440p even with a RTX 3080 12GB. Not to mention 1% low performance gains, shorter stutter spikes and all that.

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад

      @@band0lero Yes, lower graphics/settings will obviously put you in a CPU bound scenario which will show gains for the 7800x3d. But we're talking about one 3d vcache chip vs another here, the 7800x3d isn't that much faster. The 5800x3d already has very good 1% lows. Also, if you're running a decent mid range GPU (like a 3080/4070 ti/7800 XT) you're probably getting 200+ fps in most indie titles as they generally aren't that demanding.

  • @The_Noticer.
    @The_Noticer. 7 месяцев назад +7

    nvidia overhead issue is never aknowledged on forums, its really weird.

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад +1

      Same reason their driver issues are never acknowledged. Because nvidia's fanbase are hardcore fanboys.

  • @dkphantomdk
    @dkphantomdk 7 месяцев назад +3

    you should really put the GPU usage up next to the results, would be nice to see how much the GPU is limited at those settings.

  • @leyterispap6775
    @leyterispap6775 7 месяцев назад +2

    Even is situations where personally i happened to upgrade the cpu prior to upgrading the gpu i found it interesting that in many games although the AVG framerates didnt change, the 1% lows felt way better and buttery smooth. when paired with a low tier gpu.

    • @benjaminoechsli1941
      @benjaminoechsli1941 7 месяцев назад +4

      That's fairly typical of a CPU upgrade. Definitely welcome!

  • @lazyboygamech
    @lazyboygamech 7 месяцев назад +5

    If you compare the normal 5600. The gap wouldnt be that significant. Because ryzen 3000 never known for good single core ipc. It just give more core to provide better offer than intel. It is ryzen 5000 new architecture that change the deal!

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 7 месяцев назад

      Thank you, we all know he deliberately picked an older cpu with X3D, if he wouldve atleast tested a 5600x3d the numbers would be much closer

    • @robthegobbler
      @robthegobbler 5 месяцев назад

      @@CallMeRabbitzUSVI he did say it was a deliberate extreme example.

  • @larsjrgensen5975
    @larsjrgensen5975 7 месяцев назад +1

    Thanks for the bonus info of Low+RT vs Ultra settings

  • @MaxIronsThird
    @MaxIronsThird 7 месяцев назад +3

    Why can't people still understand this?
    If a CPU reviewer does a 240p run and 0.1% is 200fps, you know that in 16K it will have the same 0.1% lows(IF YOUR GPU CAN MATCH IT)
    The PC is always as fast as the slowest component(it varies depending on workload)

  • @AvroBellow
    @AvroBellow 7 месяцев назад +1

    To Steve:
    Steve, this was another great video from you and I swear that you must have the work ethic of a Japanese robot because what you achieve through all of your hard work is just astonishing. This is one of the most misunderstood aspects of PC gaming and you did a fantastic job showing that while CPU performance at 4K isn't as relevant as GPU performance, it still matters because it sets the minumum gaming performance that a PC is capable of regardless of what video card is under the bonnet (I think that's how Aussies say it). I tip my hat to you and I'm really considering joining you on Patreon or Floatplane. I think that would be pretty cool. Good on ya mate!
    To everyone else:
    Steve's 100% right. It's not that the CPU is AS important as the GPU, it's that you need a CPU that's fast enough to deliver a framerate that you want. This is because CPUs aren't really affected by graphics settings as the number of draw calls it makes to the GPU will be roughly the same regardless of the resolution.
    A CPU bottleneck is far worse than a GPU bottleneck because there's really no way around it. I mean, sure, if you have Firefox open with 30 active tabs, closing it would probably go a long way to alleviate the CPU bottleneck but when Steve's testing CPUs in gaming, there's NOTHING open to slow them down. A GPU bottlenck is more common but far less critical because you can turn graphics settings down, employ upscaling or lower the resolution to get those framerates up. However, a CPU's max framerate is its max framerate.
    Ten years ago, you could OC the CPU to get better framerates (assuming that your cooling solution was good enough). This is why my attitude towards overclocking was to only do it when the CPU was getting old and no longer giving me the framerate I wanted. I would OC the CPU to give me time to save up for an upgrade and the only CPU that I never had to do that with was my FX-8350 because it gave me playable frame rates for the whole five years I used it.
    However, once I upgraded to an R7-1700, overclocking became more or less irrelevant because modern CPUs auto-clock themselves as high as possible on their own. While it's true that you technically can overclock CPUs like the R7-7700(X) and i7-13700K, the gains that are possible by taking that route are nowhere near the gains that were possible with CPUs from the FX/Sandy Bridge era, an era that ended about seven years ago when things switched to the Ryzen/Skylake era we live in now.
    Overclocking becomes even less relevant when one considers that the best gaming CPUs are AMD X3D CPUs and they can't be manually overclocked AT ALL. I actually like that because X3D CPUs, despite their incredible gaming performance, are forced to have relatively low TDPs are are thus much easier to cool since there is no PBO Max setting for them. This means that if I were to upgrade my R7-5800X3D to an R7-7800X3D, I would still be more than fine using my AMD Wraith Prism cooler that I got for free.
    This is also why I tell new builders to get an AM5 CPU that doesn't have a suffix like X or X3D because they get a usable cooler with it for less than an X model costs without one. I usually recommend the R7-7700 because it comes with a Wraith Prism that not only is far better than the Wraith Stealth, but is easily one of the most beautiful air coolers ever made and it scratches that itch that some young'uns have for RGB aesthetics (and I totally get that because I like it too) without them having to resort to some $100 240mm AIO. An AIO on an R7-7800X3D is a complete waste of money and that $100 would be far better spent on getting faster RAM which, while not relevant for an X3D CPU (because the 3D V-Cache is faster than the fastest RAM anyway), is completely relevant for unlocking the performance of CPUs with suffixes like X, K and F as well as CPUs with no suffix at all.

  • @vulcan4d
    @vulcan4d 7 месяцев назад +5

    A 5700x3d vs the 7800x3d would have been an interesting comparison with a 7700xt. I bet you they are identical until you hit a 4070ti super or better.

    • @Son37Lumiere
      @Son37Lumiere 5 месяцев назад +1

      You would be correct. The 7700 XT is not fast enough to show any difference. The 7800x3d is only 15% faster than the 5800x3d with a 4090 at 1080p.

  • @walter274
    @walter274 7 месяцев назад +1

    I liked the illustration of how upscaling doesn't help when CPU limitted.

  • @ArmchairMagpie
    @ArmchairMagpie 7 месяцев назад +14

    I think the reason for why the CPU is often disregarded is that it's often forgotten that there are operations that can only be done on the CPU side. The CPU also has to wait for GPU operations to finish, but when it is finally its turn again, then it can do stuff, and the faster the CPU, the faster it can get its stuff done. This may be a round of logical or organizational tasks like sorting or iterating through collections etc., and there is a really technological limit. The CPU isn't just there to present you the Save & Quit button, but actually in charge of the whole game.

    • @GoldenSun3DS
      @GoldenSun3DS 7 месяцев назад

      I like the analogy that the CPU/RAM is the foundation of the computer and determines WHETHER you can play the game (or what FPS you can achieve) and the GPU determines how good the game will look.
      The argument of "CPUs don't matter as much at 4K resolution" isn't wrong, but it's using the wrong variable in the argument. What people are actually meaning is "CPUs don't matter as much if your target is 30FPS or 60FPS".
      I've seen benchmark videos before where the point of the video is to show how much performance you're missing out on with an old/weak CPU (saying that it's a waste to upgrade GPU if you're still on an old/weak CPU), and in their own graphs on screen, the old/weak CPU was still mostly hitting the 60FPS goal at maximum settings. Their words were saying one thing, but their own evidence was contradicting their point.
      A lot of people still are fine with or prefer 60FPS gaming, and oftentimes if someone is buying a 4K display, it's a 60Hz screen. So in a roundabout way, "CPUs don't matter as much at 4K resolution" is still TRUE unless that person specifically bought a high refresh rate 4K display AND they prefer high refresh rates.
      But with upscaling technology, your hardware still won't actually be running at 4K resolution, more like 1080P or 1440P.
      I love emulation, though, so I actually prefer a high end CPU + a medium end GPU. With emulation, even medium range GPUs can do 4K 60FPS, but the CPU/RAM is much more important.

  • @HazewinDog
    @HazewinDog 7 месяцев назад +1

    You betcha. I went from a heavily overclocked 3700X to a 5800X3D and that upgrade alone increased my fps by 102% in Assetto Corsa. No, not 2%. 102%. More than a doubling. Admittedly Assetto Corsa is quite an edge case, but needless to say, I'm holding onto this puppy and if I ever upgrade again, it will be to another cache monster.

  • @4kORCHILL
    @4kORCHILL 7 месяцев назад +3

    Dang what camera are using to film this , video is crisp

  • @ominoussage
    @ominoussage 7 месяцев назад +2

    Very insightful video! Hope a lot more people watch this because this opens up the right way of thinking when buying for more performance. GPUs aren't the only things that improve performance!

    • @soppaism
      @soppaism 7 месяцев назад +3

      Which is: when optimizing for performance you need to balance your CPU, GPU and FPS target. Thinking about them in isolation is the cause of all confusion.

  • @terr281
    @terr281 7 месяцев назад +4

    Another reason to test at "low" resolutions (video comments around 21m mark), is that 1080p is still the preferred resolution for many gamers. Yes, 1440p (especially via upscaling) is gaining traction, but it has a LONG way to go. (The fact of the matter is... a much lower cost is had to game at 60, or even, 120 fps at 1080p when you factor in overall system costs.
    My household has yet to upgrade past 27" IPS 1080p 60hz good response time monitors. (Our monitors from 2018 haven't died yet.) Based on pricing today, our next monitors would be 27" IPS 2K 100hz good response time ones. (There is no need for 4k... in our opinion. You have to get there through software upscaling.)

    • @sjneow
      @sjneow 7 месяцев назад +1

      yup, Steam Hardware Survey data also suggest that most screens out there are still 1080p

    • @ThunderingRoar
      @ThunderingRoar 7 месяцев назад +2

      ​@sjneow And how many of those are shitty 8+ year old lap tops? Tims amazon referral data even from few years ago showed that the majority of new monitors purchased are 1440p.

    • @terr281
      @terr281 7 месяцев назад

      @@ThunderingRoar I'd argue it doesn't matter for testing purposes, especially since "decent enough" integrated graphics are now becoming the norm. Yes, these aren't in those 8+ year old laptops, but those new ones... .
      The enthusiast (desktop) gaming market first began to be killed by consoles, then "good enough" laptops, and now... the new generations (in general) care more about gaming on their phones and tablets. In reality, I'll admit it... I lied about 1080p as the most played gaming resolution. It is actually 360x800. (I had to look it up.)

    • @robthegobbler
      @robthegobbler 5 месяцев назад

      My opinion is 1080p 144hz on laptops, and for desktop 1080p 24" 144hz, 1440p 27" 144hz, and 4k 32" refresh rate whatever you want. For beginners always start 1080p 144hz and just occasionally ask them if they have genuinely strong desire for more clarity, at which point upgrade monitor. The same resolution on a smaller panel will have higher pixel density and look even clearer.

  • @gametime4316
    @gametime4316 7 месяцев назад +3

    i actually would love a deep dive to see what happen with X3D CPU's when they become CPU limited and not latency limited.
    i believe that X3D CPU are only good for extreme FPS because they improve the latency a lot, but sooner or later a game that peg them to 90-100% *cpu use* will come ( let's say what happen to 4c/4t CPU today) and then they will be just as good as the same CPU without the 3D V-cash.
    best way to test something like that is to take 7700X and 7800X3D and drop the CPU frequency to like 1 or 2GHZ and test with heavy CPU game/s that have good CPU scaling :)

    • @thetruth5232
      @thetruth5232 7 месяцев назад +1

      The 5800X3D struggles to keep 60fps in cyberpunk with Raytracing. It Jitters just like any other CPU when it drops into the 50's at 90% usage.

    • @Deathscythe91
      @Deathscythe91 7 месяцев назад +1

      "what happen with X3D CPU's when they become CPU limited "
      your whole story here already answered your own question

    • @gametime4316
      @gametime4316 7 месяцев назад

      @@thetruth5232 are u sure that its not u'r GPU ? do u see low GPU usage ?
      my 12700 hold over 100FPS with PT if i test it with ultra performance DLSS
      (4070ti 1440P UP DLSS)

    • @thetruth5232
      @thetruth5232 7 месяцев назад +1

      @@gametime4316 Go to market areas with many NPCs and High crowd densities. CPU usage spikes there, from 50-65% up to 90%. I play 1440p60 with RT+PT and my XTX runs at 80-90%.

    • @gametime4316
      @gametime4316 7 месяцев назад

      @@thetruth5232 give it a try at 1080P or even 720P with the same FSR, if u'r FPS doesnt go up u are really CPU limited and its quite shocking.

  • @stylist_bitter6194
    @stylist_bitter6194 7 месяцев назад +1

    Cheers Steve, this is a great piece of content. I conceptually understood this, but this video did a great job of demonstrating the point and driving it home.

  • @englematic
    @englematic 7 месяцев назад +4

    lmao it's pretty funny that you're using an R5 3600 for the low end. That's the exact CPU I have in the computer that's hooked up to my 4K TV, along with an RX 6800. I realize it's a pretty unoptimized build, but it was more of a spare parts PC than anything for streaming movies and the occasional older platformer or cozy indie game.

    • @christophermullins7163
      @christophermullins7163 7 месяцев назад +2

      Honestly that is a great balance for 4k. Similar balance to my setup 5600@4.7ghz and 6950xt at 4k.

  • @ShredZ79
    @ShredZ79 7 месяцев назад

    It's great to actually see someone pay attention to this, graphics can usually be tweaked down be it DLSS, nvidia scaling (very useful for ultrawides) or whatever trick there is for a given game. e.g. I ran into cpu limitation on my 9700k for cyberpunk and after upgrading to a 7800x3D (following the same reasoning this video is about), I can run pretty much any game at whatever FPS I like on my 3080/160Hz ultrawide. Good stuff guys. Enough with all the ultra 4k benchmarks as a general rule.

  • @paulmaydaynight9925
    @paulmaydaynight9925 7 месяцев назад +4

    so... 8K 120fps is still unobtainable -4 years after 8K 120fps TV's were available in japan-

  • @Swecan76
    @Swecan76 7 месяцев назад +1

    It matters for me. I have a 5800X3D and I get CPU limited in several games using my RTX4090. Because the CPU isn't strong enough to feed especially RT frames to the GPU. It gets bogged down in some areas. Easy to see when GPU util doesn't hit 99%.
    I am upgrading to the next X3D or maybe Arrow lake. Basically which is first out the door.

  • @Exitar15
    @Exitar15 7 месяцев назад +25

    Raise your hand if you have a 4k monitor....

    • @Garde538
      @Garde538 6 месяцев назад +2

      Not many..

    • @electricant55
      @electricant55 6 месяцев назад +2

      the monitors are dirt cheap, driving them is a problem. any rusty trash pile of silicon can get 120+ fps at 1440p

    • @abdullaal-naimi9075
      @abdullaal-naimi9075 5 месяцев назад +2

      I got my brother's 4k monitor a while back when I built my current pc with an RX 6800 xt, and I'm absolutely "downgrading" to a 1440p 120+ monitor soon. Game devs can't be bothered to support last Gen high-end cards and/or nvidia and amd are pushing for some back breaking tech to get you to buy newer cards. I'm not playing that game, no pun intended.
      The only real upgrade I'm considering is a 5700x3d to replace my 5800x.

    • @bm373
      @bm373 2 месяца назад +2

      4k 120hz is the way to go! Can't wait for 8k!! ✌️🇺🇲

  • @Gindi4711
    @Gindi4711 7 месяцев назад +1

    I think the main point is:
    .) If you buy a low end gaming CPU like the 7600X for 200$ it is fine for now, but as games get more demanding you need to upgrade it in around 3-4 years with your next GPU upgrade.
    .) If you buy a 7800X3D now it will last more like 6-8 years.

    • @puffyips
      @puffyips 7 месяцев назад

      Chose the 7800x3d in 2023 for my 5year cpu till ps6 hardware is being talked about

  • @AngryChineseWoman
    @AngryChineseWoman 7 месяцев назад +12

    Tech Deals likes this

  • @ThorDyrden
    @ThorDyrden 7 месяцев назад

    Thanks - in deed that corrected my CPU evaluation somewhat.
    I knew it is not irrelevant, as my old i7 8700k (also a 6-core) was in deed limiting my gaming-performance already with a 3070@1440p - and was one reason for last years upgrade.
    But your numbers imply a rule of thump, that you define the maximum fps by choosing the CPU, measured at low-res/details in a certain game - and then shop for the GPU supporting the fps you target in your intended settings. So set the CPU-bottleneck high enough, that the GPU-selection is defining your FPS.

  • @Adnyeus
    @Adnyeus 7 месяцев назад +5

    honestly this comparison would be better for me if the 5600 and 7600 were also included

    • @thegreathadoken6808
      @thegreathadoken6808 7 месяцев назад +3

      The point of the video isn't to give 3dge433 the benchmarks for the precise hardware they need to know about, but to make a point.

    • @martineyles
      @martineyles 7 месяцев назад +1

      ​@@thegreathadoken6808It doesn't make any point particularly well, and using only these 2 CPUs doesn't help.

    • @wertyuiopasd6281
      @wertyuiopasd6281 7 месяцев назад +4

      ​@@martineylesBecause you didn't understand the point of the video.
      The limit through put of a CPU at 1080p is the same at 4k, it just depends whether the GPU can display the same amount of frames.
      If the CPU can only display 80 frames per second at 1080p low and medium for example, it means no matter it won't display anymore frames than this.

    • @martineyles
      @martineyles 7 месяцев назад +2

      @@wertyuiopasd6281 I'd rather see the proof in a 4k chart than have to hope it's the case in every review that doesn't show a 4k chart.

    • @enderfox2667
      @enderfox2667 7 месяцев назад +1

      ​@@martineylesBut this video has 4k and even 1440p charts, alot of them

  • @eswecto6074
    @eswecto6074 7 месяцев назад

    Finally good 4K review, a little bit more 4K upscale/frame gen performance data and would be perfect.
    For 4K60 max settings you can easily go with R5 5600/R7 5700 + 4070TiS/7900XT and will get best for bucks combo.
    I am still doubt about 16/32Mb L3 cache for that settings. As B450 user I am looking more to the R7 5700, for B550 maybe pick R5 5600.

  • @zodiacfml
    @zodiacfml 7 месяцев назад +4

    ironic that as you increase resolution you actually less need of an expensive CPU system. consider that Ryzen AM4 3600 is now dirt cheap and the 7800x3d likely costs 5 times. If I simplify these results, you just need the x3d CPUs for competitive or fps gaming.

    • @yt-mull0r
      @yt-mull0r 7 месяцев назад +2

      What "you" need, isn't what anyone else needs. The point of the video is that the CPU does matter in a lot of scenarios: when fiddling with resolution and/or quality settings.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 7 месяцев назад +1

      Exactly! At 4k high/ultra the gauns were marginal, not warranted paying 5 times the price for the high end. That money can be spent buying a better GPU

    • @zodiacfml
      @zodiacfml 7 месяцев назад

      ​@@CallMeRabbitzUSVI I've been dreaming of getting x3d CPU someday because it looks so good in benchmarks/charts, only to realize that I don't play FPS games. Consider also DLSS or frame generation that increases fps without the help from the CPU

    • @vigilant_1934
      @vigilant_1934 7 месяцев назад

      @@CallMeRabbitzUSVI Some of that money saved should also go to a better CPU so you can build a balanced system. A cheap low end CPU and a super fast high end GPU don't match. Also the 7800X3D is not expensive or high end anymore and has been under $400 for a while and you could probably find it used for $300. That CPU will last you years longer than a 3600. It's an investment not a one time cost so you need to think long term not just how much you can save now. The lowest end CPU that won't be the bottleneck most of the time when paired with a fast high end GPU is a 5700X3D but I wouldn't go above a 4080 Super as far as GPU. A balanced system with an inexpensive but fast CPU and fast high end GPU would be a Ryzen 7 7700(X) and 4090. With a Ryzen 5 3600 I wouldn't go above a 3080/6800XT while a 3070 or 6700XT would be a better balance.

  • @RyanProsser0
    @RyanProsser0 7 месяцев назад

    Steve I’ve listened to a lot of hoursXdays of your conclusions and Q&A answers.
    And I gotta say, you really did up the IQ on the writing and delivery of the explanations in this video
    Top job and thanks for all you do
    My next CPU upgrade will be from R5 3600 to likely the 5700x3d
    Cheers

  • @justinpatterson5291
    @justinpatterson5291 7 месяцев назад +20

    Hopefully my 5800X3D still sits well on these charts.

    • @christophermullins7163
      @christophermullins7163 7 месяцев назад +12

      We will never know as.. he didn't nest it here 😭

    • @calisto2735
      @calisto2735 7 месяцев назад

      @@christophermullins7163 bruh... There are plenty older videos with the 5800x3d on the channel, you can infere.

    • @DavideDavini
      @DavideDavini 7 месяцев назад +4

      @@christophermullins7163 I didn’t know CPU’s nested. 🤣💀

    • @christophermullins7163
      @christophermullins7163 7 месяцев назад +4

      @@DavideDavini :P

    • @DavideDavini
      @DavideDavini 7 месяцев назад +3

      @@christophermullins7163 it’s a very nice typo mate. Cheers. 🙂

  • @darreno1450
    @darreno1450 7 месяцев назад +1

    With the introduction of the 3d v-cache chips, buying a particular CPU for future proofing at higher resolutions isn't always straightforward. In those cases, how much future titles might take advantage of the 3d v-cache should be considered. We've seen how well a 3d v-cache friendly title can perform even at 4k.
    As for whether 4k data should be included or not? At a glance they immediately answer the question about what one's particular CPU (or future purchase) is capable of at 4k with the best graphics cards CURRENTLY on the market instead of having to extrapolate based on the lower res data.

    • @motmontheinternet
      @motmontheinternet 7 месяцев назад

      "At a glance they immediately answer the question about what one's particular CPU (or future purchase) is capable of at 4k with the best graphics cards CURRENTLY on the market instead of having to extrapolate based on the lower res data." No it doesn't, it just checks a box. If a game is GPU bound then you have no idea if the CPU you're buying is a good purchase or not and the benchmark is void. If I can buy a $200 CPU vs a $400 CPU and your benchmark shows the same results (because you're just GPU bottlenecking the computer) then your benchmark doesn't help my purchasing decision in the slightest.

    • @darreno1450
      @darreno1450 7 месяцев назад

      @@motmontheinternet "If I can buy a $200 CPU vs a $400 CPU and your benchmark shows the same results (because you're just GPU bottlenecking the computer) then your benchmark doesn't help my purchasing decision in the slightest." How come? It's telling you that at 4k, you gain nothing by purchasing the $400 CPU even if you use a 4090.

  • @lharsay
    @lharsay 7 месяцев назад +13

    "CPU doesn't matter at 4K' was more or less true when the RTX2080Ti an later the RTX3090 were the top GPUs but with the 4090 it's very apparent that the increase of GPU performance has outpaced CPUs and I would even go as far to say that none of today's CPUs are good enough to completely utilize a 4090 in every game at 4K.

    • @EBMproductions1
      @EBMproductions1 7 месяцев назад +4

      Next gen might show this.

    • @tommihommi1
      @tommihommi1 7 месяцев назад +3

      it doesn't matter if your max framerate is limited by the GPU. A faster CPU will make the game just run better overall, with fewer spikes.

    • @gametime4316
      @gametime4316 7 месяцев назад +2

      well u are right and wrong at the same time.
      it's all about the game, cyberpunk 2077 with PT run at like 20FPS on 4090 at 4K with no upscaling... that won't be limited by intel 6700K
      so if u count on that scenario u can say that 6700K doesn't limit 4090.
      and if u look at CS 2 at 1080P u can say that 7800X3D limit 4060...

    • @Tech2C
      @Tech2C 7 месяцев назад +1

      Which begs the question, will we see much of an uplift with the RTX5090 if the CPUs can't keep it fed?

    • @totalermist
      @totalermist 7 месяцев назад

      @@Tech2C depends on the game I guess. Titles like Alan Wake 2 or Cyberpunk 77 with PT will surely see a big uplift in performance, whereas e-sports titles like CS 2 or more CPU constrained one like Baldurs Gate 2 probably won't see much of difference.

  • @budthecyborg4575
    @budthecyborg4575 7 месяцев назад

    19:05 For AMD Radeon owners it's pretty much universally better to never use upscaling, and for Nvidia owners it's still often best to keep running Native 4K and not use DLSS.
    While a 30% framerate uplift isn't "nothing", considering you're cutting the render resolution by half this is basically never worth the trade off for FSR vs. native rendering, and even DLSS can dramatically hurt image quality in scenarios where the image contains a lot of random detail like distant grass and trees.

    • @motmontheinternet
      @motmontheinternet 7 месяцев назад

      I hate using upscalers, and I agree with you, but you're ignoring XeSS. Which still isn't great, but it is a lot better than FSR 2.1. Radeon users can use that if they want an upscaler, not FSR.

    • @budthecyborg4575
      @budthecyborg4575 7 месяцев назад

      @@motmontheinternet Personally the scenario that convinced me to change out from a 3080 to 7900XTX was Flight Sim 2020 (nearly a thousand hours played).
      Even in 2022 when the 3080 was high end I much preferred using native 4K instead of DLSS.
      Of course, almost the entirety of content in MSFS2020 is natural foliage so my use case is specific.
      Nonetheless DLSS is not the panacea it is often promoted to be.

  • @seamon9732
    @seamon9732 7 месяцев назад +6

    Love my 7800X3D.
    CPU market is very good atm... GPUs, not so much.
    For the same'ish price all I could get was a RTX 3070.
    And I don't upgrade them at the same time, if anyone says "should've put more of the budget on the GPU".

    • @kerotomas1
      @kerotomas1 7 месяцев назад +4

      you would have been way better off with a 7700XT or 7800XT, 8 gigs of vram is kinda pathetic these days especially for the price of a 3070. last gen Rx 6800 nowadays are dirt cheap too

    • @redlt194
      @redlt194 7 месяцев назад

      @@kerotomas1 Unfortunatly AMD video cards may as well be invisible to most. Their loss.

    • @andersjjensen
      @andersjjensen 7 месяцев назад +3

      An RTX 3070 is also larger chip (on an older node), 8GB of VRAM, half a motherboard's worth of components and a non-trivial cooler. Why people compare CPU prices and GPU prices is beyond me.

    • @seamon9732
      @seamon9732 2 месяца назад

      @@kerotomas1 7000 series wasn't out yet, it was during the crypto crash. I also like to play with RT on, use DLDSR and highly prefer the much more accurate and faster DLSS ( you can go with a lower setting and still retain image quality parity with a higher FSR setting ).

  • @Flaimbot
    @Flaimbot 7 месяцев назад +34

    and despite this video, there will still be people not understanding why cpu benchmarks are done at lower res

    • @martineyles
      @martineyles 7 месяцев назад +8

      Mostly because they are wrong. He ignored the 1% lows not hitting 60fps at 4k60. I don't care about getting a massive framework, but a nice reliable 60fps without stuttering is important. I saw wether a 3600 can hit it, but that's useless as I won't be buying one. We need to see the 5600 and 7600 included too, as these are the CPUs we would buy if we have a lower budget.

    • @rooster1012
      @rooster1012 7 месяцев назад +12

      @@martineyles You are missing the point, people make ignorant comments saying that the CPU doesn't matter at 4k gaming and he is showing it does.

  • @robertcarhiboux3164
    @robertcarhiboux3164 7 месяцев назад

    When I was saying the same thing as you demonstrated few years ago, people were calling me idiot, not knowledgable, when I was explaining the importance of gpu/cpu pairing in order to not loose performance from one or the other side. Many people overweight cpu or gpu and then they end up wasting 30% of the budget and 30% perf because while they could have spend those 30% in balancing everything or just save the money. It is a physical bound, that will keep growing over time. Not because a cpu bound calculator tells you there is only 2% bottleneck when in fact there a 70% physical bound in absolute theorical numbers that will soon or later become a reality. If you want to keep your computer as long as possible it is important to consider, and using those benchmarks is pretty usefull to illustrate the importance of pairing balanced components alltogether. For instance overweighting the gpu can make the budget climb 30% while there will be like 2% perf gain over a 30% cheaper model. On the other side you save like 30%, you can use let's say 10% to improve your ram and the perf, and it will also help for those tiny fps drop that will soon or later make you feel you lack something to get stable 60fps. In the end spend 20% less money and have 10% more perf than you would if you spent 30% more on the gpu.

  • @evila9076
    @evila9076 7 месяцев назад +14

    I'm not sure why this video exists. Obviously a better cpu can provide better framerates when the gpu bottleneck is removed (ie when you buy a better gpu or lower your settings).
    Low settings at 4k make no sense anyway people who play at native 4k are looking for the sharpest image (makes no sense to lower textures at that point)
    Especially when dlss and fsr exist.

    • @motmontheinternet
      @motmontheinternet 7 месяцев назад +8

      For the millionth time, "real people don't use their computers this way" is not the point and you aren't benchmarking anything except the GPU by testing 4K high settings. And the fact that such a benchmark only tests the GPU IS the point

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 7 месяцев назад +1

      ​@@motmontheinternet But Real people who play at 4k high settungs are the one asking for him to do CPU benchmarks in 4k! Like seriously Steve is being deliberately stubborn against his own audience. Why?

  • @gerardfraser
    @gerardfraser 7 месяцев назад +2

    Thanks for sharing makes sense in the testing

  • @StingyGeek
    @StingyGeek 7 месяцев назад +6

    Awesome content. Thank you!

    • @zer125
      @zer125 7 месяцев назад +1

      What? He compares two cpus. This content is irrelevant to most of the world. Why he released this, I’ll never know.

    • @Hardwareunboxed
      @Hardwareunboxed  7 месяцев назад +6

      Not everyone is as blunt as you gtech.

    • @martineyles
      @martineyles 7 месяцев назад +1

      ​​@@zer125The video is released to get views. There's a gap in new hardware to review for a while. Zero views for a long period would probably harm them in the algorithm when the new releases finally appear.

    • @StingyGeek
      @StingyGeek 7 месяцев назад +2

      @@zer125 what he's done is used two CPUs, of very different performance levels, to demonstrate what a GPU upgrade means in that circumstance. He demonstrated that you can waste a shit ton of money on an expensive GPU by not keeping your CPU balanced, and that depends on your use case. To that end, he showed how you can test for CPU bottlenecking. At a time of bullshit GPU prices, this is one of the best videos from the hardware unboxed team.

  • @pretentiousarrogance3614
    @pretentiousarrogance3614 7 месяцев назад +2

    I thought it was about if there is a CPU load difference with the res increase but I guess not.
    Also crazy to see how slow Zen 2 is these days.

  • @richardhunter9779
    @richardhunter9779 7 месяцев назад +20

    Why are you considering average framerate to be more important than 1% lows? If anything, it should be the other way around.

    • @totalermist
      @totalermist 7 месяцев назад +14

      Because 1% lows tell you nothing really. Depending on the game engine (UE5 _cough_) frame drops can be unrelated to the hardware used, e.g. due to shader compilation/loading/cut scenes etc.
      It's simply not a very useful statistic in isolation. A single shader compilation stutter during the benchmark run can drop %1 lows significantly depending on the duration of the run. Ideally, you'd want to factor in the duration and frames rendered overall, but in that case frametime graphs as used by Other Steve are much more informative in terms of illustrating what's really going on. A reasonable compromise is looking at the discrepancy between 1% lows and averages, not just either on their own.
      Long story short: it's complicated.

    • @concinnus
      @concinnus 7 месяцев назад +5

      Lows are more disruptive to experience, but very inconsistent and hard to replicate. Ideally you'd want a scripted sequence for consistent results, but that means the built-in benchmarks, which never include the stutters from e.g. loading at level boundaries.

    • @kleinerprinz99
      @kleinerprinz99 7 месяцев назад +4

      Id say the average is important and then observe if you get 1% lows stutter/ jitter. Nothing out of context is useful.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 7 месяцев назад

      No average is more important. I'd rather choose a 100 fps experience with 60 1% lows than an 80fps one with 70 1 % lows. I know one is more consistent but idc.

    • @fracturedlife1393
      @fracturedlife1393 7 месяцев назад +1

      Limit frame time spikes, large frametime deviations and don't bounce between high and low FPS plz!!! Average never told the whole story

  • @256shadesofgrey
    @256shadesofgrey 7 месяцев назад

    And people still tell me I'm wrong when I say that they should buy the best CPU they can and go 1 tier lower on the GPU if it gives them a proportional improvement on the CPU. The CPU will be the thing that limits your FPS while what the GPU delivers can be adjusted with settings. Also when you play some game and suddenly see an FPS drop it's almost always because of the CPU. I haven't seen GPU related FPS drops to unplayable territory since the time I had a laptop with an ATI Radeon 8500 that couldn't handle particle effects in NFS Underground 2. If the GPU is to blame for low FPS, it will be consistently low, and not in the form of sudden drops by a factor of 4 or more when 20 additional characters appear on the screen.

  • @EBMproductions1
    @EBMproductions1 7 месяцев назад +6

    *Gents i agree but using the 7800X3d is a bit of a issue cause it is the no.1 gaming cpu so much so that its an outlier id much rather have seen a Ryzen 5 7600 being used as it really is a more likely part people will buy and is not an outlier part performance wise. That being said it won't change the results too much but a 20% closer performance graph changes everybodies views substantially*

    • @JohnM89
      @JohnM89 7 месяцев назад +1

      good point

    • @EBMproductions1
      @EBMproductions1 7 месяцев назад

      @@JohnM89 Yeah like i agree but a more midranged cpu would be better to prove the point cause as we know a 7800X3d is a outlier cpu at the moment. I hope x3d tech is in all newer AMD cpus oneday.

    • @Shieftain
      @Shieftain 7 месяцев назад +1

      Agreed, especially since the 7600 is now at a pretty good price point in some places (as low as 190 Canadian dollars here for example).

  • @heskoo2k
    @heskoo2k 7 месяцев назад +1

    This is basically a comparison of my old cpu and current cpu lol. I'm so happy with 7800x3d. i didn't realize how much my 3070 was being bottlenecked until i noticed x2 fps increase in all titles

  • @coolvinay
    @coolvinay 7 месяцев назад +2

    Seems like Steve has too much spare time to kill. Its not even a 5600.

  • @ivanrozic9809
    @ivanrozic9809 7 месяцев назад

    Very informative, really. Cheers from Croatia!

  • @FtGFA
    @FtGFA 7 месяцев назад +4

    honestly if gaming was in a good place this would be interesting. there is practically no games that are worth paying out for a high-end rig right now. even the games tested would make me want to sell my pc

  • @gametime4316
    @gametime4316 7 месяцев назад +1

    its like i said all the time, its not that CPU doesn't matter at 4K, u are just GPU limited more .
    if a GPU can do 400FPS at 1080P, 200FPS at 1440P and 100FPS at 4K
    if u have a CPU that can do 130FPS at that game u will be limited at 1080P and 1440P.
    if u have weaker CPU that only do 80FPS u will be limited at all 3 resolutions.
    and if u upgrade to new GPU that can do 200FPS at 4K u will be limited by both CPU.

  • @PookaBot
    @PookaBot 7 месяцев назад +5

    Dont waste any more breath trying to convince the "why test at 1080p?" crowd of the truth. They're not worth even acknowledging.

    • @CallMeRabbitzUSVI
      @CallMeRabbitzUSVI 7 месяцев назад +1

      We just want him to show the test in all 3 resolutions, 1080p,1440p, and 4k, hell i would like him to test in 8k too since there where I game today. But steve seems either too stubborn or too lazy to do exactly that, that he would rather make a slapback video to all the people wanting him tk do thorough tests

  • @meridian6042
    @meridian6042 7 месяцев назад

    I appreciate this review. As someone who recently got a 4k monitor about 6months ago now, this is a permanent resolution change for me. So I'd like to know how 4k is impacted with every aspect of the hardware I use. "4k will be the same regardless of the cpu" is something I see too often and hopefully this will properly inform people sticking to that idea. As usual, 'it depends' really hits home, but at least this shows its not a hard rule to stick to by any means.

    • @meridian6042
      @meridian6042 7 месяцев назад

      I realize this isn't really a review and more of an informative video. My appreciation still stands

  • @lsik231l
    @lsik231l 7 месяцев назад +5

    I don't even game at 4k, just watched to support the channel.

  • @erictayet
    @erictayet 7 месяцев назад

    Great video Steve. Not everyone understands the test methodology. I'm surprised you didn't use a Zen 1 CPU. :D

  • @LyroLife
    @LyroLife 7 месяцев назад +6

    but a 5600 or 5800 is still okay right?

  • @thestrykernet
    @thestrykernet 7 месяцев назад

    Really good comparison which offers a clear visual for what you guys have been saying for years now. Minimum frame rates are still generally better even when GPU limited and I'd imagine frame pacing overall is better.
    Given the performance drops at 4K and upscaling recommendations I'd be curious to see image quality/performance comparisons between 1440p native and upscaling at 4K.