"CPU doesn't matter when gaming at 4k"

Поделиться
HTML-код
  • Опубликовано: 14 окт 2024
  • Check out Jawa, the place where gamers buy, build, and sell! bit.ly/JawaOwe... Use code OWEN10 for $10 off your first purchase!
    Does CPU matter when gaming at 4K? There has been common wisdom floating around the PC gaming space that CPUs don't matter much when gaming at higher resolutions. And while from a certain perspective this is true, I think that in realistic use cases it is much less true.
    Sources:
    HUB CPU video: • How Slow Is The Ryzen ...
    GN Dragon's Dogma 2 video: • Dragon's Dogma 2 is a ...
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.

Комментарии • 888

  • @danielowentech
    @danielowentech  6 месяцев назад +22

    Check out Jawa, the place where gamers buy, build, and sell! bit.ly/JawaOwenApril24 Use code OWEN10 for $10 off your first purchase!

    • @connor040606
      @connor040606 6 месяцев назад +1

      As a Jawa Verifed Seller, I will say there is some incredible value in to be had in purchasing a prebuilt from a good seller on the platform.

    • @kaimojepaslt
      @kaimojepaslt 6 месяцев назад

      never sekk in java, prices is joke, like you give it for free.

    • @korinogaro
      @korinogaro 6 месяцев назад

      I like your videos but this is complete miss without DLSS results to compare. You just make an educated assumption. But DLSS is not free performance. It still impacts GPU's performance. I am not saying you are wrong, I just can't say you are right. And I can't test it myself as I don't have 2 CPUs. It would be nice for you, if you have 2 CPUs to make a follow-up video where you get same PC, different GPU and test one scene for native/upscalled. Than we could clearly see proportional impact of using DLSS.

    • @THU31
      @THU31 6 месяцев назад +1

      @@ingamgoduka57 Late April Fools comment? UE and Unity are literally the two most CPU-limited engines out there, with tons of other issues like #stutterstruggle. 😄

    • @ingamgoduka57
      @ingamgoduka57 6 месяцев назад

      But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance. Capcom must fix or ditch their game engine.

  • @Lackesis
    @Lackesis 6 месяцев назад +249

    CPU handle your
    1. Render targets
    2. Physic related
    3. NPC AI related
    4. Date streaming related
    CPU definitely matters even at 4K gaming. you will get stutter or worse 1% low if CPU can't handle those things well.

    • @saricubra2867
      @saricubra2867 6 месяцев назад +30

      CPU = Handles the gameplay or game feel.
      GPU = How pretty your game looks.
      I take gameplay over graphics any day, so i always overspend on CPUs.

    • @ingamgoduka57
      @ingamgoduka57 6 месяцев назад +11

      But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance.

    • @Argoon1981
      @Argoon1981 6 месяцев назад

      @@ingamgoduka57 Nvidia PhysX, even on Nvidia, do not run everything on the GPU.
      I know this is hard to believe for many but is the truth, I know this for two reasons, one I was a old Ageia PhysX user, before Nvidia bought it (meaning I add a Ageia PPU...) and I'm a game developer (a modder to be exact) my self for more than 20 years now and I have worked with PhysX (and other physics engines).
      Nvidia passed this wrong idea about what PhysX can really do, on purpose, just to sell you more GPU's and they won, because the majority of people/gamers, are totally mistaken about what GPU PhysX acceleration really does in games.
      In reality the only physics a GPU can accelerate, in real games, is the non gameplay impacting physics.
      Meaning soft body physics like, cloth movement and destruction, liquids, dynamic fog effects, particle based physics, ex. glass breaking peace's or similar effects and rigid body objects, that can't affect the AI or the player, in any meaningful way. Physics that can affect the AI or the player, meaning those that can kill/block the AI or the Player, is always run on the CPU, no matter what.
      For example, rigid body objects that you can pickup, with a gravity gun like in HL2 and shoot at NPC's to kill them, that is always run on CPU.
      Bullet detection/collision for shooters, is always run on the CPU.
      Collision detection for AI navigation, is run on the CPU.
      Collision detection for audio, is run on CPU.
      Ray tracing for gameplay, is run on CPU, etc.
      There's plenty of physics, in real games that exclusively run on CPU's, even on Nvidia, because there's no other way.
      Yes Nvidia has demos, where they have GPU's, running a bunch of rigid body objects around that look just like those moving around in games, but those demos, have zero AI, is all visual flare, the minute you put AI in the scene and the physics objects need to hurt the AI the physics have to switch to the CPU.
      And just for those that may wonder, this was true with the Ageia PPU as well, in reality the PPU was just a low power ASIC, with a custom processor with a bunch of tiny cores, very similar to a GPU.

    • @vindicator879
      @vindicator879 6 месяцев назад +2

      Also the more amount of unique NPCs in a game, the higher the CPU load

    • @Adri9570
      @Adri9570 6 месяцев назад +9

      Analogy to understand the relation between CPU and GPU:
      - GPU = orange
      - CPU = orange juicer
      - Objective: get quickly the highest amount of juice from the orange (highest possible FPS) = get the biggest possible orange (higher GPU) but don't forget that the best orange juicers (better CPUs) are the ones that get more juice from the same orange and avoid the worst orange juicers that can't even penetrate the orange (very low CPUs).

  • @hollisbostick2872
    @hollisbostick2872 6 месяцев назад +49

    Very well explained. Thank you very much for helping me understand this issue better. It's why I'm a member of your channel. I'm happy and proud to support such stellar work. I appreciate you🙂.

    • @zzavatski
      @zzavatski 6 месяцев назад +3

      Especially the false narrative from NVIDIA that when you set resolution to 4K and enable DLSS you still play at 4K.

    • @hollisbostick2872
      @hollisbostick2872 6 месяцев назад

      @@zzavatski Well.... yeah, but this is the fiction *everybody* is supporting, because (apparently) it is impossible to make hardware that will display games with any kind of effects in 4k at a decent speed for a reasonable price. If a 4090 -- agreed by all to be the most powerful hardware currently available, the "best of the best" -- can barely hit 60 fps at 4k Ultra in Alan Wake 2; can't even make 30 fps with RT on; if the only way to get decent framerates is through software 'hacks', *yet* Nvidia _still_ charges €1800 - €2100 for the hardware.... then we have to recognize that for whatever reason, the hardware is beyond its limit and everyone is lying and saying it isn't to get as much money as possible out of the users. Because it's ridiculous to think that I would spend €2000 on hardware that literally cannot do what I'm paying for unless you lied to me *like a **_boss_* .

    • @XeqtrM1
      @XeqtrM1 6 месяцев назад

      Again they do it on purpose because in reality you don't need a super high end GPU to play 4k with high frames actually because it's about game optimization so if you optimize a game well enough you don't need a latest and greatest but they do it on purpose to make you buy the latest GPU so in a sense they work together because if games was so good optimize to run 4k 144 on a 1080ti for example people wouldn't have a reason to upgrade for does that mainly game it's sad but it's the truth​@@hollisbostick2872

  • @zozihn8047
    @zozihn8047 6 месяцев назад +626

    Let me just pair a Pentium with a 4090. It will be awesome.

    • @Dajova
      @Dajova 6 месяцев назад +39

      I just checked up the prices of them... oh boy, they arent even cheaper than a 12400F and have lower clockspeeds to boot xD

    • @khursaadt
      @khursaadt 6 месяцев назад +1

      Amen brother

    • @doctorspook4414
      @doctorspook4414 6 месяцев назад +20

      As one wise Al Yankovich once said, "it's all about the Pentiums!"

    • @TwinkleTutsies
      @TwinkleTutsies 6 месяцев назад +20

      glorious 4K at 1 FPS

    • @Kotnan
      @Kotnan 6 месяцев назад +1

      😂

  • @Louis_Bautista
    @Louis_Bautista 6 месяцев назад +58

    Yup, just upgraded from a 2070 Super to a 4070 Super with the intention of playing at 4K and after booting up Cyberpunk I realized that my Ryzen 3600 was clearly holding back my GPU. I decided to do a full upgrade to a 7600 system and now everything's alright. Turns out ray tracing takes a big hit on the CPU, even if you're playing at 4K

    • @lenscapes2755
      @lenscapes2755 6 месяцев назад +2

      I agree. I had 3600 as well and it's not a very good CPU to play with ray tracing effects. I'm still using 2070super and even it was getting bottlenecked by 3600 for raytracing. Now I'm using 13900K and it's buttery smooth.

    • @EliteGamingG786
      @EliteGamingG786 6 месяцев назад

      How is 4070 super holding in 1440p
      IAM planing to buy one but kind of nervous

    • @X_irtz
      @X_irtz 6 месяцев назад

      @@EliteGamingG786 You can just... look up benchmarks online...

    • @42cuba
      @42cuba 6 месяцев назад +3

      LOL even a 7600 isn't good enough

    • @wertyuiopasd6281
      @wertyuiopasd6281 5 месяцев назад

      Yes but the good news is that a 7600 can max out GPU at 4k. While at 1440p or 1080p, a 7800x3d will do a better job.

  • @atirta777
    @atirta777 6 месяцев назад +150

    Good 3 steps to use when building a *gaming* PC:
    1. Pick a GPU according to the fps and resolution you like to play at in your games of choice.
    2. Pick a CPU that achieves *at least* that same framerate.
    3. Scale both to your budget while also considering longevity.

    • @mertince2310
      @mertince2310 6 месяцев назад +3

      Can you even explain what "cpu according to the framerate you like to play at" means? What would you recommend to a person who "likes" 60 fps? Or 144 fps?

    • @aohjii
      @aohjii 6 месяцев назад +54

      2 steps when building a PC
      1. Buy the fastest CPU in the market
      2. Buy the fastest GPU in the market

    • @CheeseburgerEnjoyer
      @CheeseburgerEnjoyer 6 месяцев назад +53

      @@aohjiiYou forgot the first step
      Step 1: Get a job

    • @mickieg1994
      @mickieg1994 6 месяцев назад +8

      Respectfully disagree, the frames per second you might see on a chart will almost certainly not be your real world user experience.
      By the time you have a bunch of programs running in the system tray, a browser with 20 tabs open, discord and anything else you might have running in the background this will not be your experience, especially if you're like me and watching a 1080p or even 4k movie on a second screen while gaming.
      Typically I would recommend buying a CPU that is more powerful than you think you need, at the moment I'd say 8 cores minimum and scaling the GPU based upon your budget after that, temper your expectations and if you're not happy with your parts choice for your budget then wait and save more.
      It's always better to have a CPU that's more powerful than the GPU for so many reasons, the smoothness of operation is what you will notice day to day, gaming or not.

    • @Bballfan1992
      @Bballfan1992 6 месяцев назад +7

      @@mertince2310basically he just means you have to balance your system. The gpu should almost always be your limiting factor in gaming. Hence why you spend 50% of your budget on the gpu. you choose the cpu that will not be the limiting factor. Obviously a 12400f and a 4090 is crazy. The cpu is the limit. Something like the 14700k is much more in line with the power of the 4090 and is a good balance.

  • @ZackSNetwork
    @ZackSNetwork 6 месяцев назад +163

    I saw a person pair a ryzen 5 1600x with a RX 7900xtx and argue that there was no CPU bottleneck.

    • @kotztotz3530
      @kotztotz3530 6 месяцев назад +42

      Oof, Zen 1 was really bad for gaming. Zen 2 was better, but I think Zen 3 is when AMD actually started competing with Intel in gaming CPUs.

    • @atirta777
      @atirta777 6 месяцев назад +14

      I mean it's very possible in a few games at 4K Native like Doom Eternal or even RDR2, but the combo isn't very reliable😅.

    • @nathangamble125
      @nathangamble125 6 месяцев назад +4

      @@atirta777 I don't think those two games are very good examples. Neither of them is GPU-intensive enough relative to their CPU usage, and I expect an RX 7900 XTX would still be significantly bottlenecked by a Ryzen 5 1600X. I'd pick a different example like Cyberpunk 2077 with ray tracing enabled, which is so GPU-intensive at 4K (especially for AMD GPUs, which aren't great at ray tracing) that an RX 7900 XTX actually _wouldn't_ be bottlenecked by a Ryzen 5 1600X.

    • @atirta777
      @atirta777 6 месяцев назад +9

      @@nathangamble125 Doom and RDR2 are good examples don't you think? The R5 1600X drives nearly 200fps on Eternal and I doubt 7900XTX does better than that with maxed RT on. Same with RDR2, about 90fps and 7900XTX doesn't reliably hit that at 4K Max. Also, we're talking about playable experiences here, which isn't the case with Cyberpunk 4K Native + RT on the 7900XTX.

    • @conyo985
      @conyo985 6 месяцев назад +9

      Tell that person that he can upgrade to a 5700X3D for the cheap.

  • @MrXaniss
    @MrXaniss 6 месяцев назад +8

    Well i have a 4090 and went from a 5800x3D to a 7800x3D, my average fps stayed the same at 4k pretty much, but it was MUCH more stable.

    • @soots-stayingoutofthespotl5495
      @soots-stayingoutofthespotl5495 5 месяцев назад +1

      Can you please elaborate on what you mean by that, or what the insinuation is. Thanks.

    • @InfernusdomniAZ
      @InfernusdomniAZ Месяц назад

      Very old comment to reply to but I think he means that the variation in frames is smaller think like 40-120 fps on the 5800x3d but 80-120 on the 7800x3d.

  • @cks2020693
    @cks2020693 6 месяцев назад +184

    If you wanna play any RPG games with open world NPCs like animals, townspeople, or strategy games with million units and calculations, CPU is almost always the bottleneck

    • @grynadevilgearxd2442
      @grynadevilgearxd2442 6 месяцев назад +13

      True. Yesterday i tried Witcher 3 on both dx11 and dx12 on My i5 12600K, RTX 4070 different setting, resolutions and still have large frame drops on Novigrad.

    • @Ravix0fFourHorn
      @Ravix0fFourHorn 6 месяцев назад +28

      @@grynadevilgearxd2442Witcher 3 next gen runs really badly because they hacked together a dx12 layer instead of making it run natively. If you try the Classic version you will have 0 framedrops in busy areas with your system.

    • @saricubra2867
      @saricubra2867 6 месяцев назад

      @@Ravix0fFourHornWhat if we use Vulkan instead?

    • @grynadevilgearxd2442
      @grynadevilgearxd2442 6 месяцев назад +3

      Yeah I know😀 because i'm playing on dx11. I just mean that Novigrad is more CPU bound with that many NPC.

    • @ogaimon3380
      @ogaimon3380 6 месяцев назад +4

      @@grynadevilgearxd2442 nah witcher3 npc barely have any logic,they legit just block of texture that have 1 line and move around or never,witcher get high fps even on a 4core cpu xd,now rdr2 is a better test,days gone is also heavy cpu limited in town my 4core can barely output 30-40fps

  • @ggogaming7441
    @ggogaming7441 6 месяцев назад +106

    If only you can feel the shiver that went down my spine when you mentioned DLSS. That realisation that kicked my brain to high gear and I was like...Holy timbers, he's right!
    If you're playing at 4k with DLSS quality, you're actually playing at 1440p.... You're more CPU bound than you thought.

    • @KryssN1
      @KryssN1 6 месяцев назад +15

      What is an obvious thing to many to others it’s not 🎉

    • @aberkae
      @aberkae 6 месяцев назад +5

      I've been saying this for years 😂.

    • @beemrmem3
      @beemrmem3 6 месяцев назад +23

      Not to mention there is almost no reason to not run DLSS quality at 4K. It's where it shines the most

    • @Angel7black
      @Angel7black 6 месяцев назад +2

      True, but youre good with a 4090 with even a 12600K or 5800X3D if you were gonna use upscaling. Its why with a 12700K im not really caring what new CPUs come out and havent cared about anything newer. Im more than fine with my 4070 Ti at 1440p, and id be fine with anything rivaling a 4090 in the future cause id realistically be moving to 4K in that case. Its complicating. If i were gonna buy a 4090 level GPU for 1440p in the future cause thats just how much harder it is to run 1440p max settings, id honestly just move to a used 14700K or 13700K, overclock and time the ram and be fine probably. Not planning on buying a new CPU+motherboard combo realistically till AM6 and whatever Intel has to over at the time.

    • @krazyfrog
      @krazyfrog 6 месяцев назад +1

      Been pointing this out for a while especially to channels like Hardware Unboxed who have been at the forefront of 'CPU doesn't matter at 4K' nonsense because that reviewer personally doesn't like to use upscaling and only plays Fortnite.

  • @102728
    @102728 6 месяцев назад +63

    In short:
    - The CPU can achieve a mostly fixed framerate across all resolutions
    - The framerate the GPU can achieve scales inversely with the resolution
    - This is not set in stone, but dependent on the game and the settings
    - The lowest between the two is the framerate you'll end up with, what causes generally a GPU bottleneck in higher resolutions and a CPU bottleneck in lower resolutions
    - This can easily be seen in GPU benchmarks, where you'll see the GPU achieve the same framerates between 1080p and 1440p in some games, or not scale proportionally to the drop in resolution
    - A third limiter is the game engine, which in some cases, mostly older engines, can't achieve higher framerates than a certain cap
    - Upscalers like DLSS, FSR and XeSS render at lower resolutions than the target resolution, providing close to the framerate of a lower resolution
    - This makes it more likely you'll hit a CPU bottleneck when using upscaling even at high monitor resolutions
    - BONUS: Geforce cards drop the driver overhead on the CPU, while Radeon cards keep them on the GPU. This means that with lower end CPU's, there is a significant difference in performance between Geforce GPU's and Radeon GPU's since the CPU bottleneck is more severe when paired with a Geforce card due to that driver overhead

    • @slimeman2374
      @slimeman2374 6 месяцев назад +4

      TY bro

    • @klauserwin9860
      @klauserwin9860 6 месяцев назад +6

      I heard that the NVIDIA GPU driver offloads all the work to only a single CPU core, while the AMD GPU driver can distribute the driver workload to multiple CPU cores.
      Thus, you especially need a high single core performance CPU when you use a NVIDIA GPU.
      Someone please confirm, what I said.

    • @artiromanenko
      @artiromanenko 6 месяцев назад +4

      - Remember, if you have 60 hz DISPLAY, then DISPLAY is your 60 fps bottleneck, no matter how high fps your CPU and GPU can produce.

    • @devonmoreau
      @devonmoreau 6 месяцев назад +1

      excellent post! 🎉

    • @mecyanned
      @mecyanned 6 месяцев назад +1

      Very short 😂😂😂😂😂😂😂

  • @FromDesertTown
    @FromDesertTown 6 месяцев назад +3

    Just think of it as two different pipelines: CPU instructions and GPU instructions.
    Depending on the rendering engine and how the game was programmed, the balance of instructions sent to the CPU and GPU varies.
    For most games, changing resolution will mostly impact GPU load.

  • @WrexBF
    @WrexBF 6 месяцев назад +11

    When a game is very CPU intensive, that's applies to any resolution. Higher resolutions have close to no impact on CPU performance. I said ''close to no impact'' because as the resolution increases, the framebuffer size increases, and that requires more memory bandwidth for transferring data between the CPU and VRAM. That increased data transfer can ''technically'' lead to slightly higher CPU usage. However, that increase is generally not significant compared to the overall workload of the GPU, which handles the bulk of the rendering process.

    • @christophermullins7163
      @christophermullins7163 6 месяцев назад

      Applies more so to lower resolutions***

    • @meurer13daniel
      @meurer13daniel 6 месяцев назад +8

      ​@@christophermullins7163not at all. There is no difference in resolution. If you are playing at 60 fps, vou requirement will be the same regardless of resolution. People just say stupid things like this because they think "lower res = more fps = more CPU demand", but they fail to realize the CPU requirement increases due to higher fps, not lower resolution. They are not the same thing

    • @saricubra2867
      @saricubra2867 6 месяцев назад +1

      @@meurer13danielHigher FPS means that the CPU has to handle a shorter game cycle therefore processing more inputs and outputs more quickly.

    • @christophermullins7163
      @christophermullins7163 6 месяцев назад

      @@saricubra2867 I said that "CPU matters less at 4k than lower resolutions" and after watching this video, Daniel confirmed that my statement is accurate. The title is clickbaity AF. The point of the video is to say that when we say "play at 4k" we are more likely upscaling but that does not invalidate the original statement at all. "Slower CPUs have less of a performance hit at 4k" "CPU performance matters less at higher resolution" are irrefutably factual statements and Daniel himself knowd that without any doubt in my mind. It is all about the way we word the question and assuming we are using upscaling and lower graphics settings etc. it cannot be argued that anything I said is wrong. You'd be greatly benefited from trying to understand why that is the case if you still haven't realized that it is true.
      Go ahead and argue something irrelevant like Daniels video.. no one said anything about upscaling guys.. it's called clickbait.

    • @Shadowninja1200
      @Shadowninja1200 6 месяцев назад +1

      @@christophermullins7163 But you are making a completely different statement. He didn't confirm your statement. This entire video is digging into the nuance of why cpu performance *looks* (emphasis mine) like it doesn't matter at 4k but it actually does. The issue isn't that slower cpus doesn't matter at 4k but more so that you're gpu limited if you just slap on max setting and called it a day. It will matter if you adjust down the settings and use up-scaling to reach your potential max framerate which is determined by what cpu you have. Most people aren't playing at max setting on 4k.
      It's like you heard the phrase "you're kind of right" and just stopped watching after that. You can argue all you want about how you're right but you actually aren't because you're making a statement based on a faulty conclusion.
      The data from the video above us literally tells us that you can be cpu limited at 4k and it does matter what cpu you have especially if you're reducing your gpu bottleneck by paring down the settings and resolution. No one in their right mind would recommend pairing a 4080 with something like a 2600x "because cpu doesn't matter at 4k" because we all know that it does bottleneck.

  • @FilthEffect
    @FilthEffect 6 месяцев назад +20

    Going from a 9400 to a 12700k doubled my fps in certain titles at 4k.

    • @saricubra2867
      @saricubra2867 6 месяцев назад +1

      Is not a fair comparison, the 12700K kinda is a mistake by Intel, it is way too good.

    • @Dexx1s
      @Dexx1s 6 месяцев назад +2

      @@saricubra2867 Well, whenever you make such a jump in CPU power and then add 'in certain titles', the response is automatically: no shit Sherlock. The 9400 doesn't even have multithreading. It's 6 cores and the same 6 threads. Plugging a Ryzen 1600 into those Starfield charts would probably show the 7800X3D to be miles ahead as well.

    • @SinNumero_no41
      @SinNumero_no41 6 месяцев назад

      ​​​? Is not that amazing, the 12600k is way better in that regard since it almost matches the 12700k in games for way less, in fact he should've gone with that one instead since he wasted money. ​@@saricubra2867

  • @Cheeseypoofs85
    @Cheeseypoofs85 6 месяцев назад +39

    an outlier to the 4k argument is MMOs.... in MMOs, where a bunch of players are around each other, cpu definitely matters.... 3d chips really shine in those scenarios. my minimum fps in wow in valdrakken in 4k doubled when i switched from a 3700x to a 5800x3d. i get 80fps in 4k native max settings in city.

    • @Billyshatner88
      @Billyshatner88 6 месяцев назад +1

      U should fine tune that even more with 14600k and a 7900xtx i get 150 to 180 fps in vald at 4k but the thing with vald is it artificialy limits gpu and cpu usage im always sitting at like 60% gpu and like 20% cpu usage in the emerald dream i get 90% + gpu usage and like 400+ fps at 4k

    • @Iridiumcosmos
      @Iridiumcosmos 6 месяцев назад +15

      @@Billyshatner88Just get a 7800X3D which would blow both of them out of the water. Lol. In all seriousness, if the hardware you’re playing on is serving you well then no need to change it. The PC community STRUGGLES with this notion and continues to complain why they’re in debt.

    • @khaledassaf6356
      @khaledassaf6356 6 месяцев назад

      Yeah, had a similar experience from the exact same upgrade in flight sim titles.

    • @Krannski
      @Krannski 6 месяцев назад +3

      ​@@Billyshatner88 psst... You're CPU bound. That's what being CPU bound looks like. If your framerate isn't hitting max and your GPU utilisation isn't at 100% that means you're CPU bound.

    • @kerotomas1
      @kerotomas1 6 месяцев назад +2

      Yes CPU performance is all about raw data. Especially in multiplayer where you have to calculate 15-20 players like in a WoW raid that tanks the CPU hard while in a singleplayer game everything is predetermined the CPU doesn't have to do anything there.

  • @jasonroberts8416
    @jasonroberts8416 6 месяцев назад +4

    Thanks for doing this I've seen way to many "CPU doesnt matter at 4K" comments over the past few years, even at 1080P I noticed a big improvement in 1% lows alone at 1080P when upgrading from an i7 8700K to 13600K (Carried over existing RTX 3070).

    • @DarkSoul-pb6dv
      @DarkSoul-pb6dv 6 месяцев назад

      i'm still on a i7 8700k running at 4.8ghz with a 3060ti and i have people telling me all the time my cpu is still fine for gaming i play battlefield games all the time and helldivers 2 and it's always my cpu even when i play metro exodus this cpu is just to old now i cannot stay at 60 fps

  • @sherr6847
    @sherr6847 6 месяцев назад +13

    i honestly love the way you do your videos. the moving dan to point at things is cute and humorous

  • @theinsfrijonds
    @theinsfrijonds 6 месяцев назад +5

    I’m glad that this isn’t in response to my post on one of these videos! 😅
    I did say that CPU doesn’t matter at 4K resolution but what I really meant was the difference between CPU’s of a particular segment. I was considering a 14900K at one point and there’s the 7800x3D, but I went with a 7950x3D.
    The graphs for those can show different frame rates but as you mentioned it’s highly dependent on the area that you’re in in the game. Also, I doubt that I would actually be able to appreciate any kind of uplift that’s less than 5 or 10%.

    • @michaelcarson8375
      @michaelcarson8375 6 месяцев назад

      @theinsfrijonds Why not upgrade to a CPU that has more than dual channel memory? When I made the switch I noticed the difference in frame pacing, but I was also able to truly multi task. It amazes me that streamers don't use workstations to game with. Yes there's a limit of what a cpu can do at 4K, but no one has put the Quad, hex, and Oct memory channel systems to the test in a public video. The cool thing is you can work and GAME at the same time with a workstation. You have enough pci-e lanes that you can run two VM at the same time and use more than one GPU.. Gaming is limited to single GPU since SLI and Xfire bit the dust. if you have another GPU you can use VT-d and pass it through a VM to run rendering., professional workloads, or a streaming for youtube/twitch/whatever box on the same system. The use of unlocked threadripper and unlocked Xeon is not well know by those that focus only on gaming.

    • @saricubra2867
      @saricubra2867 6 месяцев назад

      @@michaelcarson8375You save a lot more money buying an X3D cheap instead of buying an overpriced platform with quad channel support. The extra cache is equivalent to a giant memory OC.

    • @theinsfrijonds
      @theinsfrijonds 6 месяцев назад

      @@michaelcarson8375 Those processors were out of my budget. I mostly bought AMD because it's more power efficient. I do hate the fact that the memory controller in their processors is definitely worse than Intel (limited to 3000 mega transfers per channel.)
      Also I'm not sure that there is a motherboard where I could access the second PCI Express slot with my 4090 graphics card. It's just that huge (four physical slots taken up in space but three on the back of the case.)
      Good to know though that there are uses for quad channel and up. I can only imagine that channels haven't covered that due to the limited number of buyers plus the fact that it would be very expensive for them to review

    • @Shadowninja1200
      @Shadowninja1200 6 месяцев назад

      @@michaelcarson8375 Because workstation cpu are focused more on doing a bunch of things well rather than having high clock speed which some games really need. You could potentially limit yourself on performance if you go for something like a threadripper or a xeon cpu. Also the price difference doesn't really make sense for a streamer when they could put that money into second pc that they use for capture and rendering, keeping a eye on chat, and so on. They cut out the overhead on streaming without sacrificing performance this way.
      *edit* also using a vm while gaming could potentially get you banned if you play online if you can even launch the game at all due to anti cheat detection. A lot of cheaters used to use vms to run their cheats.

    • @michaelcarson8375
      @michaelcarson8375 6 месяцев назад

      @@Shadowninja1200 Excuses excuses. There are unlocked CPUs from both Intel and AMD Those systems are the true high end desktops. Dual channel cpus no matter the number of cores are not high end desktops.

  • @Jason-ol1ty
    @Jason-ol1ty 6 месяцев назад +28

    it all depends on how much RGB you have

    • @brunoutechkaheeros1182
      @brunoutechkaheeros1182 6 месяцев назад

      none, which means my 5800x3d is dogshit

    • @whitygoose
      @whitygoose 6 месяцев назад +3

      ​@@brunoutechkaheeros1182 put RGB on your cooler.

    • @giannimaccario9571
      @giannimaccario9571 Месяц назад

      True I was struggling with 30 fps and bad stutters with my 4090. I swapped all my fans and my ram with rgb stuff and now I'm getting a bare minimum of 200 fps, rgb is life 😂

  • @ConnorH2111
    @ConnorH2111 6 месяцев назад +88

    thank you for making this video, so annoying seeing so many people assume you can pair a 4090 with a 5600 and expect the same results as a 7800x3d at 1440p+ 🤣

    • @ZackSNetwork
      @ZackSNetwork 6 месяцев назад +19

      I saw a person pair a zen 1 ryzen 5 1600x with a RX 7900xtx. They argued there is no bottleneck.

    • @ezechieldzimeyor4541
      @ezechieldzimeyor4541 6 месяцев назад +9

      ​@@ZackSNetworkwow that is insane. I can understand a 5800x3d or something but a 1600 isn't even going to have a PCIE 4 unless you go get a new motherboard that costs 3x the CPU 💀

    • @aboveaveragebayleaf9216
      @aboveaveragebayleaf9216 6 месяцев назад +8

      Why would you drop $1,000 on a gpu and not at least get a 5600 or something for $130 more lol. Even if there is "no bottleneck"

    • @chy.0190
      @chy.0190 6 месяцев назад +5

      They are the people crying why their framerates are the same on both low and max settings in certain games, and then blame it on bad optimisation lmao

    • @V4zz33
      @V4zz33 6 месяцев назад

      People think that?

  • @Torso6131
    @Torso6131 6 месяцев назад +2

    My favorite case of this is the original Crysis (and remastered) where on top of being super GPU heavy, it also increases more CPU-driven rendering budgets with an increase in resolution. On top of that it's also not very well multi-threaded so you basically need a 7800x3D to run a 2007 (right?) game at 4k60fps maxed out settings. I think it ends up pushing out the LOD more or something super weird the higher your resolution, so you get more CPU calls. Digital Foundry did a pretty good video on it and in the clip where they got Alex (their PC focused member) a 7800x3D to replace his 12900k they discuss Crysis specifically.
    But yeah, basically at 4k your CPU is less likely to be a bottleneck, but it still very much can be, and you want something at least pretty good to make sure you don't get slammed with crap frametime spikes, as CPU limited gameplay will almost always be a stuttery mess where GPU limited will be better.

  • @arseniogarcia8631
    @arseniogarcia8631 6 месяцев назад +5

    Mainly game at 4k besides esports titles and just upgraded from the 5600x to the 5800x3d and it's been a big uplift. Average is only 10% to 20% higher but the 1%low is so much higher its great. Even if GPU is maxed out it still helps a lot!

    • @ChoppyChof
      @ChoppyChof 6 месяцев назад +2

      Did the same change and got the same results as you, those better 1% lows make such a difference. Really wanted a 7800x3d, but factor in the price of the new ram and MB and it was 3-4 times the cost of just dropping in the 5800x3d. Shall stick with this for a while.

    • @pengu6335
      @pengu6335 6 месяцев назад +1

      ​@@ChoppyChofYeah, the 1% lows matter just as much. Most people think average fps is all that matters.

  • @hyperfocal2002
    @hyperfocal2002 6 месяцев назад +2

    It's great to see a selection of videos explaining the connection between CPU/GPU and framerate at different resolutions come out in the past few days. It works to test CPUs with nothing but a 4090 for comparison and likewise GPUs with a 7800X3D for consistent benchmarks, it doesn't help people trying to match components in an upgrade or on a strict budget when the midrange CPU wastes GPU power and vice versa.

  • @Highwayman777
    @Highwayman777 6 месяцев назад +38

    Core 2 Duo should pair well with the 50 series.

    • @nathangamble125
      @nathangamble125 6 месяцев назад +1

      The best CPU to pair with an RTX 5090 will obviously be an Intel 8080.

    • @Beeterfish
      @Beeterfish 6 месяцев назад +2

      Joke aside, that CPU was the greatest I've had. It was a huge leap in terms of x86 architecture and not overly expensive.

    • @campfiregames4633
      @campfiregames4633 4 месяца назад

      I still have my old computer with it, runs Linux Mint, I watch movies on it.

  • @sanzyboy3952
    @sanzyboy3952 6 месяцев назад +9

    Remember if you are gpu bottlenecked you can always just use upscalers and lower graphic settings.
    The same doesn't apply to a cpu bottleneck

    • @christophermullins7163
      @christophermullins7163 6 месяцев назад

      I decrease the CPU render resolution to put more of the resolution on the GPU instead.

    • @DarkSoul-pb6dv
      @DarkSoul-pb6dv 6 месяцев назад +4

      @@christophermullins7163 are you talking about draw distance? never heard of cpu render resolution before draw distance helps a lot for cpu and gpu but you will still get lag spikes a lot because of the pop in from draw distance

  • @johndoh5182
    @johndoh5182 6 месяцев назад +3

    I got to about 10:00 and you were going on and on, and maybe you made this point later? I tuned out.
    The bottom like for ANY resolution is what a person wants for fps. If I'm playing Starfield with a 4080, well, I have a 2K rig so it's great, or good enough. I prefer an avg. above 90 and a 1% low to be above 60 fps. I find that gives me a stutter free experience in open world games.
    But looking at a RUclipsr benchmark it doesn't tell me much, because they will stick to certain settings just for consistency, but there's no way I'm playing Starfield at 4K with a 4080 UNLESS I scale down the quality to where once again my avg. fps is above 90 and my 1% is above 60, or I'm using DLSS, and using RT, DLSS is almost always going to be enabled.
    YES, because I'm pushing the fps back up, the CPU is going to matter. I'm not gaming at the settings HDWU had for 4K in Starfield. I'm not going to let a GPU struggle like that and give stutters from time to time.
    AND THIS is the point most people get wrong when they look at RUclips benchmarks and listen to a person say "at 4K they're the same". And frankly, part of this held belief has been SPREAD by different RUclipsrs.
    And I don't need to get into what the CPU does and what the GPU does. What I can tell you is when you play a game at high fps at 4K the CPU is CERTAINLY going to matter. I do know the CPU has to pass data to the GPU before the GPU can render a different frame, otherwise it's painting the same frame, because the GPU doesn't know movement, so there is change data that the CPU has to send to the GPU to get a different frame. The CPU tracks the game. You also have to keep updating RAM.
    And if YOU personally like playing at 4K with 1% lows at 40 fps, well have fun with it.

  • @Roland_Deschain_of_Gilead19
    @Roland_Deschain_of_Gilead19 6 месяцев назад +7

    @danielowentech breaks out the teacher side of him! Always great to hear your explanations! 👍🏻

  • @arc00ta
    @arc00ta 6 месяцев назад +3

    Great video, as a hardware enthusiast I've known about this for many years but its really hard to articulate it as well as you did here. Going to bookmark this one for future use so I don't have to try and explain it.

  • @bogroll1881
    @bogroll1881 21 день назад

    Really helpful for tuning a setup at 4k, you can initially max everything at 1080p to find out what the CPU FPS limit is and then go to 4k to target that frame rate, may as well crank up all the quality settings to max until you see the FPS dip below the CPU limit

  • @obeliskt1024
    @obeliskt1024 6 месяцев назад +1

    very informative and practical. I generally recommend to my friends that CPU doesn't have to be the best for their build, but I always ask them what their resolution is, their target "smoothness" (most people are not familiar with fps) and the games they wanna play.
    So most of the time I would recommend a mid-range CPU like a ryzen 5 or i5 + a decent midrange GPU for 1080/1440p. Sometimes I'd even sell them to the upscaling tech available for each side as well as show them that some games have beautiful raytracing BUT I always make sure to let them know that it's not always a good idea to crank up the settings and make them consider optimizing settings.

  • @erictayet
    @erictayet 6 месяцев назад +1

    This is why I change GPU once every 2.5 years but CPU every 5-6 years. My monitor and the game I'm playing are actually the key reasons that cause me to upgrade my PC components.
    You have to find out what your CPU/GPU are actually doing before you upgrade. There are plenty of tools like MSI Afterburner, AMD Adrenaline Software monitoring, Intel PresentMon, Task Manager, etc.

  • @androidx99
    @androidx99 6 месяцев назад

    Awesome content as always @Daniel Owen. I build alot of PCs and haven't considered this perspective, so thank you for bringing it to my attention!!

  • @Marco-bq3wc
    @Marco-bq3wc 6 месяцев назад

    One more aspect where the CPU is relevant for 4K / increasing frame-rates that Digital Foundry found (mentioned in passing in their Dragons Dogma 2 Tech Review - 19:00 - 19:27) :
    When CPU limited, the frame-rate in the same scene is still taking a (small) hit (~7FPS difference between 1080p and 4k as well as ~2 FPS between 1080p and 1440p) at higher resolutions.
    Though they didn't really test this in detail (as this was not relevant for this review) and just mentioned that they suspect this is due to this / some game(s) drawing more things in the distance while at higher resolution.

  • @kr00tman
    @kr00tman 6 месяцев назад +1

    Great video...I actually changed my opinion on this a while back. In CP77 with a 5950x and a 4090 with rt max, dlss off in 4k I was getting about 40 fps, when I upgraded to a 7800x3d it uped to about 48 fps so there was a 20% increase. I was actually pretty shocked.

  • @GamingOnArc
    @GamingOnArc 6 месяцев назад +1

    0:01 the face you are making while pointing to what looking like a red dog rocket covering a name is priceless.

  • @Bezzerkus
    @Bezzerkus 6 месяцев назад

    Thank you for addressing this! I would see so many comments just like the ones you were talking about. Hard to argue with misinformation

  • @teddyholiday8038
    @teddyholiday8038 6 месяцев назад +61

    Of course it matters. Better cpu can provide smoother frame times

    • @mickieg1994
      @mickieg1994 6 месяцев назад +7

      Will also handle background tasks better, plus more and more games can now take advantage of 12 or more cores, why hamstring yourself with 8?

    • @paul2609
      @paul2609 6 месяцев назад +7

      @@mickieg1994 Because budget?

    • @teddyholiday8038
      @teddyholiday8038 6 месяцев назад +2

      @@mickieg1994 only reason why I would go with 8 is if I’m hellbent on a x800x3D chip, but otherwise yeah the more cores the better

    • @mickieg1994
      @mickieg1994 6 месяцев назад +4

      @@paul2609 I disagree, if the budget is that tight that you can't afford the extra £100 or so to go from 6 to 8-12, then you should seriously reconsider whether you can afford a pc at all.

    • @saricubra2867
      @saricubra2867 6 месяцев назад

      @@teddyholiday8038It depends, for example, the 5900X has 12 big cores and the 12700K is hybrid. The only reason why i choose the 12700K besides the better IPC is the lack of CPU overhead (Intel thread director cleans interruptions for the main threads for any program).
      I wanted a X800X3D chip but they are 8 core only and they would feel like a slog when the scheduling isn't as good. Ryzen 9 5900X gets lower averages than these newer chips but 1% lows and frametimes are very, very smooth when the game is well optimized (like Cyberpunk 2077).

  • @Ralipsi
    @Ralipsi 6 месяцев назад +35

    +1000! Lets play games like Cities Skyline or Microsoft Flight Simulator at any resolution : we willl see that CPU does matter.

    • @colevandyken2871
      @colevandyken2871 6 месяцев назад

      With a 7900xtx and 7800x3d i was gpu limited at ultrawide 1440p, i cant speak to city skylines, but at least for mfs2020 youll be gpu limited at 4k

    • @mertince2310
      @mertince2310 6 месяцев назад +9

      Its kinda infurating that there is no comprehensive comparison of cpu performance on 4x/simulation games. Its always AAA stuff, id like to know how much a 7800x3d would improve my ticks per second on vic 3.

    • @saricubra2867
      @saricubra2867 6 месяцев назад

      I bought my i7-12700K for PS3 emulation (besides for work), those games that you mentioned aren't as demanding.

    • @BlueSatt
      @BlueSatt 6 месяцев назад

      Im in this scenario. Currently have a ryzen 3700x paired with RTX2080 and planning to upgrade to Rx7900XT. Im playing sim games at 4k, 60hz monitor. Wondering if the CPU bottleneck will be a problem and if a 5800X3D upgrade is a smart move. What do you guys think ?

    • @Ralipsi
      @Ralipsi 6 месяцев назад

      @@BlueSatt A nice match i guess but that gear lifespan will be a bit shorter

  • @ronny332
    @ronny332 6 месяцев назад +2

    The point in (my own) short words: for 4K gaming CPU performance is often not important, just because the framerates are lower, just because the gpu is maxed out. that's all. when the framerates raise, the cpu has to work again a lot more.
    to be futureproof, an unlimited fast CPU and GPU is needed %-). Nobody really knows, which new game will "waste" more power on the gpu or cpu. Too much just does not exist, but saving the money for a top level cpu but inserting a 4090 is just wrong.
    Especially with UE5engine using most of the time Raytracing on the CPU we all can find us in a state, where a faster cpu could be easily be needed. That all said with the need for intense AI npcs, just because ... why not?

  • @Plasmacat91
    @Plasmacat91 6 месяцев назад +1

    Another great video my man. You remain one of the best RUclipsrs in this space.

  • @HivisFree
    @HivisFree 6 месяцев назад

    You're 100% on it man, I use a 7900xtx and a 5800x and play in 4k native, Ive noticed on Cyberpunk that my 5800x bottlenecks the 7900xtx when a lot is going on like in combat situations and what not. I noticed it just like how you are saying, my 1% lows tank down to the 30s while my regular fps is still in the 80s. I think the new AI for the police and NPC's has a lot to do with it, especially when you turn the crowd density to low. I usually have to play with crowd density on medium. But even watching benchmark videos of a 7800x3d with a 7900xtx the 1% lows are still just as bad, which has made me hesitant on upgrading my cpu

  • @clouds5
    @clouds5 2 месяца назад

    So, TLDR: if you plan on optimizing your games for HIGH FPS, your cpu is more important.
    You open the game and put everything on max like 4k/ultra but then you realize you only get 40-50fps or you might get 60-70 but want 90fps+ etc. Then what you do is turn down graphics settings and/or use upscaling (duh!), and the more you do that, the closer you get to your CPU limit (=max fps your cpu can deliver before the gpu starts rendering). And once you hit it, you can turn down your settings as much as you want and upscale from 240p you won't get more fps. It has to be said though, if you have a semi modern CPU and want to play with "moderate" fps ~60-90fps you really have to look hard for games where this becomes an issue.

  • @British_Dragon-4K-Simulations
    @British_Dragon-4K-Simulations 6 месяцев назад

    Fly at low altitude in Microsoft Flight Simulator or DCS with a i9-12900K and an RTX 4090 and you will see your frames sink below 30fps in 4K.
    This also happens in Racing Sims when you have a full grid of cars.
    I find it hard enough to maintain just 72fps in a Pimax Crystal VR headset in DCS and MSFS2020.

  • @anitaremenarova6662
    @anitaremenarova6662 6 месяцев назад +2

    CPU does matter, but not nearly as much as people think. In 4K you'll need to spend a lot more on the GPU so it's perfectly fine to go with a 5600 if your only interest is casual gaming.

    • @wertyuiopasd6281
      @wertyuiopasd6281 5 месяцев назад

      7600*

    • @anitaremenarova6662
      @anitaremenarova6662 5 месяцев назад

      @@wertyuiopasd6281 Zen4 is still more expensive all around without providing too much of a boost. Budget builds are fine with Zen3 but if you really want to get the current gen then 7500f is the go-to if available.

  • @jesusfreakster101
    @jesusfreakster101 6 месяцев назад

    Thank you for this explanation- I am one of many who struggle with settings and tweaks - more education is needed on topics like this.

  • @tourmaline07
    @tourmaline07 6 месяцев назад +1

    As someone with an 8700k and 4080 Super combo , CPU performance does absolutely matter. I'm even CPU bound in Time Spy of all things 😂 . Bit imbalanced but my old 2080ti GPU died recently , so put this GPU into my old build. Planning to upgrade to Zen 5 though soon enough ;)

  • @c-dub8639
    @c-dub8639 6 месяцев назад

    Not to mention that the CPU has a major impact on system snappiness and load times even if you're 100% GPU bound while gaming.
    I have a TitanXp (1080 TI equivalent) paired with an R7 5800X and I'm itching for a CPU upgrade even though my GPU is 3 years older than my CPU.
    Additionally, the CPU can get interrupted by background tasks or if the game has to load in assets and that can come through as stuttering even in GPU bound scenarios.

  • @walter274
    @walter274 6 месяцев назад

    Good points! In many cases, you feel the difference between CPUs in 1% and.1% low. Also, most people don't play on a clean test bench, so in reality, the CPU is probably going to be hit with random request whie gaming. The better your CPU, the better it can cope with stuff like that.

  • @13loodLust
    @13loodLust 6 месяцев назад +15

    The only bottleneck everyone has is how much is in their bank account.

    • @lenscapes2755
      @lenscapes2755 6 месяцев назад +2

      You would be surprised how many people are bottlenecked by their wrong decisions and not researching when they are buying PCs or parts and wasting their money.

  • @niko220491
    @niko220491 6 месяцев назад

    Nice video.. again. :) I can absolutely recommend the videos from Gamers Nexus regarding "GPU Busy", "Animation Error", "input lag" (Latency pipeline), "Optimization" and how GPU drivers work.. discussed with an engineers from Intel and Nvidia.

  • @Iamkaden
    @Iamkaden 6 месяцев назад

    I learned about this on my own tinkering with my PC like 6 years ago. A lot of people get it wrong. Heres a simple test you can do as well: if you want to know your highest FPS you can go in any game, first drop to the lowest resolution the game supports, then see what your FPS is uncapped. You can also add an upscale setting such as DLSS to further increase your FPS if you are somehow still GPU bound. 👌

  • @valuehunter5544
    @valuehunter5544 6 месяцев назад +1

    spider man remastered with ray tracing enabled is the perfect example of this. it's still very cpu influenced even at 4k.

  • @christophermullins7163
    @christophermullins7163 6 месяцев назад +3

    This feels clickbaity. The statement "CPU matters less at 4k" is just as valid as it was before this video.

    • @od1sseas663
      @od1sseas663 6 месяцев назад +1

      It’s not

    • @christophermullins7163
      @christophermullins7163 6 месяцев назад

      @@od1sseas663 make an argument numb nuts..

    • @christophermullins7163
      @christophermullins7163 6 месяцев назад

      @@od1sseas663 make an argument then numb nuts. It shows ignorance to think or say that higher resolution does not remove CPU bottleneck. It's irrefutable that it does.

    • @Centrioless
      @Centrioless 6 месяцев назад +1

      I agree. He intentionally avoid putting an actual benchmark @4k cos the negligible difference wouldnt suit his narrative

    • @od1sseas663
      @od1sseas663 6 месяцев назад +1

      @@Centrioless That’s the point. It’s not a cpu benchmark anymore when you’re testing 4K, it’s a GPU benchmark. 🤦🤦

  • @valentin3186
    @valentin3186 6 месяцев назад +2

    Is DD2 really cpu limited? Or just badly optimized. The industry dropped the ball when the highest of the highest end cpus dip to 20-30s fps

  • @perlichtman1562
    @perlichtman1562 6 месяцев назад

    Or to put it another way: when do you start to be GPU limited? One of the best tools for visualizing this is the Forza Horizon 5 benchmark because it shows what percentage of the time it’s limited by one vs. the other and how many FPS the CPU is managing in a given run vs. the GPU.

  • @rodriguezkev
    @rodriguezkev 6 месяцев назад +2

    So basically as nearly everyone plays games at upscaled 4k instead of native, the CPU is just as important as playing at a native lower resolution.

  • @novantha1
    @novantha1 5 месяцев назад

    Usually I look at 1% lows when judging a PC build, not the average frame rate. To an extent that’s personal preference, but anyway: I’ve noticed that a lot of higher end CPUs have better 1% lows even given the same average frame rate. Sometimes you also get a boost from faster memory, storage, and higher VRAM GPUs, and those tend to heavily influence my purchasing decisions.

  • @facegamefps6725
    @facegamefps6725 6 месяцев назад

    Good job Daniel! I brute forced Dogma so no issues. I recorded it on my channel with 4 pawns in the city.

  • @Nekudza
    @Nekudza 6 месяцев назад

    Also graphic settings are well scalable in most of the games while CPU load settings are way more limited. And, most important to me - good CPU will usually provide better 1% and 0.1% lows which are the most important metrics perception-wise

  • @themalanden
    @themalanden 6 месяцев назад

    This actually happened to me. I had a 32 core threadripper 3970x and I upgraded my gpu to a 4090. In particularly Forza Horizon 5 and The Witcher 3, No matter what settings I used the Framerate was about the same. 1080p, 4k, and 4k with any DLSS. All were about the same. Due to this I had to sell my 3970x for a more current cpu.

  • @mrhassell
    @mrhassell 4 месяца назад

    RAID 0 your NVMe M.2 drives and theoretically, double your throughput. Also turn off memory integrity, when gaming for extra FPS. In the most recent Insider Preview builds, Windows will notify the user that the Memory integrity feature is currently turned off so that action can be taken for the user to turn it back on so that their device is as secure as possible against malicious attacks.

  • @gloth
    @gloth 6 месяцев назад

    it's all about understanding where (and what) the bottleneck is. Increasing gpu load either with resolution or higher settings to the point that the gpu becomes the slowest component of your system, at that point cpu does not matter. That point is obviously different on an 7800x3d vs an 5600, on an 3090 vs a 1060. And of course that point is not a global value, but depends on how a specific game utilizes cpu and gpu (or ram and vram or even drive speed).
    I usually find the opposite to be true, people seeing cpu gaming benchmarks (correctly done at the lower res with lowest graphic settings) and thinking that the x cpu is going to give them that much fps gains in 4k rt ultra, when they would gain a lot more with a gpu upgrade.

  • @zaleskar
    @zaleskar 6 месяцев назад

    9:41 Here's the key point. I used to think CPU didn't matter at 4K too. Then I actually got a 4K monitor and started using DLSS on every single modern game. You just need to look at the CPU usage rising after activating DLSS to figure out that your CPU actually *does* matter. If you play at 4K native it still matters, just a lot less since you're extremely GPU bound.

  • @jonogrimmer6013
    @jonogrimmer6013 6 месяцев назад

    Every day is a school day! A lot of people including me often think they know more than they actually do. Being wrong about something isn’t a bad thing as long as you learn from it. Great video

  • @Omega_21XX
    @Omega_21XX 6 месяцев назад

    Went from a 6950x @4.2ghz all core, to a 7800x3d build with the same GPU. An RTX3080. The 4K stuff does hold true 9 times out of 10. There are certainly more games today that NEED SSD storage to perform well, and most of those are gonna be the same that are struggling with older CPUs. I personally think for most targeting 4k, the GPU is gonna be the main component still. Outliers like dragons dogma 2 are not the norm. I honestly haven't felt any noticeable difference with my upgrade.

  • @Khalid_Looby
    @Khalid_Looby 6 месяцев назад +1

    Intel 4004 could pair well with rtx 4090 or 5090.
    The Intel 4004 is a 4-bit central processing unit (CPU) released by Intel Corporation in 1971. Sold for US$60 (equivalent to $450 in 2023), it was the first commercially produced microprocessor, and the first in a long line of Intel CPUs.

  • @AbhishekG2000
    @AbhishekG2000 6 месяцев назад

    There are a of misunderstandings among gamers about how things works. I also came across some posts that suggest turning on ultra textures even on weak GPU just because you have high vram. According to them it does not affect the performance but it absolutely does. Mid range GPU come with limited memory bandwidth.

  • @o4karik41
    @o4karik41 6 месяцев назад

    The easiest way to explain how "bottleneck" work's, it's just to understand, that you have two possible fps limiters it's CPU and GPU(if we aren't talking about RAM). They are fairly independent of each other. Both of them can reach a strict amount of frames they can call or draw, that's all. If your CPU can't call more than 60 fps, you simply can't go above that without upgrading CPU itself. The same situation appends GPU cases.
    I believe it's the best knowledge what I've got about 8 years ago that drove away confusion of building PC theory.

  • @sethdou10
    @sethdou10 6 месяцев назад +3

    So, 5600 + 4080 show trivial difference from 78x3d + 4080 at 4k, what you want to prove?

  • @SingularitySource
    @SingularitySource 6 месяцев назад

    would love to see someone do a video of CPU comparisons but at different DLSS/FSR presets to see the performance difference between internal resolutions upscaled by the GPU VS internal resolutions matched to displayed resolution. Most might think it's 1:1 but the extra processing done by the GPU may affect the overall performance in-game.

  • @caelanb1711
    @caelanb1711 6 месяцев назад +7

    A smaller content creator made a video recently where he suggested pairing a 4080 with a Ryzen 5 7600 for a 4K build. I commented and said that was a bad pairing, and a bunch of people all replied to my comment giving me so much crap and roasting me while saying you didn't need a stronger CPU for 4K because games will pretty much always be GPU bound at 4K. I then made the mistake of bringing it over to reddit, where I then got roasted by like a million people who all agreed that you don't need more than a 7600 for something like a 4080. Truly a "be right even if you're alone" moment for me. I mean seriously, if you're pairing a $1,000 GPU with a $200 CPU in 2024, you've done it wrong, even if you're playing at 4K.

    • @g.eberling8700
      @g.eberling8700 6 месяцев назад +1

      this is so true. At this point you should be aiming for a 7800x3d and nothing less.

    • @pengu6335
      @pengu6335 6 месяцев назад +2

      ​@@g.eberling8700Eh, I'd argue you can get away with a 5800X3D.

    • @ahmed_x5597
      @ahmed_x5597 6 месяцев назад +1

      They are right, ryzen 5 7600 is very very good, it can handle 7900xtx in balanced games but not highly cpu demanding

    • @g.eberling8700
      @g.eberling8700 6 месяцев назад +1

      @@pengu6335 You would. But at such a high price class I would go with AM5. There is a high possibility that you can upgrade the CPU in future without changing anything.

    • @pengu6335
      @pengu6335 5 месяцев назад

      @@g.eberling8700 That's only assuming you already have an AM4 motherboard.

  • @max5183
    @max5183 6 месяцев назад +1

    I never only play games. I need benchmarks where they game at 4k, have a wnd monitor with 3 Chrome tabs that are streaming video, voice chat over discord, spotify etc all running. (No i dont listen to people and music and video at the same time, but the program is running)
    And THEN tell me i only need 4 cores and 16gb ram in 2024. Cause thats bs

  • @lucianogerace9951
    @lucianogerace9951 5 месяцев назад

    I'm not completely sold by this argument. On paper I would agree with what you say Daniel, but my actual experience with a 4070 Ti and a 5600 is just so different. You mention Starfield for example, and I was able to play it all maxed out with DLSS at 4k120 with my CPU barely going over 70% usage.
    My general experience with recent games, Alan Wake 2 being the most demanding, has been consistent: GPU doing all the work while the 5600 is just sitting there.
    CPU intensive games feel like the exception, for now at least

  • @Jackjack1978.
    @Jackjack1978. 6 месяцев назад

    That was literally the best explanation of how CPUs affect and don't affect performance I've ever heard. I don't know why all the big RUclipsrs cannot break it down simply like that So the people can actually understand.

  • @machoortv
    @machoortv 6 месяцев назад

    I recently updated from Ryzen 5 5600x to Ryzen 7 7800XD paired with 3080 because I also play 4K with dlss thinking it will improve the frame rates a lot. But actually it didn’t change anything and it feels like a waste of money.

  • @Stinger2578
    @Stinger2578 6 месяцев назад

    My last PC was put together around 2011 based around a Phenom II X6 1100T - 6 core CPU paired with a GTX 560 later upgraded to a 980. I could play something like Just Cause 3 at 1080p at decent settings at higher than console framerate between 40 and 60. Some time later, I ended up with a GTX 1080 but not that much improvement. By the end of 2018, I built a new machine based around an i7 8700K and with the same GTX 1080, Just Cause 3's framerate was much closer to solid 120fps at the same 1080p settings. So new platform new CPU I doubled / basically tripled the performance. As-of now I've replaced the GTX 1080 with a 4070 Super card and I can get that same 120fps but also running 2160P - also called 4K.

  • @jaredangell8472
    @jaredangell8472 6 месяцев назад

    I paired a Xeon 2680v4 with an rx6800. I play everything at max settings in ultrawide 1440p. Getting 45 FPS in Cyberpunk with FSR 2, 25 FPS without it. Getting 35 FPS in Immortals of Aveum without any upscaling. Getting 40 FPS in Forbidden West without upscaling. I think I'll be able to go 5 years before I need to upgrade hardware. Little to no stutters. My build cost $500 and could smoke a PS5+.

  • @mrdappernature8861
    @mrdappernature8861 6 месяцев назад

    I got a 12700k paired with a 4080 and I am very happy playing at 4k. I get about 90 to 100 plus fps in certain games. It all depends on what type of frames you want and what type of cooling you will have.

  • @TheDaoistheway
    @TheDaoistheway 6 месяцев назад

    I went with i7 7700k with 7900xt. Sure I can play all games on high settings. But it’s not smooth. 1% low is noticeable, tearing and stutter. Upgraded to 14600k. Frame rate went up like 20-40%. And feels much smoother

  • @mttrashcan-bg1ro
    @mttrashcan-bg1ro 6 месяцев назад

    Back in 2019 it didn't matter that much if you were using even the fastest GPU, most people using a 2080Ti at the time had a good enough CPU to push it at 4k. Nowadays, with games being insanely CPU heavy at time, a 4090 can't be fully pushed at 4k in some games with anything right now. CPU always matters, it's more a 40-60 split though, the GPU being the 60% as it's definitely more important but if you CPU isn't up to spec then you WILL notice problems

  • @AronHallan
    @AronHallan 6 месяцев назад

    It does, the greater the GPU the bigger the disparity will be. For example when 5090 drops out the newer and more recent CPUs will pull further away from older models like 13900k vs 10900k.

  • @fups1
    @fups1 6 месяцев назад

    Racing simulators like iRacing/Assetto also tend to be very CPU limited, even with triple screens

  • @raute2687
    @raute2687 6 месяцев назад +1

    If you're at 4k and your gpu pushes more frames than your cpu can handle, then cpu bottlenecks exist there too. You just don't see that in most newer games, because the gpu isn't likely to keep up. But if you play older titles (or less demanding indie games), then you can be cpu-bottlenecked at 4k.

    • @jodiepalmer2404
      @jodiepalmer2404 6 месяцев назад

      I currently play Bethesda Games (poorly optimised) like Fallout 4 (modded) on a 4790k, 16gb 1600mhz plus, 1080ti I have moments where my high is around 90 fps and my lows are around 20fps. This has been occurring since Nvidia introduced something in one of their updates years ago (something to due weapon debris) before this my framerates was buttery smooth. I'm saving to upgrade my PC (7800x3d etc) but keeping my 1080ti until I can afford a better GPU. Hopefully, this will fix up my stutters and lag.

  • @lepolangopower
    @lepolangopower 6 месяцев назад

    I believe the reasoning to be that in 4k it is hard to reach somewhere above 60fps. So, as long as you stick with a monitor that is 75Hz for example, you are good with whatever processor. That's kinda my situation, I'm looking to buy a used 3070 to continue playing 1080p 75Hz forever. So I think my Ryzen 1700 will last a long time for that.

  • @JorgeMartinez-dp3im
    @JorgeMartinez-dp3im 6 месяцев назад

    Basically, both your CPU and GPU need to have the computing power to output at your chosen resolution and framerate target. With the added variable that those computing needs will vary from game to game. I honestly get the confusion.

  • @GENKI_INU
    @GENKI_INU 6 месяцев назад

    If you aren't a competitive/Esports player, and you are fine with the maximum framerate and 1% that a given CPU can produce in a game, then there's no reason to worry or upgrade.
    This is especially true for the massive amount of 60hz 4K monitors that people still use.
    And it's not a lost cause either, because even if your CPU is the bottleneck, you can still crank the settings up "for free".

  • @bigarmydave
    @bigarmydave 6 месяцев назад

    It definitely does. 4k gaming on my 7950x3d/4090 vs my older 5800x3d/4090 can be a huge difference!

  • @haroldbutler778
    @haroldbutler778 6 месяцев назад +1

    I'd also bet people running 4k don't really need AA turned on to 4x 8x or msaa or txaa.. that's pretty taxing on gpu st 4k.. I bet that would show a uplift of some fps.. where a good cpu matters.

  • @felderup
    @felderup 6 месяцев назад

    could say, it doesn't scale linearly. some things will be the same load on cpu regardless of resolution.

  • @Tainted-Soul
    @Tainted-Soul 4 месяца назад

    very will put out and a new way to look at CPU - GPU requirements

  • @kathrynck
    @kathrynck 6 месяцев назад

    I'd go with something above "sorta true", maybe "mostly true".
    There's some "every little bit helps" involved. And also that at 4k (with current graphics cards & games) is going to be heavily GPU bound 99% of the time.
    Ray Tracing throws this off somewhat though, because it kinda pumps some math workload back to the CPU. In small amounts, this is minor, in cyberpunk @ super-ultra-mega-RT... it's quite noticeable.
    Also, it matters if you are going to use frame generation. Which is going to artificially inflate the cpu workload with more frames to process. Personally I'm not a frame-gen fan (though I _am_ very impressed with the modern upscaling tech).
    As a statement, it's "right enough" to make fairly sound decisions on where to spend and where to save, in a gaming pc build, based on your display resolution.
    There will come a day (probably) where 4k will be "easy" for graphics cards, and it'll be cpu-bound. That's very rarely the case right now though.
    It does matter what you play though, definitely. Make a big 2000-part rocket in kerbal space program, and your cpu will be screaming about the physics simulation, while your GPU is sipping tea.
    I paired a 4090 with a 7900X. Because I got a 7900X for $350 _with_ 32GB free 6000 ram, and $20 off any mobo. Typical pricing for it _now_ but it was phenomenal pricing then. And 7900X had the best single-core of AM5 (at that time). I do look forward to upping it to something like a 9950X/9800X3D (or even a generation beyond that) when we get clear info on what the final boss form of AM5 will be. Plus whatever the best "reasonable" ram is at that time (8000? who knows). My ram is kind of an experiment, where I got 64GB of non-matched ram to run stable across 4 sticks (which isn't recommended, but I got it working 100% within my receipt return window). It will be more long-term reliable with 2x32GB sticks though. But for now, I got AM5 features (cheap end X670E), and a cheap cpu with great single core perf, and 12 cores. And even with a 4090, i'm usually GPU bound in 4k. Also, why 64GB? It's really a dual-use pc.
    Worth noting, a cpu can affect "responsive feel" sometimes too (I mean above & beyond frame rate, like the 'total input lag' for clicks & inputs), and can affect things like load times, etc.
    Upscaling... sure I can (and sometimes do, if the weather is hot) run at 1440p upscaled to 4k in DLSS. And that can shift the bottleneck more to the CPU certainly. But it has to go through a cable to the 4k display... and there's a bandwidth limit there. So 7900X is "good enough" to hold up it's end of things, even with upscaling. At 165hz 4k, there's a little bit of artifacting from compression, so I usually just run it at 120hz which is the full bandwidth for HDR 4k without any compression. 7900X meets that need, and my only reason to upgrade it in the future is the dual-use aspect of the pc, and future-proofing the cpu/mobo.

  • @ColdMoongg
    @ColdMoongg 6 месяцев назад

    Enjoy the unbiased perspective and the fact that you're understanding of why the common misconception is said, rather than just grilling away.

  • @RJ-pb1qx
    @RJ-pb1qx 6 месяцев назад

    In the age of multi-threaded CPU's, I'm amazed that devs aren't utilizing more cores in games. I should expect to see almost no stutter or hitching using something like I have, a 5900X. 12C+24 threads should easily be enough, even 8C+8 threads or even 16 threads you would think should be enough as well.

  • @ShaneH5150
    @ShaneH5150 6 месяцев назад

    This does make sense so I want to say thank you for sharing the knowledge! My biggest pc bottleneck currently is my checking balance

  • @theMetallico666
    @theMetallico666 6 месяцев назад

    I dont understand fully but I remember I had an i5 7500 with a 1060 6G. When I got an i7 10700k, my 1060 felt like a phoenix, being able to play more games and getting better frames and stability. Afterwards got my current 3070 being able to play smoothly now in 2k for a while. If Im upgrading, i will go for the CPU first once again LOL. (Aiming for an i9 13th or i7 13th if I see a good sale going on)

  • @zerorequiem42
    @zerorequiem42 6 месяцев назад

    "They know just enough to get them in trouble."
    I feel personally attacked.

  • @Gameover571
    @Gameover571 6 месяцев назад

    I used to think it didn't matter at all, i kept my 6700k with a 6800xt at 1440p ultrawide (75hz). I was wrong, moving to a 13400f from a 6700k is night and day, stutters are gone, games load faster, and cyberpunk at ultra basically doubled FPS in some areas.

  • @GameHUB4k
    @GameHUB4k 2 месяца назад

    i noticed a massive difference when i went from a 5700x to 5700x3d it felt Way more smoother and i got absolutely no stutters now at 4k

  • @ludwigrevolution3288
    @ludwigrevolution3288 6 месяцев назад

    The 4090 was bottlenecked by the 12900k or 5800x3D back then in 4k for some games!
    Also for DD2 your 1% lows are a lot better when you stop being CPU bottlenecked (e.g. lock frame rate to 60~70 or have a weaker GPU). This is a problem with the game itself and not the CPU bottleneck inherently, but I've seen this happen often enough that you just prefer to not be CPU bottlenecked if possible for smooth framerates.

  • @NGreedia
    @NGreedia 6 месяцев назад

    Driver overhead is an issue with Nvidia. In the 7600 vs 5600 video, the 4090 was barely ahead of a 6950xt on the 7600

    • @saricubra2867
      @saricubra2867 6 месяцев назад

      I'm an i7-12700K owner, what is driver overhead? (a joke).