Do You Really Need A Powerful CPU For 4K Gaming?

Поделиться
HTML-код
  • Опубликовано: 18 окт 2023
  • ► Watch the FULL video here: • DF Direct Weekly #133:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry RUclips: / digitalfoundry
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on Twitter: / digitalfoundry

Комментарии • 170

  • @silence1977
    @silence1977 9 месяцев назад +11

    Chips that are only ecore already exist. The N100 and N200. They actually work quite well, we use them for things like Digital Signage and kiosks.

  • @87crimson
    @87crimson 9 месяцев назад +9

    There are lots of issues that you can't just power through with better HW. And you can absolutely do high resolution gaming with a lower end CPU. I replaced my GTX 760 with an RX 580 and it allowed me to run lots of 8th generation AAA games at 1800p or 4K 30 FPS with ultra textures with an FX 6300 CPU.
    Thats why always system spec sheets have troubled me. They include "4k specs" in which they ask you for the best CPU of the market when you are going to be GPU bottlenecked most of the time, so a lower end CPU might do just fine.

  • @perlichtman1562
    @perlichtman1562 9 месяцев назад +11

    Having benchmarked the A770, 3060, 4070 and 6800 XT on an i7-3820 (4 cores, 8 threads with HT or 4 without), 13600KF and 13900K I would say that there are a lot of games where if you maxed out a lot of the settings at 4K, you can still get high GPU usage on even an old 3820. Most of the time will you still get a little performance on a newer GPU? Yes, but I didn’t really see much at the 3060 class for instance in Forza Horizon 5 or F1 2022.
    On the other hand, there are more CPU limited games (like Baldur’s Gate 3 in the city of Baldur’s Gate) where the CPU makes a bigger difference even at 4K.
    So it’s not always one or the other. But there are certainly a ton of games out there that play great at 4K even on a CPU well below a 12400/13500.

  • @ionamygdalon2263
    @ionamygdalon2263 9 месяцев назад +2

    Actually, do consider making a tutorial on minimizing the stutter on those games mentioned at the end. It would yield many views imho

  • @ChewyLoo
    @ChewyLoo 9 месяцев назад +12

    I'm rocking a i9-9900k with a 4080 in 4K. Keep thinking about a CPU upgrade but on the whole it's still delivering a premium experience for high end games

    • @marrow94
      @marrow94 9 месяцев назад +9

      Just don’t waste money on a f***ing i9 next time, get a X3D CPU from AMD, less power and faster every time than intel.

    • @ChewyLoo
      @ChewyLoo 9 месяцев назад

      I bought it 2019 before X3D was a thing @@marrow94

    • @OPDR1P
      @OPDR1P 9 месяцев назад +8

      Went to a 7800x3d from a 9900k. If you’re just gaming And want to push your gpu to its limits I’d recommend pulling the trigger. My 9900k drove my 3080ti pretty decent but am definitely seeing much higher frame rates in every game with the 7800x3d. If you’re playing on a giant tv and looking to keep more of locked 60 rather than trying to squeeze every fps you can, then the 9900k still has plenty of life left.

    • @Kojiros7th
      @Kojiros7th 9 месяцев назад

      @@OPDR1P I just bought a 7800x3D for a new build, Im still rocking a 2070super and a i7-8700k. I gaming mainly at 1440p but I do plan on pushing 4k a bit more. Im sure my GPU will be a collosal bottleneck paired with the 7800x3D for 4k gaming but tbh Im not all that interested in upgrading to a 40 series card. Im content on waiting for a 50 series card. Do you think I should wait or buy one of the new refresh 40 series cards slated to come out in 2024? If anything Id upgrade to a 4080ti next year but id rather wait for the 5080ti tbch

    • @NPTEEE
      @NPTEEE 9 месяцев назад +1

      @@marrow94 AMD SUCK

  • @gameranch1995
    @gameranch1995 9 месяцев назад

    Okay so I have some questions about a couple of aspects of modern video games.
    1) Firstly I'm confused about photo mode. I have noticed some games look more detailed in their photo mode and we can also place spot lights to light the scene up according to how we want. Is this all being done in real time (on our GPU) ?
    2) When a trailer is released, we often see they have mentioned 'in engine footage' What does this exactly mean? Does this mean the game is being rendered on the hardware it is intended to release on
    3) What are real time cutscenes ? Do choice based games like detriot, dark pictures anthology (where we don't have lot of gameplay) but alot of cutscenes where we can interact with the players for example quick time events. Are they being done in real time? Like what's real?
    4) In modern games I have noticed that some light sources such as flash lights the character use, local lights and etc, cast really good resolution shadows but the sun doesn't cast good resolution shadows but jagged edges and in some games vice versa

  • @walter274
    @walter274 9 месяцев назад

    I think the e-cores are supposed to handle all of the background stuff that can happen. I think issues come from what gets scheduled on what cores.

  • @AppleTaco
    @AppleTaco 9 месяцев назад +2

    Ryzen 5 3600 + RTX 3080 and I can play 4K without issues on well optimized games. Even BF 2042 has 98% usage on CPU and GPU and it stays at 60-80 fps maxed
    other games at 50-80% usage barely hit constant +60 fps

  • @Dirt33breaks
    @Dirt33breaks 9 месяцев назад +1

    On my asus mobo you can disable the e-cores just by pressing screenlock key.
    Its very convenient

  • @Solidus-JD
    @Solidus-JD 9 месяцев назад

    I've finally came over to the dark side. I get my i94000k & 4090 Monday. Thanx DF✌

  • @jameswayton2340
    @jameswayton2340 9 месяцев назад +6

    Whatever you do: Don't trust websites telling you you have a ''bottlebeck'' when combining a certain CPU with GPU at a certain resolution. Those websites are garbage and on average will tell you to buy a CPU that is way more powerfull and thus expensive then you need.
    Like i build my pc, have a rtx 4080 combined with an AMD ryzen 5 7600x cpu. Gpu usage is very good on average 97-98% usage even when using like DLSS balanced resolution. But the website say's this creates a 17.7% bottleneck. Absolute nonsense.

    • @dante19890
      @dante19890 9 месяцев назад

      Difference is that they have done the A to B testing and u have not.
      Some games are more gpu bound than others so u have to exclude those

  • @NanoNutrino
    @NanoNutrino 9 месяцев назад

    I have an i7-5820K 3.30GHz, and a 3080. My 4K 120hz OLED runs pretty well. I did cash out for a 13900k but have yet to put it together

  • @doctorsatan4260
    @doctorsatan4260 2 месяца назад

    still getting by with a 12400 i5 at 4k.. usually have a balance of getting at least 60fps worst case in most unoptimized games, but still no problem running 100+ fps with dlss and a little balancing that looks just as good to me. however, i've noticed, RT on the newest titles in 2024 feel like its probably the last year I keep it going without any feeling it holds anything back.

  • @mondodimotori
    @mondodimotori 9 месяцев назад +1

    Wait, when was core/threads optimisation in games "common"??

  • @leonidasdestoleo2761
    @leonidasdestoleo2761 9 месяцев назад +1

    I am currently with 5600x paired with 6900xt lc gpu and it's just fine for 1440p and 4k on plenty of tiles because my budget is kind of limited but this month one 5800x3d is probably coming to give some love to my system.
    After that i 'll be waiting for next gen cpu - gpu to come by but it also will depend on the pricing range as well.Hopefully i want have to sell my other kidney...

  • @cameron818
    @cameron818 5 месяцев назад +1

    My R7 3800xt is starting to show issues at 4k. I think single core performance is key. So for me either a R5 5600x or the 3D CPUS if I don't want to go AM5

  • @cosbusta
    @cosbusta 9 месяцев назад +2

    Games that are fine work great on mid-range CPUs. Games with inherent stutter will have issues regardless, trying to power through engine and optimization issues by over-paying to still experience problems (just lessened) feels like being robbed.

  • @molsen7549
    @molsen7549 9 месяцев назад +1

    Glad my 5900x will be relevant for a bit longer, especially when an upgrade means switching to a whole new platform.

  • @DonaldHendleyBklynvexRecords
    @DonaldHendleyBklynvexRecords 9 месяцев назад +1

    I was waiting for this, I have a quad core 8 thread CPU and I know it bottlenecks my 7900xtx but 90% of the time at 4k it really isn't

  • @rolandrohde
    @rolandrohde 9 месяцев назад

    I very recently returned to PC Gaming after being a Console Gamer for 10 years. On Consoles there was always a lot of talk about thr CPUs being too weak to do 60fps, but when I run Games on my PC (7800x3d, 7800XT, 32GB DDR5, Samsung 990 Pro SSD) thr CPU is idling at around 5% most of the time in basically every Game I have tested so far while the GPU is pretty much always at 100% load. And I am not even play at Native 4K (using FSR where possible) but even fairly simple Games like PubG are extremely GPU bound. Maybe things would be different if I had an RTX4090...🤷‍♂️

  • @quikspecv4d
    @quikspecv4d 9 месяцев назад +4

    I’ve been wondering this. I typically game in 4k with a 5800x (overclocked) and am hoping it would do well when I upgrade my GPU next gen. Maybe a rtx5080 when it eventually releases

    • @daniil3815
      @daniil3815 9 месяцев назад +1

      It will do fine, but I would suggest buying new CPU too. 5080 will be as powerful as 4090, maybe even more powerful. And 5800x is not utilizing 100% of 4090 even at 4k.
      I personally have 5800x and 3060ti, planning to build a new pc in 2025. Hopefully I will afford 5090 with whatever the best processor available in 2025.

    • @environm3ntalist549
      @environm3ntalist549 9 месяцев назад

      you wont need a 5090 if you are just buying it for gaming @@daniil3815

    • @puffyips
      @puffyips 9 месяцев назад

      @@daniil38155800x just not enough, low end am5 is better for higher end gpu

    • @cameron818
      @cameron818 5 месяцев назад

      What CPU did you come from? I have a 3080ti and I would like to buy your same CPU for myself. I'm curious how much difference it would make as in some games at 4k 60 having no issues I'm sitting around 40% CPU usage and I get frame drops. I believe this would not be the case with a better CPU.

  • @halo2bounceguy
    @halo2bounceguy 9 месяцев назад +1

    My 10700k has not given me any issues with my 4090k. When i upgrade my Mobo ill get a new CPU. Literally every intensive ray tracing game ive played has been gpu limited. Only one that the CPU would've helped with is spiderman PC.
    4k120 is no issue with a 10th gen is all im saying

  • @Lead_Foot
    @Lead_Foot 9 месяцев назад

    Intel was working on an all e-core server/workstation cpu.

  • @jeckyfreaky
    @jeckyfreaky 7 месяцев назад +1

    I need more this kind of video to stop me from considering buying overpriced CPU only just for gaming, I'm running 4090 with 12400f, it's not a perfect 99% gpu load all the time, but on most gpu intensive game at 4k, the performance diff is negligible for you to spend 3-4 cpu price, you can check my benchmark (not using capture card so the fps might a lil low) on cyberpunk 2077 4k path tracing, it still can pull 4090 to get over 70fps

  • @jmangames5562
    @jmangames5562 9 месяцев назад

    There are so many variables from Pc to PC it can not be generalized. My 13700k is flawless never a noticeable stutter every game smooth as butter couldn't ask for better. My 5700x I get some but not game breaking. I prefer intel just because my overall experience has been so good with them. I don't have any issues like he speaks of with e-cores.

  • @Daddy80sCool
    @Daddy80sCool 9 месяцев назад +2

    The bottleneck that my I5 8400 was causing my RX 6800 XT was horrible, I upgraded it to an I5 12400F and it made a huge difference.

    • @duxnihilo
      @duxnihilo 9 месяцев назад +2

      I think the point here is more the difference between mid-tier and high-tier.

    • @PantsaBear
      @PantsaBear 9 месяцев назад +2

      Obviously in your more extreme case of using a 6 year old CPU that was mid range back then, its going to bottleneck your GPU from 2-3 years ago, that is playing games coming out in present day
      Like the other person said, the discussion was more on do I need to upgrade from pretty modern and fairly powerful CPUs to nearly the highest tiers of CPU to accommodate for game optimization in 2023

    • @duxnihilo
      @duxnihilo 9 месяцев назад

      @@PantsaBear Indeed, that was my point. A 5600 will be more than enough nowadays.

    • @Hobbes4ever
      @Hobbes4ever 9 месяцев назад +1

      its because that garbage cpu doesnt even have hyperthreading

  • @NarciisJr
    @NarciisJr 9 месяцев назад

    my 5900x is keeping
    my 3090ti back at 3440x1440 native. DLSS is even worse with RT. 3800 CL14 RAM, using PBO/pbo scalar x10 and also curve optimizer

  • @GitDatPC
    @GitDatPC 6 месяцев назад

    I have a 9900k paired with my RTX 4090 gaming at dual-4K resolution 😀

  • @datageek9132
    @datageek9132 9 месяцев назад

    Intel N series is only E cores. (n100,n200,n305 etc)

  • @sieq82
    @sieq82 9 месяцев назад +2

    Paired 5800x3d with 4090 as I was running 3700x before and honestly this will do me just fine with 4k for some time as I game on 77 inch tv but a lot of reviews are saying that 5800x3d is a bottleneck but honestly I cannot see it.

    • @dante19890
      @dante19890 9 месяцев назад

      Ye u will get better performance with an i9 13900k and really fast ddr5 with a 4090. So ye u are in a way bottlenecking ur card

    • @Hobbes4ever
      @Hobbes4ever 9 месяцев назад +1

      those reviewers must be working for either AMD or Intel or sponsored by them. the 5800x3d is as fast or faster than 13700k in most games

    • @Kitten_Stomper
      @Kitten_Stomper 9 месяцев назад

      Even if it does bottleneck a little bit, frame gen should make that irrelevant now.

    • @cameron818
      @cameron818 5 месяцев назад

      How much of a change was your CPU usage % with the 3700x going to the 5800x3D? I have a 3800xt and I'm considering a 5800x or 3D. My CPU usage in some games is 40% at 4K and I get frame drops and stutters

    • @sieq82
      @sieq82 5 месяцев назад

      ⁠@@cameron818main thing I noticed that jumped high is the minimum frame rates went up by a mile, stutters were gone as well so if have any that’s mostly to game not the cpu now. Also at 4K the gpu is being maxed where it wasn’t when I was on 3700x. Can’t recall exactly now but minimum fps performance is very very noticeable. There are reviews that show this so I would look them up ie hardware unboxed or gamers nexus

  • @Olivion17
    @Olivion17 9 месяцев назад

    I have a i5 13th gen and I have zero issues

  • @justalex4571
    @justalex4571 9 месяцев назад +1

    No. We need powerful GPU plus good optimization of games 😊

  • @xXGreenTigerXx
    @xXGreenTigerXx 9 месяцев назад +3

    I get average 83 fps in Cyberpunk2077 overdrive maxed out at 4k with dlss quality. I thougt the cpu bottleneck would hit harder on my 4090. I have "only" a 10700k at 5.1GHz and 16GB DDR4. But of course it depends on the game. CP2077 is well optimized.

    • @daniil3815
      @daniil3815 9 месяцев назад +1

      I don’t really understand that conversation. If someone can afford 4090 and 4k 144hz monitor, why there is a conversation/problem to upgrade cpu, mobo and ram?

    • @xXGreenTigerXx
      @xXGreenTigerXx 9 месяцев назад

      and i dont really understand why u care. Every game i play runs perfectly smooth. I only play at 4k. I dont play competitive games. So i dont need every single frame. Resident Evil 4 remake maxed out 120-144fps. Forza Horizon 5 maxed out 100-120fps. Doom Eternal maxed out 200+ fps. Death Stranding maxed out 200fps. Dead Island 2 maxed out 144fps. Red Dead Redemption 2 maxed out with reshade 100+fps. Tell me one reason why i should upgrade. I dont upgrade my cpu, motherboard and ram only to get 5-10fps more at 4k.@@daniil3815

  • @Lock2002ful
    @Lock2002ful 9 месяцев назад +6

    I’m still on the Xeon 1230 v2 with 4 cores, 8 threads. Have a 3080 10GB and 24 gb ddr3 with a samsung 800 something evo ssd.
    Let’s just say, it depends.
    Funnily enough most games are indeed still gpu bound.
    Some games are unplayable or don’t reach the fps they should with my gpu.
    RT however is taking a big hit depending on the game.

    • @perlichtman1562
      @perlichtman1562 9 месяцев назад +1

      That’s pretty similar to my experience trying some GPUs on both an i7-3820 and a 13900K. Up to around a 3060, I’m finding even ob the i7-3820 I’m mostly GPU limited at highest 4K settings in Forza Horizon 5 and F1 2022 but go above that (like a 4070 or 6800 XT) and with RT the CPU becomes more of a limiting factor. And then there are games like Baldur’s Gate 3 that are more CPU limited in general, but they are less common once you get the settings high enough without RT.

    • @daniil3815
      @daniil3815 9 месяцев назад +1

      When did you buy 3080? In 2020/21 it was crazy expensive. I would definitely upgrade my cpu for that gpu. It might be gpu limited, but stutters, 1% low, 0.1% lows.

    • @Lock2002ful
      @Lock2002ful 9 месяцев назад +1

      @@daniil3815
      Bought it at the end of last year for about 400 bucks. ^^
      Yeah, I wanted to wait for the new AMD 3D cpus, then they got released and are awesome as expected but this year’s been crazy. I rarely get time to play or do anything. Also wanted to wait for some big time game releases like Alan Wake 2 to see how cpu and gpu performance is. I’m planning to upgrade my 3080 to a new 5000 series card or used 4090 in a year depending on performance and price.
      Hoping for some good deals on black Friday for motherboards, ram and cpu.

  • @watkinsjason
    @watkinsjason 9 месяцев назад +8

    I'm not sure if this is true (haven't tested it myself) but if you use any upscaling tech (DLSS, etc), I would think that the answer is Yes, you would benefit from a powerful CPU for 4K gaming. For example, if you are outputting 4K with DLSS set to Performance mode, the game is internally rendering at 1080p, which we all know that the CPU power can make a significant difference at 1080p.

    • @dmtsmokingman
      @dmtsmokingman 9 месяцев назад +2

      haha outsmarted them haha outsmarted them haha

    • @dmtsmokingman
      @dmtsmokingman 9 месяцев назад

      )outsmarted them(

    • @mitsuhh
      @mitsuhh 9 месяцев назад

      Then you aren't rendering at 4K resolution.

    • @watkinsjason
      @watkinsjason 9 месяцев назад +1

      @@mitsuhh Yes I know, that's what I said. That's how DLSS works. 4K DLSS Performance mode is 1080p upscaled to 4K.

    • @mitsuhh
      @mitsuhh 9 месяцев назад

      Your first half of the first comment contradicts the 2nd half.@@watkinsjason

  • @aforty1
    @aforty1 Месяц назад

    I’m successfully gaming at 4k with a 9600k and 6800xt.

  • @ledooni
    @ledooni 9 месяцев назад +1

    You either need a decent Intel CPU properly overclocked (incl. Ram) or a Ryzen 3D chip if you are interested in consistent frame pacing with less noticeable drops. The lack of CPU optimization in most modern titles can at least somewhat be mitigated with that. Being CPU limited in any scenario always means much less consistency and a worse experience, so yes, a CPU setup that‘s quite a bit stronger than what you might initially think would be a good pairing for your GPU will ultimately give you a more enjoyable experience though not necessary in all games of course.

  • @leemaniac9091
    @leemaniac9091 9 месяцев назад +1

    I guess as long as you have 1 really really high IPC thread and at least 10-11 or so other threads you can get away with it in most games now. So with a 6 core CPU with X3D and HT you might be good to go once something like that comes out.

  • @Koozwad
    @Koozwad 9 месяцев назад +1

    Went for a 10900K and currently use it with all clocks at 4.9(possibly 5 or 5.1 later on if required) with HT off. Simple but effective. Skipped that e-core mess on purpose(could've gotten a 12900K at the time instead). 10 normal physical cores all the way!

  • @Endorsememe
    @Endorsememe 9 месяцев назад +1

    yes

  • @dmer-zy3rb
    @dmer-zy3rb 9 месяцев назад

    since when does more resolution need more cpu power? what kind of a dumb question is that? you could play at 640x480 and it wouldnt matter.

  • @50H3i1
    @50H3i1 9 месяцев назад

    With 7800X3D going for 350 to 400$ there is no question for people with 4090 and 7900XTX

  • @nicktronson2977
    @nicktronson2977 9 месяцев назад +2

    My Sega Master System handles 4k all the way up to 440k.

  • @djtomoy
    @djtomoy 9 месяцев назад

    Yes you do, true gamers know this to be true

  • @V3ntilator
    @V3ntilator 9 месяцев назад +1

    My 3 year old CPU is still not used in new games. Starfield 13%. No new game uses my CPU in general.
    Sure, my old CPU is overkill but underused as of today.

    • @daniil3815
      @daniil3815 9 месяцев назад +1

      Try battlefield2042, cyberpunk, spiderman. There are games that will use more than 13%. Also it depends on your gpu and resolution you play, if gpu sucks and you play at 8k, then you will not utilize your cpu much

    • @V3ntilator
      @V3ntilator 9 месяцев назад +1

      @@daniil3815 I play BF 2042 daily and my CPU is still idling. Spiderman only uses 25% of my CPU. Poorly optimized game. Cyberpunk is poorly optimized and uses only 23% or so.
      My CPU is 3 years old and no one is using it.

    • @Scorpion95
      @Scorpion95 9 месяцев назад

      @@V3ntilator Which CPU is it exactly ? Is it like a threadripper ? With my 3950X I get between 35-40 % usage in Cyberpunk. The Division 2 and Doom Eternal get both around 50-60 %. But yeah most games dont really utilize all cores/threads. The only "game" which hits really hard was Star Citizen which push the Cpu to 80-100 % while running around on the planet.

    • @V3ntilator
      @V3ntilator 9 месяцев назад

      @@Scorpion95 Ryzen 9 5900X. I haven't checked CPU usage in Doom Eternal, but hust checked in The Division 2. Mostly 31-33% CPU usage, with spikes sometimes to 39%.

  • @mick7727
    @mick7727 4 месяца назад

    Where's my graphs?

  • @ha-nocri9923
    @ha-nocri9923 9 месяцев назад

    Even tho AMD CPUs outsell Intel regulary, for years now, DF would never mention them. Like they won't mention AMD GPUs, but for that they a tleast give exuce that nVidia is way more popular

  • @Shawner666
    @Shawner666 8 месяцев назад

    I just upgraded to the i9 14900kf. Having 24 cores are really amazing. I like to do multitasking and stream movies at the stream as well. I could of got an i5 14600k and fine with that. It was an impulse buy:(

  • @williamhardin8550
    @williamhardin8550 9 месяцев назад

    Why don't they target 1440p an upscale that to 4K instead...

  • @thelifeofalex3546
    @thelifeofalex3546 9 месяцев назад

    Short Answer: Probably
    me with potato :(

  • @dante19890
    @dante19890 9 месяцев назад +1

    Its a really bad idea to have a mid tier cpu with a 4090. U will definitely bottleneck that card even at higher resolutions.
    U get a cpu that's a good match for ur gpu

    • @glenmcl
      @glenmcl 9 месяцев назад +2

      I have a i9 13900K paired with a 4090 and 64gb DDR5.
      CPU usage shows around 3%. It's insane. I really think I could have got a cheaper CPU.
      Games still stutter regardless. I'm convinced Devs need to optimise better.

    • @dante19890
      @dante19890 9 месяцев назад

      @@glenmcl no, u will get more out of ur 4090 with that cpu

  • @stanisawkowalski7440
    @stanisawkowalski7440 9 месяцев назад +1

    The more fps you want, the more powerful CPU you need to get them and resolution doesn't matter here at all. If CPU bottleneck in FullHD, but not in 4K, just change the GPU to one being able to give in 4K the same or more fps as previous one was giving in FullHD and then you'll see the same CPU bottleneck in 4K with quite the same (~5-10% lower) numbers of fps.

  • @George-um2vc
    @George-um2vc 9 месяцев назад

    I was rocking a 11600k with a 3080ti for some time, seemed fine but perhaps could have improved with something like a 13600k…

  • @Hobbes4ever
    @Hobbes4ever 9 месяцев назад

    my old Ryzen 3700X works great with 3090Ti at 4K. stop buying overpriced cpus and motherboards

  • @glenmcl
    @glenmcl 9 месяцев назад

    Dear devs, Optimise Optimise Optimise.

  • @puregarbage2329
    @puregarbage2329 9 месяцев назад

    Please…for the love of god…do not revive the celeron or the pentium…what a horrible idea!

  • @danielgomez7236
    @danielgomez7236 9 месяцев назад

    It is mostly a software issue, games developed with old tech, Direct Storage combined with fastest SSD has the potential to greatly improve game performance and stability.

  • @garethbryant2183
    @garethbryant2183 9 месяцев назад +1

    My non gaming laptop is really enjoying its new life as a mainly gaming Geforce now machine.. bringing new life into a machine that would be prob in a landfill by now or someone's junk pile.

  • @ryanroleke1304
    @ryanroleke1304 9 месяцев назад +1

    *DLSS lowering internal resolution to achieve 4K60fps making CPU more meaningful should have been covered in above video, discussed later in my statement. I am running an i7-7700K @ 4.9Ghz w/ HT on and a RTX 4090 Suprim Liquid X and I can say that having a faster CPU does make a huge difference on titles that normally with top of the line CPU w/ a 4090 would have difficulty holding 60fps because of the necessity for DLSS to achieve that fps and how it lowers the internal resolution thus making the CPU more meaningful. I aim for 4K 120fps on my LG C1 OLED. On new metroidvanias and PC titles released during the Playstation 3 era, my rig is perfect for 4K 120Hz. But on Fortnite, I have to turn shadows off and turn DLSS Quality on to max out all other settings at 4K and get an avg of somewhere around 60fps. Since DLSS Quality lowers the internal resolution to 66% of the external resolution then uses AI to fill in the gap to upscale to the external resolution, this reduces the load on the GPU and intensifies the bottleneck of my CPU to prevent more meaningful gains if my CPU had been faster since the sweet spot to optimize my fps gains from DLSS in general on my rig would have been around 70 to 80% of 4K. Going further with DLSS Balanced or DLSS Performance did not do hardly anything due to the intensification of my CPU bottleneck in this scenario other than reduce the quality of the image. Go with an AMD Ryzen 7 7800X3D CPU since the cache is more meaningful than the higher frequency CPUs available.

  • @1SaG
    @1SaG 6 месяцев назад

    Judging from my results when I went from a 12600K to a 14700 KF recently (CPU upgrade only), I'd say: Yes. From what I can tell from both benchmark-results and from actual gaming, my 4070 seems to have been held back quite a bit by the 12600. That's at 1440p, but VR on my Reverb G2 (pushing about as many pixels as 4K would) has also become smoother and I'm now able to run the set at its preferred frequency of 90 Hz in IL-2 and DCS.

  • @mikeuk666
    @mikeuk666 9 месяцев назад +22

    'Next Gen' consoles definitely need to stop with ray tracing & concentrate on higher stable frame rates without making compromises.

    • @seethruhead7119
      @seethruhead7119 9 месяцев назад +2

      why i stick with PC, framerate far more important than visual fidelity

    • @firip255
      @firip255 9 месяцев назад +4

      @@seethruhead7119Most Games are 60 Fps. Only Starfield and Arkham Origins are 30 Fps, and Both are Bad Games

    • @nick4810
      @nick4810 9 месяцев назад +4

      @@firip255 while technically true that's usually only possible due to unacceptably low resolutions, for example Spider-Man 2 is sub HD at 60fps, same for FF16.

    • @firip255
      @firip255 9 месяцев назад +3

      @@nick4810 What. Spiderman is 1440P. And FF is bad game.

    • @Zenzuu
      @Zenzuu 9 месяцев назад

      Even so, they both still look great on a 4k OLED TV@@nick4810

  • @you2be839
    @you2be839 9 месяцев назад +1

    4:55 E-cores have been bothering me since 12th gen Alder Lake launched, I don't like a single bit about the approach and requirements Intel makes to take advantage of those "Efficiency-cores", which by the way, aren't even that power efficient to begin with!

  • @NemoNobody0
    @NemoNobody0 5 месяцев назад

    Outdated and inaccurate. Check recent e-cores on/off benchmarks.

  • @daniil3815
    @daniil3815 9 месяцев назад +1

    Still would prefer top end CPUs like 7950x3D or 14900k.

  • @dennisole457
    @dennisole457 9 месяцев назад

    The only benefit i see is the unfortunate reliance from developers using upscaling tech. Native 4k, even with a 4090, is almost not doable.
    Throwing on an upscaler sees a benefit and is the only reason i upgraded to a 7800x3d. Its why i partially despise pc gaming but its the only way to easily keep a collection of games since consoles either dont care or give us the smallest drive sizes possible.

  • @brutlern
    @brutlern 9 месяцев назад +2

    So here is the issue. Every time there is a CPU benchmark, every tech tuber does 1080p low or medium settings. With a 4090! It's an absolute pointless benchmark. People need to stop testing useless settings, and test REAL world scenarios. With a 4090, you do 4K High settings or more. In which case, CPU will stop mattering much. But unless someone actually starts testing this, NOBODY KNOWS! We all assume that in such a scenario we would be GPU bottlenecked and most CPU would perform extremely close to one another. But nobody knows for sure, because nobody bothers to test it. That is why this question comes up all the time. "Do you really need a powerful CPU for 4K gaming?" Nobody knows for sure, until the people who have access to the hardware (like DF) start doing REAL benchmarks. Techpowerup is the only site I know that actually does 4K benchmarks for CPU's, but I don't actually trust those results because there isn't any other site to corroborate it.

    • @87crimson
      @87crimson 9 месяцев назад +1

      Its not, actually they should be doing 720p or hell, 640 x 480 tests. On a CPU bench you just want to get as much FPS as possible to test the limits of your CPU.

    • @brianfunt2619
      @brianfunt2619 9 месяцев назад +1

      ​@@87crimsonNo I see his point. Testing at 720p will certainly allow you to compare ultimate CPU performance but it doesn't tell you whether you can get away with an i5 for 4k60fps

    • @jmangames5562
      @jmangames5562 9 месяцев назад

      💯💯💯💯💯💯💯👌👍

  • @Zenzuu
    @Zenzuu 9 месяцев назад +99

    It's kind of sad that the talks and focus, especially regarding GPUs, are all about DLSS/FSR and ray tracing, which are essentially just more accurate lighting, reflections, and shadows nowadays. I would much prefer for developers to use their resources to push for higher polygon counts, more interactive destructible objects/environments, better AI, & more advanced/better physics with interactive objects.

    • @WT83
      @WT83 9 месяцев назад +32

      Games without RT aren't pushing most of these things either. Removing RT would just be removing better lighting, reflection, and shadows from the equation.

    • @danielecastagna3313
      @danielecastagna3313 9 месяцев назад +9

      Didn't the first ue5 tech demo like 3 years ago advertise smaller than a pixel polygons? How small do they need to be?

    • @MachProdigy
      @MachProdigy 9 месяцев назад +11

      @@danielecastagna3313Yeah, that’s nanite. We’ve already got the technology for polygon detail on a subpixel level. I want them to focus on animation and physics while pushing things like pathtracing to be the industry standard.

    • @thedoomslayer108_YT
      @thedoomslayer108_YT 9 месяцев назад +5

      ​@@MachProdigy, Path Tracing is very GPU-intensive as the 4090 still can not reach a stable 30 FPS in Cyberpunk 2077 with RT Overdrive at 4K without DLSS.

    • @MachProdigy
      @MachProdigy 9 месяцев назад +1

      @@thedoomslayer108_YT I know, I’m referring to the future of rendering. It would be cool if we eventually had the baseline power for it to become a standard.

  • @marklarz4399
    @marklarz4399 9 месяцев назад +2

    Laughs in an AMD FX 6300

  • @nick4810
    @nick4810 9 месяцев назад +3

    i recently went from a 3600x to a 5800x3D and i don't think it's made any difference in any game so far tbh, CPU usage is nearly always really low in games anyway

    • @perlichtman1562
      @perlichtman1562 9 месяцев назад

      I had some games I tested where it was quite similar where even going from an old i7-3820 to a 13900K didn’t make much difference at 4K with maxed out settings on for example a 3060 in games like Forza Horizon 5 or F1 2022, though RT complicated things a bit. Above that level, I was starting to see more of a difference with a 4070, 6800 XT and 7800 XT.

    • @zoriiginalx7544
      @zoriiginalx7544 9 месяцев назад +1

      Play Escape from Tarkov and then you'll really notice it.

    • @leonro
      @leonro 9 месяцев назад

      You know you should to state your GPU and monitor pairing as well for performance claims, right? If you're on an RTX 3060 at 1440p then yeah, you could have probably even gone for a 4 core ryzen/intel and not lose out on performance.

    • @nico3064
      @nico3064 9 месяцев назад +1

      It always depends on what you are going for. I use a 5600x and a 4070 @1440p, so technically I do have a CPU bottleneck, especially in 1080p. In reality it doesn't matter since I cap my framerate at 120 fps anyway most of the time and the 5600x is more than capable for this. In a bunch of games I even go as low as 60 or 30, so any steonger CPU alternative is just a waste of money

  • @CheckmateStallioN
    @CheckmateStallioN 9 месяцев назад

    Can't wait for their Lords of the Fallen PC deep dive. My GOTY so far

    • @RetroBacon1
      @RetroBacon1 9 месяцев назад +1

      Nah it’s too lame

    • @SolidBoss7
      @SolidBoss7 9 месяцев назад +2

      @@RetroBacon1minority take. They will one hundred percent do a deep dive.

    • @CheckmateStallioN
      @CheckmateStallioN 9 месяцев назад

      @@RetroBacon1 hmm looks like someone still has to GIT GUD

    • @Koozwad
      @Koozwad 9 месяцев назад +1

      @@RetroBacon1 Not only that but it's a really sh1t company, they still have Denuvo in their original Lords of the Fallen game(on Steam) after 10 FREAKING YEARS.

  • @jcdenton8750
    @jcdenton8750 9 месяцев назад +5

    *Xbox Series S has left the chat*

    • @Padawan333
      @Padawan333 9 месяцев назад +5

      All consoles left the chat, this video is about PC....

    • @phaedrussocrates7636
      @phaedrussocrates7636 9 месяцев назад +1

      ​@@Padawan33390% of PC left the chat, decimated by PS5 exclusives performance... 100% of PC run away if you do price comparison of 400$

  • @jimengab
    @jimengab 9 месяцев назад

    I feel like games are less and less optimized as hardware becomes more powerful. CPUs don't really suffer that much but graphics tend to be so poorly optimized, they game NEEDS some kind of reconstruction to run at decent frame rates. Looking a you, Starfield.

    • @DaBigBoo_
      @DaBigBoo_ 9 месяцев назад +1

      true. when is the last time a cpu bound game was peaking the cores at 99%, they dont, they sit at like 60% usage while gpu is also at like 60% and the engine is just hogging up all the render time

    • @marrow94
      @marrow94 9 месяцев назад

      You just don’t understand how %usage works.

  • @SpaniardNL
    @SpaniardNL 9 месяцев назад

    The real question should be, do you actually need 4k gaming?
    To which I answer, hell no, stay 1080p and not have to deal with any of the modern gaming nonsense of 30fps or below or horrible looking games. Resolution is less important then you think.

    • @mornedhel1
      @mornedhel1 9 месяцев назад +2

      What a dumb statement to make.

  • @queenform
    @queenform 9 месяцев назад

    if you can afford a $1000 card for 4K, you can afford a $500 - $700 CPU for 4K

    • @anubisbahamut
      @anubisbahamut 9 месяцев назад

      depending on where they are coming from upgrading from you might as well add ram and a motherboard to that price lmfao eyeroll but sure in your perfect scenario where thats all you have to do you might have a point just because you can buy a 1000dol gpu doesn't make you rich maybe you had a budget for just that its like telling a person who bought a playstation to spend double the price and get a pc instead since they are already 500-600dol in the hole some people play consoles out of convenience but most cant afford the expense of a pc.

  • @anubisbahamut
    @anubisbahamut 9 месяцев назад

    the guy was obviously talking about the need to upgrade an older cpu lol not a mid range or low end cpu of the newest generation lol think 3900x or 5900x compared to the amd cpu's of today common guys @DigialFoundry basically hes asking do you NEED to upgrade from 3900x or 5900x if you are running a 4090 at 4k with those options/settings.