Does Multitasking Hurt Gaming FPS? 5600 vs. 5700X: YouTube + Discord Call

Поделиться
HTML-код
  • Опубликовано: 28 сен 2024

Комментарии • 1,5 тыс.

  • @WTP_DAVE
    @WTP_DAVE 2 года назад +873

    This is the type of real world content that i have been looking for! Cheers boys

    • @Takashita_Sukakoki
      @Takashita_Sukakoki 2 года назад +16

      I would have hoped that people were smart enough to open task manager while using discord/youtube and put 2 and 2 together but i quess people needed this video

    • @killzonearmed
      @killzonearmed 2 года назад +13

      @@Takashita_Sukakoki yea because I definitely have the components to buy and test it

    • @Takashita_Sukakoki
      @Takashita_Sukakoki 2 года назад +12

      @@killzonearmed who cares what HUB tests. Your set up is the one you should care about. So open up your epic CPU intensive multitasking and monitor the CPU usage, open up a game if you feel nothing a miss boom I just saved a youtuber hours of pointless work.

    • @killzonearmed
      @killzonearmed 2 года назад +23

      @@Takashita_Sukakoki I'm going to upgrade my pc, that's why it's useful to have these videos, I think you don't understand that

    • @Takashita_Sukakoki
      @Takashita_Sukakoki 2 года назад +6

      as HUB keeps saying there is no "future proofing" so buy a capable 6 core alder lake/zen3 or next gen cpus and no need to overthink it.

  • @JarrodsTech
    @JarrodsTech 2 года назад +417

    To be fair, RUclips videos are pretty intensive, better go 64 cores.

    • @anusmcgee4150
      @anusmcgee4150 2 года назад +8

      Your content is awesome and you’re a phenomenal human being 🤘💪

    • @moldyshishkabob
      @moldyshishkabob 2 года назад +19

      Don't forget that it's best to have at least four times as much GB as number of cores. 256GB should be a good start.

    • @MLWJ1993
      @MLWJ1993 2 года назад +9

      Also requires RGB on as much components as possible to allow for lit gaming performance.

    • @FcoEnriquePerez
      @FcoEnriquePerez 2 года назад

      Oh please don't become the "Tech deals" dude lmao.

    • @robertt9342
      @robertt9342 Год назад

      @@moldyshishkabob. You know that a lot of people have around 4 times the gb as cores for memory right? Many people have 6-8 cores with 32gb of tam. Same with those before who had 4 cores and 16GB…. So technically that’s is very common on the lowend.

  • @Mario07101997
    @Mario07101997 2 года назад +205

    In my experience, what will definitely degrade gaming performance is playing a RUclips video/Livestream on a second display, actively visible (not minimized). But that's on the gpu, not the cpu, and only if the gpu was already maxed out while playing

    • @flameshana9
      @flameshana9 2 года назад +22

      Which is because if a video is visible it's using hardware decoding (part of the gpu). If it's minimized that decoding is heavily reduced.

    • @jcbc5950
      @jcbc5950 2 года назад +31

      NVIDIA has addressed this saying the NVENC video decode part of the chip is completely separate to 3D/gaming part of the chip. (For GTX 1650 super and above) Video decoding does not affect 3D/gaming chip, unless you run multiple 4K youtube videos on multiple monitors and overpower the Video decode/encode chip. If you have performance issues, open up windows task manager, GPU section, and see if your Video decode chip is at 100% and what % the 3D/gaming chip is at

    • @andersjjensen
      @andersjjensen 2 года назад +13

      @@jcbc5950 But it still uses VRAM I/O so games which are more memory I/O bound than compute bound will see a bit of a reduction (but it's only a fairly small bit, even on low-ish end GPUs, as video-decode-and-copy-to-scanout-buffer bandwidth is fairly limited)

    • @Your_Paramour
      @Your_Paramour 2 года назад +13

      If you have mismatched refresh rates between monitors that is far more likely to be the cause of stuttering if you have a modern system, but even this has largely been addressed in the later versions of windows 10.

    • @Trigath
      @Trigath 2 года назад +4

      Did they run the tests with YT on second monitor or minimized? I missed that part.

  • @JohnDuthie
    @JohnDuthie 2 года назад +339

    I watch videos all the time using Chrome while gaming on a 2600X, RX580. There are some scenarios where it causes a problem but the majority of games that I play don't seem to mind. The bigger problem is my brain can't multitask and I end up only half paying attention to each.

    • @GewelReal
      @GewelReal 2 года назад +7

      so relatable!

    • @IN-pr3lw
      @IN-pr3lw 2 года назад +4

      It also could be a lot smoother

    • @dralord1307
      @dralord1307 2 года назад +9

      I use a 3600, 32gig ram 3200, GTX 1070 "hope to upgrade soon" Same as me. I watch vids on 2nd monitor, run chat apps, comp monitoring software etc while gaming on my 32inch primary monitor. Most of the time Multi-threading is normally not an issue but there are some times it is. Specifically in Space Engineers, or a couple single core heavy games. The single core games get affected when some other task decides it suddenly wants the prime cores. "Often some stupid windows service"

    • @Hugh_I
      @Hugh_I 2 года назад +23

      brain needs more cores, confirmed

    • @danw7864
      @danw7864 2 года назад +57

      No one can actually multitask. Humans aren’t capable of multitasking. “Good multitaskers” really are just good at switching rapidly between tasks.

  • @TheCgOrion
    @TheCgOrion 2 года назад +239

    This is exactly what I have wanted to see. I've stuck with 8C or higher since AM4 launched (I do a lot of encoding), but I still wanted to see how a Zen 3 6C would do long term against 8C consoles. I realize a 5600X is only slightly behind a 3700X in total performance, and the consoles are basically cut down, heavily down clocked 3700X (sort of). The other side programs is a good idea too. I regularly assign half of the threads to a batch while gaming, so I will stay with 8C or more, but my use case isn't normal. How were the .1% lows, similar behavior?

    • @OrkDiktator
      @OrkDiktator 2 года назад +16

      The Ryzen 3700X has a base clock of 3.6 Ghz. Thats exactly the clock speed of the Series X APU in SMT mode. In the 8c/8t non-smt mode, the Series X CPu has a clock speed of 3.8 Ghz. They reduced the cache for the console APU what also normally is the case for AMDs mobile and desktop APUs. The PS5 APU is clocked at 3.5 Ghz. So, not much of a difference. The only advantage clock wise is the boost clock the 3700X (4.4 Ghz) and normally every PC CPU has. But we all know the advantage of the PC isn't the potentially more powerful hardware, but the open character of the hardware platform.

    • @TheCgOrion
      @TheCgOrion 2 года назад +15

      @@OrkDiktator One of my PC's has a 3700x, and it never runs at 3.6Ghz when it's actually being put to work. Even all core 100% load. Same with Intel, when the board and cooling isn't garbage, and it has a decently power budget. I understand the spec side of it though. I agree for sure on what makes the PC advantageous. I also thoroughly enjoy the flexibility of modding games, fixing games when the devs refuse to add options or fix issues, and the ability to push settings higher years later, or even applying post processing to an old game. Although I would say the cache reduction is also a big disadvantage. Look at the 5800X3D vs the 5800X, and the latter has a higher clock speed. I know the SOC has advantages going for it as well.

    • @erroroflife187
      @erroroflife187 2 года назад +34

      5600 is actually quite a bit better than a 3700x

    • @TheCgOrion
      @TheCgOrion 2 года назад +13

      @@erroroflife187 In gaming fps yes. I was referring to multithread performance, if you were to make use of all of the CPU's processing power. I was mostly referring to what a 5600x would do compared to a Series X, if they were being fully utilized. I realize that most or all current titles benefit from the superior IPC of the 5600x.

    • @OrkDiktator
      @OrkDiktator 2 года назад +5

      @@TheCgOrion I fully agree with what you said. I'm still at a R 5 1600. Never checked when it runs on base clock. The Problem with implementing more cache is the space. There just isn't enough space for more cache on APUs. Maybe the 5800X3D will motivate the console manufacturers to implement more chance next generation. Both consoles are still great pieces of hardware for their pricepoints. I'd like to see similar APUs for Desktop PCs. They could completely replace dedicated low-end and maybe even lower mid-range GPUs and give AMD market leadership in that field. Btw.. my most played Game over the last 8 months was Skyrim on PC with over 270 mods (mostly new lands, quests and companions and some gameplay enhancements. I really like my Series X and Gamepass and all the stuff Xbox offers, but there's nothing that comes close to that nearly 11 years old game with no mod limitations.

  • @koyomiararagi3018
    @koyomiararagi3018 2 года назад +133

    One possible scenario is a user streaming RUclips and discord with hardware encoding and acceleration disabled. Otherwise most of the workload is offloaded to the GPU.

    • @trackingdifbeatsaber8203
      @trackingdifbeatsaber8203 2 года назад +1

      True some people do use a 6500xt and also it would be interesting if you could get more in game performance by shifting load of the GPU

    • @Dracossaint
      @Dracossaint 2 года назад +17

      i find that using hardware encoding can actually cause some video stutter (dropped frames) when loading things in a game. changing it to software encoding alleviates that for me.

    • @MrMeanh
      @MrMeanh 2 года назад +2

      @@Dracossaint I have found the same thing, but only on my 3080 and before that 2070s. When I use my 6800xt I have no such issues, so for me AMD has worked better in that regard.

    • @Dominus_Potatus
      @Dominus_Potatus 2 года назад

      @@Dracossaint really? never have issue with it. using RTX 3050, streaming 1080p 60fps is breeze.
      Sometimes, I even video conferencing with 3 different google meets and my GPU utilizations are still below 10%.
      Nvidia's NVENC sure is efficient, I believe the efficient encoder only available at GTX 1660s and all RTX series.
      Old GPU such as GTX 1060 6GB uses a lot of utilization to stream, my friend with better CPU but older GPU suffers from streaming issue.

    • @jcbc5950
      @jcbc5950 2 года назад +2

      ​@@Dracossaint Video encoding and decoding barely affects gaming with NVIDIA. NVIDIA has addressed this saying the NVENC video encode/video decode part of the chip is completely separate to CUDA/gaming part of the chip. While NVIDIA performance for 1080p/1440p gaming is abysmal compared to AMD, NVENC is very good so I have to use NVIDIA when streaming to Twitch.

  • @RustyChickn
    @RustyChickn 2 года назад +142

    The kind of benchmarking people actually need 😳

    • @popcorny007
      @popcorny007 2 года назад +24

      *think they need :P

    • @sudhanyahalder9964
      @sudhanyahalder9964 2 года назад

      Ya bro

    • @registeredblindgamer4350
      @registeredblindgamer4350 2 года назад +18

      You watch too much TechDeals, we don't need this at all. Anyone with half a brain knows tasks like this use hardly any resources. If you needed this video to know this you clearly couldn't tell your pc ran any slower.

    • @sudhanyahalder9964
      @sudhanyahalder9964 2 года назад

      @@registeredblindgamer4350 true as well, but I'm not upset with the upload of this video

  • @M4RC0Geek
    @M4RC0Geek 2 года назад +10

    Thank you! Very good content for those who are looking into this theme. The modern 6 cores CPUs can take the "load" of this simple tasks.

  • @sakhayaangavriliev284
    @sakhayaangavriliev284 2 года назад +13

    Amazing work! But I want to add that sometimes people use software decoding, which uses up quite a lot of CPU power, and I regularly use software decoding on my system to watch videos and streams. This is due to the fact that when I use HW decoding and my GPU is at a 100% load, video feed becomes extremely choppy or straight up locks up. This is probably due to an older GPU, as I use a 1080 Ti with a 5900x, or maybe due to my browser (firefox)
    I'd like if you were to explore this possibility. As a sidenote, I can certifiably say, that a 5900x is not enough to watch 8k60fps videos.

    • @batbiebel
      @batbiebel 2 года назад

      Firefox doesn't behave well with long yt videos or streams. It eats memory till it becomes unusable. At least it does on my system.

    • @greggmacdonald9644
      @greggmacdonald9644 2 года назад

      @@batbiebel I've noticed that too

    • @djmccullough9233
      @djmccullough9233 2 года назад +3

      well.. no, if your GPU is at 100% load, it obviously has no time to do additional work. its going to drop frames if you are already at 100% usage and then ask it to do ANYTHING else. Also, Nvidia cards use a Software Scheduler that runs on the CPU, so if your CPU is fully utilized, your video card will start dropping frames wildly because the CPU doesnt have available time to manage the GPU. That being said, the 5900x shouldnt have any problem with the 1080ti. I've got my 1080ti driven by a ryzen 1700x, and it can fully max the GPU. The 1080Ti is a great card, but if any card is at 100%, expecting it to be smooth sailing when you introduce additional workloads to the device isnt going to go well. hehe.

  • @evenAndre
    @evenAndre 2 года назад +42

    Thank you for this! I think tech deals channel is responsible for a lot of worry about this (for me at least) as he kept hammering that 6 core in general would be struggling in a few years (a few years ago :) ), although it's been a while since I've watched his content.

    • @Dominus_Potatus
      @Dominus_Potatus 2 года назад +1

      I think we are already living in a time if there will be jump in performance, it will be minuscule.
      The only way to improve performance today is either going full quantum, increase TDP and power consumption, change SoC, or everything combined.
      So, your current hardware, I think will still relevant in 5 - 7 years unless somehow Intel/AMD/Nvidia go nuts and move to new, more efficient architecture.
      We have seen x86, x64, and so on..., now ARM is a big contender.
      Let's see if Intel, AMD, Nvidia, and Microsoft can combine effort to make a new architecture.
      Where is Linux? They can improve themselves, they are impossible to kill and evolving everyday

    • @antiwokehuman
      @antiwokehuman 2 года назад +15

      Yeah. He trolled me for using a perfectly working dual core laptop for office work. He thinks everyone needs atleast 8 cores and 64 gb ram.

    • @takehirolol5962
      @takehirolol5962 2 года назад

      Well, even games like RDR2 locked at 60 FPS 1440p HUB settings can use more than 6 threads.
      More is better, most of the time with PC resources. I only game though, no other applications are opened while I game.

    • @DarkReturns1
      @DarkReturns1 2 года назад +7

      Never really liked Tech Deals, dunno, his opinions don't sync with mine I guess

    • @DarkReturns1
      @DarkReturns1 2 года назад +8

      @@takehirolol5962 not necessarily true. Just because games will use the cores doesn't mean more is better. A 5600X will destroy anything in the first 3 generations of Ryzen for gaming.

  • @yearofthegarden
    @yearofthegarden 2 года назад +18

    This is among a very few list of videos which provide me with the information I actually need, as opposed to the game FPS tests that make up for 99% of all youtube tests. I am moving from a rx570 to a APU 5700g because I no longer game. I am moving on with my life to only photoshop and blender, as well as moving onto a solar powered electrical source, so being able to undervolt with a low base TDP while keeping the capacity to process infrequently large files is going to be essential for me, all while keeping a GPU function close to that which have 2gb.

  • @avallach2061
    @avallach2061 2 года назад +53

    Amazing benchmark, basically my experience with the 12400F, sure there's a FPS drop, but never relevant enough for me to even notice while playing. And if I notice a situation where this is the case, I can just well... close everything else but the game lol. No biggie, the 6-core is working for my needs pretty well.

  • @TheDaNuker
    @TheDaNuker 2 года назад +6

    Excellent content, this is the type of workloads folks try to use their higher core CPUs for. Albeit the biggest test would be OBS and the various x264 CPU options with the visual advaned audio/visual effects processing options that might not be GPU accelerated, scaling, etc...

  • @foxpants
    @foxpants 2 года назад +193

    Happy to see you deliver what's asked for, and that your insights are appreciated and as usual, on the money.

    • @mroutcast8515
      @mroutcast8515 2 года назад

      was about a time to shut up all those tech idiots with bloatware and malware on their PCs 🤡😂

  • @ChrisPkmn
    @ChrisPkmn 2 года назад +7

    Thank you for this! I wonder if this was an issue with immature OS's and weaker hardware. As cache, virtualized sandboxes, and scheduling improves, less is demand is put on having dedicated cores.

  • @Rachit0904
    @Rachit0904 2 года назад +9

    it’d be interesting to see if this is the same behaviour on Alder Lake E-cores as those have been sold as being good for background tasks. You could test the i5-12600K vs the i5-12600 (or ideally the i5-12490F as it has the larger L3 cache).

    • @brownjonny2230
      @brownjonny2230 Год назад +1

      You see the point is, even with P cores alone can handle ALL light-threaded background tasks, which negates the point of E cores taking care of background while P cores concentrate on 'important' tasks. Hell even many P cores sit in idle past certain point as the video suggests.
      The question is then, would E cores save more power than just using P cores for all the tasks? Unfortunately no. As much as marketing guys in Intel want you to believe E cores are 'efficient', E cores are just as power hungry as P cores.
      So what the hell E cores are for? For all core performance such as rendering where latency isn't the issue. Other than that E cores aren't useful for saving energy contrary to Intel's marketing.

    • @RobBCactive
      @RobBCactive 4 месяца назад

      Active applications trigger i/o and access memory.
      Steve explained spare cores idling DO NOT help, when you have lightly loaded cores already.
      This obsession with core count is simply wrong.

  • @XxDeViLBrInGeRxX
    @XxDeViLBrInGeRxX 2 года назад +6

    For Stream dedicated PCs, in my experience I would recommend getting a platform that can have a decent PCIe capacity, for example I decided to keep my 5820k from 6 years ago since it had a decent capacity, besides lots of PCIe connectors, which those came really handy when adding devices like for example, the Elgato 4K mk2 capture card. I currently run two 4K capture cards with a 1650 Super, so that i can encode with the NVENC part, as well as record in higher bitrate, using X264 to the streaming pc's hard drive.
    Now since X99 is a bit of a rare platform unless you do a build from like aliexpress, I suggest that if you are looking more for a usual consumer CPU, something like the Ryzen 1600 will be plenty for that matter, and these can be found relatively cheap nowadays, as for integrated graphics CPUs, a Ryzen 3400G and 16gb of ram since you need a bit more ram for the iGPU, should do well as long as the encoding is set to X264 (Cpu based on OBS), with that is it possible to "get away" on not having to buy a dedicated GPU for a streaming only PC. As for motherboards, besides having enough VRM for your chosen CPU, if you want to future-proof it to later have a dedicated GPU + a PCIe Capture card like one of the mentioned above, a good way to find one easily, is that it is "Crossfire compatible", that way it will be easier to search for motherboards that have more than a x16 slot conector, which not all budget options tend to have.
    Hope this helps the ones interested on Streaming dedicated PCs.

    • @cromefire_
      @cromefire_ 2 года назад +3

      Nah, it'd be cheaper to add 2 to 4 cores to the existing platform or just use hardware encoding. An extra PC will add 500€ or so, which would basically upgrade you to a 5950X that will handle it all and you don't have to switch PCs all the time

    • @XxDeViLBrInGeRxX
      @XxDeViLBrInGeRxX 2 года назад

      ​@@cromefire_ as someone who has a 5900X system, i still kept my X99 because of the reasons mentioned above, besides your GPU still gets a performance hit as well since it needs to render the scenes for the other applications that you use too, it is not "CPU upgrade bound" only. Besides that, you will also be limited on PCIe bandwith as well, using M.2 drives and your GPU, is very likely that your motherboard wont have enough PCIe lanes alone for adding a 4K capture card, and if you do, then you would limiting the PCIe bandwith on your GPU already by half. The main purpose of using an external system is to have next to no performance impacts on your daily drive/gaming pc.
      Also the upgrade that you suggest alone can mean as much as a simple, used but cost effective system enough to be used for a stream dedicated system, even more if you skip the capture card and use a LAN bandwith system like NDI for example.

    • @cromefire_
      @cromefire_ 2 года назад

      @@XxDeViLBrInGeRxX Well you'd probably use the motherboard PCI-e for the capture card, my rough estimate would be that 4x should probably be enough. And even if not PCI-e x8 is still enough with a 3090 to only get like a very few percent. Regarding HW encoding: no HW encoding is done on dedicated silicon, so it's fine and with a 5950x it's negligible performance anyways. I mean after all you only need 1080p60 (some even do 800somthingP) for streaming anyway, at least on twitch, for proper 4K you have to go to RUclips I think (which few do). After all I can play games fine, even when doing a software AV1, 16 thread encode, so even a 5700 or 5900X will probably be fine. Just be sure to pin the threads currently so they don't interfere and everything is fine. And I mean besides price you just loose feature wise too because things like window capture obviously don't work and you need the capturing card. On a single system a capture card only even starts to interfere if you both game on PC and console anyway as you don't need the the GPU with console gaming and you don't need the capture card with PC gaming. Add the second keyboard and amonitor that's for the second PC and it doesn't make much sense for most.
      If I'd built a PC for gaming nowadays, I'd just get a i5-12600K (non-F) probably and just use QuickSync. Even with a second PCI-e slot used you will still have PCI-e 4.0x8 for current gen and 5.0x8 for next gen, so plenty of bandwidth and if you want to do CPU encoding just put OBS on the E-Core, video encoding is something they do great at (in fact E-Core ≈ 1 P-Core Hypethread), so think of it as a dedicated older quad-core for encoding only.
      But of course you do you if a dedicated machine is what you desire nobody can stop you, it's just probably ~50% more expensive and I find it more annoying than the easy solution (which is a 12600K which is like 100-200€ more I think than a 12400F) and what you gain is ~5% performance in the best case, of which the stream won't see anything because they neither see 4K nor > 60Hz.

  • @greggmacdonald9644
    @greggmacdonald9644 2 года назад +7

    An excellent video, thanks! What I'd like to see is how much Unreal Engine 5 + DirectStorage in games (when they arrive) changes the calculus here. How many cores will those games like to see?

    • @anrucas
      @anrucas 2 года назад

      Unreal 5 only uses 8 cores. So not much change there. Unless they actually decide to use more threads and go into Async style of cpu utilization. Quite disappointing really. But considering PS5 and XboX X devs have the ability to limit the consoles to use only 8 cores instead of 16. I think we will see 8 cores be the max for a while still.

    • @MrGraywolf09
      @MrGraywolf09 2 года назад

      Not sure if my question is similar, but I'm building a PC for game development (including UE5 as one of the engines I consider using) and I wonder if the 5700X is worth the cost compared to the 5600.

    • @anrucas
      @anrucas 2 года назад

      @@MrGraywolf09 Going forward you will need a 8gb vram card minimum to run at 30 fps. So a 5700xt and a 1080 will run at that on the example/sample projects. You can get away with a 6600xt/6650xt and a 3060/ti but running the project at medium settings will be alright. A basic project takes 2gb of ram with barely anything in it. If you can save up and get a 6700xt or 6800xt you should be fine for say 6 years if it lasts that long. You could pick the equivalent level card from nvidia. 64gb of ram or more with 16 cores cpu. 8 cores 16 threads. nvme recommended going forward something like 256gb os and 2tb. You can get away with hard drive but it will take minutes to load instead of seconds. UE5 is next gen so it has a heavy cost.

  • @AdjutorMusic
    @AdjutorMusic 2 года назад +6

    Love the videos lately. Y'all have become my current favorite tech channel! LOVE the monitor dedicated content as well

  • @mcunner
    @mcunner 2 года назад +1

    I've found websites like toms hardware with a billion ads on a page can decimate the cpu even in the background and ad blockers don't seem to work much anymore, well on certain sites.

  • @normandy4
    @normandy4 2 года назад +3

    Discord had a bug where their active noise cancellation software would turn itself off due to "high system load" on Ryzen CPUs, but it has been fixed.
    Something else that I have noticed is that video playback will drop a lot of frames while I am in a game that is maxing out my VRAM (I still have a 580 4gb)
    It may be nice to test the performance loss when approaching 100% utilization of RAM or VRAM.

    • @flameshana9
      @flameshana9 2 года назад

      Any time you go near your max ram or vram things stop working well. That's to be expected. If you can juggle 5 balls at a time and someone says "can you please do 6?" what do you think will happen? You're going to fumble and drop some.

    • @normandy4
      @normandy4 2 года назад +2

      @@flameshana9 Obviously.
      The point would be to measure how much of a performance impact there is, and how much ram/VRAM is enough for modern use cases.
      This can be an issue with new hardware as well. For instance the 3080 10gb can run into performance issues at 4k in some games but not others. Potentially adding in a little bit of multitasking on a second monitor could push the card to it's VRAM limit

    • @Zangetsu_X2
      @Zangetsu_X2 Год назад

      @@flameshana9 how many do you think your mom can handle?

  • @davidbetancourt4028
    @davidbetancourt4028 2 года назад +3

    This is strange. While watching this video at 1440p and playing at 1.5x, my GPU kicks up from 0%-1% to low 30s. I'm playing in an Edge browser with hardware acceleration turned off. This is on a 3440x1440 monitor. _NOTE:_ With Edge + no hardware acceleration, the GPU used Client Server Runtime Process.
    When I switched to Brave _(hardware acceleration default on),_ the GPU utilization is in the low 40s. Same result with FireFox. The two top GPU usage apps were the browser + Desktop Window Manager, with the browser taking up most of the GPU usage.
    When I used Edge + no hardware acceleration, my CPU usage was between 12% and 18%. With FireFox and Brave, my CPU usage was between 3% - 6%. When playing games, I use Edge because with hardware acceleration turned on, RUclips would kick my quality down from 4k or 2k down to 360p every single time. Now it doesn't.

  • @tomvandongen8075
    @tomvandongen8075 2 года назад +11

    How about flipping the question on it's head; how low on CPU power do you need to go for this to be an issue? 1st gen Ryzen? Older?

    • @AlarmedJuggernaut8
      @AlarmedJuggernaut8 2 года назад +1

      1600x 6 core here

    • @finestPlugins
      @finestPlugins 2 года назад +1

      I7 2600k here (to be replaced soon).

    • @jamzqool
      @jamzqool 2 года назад

      bet that all 4 core and lower processor gonna have harder task to handle all those things

    • @finestPlugins
      @finestPlugins 2 года назад +1

      @@jamzqool Even worse, I'm running at 4K on a 2070S ;)
      The effect of Discord is negligible though - and I'm usually in a group voice chat.

    • @Natehalem1366
      @Natehalem1366 2 года назад

      I have done cpu based recording/streaming in obs (I think I did 720p 60fps fast preset) while gaming on my old 4770 and it used about 20%. If your just doing discord and youtube I doubt it would make a noticeable difference unless you are already on the edge of your fps target. My guess would be you would have to go back to a 4 thread cpu like the old i5s to see a big difference from discord and youtube (but those barely run modern games anyway so does it really matter?)

  • @santiagoale6232
    @santiagoale6232 2 года назад +2

    This is what I was looking for. I have 3600 n strongly consider upgrading to 5700x. I do light gaming but run VMs, have daily calls, use Docker and multiple process running at the same time for work while listening music

  • @NakamiJun
    @NakamiJun 2 года назад +9

    As a Vtuber I actually run dual PC's because how intensive the multi-tasking is when streaming... But that's because a lot of things are running: OBS (10-15%), Vtube Studio (20-30%), audio player (2-5%), Web Browser (5-10%), even with a second PC that is doing this so the other system can game you sometimes add other things as well like a video capture program (used for lighting the model) or the Twitch Integrated Throwing System. I have run this all on one PC, but the experience was horrible.

    • @crylune
      @crylune 2 года назад +4

      Why don't you just show yourself instead of some 3D model? Never got this "Vtuber" thing. I suppose I get it if you're too shy to do so, but you'd be great, I promise :3

    • @dakrawnik4208
      @dakrawnik4208 2 года назад

      Might wanna a include some hardware specs instead of just cpu usage....

    • @awwtv2603
      @awwtv2603 2 года назад

      Web Browser at 5-10%, i call BS, I have 30-60 tabs open at any giving time and I only see at most 5%, and this is with a laptop processor, i9 10980hk and 64gb of ram. I do movie editing for part time work.

    • @johnbuscher
      @johnbuscher 2 года назад +1

      @@crylune Because Twitch is ostensibly marketing yourself, and being “normal” doesn’t stand out when you want new people to join your stream.

    • @NakamiJun
      @NakamiJun 2 года назад +2

      @@crylune Because that's not an option if I value my security. Showing my face means any crazy (including all to many of my neighbors) can find out way to much information about me with enough time. Don't like Vtubers? Don't watch. That's on you, but I'll take being able to be myself online and not needing to worry about getting SWATed or harassed.

  • @imadecoy.
    @imadecoy. 2 года назад +2

    RUclips and Discord are both GPU accelerated. It probably explains the results evening out at the end there.

  • @jakeek1946
    @jakeek1946 2 года назад

    Thank You, Thank You, Thank You!
    Now you finally understand us MMO gamers
    Where guild leader talks in discord and asks us to check out a video link for upcoming raids!

  • @prgnify
    @prgnify 2 года назад +7

    I will always opt for more cores even if ing aming it might be a 1% difference or maybe even worse than fewer cores due to ipc and clocks. My use case is not common, but if you want to test a 'worst case scenario' based on real usage - I only game at night, after work hours, and usually at that time I'll have my daily backup running (using borg) and might or might not have a video rendering in the background, as well as the usual discord - youtube - spotify and 250+ tabs (unloaded) open in firefox. Honestly, it is pretty rare for me to see more then 16 cores pegged while gaming (all the games I play are 'light' but CPU limited, like CSGO, StarCraft 2 and Rocket League. I'm sure if I just let everything run wild I'd have more cores pegged, but I use the taskset command (on linux) to limit things, mostly the video rendering.
    The actual worst case for my computer is when me and my friends are analysing chess games while on queue for a match. Stockfish likes cores.

  • @bwcbiz
    @bwcbiz 2 года назад +1

    Thanks for gathering and providing the data to back up your assertions. The oddities where multi-tasking reduces the performance gain for the 8 core CPU make me wonder if Chrome or Discord is doing some GPU acceleration for decoding or encoding.

    • @flameshana9
      @flameshana9 2 года назад

      You can see that in the Task Manager itself if you're using Windows 10. There's a gpu column. You can immediately tell if Chrome or whatever is using the gpu at all.

    • @andersjjensen
      @andersjjensen 2 года назад

      The boosting algorithm on the 8, 12 and 16 core parts are more opportunistic than the down-binned six core. This is plainly a loss of peak clock speeds.

  • @danonimob
    @danonimob 2 года назад +5

    I know it's too much, but as a long term project, it'd be great to see the Ryzen 1000/2000/3000 added to the sample. Great video! (:

  • @OniMetsuki
    @OniMetsuki Год назад +1

    4k60fps at 2x speed can eat about up to 60% (+about another 9% for background windows tasks) of my 2700X alone. I watch 90% plus of my content at 2x speed and still like quality ;)
    Looking to buy a 5700X in the next few days which should help things a Lot :)

  • @doctorfresh3856
    @doctorfresh3856 2 года назад +3

    I want to see the performance hit on Discord using Discords GP acceleration feature.

  • @iamaboozer
    @iamaboozer 2 года назад +1

    This is like the 3rd 5600 vs 5700x video in the last couple weeks from this channel.

  • @johng4357
    @johng4357 2 года назад

    Thank you for running ACC. I have most settings on high or epic, 3440x1440 at 100% resolution and 90FPS. My 3070 and 5600x manages just fine even with discord chat, plus streaming via obs to RUclips at 1720x720 60 FPS

  • @greenbay470
    @greenbay470 2 года назад +4

    Great video! I would love to see some follow-up with some 2000 or 3000 series parts in the future to see how they hold up

    • @Zangetsu_X2
      @Zangetsu_X2 Год назад

      Ha! yeahhh, expecting these guys to develop a multitasking benchmark or follow up on anything will just leave us disappointed.

  • @anepicjourney
    @anepicjourney 2 года назад +1

    I don't know if its because of how old my hardware is or if its because its 4 cores, but on a i7 3770, youtube videos have a HUGE impact on performance. 1080p60 videos use 20% cpu. Whenever I have lower fps than usual, I minimize the game to see if I left a video on 1080p and reduce the quality, and there we go, fps increases instantly

    • @sudd3660
      @sudd3660 2 года назад

      i had issues with my 6700k like you have, but it whould not show up as fps loss, but stutter. i hope that the 1% low pick that up on the video chart.

  • @ShantalhaitianPrincess
    @ShantalhaitianPrincess 6 месяцев назад +2

    my 5700x was only $10 more than a 5600x so i bought it plus i do some encoding so it helps in that also bought a 5700g for a htpc and it was the same price as a 5600x at the time so it depends on your usage and cost

  • @ZylonFPV
    @ZylonFPV 2 года назад +3

    How about.. what difference does it make to use NVENC to record, stream on discord or on twitch, compared to not using it with various cards? Since it’s the GPU that does the work here, it would be really interesting to see how it scales across all cards. Of course AMD uses something different but it would be good to see that too!

  • @bjn714
    @bjn714 2 года назад +1

    Even with my travel system which has a 5600x, my typical use of 2 browsers with about 20 tabs open with youtube playing, discord, and it's not even remotely noticeable compared to my 5800x primary system.

  • @jordanmackay6746
    @jordanmackay6746 2 года назад +3

    Man i thought you finally understood what we were asking for but apparently not. Many gamers stream their game to discord while watching their friends discord stream. I show 25% usage on a ryzen 5 2600 when doing this and typically drop frames in demanding games. Up to 40% usage when i am streaming and watching 2 or 3 friends on discord which i frequently do when playing games like Ark, V rising, Conan exiles, Icarus, ect.

  • @noanyobiseniss7462
    @noanyobiseniss7462 2 года назад +1

    Wish you had added .1% lows, also assigning cores.

  • @gusgyn
    @gusgyn 2 года назад +11

    You didn't see much CPU utilization while playing youtube videos because by default Chrome offloads the task to the GPU, you would need to manually disable GPU decoding in chrome flags. So by this, you would only see impact while doing another GPU heavy task like gaming with games like Cyberpunk 2077 with high settings

    • @MacCheetah3
      @MacCheetah3 2 года назад

      "So by this, you would only see impact while doing another GPU heavy task like gaming with games like Cyberpunk 2077 with high settings" Probably not as the decoder/encoder is separate from the graphics processing cores/compute units. Not specific to Nvidia but in their own words, "NVIDIA GPUs contain one or more hardware-based decoder and encoder(s) (separate from the CUDA cores) which provides fully-accelerated hardware-based video decoding and encoding for several popular codecs. With decoding/encoding offloaded, the graphics engine and the CPU are free for other operations."

    • @gusgyn
      @gusgyn 2 года назад

      @@MacCheetah3 I tested it in my 1070, and cyberpunk had really bad frame times while watching videos. I disable GPU decoding and fixed

  • @AdalbertSchneider_
    @AdalbertSchneider_ 2 года назад +1

    Steve, you are the BEST !
    And now plss, all the things in the background, but proper stream via OBS, 1080p / 6k bitrate, resonable quality, plss. Cause the Discord stream is dogshit. Thx 👍
    ( and ofc I mean streaming through CPU encoding, cause streaming via AMD CPUs... you know :P, so yeah, CPU H.264 )

  • @tusharjamwal
    @tusharjamwal 2 года назад

    I also have some topology optimisation running, ableton is open with a song that I'm working on, bunch of tabs from school, and sometimes word and or excel as well.

  • @Matthigast
    @Matthigast 2 года назад +2

    I'm watching this video in 1440p on Firefox, it's using 20-25% of my 6600K, Twitch often goes 30-40%
    Even playing Vanilla Skyrim I often get framedrops when I'm playing a video on my 2nd monitor

    • @thomasschraubt7497
      @thomasschraubt7497 2 года назад

      Firefox playback can stun my 5900X and don‘t you dare switching to fullscreen(in my case 4k60 on the second monitor)

  • @scarletspidernz
    @scarletspidernz 2 года назад

    I've always took it as a upto 10% hit on a midrange system (6/8core with 6600XT/3060TI or 6700XT/3070) when using a pc normally
    ie, a few tabs or upto 30 open ( with like emails, walkthroughs, 2-3 youtube videos open (1 playing though)/twitch), probably a word document/notepad or two open, 1-2 file explore windows open, steam, epic, gog running in the system tray, rtss/msi afterburner, hwinfo, antivirus with active file and process monitoring, discord group voice chat, webcam, spotify chill music on low.
    This would be like my default multi tasking setup that's been running like this for at least 2-4hrs before starting a game and benchmarking.
    and maybe add recording & encoding via obs as another layer.
    biggest bottle neck I would assume is having a sata ssd system vs nvme or running a pre ryzen release generation system

  • @josir1994
    @josir1994 2 года назад +1

    It may be a nice quest to search how far back and low in cpu processing power to see meaningful difference in games with or without background tasks, given how people upgrade cpu far less often than gpu

  • @Acid_Burn9
    @Acid_Burn9 2 года назад +3

    Would be interesting to see these benchmarks with a higher degree of multitasking (multiple game launchers checking for updates, spotify streaming music, running a torrent client or/and downloading files from the internet, antivirus scans, hardware monitoring software, software for dynamic RGB control(f.e. Razer Visualizer) and maybe even a few common bloatware programs)
    Obviously these are light task on their own, but when running all of them at the same time they might add up to something more significant and i am curious to see how big of an affect they might have together.
    Also you might think of running the task with hardware acceleration disabled to maximize their load on the CPU instead of offloading it on the GPU. Sure, it's not a very realistic scenario, but nonetheless an interesting one.

    • @flameshana9
      @flameshana9 2 года назад

      You can try all of this yourself. I have a dozen things going on my computer at any time and it's only a 3rd gen i-7. All you need is the Task Manager and a free program called Afterburner to monitor your fps. It's not rocket science to count numbers on a screen.

    • @Dominus_Potatus
      @Dominus_Potatus 2 года назад

      It isn't realistic... if you are actively installing game or scanning virus, you better not using your PC for gaming.
      Fun Fact, if you have good internet bandwidth, your CPU utilization will be used heavily since the CPU will decode the downloads at very high speed. Linus even jokes about making it a way to benchmark CPU.
      Lesson to learn, if you want to download games while playing, you might want to throttle the download speed.
      1Gbps download speed might utilize i9-12900KF to 40% usages. Why high? Because the CPU decodes at the same time it downloads.

    • @Acid_Burn9
      @Acid_Burn9 2 года назад +1

      @@Dominus_Potatus ik. that's why i suggested it

    • @Zangetsu_X2
      @Zangetsu_X2 Год назад

      @@flameshana9 yes, we *can* do it ourselves. so why did you watch this video then since you can just test your own system yourself. but don't you dare make a video on it and put it on youtube for millions to see, because showing us your results would go against your ethos of everyone having to doing it themselves and keeping the info to themselves.

    • @Zangetsu_X2
      @Zangetsu_X2 Год назад

      @@flameshana9 hi, have you ran your benchmarks on your old 3rd gen intel ivy bridge yet? how many things were you multitasking at once? what "things" caused the biggest fps hits? we are breathlessly awaiting your video on your youtube channel.

  • @QueueTeePies
    @QueueTeePies 2 года назад +1

    Great benchmark. I did a little test to confirm and it is true (for the most part). Using my brave browser in incognito so it's a clean version adds about 3-5% cpu usage on my 5800x3d at 4k60. The only issue is I never run a clean browser. I have a wide range of extensions that easily add ~10-15% extra. I noticed quite a performance hit especially to my games and video stutter on my old 5600X.

  • @Cylonknight
    @Cylonknight 2 года назад

    I have a 3 monitor setup, my 1440p 165hz, and 2 1080p, I personally see myself as a minimalist when it comes to software/shortcuts/files. I don't even install printer drivers to my desktop, only on my laptop, I only have my corsair and steelseries software launch on startup. I launch everything else myself, steam, battle.net, discord, and msi afterburner. One screen is dedicated to discord, the other to youtube or streaming content along with afterburner, I feel like I lose more gpu resources than CPU with my habits of watching something and playing a game with 3 monitors running off of a 2070 super.

  • @PaiyNstar
    @PaiyNstar 2 года назад +1

    Hi, maybe test it with 2nd monitor and 30+ browser tabs open (5 Twitch Streams) and discord + 3 different Launcher open while recording the game. This would be multitasking :D :D.

  • @wargamingrefugee9065
    @wargamingrefugee9065 2 года назад

    Holding up the coffee cup in the thumbnail had me laughing. Thanks for that. :-)

  • @ZLO_FAF
    @ZLO_FAF 2 года назад +1

    i have 100 discord servers and when streaming i usually have 3 chrome windows with up to 50 tabs combined (tho one of the windows is chat) also i use software to remove noise from mic on the fly and it also uses quite alot of CPU. Then while waiting for one game to start i might play some other game that can be paused like minecraft (but i play with 300 mods on minecraft so it is not a light task), there is also software to display APM, Dynamic objects in OBS also consume lots CPU even if you stream using hardware encoding

    • @ZLO_FAF
      @ZLO_FAF 2 года назад

      Oh I also am a casual "pro gamer", but I play rts and don't care about fps in game that much (i remember literally playing tournaments with 20 fps but it was long time ago)
      I used xeon e5 1650 with 16 GB 1333 ddr3 just fine and only upgraide so I can fastforward game replays faster cause they are very limited by cpu single thread performance. Also running two games at the same time is usually only a problem when paused game does not stop using GPU...

  • @Lightkie
    @Lightkie 2 года назад

    THANKS STEVE! I mean that completely unironically, the StarCraft II benchmark with the Ryzen 7 5800X3D is exactly what I was looking for (even if unrelated to the topic) and wasn't able to find from any of the other usual suspects.

  • @majinmew7745
    @majinmew7745 2 года назад +1

    PLEASE do an additional test using the current budget king, the Intel 12100(F). Another 2 cores less would be interesting to see in such a comparison with regards to FPS loss with background tasks...

  • @FawBio
    @FawBio 2 года назад +3

    What about 4c 8thr cpu like 12100 or 4100?

  • @Clobercow1
    @Clobercow1 2 года назад +1

    This video felt like it was out to prove something by using a multitasking test that is as benign as possible.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад

      Well sorry about that but it was addressing exactly what the bulk of the audience asked for.

  • @koyomiararagi3018
    @koyomiararagi3018 2 года назад +2

    While I don't think it would make a noticeable difference. I am wondering if you cleared your cache between tests. Considering that you might be loading the video of cache after the first viewing of the same video.

  • @touhami_dz6458
    @touhami_dz6458 8 месяцев назад

    i love this type of videos , thanks to u i decided to not wait 2 more months and just get R5600 cuz that's what i have in pocket right now

  • @donpowers5412
    @donpowers5412 2 года назад +22

    I picked up the 2700x over the 3600 back then betting on cores coming into play in the future, it is safe to say I will be grabbing a 5600 for my next upgrade

    • @ramen645
      @ramen645 2 года назад +2

      But muh cores!

    • @goa141no6
      @goa141no6 2 года назад

      Especially on overcloking, 6 cores is the sweat spot, since 8 core cause latency which on stock is compensated with higher speeds.

    • @metamon2704
      @metamon2704 2 года назад +2

      @@goa141no6 There is no latency from 3x00 and up it was only the old ZEN's that had that.

    • @michaelbakker5720
      @michaelbakker5720 2 года назад +1

      i went from a 2700X and 2080 to 5600X and 6900XT. The IPC uplift is very significant, and it'll be a while still before games really challenge/utilize those 6 cores.

    • @gatsusagara6637
      @gatsusagara6637 2 года назад

      If I were you, I would examine what you play first. My experience with the the R51600 ->R7-3700x and R9 5900x taught me that it really depends. I have 3 separate rigs at various locations and I tend to play on all three sporadically. For certain games, the 1% lows suffer mainly due to cpu resources not being enough. The worst offender is BF2042 , but I can see that the newer titles are definitely more cpu demanding. Maybe skip a lunch and grab the R7 5700X just in case. Unless of course, you don't tend to play games that require lots of mutlicore resources.

  • @snowdawn
    @snowdawn 2 года назад +1

    excellent testing Steve. Thank you.

  • @x1c3x
    @x1c3x 2 года назад +1

    I remember playing the then new Call of Duty WW2 on my old Intel 4c/4t Q9550 and 750Ti.
    It was heavily CPU bottlenecked and turning off Discord actually made a difference,but still all cores full 100% usage. Also if i was in a Disc voice channel, the audio would stutter heavily and sometimes just stop for a few seconds, then play the content of those few seconds "all at once" in 1s. lol.
    Don't remember the fps diff, but that stuck in my head as the worst CPU bottleneck I've had happen to me, with Battlefield 1 being a close 2nd at the time.

  • @ajkun5587
    @ajkun5587 2 года назад

    Thank you so much for this! this is what i needed for my planned build!

  • @cmja09
    @cmja09 2 года назад +1

    Guess who doesn't need an upgrade... me.
    Thanks Hardware Unboxed!

  • @christrubl5014
    @christrubl5014 2 года назад

    Could you perform a test while gaming that includes the following background tasks.
    1. Msi after burner (polling GPU temp, GPU core clock, GPU memory usage, system ram usage, CPU temp, CPU utilization, CPU clock, FPS and frame time)
    2. RGB controller software (e.g Razer, Corsair, ASUS…etc or run two rgb programs together for those that use a mix of manufacturers for keyboard, mice, headset and desktop rgb lights)
    3. Spotify (or any other music streaming service or a RUclips music play list)
    4. Audio equalization software (APO/peace, Dolby, windows sonic, DTS or whatever else people use)
    5. Active discord audio session
    6. A common antivirus program installed (but no manual scans running)
    7. Loupedeck or elgato software running
    8. Some chrome tabs open (user guides…etc)
    9. Steam, Epic, origin or other game plate form open with chat tabs (for those friends that refuse to use discord)

  • @truemyth3148
    @truemyth3148 2 года назад +1

    Love to see transcoding, file moving and photoshop like tasks and the effect on gaming if you need more ideas.

    • @phillipmarnik
      @phillipmarnik 2 года назад +2

      How many people do you think would ever find themselves in this scenario?

    • @truemyth3148
      @truemyth3148 2 года назад

      @@phillipmarnik oh you’re probably right, but are you saying you’re not curious?

  • @Aokimarcel
    @Aokimarcel 2 года назад +2

    I was on a R5 3600, and changed to a 5700g. Ofc the single thread performance between the 3000 and 5000 was significant, but when running 4 Bluestack instances + other utilities I felt a great difference when playing another main game, when I changed the CPU. I guess I would have to try it against the 5600G to see if the difference would still be significant.

    • @flameshana9
      @flameshana9 2 года назад

      Emulation is always cpu heavy. He wasn't testing that here.

    • @Aokimarcel
      @Aokimarcel 2 года назад

      @@flameshana9 i know. Buy he asked for examples of use cases where there could be an impact on performance.

    • @flameshana9
      @flameshana9 2 года назад

      @@Aokimarcel There _shouldn't_ be any issues unless you only have 2 cores. My old i7 (4 cores 8 threads) handles multitasking without a problem, so any modern machine has even more headroom.
      Whenever you have doubts about performance just use the tools that are built in to Windows and see for yourself. The Task Manager will show cpu and gpu performance in % or on a graph. What else do you need? Fps can be monitored by Afterburner which is free.
      Open only the game, note the fps, run all the programs, note the new fps. Unless your cores are full you can keep adding programs until you utilize them all. And even then Windows will prioritize the active window. So multitasking is a non-issue these days.

  • @Kaori42
    @Kaori42 2 года назад +1

    Hello, I have a 5600x with a rtx 3070, playing on a 1440p 144hz monitor (and a second 1080p 60hz) , the only performance hit that I saw was while playing a video, the higher the quality, the more I loss fps, seam like the more I increased the video quality, the more my gpu usage drop!

  • @UltraSolarGod
    @UltraSolarGod 2 года назад +1

    Thank you for doing this test I know now that 5600 is more than enough for gaming even with dual monitor setup 🔥.
    One last question I am someone who keep his cpu for atleast 6/8 years so for that amount of time is 5600 is good enough?
    I have been using i5 4690 for around 7 years it's finally time to upgrade thanks to your video

    • @flameshana9
      @flameshana9 2 года назад

      If you're like me and wait that many years then spending $200 more for a better cpu would be worth it. $200 divided by 6 years is not much more than $30 a year or around $3 a month.

    • @UltraSolarGod
      @UltraSolarGod 2 года назад

      @@flameshana9 you mean 5900x or 5950x or 7800x

  • @turk88
    @turk88 2 года назад

    It is all perception, we all know the background tasks do not really do much . The only time is when one of the apps has a random crash etc.

  • @Algorythm44
    @Algorythm44 2 года назад +18

    14:10 The 5600 is more than capable of streaming with reasonable settings without a noticeable hit to performance, you don't need to use gpu or a second pc like it's 2010.
    And for anyone that asks I mean reasonable as in like 720p 60fps 4000kbps faster/fast/medium, twitch doesn't like when you go over 6000kbps so 1080p is out of the question and looks horrible with less than 6000kbps

    • @PlasticinaVerde
      @PlasticinaVerde 2 года назад +6

      The advantage to use a dedicate streaming pc is that the layouts can use a lot of vram once you start to add a lot of stuff on it. So yeah, there is that

    • @HazewinDog
      @HazewinDog 2 года назад

      Just get an Intel CPU and use Quicksync, it's the only way stream at high quality without affecting 1% and 0.1% lows. I mean, I guess a 5900X might do it as well, or of course a separate PC. Maybe there's even ways to use a secondary dGPU for it. But for most people starting out, Quicksync is the best way IMO. Unless you're fine with using Geforce Now or AMD's equivalent of course, those are both great options if you don't need additional features.

    • @fafiteee
      @fafiteee 2 года назад +2

      @@HazewinDog how does quicksync compare to the (turing and later) nvenc encoder from nvidia?

    • @Angel7black
      @Angel7black 2 года назад +1

      Im assuming you mean with an encoder cause my old 3700X took hits even when using X264. You wnd up having to use x264 fast a lot of times

  • @lorsch.
    @lorsch. 2 года назад

    I sometimes give 2-4 cores (4-8 threads) of the 5800x exclusively to a VM for "Cloud gaming" in the private network, even there the system memory is more of a bottleneck (16GB is rough) and when the VM runs a game simultaneously, it's the power limit of the GPU.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +1

      It goes without saying, but if you're running VM's obviously get as many cores as you can afford.

  • @TleafarafaelT
    @TleafarafaelT 2 года назад

    i love the fact that you test ACC, you're the only channel i see testing, and i love it because i race there so much :D

  • @daboross2
    @daboross2 2 года назад +1

    I would bet the architectural limitation putting the 5600x and 5700x at the same performance is memory bandwidth! Need memory to feed the cores.
    For that reason I'd be really interested to see a 5800x vs 5900x test! With twice the memory write bandwidth, maybe the 5900x can overcome the single-CCD CPUs?

    • @Dominus_Potatus
      @Dominus_Potatus 2 года назад

      I mean... 5800X3D is de facto fastest gaming CPU from AMD beats even Ryzen 9 because of a lot of memory cache.
      I place my bet, 5800X3D will win against 5900X in gaming and background multitasking.
      5900X might win if you are using your PC while actively using multiple softwares such as rendering and gaming

  • @johnpaulbacon8320
    @johnpaulbacon8320 2 года назад

    Great video. I have always been a Multi-tasker. I try to build my computers to handle Multi-Tasking. Thanks for making this video.

  • @Yan1nc
    @Yan1nc 2 года назад +4

    The weird bottlebecking for like the 5600x and 5700x becoming the same with multitasking is probably Core Boost Behavior, I've seen some 5700x boost less than a 5600x with the same amount of cores being taxed, my 5600x runs 5ghz on up to 4 cores (it's ocd obviously) but as soon as it goes to 6 cores it has to drop to 4.7 otherwise it's not even close to be stable

    • @mustangboss9295
      @mustangboss9295 2 года назад +1

      Are you running a 103 BCLK? 48.5 is max multiplier, at least with PBO.

  • @tomdillan
    @tomdillan 2 года назад +1

    I must be doing something wrong with my system. I7-9700F, 32 gb ddr4, Rtx 3060 Ti, wd sn 750, Sandisk 1tb Ssd , Noctua uh12a with dual 120mm fans, Win 10 updated all drivers and pretty much open air Fractiual node 304. At idle in Warzone is 60 to 70 percent and during game up 95 ish percent of cpu, gpu goes up to 10%. I am running 1080p 144 hz refresh on all games.

  • @Yad_HG
    @Yad_HG 2 года назад

    I know a certain Deals channel that might have a different opinion lol. The tone generally is get 8, 10 cores up or stop being poor. Although very knowledgeable in the field, sometimes some reviewers just tend to forget not everyone has RUclips money or live in the US.

  • @solocamo3654
    @solocamo3654 2 года назад +1

    BFV 64 player would definitely make a big difference. Even at 4k ultra, my 10900 is at 80% usage on a oc'ed 6900XT. And that's with bare minimum software running.

    • @MrHope86
      @MrHope86 2 года назад

      indeed, this dude is testing with easy CPU games. BF orr warzone even tarkov will show the difference

  • @cser2453
    @cser2453 2 года назад +1

    I have a ryzen 5 3600 which is an upgrade from my old cpu, but downgrade in cores, but noticed when gaming and playing games the video stutters and lags behind video wise to the point the video is unwatchable

  • @registeredblindgamer4350
    @registeredblindgamer4350 2 года назад +8

    It is quite obvious that the people who thought you needed 8 cores because of multitasking watch far too much TechDeals who is an awful outlet for testing.

    • @emlyndewar
      @emlyndewar 2 года назад +2

      "I couldn’t possibly use a 6 core CPU, I need those extra cores because I’m a gullible."

    • @evenAndre
      @evenAndre 2 года назад +1

      true :)

    • @MrNota500
      @MrNota500 2 года назад +1

      Ikr, tech deals recomends all topend crap. Which is ironic considering the name of the channel. Can't really take to the dude, he acts like he knows it all in a snobby way.

    • @registeredblindgamer4350
      @registeredblindgamer4350 2 года назад

      @@MrNota500 I couldn't agree more! Tech is very bad for only recommending top tier hardware. Even if he recommends a mid range GPU he still advocates for a high core count CPU. He's extremely arrogant and tries to be a know it all and ends up falling short every single time.

    • @thomasschraubt7497
      @thomasschraubt7497 2 года назад

      I guess great outlets for testing use gpu accelerated tasks to prove something about cpus, right?

  • @Yeahboii1
    @Yeahboii1 2 года назад

    oath reading the comments on the Q&A this video was a necessity

  • @f.miller801
    @f.miller801 Год назад

    I always play with youtube in the background and I definitely feel like the performance tanks a little bit.

  • @Lordmagus87
    @Lordmagus87 2 года назад +1

    What about RGB contol programs or dynamic Wallpapers or both?

  • @charleswidmore5458
    @charleswidmore5458 2 года назад +1

    Very interesting stuff.
    I figured the performance hit was exaggerated on my dual core i3 and 1050ti.
    Maybe it is time to upgrade....

  • @churblefurbles
    @churblefurbles 2 года назад +1

    Used to matter more with quad cores

  • @devonmoreau
    @devonmoreau 2 года назад

    Superb work, this answers alot of questions for me.

  • @kajurn791
    @kajurn791 2 года назад +1

    Great video, now i'd like to see how Alder Lake does with their E cores. I think that's going to be very important to know moving forward.

    • @djmccullough9233
      @djmccullough9233 2 года назад

      why is the performance of low power cores that are by design Not high performance cores important ? Sure if you are power limited they have their use. But given a choice between X number of Perfomance cores or the same number of Effeciency cores , the Performance cores are going to be far more desirable to almost every end user. it will only be "important" going forward to laptop form factors or tablets where they have to sip power. Edit: also i guess they'd be of mild interest to people who have very badly designed desktop systems that struggle to manage their own thermals I suppose.

    • @xxch1coriaxx
      @xxch1coriaxx 2 года назад +1

      @@djmccullough9233 hmmm 🤔.... they are supposed to handle the background tasks??

    • @kajurn791
      @kajurn791 2 года назад +1

      @@djmccullough9233 E cores are there to handle windows background processes and multitasking so the stronger cores don't need to deal with other processes that interfere with gaming. That "low power core" nonsense you're claiming is silly since they're on par with Skylake cores in performance and skylake is still more than powerful enough to game on, in fact Ryzens up to Zen 2 struggled to match their gaming performance. That you mention badly designed desktops is disingenuous, i had a Ryzen that got to 89 degrees under load and have a 12600k that doesn't even get to 75. Don't be a stupid fanboy.

    • @djmccullough9233
      @djmccullough9233 2 года назад

      @@kajurn791 well true, but if the purpose was to see the impact of background tasks on the CPU and gaming performance, those extra threads are just going to insulate the data from showing the impact

    • @djmccullough9233
      @djmccullough9233 2 года назад

      @@kajurn791 and what is nonsense about calling the E cores , low power? They are meant to use less power and are slower than the performance cores.

  • @mhh3
    @mhh3 2 года назад +5

    RIP every "bUt i'Ve BaCKGroUnD TaSKS Open SO I NeED 8 CoRES" Comments in every 6 vs 8 cores for gaming discussion

  • @vmafarah9473
    @vmafarah9473 2 года назад +1

    We also wanna see a. 4 core 12th gen i3 in comparison.

  • @andrewmcewan9145
    @andrewmcewan9145 2 года назад

    Just a point. Only arc,rdna2 and ampure have av1 decode for youtube.
    I beleave youtube uses h264 up to 720p but i don't know what makes it use vp9 vs av1.
    Edit. Thst means your cpu does all the decode work for youtube depending on res/framerate.
    Also as a note audio quality is different depending on youtube resolution afaik may have changed as I did that test a long time ago.
    Although in most cases as we are talking about weaker gpus here the impact will be less anyway from the higher load.

  • @ericp1480
    @ericp1480 2 года назад

    This is why I subscribed to this channel. Great info, thank you!

  • @ChrisGreeceChris
    @ChrisGreeceChris 2 года назад +2

    What about stutters though????

  • @zoopa9988
    @zoopa9988 2 года назад

    This is why I prefer an i5-12600K with my RTX 3080 when aiming for 4k@120hz.
    Don't need many cores, just fast cores, especially when running an old game like SC2.
    I would upgrade to an RTX 3080 TI or 3090 before I would upgrade my CPU.
    Or my ram from 16 to 32 GB, in case I want to run a server.
    Or just upgrade my storage from 1 to 2 TB of NVMe, just for convenience.
    Hell, I can upgrade both if I don't upgrade the CPU and still have money left over.

  • @SomethingCool51
    @SomethingCool51 2 года назад +1

    Curious about the performance hit playing online games while hosting the server locally like Minecraft or Rust (plus teamspeak). Maybe those games can't leverage 6 cores anyway so no difference, who knows.

  • @robertm9682
    @robertm9682 2 года назад

    So glad you did this!

  • @MrFredscrap
    @MrFredscrap 2 года назад +1

    Wouldnt this test be a bit better if a Nvidia GPU is used with a known driver overhead problem/CPU handicap?
    I also wonder what happens when RAM is reduce to 16gb rather than the 32gb as i'd imagine most people with R5 series CPU would actually have.
    Theres also the question of how much "variable" is introduced with ReBar is enabled in this test.

  • @guiorgy
    @guiorgy 2 года назад

    I didn't think Discord would be much of a problem for a 6 core CPU anyways. However, what might be interesting to check out, is downloading/installing/updating games or other apps/files from the browser in the background while gaming. After all, if you open Flight Simulator and see that a 50 GB update needs installing, would you decide against gaming, or would you choose to play something else to kill time while the game updates?

    • @davidandrew6855
      @davidandrew6855 2 года назад

      For me I'd download first, but that would be less than 20 minute, on a good day. However I would think that the hit to ones Internet connection would be more significant than the actual processing of an update to a game in the background, especially if to an SSD vs HDD. I could be wrong, I might look at CPU loads on my next update to see how much it eats up.

  • @flamespirit1
    @flamespirit1 2 года назад +1

    A little late for this content...

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +2

      Obviously not...

    • @phillipmarnik
      @phillipmarnik 2 года назад +1

      I would have been interested yesterday, today yawn.

    • @AgentSmith911
      @AgentSmith911 2 года назад +1

      @@Hardwareunboxed It's a timeless topic; it was relevant a decade ago when we moved away from one core, and it's relevant today and tomorrow as we move on with more cores and threads, E-cores, scheduler etc

  • @FooMaster15
    @FooMaster15 2 года назад

    Thanks for this. I always wondered this.