Ryzen 5 5600X Build Recommendation - ruclips.net/video/szfBisM5p4w/видео.html Support my work by shopping here (paid links): Ryzen 5 5600X - geni.us/8MVw5G Ryzen 7 5800X - geni.us/vHoouU Ryzen 9 5900X - geni.us/Tjtfw6 Ryzen 9 5950X - geni.us/Bws7Ky Best AMD motherboard - geni.us/jYLKhP 🛒Newegg - bit.ly/36axHIK All of the above links are affiliate links. It means that I will get commission from any purchases you make using these links. It doesn’t affect the price you pay and it is a great way to support my channel. As an Amazon Associate I earn from qualifying purchases.
What you forget to mention on these videos are the 1% lows. Average fps does not dictate how smooth a game will be. Getting a 5950x where 1% lows are higher also help with big multiplayer lobbies like warzone that take advantage of the cores leading to smoother gameplay. Thus for competitive scene why do you think competitive players get the most expensive cpu on the same version on the 5000 series. You rarely see competitive players use the 5600x in tournaments. You should redo the tests to show how important 1% lows are. It won't feel as stuttery as using 5600x compared to 5950x and this is myself who have experienced both.
Great content!!! We need these types of real world usage tests so people can make the right choices. We expect a similar video with ryzen 3000 series and 10-16series nvidia gfx cards. Hope you find the time....Best wishes from India❤️
Back in December I upgraded my gpu from a 5700 XT to a 6800 XT and still holding on to my Ryzen 7 3700X. I did think about upgrading the cpu as well but than decided against it. I play at 1440P with settings maxed out and while there are times where I do encounter cpu bottleneck it's not really all that common. At the end of the day I had to weigh in the benefits and the cost and to me it's not worth spending $500 on a 5800X + motherboard to get a faster cpu. I'm very happy with my current PC. Who knows maybe in the future when the 5800X goes on nice sale like the 3700X when I got it for $230 a little over a year ago I may upgrade.
Wonderful review of information. I have the 5600x and the Nvidia 3070 and play at 1440. I can't wait for your thoughts on the next generation of graphic cards and cpu. Thanks again.
This is the content I've been searching for. All the cpu reviews use 1080p gaming to help accentuate the differences in new CPUs. But why does that matter if we're playing at 4k 60-120hz and gpu bottlenecked on a GTX 3090 anyway?
Because you couldn't see any difference otherwise. The review would be absolutely pointless if the tests were ran on 4K as every CPU would perform as dictated by the GPU. Rating CPU differences should be done by actual benchmarks or real world workloads, not games, save for some e-sports titles that actually can utilize CPUs, anyway.
Great educational video...I went with the 5600x for now and am happy with it....paired with an RX 5700 that runs overclocked at almost 2000 most of the time so...good for now. Thanks again!
Lows make a difference. On top of that many people use multiple monitors and run a extra program or two. Eight core and up is preferred during real world usage. Like having a RUclips playlist going while you game. MSI afterburner. Things like that.
@@burntorange3 I'm sure it can run it fine. It's a difference in the fps when it is running extras that makes the value show up. Those things never show up on charts.
Things change so fast, I have a 5600X CPU and a RX 6900 XT that I bought 2 years ago, nowadays if I consider a GPU upgrade, for example a RTX 4090, my CPU would now be a bottleneck
As a last cpu for am4 it absolutely makes sense to use a more powerful one. Maybe there is no difference now, but in the next 2-3 gpu generations, there is probably a higher difference.
Exactly. He should be doing tests with older games such us cs go, and try low settings on the games to see if the fps counter increases. You get to the point when lowering graphics doesnt increase the fps counter, and thats when you know you have cpu bottleneck
This is great information, but as someone who does 3d assets and animation, i cannot find anywhere that shows this level of detail in workstation programs that benefit specifically from having extra cores
Indeed, it's difficult to find any kind of professional apps testing on RUclips. The audience is just not as large as it is with gaming-related content. The only guy I know who does some kind of professional oriented testing on RUclips is Tech Notice.
Hello, I have an AMD Ryzen 9 5950x RTX3070 TI 4x8GB RAM module and from 240 FPS 90 FPS drops for 1 second then back to normal. I play Performance Mode on Low. And could it be because of the 4x8GB RAM modules that I should go to 2x16GB RAM modules? Or what is the problem?
Nice video, you got a new subscriber ! There are many people who game at 1080p with powerful cards as rtx3090, cause even with bottlenecks, the fps are always way higher on 1080p than 1440p & 4k resolutions ! Me, i have the 5950x with rtx3090 on a 24" 1080p monitor 144hz for content creation & gaming ! It would be useless to have less fps for higher resolution as on 24" you could not tell the difference in resolution especially while gaming !
Thanks! I'm glad the video was helpful🙂 Don't you wish you had a bigger monitor and higher resolution for better content creation workflow? Video editing on a bigger monitor is such a nice experience.
@theivadim yes, bigger is always better, however the higher than 1080p resolution, starts to seem on monitors bigger than 32" ! Your videos are very informative and high quality! Keep up 👍
Hi i want a pc for music production and 4k video editing a friend made a pc with a ryzen 9 5900x and a VIDEO CARD Inno3D TWIN X2 OC WHITE GeForce RTX 4070 12 GB Video Card is that a good combo or do you guys recommend something else?
hey, i have a 3080ti with a ryzen 7 3700x, and my cpu bottlenecks as heck, can you tell me if you tried cyberpunk and if yes what was your fps? i want to compare if changing cpu is really worth it
Interesting, im running a 7800xt paired with a 5600x and with cyberpunk at 1440p, my gpu is at 100% as expected but in busy areas i see some cores almost hitting 90% when i get framerate drops so i was expecting the better cpu to do better
I can affirm the situation is much worse than this video describes. I've accepted that computers are always going to be crippled garbage that nobody can understand and obsolete before you buy.
If you are replacing cpu every 3-4 years go for 5600x else 5800x. I personally think cpu should last for 7-10years while a GPU last for around 5-6 years otherwise you are wasting money
agreed, I switched from 4770 to 5600x THAT was true improvement. I'm good with 5600x for next 5 years, easily. Especially considering fact that even 4090 rtx will work with it well, and now I have 4070, that should last me for years.
@@UltraSolarGod I still running on my 10 years old Corsair 650W RMI, so no psu is the same. I like this rig a lot, it's semi-passive, with Noctua radiator on cpu and all fans off at idle. Even when gaming there is literally no noise from computer 🙂
If you have a Ryzen 3000 series CPU literally don't buy ANY 5000 series CPU. End discussion. In real world testing. The only use case is a RTX 3090 or RX6900XT with a 5800X or 5900X but a 3900X overclocked is exactly the same as the 5900X under the same conditions. The only difference is the chipset and PCIe 4 which at the moment is truly useless over PCIe3 2-4 frames at best in a very small range of selected titles and IG you go for "Rage Mode" up to 7 FPS with a 6900XT but still far less than a 3900X OC with a RTX 3090 still. End learning outcome: DON'T waste your money on Ryzen 5000 ANYTHING if you have a Ryzen 3000 CPU now. And if you're upgrading STILL buy a Ryzen 3000 series CPU regardless! It's smarter and cheaper than wasting money on a number greater for exactly the same performance! Ryzen 5000 was ONLY brought out for PCIe4 and it's not even the 7+ node it's just a refresh with some tiny improvements in the silicon. Nothing you'll EVER notice in ANY application aside from cinebench R20. Save your money now for Ryzen 7000 5nm finfet. Don't even bother with the tick tok 7nm+ of Ryzen 6000 in the meantime. If you have technical evidence to the contrary I'd genuinely love to see it 😂 As if you do I'd be gob smacked to see that all of my research has been for naught from what industry has told me so far!
HA! The Ryzen 9 3900X is more expensive than the Ryzen 9 5900X. The 3000-series Ryzen CPUs still cost the same as, or more than, the newer 5000-series Ryzen CPUs. I just picked up another 5900X for a second PC build in a small form factor that I can bring to friends' houses to play in LAN parties.
@@l.i.archer5379 that shows the quality difference doesn't it though? If the 3900x is still more or as much as the 5900x it says a lot for the 5900x not being as well made or quality controlled as the 3900x to this date lol. I still stand by my original statement. This is just supporting evidence of that right? Thank you for your feedback.
More than fine! Though Nvidia GPU's use the NVEC decoder the new 6000 series use the AV1 codec which in my opinion is superior in its own way to the Nvidia codec. Just make sure your software supports AV1 codec. Ie. Adobe or Davinchi as examples.
I have no experience with picking hardware for productivity apps. You should check out some video editing specific channels to see comparisons or demos.
So what combination would you recommend for playing d4 and cs 2 on 1080p and 144 hz monitor? I'm currently thinking about following combos: Ryzen 5 5600X with Radeon RX6950 16GB GDDR6 Ryzen 7 5800X/Ryzen 7 5800X3D with Radeon RX6800 16GB GDDR6 Ryzen 7 5700X with Radeon Red Dragon RX 6800XT 16GB GDDR6
It is important to note that what you see here is true ONLY IF you are PURELY GAMING in that instance. However, some people like to run some heck of a lot of background tasks like web browser(lots of chrome tabs open), discord app, streaming your game, some monitoring software and some even light emulators(android games, afk-auto gaming). So if you're one of those who constantly have other things open on your PC, you will benefit from having more cores. if you're perfectly fine with closing everything else but the game you're playing, then the 5600x(or the intel equivalent) will be more than enough. (Considering you're on the right resolution and graphical settings of course)
Before I start this, I want to say that I don't have any plan to play games on 4k and I will either stay with my 1080p monitor or maybe an upgrade in the future towards the 1440p. Confused between RTX 3070 + Ryzen 5 5600x and RTX 3080 + Ryzen 5 5600x Will it bottleneck my PC later in the future? or should I just get RTX 3080 + Ryzen 9 5900x? I know it's overkill for my 1080p - 1440p setup but I'm also taking in the fact that I have to stay with this PC for a long time for future proof.
I don't approve of future proofing concept with PC hardware. It especially does not make sense in 2021 when some parts cost a lot more than they should. I'd get whatever you need for right here and now. It will definitely get you through the next 2 to 5 years at 1080p. RTX 3070 + 5600X sounds like a good choice. However, if you insist on future proofing then more CPU cores makes sense as modern games can spread the load across multiple cores much better than the old games did. I expect this trend to continue.
If you think about 3080 rtx, I would recommend 4070 rtx. Lower power consumption, raw power as 3080 and you get dlss 3 and frame generation that give good 40% improvement over 3080. I know, I had 3080 and now 4070 🙂. I'm used both on 5600x. I play on 3840x1600p lg 38" monitor and everything is amazing.
What about with a Quadro RTX 6000. My primary use is Solidworks. My Razer Blade Pro 17 is taking hours to render, so I am building a desktop machine. Thanks!
I specialise in gaming performance only. So, I can't answer that. If I were you, I'd look for answers in the professionals community. Reddit is a good place to start.
Always invest in a better GPU first. When you see that it is not 100-98% loaded in games then you can start worrying about upgrading the CPU. 8700 is still a damn fine CPU imho.
@@HassanAchieved 3-4 probably. I think we will start seeing shorter brakes between releases. The competition is heating up on all fronts. Especially Intel and AMD will have to speed up their releases if Apple starts releasing new cpu/gpu every year like they do on iPhones.
I just figured out that this is a thing. I have a Ryzen 5600x with a 3080. Fans start blasting and my GPU usage jumps to 92. I used triple monitors. What should I do to maintain the triple monitor gameplay?
@@burntorange3 sorry the problem was solved a while back. I had to set a frame rate limit. My FPS was almost +200 at times, so it was maxing out my GPU etc. I set the limit to 75-80 max. But thank you
yes someone actually says it. idk why everytime some will say the cpu is gonna bottleneck a high end gpu like the person is gonna play csgo on 1080p lowest settings. especially when its a new gen entry cpu like a 5600
I'm gonna buy a used 3080 ti with an AM4 CPU for 1440p gaming and a little bit of video editing. I was thinking 5800x or 5600x because of my low budget this video solved my doubts about 5600x. It's better than a 3080-5800x combo I guess. You can wait longer for video rendering and get the same result but you cannot wait for the future frames on games :). Pretty good and useful video.
Go with 4070, way better choice. Lower power consumption, basically the same raw performance as 3080 and dlss 3 and frame generation takes it even further away from 3080. I used 3080 and 4070 on 5600x 🙂
@@krecikowi got mine 5800x and used 3080 ti. I'm pretty happy with 3080 ti but cooling the 5800x is so freaking hard I bought a new case 360mm liquid cooler and a bunch of fans. undervolted and 75 degrees while gaming drives me crazy. If today you ask me I would go with 4070 ti too and an intel platform.
@@bruhyldrm5191 my 5600x is running at 85-90C 😁 (passive cooling) and it's not throttling to much. In general cpu can run "hot", as long clocks are as expected.
4 hours? Reminds me of the stories my parents told me about the communist USSR times. They had to stand in line for hours to get almost any product😅 But never mind. Congrats on your new GPU! What is the first game you gonna fire up on that bad boi?
@@theivadim Haha. Yeah. I was born in a communist country too and remember as a little kid my grandma having to wait outside the store in a line to buy milk and eggs multiple times. I experienced this first hand back in December when I waited for about an hour outside a Micro Center to get a 6800 XT.
Of course the CPU isn’t that important as long as you have a modern six-core. Reviewers have said that many, many times. That said, driver overhead is a different, related problem for Nvidia cards.
I agree with Vincetas. Always upgrade GPU first. You will get the most of that upgrade even if CPU bottlenecks. Then you can install CapFrameX, MSI Afterburner or similar overlay app to see the CPU/GPU load stats during games. Test it. If GPU is loaded 95-90% or even lower than you can start planning a CPU upgrade.
iVadim, thank you for the work you are doing! Just a quick question. I know there's no way of benchmarking with Cod:Warzone. But I've heard that's a very demanding game cpu wise, therefore, wouldn't I see more fps with a 8 core/16 threaded cpu paired with a 2060 super (as of today - Will upgrade the gpu at a later stage) at 1440p? I know my gpu is most likely bottlenecking my gaming rig, even though I'm on a 6700 skylake oced to 4.5ghz... but with the Nvidia overhead issue on older cpus, I believe upgrading the cpu will improve my experience as well. Should I stick to a 5600x or interl rocket lake counterpart?
Always upgrade graphics card first. Especially at higher resolutions. GPU is bottleneck in your PC at this point. You can check it for yourself by installing CapFrameX or MSI Afterburner and enabling overlay to check your CPU & GPU load. You will see that your GPU is most likely working at its full potential. Hence you will not see any difference in performance by changing your CPU. That was the whole point of this video. To demonstrate that even top of the line current gen GPUs are too weak for CPUs in most cases.
what about frame timing ? what about system smoothness and no hickups ? what about multitasking and using chrome x tabs open + gaming ? or using ultrawide or 2 3 monitors ? thats what you cannot benchmark and aftherall we can say IT DEPPDENDS
The only time I recommend getting anything above the 5600x is if you're doing any cpu intensive task other than gaming like photo and video editing, streaming, recording, rendering etc
@@theivadim also you have an rtx3000 series gpu you can even record and stream with the 5600x because rtx3000 do most of the video encoding which takes a lot of load of the cpu
Can someone help me out. I'm looking to upgrade my i7 5820 to a AMD Ryzen 5600x. I'm running a 2080 video card that will updated just not right now. I want to know if I see a big increase in my video card performance when upgrading. I'm running 1440P triples. I plan on upgrading the video card to a 3080ti once the supply and demand hopefully change next year. If not I wait and to the upgrade all at once. I play iracing and it's more cpu driven than gpu driven. I have not seen data showing the increase in FPS with a 2080 with that big of a jump in cpu.
I imagine that running 3x 1440p monitors on an RTX 2080 you are mostly GPU limited. Unless you are using low graphics settings. So, I think you need to worry about that first. Also, by the time you will be upgrading your GPU there may be other better (new) CPU options available.
@@theivadim Well no not really. I'm running 130 to 80 FPS depending on tracks but, yes the eye candy is toned down a little. I have read many people upgrade to a 3080 or 3090 with not much increase with older CPU's. As the game iracing is very CPU dependent and I'm asking since my CPU is old will I get more out of the FPS upgrading the CPU first. So you really not sure but, thanks.
I got a ryzen 5 5600 and a Rx6600 (theoretically this is the AMD rtx 3050 or something like that), and I've made inumerous changes on my new pc ( 1 month old with everything new) but my graphics card cant hold 99% in the majority of games, while the Nvidia ones can, and it's very weird cause in games like dishonored 2, Far cry 3, watch dogs, spider man remastered/ miles morales etc, the gpu always drops and I simply cant understand why, and its not even only with me, there are plenty videos on YT with my config/ same specs and with this issue, cause when the gpu utilization drops, it causes some tipe of stutter. If someone knows how to fix this...I could really use some help
Did you set a higher minimum clock speed for the GPU, like 900mhz? Did you undervolt your graphics card? Did you set maximum allowable power to max in Adrenalin?
your only going to get a difference at 4k and 8k which he totally lies about or does not know how to set up his PC properly you cannot just pop a 12 core or 16 core in and think there are not adjustments because there are 3 major adjustments that must be done to windows before you can utilize past 8 cores.
Today to play modern games you need 4/8 CPU, at 4/4 you will get at stutters but 4/8 will be at its maximum. And if you're buying 6/12 core CPU it will be perfect for the next 5 years.
Mmm... Even 4/8 can be not enough in some CPU demanding games. Like Battlefield. 6/12 is the minimum I'd recommend to make sure that you are fully covered for all games. And it doesn't cost that much more over a 4/8 CPU.
@@theivadim Regular user will not notice difference between 4/8 and 6/12 difference. Most gamers are not that demanding. With 6/12 you can have 100fps almost all games. If not just move 1440p or 4k and you will be fine next 8-10 years. Of course if you see difference between 100 fps and 144fps. Myself i see difference 😉 but only shooter games i need it others 100fps is fine, even 60 is enough
@@iraklimgeladze5223 most gamers shouldn't notice 720p to 1080p, this news was all over the internet back in the early 2000's, can you be fine with 720p? Cpu is like a vehicle, you can drive many cars, but the response time, no cpu lag, etc is that makes the difference with 8+ cores.
Now throw up 6x Frostbite engine games and you'll see why 6 cores aren't enough. Or RDR2. Or DCS: World now that they've optimized it for multicore, although, it may be okay with 6c/12t. Try BF4, BF1, BF5, BF2042, Anthem, NFS: Heat. They all push my 5900x to 50-55%. BF4 might be a tad lower. When you have a 6 core, which before this 5900x, I had an R5 3600, in games, you just don't go beyond 85%. It won't peg them at 100% even though they technically "could". Not sure why, just something I noticed while owning a 3600. When you see 85%, it doesn't mean you are good and have "just enough", it means the game would push it further if whatever was stopping it from doing so wasn't there. Because I literally got to see what 6c/12t did in comparison to 12c/24t. And, technically, all I have to do is double my CPU usage and get an idea of what a 5600x would be showing, too. Now, ADMITTEDLY, a 6c/12t CPU is plenty fine for 80-90% of games out there, so it's not a big deal. Especially if all you're doing is gaming. No streaming/recording using the CPU, since you can do that with the GPU, anyways. It's just that CPU streaming/recording/encoding gives much better quality and also at the same time, smaller file sizes. But hey, even with a 5900x, I don't want to wait 10 hours to encode some 100mbit recorded game footage down to some super high quality format. I can't remember how large that file was, but it's a true story lol. I went to sleep, woke up, CPU had been pulling 180-190 watts all night long and I was like "ehhh... sorry, big guy, I'll let the 6700XT do it." 8 minutes. 8 frickin' minutes using the GPU. But I know the quality settings weren't identical since you can't do that in Handbrake, but STILL, it was as similar as I could get them. GPUs be crazy. EDIT: Yeah, this is why AMD should have been putting integrated graphics on all their CPUs like Intel has done forever. I think they do now, but I mean, I don't want a "g" CPU, they lose half the cache and are just overall slower CPUs than "x" or "non-x" models. The APUs have to compromise cache and PCI-e lanes to fit the GPU in it to make it an APU. I need my PCI-e lanes because I use everything on my motherboard. I don't need my PCI-e x16 4.0 slot cutting to x8 or 3.0 just because I slap an M.2 NVME drive in there, which also gets cut down to PCI-e x2 instead of x4. But I guess they're finally doing it right, now.
I feel like 5800x is the sweet-spot since I doubt anyone would want to only game a pc. I think most people would want to do some work or video editing, so the 8 cores and 16 threads should make a good difference.
You make it sound like all the PC owners are hustling and bustling all day long and only game after a hard days of work in video editing or 3D modelling😄 I think it's the opposite.
A very misleading video man. While yeah the average fps may not be so different in a lot of games but 1% and 0.1 % lows will be. For example I run 4k 120 set up and 10900k is performing much better than my 9700k in a lot of games. I do agree that for 4k 60 there is no difference between these CPUs though.
1) 1% lows follow almost the same trend as average FPS in this particular case (Ryzen 5000-series). 2) Are you saying that 9700K performs so much worse in 1% lows vs 10900K that you can spot it without benchmarking it? My friend has 9700K and it runs games smoothly at over 200 FPS and more. You'll need like 4x RTX 3090 to start noticing those kind of differences. P.S. I know that you want to believe that 10900k is giving you massive gains in games. However, the truth is - it doesn't unless you put graphics on low and run RTX 3090 at 1080p.
Lol you know that I'm playing on 1080p monitor with an RTX 3070 and it's better than the GTX 1060... And my CPU is an i7-7700k where my new CPU R7 5800X is way better to play FS2020
I disagree mate, online competitive games like Fortnite or warzone take advantage of better CPU on 1080p low settings for maximum frame rate. Now yes I went from 150fps to 300fps in Fortnite by going from a i3 8100 to a i7 10700k which is an extreme scenario, but still it was kinda worth it
BTW. On the other side I have 144hz monitor (can be OCed to 160hz) and I cap all my games at 90 or 100fps since simply can't tell the difference with refresh rate above 90hz, 60 to 90 yes, but above basically nothing. Sure, maybe I'm not spending 10h day playing Fortnight but I do have life outside of my computer.
Ryzen 5 5600X Build Recommendation - ruclips.net/video/szfBisM5p4w/видео.html
Support my work by shopping here (paid links):
Ryzen 5 5600X - geni.us/8MVw5G
Ryzen 7 5800X - geni.us/vHoouU
Ryzen 9 5900X - geni.us/Tjtfw6
Ryzen 9 5950X - geni.us/Bws7Ky
Best AMD motherboard - geni.us/jYLKhP
🛒Newegg - bit.ly/36axHIK
All of the above links are affiliate links. It means that I will get commission from any purchases you make using these links. It doesn’t affect the price you pay and it is a great way to support my channel. As an Amazon Associate I earn from qualifying purchases.
What you forget to mention on these videos are the 1% lows. Average fps does not dictate how smooth a game will be. Getting a 5950x where 1% lows are higher also help with big multiplayer lobbies like warzone that take advantage of the cores leading to smoother gameplay. Thus for competitive scene why do you think competitive players get the most expensive cpu on the same version on the 5000 series. You rarely see competitive players use the 5600x in tournaments. You should redo the tests to show how important 1% lows are. It won't feel as stuttery as using 5600x compared to 5950x and this is myself who have experienced both.
Great content!!! We need these types of real world usage tests so people can make the right choices. We expect a similar video with ryzen 3000 series and 10-16series nvidia gfx cards. Hope you find the time....Best wishes from India❤️
Great suggestion!
Back in December I upgraded my gpu from a 5700 XT to a 6800 XT and still holding on to my Ryzen 7 3700X. I did think about upgrading the cpu as well but than decided against it. I play at 1440P with settings maxed out and while there are times where I do encounter cpu bottleneck it's not really all that common.
At the end of the day I had to weigh in the benefits and the cost and to me it's not worth spending $500 on a 5800X + motherboard to get a faster cpu. I'm very happy with my current PC. Who knows maybe in the future when the 5800X goes on nice sale like the 3700X when I got it for $230 a little over a year ago I may upgrade.
That is a good state of mind that I agree with. Nice!👍
You might not need to upgrade your motherboard, I have a b450 tomahawk and it can support ryzen 5000 with a bios update
The Ryzen 7 5800X is now $279.99 at MicroCenter, and the Ryzen 9 5900X is $349.99. Happy upgrading!!
Sir iVadim, truest words ever spoken. Spot on correct, cudos Sir!
Right on
Wonderful review of information. I have the 5600x and the Nvidia 3070 and play at 1440. I can't wait for your thoughts on the next generation of graphic cards and cpu. Thanks again.
Thanks, will do my best to deliver!🙂
I have a 5600X and RX 6600 with 16GB RAM. Nice set-up for 1080p gaming. Enough for me personally. ❤
Very nice, balanced PC👍 I'd happily game on such a machine as well🙂
This is the content I've been searching for. All the cpu reviews use 1080p gaming to help accentuate the differences in new CPUs. But why does that matter if we're playing at 4k 60-120hz and gpu bottlenecked on a GTX 3090 anyway?
Exactly. Thank you!
1080p tests are needed to see the maximum possible performance but they fail to showcase real-world scenario usage.
Because you couldn't see any difference otherwise. The review would be absolutely pointless if the tests were ran on 4K as every CPU would perform as dictated by the GPU. Rating CPU differences should be done by actual benchmarks or real world workloads, not games, save for some e-sports titles that actually can utilize CPUs, anyway.
ВОТ ЭТО ИНФА ПОДЪЕХАЛА!!!
Great educational video...I went with the 5600x for now and am happy with it....paired with an RX 5700 that runs overclocked at almost 2000 most of the time so...good for now. Thanks again!
I am listening this Video while playing Warthunder with a Threadripper 3970X and a Geforce GTX 1080 (without TI) ^^
It is amazing to run a workload on 60 threads (rendering with blender) and play a game and it works really good.
Beastly performance=)
Lows make a difference. On top of that many people use multiple monitors and run a extra program or two.
Eight core and up is preferred during real world usage. Like having a RUclips playlist going while you game. MSI afterburner. Things like that.
6-core 5600x with 32gb of RAM handles all of that just fine lol
@@burntorange3 I'm sure it can run it fine. It's a difference in the fps when it is running extras that makes the value show up. Those things never show up on charts.
Things change so fast, I have a 5600X CPU and a RX 6900 XT that I bought 2 years ago, nowadays if I consider a GPU upgrade, for example a RTX 4090, my CPU would now be a bottleneck
Indeed. That is how PC master race gets you hooked😄
How is it a bottleneck???? The 4090 is a 4k card, takes off any bottleneck
And this is exactly why I am still sticking with my i7-8700k 😁
In my opinion if you are on a budget buy the ryzen 5 3600 if not buy the ryzen 5 5600x
Or Ryzen 5 2600 if you are on an even tighter budget=) I have it in my personal PC with RTX 2060 and it keeps that GPU load close to 100%.
@@theivadim i5 11400f (;
They're the same goddam price
Just buy the current generation
@MiniSparki cringe, intel and amd are just different. Picking sides cringe tho
@@CocoKoi321 i hope your ass realizes that it was 2 years ago
What's the best option to play and render? I have two options, R5 5600X and R9 5900X
As a last cpu for am4 it absolutely makes sense to use a more powerful one. Maybe there is no difference now, but in the next 2-3 gpu generations, there is probably a higher difference.
Would be more useful to include the 1% lows as that is surely more likely to highlight core count differences that best FPS???
Will amd ryzen 5600 or 5600x bottleneck a gtx 1050ti?. If yes which cpu is best for gtx 1050 ti
your using the wrong games to compare. your only using demanding games that force every cpu into a gpu bound situation regardless of resolution.
Exactly. He should be doing tests with older games such us cs go, and try low settings on the games to see if the fps counter increases. You get to the point when lowering graphics doesnt increase the fps counter, and thats when you know you have cpu bottleneck
This is great information, but as someone who does 3d assets and animation, i cannot find anywhere that shows this level of detail in workstation programs that benefit specifically from having extra cores
Indeed, it's difficult to find any kind of professional apps testing on RUclips. The audience is just not as large as it is with gaming-related content. The only guy I know who does some kind of professional oriented testing on RUclips is Tech Notice.
Can you do this again but with 1% and 0.1% lows?
Hello,
I have an AMD Ryzen 9 5950x RTX3070 TI 4x8GB RAM module and from 240 FPS 90 FPS drops for 1 second then back to normal. I play Performance Mode on Low. And could it be because of the 4x8GB RAM modules that I should go to 2x16GB RAM modules? Or what is the problem?
Nice video, you got a new subscriber !
There are many people who game at 1080p with powerful cards as rtx3090, cause even with bottlenecks, the fps are always way higher on 1080p than 1440p & 4k resolutions !
Me, i have the 5950x with rtx3090 on a 24" 1080p monitor 144hz for content creation & gaming !
It would be useless to have less fps for higher resolution as on 24" you could not tell the difference in resolution especially while gaming !
Thanks! I'm glad the video was helpful🙂
Don't you wish you had a bigger monitor and higher resolution for better content creation workflow? Video editing on a bigger monitor is such a nice experience.
@theivadim yes, bigger is always better, however the higher than 1080p resolution, starts to seem on monitors bigger than 32" ! Your videos are very informative and high quality! Keep up 👍
Nice, I was unsure about getting a 5600X for a 3090 for gaming but it looks like I won't have issues
Hi i want a pc for music production and 4k video editing a friend made a pc with a ryzen 9 5900x and a VIDEO CARD
Inno3D TWIN X2 OC WHITE GeForce RTX 4070 12 GB Video Card is that a good combo or do you guys recommend something else?
I have a my gaming rig with an R5 5600x, an RTX 3080ti and 16gb@3600mhz. Runs like a dream. GPU maxed out in everything thus far. 60+ fps @ 4K!
hey, i have a 3080ti with a ryzen 7 3700x, and my cpu bottlenecks as heck, can you tell me if you tried cyberpunk and if yes what was your fps? i want to compare if changing cpu is really worth it
Interesting, im running a 7800xt paired with a 5600x and with cyberpunk at 1440p, my gpu is at 100% as expected but in busy areas i see some cores almost hitting 90% when i get framerate drops so i was expecting the better cpu to do better
I can affirm the situation is much worse than this video describes.
I've accepted that computers are always going to be crippled garbage that nobody can understand and obsolete before you buy.
If you are replacing cpu every 3-4 years go for 5600x else 5800x. I personally think cpu should last for 7-10years while a GPU last for around 5-6 years otherwise you are wasting money
agreed, I switched from 4770 to 5600x THAT was true improvement. I'm good with 5600x for next 5 years, easily. Especially considering fact that even 4090 rtx will work with it well, and now I have 4070, that should last me for years.
@@krecikowi nice build 😁 did you also replaced the psu?
@@UltraSolarGod I still running on my 10 years old Corsair 650W RMI, so no psu is the same. I like this rig a lot, it's semi-passive, with Noctua radiator on cpu and all fans off at idle. Even when gaming there is literally no noise from computer 🙂
I thought of upgrading my cpu to ryzen 5600x for gtx 1660 ti will it have any bottleneck issue...?
Excellent. Thanks for sharing.
My pleasure!
What RAM speed and timings were used?
Finally an honest RUclipsr 🎉🎉🎉🎉
But how do you explain that I upgraded from my Ryzen 5 3600x to Ryzen 9 5900x with a RTX 2070super and now I get more fps an WAY less stutters?
Maybe, you changed ddr?
If you have a Ryzen 3000 series CPU literally don't buy ANY 5000 series CPU. End discussion. In real world testing.
The only use case is a RTX 3090 or RX6900XT with a 5800X or 5900X but a 3900X overclocked is exactly the same as the 5900X under the same conditions. The only difference is the chipset and PCIe 4 which at the moment is truly useless over PCIe3 2-4 frames at best in a very small range of selected titles and IG you go for "Rage Mode" up to 7 FPS with a 6900XT but still far less than a 3900X OC with a RTX 3090 still.
End learning outcome: DON'T waste your money on Ryzen 5000 ANYTHING if you have a Ryzen 3000 CPU now.
And if you're upgrading STILL buy a Ryzen 3000 series CPU regardless!
It's smarter and cheaper than wasting money on a number greater for exactly the same performance!
Ryzen 5000 was ONLY brought out for PCIe4 and it's not even the 7+ node it's just a refresh with some tiny improvements in the silicon. Nothing you'll EVER notice in ANY application aside from cinebench R20.
Save your money now for Ryzen 7000 5nm finfet.
Don't even bother with the tick tok 7nm+ of Ryzen 6000 in the meantime.
If you have technical evidence to the contrary I'd genuinely love to see it 😂
As if you do I'd be gob smacked to see that all of my research has been for naught from what industry has told me so far!
This some good advice👍
@@theivadim very appreciated sir! I always love your content, I just thought this night help also. Looking forward to your next video!
100% correct!
HA! The Ryzen 9 3900X is more expensive than the Ryzen 9 5900X. The 3000-series Ryzen CPUs still cost the same as, or more than, the newer 5000-series Ryzen CPUs. I just picked up another 5900X for a second PC build in a small form factor that I can bring to friends' houses to play in LAN parties.
@@l.i.archer5379 that shows the quality difference doesn't it though? If the 3900x is still more or as much as the 5900x it says a lot for the 5900x not being as well made or quality controlled as the 3900x to this date lol. I still stand by my original statement. This is just supporting evidence of that right?
Thank you for your feedback.
I’m thinking of buying a 5600X. It should be fine for video editing with a rx 6800, right?
More than fine! Though Nvidia GPU's use the NVEC decoder the new 6000 series use the AV1 codec which in my opinion is superior in its own way to the Nvidia codec.
Just make sure your software supports AV1 codec.
Ie. Adobe or Davinchi as examples.
I have no experience with picking hardware for productivity apps. You should check out some video editing specific channels to see comparisons or demos.
Ok thanks!
So a ryzen 5 5600x with a rog strix rtx 3080 oc would be fine without any bottlenecking when compared with 5800x and 5900x? Hope to get your answer
Yes. You will not see the FPS difference in 99% of the games available right now.
So what combination would you recommend for playing d4 and cs 2 on 1080p and 144 hz monitor? I'm currently thinking about following combos:
Ryzen 5 5600X with Radeon RX6950 16GB GDDR6
Ryzen 7 5800X/Ryzen 7 5800X3D with Radeon RX6800 16GB GDDR6
Ryzen 7 5700X with Radeon Red Dragon RX 6800XT 16GB GDDR6
It is important to note that what you see here is true ONLY IF you are PURELY GAMING in that instance. However, some people like to run some heck of a lot of background tasks like web browser(lots of chrome tabs open), discord app, streaming your game, some monitoring software and some even light emulators(android games, afk-auto gaming). So if you're one of those who constantly have other things open on your PC, you will benefit from having more cores.
if you're perfectly fine with closing everything else but the game you're playing, then the 5600x(or the intel equivalent) will be more than enough. (Considering you're on the right resolution and graphical settings of course)
Should I get a ryzen 5. 5600x our a ryzen 9 5900x for my rx 6600 xt
5950x a better idea than a 5800x3d for my 4080? should i swap?
Before I start this, I want to say that I don't have any plan to play games on 4k and I will either stay with my 1080p monitor or maybe an upgrade in the future towards the 1440p.
Confused between RTX 3070 + Ryzen 5 5600x and RTX 3080 + Ryzen 5 5600x
Will it bottleneck my PC later in the future? or should I just get RTX 3080 + Ryzen 9 5900x?
I know it's overkill for my 1080p - 1440p setup but I'm also taking in the fact that I have to stay with this PC for a long time for future proof.
I don't approve of future proofing concept with PC hardware.
It especially does not make sense in 2021 when some parts cost a lot more than they should. I'd get whatever you need for right here and now. It will definitely get you through the next 2 to 5 years at 1080p. RTX 3070 + 5600X sounds like a good choice. However, if you insist on future proofing then more CPU cores makes sense as modern games can spread the load across multiple cores much better than the old games did. I expect this trend to continue.
If you think about 3080 rtx, I would recommend 4070 rtx. Lower power consumption, raw power as 3080 and you get dlss 3 and frame generation that give good 40% improvement over 3080. I know, I had 3080 and now 4070 🙂. I'm used both on 5600x. I play on 3840x1600p lg 38" monitor and everything is amazing.
what sould such video tell us? I have played with 3900x and RTX3090 CoD warzone on 1440p after upgrading to 5900x I had 30 fps more.
What about with a Quadro RTX 6000. My primary use is Solidworks. My Razer Blade Pro 17 is taking hours to render, so I am building a desktop machine. Thanks!
I specialise in gaming performance only. So, I can't answer that. If I were you, I'd look for answers in the professionals community. Reddit is a good place to start.
what about 5800x with rtx 3060? will it bottleneck?
hi is it good when pairing a ryzen 7 5800x and rtx 3080?
Yeah. It's very good I'd say.
@@theivadim is there any fps drop will happen cause they said this combo have a bottle neck
or should i stick to rtx 3070?
Um a majority of gamers still use 1080p and especially so for competitive gaming. So a 3090 could very well be used for a 1080p build.
Thank you brother!
Beware that the situation has changed if you're looking to play the latest AAA games. Some of the new games have become more CPU intensive.
@@theivadim Yeah, things are evolving but in my case it's just perfect👏
yes a a monitor upgrade made the difference in my gaming enjoyment i upgraded from 1080p TN monitor to a 1440p IPS 10-bit color monitor so much better
i have an i7 8700 should i upgrade to the 5800x? or sould i leave it as it is?
Always invest in a better GPU first. When you see that it is not 100-98% loaded in games then you can start worrying about upgrading the CPU. 8700 is still a damn fine CPU imho.
@@HassanAchieved 3-4 probably. I think we will start seeing shorter brakes between releases. The competition is heating up on all fronts. Especially Intel and AMD will have to speed up their releases if Apple starts releasing new cpu/gpu every year like they do on iPhones.
I just figured out that this is a thing. I have a Ryzen 5600x with a 3080. Fans start blasting and my GPU usage jumps to 92. I used triple monitors. What should I do to maintain the triple monitor gameplay?
Are your monitors 4k?
@@burntorange3 sorry the problem was solved a while back. I had to set a frame rate limit. My FPS was almost +200 at times, so it was maxing out my GPU etc. I set the limit to 75-80 max. But thank you
yes someone actually says it. idk why everytime some will say the cpu is gonna bottleneck a high end gpu like the person is gonna play csgo on 1080p lowest settings. especially when its a new gen entry cpu like a 5600
I'm gonna buy a used 3080 ti with an AM4 CPU for 1440p gaming and a little bit of video editing. I was thinking 5800x or 5600x because of my low budget this video solved my doubts about 5600x. It's better than a 3080-5800x combo I guess. You can wait longer for video rendering and get the same result but you cannot wait for the future frames on games :). Pretty good and useful video.
Go with 4070, way better choice. Lower power consumption, basically the same raw performance as 3080 and dlss 3 and frame generation takes it even further away from 3080. I used 3080 and 4070 on 5600x 🙂
@@krecikowi got mine 5800x and used 3080 ti. I'm pretty happy with 3080 ti but cooling the 5800x is so freaking hard I bought a new case 360mm liquid cooler and a bunch of fans. undervolted and 75 degrees while gaming drives me crazy. If today you ask me I would go with 4070 ti too and an intel platform.
@@bruhyldrm5191 my 5600x is running at 85-90C 😁 (passive cooling) and it's not throttling to much. In general cpu can run "hot", as long clocks are as expected.
Hello Vadim! I finally got an RTX 3060 for $390. I had to wait in line in front of a store for 4 hours but it was worth it!!
4 hours? Reminds me of the stories my parents told me about the communist USSR times. They had to stand in line for hours to get almost any product😅 But never mind. Congrats on your new GPU! What is the first game you gonna fire up on that bad boi?
@@theivadim the first game I played was minecraft RTX!
@@CruzGD nice! Enjoy it!
@@theivadim Haha. Yeah. I was born in a communist country too and remember as a little kid my grandma having to wait outside the store in a line to buy milk and eggs multiple times. I experienced this first hand back in December when I waited for about an hour outside a Micro Center to get a 6800 XT.
Of course the CPU isn’t that important as long as you have a modern six-core. Reviewers have said that many, many times.
That said, driver overhead is a different, related problem for Nvidia cards.
What about for lower generations ? i7 6700K overclocked at 4.7 GHz with a 3080 at 1440p ? Should I change the CPU ?
if you bought those parts install fraps and check which part gets 100% upgrade that
@@vincentas1 I am thinking of buying 3080, but a little bit scared of not lose a lot because of the CPU.
@@xadrian23 I think you will be getting cpu bottlenecked, you should buy a good gpu first upgrade cpu later when zen 4 amd or intel equivalent
I agree with Vincetas. Always upgrade GPU first. You will get the most of that upgrade even if CPU bottlenecks. Then you can install CapFrameX, MSI Afterburner or similar overlay app to see the CPU/GPU load stats during games. Test it. If GPU is loaded 95-90% or even lower than you can start planning a CPU upgrade.
6700k up until 2070 / rx5700XT levels, above, its the bottleneck.
You have to decide the rest.
Thank you so much brother! This kind of information helps out so much!!
I will go for a rtx 3080 and a 5 5600x. Thanks for the vid!
My 5600 is a bottleneck for my 3070
is the 5600x with the rtx3080 with 144hz 1440p monitor a good option?
Yes 👍
@@theivadim thx couse already have that in my pcpartpicker list :)
iVadim, thank you for the work you are doing! Just a quick question. I know there's no way of benchmarking with Cod:Warzone. But I've heard that's a very demanding game cpu wise, therefore, wouldn't I see more fps with a 8 core/16 threaded cpu paired with a 2060 super (as of today - Will upgrade the gpu at a later stage) at 1440p? I know my gpu is most likely bottlenecking my gaming rig, even though I'm on a 6700 skylake oced to 4.5ghz... but with the Nvidia overhead issue on older cpus, I believe upgrading the cpu will improve my experience as well.
Should I stick to a 5600x or interl rocket lake counterpart?
Always upgrade graphics card first. Especially at higher resolutions. GPU is bottleneck in your PC at this point. You can check it for yourself by installing CapFrameX or MSI Afterburner and enabling overlay to check your CPU & GPU load. You will see that your GPU is most likely working at its full potential. Hence you will not see any difference in performance by changing your CPU. That was the whole point of this video. To demonstrate that even top of the line current gen GPUs are too weak for CPUs in most cases.
what about frame timing ?
what about system smoothness and no hickups ?
what about multitasking and using chrome x tabs open + gaming ?
or using ultrawide or 2 3 monitors ?
thats what you cannot benchmark
and aftherall we can say IT DEPPDENDS
TECHDEALS... is that you..
The only time I recommend getting anything above the 5600x is if you're doing any cpu intensive task other than gaming like photo and video editing, streaming, recording, rendering etc
Same 👍
@@theivadim also you have an rtx3000 series gpu you can even record and stream with the 5600x because rtx3000 do most of the video encoding which takes a lot of load of the cpu
What about over clocking
Can someone help me out. I'm looking to upgrade my i7 5820 to a AMD Ryzen 5600x. I'm running a 2080 video card that will updated just not right now. I want to know if I see a big increase in my video card performance when upgrading. I'm running 1440P triples. I plan on upgrading the video card to a 3080ti once the supply and demand hopefully change next year. If not I wait and to the upgrade all at once. I play iracing and it's more cpu driven than gpu driven. I have not seen data showing the increase in FPS with a 2080 with that big of a jump in cpu.
I imagine that running 3x 1440p monitors on an RTX 2080 you are mostly GPU limited. Unless you are using low graphics settings. So, I think you need to worry about that first. Also, by the time you will be upgrading your GPU there may be other better (new) CPU options available.
@@theivadim Well no not really. I'm running 130 to 80 FPS depending on tracks but, yes the eye candy is toned down a little. I have read many people upgrade to a 3080 or 3090 with not much increase with older CPU's. As the game iracing is very CPU dependent and I'm asking since my CPU is old will I get more out of the FPS upgrading the CPU first. So you really not sure but, thanks.
I got a ryzen 5 5600 and a Rx6600 (theoretically this is the AMD rtx 3050 or something like that), and I've made inumerous changes on my new pc ( 1 month old with everything new) but my graphics card cant hold 99% in the majority of games, while the Nvidia ones can, and it's very weird cause in games like dishonored 2, Far cry 3, watch dogs, spider man remastered/ miles morales etc, the gpu always drops and I simply cant understand why, and its not even only with me, there are plenty videos on YT with my config/ same specs and with this issue, cause when the gpu utilization drops, it causes some tipe of stutter. If someone knows how to fix this...I could really use some help
Did you set a higher minimum clock speed for the GPU, like 900mhz? Did you undervolt your graphics card? Did you set maximum allowable power to max in Adrenalin?
@@robertkubrick3738 yes to all the questions, I've tried any possible option in the adrenaline driver
@@Xspiderman400 Are you on the latest bios? What are the temps like? Did you try testing with the case side panel off and fans at 100%?
@@robertkubrick3738 latest bios, always less than 65 C. There is no temp issue
@@robertkubrick3738 latest bios, always less than 65 C. There is no temperature issue, I've always been monitoring
Will rx 6700 xt be ok with r5 5600x?
yes, but you can go further, like a 7700xt with minium bottleneck
you have to test on 720p and low settings
COD WarZone with my 5600x 3080 ti locked to 170 FPS max 1440p.. switched to 5900x and instantly 200+ FPS.. just saying
Yea I'll buy either 5900x or wait for nextgen. Idk about stock.
your only going to get a difference at 4k and 8k which he totally lies about or does not know how to set up his PC properly you cannot just pop a 12 core or 16 core in and think there are not adjustments because there are 3 major adjustments that must be done to windows before you can utilize past 8 cores.
Thanks!!!
Today to play modern games you need 4/8 CPU, at 4/4 you will get at stutters
but 4/8 will be at its maximum.
And if you're buying 6/12 core CPU it will be perfect for the next 5 years.
Mmm... Even 4/8 can be not enough in some CPU demanding games. Like Battlefield. 6/12 is the minimum I'd recommend to make sure that you are fully covered for all games. And it doesn't cost that much more over a 4/8 CPU.
Perfectly fine will be 8c/16t, 4 cores was on market for too long and the games are getting more cpu intensive nowadays.
@@theivadim Regular user will not notice difference between 4/8 and 6/12 difference. Most gamers are not that demanding. With 6/12 you can have 100fps almost all games. If not just move 1440p or 4k and you will be fine next 8-10 years. Of course if you see difference between 100 fps and 144fps. Myself i see difference 😉 but only shooter games i need it others 100fps is fine, even 60 is enough
@@eikementira1604 true but most gamers will not notice it
@@iraklimgeladze5223 most gamers shouldn't notice 720p to 1080p, this news was all over the internet back in the early 2000's, can you be fine with 720p?
Cpu is like a vehicle, you can drive many cars, but the response time, no cpu lag, etc is that makes the difference with 8+ cores.
Guys im new to pc and i dont speak english very good can anyone tell me is ryzen 5 5600x and rtx 3060 good?
It is very good 👍
okay these are all generic games but what about VR?
Not true. If you have a CPU bias game, 5800x and 5800x3D will have increase performance over the 5600x.
you should have shown gpu usage
Now throw up 6x Frostbite engine games and you'll see why 6 cores aren't enough. Or RDR2. Or DCS: World now that they've optimized it for multicore, although, it may be okay with 6c/12t.
Try BF4, BF1, BF5, BF2042, Anthem, NFS: Heat. They all push my 5900x to 50-55%. BF4 might be a tad lower.
When you have a 6 core, which before this 5900x, I had an R5 3600, in games, you just don't go beyond 85%. It won't peg them at 100% even though they technically "could". Not sure why, just something I noticed while owning a 3600. When you see 85%, it doesn't mean you are good and have "just enough", it means the game would push it further if whatever was stopping it from doing so wasn't there.
Because I literally got to see what 6c/12t did in comparison to 12c/24t. And, technically, all I have to do is double my CPU usage and get an idea of what a 5600x would be showing, too.
Now, ADMITTEDLY, a 6c/12t CPU is plenty fine for 80-90% of games out there, so it's not a big deal. Especially if all you're doing is gaming. No streaming/recording using the CPU, since you can do that with the GPU, anyways. It's just that CPU streaming/recording/encoding gives much better quality and also at the same time, smaller file sizes.
But hey, even with a 5900x, I don't want to wait 10 hours to encode some 100mbit recorded game footage down to some super high quality format. I can't remember how large that file was, but it's a true story lol. I went to sleep, woke up, CPU had been pulling 180-190 watts all night long and I was like "ehhh... sorry, big guy, I'll let the 6700XT do it."
8 minutes. 8 frickin' minutes using the GPU. But I know the quality settings weren't identical since you can't do that in Handbrake, but STILL, it was as similar as I could get them.
GPUs be crazy.
EDIT: Yeah, this is why AMD should have been putting integrated graphics on all their CPUs like Intel has done forever. I think they do now, but I mean, I don't want a "g" CPU, they lose half the cache and are just overall slower CPUs than "x" or "non-x" models. The APUs have to compromise cache and PCI-e lanes to fit the GPU in it to make it an APU. I need my PCI-e lanes because I use everything on my motherboard. I don't need my PCI-e x16 4.0 slot cutting to x8 or 3.0 just because I slap an M.2 NVME drive in there, which also gets cut down to PCI-e x2 instead of x4.
But I guess they're finally doing it right, now.
simple and great
Looks like GPU bottleneck here.
your being limited by your gpu like this is such a terrible test
I feel like 5800x is the sweet-spot since I doubt anyone would want to only game a pc. I think most people would want to do some work or video editing, so the 8 cores and 16 threads should make a good difference.
You make it sound like all the PC owners are hustling and bustling all day long and only game after a hard days of work in video editing or 3D modelling😄 I think it's the opposite.
@@theivadim i mean...id agree with Kavin. Even if it's not production stuff , chrome will eat our cpus
@@IGarrettI what are you talking about? Chrome works fine even on my weak 2-core CPU in 2015 Macbook
@@theivadim I was making a human joke about the common meme'age of chrome being very hungry for yummy yummy resources mmmm
@@IGarrettI oh😅 My human feelings didn’t enable just then 😂
👍
A very misleading video man. While yeah the average fps may not be so different in a lot of games but 1% and 0.1 % lows will be. For example I run 4k 120 set up and 10900k is performing much better than my 9700k in a lot of games. I do agree that for 4k 60 there is no difference between these CPUs though.
1) 1% lows follow almost the same trend as average FPS in this particular case (Ryzen 5000-series).
2) Are you saying that 9700K performs so much worse in 1% lows vs 10900K that you can spot it without benchmarking it? My friend has 9700K and it runs games smoothly at over 200 FPS and more. You'll need like 4x RTX 3090 to start noticing those kind of differences.
P.S. I know that you want to believe that 10900k is giving you massive gains in games. However, the truth is - it doesn't unless you put graphics on low and run RTX 3090 at 1080p.
good video
Lol you know that I'm playing on 1080p monitor with an RTX 3070 and it's better than the GTX 1060... And my CPU is an i7-7700k where my new CPU R7 5800X is way better to play FS2020
this is fake
I disagree mate, online competitive games like Fortnite or warzone take advantage of better CPU on 1080p low settings for maximum frame rate. Now yes I went from 150fps to 300fps in Fortnite by going from a i3 8100 to a i7 10700k which is an extreme scenario, but still it was kinda worth it
Do you play on a 360Hz monitor?
BTW. On the other side I have 144hz monitor (can be OCed to 160hz) and I cap all my games at 90 or 100fps since simply can't tell the difference with refresh rate above 90hz, 60 to 90 yes, but above basically nothing. Sure, maybe I'm not spending 10h day playing Fortnight but I do have life outside of my computer.
For gaming,i5/i3/r3/r5 is enough
Invest more on gpu
Definitely GPU is more important=)