It doesn’t, overheats like a mf. Even undervolting won’t work unless you won the lottery. My 7600x with some good cooling gets a bit toasty. What makes it worse is half the games today aren’t optimized so they put Kris stress than usual for no reason
Hardware Unboxed did a great benchmark comparison of those 2 cpus. Depending of the game, the 5800x3D have the edge, but in other games, it's the 7600x. One got more L3 cache, the other have higher frequency per core. It really depends, the results are different from a game to another. Take care !
@@yan3066 yep that's true that the new gen CPUs have more clock speed and it's a 50 - 50 of the 5800X3D vs new gen CPUs but the 70**X3D will be the best of the best
@@MateusSilva-ps6xr I don't know, but the difference wouldn't be that much in my opinion. No matter which one of those 2 you will use, I'm pretty sure the performance will be awesome.
actually useful, don't listen to the people complaining about pairing those with a 4090 (why would he pick a 4060 that will bottleneck all 3 on 90% scenarios jesus...) we testing cpus no gpus
Btw dore people here, dont be tricked by the frames. The reason for the big difference between the 5600x and 7600x is the bottlenecking capacity for the RTX 4090 which is unreleastic. An RTX 3070 would have been a fair comparison as no one is buying a budget cpu with a 4090
this is about showing the maximum possible fps difference between both cpus which can only be shown if both of them bottleneck a much more powerful gpu.
That is unnecessary. What is needed is to test them with a GPU that you can afford. After all, that's the reason why you see comparisons between CPUs, to see what's best value with your now/future-GPU @@ImperialDiecast
@@TestingChannel1-kh9iz in the budget to midrange price bracket, choosing cpus really doesnt matter as much as choosing the right Gpu. because any extra edge provided by a better cpu only matters if you want to achieve the highest available framerate in terms of absolute numbers, which, with a midrange gpu, is only possible at 1080p low settings because that's the only time your midrange GPU wont be at 100% due to your cpu bottlenecking it. So only in that case does a better cpu help it get 100% usage even at lower resolutions and lower settings. You will always achieve 100% gpu usage by playing at the recommended resolution (e.g. 1440p) and settings (high) unless you buy a high end gpu where even 1440p wont be enough for 100% unless you upgrade your cpu.
@@TestingChannel1-kh9iz it does matter, pairing a 7600 with a gpu that will never bottleneck on competitive games like a 4070 ti or 4080 actually makes sense, if the gpu is the limiting factor latency sucks for competitive games it also makes sense if you are going to record gameplay using geforce overlay, everyone have different needs, this also makes sense if you are thinking about upgrading
serious question, why do this videos not test on actual CPU intensive games? 1 test in a game like Football Manager, Cities Skyline, EU4, Vic3, etc.. will give far more real results, like if one CPU can simulate 1 year in Football Manager with all the leagues in the world active in under 10min or 20min then oh boy what a great CPU. now this are the test we should be seeing.
@@iikatinggangsengii2471 i know it won't, no cpu is strong enough to have all FM leagues playable, but it would be a true test to see the power of a CPU since games like that use almost no graphic card at all.
I am confused about the low performance of R5 3600 in Cyberpunk 2077. In many other tests, it achieves much better results. Is it because of the CPU overhead on the Nvidia drivers? Can you test and show parallel results with his setup and then with the RX 7900 XTX, for example?
The difference between the 3600 and 5600X is okay but the difference between 7600X and 5600X is much more, but around 10% of the performance is from DDR5. I'm waiting though for the X3D CPUs and 7000G APUs if they would release it.
@@gladiumcaeli I think adding a mid range GPU onto a ultra high end CPU wouldn't make sense, but it could make sense if you plan to upgrade after and is planning to use an iGPU for a while
that temperature difference and additional wattage is just not worth it for extra framerate at 1080p gaming. if high refresh rates are that important start with frame generation and only top up with the cpu upgrade if you must. i'd still recommend the am4 platform over am5
@@Blacktrous it's about using existing other stuff . If you get a 5600 or better , don't bother tbh . Enjoy what you got and wait 4 or 5 cycle for the next purchase
@@Blacktrous It depends on your work load. Game performance usually depend on single thread performance. That's why people don't game on a 64-Core Threadripper. But if you do 3D rendering or video editing, more cores mean more performance. In games too many cores cause high power draw and high temperature.
The CPUs thats bottlenecking because of the GPU, thats the point! The CPUs go to their full potential. Now stop trying to make the GPU the bottleneck. lmao
*IMPORTANT INFORMATION* The GPU (4090) is in a much higher price range and is the reason for the FPS levels in games like "Cyber Punk" or "Rust" or anything that requires a supercomputer. If you buy a 3070 and a ryzen 5 don't expect 80 FPS on high graphics. It won't happen.
Yeah but the reason for using a 4090 to test a CPU is to eliminate any possible GPU bottleneck from the results. You don't really want to look at the result in a vacuum, you just want to look at the difference in performance between the 3 CPU's. You can trust that the difference is entirely because of the CPU.
Yea, unless the game is CPU-intensive like cyberpunk dlc and there you can lower graphics settings all you want on rtx 3070, but it will only lower GPU load, CPU load will still be the same and you won't be able to have stable 60 fps on 5600x. Simply said with 5600x in newer games you won't really be able to increase your fps by lowering settings, with 7600x there will be difference i think, this is what this videos for.
RT implementation in the Witcher is so broken it smites performance even on 1080p -> totally CPU bound. Only single core IPC improvement helps as game uses very few cores/threads. Watch dogs Legion and CP 2077 Work way better in RT with very high cpu usage of majority of cores/threads.
thumbs uo for including C.PWR draw (and also G.PWR) nice FPS boost, though, just took a random part: 1:46 Forza5: 3600 - 145FPS - 51W 7600x-251FPS - 89W FPS increased by 73,1% Watts increased by 74,5% That being said, running my 5800x at 1.0875V at 4.0GHz and it never exceeds 40 Watts while gaming and still doesnt bottleneck my 1080TI by anything. That being said, capped running my 144HZ monitor at 120Hz and capped FPS at 120. Overall Total System power draw never exceeds 300Watts. Running without limits it would reach 500W max
@@SweatyFeetGirl You are absolutely right, hence my last paragraph about how i use my 5800x. Though you could also do this with the 3600. My living room PC connected to the TV uses a 3600, for which i also disabled all boosts and run it at 3.8GHZ (1.1V if i remember correct). And with these settings the 3600 doesnt pull more then 35W while gaming. Runs with a RX 580 on a 1080p 60HZ TV panel, so totally fine with the downclocked 3600 EDIT: But yea, i would guess the 7600x can easily run at 4GHz with 1.0-1.1V, which should probably also only pull around 30-40Watts. I guess with these settings the 7600 would be very efficient compared to 3000 and 5000 series
@@TerranigmaQuintet thats litteraly the whole point of upgrading your CPU.... if it doesnt bottleneck the GPU you dont need to upgrade ? So what are you trying to say ?
I mean, if you have a 5600, it's not worth buying a new motherboard processor and ram, even if I just had to change the processor alone its not worth the money, it is a young new platform
Literally all the more reason to buy am5 but okay.. Am4 is a dead platform at this point and the socket itself is the problem, you can’t get any better cpus than the 5800x3d..
It would have maybe had more sense to use a 30XX - 6XXX GPU to see if the vast majority of older graphic cards (that most of us have) can really benefit from a 7XXX CPU.
@@jadenbloemstein5507 Even if the clock speed is the same or a little higher, the 5600x will be better than my 3600. But still from 3.9 to 4.7 ghz all core is a good improvement.
Does it matter at 1440p or 4K would like to know because I have a3900x and I don’t want to have to buy a new cpu just want to upgrade my graphics card thanks
@@m8x425 YES, then equalize the working frequencies because if modern computers are in practice a fast counting devices, then they are getting performance increase almost linearly when frequency (aka clock rate) is increased. More frequency - more instructions executed per second. In this case we're comparing performance between THREE different CPU generations (no matter that all are Ryzen) Sadly you're another one that have NOT something to say
I have a question. Why does the FPS rate change in each 7600X test, even though the test setup is the same. There are videos with 7600x, doing 96, 108, 140 and now 111 ???
The high power consumption on the 7000 series especially while gaming is so deceiving and it makes wonder why AMD pushed these CPUs so hard when you don't have to do that at all. i have my ryzen 5 7600 at 5.4Ghz at 1.185v and it consumes 30watts less at multicore and around 40w to 45w less while gaming, which means in my case, 85watts while using Cinebench r23 with 15850 points at multicore and 50watts to 55watts while gaming on Cyberpunk 2077 and using around 50% to 60% of the CPU at times. There are crazy efficient and that's what AMD should have doubled down on this release
To be honest it won't be worth it . If you are upgrading from a 5600x to 7600x . As I'm sure your 5600x system maybe a year or two old. But really worth it if your are going from 3600 to the ryzen 7600x 😅
why was that the rtx4090 on the 7600x set always running in higher frequency? while those on the other two platforms were always running in lower frequencies yet with similar gpu usage?
Eagerly waiting for the idiots who will undoubtedly ask why the tests were done at 1080p, "because nobody plays at 1080p" (which is a false statement in and of itself).
The Core i5-12400f is comparable to the Ryzen 5 5600 and 5600x. But if you're going to game on a high refresh rate monitor and you're going to tweak the settings to get a higher framerate then go with the i5-12400f or R5 5600. If you're happy with 60fps and you're going to go with a GPU like the RX 6600xt or RTX 3060, then you'd be okay with the Ryzen 5 3600. The gaming performance of the R5 5500 is about the same as the R5 3600, so there's that too.
Judging by the load schedules of the processor and video card, then there is an artificial limitation, since the load on both the processor and the video card is incomplete and they do not limit each other, perhaps in the code of the games themselves in the form of profiles for different combinations of components, it’s a fraud!
yes ,,, was looking for the same... even 3600 is never maxed out in any situation... some game has 50% cap for procesors .. some game has 50% cap for gpu.. either something else is bottlenecking or clearly they have a profile inbuild... cheating to the core
Tengo un TYZEN 5 3600 CON 32 DE RAM A 3200 MHZ CON una rtx 308p y juego en 4K por lo que pienso el cpu realmente e suficiente por la baja exigencia en esa resolución
Besides the fact that all of them are bottlenecking the 4090 the lower cpus are bottlenecking it even worse should have tested it with a 3080 ti or smth for better comparison
I think this Benchmark goes wrong because you're using RTX 4090 with the ryzen 5 3600 ryzen 5 5600x which caused them to bottleneck graphics card pecause at r5 3600 your not even 50% using graphics and with a r5 7600x your using 80% and that show the wrong kind of spec for CPU
no rather the 3070. the 3070 is queit a bad deal. costs 100bucs more tah the 3070 while consuming 70 watts more for like 8% more performance and also only having 8gb vram. DO NOT buy a 3070 ti. 3070/6700xt is the way to go
@@theplayerofus319 Why not 6800? It costs around 400-500 USD. It performs better than 3070 ti while being more power efficient and having more VRAM {16 gb}. The only con is the incompatibility of its drivers in production softwares.
Screw these benchmark videos with low-mid range cpus. I want to see how much of a difference there is with a GPU you would actually pair it with not a 4090
not really, because only a GPU that is bottlenecked in both cpus at 1080p can truly show you the absolute maximum possible difference between both cpus.
No. The goal is to realize 100 percent of the theoretical performance of each CPU. Placing it into a CPU bound scenario by lowering resolution and creating as much headroom on the GPU is how you do it.
Witcher 3 is so old, and the 5600x still fails to deliver 60 fps to the city. Most games the 5600x is enough to 60fps gameplay but there are some games where it fail hard... Make me wana get a 5800x3d, only changing the cpu would be nice...
cara se tu usa coolerbox é isso ai mesmo mas da pra fazer undervolt Eu por exemplo pretendo comprar um 5600 que é igual ao 5600x tanto em consumo quanto temperatura e recentemente peguei um cooler boreas e1 410 que é muito top e vai segurar ele bem frio
Which CPU should I choose from these prices I found second hand in my country? Only for gaming. RDR2, Mafia Definitive, Far Cry 6... 7600x 170 USD, 7800x3D 205 USD, 7900x 205 USD, 7500f 147 USD. My hourly wage is 3.40 USD.
Because the CPU power itself differs. You can think about it this way: 60% usage of a Hyundai engine is not the same as a 60% usage of a Lamborghini engine.
if you put a 4090 of course there will be a better improvement on better tech, very misleading, because its made to have a high end cpu, its not made to be paired with slow things so weak chips will have a much worse performance, meanwhile using mid tier cpus will have a more close behaviour and more realistic.
This is not a good comparison. Youdo know when you pair an rtx 4090 to a r5 3600, that would be a huge bottleneck. It could've been better if you used a 3060ti or 3070 on a 1440p resolution.
args plz think for 1 second..... this tests shows the power of the CPU... if you choose a bad graphic card you show the power of the graphic card not the cpu .....
🤦♂️ another one of those. You can only compare CPUs if all of them bottleneck a GPU, and this can be achieved by maximizing CPU usage (enabling rtx, lowering resolution/any gpu-related settings, and to be sure using 4090 so it's always underloaded). This way we see frames that CPU can produce and GPU is out of equation.
Games :
God of War - 0:21 - gvo.deals/TestingGamesGoWPC
Forza Horizon 5 - 1:23 - gvo.deals/TestingGamesForza5
Spider-Man - 2:31 - gvo.deals/TestingGamesSpiderManPC
CYBERPUNK 2077 - 3:34 - gvo.deals/TestingGamesCP2077
CS GO - 4:32
Microsoft Flight Simulator - 5:33 - gvo.deals/TestingGamesMFS20
PUBG - 6:40 - gvo.deals/TestingGamesPUBG
The Witcher 3 - 7:38 - gvo.deals/TestingGamesWitcher
System:
Windows 10 Pro
Ryzen 5 7600X - bit.ly/3ULWytC
Gigabyte X670E AORUS MASTER - bit.ly/3BRDIIG
G.SKILL Trident 32GB DDR5 6000MHz - bit.ly/3XlBGdU
Ryzen 5 5600X - bit.ly/36fmPbc
AMD Ryzen 5 3600 - bit.ly/30Es4Qk
GIGABYTE X570 AORUS PRO - bit.ly/2MKI0bQ
32Gb RAM DDR4 3600Mhz - bit.ly/35vyWko
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
What cpu cooler?
do the test in 4k pls, put cheap cpu vs costly cpu in native 4k pls no dlss pls
try on 3060ti then 6700 or rx6650xt..then we see how these performance
Can you test csgo in dust2 or mirage? because csgo benchmark is low pressure.
the 7600x is a monster when it comes to gaming
a pity the AM5 platform is so expensive
@@notenjoying666 except you don't need to buy DDR5 and memory speed doesn't effect intel CPUs as much as it does with AMD
Microcenter offers free DDR5-6000 with zen 4 CPUs, as an American I’d prefer that over 13th gen eol CPUs
@@thetshadow999animates9 Me: Cries in Asian
but AM5 will be long supports, it's expensive for the first Gen4, it need time to be a good budget
@@igm1571 I mean, 7600 + cheap AM5 and DDR5 ain’t that bad, microcenter gives free ram with zen4
The 7600X has a big improvement's of FPS count, but I hope it works at more efficient frequency and has better thermal conduct.
its a fake.... impossible this difference. Gpu Power... 60 vs 90 watt? haha
@@UltraPEPE More Fps increase the W draw .... Negative IQ
Ryzen 7000s can basically support high heat I guess (totally random comment)
It doesn’t, overheats like a mf. Even undervolting won’t work unless you won the lottery. My 7600x with some good cooling gets a bit toasty. What makes it worse is half the games today aren’t optimized so they put Kris stress than usual for no reason
Thank you! 5800X3D vs. 7600 next, please!
Hardware Unboxed did a great benchmark comparison of those 2 cpus. Depending of the game, the 5800x3D have the edge, but in other games, it's the 7600x. One got more L3 cache, the other have higher frequency per core. It really depends, the results are different from a game to another. Take care !
@@yan3066 yep that's true that the new gen CPUs have more clock speed and it's a 50 - 50 of the 5800X3D vs new gen CPUs but the 70**X3D will be the best of the best
@@jadenbloemstein5507 yep, it will be the best, but at what cost ? 😨
@@yan3066 in pubg which of the two does better? 5800x3d or 7600x?
@@MateusSilva-ps6xr I don't know, but the difference wouldn't be that much in my opinion. No matter which one of those 2 you will use, I'm pretty sure the performance will be awesome.
The 4090 driver overhead might be skewing these results, a 7900xtx might be more effective at showing differences at low resolutions.
its clearly bottlenecking everything, a 4070 or a 3080 would have been more suffice
Nobody will pair a r5 3600 with a rtx 4090 but this comparison shows that the r5 3600, all in all, it has aged well.
I think I'm ordering pc for 1500 with 7600 and 3060 ti or 3070
@@JH-vb1wo building is cheaper
@@JH-vb1wo buy 3060 ti gddr6x
I did a 3600X with my 4090
@@CluckYou21 at what resolution?
i got Ryzen 5 7600x with Sapphire pure 7800xt superb combo
Nice
actually useful, don't listen to the people complaining about pairing those with a 4090 (why would he pick a 4060 that will bottleneck all 3 on 90% scenarios jesus...) we testing cpus no gpus
Btw dore people here, dont be tricked by the frames. The reason for the big difference between the 5600x and 7600x is the bottlenecking capacity for the RTX 4090 which is unreleastic. An RTX 3070 would have been a fair comparison as no one is buying a budget cpu with a 4090
this is about showing the maximum possible fps difference between both cpus which can only be shown if both of them bottleneck a much more powerful gpu.
That is unnecessary. What is needed is to test them with a GPU that you can afford. After all, that's the reason why you see comparisons between CPUs, to see what's best value with your now/future-GPU @@ImperialDiecast
@@TestingChannel1-kh9iz in the budget to midrange price bracket, choosing cpus really doesnt matter as much as choosing the right Gpu. because any extra edge provided by a better cpu only matters if you want to achieve the highest available framerate in terms of absolute numbers, which, with a midrange gpu, is only possible at 1080p low settings because that's the only time your midrange GPU wont be at 100% due to your cpu bottlenecking it. So only in that case does a better cpu help it get 100% usage even at lower resolutions and lower settings. You will always achieve 100% gpu usage by playing at the recommended resolution (e.g. 1440p) and settings (high) unless you buy a high end gpu where even 1440p wont be enough for 100% unless you upgrade your cpu.
@@TestingChannel1-kh9iz I agree
@@TestingChannel1-kh9iz it does matter, pairing a 7600 with a gpu that will never bottleneck on competitive games like a 4070 ti or 4080 actually makes sense, if the gpu is the limiting factor latency sucks for competitive games it also makes sense if you are going to record gameplay using geforce overlay, everyone have different needs, this also makes sense if you are thinking about upgrading
serious question, why do this videos not test on actual CPU intensive games? 1 test in a game like Football Manager, Cities Skyline, EU4, Vic3, etc.. will give far more real results, like if one CPU can simulate 1 year in Football Manager with all the leagues in the world active in under 10min or 20min then oh boy what a great CPU. now this are the test we should be seeing.
it wont lol but indeed will load faster w more leagues enabled
@@iikatinggangsengii2471 i know it won't, no cpu is strong enough to have all FM leagues playable, but it would be a true test to see the power of a CPU since games like that use almost no graphic card at all.
It is a Demo for the CPU limit to a high end GPU. But what is with a rx 6650xt or a 3060 ti?
Can’t wait to see the 7xxx3D variant 🚀
I am confused about the low performance of R5 3600 in Cyberpunk 2077. In many other tests, it achieves much better results. Is it because of the CPU overhead on the Nvidia drivers? Can you test and show parallel results with his setup and then with the RX 7900 XTX, for example?
Cyberpunk is a very cpu demanding game. When i play it on my i7 8700k it makes me feel that the cpu is almost dead
Cyberpunk runs better on intel, on i9 13900k I average around 190
@@xTurtleOW it's not about frames. you can have 65 fps and smooth but 100 fps and stutters
Yeah, the Cyberpunk bench is weird.
Raytracing is ON which decreases performance a lot on ryzen cpus
The difference between the 3600 and 5600X is okay but the difference between 7600X and 5600X is much more, but around 10% of the performance is from DDR5. I'm waiting though for the X3D CPUs and 7000G APUs if they would release it.
@@istealpopularnamesforlikes3340 only 7800x3D.
@official_testinggames No thx. It's a scam.
Would be awesome if they did 7000x3g
@@gladiumcaeli I think adding a mid range GPU onto a ultra high end CPU wouldn't make sense, but it could make sense if you plan to upgrade after and is planning to use an iGPU for a while
@@shadowlemon69 what I meant was if they would make an high end apu would be great. it would be like the 5800x3d but an apu version
I love my ryzen 5 3600X ❤️
that temperature difference and additional wattage is just not worth it for extra framerate at 1080p gaming. if high refresh rates are that important start with frame generation and only top up with the cpu upgrade if you must. i'd still recommend the am4 platform over am5
7600X is such a damn good processor. Paired mine with 7900XT
If you have already the 3600x or the 5600x just get the 5800x3d. If you are building a new pc get the 7600x or wait for the x3d variants.
More like get the i7 13700K, more cores, more performance.
@@Blacktrous it's about using existing other stuff .
If you get a 5600 or better , don't bother tbh . Enjoy what you got and wait 4 or 5 cycle for the next purchase
@@Blacktrous yeah, the 13600KF is godly too.
@@Blacktrous E core didn't help much on gaming.
@@Blacktrous It depends on your work load. Game performance usually depend on single thread performance. That's why people don't game on a 64-Core Threadripper. But if you do 3D rendering or video editing, more cores mean more performance. In games too many cores cause high power draw and high temperature.
The CPUs thats bottlenecking because of the GPU, thats the point! The CPUs go to their full potential. Now stop trying to make the GPU the bottleneck. lmao
My configuration : CPU- 7600X AM5 / GPU- 6800 XT 16GB GDDR6 / 64 RAM 5600 DDR5/ 2TB SSD M.2 Gen.4 / Waiting Starfield!!!
*IMPORTANT INFORMATION*
The GPU (4090) is in a much higher price range and is the reason for the FPS levels in games like "Cyber Punk" or "Rust" or anything that requires a supercomputer. If you buy a 3070 and a ryzen 5 don't expect 80 FPS on high graphics. It won't happen.
yeah surely the 3070 wont manage above 80 fps for 1080p
Yeah but if you run low settings the fps is dramatically different
Yeah but the reason for using a 4090 to test a CPU is to eliminate any possible GPU bottleneck from the results. You don't really want to look at the result in a vacuum, you just want to look at the difference in performance between the 3 CPU's. You can trust that the difference is entirely because of the CPU.
Yea, unless the game is CPU-intensive like cyberpunk dlc and there you can lower graphics settings all you want on rtx 3070, but it will only lower GPU load, CPU load will still be the same and you won't be able to have stable 60 fps on 5600x. Simply said with 5600x in newer games you won't really be able to increase your fps by lowering settings, with 7600x there will be difference i think, this is what this videos for.
It´s good and all that but, you´re not gonna see that difference using a weaker GPU. Well done video mate. 😊
yes. With something like 6700xt the difference is maybe 5%
Yeah true
Hella true
exactly this is just bottleneck , how it would go in a 4070ti?
@@tryfergoodra552 fine, it only has bottlenecks for 2000 series and probably
Wait 1080p??? With a 4090?? But why lol. Really doesn't show much of a difference at 4k
RT implementation in the Witcher is so broken it smites performance even on 1080p -> totally CPU bound. Only single core IPC improvement helps as game uses very few cores/threads. Watch dogs Legion and CP 2077 Work way better in RT with very high cpu usage of majority of cores/threads.
thumbs uo for including C.PWR draw (and also G.PWR)
nice FPS boost, though, just took a random part: 1:46 Forza5:
3600 - 145FPS - 51W
7600x-251FPS - 89W
FPS increased by 73,1%
Watts increased by 74,5%
That being said, running my 5800x at 1.0875V at 4.0GHz and it never exceeds 40 Watts while gaming and still doesnt bottleneck my 1080TI by anything. That being said, capped running my 144HZ monitor at 120Hz and capped FPS at 120. Overall Total System power draw never exceeds 300Watts. Running without limits it would reach 500W max
you can have the same wattage and have 99% of the performance with the 7600x. ryzen 7000 runs power hungry out of the box but you can fix it
@@SweatyFeetGirl You are absolutely right, hence my last paragraph about how i use my 5800x. Though you could also do this with the 3600. My living room PC connected to the TV uses a 3600, for which i also disabled all boosts and run it at 3.8GHZ (1.1V if i remember correct). And with these settings the 3600 doesnt pull more then 35W while gaming. Runs with a RX 580 on a 1080p 60HZ TV panel, so totally fine with the downclocked 3600
EDIT: But yea, i would guess the 7600x can easily run at 4GHz with 1.0-1.1V, which should probably also only pull around 30-40Watts. I guess with these settings the 7600 would be very efficient compared to 3000 and 5000 series
Huh… didn’t know the 7600x was that much faster. Pretty cool. Excited to see x3d chips
Remember that it uses DDR5 RAM.
@@NickT9330 Also, the 4090 is seriously bottlenecked by those older CPUs, explaining the fps differences.
@@TerranigmaQuintet it's true.
@@TerranigmaQuintet thats litteraly the whole point of upgrading your CPU.... if it doesnt bottleneck the GPU you dont need to upgrade ? So what are you trying to say ?
@@lordadThe perfect balance is never cheap. 7:51
This only matters if your GPU is 4080 or above. If you have 6700 then you will see no difference.
I mean, if you have a 5600, it's not worth buying a new motherboard processor and ram, even if I just had to change the processor alone its not worth the money, it is a young new platform
Literally all the more reason to buy am5 but okay.. Am4 is a dead platform at this point and the socket itself is the problem, you can’t get any better cpus than the 5800x3d..
What cooler did you use for the 7600X?
It would have maybe had more sense to use a 30XX - 6XXX GPU to see if the vast majority of older graphic cards (that most of us have) can really benefit from a 7XXX CPU.
I'm still happy with my 3600 overclocked to 4.7ghz and my rtx 3080 is fully used in all games at 4k
Huh can it go to a 5600X ???
@@jadenbloemstein5507 Even if the clock speed is the same or a little higher, the 5600x will be better than my 3600. But still from 3.9 to 4.7 ghz all core is a good improvement.
Bruh how did you overclock to 4.7. my stock 3600 is bottlenecking my RTX 3060ti when I turn on Ray tracing.
@@MrRaphaelLippi are you playing in 1080p 1440p or 4k?
You cant overclock a 3600 to 4.7, stop lying kid. 🤡
Finally prices came down and an AM5 build is viable now
I prefer 5 3600X more than anything for its price...
Why is the performance in the witcher 3 so bad while cpu and gpu load are som low?
That 7600x is a monster
God of War with render resolution 720p and DLSS quality? I doubt how much that test is relevant
Amd killed its own cpu with expensive boards
the 7600 is the real amd king actually, producers 95% of performance and remains much cooler and consumes impressive watt
You mean nearly 99%. Cheers ☕
Does it matter at 1440p or 4K would like to know because I have a3900x and I don’t want to have to buy a new cpu just want to upgrade my graphics card thanks
There is not a big difference at higher resolutions. I also have a 3900x and play at 1440p and my gpu will usually hit 96-98% usage.
4070 ti is the most suitable GPU to see the difference. 4090 is too much for these little cpus...
Not really, it's CPU comparison test, it IS NOT a test about choosing the best cpu for rtx 4090
And again - test them at EQUAL core and DRAM frequencies
That's an IPC test, bruh
@@m8x425 YES, then equalize the working frequencies because if modern computers are in practice a fast counting devices, then they are getting performance increase almost linearly when frequency (aka clock rate) is increased. More frequency - more instructions executed per second. In this case we're comparing performance between THREE different CPU generations (no matter that all are Ryzen)
Sadly you're another one that have NOT something to say
I have a question. Why does the FPS rate change in each 7600X test, even though the test setup is the same. There are videos with 7600x, doing 96, 108, 140 and now 111 ???
The high power consumption on the 7000 series especially while gaming is so deceiving and it makes wonder why AMD pushed these CPUs so hard when you don't have to do that at all. i have my ryzen 5 7600 at 5.4Ghz at 1.185v and it consumes 30watts less at multicore and around 40w to 45w less while gaming, which means in my case, 85watts while using Cinebench r23 with 15850 points at multicore and 50watts to 55watts while gaming on Cyberpunk 2077 and using around 50% to 60% of the CPU at times. There are crazy efficient and that's what AMD should have doubled down on this release
But why you testing with rtx 4090? Test this processor with example rtx 3060
To be honest it won't be worth it . If you are upgrading from a 5600x to 7600x . As I'm sure your 5600x system maybe a year or two old.
But really worth it if your are going from 3600 to the ryzen 7600x 😅
why was that the rtx4090 on the 7600x set always running in higher frequency?
while those on the other two platforms were always running in lower frequencies yet with similar gpu usage?
Amd makes huge jump in performance unlike intel .
what cpu cooler r u using for test?
Usually he used dark rock pro 4
Eagerly waiting for the idiots who will undoubtedly ask why the tests were done at 1080p, "because nobody plays at 1080p" (which is a false statement in and of itself).
Guys should I buy a 3600 (100€) or a 12400f (180€)? I will buy the motherboard aswell.
The Core i5-12400f is comparable to the Ryzen 5 5600 and 5600x. But if you're going to game on a high refresh rate monitor and you're going to tweak the settings to get a higher framerate then go with the i5-12400f or R5 5600. If you're happy with 60fps and you're going to go with a GPU like the RX 6600xt or RTX 3060, then you'd be okay with the Ryzen 5 3600.
The gaming performance of the R5 5500 is about the same as the R5 3600, so there's that too.
Judging by the load schedules of the processor and video card, then there is an artificial limitation, since the load on both the processor and the video card is incomplete and they do not limit each other, perhaps in the code of the games themselves in the form of profiles for different combinations of components, it’s a fraud!
yes ,,, was looking for the same... even 3600 is never maxed out in any situation... some game has 50% cap for procesors .. some game has 50% cap for gpu.. either something else is bottlenecking or clearly they have a profile inbuild... cheating to the core
@@ElangesWaranor the game just can't utilize all cpu threads
Spiderman games + Ryzen CPUs = ☠
Tengo un TYZEN 5 3600 CON 32 DE RAM A 3200 MHZ CON una rtx 308p y juego en 4K por lo que pienso el cpu realmente e suficiente por la baja exigencia en esa resolución
Besides the fact that all of them are bottlenecking the 4090 the lower cpus are bottlenecking it even worse should have tested it with a 3080 ti or smth for better comparison
`the whole point of testing a CPU is to see how much it bottlenecks the GPU >-< these comments make me lose my sanity
@@lordad There will always be these type of comments on CPU test videos.
I think this Benchmark goes wrong because you're using RTX 4090 with the ryzen 5 3600 ryzen 5 5600x which caused them to bottleneck graphics card pecause at r5 3600 your not even 50% using graphics and with a r5 7600x your using 80% and that show the wrong kind of spec for CPU
Only a mental will pair 4090 with ryzen 5 and that too to play in 1080p
COMPETITIVE PLAYERS WILL ...those who care more of fps low ip lag etc ,,docummet yourself ...after speak..
Impressive 30% generational increase
3070 Ti should be ideal for Ryzen 5 3xxx/5xxx.
no rather the 3070. the 3070 is queit a bad deal. costs 100bucs more tah the 3070 while consuming 70 watts more for like 8% more performance and also only having 8gb vram. DO NOT buy a 3070 ti. 3070/6700xt is the way to go
@@theplayerofus319 agree, The rx counterpart would be better for stepping up except rt
@@MetaDude yes and in 3070/ rx 6700xt segment most dont use raytracing so amd is prop a better deal in the middle or low gpu class
@@theplayerofus319 Why not 6800? It costs around 400-500 USD.
It performs better than 3070 ti while being more power efficient and having more VRAM {16 gb}.
The only con is the incompatibility of its drivers in production softwares.
@@Nitin-vh4dp well if you have the extra 170 bucks (price in germany) from 6700xt to rx 6800 id say go for that, also a great and VERY efficent card
find it so strange, because the cpu usage is so low across the board. you think the older gen cpu would be much higher in usage.
65 average FPS in the Witcher with a 4090 and 7600x at 1080p? Something definitely isn't right here....
Screw these benchmark videos with low-mid range cpus. I want to see how much of a difference there is with a GPU you would actually pair it with not a 4090
With that huge of a jump you might think that even the 7600X might be a bottleneck?
The 7600x is a bottleneck. Look at the GPU utilisation. Not 100%. I would say the 4090 is not a 1080p card
not at all, for now, unless later 4090 proven benefit from 8 cores and more
usually they (the high ends) are just better cpu clock and feature wise
7:32 81 fps for 3600 REALY ?
Gap is lot Because cpu is bottleneck gpu .
Something is wrong with the witcher 3 benchmark. I have r5 5600x and it runs at stable 60fps
What! No Division 1 or 2?
5800X3D vs 7600X please
they are on par, some wins for 5800x3d some wins for 7600x
it would've made more sense to test it with a 3060ti or 3070 nothing more
not really, because only a GPU that is bottlenecked in both cpus at 1080p can truly show you the absolute maximum possible difference between both cpus.
No. The goal is to realize 100 percent of the theoretical performance of each CPU. Placing it into a CPU bound scenario by lowering resolution and creating as much headroom on the GPU is how you do it.
60Fps FHD with RTX4090 with 7600X in the Witcher 3? is this a joke? I was on this level with gtx 1070 i7-7700 7 years ago on high settings.
Asymmetrical, the 7600x improvement rate is much higher than the previous two generations
Why in many games GPU not full usage?
Because it's 1080p and the CPUs are too weak and can't produce enough frames to keep up with the 4090.
Witcher 3 is so old, and the 5600x still fails to deliver 60 fps to the city.
Most games the 5600x is enough to 60fps gameplay but there are some games where it fail hard...
Make me wana get a 5800x3d, only changing the cpu would be nice...
Emm wondering what would be the difference on same frequency as the am5 is nearly 1ghz above it's predecessor.
thats manufacturing secrets, but one example would be like 3000mhz core clock in 7nm perform differently against 3000nm 4nm
bro, did you really tested it into 1080p :D? with that gpu?
bro plz inform yourself how cpu testing is done and why it is good to test in 720p or 1080p
that has to be one absolute UNIT of a cpu cooling system u got there. like how the f are ur temps that low?
I5 13500+B660 or 7600x+B650...which combo should i go with 3080@1440p?
I would go AMD route since you will be able to upgrade the CPU later down the line. Intel change their socket like every 2 generations.
Brother, what the hell? Where is the cooling system?
i dont have time to change anything, theyre yours and use as your please
did they can run minecrah?
I should upgrade my Ryzen 5 3600? I have a rtx 3060 12gb?
I have a R5 3600x paired with a 6600XT, should I upgrade my 3600x?
Which one is better for gtx 1050 ti?
R5 3600 or 5600x
3600 is enough for 1050ti, i have 3600 paired with rx 570 and i didnt see any cpu limited scenario.
Get the cheaper CPU and invest in a better GPU.
Preciso desse cooler do 5600x . O meu parece uma fornalha entre 75 a 82 graus
faz o overclock manual
cara se tu usa coolerbox é isso ai mesmo mas da pra fazer undervolt
Eu por exemplo pretendo comprar um 5600 que é igual ao 5600x tanto em consumo quanto temperatura e recentemente peguei um cooler boreas e1 410 que é muito top e vai segurar ele bem frio
@@atirador94tg tenho um 5600 com um water cooler simples em jogos o maximo que chega é 60 graus com o boost manual
@@atirador94tg eu uso nele o que vem com o 3800x. Troquei de pasta, deu melhorada mas ainda assim fica bem alto
@@thalison19 irei tentar o undervolt, se resolver de boa, se não, irei tentar o cooler
бред, тут просто новые и дорогие процы, вывозят 4090, а старые нет. поставь видюху 3060 или 3070, чтобы честно было.
Truly impressive gains.
that's graphic card upgrade level performance
true people often ignore 'small' fps increase while in total can similar to upgrade to far more expensive gpu
its why reviewers always got better score, since they know how to optimize things
Which CPU should I choose from these prices I found second hand in my country? Only for gaming. RDR2, Mafia Definitive, Far Cry 6... 7600x 170 USD, 7800x3D 205 USD, 7900x 205 USD, 7500f 147 USD. My hourly wage is 3.40 USD.
Bro pls include valorant in your games list
Can someone explain me why all cpu's have %60 use but different fps results?
clock speeds and generational gaps otherwise why there will be newer cpus they have to bring something better and faster
Because the CPU power itself differs. You can think about it this way: 60% usage of a Hyundai engine is not the same as a 60% usage of a Lamborghini engine.
@@scizzorxyz thanks man
@@ruxandy thanks for explanation
using 7600 with gpu is probably running and paying bills equal to an refrigerator monthly..maybe 800 pesos per month only pc..
how is the witcher 3 with a 4090 and 7600x just hitting 65fps?
Because of ray tracing.
i5-13400 vs i5-13600k please.
Thanks for the video bro....
I jumped R7 3700X to R5 7600X. And it's worth it.
Guys its 4090 .Probably the most od people jest card is RX 6800xt right now
Pls next time compare ryzen X700 series
if you put a 4090 of course there will be a better improvement on better tech, very misleading, because its made to have a high end cpu, its not made to be paired with slow things so weak chips will have a much worse performance, meanwhile using mid tier cpus will have a more close behaviour and more realistic.
This just convinced me to get 7600x
If you have a very powerful gpu, go for it. if not, either 5600x and 7600x should perform similiarly
Do you have a 4090 or a 4080? Otherwise you wont notice any difference worth the moneu
@@AllanRoberto2711 You will notice a difference even with a lower end gpu. my Ryzen 7 3700X bottlenecks my RTX 2070 in many games..
@@SweatyFeetGirl of course, but will it be worth the money? Buying new mobo, ram and cpu
nah, 13600k is better deal
This is not a good comparison. Youdo know when you pair an rtx 4090 to a r5 3600, that would be a huge bottleneck. It could've been better if you used a 3060ti or 3070 on a 1440p resolution.
args plz think for 1 second..... this tests shows the power of the CPU... if you choose a bad graphic card you show the power of the graphic card not the cpu .....
🤦♂️ another one of those. You can only compare CPUs if all of them bottleneck a GPU, and this can be achieved by maximizing CPU usage (enabling rtx, lowering resolution/any gpu-related settings, and to be sure using 4090 so it's always underloaded). This way we see frames that CPU can produce and GPU is out of equation.
Look the gpu, not use 90% its bottleneck for huge GPU.
год оф вор нагревает процессор на 15 градусов больше чем паук? хм, странно