AMD Ryzen 2600 vs Ryzen 5600. CPU comparison in 10 games! Amazing improvement in two generations!

Поделиться
HTML-код
  • Опубликовано: 9 янв 2025

Комментарии • 37

  • @starstreamgamer3704
    @starstreamgamer3704  2 года назад +4

    This is the comparison between two CPU models: 2018 AMD Ryzen 5 2600 (based on Zen+ architecture) and 2022 Ryzen 5 5600 (based on Zen 3 architecture).
    Graphics quality in most games was set to the highest available preset or highest possible quality (where it could be pushed even further) and in-game resolution usually was set to 720p, unless there was a GPU bottleneck, so I had to lower some settings down, decrease resolution scale (when possible) or use DLSS "ultra performance" preset (when available).
    In some games/scenes there was a GPU bottleneck in case of 5600, although I was using the lowest possible resolution. This CPU is really fast and is a good match for a very high performance GPU, something much faster than my RTX 3060.
    In Star Wars: Jedi Fallen Order there is 144 fps in-game limit, so the performance of 5600 was limited in some scenes as well.
    Introduction - 00:00
    Assassin's Creed Odyssey - 00:04
    Assassin's Creed Syndicate - 01:12
    Cyberpunk 2077 - 02:47
    Cyberpunk 2077 with RT - 03:59
    Far Cry 5 - 05:12
    God of War - 06:20
    Horizon Zero Dawn - 07:45
    Metro Exodus Enhanced Edition - 10:46
    Rise of The Tomb Raider - 11:58
    Shadow of the Tomb Raider - 13:27
    Shadow of the Tomb Raider with RT - 16:19
    Star Wars: Jedi Fallen Order - 19:12
    System specs:
    GPU: Gainward RTX 3060 Ghost
    Storage: Western Digital Blue WD10EZEX 1TB HDD
    CPU №1: AMD Ryzen 5 2600 - OCed to 3800 MHz
    RAM №1: 16GB (2x8GB) DDR4 Crucial Ballistix BL2K8G32C16U4R - OCed to 3533 MHz [14-17-14-35 CR1 + tightened subtimings]
    Motherboard №1: MSI B450M Mortar MAX (AM4, mATX)
    CPU cooler №1: ID-Cooling SE-214 (1x120mm fan, 4 copper heat pipes, 150 W)
    CPU №2: AMD Ryzen 5 5600 - OCed to 4550 MHz (via Curve optimizer)
    RAM №2: 16GB (2x8GB) DDR4 Crucial Ballistix BL2K8G32C16U4R - OCed to 3800 MHz [16-20-20-40 CR2 + tightened subtimings]
    Motherboard №2: MSI B550M PRO-VDH Wi-Fi (AM4, mATX)
    CPU cooler №2: Thermalright Burst Assassin 120 (1x120mm fan, 6 copper heat pipes, ~180 W)
    Recorded with a help of NVIDIA ShadowPlay.

    • @arfianwismiga5912
      @arfianwismiga5912 2 года назад

      what is your recommendation gpu for this cpu pair, if i want upgrade it from 1660 and play it only at FHD?

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад

      If you do not want to be CPU bottlenecked at 1080p at all, then 3060 ti/3070/3070 ti level of performance is the way to go. 8 GB of VRAM should not be a problem at FHD. Although 6700 XT with it's 12 GB is a good option as well.
      But if you are planning to use RT Ultra preset, then even 3080 my not be fast enough in some cases, so 4070 Ti is the GPU level you should consider in this case.

  • @arfianwismiga5912
    @arfianwismiga5912 2 года назад +2

    my biggest upgrade is change R5 2600 to R5 5600, cuz if i upgrade gpu i should change psu as well, i may keeping gtx 1660 til it died.

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад +1

      It depends on a power supply quality and capacity. 500W "80+ Gold" model should be good enough to replace 1660 with 3070/6700 XT.
      I personally have 500W "80+ Bronze" PS and I moved from 1660 to 3060 without any problems. It consumes 170W at stock, but with a slight undervolt (down to 0,925mV) it consumes 130-140W (not far from 1660's 120W TDP). Moreover it uses the same single 8-pin power connector as 1660.
      And 3060 is a much more powerful GPU to consider it as a decent upgrade over 1660 (50-70% faster on average). Here is my comparison of these two models: ruclips.net/video/4IcSmHkYAJc/видео.html

  • @frapeczek8570
    @frapeczek8570 Год назад +3

    I understand the fact that you set resolution to 720p to see real difference between CPUs. However, people that watch videos like this want to see the real difference that they will experience on the daily basis (which means 1080p or 1440p) ;)

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад +3

      720p res was chosen simply to eliminate GPU bottleneck. I was using RTX 3060 12GB during this comparison. At 1080p (let alone at 1440p) it would become a serious bottleneck, thus you would not see any significant difference between these two CPUs.
      But if we are talking about 3060 rendering image at 720p, you can get roughly the same performance level at 1080p with 3070/4060 Ti, or at 1440p with 4070 Ti/3090 Ti. While 4090 can provide this level of performance at 4K. In two years time you will be able to get 5060, which will probably be as fast as 4070...
      So, as you can see, it depends on a certain GPU. That is why there is absolutely no sense in testing CPUs while being bottlenecked by GPU. Major RUclips channels often test CPUs at 1440p or even at 4K, but they usually use the fastest available GPU for this purpose, again, to reduce GPU bottleneck as much as possible.

    • @CL-jq1xs
      @CL-jq1xs 8 месяцев назад

      @@starstreamgamer3704 i guess what he wanted to see is the how much the difference is no matter how insignificant it is because its more in line with what people are experiencing. Such as upgrading from r5 2600 to r5 5600 with at rtx 3060 or buying a new system on a budget and deciding if they should get a lower end cpu. For the most part testing the limit of a cpu without a bottleneck is not a useful information to most people such as testing a r5 5600 with a rtx4090. No one actually pair those cpu and gpu together in reality.

    • @starstreamgamer3704
      @starstreamgamer3704  8 месяцев назад

      @@CL-jq1xs It depends. If you play at 4K/Ultra/60, you can easily pair 4090 with 5600, as there will be no bottleneck from CPU side in most games.
      If you have 1440p/120 screen, then there is no sense in pairing 4090 with anything less performant than 7800X3D or 13700K.
      Testing CPUs without any GPU bottleneck is actually very useful for those, who are planning to upgrade their GPUs. That way you can see, what your CPU can really do.
      3060 is just too slow to show any significant difference in performance between 2600 and 5600 in most cases. So if I was testing these 2 CPUs at 1080p resolution, let alone 1440p, there will be almost identical results in both cases. It would not really be a comparison, but more like searching of edge cases.

  • @zonelore
    @zonelore 2 года назад

    Nice video! Can you add 0.1%low graph pls

  • @enggarrizkindo6992
    @enggarrizkindo6992 Год назад

    Im planning to upgrade my cpu from 2600 to 5600 but i my gpu currently Amd 5500xt
    Mainly used for CAD and design stuff and games..
    is it worth it ? or should i just invest in new gpu ?

  • @ph7neutral
    @ph7neutral Год назад

    WOW! so big improvement! :O but i play in videogames on 360p resolution monitor with RTX 4090 in my PC and Ryzen 2600x. what you think, i must change processor?

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад

      4090 is an extremely powerful card. I'd say even 5800X3D will bottleneck it in certain games, especially at lower then 4K resolutions. 2600X is a severe bottleneck for this GPU. So CPU upgrade is a must-have solution in your case.

    • @g3nny9342
      @g3nny9342 Год назад +2

      360p monitor why

  • @diblin.
    @diblin. Год назад +1

    just what i need but far cry 5 says far cry new dawn though the difference is negligible

  • @agussuparno741
    @agussuparno741 Год назад +1

    what 720p ?

  • @kazukikazami1397
    @kazukikazami1397 Год назад

    is the upgrade worth it if I'm using rx 5600xt playing at 2k. I can only buy one new cpu or used 3060ti/rx 6700xt

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад +1

      Actually your current PC configuration is well balanced. So the upgrade path depends on the games you play and how you play them. If you are OK with 50-60 fps gaming and prefer higher resolution (1440p?), then you'd better upgrade your GPU. But if you are aiming for 1080p/75+ fps gaming, then 2600 will not be able to satisfy this need.
      You can also consider more affordable 3600 or 5500 CPU and combine it with GPU upgrade. For example, even 6600XT is significantly faster then 5600XT. A combination of 3600/5500 and 6600XT is perfectly balanced (it will provide ~25% improvement in both regards in comparison with your current setup).

    • @kazukikazami1397
      @kazukikazami1397 Год назад

      @@starstreamgamer3704 thank you, I think I'll go with gpu upgrade first mainly for 2k gaming

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад

      You're welcome. Good luck.

  • @sownlengoc
    @sownlengoc 4 месяца назад

    look like 88w is the real tdp of 5600 not 65w

    • @starstreamgamer3704
      @starstreamgamer3704  4 месяца назад

      65W is it's rated TDP. While it's PPT is 88W. But it can consume up to 110-120W under some loads, when not constrained by any limits, depending on it's core clock and core voltage.
      Though the most optimal limit for this CPU is 90W. Anything higher brings negligible performance improvements, but significantly increases the CPU temperature.

    • @sownlengoc
      @sownlengoc 4 месяца назад

      @@starstreamgamer3704 then I wonder how r5 2600/3600 perform if they are forced to use more watts

    • @starstreamgamer3704
      @starstreamgamer3704  4 месяца назад +1

      ​@@sownlengoc My sample was functioning at 3,8 GHz/1,22V. Next step for it was 3,9GHz/1,28V. This would increased it's power draw to 65-80W in games. At 4GHz/1,38V it was consuming ~90-100W, but it's gaming performance was just 3-4% higher then at 3,8GHz/1,22V. So 70W limit for 2600 is an optimal choice.
      5600 just needs a bit more power to reach a good balance between power consumption and provided framerate. 3600 is similar to 2600 on this matter - 70W limit is an optimal choice for this CPU as well.

  • @bodasactra
    @bodasactra Год назад +1

    I suspect the Nvidia driver overhead is playing a part here. I doubt there would be much difference in these CPUs, even at 720p, if you used the RX 5700XT. That GPU is faster than a 3060 but gets along with older Ryzen 5 parts almost like they were made for each other :)

    • @starstreamgamer3704
      @starstreamgamer3704  Год назад +1

      Yes, driver overhead is a problem, at least to some extent. But it generally decreases the performance of a CPU by 10-15%. So 5700XT would definitely be bottlenecked by 2600X at 720p anyway, although not as significantly as 3060.

  • @a.m.dbrutal8360
    @a.m.dbrutal8360 Год назад

    well idk i think that 20 30 fps not worth 2600 are still fine

  • @blackti4792
    @blackti4792 Год назад

    С ryzen 5 2600 gpu сток. С ryzen 5 5600 gpu с меньшим напряжением и большей частотой. Тесты неправильные

    • @КириллМ-ъ9о
      @КириллМ-ъ9о Год назад

      3800 это не сток по всем ядрам. У меня 3900 предел по разгону. На 4.0 работает, но лишнее тепловыделение того не стоит. Впрочем я согласен, что прирост не крышесносный. Любой ам4 есть смысл обновлять только на 5800х3д и уже оставить доживать.

    • @blackti4792
      @blackti4792 Год назад

      Какой предел по разгону? 3900 это частота турбобуста🤦‍♂️при чем тут разгон. Про сток и разгон имелось ввиду про ГПУ. смотри внимательно тесты. Карта при R5 5600 с андерводьтом и в разгоне

  • @Michael-it6gb
    @Michael-it6gb Год назад

    This is what I find strange:
    Ryzen 2600: 4,8 Billion Transistors
    Ryzen 5600: 4,1 Billion Transistors