Ryzen 5 5600: When Will it Bottleneck your GPU?

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • In this video, we'll take a look at when the Ryzen 5 5600 will start bottlenecking your graphics card.
    Are you looking to upgrade your graphics card but don't know which one to choose? In this video, we'll take a look at the different options available for the Ryzen 5 5600 and compare them to the Ryzen 5 5500 and 5600x. We'll also discuss when the 5600 will start to bottleneck your graphics card, so you can make an informed decision.
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/p...
    License code: 9PZN5ZEHDW9WDQW9
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/s...
    License code: RODD58Z9DOFWVUQ5
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/s...
    License code: 3STA0ZMRRIGUSPHJ
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/s...
    License code: XYG11GKQNKTZN825
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/p...
    License code: KPBSHICZAI186DXS
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/p...
    License code: RRJMQDFTHBHLO8IS
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/p...
    License code: FBO0VOQHYXTFM3VM
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/p...
    License code: 4X1B0QSFGFKZJTD6
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/t...
    License code: LD65UULTEMKDAZVC
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/d...
    License code: LOEWUNJK5BQ2DCX6
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/s...
    License code: ZHGJHH60CVPT23Y7
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/m...
    License code: CUZPRRDRSKTBZMWB
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/a...
    License code: LBJBHU0JVYTKGOQB
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/r...
    License code: RVMO02FHCAHQ6ULP
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/r...
    License code: RVMO02FHCAHQ6ULP

Комментарии • 913

  • @LegitCamelRetro
    @LegitCamelRetro  Год назад +52

    I had a lot of interest in doing a i5-12400 bottleneck video and finally finished it! Thanks for all the support you guys have shown on this video, I appreciate it more than you know. Check out my i5-12400 video if you have'nt already: ruclips.net/video/gVFb6dBEv_A/видео.html

    • @nitinnetworker
      @nitinnetworker Год назад +1

      I have 2080 and Ryzen 1600. Which CPU would you recommend me to eliminate bottleneck?

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      @@nitinnetworker I would upgrade to the 5600, its really well priced and should handle the 2080 just fine.

    • @edwqrd
      @edwqrd Год назад

      I have a question, will a ryzen 5 5600 get bottlenecked by a rx 580?, i mainly think about using them for valorant/csgo

    • @jdquadrider
      @jdquadrider 8 месяцев назад +1

      @@edwqrd I'd get an RX6600 long before I ever got an RX580. The 5600 will handle either one of these just fine.

    • @edwqrd
      @edwqrd 8 месяцев назад +1

      @@jdquadrider i had a good deal here in my country it was like 80 dollars used so...

  • @coveyking
    @coveyking Год назад +613

    the results show that the 5600 handles all the cards well, with only minor performance differences compared to the more expensive 5800x 3D. Overall, the Ryzen 5 5600 is an excellent value CPU on the market.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +22

      very true!

    • @theanglerfish
      @theanglerfish Год назад +2

      x3D is 8 cores / 16 threads instead of 6/12

    • @hadifelani
      @hadifelani Год назад +25

      That's only around 50-70% true. Not quite _minor_ when it comes to games that enjoys huge chunks of L3 cache though.

    • @fausto198
      @fausto198 Год назад +18

      I’ve a 5600 and it handles a 6700xt just fine. I’ve recently acquired a 6900xt and boy do I get a CPU bottleneck IN Bf2042 @1440p I mean close to 100% cpu usage. God of war maxed out runs fine so at this point and after watching this video it looks like the CPU should be able to handle the GPU demand. It may just be a config in the game I need to tweek. Will test more over the weekend

    • @brucewayyyyne
      @brucewayyyyne Год назад +6

      should i get a 5600 or 5600x to pair with a 6600xt?

  • @AshtonCoolman
    @AshtonCoolman Год назад +330

    It's important to know that AMD GPUs bottleneck less than Nvidia GPUs because they have a hardware command processor. Nvidia does that function in software and has a far higher driver overhead on the CPU. That's why a 6950 XT with a 5600 is faster than a 4090 with a 5600 in all non-RT, GPU bound scenarios. Quite simply, if you don't have a fairly high end CPU, you'll lose performance with a higher end Nvidia GPU. The hardware command processor has been a feature since the HD 2900 XT series. This is why you got the results that you did with the 7900xtx and other AMD GPUs.

    • @shadowkyun
      @shadowkyun Год назад +28

      huh never knew that

    • @AshtonCoolman
      @AshtonCoolman Год назад

      @@shadowkyun check out Hardware Unboxed's video comparing the 13400 to the 5700X along with some other CPUs. ruclips.net/video/OsA52DkP8WU/видео.html

    • @Strytax
      @Strytax Год назад +9

      nice to know...ty.

    • @Vantrakter
      @Vantrakter Год назад +7

      How are the Intel Arc cards in terms of hardware command processor / software overhead?

    • @jusala14
      @jusala14 Год назад +3

      so, should I pair AMD with AMD? I have a R5 5500 + 3060, should i change to a 6700 or so?

  • @COBRO98
    @COBRO98 Год назад +67

    Parts last as long as you're happy with their performance. A 5600 realistically will last you 2-3 GPU upgrades going forward(6+ years)

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +5

      For sure!

    • @martingolding4951
      @martingolding4951 2 месяца назад +1

      I have been using a ryzen 7th Gen cpu since 2019 it came with a gaming pc at £400 and just today, I've purchased the ryzen 5600. I thought it was about time for an upgrade costing me £279 through my catalogue

  • @earthling1984
    @earthling1984 11 месяцев назад +28

    Plan to keep my b550m, 5800x, 3060ti, 32gb ram, gen 4 nvme for a few more years. Built in November 2021. Started with a GTX 1650 (gpu prices were crazy - then recently upgraded to 3060ti). I play at 1080x60. My system does everything I want at max settings (with RT off). And video rendering times are great too (compared to the i7 4000 series I had last with 4 cores). 2 years in, hopefully 5 to go. I'll rebuild a new desktop at some point, but for now, this is working great!

  • @andrewfarrugia2688
    @andrewfarrugia2688 Год назад +61

    Love this video. I recently upgraded to amd 5600 and 6800 and you demonstrated I did not need to go to the 5800x 3d. I got a great deal for the 5600. $199 AUD. Great value cpu

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +6

      Glad I could help!

    • @TheRageKing31
      @TheRageKing31 Год назад +2

      Same setup as me! It's a fantastic combo and I am very surprised how powerful this little cpu is.

    • @jacobfeet4870
      @jacobfeet4870 Год назад +1

      hey have you ran into any problems with your setup? and what 6800 brand if raedon did you buy because i want to buy a raedon gpu to pair with a 5600 I already have coming.
      are the saphire any good? or is there a certain brand that is just better than the others.

    • @TheRageKing31
      @TheRageKing31 Год назад +2

      @@jacobfeet4870 Saphire and Power Color are the "best brands." I have a reference model and it stays very cool while gaming. Only thing I would recommend is running DDU before installing the card and do a fresh driver install. All models are incredible for overclocking if you choose to do so, they're a little better than a 3070ti in rasterized performance at stock clocks.

    • @jacobfeet4870
      @jacobfeet4870 Год назад

      @@TheRageKing31 so it being my first ever built pc, do i still run ddu?

  • @davetheslayerfan9357
    @davetheslayerfan9357 Год назад +102

    It would also be helpful to see the 1% and 0.1% lows. Frame times are also helpful. It's not just about average FPS. Smoothness matters.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +15

      I agree, I replied to a lot of other comments and fixed it in the 12400f video I did.

    • @logandeathrage6945
      @logandeathrage6945 Год назад +5

      ​@@LegitCamelRetro You should do an update video with the 5800x3d since it has higher 1% lows which is why the 3D processors are so good in gaming.

    • @ericsilva7430
      @ericsilva7430 11 месяцев назад +6

      Yeah, 1%, 0.1%, and min are all that matter.
      So many reviews don't show this, making the review useless.

  • @fandomkiller
    @fandomkiller Год назад +53

    3600 and 5600 were both incredible value for gamers.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +6

      Very true

    • @CHICKENmcNUGGIESMydude
      @CHICKENmcNUGGIESMydude Год назад +2

      no the 3600 sucked loll why so many people got it and so few people got 5000 sieries is beyond me

    • @shining_buddha
      @shining_buddha Год назад +3

      They still :)

    • @razerPh
      @razerPh 11 месяцев назад +4

      ​@@CHICKENmcNUGGIESMydudeyour the one who's complaining it. Hahaha. The 3600 still the best budget processor right now😂 the next in line was the 5600

    • @eltonmusg3910
      @eltonmusg3910 9 месяцев назад +1

      RX6700 10GB or Ryzen 5500 better value/performance

  • @heseman5726
    @heseman5726 Год назад +37

    From what I've seen and read from online the real benefit of 5800X3D vs. other 5000 CPUs is the better 1% lows which create smoother experience, not so much in raw performance. Personally I'd not go back to non-3D V-cache models, 5600 is a great budget CPU though if you're already in AM4 platform.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +3

      Yeah, its something I for sure need to record in the future.

    • @RobBCactive
      @RobBCactive Год назад +1

      But there's some games, often strategy that really love the cache.
      Overall having switched from a 5600x, there's a clear difference with the 5800x3D, that also has a more power efficient stepping.
      Still as I ended up rebuilding so my niece could have my 5600x for data analysis work, I should have gone with AM5 as the marginal cost is only €50 now if you need RAM & mobo too.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      @@RobBCactive I also enjoyed the new platform, built my girlfriends pc with a 7700x and it feels butter smooth, hard to describe without feeling it yourself, but i would totally recommend it for a future build.

    • @TurboD16z6
      @TurboD16z6 9 месяцев назад

      Also in 6 years when a 6050 rtx comes out, you wont be bottleneck.

    • @mahmoudrabya585
      @mahmoudrabya585 6 месяцев назад

      I have this combo 6800Xt and 5600x for a year now, works perfectly in Triple A games 1440P with slightly lower than usual 1% low, the CPU will only bottleneck you in High demanding online games like Fortnite
      I was skiptic at first if my CPU will bottleneck my GPU but that didn't happen after testing more than 20 games

  • @Pete856
    @Pete856 Год назад +5

    Nice to see realistic testing. I get sick of all the reviewers using an RTX 4090 at 1080p...it's proves their point about CPU and RAM speeds, but does nothing in telling you if you really need to buy the best CPU with the fastest RAM if you only have a mid level graphics card.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      True! Hardware unboxed does have a pretty decent video about how you can take there results on testing and figure out what cards are good with what cpu, but sometimes I feel like its good to have some hands on footage for proof.

    • @SomeFrenchDude
      @SomeFrenchDude Год назад

      ​@@LegitCamelRetro hey, do you know which video that is?

  • @adamgroszkiewicz814
    @adamgroszkiewicz814 Год назад +48

    Went from a R3-1400 to an R5-5600 a couple of years ago. Currently paired with a 3060ti, primary use is single player gaming at 1440p. Still a solid CPU today.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      very true, nice set up!

    • @kumajin6207
      @kumajin6207 Год назад +4

      i´m Confused i thought the 5600 launched 2022

    • @valerioocioccolanti227
      @valerioocioccolanti227 Год назад +3

      @@kumajin6207 is 2022 idk that man came from future

    • @KotchcusDomesticus
      @KotchcusDomesticus Год назад +1

      @@kumajin6207 It is older ryzen 5600x from 2020 without the letter.

    • @wrusst
      @wrusst Год назад +1

      I did this for a guy last week lol , great upgrade

  • @strickerarts
    @strickerarts 8 месяцев назад +6

    Sweet, I was thinking of getting a radeon 7900xtx, just wanted to see how well it does with ryzen 5 5600. Looks great just have to adjust settings per game for performance.

    • @oscannail274
      @oscannail274 4 месяца назад

      If your CPU isn't good enough for a game you'll not be able to do much about it in settings.
      You'll find about two settings out of all of them that actually matter for CPU overhead.
      This is because you either have a good enough CPU for a scene, or you don't. Dropping your display resolution isn't even a solution.
      Raising your resolution can be a solution if you're agreeable with the frame rates you're already getting.
      That said if you already have a 5600 don't run out and replace it.
      You'll want to wait for 9000 if you want higher frame rates. the 7900 XTX is not going to be obsolete for a while.
      A friend of mine has put a 5700x in his x370 board in order to wait out the next release cycle for desktop CPUs.

  • @thescryingpool
    @thescryingpool Год назад +10

    One thing to note, the 5600G is the only CPU in the 5600 series that is ONLY PCIE 3.0, the others are 4.0 Just wanted to point that out in case there are others like me that assumed it was also 4.0 and missed that in the specs.

    • @DPLS77
      @DPLS77 Год назад +3

      there are benchmarks showing that when it comes to gaming, the performance gain you get with 4.0 is almost negligible. PCIE 4.0 becomes important when you are running multiple graphics cards or playing on higher resolutions like 4k.
      Edit:PCIE 4.0 begins to matter when your gpu has more than 16gb of vram. If you had a 4090 (24gb gdr6x) paired with a PCIE 3.0 GPU your bandwidth would be reduced to 16GB/s, maybe even 8GB/s depending on the mobo and cpu, instead of utilizing the full 24GB/s. However most current games do not fully utilize PCIE 3.0 x16, so PCIE 4.0 isn't going to be relevant until games start needing more than 16GB/s of bandwidth

    • @thescryingpool
      @thescryingpool Год назад +1

      ​@@DPLS77 True, but if want to enable things like ReBar you need a PCIE 4.0 compatible processor. Even if B550 support it, the processor does not.

    • @DPLS77
      @DPLS77 Год назад

      @@thescryingpool I don't think so, I have resizeable bar enabled on a amd 5000 series 3.0 cpu

    • @thescryingpool
      @thescryingpool Год назад +1

      @@DPLS77 You have a 5600G? I couldn't find any info online that confirms if ReBar is possible on a 5600G, save for a few people saying SAM works. I was specifically wondering about ReBar. If you could confirm that, that would be great!

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Interesting info! Thanks for sharing that I had no idea.

  • @konstantinlozev2272
    @konstantinlozev2272 Год назад +5

    The benchmark on Cyberpunk 2077 is a GPU test mostly. To check CPU bottlenecking in Cyberpunk 2077 you have to get in high crowd density in the busier parts of Night City.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Ah fair enough, might have to change that up then for cpu benchmarks in the future, thanks!

    • @konstantinlozev2272
      @konstantinlozev2272 Год назад +1

      @@LegitCamelRetro no problems 😁

  • @WidePhotographs
    @WidePhotographs Год назад +38

    This video confirms I made the right choice to go with the 5600x in December for $150 brand new rather than the 5800x3d for $350. Knowing myself, I will most likely upgrade my 3060ti in 2024.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Yeah for sure, there are some competitve games that are a little less gpu bound that I could have tested like csgo but for the most part the 5600 still gets excellent framerate in those too.

    • @chrismansour5191
      @chrismansour5191 Год назад +1

      @Arnoud van Lieshout 💀

    • @alkine3011
      @alkine3011 Год назад +6

      ​@Arnoud van Lieshout 4090 Saves Lives 🤣....At least in your case

    • @tarkitarker0815
      @tarkitarker0815 Год назад

      @UT2004 well say goodbye to your nostrils after sniffing 40g.....

    • @tarkitarker0815
      @tarkitarker0815 Год назад

      @UT2004 nah dude, its not about bad cut, its about remains of gasoline,benzol and concrete. and thats the LEGIT CLEAN way to make cocaine, the dirty way contains things you dont wanna hear about.

  • @omulamfibie
    @omulamfibie Год назад +7

    Thanks, I was actually looking for a good video about gpu bottleneck on 5600/5600x. I have a 5600x and won't be afraid to get a 6900xt or 3080 for high 1440p gaming now. I have a 3060Ti atm, but next year I might want to upgrade to a better gpu and 6900xt looks extremely good value/performance(especially since I couldn't care less about RT).

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Sweet! I am glad you found the video useful haha thank you!

  • @ivanbogdaue
    @ivanbogdaue Год назад +10

    Excellent video, exactly what I was always wandering about. I especially like the resolutions you chose.
    The only thing missing is higher end nVidia card because they seem to have a higher cpu usage.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +4

      Thanks! Yeah I have heard that, I am too poor to get the higher end Nvidia stuff haha but maybe if the channel gets big enough I can add it to the benchmarks in the future

    • @nitinnetworker
      @nitinnetworker Год назад

      I have 2080 and Ryzen 1600. Which CPU would you recommend me to eliminate bottleneck?

  • @Angel7black
    @Angel7black Год назад +16

    This is what i’ve been saying for a while, any CPU right now thats around Intel 10th gen K series/zen 3 is enough for really any GPU on the market at its optimal resolution. There are a lot of people who are waiting for Zen 4 X3D for some reason just to go in AM5 When theyre wayyy better off saving money with AM4 or Alder Lake/Raptor Lake and a DDR4 board. That or wait for Zen 5 when core counts go up and hopefully newer AM5 and Zen 5 cpus can handle better ram than 6000 mhz DDR5 to make the extra costs matter. If youre on an AM4 board right now the only reason to really upgrade away from AM4 is to get out of pcie gen 3.0 or a bad 300 series board. I think CPUs are hella out pacing GPUs right now for gaming at any resolution that makes sense. For instance you have to just hate money to buy a 4090 for anything other than 4K gaming, literally makes no sense for even 1440p, and ive seen even a 3600 perform decent enough with a 4090 at 4K. My that i mean i think its still noticeably faster than a 4080 in 4K meaning its technically not completely pointless

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Yeah totally agree, esports titles are the outlier with super high refresh rate monitors hitting the market which requires a pretty high end cpu but thats really the only case.

  • @LeitoAE
    @LeitoAE 9 месяцев назад +1

    I think you are not understanding what really CPU bottleneck is and how it works. You can not check % usage like in GPU or RAM. At least you can not do that for overall CPU usage. If you have 8 cores CPU and game uses only 1 core but at 100% and rest of them is at 0-5%, you will get around 10% of overall CPU usage in afterburner.
    What would be better is to take every single core, it's clock and if possible every single thread too in afterburner and see if any of them reaches 100% of usage and still - if game is not well optimized you can have a situation where you do not have 100% of core utilisation, but game stutters because of too small cache or bad instructions set etc.
    Core clocks could show us the reason why x3d was in some areas worse than non 3d CPU - maybe it was because of cores clocks?
    I had 1700x and rx570 4gb. I have changed the CPU to 5700x and I gained only 1 or 2 FPS in Forza Horizon, but the gameplay finally is smooth now. I don't have annoying stutters that made me thinking that my GPU dies. It was always the CPU that was making a problem and according to those methods of testing - it was around 10% of CPU usages. CPU temp was at 40°C. You would tell it was doing nothing.

  • @misot90
    @misot90 Год назад +3

    Running a 5600X and a 6700 XT.
    Pretty satisfied with the performance. I'm starting to save up for a new PC that I'll be getting in 5-6 years from now so I can have as much value for my money.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Not a bad idea, maybe by then pricing will normalize lol

  • @GB-zi6qr
    @GB-zi6qr Год назад +27

    Nice video. Thanks for show casing just how good the R5 5600 is. I'd have to look back through the original reviews but, as I remember, the 5600X was really meant for the entry to mid teir gaming PC. The 5600 will net about 85%-95% of the performance for less money. Depending on the Silicone lottery, I'd guess it's possible a 5600 could beat the 5600X.
    I think we all know the X3D chip was designed to be the ultimate gaming CPU for its generation.
    Again, nice video.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +4

      Thanks very much! Yeah with the 5600 being so close in performance to the 5600x it feels like a no brainer, I was really impressed with it in testing and for the cost it is probably the best bang for buck CPU out there right now.

    • @macymorse80
      @macymorse80 Год назад

      I got the 5600x a month after it dropped at retail thankfully, but have it all core OC at 4.2 ghz and still don’t see any real gains with it yet but it’s nice to have.

    • @4olaka
      @4olaka Год назад

      @@macymorse80all cores at 4.2 is alot of bottleneck. Undervolt it without changing clocks, and it will boost all cores at 4600mhz instead of maxing at 4.3 or 4.4. With stock settings i get all cores 4.3 in cinebench r23 and with undervolt they all run at 4.6 and giving better score at the end. Consider :)Sry my english

    • @macymorse80
      @macymorse80 Год назад

      @@4olaka not sure how to undervolt it. There was some software where you could do it but I tried and all I got was even more confused lol.

    • @4olaka
      @4olaka Год назад +1

      @@macymorse80 trough bios, nothing complicated

  • @surfx4804
    @surfx4804 Год назад +8

    I have a 5600 and it benchmarks almost the same a my 5800x up to a 3080 GPU.
    What made me jump up to a 5800x3d was scoring a 4090. It was just a direct CPU swap, no hassle an the alternatives are not really much better.
    There was a difference, it was not massive, more like games were smoother.
    Also with the 5800x3d you can undervolt the cores using some software called PBO2 tuner. Mine is locked it at -30 on all cores.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +4

      Yeah in the future ill add frame time and 1% lows to the tests, im sure both of those are better on the 5800x3d leading to a smoother feel

    • @ij6708
      @ij6708 Год назад +4

      PBO2 and curve optimizer can be on any Zen 3 processor including 5600

    • @surfx4804
      @surfx4804 Год назад

      @@ij6708 It can but the point is the 5800x3d is locked and so does not allow this. The PBO2 tuner tool will get around that side. Though I think some motherboards may have a bios update that also lets you do it.

  • @grzegorzkowalski3097
    @grzegorzkowalski3097 Год назад +6

    Great video. I wonder how these CPU age, because in 4 years the difference would be more noticeable. I know that it is impossible to measure it right now...

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Yeah its always hard to know, but I think current hardware might age quicker than expected because of the new console generation allowing bigger developers more room to push graphics.

  • @justincowans2677
    @justincowans2677 Месяц назад

    Thanks for doing the 1440p and 4k testing.
    These are the benchmarks that some of us look for but can't get from the other channls.
    One bug thumbs up from me.

  • @packetcreeper
    @packetcreeper Год назад +6

    Still so happy with my 5600x and it's coupled with a 7900XTX. I get silly frames on ultrawide 1440p. Planning on moving to 4k this year so this video reassured me I'll be fine.

  • @mattiasimone6929
    @mattiasimone6929 8 часов назад

    Bought a 5600x in 2021, best decision i've ever made for that PC. Still going strong after 2 gpu swap and from what i saw it'll be able to handle a 7800XT with no problem. So I can still do another gpu change with no need to upgrade anything else

  • @GLDragon93
    @GLDragon93 Год назад +5

    Was planning to wait for AM5 to go down, but it doesn't seems so. So I've decided to spend 330€ by the end of the month for an am4 system built around the r5 5600 and reuse my rx 5600xt (currently paired with an old i7 4770) and probably upgrade the gpu in the future, although I'll likely keep this system for at least 5 more years and maybe hunt down a used 5800x3d later down the line.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      I actually think thats a pretty good move right now, pricing is crazy and it will be a really sick PC for much less cost.

    • @DaCat1337
      @DaCat1337 Год назад

      i got the 5600xt and will upgrade to am4 tomorrow, how has the system been for you so far?

    • @GLDragon93
      @GLDragon93 Год назад

      @@DaCat1337 Hey, as far as system goes, the overall experience is amazing. Went from a mechanical drive to a gen4 Nvme and boot time is about 10-15 seconds, so fast that if I'm not careful I need to restart to enter bios.
      Haven't tried really demanding titles, but at 1080p the first Plague Tale runs at 120fps with mostly high settings (some on medium, because I don't like some effects). Genshin Impact at 1080p medium- high (same reason as above), x1,3 rendering and SMAA the gpu sits at max 70% in the most demanding areas, while hovering in the 40-50%. Age of Empires 2 DE at high settings runs at 400-550 fps uncapped.
      Previously with the i7 4770/Xeon E3 1240V3, couldn't get past 70fps on Plague Tale, Genshin had pop up issues, on top longer loading time and delays for things to start, While AoE couldn't go past 120fps with very frequent dips below that (thus I locked it at 62fps). Cpu was almost always above 40% usage, with frequent spikes to 70-80% . Haven't seen Ryzen going over 35%, other than AoE in late game session.
      Paired my cpu with a budget Thermalright Assassin 120 SE (5 pipes), and with an undervolt of -50mV, it stays amazingly cool and quiet , idle is about 28°, while barely going over 60° under stress test. Mind you I have an old but decent case with good airflow (3 fans, 2 for push&pull and 1 for the gpu area) and lots of space inside (corsair air 540)
      One thing I noticed is that as opposed to intel, idle power consumption is quite high. My Xeon at low usage and idle sipped power (from 15 to 28w) while the R5 5600 at idle draws about 25w and 40w in low load scenarios despite the undervolt (can't UV any further as I didn't hit the silicon jackpot with mine)

    • @DaCat1337
      @DaCat1337 Год назад +1

      @@GLDragon93 Im glad it works so well for ya! cant wait to build my pc tomorrow :D

    • @GLDragon93
      @GLDragon93 Год назад

      @@DaCat1337 Hope everything goes well with your new build!

  • @crabjockey
    @crabjockey 7 месяцев назад +1

    very helpful for me. I'm teetering between the 7600 and 5600 but I run 1080P and don't intend to break the bank with a new GPU soon.
    I'm headed for the 5600 and saving the dough for when I build my next Rocket on the AM5 platform.

  • @DGCastell
    @DGCastell Год назад +3

    I can excuse my skew towards the X3D because I'm still on AM4 and plan to stick with it the most I can, so in the near future I'm gonna upgrade my 2700X with it. Also I do 1080p high FPS gaming mostly.

  • @magmer8427
    @magmer8427 Год назад +2

    Thank you so much! I wasn't sure if 5600x would be enough for games, which is why I almost decided to build PCs with 7600x for $200 more!

  • @giorx5
    @giorx5 Год назад +10

    5600 with tuned RAM can utilise properly any GPU below $1000 at 1440P max details. A great CPU for its price now ($120-130), especially with nice RAM costing a bit above $100 for 32GB.

    • @theHardwareBench
      @theHardwareBench Год назад +2

      Pretty much any CPU will perform better with ram at the right clock speed and timings, the memory controller makes even more difference. 32GB will cause extra latency at the bleeding edge, 16gb is faster until games need more.

    • @raghavtripathi564
      @raghavtripathi564 Год назад

      It would be better to just buy 5800X3D with aguarantee to give solid performance on a trash tier 3200 CL16 RAM rather than spending huge amount on tight RAM and then putting in time and effort tuning it on 5600 to gain less overall performance.

    • @starstreamgamer3704
      @starstreamgamer3704 Месяц назад +1

      True. Tuned RAM can provide ~15% performance improvement in CPU demanding gaming scenes (in case of 5600) in comparison with stock 3200Mhz/CL16 kit. While 5800X3D gets just 2-3% bump due to RAM tuning, since big L3 cache almost eliminates memory bandwidth bottleneck.
      And DDR4 RAM with decent overclocking potential is pretty cheap, just a little bit more expensive than cheapest DDR4 3200MHz/CL16. So in terms of price/performance ratio 5800X3D was and still is much worse than 5600 with tuned RAM. Even much cheaper 5700X3D can barely beat 5600 in this regard, since it costs 50-70% more, but is just ~15% faster.

  • @viqraabrar
    @viqraabrar Год назад +2

    Omg this is a very great video for 5600 owner like me, thank you so much!

  • @bournechupacabra
    @bournechupacabra Год назад +3

    Why does the 5600 perform so much worse on Spiderman 1440p RT with the 6900xt compared to 6800xt? Is that a mistake?

  • @mikem2253
    @mikem2253 Год назад +2

    I have a 5600 paired with a 7900xt and have no complaints. I’m capped at 120fps max on my LG C1 and that CPU gets me there or past it, so much so that I need to use Radeon Chill pretty often to cap frames.

  • @infinity2z3r07
    @infinity2z3r07 Год назад +10

    Always thought if CPU consistently pegged GPU 95-100% then you've maxed the GPU, but these scenarios showed potentially huge fps differences even at very similar GPU utilization. It's gonna take a while to understand that. Maybe X3D magic? lol
    Thanks for doing the video 👍

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Some people have mentioned it in the comments before but frame times can make a pretty large difference, I couldnt get them working for this video but that could be a reason. Also some games just run weird requiring more cpu power for frames. Thanks for watching it!

    • @jasonluvisi
      @jasonluvisi Год назад

      There can be huge differences in fps between 95% GPU usage and 100%. 99 out of 100 times though if you are at 99% or 100% GPU the performance will be identical on both processors. Something weird is going on with these benchmarks though as the memory usage is wildly different. Memory usage should be fairly close to the same if all you are comparing is CPU.

    • @tilapiadave3234
      @tilapiadave3234 Год назад

      LMAO X3D magic ,, only magic it has is EMPTYING YOUR WALLET ,, waste of money way WAY over-priced EXTREMELY overhyped cult cpu

    • @jasonluvisi
      @jasonluvisi Год назад +3

      @@tilapiadave3234 It's definitely not the best value. But I wouldn't say it's overhyped... considering anybody who built an AM4 system 6 YEARS AGO can drop the 5800X3D in their system and instantly have one of the best gaming systems in 2023. Of course that's assuming they have a modern GPU already. It's pretty amazing what AMD was able to pull off. But clearly the 5600 is much better FPS per dollar.

    • @tilapiadave3234
      @tilapiadave3234 Год назад

      @@jasonluvisi Anyone wanting to upgrade their AM4 AND has INTELLECT selects the r5 5600 (non x ) at WAYA less than HALF the price. I can also see a reasonable argument for the 5700x with it's new discounted price

  • @DavidBSRK
    @DavidBSRK Месяц назад +1

    I've got an R5 5600 and a RX6800 and its not a very big bottleneck

  • @kaedeschulz5422
    @kaedeschulz5422 Год назад +2

    Really didn't expect those results! i guess i will stick with my 5600x for a good while longer! Espacially as mine seems to be a rather good sample being able to clock very high at a low voltage.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      oh yeah for sure, it will get a by for a while yet

    • @kaedeschulz5422
      @kaedeschulz5422 Год назад +1

      @@LegitCamelRetro And then get 5800X3D for real cheap later hehee. Cheaper upgrade path than moving to AM5 right away instead of getting the best chip for the Plattform you already have i imagine. Then when AM5 get's cheap move over. No way i gonna pay like 300euros for an itx board alone.

  • @EJPlayz22
    @EJPlayz22 4 месяца назад

    Just build a rig last night with a Ryzen 5 5600x and a rtx 3080 ti. Running really good. That cpu is all around good

  • @Qbens-ez7gq
    @Qbens-ez7gq 7 месяцев назад +3

    So i can buy rtx 3060ti for ryzen 5 5600? At 1080p

  • @gamersanonymous4me131
    @gamersanonymous4me131 4 месяца назад

    Nice Video, thanks. The biggest mistake gamers make when watching the glory of the tested GPUS is that if testing a GPU they always have the highest performing CPU to push the GPU and it shows large FPS differences or when they are testing CPUs they always use a 4090 (highest performing GPU) and show large differences, but who is using a 4090 or a top line CPU like R7 7800X3D. Those large differences go away when using mid level CPUs and GPUs and while some differences and better experiences may be had in some games, the averages will even out often. Having a balanced PC will always get the lowest price/best performance PC. I was using a Ryzen 3600 and Rx5700Xt GPU (lower mid range GPU) and upgraded to a R7 5700X and the game was noticeably smoother, had a boost of 10-40 FPS and cost me $190 w/tax on my AM4 B450 MOBO. If I wanted a significantly higher boost I would have to invest in an AM5 MOBO, DDR5 ram and high end GPU like the 6800 GPU and a better CPU for mayby $1K.
    The thing that changed gaming forever is the upscaling software ( and FSR and DLSS) that instantly improved FPS for mid range systems, where only true purists need top of the line Components for that extra visuals available (running the highest graphic settings). To their credit a lot of testers add in production work capability (like running all cores at 100% in Cinebench to see if you save time doing video editing and rendering) if that is something you do (like I do). The 5600 is a beast for gaming/price, and slightly edges out my undervolted R7 but they are very close except for production work.

  • @Salty_Nutella
    @Salty_Nutella Год назад +3

    I've got my 5600 on PBO +200 MHz, Auto V-core, and temps never go above 70*C on a $20 ID-Cooling air cooler under hours of gaming. Just with this PBO setting it's scraping at the single core performance of a 5800x. Lucky me.

  • @emmanuelgeorge7199
    @emmanuelgeorge7199 8 месяцев назад

    I only just saw this video. I have a Ryzen 5600, with RX6800, its running on a fractal 550w gold psu. Runs perfectly, efficiently, psu blows out cold air. Putting this here in case anyone is wondering what psu is needed. Although bigger would be better for the future.

  • @divertiti
    @divertiti 4 месяца назад

    Some constructive feedback to make the results actually meaningful:
    1. Average fps tells you nothing, you need to look at 1% low and 0.1% lows to see the impact or cpu on frametime consistency and how smooth the gaming experience is
    2. 7900xtx isn't really that fast, you need a 4090 to remove gpu from the equation as much as possible

  • @weyo14
    @weyo14 Год назад +3

    You should add 1% lows next time there’s a difference between 6 and 8 cores

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Yeah thats true, I've got a couple improvements in mind for the next one

    • @peterpan408
      @peterpan408 Год назад +1

      Smoothness 😎

  • @juancarloscastrejon3417
    @juancarloscastrejon3417 Год назад +2

    O just purchased a 7900xt and I have already a 5600G (deactivated integrated graphics already)… I have 32” Samsung odyssey 5 , which is 1440p. All the games are running at ultra with at least 75 fps, but of course I don’t feel I’m getting all the juice. What CPU should I get to get at least a 100 fps? Yes I usually play single player games , rarely competitive. Is a 5600X ok? Or should I get something a little bigger ? Like maybe a 5700X ? Thank you !!

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      I would just get the 5800x3d, its not very expensive and its the best cpu you could slot into your current build without needing to upgrade the entire platform

  • @George4815162342
    @George4815162342 Год назад +10

    This comparison video is exactly what I was searching for. Great work, thank you!

  • @theHardwareBench
    @theHardwareBench Год назад +10

    What you should have done is run the games with the card stock and then overclocked the cards to see the difference. A bottlenecked card will show little to no improvement when you overclock and often it will perform worse. A stronger CPU will allow you to measure how much the card is bottlenecked by a weaker one as long as that too isn't bottlenecked when you push the card clocks. You need to find the limits with a benchmark before testing games. I can see I'm going to have to do a video showing exactly how I do it as the other overclockers like Buldzoid and Derbauer aren't into games. Benches don't lie and the results transfer very well to the games I use for testing.

    • @rageofheaven
      @rageofheaven 11 месяцев назад

      Uh, the 6800XT was running at 2.4 ghz. It will overclock itself unless you limit it in software.

    • @theHardwareBench
      @theHardwareBench 11 месяцев назад

      @@rageofheaven I didn't know that, thanks. The cards I have boost but if you OC manually they will go further.

  • @vladislavkaras491
    @vladislavkaras491 7 месяцев назад

    Did not know that DLSS & Ray tracing would affect CPU usage, that was something new to learn!
    Also, as others have mentioned, the 1%, .1% fps are important factors, thanks for including it in your future videos!
    Cool to know that my CPU is still and will be viable for a while!
    Thanks for the video!

  • @h1tzzYT
    @h1tzzYT Год назад +6

    5800x3d is good, but it starts to crumble when raytracing is introduced, you can check hardware unboxed spiderman remastered cpu comparison video for more info. It shows that either raytracing overwhelms 5800x3d cache advantage forcing to go for main ram like its regular counterparts or raytracing just dont really care about cpu cache. So the future of 5800x3d seems quite interesting if you can call that as more and more games starts using raytracing by various degree of complexity.
    Overall 5800x3d is very solid cpu but imo its a bit over rated, if you can get 5600x or 5800x for significantly cheaper price than 5800x3d, i would consider them instead, or just go straight for ddr5 stuff like 7600x or i5 13500.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +4

      Ah yeah I sort of forgot about that, I have a 7700x and 13700k build I should probably use in the future, I just kind of felt for this video in particular using the 5800x3d would be interesting since they are on the same platform

    • @h1tzzYT
      @h1tzzYT Год назад

      @@LegitCamelRetro oh yeah for sure👍

  • @nervsouly
    @nervsouly Год назад +3

    I put the 7900 XTX in a machine with a R5 5600X, but I'm playing in 4k at all times and cap the frames at 60. I won't have any problems, right?

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      I mean you will for sure have some minor bottlenecking, and it kind of depends on whether you are turning ray tracing on or not, but for most games you will be fine. I wouldnt sweat it too much lol like you are leaving some performance on the table, but like the benchmarks showed with the 7900xtx it isn't much at 4k.

    • @mastroitek
      @mastroitek Год назад +1

      the 5600x should not have problems in pushing 60fps, but in case it will, you will know. When the cpu starts to struggle delivering at least 60fps you will probably experience frametime spikes (which is quite noticeable, especially when gaming at 60fps). To be clear, in any scenario there will be a certain amount of frametime inconsistency, but these should be much more apparent. In other words, consider to upgrade the cpu only when you feel like it is breaking your gaming experience/immersion.
      (For example I had a 7700k and in watch dogs legion I had an avg of 75fps but clear 1% low that went down to < 35fps, after upgrading to a 12600k the 1% low became > 55fps)

    • @nervsouly
      @nervsouly Год назад

      @@LegitCamelRetro yeah thankfully I'm not into ray tracing because I barely see a difference. From the benchmarks I mostly read there are losses above the 60 FPS mark that I wouldn't even notice.

    • @nervsouly
      @nervsouly Год назад

      @@mastroitek ohh that makes sense. Thank you.

  • @MrVizzle
    @MrVizzle 9 месяцев назад

    Smart test, recently recommended someone 7600 and he was not sure if the CPU would not be to weak for 1440p with mid tier card. Now need to show him this test.

  • @timothypattonjr.4270
    @timothypattonjr.4270 Год назад +8

    I understand you are wanting to keep things realistic use cases but if every time you increase graphics card performance you also increase resolution or graphics settings its obviously going to minimize any performance limitations on the weaker gaming cpu. Keep up making videos dude. I also want to mention Hogwarts Legacy that recently came out. It looks great but man it is a system slayer on gpu cpu and ram.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Thanks! Yeah I feel like the main point I was trying to get across in this video was at 1440p and 4k specifically having the best cpu sometimes doesnt matter as much as you would think, but thats why the mw2 1080p benchmark was tossed in there because it shows how at those lower settings and resolutions you will start to bottleneck really bad.

  • @Litt02
    @Litt02 Год назад +1

    wont be needing a cpu upgrade and ill just stick with my 5600. i will however upgrade my gpu instead from 3060 ti to 4070 to see real performance gains.

  • @flyinhawaiian9174
    @flyinhawaiian9174 Год назад +3

    I apologize if you already knew this, but you don't mention RAM at all. Tuned RAM will have an impact on the 5600's performance. Just turning on XMP without doing anything else will give worse performance. Setting the Fabric to Linked in the BIOS will help you hit high RAM speed such as 3800. 4x8GB sticks will give it a 10% boost over 2x8GB. The X3D chips allow you to set XMP and forget it due to the larger amount of L3 cache.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +2

      Yeah I probably should have mentioned it, I just went with a cheap 2x8gb 3200 mhz cl16 kit with just XMP enabled for the 5600 and a 4x8gb kit for the 5800x3d. I wanted to give a realistic cheap build bench for the 5600.

    • @flyinhawaiian9174
      @flyinhawaiian9174 Год назад +2

      Last I looked you could pick up 2x8GB Patriot Viper Steel CL19 4000MHz on Amazon for 72 bucks. That RAM is Samsung B-Die which is the best for overclocking.
      You shoulda changed the RAM config around and given the 5600 a little boost. Honestly, the X3D couldn't care less about RAM. Still, I understand what you were looking to achieve. Just wanted to point out that there's more performance to be had with the 5600.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      @@flyinhawaiian9174 Oh yeah for sure, and ddr4 prices are so cheap right now, I honestly didn't realize how low they have gotten! lol

    • @AshtonCoolman
      @AshtonCoolman Год назад

      Excellent point. Primary timings on the memory kit will stay the same buy sub timings can vary wildy from CPU to CPU due to memory training. With the Samsung B-die you can manually set certain parameters that'll work with different CPUs and their memory controllers.

    • @theHardwareBench
      @theHardwareBench Год назад +2

      @@AshtonCoolman Yeah but where do you draw the line? I could get gen 1 Ryzens to work better than any of the RUclipsrs in 2017 but getting everything properly optimized takes time. Not to mention every individual game and program will respond to different tweaks. Kaby Lake Intels will fly to the moon with the ram and memory controller set right along with a sweet overclock on the core but most people are terrible at setting up that platform to be optimal. The video is about a GPU bottleneck at the endo of the day not how to tune a CPU and RAM.

  • @45eno
    @45eno 4 дня назад

    When people simply ask a bottleneck question without providing other information like game, game detail, resolution and fps target you know they don’t understand bottlenecks.
    If some targets highest fps and less resolution and quality settings then even midranged gpus can easily be CPU bottlenecked.

  • @Largallama
    @Largallama Год назад +3

    Looking at the benchmarks, its not just about the FPS, i noticed about 95% of the time, the usage and temps where lower for the 5800X3D so that means less power for the same or more FPS

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Very true!

    • @robertkubrick3738
      @robertkubrick3738 11 месяцев назад

      Nope. One is a 105W processor that can go higher even without overclocking and the other is a 65W processor that might reach 72W when it's not overclocked. The usages are in % and it's not an even playing field.

  • @_.Egor._
    @_.Egor._ Год назад +1

    Great video! Now I have 5600 + RTX 3080 and thinking about switching on 5800x3d.

  • @MarginalSC
    @MarginalSC Год назад +4

    Not too surprising. Not every game benefits from the extra cache in the 5800X3D, and the 5600x can make up some ground by virtue of its higher boost clocks which the 5800x3D can't do.

    • @andrewfarrugia2688
      @andrewfarrugia2688 Год назад

      That was the non X 5600 CPU, slightly lower clocks and it was still keeping up

    • @MarginalSC
      @MarginalSC Год назад

      @@andrewfarrugia2688 Doesn't really matter. Can still boost.

    • @Turtizzle
      @Turtizzle Год назад

      @@andrewfarrugia2688 the X just means it can be overclocked. Non-X still has boost clock from factory.

    • @ianlennon-puthoff7613
      @ianlennon-puthoff7613 Год назад

      @@Turtizzle the 5600 can also be overclocked. i overclocked mine to match the 5600X lol its the same chip pushed to higher frequencies

  • @Unbornchikkenable
    @Unbornchikkenable 26 дней назад

    Great video! I bought my 1440p widescreen system and took the r5600 with a 7800xt and it’s amazing. The alternative was something on a am5 with weaker gpu cus of budget. And I’m glad that I took am4 with a cheap but decent cpu. Much more fps for the budget. And in view years I still can go with a beefier gpu and take the 5800x3d. So one upgrade is probably alright on this system.

  • @zfk1000
    @zfk1000 4 месяца назад

    This is a great video. It ultimately shows that for the majority of gamers a 5600 and by extension the 3600 is more than sufficient and it’s questionable whether they should jump to AM5 if they are on a limited budget.

  • @wilazn
    @wilazn Год назад +1

    This is exactly what I was looking for. Even have the same exact 6800xt thanks!

  • @roythunderplump
    @roythunderplump Год назад +2

    Very helpful when picking CPUs, ram size, GPU combos.

  • @PrinceVegeta-ep8sr
    @PrinceVegeta-ep8sr Год назад +1

    this is the closest video to exactly what ive been looking for. But the info is there. i just bought a 4080 and have a 5600x 240 hz 2k monitor. I think i need to upgrade hahaha

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Yeah I would recommend it, that performance level at that resolution probably needs something a little higher end to not bottleneck.

  • @panjipewe
    @panjipewe Месяц назад +1

    Ryzen 5600 is still the best CPU for performance to money...
    Compatible to high end GPU...
    Cheap RAM, Cheap Motherboard...
    The best for 1080p and 1440p gaming...

  • @bobwreck3775
    @bobwreck3775 4 месяца назад

    I love my AMD 5600. It kicks butt. I have it paired with a 6700xt VC and play games 1440p 2k. Nothing but love.

  • @dumbzero
    @dumbzero 8 месяцев назад

    i really needed to watch this video, i have the 5600 and i was going to upgrade only my GPU to a 7800XT coming from a 6700 and this video showed me that i can even get to 1440p without any issues, i might have to buy a new monitor lol

  • @Maksimov1337
    @Maksimov1337 9 месяцев назад +1

    My 5600x runs 4.7 GHz all core and compared to the 5800X3D at 1440p there is hardly any difference in all the games I actually play, so for the cost difference honestly not worth the hassle. Running an 11 year old Noctua NH-D14 as well, never see over 60 degrees on the CPU.

  • @MrMarrok657
    @MrMarrok657 Год назад +1

    if i remember right, aren’t cyberpunk and spiderman AMD gpu favoring titles?

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Actually for Cyberpunk its the opposite, thats whay Nvidia pretty frequently uses Cyberpunk to show there benchmarks in all the slide shows, very Nvidia optimized. Spiderman I dont think really favors either brand.

    • @MrMarrok657
      @MrMarrok657 Год назад

      @@LegitCamelRetro Ah gotcha. this was something I heard in a short, you never know what’s true with them anymore.

  • @RemyBDP
    @RemyBDP 11 месяцев назад +1

    been using a 5600x and a 7900xt and has been a great experience to be completely honest.

  • @RagnarokCo
    @RagnarokCo Год назад +2

    18:24 THANK YOU! The scene needs to evolve. If you are an esports professional that needs 240+ fps then you are sponsored or a niche market, I would love more benchmarks on the higher resolutions and details.

  • @br4713
    @br4713 Год назад +2

    Thank you for this demonstration. I play very often with XPlane (flight simulator) which is very cpu demanding. But even with this game I don't think it's worth it to upgrade from my 5700 to the 5800x3D. I'll buy the best Nvidia card I can afford, and change the cpu/ram/motherboard later (maybe 7800x3D?).

  • @siegtv9305
    @siegtv9305 9 месяцев назад

    r5 1600 to 3600 now with 5600x paired with gtx 1070 / rx 5700 / rx 6700xt still with the same motherboard, great value.

  • @havochowl6766
    @havochowl6766 Год назад +2

    I got this BNew 5600 OEM tray type version for $107.43 via online during sale ☺

  • @Physics072
    @Physics072 11 месяцев назад +1

    You have to turn the settings way down to see a cpu difference with a 3060ti. Sure if you max it out then yes you can move the load to the graphics card. Honest review without a agenda would have done it both ways.
    With a 3080 or 4070 in Shadow of the tombraider the 5600x3d is 30+ fps faster at 1080p and 1440. But if you run a old 1070 gtx the cpus will appear similar. 5600x3d all the way no matter what gpu as it will have less micro stutters not talked about in this video at all.

    • @LegitCamelRetro
      @LegitCamelRetro  11 месяцев назад

      Feel like you missed the point of the video. There are plenty of benchmarks out there that compare genuine cpu performance by lowering settings, its a "will my cpu bottleneck this gpu" video and I am not going to just force a bottleneck, most people want to run games on high settings. I didnt have any agenda and I ran a multitude of quality settings. I am sure the 5600x3d is a great cpu but this video released well before that was ever created.

  • @vellocx
    @vellocx 11 месяцев назад

    I recently upgraded to the 5800X3D from the 5600X. Star Citizen greatly benefitted from that.

  • @l3ulldozer
    @l3ulldozer Год назад +2

    If you play Tarkov or Flight sims you get way better performance with the 5800x3d

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      Yeah the extra V cache depending on the game can make a real big difference

  • @anthonybradway8925
    @anthonybradway8925 Год назад +1

    So basically stick with my 1080p setup Ryzen 5700x, RX6650XT, since all I do is play COD MW2? It’s absolutely hard to stick with what I have because I see so many deals now on higher end stuff and I want to buy it cause of the deal and not cause of the necessity.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      You can still reach high fps at 1440p depending on how much higher you want to go, what fps are you shooting for?

    • @anthonybradway8925
      @anthonybradway8925 Год назад +1

      @@LegitCamelRetro that’s the thing, I’ve never really let that be an issue. I’ve just wanted to at least match my monitor with mid to ultra settings. Some COD settings are off or low due to my need for gaming properly. My monitor is 1080p 144hz 1ms response time and my fps average is around 125. I can get it around 155fps with fidelity turned on.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      @@anthonybradway8925 You could get a gpu upgrade like the 6800xt and consistently get 144fps+ for your 144hz monitor

    • @anthonybradway8925
      @anthonybradway8925 Год назад

      @@LegitCamelRetro I’ve thought about that and upgrading to 1440p but when I heard your comment about 1080p competitive play it me feel ok about staying at 1080p.

  • @tarik8300
    @tarik8300 3 месяца назад

    Very useful video! So it turns out that for 3060 there is no point in buying a processor better than 5600? Well, if the priority is multiplayer and occasionally single games

  • @tjadams8
    @tjadams8 5 месяцев назад

    Nice, I always assumed the CPU wasn't quite as important as some make it out to be. Especially since a lot have left 1080 behind...

  • @Bdot888
    @Bdot888 Год назад

    This makes me feel better about my 5700x. I need not upgrade until I make the jump to AM5 in a year. Currently on a 4070ti at 1440p

  • @justincowans2677
    @justincowans2677 Год назад +1

    Thank you for this video. I understand why reviewers use 1080p for.CPU reviews as the eliminates the gpu as a bottleneck and shows the difference in actual CPU performance.
    What gets forgotten is that if I'm looking at high end gpus, 6800 and up or 3080 and up, showing 1080p performance isn't relevant.
    We woulc.like to.know how it plays at 1440p.
    Thanks for.giving the people what we want!😀

    • @UTFapollomarine7409
      @UTFapollomarine7409 Год назад

      not for a 3060 buddy this video is kind of obsolete the 3060 is the botteleneck even at 1080p

    • @peterpan408
      @peterpan408 Год назад

      @@UTFapollomarine7409 Indeed. Some of us cannot afford/wont spend the money for the higher cards these days.
      Some of us have been long term 60 class customers despite Nvidia trying to force us up the stack 😉

  • @RobiLemos10
    @RobiLemos10 Год назад +2

    Sadly my Ryzen 3500x isnt enought even in 1440p 144hz monitor,so I changed instead 5600x. I was between that and 5800x3D. But unless you play 1080 240hz or something like that...doesnt deal at all.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Yeah I am inclined to agree, from my experience it does well at 1440p

  • @hyd.e
    @hyd.e Год назад +1

    Exactly what I was looking for… subbed and hoping to see some reversed cpu scaling comparison with low and high end gpus

  • @kotarojujo2737
    @kotarojujo2737 6 месяцев назад

    Upgraded from 3100 to 5600, really surprised with massive improvement for rdr2.

  • @mhameemad9683
    @mhameemad9683 2 месяца назад

    I was searching for a video like this bro
    Thank you so much

  • @bmdshred77
    @bmdshred77 Год назад +1

    I just bought me a 7900xt for $779 and microcenter told me not to bother upgrading my cpu. I have 5600 and I’m glad I didn’t. It rarely bottlenecks at 1440p. If it is the fps is so high it don’t matter

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      oh yeah I totally agree with them, nice set up!

  • @6FootVampire
    @6FootVampire Год назад +2

    Ill prob get the 5600 since the cooler is included, and ill pair it with the 6650xt

  • @edhikurniawan
    @edhikurniawan Год назад +1

    Maybe games like Factorio, Stellaris, City Skylines, Rimworld could give different result. Honestly that's the only type of game i would play, making most benchmarking games kinda irrelevant to me.

  • @rayh91
    @rayh91 7 месяцев назад

    I went from s 5600x to 7800x3d, and at 1440P (RX6800), I didn't notice much of a FPS increase but Cyberpunk has been noticeably smoother, which made it worth it for me.

    • @b3as_t
      @b3as_t 6 месяцев назад

      The x3d chips have their major impact on the 1% lows which is probably why the game feels smoother

  • @chinesepopsongs00
    @chinesepopsongs00 Год назад +1

    You run CPU bottleneck tests at 1080 because it shows how many FSP (if game is uncapped) the CPU can push. The CPU will be able to push about the same on 1440p and on 4K, only on the higher resolution you have more chance to hit the GPU limit. So if you watch a GPU test for the games you like and on the resolution you want to target (lets asume 1440p) then you know what GPU to buy to be able to get the desired FPS at 1440p. Now you can watch CPU benchmarks at 1080p with the most insane GPU the reviewers have. And you want a CPU that can push at least a little bit more then what you target when choosing the GPU. Voila now you got a balanced system. One minor detail: When you look at CPU reviews, reviews that use the same brand high end GPU as you did choose in your GPU selection produce a bit more accurate comparison. This is due to NVidia doing more work in the GPU driver as opposed to AMD. However this difference is not huge. Watch this from Hardware Unboxed as they try to describe the same process: ruclips.net/video/Zy3w-VZyoiM/видео.html What will you experience if you have a GPU bottleneck: Your will be limited in framerate and when there are big changes in scene the framerate will change accordingly. Example going from inside to bigger outside will lower FPS. However the game will feel as smooth as can be at the given FPS. What will a CPU bottleneck experience be like: Next to changes in FPS due to scene things like more AI enemies also will impact max FPS, but the biggest problem with a CPU bottleneck is that frame times will be inconsistent. Meaning while average FPS can still be decent, 1% lows will be worse. Your game will stutter and feel bad. Tuning a game down for a weaker GPU (turn off eye candy) is much easier and effective then tuning for a weak CPU. Although some games like Cyberpunk have things that target CPU tuning like crowd density.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Yeah I watched that video when it came out and it was really helpful and informative. I kind of just put this video out there to help streamline it so people can just see it in action during the benchmarks. Nice comment though, I hope more people read it to help them make an informed decision when choosing a set up!

    • @peterpan408
      @peterpan408 Год назад

      The question is when will it bottleneck..
      And provided you are not chasing HFR and trying for a practical level of detail, most CPU are fine..
      However smoothness is another question entirely. Balancing the number of cores/threads and performance per core/thread is the way to 'smoothness'.
      4-cores definitely stutter.. when will/are 6-cores stuttering already??
      Time for 8-cores??
      Maybe we need those extra cores to handle NVidias driver overhead 🤣

  • @ramananyt4762
    @ramananyt4762 Год назад +1

    Bro the seller recommended r5 5600x with rtx 1650... Should i buy gtx 1660s and r5 5600??? Or go with the recommended?

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      I would get a higher end gpu, depends on your budget but 1650 is about as low end as I would recommend now adays.

    • @ramananyt4762
      @ramananyt4762 Год назад

      @@LegitCamelRetro bro i have choice... 1) i can go with 1660s and a low budget cpu like i3 10th gen or i can go with r5 5600x with gtx 1650 my budget is about 65 k so which should i choose?

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад +1

      @@ramananyt4762 Man thats kind of tough, i guess i would go with the 1650 with the 5600.

  • @COLOFIDUTI
    @COLOFIDUTI Год назад

    ive got the 5600 for a little over 100$ ,was searching for a bottleneck video as i plan to upgrade my 6600 for a 6800xt.Thanks to you, now i know that even a 6900xt world pair really good with it😂

  • @jmac79ers
    @jmac79ers Год назад +2

    I'm late to the party here, but thanks for this. I finally pulled the trigger on a rx 6650xt to run with my 5600g, and didn't think that it would bottleneck but you pretty much confirmed that it won't

  • @avif007
    @avif007 Год назад

    so bottom line, i am in the clear pairing 3090 with 5600x especially on 1440p/2160p resolutions (i mostly play VR anyhow). Thank you for the video!

  • @mre8ballgaming842
    @mre8ballgaming842 Год назад +1

    This was very interesting....I have 5600 and was curious how far I could push a gpu upgrade before needing to upgrade the cpu...and now I have my answers

  • @oogieboogie9341
    @oogieboogie9341 Год назад +2

    This video was very helpful, Thank You.. You Got A Sub!

  • @sonytest5601
    @sonytest5601 3 месяца назад

    I have a system with 5600x CPU and 6800XT GPU. I was wondering if there is any reason to upgrade my CPU and this video gave me exactly the answer I needed. Thanks!

  • @wyc603
    @wyc603 Год назад +1

    Amazing video, this is exectly what i am looking for.

  • @junilog
    @junilog Год назад +1

    I'd take that TL;DR, it doesn't really fully bottleneck since it's more up to the games to optimize how much it uses the other cores? The usage seems pretty much the same and they're not even close to using the CPU near 80%. Looks like most video games loves single core performance over the total package especially if it's over 6 cores.

    • @LegitCamelRetro
      @LegitCamelRetro  Год назад

      Yeah that is true, but optimizing games is more nuanced than it seems and I dont think developers are going to put resources into utilizing more cores anytime soon. If you looked at a more detailed cpu utilization chart you would probably see the first 2-4 cores at close to 100% usage while the others are idle creating the gpu bottleneck.

  • @afaqahmed43
    @afaqahmed43 9 месяцев назад

    lovely video, i was thinking of pairing a 6800 with a 5600, i knew it wouldnt bottleneck but i was curious. i wish you'd included more 1080p results though. regardless cheers