The Evolution of CPU Gaming Performance, AMD vs. Intel

Поделиться
HTML-код
  • Опубликовано: 11 янв 2025

Комментарии • 935

  • @jacobvriesema6633
    @jacobvriesema6633 3 года назад +231

    This whole series has been so cool to see. Thanks for going the extra mile as a tech channel. Your content, explanations, visuals…etc. are always top notch!

    • @katech6020
      @katech6020 3 года назад +9

      I hope we get a gpu version

    • @Hardwareunboxed
      @Hardwareunboxed  3 года назад +33

      @@katech6020 GPUs aren't really possible to do this with as older models don't have enough VRAM and can't use the latest APIs. So you'd have to test with old games where the newer GPUs aren't fully utilized. Basically, you can't get useful data spanning that many generations.

    • @katech6020
      @katech6020 3 года назад +10

      @@Hardwareunboxed You are right, It will not be possible and thanks for reply I love the content

    • @theyellowlemon
      @theyellowlemon 3 года назад +1

      ​@@Hardwareunboxed I'd really love to see a performance per watt benchmark with many different GPUs when power limited. I still think you could get useful and quite interesting data even if only going back to 2015's 980 ti or maybe even 2013's 780 ti. It would be cool to see a variety of cards when limited to 150w, 200w, etc. It would even be interesting to see just this current generation's cards since TDPs have increased so substantially.

  • @JarrodsTech
    @JarrodsTech 3 года назад +596

    It's cool seeing how far behind AMD was in 2012 and then slowly climbing up from 2017

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 3 года назад +54

      It`s also cool that enough people bought AMD CPUs when they were far behind. Otherwise, AMD wouldn`t have had enough money for R&D to make Ryzen.

    • @dalefrancis9414
      @dalefrancis9414 3 года назад +34

      @@MarkLikesCoffee860 Kudos to those people. They are the real winners now.

    • @hemenduchowdhury2648
      @hemenduchowdhury2648 3 года назад +2

      hey brother i have been following for a long time by now could you pls tell me whether or not should consider buying the 6900xt of asus rog strix lc pls reply...

    • @nipa5961
      @nipa5961 3 года назад +24

      @@Pillokun AMD is beating Intel at all fronts while still being cheaper most of the time. Bringing back competition was the best AMD did for all customers. Otherwise we would still only see Intel 4C/8T CPUs for steadily increasing prices each "generation".

    • @patrickleeastarjohnson2918
      @patrickleeastarjohnson2918 3 года назад +3

      @@hemenduchowdhury2648 just get a Nvida gpu, got a 6900xt reference. Tons of driver time outs

  • @sopanroy1506
    @sopanroy1506 3 года назад +130

    AMD's comeback is pretty motivational I would say.

    • @wargamingrefugee9065
      @wargamingrefugee9065 3 года назад

      Cue Sinatra singing "High Hopes".

    • @bsh7390
      @bsh7390 3 года назад +4

      *TSMC's comeback

    • @rattlehead999
      @rattlehead999 3 года назад

      Considering AMD was better from 1982 to 2006, I'd say they were pretty great for the majority of the CPU history.

    • @calv19
      @calv19 3 года назад +4

      To be fair, AMD'S So called failure of FX processors allowed me and many many more to actually afford to get into pc gaming

    • @rattlehead999
      @rattlehead999 3 года назад +3

      @@calv19 FX 6300 from 2012 to 2017 was a complete soldier, ran cool and efficiently as it was a lower power chip and it played everything with 60fps+ up to Battlefield 1 64 player multiplayer.

  • @fran117
    @fran117 3 года назад +72

    I didnt realize haswell performance jump was that big, I remember when haswell came out there were a lot of negativity on how its only like a 5% performance bump from IVB.

    • @chronicle_4
      @chronicle_4 3 года назад +7

      me too. but then, this is the start of the end of ddr3 which dampens Haswell's general appeal. Add to that that everyone knew Intel would just release a new socket the following year makes Haswell unappealing to some.

    • @dondraper4438
      @dondraper4438 3 года назад +20

      That's why they always test gpu's with the most powerful GPU in the market. Because generally the fastest GPU around should be able to give us a good idea of cpu bottlenecks.
      Turns out the fastest GPU at the time was bottle necking Haswell's gaming performance.

    • @dondraper4438
      @dondraper4438 3 года назад

      @Steve Dave Yeah I don't think a major RUclipsr would tell us how to disable data harvesting through bios. Just my 2 cents though.
      So much performance was lost thanks to spectre and meltdown mitigations.

    • @fran117
      @fran117 3 года назад +6

      @Steve Dave Oh thats right, totally forgot about that, I do remember pre-haswell was ALOT more heavily affected by the patches. Im also not sure if he mentioned it, I ususally just time skip thru the graphs

    • @Kynareth6
      @Kynareth6 3 года назад +3

      We were used to seeing +60% every year... it slowed down to a crawl.

  • @shinseitaizen
    @shinseitaizen 3 года назад +149

    A great history lesson. Good way to show someone new to tech why AMD became so popular.

    • @eth_saver
      @eth_saver 3 года назад +9

      AMD became popular because TSMC is dominating entire market, not because of their amazing chip design. It would be nice to see in these graphs what node is each generation using. But that would just show the truth how AMD gained most perfomance by using better and better node, not better architecture. Zen3 was the only architecture change that actually helped performance in some way, everything before was just because AMD manufacturer got better not AMD themselves. Its the same with 3D cache, its not amd technology its TSMC who is leading the market.

    • @iChocoCookie
      @iChocoCookie 3 года назад +6

      @@eth_saver that's completely wrong? Doesn't matter if you shrink the transistors, the video even showed them at the same clock speed, you shrink the node thus you can increase clocks. All the IPC improvements come from architecture, not the node shrink.

    • @eth_saver
      @eth_saver 3 года назад +2

      ​@@iChocoCookie Smaller node does not mean you can just increase clocks. It does matter alot more, when you shrink stuff everything is closer together and you get lower latency and better performance, even at the same clock speed with same architecture, also you can put much more cache into same space etc.. Or you can improve "the architecture" by adding more execution units, but you do that, because everything is smaller and you can add more features into same space without hurting the latency. Its not some magical Apple or AMD architecture, they simply have new node which is better than competition and they can do more with it. Intel was stuck on 14nm far too long, so they didnt have much oportunity to improve like AMD. Once Intel get EUV production working, things will start to change rapidly again. Keep in mind that the 14nm process dates back to 2014 which is really ridiculous considering its almost 2022.

    • @maxwellsmart3156
      @maxwellsmart3156 3 года назад +1

      @@eth_saver You could say that about Intel. If Intel hadn't bought off OEMs not to buy AMD product. AMD was stuck on 32/28 nm while Intel was on 22/14 nm. Was it Intel's architecture or their process advantage. And Intel had a better process during Zen and Zen+ but still killed Intel in multithreading. Basically you have no idea what you're talking about.

    • @eth_saver
      @eth_saver 3 года назад +1

      ​@@maxwellsmart3156 I am not sure what are you comparing? 8 core Zen to some 4 core Intel cpu? Sure it was faster in some multithreading applications, but the cores were not faster. Also since Zen+ the node AMD used was about same as what Intel had. With Zen2 and now Zen3 their node is superior and suddenly their chips are "better", again.. this is not because of their amazing architecture.

  • @CuttingEdgeRetro
    @CuttingEdgeRetro 3 года назад +170

    What a journey it's been. For those of us that's been around for even longer in seeing the rise and fall of AMD since the 90s its been a great period of time in personal computing. BUt for this video. I wishes you have also Stuck a Phenom II 1090T in the mix, as those can still outperform the 8350 in some scenarios. But awesome work Steve. Maybe now a AMD'v'Nvidia 10 year GPU history lesson?

    • @shaneeslick
      @shaneeslick 3 года назад +13

      Yeah, there are so many people praising AMD as a Hero with 'If it wasn't for AMD...' comments,
      but being 50 & seeing Home PCs evolve it was also AMD's lack of Competition with FX after Athlon II & Phenom that caused the Intel Stagnation

    • @DanielFrost79
      @DanielFrost79 3 года назад +8

      @@shaneeslick Exactly.
      If it wasn't for AMD we would still only have those shitty Quad Core CPU's from Intel, maybe 6 cores, but i digress.
      We should really thank AMD for pushing forward and not stopping at their first Ryzen series.
      Because of AMD.. Intel went 'oh gosh darnit' and got out of their so-called anticonsumer hole with their Quad core CPU's. They were not bad, but thanks to AMD we got bigger and better CPUs.
      EDIT: Just realised my CPU is soon 4 years old. (8700K)

    • @shaneeslick
      @shaneeslick 3 года назад +8

      @@DanielFrost79 I think you missed my point it was also AMDs fault that Intel had no need to progress past 4c/8t in the first place

    • @DanielFrost79
      @DanielFrost79 3 года назад +1

      @@shaneeslick I know. I just had some flashbacks and did not add the negative points due the comment being enourmously long. 🖖
      As you said, they had no point going further at that time because of AMD.

    • @Kestrel626
      @Kestrel626 3 года назад +13

      @@shaneeslick don't forget Intel's dirty tricks when AMD were ahead that did so much damage to AMD, spending billions to try to put AMD out of business.

  • @SteffenMeyer101
    @SteffenMeyer101 3 года назад +120

    Interesting to see how the intel graphs are stagnating after skylake while with each AMD generation the progress is clearly visible. Shows that intel didn't expect AMD to come back like that and how 14nm is running out of juice

    • @innocentiuslacrim2290
      @innocentiuslacrim2290 3 года назад +5

      @@djp3637 This is a good description of the issue.

    • @whiffa5438
      @whiffa5438 3 года назад +1

      @@djp3637 that's interesting, i've never heard that view before. thanks for sharing!

    • @lucidnonsense942
      @lucidnonsense942 3 года назад +12

      @@djp3637 You are omitting the fact that the 10nm intel process would have been viable, if all they had to do was make small 4 core dies. What killed 10nm's viability was AMD providing competitive 8 cores, for a lower price. Then AMD moved to chiplet architecture which allowed them to make 8 core+ designs viable on a node shrink, using a process which (at the time) had yields about the same as Intel's 10nm.
      TL; DR. Intel's 10nm did not "fail," AMD killed it. If Intel was allowed to dump small 4 core dies on consumers and charge insane prices for higher core counts, 10nm would have worked out as expected. It never became viable because AMD kept twisting the knife.

    • @thejollysloth5743
      @thejollysloth5743 3 года назад +2

      @@lucidnonsense942 As long as intel managed to release 4/6 (which would really be a higher core count due to performance cores and efficiency cores) with 10nm instead of attempting to chase AMD in core count and number of sku's. I think it would have been absolutely fine for the first generation of a new architecture, especially if they managed to retake the gaming crown again by more than say 10% minimum over AMD's current low/mid range CPU's. If they couldn't manage to perfect the larger die
      sizes quite yet for i7 and i9's they could at least have produced something worth buying for consumers instead of giving us literally zero gaming performance uplift (and in many cases regressing) from 10th gen to 11th gen.
      I realize the 11th gen had an IPC uplift which made it better for tasks other than gaming and the 11600k caught up to the 5600x near enough in most cases, but it hasn't really been a success has it? In fact the 11900k was roundly mocked by pretty much everyone for being worse than the 10 core 10900k in many tasks.

    • @TropicChristmas
      @TropicChristmas 3 года назад

      @@drunkhusband6257
      This is true. These performance gaps evaporate at 4k, I think even a 2600 was matching the 9900k with a 2080ti. Basically, noone with a modern processor is all that cpu-limited anymore. I have a 10400, I absolutely don't sweat not having an 11-series or Zen 3 processor

  • @FaceI3ss
    @FaceI3ss 3 года назад +49

    So holding onto my 6700k was a good idea! Hope new CPUs show a larger jump

    • @innocentiuslacrim2290
      @innocentiuslacrim2290 3 года назад +7

      I just last year switched from 6600k. If I had 6700k, I might have stuck with that for one more year.

    • @nossy232323
      @nossy232323 3 года назад +2

      Me too! Planning to upgrade to Zen 4 or the Intel alternative next year.

    • @MGB2408
      @MGB2408 3 года назад

      Not for the new games in 1080p. My 7700 was the bottleneck in COD. No more with my 9900k.

    • @tyre1337
      @tyre1337 3 года назад +1

      @METhOdhowtoKilLnoObS depends on the gpu and resolution

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j 3 года назад +1

      for some games it does result in stuttering, especially if want to have background apps open.
      At the same time, even a 2500k is still great for most games today.

  • @Two49
    @Two49 3 года назад +40

    This is such a good video. Everyone talks about Intel's stagnation but I've never seen it laid out this clearly.

    • @Hirnlego999
      @Hirnlego999 3 года назад +1

      Same thing happened with MS and IE when they dominated. Things stagnated badly.

    • @LBCAndrew
      @LBCAndrew 3 года назад +2

      It's not laid out clearly though. This comparison removes a massive factor which is Intel's 20% frequency lead. Those graphs would look completely different had it just been restricted to 4 cores.

    • @Hardwareunboxed
      @Hardwareunboxed  3 года назад +11

      How would they look completely different? Zen 3 is still faster in most games than the 10th and 11th gen when all parts are stock and overclocked.
      By 'completely different' do you mean Intel wouldn't have seen stagnation from Skylake while AMD wouldn't have seen massive uplifts since the FX series and then through the Zen series? If anything stock frequencies would show slightly, and I mean only slightly show greater improvements for Intel and much larger improvements for AMD.

    • @nktslp3650
      @nktslp3650 3 года назад +1

      @@Hardwareunboxed I assume he ment that the frequency increase was a big factor for Intel gaming leadership for a long time. No one is questioning AMDs great comeback, but the comparison is biased because it doesn't take core frequency, and it only show IPC increase, which isn't as good looking for Intel than AMD.
      The same comparison with stock turbo frequency boost would be better comparing both companies processors. Still, AMDs comeback is just breathtaking.

    • @RandoBurner
      @RandoBurner 3 года назад +1

      @@Hardwareunboxed Intel's 4770k was better than a ryzen 2700x....4 years later.So why upgrade?Ryzen was better just for cost/performance.Only this gen they have the best performance trophy and.... they upped the prices, so who really cares then?I dont care who i get it from i only care for prices, and prices seem to have increased.

  • @APU-iGPU
    @APU-iGPU 3 года назад +45

    Incredible analysis Steve 👍👍
    Zen2 architecture was true "Zen" moment for AMD.
    Excited to see how Zen3D processors perform 🤩

  • @Fina1Ragnarok
    @Fina1Ragnarok 3 года назад +20

    Wow. I currently have a Ryzen 1600 and have a 5600x on the way. The improvement over my 1600 is nearly the same as my 1600 would be over an old bulldozer/piledriver cpu. Crazy how much AMD was able to accelerate their progress in the last 4 years.

    • @bsh7390
      @bsh7390 3 года назад +1

      Thanks to TSMC

    • @Luredreier
      @Luredreier 3 года назад +5

      @@bsh7390 Not just TSMC, most of these changes are due to arcitecture.

    • @Luredreier
      @Luredreier 3 года назад +3

      Yeah, I got a FX 8350 and a R7 1700 system myself.
      I love them both.
      The 8350 was never a good gaming CPU, but your could do hardware accelerated virtualisation on it while its price was lower then the i5s that *didn't* have that hardware acceleration.
      Also, the 8350 supported ECC memory, the Intel chip didn't.
      While the performance pr core was way, way lower then anything Intel did having those cores still proved usefull in some situations.
      And also, while the 3770k performed way better in games and had way more performance available pr thread (when the other wasn't hammered) you'd find that if you *really* hammered the cores the FX scaling from 4 to 8 cores on the 4 modules was better then from 4 to 8 threads on the Intel cores back then (Haswell cores crushed the FX modules though)

    • @andersjjensen
      @andersjjensen 3 года назад +1

      @@bsh7390 A good node helps you nada if you don't have a good architecture. Sandy Bridge to Skylake shows this clearly. Sandy Bridge was very good for it's time, but Intel (quite sanely) liked to run architectural revamps opposite of node shrinks, so it wasn't until Skylake they fully utilized what 14nm could give them. AMD's gains from Zen 2 to Zen 3 tells the same story. I personally don't think AMD could squeeze much more out of TSMC 7nm, but they don't have to since 5nm is well into mass production, and reportedly has as good yields as 7nm at this point.
      Now I can see that you have spouted this "Thanks to TSMC" reply to several other threads, to downplay AMD's efforts, so let me cut it out clear to you: Intel stagnated so badly on both architectural and manufacturing progress that they lost not only to AMD, but also to TSMC and Samsung. Even Samsung's 8nm is better than Intel's 14nm in terms of efficiency. It's only marginally better in density, but if you were to make a GeForce 3090 on 14nm you'd be looking at well over 700W. *Nothing any of the three have done is extraordinary given the time frame.* It's plainly and simply that Intel dropped the ball hard. For many years in a row.
      Bob Swan The Bean Counter tried to eat the cake and have it. It's inarguable at this point. And it's going to go down in history as a case study in why near-monopolies have a tendency to eventually sort themselves out: Greed fucks up everything.
      Bob retired with a biiiig golden hand shake and doesn't care that he'll be remembered for The Dreaded Decade of the Quadcore. Pat Gelsinger has inherited a kingdom almost in ruins. If AlderLake doesn't bring MASSIVE gains Intel will be "the second supplier" within the next five years.

    • @mduckernz
      @mduckernz 3 года назад +1

      @@bsh7390 Writing this in every comment chain doesn't make you more right ;)

  • @JMartinni
    @JMartinni 3 года назад +24

    I upgraded from a Phenom II X6 1090T from early 2011 to a Ryzen 5 2600X in 2018 so I completely missed most of this - and I never really considered Intel given the price premium the CPUs and mainboards usually commanded. Looking at Ryzen 5000 now, suppose if I want to stay in the same price category I'll still be stuck with a six-core ten years later... :/

    • @chriswright8074
      @chriswright8074 3 года назад +1

      Ryzen 5000 is cheap tf is ya problem

    • @koraptd6085
      @koraptd6085 3 года назад +4

      @@chriswright8074 wAt's yA pRoBleM mAte?

    • @talhamalik4910
      @talhamalik4910 3 года назад +2

      @@chriswright8074 Ryzen 5000 is overpriced compared to last gen and Intel atm (at least for the mid-range)

    • @chriswright8074
      @chriswright8074 3 года назад

      @@talhamalik4910 if we going by MSRP you can't afford 300 for a 5600x when Intel was always charging 300 for a i7 all them years what about the 8600k at 350 by MSRP amd rose the price 50 from the x models there is no base 5000 series like how they always had one

    • @talhamalik4910
      @talhamalik4910 3 года назад +3

      @@chriswright8074 it doesn't matter what Intel was charging back then, because compared to AMD's own previous generation CPUs and Intel's 11400/11400F, the 5600x is overpriced. The 11400F is just over half the price of a 5600x and is only ~10% slower for gaming. Look at hardware Unboxed videos about those CPUs, they agree that Intel is a better value right now because most of AMD's CPUs are overpriced. I am not a fanboy of either brand, I have a Ryzen 5 3600 because it was the best value at the time.

  • @S3lvah
    @S3lvah 3 года назад +8

    Watching this video made me realise that Steve's voice while reading graphs would be well-suited for radio. A smooth, articulate and calming timbre to listen to.

  • @jmv2015
    @jmv2015 3 года назад +23

    I feel better on my aging 6700, I'll rock this rig till it dies, basically no reason to upgrade atm for me.

    • @SixStringViolence
      @SixStringViolence 3 года назад +2

      Till Windows 11 hits and the first Win11 only games appears on the horizon. ;)

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j 3 года назад +2

      @@SixStringViolence wont happen

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j 3 года назад +1

      @@SixStringViolence and plus a 6700 can run it even if not officially, so no point

    • @НикитаЕремеев-ц6ц
      @НикитаЕремеев-ц6ц 3 года назад

      @@user-wq9mw2xz3j until some technologies become available only on win11, like hardware av1 decoding, together with youtube migration to av1.

    • @Soverax
      @Soverax 3 года назад +4

      @@НикитаЕремеев-ц6ц by the time that happens his 6700 would only be good for heating. Give the man a chance.

  • @greggmacdonald9644
    @greggmacdonald9644 3 года назад +14

    I'd definitely like to see Alder Lake added to this data. It could also be interesting to see how running 4 cores of JUST Efficiency cores would do, vs. 4 of P-cores.

    • @andersjjensen
      @andersjjensen 3 года назад

      But we know the IPC of the efficiency cores... because they're Skylake. We've been watching Skylake for almost a decade.. :P

    • @greggmacdonald9644
      @greggmacdonald9644 3 года назад +1

      @@andersjjensen Sure, but it's not Skylake in isolation, we don't know how the rest of the CPU affects it, and we've certainly never paired Skylake with DDR5 before! Also, just how cool will it run with just e-cores, and so forth!

    • @saricubra2867
      @saricubra2867 3 года назад

      @@andersjjensen Skylake cores have hyperthreading and higher clockspeed, something that Gracemount lacks

    • @saricubra2867
      @saricubra2867 3 года назад

      @Rita 25 y.o - check my vidéó They were running out of juice a long time ago with the laptops, Every single high end laptop CPU after i7-4700MQ has been mediocre, 11800H is the first CPU since 4700MQ (or Haswell) that has bins of overclocking supported and doesn't get ridiculously hot.

  • @saricubra2867
    @saricubra2867 3 года назад +2

    What the hell happened to the i9-11900K in Rainbox Six Siege?
    Any explanation?
    That score doesn't make sense compared to the rest of the games, definetly had a significant impact in the 9 game average.

  • @jubayerwasidraiyan5874
    @jubayerwasidraiyan5874 3 года назад +3

    these must take so long, thanks for these amazing tests steve, actually putting effort into something to make content for us plebs, that's just amazing

  • @titusrichard4949
    @titusrichard4949 3 года назад +12

    Alder lake sounds great but I cannot wait for the new AM5 platform, I think is gonna launch around the same time with Raptor lake. That will be a massive performance, at least I hope because we already know how good Zen3 is so it can’t be worse and cannot be just 5-10% improvement. At least I hope

    • @perfectdiversion
      @perfectdiversion 3 года назад +3

      According to "moores law is dead" channel, raptor Lake is supposed to bring a huge leap forward so it will be interesting to see how am5 stacks up

    • @saricubra2867
      @saricubra2867 3 года назад +1

      One Golden Cove core is 40% higher than a Skylake core, 20% ahead of a Willow Cove from Tiger Lake processors.

    • @soumen8624
      @soumen8624 3 года назад

      You may have to upgrade your ram to DDR5 though.

    • @jacobc5747
      @jacobc5747 3 года назад

      @@soumen8624 you WILL need to find some DDR5 somewhere. I don't expect it to be cheap or have good timings for quite some time either.

  • @seylaw
    @seylaw 3 года назад +4

    The gap between Ivy Bridge and Haswell can be at least partly attributed to Spectre/Meltdown, as Haswell supports a newer instruction (INVPCID) which helped to preserve some performance with these mitigations enabled.

    • @Kepe
      @Kepe 3 года назад +1

      Didn't those mitigations mainly hurt IO performance in situations where there were tons of reads and writes going on at once? Not so much actual CPU number crunching performance.

    • @monotoneone
      @monotoneone 3 года назад

      Also, Sandy and Ivy Bridge typically use horrible secondary/tertiary memory timings if you use DDR3-2400.

    • @seylaw
      @seylaw 3 года назад

      @@Kepe It depends on the workload of course and as most games are GPU-bound anyway, the difference is less pronounced most of the time. But the performanace drop on Haswell+ is less severe due to the mentioned instruction, but very noticeable e.g. on Westmere. And more performance-costing security vulnerabilities kept coming since Spectre/Meltdown, e.g. L1TF etc. - it would be great to revisit this topic (you can disable most of these via two registry entries, Microsoft has a dedicated site for this).

  • @vibhav1011
    @vibhav1011 3 года назад +3

    Ooh so that's Monitor Steve's REAL name - CPU architecture progress benchmark series. Got it.

  • @qlum
    @qlum 3 года назад +8

    I would have loved to see the last phenom II was added, as that really shows how bad bulldozer was. Granted game compatibility would sting on that one.

  • @rangersmith4652
    @rangersmith4652 3 года назад +14

    AMD certainly needed something completely different after FX, and we all owe a lot to Lisa Su's vision and her development team for bringing Ryzen onto the scene. To all who are still using an FX chip, in the very mortal words of a certain graphics card company CEO, "It's safe to upgrade."
    I was one of those FX users, mostly based on the 8-core claim and price, but if I could go back to 2014 (for my first ever build), I'd bite the budget bullet harder and go with a 4770K, as clearly that would have been the better choice, though at a 40% price premium. Ah, the beauty of hindsight! The best that can be said about FX was that the series got many gamers by for a few years, keeping them in the hobby when Intel pricing would have shut them out. But since it did that, it was not a heap of crap. Comparatively weak and inefficient? By any measure, yes. But not a heap of crap.

    • @RamonInNZ
      @RamonInNZ 3 года назад +1

      I stayed with my Phenom II x 1090T till buying the i5 6600/i7 6700 (cannot remember which) mid 2015, went through to i5 8600 then swapped to Ryzen 7 2700 (2018) then 3700X (2019). Have built Intel machine for various people but for myself have gone with Ryzen, next upgrade 5000 series or NG when they arrive.

    • @rangersmith4652
      @rangersmith4652 3 года назад +1

      @@RamonInNZ I've built far more Ryzen PC than Intel since relegating my FX chip to my growing tech museum, but I'm not particularly brand loyal--with CPUs, there's no "familiarity" or driver advantage to either AMD or Intel. It's about performance in the apps one uses and, of course, price. Power consumption is a drop in the cost bucket where I live, and heat is not an issue with modern CPU coolers.

    • @spectro742
      @spectro742 3 года назад

      safe to upgrade: Sells gpu, can't buy a new one.

  • @jahjahlive1000
    @jahjahlive1000 3 года назад +13

    Blessings from Trinidad. I never make a purchase of any part unless it has passed through the hands of Hardware Unboxed. I commend you brothers for all your hard work, dedication and creativity. Your channel is amazing and tech information will not be the same without it....

  • @nipa5961
    @nipa5961 3 года назад +2

    Super interesting comparison! Thank you, Steve!

  • @bigbronx
    @bigbronx 3 года назад +4

    Great video! I like the methodology, only missed power consumption too!

    • @gamingunboxed5130
      @gamingunboxed5130 3 года назад +3

      Thanks, only issue with power being some parts are heavily overclocked while some are underclocked, so the data would be kind of useless and quite misleading.

    • @bigbronx
      @bigbronx 3 года назад +1

      @@gamingunboxed5130 I thought that power consumption would roughly stay the same or even decrease with newer, more efficient CPUs. Just ignorant intuition as I am no expert. Thank you for the clarification Steve! You guys are awesome.

  • @HDJess
    @HDJess 3 года назад +1

    Thanks for this lesson of history. I haven't swapped CPUs that often, I started off with an AMD Athlon XP back in 2000, then had a Pentium4 and then some Celerons, when I finally got an i3-something (3xxx series, can't even remember the name). My first 'gaming' CPU after those was an i5-6500, which I still keep today in my 2nd PC for multimedia. After that, I got the i5-10600K which I'm using right now and I don't plan on changing soon.

    • @andersjjensen
      @andersjjensen 3 года назад +1

      Unless you're an e-sports dork who yells bUt HiGh RefREsH gAmINg Is kINg!!!111oneoneone every time someone says "more than a year old" then there should be no reason to upgrade CPU any more often than every 5-6 years. If you care more about game play than pretty graphics then the GPU cycle looks much the same.

  • @807800
    @807800 3 года назад +4

    Nice recap!.
    It also shows AMD's progress from Zen 1 to Zen 3 is more impressive than the Bulldozer to Ryzen.

  • @MadClowdz
    @MadClowdz 3 года назад +1

    Very interesting work, Steve. Thanks.

  • @MafiaboysWorld
    @MafiaboysWorld 3 года назад +5

    This was brilliant Steve. Let alone showing Intel's stagnation, this is great for anyone who's on older Ryzen as this gives the ultimate guide to performance boosts by upgrading from Zen 1 to Zen 3. 👍

  • @SwirlingDragonMist
    @SwirlingDragonMist 3 года назад

    Fascinating, I feel you’re really coming into your flow in exhibiting the changing winds of the industry. I would recommend playing around with Earth Defense force 5 as a benchmark, it features hoards of enemies that I presume would be strongly CPU weighted. Lots of levels spawn hundreds of units all at once right after the level begins. So I feel it would be a convenient benchmark as you could just shoot the same rock with a bazooka when the voiceover kicks in. It would be especially awesome if you use the air-raider class to bring down huge area of effect blast physics ontop of a huge amount of ants that are then blasted to bits that fly all over the map. Which is visually stunning, and used to drop the frames of my xbox down to the single digits.
    Also some Supreme Commander benchmarks would also be a nice frame of reference, it’s an old game, but well known and infamously challenging to run as many a gamer has experienced what colloquially was called supreme-commander-lag, it may be a relatable measure of how technology has advanced, and maybe even highlighting how systems still struggle during big battles and nuke animations. Supreme commander type games also have convenient save/reloads, so you could assign orders, then save the game, allowing for any reload of the game to run on autopilot as a huge wave of fighters all converge into a maelstrom of physics and explosions. Which, (for me) is allot cooler to watch than just a guy running down an empty street. But hey, you guys are doing great! And you clearly put allot of thought and effort into everything you do, at least when you have the time ha ha you guys must be soo busy. Thanks for taking the time to read this. Unlock that air-raider class’s super weapons and thank me later. ;)

  • @tinfever
    @tinfever 3 года назад +3

    Really enjoyed this. I echo what someone else said regarding testing with and without Meltdown mitigations. That's be really interesting too. Really cool to quantify just how large an upgrade I had leaving my 3770k

  • @GENKI_INU
    @GENKI_INU 3 года назад

    Perhaps the main reason for the performance jump between Ivybridge and Haswell in modern games, is because many games apparently use AVX instructions these days, and Haswell greatly expanded on AVX with AVX2 support.
    After that, the jump from Haswell to Skylake makes total sense because of the change to faster DDR4 memory instead of DDR3.

  • @Vikushu
    @Vikushu 3 года назад +5

    I kind of wish this type of comparison was possible with laptop CPUs. But I know there's just too many variables to make any data accurate enough.

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j 3 года назад

      there's plenty of laptop hardware comparisons

    • @saricubra2867
      @saricubra2867 3 года назад

      i7-11800H makes desktop chips look like Bulldozer CPUs. It's a really impressive efficient and fast chip, obliterates Zen 3 mobile in extremely CPU intensive gaming, makes the 10700K and 5600X look dumb.

  • @osgrov
    @osgrov 3 года назад +1

    This is great content, please keep this one alive as generations are added. :)
    I often get questions from friends if it's worth to upgrade from X to Y, and given how games evolve every year, that question is sometimes hard to answer.

  • @Dj0rel
    @Dj0rel 3 года назад +6

    Maybe the reason Zen is so good is because Bulldozer was so bad. Steamroller and excavator were mobile parts which means AMD had to work extra hard in order to make them more efficient. Maybe what they learned from that transfered to Zen.

  • @Z4CH4
    @Z4CH4 3 года назад

    I don't know what it is but your intro soundtrack has a very calming effect on me. Always a delight to watch your reviews etc

  • @martinbadoy5827
    @martinbadoy5827 3 года назад +17

    This video is like one of those history documentaries on the Battle of Stalingrad and how that was the turning point of the war

  • @TechRIP
    @TechRIP 3 года назад +1

    Well, you already have the template, so keep adding CPU's to it. It was an awesome visual.

  • @jeekie22
    @jeekie22 3 года назад +3

    It would be interesting to see the same comparison but with the clocks unlocked

  • @marekciostek1458
    @marekciostek1458 3 года назад +1

    When I change from Phenom II 955BE to Ryzen 5 3600 I feel big improvement even with my old Radeon RX470 4GB. I wait long to upgrade my PC setup.

  • @phucnguyen0110
    @phucnguyen0110 3 года назад +6

    The monitor name jabs from Steve and Tim are so hilarious 🤣🤣

  • @tomislavkaltnecker6354
    @tomislavkaltnecker6354 3 года назад +2

    Outstanding video Steve from every point of a view.

    • @SKHYJINX
      @SKHYJINX 3 года назад +1

      Agreed, so much better laid out than linus and his IPC video.
      Glorious graphs and expertise here, no rush, just good stuff.

  • @quantruong5477
    @quantruong5477 3 года назад +12

    i already knew someone would mention this, yet thank you AMD and Lisa Su for their contribution to the cpu market.

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 3 года назад

      Also thank you to all the people who still bought AMD CPUs when they were slower than Intel. Otherwise, AMD wouldn`t have had enough money to make Ryzen.

  • @VanillaWahlberg
    @VanillaWahlberg 3 года назад +1

    I went from an Athlon XP 3000+ Thunderbird CPU, to Athlon 64 X2 5000+, to Athlon II X4 640, to FX-8350, to Ryzen 1600, 2600, 3600, and now 5600X.
    I've seen both sides of AMD. The good and the bad years.
    However they've never stopped trying to innovate for the consumer. They were technically the first actual 4-core processor, and 64-bit processor. They also didn't gouge customers initially when ahead with FX-64, and commit anti-competitive practices (like "convincing" Dell & the like to only use their processors) like Intel did.

  • @zsoltszilagyi4768
    @zsoltszilagyi4768 3 года назад +8

    It is weird for me that it was not noticed while editing that Steve is referring to all Intel CPU as 14nm process. Actually the biggest gains for Intel in the graphs come in the period up to reaching the 14nm process node. Sandy Bridge was 32nm, Ivy Bridge was 22nm as was Haswell. Broadwell CPUs were the first 14nm parts.
    At the same time Intel was iterating the process every two years AMD was stagnating at 32nm and 28nm nodes while working on a new microarchitecture. The same thing happened to Intel when they reached 14nm, up to the current day, trying to mitigate in one way or another the aging architecture, while AMD was focusing on incremental changes and process node refinements.
    If Tiger Lake or any earlier 10nm CPU ever ended being a viable desktop product we might have seen the tables turning again at that point, if Intel could have kept with in line AMD increasing the core counts as well. The architecture itself is good but it could not reach a viable product state. Their saving grace was that AMD was only catching up and had no pressure on releasing an un-optimal product, as opposed to when Bulldozer came out, where AMD had to act as Phenom II was already lagging behind when the Core I series appeared.
    I see no good things coming our way, and I wish for a surprise, whether 12th gen or Zen4, with the current market making money is even more important with the limited supply and will go against R&D. The only saving grace is that none of the two vendors have a definite lead, if the market stagnates it will not really turn into a monopoly again.

    • @zsoltszilagyi4768
      @zsoltszilagyi4768 3 года назад

      @VERY EVIL PERSON FROM ILLUMINATI I think you made fair points, and your explanation extends what I was trying to say.
      Would Intel bring Tiger Lake to desktop it would have been a power hog and on 10nm it would cause more problems with heat density over a smaller footprint. Based on some reports AMD is having similar experience cooling wise with some of their parts. On the server side Tiger Lake is also not yet available, Ice Lake parts being the latest release which are still not offering a competitive product to EPYC. The Beast Canyon or however it is called is the highest TDP Tiger Lake part, which is still a binned mobile CPU.
      As for AMD being fabless its both an advantage and a curse, but one cannot imagine that you can just take the 32 or 14 or whatever nm design and just transpose it to a new node/new manufacturer if they could secure a deal. It all needs adaptation and engineering effort. We might have seen the best they can offer with Zen3 and its Zen 4 evolution for a while. I see them winning more by going back to the APU route for desktop parts if the shortages persist, the Cezanne parts perform exceptionally and with RDNA it could be just what we need, while Intel tries to make a name for themselves in dedicated GPUs and still have no mainstream desktop part with decent IGP, which they could have had with Tiger Lake G7 parts (Higher clocked 1165 G7 as 11th gen I3 would have been a killer deal in 2021, instead of rebranded 10 series with external G7 graphics)

  • @rodrirm
    @rodrirm 3 года назад

    Amazing video!
    Thank you for doing such an extended hardware analysis!. I can only imagine how long it took.
    Its great to be able to look at all this data in one place!

  • @thejollysloth5743
    @thejollysloth5743 3 года назад +3

    Yes, yes, yes!!! Please add Alder Lake to the list when it releases and you have the time. It will be very interesting to see what the results are with the same clock speeds and core count from 14nm (++++++++) to 10nm.

    • @saricubra2867
      @saricubra2867 3 года назад

      It would be the 2nd desktop architecture that is not based on Skylake.

    • @andersjjensen
      @andersjjensen 3 года назад

      @@saricubra2867 You are technically correct, but the 11th gen architecture was so damn neutered because it was originally designed for a much denser node, so something had to be axed to fit. 10nm is 2.3x denser than 14nm, so going from 10 to 8 cores was not enough on it's own.

    • @saricubra2867
      @saricubra2867 3 года назад

      @@andersjjensen Currently 8 cores are the standard (thanks Ryzen 7 1700), i see power consumption more of a problem than core count, that's why the 11900K is just a bad flagship.

  • @sgomez3047
    @sgomez3047 3 года назад

    STEVE!! Totally enjoyed this walk down memory lane video brother. Thanks for posting!

  • @Mi2Lethal
    @Mi2Lethal 3 года назад +7

    AMD is the turtle that caught and beat the hare after intel became lazy.

    • @andersjjensen
      @andersjjensen 3 года назад

      Yeah, but 5 years is also a pretty long nap! :P

  • @20rcRules
    @20rcRules 3 года назад +1

    Finally! I've been patiently waiting for this great video!

  • @orenm1454
    @orenm1454 3 года назад +6

    Basically Intel refused to innovate until AMD forced thier hands. Next gen will be very interesting for both companies

    • @zjunk4877
      @zjunk4877 3 года назад +2

      Intel didn't refuse, they got stuck behind die shrinks. AMD moved off GF to TSMC. Considering this is a clock for clock comparison it's surprising 14nm while losing, is still holding it's own.

  • @duncanfirth
    @duncanfirth 3 года назад +2

    There are issues with stability on zen 3 architecture. I have received two ryzen 5900x that gave me continual WHEA black screen and restarts with WHEA exceptions related to the CPU core.
    I havnt heard anything about the zen 3 stability only about performance.

    • @duncanfirth
      @duncanfirth 3 года назад

      @HardwareUnboxed can you please discuss these instability issues. It is quite wide spread.

  • @FreeIreland32CountyRepublic
    @FreeIreland32CountyRepublic 3 года назад +3

    I ran Haswell with a Sapphire R9 290 for donkeys b4 finally jumping onto the 3600, Sapphire 5700XT Nitro on ultra-wide 2 years ago, a really nice upgrade. I think I'll wait another couple of years until the mid range can run a super-ultra-wide at 4k 140.
    Once u go high refresh, ultra wide there's NO going back 😜

    • @christophermullins7163
      @christophermullins7163 3 года назад

      Idk about ultrawide being something special, I agree with high refresh rate. 60 hz feels like a slide show these days.

    • @FreeIreland32CountyRepublic
      @FreeIreland32CountyRepublic 3 года назад

      @@christophermullins7163 depends on the type of game u prefer I guess, something like Horizon, Death Stranding or Cyberpunk I find the ultra-wide make the game world's much more immersive. I can't stomach VR headsets they make me 🤢 So this is defo the next best thing for me personally.

  • @Kallan007
    @Kallan007 3 года назад +1

    I survived from 2011 with an Intel i7-2600k until 2020 when I jumped to an AMD 9 Ryzen 5900X.

  • @M00_be-r
    @M00_be-r 3 года назад +3

    great summary about he last 10 years cpu history Steve should I call you master Steve now after this great lesson? ;-) have a great weekend!

  • @bb5307
    @bb5307 3 года назад +2

    All I see here is that my 8700k I bought in 2017 still holds up today. Shame it doesnt support PCIE 4.0 though.

  • @Michael-Ray
    @Michael-Ray 3 года назад +18

    They should teach this history lesson in high school.

  • @thestrykernet
    @thestrykernet 3 года назад

    Great series, and it has been my favorite thing you all have done recently. Looking forward to more unique takes.

  • @paolog3213
    @paolog3213 3 года назад +3

    There are some games like AC Valhalla in which having more than 4 threads is essential, Fx8350 is on par of i5-3470. Threads count more today than yesterday.

    • @Luredreier
      @Luredreier 3 года назад +1

      Yes, also the FX processors had hardware acceleration and support for ECC memory (although not all motherboards did).
      If you ran a host OS on 4 cores and a VM on 4 cores on a FX 8350 you'd get a way better experience with that then a i5 3550k for instance (the K skues didn't support virtualisation, the non-k didn't OC).
      Indeed there's situations where a FX 8350 will even beat a i7 3770k, although those situations are few and far between.

  • @azzadazza1983
    @azzadazza1983 3 года назад

    Awesome work thanks mate, those of us with older processors it helps give us an idea of how much performance improvement we can expect with an upgrade.

  • @nathanddrews
    @nathanddrews 3 года назад +14

    I love this channel, that's why I support it via Floatplane. Looks like it's time to DOUBLE my minimum frames by upgrading my 3570K to a new CPU, eh? LOL

    • @azurai3934
      @azurai3934 3 года назад +1

      Just getting something like a 2600X CPU would be a big leap in performance.

    • @nathanddrews
      @nathanddrews 3 года назад +4

      @@azurai3934 Nah, I'd rather wait for AM5 and start with a new platform rather than a dead one.

    • @sujimayne
      @sujimayne 3 года назад

      The channel may be great but this video is hot garbage. Running CPU tests on Ultra settings is a great way to NOT test the CPU.

    • @zigaklun3395
      @zigaklun3395 3 года назад +2

      @@sujimayne its 1080p and the gpu is 6900xt

    • @saricubra2867
      @saricubra2867 3 года назад

      12900K, close to 100% IPC improvement, 30% higher clockspeed and 4 times more cores?
      lmao

  • @jackboudreaux5883
    @jackboudreaux5883 3 года назад +1

    great video, i know this took a lot of work so thank you steve. i would love to know how much more power each chip had to use to achieve the gains. When i see for example a 20% gain but it used 20% more power i kinda shrug my shoulders, but thats just me i guess.

  • @paolosaporetti8293
    @paolosaporetti8293 3 года назад +3

    Best content, always ❤️

  • @MrThenorwood
    @MrThenorwood 3 года назад +2

    Hmm, there's a big gap bwtween 2012 and 2017 in the AMD lineup. Why were the APU's not included? (Trinity, Kaveri, Carrizo etc). While these weren't massive gains, they were evolutions of Bulldozer and it would be interesting to see them slotted into the graph to complete the evolution picture. They would be especially interesting to show how much could be gained from evolving a flawed architecture compared to a completely new design.

    • @gorky_vk
      @gorky_vk 3 года назад +1

      And FX vs same price Intel offering just for fun :)

  • @bafon
    @bafon 3 года назад +3

    Will be interesting to see what performance Intel's 12th gen and AMD's 3D stacked cpu's will bring to the table when they launch :)
    Think I'll keep on to my 5900X for some years to come, while it's cool to be able to run games at 1000fps, I still cap it at 240 or 360 as the benefit beyond that is marginal at best.

    • @nipa5961
      @nipa5961 3 года назад +1

      I can't wait to see benchmarks of Alder Lake and Zen 3D. It's so good to have real competition back.

    • @mryellow6918
      @mryellow6918 3 года назад

      Capping fps is better from response times anyway

  • @shaunlunney7551
    @shaunlunney7551 3 года назад

    Great one, thanks Fellas! maybe a followup...including a couple earlier HEDT setups as they were still gaming focused. They had large Caches and quad channel memory...really helped with platform longevity!

  • @trackingdifbeatsaber8203
    @trackingdifbeatsaber8203 3 года назад +4

    i was feeling concerned as i have first gen Ryzen then I remembered my 60hz screen

    • @finck9226
      @finck9226 3 года назад +1

      Ahah maybe that's the next upgrade you need. A new monitor is arguably the most notable upgrade to a PC next to a high end GPU. It's literally the thing you look at 100% of the time you'll notice a difference in everything down to web browsing or even reading text on a higher resolution screen.

    • @trackingdifbeatsaber8203
      @trackingdifbeatsaber8203 3 года назад +2

      @@finck9226 I have a 4k 60hz screen it looks nice I recently added an nvme drive which was a good upgrade

    • @finck9226
      @finck9226 3 года назад

      @@trackingdifbeatsaber8203 that's my next upgrade right there. 4k 60 is pretty godly not gonna lie. I personally don't need more then 60 fps unless it's a shooter and I dont play those too often and when I do I'm trash 🗑️

    • @trackingdifbeatsaber8203
      @trackingdifbeatsaber8203 3 года назад +1

      @@finck9226 4k is very hard to run and 1440p despite having half has many pixels gets pretty close according to my freind who switched from his 4k 60hz to a 1440p ultra wide 144hz screen. I have a 1080ti and most games I play run really well but with newer titles not as well as I would like

  • @fleurdewin7958
    @fleurdewin7958 3 года назад +1

    Ivy Bridge was the first cost cutting measure by Intel when the competition wasn't even there at all. The shift to using thermal poop instead of soldering the IHS to the silicon makes the predecessor , Sandy Bridge look so good in temps. The only saving grace of Ivy Bridge is the much improve memory controller, the introduction of PCI-E 3.0 doesn't mean much back in those days . Then came Skylake, Intel again try to cut cost by making the substrate thinner than Haswell , causing big heavy air coolers to damage the CPU and motherboard.

  • @hartsickdisciple
    @hartsickdisciple 3 года назад +4

    Ryzen got AMD back in the game, but it wasn't really a good gaming architecture until the 3000 series (Zen 2).

    • @1337Ox
      @1337Ox 3 года назад

      well yeah in terms of performance vs intel, but I mean you could play anything :D just not at super high fps

    • @Luredreier
      @Luredreier 3 года назад +1

      Yeah, but for anyone into VMs it was a good idea to consider AMD even back in the FX days.
      A FX 8350 had hardware acceleration and 8 cores as well as ECC support at a lower cost then a i5 3550 that did *not* have any of those.

  • @Zwalter
    @Zwalter 3 года назад

    I really do hope this channel keeps on doing these reviews!!! Other channels dont really put that much time benchmarking and have these quality datas

  • @kamata93
    @kamata93 3 года назад +3

    I really think the 2 best CPUs are the Ryzen 5 5600x and the i5 11600(k).

    • @dylan.dog1
      @dylan.dog1 3 года назад +1

      yes they are both good but if you find them close in price r5 5600x its obvious choise with fast gaming performance and less power draw

    • @kamata93
      @kamata93 3 года назад

      @@dylan.dog1 Although I was one of the first Ryzen adopters back in the day with the R5 1600, I still prefer the BIOS layout and stability of Intel platforms over the Ryzen ones. In terms of performance, I agree. In my country the 11600(k) retails for a better price, so I might pick that one to upgrade my 8700.

    • @dylan.dog1
      @dylan.dog1 3 года назад

      @@kamata93 ram stability issue was fixed 3-4 months after first ryzen relese
      ps. upgrade from 8700/k to another 6 core for gaming make no sense...the 8700 its still good, you can go with 5800x/11700k or 5900x

  • @Metetron
    @Metetron 3 года назад

    My current PC of Theseus has gone through 4 CPUs, starting from 955 BE back in 2011 to a FX-8320, i7-3770k and finally to a 5600x. That last jump was massive to say the least.

  • @theslicefactor4590
    @theslicefactor4590 3 года назад +15

    I just want a reasonably priced GPU 😒

    • @Zero11s
      @Zero11s 3 года назад +1

      you dance with the new normal, you get the new normal

    • @benjaminoechsli1941
      @benjaminoechsli1941 3 года назад +1

      Crash the cryptocurrency market, and you'll have your pick of graphics cards.

  • @ole7736
    @ole7736 3 года назад

    This video is part of a series of amazing content. Well done!👍

  • @dswift864
    @dswift864 3 года назад +4

    You should have also included the cost difference between the different platforms to get a clearer picture of why AMD was still a viable platform. I personally have been full AMD since the early days. I played the same games as everyone else just fine and my systems cost me waaaayyyy less then my friends were paying for Intel based systems. There is a point where the fps is good enough to play and enjoy a game fully.

    • @lucidnonsense942
      @lucidnonsense942 3 года назад

      While that is true, that is not a sustainable model from the side of the corporation. AMD was bleeding money in the FX years, kept afloat by providing cheap cores to console manufacturers and its GPU division (ha!) There is not enough money to remain profitable in the lowend, especially when you need to harvest dies that are bigger than your competition. It all comes down to the performance per mm^2 of the wafer and die yield - there was just no way for AMD to stay in business with FX.

    • @riba2233
      @riba2233 3 года назад

      Same, amd only since 2004 :)

    • @Hardwareunboxed
      @Hardwareunboxed  3 года назад

      At the time you could get Core i3's for about the same price and they were often much faster for gaming. Comparing the FX series back in 2012-2015 with Core i5's and Core i7's was brutal, they were Core i3 competitors at best.

    • @riba2233
      @riba2233 3 года назад

      @@Hardwareunboxed well I got phenom x2 555 for 50 bucks and cheap mbo, unlocked it to x4 and overxlocked, and it was good up to 1500x :)

    • @xthelord1668
      @xthelord1668 3 года назад

      @@Hardwareunboxed your charts show there that frame time consistency in a lot of games was way worser for intel than for AMD
      even if FX was bad when it came to nuke level CPU workloads like assassin's creed or hitman where there are a lot of NPC's where it would be pegged to 100% same as 2nd-7th gen core i3-i7 series CPUs BUT intel side would lose in frame time consistency much more when it only had 4 cores and 8 threads where it could not offload workload anymore plus background tasks which would decrease data hit rate especially for old task schedulers which were merciless around this
      windows 10 task scheduler compared to windows 7 task scheduler are different because back in the day having many core CPUs was HEDT or server side while mainstream had 4c 8t cpu's or 4m 8t cpus (unless you ran 1st gen core i7 with triple channel and 6c 12t or phenom x6's) which meant all of stuff went to one core max 2 cores which microsoft knew and devs were building games for
      APIs we used were different,we had only DX11 no DX12 and vulkan
      now today all of stuff gets offloaded to all possible cores due to devs focusing on 2-4 core allocation hence why zen-zen 2-zen 3 was such a big improvement where zen+ wasn't due to games crossing CCX's and adding extra lag and why older intel core i series cpu's see quite possibly big swings in framerates because cache sizes were small,they had lack of cores which meant good luck with having no freezes on new games unless you ran AMD GPU and newer API's like vulkan or DX12 which were friendly towards weaker technologies unlike DX11
      my friend's core i3 8100 is a frametime incosistency shitshow and it has been helped from all ends which tells why max framerate was really never a proper way of benchmarking each CPU since you also have to look at how those CPU's handle harsh loads like big amount of NPC's where frametime charts would say truth behind FX CPU's because it had less frametime consistency swings by your charts than all of counterparts it fought at the time

  • @Robert-hb9mo
    @Robert-hb9mo 3 года назад +1

    A perfect video to watch during a saturday diner. Thank you so much Steve!

  • @karansingh1154
    @karansingh1154 3 года назад +3

    Almost seems unreal, how far AMD progressed in like a decade is astonishing.

    • @Kynareth6
      @Kynareth6 3 года назад +1

      CPUs used to go 100x in performance over 10 years since the 1970s. It stopped being true in 2011 with Bulldozer. Intel made small progress, but AMD made even smaller before Zen.

    • @Kynareth6
      @Kynareth6 3 года назад

      @тαρ мє αи∂ ѕєχ ωιтн мє Zoey 12900K is 78% faster in CB23 than 11900K, that's more than 8700K vs 6700K.

  • @SpontaneousWeasel
    @SpontaneousWeasel 3 года назад

    When I returned to pc gaming after the pentium era, I chose Nvidia GPU GTX 275 and an AMD 955be over the core 2 quads from intel at that time.... That chip served me well and even when I upgraded in 2016 it was a no brainer to go to intel skylake 6700k paired with an R9 390. 5 Years on and ive now switched back to team red with 5600x but to be honest there wasn't a huge difference (in gaming performance) between that and the equivalent intel chips when paired with my 3070ti. It fantastic for all consumers that AMD have caught up and are also vastly more competitive in their GPU output too. Ultimately I couldnt give a damn about brand loyalty - I will always seek the best price to performance ratio for my region on every upgrade I goto. Loved this video! ✌️

  • @m_sedziwoj
    @m_sedziwoj 3 года назад +1

    It is great video, thx for making it!

  • @kelownatechkid
    @kelownatechkid 3 года назад

    Amazing work, Steve!

  • @ShadowMebs
    @ShadowMebs 3 года назад

    Fun to still see how strong my old 2600k was for a long time keeping up over 60fps 1080p.. But will probably not be as long between upgrade as it was between the 2600k and my 3900X.

  • @ESSA20
    @ESSA20 3 года назад +1

    Hey guys, have a question for you tec savy guys, I have have i73770k should I keep it or upgrade, it's paired with asus strix gtx 980ti. What's the equivalent to the i73770k to day cpu?

  • @texedomel01
    @texedomel01 3 года назад

    Great video. I did good upgrading from an ivy bridge all the way to 5900x this summer skipping the whole 14nm Intel scene.

  • @ruxandy
    @ruxandy 3 года назад

    Awesome stuff, Steve! Now... you gotta go even further back and do a comparison for the previous decade: Pentium 3 / Athlon (Slot 1 vs Slot A, baby!), Pentium 4 Northwood / Athlon XP, Pentium 4 Prescott / Athlon 64, etc :-D

  • @raeferwilson2599
    @raeferwilson2599 3 года назад +1

    I upgraded my main machine from a Phenom II X4 840, RX 460 to my current R5 2600 RX 580 at the end of 2018. Paid $149.99 for the cpu and $220 for the gpu on Amazon. Currently listed at $240 and $545.97

  • @Liqtor
    @Liqtor 3 года назад +1

    Jumping from i7 6700k to the R7 5800x has been eye opening. So much progress.
    Gaming on PC has never been this good.

    • @四as
      @四as 3 года назад

      In what sense? Which resolution?

    • @Liqtor
      @Liqtor 3 года назад

      @@四as
      4k
      My old GPU was the R9 390x.
      I build a new system. 5800x, 6900xt, nvme gen4. 32gb 4000 memory.
      Custom loop. I'm having such a good time.

  • @Thinker145
    @Thinker145 3 года назад +1

    This is legendary content. Thank you.

  • @DeLaVeGaGR
    @DeLaVeGaGR 3 года назад +1

    I still own my AM3 8350 PC as a second one for media consumption, back in the day I played a lot of BC2 / BF4 / DayZ with no issues, it wasn't an impressive CPU, but I was able to game on it with decent (60+) frame rates.

    • @mryellow6918
      @mryellow6918 3 года назад

      That's because 6900xts didn't exist. Lol

  • @andrey7268
    @andrey7268 3 года назад

    Thanks for the series, it was an interesting retrospect. Adding Alder Lake to the list would also be interesting, although difficult, since there are P and E-cores and the position of Alder Lake would depend on which 4 cores you select. Perhaps, you would need to have two entries in the graphs.

  • @Austin1990
    @Austin1990 3 года назад

    Such good work!!!
    This gets me really excited to see how Alder Lake fits in. We don't even know the cache sizes of AL yet.

  • @BulletPr00fGAm3r
    @BulletPr00fGAm3r 3 года назад

    Really enjoyed these videos! Keep up the good work!

  • @pierregrobbelaar9116
    @pierregrobbelaar9116 3 года назад +1

    My r5 2600 are getting close to her 3 years anniversary and i still love her like the day i got her.I built her 2 days before my birthday.AMD did change the game somewhat as Intel wanted to make you believe you don't need more than 4 cores for gaming.How that has changed over the last 2 years.I don't have a high end setup but it's good enough that everything i run i run at locked 60 fps.The new few years will be crucial for both intel and amd.I don't care who's faster.It's just price to performance.I can fit up to a 5xxx model on my b450 but if it were intel i had to change motherboards if i were to upgrade(ialmost got a 9400f but the amd system worked out cheaper).

    • @zigaklun3395
      @zigaklun3395 3 года назад

      well technically you dont need more than 4 cores for gaming, did you see the video?

  • @KrastyoKrastev
    @KrastyoKrastev 3 года назад

    Lovely video, as all videos from you guys!

  • @chuckie2880
    @chuckie2880 3 года назад

    Nice to see this kind of wrap up, now with an eye on the cache, or thinking of how different each game is in this apples to apples comparison
    some love more cache, some love fast RAM, some love other things in an µArch ... very interesting, nice job, thank you for doing this

  • @lastxflfan420
    @lastxflfan420 3 года назад

    Excellent content, thanks Steve.

  • @RuffsTV
    @RuffsTV 3 года назад

    As bad as the 8350 was, it was at the heart of my first "proper" gaming pc build back in the day. Paired with a 280x it was my gateway into pc gaming. It over locked to 5ghz 2oth a beefy water cooler and served me well for a couple of years. After that, I went to a 6600k and gtx1080, upgrading to the 6700k shortly after. I'm now planning on going back to team red with the 5600x.

  • @CossackHD
    @CossackHD 3 года назад +1

    I think these benchmarks show AMD Zen 1/+/2 at a disadvantage since it forces cross-CCX communication in 2+2 core configuration, increasing latency.

    • @dylan.dog1
      @dylan.dog1 3 года назад

      yes that why zen/zen plus need low latency ram for gaming i remember my old r9 1700 was 30% faster with 3266mhz cl 14 optimized sub-timing compared to 3000mhz xmp

    • @scaredycrow
      @scaredycrow 3 года назад

      It would also have been interesting R9 series with 2+2 cross chiplets with vastly more L3 available.

    • @CossackHD
      @CossackHD 3 года назад

      @@dylan.dog1 yeah, I seen some +15% FPS increase in Assassin's Creed by just setting tight sub-timings, not even increasing frequency. Your mobo probably picked very loose sub-timings, or Zen 1 is even more sensitive.

  • @INSANEDOMINANCE
    @INSANEDOMINANCE 3 года назад

    Love the videos you guys have been putting out, very interesting and insightful!

  • @biffington8240
    @biffington8240 3 года назад

    Still rocking a Haswell i7 4770k and ddr3 2400mhz for my daily rig and don't see a reason to upgrade just yet. Great video

  • @andreimihai164
    @andreimihai164 3 года назад +1

    "Sounds like a monitor name" hp: hold my omen 25lex52x37e