The i9 Killer? AMD Ryzen 5800X3D Tested

Поделиться
HTML-код
  • Опубликовано: 4 июл 2022
  • Check prices on Amazon below
    AMD Ryzen 5800X3D: geni.us/9GXJrNS
    AMD Ryzen 5600X: geni.us/JXHkwGo
    AMD Ryzen 5900X: geni.us/Qt5VW
    Intel 12900KS: geni.us/pkXQ5Ar
    Intel 12700K: geni.us/zvjzvE4
    Intel 12600K: geni.us/1I7OXuq
    The AMD Ryzen 5800X3D crushed my expectations - here's how it stacks up against Intel's best.
    Video gear
    Camera: geni.us/cN16f
    Primary Lens: geni.us/Mfv0kQO
    / optimumtechyt
    / alisayed3
    / optimum
    As an Amazon Associate I earn from qualifying purchases.
  • НаукаНаука

Комментарии • 967

  • @WilliamChoochootrain
    @WilliamChoochootrain 2 года назад +1354

    The most insane part is you can get this performance with a DDR4 board from 2017.

    • @MarioPL989
      @MarioPL989 2 года назад +97

      With 3200-3400 MHz ram instead of 3800. Above 3400 there is almost no difference for 5800X3D.

    • @NicksonYT
      @NicksonYT 2 года назад +122

      AMD's chipset support is admirable

    • @mastroitek
      @mastroitek 2 года назад +17

      True, such a good platform, I still have a 4790k and I was looking for an upgrade. The 5800x looks fabulous like most of the 5xxx series but unfortunately they don't have an iGPU which is a deal breaker for me

    • @user-uw4uv3cy7l
      @user-uw4uv3cy7l 2 года назад +24

      @@mastroitek not having an igpu is not as bad as you think (in my opinion). most systems have a universal gpu driver pre-installed. So as long as you have really any gpu installed onto your system, you should be able to boot. some operating systems like windows can also automatically detect what gpu you are using and download its latest drivers for you. I guess the downside is that you won't be able to use your computer if you don't have a gpu on-hand.

    • @OmarDaily
      @OmarDaily 2 года назад +28

      @@mastroitek I have a 5950x and not having a iGPU has never been an issue. Just get a cheap GPU which will give you more performance than any iGPU.

  • @luckyowl10
    @luckyowl10 2 года назад +615

    7:31 that power consumption difference is HUGE. 5800X3D draws 82.6W on that Valorant 1080 low test, 12900KS draws 188.3W. That's 2.27x more power consumption and 4% less FPS. Intel sure likes to make the power bill high

    • @MarioPL989
      @MarioPL989 2 года назад +105

      During summer this is a dealbreaker tbh.

    • @HosakaBlood
      @HosakaBlood 2 года назад +11

      Well your comparing a cpu with almost twice the core also so power draw comparasion is quite pointless here intel chip can games similar level to 3d zen while performing on similar level to 5950x in multi task so your getting best of both worlds going with intel aswell while the 3d is amazing for gaming having 8 core only is bummer for many who use the pc both work and gaming without mentioning Thas amd want 500$ for 8 core cpu in 2022 while the competition give similar gaming performance and more core

    • @lawrencetoh9510
      @lawrencetoh9510 2 года назад +79

      @@HosakaBlood 12900k $599 and 5800x3d $449.....so they are around $500..ya good math.
      did you calculate the motherboard the same as well?😂

    • @rdmz135
      @rdmz135 2 года назад +55

      @@lawrencetoh9510 dont forget the crazy expensive DDR5 needed for Intel to unlock its maximum potential

    • @luckyowl10
      @luckyowl10 2 года назад +57

      @@HosakaBlood You know 12900KS has only 8 real performance cores (same as 5800X3D) and 8 efficiency cores, which are only Skylake level of performance?
      Also, power draw is power draw, number of cores doesn't matter. You can have server CPUs with 3x number of cores and same TDP as a 12900KS, because they run on lower frequency.
      If a component has 0-30% more performance for 2x the power draw, that's trash power efficiency.
      If you don't care about power draw at all, you might as well get a 12900KS and a 3090 TI, overclock them to max and use a 1000W PSU.

  • @Flyfishindog
    @Flyfishindog 2 года назад +194

    Love the color coding between brands. Makes a quick glance easier to compare overall

  • @tobo4694
    @tobo4694 2 года назад +294

    Imagine beating an i9 using only 80w and just adding more cache

    • @evo8150
      @evo8150 2 года назад +11

      Crazy

    • @LordRobbert
      @LordRobbert 2 года назад +17

      id love to see a 12600k with a 96mb cache instead of 20mb.

    • @jonathanariaslorenzo3214
      @jonathanariaslorenzo3214 2 года назад +1

      well, now we don't have to...

    • @eliezerortizlevante1122
      @eliezerortizlevante1122 2 года назад +1

      imagine beating i9 with 9W usage

    • @Abbrevious
      @Abbrevious 2 года назад +2

      @@eliezerortizlevante1122 Imagine that 9w is run off a solar panel.

  • @HuNtOziO
    @HuNtOziO 2 года назад +208

    I got mine recently. In MMOs this thing is an absolute power house. I went from a 5800x to the 5800x3d and in gw2, lost ark, destiny 2 and ffxiv the gains in areas with loads of players are absolutely mind boggling. The 1% lows are also much better. Far smoother overall.
    11/10 would buy again. Looking forward to zen 4 vcache models!!

    • @busterscrugs
      @busterscrugs 2 года назад +42

      AMD's ideal customer right here, buying the same CPU twice lmfao

    • @ashura7684
      @ashura7684 2 года назад +1

      I play mostly destiny 2 I have the 10900k right now but I might switch to amd when the new 7000 series comes out

    • @Travelfast
      @Travelfast 2 года назад +3

      Same thoughts about my 5800X3D. Games are just so much fluent now.

    • @narius_jaden215
      @narius_jaden215 2 года назад +12

      @@busterscrugs Maaaan most computer people are nuts like this, you must know this :P.

    • @charliek9394
      @charliek9394 2 года назад +15

      My cousin bought this just for WoW lmao. he went from a 4600k that gave him 24fps to this and has now about 210fps with a 1070gtx. its mental.

  • @steel5897
    @steel5897 2 года назад +87

    That massively lower TDP is what sells it for me over the i9.
    I really hope the power consumption creep doesn't get out of hand. Intel and Nvidia in particular are going ham with their new/upcoming products. If you start recommending a 1000w power supply for a god damn gaming rig, even if high end, to me that's just insane.

    • @gmaxsfoodfitness3035
      @gmaxsfoodfitness3035 2 года назад +2

      PSUs are most efficient at around 50% usage. This is where their rating (bronze, gold, platinum ect.) is achieved. Efficiency drops after 55% usage. If you build a PC that uses around 500 watts at full load it's wise to go for a 1000 watt PSU over say a 850 watt. 1000w PSUs have been recommended since late 2020 for builds with the highest end GPUs. If you ever use Newegg's pc builder or PC Part Picker you can hit 400-500 watts easily just messing around. You also have to remember power spikes especially with Nvidia GPUs so it's smart to have that extra headroom. I've even seen a 1000w PSU shut off for safety when a 3090 card was spiking in one of Linus' videos. He ended up having to go with 1200W for everything to run smoothly with no shut down. The 3090TI uses even more power and it doesn't seem like Nvidia cares about efficiency at this point.

    • @Nilssen84
      @Nilssen84 2 года назад +6

      Yes, It just shows how much better tech AMD actually has.. If Intel and Nvidia only can stay slightly on top because they run twice the amount of watts through CPU/GPU’s…

    • @seth4321
      @seth4321 2 года назад +3

      @@gmaxsfoodfitness3035 Have you seen the rumors for the upcoming 40 series nVidia cards? Possibly over 400W for some of the highest end SKU’s? Something has to change. Hopefully AMD will lead the charge with RDNA3 like they’ve done with Ryzen.

  • @johnnyelijasialuk3879
    @johnnyelijasialuk3879 2 года назад +402

    All this boils down is adding more cache memory and gets lower wattage usage, the efficiency came along way for AMD.

    • @bgop346
      @bgop346 2 года назад +11

      amd lucky they got onto tsmc silicon/their design teams obviously helped them out a lot more than nodes.

    • @pranavbansal5440
      @pranavbansal5440 2 года назад +5

      Effeciency doesn't even matter that much on desktops

    • @saadanjum8859
      @saadanjum8859 2 года назад +48

      @@pranavbansal5440 it matters to some people though

    • @Ahfeku
      @Ahfeku 2 года назад +60

      @@pranavbansal5440 i dont want my electric bills to be 400 bucks every month tho.

    • @joe_ferreira
      @joe_ferreira 2 года назад +47

      @@pranavbansal5440 personally, I am tired of my office being 80-90 degrees F all the time because of my gaming desktop and getting AC dedicated for one room just proves the point that they all run too hot. CPUs and GPUs need to be more efficient.

  • @Lubinetsm
    @Lubinetsm 2 года назад +45

    I upgraded from 3600 yesterday, and my god, I have never seen a few of heavily memory-intensive games run that smoothly before. Definitely a worthy purchase!

    • @evilemil
      @evilemil Год назад +2

      Im currently using the 3600 and planning to upgrade and is targeting the 5800X3D, Should i go for it or upgrade my GPU instead?

    • @Ladioz
      @Ladioz Год назад +4

      @@evilemil cpu is more important than gpu. GPU gives you extra frames, CPU makes everything smooth

    • @alvabravo5655
      @alvabravo5655 Год назад +1

      @@evilemil I just upgrade my 3600 to 5800X3D yesterday, got it for $300 and those extra cash made me bought Rx 6800 XT for $500 and put my RTX 2060 to rest..
      now I live on another dimension..

  • @kennadod2080
    @kennadod2080 2 года назад +24

    I love mine. Fantastic chip and just slotted into my old system. Nice to see a company not forcing you to upgrade other pc parts too

  • @jelipebands1700
    @jelipebands1700 2 года назад +11

    Dude your charts are the best now i love the 1% and .1% lows on the chart the most important metric for sure. You also did amazing job benchmarking while playing apex for 25 minutes in a real match. I know you spent a long time making this video and it shows great work

  • @zargham6719
    @zargham6719 2 года назад +1

    Hey, been watching your videos for a while but this change in background looked amazing. You looked very nice and personable in this video and the background switch up to a more colorful one is amazing. Anyways, great content as always!

  • @TheStubbenator
    @TheStubbenator 2 года назад +33

    Buying the 5800x3D overall was a way better investment to me than upgrading my RAM and motherboard to support a future gen processor of similar speed

  • @user-qd4gf8hg8f
    @user-qd4gf8hg8f 2 года назад +150

    just a friendly reminder, the 5800x is competing with intel's next gen 12th CPUs instead of the same gen (11th gen) CPUs.
    that alone shows how powerful it is

    • @jondonnelly4831
      @jondonnelly4831 2 года назад +7

      u sound like stewie from family guy :D

    • @vishalsharma23k
      @vishalsharma23k 2 года назад +2

      @@jondonnelly4831 Now that you mention it , I hear stewie as well XD

    • @RafitoOoO
      @RafitoOoO 2 года назад +16

      it's worse, AMD released Zen 3 to compete with 10th gen, 11th gen was released 6 months after Zen 3 and still lost in all SKUs lmao.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 2 года назад +2

      @@RafitoOoO YUP, true. If they release a 5900x3d then its gaming performance might be on par with 13 gen.

    • @thelittleewokboss9929
      @thelittleewokboss9929 2 года назад +2

      5800x3d was meant to dethrone the 12900k/s for gaming king

  • @vailpcs4040
    @vailpcs4040 2 года назад +3

    This was the comparison I was looking for, thanks!

  • @The_Professional_Hobbiest
    @The_Professional_Hobbiest 2 года назад +2

    Awesome video. Super cool to see this from a sweaty gamer perspective, and the variety of benchmarks was sick

  • @AvWoN
    @AvWoN 2 года назад +1

    Super cool, thanks! Certainly my fav tech channel, the way you present stuff is just golden so cleannn. Would have been neat to see the 5900x in the lineup given that it would be an interesting comparison within AMD and given that comparison on intel has a more premium product with the 12900KS. If the results were not interesting at least a quick comment where it should land between the items in the lineup would be nice.

  • @xlinnaeus
    @xlinnaeus 2 года назад +113

    Late, but we know it’s the best looking review in the tech space!!

    • @prot018
      @prot018 2 года назад +13

      @Od1sseas *best looking*

    • @Underground.Rabbit
      @Underground.Rabbit 2 года назад

      bit gay but okay

    • @aerosw1ft
      @aerosw1ft 2 года назад +1

      @@prot018 I'd argue HWC is on par. Those guys really go above and beyond with their cinematography

    • @NicksonYT
      @NicksonYT 2 года назад +1

      Absolutely

    • @Shoey
      @Shoey 2 года назад +2

      OT for eye candy and itx+esports perspective
      HUB/GN for detail
      us aussies have tech yt in a chokehold atm lol

  • @killerindustries49
    @killerindustries49 2 года назад +132

    I wanted this CPU so bad... But I went with the 5800x instead got it on sale at Microcenter for 250$. The 3D model is a bit expensive atm but for a good reason.

    • @GamingXpaul2K10
      @GamingXpaul2K10 2 года назад +18

      In the UK right the 5800x3D is £180 ($220) more expensive than regular model on Amazon. Really not worth the price difference imo. I'd say only get the 3D models if they're of comparable pricing.

    • @killerindustries49
      @killerindustries49 2 года назад

      I agree with you, at least we have options and decent price points now 👌🏼

    • @Paraclef
      @Paraclef 2 года назад

      @@GamingXpaul2K10 link ? because i can not understand how you could get it as cheap as that.

    • @itisabird
      @itisabird 2 года назад +4

      It's not for a good reason in my opinion. In Germany you can buy the 5800x3D for 480 euros, while the top of the line 5950X is only 50 euros more expensive. Yes, I know that the 5950X won't be as good at gaming, but it is overall a better CPU.

    • @syed2694
      @syed2694 2 года назад +12

      @@Paraclef he meant £180 MORE than the 5800X. It's £440 and the 5800X is about £270

  • @joaojgac
    @joaojgac 2 года назад +1

    always presenting great content Ali ! Big fan here... cheers from BR 🇧🇷

  • @henry3397
    @henry3397 3 месяца назад

    Found your channel not long ago and your shots are amazing. Especially when explaining something through a visual, it hits my eyes at the same time it’s hitting my imagination

  • @Slayer9009
    @Slayer9009 2 года назад +8

    Love the benchmarking, you definitely went out of your way to get accurate results so thank you for that!

  • @sighheinrich
    @sighheinrich 2 года назад +7

    Great benchmark review test. Really well made, took everything I wanted into play. Power consumption is just crazy, should be reason enough to buy AMD for most people.

  • @bzdtemp
    @bzdtemp 2 года назад +7

    Upgraded from a Ryzen 3600 to the 5800x3d and for me it was just a really good step. For sure worthwhile to do as any other upgrade to similar performance would be lots more costly - and also time wise since there is zero need to reinstall Windows or anything.
    The only trap is, that you must update the BIOS on your motherboard with the old CPU running and then you can swap CPU.

  • @lolilolplix
    @lolilolplix 2 года назад +1

    As always, great production and camera work

  • @Petr75661
    @Petr75661 2 года назад +36

    It's crazy how inefficient Intel is.

    • @MightBeBren
      @MightBeBren 2 года назад +2

      im not a fan of intel just throwing power at stuff to compete with lower power hardware

    • @marcturcotte1962
      @marcturcotte1962 2 года назад +2

      I remember saying the same thing with AMD Bulldozer .

    • @deadly_mir
      @deadly_mir 2 года назад +2

      We must remember though, 12th gen is basically like having 2-3 CPU's under 1 heat spreader. (ex: 12900k is like having an 11900k and 2 4670k's under 1 heat spreader.) Obviously its gonna take more power and produce more heat because 1+2=3, Not 1+2=1. People completely like to overlook E cores exist in these comparisons. My 12600K at its current 5.3 all p core, 4.0 all E core pulls 180w. At stock speeds and without E cores enabled it only draws about 90-100w. That's an unlocked Intel CPU under 100w. Most of the power draw spikes is the motherboards fault. This time around, 12th gen can be undervolted by .05v to .1v and take literally no performance hit. The CPUs simply were designed to draw more voltage than necessary

    • @LordRobbert
      @LordRobbert 2 года назад

      @@deadly_mir id love to see a 12600k with a 96mb cache instead of 20.

    • @deadly_mir
      @deadly_mir 2 года назад

      @@LordRobbert That would be super damn fast. But also probably pull well over 250w.

  • @BSEUNHIR
    @BSEUNHIR 2 года назад +7

    You're not the only one testing the 5800X3D so late after release. Is that just a coincidence or is there a second round of samples or something?
    I ordered this beast on release day, love it for my shitty unoptimized DX9 games that are permanently in draw call limit. Really helps in those, like Planetside 2.

  • @cezannee
    @cezannee 2 года назад

    your videos are very helpful and is easy to understand this rabbit hole of PC gaming

  • @balin72
    @balin72 2 года назад +1

    thanks for taking the time to give all these chips a fair chance! loved this vid

  • @TimothyStovall108
    @TimothyStovall108 2 года назад +5

    With the new BIOS versions coming out for the 5800X3D to allow for more tuning flexibility. I would like to see how it performs with an under volt offset as well, similar to how you did the PBO under volt, which really helped with my 5800X. Really looking at going to the 3D version, but now that there is word of more 3D AM3 versions in the future I'm really hoping for a 5900X3D. I still get hitches and stuttering with certain applications in which I know the 3D would just completely eliminate.

    • @iiimgdiii
      @iiimgdiii 2 года назад +1

      PBO2Tuner Tool

  • @narius_jaden215
    @narius_jaden215 2 года назад +10

    I went from a 9900k to this thing and I'm pretty happy with how things have gone. I find some of the games I play to be less stuttery than before, which is really what I was looking for. FPS is everything, don't get me wrong, but it's more about how stable those frames are delivered.

    • @danielkammer3244
      @danielkammer3244 2 года назад +1

      was it worth it.i have a 9900k and everything runs smoothly

    • @QueckSilber87
      @QueckSilber87 Год назад

      Yeah say plz... Switch from i9 9900k to this it's worth?

    • @mickeyriola9087
      @mickeyriola9087 Год назад

      It’s just rough to spend the cash cause my cpu is really fast and has same amount of cores and runs faster as far as straight speed idk man

  • @LoongJade
    @LoongJade 2 года назад +1

    Great vid but I wish you could include full spec for both systems in the description.

  • @hyypersonic
    @hyypersonic 2 года назад +1

    New background looks awesome man

  • @humbleweirdo2860
    @humbleweirdo2860 2 года назад +60

    Its by far the best cpu for sims in VR. All the frame drops I use to get in DCS have completely vanished with the 5800X3D and thats coming from a 5950x!

  • @tuganerf
    @tuganerf 2 года назад +3

    Damn. makes me feel good to know that I have upgrade options within am4. My 5600x is going fine but I might get one of these used in 2-3 years

  • @testerzz
    @testerzz 2 года назад +1

    Great reviews as always, also a great Twitch channel for especially Apex Legends gameplay.

  • @cykablyat114
    @cykablyat114 2 года назад +1

    love the videos man one of the best tech channels on youtube

  • @Dionyzos
    @Dionyzos 2 года назад +30

    A 5600X3D would be very interesting

    • @killerindustries49
      @killerindustries49 2 года назад +6

      A budget gamers dream

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      A 5950x3d maybe so I can switch my 12900k otherwise losing almost twice to performance in multi task for 3% better In gaming idk man

    • @WhiteoutTech
      @WhiteoutTech 2 года назад +5

      @@HosakaBlood huh why would you switch from a 12900K. Just throw in a 13900K if you already got an Intel mainboard.

    • @marsovac
      @marsovac 2 года назад +2

      @@WhiteoutTech maybe for less power... he 12900K uses 240W in multicore. Intel and nVidia should be scolded for doing this. Technically win on paper with absurd power draws, because they cannot take a loss. It will make AMD do the same thing, for the disgrace of the industry. It was said that the new RDNA and Ryzen will use more power, unfortunately.

    • @Mr11ESSE111
      @Mr11ESSE111 2 года назад +1

      ​@@HosakaBlood 5950x3d &5900x3d will not have point because productivity will be pretty worse then normal cpu and gaming will be same or worse then 5800x3d and cpu will be too hot with 16 core and at last too expensive so 12&16 core CPUs with 3d don't have point

  • @theb4r138
    @theb4r138 2 года назад +27

    What sort of cooling were you using for the 5800x3d and what were the temps? Could you explore undervolting with it? I’m using a 240 custom loop with it and a negative voltage offset and I haven’t seen much of any performance loss but much better temps.

    • @lauchkillah
      @lauchkillah 2 года назад +1

      I'm on a nh-d15 and I'm seeing an avg. Tdie temp of around 70°C while playing Outriders for example. not too hapy with it at the moment, but it was the first time i used Kryonaut on a CPU, and I believe i used too little

    • @TheRockRocknRoll
      @TheRockRocknRoll 2 года назад

      i have a dark rock pro 4 on mine

    • @theb4r138
      @theb4r138 2 года назад

      @@TheRockRocknRoll what sort of temps are you getting at stock settings?

    • @theb4r138
      @theb4r138 2 года назад +1

      @@lauchkillah 70 C sounds nice to me. With a 240 in the FormDt1 at stock settings my 5800x3d was hitting 80C after playing Warzone for an hour. The negative voltage offset brings me down to the mid to high 60s. It is summer time so that has a little to do with my bad temps though.

    • @MiguelAngelAvilaVillalobos
      @MiguelAngelAvilaVillalobos 2 года назад

      @@theb4r138 What 240 aio are you using? EK basic?

  • @HardwareFRA
    @HardwareFRA 2 года назад

    Verry nice video ! Do you think that the R9 5900x is interessant to buy now ? Or shloud i wait the ryzen 7000 ?

  • @cherryfarther
    @cherryfarther 2 года назад +1

    Man half the time I already know the answer but I still watch your videos anyway. The editing and presentation style are just so sick

  • @macstheking
    @macstheking 2 года назад +4

    Would this cpu have any issues pairing with a rtx 40 series when they come out? I desperately need to replace my 10 year old i7 2600k pc but I want to wait for the new 40 series gpu and just ride out my 1080 ti til then.

  • @13thzephyr
    @13thzephyr 2 года назад +3

    Looking forward to the 5600X3D and 5900X3D down the line, that power difference is also is given that we are looking at 7nm vs 10nm so imagine what 5nm will do when it comes to efficiency. To match the latest and greatest of Intel in "gaming" by an almost 2-years old CPU just by adding more cache is impressive.

  • @ryanreviews8566
    @ryanreviews8566 2 года назад +1

    damn that background looks as sick as this super clean review

  • @saab_9
    @saab_9 2 года назад

    Awesome review Ali! It would be have been nice to see 4K framerates!

  • @Stong1337
    @Stong1337 9 месяцев назад +7

    5800x3d is the best cpu of the decade

  • @mike605
    @mike605 Год назад +5

    I just bought the 5800x3d and wow what a difference it made compare to the 3700x that i had. MSFS looks so good now and runs smooth. I'm a very happy camper with this chip in my system.

    • @williamwen3477
      @williamwen3477 Год назад

      Nice, im gonna go to 5800x3d too from 3700x

  • @NarekAvetisyan
    @NarekAvetisyan 2 года назад

    Great review dude.
    But one question that's eating me alive and probably you can only test is, does the V-Cache make a difference in Blender simulation baking?
    Bro, can you test that please? I wanna know this to make a decision whether to upgrade to 5800X3D or the 5900X.
    Thanks.

  • @troycasinillo448
    @troycasinillo448 2 года назад

    nice review, i have a question though. Does it have an imbedded built in GPU? Or do you still need to install an actual GPU in your motherboard?

  • @Ken-zg3ze
    @Ken-zg3ze 2 года назад +80

    The graphs seem a little confusing to me. I completely understand that it's 0.1, 1, and avg, in that order but my brain wants the color to represent a range rather than a delta. Maybe it's because of the inclusion of 3 values on 1 bar rather than the usual 2. Great comparison though. AMD is killing it.

    • @TheAnoniemo
      @TheAnoniemo 2 года назад +10

      It's the colors that throw the readability out the window. They probably should be lighter shades of the same color instead of contrasting ones.

    • @reubnn
      @reubnn 2 года назад

      you dont understand cause ur a woman

    • @PunzL
      @PunzL 2 года назад

      @@TheAnoniemo Using pure white to represent the 1% average kinda creates a divide between the average and the 0.1% and the placement of the numbers does not help either. I think a different color other than white or swapping colors between 1% and 0.1% would easily fix the issue

  • @twinjuke
    @twinjuke 2 года назад +5

    Very easy to consume yet one of the best reviews on the net with lot of useful content. Thanks again! Cool channel as always!

  • @angelaguirre1289
    @angelaguirre1289 2 года назад +1

    Will you make a video about the new EKWB products announced such as their new pump/res/cpu block combo? I'm curious to see what the god of SFF has to say about these new ideas!

  • @sultanmm2012
    @sultanmm2012 2 года назад +1

    I love your videos, very informative

  • @connordaniels2448
    @connordaniels2448 2 года назад +43

    Crazy how much power the 12900ks pulls. I had to undervolt my 12700k because of temps but I guess it’s nothing compared to a 240w tdp

    • @jhellier
      @jhellier 2 года назад +1

      Does undervolting reduce performance?

    • @connordaniels2448
      @connordaniels2448 2 года назад +7

      @@jhellier it should actually increase it if done properly. My cinebench score went up a few hundred points since temps stayed lower as the 12700k will thermal throttle under constant 100% loads in an application such as that.

    • @najeebshah.
      @najeebshah. 2 года назад +8

      @@jhellier no, barely. My 12900k is actually more efficient than a 5950x, calibrated loadlines and a undervolt resulted in a cinebench score of 27k @158w, mind you for the 5950 to break the 27k mark it needs 166-175 watts with pbo. Total performance drop was 5-7% for me for all core and bone in gaming, i actually end uo boosting for longer and higher dur to reduced temps

    • @williehrmann
      @williehrmann 2 года назад +1

      @@connordaniels2448 What cooler you built on top of it? I only have an air cooler Noctua NH-D15 and i can let mine run for hours pulling 205w and not throttling even in summer temps but it's close to max temps.

    • @connordaniels2448
      @connordaniels2448 2 года назад

      @@williehrmann I’ve got a 240mm Corsair H115i Elite Capellix but it was close to max temps so I decided to undervolt it. Honestly it doesn’t affect performance at all and dropped my temps by about 20c.

  • @mordacain3293
    @mordacain3293 2 года назад +13

    I'm quite happy with my 5800X3D after switching back to my AM4 platform after struggling with a 12700K for a few months. The 12700K is great (and really great for the price if you live near a Micro Center), but nearly a year later and linux support is still pretty rubbish and the power consumption is ridiculous if you let it run unbridled. The 5800X3D is easier to cool and uses far less power and still works fine for my non-gaming workloads. Side note, Ryzen seems to get quite a bit more benefit out of extreme memory tuning than Alder Lake - I settled on running my old B-Die set at 3200Mhz Cl13 (with ridiculously tight sub-timings) and I was able to eek another 5-10% over the CL14 XMP profile and it even matches my tuned 4000Mhz (fclk @ 1:1) settings while using less power and without hammering the memory controller / SOC as hard.

    • @brandonj5557
      @brandonj5557 2 года назад

      I have no clue what you are talking about in regards to Linux support being rubbish, I have a i9 1200KS I mostly run a lot of docker containers for both Ethereum & Cardano. DDR5 has also been very reliable at 6000Mhz with 2 32gb dimms
      Linux has been an absolute great for Alder-Lake especially if you update your Kernal

    • @mordacain3293
      @mordacain3293 2 года назад +1

      @@brandonj5557 Good for you. Not sure what distro you are using but that was not my experience. Intel's thread director wasn't added until 5.18 (released in May) so prior to that there was no intelligent thread direction which lead to very unpredictable performance as there was no delineation between P-Cores and E-cores. Personally, I was only able to get predictable performance by disabling the E-cores all-together (this was actually the only way to boot some distros for quite some time after release). The easier hypervisor distros (i.e: Proxmox) are still running older kernel versions as well.
      I wasn't saying it won't work, it is just a bit rubbish in my opinion and takes far more effort than my Ryzen setup did.
      As far as the memory comparison goes, it is a bit apples vs oranges. I was running a DDR4 setup and it was never particularly stable running tight timings and I could only run up to 3600 in gear 1. Since you're running DDR5, you're in gear 2 by default which puts far less strain on the memory controller. I could manually switch to gear 2 and it was more stable; it was also more slow.
      Again, not saying Alder Lake is bad by any means, I was just sharing a small portion of my experience with it.

  • @caml1720
    @caml1720 2 года назад

    i could just be stupid but the indication that the white block in the middle of the graph isn't a number we have to infer, and the lack of a unified color choice on the far right text when the key at the top has avg fps in orange, was slightly confusing. made 1 seconds work into 5, not a real problem by any means. thanks for the comparison!

  • @rizrizriz
    @rizrizriz 2 года назад +1

    I really like this kind of graph. Easy to digest and understand.

  • @Xenoray1
    @Xenoray1 2 года назад +3

    was running a 3600 last month upgraded to 5800x, didnt consider 3d variant cuz this would cost me 300€ more.. im fine right now :D

  • @Vectral555
    @Vectral555 2 года назад +15

    AMD has definitely found a winning formula for a top gaming CPU. With that being said, next gen 3D V-Cache CPUs are gonna be absolute monsters. If a mid range Zen3 based CPU clocked to "just" 4.5GHz and added extra 64MB of cache, can do this much damage in games, imagine what a Zen4 based CPU with the same technology can do. If I had to make an educated guess, the 7800X3D is gonna be clocked to *at least* 5GHz (or maybe even more, considering the jump from 7nm to 5nm AND the non-3D V-Cache models will be boosting to *at least* 5.5GHz as shown by AMD in their Zen4 tech demo, also you have to consider the matured 3D V-Cache technology by then) with the same 96MB of L3, then you add the IPC increase, etc. Now, pause for a second and think about the fact that AMD will also have dual CCX 3D V-Cache CPUs coming as well, and those will be 12/16-core 192MB (96MB of L3 per CCX) monsters. And what's best of all, with the 3D V-Cache technology you don't need to invest in expensive memory (especially DDR5) since there is very little benefit thanks to all that cache, so something like a basic DDR5-5200 or DDR5-5600 memory will be more than enough and those kits have already started to come down to reasonable prices.

    • @tywoods7234
      @tywoods7234 2 года назад

      Yes! And then double the power consumption! And double the price! Don’t forget that.

    • @Vectral555
      @Vectral555 2 года назад +1

      @@tywoods7234 So a brand new product with improved performance on a brand new platform will cost more than the old one? Shocker! As far as consumption, don't worry about it, it'll still be much more efficient than their competitor ;)

  • @siddhantjain243
    @siddhantjain243 2 года назад

    New camera? Live this insane contrast and vivid colours 👍

  • @timfngwena
    @timfngwena 2 года назад +1

    do you have a new camera? your portrait shots and close ups are just that bit crispier especially if you consider how much youtube smashes down the video.

  • @barrycooper8640
    @barrycooper8640 2 года назад +4

    I thought I was mad swapping out a 5600x for a 5800x3d on my gaming rig. I was so wrong. It's awesome and you feel the difference.

  • @KesumaofEO
    @KesumaofEO 2 года назад +6

    That's what I look for, efficient performance and not just adding more raw power. I recently built a new PC with a 5600x and a 3060ti and I can already feel that my room warms up faster. Not everyone has an AC so I don't wanna deal with how much heat the higher end parts give off using so much wattage.

  • @khazad-
    @khazad- 2 года назад

    Please do the same exact tests when the new AMD CPUs come out. Thanks you for testing valorant and cs go since 500Hz monitor is coming out, need to know what cpu can handle constant 500 fps.

  • @user-bp8yg3ko1r
    @user-bp8yg3ko1r 2 года назад

    Great video! Very well-made!

  • @CarlosWGomez
    @CarlosWGomez 2 года назад +9

    I've always been more of an Intel guy myself (just never had a AMD device), but after seeing power consumption numbers, made the switch immediately to a 5600x and do not regret it. Hoping to upgrade my GTX 1080 Ti FE soon to a 4080 depending if an 850W PSU can support it.

    • @rexomi17
      @rexomi17 2 года назад

      Ms get i5 12600k since it better deal and didn't spend chunk of money to buy r7 or i7 11th gen.
      12600k= 10% better than 11900k

    • @budgetking2591
      @budgetking2591 2 года назад +4

      never fall in love with a compagny, i know people that only buy certain brand of motherboards because thats what they are used to, so they buy shit motherboards a lot of times, brands and their motherboards quality are always changing

    • @FILIPLAMBOV
      @FILIPLAMBOV 2 года назад

      Really doubt 850w would be enough 😂

    • @aerosw1ft
      @aerosw1ft 2 года назад

      4080 would be kinda overkill with a 5600x imo (and that psu worries me too, especially with transient spikes)

    • @ysnyldrm73
      @ysnyldrm73 2 года назад

      @@aerosw1ft With 4k monitor it is okay with 5600x.

  • @hbgl8889
    @hbgl8889 2 года назад +32

    I think more dense and higher capacity cache is definitely the future. Memory accesses are really slow and a larger cache can help with that. Once a larger cache becomes more common, the game developers might actually start optimizing for it, which could give an even larger performance boost.

    • @bersK00
      @bersK00 2 года назад +7

      If game devs weren't optimizing their games taking into consideration the cache sizes all of your games would run like shit. Bigger cache sizes are essentially free performance.

    • @hbgl8889
      @hbgl8889 2 года назад +2

      @@bersK00I am sure they already optimize for cache size, but if you statically know that the average CPU has 100MB L3 cache instead of 4MB, you might actually adjust the code. And maybe in the future the L3 cache will be 1GB or more.

    • @defectiveclone8450
      @defectiveclone8450 2 года назад +2

      @@hbgl8889 ALL THIS will do is make those games run like trash on anything other then a CPU with 100MB of L3 :) ..

    • @tim3172
      @tim3172 2 года назад +1

      @@defectiveclone8450 Yes, that... that's the way technology progresses, yes.

    • @defectiveclone8450
      @defectiveclone8450 2 года назад +1

      @@tim3172 game engines lag and are made to run on the most common hardware of the times.. :) dont see anything needing large cashe anytime soon.. we need the market to move to these new CPUs before that happens

  • @franchellevanheerden
    @franchellevanheerden 2 года назад +1

    Would love to see more benchmarks on open world games. since that is what i play mostly.

  • @PiotrMalicki
    @PiotrMalicki 2 года назад

    Got upgraded yesterday. Could not be happier. It's a beast for MSFS

  • @JuniorCloudHouse
    @JuniorCloudHouse 2 года назад +8

    My Gigabyte B350 Gaming 3 is 5 years old now. With support for the 5800X3D. I'm using the 5600 now.
    Glad that i still have a CPU upgrade on it.

  • @PATroll_knocking
    @PATroll_knocking 2 года назад +4

    With half power and almost same performance compare to 12900ks, 3D-V might be a right way for extreme gaming setup in itx build🤔

    • @rapamune
      @rapamune 2 года назад

      at ~half the cost

    • @13thzephyr
      @13thzephyr 2 года назад

      @@rapamune not just half the cost since you are reusing a good majority of your components

    • @rapamune
      @rapamune 2 года назад

      @@13thzephyr Valid point, indeed

  • @PaulRoneClarke
    @PaulRoneClarke 2 года назад +1

    Reminds me of the days of the FX57. An absolute monster from about 15 years ago.

  • @steviejoe66
    @steviejoe66 2 года назад

    You might consider using a Battlefield game (V or 2042) for one of your CPU benchmarks. They're known to be pretty heavy on the CPU. Would be a similar situation to Apex where it's hard to get a consistent benchmark but it still might be a nice thing to add.

  • @michaelthompson9798
    @michaelthompson9798 2 года назад +5

    I ordered mine 🥰👍 at launch and got it 2 weeks later and it’s been running on my 1080P ITX Skyreach Mini S4 setup with Alienware 360hz monitor and the fps is amazing🤩🤯, the temps vg, the fan noise much lower than my older R5 3600 (non-x) under gaming loads etc. The power draw (on5800x3D) was lower under load than the R5 3600. It’s definitely a powerhouse chip that down the road in a few years will still command a hefty premium 😉 and perform well for a long time.

  • @callmemesh
    @callmemesh 2 года назад +4

    Gonna wait, so I can spend less cash for more cache.

  • @riopower
    @riopower 2 года назад

    Less Power and heat is very important for SFF build. Thanks OT my next upgrade will be 5800X3D. Hope I can keep Velka 7 when RTX4000 Series released.

  • @johanescandra
    @johanescandra 2 года назад +1

    dope studio man

  • @matthieuzglurg6015
    @matthieuzglurg6015 2 года назад +3

    Currently using a 3700X that I've overclocked the crap out of (sustaining 4.6GHz all core, pretty happy with it, especially on my B450 Tomahawk board)
    I was considering waiting until AM5 CPUs drop, go with the first gen on that socket and then have the ability to upgrade as CPU generations go.
    But looking at this now, I'm wondering : I could very well just drop a 5800X3D, get a GPU upgrade, and I'm able to keep my RAM and motherboard for a few more years.
    I was quite a bit drooled over the perspective of a totally new system, but extending the life of the current one is also quite tempting

  • @valentin3186
    @valentin3186 2 года назад +3

    The show starts with 5600X3D.
    5800X3D was down-clocked to meet 142W package power limit
    But that headroom gives 5600X3D to become the new budget king
    95W TDP? Let's go
    Dead platform :)

    • @egalanos
      @egalanos 2 года назад +1

      AMD stated that they locked out over clocking due to the lower voltage limits of the V-Cache die.
      The higher the frequency, the higher the voltage is needed to be able to switch the transistor within that shorter timeframe.
      The V-Cache die is designed using a denser cell library as they can fit more cache per die and to lower power consumption. This is ideal for Epyc processors.
      Unsure if the lower voltage limit is due to that dense cell library, the hybrid bonding process, or perhaps limitations of both L3 caches being in the same voltage domain?

  • @vi683a
    @vi683a 2 года назад

    Will purchase this for my next build when price drops on NEXT GEN release..

  • @harbscantina
    @harbscantina 2 года назад

    Ok...sounds good. But if you compare it to a 12900k/s will it also be good for things like editing videos?

  • @phero6933
    @phero6933 2 года назад +10

    I love my x3D! Upgraded from the basic 5800x and saw a massive jump in low frame rates and tighter frame timings.

    • @kelownatechkid
      @kelownatechkid 2 года назад +2

      I migrated from the 5900x and it was way worth it. I put the 5900x in a server instead

    • @Cremepi
      @Cremepi 2 года назад

      Yeah I f*d up and got a 5800x my 5600x almost performs better

    • @MightBeBren
      @MightBeBren 2 года назад +1

      @@Cremepi ??? how does your 5600x almost perform better

    • @Cremepi
      @Cremepi 2 года назад

      @@MightBeBren same clock speeds on single core, with lower temps. Unless you’re doing heavy rendering production you’ll be in single core work load. Even gaming streaming with multiple processes my 5600x is better value for the $

    • @MightBeBren
      @MightBeBren 2 года назад

      @@Cremepi what temps do you get on your 5600x? my 5800x doesnt go above 65c in games with a 4850mhz all core overclock at 1.4v and idles at 24c

  • @tudorgoina9997
    @tudorgoina9997 2 года назад +4

    Honestly I would just wait for amd ryzen 7000 to come out and I would wait for pci 5.0 and ddr5 to get cheaper

    • @katinahoodie
      @katinahoodie 2 года назад

      What do you need pci.5 for at this time, or at the time 7000 series comes out?

    • @mikeymaiku
      @mikeymaiku 2 года назад

      @@katinahoodie if your on ryzen 5000 already, theres no reason to get this and just wait for the next gen stuff is probly what he meant

  • @anant3940
    @anant3940 2 года назад +1

    great video as always

  • @Bass_therapy_
    @Bass_therapy_ 2 года назад

    What lens do you use ? they look amazing

  • @aliananza1990
    @aliananza1990 2 года назад +5

    Whats even more awesome you can get the 5800x standard for 300-350 euro currently in Europe. Insane value for the fps it can output.

    • @aliananza1990
      @aliananza1990 2 года назад +2

      @@boutyemusic8296 lol almost same got mine 3 weeks ago undervolted it last night. Awesome cpu

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      @@boutyemusic8296 if your building from scratch for Thas prices there's also 12700k oc the shit out of and get similar level of gaming performance while having more core

    • @aliananza1990
      @aliananza1990 2 года назад

      @@HosakaBlood yes thats good option too for new builds. I just upgraded from a 3600 on my sff using a x570 Aorus pro wifi

    • @aliananza1990
      @aliananza1990 2 года назад +1

      @@boutyemusic8296 about the same 1.25v at 4.6 stable - tested max temp of 82 degrees.

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      @@boutyemusic8296 if you already have amd board well your choice is ryzen 5000 then

  • @Breakfast_of_Champions
    @Breakfast_of_Champions 2 года назад +2

    A bit like the old i7-5775C with its 128mb L4 cache, to get the most out of an aged platform. The question is, how much of the game's main loop fits inside the special cache.

    • @AerynGaming
      @AerynGaming 2 года назад

      the 5775c's cache had 1/40'th of the bandwidth and 5x the latency IIRC. It was actually slower than overclocked DDR4 on a 6700k.

  • @matosindustriesllc2026
    @matosindustriesllc2026 2 года назад

    Love your videos but Ryzen 3d was released a while ago?

  • @Girvo747
    @Girvo747 2 года назад

    That power usage difference really matters to me in south east Queensland. In summer power hungry parts are brutal to live next to lol

  • @ilkerYT
    @ilkerYT 2 года назад +5

    U should have added Escape From Tarkov to the tests 5800x3D s 100mb cache is really impressive

    • @kelownatechkid
      @kelownatechkid 2 года назад

      Or any VR/sim/RTS testing haha, the 5800x3d stomps all over the competition

    • @MarioPL989
      @MarioPL989 2 года назад

      Yeah, 5800x3D is huge in tarkov, even with weak gpu. I got 30-50% fps increase from 1700x with rx 470 gpu. With better gpu I could get wayyyyy more.

  • @Mikktor
    @Mikktor 2 года назад +3

    "No one's going to be playing at 200fps plus."
    Every time i hear this i feel like the only one in the world with a 240fps monitor. Though of course it is more important for competitive games i also enjoy having higher fps in single player games.

    • @T2NA
      @T2NA 2 года назад

      Do you play single player / story games at 200fps? I've always just capped at 120 as a force of habit, never really seen the benefit to it but I've always been kinda tempted to just let the system go wild and see what frames I could get in Horizon or something!

    • @crabosity
      @crabosity 2 года назад

      He probably says this because singleplayer games know that people play for the story/world so they can bump up the graphics, sacrificing fps so you wouldn’t hit 200fps plus in most singleplayer games if playing on high-ultra

  • @bazebaze2657
    @bazebaze2657 2 года назад

    would love to see how that performance translates to programs like blender or C4D, amazing video though!

  • @abaj006
    @abaj006 2 года назад +2

    Half the power usage for better gaming FPS, that is just insane! I have also seen other reviews show that the 5800X3D is the best CPU for VR.

  • @N0N0111
    @N0N0111 2 года назад +3

    1080p-low is an e-sports thing for sure, getting those ultimate high FPS.
    To be fair a lot none PRO e-sports gamers are moving to 27" 1440p monitors.
    Showing some 1440p benchmarks would be much more fair, cause we know that the 3D cache boost a lot on 1080p resolution.
    The fact that 1:55 intel 12900KS has 1GHz more boost is insane than the 5800X3D, the power drawn is a 2x too that is just bonkers!
    We have already seen that AMD demo their new coming gaming CPUs with 5.5GHz so intel is in big trouble!

    • @Njazmo
      @Njazmo 2 года назад

      I actually moved to 65" 4k OLED, so the CPU doesn't matter anymore. Gotta have some GPU power.

  • @nayanpitlam5621
    @nayanpitlam5621 2 года назад +6

    12700k is the sweet spot high end cpu atm!
    change my mind

    • @danielparks9035
      @danielparks9035 2 года назад

      Yeh that or 5700x/5800x for 50-75 less is still pretty good.

    • @chovekb
      @chovekb 2 года назад

      5900X tyvm bye.

    • @wixxzblu
      @wixxzblu 2 года назад

      i'd say 5800x or 12700F

    • @kelownatechkid
      @kelownatechkid 2 года назад

      12700F for new buyers, no contest. For upgrades, 5800x3d on the high end or 5900x for mid tier

    • @rdmz135
      @rdmz135 2 года назад

      For mixed workloads best value is the 12700F.
      For pure gaming its the 5600.

  • @Motishay
    @Motishay 2 года назад

    This is a proper send-off to the am4 socket, it was amazing while it lasted

  • @AwSomeNESSS
    @AwSomeNESSS 2 года назад

    I know you need to strive for accuracy above all else, but the issue with using non-standard maps for testing in those e-sports games is that the biggest (re: noticeable) drops usually do come in the all-out fights when you have a server full of players. So for instance, the benchmark map for CS or the training plaza in Val are known to not be representative for a CPU benchmark.
    For CS specifically, I've seen that running a 5v5 bot map can actually be more stressful for a CPU than a regular match, due to the added workload of the local computer having to run the bot calculations on top of the standard match calculations. In that case, you would be showing a worst-case scenario in your data, and regular matches of CS would actually be better than the data shown in the video. Would be an interesting separate video to find and show the best methods for testing these games.

    • @zazabean23
      @zazabean23 2 года назад

      Think of it in a simple way
      Synthetic load
      Real world load
      Two system loads which will perform very differently on different cpus due to memory latency, cache size/ cache latency, frequency, core to core latency, and more. Really these tests are cool but take them with a grain of salt. It's for casuals and the data is not accurate. The only issue is simulating a real world scenario is slightly difficult. You can come up with a synthetic benchmark that could be more intensive and random.

  • @Amzyy
    @Amzyy 2 года назад +8

    Currently on a 5600X and everything is flawless but when zen 4 comes out and this cpu drops might upgrade to it since it runs on the same motherboard

    • @gaeborg
      @gaeborg 2 года назад

      How long will amd be making these

    • @sebastianguerraty6413
      @sebastianguerraty6413 2 года назад +3

      zen 4 is going to use a new socket :/

    • @Ladioz
      @Ladioz 2 года назад

      I'm still using my i7 8700k! I have no idea when I should upgrade.... some people say im fine and some people say the newer CPUs make a pc feel so much smoother and snappier..

    • @cryx4
      @cryx4 2 года назад

      @@Ladioz 8700k is still a decent gaming CPU. I wouldn't upgrade right now, but you could consider upgrading to 12th gen when their prices fall a bit, or zen4 if it strikes your fancy

    • @aerosw1ft
      @aerosw1ft 2 года назад +1

      @@Ladioz upgrade when you feel you need to upgrade. Yes newer CPUs will definitely improve your performance but if what you're having now is enough for you then why spend the money?

  • @CannedMarmalade
    @CannedMarmalade 2 года назад +6

    Damn the i9 sucking up almost 190W of power

    • @MarioPL989
      @MarioPL989 2 года назад

      This is just Valorant. All threads power usage is way higher, about 250-320W, while ryzen never gets above 100-130W

  • @nicosilvestri7292
    @nicosilvestri7292 2 года назад

    Could you build a custom watercooled PC in the Fractal Design Torrent Nano Case. Would be such a good looking Pc and im interested to see how it´s thermals would be.

  • @shadowarez1337
    @shadowarez1337 2 года назад

    I finally got ahold of one of these for my Ghost S1 build / 3070Ti FE / new RTX spine / 64gb 3800mhz, and of course the 5900x3D and the 5950x3D get rumored release 🤦‍♂️