The i9 Killer? AMD Ryzen 5800X3D Tested

Поделиться
HTML-код
  • Опубликовано: 15 дек 2024

Комментарии • 958

  • @WilliamChoochootrain
    @WilliamChoochootrain 2 года назад +1390

    The most insane part is you can get this performance with a DDR4 board from 2017.

    • @MarioPL989
      @MarioPL989 2 года назад +101

      With 3200-3400 MHz ram instead of 3800. Above 3400 there is almost no difference for 5800X3D.

    • @NicksonYT
      @NicksonYT 2 года назад +126

      AMD's chipset support is admirable

    • @mastroitek
      @mastroitek 2 года назад +17

      True, such a good platform, I still have a 4790k and I was looking for an upgrade. The 5800x looks fabulous like most of the 5xxx series but unfortunately they don't have an iGPU which is a deal breaker for me

    • @古賀惣仁
      @古賀惣仁 2 года назад +25

      @@mastroitek not having an igpu is not as bad as you think (in my opinion). most systems have a universal gpu driver pre-installed. So as long as you have really any gpu installed onto your system, you should be able to boot. some operating systems like windows can also automatically detect what gpu you are using and download its latest drivers for you. I guess the downside is that you won't be able to use your computer if you don't have a gpu on-hand.

    • @omarfmercado
      @omarfmercado 2 года назад +29

      @@mastroitek I have a 5950x and not having a iGPU has never been an issue. Just get a cheap GPU which will give you more performance than any iGPU.

  • @Flyfishindog
    @Flyfishindog 2 года назад +207

    Love the color coding between brands. Makes a quick glance easier to compare overall

  • @luckyowl10
    @luckyowl10 2 года назад +630

    7:31 that power consumption difference is HUGE. 5800X3D draws 82.6W on that Valorant 1080 low test, 12900KS draws 188.3W. That's 2.27x more power consumption and 4% less FPS. Intel sure likes to make the power bill high

    • @MarioPL989
      @MarioPL989 2 года назад +110

      During summer this is a dealbreaker tbh.

    • @HosakaBlood
      @HosakaBlood 2 года назад +11

      Well your comparing a cpu with almost twice the core also so power draw comparasion is quite pointless here intel chip can games similar level to 3d zen while performing on similar level to 5950x in multi task so your getting best of both worlds going with intel aswell while the 3d is amazing for gaming having 8 core only is bummer for many who use the pc both work and gaming without mentioning Thas amd want 500$ for 8 core cpu in 2022 while the competition give similar gaming performance and more core

    • @lawrencetoh9510
      @lawrencetoh9510 2 года назад +82

      @@HosakaBlood 12900k $599 and 5800x3d $449.....so they are around $500..ya good math.
      did you calculate the motherboard the same as well?😂

    • @rdmz135
      @rdmz135 2 года назад +58

      @@lawrencetoh9510 dont forget the crazy expensive DDR5 needed for Intel to unlock its maximum potential

    • @luckyowl10
      @luckyowl10 2 года назад +59

      @@HosakaBlood You know 12900KS has only 8 real performance cores (same as 5800X3D) and 8 efficiency cores, which are only Skylake level of performance?
      Also, power draw is power draw, number of cores doesn't matter. You can have server CPUs with 3x number of cores and same TDP as a 12900KS, because they run on lower frequency.
      If a component has 0-30% more performance for 2x the power draw, that's trash power efficiency.
      If you don't care about power draw at all, you might as well get a 12900KS and a 3090 TI, overclock them to max and use a 1000W PSU.

  • @HuNtOziO
    @HuNtOziO 2 года назад +217

    I got mine recently. In MMOs this thing is an absolute power house. I went from a 5800x to the 5800x3d and in gw2, lost ark, destiny 2 and ffxiv the gains in areas with loads of players are absolutely mind boggling. The 1% lows are also much better. Far smoother overall.
    11/10 would buy again. Looking forward to zen 4 vcache models!!

    • @busterscrugs
      @busterscrugs 2 года назад +43

      AMD's ideal customer right here, buying the same CPU twice lmfao

    • @Resio_23
      @Resio_23 2 года назад +1

      I play mostly destiny 2 I have the 10900k right now but I might switch to amd when the new 7000 series comes out

    • @Travelfast
      @Travelfast 2 года назад +3

      Same thoughts about my 5800X3D. Games are just so much fluent now.

    • @narius_jaden215
      @narius_jaden215 2 года назад +13

      @@busterscrugs Maaaan most computer people are nuts like this, you must know this :P.

    • @charliek9394
      @charliek9394 2 года назад +16

      My cousin bought this just for WoW lmao. he went from a 4600k that gave him 24fps to this and has now about 210fps with a 1070gtx. its mental.

  • @steel5897
    @steel5897 2 года назад +93

    That massively lower TDP is what sells it for me over the i9.
    I really hope the power consumption creep doesn't get out of hand. Intel and Nvidia in particular are going ham with their new/upcoming products. If you start recommending a 1000w power supply for a god damn gaming rig, even if high end, to me that's just insane.

    • @gmaxsfoodfitness3035
      @gmaxsfoodfitness3035 2 года назад +3

      PSUs are most efficient at around 50% usage. This is where their rating (bronze, gold, platinum ect.) is achieved. Efficiency drops after 55% usage. If you build a PC that uses around 500 watts at full load it's wise to go for a 1000 watt PSU over say a 850 watt. 1000w PSUs have been recommended since late 2020 for builds with the highest end GPUs. If you ever use Newegg's pc builder or PC Part Picker you can hit 400-500 watts easily just messing around. You also have to remember power spikes especially with Nvidia GPUs so it's smart to have that extra headroom. I've even seen a 1000w PSU shut off for safety when a 3090 card was spiking in one of Linus' videos. He ended up having to go with 1200W for everything to run smoothly with no shut down. The 3090TI uses even more power and it doesn't seem like Nvidia cares about efficiency at this point.

    • @Nilssen84
      @Nilssen84 2 года назад +7

      Yes, It just shows how much better tech AMD actually has.. If Intel and Nvidia only can stay slightly on top because they run twice the amount of watts through CPU/GPU’s…

    • @seth4321
      @seth4321 2 года назад +3

      @@gmaxsfoodfitness3035 Have you seen the rumors for the upcoming 40 series nVidia cards? Possibly over 400W for some of the highest end SKU’s? Something has to change. Hopefully AMD will lead the charge with RDNA3 like they’ve done with Ryzen.

  • @johnnyelijasialuk3879
    @johnnyelijasialuk3879 2 года назад +404

    All this boils down is adding more cache memory and gets lower wattage usage, the efficiency came along way for AMD.

    • @bgop346
      @bgop346 2 года назад +11

      amd lucky they got onto tsmc silicon/their design teams obviously helped them out a lot more than nodes.

    • @pranavbansal5440
      @pranavbansal5440 2 года назад +5

      Effeciency doesn't even matter that much on desktops

    • @saadanjum8859
      @saadanjum8859 2 года назад +49

      @@pranavbansal5440 it matters to some people though

    • @Ahfeku
      @Ahfeku 2 года назад +61

      @@pranavbansal5440 i dont want my electric bills to be 400 bucks every month tho.

    • @joe_ferreira
      @joe_ferreira 2 года назад +48

      @@pranavbansal5440 personally, I am tired of my office being 80-90 degrees F all the time because of my gaming desktop and getting AC dedicated for one room just proves the point that they all run too hot. CPUs and GPUs need to be more efficient.

  • @tobo4694
    @tobo4694 2 года назад +308

    Imagine beating an i9 using only 80w and just adding more cache

    • @evo8150
      @evo8150 2 года назад +11

      Crazy

    • @LordRobbert
      @LordRobbert 2 года назад +17

      id love to see a 12600k with a 96mb cache instead of 20mb.

    • @jonathanariaslorenzo3214
      @jonathanariaslorenzo3214 2 года назад +1

      well, now we don't have to...

    • @eliezerortizlevante1122
      @eliezerortizlevante1122 2 года назад +1

      imagine beating i9 with 9W usage

    • @Abbrevious
      @Abbrevious 2 года назад +2

      @@eliezerortizlevante1122 Imagine that 9w is run off a solar panel.

  • @Lubinetsm
    @Lubinetsm 2 года назад +48

    I upgraded from 3600 yesterday, and my god, I have never seen a few of heavily memory-intensive games run that smoothly before. Definitely a worthy purchase!

    • @evilemil
      @evilemil 2 года назад +2

      Im currently using the 3600 and planning to upgrade and is targeting the 5800X3D, Should i go for it or upgrade my GPU instead?

    • @Ladioz
      @Ladioz Год назад +4

      @@evilemil cpu is more important than gpu. GPU gives you extra frames, CPU makes everything smooth

    • @alvabravo5655
      @alvabravo5655 Год назад +1

      @@evilemil I just upgrade my 3600 to 5800X3D yesterday, got it for $300 and those extra cash made me bought Rx 6800 XT for $500 and put my RTX 2060 to rest..
      now I live on another dimension..

  • @kennadod2080
    @kennadod2080 2 года назад +25

    I love mine. Fantastic chip and just slotted into my old system. Nice to see a company not forcing you to upgrade other pc parts too

  • @TheStubbenator
    @TheStubbenator 2 года назад +37

    Buying the 5800x3D overall was a way better investment to me than upgrading my RAM and motherboard to support a future gen processor of similar speed

  • @あなた以外の誰でもない
    @あなた以外の誰でもない 2 года назад +150

    just a friendly reminder, the 5800x is competing with intel's next gen 12th CPUs instead of the same gen (11th gen) CPUs.
    that alone shows how powerful it is

    • @jondonnelly3
      @jondonnelly3 2 года назад +7

      u sound like stewie from family guy :D

    • @vishalsharma23k
      @vishalsharma23k 2 года назад +2

      @@jondonnelly3 Now that you mention it , I hear stewie as well XD

    • @RafitoOoO
      @RafitoOoO 2 года назад +16

      it's worse, AMD released Zen 3 to compete with 10th gen, 11th gen was released 6 months after Zen 3 and still lost in all SKUs lmao.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 2 года назад +2

      @@RafitoOoO YUP, true. If they release a 5900x3d then its gaming performance might be on par with 13 gen.

    • @thelittleewokboss9929
      @thelittleewokboss9929 2 года назад +2

      5800x3d was meant to dethrone the 12900k/s for gaming king

  • @xlinnaeus
    @xlinnaeus 2 года назад +113

    Late, but we know it’s the best looking review in the tech space!!

    • @prot018
      @prot018 2 года назад +13

      @Od1sseas *best looking*

    • @Underground.Rabbit
      @Underground.Rabbit 2 года назад

      bit gay but okay

    • @aerosw1ft
      @aerosw1ft 2 года назад +1

      @@prot018 I'd argue HWC is on par. Those guys really go above and beyond with their cinematography

    • @NicksonYT
      @NicksonYT 2 года назад +1

      Absolutely

    • @lachlantula
      @lachlantula 2 года назад +2

      OT for eye candy and itx+esports perspective
      HUB/GN for detail
      us aussies have tech yt in a chokehold atm lol

  • @jelipebands1700
    @jelipebands1700 2 года назад +11

    Dude your charts are the best now i love the 1% and .1% lows on the chart the most important metric for sure. You also did amazing job benchmarking while playing apex for 25 minutes in a real match. I know you spent a long time making this video and it shows great work

  • @humbleweirdo2860
    @humbleweirdo2860 2 года назад +60

    Its by far the best cpu for sims in VR. All the frame drops I use to get in DCS have completely vanished with the 5800X3D and thats coming from a 5950x!

  • @killerindustries49
    @killerindustries49 2 года назад +132

    I wanted this CPU so bad... But I went with the 5800x instead got it on sale at Microcenter for 250$. The 3D model is a bit expensive atm but for a good reason.

    • @GamingXpaul2K10
      @GamingXpaul2K10 2 года назад +18

      In the UK right the 5800x3D is £180 ($220) more expensive than regular model on Amazon. Really not worth the price difference imo. I'd say only get the 3D models if they're of comparable pricing.

    • @killerindustries49
      @killerindustries49 2 года назад

      I agree with you, at least we have options and decent price points now 👌🏼

    • @Paraclef
      @Paraclef 2 года назад

      @@GamingXpaul2K10 link ? because i can not understand how you could get it as cheap as that.

    • @itisabird
      @itisabird 2 года назад +4

      It's not for a good reason in my opinion. In Germany you can buy the 5800x3D for 480 euros, while the top of the line 5950X is only 50 euros more expensive. Yes, I know that the 5950X won't be as good at gaming, but it is overall a better CPU.

    • @syed2694
      @syed2694 2 года назад +12

      @@Paraclef he meant £180 MORE than the 5800X. It's £440 and the 5800X is about £270

  • @bzdtemp
    @bzdtemp 2 года назад +7

    Upgraded from a Ryzen 3600 to the 5800x3d and for me it was just a really good step. For sure worthwhile to do as any other upgrade to similar performance would be lots more costly - and also time wise since there is zero need to reinstall Windows or anything.
    The only trap is, that you must update the BIOS on your motherboard with the old CPU running and then you can swap CPU.

  • @narius_jaden215
    @narius_jaden215 2 года назад +10

    I went from a 9900k to this thing and I'm pretty happy with how things have gone. I find some of the games I play to be less stuttery than before, which is really what I was looking for. FPS is everything, don't get me wrong, but it's more about how stable those frames are delivered.

    • @danielkammer3244
      @danielkammer3244 2 года назад +1

      was it worth it.i have a 9900k and everything runs smoothly

    • @QueckSilber87
      @QueckSilber87 2 года назад

      Yeah say plz... Switch from i9 9900k to this it's worth?

    • @mickeyriola9087
      @mickeyriola9087 Год назад

      It’s just rough to spend the cash cause my cpu is really fast and has same amount of cores and runs faster as far as straight speed idk man

  • @mike605
    @mike605 2 года назад +5

    I just bought the 5800x3d and wow what a difference it made compare to the 3700x that i had. MSFS looks so good now and runs smooth. I'm a very happy camper with this chip in my system.

    • @williamwen3477
      @williamwen3477 2 года назад

      Nice, im gonna go to 5800x3d too from 3700x

  • @Dionyzos
    @Dionyzos 2 года назад +31

    A 5600X3D would be very interesting

    • @killerindustries49
      @killerindustries49 2 года назад +6

      A budget gamers dream

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      A 5950x3d maybe so I can switch my 12900k otherwise losing almost twice to performance in multi task for 3% better In gaming idk man

    • @WhiteoutTech
      @WhiteoutTech 2 года назад +5

      @@HosakaBlood huh why would you switch from a 12900K. Just throw in a 13900K if you already got an Intel mainboard.

    • @marsovac
      @marsovac 2 года назад +2

      @@WhiteoutTech maybe for less power... he 12900K uses 240W in multicore. Intel and nVidia should be scolded for doing this. Technically win on paper with absurd power draws, because they cannot take a loss. It will make AMD do the same thing, for the disgrace of the industry. It was said that the new RDNA and Ryzen will use more power, unfortunately.

    • @Mr11ESSE111
      @Mr11ESSE111 2 года назад +1

      ​@@HosakaBlood 5950x3d &5900x3d will not have point because productivity will be pretty worse then normal cpu and gaming will be same or worse then 5800x3d and cpu will be too hot with 16 core and at last too expensive so 12&16 core CPUs with 3d don't have point

  • @vailpcs4040
    @vailpcs4040 2 года назад +4

    This was the comparison I was looking for, thanks!

  • @Ken-zg3ze
    @Ken-zg3ze 2 года назад +81

    The graphs seem a little confusing to me. I completely understand that it's 0.1, 1, and avg, in that order but my brain wants the color to represent a range rather than a delta. Maybe it's because of the inclusion of 3 values on 1 bar rather than the usual 2. Great comparison though. AMD is killing it.

    • @TheAnoniemo
      @TheAnoniemo 2 года назад +10

      It's the colors that throw the readability out the window. They probably should be lighter shades of the same color instead of contrasting ones.

    • @reubnn
      @reubnn 2 года назад

      you dont understand cause ur a woman

    • @PunzL
      @PunzL 2 года назад

      @@TheAnoniemo Using pure white to represent the 1% average kinda creates a divide between the average and the 0.1% and the placement of the numbers does not help either. I think a different color other than white or swapping colors between 1% and 0.1% would easily fix the issue

  • @mordacain3293
    @mordacain3293 2 года назад +13

    I'm quite happy with my 5800X3D after switching back to my AM4 platform after struggling with a 12700K for a few months. The 12700K is great (and really great for the price if you live near a Micro Center), but nearly a year later and linux support is still pretty rubbish and the power consumption is ridiculous if you let it run unbridled. The 5800X3D is easier to cool and uses far less power and still works fine for my non-gaming workloads. Side note, Ryzen seems to get quite a bit more benefit out of extreme memory tuning than Alder Lake - I settled on running my old B-Die set at 3200Mhz Cl13 (with ridiculously tight sub-timings) and I was able to eek another 5-10% over the CL14 XMP profile and it even matches my tuned 4000Mhz (fclk @ 1:1) settings while using less power and without hammering the memory controller / SOC as hard.

    • @brandonj5557
      @brandonj5557 2 года назад

      I have no clue what you are talking about in regards to Linux support being rubbish, I have a i9 1200KS I mostly run a lot of docker containers for both Ethereum & Cardano. DDR5 has also been very reliable at 6000Mhz with 2 32gb dimms
      Linux has been an absolute great for Alder-Lake especially if you update your Kernal

    • @mordacain3293
      @mordacain3293 2 года назад +1

      @@brandonj5557 Good for you. Not sure what distro you are using but that was not my experience. Intel's thread director wasn't added until 5.18 (released in May) so prior to that there was no intelligent thread direction which lead to very unpredictable performance as there was no delineation between P-Cores and E-cores. Personally, I was only able to get predictable performance by disabling the E-cores all-together (this was actually the only way to boot some distros for quite some time after release). The easier hypervisor distros (i.e: Proxmox) are still running older kernel versions as well.
      I wasn't saying it won't work, it is just a bit rubbish in my opinion and takes far more effort than my Ryzen setup did.
      As far as the memory comparison goes, it is a bit apples vs oranges. I was running a DDR4 setup and it was never particularly stable running tight timings and I could only run up to 3600 in gear 1. Since you're running DDR5, you're in gear 2 by default which puts far less strain on the memory controller. I could manually switch to gear 2 and it was more stable; it was also more slow.
      Again, not saying Alder Lake is bad by any means, I was just sharing a small portion of my experience with it.

  • @barrycooper8640
    @barrycooper8640 2 года назад +4

    I thought I was mad swapping out a 5600x for a 5800x3d on my gaming rig. I was so wrong. It's awesome and you feel the difference.

  • @hyypersonic
    @hyypersonic 2 года назад +1

    New background looks awesome man

  • @sighheinrich
    @sighheinrich 2 года назад +7

    Great benchmark review test. Really well made, took everything I wanted into play. Power consumption is just crazy, should be reason enough to buy AMD for most people.

  • @crimsonram77
    @crimsonram77 2 года назад +1

    5:12 sheeesh absolute pogger skillz

  • @BSEUNHIR
    @BSEUNHIR 2 года назад +7

    You're not the only one testing the 5800X3D so late after release. Is that just a coincidence or is there a second round of samples or something?
    I ordered this beast on release day, love it for my shitty unoptimized DX9 games that are permanently in draw call limit. Really helps in those, like Planetside 2.

  • @henry3397
    @henry3397 8 месяцев назад

    Found your channel not long ago and your shots are amazing. Especially when explaining something through a visual, it hits my eyes at the same time it’s hitting my imagination

  • @Petr75661
    @Petr75661 2 года назад +39

    It's crazy how inefficient Intel is.

    • @MightBeBren
      @MightBeBren 2 года назад +2

      im not a fan of intel just throwing power at stuff to compete with lower power hardware

    • @marcturcotte1962
      @marcturcotte1962 2 года назад +2

      I remember saying the same thing with AMD Bulldozer .

    • @deadly_mir
      @deadly_mir 2 года назад +2

      We must remember though, 12th gen is basically like having 2-3 CPU's under 1 heat spreader. (ex: 12900k is like having an 11900k and 2 4670k's under 1 heat spreader.) Obviously its gonna take more power and produce more heat because 1+2=3, Not 1+2=1. People completely like to overlook E cores exist in these comparisons. My 12600K at its current 5.3 all p core, 4.0 all E core pulls 180w. At stock speeds and without E cores enabled it only draws about 90-100w. That's an unlocked Intel CPU under 100w. Most of the power draw spikes is the motherboards fault. This time around, 12th gen can be undervolted by .05v to .1v and take literally no performance hit. The CPUs simply were designed to draw more voltage than necessary

    • @LordRobbert
      @LordRobbert 2 года назад

      @@deadly_mir id love to see a 12600k with a 96mb cache instead of 20.

    • @deadly_mir
      @deadly_mir 2 года назад

      @@LordRobbert That would be super damn fast. But also probably pull well over 250w.

  • @The_Professional_Hobbiest
    @The_Professional_Hobbiest 2 года назад +2

    Awesome video. Super cool to see this from a sweaty gamer perspective, and the variety of benchmarks was sick

  • @hbgl8889
    @hbgl8889 2 года назад +32

    I think more dense and higher capacity cache is definitely the future. Memory accesses are really slow and a larger cache can help with that. Once a larger cache becomes more common, the game developers might actually start optimizing for it, which could give an even larger performance boost.

    • @bersK00
      @bersK00 2 года назад +7

      If game devs weren't optimizing their games taking into consideration the cache sizes all of your games would run like shit. Bigger cache sizes are essentially free performance.

    • @hbgl8889
      @hbgl8889 2 года назад +2

      @@bersK00I am sure they already optimize for cache size, but if you statically know that the average CPU has 100MB L3 cache instead of 4MB, you might actually adjust the code. And maybe in the future the L3 cache will be 1GB or more.

    • @defectiveclone8450
      @defectiveclone8450 2 года назад +2

      @@hbgl8889 ALL THIS will do is make those games run like trash on anything other then a CPU with 100MB of L3 :) ..

    • @tim3172
      @tim3172 2 года назад +1

      @@defectiveclone8450 Yes, that... that's the way technology progresses, yes.

    • @defectiveclone8450
      @defectiveclone8450 2 года назад +1

      @@tim3172 game engines lag and are made to run on the most common hardware of the times.. :) dont see anything needing large cashe anytime soon.. we need the market to move to these new CPUs before that happens

  • @zargham6719
    @zargham6719 2 года назад +1

    Hey, been watching your videos for a while but this change in background looked amazing. You looked very nice and personable in this video and the background switch up to a more colorful one is amazing. Anyways, great content as always!

  • @Vectral555
    @Vectral555 2 года назад +13

    AMD has definitely found a winning formula for a top gaming CPU. With that being said, next gen 3D V-Cache CPUs are gonna be absolute monsters. If a mid range Zen3 based CPU clocked to "just" 4.5GHz and added extra 64MB of cache, can do this much damage in games, imagine what a Zen4 based CPU with the same technology can do. If I had to make an educated guess, the 7800X3D is gonna be clocked to *at least* 5GHz (or maybe even more, considering the jump from 7nm to 5nm AND the non-3D V-Cache models will be boosting to *at least* 5.5GHz as shown by AMD in their Zen4 tech demo, also you have to consider the matured 3D V-Cache technology by then) with the same 96MB of L3, then you add the IPC increase, etc. Now, pause for a second and think about the fact that AMD will also have dual CCX 3D V-Cache CPUs coming as well, and those will be 12/16-core 192MB (96MB of L3 per CCX) monsters. And what's best of all, with the 3D V-Cache technology you don't need to invest in expensive memory (especially DDR5) since there is very little benefit thanks to all that cache, so something like a basic DDR5-5200 or DDR5-5600 memory will be more than enough and those kits have already started to come down to reasonable prices.

    • @tywoods7234
      @tywoods7234 2 года назад

      Yes! And then double the power consumption! And double the price! Don’t forget that.

    • @Vectral555
      @Vectral555 2 года назад +1

      @@tywoods7234 So a brand new product with improved performance on a brand new platform will cost more than the old one? Shocker! As far as consumption, don't worry about it, it'll still be much more efficient than their competitor ;)

  • @13thzephyr
    @13thzephyr 2 года назад +3

    Looking forward to the 5600X3D and 5900X3D down the line, that power difference is also is given that we are looking at 7nm vs 10nm so imagine what 5nm will do when it comes to efficiency. To match the latest and greatest of Intel in "gaming" by an almost 2-years old CPU just by adding more cache is impressive.

  • @tuganerf
    @tuganerf 2 года назад +3

    Damn. makes me feel good to know that I have upgrade options within am4. My 5600x is going fine but I might get one of these used in 2-3 years

  • @davidcole2337
    @davidcole2337 Год назад +1

    The answer to question is yes. Even with Intel 14th Gen and Ryzen 7000 Series this is still a Top 5 Processor for Gaming.. And you can pick it up for around $300 now.

  • @Slayer9009
    @Slayer9009 2 года назад +8

    Love the benchmarking, you definitely went out of your way to get accurate results so thank you for that!

  • @Stong1337
    @Stong1337 Год назад +7

    5800x3d is the best cpu of the decade

  • @TimothyStovall108
    @TimothyStovall108 2 года назад +5

    With the new BIOS versions coming out for the 5800X3D to allow for more tuning flexibility. I would like to see how it performs with an under volt offset as well, similar to how you did the PBO under volt, which really helped with my 5800X. Really looking at going to the 3D version, but now that there is word of more 3D AM3 versions in the future I'm really hoping for a 5900X3D. I still get hitches and stuttering with certain applications in which I know the 3D would just completely eliminate.

    • @iiimgdiii
      @iiimgdiii 2 года назад +1

      PBO2Tuner Tool

  • @CarlosWGomez
    @CarlosWGomez 2 года назад +9

    I've always been more of an Intel guy myself (just never had a AMD device), but after seeing power consumption numbers, made the switch immediately to a 5600x and do not regret it. Hoping to upgrade my GTX 1080 Ti FE soon to a 4080 depending if an 850W PSU can support it.

    • @LightMCXx
      @LightMCXx 2 года назад

      Ms get i5 12600k since it better deal and didn't spend chunk of money to buy r7 or i7 11th gen.
      12600k= 10% better than 11900k

    • @budgetking2591
      @budgetking2591 2 года назад +4

      never fall in love with a compagny, i know people that only buy certain brand of motherboards because thats what they are used to, so they buy shit motherboards a lot of times, brands and their motherboards quality are always changing

    • @FILIPLAMBOV
      @FILIPLAMBOV 2 года назад

      Really doubt 850w would be enough 😂

    • @aerosw1ft
      @aerosw1ft 2 года назад

      4080 would be kinda overkill with a 5600x imo (and that psu worries me too, especially with transient spikes)

    • @ysnyldrm73
      @ysnyldrm73 2 года назад

      @@aerosw1ft With 4k monitor it is okay with 5600x.

  • @JuniorCloudHouse
    @JuniorCloudHouse 2 года назад +8

    My Gigabyte B350 Gaming 3 is 5 years old now. With support for the 5800X3D. I'm using the 5600 now.
    Glad that i still have a CPU upgrade on it.

  • @joaojgac
    @joaojgac 2 года назад +1

    always presenting great content Ali ! Big fan here... cheers from BR 🇧🇷

  • @theb4r138
    @theb4r138 2 года назад +27

    What sort of cooling were you using for the 5800x3d and what were the temps? Could you explore undervolting with it? I’m using a 240 custom loop with it and a negative voltage offset and I haven’t seen much of any performance loss but much better temps.

    • @lauchkillah
      @lauchkillah 2 года назад +1

      I'm on a nh-d15 and I'm seeing an avg. Tdie temp of around 70°C while playing Outriders for example. not too hapy with it at the moment, but it was the first time i used Kryonaut on a CPU, and I believe i used too little

    • @TheRockRocknRoll
      @TheRockRocknRoll 2 года назад

      i have a dark rock pro 4 on mine

    • @theb4r138
      @theb4r138 2 года назад

      @@TheRockRocknRoll what sort of temps are you getting at stock settings?

    • @theb4r138
      @theb4r138 2 года назад +1

      @@lauchkillah 70 C sounds nice to me. With a 240 in the FormDt1 at stock settings my 5800x3d was hitting 80C after playing Warzone for an hour. The negative voltage offset brings me down to the mid to high 60s. It is summer time so that has a little to do with my bad temps though.

    • @MiguelAngelAvilaVillalobos
      @MiguelAngelAvilaVillalobos 2 года назад

      @@theb4r138 What 240 aio are you using? EK basic?

  • @abaj006
    @abaj006 2 года назад +2

    Half the power usage for better gaming FPS, that is just insane! I have also seen other reviews show that the 5800X3D is the best CPU for VR.

  • @connordaniels2448
    @connordaniels2448 2 года назад +43

    Crazy how much power the 12900ks pulls. I had to undervolt my 12700k because of temps but I guess it’s nothing compared to a 240w tdp

    • @jhellier
      @jhellier 2 года назад +1

      Does undervolting reduce performance?

    • @connordaniels2448
      @connordaniels2448 2 года назад +7

      @@jhellier it should actually increase it if done properly. My cinebench score went up a few hundred points since temps stayed lower as the 12700k will thermal throttle under constant 100% loads in an application such as that.

    • @najeebshah.
      @najeebshah. 2 года назад +8

      @@jhellier no, barely. My 12900k is actually more efficient than a 5950x, calibrated loadlines and a undervolt resulted in a cinebench score of 27k @158w, mind you for the 5950 to break the 27k mark it needs 166-175 watts with pbo. Total performance drop was 5-7% for me for all core and bone in gaming, i actually end uo boosting for longer and higher dur to reduced temps

    • @williehrmann
      @williehrmann 2 года назад +1

      @@connordaniels2448 What cooler you built on top of it? I only have an air cooler Noctua NH-D15 and i can let mine run for hours pulling 205w and not throttling even in summer temps but it's close to max temps.

    • @connordaniels2448
      @connordaniels2448 2 года назад

      @@williehrmann I’ve got a 240mm Corsair H115i Elite Capellix but it was close to max temps so I decided to undervolt it. Honestly it doesn’t affect performance at all and dropped my temps by about 20c.

  • @cherryfarther
    @cherryfarther 2 года назад +1

    Man half the time I already know the answer but I still watch your videos anyway. The editing and presentation style are just so sick

  • @Xenoray1
    @Xenoray1 2 года назад +3

    was running a 3600 last month upgraded to 5800x, didnt consider 3d variant cuz this would cost me 300€ more.. im fine right now :D

  • @curtismariani6303
    @curtismariani6303 2 года назад +1

    I went from a 5800x to a 5800x3d to pair with my 3090FE as needed the 5800x for another build. As I game at 3440x1440 I wasn’t expecting to see any difference in fps and I didn’t. The idea was to given me some headroom for the 4090 when it launches. However my 1% lows have improved and overall I do feel like the smoothness has improved in games like Warzone, so I am pretty pleased with it and I would definitely recommend to an existing AM4 platform owner looking to upgrade their cpu.

    • @curtismariani6303
      @curtismariani6303 2 года назад

      @kureto 1294 I will try to offer some balance here so you can make the best choice for you. The 5700x should do the job nicely for a 3070 and in every measure will feel like an upgrade from what you have. The 5800x3d will not feel any quicker in the desktop or none gaming applications. However, I believe the 5800x3d will still give you better 1% lows and smoother gaming even with a 3070 (which is a cracking card btw). What the 5800x3d will likely give you, is the ability to drop in a 4070 or better or maybe even a 5070 at some point and not feel the need to change your cpu. I’ve seen B550 MSI Tomahawks for around your budget, but also look at Asus Rog Strix F-Gaming and Rog Strix A-Gaming boards as they may be something on sale at around that price point. The key is to get a board with the BIOS flashback utility, as your board could be on an older revision and may not support a 5800x3d out of the box.
      **edit** I would also say from reviews the 5800x3d is less sensitive to fast memory, so it will perform better if you have something slower then preferred 3200mhz cl14 or 3600mhz cl16.

  • @phero6933
    @phero6933 2 года назад +10

    I love my x3D! Upgraded from the basic 5800x and saw a massive jump in low frame rates and tighter frame timings.

    • @kelownatechkid
      @kelownatechkid 2 года назад +2

      I migrated from the 5900x and it was way worth it. I put the 5900x in a server instead

    • @Cremepi
      @Cremepi 2 года назад

      Yeah I f*d up and got a 5800x my 5600x almost performs better

    • @MightBeBren
      @MightBeBren 2 года назад +1

      @@Cremepi ??? how does your 5600x almost perform better

    • @Cremepi
      @Cremepi 2 года назад

      @@MightBeBren same clock speeds on single core, with lower temps. Unless you’re doing heavy rendering production you’ll be in single core work load. Even gaming streaming with multiple processes my 5600x is better value for the $

    • @MightBeBren
      @MightBeBren 2 года назад

      @@Cremepi what temps do you get on your 5600x? my 5800x doesnt go above 65c in games with a 4850mhz all core overclock at 1.4v and idles at 24c

  • @lolilolplix
    @lolilolplix 2 года назад +1

    As always, great production and camera work

  • @KesumaofEO
    @KesumaofEO 2 года назад +6

    That's what I look for, efficient performance and not just adding more raw power. I recently built a new PC with a 5600x and a 3060ti and I can already feel that my room warms up faster. Not everyone has an AC so I don't wanna deal with how much heat the higher end parts give off using so much wattage.

  • @balin72
    @balin72 2 года назад +1

    thanks for taking the time to give all these chips a fair chance! loved this vid

  • @michaelthompson9798
    @michaelthompson9798 2 года назад +5

    I ordered mine 🥰👍 at launch and got it 2 weeks later and it’s been running on my 1080P ITX Skyreach Mini S4 setup with Alienware 360hz monitor and the fps is amazing🤩🤯, the temps vg, the fan noise much lower than my older R5 3600 (non-x) under gaming loads etc. The power draw (on5800x3D) was lower under load than the R5 3600. It’s definitely a powerhouse chip that down the road in a few years will still command a hefty premium 😉 and perform well for a long time.

  • @rizrizriz
    @rizrizriz 2 года назад +1

    I really like this kind of graph. Easy to digest and understand.

  • @valentin3186
    @valentin3186 2 года назад +3

    The show starts with 5600X3D.
    5800X3D was down-clocked to meet 142W package power limit
    But that headroom gives 5600X3D to become the new budget king
    95W TDP? Let's go
    Dead platform :)

    • @egalanos
      @egalanos 2 года назад +1

      AMD stated that they locked out over clocking due to the lower voltage limits of the V-Cache die.
      The higher the frequency, the higher the voltage is needed to be able to switch the transistor within that shorter timeframe.
      The V-Cache die is designed using a denser cell library as they can fit more cache per die and to lower power consumption. This is ideal for Epyc processors.
      Unsure if the lower voltage limit is due to that dense cell library, the hybrid bonding process, or perhaps limitations of both L3 caches being in the same voltage domain?

  • @cezannee
    @cezannee 2 года назад

    your videos are very helpful and is easy to understand this rabbit hole of PC gaming

  • @twinjuke
    @twinjuke 2 года назад +5

    Very easy to consume yet one of the best reviews on the net with lot of useful content. Thanks again! Cool channel as always!

  • @crf637
    @crf637 2 года назад +1

    Great reviews as always, also a great Twitch channel for especially Apex Legends gameplay.

  • @macstheking
    @macstheking 2 года назад +4

    Would this cpu have any issues pairing with a rtx 40 series when they come out? I desperately need to replace my 10 year old i7 2600k pc but I want to wait for the new 40 series gpu and just ride out my 1080 ti til then.

  • @PiotrMalicki
    @PiotrMalicki 2 года назад

    Got upgraded yesterday. Could not be happier. It's a beast for MSFS

  • @Mikktor
    @Mikktor 2 года назад +3

    "No one's going to be playing at 200fps plus."
    Every time i hear this i feel like the only one in the world with a 240fps monitor. Though of course it is more important for competitive games i also enjoy having higher fps in single player games.

    • @T2NA
      @T2NA 2 года назад

      Do you play single player / story games at 200fps? I've always just capped at 120 as a force of habit, never really seen the benefit to it but I've always been kinda tempted to just let the system go wild and see what frames I could get in Horizon or something!

    • @crabosity
      @crabosity 2 года назад

      He probably says this because singleplayer games know that people play for the story/world so they can bump up the graphics, sacrificing fps so you wouldn’t hit 200fps plus in most singleplayer games if playing on high-ultra

  • @ryanreviews8566
    @ryanreviews8566 2 года назад +1

    damn that background looks as sick as this super clean review

  • @TBiscuitsmx5
    @TBiscuitsmx5 2 года назад +5

    Gonna wait, so I can spend less cash for more cache.

  • @timfngwena
    @timfngwena 2 года назад +1

    do you have a new camera? your portrait shots and close ups are just that bit crispier especially if you consider how much youtube smashes down the video.

  • @PATroll_knocking
    @PATroll_knocking 2 года назад +4

    With half power and almost same performance compare to 12900ks, 3D-V might be a right way for extreme gaming setup in itx build🤔

    • @rapamune
      @rapamune 2 года назад

      at ~half the cost

    • @13thzephyr
      @13thzephyr 2 года назад

      @@rapamune not just half the cost since you are reusing a good majority of your components

    • @rapamune
      @rapamune 2 года назад

      @@13thzephyr Valid point, indeed

  • @PaulRoneClarke
    @PaulRoneClarke 2 года назад +1

    Reminds me of the days of the FX57. An absolute monster from about 15 years ago.

  • @aliananza1990
    @aliananza1990 2 года назад +5

    Whats even more awesome you can get the 5800x standard for 300-350 euro currently in Europe. Insane value for the fps it can output.

    • @aliananza1990
      @aliananza1990 2 года назад +2

      @@tjames1994 lol almost same got mine 3 weeks ago undervolted it last night. Awesome cpu

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      @@tjames1994 if your building from scratch for Thas prices there's also 12700k oc the shit out of and get similar level of gaming performance while having more core

    • @aliananza1990
      @aliananza1990 2 года назад

      @@HosakaBlood yes thats good option too for new builds. I just upgraded from a 3600 on my sff using a x570 Aorus pro wifi

    • @aliananza1990
      @aliananza1990 2 года назад +1

      @@tjames1994 about the same 1.25v at 4.6 stable - tested max temp of 82 degrees.

    • @HosakaBlood
      @HosakaBlood 2 года назад +1

      @@tjames1994 if you already have amd board well your choice is ryzen 5000 then

  • @looppp
    @looppp 11 месяцев назад

    Just got this to replace my 3600, combined with my 3070, it made The Finals run.... SOOOOOOOOOOO much smoother. Holy cow. What an insane upgrade.

  • @N0N0111
    @N0N0111 2 года назад +3

    1080p-low is an e-sports thing for sure, getting those ultimate high FPS.
    To be fair a lot none PRO e-sports gamers are moving to 27" 1440p monitors.
    Showing some 1440p benchmarks would be much more fair, cause we know that the 3D cache boost a lot on 1080p resolution.
    The fact that 1:55 intel 12900KS has 1GHz more boost is insane than the 5800X3D, the power drawn is a 2x too that is just bonkers!
    We have already seen that AMD demo their new coming gaming CPUs with 5.5GHz so intel is in big trouble!

    • @Njazmo
      @Njazmo 2 года назад

      I actually moved to 65" 4k OLED, so the CPU doesn't matter anymore. Gotta have some GPU power.

  • @siddhantjain243
    @siddhantjain243 2 года назад

    New camera? Live this insane contrast and vivid colours 👍

  • @tudorgoina9997
    @tudorgoina9997 2 года назад +4

    Honestly I would just wait for amd ryzen 7000 to come out and I would wait for pci 5.0 and ddr5 to get cheaper

    • @katinahoodie
      @katinahoodie 2 года назад

      What do you need pci.5 for at this time, or at the time 7000 series comes out?

    • @mikeymaiku
      @mikeymaiku 2 года назад

      @@katinahoodie if your on ryzen 5000 already, theres no reason to get this and just wait for the next gen stuff is probly what he meant

  • @alyessamaddox7022
    @alyessamaddox7022 Год назад

    It's a rare thing in the computing market, but having this as a "special" product within AMD's product line actually makes sense. As impressive as the 3D vcache is, it really does not help in productivity uses, or even any game that can't take advantage of that extra cache (older games that barely make use of the regular 32mb of cache in the stock models.)

  • @ilkerYT
    @ilkerYT 2 года назад +5

    U should have added Escape From Tarkov to the tests 5800x3D s 100mb cache is really impressive

    • @kelownatechkid
      @kelownatechkid 2 года назад

      Or any VR/sim/RTS testing haha, the 5800x3d stomps all over the competition

    • @MarioPL989
      @MarioPL989 2 года назад

      Yeah, 5800x3D is huge in tarkov, even with weak gpu. I got 30-50% fps increase from 1700x with rx 470 gpu. With better gpu I could get wayyyyy more.

  • @johanescandra
    @johanescandra 2 года назад +1

    dope studio man

  • @nayanpitlam5621
    @nayanpitlam5621 2 года назад +6

    12700k is the sweet spot high end cpu atm!
    change my mind

    • @danielparks9035
      @danielparks9035 2 года назад

      Yeh that or 5700x/5800x for 50-75 less is still pretty good.

    • @chovekb
      @chovekb 2 года назад

      5900X tyvm bye.

    • @wixxzblu
      @wixxzblu 2 года назад

      i'd say 5800x or 12700F

    • @kelownatechkid
      @kelownatechkid 2 года назад

      12700F for new buyers, no contest. For upgrades, 5800x3d on the high end or 5900x for mid tier

    • @rdmz135
      @rdmz135 2 года назад

      For mixed workloads best value is the 12700F.
      For pure gaming its the 5600.

  • @cykablyat114
    @cykablyat114 2 года назад +1

    love the videos man one of the best tech channels on youtube

  • @matthieuzglurg6015
    @matthieuzglurg6015 2 года назад +3

    Currently using a 3700X that I've overclocked the crap out of (sustaining 4.6GHz all core, pretty happy with it, especially on my B450 Tomahawk board)
    I was considering waiting until AM5 CPUs drop, go with the first gen on that socket and then have the ability to upgrade as CPU generations go.
    But looking at this now, I'm wondering : I could very well just drop a 5800X3D, get a GPU upgrade, and I'm able to keep my RAM and motherboard for a few more years.
    I was quite a bit drooled over the perspective of a totally new system, but extending the life of the current one is also quite tempting

  • @AvWoN
    @AvWoN 2 года назад +1

    Super cool, thanks! Certainly my fav tech channel, the way you present stuff is just golden so cleannn. Would have been neat to see the 5900x in the lineup given that it would be an interesting comparison within AMD and given that comparison on intel has a more premium product with the 12900KS. If the results were not interesting at least a quick comment where it should land between the items in the lineup would be nice.

  • @krispy4605
    @krispy4605 2 года назад +2

    Upgraded my 5800x, can certainly tell the difference in Warzone and WoW

    • @lore00star
      @lore00star 2 года назад

      waste of money

    • @krispy4605
      @krispy4605 2 года назад +3

      @@lore00star should of just spent 1.5k on a new board, ram and a 12th gen then. Would of made more sense .. 🤪

    • @kelownatechkid
      @kelownatechkid 2 года назад

      @@krispy4605 ignore those morons lol, people who haven't seen it might not even believe how much smoother it is

    • @krispy4605
      @krispy4605 2 года назад

      @Skyfox couldn’t agree more. If you play at 240+, 4k or have an Ultrawide it’s definitely worth it. At 3840x1600 on Warzone I went from 90-125 to 110-141 (capped) dls/rt disabled / max settings

  • @anant3940
    @anant3940 2 года назад +1

    great video as always

  • @Breakfast_of_Champions
    @Breakfast_of_Champions 2 года назад +2

    A bit like the old i7-5775C with its 128mb L4 cache, to get the most out of an aged platform. The question is, how much of the game's main loop fits inside the special cache.

    • @AerynGaming
      @AerynGaming 2 года назад

      the 5775c's cache had 1/40'th of the bandwidth and 5x the latency IIRC. It was actually slower than overclocked DDR4 on a 6700k.

  • @saab_9
    @saab_9 2 года назад

    Awesome review Ali! It would be have been nice to see 4K framerates!

  • @Amzyy
    @Amzyy 2 года назад +8

    Currently on a 5600X and everything is flawless but when zen 4 comes out and this cpu drops might upgrade to it since it runs on the same motherboard

    • @gaebolglancer
      @gaebolglancer 2 года назад

      How long will amd be making these

    • @sebastianguerraty6413
      @sebastianguerraty6413 2 года назад +3

      zen 4 is going to use a new socket :/

    • @Ladioz
      @Ladioz 2 года назад

      I'm still using my i7 8700k! I have no idea when I should upgrade.... some people say im fine and some people say the newer CPUs make a pc feel so much smoother and snappier..

    • @cryx4
      @cryx4 2 года назад

      @@Ladioz 8700k is still a decent gaming CPU. I wouldn't upgrade right now, but you could consider upgrading to 12th gen when their prices fall a bit, or zen4 if it strikes your fancy

    • @aerosw1ft
      @aerosw1ft 2 года назад +1

      @@Ladioz upgrade when you feel you need to upgrade. Yes newer CPUs will definitely improve your performance but if what you're having now is enough for you then why spend the money?

  • @urband10
    @urband10 2 года назад +1

    I got this CPU a month ago for only 350 USD, I was using a Ryzen 1700 , the difference is incredible.

  • @CannedMarmalade
    @CannedMarmalade 2 года назад +6

    Damn the i9 sucking up almost 190W of power

    • @MarioPL989
      @MarioPL989 2 года назад

      This is just Valorant. All threads power usage is way higher, about 250-320W, while ryzen never gets above 100-130W

  • @caml1720
    @caml1720 2 года назад

    i could just be stupid but the indication that the white block in the middle of the graph isn't a number we have to infer, and the lack of a unified color choice on the far right text when the key at the top has avg fps in orange, was slightly confusing. made 1 seconds work into 5, not a real problem by any means. thanks for the comparison!

  • @TheKorusuk
    @TheKorusuk 2 года назад +3

    I think it's remarkable how good this CPU is compared to the i9 when you consider the TDP. Intel are just throwing clock speed and watts at their CPUs, that's all.

  • @garzolar
    @garzolar 2 года назад +1

    I used to have an AMD Fx-8320, it was not the best. A ton of heat and no computing power then I upgraded to 5600x. The AMD Ryzen solved all the issues and beats intel ever since they came out. Congrats to AMD!

    • @Shayinator
      @Shayinator 2 года назад

      Same here but from 3770K to 5600X, which i Overclocked it to 4.85Ghz!

  • @oak9428
    @oak9428 2 года назад +3

    AMD Ryzen 5800X3D outclassed Intel i9 12th generation 8 cores.

    • @LightMCXx
      @LightMCXx 2 года назад

      16*

    • @LightMCXx
      @LightMCXx 2 года назад

      In multitask will favor 12900k

    • @MarioPL989
      @MarioPL989 2 года назад

      @@LightMCXx With 2.5x power usage you get 1.5-2x speed.

  • @soraaoixxthebluesky
    @soraaoixxthebluesky 2 года назад +1

    AMD is king here.
    -don’t need an expensive DDR5
    -can run on an old board
    -low power consumption
    -way cheaper

  • @MrDutch1e
    @MrDutch1e 2 года назад +3

    Should have included some top end ddr5 imo. 12700k+ddr5 is roughly the same price as 5800x3d+ddr4.

    • @ruxandy
      @ruxandy 2 года назад

      Say what?

    • @Corey_T
      @Corey_T 2 года назад +3

      That would ignore mobo price though, which the 5800x3D can run in very cheap mobos without issue. The price difference would still be around $90 minimum.

    • @ruxandy
      @ruxandy 2 года назад +1

      Not sure where you guys are from, but in my country the 5800X3D is $20 cheaper than the 12700K... and let's not even discuss about the price of DDR5 memory.

    • @kelownatechkid
      @kelownatechkid 2 года назад

      @@ruxandy in most locations the i7 is about 200 dollars cheaper. So for new buyers the 5800x3d doesn't make much sense in those jurisdictions.

    • @ruxandy
      @ruxandy 2 года назад

      @@kelownatechkid I see that the Intel CPU is currently $100 cheaper on Newegg, and that's with the current $50 discount that it has (so the price difference would normally be ~ $50). Anyway, not sure where you guys see a $200 difference, but if that's the case, yeah, it makes more sense to buy the 12700K.

  • @johnszatkowski6898
    @johnszatkowski6898 2 года назад +2

    This was a really great video! As MANY other articles/videos mention is the POWER consumption as you noted here. Intel is absolutely ridiculous with it's power-draw! AMD using SO much less "juice" yet out preforming the I9! What does that tell you here? This is the reason I switched to AMD over Intel is because of this very reason in my latest build. I am only air-cooling vs. the ONLY OPTION for Intel which is water-cooling and the ADDED expense with the ALREADY "pricey" Intel offerings themselves! This is a NO-BRAINER for a "cost" standpoint! Now the 5800X3D does have the lower clock-speeds and will "hamper" other operations such as video, coding, compiling but, this CPU is listed as a "gaming" CPU so there are trade-offs and is NOT for everyone. For cost vs. performance AMD offerings make WAY better sense and will NOT have the added "hit" on your monthly power bill either! The other ADDED bonus is you can "drop" the 5800X3D into MANY existing AMD4 mother-boards giving "new" life to an older build where as with Intel you need a NEW mother-board, water-cooler, and MUCH more cash to do it! The price difference for Intel vs. AMD is ridiculous!

  • @budgetking2591
    @budgetking2591 2 года назад +4

    AMD beating Intel with server cpu's, its fucking amazing. Intel should be fucking ashamed lol ( ryzen cpu's are basicly mini epic cpu's)

  • @Jeffrey_Wong
    @Jeffrey_Wong 2 года назад

    that Kraber shot at 5:22 is absolutely NASTY

  • @muhammadzinc5228
    @muhammadzinc5228 2 года назад +8

    AMD just needs to invest properly in its software side. Drivers, crashes, BSOD.
    I'm referring to both its CPU and GPUs.

    • @Constantin314
      @Constantin314 2 года назад +9

      i think you live in another era, my friend

    • @L4tinoR4g3
      @L4tinoR4g3 2 года назад +3

      I think you need to revise your so-called "knowledge" because none of what you said is being experienced by other AMD users.

    • @wixxzblu
      @wixxzblu 2 года назад +1

      @@Constantin314 tell that to everybody having USB problems on AM4 platforms (yes even with the latest AGESA) and black screens and other isues on 5700XT and 6000 series.

    • @kelownatechkid
      @kelownatechkid 2 года назад +3

      Lol, this is ridiculous. I have like 6 zen systems including the 2600, 3600, 5900x, and 5800x3d. Across b450, x570, with varying amounts and types of ram including ECC - they're all rock solid on Linux and Windows. Compared to my Xeon and Core systems, they're exactly the same in terms of reliability.

    • @wixxzblu
      @wixxzblu 2 года назад

      @@kelownatechkid I guess everybody was lying on the internet about the USB issues then... I still hear streamers USB devices unplug and plug in to this day. It's a real issue, just becuase your use case doesn't encounter it, doesn't mean it doesn't exist.

  • @平和-v1z
    @平和-v1z 2 года назад

    Great video! Very well-made!

  • @Girvo747
    @Girvo747 2 года назад

    That power usage difference really matters to me in south east Queensland. In summer power hungry parts are brutal to live next to lol

  • @vi683a
    @vi683a 2 года назад

    Will purchase this for my next build when price drops on NEXT GEN release..

  • @Sjbjo
    @Sjbjo 2 года назад +1

    Which DDR4 speed? Those information are important!

  • @creed981
    @creed981 Год назад

    I swapped out my 3700X with the 5800X3D with an rtx 3070. My fps on everygame went up at least 15fps and some games a lot more.. this was absolutely worth the upgrade. I couldn't be happier.