True, such a good platform, I still have a 4790k and I was looking for an upgrade. The 5800x looks fabulous like most of the 5xxx series but unfortunately they don't have an iGPU which is a deal breaker for me
@@mastroitek not having an igpu is not as bad as you think (in my opinion). most systems have a universal gpu driver pre-installed. So as long as you have really any gpu installed onto your system, you should be able to boot. some operating systems like windows can also automatically detect what gpu you are using and download its latest drivers for you. I guess the downside is that you won't be able to use your computer if you don't have a gpu on-hand.
7:31 that power consumption difference is HUGE. 5800X3D draws 82.6W on that Valorant 1080 low test, 12900KS draws 188.3W. That's 2.27x more power consumption and 4% less FPS. Intel sure likes to make the power bill high
Well your comparing a cpu with almost twice the core also so power draw comparasion is quite pointless here intel chip can games similar level to 3d zen while performing on similar level to 5950x in multi task so your getting best of both worlds going with intel aswell while the 3d is amazing for gaming having 8 core only is bummer for many who use the pc both work and gaming without mentioning Thas amd want 500$ for 8 core cpu in 2022 while the competition give similar gaming performance and more core
@@HosakaBlood You know 12900KS has only 8 real performance cores (same as 5800X3D) and 8 efficiency cores, which are only Skylake level of performance? Also, power draw is power draw, number of cores doesn't matter. You can have server CPUs with 3x number of cores and same TDP as a 12900KS, because they run on lower frequency. If a component has 0-30% more performance for 2x the power draw, that's trash power efficiency. If you don't care about power draw at all, you might as well get a 12900KS and a 3090 TI, overclock them to max and use a 1000W PSU.
I got mine recently. In MMOs this thing is an absolute power house. I went from a 5800x to the 5800x3d and in gw2, lost ark, destiny 2 and ffxiv the gains in areas with loads of players are absolutely mind boggling. The 1% lows are also much better. Far smoother overall. 11/10 would buy again. Looking forward to zen 4 vcache models!!
That massively lower TDP is what sells it for me over the i9. I really hope the power consumption creep doesn't get out of hand. Intel and Nvidia in particular are going ham with their new/upcoming products. If you start recommending a 1000w power supply for a god damn gaming rig, even if high end, to me that's just insane.
PSUs are most efficient at around 50% usage. This is where their rating (bronze, gold, platinum ect.) is achieved. Efficiency drops after 55% usage. If you build a PC that uses around 500 watts at full load it's wise to go for a 1000 watt PSU over say a 850 watt. 1000w PSUs have been recommended since late 2020 for builds with the highest end GPUs. If you ever use Newegg's pc builder or PC Part Picker you can hit 400-500 watts easily just messing around. You also have to remember power spikes especially with Nvidia GPUs so it's smart to have that extra headroom. I've even seen a 1000w PSU shut off for safety when a 3090 card was spiking in one of Linus' videos. He ended up having to go with 1200W for everything to run smoothly with no shut down. The 3090TI uses even more power and it doesn't seem like Nvidia cares about efficiency at this point.
Yes, It just shows how much better tech AMD actually has.. If Intel and Nvidia only can stay slightly on top because they run twice the amount of watts through CPU/GPU’s…
@@gmaxsfoodfitness3035 Have you seen the rumors for the upcoming 40 series nVidia cards? Possibly over 400W for some of the highest end SKU’s? Something has to change. Hopefully AMD will lead the charge with RDNA3 like they’ve done with Ryzen.
@@pranavbansal5440 personally, I am tired of my office being 80-90 degrees F all the time because of my gaming desktop and getting AC dedicated for one room just proves the point that they all run too hot. CPUs and GPUs need to be more efficient.
I upgraded from 3600 yesterday, and my god, I have never seen a few of heavily memory-intensive games run that smoothly before. Definitely a worthy purchase!
@@evilemil I just upgrade my 3600 to 5800X3D yesterday, got it for $300 and those extra cash made me bought Rx 6800 XT for $500 and put my RTX 2060 to rest.. now I live on another dimension..
just a friendly reminder, the 5800x is competing with intel's next gen 12th CPUs instead of the same gen (11th gen) CPUs. that alone shows how powerful it is
Dude your charts are the best now i love the 1% and .1% lows on the chart the most important metric for sure. You also did amazing job benchmarking while playing apex for 25 minutes in a real match. I know you spent a long time making this video and it shows great work
Its by far the best cpu for sims in VR. All the frame drops I use to get in DCS have completely vanished with the 5800X3D and thats coming from a 5950x!
I wanted this CPU so bad... But I went with the 5800x instead got it on sale at Microcenter for 250$. The 3D model is a bit expensive atm but for a good reason.
In the UK right the 5800x3D is £180 ($220) more expensive than regular model on Amazon. Really not worth the price difference imo. I'd say only get the 3D models if they're of comparable pricing.
It's not for a good reason in my opinion. In Germany you can buy the 5800x3D for 480 euros, while the top of the line 5950X is only 50 euros more expensive. Yes, I know that the 5950X won't be as good at gaming, but it is overall a better CPU.
Upgraded from a Ryzen 3600 to the 5800x3d and for me it was just a really good step. For sure worthwhile to do as any other upgrade to similar performance would be lots more costly - and also time wise since there is zero need to reinstall Windows or anything. The only trap is, that you must update the BIOS on your motherboard with the old CPU running and then you can swap CPU.
I went from a 9900k to this thing and I'm pretty happy with how things have gone. I find some of the games I play to be less stuttery than before, which is really what I was looking for. FPS is everything, don't get me wrong, but it's more about how stable those frames are delivered.
I just bought the 5800x3d and wow what a difference it made compare to the 3700x that i had. MSFS looks so good now and runs smooth. I'm a very happy camper with this chip in my system.
@@WhiteoutTech maybe for less power... he 12900K uses 240W in multicore. Intel and nVidia should be scolded for doing this. Technically win on paper with absurd power draws, because they cannot take a loss. It will make AMD do the same thing, for the disgrace of the industry. It was said that the new RDNA and Ryzen will use more power, unfortunately.
@@HosakaBlood 5950x3d &5900x3d will not have point because productivity will be pretty worse then normal cpu and gaming will be same or worse then 5800x3d and cpu will be too hot with 16 core and at last too expensive so 12&16 core CPUs with 3d don't have point
The graphs seem a little confusing to me. I completely understand that it's 0.1, 1, and avg, in that order but my brain wants the color to represent a range rather than a delta. Maybe it's because of the inclusion of 3 values on 1 bar rather than the usual 2. Great comparison though. AMD is killing it.
@@TheAnoniemo Using pure white to represent the 1% average kinda creates a divide between the average and the 0.1% and the placement of the numbers does not help either. I think a different color other than white or swapping colors between 1% and 0.1% would easily fix the issue
I'm quite happy with my 5800X3D after switching back to my AM4 platform after struggling with a 12700K for a few months. The 12700K is great (and really great for the price if you live near a Micro Center), but nearly a year later and linux support is still pretty rubbish and the power consumption is ridiculous if you let it run unbridled. The 5800X3D is easier to cool and uses far less power and still works fine for my non-gaming workloads. Side note, Ryzen seems to get quite a bit more benefit out of extreme memory tuning than Alder Lake - I settled on running my old B-Die set at 3200Mhz Cl13 (with ridiculously tight sub-timings) and I was able to eek another 5-10% over the CL14 XMP profile and it even matches my tuned 4000Mhz (fclk @ 1:1) settings while using less power and without hammering the memory controller / SOC as hard.
I have no clue what you are talking about in regards to Linux support being rubbish, I have a i9 1200KS I mostly run a lot of docker containers for both Ethereum & Cardano. DDR5 has also been very reliable at 6000Mhz with 2 32gb dimms Linux has been an absolute great for Alder-Lake especially if you update your Kernal
@@brandonj5557 Good for you. Not sure what distro you are using but that was not my experience. Intel's thread director wasn't added until 5.18 (released in May) so prior to that there was no intelligent thread direction which lead to very unpredictable performance as there was no delineation between P-Cores and E-cores. Personally, I was only able to get predictable performance by disabling the E-cores all-together (this was actually the only way to boot some distros for quite some time after release). The easier hypervisor distros (i.e: Proxmox) are still running older kernel versions as well. I wasn't saying it won't work, it is just a bit rubbish in my opinion and takes far more effort than my Ryzen setup did. As far as the memory comparison goes, it is a bit apples vs oranges. I was running a DDR4 setup and it was never particularly stable running tight timings and I could only run up to 3600 in gear 1. Since you're running DDR5, you're in gear 2 by default which puts far less strain on the memory controller. I could manually switch to gear 2 and it was more stable; it was also more slow. Again, not saying Alder Lake is bad by any means, I was just sharing a small portion of my experience with it.
Great benchmark review test. Really well made, took everything I wanted into play. Power consumption is just crazy, should be reason enough to buy AMD for most people.
You're not the only one testing the 5800X3D so late after release. Is that just a coincidence or is there a second round of samples or something? I ordered this beast on release day, love it for my shitty unoptimized DX9 games that are permanently in draw call limit. Really helps in those, like Planetside 2.
Found your channel not long ago and your shots are amazing. Especially when explaining something through a visual, it hits my eyes at the same time it’s hitting my imagination
We must remember though, 12th gen is basically like having 2-3 CPU's under 1 heat spreader. (ex: 12900k is like having an 11900k and 2 4670k's under 1 heat spreader.) Obviously its gonna take more power and produce more heat because 1+2=3, Not 1+2=1. People completely like to overlook E cores exist in these comparisons. My 12600K at its current 5.3 all p core, 4.0 all E core pulls 180w. At stock speeds and without E cores enabled it only draws about 90-100w. That's an unlocked Intel CPU under 100w. Most of the power draw spikes is the motherboards fault. This time around, 12th gen can be undervolted by .05v to .1v and take literally no performance hit. The CPUs simply were designed to draw more voltage than necessary
I think more dense and higher capacity cache is definitely the future. Memory accesses are really slow and a larger cache can help with that. Once a larger cache becomes more common, the game developers might actually start optimizing for it, which could give an even larger performance boost.
If game devs weren't optimizing their games taking into consideration the cache sizes all of your games would run like shit. Bigger cache sizes are essentially free performance.
@@bersK00I am sure they already optimize for cache size, but if you statically know that the average CPU has 100MB L3 cache instead of 4MB, you might actually adjust the code. And maybe in the future the L3 cache will be 1GB or more.
@@tim3172 game engines lag and are made to run on the most common hardware of the times.. :) dont see anything needing large cashe anytime soon.. we need the market to move to these new CPUs before that happens
Hey, been watching your videos for a while but this change in background looked amazing. You looked very nice and personable in this video and the background switch up to a more colorful one is amazing. Anyways, great content as always!
AMD has definitely found a winning formula for a top gaming CPU. With that being said, next gen 3D V-Cache CPUs are gonna be absolute monsters. If a mid range Zen3 based CPU clocked to "just" 4.5GHz and added extra 64MB of cache, can do this much damage in games, imagine what a Zen4 based CPU with the same technology can do. If I had to make an educated guess, the 7800X3D is gonna be clocked to *at least* 5GHz (or maybe even more, considering the jump from 7nm to 5nm AND the non-3D V-Cache models will be boosting to *at least* 5.5GHz as shown by AMD in their Zen4 tech demo, also you have to consider the matured 3D V-Cache technology by then) with the same 96MB of L3, then you add the IPC increase, etc. Now, pause for a second and think about the fact that AMD will also have dual CCX 3D V-Cache CPUs coming as well, and those will be 12/16-core 192MB (96MB of L3 per CCX) monsters. And what's best of all, with the 3D V-Cache technology you don't need to invest in expensive memory (especially DDR5) since there is very little benefit thanks to all that cache, so something like a basic DDR5-5200 or DDR5-5600 memory will be more than enough and those kits have already started to come down to reasonable prices.
@@tywoods7234 So a brand new product with improved performance on a brand new platform will cost more than the old one? Shocker! As far as consumption, don't worry about it, it'll still be much more efficient than their competitor ;)
Looking forward to the 5600X3D and 5900X3D down the line, that power difference is also is given that we are looking at 7nm vs 10nm so imagine what 5nm will do when it comes to efficiency. To match the latest and greatest of Intel in "gaming" by an almost 2-years old CPU just by adding more cache is impressive.
The answer to question is yes. Even with Intel 14th Gen and Ryzen 7000 Series this is still a Top 5 Processor for Gaming.. And you can pick it up for around $300 now.
With the new BIOS versions coming out for the 5800X3D to allow for more tuning flexibility. I would like to see how it performs with an under volt offset as well, similar to how you did the PBO under volt, which really helped with my 5800X. Really looking at going to the 3D version, but now that there is word of more 3D AM3 versions in the future I'm really hoping for a 5900X3D. I still get hitches and stuttering with certain applications in which I know the 3D would just completely eliminate.
I've always been more of an Intel guy myself (just never had a AMD device), but after seeing power consumption numbers, made the switch immediately to a 5600x and do not regret it. Hoping to upgrade my GTX 1080 Ti FE soon to a 4080 depending if an 850W PSU can support it.
never fall in love with a compagny, i know people that only buy certain brand of motherboards because thats what they are used to, so they buy shit motherboards a lot of times, brands and their motherboards quality are always changing
What sort of cooling were you using for the 5800x3d and what were the temps? Could you explore undervolting with it? I’m using a 240 custom loop with it and a negative voltage offset and I haven’t seen much of any performance loss but much better temps.
I'm on a nh-d15 and I'm seeing an avg. Tdie temp of around 70°C while playing Outriders for example. not too hapy with it at the moment, but it was the first time i used Kryonaut on a CPU, and I believe i used too little
@@lauchkillah 70 C sounds nice to me. With a 240 in the FormDt1 at stock settings my 5800x3d was hitting 80C after playing Warzone for an hour. The negative voltage offset brings me down to the mid to high 60s. It is summer time so that has a little to do with my bad temps though.
@@jhellier it should actually increase it if done properly. My cinebench score went up a few hundred points since temps stayed lower as the 12700k will thermal throttle under constant 100% loads in an application such as that.
@@jhellier no, barely. My 12900k is actually more efficient than a 5950x, calibrated loadlines and a undervolt resulted in a cinebench score of 27k @158w, mind you for the 5950 to break the 27k mark it needs 166-175 watts with pbo. Total performance drop was 5-7% for me for all core and bone in gaming, i actually end uo boosting for longer and higher dur to reduced temps
@@connordaniels2448 What cooler you built on top of it? I only have an air cooler Noctua NH-D15 and i can let mine run for hours pulling 205w and not throttling even in summer temps but it's close to max temps.
@@williehrmann I’ve got a 240mm Corsair H115i Elite Capellix but it was close to max temps so I decided to undervolt it. Honestly it doesn’t affect performance at all and dropped my temps by about 20c.
I went from a 5800x to a 5800x3d to pair with my 3090FE as needed the 5800x for another build. As I game at 3440x1440 I wasn’t expecting to see any difference in fps and I didn’t. The idea was to given me some headroom for the 4090 when it launches. However my 1% lows have improved and overall I do feel like the smoothness has improved in games like Warzone, so I am pretty pleased with it and I would definitely recommend to an existing AM4 platform owner looking to upgrade their cpu.
@kureto 1294 I will try to offer some balance here so you can make the best choice for you. The 5700x should do the job nicely for a 3070 and in every measure will feel like an upgrade from what you have. The 5800x3d will not feel any quicker in the desktop or none gaming applications. However, I believe the 5800x3d will still give you better 1% lows and smoother gaming even with a 3070 (which is a cracking card btw). What the 5800x3d will likely give you, is the ability to drop in a 4070 or better or maybe even a 5070 at some point and not feel the need to change your cpu. I’ve seen B550 MSI Tomahawks for around your budget, but also look at Asus Rog Strix F-Gaming and Rog Strix A-Gaming boards as they may be something on sale at around that price point. The key is to get a board with the BIOS flashback utility, as your board could be on an older revision and may not support a 5800x3d out of the box. **edit** I would also say from reviews the 5800x3d is less sensitive to fast memory, so it will perform better if you have something slower then preferred 3200mhz cl14 or 3600mhz cl16.
@@MightBeBren same clock speeds on single core, with lower temps. Unless you’re doing heavy rendering production you’ll be in single core work load. Even gaming streaming with multiple processes my 5600x is better value for the $
That's what I look for, efficient performance and not just adding more raw power. I recently built a new PC with a 5600x and a 3060ti and I can already feel that my room warms up faster. Not everyone has an AC so I don't wanna deal with how much heat the higher end parts give off using so much wattage.
I ordered mine 🥰👍 at launch and got it 2 weeks later and it’s been running on my 1080P ITX Skyreach Mini S4 setup with Alienware 360hz monitor and the fps is amazing🤩🤯, the temps vg, the fan noise much lower than my older R5 3600 (non-x) under gaming loads etc. The power draw (on5800x3D) was lower under load than the R5 3600. It’s definitely a powerhouse chip that down the road in a few years will still command a hefty premium 😉 and perform well for a long time.
The show starts with 5600X3D. 5800X3D was down-clocked to meet 142W package power limit But that headroom gives 5600X3D to become the new budget king 95W TDP? Let's go Dead platform :)
AMD stated that they locked out over clocking due to the lower voltage limits of the V-Cache die. The higher the frequency, the higher the voltage is needed to be able to switch the transistor within that shorter timeframe. The V-Cache die is designed using a denser cell library as they can fit more cache per die and to lower power consumption. This is ideal for Epyc processors. Unsure if the lower voltage limit is due to that dense cell library, the hybrid bonding process, or perhaps limitations of both L3 caches being in the same voltage domain?
Would this cpu have any issues pairing with a rtx 40 series when they come out? I desperately need to replace my 10 year old i7 2600k pc but I want to wait for the new 40 series gpu and just ride out my 1080 ti til then.
"No one's going to be playing at 200fps plus." Every time i hear this i feel like the only one in the world with a 240fps monitor. Though of course it is more important for competitive games i also enjoy having higher fps in single player games.
Do you play single player / story games at 200fps? I've always just capped at 120 as a force of habit, never really seen the benefit to it but I've always been kinda tempted to just let the system go wild and see what frames I could get in Horizon or something!
He probably says this because singleplayer games know that people play for the story/world so they can bump up the graphics, sacrificing fps so you wouldn’t hit 200fps plus in most singleplayer games if playing on high-ultra
do you have a new camera? your portrait shots and close ups are just that bit crispier especially if you consider how much youtube smashes down the video.
@@tjames1994 if your building from scratch for Thas prices there's also 12700k oc the shit out of and get similar level of gaming performance while having more core
1080p-low is an e-sports thing for sure, getting those ultimate high FPS. To be fair a lot none PRO e-sports gamers are moving to 27" 1440p monitors. Showing some 1440p benchmarks would be much more fair, cause we know that the 3D cache boost a lot on 1080p resolution. The fact that 1:55 intel 12900KS has 1GHz more boost is insane than the 5800X3D, the power drawn is a 2x too that is just bonkers! We have already seen that AMD demo their new coming gaming CPUs with 5.5GHz so intel is in big trouble!
It's a rare thing in the computing market, but having this as a "special" product within AMD's product line actually makes sense. As impressive as the 3D vcache is, it really does not help in productivity uses, or even any game that can't take advantage of that extra cache (older games that barely make use of the regular 32mb of cache in the stock models.)
Currently using a 3700X that I've overclocked the crap out of (sustaining 4.6GHz all core, pretty happy with it, especially on my B450 Tomahawk board) I was considering waiting until AM5 CPUs drop, go with the first gen on that socket and then have the ability to upgrade as CPU generations go. But looking at this now, I'm wondering : I could very well just drop a 5800X3D, get a GPU upgrade, and I'm able to keep my RAM and motherboard for a few more years. I was quite a bit drooled over the perspective of a totally new system, but extending the life of the current one is also quite tempting
Super cool, thanks! Certainly my fav tech channel, the way you present stuff is just golden so cleannn. Would have been neat to see the 5900x in the lineup given that it would be an interesting comparison within AMD and given that comparison on intel has a more premium product with the 12900KS. If the results were not interesting at least a quick comment where it should land between the items in the lineup would be nice.
@Skyfox couldn’t agree more. If you play at 240+, 4k or have an Ultrawide it’s definitely worth it. At 3840x1600 on Warzone I went from 90-125 to 110-141 (capped) dls/rt disabled / max settings
A bit like the old i7-5775C with its 128mb L4 cache, to get the most out of an aged platform. The question is, how much of the game's main loop fits inside the special cache.
I'm still using my i7 8700k! I have no idea when I should upgrade.... some people say im fine and some people say the newer CPUs make a pc feel so much smoother and snappier..
@@Ladioz 8700k is still a decent gaming CPU. I wouldn't upgrade right now, but you could consider upgrading to 12th gen when their prices fall a bit, or zen4 if it strikes your fancy
@@Ladioz upgrade when you feel you need to upgrade. Yes newer CPUs will definitely improve your performance but if what you're having now is enough for you then why spend the money?
i could just be stupid but the indication that the white block in the middle of the graph isn't a number we have to infer, and the lack of a unified color choice on the far right text when the key at the top has avg fps in orange, was slightly confusing. made 1 seconds work into 5, not a real problem by any means. thanks for the comparison!
I think it's remarkable how good this CPU is compared to the i9 when you consider the TDP. Intel are just throwing clock speed and watts at their CPUs, that's all.
I used to have an AMD Fx-8320, it was not the best. A ton of heat and no computing power then I upgraded to 5600x. The AMD Ryzen solved all the issues and beats intel ever since they came out. Congrats to AMD!
That would ignore mobo price though, which the 5800x3D can run in very cheap mobos without issue. The price difference would still be around $90 minimum.
Not sure where you guys are from, but in my country the 5800X3D is $20 cheaper than the 12700K... and let's not even discuss about the price of DDR5 memory.
@@kelownatechkid I see that the Intel CPU is currently $100 cheaper on Newegg, and that's with the current $50 discount that it has (so the price difference would normally be ~ $50). Anyway, not sure where you guys see a $200 difference, but if that's the case, yeah, it makes more sense to buy the 12700K.
This was a really great video! As MANY other articles/videos mention is the POWER consumption as you noted here. Intel is absolutely ridiculous with it's power-draw! AMD using SO much less "juice" yet out preforming the I9! What does that tell you here? This is the reason I switched to AMD over Intel is because of this very reason in my latest build. I am only air-cooling vs. the ONLY OPTION for Intel which is water-cooling and the ADDED expense with the ALREADY "pricey" Intel offerings themselves! This is a NO-BRAINER for a "cost" standpoint! Now the 5800X3D does have the lower clock-speeds and will "hamper" other operations such as video, coding, compiling but, this CPU is listed as a "gaming" CPU so there are trade-offs and is NOT for everyone. For cost vs. performance AMD offerings make WAY better sense and will NOT have the added "hit" on your monthly power bill either! The other ADDED bonus is you can "drop" the 5800X3D into MANY existing AMD4 mother-boards giving "new" life to an older build where as with Intel you need a NEW mother-board, water-cooler, and MUCH more cash to do it! The price difference for Intel vs. AMD is ridiculous!
@@Constantin314 tell that to everybody having USB problems on AM4 platforms (yes even with the latest AGESA) and black screens and other isues on 5700XT and 6000 series.
Lol, this is ridiculous. I have like 6 zen systems including the 2600, 3600, 5900x, and 5800x3d. Across b450, x570, with varying amounts and types of ram including ECC - they're all rock solid on Linux and Windows. Compared to my Xeon and Core systems, they're exactly the same in terms of reliability.
@@kelownatechkid I guess everybody was lying on the internet about the USB issues then... I still hear streamers USB devices unplug and plug in to this day. It's a real issue, just becuase your use case doesn't encounter it, doesn't mean it doesn't exist.
I swapped out my 3700X with the 5800X3D with an rtx 3070. My fps on everygame went up at least 15fps and some games a lot more.. this was absolutely worth the upgrade. I couldn't be happier.
The most insane part is you can get this performance with a DDR4 board from 2017.
With 3200-3400 MHz ram instead of 3800. Above 3400 there is almost no difference for 5800X3D.
AMD's chipset support is admirable
True, such a good platform, I still have a 4790k and I was looking for an upgrade. The 5800x looks fabulous like most of the 5xxx series but unfortunately they don't have an iGPU which is a deal breaker for me
@@mastroitek not having an igpu is not as bad as you think (in my opinion). most systems have a universal gpu driver pre-installed. So as long as you have really any gpu installed onto your system, you should be able to boot. some operating systems like windows can also automatically detect what gpu you are using and download its latest drivers for you. I guess the downside is that you won't be able to use your computer if you don't have a gpu on-hand.
@@mastroitek I have a 5950x and not having a iGPU has never been an issue. Just get a cheap GPU which will give you more performance than any iGPU.
Love the color coding between brands. Makes a quick glance easier to compare overall
Fr
7:31 that power consumption difference is HUGE. 5800X3D draws 82.6W on that Valorant 1080 low test, 12900KS draws 188.3W. That's 2.27x more power consumption and 4% less FPS. Intel sure likes to make the power bill high
During summer this is a dealbreaker tbh.
Well your comparing a cpu with almost twice the core also so power draw comparasion is quite pointless here intel chip can games similar level to 3d zen while performing on similar level to 5950x in multi task so your getting best of both worlds going with intel aswell while the 3d is amazing for gaming having 8 core only is bummer for many who use the pc both work and gaming without mentioning Thas amd want 500$ for 8 core cpu in 2022 while the competition give similar gaming performance and more core
@@HosakaBlood 12900k $599 and 5800x3d $449.....so they are around $500..ya good math.
did you calculate the motherboard the same as well?😂
@@lawrencetoh9510 dont forget the crazy expensive DDR5 needed for Intel to unlock its maximum potential
@@HosakaBlood You know 12900KS has only 8 real performance cores (same as 5800X3D) and 8 efficiency cores, which are only Skylake level of performance?
Also, power draw is power draw, number of cores doesn't matter. You can have server CPUs with 3x number of cores and same TDP as a 12900KS, because they run on lower frequency.
If a component has 0-30% more performance for 2x the power draw, that's trash power efficiency.
If you don't care about power draw at all, you might as well get a 12900KS and a 3090 TI, overclock them to max and use a 1000W PSU.
I got mine recently. In MMOs this thing is an absolute power house. I went from a 5800x to the 5800x3d and in gw2, lost ark, destiny 2 and ffxiv the gains in areas with loads of players are absolutely mind boggling. The 1% lows are also much better. Far smoother overall.
11/10 would buy again. Looking forward to zen 4 vcache models!!
AMD's ideal customer right here, buying the same CPU twice lmfao
I play mostly destiny 2 I have the 10900k right now but I might switch to amd when the new 7000 series comes out
Same thoughts about my 5800X3D. Games are just so much fluent now.
@@busterscrugs Maaaan most computer people are nuts like this, you must know this :P.
My cousin bought this just for WoW lmao. he went from a 4600k that gave him 24fps to this and has now about 210fps with a 1070gtx. its mental.
That massively lower TDP is what sells it for me over the i9.
I really hope the power consumption creep doesn't get out of hand. Intel and Nvidia in particular are going ham with their new/upcoming products. If you start recommending a 1000w power supply for a god damn gaming rig, even if high end, to me that's just insane.
PSUs are most efficient at around 50% usage. This is where their rating (bronze, gold, platinum ect.) is achieved. Efficiency drops after 55% usage. If you build a PC that uses around 500 watts at full load it's wise to go for a 1000 watt PSU over say a 850 watt. 1000w PSUs have been recommended since late 2020 for builds with the highest end GPUs. If you ever use Newegg's pc builder or PC Part Picker you can hit 400-500 watts easily just messing around. You also have to remember power spikes especially with Nvidia GPUs so it's smart to have that extra headroom. I've even seen a 1000w PSU shut off for safety when a 3090 card was spiking in one of Linus' videos. He ended up having to go with 1200W for everything to run smoothly with no shut down. The 3090TI uses even more power and it doesn't seem like Nvidia cares about efficiency at this point.
Yes, It just shows how much better tech AMD actually has.. If Intel and Nvidia only can stay slightly on top because they run twice the amount of watts through CPU/GPU’s…
@@gmaxsfoodfitness3035 Have you seen the rumors for the upcoming 40 series nVidia cards? Possibly over 400W for some of the highest end SKU’s? Something has to change. Hopefully AMD will lead the charge with RDNA3 like they’ve done with Ryzen.
All this boils down is adding more cache memory and gets lower wattage usage, the efficiency came along way for AMD.
amd lucky they got onto tsmc silicon/their design teams obviously helped them out a lot more than nodes.
Effeciency doesn't even matter that much on desktops
@@pranavbansal5440 it matters to some people though
@@pranavbansal5440 i dont want my electric bills to be 400 bucks every month tho.
@@pranavbansal5440 personally, I am tired of my office being 80-90 degrees F all the time because of my gaming desktop and getting AC dedicated for one room just proves the point that they all run too hot. CPUs and GPUs need to be more efficient.
Imagine beating an i9 using only 80w and just adding more cache
Crazy
id love to see a 12600k with a 96mb cache instead of 20mb.
well, now we don't have to...
imagine beating i9 with 9W usage
@@eliezerortizlevante1122 Imagine that 9w is run off a solar panel.
I upgraded from 3600 yesterday, and my god, I have never seen a few of heavily memory-intensive games run that smoothly before. Definitely a worthy purchase!
Im currently using the 3600 and planning to upgrade and is targeting the 5800X3D, Should i go for it or upgrade my GPU instead?
@@evilemil cpu is more important than gpu. GPU gives you extra frames, CPU makes everything smooth
@@evilemil I just upgrade my 3600 to 5800X3D yesterday, got it for $300 and those extra cash made me bought Rx 6800 XT for $500 and put my RTX 2060 to rest..
now I live on another dimension..
I love mine. Fantastic chip and just slotted into my old system. Nice to see a company not forcing you to upgrade other pc parts too
Buying the 5800x3D overall was a way better investment to me than upgrading my RAM and motherboard to support a future gen processor of similar speed
just a friendly reminder, the 5800x is competing with intel's next gen 12th CPUs instead of the same gen (11th gen) CPUs.
that alone shows how powerful it is
u sound like stewie from family guy :D
@@jondonnelly3 Now that you mention it , I hear stewie as well XD
it's worse, AMD released Zen 3 to compete with 10th gen, 11th gen was released 6 months after Zen 3 and still lost in all SKUs lmao.
@@RafitoOoO YUP, true. If they release a 5900x3d then its gaming performance might be on par with 13 gen.
5800x3d was meant to dethrone the 12900k/s for gaming king
Late, but we know it’s the best looking review in the tech space!!
@Od1sseas *best looking*
bit gay but okay
@@prot018 I'd argue HWC is on par. Those guys really go above and beyond with their cinematography
Absolutely
OT for eye candy and itx+esports perspective
HUB/GN for detail
us aussies have tech yt in a chokehold atm lol
Dude your charts are the best now i love the 1% and .1% lows on the chart the most important metric for sure. You also did amazing job benchmarking while playing apex for 25 minutes in a real match. I know you spent a long time making this video and it shows great work
Its by far the best cpu for sims in VR. All the frame drops I use to get in DCS have completely vanished with the 5800X3D and thats coming from a 5950x!
This guy gets it.
@@michaeltrivette1728 My man!
I wanted this CPU so bad... But I went with the 5800x instead got it on sale at Microcenter for 250$. The 3D model is a bit expensive atm but for a good reason.
In the UK right the 5800x3D is £180 ($220) more expensive than regular model on Amazon. Really not worth the price difference imo. I'd say only get the 3D models if they're of comparable pricing.
I agree with you, at least we have options and decent price points now 👌🏼
@@GamingXpaul2K10 link ? because i can not understand how you could get it as cheap as that.
It's not for a good reason in my opinion. In Germany you can buy the 5800x3D for 480 euros, while the top of the line 5950X is only 50 euros more expensive. Yes, I know that the 5950X won't be as good at gaming, but it is overall a better CPU.
@@Paraclef he meant £180 MORE than the 5800X. It's £440 and the 5800X is about £270
Upgraded from a Ryzen 3600 to the 5800x3d and for me it was just a really good step. For sure worthwhile to do as any other upgrade to similar performance would be lots more costly - and also time wise since there is zero need to reinstall Windows or anything.
The only trap is, that you must update the BIOS on your motherboard with the old CPU running and then you can swap CPU.
I went from a 9900k to this thing and I'm pretty happy with how things have gone. I find some of the games I play to be less stuttery than before, which is really what I was looking for. FPS is everything, don't get me wrong, but it's more about how stable those frames are delivered.
was it worth it.i have a 9900k and everything runs smoothly
Yeah say plz... Switch from i9 9900k to this it's worth?
It’s just rough to spend the cash cause my cpu is really fast and has same amount of cores and runs faster as far as straight speed idk man
I just bought the 5800x3d and wow what a difference it made compare to the 3700x that i had. MSFS looks so good now and runs smooth. I'm a very happy camper with this chip in my system.
Nice, im gonna go to 5800x3d too from 3700x
A 5600X3D would be very interesting
A budget gamers dream
A 5950x3d maybe so I can switch my 12900k otherwise losing almost twice to performance in multi task for 3% better In gaming idk man
@@HosakaBlood huh why would you switch from a 12900K. Just throw in a 13900K if you already got an Intel mainboard.
@@WhiteoutTech maybe for less power... he 12900K uses 240W in multicore. Intel and nVidia should be scolded for doing this. Technically win on paper with absurd power draws, because they cannot take a loss. It will make AMD do the same thing, for the disgrace of the industry. It was said that the new RDNA and Ryzen will use more power, unfortunately.
@@HosakaBlood 5950x3d &5900x3d will not have point because productivity will be pretty worse then normal cpu and gaming will be same or worse then 5800x3d and cpu will be too hot with 16 core and at last too expensive so 12&16 core CPUs with 3d don't have point
This was the comparison I was looking for, thanks!
The graphs seem a little confusing to me. I completely understand that it's 0.1, 1, and avg, in that order but my brain wants the color to represent a range rather than a delta. Maybe it's because of the inclusion of 3 values on 1 bar rather than the usual 2. Great comparison though. AMD is killing it.
It's the colors that throw the readability out the window. They probably should be lighter shades of the same color instead of contrasting ones.
you dont understand cause ur a woman
@@TheAnoniemo Using pure white to represent the 1% average kinda creates a divide between the average and the 0.1% and the placement of the numbers does not help either. I think a different color other than white or swapping colors between 1% and 0.1% would easily fix the issue
I'm quite happy with my 5800X3D after switching back to my AM4 platform after struggling with a 12700K for a few months. The 12700K is great (and really great for the price if you live near a Micro Center), but nearly a year later and linux support is still pretty rubbish and the power consumption is ridiculous if you let it run unbridled. The 5800X3D is easier to cool and uses far less power and still works fine for my non-gaming workloads. Side note, Ryzen seems to get quite a bit more benefit out of extreme memory tuning than Alder Lake - I settled on running my old B-Die set at 3200Mhz Cl13 (with ridiculously tight sub-timings) and I was able to eek another 5-10% over the CL14 XMP profile and it even matches my tuned 4000Mhz (fclk @ 1:1) settings while using less power and without hammering the memory controller / SOC as hard.
I have no clue what you are talking about in regards to Linux support being rubbish, I have a i9 1200KS I mostly run a lot of docker containers for both Ethereum & Cardano. DDR5 has also been very reliable at 6000Mhz with 2 32gb dimms
Linux has been an absolute great for Alder-Lake especially if you update your Kernal
@@brandonj5557 Good for you. Not sure what distro you are using but that was not my experience. Intel's thread director wasn't added until 5.18 (released in May) so prior to that there was no intelligent thread direction which lead to very unpredictable performance as there was no delineation between P-Cores and E-cores. Personally, I was only able to get predictable performance by disabling the E-cores all-together (this was actually the only way to boot some distros for quite some time after release). The easier hypervisor distros (i.e: Proxmox) are still running older kernel versions as well.
I wasn't saying it won't work, it is just a bit rubbish in my opinion and takes far more effort than my Ryzen setup did.
As far as the memory comparison goes, it is a bit apples vs oranges. I was running a DDR4 setup and it was never particularly stable running tight timings and I could only run up to 3600 in gear 1. Since you're running DDR5, you're in gear 2 by default which puts far less strain on the memory controller. I could manually switch to gear 2 and it was more stable; it was also more slow.
Again, not saying Alder Lake is bad by any means, I was just sharing a small portion of my experience with it.
I thought I was mad swapping out a 5600x for a 5800x3d on my gaming rig. I was so wrong. It's awesome and you feel the difference.
New background looks awesome man
Great benchmark review test. Really well made, took everything I wanted into play. Power consumption is just crazy, should be reason enough to buy AMD for most people.
5:12 sheeesh absolute pogger skillz
You're not the only one testing the 5800X3D so late after release. Is that just a coincidence or is there a second round of samples or something?
I ordered this beast on release day, love it for my shitty unoptimized DX9 games that are permanently in draw call limit. Really helps in those, like Planetside 2.
Found your channel not long ago and your shots are amazing. Especially when explaining something through a visual, it hits my eyes at the same time it’s hitting my imagination
It's crazy how inefficient Intel is.
im not a fan of intel just throwing power at stuff to compete with lower power hardware
I remember saying the same thing with AMD Bulldozer .
We must remember though, 12th gen is basically like having 2-3 CPU's under 1 heat spreader. (ex: 12900k is like having an 11900k and 2 4670k's under 1 heat spreader.) Obviously its gonna take more power and produce more heat because 1+2=3, Not 1+2=1. People completely like to overlook E cores exist in these comparisons. My 12600K at its current 5.3 all p core, 4.0 all E core pulls 180w. At stock speeds and without E cores enabled it only draws about 90-100w. That's an unlocked Intel CPU under 100w. Most of the power draw spikes is the motherboards fault. This time around, 12th gen can be undervolted by .05v to .1v and take literally no performance hit. The CPUs simply were designed to draw more voltage than necessary
@@deadly_mir id love to see a 12600k with a 96mb cache instead of 20.
@@LordRobbert That would be super damn fast. But also probably pull well over 250w.
Awesome video. Super cool to see this from a sweaty gamer perspective, and the variety of benchmarks was sick
I think more dense and higher capacity cache is definitely the future. Memory accesses are really slow and a larger cache can help with that. Once a larger cache becomes more common, the game developers might actually start optimizing for it, which could give an even larger performance boost.
If game devs weren't optimizing their games taking into consideration the cache sizes all of your games would run like shit. Bigger cache sizes are essentially free performance.
@@bersK00I am sure they already optimize for cache size, but if you statically know that the average CPU has 100MB L3 cache instead of 4MB, you might actually adjust the code. And maybe in the future the L3 cache will be 1GB or more.
@@hbgl8889 ALL THIS will do is make those games run like trash on anything other then a CPU with 100MB of L3 :) ..
@@defectiveclone8450 Yes, that... that's the way technology progresses, yes.
@@tim3172 game engines lag and are made to run on the most common hardware of the times.. :) dont see anything needing large cashe anytime soon.. we need the market to move to these new CPUs before that happens
Hey, been watching your videos for a while but this change in background looked amazing. You looked very nice and personable in this video and the background switch up to a more colorful one is amazing. Anyways, great content as always!
AMD has definitely found a winning formula for a top gaming CPU. With that being said, next gen 3D V-Cache CPUs are gonna be absolute monsters. If a mid range Zen3 based CPU clocked to "just" 4.5GHz and added extra 64MB of cache, can do this much damage in games, imagine what a Zen4 based CPU with the same technology can do. If I had to make an educated guess, the 7800X3D is gonna be clocked to *at least* 5GHz (or maybe even more, considering the jump from 7nm to 5nm AND the non-3D V-Cache models will be boosting to *at least* 5.5GHz as shown by AMD in their Zen4 tech demo, also you have to consider the matured 3D V-Cache technology by then) with the same 96MB of L3, then you add the IPC increase, etc. Now, pause for a second and think about the fact that AMD will also have dual CCX 3D V-Cache CPUs coming as well, and those will be 12/16-core 192MB (96MB of L3 per CCX) monsters. And what's best of all, with the 3D V-Cache technology you don't need to invest in expensive memory (especially DDR5) since there is very little benefit thanks to all that cache, so something like a basic DDR5-5200 or DDR5-5600 memory will be more than enough and those kits have already started to come down to reasonable prices.
Yes! And then double the power consumption! And double the price! Don’t forget that.
@@tywoods7234 So a brand new product with improved performance on a brand new platform will cost more than the old one? Shocker! As far as consumption, don't worry about it, it'll still be much more efficient than their competitor ;)
Looking forward to the 5600X3D and 5900X3D down the line, that power difference is also is given that we are looking at 7nm vs 10nm so imagine what 5nm will do when it comes to efficiency. To match the latest and greatest of Intel in "gaming" by an almost 2-years old CPU just by adding more cache is impressive.
Damn. makes me feel good to know that I have upgrade options within am4. My 5600x is going fine but I might get one of these used in 2-3 years
The answer to question is yes. Even with Intel 14th Gen and Ryzen 7000 Series this is still a Top 5 Processor for Gaming.. And you can pick it up for around $300 now.
Love the benchmarking, you definitely went out of your way to get accurate results so thank you for that!
5800x3d is the best cpu of the decade
With the new BIOS versions coming out for the 5800X3D to allow for more tuning flexibility. I would like to see how it performs with an under volt offset as well, similar to how you did the PBO under volt, which really helped with my 5800X. Really looking at going to the 3D version, but now that there is word of more 3D AM3 versions in the future I'm really hoping for a 5900X3D. I still get hitches and stuttering with certain applications in which I know the 3D would just completely eliminate.
PBO2Tuner Tool
I've always been more of an Intel guy myself (just never had a AMD device), but after seeing power consumption numbers, made the switch immediately to a 5600x and do not regret it. Hoping to upgrade my GTX 1080 Ti FE soon to a 4080 depending if an 850W PSU can support it.
Ms get i5 12600k since it better deal and didn't spend chunk of money to buy r7 or i7 11th gen.
12600k= 10% better than 11900k
never fall in love with a compagny, i know people that only buy certain brand of motherboards because thats what they are used to, so they buy shit motherboards a lot of times, brands and their motherboards quality are always changing
Really doubt 850w would be enough 😂
4080 would be kinda overkill with a 5600x imo (and that psu worries me too, especially with transient spikes)
@@aerosw1ft With 4k monitor it is okay with 5600x.
My Gigabyte B350 Gaming 3 is 5 years old now. With support for the 5800X3D. I'm using the 5600 now.
Glad that i still have a CPU upgrade on it.
always presenting great content Ali ! Big fan here... cheers from BR 🇧🇷
What sort of cooling were you using for the 5800x3d and what were the temps? Could you explore undervolting with it? I’m using a 240 custom loop with it and a negative voltage offset and I haven’t seen much of any performance loss but much better temps.
I'm on a nh-d15 and I'm seeing an avg. Tdie temp of around 70°C while playing Outriders for example. not too hapy with it at the moment, but it was the first time i used Kryonaut on a CPU, and I believe i used too little
i have a dark rock pro 4 on mine
@@TheRockRocknRoll what sort of temps are you getting at stock settings?
@@lauchkillah 70 C sounds nice to me. With a 240 in the FormDt1 at stock settings my 5800x3d was hitting 80C after playing Warzone for an hour. The negative voltage offset brings me down to the mid to high 60s. It is summer time so that has a little to do with my bad temps though.
@@theb4r138 What 240 aio are you using? EK basic?
Half the power usage for better gaming FPS, that is just insane! I have also seen other reviews show that the 5800X3D is the best CPU for VR.
Crazy how much power the 12900ks pulls. I had to undervolt my 12700k because of temps but I guess it’s nothing compared to a 240w tdp
Does undervolting reduce performance?
@@jhellier it should actually increase it if done properly. My cinebench score went up a few hundred points since temps stayed lower as the 12700k will thermal throttle under constant 100% loads in an application such as that.
@@jhellier no, barely. My 12900k is actually more efficient than a 5950x, calibrated loadlines and a undervolt resulted in a cinebench score of 27k @158w, mind you for the 5950 to break the 27k mark it needs 166-175 watts with pbo. Total performance drop was 5-7% for me for all core and bone in gaming, i actually end uo boosting for longer and higher dur to reduced temps
@@connordaniels2448 What cooler you built on top of it? I only have an air cooler Noctua NH-D15 and i can let mine run for hours pulling 205w and not throttling even in summer temps but it's close to max temps.
@@williehrmann I’ve got a 240mm Corsair H115i Elite Capellix but it was close to max temps so I decided to undervolt it. Honestly it doesn’t affect performance at all and dropped my temps by about 20c.
Man half the time I already know the answer but I still watch your videos anyway. The editing and presentation style are just so sick
was running a 3600 last month upgraded to 5800x, didnt consider 3d variant cuz this would cost me 300€ more.. im fine right now :D
I went from a 5800x to a 5800x3d to pair with my 3090FE as needed the 5800x for another build. As I game at 3440x1440 I wasn’t expecting to see any difference in fps and I didn’t. The idea was to given me some headroom for the 4090 when it launches. However my 1% lows have improved and overall I do feel like the smoothness has improved in games like Warzone, so I am pretty pleased with it and I would definitely recommend to an existing AM4 platform owner looking to upgrade their cpu.
@kureto 1294 I will try to offer some balance here so you can make the best choice for you. The 5700x should do the job nicely for a 3070 and in every measure will feel like an upgrade from what you have. The 5800x3d will not feel any quicker in the desktop or none gaming applications. However, I believe the 5800x3d will still give you better 1% lows and smoother gaming even with a 3070 (which is a cracking card btw). What the 5800x3d will likely give you, is the ability to drop in a 4070 or better or maybe even a 5070 at some point and not feel the need to change your cpu. I’ve seen B550 MSI Tomahawks for around your budget, but also look at Asus Rog Strix F-Gaming and Rog Strix A-Gaming boards as they may be something on sale at around that price point. The key is to get a board with the BIOS flashback utility, as your board could be on an older revision and may not support a 5800x3d out of the box.
**edit** I would also say from reviews the 5800x3d is less sensitive to fast memory, so it will perform better if you have something slower then preferred 3200mhz cl14 or 3600mhz cl16.
I love my x3D! Upgraded from the basic 5800x and saw a massive jump in low frame rates and tighter frame timings.
I migrated from the 5900x and it was way worth it. I put the 5900x in a server instead
Yeah I f*d up and got a 5800x my 5600x almost performs better
@@Cremepi ??? how does your 5600x almost perform better
@@MightBeBren same clock speeds on single core, with lower temps. Unless you’re doing heavy rendering production you’ll be in single core work load. Even gaming streaming with multiple processes my 5600x is better value for the $
@@Cremepi what temps do you get on your 5600x? my 5800x doesnt go above 65c in games with a 4850mhz all core overclock at 1.4v and idles at 24c
As always, great production and camera work
That's what I look for, efficient performance and not just adding more raw power. I recently built a new PC with a 5600x and a 3060ti and I can already feel that my room warms up faster. Not everyone has an AC so I don't wanna deal with how much heat the higher end parts give off using so much wattage.
thanks for taking the time to give all these chips a fair chance! loved this vid
I ordered mine 🥰👍 at launch and got it 2 weeks later and it’s been running on my 1080P ITX Skyreach Mini S4 setup with Alienware 360hz monitor and the fps is amazing🤩🤯, the temps vg, the fan noise much lower than my older R5 3600 (non-x) under gaming loads etc. The power draw (on5800x3D) was lower under load than the R5 3600. It’s definitely a powerhouse chip that down the road in a few years will still command a hefty premium 😉 and perform well for a long time.
I really like this kind of graph. Easy to digest and understand.
The show starts with 5600X3D.
5800X3D was down-clocked to meet 142W package power limit
But that headroom gives 5600X3D to become the new budget king
95W TDP? Let's go
Dead platform :)
AMD stated that they locked out over clocking due to the lower voltage limits of the V-Cache die.
The higher the frequency, the higher the voltage is needed to be able to switch the transistor within that shorter timeframe.
The V-Cache die is designed using a denser cell library as they can fit more cache per die and to lower power consumption. This is ideal for Epyc processors.
Unsure if the lower voltage limit is due to that dense cell library, the hybrid bonding process, or perhaps limitations of both L3 caches being in the same voltage domain?
your videos are very helpful and is easy to understand this rabbit hole of PC gaming
Very easy to consume yet one of the best reviews on the net with lot of useful content. Thanks again! Cool channel as always!
Great reviews as always, also a great Twitch channel for especially Apex Legends gameplay.
Would this cpu have any issues pairing with a rtx 40 series when they come out? I desperately need to replace my 10 year old i7 2600k pc but I want to wait for the new 40 series gpu and just ride out my 1080 ti til then.
Got upgraded yesterday. Could not be happier. It's a beast for MSFS
"No one's going to be playing at 200fps plus."
Every time i hear this i feel like the only one in the world with a 240fps monitor. Though of course it is more important for competitive games i also enjoy having higher fps in single player games.
Do you play single player / story games at 200fps? I've always just capped at 120 as a force of habit, never really seen the benefit to it but I've always been kinda tempted to just let the system go wild and see what frames I could get in Horizon or something!
He probably says this because singleplayer games know that people play for the story/world so they can bump up the graphics, sacrificing fps so you wouldn’t hit 200fps plus in most singleplayer games if playing on high-ultra
damn that background looks as sick as this super clean review
Gonna wait, so I can spend less cash for more cache.
do you have a new camera? your portrait shots and close ups are just that bit crispier especially if you consider how much youtube smashes down the video.
With half power and almost same performance compare to 12900ks, 3D-V might be a right way for extreme gaming setup in itx build🤔
at ~half the cost
@@rapamune not just half the cost since you are reusing a good majority of your components
@@13thzephyr Valid point, indeed
Reminds me of the days of the FX57. An absolute monster from about 15 years ago.
Whats even more awesome you can get the 5800x standard for 300-350 euro currently in Europe. Insane value for the fps it can output.
@@tjames1994 lol almost same got mine 3 weeks ago undervolted it last night. Awesome cpu
@@tjames1994 if your building from scratch for Thas prices there's also 12700k oc the shit out of and get similar level of gaming performance while having more core
@@HosakaBlood yes thats good option too for new builds. I just upgraded from a 3600 on my sff using a x570 Aorus pro wifi
@@tjames1994 about the same 1.25v at 4.6 stable - tested max temp of 82 degrees.
@@tjames1994 if you already have amd board well your choice is ryzen 5000 then
Just got this to replace my 3600, combined with my 3070, it made The Finals run.... SOOOOOOOOOOO much smoother. Holy cow. What an insane upgrade.
1080p-low is an e-sports thing for sure, getting those ultimate high FPS.
To be fair a lot none PRO e-sports gamers are moving to 27" 1440p monitors.
Showing some 1440p benchmarks would be much more fair, cause we know that the 3D cache boost a lot on 1080p resolution.
The fact that 1:55 intel 12900KS has 1GHz more boost is insane than the 5800X3D, the power drawn is a 2x too that is just bonkers!
We have already seen that AMD demo their new coming gaming CPUs with 5.5GHz so intel is in big trouble!
I actually moved to 65" 4k OLED, so the CPU doesn't matter anymore. Gotta have some GPU power.
New camera? Live this insane contrast and vivid colours 👍
Honestly I would just wait for amd ryzen 7000 to come out and I would wait for pci 5.0 and ddr5 to get cheaper
What do you need pci.5 for at this time, or at the time 7000 series comes out?
@@katinahoodie if your on ryzen 5000 already, theres no reason to get this and just wait for the next gen stuff is probly what he meant
It's a rare thing in the computing market, but having this as a "special" product within AMD's product line actually makes sense. As impressive as the 3D vcache is, it really does not help in productivity uses, or even any game that can't take advantage of that extra cache (older games that barely make use of the regular 32mb of cache in the stock models.)
U should have added Escape From Tarkov to the tests 5800x3D s 100mb cache is really impressive
Or any VR/sim/RTS testing haha, the 5800x3d stomps all over the competition
Yeah, 5800x3D is huge in tarkov, even with weak gpu. I got 30-50% fps increase from 1700x with rx 470 gpu. With better gpu I could get wayyyyy more.
dope studio man
12700k is the sweet spot high end cpu atm!
change my mind
Yeh that or 5700x/5800x for 50-75 less is still pretty good.
5900X tyvm bye.
i'd say 5800x or 12700F
12700F for new buyers, no contest. For upgrades, 5800x3d on the high end or 5900x for mid tier
For mixed workloads best value is the 12700F.
For pure gaming its the 5600.
love the videos man one of the best tech channels on youtube
Currently using a 3700X that I've overclocked the crap out of (sustaining 4.6GHz all core, pretty happy with it, especially on my B450 Tomahawk board)
I was considering waiting until AM5 CPUs drop, go with the first gen on that socket and then have the ability to upgrade as CPU generations go.
But looking at this now, I'm wondering : I could very well just drop a 5800X3D, get a GPU upgrade, and I'm able to keep my RAM and motherboard for a few more years.
I was quite a bit drooled over the perspective of a totally new system, but extending the life of the current one is also quite tempting
Super cool, thanks! Certainly my fav tech channel, the way you present stuff is just golden so cleannn. Would have been neat to see the 5900x in the lineup given that it would be an interesting comparison within AMD and given that comparison on intel has a more premium product with the 12900KS. If the results were not interesting at least a quick comment where it should land between the items in the lineup would be nice.
Upgraded my 5800x, can certainly tell the difference in Warzone and WoW
waste of money
@@lore00star should of just spent 1.5k on a new board, ram and a 12th gen then. Would of made more sense .. 🤪
@@krispy4605 ignore those morons lol, people who haven't seen it might not even believe how much smoother it is
@Skyfox couldn’t agree more. If you play at 240+, 4k or have an Ultrawide it’s definitely worth it. At 3840x1600 on Warzone I went from 90-125 to 110-141 (capped) dls/rt disabled / max settings
great video as always
A bit like the old i7-5775C with its 128mb L4 cache, to get the most out of an aged platform. The question is, how much of the game's main loop fits inside the special cache.
the 5775c's cache had 1/40'th of the bandwidth and 5x the latency IIRC. It was actually slower than overclocked DDR4 on a 6700k.
Awesome review Ali! It would be have been nice to see 4K framerates!
Currently on a 5600X and everything is flawless but when zen 4 comes out and this cpu drops might upgrade to it since it runs on the same motherboard
How long will amd be making these
zen 4 is going to use a new socket :/
I'm still using my i7 8700k! I have no idea when I should upgrade.... some people say im fine and some people say the newer CPUs make a pc feel so much smoother and snappier..
@@Ladioz 8700k is still a decent gaming CPU. I wouldn't upgrade right now, but you could consider upgrading to 12th gen when their prices fall a bit, or zen4 if it strikes your fancy
@@Ladioz upgrade when you feel you need to upgrade. Yes newer CPUs will definitely improve your performance but if what you're having now is enough for you then why spend the money?
I got this CPU a month ago for only 350 USD, I was using a Ryzen 1700 , the difference is incredible.
Damn the i9 sucking up almost 190W of power
This is just Valorant. All threads power usage is way higher, about 250-320W, while ryzen never gets above 100-130W
i could just be stupid but the indication that the white block in the middle of the graph isn't a number we have to infer, and the lack of a unified color choice on the far right text when the key at the top has avg fps in orange, was slightly confusing. made 1 seconds work into 5, not a real problem by any means. thanks for the comparison!
I think it's remarkable how good this CPU is compared to the i9 when you consider the TDP. Intel are just throwing clock speed and watts at their CPUs, that's all.
Exactly
I used to have an AMD Fx-8320, it was not the best. A ton of heat and no computing power then I upgraded to 5600x. The AMD Ryzen solved all the issues and beats intel ever since they came out. Congrats to AMD!
Same here but from 3770K to 5600X, which i Overclocked it to 4.85Ghz!
AMD Ryzen 5800X3D outclassed Intel i9 12th generation 8 cores.
16*
In multitask will favor 12900k
@@LightMCXx With 2.5x power usage you get 1.5-2x speed.
AMD is king here.
-don’t need an expensive DDR5
-can run on an old board
-low power consumption
-way cheaper
Should have included some top end ddr5 imo. 12700k+ddr5 is roughly the same price as 5800x3d+ddr4.
Say what?
That would ignore mobo price though, which the 5800x3D can run in very cheap mobos without issue. The price difference would still be around $90 minimum.
Not sure where you guys are from, but in my country the 5800X3D is $20 cheaper than the 12700K... and let's not even discuss about the price of DDR5 memory.
@@ruxandy in most locations the i7 is about 200 dollars cheaper. So for new buyers the 5800x3d doesn't make much sense in those jurisdictions.
@@kelownatechkid I see that the Intel CPU is currently $100 cheaper on Newegg, and that's with the current $50 discount that it has (so the price difference would normally be ~ $50). Anyway, not sure where you guys see a $200 difference, but if that's the case, yeah, it makes more sense to buy the 12700K.
This was a really great video! As MANY other articles/videos mention is the POWER consumption as you noted here. Intel is absolutely ridiculous with it's power-draw! AMD using SO much less "juice" yet out preforming the I9! What does that tell you here? This is the reason I switched to AMD over Intel is because of this very reason in my latest build. I am only air-cooling vs. the ONLY OPTION for Intel which is water-cooling and the ADDED expense with the ALREADY "pricey" Intel offerings themselves! This is a NO-BRAINER for a "cost" standpoint! Now the 5800X3D does have the lower clock-speeds and will "hamper" other operations such as video, coding, compiling but, this CPU is listed as a "gaming" CPU so there are trade-offs and is NOT for everyone. For cost vs. performance AMD offerings make WAY better sense and will NOT have the added "hit" on your monthly power bill either! The other ADDED bonus is you can "drop" the 5800X3D into MANY existing AMD4 mother-boards giving "new" life to an older build where as with Intel you need a NEW mother-board, water-cooler, and MUCH more cash to do it! The price difference for Intel vs. AMD is ridiculous!
AMD beating Intel with server cpu's, its fucking amazing. Intel should be fucking ashamed lol ( ryzen cpu's are basicly mini epic cpu's)
that Kraber shot at 5:22 is absolutely NASTY
AMD just needs to invest properly in its software side. Drivers, crashes, BSOD.
I'm referring to both its CPU and GPUs.
i think you live in another era, my friend
I think you need to revise your so-called "knowledge" because none of what you said is being experienced by other AMD users.
@@Constantin314 tell that to everybody having USB problems on AM4 platforms (yes even with the latest AGESA) and black screens and other isues on 5700XT and 6000 series.
Lol, this is ridiculous. I have like 6 zen systems including the 2600, 3600, 5900x, and 5800x3d. Across b450, x570, with varying amounts and types of ram including ECC - they're all rock solid on Linux and Windows. Compared to my Xeon and Core systems, they're exactly the same in terms of reliability.
@@kelownatechkid I guess everybody was lying on the internet about the USB issues then... I still hear streamers USB devices unplug and plug in to this day. It's a real issue, just becuase your use case doesn't encounter it, doesn't mean it doesn't exist.
Great video! Very well-made!
That power usage difference really matters to me in south east Queensland. In summer power hungry parts are brutal to live next to lol
Will purchase this for my next build when price drops on NEXT GEN release..
Which DDR4 speed? Those information are important!
I swapped out my 3700X with the 5800X3D with an rtx 3070. My fps on everygame went up at least 15fps and some games a lot more.. this was absolutely worth the upgrade. I couldn't be happier.