Our recent Ryzen 5 5600 review & benchmarks show how it’s a good value CPU to consider right now: ruclips.net/video/ifI9nnmW5sg/видео.html Our GN Tear-Down Toolkits now have a 7-year (retroactive!) warranty! This is a great way to support our work, our reviews and purchasing of components, and get something useful in return! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit Coasters are also IN STOCK & SHIPPING: store.gamersnexus.net/products/3d-coaster-pack-4-component-coasters Our full review for the R7 5800X can be found here: ruclips.net/video/6x2BYNimNOU/видео.html Our review for the Intel i7-12700K is here: ruclips.net/video/B14h25fKMpY/видео.html
I love these in-depth reviews, but do you regularly do an overview as well? It seems every week it's AMD is destroyed, AMD is dominating, lather, rinse, repeat. I know they're different price points and market segments, but still.
@@VndNvwYvvSvv Wow I just commented this stuff. It's complete trash reporting imo... Like when Linus said Nvidia was an evil company for not giving free GPU's out to reviewers that refused to review the product... then went on to go "WOOOAAWOAWO! 8k gaming 3090 wow!!!"
Can't wait to peruse the OC settings for the 5900X in the article in the morning (midnight here) - sounds to me that AMD has kind of fibbed on the 15% if you're not running a well tuned PBO and max boost overclock on the other 5000 cpu's (and it seems you are not based on the frequencies you report the other 5000 cpu's to be running at), because you can't just compare stock apples to stock apples here solely just because its disabled on the 5800x3d. Hope there's no goalpost shifting on it, because that really only obscures the true end user performance.
I really appreciate the GN brand. No clickbait in the upload's title, diverse comparison metrics, no hyperbole for the sake of manufacturing sensation: Steve tells it like it is.
So here we are 11 months later and the 5800X3D is still in the conversation, even against it's direct successor the 7800X3D. AMD gave customers a hell of a CPU for AM4 before ending the platform. While some ask why, I see intelligent motive. With the 5800X3D still in the conversation, not every Ryzen user needs to be an early adopter of AM5. Motherboards for AM5 are stupid expensive, so AMD said keep your X570 and install this. They also didn't price it to the moon and, more recently, you can find this chip for 320-350 on sale. That's insane value when you factor in not having to change literally anything else about your PC, just change the CPU, apply thermal paste to the cooler, hit F1, and you will see a noticeable upgrade for gaming no matter what previous chip you had.
I think it was a great idea. It cemented the X3D branding as a premium gaming focused chip line up in the eyes of consumers. If they started with a dud, consumers wouldn’t have been excited for the 7000X3D chips. It was also a massive cherry on top of the AM4 platform in general. It was such a great CPU, and it’s still relevant today.
Just installed my 5800X3D last night. Bought an X570 board (up from B550) and new RAM just to pass along my old sticks. But yea full AM4 overhaul in June 2023 and I’m not batting an eye about it
@@PushUphill hope it was one of the amazing X570S boards. Just upgraded from my X570 and I like that it has a lot of the quality of life upgrades that AM5 comes with.
What blows my mind is that I could stick with the same B450 motherboard that hosted my previous 1700X and currently has my 3700X, and put in this top-of-the-line 2022 CPU. That's remarkable.
Even better is that the 3700X already does perfectly fine, so you can just sit on it and wait until Zen 4 is out and decide if it's worth it. If not then the 5800X3D will, by then, be some $300-ish "life extender" that will still rip and tear in every game for at least 5 more years to come...
@@andersjjensen yup, and this is pretty much my current plan. I've got a bit of a CPU bottleneck but not a burn-$450-to-fix-it bottleneck. AM4 socket will be remembered fondly
@@paulpietschinski3282 For 400 series and 500 series boards, yes, there should be a compatible BIOS update available. For 300 series boards: your mileage may vary.
@@wotwott2319 Intel would rather nuke every single one of its fabs than give customers more than 2 generations of hardware support per socket. Their profits are more important than your budget or respect.
Thanks for everything yall do for the tech community. Really appreciate this review. Also, love the merch too... I use everything I've bought from you guys on a almost daily basis. Well done guys, well done!
The best thing is, this was just a test run basically, and on a fairly old platform. Seeing how well it can do means same technology can be applied to future platforms and architectures, including GPU's since they also use L2 cache which is fairly important but is rarely ever talked about. Imagine if the same design were to applied to a flagship like a 5950X, or give a midrange option like 5600X. The possibilities are endless for the future. Instead of just piling on more cores and power/heat as not all games or applications need it. Interested to see how it bodes for the future.
One of my first thoughts when I heard of it was of how well it might do as an infinity cache alternative for the RDNA2/3 GPUs. One of my later thoughts was cursing AMD for the lost potential in not launching a 5950X3D, as it would probably crush last-gen threadrippers and be the absolute best final product of a socket, heh
Imagine it on even lower end parts, like a 5300x3D. it would be hands down some of the best bang for buck performance. I can't wait to see this proof of concept being actualized and further pushing innovation. even if it completely crashes and burns it will still be a push in the right direction for the entire market. Almost makes me regret buying a 5600x, because in theory a 5600x3D would be nearly the same performance as a 5700x in gaming.
@@mammy24 Eh, it would be cool... but not that plausible. They already bin (as you can see on the power consumption levels) these to be on the better side of the dies (probably not as good bins as the Epyc dies, but close) before they attach the cache. So, at best they might get enough defects to warrant a 5600x3d, but not lower. Edit: plus, you've gotta think of the margins. With this being a newly-produced technology, it's gotta eat up some of the profit margins of the chips. Lower-end chips already have the lowest margins, so economically they make the least sense to sell. Maybe in a couple years, when it has payed off its R&D costs, it might be plausible enough to make. But for now, the only lower end chips that are worthy of it are Epyc/Threadrippers, because even on their lower end they're expensive enough to cover the costs.
@@MistahGamah this thing is pretty damn affordable for what it is, $400 and it walks all over intels top of the line chip that's twice the price... I give it a year or two at most before this setup becomes more common
I love how AMD has recently been upping their game and really offering new technology with almost every generation of CPUs they release, most certainly the 5000 series is the pinnacle of the AM4 socket.
i TRULY hope they do that with the next socket! that would be amazing! im on i7-9700k and honestly AMD is getting more and more intruging with their Ryzen series.. i basically OC my chip to 5.2 ghz and it has kept up within 30-40 fps.. i dont 4k game so its honestly not an issue.. but i think i just might make the swap this time around. 4k looks gorgeous.. but then again i do fps mostly now and 4k seems like a waste imho..
"Thanks Steve" Gets me every time. Looks like a quick chip. I wonder if there will be 3D or XT versions of the 7000 series that release later incorporating the 3DCache
I am thinking this will be a normal thing they will be doing to newer chips to massively increase gaming performance as a default manufacturing method. At they did this to the AMD 5800X3D CPU and yet the Price was exactly the same as the original 5800x.
The 3D version will be the gaming CPU, the non-3D the normal and productivity CPU. I dont think there will be much room for non-3D gaming CPUs in the near future. This is the new standard.
What makes this so impressive is that this is a year old CPU that has been made to match and at times beat a brand new architecture just by adding innovative packaging. it's the same node, same architecture, with no clock speed improvements (a regression actually) and it can go toe to toe with the special edition of Intel's best new chip. What's going to happen when they put V-cache on Zen 4????
The way I see it, you buy this CPU now, and you'll be largely future-proofed and won't have to upgrade for a few generations. Games are only now just about requiring 6 cores and the consoles are mostly going to running in 7 game thread modes so you're good on cores. Frequency is reasonable for the power efficiency and the amount of cache it has will more than sufficient for some time.
Not that impressive though cause this cpu has been released 6 months after the alder lake launch and alder lake launches 6 months after rocket lake so if you go by that logic then intel's a lot more efficient and better at making chips than amd.
@@TechyBen Basically I'm using his own words to prove him wrong and yeah I know that it didn't take intel 6 months to create the alder lake architecture.
What a story the AM4 arc has been, it had it's ups and downs but it goes out with a bang, and managed to save AMD from bankruptcy, can't wait to see what AM5 will do to the industry.
It's not a big bang. It would be if it were the thing to change to if you wanted to stick to AM4. But since I have a 5950x and game at 4k there is no point to it for me.
@@alexworm1707 Not if you have a general interest in PC tech, and wanted to find out if it would do anything for 4k. Do you want me to tell you how you should use your time?
@@alexworm1707 thats just not true for all games. Plenty of games are more CPU driven than GPU. FOr instance, i play Squad in 4k, and when i went from a 3700x and 2080 to a 5900x and 3090, i noticed more of a difference when my 5900x arrived 1 week after my 3090, than i did from the 3090 upgrade
I’m finally the target customer :D upgrading to the final AM4 chip in a week! Been waiting for this review for a couple days now, great job as always delivering us the critical information ahead of the release, Steve!
Almost 2 years after this CPU came out, it still provides one of the best gaming experiences on the market. For AM4 users looking for more performance, this is the definitive answer.
@@kostastsitsis1309 I'd say a 5800X3D would be more than enough for any graphics card with 16GB of V-Ram, especially with the recent reviews and benchmarks of the 5700X3D that often compare it to the 5800X3D, along with comparisons AM5 and Intel CPUs. For anything short of an RTX 4090 and its $1,600+ price point, the 5800X3D will be an excellent performer for gaming that won't be holding your card back in any significant way. If you do content creation or streaming though, moving up to AM5 with the 7800X3D would be a much better choice while not being much distant in price. More often I've seen games capped out by the graphics card than the CPU when it comes to testing, especially for 4K resolutions where anything less than 16GB V-Ram on the graphics card will start to see major dips in performance. One thing to keep in mind is to not cheap out on your system's RAM. If you get the Micro Center Bundle for the 5800X3D for example, it only comes with 16GB of RAM and if you're on Windows 11 you really want 32GB to handle how much load Windows 11 and background/casual tasks can put on your RAM. So adding another 16GB of RAM on top of what that bundle provides would be a good idea, just make sure it's the exact same model of RAM.
My 5800X sadly died, but I didn't want to shell out the $ to upgrade to AM5 just yet, so I bought a 5800X3D and it has been awesome so far. this combined with my 3080 and 32gb of ram should carry me for quite a while
@@eliasroflchopper3006 To play Devil’s Advocate, if you could fully leverage the compute power, the M1 architecture is impressive as heck. The M1 Ultra can keep up with a 12900K in code compilation, beating it depending on which linker/compiler used. The built-in video encoders can make ProRes much easier to work with. And if you can use Metal, I’m sure the 64 core GPU would be more powerful than what we see now. There’s something wonky with scaling in some games that 24->48 is only a 50% performance bump and 48->64 is similarly disappointing currently. But still, even if we assume that it matches a 3070, it’s not an insane tax that we’re used to from Apple, especially as the power usage is far lower. Now if they could release a new generation of ARM Macs that supported nested virtualization and Vulkan out of the box…
@@johnbuscher the M1 ultra chip is overpriced and a absolute complete scam for trendy hipsters, you can get equivalent performance for a 1/4 of the cost and as an extra bonus the CPU/GPU wont fry itself at 100C and the components around it, at max TDP the CPU vcore is literally 2.0v; the M1's cores are going to degrade so fast it will be useless in 1 or 2 years, which is exactly what apple wants because then you have to buy their new garbage
I will never get tired of "Thanks, Steve" popping up every now and again. Also so happy that the 5800X3D sitting at such a notably "nice" price. The 12900KS made me scared at its monstrous "higher than MSRP 10GB 3080" pricing. After a slew of "meh" and "barf" CPUs, I can see why AMD was so confident to release those alongside an actually great CPU. All good options for my older AM4 friends to slot in and then rub in my face that they got a better deal than my launch 5600X, whom I will just continue appreciating for years to come.
i’m with you homie, I got 2400g at launch too, but I was desperate and didnt have enough for a good gpu. But it went hard for an igpu, got gpu long time ago. i’m probably up for an upgrade but i’ll wait cause i’m busy with work and life rn anyways 😅
Have not seen any "real" prices yet for this cpu, but I think it will have a hardtime. The difference between a 5800x (400)and a 5900x (480) is $80 here now, and if this is $450, I think I rather go with a 5900x for $30 more, and at 1440P you dont notice any difference in fps anyway, and in workloads you get way more punch.
@@AndrewTSq I don't disagree that the 5900X is a compelling alternative in theory, but only if you can actually use the extra cores for something regularly. If you're doing mostly gaming with some occasional productivity tasks, you'll probably get more real value out of the 5800X3D than the 5900X.
@@ozzyp97 Since the 5900x is cheaper here than the 5800x3d, I would go for the 5900x. But if the 5800x3d is cheaper.. maybe.. and if you do 1440P, i would just go for the regular 5800x and save $130.
@@AndrewTSq Yeah, it would depend on your GPU and upgrade plans too. The point remains that the 5900X isn't that much faster than the 5800X/5700X in games, whereas the 5800X3D is a fair bit faster than both. Just spend the money on what makes the biggest difference in your actual use case, there's no value in theoretical performance.
There are already overclocks being submitted with this CPU, external clock generators and a 1.35v lock allows it to hit 4.9GHz with another 12-16% gain in performance so ideally with the right setup this CPU can completely cream the 12900KS despite it using more power and having the advantage of high speed (and very expensive) DDR5!
@@AstroAvenger depends on what type of gaming. For esports gaming (generally the ones played at 1080p with high refersh rates), a higher clock speed cpu will be better then this one.
I can see it become the AM4 equivalent of the Q6600 down the lane. Not the best choice at release, but great performance that will give it a long lasting life for years.
I would like to see mid year experiment CPUs like the 5800X3D a lot more. It keeps us interested in how they can do some really cool things by modifying how the CPU is made. I would like to see Intel do that instead of releasing a KS SKU that doesn’t warrant its price.
Here I am in late 2023 putting a 5800x3d on a x370 motherboard I bought in 2016. Fucking solid platform. The next will be AMD fore sure if they keep pushing in this direction.
This reminds me of the 5775c. I really wonder why AMD and Intel hadn't repeated that experiment after having such good results back then. Great review!
This CPU costs a lot of money and silicon to make. Essentially 50% more silicon making it cost similar to a 5900X or even a 5950X depending on how yielding that 3D bonding process is. Costs more raw materials and the extra production steps might make it not worth it to manufacture.
Cache is super expensive - that's why. Oh and btw. - IF AMD wouldn't have raised to core count to 16/32 on AM4 and in the normal consumer market, we still would use 4/8 or 6/12 ( or maaaybe 8 / 16 ) Intel refreshs for ages. :D The last thing of Intel i expected after using their products for years is something really innovative that is really pushing boundaries for all users. They simply react and that is it.
@@notlikethis9932 Why do you even need more than 4c/8t in mainstream desktop? 1. Video recording is moving to more ASIC, like NVENC and QuickSync. 2. Nope, it's just shitty off/online DRM that tanks the CPU. There's a benchmark that shows a pirated copy suddenly perform better even on a 4c/4t CPU. 3. Console optimisation And 12th gen i3 already proved that you can simply game on a 4c/8t CPU. If you want more cores, you can go X299. You don't need quad channel memory to run on X299, you can go dual channel for it. Heck, even the entry X299 board are heavily build when compared to their mainstream counterpart. AMD hasn't done anything innovative. Zen is just pre-Faildozer architecture--K10, souped up, modernised and modularised. AMD has no good engineers, they always have to call Jim Keller in order to jump start.
As someone new to the PC space, I greatly appreciate the in-depth reviews you guys do. I watched countless hours of your videos as I was trying to decide what to get for my first build.
It was GN's video on the H500M that sold me on the case and boy, I'm glad I did as it's allowed me to eek hellova alot more performance out my 2700X (undervolted and overclocked).
Same. Steve for data, Jay for general info. Combined, helped a LOT when I built mine in 2018. And am continuously looking for upgrades.😉 It can get addicting.😅
Yep a CPU price, that is good. But it is the 6900XT the GPU price is still a big ouch. Same goes for a RTX 3090. Meanwhile the 3800X I have is still waiting. It also can do 4550 Mhz with less than 1.300 volts all day. Still it is good enough for either GPU that I refuse to pay for.
@@warrenpuckett6134 3800xt bottlenecked 6800xt 10%-15% or so in some games at 1080p, and about 5-8% at 1440p, so your 3800x would bottleneck them a bit considering they're even better
@@vespa7961 It don't bottleneck a HD 7790 using LInux with a 60Hz 1440 monitor. Point is do not need more. Cuz I am not buying any GPUs this year. Not buying 144hz monitor either. I do have a R 9 390X sitting in the drawer. I bought that used for $250. Way more than what is needed for Linux. Pretty much what I have is a 5700G. Have not bought any game titles in 4 years. I can wait. If the day comes when it is the price I am willing to pay, I am sure the the 5800XD3 will be long in the tooth also. It also is alright with the olde lady. She gets to be a Shop More. But still if you need it get it.
@@warrenpuckett6134 I'm talking from firsthand experience, that cpu WOULD bottleneck those gpus a bit, as I have the 6800xt and it was bottlenecking it in some games by 10-15%
Still always love seeing the random tech company presentation clips, the injection of dry humor without acknowledgement is hilarious to me. Keep up the great work that you guys are doing!
When this chip launched I saw the hype around it but thought "I won't watch those reviews, no need to upgrade my 5600X anyway." Seems I was dead wrong! If DDR5 keeps not making any big leaps in performance, it seems I won't have to upgrade my X570 mobo after all! This chip makes me so happy since if I ever wish to upgrade to the RTX 40xx series this chip can handle those cards no problem.
Impressive product in my opinion. The performance increase with a wattage decrease shows actual technological improvements, as opposed to dual turbo charging and then super charging your CPU and blasting 300 watts through it. Much much better for sff builds! Going to be way easier to cool.
Not really if you want sff build you better off with Intel, you missed the point completely here....... Intel only less efficient when pushed to it's limits.... But in idle, in gaming, in streaming, in browsing it's more efficient.... Even if you decided you need creativity workload, Intel twice as fast which means takes half the time to finish the task which means half the time to stay at that workload hence it cancells out consumption differences
@@TRX25EX I don’t think you are very familiar with the CPU industry if you considera Inter to be efficient (and cooler) in multi-thread applications the efficiency cores cannot handle the same performance, if you take in consideración the same core count.
@@GansoGG You talk too much, 12600K is is more efficient than 5800X + cooler + faster + cheaper ...... Full stop... Too much blah blah is not required..... I don't care industry I don't care Intel or AMD, I just state (FACTS) You have facts prove me wrong? 12700K even beats 5900X yet cheaper cooler more efficient..... If you gonna speak about productivity then read my comment again (FANBOY( Edit: a video for a fanboy like you who trying to defend AMD by being very specific knowing in general what I said is true..... ruclips.net/video/PiBVuwBeoQc/видео.html
LMAO i mean SFF has its limits as it is talking on air, then aio then custom loop. and INTEL k chips lie in the custom loop category along with r9 zen 3 cpus and the 3950x. Best id consider in ITX would be a 12400 for intel. At least the 5800x can be undervolted or pbo voltage tightened to cool better. The 96mb of cache makes the amd chip the best gaming option in itx but still expensive at 450$ vs the 12400. ESP if you jump to 4k or towards a racing sim setup with 4k etc displays. Then the cpu dont mean much....vs a 100+w single core 5ghz+ 12900 on itx in a game like cs or valorant etc free games lotta $$$ invested for some free shit games NGL
Can I just say... This is just the beginning for this type of Technology. The 3D caching they're using, that is. If I remember what AMD was stating in the past with this, the Tech will be used on all Future Desktop CPUs, which is going to be pretty amazing, after seeing this. Intel is going to need to up their game on something now. ^_^
I was wondering about this, I know this is essentially a POC and it looks really good. But if a Zen4 3D Vcache is right around the corner it makes this hard to justify.
@@GoodnotGreat88 I think this is for people that with am4 socket motherboard with second gen ryzen that hasn't upgrade yet, the performance bump is substantial
yeah cant wait for intel to release a fist thick cpu with a shit ton of fake multi cores, and a 3000 watt powerdraw, that bends , to barely beat amd...
The most exciting thing for me is that by making a cache-only die, they managed to make it much more dense. Combined with 3D-stacking, they can vastly improve the cache size/latency ratio.
From my personal studies of software optimization i've learned that larger caches are excellent for poorly optimized code, or code where you have a lot of small objects and you don't bother keeping them in contiguous memory chunks. This is very much the case for many games and certain data formats like JSON, if you don't carefully design the code to collect all the little memory fragments into larger chunks. It is also very useful in cases where you have large datasets and need to jump around randomly to do calculations, since there is a greater chance that the data you need will still be in cache if it has more available cachelines. The improvements in cache makes me very excited to see what they will manage to do with their next generation of CPUs.
Software emulation too. The constant branching, setting bit-flags and giant instruction lookup tables really does a number on both the memory controller and the CPUs speculative branching, and only "by pure dumb luck I had that in L3" can help it.
This CPU also blows everything out of the water for Factorio. That is a super optimised game, so everything will end up on the cache, and it steams ahead. So both low and high (but cache hungry) optimised code benefits? The stuff in the middle does not, it just runs the same or slower.
unfortunately, these kind of improvements only leads to developers getting more lazy instead of improving overall performance and results. Software will expand inefficiently to abuse all the hardware power that is offered.
*"very excited to see what they will manage to do with their next generation of CPUs"* This aged incredibly well, as just 1 year later we all saw the Ryzen 7800X3D basically obliterate every other CPU when it comes to gaming while not breaking the bank with its price. The 7800X3D will go down as arguably the best gaming CPU of all time, and something you won't need to upgrade from for who knows how long.
Amazing how the 5800X3D competed with the i9-12900KS at a lower price and almost 3 times lower power consumption. Can't wait for this tech in gaming laptops.
I wouldn't hold my breath, as precisely one of the main differences between the Ryzen desktop and laptop models, is that the laptops have a much smaller cache- I figure due to size and power consumption woes.
@@albertobueno7805 The laptops have monolithic dies and they are all APUs and have large volumes so that means they are cost sensitive. Only Apple has managed to build large monolithic dies for laptops since they can charge a large premium. But since the 3D v-cache tech only needs around 2% more die are on the main die and a separate cache chiplet, AMD can now build 3D v-cache top end SKUs of the top laptop APUs and brand them as the ultimate laptop gaming CPUs and charge a premium like Apple does. As we can see the Ryzen clocks are lower too so for laptops it's even better for power usage. Intel can't have a KS answer for this since clock and power increases on laptops makes it a no-go from the start.
@@stefangeorgeclaudiu You've missed a key consideration -- the cache is 3D stacked which adds to the z-height of the package. In turn, that requires structural considerations to the silicon below that are contradictory with the goal of shrinking the z-height. If AMD chooses to bifuricate their dies between 15w and 45w parts then I could see it happening, but otherwise I would put it at a 15% chance it happens at best.
It's misled graphs...... You missing the point completely here by power consumption.. because of 2 reasons: 1. Intel is more efficient because you not gonna use creativity workload 24/7 in which only here Intel falls but in Idle in gaming in streaming intel more efficient 2. Even if you use it for creativity 12900KS is almost twice as fast as 5800X3D which means it takes half a time to finish same task (so half the time to stay at that workload) which basically cancells out the power consumption difference....
Is great cpu. I upgraded from ryzen 9 3900x to ryzen 7 5800x3d and my fps went from 70, 80 to 120 135 with rx 6900 xt sapphire nitro on 34inch 3144x1440 wide-screen monitor.
Running the exact same. But the OC is gone, and though I never really use the actual base clock OC I usually up the boost and max powerdraw in bios. Not sure that will work flawlessly. Bit expensive anyway though, thinking I might be better off with 5700X/5600X and 2TB ssd overall.
@@jamegumb7298 I use a 5600x and it runs everything I play very well but I also run some of my games on a 2TB Hardrive (the price for the amount of storage was very good, half the price of a 1TB SSD). A good compromise would be the 5700x if you want to upgrade other parts too.
@@winonesoon9771 just noticed a huge price drop (including 5950X) in the UK as well, it could be more worthwhile to get a 5950X before the X3D even launches!
@@draketurtle4169 I only have 2 hdd left for mass storage. Still need 2TB to upgrade the gamedrive, less reinstalling all the time. I would go solid state all the way for the least amount of noise, but a good big ssd costs an arm and a leg. €1000 for the cheapest 8TB. We really need more M.2 slots, E-key should be standard, 3-4 nvme storage as standard, then I could just add 1 drive. Thinking of just ditching this and getting a low end SP3 cpu, it has enough lanes at least, maybe 72F3. Other system is a 3647 (just a 4114) and the lanes are a blessing, but I use it for other stuff and has other OS on it. Also needs a new SDR, which is costly.
Thanks for this great review. I bought the 5800X3D on crazy sale and sold my 5800X. For around $150 out of pocket, it was a drop in upgrade competing with all the new stuff. Wow. No reinstall of Windows. No nothing. Just swap and go. Ready to take on any new GPU I throw at it.
@@Captain-Chats I just did, massive difference. Need a 360 AIO for it tho with mine i hit around 60Oc while gaming with a non aggressive fan and pump setting. 80oC while pegging the CPU at 100% so it's pretty good.(PBO is off as well)
I just ordered the 3D, also gonna sidegrade from the 5800x, new AM5 platform sucks (for now) so I might as well go with the best in slot for AM4 and just wait a couple gpu gens before fully starting over on a new platform lol
@@franchocou good luck trying to fit it in cache lol. Although, If it's so much ram dependant, I bet it could benefit A LOT of more cache, as it probably needs a lot of data moved from and to it. Would be interesting to see
They are focusing this technology on the server space right now, they just threw us a bone with this one (more to antagonize Intel really). If this was for us we'd see 12 and 16 core parts.
Milan-X (their HPC Epyc variant with 3D V-Cache) is what this was originally designed for. And there it absolutely rips the face of the competition in some scientific compute workloads. As in, some workloads see 2x the performance at the same wattage... That's a decade of Intel server technology in one fell swoop.
@@nayan.punekar 'Not that interresting tbh' Getting about 10 to 35 fps or more with the same (old at this point about a year and a half) zen 3 arhitecture and ddr4. Hmm I'm missing something?
@@SoriPop well yup.It took 6 months to get this out with minor gains mostly between 10-15 and rarely above 20 while intel took just 6 months to release a new architecture 😂
It looks like AMD has decided to take multiple crowns this week. I'm kinda sad that they went for the 'worst CPU' 5500 and 'best gaming cpu' 5800X3D at the same time, but hey this latter product is excellent for gaming both in absolute performance and price/performance.
to be fair to AMD, CPUs like the 5500 aren't really targeted at gaming anyways. They are perfectly fine low cost CPUs for your boring Desktop/Office PCs that aren't meant for high end gaming or serious production use. That's also why those non-X SKUs tend to be OEM first. (This gen though the competition from Intel in that price segment looks much better than we've seen in a while.)
The 12900KF cost about the same and got the same gaming performance as the 5800X3D and for everything besides gaming it will be faster. I don't mind the 5800X3D as it gives a last gasp to AM4 but it is largely a pointless product. Especially given the next generation AM5 is out this year..
@@Hugh_I even such low end CPUs like 12100/5500 can push out 100% from most popular GPU models out there so I really can't understand why you consider them something that can go only in office PC. For that we still have pentiums (even celerons). If it can satisfice need of more than 60% of gamers than it's gaming CPU.
@@logirex it's pointless for you, but a lot of people on AM4 were looking for that last upgrade, and the 5800x3D would be that last upgrade. Jumping immediately to ddr5 AM5 Zen4 is complete waste of money this early on.
@@logirex it's more of a showcase of what we can expect to see in the future. If AMD can comfortably do 10% plus performance increse, just imagine what the new generation can do: Maybe a 30% increase due to: IPS, Clock speed and the X3D cache is not so far fetched. We can only hope for the best :)
This makes me VERY excited for AM5 since those will be using this stacked cache too, with a major difference. With a new socket instead of a retrofit they have a lot more room for error, which is a big part of why it took this long to launch.
Well not promised since AM5 will use DD5 and DDR5 still is early stuff, with bad timings, which can affect infinite fabric speed/latency alot. We will see in a few months. Raptor Lake will get a big cache bump too and boost clocks rumoured to be up to 5.8 GHz, meaning that pretty much every single chip will do 5+ GHz all core with ease. Can't wait for Ryzen 7000 vs 13th Gen however Ryzen 8000 vs Meteor Lake is going to be much more interesting (and this is when I will upgrade)
Stacked cache Zen4s aren't coming out this year. There's nothing circulating about "Genoa-X" server parts so no ryzen with Vcache either. My guess is they are minimum a year away.
Was waiting for AM5/13th Gen Intel but 5800x3D upgrade from a 3700x for gaming at £300 (sell the 3700x) would also be a huge upgrade, plus I'd get to keep motherboard/RAM and let the early adopters be beta testers for the new platform. Nice to see competition again!
Or... just wait a few more months and buy Zen 4 directly after things get polished instead of buying a stopgap for something that's perfectly fine and fast.
@@GamerBoy705_yt Why, when this would be cheaper and easily fast enough to run a high end gaming PC for years to come? If AM4 is any indication, waiting for second or third gen AM5 might be the safer bet anyway.
@@GamerBoy705_yt Or buy 5800x3D because it'll help max out my RTX 3080 while costing me £300 vs what a new motherboard/RAM and CPU will cost (We also have no idea what the price and performance of Zen 4 will be). A 3080/5800x3D combo should last me 3 years, it's like telling someone not to buy a RTX 3080 FE (at RRP) when they're running a RTX 2070 Super because the next gen will be here soon, we can all keep waiting although my opinion is if something exists that makes you happy and you can easily afford it why not get it? If Zen 4 absolutely smashes the 5800x3D with the amount of use my PC gets it's not a bad thing (6 months use for me is like 24 months for others), selling my older motherboard/RAM will be easier with a 5800x3D installed vs my older 3700x. It's a pretty awesome upgrade for the price over my current 3700x with a performance lift of over 50% in some games, considering I sat out the 5000 normal series if the price is right with no stock issues I'll be a happy camper. You're also not wrong with your comment, just for me personally I think the 5800x3D suits my needs well.
These GTA 5 results actually make sense when you consider the amount of AI prediction it takes to run the game (pedestrians and cars path perdiction / trafic lights / also there are alot of RNG dependent spawns, random events) this is where the cache on the 5800x3d should thrive.
@@ArchusKanzaki It also came out top of the heap in Flight Simulator. 113fps vs 93fps for the 12900KS, and 33fps vs 80fps for the 5800X, which equates to gains of 21% and 41% respectively. Those are seriously respectable gains.
This is such an interesting CPU, and gives a chance to show where cache really matters. This really makes me wonder what an x3D 5900x and 5950x would have looked like. It would have been interesting to see how it worked across chiplets. I also can't wait to see what an architecture that's built from the ground up for 3d cache can do, rather than it just being tacked on to an already funcional cpu at the end. As 3d stacking gets more complex, and moves to logic, I bet we see some interesting thermal solutions too. The future is exciting! Well... at least in tech it is.
Hardware Unboxed called this ages ago back when Zen3 launched. In response to Tech Deals going on about cores they practically proved that even in the cases where the higher core count parts outperformed the 6 core, it was due to their increased cache and not the actual cores. You could probably get the same results with a theoretical 5600X3D. Cache seems to be the limiting factor for CPU gaming performance right now.
@@AVerySillySausage Mostly, though in some games having more cores can drastically improve perceived smoothness by raising lows and improving frametime consistency. But that really depends on how well the game scales, and how much you have going on in the background, or even how old your windows install is. But yeah, overall a 5600x3D would be the same. I just wonder how two chiplets on a 5900/5950x3D would work out.
If yields are not so great, we could see an 8 core composed of two half-disabled (or 3+5/2+6/1+7) chiplets with cache stacked on top of both. I'd imagine the cache would help fight the latency cost of having the cores split across 2 chiplets. Would be a great gaming CPU without taking Chiplets away from productivity models.
This seems like a smart final product release for the AM4 socket, gives a very clear upgrade path to people who want to get the most from AM4 before moving on to AM5 later down the road.
I was hoping for a 5950X with 3D-V cache 😂... as upgrading from a 2700X to a 5800X3D won't be much of an upgrade in terms of my workload as I rarely game lol.
@@Thunderstyle7 Indeed it would be huge for gaming. I really hope we get a 5950 version later lol. Yes, I want all of that cake (productivity and gaming) haha.
I feel like this CPU is just an appetizer before the main course: Zen 4. Quite amazing considering this chip while using DDR4 RAM can actually challenge INTEL top CPU using DDR5 RAM!
DDR5 is not mature and there really isn't software that can even utilize such large transfer speeds as it is. Just looking at DDR4 vs DDR5 benchmarks on any 12th gen Intel CPU, the difference is overall negligible with a massive financial penalty for adopting DDR5 early. You can't find DDR5 for less than $300. But you sure can find DDR4 for around $50-70 if you're not picky on RGB or heatspreader shape and color This isn't some "Intel is bad because DDR5 performance bad". This just means that DDR4 is still the best even while CPU manufacturers are migrating to support DDR5. Unless you're chasing benchmarks, you'll never see a true difference in RAM speed and latency going beyond 3600. best bang for buck is 3200.
but just in games, let intel get on the same nm AND the extra cache and it would be faster. and this cpu is just for gaming. if you use it for work its shitty
I’ve just bought a 5800X3D this week, specifically for racing simulators where it is a monster. Got it for a lot less than a decent M5 Mobo. It’s up to a 100% increase in frames for sims over my 5600X for a net £200 after selling the 5600X. That’s outrageous value for money.
You can tell Steve's wallet had a hand in a large part of the scriptwriting for this episode. xD AMD allowing reviews a week before launch demonstrates their confidence in the chip, and their confidence seems well-placed. What a wonderful product to reward those AM4 users who didn't switch to Alder Lake.
It's probably best not to think of this as a 'reward' because that would imply that loyalty means anything in the pc computing space. Very few people buy "the newest and greatest" cpu every year. For most people, buying into any platform is a gamble and longevity is the goal. Last pc I built was a 6700k back in 2016 and it's still going strong (currently relegated to being a media computer). That was a solid value. AMD most certainly was not up on its game back in 2016, and the early AM4 socket had serious problems. Some BIOS updates would literally brick the motherboards. It was a bad time until Ryzen finally matured. If I'd bought in back then I would have had a _bad_ time. Zen 3 has been a banger, though. That's one of the big reasons why my current system is a 5900x. But I got that after enough time had passed that I could justify throwing money at a new motherboard and cpu. If I'd chosen an AM4 chip back in 2016, I would have had to suffer with AMD during a time when it was seriously lagging behind. If I'd built my computer in 2019/2020 and bought into the AM4 platform at Zen 2, the value proposition would have been a little better but it would also feel too soon to buy a new chip today. I'd basically be admitting that the Zen 2 chip I bought was a waste of money that didn't have the longevity. Another thing to consider is that buying a 5800X3D would have only allowed me to get more life out of the existing system, but it'd basically be a last-ditch effort to keep my *motherboard* relevant. A motherboard costs what... a hundred bucks? Two hundred bucks if I want a nice b550 board? It's kind of a silly thing to base so many decisions on. What matters most is how much computer you can get for the price, _when it's time to buy a computer._ If enough time has passed that I'm getting ready to upgrade some parts, I'm usually gonna need a new motherboard anyway. So the chipset really is irrelevant unless the parts I buy are failing me right out of the gate. So if you're somebody who is in the market for a decent gaming computer and you don't care what the platform is, the 5800X3D is genuinely the best value out there. The AM4 technology is as mature as it's ever gonna get. It's stable and it kicks ass. I think that's really all that needs to be said.
@@pirojfmifhghek566 I mean, I bought the cheapest zen 2 on the assumption I’d buy the last AM4 cpu. I have to say, this is a pretty awesome pay off for that decision. I had a pc that met my expectations for two years and get a rather substantial upgrade at the end.
Hindsight 11 months later, this chip is a love letter to AMD costumers. Dr. Su has said she loves gamers, there is no AM4 chip that beats this in gaming and, until 13th gen, almost no Intel CPU did either. 11 months later, the Ryzen 7000 is in full swing and it's still in the conversation. That's love of your costumers who may not have the insane amounts of money to be the early adopters of AM5. They have made this chip widely available and widely compatible with everything but 1st gen boards. Good on AMD.
Holy...I actually believed that AMD wasnt't gonna live up to their claims, especially with how good 12th gen is. Good to see new tech stretching its legs.
I think the 1% lows are really interesting, even in CS:GO where the average frame rate was so much lower than the competition, it still has the best 1% low score, which will make for a much more consistent experience.
Funny how AM4 was supposed to be retired earlier. But the pandemic delayed its retirement and now it's going out with a bang with the 5800X3D being the best value gaming CPU. AM5 better not disappoint. AM4 set the bar very high for platform longevity, performance and real value.
@@GodKitty677 The improvement is marginal for a $150 price increase if you're only looking at it from a gaming perspective vs the 5800X Unless you have a use-case for needing the extra L3 cache where it benefits greatly, this is more of a "what if" modification to an existing architecture.
Just bought one of these in Dec 2023 - I think this CPU has single-handedly extended the lifespan of AM4 by another two years. (And they ALL seem to be _incredible_ under-volters - I used -30 all core and dropped 10+ degrees in CB for _more_ performance.)
It uses 70-80W in Last of us Part 1. However that game is known to be heavy on the cpu, some say its not optimized. On the other hand this CPU uses only 50-55W in AC mirage.. which is brilliant
@@Ladioz my 5600 non X go up to 60 with a peerless assassin and undervolted. you can still undervolt the 5800x3d to now and most people go -30 with no issue even after month and month.
3800x here 5800x3d is like going from a 1800x/2700x to the 3700x (I probably wouldn't be bothered with the 5800x3d if I already had a 5800x) going from a 3700x to a 5800x witch only gained you 10-20% and gen4 pci-e good gain but not worth the cost Where as the 3700x to 5800x3d is above 50%+ consistency so is worth the cost overall (this was all using 3200 ram as welwitch is impressive, if using 3600/3800 cl16 witch incresses the infinity fabric speed it would have lowered the latency even more ) I have the ram (3600 cl16 that my current x370 mobo limits to 2933) I only need motherboard and CPU (probably buy a 2tb nvme SSD as well) so the 5800x3d is probably going to be the way to go
So much to process here. AMD shot themselves in the foot with some of the lower end CPUs just released, then release this intriguing, cache-heavy monster that could really boost performance for certain applications to levels we haven’t seen before for a competitive price. I’m currently on an 11700K (it is what it is). I wonder if RPCS3 is any sort of cache dependent and, if so, what the performance numbers would look like.
The fact that the am4 chip can keep up with the i9 and surpass in some places while only using ddr4 makes me impatient for the 7000 series release... i have a funny feeling that am5 is going to completely blow intel out of the water.
On one hand it makes it slightly feel bad I built on AM4/DDR4/PCIe 4(even numbers suck anyway) in 2020 but on the other it makes me realize holy shit how was that 2 whole years ago already. Think Am5 will last as long?
@@TheJimyyy ummm you and i cant have a realistic conversation about amd vs. Intel if you really believe that. Intel i9 based on the 5nm process and only that chip that is slightly better than the processing power of amds ryzen 9 5950x and ryzen 7 5800x3d out of box. So you arguing that its based on last years "refresh", mildly irritating. All i was commenting on was the fact that AMD is going to smoke intel out with the release of 7000 series processors.
@@jordanplays-transitandgame1690 I agree, they're probably going to blow AMD and also maybe AMD will give them a reach around like how they enabled FSR on the competition's old cards lol Seriously though I've been hearing stuff like this for YEARS at this point. Do you remember back when we were told wait for Alder Lake? Do you remember back when we were told to JustWait for 11000 series? Because I sure 'member. Basically nVidia and Intel just got so comfortable and allowed themselves to so rest on their laurels that they're literally YEARS behind down the pipeline now and at this point it's still them basically trying to play catch ahead again to very bad planning on their parts due to not taking AMD seriously. Which would be one thing were it only that, but lately the last few years increasingly just feels like Intel and nVidia blowing lots of smoke (and power draw) up my ass in order to justify and cope their way out of losing massive shares. It's one small part of why I went from wishing to get a high end nVidia card for my 2020 dream build, to not only switching to red but outright shittalking them constantly because I can see through the lies at this point. I just don't believe in hype and I want results, particularly where my money is directly involved. So as a result ironically enough AMD needs to fuck up pretty spectacularly to get me back on nVidia at this point whereas I actually would switch back to Intel mainly because I'm tired of hearing sustained turbo boosts get called "overclocking" because it's not. Ryzen flatly cannot and does not overclock.Which unless Steve or somebody (upside down or NA Steve either one) could clarify this to me, because I do feel pretty bilked about all that when it comes to processing regardless how good it is and efficient and cheaper etc. or that I'd have done it anyway, because on Intel you literally can get a say 3770k to sustained boost to 3.9ghz OR overclock it in the BIOS at up to 4.7ghz. Now, that is an overclock. 3.5ghz baseclocks and up to 3.9ghz TURBO BOOST. A sustained boostclock of 3.9 on a 3770k is not "overclocking" and I am not ever going to accept that as such, and yet somehow I keep seeing it getting talked about as if somehow sustaining boost UNDER the specified turbo boost clocks on AMD's 3700x is called "sustained overclocks of 4.25ghz all core on my 3.6-4.4ghz 3700x" like what the absolute fuck? So just for that, I am much more willing to make fun of AMD on gaming, and more prone to saying you should get nVidia+AMD CPU for productivity, and Intel CPU+AMD GPU for straight gaming. Yes, that's even with the newest tepid showing of Alder Lake next to Zen3. However, the main thing is that Alder Lake mostly fails because of Intel's notorious power draw issues, although my understanding is you can easily push 250w on a 5950x with sustained all core turbos aka "overclocking" unless somebody can explain to me how I'm wrong about that
had to come back and watch this because i just bought this 5800X3D and cheap B550 board and got them at very good price on BF deals here in Finland 😁, replacing 8700K which is still good CPU but cant wait some results in gaming
I bit too early on the AM4 platform with the 370x chipset motherboard. As well as starting out with the introductory R3-1200 CPU, along with an RX 570 GPU. Granted, it was a far cry from the Phenom II X940 AM2 based system I was coming from. It was a long time coming. I learned my lesson though. I upgraded with the 5950X and a Radeon RX 7900 GRE (AV1 codec support). I went with the 5950X because many of the review comparisons found that even at lower frame rates, the gameplay was often quite smooth in comparison to the X3D offerings. I literally can't stand stuttering, at all. Something to consider. Also, it will have other benefits above gaming that I will utilize. I think it's squares out to about the same difference between a naturally aspirated V8 engine versus a V6 Turbo. I'd say that probably will max out that system for my next 5-7 years run and I shall begin saving up for the next build. Only this time, I will wait a couple of generations on a new socket(likely an AM6 or 7 if I stick with AMD and they stick with the current MO), and will invest more initially in a CPU and GPU. In my AM4 example, I should had waited another year for an updated chipset, the 470x, a R7 8 core, and a Radeon RX 5700. A good enough intro build on a maturing platform. And then a final maxing out upgrade on the outro. At least that's the strategy anyway.
This is very impressive, but also, at the moment, I'm more interested in seeing AM5 stuff (and 40-series GPUs) before I make any decision as far as building a new PC. My 3700X and 3060 Ti are more than doing the job right now, so I'm not in a huge hurry.
@@TKIvanov Actually I can't, unfortunately. The motherboard in this prebuilt (which was the only way to get that 3060 Ti early last year) is unsupported by Gigabyte entirely and has no bios updates (It's the B450M DS3H Wifi V2, which isn't compatible with the bios updates of the DS3H Wifi OR the DS3H V2, and isn't even a motherboard listed on Gigabyte's website). So basically I'd need to get a new motherboard as well, and at that point might as well also get a few other new things (PSU and case). So I'll just wait. I did find solutions to actually update it with another board's bios, but I'd rather not risk bricking my machine.
@@taggerman6867 I've been playing newly-released games at 1440p at around 120-140fps (when they're optimized decently, as some games go down to the 90fps range), things are going great with just a minor stable overclock on the 3060 Ti. And there's DLSS helping a bunch too on some games.
@@nick524 there is Z690 with DDR4, 12900K works just fine with it. Besides I don't really recommand DDR5 with the 12th gen (weak gain unless your a pro-RAM overclocker)
Fascinating silicon launch. AMD realised there was a niche market for faster gaming CPUs, maxing out cache as opposed to cores and threads. Great way to ping their target market.
I think that an interesting test of this CPU would be if the extra cache makes a difference in the FPS drop that comes from running a lot of stuff in the background while gaming Vs the normal 5800x
@Garrus Vakarian The only game I have that is listed in their testing as showing any improvement, was a 3% increase vs the 5800X. So no, not "half" of my games. I have a 12600k which out-performs it still, for a better price to performance ratio
@mike n yep and are people really playing these AAA games at 1080p when you’ve got such a high end chip? A lot of people are probably on 1440p where the performance difference is much lower.
@@rotor13 12600k with a nice new motherboard, and did you buy ram too? I can just drop this 5800X3D into my system and be done. 12600k isn't a better gaming CPU.
I just got mine, with my 3090 it’s absolutely incredible. I don’t know if any other cpu can touch it in gaming. And even if you’re not gaming you have 8 cores and 16 threads. Just stunning
I wonder how ram frequency/latency effects the X3D. Going to be interesting to see if any reviews push the infinity fabric to its limits or break 1:1 IF and see if there are any gains. Essential test DDR4 3800CL14 and DDR4000+ vs 3200CL14.
@@clydecaballero1706 That 3800 RAM in that H.U. review was CL 16, NOt the same as what OP is correctly asking for. And by correctly, I mean that its the really awesome RAM setup for Zen 3, and H.U. STILL has yet to understand this. Amazingly. all of these review channels are flawed in some major ways as it pertains to Zen 3, it's really funny to watch people kiss their feet.
Just imagine a 105w 5600X3D with higher or even the same clocks compared to the original 5600X. It would be killer in price/performance within that segment.
@@Ilegator No more likely they did not want to kill 5600 and 5600x. And really the extra 2 cores on the 5800x3d sure makes the CPU laughable good compared to what came before it. There is some 10 and 12 core CPU's from Intel or something that. But even under a full 6 core load the cache is just grate. And then you got 2 cores that should be secluded to do stuff that otherwise had slowed down the already taxed 6cores. And when doing something that needs cores the extra 2 cores over 5600x is very welcomed. Really it is just really good compromise. And really the 3d cache is a extra expensive step in production. Doing it to a 6core part when 8 cores chips almost adds no cost to AMD is just dum. Better to try and make as much money out of the good 8core silicon instead of making better 6-core chips for less money.
Vermeer has 8 cores per CCX sharing one L3 cache block between all 8 cores. A 5600X3D chip does not make sense. And of course, it would cannibalize sales from the 5800X3D.
@@jorelldye4346 Well, same as the 7800x3D is destroying sales of its bigger brothers. That includes the 7900x3D, which only has the 3D cache on one of the chiplets, each one composed of, you guessed, 6 cores. A 5600x3D makes not sense now, over 1 year after my original comment, because AM4 is gone. But of course we won't have a 7600x3D either.
@@sovo1212 I stand corrected. Thank you for commenting. I suppose it is only the minimum profit margin they've set for themselves that limits 3D cache to 8 cores and up.
I upgraded recently from a 3700X to a 5800X3D and the performance uplift is awesome. Getting anywhere from 15-30 fps more in virtually every game (paired with a 6800XT, that I also upgraded to from a 2060 Super). On top of that when u undervolt the CPU and set limits its also very reliable and runs a lot cooler and clocks higher. Overall a fantastic chip anyone on AM4 that wants to upgrade and boost their gaming performance this is definitely the chip.
@@j_g2005 6800XT for the GPU and be quiet Pure Power 12M 850W for the PSU. I had to upgrade my PSU as well when I got the GPU, old one was 600W. 5800X3D draws 100W as well so it aint no slouch. If ur using a higher end GPU that pulls a good amount of wattage then yeah 500W aint gonna cut it. I suggest going for Corsair, be quiet or Seasonic.
That is also why Broadwell has ages surprisingly well. The big 128 MiB L4 Cache can do a lot in some games. Io that one test on intel 10th gen where cache could do more than cores in some. Or even back on LGA775 where the same clocks with bigger cache could improve performance.
I have popcorn ready to go for that. AMD is about to jump node to N5 which is 84% denser than N7. Intel is on the "tock" cycle with Raptor Lake so that's still on 10nm/"Intel 7". And considering how much more power Alder Lake already uses... my money (not that I'm actually a betting man) is on AMD pulling further ahead from Raptor Lake than Zen 3 did from 10th gen....
We'll probably see something like that with the upcoming Zen4 generation, but I don't think it will be coming for the current, soon-to-be-obsolete Zen3 generation.
Would have been cool, but AMD probably have drastically reduced the clock speeds if they wanted to meet the same TDP. But I wouldn't be surprised if AMD released a 5900X3D or 5950X3D, out of the blue, but its very unlikely.
Really impressive to trade blows with something that is twice the price and near on triple the power consumption in gaming. And no doubt any measurable difference will be indistinguishable to the player anyway. Funny to think that what is both kindof experimental and a "halo" CPU basically repeats what made Ryzen so great from the beginning, at the end of its life cycle. Further to that, Ryzen has already turned many new ideas into things that are now standard in just a few years, with iterative improvements and refinement along the way, so it'll be good to see how 3D cache progresses.
I can see why the 5900X3D wasn’t released … as the 5800X3D performs very well and cooler than Intels new i9-12900K/KS, trades serious blows with Intels flagship and most likely it’s an AM5 release having a 12-core with 3D cache … hopefully with a slight speed bump will be an absolute monster going forward. Great to see the competition between manufacturers again and drops in pricing occurring 🥰🥰😇🤩👍
The benchmarks on Tom's hardware show that x264 and x265 encoding sees a slight improvement despite the lower clock speed, so it would be cool to investigate what the improvement would be at the same clocks. More cores is best but it could give some insight into how much it could contribute to improvements when 3D V-Cache is eventually included in higher core count CPUs (though not on AM4)
Oh phew. I was worried about the code compile test, since then I would have been tempted to swap out my 3950x. Because yes, I hate waiting for stuff to compile enough to pay 700$ to get that as low as it could get.
Got a 4080 for myself for christmas, currently using the 3700X and these benchmarks made me quite certain I'm gonna pull the trigger on this CPU. My AM4 build will get a nice boost, so I can wait for AM5 parts to become more affordable.
Gotta remember this is the first time anyone has done anything close to this and im fairly impressed. If they end up allocating resources towards this general idea ill probly end up getting ome when i upgrade mine in aproximately 6 years lol. 3700x is gonna do me just fine for a good while
Yeah, modern games are much smarter about using the cache as much as possible (usually at the engine level), often relying on batch rendering calls and SOA (structs of arrays) to keep data all close to each other, and doing function calls on all data at one time during the end or beginning of a frame, thus maximizing the likelihood of it entering cache and being reused in cache the entire time that data needs to be referenced. Older games were still primarily object-oriented, rather than data-oriented. So games like CS:GO will likely see very little benefit from double or even triple cache, as they don't bother with cache optimization nearly as much. Modern games on the other hand, will likely continue to see massive benefits from higher cache. Unfortunately we can't just trade out raw performance for cache though, as there are too many applications that need the performance still, and many people wouldn't be happy to see 10% improvements in some games while getting 20% reductions in performance in others that aren't cache heavy. On another note, though... This CPU will show people which games and applications cared about cache and which didn't, though. As you'll see a significant difference in those two different types of application with this CPU.
This is a big win for amd in the efficiency department, now wouldnt it be interesting if intel put 3d stacked cache on a 12900ks. Why brute force fps with more wattage and a higher power bill when you can finesse the same fps or more with a elegant implementation of cache.
Considering amd had to drop frequencies to be able to stack the cache and implement strict voltage controls, would alder lake perform well enough under that kind of voltage controls at lower frequencies?
Problem there is heat. AMD specifically said a major reason for the clock reduction in the 5800x3d is the total power draw and thus heat output. Rember, the cache physically sits on top of the cores on it, so it wits between the cores and the cooler. Looking at how much power even the normal 12900 can draw, that would likely be to hot for the 3d cache.
cant OC meaning you better hope it has good cooling to make up for trapped heat. It's great for games that likes cache, its just the same as a 5800X on a compute level.
Milan-X is the name for AMD's 3D-stacked server parts, I believe. Demand for it is absurd, by all counts. No need to worry about cooling on those parts. Server-grade blowiematrons are perfectly capable.
To post a comment or not to post a comment. Not sure if the GN team even read these but just wanted to say , appreciate the content quality of this channel
I see this being amazing for small form factor builds for gaming. You get the best for gaming while also being cooler to run. As well as just gaming only builds in general. Makes me wonder if Zen 5 will use more cache for easy gains. Or some Zen 4 CPUs. Or APUs. Those could be interesting too.
Our recent Ryzen 5 5600 review & benchmarks show how it’s a good value CPU to consider right now: ruclips.net/video/ifI9nnmW5sg/видео.html
Our GN Tear-Down Toolkits now have a 7-year (retroactive!) warranty! This is a great way to support our work, our reviews and purchasing of components, and get something useful in return! store.gamersnexus.net/products/gamersnexus-tear-down-toolkit
Coasters are also IN STOCK & SHIPPING: store.gamersnexus.net/products/3d-coaster-pack-4-component-coasters
Our full review for the R7 5800X can be found here: ruclips.net/video/6x2BYNimNOU/видео.html
Our review for the Intel i7-12700K is here: ruclips.net/video/B14h25fKMpY/видео.html
You whiplash too much... "AMD FAILS! WORST COMPANY EVER!" "AMD HITS HARD!"
C'mon man.
I love these in-depth reviews, but do you regularly do an overview as well? It seems every week it's AMD is destroyed, AMD is dominating, lather, rinse, repeat. I know they're different price points and market segments, but still.
@@ErgonomicChair the low end parts that just launched are indeed bad. Praise when it's good, criticize when it's bad.
@@VndNvwYvvSvv Wow I just commented this stuff. It's complete trash reporting imo... Like when Linus said Nvidia was an evil company for not giving free GPU's out to reviewers that refused to review the product... then went on to go "WOOOAAWOAWO! 8k gaming 3090 wow!!!"
Can't wait to peruse the OC settings for the 5900X in the article in the morning (midnight here) - sounds to me that AMD has kind of fibbed on the 15% if you're not running a well tuned PBO and max boost overclock on the other 5000 cpu's (and it seems you are not based on the frequencies you report the other 5000 cpu's to be running at), because you can't just compare stock apples to stock apples here solely just because its disabled on the 5800x3d.
Hope there's no goalpost shifting on it, because that really only obscures the true end user performance.
If they're giving time to review before launch, they know it hits.
I love to see comments like this! Woke tech buyer haha
Except for when it doesn't. Such is life.
@@justsomeperson5110 it does doe haha
That's the god damn truth! Haven't seen embargo lifting earlier than 1 day in about a decade..... Nvidia be doing launch day review embargos lol
On the other hand, they didn't sample the other six new CPUs they launched at all, which shows their confidence in them.
I really appreciate the GN brand. No clickbait in the upload's title, diverse comparison metrics, no hyperbole for the sake of manufacturing sensation: Steve tells it like it is.
And no sponsors every five minutes.
I was just thinking that.
@@alexcain2855 and if he has a sponsor it is a company which he has long term relations with and not just some random new company
"Thanks Steve.", "Back to you Steve"
Instead he uses hyperbole to shit on companies like he did with the review of the Ryzen 5 4500.
So here we are 11 months later and the 5800X3D is still in the conversation, even against it's direct successor the 7800X3D. AMD gave customers a hell of a CPU for AM4 before ending the platform. While some ask why, I see intelligent motive. With the 5800X3D still in the conversation, not every Ryzen user needs to be an early adopter of AM5. Motherboards for AM5 are stupid expensive, so AMD said keep your X570 and install this. They also didn't price it to the moon and, more recently, you can find this chip for 320-350 on sale. That's insane value when you factor in not having to change literally anything else about your PC, just change the CPU, apply thermal paste to the cooler, hit F1, and you will see a noticeable upgrade for gaming no matter what previous chip you had.
I think it was a great idea. It cemented the X3D branding as a premium gaming focused chip line up in the eyes of consumers. If they started with a dud, consumers wouldn’t have been excited for the 7000X3D chips. It was also a massive cherry on top of the AM4 platform in general. It was such a great CPU, and it’s still relevant today.
I just bought b550 mobo and 5800x3d. 7000 series is joke with those prices especially for mobos
The 5800X3D is AMD's 1080 ti
Just installed my 5800X3D last night. Bought an X570 board (up from B550) and new RAM just to pass along my old sticks. But yea full AM4 overhaul in June 2023 and I’m not batting an eye about it
@@PushUphill hope it was one of the amazing X570S boards. Just upgraded from my X570 and I like that it has a lot of the quality of life upgrades that AM5 comes with.
What blows my mind is that I could stick with the same B450 motherboard that hosted my previous 1700X and currently has my 3700X, and put in this top-of-the-line 2022 CPU. That's remarkable.
Even better is that the 3700X already does perfectly fine, so you can just sit on it and wait until Zen 4 is out and decide if it's worth it. If not then the 5800X3D will, by then, be some $300-ish "life extender" that will still rip and tear in every game for at least 5 more years to come...
@@andersjjensen yup, and this is pretty much my current plan. I've got a bit of a CPU bottleneck but not a burn-$450-to-fix-it bottleneck. AM4 socket will be remembered fondly
Bro, I can throw this in the same MOBO that my 3600xt is sitting in?
@@paulpietschinski3282 depends on the chipset ... check your bios update support 5800X3D
@@paulpietschinski3282 For 400 series and 500 series boards, yes, there should be a compatible BIOS update available. For 300 series boards: your mileage may vary.
For all the hardships of am4, it’s amazing that someone could go from a first gen Ryzen to current flagship gaming performance in the same socket.
I just wish Intel got the memo and start making sockets that fits multiple cpus.
Yeah AMD fans are complaining because it wasnt as long as it shouldve been but its like its this or every couple of generations screw that.
That's just smart business, being able to use the same socket for so long saves everyone money and is also less waste
@@kingscarletbuilds Yeah, that's why I'm an AMD fan it's not as much as they could but it's a lot more than other companies.
@@wotwott2319 Intel would rather nuke every single one of its fabs than give customers more than 2 generations of hardware support per socket. Their profits are more important than your budget or respect.
"This is probably the last AM4 processor being released" - Steve
2 years and 4 AM4 processors later...
Thanks for everything yall do for the tech community. Really appreciate this review.
Also, love the merch too... I use everything I've bought from you guys on a almost daily basis. Well done guys, well done!
Thank you!
The best thing is, this was just a test run basically, and on a fairly old platform. Seeing how well it can do means same technology can be applied to future platforms and architectures, including GPU's since they also use L2 cache which is fairly important but is rarely ever talked about. Imagine if the same design were to applied to a flagship like a 5950X, or give a midrange option like 5600X. The possibilities are endless for the future. Instead of just piling on more cores and power/heat as not all games or applications need it. Interested to see how it bodes for the future.
One of my first thoughts when I heard of it was of how well it might do as an infinity cache alternative for the RDNA2/3 GPUs.
One of my later thoughts was cursing AMD for the lost potential in not launching a 5950X3D, as it would probably crush last-gen threadrippers and be the absolute best final product of a socket, heh
Imagine it on even lower end parts, like a 5300x3D. it would be hands down some of the best bang for buck performance. I can't wait to see this proof of concept being actualized and further pushing innovation. even if it completely crashes and burns it will still be a push in the right direction for the entire market.
Almost makes me regret buying a 5600x, because in theory a 5600x3D would be nearly the same performance as a 5700x in gaming.
@@mammy24 Eh, it would be cool... but not that plausible. They already bin (as you can see on the power consumption levels) these to be on the better side of the dies (probably not as good bins as the Epyc dies, but close) before they attach the cache. So, at best they might get enough defects to warrant a 5600x3d, but not lower.
Edit: plus, you've gotta think of the margins. With this being a newly-produced technology, it's gotta eat up some of the profit margins of the chips. Lower-end chips already have the lowest margins, so economically they make the least sense to sell. Maybe in a couple years, when it has payed off its R&D costs, it might be plausible enough to make. But for now, the only lower end chips that are worthy of it are Epyc/Threadrippers, because even on their lower end they're expensive enough to cover the costs.
@@mammy24 my understanding is that the packaging tech is quite expensive, so it's unlikely we'll see this on low end products at reasonable prices.
@@MistahGamah this thing is pretty damn affordable for what it is, $400 and it walks all over intels top of the line chip that's twice the price... I give it a year or two at most before this setup becomes more common
I love how AMD has recently been upping their game and really offering new technology with almost every generation of CPUs they release, most certainly the 5000 series is the pinnacle of the AM4 socket.
i TRULY hope they do that with the next socket! that would be amazing! im on i7-9700k and honestly AMD is getting more and more intruging with their Ryzen series.. i basically OC my chip to 5.2 ghz and it has kept up within 30-40 fps.. i dont 4k game so its honestly not an issue.. but i think i just might make the swap this time around. 4k looks gorgeous.. but then again i do fps mostly now and 4k seems like a waste imho..
"Thanks Steve"
Gets me every time.
Looks like a quick chip. I wonder if there will be 3D or XT versions of the 7000 series that release later incorporating the 3DCache
I am thinking this will be a normal thing they will be doing to newer chips to massively increase gaming performance as a default manufacturing method. At they did this to the AMD 5800X3D CPU and yet the Price was exactly the same as the original 5800x.
The 3D version will be the gaming CPU, the non-3D the normal and productivity CPU.
I dont think there will be much room for non-3D gaming CPUs in the near future.
This is the new standard.
Supposedly, they will not launch Zen4-3D this year but rather in 2023
zen 4 was made with 3d stacking cache in mind
Gotta love the fact that Steve is a common enough name to make it fit so well into GN videos
Been waiting for this review! Knew the embargo lifted today, so I was just waiting for your review to watch first!
Hardware Unboxed's review was also quite good, & they're planning another follow up comparing more games. Both HU & GN are my go-to's nowadays.
What makes this so impressive is that this is a year old CPU that has been made to match and at times beat a brand new architecture just by adding innovative packaging. it's the same node, same architecture, with no clock speed improvements (a regression actually) and it can go toe to toe with the special edition of Intel's best new chip. What's going to happen when they put V-cache on Zen 4????
The way I see it, you buy this CPU now, and you'll be largely future-proofed and won't have to upgrade for a few generations. Games are only now just about requiring 6 cores and the consoles are mostly going to running in 7 game thread modes so you're good on cores. Frequency is reasonable for the power efficiency and the amount of cache it has will more than sufficient for some time.
Not that impressive though cause this cpu has been released 6 months after the alder lake launch and alder lake launches 6 months after rocket lake so if you go by that logic then intel's a lot more efficient and better at making chips than amd.
Guzzling that copium.
@@nayan.punekar Huh? What?
@@TechyBen Basically I'm using his own words to prove him wrong and yeah I know that it didn't take intel 6 months to create the alder lake architecture.
What a story the AM4 arc has been, it had it's ups and downs but it goes out with a bang, and managed to save AMD from bankruptcy, can't wait to see what AM5 will do to the industry.
Now wait for neko arc
It's not a big bang. It would be if it were the thing to change to if you wanted to stick to AM4. But since I have a 5950x and game at 4k there is no point to it for me.
@@Safetytrousers if you game 4k, seeing cpu videos is a complete waste of time.
@@alexworm1707 Not if you have a general interest in PC tech, and wanted to find out if it would do anything for 4k.
Do you want me to tell you how you should use your time?
@@alexworm1707 thats just not true for all games. Plenty of games are more CPU driven than GPU. FOr instance, i play Squad in 4k, and when i went from a 3700x and 2080 to a 5900x and 3090, i noticed more of a difference when my 5900x arrived 1 week after my 3090, than i did from the 3090 upgrade
I’m finally the target customer :D upgrading to the final AM4 chip in a week! Been waiting for this review for a couple days now, great job as always delivering us the critical information ahead of the release, Steve!
Upgrading from what though and what is your current GPU? If your GPU is mid range and you are playing at 1440p you won't notice much of an uplift.
@@Mopantsu I appreciate the concern.
Check out Hardware Unboxed's review too. They show a few different memory options as well to help you weigh your options.
Almost 2 years after this CPU came out, it still provides one of the best gaming experiences on the market. For AM4 users looking for more performance, this is the definitive answer.
Well said! Do you think the 5800x3d will be enough for a 4070 ti super?
@@kostastsitsis1309 I'd say a 5800X3D would be more than enough for any graphics card with 16GB of V-Ram, especially with the recent reviews and benchmarks of the 5700X3D that often compare it to the 5800X3D, along with comparisons AM5 and Intel CPUs. For anything short of an RTX 4090 and its $1,600+ price point, the 5800X3D will be an excellent performer for gaming that won't be holding your card back in any significant way. If you do content creation or streaming though, moving up to AM5 with the 7800X3D would be a much better choice while not being much distant in price.
More often I've seen games capped out by the graphics card than the CPU when it comes to testing, especially for 4K resolutions where anything less than 16GB V-Ram on the graphics card will start to see major dips in performance.
One thing to keep in mind is to not cheap out on your system's RAM. If you get the Micro Center Bundle for the 5800X3D for example, it only comes with 16GB of RAM and if you're on Windows 11 you really want 32GB to handle how much load Windows 11 and background/casual tasks can put on your RAM. So adding another 16GB of RAM on top of what that bundle provides would be a good idea, just make sure it's the exact same model of RAM.
yeah ofc , its trading blows with best cpus , i just got r7 5700x3d today :D we good with them for years to come :D
@@kostastsitsis1309
@@kostastsitsis1309 More than enough. It is a beast of a cpu.
My 5800X sadly died, but I didn't want to shell out the $ to upgrade to AM5 just yet, so I bought a 5800X3D and it has been awesome so far. this combined with my 3080 and 32gb of ram should carry me for quite a while
I'm glad to see a company like this give proper time between review and launch, as well as the advertised gains not being wholly false.
One thing for sure: AMD doesn't like on their slides. Very cherry picked, but never a lie
@@GewelReal Their GPU comparisons have been shady before.
*I'm looking at you, apple*
@@eliasroflchopper3006 To play Devil’s Advocate, if you could fully leverage the compute power, the M1 architecture is impressive as heck. The M1 Ultra can keep up with a 12900K in code compilation, beating it depending on which linker/compiler used. The built-in video encoders can make ProRes much easier to work with. And if you can use Metal, I’m sure the 64 core GPU would be more powerful than what we see now. There’s something wonky with scaling in some games that 24->48 is only a 50% performance bump and 48->64 is similarly disappointing currently. But still, even if we assume that it matches a 3070, it’s not an insane tax that we’re used to from Apple, especially as the power usage is far lower.
Now if they could release a new generation of ARM Macs that supported nested virtualization and Vulkan out of the box…
@@johnbuscher the M1 ultra chip is overpriced and a absolute complete scam for trendy hipsters, you can get equivalent performance for a 1/4 of the cost and as an extra bonus the CPU/GPU wont fry itself at 100C and the components around it, at max TDP the CPU vcore is literally 2.0v; the M1's cores are going to degrade so fast it will be useless in 1 or 2 years, which is exactly what apple wants because then you have to buy their new garbage
I will never get tired of "Thanks, Steve" popping up every now and again.
Also so happy that the 5800X3D sitting at such a notably "nice" price. The 12900KS made me scared at its monstrous "higher than MSRP 10GB 3080" pricing. After a slew of "meh" and "barf" CPUs, I can see why AMD was so confident to release those alongside an actually great CPU.
All good options for my older AM4 friends to slot in and then rub in my face that they got a better deal than my launch 5600X, whom I will just continue appreciating for years to come.
i’m with you homie, I got 2400g at launch too, but I was desperate and didnt have enough for a good gpu. But it went hard for an igpu, got gpu long time ago.
i’m probably up for an upgrade but i’ll wait cause i’m busy with work and life rn anyways 😅
Have not seen any "real" prices yet for this cpu, but I think it will have a hardtime. The difference between a 5800x (400)and a 5900x (480) is $80 here now, and if this is $450, I think I rather go with a 5900x for $30 more, and at 1440P you dont notice any difference in fps anyway, and in workloads you get way more punch.
@@AndrewTSq I don't disagree that the 5900X is a compelling alternative in theory, but only if you can actually use the extra cores for something regularly. If you're doing mostly gaming with some occasional productivity tasks, you'll probably get more real value out of the 5800X3D than the 5900X.
@@ozzyp97 Since the 5900x is cheaper here than the 5800x3d, I would go for the 5900x. But if the 5800x3d is cheaper.. maybe.. and if you do 1440P, i would just go for the regular 5800x and save $130.
@@AndrewTSq Yeah, it would depend on your GPU and upgrade plans too. The point remains that the 5900X isn't that much faster than the 5800X/5700X in games, whereas the 5800X3D is a fair bit faster than both. Just spend the money on what makes the biggest difference in your actual use case, there's no value in theoretical performance.
There are already overclocks being submitted with this CPU, external clock generators and a 1.35v lock allows it to hit 4.9GHz with another 12-16% gain in performance so ideally with the right setup this CPU can completely cream the 12900KS despite it using more power and having the advantage of high speed (and very expensive) DDR5!
This cpu will become the go to for AM4 gaming for years and years to come.
God I wish someone would cream me 😉
@@AstroAvenger depends on what type of gaming. For esports gaming (generally the ones played at 1080p with high refersh rates), a higher clock speed cpu will be better then this one.
I can see it become the AM4 equivalent of the Q6600 down the lane. Not the best choice at release, but great performance that will give it a long lasting life for years.
@@HappyBeezerStudios how is it not the best choice for the price its going for?
I would like to see mid year experiment CPUs like the 5800X3D a lot more. It keeps us interested in how they can do some really cool things by modifying how the CPU is made. I would like to see Intel do that instead of releasing a KS SKU that doesn’t warrant its price.
KS is idiotic
SKU*
@@TKIvanov I was referring to the 12900KS being a variant of the 12900K.
@@worldofjoseup It's supposed to be "KS SKU", not "KS skew".
@@lapin_noir Ohhh. Mb. Thx for clarifying.
Here I am in late 2023 putting a 5800x3d on a x370 motherboard I bought in 2016. Fucking solid platform. The next will be AMD fore sure if they keep pushing in this direction.
This reminds me of the 5775c. I really wonder why AMD and Intel hadn't repeated that experiment after having such good results back then. Great review!
This CPU costs a lot of money and silicon to make. Essentially 50% more silicon making it cost similar to a 5900X or even a 5950X depending on how yielding that 3D bonding process is. Costs more raw materials and the extra production steps might make it not worth it to manufacture.
Cache is super expensive - that's why.
Oh and btw. - IF AMD wouldn't have raised to core count to 16/32 on AM4 and in the normal consumer market, we still would use 4/8 or 6/12 ( or maaaybe 8 / 16 ) Intel refreshs for ages. :D
The last thing of Intel i expected after using their products for years is something really innovative that is really pushing boundaries for all users. They simply react and that is it.
@Darren Rushworth-Moore That eDRAM is for the Iris GPU.
We've seen a simple cache increased can lead to high performance.
@@AlfaPro1337 Nope, that L4 cache is also for the CPU on BW.
@@notlikethis9932 Why do you even need more than 4c/8t in mainstream desktop?
1. Video recording is moving to more ASIC, like NVENC and QuickSync.
2. Nope, it's just shitty off/online DRM that tanks the CPU. There's a benchmark that shows a pirated copy suddenly perform better even on a 4c/4t CPU.
3. Console optimisation
And 12th gen i3 already proved that you can simply game on a 4c/8t CPU.
If you want more cores, you can go X299. You don't need quad channel memory to run on X299, you can go dual channel for it. Heck, even the entry X299 board are heavily build when compared to their mainstream counterpart.
AMD hasn't done anything innovative. Zen is just pre-Faildozer architecture--K10, souped up, modernised and modularised. AMD has no good engineers, they always have to call Jim Keller in order to jump start.
As someone new to the PC space, I greatly appreciate the in-depth reviews you guys do. I watched countless hours of your videos as I was trying to decide what to get for my first build.
It was GN's video on the H500M that sold me on the case and boy, I'm glad I did as it's allowed me to eek hellova alot more performance out my 2700X (undervolted and overclocked).
Same. Steve for data, Jay for general info. Combined, helped a LOT when I built mine in 2018. And am continuously looking for upgrades.😉 It can get addicting.😅
It's only at the release of the 14900k, did I realize Steve was guiding us all along.. Thanks Steve, back to you..
I'm surprised by how well the 5800X3D holds up after all. Quite compelling. Thanks for your work!
Yep a CPU price, that is good. But it is the 6900XT the GPU price is still a big ouch. Same goes for a RTX 3090.
Meanwhile the 3800X I have is still waiting. It also can do 4550 Mhz with less than 1.300 volts all day. Still it is good enough for either GPU that I refuse to pay for.
@@warrenpuckett6134 3800xt bottlenecked 6800xt 10%-15% or so in some games at 1080p, and about 5-8% at 1440p, so your 3800x would bottleneck them a bit considering they're even better
@@vespa7961 It don't bottleneck a HD 7790 using LInux with a 60Hz 1440 monitor. Point is do not need more. Cuz I am not buying any GPUs this year. Not buying 144hz monitor either.
I do have a R 9 390X sitting in the drawer.
I bought that used for $250. Way more than what is needed for Linux.
Pretty much what I have is a 5700G. Have not bought any game titles in 4 years. I can wait.
If the day comes when it is the price I am willing to pay, I am sure the the 5800XD3 will be long in the tooth also. It also is alright with the olde lady. She gets to be a Shop More. But still if you need it get it.
@@warrenpuckett6134 I'm talking from firsthand experience, that cpu WOULD bottleneck those gpus a bit, as I have the 6800xt and it was bottlenecking it in some games by 10-15%
Still always love seeing the random tech company presentation clips, the injection of dry humor without acknowledgement is hilarious to me. Keep up the great work that you guys are doing!
it's insane how this CPU is still the best bang for buck enthusiast level gaming as of recently.
When this chip launched I saw the hype around it but thought "I won't watch those reviews, no need to upgrade my 5600X anyway." Seems I was dead wrong! If DDR5 keeps not making any big leaps in performance, it seems I won't have to upgrade my X570 mobo after all! This chip makes me so happy since if I ever wish to upgrade to the RTX 40xx series this chip can handle those cards no problem.
Impressive product in my opinion. The performance increase with a wattage decrease shows actual technological improvements, as opposed to dual turbo charging and then super charging your CPU and blasting 300 watts through it. Much much better for sff builds! Going to be way easier to cool.
Not really if you want sff build you better off with Intel, you missed the point completely here....... Intel only less efficient when pushed to it's limits.... But in idle, in gaming, in streaming, in browsing it's more efficient....
Even if you decided you need creativity workload, Intel twice as fast which means takes half the time to finish the task which means half the time to stay at that workload hence it cancells out consumption differences
@@TRX25EX I don’t think you are very familiar with the CPU industry if you considera Inter to be efficient (and cooler) in multi-thread applications the efficiency cores cannot handle the same performance, if you take in consideración the same core count.
@@TRX25EX That was a very weak Intel fanboy comment? lol
@@GansoGG You talk too much, 12600K is is more efficient than 5800X + cooler + faster + cheaper ...... Full stop... Too much blah blah is not required..... I don't care industry I don't care Intel or AMD, I just state (FACTS)
You have facts prove me wrong? 12700K even beats 5900X yet cheaper cooler more efficient.....
If you gonna speak about productivity then read my comment again (FANBOY(
Edit: a video for a fanboy like you who trying to defend AMD by being very specific knowing in general what I said is true.....
ruclips.net/video/PiBVuwBeoQc/видео.html
LMAO i mean SFF has its limits as it is talking on air, then aio then custom loop. and INTEL k chips lie in the custom loop category along with r9 zen 3 cpus and the 3950x. Best id consider in ITX would be a 12400 for intel. At least the 5800x can be undervolted or pbo voltage tightened to cool better. The 96mb of cache makes the amd chip the best gaming option in itx but still expensive at 450$ vs the 12400. ESP if you jump to 4k or towards a racing sim setup with 4k etc displays. Then the cpu dont mean much....vs a 100+w single core 5ghz+ 12900 on itx in a game like cs or valorant etc free games lotta $$$ invested for some free shit games NGL
Can I just say... This is just the beginning for this type of Technology. The 3D caching they're using, that is. If I remember what AMD was stating in the past with this, the Tech will be used on all Future Desktop CPUs, which is going to be pretty amazing, after seeing this. Intel is going to need to up their game on something now. ^_^
I was wondering about this, I know this is essentially a POC and it looks really good. But if a Zen4 3D Vcache is right around the corner it makes this hard to justify.
@@GoodnotGreat88 I think this is for people that with am4 socket motherboard with second gen ryzen that hasn't upgrade yet, the performance bump is substantial
@@chrom4ful No good for me with my 5950x playing at 4k.
yeah cant wait for intel to release a fist thick cpu with a shit ton of fake multi cores, and a 3000 watt powerdraw, that bends , to barely beat amd...
The most exciting thing for me is that by making a cache-only die, they managed to make it much more dense. Combined with 3D-stacking, they can vastly improve the cache size/latency ratio.
From my personal studies of software optimization i've learned that larger caches are excellent for poorly optimized code, or code where you have a lot of small objects and you don't bother keeping them in contiguous memory chunks. This is very much the case for many games and certain data formats like JSON, if you don't carefully design the code to collect all the little memory fragments into larger chunks. It is also very useful in cases where you have large datasets and need to jump around randomly to do calculations, since there is a greater chance that the data you need will still be in cache if it has more available cachelines.
The improvements in cache makes me very excited to see what they will manage to do with their next generation of CPUs.
Software emulation too. The constant branching, setting bit-flags and giant instruction lookup tables really does a number on both the memory controller and the CPUs speculative branching, and only "by pure dumb luck I had that in L3" can help it.
This CPU also blows everything out of the water for Factorio. That is a super optimised game, so everything will end up on the cache, and it steams ahead. So both low and high (but cache hungry) optimised code benefits? The stuff in the middle does not, it just runs the same or slower.
So basically escape from tarkov lol
unfortunately, these kind of improvements only leads to developers getting more lazy instead of improving overall performance and results. Software will expand inefficiently to abuse all the hardware power that is offered.
*"very excited to see what they will manage to do with their next generation of CPUs"*
This aged incredibly well, as just 1 year later we all saw the Ryzen 7800X3D basically obliterate every other CPU when it comes to gaming while not breaking the bank with its price. The 7800X3D will go down as arguably the best gaming CPU of all time, and something you won't need to upgrade from for who knows how long.
3:40 Really getting more Tech Jesus vibes than usual with that back lighting
Amazing how the 5800X3D competed with the i9-12900KS at a lower price and almost 3 times lower power consumption. Can't wait for this tech in gaming laptops.
I wouldn't hold my breath, as precisely one of the main differences between the Ryzen desktop and laptop models, is that the laptops have a much smaller cache- I figure due to size and power consumption woes.
@@albertobueno7805 The laptops have monolithic dies and they are all APUs and have large volumes so that means they are cost sensitive. Only Apple has managed to build large monolithic dies for laptops since they can charge a large premium.
But since the 3D v-cache tech only needs around 2% more die are on the main die and a separate cache chiplet, AMD can now build 3D v-cache top end SKUs of the top laptop APUs and brand them as the ultimate laptop gaming CPUs and charge a premium like Apple does. As we can see the Ryzen clocks are lower too so for laptops it's even better for power usage.
Intel can't have a KS answer for this since clock and power increases on laptops makes it a no-go from the start.
Let's be honest, gaming power consumption between 12900ks and 5800x3d is almost the same as like gaming performance.
@@stefangeorgeclaudiu You've missed a key consideration -- the cache is 3D stacked which adds to the z-height of the package. In turn, that requires structural considerations to the silicon below that are contradictory with the goal of shrinking the z-height. If AMD chooses to bifuricate their dies between 15w and 45w parts then I could see it happening, but otherwise I would put it at a 15% chance it happens at best.
It's misled graphs...... You missing the point completely here by power consumption.. because of 2 reasons:
1. Intel is more efficient because you not gonna use creativity workload 24/7 in which only here Intel falls but in Idle in gaming in streaming intel more efficient
2. Even if you use it for creativity 12900KS is almost twice as fast as 5800X3D which means it takes half a time to finish same task (so half the time to stay at that workload) which basically cancells out the power consumption difference....
Is great cpu. I upgraded from ryzen 9 3900x to ryzen 7 5800x3d and my fps went from 70, 80 to 120 135 with rx 6900 xt sapphire nitro on 34inch 3144x1440 wide-screen monitor.
Steve's giggle after the "Thanks, Steve" did it for me, the review after that is just eye candy, Thanks, Steve!.
as someone that doesn't overclock anyway - this is a very very very very tempting upgrade to a B550 board with the R3 3100 in it, currently!
Running the exact same.
But the OC is gone, and though I never really use the actual base clock OC I usually up the boost and max powerdraw in bios. Not sure that will work flawlessly.
Bit expensive anyway though, thinking I might be better off with 5700X/5600X and 2TB ssd overall.
@@jamegumb7298 I use a 5600x and it runs everything I play very well but I also run some of my games on a 2TB Hardrive (the price for the amount of storage was very good, half the price of a 1TB SSD).
A good compromise would be the 5700x if you want to upgrade other parts too.
idk ID rather go with the 5900x instead for same price n more cores
@@winonesoon9771 just noticed a huge price drop (including 5950X) in the UK as well, it could be more worthwhile to get a 5950X before the X3D even launches!
@@draketurtle4169 I only have 2 hdd left for mass storage. Still need 2TB to upgrade the gamedrive, less reinstalling all the time. I would go solid state all the way for the least amount of noise, but a good big ssd costs an arm and a leg. €1000 for the cheapest 8TB.
We really need more M.2 slots, E-key should be standard, 3-4 nvme storage as standard, then I could just add 1 drive. Thinking of just ditching this and getting a low end SP3 cpu, it has enough lanes at least, maybe 72F3. Other system is a 3647 (just a 4114) and the lanes are a blessing, but I use it for other stuff and has other OS on it. Also needs a new SDR, which is costly.
Thanks for this great review. I bought the 5800X3D on crazy sale and sold my 5800X. For around $150 out of pocket, it was a drop in upgrade competing with all the new stuff. Wow. No reinstall of Windows. No nothing. Just swap and go. Ready to take on any new GPU I throw at it.
Thinking of swapping out my 5600x for this cpu. 🤔
@@Captain-Chats I just did, massive difference. Need a 360 AIO for it tho with mine i hit around 60Oc while gaming with a non aggressive fan and pump setting. 80oC while pegging the CPU at 100% so it's pretty good.(PBO is off as well)
I just ordered the 3D, also gonna sidegrade from the 5800x, new AM5 platform sucks (for now) so I might as well go with the best in slot for AM4 and just wait a couple gpu gens before fully starting over on a new platform lol
I'm very interested to see how the 5800X3D performs in Unreal Engine 5 environments like the Matrix demo, which is heavily CPU dependant.
Are they cache dependant?
@@ladrillorojo4996 I have no idea, hence why it would be interesting to see.
Ue5 demo eat 70gb of ram when running editor, is it right?
@@franchocou good luck trying to fit it in cache lol. Although, If it's so much ram dependant, I bet it could benefit A LOT of more cache, as it probably needs a lot of data moved from and to it. Would be interesting to see
More L3 cache hits and more space for the branch predictor to run as well
Really hope this isn’t just a 1-off stopgap, the engineering is way too cool to not be used for more products
They are also using it in their server CPUs, so I doubt it's a one-off. I'm pretty sure AM5 will have at least one SKU with 3D cache too.
I don't see why it would be.
They are focusing this technology on the server space right now, they just threw us a bone with this one (more to antagonize Intel really). If this was for us we'd see 12 and 16 core parts.
Milan-X (their HPC Epyc variant with 3D V-Cache) is what this was originally designed for. And there it absolutely rips the face of the competition in some scientific compute workloads. As in, some workloads see 2x the performance at the same wattage... That's a decade of Intel server technology in one fell swoop.
I am going to upgrade to this one this year. Thank you AMD!
In terms of gaming a quite good improvement. Also interesting technology.
Not that interesting tbh
@@nayan.punekar 'Not that interresting tbh' Getting about 10 to 35 fps or more with the same (old at this point about a year and a half) zen 3 arhitecture and ddr4. Hmm I'm missing something?
@@SoriPop well yup.It took 6 months to get this out with minor gains mostly between 10-15 and rarely above 20 while intel took just 6 months to release a new architecture 😂
@@nayan.punekar Good bait
@@nayan.punekar I don't think you get me.
It looks like AMD has decided to take multiple crowns this week. I'm kinda sad that they went for the 'worst CPU' 5500 and 'best gaming cpu' 5800X3D at the same time, but hey this latter product is excellent for gaming both in absolute performance and price/performance.
to be fair to AMD, CPUs like the 5500 aren't really targeted at gaming anyways. They are perfectly fine low cost CPUs for your boring Desktop/Office PCs that aren't meant for high end gaming or serious production use. That's also why those non-X SKUs tend to be OEM first. (This gen though the competition from Intel in that price segment looks much better than we've seen in a while.)
The 12900KF cost about the same and got the same gaming performance as the 5800X3D and for everything besides gaming it will be faster. I don't mind the 5800X3D as it gives a last gasp to AM4 but it is largely a pointless product. Especially given the next generation AM5 is out this year..
@@Hugh_I even such low end CPUs like 12100/5500 can push out 100% from most popular GPU models out there so I really can't understand why you consider them something that can go only in office PC. For that we still have pentiums (even celerons).
If it can satisfice need of more than 60% of gamers than it's gaming CPU.
@@logirex it's pointless for you, but a lot of people on AM4 were looking for that last upgrade, and the 5800x3D would be that last upgrade. Jumping immediately to ddr5 AM5 Zen4 is complete waste of money this early on.
@@logirex it's more of a showcase of what we can expect to see in the future.
If AMD can comfortably do 10% plus performance increse, just imagine what the new generation can do: Maybe a 30% increase due to: IPS, Clock speed and the X3D cache is not so far fetched.
We can only hope for the best :)
This makes me VERY excited for AM5 since those will be using this stacked cache too, with a major difference. With a new socket instead of a retrofit they have a lot more room for error, which is a big part of why it took this long to launch.
Well not promised since AM5 will use DD5 and DDR5 still is early stuff, with bad timings, which can affect infinite fabric speed/latency alot. We will see in a few months. Raptor Lake will get a big cache bump too and boost clocks rumoured to be up to 5.8 GHz, meaning that pretty much every single chip will do 5+ GHz all core with ease. Can't wait for Ryzen 7000 vs 13th Gen however Ryzen 8000 vs Meteor Lake is going to be much more interesting (and this is when I will upgrade)
Stacked cache Zen4s aren't coming out this year. There's nothing circulating about "Genoa-X" server parts so no ryzen with Vcache either. My guess is they are minimum a year away.
@@Dr.WhetFarts Same here will skip the 1st gen of AM5.
Hoping for a good 5800x3d deal this coming Black Friday Sale.
Seems like unfortunately they will not, yet.
@@Dr.WhetFarts well. You were wrong
Was waiting for AM5/13th Gen Intel but 5800x3D upgrade from a 3700x for gaming at £300 (sell the 3700x) would also be a huge upgrade, plus I'd get to keep motherboard/RAM and let the early adopters be beta testers for the new platform.
Nice to see competition again!
Or... just wait a few more months and buy Zen 4 directly after things get polished instead of buying a stopgap for something that's perfectly fine and fast.
@@GamerBoy705_yt Why, when this would be cheaper and easily fast enough to run a high end gaming PC for years to come? If AM4 is any indication, waiting for second or third gen AM5 might be the safer bet anyway.
@@GamerBoy705_yt Or buy 5800x3D because it'll help max out my RTX 3080 while costing me £300 vs what a new motherboard/RAM and CPU will cost (We also have no idea what the price and performance of Zen 4 will be). A 3080/5800x3D combo should last me 3 years, it's like telling someone not to buy a RTX 3080 FE (at RRP) when they're running a RTX 2070 Super because the next gen will be here soon, we can all keep waiting although my opinion is if something exists that makes you happy and you can easily afford it why not get it? If Zen 4 absolutely smashes the 5800x3D with the amount of use my PC gets it's not a bad thing (6 months use for me is like 24 months for others), selling my older motherboard/RAM will be easier with a 5800x3D installed vs my older 3700x.
It's a pretty awesome upgrade for the price over my current 3700x with a performance lift of over 50% in some games, considering I sat out the 5000 normal series if the price is right with no stock issues I'll be a happy camper.
You're also not wrong with your comment, just for me personally I think the 5800x3D suits my needs well.
These GTA 5 results actually make sense when you consider the amount of AI prediction it takes to run the game (pedestrians and cars path perdiction / trafic lights / also there are alot of RNG dependent spawns, random events) this is where the cache on the 5800x3d should thrive.
Yeah LTT also used GTA as example for how more cache can help keep better average frame rate and consistency
@@ArchusKanzaki It also came out top of the heap in Flight Simulator. 113fps vs 93fps for the 12900KS, and 33fps vs 80fps for the 5800X, which equates to gains of 21% and 41% respectively. Those are seriously respectable gains.
This is such an interesting CPU, and gives a chance to show where cache really matters. This really makes me wonder what an x3D 5900x and 5950x would have looked like. It would have been interesting to see how it worked across chiplets. I also can't wait to see what an architecture that's built from the ground up for 3d cache can do, rather than it just being tacked on to an already funcional cpu at the end. As 3d stacking gets more complex, and moves to logic, I bet we see some interesting thermal solutions too. The future is exciting! Well... at least in tech it is.
Probably would have overheated.
Makes me wonder if AM5 CPUs will be split into "work" models with focus on more cores and "gaming" models with focus on more cache
Hardware Unboxed called this ages ago back when Zen3 launched. In response to Tech Deals going on about cores they practically proved that even in the cases where the higher core count parts outperformed the 6 core, it was due to their increased cache and not the actual cores. You could probably get the same results with a theoretical 5600X3D. Cache seems to be the limiting factor for CPU gaming performance right now.
@@AVerySillySausage Mostly, though in some games having more cores can drastically improve perceived smoothness by raising lows and improving frametime consistency. But that really depends on how well the game scales, and how much you have going on in the background, or even how old your windows install is. But yeah, overall a 5600x3D would be the same. I just wonder how two chiplets on a 5900/5950x3D would work out.
If yields are not so great, we could see an 8 core composed of two half-disabled (or 3+5/2+6/1+7) chiplets with cache stacked on top of both. I'd imagine the cache would help fight the latency cost of having the cores split across 2 chiplets. Would be a great gaming CPU without taking Chiplets away from productivity models.
This seems like a smart final product release for the AM4 socket, gives a very clear upgrade path to people who want to get the most from AM4 before moving on to AM5 later down the road.
Yeah i've got a 3600XT and looking at this as a mid-life upgrade to pick up this summer/autumn
I was hoping for a 5950X with 3D-V cache 😂... as upgrading from a 2700X to a 5800X3D won't be much of an upgrade in terms of my workload as I rarely game lol.
@@earthtaurus5515 I also have a 2700x. I'm tempted... for gaming this upgrade would be freaking huge.
farewell product to AM4 lover
@@Thunderstyle7 Indeed it would be huge for gaming. I really hope we get a 5950 version later lol. Yes, I want all of that cake (productivity and gaming) haha.
I feel like this CPU is just an appetizer before the main course: Zen 4.
Quite amazing considering this chip while using DDR4 RAM can actually challenge INTEL top CPU using DDR5 RAM!
Can't find ram specs, but you need above "supported spec" DDR5 to get more performance than 3200C14.
DDR5 is not mature and there really isn't software that can even utilize such large transfer speeds as it is.
Just looking at DDR4 vs DDR5 benchmarks on any 12th gen Intel CPU, the difference is overall negligible with a massive financial penalty for adopting DDR5 early. You can't find DDR5 for less than $300. But you sure can find DDR4 for around $50-70 if you're not picky on RGB or heatspreader shape and color
This isn't some "Intel is bad because DDR5 performance bad".
This just means that DDR4 is still the best even while CPU manufacturers are migrating to support DDR5.
Unless you're chasing benchmarks, you'll never see a true difference in RAM speed and latency going beyond 3600. best bang for buck is 3200.
Yeah wait til DDR5 matures more
but just in games, let intel get on the same nm AND the extra cache and it would be faster. and this cpu is just for gaming. if you use it for work its shitty
@@PrefoX If that happens AMD would be on a whole new node. AMD is more excited about zen 5
I’ve just bought a 5800X3D this week, specifically for racing simulators where it is a monster. Got it for a lot less than a decent M5 Mobo. It’s up to a 100% increase in frames for sims over my 5600X for a net £200 after selling the 5600X. That’s outrageous value for money.
You can tell Steve's wallet had a hand in a large part of the scriptwriting for this episode. xD
AMD allowing reviews a week before launch demonstrates their confidence in the chip, and their confidence seems well-placed. What a wonderful product to reward those AM4 users who didn't switch to Alder Lake.
It's probably best not to think of this as a 'reward' because that would imply that loyalty means anything in the pc computing space. Very few people buy "the newest and greatest" cpu every year. For most people, buying into any platform is a gamble and longevity is the goal.
Last pc I built was a 6700k back in 2016 and it's still going strong (currently relegated to being a media computer). That was a solid value. AMD most certainly was not up on its game back in 2016, and the early AM4 socket had serious problems. Some BIOS updates would literally brick the motherboards. It was a bad time until Ryzen finally matured. If I'd bought in back then I would have had a _bad_ time. Zen 3 has been a banger, though. That's one of the big reasons why my current system is a 5900x. But I got that after enough time had passed that I could justify throwing money at a new motherboard and cpu.
If I'd chosen an AM4 chip back in 2016, I would have had to suffer with AMD during a time when it was seriously lagging behind. If I'd built my computer in 2019/2020 and bought into the AM4 platform at Zen 2, the value proposition would have been a little better but it would also feel too soon to buy a new chip today. I'd basically be admitting that the Zen 2 chip I bought was a waste of money that didn't have the longevity.
Another thing to consider is that buying a 5800X3D would have only allowed me to get more life out of the existing system, but it'd basically be a last-ditch effort to keep my *motherboard* relevant. A motherboard costs what... a hundred bucks? Two hundred bucks if I want a nice b550 board? It's kind of a silly thing to base so many decisions on. What matters most is how much computer you can get for the price, _when it's time to buy a computer._ If enough time has passed that I'm getting ready to upgrade some parts, I'm usually gonna need a new motherboard anyway. So the chipset really is irrelevant unless the parts I buy are failing me right out of the gate.
So if you're somebody who is in the market for a decent gaming computer and you don't care what the platform is, the 5800X3D is genuinely the best value out there. The AM4 technology is as mature as it's ever gonna get. It's stable and it kicks ass. I think that's really all that needs to be said.
@@pirojfmifhghek566 I mean, I bought the cheapest zen 2 on the assumption I’d buy the last AM4 cpu.
I have to say, this is a pretty awesome pay off for that decision. I had a pc that met my expectations for two years and get a rather substantial upgrade at the end.
Hindsight 11 months later, this chip is a love letter to AMD costumers. Dr. Su has said she loves gamers, there is no AM4 chip that beats this in gaming and, until 13th gen, almost no Intel CPU did either. 11 months later, the Ryzen 7000 is in full swing and it's still in the conversation. That's love of your costumers who may not have the insane amounts of money to be the early adopters of AM5. They have made this chip widely available and widely compatible with everything but 1st gen boards.
Good on AMD.
@@pirojfmifhghek566One hundred bucks would literally be 3/4ths of the cpu i want to use.
Holy...I actually believed that AMD wasnt't gonna live up to their claims, especially with how good 12th gen is. Good to see new tech stretching its legs.
Good to see them competing
truly the 1080ti of CPUs, all hail the GOAT!
I think the 1% lows are really interesting, even in CS:GO where the average frame rate was so much lower than the competition, it still has the best 1% low score, which will make for a much more consistent experience.
Funny how AM4 was supposed to be retired earlier. But the pandemic delayed its retirement and now it's going out with a bang with the 5800X3D being the best value gaming CPU.
AM5 better not disappoint. AM4 set the bar very high for platform longevity, performance and real value.
I dunno man. My pc already dead. Ram slot fucked up.
Should I wait am5 or just buy am4.
They said. You can use am4 cpu on am5 platform.
This is really fascinating to see how different applications depend on cache.
After seeing the benchmarks of the Epyc chips with the 3D VCache there are certainly non-gaming programs that benefit a lot from more cache.
5800x is faster than the 5800x3d in non-gaming programs. The 5800x3d is aim only at game performance.
@@GodKitty677 some, not all
@@GodKitty677 The improvement is marginal for a $150 price increase if you're only looking at it from a gaming perspective vs the 5800X
Unless you have a use-case for needing the extra L3 cache where it benefits greatly, this is more of a "what if" modification to an existing architecture.
@@rotor13 If you are looking outside games then one would expect more cores as the more important attribute.
Just bought one of these in Dec 2023 - I think this CPU has single-handedly extended the lifespan of AM4 by another two years.
(And they ALL seem to be _incredible_ under-volters - I used -30 all core and dropped 10+ degrees in CB for _more_ performance.)
The best part about this CPU is I rarely see it breaking 70 watts in games
It uses 70-80W in Last of us Part 1. However that game is known to be heavy on the cpu, some say its not optimized. On the other hand this CPU uses only 50-55W in AC mirage.. which is brilliant
@@Ladioz my 5600 non X go up to 60 with a peerless assassin and undervolted. you can still undervolt the 5800x3d to now and most people go -30 with no issue even after month and month.
"After this, AM4 dies."
Famous last words.
I hope the stock is good on the 5800X3D, looking to upgrade my 3700x to this since I'm trying not to adopt AM5 for a few years.
Im in the same boat. The last Hurra for our Am4 Mainboards with DDR4.
3800x here 5800x3d is like going from a 1800x/2700x to the 3700x (I probably wouldn't be bothered with the 5800x3d if I already had a 5800x)
going from a 3700x to a 5800x witch only gained you 10-20% and gen4 pci-e good gain but not worth the cost
Where as the 3700x to 5800x3d is above 50%+ consistency so is worth the cost overall (this was all using 3200 ram as welwitch is impressive, if using 3600/3800 cl16 witch incresses the infinity fabric speed it would have lowered the latency even more )
I have the ram (3600 cl16 that my current x370 mobo limits to 2933) I only need motherboard and CPU (probably buy a 2tb nvme SSD as well) so the 5800x3d is probably going to be the way to go
THANK YOU, I've been waiting on this since you started reviewing the AMD chips a week or so ago.
So much to process here. AMD shot themselves in the foot with some of the lower end CPUs just released, then release this intriguing, cache-heavy monster that could really boost performance for certain applications to levels we haven’t seen before for a competitive price. I’m currently on an 11700K (it is what it is). I wonder if RPCS3 is any sort of cache dependent and, if so, what the performance numbers would look like.
RPCS3 might actually perform better on 11th gen since it actually uses AVX-512.
The fact that the am4 chip can keep up with the i9 and surpass in some places while only using ddr4 makes me impatient for the 7000 series release... i have a funny feeling that am5 is going to completely blow intel out of the water.
On one hand it makes it slightly feel bad I built on AM4/DDR4/PCIe 4(even numbers suck anyway) in 2020 but on the other it makes me realize holy shit how was that 2 whole years ago already.
Think Am5 will last as long?
@@TheJimyyy ummm you and i cant have a realistic conversation about amd vs. Intel if you really believe that. Intel i9 based on the 5nm process and only that chip that is slightly better than the processing power of amds ryzen 9 5950x and ryzen 7 5800x3d out of box. So you arguing that its based on last years "refresh", mildly irritating. All i was commenting on was the fact that AMD is going to smoke intel out with the release of 7000 series processors.
Raptor Lake and Meteor Lake will blow AMD. There are some serious improvements in Raptor Lake.
@@jordanplays-transitandgame1690 I agree, they're probably going to blow AMD and also maybe AMD will give them a reach around like how they enabled FSR on the competition's old cards lol
Seriously though I've been hearing stuff like this for YEARS at this point. Do you remember back when we were told wait for Alder Lake? Do you remember back when we were told to JustWait for 11000 series? Because I sure 'member. Basically nVidia and Intel just got so comfortable and allowed themselves to so rest on their laurels that they're literally YEARS behind down the pipeline now and at this point it's still them basically trying to play catch ahead again to very bad planning on their parts due to not taking AMD seriously.
Which would be one thing were it only that, but lately the last few years increasingly just feels like Intel and nVidia blowing lots of smoke (and power draw) up my ass in order to justify and cope their way out of losing massive shares. It's one small part of why I went from wishing to get a high end nVidia card for my 2020 dream build, to not only switching to red but outright shittalking them constantly because I can see through the lies at this point. I just don't believe in hype and I want results, particularly where my money is directly involved.
So as a result ironically enough AMD needs to fuck up pretty spectacularly to get me back on nVidia at this point whereas I actually would switch back to Intel mainly because I'm tired of hearing sustained turbo boosts get called "overclocking" because it's not. Ryzen flatly cannot and does not overclock.Which unless Steve or somebody (upside down or NA Steve either one) could clarify this to me, because I do feel pretty bilked about all that when it comes to processing regardless how good it is and efficient and cheaper etc. or that I'd have done it anyway, because on Intel you literally can get a say 3770k to sustained boost to 3.9ghz OR overclock it in the BIOS at up to 4.7ghz. Now, that is an overclock. 3.5ghz baseclocks and up to 3.9ghz TURBO BOOST. A sustained boostclock of 3.9 on a 3770k is not "overclocking" and I am not ever going to accept that as such, and yet somehow I keep seeing it getting talked about as if somehow sustaining boost UNDER the specified turbo boost clocks on AMD's 3700x is called "sustained overclocks of 4.25ghz all core on my 3.6-4.4ghz 3700x" like what the absolute fuck?
So just for that, I am much more willing to make fun of AMD on gaming, and more prone to saying you should get nVidia+AMD CPU for productivity, and Intel CPU+AMD GPU for straight gaming. Yes, that's even with the newest tepid showing of Alder Lake next to Zen3. However, the main thing is that Alder Lake mostly fails because of Intel's notorious power draw issues, although my understanding is you can easily push 250w on a 5950x with sustained all core turbos aka "overclocking" unless somebody can explain to me how I'm wrong about that
@@pandemicneetbux2110 Nvidia is definitely questionable, but what's not is they beat Radeon by a long shot every time.
had to come back and watch this because i just bought this 5800X3D and cheap B550 board and got them at very good price on BF deals here in Finland 😁, replacing 8700K which is still good CPU but cant wait some results in gaming
Nice man same I'm upgrading from i7 8700
One year later and this thing costs 300 bucks lmao. AM4 is outdated but it goes out with a bang. Bought it last week, it's amazing.
I like how 5800X3D is still powerful enough for 4090 and it will 100% be powerful enough for 5090 as well. AM4 was a good fucking investment.
I bit too early on the AM4 platform with the 370x chipset motherboard. As well as starting out with the introductory R3-1200 CPU, along with an RX 570 GPU. Granted, it was a far cry from the Phenom II X940 AM2 based system I was coming from. It was a long time coming. I learned my lesson though. I upgraded with the 5950X and a Radeon RX 7900 GRE (AV1 codec support). I went with the 5950X because many of the review comparisons found that even at lower frame rates, the gameplay was often quite smooth in comparison to the X3D offerings. I literally can't stand stuttering, at all. Something to consider. Also, it will have other benefits above gaming that I will utilize. I think it's squares out to about the same difference between a naturally aspirated V8 engine versus a V6 Turbo. I'd say that probably will max out that system for my next 5-7 years run and I shall begin saving up for the next build. Only this time, I will wait a couple of generations on a new socket(likely an AM6 or 7 if I stick with AMD and they stick with the current MO), and will invest more initially in a CPU and GPU. In my AM4 example, I should had waited another year for an updated chipset, the 470x, a R7 8 core, and a Radeon RX 5700. A good enough intro build on a maturing platform. And then a final maxing out upgrade on the outro. At least that's the strategy anyway.
I upgraded from 3600 to 5800x3D and my goddddd the gaming improvements was insane
This is very impressive, but also, at the moment, I'm more interested in seeing AM5 stuff (and 40-series GPUs) before I make any decision as far as building a new PC. My 3700X and 3060 Ti are more than doing the job right now, so I'm not in a huge hurry.
I mean.. You could just take the 3700x out and (after a BIOS update) slap the 5800x3d in, you don't need a new build.
Yeah, your 3700X will be just fine with the 3060 ti. It shouldn't bottleneck you, even at 1080p.
@@TKIvanov only if he wants to play far cry 6 or gta 5 with max fps on a 144hz screen, otherwise he won't notice a difference
@@TKIvanov Actually I can't, unfortunately. The motherboard in this prebuilt (which was the only way to get that 3060 Ti early last year) is unsupported by Gigabyte entirely and has no bios updates (It's the B450M DS3H Wifi V2, which isn't compatible with the bios updates of the DS3H Wifi OR the DS3H V2, and isn't even a motherboard listed on Gigabyte's website). So basically I'd need to get a new motherboard as well, and at that point might as well also get a few other new things (PSU and case). So I'll just wait.
I did find solutions to actually update it with another board's bios, but I'd rather not risk bricking my machine.
@@taggerman6867 I've been playing newly-released games at 1440p at around 120-140fps (when they're optimized decently, as some games go down to the 90fps range), things are going great with just a minor stable overclock on the 3060 Ti. And there's DLSS helping a bunch too on some games.
Funny, you can almost buy two x3d for a price of one 12900ks and those two still will be drawing only around 70-80% of a single ks chip.
Or you can just buy a K and Overclock it...
@@Jonathan_T ddr5 prices tho.. worse bin too, so it'll draw even more power at 12900ks clocks, not to mention the fact that you need a z series board
@@nick524 there is Z690 with DDR4, 12900K works just fine with it. Besides I don't really recommand DDR5 with the 12th gen (weak gain unless your a pro-RAM overclocker)
@@nick524 bro.. power? who care about power of cpu.. 100 or 200w, if u have 3090
@@nick524 also next gen will double power draw.. and Intel will release a processor that will beat this one
It would have been great to see comparisons with VR. This chip really shines for VR where frametime is most critical.
Fascinating silicon launch. AMD realised there was a niche market for faster gaming CPUs, maxing out cache as opposed to cores and threads. Great way to ping their target market.
I think that an interesting test of this CPU would be if the extra cache makes a difference in the FPS drop that comes from running a lot of stuff in the background while gaming Vs the normal 5800x
Doesn't seem to be that big of an improvement given the graphs they've shown for gaming, with only a very slight increase in the lows
@Garrus Vakarian The only game I have that is listed in their testing as showing any improvement, was a 3% increase vs the 5800X. So no, not "half" of my games.
I have a 12600k which out-performs it still, for a better price to performance ratio
@Garrus Vakarian im still waiting for why people argue 230 fps > 215 fps when the average user uses a 120 / 144hz moniter
@mike n yep and are people really playing these AAA games at 1080p when you’ve got such a high end chip? A lot of people are probably on 1440p where the performance difference is much lower.
@@rotor13 12600k with a nice new motherboard, and did you buy ram too? I can just drop this 5800X3D into my system and be done. 12600k isn't a better gaming CPU.
I just got mine, with my 3090 it’s absolutely incredible. I don’t know if any other cpu can touch it in gaming. And even if you’re not gaming you have 8 cores and 16 threads. Just stunning
What mobo are you running ?
@@horseradishwithchives X570 Aorus Elite
I wonder how ram frequency/latency effects the X3D. Going to be interesting to see if any reviews push the infinity fabric to its limits or break 1:1 IF and see if there are any gains. Essential test DDR4 3800CL14 and DDR4000+ vs 3200CL14.
Watch hardware unboxed, they included 3800mhz ram in their review
basically arround 5% improvement
@@clydecaballero1706 Seconded! Hardware Unboxed also compared with Intel Alder Lake both DDR4 & DDR5.
😃
@@clydecaballero1706 That 3800 RAM in that H.U. review was CL 16, NOt the same as what OP is correctly asking for. And by correctly, I mean that its the really awesome RAM setup for Zen 3, and H.U. STILL has yet to understand this. Amazingly. all of these review channels are flawed in some major ways as it pertains to Zen 3, it's really funny to watch people kiss their feet.
Just imagine a 105w 5600X3D with higher or even the same clocks compared to the original 5600X. It would be killer in price/performance within that segment.
they didn't want to kill the 7000X unfortunately
@@Ilegator No more likely they did not want to kill 5600 and 5600x. And really the extra 2 cores on the 5800x3d sure makes the CPU laughable good compared to what came before it. There is some 10 and 12 core CPU's from Intel or something that. But even under a full 6 core load the cache is just grate. And then you got 2 cores that should be secluded to do stuff that otherwise had slowed down the already taxed 6cores. And when doing something that needs cores the extra 2 cores over 5600x is very welcomed. Really it is just really good compromise.
And really the 3d cache is a extra expensive step in production. Doing it to a 6core part when 8 cores chips almost adds no cost to AMD is just dum. Better to try and make as much money out of the good 8core silicon instead of making better 6-core chips for less money.
Vermeer has 8 cores per CCX sharing one L3 cache block between all 8 cores. A 5600X3D chip does not make sense. And of course, it would cannibalize sales from the 5800X3D.
@@jorelldye4346 Well, same as the 7800x3D is destroying sales of its bigger brothers. That includes the 7900x3D, which only has the 3D cache on one of the chiplets, each one composed of, you guessed, 6 cores. A 5600x3D makes not sense now, over 1 year after my original comment, because AM4 is gone. But of course we won't have a 7600x3D either.
@@sovo1212 I stand corrected. Thank you for commenting. I suppose it is only the minimum profit margin they've set for themselves that limits 3D cache to 8 cores and up.
I upgraded recently from a 3700X to a 5800X3D and the performance uplift is awesome. Getting anywhere from 15-30 fps more in virtually every game (paired with a 6800XT, that I also upgraded to from a 2060 Super). On top of that when u undervolt the CPU and set limits its also very reliable and runs a lot cooler and clocks higher. Overall a fantastic chip anyone on AM4 that wants to upgrade and boost their gaming performance this is definitely the chip.
Can I ask what gpu and psu you have because I also want to upgrade to this chip but my 500w isn’t gonna cut it most likely
@@j_g2005 6800XT for the GPU and be quiet Pure Power 12M 850W for the PSU. I had to upgrade my PSU as well when I got the GPU, old one was 600W. 5800X3D draws 100W as well so it aint no slouch.
If ur using a higher end GPU that pulls a good amount of wattage then yeah 500W aint gonna cut it. I suggest going for Corsair, be quiet or Seasonic.
Had I been on a 5000 series Ryzen already I don't think I would care, but it looks like a pretty massive drop-in upgrade from my 3700X
That is also why Broadwell has ages surprisingly well. The big 128 MiB L4 Cache can do a lot in some games. Io that one test on intel 10th gen where cache could do more than cores in some. Or even back on LGA775 where the same clocks with bigger cache could improve performance.
Barring stuff like the 4500, I expect to see am4 fondly remembered as a banger down the line. Thanks for you and your team's hard work
Damn. AMD with the counter-punch. Looking forward to the next generation match-ups.
I have popcorn ready to go for that. AMD is about to jump node to N5 which is 84% denser than N7. Intel is on the "tock" cycle with Raptor Lake so that's still on 10nm/"Intel 7". And considering how much more power Alder Lake already uses... my money (not that I'm actually a betting man) is on AMD pulling further ahead from Raptor Lake than Zen 3 did from 10th gen....
If i had to chose the last CPU for my am4 mobo i'll buy 5950x: 16 cores and boost at 4.9. still One of the best CPU on the market, for gaming too
Wow was expecting review to be a bit brutal, but for its intended purpose, it's rather solid.
Now I want to see someone make it overclock anyways. 😂
The 5950X 3D would have been sweet.
We'll probably see something like that with the upcoming Zen4 generation, but I don't think it will be coming for the current, soon-to-be-obsolete Zen3 generation.
ie zen4
Would have been cool, but AMD probably have drastically reduced the clock speeds if they wanted to meet the same TDP. But I wouldn't be surprised if AMD released a 5900X3D or 5950X3D, out of the blue, but its very unlikely.
@@stupiddog79 I wouldn't say obsolete but out of line
@@stupiddog79 zen3 is going to be relevant for the next 10 years easily
Really impressive to trade blows with something that is twice the price and near on triple the power consumption in gaming. And no doubt any measurable difference will be indistinguishable to the player anyway. Funny to think that what is both kindof experimental and a "halo" CPU basically repeats what made Ryzen so great from the beginning, at the end of its life cycle.
Further to that, Ryzen has already turned many new ideas into things that are now standard in just a few years, with iterative improvements and refinement along the way, so it'll be good to see how 3D cache progresses.
Alder Lake is more efficient than Zen 3 during gaming, not double or triple the power draw.
I've ignored every review just to see yours first! So excited to see how it goes.
I can see why the 5900X3D wasn’t released … as the 5800X3D performs very well and cooler than Intels new i9-12900K/KS, trades serious blows with Intels flagship and most likely it’s an AM5 release having a 12-core with 3D cache … hopefully with a slight speed bump will be an absolute monster going forward. Great to see the competition between manufacturers again and drops in pricing occurring 🥰🥰😇🤩👍
The benchmarks on Tom's hardware show that x264 and x265 encoding sees a slight improvement despite the lower clock speed, so it would be cool to investigate what the improvement would be at the same clocks. More cores is best but it could give some insight into how much it could contribute to improvements when 3D V-Cache is eventually included in higher core count CPUs (though not on AM4)
OMG! Thank you for vertical bars near graphs showing the time left until next slide! Such a neat detail!
"AM4 dies"
A year later and that phrase hasn't aged well at all. 😁
Oh phew. I was worried about the code compile test, since then I would have been tempted to swap out my 3950x. Because yes, I hate waiting for stuff to compile enough to pay 700$ to get that as low as it could get.
Sounds like a 5950X will be in your future unless you build a whole new box.
@@johnbuscher Probably, yeah. So far it was just a bit too expensive for the improvement it would have given me so I've been holding off.
Pone
@@aquahertz6922 Pnoy
The 5950x is down to $550. Very tempting. So is the 5900x at $380.
Got a 4080 for myself for christmas, currently using the 3700X and these benchmarks made me quite certain I'm gonna pull the trigger on this CPU. My AM4 build will get a nice boost, so I can wait for AM5 parts to become more affordable.
Gotta remember this is the first time anyone has done anything close to this and im fairly impressed. If they end up allocating resources towards this general idea ill probly end up getting ome when i upgrade mine in aproximately 6 years lol. 3700x is gonna do me just fine for a good while
Definitely curious how they're going to take this forward & improve on it! Seems like a decent high(er)-end gaming option.
steve ur voice is now getting better and more clear. ty and getting less boring
Yeah, modern games are much smarter about using the cache as much as possible (usually at the engine level), often relying on batch rendering calls and SOA (structs of arrays) to keep data all close to each other, and doing function calls on all data at one time during the end or beginning of a frame, thus maximizing the likelihood of it entering cache and being reused in cache the entire time that data needs to be referenced. Older games were still primarily object-oriented, rather than data-oriented. So games like CS:GO will likely see very little benefit from double or even triple cache, as they don't bother with cache optimization nearly as much. Modern games on the other hand, will likely continue to see massive benefits from higher cache. Unfortunately we can't just trade out raw performance for cache though, as there are too many applications that need the performance still, and many people wouldn't be happy to see 10% improvements in some games while getting 20% reductions in performance in others that aren't cache heavy. On another note, though... This CPU will show people which games and applications cared about cache and which didn't, though. As you'll see a significant difference in those two different types of application with this CPU.
No clue what half of that even means but I concur 👍
Okay, nerd.
jk, we're all nerds here
This is a big win for amd in the efficiency department, now wouldnt it be interesting if intel put 3d stacked cache on a 12900ks. Why brute force fps with more wattage and a higher power bill when you can finesse the same fps or more with a elegant implementation of cache.
Considering amd had to drop frequencies to be able to stack the cache and implement strict voltage controls, would alder lake perform well enough under that kind of voltage controls at lower frequencies?
Problem there is heat. AMD specifically said a major reason for the clock reduction in the 5800x3d is the total power draw and thus heat output. Rember, the cache physically sits on top of the cores on it, so it wits between the cores and the cooler.
Looking at how much power even the normal 12900 can draw, that would likely be to hot for the 3d cache.
I think the problem for Intel are the yields as their die is monolithic. The resulting price after defects would be far too high per chip.
Steve you and your team are fkn soooooooo good!
Thank you so much for your non subjective result!
I would love to see the comparisons for their server CPUs using the 3D tech.
cant OC meaning you better hope it has good cooling to make up for trapped heat.
It's great for games that likes cache, its just the same as a 5800X on a compute level.
Milan-X is the name for AMD's 3D-stacked server parts, I believe. Demand for it is absurd, by all counts.
No need to worry about cooling on those parts. Server-grade blowiematrons are perfectly capable.
Serve the home already covered the server 3e vcache chips milan-x. They're monsters with 750mb l3 cache.
What a chip and way to end AM4, currently debating to upgrade with this CPU and replace my 3600.
I have a 3600x and debating that aswell, I think it will be great investment.
I just ordered 5800x3d to replace my 3600 👍
@@prashank Great! 👍I have a R5 3600x and will order one too, as soon as I can :)
Replaced my 3600 with the 5800x3d and it's an amazing upgrade.
@@denikec Did the same. It is a great upgrade. Enjoying the 5800x3D, last CPU upgrade for my AM4 system :)
To post a comment or not to post a comment. Not sure if the GN team even read these but just wanted to say , appreciate the content quality of this channel
I see this being amazing for small form factor builds for gaming.
You get the best for gaming while also being cooler to run.
As well as just gaming only builds in general.
Makes me wonder if Zen 5 will use more cache for easy gains. Or some Zen 4 CPUs.
Or APUs. Those could be interesting too.
Considering swapping out my 5900x for a 5800x3d because my ITX build cant seem to cool it properly
Not best for 4k.
@@Safetytrousers Maybe, Maybe not, depends on the workload... Even 4K is CPU bound on MS Flight Simulator for example