Not sure why the chapters arent showing up, but anyway they are in the description. Also regarding PBO, turning on PBO for the 9950X actually HURTS performance. Several other content creators I spoke with prior to launch saw this same behavior. Its also an overclock that kills your CPU warranty (still not sure how they would know) so no, not showing it.
@@flimermithrandir Thats actually not why, its because chapters have to be 10 seconds or longer, I had 5 second chapters showing the charts. Once I removed those it worked. Adding the 0:00 didnt work alone.
@@Jayztwocents Ok strange. But it does say in the Tutorial that it needs to Start with a 0:00 or it wont work. Maybe an old Fragment and it now works without that as well. Anyways. Good it works now :)
Jay. You gotta do CS2 cause it's huge, and it's one game that is using 1080p and 1440p primarily, monitors with 500Hz+ now. But also people be running 4K high Hz monitors, becoming more common and will be a focus in the future (as sizes increase too). To know averages, highs, lows would be great. It's just a title that's not going anywhere. Add it back to the suit please!
I am a 3d/motion graphics animator. 25%-33% faster then my 5950x this looks like an AMAZING upgrade. What I would like to see is a build with the new 9950x in a build for Phil and see the time difference in Prodution tasks compared to what he is running now. Complaining about the 9950x's gaming scores is like saying "This Farrari sucks! When I go to the grocery store It can hold only half the number of bags my SUV can hold"
i really wish someone did benchmarks for simulations in blender or something. way more interesting than rendering on blender. they can just compare the time it takes to bake a scene.
Yeah, I don't think people realize that when your making your living off of it a 5% upgrade is going to be well worth 650 dollars. Your likely to make that back in less than a month if that is your livelyhood. Now the 9700x and 9600x make no sense really but the other two are amazing for productivity users.
@@Jayztwocents- I know you said you need more graphs but don't go too far down the GN route. I think you've got the right balance between information and entertaining discussion. Some people will push for you to be more like GN but some of us also find GN too dry to watch lots of their videos. I dont need to see 50 graphs telling me gaming performance, I trust you to do sufficient tests to draw your conclusion and then only need to see enough to see why youve drawn the conclusion. I think you've found your niche in the market lean into that even more.
@Drunken_Horse I would like to see a little more commentary when showing the graphs. A lot of times I'm doing other stuff while watching videos so the narration of the results (like what GN does) is nice.
@@Jayztwocents Not sure if it would be too much to add, but I would love to see some type of productivity/video editing additions (Puget has some preset testing suites for Davinci Resolve and I believe Premiere/After Effects). I know you're a primarily gaming focused channel, but would be a great addition to see Phil jump in to talk about some video editing stuff with the CPU/GPU benchmarks. Thanks for everything!
Can you color code the bar graphs? It makes it a little easier to read at a glance when you have 21 lines on like 30 graphs. Great video, thanks for the info. Love your content.
Quit givining me an excuse to stay on my 5900X. This was supposed to be the reason to upgrade. I wonder what the results will show on the new X870E chipset
I mean, we are saving money by staying on AM4 and not upgrading. This just gives me hope because I just upgraded to a 5800X3D two months ago to keep my RAM and motherboard. I do not need my PC to do renders hella fast. As a video editor, I just want to be able to work and play games on it after 5 PM, and it does both things very well despite the X3D chips performing worse for productivity.
I'm on a 5800x3d and for two generations now, I see no reason to buy a new motherboard/cpu. I game at 1440p ultrawide with a 3090, and for most of my games, I'm GPU limited. None of this matters, whatsoever if you're not gaming at 1080p on medium. I think I might see some CPU bottlenecking around the time the 6090 comes out in 8 years lol. The 4090 won't fit in my case so it's not really an upgrade path for me without buying a new case as well and redoing my entire watercooling setup. I'm hoping the 5090 is a little smaller, and if not, I'll probably do the full upgrade for 6000 series.
Can you compare it to 7950x performance during zen 4's launch? Maybe we can predict whether zen 5 will keep getting better with bios updates. Amd loves to send unfinished products and then give updates to age like fine wine.
Well gamers shouldn't be looking at any other CPU , 7800x3d is beast , this CPU will still make sense 5 years from today . I don't think 9800x3d be any faster , 5% max , so grab 7800x3d and start gaming already at high FPS boys.😊
Pretty boneheaded decisions made with the pricing on these CPU’s at a time where trust in Intel is on the decline. It’s been frustrating seeing AMD not taking advantage of the situation.
They are taking advantage of the situation. AMD has a functional monopoly until the microcode update is proven to work, so bad value products are expected. AMD just forgot that it has to compete with itself too.
Yeah, because a CPU line up launch that had been planned for a long time will be changed on super short notice because in the weeks leading up to the release your competitor fumbled the bag. There just isn't enough time to change the plan.
Glad I just went with the 7950x3d back in Aug when it was for 524 Huge upgrade from the 4930k i had since 2015. Ur reviews and insight are very helpful on new and upcoming products.
Just bought a 7800x3d and can't be happier. Amazing performance in a SFF build running under 80c in games. We will see what the x3d chips offer in the future from the 9000 series.
nah, the 5800x3d is still awesome with some PBO tuning... 180$ for b650 motherboard, 350$ for 7800x3d processor and 100$ for ram, that's almost 650$ for a few fps gains?
@@SkrappyIE I'm expecting at least 2 more gens after Ryzen 9000, and I think AMD owe us that seeing as AM4 got 5 generations and none of them were duds like Ryzen 9000. The 5800X3D is only better value if you're already on AM4, if you are on Intel and make the switch or a really old PC, the 7800X3D and DDR5's extra cost is kind of negated by the massive performance increase.
i mean microcenter is offering 500$ bundles with b650 strix and a kit of ram....in july....they were offering 7800x3d with x670 tuf...and a ram kit for 500$....and a 1000$ bundle which included a 7900XT reference design....at that point i look at my 5900x/x570 tomahawk build...and consider selling it on jawa/offer up in exchange for one of those bundles. VS paying damnear 300$ for a 3+ year old cpu. And thats 300$ at microcenter...340$ elsewhere for an 5800x3d. its almost as if MC knew the consumers are seeing these cpu reviews...and going o shit i might as well buy now waitng for next gen=intel voltage creep/dud k series chip/board combos....and 9000 series re-badge. XD
So glad to see the 5900x represented... Shows me that I am still current and my losses aren't that bad compared to the new shiny bits. I run it and a 3070 for production purposes. Music and video editing. For me, the gains do not warrant the costs. Thanks Jay and Gang!
One question that no review talked about: Is 9950x better (more stable, higher RAM frequency etc.) than 7950x with 192gb RAM or even 128 gb. Which could be the main advantage of the newer generation for many professionals.
I honestly kinda like this cpu as I am thinking of upgrading from 3600. Altough I would (and I will) wait for X3D variant. But maybe for my home server... But I agree that other than efficiency there is nothing impressive.
You guys have significantly stepped up your testing methodology, Jay. The inclusion of CPU power consumption, CPU frequency (even comparing frequencies between CCDs!), and temperature in your recent CPU reviews is fantastic. Kudos to you and the team for the effort and attention to detail. Keep up the great work!
AMD, just cause Intel fumbled the ball doesn't mean you should've too, now everyone's just skipping this generation CPU wise (Which honestly is fine by me cause I can't afford to upgrade anyways/my rig is perfectly fine as is. xD)
Same here. Got a 5600x/6800 which is still doing exactly doing the things I need it to do. Decided just this morning I will upgrade somewhere in April 2026 or later, but no sooner.
My 5950X does 30800 (R23) while puling 90W (hours of core tuning, stability tests, haven't crashed in years) best chip I've ever owned. I was thinking about jumping to 9950X to greatly improve Handbrake times, I mean they are good but yeah Zen 6's gonna be it, skipping 7000 and 9000 series. Great video nice testing
There isn't much reason to upgrade every generation unless there is a huge uplift in performance or dramatic improvement in power efficiency these days.
That's one thing I don't get. "This Generation isn't 8 times better than the last one, what a waste" Yeah. So stick with the last Generation. Upgrading every 1 or 2 generations has been a waste of money for a while now, it's not a secret.
Nobody is interested in "dramatic improvement in power efficiency" these days, as people in the comment section believe that AMD GPUs are mostly the best buy, because they're a bit cheaper with 5% extra raster performance, while NVidia alternatives eat 100W less on avarage. Apparently, power efficiency is super important only in CPUs.
@@someperson1829 Any power efficiency is a tall tale, Your not actually buying for what you paid for. If its IPS is pushing 65W for 5.5GHz AMD knows this and clearly what you actually end up buying is 65W and 4.4GHz. Its pretty much a scheme for cash grab.. This launch has made them look terrible I will never purchase from them again. Their marketing also on the GPU side is pretty tragic as they claim to be head to head without being the GO-TO name card like NVIDIA, and saying that its less power, etc, etc. But we all know the Open Source Tech on a GPU is dogwater, and it only can work on some titles but most are strictly developed on NVIDIA. Instead of trying to match a card but still failed, (The 4090). They would need to push out technology way greater then a 4090. They actually need customers to see that perhaps they are a better more powerful company. But sadly AMD does not have the balls to do that, why? Because it needs to be efficiency to back up there market.
On the last AMD video I commented about the charts not having the "higher/lower" wasn't the better thing. I'm pretty sure I was not the only one expressing that. Sometimes the charts can be overwhelming (not the tester's fault) It felt really refreshing to see it being addressed in this video. Kudos for those reading the feedback and taking notes on how to become better. Nice video!
Im definitely thinking of switching plaforms and retiring my 12600K for the 7800X3D. Unfortunately when it comes to Intel, I don't have much of an upgrade path due to 13th and 14th gen CPU's being defective. I just don't want to get something like a 13700K and risk having stability issues.
@@TheBeretGamerthey’re not defective. It’s the microcode/ voltages causing the issues. There’s a new microcode update which is meant to fix it. But the one true fix is just undervolting
@@XFXGX I've also heard from many others that the microcode patch and undervolting will help, but its not an ultimate fix. From my understanding, the 13th and 14 gen Intel CPU's are physically flawed, as the silicon wears down and degrades way faster than it should (partly due to high voltage, but also has to do with the quality itself), leading to instability issues.
Hello Jay and team! hey great video i know these are a lot of work to get these results, I just wanted to follow up on your tweet and the mention in the video about upping up your game in the videos, here is my feedback: 1.-Although is great to have a lot of CPUs for comparison i think above 10 CPUs in the chart is difficult to follow as there is just a lot of numbers there, maybe is not the amount of CPUs but how the results are dissected. 2.-One thing that i miss is a value chart or a value segment, in the past you had the dollar per frame for GPUs i wonder if that is something possible with CPUs even if is not FPS. 3.-With CPUs a "best pairing" slide would be nice, this refering to chipset and cpu combo, now that we have different chipsets that can run multiple CPUs but not at their full capacity. 4.-Real wolrd experience mention, how stable was it? were there issues? how many? etc. I think that would speak a lot for what the user can expect in a day to day basis. This is just my opinion, maybe im alone in this but i wanted to reply to your request, cheers guys you are doing great!
I think it is really that these aren't AMDs gaming chips. They can obviously do it but it isn't the design for it. Games aren't going to use 16 threads and so I get why they have different clocks on each CCD though they may need to adjust their scheduling better for it. The 9900X and 9950X is a cheaper work CPU. I am sure that a game developer would do well with these CPUs especially if they can't afford Threadripper. The 9700X and 9600X seem to be more general workstation processors. Basically for people who do 90% of their work either in a browser and an office suite which would make sense for the lower power limits as power matters a lot more when you have to worry about powering hundreds of them in an office building. But that is the thing. These are processors for office use being reviewed as gaming chips which is the root of the problem in my mind. AMD themselves have made the comparison and so it is fair but I don't think it is where their strengths lie. What do gamers want then? Well it is simple. These chips are not for you. You should be waiting anxiously for the X3D chips that I assume AMD has coming. We already know that the X3D chips dominate in gaming and the lower power and heat of these chips may be what AMD was going for as those have been areas of concern for the X3D chips in the past. I also want to point out that the prices are high but I think that people still aren't used to how high inflation was these past few years. $650 in 2024 is comparable to about $490 in 2015. Yes that is still high but it also isn't absurd either especially when considering that Intel's 6800K from that year was $434 and was only a 6 Core processor. And the 6800k wasn't the top of their stack it was just the only one a sane person would buy. In 2015 Intel was perfectly happy to put out the $615(2015 MSRP) 6850k the $1089 6900k and the $1723 i7 6950X.
Thank you for including the 5900x here. That cpu is what I am currently using in my main rig. If you are going to test the am4 cpus though it would be nice to see the 5800x3d or maybe some of the xt parts? Otherwise goot job Jay and team!
I am still on an aging 3900X as my "office" machine. Looks like I'm going to ride it out a little longer still. I expected much more from AMD this time around, and they completely missed the mark. I didn't go for 7900/7950X because the cost to replace the entire platform didn't feel right, and now 9900/9950x are basically side-grades from the 7000 series, so it's still a no from me dawg.
Thanks so much for including the 5950X, the best AMD CPU I ever owned, never had a single issue with it. I am more on the Intel side but see no reason to update from my 12900K on an Asus Apex motherboard.
What I want is to actually use my cores. Just because I'm in a game is no good reason to basically turn off half my CPU (yes, I know that's not really what it's doing, but to keep things simple, it also kind of is).
@@someperson1829 The processors will still fail...just more slowly! This is the thing they aren't saying about the fix. It also won't magically repair damaged processors. They are all still junk. Anyone buying one new is asking for trouble.
@@paradoxicalcat7173 Just more slowly - like any other CPU. Newsflash - they all degrade. By the time it degrade you'll buy a new one. I have 14900KF with 5.7/4.5 GHz and adaptive voltage with undervolt - HWiNFO max registered voltage ~1.28V.
Did you hear Wendel's comment, if he runs 9950X cyberpunk in Administrator mode, FPS shoots up frmo 188 to 197 that is mad, sounds to me like an OS update should get better performance
Hey Jay, thanks for the content. I bet there's a lot of people, like me, on the 5800x3d that are starting to look for an upgrade. It would be nice to see that SKU included in the charts.
The part you pointed out at the end of the video is a big reason why a lot of people just don't move up to AM5 or whatever Intel is doing. There's just not enough of a performance boost for the newest cutting edge hardware to justify the massive upfront cost (and the domino effect of making more upgrades to keep up) A friend of mine and myself use the AMD Ryzen 9 5900X; our lifestyles and demand from our systems just doesn't line up with modifying/replacing our still fully functional computers to make another one that costs twice as much, eats almost twice as much power, and triples as a space heater.
I believe that AMD's X series CPU's are geared more towards professionals... And the X3D series are geared towards gamers. Thats the best way to look at it.
A bit different from you. I believe this 9th gen ryzen is not geared towards consumers. I think what they did is they improved the power efficiency of their architecture for the EPYC lineup. Unlike consumer CPUs, power efficiency in EPYC is much desireable. When epyc numbers are out, we will see whether my prediction is correct or I'm just babbling mindlessly.
Yah I feel all these sorta "gaming" channels have the processors a bad wrap. It's obvious the new amd processors are beasts when it comes to "work" applications. And they still have the best processors for gaming with their x3d series. Not sure what the outrage is about.
Ifixit has been a partner for years and I still love when you announce them, jay. haha Too bad its expensive af to import their products to my country.
I am still running a 3900x and i would love to see a comparison of like all 12 core zen cpu's or 8 core. i think it would be great to see the generational steps of AMD
I'm running one too. I was minutes from getting an X670 and 7800X3D today but didn't because I just know the X870 boards are going to drop before the sound of the cash register dies out. Also because I am not building two rigs. My gaming rig has always been my general/work rig, and that has never been a problem running any games at max settings, the way I build. This is the first time I'm forced by AMD to choose one or the other or run two different computers and it's really annoying. I'll wait for the 98/9900X3D and just hope it's not as crap at "everything else" in comparison as the current X3D chips seem to be.
You kind of addressed it at the end. I know you can only do so many CPU's / Platforms, but 5800x3d (and maybe even 5700x3d) is relevant to this conversation. For the x3d drop I'd say this is more relevant than the 5950x you had here. Many who upgraded their AM4 platform to this had a case to skip Ryzen 7000, and are thinking about at what point it makes sense to jump to a new platform.
for productivity, this CPU rules! I mainly work with 3D software and do a lot of simulations and do not play games at all, so this CPU is a warm welcome for my long awaited upgrade. Hate the core parking drama though.. Microsoft Gamebar is just horrible.
Kinda amazed looking at the 14700k since I got it for less than a 300$ open box with no problems since I run it undervolted from the beginning. The fact it is able to keep up to the faster cpus at almost half the price or more is such a good deal for productivity. I wonder why people hate on it so much. The amd efficiency of my other build is really good though but the cost effectiveness of the 14700k is amazing
AMD is leaving the door wide open for Intel to make a comeback. With the same performance as 14th gen and old school reliability, Intel will have a smooth ride.
Reliability? Old school? Where have you been when the 13th and 14th gen i7 and i9 were crashing? And AMD is currently the old school because they didn't bother to add E cores to their CPUs. Intel literally had to disable AVX512 on their new CPUs starting from the 12th gen because they had it on P cores and didn't have it on E cores (so running an AVX512 program on an E core would crash it). AMD just added it in Zen 4 because they don't have this problem with E cores. Good luck with that room warmer. I'll just stay chill.
Don't count on it. The 7800x3d is a 7700x that is clocked lower with the cache stack added. 9800x3d will absolutely perform the same as the 7800x3d and cost $150+ more.
if they are using core parking on the normal chips I doubt they will use dual 3d cache on the 9950x there would be no need unless they allowed all cored to game
I think both Intel and AMD need to stop releasing chips so often. These tiny incremental changes are just a waste of time and nobody is impressed. I really don't see why this is necessary in 2024. GPU product cycles last years and when new ones are released, they are actual big improvements.
I have a 7950x3d but I've always been an overclocker. I wanted a 7950x but poor power efficiency kept me away. the 9950x seems to have addressed my issues and now i'm interested. People seem to forget that scheduling, core parking, etc go out the window with all core overclocking. basically zeroes out any latency issues as you have no ccd switching and no sudden frequency spikes that often translate to latency spikes that feel like microstutter. This is the only way to maximize latency for 4k gaming. As the x3d doesn't impact gaming performance at higher resolutions. you don't notice a difference in fps you notice it in crispy snappy gameplay.
Most Windows RUclipsrs: AMD 5 is "meh" Most Linux RUclipsrs: AMD 5 puts Linux on top of some gaming benchmarks over Windows.. WILL SOMEONE INVESTIGATE WHAT IS GOING ON!
Lack of a proper CPU scheduler on the Windows side. Microsoft is in a real Minimum Effort mode with Windows these days. If it's not something that will ring every last fraction of a cent out of the user via ads or data collection, they don't care about it
imagine this 'scheduling' in a win 11 landscape as a standard...and dx12u or DXU oriented/lobbied/payed for games/engines.....and a world with little vulkan.....this is the only thing intel and NV is hoping for...
I’m not a gamer, just a photographer and fine art printer. So, even with all the bad news and confusion re the Ryzen 9000 series, I decided to go ahead and replace my 7900 with a 9900X anyway. I have an MSI MPG B650i EDGE WiFi motherboard in an open case with an ID-Cooling SE-207-XT air cooler to which I have attached a second fan. The 7900 runs with DDR5-6000 memory, Game-Boost and the TDP elevated to 105W. It pulls 142W running Cinebench R23 with a high temperature of 79C for a multi-core score of 27550. When I replaced the 7900 with the 9900X, with the BIOS cleared except for EXPO, Cenibench R23 scored it at 32214 with a high temp of 81C. That’s a 17% improvement, much better than I expected. Think I’ll be keeping the 9900X and seeing how much I can wring out of it with a better cooler. I have no intention of playing the core parking game. BTW, B&H is selling the 9900X for $50 less than everybody is reporting. They also pay the tax if you use their PayBoo card. No connection to B&H, just a happy customer for many years. Some good news re the 9000 series is overdue. I appreciate that gaming drives the technology, but other users make up a large part of the marketplace, so it seems wrong for gaming to influence the whole picture. Even though AMD fumbled the rollout, the 9900X is still a great CPU for non-gamers.
I am glad I got desperate and bought a 7950x3d about a month ago. I was waiting for the 9950x but found a good deal on the 7950x3d. Really dodged a bullet there.
@@jojimonty2506 absolutely! The fact that in workloads for the most part the 7950X3D isn't far off from the competition and in gaming it's the top dog really means this CPU will agree gracefully. 9950X seems like a total mess.
@@jojimonty2506 I'm considering going to the 7950X right now, but my needs are about 50/50 gaming/workstation. Upgrading from a 6700K. Don't think it's worth thinking about the 9950X. Maybe even the 7950X3D should be a consideration for me? Not sure since I'm pretty much doing a 50/50 split.
@@zombizombi I recommend the x3d then. I use my station for 50/50 aswell it's solid in premier pro/After effects, blender, and my laser software, I do some AI stuff too and it's pretty quick compared to my old 5900x I am very happy with the performance of it although it was a bit difficult to get running stable. My best Cinebench r23 score has been 37644 which is pretty good I'd say.
Dude, I started my computer career in 1990. So, yea, the new CPUs are amazing. What's more amazing? Storage. I remember paying 400 bucks for a 40 meg hard drive back in 1990 for my 286.
It's been a while since I really watched any PC part/performance reviews, it's nice to see where things stand right now between different cpu's. I will be waiting for the upcoming tests to see how the ram speeds work out since a friend of mine is wanting to get a PC. He's not dead set on a time frame to build it, but if the 9000x3d stuff isn't yet released by then I'll probably push him towards the 7800x3d.
People are mad because they felt that AMD should have really taken advantage here because Intel messed up. But the thing is that this AMD processor was probably built and ready to go months ago before 'intel even messed up'. However, for AMD to mess up next year in 2025 then we can really spill hate on them. Now really is their (AMD's) time market the product until launch....that's just how the cycle goes
Exactly this, it's like people think AMD knows exactly when Intel would make a monumental mistake and somehow schedule all their development and validation solely based on that. This chip was probably in the works for the past few years, and the next one is already underway. They have a roadmap, and they aren't going to change everything at a moment's notice because of a suddenly, and recently exposed flaw from its competitor.
The same thing Nvidia don’t take advantage of AMD gpu department because AMD is winning the CPU market why would they launch there’s best CPU when don’t need to because everyone going to buy it regardless what you launch and your competition isn’t a threat for now. AMD is waiting Intel new CPU for them to answer like every company would do.
@@DecemberPlaysX It does. Chips and cheese has a nice article on Zen 5, and it has some 'significant architecture changes' to it, and those don't happen overnight. Seeing that the Linux people (Phoronix, Wendel@Level1) love the performance increases, I'd bet the undershoot is more Windows than chip. We'll see.
I don't understand where people even get to this thought process from. There's been generational improvements every single Zen release. Zen 5 is now almost stagnant like Intel's 14th gen. It is completely unexpected and also completely irrelevant to any of the situation with Intel. This release is does not follow the pattern AMD set expectations of, especially now with their marketing. It's false, it's misleading, and it's disappointing.
Serious gamers are waiting for the X3D chips anyway. But I think this isn't a wasted launch. AMD is trying to balance their processors and maintain efficiency. That's not a bad thing. The top end processors are more thirsty, of course, but still a bit more efficient than the previous chips. The target for these chips is not those who purchased 7000 series chips. The target markets are 13th gen Intel and 14th Gen Intel, who are sick of Intel, giving them the run around. The other target market is those with older Intel and AMD processors, say early Ryzen or even early Intel and looking to upgrade. If you're buying a new computer, the Ryzen 9000 looks more attractive. You also might look at the Ryzen 7000 chips, especially the 7800 X3D. These are not a waste and AMD is trying to avoid a situation like Intel has with hot thirsty processors that end up becoming a problem like the 13th gen and 14th gen which pushed to their limits end up damaging themselves. I think AMD is being smart, but I don't intend to get a 9000 series processor, but I do intend to go 7800X3D for a long time. I am looking forward to where AMD will be with future generations of zen based processors and future iterations of these Ryzen processors. I have no problem with Ryzen in my desktop and Ryzen in my laptop with my 4800 powered laptop, which is a much older architecture, and it shows. It's great for my purposes, though, and I don't see myself replacing it for a few years yet. But when I do, I will likely stick with AMD if they keep up their pacing and balancing power efficiency and raw horsepower. My previous desktop and laptop were both Intel, and I feel like Intel has violated my trust with how they have handled 13th gen and 14th gen in particular. With their current business problems too I am reluctant to go Intel. So I am still impressed with the 9000 processors but I think sticking with 7800X3D or even other 7000 series processors is not a bad idea but if you don't have a 7000 series then the 9000 series processors are more attractive.
Or you know, me upgrading from 6700K in total fear of Intels last two gens ;) Though even in my case going to 7950X, cost considered is going to be better than a 9950X. The actual tough choice is between the X3D and X versions given my use is an equal split between gaming/workstation.
@zombizombi exactly... These reviewers seem to think we're constantly upgrading and that we always need to buy the latest thing. There's a place for the 9000 CPUs given they are more power efficient, but if you get cheap power, then the 7000 chips like especially the 7800X3D, are a more attractive option. Even my 4800 in my laptop is actually really great for work and gaming. So the chips on the market are all super options, and they get the job done.
@@zombizombi That's why I prefer Intel, you don't need to choose between gaming and productivity, it's just great at both. Yes, the last two gens is a mess, but undervolting is not a hard thing to do. All who did UV are rocking their Intel just fine, inc. myself with 14900KF. In fact, sorting out AMD core-parking is more time consuming, that to UV your Intel.
@@someperson1829 Yeah... I just spent the last few hours understanding what core parking is.... Now I'm even more confused... Would be so much easier to go with Intels single die designs and cache... Now I'm gonna look into undervolting to fix the issues and the oxidisation problems Intel have... Sigh
Hey Jay, this is only a suggestion on your reviews: in the results charts, highlight (with different color of the bar) the reviewed CPUs for better reading 🙂
I really appreciate hearing about the potential to use these chips for non-gaming purposes. A lot of us only have one PC and it has to be dual use. I don't think that a majority of PC users have a rig dedicated only to gaming. It is great to hear the parallel use potential of these cpus.
That's me. My usage is basically 50/50. I don't know whether I should go with 7950X3D or 7950X. I guess I just need to decide whether I lean towards gaming or workstation more. (note i wrote in the wrong cpus but i edited them now). Coming from a 6700K I doubt I'm going to be disappointed with either.
Feels like AMD is just going through it's own phase of just becoming more efficient like how the 12th > 13th gen or 13th > 14th gen did (i cant remember off the top of my head where there was no notable performance increase, but mainly an efficiency increase), but it would be pretty interesting when you were talking about how AMD did their testing using a 7900xtx so i think a good video would be a comparison between the 2 just for sanity checking since numbers are kind've all over the place from what you were saying other reviewers had. Just a request for graphs, or more of a "would be nice to see overlayed or side-by-side" for a comparison of powerdraws on similar tests for either upgraded CPUs (ie 9950x vs 7950x) or AMD/Intel equivalents when efficiency is what the new hotness is pushing for. (not a big issue worth complaining about by any means) I do think a good video for sanity would be showing testing for using the full AMD cpu/gpu & seeing if there is any notable difference because that would be interesting to see, but that's just me. I like my Jay videos 👀👍
Why are tests being done at 5200? This current media cycle is churning at AMDs expense if thats not from AMD. Not trying to be negative - I am honestly curious. I do appreciate that the tests will be done again at 6000. If AMD is saying test at 5200 that's a really interesting take.
cuz Jay sucks at DDR5 ram set up or doesn't care to do it at all or hire someone to do it for him, there is no other explanation for all his AM5 videos are with Ram so bad that the CPU is shot in the foot, thats not new at this point, but it is indeed very dissapointing
@@daishirokuma I don't remember who, but there was a video by a Tech Content Creator this week who covered that point. Plus Jay also mentioned in the video that he did the testing as per suggested frequency.
Was hyped to have this replace my 3900X. I guess I'll wait longer, since my current setup is still doing everything I need without much demand for more. Was just gonna go for the upgrade to justify going AM5.
Sitting on a 1800X and 1080Ti overclocked up the whazoo and water-cooled. I am only now really looking into upgrading and the most sensible option still seems to be just taking a 5800X3D and a 4080 Super, precisely because of the AM5 snowballing. Heck, don't even need to change anything about my cooling solution, because the setup should've zero trouble keeping the new CPU and GPU peachy. And investing "just" ~1.500€ (1.300€ for the GPU and 200€ for the CPU) into this once-5.000€ system and thus keeping it up snazzy for another 3 to 4 years seems just that much more enticing than anything AM5-related at the moment...
Love your vids, this is another good one. If it wasn't for folks like you I'd likely just buy the most powerful thing available like I used to instead of trying to figure out what is the best bang for the buck, but it's not as necessary as it used to be, especially with CPUs. I'm ready for a CPU upgrade in my primary rig... still running a 4.8Ghz 4790k and it's showing its age. I really want to know about those new P-core only Intel CPUs, not sure if they're available yet but I'm really thinking about getting a 14901 ke instead of either a 12900k or 14600k. I'd go AMD but I already have 2 other LGA 1700 rigs and I want to keep my infrastructure on the same socket in case I have any problems, then I can just swap parts- I'd completely switch to AMD but I don't have the Time or Money to replace everything. Depends on the price/performance though. Feels weird having a primary rig that's significantly slower than my others, but this is the most efficient setup for how I use them.
These CPUs weren't designed for gaming, that's the X3D ones coming later. Overall the productivity and workload is an improvement over Intel 14th gen so win for AMD overall.
@@_J3RK_ on the Alienware site their top of the line R16 system (essentially max everything) performed no better than my several years old R10 system when doing the same CPU intensive scripts.
Hi Jay, great review! Thanks for your efforts. I would really like to see the reviewed products highlighted in the performance charts. Its sometimes hard to keep track without pausing the video, trying to find the CPU since its sometimes on top of the chart and sometimes in the middle of it... Secondly what i really miss on your guys videos is timestamps. Would be nice especially for long videos such as this one so you can find the crucial information for yourself more quickly and skip a little where needed. Cheers!
Because it was marketed to also be great for gaming. Really that hard to understand? People just don't like AMD talking sh't that's so far away from reality that even calling it misleading is being generous.
I don’t know what is gonna be the general feedback on the graphs, but I would have liked the CPUs sorted by brand/gen rather than performance as looking for a specific chip every slide was difficult (I was following in particular the am4 chips, and from one test to the other I sometimes struggled to keep an eye on them and analyse their performance before it went to the next :) Thorough test though! 👍
No sane person uses 13900k and 14900k on 253w, remove those limits with newest bios, lock all cores at 5.6 or 5.7 Ghz and suddenly 9950x loses almost every benchmark you performed.
@@Longlius i literally suggested locking cores below their advertised up to boost speed. Wipe your oil covered glasses before you make a fooI of yourself mr sweaty prince of bacon and gravy
No thank you, I already have a 13700KF that's now dead which Intel won't replace. I'm not interested in beta-testing more engineering samples from team blue.
“Just lock the cores, PBO, tune the chip and then…” OR you accept the Chips Raw performance out of the box and view it like 95% of the people buying the thing are gonna do. Put it in and go.
@@Jayztwocentsit seems sensible that using different color on chart for CPU being reviewed and its direct competitor or predecessor having different color than others. Like this Vid, 7950X 9950X and 14900K being highlighted would be a good idea.
I'm more curious of how the new X3d chips will do once we get more info and they are eventually released. Currently running the powerhouse 5800x 3D but I have to 100% agree that I feel like I'm in a holding pattern for upgrading into new stuff (which is why I got the x3d in the first place). My current issues with the market: 1. As Jay pointed out, not a lot of uplift on CPUs to make it worthwhile to get an entire upgrade on that end 2. I really, REALLY want to upgrade my 3060 (I want to take advance of frame gen for Microsoft Flight Sim), but the GPU market feels like crap still, and with the inevitable launches of the 50 series soon, I'm *hoping* they'll do a better job with those then they did 40 series.. The rest of my build seems solid. Plenty of m.2 storage, my EVGA power supply, while old, is still rocking hard, and I have a pretty decent 280mm aio from Corsair for my CPU that I'm sure I can transfer over if I eventually get new stuff...
AM I MISSING SOMETHING?! Can you please show me the person who is looking to buy these CPUs and a 4090 and then play at 1080p on medium settings? This is as synthetic and useless a test as it can be. Like, who benefits from it? Why test it like that? How can you even say “hope it helps you to choose” after that🤣 Edit: Dudes, I understand WHY(that’s why I called it synthetic) they do it. I am trying to ask, how is it in any way relevant for a real-life scenario and how does it "help you make a choice", as he says, when buying it?
At 1080p the GPU can compute the frames at an insane pace, which means it will demand data from the CPU at a faster rate, thereby stressing the CPU more, specifically in instructions per clock in most cases. At higher resolutions you're no longer stressing the CPU nearly as much, instead it's the GPU that you stress. This is why for CPU they test at lower resolutions and for GPU they test at higher resolutions.
@@LiamMartinKeane Are you proving my point? I do understand why they are doing it. So it's a synthetic and useless test for gaming, because you're GPU-bound 99 times out of 100. That is what I was talking about. And also if you can buy 4090 you probably not playing at 1080 🤪
Testing at 1080p with a 4090 puts MORE of the workload on the CPU (to keep up with the gpu), which is useful to show how the CPU performs. Testing at 2k/4k hides the specific performance of the cpu. Any channel testing at higher resolutions is just to accommodate people not understanding this concept.
Exactly. Think of it like a stress test. Pretty much any CPU can do general desktop office things nowadays, so the way to test them is to find tests that stress just the CPU. This can be artificial benchmarks (Cinebench, Prime95 etc) or gaming (1080p Low). In the same way a user isn't spending their whole life running Cinebench, they aren't expected to game at 1080p Low. It's telling you what the CPU is capable of, not what a regular user would see - that was "good enough" years ago
(of course, this raises questions about why people are convinced by companies and reviewers to buy stuff they don't need... But I'll stop as I would just sound like a grumpy old man at that point 🤣)
Appreciate the testing that you all do! Planned on keeping my 7800x3d and 3080Ti for awhile and only upgrading to a 5000 series GPU when the time comes. My only complaint with this chip is I wish I could get my OC to not dip below 4000mhz without locking all cores. Cheers!
@@SkrappyIE Hey! I would say gaming is great overall, however, I am using a 65 LG OLED 4k (was it a C1 or C2? can't remember). Anyway, I find myself still pulling things back which I assume is the 4k side of things. A lot of Dirt Rally 2 in VR which now runs better since the cpu upgrade. But sometimes other titles like Mechwarrior 5 and Starfield don't always seem to be happy maxed out. The part about the chip going sub 4GHz, even on OC, is what is really bothering me.
Hi Jay, I watched your 9700X and AMD really borked the launch. So much that I went to Micro Center today and bought the 7800X3D combo special for $499. I am upgrading/fixing my 5800X3D gaming system. It's been down for 5 months. Now to find the time to rebuild it (flush fill custom water cooled loops). I hope to get it running in a couple of weeks.
The difference in frequency between the two CCDs at 19:00 isn't entirely unexpected, although it does appear to be a rather significant gap (~400MHz). You mention only seeing a ~50MHz differential with your 7950X, but it might be worth checking again to create a similar graph and see where it lands. Most reviewers (HUB, GN, TechPowerUp, AnandTech, etc) saw CCD1 clock about 200MHz lower than CCD0 in multicore workloads on the 7950X, a difference which also extended to the 5950X and 3950X before it, although the frequency gap varies. The higher frequency cores are always on the first CCD in each of these cases, with each CCD (CCX for 3950X) having its own preferred cores.
I think your conclusion hits the nail on the head. Unless your hardware was already overdue for an upgrade, there is nothing to see here. I am still sitting pretty comfortably with a 12700k/3080 Ti. It will probably be several more years before I even need to consider upgrading, even with an in-socket upgrade to 14th gen Intel available to me.
Not sure why the chapters arent showing up, but anyway they are in the description.
Also regarding PBO, turning on PBO for the 9950X actually HURTS performance. Several other content creators I spoke with prior to launch saw this same behavior. Its also an overclock that kills your CPU warranty (still not sure how they would know) so no, not showing it.
Its because you forgott it needs to start with 00:00
At least thats what i know how they work. If they did not change it it still should do the Trick.
@@flimermithrandir Thats actually not why, its because chapters have to be 10 seconds or longer, I had 5 second chapters showing the charts. Once I removed those it worked. Adding the 0:00 didnt work alone.
@@Jayztwocents Interesting nuance to know. Thanks for info Jay, and as always, it's totally useful.
@@Jayztwocents Ok strange. But it does say in the Tutorial that it needs to Start with a 0:00 or it wont work. Maybe an old Fragment and it now works without that as well. Anyways. Good it works now :)
Jay. You gotta do CS2 cause it's huge, and it's one game that is using 1080p and 1440p primarily, monitors with 500Hz+ now. But also people be running 4K high Hz monitors, becoming more common and will be a focus in the future (as sizes increase too). To know averages, highs, lows would be great. It's just a title that's not going anywhere. Add it back to the suit please!
Future video Titles:
->My Daily issues with the 9950x
->I switched to the 15900k do I regret it?
->I went back to AMD. Here is why
Praying all but that last statement come to pass. lol
lol thats typical tech youtuber clickbait title
i mean there's only so much content variety on tech.
Would be pretty crazy if he got a 15900k considering it won't exist and be called the Ultra core 9 or what ever the fuck they call them nowadays
@@ryanw1284ryansrants probably will lol
Thank you for always making great content - you do well delivering information in a concise and relatable way!
Did you just give this guy $500 and he didn't even bother to like your comment? 🥴
@@LewisTiburon I think so!!
We like your comment :)
@@LewisTiburon L O L
We've got you pal! Here, take my like!
the ifixit ad is [still] gold
It really is.
Made me buy a kit 😂
GOLD!
If you say so. Shows a lack of effort to me by using the same ads for years.
If it ain't broke why fix?
I am a 3d/motion graphics animator. 25%-33% faster then my 5950x this looks like an AMAZING upgrade. What I would like to see is a build with the new 9950x in a build for Phil and see the time difference in Prodution tasks compared to what he is running now. Complaining about the 9950x's gaming scores is like saying "This Farrari sucks! When I go to the grocery store It can hold only half the number of bags my SUV can hold"
Stop being a FANBOY graphics animator. Wait for INTEL ARL on 3nm, will blow ZEN 0.5% out of the water.
You could have gotten around those 25%-33% 2 years ago for cheaper and with amd
This gen is just 5% faster than zen 4
I am 100% sure there are more gamers than animators, by magnitudes, in the world right now.
i really wish someone did benchmarks for simulations in blender or something. way more interesting than rendering on blender. they can just compare the time it takes to bake a scene.
Yeah, I don't think people realize that when your making your living off of it a 5% upgrade is going to be well worth 650 dollars. Your likely to make that back in less than a month if that is your livelyhood. Now the 9700x and 9600x make no sense really but the other two are amazing for productivity users.
Sad to hear how AMD wasted a big chance to take advantage of Intel's issues with this release. Hope the X3D versions will bring some improvements.
To be fair those issues occurred a few weeks ago.. how does AMD design, build and ship a product that takes years in just a few weeks?
It's so AMD of them...
@milktobo7418 the issues with Intel have been around since last year....
@@popper6633 cpu developement takes a lot longer than a year lol
@@remo7846 Plus cpu's or x86 platform is or has reached it's plato
Oof. You hate to see it.
yoo its the greatest technician thats ever lived
Hi racoon finger
the greatest commenter thats ever lived
My mom lost her toothbrush, do you know where it is?
Nothing that the greatest technician that ever lived can’t solve
Intel: I am sinking... Help~!!!
AMD: Bro, I am not leaving you behind. Saving you now.
Intel: Where’s the life preserver?
AMD: I thought I’d just jump in with you.
@@Adam-xo5iq gold!
@@Adam-xo5iqlmfao!!!!!
Rather: "Bro, I'm not leaving you behind! I'm coming to sink with you in solidarity"
This is not as bad as Intel situation. And I own Intel.
I like how you always get Steve's best side. Thanks Stev....errr Jay.
Steve helped a ton this time around, I bounced a lot of meetings and info off him for sanity checks
@@Jayztwocents- I know you said you need more graphs but don't go too far down the GN route. I think you've got the right balance between information and entertaining discussion. Some people will push for you to be more like GN but some of us also find GN too dry to watch lots of their videos. I dont need to see 50 graphs telling me gaming performance, I trust you to do sufficient tests to draw your conclusion and then only need to see enough to see why youve drawn the conclusion. I think you've found your niche in the market lean into that even more.
@Drunken_Horse I would like to see a little more commentary when showing the graphs. A lot of times I'm doing other stuff while watching videos so the narration of the results (like what GN does) is nice.
@@Jayztwocents Not sure if it would be too much to add, but I would love to see some type of productivity/video editing additions (Puget has some preset testing suites for Davinci Resolve and I believe Premiere/After Effects). I know you're a primarily gaming focused channel, but would be a great addition to see Phil jump in to talk about some video editing stuff with the CPU/GPU benchmarks. Thanks for everything!
Can you color code the bar graphs? It makes it a little easier to read at a glance when you have 21 lines on like 30 graphs.
Great video, thanks for the info. Love your content.
Quit givining me an excuse to stay on my 5900X. This was supposed to be the reason to upgrade. I wonder what the results will show on the new X870E chipset
That'd be the 1 thing that might change things.
but why would you even upgrade?
I mean, we are saving money by staying on AM4 and not upgrading. This just gives me hope because I just upgraded to a 5800X3D two months ago to keep my RAM and motherboard. I do not need my PC to do renders hella fast. As a video editor, I just want to be able to work and play games on it after 5 PM, and it does both things very well despite the X3D chips performing worse for productivity.
I'm on a 5800x3d and for two generations now, I see no reason to buy a new motherboard/cpu. I game at 1440p ultrawide with a 3090, and for most of my games, I'm GPU limited. None of this matters, whatsoever if you're not gaming at 1080p on medium. I think I might see some CPU bottlenecking around the time the 6090 comes out in 8 years lol. The 4090 won't fit in my case so it's not really an upgrade path for me without buying a new case as well and redoing my entire watercooling setup. I'm hoping the 5090 is a little smaller, and if not, I'll probably do the full upgrade for 6000 series.
I figure the 5900X3D would be a better balance for work and play since those CPUs are dirt cheap.
I saw JayzTwoCents a few weeks ago but was too nervous/shy to say hello. My mom was wondering why I was looking at the snow-haired giant 💀
hopefully you will have the courage next time, don’t give up ❤
wow, great story
Can you compare it to 7950x performance during zen 4's launch? Maybe we can predict whether zen 5 will keep getting better with bios updates. Amd loves to send unfinished products and then give updates to age like fine wine.
Core parking on the 9950x? really?
AMD making the 7800X3D an easier and easier choice!
That is honestly the dumbest shit ever. That is why I went from my 7950x3d to the 7800x3d
@STKReacts aww man I just bought a 7950x3d
Well gamers shouldn't be looking at any other CPU , 7800x3d is beast , this CPU will still make sense 5 years from today .
I don't think 9800x3d be any faster , 5% max , so grab 7800x3d and start gaming already at high FPS boys.😊
Until they show the 9800X3D at the end of year or Q1 2025...
@@KAMS-r3s Might not be faster, but will probably run cooler and be more effcient. I'm waiting for the 9800x3d. upgrading from a 7700k
Pretty boneheaded decisions made with the pricing on these CPU’s at a time where trust in Intel is on the decline. It’s been frustrating seeing AMD not taking advantage of the situation.
Amd are completely ignorant. They always fuck up a good opportunity even when it's handed to them
We entered AMD Zen++++ time
They are taking advantage of the situation. AMD has a functional monopoly until the microcode update is proven to work, so bad value products are expected. AMD just forgot that it has to compete with itself too.
Yeah, because a CPU line up launch that had been planned for a long time will be changed on super short notice because in the weeks leading up to the release your competitor fumbled the bag. There just isn't enough time to change the plan.
They have changed the pricing on products days before release. They even changed the price of a gpu after the embargo! @danielgonzalezjimenez5677
Smug mode with the 7800x3d is still going strong
You can keep that and be just fine for a number of years yet.
I love this... I just bought the 7800X3D and have less than zero regrets!
The GTX 1080 Ti of CPUs.
@@kunka592I'm going to be gaming on my 5800X3D and 7800X3D systems for a decade.
Regardless what new CPUs do, you got a good one and can be happy with it for a for years.
Glad I just went with the 7950x3d back in Aug when it was for 524 Huge upgrade from the 4930k i had since 2015. Ur reviews and insight are very helpful on new and upcoming products.
Just bought a 7800x3d and can't be happier. Amazing performance in a SFF build running under 80c in games. We will see what the x3d chips offer in the future from the 9000 series.
nah, the 5800x3d is still awesome with some PBO tuning... 180$ for b650 motherboard, 350$ for 7800x3d processor and 100$ for ram, that's almost 650$ for a few fps gains?
While this is true, the benefit of 7800x3D still has an upgrade path. If am5 is still supported after 9000.
@@SkrappyIE I'm expecting at least 2 more gens after Ryzen 9000, and I think AMD owe us that seeing as AM4 got 5 generations and none of them were duds like Ryzen 9000. The 5800X3D is only better value if you're already on AM4, if you are on Intel and make the switch or a really old PC, the 7800X3D and DDR5's extra cost is kind of negated by the massive performance increase.
i mean microcenter is offering 500$ bundles with b650 strix and a kit of ram....in july....they were offering 7800x3d with x670 tuf...and a ram kit for 500$....and a 1000$ bundle which included a 7900XT reference design....at that point i look at my 5900x/x570 tomahawk build...and consider selling it on jawa/offer up in exchange for one of those bundles. VS paying damnear 300$ for a 3+ year old cpu. And thats 300$ at microcenter...340$ elsewhere for an 5800x3d. its almost as if MC knew the consumers are seeing these cpu reviews...and going o shit i might as well buy now waitng for next gen=intel voltage creep/dud k series chip/board combos....and 9000 series re-badge. XD
@@anhiirr yeah the 5800x3d should be 200-250 now.
A few?
I am about to get my build worked on today with the 7950X. Good thing I sticked with it.
Same am on that CPU and leaving the 9000 series
I’m sticking with 7950x3d with 4080 super
So glad to see the 5900x represented... Shows me that I am still current and my losses aren't that bad compared to the new shiny bits. I run it and a 3070 for production purposes. Music and video editing. For me, the gains do not warrant the costs. Thanks Jay and Gang!
5900x and a 3070 is not top dog, but that is still a beast.
@jay mark the intel bars in blue, would be great for visibility!
One question that no review talked about: Is 9950x better (more stable, higher RAM frequency etc.) than 7950x with 192gb RAM or even 128 gb. Which could be the main advantage of the newer generation for many professionals.
Nope, both use the exact same IO-Die and thus also have the exact same memory controller.
@@mortlet5180he said in the video that there is a different memory controller…
@@walterdavis9718 AMDs own docs have the same controller.
Sad... Leave it to AMD to grab disappointment from the jaws of victory. Buying this CPU would make zero sense to 99.9% of people.
And still are some people out there praising AMD’s profit margins, lol
I honestly kinda like this cpu as I am thinking of upgrading from 3600. Altough I would (and I will) wait for X3D variant. But maybe for my home server... But I agree that other than efficiency there is nothing impressive.
Especially when you don't have a high end board to put it in.
@@lunamiya1689 They won't have profits for much longer if they keep trashing 'hearts and minds' with utter insanity like this, lol
It’s not meant for 99.9%.
You guys have significantly stepped up your testing methodology, Jay. The inclusion of CPU power consumption, CPU frequency (even comparing frequencies between CCDs!), and temperature in your recent CPU reviews is fantastic. Kudos to you and the team for the effort and attention to detail. Keep up the great work!
AMD, just cause Intel fumbled the ball doesn't mean you should've too, now everyone's just skipping this generation CPU wise
(Which honestly is fine by me cause I can't afford to upgrade anyways/my rig is perfectly fine as is. xD)
Same here. Got a 5600x/6800 which is still doing exactly doing the things I need it to do. Decided just this morning I will upgrade somewhere in April 2026 or later, but no sooner.
@@Scarfliotti Yeah my current CPU is the 7950X3D so I'm pretty much good for a while
Yeah tbh, I am probably going to buy a 7800x3d and just give it away or sell it when the 9800x3d or better comes out
@@Baesic999 Yeah if anything, just wait to see the x3D stuff, MAYBE it'll get better? Again maybe.
My 5600x and 6600 still rocking for 1080 medium high tho (crying in 8GB vram) :"
My 5950X does 30800 (R23) while puling 90W (hours of core tuning, stability tests, haven't crashed in years) best chip I've ever owned. I was thinking about jumping to 9950X to greatly improve Handbrake times, I mean they are good but yeah Zen 6's gonna be it, skipping 7000 and 9000 series. Great video nice testing
There isn't much reason to upgrade every generation unless there is a huge uplift in performance or dramatic improvement in power efficiency these days.
That's one thing I don't get.
"This Generation isn't 8 times better than the last one, what a waste"
Yeah. So stick with the last Generation. Upgrading every 1 or 2 generations has been a waste of money for a while now, it's not a secret.
Nobody is interested in "dramatic improvement in power efficiency" these days, as people in the comment section believe that AMD GPUs are mostly the best buy, because they're a bit cheaper with 5% extra raster performance, while NVidia alternatives eat 100W less on avarage. Apparently, power efficiency is super important only in CPUs.
@@someperson1829 Any power efficiency is a tall tale, Your not actually buying for what you paid for. If its IPS is pushing 65W for 5.5GHz AMD knows this and clearly what you actually end up buying is 65W and 4.4GHz.
Its pretty much a scheme for cash grab.. This launch has made them look terrible I will never purchase from them again. Their marketing also on the GPU side is pretty tragic as they claim to be head to head without being the GO-TO name card like NVIDIA, and saying that its less power, etc, etc.
But we all know the Open Source Tech on a GPU is dogwater, and it only can work on some titles but most are strictly developed on NVIDIA. Instead of trying to match a card but still failed, (The 4090). They would need to push out technology way greater then a 4090. They actually need customers to see that perhaps they are a better more powerful company. But sadly AMD does not have the balls to do that, why? Because it needs to be efficiency to back up there market.
On the last AMD video I commented about the charts not having the "higher/lower" wasn't the better thing. I'm pretty sure I was not the only one expressing that. Sometimes the charts can be overwhelming (not the tester's fault) It felt really refreshing to see it being addressed in this video. Kudos for those reading the feedback and taking notes on how to become better. Nice video!
Just upgraded to 7800X3D from i7-7700K, night and day difference in several games.
I'm about to go probably from an i7-6700K to a 7950X :D
Im definitely thinking of switching plaforms and retiring my 12600K for the 7800X3D. Unfortunately when it comes to Intel, I don't have much of an upgrade path due to 13th and 14th gen CPU's being defective. I just don't want to get something like a 13700K and risk having stability issues.
@@TheBeretGamerthey’re not defective. It’s the microcode/ voltages causing the issues. There’s a new microcode update which is meant to fix it. But the one true fix is just undervolting
I am still on Intel Core i7-7700K. I am waiting for new Ryzens X3D. I would buy Ryzen 9 7950X3D today
@@XFXGX I've also heard from many others that the microcode patch and undervolting will help, but its not an ultimate fix. From my understanding, the 13th and 14 gen Intel CPU's are physically flawed, as the silicon wears down and degrades way faster than it should (partly due to high voltage, but also has to do with the quality itself), leading to instability issues.
Hello Jay and team! hey great video i know these are a lot of work to get these results, I just wanted to follow up on your tweet and the mention in the video about upping up your game in the videos, here is my feedback:
1.-Although is great to have a lot of CPUs for comparison i think above 10 CPUs in the chart is difficult to follow as there is just a lot of numbers there, maybe is not the amount of CPUs but how the results are dissected.
2.-One thing that i miss is a value chart or a value segment, in the past you had the dollar per frame for GPUs i wonder if that is something possible with CPUs even if is not FPS.
3.-With CPUs a "best pairing" slide would be nice, this refering to chipset and cpu combo, now that we have different chipsets that can run multiple CPUs but not at their full capacity.
4.-Real wolrd experience mention, how stable was it? were there issues? how many? etc. I think that would speak a lot for what the user can expect in a day to day basis.
This is just my opinion, maybe im alone in this but i wanted to reply to your request, cheers guys you are doing great!
I think it is really that these aren't AMDs gaming chips. They can obviously do it but it isn't the design for it. Games aren't going to use 16 threads and so I get why they have different clocks on each CCD though they may need to adjust their scheduling better for it. The 9900X and 9950X is a cheaper work CPU. I am sure that a game developer would do well with these CPUs especially if they can't afford Threadripper. The 9700X and 9600X seem to be more general workstation processors. Basically for people who do 90% of their work either in a browser and an office suite which would make sense for the lower power limits as power matters a lot more when you have to worry about powering hundreds of them in an office building. But that is the thing. These are processors for office use being reviewed as gaming chips which is the root of the problem in my mind. AMD themselves have made the comparison and so it is fair but I don't think it is where their strengths lie.
What do gamers want then? Well it is simple. These chips are not for you. You should be waiting anxiously for the X3D chips that I assume AMD has coming. We already know that the X3D chips dominate in gaming and the lower power and heat of these chips may be what AMD was going for as those have been areas of concern for the X3D chips in the past.
I also want to point out that the prices are high but I think that people still aren't used to how high inflation was these past few years. $650 in 2024 is comparable to about $490 in 2015. Yes that is still high but it also isn't absurd either especially when considering that Intel's 6800K from that year was $434 and was only a 6 Core processor. And the 6800k wasn't the top of their stack it was just the only one a sane person would buy. In 2015 Intel was perfectly happy to put out the $615(2015 MSRP) 6850k the $1089 6900k and the $1723 i7 6950X.
Thank you for including the 5900x here. That cpu is what I am currently using in my main rig. If you are going to test the am4 cpus though it would be nice to see the 5800x3d or maybe some of the xt parts? Otherwise goot job Jay and team!
I am still on an aging 3900X as my "office" machine. Looks like I'm going to ride it out a little longer still. I expected much more from AMD this time around, and they completely missed the mark. I didn't go for 7900/7950X because the cost to replace the entire platform didn't feel right, and now 9900/9950x are basically side-grades from the 7000 series, so it's still a no from me dawg.
Four more years
Thanks so much for including the 5950X, the best AMD CPU I ever owned, never had a single issue with it. I am more on the Intel side but see no reason to update from my 12900K on an Asus Apex motherboard.
I kinda think it makes sense. If you want to game. Get an x3d chip. If you want productivity get the 12 or 16 core. If you want both. Get the hybrid.
What I want is to actually use my cores. Just because I'm in a game is no good reason to basically turn off half my CPU (yes, I know that's not really what it's doing, but to keep things simple, it also kind of is).
If you want both - buy Intel. Just fix the voltage yourself (which is not that hard to do).
@@someperson1829 The processors will still fail...just more slowly! This is the thing they aren't saying about the fix. It also won't magically repair damaged processors. They are all still junk. Anyone buying one new is asking for trouble.
@@paradoxicalcat7173 Just more slowly - like any other CPU. Newsflash - they all degrade. By the time it degrade you'll buy a new one. I have 14900KF with 5.7/4.5 GHz and adaptive voltage with undervolt - HWiNFO max registered voltage ~1.28V.
Did you hear Wendel's comment, if he runs 9950X cyberpunk in Administrator mode, FPS shoots up frmo 188 to 197 that is mad, sounds to me like an OS update should get better performance
Apparently Win 11 has "user admin", and there is a hidden "admin admin" that has fewer security checks and therefore higher performance. WTF.
Hey Jay, thanks for the content. I bet there's a lot of people, like me, on the 5800x3d that are starting to look for an upgrade. It would be nice to see that SKU included in the charts.
I would have loved to see how the 5800X3D compares to these new CPUs
The part you pointed out at the end of the video is a big reason why a lot of people just don't move up to AM5 or whatever Intel is doing. There's just not enough of a performance boost for the newest cutting edge hardware to justify the massive upfront cost (and the domino effect of making more upgrades to keep up)
A friend of mine and myself use the AMD Ryzen 9 5900X; our lifestyles and demand from our systems just doesn't line up with modifying/replacing our still fully functional computers to make another one that costs twice as much, eats almost twice as much power, and triples as a space heater.
I believe that AMD's X series CPU's are geared more towards professionals... And the X3D series are geared towards gamers. Thats the best way to look at it.
A bit different from you. I believe this 9th gen ryzen is not geared towards consumers. I think what they did is they improved the power efficiency of their architecture for the EPYC lineup. Unlike consumer CPUs, power efficiency in EPYC is much desireable.
When epyc numbers are out, we will see whether my prediction is correct or I'm just babbling mindlessly.
Yah I feel all these sorta "gaming" channels have the processors a bad wrap. It's obvious the new amd processors are beasts when it comes to "work" applications. And they still have the best processors for gaming with their x3d series. Not sure what the outrage is about.
What about professional gamers? Do they buy both and stick them together ?
@@JeremyBell The problem is that AMD is marketing these CPU's as best for gaming instead of focusing on the productivity side.
and the non x geared towards nobody amirite
Ifixit has been a partner for years and I still love when you announce them, jay. haha
Too bad its expensive af to import their products to my country.
I am still running a 3900x and i would love to see a comparison of like all 12 core zen cpu's or 8 core. i think it would be great to see the generational steps of AMD
I'm running one too. I was minutes from getting an X670 and 7800X3D today but didn't because I just know the X870 boards are going to drop before the sound of the cash register dies out. Also because I am not building two rigs. My gaming rig has always been my general/work rig, and that has never been a problem running any games at max settings, the way I build. This is the first time I'm forced by AMD to choose one or the other or run two different computers and it's really annoying. I'll wait for the 98/9900X3D and just hope it's not as crap at "everything else" in comparison as the current X3D chips seem to be.
You kind of addressed it at the end. I know you can only do so many CPU's / Platforms, but 5800x3d (and maybe even 5700x3d) is relevant to this conversation. For the x3d drop I'd say this is more relevant than the 5950x you had here. Many who upgraded their AM4 platform to this had a case to skip Ryzen 7000, and are thinking about at what point it makes sense to jump to a new platform.
Glad that 7900 i bought will run me atleast 2 more gens. See ya next release cycle
I'm still split because I think with 9000 you maybe able to run 4 sticks of ram
2026
@@Thri11seekerthat’s a very good point. That’s the biggest con with my 7950x. I miss running 64+ gigs ram
@@boot-strapper I want him to test it
@@boot-strapper You can use 2 32 gig sticks on 7k series. That's what I'm doing with my 7950x3d
I lolled at the CPU box falling over. Same happened to me in my review 🤣
for productivity, this CPU rules! I mainly work with 3D software and do a lot of simulations and do not play games at all, so this CPU is a warm welcome for my long awaited upgrade.
Hate the core parking drama though.. Microsoft Gamebar is just horrible.
Kinda amazed looking at the 14700k since I got it for less than a 300$ open box with no problems since I run it undervolted from the beginning. The fact it is able to keep up to the faster cpus at almost half the price or more is such a good deal for productivity. I wonder why people hate on it so much. The amd efficiency of my other build is really good though but the cost effectiveness of the 14700k is amazing
AMD is leaving the door wide open for Intel to make a comeback. With the same performance as 14th gen and old school reliability, Intel will have a smooth ride.
I think that they are working together. Old lawsuits may be making Intel fall back so that AMD can get paid in that way and stay afloat.
Reliability? Old school? Where have you been when the 13th and 14th gen i7 and i9 were crashing?
And AMD is currently the old school because they didn't bother to add E cores to their CPUs. Intel literally had to disable AVX512 on their new CPUs starting from the 12th gen because they had it on P cores and didn't have it on E cores (so running an AVX512 program on an E core would crash it).
AMD just added it in Zen 4 because they don't have this problem with E cores.
Good luck with that room warmer. I'll just stay chill.
Given recent SERIOUS issues Intel has with reliability, I don't see the door being wide open.
@@Hardcore_Remixer I never had a bad Intel CPU in my entire life. Curently using a 13900K. Never a single issue either.
Holy comprehensive list Batman! This is awesome, thank you!
hopefully the x3d chips will redeem this generation. really hoping the 9950x3d is going to be a dual 3dcache ccd design.
Don't count on it. The 7800x3d is a 7700x that is clocked lower with the cache stack added. 9800x3d will absolutely perform the same as the 7800x3d and cost $150+ more.
if they are using core parking on the normal chips I doubt they will use dual 3d cache on the 9950x there would be no need unless they allowed all cored to game
@@jimmer411 I'll be charitable and guess it will be 5% faster.
I love listening to the Jay Podcast, his voice is always warm and enjoyable and then there is this relaxing bit of funky music!
Love the iFixit ad!
That iFixit ad gets me every time. It's a joy to watch. I never ever in my life wanted to see a longer video of an ad. 😂😂
I think both Intel and AMD need to stop releasing chips so often. These tiny incremental changes are just a waste of time and nobody is impressed. I really don't see why this is necessary in 2024. GPU product cycles last years and when new ones are released, they are actual big improvements.
$$$$$
I have a 7950x3d but I've always been an overclocker. I wanted a 7950x but poor power efficiency kept me away. the 9950x seems to have addressed my issues and now i'm interested.
People seem to forget that scheduling, core parking, etc go out the window with all core overclocking. basically zeroes out any latency issues as you have no ccd switching and no sudden frequency spikes that often translate to latency spikes that feel like microstutter. This is the only way to maximize latency for 4k gaming. As the x3d doesn't impact gaming performance at higher resolutions. you don't notice a difference in fps you notice it in crispy snappy gameplay.
Most Windows RUclipsrs: AMD 5 is "meh"
Most Linux RUclipsrs: AMD 5 puts Linux on top of some gaming benchmarks over Windows..
WILL SOMEONE INVESTIGATE WHAT IS GOING ON!
Lack of a proper CPU scheduler on the Windows side. Microsoft is in a real Minimum Effort mode with Windows these days. If it's not something that will ring every last fraction of a cent out of the user via ads or data collection, they don't care about it
imagine this 'scheduling' in a win 11 landscape as a standard...and dx12u or DXU oriented/lobbied/payed for games/engines.....and a world with little vulkan.....this is the only thing intel and NV is hoping for...
I’m not a gamer, just a photographer and fine art printer. So, even with all the bad news and confusion re the Ryzen 9000 series, I decided to go ahead and replace my 7900 with a 9900X anyway. I have an MSI MPG B650i EDGE WiFi motherboard in an open case with an ID-Cooling SE-207-XT air cooler to which I have attached a second fan. The 7900 runs with DDR5-6000 memory, Game-Boost and the TDP elevated to 105W. It pulls 142W running Cinebench R23 with a high temperature of 79C for a multi-core score of 27550. When I replaced the 7900 with the 9900X, with the BIOS cleared except for EXPO, Cenibench R23 scored it at 32214 with a high temp of 81C. That’s a 17% improvement, much better than I expected. Think I’ll be keeping the 9900X and seeing how much I can wring out of it with a better cooler. I have no intention of playing the core parking game. BTW, B&H is selling the 9900X for $50 less than everybody is reporting. They also pay the tax if you use their PayBoo card. No connection to B&H, just a happy customer for many years. Some good news re the 9000 series is overdue. I appreciate that gaming drives the technology, but other users make up a large part of the marketplace, so it seems wrong for gaming to influence the whole picture. Even though AMD fumbled the rollout, the 9900X is still a great CPU for non-gamers.
This video makes the 7950X3D look like a monster right now..
I am glad I got desperate and bought a 7950x3d about a month ago. I was waiting for the 9950x but found a good deal on the 7950x3d. Really dodged a bullet there.
@@jojimonty2506 absolutely! The fact that in workloads for the most part the 7950X3D isn't far off from the competition and in gaming it's the top dog really means this CPU will agree gracefully. 9950X seems like a total mess.
@@jojimonty2506 I'm considering going to the 7950X right now, but my needs are about 50/50 gaming/workstation. Upgrading from a 6700K. Don't think it's worth thinking about the 9950X. Maybe even the 7950X3D should be a consideration for me? Not sure since I'm pretty much doing a 50/50 split.
@@zombizombi I think R9-7950X3D is the most balanced cpu for you, fastest gaming cpu and similar productivity performance to R9-9950X.
@@zombizombi I recommend the x3d then. I use my station for 50/50 aswell it's solid in premier pro/After effects, blender, and my laser software, I do some AI stuff too and it's pretty quick compared to my old 5900x I am very happy with the performance of it although it was a bit difficult to get running stable. My best Cinebench r23 score has been 37644 which is pretty good I'd say.
Love the new graphs! Very easy to follow!
Still running on my ryzen 9 5950x
Love it
Me too. Won't swap it out for a while .
I must have seen the Ifixit ads hundreds of times now and it still makes me laugh.
Dude, I started my computer career in 1990. So, yea, the new CPUs are amazing. What's more amazing? Storage. I remember paying 400 bucks for a 40 meg hard drive back in 1990 for my 286.
I remember paying the same for 4 256mb ram sticks. Ram used to be worth more than it's weight in gold.
It's been a while since I really watched any PC part/performance reviews, it's nice to see where things stand right now between different cpu's. I will be waiting for the upcoming tests to see how the ram speeds work out since a friend of mine is wanting to get a PC. He's not dead set on a time frame to build it, but if the 9000x3d stuff isn't yet released by then I'll probably push him towards the 7800x3d.
People are mad because they felt that AMD should have really taken advantage here because Intel messed up.
But the thing is that this AMD processor was probably built and ready to go months ago before 'intel even messed up'.
However, for AMD to mess up next year in 2025 then we can really spill hate on them.
Now really is their (AMD's) time market the product until launch....that's just how the cycle goes
Exactly this, it's like people think AMD knows exactly when Intel would make a monumental mistake and somehow schedule all their development and validation solely based on that. This chip was probably in the works for the past few years, and the next one is already underway. They have a roadmap, and they aren't going to change everything at a moment's notice because of a suddenly, and recently exposed flaw from its competitor.
The same thing Nvidia don’t take advantage of AMD gpu department because AMD is winning the CPU market why would they launch there’s best CPU when don’t need to because everyone going to buy it regardless what you launch and your competition isn’t a threat for now. AMD is waiting Intel new CPU for them to answer like every company would do.
Months? The engineering groundwork for Zen 5 likely goes back years.
@@DecemberPlaysX It does. Chips and cheese has a nice article on Zen 5, and it has some 'significant architecture changes' to it, and those don't happen overnight. Seeing that the Linux people (Phoronix, Wendel@Level1) love the performance increases, I'd bet the undershoot is more Windows than chip. We'll see.
I don't understand where people even get to this thought process from.
There's been generational improvements every single Zen release. Zen 5 is now almost stagnant like Intel's 14th gen. It is completely unexpected and also completely irrelevant to any of the situation with Intel.
This release is does not follow the pattern AMD set expectations of, especially now with their marketing. It's false, it's misleading, and it's disappointing.
I'll keep my 7950x3d. Going to run Revo when I get in. Hope it works like you demonstrated last video. Thanks Jay
Serious gamers are waiting for the X3D chips anyway. But I think this isn't a wasted launch. AMD is trying to balance their processors and maintain efficiency. That's not a bad thing. The top end processors are more thirsty, of course, but still a bit more efficient than the previous chips.
The target for these chips is not those who purchased 7000 series chips. The target markets are 13th gen Intel and 14th Gen Intel, who are sick of Intel, giving them the run around. The other target market is those with older Intel and AMD processors, say early Ryzen or even early Intel and looking to upgrade.
If you're buying a new computer, the Ryzen 9000 looks more attractive. You also might look at the Ryzen 7000 chips, especially the 7800 X3D. These are not a waste and AMD is trying to avoid a situation like Intel has with hot thirsty processors that end up becoming a problem like the 13th gen and 14th gen which pushed to their limits end up damaging themselves.
I think AMD is being smart, but I don't intend to get a 9000 series processor, but I do intend to go 7800X3D for a long time. I am looking forward to where AMD will be with future generations of zen based processors and future iterations of these Ryzen processors.
I have no problem with Ryzen in my desktop and Ryzen in my laptop with my 4800 powered laptop, which is a much older architecture, and it shows. It's great for my purposes, though, and I don't see myself replacing it for a few years yet. But when I do, I will likely stick with AMD if they keep up their pacing and balancing power efficiency and raw horsepower.
My previous desktop and laptop were both Intel, and I feel like Intel has violated my trust with how they have handled 13th gen and 14th gen in particular. With their current business problems too I am reluctant to go Intel. So I am still impressed with the 9000 processors but I think sticking with 7800X3D or even other 7000 series processors is not a bad idea but if you don't have a 7000 series then the 9000 series processors are more attractive.
Or you know, me upgrading from 6700K in total fear of Intels last two gens ;) Though even in my case going to 7950X, cost considered is going to be better than a 9950X. The actual tough choice is between the X3D and X versions given my use is an equal split between gaming/workstation.
@zombizombi exactly... These reviewers seem to think we're constantly upgrading and that we always need to buy the latest thing.
There's a place for the 9000 CPUs given they are more power efficient, but if you get cheap power, then the 7000 chips like especially the 7800X3D, are a more attractive option. Even my 4800 in my laptop is actually really great for work and gaming. So the chips on the market are all super options, and they get the job done.
@@zombizombi That's why I prefer Intel, you don't need to choose between gaming and productivity, it's just great at both. Yes, the last two gens is a mess, but undervolting is not a hard thing to do. All who did UV are rocking their Intel just fine, inc. myself with 14900KF. In fact, sorting out AMD core-parking is more time consuming, that to UV your Intel.
@@someperson1829 Yeah... I just spent the last few hours understanding what core parking is.... Now I'm even more confused... Would be so much easier to go with Intels single die designs and cache... Now I'm gonna look into undervolting to fix the issues and the oxidisation problems Intel have... Sigh
@@zombizombi lol. True. The realities of the CPU market these days.
Hey Jay, this is only a suggestion on your reviews: in the results charts, highlight (with different color of the bar) the reviewed CPUs for better reading 🙂
I really appreciate hearing about the potential to use these chips for non-gaming purposes. A lot of us only have one PC and it has to be dual use. I don't think that a majority of PC users have a rig dedicated only to gaming. It is great to hear the parallel use potential of these cpus.
I rock a 2-pc Stream/setup
Streaming-5950x
Gaming 13900K July Microcode update: Waiting till August Micro Code update is OUT of Beta
That's me. My usage is basically 50/50. I don't know whether I should go with 7950X3D or 7950X. I guess I just need to decide whether I lean towards gaming or workstation more. (note i wrote in the wrong cpus but i edited them now). Coming from a 6700K I doubt I'm going to be disappointed with either.
Feels like AMD is just going through it's own phase of just becoming more efficient like how the 12th > 13th gen or 13th > 14th gen did (i cant remember off the top of my head where there was no notable performance increase, but mainly an efficiency increase), but it would be pretty interesting when you were talking about how AMD did their testing using a 7900xtx so i think a good video would be a comparison between the 2 just for sanity checking since numbers are kind've all over the place from what you were saying other reviewers had.
Just a request for graphs, or more of a "would be nice to see overlayed or side-by-side" for a comparison of powerdraws on similar tests for either upgraded CPUs (ie 9950x vs 7950x) or AMD/Intel equivalents when efficiency is what the new hotness is pushing for.
(not a big issue worth complaining about by any means)
I do think a good video for sanity would be showing testing for using the full AMD cpu/gpu & seeing if there is any notable difference because that would be interesting to see, but that's just me. I like my Jay videos 👀👍
Why are tests being done at 5200? This current media cycle is churning at AMDs expense if thats not from AMD. Not trying to be negative - I am honestly curious. I do appreciate that the tests will be done again at 6000. If AMD is saying test at 5200 that's a really interesting take.
Because that's as per the Reviewer Guidelines provided by AMD themselves.
Did you watch the video? Do you need a timestamp? Instead of skipping ahead, listen to what the reviewer has to say.
cuz Jay sucks at DDR5 ram set up or doesn't care to do it at all or hire someone to do it for him, there is no other explanation for all his AM5 videos are with Ram so bad that the CPU is shot in the foot, thats not new at this point, but it is indeed very dissapointing
@@Sup_D Do you have a source for this - actually curious if we can find where AMD actually said this.
@@daishirokuma I don't remember who, but there was a video by a Tech Content Creator this week who covered that point.
Plus Jay also mentioned in the video that he did the testing as per suggested frequency.
Was hyped to have this replace my 3900X.
I guess I'll wait longer, since my current setup is still doing everything I need without much demand for more. Was just gonna go for the upgrade to justify going AM5.
yeah im sitting on my 3900x right now dont know when to upgrade...or to what to upgrade right now
As long as you're gaming and don't want to move to am5, 5700x3d is a no brainer for value.
When the performance doubled in the same tier cpu.
Definitely get 5000 3d chip and get 32 gb ram for cheap.
Sitting on a 1800X and 1080Ti overclocked up the whazoo and water-cooled. I am only now really looking into upgrading and the most sensible option still seems to be just taking a 5800X3D and a 4080 Super, precisely because of the AM5 snowballing. Heck, don't even need to change anything about my cooling solution, because the setup should've zero trouble keeping the new CPU and GPU peachy. And investing "just" ~1.500€ (1.300€ for the GPU and 200€ for the CPU) into this once-5.000€ system and thus keeping it up snazzy for another 3 to 4 years seems just that much more enticing than anything AM5-related at the moment...
This launch is as real as it can be as a preview of what happens if AMD has "no" competitor. AMD=Intel
Thanks Jay. Still loving my efficient workhorse, the 5950x
14900KS is a monster.
Love your vids, this is another good one. If it wasn't for folks like you I'd likely just buy the most powerful thing available like I used to instead of trying to figure out what is the best bang for the buck, but it's not as necessary as it used to be, especially with CPUs.
I'm ready for a CPU upgrade in my primary rig... still running a 4.8Ghz 4790k and it's showing its age. I really want to know about those new P-core only Intel CPUs, not sure if they're available yet but I'm really thinking about getting a 14901 ke instead of either a 12900k or 14600k. I'd go AMD but I already have 2 other LGA 1700 rigs and I want to keep my infrastructure on the same socket in case I have any problems, then I can just swap parts- I'd completely switch to AMD but I don't have the Time or Money to replace everything. Depends on the price/performance though. Feels weird having a primary rig that's significantly slower than my others, but this is the most efficient setup for how I use them.
These CPUs weren't designed for gaming, that's the X3D ones coming later. Overall the productivity and workload is an improvement over Intel 14th gen so win for AMD overall.
No they didnt the Intel 14 gen beat the new AMDs 90% of all tests. Nice try though 😂😂😂😂😊
Are you trolling or denying reality?
@@_J3RK_I have two i9 machines (last two generations) and their performance is very poor. Suspect it has to do with keeping them from crashing.
@@brwa5176 send me your rig build and specs
@@_J3RK_ on the Alienware site their top of the line R16 system (essentially max everything) performed no better than my several years old R10 system when doing the same CPU intensive scripts.
Hi Jay, great review! Thanks for your efforts.
I would really like to see the reviewed products highlighted in the performance charts. Its sometimes hard to keep track without pausing the video, trying to find the CPU since its sometimes on top of the chart and sometimes in the middle of it...
Secondly what i really miss on your guys videos is timestamps. Would be nice especially for long videos such as this one so you can find the crucial information for yourself more quickly and skip a little where needed.
Cheers!
I honestly can't understand why gamers are complaining about a processor that wasn't made for gaming! Lol.
Its what gamers do. Complain while not understanding why
Even though it handles games just fine , most games are GPU focused anyways
Because it was marketed to also be great for gaming. Really that hard to understand? People just don't like AMD talking sh't that's so far away from reality that even calling it misleading is being generous.
@@Aaron-zl5gq true enough. The only way to make CPU a factor is to have a 4090 running at 1080p, which no gamer will ever experience
I don’t know what is gonna be the general feedback on the graphs, but I would have liked the CPUs sorted by brand/gen rather than performance as looking for a specific chip every slide was difficult (I was following in particular the am4 chips, and from one test to the other I sometimes struggled to keep an eye on them and analyse their performance before it went to the next :)
Thorough test though! 👍
No sane person uses 13900k and 14900k on 253w, remove those limits with newest bios, lock all cores at 5.6 or 5.7 Ghz and suddenly 9950x loses almost every benchmark you performed.
"dude just overclock the intel processors so they degrade and they can beat the 9950x!!!"
@@Longlius i literally suggested locking cores below their advertised up to boost speed. Wipe your oil covered glasses before you make a fooI of yourself mr sweaty prince of bacon and gravy
No thank you, I already have a 13700KF that's now dead which Intel won't replace. I'm not interested in beta-testing more engineering samples from team blue.
“Just lock the cores, PBO, tune the chip and then…” OR you accept the Chips Raw performance out of the box and view it like 95% of the people buying the thing are gonna do. Put it in and go.
@@teganburgess if you're avarage pauper then sure. However ignorance is not something to brag about so i wouldn't do it if i were you
Honestly. I think this was the perfect level of info. Loving the vertical graphs. Nice and easy to look through!
Great work!
9950x for my workstation and the 9950x3d for my gaming build, thanks J!
I love that ifixit ad, its so good 😂
If Arrow Lake can get a minimum 5% gains with half the power over Raptor Lake, I can see Intel taking the lead this generation.
They better do that when they're going from 10nm to 3nm lol
Half the power would be a major improvement. It's not easy to achieve
Even if they'd achieve that it won't be enough. 7800X3D is 10% faster at one fourth the power draw.
Thanks Jay, really enjoy the content.
oh for f..s sake how hard is to highlight 9950x and 7950x on this charts with a different color?
Oh, you want me to make a chart just for you? Standy by... dont go nowhere.
@@Jayztwocentsshould have made it strobe RGB puke just for that guy.
@@Jayztwocentsit seems sensible that using different color on chart for CPU being reviewed and its direct competitor or predecessor having different color than others. Like this Vid, 7950X 9950X and 14900K being highlighted would be a good idea.
I'm more curious of how the new X3d chips will do once we get more info and they are eventually released. Currently running the powerhouse 5800x 3D but I have to 100% agree that I feel like I'm in a holding pattern for upgrading into new stuff (which is why I got the x3d in the first place). My current issues with the market:
1. As Jay pointed out, not a lot of uplift on CPUs to make it worthwhile to get an entire upgrade on that end
2. I really, REALLY want to upgrade my 3060 (I want to take advance of frame gen for Microsoft Flight Sim), but the GPU market feels like crap still, and with the inevitable launches of the 50 series soon, I'm *hoping* they'll do a better job with those then they did 40 series..
The rest of my build seems solid. Plenty of m.2 storage, my EVGA power supply, while old, is still rocking hard, and I have a pretty decent 280mm aio from Corsair for my CPU that I'm sure I can transfer over if I eventually get new stuff...
AM I MISSING SOMETHING?! Can you please show me the person who is looking to buy these CPUs and a 4090 and then play at 1080p on medium settings? This is as synthetic and useless a test as it can be. Like, who benefits from it? Why test it like that? How can you even say “hope it helps you to choose” after that🤣
Edit: Dudes, I understand WHY(that’s why I called it synthetic) they do it. I am trying to ask, how is it in any way relevant for a real-life scenario and how does it "help you make a choice", as he says, when buying it?
At 1080p the GPU can compute the frames at an insane pace, which means it will demand data from the CPU at a faster rate, thereby stressing the CPU more, specifically in instructions per clock in most cases. At higher resolutions you're no longer stressing the CPU nearly as much, instead it's the GPU that you stress. This is why for CPU they test at lower resolutions and for GPU they test at higher resolutions.
@@LiamMartinKeane Are you proving my point? I do understand why they are doing it. So it's a synthetic and useless test for gaming, because you're GPU-bound 99 times out of 100. That is what I was talking about. And also if you can buy 4090 you probably not playing at 1080 🤪
Testing at 1080p with a 4090 puts MORE of the workload on the CPU (to keep up with the gpu), which is useful to show how the CPU performs. Testing at 2k/4k hides the specific performance of the cpu. Any channel testing at higher resolutions is just to accommodate people not understanding this concept.
Exactly. Think of it like a stress test. Pretty much any CPU can do general desktop office things nowadays, so the way to test them is to find tests that stress just the CPU. This can be artificial benchmarks (Cinebench, Prime95 etc) or gaming (1080p Low). In the same way a user isn't spending their whole life running Cinebench, they aren't expected to game at 1080p Low. It's telling you what the CPU is capable of, not what a regular user would see - that was "good enough" years ago
(of course, this raises questions about why people are convinced by companies and reviewers to buy stuff they don't need... But I'll stop as I would just sound like a grumpy old man at that point 🤣)
Thank you for including 5950x as the results show my machine is still a beast, even if there are now bigger beasts on the prowl
9950x is 3% faster than 7950x hahahah what a joke
Somebody waited 2 years to get this 🤣👉🏻 HA-HA
With curve optimizer 9950X is way faster than 7950X and uses way less power. However these are not the CPUs for the best gaming performance anyway...
@@TheLapari Lmao...im hitting 35k score on cinebench on 70 wats and 55 degrees. Can't beat that with 9950x
@@skrajina8037 You better watch some other benchmarks since 9950X can hit 42k on Cinebench 😂
Glad I did not wait and bought the 7950x 😆
Appreciate the testing that you all do! Planned on keeping my 7800x3d and 3080Ti for awhile and only upgrading to a 5000 series GPU when the time comes. My only complaint with this chip is I wish I could get my OC to not dip below 4000mhz without locking all cores. Cheers!
I have the same thing basically, just 7950x3D instead. How is gaming for you? I'm loving it.
@@SkrappyIE Hey! I would say gaming is great overall, however, I am using a 65 LG OLED 4k (was it a C1 or C2? can't remember). Anyway, I find myself still pulling things back which I assume is the 4k side of things. A lot of Dirt Rally 2 in VR which now runs better since the cpu upgrade. But sometimes other titles like Mechwarrior 5 and Starfield don't always seem to be happy maxed out. The part about the chip going sub 4GHz, even on OC, is what is really bothering me.
This was a great video. The charts are easy to understand and are extremely clean. They look really good.
Hi Jay, I watched your 9700X and AMD really borked the launch. So much that I went to Micro Center today and bought the 7800X3D combo special for $499. I am upgrading/fixing my 5800X3D gaming system. It's been down for 5 months. Now to find the time to rebuild it (flush fill custom water cooled loops). I hope to get it running in a couple of weeks.
The difference in frequency between the two CCDs at 19:00 isn't entirely unexpected, although it does appear to be a rather significant gap (~400MHz). You mention only seeing a ~50MHz differential with your 7950X, but it might be worth checking again to create a similar graph and see where it lands.
Most reviewers (HUB, GN, TechPowerUp, AnandTech, etc) saw CCD1 clock about 200MHz lower than CCD0 in multicore workloads on the 7950X, a difference which also extended to the 5950X and 3950X before it, although the frequency gap varies. The higher frequency cores are always on the first CCD in each of these cases, with each CCD (CCX for 3950X) having its own preferred cores.
I think your conclusion hits the nail on the head. Unless your hardware was already overdue for an upgrade, there is nothing to see here. I am still sitting pretty comfortably with a 12700k/3080 Ti. It will probably be several more years before I even need to consider upgrading, even with an in-socket upgrade to 14th gen Intel available to me.
Jay´s CPU reviews are getting pretty good.