Im still just so happy with my AM4 set up all the way from a 1800X at launch to the upgrade to a 5800X3D last year it is/was just a unbelievably good platform...
@@josephnorris4095what's your resolution? I'm at 3440x1440 and my 5800x3d and 6900xt is doing alright but an upgrade would be awesome (gpu specifically)
Good timing. My brother JUST asked me what's the best current cpu for gaming (no productivity. Pure gaming) as he's gonna build a PC again after 5 years without one. I just linked the video haha. Thank you for your service Steve. We appreciate the hard work
@@toseltreps1101 He knows how to find that information. I mentioned AMD in passing during a casual conversation, and he said "I think I'm going to build a PC again soon. What's the best CPU right now?". So I said 7800x3d, and linked the video in case he wanted to check it out himself. It's just a casual conversation between brothers. Sheesh, you are definitely not fun at parties 🤦
@@MogCity2 you good, bro? Lol idk why it’s expected, at least seemingly so, that one must be thanked or thanked back by channels that barely have time to take a sht. And besides, it’s not like I donated the gdp value of Coruscant lmao
With my last power bill higher 28% (including 10% rebate for lower consumption) then previous one, power cost is more and more important... So much so that I started to cap FPS at 144. And they are already announcing more price rises...
@@johntotten4872 That may very well be, but i would rather see a chart in the video allowing me to compare that for myself instead of taking your word for it. It would also be really useful for cmparing the actual gains/watt compared to older very efficient gaming CPUs like the 5600x. The 7800X3D is anywhere from 30 to 100% faster...but how much more power does it use on average? That data is really much harder to come by than it should be. This video includes both of these CPUs. I was really looking forward to that comparison when i watched it - only to be left empty handed, so to speak.
It's definitely good, but being on AM4 with DDR4 it's definitely held back. I wouldn't say it's holding up to the Intel 13/14th gen or Ryzen 7000 at all, but it's easily the best AM4 chip and for a reasonable price. If you're using a higher-end card like a 4080 or 4090 or anything in the future then the 5800X3D isn't gonna cut it.
@ksks6802 that's assuming that 800 wouldn't go to things like health expenses, rent, car repairs, or other things more important than a gaming pc. Hell it could even go to games instead.
@@shadow105720that's why it's called "savings". You try and not tap into it. If you cant save 20 bucks a month...you've made some unbelievably bad decisions in life. Go cry about it.
@@ksks6802 they could live in a country where working gives them $10 daily, which in 21 work days/month gives 2 days of work and 10% of earnings... you don't pay different prices for hardware, most often US pricing is the lowest since there's no VAT or other taxes/expenses
@@ksks6802 800$ for a marginal performance upgrade or invest that into equity of a company? I'd rather invest first. Only when its a huge upgrade it'll be worth forking out $ for an entirely new platform
@@MordWincer total volume of any is almost nonexistent. Like it was always low like any from 1% to 5% but now it's 0.1% as high to 0.001% like the people who use S.L.I/crossfire/mGPU would be in the "PC enthusiast class". But now everyone just trolls those people to death & calls them dumb for owning or using S.L.I./Crossfire/mGPU. All it's about now straight up gaming & most ideas come from Reddit which is a horrible place
I would have liked to see some more budget options here like the 12400f. That's down to about $100 now and I think it's pretty relevant in the $6-800 sort of budget range. Maybe the 3600 too, that's still available and with how cheap some am4 boards are there could be people interested in it. I would have been much more interested in that or some more locked i5s (which are very common in prebuilts) than having so many intel parts in both the k and kf variants or having 5700x and 5800x. Your opening note about those being the same cpu covers that well enough for me.
@@MiguelSilva-sm7xl that's quite more alright but is it so insanely more pricey that you can't afford to game with any part you want? Like even if PC while gaming uses say 500Wh is it really that costly? Say you are playing 5 hours a day every day for a whole month... isn't that 11.25$ in total spent (give or take some cents)? That's really not a lot.
I won't be upgrading my own PC for quite some time (and Tim's latest video certainly hammered home why)... But I still like to keep on top of the latest goings on in the market! (I honestly find it more confusing to take a break and try and get caught up later, than to just stay on top of it now.) This channel is amazing for that! Sure, I check multiple reviewers, but if I was in too much of a time crunch to watch multiple videos, and had to be confident about something, I think I'd pick your videos. Also: Still the best B-roll in the business!
I've had mine for about 7 months now and it's killer. No issues and tackles video editing and gaming like a champ. Good temps and power draw too. No complaints.
Though a dead platform now, I do feel like that retrospectively, the 13700K seems to have been quite a highlight of the LGA1700 platform. Mine has served me well, and I think it was good value for its time for mixed gaming and productivity rendering.
"Dead platform" is irrelevant when you don't upgrade often. 13700k is great don't worry. It'll serve us well for a few more gens. Offset the voltage down by 0.09v and it'll be a sleeping beast.
@@yuunashikiIt's still relevant. If you bought a 1000 series ryzen chip in 2017, you could buy a 5800x3D now and not need a new motherboard or ram. Thats 7 years apart. For a value minded buyer, the ability to upgrade to a new CPU for only the cost of the cpu itself 7 years later is a massive cost saving.
Been very happy with my i7 14700K, love seeing it is one of the top 5 cpus still! It screams in any game I throw at it, and its a beast in ps3 and Switch emulation (hello God of War 1 & 2 HD in 4k 60fps!) as well as it is a super useful cpu in production such as 3d modeling and renders. It's nice to know I can easily rock this cpu the next 5 yeard until i plan to upgrade my pc next around 2030
What i don't like about the 13th gen and 14th gen i7s is the awful perfomance per watt vs the 12700K. It makes them impractical for audio/music work, an AIO doesn't make sense, it must be air cooled. My 12700K has AVX512, easily beats a 7800X3D (also has AVX512) on PS3.
Was going to buy that chip last week. Then seen what's starting to happen with 13th/14th gen. I'll be avoiding completely now, especially seeing how they've been fobbing off gamers accusing them of using incorrect board settings. I'll be holding off for AMD 9000 CPUs now.
@@saricubra2867 performance per Watt is miles better on newer intel gen, - it's just that they can reasonably argue that when it comes to i7/i9 most of the people have the top of the line cooling solutions so do 253W cpus. Anyway - I cool my 14900k using air, and it's alright - not saying it's worth updating from 12700k though.
@@saricubra2867 It is. 12th gen is just before the intel's shift. They were more conservative with default CPUs PLs (now the boost to 253W is set as a target). AMD 7000 series, is by the same metrics less efficient. However if you lock down power limits, you'll find out that they're significantly more efficient and in fact don't need 200-250W.
When I built my pc about a year ago, I didn’t really know all that much about anything. So when you upload a benchmark for gaming processors and I see my processor (7800x3d) is still that good, whether it was luck or my research, I get very pleased that I’m still very much set with that part of the build.
@@user-wq9mw2xz3j True AM4 (DDR4/5700X3D) is still a bit cheaper than AM5 (DDR5/7500f). But 7500f is also bit faster than 5700X3D. With tuned RAM it outperforms 5700X3D by ~15%. Plus AM5 offers great future upgradeability, unlike AM4, since 5700X3D is a dead end here. Taking into account these factors, AM5 totally worth a bit of a premium IMO.
It was available for some hours last week at 100 Euros in Germany last week. Now it's back at 146 but still the best deal on AM5. Especially since 7600 prices went up by 10% since May .
Just to add the one millionth comment, these guys make the absolute best and thorough breakdowns which actually help in deciding what I want to buy without a doubt in the end. Thank you!
@@laurentiudll 12700K has more overclocking headroom than a 7600X. The latter already is around 5.2GHz, the 12700K is that fast and it clocks at 4.7GHz, the 7700X is definetly the trash CPU of Zen 4, it will fall apart if you set it to 12700K clocks.
I know you have been doing this for along time, but I would argue that sorting by minimum frame rates, then by maximum frame rates should be the primary method of ranking performance. It ensures that the products recommended are absolute top performers.
Great vid. I know that grand strategy games and 4x are shedogs to test, but it could be wonderful if you can touch on the subject in future videos. Like late game Victoria 3 or Civilization turntimes ect. Keep up the good work ❤
Yeah, I really hate how game performance is usually limited to spectacular 3D games, which aren't really CPU limited. There is also factory and colony management games like Factorio, Rimworld or Oxygen Not Included that are greatly affected by CPU speed and sometimes RAM / cache performance in the late game.
I'd like to ask you to include power consumption to those as well, please. Energy prices here in Australia have doubled for some of us over the past couple of years, and this is now a big factor for me. Thank you.
In your comments about Starfield, you note that the dual CCD of the 7950X3D and the way the game schedules its threads all meant its lows were hit, but the 14900 ALSO had a very similar hit, and the single CCD 7800X3D had better lows than the big hitters from Intel's 14th Gen, indicating to me that they ALSO were hit with a scheduling problem, hitting those weaker E cores, but you weren't pointing out the issue of "mutli-tile scheduling" only "mutli CCD scheduling". Fix it by pointing out that tile scheduling can hit in some games too.
His comment makes sense because that was a comparison between 2 cpus with similar performance (7800x3d and 7950x3d). While, yes, multitile surely can have an impact, there was nothing to compare to, since all the intel cpus of that level have e-cores. Meaning: while it's useful to point out that the 7800x3d can be a better purchase because of that issue over other am5 products, it doesn't matter for intel, since all the other cpus are all of the same.
@@MattJDylan That's my point, and why his doesn't, in that specific situation I pointed out and is solved with a change: it depends on how many threads the game engine will use. Starfield will use more than 8 threads, and are not well scheduled, so anything that HAS an option of which core to use has a scheduling problem. The 7800X3D doesn't HAVE a choice. All threads HAVE to cooperate on that single CCX. Intel's 14th Gen just upped the E core count to get multithreaded up to competitive with AMD, and so 8 of those threads will go to the faster P cores, but since it has a choice, it will put thread 9, 10, 11, etc, on an E core, meaning if the thread has to be waited for, the lows in the fps count will go down. Same for the two-chip vcache options: the other core isn't as fast, so it still has to wait until that thread ends, hurting the lows. The 12th Gen Intel doesn't have as much of a problem: it doesn't have many E cores to play threads on, so it will cooperatively multithread on a P core much more often. 13th Gen ups the 13600 count of E cores, so more threads are run on the vacant E cores, hitting the lows a bit more. And 14th Gen ups the count of E cores even more, hurting the lows even more. And since the higher quality chips from intel, like from AMD, use more "not faster cores" to handle more concurrent threads, the lows are hit more at the higher end (e.g. 13900) than the lower end (e.g. 14200). The AMD 7800 with vcache is a low end configuration that has a LOT more cache, which means more cooperative multitasking, but since it has a larger cache, more hits on-die so those threads operate a lot faster.
Recently scored a 7900X3D for $280 US. For a mix of productivity and games, I couldn't pass it up. These charts are great to see where things generally land, but always keep an eye out for deals; you may find a diamond in the rough :) Also... didn't you forget to benchmark one particular game? ;) ha
I basically bought my 12700K new a months after it's launch for 330$. Not even the Ryzen 7 1700 at launch had that value. It always made the 5800X3D look completely stupid for me, then i tried a simulated 5800X3D by turning off the E-cores. 8 cores 16 threads really feel sluggish vs 12 cores with 20-24 threads. I never liked the 5800X3D, and for games when the X3D cache doesn't matter, the 5900X pulls ahead because of clocks and cores, it's much more faster in productivity.
this is something you really have to keep an eye on the prices to make sure you are getting the best bang for the buck at the time of your purchase. right now the 5600 is at $117 and the 12600kf is at $159 for example. knowing the performance and doing your due diligence will get you the best deal. thank you for all of your hard work on the benchmarks.
@@qu54re65 but we're talking about gaming, the 7900X3D is a inbetween CPU, worse than 7950X for productivity and worse than the 7800X3D for gaming while also being more expensive.
@@MaxIronsThird It's 5-10% worse in gaming than the 7800x3D while being 50%+ faster in multicore tasks. At the same time, it's ~20% slower than the 7950x3D in multicore tasks, but the 7950x3D costs currently 50% more. With the launch price it made no sense, with lowered prices it absolutely does make sense.
Hardware Unboxed - upon the releases of the new CPUs coming out, can you guys try to do another video like this shortly after? I want to do a new build and this is the type of stuff I want to base my information off of, but my current hardware isn't gonna last much longer.
When I got my new system I got a R9 7900X3D paired with 64GB ram, X670 MB and a RTX 4070Ti Super, and so far it has been an awesome. Hope to have this system last at least 10+ years.
@@Majoraexp I was thinking about that, but with what my system cost in total I think that I could not go the top of the heap in the Ryzen series. I wanted to go to the 4080 Super but that was $500 AUD more than what I could afford.
@@Ghastly10 I would have gone with a 4080 but I only upgraded cause I won at the casino xD otherwise I'd still have a i5 9600k. If only I had won more. Quite a huge upgrade for me still
Got the 5700x3d like bit over a month ago for 200€ or so, great last breath for the am4 socket. As for going for am5 the 7800x3d wouldve been almost double the price, and motherboard (150-200) and 100 for ram.
the cost per frame MSRP is correct but if you atke into consideration what you pay on the long run given the power consumption i think the 14600k falls way behind the 12600k especially for those whose pcs are ON almost 24/7 since the 14600k uses a max of 181w whereas the 12600k is 125w but great work and insight into these CPUs as always, i wouldn't expect anything else from this channel
1080p testing goes against monitors unboxed ethos of “its time for a monitor upgrade”. 1440p are 120hz needs to become the new baseline. If that does not show a significant difference between video cards then that is part of the story. I think we have reached the point where objective data is good but not at the cost of skewing the perspective to show that difference. Keep up the great work and I look forward to the next rebound of new motherboard reviews. Can you create a “best power delivery per dollar chart” for motherboards as that seems to be a metric that is measured differently between vendors? For example VRM/PowerDeliver/???/voltage/watts/amps … how about a scaled score of 1-10 called “this gives your cpu all the power it needs without getting too hot” or whatever it needs to be called. Thanks gain.
If it were included it would most likely top the price/perf chart. Saw it on Newegg for $120 the other day. It is an amazing budget/mid tier gaming Cpu.
Still rocking my i7 4790k and gtx 980. Would love to upgrade soon but now seems to be a holding pattern for all these new releases coming. May just slap in a rx6600 and keep er moving. Great content as always Steve. Also, I've run out of podcast to listen to at work now that I've finished all your pods.
Here's your engagement comment. It's hilarious to me how many people still won't watch the videos explaining why you test with 1080p and a 4090, despite you specifically addressing that in the video. I'm wondering if they're just doing it intentionally at this point for a gag. In any case, thanks for the rundowns! Also, it looks like the video might not be in your description, so here it is for anyone who wants it: ruclips.net/video/Zy3w-VZyoiM/видео.html
Hey steve! out of curiosity, would you ever consider adding RFO benchmark to any kind of testing standard? maybe every so often you do a super test of different professional softwares and such?
It is mate. Ive upgraded my GPU to 4070s and I'm on 12400, no visible bottlenecks at 1440p. All the games I play ulitilse GPU to 100% or I'm limited by the refresh rate of my monitor. Don't worry and enjoy
I strongly believe the results should be set in order following 1% low instead of average, since the global experience will be determined more for those deeps than the averages...
Digital Foundry would disagree with you. What 1% shown here and 1% in your mind is different. What you're thinking about is frametime consistency. What if i spend 90% of my time in a not busy environment of the game and only 10% on the heaviest part? The 1% wouldn't even do justice. Meanwhile frametime consistency really represents the feeling of judder or hitch even though you're at 60+ fps constantly, those 60 frames could be bunched up in the last half of 1000ms, instead of evenly 60fps/1000ms. So in this case avg is better because we're measuring peak performance, not the consistency of the game itself
I bought an B350 Pro 4 with Ryzen 1600 and upgraded to 5600 in January, with 32 GB DDR4 in total it costed me about $250. And I'm happy with the jump to it.
With the release of the newer 5700X3D and it not being included in these charts, is it safe to assume that we can look at the metrics of the 5800X3D as the closest chip to its results in terms of performance?
1080p is best for CPU benchmarks for getting maximum FPS, as CPUs are responsible for fps, the resolution and upscaling is taken care by GPU so no point in checking in 4k as games will become limited to GPUs capability and showing off the same FPS on all CPUs even if one is worse or better than others
This is exactlt why you SHOULD test at 4K, this isn't 2011 anymore most people watching will already have a Gamung Computer and look to these mega benchmarks to inform them unto which CPU to upgrade to next. If he tested at 4k, it would most likely show that most people are fine with their CPU and it wiuld show then where to spend their money next. But I only know this because I game at 4K and 8K. Most people don't and will get the impression they need to upgrade the CPU to the 7800X3d or whatever to get the best performance when in reality the higher the resolution you go the less relevant the CPU becomes
@@CallMeRabbitzUSVI Most people are smart enough to get a CPU that supports the frame rate their GPU can produce in their favorite games at their monitor's resolution (which, according to the most recent Steam poll, is most likely 1920x1080 anyway).
@@CallMeRabbitzUSVI ?? it's a cpu benchmark test, not letting the gpu bottleneck the cpu and showing the difference is THE POINT of the whole benchmark, that's why it's done in 1080p with 4090 seriously how do people not get that
And the 7800X3D is yet to reach peak status. 1.5-2 years later, by the time even Zen 6 is released, the 7800X3D will be much cheaper but remain more than enough for the vast majority of gamers. And with ample headroom for future upgrades.
@@josephk6136 At stock, yes, 7800X3D is fantastic. Best CPU for a normal consumer. It is not the best when it comes to peak performance when you start to overclock. A 13/14900K overclocked with high speed DDR5 (tuned) will destroy the 7800X3D in most games and benchmarks. It just comes down to you wanting to spend the extra money and FAFO with setting and bios for a week or two.
I think including emulation in these tests will show some rather intersting differences and I recommend as the benchmark gow 3/ascension rpcs3 as it destroys cpus and hardest game to run in any emulator to date
Steve, is there any chance you can do these benchmarks with an RX 7900 XT or an RTX 4070 Super? Only a tiny percentage of gamers have an RTX 4090. It seems that the only reason to go above a Ryzen 7600 is if you have a 4090. I'd love to see the CPU sweet spot at 1080p, 1440p, and 4k if you have a midrange card. If you were to do this, you would probably be the only reviewer to provide this useful information to millions of gamers who use these midrange cards. Thanks.
I'm not ashamed to admit it: I bought an i9-14900K! It was $50 less than the i9-13900K. Why _wouldn't_ I buy it? I can handle things. I'm smart. Not like everybody says. Like dumb. I'm smart. And I want respect!
So important question, especially related to some of the scheduling issues with the AMD CPU's that have two CCD's but only one with a vCache. What version of Windows 11 did you use for the testing? I'm reading that Windows 11 24H2 has largely sorted the scheduling issues so something like a 7950X3D would now performance basically the same as the 7800X3D, so some of those games, like Cyber Punk 2077 should get pretty much the same result for those two CPU's.
You know, I'm still running my roughly 10 year old i7-5820k at 3.3GHZ which along with my 12GB RX-6750XT can still can run any game I have (with the possible exception of StarField that still has unexpected graphics glitches and pauses) beautifully at 1080p. My system is also doing a ton of other things while I'm gaming too, like being the hub for my local network backups, home media server, home security camera hub, as well as serving up a commercial web site through a CloudFlare tunnel. (No, it's not on wi-fi - just a g-bit CAT-5 cable instead). Even with all that, most games are only using about 30% to 40% of the CPU resources - so the notion that I need to be buying a newer and much more expensive CPU every couple of years "for gaming" seems pretty absurd to me. Yeah, I usually try to upgrade my video card every 3 to 5 years or so - but the old 6 core, 12 thread, 22nm CPU just isn't a problem.
You don’t seem to be aiming for high refresh gaming. I could limit my FPS to 30 and my CPU would last for ages. If you’re okay with that, it’s your call. I want 100+ frames on all games. It can’t be stressed enough how much better of an experience it is.
I upgraded from Haswell i7-4700MQ from 2013 to an i7-12700K in 2022. I more than doubled the single thread perfomance gain, fps gains easily 7 times (floating point speed).
"We're less than a month away from [this video being completely out of date and inaccurately titled, but we did all the testing and wanted to get two videos out of it instead of one so here]" - ftfy
I probably missed it, in which case I apologize, but are the Intel 14th gen CPUs tested at the new "Intel Performance" power settings (PL1=253w, PL2=253w)?
@@madelaki yes the 5700/5800x3d are good, but they make the most sense as a upgrade (even then, the AM5 R5 7600/7500f is much cheaper and it has similar gaming performance) there is really no point in building a new AM4 DDR4 system in 2024
AM4 isn't a Dead platform - AMD is gonna release new CPU's for it at the end of July along side the new AM5 release. I would call AM4 a secondary platform and still a worthwhile platform for those who choose not to or don't have the financial capability to just switch from AM4 to AM5.
Got i5 13500 with DDR4 last year and I'm pretty happy with it. It performs awesome in any normal game, but also shines in RPCS3 emulator thanks to huge amount of cores (compared to my R7 5700G). Also the core count is really useful while production, like rendering in 3DS max in software mode.
I bought the 13600K on launch before the 7800x3D existed and I'm very happy with it, I will of course consider AMD when it comes to my next CPU upgrade.
The 12700k doesn't get the love it deserves. Intel has been selling it for stupidly cheap for months now and I've built at least a dozen systems with it. Sure, it's not impressive compared to a 7800X3D with a 4090 running at 1080p, but when you pair it with a more realistic 3080 or 7800XT at 1440 there's simply nothing from AMD that can compete with that $210 CPU. (Where's my AM5 R3 AMD?!) Intel _knows_ they aren't in the lead anymore and their prices show it. X3D might be the king of gaming but the second you turn that game off core counts matter and I got a 14700k for a lot less than you would expect. Great prices for good CPUs are what made me love AMD, and Intel is killing it right now in the "bang for the buck" department. There's no question if you're doing a "cost is irrelevant" build you go for a 7800X3D or better yet 7950X3D, but if you're building on a budget Intel is killing it if you want a lot of cores for your money.
@@LiesThatBind You can find a 12700K with AVX512 instructions beating a 7800X3D with AVX512 on PS3 emulation. There's nothing more CPU intensive than that. Heck, the i9-11900K which was the fastest CPU for PS3, with overclock loses against the 12700K *at stock* . 3% singlethread difference between 12700K and 13600K, the 12700K is easier to cool than a 13600K because it clocks at 4.7GHz (8 p-cores), the 13600K clocks at 5.1GHz, at that point just get a 7600X that is cheaper and clocks at 5.3GHz max. 6+8 Big-Little is just a weird number than won't age as well as 8+4. Intel realizes that the 12700K was a mistake, so they forgot that 8+4 number. I only know one chip that is similar to that anomaly, it's the M2 Pro also 8+4.
14700K will pull ahead of the 7800X3D once the latter lacks the raw multithread and clocks muscle. Right now the 12700K is pulling ahead of the 5800X3D.
Guys, you are legends! Great info. Can you do some updated "creator centric" benchmarking with benchmarks for Davinci Resolve, Blender and the like, it has been a couple of years since I have seen anything. Thanks!
Surprisingly, my choice for both gaming and productivity, and rather “unfashionable or meh” when launched 12700K seems to have aged quite well. It now appears to be in the ball park for results and value as the supposedly “better” 5800X3D 😵😵
@@shoobadoo123 True and I am aware of this. At the time it meant DDR4/DDR5 was 32GB rather than 16GB within the budget and again nearly always a compromise when considering game vs productivity but for a processor that was virtually written off at launch it doesn’t do too badly at all!!
Also what's your powerdraw in gaming and under workloads? Also the US prices mske me grin, as in Germany we can get a 5800X3D for 299 VAT included, and for a while it was down to 269 VAT included. It is very game dependant, but in sims like ACC or DCS the 5800X3D still rocks with inanely high, smooth fps.
@@LupusAries Although the power is high during gaming (as mine with decent cooling boost WAY higher than spec) and DDR4 is probably 4-5% slower in games (remember it won’t run DDR5 particularly quickly) the extra memory makes up in productivity. Also in productivity the IG and codecs give a real boost in some apps. During general use the E cores actually work and idle and general office use is used less than my previous AMD rig. As the system generally runs off the solar panels or battery backup then power/power costs are not a major decision point. Please be aware I’m not saying the 5800X3D is a poor choice or a bad chip. I am certainly NOT saying that and it’s a fabulous chip. I’m only surprised that the 12700K gets as close as it does as that was not what was suggested on release.
13:38 Better to include the price of motherboard and RAM for cost per frame. Simply use the cost of the hardware used in the test. I think the AM4 going to dominate this chart. hard to believe the 5600x still the best here. I guess it is still the sweet spot for me, gaming and Handbrake only the heaviest workloads I do.
I like the 12700K more because of the 8+4 configuration. The 6+8 config of the 13600K makes it look like an overclocked laptop i7-12700H which is basically what it is.
Awesome video, but where are the i3s? AM4 and LGA1200 may be worse value for new systems but here they cost 200-300 euros per kit and the CPUs you used would cost 400+ euros in my country. I can't look at efficiency if the price is double.
Platform longevity sounds grossly over-rated to me: I'd never put a CPU on a 2+ generations older board as that usually means forfeiting some of the new CPU's key features. What is the proportion of PCs that ever get a CPU upgrade? Likely around 1% when you account for the ~80% of the market that is laptops with soldered SoCs.
I disagree, motherboards have become much less relevant to performance than they used to. I think 1 motherboard per DDR generation should be the new standard. If you had been able to get a good quality DDR4 board in 2016 that could run a 5800X3D that would be 6 years with no lost performance, potentially offering 2 or 3 cpu upgrades. PCIE3 is still sufficient for all but the highest end GPUs, and slower SSD sequential doesn't impact games. Really the only thing motherboards have to offer is USB 4.2x2.2 squared or whatever they decide to call it
Look at all the 4usb b650 and B660, b760 motherboards for both sides. Using multiple controllers or a racing wheel is going to cramp the space. Many people are like a switchboard with their PC I/O panel as it is. I'll take my 8 USB on the back of my Tomahawk Z790 any day. I know for a fact the onboard sound choices are mid at best for most of those boards. The entire LGA1700 line up from AsRock was stripped. Cheaper but stripped down.
@@kramnull8962 If IO is your only reason for spending $100+ extra on a much higher-end motherboard, most mid-range boards around $170 have headers for some combination of 6-10 extra USB ports, basically all of the CPU and chipset IO that isn't already on the rear IO. Beyond that, USB hubs are also an option. I have one on my desk simply to spare me the hassle of crawling under my desk whenever I need to temporarily plug something in.
I know this chip isn’t there, but the 7500F is a beast. Specially over clocked. if you find good deals on ram, and a Motherboard, you can go AM5 for such a good deal, and you’ll have room for improvement over time
sweet glad I came back to this Vid 5months later helped me decide on going with a 7600X over a 7700X since their so close together in gaming + it would eventually be replaced with the last AM5 8+ core X3D CPU they will make for the platform so saving $60+ in 2 months is better than getting the 9800X3D now at least for me just looking at the cheap route for a long upgrade path.
Before my 7800X3D I had the 7700x and 9900ks, which I Upgraded from the i7 980xe Ihad for 10 years at 4.3 ghz. The i7 980xe was able to play Metro exodus at extreme settings at 3440x1440p 60 fps. The game looks better than some titles today for a 2/2019 title ( with 1080ti ftw hybrid).
For me it's the opposite (for old games). Finally with my 12700K i can get 100-110fps on Crysis 1 and playing Minecraft without being a stuttery mess when loading chunks.
@@J1e9r9r6y No one is contesting that 4k is superior man, is only that a 4k gaming monitor is still way expensive and a GPU capable of playing 4k for all games as well.
Hey guys, I'm a bit concerned that the Assetto Corsa Competizione benchmark you guys are running may be misleading. In ACC, having more cars on the track at the same time stresses the CPU much more than when there's only a few - I can see in the gameplay footage that there is only 16 cars on track, when races can frequently have 30 cars or more. As such, a user of a 7800X3D and a 4090 may not see such high fps nmumbers as more cars = more CPU stress = possibly fewer fps. Running a race with 30 cars (or however high you can set it) rather than 16 would likely give a better indicator to how well the CPU's are performing against each other in this title. The Opponent Visibility setting won't be an issue here as you're running the Epic preset, which sets this value to All by default, so just changing the number of AI cars will be sufficient. If you have any questions about this, please do let me know - I have a lot of experience in using ACC (including esports experience both as an event organiser and competitor/driver), so I would likely be able to answer any questions you have on this.
21:51 It looks like you don't remember the perfomance gain from the i7-11700K to the i7-12700K. If the generational increase turns out to be that big, you just pay the extra for the platform and from a value standpoint it will end up being good because of the chips that you get, no one that i know upgrades their CPUs in 3 years 🙄.
How are the prices of the 7800X3D in AU? In Europe since July 1st the 7800X3D has undergone strong price increases and is sold out at the major European distributors. Retailers increase the price of the 7800X3d several times a day.
I mean it deserves the dang praise it gets. I wish it didn't heat so aggressively out of the box though, had to dial down some of the voltages to tame it. And the funny part is that I didn't even lose any performance, so it is needlessly aggressive out the box.
5800X3D is no longer interesting, it is way too expensive for what it offers. 5700X3D is the CPU, which deserves to be praised. Although even 5700X3D is outperformed by 7500f, the true value king.
Just like death and taxes, there will still be people in the comments asking why are we still benchmarking cpu on 1080p.
Yeah, lets go back to using 720p for CPU benching.
Might be their first pc😉
I remember when they did 360p for cpu benchmarks in 1999.
And here I was, this weekend, searching for 1080p monitors on RTINGS.
@@arc00ta tpu has u covered 😅.
Im still just so happy with my AM4 set up all the way from a 1800X at launch to the upgrade to a 5800X3D last year it is/was just a unbelievably good platform...
Wow, very good value, what Mobo?
@@AlexHusTech MSI X370 Gaming Pro Carbon... Not even a particularly good board but it served me well over the last years
Yep, and I see no reason to upgrade at all, especially since I only have a RX6800XT and a R9 5900X.
@@josephnorris4095what's your resolution? I'm at 3440x1440 and my 5800x3d and 6900xt is doing alright but an upgrade would be awesome (gpu specifically)
The GOAT!
Good timing. My brother JUST asked me what's the best current cpu for gaming (no productivity. Pure gaming) as he's gonna build a PC again after 5 years without one. I just linked the video haha. Thank you for your service Steve. We appreciate the hard work
so he is smart enough to build a pc but too stupid to figure out the best components?
@@frankieinjapanstill, someone who do not know about ryzen 7 7800x3d is ignorant
@@toseltreps1101 He knows how to find that information. I mentioned AMD in passing during a casual conversation, and he said "I think I'm going to build a PC again soon. What's the best CPU right now?". So I said 7800x3d, and linked the video in case he wanted to check it out himself. It's just a casual conversation between brothers. Sheesh, you are definitely not fun at parties 🤦
@@toseltreps1101dont talk anymore, ever.
@@toseltreps1101 bruh
Thanks, Australian Steve!
No thanks back 💀
When one Steve goes to sleep, the other is just waking up. How lucky we are to have 24/7 Steve.
@@Osprey850 *_STEVE INTERNATIONAL 24/7_*
@@MogCity2 you good, bro? Lol idk why it’s expected, at least seemingly so, that one must be thanked or thanked back by channels that barely have time to take a sht. And besides, it’s not like I donated the gdp value of Coruscant lmao
Not a single Power Consumption Chart in the entire Video? Makes me sad. I would have LOVED to see an fps/watt chart overview of these CPUs.
Yep, leaving out the Power Consumption and 4k Benchmarks makes this entire video incomplete
With my last power bill higher 28% (including 10% rebate for lower consumption) then previous one, power cost is more and more important... So much so that I started to cap FPS at 144.
And they are already announcing more price rises...
Simple really, if you care about power usage then buy a 7000 series Ryzen. They are the most efficient.
@@CallMeRabbitzUSVI 4k is pointless because they'd almost all be the same FPS
@@johntotten4872 That may very well be, but i would rather see a chart in the video allowing me to compare that for myself instead of taking your word for it. It would also be really useful for cmparing the actual gains/watt compared to older very efficient gaming CPUs like the 5600x. The 7800X3D is anywhere from 30 to 100% faster...but how much more power does it use on average? That data is really much harder to come by than it should be. This video includes both of these CPUs. I was really looking forward to that comparison when i watched it - only to be left empty handed, so to speak.
There should be a golden light coming out of that 4090 box, like in Pulp Fiction
like the spongebob special spatula
RTX MacGuffin
We happy?
@@whoeverthisguyisthank you for making me smile
You mean like when on it catches on fire?
Great performance from the 5800X3D. It practically still holds up with current generation.
It's still going strong for a DDR4 platform. Most likely gonna skip the 7 and 9k series.
still rocking mine with 12gb 3080 and its strong
i'm running it with a 4080 super 💪
It's definitely good, but being on AM4 with DDR4 it's definitely held back. I wouldn't say it's holding up to the Intel 13/14th gen or Ryzen 7000 at all, but it's easily the best AM4 chip and for a reasonable price. If you're using a higher-end card like a 4080 or 4090 or anything in the future then the 5800X3D isn't gonna cut it.
Best CPU purchase ever so far for me. Gonna hang on to it and skip 7&9k series.
I think Im gonna use my 5600 until it or my motherboard dies or I unexpectedly get rich enough to not care about money.
Rich? You mean your horrible at saving money. Saving just 20 bucks a month in between product releases will add up to like 800 bucks. Poor life skills
@ksks6802 that's assuming that 800 wouldn't go to things like health expenses, rent, car repairs, or other things more important than a gaming pc. Hell it could even go to games instead.
@@shadow105720that's why it's called "savings". You try and not tap into it. If you cant save 20 bucks a month...you've made some unbelievably bad decisions in life. Go cry about it.
@@ksks6802 they could live in a country where working gives them $10 daily, which in 21 work days/month gives 2 days of work and 10% of earnings... you don't pay different prices for hardware, most often US pricing is the lowest since there's no VAT or other taxes/expenses
@@ksks6802 800$ for a marginal performance upgrade or invest that into equity of a company? I'd rather invest first. Only when its a huge upgrade it'll be worth forking out $ for an entirely new platform
"If you believe yourself to be a PC enthusiast" 🤣 golden!!!
Pc enthusiast doesn't exsit anymore.
@@kevinerbs2778 Yep, we died out when RGB took over 🌈
@@kevinerbs2778 what do you mean by that?
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
@@MordWincer total volume of any is almost nonexistent. Like it was always low like any from 1% to 5% but now it's 0.1% as high to 0.001% like the people who use S.L.I/crossfire/mGPU would be in the "PC enthusiast class". But now everyone just trolls those people to death & calls them dumb for owning or using S.L.I./Crossfire/mGPU. All it's about now straight up gaming & most ideas come from Reddit which is a horrible place
Really would've loved to see the 12400 in there as Intels last cpu without e cores....
Just subtract like ~5% from 5600X results
@@shoobadoo123 no
12600 was the last without E cores.
@@moebius2k103 12600 has e-cores, 12400 doesn't
@@moebius2k103 Not true
Nice video as always! Would love to see r5 7500f included in these type of videos too!
I would have liked to see some more budget options here like the 12400f. That's down to about $100 now and I think it's pretty relevant in the $6-800 sort of budget range. Maybe the 3600 too, that's still available and with how cheap some am4 boards are there could be people interested in it. I would have been much more interested in that or some more locked i5s (which are very common in prebuilts) than having so many intel parts in both the k and kf variants or having 5700x and 5800x. Your opening note about those being the same cpu covers that well enough for me.
12400 omission hurts a little but I'll forgive you this time...
Agreed. My 12400F has been a trooper.
They even forgot 12100f😮
Just subtract like ~5% from 5600X results
Agree, saw the 12400f on Newegg the other day for $120. 12100f was under $100. Amazing budget Cpus.
@@shoobadoo123no? 12400f is en par with 5600x and not 5% worse.
Liked the review, just wish it had an average power draw per CPU, as energy costs aren't something to be underestimated these days
Don't you see, Reviewers live in a limitless energy bubble where the elercticity is live and free (Energy Pun)
How much is 1 kW where you live? Here it's like 0.092$ per kW spent.
@@sermerlin1 It depends a lot on the contracted power and company (e.g. joining energy + gas gives you benefits) but I currently pay 0.1472€ per kWh.
@@MiguelSilva-sm7xl that's quite more alright but is it so insanely more pricey that you can't afford to game with any part you want?
Like even if PC while gaming uses say 500Wh is it really that costly?
Say you are playing 5 hours a day every day for a whole month... isn't that 11.25$ in total spent (give or take some cents)? That's really not a lot.
AM4 and AM5 CPU's use far less power than Intel 1700 ones so there is probably not much point 🫤🤷♂️
I won't be upgrading my own PC for quite some time (and Tim's latest video certainly hammered home why)... But I still like to keep on top of the latest goings on in the market! (I honestly find it more confusing to take a break and try and get caught up later, than to just stay on top of it now.) This channel is amazing for that! Sure, I check multiple reviewers, but if I was in too much of a time crunch to watch multiple videos, and had to be confident about something, I think I'd pick your videos.
Also: Still the best B-roll in the business!
7800x3d is such a tempting pick rn
Always has been. 🙂
I would wait for the 9800X3D which is rumored to launch in a few months
@@TheKims82it will be 20-25% faster in games compared to the 7800x3d. Intel needs Arrowlake to do very well in IPC uplift.
Me too...
I've had mine for about 7 months now and it's killer. No issues and tackles video editing and gaming like a champ. Good temps and power draw too. No complaints.
Though a dead platform now, I do feel like that retrospectively, the 13700K seems to have been quite a highlight of the LGA1700 platform. Mine has served me well, and I think it was good value for its time for mixed gaming and productivity rendering.
Same. Love mine.
"Dead platform" is irrelevant when you don't upgrade often. 13700k is great don't worry. It'll serve us well for a few more gens. Offset the voltage down by 0.09v and it'll be a sleeping beast.
@@yuunashikiIt's still relevant. If you bought a 1000 series ryzen chip in 2017, you could buy a 5800x3D now and not need a new motherboard or ram. Thats 7 years apart. For a value minded buyer, the ability to upgrade to a new CPU for only the cost of the cpu itself 7 years later is a massive cost saving.
@@JoeNokers I think you may have misunderstood my comment.
13700k is an awesome Cpu. Had mine since release and plan to keep it a few more gens.
Been very happy with my i7 14700K, love seeing it is one of the top 5 cpus still! It screams in any game I throw at it, and its a beast in ps3 and Switch emulation (hello God of War 1 & 2 HD in 4k 60fps!) as well as it is a super useful cpu in production such as 3d modeling and renders. It's nice to know I can easily rock this cpu the next 5 yeard until i plan to upgrade my pc next around 2030
What i don't like about the 13th gen and 14th gen i7s is the awful perfomance per watt vs the 12700K. It makes them impractical for audio/music work, an AIO doesn't make sense, it must be air cooled. My 12700K has AVX512, easily beats a 7800X3D (also has AVX512) on PS3.
Was going to buy that chip last week. Then seen what's starting to happen with 13th/14th gen. I'll be avoiding completely now, especially seeing how they've been fobbing off gamers accusing them of using incorrect board settings.
I'll be holding off for AMD 9000 CPUs now.
@@saricubra2867
performance per Watt is miles better on newer intel gen, - it's just that they can reasonably argue that when it comes to i7/i9 most of the people have the top of the line cooling solutions so do 253W cpus.
Anyway - I cool my 14900k using air, and it's alright - not saying it's worth updating from 12700k though.
@@tteqhu "is miles better on newer Intel gen"
Is not because they pushed the sillicon too far.
@@saricubra2867
It is.
12th gen is just before the intel's shift. They were more conservative with default CPUs PLs (now the boost to 253W is set as a target).
AMD 7000 series, is by the same metrics less efficient.
However if you lock down power limits, you'll find out that they're significantly more efficient and in fact don't need 200-250W.
Gotta love my i7 2600 in my Z67 sabertooth matabored! 13 years old and STILL RUNNING!!!
at 15fps on rdr2.
Naaaaaahhhh I’ve got more than that on a 2600k
@@TheBottleneckedGamer really? My top tier gt730 would beg to differ.
My 2410m and GT540M from 2011 were doing well* untill a few weeks ago.
* at 9fps
I own a Haswell i7-4700MQ laptop that gave me higher fps on CSGO than a *desktop* i7-3770, let that sink in.
When I built my pc about a year ago, I didn’t really know all that much about anything. So when you upload a benchmark for gaming processors and I see my processor (7800x3d) is still that good, whether it was luck or my research, I get very pleased that I’m still very much set with that part of the build.
7500f via AliExpress still the cost per frame goat
FR
True. It outperforms even 5700X3D in terms of price/performance ratio and demolishes 5800X3D, which is crazily expensive.
Depends entirely on what games one plays.
And as for am4/5700x3d, the platform cost with mobo+ram is a bit cheaper@stangamer1151
@@user-wq9mw2xz3j True
AM4 (DDR4/5700X3D) is still a bit cheaper than AM5 (DDR5/7500f). But 7500f is also bit faster than 5700X3D. With tuned RAM it outperforms 5700X3D by ~15%. Plus AM5 offers great future upgradeability, unlike AM4, since 5700X3D is a dead end here.
Taking into account these factors, AM5 totally worth a bit of a premium IMO.
It was available for some hours last week at 100 Euros in Germany last week. Now it's back at 146 but still the best deal on AM5. Especially since 7600 prices went up by 10% since May .
well done sir. i've been waiting for another massive cpu scaling video from you guys for sometime. Thank you! damn...look at that 12600k go!
The $166 13400f at Walmart with DD43600 ram With a good z790 would still be a stocking stuffer. If you need/want more than 4 USB ports.
7800x3d - "all I do is win, I am perfection, I am GOD!!!!"
best CPU is not even debatable
Just to add the one millionth comment, these guys make the absolute best and thorough breakdowns which actually help in deciding what I want to buy without a doubt in the end. Thank you!
7600x my beloved
especially overclocked
@@laurentiudll 12700K has more overclocking headroom than a 7600X. The latter already is around 5.2GHz, the 12700K is that fast and it clocks at 4.7GHz, the 7700X is definetly the trash CPU of Zen 4, it will fall apart if you set it to 12700K clocks.
I know you have been doing this for along time, but I would argue that sorting by minimum frame rates, then by maximum frame rates should be the primary method of ranking performance. It ensures that the products recommended are absolute top performers.
Great vid.
I know that grand strategy games and 4x are shedogs to test, but it could be wonderful if you can touch on the subject in future videos. Like late game Victoria 3 or Civilization turntimes ect.
Keep up the good work ❤
Yeah, I really hate how game performance is usually limited to spectacular 3D games, which aren't really CPU limited.
There is also factory and colony management games like Factorio, Rimworld or Oxygen Not Included that are greatly affected by CPU speed and sometimes RAM / cache performance in the late game.
Thank you for your extensive work benchmarking. Please keep ACC in the rotation
HUB doing God's work out here
Thanks to AMD sponsoring these guys behind the scene lol
I'd like to ask you to include power consumption to those as well, please. Energy prices here in Australia have doubled for some of us over the past couple of years, and this is now a big factor for me.
Thank you.
In your comments about Starfield, you note that the dual CCD of the 7950X3D and the way the game schedules its threads all meant its lows were hit, but the 14900 ALSO had a very similar hit, and the single CCD 7800X3D had better lows than the big hitters from Intel's 14th Gen, indicating to me that they ALSO were hit with a scheduling problem, hitting those weaker E cores, but you weren't pointing out the issue of "mutli-tile scheduling" only "mutli CCD scheduling". Fix it by pointing out that tile scheduling can hit in some games too.
It's not intel or AMD problems. It's Bethesda's problem.
His comment makes sense because that was a comparison between 2 cpus with similar performance (7800x3d and 7950x3d).
While, yes, multitile surely can have an impact, there was nothing to compare to, since all the intel cpus of that level have e-cores.
Meaning: while it's useful to point out that the 7800x3d can be a better purchase because of that issue over other am5 products, it doesn't matter for intel, since all the other cpus are all of the same.
@@MattJDylan you make a good point.
@@MattJDylan That's my point, and why his doesn't, in that specific situation I pointed out and is solved with a change: it depends on how many threads the game engine will use. Starfield will use more than 8 threads, and are not well scheduled, so anything that HAS an option of which core to use has a scheduling problem. The 7800X3D doesn't HAVE a choice. All threads HAVE to cooperate on that single CCX. Intel's 14th Gen just upped the E core count to get multithreaded up to competitive with AMD, and so 8 of those threads will go to the faster P cores, but since it has a choice, it will put thread 9, 10, 11, etc, on an E core, meaning if the thread has to be waited for, the lows in the fps count will go down. Same for the two-chip vcache options: the other core isn't as fast, so it still has to wait until that thread ends, hurting the lows.
The 12th Gen Intel doesn't have as much of a problem: it doesn't have many E cores to play threads on, so it will cooperatively multithread on a P core much more often. 13th Gen ups the 13600 count of E cores, so more threads are run on the vacant E cores, hitting the lows a bit more. And 14th Gen ups the count of E cores even more, hurting the lows even more. And since the higher quality chips from intel, like from AMD, use more "not faster cores" to handle more concurrent threads, the lows are hit more at the higher end (e.g. 13900) than the lower end (e.g. 14200). The AMD 7800 with vcache is a low end configuration that has a LOT more cache, which means more cooperative multitasking, but since it has a larger cache, more hits on-die so those threads operate a lot faster.
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
Good preparation video, now we are ready to see where the new 9000 series lands.
Recently scored a 7900X3D for $280 US. For a mix of productivity and games, I couldn't pass it up. These charts are great to see where things generally land, but always keep an eye out for deals; you may find a diamond in the rough :)
Also... didn't you forget to benchmark one particular game? ;) ha
I basically bought my 12700K new a months after it's launch for 330$. Not even the Ryzen 7 1700 at launch had that value.
It always made the 5800X3D look completely stupid for me, then i tried a simulated 5800X3D by turning off the E-cores. 8 cores 16 threads really feel sluggish vs 12 cores with 20-24 threads. I never liked the 5800X3D, and for games when the X3D cache doesn't matter, the 5900X pulls ahead because of clocks and cores, it's much more faster in productivity.
Long live the AM4 Platform! You have been very good to us
Q, is the 14900k in this result using performance or extreme power limits?
Extreme profile 253/253W
@@Hardwareunboxed wow u replied!, cheers for all the handwork you and your team does and ty for that clarification too
@@Hardwareunboxed Was gonna ask this same question, thankyou for replying! (Might be helpful to pin this info or add it into the video description.)
this is something you really have to keep an eye on the prices to make sure you are getting the best bang for the buck at the time of your purchase. right now the 5600 is at $117 and the 12600kf is at $159 for example. knowing the performance and doing your due diligence will get you the best deal. thank you for all of your hard work on the benchmarks.
I like that you test the 7900x3D. Most reviewers have written it off and I think it's bad with how the prices change.
that chip makes no sense though, no one should buy it
@@MaxIronsThird I disagree
As someone who uses a pc for more than gaming, the 7900x3d makes a lot of sense if you can grab one for $330 on sale.
@@qu54re65 but we're talking about gaming, the 7900X3D is a inbetween CPU, worse than 7950X for productivity and worse than the 7800X3D for gaming while also being more expensive.
@@MaxIronsThird It's 5-10% worse in gaming than the 7800x3D while being 50%+ faster in multicore tasks. At the same time, it's ~20% slower than the 7950x3D in multicore tasks, but the 7950x3D costs currently 50% more.
With the launch price it made no sense, with lowered prices it absolutely does make sense.
Hardware Unboxed - upon the releases of the new CPUs coming out, can you guys try to do another video like this shortly after?
I want to do a new build and this is the type of stuff I want to base my information off of, but my current hardware isn't gonna last much longer.
When I got my new system I got a R9 7900X3D paired with 64GB ram, X670 MB and a RTX 4070Ti Super, and so far it has been an awesome. Hope to have this system last at least 10+ years.
you will be good its just you will have to upgrade that graphics card every 3-4 years in that 10 year time frame.
I got a 7950x3d and 4070 super with 64GB of RAM, I'm hoping for around 5 years. Tech just advances so fast.
@@msg360 Thank you, yeah will be interesting to see what the future holds for GPU's and other hardware.
@@Majoraexp I was thinking about that, but with what my system cost in total I think that I could not go the top of the heap in the Ryzen series. I wanted to go to the 4080 Super but that was $500 AUD more than what I could afford.
@@Ghastly10 I would have gone with a 4080 but I only upgraded cause I won at the casino xD otherwise I'd still have a i5 9600k. If only I had won more. Quite a huge upgrade for me still
Fantastic video. In particular the cost per frame and value graphs. Very interesting and useful, thanks !
Got the 5700x3d like bit over a month ago for 200€ or so, great last breath for the am4 socket.
As for going for am5 the 7800x3d wouldve been almost double the price, and motherboard (150-200) and 100 for ram.
the cost per frame MSRP is correct but if you atke into consideration what you pay on the long run given the power consumption i think the 14600k falls way behind the 12600k especially for those whose pcs are ON almost 24/7 since the 14600k uses a max of 181w whereas the 12600k is 125w but great work and insight into these CPUs as always, i wouldn't expect anything else from this channel
Got my 7500f for $100, considering it's only 5% slower than the 7600 the cost per frame is actually insane.
FR
Love the air quotes around “14th generation” in the intro. Nice touch! Also the look on Steve’s face as he opens the 4090 box - hilarious.
Thanks Steve! (And team) Great info.
1080p testing goes against monitors unboxed ethos of “its time for a monitor upgrade”. 1440p are 120hz needs to become the new baseline. If that does not show a significant difference between video cards then that is part of the story. I think we have reached the point where objective data is good but not at the cost of skewing the perspective to show that difference. Keep up the great work and I look forward to the next rebound of new motherboard reviews. Can you create a “best power delivery per dollar chart” for motherboards as that seems to be a metric that is measured differently between vendors? For example VRM/PowerDeliver/???/voltage/watts/amps … how about a scaled score of 1-10 called “this gives your cpu all the power it needs without getting too hot” or whatever it needs to be called. Thanks gain.
Have you not watched any of their videos about cpu and gpu limiting scenarios?
where is 12400?
Didn't make the cut.
If it were included it would most likely top the price/perf chart. Saw it on Newegg for $120 the other day. It is an amazing budget/mid tier gaming Cpu.
@@mrhassell Take 9zzz 12 months to mature anyways. Just like 7xxx.
Still rocking my i7 4790k and gtx 980. Would love to upgrade soon but now seems to be a holding pattern for all these new releases coming. May just slap in a rx6600 and keep er moving. Great content as always Steve. Also, I've run out of podcast to listen to at work now that I've finished all your pods.
Here's your engagement comment.
It's hilarious to me how many people still won't watch the videos explaining why you test with 1080p and a 4090, despite you specifically addressing that in the video. I'm wondering if they're just doing it intentionally at this point for a gag. In any case, thanks for the rundowns!
Also, it looks like the video might not be in your description, so here it is for anyone who wants it:
ruclips.net/video/Zy3w-VZyoiM/видео.html
Hey steve! out of curiosity, would you ever consider adding RFO benchmark to any kind of testing standard? maybe every so often you do a super test of different professional softwares and such?
7:25 When your 1% low is higher than your competitor's average frame rate...
LOL
Thank you for still mentioning 1080p. Not everyone can afford the 1440p, let alone 4K, so this is nice and really helpful.
I have an i5-12400f and just ordered a 7800 XT, coming from a 6600 non XT for 1440p. Is the CPU still good or should i upgrade?
your CPU is about 8% slower than the 12600K benchmarked here
12400f is perfect imo. Im using it for my 7900 xt build basement system and its amazing value at 1440p
the 12400f is still good enough until like 7900xt or 4070 ti super; so no need to upgrade the cpu yet, at 1440P
It is mate. Ive upgraded my GPU to 4070s and I'm on 12400, no visible bottlenecks at 1440p. All the games I play ulitilse GPU to 100% or I'm limited by the refresh rate of my monitor. Don't worry and enjoy
aight good to know, thanks guys
Thank you so much, incredibly helpful. Remains one of my go to tech channels.
I strongly believe the results should be set in order following 1% low instead of average, since the global experience will be determined more for those deeps than the averages...
agreed, and it seems like the 7800x3D does really well with 1% lows.
Digital Foundry would disagree with you. What 1% shown here and 1% in your mind is different. What you're thinking about is frametime consistency. What if i spend 90% of my time in a not busy environment of the game and only 10% on the heaviest part? The 1% wouldn't even do justice.
Meanwhile frametime consistency really represents the feeling of judder or hitch even though you're at 60+ fps constantly, those 60 frames could be bunched up in the last half of 1000ms, instead of evenly 60fps/1000ms.
So in this case avg is better because we're measuring peak performance, not the consistency of the game itself
@@rzkrdn8650 then your not busy environment is always fine with pretty much any 6+ core CPU of last 3 years, but those heavy parts can become annoying
I bought an B350 Pro 4 with Ryzen 1600 and upgraded to 5600 in January, with 32 GB DDR4 in total it costed me about $250. And I'm happy with the jump to it.
I like pure numbers..that's what we've been waiting for Steve..gg from Germany
More am5 board test pls 😅
With the release of the newer 5700X3D and it not being included in these charts, is it safe to assume that we can look at the metrics of the 5800X3D as the closest chip to its results in terms of performance?
The 5700x3d is only about 6 - 8% or less behind the 5800x3d. It's very comparable and half the price.
1080p is best for CPU benchmarks for getting maximum FPS, as CPUs are responsible for fps, the resolution and upscaling is taken care by GPU so no point in checking in 4k as games will become limited to GPUs capability and showing off the same FPS on all CPUs even if one is worse or better than others
that was the dumbest thing i have read this year.
This is exactlt why you SHOULD test at 4K, this isn't 2011 anymore most people watching will already have a Gamung Computer and look to these mega benchmarks to inform them unto which CPU to upgrade to next.
If he tested at 4k, it would most likely show that most people are fine with their CPU and it wiuld show then where to spend their money next. But I only know this because I game at 4K and 8K. Most people don't and will get the impression they need to upgrade the CPU to the 7800X3d or whatever to get the best performance when in reality the higher the resolution you go the less relevant the CPU becomes
@@CallMeRabbitzUSVI Most people are smart enough to get a CPU that supports the frame rate their GPU can produce in their favorite games at their monitor's resolution (which, according to the most recent Steam poll, is most likely 1920x1080 anyway).
@@CallMeRabbitzUSVI ?? it's a cpu benchmark test, not letting the gpu bottleneck the cpu and showing the difference is THE POINT of the whole benchmark, that's why it's done in 1080p with 4090
seriously how do people not get that
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
should have done a power consumption comparison as well, otherwise awesome vid
7800X3D will be remembered long time
And the 7800X3D is yet to reach peak status. 1.5-2 years later, by the time even Zen 6 is released, the 7800X3D will be much cheaper but remain more than enough for the vast majority of gamers. And with ample headroom for future upgrades.
@@Rexperto6454My thoughts exactly! The only thing about the 9000 series I'm interested in is how much they drop the prices of the 7000 series 🙂
@@Rorschach1488_tell me you are salty for buying an Intel cpu without telling me you are salty for buying an Intel cpu.
@@josephk6136 At stock, yes, 7800X3D is fantastic. Best CPU for a normal consumer. It is not the best when it comes to peak performance when you start to overclock. A 13/14900K overclocked with high speed DDR5 (tuned) will destroy the 7800X3D in most games and benchmarks. It just comes down to you wanting to spend the extra money and FAFO with setting and bios for a week or two.
Then aging like milk like the 5800X3D is doing right now surrounded by similarly priced i5s and i7s.
I think including emulation in these tests will show some rather intersting differences and I recommend as the benchmark gow 3/ascension rpcs3 as it destroys cpus and hardest game to run in any emulator to date
Zen 4 would likely pull ahead because the entire platform supports AVX-512.
@@riven4121 12th gen can be found with AVX512 and the IPC is higher than Zen 4. 12700K beats a 7800X3D
Seems like someone is doing overtime before release of Zen 5 😂👍
@@RUclipsTookMyNickname.WhyNot It all depends where you test. He is testing in the city act 3 which is very cpu intensive.
Steve, is there any chance you can do these benchmarks with an RX 7900 XT or an RTX 4070 Super? Only a tiny percentage of gamers have an RTX 4090. It seems that the only reason to go above a Ryzen 7600 is if you have a 4090. I'd love to see the CPU sweet spot at 1080p, 1440p, and 4k if you have a midrange card. If you were to do this, you would probably be the only reviewer to provide this useful information to millions of gamers who use these midrange cards. Thanks.
I'm not ashamed to admit it: I bought an i9-14900K! It was $50 less than the i9-13900K. Why _wouldn't_ I buy it? I can handle things. I'm smart. Not like everybody says. Like dumb. I'm smart. And I want respect!
So important question, especially related to some of the scheduling issues with the AMD CPU's that have two CCD's but only one with a vCache. What version of Windows 11 did you use for the testing? I'm reading that Windows 11 24H2 has largely sorted the scheduling issues so something like a 7950X3D would now performance basically the same as the 7800X3D, so some of those games, like Cyber Punk 2077 should get pretty much the same result for those two CPU's.
Was about to complain on not seeing the 5800x3d, but i decided to wait. My vega 56 is still mad btw.
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
Happy with my i9 14900kf for gaming and productivity. Respect for the 7800x3d, gaming beast
You scoff in 40K R23 while gaming.....
this entire video can be summed up saying : just buy a 7800x3d and your good to go!
And here I am, still using a Ryzen 7 3800X with my RTX 4070 😄Bought this CPU back in 2020 and it's still pretty solid.
You know, I'm still running my roughly 10 year old i7-5820k at 3.3GHZ which along with my 12GB RX-6750XT can still can run any game I have (with the possible exception of StarField that still has unexpected graphics glitches and pauses) beautifully at 1080p. My system is also doing a ton of other things while I'm gaming too, like being the hub for my local network backups, home media server, home security camera hub, as well as serving up a commercial web site through a CloudFlare tunnel. (No, it's not on wi-fi - just a g-bit CAT-5 cable instead). Even with all that, most games are only using about 30% to 40% of the CPU resources - so the notion that I need to be buying a newer and much more expensive CPU every couple of years "for gaming" seems pretty absurd to me. Yeah, I usually try to upgrade my video card every 3 to 5 years or so - but the old 6 core, 12 thread, 22nm CPU just isn't a problem.
You are severely limited by that CPU, you can cope as much as you like, it's not going to change that fact.
You don’t seem to be aiming for high refresh gaming. I could limit my FPS to 30 and my CPU would last for ages. If you’re okay with that, it’s your call.
I want 100+ frames on all games. It can’t be stressed enough how much better of an experience it is.
I upgraded from Haswell i7-4700MQ from 2013 to an i7-12700K in 2022. I more than doubled the single thread perfomance gain, fps gains easily 7 times (floating point speed).
Smart of you to make two videos about the 9000 reviews.
"We're less than a month away from [this video being completely out of date and inaccurately titled, but we did all the testing and wanted to get two videos out of it instead of one so here]" - ftfy
Never miss an opportunity to learn something.
I probably missed it, in which case I apologize, but are the Intel 14th gen CPUs tested at the new "Intel Performance" power settings (PL1=253w, PL2=253w)?
AM4 is dead? How is it dead when B450/B550 and Ryzen 5000 sells huge numbers all over the World?
no new good cpu's being released on am4 anymore, and the fastest one you can get is the 5800x3d which is comparable to the 7600
@@autumn42762 Is the 5700X3D somehow not a "good new CPU"? Some of you are so lost lol
@@madelaki You're the one who's lost if you think the 5700X3D is new or good. It's only good if you're already on AM4.
@@madelaki yes the 5700/5800x3d are good, but they make the most sense as a upgrade (even then, the AM5 R5 7600/7500f is much cheaper and it has similar gaming performance)
there is really no point in building a new AM4 DDR4 system in 2024
AM4 isn't a Dead platform - AMD is gonna release new CPU's for it at the end of July along side the new AM5 release. I would call AM4 a secondary platform and still a worthwhile platform for those who choose not to or don't have the financial capability to just switch from AM4 to AM5.
Got i5 13500 with DDR4 last year and I'm pretty happy with it. It performs awesome in any normal game, but also shines in RPCS3 emulator thanks to huge amount of cores (compared to my R7 5700G). Also the core count is really useful while production, like rendering in 3DS max in software mode.
I know a couple of subreddit that are going to REEEEEEE😂
r/hardware gonna downvote
@@JoeL-xk6boand Intel mob will point out that this channel is biased, that AMD didn't invented anything, and that Intel is cooking something better 😬
I bought the 13600K on launch before the 7800x3D existed and I'm very happy with it, I will of course consider AMD when it comes to my next CPU upgrade.
The 12700k doesn't get the love it deserves. Intel has been selling it for stupidly cheap for months now and I've built at least a dozen systems with it. Sure, it's not impressive compared to a 7800X3D with a 4090 running at 1080p, but when you pair it with a more realistic 3080 or 7800XT at 1440 there's simply nothing from AMD that can compete with that $210 CPU. (Where's my AM5 R3 AMD?!)
Intel _knows_ they aren't in the lead anymore and their prices show it. X3D might be the king of gaming but the second you turn that game off core counts matter and I got a 14700k for a lot less than you would expect. Great prices for good CPUs are what made me love AMD, and Intel is killing it right now in the "bang for the buck" department.
There's no question if you're doing a "cost is irrelevant" build you go for a 7800X3D or better yet 7950X3D, but if you're building on a budget Intel is killing it if you want a lot of cores for your money.
The problem with the 12700k is that the 13600k beats it in most games and for less money.
@@LiesThatBind You can find a 12700K with AVX512 instructions beating a 7800X3D with AVX512 on PS3 emulation. There's nothing more CPU intensive than that. Heck, the i9-11900K which was the fastest CPU for PS3, with overclock loses against the 12700K *at stock* .
3% singlethread difference between 12700K and 13600K, the 12700K is easier to cool than a 13600K because it clocks at 4.7GHz (8 p-cores), the 13600K clocks at 5.1GHz, at that point just get a 7600X that is cheaper and clocks at 5.3GHz max.
6+8 Big-Little is just a weird number than won't age as well as 8+4.
Intel realizes that the 12700K was a mistake, so they forgot that 8+4 number. I only know one chip that is similar to that anomaly, it's the M2 Pro also 8+4.
14700K will pull ahead of the 7800X3D once the latter lacks the raw multithread and clocks muscle.
Right now the 12700K is pulling ahead of the 5800X3D.
Guys, you are legends! Great info. Can you do some updated "creator centric" benchmarking with benchmarks for Davinci Resolve, Blender and the like, it has been a couple of years since I have seen anything. Thanks!
Surprisingly, my choice for both gaming and productivity, and rather “unfashionable or meh” when launched 12700K seems to have aged quite well. It now appears to be in the ball park for results and value as the supposedly “better” 5800X3D 😵😵
In this vid, they were using ddr5. So since you're probably on ddr4, your 12700k will not get the same results
@@shoobadoo123
True and I am aware of this. At the time it meant DDR4/DDR5 was 32GB rather than 16GB within the budget and again nearly always a compromise when considering game vs productivity but for a processor that was virtually written off at launch it doesn’t do too badly at all!!
@@deepend9376 With DDR4 the 12700k is basically a 5900x. It's a pretty decent drop in performance.
Also what's your powerdraw in gaming and under workloads?
Also the US prices mske me grin, as in Germany we can get a 5800X3D for 299 VAT included, and for a while it was down to 269 VAT included.
It is very game dependant, but in sims like ACC or DCS the 5800X3D still rocks with inanely high, smooth fps.
@@LupusAries
Although the power is high during gaming (as mine with decent cooling boost WAY higher than spec) and DDR4 is probably 4-5% slower in games (remember it won’t run DDR5 particularly quickly) the extra memory makes up in productivity. Also in productivity the IG and codecs give a real boost in some apps.
During general use the E cores actually work and idle and general office use is used less than my previous AMD rig. As the system generally runs off the solar panels or battery backup then power/power costs are not a major decision point.
Please be aware I’m not saying the 5800X3D is a poor choice or a bad chip. I am certainly NOT saying that and it’s a fabulous chip. I’m only surprised that the 12700K gets as close as it does as that was not what was suggested on release.
13:38 Better to include the price of motherboard and RAM for cost per frame. Simply use the cost of the hardware used in the test. I think the AM4 going to dominate this chart.
hard to believe the 5600x still the best here. I guess it is still the sweet spot for me, gaming and Handbrake only the heaviest workloads I do.
The 13600k remains my top pick for most people for Intel
13400f at walmart for $166 keeps them all honest.
I like the 12700K more because of the 8+4 configuration.
The 6+8 config of the 13600K makes it look like an overclocked laptop i7-12700H which is basically what it is.
where did you get that cool lego building with all the windows?
Awesome video, but where are the i3s? AM4 and LGA1200 may be worse value for new systems but here they cost 200-300 euros per kit and the CPUs you used would cost 400+ euros in my country. I can't look at efficiency if the price is double.
Platform longevity sounds grossly over-rated to me: I'd never put a CPU on a 2+ generations older board as that usually means forfeiting some of the new CPU's key features.
What is the proportion of PCs that ever get a CPU upgrade? Likely around 1% when you account for the ~80% of the market that is laptops with soldered SoCs.
That is definitely the case but it's nice to have the option to just pop in a contemporary CPU+GPU in 2-3 years time for a mid-term upgrade.
It's me! Rockin a 5600 on a X370 board. Yes, I'm missing PCIe gen5, SAM, and maybe something else. But I don't really care that much.
I disagree, motherboards have become much less relevant to performance than they used to. I think 1 motherboard per DDR generation should be the new standard. If you had been able to get a good quality DDR4 board in 2016 that could run a 5800X3D that would be 6 years with no lost performance, potentially offering 2 or 3 cpu upgrades. PCIE3 is still sufficient for all but the highest end GPUs, and slower SSD sequential doesn't impact games. Really the only thing motherboards have to offer is USB 4.2x2.2 squared or whatever they decide to call it
Look at all the 4usb b650 and B660, b760 motherboards for both sides. Using multiple controllers or a racing wheel is going to cramp the space. Many people are like a switchboard with their PC I/O panel as it is.
I'll take my 8 USB on the back of my Tomahawk Z790 any day.
I know for a fact the onboard sound choices are mid at best for most of those boards. The entire LGA1700 line up from AsRock was stripped. Cheaper but stripped down.
@@kramnull8962 If IO is your only reason for spending $100+ extra on a much higher-end motherboard, most mid-range boards around $170 have headers for some combination of 6-10 extra USB ports, basically all of the CPU and chipset IO that isn't already on the rear IO.
Beyond that, USB hubs are also an option. I have one on my desk simply to spare me the hassle of crawling under my desk whenever I need to temporarily plug something in.
I know this chip isn’t there, but the 7500F is a beast. Specially over clocked. if you find good deals on ram, and a Motherboard, you can go AM5 for such a good deal, and you’ll have room for improvement over time
5800X3D FTW
sweet glad I came back to this Vid 5months later helped me decide on going with a 7600X over a 7700X since their so close together in gaming + it would eventually be replaced with the last AM5 8+ core X3D CPU they will make for the platform so saving $60+ in 2 months is better than getting the 9800X3D now at least for me just looking at the cheap route for a long upgrade path.
The best CPU is the CPU you bought yesterday to play today's game.
7500f 😂
@@RUclipsTookMyNickname.WhyNot we're almost the same.
5700X, 4070 Ti Super, System Shock (1994) and System Shock 2 (1999).
Before my 7800X3D I had the 7700x and 9900ks, which I Upgraded from the i7 980xe Ihad for 10 years at 4.3 ghz. The i7 980xe was able to play Metro exodus at extreme settings at 3440x1440p 60 fps. The game looks better than some titles today for a 2/2019 title ( with 1080ti ftw hybrid).
11900K, Fallout: New Vegas, 8 - 12% CPU utilization xD
For me it's the opposite (for old games). Finally with my 12700K i can get 100-110fps on Crysis 1 and playing Minecraft without being a stuttery mess when loading chunks.
Thanks for the update!
To say that nobody is gaming in 1080p anymore when most gamers can't afford high end 4k gaming pc's, shows ignorance.
Yep 50%+ still game at 1080P according to steam hardware stats.
@@AlexHusTechWhich is massively skewed by internet cafes in Asia....
@@LupusAries These cafes, what resolution are they playing on?
4k superior
@@J1e9r9r6y No one is contesting that 4k is superior man, is only that a 4k gaming monitor is still way expensive and a GPU capable of playing 4k for all games as well.
Hey guys, I'm a bit concerned that the Assetto Corsa Competizione benchmark you guys are running may be misleading.
In ACC, having more cars on the track at the same time stresses the CPU much more than when there's only a few - I can see in the gameplay footage that there is only 16 cars on track, when races can frequently have 30 cars or more. As such, a user of a 7800X3D and a 4090 may not see such high fps nmumbers as more cars = more CPU stress = possibly fewer fps. Running a race with 30 cars (or however high you can set it) rather than 16 would likely give a better indicator to how well the CPU's are performing against each other in this title.
The Opponent Visibility setting won't be an issue here as you're running the Epic preset, which sets this value to All by default, so just changing the number of AI cars will be sufficient.
If you have any questions about this, please do let me know - I have a lot of experience in using ACC (including esports experience both as an event organiser and competitor/driver), so I would likely be able to answer any questions you have on this.
My Brother in Christ W H E R E is the FSR 3.1 analysis. I need it.
2 days away still.
THIS
21:51 It looks like you don't remember the perfomance gain from the i7-11700K to the i7-12700K. If the generational increase turns out to be that big, you just pay the extra for the platform and from a value standpoint it will end up being good because of the chips that you get, no one that i know upgrades their CPUs in 3 years 🙄.
Why are you benchmarking at 1080p, no one uses that resolution anymore.
Competitive gamers still very much use it and it's basically your only choice if you want to use a 360hz+ monitor in some games
I use that resolution and my name isn't "no one". Beside 165+ Hz monitors and 1440p or 4k?!? You're out of your mind son!
In my rig the 13600k was a fantastic upgrade from what i had & yeah i needed the upgrade from the 7600k. Good to see it's still up there in this list.
The best gaming cpu is the one you already have
How are the prices of the 7800X3D in AU?
In Europe since July 1st the 7800X3D has undergone strong price increases and is sold out at the major European distributors. Retailers increase the price of the 7800X3d several times a day.
insert 5800x3d circlejerk comment
I mean it deserves the dang praise it gets. I wish it didn't heat so aggressively out of the box though, had to dial down some of the voltages to tame it. And the funny part is that I didn't even lose any performance, so it is needlessly aggressive out the box.
5700x3d benchmarks pls
5800X3D is no longer interesting, it is way too expensive for what it offers. 5700X3D is the CPU, which deserves to be praised.
Although even 5700X3D is outperformed by 7500f, the true value king.
@@stangamer1151 5700X3D is still better in MMOS and some high fps games.
@@wizarian Not if you tune RAM. Then 7500f is faster than 5700X3D in all games. DDR5 is a big improvement over DDR4. This helps 7500f a lot.