Im still just so happy with my AM4 set up all the way from a 1800X at launch to the upgrade to a 5800X3D last year it is/was just a unbelievably good platform...
@@josephnorris4095what's your resolution? I'm at 3440x1440 and my 5800x3d and 6900xt is doing alright but an upgrade would be awesome (gpu specifically)
@@donsly375thankyou. He’s a hardworking man who loved his family and always put us above himself. He sadly pass away in 2020 but I love him a lot. Miss you dad
@thkhoa8805 yeah x3d is goat status like 5800X3D. If the rumors are true and the 9800X3D will be able to sustain higher clocks compared to the 9700x and still overclock then it will be a Goat to anything on the market unless Intel proves otherwise.
@ksks6802 that's assuming that 800 wouldn't go to things like health expenses, rent, car repairs, or other things more important than a gaming pc. Hell it could even go to games instead.
@@shadow105720that's why it's called "savings". You try and not tap into it. If you cant save 20 bucks a month...you've made some unbelievably bad decisions in life. Go cry about it.
@@ksks6802 they could live in a country where working gives them $10 daily, which in 21 work days/month gives 2 days of work and 10% of earnings... you don't pay different prices for hardware, most often US pricing is the lowest since there's no VAT or other taxes/expenses
@@ksks6802 800$ for a marginal performance upgrade or invest that into equity of a company? I'd rather invest first. Only when its a huge upgrade it'll be worth forking out $ for an entirely new platform
@@MordWincer total volume of any is almost nonexistent. Like it was always low like any from 1% to 5% but now it's 0.1% as high to 0.001% like the people who use S.L.I/crossfire/mGPU would be in the "PC enthusiast class". But now everyone just trolls those people to death & calls them dumb for owning or using S.L.I./Crossfire/mGPU. All it's about now straight up gaming & most ideas come from Reddit which is a horrible place
@@Garde538 you good, bro? Lol idk why it’s expected, at least seemingly so, that one must be thanked or thanked back by channels that barely have time to take a sht. And besides, it’s not like I donated the gdp value of Coruscant lmao
It's definitely good, but being on AM4 with DDR4 it's definitely held back. I wouldn't say it's holding up to the Intel 13/14th gen or Ryzen 7000 at all, but it's easily the best AM4 chip and for a reasonable price. If you're using a higher-end card like a 4080 or 4090 or anything in the future then the 5800X3D isn't gonna cut it.
Good timing. My brother JUST asked me what's the best current cpu for gaming (no productivity. Pure gaming) as he's gonna build a PC again after 5 years without one. I just linked the video haha. Thank you for your service Steve. We appreciate the hard work
@@toseltreps1101 He knows how to find that information. I mentioned AMD in passing during a casual conversation, and he said "I think I'm going to build a PC again soon. What's the best CPU right now?". So I said 7800x3d, and linked the video in case he wanted to check it out himself. It's just a casual conversation between brothers. Sheesh, you are definitely not fun at parties 🤦
With my last power bill higher 28% (including 10% rebate for lower consumption) then previous one, power cost is more and more important... So much so that I started to cap FPS at 144. And they are already announcing more price rises...
@@johntotten4872 That may very well be, but i would rather see a chart in the video allowing me to compare that for myself instead of taking your word for it. It would also be really useful for cmparing the actual gains/watt compared to older very efficient gaming CPUs like the 5600x. The 7800X3D is anywhere from 30 to 100% faster...but how much more power does it use on average? That data is really much harder to come by than it should be. This video includes both of these CPUs. I was really looking forward to that comparison when i watched it - only to be left empty handed, so to speak.
I would have liked to see some more budget options here like the 12400f. That's down to about $100 now and I think it's pretty relevant in the $6-800 sort of budget range. Maybe the 3600 too, that's still available and with how cheap some am4 boards are there could be people interested in it. I would have been much more interested in that or some more locked i5s (which are very common in prebuilts) than having so many intel parts in both the k and kf variants or having 5700x and 5800x. Your opening note about those being the same cpu covers that well enough for me.
If it were included it would most likely top the price/perf chart. Saw it on Newegg for $120 the other day. It is an amazing budget/mid tier gaming Cpu.
Steve you've lost it. How is AM4 dead ... 5800X3D says otherwise, R5 5600 says otherwise. You can't pronounce things dead just because you wouldn't do it. IT CAN BE DONE, and it would be very useful to do so. What do they smoke in Aussie-land mate?
8:08 why are the benchmark results for AM4 so much lower than in this video from you guys from a month ago? ruclips.net/video/l3b7T5OohSQ/видео.html Am i missing something? they are both tests with a 4090 at 1080p at what i'd assume to be the same settings. Feel free to correct me if im wrong and i still enjoyed the video btw. Thanks for your work!
@@user-wq9mw2xz3j True AM4 (DDR4/5700X3D) is still a bit cheaper than AM5 (DDR5/7500f). But 7500f is also bit faster than 5700X3D. With tuned RAM it outperforms 5700X3D by ~15%. Plus AM5 offers great future upgradeability, unlike AM4, since 5700X3D is a dead end here. Taking into account these factors, AM5 totally worth a bit of a premium IMO.
It was available for some hours last week at 100 Euros in Germany last week. Now it's back at 146 but still the best deal on AM5. Especially since 7600 prices went up by 10% since May .
WHY ARE YOU BENCHMARKING WITH A 4090!?!?!1?!?/1/ 😡😡😡😡 YOU SHOULD BE USING THE EXACT MODEL OF MY GPU BECAUSE I LIKE IT AND YOU GUYS STINK!!!!!1!!!!!!!!11!!!!!!! GRRRRRRRR
I would guess the 5950x3d in these benchmarks is not properly setup ☺️ There is a number of things that must be performed before the 5950x3d can perform at its best in gaming. Gamers Nexus has done a video about it in the past! Its a pretty simple solution really of 3 steps. BIOS, AMD Utility Software and Xbox bar.
Been very happy with my i7 14700K, love seeing it is one of the top 5 cpus still! It screams in any game I throw at it, and its a beast in ps3 and Switch emulation (hello God of War 1 & 2 HD in 4k 60fps!) as well as it is a super useful cpu in production such as 3d modeling and renders. It's nice to know I can easily rock this cpu the next 5 yeard until i plan to upgrade my pc next around 2030
What i don't like about the 13th gen and 14th gen i7s is the awful perfomance per watt vs the 12700K. It makes them impractical for audio/music work, an AIO doesn't make sense, it must be air cooled. My 12700K has AVX512, easily beats a 7800X3D (also has AVX512) on PS3.
Was going to buy that chip last week. Then seen what's starting to happen with 13th/14th gen. I'll be avoiding completely now, especially seeing how they've been fobbing off gamers accusing them of using incorrect board settings. I'll be holding off for AMD 9000 CPUs now.
@@saricubra2867 performance per Watt is miles better on newer intel gen, - it's just that they can reasonably argue that when it comes to i7/i9 most of the people have the top of the line cooling solutions so do 253W cpus. Anyway - I cool my 14900k using air, and it's alright - not saying it's worth updating from 12700k though.
@@saricubra2867 It is. 12th gen is just before the intel's shift. They were more conservative with default CPUs PLs (now the boost to 253W is set as a target). AMD 7000 series, is by the same metrics less efficient. However if you lock down power limits, you'll find out that they're significantly more efficient and in fact don't need 200-250W.
So, of the AM4 results, the only one that runs into real issues at any point is... Oh, the processor that every channel recommended as their pRiCe tO pErFoRmAnCe KiNg, the 5600X. Because it's a 6-core, and some devs finally figured out how to utilize 8 properly. Sure, AM4 is irrelevant if you're building a 4090/7900XTX system today, but it's a $150ish savings versus AM5 (or LGA1700 + DDR5). On the "I don't want to spend $1000 on a graphics card" side of things, that's a free upgrade from a 7600XT/4060 to a 7700XT/7800XT (you weren't going to buy a 4060ti, were you?).
Could u please carefully tuning the 7950x3d to be okay? Don't just test it by all default, In this case, 7950x3d will lose a lot of game-performance and cannot fully exert its real game-capabilities... In fact, most games are 7950x3d>78x3d>149k Also, do u need me to teach u how to completely & comprehensive knock out of 78x3d by 7950x3d in all games?😏It is related to the "windows power plan" "close 8 ccd1 cores when playing games" already a not-optimal & outdated solution🤭
I won't be upgrading my own PC for quite some time (and Tim's latest video certainly hammered home why)... But I still like to keep on top of the latest goings on in the market! (I honestly find it more confusing to take a break and try and get caught up later, than to just stay on top of it now.) This channel is amazing for that! Sure, I check multiple reviewers, but if I was in too much of a time crunch to watch multiple videos, and had to be confident about something, I think I'd pick your videos. Also: Still the best B-roll in the business!
@@laurentiudll 12700K has more overclocking headroom than a 7600X. The latter already is around 5.2GHz, the 12700K is that fast and it clocks at 4.7GHz, the 7700X is definetly the trash CPU of Zen 4, it will fall apart if you set it to 12700K clocks.
I've had mine for about 7 months now and it's killer. No issues and tackles video editing and gaming like a champ. Good temps and power draw too. No complaints.
@@MiguelSilva-sm7xl that's quite more alright but is it so insanely more pricey that you can't afford to game with any part you want? Like even if PC while gaming uses say 500Wh is it really that costly? Say you are playing 5 hours a day every day for a whole month... isn't that 11.25$ in total spent (give or take some cents)? That's really not a lot.
21:51 It looks like you don't remember the perfomance gain from the i7-11700K to the i7-12700K. If the generational increase turns out to be that big, you just pay the extra for the platform and from a value standpoint it will end up being good because of the chips that you get, no one that i know upgrades their CPUs in 3 years 🙄.
7800X3D might just be even TOO good for AMD, as it will gonna reduce the hype for their new gen when benchmarks come up and they can barely beat it by 1-2% if at all
Great vid. I know that grand strategy games and 4x are shedogs to test, but it could be wonderful if you can touch on the subject in future videos. Like late game Victoria 3 or Civilization turntimes ect. Keep up the good work ❤
Yeah, I really hate how game performance is usually limited to spectacular 3D games, which aren't really CPU limited. There is also factory and colony management games like Factorio, Rimworld or Oxygen Not Included that are greatly affected by CPU speed and sometimes RAM / cache performance in the late game.
Got the 5700x3d like bit over a month ago for 200€ or so, great last breath for the am4 socket. As for going for am5 the 7800x3d wouldve been almost double the price, and motherboard (150-200) and 100 for ram.
Recently scored a 7900X3D for $280 US. For a mix of productivity and games, I couldn't pass it up. These charts are great to see where things generally land, but always keep an eye out for deals; you may find a diamond in the rough :) Also... didn't you forget to benchmark one particular game? ;) ha
I basically bought my 12700K new a months after it's launch for 330$. Not even the Ryzen 7 1700 at launch had that value. It always made the 5800X3D look completely stupid for me, then i tried a simulated 5800X3D by turning off the E-cores. 8 cores 16 threads really feel sluggish vs 12 cores with 20-24 threads. I never liked the 5800X3D, and for games when the X3D cache doesn't matter, the 5900X pulls ahead because of clocks and cores, it's much more faster in productivity.
13:38 Better to include the price of motherboard and RAM for cost per frame. Simply use the cost of the hardware used in the test. I think the AM4 going to dominate this chart. hard to believe the 5600x still the best here. I guess it is still the sweet spot for me, gaming and Handbrake only the heaviest workloads I do.
Here's your engagement comment. It's hilarious to me how many people still won't watch the videos explaining why you test with 1080p and a 4090, despite you specifically addressing that in the video. I'm wondering if they're just doing it intentionally at this point for a gag. In any case, thanks for the rundowns! Also, it looks like the video might not be in your description, so here it is for anyone who wants it: ruclips.net/video/Zy3w-VZyoiM/видео.html
That Counter Strike 2 benchmark results is nonsense. Is that a deathmatch or casual matches? Perfomance varies by a lot CPU wise depending on the game mode, smoke grenades make the game way more GPU intensive.
@@J1e9r9r6y No one is contesting that 4k is superior man, is only that a 4k gaming monitor is still way expensive and a GPU capable of playing 4k for all games as well.
remove all of the zen 5 and the new’s lntel and amd video right now! I removed your worst video that said “upgrade your pc is pointless” and does wrong
As of July 2024 AM4 CPUs are still out selling AM5 by a 2:1 margin. Even on Amazon the majority of CPUs sold are on AM4. The fact of the matter is the 7800X3D made the rest of Zen 4's lineup irrelevant. 12th Gen parts are amazing value and with PCIe 5.0 GPU support will last several more years. Personally I'd get a dirt cheap 12600K, 13600K or 5700X3D.
Though a dead platform now, I do feel like that retrospectively, the 13700K seems to have been quite a highlight of the LGA1700 platform. Mine has served me well, and I think it was good value for its time for mixed gaming and productivity rendering.
"Dead platform" is irrelevant when you don't upgrade often. 13700k is great don't worry. It'll serve us well for a few more gens. Offset the voltage down by 0.09v and it'll be a sleeping beast.
@@yuunashikiIt's still relevant. If you bought a 1000 series ryzen chip in 2017, you could buy a 5800x3D now and not need a new motherboard or ram. Thats 7 years apart. For a value minded buyer, the ability to upgrade to a new CPU for only the cost of the cpu itself 7 years later is a massive cost saving.
It is mate. Ive upgraded my GPU to 4070s and I'm on 12400, no visible bottlenecks at 1440p. All the games I play ulitilse GPU to 100% or I'm limited by the refresh rate of my monitor. Don't worry and enjoy
I’m running my 4080 on my Ryzen 7 5700. Could it be faster? Yes. But my monitors are my bottleneck as they have slow refresh rates and I primarily play on my 4K TV
I bought the 13600K on launch before the 7800x3D existed and I'm very happy with it, I will of course consider AMD when it comes to my next CPU upgrade.
When I got my new system I got a R9 7900X3D paired with 64GB ram, X670 MB and a RTX 4070Ti Super, and so far it has been an awesome. Hope to have this system last at least 10+ years.
@@MajoraEXP I was thinking about that, but with what my system cost in total I think that I could not go the top of the heap in the Ryzen series. I wanted to go to the 4080 Super but that was $500 AUD more than what I could afford.
@@Ghastly10 I would have gone with a 4080 but I only upgraded cause I won at the casino xD otherwise I'd still have a i5 9600k. If only I had won more. Quite a huge upgrade for me still
In all of these comparisons i remember people bashing me in the comments section for making the claim that the 12700K in better than the 5800X3D in every single aspect. After all of these years i'm right.
@@madelaki yes the 5700/5800x3d are good, but they make the most sense as a upgrade (even then, the AM5 R5 7600/7500f is much cheaper and it has similar gaming performance) there is really no point in building a new AM4 DDR4 system in 2024
AM4 isn't a Dead platform - AMD is gonna release new CPU's for it at the end of July along side the new AM5 release. I would call AM4 a secondary platform and still a worthwhile platform for those who choose not to or don't have the financial capability to just switch from AM4 to AM5.
If the perfomance gain of Arrow Lake is so massive, the 7800X3D will be irrelevant kinda like the 5800X3D is vs the 12700K. In my case if i run PS3 emulation, my 12700K matches a 7800X3D and loses everywhere else (minus productivity).
In your comments about Starfield, you note that the dual CCD of the 7950X3D and the way the game schedules its threads all meant its lows were hit, but the 14900 ALSO had a very similar hit, and the single CCD 7800X3D had better lows than the big hitters from Intel's 14th Gen, indicating to me that they ALSO were hit with a scheduling problem, hitting those weaker E cores, but you weren't pointing out the issue of "mutli-tile scheduling" only "mutli CCD scheduling". Fix it by pointing out that tile scheduling can hit in some games too.
His comment makes sense because that was a comparison between 2 cpus with similar performance (7800x3d and 7950x3d). While, yes, multitile surely can have an impact, there was nothing to compare to, since all the intel cpus of that level have e-cores. Meaning: while it's useful to point out that the 7800x3d can be a better purchase because of that issue over other am5 products, it doesn't matter for intel, since all the other cpus are all of the same.
@@MattJDylan That's my point, and why his doesn't, in that specific situation I pointed out and is solved with a change: it depends on how many threads the game engine will use. Starfield will use more than 8 threads, and are not well scheduled, so anything that HAS an option of which core to use has a scheduling problem. The 7800X3D doesn't HAVE a choice. All threads HAVE to cooperate on that single CCX. Intel's 14th Gen just upped the E core count to get multithreaded up to competitive with AMD, and so 8 of those threads will go to the faster P cores, but since it has a choice, it will put thread 9, 10, 11, etc, on an E core, meaning if the thread has to be waited for, the lows in the fps count will go down. Same for the two-chip vcache options: the other core isn't as fast, so it still has to wait until that thread ends, hurting the lows. The 12th Gen Intel doesn't have as much of a problem: it doesn't have many E cores to play threads on, so it will cooperatively multithread on a P core much more often. 13th Gen ups the 13600 count of E cores, so more threads are run on the vacant E cores, hitting the lows a bit more. And 14th Gen ups the count of E cores even more, hurting the lows even more. And since the higher quality chips from intel, like from AMD, use more "not faster cores" to handle more concurrent threads, the lows are hit more at the higher end (e.g. 13900) than the lower end (e.g. 14200). The AMD 7800 with vcache is a low end configuration that has a LOT more cache, which means more cooperative multitasking, but since it has a larger cache, more hits on-die so those threads operate a lot faster.
@@qu54re65 but we're talking about gaming, the 7900X3D is a inbetween CPU, worse than 7950X for productivity and worse than the 7800X3D for gaming while also being more expensive.
@@MaxIronsThird It's 5-10% worse in gaming than the 7800x3D while being 50%+ faster in multicore tasks. At the same time, it's ~20% slower than the 7950x3D in multicore tasks, but the 7950x3D costs currently 50% more. With the launch price it made no sense, with lowered prices it absolutely does make sense.
I'm not ashamed to admit it: I bought an i9-14900K! It was $50 less than the i9-13900K. Why _wouldn't_ I buy it? I can handle things. I'm smart. Not like everybody says. Like dumb. I'm smart. And I want respect!
Why dont you test CPUs at 720p or lower? if you test at 1080p because the popularity of that resolution you cant complain of people asking you to test at 1440p or 4k.
The 12700k doesn't get the love it deserves. Intel has been selling it for stupidly cheap for months now and I've built at least a dozen systems with it. Sure, it's not impressive compared to a 7800X3D with a 4090 running at 1080p, but when you pair it with a more realistic 3080 or 7800XT at 1440 there's simply nothing from AMD that can compete with that $210 CPU. (Where's my AM5 R3 AMD?!) Intel _knows_ they aren't in the lead anymore and their prices show it. X3D might be the king of gaming but the second you turn that game off core counts matter and I got a 14700k for a lot less than you would expect. Great prices for good CPUs are what made me love AMD, and Intel is killing it right now in the "bang for the buck" department. There's no question if you're doing a "cost is irrelevant" build you go for a 7800X3D or better yet 7950X3D, but if you're building on a budget Intel is killing it if you want a lot of cores for your money.
@@LiesThatBind You can find a 12700K with AVX512 instructions beating a 7800X3D with AVX512 on PS3 emulation. There's nothing more CPU intensive than that. Heck, the i9-11900K which was the fastest CPU for PS3, with overclock loses against the 12700K *at stock* . 3% singlethread difference between 12700K and 13600K, the 12700K is easier to cool than a 13600K because it clocks at 4.7GHz (8 p-cores), the 13600K clocks at 5.1GHz, at that point just get a 7600X that is cheaper and clocks at 5.3GHz max. 6+8 Big-Little is just a weird number than won't age as well as 8+4. Intel realizes that the 12700K was a mistake, so they forgot that 8+4 number. I only know one chip that is similar to that anomaly, it's the M2 Pro also 8+4.
14700K will pull ahead of the 7800X3D once the latter lacks the raw multithread and clocks muscle. Right now the 12700K is pulling ahead of the 5800X3D.
FWIW, the 12600K is now $158, making it far and away the best cost per frame on here…also not included is the $109 12400F, which could easily have a spot as well. Just something to consider.
1080p is best for CPU benchmarks for getting maximum FPS, as CPUs are responsible for fps, the resolution and upscaling is taken care by GPU so no point in checking in 4k as games will become limited to GPUs capability and showing off the same FPS on all CPUs even if one is worse or better than others
This is exactlt why you SHOULD test at 4K, this isn't 2011 anymore most people watching will already have a Gamung Computer and look to these mega benchmarks to inform them unto which CPU to upgrade to next. If he tested at 4k, it would most likely show that most people are fine with their CPU and it wiuld show then where to spend their money next. But I only know this because I game at 4K and 8K. Most people don't and will get the impression they need to upgrade the CPU to the 7800X3d or whatever to get the best performance when in reality the higher the resolution you go the less relevant the CPU becomes
@@CallMeRabbitzUSVI Most people are smart enough to get a CPU that supports the frame rate their GPU can produce in their favorite games at their monitor's resolution (which, according to the most recent Steam poll, is most likely 1920x1080 anyway).
@@CallMeRabbitzUSVI ?? it's a cpu benchmark test, not letting the gpu bottleneck the cpu and showing the difference is THE POINT of the whole benchmark, that's why it's done in 1080p with 4090 seriously how do people not get that
I mean it deserves the dang praise it gets. I wish it didn't heat so aggressively out of the box though, had to dial down some of the voltages to tame it. And the funny part is that I didn't even lose any performance, so it is needlessly aggressive out the box.
5800X3D is no longer interesting, it is way too expensive for what it offers. 5700X3D is the CPU, which deserves to be praised. Although even 5700X3D is outperformed by 7500f, the true value king.
And the 7800X3D is yet to reach peak status. 1.5-2 years later, by the time even Zen 6 is released, the 7800X3D will be much cheaper but remain more than enough for the vast majority of gamers. And with ample headroom for future upgrades.
@@josephk6136 At stock, yes, 7800X3D is fantastic. Best CPU for a normal consumer. It is not the best when it comes to peak performance when you start to overclock. A 13/14900K overclocked with high speed DDR5 (tuned) will destroy the 7800X3D in most games and benchmarks. It just comes down to you wanting to spend the extra money and FAFO with setting and bios for a week or two.
Before my 7800X3D I had the 7700x and 9900ks, which I Upgraded from the i7 980xe Ihad for 10 years at 4.3 ghz. The i7 980xe was able to play Metro exodus at extreme settings at 3440x1440p 60 fps. The game looks better than some titles today for a 2/2019 title ( with 1080ti ftw hybrid).
For me it's the opposite (for old games). Finally with my 12700K i can get 100-110fps on Crysis 1 and playing Minecraft without being a stuttery mess when loading chunks.
I strongly believe the results should be set in order following 1% low instead of average, since the global experience will be determined more for those deeps than the averages...
Digital Foundry would disagree with you. What 1% shown here and 1% in your mind is different. What you're thinking about is frametime consistency. What if i spend 90% of my time in a not busy environment of the game and only 10% on the heaviest part? The 1% wouldn't even do justice. Meanwhile frametime consistency really represents the feeling of judder or hitch even though you're at 60+ fps constantly, those 60 frames could be bunched up in the last half of 1000ms, instead of evenly 60fps/1000ms. So in this case avg is better because we're measuring peak performance, not the consistency of the game itself
You know, I'm still running my roughly 10 year old i7-5820k at 3.3GHZ which along with my 12GB RX-6750XT can still can run any game I have (with the possible exception of StarField that still has unexpected graphics glitches and pauses) beautifully at 1080p. My system is also doing a ton of other things while I'm gaming too, like being the hub for my local network backups, home media server, home security camera hub, as well as serving up a commercial web site through a CloudFlare tunnel. (No, it's not on wi-fi - just a g-bit CAT-5 cable instead). Even with all that, most games are only using about 30% to 40% of the CPU resources - so the notion that I need to be buying a newer and much more expensive CPU every couple of years "for gaming" seems pretty absurd to me. Yeah, I usually try to upgrade my video card every 3 to 5 years or so - but the old 6 core, 12 thread, 22nm CPU just isn't a problem.
You don’t seem to be aiming for high refresh gaming. I could limit my FPS to 30 and my CPU would last for ages. If you’re okay with that, it’s your call. I want 100+ frames on all games. It can’t be stressed enough how much better of an experience it is.
I upgraded from Haswell i7-4700MQ from 2013 to an i7-12700K in 2022. I more than doubled the single thread perfomance gain, fps gains easily 7 times (floating point speed).
1080p benchmarks, well thank you very much for the useless info, got 7800xt and 7500f and they cost the same 1year later, I have never seen such thing in 30 years shopping PC parts, now comment on this.
I doubt I could explain basic economics to you if you can't understand why CPU benchmarking with CPU limited testing is anything but useless information.
"We're less than a month away from [this video being completely out of date and inaccurately titled, but we did all the testing and wanted to get two videos out of it instead of one so here]" - ftfy
I like the 12700K more because of the 8+4 configuration. The 6+8 config of the 13600K makes it look like an overclocked laptop i7-12700H which is basically what it is.
Just like death and taxes, there will still be people in the comments asking why are we still benchmarking cpu on 1080p.
Yeah, lets go back to using 720p for CPU benching.
Might be their first pc😉
I remember when they did 360p for cpu benchmarks in 1999.
And here I was, this weekend, searching for 1080p monitors on RTINGS.
@@arc00ta tpu has u covered 😅.
Im still just so happy with my AM4 set up all the way from a 1800X at launch to the upgrade to a 5800X3D last year it is/was just a unbelievably good platform...
Wow, very good value, what Mobo?
@@AlexHusTech MSI X370 Gaming Pro Carbon... Not even a particularly good board but it served me well over the last years
Yep, and I see no reason to upgrade at all, especially since I only have a RX6800XT and a R9 5900X.
@@josephnorris4095what's your resolution? I'm at 3440x1440 and my 5800x3d and 6900xt is doing alright but an upgrade would be awesome (gpu specifically)
The GOAT!
AMD is goated for producing the 7800x3D
your daddy is goat
and you are gonna say that again for the prolly 9800x3D something later this year bro
@@donsly375thankyou. He’s a hardworking man who loved his family and always put us above himself. He sadly pass away in 2020 but I love him a lot. Miss you dad
@@dogdie147 My heart is with you brother
@thkhoa8805 yeah x3d is goat status like 5800X3D. If the rumors are true and the 9800X3D will be able to sustain higher clocks compared to the 9700x and still overclock then it will be a Goat to anything on the market unless Intel proves otherwise.
I think Im gonna use my 5600 until it or my motherboard dies or I unexpectedly get rich enough to not care about money.
Rich? You mean your horrible at saving money. Saving just 20 bucks a month in between product releases will add up to like 800 bucks. Poor life skills
@ksks6802 that's assuming that 800 wouldn't go to things like health expenses, rent, car repairs, or other things more important than a gaming pc. Hell it could even go to games instead.
@@shadow105720that's why it's called "savings". You try and not tap into it. If you cant save 20 bucks a month...you've made some unbelievably bad decisions in life. Go cry about it.
@@ksks6802 they could live in a country where working gives them $10 daily, which in 21 work days/month gives 2 days of work and 10% of earnings... you don't pay different prices for hardware, most often US pricing is the lowest since there's no VAT or other taxes/expenses
@@ksks6802 800$ for a marginal performance upgrade or invest that into equity of a company? I'd rather invest first. Only when its a huge upgrade it'll be worth forking out $ for an entirely new platform
"If you believe yourself to be a PC enthusiast" 🤣 golden!!!
Pc enthusiast doesn't exsit anymore.
@@kevinerbs2778 Yep, we died out when RGB took over 🌈
@@kevinerbs2778 what do you mean by that?
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
@@MordWincer total volume of any is almost nonexistent. Like it was always low like any from 1% to 5% but now it's 0.1% as high to 0.001% like the people who use S.L.I/crossfire/mGPU would be in the "PC enthusiast class". But now everyone just trolls those people to death & calls them dumb for owning or using S.L.I./Crossfire/mGPU. All it's about now straight up gaming & most ideas come from Reddit which is a horrible place
Thanks, Australian Steve!
No thanks back 💀
When one Steve goes to sleep, the other is just waking up. How lucky we are to have 24/7 Steve.
@@Osprey850 *_STEVE INTERNATIONAL 24/7_*
@@Garde538 you good, bro? Lol idk why it’s expected, at least seemingly so, that one must be thanked or thanked back by channels that barely have time to take a sht. And besides, it’s not like I donated the gdp value of Coruscant lmao
Great performance from the 5800X3D. It practically still holds up with current generation.
It's still going strong for a DDR4 platform. Most likely gonna skip the 7 and 9k series.
still rocking mine with 12gb 3080 and its strong
5800x3d is the 1080 Ti of CPUs. Punching hard for a long time
i'm running it with a 4080 super 💪
It's definitely good, but being on AM4 with DDR4 it's definitely held back. I wouldn't say it's holding up to the Intel 13/14th gen or Ryzen 7000 at all, but it's easily the best AM4 chip and for a reasonable price. If you're using a higher-end card like a 4080 or 4090 or anything in the future then the 5800X3D isn't gonna cut it.
There should be a golden light coming out of that 4090 box, like in Pulp Fiction
like the spongebob special spatula
RTX MacGuffin
We happy?
@@whoeverthisguyisthank you for making me smile
You mean like when on it catches on fire?
Good timing. My brother JUST asked me what's the best current cpu for gaming (no productivity. Pure gaming) as he's gonna build a PC again after 5 years without one. I just linked the video haha. Thank you for your service Steve. We appreciate the hard work
so he is smart enough to build a pc but too stupid to figure out the best components?
@@frankieinjapanstill, someone who do not know about ryzen 7 7800x3d is ignorant
@@toseltreps1101 He knows how to find that information. I mentioned AMD in passing during a casual conversation, and he said "I think I'm going to build a PC again soon. What's the best CPU right now?". So I said 7800x3d, and linked the video in case he wanted to check it out himself. It's just a casual conversation between brothers. Sheesh, you are definitely not fun at parties 🤦
@@toseltreps1101dont talk anymore, ever.
@@toseltreps1101 bruh
Not a single Power Consumption Chart in the entire Video? Makes me sad. I would have LOVED to see an fps/watt chart overview of these CPUs.
Yep, leaving out the Power Consumption and 4k Benchmarks makes this entire video incomplete
With my last power bill higher 28% (including 10% rebate for lower consumption) then previous one, power cost is more and more important... So much so that I started to cap FPS at 144.
And they are already announcing more price rises...
Simple really, if you care about power usage then buy a 7000 series Ryzen. They are the most efficient.
@@CallMeRabbitzUSVI 4k is pointless because they'd almost all be the same FPS
@@johntotten4872 That may very well be, but i would rather see a chart in the video allowing me to compare that for myself instead of taking your word for it. It would also be really useful for cmparing the actual gains/watt compared to older very efficient gaming CPUs like the 5600x. The 7800X3D is anywhere from 30 to 100% faster...but how much more power does it use on average? That data is really much harder to come by than it should be. This video includes both of these CPUs. I was really looking forward to that comparison when i watched it - only to be left empty handed, so to speak.
Really would've loved to see the 12400 in there as Intels last cpu without e cores....
Just subtract like ~5% from 5600X results
@@shoobadoo123 no
12600 was the last without E cores.
@@moebius2k103 12600 has e-cores, 12400 doesn't
@@moebius2k103 Not true
7800x3d - "all I do is win, I am perfection, I am GOD!!!!"
best CPU is not even debatable
12400 omission hurts a little but I'll forgive you this time...
Agreed. My 12400F has been a trooper.
They even forgot 12100f😮
Just subtract like ~5% from 5600X results
Agree, saw the 12400f on Newegg the other day for $120. 12100f was under $100. Amazing budget Cpus.
@@shoobadoo123no? 12400f is en par with 5600x and not 5% worse.
I would have liked to see some more budget options here like the 12400f. That's down to about $100 now and I think it's pretty relevant in the $6-800 sort of budget range. Maybe the 3600 too, that's still available and with how cheap some am4 boards are there could be people interested in it. I would have been much more interested in that or some more locked i5s (which are very common in prebuilts) than having so many intel parts in both the k and kf variants or having 5700x and 5800x. Your opening note about those being the same cpu covers that well enough for me.
where is 12400?
Didn't make the cut.
If it were included it would most likely top the price/perf chart. Saw it on Newegg for $120 the other day. It is an amazing budget/mid tier gaming Cpu.
@@mrhassell Take 9zzz 12 months to mature anyways. Just like 7xxx.
Steve you've lost it. How is AM4 dead ... 5800X3D says otherwise, R5 5600 says otherwise. You can't pronounce things dead just because you wouldn't do it. IT CAN BE DONE, and it would be very useful to do so.
What do they smoke in Aussie-land mate?
well done sir. i've been waiting for another massive cpu scaling video from you guys for sometime. Thank you! damn...look at that 12600k go!
The $166 13400f at Walmart with DD43600 ram With a good z790 would still be a stocking stuffer. If you need/want more than 4 USB ports.
8:08 why are the benchmark results for AM4 so much lower than in this video from you guys from a month ago? ruclips.net/video/l3b7T5OohSQ/видео.html
Am i missing something? they are both tests with a 4090 at 1080p at what i'd assume to be the same settings. Feel free to correct me if im wrong and i still enjoyed the video btw. Thanks for your work!
this entire video can be summed up saying : just buy a 7800x3d and your good to go!
HUB doing God's work out here
Thanks to AMD sponsoring these guys behind the scene lol
7500f via AliExpress still the cost per frame goat
FR
True. It outperforms even 5700X3D in terms of price/performance ratio and demolishes 5800X3D, which is crazily expensive.
Depends entirely on what games one plays.
And as for am4/5700x3d, the platform cost with mobo+ram is a bit cheaper@stangamer1151
@@user-wq9mw2xz3j True
AM4 (DDR4/5700X3D) is still a bit cheaper than AM5 (DDR5/7500f). But 7500f is also bit faster than 5700X3D. With tuned RAM it outperforms 5700X3D by ~15%. Plus AM5 offers great future upgradeability, unlike AM4, since 5700X3D is a dead end here.
Taking into account these factors, AM5 totally worth a bit of a premium IMO.
It was available for some hours last week at 100 Euros in Germany last week. Now it's back at 146 but still the best deal on AM5. Especially since 7600 prices went up by 10% since May .
WHY ARE YOU BENCHMARKING WITH A 4090!?!?!1?!?/1/ 😡😡😡😡 YOU SHOULD BE USING THE EXACT MODEL OF MY GPU BECAUSE I LIKE IT AND YOU GUYS STINK!!!!!1!!!!!!!!11!!!!!!! GRRRRRRRR
Looking young and healthy Steve, thanks for the hard work, again
I would guess the 5950x3d in these benchmarks is not properly setup ☺️ There is a number of things that must be performed before the 5950x3d can perform at its best in gaming. Gamers Nexus has done a video about it in the past! Its a pretty simple solution really of 3 steps. BIOS, AMD Utility Software and Xbox bar.
Been very happy with my i7 14700K, love seeing it is one of the top 5 cpus still! It screams in any game I throw at it, and its a beast in ps3 and Switch emulation (hello God of War 1 & 2 HD in 4k 60fps!) as well as it is a super useful cpu in production such as 3d modeling and renders. It's nice to know I can easily rock this cpu the next 5 yeard until i plan to upgrade my pc next around 2030
What i don't like about the 13th gen and 14th gen i7s is the awful perfomance per watt vs the 12700K. It makes them impractical for audio/music work, an AIO doesn't make sense, it must be air cooled. My 12700K has AVX512, easily beats a 7800X3D (also has AVX512) on PS3.
Was going to buy that chip last week. Then seen what's starting to happen with 13th/14th gen. I'll be avoiding completely now, especially seeing how they've been fobbing off gamers accusing them of using incorrect board settings.
I'll be holding off for AMD 9000 CPUs now.
@@saricubra2867
performance per Watt is miles better on newer intel gen, - it's just that they can reasonably argue that when it comes to i7/i9 most of the people have the top of the line cooling solutions so do 253W cpus.
Anyway - I cool my 14900k using air, and it's alright - not saying it's worth updating from 12700k though.
@@tteqhu "is miles better on newer Intel gen"
Is not because they pushed the sillicon too far.
@@saricubra2867
It is.
12th gen is just before the intel's shift. They were more conservative with default CPUs PLs (now the boost to 253W is set as a target).
AMD 7000 series, is by the same metrics less efficient.
However if you lock down power limits, you'll find out that they're significantly more efficient and in fact don't need 200-250W.
So, of the AM4 results, the only one that runs into real issues at any point is...
Oh, the processor that every channel recommended as their pRiCe tO pErFoRmAnCe KiNg, the 5600X. Because it's a 6-core, and some devs finally figured out how to utilize 8 properly. Sure, AM4 is irrelevant if you're building a 4090/7900XTX system today, but it's a $150ish savings versus AM5 (or LGA1700 + DDR5). On the "I don't want to spend $1000 on a graphics card" side of things, that's a free upgrade from a 7600XT/4060 to a 7700XT/7800XT (you weren't going to buy a 4060ti, were you?).
Q, is the 14900k in this result using performance or extreme power limits?
Extreme profile 253/253W
@@Hardwareunboxed wow u replied!, cheers for all the handwork you and your team does and ty for that clarification too
@@Hardwareunboxed Was gonna ask this same question, thankyou for replying! (Might be helpful to pin this info or add it into the video description.)
Could u please carefully tuning the 7950x3d to be okay? Don't just test it by all default, In this case, 7950x3d will lose a lot of game-performance and cannot fully exert its real game-capabilities... In fact, most games are 7950x3d>78x3d>149k
Also, do u need me to teach u how to completely & comprehensive knock out of 78x3d by 7950x3d in all games?😏It is related to the "windows power plan"
"close 8 ccd1 cores when playing games" already a not-optimal & outdated solution🤭
I won't be upgrading my own PC for quite some time (and Tim's latest video certainly hammered home why)... But I still like to keep on top of the latest goings on in the market! (I honestly find it more confusing to take a break and try and get caught up later, than to just stay on top of it now.) This channel is amazing for that! Sure, I check multiple reviewers, but if I was in too much of a time crunch to watch multiple videos, and had to be confident about something, I think I'd pick your videos.
Also: Still the best B-roll in the business!
I dont understand why anyone buys intel for gaming, especially a 14900 or 13900k...
More expensive and power draw than 7800x3d...
7600x my beloved
especially overclocked
@@laurentiudll 12700K has more overclocking headroom than a 7600X. The latter already is around 5.2GHz, the 12700K is that fast and it clocks at 4.7GHz, the 7700X is definetly the trash CPU of Zen 4, it will fall apart if you set it to 12700K clocks.
AM4 is goated. nuf said. Intel has had his era of forcing consumers to change platform every 2 gens. While AM4 been running forever. lol
7800x3d is such a tempting pick rn
Always has been. 🙂
I would wait for the 9800X3D which is rumored to launch in a few months
@@TheKims82it will be 20-25% faster in games compared to the 7800x3d. Intel needs Arrowlake to do very well in IPC uplift.
Me too...
I've had mine for about 7 months now and it's killer. No issues and tackles video editing and gaming like a champ. Good temps and power draw too. No complaints.
OmG!!!! I cAn’T bElIeVe yOu sTiLl tEsT aT 1080
Got my 7500f for $100, considering it's only 5% slower than the 7600 the cost per frame is actually insane.
FR
"Why do you use 1080p" yea, wheres my 720p 🤓
Liked the review, just wish it had an average power draw per CPU, as energy costs aren't something to be underestimated these days
Don't you see, Reviewers live in a limitless energy bubble where the elercticity is live and free (Energy Pun)
How much is 1 kW where you live? Here it's like 0.092$ per kW spent.
@@sermerlin1 It depends a lot on the contracted power and company (e.g. joining energy + gas gives you benefits) but I currently pay 0.1472€ per kWh.
@@MiguelSilva-sm7xl that's quite more alright but is it so insanely more pricey that you can't afford to game with any part you want?
Like even if PC while gaming uses say 500Wh is it really that costly?
Say you are playing 5 hours a day every day for a whole month... isn't that 11.25$ in total spent (give or take some cents)? That's really not a lot.
AM4 and AM5 CPU's use far less power than Intel 1700 ones so there is probably not much point 🫤🤷♂️
21:51 It looks like you don't remember the perfomance gain from the i7-11700K to the i7-12700K. If the generational increase turns out to be that big, you just pay the extra for the platform and from a value standpoint it will end up being good because of the chips that you get, no one that i know upgrades their CPUs in 3 years 🙄.
7:25 When your 1% low is higher than your competitor's average frame rate...
LOL
7800X3D might just be even TOO good for AMD, as it will gonna reduce the hype for their new gen when benchmarks come up and they can barely beat it by 1-2% if at all
Great vid.
I know that grand strategy games and 4x are shedogs to test, but it could be wonderful if you can touch on the subject in future videos. Like late game Victoria 3 or Civilization turntimes ect.
Keep up the good work ❤
Yeah, I really hate how game performance is usually limited to spectacular 3D games, which aren't really CPU limited.
There is also factory and colony management games like Factorio, Rimworld or Oxygen Not Included that are greatly affected by CPU speed and sometimes RAM / cache performance in the late game.
Got the 5700x3d like bit over a month ago for 200€ or so, great last breath for the am4 socket.
As for going for am5 the 7800x3d wouldve been almost double the price, and motherboard (150-200) and 100 for ram.
No 12400 or 12100?
maybe with DDR4 for a more realistic build, the 12400 is a much better choice than the 5600X.
Recently scored a 7900X3D for $280 US. For a mix of productivity and games, I couldn't pass it up. These charts are great to see where things generally land, but always keep an eye out for deals; you may find a diamond in the rough :)
Also... didn't you forget to benchmark one particular game? ;) ha
I basically bought my 12700K new a months after it's launch for 330$. Not even the Ryzen 7 1700 at launch had that value.
It always made the 5800X3D look completely stupid for me, then i tried a simulated 5800X3D by turning off the E-cores. 8 cores 16 threads really feel sluggish vs 12 cores with 20-24 threads. I never liked the 5800X3D, and for games when the X3D cache doesn't matter, the 5900X pulls ahead because of clocks and cores, it's much more faster in productivity.
13:38 Better to include the price of motherboard and RAM for cost per frame. Simply use the cost of the hardware used in the test. I think the AM4 going to dominate this chart.
hard to believe the 5600x still the best here. I guess it is still the sweet spot for me, gaming and Handbrake only the heaviest workloads I do.
Here's your engagement comment.
It's hilarious to me how many people still won't watch the videos explaining why you test with 1080p and a 4090, despite you specifically addressing that in the video. I'm wondering if they're just doing it intentionally at this point for a gag. In any case, thanks for the rundowns!
Also, it looks like the video might not be in your description, so here it is for anyone who wants it:
ruclips.net/video/Zy3w-VZyoiM/видео.html
That Counter Strike 2 benchmark results is nonsense. Is that a deathmatch or casual matches?
Perfomance varies by a lot CPU wise depending on the game mode, smoke grenades make the game way more GPU intensive.
To say that nobody is gaming in 1080p anymore when most gamers can't afford high end 4k gaming pc's, shows ignorance.
Yep 50%+ still game at 1080P according to steam hardware stats.
@@AlexHusTechWhich is massively skewed by internet cafes in Asia....
@@LupusAries These cafes, what resolution are they playing on?
4k superior
@@J1e9r9r6y No one is contesting that 4k is superior man, is only that a 4k gaming monitor is still way expensive and a GPU capable of playing 4k for all games as well.
remove all of the zen 5 and the new’s lntel and amd video right now!
I removed your worst video that
said “upgrade your pc is pointless” and does wrong
Gotta love my i7 2600 in my Z67 sabertooth matabored! 13 years old and STILL RUNNING!!!
at 15fps on rdr2.
Naaaaaahhhh I’ve got more than that on a 2600k
@@TheBottleneckedGamer really? My top tier gt730 would beg to differ.
My 2410m and GT540M from 2011 were doing well* untill a few weeks ago.
* at 9fps
I own a Haswell i7-4700MQ laptop that gave me higher fps on CSGO than a *desktop* i7-3770, let that sink in.
As of July 2024 AM4 CPUs are still out selling AM5 by a 2:1 margin. Even on Amazon the majority of CPUs sold are on AM4. The fact of the matter is the 7800X3D made the rest of Zen 4's lineup irrelevant. 12th Gen parts are amazing value and with PCIe 5.0 GPU support will last several more years. Personally I'd get a dirt cheap 12600K, 13600K or 5700X3D.
Though a dead platform now, I do feel like that retrospectively, the 13700K seems to have been quite a highlight of the LGA1700 platform. Mine has served me well, and I think it was good value for its time for mixed gaming and productivity rendering.
Same. Love mine.
"Dead platform" is irrelevant when you don't upgrade often. 13700k is great don't worry. It'll serve us well for a few more gens. Offset the voltage down by 0.09v and it'll be a sleeping beast.
@@yuunashikiIt's still relevant. If you bought a 1000 series ryzen chip in 2017, you could buy a 5800x3D now and not need a new motherboard or ram. Thats 7 years apart. For a value minded buyer, the ability to upgrade to a new CPU for only the cost of the cpu itself 7 years later is a massive cost saving.
@@JoeNokers I think you may have misunderstood my comment.
13700k is an awesome Cpu. Had mine since release and plan to keep it a few more gens.
*WhY oNlY 1080P tEsTiNg, wE wAnT tEsTiNg iN 16K!*
I have an i5-12400f and just ordered a 7800 XT, coming from a 6600 non XT for 1440p. Is the CPU still good or should i upgrade?
your CPU is about 8% slower than the 12600K benchmarked here
12400f is perfect imo. Im using it for my 7900 xt build basement system and its amazing value at 1440p
the 12400f is still good enough until like 7900xt or 4070 ti super; so no need to upgrade the cpu yet, at 1440P
It is mate. Ive upgraded my GPU to 4070s and I'm on 12400, no visible bottlenecks at 1440p. All the games I play ulitilse GPU to 100% or I'm limited by the refresh rate of my monitor. Don't worry and enjoy
aight good to know, thanks guys
I’m running my 4080 on my Ryzen 7 5700. Could it be faster? Yes. But my monitors are my bottleneck as they have slow refresh rates and I primarily play on my 4K TV
I know a couple of subreddit that are going to REEEEEEE😂
r/hardware gonna downvote
@@JoeL-xk6boand Intel mob will point out that this channel is biased, that AMD didn't invented anything, and that Intel is cooking something better 😬
The 5600 still plenty enough for 1080p gaming
Thanks Steve! (And team) Great info.
I bought the 13600K on launch before the 7800x3D existed and I'm very happy with it, I will of course consider AMD when it comes to my next CPU upgrade.
I like pure numbers..that's what we've been waiting for Steve..gg from Germany
More am5 board test pls 😅
12600KF for $159 dethrones 5600x in best fps/$
Seems like someone is doing overtime before release of Zen 5 😂👍
@@RUclipsTookMyNickname.WhyNot It all depends where you test. He is testing in the city act 3 which is very cpu intensive.
okay so it was 100% worth it to get 5600 in 2024, I can now sleep peacefully
When I got my new system I got a R9 7900X3D paired with 64GB ram, X670 MB and a RTX 4070Ti Super, and so far it has been an awesome. Hope to have this system last at least 10+ years.
you will be good its just you will have to upgrade that graphics card every 3-4 years in that 10 year time frame.
I got a 7950x3d and 4070 super with 64GB of RAM, I'm hoping for around 5 years. Tech just advances so fast.
@@msg360 Thank you, yeah will be interesting to see what the future holds for GPU's and other hardware.
@@MajoraEXP I was thinking about that, but with what my system cost in total I think that I could not go the top of the heap in the Ryzen series. I wanted to go to the 4080 Super but that was $500 AUD more than what I could afford.
@@Ghastly10 I would have gone with a 4080 but I only upgraded cause I won at the casino xD otherwise I'd still have a i5 9600k. If only I had won more. Quite a huge upgrade for me still
In all of these comparisons i remember people bashing me in the comments section for making the claim that the 12700K in better than the 5800X3D in every single aspect. After all of these years i'm right.
AM4 is dead? How is it dead when B450/B550 and Ryzen 5000 sells huge numbers all over the World?
no new good cpu's being released on am4 anymore, and the fastest one you can get is the 5800x3d which is comparable to the 7600
@@autumn42762 Is the 5700X3D somehow not a "good new CPU"? Some of you are so lost lol
@@madelaki You're the one who's lost if you think the 5700X3D is new or good. It's only good if you're already on AM4.
@@madelaki yes the 5700/5800x3d are good, but they make the most sense as a upgrade (even then, the AM5 R5 7600/7500f is much cheaper and it has similar gaming performance)
there is really no point in building a new AM4 DDR4 system in 2024
AM4 isn't a Dead platform - AMD is gonna release new CPU's for it at the end of July along side the new AM5 release. I would call AM4 a secondary platform and still a worthwhile platform for those who choose not to or don't have the financial capability to just switch from AM4 to AM5.
I guess it's time for me to stop being an Intel fanboy an upgrade my 9700k to a 7800x3d.
If the perfomance gain of Arrow Lake is so massive, the 7800X3D will be irrelevant kinda like the 5800X3D is vs the 12700K. In my case if i run PS3 emulation, my 12700K matches a 7800X3D and loses everywhere else (minus productivity).
5800X3D FTW
Long live the AM4 Platform! You have been very good to us
In your comments about Starfield, you note that the dual CCD of the 7950X3D and the way the game schedules its threads all meant its lows were hit, but the 14900 ALSO had a very similar hit, and the single CCD 7800X3D had better lows than the big hitters from Intel's 14th Gen, indicating to me that they ALSO were hit with a scheduling problem, hitting those weaker E cores, but you weren't pointing out the issue of "mutli-tile scheduling" only "mutli CCD scheduling". Fix it by pointing out that tile scheduling can hit in some games too.
It's not intel or AMD problems. It's Bethesda's problem.
His comment makes sense because that was a comparison between 2 cpus with similar performance (7800x3d and 7950x3d).
While, yes, multitile surely can have an impact, there was nothing to compare to, since all the intel cpus of that level have e-cores.
Meaning: while it's useful to point out that the 7800x3d can be a better purchase because of that issue over other am5 products, it doesn't matter for intel, since all the other cpus are all of the same.
@@MattJDylan you make a good point.
@@MattJDylan That's my point, and why his doesn't, in that specific situation I pointed out and is solved with a change: it depends on how many threads the game engine will use. Starfield will use more than 8 threads, and are not well scheduled, so anything that HAS an option of which core to use has a scheduling problem. The 7800X3D doesn't HAVE a choice. All threads HAVE to cooperate on that single CCX. Intel's 14th Gen just upped the E core count to get multithreaded up to competitive with AMD, and so 8 of those threads will go to the faster P cores, but since it has a choice, it will put thread 9, 10, 11, etc, on an E core, meaning if the thread has to be waited for, the lows in the fps count will go down. Same for the two-chip vcache options: the other core isn't as fast, so it still has to wait until that thread ends, hurting the lows.
The 12th Gen Intel doesn't have as much of a problem: it doesn't have many E cores to play threads on, so it will cooperatively multithread on a P core much more often. 13th Gen ups the 13600 count of E cores, so more threads are run on the vacant E cores, hitting the lows a bit more. And 14th Gen ups the count of E cores even more, hurting the lows even more. And since the higher quality chips from intel, like from AMD, use more "not faster cores" to handle more concurrent threads, the lows are hit more at the higher end (e.g. 13900) than the lower end (e.g. 14200). The AMD 7800 with vcache is a low end configuration that has a LOT more cache, which means more cooperative multitasking, but since it has a larger cache, more hits on-die so those threads operate a lot faster.
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
Conclusion : if you don't havean Rtx 4090 (anything else), any cpu that matches Ryzen 5700x3d/7600 or i5 12600k is perfect for gaming.
Pretty much yea, this is for the top 1%
I like that you test the 7900x3D. Most reviewers have written it off and I think it's bad with how the prices change.
that chip makes no sense though, no one should buy it
@@MaxIronsThird I disagree
As someone who uses a pc for more than gaming, the 7900x3d makes a lot of sense if you can grab one for $330 on sale.
@@qu54re65 but we're talking about gaming, the 7900X3D is a inbetween CPU, worse than 7950X for productivity and worse than the 7800X3D for gaming while also being more expensive.
@@MaxIronsThird It's 5-10% worse in gaming than the 7800x3D while being 50%+ faster in multicore tasks. At the same time, it's ~20% slower than the 7950x3D in multicore tasks, but the 7950x3D costs currently 50% more.
With the launch price it made no sense, with lowered prices it absolutely does make sense.
The problem is now I play only at 4K so, I'm sorry but this video is useless
I'm not ashamed to admit it: I bought an i9-14900K! It was $50 less than the i9-13900K. Why _wouldn't_ I buy it? I can handle things. I'm smart. Not like everybody says. Like dumb. I'm smart. And I want respect!
i5 14600k - 358$ vs ryzen 5 5600x - 156$(300$?????)
The best gaming cpu is the one you already have
Why dont you test CPUs at 720p or lower? if you test at 1080p because the popularity of that resolution you cant complain of people asking you to test at 1440p or 4k.
4090 can be slower at 720p. Not enough load to utilise all the cores.
The 12700k doesn't get the love it deserves. Intel has been selling it for stupidly cheap for months now and I've built at least a dozen systems with it. Sure, it's not impressive compared to a 7800X3D with a 4090 running at 1080p, but when you pair it with a more realistic 3080 or 7800XT at 1440 there's simply nothing from AMD that can compete with that $210 CPU. (Where's my AM5 R3 AMD?!)
Intel _knows_ they aren't in the lead anymore and their prices show it. X3D might be the king of gaming but the second you turn that game off core counts matter and I got a 14700k for a lot less than you would expect. Great prices for good CPUs are what made me love AMD, and Intel is killing it right now in the "bang for the buck" department.
There's no question if you're doing a "cost is irrelevant" build you go for a 7800X3D or better yet 7950X3D, but if you're building on a budget Intel is killing it if you want a lot of cores for your money.
The problem with the 12700k is that the 13600k beats it in most games and for less money.
@@LiesThatBind You can find a 12700K with AVX512 instructions beating a 7800X3D with AVX512 on PS3 emulation. There's nothing more CPU intensive than that. Heck, the i9-11900K which was the fastest CPU for PS3, with overclock loses against the 12700K *at stock* .
3% singlethread difference between 12700K and 13600K, the 12700K is easier to cool than a 13600K because it clocks at 4.7GHz (8 p-cores), the 13600K clocks at 5.1GHz, at that point just get a 7600X that is cheaper and clocks at 5.3GHz max.
6+8 Big-Little is just a weird number than won't age as well as 8+4.
Intel realizes that the 12700K was a mistake, so they forgot that 8+4 number. I only know one chip that is similar to that anomaly, it's the M2 Pro also 8+4.
14700K will pull ahead of the 7800X3D once the latter lacks the raw multithread and clocks muscle.
Right now the 12700K is pulling ahead of the 5800X3D.
FWIW, the 12600K is now $158, making it far and away the best cost per frame on here…also not included is the $109 12400F, which could easily have a spot as well. Just something to consider.
1080p is best for CPU benchmarks for getting maximum FPS, as CPUs are responsible for fps, the resolution and upscaling is taken care by GPU so no point in checking in 4k as games will become limited to GPUs capability and showing off the same FPS on all CPUs even if one is worse or better than others
that was the dumbest thing i have read this year.
This is exactlt why you SHOULD test at 4K, this isn't 2011 anymore most people watching will already have a Gamung Computer and look to these mega benchmarks to inform them unto which CPU to upgrade to next.
If he tested at 4k, it would most likely show that most people are fine with their CPU and it wiuld show then where to spend their money next. But I only know this because I game at 4K and 8K. Most people don't and will get the impression they need to upgrade the CPU to the 7800X3d or whatever to get the best performance when in reality the higher the resolution you go the less relevant the CPU becomes
@@CallMeRabbitzUSVI Most people are smart enough to get a CPU that supports the frame rate their GPU can produce in their favorite games at their monitor's resolution (which, according to the most recent Steam poll, is most likely 1920x1080 anyway).
@@CallMeRabbitzUSVI ?? it's a cpu benchmark test, not letting the gpu bottleneck the cpu and showing the difference is THE POINT of the whole benchmark, that's why it's done in 1080p with 4090
seriously how do people not get that
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
Best cpu is a cpu that doesn't have a 50% failing rate
Was about to complain on not seeing the 5800x3d, but i decided to wait. My vega 56 is still mad btw.
Based on this video, what would be the best video card for full HD using the 5950x and 5800x3d ultra quality?
Ryzen 7500f from Aliexpress says hi
insert 5800x3d circlejerk comment
I mean it deserves the dang praise it gets. I wish it didn't heat so aggressively out of the box though, had to dial down some of the voltages to tame it. And the funny part is that I didn't even lose any performance, so it is needlessly aggressive out the box.
5700x3d benchmarks pls
5800X3D is no longer interesting, it is way too expensive for what it offers. 5700X3D is the CPU, which deserves to be praised.
Although even 5700X3D is outperformed by 7500f, the true value king.
@@stangamer1151 5700X3D is still better in MMOS and some high fps games.
@@wizarian Not if you tune RAM. Then 7500f is faster than 5700X3D in all games. DDR5 is a big improvement over DDR4. This helps 7500f a lot.
Major problem with Intel not mentioned here is big power consumption. In some cases they take twice the power of AMD.
And that most motherboard configurations are broken lol
bad list doesnt have 13600k but has predecessor like wtf
7800X3D will be remembered long time
And the 7800X3D is yet to reach peak status. 1.5-2 years later, by the time even Zen 6 is released, the 7800X3D will be much cheaper but remain more than enough for the vast majority of gamers. And with ample headroom for future upgrades.
@@valentinvas6454My thoughts exactly! The only thing about the 9000 series I'm interested in is how much they drop the prices of the 7000 series 🙂
@@Rorschach1488_tell me you are salty for buying an Intel cpu without telling me you are salty for buying an Intel cpu.
@@josephk6136 At stock, yes, 7800X3D is fantastic. Best CPU for a normal consumer. It is not the best when it comes to peak performance when you start to overclock. A 13/14900K overclocked with high speed DDR5 (tuned) will destroy the 7800X3D in most games and benchmarks. It just comes down to you wanting to spend the extra money and FAFO with setting and bios for a week or two.
Then aging like milk like the 5800X3D is doing right now surrounded by similarly priced i5s and i7s.
If the 9700X doesn't trade blows with the 7800X3D, the 7800X3D will likely be out of stock in a lot of places I suspect.
The best CPU is the CPU you bought yesterday to play today's game.
7500f 😂
@@RUclipsTookMyNickname.WhyNot we're almost the same.
5700X, 4070 Ti Super, System Shock (1994) and System Shock 2 (1999).
Before my 7800X3D I had the 7700x and 9900ks, which I Upgraded from the i7 980xe Ihad for 10 years at 4.3 ghz. The i7 980xe was able to play Metro exodus at extreme settings at 3440x1440p 60 fps. The game looks better than some titles today for a 2/2019 title ( with 1080ti ftw hybrid).
11900K, Fallout: New Vegas, 8 - 12% CPU utilization xD
For me it's the opposite (for old games). Finally with my 12700K i can get 100-110fps on Crysis 1 and playing Minecraft without being a stuttery mess when loading chunks.
Not testing 5500 is huge omission, its currently 50$ less than 5600x and default choice for budget oriented builds for last 12 months at least.
I strongly believe the results should be set in order following 1% low instead of average, since the global experience will be determined more for those deeps than the averages...
agreed, and it seems like the 7800x3D does really well with 1% lows.
Digital Foundry would disagree with you. What 1% shown here and 1% in your mind is different. What you're thinking about is frametime consistency. What if i spend 90% of my time in a not busy environment of the game and only 10% on the heaviest part? The 1% wouldn't even do justice.
Meanwhile frametime consistency really represents the feeling of judder or hitch even though you're at 60+ fps constantly, those 60 frames could be bunched up in the last half of 1000ms, instead of evenly 60fps/1000ms.
So in this case avg is better because we're measuring peak performance, not the consistency of the game itself
@@rzkrdn8650 then your not busy environment is always fine with pretty much any 6+ core CPU of last 3 years, but those heavy parts can become annoying
Best CPU's for a 4090*
Shame you did not include the 7500f
Similar to 7600X really
He did, its called the 7600X.
Same FPS
You know, I'm still running my roughly 10 year old i7-5820k at 3.3GHZ which along with my 12GB RX-6750XT can still can run any game I have (with the possible exception of StarField that still has unexpected graphics glitches and pauses) beautifully at 1080p. My system is also doing a ton of other things while I'm gaming too, like being the hub for my local network backups, home media server, home security camera hub, as well as serving up a commercial web site through a CloudFlare tunnel. (No, it's not on wi-fi - just a g-bit CAT-5 cable instead). Even with all that, most games are only using about 30% to 40% of the CPU resources - so the notion that I need to be buying a newer and much more expensive CPU every couple of years "for gaming" seems pretty absurd to me. Yeah, I usually try to upgrade my video card every 3 to 5 years or so - but the old 6 core, 12 thread, 22nm CPU just isn't a problem.
You are severely limited by that CPU, you can cope as much as you like, it's not going to change that fact.
You don’t seem to be aiming for high refresh gaming. I could limit my FPS to 30 and my CPU would last for ages. If you’re okay with that, it’s your call.
I want 100+ frames on all games. It can’t be stressed enough how much better of an experience it is.
I upgraded from Haswell i7-4700MQ from 2013 to an i7-12700K in 2022. I more than doubled the single thread perfomance gain, fps gains easily 7 times (floating point speed).
1080p benchmarks, well thank you very much for the useless info, got 7800xt and 7500f and they cost the same 1year later, I have never seen such thing in 30 years shopping PC parts, now comment on this.
I doubt I could explain basic economics to you if you can't understand why CPU benchmarking with CPU limited testing is anything but useless information.
He has a video on why testers use 1080p benchmarks. You should watch it.
"We're less than a month away from [this video being completely out of date and inaccurately titled, but we did all the testing and wanted to get two videos out of it instead of one so here]" - ftfy
Never miss an opportunity to learn something.
My R5 7600x is more than enough for me.
The 13600k remains my top pick for most people for Intel
13400f at walmart for $166 keeps them all honest.
I like the 12700K more because of the 8+4 configuration.
The 6+8 config of the 13600K makes it look like an overclocked laptop i7-12700H which is basically what it is.
4:05 No one, just over 50%...
My Brother in Christ W H E R E is the FSR 3.1 analysis. I need it.
2 days away still.
THIS