After our 14600K review, is there anything else you want to see us do with these CPUs? Any one-off tests or points of interest worth exploring? Let us know in the comments (new top-level comments are easier to find). Remember that any of those one-offs would take the place of other content, so if you're not interested in seeing more of these CPUs, let us know that as well (so we can prioritize our next content)! Watch our Intel Core i7-14700K CPU review: ruclips.net/video/8KKE-7BzB_M/видео.html Learn about CPU sample size in reviews (68 CPUs tested): ruclips.net/video/PUeZQ3pky-w/видео.html
Would They are Billions be a good game to CPU benchmark with? It seems like a heavily CPU bound game when you have a large number of zombies. My i9-9900k lags like hell in some stages.
Yes, please. As winter is coming, test the 14900K under full load, how it performs as an auxiliary heating. Is it worth to move it as a production PC into the bedroom?
Intel could've just called these the 13950k, 13750k or some shit like that and it wouldn't have been much of a problem... Mid generational upgrades is never a bad thing, unless they raise the prices more than the performance increase.
@@freaky425 the fuss is about them marketing it as a new 14th generation which will lead some not-so-tech-savvy customers to assume that it is noticeably better than 13th gen, when it is in fact just 13th gen. This is intentionally misleading and anti-consumer behavior.
@@freaky425 The problem is the implied improvement that comes with a "bigger number better". Most consumers will look at the 14 series at, say, $500 and the 13 series at a discounted price of $450 and believe that they're getting a whole generational improvement in performance for just $50, or 11%, more. They think they're getting a good deal but they're overpaying for just a few extra percent in performance. It's scummy, and intentionally dishonest by Intel.
It's incredible that this guy was called Steve Fromgamersnexus and he ended up working at Gamers Nexus to review one of the cpus ever made, life just finds a way
I never get tired of the "thanks Steve" inserts. So uh, thanks Steve and GN crew for making yet another entertaining video even though the product was about as boring as it can get.
There's a movie/TV critic on youtube called Diasparu who's been using the steve "That's gaslighting" clips, and it's so damn good. I laugh my ass off every time.
Can I just call out how much I appreciate the editing team adding the little blue bars on the sides of the screen slowly moving down to indicate how long and when a section/topic will be and end? Its the little things in life you know?
@@Pressbutan This reply was 4.2% better than the average reply to a comment, which for a comment on an intel review video is a mindblowing increment over the others.
its still bought and am4 is still viable its not like the 5900 or 5950x are bad cpus either its just that its a dead end however if you dont upgrade that often thats not even an issue i guess the 5800x3d or the r9s of zen 3 still have plenty! of juice left
Still rocking my 5800X3D and don't plan on replacing it any time soon. This thing is just silly, nothing I play comes *close* to maxing it out (and that's including games that are CPU heavy like Star Citizen).
@@Valdore1000well am4 is a dead end its either r7 x3D or r9 both awesome Chips though I run a 5800X3D and I dont plan on upgrading that anytime soon either
250W is really as high as any desktop cpu should go at stock. It is not a good trend that they are pushing the power beyond diminishing return just to get the few percentage on perf that is only useful in marketing. Optimizing the design is the way to move forward as we are getting closer and closer to the upper limit that silicon semiconductor processes are going to eventually hit.
I'm not a fan of the power comparison to older i9s, which were obviously tested with power limits applied at the stock TDP of 95 and 125 W respectively. My 9700KF was drawing 140 W in an all-core workload at 4.5 GHz (official max turbo is 4.6), and it didn't have HyperThreading. That means that the 9900K at max official turbo of 4.7 GHz would draw around 200 W. At 95 W the frequency was probably around 4 GHz or less. These CPUs have two TDPs, 125 W and 253 W. That means we should never see more than 253 W at stock settings. If power limits are disabled by default, that's a choice made by either Intel or the motherboard manufacturers, but it doesn't showcase a realistic comparison to older CPUs, where power limits were observed.
@@Iwanttobefree42 please get your head out of the hyperbolic gutter Even at unoptimized stock on my 360mm R23 was just about 300w. After a basic offset that took like 15mins to guesstimate it was down to 265w @ 5.6 (KS). In real activity usage... it uses like max 150w for my games and pretty much always below 60c.
this is the most sad sack update I've ever seen. I've literally seen average cpu bins of the exact same product improve more over 6 months, than the improvements to efficiency and performance a 13900k to 14900k gives. 😕
@@ashryver3605 150W for real usage is still a lot. All my systems still run CPUs that don't even go above 100W max. And I still need massive 3 fan coolers to keep them reasonably quiet.
@@bliss_gore5194I agree maybe if this was a GPU comparison but CPUs? Nah. At this rate AMD is gonna maintain the performance lead for at least another generation
The amount of sarcasm dripping from this review was nearly enough to drown me as I listened to it driving into work this morning. I can't thank the GN crew enough for the actual out loud laughs you have given me, just in the last two reviews. I didn't expect the 14th "Gen" to set so many records (of mediocrity) with so many watts.
Why are people comparing this to 11th gen like seriously do you have memory of a goldfish? 11th gen was actual DOWNGRADE. 8 core i9 vs 10 previous gen. 14th gen has at least a clock improvement and even few more e cores for i7. It's not even a comparison. It's advertised as refresh and it's exactly that. This is like shuning AMD when they released 2000 ryzen series which was literally the same thing.
Those tests they do with the virtual loads has no bearing to the real world. Been running a 13900KS for almost a year now and the thing has never gone above 180W. But yes, that little extra heat is welcome during the winter months :D
@@SixDasher well then why did they add so many e-cores and high clocked P cores, when we are not supposed to use them in "real world", many people buying 13900k unlike you are actually pushing it harder than this video e.g i run multiple compiles and renders on background while doing gaming or using editor, and most of the it stays > 180 W, its your problem you are using a 9 series for watching youtube or facebook or doing only gaming, this ain't 2015, people don't close all applications before starting a game..
I was hesitating to either slot a 5800x3d into an existing machine, or go for 7800x3d with a new build. Thanks to these videos I finally desided to save a little money by switching to 5800x3d.
I did that, but my old CPU found a new home in a data analysis build, so I ended up with new RAM & mobo anyway. I would have been better to go with AM5
@@KoItai1 Yes because games won't even use 1/3rd of all available cores... buying 32 thread CPU just to game on is equivalent to getting 4090 to play Quake 3 and RimWorld...
can you describe what workload you will be using that will cause these CPUs to draw 300w? do you run benchmarks in your home office all day long? since you probably don't do any real work, this CPU idling at 20-25w less than its AMD counterparts (which gamersnexus fails to mention, for some reason) would probably lower your power bill. you seem like a very smart person though so maybe i'm wrong and you get paid to stress test a CPU all day long?
@@ibtarnine i7-12800H: literally starts using 100W easily just from doing mundane tasks in Teams, Outlook. If you do anything more serious than that, yeah, it's going to consume a lot of power.
You can just feel the disappointment through the screen. In fairness I'm sharing it with him - I expect better, given that these are crucial moments for the industry. They need to be on their best efforts and clearly Intel is off in left field, dancing around on the silver highway playing with their cell phones.
@@Pressbutan its only intel and nvidia doing the bullshit. AMD is best in everything. I hope AMD will overgrow intel soon because they definitely deserve it. Intel deserves to go bankrupt and break into a bunch of smaller companies whixh will then be way more innovative than gigacorp intel with their 10 years obsolete architecture
You would have expected by now that the "Thanks Steve" and "You can literally see it" clips would have gotten older by now. But no, I still crack up every time :D
Thank god I did my research and watch this content before buying. Literally had the 14900 in the shopping cart, you boys saved me a decent chunk of money.
@@SixDasheri mean if you use it for productivity it’s even more insane to buy Intel crap. AMD Epyc cpus pretty much destroy anything that Intel has ever made and i’m saying it as a former Intel diehard. I’m interested to see if they can finally make a proper next-gen cpu with 15th series.
@@SixDasher I used to think that but now realizing after 12 months of 13600K ownership all I do is play games and occasional video clip encodes for memes etc which already had fast enough encode times on my previous i7 8700K.
The Haswell refresh, Devil's Canyon (4790K) was a massive improvement over its predecessor (4770K), despite only minor changes. But this time they're presenting something that's even less of an improvement as a whole new generation of CPUs. It's like they reckon most people won't see through the marketing, and they're probably right.
@@LucasHolt YET Kaby lake could do hardware accelerated x/h.264 decoding and skylake could not. That's a big upgrade if you are trying to build a nuc sized system that runs at 15 watts. IPC of the cores also were the same with better power efficiency. Go look at the AnandTech write up. Kaby lake yields were higher "The combination of the two allows for more voltage range and higher frequencies, although it may come at the expense of die size. We are told that transistor density has not changed, but unless there was a lot of spare unused silicon in the Skylake die design for the wider pitch to spread, it seems questionable. It also depends which part of the metal stack is being adjusted as well."
Appreciate Gamers Nexus style of testing and benchmarking CPU's. I feel like other tech channels are just rushing videos out, using handpicked games and 1080p resolutions... You guys are showing the community better material right now. Just my opinion
I really considered upgrading from r5 3600 to a 12th gen or am5 upgrade, but I'm happy I went for a 5800x3D. Maybe I'll consider the 15th gen / the next AMD socket.
@@sethperry6616 AMD has promised at least Zen 5 but were purposefully vague about Zen 6. Given the leaks we've seen surrounding Zen 6's departure from the IO Die + CCDs layout that Zen 2,3,4 and 5 feature, it makes sense that AMD wouldn't set it in stone in case they simply can't retain pin compatibility. But if they can, then expect Zen 6 and Zen 7 to be AM5, as they won't switch until DDR6 and PCIe6 unless they absolutely have to.
I say, with the power draw and total lack of generational improvement, Intel may have finally met their own Bulldozer Moment on Desktop (since they already had one in servers). Of course, as we know Intel is too big to fail and will quickly bounce back from this mild embarrassment.
If possible, I'd love to see Intel i7/i9 with PL2 @ 140-ish Watts vs AMD Ryzen 9 with ""105 Watt"" eco mode. (Preferably with a focus on productivity.) Thanks for all you guys do!
Right now userbunchmork has 14900k at #1 The highest Ryzen CPU, according to them, is the 7950X3D at #10 The 7800X3D is at #17. Truly one of the websites of all time
The comedic delivery timing is the only thing tighter than the difference between these "generations". Absolutely... informative and hilarious presentation. 😂
Yeah haha. I couldn't care less about Intel anymore. I've been team red for the last 5 years. Intel Innovation has stagnated SO hard. They just keep uping power consumption because that's the only way they've figured out of producing more performance for the last 8 years.
is there anyway to get this sucker under 600 dollars, like perhaps get it for $50 bucks, or perhaps someone can pay me to extract my 12900K to install the 14900K on my Z690 mobo?
I disagree. I suspect it will sell. It will be headline sales pitch in the latest Dell or HP workstation (and eventually laptops). Intel just made a load of money without having to invest a dime in R&D, that's genius, dirty... but genius. Yeah they will have lost a few enthusiasts who understand how cynical it is, but there are 100 morons for every one of them, they all will be clicking their fingers to get these 'next gen' CPUs. Being savvy in a mass market dominated by morons is a dismal experience.
While it's also stomping on AMD's current 7000 series. From my knowledge, if a CPU from a previous generation is stomping a mudhole in your current gen lineup, you know you're in trouble.
What? If you're thinking your little 5800X3D is stomping on the 14900K, you're on crack. I don't even think you can get half the score in CB 2024 But let me know. You have 2258 points to beat in my case. I'd be surprised if you get more than 930.
I fucking love Steve's absolute disdain with Intel. It's entirely understandable, though, since their effort for improvement between 13 vs 14 is similar to that of a barely passing undergrad. Back to you, Steve.
It shows how far behind the curve Intel is. They dragged their feet far too long with Core architecture. 2008 designs in 2024 do not work. x86s and a whole new architecture are needed to compete, putting more lipstick on the pig and hitting it with more watts isn't going to solve this issue.
@@PressbutanYeah,all the current stuff they have is still a variant on Skylake. They need a Bulldozer -> Zen moment for their cpus. The only way they are even "competing" right now is with stuffing E cores and pushing their P cores to the max heat they can handle and even with all those tricks they only come close to Zen 4 or even Zen 3 X3d(5800x3d). Most of the time their 8P16E cpus at 32Threads still lose out to the 16C/32T 7950x and the power usage is a joke. Almost the same power draw as my 4060. All AMD needs to do is make a 16 Zen5 core and 16 Zen 5C core CPU at 64 threads and Intel will have nothing at all to compete with that. Even a 16 Zen5 and 8 Zen5C CPU would annihilate them, all the while taking less power than a 14900k. Now imagine that CPU but with x3d. Even less power draw and way better gaming performance.
@@rudrasingh6354all of their architecture is still based on scaling of the original Pentium 2 core. Their most recent architecture was “Netburst” for the Pentium 4, which was similar to bulldozer in that the frequency was high but IPC was low. The Pentium 4 and Pentium D used Netburst before they went back to the drawing board and revised the Pentium 2 architecture (which had been revised once before for Pentium 3) for the “core 2” CPU’s, and every generation since has just featured further revisions. This is notable a “refresh” in that the revisions made haven’t functionally effected the architecture at all.
few problems, intel is known for memory controllers amd just sucks at that max u can overclock amd is 6000 while intel can run at 8000 on 14 gen with good motherboards but idk intel is starting to suck
The Stellaris chart could use more game settings data. For example, map size (how many star systems), if L-Gates are enabled, in game year (important as number of pops affects performance), number of AI empires and whether or not Xeno-Compatibilty is enabled as that one setting can basically set fire to your CPU in the late game.
For stellaris, I'd actually recommend setting it to the highest speed on a late game save, then seeing how long it takes to the to, for example, get through 50 years (shorter would be better). On the fastest speed paradox games run as fast as the system can handle and I think this type of benchmark would show much better scaling.
12th and 13th gen was pretty hard to keep track of because you had to actually pay attention to tell what's good and what isn't. Now with 14th gen, that's not a problem. Thanks Intel.
This was a welcome reminder that the 11900K had two cores less then its predecessor. We got a "side"-grade instead of an upgrade this time, but at least we didn't get a downgrade. That's something, isn't it?
Once you look into the real world latency between 10th gen, 11th gen, and compare it to 12th and 13th gen, you definitely did get a side-grade. Overall system latency is higher with 12th/13th gen. So really depends on how you look at it.
@@germanmade1219 everything has gone downhill except benchmark performance since Intel introduce their e-core shite that should have never come to desktop
I'm astounded by how well AM4 holds up considering the DRAM bandwidth difference. Could you please start including DRAM layout in testing config? E.g. 2x 8GB single-rank or whatever. This can make a surprising amount of difference. Maybe even worth a video at some point. Find a set of 4 benchmarks that are GPU-limited/CPU-limited/dram latency limited/dram bandwidth limited, and run on different DRAM layouts and CAS-versus-speed tunings, to illustrate the tradeoffs.
@@atretador So does the 14th "gen". Intel just only manged to get it ready for two specific games... that Gamers Nexus don't feature in their test suite. I can't recall it's name (Steve mentions in the 14700k review if you're actually interested) but it's the very definition of a locked software gimmick, as it will not be released for 13th gen despite it being the exact same silicon.
That's a really ignorant comparison. The 4060 is disliked for poor performance. While to marketing of the 14th gen is B.S a 13900K/14900K is for all intended purposes a very powerful CPU.
I feel like getting a 7950x makes the most sense for people right now who need lots of cores. You get a brand new platform with upgrades down the line. AMD also did right by AM4 with the 5800x3d release. Pumped new life into a dated board.
That's what I was thinking. I'd like to build a workstation that I can game on as well, and AMD seems the clear way to go at this moment. I just remain hesitant regarding which CPU so these charts are helpful.
@@DissertatingMedieval a 7800x3d is really good, would future proof for gaming for a long time. In terms of productivity, it would be fine. I use a 5800x3d and my machine flies.
Next year I have to upgrade from my I7 8700k , it was one of the last great CPUs intel put out but it's showing its age. I cant take any of the 12-14 "gen" Intel BS because of the idiotic P/E cores. None of the vitrulizers I use can distinguish between the P and E cores. FFS Intel , E cores are only good for mobile platforms, saving few W on a 300 W CPU is not much. Just make a standard CPU for desktops... A couple of weeks ago me a friend tried to workaround 13900k in Red Hat KVM and VMware ESXi. ESXi just could not work stable because it cant understand the core difference. KVM , we only got around by leaving the E cores for the host OS and giving the P for the guest. What is the point in dumping that amount of cache and not be able to use most of it.... This is why I will have to go AMD but AMD-V was never on the same level as VTX/d , that's something Intel nailed log time ago because of their Enterprise CPUs..... I was given an advice to just buy Gen 11 Intel and stick with it. They are on very good prices now as retailers are trying to offload the remaining stock. The mobs for them are not on insane prices as well.
10:44 "that's when you know it's a good generational uplift; in fact, it's so exciting that we're not even gonna talk about any other production test" I'm crying
@@dangerous8333 While that is true, these Intel CPU are still largely less efficient than an AMD CPU even at productivity tasks. A 30% increase in your power bill for a 2~5% improvement in performances just isn't worth it. Hence why several datacentres in the world (such as Cloduflare's) have moved to EPYC.
Steve and the GamersNexus team together, made this launch something entertaining and worth to watch. Unfortunately the only downside to this is that I want more videos like this, but I hope not.
So put a power limit or undervolt? My 13700k Runs at 5.7Ghz and barely uses 60-120W depending on the game. 150W power limit and it drops to 5.3Ghz but that's only when I am rendering. I can save more power with the eCores disabled or parked. The GPU is still my biggest concern at 400W.
It's crazy how much power some of these newer top end CPUs use. My i7-8700k doesn't even get out of bed most days for gaming, and I was worried it'd run hot and had it delidded back when I got it.
That's exactly what I've been using since it was released. R5 5600 and a Sapphire Pulse 5600XT. Absolutely fantastic combo paired with a nice B550M mobo and some 3600mhz ram. I don't see any need to upgrade for the next few years. I play Starfield at 1440P medium settings no problem.
Can you imagine? A cpu model numbre like intel 14P8E10 and be able to identify specs of the product only with the name... my god... life could be so much simple
I haven't ran the numbers but it seems like even if you're putting together a brand new PC, if you're on a limited budget you might _still_ want to buy a 5800X3D and put all the savings you've made on AM4 platform components toward a bigger GPU.
I consider myself pretty faithful to Intel and if I had a friend building a PC asking for advice I'd steer them towards the same. If not 5800X3D, then absolutely a 7800. Either would be fine. In fact both seem to be fantastic value compared to team blue..
I wouldn't. The only reason I got the 5800x3d is bc I was on the AM4 platform already. Not to mention, PBO2 optimization also gives me nearly another 20% in performance with the 5800x3d. But with a new system, I'd totally save for AM5 for upgrade ability later down the line.
@@Purified1k yeah but looking at am5 now, they'll probably develop further so ddr5 and all can be better used, it's better to wait. Most b650 mobos are 6400mhz and stability issues are going on if you touch too much, ppl with this current boards and rams are probably gonna want to buy new ram or even new mobo as well for 8000s or 9000s ryzens, makes no sense to go am5 now imo unless you running a really old chip... I would wait myself until 2k25 and see how new gens come up
I upgraded from the 3900x to the 5800x3D back in Jan. Its hard to sell me on anything new for a while with the value I got on that chip during the holidays
Man, first the 4060 ti, then the 7800XT, now the entire 14th series. We're truly in the era of generational stagnation. At least they had the decency to price the 7800XT well and the 14th gen the... same.
yea, they said after the 13th release, that the 14th gen will not be a generation upgrade, and just a 13th gen refresh, that s why it has same slot and everything almost the same, ppl just need to read, but it s normal for everyone on youtube to make clickbaits and stuff for views, nobody said that 14th gen will be a leap, but views are views
@@KoItai1 Then why name it as if its another generation, could've named in a 13950k atleast, it's not like all people knows it's a refresh, but we can surely do say that a generation naming scheme from 13900k to 14900k does mislead people. intel is doing this for marketing obviously to decieve people
You live in an era where every industry is controlled ruthlessly by hedge funds, who demand not quality products but the maximum in shareholder return on investment. EA, Intel, Dreamworks Studios. Pick a company and they're guilty of being on strings, controlled by companies like Vanguard, Blackrock, et al
Was sad to see total war not included in the benchmark tests as it is one of my favorite franchises, but good to know this is literally just a refresh rather than a new generation!
Comments like these make me glad I spent the money during COVID lockdowns on a GN Wireframe mousemat. I'm glad I supported a channel that people enjoy so much.
I don't know if this is feasible (I suspect it would require a _lot_ of test time, even though you could tolerate substantially more noisy individual test points), but a benchmark _sweep_ of something like Factorio across different working set sizes would be interesting. One thing I've noted is that performance of Factorio, and DF, and a lot of other simulation-esque games in general, tends to tank once your working set no longer fits in cache, and it's definitely noticeable that the 5800X3D can handle a lot more than most other cpus before that happens - but most benchmarks don't really show the full story, instead just showing one or two datapoints at max.
Meh, when you are watching youtube and surfing internet cpu is basically on idle and consumes like 20w. If you play games like 5h per day you pay like 10c per day more if your electricity is 20c per 1kwh. So 35€ in year if you are a f**king nerd and play 5h every single day
@@DuBstep11510c? 20c? lmao. Where do you live, in russia? electricity in europe is more than 35c per kwh. In uk 2x5 so like 50/55c per kwh. Now redo the math, kid.
@@DuBstep115 During peak hours my electricity goes up to 60c/kwh. but Its usually 34c. But yeah sure if you just pretend the numbers are smaller the cost argument goes away
@@Salvo78106 I was being generous and shooting it high. I pay 8c for 1 kwh. I live in Finland. So its even lower, 100w vs 200w cpu difference is 15€ for me if you play 5h EVERY SINGLE DAY
A 30% increase in watts over my FX 9590. An engineering feat never before imagined. The bold future of more watts for more dollars, giving only as much performance as you need.
@@stephenallen4635 Point? The 9590 was hot garbage, sucking down power (in times where most boards could NOT handle it - there were numerous cases of fried mobos back then) and performing only marginally better than a 8350/8370 and miles behind 4770k that came out at the same time. The 14900k is also hot, but at least it performs.
@@HenrySomeone ok so you do get what he was saying is exactly that applied to the 14900k. Its barely faster than its predecessor, comsumed almost 25% more power to a ridiculous degree and should be laughed at lole the 9590
I’ve been using the i7 980 since 2008 the next five years from intel only had incremental performance. I have just bern updating the GPUs. Finally got the Ryzen 9 3950 in 2020, been great.
Can you test the ratchet and clank portal sequences and starfield with different combos of cpu and ssds. Apparently starfield is not just cpu bound but also ssd bound, same with ratchet and clank portals.
We're using a high-end NVMe SSD. There is no SSD bind in our CPU testing. As for doing a standalone on it, no plans right now but maybe as Direct IO becomes more prevalent!
Here's a question. At hat wattage does your CPU sit when playing say Cyberpunk? And tens of watts? Funny. I'm looking at my 14900K in HWM right now and it draws less than 10.
Hilariously, on the biggest Swiss electronics retailer’s website, the i9-14900K launches $20 cheaper than the current price of the 13900K…..it’s as if they knew that it would be what it takes to convince people to buy the newer one 😅 Edit: same story with the 14700K vs 13700K btw
That's strange. I wonder if Intel is offering any subsidies in order to shift these? It would be silly to not reduce the price on the older hardware first, but this is Intel we're discussing.
@@Pressbutan I don’t know, and interestingly, all four except for the 13700K are labeled “on sale” - but the discount on the 14900K is 13% while it is 7% on the 13900K - with both listed base price being virtually equivalent (the 13900K is listed as $5 cheaper)
That is interesting. Could be a retailer thing to manage their own inventory, I have no clue on either side but I am aware both retailer and vendor will mess with pricing like that to psychologically manipulate consumers into making poor decisions.@@polymniaskate
With every video comes an increasingly powerful and historical thumbnail. Can’t say they’re ineffective either, because it sure as hell got me to click on the video.
After our 14600K review, is there anything else you want to see us do with these CPUs? Any one-off tests or points of interest worth exploring? Let us know in the comments (new top-level comments are easier to find). Remember that any of those one-offs would take the place of other content, so if you're not interested in seeing more of these CPUs, let us know that as well (so we can prioritize our next content)!
Watch our Intel Core i7-14700K CPU review: ruclips.net/video/8KKE-7BzB_M/видео.html
Learn about CPU sample size in reviews (68 CPUs tested): ruclips.net/video/PUeZQ3pky-w/видео.html
“Will it blend? That is the real question.”
Would They are Billions be a good game to CPU benchmark with? It seems like a heavily CPU bound game when you have a large number of zombies. My i9-9900k lags like hell in some stages.
As Steve from HU said it's a waste of time
Personally I do not understand why you don't test the games with RT, as RT has a huge CPU impact.
Yes, please. As winter is coming, test the 14900K under full load, how it performs as an auxiliary heating. Is it worth to move it as a production PC into the bedroom?
Intel could've just called these the 13950k, 13750k or some shit like that and it wouldn't have been much of a problem... Mid generational upgrades is never a bad thing, unless they raise the prices more than the performance increase.
People will always complain no matter what, they didnt raise price, so I don't see what the fuss is about.
@@freaky425 the fuss is about them marketing it as a new 14th generation which will lead some not-so-tech-savvy customers to assume that it is noticeably better than 13th gen, when it is in fact just 13th gen. This is intentionally misleading and anti-consumer behavior.
@@freaky425 The problem is the implied improvement that comes with a "bigger number better". Most consumers will look at the 14 series at, say, $500 and the 13 series at a discounted price of $450 and believe that they're getting a whole generational improvement in performance for just $50, or 11%, more. They think they're getting a good deal but they're overpaying for just a few extra percent in performance. It's scummy, and intentionally dishonest by Intel.
Steve literally said this in the first video
@@AndrewB23 Great minds think alike.
5800X3D value proposition looks better every time Intel launches a new CPU, impressive.
Turns out incrementing the number on their existing CPUs was free marketing for AMD
Yeah and prices are slowly rising again because of it (at least where i'm from).
Not if you're buying it now though...
@@HenrySomeone5950
The x3d is good value as an upgrade not a fresh build, the chip is literally hyped to the gills, you can buy a 7600x for 100 less that beats it.....
On the bright side, it's a good demonstration of how consistent your testing is between different products/over multiple runs of the same test.
she don't XD
It's incredible that this guy was called Steve Fromgamersnexus and he ended up working at Gamers Nexus to review one of the cpus ever made, life just finds a way
Absolute comment right here.
A Steve moment if one may say
Truly one of the moments of all time
Thanks, Steve.
Love the sarcasm
I never get tired of the "thanks Steve" inserts. So uh, thanks Steve and GN crew for making yet another entertaining video even though the product was about as boring as it can get.
Back to you Sheve
There's a movie/TV critic on youtube called Diasparu who's been using the steve "That's gaslighting" clips, and it's so damn good. I laugh my ass off every time.
You can literally see it
@@Pressbutan I see you have also been watching that. Though Disparu does also tend to overuse clips at not necessarily the best timing.
Can I just call out how much I appreciate the editing team adding the little blue bars on the sides of the screen slowly moving down to indicate how long and when a section/topic will be and end? Its the little things in life you know?
Ooo it's the shroud fan
yessss, it's such a lovely little touch of detail, and they've gone and done more things like this in their newer videos as well.
Steve's review of the Core i9-14900K was 212.4% better than his review of the Core i9-13900K. Outstanding win for intel!
This comment was 9% better than the average reply, which for an Intel review video, is an amazing gain.
@@Pressbutan This reply was 4.2% better than the average reply to a comment, which for a comment on an intel review video is a mindblowing increment over the others.
I think this means we can call him Steve 2.0, increment the version number. Get marketing on the phone.
Thanks Steve!
the Steve™2000™ series™™
Only Intel can launch a new "Generation" of CPUs and, in so doing, make the 5800X3D look good again... Almost 2 years after it's launch!
its still bought and am4 is still viable its not like the 5900 or 5950x are bad cpus either its just that its a dead end however if you dont upgrade that often thats not even an issue i guess the 5800x3d or the r9s of zen 3 still have plenty! of juice left
Still rocking my 5800X3D and don't plan on replacing it any time soon. This thing is just silly, nothing I play comes *close* to maxing it out (and that's including games that are CPU heavy like Star Citizen).
@@Warsheep2k6 5950x dead end? yeah, maybe I will keep it until I die, correct hahaha
@@Valdore1000well am4 is a dead end its either r7 x3D or r9 both awesome Chips though I run a 5800X3D and I dont plan on upgrading that anytime soon either
@@Warsheep2k6 I upgrade CPU like in once in 8 - 10 years so it's pretty much alive
250W is really as high as any desktop cpu should go at stock. It is not a good trend that they are pushing the power beyond diminishing return just to get the few percentage on perf that is only useful in marketing. Optimizing the design is the way to move forward as we are getting closer and closer to the upper limit that silicon semiconductor processes are going to eventually hit.
I'm not a fan of the power comparison to older i9s, which were obviously tested with power limits applied at the stock TDP of 95 and 125 W respectively.
My 9700KF was drawing 140 W in an all-core workload at 4.5 GHz (official max turbo is 4.6), and it didn't have HyperThreading. That means that the 9900K at max official turbo of 4.7 GHz would draw around 200 W. At 95 W the frequency was probably around 4 GHz or less.
These CPUs have two TDPs, 125 W and 253 W. That means we should never see more than 253 W at stock settings. If power limits are disabled by default, that's a choice made by either Intel or the motherboard manufacturers, but it doesn't showcase a realistic comparison to older CPUs, where power limits were observed.
You can always set a power limit if you want
@@THU31 AMD got me with the r7-7700. 65W 8 core TDP is pretty insane. Putting my 6600k to rest.
@@norge696 yeah, for me it was r7 5700, even with SMT OFF by my choice its still better performance that what intel has to offer.
@@divinehatred6021 why have it off?
It’s hard to believe a cpu runs to 300w.
Power supply manufacturer right now are salivating!
Until you buy a 13900K and realizes that your 420mm AIO is just the bare minimum... this is unacceptable from a consumer viewpoint
@@Iwanttobefree42 please get your head out of the hyperbolic gutter
Even at unoptimized stock on my 360mm R23 was just about 300w. After a basic offset that took like 15mins to guesstimate it was down to 265w @ 5.6 (KS).
In real activity usage... it uses like max 150w for my games and pretty much always below 60c.
this is the most sad sack update I've ever seen. I've literally seen average cpu bins of the exact same product improve more over 6 months, than the improvements to efficiency and performance a 13900k to 14900k gives. 😕
@@ashryver3605 150W for real usage is still a lot. All my systems still run CPUs that don't even go above 100W max. And I still need massive 3 fan coolers to keep them reasonably quiet.
AMD's Ryzen 8000 series CPUs will have insane prices with this kind of competition from Intel
Nobody will buy any expensive product from AMDip.
@@kingyogesh441 hahahaahhahaaahhaah.... wait lemme breathe in to laugh even harder. HAHHAHAAHHAHAAHA
@@kingyogesh441bro with radeon i kinda get it with the drivers but ryzen naaah
@@bliss_gore5194I agree maybe if this was a GPU comparison but CPUs? Nah. At this rate AMD is gonna maintain the performance lead for at least another generation
@@bliss_gore5194graphic drivers are pretty good right now.
The amount of sarcasm dripping from this review was nearly enough to drown me as I listened to it driving into work this morning. I can't thank the GN crew enough for the actual out loud laughs you have given me, just in the last two reviews.
I didn't expect the 14th "Gen" to set so many records (of mediocrity) with so many watts.
14th gen. We definitely learned from our failures with 11th gen. We swear.
at least 11th gen was an attempt
Dude 11th gen is a killer deal at this point. Boards, CPUs and memory are dirt cheap. Price to performance is hard to beat.
At least 11th gen was a new uArch. And it brought AVX512.
14th gen is worse
Why are people comparing this to 11th gen like seriously do you have memory of a goldfish?
11th gen was actual DOWNGRADE. 8 core i9 vs 10 previous gen.
14th gen has at least a clock improvement and even few more e cores for i7.
It's not even a comparison. It's advertised as refresh and it's exactly that.
This is like shuning AMD when they released 2000 ryzen series which was literally the same thing.
So nice of intel to consider those freezing to death this winter, how would they have survived if not for the new 14-series?
Those tests they do with the virtual loads has no bearing to the real world. Been running a 13900KS for almost a year now and the thing has never gone above 180W. But yes, that little extra heat is welcome during the winter months :D
@@SixDasher well then why did they add so many e-cores and high clocked P cores, when we are not supposed to use them in "real world", many people buying 13900k unlike you are actually pushing it harder than this video e.g i run multiple compiles and renders on background while doing gaming or using editor, and most of the it stays > 180 W, its your problem you are using a 9 series for watching youtube or facebook or doing only gaming, this ain't 2015, people don't close all applications before starting a game..
@@SixDasherSo you bought a 24 core cpu to not use it? Could have just bought an i7 instead
@@ThunderingRoar he should of waited for the 14700k
I was hesitating to either slot a 5800x3d into an existing machine, or go for 7800x3d with a new build. Thanks to these videos I finally desided to save a little money by switching to 5800x3d.
I did that, but my old CPU found a new home in a data analysis build, so I ended up with new RAM & mobo anyway.
I would have been better to go with AM5
5800x3d and 5950x will keep socket am4 relevant for a long time to come
I've always wanted a furnace in my home office. Preferably one that triples my computer's utility bill. This is perfect!
u know that power consumption is not in gaming, right ?
@@KoItai1 Yes because games won't even use 1/3rd of all available cores... buying 32 thread CPU just to game on is equivalent to getting 4090 to play Quake 3 and RimWorld...
I have 12700H, 3060 laptop and ohh man it is basicly a hair dryer! When i feel cold i just launch a game..
can you describe what workload you will be using that will cause these CPUs to draw 300w? do you run benchmarks in your home office all day long? since you probably don't do any real work, this CPU idling at 20-25w less than its AMD counterparts (which gamersnexus fails to mention, for some reason) would probably lower your power bill. you seem like a very smart person though so maybe i'm wrong and you get paid to stress test a CPU all day long?
@@ibtarnine i7-12800H: literally starts using 100W easily just from doing mundane tasks in Teams, Outlook.
If you do anything more serious than that, yeah, it's going to consume a lot of power.
If you compile the intros and conclusions of this year's hardware launches from nvidia, intel and amd, you can see Steve's slow descend into insanity.
You can just feel the disappointment through the screen. In fairness I'm sharing it with him - I expect better, given that these are crucial moments for the industry. They need to be on their best efforts and clearly Intel is off in left field, dancing around on the silver highway playing with their cell phones.
@@Pressbutan its only intel and nvidia doing the bullshit. AMD is best in everything. I hope AMD will overgrow intel soon because they definitely deserve it. Intel deserves to go bankrupt and break into a bunch of smaller companies whixh will then be way more innovative than gigacorp intel with their 10 years obsolete architecture
Intel's GPU's at least seem to be an interesting thing to see not have any hardware changes and see performance increase as much as it has.
"Slow"? This industry's driving him mad at warp speed.
Though the Intel releases have made the dialogue surrounding AMD a lot better. It was quite negative at launch.
i love you guys for your relentless, blunt honesty, enabled by independence from the companies whose products you review.
You would have expected by now that the "Thanks Steve" and "You can literally see it" clips would have gotten older by now. But no, I still crack up every time :D
Same
Truly one of the CPUs of all time
I loved when Pat Gelsinger come to stage and said inteling time
Say what you will but it's a central unit and it processes.
I too deem this as a cpu in time.
@@Dr.RichardBanksmaybe it is outside of time. Coming back from bulldozer era to show what failure really looks like 😂
@christophermullins7163 it's definitely in this timeline. Possibly outside of it as well.
Thank god I did my research and watch this content before buying. Literally had the 14900 in the shopping cart, you boys saved me a decent chunk of money.
So what exactly do you use now?
@@kabishnando4286He bought a 14700k so in the end he didn't save any money
Thank you for Re-Reviewing the 13900K. im glad someone finally did it.
Keep those "weird off the rails" intros. Loved them. It summarized the review and also gave a personal signature to the video.
This review has once again shown that my plan of just upgrading my AMD 3700X to a 5800X3D on my existing system was money well spent.
7800x3d continues to look better & better, especially after the most recent price drops
Indeed... I would like to thank Intel for persuading me to buy an AM5 platform and make the jump to DDR5.
If you just use the pc for gaming maybe.
@@SixDasher agreed, I dont know why so many people recommend the 7800x3d when its only optimal for 95% of users.
@@SixDasheri mean if you use it for productivity it’s even more insane to buy Intel crap. AMD Epyc cpus pretty much destroy anything that Intel has ever made and i’m saying it as a former Intel diehard.
I’m interested to see if they can finally make a proper next-gen cpu with 15th series.
@@SixDasher I used to think that but now realizing after 12 months of 13600K ownership all I do is play games and occasional video clip encodes for memes etc which already had fast enough encode times on my previous i7 8700K.
The Haswell refresh, Devil's Canyon (4790K) was a massive improvement over its predecessor (4770K), despite only minor changes. But this time they're presenting something that's even less of an improvement as a whole new generation of CPUs. It's like they reckon most people won't see through the marketing, and they're probably right.
Drinks like a Sandy Bridge-EP, performs like a Haswell.
Mainly big companies are a target of this because those always order the newest i5 or i7 no matter what the actual performance is
True, but skylake vs kabylake were identical except a igpu improvement. It's not the first time intel has done this. (and no it's not ok)
@NobbsAndVagene Most "gaming" reviews do a bad job in general explaining cpu architecture changes in general.
@@LucasHolt YET Kaby lake could do hardware accelerated x/h.264 decoding and skylake could not. That's a big upgrade if you are trying to build a nuc sized system that runs at 15 watts. IPC of the cores also were the same with better power efficiency. Go look at the AnandTech write up. Kaby lake yields were higher
"The combination of the two allows for more voltage range and higher frequencies, although it may come at the expense of die size. We are told that transistor density has not changed, but unless there was a lot of spare unused silicon in the Skylake die design for the wider pitch to spread, it seems questionable. It also depends which part of the metal stack is being adjusted as well."
Appreciate Gamers Nexus style of testing and benchmarking CPU's. I feel like other tech channels are just rushing videos out, using handpicked games and 1080p resolutions... You guys are showing the community better material right now. Just my opinion
I really considered upgrading from r5 3600 to a 12th gen or am5 upgrade, but I'm happy I went for a 5800x3D. Maybe I'll consider the 15th gen / the next AMD socket.
AM5 boards last longer like 4 to 5 years. im on old 5 yo AB350 using R5 5600. my next system sure be AM5 with decent mem price
@@roko98332r R5 5600 still a beast for it's price
@@roko98332r it is not possible to know that yet. Am5 has only been out
The next gen of AMD CPUS is going to use the am5 board, while AMD is unsure if they will go to a new generation of socket for the gen after.
@@sethperry6616 AMD has promised at least Zen 5 but were purposefully vague about Zen 6. Given the leaks we've seen surrounding Zen 6's departure from the IO Die + CCDs layout that Zen 2,3,4 and 5 feature, it makes sense that AMD wouldn't set it in stone in case they simply can't retain pin compatibility. But if they can, then expect Zen 6 and Zen 7 to be AM5, as they won't switch until DDR6 and PCIe6 unless they absolutely have to.
Finally, an intel version of the FX 9590.
What a milestone.
Huh?
Ok I'm done reading comments, this is the one I was looking for. I would have also accepted "Bulldozer Lake" or "Ooh, the new Fermi looks hot"
Good luck having optimization issues in mainstream games :)
Intel have done a couple ++ refresh cpu's
I say, with the power draw and total lack of generational improvement, Intel may have finally met their own Bulldozer Moment on Desktop (since they already had one in servers). Of course, as we know Intel is too big to fail and will quickly bounce back from this mild embarrassment.
If possible, I'd love to see Intel i7/i9 with PL2 @ 140-ish Watts vs AMD Ryzen 9 with ""105 Watt"" eco mode. (Preferably with a focus on productivity.) Thanks for all you guys do!
Anyone else excited to see how userbenchmark views this?
I don't think intel made enough cash from 13th gen to pay the subscription fees
Right 😂
Right now userbunchmork has 14900k at #1
The highest Ryzen CPU, according to them, is the 7950X3D at #10
The 7800X3D is at #17.
Truly one of the websites of all time
@@GeordiLaForgery
Those blokes will do it for cheap.
lmao
The comedic delivery timing is the only thing tighter than the difference between these "generations".
Absolutely... informative and hilarious presentation. 😂
That final “back to you Steve” got me 💀
Honestly, we really no longer care about intel chip rebranding, but still keep watching all these reviews because how fun they are, well done.
Yeah haha. I couldn't care less about Intel anymore. I've been team red for the last 5 years. Intel Innovation has stagnated SO hard. They just keep uping power consumption because that's the only way they've figured out of producing more performance for the last 8 years.
This entire thing is 100% entertaining as well as informative. Intel seriously dropped the silicon on this one.
Definitely one of the consumer products of all time.
is there anyway to get this sucker under 600 dollars, like perhaps get it for $50 bucks, or perhaps someone can pay me to extract my 12900K to install the 14900K on my Z690 mobo?
@@dabneyoffermein595Of course there is! Buy 13th gen under $600 and SB OC! Wait for 15th gen Intel.
I disagree. I suspect it will sell. It will be headline sales pitch in the latest Dell or HP workstation (and eventually laptops). Intel just made a load of money without having to invest a dime in R&D, that's genius, dirty... but genius. Yeah they will have lost a few enthusiasts who understand how cynical it is, but there are 100 morons for every one of them, they all will be clicking their fingers to get these 'next gen' CPUs. Being savvy in a mass market dominated by morons is a dismal experience.
Agreed. I thought about waiting for 14th gen chips to build a new machine, but ended up getting 13th gen from Micro Center not too long ago.
It's kind of a wild that Intel is still getting stomped on by the 5800x3d. That V-cache is incredible!
While it's also stomping on AMD's current 7000 series.
From my knowledge, if a CPU from a previous generation is stomping a mudhole in your current gen lineup, you know you're in trouble.
Wat
@@zachcooper9102The 5800x3d is not stomping on the 7800x3d lmao
What? If you're thinking your little 5800X3D is stomping on the 14900K, you're on crack.
I don't even think you can get half the score in CB 2024 But let me know. You have 2258 points to beat in my case. I'd be surprised if you get more than 930.
Love watching Gamers Nexus. Steve is able to entertain even when there is really nothing to talk about😂
My *entire* system, including a 34" UW display, uses 350 W or so - 300W on only the CPU is beyond insane.
that s on 100% core load, since it has 24 cores, but in gaming, it uses like 70W most of the time
That's a weak system brah my GPU pulls 400w
My laptop uses 120W in gaming in the worst case.
@@KoItai1Nope. at least 110.
Based on gaming bench videos at least.
@@ms3862undervolt it
I fucking love Steve's absolute disdain with Intel. It's entirely understandable, though, since their effort for improvement between 13 vs 14 is similar to that of a barely passing undergrad. Back to you, Steve.
Videos like this one really highlight to me how good of a CPU the 7800X3D is. Very powerful, fast, cool, and not power hungry.
7800x3d has high idle power usage.
It shows how far behind the curve Intel is. They dragged their feet far too long with Core architecture. 2008 designs in 2024 do not work. x86s and a whole new architecture are needed to compete, putting more lipstick on the pig and hitting it with more watts isn't going to solve this issue.
@@PressbutanYeah,all the current stuff they have is still a variant on Skylake. They need a Bulldozer -> Zen moment for their cpus. The only way they are even "competing" right now is with stuffing E cores and pushing their P cores to the max heat they can handle and even with all those tricks they only come close to Zen 4 or even Zen 3 X3d(5800x3d).
Most of the time their 8P16E cpus at 32Threads still lose out to the 16C/32T 7950x and the power usage is a joke. Almost the same power draw as my 4060.
All AMD needs to do is make a 16 Zen5 core and 16 Zen 5C core CPU at 64 threads and Intel will have nothing at all to compete with that. Even a 16 Zen5 and 8 Zen5C CPU would annihilate them, all the while taking less power than a 14900k. Now imagine that CPU but with x3d. Even less power draw and way better gaming performance.
@@rudrasingh6354all of their architecture is still based on scaling of the original Pentium 2 core. Their most recent architecture was “Netburst” for the Pentium 4, which was similar to bulldozer in that the frequency was high but IPC was low. The Pentium 4 and Pentium D used Netburst before they went back to the drawing board and revised the Pentium 2 architecture (which had been revised once before for Pentium 3) for the “core 2” CPU’s, and every generation since has just featured further revisions. This is notable a “refresh” in that the revisions made haven’t functionally effected the architecture at all.
few problems, intel is known for memory controllers amd just sucks at that max u can overclock amd is 6000 while intel can run at 8000 on 14 gen with good motherboards
but idk intel is starting to suck
Intel truly excels at making the competition look even better than it already was.
Except the competition likes to explode...
@@HenrySomeone No it doesn't?
Whatever you have tell yourself.
@@HenrySomeonewhat? Lol
@@HenrySomeone???
It’s wild how CPUs now have such high thermals that contact frames and 420mm AIO radiators just to get comfortable temperatures.
The Stellaris chart could use more game settings data. For example, map size (how many star systems), if L-Gates are enabled, in game year (important as number of pops affects performance), number of AI empires and whether or not Xeno-Compatibilty is enabled as that one setting can basically set fire to your CPU in the late game.
This guy Stellarisis
Agreed
For stellaris, I'd actually recommend setting it to the highest speed on a late game save, then seeing how long it takes to the to, for example, get through 50 years (shorter would be better). On the fastest speed paradox games run as fast as the system can handle and I think this type of benchmark would show much better scaling.
Sadly, the rng would probably invalidate runs over multipe turns. That is why they only test a turn to reduce that (and other) variables.
7 months later, the "thanks Steve" still gets me lmao
This CPU is great marketing for the 7800X3D
and those with 5800X3Ds.
12th and 13th gen was pretty hard to keep track of because you had to actually pay attention to tell what's good and what isn't. Now with 14th gen, that's not a problem. Thanks Intel.
I just built myself an i9 14900k PC with a DeepCool LT720, and it runs nice and cool.
This was a welcome reminder that the 11900K had two cores less then its predecessor. We got a "side"-grade instead of an upgrade this time, but at least we didn't get a downgrade. That's something, isn't it?
11900K was even more shitty, it had 2 less cores yes but it still had higher power draw
At least 11th gen offered PCIe Gen 4 over 10th which actually mattered. Now 14th gen is pretty much garbage and offers nothing new.
Once you look into the real world latency between 10th gen, 11th gen, and compare it to 12th and 13th gen, you definitely did get a side-grade. Overall system latency is higher with 12th/13th gen. So really depends on how you look at it.
@@germanmade1219 everything has gone downhill except benchmark performance since Intel introduce their e-core shite that should have never come to desktop
I'm astounded by how well AM4 holds up considering the DRAM bandwidth difference.
Could you please start including DRAM layout in testing config? E.g. 2x 8GB single-rank or whatever. This can make a surprising amount of difference. Maybe even worth a video at some point. Find a set of 4 benchmarks that are GPU-limited/CPU-limited/dram latency limited/dram bandwidth limited, and run on different DRAM layouts and CAS-versus-speed tunings, to illustrate the tradeoffs.
best thing about X3D is them not really caring about ram speed, getting uberfast ram is a literal waste of money for them
I very much enjoy the Sahara Desert level humor You employ! Bravo! 👍👍👍 ®️
the 4060 of cpus, truly one of the cpus of all time.
The 4060 at least has a software locked generational software gimmick that makes it look not terrible in very specific tittles
@@atretador So does the 14th "gen". Intel just only manged to get it ready for two specific games... that Gamers Nexus don't feature in their test suite. I can't recall it's name (Steve mentions in the 14700k review if you're actually interested) but it's the very definition of a locked software gimmick, as it will not be released for 13th gen despite it being the exact same silicon.
That's a really ignorant comparison.
The 4060 is disliked for poor performance. While to marketing of the 14th gen is B.S a 13900K/14900K is for all intended purposes a very powerful CPU.
I feel like getting a 7950x makes the most sense for people right now who need lots of cores. You get a brand new platform with upgrades down the line. AMD also did right by AM4 with the 5800x3d release. Pumped new life into a dated board.
going from 5800x to the X3D was so worth it for me, I thought it will be just a sidegrade but it's a legit great final upgrade on that socket
That's what I was thinking. I'd like to build a workstation that I can game on as well, and AMD seems the clear way to go at this moment. I just remain hesitant regarding which CPU so these charts are helpful.
@@DissertatingMedieval a 7800x3d is really good, would future proof for gaming for a long time. In terms of productivity, it would be fine. I use a 5800x3d and my machine flies.
Next year I have to upgrade from my I7 8700k , it was one of the last great CPUs intel put out but it's showing its age. I cant take any of the 12-14 "gen" Intel BS because of the idiotic P/E cores. None of the vitrulizers I use can distinguish between the P and E cores. FFS Intel , E cores are only good for mobile platforms, saving few W on a 300 W CPU is not much. Just make a standard CPU for desktops... A couple of weeks ago me a friend tried to workaround 13900k in Red Hat KVM and VMware ESXi. ESXi just could not work stable because it cant understand the core difference. KVM , we only got around by leaving the E cores for the host OS and giving the P for the guest. What is the point in dumping that amount of cache and not be able to use most of it....
This is why I will have to go AMD but AMD-V was never on the same level as VTX/d , that's something Intel nailed log time ago because of their Enterprise CPUs.....
I was given an advice to just buy Gen 11 Intel and stick with it. They are on very good prices now as retailers are trying to offload the remaining stock. The mobs for them are not on insane prices as well.
@@mowtow90 yeah these e cores are absolutely stupid in desktops
10:44 "that's when you know it's a good generational uplift; in fact, it's so exciting that we're not even gonna talk about any other production test"
I'm crying
7800X3D is unbeatable in games for a fraction of price and power consumption.
People that buy this don’t just play games.
@@dangerous8333 While that is true, these Intel CPU are still largely less efficient than an AMD CPU even at productivity tasks. A 30% increase in your power bill for a 2~5% improvement in performances just isn't worth it. Hence why several datacentres in the world (such as Cloduflare's) have moved to EPYC.
@@dangerous8333 Yeah, they play themselfes
People that a work machine buy threadripper@@dangerous8333
@@dangerous8333in that case they should go for a 7950x3d. Better gaming, similar productivity, 1/3 the power.
I am really enjoying the step up in humor. You could almost add cut out graphics to make these reviews Montey Python skits
Steve and the GamersNexus team together, made this launch something entertaining and worth to watch. Unfortunately the only downside to this is that I want more videos like this, but I hope not.
Came to see if it made any difference to Starfield. Left a like. Moved on. Thanks Steve.
Thanks for the like and the comment! It actually does help a lot of someone's watch time is low (but they found it helpful) and balances it out!
If this continues were gonna have to start figuring in electrical breaker upgrades into build costs.
Lol
Adding 14 cores added 200watts. Were in trouble
So put a power limit or undervolt?
My 13700k Runs at 5.7Ghz and barely uses 60-120W depending on the game.
150W power limit and it drops to 5.3Ghz but that's only when I am rendering.
I can save more power with the eCores disabled or parked.
The GPU is still my biggest concern at 400W.
This generation will go down in history as one of the generations of Intel's cpu's.
I love the snark GN team. These Intel reviews have been hilarious… unexpected joy that the product itself will never deliver.
Intel can be a good deal during Intel discount season 😉
@@peterpan408 you'd still end up saving more from amd simply due to not having to spend extra keeping it cooled and powered 💀
The callbacks to the Intel presentation are never going to get old
This thumbnail belongs in a museum.
*K-KONO POWA DAAA!!!*
I’m not even going to try and make a better comment I’m just going to agree
userbenchmark review gonna be crazy🤯
I like the comedic approach here. Even the subtle parts like saying "13" when you meant "14" repeatedly. Well done!
It's crazy how much power some of these newer top end CPUs use. My i7-8700k doesn't even get out of bed most days for gaming, and I was worried it'd run hot and had it delidded back when I got it.
My 9700k gets like 50fps in starfield at 1440p I'll probably grab 15th gen or AMD maybe
R5 5600 still being relevant is amazing!
3700x still showing it's old but efficient as well
Truly unbeatable from a value per dollar perspective.
That's exactly what I've been using since it was released. R5 5600 and a Sapphire Pulse 5600XT. Absolutely fantastic combo paired with a nice B550M mobo and some 3600mhz ram. I don't see any need to upgrade for the next few years. I play Starfield at 1440P medium settings no problem.
looking at the efficiency of amds 7800x3d and 7950x3d i am so impressed and i somehow feel bad for not having bought amd stock before
My last 2 pcs are AMD CPUs. They really have stepped out it up, I'm looking forward to the Zen5X3D chips
AMD wasn't always this good, they have had some real stinkers too. Bulldozer for example...
Have you considered numbering the cores with preceding 0 for numbers
Can you imagine? A cpu model numbre like intel 14P8E10 and be able to identify specs of the product only with the name... my god... life could be so much simple
I haven't ran the numbers but it seems like even if you're putting together a brand new PC, if you're on a limited budget you might _still_ want to buy a 5800X3D and put all the savings you've made on AM4 platform components toward a bigger GPU.
I consider myself pretty faithful to Intel and if I had a friend building a PC asking for advice I'd steer them towards the same. If not 5800X3D, then absolutely a 7800. Either would be fine. In fact both seem to be fantastic value compared to team blue..
I wouldn't. The only reason I got the 5800x3d is bc I was on the AM4 platform already. Not to mention, PBO2 optimization also gives me nearly another 20% in performance with the 5800x3d. But with a new system, I'd totally save for AM5 for upgrade ability later down the line.
@@Purified1k pretty sure that for gaming on 1440p with a high-end gpu and the 5800x3d you can even skip AM5 easily
@sebastianferreira3595 Correct, easily. But a new build, you should just use AM5 so you have an upgrade path.
@@Purified1k yeah but looking at am5 now, they'll probably develop further so ddr5 and all can be better used, it's better to wait. Most b650 mobos are 6400mhz and stability issues are going on if you touch too much, ppl with this current boards and rams are probably gonna want to buy new ram or even new mobo as well for 8000s or 9000s ryzens, makes no sense to go am5 now imo unless you running a really old chip... I would wait myself until 2k25 and see how new gens come up
I upgraded from the 3900x to the 5800x3D back in Jan. Its hard to sell me on anything new for a while with the value I got on that chip during the holidays
Thank you for revisiting the 13900K, good stuff here.
Man, first the 4060 ti, then the 7800XT, now the entire 14th series. We're truly in the era of generational stagnation. At least they had the decency to price the 7800XT well and the 14th gen the... same.
yea, they said after the 13th release, that the 14th gen will not be a generation upgrade, and just a 13th gen refresh, that s why it has same slot and everything almost the same, ppl just need to read, but it s normal for everyone on youtube to make clickbaits and stuff for views, nobody said that 14th gen will be a leap, but views are views
Its not even a refresh tho. It's nothing. Less than 5%
@@DM16_ it is a refresh, that s why it s called intel refresh raptor lake, (13th gen being raptor lake)
@@KoItai1 Then why name it as if its another generation, could've named in a 13950k atleast, it's not like all people knows it's a refresh, but we can surely do say that a generation naming scheme from 13900k to 14900k does mislead people. intel is doing this for marketing obviously to decieve people
You live in an era where every industry is controlled ruthlessly by hedge funds, who demand not quality products but the maximum in shareholder return on investment. EA, Intel, Dreamworks Studios. Pick a company and they're guilty of being on strings, controlled by companies like Vanguard, Blackrock, et al
Was sad to see total war not included in the benchmark tests as it is one of my favorite franchises, but good to know this is literally just a refresh rather than a new generation!
this is THE MOST lowkey funny hardware channel i ever subscribed to. love you guys
I laughed so hard, love GN
Comments like these make me glad I spent the money during COVID lockdowns on a GN Wireframe mousemat. I'm glad I supported a channel that people enjoy so much.
I don't know if this is feasible (I suspect it would require a _lot_ of test time, even though you could tolerate substantially more noisy individual test points), but a benchmark _sweep_ of something like Factorio across different working set sizes would be interesting. One thing I've noted is that performance of Factorio, and DF, and a lot of other simulation-esque games in general, tends to tank once your working set no longer fits in cache, and it's definitely noticeable that the 5800X3D can handle a lot more than most other cpus before that happens - but most benchmarks don't really show the full story, instead just showing one or two datapoints at max.
Watch hardware unboxed 14900k factorio benchmark for this exact situation.
WoW! My 5800X3D still beats the 14900 in so many situations. And it consumes a fraction of the power.
Intel has done it again. Truly revolutionary
Given we pay ~2.5x as much for energy here in the UK, the cost is far higher than the initial price tag suggests.
Meh, when you are watching youtube and surfing internet cpu is basically on idle and consumes like 20w. If you play games like 5h per day you pay like 10c per day more if your electricity is 20c per 1kwh. So 35€ in year if you are a f**king nerd and play 5h every single day
@@DuBstep11510c? 20c? lmao. Where do you live, in russia? electricity in europe is more than 35c per kwh. In uk 2x5 so like 50/55c per kwh. Now redo the math, kid.
@@DuBstep115 During peak hours my electricity goes up to 60c/kwh. but Its usually 34c. But yeah sure if you just pretend the numbers are smaller the cost argument goes away
@@Salvo78106 I was being generous and shooting it high. I pay 8c for 1 kwh. I live in Finland. So its even lower, 100w vs 200w cpu difference is 15€ for me if you play 5h EVERY SINGLE DAY
@@stephenallen4635 Well get rekt then :D
that episodes intro was great, keep up the good work and creativity, computer channel number 1
Honestly, I would've like to see you add the i9 13900KS to see if the i9 14900K even compares because there both 6GHZ
They are exactly the same chip
@@damara226814900k has slightly higher memory bandwidth, other than that is it is the same chip
I cant believe i bought a 14900k 2 years ago
Been waiting for ONLY YOU to review this! No one else competes!
"Our room heaters are 3 times more powerful than the competition, with 5% less computing! Intel - we make winters warm!"
A 30% increase in watts over my FX 9590. An engineering feat never before imagined. The bold future of more watts for more dollars, giving only as much performance as you need.
Yeah well, it's also about 3000% more performance, so...
@@HenrySomeone so... what? did you miss the point?
@@stephenallen4635 Point? The 9590 was hot garbage, sucking down power (in times where most boards could NOT handle it - there were numerous cases of fried mobos back then) and performing only marginally better than a 8350/8370 and miles behind 4770k that came out at the same time. The 14900k is also hot, but at least it performs.
@@HenrySomeone ok so you do get what he was saying is exactly that applied to the 14900k. Its barely faster than its predecessor, comsumed almost 25% more power to a ridiculous degree and should be laughed at lole the 9590
@@stephenallen4635 Did you not watch the video where it shows it using even a few watts less than the 13900k, you blatant AMD fanboy?
I’ve been using the i7 980 since 2008 the next five years from intel only had incremental performance. I have just bern updating the GPUs.
Finally got the Ryzen 9 3950 in 2020, been great.
7800x3d 💯
not necessarily, 14700k can perform the same or better in games, while being 2x better in multicore performance
@@KoItai1did i even watch GN's review on the i7?
@@evilleader1991 did the guy that comment watch the review of 7800x3d to comment about it? no
what are you even on about bro, 14700k gets DEMOLISHED by 7800x3d in gaming.@@KoItai1
"Now for the 13..erm 14900K..." this cracked me up whenever Steve did this 🤣🤣
When a CPU draws almost like a RTX 3080. Wild times.
@@leanja6926
Parrot.
Can you test the ratchet and clank portal sequences and starfield with different combos of cpu and ssds. Apparently starfield is not just cpu bound but also ssd bound, same with ratchet and clank portals.
We're using a high-end NVMe SSD. There is no SSD bind in our CPU testing. As for doing a standalone on it, no plans right now but maybe as Direct IO becomes more prevalent!
@@GamersNexus If you want to do it several years from now, i recommend buying the 3dxpoint optane ssds now before they are forever gone. For science.
Just caught one on a $300 sale. Not a bad deal from a 12600k upgrade. Thanks for the summary.
That soothing feeling when my 5900X draws 133 Watts at PEAK load and barely draws tens of Watts when idle...
Yeah well, it's also barely a third of 14900k's multi thread performance, so...
actually, better no mention iddle, a 12700 pulls 6 watts at iddle, 13700 8, and even this hot comet that is the 14900 iddles at 14
Same here, i have the i7-12700K. Iddle power is way lower than a 5900X.
Here's a question. At hat wattage does your CPU sit when playing say Cyberpunk?
And tens of watts? Funny. I'm looking at my 14900K in HWM right now and it draws less than 10.
@@AB-80XI don't play Cyberpunk, never did, never will. I refuse to even pirate that crap of a game.
Hilariously, on the biggest Swiss electronics retailer’s website, the i9-14900K launches $20 cheaper than the current price of the 13900K…..it’s as if they knew that it would be what it takes to convince people to buy the newer one 😅
Edit: same story with the 14700K vs 13700K btw
That's strange. I wonder if Intel is offering any subsidies in order to shift these? It would be silly to not reduce the price on the older hardware first, but this is Intel we're discussing.
@@Pressbutan I don’t know, and interestingly, all four except for the 13700K are labeled “on sale” - but the discount on the 14900K is 13% while it is 7% on the 13900K - with both listed base price being virtually equivalent (the 13900K is listed as $5 cheaper)
That is interesting. Could be a retailer thing to manage their own inventory, I have no clue on either side but I am aware both retailer and vendor will mess with pricing like that to psychologically manipulate consumers into making poor decisions.@@polymniaskate
Bagged a 12900k/msi mobo/ripjaw ddr5 6000 combo from microcenter couple weeks ago for $400.. see ya'll in 3-5 years zzzzzzzz
Awesome and informative video like always! thanks for the hard work :D
Looks like Intel is all in on the home heating business. That's just good business diversification strategy.
Its weird how people dont realize this is the end goal
governments forcing people to switch to electric heating, so I am just helping the environment!
@@radiantveggies9348
Most people are sheep. Most people do not understand what the point of a 14900K is. They think it's a gaming CPU.
Given the uplift should these have been the 13901K and 13701K?
Thumbnail aged well.
I have the 5800X3D and the TUF 4090. I am sat here smiling. The 5800X3D has to be the most legendary CPU ever made.
Let me know if it last 10 years and then we could say that.
@@dangerous8333 no one who buys a 4090 keeps a CPU for 10 years, what a stupid statement. Go lick intels boots.
now if only AMD would let us overclock the darn thing. That's the one thing I regret switching from a 3800X to the 5800 X3D
@@KR4FTW3RK
Overclocking a 3D CPU. Yep, go right ahead. Please make a video. I've always wanted to see one of those go full Chernobyl.
With every video comes an increasingly powerful and historical thumbnail. Can’t say they’re ineffective either, because it sure as hell got me to click on the video.