Small error, the driver version used is "544.43", not 644.43. Also please note there are some errors in the 5800X3D data. Spider-Man should be 144 fps, not 122 fps ACC should be 175 fps, not 161 fps Baldur's Gate 3 should be 145 fps, not 132 fps Star Wars Jedi Survivor should be 140 fps, not 129 fps
It's worth noting that Intels draw significantly less power when at idle compared to AMD. Something like 9w for the 13900k vs 54w for the 7950x. So if you're not a gamer then intel is more power efficient in the long run as far as day-to-day use
@@Barncore I noticed that my Ryzen with MCM design draw more power during idle compared to single-chip Ryzen but not as high as 54W though. More around 20-30 watts. And this is quite a lot higher than the good old Ryzen 7 1700 which surprised me so much in efficiency back then in 2017.
@@eternalwingspro4454 that's the problem of MCM, the chip with single die often better at idle power consumption. If you into that thing then you should pick APU model from AMD (often made in single die). I have 5600G and 5950X, 5600G idle around 10W of power, but 5950X is ~31W.
@xiphosmaniac Well not really, I bet if people looked how much they are actually gaming, vs just browsing and watching youtube, its probably mostly not gaming.
@@bt82 one probably shouldn’t be getting such expensive processors if wattage on idle is a concern. Laptops or anything mobile are an obvious exception tho.
@@coffee7180 This may be true but for most people, like myself who stay about 90% in idle, even though I have a 4090 but just don't game that much, this would really offset any of the gains an AMD would have. Most people are not running their PC that hard and just staying around idle where the power difference isn't as much as AMD likes to claim and assumes people are running games the whole time. Either way, I think the intel runs too hot during games and that is why I decided to wait for Intel new lineup of CPUs, hoping they improve on this or at least offer much better performance for that heat.
To me it's very clear what they've been doing, this is NOT a refresh but CPU binning, they actually reserved better yields for the 14th gen while releasing the worse as 13th gen.
14700k has 2 more cores though. Wait a minute. Maybe that was binning too? Holy shit. Maybe they had a batch of chips that had 1-2 failed cores and decided to release them to as 13th gen. Rest is for sure just binning.
Well they have Meteor Lake on Intel 4, but it's just not coming to desktop. So they're making advancements, but they're seemingly a bit rushed and I imagine their capacity on their leading edge processes are being tied up with their HPC/DC parts primarily. Arrow Lake for desktop will amount to basically two architectural generations and two process generations ahead of Raptor Lake. That's what we'll be waiting for and Intel *needs* to deliver on it.
Coming up next Arrow lake, based on igorslab leak it will only bring + 3% ST and 10% MT over 14th gen. Also hyperthreading is gone on arrow lake because they fkd up the chip design but dont want to repair it because that way they would waste even more time on their obsolete and bloated "core" architecture
Well, should they release nothing at all? Technology takes time. Be happy for an oppourtunity to get 12th and 13th gen at a discount thanks to this new gen.
@@damara2268that’s a disappointment, I thought after two small improvements generations they would have another big leap in performance like the 12th gen.
Yeah... Intel has been doing it for years. From Sandy Bridge to Kaby Lake, they released the exact same chips for 5 whole years. Until AMD stepped up, Intel released borderline scams. EDIT: It was a miracle if people saw 3% gains.
.@gatsusagara6637 Sorry for misunderstanding... minus in the relation to 4060TI power consumption which is actually the only impressive thing about it while 14900K on the other hand as you said...well....
@@LordPaulusCobris I wouldn't mind a B550 to get some features but in a nutshell it's not worth it for me. I'll wait for Zen 5 with you and then decide.
Looks like Intel is back to it's 14nm ++++++ generational performance improvement. 3-4% performance uplift with more then 30% extra power. Yep classic Intel.
Those gains are only for this generation, this is because they couldn't get meteor lake desktop to come to the market so they had to come out with a refresh and as we all know, refreshes are nothing major. Next generation will have moderate - large gains if I'm not mistaken.
Yep. This is simply 10nm+++ You can tell its the exact same chip as 12th gen because the only way they can get it to go faster is by increasing the power. Im a little impressed. I actually thought last gen was as far as they could push it
@jacktheripper4458 Oddly enough I've only experienced stuttering with the 3 intel systems I had. I went to amd for the first time with the 7800x3d and I can count the stutters I've experienced on one hand in the 4 months I've had it.
@@jacktheripper4458There is truth to having stutter on b650m boards. I swapped my MSI b650m am5 board to a quality a620m board from Asrock and all stutters and issues went away. Most responsive pc I've ever had
The fact that you reused the thumbnail from the 13900k review fits perfectly 😁👍 Also, thank you for providing power usage numbers for all Games. Your CPU Reviews are the best
The power usage numbers really is a welcome change, one that all reviewers should embrace. Intel is making huge sacrifices to power efficiency to stay on the top end of the charts, and that's at the expense of their customers, who are getting a CPU that is hot, throttling even when water-cooled, and would need to be considerably cheaper than AMD for it to be considered an alternative to AMD's offerings, plus the AM5 socket longevity. As it stands, it has to be AMD for the large majority of people on the market for a new CPU.
@@filipealves6602Yeah, if AMD had more fabs reserved for them at TSMC, the desktop AND laptop side would be a complete blow out for AMD Ryzen CPUs. I mean, the Intel 19-13900K did one thing good: It got me onto AMD's AM5!
Impressive review. I really appreciate the updated Factorio benchmark and the power draw for each of the games. Much more useful than only comparing power comsumption using blender. Great work.
More than double the power draw, 60% more expensive, evidently almost impossible to keep cool even with a 360mm AIO and effectively no better in gaming compared to a 7800X3D... At this point I'd ask if this is some sort of out of season joke, but Intel has taught us all better by now.
Same here. I don't do much fps games. But I am just starting on Starfield. It's a little choppy, but I think that is my 5700xt. I mostly play civ6, and the performance boost over my old 3800x is noticeable.
Yes, its noticeably slower when not gaming, and just a bit more efficient when gaming and if you game at 4K, probably not that big of a difference. You basically just save maybe $20 bucks a year on electricity, makes you feel good buddy?
Why would they change though. It's basically what they've been doing for over a decade now, with the odd uplift, followed by several generations of refreshes.
£450 LOL. Nice work! I'm probably going to go and buy a 7800X3D now, but am still slightly concerned with enabling EXPO as apparently there's still some issues with it.
@@benjamind547 new bios updates mean that the best memory speed that is completely stable is 6000 CL30, no issues at the moment, but there used to be a few months ago
Well, at least for once they are launching the new CPUs at the right time - just before winter, so now you can heat not only your room, but your entire house floor with your PC. This is what real innovation looks like.
Intel had better have a 15th-gen ready for the Zen 5 launch, because there's no way this is going to be enough. If not, we'll have a quarter of AMDominance, possibly even longer.
Tempted to buy a 7800X3D now despite their previous issues with EXPO-enabling, but that might just be that some people got unlucky. I'd be upgrading from a 6600K.
@@scxrlethouse I did hear Jay from JayzTwoCents mention he's still having issues with random memory retraining and not being able to enable EXPO, which is leading him to consider the latest Intel release.
@@jeevejavari8461i think you will want to upgrade to amd's 8000 gen with how things are going to be honest. im on an ryzen 5 5600x amd im tempted to upgrade to the 7800x3d but it's very pricey since i need to upgrade to the am5 platform so also new ddr5 ram. i think im sitting this gen out and ill go straight to the 8800x3d
@@jeevejavari8461Yeah, no reason whatsoever to upgrade from 12th gen (personally have 12900KF). Waiting for arrow lake, maybe it brings finally something new to the table.
This has been my way of upgrading, getting a gen before the latest especially when they're heavily discounted. Don't wanna be the guinea pigs for these companies while still paying a premium.
@vinarrun3622 What production? Sucks in what way? People like to throw that word but don't really explain what. I have 3 systems, an i7 12th gen, an Apple M2, and Ryzen 7 2700. All good with my professional job, software engineering. The difference is the i7 heats up way faster, but it does the job.
As someone who has to game on a 7950X gotten for a work-from-home workstation, this just makes me hope the 8950X3D has cache on each CCD, as I imagine that would be a real beast.
@@thefurmidablecatlucky3, for sure. I mean, I'm using the 5800X3D with an RTX 4070Ti, at 1440p, on a 75Hz monitor (don't ask), so I limit all my games to 73 FPS for the best low latency experience. Not to mention that I only have time to play 5, maybe 6 games per year (I'm almost 40, so my 'avid gamer' days have long since passed). 🙂
@@thefurmidablecatlucky3Definitely not true. The massive uplift I saw on a 6900XT going from a 3700X to a 7600X was hilarious. I was so CPU limited that I ended up seeing 50% uplift in some games, and well over 30% in most others.
zen3 (5000series) was a bigger uplift over zen 2(3000) than zen 4 (7000) is over zen 3 so he will still be good for many years. notice how the 5800x3d follows the 7800x3d .. yes one is newer and faster but not by the same gap you will get comparing 3000series chips @@rustler08
@@rustler08that doesn't mean that the 5600 is bad, it's just that 7600x is really good, for the majority of gamer which have a 3070/6700XT or below that 5600 will still be good for the next few years
Thanks so much for the information on the power consumption of these parts; power consumption is a major consideration for me when looking at new parts due to my living conditions (stuffy room that gets hot if my system uses too much power), so this video is a super useful resource for me.
@@samarkand1585In reality, the 5800X3D gets destroyed in perfomance per watt by the i7-12700K, the i5-13600K, Ryzen 7 7700... While writing this comment, my i7-12700K is pulling 8 watts and is just Big-Little. Nothing of balanced power plan or anything like that. No Zen 3 chip draws 8 watts while watching youtube on high perfomance power plan at steady 5GHz.
Omg I remember Steve "unboxing hardware" videos and having the time of his life asking for cool stuff from the subs. When this channel had like... Less than 50k subs. Now 1M? Took you guys long enough! We don't always see eye to eye and I've caught many mistakes here in the past that I addressed and even grew tired of thr channel back and forth, but this 1M is absolutely deserved. Really hard fought. Regardless if our opinion match or don't match, the amount of work put is just incredible and the journey I witnessed is really worth rewatching. I'm very happy to have been a part of this, the good and rhe bad. You guys deserve it!
15th & 16th gen Arrow Lake & Arrow Lake refresh are going to be huge. 1st cpus released by Intel that were created under the legendary Jim Keller. Everything he makes is gold.
TBH they don't really need to, the only AMD cpu that makes sense are the 7950x or the 7800x3d. 14600k and 14700k are far ahead of the 7700x and the 7900x in productivity and also ahead of them in gaming. There is a bit of a gap for them to relax for a year or two provided they keep giving the same minimal improvements.
@@BroderickNorthmoorThat doesn’t mean much when AMD has been siphoning off data center customers. Money isn’t in the gamers, they need to offer more everywhere
I got a 7800X3D because I wanted good gaming performance without having a space heater on my desk. I'm glad I made that decision. Intel CPUs just use too much power and run too hot these last few generations. Of course if you care about productivity tasks then they're the best for you so turn on the air conditioner and enjoy!
It's weird that my i7-12700K is an anomaly. I'm using it for a music production PC, it's air cooled and i'm using a Corsair 5000D Airflow case. For gaming, it stays around 45 degrees, on Cinebench R23, rarely goes beyond 70 degrees. While writing this comment, it is around 24 degrees. By far the most efficent i7 ever made in a decade.
7950X3D scheduling issues can be solved by setting bios to favor frequency, then using Process Lasso to set games folder to prefer Cache. The performance is very good after those changes. I think it would be cool to see it be investigated (not saying these settings should be used in benchmark videos)
Agreed. This is the way to use the 7950x3D. The driver/game bar solution is just really inconsistent and checking the "this is a game" check is about as much time as setting affinity in process lasso anyways
Maaaaan I miss intel. These insane power draw and heat output keeps on steering me away for the past 8 years. That’s a lot of heat coming from a gaming pc when you live in a tropical country (like me). But I guess, intel is always considerate of their customers who needs a pc that doubles as a room heater 😭
@@aeronybrek4696 Radeon cards are way more competitive than you think. The only thing nvidia has going for them is the 4090, but every benchmark out there is still using drivers from launch and most are using reference cards. Partner cards are pretty considerably faster and the gap is way smaller, but nvidia is getting spanked at every price point except the 4090, which is 170% the price for about 15-20% more performance than a partner xtx. 7800xt vs 4070 is pretty funny, as is the 7900xt vs 4070ti or 7700xt vs 4060ti. Latest beta driver just made AFMF framegen available for pretty much every game and on the 6000 cards as well.
While these stock temps and power are ridiculous, you dont have to run them like this. They can be made to run a fair bit more efficiently, though ultimately they are still a generation behind Zen 4 and TSMC 5nm in this regard. Meteor Lake should be much closer in terms of efficiency.
THANK YOU *SO MUCH* for the IPC graphs! I've been wanting to see this comparison for a loooong time (way before the release of the 14th gen). I hope the others find it useful too, so that you continue doing it in the future :)
@@busetgadapet Maybe AMD CPU owners are not automatically AMD GPU owners? That said, personally I went with an 7900XTX over a 4080 because the price was 20% lower but the power consumption was only 11% higher (and I get 8GB extra), which is, all things considered, a pretty negligible amount. But we're not talking 11% here. In many instances the 14900k used 30-40% more power for 15% less performance. That's MASSIVE. And observe that Steve was "being nice to Intel" in the sense he compared total system power and not just CPU power. If you isolate it to just CPU power (as Gamers Nexus did) there are instances where the 14900k uses twice the power of the 7950X for practically identical performance. So you can actually perfectly complain about one but not the other. My GPU uses 35W more than the competition. Not 150-200W like if I had gone with a 12/13/14900k over my 7950X3D. TL;DR: Having principles can be a good thing. But only as long as those principles translates to tangible differences in the real world. Otherwise it's just benchmark masturbation.
Thanks Steve. Would have liked to also see the 13900KS on the charts -- just out of curiosity to see if there's any appreciable difference at all between the 14900K and 13900KS.
It is awesome you add power consumption charts to each game. To me, the 14600 efficiency seems to be a small but nice bonus over the 13600. Good for SFF I guess.
@filipealves6602 I'm not a chill, I have an AMD and an Intel PC, one for regular use and gaming, and the other for VR and Simracing. The AMD lags and stutters and I'm constantly tweaking settings to troubleshoot shit, the Intel/nvidia pc just doesn't. I've built my own PCs since before AMD bought up ATI, my 1st AMD pc was a 2004 Athlon 64. I adopted ryzen 1st gen when it came out, then got an r5 3600, then a 5700x, and today I play on a 7800X3D, gpus I've owned incude an RX 480s, 5600, 5700xt, and right now a and RX 6800, so I have plenty of experience with AMD and it's always the same, so are you an AMD chill for ignoring real users' complaints?
In the UKs top Amazon CPU sales list the whole top ten are AMD, #11 is a 13700K (for more money than a 7800X3D, I mean seriously?), and #12 is a Thermalright doohickey to stop Intel chips bending in the LGA1700 socket after Intel did such a bad job designing it.
@@JoeMaranophotographyEach Intel CPUs have their own seperate listing from 12100((k)(f)) to 13900((k)(f)(s)), whereas the contact plate only has one listing. Edit: spelling
7:46 if 1713 is 100% then 1779 is 103,85% so how do you get a 14% uplift? Because you have it right with the i7 and the i5 being 6% and 7% faster respectively. So why is the number for the 14900K so wrong?
paying for the privilege of having a harder to cool part with way more power consumption and more dependant on ram to extract meanigful performance out of it. seriously, the amount of extra cash you'd need to spend to get a better performaing 13900k/14900k system vs a 7800X3D system is absurd. you could run the latter at stock with 5600mhz ram, a $40 air cooler and a budget motherboard and have zero negative performance impact whatsoever. the Intel setup NEEDS a high end vrm motherboard, at bare minimum $120 investment into a cooling solution, 7200mhz ram and it will only cost you hours in tweaking and stability testing and literally more than double the power consumption and heat output, to hopefully not lose to the AMD setup in almost every game. Intel i9's are simply not worth the cost unless you're going for wolrd record benchmarking. as gaming solutions they perform well, but not nearly justified cost of ownership.
Moving one step closer to outlet max wattage being the limiting factor. "For those running a 15a house electrical system, if you can't upgrade your house, you will be power limited and so upgrading to current gen might not be worth it".
Very happy with my AMD Ryzen 5 7600, such a great value. Even lowered the TDP in the BIOS, making it use 30% less power for a mere 3% decrease in performance.
My i7-12700K is an anomaly, i bought it for 330usd years ago. At stock, 5 watts iddle, 150 max on Cinebench R23. Infinite turbo as well, 70 degrees on Cinebench R23 with air cooling and a Corsair 5000D Airflow... This is literally the best i7 ever made in a decade, the only weakness is the crappy DDR5 memory controller.
What a bloodbath! I hoped Intel had learned from the 11th gen disaster. Another full year of no competition for AMD is disappointing. Let's hope Arrow Lake will be on time next year at least.
I'm glad to have sidegraded my 13900K for a 7950X3D. I use the PC for gaming and for work. Code compiling on 13900K was very bad in terms of performance, power consumption and noise. 7950X3D is nearly silent during a C++ firmware compiling and it runs so much faster thanks to the 3D cache. My compile time decreased from 3.30 minutes to 2 minutes. I feel that AMD is a much more stable and "calibrated" platform over my previous 13900k. With 13900K the first AV2 load makes the CPU throttle like crazy, with 7950X3D no downclock happen during heavy AV2 loads.
Side grade? You essentially had to build a whole new computer, no? New mobo, cpu, ram. I guess everything else can be swapped over but still, that had to run at least 1000 to swap.
@@B1u35ky If he needs to compile his code many times a day, work-related, then he saves a tonne of time. Those extra money for ram, MB, etc., will be saved in a few months max.
Steve would you guys be able to change the graph colours a bit to make it easier to use Like 14th gen is royal /dark blue 13th gen light blue Ryzen 7000 dark red 5800x3d is light red or orange This would be dramatically more helpful
I have new settings which I am using for my i9-13900k and i9-14900k. I don’t like the uncertainty of undervolting so prefer working with power and clock limits. With these settings I got a couple of FPS BETTER on CP2077 and Forza Horizon Benchmark, went from 99th to 98th percentile on PC Mark Extended, passed the Time Spy Stress Test, Time Spy was within a whisker of the average for my hardware, Cinebench R23 finished between 37-38000, CPU-Z, Intel Diag tool and Prime95 was stable and < 90 degrees. MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.3 GHz and E-core boost to 4.0GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC values. With these settings, you should have no stability issues, be able to run on an air cooler like the NH-D15 (what I use on my 3 i9 13th/14th gen systems) and will barely notice any performance changes in gaming or productivity.
It's really good for general work. If you have a relative asking for a simple build for general workload, just give them a 5600G with 16 or 32 GB RAM. Set it and forget it. That's what I did for several people I know. It's not like it can't play too, most of these people play simple games.
If I pair the 14900K with my 7900XTX, I won't need my furnace to heat my room this winter. :) Actually the 9900K + 7900XTX is doing a pretty good job of that now. An AMD CPU is the next upgrade.
@@SwAT36 Not much, but some. I see it the perf tool where occasionally the 9900K is the bottleneck. And If I try to use FSR to upscale from 1440P to 4k the CPU bottleneck really shows. I thought it would boost the FPS but it doesn't and CPU pegs at 100% and GPU drops to 75%. also the platform my 9900K uses is behind in DRAM and SSD tech. This is playing TLoU.
@@SwAT36 I upgraded from a 9900k to 13900k and honestly I did find a decent performance increase in 4k, and especially just overall desktop performance, was worth the upgrade. The 14900k tho...
I think it's also fair to point out that if you're air cooling these intel chips (or using a smaller, less powerful clc) you will NOT be getting the same performance out of them as in the testing. They WILL throttle at lower clocks.
and i can add that even watercooling wont work if you go for a radiator with a smaller size or a lower fan speed. so yeah.. an additional 100 bucks in cooling, coil whine and high fan speed noise on top of the worse performance. best deal ever
The way the CPU's are going with regards to power and temp we should see a Kettle fitting to a PC' case so we can make a cup a tea when we are gaming 😁
This is not wasted time in testing, even tho it feels a bit that way... is 2 in 1 so called new cpu release and a wonderfull up to date revisit on up to date bios, drivers etc with relevant games on time for holiday season. You probably help consumers more than you realize Steve.👍
I wonder why the difference on the ranking for the 5800X3D in Cyberpunk and Baldur's Gate 3 is like that compared to what Gamers Nexus reported on it. On their testing this processor was on the top part of the chart for the most part, so I am curious to know on why this happens.
Yeah, was wondering about this as well. GN's result has the 5800X3D with a 40FPS uplift over these results. Perhaps different locations, Ultra vs. Medium on GN's side, maybe SMT wasn't enabled for the 5800X3D?
I recently chose to go with the 7600x over the 5800x3d after seeing it was cheaper to go with am5. 7600x = 199. 32gb 6000 cl30 =$95. AM5 Mobo= $80. Total am5 upgrade was $255. After selling my old hw for $120.
Unless you got your old PC for free. AM5 cost you what you paid for it. Guess we're all prone to those kind of mental gymnastics as justification lol I bought my 5800X3D for around $255 and that's what I paid for it. Even though I sold my older processor for around $100
@@ij6708 If the new components replace my old ones and I no longer have a use for the old ones, I always sell them and deduct the proceeds from the upgrade cost. My old Mobo's vrm was also too weak for a cpu upgrade. I was sure to get a an AM5 board with a good vrm this time around
@@zoopa9988 Actually the X was $30 cheaper. I wouldn't use a stock cooler anyway. The X is also the superior chip as it has higher clocks out of the box. It can also be limited to 65w and slower clocks like the non x
@@meh-tc6vv If it was cheaper for you to get the X version and you didn't wanna use the stock cooler then I fully understand. The stock cooler is totally fine though, the non-x only uses 65 watts, 45 in my case. I wouldn't call it superior though, they are basically the same, ain't no difference. It's just that the X version is way less efficient unless you also lower the TDP. Just like me with the non-x I can just increase TDP from 65 to 95/105 TDP and get the same performance as the X version. I would say the X version is basically the non-x with a bad TDP.
Thanks for including the power draw charts! I'm going to upgrade my i7 7700 to a ryzen5 7600x soon, power draw is quite important too. When gaming for some years, those extra kWh probably add up.
@@saricubra2867 my i7 7700 has 4cores (8threads), doesn't support Windows 11 and support will endofservice in march 2024 ;) 7600x is +- 3times faster on most gaming and prod. benchmarks i've seen. The Intel i7 7700 is quite old so it's time for an upgrade
And the 3D Vcache parts are also much more easy to undervolt, because they have to be binned for power due to having difficulties cooling otherwise. I can run my 5800X3D at -100mvtt baseline and then add -30 PBO2 tuner on top of it (yes that is actually 100% stable). I can't measure it accurately, but its 60-70W tops, running 60-65 degrees with a 120mm NH-U12S at 80-90% load in Cyberpunk's dogtown. If you then compare efficiency...well. It's embarassing for Intel.
There is no difficulty cooling X3D parts. I wish this absurd myth would just die. The stacked cache die lies on top of the existing L3 cache. It does not cover any part of the cores themselves. There's no extra silicon above the cores, so these parts have the same thermal transfer as non-X3D parts.
You can do the same thing with Intel CPU's. You can reduce their power consumption significantly just by appropriately adjusting LLC and AC/DC loadline. You can further undervolt them by directly adjusting the voltage.
Comparison really could've used some 12th gen models in there. Those are still widely available IIRC and the jump from 12th to 13th is quite a lot bigger than 13 to 14. I was thinking of upgrading my 12600K to 14th gen, but now that the reviews are in, I guess it'll be 13th gen for me - 13600K or 13700K, I suppose. Hopefully those will drop a bit in price once 14th gen hits the market, but I doubt it. EDIT: Scratch that - because prices. Just took delivery of a 14700KF - which was actually cheaper than the 13700KF and just 4 or 5 € more expensive than the 13700K. For some weird reason, the 13700KF is more expensive than the 13700K, just like it was for the 12600K/KF when I bought my "old" CPU last year. 14700K would've been another €30 on top of the 14700KF, and since I never used the iGPU on my 12600K, I went with the KF. I think the i7 is the one CPU in this "gen" that offers an actual incentive to go 14 instead of 13. At least it comes with a few more E-cores... :)
Just a heads up, be careful putting those flashing lights in the background (@27:27). They can trigger people's epilepsy. Maybe put a warning in the video if you want to continue including them.
I can't believe I didn't get this the first time. The thumbnail for this review is the same as the one for 13th gen with a different number because that's what 14th gen is
@@SB-pf5rc There are tests for that. Well not for the 14900K, but comparing the 13900K with the 7950x at different wattages. The lower your wattage goes, the bigger the lead of the 7950x becomes. If you lower the 7950x to 100w, it is not loosing all that much performance, while the 14900K is getting totally handicapped
After 3 weeks of testing my computer and updating the BIOS I finally cracked it. I ended up using the provoked pawns advice, putting the CPU cooler push pull as intake instead of exhaust. Receded the CPU three times and reapplied thermal paste. Now I have 192 GB of RAM running at 5000 megahertz with a negative.0400 under volt running smoothly with no overheating
UE5 loves V-Cache. The vast majority of upcoming AAA titles are all UE5. The 7800X3D is going to age like fine wine on the best oak barrels at the best spot in the cellar.
@@todddominoes9862 About half of all games announced for 2024 are UE5 games. Yes, the really big umbrella studios have their own engines, but more and more mid sized and independent studios are ditching them. CD Projekt Red is one of them. The Witcher 4 will be on UE5 as they're ditching their own RED Engine. UE4 also contracted quite a few AAA titles. The latest in my working memory is Hogwarts Legacy with a development budget of $150 million. That people don't say what you like, or have ever so slightly different ideas of when the budget is large enough to constitute a AAA title doesn't mean that they're lying.
Jeez, even during the dark ages of 2011-2016 when Intel was just giving us tiny little improvements every year for the same price, they'd do more than this. You'd at least get 10% improvement like clockwork. A well advertised price drop on the 13th gen would have been cheaper and easier while also generating good will amongst consumers. Everyone likes price drops. This feels less like it was done for marketing and more to please investor demands that Intel release a new product in 2023. I find it impossible to believe Intels marketing division thought releasing a nearly 0% improvement part masked as a new generation would garner them positive interest. They know tech press exists and was going to rip them a new one. That's why this smells like investor meddling to me, personally.
Nice video again. Two things: 1. Found small mistake: 3th @ 14th gen core series on Test System Spec section of the video, 2. I am watching content on my LG C2 OLED,and i can see,that your videos have great HDR and color accuracy comparing to other tech youtubers. Especialy,on the beginning of the video,where there was close shots of the processor. Yet,there is no option of description on RUclips,that your video is using HDR technology,can you explain that please? Cheers!
Congrats to i9 13900K owners you already have i9 14900K a year ago.
LOL
💀
🤣🤣🤣🤣
i9 13900KS is basically a 14900K 😂
I almost waited. Glad I went with a 13900k 4 months ago
Small error, the driver version used is "544.43", not 644.43.
Also please note there are some errors in the 5800X3D data.
Spider-Man should be 144 fps, not 122 fps
ACC should be 175 fps, not 161 fps
Baldur's Gate 3 should be 145 fps, not 132 fps
Star Wars Jedi Survivor should be 140 fps, not 129 fps
@@crazybeatrice4555 Really impressive stuff for a 3rd gen product!
You are like intel, inflating the number by 100/1000!
@@shraf2kay skill issue
Maybe future version 694.20 will work best
Unacceptable, unsubing 😂
Intel sets a new level of benchmark... in power consumption.
It's great to see Steve added power consumption benchmark for every game tested.
Especially since Steve initially didn’t think much of power consumption. It took quite a lot of comments and monthly questions before he came around.
😅
It's worth noting that Intels draw significantly less power when at idle compared to AMD. Something like 9w for the 13900k vs 54w for the 7950x. So if you're not a gamer then intel is more power efficient in the long run as far as day-to-day use
@@Barncore I noticed that my Ryzen with MCM design draw more power during idle compared to single-chip Ryzen but not as high as 54W though. More around 20-30 watts. And this is quite a lot higher than the good old Ryzen 7 1700 which surprised me so much in efficiency back then in 2017.
@@eternalwingspro4454 that's the problem of MCM, the chip with single die often better at idle power consumption. If you into that thing then you should pick APU model from AMD (often made in single die). I have 5600G and 5950X, 5600G idle around 10W of power, but 5950X is ~31W.
The difference in power consumption between Intel and AMD is pretty insane. Up to 40% for similar performance.
Depends what you do with it and how you measure. Intel peaks much higher but also idles a fair bit lower.
@xiphosmaniac Well not really, I bet if people looked how much they are actually gaming, vs just browsing and watching youtube, its probably mostly not gaming.
@@bt82 one probably shouldn’t be getting such expensive processors if wattage on idle is a concern. Laptops or anything mobile are an obvious exception tho.
@@Fighter4Street their is a 6 watt difference in idle that's nothing even 24/7 would make it more or less 1.30 bucks per month.
@@coffee7180 This may be true but for most people, like myself who stay about 90% in idle, even though I have a 4090 but just don't game that much, this would really offset any of the gains an AMD would have.
Most people are not running their PC that hard and just staying around idle where the power difference isn't as much as AMD likes to claim and assumes people are running games the whole time.
Either way, I think the intel runs too hot during games and that is why I decided to wait for Intel new lineup of CPUs, hoping they improve on this or at least offer much better performance for that heat.
To me it's very clear what they've been doing, this is NOT a refresh but CPU binning, they actually reserved better yields for the 14th gen while releasing the worse as 13th gen.
they should've binned the entire generation. as in sent it to the trash bin
14700k has 2 more cores though.
Wait a minute. Maybe that was binning too?
Holy shit. Maybe they had a batch of chips that had 1-2 failed cores and decided to release them to as 13th gen.
Rest is for sure just binning.
You could think it might be a tad bit too early for Intel to go back to providing their 2012 - 2016 level of improvements.
Well they have Meteor Lake on Intel 4, but it's just not coming to desktop. So they're making advancements, but they're seemingly a bit rushed and I imagine their capacity on their leading edge processes are being tied up with their HPC/DC parts primarily. Arrow Lake for desktop will amount to basically two architectural generations and two process generations ahead of Raptor Lake. That's what we'll be waiting for and Intel *needs* to deliver on it.
Coming up next Arrow lake, based on igorslab leak it will only bring + 3% ST and 10% MT over 14th gen. Also hyperthreading is gone on arrow lake because they fkd up the chip design but dont want to repair it because that way they would waste even more time on their obsolete and bloated "core" architecture
This is even worse than that. It's 2% performance for ~5% more power. That's abysmal.
Well, should they release nothing at all? Technology takes time.
Be happy for an oppourtunity to get 12th and 13th gen at a discount thanks to this new gen.
@@damara2268that’s a disappointment, I thought after two small improvements generations they would have another big leap in performance like the 12th gen.
Intel was able to release “4060TI 16GB” of CPUs minus the power consumption. Pretty impressive! ;)
Yeah... Intel has been doing it for years. From Sandy Bridge to Kaby Lake, they released the exact same chips for 5 whole years. Until AMD stepped up, Intel released borderline scams.
EDIT: It was a miracle if people saw 3% gains.
it would have been okay if it would be the same price as the 13900k, but it’s more expensive which just makes it laughable
Minus? More like + extra power consumption.
.@gatsusagara6637
Sorry for misunderstanding... minus in the relation to 4060TI power consumption which is actually the only impressive thing about it while 14900K on the other hand as you said...well....
3050*
Intel is really good at selling 3D v cache CPUs
its a new skill lol
It's a feature
a feature, not a bug
They also good at selling i7-12700K chips (the only good intel chip on Amazon's top 10).
Picked one up last week... lol
Great honest review as always. Just a word of caution, the B-roll starting at 27:26 doesn't seem apropriate for light sensitive viewers.
As a 5800x3D owner with B450 motherboard I still see no reason to upgrade to...anything. Good job AMD.
Same here... (except i have a b550)... Waiting for Zen5
@@LordPaulusCobris I wouldn't mind a B550 to get some features but in a nutshell it's not worth it for me. I'll wait for Zen 5 with you and then decide.
I'm running mine on B350 itx :) Still going strong...
Yup. The 5800x3D surprised everybody. Even AMD.
The 5800X3D has been released 1.5 year ago and people make it seem like it's been released 5 years ago
Looks like Intel is back to it's 14nm ++++++ generational performance improvement.
3-4% performance uplift with more then 30% extra power. Yep classic Intel.
Those gains are only for this generation, this is because they couldn't get meteor lake desktop to come to the market so they had to come out with a refresh and as we all know, refreshes are nothing major. Next generation will have moderate - large gains if I'm not mistaken.
They'll take a 2 nodes jump to 20A before the end of next year for Arrow Lake, for the compute tile at least. So not really if you can believe them.
Yep. This is simply 10nm+++
You can tell its the exact same chip as 12th gen because the only way they can get it to go faster is by increasing the power. Im a little impressed. I actually thought last gen was as far as they could push it
@@PaulsTechSpace nah. Next gen is only around 5% ipc increase expected on intel side in those leaks. 😂
atm AMD is 2gen's above intel. Thats hilarious.
@@PaulsTechSpaceso yeah classic Intel
The 7800x3d is a pure diamond for gaming. It is simply perfect.
Not expansive, incredible performance and low power consumption
@@jacktheripper4458wtf are you talking about 😂😂
Not expensive? It's 400-600$ dude. It's expensive
@jacktheripper4458 Oddly enough I've only experienced stuttering with the 3 intel systems I had. I went to amd for the first time with the 7800x3d and I can count the stutters I've experienced on one hand in the 4 months I've had it.
@@jacktheripper4458 did you even watch the video lmao
@@jacktheripper4458There is truth to having stutter on b650m boards. I swapped my MSI b650m am5 board to a quality a620m board from Asrock and all stutters and issues went away. Most responsive pc I've ever had
The fact that you reused the thumbnail from the 13900k review fits perfectly 😁👍
Also, thank you for providing power usage numbers for all Games. Your CPU Reviews are the best
he spent more effort changing the 3 to a 4 than intel did on this generation
The power usage numbers really is a welcome change, one that all reviewers should embrace. Intel is making huge sacrifices to power efficiency to stay on the top end of the charts, and that's at the expense of their customers, who are getting a CPU that is hot, throttling even when water-cooled, and would need to be considerably cheaper than AMD for it to be considered an alternative to AMD's offerings, plus the AM5 socket longevity. As it stands, it has to be AMD for the large majority of people on the market for a new CPU.
@@filipealves6602Yeah, if AMD had more fabs reserved for them at TSMC, the desktop AND laptop side would be a complete blow out for AMD Ryzen CPUs. I mean, the Intel 19-13900K did one thing good: It got me onto AMD's AM5!
@@cameronbosch1213 If AMD had not sold off it's own labs (now called Global Foundries), AMD wouldn't have to wait in line at TSMC.
I really appreciate the way you made this video, showing all of these benchmarks, explaining large outliers, good data, thanks!
Congratulations on the 1M subs Steve and Tim !!! It's well deserved considering the work you guys do. Here's to the next million !
Impressive review. I really appreciate the updated Factorio benchmark and the power draw for each of the games. Much more useful than only comparing power comsumption using blender.
Great work.
Anyone who hasn’t played Factorio yet beware, that game is more addictive than any drug 😂
100% Agreed you see the really good stuff.
More than double the power draw, 60% more expensive, evidently almost impossible to keep cool even with a 360mm AIO and effectively no better in gaming compared to a 7800X3D...
At this point I'd ask if this is some sort of out of season joke, but Intel has taught us all better by now.
It's like AMD's Bulldozer era inversed.
60% more what?
Do you guys not have phones?
except this at least has performance matching AMD at least AMD was nowhere while pulling more power also@@quasii7
@@WayStedYouwhat is your point my dude this comment doesnt make the 14900K magically better cause goddamn
I recently bought the 5800x3d and im just happy to see that its still doing amazingly well in these charts. HOPEFULLY this will last me around 5 years
Game at 4k then you will have no worries 😊
no
Same here. I don't do much fps games. But I am just starting on Starfield. It's a little choppy, but I think that is my 5700xt. I mostly play civ6, and the performance boost over my old 3800x is noticeable.
@@andrewsolis2988 yep. Got a 5800X non-3D and runs games fine as I play at 4K.
@@darthdadious6142do yourself a favour and play a better game, phantom liberty for example
Enjoying how the 7800X3D is still a chart topper.
*in gaming 🐣
Assetto Corsa gains are insane! Even in 4k...
I've got to say, with the 7800X3D, AMD has managed to create a product which I feel delighted to own, over and over again
Yes, its noticeably slower when not gaming, and just a bit more efficient when gaming and if you game at 4K, probably not that big of a difference.
You basically just save maybe $20 bucks a year on electricity, makes you feel good buddy?
@@Fighter4Street Sorry to hear about your hurt butt
also way cheaper@@Fighter4Street
intel also runs way hotter@@Fighter4Street
@@Fighter4Street If not for AMD you would be playing with that 2 cores 4 threads CPU for the rest of your life
Great channel, doesn't favor any brand, just straight honest opinions.
One of the best RUclips channels I'm subbed to.
Intel handed in the same essay they wrote for last semester's final this year and thought we were too stupid to notice.
To be fair some people will fall for it and since they didn't have any better ideas why not rebrand and catch a bunch of people not paying attention.
Why would they change though.
It's basically what they've been doing for over a decade now, with the odd uplift, followed by several generations of refreshes.
@@ultrademigod I made the blunder of believing in the engineer heading the company marketing bull... but you are not incorrect.
So you haven't found out what a "Ryzen 7520" is then...
they used GPT to "shift" things a little. Nothing else.
Feeling good about picking up a 7950x3d for £450 recently, can't believe just how much more wattage intel chips consume - it's insane!!
Really?! Where? I can't believe I paid almost £650 in August for mine! 😯
£450 LOL. Nice work! I'm probably going to go and buy a 7800X3D now, but am still slightly concerned with enabling EXPO as apparently there's still some issues with it.
@@filipealves6602 CEX, technically second hand with a 2 year warranty. It was basically new in the box and runs great.
@@benjamind547 do it! These 3d cache chips are insanely good. Is that just a 7800x3d thing? Expo seems to be all good on my system so far
@@benjamind547 new bios updates mean that the best memory speed that is completely stable is 6000 CL30, no issues at the moment, but there used to be a few months ago
That IPC test was very interesting! Also great job on recognizing the flaws on the old Factorio bench!
Well, at least for once they are launching the new CPUs at the right time - just before winter, so now you can heat not only your room, but your entire house floor with your PC. This is what real innovation looks like.
The difference between the new "generation" and the last generation are the quotes
Here we are at a watershed moment, AMD 7800X3D still on top with the 8000 series still to come 😮
Intel had better have a 15th-gen ready for the Zen 5 launch, because there's no way this is going to be enough. If not, we'll have a quarter of AMDominance, possibly even longer.
I think 2024 is looking strong for AMD, for at least two quarters. @@benjaminoechsli1941
Tempted to buy a 7800X3D now despite their previous issues with EXPO-enabling, but that might just be that some people got unlucky. I'd be upgrading from a 6600K.
@@benjamind547the expo issues have pretty much been ironed out
@@scxrlethouse I did hear Jay from JayzTwoCents mention he's still having issues with random memory retraining and not being able to enable EXPO, which is leading him to consider the latest Intel release.
One positive note. It did make 12 gem more discounted and 13th gen slightly discounted.
That's where the win's at. This release drops the prices of 13th and 12th gen, which is a win.
@@TheKelzyup this is the way. I'm staying on 12th gen til at least 15th gen, maybe even later if things don't pan out
@@jeevejavari8461i think you will want to upgrade to amd's 8000 gen with how things are going to be honest. im on an ryzen 5 5600x amd im tempted to upgrade to the 7800x3d but it's very pricey since i need to upgrade to the am5 platform so also new ddr5 ram. i think im sitting this gen out and ill go straight to the 8800x3d
@@jeevejavari8461Yeah, no reason whatsoever to upgrade from 12th gen (personally have 12900KF). Waiting for arrow lake, maybe it brings finally something new to the table.
This has been my way of upgrading, getting a gen before the latest especially when they're heavily discounted. Don't wanna be the guinea pigs for these companies while still paying a premium.
I don’t think I’m ever gunna regret my 7800x3d, all these videos just increase my bias 😂
yeah should be fine for most games.
i own the 7950x3d and the normal cores are just afk all the time when gaming.
@vinarrun3622 What production? Sucks in what way? People like to throw that word but don't really explain what. I have 3 systems, an i7 12th gen, an Apple M2, and Ryzen 7 2700. All good with my professional job, software engineering. The difference is the i7 heats up way faster, but it does the job.
As someone who has to game on a 7950X gotten for a work-from-home workstation, this just makes me hope the 8950X3D has cache on each CCD, as I imagine that would be a real beast.
I'm good with my 5800X3D for the next 2 - 3 years, thank you very much. 😀
even a 5600 non X will be good for the next 2 -3 years
@@thefurmidablecatlucky3, for sure. I mean, I'm using the 5800X3D with an RTX 4070Ti, at 1440p, on a 75Hz monitor (don't ask), so I limit all my games to 73 FPS for the best low latency experience.
Not to mention that I only have time to play 5, maybe 6 games per year (I'm almost 40, so my 'avid gamer' days have long since passed). 🙂
@@thefurmidablecatlucky3Definitely not true. The massive uplift I saw on a 6900XT going from a 3700X to a 7600X was hilarious. I was so CPU limited that I ended up seeing 50% uplift in some games, and well over 30% in most others.
zen3 (5000series) was a bigger uplift over zen 2(3000) than zen 4 (7000) is over zen 3 so he will still be good for many years. notice how the 5800x3d follows the 7800x3d .. yes one is newer and faster but not by the same gap you will get comparing 3000series chips @@rustler08
@@rustler08that doesn't mean that the 5600 is bad, it's just that 7600x is really good, for the majority of gamer which have a 3070/6700XT or below that 5600 will still be good for the next few years
Thanks so much for the information on the power consumption of these parts; power consumption is a major consideration for me when looking at new parts due to my living conditions (stuffy room that gets hot if my system uses too much power), so this video is a super useful resource for me.
Can't beat the 5800x3D for perf/watt
@@samarkand1585In reality, the 5800X3D gets destroyed in perfomance per watt by the i7-12700K, the i5-13600K, Ryzen 7 7700...
While writing this comment, my i7-12700K is pulling 8 watts and is just Big-Little. Nothing of balanced power plan or anything like that.
No Zen 3 chip draws 8 watts while watching youtube on high perfomance power plan at steady 5GHz.
30seconds in… and we’re already setting our expectations for this new “generation” 👀
Really appreciate the shot at 28:00 "in a nutshell" with the nuts flying through the background! :D Very nice cinematography.
Omg I remember Steve "unboxing hardware" videos and having the time of his life asking for cool stuff from the subs. When this channel had like... Less than 50k subs. Now 1M? Took you guys long enough!
We don't always see eye to eye and I've caught many mistakes here in the past that I addressed and even grew tired of thr channel back and forth, but this 1M is absolutely deserved. Really hard fought. Regardless if our opinion match or don't match, the amount of work put is just incredible and the journey I witnessed is really worth rewatching. I'm very happy to have been a part of this, the good and rhe bad. You guys deserve it!
You handed them a piece of shit with honey on? Really weird!
I hope Intel comes out swinging with their next generation. The ridiculous power consumption of these parts shows they've hit the performance wall.
15th & 16th gen Arrow Lake & Arrow Lake refresh are going to be huge. 1st cpus released by Intel that were created under the legendary Jim Keller. Everything he makes is gold.
TBH they don't really need to, the only AMD cpu that makes sense are the 7950x or the 7800x3d. 14600k and 14700k are far ahead of the 7700x and the 7900x in productivity and also ahead of them in gaming. There is a bit of a gap for them to relax for a year or two provided they keep giving the same minimal improvements.
@@BroderickNorthmoorThat doesn’t mean much when AMD has been siphoning off data center customers. Money isn’t in the gamers, they need to offer more everywhere
@@Dante_S550_Turbothey’re never coming. The delays will continue until morale improves.
They have hit the limit of their process node. Their only choice is to either improve the process ($$$$) or move to different architecture.
I got a 7800X3D because I wanted good gaming performance without having a space heater on my desk. I'm glad I made that decision. Intel CPUs just use too much power and run too hot these last few generations. Of course if you care about productivity tasks then they're the best for you so turn on the air conditioner and enjoy!
Gamers Nexus tested the main productivity softwares, and 7950x still came out on top most of the time
i've just done the same! First AMD chip for me
It's weird that my i7-12700K is an anomaly. I'm using it for a music production PC, it's air cooled and i'm using a Corsair 5000D Airflow case. For gaming, it stays around 45 degrees, on Cinebench R23, rarely goes beyond 70 degrees.
While writing this comment, it is around 24 degrees.
By far the most efficent i7 ever made in a decade.
@@saricubra2867what is your ambient
Temperature is not = power used@@saricubra2867
Pretty much what was expected and is what made me go for an AMD CPU for the first time since the DX2-80.
7950X3D scheduling issues can be solved by setting bios to favor frequency, then using Process Lasso to set games folder to prefer Cache.
The performance is very good after those changes. I think it would be cool to see it be investigated (not saying these settings should be used in benchmark videos)
is better to setup in bios for vcache cores as favor. You get better performance in games and applications.
Agreed. This is the way to use the 7950x3D. The driver/game bar solution is just really inconsistent and checking the "this is a game" check is about as much time as setting affinity in process lasso anyways
Love the new editing bits, also congrats on one mil
Maaaaan I miss intel. These insane power draw and heat output keeps on steering me away for the past 8 years. That’s a lot of heat coming from a gaming pc when you live in a tropical country (like me). But I guess, intel is always considerate of their customers who needs a pc that doubles as a room heater 😭
8 years, or In other words as soon as AMD got it's act together
@@ultrademigod yea I’m expecting their radeon division to get their shit together too. But I’m not holding my breath.
@@aeronybrek4696 Radeon cards are way more competitive than you think. The only thing nvidia has going for them is the 4090, but every benchmark out there is still using drivers from launch and most are using reference cards. Partner cards are pretty considerably faster and the gap is way smaller, but nvidia is getting spanked at every price point except the 4090, which is 170% the price for about 15-20% more performance than a partner xtx. 7800xt vs 4070 is pretty funny, as is the 7900xt vs 4070ti or 7700xt vs 4060ti. Latest beta driver just made AFMF framegen available for pretty much every game and on the 6000 cards as well.
While these stock temps and power are ridiculous, you dont have to run them like this. They can be made to run a fair bit more efficiently, though ultimately they are still a generation behind Zen 4 and TSMC 5nm in this regard. Meteor Lake should be much closer in terms of efficiency.
Just match a new Intel CPU with a biostar radeon 6600xt for all your winter heating needs😂
THANK YOU *SO MUCH* for the IPC graphs! I've been wanting to see this comparison for a loooong time (way before the release of the 14th gen). I hope the others find it useful too, so that you continue doing it in the future :)
As a 7800x3D owner I am laughing all the way to the bank with the extra money I save on my electric bill for better performance 😂
if i point out nvidia gpu power consumption to amd gpu owner they always said that electric bill doesnt matter, yet here we are
@@busetgadapetThe difference being Nvidia GPUs cost more. AMD CPUs cost less 😉
@@redrock425 RIght? There's no point saving $50 a year on power consumption when the GPU costs $500 more.
@@busetgadapeti mean rdna2 used as much/slightly less power than Ampere, its only the latest generation that nvidia has a decent power advantage
@@busetgadapet Maybe AMD CPU owners are not automatically AMD GPU owners? That said, personally I went with an 7900XTX over a 4080 because the price was 20% lower but the power consumption was only 11% higher (and I get 8GB extra), which is, all things considered, a pretty negligible amount.
But we're not talking 11% here. In many instances the 14900k used 30-40% more power for 15% less performance. That's MASSIVE. And observe that Steve was "being nice to Intel" in the sense he compared total system power and not just CPU power. If you isolate it to just CPU power (as Gamers Nexus did) there are instances where the 14900k uses twice the power of the 7950X for practically identical performance.
So you can actually perfectly complain about one but not the other. My GPU uses 35W more than the competition. Not 150-200W like if I had gone with a 12/13/14900k over my 7950X3D.
TL;DR: Having principles can be a good thing. But only as long as those principles translates to tangible differences in the real world. Otherwise it's just benchmark masturbation.
After having intel for 20+years. I m so happy that i bought the 7800x3d. It is soooo impressive with a rtx4090. The latest 14900k is disapointing
Hi, what mobo did you get for it?
Thanks Steve. Would have liked to also see the 13900KS on the charts -- just out of curiosity to see if there's any appreciable difference at all between the 14900K and 13900KS.
there the exact same, bench marks are exactly the same sometimes worse because apparently the 14900k uses a worse memory controller then the KS does.
It is awesome you add power consumption charts to each game. To me, the 14600 efficiency seems to be a small but nice bonus over the 13600. Good for SFF I guess.
It's the only thing standing out for this "generation". Looks like a CPU that's easy to recommend if it's priced well.
Very hard to recommend over either the 7800X3D or the 7700 non-X though. You still get an upgrade path with the AM5 socket.
@@filipealves6602 AMD = bugs might be an issue. Intel = Solid platform
@@ByGraceThroughFaith777 why am I not surprised that shills are spreading FUD on AMD for Intel like Nvidia has been doing for two decades... 🤦🏻♂️
@filipealves6602 I'm not a chill, I have an AMD and an Intel PC, one for regular use and gaming, and the other for VR and Simracing. The AMD lags and stutters and I'm constantly tweaking settings to troubleshoot shit, the Intel/nvidia pc just doesn't. I've built my own PCs since before AMD bought up ATI, my 1st AMD pc was a 2004 Athlon 64. I adopted ryzen 1st gen when it came out, then got an r5 3600, then a 5700x, and today I play on a 7800X3D, gpus I've owned incude an RX 480s, 5600, 5700xt, and right now a and RX 6800, so I have plenty of experience with AMD and it's always the same, so are you an AMD chill for ignoring real users' complaints?
In the UKs top Amazon CPU sales list the whole top ten are AMD, #11 is a 13700K (for more money than a 7800X3D, I mean seriously?), and #12 is a Thermalright doohickey to stop Intel chips bending in the LGA1700 socket after Intel did such a bad job designing it.
#12 best seller being the socket contact frame is too funny lol
a literal contact frame is selling better than intel cpus? LOL
@@TheUltraMineboxYeah that makes no sense as you need an Intel chip to use it....
@@JoeMaranophotographyEach Intel CPUs have their own seperate listing from 12100((k)(f)) to 13900((k)(f)(s)), whereas the contact plate only has one listing.
Edit: spelling
Thanks for including Factorio in your benchmark suite. I hope it continues :) its a great one for CPU performance.
You play factorio
Factorio is amazing!
how about cities skyline 2, its heavy cpu too
great video, the strobing lights on 27:25 isn't the best idea in case of epileptic viewers for the future
7:46 if 1713 is 100% then 1779 is 103,85% so how do you get a 14% uplift? Because you have it right with the i7 and the i5 being 6% and 7% faster respectively. So why is the number for the 14900K so wrong?
bought a 7800x3d few months back, good to see its still the best CPU to buy for gaming.
cpu
That total power draw difference between the 7950X3D and 14900k is obscene, wtf are you playing at Intel!
paying for the privilege of having a harder to cool part with way more power consumption and more dependant on ram to extract meanigful performance out of it. seriously, the amount of extra cash you'd need to spend to get a better performaing 13900k/14900k system vs a 7800X3D system is absurd. you could run the latter at stock with 5600mhz ram, a $40 air cooler and a budget motherboard and have zero negative performance impact whatsoever. the Intel setup NEEDS a high end vrm motherboard, at bare minimum $120 investment into a cooling solution, 7200mhz ram and it will only cost you hours in tweaking and stability testing and literally more than double the power consumption and heat output, to hopefully not lose to the AMD setup in almost every game. Intel i9's are simply not worth the cost unless you're going for wolrd record benchmarking. as gaming solutions they perform well, but not nearly justified cost of ownership.
BG3! Finally been waiting to see a CPU benchmark of BG3 since german reviewer, pcgameshardware did it. V-cache performs really well
I bought a 7800x3d system a couple months ago and no regrets!
Thanks for also testing the non-x3D CPUs!
*YIKES* Good thing Intel is using a hybrid design to keep that power efficiency in check...
On a different note - congrats on hitting 1M subs!!! 🙂
Moving one step closer to outlet max wattage being the limiting factor. "For those running a 15a house electrical system, if you can't upgrade your house, you will be power limited and so upgrading to current gen might not be worth it".
You get a like for the Thumbnail alone. This Intel CPU review making me loooooooove my 7800X3D you guys talked me into earlier. Noice.
How nice of intel to release a space heater right in time for winter! ☃️
Thanks for the new Factorio benchmarks, puts things into perspective!
These are so good the 7800X3D has gone up in price in the UK today from £369 to £380 lol.
Thank you for including all three CPUs in one review. Normally I prefer individual reviews, but it really wouldn't be warranted for these rehashes.
incredible benchmarks as always, and kudos for that factorio addon!
Im a fan of effeciency and couldnt be happier with the 5800x3D
Very happy with my AMD Ryzen 5 7600, such a great value.
Even lowered the TDP in the BIOS, making it use 30% less power for a mere 3% decrease in performance.
AMD FTW
My i7-12700K is an anomaly, i bought it for 330usd years ago. At stock, 5 watts iddle, 150 max on Cinebench R23. Infinite turbo as well, 70 degrees on Cinebench R23 with air cooling and a Corsair 5000D Airflow...
This is literally the best i7 ever made in a decade, the only weakness is the crappy DDR5 memory controller.
What a bloodbath! I hoped Intel had learned from the 11th gen disaster.
Another full year of no competition for AMD is disappointing.
Let's hope Arrow Lake will be on time next year at least.
3% Improvement, 20% power consumption again. 😂🤣
Yep, and the 7500f looks like a nice entry point....
I'm glad to have sidegraded my 13900K for a 7950X3D. I use the PC for gaming and for work. Code compiling on 13900K was very bad in terms of performance, power consumption and noise. 7950X3D is nearly silent during a C++ firmware compiling and it runs so much faster thanks to the 3D cache. My compile time decreased from 3.30 minutes to 2 minutes. I feel that AMD is a much more stable and "calibrated" platform over my previous 13900k. With 13900K the first AV2 load makes the CPU throttle like crazy, with 7950X3D no downclock happen during heavy AV2 loads.
Side grade? You essentially had to build a whole new computer, no? New mobo, cpu, ram. I guess everything else can be swapped over but still, that had to run at least 1000 to swap.
This does make me feel better about going with the 7950X3D over the 14900k. Still happy about going with it over the 7800X3D as well.
@@beeman4266If he can save almost 50% in compile time for work, its well worth it.
This doesn’t make any sense, you build like a whole new pc for no reason
@@B1u35ky If he needs to compile his code many times a day, work-related, then he saves a tonne of time. Those extra money for ram, MB, etc., will be saved in a few months max.
Steve would you guys be able to change the graph colours a bit to make it easier to use
Like 14th gen is royal /dark blue
13th gen light blue
Ryzen 7000 dark red
5800x3d is light red or orange
This would be dramatically more helpful
I have new settings which I am using for my i9-13900k and i9-14900k. I don’t like the uncertainty of undervolting so prefer working with power and clock limits. With these settings I got a couple of FPS BETTER on CP2077 and Forza Horizon Benchmark, went from 99th to 98th percentile on PC Mark Extended, passed the Time Spy Stress Test, Time Spy was within a whisker of the average for my hardware, Cinebench R23 finished between 37-38000, CPU-Z, Intel Diag tool and Prime95 was stable and < 90 degrees.
MCE off, PL1 and PL2 limit to 225, limit P-core boost to 5.3 GHz and E-core boost to 4.0GHz, and use balanced power profile in Windows (although I do disable core parking to keep system highly responsive). Oh and just XMP on the RAM. I didn’t change LLC values.
With these settings, you should have no stability issues, be able to run on an air cooler like the NH-D15 (what I use on my 3 i9 13th/14th gen systems) and will barely notice any performance changes in gaming or productivity.
Great review. Top marks Hardware unboxed team.
Great to see the 7800x3d kicking butt.
*"The power of the Sun... in the palm of my hand!"* - Intel Engineers when they are making a new i9.
its more like... "the heat of the sun..."
Interesting that 5600G is in the top sellers. I love that APU for ultra small form factors.
And that despite PCI 3.0 ...
It's really good for general work. If you have a relative asking for a simple build for general workload, just give them a 5600G with 16 or 32 GB RAM. Set it and forget it. That's what I did for several people I know. It's not like it can't play too, most of these people play simple games.
@@bassyey Indeed! And additionally it has very good thermal efficiency! If you wanna a super silent build, a Noctua low profile is perfect!
If I pair the 14900K with my 7900XTX, I won't need my furnace to heat my room this winter. :)
Actually the 9900K + 7900XTX is doing a pretty good job of that now.
An AMD CPU is the next upgrade.
cpu won't matter much when gaming on 4k
@@SwAT36 it will if you want high refresh rate in older games
@@SwAT36 Not much, but some. I see it the perf tool where occasionally the 9900K is the bottleneck. And If I try to use FSR to upscale from 1440P to 4k the CPU bottleneck really shows. I thought it would boost the FPS but it doesn't and CPU pegs at 100% and GPU drops to 75%.
also the platform my 9900K uses is behind in DRAM and SSD tech.
This is playing TLoU.
Yeah my 9900k is trying to burn its self through the motherboard recently.. may need changed out lol
@@SwAT36 I upgraded from a 9900k to 13900k and honestly I did find a decent performance increase in 4k, and especially just overall desktop performance, was worth the upgrade. The 14900k tho...
i feel better and better after snatching a 5800x3d last year as the last hoorah of the AM4 socket.
This might be the best review I've seen for the 14*00K CPUs. Good job
I think it's also fair to point out that if you're air cooling these intel chips (or using a smaller, less powerful clc) you will NOT be getting the same performance out of them as in the testing. They WILL throttle at lower clocks.
and i can add that even watercooling wont work if you go for a radiator with a smaller size or a lower fan speed.
so yeah.. an additional 100 bucks in cooling, coil whine and high fan speed noise on top of the worse performance.
best deal ever
@@robinspanier7017 even watercooling wont work for Intel chips?
The thumbnail absolutely fits this CPU release. 😂 Thanks for the review and including some new game titles for benchmarking.
The way the CPU's are going with regards to power and temp we should see a Kettle fitting to a PC' case so we can make a cup a tea when we are gaming 😁
4:19 small typo on 13th, missing a “1”, great video!
This is not wasted time in testing, even tho it feels a bit that way... is 2 in 1 so called new cpu release and a wonderfull up to date revisit on up to date bios, drivers etc with relevant games on time for holiday season. You probably help consumers more than you realize Steve.👍
Im so damn happy I bought the 7800x3d.
I wonder why the difference on the ranking for the 5800X3D in Cyberpunk and Baldur's Gate 3 is like that compared to what Gamers Nexus reported on it. On their testing this processor was on the top part of the chart for the most part, so I am curious to know on why this happens.
Yeah, was wondering about this as well.
GN's result has the 5800X3D with a 40FPS uplift over these results.
Perhaps different locations, Ultra vs. Medium on GN's side, maybe SMT wasn't enabled for the 5800X3D?
@@saikousocial RAM speed. HUB runs all AM4 CPUs with 3200CL14 and Gamers Nexus run with 3600CL16 (IIRC).
@@akirafan28 Yup. Typo. Edited.
I recently chose to go with the 7600x over the 5800x3d after seeing it was cheaper to go with am5. 7600x = 199. 32gb 6000 cl30 =$95. AM5 Mobo= $80. Total am5 upgrade was $255. After selling my old hw for $120.
Unless you got your old PC for free. AM5 cost you what you paid for it. Guess we're all prone to those kind of mental gymnastics as justification lol
I bought my 5800X3D for around $255 and that's what I paid for it. Even though I sold my older processor for around $100
Should've gone for the 7600 non-X.
Cheaper, comes with a cooler, and is more power efficient.
@@ij6708 If the new components replace my old ones and I no longer have a use for the old ones, I always sell them and deduct the proceeds from the upgrade cost. My old Mobo's vrm was also too weak for a cpu upgrade. I was sure to get a an AM5 board with a good vrm this time around
@@zoopa9988 Actually the X was $30 cheaper. I wouldn't use a stock cooler anyway. The X is also the superior chip as it has higher clocks out of the box. It can also be limited to 65w and slower clocks like the non x
@@meh-tc6vv If it was cheaper for you to get the X version and you didn't wanna use the stock cooler then I fully understand.
The stock cooler is totally fine though, the non-x only uses 65 watts, 45 in my case.
I wouldn't call it superior though, they are basically the same, ain't no difference. It's just that the X version is way less efficient unless you also lower the TDP. Just like me with the non-x I can just increase TDP from 65 to 95/105 TDP and get the same performance as the X version.
I would say the X version is basically the non-x with a bad TDP.
Was that a cancelled yawn at 15:32?
Thanks for including the power draw charts! I'm going to upgrade my i7 7700 to a ryzen5 7600x soon, power draw is quite important too. When gaming for some years, those extra kWh probably add up.
Why downgrade from 8 cores to 6 cores?
@@saricubra2867 my i7 7700 has 4cores (8threads), doesn't support Windows 11 and support will endofservice in march 2024 ;)
7600x is +- 3times faster on most gaming and prod. benchmarks i've seen. The Intel i7 7700 is quite old so it's time for an upgrade
I7 7700 has only 4 Cores ;)@@saricubra2867
@@saricubra2867from 4core to 6core 😂
And the 3D Vcache parts are also much more easy to undervolt, because they have to be binned for power due to having difficulties cooling otherwise.
I can run my 5800X3D at -100mvtt baseline and then add -30 PBO2 tuner on top of it (yes that is actually 100% stable). I can't measure it accurately, but its 60-70W tops, running 60-65 degrees with a 120mm NH-U12S at 80-90% load in Cyberpunk's dogtown.
If you then compare efficiency...well. It's embarassing for Intel.
In idle use (like browsing or similar stuff), Intel CPUs are more efficient. During full load tests, AMD CPUs are more efficient.
Is -30 supposed to be impressive? Both AM5 CPUs I've had all did this just fine.
There is no difficulty cooling X3D parts. I wish this absurd myth would just die. The stacked cache die lies on top of the existing L3 cache. It does not cover any part of the cores themselves. There's no extra silicon above the cores, so these parts have the same thermal transfer as non-X3D parts.
You can do the same thing with Intel CPU's. You can reduce their power consumption significantly just by appropriately adjusting LLC and AC/DC loadline. You can further undervolt them by directly adjusting the voltage.
@@rustler08 -30 with an offset vcore of 100mvtt on top of it. Read maybe? And yeah for zen3 thats good.
Comparison really could've used some 12th gen models in there. Those are still widely available IIRC and the jump from 12th to 13th is quite a lot bigger than 13 to 14.
I was thinking of upgrading my 12600K to 14th gen, but now that the reviews are in, I guess it'll be 13th gen for me - 13600K or 13700K, I suppose. Hopefully those will drop a bit in price once 14th gen hits the market, but I doubt it.
EDIT: Scratch that - because prices. Just took delivery of a 14700KF - which was actually cheaper than the 13700KF and just 4 or 5 € more expensive than the 13700K. For some weird reason, the 13700KF is more expensive than the 13700K, just like it was for the 12600K/KF when I bought my "old" CPU last year. 14700K would've been another €30 on top of the 14700KF, and since I never used the iGPU on my 12600K, I went with the KF.
I think the i7 is the one CPU in this "gen" that offers an actual incentive to go 14 instead of 13. At least it comes with a few more E-cores... :)
That Glorious thumbnail strikes again
Just a heads up, be careful putting those flashing lights in the background (@27:27). They can trigger people's epilepsy. Maybe put a warning in the video if you want to continue including them.
I can't believe I didn't get this the first time. The thumbnail for this review is the same as the one for 13th gen with a different number because that's what 14th gen is
Since I "unfollowed" LTT I have been enjoying this content more consistently than in the past.
Cheers for great informative content 👍
i would love to see some content comparing AMD vs intel in a power limited scenario. like cap both cpus at 100w or something.
What do you expect? It‘ll be a bloodbath with intel being completely annihilated in games.
@@achdubloedesau that would be fine, but i suspect that's not the case. hence the suggestion.
it'd probably get oblivioned in productivity though.
@@SB-pf5rc
There are tests for that. Well not for the 14900K, but comparing the 13900K with the 7950x at different wattages. The lower your wattage goes, the bigger the lead of the 7950x becomes. If you lower the 7950x to 100w, it is not loosing all that much performance, while the 14900K is getting totally handicapped
8 cores + a handful of cinebench accelerators for ~700€...
🤣🤣🤣🤣
After 3 weeks of testing my computer and updating the BIOS I finally cracked it. I ended up using the provoked pawns advice, putting the CPU cooler push pull as intake instead of exhaust. Receded the CPU three times and reapplied thermal paste. Now I have 192 GB of RAM running at 5000 megahertz with a negative.0400 under volt running smoothly with no overheating
This vid has convinced me that going the 7800x3d route for me, for the games i play, is the best choice for me. Thanks for the in depth testing.
UE5 loves V-Cache. The vast majority of upcoming AAA titles are all UE5. The 7800X3D is going to age like fine wine on the best oak barrels at the best spot in the cellar.
@@andersjjensen The vast majority of AAA titles are on their own proprietary engines. Stop lying.
@@todddominoes9862 About half of all games announced for 2024 are UE5 games. Yes, the really big umbrella studios have their own engines, but more and more mid sized and independent studios are ditching them. CD Projekt Red is one of them. The Witcher 4 will be on UE5 as they're ditching their own RED Engine. UE4 also contracted quite a few AAA titles. The latest in my working memory is Hogwarts Legacy with a development budget of $150 million.
That people don't say what you like, or have ever so slightly different ideas of when the budget is large enough to constitute a AAA title doesn't mean that they're lying.
Crazy I can run a 4090 with my 5800X3D on a 650w PSU. No way with an Intel CPU
Try 7800x3d 20 watts lower
Jeez, even during the dark ages of 2011-2016 when Intel was just giving us tiny little improvements every year for the same price, they'd do more than this. You'd at least get 10% improvement like clockwork.
A well advertised price drop on the 13th gen would have been cheaper and easier while also generating good will amongst consumers. Everyone likes price drops. This feels less like it was done for marketing and more to please investor demands that Intel release a new product in 2023. I find it impossible to believe Intels marketing division thought releasing a nearly 0% improvement part masked as a new generation would garner them positive interest. They know tech press exists and was going to rip them a new one. That's why this smells like investor meddling to me, personally.
Let's Go again AMD Unboxed.
Solid reviews as usual, but I just wanted to say kudos for the continued improvement on data representation and visualisation.
Nice video again. Two things:
1. Found small mistake: 3th @ 14th gen core series on Test System Spec section of the video,
2. I am watching content on my LG C2 OLED,and i can see,that your videos have great HDR and color accuracy comparing to other tech youtubers. Especialy,on the beginning of the video,where there was close shots of the processor. Yet,there is no option of description on RUclips,that your video is using HDR technology,can you explain that please?
Cheers!