Correction: The 12400 is listed as "i5-12400 (6P/6E/12T) [10/24]." It should be listed as "i5-12400 (6P/0E/12T) [10/24]." This is a specification listing error (name entry error) that has no impact on performance or results. Our apologies for the error - we're adding it to our Corrections page and have added an in-video pop-out correction card as well. Watch our Best CPUs of 2024 Video! Went up yesterday. ruclips.net/video/Zue30tcu0mY/видео.html Grab our brand new dice sets! We are donating 10% of ALL dice sales to Cat Angels, a local cat shelter near us, through November 21, 2024. Grab the SNOWFLAKE DICE: store.gamersnexus.net/products/snowflake-full-tabletop-mtg-dnd-premium-dice-set-7-piece-dice-wooden-box-cat-card Or the e-waste INDUCTOR DICE: store.gamersnexus.net/products/inductor-full-tabletop-mtg-dnd-premium-dice-set-7-piece-dice-wooden-box-token-card Also, cases and coolers on the list for the next week! Though we'll have a news episode likely before that.
@@Y0Uanonymous Good request! Haven't done that test in a few years. It can actually be pretty different. Maybe it's time to try again -- any requests on which CPUs you want to see tested on the Intel side? AMD seems obvious -- 9800X3D. What about Intel?
Also, I like this "look back" style video. It lets people know about what they'd get from an upgrade from a different perspective than normal. I think this is a great idea.
Yeah definitely agree! Plus I'm sure there's a market out there that prefers/doesn't have the budget for something new and is looking an older generation flagship used product, this is great for that :) I bought my 12900k used last year, was half the price of a 13900k brand new and considering I didn't really need the extra performance, it was a lot easier to justify the upgrade at that price point haha :D
i just put together a 12100 'look back' build, paired with an rx6600. it's my media pc, and i figure i'll be able to 'look back' at games from 2022 and before forever, while it delivers a pretty good media storage/streaming experience. i was using my old workstation for the task, but goodness that thing ate power while idling.
Yup. I'm still rocking a Ryzen 3600 because the increase in productivity performance isn't worth the money for me yet. But I'm always monitoring the price lists.
You were living dangerously, if you watch bryans series from techyes you'll find that intel made BIG sacrifices to achieve the P and E core layout with the source reason being confusion and marketing. There's latency everytime you wanna do something spontaneous ranging from clicking on a random mp4 in file explorer (11th gen faster) to stacked spontaneous tasks with stacking stutters disrupting workflow. This series is propagated by gamers to being the best entirely because games aren't affected by this and most gamers don't do much else other than game or aren't smart enough to differentiate from that toxic ecosystem. You did however make the right decision at the time luckily because Intel had a direct hand in messing up AMD's interaction with Win11 and only recently when Intel was extra vulnerable did the major bugs get fixed, but it was even worse before when AM5 was unstable as a whole for almost a year. Now however, there is no reason for anyone to go 12th unless you get a steal second hand.
my 13600kf got unusably unstable and intel refused to rma it. luckily, my seller just refunded me almost 2 years after i purchased it. straight up replaced it with a 12700k for half the amount that i once paid for the 13th gen.
Got a Dell laptop with a 12700h (7620 plus). Was very unstable the first year, but after 1~2 years of firmware updates, i finally reach uptimes of 7+ days :) still a very fast laptop
@@GamersNexus I think it's one of the most informative types of charts you can possibly put out. At the end of the day what people want to know is "is x better than y considering their prices?" and an equivalency chart in different challenges answers that VERY simply. Then it's just down to the viewer to compare the prices.
@@GamersNexus I really like it too, I think an aggregate row at the top or bottom that states the overall average closest CPU would make it easier to reference in a general sense, before looking at exact neighbours per title.
I wonder if GN could add a DB on the website where you could, input a CPU and it would provide a grouping of comparable items? Very much like these charts but always available. Sorry. I don't think I'm expressing this idea well at all. 😳🙃 But I also love the charts. 🖖🏻🤝🏻
Agreed. Everything can be distilled to numbers but the chart gives an interesting perspective and sort of serves as a TLDR if you feel like it. Good stuff.
It’s wild how the Alder Lake CPUs, especially the 12900K, have managed to stay so relevant even as newer generations keep rolling in. Intel really hit a sweet spot back in 2021, balancing decent performance, new tech (DDR5, PCIe Gen 5), and reasonable prices. Even now, CPUs like the 12900K and 12700K are holding their ground, performing like modern i5s in a lot of cases. That’s a pretty good deal if you snagged one for $120 in a Best Buy clearance. What I love about Alder Lake is the flexibility! it works with DDR4 or DDR5, so you don’t need to throw your whole build into the trash if you want an upgrade. Plus, it shares a socket with the newer chips, which is rare these days. You can slap in a 14900K if you really want to push your system, though most people probably don’t need that kind of power (or heat).
Agreed on all of this! This is what was exciting about it back at launch. The in-socket changes is also uncharacteristic from Intel, at least for this wide of a performance gap. That you can go from a 12900K to a 14900K in the same socket is what we need more of, especially from Intel (given AM4's track record).
but with ddr4 performance you will suffer performance regression by some margin maybe 8-10% compared to ddr5 so if you still want to build gaming pc with ddr4 5800x3d, 5600x3d etc.. still kicking ass sometime fell strange to see those cpu on top of the charts
Built a system with 12400f last month and just helped a friend building a gaming PC with 12600kf. Alder lake is the best bang for the buck for LGA 1700!
12400f paired with either rtx 3070 or any other equivalent gpu is still a very potent 1080p - 1440p gaming machine and will be for quite a while as well.
You will be good to go for next two years with that combo. That GPU is a beast, RX7800XT < RX6900XT < RX7900GRE but very little margin. I personally call them "cousins".
Got mine for 700$ just as the crypto boom started to slide and since it’s been a great performer at both 4K and 1440P. Easily one of AMD’s best flagships once the price came down a bit
Went from a 2600k (2011) to a 12600k! 11 years later. It's still going strong 2 years later can't see a need to upgrade, although I don't think I'll get over a decade of it like the 2600k goat. (Which is still running! But spending its retirement in a sims4 pc in our lounge for my wife)
I reckon that the 12th gen might last even longer than the 2nd gen. Back then we saw substantial performance jumps between generations, but it's been very lacklustre lately. If there are no major architectural changes that enable large performance jumps, I expect to get the same 10 years out of my current i5-13600K much like I did with my previous i5-4670K.
@@engrammiyup....12th gen to 14th gen are essentially the same processors with just more clock speed and the new ultra series are on par or even weaker than 12th gen chips. I think it will take another 3-4 generations to see any noticeable performance improvements.
in 2019 I also upgraded from an I7 2600...CPUs likely will never "live" as long as back then. Thats because PC gaming has changed a lot. For me personally gaming on 1440p 165FPS makes going lower than 120 basically impossible. 60 FPS feels choppy to me and sadly higher framerates also means you need a better CPU. Back then all we had was a 60Hz screens for ages, this is no longer the case
I too updated from my 2600K (2011) to a 9700x 13 years later. I couldn't justify a purchased from Intel given their recent behavior, from both product (13th/14th degrade fiasco) and individuals at the company executive level.
Same here!!! I had a i7-2600(non-k) from a Dell Optiplex 990, the 12600k is soooo nice, especially the ability to banish an app to the E-cores with Process Lasso.
Honestly, I think the X79-era was the peak of modern DIY/enthusiast computing. While six cores is standard now, quite a number of people still had dual core processors in 2011, and it wasn't until, what, 8th-gen that standard desktop i7s moved from quad to hex core chips, so the idea that you could, for ~$600 USD pickup an unlocked, six core HEDT processor -- that was still on the exact same architecture as the consumer-consumer parts, mind you -- was great. Plus, it came with a quad channel memory controller, something we still don't have on standard desktop parts, and FOURTY PCI-E 3.0 [1] lanes direct from the CPU. On top of that, it was funny, I remember buying a Rampage IV Extreme, which seemed absurdly expensive at the time at like ~$400 -- which feels barely above average now -- and not only did it have all the usual xoc features, a seven segment display (wow!), but it included a PLX PCI-E switch, in case those 40 CPU lanes weren't enough. And it wasn't just Intel; while Bulldozer may have been a flop, for several months, the absolute fastest consumer GPU on the market, by a solid margin, was the newly released 7970 -- which was "only" $500, not $1500. [1]: Officially, I think Intel still claims PCI-E 2.0, but it was totally capable of 3.0.
I'd say they've been legendary since the Core 2 days, peaked in the mid-2010s and then Ryzen's been destroying them in gaming since. Not sure why it's not possible to have two manufacturers at once producing baller CPUs and having proper competition at all times. It works in GPUs, right? There is very strong competition between AMD, Nvidia and Intel. Rarely does one manufacturer pull ahead massively. Sure, Nvidia is a market leader, but that's more of a brand image thing. Probably same as Intel in CPUs.
If I were forced to choose an Intel CPU this very minute out of all of the generations available, it would absolutely be 12th gen. 13th, 14th and "15th" gen all have anxiety written all over them. 12th gen has a proven track record of longevity, reasonably good performance, and now, they're dirt cheap.
@@GODFADED That's pretty high, your cpu is liquid cooled? I OC'd mine as well for a while. I got great results high fps in fn and notably less micro stutters and fps stability but the cpu got pretty hot talking about 85°. I have a dark rock 4 cpu cooler and it was barely keeping it from throttling when in heavy load.
I've only upgraded to a 12600KF like a half year ago after using a 12100F for 2+ years which served me really well for my use case so the 12600KF will defo last me a few years.🙂 The upgrade did not cost me much since I've also sold my 12100F to the same person who I've bought my 12600KF from. 'second hand but basically brand new CPU' Gave it a healthy ammount of undervolting in the BIOS and a cheapo contact frame and this way its very easy to cool even my budget ~35$ tower cooler is able to keep it under 60 celsius while gaming at stock speeds. 'B mobo so even if I wanted to I can't OC but since it was cheaper than a 13400F I've went with this instead cause its faster even at stock anyway so the 13400F made no sense to me over this'
@@bracusforge7964 True honestly been thinking about upgrading to a 360mm AIO. I think I got a decent chip that can overclock quite well. But the issue is keeping it cool. I started Dragon age and while building shaders thermal throttled. And forget about running anything AVX. Imagine I could always run an offset but what fun is that. 😜
One thing to consider about 12th Gen is it is now really the only option for ITX builds at the moment (excluding 13, 14th for obvious issues and heat and 200 series for price to performance). Am5 boards are fewer in number, more expensive (in Europe at least) and limited io compared to their 1700 counterparts. The amount of AM5 itx boards with only 1 graphics port on the rear io is ridiculous! Considering ITX is the only form factor where you are most likely need to use the IGPU if you need to use the PCI slot for something other than a graphics card.
You're already on thin ground when you want AM5 integrated graphics capable ITX - found 11. A quarter of those has 2 ports: * Gigabyte B650I AORUS ULTRA * Gigabyte A620I AX * Gigabyte B650I AX (You can see the trend, no?). But you also can just compromise and get a usb-c to hdmi adapter (mini-docking station) for a few bucks, shove it in of the 40gbps backports of an Asus ROG STRIX X870-I GAMING WIFI or equivalent. We're in PC world, mate - this should work just fine.
Got my 12700k at launch. I’m very tempted to go for the 9800x3d but will likely sit out a few more cycles. Just seems smarter to use that money towards a better gpu and upgrade when 9800x3d drops in price or better CPU’s come out.
I'm actually feeling the opposite, many of the games I play are more CPU bound and seeing a roughly 50% diffy between the 12700k and the AMD X3D CPUs makes me kinda want to upgrade. Kinda regret going Intel for my current build. Wish I had gone AMD so I didn't have to replace MOBO and stuff as well.
This just summed up for me to hold on to my 12900k for another 3-4 years as im not bottlenecked at 1440p with my 4070, loved this type of video super informative and just took all my doubts
Interestingly, the 12th gen's DDR4 support makes upgrading complicated. If you have a DDR4 motherboard, an in-socket upgrade from 12th gen stops making a lot of sense.
Just bought a 12th Gen machine few months ago. Dirt cheap, blazing fast. Don't really feel like I'm missing out on anything. No stability issues whatsoever. Even the integrated graphics is fast enough to play games in HD. I wasn't expecting it to preform this well, but my next upgrade will be an AMD. I'm bailing on Intel because of the last 2 gens.
Welcome Comrade! I just built a 12900k system, upgrading from my old 4790k build. It's the first new build I've done since 2008 (for myself). I just kept upgrading for 15+ Years. Not gonna lie, it was a little emotional moving the rig I've been building & working on for all those Years move to secondary duty- there's literal blood, sweat and tears in this new PC! I love it though- totally worth it! And I'm VERY glad it's over and I've had time to get comfy with it. It's one thing building a secondary PC or one for someone else but it's another thing to build my own PC, at least for Me.
@@Chaos_God_of_Fate I used to care about hardware. Always had dual GPU's and top of the line everything. My computers 10 years ago already had 256GB of RAM and then I suddenly stopped caring at all. And I just retired a laptop recently that was 12 years ago. The battery finally died and it doesn't work without it. I would've kept using it otherwise. I bout a 12th gen as a replacement because I had some reward points and the computer on offer ended up being practically free. All that about one week before Intel 13th and 14th gen defects became public. Fortunate buy that I'm still enjoying. I hate the idea of Microsoft forcefully retiring Win 10 that most of my computer fleet (that I use for work) operates on. The move to Win11 would make a whole ton of those 256GB 32-core beasts next to useless. Sure, there is a registry fix and it is possible to get a hardware board that makes the computer security compliant but it's all expenses out of nowhere on something that was considered fine only yesterday. It's BS. I can dodge Intel and go with AMD, if the brand messes up, but what about Windows? I'm completely stuck. Macs are overpriced and Linux is useless in a work environment. Most software is for Windows... 12th Gen intel is the last system I'll be enjoying, even though it already feels like I don't own the PC, and simply renting it as a service from Microsoft, who are worse than spyware... blah blah blah... I'm rambling.
As a weirdo with a 12600k+4090 system I appreciate this. Originally the thought process was "4k = max gpu, midrange cpu" but in retrospect it was too optimistic with a 12600k. All told, coming from a "if it works, I'm happy" background frankensteining PC parts together my system's still WAY more than enough, but a cheap 12900k wouldn't be bad. I didn't realize they'd dropped in price that hard. Cool vid!
I got a 9700k with a 3090 🤣 I am thinking of upgrading to 12th gen, still haven't decided if I want an i5 or the i7 but the i7 has been looking tempting but I'm short in cash so I'm still thinking about it
Have the same setup, similar thought process. I don't need the highest frames on 4k, 120+ us ideal but I'm fine with around 80+. Optimisation in games are atrocious now but works really well for most of my use cases just hoping MFS2024 will run well with it.
Love my 12700k. Picked it up, around a year ago or so, when a z790 board and 12700k combo was first available for $300. No regrets. I was able to get 5.3/ 4.3 OC on all p-core/e-core Respectively. With 4.4 oc on the ring ratio. Unfortunately, it reads ~230w of power with peaks at 250w in multicore CINEBENCH 24. Thankfully, all I do is game, so it's a non-issue. Use a 280mm thermalright liquid cooler. Zero issues.
@@WSS_the_OG certain emulators, such as RPCS3 make heavy use of it and it's night and day for some of those scenarios, but outside of the emulation scene, I'm don't think you'll find much in the gaming world, except possibly some random decompression libraries for assets or whatever, actually using it. However, it was nice being able to write and test AVX-512 routines as a programmer without needing to use Intel's emulator or purchase something like Sapphire Rapids, etc, and still getting best-in-class single-thread performance and all of that jazz. Similarly, as you mentioned, a solid number of video encoders make decent use of it. And contrary to what people might expect, unless you're encoding several things at once, or possibly splitting a video into segments, encoding them in parallel, and then trying to stich them together afterwards, you'd actually see faster encoding/render times on something like a 12900K/7950X than any EPYC or Sapphire Rapids [non-WS] parts -- most of these codecs can only efficiently be parallelized up to a certain point, and after that, you typically run into either too much overhead or worse compression ratios, so having fewer, yet significantly faster cores often wins out. And then there is also the scientific simulation realm, CPU raytracing, as well as a variety of database software (ClickHouse, for example, and I'm sure others) and [as noted] various compression/decompression and data marshalling libraries -- e.g., zlib-ng, simdjson, etc. While, at the time, I'd probably take something like a w7-2495X if I had the chance, there's just such a monstrous price difference between desktop and "HEDT" nowadays, so it was nice to have the ability to use those features when it made sense without paying $2K+ for the processor, $1K for a motherboard, and god knows how much on DDR5 RDIMMs. For now, I'll live with my ghetto Taobao 12900HX and Asus W680 board I repaired and got stupidly cheap since some poor soul must've immediately dropped their processor into the socket and bent half a dozen pins.
@@WSS_the_OG As the guy above you mentioned: emulation is a big thing (as certain instructions need to be ‘spoofed’/reimplemented which works much better on an architecture with more of them; hence ARM→AMD64 is much easier than vice versa). Also some forms of rendering that reach the limits of AVX2 not dragging them down.
I built a system when the 12600k came out (it was only $200 in my country). It has never crashed, stuttered and performs so well with my 3060ti. Perfect computer for 3 years now and I am really happy with it.
I helped building 2 Intel system this year. One with a 12100 ($94 at local retail) and another with a 12400 ($114). Both times the users were very happy about the upgrade. The 12100 which I personally worked on, booted blazingly fast compared to my 10400 rig. ADL is still super competitive in 2024 for low end and mid range.
This is also our concern. AMD has shown with the 5000 era that it will drift toward high prices and out of as competitive a position if Intel allows it to, which makes sense. What we don't know yet is if they'd also stop innovating like Intel did back in the Skylake era.
To be fair 10/7nm was held back by the invention of the UV process and Intel did miracles with 14nm, the scummy part was "4 cores are enough for consumers" and charging insane prices for HEDT and unlocked CPUs. AMD has already been on that train with 8 big cores per chiplet for like... 10 years?
@@GamersNexus I mean I like AMD but it was so obvious they'd do it if Intel lets them. After all they are business and no business/corporate is our friend.
@@plamen5358kinda? 3000 series used 4 cores on the CCXes which were, it I'm not mistaken, the base chiplet and then it slapped them together in pairs to form the CCDs. I think that that's one of the main reasons why the 3300X was quite a bit better than the 3100 while both being quadcores: 3300X was using the whole 4 cores on a CCX while the 3100 was 2 CCXes with 2 functional cores each. CCXes dropped off from relevance on the 5000 series, where the baseline was the octacore CCD, also the reason why it was such an uplift, since there was no latency issues brought by CCXes communicating with each other.
@@plamen5358To be fair to both Intel and AMD, there's definitely diminishing returns to sticking more cores into a CPU in terms of usefulness. Few, if any games and not a lot more apps can use more than 16 threads (a lot fewer than with 8 threads), so if you're not planning to use your PC for production workloads like heavy 3d modelling or video editing (which 95% of PC users don't), the offerings out on the market right now are perfectly sufficient.
Happy to see this video pop up. Just bought a 12900KS after multiple days of research for £270. The 13th and 14th Gen have FAR to many issues and not worth the hassle, stress and potential problems. The 12900KS also has much better resale value. I did look at the new 285k, But those in the UK are around £580 and unknown longterm. ❤❤❤
@@justmatt2655 yep 8th gen is rock solid if somewhat dated,,, I still use an 8th gen i7 as a backup to my newest 12th gen rig, looking forward to doing an AMD build after the world settles down on the new 9800X because Intell just aint got it with thier last 3 generations if you ask me.
Been using 12400 since 2021 and it's been flawless. Upgraded my 1080p rig to 1440p with 4070 super which it doesn't bottleneck, I play single player games mostly. I really don't see myself upgrading to 13th or 14th gen, anything worth upgrading to consumes 3X the power. I'd probably hold off till AM6 or AM5 if games I play start giving me a headache. Thanks for the video! It really solidified the belief that I shouldn't be entertaining FOMO as long as the PC performs as well as it should!
This video is awesome in a bittersweet way. Not only showing off that 12th Gen truly was a marvel, but also what could have been the 13th Gen if it wasn’t for the fact Intel fumbled the bag with the instability. Seeing my 13700K up there beating out the 12900K and being competitive was great to see but.. man. I’m unsure how damaged it is as of now. The BIOS update did actually stop my Windows from soft-locking, and my brother’s 13600K was affected, and that’s apparently rare. Thanks for the hard work as always though, great watch 🫡
Currently using my i5 12400F to run games at 3440x1440 max settings with my 7900 XT and I couldn’t be happier. Here’s not wasting money on an overkill CPU for gaming!
usually amd gpu have less driver overhead so even with weaker cpu, in the cpu bound scenatiors you ll have more performance compared to an nvidia gpu hardware u. made a video about that show you can have up to 30% more performance with same cpu
I recently upgraded from a 12100F in my ITX board, to a 14700K to help leverage my 7900XTX that I put in to replace the 6600XT. That 12100F though rocked solid low temps on a tiny Noctua cooler and never broke a sweat.
I really put 12100f-12600k in the "if ya have it, it's fine" but it's aging fast, and I think the upgrade to the 14700k was a wise choice in your scenario.
Tons of builders often forget that an old monitor is often the biggest bottleneck. 1080p/60 is ubiquitous, so new builders have to consider display targets.
Totally agree! And the performance gap shortens as you go to higher resolutions. So this kind of benchmark at 1080p only makes sense for someone who plan on playing at this resolution for the time they will keep the processor.
I had thought to upgrade the 12900K to 13th gen but the improvement was too small to be worth the effort. Then I was excited when 14th gen was compatible but still it didn’t seem worth the expenditure, so bought some parts and built two machines mostly out of spare parts and gave them to family members. I realized I got more fun from that than I would have getting a few more frames in flightsim. Then all the problems became known and I decided to hold off on a new build. Recently found a new build 7800X3D/4070S pre-built on sale at Microcenter and got two of those again for people in the extended family. Again, more fun in sharing. Maybe in the spring I’ll finally do my own upgrade but the thing is the 12900K/3090Ti just keep chugging along so I don’t have to. In my old age I find other things become appealing to me with hardware, like a GPU that’s got zero coil whine, low fan noise and is built solid so there’s no sag.
I said it already. Keeping my 12900K is the best decision I ever made. It runs anything. It is easy to cool. It is stable. Paired with RTX 4090, I am all set for many years. 12700(K) is also very good.
@@fatmike01Lol I still have the 4770k . Still running current gen games great, thanks to my 16 gb ram and my gtx 1660 ti. Newer games avg around 50 fps with 1080 resolution. Not shabby at all with a 10 year old CPU! Turns out most games nowadays rely mostly on GPU and ram.
I just went from a 10700k to a 12700k thanks to a Microcenter bundle. $250 with a good motherboard included. Very impressed with the power jump. Using a 4070 Super and everything runs like a dream.
My 13700k is over 2 years old now and has been oc since day one to 5.6ghz on all 8 p cores, All 8 E core oc to 4.4ghz, Ring oc to 5.0ghz at 1.34v. Even the mem controller has been amazing. 32g.b Corsair rgb vengeance 6400 cl32 hynix A-Die oc to 7200 34-41-41-83 at 1.45v. My computer is on 7 days a week over 8 hours a day for work and gaming. Must be some sort of 13th gen miracle. Haven’t even updated the bios since Jan of this year.
I've been running a 12700k at 4.9ghz p core 4.0ghz e core (all cores) with negative offset since launch. Runs perfect for games at 4k (3080 TI). If you don't care much about productivity and are playing games at 4k, you're GPU bound and wont see much uplift from a CPU. Right now I don't see a point to upgrading for probably another 2 years.
Wow you read my mind, such a well timed video,thank you. My only complaint would be , like with your standard reviews power consumption is a very real consideration so as a 12600k user I was hoping to see some insight as to if the 13700k used less power than the 12900k.
I'm so glad I went 12th Gen 12700K and not 13th gen when I had the option to choose. Who would've ever saw such a disaster coming? 😅😅 I paired this with a 4060TI 3x Ventus, and 64 GB of DDR5 6000Mhz WAM. I kind of regret getting an MSI motherboard (MPG Z790 Edge), because it won't let me overclock and use the full 6000Mhz without blue screening. The MOBO defultively caps the wam 5600Mhz. Despite that, I haven't experienced any issues regarding the 12700K itself. If only Windows 11 was as good as my PC...
Running a 13700k + 4090 at 4K. Probably just gonna use the 13700k and upgrade my gpu to a 5090 when the time comes as that will be "the biggest" upgrade in my case. Great video as always Steve!
Same here 13700k and 4090 for the past 2 years and has been absolutely flawless. It’s been oc since day one to 5.6ghz on all 8 p cores, All 8 E core oc to 4.4ghz, Ring oc to 5.0ghz at 1.34v. Still on bios from Jan of 24 and have no plans on updating the bios. Even the mem controller has been amazing. 32g.b Corsair rgb vengeance 6400 cl32 hynix A-Die oc to 7200 34-41-41-83 at 1.45v. My system is on 7 days a week over 8 hours a day for work and gaming. It’s been an absolute reliable work horse. The 13th/14th gen problem was so blown out of portion in my opinion. Same as the 4090 connector melting. Same as oled burn in. People loose sleep over the dumbest crap.
@ not rude at all to ask. The short answer, I don’t need to, I just really want higher fps in games like cp2077/alan wake 2, basically the stuff pushing the boundaries with path tracing / ray tracing.
My current PC has a 12400F, built it when Alder Lake first came out and it's served me well. I am thinking of doing an in socket upgrade to a 13700K within the next 1 or 2 months to get me a couple more years on the same motherboard/ram set up.
if you change socket it's better to go with am5 for me, or go with 12700k-12900k(12900k i think price it's too high) and keep same socket, i would avoid top 13-14 gen due to degradation problems that even with patch seems not resolved but only mitigated...in server/datacenter with cpu at work h24 they still have instability issue
i love my i5-12400F because of the non E cores tho if i was to upgrade i would get the i9-12900 and stick to a all intel setup ie motherboard ect but with a nvidea gpu.
@@dylan.dog1 The bios don't fix already degraded CPUs, i have a 13700K and is working perfect after 1,5 year, and if it was degraded and it breaks i'll ask intel for a replacement since they increased to 4 years warranty on 13 and 14.
@@guille92h for day to day desktop users it's more hard to see than in a server or datacenter ((with cpu working h24 and where you need everything stable)), and cpu affected by it are 20% so your 13700k can be fine, anyway you ll don't see a break, your pc will work anyway but you ll have stabilities issue, maybe a program won't work or crash or game crash or pc crash or some online navigation instability,with lag, restart etc.. or strange shuttering or various problems like black, screen blue screen errors, etc... some users can have only one or 2/3 of these problems that are related to cpu degradation and don't even know
Great video, as always. I'm glad that your considerations about the Alder lake series in 2024 matched exactly my conclusions when I was upgrading 5 months ago. I had a strict budget for the upgrand, and my primary choice - R7 5700X / B550 Mobo / 32gbram were still a bit high in price here in Brazil for the good reputation AM4 built over the years. Instead, I got me a 12600KF build with a B660 PG4 that smashes the 5700X in games and general performance, and still managed to save around 140usd by chosing that, which allowed me also to complement savings and upgrade from an RX5700 to an RX6700XT. The R5 7600 non-x was an option, but the build with DDR5 and the expensive mobos would've cost 220usd more for just 5~7% more performance.
I got back into building PCs with a 12700k. That was a fine machine, and the friend I sold it to is still using it. Also, from the time machine, "performance fortunately has also crept up but so has power," made me chuckle.
16:46 Minor correction, 12700k and 13600k have the same THREAD count, but not the same core count. Not a big deal and doesn’t change the message of the video, but thought I’d point it out given GNs never ending strive for the upmost accuracy.
I'm still rockin a 7700k + 1070 and Monster Hunter Wilds was finally the wake up call to get a new PC. Being an intel fan I decided to hold out for the 15th gen release and.. man, that 9800X3D sure does look sexy. Prior to the 9800X3D release I was considering getting a 12th gen intel since they're so damn cheap and still have good performance results without having to deal with problems like self immolation. New killer gaming CPU, DDR5 RAM, PCIe 5.0 GPUs in January, it's nice to be back on top. And broke, very broke.
@ yeah. it’s crazy how in just few short years we went from toe to toe battles for x86 competition to genuinely hoping either intel of all companies gets better or face the consequences. almost sad if it wasn’t so deserved.
I am happy to see this up-to-date revisiting and testing, thank you. I own a mini itx build (Lian Li Q58 case), which has an undervolted i9 12900F cpu, 240mm AIO, rtx 4070 super, and 6200MT/s 32GB DDR5. I am doing some light gaming, and some rendering hence i opted for an i9 cpu which was very cheap second hand. Been thinking to buy a newer processor(or not)already, but it seems even the i7 counterpart is still relevant in 2024 or onwards. Amazing.
I ve had 12400F, 12600kF and 12700K + 3060 Ti/ 7800 XT, 6700 XT and I am totally happy, no need for any form of upgrades. The key point is does the modern games give 1% lows that are drastically lower than the fps average. Most of the time other than 12400F the 12600k and 12700K almost all the times give good 1% lows and I am so happy with Intel 12th gens. I ve had 5600x + 6700xt pc as well as 5800x + 6900 xt PC. Not sayn these combos were bad but I d say the Intel Nvidia combos gave me best results in most competetive games And 12700k + 7800xt is a superb 1440p beast even today. I wud never upgrade at least till 2027.
@@drunkhusband6257 Are you serious? What planet do you live on? Because it sure as hell ain’t earth! There is a huge bottleneck even with a 14900K in every single modern game once you try to get near 144FPS. And in flight sims there is even bigger differences even at low frame rates. 14900K does 45 to 55 FPS while the 9800X3D does 90 to 100 FPS without changing any setting and with same GPU. And in cyberpunk no matter what u do you won’t go over a like 135FPS even at the lowest setting and ultra performance upscaling while the 9800X3D allows it to go past 190FPS without changing anything and again the same GPU. Same in Hogwarts Legacy, same in Jedi survivor, same in stars wars outlaws, same in watch dogs legion, and so on and so on. The performance is up to 70% faster so what the hell are you even saying, your clearly have no clue what u are talking about sorry not sorry.
@@pvdgucht You are totally wrong and making things up. lol performance at 1440p and above isn't even 30% faster. It's fine, live in your dream world that doesn't exist.
@@drunkhusband6257 Didn’t say it was 30% at 4K but anyway (well actually in flight simulator it comes close but that one is an exception. I not making anything up. I did all the test and benchmarked everything myself and noted all the FPS. I know what I am talking about. I saw a CPU bottleneck in almost any modern game at higher frame rates with Intel CPU compared to X3D. And I don’t get why you can’t accept that and are calling me a liar, I am just straight up telling what I saw, and I invite you to do the tests yourself and see that with a 12900K your frame rate stops at a certain point often near 130FPS if you lower all graphics settings and if you lowed them even more nothing happens, while with a 9800X3D it does go much higher indicating a CPU bottleneck. Or try to run flight simulators on these two CPU’s and you will instantly see what I am talking about.
I went from a 7700K to a 12900K back when it released. Also still rocking my RTX3080. I've not felt any need to upgrade whatsoever. It used to be that we'd have a Crysis type game come out that would push the performance envelope and requirements - but nothing pushes the boundaries that hard these days. I can easily play any current game at > 120fps+ at 1440p, and that's perfectly fine.
12th gen is absolute bangers to this day. Stocked up on 12700Ks for like $100 or something from the TikTok Newegg sale last year. Running 12th gen in all my systems and probably will stay that way for a while
My brother and girlfriend are both on 12400f, and i'm still rocking a 12700k. I play at 4k, so my games rarely cause it to break a sweat in gaming. It's amazing for running my audio software (Sony ACID Pro 11) as well.
When Intel was good: Sandy Bridge to Coffee Lake. That's when it worked right and felt right. After that, everything got weird. 10th gen was only good for the gold 10900Ks, the rest were crap. 11th gen was a lousy backport with virtually no cache. 12th gen was still needlessly complicated with P+e cores requiring a new scheduler, and the reason Intel didn't screw it all up was because they were comfortable in their (temporary) lead over AMD so they kept it sane. Everything Intel has done after 2021 has been an UTTER failure that often requires moronicism to excuse... Raptor Lake: fail, Arc Alchemist: EPIC FAIL, Raptor Refresh: fail, Error Lake: many different flavors of FAIL... Gelsinger: FAIL!!
"12th gen was still needlessly complicated with P+e cores requiring a new scheduler" The i7-12700K makes the P-core only i9-10900K look like a Bulldozer CPU.
Same. I have mine paired with a 4090 and have zero gaming issues with my rig. Some day I'll upgrade, but only for Windows 11 when it becomes mandatory.
It's a bummer seeing your CPU slowly slip off of the charts but like you said, it's important to take a step back and think about whether you're actually unhappy with it outside of just the numbers. My 9600k feels absolutely ancient when i watch reviews but still completely handles all of my games just fine.
LOVE that yall did that revisit! This gen may age similarly to Sandy. Unfortunately, I built my 12900k rig on ddr4 so I think it'll just make more sense to keep it as-is until I give it to a family member or something. One test I wish yall had done was power draw and or efficiency. Maybe I missed it. With the latter, I enjoy how yall adjust the parts for the same wattage and then the same performance (to see how many watts each pull for the close to the same perf). The only thing that drives me to a new build right now is how power hungry this system is. Based on your last video, this thing and for that matter all of 12, 13 and 14th gen high-end cpus are absolute behemoth porkin pigs lol. I might switch just to save on my power bill for premiere/topaz rendering which can be hours like Blender.
I went DDR5 which at the time cost £300 for 32GB 5600. I think you made the correct choice. Power isn't an issue for me as I use mine for coding and gaming. I have it PL1/2 locked at 125W and a .07 undervolt on the core which runs games ~60-75W. I can't remember the CBr23 score but think it was 24k vs 27k uncapped.
There is 2 points, that where not covered in this video 1. Non K parts can be overclocked via FSB on some boards 2. There is option to enable AVX-512 on some boards when disabling E-cores. May impact grately in some workload scenarios.
3. Oc ring bus way further with ecores turned off, and the cache on ecores gets fully available for the pcores. 4. No thread scheduler issues or cores going to sleep when they really shouldnt.
I love the comparison charts you do as it puts thing into perspective and ranks things for me which I am unfamiliar with. The comparison charts in your recent GTX 1080Ti video were really, really good as were these.
I’m still rocking my 12900k, Msi 3080 gaming z trio and 64gb of DDR5 6000mhz and it still kicks ass to this day. My Corsair 280x case and 240 AIO doubles as a space heater but hope to remedy that with a external mounted radiator outside my room where it gets down to -35 regularly (Alberta Canada) if y’all could give me some insight or tips on pros/cons to this it’d be super appreciated. I know condensation will probably be a problem but my pc stays on 24/7
I'm still rocking a 10700k and 3070 , honestly happy with gaming performance at 2K. Still love the meshify 2 case I bought based on Gamer's nexus review , best case I've ever owned
People in the comments saying: "im still running alder lake", guys. girls. You can still use an 6700 (non-K) no problem. was using one until recently. For youtube and competitve gaming (BF5, CS2 etc) it works. Bought an 12600K and of course it is a monster near the 6700, but if u only play cs2 and watch youtube, no problem with the 6700. Before the 6700 i was using an i7-4930K and only change it to go to ddr4 and then use the same ram when going to the 12600k.
Just got an i7-12700k and MSI mobo bundle from microcenter for a friend's daughter's PC repair/upgrade. Previously 8th gen Intel. Then stumbled on this video in my feed. Hopefully a good value. The hardware bundle that is. The GN videos are always a great value. 💰
Went from 4690k to the 12700k. I'm happy with it so far, had to upgrade cause the old boy just couldn't handle modern games at high frame rates but damn it was a great chip. Got it up to 4.7 overclock and was an absolute beast. Had to lower to 4.6 ghz after a few years cause of instability but still was very happy with my purchase. Also thank you micro center cause i got that chip for $186 at the time. Sentimental cause it was my first build and still run it as a backup PC if friends come over to game.
Great video as usual, and on a topic that is of particular interest to me, as an Alder Lake owner. I wish you'd tested the differences between DDR4 and DDR5, though. From what I've seen, those numbers are sort of all over the place. At release, for example, most reviewers found a negligible uplift for DDR5. In newer games, however, HUB found that Alder Lake can benefit enormously from DDR5, in some cases to a much larger degree than Raptor Lake. For me I'm not sure it really matters, because I'm not exactly a cutting edge gamer. But the memory issue seems like a substantial wrinkle in the 'should I or shouldn't I upgrade' discussion. You did touch on this in the conclusion, but direct and contemporary comparisons between DDR4 and DDR5 for Alder Lake are pretty hard to find, so more data would be appreciated. Anyway, in the general case upgrading from Alder Lake clearly isn't worth it right now. As you say, in-socket options (Raptor) are a mine field, and newer platforms don't offer a performance uplift sufficient to justify their huge cost. People running lower-end Alder can get a pretty good boost out of a cheap 12700 or 12900 right now, though, which is nice. Just have to watch those VRM; some cheaper boards might balk.
intel was at a crossroads after 12th gen: - invest into developing a new and better architecture and build it on the best possible process - don't invest anything and see if you can squeeze out a tiny bit more out of the old stuff no matter how or the consequences
RUclips comment section probably isn’t the best place for this but figured I would try. I just upgraded my PC after 10 years (not counting GPU). I now have the following new parts (again, not counting my 3080 GPU) 32gb DDR5 Corsair Vengeance 6000MHz Z790 Aorus Pro X 12th Gen Intel i9 12900k Corsair Nautilus 240 RS AIO The “issues” I’m encountering are that it takes much longer to boot up from POST. And my PC has become significantly louder than before. HWMonitor shows CPU temps don’t usually go about 40-50C but fans really start spinning hard. If anyone would have any suggestions or troubleshooting ideas, I’d be all for it.
Awesome comparison! This has been a great reference trying to figure out a good upgrade for my brother's setup so it's as powerful as my 12700kf / RTX 3080 12gb build. He's got my old 5820k / GTX 1080 and he wants to do 1440p ultrawide gaming, so it's been a real head scratcher trying to find the best bang for the buck cpu/gpu combo with all the deals going on for the holidays. Been thinking about either a 5700X3D / RX 6800 combo or a 5600X / RX 7800XT combo, but I'm worried about the 5700X3D being wasted on the RX 6800 and vice versa for the other combo option. Best option would probably be going with the 5700X3D and RX 7800XT, but my budget might not be able to be pushed that far. Regardless, excellent video! Gave me plenty to think about!
These kinds of reviews are cool as fuck, yo. Please keep revisiting older SKUs of products because... well, we're gonna need to know pretty soon how best we can get by with older hardware. We're ALL gonna be doing "budget builds" once the tariffs hit. It's important to know just how long one can expect a CPU to not only last, but stay relevant.
Correction: The 12400 is listed as "i5-12400 (6P/6E/12T) [10/24]." It should be listed as "i5-12400 (6P/0E/12T) [10/24]." This is a specification listing error (name entry error) that has no impact on performance or results. Our apologies for the error - we're adding it to our Corrections page and have added an in-video pop-out correction card as well.
Watch our Best CPUs of 2024 Video! Went up yesterday. ruclips.net/video/Zue30tcu0mY/видео.html
Grab our brand new dice sets! We are donating 10% of ALL dice sales to Cat Angels, a local cat shelter near us, through November 21, 2024. Grab the SNOWFLAKE DICE: store.gamersnexus.net/products/snowflake-full-tabletop-mtg-dnd-premium-dice-set-7-piece-dice-wooden-box-cat-card
Or the e-waste INDUCTOR DICE: store.gamersnexus.net/products/inductor-full-tabletop-mtg-dnd-premium-dice-set-7-piece-dice-wooden-box-token-card
Also, cases and coolers on the list for the next week! Though we'll have a news episode likely before that.
i'm stunlocked on 4th gen intel due funds and yes it's in a 2003 HP case because it works i guess
Somebody needs to compare the driver overhead of Intel graphics with AMD and Nvidia graphics.
Are you really sure you tested Baldur's Gate 3? ruclips.net/video/NQp9plxGHlQ/видео.html ruclips.net/video/97VEQAD_z3Q/видео.html
@@Y0Uanonymous Good request! Haven't done that test in a few years. It can actually be pretty different. Maybe it's time to try again -- any requests on which CPUs you want to see tested on the Intel side? AMD seems obvious -- 9800X3D. What about Intel?
@@GamersNexus i think it has to be 14900k, because the 200 series needs some fixing before it can be used for testing
Also, I like this "look back" style video. It lets people know about what they'd get from an upgrade from a different perspective than normal. I think this is a great idea.
Yeah definitely agree! Plus I'm sure there's a market out there that prefers/doesn't have the budget for something new and is looking an older generation flagship used product, this is great for that :)
I bought my 12900k used last year, was half the price of a 13900k brand new and considering I didn't really need the extra performance, it was a lot easier to justify the upgrade at that price point haha :D
i just put together a 12100 'look back' build, paired with an rx6600. it's my media pc, and i figure i'll be able to 'look back' at games from 2022 and before forever, while it delivers a pretty good media storage/streaming experience. i was using my old workstation for the task, but goodness that thing ate power while idling.
PROMOSM !
Yup. I'm still rocking a Ryzen 3600 because the increase in productivity performance isn't worth the money for me yet. But I'm always monitoring the price lists.
Definitely
I built a new system with the 12700K a couple months after they came out, and I thought I was living dangerously getting the brand new architecture...
hahaha, it was a HUGE change! Go figure it'd be the refreshes and follow-ups that botched it.
@@GamersNexus I went from a 6700k to a 12700k, in 2021, still running it! probably gonna have to go AMD in the next upgrade...
You were living dangerously, if you watch bryans series from techyes you'll find that intel made BIG sacrifices to achieve the P and E core layout with the source reason being confusion and marketing. There's latency everytime you wanna do something spontaneous ranging from clicking on a random mp4 in file explorer (11th gen faster) to stacked spontaneous tasks with stacking stutters disrupting workflow. This series is propagated by gamers to being the best entirely because games aren't affected by this and most gamers don't do much else other than game or aren't smart enough to differentiate from that toxic ecosystem.
You did however make the right decision at the time luckily because Intel had a direct hand in messing up AMD's interaction with Win11 and only recently when Intel was extra vulnerable did the major bugs get fixed, but it was even worse before when AM5 was unstable as a whole for almost a year. Now however, there is no reason for anyone to go 12th unless you get a steal second hand.
my 13600kf got unusably unstable and intel refused to rma it. luckily, my seller just refunded me almost 2 years after i purchased it. straight up replaced it with a 12700k for half the amount that i once paid for the 13th gen.
Got a Dell laptop with a 12700h (7620 plus). Was very unstable the first year, but after 1~2 years of firmware updates, i finally reach uptimes of 7+ days :) still a very fast laptop
The new "Equivalent" chart is interesting. I like it.
Thanks! We'll experiment with it some more.
@@GamersNexus I think it's one of the most informative types of charts you can possibly put out. At the end of the day what people want to know is "is x better than y considering their prices?" and an equivalency chart in different challenges answers that VERY simply. Then it's just down to the viewer to compare the prices.
@@GamersNexus I really like it too, I think an aggregate row at the top or bottom that states the overall average closest CPU would make it easier to reference in a general sense, before looking at exact neighbours per title.
I wonder if GN could add a DB on the website where you could, input a CPU and it would provide a grouping of comparable items?
Very much like these charts but always available.
Sorry. I don't think I'm expressing this idea well at all. 😳🙃
But I also love the charts. 🖖🏻🤝🏻
Agreed. Everything can be distilled to numbers but the chart gives an interesting perspective and sort of serves as a TLDR if you feel like it. Good stuff.
It’s wild how the Alder Lake CPUs, especially the 12900K, have managed to stay so relevant even as newer generations keep rolling in. Intel really hit a sweet spot back in 2021, balancing decent performance, new tech (DDR5, PCIe Gen 5), and reasonable prices. Even now, CPUs like the 12900K and 12700K are holding their ground, performing like modern i5s in a lot of cases. That’s a pretty good deal if you snagged one for $120 in a Best Buy clearance.
What I love about Alder Lake is the flexibility! it works with DDR4 or DDR5, so you don’t need to throw your whole build into the trash if you want an upgrade. Plus, it shares a socket with the newer chips, which is rare these days. You can slap in a 14900K if you really want to push your system, though most people probably don’t need that kind of power (or heat).
Agreed on all of this! This is what was exciting about it back at launch. The in-socket changes is also uncharacteristic from Intel, at least for this wide of a performance gap. That you can go from a 12900K to a 14900K in the same socket is what we need more of, especially from Intel (given AM4's track record).
but with ddr4 performance you will suffer performance regression by some margin maybe 8-10% compared to ddr5 so if you still want to build gaming pc with ddr4 5800x3d, 5600x3d etc.. still kicking ass sometime fell strange to see those cpu on top of the charts
@@dylan.dog1 My point was that you CAN run DDR4. Not that you should. Come on man.
@@visionary4787 you're right sorry
@@dylan.dog1Ddr4 is just as fast as ddr5
Built a system with 12400f last month and just helped a friend building a gaming PC with 12600kf. Alder lake is the best bang for the buck for LGA 1700!
same 12400f
12400f paired with either rtx 3070 or any other equivalent gpu is still a very potent 1080p - 1440p gaming machine and will be for quite a while as well.
Yeah rocking a 12400f. Does the job. Not in a hurry to upgrade
12400F @ 5GHz 💪
Planning to upgrade on 12700F in coming year, it's enough for me to play and work for another 5 years
12400 overclocked to 5-5.3Ghz was insane buck for money with mb that had external clock generator. Skatterbencher has some nice vids on how to to it.
I bought the 12700KF in 2021 after watching your video, and I've been loving it ever since!
I'm still running my i7 12700kf with a 6900xt. I haven't felt the need to upgrade yet. If anything I'll upgrade to ddr5. I'm running 64 gigs of 3200.
That's a great build overall. 6900 XT was a good price there for a year or so.
You will be good to go for next two years with that combo. That GPU is a beast, RX7800XT < RX6900XT < RX7900GRE but very little margin. I personally call them "cousins".
Got mine for 700$ just as the crypto boom started to slide and since it’s been a great performer at both 4K and 1440P. Easily one of AMD’s best flagships once the price came down a bit
I’m in almost the same position but I got a 7800xt just gonna upgrade to ddr5 at some point
12700k + 4090 here. I'm waiting to see if Inteil will announce the Non Ultra 1700 socket in january or else I'll buy a 14700k.
Went from a 2600k (2011) to a 12600k! 11 years later. It's still going strong 2 years later can't see a need to upgrade, although I don't think I'll get over a decade of it like the 2600k goat. (Which is still running! But spending its retirement in a sims4 pc in our lounge for my wife)
I reckon that the 12th gen might last even longer than the 2nd gen. Back then we saw substantial performance jumps between generations, but it's been very lacklustre lately. If there are no major architectural changes that enable large performance jumps, I expect to get the same 10 years out of my current i5-13600K much like I did with my previous i5-4670K.
@@engrammiyup....12th gen to 14th gen are essentially the same processors with just more clock speed and the new ultra series are on par or even weaker than 12th gen chips. I think it will take another 3-4 generations to see any noticeable performance improvements.
in 2019 I also upgraded from an I7 2600...CPUs likely will never "live" as long as back then. Thats because PC gaming has changed a lot. For me personally gaming on 1440p 165FPS makes going lower than 120 basically impossible. 60 FPS feels choppy to me and sadly higher framerates also means you need a better CPU. Back then all we had was a 60Hz screens for ages, this is no longer the case
I too updated from my 2600K (2011) to a 9700x 13 years later. I couldn't justify a purchased from Intel given their recent behavior, from both product (13th/14th degrade fiasco) and individuals at the company executive level.
Same here!!! I had a i7-2600(non-k) from a Dell Optiplex 990, the 12600k is soooo nice, especially the ability to banish an app to the E-cores with Process Lasso.
Intel peaked at 12th gen. It was their last gen that didn’t have vast issues and was also competitive.
Honestly, I think the X79-era was the peak of modern DIY/enthusiast computing. While six cores is standard now, quite a number of people still had dual core processors in 2011, and it wasn't until, what, 8th-gen that standard desktop i7s moved from quad to hex core chips, so the idea that you could, for ~$600 USD pickup an unlocked, six core HEDT processor -- that was still on the exact same architecture as the consumer-consumer parts, mind you -- was great.
Plus, it came with a quad channel memory controller, something we still don't have on standard desktop parts, and FOURTY PCI-E 3.0 [1] lanes direct from the CPU. On top of that, it was funny, I remember buying a Rampage IV Extreme, which seemed absurdly expensive at the time at like ~$400 -- which feels barely above average now -- and not only did it have all the usual xoc features, a seven segment display (wow!), but it included a PLX PCI-E switch, in case those 40 CPU lanes weren't enough.
And it wasn't just Intel; while Bulldozer may have been a flop, for several months, the absolute fastest consumer GPU on the market, by a solid margin, was the newly released 7970 -- which was "only" $500, not $1500.
[1]: Officially, I think Intel still claims PCI-E 2.0, but it was totally capable of 3.0.
I'd say they've been legendary since the Core 2 days, peaked in the mid-2010s and then Ryzen's been destroying them in gaming since. Not sure why it's not possible to have two manufacturers at once producing baller CPUs and having proper competition at all times.
It works in GPUs, right? There is very strong competition between AMD, Nvidia and Intel. Rarely does one manufacturer pull ahead massively. Sure, Nvidia is a market leader, but that's more of a brand image thing. Probably same as Intel in CPUs.
13600k is AWESOME
The 9th gen is peak intel.
@@timothybayliss6680 LMAO ,,, your kidding right?
Still rocking my 12700k, working perfectly and happy that I didn’t need to deal with the instability issues.
If I were forced to choose an Intel CPU this very minute out of all of the generations available, it would absolutely be 12th gen. 13th, 14th and "15th" gen all have anxiety written all over them. 12th gen has a proven track record of longevity, reasonably good performance, and now, they're dirt cheap.
It can handle every GPU besides 4080 and 4090
nice bro i got mine rocking 5.5ghz on p cores and 4.1ghz on e cores and its stable as ever
@@GODFADED That's pretty high, your cpu is liquid cooled? I OC'd mine as well for a while. I got great results high fps in fn and notably less micro stutters and fps stability but the cpu got pretty hot talking about 85°. I have a dark rock 4 cpu cooler and it was barely keeping it from throttling when in heavy load.
@ yessir got a ls520 deepcool aio i rarely get to 70 while gaming and barely touches 80-85c on full load for a hour
my 12600k is almost 3 years old, will use it for 3-4 more years
Hell yeah keeping mine for a while.
I've only upgraded to a 12600KF like a half year ago after using a 12100F for 2+ years which served me really well for my use case so the 12600KF will defo last me a few years.🙂
The upgrade did not cost me much since I've also sold my 12100F to the same person who I've bought my 12600KF from. 'second hand but basically brand new CPU'
Gave it a healthy ammount of undervolting in the BIOS and a cheapo contact frame and this way its very easy to cool even my budget ~35$ tower cooler is able to keep it under 60 celsius while gaming at stock speeds. 'B mobo so even if I wanted to I can't OC but since it was cheaper than a 13400F I've went with this instead cause its faster even at stock anyway so the 13400F made no sense to me over this'
don't forget to reapply the thermal paste every year or so
@@bracusforge7964 True honestly been thinking about upgrading to a 360mm AIO. I think I got a decent chip that can overclock quite well. But the issue is keeping it cool. I started Dragon age and while building shaders thermal throttled. And forget about running anything AVX. Imagine I could always run an offset but what fun is that. 😜
I want to try using it with RTX5080, I wonder if it can handle it at 2K resolution?
One thing to consider about 12th Gen is it is now really the only option for ITX builds at the moment (excluding 13, 14th for obvious issues and heat and 200 series for price to performance). Am5 boards are fewer in number, more expensive (in Europe at least) and limited io compared to their 1700 counterparts. The amount of AM5 itx boards with only 1 graphics port on the rear io is ridiculous! Considering ITX is the only form factor where you are most likely need to use the IGPU if you need to use the PCI slot for something other than a graphics card.
You're already on thin ground when you want AM5 integrated graphics capable ITX - found 11. A quarter of those has 2 ports:
* Gigabyte B650I AORUS ULTRA
* Gigabyte A620I AX
* Gigabyte B650I AX
(You can see the trend, no?). But you also can just compromise and get a usb-c to hdmi adapter (mini-docking station) for a few bucks, shove it in of the 40gbps backports of an Asus ROG STRIX X870-I GAMING WIFI or equivalent. We're in PC world, mate - this should work just fine.
Got my 12700k at launch. I’m very tempted to go for the 9800x3d but will likely sit out a few more cycles. Just seems smarter to use that money towards a better gpu and upgrade when 9800x3d drops in price or better CPU’s come out.
All these hype around CPU for gaming, when GPU is the real bottleneck.
I'm actually feeling the opposite, many of the games I play are more CPU bound and seeing a roughly 50% diffy between the 12700k and the AMD X3D CPUs makes me kinda want to upgrade.
Kinda regret going Intel for my current build. Wish I had gone AMD so I didn't have to replace MOBO and stuff as well.
@QPoily it would be a no brainer upgrade when 9950x3d gets released. I feel 9800x3d with only 8 cores is too limited.
Me too. That CPU still has significant headroom in all the new games I play.
@@sunnyokp lol u dont need more than 8 cores for games
Damn your hair was less grey but you look healthier now
I'll take it!
Steve is slowly turning into a wizard
Or Battlemage
@@GamersNexus Look into copper supplements. No more grey
For those of us with older Intel CPUs, it would be nice to see a 9900k in there since you have some older AMD CPUs like the R7 2700. Thx!
This just summed up for me to hold on to my 12900k for another 3-4 years as im not bottlenecked at 1440p with my 4070, loved this type of video super informative and just took all my doubts
I have a 5800X3D and I'll hold onto it until DDR6 platforms come out, at that point I think it'll be fair to upgrade.
Interestingly, the 12th gen's DDR4 support makes upgrading complicated. If you have a DDR4 motherboard, an in-socket upgrade from 12th gen stops making a lot of sense.
Why? DDR4 can still very much cut it vs DDR5 in most usage scenarios.
There are very few cases/games where the bendwidth is an issue, the fast DDR4 kits are still insane and as vendora clear inventory, quite cheap.
If you have some b-die ddr4 tuned to 4000mhz tight timings it can compete with some of the best ddr5 rigs
@@weyo14 this
@@plamen5358 Only if you're pairing with a high-end GPU. But the RTX 5000 series may change that.
Just bought a 12th Gen machine few months ago. Dirt cheap, blazing fast. Don't really feel like I'm missing out on anything. No stability issues whatsoever. Even the integrated graphics is fast enough to play games in HD. I wasn't expecting it to preform this well, but my next upgrade will be an AMD. I'm bailing on Intel because of the last 2 gens.
Welcome Comrade! I just built a 12900k system, upgrading from my old 4790k build. It's the first new build I've done since 2008 (for myself). I just kept upgrading for 15+ Years. Not gonna lie, it was a little emotional moving the rig I've been building & working on for all those Years move to secondary duty- there's literal blood, sweat and tears in this new PC! I love it though- totally worth it! And I'm VERY glad it's over and I've had time to get comfy with it. It's one thing building a secondary PC or one for someone else but it's another thing to build my own PC, at least for Me.
@@Chaos_God_of_Fate I used to care about hardware. Always had dual GPU's and top of the line everything. My computers 10 years ago already had 256GB of RAM and then I suddenly stopped caring at all. And I just retired a laptop recently that was 12 years ago. The battery finally died and it doesn't work without it. I would've kept using it otherwise. I bout a 12th gen as a replacement because I had some reward points and the computer on offer ended up being practically free. All that about one week before Intel 13th and 14th gen defects became public. Fortunate buy that I'm still enjoying. I hate the idea of Microsoft forcefully retiring Win 10 that most of my computer fleet (that I use for work) operates on. The move to Win11 would make a whole ton of those 256GB 32-core beasts next to useless. Sure, there is a registry fix and it is possible to get a hardware board that makes the computer security compliant but it's all expenses out of nowhere on something that was considered fine only yesterday. It's BS. I can dodge Intel and go with AMD, if the brand messes up, but what about Windows? I'm completely stuck. Macs are overpriced and Linux is useless in a work environment. Most software is for Windows... 12th Gen intel is the last system I'll be enjoying, even though it already feels like I don't own the PC, and simply renting it as a service from Microsoft, who are worse than spyware... blah blah blah... I'm rambling.
As a weirdo with a 12600k+4090 system I appreciate this. Originally the thought process was "4k = max gpu, midrange cpu" but in retrospect it was too optimistic with a 12600k.
All told, coming from a "if it works, I'm happy" background frankensteining PC parts together my system's still WAY more than enough, but a cheap 12900k wouldn't be bad. I didn't realize they'd dropped in price that hard. Cool vid!
Flight Simulator or City Builders-esque stuff will always stutter, but in "most" games the 12600 idles around.
I got a 9700k with a 3090 🤣
I am thinking of upgrading to 12th gen, still haven't decided if I want an i5 or the i7 but the i7 has been looking tempting but I'm short in cash so I'm still thinking about it
Have the same setup, similar thought process. I don't need the highest frames on 4k, 120+ us ideal but I'm fine with around 80+.
Optimisation in games are atrocious now but works really well for most of my use cases just hoping MFS2024 will run well with it.
Yeah i just bought a new 12900ks for 240$ theyve dropped
Built my current system 2 years ago with a 12400F/3070/ddr5 6000. Its been a great system and I don't think I'll be upgrading any time soon.
8 gb of vram in 2024 😂
Any AMD gpu in 2024..... Driver crashes and taking it in the butt.... 😂 frame stutters@@nitrowarrior-lj5ip
@@nitrowarrior-lj5ip and it's completely fine 😂
@nitrowarrior-lj5ip I have 24b of vram and 8gb of vram there's horses for courses so to speak
Love my 12700k. Picked it up, around a year ago or so, when a z790 board and 12700k combo was first available for $300. No regrets. I was able to get 5.3/ 4.3 OC on all p-core/e-core Respectively. With 4.4 oc on the ring ratio. Unfortunately, it reads ~230w of power with peaks at 250w in multicore CINEBENCH 24. Thankfully, all I do is game, so it's a non-issue. Use a 280mm thermalright liquid cooler. Zero issues.
throwback to avx-512 p-cores
[Horny RPCS3 sounds]
What mainstream / semi-mainstream programs actually use AVX 512 at the moment? I can only think of handbrake off the top of my head. Is there more?
@@WSS_the_OG certain emulators, such as RPCS3 make heavy use of it and it's night and day for some of those scenarios, but outside of the emulation scene, I'm don't think you'll find much in the gaming world, except possibly some random decompression libraries for assets or whatever, actually using it. However, it was nice being able to write and test AVX-512 routines as a programmer without needing to use Intel's emulator or purchase something like Sapphire Rapids, etc, and still getting best-in-class single-thread performance and all of that jazz.
Similarly, as you mentioned, a solid number of video encoders make decent use of it. And contrary to what people might expect, unless you're encoding several things at once, or possibly splitting a video into segments, encoding them in parallel, and then trying to stich them together afterwards, you'd actually see faster encoding/render times on something like a 12900K/7950X than any EPYC or Sapphire Rapids [non-WS] parts -- most of these codecs can only efficiently be parallelized up to a certain point, and after that, you typically run into either too much overhead or worse compression ratios, so having fewer, yet significantly faster cores often wins out.
And then there is also the scientific simulation realm, CPU raytracing, as well as a variety of database software (ClickHouse, for example, and I'm sure others) and [as noted] various compression/decompression and data marshalling libraries -- e.g., zlib-ng, simdjson, etc.
While, at the time, I'd probably take something like a w7-2495X if I had the chance, there's just such a monstrous price difference between desktop and "HEDT" nowadays, so it was nice to have the ability to use those features when it made sense without paying $2K+ for the processor, $1K for a motherboard, and god knows how much on DDR5 RDIMMs. For now, I'll live with my ghetto Taobao 12900HX and Asus W680 board I repaired and got stupidly cheap since some poor soul must've immediately dropped their processor into the socket and bent half a dozen pins.
@@WSS_the_OG As the guy above you mentioned: emulation is a big thing (as certain instructions need to be ‘spoofed’/reimplemented which works much better on an architecture with more of them; hence ARM→AMD64 is much easier than vice versa). Also some forms of rendering that reach the limits of AVX2 not dragging them down.
@@WSS_the_OG RPCS3.
I built a system when the 12600k came out (it was only $200 in my country). It has never crashed, stuttered and performs so well with my 3060ti. Perfect computer for 3 years now and I am really happy with it.
I'm also very happy with my 3060ti. 8gb seems to be just fine for 1440p at the moment.
@TheLongWind Yes, I mostly just play cs2 now and I can easily achieve 165fps at 1440p. It's truly amazing coming from 1080p60😖
Same. I got mine in 2022 and I absolutely love it. It also came with COD MWII.
1:33 young tech jesus
I helped building 2 Intel system this year. One with a 12100 ($94 at local retail) and another with a 12400 ($114). Both times the users were very happy about the upgrade. The 12100 which I personally worked on, booted blazingly fast compared to my 10400 rig. ADL is still super competitive in 2024 for low end and mid range.
i3-12100f can be found on the second hand used market for anywhere between 30$ and 55$ here, unbeatable value.
Love my 12900k. Does everything I need it to do without issues ❤
Same here love my 12900KS its an absolute beast of a CPU not going to upgrade for a few years
It's still a great CPU! If you're not feeling the years on it, then no need to upgrade!
I acutally hope I went for 12900k + a cl16 ddr4 kit rather than mine 13700k now, I'd have similar performence and more stability
Thanks to a microcenter bundle I just built a new PC with a 12900k.
I got my 12900k with an Asus Tuff D4 board for $420, when the 13900K came out. It's been rock solid for my 1440p 144Hz G-Sync monitor with gaming.
Late night GN edition. Love to see it. All at GN, be safe and keep it up. We love you!
Needed to switch it up with the publishing!
Hopefully they make a comeback soon, or we might see the 14nm+++++ again with AMD
This is also our concern. AMD has shown with the 5000 era that it will drift toward high prices and out of as competitive a position if Intel allows it to, which makes sense. What we don't know yet is if they'd also stop innovating like Intel did back in the Skylake era.
To be fair 10/7nm was held back by the invention of the UV process and Intel did miracles with 14nm, the scummy part was "4 cores are enough for consumers" and charging insane prices for HEDT and unlocked CPUs.
AMD has already been on that train with 8 big cores per chiplet for like... 10 years?
@@GamersNexus I mean I like AMD but it was so obvious they'd do it if Intel lets them. After all they are business and no business/corporate is our friend.
@@plamen5358kinda? 3000 series used 4 cores on the CCXes which were, it I'm not mistaken, the base chiplet and then it slapped them together in pairs to form the CCDs. I think that that's one of the main reasons why the 3300X was quite a bit better than the 3100 while both being quadcores: 3300X was using the whole 4 cores on a CCX while the 3100 was 2 CCXes with 2 functional cores each.
CCXes dropped off from relevance on the 5000 series, where the baseline was the octacore CCD, also the reason why it was such an uplift, since there was no latency issues brought by CCXes communicating with each other.
@@plamen5358To be fair to both Intel and AMD, there's definitely diminishing returns to sticking more cores into a CPU in terms of usefulness. Few, if any games and not a lot more apps can use more than 16 threads (a lot fewer than with 8 threads), so if you're not planning to use your PC for production workloads like heavy 3d modelling or video editing (which 95% of PC users don't), the offerings out on the market right now are perfectly sufficient.
Happy to see this video pop up. Just bought a 12900KS after multiple days of research for £270. The 13th and 14th Gen have FAR to many issues and not worth the hassle, stress and potential problems. The 12900KS also has much better resale value. I did look at the new 285k, But those in the UK are around £580 and unknown longterm. ❤❤❤
12th gen was the best since 7th gen.
Definitely up there! The 8700K was a pretty big overhaul too.
I'd say since 8th gen
@@justmatt2655 yep 8th gen is rock solid if somewhat dated,,, I still use an 8th gen i7 as a backup to my newest 12th gen rig, looking forward to doing an AMD build after the world settles down on the new 9800X because Intell just aint got it with thier last 3 generations if you ask me.
9 hot
10 almost the same
11 shit
12 the peak
13 problem
14 same problem
200 also shit
Since Sandy Bridge or Haswell. 7th gen was mediocre, Skylake overclocked.
i never tire of him saying the names of different CPUs so fast i barely follow what he's saying.
We need a SOAS (System of a Steve) song where it's Steve rattling off CPU and GPU names
Been using 12400 since 2021 and it's been flawless. Upgraded my 1080p rig to 1440p with 4070 super which it doesn't bottleneck, I play single player games mostly. I really don't see myself upgrading to 13th or 14th gen, anything worth upgrading to consumes 3X the power. I'd probably hold off till AM6 or AM5 if games I play start giving me a headache. Thanks for the video! It really solidified the belief that I shouldn't be entertaining FOMO as long as the PC performs as well as it should!
I upgraded from a 6700k & Vega56 to a 12700kf & 4070Ti Super with DDR5 a few months ago. No regrets! Really great price to performance.
This video is awesome in a bittersweet way. Not only showing off that 12th Gen truly was a marvel, but also what could have been the 13th Gen if it wasn’t for the fact Intel fumbled the bag with the instability. Seeing my 13700K up there beating out the 12900K and being competitive was great to see but.. man. I’m unsure how damaged it is as of now. The BIOS update did actually stop my Windows from soft-locking, and my brother’s 13600K was affected, and that’s apparently rare. Thanks for the hard work as always though, great watch 🫡
Super happy 12600k owner here. I’ve never needed more, it was a huge step up from the 6600k that I ran for years.
Currently using my i5 12400F to run games at 3440x1440 max settings with my 7900 XT and I couldn’t be happier. Here’s not wasting money on an overkill CPU for gaming!
usually amd gpu have less driver overhead so even with weaker cpu, in the cpu bound scenatiors you ll have more performance compared to an nvidia gpu hardware u. made a video about that show you can have up to 30% more performance with same cpu
I recently upgraded from a 12100F in my ITX board, to a 14700K to help leverage my 7900XTX that I put in to replace the 6600XT. That 12100F though rocked solid low temps on a tiny Noctua cooler and never broke a sweat.
I really put 12100f-12600k in the "if ya have it, it's fine" but it's aging fast, and I think the upgrade to the 14700k was a wise choice in your scenario.
Tons of builders often forget that an old monitor is often the biggest bottleneck. 1080p/60 is ubiquitous, so new builders have to consider display targets.
Totally agree!
And the performance gap shortens as you go to higher resolutions. So this kind of benchmark at 1080p only makes sense for someone who plan on playing at this resolution for the time they will keep the processor.
I had thought to upgrade the 12900K to 13th gen but the improvement was too small to be worth the effort. Then I was excited when 14th gen was compatible but still it didn’t seem worth the expenditure, so bought some parts and built two machines mostly out of spare parts and gave them to family members.
I realized I got more fun from that than I would have getting a few more frames in flightsim. Then all the problems became known and I decided to hold off on a new build. Recently found a new build 7800X3D/4070S pre-built on sale at Microcenter and got two of those again for people in the extended family. Again, more fun in sharing. Maybe in the spring I’ll finally do my own upgrade but the thing is the 12900K/3090Ti just keep chugging along so I don’t have to.
In my old age I find other things become appealing to me with hardware, like a GPU that’s got zero coil whine, low fan noise and is built solid so there’s no sag.
just upgraded to a 12400F paired with a B760m PG Riptide a few weeks ago. OCed to 5.0GHz, 1.15V. loving this thing
Same built here! Its a very nice performance gain when its overclocked to 5Ghz. I don't plan on upgrading for now. :d
I said it already. Keeping my 12900K is the best decision I ever made. It runs anything. It is easy to cool. It is stable. Paired with RTX 4090, I am all set for many years. 12700(K) is also very good.
Still running my i7-8700k and loving it. It's seven years old at this point but it still holds up.
I have just pulled the trigger on a new rig. Had the 4770k for 10 years so excited to say the least to get my hands on the new 9800x3d.
@@fatmike01Lol I still have the 4770k . Still running current gen games great, thanks to my 16 gb ram and my gtx 1660 ti. Newer games avg around 50 fps with 1080 resolution. Not shabby at all with a 10 year old CPU! Turns out most games nowadays rely mostly on GPU and ram.
I just went from a 10700k to a 12700k thanks to a Microcenter bundle. $250 with a good motherboard included. Very impressed with the power jump. Using a 4070 Super and everything runs like a dream.
Perfect timing, with my morning coffee in my hand.
Cheers from Sweden :)
My 13700k is over 2 years old now and has been oc since day one to 5.6ghz on all 8 p cores, All 8 E core oc to 4.4ghz, Ring oc to 5.0ghz at 1.34v. Even the mem controller has been amazing.
32g.b Corsair rgb vengeance 6400 cl32 hynix A-Die oc to 7200 34-41-41-83 at 1.45v. My computer is on 7 days a week over 8 hours a day for work and gaming. Must be some sort of 13th gen miracle. Haven’t even updated the bios since Jan of this year.
Same as me. Anyone who set the bios up right and undervolted properly since the start will be fine I feel.
Wanted to comment that I’ve got my dice set and they are wonderful- the wooden case is particularly impressive! Thanks Steve! Back to you
That's awesome! Thank you for being one of the first buyers and glad you like them!
My 12400f is doing fine. Won’t upgrade for a while yet. Thanks for the video update.
I've been running a 12700k at 4.9ghz p core 4.0ghz e core (all cores) with negative offset since launch. Runs perfect for games at 4k (3080 TI). If you don't care much about productivity and are playing games at 4k, you're GPU bound and wont see much uplift from a CPU. Right now I don't see a point to upgrading for probably another 2 years.
Still running my launch 12700K. Had a 3080ti that I eventually switched to a 4080S due to vram failure. My only regret was not going with DDR5.
Yup my 12400 still doing me just great.😁
Wow you read my mind, such a well timed video,thank you. My only complaint would be , like with your standard reviews power consumption is a very real consideration so as a 12600k user I was hoping to see some insight as to if the 13700k used less power than the 12900k.
We need an 11900k revisit. Would be fun
Waste of sand > melted silicon.
The new equivalent charts are very cool. Love that comparative performance info!
I'm so glad I went 12th Gen 12700K and not 13th gen when I had the option to choose. Who would've ever saw such a disaster coming? 😅😅
I paired this with a 4060TI 3x Ventus, and 64 GB of DDR5 6000Mhz WAM. I kind of regret getting an MSI motherboard (MPG Z790 Edge), because it won't let me overclock and use the full 6000Mhz without blue screening. The MOBO defultively caps the wam 5600Mhz. Despite that, I haven't experienced any issues regarding the 12700K itself.
If only Windows 11 was as good as my PC...
I never experienced issues with Windows 11 on my PC and the 12700K minus a weird Windows Update bug (my bad on my side).
Running a 13700k + 4090 at 4K. Probably just gonna use the 13700k and upgrade my gpu to a 5090 when the time comes as that will be "the biggest" upgrade in my case.
Great video as always Steve!
Same here 13700k and 4090 for the past 2 years and has been absolutely flawless. It’s been oc since day one to 5.6ghz on all 8 p cores, All 8 E core oc to 4.4ghz, Ring oc to 5.0ghz at 1.34v. Still on bios from Jan of 24 and have no plans on updating the bios. Even the mem controller has been amazing. 32g.b Corsair rgb vengeance 6400 cl32 hynix A-Die oc to 7200 34-41-41-83 at 1.45v. My system is on 7 days a week over 8 hours a day for work and gaming. It’s been an absolute reliable work horse. The 13th/14th gen problem was so blown out of portion in my opinion. Same as the 4090 connector melting. Same as oled burn in. People loose sleep over the dumbest crap.
Not being a rude but why do you need to go from a 4090 to a 5090?
@ not rude at all to ask. The short answer, I don’t need to, I just really want higher fps in games like cp2077/alan wake 2, basically the stuff pushing the boundaries with path tracing / ray tracing.
My current PC has a 12400F, built it when Alder Lake first came out and it's served me well. I am thinking of doing an in socket upgrade to a 13700K within the next 1 or 2 months to get me a couple more years on the same motherboard/ram set up.
if you change socket it's better to go with am5 for me, or go with 12700k-12900k(12900k i think price it's too high) and keep same socket, i would avoid top 13-14 gen due to degradation problems that even with patch seems not resolved but only mitigated...in server/datacenter with cpu at work h24 they still have instability issue
i love my i5-12400F because of the non E cores tho if i was to upgrade i would get the i9-12900 and stick to a all intel setup ie motherboard ect but with a nvidea gpu.
@@dylan.dog1 The bios don't fix already degraded CPUs, i have a 13700K and is working perfect after 1,5 year, and if it was degraded and it breaks i'll ask intel for a replacement since they increased to 4 years warranty on 13 and 14.
The 14700k was only like 5 dollars more expensive on new egg in the US 2 weeks ago. I have no idea if that will last.
@@guille92h for day to day desktop users it's more hard to see than in a server or datacenter ((with cpu working h24 and where you need everything stable)), and cpu affected by it are 20% so your 13700k can be fine, anyway you ll don't see a break, your pc will work anyway but you ll have stabilities issue, maybe a program won't work or crash or game crash or pc crash or some online navigation instability,with lag, restart etc.. or strange shuttering or various problems like black, screen blue screen errors, etc... some users can have only one or 2/3 of these problems that are related to cpu degradation and don't even know
Great video, as always. I'm glad that your considerations about the Alder lake series in 2024 matched exactly my conclusions when I was upgrading 5 months ago. I had a strict budget for the upgrand, and my primary choice - R7 5700X / B550 Mobo / 32gbram were still a bit high in price here in Brazil for the good reputation AM4 built over the years. Instead, I got me a 12600KF build with a B660 PG4 that smashes the 5700X in games and general performance, and still managed to save around 140usd by chosing that, which allowed me also to complement savings and upgrade from an RX5700 to an RX6700XT. The R5 7600 non-x was an option, but the build with DDR5 and the expensive mobos would've cost 220usd more for just 5~7% more performance.
I have built a PC with a 12600K back in 2022, and almost 3 years later I've never felt the need to upgrade
All the reviews lately are showing 1080p with over 300 fps. Seriously no real world benefit to anyone.
I got back into building PCs with a 12700k. That was a fine machine, and the friend I sold it to is still using it. Also, from the time machine, "performance fortunately has also crept up but so has power," made me chuckle.
12700k still probably the best price to performance on the intel side
Has the best perfomance per watt as well from an i7.
16:46 Minor correction, 12700k and 13600k have the same THREAD count, but not the same core count. Not a big deal and doesn’t change the message of the video, but thought I’d point it out given GNs never ending strive for the upmost accuracy.
Still rocking my i5-12600kf and no plans to upgrade for a long time unless it's really worth it.
It still handles well!
potentially bartlett lake next year could drop something decent and STABLE... hopefully.
@@GamersNexus paired it with an rtx 3060 ti, works like a champ. 💪
@@Neo_X90X It will also handle the future RTX5080 without any problems, for example at 2K resolution.
@@pontiac797 I've already been playing at 2k for years now, but I do plan on getting a 50 series card next year.
I'm still rockin a 7700k + 1070 and Monster Hunter Wilds was finally the wake up call to get a new PC. Being an intel fan I decided to hold out for the 15th gen release and.. man, that 9800X3D sure does look sexy. Prior to the 9800X3D release I was considering getting a 12th gen intel since they're so damn cheap and still have good performance results without having to deal with problems like self immolation.
New killer gaming CPU, DDR5 RAM, PCIe 5.0 GPUs in January, it's nice to be back on top. And broke, very broke.
In May 2024 I built a PC for 12600KF. СPU cost me $170. I still think this is a great solution for the price.
Thank you for yet again affirming my upgrade decision from earlier this year.
Now to just get the GPU up to date
it was beautiful…
That was a strange time! AMD was way too expensive and Intel was fighting pretty hard.
@ yeah. it’s crazy how in just few short years we went from toe to toe battles for x86 competition to genuinely hoping either intel of all companies gets better or face the consequences. almost sad if it wasn’t so deserved.
I am happy to see this up-to-date revisiting and testing, thank you.
I own a mini itx build (Lian Li Q58 case), which has an undervolted i9 12900F cpu, 240mm AIO, rtx 4070 super, and 6200MT/s 32GB DDR5. I am doing some light gaming, and some rendering hence i opted for an i9 cpu which was very cheap second hand. Been thinking to buy a newer processor(or not)already, but it seems even the i7 counterpart is still relevant in 2024 or onwards. Amazing.
More Testing Woot!
I ve had 12400F, 12600kF and 12700K + 3060 Ti/ 7800 XT, 6700 XT and I am totally happy, no need for any form of upgrades.
The key point is does the modern games give 1% lows that are drastically lower than the fps average.
Most of the time other than 12400F the 12600k and 12700K almost all the times give good 1% lows and I am so happy with Intel 12th gens.
I ve had 5600x + 6700xt pc as well as 5800x + 6900 xt PC. Not sayn these combos were bad but I d say the Intel Nvidia combos gave me best results in most competetive games
And 12700k + 7800xt is a superb 1440p beast even today.
I wud never upgrade at least till 2027.
Returned my 285K and finally switched to AMD. Ain’t going back any time soon!
Love the 9800X3D finally no more CPU bottleneck in modern games.
There is 100% no bottleneck in anything with a 12900k
@@drunkhusband6257 Are you serious? What planet do you live on? Because it sure as hell ain’t earth! There is a huge bottleneck even with a 14900K in every single modern game once you try to get near 144FPS. And in flight sims there is even bigger differences even at low frame rates. 14900K does 45 to 55 FPS while the 9800X3D does 90 to 100 FPS without changing any setting and with same GPU.
And in cyberpunk no matter what u do you won’t go over a like 135FPS even at the lowest setting and ultra performance upscaling while the 9800X3D allows it to go past 190FPS without changing anything and again the same GPU. Same in Hogwarts Legacy, same in Jedi survivor, same in stars wars outlaws, same in watch dogs legion, and so on and so on.
The performance is up to 70% faster so what the hell are you even saying, your clearly have no clue what u are talking about sorry not sorry.
@@pvdgucht You are totally wrong and making things up. lol performance at 1440p and above isn't even 30% faster. It's fine, live in your dream world that doesn't exist.
@@drunkhusband6257 Didn’t say it was 30% at 4K but anyway (well actually in flight simulator it comes close but that one is an exception. I not making anything up. I did all the test and benchmarked everything myself and noted all the FPS. I know what I am talking about. I saw a CPU bottleneck in almost any modern game at higher frame rates with Intel CPU compared to X3D.
And I don’t get why you can’t accept that and are calling me a liar, I am just straight up telling what I saw, and I invite you to do the tests yourself and see that with a 12900K your frame rate stops at a certain point often near 130FPS if you lower all graphics settings and if you lowed them even more nothing happens, while with a 9800X3D it does go much higher indicating a CPU bottleneck. Or try to run flight simulators on these two CPU’s and you will instantly see what I am talking about.
@@pvdgucht It's about 15% tops at 1440p, and less than 10% difference at 4k. Unless you play at 1080p, who cares.
I went from a 7700K to a 12900K back when it released. Also still rocking my RTX3080. I've not felt any need to upgrade whatsoever. It used to be that we'd have a Crysis type game come out that would push the performance envelope and requirements - but nothing pushes the boundaries that hard these days. I can easily play any current game at > 120fps+ at 1440p, and that's perfectly fine.
Intel 12th gen is the goat
Rocking 12700k here, bought based on your testing from 2 years ago. Power limited to 100W, still got 90% performance out of it and
12th gen is absolute bangers to this day. Stocked up on 12700Ks for like $100 or something from the TikTok Newegg sale last year. Running 12th gen in all my systems and probably will stay that way for a while
They are indeed hot product 😂
My brother and girlfriend are both on 12400f, and i'm still rocking a 12700k. I play at 4k, so my games rarely cause it to break a sweat in gaming. It's amazing for running my audio software (Sony ACID Pro 11) as well.
When Intel was good: Sandy Bridge to Coffee Lake. That's when it worked right and felt right. After that, everything got weird. 10th gen was only good for the gold 10900Ks, the rest were crap. 11th gen was a lousy backport with virtually no cache. 12th gen was still needlessly complicated with P+e cores requiring a new scheduler, and the reason Intel didn't screw it all up was because they were comfortable in their (temporary) lead over AMD so they kept it sane. Everything Intel has done after 2021 has been an UTTER failure that often requires moronicism to excuse... Raptor Lake: fail, Arc Alchemist: EPIC FAIL, Raptor Refresh: fail, Error Lake: many different flavors of FAIL... Gelsinger: FAIL!!
12 gen was al;so 12400, 12500 12600's NO e-cores!!!!!!!
@@sezwo5774That’s what I use i5-12600 non K, flawless.
"12th gen was still needlessly complicated with P+e cores requiring a new scheduler"
The i7-12700K makes the P-core only i9-10900K look like a Bulldozer CPU.
I was able to use my Intel employee discount for the 12900KS, still rocking it solid. Overclocked to the max. Paid $180 for it.
I miss sandy bridge.
If AMD didn't come out with Ryzen, we might have still been using quad core i7s to this day
Alder Lake IS Sandy Bridge 2.0.
I built a new system around a 12400F in late 2023 and I do not regret my decision.
Still Rocking my 6700K on a constant OC, unkillable :D
its dead
Same. I have mine paired with a 4090 and have zero gaming issues with my rig. Some day I'll upgrade, but only for Windows 11 when it becomes mandatory.
@@dominicpinchott7432 Nice :D Im using a 3070 Ti atm, but will probably do a full new build soonish and give this computer away to my brother.
It's a bummer seeing your CPU slowly slip off of the charts but like you said, it's important to take a step back and think about whether you're actually unhappy with it outside of just the numbers. My 9600k feels absolutely ancient when i watch reviews but still completely handles all of my games just fine.
Im rocking my i7-4790 3.60 GHz with a GIGABYTE GA-B85M-HD motherboard for 11 years still going strong
Thanks! This summer I upgraded my i5-12400 to 13600k with 160$ and have been very happy about it !
LOVE that yall did that revisit! This gen may age similarly to Sandy. Unfortunately, I built my 12900k rig on ddr4 so I think it'll just make more sense to keep it as-is until I give it to a family member or something.
One test I wish yall had done was power draw and or efficiency. Maybe I missed it. With the latter, I enjoy how yall adjust the parts for the same wattage and then the same performance (to see how many watts each pull for the close to the same perf).
The only thing that drives me to a new build right now is how power hungry this system is. Based on your last video, this thing and for that matter all of 12, 13 and 14th gen high-end cpus are absolute behemoth porkin pigs lol. I might switch just to save on my power bill for premiere/topaz rendering which can be hours like Blender.
I went DDR5 which at the time cost £300 for 32GB 5600. I think you made the correct choice. Power isn't an issue for me as I use mine for coding and gaming. I have it PL1/2 locked at 125W and a .07 undervolt on the core which runs games ~60-75W. I can't remember the CBr23 score but think it was 24k vs 27k uncapped.
@@alistermunro7090 The i7-12700K is the best perfomance per watt chip from that gen, the 12900K is worse.
@@saricubra2867 That's why I have mine PL1/2 locked and undervolterd.
There is 2 points, that where not covered in this video
1. Non K parts can be overclocked via FSB on some boards
2. There is option to enable AVX-512 on some boards when disabling E-cores. May impact grately in some workload scenarios.
3. Oc ring bus way further with ecores turned off, and the cache on ecores gets fully available for the pcores. 4. No thread scheduler issues or cores going to sleep when they really shouldnt.
@@impuls60 I don't think the cache on e-cores can be used by the p-cores, only L3 (which is always shared by all cores).
Love the revisiting videos. I already have nostalgic memories of the Alder Lake hype.
I love the comparison charts you do as it puts thing into perspective and ranks things for me which I am unfamiliar with. The comparison charts in your recent GTX 1080Ti video were really, really good as were these.
I’m still rocking my 12900k, Msi 3080 gaming z trio and 64gb of DDR5 6000mhz and it still kicks ass to this day. My Corsair 280x case and 240 AIO doubles as a space heater but hope to remedy that with a external mounted radiator outside my room where it gets down to -35 regularly (Alberta Canada) if y’all could give me some insight or tips on pros/cons to this it’d be super appreciated. I know condensation will probably be a problem but my pc stays on 24/7
I'm still rocking a 10700k and 3070 , honestly happy with gaming performance at 2K. Still love the meshify 2 case I bought based on Gamer's nexus review , best case I've ever owned
People in the comments saying: "im still running alder lake", guys. girls. You can still use an 6700 (non-K) no problem. was using one until recently. For youtube and competitve gaming (BF5, CS2 etc) it works. Bought an 12600K and of course it is a monster near the 6700, but if u only play cs2 and watch youtube, no problem with the 6700. Before the 6700 i was using an i7-4930K and only change it to go to ddr4 and then use the same ram when going to the 12600k.
Just got an i7-12700k and MSI mobo bundle from microcenter for a friend's daughter's PC repair/upgrade. Previously 8th gen Intel. Then stumbled on this video in my feed. Hopefully a good value. The hardware bundle that is. The GN videos are always a great value. 💰
Went from 4690k to the 12700k. I'm happy with it so far, had to upgrade cause the old boy just couldn't handle modern games at high frame rates but damn it was a great chip. Got it up to 4.7 overclock and was an absolute beast. Had to lower to 4.6 ghz after a few years cause of instability but still was very happy with my purchase. Also thank you micro center cause i got that chip for $186 at the time. Sentimental cause it was my first build and still run it as a backup PC if friends come over to game.
Great video as usual, and on a topic that is of particular interest to me, as an Alder Lake owner. I wish you'd tested the differences between DDR4 and DDR5, though. From what I've seen, those numbers are sort of all over the place. At release, for example, most reviewers found a negligible uplift for DDR5. In newer games, however, HUB found that Alder Lake can benefit enormously from DDR5, in some cases to a much larger degree than Raptor Lake.
For me I'm not sure it really matters, because I'm not exactly a cutting edge gamer. But the memory issue seems like a substantial wrinkle in the 'should I or shouldn't I upgrade' discussion. You did touch on this in the conclusion, but direct and contemporary comparisons between DDR4 and DDR5 for Alder Lake are pretty hard to find, so more data would be appreciated.
Anyway, in the general case upgrading from Alder Lake clearly isn't worth it right now. As you say, in-socket options (Raptor) are a mine field, and newer platforms don't offer a performance uplift sufficient to justify their huge cost. People running lower-end Alder can get a pretty good boost out of a cheap 12700 or 12900 right now, though, which is nice. Just have to watch those VRM; some cheaper boards might balk.
Off topic question: has fan and psu testing been abandoned, or are the processes still being worked on?
intel was at a crossroads after 12th gen:
- invest into developing a new and better architecture and build it on the best possible process
- don't invest anything and see if you can squeeze out a tiny bit more out of the old stuff no matter how or the consequences
RUclips comment section probably isn’t the best place for this but figured I would try.
I just upgraded my PC after 10 years (not counting GPU). I now have the following new parts (again, not counting my 3080 GPU)
32gb DDR5 Corsair Vengeance 6000MHz
Z790 Aorus Pro X
12th Gen Intel i9 12900k
Corsair Nautilus 240 RS AIO
The “issues” I’m encountering are that it takes much longer to boot up from POST. And my PC has become significantly louder than before. HWMonitor shows CPU temps don’t usually go about 40-50C but fans really start spinning hard. If anyone would have any suggestions or troubleshooting ideas, I’d be all for it.
Awesome comparison! This has been a great reference trying to figure out a good upgrade for my brother's setup so it's as powerful as my 12700kf / RTX 3080 12gb build. He's got my old 5820k / GTX 1080 and he wants to do 1440p ultrawide gaming, so it's been a real head scratcher trying to find the best bang for the buck cpu/gpu combo with all the deals going on for the holidays. Been thinking about either a 5700X3D / RX 6800 combo or a 5600X / RX 7800XT combo, but I'm worried about the 5700X3D being wasted on the RX 6800 and vice versa for the other combo option. Best option would probably be going with the 5700X3D and RX 7800XT, but my budget might not be able to be pushed that far. Regardless, excellent video! Gave me plenty to think about!
These kinds of reviews are cool as fuck, yo. Please keep revisiting older SKUs of products because... well, we're gonna need to know pretty soon how best we can get by with older hardware. We're ALL gonna be doing "budget builds" once the tariffs hit. It's important to know just how long one can expect a CPU to not only last, but stay relevant.
Had a 12400 and upgraded to a 12700kf that I use everyday. Amazing value and performance.