(Follow-up: The amount of people who don't get the counting joke with the i7-1185G7E is sort of crazy). Wait a minute: "K" often means "thousand," like 1K, 2K. Does that mean the 14600K is actually the 14,600,000? If that's true.... Intel has done it. They've won. They have dethroned the AMD X multiplier. Factor X. Watch our CPU sample size testing here for something sane, but still involving internet comments: ruclips.net/video/PUeZQ3pky-w/видео.html Watch our Intel Core i7-14700K CPU review: ruclips.net/video/8KKE-7BzB_M/видео.html Watch our Intel i9-14900K CPU review: ruclips.net/video/2MvvCr-thM8/видео.html Grab a GN Solder & Project Mat! store.gamersnexus.net/products/gn-project-soldering-mat
Yeah but AMD is 7800x3d and if we write this a bit differently it is 7800 * 10^3d and clearly 3d means 3days so 72 hours. 7800x3d is really 7800 * 1000000000000000000000000000000000000000000000000000000000000000000000000.
@@123JimmyTheCookie We would make one, but we fear the worldly consequences of such an initiative. No one knows what happens when we run out of numbers. What do product names become? The NVIDIA RTX X?
but AMD would maintain the number of Ds whereas intel's K standing for thousand means they just have a vague number of ....... something, but nothing to do with letters or anything important like that. So the question becomes: does intel know that K is a number and is trying to top the number counting charts, or do they think their monopoly on the much more rare K (in terms of letter frequency in the english lexicon) is what will push them ahead?
Wait, good point by commenters: Is the 'dash' in Intel product names actually a subtraction symbol?! Does that mean it's actually (imaginary) 7 MINUS 14900? This changes EVERYTHING.
A genuine masterpiece, if if was ray-traced it would be 10/10 but since it's raster I give this video 9 raster monkeys out of 10 Beve Sturke coffees ;)
As PC marketing has gotten worse, Steve has gotten much better at entertaining! Otherwise he would be out of a job, having to review medocre products so often.
Sometimes I wonder why Google doesn't take it down from their top results. I understand it tries to treat content indiscriminately and only care about SEO, but surely that garbage of a website deserves special treatment. People genuinely buy suboptimal hardware because of it.
@@charlesbrown4483AMD sure as heck did destroy Intel. Even in power consumption. I was, reading from other places that max speed the 14900k uses 533w of power which come out to 311 degrees Fahrenheit. Or 155 degrees Celsius.
Well... I just posted 129% for Processor performance with my new stock 14600K. It called me a Swiss, but I'm not! That matches stock 14700K / 13900K, and OC 13600K, and I only saw 65C with an air-cooler. I'm personally happy with that, and expecting to exceed the 14900K for gaming performance, with an overclock, when I need it. OC-ing even the 14700K is a no-go without a high-end AIO, which is another cost I don't need with the 14600K. Check TechPowerUp for confirmation on OC.
@@dimancor2925maybe, but the 11th gen was a failed attempt at squishing a new arch into the old fab process, I mean it was trash, but they actually tried something new. this one is literally the 13th gen with a different name and some OC I guess.
@@endless2239You mean Rocket Lake (11th gen desktop), because Tiger Lake (11th gen mobile) was actually good, it got the full 24MB of L3 cache improvement as an example and the 10nm process node at the time.
@@endless2239 11th gen was not that bad, but the decision to backport the arch to an older node was made way too late. The 11th gen should already have been the 10th gen, then things would have looked better
Glad to see you guys being very thorough and doing 3 count passes of the name. This level of detail and willingness to go above and beyond is remarkable.
You know what they say: never two without three... so it's obvious that doing 3 count passes of the name was the most scientific accurate test to do, duh!
I am glad you posted the Model of Calculator you used to calculate the number of numbers in the CPU names. That gives me confidence in your reviews. Thanks Mr. Steve Dude.
Forget presentmon and GPU busy, I think the number counting metric might actually have dethroned the number of Xs as the true measure of ultimate performance! Kudos to gamers Nexus for always pushing the envelope and moving the industry forward.
Isn't it the same as the 13600k + 200MHz? I run my 13600kf at either 5.8GHz all P cores or 6GHz on two P cores and 5.8GHz on the rest, it runs cooler than my old 9600k @5.2GHz typical gaming temps around 50-55°C I do have a contact frame and use a Liquid Freezer ii 280mm AIO. Very capable chip (13th gen) I get why sales of the 13th gen are outstripping the 14th gen though, bit daft trying to say it's a different generation.
Steve absolutely kills it - 'Name Number Count' graph 💀 ( Updating from a 13600k to a 7800x3d to pair with my 4090, thanks to the info presented by this channel )
Waste of money that upgrade. A whole platform cost up to 600-800€ just to gain few frames in some games and losing in productivity. Having money and being smart never seems to go hand in hand
@@techluvin7691 "...respect it deserves." not "respect." When combined with the video's introduction, I thought the sardonic humor was plain. I guess that's why I'm not a successful standup comedian.
Thank you for all that you do. I finally did a new build after like 12 years, and a lot of it was chosen based on what I've seen on GN. I'm finally ready to join the 21st century.
Doubling down on the sarcasm, I see. Love it. 😂 You and your team's high degree of skepticism around performance claims is one of the primary reasons for me tuning in. Please don't stop.
@@maxmustermann5932 I don't know how a functioning adult wouldn't understand that improvements that are within margin of error, when presented in a sarcastic manner, can't fathom that it is sarcasm.
This intel gen is the equivalent of your mom telling you to make your room cleaner, and all you did was fold a sock and call it the day cause it’s technically “Cleaner”
I honestly think they made it crap on purpose just so people go "whoa, 15th gen Intels are so good, what a lift over 14th gen!"...that's what happened with alder lake, the gen before it was pretty bad and very power hungry.
No, no -- clearly you don't understand: The letter "E" subtracts one. Unfortunately, the letter "G" is as yet unknown. We should put a bounty out on its research.
In terms of product names and not confusing the consumers, an example is AMD revealing the RX 6750 GRE GPU, despite a month ago releasing the RX 7700 XT and RX 7800 XT
Nice review, guys! 😄 Could you please include "Idle" and "light load" power consumption tests to the review? It will be really worthwhile information for office users.
This was extremely informative. Putting this Intel generation's overly disappointing K/D ratio in such a plain and straightforward way is credit to the GN Team's methodologies and commitment to getting it left.... I mean, right.
I'm very happy that i opted to go wit that chip instead of jumping to AM5. Not that there is anything wrong with AM5. But i saved so much money going from a 3700X to the 5800X3D just by being able to utilize my AM4 motherboard.
With a large complex simulation like Stellaris, I think the most likely explanation is that memory latency is the limiting factor. If a game needs to fetch a ton of data from all over RAM, the CPU is effectively going to be idle a lot of the time.
for more math fun:This equation is known as Coulomb's law, and it describes the electrostatic force between charged objects. The constant of proportionality "k" is called Coulomb's constant. In SI units, the constant k has the value k = 8.99 × 10 9 N ⋅ m 2 /C 2.
Long time watcher, first time commenter. This is a perfect review! Your approach to objectivity in measurements is truly second-to-none! I cannot imagine the painstaking labour that goes into double- and triple-checking your numbers. I also appreciate your commitment towards transparency in outlining your methodology so that we, at home, can verify your findings!
Nice to see GN approaching industry reviews with all the seriousness it deserves. Comparing the 14600K with the 5600X reveals the 14600K would net a 10% improvement in framerate and a 25% improvement in 1% lows. But most people aren't working with anywhere near the power of a 4090. For most people, the differences will be a lot lower and will have a minimal impact on overall experience. If you need the extra threads for some serious productivity work, that is one thing. But it would be hard for a gamer to justify spending twice as much on their CPU when building a midrange gaming PC. Even a previous gen intel CPU would be a more sensible choice. For that matter, it is hard to justify buying a current gen Ryzen CPU for a midrange build.
Thats not 100% accurate. It is in the game. Ghostwire tokyo with 5700x and max ray tracing it drops to under 60 when you move fast. With a 14600k it will be over 100fps all the time. The same happen with hogwarts legacy last of us jedi survivor witcher 3 rtx and other cpu bound games with ray tracing. Ryzen 5xxx is pretty bad in these games. I have a 5700x for half a year and i know that. Gladly i am waiting for the 14600k to enjoy smooth gameplay in these hard cpu bound games.
@@GTGaming1990i think my IMC IS dying on my I712700k. Having issues with memory and errors on Memtest. I've tried other memory and still issues. So I think the i514600k is the spot to be.
The intro made me spit my drink out several time. I cant believe AMD are X3 times as many D's. Way more than Intel's K's. This is the real numbers we look for in a review. Thanks to GN for getting straight to the numbers we watch these for.
Thanks Steve, I hope to see the CPU name number number sheet expand over the coming years, I'm truly taken back by the AMDs performance in their use of the 3D cache to increase their name number calculation. Congratulations on you and your team being able to crack the complex meaning behind the K, X and D meaning in CPUs.
I am planning to upgrade my workstation and didn’t want to waste the 64gb DDR4 ram, your review was extremely useful now I can settle with 13 gen and save some money 💰
May have to revisit these videos after factoring in Intel APO (application optimization), which only works on the 14 series via Intel DTT technology. Some are showing 20-30% performance increase in gaming applications.
I run game servers and seeing more single-thread benchmarks (especially power efficiency for a server that runs 24/7) would be very nice to see. It's pretty common for game servers to be using consumer hardware these days for the single-thread performance and ECC support on Ryzen.
Yeah the 7800 X3D is an awesome CPU. Price to performance is insane. And the 5800 X3D even better P2P than that. Also the 7800 XT is so worth it compared to a 4070 or a 4070ti. My new build will be AMD for sure.
I have the Ryzen 7 7800X3D 30Watts to 50Watts + Sapphire Pulse RX 7900 XT it undervolt 316 Watts and im very happy it this build and no Change to Intel again is the frist time i buy AMD and no regreds and AMD adrenaline FSR 3 + others tecnologys i can use i have monitor IPS 21:9 UWQHD 3440x1440p 144Hz 1ms this build is a beast for gaming 😊
Acer Nitro XV340CKPbmiipphzx - Monitor Gaming 34" 144 Hz 21:9 UltraWide QHD 86cm (3440x1440, IPS LED 1Ms ... Wen i buy it cost 800€ but today the cost less is good monitor i like it
@@zee9709 40s series no tanks i like what i have , the undervolt works nice Nvidia no more crazy prices they tink the graFic card Nvidia are made in full gold 🙄 , i play all games it no problems now full AMD and no regreds ATM 😁
@@burrfoottopknot Ummm, 40th gen from Nvidia? You know, RTX 4060ti being a 3060ti for the same price and 4060 being around 3050. Did you miss the "Last years performance, at this years price" meme? :D
I loved the intro of this one almost as much as the 14th gen launch video - which to me was an instant classic. Refreshes are common in the hardware industry, but a new generation name signals to consumwrs 1 of 2 things: new or significant architecture improvements (Ryzen 5000), or price cuts on a refresh (Ryzen 4000G desktop)
RX 590 for example was only a ŕefresh of the RX 580 and only around 7% faster, with the RX 580 being just a refresh of the RX 480, but at least it had 6 percent more. Which was still somewhat lackluster, but there you go.
@@leonavis So there were quite a few small changes between the Rx 400 and Rx 500 series of GPU's, specifically the Rx 500 cards had higher power draws for slightly higher clocks, but they also had better power management for reduced power consumption when under light or partial load. Many of the 500 series also included better cooling, etc. Yes it was a refresh but overall there were some clear design changes for the cards even if the CU count didn't' change. Also I believe that OpenGL and Vulcan had a slightly newer version support on the 500 series. Last the release of the Rx 590 wasn't called "Rx 680" - Plus if memory serves there were a lot of discounts and promotional offers making the 500 series effectively cheaper as well as just slightly better designed cards. The difference between this Intel launch and their past refreshes / AMD and NVIDIA refreshes is that the Alder Lake's design was already improved upon, 13th gen already found some significant areas to improve upon. Plus Intel is asking for a lot for these CPU's, so to see the power draw go up just to slightly increase some clock speeds is just completely asinine when dealing with K series CPU's, plus from all the data I've seen so far, the 13 series performs better when overclocking, I know that some people were able to get 14th gen up to some high clock speeds, but the 13900K / KS still gets better scores in some benchmarks when both 13900k And 14900k are overclocked and compared. Now this may change with time as the extreme overclockers fine-tune their methods, but it's possible that 13th gen will remain the better overclocking platform because of lower power draw and/or less heat on the E cores allowing the P cores more overhead.
@@aleksandarlazarov9182 it feels like Intel gets stingy with cores and sometimes even with hyperthreading, which I feel has also contributed to them falling so far behind the other industry players in process node size. Honestly they would be doing so much better right now in every space - servers laptops desktops, CPUs, GPUs, workstation accelerator cards, etc. They have a really high-end AI card which can hit some pretty high performance figures, but holy crap does that thing suck down a lot of power: 600+ watts! I don't know which platform is the most affordable, but given AMDs MI210/MI250 and MI300, and the various platforms from Nvidia, I don't think that it's a compelling option.
@@ComprehensiveTechnologyConsult I agree with you. A similarity however is that both refreshes took place because they didn't have any better. If Intel had great improvements right now that would actually justify a new generation, they would do that. They don't. Same with AMD back then: They satisfied middle class and lower end with that refresh to have anything. That being said, over the years the cards actually gained great value due to big memory capacity and long driver improvements by AMD. Still, as you rightly point out, they made improvements happen way beyond what is Intel doing right now. And we should always keep in mind: AMD's revenue is half of that of Intel (including console SOCs and GPUs). Intel is, in general, much bigger than AMD and struggles so hard to keep up that they're doing this sort of nonsense out of neccessity. It's almost painful to watch.
I have the 13600k and i think it is all you need for gaming right now. The next 2 big things will be 3d cache being used by intel and the new AI cores. But i guess all of that will take at least until gen 17
The efficiency that AMD brings to the market has to be appreciated. Being able to get that kind of performance out of a processor with only 5 numbers is not small feat. I think chiplets are involved.
The best GN review ever. Everyone knows that the 14th gen is the ring leader of the CPU circus, Steve's presentation of that fact is fitting AND entertaining. Although..... I am worried that my 7700k doesn't have NEARLY enough "K's" to run games on ultra.
I really love it when tech companies work together. See guys, both Intel and Nvidia teamed up to masterfully craft an entire generation of bad products that exist only to promote AMD's hardware. That's what I call friendship right there. So let's stop with all this bickering over GPU preferences and let's all come together to buy a 6700XT!
It would be interesting to compare idle and light load power consumption between Intel and AMD. While I occasionally game and also need speed for compiling, CAD work etc., most of the time I'm just browsing, watching RUclips, reading forums. From what I found out, Intel actually draws less power than AMD under light loads. It would be great to have information about mixed-use power consumption, I've found little information about this.
Came to say this. Most people's CPUs will spend 90% (blind guess) of the time idling, and as such, an Intel CPU with double power draw while under heavy load could actually use less power overall.
They don't compare idle power consumption because it would make AMD look bad. Intel's power consumption during idle is only 10-25% of what you will see on an AMD system. If you use your PC a lot for other things, or need to leave it on for certain tasks, that extra 30-35w during idle adds up. Probably adds up more than the extra power usage in gaming. And it's not something you can change either, while any Intel CPU can be undervolted and power-limited for little to no performance loss, while with an AMD CPU, you're stuck with that 45w idle load no matter what. These channels live and die based on how much clickbait they can generate for zoomers. Real world comparisons like idle power usage, and a power limited perf/watt comparison between Intel and AMD to see how efficient each architecture *actually* is, are not priorities here. Dunking on Intel is what brings in the clicks from teenagers who can't figured out how to use an ad blocker on their phones, so that's what they publish. Look at their 14700k review. Barely any mention that it's now the best value by a long shot for productivity, or mixed gaming/productivity, edging out the 7900X for a much lower price. Why isn't that emphasized? Probably because none of the very smart people who watch these videos care or even understand what that sentence means.
@@ibtarnine I don't know what benchmark you've been running, but Intel CPUs under no circumstance will consume "only 10-25%" of a competing AMD CPU when idling. It may be slightly lower in some scenarios, but in the benchmarks I've seen that actually measured idle power between RPL and Ryzen 7000, it's about the same. Intels efficiency on Desktop while idling is not bad, but that it's significantly different is a myth. For actual numbers, google for example "guru3d 13600 Power Consumption and temperatures" to find their review of the 13600K. Look at page 6: total system idle power with the 13600K is at 67W, while with the Ryzen 7600X is 68W. 99% is not remotely close to a tenth! I indeed wish more reviewers on YT would benchmark this, just because I see this kind of misinformation repeated in comments regularly by apparent Intel fans.
People need to get into optimising / undervolting their voltages. I have a 13700k and in gaming it uses 60-120W depending on the game while at 5.6Ghz. I have a 150W power limit and it drops to 5.2-5.3Ghz. I could drop it to 125W even. But who cares, my GPU uses 350-450W.
But very expensive unfortunately.... costs approx. 344usd here, whereas Ryzen 5 7600 (non-x) only 206usd (@Thailand), not regarding discounts/sales of course, where it can bought for perhaps up till 20% less
Thanks Steve. I appreciated the added commentary on why Gamers Nexus reviewed these so harshly even though they were a refresh. I agree the naming of these processors make no sense.
Got my 12600kf a while back and think it will last for me for quite awhile. Very happy with it; as in your video the price difference vs. the normal K model was a big seller. Hopefully, Intel makes strides back towards technical parity if not dominance in the coming generations. Who knows? Maybe AMD won't have TSMC's Taiwan fabs to rely on in a few years...though I guess that's why they are building new fabs in the US, Europe, and Japan.
People are SERIOUSLY sleeping on the 12600k(f) right now. It has basically the same performance as the 7600x and better multicore and it costs less. You can also get a fully featured z690 mb for less than an AMD MB with the same feature set. If you only want to game it's literally one of the best deals right now.
I ran my i5 2300 non K for like 9 years and I was satisfied playing WoW with it. This time around I put more of my money on 10 years+ quality components like a 10 year warranty 1300W power supply, Noctua NHD 15 cooler, and a very roomy and easy to build mid size case.
I just recently swaped my i7-2700k paired with GTX1060 3GB to i5-12600k paired with RTX4070 and noticed a slight difference on CS:GO. Now playing CS2 and I do not think that for CS2 it is worth swaping 12600k to 14600k
I've really been enjoying the review series I have loved seeing the generational leap, finally the processor deserve and not the one we needed... also super disappointed because I was so excited to upgrade my 12th gen processor with something Intel has not done in a decade or more by having a socket last more than 2ish years. keep it up!
This isn’t to discredit the quality of your other videos but I could watch these sketches all day. “Bigger number better” was gold but this is also brilliant
E-cores seem totally useless to gamers. The superior productivity performance of an intel CPU isn't really worth the purchase price or power draw for most people. It seems the advantages would really only make sense in an enterprise environment where the time saved is worth more than the additional costs.
If you only used 1 TI 30 to do the count, how can you tell it calculated the count correctly? multiple TI 30s and indeed, multiple brands and methods of counting should have been employed. I can't take this video seriously.
They could have just not released 14th gen, kept making 13th gen as 13650k or something, and cut the price of regular 13th gen. This would have been better for public perception
They could have simply added an r (for refresh) So 13600Kr For i7 should have been a 13800K (higher core count, still slotted below i9) In retrospect that is the only chip that should have been added was a the 14700K as 13800K. This is supposed to be the last hurrah* of the Intel Core i-branding Next is supposed to be a change to naming to coincide with architecture change. Hoping for less growing pains than Intel ARC. Maybe they'll have realistic claims.
@@mariano3113 I think adding +++ would have been more fitting for intel, like a 13600k+++. It could have been an inside joke with the enthusiasts, just like 14th gen is a joke with enthusiasts... LGA1700 last hurrah is pathetic compared to AM4 last hurrah.
@@mikesteve3039 AM4 X3D is Historic in a good way. Intel 1700 ending with a refresh amod cancelling Meteor Lake and skipping to Arrow Lake (I wonder how bad desktop Meter Lake would have performed)**
Finally a CPU review for the masses! I even screenshot the digit count chart and printed it to paper for my wallet. Now whenever a wild CPU apears, I am prepared to only hunt for those with the most digits.
I too was worried that more numbers did not equal better. thank you for the performing the stressful amount of work necessary to confirm to us all that yes, more number do indeed equal more betterer.
@@N_N23296 Meteor Lake is not called Intel Core anymore and it will be launched on december this year. It uses Intel 4 process node. Next year, Intel Core desktop will cease to exist as well because the process node will be ready for desktops. Making 14th "gen" just a stop gap filler just like 11th gen desktop was.
@@N_N23296 The name will be Intel Ultra 3, 5, 7 and 9. Desktop will get Arrow Lake, it's an SoP (system on package) chip with features in line with Apple M series chips, has an NPU (neural processing unit). That is very interesting, because it can nullify in theory NVIDIA's AI lead for graphics cards forcing them to lower their prices.
@@N_N23296 No dude. Watch Moore's Law Is Dead, he has very good leaks of AMD, Intel, and NVIDIA. I already knew years ago that Raptor Lake and Raptor Lake refresh are a dud.
@@N_N23296 Intel did a Meteor Lake presentation this year, there's nothing fake about it, eventually that kind of design will be found on desktop, it's genuine Intel 4 process node.
I've only watched the beginning of this video, and I already "seriously" love it. I'm not sure if I want it to be more serious or less serious after the ad.. Back to you steve...
On the subject of power efficiency, it's already quite established that AMD is presently more performant per watt, but another consideration is idle power consumption. I've been noticing that Intel CPUs under very light work or nothing at all only draw 9-12W (even i9s), whereas Ryzen seems to idle between 36-55W depending on if you've selected power saving mode or not. Why is this? Is it the extra power requirement for transmitting data off-silicon within the CPU on the infinity fabric? What accounts for this idle/web-browsing power draw?
LOL! I love it. A totally arbitrary, totally pointless scoring system - drawn out into far more slides that was ever necessary and then dryly described in terms of corporate jango. Steve, this is your best video :) The third this month! Genius.
I'm glad to hear you take your job seriously. Because if someone doesn't know, it could mislead and be confusing. It means a lot to those trying to learn.
You should be labeling which cpu is running on ddr4 and which cpu is running on ddr5 This is necessary information to make a proper comparison and decision. When building a computer It's needed to know how you came to these statistics so they don't change after a build because a variable changed like ram. Ddr5 increases power draw so it's definitely worth knowing all variables that brought you to a specific statistical amount like power draw or performance. 13th Gen gets a nice performance boost from ddr5 as seen in other tests comparisons. Thank You so much for the time and effort you put into making these informative videos. I'm in the process of gathering information so I can build myself a computer. I don't have a lot of money and the 2nd hand market sucks, which means I'll probably only get one shot at it. I watch all your videos and The information I'm Getting from them is really appreciated to help me not make poor choices when selecting hardware. Thanks again, Rigem
Its astonishing how the jump to 13th gen had the i5 trading blows with the i9 12th gen and now with this it barely trades blows with its own predecessor
(Follow-up: The amount of people who don't get the counting joke with the i7-1185G7E is sort of crazy). Wait a minute: "K" often means "thousand," like 1K, 2K. Does that mean the 14600K is actually the 14,600,000? If that's true.... Intel has done it. They've won. They have dethroned the AMD X multiplier. Factor X.
Watch our CPU sample size testing here for something sane, but still involving internet comments: ruclips.net/video/PUeZQ3pky-w/видео.html
Watch our Intel Core i7-14700K CPU review: ruclips.net/video/8KKE-7BzB_M/видео.html
Watch our Intel i9-14900K CPU review: ruclips.net/video/2MvvCr-thM8/видео.html
Grab a GN Solder & Project Mat! store.gamersnexus.net/products/gn-project-soldering-mat
We need a new video to cover this huge finding
Yeah but AMD is 7800x3d and if we write this a bit differently it is 7800 * 10^3d and clearly 3d means 3days so 72 hours. 7800x3d is really 7800 * 1000000000000000000000000000000000000000000000000000000000000000000000000.
@@123JimmyTheCookie We would make one, but we fear the worldly consequences of such an initiative. No one knows what happens when we run out of numbers. What do product names become? The NVIDIA RTX X?
but AMD would maintain the number of Ds whereas intel's K standing for thousand means they just have a vague number of ....... something, but nothing to do with letters or anything important like that. So the question becomes: does intel know that K is a number and is trying to top the number counting charts, or do they think their monopoly on the much more rare K (in terms of letter frequency in the english lexicon) is what will push them ahead?
Please review the Lian Li O11 Dynamic EVO XL
Wait, good point by commenters: Is the 'dash' in Intel product names actually a subtraction symbol?! Does that mean it's actually (imaginary) 7 MINUS 14900? This changes EVERYTHING.
😂
👍
Gasping for air, continues to laugh uncontrollably
The -14,900,000
The only CPU that will send your computer back in time
maybe they were on to something with 14+++++++++++
When a cpu is so boring that you go into depth on naming and number use. I love me some satire.
Also what does the i5 mean anyway?
This is it, the pinnacle of a CPU review.
Not, It could be one higher
we're missing a string of segues to our sponsor! oh wait
Just wait until the 15th generation of reviews come out.
The iSteve15600k will have at least 4 new performance gfuel cores.
A genuine masterpiece, if if was ray-traced it would be 10/10 but since it's raster I give this video 9 raster monkeys out of 10 Beve Sturke coffees ;)
They need to add bigger better numbers to their video titles. They learn something from the cpu industry.
I am more interested in Steve's cheeky takes on the 14th gen Intel cpu than the cpu itself. I never thought I would have blast to see
As PC marketing has gotten worse, Steve has gotten much better at entertaining! Otherwise he would be out of a job, having to review medocre products so often.
@@aleksandarlazarov9182 ага, как будто все забыли, что было 5-6-7 поколение интел, где прирост был по 5-10%
@@bsn4all-i9f ооо, аз не съм забравил. Даже имам 6 поколение процесор. XD
0,5% boost lol
I can't wait to see how User Benchmark makes this look like a huge upgrade generationally
“AMD DESTROYED”😂
Sometimes I wonder why Google doesn't take it down from their top results. I understand it tries to treat content indiscriminately and only care about SEO, but surely that garbage of a website deserves special treatment. People genuinely buy suboptimal hardware because of it.
@@charlesbrown4483AMD sure as heck did destroy Intel. Even in power consumption. I was, reading from other places that max speed the 14900k uses 533w of power which come out to 311 degrees Fahrenheit. Or 155 degrees Celsius.
they say the only thing good about the 7800X3D is the price and says the 14900k is faster and everything else lol.
Well... I just posted 129% for Processor performance with my new stock 14600K. It called me a Swiss, but I'm not! That matches stock 14700K / 13900K, and OC 13600K, and I only saw 65C with an air-cooler. I'm personally happy with that, and expecting to exceed the 14900K for gaming performance, with an overclock, when I need it. OC-ing even the 14700K is a no-go without a high-end AIO, which is another cost I don't need with the 14600K. Check TechPowerUp for confirmation on OC.
This is the intel generation of all time.
I thought it was 11th Gen?
@@dimancor2925maybe, but the 11th gen was a failed attempt at squishing a new arch into the old fab process, I mean it was trash, but they actually tried something new.
this one is literally the 13th gen with a different name and some OC I guess.
@@endless2239You mean Rocket Lake (11th gen desktop), because Tiger Lake (11th gen mobile) was actually good, it got the full 24MB of L3 cache improvement as an example and the 10nm process node at the time.
this is definitely one of the intel generations of all time.
@@endless2239 11th gen was not that bad, but the decision to backport the arch to an older node was made way too late. The 11th gen should already have been the 10th gen, then things would have looked better
0:54
"3 count passes averaged" would've been even more hilarious if one of the processors had a .333 or .667
hahahaha, I am using this next time we run this... "test!"
@@GamersNexusHopefully you won't have to, but given Intel's track record, it'll probably happen again with either the 15 or 16 series 😂
@@GamersNexus "we think this may have been due to a hardware fault with one of the buttons on our calculator"
@@GamersNexusOr like introduce a fake floating point error. It'd be like "what the hell? These are all integers?"
Glad to see you guys being very thorough and doing 3 count passes of the name. This level of detail and willingness to go above and beyond is remarkable.
LMG wouldn't have checked 3 times. This is what sets GN apart from the chaff
You know what they say: never two without three... so it's obvious that doing 3 count passes of the name was the most scientific accurate test to do, duh!
They even specified the exact methodology and measuring equipment used. As always GN continues to be the benchmark of benchmarking.
I am glad you posted the Model of Calculator you used to calculate the number of numbers in the CPU names. That gives me confidence in your reviews. Thanks Mr. Steve Dude.
3:27 Video End.
Fantastic review Steve, you've really blown the lid off the relaxed norms of CPU industry reviewing.
Hey Greg, it's 'clique', not 'click'
🤣🤣🤣
That was what my brain focused on during the intro as well. And the beautiful charts, hahahaha.
Yes “clique” not “click” thank you! That bugged me immediately lol
Yeah Greg 🙄
Back to school Greg. For French and CPU classes.
Sure, the number of digits in the names are the same for AMD and Intel, but if you calculate the digit sum, the results would be vastly different!
Forget presentmon and GPU busy, I think the number counting metric might actually have dethroned the number of Xs as the true measure of ultimate performance! Kudos to gamers Nexus for always pushing the envelope and moving the industry forward.
"always pushing the envelope and moving the industry foward"
Forgets GPU busy and presentmon afterwards.
The K rating is now the temperature in Kelvin the CPU needs to run at in order to bave a meaningful performance uploft between generations.
I fucking lost it man. Good stuff.
That's genius, lol
Sure seems that way.
Dude, Thank You for that!
Isn't it the same as the 13600k + 200MHz? I run my 13600kf at either 5.8GHz all P cores or 6GHz on two P cores and 5.8GHz on the rest, it runs cooler than my old 9600k @5.2GHz typical gaming temps around 50-55°C I do have a contact frame and use a Liquid Freezer ii 280mm AIO. Very capable chip (13th gen) I get why sales of the 13th gen are outstripping the 14th gen though, bit daft trying to say it's a different generation.
Thank you Greg, for indirectly giving us the single best video intro I’ve ever seen. Kudos.
Steve absolutely kills it - 'Name Number Count' graph 💀 ( Updating from a 13600k to a 7800x3d to pair with my 4090, thanks to the info presented by this channel )
Should have gotten a 7900 XTX, the number is bigger and it has the X factor TWICE:)
Glad you got something useful out of the real charts! haha
I prefer a 7800X3DF (8 letters).
@@threddastAn XFX RX 7900 XTX, of course
Waste of money that upgrade. A whole platform cost up to 600-800€ just to gain few frames in some games and losing in productivity. Having money and being smart never seems to go hand in hand
Thank you. This introduction is pure gold, giving the latest Intel series the respect it deserves.
Respect? You’re joking…….right?
@@techluvin7691 "...respect it deserves." not "respect." When combined with the video's introduction, I thought the sardonic humor was plain. I guess that's why I'm not a successful standup comedian.
@@RichardJohnson-GWhaha, he’s just not that smart😅
Probably my favorite cpu review in past years
Greg is the kind of client Intel dreams of, if more people were like Greg we would still be at 14nm+++++++++++++
if more people were like Greg we would still be in Stone Age
So, what you are saying is: Greg runs UserBenchmark?
🤣😂
@@Zidakuh bruh🤣🤣
Thank you for all that you do. I finally did a new build after like 12 years, and a lot of it was chosen based on what I've seen on GN. I'm finally ready to join the 21st century.
what specs did u choose?
@@megapet777
i5-13600k
Thermalright Phantom Spirit
MSI PRO Z790-A WIFI
RTX 3080
32GB (2x16GB) DDR5 5600
1TB Samsung 980 PRO M.2
2TB Samsung 980 PRO M.2 (for games)
Corsair 1200watt PSU
Fractal Design Torrent
Also threw in some older SATA SSDs and HDDs
@@megapet777probably a thread ripper pro with 96cores paired with 8x RTX 4090 Ti in SLI
@@GoldenEDM_2018 ah man of culture then
Doubling down on the sarcasm, I see. Love it. 😂
You and your team's high degree of skepticism around performance claims is one of the primary reasons for me tuning in. Please don't stop.
Though please don't ridicule people who have trouble understanding sarcasm.
@@maxmustermann5932 Nah, it would be best if they continued to do so. That's a perfectly fine thing to mock people for.
@@maxmustermann5932 I don't know how a functioning adult wouldn't understand that improvements that are within margin of error, when presented in a sarcastic manner, can't fathom that it is sarcasm.
This intel gen is the equivalent of your mom telling you to make your room cleaner, and all you did was fold a sock and call it the day cause it’s technically “Cleaner”
It's like yes I folded this sock while I upped my room temperature by 2 degrees.
@@wijn1008what
@@canaconn2388 Some of the 14000 CPUs run even hotter than their 13000 counterparts.
@@Daniel-vx5vc Nanomachines son.
I honestly think they made it crap on purpose just so people go "whoa, 15th gen Intels are so good, what a lift over 14th gen!"...that's what happened with alder lake, the gen before it was pretty bad and very power hungry.
That first chart is wrong the i7-1185G7E clearly has 6 numbers in the name not 5. I am really disappointed in GN and expected more.
No, no -- clearly you don't understand: The letter "E" subtracts one. Unfortunately, the letter "G" is as yet unknown. We should put a bounty out on its research.
@@GamersNexusreal Gs move in silence like lasagna.
@@GamersNexusG might just be Giga (1e9) this is truly revolutionary stuff: i7-11850000000007E
really makes you think
Could also apply scrabble letter scores.
K is 5 points
D is 2 points
This needs to be investigated
@@GamersNexusIn physics, the g-factor is actually approximately 2.
In terms of product names and not confusing the consumers, an example is AMD revealing the RX 6750 GRE GPU, despite a month ago releasing the RX 7700 XT and RX 7800 XT
Nice review, guys! 😄 Could you please include "Idle" and "light load" power consumption tests to the review? It will be really worthwhile information for office users.
This was extremely informative. Putting this Intel generation's overly disappointing K/D ratio in such a plain and straightforward way is credit to the GN Team's methodologies and commitment to getting it left.... I mean, right.
Is shocking how good the 5800X3D still is!
I'm very happy that i opted to go wit that chip instead of jumping to AM5. Not that there is anything wrong with AM5. But i saved so much money going from a 3700X to the 5800X3D just by being able to utilize my AM4 motherboard.
there it is , daily 5800x3d comment
I'm so so fucking happy with my 5800X3D + 7900XT combo. I'm a 1440p 144hz gamer and it gobles up anything I throw at it.
No……..actually, it isn’t shocking. It is specifically a gaming CPU and AMD got it right on the first try. Their AM5 x3d is not as good.
In half of games that take advantage of the extra cache**
Otherwise it's just a worse 5800X
With a large complex simulation like Stellaris, I think the most likely explanation is that memory latency is the limiting factor. If a game needs to fetch a ton of data from all over RAM, the CPU is effectively going to be idle a lot of the time.
I don't think I have ever seen tech reviews this entertaining and informative in the same video.
This was some damn fine work gentlemen.
for more math fun:This equation is known as Coulomb's law, and it describes the electrostatic force between charged objects. The constant of proportionality "k" is called Coulomb's constant. In SI units, the constant k has the value k = 8.99 × 10 9 N ⋅ m 2 /C 2.
AMD is truly groundbreaking with their Factor X. A 7800X3D plus a XFX RX 7900XTX is truly the pinnacle of X performance
MY GOD! THAT IS SO MANY X's!
Long time watcher, first time commenter. This is a perfect review! Your approach to objectivity in measurements is truly second-to-none! I cannot imagine the painstaking labour that goes into double- and triple-checking your numbers. I also appreciate your commitment towards transparency in outlining your methodology so that we, at home, can verify your findings!
I love this in-depth, unbiased review!
It's rare to find such a gem of a video on RUclips! ♥
The number of numbers in name graph is one of the most useful graphs i have seen!
Nice to see GN approaching industry reviews with all the seriousness it deserves.
Comparing the 14600K with the 5600X reveals the 14600K would net a 10% improvement in framerate and a 25% improvement in 1% lows. But most people aren't working with anywhere near the power of a 4090. For most people, the differences will be a lot lower and will have a minimal impact on overall experience. If you need the extra threads for some serious productivity work, that is one thing. But it would be hard for a gamer to justify spending twice as much on their CPU when building a midrange gaming PC. Even a previous gen intel CPU would be a more sensible choice. For that matter, it is hard to justify buying a current gen Ryzen CPU for a midrange build.
Thats not 100% accurate. It is in the game. Ghostwire tokyo with 5700x and max ray tracing it drops to under 60 when you move fast. With a 14600k it will be over 100fps all the time. The same happen with hogwarts legacy last of us jedi survivor witcher 3 rtx and other cpu bound games with ray tracing. Ryzen 5xxx is pretty bad in these games. I have a 5700x for half a year and i know that. Gladly i am waiting for the 14600k to enjoy smooth gameplay in these hard cpu bound games.
@@GTGaming1990i think my IMC IS dying on my I712700k. Having issues with memory and errors on Memtest. I've tried other memory and still issues. So I think the i514600k is the spot to be.
People don't just play games on CPUs. They also do work.
The intro made me spit my drink out several time. I cant believe AMD are X3 times as many D's. Way more than Intel's K's. This is the real numbers we look for in a review. Thanks to GN for getting straight to the numbers we watch these for.
Thanks Steve, I hope to see the CPU name number number sheet expand over the coming years, I'm truly taken back by the AMDs performance in their use of the 3D cache to increase their name number calculation. Congratulations on you and your team being able to crack the complex meaning behind the K, X and D meaning in CPUs.
We should thank Greg for having Steve educate us on Ds and Ks. "Back to you, Steve!"
I am planning to upgrade my workstation and didn’t want to waste the 64gb DDR4 ram, your review was extremely useful now I can settle with 13 gen and save some money 💰
This review is so serious and accurate I almost thought the sponsor spot was a joke
Do people not like your humor? I feel like its part of your charm and sarcastically roasting megacorps is pure entertainment.
I second that, If i am to come here only when I wish to buy something i would drop by once per year, but i come here every vid for the shitposting ;)
So I should definitely get a 7800X3d to pair with my XFX RX 7900 XTX to compound those multipliers further.
7800x3d is the bees knee for gameing , also if i recall you can crossfire the igpu with a amd dgpu
@ReLo_FZR how'd you miss the shitposting that hard
@@tomboomeronacrv just realised lol
@@ReLo_FZR For what it's worth, I thought you were riffing off the joke. Crossfire hasn't been supported since the RX Vega series.
Someone here gets it. JAJAJA
That weird feeling when Steve put more seriousness in this review than Intel put innovation in their 14th series
Thank you! GN is truly the best. Not only are your news, reviews, and testing methodology top notch but you hold companies accountable.
May have to revisit these videos after factoring in Intel APO (application optimization), which only works on the 14 series via Intel DTT technology. Some are showing 20-30% performance increase in gaming applications.
For the 2 games that currently support it*, but yes, it would be nice to see the APO uplift just for reference.
I run game servers and seeing more single-thread benchmarks (especially power efficiency for a server that runs 24/7) would be very nice to see. It's pretty common for game servers to be using consumer hardware these days for the single-thread performance and ECC support on Ryzen.
I love how the sarcastic intros keep one upping themselves. I mean,the totally serious intros about totally serious processors.
Epic video... Edge of my seat, the entire time! Thank you Steve!
I'm happy for 14th "generation" just because of making 13th gen stuff cheaper
As somebody who has never left my room I agree, the last review was unacceptable confusing. Thank you for keeping this review so serious.
Yeah the 7800 X3D is an awesome CPU. Price to performance is insane. And the 5800 X3D even better P2P than that. Also the 7800 XT is so worth it compared to a 4070 or a 4070ti. My new build will be AMD for sure.
I have the Ryzen 7 7800X3D 30Watts to 50Watts + Sapphire Pulse RX 7900 XT it undervolt 316 Watts and im very happy it this build and no Change to Intel again is the frist time i buy AMD and no regreds and AMD adrenaline FSR 3 + others tecnologys i can use i have monitor IPS 21:9 UWQHD 3440x1440p 144Hz 1ms this build is a beast for gaming 😊
@@albertovilela9015ur monitor name ?
Acer Nitro XV340CKPbmiipphzx - Monitor Gaming 34" 144 Hz 21:9 UltraWide QHD 86cm (3440x1440, IPS LED 1Ms ... Wen i buy it cost 800€ but today the cost less is good monitor i like it
if you want low wattage combo, you go ryzen + 40s series
@@zee9709 40s series no tanks i like what i have , the undervolt works nice Nvidia no more crazy prices they tink the graFic card Nvidia are made in full gold 🙄 , i play all games it no problems now full AMD and no regreds ATM 😁
These 13th, cough........ I mean 14th gen reviews have been absolutely hilarious. Entertaining as hell. Good Work!
Ahhh, just like Nvidia 3000 and 4000 lack of gen improvement. 😂
It is almost like AMD and Intel are gaming the market, kind of surprising since Nvidia and AMD never do this type of underhanded thing with GPU's.
@@burrfoottopknot Ummm, 40th gen from Nvidia? You know, RTX 4060ti being a 3060ti for the same price and 4060 being around 3050. Did you miss the "Last years performance, at this years price" meme? :D
Steve, i have noticed a devastating error in this review. On the first chart the Intel I7-1185G7E contains 6 numbers, not 5.
No, no. We counted it 3 times. That can't be right.
@@GamersNexus You may need to increase the number of test passes in future tests
I loved the intro of this one almost as much as the 14th gen launch video - which to me was an instant classic.
Refreshes are common in the hardware industry, but a new generation name signals to consumwrs 1 of 2 things: new or significant architecture improvements (Ryzen 5000), or price cuts on a refresh (Ryzen 4000G desktop)
And somehow 14th is neither. :(
Better than the 11th gen downgrade at least!
RX 590 for example was only a ŕefresh of the RX 580 and only around 7% faster, with the RX 580 being just a refresh of the RX 480, but at least it had 6 percent more. Which was still somewhat lackluster, but there you go.
@@leonavis So there were quite a few small changes between the Rx 400 and Rx 500 series of GPU's, specifically the Rx 500 cards had higher power draws for slightly higher clocks, but they also had better power management for reduced power consumption when under light or partial load. Many of the 500 series also included better cooling, etc. Yes it was a refresh but overall there were some clear design changes for the cards even if the CU count didn't' change. Also I believe that OpenGL and Vulcan had a slightly newer version support on the 500 series.
Last the release of the Rx 590 wasn't called "Rx 680" - Plus if memory serves there were a lot of discounts and promotional offers making the 500 series effectively cheaper as well as just slightly better designed cards.
The difference between this Intel launch and their past refreshes / AMD and NVIDIA refreshes is that the Alder Lake's design was already improved upon, 13th gen already found some significant areas to improve upon. Plus Intel is asking for a lot for these CPU's, so to see the power draw go up just to slightly increase some clock speeds is just completely asinine when dealing with K series CPU's, plus from all the data I've seen so far, the 13 series performs better when overclocking, I know that some people were able to get 14th gen up to some high clock speeds, but the 13900K / KS still gets better scores in some benchmarks when both 13900k And 14900k are overclocked and compared. Now this may change with time as the extreme overclockers fine-tune their methods, but it's possible that 13th gen will remain the better overclocking platform because of lower power draw and/or less heat on the E cores allowing the P cores more overhead.
@@aleksandarlazarov9182 it feels like Intel gets stingy with cores and sometimes even with hyperthreading, which I feel has also contributed to them falling so far behind the other industry players in process node size.
Honestly they would be doing so much better right now in every space - servers laptops desktops, CPUs, GPUs, workstation accelerator cards, etc. They have a really high-end AI card which can hit some pretty high performance figures, but holy crap does that thing suck down a lot of power: 600+ watts! I don't know which platform is the most affordable, but given AMDs MI210/MI250 and MI300, and the various platforms from Nvidia, I don't think that it's a compelling option.
@@ComprehensiveTechnologyConsult I agree with you.
A similarity however is that both refreshes took place because they didn't have any better. If Intel had great improvements right now that would actually justify a new generation, they would do that. They don't. Same with AMD back then: They satisfied middle class and lower end with that refresh to have anything.
That being said, over the years the cards actually gained great value due to big memory capacity and long driver improvements by AMD. Still, as you rightly point out, they made improvements happen way beyond what is Intel doing right now.
And we should always keep in mind: AMD's revenue is half of that of Intel (including console SOCs and GPUs). Intel is, in general, much bigger than AMD and struggles so hard to keep up that they're doing this sort of nonsense out of neccessity. It's almost painful to watch.
I thought the last video intro would never be topped, yet somehow you have done it! Well done, you got me hooked.
I have the 13600k and i think it is all you need for gaming right now. The next 2 big things will be 3d cache being used by intel and the new AI cores. But i guess all of that will take at least until gen 17
The efficiency that AMD brings to the market has to be appreciated. Being able to get that kind of performance out of a processor with only 5 numbers is not small feat. I think chiplets are involved.
The best GN review ever. Everyone knows that the 14th gen is the ring leader of the CPU circus, Steve's presentation of that fact is fitting AND entertaining. Although..... I am worried that my 7700k doesn't have NEARLY enough "K's" to run games on ultra.
Congrats?! A bot stole your comment :/
I want steve to explain how intel and AMD had increased their Ks and Ds over the years
Four months later and this is still the best CPU review. Ever.
The 011-D for the ad, along with the chapter ad title, was a masterpiece
This truly is one of the most CPU reviews of all time... this made me cry from happiness - from up high and down below ;)
I really love it when tech companies work together. See guys, both Intel and Nvidia teamed up to masterfully craft an entire generation of bad products that exist only to promote AMD's hardware. That's what I call friendship right there. So let's stop with all this bickering over GPU preferences and let's all come together to buy a 6700XT!
Or 7800XT, I was nvidia and intel for almost 20 years... thanks to a micro center grand opening I now have a 7900x/7800xt combo and I love it.
It would be interesting to compare idle and light load power consumption between Intel and AMD.
While I occasionally game and also need speed for compiling, CAD work etc., most of the time I'm just browsing, watching RUclips, reading forums. From what I found out, Intel actually draws less power than AMD under light loads.
It would be great to have information about mixed-use power consumption, I've found little information about this.
Came to say this. Most people's CPUs will spend 90% (blind guess) of the time idling, and as such, an Intel CPU with double power draw while under heavy load could actually use less power overall.
They don't compare idle power consumption because it would make AMD look bad. Intel's power consumption during idle is only 10-25% of what you will see on an AMD system. If you use your PC a lot for other things, or need to leave it on for certain tasks, that extra 30-35w during idle adds up. Probably adds up more than the extra power usage in gaming. And it's not something you can change either, while any Intel CPU can be undervolted and power-limited for little to no performance loss, while with an AMD CPU, you're stuck with that 45w idle load no matter what.
These channels live and die based on how much clickbait they can generate for zoomers. Real world comparisons like idle power usage, and a power limited perf/watt comparison between Intel and AMD to see how efficient each architecture *actually* is, are not priorities here. Dunking on Intel is what brings in the clicks from teenagers who can't figured out how to use an ad blocker on their phones, so that's what they publish. Look at their 14700k review. Barely any mention that it's now the best value by a long shot for productivity, or mixed gaming/productivity, edging out the 7900X for a much lower price. Why isn't that emphasized? Probably because none of the very smart people who watch these videos care or even understand what that sentence means.
@@ibtarnine I don't know what benchmark you've been running, but Intel CPUs under no circumstance will consume "only 10-25%" of a competing AMD CPU when idling.
It may be slightly lower in some scenarios, but in the benchmarks I've seen that actually measured idle power between RPL and Ryzen 7000, it's about the same. Intels efficiency on Desktop while idling is not bad, but that it's significantly different is a myth.
For actual numbers, google for example "guru3d 13600 Power Consumption and temperatures" to find their review of the 13600K. Look at page 6: total system idle power with the 13600K is at 67W, while with the Ryzen 7600X is 68W. 99% is not remotely close to a tenth!
I indeed wish more reviewers on YT would benchmark this, just because I see this kind of misinformation repeated in comments regularly by apparent Intel fans.
@@ibtarnine UserBenchmark website owner, is that you?
People need to get into optimising / undervolting their voltages.
I have a 13700k and in gaming it uses 60-120W depending on the game while at 5.6Ghz.
I have a 150W power limit and it drops to 5.2-5.3Ghz. I could drop it to 125W even. But who cares, my GPU uses 350-450W.
Ryzen 5800x3D truly is one of the CPU's of all time
Agreed
The 1080ti of the CPU world
I would say the same for the i3 8100, truly one of the compute of all time
i5 2500K level legend
But very expensive unfortunately.... costs approx. 344usd here, whereas Ryzen 5 7600 (non-x) only 206usd (@Thailand), not regarding discounts/sales of course, where it can bought for perhaps up till 20% less
Truly one of the cpu reviews of time
even among more time
I'm only 2 minutes in but I sure hope "Greg" is now satisfied with this review.
Thanks Steve. I appreciated the added commentary on why Gamers Nexus reviewed these so harshly even though they were a refresh. I agree the naming of these processors make no sense.
You had me sold right up to the part where you started talking about the counting of product numbers lmao. You got me 😂
Yeah, I totally wasn't paying full attention until I was like "wait, what?" 😁 Got me too!
Got my 12600kf a while back and think it will last for me for quite awhile. Very happy with it; as in your video the price difference vs. the normal K model was a big seller. Hopefully, Intel makes strides back towards technical parity if not dominance in the coming generations. Who knows? Maybe AMD won't have TSMC's Taiwan fabs to rely on in a few years...though I guess that's why they are building new fabs in the US, Europe, and Japan.
People are SERIOUSLY sleeping on the 12600k(f) right now. It has basically the same performance as the 7600x and better multicore and it costs less. You can also get a fully featured z690 mb for less than an AMD MB with the same feature set. If you only want to game it's literally one of the best deals right now.
I ran my i5 2300 non K for like 9 years and I was satisfied playing WoW with it. This time around I put more of my money on 10 years+ quality components like a 10 year warranty 1300W power supply, Noctua NHD 15 cooler, and a very roomy and easy to build mid size case.
I just recently swaped my i7-2700k paired with GTX1060 3GB to i5-12600k paired with RTX4070 and noticed a slight difference on CS:GO. Now playing CS2 and I do not think that for CS2 it is worth swaping 12600k to 14600k
I've really been enjoying the review series I have loved seeing the generational leap, finally the processor deserve and not the one we needed... also super disappointed because I was so excited to upgrade my 12th gen processor with something Intel has not done in a decade or more by having a socket last more than 2ish years. keep it up!
AMD gives Intel the D, and Intel replies: "K".
We are truly in a golden age of CPU names.
0:40 wow nice gesture, keep it up as it is, we are here not in school so fun is allways welcome!
This isn’t to discredit the quality of your other videos but I could watch these sketches all day. “Bigger number better” was gold but this is also brilliant
The 14700k actually looks interesting. But at the cost, an x3D build would be a better choice.
E-cores seem totally useless to gamers. The superior productivity performance of an intel CPU isn't really worth the purchase price or power draw for most people. It seems the advantages would really only make sense in an enterprise environment where the time saved is worth more than the additional costs.
i need to try out a laptop with the amd .... i wonder if its less laggy than intels? i have a13980hx and this thing is SLLLLOW
isn't that a top tier laptop CPU?@@highlanderc
I love that these processors are so bad that the beginning of every review is a comedy show
who told you these processors are bad!?
If you only used 1 TI 30 to do the count, how can you tell it calculated the count correctly? multiple TI 30s and indeed, multiple brands and methods of counting should have been employed. I can't take this video seriously.
That new GN intro logo looks pretty slick. Looks more real than ever.
0:54 This is the reliable and scientific approach to research that I keep coming back for. Keep up the good work.
They could have just not released 14th gen, kept making 13th gen as 13650k or something, and cut the price of regular 13th gen. This would have been better for public perception
They could have simply added an r (for refresh)
So 13600Kr
For i7 should have been a 13800K (higher core count, still slotted below i9)
In retrospect that is the only chip that should have been added was a the 14700K as 13800K.
This is supposed to be the last hurrah* of the Intel Core i-branding
Next is supposed to be a change to naming to coincide with architecture change.
Hoping for less growing pains than Intel ARC.
Maybe they'll have realistic claims.
Well then nobody's going to buy the 13-50K series processors just like Ryzen's 3000 XT series. Intel's investors won't be happy with that
@@mariano3113 I think adding +++ would have been more fitting for intel, like a 13600k+++. It could have been an inside joke with the enthusiasts, just like 14th gen is a joke with enthusiasts...
LGA1700 last hurrah is pathetic compared to AM4 last hurrah.
@@mikesteve3039 AM4 X3D is Historic in a good way.
Intel 1700 ending with a refresh amod cancelling Meteor Lake and skipping to Arrow Lake
(I wonder how bad desktop Meter Lake would have performed)**
I love the intentional error where the i7-1185G7E was listed at 5 Numbers. 🤣🤣🤣
It very obviously has 6 numbers.
Finally a CPU review for the masses!
I even screenshot the digit count chart and printed it to paper for my wallet.
Now whenever a wild CPU apears, I am prepared to only hunt for those with the most digits.
I too was worried that more numbers did not equal better. thank you for the performing the stressful amount of work necessary to confirm to us all that yes, more number do indeed equal more betterer.
Wait is it april 1st already? Darn time truly flies when watching great content like this.
I can't believe Lian Li actually actually sponsored this. Videos fucking hilarious, what a pointless '14th' gen. Nicely done to you and the team!
Clearly if intel didnt want to mislead people they would have called it a 13650k and not change it to 14600k
They also mislead the public with the i9-11900K when clearly it's an i7-11700KS.
@@N_N23296 Meteor Lake is not called Intel Core anymore and it will be launched on december this year.
It uses Intel 4 process node. Next year, Intel Core desktop will cease to exist as well because the process node will be ready for desktops. Making 14th "gen" just a stop gap filler just like 11th gen desktop was.
@@N_N23296 The name will be Intel Ultra 3, 5, 7 and 9. Desktop will get Arrow Lake, it's an SoP (system on package) chip with features in line with Apple M series chips, has an NPU (neural processing unit).
That is very interesting, because it can nullify in theory NVIDIA's AI lead for graphics cards forcing them to lower their prices.
@@N_N23296 No dude. Watch Moore's Law Is Dead, he has very good leaks of AMD, Intel, and NVIDIA. I already knew years ago that Raptor Lake and Raptor Lake refresh are a dud.
@@N_N23296 Intel did a Meteor Lake presentation this year, there's nothing fake about it, eventually that kind of design will be found on desktop, it's genuine Intel 4 process node.
i get it now. Why my girlfriend asking for R9 7950X3D back then, because it's had a lot of D's.
Good one😂
I've only watched the beginning of this video, and I already "seriously" love it.
I'm not sure if I want it to be more serious or less serious after the ad..
Back to you steve...
I am super mad that I'm only just seeing this now. That intro was everything I needed in my life and then some.
On the subject of power efficiency, it's already quite established that AMD is presently more performant per watt, but another consideration is idle power consumption. I've been noticing that Intel CPUs under very light work or nothing at all only draw 9-12W (even i9s), whereas Ryzen seems to idle between 36-55W depending on if you've selected power saving mode or not. Why is this? Is it the extra power requirement for transmitting data off-silicon within the CPU on the infinity fabric? What accounts for this idle/web-browsing power draw?
thats just chiplet design vs monolithic desing, on APUs like 5700g idle consumption is lower because chip is monolithic
LOL!
I love it. A totally arbitrary, totally pointless scoring system - drawn out into far more slides that was ever necessary and then dryly described in terms of corporate jango.
Steve, this is your best video :) The third this month! Genius.
Amazing just how good the 5800X3D is
It's a little over a year old, what's amazing about it? People keep forgetting it was not launched with AM4 but in 2022
I love this. Not laughed at an introduction so hard in a long time!
I'm glad to hear you take your job seriously.
Because if someone doesn't know, it could mislead and be confusing.
It means a lot to those trying to learn.
As usual another crappy gamers nexus review 💀
Silent subscriber here. But no longer. Significantly the best humor out of all reviewers around. Great job, GN! :)
You should be labeling which cpu is running on ddr4 and which cpu is running on ddr5 This is necessary information to make a proper comparison and decision. When building a computer It's needed to know how you came to these statistics so they don't change after a build because a variable changed like ram. Ddr5 increases power draw so it's definitely worth knowing all variables that brought you to a specific statistical amount like power draw or performance. 13th Gen gets a nice performance boost from ddr5 as seen in other tests comparisons. Thank You so much for the time and effort you put into making these informative videos. I'm in the process of gathering information so I can build myself a computer. I don't have a lot of money and the 2nd hand market sucks, which means I'll probably only get one shot at it. I watch all your videos and The information I'm Getting from them is really appreciated to help me not make poor choices when selecting hardware.
Thanks again,
Rigem
Good for you to review Core i5-13599K!
Its astonishing how the jump to 13th gen had the i5 trading blows with the i9 12th gen and now with this it barely trades blows with its own predecessor
That intro tho XD love the work that you guys at GN do!
Great intro! The naming was a true 23400-D move from AMD.