Intel's 300W Core i9-14900K: CPU Review, Benchmarks, Gaming, & Power

Поделиться
HTML-код
  • Опубликовано: 24 янв 2025

Комментарии • 3,1 тыс.

  • @GamersNexus
    @GamersNexus  Год назад +427

    After our 14600K review, is there anything else you want to see us do with these CPUs? Any one-off tests or points of interest worth exploring? Let us know in the comments (new top-level comments are easier to find). Remember that any of those one-offs would take the place of other content, so if you're not interested in seeing more of these CPUs, let us know that as well (so we can prioritize our next content)!
    Watch our Intel Core i7-14700K CPU review: ruclips.net/video/8KKE-7BzB_M/видео.html
    Learn about CPU sample size in reviews (68 CPUs tested): ruclips.net/video/PUeZQ3pky-w/видео.html

    • @hquest
      @hquest Год назад +92

      “Will it blend? That is the real question.”

    • @AveiMil
      @AveiMil Год назад +6

      Would They are Billions be a good game to CPU benchmark with? It seems like a heavily CPU bound game when you have a large number of zombies. My i9-9900k lags like hell in some stages.

    • @dnakatomiuk
      @dnakatomiuk Год назад +16

      As Steve from HU said it's a waste of time

    • @ammanus356
      @ammanus356 Год назад +11

      Personally I do not understand why you don't test the games with RT, as RT has a huge CPU impact.

    • @sandornyemcsok4168
      @sandornyemcsok4168 Год назад +48

      Yes, please. As winter is coming, test the 14900K under full load, how it performs as an auxiliary heating. Is it worth to move it as a production PC into the bedroom?

  • @SpankeyMcCheeks
    @SpankeyMcCheeks Год назад +2434

    Intel could've just called these the 13950k, 13750k or some shit like that and it wouldn't have been much of a problem... Mid generational upgrades is never a bad thing, unless they raise the prices more than the performance increase.

    • @freaky425
      @freaky425 Год назад +67

      People will always complain no matter what, they didnt raise price, so I don't see what the fuss is about.

    • @Acid_Burn9
      @Acid_Burn9 Год назад +475

      ​@@freaky425 the fuss is about them marketing it as a new 14th generation which will lead some not-so-tech-savvy customers to assume that it is noticeably better than 13th gen, when it is in fact just 13th gen. This is intentionally misleading and anti-consumer behavior.

    • @SpankeyMcCheeks
      @SpankeyMcCheeks Год назад +243

      @@freaky425 The problem is the implied improvement that comes with a "bigger number better". Most consumers will look at the 14 series at, say, $500 and the 13 series at a discounted price of $450 and believe that they're getting a whole generational improvement in performance for just $50, or 11%, more. They think they're getting a good deal but they're overpaying for just a few extra percent in performance. It's scummy, and intentionally dishonest by Intel.

    • @AndrewB23
      @AndrewB23 Год назад +42

      Steve literally said this in the first video

    • @SpankeyMcCheeks
      @SpankeyMcCheeks Год назад +17

      @@AndrewB23 Great minds think alike.

  • @alexz1232
    @alexz1232 Год назад +2486

    5800X3D value proposition looks better every time Intel launches a new CPU, impressive.

    • @seeibe
      @seeibe Год назад +233

      Turns out incrementing the number on their existing CPUs was free marketing for AMD

    • @Bruhyeet42069
      @Bruhyeet42069 Год назад +42

      Yeah and prices are slowly rising again because of it (at least where i'm from).

    • @HenrySomeone
      @HenrySomeone Год назад +11

      Not if you're buying it now though...

    • @chriswright8074
      @chriswright8074 Год назад +7

      ​@@HenrySomeone5950

    • @rambosausage8881
      @rambosausage8881 Год назад +78

      The x3d is good value as an upgrade not a fresh build, the chip is literally hyped to the gills, you can buy a 7600x for 100 less that beats it.....

  • @JohnCarter04
    @JohnCarter04 Год назад +214

    On the bright side, it's a good demonstration of how consistent your testing is between different products/over multiple runs of the same test.

  • @aylim3088
    @aylim3088 Год назад +897

    It's incredible that this guy was called Steve Fromgamersnexus and he ended up working at Gamers Nexus to review one of the cpus ever made, life just finds a way

    • @kintustis
      @kintustis Год назад +66

      Absolute comment right here.

    • @GewelReal
      @GewelReal Год назад +30

      A Steve moment if one may say

    • @ag8912
      @ag8912 Год назад +28

      Truly one of the moments of all time

    • @elcazador3349
      @elcazador3349 Год назад +28

      Thanks, Steve.

    • @stunytsuR
      @stunytsuR Год назад +1

      Love the sarcasm

  • @KefazX
    @KefazX Год назад +367

    I never get tired of the "thanks Steve" inserts. So uh, thanks Steve and GN crew for making yet another entertaining video even though the product was about as boring as it can get.

    • @Vekstar
      @Vekstar Год назад +16

      Back to you Sheve

    • @Pressbutan
      @Pressbutan Год назад +8

      There's a movie/TV critic on youtube called Diasparu who's been using the steve "That's gaslighting" clips, and it's so damn good. I laugh my ass off every time.

    • @samuelrodgers2742
      @samuelrodgers2742 Год назад +8

      You can literally see it

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature Год назад

      @@Pressbutan I see you have also been watching that. Though Disparu does also tend to overuse clips at not necessarily the best timing.

  • @Shivaxi
    @Shivaxi Год назад +57

    Can I just call out how much I appreciate the editing team adding the little blue bars on the sides of the screen slowly moving down to indicate how long and when a section/topic will be and end? Its the little things in life you know?

    • @dylanloo9856
      @dylanloo9856 Год назад

      Ooo it's the shroud fan

    • @peyton_uwu
      @peyton_uwu 2 месяца назад

      yessss, it's such a lovely little touch of detail, and they've gone and done more things like this in their newer videos as well.

  • @Mapantz1
    @Mapantz1 Год назад +189

    Steve's review of the Core i9-14900K was 212.4% better than his review of the Core i9-13900K. Outstanding win for intel!

    • @Pressbutan
      @Pressbutan Год назад +23

      This comment was 9% better than the average reply, which for an Intel review video, is an amazing gain.

    • @rudrasingh6354
      @rudrasingh6354 Год назад +7

      @@Pressbutan This reply was 4.2% better than the average reply to a comment, which for a comment on an intel review video is a mindblowing increment over the others.

    • @JBVXR
      @JBVXR Год назад +8

      I think this means we can call him Steve 2.0, increment the version number. Get marketing on the phone.

    • @nunjacko
      @nunjacko Год назад +2

      Thanks Steve!

    • @Sportivities
      @Sportivities Год назад

      the Steve™2000™ series™™

  • @winstonwright3613
    @winstonwright3613 Год назад +328

    Only Intel can launch a new "Generation" of CPUs and, in so doing, make the 5800X3D look good again... Almost 2 years after it's launch!

    • @Warsheep2k6
      @Warsheep2k6 Год назад +14

      its still bought and am4 is still viable its not like the 5900 or 5950x are bad cpus either its just that its a dead end however if you dont upgrade that often thats not even an issue i guess the 5800x3d or the r9s of zen 3 still have plenty! of juice left

    • @Brunnen_Gee
      @Brunnen_Gee Год назад +12

      Still rocking my 5800X3D and don't plan on replacing it any time soon. This thing is just silly, nothing I play comes *close* to maxing it out (and that's including games that are CPU heavy like Star Citizen).

    • @Valdore1000
      @Valdore1000 Год назад +10

      @@Warsheep2k6 5950x dead end? yeah, maybe I will keep it until I die, correct hahaha

    • @Warsheep2k6
      @Warsheep2k6 Год назад +5

      ​@@Valdore1000well am4 is a dead end its either r7 x3D or r9 both awesome Chips though I run a 5800X3D and I dont plan on upgrading that anytime soon either

    • @Valdore1000
      @Valdore1000 Год назад +9

      @@Warsheep2k6 I upgrade CPU like in once in 8 - 10 years so it's pretty much alive

  • @martinzhang5533
    @martinzhang5533 Год назад +64

    250W is really as high as any desktop cpu should go at stock. It is not a good trend that they are pushing the power beyond diminishing return just to get the few percentage on perf that is only useful in marketing. Optimizing the design is the way to move forward as we are getting closer and closer to the upper limit that silicon semiconductor processes are going to eventually hit.

    • @THU31
      @THU31 Год назад +1

      I'm not a fan of the power comparison to older i9s, which were obviously tested with power limits applied at the stock TDP of 95 and 125 W respectively.
      My 9700KF was drawing 140 W in an all-core workload at 4.5 GHz (official max turbo is 4.6), and it didn't have HyperThreading. That means that the 9900K at max official turbo of 4.7 GHz would draw around 200 W. At 95 W the frequency was probably around 4 GHz or less.
      These CPUs have two TDPs, 125 W and 253 W. That means we should never see more than 253 W at stock settings. If power limits are disabled by default, that's a choice made by either Intel or the motherboard manufacturers, but it doesn't showcase a realistic comparison to older CPUs, where power limits were observed.

    • @KeinNiemand
      @KeinNiemand Год назад +3

      You can always set a power limit if you want

    • @norge696
      @norge696 Год назад +2

      @@THU31 AMD got me with the r7-7700. 65W 8 core TDP is pretty insane. Putting my 6600k to rest.

    • @divinehatred6021
      @divinehatred6021 Год назад +1

      @@norge696 yeah, for me it was r7 5700, even with SMT OFF by my choice its still better performance that what intel has to offer.

    • @mparagames
      @mparagames Год назад

      @@divinehatred6021 why have it off?

  • @CFI963
    @CFI963 Год назад +996

    It’s hard to believe a cpu runs to 300w.

    • @GamersNexus
      @GamersNexus  Год назад +644

      Power supply manufacturer right now are salivating!

    • @Iwanttobefree42
      @Iwanttobefree42 Год назад +229

      Until you buy a 13900K and realizes that your 420mm AIO is just the bare minimum... this is unacceptable from a consumer viewpoint

    • @ashryver3605
      @ashryver3605 Год назад +47

      ​@@Iwanttobefree42 please get your head out of the hyperbolic gutter
      Even at unoptimized stock on my 360mm R23 was just about 300w. After a basic offset that took like 15mins to guesstimate it was down to 265w @ 5.6 (KS).
      In real activity usage... it uses like max 150w for my games and pretty much always below 60c.

    • @anasevi9456
      @anasevi9456 Год назад +31

      this is the most sad sack update I've ever seen. I've literally seen average cpu bins of the exact same product improve more over 6 months, than the improvements to efficiency and performance a 13900k to 14900k gives. 😕

    • @seeibe
      @seeibe Год назад +88

      @@ashryver3605 150W for real usage is still a lot. All my systems still run CPUs that don't even go above 100W max. And I still need massive 3 fan coolers to keep them reasonably quiet.

  • @alaintchimnang9238
    @alaintchimnang9238 Год назад +556

    AMD's Ryzen 8000 series CPUs will have insane prices with this kind of competition from Intel

    • @kingyogesh441
      @kingyogesh441 Год назад +19

      Nobody will buy any expensive product from AMDip.

    • @TheHighborn
      @TheHighborn Год назад +331

      @@kingyogesh441 hahahaahhahaaahhaah.... wait lemme breathe in to laugh even harder. HAHHAHAAHHAHAAHA

    • @bliss_gore5194
      @bliss_gore5194 Год назад +139

      ​@@kingyogesh441bro with radeon i kinda get it with the drivers but ryzen naaah

    • @Silentguy_
      @Silentguy_ Год назад +107

      @@bliss_gore5194I agree maybe if this was a GPU comparison but CPUs? Nah. At this rate AMD is gonna maintain the performance lead for at least another generation

    • @stefano1488hh
      @stefano1488hh Год назад +1

      @@bliss_gore5194graphic drivers are pretty good right now.

  • @benmoore2684
    @benmoore2684 Год назад +28

    The amount of sarcasm dripping from this review was nearly enough to drown me as I listened to it driving into work this morning. I can't thank the GN crew enough for the actual out loud laughs you have given me, just in the last two reviews.
    I didn't expect the 14th "Gen" to set so many records (of mediocrity) with so many watts.

  • @Cowlicorn
    @Cowlicorn Год назад +469

    14th gen. We definitely learned from our failures with 11th gen. We swear.

    • @GewelReal
      @GewelReal Год назад +23

      at least 11th gen was an attempt

    • @ag8912
      @ag8912 Год назад +12

      Dude 11th gen is a killer deal at this point. Boards, CPUs and memory are dirt cheap. Price to performance is hard to beat.

    • @raresmacovei8382
      @raresmacovei8382 Год назад +16

      At least 11th gen was a new uArch. And it brought AVX512.

    • @kawesu8781
      @kawesu8781 Год назад +14

      14th gen is worse

    • @w04h
      @w04h Год назад +22

      Why are people comparing this to 11th gen like seriously do you have memory of a goldfish?
      11th gen was actual DOWNGRADE. 8 core i9 vs 10 previous gen.
      14th gen has at least a clock improvement and even few more e cores for i7.
      It's not even a comparison. It's advertised as refresh and it's exactly that.
      This is like shuning AMD when they released 2000 ryzen series which was literally the same thing.

  • @manbearpigsereal
    @manbearpigsereal Год назад +132

    So nice of intel to consider those freezing to death this winter, how would they have survived if not for the new 14-series?

    • @SixDasher
      @SixDasher Год назад +1

      Those tests they do with the virtual loads has no bearing to the real world. Been running a 13900KS for almost a year now and the thing has never gone above 180W. But yes, that little extra heat is welcome during the winter months :D

    • @niks660097
      @niks660097 Год назад +8

      @@SixDasher well then why did they add so many e-cores and high clocked P cores, when we are not supposed to use them in "real world", many people buying 13900k unlike you are actually pushing it harder than this video e.g i run multiple compiles and renders on background while doing gaming or using editor, and most of the it stays > 180 W, its your problem you are using a 9 series for watching youtube or facebook or doing only gaming, this ain't 2015, people don't close all applications before starting a game..

    • @ThunderingRoar
      @ThunderingRoar Год назад +7

      ​@@SixDasherSo you bought a 24 core cpu to not use it? Could have just bought an i7 instead

    • @edwardgrant7708
      @edwardgrant7708 Год назад

      @@ThunderingRoar he should of waited for the 14700k

  • @iskinmind7020
    @iskinmind7020 Год назад +29

    I was hesitating to either slot a 5800x3d into an existing machine, or go for 7800x3d with a new build. Thanks to these videos I finally desided to save a little money by switching to 5800x3d.

    • @RobBCactive
      @RobBCactive Год назад +1

      I did that, but my old CPU found a new home in a data analysis build, so I ended up with new RAM & mobo anyway.
      I would have been better to go with AM5

    • @ryanvtec3885
      @ryanvtec3885 Год назад +2

      5800x3d and 5950x will keep socket am4 relevant for a long time to come

  • @Slugbunny
    @Slugbunny Год назад +101

    I've always wanted a furnace in my home office. Preferably one that triples my computer's utility bill. This is perfect!

    • @KoItai1
      @KoItai1 Год назад +2

      u know that power consumption is not in gaming, right ?

    • @Micromation
      @Micromation Год назад +13

      @@KoItai1 Yes because games won't even use 1/3rd of all available cores... buying 32 thread CPU just to game on is equivalent to getting 4090 to play Quake 3 and RimWorld...

    • @ggoddkkiller1342
      @ggoddkkiller1342 Год назад +1

      I have 12700H, 3060 laptop and ohh man it is basicly a hair dryer! When i feel cold i just launch a game..

    • @ibtarnine
      @ibtarnine Год назад +1

      can you describe what workload you will be using that will cause these CPUs to draw 300w? do you run benchmarks in your home office all day long? since you probably don't do any real work, this CPU idling at 20-25w less than its AMD counterparts (which gamersnexus fails to mention, for some reason) would probably lower your power bill. you seem like a very smart person though so maybe i'm wrong and you get paid to stress test a CPU all day long?

    • @nonci6
      @nonci6 Год назад +7

      @@ibtarnine i7-12800H: literally starts using 100W easily just from doing mundane tasks in Teams, Outlook.
      If you do anything more serious than that, yeah, it's going to consume a lot of power.

  • @matao991
    @matao991 Год назад +265

    If you compile the intros and conclusions of this year's hardware launches from nvidia, intel and amd, you can see Steve's slow descend into insanity.

    • @Pressbutan
      @Pressbutan Год назад +10

      You can just feel the disappointment through the screen. In fairness I'm sharing it with him - I expect better, given that these are crucial moments for the industry. They need to be on their best efforts and clearly Intel is off in left field, dancing around on the silver highway playing with their cell phones.

    • @damara2268
      @damara2268 Год назад +4

      @@Pressbutan its only intel and nvidia doing the bullshit. AMD is best in everything. I hope AMD will overgrow intel soon because they definitely deserve it. Intel deserves to go bankrupt and break into a bunch of smaller companies whixh will then be way more innovative than gigacorp intel with their 10 years obsolete architecture

    • @44R0Ndin
      @44R0Ndin Год назад +4

      Intel's GPU's at least seem to be an interesting thing to see not have any hardware changes and see performance increase as much as it has.

    • @TheRogueWolf
      @TheRogueWolf Год назад +13

      "Slow"? This industry's driving him mad at warp speed.

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature Год назад

      Though the Intel releases have made the dialogue surrounding AMD a lot better. It was quite negative at launch.

  • @Tattersail
    @Tattersail Год назад +9

    i love you guys for your relentless, blunt honesty, enabled by independence from the companies whose products you review.

  • @unclerubo
    @unclerubo Год назад +37

    You would have expected by now that the "Thanks Steve" and "You can literally see it" clips would have gotten older by now. But no, I still crack up every time :D

  • @puokki6225
    @puokki6225 Год назад +318

    Truly one of the CPUs of all time

    • @DuBstep115
      @DuBstep115 Год назад +26

      I loved when Pat Gelsinger come to stage and said inteling time

    • @unvergebeneid
      @unvergebeneid Год назад +32

      Say what you will but it's a central unit and it processes.

    • @Dr.RichardBanks
      @Dr.RichardBanks Год назад +6

      I too deem this as a cpu in time.

    • @christophermullins7163
      @christophermullins7163 Год назад +3

      ​@@Dr.RichardBanksmaybe it is outside of time. Coming back from bulldozer era to show what failure really looks like 😂

    • @Dr.RichardBanks
      @Dr.RichardBanks Год назад +3

      @christophermullins7163 it's definitely in this timeline. Possibly outside of it as well.

  • @firstlast159
    @firstlast159 Год назад +6

    Thank god I did my research and watch this content before buying. Literally had the 14900 in the shopping cart, you boys saved me a decent chunk of money.

    • @kabishnando4286
      @kabishnando4286 7 месяцев назад +1

      So what exactly do you use now?

    • @d9zirable
      @d9zirable 5 месяцев назад

      ​@@kabishnando4286He bought a 14700k so in the end he didn't save any money

  • @Z3rgoon
    @Z3rgoon Год назад +18

    Thank you for Re-Reviewing the 13900K. im glad someone finally did it.

  • @godisadude
    @godisadude Год назад +28

    Keep those "weird off the rails" intros. Loved them. It summarized the review and also gave a personal signature to the video.

  • @Stuka87
    @Stuka87 Год назад +14

    This review has once again shown that my plan of just upgrading my AMD 3700X to a 5800X3D on my existing system was money well spent.

  • @CarnivoryHODL
    @CarnivoryHODL Год назад +258

    7800x3d continues to look better & better, especially after the most recent price drops

    • @GigAnonymous
      @GigAnonymous Год назад +45

      Indeed... I would like to thank Intel for persuading me to buy an AM5 platform and make the jump to DDR5.

    • @SixDasher
      @SixDasher Год назад +4

      If you just use the pc for gaming maybe.

    • @agafaba
      @agafaba Год назад +98

      @@SixDasher agreed, I dont know why so many people recommend the 7800x3d when its only optimal for 95% of users.

    • @kerotomas1
      @kerotomas1 Год назад +18

      @@SixDasheri mean if you use it for productivity it’s even more insane to buy Intel crap. AMD Epyc cpus pretty much destroy anything that Intel has ever made and i’m saying it as a former Intel diehard.
      I’m interested to see if they can finally make a proper next-gen cpu with 15th series.

    • @mathesar
      @mathesar Год назад +7

      @@SixDasher I used to think that but now realizing after 12 months of 13600K ownership all I do is play games and occasional video clip encodes for memes etc which already had fast enough encode times on my previous i7 8700K.

  • @NobbsAndVagene
    @NobbsAndVagene Год назад +91

    The Haswell refresh, Devil's Canyon (4790K) was a massive improvement over its predecessor (4770K), despite only minor changes. But this time they're presenting something that's even less of an improvement as a whole new generation of CPUs. It's like they reckon most people won't see through the marketing, and they're probably right.

    • @Pressbutan
      @Pressbutan Год назад +3

      Drinks like a Sandy Bridge-EP, performs like a Haswell.

    • @damara2268
      @damara2268 Год назад +3

      Mainly big companies are a target of this because those always order the newest i5 or i7 no matter what the actual performance is

    • @LucasHolt
      @LucasHolt Год назад

      True, but skylake vs kabylake were identical except a igpu improvement. It's not the first time intel has done this. (and no it's not ok)

    • @michaelcarson8375
      @michaelcarson8375 Год назад

      @NobbsAndVagene Most "gaming" reviews do a bad job in general explaining cpu architecture changes in general.

    • @michaelcarson8375
      @michaelcarson8375 Год назад

      @@LucasHolt YET Kaby lake could do hardware accelerated x/h.264 decoding and skylake could not. That's a big upgrade if you are trying to build a nuc sized system that runs at 15 watts. IPC of the cores also were the same with better power efficiency. Go look at the AnandTech write up. Kaby lake yields were higher
      "The combination of the two allows for more voltage range and higher frequencies, although it may come at the expense of die size. We are told that transistor density has not changed, but unless there was a lot of spare unused silicon in the Skylake die design for the wider pitch to spread, it seems questionable. It also depends which part of the metal stack is being adjusted as well."

  • @brandonunger1689
    @brandonunger1689 Год назад +7

    Appreciate Gamers Nexus style of testing and benchmarking CPU's. I feel like other tech channels are just rushing videos out, using handpicked games and 1080p resolutions... You guys are showing the community better material right now. Just my opinion

  • @sethperry6616
    @sethperry6616 Год назад +52

    I really considered upgrading from r5 3600 to a 12th gen or am5 upgrade, but I'm happy I went for a 5800x3D. Maybe I'll consider the 15th gen / the next AMD socket.

    • @roko98332r
      @roko98332r Год назад +4

      AM5 boards last longer like 4 to 5 years. im on old 5 yo AB350 using R5 5600. my next system sure be AM5 with decent mem price

    • @Lethalwick5673
      @Lethalwick5673 Год назад

      @@roko98332r R5 5600 still a beast for it's price

    • @sethperry6616
      @sethperry6616 Год назад +3

      @@roko98332r it is not possible to know that yet. Am5 has only been out

    • @matthewtremain683
      @matthewtremain683 Год назад

      The next gen of AMD CPUS is going to use the am5 board, while AMD is unsure if they will go to a new generation of socket for the gen after.

    • @andersjjensen
      @andersjjensen Год назад +1

      @@sethperry6616 AMD has promised at least Zen 5 but were purposefully vague about Zen 6. Given the leaks we've seen surrounding Zen 6's departure from the IO Die + CCDs layout that Zen 2,3,4 and 5 feature, it makes sense that AMD wouldn't set it in stone in case they simply can't retain pin compatibility. But if they can, then expect Zen 6 and Zen 7 to be AM5, as they won't switch until DDR6 and PCIe6 unless they absolutely have to.

  • @Ashachi
    @Ashachi Год назад +70

    Finally, an intel version of the FX 9590.
    What a milestone.

    • @BillyRazOr2011
      @BillyRazOr2011 Год назад +1

      Huh?

    • @Pressbutan
      @Pressbutan Год назад +21

      Ok I'm done reading comments, this is the one I was looking for. I would have also accepted "Bulldozer Lake" or "Ooh, the new Fermi looks hot"

    • @CROWNZI
      @CROWNZI Год назад

      Good luck having optimization issues in mainstream games :)

    • @charlesbronson1959
      @charlesbronson1959 Год назад +2

      Intel have done a couple ++ refresh cpu's

    • @sulphurous2656
      @sulphurous2656 Год назад +5

      I say, with the power draw and total lack of generational improvement, Intel may have finally met their own Bulldozer Moment on Desktop (since they already had one in servers). Of course, as we know Intel is too big to fail and will quickly bounce back from this mild embarrassment.

  • @johnsmith1966
    @johnsmith1966 Год назад +6

    If possible, I'd love to see Intel i7/i9 with PL2 @ 140-ish Watts vs AMD Ryzen 9 with ""105 Watt"" eco mode. (Preferably with a focus on productivity.) Thanks for all you guys do!

  • @flingymingy
    @flingymingy Год назад +97

    Anyone else excited to see how userbenchmark views this?

    • @GeordiLaForgery
      @GeordiLaForgery Год назад +57

      I don't think intel made enough cash from 13th gen to pay the subscription fees

    • @anthonybaker5199
      @anthonybaker5199 Год назад +7

      Right 😂

    • @werewolfmoney6602
      @werewolfmoney6602 Год назад

      Right now userbunchmork has 14900k at #1
      The highest Ryzen CPU, according to them, is the 7950X3D at #10
      The 7800X3D is at #17.
      Truly one of the websites of all time

    • @fajaradi1223
      @fajaradi1223 Год назад +38

      ​@@GeordiLaForgery
      Those blokes will do it for cheap.

    • @Auziuwu
      @Auziuwu Год назад +3

      lmao

  • @T3hBeowulf
    @T3hBeowulf Год назад +72

    The comedic delivery timing is the only thing tighter than the difference between these "generations".
    Absolutely... informative and hilarious presentation. 😂

  • @the_shameless
    @the_shameless Год назад +2

    That final “back to you Steve” got me 💀

  • @s26me
    @s26me Год назад +52

    Honestly, we really no longer care about intel chip rebranding, but still keep watching all these reviews because how fun they are, well done.

    • @thecooljohn100
      @thecooljohn100 Год назад +2

      Yeah haha. I couldn't care less about Intel anymore. I've been team red for the last 5 years. Intel Innovation has stagnated SO hard. They just keep uping power consumption because that's the only way they've figured out of producing more performance for the last 8 years.

  • @Mystical_Zeus
    @Mystical_Zeus Год назад +116

    This entire thing is 100% entertaining as well as informative. Intel seriously dropped the silicon on this one.

    • @blinkcatmeowmeow8484
      @blinkcatmeowmeow8484 Год назад +3

      Definitely one of the consumer products of all time.

    • @dabneyoffermein595
      @dabneyoffermein595 Год назад +1

      is there anyway to get this sucker under 600 dollars, like perhaps get it for $50 bucks, or perhaps someone can pay me to extract my 12900K to install the 14900K on my Z690 mobo?

    • @iLegionaire3755
      @iLegionaire3755 Год назад

      @@dabneyoffermein595Of course there is! Buy 13th gen under $600 and SB OC! Wait for 15th gen Intel.

    • @codemonkeyalpha9057
      @codemonkeyalpha9057 Год назад

      I disagree. I suspect it will sell. It will be headline sales pitch in the latest Dell or HP workstation (and eventually laptops). Intel just made a load of money without having to invest a dime in R&D, that's genius, dirty... but genius. Yeah they will have lost a few enthusiasts who understand how cynical it is, but there are 100 morons for every one of them, they all will be clicking their fingers to get these 'next gen' CPUs. Being savvy in a mass market dominated by morons is a dismal experience.

    • @brandonl9286
      @brandonl9286 Год назад

      Agreed. I thought about waiting for 14th gen chips to build a new machine, but ended up getting 13th gen from Micro Center not too long ago.

  • @DarkRider2k3
    @DarkRider2k3 Год назад +45

    It's kind of a wild that Intel is still getting stomped on by the 5800x3d. That V-cache is incredible!

    • @LOLMAN9538
      @LOLMAN9538 Год назад

      While it's also stomping on AMD's current 7000 series.
      From my knowledge, if a CPU from a previous generation is stomping a mudhole in your current gen lineup, you know you're in trouble.

    • @zachcooper9102
      @zachcooper9102 Год назад

      Wat

    • @MIrroooo
      @MIrroooo Год назад +1

      @@zachcooper9102The 5800x3d is not stomping on the 7800x3d lmao

    • @AB-80X
      @AB-80X 10 месяцев назад

      What? If you're thinking your little 5800X3D is stomping on the 14900K, you're on crack.
      I don't even think you can get half the score in CB 2024 But let me know. You have 2258 points to beat in my case. I'd be surprised if you get more than 930.

  • @shaydza
    @shaydza Год назад +91

    Love watching Gamers Nexus. Steve is able to entertain even when there is really nothing to talk about😂

  • @Fabri91
    @Fabri91 Год назад +148

    My *entire* system, including a 34" UW display, uses 350 W or so - 300W on only the CPU is beyond insane.

    • @KoItai1
      @KoItai1 Год назад +19

      that s on 100% core load, since it has 24 cores, but in gaming, it uses like 70W most of the time

    • @ms3862
      @ms3862 Год назад +18

      That's a weak system brah my GPU pulls 400w

    • @panjak323
      @panjak323 Год назад +2

      My laptop uses 120W in gaming in the worst case.

    • @dy7296
      @dy7296 Год назад +5

      ​​@@KoItai1Nope. at least 110.
      Based on gaming bench videos at least.

    • @thomasemory4352
      @thomasemory4352 Год назад

      @@ms3862undervolt it

  • @georgenem
    @georgenem Год назад +2

    I fucking love Steve's absolute disdain with Intel. It's entirely understandable, though, since their effort for improvement between 13 vs 14 is similar to that of a barely passing undergrad. Back to you, Steve.

  • @Hirome_Satou
    @Hirome_Satou Год назад +28

    Videos like this one really highlight to me how good of a CPU the 7800X3D is. Very powerful, fast, cool, and not power hungry.

    • @gustarchitect5326
      @gustarchitect5326 Год назад

      7800x3d has high idle power usage.

    • @Pressbutan
      @Pressbutan Год назад +7

      It shows how far behind the curve Intel is. They dragged their feet far too long with Core architecture. 2008 designs in 2024 do not work. x86s and a whole new architecture are needed to compete, putting more lipstick on the pig and hitting it with more watts isn't going to solve this issue.

    • @rudrasingh6354
      @rudrasingh6354 Год назад +3

      @@PressbutanYeah,all the current stuff they have is still a variant on Skylake. They need a Bulldozer -> Zen moment for their cpus. The only way they are even "competing" right now is with stuffing E cores and pushing their P cores to the max heat they can handle and even with all those tricks they only come close to Zen 4 or even Zen 3 X3d(5800x3d).
      Most of the time their 8P16E cpus at 32Threads still lose out to the 16C/32T 7950x and the power usage is a joke. Almost the same power draw as my 4060.
      All AMD needs to do is make a 16 Zen5 core and 16 Zen 5C core CPU at 64 threads and Intel will have nothing at all to compete with that. Even a 16 Zen5 and 8 Zen5C CPU would annihilate them, all the while taking less power than a 14900k. Now imagine that CPU but with x3d. Even less power draw and way better gaming performance.

    • @Dfm253
      @Dfm253 Год назад

      @@rudrasingh6354all of their architecture is still based on scaling of the original Pentium 2 core. Their most recent architecture was “Netburst” for the Pentium 4, which was similar to bulldozer in that the frequency was high but IPC was low. The Pentium 4 and Pentium D used Netburst before they went back to the drawing board and revised the Pentium 2 architecture (which had been revised once before for Pentium 3) for the “core 2” CPU’s, and every generation since has just featured further revisions. This is notable a “refresh” in that the revisions made haven’t functionally effected the architecture at all.

    • @kingofnoobs1739
      @kingofnoobs1739 Год назад

      few problems, intel is known for memory controllers amd just sucks at that max u can overclock amd is 6000 while intel can run at 8000 on 14 gen with good motherboards
      but idk intel is starting to suck

  • @Aggnog
    @Aggnog Год назад +180

    Intel truly excels at making the competition look even better than it already was.

  • @YTAccount82825
    @YTAccount82825 7 месяцев назад +2

    It’s wild how CPUs now have such high thermals that contact frames and 420mm AIO radiators just to get comfortable temperatures.

  • @GaryBusey-sLaserdiscCollection
    @GaryBusey-sLaserdiscCollection Год назад +58

    The Stellaris chart could use more game settings data. For example, map size (how many star systems), if L-Gates are enabled, in game year (important as number of pops affects performance), number of AI empires and whether or not Xeno-Compatibilty is enabled as that one setting can basically set fire to your CPU in the late game.

  • @Grimsace
    @Grimsace Год назад +23

    For stellaris, I'd actually recommend setting it to the highest speed on a late game save, then seeing how long it takes to the to, for example, get through 50 years (shorter would be better). On the fastest speed paradox games run as fast as the system can handle and I think this type of benchmark would show much better scaling.

    • @glasseyemarduke3746
      @glasseyemarduke3746 Год назад +5

      Sadly, the rng would probably invalidate runs over multipe turns. That is why they only test a turn to reduce that (and other) variables.

  • @imitzi
    @imitzi 7 месяцев назад +2

    7 months later, the "thanks Steve" still gets me lmao

  • @Aggrofool
    @Aggrofool Год назад +21

    This CPU is great marketing for the 7800X3D

  • @golasticus
    @golasticus Год назад +10

    12th and 13th gen was pretty hard to keep track of because you had to actually pay attention to tell what's good and what isn't. Now with 14th gen, that's not a problem. Thanks Intel.

  • @jameysummers1577
    @jameysummers1577 Год назад +1

    I just built myself an i9 14900k PC with a DeepCool LT720, and it runs nice and cool.

  • @titaniummechanism3214
    @titaniummechanism3214 Год назад +21

    This was a welcome reminder that the 11900K had two cores less then its predecessor. We got a "side"-grade instead of an upgrade this time, but at least we didn't get a downgrade. That's something, isn't it?

    • @damara2268
      @damara2268 Год назад +2

      11900K was even more shitty, it had 2 less cores yes but it still had higher power draw

    • @kerotomas1
      @kerotomas1 Год назад +1

      At least 11th gen offered PCIe Gen 4 over 10th which actually mattered. Now 14th gen is pretty much garbage and offers nothing new.

    • @germanmade1219
      @germanmade1219 Год назад

      Once you look into the real world latency between 10th gen, 11th gen, and compare it to 12th and 13th gen, you definitely did get a side-grade. Overall system latency is higher with 12th/13th gen. So really depends on how you look at it.

    • @damara2268
      @damara2268 Год назад

      @@germanmade1219 everything has gone downhill except benchmark performance since Intel introduce their e-core shite that should have never come to desktop

  • @TheLoneWolfling
    @TheLoneWolfling Год назад +28

    I'm astounded by how well AM4 holds up considering the DRAM bandwidth difference.
    Could you please start including DRAM layout in testing config? E.g. 2x 8GB single-rank or whatever. This can make a surprising amount of difference. Maybe even worth a video at some point. Find a set of 4 benchmarks that are GPU-limited/CPU-limited/dram latency limited/dram bandwidth limited, and run on different DRAM layouts and CAS-versus-speed tunings, to illustrate the tradeoffs.

    • @The93Momo93
      @The93Momo93 Год назад +13

      best thing about X3D is them not really caring about ram speed, getting uberfast ram is a literal waste of money for them

  • @diqlabrie1278
    @diqlabrie1278 Год назад +2

    I very much enjoy the Sahara Desert level humor You employ! Bravo! 👍👍👍 ®️

  • @Tuxedo512
    @Tuxedo512 Год назад +96

    the 4060 of cpus, truly one of the cpus of all time.

    • @atretador
      @atretador Год назад +3

      The 4060 at least has a software locked generational software gimmick that makes it look not terrible in very specific tittles

    • @andersjjensen
      @andersjjensen Год назад +1

      @@atretador So does the 14th "gen". Intel just only manged to get it ready for two specific games... that Gamers Nexus don't feature in their test suite. I can't recall it's name (Steve mentions in the 14700k review if you're actually interested) but it's the very definition of a locked software gimmick, as it will not be released for 13th gen despite it being the exact same silicon.

    • @AB-80X
      @AB-80X 10 месяцев назад

      That's a really ignorant comparison.
      The 4060 is disliked for poor performance. While to marketing of the 14th gen is B.S a 13900K/14900K is for all intended purposes a very powerful CPU.

  • @profpigeon5441
    @profpigeon5441 Год назад +114

    I feel like getting a 7950x makes the most sense for people right now who need lots of cores. You get a brand new platform with upgrades down the line. AMD also did right by AM4 with the 5800x3d release. Pumped new life into a dated board.

    • @The93Momo93
      @The93Momo93 Год назад +15

      going from 5800x to the X3D was so worth it for me, I thought it will be just a sidegrade but it's a legit great final upgrade on that socket

    • @DissertatingMedieval
      @DissertatingMedieval Год назад +8

      That's what I was thinking. I'd like to build a workstation that I can game on as well, and AMD seems the clear way to go at this moment. I just remain hesitant regarding which CPU so these charts are helpful.

    • @profpigeon5441
      @profpigeon5441 Год назад +1

      @@DissertatingMedieval a 7800x3d is really good, would future proof for gaming for a long time. In terms of productivity, it would be fine. I use a 5800x3d and my machine flies.

    • @mowtow90
      @mowtow90 Год назад +3

      Next year I have to upgrade from my I7 8700k , it was one of the last great CPUs intel put out but it's showing its age. I cant take any of the 12-14 "gen" Intel BS because of the idiotic P/E cores. None of the vitrulizers I use can distinguish between the P and E cores. FFS Intel , E cores are only good for mobile platforms, saving few W on a 300 W CPU is not much. Just make a standard CPU for desktops... A couple of weeks ago me a friend tried to workaround 13900k in Red Hat KVM and VMware ESXi. ESXi just could not work stable because it cant understand the core difference. KVM , we only got around by leaving the E cores for the host OS and giving the P for the guest. What is the point in dumping that amount of cache and not be able to use most of it....
      This is why I will have to go AMD but AMD-V was never on the same level as VTX/d , that's something Intel nailed log time ago because of their Enterprise CPUs.....
      I was given an advice to just buy Gen 11 Intel and stick with it. They are on very good prices now as retailers are trying to offload the remaining stock. The mobs for them are not on insane prices as well.

    • @The93Momo93
      @The93Momo93 Год назад +4

      @@mowtow90 yeah these e cores are absolutely stupid in desktops

  • @ctrlectrld
    @ctrlectrld Год назад +1

    10:44 "that's when you know it's a good generational uplift; in fact, it's so exciting that we're not even gonna talk about any other production test"
    I'm crying

  • @slvrization
    @slvrization Год назад +84

    7800X3D is unbeatable in games for a fraction of price and power consumption.

    • @dangerous8333
      @dangerous8333 Год назад +7

      People that buy this don’t just play games.

    • @impointr
      @impointr Год назад +1

      @@dangerous8333 While that is true, these Intel CPU are still largely less efficient than an AMD CPU even at productivity tasks. A 30% increase in your power bill for a 2~5% improvement in performances just isn't worth it. Hence why several datacentres in the world (such as Cloduflare's) have moved to EPYC.

    • @schinkenspringer1081
      @schinkenspringer1081 Год назад +75

      @@dangerous8333 Yeah, they play themselfes

    • @SweatyFeetGirl
      @SweatyFeetGirl Год назад

      People that a work machine buy threadripper@@dangerous8333

    • @RJ-lh8vm
      @RJ-lh8vm Год назад

      ​@@dangerous8333in that case they should go for a 7950x3d. Better gaming, similar productivity, 1/3 the power.

  • @Brian0wns
    @Brian0wns Год назад +7

    I am really enjoying the step up in humor. You could almost add cut out graphics to make these reviews Montey Python skits

  • @cubez9
    @cubez9 Год назад +23

    Steve and the GamersNexus team together, made this launch something entertaining and worth to watch. Unfortunately the only downside to this is that I want more videos like this, but I hope not.

  • @quantumjank3091
    @quantumjank3091 Год назад +11

    Came to see if it made any difference to Starfield. Left a like. Moved on. Thanks Steve.

    • @GamersNexus
      @GamersNexus  Год назад +5

      Thanks for the like and the comment! It actually does help a lot of someone's watch time is low (but they found it helpful) and balances it out!

  • @redsnow846
    @redsnow846 Год назад +40

    If this continues were gonna have to start figuring in electrical breaker upgrades into build costs.

    • @pegcity4eva
      @pegcity4eva Год назад

      Lol

    • @TheTastefulThickness
      @TheTastefulThickness Год назад

      Adding 14 cores added 200watts. Were in trouble

    • @griffin1366
      @griffin1366 Год назад

      So put a power limit or undervolt?
      My 13700k Runs at 5.7Ghz and barely uses 60-120W depending on the game.
      150W power limit and it drops to 5.3Ghz but that's only when I am rendering.
      I can save more power with the eCores disabled or parked.
      The GPU is still my biggest concern at 400W.

  • @sarahjrandomnumbers
    @sarahjrandomnumbers 7 месяцев назад +2

    This generation will go down in history as one of the generations of Intel's cpu's.

  • @anthonymoreno894
    @anthonymoreno894 Год назад +42

    I love the snark GN team. These Intel reviews have been hilarious… unexpected joy that the product itself will never deliver.

    • @peterpan408
      @peterpan408 Год назад +3

      Intel can be a good deal during Intel discount season 😉

    • @Sportivities
      @Sportivities Год назад

      @@peterpan408 you'd still end up saving more from amd simply due to not having to spend extra keeping it cooled and powered 💀

  • @Zer0Log1c
    @Zer0Log1c Год назад +1

    The callbacks to the Intel presentation are never going to get old

  • @zeta2209
    @zeta2209 Год назад +48

    This thumbnail belongs in a museum.

    • @KillerInstinct1
      @KillerInstinct1 Год назад

      *K-KONO POWA DAAA!!!*

    • @itsmeee-ey8xv
      @itsmeee-ey8xv Год назад +1

      I’m not even going to try and make a better comment I’m just going to agree

  • @PizzaDude-sc2jo
    @PizzaDude-sc2jo Год назад +6

    userbenchmark review gonna be crazy🤯

  • @FinanceNinja
    @FinanceNinja Год назад

    I like the comedic approach here. Even the subtle parts like saying "13" when you meant "14" repeatedly. Well done!

  • @HurricaneSparky
    @HurricaneSparky Год назад +4

    It's crazy how much power some of these newer top end CPUs use. My i7-8700k doesn't even get out of bed most days for gaming, and I was worried it'd run hot and had it delidded back when I got it.

    • @RCXDerp
      @RCXDerp Год назад

      My 9700k gets like 50fps in starfield at 1440p I'll probably grab 15th gen or AMD maybe

  • @fayis4everlove
    @fayis4everlove Год назад +18

    R5 5600 still being relevant is amazing!

    • @darkkingastos4369
      @darkkingastos4369 Год назад

      3700x still showing it's old but efficient as well

    • @JudeONeill
      @JudeONeill Год назад +1

      Truly unbeatable from a value per dollar perspective.

    • @mostlyinept
      @mostlyinept Год назад +1

      That's exactly what I've been using since it was released. R5 5600 and a Sapphire Pulse 5600XT. Absolutely fantastic combo paired with a nice B550M mobo and some 3600mhz ram. I don't see any need to upgrade for the next few years. I play Starfield at 1440P medium settings no problem.

  • @o8livion
    @o8livion Год назад +5

    looking at the efficiency of amds 7800x3d and 7950x3d i am so impressed and i somehow feel bad for not having bought amd stock before

    • @lesserlogic9977
      @lesserlogic9977 Год назад +2

      My last 2 pcs are AMD CPUs. They really have stepped out it up, I'm looking forward to the Zen5X3D chips

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic Год назад

      AMD wasn't always this good, they have had some real stinkers too. Bulldozer for example...

  • @widnyj5561
    @widnyj5561 Год назад +21

    Have you considered numbering the cores with preceding 0 for numbers

    • @MarKarda
      @MarKarda Год назад +7

      Can you imagine? A cpu model numbre like intel 14P8E10 and be able to identify specs of the product only with the name... my god... life could be so much simple

  • @KillahMate
    @KillahMate Год назад +18

    I haven't ran the numbers but it seems like even if you're putting together a brand new PC, if you're on a limited budget you might _still_ want to buy a 5800X3D and put all the savings you've made on AM4 platform components toward a bigger GPU.

    • @Pressbutan
      @Pressbutan Год назад +2

      I consider myself pretty faithful to Intel and if I had a friend building a PC asking for advice I'd steer them towards the same. If not 5800X3D, then absolutely a 7800. Either would be fine. In fact both seem to be fantastic value compared to team blue..

    • @Purified1k
      @Purified1k Год назад +2

      I wouldn't. The only reason I got the 5800x3d is bc I was on the AM4 platform already. Not to mention, PBO2 optimization also gives me nearly another 20% in performance with the 5800x3d. But with a new system, I'd totally save for AM5 for upgrade ability later down the line.

    • @sebastian-sfX
      @sebastian-sfX Год назад

      @@Purified1k pretty sure that for gaming on 1440p with a high-end gpu and the 5800x3d you can even skip AM5 easily

    • @Purified1k
      @Purified1k Год назад

      @sebastianferreira3595 Correct, easily. But a new build, you should just use AM5 so you have an upgrade path.

    • @sebastian-sfX
      @sebastian-sfX Год назад

      @@Purified1k yeah but looking at am5 now, they'll probably develop further so ddr5 and all can be better used, it's better to wait. Most b650 mobos are 6400mhz and stability issues are going on if you touch too much, ppl with this current boards and rams are probably gonna want to buy new ram or even new mobo as well for 8000s or 9000s ryzens, makes no sense to go am5 now imo unless you running a really old chip... I would wait myself until 2k25 and see how new gens come up

  • @slaytorr.
    @slaytorr. Год назад +1

    I upgraded from the 3900x to the 5800x3D back in Jan. Its hard to sell me on anything new for a while with the value I got on that chip during the holidays

  • @mikeyX101
    @mikeyX101 Год назад +4

    Thank you for revisiting the 13900K, good stuff here.

  • @salemthekit6143
    @salemthekit6143 Год назад +14

    Man, first the 4060 ti, then the 7800XT, now the entire 14th series. We're truly in the era of generational stagnation. At least they had the decency to price the 7800XT well and the 14th gen the... same.

    • @KoItai1
      @KoItai1 Год назад +2

      yea, they said after the 13th release, that the 14th gen will not be a generation upgrade, and just a 13th gen refresh, that s why it has same slot and everything almost the same, ppl just need to read, but it s normal for everyone on youtube to make clickbaits and stuff for views, nobody said that 14th gen will be a leap, but views are views

    • @DM16_
      @DM16_ Год назад +2

      Its not even a refresh tho. It's nothing. Less than 5%

    • @KoItai1
      @KoItai1 Год назад

      @@DM16_ it is a refresh, that s why it s called intel refresh raptor lake, (13th gen being raptor lake)

    • @kawesu8781
      @kawesu8781 Год назад

      @@KoItai1 Then why name it as if its another generation, could've named in a 13950k atleast, it's not like all people knows it's a refresh, but we can surely do say that a generation naming scheme from 13900k to 14900k does mislead people. intel is doing this for marketing obviously to decieve people

    • @Pressbutan
      @Pressbutan Год назад

      You live in an era where every industry is controlled ruthlessly by hedge funds, who demand not quality products but the maximum in shareholder return on investment. EA, Intel, Dreamworks Studios. Pick a company and they're guilty of being on strings, controlled by companies like Vanguard, Blackrock, et al

  • @grant0317
    @grant0317 Год назад +1

    Was sad to see total war not included in the benchmark tests as it is one of my favorite franchises, but good to know this is literally just a refresh rather than a new generation!

  • @duduza1
    @duduza1 Год назад +7

    this is THE MOST lowkey funny hardware channel i ever subscribed to. love you guys

    • @Auziuwu
      @Auziuwu Год назад +1

      I laughed so hard, love GN

    • @Pressbutan
      @Pressbutan Год назад +2

      Comments like these make me glad I spent the money during COVID lockdowns on a GN Wireframe mousemat. I'm glad I supported a channel that people enjoy so much.

  • @TheLoneWolfling
    @TheLoneWolfling Год назад +7

    I don't know if this is feasible (I suspect it would require a _lot_ of test time, even though you could tolerate substantially more noisy individual test points), but a benchmark _sweep_ of something like Factorio across different working set sizes would be interesting. One thing I've noted is that performance of Factorio, and DF, and a lot of other simulation-esque games in general, tends to tank once your working set no longer fits in cache, and it's definitely noticeable that the 5800X3D can handle a lot more than most other cpus before that happens - but most benchmarks don't really show the full story, instead just showing one or two datapoints at max.

    • @hamzashaikh9310
      @hamzashaikh9310 Год назад +2

      Watch hardware unboxed 14900k factorio benchmark for this exact situation.

  • @jessemiller70
    @jessemiller70 6 месяцев назад +4

    WoW! My 5800X3D still beats the 14900 in so many situations. And it consumes a fraction of the power.

  • @crabosity
    @crabosity Год назад +6

    Intel has done it again. Truly revolutionary

  • @CuriousMoth
    @CuriousMoth Год назад +12

    Given we pay ~2.5x as much for energy here in the UK, the cost is far higher than the initial price tag suggests.

    • @DuBstep115
      @DuBstep115 Год назад

      Meh, when you are watching youtube and surfing internet cpu is basically on idle and consumes like 20w. If you play games like 5h per day you pay like 10c per day more if your electricity is 20c per 1kwh. So 35€ in year if you are a f**king nerd and play 5h every single day

    • @Salvo78106
      @Salvo78106 Год назад +3

      ​@@DuBstep11510c? 20c? lmao. Where do you live, in russia? electricity in europe is more than 35c per kwh. In uk 2x5 so like 50/55c per kwh. Now redo the math, kid.

    • @stephenallen4635
      @stephenallen4635 Год назад

      @@DuBstep115 During peak hours my electricity goes up to 60c/kwh. but Its usually 34c. But yeah sure if you just pretend the numbers are smaller the cost argument goes away

    • @DuBstep115
      @DuBstep115 Год назад

      @@Salvo78106 I was being generous and shooting it high. I pay 8c for 1 kwh. I live in Finland. So its even lower, 100w vs 200w cpu difference is 15€ for me if you play 5h EVERY SINGLE DAY

    • @DuBstep115
      @DuBstep115 Год назад

      @@stephenallen4635 Well get rekt then :D

  • @johnmarick6974
    @johnmarick6974 Год назад

    that episodes intro was great, keep up the good work and creativity, computer channel number 1

  • @soggycatgang
    @soggycatgang Год назад +5

    Honestly, I would've like to see you add the i9 13900KS to see if the i9 14900K even compares because there both 6GHZ

    • @damara2268
      @damara2268 Год назад +1

      They are exactly the same chip

    • @danielstory2761
      @danielstory2761 Год назад

      @@damara226814900k has slightly higher memory bandwidth, other than that is it is the same chip

  • @leonardoariewibowo7867
    @leonardoariewibowo7867 Год назад +5

    I cant believe i bought a 14900k 2 years ago

  • @ifell3
    @ifell3 Год назад

    Been waiting for ONLY YOU to review this! No one else competes!

  • @draxrdax7321
    @draxrdax7321 Год назад +5

    "Our room heaters are 3 times more powerful than the competition, with 5% less computing! Intel - we make winters warm!"

  • @kintustis
    @kintustis Год назад +11

    A 30% increase in watts over my FX 9590. An engineering feat never before imagined. The bold future of more watts for more dollars, giving only as much performance as you need.

    • @HenrySomeone
      @HenrySomeone Год назад +1

      Yeah well, it's also about 3000% more performance, so...

    • @stephenallen4635
      @stephenallen4635 Год назад +1

      @@HenrySomeone so... what? did you miss the point?

    • @HenrySomeone
      @HenrySomeone Год назад +2

      @@stephenallen4635 Point? The 9590 was hot garbage, sucking down power (in times where most boards could NOT handle it - there were numerous cases of fried mobos back then) and performing only marginally better than a 8350/8370 and miles behind 4770k that came out at the same time. The 14900k is also hot, but at least it performs.

    • @stephenallen4635
      @stephenallen4635 Год назад +1

      @@HenrySomeone ok so you do get what he was saying is exactly that applied to the 14900k. Its barely faster than its predecessor, comsumed almost 25% more power to a ridiculous degree and should be laughed at lole the 9590

    • @HenrySomeone
      @HenrySomeone Год назад +1

      @@stephenallen4635 Did you not watch the video where it shows it using even a few watts less than the 13900k, you blatant AMD fanboy?

  • @darylcheshire1618
    @darylcheshire1618 Год назад

    I’ve been using the i7 980 since 2008 the next five years from intel only had incremental performance. I have just bern updating the GPUs.
    Finally got the Ryzen 9 3950 in 2020, been great.

  • @melihuygur6578
    @melihuygur6578 Год назад +14

    7800x3d 💯

    • @KoItai1
      @KoItai1 Год назад

      not necessarily, 14700k can perform the same or better in games, while being 2x better in multicore performance

    • @evilleader1991
      @evilleader1991 Год назад

      ​@@KoItai1did i even watch GN's review on the i7?

    • @KoItai1
      @KoItai1 Год назад

      @@evilleader1991 did the guy that comment watch the review of 7800x3d to comment about it? no

    • @evilleader1991
      @evilleader1991 Год назад

      what are you even on about bro, 14700k gets DEMOLISHED by 7800x3d in gaming.@@KoItai1

  • @alfonsecouillon
    @alfonsecouillon Год назад +4

    "Now for the 13..erm 14900K..." this cracked me up whenever Steve did this 🤣🤣

  • @denisruskin348
    @denisruskin348 Год назад +5

    When a CPU draws almost like a RTX 3080. Wild times.

    • @AB-80X
      @AB-80X 10 месяцев назад

      @@leanja6926
      Parrot.

  • @hamzashaikh9310
    @hamzashaikh9310 Год назад +7

    Can you test the ratchet and clank portal sequences and starfield with different combos of cpu and ssds. Apparently starfield is not just cpu bound but also ssd bound, same with ratchet and clank portals.

    • @GamersNexus
      @GamersNexus  Год назад +4

      We're using a high-end NVMe SSD. There is no SSD bind in our CPU testing. As for doing a standalone on it, no plans right now but maybe as Direct IO becomes more prevalent!

    • @hamzashaikh9310
      @hamzashaikh9310 Год назад

      ​@@GamersNexus If you want to do it several years from now, i recommend buying the 3dxpoint optane ssds now before they are forever gone. For science.

  • @takecarey
    @takecarey 3 месяца назад

    Just caught one on a $300 sale. Not a bad deal from a 12600k upgrade. Thanks for the summary.

  • @tweed0929
    @tweed0929 Год назад +6

    That soothing feeling when my 5900X draws 133 Watts at PEAK load and barely draws tens of Watts when idle...

    • @HenrySomeone
      @HenrySomeone Год назад

      Yeah well, it's also barely a third of 14900k's multi thread performance, so...

    • @Solrac-Siul
      @Solrac-Siul Год назад +1

      actually, better no mention iddle, a 12700 pulls 6 watts at iddle, 13700 8, and even this hot comet that is the 14900 iddles at 14

    • @saricubra2867
      @saricubra2867 Год назад +1

      Same here, i have the i7-12700K. Iddle power is way lower than a 5900X.

    • @AB-80X
      @AB-80X 10 месяцев назад

      Here's a question. At hat wattage does your CPU sit when playing say Cyberpunk?
      And tens of watts? Funny. I'm looking at my 14900K in HWM right now and it draws less than 10.

    • @tweed0929
      @tweed0929 10 месяцев назад

      @@AB-80XI don't play Cyberpunk, never did, never will. I refuse to even pirate that crap of a game.

  • @polymniaskate
    @polymniaskate Год назад +6

    Hilariously, on the biggest Swiss electronics retailer’s website, the i9-14900K launches $20 cheaper than the current price of the 13900K…..it’s as if they knew that it would be what it takes to convince people to buy the newer one 😅
    Edit: same story with the 14700K vs 13700K btw

    • @Pressbutan
      @Pressbutan Год назад

      That's strange. I wonder if Intel is offering any subsidies in order to shift these? It would be silly to not reduce the price on the older hardware first, but this is Intel we're discussing.

    • @polymniaskate
      @polymniaskate Год назад +1

      @@Pressbutan I don’t know, and interestingly, all four except for the 13700K are labeled “on sale” - but the discount on the 14900K is 13% while it is 7% on the 13900K - with both listed base price being virtually equivalent (the 13900K is listed as $5 cheaper)

    • @Pressbutan
      @Pressbutan Год назад

      That is interesting. Could be a retailer thing to manage their own inventory, I have no clue on either side but I am aware both retailer and vendor will mess with pricing like that to psychologically manipulate consumers into making poor decisions.@@polymniaskate

    • @-in-the-meantime...
      @-in-the-meantime... Год назад

      Bagged a 12900k/msi mobo/ripjaw ddr5 6000 combo from microcenter couple weeks ago for $400.. see ya'll in 3-5 years zzzzzzzz

  • @KevinCastillo-hh1fn
    @KevinCastillo-hh1fn Год назад +1

    Awesome and informative video like always! thanks for the hard work :D

  • @Wokenomics_PhD
    @Wokenomics_PhD Год назад +10

    Looks like Intel is all in on the home heating business. That's just good business diversification strategy.

    • @radiantveggies9348
      @radiantveggies9348 Год назад

      Its weird how people dont realize this is the end goal

    • @SixDasher
      @SixDasher Год назад

      governments forcing people to switch to electric heating, so I am just helping the environment!

    • @AB-80X
      @AB-80X 10 месяцев назад

      @@radiantveggies9348
      Most people are sheep. Most people do not understand what the point of a 14900K is. They think it's a gaming CPU.

  • @joredor994
    @joredor994 Год назад +5

    Given the uplift should these have been the 13901K and 13701K?

  • @joshdoldersum9132
    @joshdoldersum9132 5 месяцев назад +3

    Thumbnail aged well.

  • @moodswinggaming2972
    @moodswinggaming2972 Год назад +11

    I have the 5800X3D and the TUF 4090. I am sat here smiling. The 5800X3D has to be the most legendary CPU ever made.

    • @dangerous8333
      @dangerous8333 Год назад +1

      Let me know if it last 10 years and then we could say that.

    • @moodswinggaming2972
      @moodswinggaming2972 Год назад

      @@dangerous8333 no one who buys a 4090 keeps a CPU for 10 years, what a stupid statement. Go lick intels boots.

    • @KR4FTW3RK
      @KR4FTW3RK Год назад

      now if only AMD would let us overclock the darn thing. That's the one thing I regret switching from a 3800X to the 5800 X3D

    • @AB-80X
      @AB-80X 10 месяцев назад

      @@KR4FTW3RK
      Overclocking a 3D CPU. Yep, go right ahead. Please make a video. I've always wanted to see one of those go full Chernobyl.

  • @toufusoup
    @toufusoup Год назад +9

    With every video comes an increasingly powerful and historical thumbnail. Can’t say they’re ineffective either, because it sure as hell got me to click on the video.