Intel Did It: Core i9-12900K CPU Review & Benchmarks vs. AMD

Поделиться
HTML-код
  • Опубликовано: 3 окт 2024

Комментарии • 4,4 тыс.

  • @GamersNexus
    @GamersNexus  2 года назад +187

    Grab a GN Tear-Down Toolkit on back-order now to guarantee you get one in the next run! We have arrival dates on the store now. These have been in production a long time and have been in constant demand, so to make sure you get one, pick up a back-order here: store.gamersnexus.net/products/gamersnexus-tear-down-toolkit
    Watch Windows 10 vs. Windows 11 performance on the Intel Alder Lake CPUs: ruclips.net/video/XBFTSej-yIs/видео.html
    Watch our Intel i5-12600K review here! ruclips.net/video/OkHMh8sUSuM/видео.html
    If you want to see the full Alder Lake specs, we covered that here: ruclips.net/video/9ypRgILsDLo/видео.html
    And if you missed the Alder Lake architecture deep dive, you can find that here:
    ruclips.net/video/htCvo9XJZDc/видео.html

    • @beavatatlan
      @beavatatlan 2 года назад

      ok

    • @calvintrinh4869
      @calvintrinh4869 2 года назад +1

      First

    • @beavatatlan
      @beavatatlan 2 года назад +1

      @@calvintrinh4869 no

    • @beavatatlan
      @beavatatlan 2 года назад

      @@EpicGamingEct
      their box desgin might be cooler, but the cpus are not

    • @Muckylittleme
      @Muckylittleme 2 года назад

      Hi great review as always.
      I would love to hear a real world $ per hour/week/month/year (whatever arbitrary baselines you set) power usage included since electricity is getting so expensive.
      How much more would it cost to run an i9-12900K than a 5600X for example over those metrics - is the extra power usage significant or trivial on those terms?

  • @J_KM654
    @J_KM654 2 года назад +850

    “As a reminder… If your CPU drops below 100 FPS in this game, Jensen Huang personally shows up and repossess your GPU” got me cracking up 😂

    • @n1t21r3
      @n1t21r3 2 года назад +14

      Does it count if it's integrated? 😅😭😂

    • @zdroid2651
      @zdroid2651 2 года назад +5

      time stamp?

    • @barcelo2304
      @barcelo2304 2 года назад +21

      @@zdroid2651 19:00

    • @ar1xx._.626
      @ar1xx._.626 2 года назад

      Whos jensen huang btw?

    • @whyjay9959
      @whyjay9959 2 года назад +2

      I guess I shouldn't play it on my Phenom II. Unless he gives you a better one?

  • @jellorelic
    @jellorelic 2 года назад +316

    So, so weird seeing you on a different set Steve. Looks great, don't get me wrong... just been sooo many years of those industrial shelves behind you. Glad the new office space is getting ready!

    • @chairwood
      @chairwood 2 года назад

      Im getting u wrong 😔

    • @Grandwigg
      @Grandwigg 2 года назад +4

      Yeah. It's strange, and in a relatively recent follower of the channel. (I can't quite bring myself to call him tech Jesus). The new location is looking good, though.
      I've got to say, the new cpu does look appealing, but I imagine it will have a lot of the same early adopter headaches for a bit.

  • @zCaptainz
    @zCaptainz 2 года назад +475

    Intel's answer:
    "Hit the gas pedal! Floor it!"
    "But sir, it's hitting 250 Watts"
    "F*** it.. 1000W PSUs are the new meta"

    • @teemuvesala9575
      @teemuvesala9575 2 года назад +31

      well nvidia is planning 450w gpu so yeah...

    • @johnscaramis2515
      @johnscaramis2515 2 года назад +32

      @@teemuvesala9575 The upcoming PCIe5 connector for GPUs will support 600W. Won't take long until GPUs are running at this limit.

    • @ColdVenom159
      @ColdVenom159 2 года назад +4

      um ya man 1000 watt psus have been a thing for years. get out from the rock youve been living under. if youre running high end hardware you better be running at min 1000 watts

    • @ColdVenom159
      @ColdVenom159 2 года назад +3

      @@teemuvesala9575 coombining gpu wattage to cpu wattage makes no sense at all. no game uses 100% cpu and gpu, intel still clocks higher which requires more power.

    • @hanzhong_wang
      @hanzhong_wang 2 года назад +49

      @@ColdVenom159 The thing is those PSUs were targeted towards multi-GPU setups 3-4 years ago. Being available for HEDT consumers and being required for a mainstream build are two different concepts.

  • @brovid-19
    @brovid-19 2 года назад +42

    I never get tired of hearing Steve dump on 11900k at every chance he gets. It's funny every time.

    • @Darkhalo314
      @Darkhalo314 2 года назад +3

      It never gets old. I love it too

    • @brovid-19
      @brovid-19 2 года назад +4

      @@Darkhalo314 i have 11900k :D

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp Год назад

      @@brovid-19 Well.. At least it isn't getting wasted, so long as you got it for a cheap price.

    • @RTSchofield
      @RTSchofield Год назад +2

      ​@@brovid-19it was more valuable as sand

    • @djmysz
      @djmysz Год назад

      @@brovid-19I'm sorry for you

  • @KennethWayne
    @KennethWayne 2 года назад +732

    Whenever Steve mentions the 11900k the disgust is PALPABLE

    • @roknroller6052
      @roknroller6052 2 года назад +5

      I think the 12900 will become similar, because the 12700 is So Close in spec, 8+4 - 8+8.. maybe I'm wrong we'll see with the benchmarks, But I can't see how the extra performance from 4 E-cores (= 1 p core) could justify the cost
      Seems the i9 is prioritising efficiency, Over the i7. Which doesn't make sense for the flagship, instead of 10+4

    • @FADHsquared
      @FADHsquared 2 года назад +24

      @@roknroller6052 look at the power consumption it's already borderline unacceptable. Imagine if they went with 10 + 4

    • @Azerkeux
      @Azerkeux 2 года назад +5

      I am genuinely turned off by the favoritism shown by certain tech channels, whether they subtly favor intel or amd, it is inappropriate given what they represent and how their reviews can affect sales.

    • @roknroller6052
      @roknroller6052 2 года назад +6

      @FADHsquared... that's like comparing fuel consumption a supercar lol. The i9 is the Balls out flagship it's not made for people prioritising efficiency. 10+4 would be almost double the gain of the current 12700 to 12900, I would happily take the 15-20w ish more

    • @elitecol69
      @elitecol69 2 года назад +14

      @@Azerkeux literally every RUclips channel. Maybe stop watching RUclips.

  • @Wenti_Tempest
    @Wenti_Tempest 2 года назад +401

    "Probably better off as sand on a beach somewhere" that absolutely killed me

    • @SteveAkaDarktimes
      @SteveAkaDarktimes 2 года назад +14

      especially since we kinda are running out of sand.

    • @hmmmyes6934
      @hmmmyes6934 2 года назад +1

      @@SteveAkaDarktimes We always are

    • @killtyrant
      @killtyrant 2 года назад +9

      Currently hoarding sand so my great grand kids can be better off than myself

    • @various3394
      @various3394 2 года назад +4

      I think in the original 11900k video he said something like “this sand would be better served stuck in someone’s bathing suit”

    • @LtdJorge
      @LtdJorge 2 года назад

      @@SteveAkaDarktimes no, we're not

  • @Mr.Morden
    @Mr.Morden 2 года назад +463

    Can't wait to see AMD strike back with 3D Vcache. This is a good time for CPUs.

    • @nightruler666
      @nightruler666 2 года назад +15

      If you can find them

    • @TheEtueify
      @TheEtueify 2 года назад +43

      AMD admitted 3D cache is expensive though, so it's basically unavoidable to make the 3D CPU's expensive. So it's probably not gonna be good value. ( Much like the 12000 series is bad value due to motherboard/ddr5 prices )

    • @Antonio_Sosa99
      @Antonio_Sosa99 2 года назад +7

      AMD die this year dude. AMD don't say anything this year, probably Said nothing. Depend to much of TSMC

    • @jaronmarles941
      @jaronmarles941 2 года назад +14

      @@TheEtueify More like a bad time due to retailers jacking up the price. B&H has a "sale" on the 5950X for $750... That's MSRP....

    • @theworksautomotive9445
      @theworksautomotive9445 2 года назад +3

      Actually all the difference appears to be Ram and ram alone

  • @JusticeWarden
    @JusticeWarden 2 года назад +42

    Steve, it was nice meeting you when you were getting more office equipment today! Love your channel and how informative it is for beginner and even advanced pc enthusiasts. Y'all definitely helped me decide on what pc parts are optimal.

  • @tzuyd
    @tzuyd 2 года назад +888

    "Single digit differences, often less than 10%"
    This solidifies everything I know about maths

    • @lefterislazaropoulos8846
      @lefterislazaropoulos8846 2 года назад +148

      a single digit difference can be more than 10%, for example if you have baseline 60fps, a 10% gain would be 6fps, so +9fps would be a single digit difference but a double digit percentage gain

    • @NBWDOUGHBOY
      @NBWDOUGHBOY 2 года назад +18

      The 12900K at around $600 bucks is Faster than the 5950X at around $750 bucks. That's where it counts

    • @nipa5961
      @nipa5961 2 года назад +124

      @@NBWDOUGHBOY Well, not really. The 5950X is at 719€, the 12900K is at 679€. Taking the much, much higher motherboard, memory and cooling costs into consideration, Alder Lake won't be any cheaper than Zen3 in the near future, but it's great to finally see real competition again.

    • @pilotguy1141
      @pilotguy1141 2 года назад +42

      @@NBWDOUGHBOY but good luck trying to get either one at MSRP

    • @sandyfordd1843
      @sandyfordd1843 2 года назад +33

      There are three types of people in the world, those who are good at maths and those who aren’t.

  • @fieldrat9046
    @fieldrat9046 2 года назад +587

    Clearly AMD’s mistake was using only efficiency cores in their design!
    /s

    • @noahokeyo
      @noahokeyo 2 года назад +37

      Underrated comment

    • @oksowhat
      @oksowhat 2 года назад +9

      underrated

    • @beavatatlan
      @beavatatlan 2 года назад +22

      overrated comment

    • @andersjjensen
      @andersjjensen 2 года назад +1

      Thank you for the laugh good Sir!

    • @andymath89
      @andymath89 2 года назад +42

      How DARE AMD do such a terrible thing? Being extremely efficient? WTF?!?

  • @bilbobaggin3
    @bilbobaggin3 2 года назад +322

    dang, the new set looks dope, and the lighting feels way better, too

    • @Greali
      @Greali 2 года назад +1

      @@BobDevV seems like green screen yup

    • @FleaOnMyWiener
      @FleaOnMyWiener 2 года назад +1

      It's real

    • @TheGinGear
      @TheGinGear 2 года назад +7

      It looks like a green-screen because the angles of the lighting on the background set and the foreground (including Steve) are different. Which is something that everyone can subconsciously pick up on.
      It's real; it's just shenanigans due to lighting

    • @WinderTP
      @WinderTP 2 года назад +1

      @@TheGinGear the lighting on Steve is also pretty even especially on the face which makes it look more like typical greenscreen lighting. Loses some of the character imo but at least his hair remains fabulous

    • @mathesar
      @mathesar 2 года назад

      @@BobDevV I thought it was green screen too but then realized its most likely just really bright lighting, maybe a bit overkill lol. EDIT: Ok after watching the intro again I'm conflicted on real or not ,something seems a little off with the lighting.

  • @TimeBucks
    @TimeBucks 2 года назад +248

    Great review as usual.

  • @FrankFloresRGVZGM
    @FrankFloresRGVZGM 2 года назад +88

    You and your team deserve all the praise for getting this done while moving. Great job!

  • @shepherd8762
    @shepherd8762 2 года назад +92

    I'd love to see comparisons from how it runs on ddr4 vs ddr5 ram and how big of an impact that makes.

    • @doublecrossedswine112
      @doublecrossedswine112 2 года назад +2

      bingo

    • @Jonw8222
      @Jonw8222 2 года назад +14

      hardware unboxed's review covered that.. it pretty much makes no difference, unless it's certain apps/games.. on average not worth the money imo.. which was to be expected, and is always the way with a new generation of RAM.

    • @shepherd8762
      @shepherd8762 2 года назад +2

      @@Jonw8222 I saw theirs after this one. I do still want to see it from gamer nexus

    • @nicolasinvernizzi6140
      @nicolasinvernizzi6140 2 года назад +2

      i was reading the other day that in order to get better performance than a 3200 cl16 ddr4 you have to get ddr5 over 6000 and with latency lower than 36. anything else will result in the same or worst performance. so most ddr5 kits now will not get you a faster pc.

    • @Cyber_Akuma
      @Cyber_Akuma 2 года назад

      Same, especially since there are (or will be at least) motherboards that support both, so it would be easier to try to eliminate any other differences other than the RAM.

  • @names-are-for-friends
    @names-are-for-friends 2 года назад +139

    lmfao that power chart completely recontextualises the whole review

    • @ShogoKawada123
      @ShogoKawada123 2 года назад +4

      You mean the one that shows a big improvement vs the 10900K, and similar results to the 5950X when running at similar all-core clock speeds? I don't see how it's Intel's problem that they simply made "stock" on these chips basically actually what would have traditionally been OC behaviour, whereas AMD put super-conservative stock limits on their parts. I'd really argue than the only truly accurate way to compare the actual *physical* efficiency of the chips would be to have the Intel ones in a "stock but without power limits" configuration, and the AMD ones in a "stock but with PBO enabled" configuration.

    • @champion1642
      @champion1642 2 года назад +29

      @@ShogoKawada123 Yeah, AMD might've used conservative limits, but they're still within 5-10% of 12900K which uses DOUBLE the power.
      It's weird that Steve decided to only show the power usage of the overclocked 5950X and not show its performance.

    • @bltzcstrnx
      @bltzcstrnx 2 года назад +5

      @@champion1642 overclocked 5950x is not good. 10% or so improvement in multi-threaded load, and negative impact to lightly-threaded load. All for 100% more power, totally not worth it.

    • @ShogoKawada123
      @ShogoKawada123 2 года назад

      @@champion1642 The 5950X at 4.7GHz all-core uses basically the same amount of power as the stock 12900K. My point was that this is all entirely a matter of how high the chips are *allowed* to clock up, and not much else. Arbitrary manufacturer-imposed limits prove zero about how *physically* efficient the chips actually are. As always, also, these power concerns aren't going to matter at all if your primary workload is gaming.

    • @evrythingis1
      @evrythingis1 2 года назад +4

      @@ShogoKawada123 The problem is that AMD"s 5950x crushes it in productivity if you overclock it to draw the same power the 12900k does...

  • @salitz91
    @salitz91 2 года назад +95

    "Waste of sand" never fails to make me laugh - yall rock.

    • @salitz91
      @salitz91 2 года назад +1

      @@8jzX_u1LN_Uez2 They only bestow that moniker on a chip that either drastically underperforms, or in the case of the 5800x more recently, essentially shouldn't exist because of its price/performance essentially being covered by the 5600x and 5900x.
      I don't think GN believes they could ever make a CPU. They are looking out for consumers, giving us information, and telling manufacturers how to improve their products and pricing.
      Not sure if you took the term personally.

    • @salitz91
      @salitz91 2 года назад

      @@8jzX_u1LN_Uez2So if Gigabyte makes PSUs that explode and GN does the quality control tests to find out why, they shouldn't tell Gigabyte, just because GN isn't a PSU manufacturer?
      While manufacturers do know how to make their products, consumers have a vested interest in knowing their purchases are sound and demanding quality. Even though I and GN don't know how a PSU or CPU are made exactly doesn't mean we can't or shouldn't demand that they don't catch on fire or are priced better.
      Anyway, at the end of the day, manufacturers are selling a product for profit. By Steve and the guys at GN calling a chip a waste of sand, they're telling consumers they are better off spending their money differently, and providing the manufacturer with some tongue in cheek feedback that their product offering or pricing needs revision. Demand curves versus price is a fundamental economic concept, and GN uses the platform to give feedback to the suppliers to assist the market better.
      It wasn't as simple as an insult, and GN just wants to see quality products at reasonable prices. In a time when any CPU is hard to come by, GN told intel it was essentially a waste to make 11700k and that AMD would be better off economically by focusing on the other CPUs.

    • @salitz91
      @salitz91 2 года назад +1

      Yes. And if we all refuse to buy 11700Ks and Gigabyte PSUs, they know to change them or stop making them, just as Gamers Nexus points out to encourage market action.
      I just wonder what's your point in all this? Just not a GN fan? Chip manufacturer employee that feels slighted? Just like to make platitudes and try to point out fallacies to make others argue? You haven't made a specific claim in regards to the original comment.

    • @tristanreymendoza2402
      @tristanreymendoza2402 2 года назад +1

      @@8jzX_u1LN_Uez2 ok, sure Intel

    • @tomstonemale
      @tomstonemale 2 года назад +1

      @@8jzX_u1LN_Uez2 like you need to make cpus to criticize their preformance versus price value lol. Nobody could have recommended on good concience that POS of a cpu on launch.

  • @pirojfmifhghek566
    @pirojfmifhghek566 2 года назад +182

    Me looking at the benchmarks: "Wow, that's nice. I guess Intel is really getting competitive for once."
    Power Consumption Benchmarks: "HOLY SHIT."

    • @josebr0
      @josebr0 2 года назад +25

      Imagine buying a artic freezer only to run your CPU at 75 degrees. Insane.

    • @HolyRamanRajya
      @HolyRamanRajya 2 года назад +41

      Yeah double the power of 5950. Looks like Intel's finishing touches were to just overclock and triumph the leaderboards.

    • @jonasrossaert2836
      @jonasrossaert2836 2 года назад +3

      Intel triumphs, POWAH!

    • @uniktbrukernavn
      @uniktbrukernavn 2 года назад +26

      Looks like they learned from NVIDA; just add more power and call it Advanced V-Tech or whatever. V stands for Volts I guess.

    • @MRoach03
      @MRoach03 2 года назад +6

      Intel is competitive though... There are reasons to buy Intel over AMD- only fanboys can't see that.

  • @Pezby69
    @Pezby69 2 года назад +234

    Its not "trouble" for AMD . Its competition and AMD will rise to the situation and develop / design more competition for Intel. The cycle will continue. Who wins ? We do as consumers hopefully.

    • @arch1107
      @arch1107 2 года назад +26

      that is pure truth, back when amd was losing, i won, their parts were dirt cheap and great for my needs and others

    • @yama007
      @yama007 2 года назад +34

      What ? who will buy 240W CPU providing similar performance like 120W CPU ? common

    • @Scoelho83
      @Scoelho83 2 года назад +29

      @@arch1107 and Intel in some benchmark using DDR 5 5200 and and R9 ddr4 3200, not mention intel 12k consume 400w so where's the efficiency green energy?

    • @drbali
      @drbali 2 года назад +1

      Hopefully. AMDs margins arre honestly ridiculous compared to rest of the industry

    • @arch1107
      @arch1107 2 года назад +23

      @@Scoelho83 definitively not here, amd has that advantage and amd wants to cut in half their own power consumption over the next 5 years, intel, is doubling their power consumption
      it is not even funny anymore

  • @jackwiedemann
    @jackwiedemann 2 года назад +71

    240 Watts, holy hell that's a hot CPU. I was impressed by the performance jump, but this power jump explains a lot 😅

    • @ColdVenom159
      @ColdVenom159 2 года назад +5

      lmmfao please ill clear 300 watts with my 10900k aand i dont come close to 70c.

    • @addictedtoRS
      @addictedtoRS 2 года назад +31

      @@ColdVenom159 Yeah and my 5900X boosts to 6.5GHz and stays below 50C full load.

    • @sphinx4579
      @sphinx4579 2 года назад +6

      yup, they have no other means until they get down to 7nm process node I think. By then AMD and Apple will get down to 5nm and 3nm respectively so hard times ahead of Intel.

    • @MsHojat
      @MsHojat 2 года назад +4

      I feel like we might be getting a bunch of early failures on the CPUs in the next 5 years (even if it's due to incomplete thermal past application or weaker heat sinks), but maybe not.

    • @ColdVenom159
      @ColdVenom159 2 года назад +3

      @@addictedtoRS lol talks about boosting to 6.5ghz "NOT ALL CORE" as the 5900x will not do thosse speeds less youre on ln2, gtfoh lol.

  • @invisa_boi
    @invisa_boi 2 года назад +21

    Gamers Nexus: "As a reminder if your cpu drops below 100 fps Jensen Huang personally shows up and repossesses your gpu"
    Me: running r6s at 50 fps with my 1050ti at 100% usage, please take it and give me a 3080 thank you

    • @mhobag
      @mhobag 2 года назад

      r6s should run higher with a 1050 ti tho

  • @SpinDlsc
    @SpinDlsc 2 года назад +93

    So what you're saying Steve is: "You can literally see it!"
    All joking aside, it's good to see Intel finally in a competitive spot again with a product that really is unique. Here's hoping both respective companies will continue to beat out one another so what we can keep getting better products from either side.

    • @insomniacbritgaming1632
      @insomniacbritgaming1632 2 года назад +1

      They have been competitive each generation... until AMD catch up on the generations lol

    • @dosko9980
      @dosko9980 2 года назад +8

      damn if we only got competitive pricing now.

    • @TheKazragore
      @TheKazragore 2 года назад +4

      If they couldn't catch up to AMD'S previous gen after a year and a half and two generations of their own, I'd have been worried.

    • @pachodomi
      @pachodomi 2 года назад +1

      Damn if only power consumption was competitive.

  • @Scarlet_Soul
    @Scarlet_Soul 2 года назад +163

    The power consumption difference is huge though, near double the power consumption for in general comparable scores

    • @krystina662
      @krystina662 2 года назад +56

      Yeah, in a hindsight, people said that zen 3 threw efficiency out of a window, Intel smashed it and set it on fire before throwing

    • @KleinMeme
      @KleinMeme 2 года назад +23

      Yup, in my eyes that's insane.
      As i wrote for ~0-15% more performance against the competition?
      I thought GN would blame the consumption more as they did in the past. I was a bit confused they did not.
      I will see how much it will cost in reality.
      MSRP is todays no info to consider nowadays. :(

    • @irridiastarfire
      @irridiastarfire 2 года назад +18

      Based on my napkin math a 12900K running 24 hours a day for 365 days would cost $187 (dollarydoos) more than a 5950X for the same Blender performance (2.88KWh / day @ $0.1831 / KWh). Obviously that's an oversimplification because most people don't run at full power all day and night for a year but it's not a trivial difference.

    • @DustinShort
      @DustinShort 2 года назад +16

      we had a decade of "you don't really need 1000W PSUs" and now we might be back to a world where we need to spend and extra $100 on power. Alderlake looks solid, but I think the "value" proposition is minimal at the moment considering that DDR5, mobos, and PSUs are all gonna be more pricey than a comparable Zen 3 platform. I'm interested to see what Zen 3+ with DDR5 support brings to the table....and then continue to use my old busted HW because the chip shortage is real lol.

    • @needles_balloon
      @needles_balloon 2 года назад +4

      @@KleinMeme They might not have mentioned the performance to power usage ratio as much because 1. They're using Windows 10 instead of 11 which should be better 2. They were using a worse memory config than they wanted to. I doubt that those 2 things will lower power consumption, but it may increase performance to make the trade off a bit more worth it. Perhaps it's also because this is a $600+ flagship part where cooling costs are less of a consideration compared the 12600K

  • @nickllama5296
    @nickllama5296 2 года назад +82

    The numbers were very impressive, then I got to the power section. Holy power draw, Batman! Double the power draw for 10-15% performance gain is nonsensical.

    • @sophieedel6324
      @sophieedel6324 2 года назад +11

      Maybe you should add that the high power draw is only in a specific benchmark, and not in gaming. You seem like one of those "worried about power draw" AMD fanboys that twist facts.

    • @quirinaled2752
      @quirinaled2752 2 года назад +21

      @Sophie Edel And you seem like one of those people who thinks that graphs clearly showing much higher power consumption to performance mean nothing. The only person who probably doesn't care is someone on a pure productivity business case where power draw, thermals etc may be acceptable vs the value of work output.

    • @meunknown69420
      @meunknown69420 2 года назад +8

      240watts is during workstation workload
      the 12900k draws closer to 125watts during gaming according to ltt

    • @Finder245
      @Finder245 2 года назад +10

      @@sophieedel6324 it’s funny because when Intel processors were more power efficient then AMD processors, Intel fanboys were bashing AMD for making hot garbage.
      In practice though, the power consumption is an important factor in total cost of ownership. This also tells you that Intel still won’t be competitive in data centers where power efficiency is even more important. Intel might be popular again for running per-core licensed software though.

    • @michaelearl6765
      @michaelearl6765 2 года назад

      If I remember my EE classes correctly, double the power draw only translates to square-root-of-two times the frequency... So about 140% would be expected?

  • @bluediamond80
    @bluediamond80 2 года назад +7

    I do miss the IT warehouse vibe of the previous set.. Made me feel like I was at work talking to fellow tech geeks. Grats on the new place though!

  • @Elc22
    @Elc22 2 года назад +341

    So, they managed to meet and exceed the 5950x by throwing wattage at it. Lol. At least they are improving and pressuring AMD to continually improve too. Seriously though, 2x the power draw of a stock 5950x... That's insane.

    • @notaname1750
      @notaname1750 2 года назад +25

      You know what they say.
      If it works, it works!!

    • @benjamintran5444
      @benjamintran5444 2 года назад +35

      Yeap, Intel just set the CPU's operating voltage higher than NOMINAL one to beat AMD's CPUs. That explained why the power is so high. I am sure that will affect the lifetime of its CPU's... I predicted that we will hear a lot about Intel's 12K series CPUs are dead at higher rate than usual within one or two years...

    • @jamesm568
      @jamesm568 2 года назад +5

      The last time I checked it has always helped increasing voltage and wattage on a CPU.

    • @FunKaYxxD1sCO
      @FunKaYxxD1sCO 2 года назад +26

      The 5950x is a monster.

    • @isnogood-
      @isnogood- 2 года назад +26

      I need cooler pc, not hotter. I need ac

  • @Ocastia
    @Ocastia 2 года назад +122

    Ohh nice.
    But what i would really love to know now is: how does the performance scale with power?
    If i limit the 12900k to 150W for example, how much performance do i loose? 50%? 40? 30? Or maybe even less?
    Because it seems to me like Intel went and took out the sledgehammer to get in front of AMD, via brute force. so there might be potential for optimization.

    • @lubossoltes321
      @lubossoltes321 2 года назад +17

      This. Is there an option to restrict the 12900 to 150W and see how it compares ? I'd guess worse than 11900 ...

    • @MrTrilbe
      @MrTrilbe 2 года назад +11

      More like Intel was struggling so badly against AMD, that they had to copy something ARM did 11 years ago, wonder how much intel is paying for it, might be a part of the deal that allows intel to manufacture ARM chips for 3rd parties

    • @unitedfools3493
      @unitedfools3493 2 года назад +49

      @@MrTrilbe You seem very uninformed making stuff up like that.

    • @IamUzyf
      @IamUzyf 2 года назад +19

      @@MrTrilbe amd fanboys going crazy

    • @Erowens98
      @Erowens98 2 года назад +24

      Yeah. This thing runs hot as the sun because intel jammed all of the power into it to come out ahead. I expect a lot of real world users will be downtuning this thing to help manage temps since 360mm AIOs aren't cheap, and yet seem like they're probably necessary here. This thing drinks wattage like an overclocked chip right out of the box.

  • @you2be839
    @you2be839 2 года назад +34

    "Efficiency Cores" = 243W!
    That really sets it for me: not really interested in buying new PC hardware until at least AM5 platform and Zen 4 CPUs launch to see what they bring to the table.
    And if they too don't bring anything excitingly new to what is now known, and especially if USB4 is still missing by then on both Intel and AMD platforms, I guess late 2023 is the next plausible date... by which time we'll already know about the performance and TDP scaling of the 13900K compared to the 12900K and we'll probably also know about the successor to LGA 1700 platform and therefore might have to wait further still until 2024...
    (edited AM4 for AM5)

    • @reversedon8698
      @reversedon8698 2 года назад +6

      intel 12th gen desktop cpus now whit built in engine block heater for those frosty mornings

    • @AzPakaTa
      @AzPakaTa 2 года назад

      Its worth it for someone like me with 9700k which is already not good enough for GPU like 3080.

    • @reversedon8698
      @reversedon8698 2 года назад

      @@AzPakaTa i got 4790k/gtx1050ti and im not yet upgrading it. when win10 support ends its time to upgrade it, if intel cant get their shit together team red is my choice few moths ago upgraded my laptop and amd was my choice. 4 cores in i7 is just a fucking joke in 2021 when mobil ryzen 3 have better performance

    • @ColdieHU
      @ColdieHU 2 года назад +2

      @@AzPakaTa Still would go with AMD since you are going to need a new board anyway. 12900k is not only going to cost you extra on purchase but also on your electricity bill.

    • @AzPakaTa
      @AzPakaTa 2 года назад

      @@ColdieHU I'm not getting 12900k, I'm getting 12700k which is cheaper and better than 5800x (at my place). Also to compare it with 5900x which is also 12 core, it costs like 150$ less and I bet it's gonna beat it in some stuff. I need new motherboard regardless of going for Intel or AMD. What I'm looking for is the better CPU and that's 12700k. Also electricity bill is big bullshit, it only matters if you run your pc 24/7 at 100%. I don't care if I have 5$ more or less in my bill at the end of the month because I don't sit with 100% CPU/GPU all the time.

  • @YuProducciones
    @YuProducciones 2 года назад +1

    Nice background, looks so much better 💙

  • @dirkmanderin
    @dirkmanderin 2 года назад +74

    Would have been nice to see power usage with a couple of games. If AL still uses ~100w more power than a 5900X for a small gain in FPS, it's definitely not worth it.

    • @agreatd
      @agreatd 2 года назад +12

      Power consumption appears to be extremely similar while gaming which is the most important factor for people just gaming and casual use.

    • @dr.noscopegaming630
      @dr.noscopegaming630 2 года назад +3

      @@agreatd 12900k is worth it for the price
      But i dont know why i feel all of amd and intel cpu are prices perfectly fpr the performance they give
      If 5900x get 100$ cut then its still an amazing cpu

    • @XTheLolX301
      @XTheLolX301 2 года назад +18

      @@dr.noscopegaming630 u have to add the price for another motherborad, ddr5 ram and a high end cooler

    • @you2be839
      @you2be839 2 года назад +7

      Exactly my thoughts, those TDP values alone put me off... and I'm sorry for any joke I may have made about the TDP of the 11900K!!

    • @Aereto
      @Aereto 2 года назад +1

      That high power draw also makes people running on narrow but efficient GPU margins make a hard pass.
      I have the Seasonic PRIME 850W, and there's already the RX 6900XT to give a significant headroom.

  • @Ashtree81
    @Ashtree81 2 года назад +7

    I bought the 9900k a couple of years ago when people were mocking its high power draw. It now seems on the efficient end compared to the past couple of intel generations

  • @TommyCrosby
    @TommyCrosby 2 года назад +59

    RUclipsrs: I'm moving home, expect less videos and bad sound until I fix the room.
    GN: I'm moving a full studio with new sets and yet we still managed to make a full review of a new CPU.

    • @Nanerbeet
      @Nanerbeet 2 года назад

      I guess GamersNexus runs like a real gaming company where the staff nearly kills itself trying to make deadlines.

  • @ChairmanDDD
    @ChairmanDDD 2 года назад +78

    I’d like a DDR5 vs DDR4 test. DDR5 memory is impossible to find and will likely be scalped through the holiday season. For those of us upgrading to this cpu is it worth the premium and battle against bots?

    • @GholaTleilaxu
      @GholaTleilaxu 2 года назад +3

      Is that a rhetorical question? If it's not, then let me provide you with an answer with another question: Why would you want to "upgrade" to this CPU?

    • @dslamngu
      @dslamngu 2 года назад +14

      Hardware Unboxed recommends sticking with DDR4 for now. DDR5 beats DDR4 on some tests, and DDR4 wins on others, but DDR5 is like double or triple the price.

    • @milesfarber
      @milesfarber 2 года назад

      DDR5 has ecc built in. Anyone who wouldn't upgrade to DDR5 is probably turned on by 0xc0000005 Bluescreens and Memory Address Kernel Panics. I mean, you ARE doing important work on your pc right?

    • @Finder245
      @Finder245 2 года назад +14

      @@milesfarber DDR5 has on-die ECC. The memory bus does not have parity. The on-die ECC of DDR5 was added to maintain the same reliability that older (less dense) technology has without ECC. Servers will still use ECC DIMMs, so they will have on-die ECC and traditional ECC on top of that.

    • @neonspark707
      @neonspark707 2 года назад +1

      DDR5 will replace DDR4. It is not an if. It is just a question of buying a motherboard with Gen5 pic and DDR5 versus buying into the exiting tech. If you're coming from a much older chip, off course you should. If you're on a more recent chip...maybe wait a couple of gens for prices to drop.

  • @RN1441
    @RN1441 2 года назад +66

    The loud part: Congratulations intel on outperforming AMD. The quiet part: A full year later at 2x the power draw.
    I'm legitimately happy to have more choice and supply in the market, I just expected it to be a lot more power efficient given intel has been struggling with this 10nm process for the desktop since ~2016.

    • @chubbysumo2230
      @chubbysumo2230 2 года назад +3

      If this is priced well from retailers, It could completely take over the 5950X space in the market. It could also force a price drop on the 5950x, which would be even better. Given the unit pricing, I expect these to come in at around $700 retail.

    • @RN1441
      @RN1441 2 года назад +4

      @@chubbysumo2230 Price drops and wider choices make everyone win, so I'm all for it.

    • @quintoblanco8746
      @quintoblanco8746 2 года назад +14

      Yeah, disappointed with the power draw. High performance PCs becoming heaters should not become the new standard.

    • @darreno1450
      @darreno1450 2 года назад +9

      Disappointed with the actual performance. I expected a lot better after the hype.

    • @thelegendaryklobb2879
      @thelegendaryklobb2879 2 года назад +5

      The whisper-quiet part: Zen 3D is incoming, likely to match or reclaim the top without an expensive platform change

  • @willg955
    @willg955 2 года назад +9

    Well this is a fun video if you own an 11900K. In my defense, I got it for $300 open-box.
    Compared to the last flagship, I guess I get where Steve is coming from, lol.

  • @myfavoriteviewer306
    @myfavoriteviewer306 2 года назад +117

    I don't take a side in the CPU wars, but I am absolutely thrilled that close competition seems to be back. Also, the new studio is looking sharp!

    • @srinathshettigar379
      @srinathshettigar379 2 года назад +14

      250w stock. is this a intel GPU? lol.

    • @myfavoriteviewer306
      @myfavoriteviewer306 2 года назад +5

      @@srinathshettigar379 Ya, I was really hoping for a substantial power saving. I'm not upgrading any time soon, but would have been nice to see them make a move in the right direction.

    • @jackwinter1507
      @jackwinter1507 2 года назад

      @@myfavoriteviewer306 they did? They’re not wasting sand like the 11 series

    • @pachodomi
      @pachodomi 2 года назад +1

      Double power consumption isn't competitive yet.

    • @roknroller6052
      @roknroller6052 2 года назад +1

      @Black and Red Pill... Yeah honestly you're better saving money on the CPU, Toward a GPU. A used 3600x is a good saving over a New 5800-5900 Which are Just overkill. How times have changed, You used to need the flagship x6800 core2 or 4770k and now an old 8700k or 3600x is more than capable
      I Remember how 2008 GTA4 the most demanding and badly optimised game of the time, Like cyberpunk now. Needed the latest core2 to run. Whereas now a 2012 Intel 3770k, A 9yr old processor can run playable cp77 at 80 fps
      Can you imagine in 2008, When GTA4 just ran on my xe6800 core2, Running it on an old Pentium III from 1999, Lmao. CPU progress has been Huge in recent
      People can save themselves so much Pain over GPU pricing, If they just saved on the CPU. You can get a CPU board and RAM on eB for the price as a New i7 or r7. Only if people realised its easily a $400-600 saving over the 2000-2017, and no impact on game playability, Helps offset the GPU inflation

  • @earlm
    @earlm 2 года назад +128

    When everyone posts about a new release, GN is always the first to watch.

    • @JustxLiving909
      @JustxLiving909 2 года назад +5

      Hardware unboxed ?

    • @RandomLucker
      @RandomLucker 2 года назад

      and or Igors Lab ;)

    • @darkdraconis
      @darkdraconis 2 года назад +1

      @@JustxLiving909 nah, GN or DF

    • @nonamenameless5495
      @nonamenameless5495 2 года назад

      I go Hardware Unboxed first as well, then GN, Linus and Level1 next

    • @ihebmhamdi3451
      @ihebmhamdi3451 2 года назад

      HardwareUnboxed is best used for gaming results. The amount of gaming benchmarks Steve does is quite remarkable exceeding other RUclipsrs like GN and Linus.

  • @righteousred723
    @righteousred723 2 года назад +109

    Maybe add a fire extinguisher to the set...

    • @ckl3814
      @ckl3814 2 года назад

      That and a fire suppression system 😎

  • @corkscrew4585
    @corkscrew4585 2 года назад +8

    Gamers nexus 2 years ago: "Ya we like tested this lol and its shit but this one is good"
    Gamers nexus now: "Today we tested all 52 pieces, sorry, its my fault the team didn't get to 55"

  • @SkillTimO
    @SkillTimO 2 года назад +97

    Cool new studio Steve!

    • @PotatInside
      @PotatInside 2 года назад +2

      Cool new studio. Hot new Intel CPU that consumes more energy than two 5950X...

    • @cytro
      @cytro 2 года назад

      @@PotatInside someone's mad

    • @PotatInside
      @PotatInside 2 года назад +3

      @@cytro Why thank you!

  • @milestailprower
    @milestailprower 2 года назад +137

    "lo and behold, the CPU market is becoming competitive" - Tech Jesus (2021)

    • @enlightendbel
      @enlightendbel 2 года назад +6

      Yeah, some dude on Twitter responded to me that Intel was "pwning" AMD with this new CPU.
      In context, Intel is competing with AMDs offering that came to market a year ago.
      All that's going to happen is that likely AMD is going to jump a few % over what Intel is capable of with their next full release.
      And then Intel will do the same again a while later.
      And that's all fucking awesome.
      It's been since the Phenom X6 days that we've had this level of performance competition on the CPU market.
      Now lets hope price competition follows.
      CPUs have been easy enough to get compared to GPUs, so scarcity shouldn't keep their prices this ludicrously high.

    • @MrLince-hr4of
      @MrLince-hr4of 2 года назад +1

      i just bought my second 3700X today ! i love it, still the best cpu for the price(240,-.) and only 65W tdp
      but 3 weeks shipping time :/

    • @enlightendbel
      @enlightendbel 2 года назад +2

      @@WhiteBoyGamer unfortunately, the vast majority of people do not give a shit about power draw.

    • @Hoootaf
      @Hoootaf 2 года назад

      Stock market tell the truth today lmao

    • @MsRShady
      @MsRShady 2 года назад

      @@WhiteBoyGamer no, Gaming Efficiency is about the same. About 118 watts to 109 Watt. What you see is Allcore Load, there ADL is still too inefficient compared to AMD.

  • @IamKryptonite
    @IamKryptonite 2 года назад +38

    I am really happy for Steve and the team. Steve seems very happy to be in his new crib!

    • @kageofkonoha
      @kageofkonoha 2 года назад +2

      While I am glad they have more room for their work I like the old background better. The new one is too clean.

  • @benjaminoechsli1941
    @benjaminoechsli1941 2 года назад +38

    "Thanks, Steve!"
    Tech memes aside, this was an excellent video, especially considering y'all're in the middle of moving! Kudos to you and the team. _Thank you_ for continuing to keep the "common man" in mind by using Windows 10.
    New set looks great, and I look forward to seeing many more videos in it. Now back to you, Steve. I have some other videos to catch up on.

    • @zanderhenriksen6776
      @zanderhenriksen6776 2 года назад

      The new Windows 10 vs Windows 11 debacle made me think of how it would perform on newer Linux kernels. Hopefully, Linus Torvalds&Co. have added functionality within the scheduler to function well with 12th gen CPUs.
      Mainly just made me curious, because A) I only use Linux, just got used to it eventually and now I feel like an outdated boomer when I try to use Windows. And because B) most games that use Steam work flawlessly and perform better on Linux with my HW than on Windows (e.g. because of the aggressive caching and low memory/CPU/GPU overhead. This means any games previously run will have often used files already stored in memory, unless you overwrote it/restarted since then, but why would you ever want to restart a Linux OS?)

    • @staggerdagger
      @staggerdagger 2 года назад +1

      The fact you used impostrophres in yallre keeps me up at night for 8 months

  • @VROOOOOOOOOOOOOOOM
    @VROOOOOOOOOOOOOOOM 2 года назад +102

    Holy crud the power consumption is insane. They cranked that thing to 11.

    • @VraccasVII
      @VraccasVII 2 года назад +5

      Cries in 750W during Cinebench...I should really get rid of my 7960x

    • @scarletspidernz
      @scarletspidernz 2 года назад +13

      Don't worry Intel will top its own power performance with 300w desktop cpus in 13th gen 👌
      Higher numbers is better right?

    • @satelliteharassment
      @satelliteharassment 2 года назад +5

      well actually i think they cranked it to 12

    • @GCAGATGAGTTAGCAAGA
      @GCAGATGAGTTAGCAAGA 2 года назад +2

      @@laurabrown5527 Now you ladies cranked up MY power consumption up to 11, oof!

    • @eclipsegst9419
      @eclipsegst9419 2 года назад +1

      @@scarletspidernz 250-300w has been the practical limit for a long time, these cpus are apparently just refined enough to ship close to the limit, so why wouldn't they? Non-K is for people worried about power.

  • @ilnerdchuck
    @ilnerdchuck 2 года назад +71

    Man the 2600k is so powerful that is not even in the charts, i'm going to keep it man

    • @arch1107
      @arch1107 2 года назад +1

      it has a performance out of the charts! out of competition

    • @dudebroguymate
      @dudebroguymate 2 года назад +2

      HA HA. That's the best comment I've read all year. Well done.

    • @moorhen6156
      @moorhen6156 2 года назад +8

      I heard if you run Cinebench on a 2600k the render is already complete when you open the program

    • @ilnerdchuck
      @ilnerdchuck 2 года назад +4

      @@moorhen6156 that's why Intel doesn't like cinebench as a benckmark, they want to keep the secret

  • @jebo4jc
    @jebo4jc 2 года назад +210

    Competition is good for us as consumers. Glad to see it.
    Now, hopefully they can keep them in stock.

    • @GinsuChikara
      @GinsuChikara 2 года назад +7

      RUclips needs a laugh react.
      Their yields are going to be complete shit, and they have to live with the same shortages as everyone else.

    • @dontmatter4423
      @dontmatter4423 2 года назад +13

      intel's using their own fabs, so hopefully there won't be a problem

    • @dontmatter4423
      @dontmatter4423 2 года назад +15

      ​@@GinsuChikara Are you sure about that? They have their own fabs, they're not using TSMC fabs 'as everyone else'.

    • @jeder6915
      @jeder6915 2 года назад

      @@dontmatter4423 they are using TSMC for some of their stuff though

    • @thisconnectd
      @thisconnectd 2 года назад +14

      Idk slight win with twice the power isnt really that good

  • @andersjjensen
    @andersjjensen 2 года назад +32

    This was underwhelming to say the least. Up to double the power draw. Gaming "advantage" is meh. Some wins and some losses in productivity, but basically ties on average. Intel's "Intel 7" is 10% denser than TSMC N7 and we're a year in on Zen 3. DDR5 is expensive, and Intel boards are expensive, so the price argument is a toss-up at best. That it basically requires an AIO to maintain what seems like a factory overclock is flabbergasting....
    The whole thing just doesn't seem professional if you ask me. Requiring an OS-redesign to run right, and outright crashing in valid applications (Total War) is so counter to the backwards compatible nature of x86. Needing a cooling solution that has a dangly end that has neither ATX nor vendor defined mounting dimensions. Pushing enthusiast GPU levels of wattage at questionable temperatures just to call it a win.
    Comparatively this seems more like '76 Shelby GT500 than a production car. Sure, it's faster, but here are all the additional maintenance requirements, and sign here for the reduced warranty because performance tuning kills parts faster. Or like the Thanos meme "What did it cost?... Everything..."
    Pat G likes to say "leadership" a lot. This doesn't feel like it. This feels like desperation. Like when AMD released the Radeon HD 7990 at 375W...

  • @browerkyle
    @browerkyle 2 года назад +19

    It's not that I don't care about your methodology, I've been watching long enough to know that you've done your due diligence on your testing. Any mistakes that are uncovered, you're quick to make corrections. Love your work and love my new GN mousemat!

  • @kjamesjr
    @kjamesjr 2 года назад +280

    “When all else fails… MORE VOLTAGE!!!” - Intel

    • @33ordie
      @33ordie 2 года назад +18

      If you take the 5950x at the same wattage, then who wins? Ahaaaa!!! Anyhow RYZEN 4 plz so we can get it over with!

    • @RandomLucker
      @RandomLucker 2 года назад +1

      seems to be far better on win 11

    • @xFODDERx
      @xFODDERx 2 года назад +12

      @@33ordie what I always love
      "Just wait for zen2"
      "Just wait for zen3"
      Now
      "Just wait for zen4"
      Seriously guys, the when am I waiting till?
      Although we all know by then 13900k will be out and AMD fans will say
      "Yeah but intel are on DDR5 gen2, it's AMDs first" just like with Ray tracing, dlss, etc.

    • @33ordie
      @33ordie 2 года назад +10

      @@xFODDERx Buy whatever you want,. I was mostly thinking of shareholders. If you want to make money, but INTEL stock for the time being, then ZEN4, buy AMD stock :P

    • @inSainTed
      @inSainTed 2 года назад +2

      @@33ordie Por que no los dos? :D

  • @ewenchan1239
    @ewenchan1239 2 года назад +18

    I'd be interested to see how the Noctua tower coolers, both single or dual, would fair with a thermal workload like this.

    • @kokilius
      @kokilius 2 года назад +1

      Noctua cant handle i9 at 270 w...you will never have that perfomance on paper with air premium cooling..You need custom water blocks

    • @ewenchan1239
      @ewenchan1239 2 года назад +5

      @@kokilius
      But the max power on the 12900K is 241 W, so it doesn't matter that the Noctua can't handle 270 W because the 12900K doesn't run at 270 W unless you start overclocking it.

    • @deepsp_ce
      @deepsp_ce 2 года назад

      @@ewenchan1239 is noctua nhd15 enough for 10900k not overclocked?

    • @ewenchan1239
      @ewenchan1239 2 года назад

      @@deepsp_ce
      Just to confirm, you're asking about the 10900K instead of the 12900K, correct?
      And if that's the case, then yes, in my experience, absolutely.
      I'm using the NH-D15 right now with my 12900K and it works.
      (Like the temps aren't ever going to be as good compared to a watercooled AIO, but it's also good enough that the CPU isn't thermal throttling either, but on the 12900K, towards the high end, if you have a higher ambient temperature, then you're going to get closer and closer towards the temperature where the CPU *WILL* start the thermal throttle itself.)
      But for the 10900K, I don't foresee any reason why the NH-D15 would be an issue.

    • @deepsp_ce
      @deepsp_ce 2 года назад +1

      @@ewenchan1239 yes, that's correct, 10900k. thank you!

  • @shaunlunney7551
    @shaunlunney7551 2 года назад +193

    Alderlake...Give me ALL the WATTS!

    • @liviubita4238
      @liviubita4238 2 года назад +13

      It will TAKE all the watts! 😄

    • @MisterFribble
      @MisterFribble 2 года назад +14

      Seriously. I've heard it be called bulldozer that is actually fast, and the heat output definitely matches.

    • @rasmusolesen5307
      @rasmusolesen5307 2 года назад +10

      more like AllWattLake.....XD

    • @Hogdriva
      @Hogdriva 2 года назад +8

      Pentium 4 Version 2 Electric Boogalo

    • @riannair7101
      @riannair7101 2 года назад

      No more Wats for ta🤣

  • @davidbrennan5
    @davidbrennan5 2 года назад +17

    That power draw and heat output gives me Netburst feelings

    • @aceofhearts573
      @aceofhearts573 2 года назад +6

      RIP Pentium 4. It warmed many winter nights

    • @Marc-zi4vg
      @Marc-zi4vg 2 года назад +3

      Or bulldozer

    • @elcouz
      @elcouz 2 года назад

      Prescott was especially bad.

  • @JustinEmlay
    @JustinEmlay 2 года назад +112

    Steve - "Intel Did It"
    Also Steve - "Meh, it's a little better"

    • @sonichedgehog36
      @sonichedgehog36 2 года назад +21

      Also, TWICE the power draw.

    • @fix0the0spade
      @fix0the0spade 2 года назад +4

      This reminds me of AMD CPUs ten years ago, matching Intel performance provided your Fusion Reactor can supply enough wattage.

    • @saricubra2867
      @saricubra2867 2 года назад +3

      @@sonichedgehog3612900K is significantly faster in one thread vs 5950X when both are 240 watts.
      Running 240 watts for the 12900K isn't really stock due to infinite turbo, 105 watts vs 125 watts would be a fair comparison, or 240 vs 240 watts.

    • @quirinaled2752
      @quirinaled2752 2 года назад +1

      Gah I had one of those - the cooler sounded like a turbo kicking in every time I loaded a game and it ran hot enough to warm the room mid winter.

    • @GamersNexus
      @GamersNexus  2 года назад +17

      Your representation of our conclusion is pretty inaccurate, unless you're talking about the gaming results. The production increase is significant, which is exactly what we said.

  • @mirekhavelka6233
    @mirekhavelka6233 2 года назад +9

    A long time fan here Steve, really appreciate your honest and transparent approach, the internet needs reviewers like you. Disagree a bit about the tone of your review though, as you seemed just a bit too upbeat considering what Intel actually delivered:
    1) On average 10% or so improvement over AMD's top chips
    2) A chip that runs really hot and demands high-end cooling
    3) A chip that can consume up to double that of the key competitor's chip
    4) A chip that requires a costly platform and DDR5 to see it performing its best
    Let's consider a few things:
    Apple came out with a very powerful and extremely efficient chip that easily beats Intel's previous best. Intel claimed that they will be able to compete even with Apple in the foreseeable future and win their business back. Then with great irony they came out with a very power hungry chip in Alder Lake, worse even than their previous generation, obviously to look good against AMD in benchmarks, but also going in the wrong direction to what they actually need to be doing, which is performance per watt. It's not just about AMD, the most efficient architecture will take the spoils going forward, it almost always works that way(unless you are poor like AMD used to be and can't quite ramp up production to meet demand).
    Intel obviously has process issues and that can't be turned around overnight. They are pushing their silicon too hard to look relevant against their competition. They would have looked better if they had matched AMD's TDP numbers and I bet they would have only been 5-10% behind in most cases, but would have created a more useful product, maybe with a 4.5GHz max on the P cores.
    Alder Lake smells of P4 desperation. But Pat talks the geek talk, and is well respected so Intel seems to be getting the benefit of the doubt here, primarily because you guys are getting bored of repeating the same story.
    Let's look at it from the context of an alternate reality: Intel delivers a chip that is 10% faster at same TDP as AMD, the world(experts) would call that a mild improvement, but nothing too impressive. The situation now is much worse though so it's hard to see it as anything other than another desperate effort by Intel that comes up way too short. AMD will likely be at parity in performance with V cache at the same power as current Zen 3, and that will be the first step forward. But both AMD and Intel have a lot of work to do to compete at the sharp end. Intel's CPU designs are fine, but their process is still lacking, so they push the chips, and AMD has been given a golden opportunity to lead because of this. They need to capitalize to stay ahead and to catch Apple. Nothing but their most ambitious designs will do, because Intel will get their process sorted sooner or later, and that will transform their (well-architected) product.
    Sandy Bridge, Ivy Bridge, Haswell, Broadwell, Skylake: all great performance-leading chips that were super-easy to cool and had low TDP. Then the wheels fell off. Alder Lake does not change that in a meaningful way.

  • @nemesis851_
    @nemesis851_ 2 года назад +23

    “Good Value” in 2021, “does not compute”

    • @Cyber_Akuma
      @Cyber_Akuma 2 года назад +1

      Wrong, this is a CPU, computing is literally it's core function. 😉

  • @taproot0619
    @taproot0619 2 года назад +31

    Just a note: all the reviewers seem to be using DDR5 RAM, which I understand. But I would love to see how AMD fairs against intel if they were running the same DDR4 memory.
    I'd like to know how much of that performance uplift is from the processor itself, and how much of it is from the RAM.
    Edit: hardware unboxed did their review using both platforms so you can check them out if you are looking for this info as well.

    • @TheProducerTV
      @TheProducerTV 2 года назад +14

      This^ I feel like it's a little disingenuous to compare Alder Lake to Ryzen 5xxx or older Intel generations using platforms that don't really give a level playing field, especially when Alder Lake is capable of running DDR4

    • @pennywisethedancingclown7139
      @pennywisethedancingclown7139 2 года назад +1

      The processor is designed and optimized for use with DDR5 though? You want to chop its legs off and test it? That's like saying I want to test the 5950X with SMT disabled... The 12th gen was already hindered by running Windows 10 instead of Win 11. It seems to have still held up.

    • @taproot0619
      @taproot0619 2 года назад +9

      @@TheProducerTV I don't think its disingenuous, like I don't think using resizable BAR on AMD when its not on intel would be disingenuous. I just think its a gap in knowledge and data that I would like filled.
      Because if the 12900K is only 3% or 5% better than the 5800X when using the same memory, then its possible, if not likely that AMD's DDR5 compatible CPU's will be just as good if not a little better than Intel.
      But if the gap is still 10% or 12%, then that might be enough of a lead that AMD's next CPU's may not be able to leapfrog them.
      So it would people decide if they want to buy Intel now, or AMD in a few months.

    • @sunagwa5046
      @sunagwa5046 2 года назад +11

      Hardware Unboxed tested with DDR4, check it out.

    • @taproot0619
      @taproot0619 2 года назад +1

      @@pennywisethedancingclown7139 I just think its a gap in knowledge and data that I would like filled.
      Because if the 12900K is only 3% or 5% better than the 5800X when using the same memory, then its possible, if not likely that AMD's DDR5 compatible CPU's will be just as good if not a little better than Intel.
      But if the gap is still 10% or 12%, then that might be enough of a lead that AMD's next CPU's may not be able to leapfrog them.
      So it would people decide if they want to buy Intel now, or AMD in a few months.
      I'm not saying they should have done DDR4 INSTEAD of DDR5. I'm saying they should have done it AS WELL AS DRR5.

  • @Weezedog
    @Weezedog 2 года назад +97

    Are you going to test DRM on older games? There were concerns that the BIG.little architecture would break DRM and it would need to be updated. This could be a deal-breaker for many people, especially since many older games are now unsupported.

    • @thespiritofyoink
      @thespiritofyoink 2 года назад +25

      That's an issue. So's the heat. So's the compatibility and performance with anything other than "windows 11."
      As a SFF enthusiast, I wouldn't touch this with a ten foot pole, especially considering it's being compared to last gen tech.

    • @DuBstep115
      @DuBstep115 2 года назад +5

      @@thespiritofyoink 12600kf is cheaper and better than 5600x. But what matters the most is 12400, it will rule with 3060ti the budget gaming pc world (95% of the world) till Zen4.

    • @i_cri_evertim
      @i_cri_evertim 2 года назад +36

      @@DuBstep115 it's cheaper but it will kill your wallet in the motherboard and RAM.

    • @techbuildspcs
      @techbuildspcs 2 года назад +21

      @@DuBstep115 and eats like 2 times the power

    • @Aereto
      @Aereto 2 года назад +9

      @@DuBstep115
      Again, mainstream games use DRM, and a lot of people still have a lot of games they play are not updated to even Windows 10 spec.
      They are not going to update DRMs to be compatible to 11, and that will put an end to Intel's performance potential by rendering a significant chunk of games in the Literally Unplayable category.

  • @phillipzan2005
    @phillipzan2005 2 года назад +81

    it went like this through the years. Ryzen was "A new hope". Now "The Intel strikes back" Will we get a "Return of the Ryzen"? stay tuned for the next episode of CPU Z....

    • @PyromancerRift
      @PyromancerRift 2 года назад +11

      immagine is AMD make thinner DIE and cranck up the power usage like intel. It will be death star level of performance.

    • @dumbcow1
      @dumbcow1 2 года назад +5

      @@PyromancerRift I am excited for what the new socket will hold. I think AMD is going to hit back H A R D. AMD is in a position to do such nowadays.

    • @Eng586
      @Eng586 2 года назад +6

      I really hope AMD takes out intel.

    • @phillipzan2005
      @phillipzan2005 2 года назад +8

      @@Eng586 Honestly I have no loyalty to any brand so whichever one is the best for me. gets my money

    • @lonniebeal6032
      @lonniebeal6032 2 года назад +7

      @@Eng586 We need them both for competition...Just like we need another gpu manufacturer thanks to cryto mining.

  • @smakfu1375
    @smakfu1375 2 года назад +82

    It's great to see Intel back in the game, even if it'll be a while before I consider dumping my 5950x for something else. That said, I think we're all losing sight of the real issue here, which is exploding Gigabyte PSU's.

    • @andersjjensen
      @andersjjensen 2 года назад +21

      Well, with the power consumption of the 12900K we're bound to hear more about them...

    • @shempeym
      @shempeym 2 года назад +21

      Yep that 240W CPU Power is a huge turn off. I live in a tropical country so NO, just NO haha

    • @_lumpia
      @_lumpia 2 года назад

      Should be fun to see how AMD swings back with their next gen cpus

    • @shempeym
      @shempeym 2 года назад +1

      @Sut Nack You shouldn't be. CPUs aren't really that significant in higher res. Better a new GPU than a bunch of new stuff that will surely have teething problems. 240W Max? I'll just wait for next gen.

    • @superblitz2274
      @superblitz2274 2 года назад +2

      People forget that that power draw is only under high stress scenarios,and that even the 12900k during gaming doesn't cross 100W

  • @Phosphor66
    @Phosphor66 2 года назад +48

    my favorite part of a new CPU launch is the folks who panic sell their "old" stuff. gonna see if i can snipe a good deal on a 5800X.

    • @mirage8753
      @mirage8753 2 года назад +2

      ye, nice idea

    • @AgentOrange502
      @AgentOrange502 2 года назад +9

      I'm relatively new to pc building. I made my first build in January 2020 with a 2070 super. When nvidia announced the 30 series I "panic sold" my 2070 super for 400$
      Guess how much I'm regretting that today (:

    • @hannesjansevanrensburg6627
      @hannesjansevanrensburg6627 2 года назад +1

      I just want a spicy deal with ryzen 9 and a pci gen 4 mobo think the ryzen 7 2700x with pci gen 3 holding back my RTX 3070 ti

    • @Cyber_Akuma
      @Cyber_Akuma 2 года назад

      I unfortunately have a Z490 board, but it does support PCIe4 if I have an 11th gen CPU. Maybe I can snag a 11700K to at least have Gen4 GPU support and to enable the third (and only gen 4) NVME slot on my board for a good price due to this.

    • @chinturajput5571
      @chinturajput5571 2 года назад

      @@AgentOrange502 im fb 2020 2070super from india and i did not sold my gpu since its need vs greed so i chose need and it will pay off once i will sell 6 months bef4 lovelace would launch and cope fr that 6 months with a dummy gpu like gt710 but bro can you tell me if my corsair 750 gold smps will b able to fuel the 60 or 70 series of rtx 40 upcoming in 3rd quarter of 2022

  • @junko4166
    @junko4166 2 года назад +34

    The power draw is a massive turn off for me. Energy is pretty dang expensive where I live.

  • @manu4ever249
    @manu4ever249 2 года назад +103

    Im curious, when overclocking the 5900x or 5950x to have similar power draw, would intel still win in those categories? Gaining 5-10% in some scenarios does't seem to justify the double power comsumption.

    • @neiliewheeliebin
      @neiliewheeliebin 2 года назад +27

      I don't think either companies chips have a lot of overclocking headroom currently. The i9 uses too much power for my liking but the i5 & i7 are pretty decent

    • @EbonySeraphim
      @EbonySeraphim 2 года назад +17

      @@neiliewheeliebin AMD over clocking, especially with the 5950x can be complicated but offers great improvement. The baseline of what you can get by simply turning on PBO and auto over locking is amazing and is probably where anyone not trying to set records wants to do.
      The dumb way to do it is simply OC on all cores - which, when you have 16 of them is probably not what you want. That’s where you’ll suck up a lot of power and generate the most heat. That’s only worth it if what you’re running really needing to run all cores as fast as possible. For most games, you’ll only see benefit when 1-2 cores maybe runs super fast and is why PBO is amazing because it’ll do that for one core, and not make the entire chip run hot or suck up huge amounts of power. I’m pretty sure GN has a great video on it and you can take a look at this improvements over stock, and import them into this data and compare that way.

    • @RobertSmith-lh6hg
      @RobertSmith-lh6hg 2 года назад +2

      5950x on chilled water (55f water temps) and tuned curve optimizer Overclocking pics up between 5 and 10 % performance across gaming and is within 2% performance of static all core over clock results. It chart tops vs stock 12900k, however, I haven't yet seen 12900k OC headroom, so when the 12900k is OCd it may take back the top spot.
      The same 5950x on ambient temp water (70f) gains very similar gaming performance and single core boost, however it doesn't do all core as well as when chilled, so the productivity results are lower than static overclock by varying amounts. It becomes thermally limited and so results are in line with load put on the cpu thermally.

    • @Stockfish1511
      @Stockfish1511 2 года назад +3

      Have 5950x and use it for server purposes. It overclocks just fins, performs incredibly well and is absolute monster on multicore performance. I use a kraken z liquid cooler

    • @saricubra2867
      @saricubra2867 2 года назад +4

      Intel has a massive IPC lead now, even a 5950X without power limits (4.7GHz) and drawing 240 watts or something similar can fall behind a 12900K by a big margin in single core perfomance and productivity that like Alder Lake's DDR5 and hugely faster CPU cores.
      AMD fanboys don't tell you the IPC that Alder Lake has and the clockspeed that it reaches, so power consumption for 3.8GHz 5950X (150 watts from Anandtech) with more than 20% less can't be compared with 12900K (using big-little) with 4.7GHz all p-core clockspeed (p-cores have more than 20% higher IPC than Zen 3 cores) and e-cores that have Comet Lake IPC, basically at 5950X's base clock.

  • @shukterhousejive
    @shukterhousejive 2 года назад +72

    I was wondering what magic trick Intel pulled to get those numbers and then I saw the power charts. This is why I'm skeptical of two-tier core designs in desktop packages, you can't hedge your bets on only hitting the big cores in short bursts like a phone and you can't tune the package to hell and back like the M-series chips. Should be a killer arch for laptops at least

    • @RandomLucker
      @RandomLucker 2 года назад +2

      i'm just wondering why igorslab has this low energy draw. maybe because of win11?

    • @MichaelPohoreski
      @MichaelPohoreski 2 года назад +1

      Don't forget Sintel is also using DDR5. I'm curious to know how much of a factor that is.

    • @zodwraith5745
      @zodwraith5745 2 года назад

      This has been all over the board. When ADL is allowed to run to the teeth it's drawing major power just like the 5950x does when using PBO. That's why power usage is varying wildly.

    • @system137
      @system137 2 года назад +1

      *power chart. GN just showed blender. looking at a range of applications, you will see that big-little is much more efficient for everything that does not push the CPU to the absolute limit (such as gaming). intels marketing somehow managed to bury that fact, which is hilarious.

    • @superblitz2274
      @superblitz2274 2 года назад +1

      During gaming the 12900k doesn't cross 100W

  • @Baoran
    @Baoran 2 года назад +21

    I would have wanted to see the flight simulator results when testing cpus.

  • @evdb9255
    @evdb9255 2 года назад +8

    in this Day and age do we want processors with that power consumption?

    • @arch1107
      @arch1107 2 года назад +1

      i don't even want gpus that use more than 100 watts, this thing overclocked uses 400 watts...

    • @evdb9255
      @evdb9255 2 года назад

      @@arch1107 i really dont know why steve does not make a point out of this. This must be criteria #1.

    • @bltzcstrnx
      @bltzcstrnx 2 года назад +1

      No, especially after Apple M1 launched.

  • @ColdRunnerGWN
    @ColdRunnerGWN 2 года назад +59

    It's great to see Intel back in the game. Unfortunately, this GPU situation means that upgrading for gaming is not worth it for a lot of people.

    • @samiam9059
      @samiam9059 2 года назад +4

      Amd is the better balance financially. The numbers are not that big of a breakthrough (imho)

    • @cardbored_
      @cardbored_ 2 года назад +15

      @@samiam9059 Huh? The 5950x is more expensive than the 12900k

    • @samiam9059
      @samiam9059 2 года назад

      @@cardbored_ I have the 5900x not the 5950X

    • @Nosuchthingasnormalhere
      @Nosuchthingasnormalhere 2 года назад +7

      @@samiam9059 Unfortunately AMD is more unreliable. The compatibility issues with windows has been an outcry. The sad part is most won't notice as they think its just windows, but when you manage thousands in scale and see similar occurrences across them you get the picture that something is at play. Maybe intel is in bed with Microsoft to make sure there stuff works better then the competition, we don't really know. I don't like intel but we dropped supplying AMD for businesses. BTW that's using OEM from Dell, HPE not your gaming builds, or as we know them "white label". That's not saying we have not tested building our own, which we came to the same conclusion.

    • @samiam9059
      @samiam9059 2 года назад +3

      @@Nosuchthingasnormalhere Actually AMD is more reliable and have had no compatibility problems at all with my amd cpu's and have loaded win10 and the latest tech of it all and works great. Intel on the other hand sadly is stuck in old school ways to be faster with inefficient power management, more heat when you don't want it and actually said they do not support overclocking and do it as your own risk.... AMD's overclocking software works excellent on my 5900x(imo). Not to mention the older 4700g is amazing for how inexpensive it was if you could get as they were OEM to manufacturers. My two I9's have not been as smooth....

  • @truckwrecker6822
    @truckwrecker6822 2 года назад +118

    It would be an "exciting time" if I could update my gtx1070 at a reasonable price!

    • @AlphaHorst
      @AlphaHorst 2 года назад +2

      Well you are lucky if you want to upgrade just to an 3070. Those can be had almost close to MSRP... I sitt on my old 1080 and want a 3080... I have to pay at the least 100% more sometimes even 150%...
      3070 will probably be back to stable prising within the first quarter of next year.

    • @hasso0n
      @hasso0n 2 года назад +34

      @@AlphaHorst where is a 3070 selling for near MSRP?!

    • @AlphaHorst
      @AlphaHorst 2 года назад +1

      @@hasso0n well in Germany you can find it sometimes. for around 750€ (partner models) which, depending on the model, is close to the avg MSRP of 650€
      But you gotta be fast. Don't know about other countries but the US and Canada usually have the better pricing when comparing to Germany.

    • @yobi765
      @yobi765 2 года назад +1

      @@hasso0n I went to microcenter and got a rog strix 3070 for $780ish, i dont remember, but current retail prices. Again like everything else, you gotta be quick.

    • @DuBstep115
      @DuBstep115 2 года назад +1

      @@hasso0n 1 year ago, why are people buying tech launched in 2020 when it's almost 2022

  • @ItsJustVV
    @ItsJustVV 2 года назад +12

    No AMD is not in trouble at all. Here's what I said on HUB's review:
    "I'm really, really NOT impressed at all by the 12900k... Is this the best intel has now? +7% better in games vs Zen3 at 1080p?
    Alder Lake comes one year later, it's much more expensive (the entire platform, CPU + big water cooler, expensive MB and DDR5), consumes more power even though it has 8 big + 8 little cores vs Zen3's 16 big cores (so that's laughable), it's hotter, it needs Win11 and DDR5 (though it malfunctions sometimes) and this is the result? Yeah, not impressed...
    I said when those leaks of +50% "DEMOLISHES" Zen3 click bait titles were everywhere that synthetic benchmarks are nothing, we needed trusted reviews of real world scenarios and games and this is the end result. I'm glad competition is back, because prices should get lower for Zen3, which is good, but the best Alder Lake is a mixed bag and really not impressive.
    Zen 3D can't come soon enough, it will have no problem achieving at least parity in performance wile also being a cheaper platform and more efficient.
    Now imagine Zen4 with DDR5 when it's also matured... that one also will come before Raptor Lake."

    • @tycanuck
      @tycanuck 2 года назад

      Who are you exactly? Thanks for your opinion.

    • @592Johno
      @592Johno 2 года назад

      Cope.

    • @seeibe
      @seeibe 2 года назад +1

      If you were following the leaks, Alder Lake has turned out pretty much as expected. We have already known for well over a year that Intel won't be blowing AMD out of the water anytime soon. What Intel needed is to not completely get left behind with Alder Lake, and they managed that, with a competitive offer as Steve correctly points out. The company has a new promising direction under Pat Gelsinger, but it won't be until 2024 or even later until his changes in direction to the CPU department will manifest in a product Intel can bring to market. Until then the best they can do is to not completely lose face by at least trading blows with AMD, and with Alder Lake they achieved this for now.

    • @ItsJustVV
      @ItsJustVV 2 года назад +1

      @@seeibe I agree, yet just look at some of these click bait titles or shitty reviews (not this one) - Jayztwocents paired the 12900k only vs 5900x and only showed cherry picked scenarios to make intel look like they obliterated AMD...
      And so many silly people like the one above your post actually believe the shilling...

    • @koto483
      @koto483 2 года назад

      Benchmarks and other heavy cpu applications push it to its max wattage though in gaming its still very competitive with ryzen

  • @zackmatey1793
    @zackmatey1793 2 года назад +31

    I'd be interested to see a DDR4 test of Alder Lake. It seems like Alder Lake's performance is mostly due to power use and DDR5

    • @noahokeyo
      @noahokeyo 2 года назад +6

      Exactly, Amd still has the superior chips.

    • @supersanic3446
      @supersanic3446 2 года назад +4

      @@noahokeyo You keep saying that to yourself.

    • @resensegaming
      @resensegaming 2 года назад +7

      @@supersanic3446 i think if amd increased power usage of a 5950x to 250w from 120w they would slam but they care about more than just raw power i bet and have lots of room to grow unlike intel at this point.

    • @Finder245
      @Finder245 2 года назад +5

      @@noahokeyo that depends entirely on the criteria you use. Some of us have inherently single threaded workloads where paying double for a few percent better performance can be worth it.

    • @MunyuShizumi
      @MunyuShizumi 2 года назад +1

      @@resensegaming It's not even a hypothetical scenario - one can already find the 5950X outperforming the 12900K at similar wattages with a 5-minute cruise control overclock applied. Of course, one can't find a detailed comparison over a dozen benchmarks yet, but it's certainly not difficult to find better results achieved with comparable power draw & cooling.

  • @alaskanmalamute101
    @alaskanmalamute101 2 года назад +3

    lol all was looking good for Intel until showed power consumption...using more than double AMD's 5950X

  • @CaptnYestrday
    @CaptnYestrday 2 года назад +47

    Didn't realize how much I missed these main reviews by Steve. Been tooling around watching some other channels and coming back here for this.... Wow.
    Nobody does this like him. There are some decent channels out there but this one is the whole package.

    • @Rickcrazy100
      @Rickcrazy100 2 года назад +1

      Agreed, one channel that I do recommend is Jay…Jay is funny, super detailed but is also straightforward and usually summarizes the vids…also also, I’m a big fan of watercooling, main reason why I subbed to Jay

    • @CaptnYestrday
      @CaptnYestrday 2 года назад

      @@Rickcrazy100 wondered if someone would mention Jay. I also like Jay and his humor is great. Maybe it just matches my sense of humor but is info is good too

    • @Captain_Yesterday
      @Captain_Yesterday 2 года назад +2

      Got a bit of a shock, I saw this and thought "I don't remember leaving a comment!?!" 😂

    • @CaptnYestrday
      @CaptnYestrday 2 года назад +1

      @@Captain_Yesterday You just did the same to me! My brain just folded in half there for A little while. Well played.
      Also, you have excellent taste

  • @solidstatealchemy9080
    @solidstatealchemy9080 2 года назад +58

    I can't be the only one watching the gaming bar graphs thinking, "The R5 5600X is a price/performance and power consumption/performance beast!"

    • @ivoivanov7407
      @ivoivanov7407 2 года назад +7

      Well, lets see what 12600 and 12700 will bring to the table. And where AMD's prices will go /crossing fingers/.

    • @Jamesaroni
      @Jamesaroni 2 года назад +11

      @@ivoivanov7407 from what ive seen 12600 is doing damn good for the price, usually killing the 5600x, but of course this is with ddr5.

    • @ivoivanov7407
      @ivoivanov7407 2 года назад +3

      @@Jamesaroni Yeah, I've just watched LTT's video, and 12600K is beast! AMD will hawe to finally deliver something in low-to-middle segment.

    • @Jamesaroni
      @Jamesaroni 2 года назад +4

      @@ivoivanov7407 id like to see a ddr4 comparison of the 2 to see the "true" value, ddr5 is a legit advantage over AMD but i think ryzen might still be my preference until ddr5 and windows 11 are more streamlined. either way this is great for the market, both companies will have to put up for sure.

    • @waking00one
      @waking00one 2 года назад +1

      @@ivoivanov7407 "finally"? what fucking word do you people live on exactly

  • @eigos
    @eigos 2 года назад +13

    TL;DR - It consumes a lot of power and provides single-digit performance improvement over AMD's best chips. Price is "up there", so is that worth it...?

    • @jayjaytp10
      @jayjaytp10 2 года назад +1

      you sound like a fanboy lol

    • @monke2361
      @monke2361 2 года назад

      no, 12700K is worth it

  • @HD_Heresy
    @HD_Heresy 2 года назад +5

    As an 11900k owner these videos are really starting to feel like self flagellation hahahaha

    • @BigJohnson911
      @BigJohnson911 2 года назад +1

      I don't understand the extreme hatred for 11900K. Makes me feel really shit about myself for buying it.

    • @HD_Heresy
      @HD_Heresy 2 года назад +1

      @@BigJohnson911 yeah it's not a great feeling, understand why they go in so hard on it but in a vacuum regardless of price for performance of efficiency it's still an awesome powerful CPU!

  • @MunyuShizumi
    @MunyuShizumi 2 года назад +16

    Alder Lake: beats AMD (by 5%) with +100% power consumption
    Raptor Lake: beats AMD with included stock LN2 cooler
    Meteor Lake: beats AMD with Mr. Fusion PSU (banana peel not included)

    • @scarletspidernz
      @scarletspidernz 2 года назад +3

      Alder Lake: But you can earn money while Cooking eggs on our CPU's
      Raptor Lake: But you can earn money by Baking bread on our CPU's
      Meteor Lake: But you can earn money by Baking pizza's on our CPU's

  • @pumpkinLive
    @pumpkinLive 2 года назад +39

    Steve looks a little bit like he was greenscreened in this new set.
    Congrats on the move guys

    • @20cent
      @20cent 2 года назад +1

      He definitely is

    • @rohesilmnelohe
      @rohesilmnelohe 2 года назад +4

      Pfff.. its CGI Steve...
      (Rendered on AMD R9 5950x)

    • @GamersNexus
      @GamersNexus  2 года назад +8

      @@20cent nope, it's a real set! Haha

    • @jonathanellis6097
      @jonathanellis6097 2 года назад

      Yeah it does he looks disconnected from his surroundings, the background looks fake for some reason? Even though he says it's not green screen, something just looks off. Great review as alway though.

    • @Cosmstack
      @Cosmstack 2 года назад

      Yea I kinda felt the same. I wonder why

  • @TomBaito
    @TomBaito 2 года назад +81

    I was almost sold with how Alderlake's leading in most of the charts until I saw the power consumption... 😂
    But still, this is great because thank God 14nm++ is finally dead and we have competition again!

    • @ja-kidnb6416
      @ja-kidnb6416 2 года назад +11

      exactly... the whole video I was like "yes, finally competition, nice!" and then came the power draw..... damn, that's a hard no for me

    • @mryellow6918
      @mryellow6918 2 года назад +2

      And it's using ddr5

    • @Naddoddur
      @Naddoddur 2 года назад +2

      The power consumption is almost the same in gaming.

    • @thathologuy8299
      @thathologuy8299 2 года назад

      @@ja-kidnb6416 wait until you learn about the 12600k 😳😳😳

    • @xlr555usa
      @xlr555usa 2 года назад

      Nvidia 4000 series will be power hogs also, time to look at 1200 or even 1600 watt power supplies for safe headroom. I'm praying that the climate weirdos don't try to ban these new PCs for not being power effecient. Gloves will come off then

  • @teknoman117
    @teknoman117 2 года назад +15

    I'm just curious to see if there are any high E-core count xeon sku chips in the future. If they can fit 4 E-cores in to the space of 1 P-core, and there are 10 tiles on the 12900K, I want to see a 2 P-core / 32 E-core Xeon-D for a mega-NAS.

    • @Austin1990
      @Austin1990 2 года назад +1

      We do not know what the E-cores sacrifice yet. My guess is that they mostly give up cache.

    • @angelaizen2231
      @angelaizen2231 2 года назад +1

      @@Austin1990 and they give up hyperthreading

  • @LoneRiderz
    @LoneRiderz 2 года назад +43

    With all that hype, I kind of expected more performance and lower power usage.

    • @BasedAstraea
      @BasedAstraea 2 года назад +3

      There is, in windows 11.

    • @jakejakedowntwo6613
      @jakejakedowntwo6613 2 года назад +5

      The tdp was expected the be insane anyways. Without the Scheduler it doesn’t fully optimize it

    • @LoneRiderz
      @LoneRiderz 2 года назад +1

      @@jakejakedowntwo6613 there was a process shrink and DDR5 so I guess I had my hopes high. I'm currently on B550/ Ryzen 3700x. I'll probably do a CPU upgrade next year to the 5800x. No better alternatives for someone that prefers small, powerful builds.

    • @hauptmanngilbertoduber9953
      @hauptmanngilbertoduber9953 2 года назад +10

      This what you can expect from intel.
      They did their early hype benchmarks on windows 11 where AMD didn't have their performance patch yet so it seemed like they really did something.

    • @arch1107
      @arch1107 2 года назад

      lol, is intel, their previous cpu when overclocked used 300 watts, this one uses 400 watts when overclocked, on normal parameters only 350 watts
      intel doesn't know what is low power usage

  • @PotatInside
    @PotatInside 2 года назад +25

    *Did they though?* Looks like more of a *Kinda Did It…* 24:17 two times the power consumption of a Ryzen 5950X when stock.

    • @WhenDoesTheVideoActuallyStart
      @WhenDoesTheVideoActuallyStart 2 года назад +2

      And as others have said, Intel's cores are not so much big.LITTLE, and more akin to big cores (without HT) paired to BIGGER cores (with HT).

    • @Real28
      @Real28 2 года назад +2

      I still can't figure out why people care about power. Ive upgraded to a 10850k that's OC to 5.1ghz which is drawing a lot more theoretical power and even after working 2 jobs all summer on my PC plus gaming, my electrical bill went up...$10.
      Maybe if you're installing this into a dozen+ machines that are running at max all day, this seems very irrelevant and not in my 25 years of building PCs and gaming/video/photo production have I cared once about power draw.
      It's a tool. I need it to do a job, not draw less power.

    • @WhenDoesTheVideoActuallyStart
      @WhenDoesTheVideoActuallyStart 2 года назад +1

      Combined with the fact that it's still a monolithic design, all in all it means that they're making way fewer CPU's than AMD in the same number of silicon wafers.
      It turns out Pat Gelsinger's "ingenious play" is... making very small margins off their CPU's. That is the move of a company desperately trying to retain marketshare and use their multi-billion-dollar foundries to fab *something* before they're rendered obsolete by TSMC.

    • @WhenDoesTheVideoActuallyStart
      @WhenDoesTheVideoActuallyStart 2 года назад +2

      @@Real28 Power = heat = lower lifetime/higher failure rate of components, less overclocking room for GPU and RAM, hidden costs in water cooling/extra fans and bigger CPU coolers, etc...

    • @phenom957
      @phenom957 2 года назад +1

      @@Real28 Not so much the electrical bill but the heat output.

  • @vg4902
    @vg4902 2 года назад +14

    Congrats on the new studio. Looks clean!

  • @nicholash8021
    @nicholash8021 2 года назад +7

    "The human eye can't see more than 24fps"... Just try reading text while dragging a window at 24fps. Even at 120Hz it's still not great (good, but not great).

    • @turbo11
      @turbo11 2 года назад +1

      That's what i sheep said, before apple gave 120 hz on iphone 13

    • @miguelzavaleta1911
      @miguelzavaleta1911 2 года назад +3

      Same. After I switched to 240Hz displays, scrolling on any other screen feels painful.

    • @Austin1990
      @Austin1990 2 года назад +1

      It is a joke, talking trash on a narrative (from over 10 years ago) that 24 FPS was all you need.

    • @nicholash8021
      @nicholash8021 2 года назад

      @@Austin1990 Right. I also once heard that 15 to 24 FPS "leaves more to the imagination" LOL. Now I'll be the first to admit, there is something nostalgic to a movies shot in 24FPS and actually played back at 24 FPS. It somehow removes the feeling of "presence of camera", but I'll take my 165Hz+ monitor for my daily work, thank you.

  • @tommybronze3451
    @tommybronze3451 2 года назад +47

    this “strong cooperation” between intel and microsoft making sure the new arch is working properly, makes me wonder how much future amd chip performance will suffer (possibly suffer from the “beefy under the table brown envelope from intel syndrome” - like we’ve seen countless times so far)
    Ps. thanks for the content.

    • @Marrrrtin
      @Marrrrtin 2 года назад +1

      It's very possible that AMD will also move to a big-little CPU design, so then they will benefit from Microsoft's development on Windows 11 as well.

    • @tommybronze3451
      @tommybronze3451 2 года назад +11

      @@Marrrrtin I don’t think you appreciate that alto up until today both intel and adm chips were both amd64 arch - they required different scheduler operations in kernel low level. So thou big-little high level might bring something in the future for amd, it’s not hard to see how intel could influence m$ to mess it up for amd specially in low level closed source kernel code. And by time amd finds it and m$ declares “whoops, this wasn’t intentional” - damage in reviews and the general public perception will already be done.

    • @NeverDoubtMe23
      @NeverDoubtMe23 2 года назад +3

      Switch to Linux, done.

    • @shukterhousejive
      @shukterhousejive 2 года назад +3

      We're going back to the DirectX 1-8 days where the sales rep that buys the dev team the most drinks wins all the shiny new features

    • @TheEtueify
      @TheEtueify 2 года назад +1

      @@Marrrrtin AMD said that at least Ryzen 6000 and possibly (not planned) 7000 will not be big little.

  • @zackmatey1793
    @zackmatey1793 2 года назад +8

    Is it wrong that I'm more excited about the new GN office than Alder Lake?

  • @Paxmax
    @Paxmax 2 года назад +14

    The inefficiency just puts the performance to sub par. Not impressed intel.

  • @deusvult6695
    @deusvult6695 2 года назад +1

    I have been a long time fan of your videos but I don't often comment, I just wanted to say that I really appreciate the sheer amount of figures that you guys manage to cram into your videos without making them boring or monotonous. I love the volume of statistics I receive and the confidence that gives me when selecting the next part for my machine. As always your work is appreciated, keep it up.
    PS I love the coasters, they are dope and I use them every day for my morning coffee XD

  • @anasevi9456
    @anasevi9456 2 года назад +19

    Everyone else: _Wow! intel reestablished the status quo only lost a year ago!_
    Me: _Wow! Intel finally has 10nm working well enough for a performance desktop part!_

    • @NeverDoubtMe23
      @NeverDoubtMe23 2 года назад +1

      Does anyone really care about the size, really? I mean, power consumption on a desktop really doesn't mean much to me but I am glad to see Intel beat AMD. Better prices and faster chips for us.

    • @thelegendaryklobb2879
      @thelegendaryklobb2879 2 года назад +5

      @@NeverDoubtMe23 When it's actually double the power consumption for at most 15% more performance, it does matter

    • @josebr0
      @josebr0 2 года назад

      It's not 10nm, its Intel 7! 😂

    • @BenderBendingRodriguezOFFICIAL
      @BenderBendingRodriguezOFFICIAL 2 года назад

      So your admitting that you're circle jerking the nanometer meme?
      I could care less if they were on 36Nm, as long as their chip performs well... which they always do.
      Nobody whose buying these cares about the nanometer process mate, literally nobody. A small handful of people in a community drawn together from a single video isn't going to change that. People care about power consumption more than they do the size of the wafer, and yeah, I know the smaller the process the more transistors and in turn more performance, but there is more than one way to skin a cat, and intel knows core efficiency is better than slapping 90+ cores on a die and calling it a day.
      I build computers for people regularly, none of them have ever bothered asking. They just want something fast that works.
      The semantics behind it are redundant in day to day interactions.

  • @dyewts
    @dyewts 2 года назад +10

    I'm a bit surprised at this, looking at the gains vs power consumption Alder Lake looks pretty terrible to me.

    • @arch1107
      @arch1107 2 года назад +3

      it doesn't look terrible to you
      it is terrible. this is a desktop cpu glued to a laptop cpu, but since it was made by intel, well, it looks like this

  • @benbarclay3872
    @benbarclay3872 2 года назад +39

    The question still exists: Why did Intel even release the 11900K?

    • @arctifire5709
      @arctifire5709 2 года назад +9

      OEMs

    • @benbarclay3872
      @benbarclay3872 2 года назад

      @@arctifire5709 Good point.

    • @Code-n-Flame
      @Code-n-Flame 2 года назад +5

      To keep shareholders happy.

    • @gorjy9610
      @gorjy9610 2 года назад +1

      when you look at 12600k results same question goes for 12900k

    • @msnehamukherjee
      @msnehamukherjee 2 года назад +1

      Same reason AMD released them bulldozer and piledriver SKUs.

  • @troygilbert1112
    @troygilbert1112 2 года назад +5

    Been here since 200k this channel has exploded and they deserve every bit of it. Love the dedication and the work put in everyday

  • @HaakonHaug
    @HaakonHaug 2 года назад +56

    So buying Intel now requires water-cooling? I'd like to see performance on air if it starts to throttle.

    • @mel-ju7kp
      @mel-ju7kp 2 года назад +7

      No reason not to have an AIO already they give far better value than similarly priced air coolers and leaks are non existent compared to when they first released

    • @Spido68_the_spectator
      @Spido68_the_spectator 2 года назад +2

      Unless you plan to push the chip hard, no need to buy OP cooler

    • @superneenjaa718
      @superneenjaa718 2 года назад +16

      LTT ran with noctua D 15 and their temps were reaching 100°C. So you can guess, any air cooler will have trouble keeping it cool under full load.

    • @nfugitt89
      @nfugitt89 2 года назад +8

      @@mel-ju7kp they’re louder and take up more room

    • @Conrad75
      @Conrad75 2 года назад +2

      Buying the 12900k requires water cooling you mean… the 12600k does perfectly fine in all applications.

  • @gypsy6211
    @gypsy6211 2 года назад +11

    Kudos to Intel for a meaningful bump. But I get the feeling when Zen4 hits with ddr5, it'll be handing Intel a box of kleenex, again.

    • @Austin1990
      @Austin1990 2 года назад +2

      Indeed. Alder Lake really excites me in some ways but disappoints me in others. Too much of the performance is from increased power consumption. Even Zen3D may take back at least the gaming crown.

    • @gypsy6211
      @gypsy6211 2 года назад +2

      @@Austin1990 yes, I'd not be surprised by that. Honestly feel the 12900k itself was a letdown, compared to the ground gained by the 12600k/12700k, the 12900k seems so lackluster an advancement. It needed to be beating the 5950x by at least 20%+, across the board, and it isn't.

    • @AwankO
      @AwankO 2 года назад

      It appears that a lot of the performance boost is from increased power draw and from the native performance boost of DDR5 itself.

    • @Austin1990
      @Austin1990 2 года назад

      @@AwankO
      I have not seen DDR4 vs DDR5 benchmarks yet.

    • @saricubra2867
      @saricubra2867 2 года назад

      @@gypsy6211 Single Core alone is 25% in Cinebench R23.

  • @PoRRasturvaT
    @PoRRasturvaT 2 года назад +6

    These 0.1%lows make me think of situations where one thread is capped at 100% and is holding back everything else. I wonder if disabling the efficiency cores might be beneficial, while having no loss on the average performance. This might be a test worth doing, Win10 scheduler ain't getting any better since they want to push Win11 (which has less users than 7 currently).

  • @kennethkyledeluna5068
    @kennethkyledeluna5068 2 года назад +34

    "You may have won the fight, but not the war."
    -AMD, probably.

    • @UnrealOG137
      @UnrealOG137 2 года назад +1

      Abs-freaking-lutely. More competition is always a good thing.

    • @shstan96
      @shstan96 2 года назад +2

      I don't know what AMD really lost. 12900K is literally next-gen compared to 11th and 10th gens Vermeer was competing against. When the 7000 series arrives, we will know.

    • @kennethkyledeluna5068
      @kennethkyledeluna5068 2 года назад +1

      @@shstan96 AMD has the power and Intel has speed. Intel may have been quick, but power will catch up to them soon.

    • @kennethkyledeluna5068
      @kennethkyledeluna5068 2 года назад

      @@shstan96 Oh and also RIP Warhol.

    • @jeffthedream
      @jeffthedream 2 года назад +4

      @@shstan96 agreed. 5000 series came out before 11th gen. Intel has taken 2 generations and a new platform to finally pass 5000 series.

  • @adblockturnedoff4515
    @adblockturnedoff4515 2 года назад +61

    Would love to see those tests with ddr4 memory for 12900k instead for “apples” to apples comparison with amd.

    • @brownshit1
      @brownshit1 2 года назад +8

      Hardware Unboxed tested ddr4 and 5. In most cases it was fairly similar.

    • @damara2268
      @damara2268 2 года назад +4

      The go watch review from hardware unboxed, they compared ddr4 3200 cl14 vs ultra expensive ddr5 6000 cl36

    • @resColts
      @resColts 2 года назад +1

      ddr4 is advantage 3200 cl14 beats 6000 cl36. Not surprised. Would like to see faster ddr4 compares to see if difference increases significantly enough. EDIT ... couple rare cases where DDR5 has good gain over ddr4 watch dogs legion is one.

    • @obsidian....
      @obsidian.... 2 года назад +5

      100% This testing methodology is so off from what GN normally tries to claim to do

    • @dstblj5222
      @dstblj5222 2 года назад +1

      @@obsidian.... not really amd doesnt support it so its a fair play advantage

  • @rickysargulesh1053
    @rickysargulesh1053 2 года назад +6

    INTEL taking the lead again came quicker than expected. They weren't overselling it. Lets see how is power consumption compared to performance. Edit: Now that I got to the point of efficiency it is what I expected.

  • @MrMebigfatguy
    @MrMebigfatguy 2 года назад +268

    Luckily for AMD, Intel has taught us that benchmarks don't matter.

    • @joeykeilholz925
      @joeykeilholz925 2 года назад +25

      uno reverse card

    • @gureguru4694
      @gureguru4694 2 года назад +14

      AMD fanboys now using the loser card lmao

    • @dra6o0n
      @dra6o0n 2 года назад +32

      @@blitzwing1 It's funny, Intel fans praises when they use more power to get performance, but attacks AMD for doing the same thing.

    • @warmage247
      @warmage247 2 года назад +5

      @@dra6o0n it's almost like people (Intel) just go for whatever solutions seems best so they can brag about something without actually understanding or knowing why... Pathetic

    • @walzine316
      @walzine316 2 года назад

      they do lol

  • @SergeiSugaroverdoseShuykov
    @SergeiSugaroverdoseShuykov 2 года назад +8

    kinda weird conclusion about value since there's no clear how much you'll need to pay to actually get working platform(with competitive performance, e.g. DDR5 memory kit)

    • @giglioflex
      @giglioflex 2 года назад

      $300 for the ram, $250+ for the motherboard, $160 for the CPU cooler. It's extremely pricey.

  • @gamersonlydotvip5337
    @gamersonlydotvip5337 2 года назад +5

    With the massive increase in power draw it really seems like the 10th gen i9 chips are still really competitive. DDR5+2 generations newer should have been a lot bigger boost than that even without a massive increase in TDP

    • @Yogachara
      @Yogachara 2 года назад

      This is misguided because it's up to software to make use of newer hardware technologies. The potential for significant performance gains is certainly there for developers to exploit.

  • @darrenm5797
    @darrenm5797 2 года назад +37

    I keep seeing videos now tagging "AMD is in trouble now" and similar.
    I honestly expected more (or less from power consumption perspective) from Intel which is being compared against AMD models release 12 months ago.
    I think the biggest disappointment for me is the lack luster attempt to make it look like they were going efficient.
    Sticking with this use tonnes of power to get performance is amplified in a server type environment and a much bigger disadvantage then anything else.

    • @chrisdpratt
      @chrisdpratt 2 года назад +13

      Yeah. Frankly, it just seems like clickbait, but I'd expect more from Steve. Still, this is really not that impressive. These are kind of basic generational uplifts, and it's not nearly enough to counter what AMD has in the pipeline. Maybe all these reviewers are just so happy to have something decent to review against AMD that they're more giddy than they should be.

    • @innocentiuslacrim2290
      @innocentiuslacrim2290 2 года назад +1

      I think this is a misinterpretation. Any power consumption figures we have seen on these early reviews shows a very comparable power use to AMD chips on any normal (even intensive workloads) that are not synthetic benchmarks. We also see on 12600K that it has VERY efficient power to performance ratio on gaming workloads.

    • @brendago4505
      @brendago4505 2 года назад +1

      Intels 13th gen is supposedly going to have 16 E cores on the flagship. Additionally, if you look at HUB's reviews where they point out lightly threaded games versus heavily threaded games, it becomes clear that the P cores outperform zen 3 cores, generally speaking, and the E cores underperformed, averaging out to be about the same. LTT found that 12th gen used less power than ryzen while gaming, even if that wasn't true while running a blender test. Plus, every reviewer let the power settings run at maximum Hog wild, so windows is going to tell the processor to just crank up the wattage, efficiency be damned. We just don't know how these chips will behave when told to tone down the power draw by the OS.

    • @chrisdpratt
      @chrisdpratt 2 года назад +1

      @@brendago4505 Well, it will be worse. Running at maximum power gives Alder Lake its absolute best showing. If you take power away, you take performance as well. They're not reinventing the laws of physics.