Intel Core i9-12900KS Review, Ryzen 7 5800X3D Counter

Поделиться
HTML-код
  • Опубликовано: 24 ноя 2024

Комментарии •

  • @HerrAlien
    @HerrAlien 2 года назад +34

    Makes perfect sense to test the 12900ks with the 3090ti - those two deserve each other.

  • @KimBoKastekniv47
    @KimBoKastekniv47 2 года назад +353

    Intel: 12900KS is better than the 12900K
    Consumer: By how much?
    Intel: Don't worry about it, it's just better!

    • @emulation2369
      @emulation2369 2 года назад +52

      By one letter

    • @TheStraightGod
      @TheStraightGod 2 года назад +6

      @@emulation2369 S is honestly the worst letter in the alphabet. I always thought so do better next time Intel.

    • @jmun3688
      @jmun3688 2 года назад +4

      @@TheStraightGod S is pretty mid

    • @koraptd6085
      @koraptd6085 2 года назад +3

      @@TheStraightGod C is much worse lol

    • @SevenHunnid
      @SevenHunnid 2 года назад

      I know this random but fam my mom recently found out about my weed channel & now i be thinking about quitting or deleting my stuff now😩😩

  • @kardeon4503
    @kardeon4503 2 года назад +145

    The 21C ambiant temperature saved your CPU. With summer temperature the 12900ks is going to melt or run 20% slower

    • @DimkaTsv
      @DimkaTsv 2 года назад +26

      That AC cooled air must feel really cold to be in with
      Also geez... 92 degrees peaked IN CINEBENCH! on liquid cooler and 21 ambient... That's an oven.
      I bet it would thermal trottle really fast if it was prime95 for thermal tests
      UPD: for people that don't get it, for me comfortable AC temp is 24-25, i will be freezing with 21, as well as many of you, probably

    • @smifffies
      @smifffies 2 года назад +13

      @@DimkaTsv I believe he said 102 degrees, so even worse. Ideal for central heating in the winter though!

    • @DimkaTsv
      @DimkaTsv 2 года назад +2

      @@smifffies I swear i heard it too as 102, but i believe tj max for intel big CPU is 100 degrees (105 is for notebook ones iirc).
      And HWInfo showed 92

    • @nathangamble125
      @nathangamble125 2 года назад +1

      Nah, anyone with enough money to burn that they're buying this POS will have an obscenely overkill AC system to tame it with.

    • @DimkaTsv
      @DimkaTsv 2 года назад +1

      @@nathangamble125 as well as clothes for cold weather to wear with that obscene AC

  • @yurimodin7333
    @yurimodin7333 2 года назад +21

    "secured by other means" and no F's given for launching the review is why I am glad to be a Patron. This thing is hotter than a FX-9590.

    • @mdd1963
      @mdd1963 2 года назад +2

      At least this CPU is not outperformed in gaming by an i3-2120, and up clocked 1.5 GHz lower....

    • @Theworthsearcher
      @Theworthsearcher 2 года назад

      @@mdd1963
      FX 9590 is clocked at abput 5 GHz - 4.7 all cores. ;) ...

    • @mdd1963
      @mdd1963 2 года назад

      @@Theworthsearcher And they sold quite a few of them, and probably 90% of folks that bought them did so because 5.0 GHz 'sounded faster' than 3.6-4.0 GHz of Intel's competing models, actual results be damned :)

  • @benjaminoechsli1941
    @benjaminoechsli1941 2 года назад +31

    This is why I'm a Patreon member. Give the lads down under the funds to buy these products themselves when [insert company here] won't supply a review sample, for reasons that become obvious once the review goes live.
    To take a line from Intel themselves, _"Thanks, Steve!"_

  • @aquapendulum
    @aquapendulum 2 года назад +191

    Once again, Intel is bringing the heat. Literally.

    • @fredfinks
      @fredfinks 2 года назад +4

      do you use blender or cinebench? Look at gaming. Iassume thats what both you and i really are interested in, For a lot more performance, look at the power consumption. at 11:40 . TLDR - Per performance, Intel is bugger all diff vs AMD in power draw with gaming.

    • @Staarfury
      @Staarfury 2 года назад +1

      @@fredfinks Indeed, 20+% higher performance for 5% more power (and that's including higher power usage from the GPU which has to render 20% more frames)
      It's quite possible that the 12900 uses less power while gaming than the 5950X does.

    • @aquapendulum
      @aquapendulum 2 года назад +2

      @@fredfinks I have observed that a lot of games tend to push the CPU clock as high as possible even though they only use a fraction of its power. Just do this experiment in some games: Limit FPS to 60 through in-game or driver-forced Vsync, make sure that they stay locked at that framerate and CPU power draw is at power limit, then use XTU or equivalent programs and lower power limit by just 10W, run the same benchmark again. My results had Serious Sam 4, Tales of Arise and rpcs3 all having the same framerate locked at 60FPS at both stock PL and PL-10W.

    • @wcg66
      @wcg66 2 года назад +5

      @@fredfinks with that logic, why not get an i7 or an i5? If you’re gaming at 4K, why upgrade at all? Power consumption and heat matter in reviews.

    • @thomaswest2583
      @thomaswest2583 2 года назад +3

      12900ks so you have chosen death. The fire spreads lol

  • @Bogdan00
    @Bogdan00 2 года назад +152

    I feel like this CPU is like the 3090Ti, refreshed old tech.

    • @shaneeslick
      @shaneeslick 2 года назад +26

      Yeah & with a STOOPIDLY RIDICULOUS price

    • @damara2268
      @damara2268 2 года назад +27

      Both products are aimed at the same group of customers - dumbs with lot of money who always must have the top product and don't care that they're getting scammed as they're paying much more for basically the same thing.

    • @lucingolo
      @lucingolo 2 года назад +22

      @@damara2268 you mean Apple customers?

    • @wherearemytesticles
      @wherearemytesticles 2 года назад +15

      Not even refreshed, just binned.

    • @nathangamble125
      @nathangamble125 2 года назад +3

      It's even worse than the 3090 Ti! The 3090 Ti at least has a better PCB and power delivery, and a VRAM layout that's much easier to cool. The i9-12900KS is literally physically identical to the 12900K except for being binned more rigorously and set to run at slightly higher clock speed.

  • @joakimp2711
    @joakimp2711 2 года назад +218

    The 1% lows in games are very impressive on intel 12th gen with DDR5 6400 memory. Im curious how 5800X3D will do and ofc the new Zen4.

    • @aravindpallippara1577
      @aravindpallippara1577 2 года назад +6

      Will definitely struggle with lows - but I expect big jumps in averages

    • @superneenjaa718
      @superneenjaa718 2 года назад +33

      @@aravindpallippara1577 I think lows would be better that 5800x. That large cache will alleviate some of the memory bandwidth bottleneck caused by ddr4.

    • @ThunderingRoar
      @ThunderingRoar 2 года назад +27

      that ddr5 6400 cl32 32GB ram kit costs like 700$ tho and the DDR4 3200 cl14 32GB is 200$

    • @OCDyno
      @OCDyno 2 года назад +1

      My $270 5200c40 Kingston kit does 6400 30-37-36 with no issue, in fact I'm limited by the motherboard in terms of frequency.

    • @OccultDemonCassette
      @OccultDemonCassette 2 года назад +11

      We're already at a DDR5 ram standard? Feels like it's only been a few years since DDR4 became standard. Didn't DDR3 last for a super long time before 4 came out?

  • @brettgriggs9888
    @brettgriggs9888 2 года назад +13

    Thank you for using AUD pricing on an Australian channel! It's appreciated.

  • @alevxzx
    @alevxzx 2 года назад +4

    It is somewhat misleading to mention 5800X3D in the title, as it is still weeks away until the review embargo is lifted.

  • @boingkster
    @boingkster 2 года назад +86

    Over 5ghz? The FX-9590 is gonna be pissed.

    • @wile123456
      @wile123456 2 года назад +9

      Yea but the FX CPU's were pissed when they existed since they are all 4 core non hyperthreaded CPU's (And a dual interger unit is not 2 cores)

    • @GewelReal
      @GewelReal 2 года назад +6

      i3s were laughing at it at not even 4GHz

    • @RaneBoDasch
      @RaneBoDasch 2 года назад +5

      Owning an FX CPU was like dating a trans person and only finding out 6 months later

    • @wile123456
      @wile123456 2 года назад +3

      @@RaneBoDasch no reason to be transphobic, the one thing more cringe than FX CPU's is being bigoted

    • @RaneBoDasch
      @RaneBoDasch 2 года назад +3

      @@wile123456 Wasn't being transphobic. It was a joke...just like the FX processors. Most, but not all, people would want to know if they were dating someone who was trans especially when it comes to the reproductive aspect of a relationship. A woman with a penis can't reproduce and bear children compared to a woman with a vagina. Same way most people would want to know if their processors were true 8 core or 4 core with "hyperthreading". Similar but not Identical. I have no more hate for trans people than I have for any other one of the billions of shitty human beings on this planet but I also can't respect anyone or anything being purposely deceptive. Guess it may have been more appropriate if I compared it to being intentionally lied to by a trans person for 6 months.

  • @ole-martinbroz8590
    @ole-martinbroz8590 2 года назад +16

    Australia wasn't hot enough already ?

  • @petertanpt84
    @petertanpt84 2 года назад +29

    I'm more impressed with how much faster Intel is with DDR5 in gaming, than the 12900KS itself, which is basically a non event in my eyes.

    • @sheriffsinghbhullar6972
      @sheriffsinghbhullar6972 2 года назад

      Agreed but is it justified the premium i dont know where u live but here in india it very hard to get and comes at more than double the price than a ddr4

    • @andersjjensen
      @andersjjensen 2 года назад +4

      It's getting better yes. But on the day one review of the 12900K the fastest DDR5 available was 5400 CL-you-won't-believe-how-bad-it-is which made DDR5 look incredibly dumb for the price. I don't know if AMD was smart or lucky, but it looks like when Zen 4 launches DDR5 prices should be pretty much were everyone thinks it's a non-issue.

    • @jaybee0507
      @jaybee0507 2 года назад +2

      Dude, the ddr4 testing was done at 3200cl14. 12900k easily usually goes 4000cl14. That's +25% more bandwith. And then you proceed to drop trfc for even more performance. Yeah i mean, it's not ddr5 6400 which is, i guess, pretty much the king, but it's not that far off like in this review. I mean 3200 isn't exactly top tier you know. It's really basic stuff unless you overclock.

    • @mdd1963
      @mdd1963 2 года назад +1

      I was noticing those massive DDR5 gains myself!!!!!!!!

    • @mdd1963
      @mdd1963 2 года назад

      Now imagine Raptor Lake, on DDR5-7000! :)

  • @andrei007ps
    @andrei007ps Год назад +2

    I don't really understand why in almost all KS CPU reviews, there is no emphasis on undervolting, that is a huge advantage on a binned CPU, you can basically retain 12900K performance at closer to 12700K power draw and thermals, also there are a lot of discounts for such high margin parts, so you can getr them for a very small premium over then non S CPU.

  • @zenvi420
    @zenvi420 2 года назад +8

    12900ks is gaming cpu and heater for you house/room

    • @aluckyshot
      @aluckyshot 2 года назад

      It has less TDP than procs 6 generations back. You will be fine.

  • @DrearierSpider1
    @DrearierSpider1 2 года назад +213

    This power guzzling CPU is brought to you by the looming worldwide energy crisis.

    • @n9ne
      @n9ne 2 года назад +1

      what energy crisis?

    • @TheHighborn
      @TheHighborn 2 года назад +24

      @@n9ne well, energy crysis in the Europe at bare minimum

    • @NXTLifeGame
      @NXTLifeGame 2 года назад

      @@TheHighborn Well, there is more than enough electricity sooooo. I don't know about your PC, but mine doesn't run with gas.

    • @Immebehindyou999
      @Immebehindyou999 2 года назад +50

      @@NXTLifeGame This guy truly doesn't understand how the European electrical grid works.

    • @jpHasABadHandle
      @jpHasABadHandle 2 года назад +19

      @@NXTLifeGame And electricity is produced out of thin air :V

  • @petrkubena
    @petrkubena 2 года назад +130

    I must say that I'm quite surprised by DDR5 performance uplift in these tests. Definitely well above what day-1 12900K reviews showed. Is it only a choice of tests, or is there a real upgrade in current bios/ucode/windows scheduler/game optimizations for higher bandwidth?

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +114

      Combination of things, we've changed a few of the games and the RTX 3090 Ti is faster at 1080p. We're also using faster DDR5 memory. All that said previous DDR5 testing did show gains for 20% in certain games.

    • @vasudevmenon2496
      @vasudevmenon2496 2 года назад +13

      @@Hardwareunboxed does this CPU have new evolved spectre microcode and hardware fixes?

    • @wayland7150
      @wayland7150 2 года назад +11

      We've really not yet seen what DDR5 is capable of. Currently not much point in getting it but there will be.

    • @peterhermina656
      @peterhermina656 2 года назад +5

      Steve, are there similar DDR5 gains over DDR4 for simulation games like MS Flight sim or Factorio?

    • @Mojave_Ranger_NCR
      @Mojave_Ranger_NCR 2 года назад +3

      @@wayland7150 Yeah the technology surrounding DDR5 hasn’t caught up yet, but that should change very soon

  • @GewelReal
    @GewelReal 2 года назад +14

    300 Watts out of the box... MADNESS!

  • @adamhafiddin9564
    @adamhafiddin9564 2 года назад +10

    this is the perfect cpu for home theater
    the "t" is silent

  • @theviewer1423
    @theviewer1423 2 года назад +2

    I bought a Ryzen 5 5600, I had a motherboard for it already and It's 199 US using less power is like the icing on the Ryzen cake lol

    • @uncleelias
      @uncleelias 2 года назад

      I like quiet and cool PCs. I plan on picking up a 5700x when that's launched

  • @jasonflt
    @jasonflt 2 года назад +7

    This CPU is meant for Canadian Winters of -40 C to keep your gaming room at a +40 C . Now we can put away our space heaters now. Hot 🔥 garbage 🗑️

  • @GordonGEICO
    @GordonGEICO 2 года назад +73

    It's sad when the nicest thing Steve said about this processor was "a huge waste of money."
    I think the people that buy this will brag about how much it cost them and be proud of the fact that they pay +50% more money for a negligible performance increase. Unfortunately, there seems to be enough people out there willing to pay because manufacturers keep doing this kind of thing.
    The last couple of years have ruined PC gaming for me, at least for the foreseeable future. I was gifted a PS5 and it's going to be a while before I build a new gaming PC.

    • @standarsh8056
      @standarsh8056 2 года назад +7

      It's a limited edition cpu
      Essentially a collectors item just liek the 9900ks was.

    • @akev2794
      @akev2794 2 года назад +6

      Intel alder lake makes things competitive again and prices of GPUs are improving.
      This is a collector's item not the value option.

    • @mrdali67
      @mrdali67 2 года назад +9

      Intel have had Extreme editions of their CPU’s for many years. Just because they don’t make much sense price to performance wise, someone will always be willing t pay for it just for the bragging rights, which is fine. And I really don’t see the KS models as for Extreme overclockers. Its for people that have enough money and just wants the fastest their money can buy them without the hazzle of trying to overclock a standard binned cpu.

    • @Mark3rz101
      @Mark3rz101 2 года назад +8

      Why would this ruin PC gaming for you? This doesn't impact anyone other than those who buy it (which is a small percentage). This is the best time to build. AM4 is going down in price, ddr5 is becoming more affordable, and new architectures in CPU and the GPU space around the corner.

    • @wayland7150
      @wayland7150 2 года назад +1

      This is unobtainium unless you need to spend a lot of money to prove a point. You'd not cheap out on anything if you wanted this CPU, you buy the most expensive of everything to go with it. The fact that someone could come very close and spend a lot less money would be like when your Lambo gets chopped by a Supra. The gold diggers are still going to be more impressed by the Lambo.

  • @jahejsa
    @jahejsa 2 года назад +3

    Could be a funny video to watch:
    1: Top of the line pc - all best
    2: Build a pc half the price of nr1
    3: build a pc 1/3 pf the price of nr1
    And see what unlimted budget vs "buying smart" difference really is

    • @mapesdhs597
      @mapesdhs597 2 года назад

      4. XEON 8c 2650 v2 with an Ali X79 mbd and 32GB/1866 with $15 6-pipe cooler, leaves rather a large budget for a 1440p gaming GPU, which for some games is a good match (just not those that need clocks/IPC).
      Then of course there's the general used market, lots of people selling Ryzen 3600s just now, while for new parts the 10105F and 10400F still look very appealing.

  • @crylune
    @crylune 2 года назад +2

    I can't give you direct financial support as I'm massively in debt but what I can do is leave a like and this comment for that sweet algorithm boost, keep it up guys.

  • @blackmennewstyle
    @blackmennewstyle 2 года назад +22

    Man, if that power hungry hardware trend continues, people gonna need their own nuclear power plant at home lol

    • @Hugh_I
      @Hugh_I 2 года назад +7

      on the other hand if you combine this CPU with a 3090Ti, you won't be worrying about your gas bills for heating anymore, so there's that...

    • @monke2361
      @monke2361 2 года назад +3

      @@Hugh_I lol run 3d mark and prime95 at the same time and rip power supply

  • @MrAtthedrivein925
    @MrAtthedrivein925 2 года назад +7

    The thumbnail for this video is just perfect, thank you Steve for this excellent review

  • @djnorth2020
    @djnorth2020 2 года назад +5

    12900KS, support your local electricity company.

  • @sissifus7226
    @sissifus7226 2 года назад +3

    This is actually the perfect CPU for our lab - where the several million $ experiment is bottlenecked by a single-threaded process that cannot be parallelized (easily). In that case, a few thousand bucks for a bit more performance really don't matter that much. But as a consumer I fully agree - this is stupid for the vast majority of people. Thanks for the review, I very much appreciate that you went the extra mile to get a sample (even though you should not have to)!

    • @suntzu1409
      @suntzu1409 2 года назад

      "this is actually.........dont matter that much"
      Is that sarcastic?

  • @heickelrrx
    @heickelrrx 2 года назад +27

    if this is a Higher bin 12900k wouldn't mean this is the best sillicon of 12900k?
    can we underclock and undervolt it to match normal 12900k and get better temp '-')?

    • @panjak323
      @panjak323 2 года назад +17

      Better binning doesn't always equal lower power consumption. I think Derbauer did video on that some time ago with many samples of 9900K.

    • @heickelrrx
      @heickelrrx 2 года назад

      ​@@panjak323 too bad then, the current best ITX board are on Z690, I thought we can made a decently powerful with some undervolt tweak with this chip for SFF build

    • @chriswright8074
      @chriswright8074 2 года назад

      His 5950x numbers is off they around 25-30k

    • @GodKitty677
      @GodKitty677 2 года назад +7

      For overclockers basically. You are paying for chip binning. Most likely get better performance with better RAM timings or more powerful gpus. The power draw is likely going to be stupid high which means expensive water cooling. For a gamer looking to play for long periods, you want to hit 4k@60 or 1440p@60 and limit power draw. Also with Hardware-accelerated GPU scheduling enabled many games hardly touch the cpu. Things like Space Engineers are the exception. RAM overclocking is a pain and does give a decent performance boost but it takes about 30 mins to dial in and 10 hours of prime 95 large ffts to prove stable.
      12900KS is just an easy way to get that higher Time Spy CPU score.
      For example my 10900k system with a 3080 ti. With an overclock only on the CPU and RAM. In Shadow of the Tomb Raider 1080p same settings as the 12900KS review. Gets 217FPS highest and TAA. The 12900KS gets 190fps for some reason.

    • @mamamia5668
      @mamamia5668 2 года назад

      it lacks AVX512 though

  • @TheBlackIdentety
    @TheBlackIdentety 2 года назад +2

    My god! Such an insane Zen humiliation. Glorious! I very much doubt Zen4 can compete.

  • @green_block
    @green_block 2 года назад +28

    12700 nonK is the best consumer CPU right now for value, gaming, content creation, and efficiency.
    Value because it’s cheap and comes with a cooler.
    Gaming is excellent due to fast p cores and low latency.
    Content creation is great due to the combined p and e cores as well as excellent quick sync encoder. Also good for muti tasking since e cores taking care of background tasks.
    Power efficiency is great due to this being a non K chip. Peak power usage is very low. Idle power usage is best-in-class for x86 CPUs.

    • @aravindpallippara1577
      @aravindpallippara1577 2 года назад +6

      The cooler is pretty meh though, but the rest is great.
      I will give power efficiency to 12700k due to being capable of undervolting/underclocking and gaining significant efficiency boosts that way

    • @wile123456
      @wile123456 2 года назад

      Intel stock coolers are landfill material destroying the climate

    • @Jabid21
      @Jabid21 2 года назад +1

      And with that extra money, you can go DDR5 and get a better gpu

    • @Duncan23
      @Duncan23 2 года назад +1

      The stock cooler really isnt sufficient for CPUs above the 12600. Even in the relatively cold UK winter the 12700/12700k will rarely hit it's full boost frequencies.

    • @nathangamble125
      @nathangamble125 2 года назад +1

      The Ryzen 5 5600 is probably slightly better value - its gaming performance isn't far behind, and it's much cheaper (especially when including platform costs), though it doesn't really get close in heavily multithreaded productivity tasks. Either way, the i7-12700 is one of the most compelling CPU options on the market. If you'd said this 2 days ago, it would be very hard to justify any other candidate for best-value new CPU.
      Though if you're also including old second-hand CPUs, and don't require something suitable for high-end PCs, a Xeon E5-2666 v3 is probably the best-value CPU on the market. It's an old 10-core Haswell CPU, about as fast as an i5-10400F overall (slower single-core, but faster multi-core), and only costs about $25.

  • @madf00bar15
    @madf00bar15 2 года назад

    Congrats on the day of release video!

  • @azaph_yt
    @azaph_yt 2 года назад +15

    The 12900KS seems designed to make the 12900K look like a reasonable purchase.

    • @timothyandrewnielsen
      @timothyandrewnielsen 2 года назад

      There always has to be something better than the best to keep people buying overpriced cpu.

  • @anakondase
    @anakondase 2 года назад +7

    How long until we get reviews of the 5800X3D?
    Not that I'm gonna replace my 5900X but it will be interesting to see how it compares with the 5800X.

    • @wile123456
      @wile123456 2 года назад +3

      10%~ increase in performance according to leaks and romours. IT will be the final and best upgrade for 400 and 500 series motherboards (And amd talked about certain 300 series motherboards being allowed to update support, despite not supporting the 5000 series at launch)

    • @ThunderingRoar
      @ThunderingRoar 2 года назад +3

      20th april iirc

  • @theigpugamer
    @theigpugamer 2 года назад +6

    I think in a few years when we have an rtx 6090 and a core i9 16900ks we'll need a power grid to start your computer

    • @wayland7150
      @wayland7150 2 года назад +1

      This goes in stages. The Pentium 4 was very hot where as the Pentium 3 was not. They actually had to go back to the Pentium 3 in order to produce the Core2 series which were not as hot as Pentium 4. Similar with GPUs. The R9 290 was very hot but the RX 480 which replaced it was a lot cooler. NVidia will have to come out with cooler chips to continue.

    • @theigpugamer
      @theigpugamer 2 года назад

      @@wayland7150 let me introduce a super cool and power efficient 3090 tie

    • @wayland7150
      @wayland7150 2 года назад

      @@theigpugamer The next gen will be cooler, or the one after that.

  • @D00m3dHitm4n
    @D00m3dHitm4n 2 года назад +1

    Do you guys plan to benchmark the 5k series CPUs (Just one or all of them) on X370(B350) vs X570(B550)?

  • @fVNzO
    @fVNzO 2 года назад +15

    The only thing I'm missing in your reviews steve, is the frequency scaling chart that GN does. And so yes, i can just go and watch their video (yours are just better though), but it's a really useful chart for SKU's exactly like this one.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +19

      Here's the clock multiplier table, but this will depend on the cooling used. So frequency scaling will vary. For our setup 8 P-core sustained 5050 MHz, not 5200 MHz.
      12900KS
      1 P-core = 55x
      2 P-core = 55x
      3 P-core = 52x
      4 P-core = 52x
      5 P-core = 52x
      6 P-core = 52x
      7 P-core = 52x
      8 P-core = 52x

    • @williama.2476
      @williama.2476 2 года назад +5

      @@Hardwareunboxed The mad lad!

    • @fVNzO
      @fVNzO 2 года назад +4

      @@Hardwareunboxed very cool. It's surprising how difficult it is to find this data on reviews and I always find them very interesting for K vs non K, X vs non X and so forth.

  • @BeefLettuceAndPotato
    @BeefLettuceAndPotato 2 года назад +4

    Ahh yes. A nice new waste of silicon..
    Great job Intel, glad I switched and haven't had the slightest urge to go back.

  • @awsan111
    @awsan111 2 года назад +5

    3090ti kingpin with a 12900ks and a godlike mobo, damn that is almost a new car's worth.

    • @Bas-Man1
      @Bas-Man1 2 года назад +2

      pointless though

    • @awsan111
      @awsan111 2 года назад

      @@Bas-Man1 ofc it is, but then no one will buy anything other than a 12400f and a 3060ti for mainstream gaming if logic was the point.
      A build like the one I mentioned is just everyone's dirty dream like it or not.

    • @Supcharged
      @Supcharged 2 года назад

      What new car can be had for that little money these days?

    • @awsan111
      @awsan111 2 года назад

      @@Supcharged its a joke, those 3 parts are $5000 tho

    • @Supcharged
      @Supcharged 2 года назад

      @@awsan111 If you drop it to 12700KF+3080ti though, you only lose maybe 10% performance for half the money

  • @okramediev404
    @okramediev404 2 года назад +1

    Ryzen r7 5800x3d is better and cheaper choice than intel ones for gaming, i do not need to say about intel power consumption, it is @ sky-high. AMD has better value and better performance out of box, no need for high priced ddr5, amd motherboards are usually cheaper (b550/x570-x570s). Amd is faster with older hardware and when AM5 comes out, intel may have even more harder to keep up. AMD platforms last longer, they will release bios update also for older amd motherboards.

  • @shahrukhwolfmann6824
    @shahrukhwolfmann6824 2 года назад +4

    I'll just get a 12700F then. Thank you, Steve, as always great content!

    • @whohan779
      @whohan779 2 года назад

      The non-K-SKUs seem to perform way better on DDR5 than DDR4 with their

    • @shahrukhwolfmann6824
      @shahrukhwolfmann6824 2 года назад

      Current gen speaking, MSI offers better VRM at a reasonable price.

  • @fyreblade1262
    @fyreblade1262 2 года назад

    Thanks so much for the Aussie prices. Best channel ever.

  • @white_shadow_123
    @white_shadow_123 2 года назад +11

    3090 Ti: finally, a worthy opponent!
    Our battle will be legendary!

    • @benjaminoechsli1941
      @benjaminoechsli1941 2 года назад

      Power supply: why do I feel like I'm about to be spitroasted?

  • @christophermullins7163
    @christophermullins7163 2 года назад +6

    this channel does it right every time. thank you for sharing such unbiased opinions. 😊

  • @andrewcross5918
    @andrewcross5918 2 года назад +4

    Great video, seems barely worth it and surprised at the performance gain DDR5 brings since I have not really kept up on that since 12th gen launch.
    Any chance of adding non FPS gaming metrics for CPU reviews? Stuff like tic rates late game for CK3, Stellsris, Cities Skyline and similar or AI turn times for 4X games that benchmark it as a couple of examples. Would be nice to see these types of result as there are a wide variety of games where the GPU matters little but choosing the right CPU can make an appreciable difference to gaming performance.

  • @ЕвгенийТрофименко-л8в

    Wait review Ryzen 7 5800X3D❤️🔥

  • @HeadphoneHangover
    @HeadphoneHangover 2 года назад +5

    When you guys do compare the 5800X3D to the 12900K can you also please include 0.1% lows in addition to 1% lows and average FPS results? Would really appreciate it and would be interesting to see 0.1% lows comparisons between these CPUs.

    • @mapesdhs597
      @mapesdhs597 2 года назад +6

      One problem with the 0.1% Lows is that by definition the Lows metrics are themselves averages, hence they often cannot truly convey a situation where a very occasional frame time spike is really annoying to gameplay as a massive stutter, but doesn't happen often enough to produce a useful number. Steve at GN talked about this a few times, so has RandomGamingInHD. This is why GN focuses more on consistent frame times rather than Lows, but of course the latter is easier to represent in a comparison chart.

    • @rdmz135
      @rdmz135 2 года назад +2

      Frametime graph would be even better

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +2

      1% lows and 0.1% lows are not averages of anything.

    • @BNR_248
      @BNR_248 2 года назад +1

      I think the best way to see lows, frame timing and stutters is to view in game performance benchmark videos done by TestingGames for example. Its hard to see an "average" lows on graphs. Theyve recently released a video comparing 12900k vs 12900ks

    • @mapesdhs597
      @mapesdhs597 2 года назад

      @@Hardwareunboxed That's odd, I could've sworn Steve @ GN refeferred to them in this way in a recent video.

  • @boomish69
    @boomish69 2 года назад +2

    426 watts for a CPU!! Holy cow, it’s gonna cost a fortune to run with Electricity prices going through the roof.

  • @PaulieVideos
    @PaulieVideos 2 года назад +5

    can't wait to use two PSUs in SLI to power my high end pc

    • @suntzu1409
      @suntzu1409 2 года назад

      Get this man a raise

  • @ilkerYT
    @ilkerYT 2 года назад +4

    Can you please do a OC on P cores and do a gaming benchmark only test ?

    • @wile123456
      @wile123456 2 года назад +1

      check his earlier P vs E core videos

  • @WayStedYou
    @WayStedYou 2 года назад +3

    Bushfire season starting early Steve?

  • @msullami2909
    @msullami2909 2 года назад +2

    An $800 dollar part to counter a $450 part... seems legit

  • @davidilie3795
    @davidilie3795 2 года назад +3

    $739 in the US according to intel's site. I'm sure plenty of people will buy this. If there wasn't a market for these products, they wouldn't be getting fabricated.

    • @michaelkeudel8770
      @michaelkeudel8770 2 года назад

      It's not like they did anything different then change the laser etching on the heat spreader, and put a different sticker on the box.

    • @bltzcstrnx
      @bltzcstrnx 2 года назад

      Intel list bulk price on their site, not msrp. So it will probably around 800 to 850 USD.

  • @Pedone_Rosso
    @Pedone_Rosso 2 года назад +2

    I have a question, or rather a curiosity I'd like to see addressed.
    Not really important, but still maybe interesting to answer for other people too?
    So, there's one kind of application that regularly uses 100% of CPUs' power for virtually unlimited times if set to do so,
    free to use (in many cases, at least), and, in a sense, in a "realistic" scenario (for those who use that kind of app, I mean):
    Chess engines.
    As these programs have been used for a lot of time by Chess players all over the world,
    and historically by students and researchers in informatics as model systems,
    I was wondering why I never see them used for CPU benchmarking.
    As you're clearly respected professionals of this field, could you tell me why that's the case?
    Thanks in advance for any answer on this curiosity of mine,
    and as always THANKS for your work and videos!

  • @nathanddrews
    @nathanddrews 2 года назад +8

    Good God, those gaming benchmarks... I never thought we would see such a shellacking like that ever again.

    • @gerv55
      @gerv55 2 года назад +2

      Ask yourself this, what enthusiast is gonna buy one of these and paiir it with a 1080p screen? In the real world those numbers are meaningless and have been for quite some time.

    • @nathanddrews
      @nathanddrews 2 года назад

      @@gerv55 Meaningless to you, maybe. You're aware that most very high refresh displays (240Hz, 360, etc.) are 1080p, right? You're not getting much above 144 with 1440p and 4K at the moment.

    • @gerv55
      @gerv55 2 года назад +2

      @@nathanddrews those displays are niche market at best, 1440p and ultra wides are more popular.

    • @nathanddrews
      @nathanddrews 2 года назад

      @@gerv55 Consider for a moment what you're saying. First you say that this is unrealistic because no "enthusiast" would ever need it. After I provide a completely realistic enthusiast scenario, you call it niche and shift the goalposts to mainstream popularity. Take the L, Team Red.

    • @Cooe.
      @Cooe. 10 месяцев назад

      Lol did the X3D chips make your Intel fanboy brain explode? 🤣

  • @Windupmykilt
    @Windupmykilt 2 года назад

    Smoke machine is a utterly fantastic touch. *Chefs kiss*

  • @bestthingsinceslicedrice
    @bestthingsinceslicedrice 2 года назад +3

    I have a coworker I bet would build his computer this way since to him its like a dck measuring contest on who has the top of the line latest and greatest.
    Then later he would complain how his wages are not keeping up with todays standards 🙄.
    Judging tho on how inflated the market is, it will be tough to stay on the budget so ill hold on my 3700x Amd build as long as I can

  • @klklkl427586
    @klklkl427586 2 года назад +1

    What is the Vcore voltage at 5.5GHz?
    It looks like overclocked and overvolted 12900k, not necessarily better silicon.
    You get a guaranteed overclock at least...

  • @Nate7.75
    @Nate7.75 2 года назад +27

    I feel like the the price premium is just insane. IDK if AMD will be able to take the crown back but at the price the 5800x3d is going to release at its a no brainer for gaming and competitive(?) for workstation workloads

    • @arya_amg
      @arya_amg 2 года назад +4

      More cash won't help in most workstation workloads

    • @Supcharged
      @Supcharged 2 года назад +10

      No chance the 5800X3D beats the 12700KF in workstation workloads. And at best it should be similar in gaming.

    • @Supcharged
      @Supcharged 2 года назад +3

      @@MrA-ir3me I am willing to bet it won't even beat the i5-12600KF paired with good DDR5 ram like in this video either.

    • @aravindpallippara1577
      @aravindpallippara1577 2 года назад +14

      Certain games and workloads scale very well with large cache
      The idea with caching is critical hits when you roll a d20 die - if the previous result can be retrieved from cache it effectively reduces those lucky instructions to zero workloads
      Larger cache meaning more chances of such lucky breaks and in general larger access to memory without dipping into much slower system ram
      It will certainly compete in certain single core workloads

    • @ssenyosdaviee1873
      @ssenyosdaviee1873 2 года назад +3

      Lots of fan bots...

  • @yogiwp_
    @yogiwp_ 2 года назад

    Really wish you include more pro-related tests other than graphics/rendering. Integer/ALU-heavy workloads like compilation are massively parallelizable but very different to floating-point heavy rendering work. Unlike graphics, they do much less homogenous computation and much more random memory access patterns, stressing out the system more. And those are the kinds that matter to a lot of people (not discrediting graphics work, just feel under-represented). Please include things like Unreal Engine compile time (both code and shader).

  • @IchNicht0
    @IchNicht0 2 года назад +3

    Why didnt you overclock it? You could easily have lowered the voltage and thereby not have been temperature-limited giving you headroom for higher clockspeeds.
    I think the higher binning and therefore higher overclocking-capacity is the main selling point of the KS-Version over the K!

    • @markpogson3799
      @markpogson3799 2 года назад

      Yes, Steve said that.

    • @NalinKhurb
      @NalinKhurb 2 года назад +3

      That's a fair point. It 'should' be the trump card for this CPU, lowering the voltage. They didn't overclock it because of saturating the cpu cooler but I think a lower voltage and a slightly higher clock, maybe even 5100 could have been tried

    • @IchNicht0
      @IchNicht0 2 года назад

      Srsly noone uses a 12900KS without undervolting +overclocking it. Standard voltages are always way too high, especially with stock oc models.

    • @ThunderingRoar
      @ThunderingRoar 2 года назад

      @@IchNicht0 thats not true majority of people will just run it stock

    • @NalinKhurb
      @NalinKhurb 2 года назад +2

      @@ThunderingRoar As Steve said "it's for people with more money than sense". The point is, these kind of people usually don't watch these channels
      Everyone who's here wants a better idea of what's exactly going on behind the scenes. Or maybe I have the wrong idea. But when a product markets itself as better binned, there should be more investigation whether that holds true or not. If you cannot cool it enough, then lower the Vcore. Quite simple

  • @Ojref1
    @Ojref1 2 года назад

    Thanks for using DDR5-6400 in your testing today. That seems like the best baseline to compare against the prior generations.

  • @kevinwood2044
    @kevinwood2044 2 года назад +4

    12700 non-k seems to be the way to go in terms of budget high performance. 20 threads, 25MB cache, 4.9GHz boost clock. Only $299 in U.S at the moment.

    • @GTFour
      @GTFour 2 года назад

      Or any of the 6 core offerings for better value

  • @griffin20
    @griffin20 2 года назад

    thank you for the videos :)

  • @bgop346
    @bgop346 2 года назад +10

    kinda hoped you would try undervolting like optimum tech when the 9900ks came out

    • @landmine36
      @landmine36 2 года назад +4

      With how little effort this "review" was, not even adjusting any settings, he may as well compared it to the standard 12900, non K variant. because that's how he treated it lol.

  • @stephandolby
    @stephandolby 2 года назад +1

    I'm wondering if some of these games that show decent gains with DDR5 would also see improvements by forgoing the 3200 DR kits in favour of faster SR kits.

  • @furynotes
    @furynotes 2 года назад +2

    the 3090 ti wasn't really the best of the best GPU. Its more of a transitional engineering standard. That was made for the gpu's to follow and capitalize. At least in Nvidia's bubble of course.

  • @Gen_66
    @Gen_66 2 года назад +2

    I like the mist/fog at 13:55, is it the cpu being overclocked or it’s being cooled down?

    • @DimkaTsv
      @DimkaTsv 2 года назад

      Maybe it is smoke or vapor that goes from in-case oven?) 102 degrees can boil water

  • @Jeffur2
    @Jeffur2 2 года назад +3

    The biggest takeaway is realizing that the reason why there are so many people buying this crap is because some people, despite the tumultuous times we live in, have so much more disposable income that the difference between buying mid range and literally the most expensive consumer computer you could possibly build would be like if you or I paid a bit more for extra guacamole in a burrito. This is like pennies to them and intel is happy to upsell.

    • @mapesdhs597
      @mapesdhs597 2 года назад

      Made worse by govt free stuff and some accumulating unspent cash because they were unable to spend their earnings or welfare in the normal way. My fiance works for a bank, you would not believe the anecdotes she's told me, eg. customers literally complaining that their account is in the black, that they have unspent money. It's quite bizarre. A heck of a lot of people have no idea what saving is anymore, add to that there are those who don't necessarily have the available spare cash as you describe, they just throw it all on a credit card.
      For those who do have the spare cash to spend, well in the end it's their money, nobody has the right to tell them what to do with it, but it sure didn't help that last year and before govts were handing out $1K+ "2080 Ti tickets" as people called them, while at the same time inducing a massive demand surge for gaming by forcing people to remain at home, during which people couldn't spend their money in the normal way anyway. A perfect storm that kicked PC parts prices in the goolies.
      Oh, also those in work who simply lied about having the coof coof to get free home holidays; that's happened *a lot*, but it's not something anyone will talk about openly for a long time.

    • @Jeffur2
      @Jeffur2 2 года назад +2

      @@mapesdhs597 your interpretation of my comment is so much different from what I expected

    • @mapesdhs597
      @mapesdhs597 2 года назад

      @@Jeffur2 Don't worry, I'm a bit whacko. :D I should though have added that you're nevertheless still right (my point was that other factors have exagerated this kind of behaviour to unprecedented levels), in that there are a fair few people who despite being "able" to spend a lot on what they want, they do not have much of a buffer to cope when times are bad. They live beyond their means, but in a way which tends not to affect them for long periods of time, usually via exploiting credit sources or welfare. Worse, they don't bother saving at all (for some the very idea is alien), an issue which is easy to ignore when one is young, or seemingly earning enough to comfortably get by, but decades later it'll slam them like a hammer when the option for affording a live-in-carer during one's health-degraded retirement years is impossible because such services cost $2K/week (though maybe they will see this play out with their elderly parents). What could have been a comfortable latter half of their life, thrown away by as you say spending extraordinary amounts on products which make little sense.
      However, there are those who can genuinely afford it, those who've properly earned their significant surplus, but what proportion of all high SKU buyers they might be I've no idea. I talked to a guy once who said he'd happily spend twice as much on a newer RTX Titan even if it was only 5% faster because, "it would be the fastest", but he said nothing of how he was able to afford such things.
      I could easily buy a 3090 Ti if I wanted to, but I think such products are dumb as a bag of rocks. Hence why my last GPU purchase was a used 1080 Ti, bagged for a snip when people were dumping Turing cards prior to the Ampere launch. I can't imagine I'll need to upgrade again for many years.
      Some have commented about the likely appeal of these GPUs to solo professionals and I suppose that's certainly possible, though the artists I've worked with over the years have rarely had budgets that high.
      To draw any firm conclusions we'd need to know a lot more about who is buying what, how and why, but NVIDIA would never release that kind of data and probably various classes of users wouldn't want to admit they're using such products if they had them (movie studios have client perceptions to maintain, it's why some pretend they only use modern tech despite some still using certain old SGIs because even today they run pretty well for tasks like compositing). Still, it's hard to ignore the repeated anecdotal evidence which drips day by day, and yes, NVIDIA is milking it for all its worth, even boasting to their own investors about the upsell.

  • @h1tzzYT
    @h1tzzYT 2 года назад +1

    ignoring these minuscule gains on "KS" model the difference between 5950x and 12900k with good memory is insanely high, im honesly shocked, even upcoming 5800x3d doesnt have a chance against it

  • @amc6090
    @amc6090 2 года назад +3

    so basically the 5950x is still the better cpu for productivity

    • @amc6090
      @amc6090 2 года назад +1

      @@tilapiadave3234 maybe, but if you turn on PBO, it still uses less power and will do well over 29000 score in cinebench. Its the better, more consistent cpu, yes its over priced, but now you know why...and all that with DDR4....yea its by far the better overall cpu

  • @gamingmarcus
    @gamingmarcus 2 года назад +1

    Should have named it 12900 KX. Everyone knows X is the best letter for gaming.

    • @wayland7150
      @wayland7150 2 года назад

      KY because you're getting shafted.

  • @jumpman1213
    @jumpman1213 2 года назад +6

    This further proves how beast of a CPU the 5950X is.

  • @MafiaboysWorld
    @MafiaboysWorld 2 года назад +1

    That 12700K vs 12900KS comparison is what I feel with 5800X3D vs 12900KS. Buy 2 for the price of 1. 😂👍

  • @big.atom37
    @big.atom37 2 года назад +3

    Why are AMD CPUs consistently slower in comparison to your review of 12900K?

  • @JuanDiegoPinillos
    @JuanDiegoPinillos 2 года назад

    Energy efficiency should become one of the main items when reviewing hardware, utility bills aren't getting cheaper and if things keep going the way they are currently, we should start thinking about how much money we will be spending only on energy when using that much power when gaming

  • @sabishiihito
    @sabishiihito 2 года назад +5

    At least the KS is ultimately cheaper and less frustrating than binning multiple 12900K CPUs looking for a golden sample.

    • @WSS_the_OG
      @WSS_the_OG 2 года назад +1

      Because Intel has had the KS in the works for a while, you know they've been skimming the best binned dies out of the 12900K pool for a while, meaning you're less likely to get a golden sample 12900K (as they've already been skimmed for sale as the KS).

  • @saricubra2867
    @saricubra2867 2 года назад +1

    I think the 12900K can have some OC headroom for 12900KS perfomance, or the 12900KS would be better with a undervolt (better sillicon) and maybe price will improve?.

  • @casualtake1497
    @casualtake1497 2 года назад +4

    That power consumption is ridiculous, may aswell getting 1200 Watt Nvidia GPU with it lol

    • @fredfinks
      @fredfinks 2 года назад +2

      Look at gaming. I assume thats what both you and i really are interested in, bot blender / cinebench. Look at the power consumption. at 11:40 . TLDR - Per performance, Intel is bugger all diff vs AMD in power draw with gaming.

    • @bltzcstrnx
      @bltzcstrnx 2 года назад

      @@fredfinks not for people who do use Blender. For them the difference almost 2x.

    • @fredfinks
      @fredfinks 2 года назад

      @@bltzcstrnx Yeah but they woudlnt buy. Regarding the OP and his complaining comment whaddya reckon hes doing with his PC apart from porn and games? like the 99% of us here. We are gamers, browsing, and may stream or do the odd vid encode. Per frame performance amd and intel are on par with power consumption.

  • @malifestro3319
    @malifestro3319 2 года назад

    Wow the power draw on that is insane. I am not sure how many people realize that with a modern GPU and one of these KS processors you would be exhausting a serious amount of heat. That is within the BTU output of around 40% of a electric space heater. Or said another way, near half a circuit breaker.
    For the average user it is marginally better then a 5950 at the double the power draw. The 12700k seems like a good option too.
    Steve - Would it be possible if you start a new series on best performance per watt? Would it even make sense to consider the power draw given the idle watts most can run at during non work times? Something to noodle on.

  • @StephenDeTomasi
    @StephenDeTomasi 2 года назад +17

    Part of me wants to see how badly this thing would throttle on an Intel stock cooler

    • @wayland7150
      @wayland7150 2 года назад +1

      Maybe but it would be a sad thing to see. I'd want to make it up to the CPU afterwards by getting it a really expensive water cooler.

    • @RaneBoDasch
      @RaneBoDasch 2 года назад +1

      The 12900KS uses integrated BAC (Bitch Ass Cooler) prevention. It will boot loop if it doesn't detect a cooling solution valued at least 15% the cost of the cpu.

    • @nathangamble125
      @nathangamble125 2 года назад

      @@RaneBoDasch Some say that the i9-12900KS has the potential to run at 3x the speed of the i9-12900K, but only if it is able to verify that you aren't a filthy peasant by detecting sufficient RGB within your PC case, otherwise it will deem you unworthy and run at the same speed as the i9-12900K.

    • @RaneBoDasch
      @RaneBoDasch 2 года назад

      @@nathangamble125 You should theoretically be able to trick it if you throw a bunch of confetti and glitter in your pc

  • @bryanaustin8362
    @bryanaustin8362 2 года назад

    These thumbnails show an evil demon who no one should mess with but then the video has a cute cuddly teddy bear you want to buy and give to your mother on Mothers Day.

  • @izidor
    @izidor 2 года назад +7

    426 Watt?!...no thanks.

    • @masx4813
      @masx4813 2 года назад +1

      even bigger than 3090 does 🤣

    • @syrynx4454
      @syrynx4454 2 года назад +2

      @@masx4813 it's total system draw,but even so it's absurd,because that means it draws 300-320w.With next gen , we might be looking at 1000w power draw... Imagine living in a country where summer days are 30-40 Celsius.Might as well run that AC full throtle while gaming.

    • @masx4813
      @masx4813 2 года назад

      @@syrynx4454 but yeah furthermore we need a powerplant to run core i9 with RTX 4000 series

  • @eugkra33
    @eugkra33 2 года назад +1

    I'd like to see some memory benchmarks for the 5800x3D when you review it. My suspicion is that with 3800mhz low latency RAM on both the 5800x and 5800x3D the performance gap between them will shrink substantially. Mainly because the larger cache means less reliance on RAM speed, while the lower cache means the 5800x has more to gain.

    • @asmi06
      @asmi06 2 года назад

      Most of the memory bandwidth is utilized by PCIE devices, not CPU, so cache won't help much with those tasks.

    • @mapesdhs597
      @mapesdhs597 2 года назад

      @@asmi06 It depends on the task; AE uses mem bw extensively and could thus gain a lot. 10 years ago an AE test I was provided with by an artist was gobbling 40GB RAM. AE is perhaps unusual in that it can hammer all major parts of the system at once, but nevertheless it's a popular application.

  • @dracer35
    @dracer35 2 года назад +3

    Sadly my favorite game is limited by CPU single core speed. The 12900ks would give me a bump in performance but for around $800 its just not worth it.

  • @MrDutch1e
    @MrDutch1e 2 года назад

    We need a tuned 3600+ ddr4 vs tuned 6000+ ddr5 video. A lot of work but it's key. I've seen some other channels that heavily tune memory show essentially zero gains with ddr5 when both are tuned since bdie ddr4 is so much more tunable that any ddr5.

  • @sdtx1112
    @sdtx1112 2 года назад +4

    With these latest offerings, Intel and Nvidia are just trolling at this point.

  • @shicocc
    @shicocc 2 года назад

    Is the intro music original or a short part of a full song?? It's so relaxing please tell me there's a full version

  • @HeliumFreak
    @HeliumFreak 2 года назад +6

    I know you are doing this for "CPU Performance" which is why you test at 1080p. But it would be nice to have some 1440p results in there as well. I know this pushes it into the more GPU usage category. But as a gamer who games at 1440p it would be nice to know what, if any difference, a new CPU can make to my FPS.

    • @rowdy1003
      @rowdy1003 2 года назад +1

      CPU choice matters less in higher resolutions so unless you're 2-3 generations behind a new CPU won't make a noticeable difference to your FPS.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +2

      It also depends on the GPU used and quality settings, so 1440p alone doesn't answer all the questions. So for a CPU review it's just easier to stick with the semi realistic CPU limited testing.

    • @HeliumFreak
      @HeliumFreak 2 года назад

      @@Hardwareunboxed Sure I guess that is one aspect of it. But on the flip side. Who is spending this much on a CPU and running anything at 1080p? I would also argue that anyone spending this much money on a CPU will more than likely have a top end GPU as well. So yes, the 3080 test at 1440p with the 12900ks would also be relevant.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад

      It's also about showing future performance, not just the here and now. Also we're using high quality settings, the margins will be similar with a slower GPU using low quality settings.

  • @Michael-Ray
    @Michael-Ray 2 года назад

    What does the S stand for in KS? Super? Second edition?

  • @Medsas
    @Medsas 2 года назад +4

    where are the 5800x3d results? this is clickbait…

    • @BeatsbyVegas
      @BeatsbyVegas 2 года назад +1

      learn how to read the title. 5800x3d doesn’t come out until April 20th.

    • @Medsas
      @Medsas 2 года назад +2

      @@BeatsbyVegas title specifically mentions its a counter to 5800x3d without offering any comparison in the benchmarks, thats the definition of clickbait title bro
      also street date release doesnt mean anything to reviewers, they could easily have gotten review samples, so its not a good excuse

    • @BeatsbyVegas
      @BeatsbyVegas 2 года назад +1

      @@Medsas That’s not how review samples work. You can get it early, but you can’t make the review public until embargo date. That date is usually either launch day or 1-3 days at maximum before launch.

    • @Hardwareunboxed
      @Hardwareunboxed  2 года назад +3

      @Medsas don't be dense. We're saying this is Intel's counter to the already announced 5800X3D.

    • @Medsas
      @Medsas 2 года назад +1

      @@Hardwareunboxed I specifically clicked this video to see a comparison with the 5800x3d as this is what the title is saying, and I’m sure I’m not the only one who did. This title will be especially misleading in a month when the 5800x3d will actually be available and people will click on this expecting a comparison.

  • @odizzido
    @odizzido 2 года назад

    I am most excited to see the power draw. I expect good(bad) things from this.

  • @ronitdutta5499
    @ronitdutta5499 2 года назад +4

    12900k/12700k + DDR5 is the clear winner. Zen3D is inadequate for taking the crown

    • @fredfinks
      @fredfinks 2 года назад

      no still DDR4. unfortunately steve used 3200 DDR4. and other end , extremely expensive top DDR5. Get fast and affordable DDR4. (e.g . 4000 cas17) Normal but still much higher price DDR5 is worse than good DDR4.

    • @ronitdutta5499
      @ronitdutta5499 2 года назад

      @@fredfinks 12900k and Zen3D are for the best. DDR5 is an advantage

  • @MrDutch1e
    @MrDutch1e 2 года назад +1

    I don't think intels worried about the 5800x3d. This is just to take back some of the media coverage.

  • @vaggeliskosiatzis5487
    @vaggeliskosiatzis5487 2 года назад +7

    it consumes twice the power than the ryzen 9 5950x and still loses in most workloads... from a CPU that is 2 years old... what a shame of a company Intel is nowadays.... AMD is the king from efficiency to perfomance perceptive by far... and i dont think it will change that in the future because Intel seems incapable of doing that apparently... even though Alder Lake architecture is good is nowhere near as good as it should be unfortunately....

    • @Supcharged
      @Supcharged 2 года назад +1

      It literally beats the 5950x in most tests in this very video? Lmao

    • @vaggeliskosiatzis5487
      @vaggeliskosiatzis5487 2 года назад +1

      @@Supcharged only on Adobe and it even loses on decompression from the ryzen 9 5900x… you can say that win in gaming as it should win though compare to a 2 year old architecture but on the productivity part they are losing really hard to AMD and all that by using twice the power… that’s embarrassing…

    • @chriswright8074
      @chriswright8074 2 года назад

      @@vaggeliskosiatzis5487 the 5950x hits 25-30k on cb23 I don't know why his numbers so low

    • @vaggeliskosiatzis5487
      @vaggeliskosiatzis5487 2 года назад

      @@chriswright8074 with curve optimiser it could reach that number for sure... i have a ryzen 9 5900x and with curve optimiser i have 22859 point on multicore the same as i7 12700k by using 30% less power and have a much better productivity perfomance and the same gaming experience... but for people that they have to buy now... either ryzen 9 5900x or i7 12700k is a good choice... with the discounts that AMD has and overall the cost of mb and CPU combo which is more or less the same... but i9 12900k let alone the KS part are useless products...

    • @fredfinks
      @fredfinks 2 года назад

      No not at gaming. How many of us use blender and cinebench?

  • @alexhin4083
    @alexhin4083 2 года назад

    2:28 watch out, i think you left your 3090 Ti benchmarking in the background while you shot that scene

  • @steamstories1279
    @steamstories1279 2 года назад +4

    96MB of L3 cache won't increase performance by 100%, so even the "old" 12900K will be much faster than 5800X3D. Just watch the video, 12900KS is almost twice as fast as 5950X in some games.

  • @bonnome2
    @bonnome2 2 года назад

    2:00 Wait, that is a engineering/qualifying sample cpu
    How did you guys get that one and why did you show it?

  • @Time4House
    @Time4House 2 года назад

    great review as always! Id like to see in 1440p if the difference is enough to upgrade from a 5950X

    • @imo098765
      @imo098765 2 года назад +1

      i guess you could look at the normal 12900k benchmarks as they basically the same performance

  • @simon3461
    @simon3461 2 года назад

    Is this CPU fast enough for 4K gaming ?