A note on Ryzen on Windows 11: We did not experience the CPU-swap performance degradation bug in our testing. Our Ryzen bench was running a fresh install with the patch and chipset driver from the get-go; It was actually geared for Windows 10 if we couldn't get the patch + driver in time for testing, and wiped for the fresh install once they became available. -AY
@@penetrarthur it's about comparing the performance the typical person will get out of that chip. Overclocking, power matching, tweaking, liquid nitrogen cooling, those will make them compare the limits of the silicon for sure. But it doesn't help you decide what CPU actually fits your needs and budget.
The year is 2084. I lie on my deathbed awaiting the news I’ve been looking for my entire life. The alert comes through. “Intel has just announced it’s 69th gen processor.” Finally, after all these years, I can die in peace. “Nice.” I mutter to myself. Before drifting off.
Before Ryzen, Intel's financial incentive has been to improve products as little as possible to maximize profits. Now their financial incentive is to make as good products as possible to surpass AMD's performance and to stop bleeding market share to them. So now we have two companies with the financial incentive for better products, which is awesome for us consumers!
@@karanjoshi2662 There are also only two companies left who can manufacter cutting edge chips. And one of them only can in theory, and in reality appears to be behind.
@@tiborsejk1921 lmao please... AMD started to get dominant after 3rd gen and literally buried Intel alive in 5th gen. I'm glad Intel finally shaked themselves and being true competitors, this will be a benefit to us. But sometimes we need to accept things.
@@AlliaNCeRUclips Which dimension we talking about? An amd fanboy trying to "accept things" :-D Omg made my day. Amd was never any competition for gaming to Intel.
@@juveboy01033 I would prefer 1 or 2 less FPS to not having an heater inside my computer case. Just to suck out a little bit FPS they were making their CPUs like they got thwir IHS from magma. 12th gen is also hot but in lighter workloads like gaming they perform so well. If i was an AMD fanboy would i say i'm happy seeing Intel got into leadership again? Shut up please..
Would really love it if you started including an "Unreal Engine Developer" benchmark. Just loading the free "city park environment" can take forever due to all the shaders being built. Generally more cores means faster loads. Would love to see a benchmark on the "cold" load time of a map like this, as well as one showing the full rebuild of the map!
you expect them to do productivity benchmarks that people actually use rather than a bunch of stuff 1 person hiding away in a basement somewhere runs just to say "nice".
@@erin758 True there is a lack of such a channel I think. But for hobbyists like me with Potato pcs, Blender benchmarks are the way for me to imagine the processing power.
Damn, those thermals are rough. I'm impressed at the performance numbers, but I feel like the 90C+ temps and 230W power draw are going to make me go to a 5950X anyways.
The power draw and high temps are a no for me as well. Sure it's faster, but that doesn't make it "better" by any means. Hopefully it's a push in the right direction though
i doubt 99% of PC users will even use the CPU in a way where those thermals will matter tho, in gaming (and probably streaming too since it's not too taxing either) you'll be at industry standard temps. I really would only worry if you work in CAD etc etc, but those people most likely weren't looking for a upgrade anyway
@@O26. This. Max thermals are only a piece of the puzzle. If you're never going to be crushing every core with full load in your day-to-day use, it's only a limited issue at worst.
"Unix-like" makes it sound like you're embarressed of your target platform. Mac perhaps? Then again, it could also mean "I develop for everything but windows", and if so I salute you.
@@nobodyofconsequence6522 It they were using a Mac, then they wouldn't be interested in an Intel or AMD processor, would they? So no point in writing the initial comment and almost for sure they meant that they write code for everuthing that is Unix-like.
@@Tiberious_Of_Elona well the cool thing about efficiency isn’t just about reducing cost it’s also the power draw in total. The less power we consume from the grid the better.
@@Tiberious_Of_Elona funny enough, we all had to hear people whine about amd being less refined and eating up more power, but like always when intel does it, it's ok, negligible, just a few bucks more, when amd does it, well, crap amd, unable to deliver, intel the best! Get your crap sorted out dude.
@@lupuradu My fx-8350 feels kinda cool compared to the new CPUs. Only drawing about 180w when severely overclocked and keeping it under 70 degrees on a Noctua NH-D14
Hot damn, that wattage and thermals. Still though, as someone who is currently running a 5800X, I’m super happy to see Intel finally making real improvements this generation. I’ve never been actually seen real competition in the tech space like this.
"that wattage and thermals" That is the 12900K overclocked or infinite turbo. In real world, the use is way lower due to Big-Little design, you overclock a 5950X (infinite turbo) and the power draw and heat is the same (around 220 or 240 watts) but the 5950X has 20% lower IPC and lower clockspeed.
@@saricubra2867 In the real world, the 12900K is 3% faster in gaming, and with all benches combined it just tops the 5900X, but is still 9% behind the 5950X. And that is while it is using DDR5, that accounts for a 7% increase in games and overall. So I'm excited that Intel made this jump, like really excited! But I'm not impressed, where the 12900K shines in benches, it's also using way more power, almost 100% more power in fully MT loads and 35% in mixed loads. Even 8% in gaming, the only good this is that Intel at idle consumes 7watt less. I might want this chip for the single-tread performance, that looks amazing, but other than that, it's just bruteforced performance with the DDR5 uplift and a major increase in power consumption over the competition.
@@saricubra2867 LMAO, the 12900k is nowhere near 20% higher IPC. Did you even watch the video? Also, if the 5950x has the same power draw after "infinite turbo" with 8 extra full size cores, that definitely not a good thing for the 12900k. Especially since the 5950x beat the 12900k in multiple benchmarks *without* "infinite turbo."
Will be interesting to see the DDR4 benchmarks, also now that Intel has enabled all core boost, would make sense that the AMDs would have that enabled also in the testing.
This was long overdue. Finally improvements from Intel. Thank you AMD for bringing competition to the market. I can't wait to see how both will improve their cpus in the next 2 years.
@@noleftturnunstoned So we should handicap performance on new products because you're too broke to afford new parts? You're the joke here for thinking that lmao.
@@ImAzraa I'm using W11 on my laptop. Its 'fine' I guess. It did however bluescreen my desktop (i9-7940x/Prime x299 MB) so I won't be moving to W11 for my main rig for a while.
I'd love to know how much of this is just because of the ddr5 though. Is it really Intel doing well or is this is having an upgrade in the ram department? This will be very important to see how easily AMD will be able to come back with their next-generation. Also seriously those temps though
There are DDR4 benchmarks available on plenty of sites. In the ones I've had time to check so far, DDR5 wins out in games and productivity, but not by massive amounts. (DDR5-6000 CL40 vs DDR4-3600 CL16 btw.)
I'm honestly quite curious how the Linux kernel will handle this. It already has great scheduler support for ARM big.Little. I'm not sure if the code architecture-specific and needs to be rewrite for 12th-gen Intel though. We'll just have to see!
The kernel did not get patches by Intel yet to address this architecture and optimizing its scheduling. So it will probably take some time until we see Intel getting ahead on Linux because currently AMD is still your best option with Ryzen in most tasks and the patch/merge window for version 5.16 of the kernel seems closed already (5.16 might get ready in December).
I’m honestly impressed that the 5000 series is still on the heels of the 12th gen CPUs, very exciting times in the CPU Market. What a time to be a computer enthusiast, AMD, Intel and Nvidia are bringing so much competition and it’s amazing.
Have to agree. I use my 5950x for rendering, and the figures from Intel here made me consider a change. Until I saw that the Intel was running at 90C stock! I overclock my amd chip, so I get better results than the i9 while still running cooler and for slightly less power draw. So I'll stay where I am for now. But alderlake version 2 next year might be a game changer. But then amd will have their next chips out then too. Exciting times to be a nerd!
I'm just gonna wait and see what AMD's next generation of CPUs will be. Took Intel long enough to be able to compete again after getting kicked in the dick by Ryzen. Although I do wonder how much of the performance is from the DDR5 and Windows 11 sucking off Intel's new core architecture. I'm still gonna stick with my Ryzen 7 3800x until it breaks though, just because I need a new GPU before a new CPU.
What I would like to see is a breakdown of actual task scheduling and how the efficiency cores compare to the performance cores on a granular level. Something that really showcases the difference between heterogeneous cores vs the traditional homogeneous cores.
@@cnhtol1586 Like Intel didn't steal this power core design from ARM? X86 in the long run will be a dead architecture anyway, the future is RISC, even for desktop.
@@super_slav91 but we’re long away from the future on the idea of performance cores didn’t come from arm first. RISC-V is long away before they are gonna be used as desktop chips!
@@cnhtol1586 ...Every manufacturer does that, it's the whole thing about competition, every advantage will be recreated and to get ahead you need to innovate to get ahead. Like Ryzen started giving more cores to consumers so Intel has started doing it. Apple paved the way for desktop ARM chips and Intel is copying that. Etc etc
I find it a bit worrying for the "Intel advantage"(despite disclaimers at start) when using windows 11 AND DDR5, as the vast majority hasn't made the leap for either yet... And will probably not be representative for most users experience.... Perhaps another video with windows 10 comparison?
Windows 11 will be the standard in a few years (not because I think it's good, but that's how Windows is) so if they wanted this video to age well it wld need 11. And yeah benchmarks take a long time, they can't test on both with so little time before the embargo lifts.
Hardware unboxed has done a bit of benchmarking with win10, results are it's not that off, compared to Win11, just a bit varied from task to task. DDR5 is a genuine advantage for them, but Win11 is not exactly boosting perf in general, it just enables BIG.little to work properly. And since they are advancing their chips themselves on that front, I consider it still a win on merit.
Not even close, they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
@@crisnmaryfam7344 Relax buddy, you can see DDR4 and DDR5 test on the internet, this is a good product and both cores are impressive, AMD still more efficient and competitive in performance but that doesnt make this Chip-Lineup bad. I'm fairly impress
8600k and 9600k were lacking even on their respective launch days. Still, they're the exception to the rule and "an i5 is good enough for gaming" has been true more often than not.
@@crisnmaryfam7344 so you are saying Intell should have to have their performance hindered because their competition doesn't have equal feature sets? That's absolutely idiotic
@@tomd4748 they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
Given this is the "best case scenario", with Win11 and DDR5, I can't see it being better value than AMD's offerings, at least not while DDR5 is still priced as high as it is. Still, it's great to see competition again!
True. Though to be fair, this competition over the last years already gave us extremely good value on CPUs. It's very tough for Intel to keep up with AMD now, so I'm glad they're at least in the ballpark still. Without this release, AMD might have kept higher prices on their 50 series CPUs.
@@mitchjames9350 Yeah but the new intel chips still beat out the ryzen chips in just about everything on windows 10 too. And what we always see with brand new architecture is over the period of a year the performance gets better with OS and driver updates.
pulled the trigger and went full ddr5 and 12600k, from an i5-7600. What a difference! When the new gen GPU's, Direct Storage and resizable bar comes around, its going to be nuts!
I try to avoid having computers that feel like blast furnaces these days, so if I want multicore madness I think I'm sticking with Ryzen 9 over Core i9. But man, that new i5 is looking pretty sweet. If I was going to build a new game machine, I'd be giving that one a HARD look.
The temps and power draw though. So it's like a much faster Bulldozer :P That said, it'd be interesting to see how much improvement 3D V-cache makes to AMD and what their next gens are like
Like the old pentium4s that used to melt sockets. (edit - seriously the people saying power doesnt matter - your individual power user overclocked CPU doesnt mean a thing, millions of prebuilt systems coming off the shelf using that kinda power does. & The idea of this arch in data centers is nuts.)
I'm super pumped for the Zen3D chips as well. I'm running a 3700x at the moment, so I honestly have no reason to upgrade at this point, but I'm glad I at least have the path if\when I have the money to do so. I'm just glad there is some proper competition between these two again, up until Ryzen came out, shit was a snooze fest, and prices were NUTTY asf.
Doesn't California ban several game systems now due to their power draw and them blaming the computers of global warming.. Imagine what will happen regarding these chips and the systems they come in..
Agree, even if MS claims they "fixed" Ryzen issues it's still a brand new OS that was made for Alder Lake. It's bound to be plagued with bugs using other architectures.
@@sirs4878 with equivalent number of cores at max load - Intel. Idle power consumption is 6 times lower on Intel, and average power consumption during gaming too.
@@Baka_Oppai don't forget about that part that involved Intel playing dirty games in order to bring down the AMD. You want them to fight over your money, but legally, not by paying OEMs to handle miserable jobs.
@@saricubra2867 If you that were the case, then nobody, even Intel, would be clamoring for Apple's M1 efficiency. These CPUs are still a burning furnace but it is a welcome change for Intel.
You should have included DDR4 benchmarks too, as most people aren't too keen on spending $200 on only 16GB of RAM. And I don't think the 12600k is better than the 5600x (from a value perspective) for the reason that boards are much more expensive, you need a beefier cooler and PSU, and I don't imagine that it'd be all that great with DDR4.
That same scenario happened between DDR3 and DDR4, users hung unto 3 even after all makers switched to 4 lowering its cost gradually due to competition while demand on 3 raised its cost and we ended up in a weird situation for a bit where 3 costs more than 4! 😅
well that depends, if there is performance gains then yes people would be willing to pay more for ram it happens all the time it just came out and you are making generalizations on price? I have seen ~$180 alder mobos already, so value boards should be coming beefier cooler, sure but is negated by the price difference between the chips psu? nope, wattage is determined by the gpu you want to run
True. DDR5 played an important part In helping intel beat old gen DDR4 Ryzen cpu. Also since it's not just a simple cpu swap , the cost of DDR5 ram and new mobo will make it a lot expensive than rayzen. But linus here didn't focus on these points for reasons unknown. Maybe he believes that he will help competition by being a little biased .
Have to agree. To even the comparison as much as possible, platforms should have both been DDR4. This review left me wondering how much the performance with Intel was due to DDR5,
And then we have another question. Can you even get everything at the market? I bet some areas are cleaned off pretty fast and 90% of the people who want to upgrade can't because of no available parts ( mobo + ram + cpu + new cooler because of new layout for the screws). You cant upgrade missing even 1 part.
Buy i5 2nd 3rd gen lga 1155 and gtx 1050 ti 1060 or i7 2600 , Maybe i7 2600 is better than 4gen intel i5 . Buy from me hehhe soon I will sell new pcs with ryzen 5 cpus
@@Whyrureadingthat That's a combined TDP of the GPU and CPU. The CPU runs at 30-35W at full load and absolutely crushed every chip in existence in terms of performance per watt.
@@Whyrureadingthat 12th gen mobile CPUs are not going to be 45W. 11th gen mobile CPUs are already hitting TDPs upwards of 75W under full load. Another thing to consider is that the M1 Max comes close in performance to the 12900K that runs at 250W on full load. That performance per watt ratio is insane.
@@avinashanish3350 Dude, are you seriously comparing Arm chip to X86 chip? Your silly little M1 can't even encode AVC. Sure, there is power, but the M1 is missing TONS AND TONS of instructions and features, making it a mobile chip. This is like comparing a stripped sports car with no features or comfort to a tesla. You have no idea what you're talking about. Think about why respectable companies, cloud-services, etc. are avoiding stuff like Apple M1 chips even thought some youtuber said it's good at playing youtube videos. EDIT: I should elaborate. By no means M1 is bad, but 12th gen intel and M1 are two completely different worlds and are used for different things. Good luck hosting and playing minecraft on your M1, it can barely reach 100fps while intel goes into 1000+.
The other thing I've noticed after checking prices today that it's ungodly expensive to get ddr5 memory and to get a compatible motherboard so the platform cost as of right now is going to be more than double than the cost of getting a current generation AMD or Intel setup. I like to see how things pan out over the next month to see how the real cost of motherboard and memory add to the platform cost of alder lake. I mean hell, just check new way right now and it would cost me $770 to pick up 32 gigs of ddr5 memory. That is absolutely insane when the memory cost almost double the CPU price. The motherboard cost from what I can see so far is around 450 to $500 at this moment in time. Which means you're going to pay a hefty penny for that kind of performance and as far as I'm concerned unless I see something more reasonable there's absolutely no reason to spend so much extra money for such a minimal performance difference when you look at it overall. Definitely the new adopter tax in full swing. Be patient guys and let prices drop a little bit at least.
32GB of 5200MHz RAM is $280. Still really expensive, but not as expensive as you’re saying. Sure there’s nothing really in stock, but I wouldn’t be surprised if it gets resolved somewhat soon. NAND hasn’t been hit as hard by COVID as other industries.
Not to mention the performance increases. A year down the line, I wouldn't be surprised to see ram hitting 7000 mhz, for half the price it is now. Not to mention, Ryzen will likely soon have their 6000 chips out. 5nm on 5ghz will be a compelling argument to stay on team red.
I’m not worried. Since I don’t have a fixed income, if the prices never drop, that’s just another reason for me to quit gaming altogether. I’m already spending enough time in front of my screen as it is. Maybe it’s the perfect time to quit.
Exactly, I game at 4k 120hz and this is why I bought an 11th gen just 2 weeks ago. Ram, CPU ( I7 ) and mobo cost me $1400aud. The same amount of ram alone is $750aud The most ive seen the CPU used is 20%
Great news for competition, more so if it starts to drive prices down instead of ever increasing price inflation like AMD/Nvidia. I would've liked to have seen you use DDR4 on some of the benchmarks for an apples to apples comparison. Regardless, when natty gas goes through the roof this winter I can at least heat my home with an intel chip.
I thought the whole point of the of the E-Cores, was to help with when you're running apps in the background? but no where did you test having background apps running vs another CPU without E-Cores
Would like to see some comparison in the audio recording field as I would love to work with +- 500 - 700 tracks with alot of different effects and vst Instruments
Exciting! Will say I wish you tested with DDR4. The price puts it on a totally weird, whacko comparison no matter the performance, and God knows there's some serious benefits to it but yea. Totally screws with the price/perf.
DDR4 performance is the same as DDR5 at much lower cost. It just isn't worth investing into DDR5 right now because when you next upgrade, current DDR5 kits will be real bad, same as what happened with initial DDR4 years ago.
would be awesome to see some thermal results in some hot areas like, 30-35C room temp to see how much difference it make an air and an watercooler cheers from Brazil
When testing the 12900k on win 10 I got Cinebench R23 Score 2000+, in fact I saw the whole i9 perform much better in Win 10. Interesting to see these results though... 🤔
Interesting. Somehow I thought that releasing a Win10 video would just be a "Shhh" for AMD fanbase, but now that you mention it, it might be even more interesting.
@@hwstar9416 Intel didn't do a new architecture, they copied ARM from 11 years ago, well "borrowed the idea", without a full schematic of the chip we won't know how close the new chips are to ARM's big.LITTLE architecture
I'm curious if the numbers are actually accurate, considering the fixes for Ryzen on Windows 11 go out the window and require a fresh install when you swap out the CPU even once
@@lilkittygirl They've had the 12th gen chips for up to several weeks now (stated by the 12th gen ES video) and have been holding this video from release because of NDA. They also stated the second patch had just barely come out when they started testing, and the CPU swapping issue was only found 3 days ago.
I'm really interested to see the i5 running ddr4 because even though it looks like the obvious go-to, I'm not so sure if it's worth the extra money spent on the motherboard.
@@giornikitop5373 Yeah just depending on how the auto overclocking and scheduler are set up. I'm sure most lower tier boards would fry in the sustained benchmarks at this power draw!
If you ever upgraded your RAM in the middle of DDR5 era, any savings you get will be obeliterated when you really need to upgrade your CPU, and therefore your mobo and RAM to actual DDR5.
Good for Intel for doing well. In some countries outside the US, Intel has the better value per performance or very slight difference for the cheaper price. AMD is still saving Zen 4 for Ryzen in the coming months.
I give credit to Intel for making a major architectural change, which is the equivalent of turning a battleship at a company like Intel. But the power & temp graphs say it all, don't they? Intel got this slim performance edge at the high end by running the chips hot.
I get the impression the Ryzen chips could keep up if they were set to maintain all core boost clocks, like the 12900. I wonder how the i9 will overclock if it's basically already completely unlocked.
@@NightshiftCustom The chips are competitive at lower power, so it’s not purely a question of power. We should be happy that both companies are putting out very competitive silicon. It’s good for customers.
@@davepastern you didn’t watch the video. The Intel performs better and uses less wattage than the AMD 5950x.. maybe you’re just blindfolding yourself because you’re a AMD diehard fan
As someone who always used Intel chips, one of the main things that made me switch to my current 3600x was the simple fact I didn't need a whole new chipset/motherboard to upgrade the cpu everytime. While there will be benefits adopting the new platftom early as mentioned, the extra cost of even just a motherboard every generation outweighs the performance benefit as of right now.
@@Sandriell hehe I chuckled when i read that defence by amd's fan everytime, the only thing they benefit is the fact that they can use the same cooler, nothing else.
@@Sandriell Valid, I do intend to upgrade to a 5000 series. However it's less about the specifics and more about the trend. Intel has inherently had far more chipset iterations and 1-2 generations of cpu's compatible, usually with very mild improvements before needing to upgrade motherboard again. The fact that my B450 motherboard can run 1xxx, 2xxx, 3xxx and 5xxx cpu's with at most a bios update is the thing I am more trying to convey.
@@Phillz91 "yEaH bUt 6xxx WiLL nEeD a NeW MaInBoArD". 4 gens of CPU on the same AM4 slot and on the 5th now it's all of a sudden "well AMD needs new mainboards too". I'm in the same boat as you... was always a huge Intel fan and only ever used Intel CPUs but their need for new slots is ridiculous.
Hey! Great Video!!! I have a quick question sir. How long did Intel say they would stay on this platform? And as AMD Said how long they would be on the next platform? I have a 5900x now. But really want something I could put a new processor in a few years from now. Thank for your help
I would love to see a comparison between how zen 3 and 12th gen handle background tasks while gaming, in theory the efficiency cores should take care of light loads like discord and chrome, leaving the P cores to run games unencumbered. Would be an interesting comparison
The problem is that everyone is assuming that the scheduler will always make good choices. Jay has already seen one instance of a benchmark being shuffled onto E-cores by mistake. What about anti-virus. Background yes, but it often blocks other things running while it does checks. Do you really want it on E-Cores and everything stacking behind it while it takes its time? Lots of unanswered questions, lots that can go wrong. Trusting Intel and Microsoft to get it right on the first try (or second, third and forth) seems a little optimistic.
Worth keeping in mind the Intel results are also with DDR5, which puts this more into perspective for me. If you're still on DDR4 then the results are probably even less distinguishable. Taking that into consideration along with the insanely high thermals, it is not a very compelling chip imo.
They specifically said several times, that those thermals and power usage is only at full load in benchmarks and rare professional workloads. Y'all are biased.
Wait for atleast a year or so. There's still not much known about software compatibility wrt Alder Lake. The review ignores to mention the fact that there are nearly 30-40 AAA games released in the recent past that don't work with Alder Lake. Don't listen to id!0ts who say "Don't wait".
I'm interested in seeing the lower-spec i5 and higher-spec i3 chips for business applications. I bet maxing out at 2 p-cores and then having e-cores for the rest would make for some smooth office/warehouse laptops.
I think they already achieved that with 8th generation professors. Unless you're talking about the Y or U series then that would be really interesting to see.
i3s and low end i5s (12400/f) probably won't have e-cores but the performance would be amazing for the price since the only good low end CPUs last gen was the 11400f and 10105f
@@viztiz316 maybe not this generation since the process is so new, but possibly if e-cores become cheaper then we could see those processors on the bottom of the stack in the future. We also have to wait and see what AMD swings back with.
@@lyrebird712 "possibly if e-cores become cheaper then we could see those processors on the bottom of the stack in the future" Yea "what AMD swings back with" AMD will release zen3d(7nm) and zen 3+(6nm) cpus on Q1 22 for AM4 to compete with 12th gen Zen 4(5nm) will release for AM5 in H2 22 Zen 5+Zen4d(3nm+5nm) will release in H1 2023
That spec workstation benchmark is cool! @6:13 I do computer repair housecalls and that's actually really helpful to know to answer some questions i have about performance for day-to-day apps rather than just games! Thanks LTT!
This is fantastic, but I'll stay for a couple more years with AMD out of sheer loyalty. They are the reason why Intel is doing what it's doing right now. And I also don't want to forgive intel too soon for fucking me over all those years.
Don't worry. While performant again, the power draw and the wacky efficieny core idea from INTEL at full load is imho a lame hack to catch up. While OK for some, its not for me. AMD will soon bypass this short INTEL episode with advanced RYZEN versions. They just pumped theire EPIC CPUs. It will (hopefully) remain a interesting race xD
Well said. I'm glad there is competition in the market, but Intel is going to have to work much harder to lure back all those they milked for years. I just bought R9 5900x and Intel can go fuck them selves with their new CPUs, I don't care how much faster they are.
@@NedoBoi - Exactly. Up until the RYZEN CPUs appeared, i only bought INTEL. But this greedy crap the did in the recent past, did it. I usually buy upper middle tier CPUs. So i don't give a F*CK if INTEL is fastest or not. As long as price/performance comparison is right i stick with AMD for the near future. And power draw matters a lot to me, as the PCs can have a cheap, simple and quiet cooling solution with fans.
yall act like you only use your pc for blender 😂. If you just play fortnite all day, you cant sit there and say amd is a better value, because it just isnt. If you actually use your pc for serious productivity, you wouldnt be looking at comsumer chips anyway
@@jahterminatedlmao5473 if you want to play fortnite here and there you don't need any of the CPUs mentioned in this test, you can get by with a 4 core i5 from 3-4 years ago
Really curious how much of the improved performance is the jump to DDR5. Can’t wait for the DDR4 testing, as it should better isolate just the CPU gains or lack thereof. Also I’d like to see Windows 10 tested vs Windows 11, as 11 is still barely used by anyone and is unstable as hell.
ruclips.net/video/WWsMYHHC6j4/видео.html here it cover both ddr 4 vs 5 and win 10 vs 11, unless it ram bandwidth benchmark it makes only 2% difference average, funny some games run better on ddr4, note this video heavily bottleneck benchmark on gpu, this make no advantage on intel cpu.
@ BIG BeAR - Wait until the 2nd or 3rd wave of DDR5 when timings tighten up. You will be exceptionally surprised at the performance gains. DDR5 handles data so beautifully.
Which currently makes the LGA 1700 platform already terrible for upgrading down the road if you buy the i5 now, because 12900k doesn't look very compelling.
@WILDENZ I have to use my machine for work so I'm just spooked that updating will break everything. If I had a dedicated gaming machine I'd already have updated to fuck around with it.
Not only price, but thermals too, the performance is amazing, but I feel like they just boosted up power to the sky only to get that "we has best processorz" award
Same problem as in the past: only enthusiasts care. All these years of Intel delivering worse processors and yet AMD has made literally no inroads into the consumer market. No prebuilts on the shelf, no laptops on the shelf, Intel everywhere. We'll be back to 2005 in a jiffy.
@@crisnmaryfam7344 And a, what, 8 month old AMD product that is due a refresh in 1 months time. Yep, it's competition, but I can see some leapfrogging coming their way! :D
I really need to know, did Linus use the noctua lga 1700 bracket during testing? I heard temps can be worse on lga 1700 if using an asus board with lga 1200 compatibility, due to poor contact with the cpu chip. I'd like to see a review with the actual noctua lga 1700 bracket before I believe 90C is what the d15 reaches
I was almost regretting my recent upgrade to a 5950x until I saw the power draw on that i9. That being said...that i5 is an absolute monster if you're not in the crowd that needs a bunch of cores.
except for the $400 motherboard, the expensive ram and then top tier cooling you would need for the i5 at that point might as well buy a 5900x with all that and be in the same ball park monetarily speaking
@@Bulanesti i dont think that would be a huge problem for long. Wait awhile and the cost would probably go down by alot........ maybe not with that ram. Maybe wait a year😅
@@Bulanesti - That's overselling it a bit.. LGA1700 mobos are a bit more than b550s, but good ones can definitely be had for 250ish. DDR5 is not even helpful for most workloads currently so you can skip that. Assuming the power draws shown here are accurate the i5 shouldn't need anything particularly special for cooling - remember it stayed down in the 120-130w range, more than a 5600 but right in the mix with a 5900x.
Linus, PLEASE address audio/video production professionals. How does Intel’s new architecture compare to the previous when doing this kind of work? Using multitrack sessions in my mixing sessions, only the first core really mattered before. What about now?
I think you'll need to wait for the samples to make it to specialised reviewers. PugetBench is a good indication of things to come for video, but for audio the questions are way too many. E-cores have more latency: if a real time audio thread ends up there, there may be trouble. Sure enough, Windows 11's new scheduler was never tested with real time audio workload. It's really not a representative use case until Reaper comes out with a "Tested on windows 11 and alder lake" build. Also, alder lake runs HOT when number crunching, which spells further trouble for daws. Audio professionals need something reliable to invest on: Ryzen still is for 2021, let's see next year.
@@virgult I remember reading that long ago, the MPEG working group told people to stop using Pentium 4 to compress MPEG audio. Something about unstable timing increasing the compression artifacts or whatever.
7:53 I have an i7 12700k with a mobo that doesn't allow for overclock and when I stress test with Prime 95 I do get 239w ~ 241w TDP with liquid metal, way above what the image shows
Just as I thought the performance improvement is done by removing power limits and adding more coal into the furnace. Great if one does not care about electric bill
But it still has a significant performance advantage in gaming, where temperatures on the 12600K (the one that really matters) range from the same as the competition to being slightly higher (and very cool compared to the 5600X). That untamed power draw in all-core workloads is worrying, but it's only worrying for users who do heavy multithreaded workloads - which, to be fair to your argument, is the segment of the market that is probably most appropriate for the 12900K. I just think it's unfair to say that they're just "adding more coal into the furnace" when there's a dramatic improvement in performance per watt in most common user workloads.
Right me over here with r5 5600x and rx strix 5600xt using a total of 150 watts in my games I opt for the best effiecent parts for performance instead of most jam packed heated up tdp stuff but still is better than the older tdps in 2007
"All they did was remove the power limit" Bruh... they changed the processor node, they changed the entire architecture, they switched to having two types of cores on one chip, they drastically changed the physical size AND layout of the die, they moved to PCI gen 5, they upgraded to DDR5, and they vastly improved the IPC. They changed LITERALLY everything in this generation, and people like still roll their eyes and say "all they did was throw coal on the fire". Ffs.
@@NeoNoggie I have a Seasonic 650watt PSU; running a Ryzen 3800XT CPU, 32GB DDR4 3200 RAM, and 3060 GPU, on a B450 motherboard. My desktop ranks in the 92nd percentile of computers globally, no need to triple the budget just for bragging. I understand they have to cover new tech, but with the shortages and scalpers, stick with computers people can actually build. I've been rebuilding the same ATX case since 2006, nice to be able to reuse parts and not replace everything every ~3/5 years.
yep. intel adopted the big.little strategy but still lose out on performance per watt. This is more like a desperate attempt from Intel, trying to compete with Ryzen's superior architecture by pulling as much power as possible. Pretty sure AMD can easily crush Intel yet again if they use DDR5 and unlock the power to 240W.
@@kjc420 everyone knows heat kills electrics, why cooling is a big industry. I'll be curious how many Intel CPUs are still running in 10 years compared to AMD. I went 8 years on my last build, then upgrade for the pandemic quarantine video calls to family. Built everything but the GPU took 14 months to find a fair deal on eBay. I was running a GTX-650 from 2013 until I found the RTX-3060, sold the old GPU on eBay, somebody wanted it for half MSRP 8 years ago.
To appease the thermal envelope gods. Seriously though the core counts keep rising so having all cores as "high" power comes at a price to thermals and power usage. Focusing on half(hopefully 3/4 in the future) and making them the best you can with the extra thermal headroom can actually increase performance as I would say most common apps don't use more than 16 threads. Just my opinion tho
Go to 7:00, it prevents the throttling of the cores that are high power dealing certain workflows to the low power cores allowing it to be more efficient as a whole. At least how I understand it……..
Yeah, I'm aware of the concep, which is perfect for laptops, but on desktop... I don't get it. The underclocking of power cores (when thay are not needed at high clocks) is good enough by my opinion.
There is a push towards increasing power efficiency in desktops with california recently enforcing restrictions on desktop computers with high energy draws.
I'm mildly confused why the 12th gen chips were run with ddr5 when doing cross-platform comparisons. Since the AMD chips dont have ddr5 as part of their feature set (this gen) I would have thought that using the same (or match spec at least) ddr4 kit would have limited the number of variables as much as possible.
As Linus said, this is best-case scenario. So if you're considering running Alder Lake with DDR4, you may see a performance drop, making the purchase less worth it (albeit significantly cheaper).
This is why I have always bought an AMD CPU when possible. If AMD were to go under, there would be nobody to light a fire under Intel to keep them innovating.
So much this. Until Ryzen came onto the scene, Intel was more than happy to continue giving users crumb-sized performance increases in their chips. Once they saw an increase of AMD usage, they realized that they needed to finally do something with their products.
Absolutely right, competition and especially fierce competition is always a good thing. It both drives innovation and cost control. That's why I'm hopeful for Intel's GPUs because Nvidia sorely needs more competition.
@@signinforthebestexperience3461 What people said was that you can wait 2 years to get DDR4 that actually offers performance significantly better than that of DDR3 and the same goes for DDR5.
A note on Ryzen on Windows 11: We did not experience the CPU-swap performance degradation bug in our testing. Our Ryzen bench was running a fresh install with the patch and chipset driver from the get-go; It was actually geared for Windows 10 if we couldn't get the patch + driver in time for testing, and wiped for the fresh install once they became available. -AY
Why wasnt the Ryzen overclocked to the same wattage for the test?
@@penetrarthur sus
@@penetrarthur very sus indeed
@@penetrarthur it's about comparing the performance the typical person will get out of that chip.
Overclocking, power matching, tweaking, liquid nitrogen cooling, those will make them compare the limits of the silicon for sure. But it doesn't help you decide what CPU actually fits your needs and budget.
@@penetrarthur super sus
The year is 2084. I lie on my deathbed awaiting the news I’ve been looking for my entire life. The alert comes through. “Intel has just announced it’s 69th gen processor.” Finally, after all these years, I can die in peace. “Nice.” I mutter to myself. Before drifting off.
Nice
Good to see you here congrats on 200k
Damn
Very nice.
I hope I live to be 100 yo then.
Before Ryzen, Intel's financial incentive has been to improve products as little as possible to maximize profits. Now their financial incentive is to make as good products as possible to surpass AMD's performance and to stop bleeding market share to them. So now we have two companies with the financial incentive for better products, which is awesome for us consumers!
But we just have two companies that do everything in x86. It's not good in long term.
@@karanjoshi2662 There are also only two companies left who can manufacter cutting edge chips. And one of them only can in theory, and in reality appears to be behind.
@@karanjoshi2662 yeah, x86 is getting old, ARM is the future as proved by apple
A few thousand intel lover bois have been irreparably damaged inside by your comment.
@@karanjoshi2662 M1 from apple is arm based, and it's insanely powerful.
Seeing the 5950x/5900x with more "power unlocked" to match in intel power draw would be interesting to see.
Robbaz in the wild!
Where are the videos Robert!
Facts bro
But I heard Ryzens don't overclock that well.
@@OKNOWIMMAD12345678 CPU super saiyan showdown
As an AMD Fan, I'm very excited Intel is making a push back! We're finally seeing some innovation on their end!
You can see innovation from the 8th gen intel cuz they beating very bad your amd shit
@@tiborsejk1921 lmao please... AMD started to get dominant after 3rd gen and literally buried Intel alive in 5th gen. I'm glad Intel finally shaked themselves and being true competitors, this will be a benefit to us. But sometimes we need to accept things.
@@AlliaNCeRUclips Which dimension we talking about? An amd fanboy trying to "accept things" :-D Omg made my day. Amd was never any competition for gaming to Intel.
@@juveboy01033 I would prefer 1 or 2 less FPS to not having an heater inside my computer case. Just to suck out a little bit FPS they were making their CPUs like they got thwir IHS from magma. 12th gen is also hot but in lighter workloads like gaming they perform so well. If i was an AMD fanboy would i say i'm happy seeing Intel got into leadership again? Shut up please..
AND fanboys are always millennials. Somehow you forgot the 60 years Intel owned the market.
FINALLY SOME COMPETITION AGAIN! Lets go bois! Both Intel and AMD fans are gonna win from this
Well our only enemies are scalpers and miners
I'm SO stoked
we aint winning when prices keep going up
The virgin fanboys vs the chad good product enjoyers
AMD will wash em' out again with Ryzen 6000
Would really love it if you started including an "Unreal Engine Developer" benchmark. Just loading the free "city park environment" can take forever due to all the shaders being built. Generally more cores means faster loads. Would love to see a benchmark on the "cold" load time of a map like this, as well as one showing the full rebuild of the map!
Really wished they also did Unreal 4/5, Unity, Substance painter etc benchmarks.
second
This
Damn, as a game artist, I 100% agree.
you expect them to do productivity benchmarks that people actually use rather than a bunch of stuff 1 person hiding away in a basement somewhere runs just to say "nice".
Love that you guys started to introduce 3D rendering and Blender and stuff. Huge thanks for this!
132
More blender and production work loads is awesome.
more work-related software benchmarks should be a staple inclusion
@@erin758 True there is a lack of such a channel I think. But for hobbyists like me with Potato pcs, Blender benchmarks are the way for me to imagine the processing power.
Damn, those thermals are rough. I'm impressed at the performance numbers, but I feel like the 90C+ temps and 230W power draw are going to make me go to a 5950X anyways.
The power draw and high temps are a no for me as well. Sure it's faster, but that doesn't make it "better" by any means. Hopefully it's a push in the right direction though
Yep, imagine this with a 3090 and you got yourself a house warmer for the winter.
i doubt 99% of PC users will even use the CPU in a way where those thermals will matter tho, in gaming (and probably streaming too since it's not too taxing either) you'll be at industry standard temps. I really would only worry if you work in CAD etc etc, but those people most likely weren't looking for a upgrade anyway
At least they can't claim that you can't OC these chips - you just won't because of temperatures and power draw. Not viably, anyway
@@O26. This. Max thermals are only a piece of the puzzle. If you're never going to be crushing every core with full load in your day-to-day use, it's only a limited issue at worst.
I am a Unix-like developer and I compiled different programs every day. Thank you for adding the Firefox Compiling Test
Which OS are you working on?
"Unix-like" makes it sound like you're embarressed of your target platform. Mac perhaps?
Then again, it could also mean "I develop for everything but windows", and if so I salute you.
"Unix-like"
You can say MacOS, we won't judge you.
@@nobodyofconsequence6522 It they were using a Mac, then they wouldn't be interested in an Intel or AMD processor, would they? So no point in writing the initial comment and almost for sure they meant that they write code for everuthing that is Unix-like.
@@FL4SHK Linux is “Unix-like,” not Unix. It’s based on GNU (GNU’s not Unix). It functions like Unix, but technically isn’t Unix.
"Competition drives innovation" - someone, sometime
Milton Friedman, probably.
Sounds like something Marx wouldn’t say
That's why technology sucks under socialism. This isn't critique, it's a fact.
Sounds like Tsun Zu
God I love capitalism
AMD's power efficiency.....when thinking about not too long ago (Bulldozer)....dang, they STILL impress me.
You are talking about maybe a few dollars a month difference in electricity costs. Just keep moving that goalpost...
@@Tiberious_Of_Elona well the cool thing about efficiency isn’t just about reducing cost it’s also the power draw in total. The less power we consume from the grid the better.
Let alone heat lol
@@Tiberious_Of_Elona funny enough, we all had to hear people whine about amd being less refined and eating up more power, but like always when intel does it, it's ok, negligible, just a few bucks more, when amd does it, well, crap amd, unable to deliver, intel the best! Get your crap sorted out dude.
@@lupuradu My fx-8350 feels kinda cool compared to the new CPUs. Only drawing about 180w when severely overclocked and keeping it under 70 degrees on a Noctua NH-D14
Excited to see what Ryzen's response is!
It will take some time
@@SamuelLobo01230 perhaps amd will just go like snap
nothing
@@Ben_mgsp probably
Being affordable lmao
Hot damn, that wattage and thermals. Still though, as someone who is currently running a 5800X, I’m super happy to see Intel finally making real improvements this generation. I’ve never been actually seen real competition in the tech space like this.
Yeah it will definitely give Ryzen some incentive to do better, so whenever there's actual competition everyone benefits.
"that wattage and thermals"
That is the 12900K overclocked or infinite turbo. In real world, the use is way lower due to Big-Little design, you overclock a 5950X (infinite turbo) and the power draw and heat is the same (around 220 or 240 watts) but the 5950X has 20% lower IPC and lower clockspeed.
@@saricubra2867 In the real world, the 12900K is 3% faster in gaming, and with all benches combined it just tops the 5900X, but is still 9% behind the 5950X.
And that is while it is using DDR5, that accounts for a 7% increase in games and overall.
So I'm excited that Intel made this jump, like really excited! But I'm not impressed, where the 12900K shines in benches, it's also using way more power, almost 100% more power in fully MT loads and 35% in mixed loads. Even 8% in gaming, the only good this is that Intel at idle consumes 7watt less.
I might want this chip for the single-tread performance, that looks amazing, but other than that, it's just bruteforced performance with the DDR5 uplift and a major increase in power consumption over the competition.
1000% more competition mean we win more
@@saricubra2867 LMAO, the 12900k is nowhere near 20% higher IPC. Did you even watch the video? Also, if the 5950x has the same power draw after "infinite turbo" with 8 extra full size cores, that definitely not a good thing for the 12900k. Especially since the 5950x beat the 12900k in multiple benchmarks *without* "infinite turbo."
Will be interesting to see the DDR4 benchmarks, also now that Intel has enabled all core boost, would make sense that the AMDs would have that enabled also in the testing.
watch hardware unboxed video, they compare the DDR4 and DDR5
@@pixels_per_inch I didn't see mention that it was now enabled as it was considered overclocking
AMD already does this out of the box and have been since ryzen launched as far as I'm aware
Hardware unbox 😁
@@AllanSavolainen that’s what PBO is.
An Intel i5 being faster than a Ryzen 9 is something I never thought I’d see.
While drawing 200w and spitting fire
@@bigtitmaster Nigga what? Its thermals are at 60 Celsius or under and it draws 125W.
I believe the Ryzen 9 is less than 10% better than the Intel i3 which is such a small difference
@@supersuede91 WOAH WOAH WOAH WOAH
@@bigtitmaster what the fuck kind of cooler do you have? a copper block?
This was long overdue. Finally improvements from Intel. Thank you AMD for bringing competition to the market. I can't wait to see how both will improve their cpus in the next 2 years.
What a joke! Requires new motherboards, new Ram, new cooler, etc. Meanwhile my B450 should support AMD's 3D cache refresh. Poor Intel.
@@noleftturnunstoned you know right that AM5 is coming very soon.
(Atleast you can use your old cooler)
@@noleftturnunstoned So we should handicap performance on new products because you're too broke to afford new parts? You're the joke here for thinking that lmao.
@@Freestyle80 my pc has an intel cpu
@@Freestyle80 You're seeming like an Intel fanboy
I know you wanted a best case but I’m more interested in seeing how well the CPU performs without better RAM
This.
AND using windows 10. I mean, hell, who even is using 11 right now?
@@ImAzraa Me
@@ImAzraa I'm using W11 on my laptop. Its 'fine' I guess. It did however bluescreen my desktop (i9-7940x/Prime x299 MB) so I won't be moving to W11 for my main rig for a while.
@@ImAzraa The figure I've heard cited is that only 2% of Windows users are on 11.
@@praetorxyn I've heard 5.
I'd love to know how much of this is just because of the ddr5 though. Is it really Intel doing well or is this is having an upgrade in the ram department? This will be very important to see how easily AMD will be able to come back with their next-generation. Also seriously those temps though
Whatch Hardware Unboxed review. It is much more thorough.
I have a 5950x and 3800C14 ram, I’m going to run a few tests to compare.
I'd have to guess DDR5's not doing a lot but I'm sure it does cost more.
There are DDR4 benchmarks available on plenty of sites. In the ones I've had time to check so far, DDR5 wins out in games and productivity, but not by massive amounts. (DDR5-6000 CL40 vs DDR4-3600 CL16 btw.)
i'm also intrested for the comparison to windows 10 ryzen
Just in time, Intel! My house heater broke but now I can explain to my gf that we'll have warmth and a beast gaming PC. :)
I'm honestly quite curious how the Linux kernel will handle this. It already has great scheduler support for ARM big.Little. I'm not sure if the code architecture-specific and needs to be rewrite for 12th-gen Intel though. We'll just have to see!
it's hugely specific. like comparing latin to chinese
Probably very well, considering Intel is the largest code contributor to entire Linux kernel...
The kernel did not get patches by Intel yet to address this architecture and optimizing its scheduling. So it will probably take some time until we see Intel getting ahead on Linux because currently AMD is still your best option with Ryzen in most tasks and the patch/merge window for version 5.16 of the kernel seems closed already (5.16 might get ready in December).
Gaming benchs are wacky. It's hit or miss depending on what cores kernel decides on. Wendell did a video on it.
You are right, i saw benchmarks with Fedora, Archlinux, Ubuntu, ClearLinux and Windows 11 and all have better perfomance than Windows 11
I’m honestly impressed that the 5000 series is still on the heels of the 12th gen CPUs, very exciting times in the CPU Market.
What a time to be a computer enthusiast, AMD, Intel and Nvidia are bringing so much competition and it’s amazing.
Have to agree. I use my 5950x for rendering, and the figures from Intel here made me consider a change. Until I saw that the Intel was running at 90C stock!
I overclock my amd chip, so I get better results than the i9 while still running cooler and for slightly less power draw. So I'll stay where I am for now.
But alderlake version 2 next year might be a game changer. But then amd will have their next chips out then too.
Exciting times to be a nerd!
@@StudioNirin idk man i have 5950x + 420aio from arctic cooling and with pbo i get 77C under aida64 load test.
I'm just gonna wait and see what AMD's next generation of CPUs will be.
Took Intel long enough to be able to compete again after getting kicked in the dick by Ryzen. Although I do wonder how much of the performance is from the DDR5 and Windows 11 sucking off Intel's new core architecture.
I'm still gonna stick with my Ryzen 7 3800x until it breaks though, just because I need a new GPU before a new CPU.
What a time to be a scalper
Apple is the new kid in the block.
Would love to see how the I5 performs with a DDR4 system!
Hardware unboxed did that already, check it out
@@foxcell21 save us some time and tell us what he said lol
it's even better than i5+DDR5, lol
yea 12th gen with ddr4 performs even better than ddr5 in some games
ruclips.net/video/yZReEP8ixyk/видео.html
What I would like to see is a breakdown of actual task scheduling and how the efficiency cores compare to the performance cores on a granular level. Something that really showcases the difference between heterogeneous cores vs the traditional homogeneous cores.
Can't wait to see what happens when AMD picks up this design style in addition to their existing Ryzen chiplet system.
Competition gon' be goooooood
I have a feeling AMD is going the same route, and Lisa has it all ready to reveal any time now. Imagine 16 performance cores and 8 little ones 6950xt.
pickes up this design style - u mean stealing and calling it something else, thats AMDs way - always been
@@cnhtol1586 Like Intel didn't steal this power core design from ARM? X86 in the long run will be a dead architecture anyway, the future is RISC, even for desktop.
@@super_slav91 but we’re long away from the future on the idea of performance cores didn’t come from arm first. RISC-V is long away before they are gonna be used as desktop chips!
@@cnhtol1586 ...Every manufacturer does that, it's the whole thing about competition, every advantage will be recreated and to get ahead you need to innovate to get ahead. Like Ryzen started giving more cores to consumers so Intel has started doing it. Apple paved the way for desktop ARM chips and Intel is copying that. Etc etc
here when its still unlisted...
Hm
Hi hi
bro
How?
nice
I find it a bit worrying for the "Intel advantage"(despite disclaimers at start) when using windows 11 AND DDR5, as the vast majority hasn't made the leap for either yet... And will probably not be representative for most users experience....
Perhaps another video with windows 10 comparison?
I second this Idea. Thats what I was thinking throughout this whole video.
I guess that's in the follow up video they mentioned. They must have been very pressed for time in this video.
After years of misleading the market with graphs and charts... You can clearly see that nothing has changed.
Windows 11 will be the standard in a few years (not because I think it's good, but that's how Windows is) so if they wanted this video to age well it wld need 11. And yeah benchmarks take a long time, they can't test on both with so little time before the embargo lifts.
Hardware unboxed has done a bit of benchmarking with win10, results are it's not that off, compared to Win11, just a bit varied from task to task. DDR5 is a genuine advantage for them, but Win11 is not exactly boosting perf in general, it just enables BIG.little to work properly. And since they are advancing their chips themselves on that front, I consider it still a win on merit.
So im guessing watercooling would be the best way to go?
Gamers Nexus did a testing with Artic Liquid Freezer II 360 and the CPU was still at 70c, sounds like your gonna need to Liquid Nitrogen it.
If ur doing that kind of workload regularly, AMD is the better chip itseems
your gonna need like an actively cooled watercooler. meaning something is cooling the water while also pumping it around the system lol
No, if you are just gaming with maybe editing videos here and there and photoshop, get an LCD AIO and ur set
Noctua: hello
They always nail-it with the i5 for some reason, good showing with a overall comeback
Not even close, they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
@@crisnmaryfam7344 Nobody cares
@@crisnmaryfam7344 Relax buddy, you can see DDR4 and DDR5 test on the internet, this is a good product and both cores are impressive, AMD still more efficient and competitive in performance but that doesnt make this Chip-Lineup bad. I'm fairly impress
8600k and 9600k were lacking even on their respective launch days. Still, they're the exception to the rule and "an i5 is good enough for gaming" has been true more often than not.
@@crisnmaryfam7344 so you are saying Intell should have to have their performance hindered because their competition doesn't have equal feature sets? That's absolutely idiotic
LOL. "Our sposor Corsair doesn't know" That's the best segway ever.
Segue*
Sponsor*
Xeneon!?
lol*
know.*
Game of Thrones :
Everyone : "The winter is coming."
Me : "Rest assured. I preordered Alder Lake. Now how hot do you want your winter?"
I got the i9 10980xe and my room stays at a perfect 90 degrees while it’s 50 F degrees outside
ADL actually uses less power and runs cooler than Zen3 during gaming.
@@Itaketoomanypics Toasty. We can combine and create our own fusion reactor.😄
Global Warming?
I'll do it myself.
You know nothing Jon Show
it's really refreshing to see AMD, Apple, ARM and Nvidia finally wake the sleeping giant known as intel.
Let's see when AMD's new ones come out later this year! Great time to be alive for us all no matter if you're team red or blue!
Go to the AMD fanboy forum then hahaha
@@tomd4748 salty intel fanboy?. He's not even shitting one intel and you're saying he's a fanboy? get you're shits together.
I'm team price/performance
@@vondamn9943 this guy gets it
@@tomd4748 they were paid to compare ddr4 to ddr5.... LMFAO! and you ate it up like factual evidence.... Go put an iontel rig on ddr3, and compare to ddr4 with AMD. Youd see similar results lmao.
Given this is the "best case scenario", with Win11 and DDR5, I can't see it being better value than AMD's offerings, at least not while DDR5 is still priced as high as it is. Still, it's great to see competition again!
True. Though to be fair, this competition over the last years already gave us extremely good value on CPUs. It's very tough for Intel to keep up with AMD now, so I'm glad they're at least in the ballpark still. Without this release, AMD might have kept higher prices on their 50 series CPUs.
Turns out Intel did there tests on the version of Windows 11 that had issues with Ryzen CPU’s.
Especially since the current gem ryzen processors are probably going to have their prices cut.
@@SnifferSock I wouldn't count on price cuts. AMD still has supply issues.
@@mitchjames9350 Yeah but the new intel chips still beat out the ryzen chips in just about everything on windows 10 too. And what we always see with brand new architecture is over the period of a year the performance gets better with OS and driver updates.
I upgraded to a ryzen 5 recently, and I don't plan to change again for years to come, but im still glad we finally have competition in this space.
lol competition
Same
Finally? Intel has never had competition until now, good that AMD is catching up to give us a fight the recent years
@@kaustubh_haha fanboi detected
@@Quantumfluxfield another fanboi detected
pulled the trigger and went full ddr5 and 12600k, from an i5-7600. What a difference! When the new gen GPU's, Direct Storage and resizable bar comes around, its going to be nuts!
I try to avoid having computers that feel like blast furnaces these days, so if I want multicore madness I think I'm sticking with Ryzen 9 over Core i9. But man, that new i5 is looking pretty sweet. If I was going to build a new game machine, I'd be giving that one a HARD look.
My thoughts exactly. Wife has a need to render/edit, so she'll be AMD still, while I might hop to Intel in the next update. Weird times. :D
@@tuurekeranen1771 tbf this are a generation apart. Wanna see when amd will drop their new cpus
Indeed! It *reeally* makes me want to see what the 12400 numbers will look like. That one could be the new value darling.
Yes though you still need to look at the entire platform cost, including RAM and MB prices. Let's wait for the next follow-up video.
@@jellorelic or wait for 3DZen in Q1. Which should net a 15%+ increase in gaming performance putting Intel's Alder lake pointless.
The temps and power draw though. So it's like a much faster Bulldozer :P
That said, it'd be interesting to see how much improvement 3D V-cache makes to AMD and what their next gens are like
kinda the point by having high efficiency cores, you'd never hammer all the cores at once in any real application
Like the old pentium4s that used to melt sockets.
(edit - seriously the people saying power doesnt matter - your individual power user overclocked CPU doesnt mean a thing, millions of prebuilt systems coming off the shelf using that kinda power does. & The idea of this arch in data centers is nuts.)
I'm super pumped for the Zen3D chips as well.
I'm running a 3700x at the moment, so I honestly have no reason to upgrade at this point, but I'm glad I at least have the path if\when I have the money to do so.
I'm just glad there is some proper competition between these two again, up until Ryzen came out, shit was a snooze fest, and prices were NUTTY asf.
Doesn't California ban several game systems now due to their power draw and them blaming the computers of global warming..
Imagine what will happen regarding these chips and the systems they come in..
@@zakofrx California where everything good and fun is illegal, Except weed.
I would like to see a best case scenario versus using Intel/Windows 11 and AMD/Windows 10, how much would gaming benchmarks differ in that scenario
Intels 12th beats itself around 8 percent on windows 11 in cinebench r20 multicore. amd takes a 10ish percent hit on windows 11 over all right now.
Hardware Unboxed was doing some good win 10 vs win 11 AMD, Intel work
Well, that's not fair
Hardware Unboxed already did the test - not much, around 5 percent off at worst
Agree, even if MS claims they "fixed" Ryzen issues it's still a brand new OS that was made for Alder Lake. It's bound to be plagued with bugs using other architectures.
It's funny to see AMD with all 16 "performance" cores score half the power usage than Intel's 8+8 P&E cores. So much for "efficiency" cores.
8 efficiency cores only consume like 40-50 watts at max load. It's the big cores that are super power hungry
@@TabalugaDragon Please tell me in short, which consumes more watts, AMD or INTEL?
@@sirs4878 with equivalent number of cores at max load - Intel.
Idle power consumption is 6 times lower on Intel, and average power consumption during gaming too.
@@TabalugaDragon How many VMs Ryzen 5950X can run?
@@sirs4878 that mostly depends on how you configure the Virtual Machine....
I would say that VM's are more RAM dependant than CPU
This just makes me more excited to see AMD's next chips.
Amd its exciting, Intel its Hotter
AMD's wattage and TDP is a huge advantage for them, this is gonna get really spicy when they release their new gen DDR5 cpu's
if history is any guide AMD will fall by the wayside like every other time intel came back and left them in the dust
@@Baka_Oppai don't forget about that part that involved Intel playing dirty games in order to bring down the AMD. You want them to fight over your money, but legally, not by paying OEMs to handle miserable jobs.
@@Baka_Oppai History isnt reliable when you are talking about entirely different people running things.
Intel sure are using a lot of power and heat to be competitive.
You mean "to win."
And 20% higher IPC with higher clocks, and DDR5, and PCIe Gen 5.0, revolutionary architecture for x86, etc.
Power consumption is a non issue.
@@saricubra2867 If you that were the case, then nobody, even Intel, would be clamoring for Apple's M1 efficiency. These CPUs are still a burning furnace but it is a welcome change for Intel.
That alone almost makes me want to wait to see AMD's response to this...
Same thing AMD did before Ryzen.
You gotta do what you gotta do.
You should have included DDR4 benchmarks too, as most people aren't too keen on spending $200 on only 16GB of RAM. And I don't think the 12600k is better than the 5600x (from a value perspective) for the reason that boards are much more expensive, you need a beefier cooler and PSU, and I don't imagine that it'd be all that great with DDR4.
That same scenario happened between DDR3 and DDR4, users hung unto 3 even after all makers switched to 4 lowering its cost gradually due to competition while demand on 3 raised its cost and we ended up in a weird situation for a bit where 3 costs more than 4! 😅
well that depends, if there is performance gains then yes people would be willing to pay more for ram
it happens all the time
it just came out and you are making generalizations on price? I have seen ~$180 alder mobos already, so value boards should be coming
beefier cooler, sure but is negated by the price difference between the chips
psu? nope, wattage is determined by the gpu you want to run
True.
DDR5 played an important part In helping intel beat old gen DDR4 Ryzen cpu.
Also since it's not just a simple cpu swap , the cost of DDR5 ram and new mobo will make it a lot expensive than rayzen.
But linus here didn't focus on these points for reasons unknown. Maybe he believes that he will help competition by being a little biased .
Have to agree. To even the comparison as much as possible, platforms should have both been DDR4. This review left me wondering how much the performance with Intel was due to DDR5,
And then we have another question. Can you even get everything at the market? I bet some areas are cleaned off pretty fast and 90% of the people who want to upgrade can't because of no available parts ( mobo + ram + cpu + new cooler because of new layout for the screws). You cant upgrade missing even 1 part.
Impressive chips. I wonder how task manager shows the cores/threads with a mixture of performance and efficiency chips?
that will be on Microsoft part
No way!! Does this mean I can finally afford 4th gen Intel now?
Nope
Not really
I think they're pretty expensive unless you're buying used
Go AMD and save some cash on the parts and your energy bill.
Buy i5 2nd 3rd gen lga 1155 and gtx 1050 ti 1060 or i7 2600 , Maybe i7 2600 is better than 4gen intel i5 .
Buy from me hehhe soon I will sell new pcs with ryzen 5 cpus
Lmao , i have 4th gen
intel: "watts go brrr"
m1: "u got a power bank?"
The M1 Max runs at 130W
@@Whyrureadingthat That's a combined TDP of the GPU and CPU. The CPU runs at 30-35W at full load and absolutely crushed every chip in existence in terms of performance per watt.
@@avinashanish3350 12th gen intel cpus will be 45w, we'll see about that. Also the M1 Max is 4 grand
@@Whyrureadingthat 12th gen mobile CPUs are not going to be 45W. 11th gen mobile CPUs are already hitting TDPs upwards of 75W under full load. Another thing to consider is that the M1 Max comes close in performance to the 12900K that runs at 250W on full load. That performance per watt ratio is insane.
@@avinashanish3350 Dude, are you seriously comparing Arm chip to X86 chip? Your silly little M1 can't even encode AVC. Sure, there is power, but the M1 is missing TONS AND TONS of instructions and features, making it a mobile chip. This is like comparing a stripped sports car with no features or comfort to a tesla. You have no idea what you're talking about. Think about why respectable companies, cloud-services, etc. are avoiding stuff like Apple M1 chips even thought some youtuber said it's good at playing youtube videos.
EDIT: I should elaborate. By no means M1 is bad, but 12th gen intel and M1 are two completely different worlds and are used for different things. Good luck hosting and playing minecraft on your M1, it can barely reach 100fps while intel goes into 1000+.
The other thing I've noticed after checking prices today that it's ungodly expensive to get ddr5 memory and to get a compatible motherboard so the platform cost as of right now is going to be more than double than the cost of getting a current generation AMD or Intel setup. I like to see how things pan out over the next month to see how the real cost of motherboard and memory add to the platform cost of alder lake. I mean hell, just check new way right now and it would cost me $770 to pick up 32 gigs of ddr5 memory. That is absolutely insane when the memory cost almost double the CPU price. The motherboard cost from what I can see so far is around 450 to $500 at this moment in time. Which means you're going to pay a hefty penny for that kind of performance and as far as I'm concerned unless I see something more reasonable there's absolutely no reason to spend so much extra money for such a minimal performance difference when you look at it overall. Definitely the new adopter tax in full swing. Be patient guys and let prices drop a little bit at least.
I imagine it will take longer than usual for prices to drop due to the chip shortage
32GB of 5200MHz RAM is $280. Still really expensive, but not as expensive as you’re saying. Sure there’s nothing really in stock, but I wouldn’t be surprised if it gets resolved somewhat soon. NAND hasn’t been hit as hard by COVID as other industries.
Not to mention the performance increases. A year down the line, I wouldn't be surprised to see ram hitting 7000 mhz, for half the price it is now. Not to mention, Ryzen will likely soon have their 6000 chips out. 5nm on 5ghz will be a compelling argument to stay on team red.
I’m not worried. Since I don’t have a fixed income, if the prices never drop, that’s just another reason for me to quit gaming altogether. I’m already spending enough time in front of my screen as it is. Maybe it’s the perfect time to quit.
Exactly, I game at 4k 120hz and this is why I bought an 11th gen just 2 weeks ago.
Ram, CPU ( I7 ) and mobo cost me $1400aud. The same amount of ram alone is $750aud
The most ive seen the CPU used is 20%
Great news for competition, more so if it starts to drive prices down instead of ever increasing price inflation like AMD/Nvidia. I would've liked to have seen you use DDR4 on some of the benchmarks for an apples to apples comparison. Regardless, when natty gas goes through the roof this winter I can at least heat my home with an intel chip.
I love it, when the Civ6 benchmark is in the review. All that counts for me. :-D
Timestamp?
@@HH-ni5hm 4:19
A competitive gaming CPU for just 300$ - sweet 😍 Now I just need to break out another 1800 bucks for a GPU and I'm golden 😌
If you're paying that much for a GPU you're not golden, you're rusty 30-year-old sheet of iron
@@tsakeboya Well I mean have you seen the 3080 prices recently?
Dont forget to add 60% extra for ddr5. The only reason why 11600k is faster than 5600x :D.
@@tsakeboya the gpu market is broken rn prices are ridiculously inflated
You might not need a top tier gpu if you're just playing competitive games they're fairly easy to run and intel's integrated is decent now
I thought the whole point of the of the E-Cores, was to help with when you're running apps in the background? but no where did you test having background apps running vs another CPU without E-Cores
There isn’t a way to test that unless you can get Alder Lake with Pcores only
@@shapshooter7769 I think OP meant to test it against AMD.
Would like to see some comparison in the audio recording field as I would love to work with +- 500 - 700 tracks with alot of different effects and vst Instruments
Exciting! Will say I wish you tested with DDR4. The price puts it on a totally weird, whacko comparison no matter the performance, and God knows there's some serious benefits to it but yea. Totally screws with the price/perf.
DDR4 performance is the same as DDR5 at much lower cost.
It just isn't worth investing into DDR5 right now because when you next upgrade, current DDR5 kits will be real bad, same as what happened with initial DDR4 years ago.
@@Steamrick It isnt. Go check LTTs, or anyone else's DDR4 vs DDR5 video. Tho i mean... I guess yeah the DDR5 kits in the future will get faster
@@Steamrick yep no gains in gaming from all benches ive seen so far
Check HWUB's review of the 12900K. DDR5 helps in benchmarks and in select games
@@Steamrick Not even dude. LMFAO!
The i5 seems to be the best deal so far
Yes, and I'm planning to buy it!
always is
When yoru doing the numbers remember to factor everything into it. Motherboard Ram etc etc.
@@dralord1307 i9 is 300$ more for not much extra performance, you can significantly upgrade your gpu for that
@@naveenbattula for gaming that is. In tasks that benefit from more cores the i9 might be better.
would be awesome to see some thermal results in some hot areas like, 30-35C room temp to see how much difference it make an air and an watercooler
cheers from Brazil
At that heat in Brazil I'd be worried about humidity damage. Water cooling would work better in that space, however.
Move to south Brazil
@Andreas ah yes, just buy stuff, it's so easy to do down here, an AC definitely doesn't cost 3x the minimum wage, nope.
Compra um extintor de incêndio.
@@williamclos6629 are you assuming he makes minimum wage jist because he's Brazilian? Racist.
Intel kicked all the ass for about a year and then quickly fell back into old habits.
When testing the 12900k on win 10 I got Cinebench R23 Score 2000+, in fact I saw the whole i9 perform much better in Win 10. Interesting to see these results though... 🤔
Interesting. Somehow I thought that releasing a Win10 video would just be a "Shhh" for AMD fanbase, but now that you mention it, it might be even more interesting.
Intel: we are back baby!
Ryzen: now hold on a minute
Lisa got something just for this occasion I can see it haha, AMD is gonna be like surprise mutha fukka!!!
@@super_slav91 I don't think AMD is going to do a completely new architecture this time. I think it'll just be a refresh of 5000.
@@hwstar9416 Intel didn't do a new architecture, they copied ARM from 11 years ago, well "borrowed the idea", without a full schematic of the chip we won't know how close the new chips are to ARM's big.LITTLE architecture
5nm, here it comes.
@@hwstar9416 I dont know, new socket, has to be more than a little. PGA to LGA.
That i5 is so tempting. We’ll see if they actually sell for MSRP though.
With the current market, it’s honestly a very good question. Scalpers could have a reawakening for these cpus considering how powerful they are.
@@VanquishR But intel is having a okay supply of silcon right? So why would they sell it for so High
@@b-beluga4510 they’ll probably price it in the middle of the 5600X and the 5800X, or they could just go ham and price it below the 5600X
@@hextobyte they have msrp lower than 5600x at 289 for 12600k and 269 for KF version
Don't forget you might also need a new mobo, new memory and new stuff. So even if it's MSRP the price overall will still be way higher.
“Our sponsor doesn’t help” - 10 most risky quotes in history
It's "doesn't know" instead of doesn't help
Listen carefully
Our sponsor doesn’t know
I'm curious if the numbers are actually accurate, considering the fixes for Ryzen on Windows 11 go out the window and require a fresh install when you swap out the CPU even once
Wait really? That sounds dumb.
@@WilliamChoochootrain Watch the Techlinked episode form November 1st, it's the first story.
@@Zullfix I mean if it was on tech linked then surely Linus knows abt it and would do the reinstall when swapping cpus
Yes it's accurate. They're the ones that did the techlinked to tell you about it lol
@@lilkittygirl They've had the 12th gen chips for up to several weeks now (stated by the 12th gen ES video) and have been holding this video from release because of NDA. They also stated the second patch had just barely come out when they started testing, and the CPU swapping issue was only found 3 days ago.
Way to go intel!
Although I would be interested to see benchmarks on windows 10 with ddr4...
Agreed. Not everyone wants to jump into windows 11 this early
Does it support DDR4? I don't think any of the motherboard manufacturers have released them yet at least.
@@stephen9894 DDR4 has been around for years. 5 is new.
This is only an early video, their doing a comprehensive review later.
GamersNexus did their testing win Win10 and ddr5
window 10 doesn't support these chips and the next and chips will support ddr5 so is better to use the latest tech to compare.
I'm really interested to see the i5 running ddr4 because even though it looks like the obvious go-to, I'm not so sure if it's worth the extra money spent on the motherboard.
i am also thinking about M.B. to costly
DDR4 lowers performance about 5.5% and the i5 in Linus' tests only held a 6% lead to begin with. Not worth it imo.
also the performance will be a little lower in "normal" motherboards and not a top of the line z690...
@@giornikitop5373 Yeah just depending on how the auto overclocking and scheduler are set up. I'm sure most lower tier boards would fry in the sustained benchmarks at this power draw!
If you ever upgraded your RAM in the middle of DDR5 era, any savings you get will be obeliterated when you really need to upgrade your CPU, and therefore your mobo and RAM to actual DDR5.
Good for Intel for doing well. In some countries outside the US, Intel has the better value per performance or very slight difference for the cheaper price. AMD is still saving Zen 4 for Ryzen in the coming months.
Gotta love the competition getting exciting again
I give credit to Intel for making a major architectural change, which is the equivalent of turning a battleship at a company like Intel. But the power & temp graphs say it all, don't they? Intel got this slim performance edge at the high end by running the chips hot.
I get the impression the Ryzen chips could keep up if they were set to maintain all core boost clocks, like the 12900. I wonder how the i9 will overclock if it's basically already completely unlocked.
crazy that they can just use 2x the power when there's a TDP rating for a reason
@@NightshiftCustom The chips are competitive at lower power, so it’s not purely a question of power. We should be happy that both companies are putting out very competitive silicon. It’s good for customers.
VERY hot. And consuming bucket loads of power. And expensive. It's still a no from me.
@@davepastern you didn’t watch the video. The Intel performs better and uses less wattage than the AMD 5950x.. maybe you’re just blindfolding yourself because you’re a AMD diehard fan
As someone who always used Intel chips, one of the main things that made me switch to my current 3600x was the simple fact I didn't need a whole new chipset/motherboard to upgrade the cpu everytime.
While there will be benefits adopting the new platftom early as mentioned, the extra cost of even just a motherboard every generation outweighs the performance benefit as of right now.
Well unless you are planning to update to a 5xxx then that benefit is entirely lost. The 6000 series will be coming with an entirely new platform.
@@Sandriell hehe I chuckled when i read that defence by amd's fan everytime, the only thing they benefit is the fact that they can use the same cooler, nothing else.
@@Sandriell Valid, I do intend to upgrade to a 5000 series. However it's less about the specifics and more about the trend. Intel has inherently had far more chipset iterations and 1-2 generations of cpu's compatible, usually with very mild improvements before needing to upgrade motherboard again.
The fact that my B450 motherboard can run 1xxx, 2xxx, 3xxx and 5xxx cpu's with at most a bios update is the thing I am more trying to convey.
@@Phillz91 "yEaH bUt 6xxx WiLL nEeD a NeW MaInBoArD". 4 gens of CPU on the same AM4 slot and on the 5th now it's all of a sudden "well AMD needs new mainboards too". I'm in the same boat as you... was always a huge Intel fan and only ever used Intel CPUs but their need for new slots is ridiculous.
@@Sandriell he can still jump to a ryzen 7 3000 to even a ryzen 9 5000.... unlike alderlake
Hey! Great Video!!! I have a quick question sir. How long did Intel say they would stay on this platform? And as AMD Said how long they would be on the next platform? I have a 5900x now. But really want something I could put a new processor in a few years from now. Thank for your help
I would love to see the performance difference between ddr4 and 5 on these new chips
der8auers mah man! - ruclips.net/video/6Y9S1wEtR3g/видео.html
I would love to see a comparison between how zen 3 and 12th gen handle background tasks while gaming, in theory the efficiency cores should take care of light loads like discord and chrome, leaving the P cores to run games unencumbered. Would be an interesting comparison
"light loads like discord and chrome" yeah, very light loads...
@@Chipsaru lol
Alder Lake is ridiculously snappy, not only from Big-Little alone, but the ridiculous IPC and frequency of Golden Cove.
The problem is that everyone is assuming that the scheduler will always make good choices. Jay has already seen one instance of a benchmark being shuffled onto E-cores by mistake. What about anti-virus. Background yes, but it often blocks other things running while it does checks. Do you really want it on E-Cores and everything stacking behind it while it takes its time? Lots of unanswered questions, lots that can go wrong. Trusting Intel and Microsoft to get it right on the first try (or second, third and forth) seems a little optimistic.
@@codemonkeyalpha9057 I think he used Windows 10 and DDR4 making his tests fundamentally flawed.
Worth keeping in mind the Intel results are also with DDR5, which puts this more into perspective for me. If you're still on DDR4 then the results are probably even less distinguishable. Taking that into consideration along with the insanely high thermals, it is not a very compelling chip imo.
This
No difference between ddr5 and ddr4 see HUB.
especially the i9 lol, i5 is the highest chip you should go for
They specifically said several times, that those thermals and power usage is only at full load in benchmarks and rare professional workloads. Y'all are biased.
This video is probably looking at the future where markets should hopefully settles and DDR5 should be as cheap as DDR4.
I'll give credit to Intel for being innovative again, like the good old days
0:00 this slap will be reverse in 2022 when AMD will release new one ? :')
shall we wait or shall we buy intel 12 gen ?
Wait
Buy 12th Gen. Don't wait. In the PC Universe, if you wait, you'll keep on waiting.
Wait if you're running well now. Pull the trigger if you need something now.
Wait for atleast a year or so. There's still not much known about software compatibility wrt Alder Lake. The review ignores to mention the fact that there are nearly 30-40 AAA games released in the recent past that don't work with Alder Lake. Don't listen to id!0ts who say "Don't wait".
@@promethbastard THIS is the Answer.
I would love to see the same chip be tested with DDR4 and 5 to see how big of a difference it makes.
Stolen comment
This is why I love being poor. Because I'll always be saving up to buy a pc, and as the years go by, better hardware becomes available
Just wait till next year when Zen 4 and RDNA3 are out or 18 months till zen 5 is out haha
Yeap not like they really care, but rich people literally waste their money upgrading their PC every year lol.
LOL SAME
@@morpheus_9 I'm a patient man
Whatever absolute maniac lined up the i9 box accents with the larger Intel box... I love and appreciate you
I'm interested in seeing the lower-spec i5 and higher-spec i3 chips for business applications. I bet maxing out at 2 p-cores and then having e-cores for the rest would make for some smooth office/warehouse laptops.
I think they already achieved that with 8th generation professors. Unless you're talking about the Y or U series then that would be really interesting to see.
I think the lower-end chips arent going to have E cores, There are i3's that are unlocked though, maybe 4p + 2 e cores?
i3s and low end i5s (12400/f) probably won't have e-cores but the performance would be amazing for the price since the only good low end CPUs last gen was the 11400f and 10105f
@@viztiz316 maybe not this generation since the process is so new, but possibly if e-cores become cheaper then we could see those processors on the bottom of the stack in the future. We also have to wait and see what AMD swings back with.
@@lyrebird712 "possibly if e-cores become cheaper then we could see those processors on the bottom of the stack in the future"
Yea
"what AMD swings back with"
AMD will release zen3d(7nm) and zen 3+(6nm) cpus on Q1 22 for AM4 to compete with 12th gen
Zen 4(5nm) will release for AM5 in H2 22
Zen 5+Zen4d(3nm+5nm) will release in H1 2023
Seeing Linus hold processors gives me anxiety after the Intel Xeon Platinum.
That spec workstation benchmark is cool! @6:13
I do computer repair housecalls and that's actually really helpful to know to answer some questions i have about performance for day-to-day apps rather than just games! Thanks LTT!
This is fantastic, but I'll stay for a couple more years with AMD out of sheer loyalty. They are the reason why Intel is doing what it's doing right now. And I also don't want to forgive intel too soon for fucking me over all those years.
Don't worry. While performant again, the power draw and the wacky efficieny core idea from INTEL at full load is imho a lame hack to catch up. While OK for some, its not for me. AMD will soon bypass this short INTEL episode with advanced RYZEN versions. They just pumped theire EPIC CPUs. It will (hopefully) remain a interesting race xD
Well said. I'm glad there is competition in the market, but Intel is going to have to work much harder to lure back all those they milked for years. I just bought R9 5900x and Intel can go fuck them selves with their new CPUs, I don't care how much faster they are.
@@NedoBoi - Exactly. Up until the RYZEN CPUs appeared, i only bought INTEL. But this greedy crap the did in the recent past, did it. I usually buy upper middle tier CPUs. So i don't give a F*CK if INTEL is fastest or not. As long as price/performance comparison is right i stick with AMD for the near future. And power draw matters a lot to me, as the PCs can have a cheap, simple and quiet cooling solution with fans.
yall act like you only use your pc for blender 😂. If you just play fortnite all day, you cant sit there and say amd is a better value, because it just isnt. If you actually use your pc for serious productivity, you wouldnt be looking at comsumer chips anyway
@@jahterminatedlmao5473 if you want to play fortnite here and there you don't need any of the CPUs mentioned in this test, you can get by with a 4 core i5 from 3-4 years ago
This i5 is an absolute beast...wasnt really expecting much of the 12th gen...but damn...this i5 surprised me alot
Really curious how much of the improved performance is the jump to DDR5. Can’t wait for the DDR4 testing, as it should better isolate just the CPU gains or lack thereof. Also I’d like to see Windows 10 tested vs Windows 11, as 11 is still barely used by anyone and is unstable as hell.
ruclips.net/video/WWsMYHHC6j4/видео.html here it cover both ddr 4 vs 5 and win 10 vs 11, unless it ram bandwidth benchmark it makes only 2% difference average, funny some games run better on ddr4, note this video heavily bottleneck benchmark on gpu, this make no advantage on intel cpu.
Windows 11 is such as mess on Ryzen laptops
@@ChiXty60 Windows 11 is another Vista / Windows 8.
😉 Ha ha ha!!
@ BIG BeAR - Wait until the 2nd or 3rd wave of DDR5 when timings tighten up. You will be exceptionally surprised at the performance gains.
DDR5 handles data so beautifully.
ddr5 currently has trash CAS latency though
I never thought that i5 will carry the whole 12th Gen line up.
The i5 10400f was carrying 10th gen tbh.
Never count an underdog out
Tbh the i5s carried the last 2 generations. 10400f and the 11400f
Which currently makes the LGA 1700 platform already terrible for upgrading down the road if you buy the i5 now, because 12900k doesn't look very compelling.
I have a Intel Core i5 12600 K(F). It needed a proper cooler fron Noctua, the NH-U12A Chromax. Runs great even under heavy load. No problem with heat.
I can't wait to have this in 5 years when I move to windows 11.
Always wait a generation for improvements and big fixes
And 13 is a bad number
So, i5-14xxx for me!
Yup
Exactly my friend.
@WILDENZ I have to use my machine for work so I'm just spooked that updating will break everything. If I had a dedicated gaming machine I'd already have updated to fuck around with it.
Exactly what I was thinking!
AMD is in a good position right now... Even with this "new generation", the cost of their current chips will be a huge factor for real consumers.
😆i
Not to mention the fact that the 5000 series is a year old at this point and AMD already has a drop in solution to this "issue", ie 3D Cache.
Not only price, but thermals too, the performance is amazing, but I feel like they just boosted up power to the sky only to get that "we has best processorz" award
Same problem as in the past: only enthusiasts care. All these years of Intel delivering worse processors and yet AMD has made literally no inroads into the consumer market. No prebuilts on the shelf, no laptops on the shelf, Intel everywhere. We'll be back to 2005 in a jiffy.
@@TherconJair Yeah I guess you are right, without customers caring it won't matter.
Customers: "Did you do it?"
Intel: "Yes"
Customers: "But what did it cost?"
Intel: "Everything"
Intel is still WAY bigger than AMD and now longer is at a technological disadvantage. They are fine.
@@NymezWoW wrong. Intel just paid youtubers to compare ddr4 to 5... not a fair comparison put AMD on ddr5 and it will be a slaughter.
@@crisnmaryfam7344 Is that why Intel 12th gen is still doing well when using DDR4?
@@crisnmaryfam7344 And a, what, 8 month old AMD product that is due a refresh in 1 months time. Yep, it's competition, but I can see some leapfrogging coming their way! :D
I really need to know, did Linus use the noctua lga 1700 bracket during testing? I heard temps can be worse on lga 1700 if using an asus board with lga 1200 compatibility, due to poor contact with the cpu chip. I'd like to see a review with the actual noctua lga 1700 bracket before I believe 90C is what the d15 reaches
I was almost regretting my recent upgrade to a 5950x until I saw the power draw on that i9. That being said...that i5 is an absolute monster if you're not in the crowd that needs a bunch of cores.
Undervolting could save 20 30 watts
except for the $400 motherboard, the expensive ram and then top tier cooling you would need for the i5
at that point might as well buy a 5900x with all that and be in the same ball park monetarily speaking
@@Bulanesti i dont think that would be a huge problem for long. Wait awhile and the cost would probably go down by alot........ maybe not with that ram. Maybe wait a year😅
@@Bulanesti - That's overselling it a bit.. LGA1700 mobos are a bit more than b550s, but good ones can definitely be had for 250ish. DDR5 is not even helpful for most workloads currently so you can skip that. Assuming the power draws shown here are accurate the i5 shouldn't need anything particularly special for cooling - remember it stayed down in the 120-130w range, more than a 5600 but right in the mix with a 5900x.
@@Bulanesti You DEFINITELY don't need a $400 motherboard nor do you need DDR5 to utilize an Alder Lake chip.
3:29 sheeeeesh
Linus, PLEASE address audio/video production professionals. How does Intel’s new architecture compare to the previous when doing this kind of work? Using multitrack sessions in my mixing sessions, only the first core really mattered before. What about now?
I think you'll need to wait for the samples to make it to specialised reviewers. PugetBench is a good indication of things to come for video, but for audio the questions are way too many. E-cores have more latency: if a real time audio thread ends up there, there may be trouble. Sure enough, Windows 11's new scheduler was never tested with real time audio workload. It's really not a representative use case until Reaper comes out with a "Tested on windows 11 and alder lake" build. Also, alder lake runs HOT when number crunching, which spells further trouble for daws. Audio professionals need something reliable to invest on: Ryzen still is for 2021, let's see next year.
@@virgult I remember reading that long ago, the MPEG working group told people to stop using Pentium 4 to compress MPEG audio. Something about unstable timing increasing the compression artifacts or whatever.
7:53 I have an i7 12700k with a mobo that doesn't allow for overclock and when I stress test with Prime 95 I do get 239w ~ 241w TDP with liquid metal, way above what the image shows
Just as I thought the performance improvement is done by removing power limits and adding more coal into the furnace. Great if one does not care about electric bill
But it still has a significant performance advantage in gaming, where temperatures on the 12600K (the one that really matters) range from the same as the competition to being slightly higher (and very cool compared to the 5600X). That untamed power draw in all-core workloads is worrying, but it's only worrying for users who do heavy multithreaded workloads - which, to be fair to your argument, is the segment of the market that is probably most appropriate for the 12900K. I just think it's unfair to say that they're just "adding more coal into the furnace" when there's a dramatic improvement in performance per watt in most common user workloads.
Yeah my thoughts exactly intel basically have the i9 running at the red line. The i5 does look way more compelling though
Right me over here with r5 5600x and rx strix 5600xt using a total of 150 watts in my games I opt for the best effiecent parts for performance instead of most jam packed heated up tdp stuff but still is better than the older tdps in 2007
"All they did was remove the power limit"
Bruh... they changed the processor node, they changed the entire architecture, they switched to having two types of cores on one chip, they drastically changed the physical size AND layout of the die, they moved to PCI gen 5, they upgraded to DDR5, and they vastly improved the IPC.
They changed LITERALLY everything in this generation, and people like still roll their eyes and say "all they did was throw coal on the fire". Ffs.
German wallets: [anxiety disorder intensifies]
Ah yes, the "this is fine" fire memes won't go away.
240watt fire meme 🤯😂
That's over a 1/3 of most PSU consumers actually own.
@@RJ_Cormac For real, I have a 650 watt PSU running a 3700x and 1080 Ti. No way I could run one of these LOL
@@NeoNoggie I have a Seasonic 650watt PSU; running a Ryzen 3800XT CPU, 32GB DDR4 3200 RAM, and 3060 GPU, on a B450 motherboard. My desktop ranks in the 92nd percentile of computers globally, no need to triple the budget just for bragging.
I understand they have to cover new tech, but with the shortages and scalpers, stick with computers people can actually build. I've been rebuilding the same ATX case since 2006, nice to be able to reuse parts and not replace everything every ~3/5 years.
yep. intel adopted the big.little strategy but still lose out on performance per watt.
This is more like a desperate attempt from Intel, trying to compete with Ryzen's superior architecture by pulling as much power as possible.
Pretty sure AMD can easily crush Intel yet again if they use DDR5 and unlock the power to 240W.
@@kjc420 everyone knows heat kills electrics, why cooling is a big industry. I'll be curious how many Intel CPUs are still running in 10 years compared to AMD.
I went 8 years on my last build, then upgrade for the pandemic quarantine video calls to family. Built everything but the GPU took 14 months to find a fair deal on eBay.
I was running a GTX-650 from 2013 until I found the RTX-3060, sold the old GPU on eBay, somebody wanted it for half MSRP 8 years ago.
_We threw double the power at these workloads and gained 10% faster speeds than the competition on average, aren't we innovative?_ ~Intel.
the green team is no better haha
I still don't understand why "I need" efficiency cores on desktop...
To appease the thermal envelope gods. Seriously though the core counts keep rising so having all cores as "high" power comes at a price to thermals and power usage. Focusing on half(hopefully 3/4 in the future) and making them the best you can with the extra thermal headroom can actually increase performance as I would say most common apps don't use more than 16 threads. Just my opinion tho
Go to 7:00, it prevents the throttling of the cores that are high power dealing certain workflows to the low power cores allowing it to be more efficient as a whole. At least how I understand it……..
Yeah, I'm aware of the concep, which is perfect for laptops, but on desktop... I don't get it. The underclocking of power cores (when thay are not needed at high clocks) is good enough by my opinion.
There is a push towards increasing power efficiency in desktops with california recently enforcing restrictions on desktop computers with high energy draws.
I'm mildly confused why the 12th gen chips were run with ddr5 when doing cross-platform comparisons. Since the AMD chips dont have ddr5 as part of their feature set (this gen) I would have thought that using the same (or match spec at least) ddr4 kit would have limited the number of variables as much as possible.
Agreed considering it most certainly accounts for a portion of their tiny lead in these benchmarks. Why not test it both ways? hmm.
As Linus said, this is best-case scenario. So if you're considering running Alder Lake with DDR4, you may see a performance drop, making the purchase less worth it (albeit significantly cheaper).
Well, Intel did give them something like $20k in upgrades a little while ago. They will likely do the DDR 4 test later.
After so many sponsorships from intel do you think that Linus is not going to give the advantage to Intel?
@@benjaminoechsli1941 its indeed best-case scenario but it seems a bit apples-oranges.
This is why I have always bought an AMD CPU when possible. If AMD were to go under, there would be nobody to light a fire under Intel to keep them innovating.
I mean ... same could be said for Intel 🤔
So much this. Until Ryzen came onto the scene, Intel was more than happy to continue giving users crumb-sized performance increases in their chips. Once they saw an increase of AMD usage, they realized that they needed to finally do something with their products.
Thanks for taking a bullet for us
Absolutely right, competition and especially fierce competition is always a good thing. It both drives innovation and cost control. That's why I'm hopeful for Intel's GPUs because Nvidia sorely needs more competition.
So what you are telling the company is keep making bad product and I will still support you? It works both ways.
I feel like the value of the i5 gets thrown out the window when you factor in platform costs lol
Maybe prices will drop in some months, let's wait - those early new tech prices are always very high
thats my main concern as well, like whats the point in getting that if u have to spend on a z board thats worth alot as well just to oc
DDR5 is not necessary
@@gurjindersingh3843 Pretty sure a lot of people said that about ddr4 when it came out.
@@signinforthebestexperience3461 What people said was that you can wait 2 years to get DDR4 that actually offers performance significantly better than that of DDR3 and the same goes for DDR5.
Can't find that platform overview video about chipsets/motherboards. Anyone got link?