@@SomeUserNameBlahBlah Right, only the server/enterprise market. Even scratching high end nowadays costs way too much in electricity bills in loads of countries to a degree that it's almost worth thinking about getting a laptop or console simply because the save in electricity over the next 2-3 years literally pays for a laptop and a year for a console. They simply reached a point where they're dangerously close to not be able to sell their products anymore - both GPU and CPU are in dire need for efficency now - and game devs are in dire need for optimization, especially after some smaller dev teams figured out that UE5 can run perfectly on older hardware (and look by far better than UE4) if actually used properly (while about 95% of releases strangely hide behind excuses) - and those that do it properly still call their game unoptimized as there's plenty of room to improve. Sorry to tell you, but no. We got a ton of power that requires a dumb amount of cash to be paid, now they can work on efficency to make it worth it.
Power draw is starting to be a legislation issue for prebuilt PCs. Countries are trying to hit their green benchmarks and were running the numbers on how much power everyone's computers were taking. AMD and Intel had fought for performance numbers by just continually cranking up the wattage and it was out of control. A bunch of countries said if they didn't get it under control they'd start adding taxes for going over a power budget or just outright banning sales of some skus.
@@Мирич-з4е Yup. I literally have an 11900k because it was over 20% CHEAPER than the 10900k at the time. Can't wait to upgrade to AMD's 9950x3D, I'm done with Intel 😂
@@jrodd13 recently I have hands on a Ryzen 7nm, much much cooler and powerful than an intel 14nm. the feeling was like "i was cheated by Intel for quite a few years now"... i guess the feeling is somehow similar to yours.
@@AstroBot-bc0213 The CPU was good. The timing was bad. If it had come out 3 years earlier in 2018, when 14nm was still acceptable, it would have been great. AMD by that time (2021) already have moved to 7nm.
@@halisturk8348it’s possible but generally rtx voice is used to emulate professional audio setups They actually have the professional audio hardware specifically for events and media so it’s fully possible they didn’t even need rtx voice
I recently went with the 5950X because I also wanted better performance in workstation tasks. I thought 270€ is a good deal considering the AM5 motherboards cost 180€ and the CPUs even more.
We could stop processor development for the next 5 years, and it wouldnt be enough time for game developers to catch up on optimising and learning how to make games efficiently again and take advantage of where we are at now.
It seems like most AAA devs these days just hope that upscalers can pick up their optimization slack. I think that side of the industry is long due for a big shakeup.
Well I'm NOT a millionaire by any means, I might end with a mid sized processor for my new PC I'm building, just need a good graphic card and Office/Win... any tips would be appreciated folks!
depending on what you mean by mid sized processor, if you want gaming performance its a better idea to go for 5800x3d/7800x3d, the prices on those are likely to drop significantly as the 9800x3d has just released and if you want gaming (assuming you do because you mention a good graphics card) they'll be the best performance/cost ratio you'll be able to get, both cpus perform very well (you can see that the 5800x3d is second in many of the gaming tests in this video despite being two generations old at this point). as for best GPUs for price, amd chips are usually better for that - although its always worth to do research as you can quite likely find a sale/price drop because early next year the 50 series of nvidia cards release so the 40 series (4070 especially being the midrange choice) will likely drop in price
Well, you are in luck: You are not buying a mac, so you don't need to be a millionaire, PC hardware is very competitively priced. You get a lot of power for very little money.
Go check out a more or less complete last gen system. If you are on a budget: It's a really good way to squeeze out more performance, for cost savings - and the only thing to consider is a GPU upgrade in a couple of years.
"AMD-sponsored marketing trolls displaying high FPS and incredible 1% lows on all major benchmarking overlays (not ours) shouldn't divert users from the fact that the 7800X3D is a subpar product. Instead we suggest the better value Intel core ultra 245K or even better, the core i3-12100F for 99.999999% of gamers"
If you really want to piss Intel off- combine the old and new naming schemes together- i9-15285k. Much easier to pronounce & the "285k" end makes it seem like a low-end, wacky i3 variant!
@@benwhite6786 The most annoying part is they could have just called them Core 9 290K, Core 7 270K and Core 5 250K. But instead they decided to take 5 off for no apparent reason.
Wild guess: some Sony full frame camera with a wireless mic inside his shirt, reasons: those colors look really Sony like, and the audio nowadays can get crazy good with lavalieres some really nice ones can deliver 32bits audio even wireless, that could explain the good noise reduction and even having a phone recording general noise to do the noise reduction, for the picture I don't really see some kind of strong light pointing at him, maybe a phone with the flashlight on
@@DJDocsVideos lmao this is not green screen. no idea where you see greenscreen edges. plus the camera focusing is way too synchronized with his hand movements.
My instinct is that this CPU generation respects that gamers simply don't demand a volume that matters for a modern PC market where decent CPUs tend to do good enough for a minimum of 5 years. Both the volume and margins probably exist with laptops and bulk commercial desktops where performance has long been over the required threshold, but efficiency would make a significant marketable difference
@@LoFiAxolotl Even if efficiency is what wins this generation, AMD takes the crown. I'm not sure there is a "huge win for intel" in the foreseeable future.
That's not what's happening though. This is a stop-gap before they get their new ASML machines and chip fab are up and running. It should be another 12-18 months before that even spins up though. They have a whole new architecture planned and coming, but they can only optimize on their current designs right now. AMD is going to be the clear winner for at least another 1-2 years as TSMC already has their ASML machines in their fab, spinning up new AMD silicon.
It does feel like they saw power efficiency as one of the biggest factors for Apple Silicon's success and they're trying to bridge that gap. I'm not too down on this since focusing on power efficiency now means less heat which can lead to a higher ceiling on future chip designs.
I love the sudden push for efficiency. The focus has been on power for ages, we can do with one intermittent gen where the efficiency boost carries to future gens.
Even at super expensive energy prices, the price difference to even the inefficient Intel 14th gen CPUs will take years to recuperate and that is before you take into account motherboard cost. And the AMD CPUs draw way less power and don't need new motherboards.
@@MajinOthinus I'm aware I'm using a 5800x3D myself and bought a RTX 4070 specifically for its low power draw undervolted. I just think that it's generally good if technology uses less power.
This generation feels like it’s for OEMs and not enthusiasts. I’m sure Dell, HP, etc. will love that it’s easy to cool either AMD and Intel CPUs while still getting reasonable performance. With that said, I don’t think I’d ever choose any of the Intel 2xxx series over the AMD options.
People expect Intel to pull off a brand new CPU that beats AMD... They should be already happy of what they got, they don't need to buy them, buy AMD's CPUs that are gonna got a price increase
While i was looking at the stats in the video, my first thought was "who would even look at buying a newer intel gpu after seeing this" the amd ryzen's look way better on the bench mark test.
@jeanytpremium the crazy thing is how quick the tables turned on intel. Who's to say they can't swing back in a couple generations with this new architecture. Then again the new ryzen architecture seems mighty so who knows!
More than anything, I'm grateful they added that bit to correct the info at all! It shows honesty and transparency beyond something simple like putting text over the screen to visually override what Linus was saying. This is exactly the sort of content we were hoping would be produced after the LTT review scandal, so props to the team for sticking to their promises, and thank you!
I'm very happy I went with a 7800X3D build two weeks ago instead of waiting for Arrow Lake. Coming from a 4th gen i7-4960X, it's absolutely blown me away. And my 4960X could run basically anything new, but now I see what it's like to be high end again.
@@JordanJ01And yet it plays every single thing I throw at it on ultra with my 3070, including Flight Sim. Imagine using your personal gaming computer for work tasks💀 But I get it, you want to justify your 600 dollar Intel CPU, so go ahead. I'll keep my top tier 7800X3D and maybe upgrade to the even topper tier 9800X3D in a few years, or the toppest tier 11800X3D a few years later. All I can say is I am SO GLAD I passed on the 14900K and went with the 7800X3D. More power, cheaper, cooler, and I'm not some Hollywood movie producer, so... top tier 7800X3D with my 3070 it is.
@@JordanJ01 So? If I want great multicore performance, I would also buy a threadripper or epic. Then I could run a 1 million pop Cities: Skylines 2 map/
14:20 Lol this is hilarious considering we are comparing this launch to 14th gen which literally had 12% power reduction at best and 0-2% performance increase. This is still a lackluster launch but 40% power reduction is definitively better but depending on your use case, you may not care. All to say intel has been a sad state of affairs for awhile, but hopefully they can turn this into "Zen 1" momentum. Then again there is not much lower intel can go...
Nowadays, AMD is the best choice for anyone because the latest AMD CPUs use the AM5 socket which will still get some years of support in the upcoming years. Back in the 2010s professionals would recommend intel because of reliability, but this is absolutely not a issue anymore. AMD has become the better one out of the two giants, with of course significantly lower prices.
@@FCNeil I agree except for in the rare use cases that need Intel quick sync. I have a Plex server that has an Intel 12900k simply because I need quick sync. My personal PC has a 7800x3d and I love it!
AMD has been the no nonsense choice for last 5yrs, its was even better earlier because you were paying intel 2x for the same perf. Intel caught up with 13th and 14th gen processors but not much, intel is only saved by OEMs pushing intel CPUs otherwise i dont think anyone would prefer intel over AMD today.
@@Xxshadowman11xX AMD is catching up with Plex transcoding. I use my 5700G and can get four to five 4k -> 1080p streams at the same time without any hiccups.
@@itsmemac yeah you can always brute force streams with multi threaded processors but quick sync is still preferably for the quality per watt. You're able to have a couple dozen streams with quick sync while only using 10 watts or less, which is crazy efficient for this particular process.
@@Xxshadowman11xX I think he's refering to Plex Media Server on Windows which does in fact support AMD GPU's which would propably include the iGPU of the 5700G
I love that I bought into the Ryzen 5800x3d. I don't have many games worth playing because the industry sucks but I can still feel confident that I bought a good product that has consistently out performed newer products by it's competitors for 3 years running! The product lived up to the hype. Thanks for helping my decision LTT and all the others in the community.
@@HerbaMachina It depends, if you don't care about gaming and only care about productivity, then it's better to look at both Intel and AMD higher offerings.
when my 13700K started acting up with the now well known intel-gate of 2024, i switched to AMD. the 7800X3D to be precise. my gaming performance significantly increased. only my synthetic benchmarks decreased. but real world performance you could say ''doubled'' (as i had to downclock the 13700K so much for stability it should no longer be considered a 13700k) another added bonus? the POWER BILL. the wattage my PC draws is significantly less. the 13700k would happily draw 200+ in ANY game/activity other than being a desktop youtube sim (and even then it would easily go to 100-150 watt) the 7800x3d? 78-82 watts. yes. thats full power, with beter gaming performance across the board. being a grown adult, who has bills to pay, while energy prices are doubling yearly, yes this is significant.
@@theroyalcam not OP but I can tell you that crashes for me are way more likely to be caused by my gpu driver or the game itself rather than the system having a complete meltdown due to the CPU. However trying to be as unbiased as possible, the YMMV part comes more from the motherboard manufacturer and how mature their BIOS's are, I'm using a gigabyte aorus master and I've had to rollback before due to bios bugs.
@@theroyalcam You probably won't experience any crashing or stuttering issues after switching because of the cpu, if it occurs it is most likely caused by the ram, motherboard or other components.
@theroyalcam nope, none at all. Same GPU. Had to switch motherboard course. And ram due to compatability. Same ssds, sata and nvme. No stability issues, no crashed, no blue screens. On demand performance for so little power.
@@avenage when diagnosing the horrendous stability I checked EVERYTHING. Ram, GPU, drivers, SSDs, software. To no avail. Clean installs multiple times. The only thing that helped was to downclock the 13700k to what you could say was 13500 performance levels...
I’m actually glad to see a clear return to Tick-Tock type roadmaps. Not every release needs to be exciting. Refinement releases are valuable, too. This product is a big pivot for Intel; it was never going to be an absolute win. This is basically their Zen 1 moment
How is this a tick-tock? It's a new socket. Tick-tock is a new platform with more power, then the tock for efficiency. We got a tock and a new socket. Not maintaining socket compatibility for more than two generations is really stupid.
@@MrPatateHead You’re being pedantic. The higher level idea of Tick-Tock is that you alternate between performance considerations vs refinement ones. You’re lost in the weeds and missing the forest. But, last gen was you Tick from 13th, if you want to look at it that way? However, that’s a false equivalency cause Intel hasn’t been doing that pattern for ages now. Every gen has been attempting a “Tick” up in performance. Tick-Tock also is not tied to socket; never has been, really.
@@aj0413_ Tick was a new process with the same architecture, Tock was a new architecture on the same process. That repeated for a while but scaling slowed and then they got stuck with 14nm without a contingency, the next future plans were all for 10nm and beyond and we got multiple refreshes before a back port of a new architecture and a watered down 10nm process from the original planned one. I don't really believe in this tick tock method today. Sometimes you need a major overhaul, and then you can get about 4-5 refinements with smaller architecture changes before starting again. Process is upgraded in a less rigid cadence. AMD essentially did this, new architecture with Zen 1, refinements of different parts of the architecture each gen, then for Zen 5 they did a major overhaul on everything, the next 5 or so gens will be refinements and picking low hanging fruit until there isn't much left and they do another overhaul. Every generation was kind of a tock, and sometimes also a tick with a new process node simultaneously. They didn't port the same architecture to a new process before making changes generally. Zen+ is a close exception with slight changes from Zen 1.
hot take: I think power efficient CPUs are the future, and I'm all about it. I hate using my power heavy gaming rig for anything that isn't gaming, and I'm still using 12thgen intel cpu and 3080 GPUs.
In addition to being a launch pad for future generations with brand new architecture with a strong pivot away from the high power low efficiency of i9s and i7s past. This is good for the future, even if it doesn’t look that great today.
@@rmiah yeah just got ryzen 7900x and yeah it may get 10 fps less compared with intel but also getting less electricity consumption... but I'm not changing that cpu for a while no overheating clocking 5.4 ghz at 1.25 vcore and maxing at 70 degress celcius. 😜
@@peterpanini96 AMD has been way ahead in the efficiency game.. Intel is finally cluing in that people don't want to pay an extra $100 a year on their power bill for a few more FPS than the competitor.
@rmiah but why choose Intel? The problem isn't what they are trying but that AMD is on every step better. More performance and cheaper CPUs the Mainboard are for sure wayy cheaper then the new Intel boards, the Brand Intel is absolutely not what it was just after the last incident showing terrible support at times, especially for knowing the problem for atleast a year.
@@ehrlichgesagt863 why did people choose AMD when Ryzen 1 dropped and it was marginally worse if not equal to Intel’s offerings (ignoring pricing)? Competition is good in the space no matter what your preconceived notions about each brand is. And as later tests in the video showed, Intel’s new architecture still holds its own for productivity and AI tasks.
Nearly a decade later Ryzen stands to be one of the greatest hat-tricks of consumer electronics for all time. I can think of a few examples that would rival/ surpass it but I still find the turn around mind blowing. We talk about “generational leaps” on an annual basis but the shift of AMD to Ryzen was a shift that occurred over an actual generation.
Thanks for highlighting the discussed chip AND marking the topic chip! Really clear and helped me follow along really well. As a gamer who really wont be going over 1080 and RARELY AAA games I am really here for the power savings. I am compelled by the results, and honestly I expect while this gen might be a power focus, next might be taking what they R&D here to apply to proper performance next round. Love the vid.
It's almost like Intel and AMD brokered a truce to end the power consumption madness and both took a breather this generation to make improvements to the platforms. Honestly, I'm here for it. Tick-Tock baby!
One thing that’s severely underrated about Linus, is his consistency and commitment to quality. The man never shows us his bad days. He’s in the middle of an airport and travelling for another video, and he’s still putting this quality into his hosting. Major props Linus. You’re truly a seasoned entertainer.
I feel like m1 style chips Only make sense in laptops To be clear I loved my m1 MacBook Air and I love my m3 max MacBook Pro but that’s because they are laptops, not desktops. I would hate any Mac desktop because when plugged into a wall raw performance matters more than
If you feel the want or need to hop up to AM5 that 7800x3d is a meeeeean big brother to the 5800x3d, but at that point it’s def worth it to ride out until some 9800x3d’s or even 10800x3d’s come around to hop into AM5. Unless you’re feeling that ‘build-a-pc fever’ then go for it bestie
Even if people might not care because FPS tend to be the main selling point for gamers nowadays, refreshes like this in PC hardware that focus on efficiency are extremely important and open the doors for big jumps in performance in future, by lowering the amount of power needed to achieve the same or even better performance than the previous generation
Hot take: I don't necessarily think it's a bad thing if we're reaching a performance plateau for CPUs in general. It might be nice to shift the goals from just raw compute performance to efficiency and optimizations. "I couldn't make it faster, so I made it cheaper" should be a valid business strategy.
@@MajinOthinus I understand the need of performance in case of running productivity applications, but if you're trying to use this to rot your brain by "gaming", indeed, have fun.
@@Buzzttok Right? I'd much rather have a CPU that's capable, yet can boost to max frequency on passive cooling, than whatever we have from Intel now. I find it surreal that so many people still buy Intel CPUs when AMD gives you comparable performance for a fraction of the power and thermals.
Can I be an optimist and ask the following question, maybe they are both (AMD and Intel) doing major power budget cuts in the current release to sky rocket clock speeds on the next gen? I am probably using the wrong terminology but I think most will get what I mean.
I think I agree. This is probably for Intel’s 1,8nm chip next gen processor to make a big splash and also for their GPUs that are supposed to come out soon
I’m pretty sure both have said that somewhere. This is the foundation for the next generation and with all the headroom they can really push performance.
This is probably the direction the world doesn't want to see Intel go in, but from Intel's point of view, probably catering to the smaller audience of efficiency in both performance and power draw. Don't get me wrong, I'm all for getting a little greener, but did we all forget how stupid it was to run the 13 and 14900K? Now we don't have to worry about racking our brains on the most optimal or the bare minimum cooler to slap on the newer chips. But hey, no one is forcing you to buy the new core 200 series. I wouldn't even hold it against ex-team blue bros, because this is definitely a day to lose
Hate the naming scheme. As somebody who's on an older 9700k with a newer graphics card, I'm seeing a performance hit in games and looking to update my platform. Anything I pick today is going to be better. Platform improvements and power efficiency are attractive to me. I don't want 600watts heating up my little gaming area in summer. Currently using 330watts ish on the ROG Thor display. 60°C hard-line loop. Besides, my power bill keeps going up all on its own. I'm not willing to compromise on FPS, but I really wanted to throw in some PCI cards for years. Having more than 16 lanes available on the consumer platform would it be amazing. The 9700k is solid. No weirdness in power settings or game bar or bios trying to cook my chip. When you sit on the same windows install for several years with a broad range of hardware, software issues compound and degrade and become terrifying. I already have to deal with razor and any others software, I don't want to have to deal with (in os) software that makes or breaks my CPU as well. Performance metrics are great, but when are we going to get another a solid platform with no BS trickery to get it to function properly?
3:52 Case in point. Cities Skylines and its sequel are notorious for being horribly unoptimized games. The fact you get around 90 FPS means it's plenty sufficient for playing those games but if the game devs optimized their game, you'd easily see much bigger numbers. Probably double or triple what you are seeing. Of course, that would negate your benchmarks. You should be calling out devs who make games that perform badly, not hardware manufacturers.
I haven't been into new tech in like some years ( I still use me ryzen 5 3600xt and 1070 on a 1080p monitor) but honestly, it scares me how many fast components got recently. Pushing more than 500 fps in games it's crazy for me.
@@sharzo7728 They did compare the power consumption to the 7800X3D and the 9950X, atleast for productivity and really didn't hide the fact that these chips pull less across the board. I don't know what you are looking for.
@@1Percentt1 In gaming it would worth buying a 7800x3d but in productivity the u9 285k beats it all across the boards, and neck on neck against 9-9950x. But 9-9950x can draw up to194-200 watts while u9 285k uses fraction of that wattage. But I can't say anything about the new 9000x3d CPUs which has not released yet. conclusion: intel u9 285k is a fast but lower power CPU with AI capabilities. Hopefully Intel can use this and create an CPU that's more performant for games rather than production.
@jeppe1774 did you watch the full video and read my comment? My comment was talking about gaming. But even then, the u9 was pulling 220w against the 9950x's 200w. It's not using a fraction of the power, it's using more power. And the 9950x is also faster at gaming.
Idk, ir still running on i9 9900k. + 3070ti an other pc i9 11900k + 4070 ti its comepletelly enough for gaming, dont see any reason to upgrade closest 3-4 years Only waste of money
when i got an i9 9900k 4 years ago that was the best pc decision i have ever made, this thing is still on top and i have no desire to upgrade to these new generations. I still have yet to run into any performance issues on this platform despite this processor being 6 years old, intel really hit their peak with this one
I also have 9900K but editing 4K video is already slow. I was waiting for this processor, but now I'm at a loss. Maybe I'll get 14700 hoping it won't degrade )
I like this minimum production out in the public video, reminds me of the first video I watched of Linus over a decade ago (overview of the MSI p55-GD65)
13:04 Have you noticed any disparity between the software reported CPU package power, and actually measuring power consumption at the hardware level? I know it's not possible to isolate just the CPU's power, but you can isolate power consumption to the motherboard by using hardware interposers like how GN is doing it now (which includes CPU, RAM etc) and measure that. It means you should have far more trustworthy numbers compared to how Intel and AMD self-report power consumption at the software level. They also use different methods internally for reporting power, which can lead to apparent power consumption differences between vendors.
The naming scheme still works the same way. It just has the word "Ultra" probably to emphasize the new NPU. This is the 2nd generation of Ultra processors, hence the 2xx. The K and F suffixes still mean the same thing, the tiers are the same, and the xx part works the same way still. I find it hilarious how people see a different word and their brains turn off.
This looks like it was filmed at an airport. But the crazy thing is how good it looks and how good the composition is. For something that appears to have been an improvised film shoot, it's pretty good production quality.
It would be foolish to trust intel after they tried to gaslight us for an entire year about their unstable processors. Many enterprise customers felt that first hand.
I'm disappointed in Microsoft and Windows 11. Zen5 and Arrow Lake perform amazingly well in Linux. Microsoft is 90% of the problem for recent CPU generations
Intel and AMD are busy lowering cpu power draw just for Nvidia to offset it by increasing gpu power draw
Server/Enterprise market wants lower power draw.
@@SomeUserNameBlahBlah I'm as laptop user want it too
@@SomeUserNameBlahBlah Right, only the server/enterprise market.
Even scratching high end nowadays costs way too much in electricity bills in loads of countries to a degree that it's almost worth thinking about getting a laptop or console simply because the save in electricity over the next 2-3 years literally pays for a laptop and a year for a console.
They simply reached a point where they're dangerously close to not be able to sell their products anymore - both GPU and CPU are in dire need for efficency now - and game devs are in dire need for optimization, especially after some smaller dev teams figured out that UE5 can run perfectly on older hardware (and look by far better than UE4) if actually used properly (while about 95% of releases strangely hide behind excuses) - and those that do it properly still call their game unoptimized as there's plenty of room to improve.
Sorry to tell you, but no. We got a ton of power that requires a dumb amount of cash to be paid, now they can work on efficency to make it worth it.
Power draw is starting to be a legislation issue for prebuilt PCs. Countries are trying to hit their green benchmarks and were running the numbers on how much power everyone's computers were taking. AMD and Intel had fought for performance numbers by just continually cranking up the wattage and it was out of control. A bunch of countries said if they didn't get it under control they'd start adding taxes for going over a power budget or just outright banning sales of some skus.
The PSU mafia might behind this.
POV: Linus talks to you about how disappointing Intel’s new processors are while your flight is delayed
(me: sips on a coffee to-go, quietly nodding)
@@aJazzyFeel HAHHAHAHAHAHAH I'm doing exactly that lmao. Broo that ambience is 10/10.
My flight got cancelled this morning, he's going to be talking about the Pentium 4 by the time I'm free
Lmaoo hey MysteryMii
@@isaacdantzler YOU FOUND ME!
This reminds me when Intel was beaten by Intel. Back in 2021, when the 10900K was beating the 11900K.
@@Мирич-з4е Yup. I literally have an 11900k because it was over 20% CHEAPER than the 10900k at the time. Can't wait to upgrade to AMD's 9950x3D, I'm done with Intel 😂
Its Netburst all over again
@@jrodd13 recently I have hands on a Ryzen 7nm, much much cooler and powerful than an intel 14nm. the feeling was like "i was cheated by Intel for quite a few years now"... i guess the feeling is somehow similar to yours.
11900k is top 3 worst cpu that intel has ever released
@@AstroBot-bc0213 The CPU was good. The timing was bad. If it had come out 3 years earlier in 2018, when 14nm was still acceptable, it would have been great. AMD by that time (2021) already have moved to 7nm.
"Hey Linus, embargos coming up"
"Ah crap, I'll just film it in the airport"
I was just thinking “is he at a micro center? Why does it look like an airport” but no it actually is an airport lmao
does that take away from his professional delivery? I don't see a problem here.
@@SatongiFilms No, and nobody said it did? It's just a little funny.
Looks like a green screen to me.
Beats flying to Taiwan and waiting for rain.
Riley's commitment to the bit is top notch. Never let him go.
Yes!
I love riler murdered.
@@AffectionateLocomotive That's quite the double-typo.
This top chip feels to complex for intel to pull off rn.
@@nottimothy5994 yes.
The Riley interlude to deal with an unexpected change in your testing results is truly the best way to hit those review curveballs.
Great job on the audio whoever mixed the episode. I can’t imagine how bad the raw feed was with all that background airport noise.
Maybe they are using RTX Voice?
@@halisturk8348it’s possible but generally rtx voice is used to emulate professional audio setups
They actually have the professional audio hardware specifically for events and media so it’s fully possible they didn’t even need rtx voice
@jormungand72 this is clearly not dubbed…
pretty sure they probably just using RTX voice
AI tools are amazing.
I do love this clear naming - "Core Ultra 200S" - WAYYY better than the annoying understandable clear naming they used to have
Yes. But also everyone who got used to the old one is pissed now XD
Edit: I was beeing sarcastic
why can't we just have gen 1,2,3,4, ... again?
@@Mafa0001 yeah, best looking name was I7 8700K!
Wont improve performance unfortunately
@@Mafa0001 he is being sarcastic, the older naming scheme was clear and easy to understand.
All i hear is that R7 5800x3D was an amazing investment
So true. In fact any of the X3D CPUs are a great long-term investment.
I never regretted the 5700x3d and I'll use the money I saved to get a better gpu.
Amen, brother
Well upgrading to a 5800X3D on a dead platform sure is looking like the best "wrong" decision I've made 😅 RIP intel.
Still plenty of life in it until Microsoft updates Windows to require a newer chip in the future after 10 is done with.
@@bionicgeekgrrl ya windows 12 will probably not support core 9 285k 🤣
I recently went with the 5950X because I also wanted better performance in workstation tasks.
I thought 270€ is a good deal considering the AM5 motherboards cost 180€ and the CPUs even more.
@@bionicgeekgrrl There is no Windows after 10. Linux all the way babyyy
5800X3D going strong after all these years in the same socket- flawless design
We could stop processor development for the next 5 years, and it wouldnt be enough time for game developers to catch up on optimising and learning how to make games efficiently again and take advantage of where we are at now.
Nah, they’ll just pump out more games every year with barely any optimization.
@@wyterabitt2149 indeed, seems the physical shrinking is at its limit. Most ipc is coming from better and smarter choices on architecture
It is more cost effective to just dump more games rather than to stop and fix the previous one.
@@fujinshu the easy, mundane task of optimizations. game devs are so lazy
It seems like most AAA devs these days just hope that upscalers can pick up their optimization slack. I think that side of the industry is long due for a big shakeup.
Well I'm NOT a millionaire by any means, I might end with a mid sized processor for my new PC I'm building, just need a good graphic card and Office/Win... any tips would be appreciated folks!
Bro you don't need much for a sweet PC, go with the flow, with a Radeon graphic card and the rest: BNH Software, that's it 🤙
I appreciate the tip, thank you.
depending on what you mean by mid sized processor, if you want gaming performance its a better idea to go for 5800x3d/7800x3d, the prices on those are likely to drop significantly as the 9800x3d has just released and if you want gaming (assuming you do because you mention a good graphics card) they'll be the best performance/cost ratio you'll be able to get, both cpus perform very well (you can see that the 5800x3d is second in many of the gaming tests in this video despite being two generations old at this point).
as for best GPUs for price, amd chips are usually better for that - although its always worth to do research as you can quite likely find a sale/price drop because early next year the 50 series of nvidia cards release so the 40 series (4070 especially being the midrange choice) will likely drop in price
Well, you are in luck: You are not buying a mac, so you don't need to be a millionaire, PC hardware is very competitively priced. You get a lot of power for very little money.
Go check out a more or less complete last gen system. If you are on a budget: It's a really good way to squeeze out more performance, for cost savings - and the only thing to consider is a GPU upgrade in a couple of years.
Its going to be fun to read how userbenchmark will spin the ultra core CPUs.
they already did on avg. bench.
@@WyattClark-hq7no It's probably going to be 70% on insulting AMD as usual
@@milksalmon UB says 285k is +17% over 9950x i think
@@thefriends5924 I really wonder who's behind that website.
"AMD-sponsored marketing trolls displaying high FPS and incredible 1% lows on all major benchmarking overlays (not ours) shouldn't divert users from the fact that the 7800X3D is a subpar product. Instead we suggest the better value Intel core ultra 245K or even better, the core i3-12100F for 99.999999% of gamers"
Yeah nah, I’m still calling it the i9-15900K
If you really want to piss Intel off- combine the old and new naming schemes together- i9-15285k. Much easier to pronounce & the "285k" end makes it seem like a low-end, wacky i3 variant!
@@benwhite6786 The most annoying part is they could have just called them Core 9 290K, Core 7 270K and Core 5 250K. But instead they decided to take 5 off for no apparent reason.
@@benwhite6786fr. Ur cooking.🔥🔥
@@theemperorofmankind3739this would be the way.
@@theemperorofmankind3739 Maybe they'll be releasing updated versions later on with those numbers planned? Similar to Super GPUs?
Buying a 5800x3d on sale near launch was one of the best tech purchases ive made in the last 10 years
Actually very curious what Linus was using for sound and video record here in an Airport, quality on both ends looks and sounds great.
Wild guess: some Sony full frame camera with a wireless mic inside his shirt, reasons: those colors look really Sony like, and the audio nowadays can get crazy good with lavalieres some really nice ones can deliver 32bits audio even wireless, that could explain the good noise reduction and even having a phone recording general noise to do the noise reduction, for the picture I don't really see some kind of strong light pointing at him, maybe a phone with the flashlight on
the usual lavalier mic + the camera they use at the studio since he's traveling to cover the event.
Green screen
It's a green screen look at the edges
@@DJDocsVideos lmao this is not green screen. no idea where you see greenscreen edges. plus the camera focusing is way too synchronized with his hand movements.
My instinct is that this CPU generation respects that gamers simply don't demand a volume that matters for a modern PC market where decent CPUs tend to do good enough for a minimum of 5 years. Both the volume and margins probably exist with laptops and bulk commercial desktops where performance has long been over the required threshold, but efficiency would make a significant marketable difference
especially with how high the energy cost is in most of the target market this might be a huge win for intel
I agree if only intel could actually do that but it dosnt seem like they know what the word efficient means
@@LoFiAxolotl Even if efficiency is what wins this generation, AMD takes the crown. I'm not sure there is a "huge win for intel" in the foreseeable future.
That's not what's happening though. This is a stop-gap before they get their new ASML machines and chip fab are up and running. It should be another 12-18 months before that even spins up though. They have a whole new architecture planned and coming, but they can only optimize on their current designs right now.
AMD is going to be the clear winner for at least another 1-2 years as TSMC already has their ASML machines in their fab, spinning up new AMD silicon.
It does feel like they saw power efficiency as one of the biggest factors for Apple Silicon's success and they're trying to bridge that gap. I'm not too down on this since focusing on power efficiency now means less heat which can lead to a higher ceiling on future chip designs.
LTT Wardrobe: Would you like to wear the V-Neck Jumper, Turtleneck or Tweed Jacket?
Riley: Yes
I love the sudden push for efficiency. The focus has been on power for ages, we can do with one intermittent gen where the efficiency boost carries to future gens.
No choice with what happened to the last two generations
If they didn´t pushed efficiency, those CPUs now would take more power than mid-range GPU. I think they dont have a choice to push performance.
intel cant push more power or they will get the 13th and 14th gen issues
Intel ran out of headroom for clock speed, they forced themselves into needing to change and chose efficiency as their next focus.
i prefer to see those new efficient cpu than seeing a cpu you need to run on a 35amps breaker 😂
Say what you want but I appreciate hardware taking power consumption into account. Power isn't cheap everywhere in the world.
Even at super expensive energy prices, the price difference to even the inefficient Intel 14th gen CPUs will take years to recuperate and that is before you take into account motherboard cost.
And the AMD CPUs draw way less power and don't need new motherboards.
@@MajinOthinus I'm aware I'm using a 5800x3D myself and bought a RTX 4070 specifically for its low power draw undervolted. I just think that it's generally good if technology uses less power.
@@MajinOthinus The power difference is insignificant.
I agree, that's why I'm going AMD this time around.
Tens of watts might not matter too much for the electric bill, but they certainly affect system heat and longevity.
I like that little pulsing mark next to the thing you're talking about I found it very helpful
This generation feels like it’s for OEMs and not enthusiasts. I’m sure Dell, HP, etc. will love that it’s easy to cool either AMD and Intel CPUs while still getting reasonable performance.
With that said, I don’t think I’d ever choose any of the Intel 2xxx series over the AMD options.
People expect Intel to pull off a brand new CPU that beats AMD... They should be already happy of what they got, they don't need to buy them, buy AMD's CPUs that are gonna got a price increase
While i was looking at the stats in the video, my first thought was "who would even look at buying a newer intel gpu after seeing this" the amd ryzen's look way better on the bench mark test.
@jeanytpremium the crazy thing is how quick the tables turned on intel. Who's to say they can't swing back in a couple generations with this new architecture. Then again the new ryzen architecture seems mighty so who knows!
Lower power also means such oems can save on the psu design for their systems where they don't ship a powerful gpu.
yup. gamers are small fry. oem's go into all industries globally
Whomever thought of the news caster bit to correct the information, needs a raise. Like, a fat one. Genius. Do it again.
More than anything, I'm grateful they added that bit to correct the info at all! It shows honesty and transparency beyond something simple like putting text over the screen to visually override what Linus was saying. This is exactly the sort of content we were hoping would be produced after the LTT review scandal, so props to the team for sticking to their promises, and thank you!
Riley is peak 100% committing to bits and it sells.
Shut up, Riley!
Whomst'd've
9:37 ahh yes, time (measured in meters)
@@cuttlefish8184 I mean honestly it does make sense (… in a vacuum). Though if it’s in meters it’s gonna be a real small unit lol
@@mduckernzIts minutes haha
You jest, but in certain fields of physics time is indeed measured in units of length.
Riley's newscaster segment was hilarious.
I would be kicked out of any studio where they're recording Riley. He is such a great standup comedian in the shape of a techtuber.
I'm very happy I went with a 7800X3D build two weeks ago instead of waiting for Arrow Lake. Coming from a 4th gen i7-4960X, it's absolutely blown me away. And my 4960X could run basically anything new, but now I see what it's like to be high end again.
7800X3D has 12600K multicore performance. The only high end you are experiencing is gaming IF you have a 4090.
@@JordanJ01And yet it plays every single thing I throw at it on ultra with my 3070, including Flight Sim. Imagine using your personal gaming computer for work tasks💀 But I get it, you want to justify your 600 dollar Intel CPU, so go ahead. I'll keep my top tier 7800X3D and maybe upgrade to the even topper tier 9800X3D in a few years, or the toppest tier 11800X3D a few years later. All I can say is I am SO GLAD I passed on the 14900K and went with the 7800X3D. More power, cheaper, cooler, and I'm not some Hollywood movie producer, so... top tier 7800X3D with my 3070 it is.
@@KartKing4ever Yeah, no, ignore that clown, 7800x3D is the best CPU money can buy these days
@@KartKing4ever X3D CPUs are the best. Great gaming and good price.
@@JordanJ01 So? If I want great multicore performance, I would also buy a threadripper or epic. Then I could run a 1 million pop Cities: Skylines 2 map/
Man that 5800X3D ! It put a huge smyle on my face !
14:20 Lol this is hilarious considering we are comparing this launch to 14th gen which literally had 12% power reduction at best and 0-2% performance increase. This is still a lackluster launch but 40% power reduction is definitively better but depending on your use case, you may not care. All to say intel has been a sad state of affairs for awhile, but hopefully they can turn this into "Zen 1" momentum. Then again there is not much lower intel can go...
Zen 1 was cheap though (CPU and MOBO'S) . . .Arrow lake on the other hand . . .🤣
_"yeah, about that..."_
- Intel's inner thoughts during every launch for the next 5 years, probably
Nowadays, AMD is the best choice for anyone because the latest AMD CPUs use the AM5 socket which will still get some years of support in the upcoming years. Back in the 2010s professionals would recommend intel because of reliability, but this is absolutely not a issue anymore. AMD has become the better one out of the two giants, with of course significantly lower prices.
@@FCNeil I agree except for in the rare use cases that need Intel quick sync. I have a Plex server that has an Intel 12900k simply because I need quick sync. My personal PC has a 7800x3d and I love it!
AMD has been the no nonsense choice for last 5yrs, its was even better earlier because you were paying intel 2x for the same perf. Intel caught up with 13th and 14th gen processors but not much, intel is only saved by OEMs pushing intel CPUs otherwise i dont think anyone would prefer intel over AMD today.
@@Xxshadowman11xX AMD is catching up with Plex transcoding. I use my 5700G and can get four to five 4k -> 1080p streams at the same time without any hiccups.
@@itsmemac yeah you can always brute force streams with multi threaded processors but quick sync is still preferably for the quality per watt. You're able to have a couple dozen streams with quick sync while only using 10 watts or less, which is crazy efficient for this particular process.
@@Xxshadowman11xX I think he's refering to Plex Media Server on Windows which does in fact support AMD GPU's which would propably include the iGPU of the 5700G
I love that I bought into the Ryzen 5800x3d.
I don't have many games worth playing because the industry sucks but I can still feel confident that I bought a good product that has consistently out performed newer products by it's competitors for 3 years running!
The product lived up to the hype. Thanks for helping my decision LTT and all the others in the community.
TLDR 5800x3D is still the GOAT
Right. Imagine making a CPU that nor your own successors beat it nor does the competition.
still the GOAT in gaming*
@@Sithhy nah its still the GOAT accross the board, performance in productivity still isn't better enough to justify upgrading.
@@HerbaMachina It depends, if you don't care about gaming and only care about productivity, then it's better to look at both Intel and AMD higher offerings.
And unlikely to change any time soon, unless the new X3D chip is bonkers value
core ultra 200s sounds like if a german car made a phone
more like a samsung phone name 😂lol
Sounds more like if crystler made a cpu
@@watson1212 apple fanboy spotted, opinion rejected
@@watson1212 you mean like in the apple watch ULTRA? even be stealing names lol
Definitely has a modern Mercedes naming scheme to it
Wow! Really loved Rileys cameo That interruption was so fitting. Well done
OMG I loved Riley doing the news segment thing. If you need to do any addendums like this in future please bring him back!
I read a title as 2005 first. And thought to myself - neat, some oldschool stuff review. Boy oh boy, was I surprised by the actual video
What can be misread as 2005? Did you get a different title?
@@malucart 200S could be misinterpreted as 2005 I can definitely see that happening.
@@malucart oh, video had different title before. It had 200S core, so I misread it as 2005 core
@@DoubleDragon5180 The title says "Core Ultra 285K & 245K" for me, so you got a different title
285k launch is barely any better than Pentium D's, so like yeah... Intel has fallen badly
Thumbs up for Riley. Also, thanks for looking out! Sounds like next generation will be able to increase the power to get the performance again.
when my 13700K started acting up with the now well known intel-gate of 2024, i switched to AMD. the 7800X3D to be precise.
my gaming performance significantly increased. only my synthetic benchmarks decreased. but real world performance you could say ''doubled'' (as i had to downclock the 13700K so much for stability it should no longer be considered a 13700k)
another added bonus? the POWER BILL.
the wattage my PC draws is significantly less. the 13700k would happily draw 200+ in ANY game/activity other than being a desktop youtube sim (and even then it would easily go to 100-150 watt)
the 7800x3d? 78-82 watts. yes. thats full power, with beter gaming performance across the board.
being a grown adult, who has bills to pay, while energy prices are doubling yearly, yes this is significant.
you experienced any crashing or stuttering after swapping to amd?
@@theroyalcam not OP but I can tell you that crashes for me are way more likely to be caused by my gpu driver or the game itself rather than the system having a complete meltdown due to the CPU. However trying to be as unbiased as possible, the YMMV part comes more from the motherboard manufacturer and how mature their BIOS's are, I'm using a gigabyte aorus master and I've had to rollback before due to bios bugs.
@@theroyalcam You probably won't experience any crashing or stuttering issues after switching because of the cpu, if it occurs it is most likely caused by the ram, motherboard or other components.
@theroyalcam nope, none at all. Same GPU. Had to switch motherboard course. And ram due to compatability. Same ssds, sata and nvme. No stability issues, no crashed, no blue screens. On demand performance for so little power.
@@avenage when diagnosing the horrendous stability I checked EVERYTHING. Ram, GPU, drivers, SSDs, software. To no avail. Clean installs multiple times. The only thing that helped was to downclock the 13700k to what you could say was 13500 performance levels...
12:04 for AI workloads, you should include YOLO for the vision models and 1B-3B-8B llm models for language tests
@@kunjs agreed, LLMs are a good test for scaling
YOLO is a vision model, not a platform to run models. Both openvino and windowsml can run YOLO for example.
We went from Intel fighting with AMD to Intel fighting itself and losing bruh.
You guys missed the chance for Linus to show off the commuter backpack since Linus is at the airport
I’m actually glad to see a clear return to Tick-Tock type roadmaps. Not every release needs to be exciting. Refinement releases are valuable, too. This product is a big pivot for Intel; it was never going to be an absolute win. This is basically their Zen 1 moment
How is this a tick-tock? It's a new socket. Tick-tock is a new platform with more power, then the tock for efficiency. We got a tock and a new socket. Not maintaining socket compatibility for more than two generations is really stupid.
@@MrPatateHead You’re being pedantic. The higher level idea of Tick-Tock is that you alternate between performance considerations vs refinement ones. You’re lost in the weeds and missing the forest. But, last gen was you Tick from 13th, if you want to look at it that way? However, that’s a false equivalency cause Intel hasn’t been doing that pattern for ages now. Every gen has been attempting a “Tick” up in performance.
Tick-Tock also is not tied to socket; never has been, really.
@@aj0413_ Tick was a new process with the same architecture, Tock was a new architecture on the same process. That repeated for a while but scaling slowed and then they got stuck with 14nm without a contingency, the next future plans were all for 10nm and beyond and we got multiple refreshes before a back port of a new architecture and a watered down 10nm process from the original planned one.
I don't really believe in this tick tock method today. Sometimes you need a major overhaul, and then you can get about 4-5 refinements with smaller architecture changes before starting again. Process is upgraded in a less rigid cadence. AMD essentially did this, new architecture with Zen 1, refinements of different parts of the architecture each gen, then for Zen 5 they did a major overhaul on everything, the next 5 or so gens will be refinements and picking low hanging fruit until there isn't much left and they do another overhaul.
Every generation was kind of a tock, and sometimes also a tick with a new process node simultaneously. They didn't port the same architecture to a new process before making changes generally. Zen+ is a close exception with slight changes from Zen 1.
hot take: I think power efficient CPUs are the future, and I'm all about it. I hate using my power heavy gaming rig for anything that isn't gaming, and I'm still using 12thgen intel cpu and 3080 GPUs.
Loved the South Park reference around 6:35, still one of my favorite scenes about Canada to this day 🤣
Honestly I think it's good they're doing this, this isn't the first time we had next gen cpus that were just a refresh
In addition to being a launch pad for future generations with brand new architecture with a strong pivot away from the high power low efficiency of i9s and i7s past. This is good for the future, even if it doesn’t look that great today.
@@rmiah yeah just got ryzen 7900x and yeah it may get 10 fps less compared with intel but also getting less electricity consumption... but I'm not changing that cpu for a while no overheating clocking 5.4 ghz at 1.25 vcore and maxing at 70 degress celcius. 😜
@@peterpanini96 AMD has been way ahead in the efficiency game.. Intel is finally cluing in that people don't want to pay an extra $100 a year on their power bill for a few more FPS than the competitor.
@rmiah but why choose Intel? The problem isn't what they are trying but that AMD is on every step better. More performance and cheaper CPUs the Mainboard are for sure wayy cheaper then the new Intel boards, the Brand Intel is absolutely not what it was just after the last incident showing terrible support at times, especially for knowing the problem for atleast a year.
@@ehrlichgesagt863 why did people choose AMD when Ryzen 1 dropped and it was marginally worse if not equal to Intel’s offerings (ignoring pricing)? Competition is good in the space no matter what your preconceived notions about each brand is. And as later tests in the video showed, Intel’s new architecture still holds its own for productivity and AI tasks.
Nearly a decade later Ryzen stands to be one of the greatest hat-tricks of consumer electronics for all time. I can think of a few examples that would rival/ surpass it but I still find the turn around mind blowing.
We talk about “generational leaps” on an annual basis but the shift of AMD to Ryzen was a shift that occurred over an actual generation.
Thanks for highlighting the discussed chip AND marking the topic chip! Really clear and helped me follow along really well. As a gamer who really wont be going over 1080 and RARELY AAA games I am really here for the power savings. I am compelled by the results, and honestly I expect while this gen might be a power focus, next might be taking what they R&D here to apply to proper performance next round. Love the vid.
0:38 I thought he said Error Lake
Would be accurate
@@WayStedYou”apo will be default” wasn’t enabled on most motherboards…
It better not be Error Lake! Intel's already being sued for the last 2 generations!
It's almost like Intel and AMD brokered a truce to end the power consumption madness and both took a breather this generation to make improvements to the platforms. Honestly, I'm here for it. Tick-Tock baby!
7:50 we need this to be a recurring character
@@zachattack2775 A perfect bit to do an entire TechLinked episode with
no to pedos
One thing that’s severely underrated about Linus, is his consistency and commitment to quality. The man never shows us his bad days. He’s in the middle of an airport and travelling for another video, and he’s still putting this quality into his hosting. Major props Linus. You’re truly a seasoned entertainer.
Love seeing Wanderlust Linus filming on location while he waits for a flight. Now THAT’S dedication! 👌🏼
What were they thinking with this naming sceme?
Seems like they're all in on trying to create their own M1 style chip, with similar drawbacks. Maybe HEDT will become relevant for gaming again?
Hopefully. I want more PCIe lanes, too. I ended up going with a Threadripper 7960X for that reason but it would be nice to have cheaper options.
I feel like m1 style chips Only make sense in laptops
To be clear I loved my m1 MacBook Air and I love my m3 max MacBook Pro but that’s because they are laptops, not desktops. I would hate any Mac desktop because when plugged into a wall raw performance matters more than
@@dbattleaxe nice *not* to have cheaper options?!
@@Chicanery_Artifice woops. Haha
Damn, the 9950x really is a beast
I should be pleased that my 5800X3d is still relevant after all this time, but I quite enjoy an upgrade now and then but it's just not worth it.
@@oortcloud210 there’s no upgrade for this yet, not even the 9800 will make sense, the 5800 has a long way to go
Soo true 5800x3D is like the 1080ti version
If you feel the want or need to hop up to AM5 that 7800x3d is a meeeeean big brother to the 5800x3d, but at that point it’s def worth it to ride out until some 9800x3d’s or even 10800x3d’s come around to hop into AM5. Unless you’re feeling that ‘build-a-pc fever’ then go for it bestie
Even if people might not care because FPS tend to be the main selling point for gamers nowadays, refreshes like this in PC hardware that focus on efficiency are extremely important and open the doors for big jumps in performance in future, by lowering the amount of power needed to achieve the same or even better performance than the previous generation
Why would i care about power over performance when I can always just underclock?
Their competition created both performance and efficiency uplifts.
this video just shows how good of a cpu the 7800x3d is
Hot take: I don't necessarily think it's a bad thing if we're reaching a performance plateau for CPUs in general. It might be nice to shift the goals from just raw compute performance to efficiency and optimizations. "I couldn't make it faster, so I made it cheaper" should be a valid business strategy.
Nah, I don't care about efficiency, I want performance.
@@MajinOthinus Then, it means that you are peak brainrot to the grave. Have fun!
@@Buzzttok Have fun with your sub par performance.
@@MajinOthinus I understand the need of performance in case of running productivity applications, but if you're trying to use this to rot your brain by "gaming", indeed, have fun.
@@Buzzttok Right? I'd much rather have a CPU that's capable, yet can boost to max frequency on passive cooling, than whatever we have from Intel now. I find it surreal that so many people still buy Intel CPUs when AMD gives you comparable performance for a fraction of the power and thermals.
Can I be an optimist and ask the following question, maybe they are both (AMD and Intel) doing major power budget cuts in the current release to sky rocket clock speeds on the next gen? I am probably using the wrong terminology but I think most will get what I mean.
I think I agree. This is probably for Intel’s 1,8nm chip next gen processor to make a big splash and also for their GPUs that are supposed to come out soon
I’m pretty sure both have said that somewhere. This is the foundation for the next generation and with all the headroom they can really push performance.
This is probably the direction the world doesn't want to see Intel go in, but from Intel's point of view, probably catering to the smaller audience of efficiency in both performance and power draw. Don't get me wrong, I'm all for getting a little greener, but did we all forget how stupid it was to run the 13 and 14900K? Now we don't have to worry about racking our brains on the most optimal or the bare minimum cooler to slap on the newer chips.
But hey, no one is forcing you to buy the new core 200 series. I wouldn't even hold it against ex-team blue bros, because this is definitely a day to lose
6:34. A true South Park Canadian.
Someone missed the memo that kirkland brand stuff is typically the best value:cost ratio.
Hate the naming scheme. As somebody who's on an older 9700k with a newer graphics card, I'm seeing a performance hit in games and looking to update my platform. Anything I pick today is going to be better. Platform improvements and power efficiency are attractive to me. I don't want 600watts heating up my little gaming area in summer. Currently using 330watts ish on the ROG Thor display. 60°C hard-line loop.
Besides, my power bill keeps going up all on its own.
I'm not willing to compromise on FPS, but I really wanted to throw in some PCI cards for years. Having more than 16 lanes available on the consumer platform would it be amazing.
The 9700k is solid. No weirdness in power settings or game bar or bios trying to cook my chip. When you sit on the same windows install for several years with a broad range of hardware, software issues compound and degrade and become terrifying. I already have to deal with razor and any others software, I don't want to have to deal with (in os) software that makes or breaks my CPU as well.
Performance metrics are great, but when are we going to get another a solid platform with no BS trickery to get it to function properly?
That South Park bit was gold lol
8:35 I love that you were still doing that character.
i really appreciate you guys included the 5800x3D i think alot of people stil use it including me and this just confirms how good of a buy it was !
That kid's grandma not happy to see this from the other side
I was sure Linus was going to say "arrow lake" is more like "arrow puddle"
But "arrow through the heart" was a pretty good line too.
Missed chance to say "Arrow in the knee"
Always informative, love the show. Don’t forget for many of us in colder climes the heat from a PC helps warm the room!
Love the Airport vibe/video! Thanks for the information, too. It's great that I'm set on last generation, this is just... messy.
I also love the vibe too.
End of a era 🫡🫡🫡
So instead of watching LTT videos while waiting for the plane, LTT will just do it live for me? Neat!
3:52 Case in point. Cities Skylines and its sequel are notorious for being horribly unoptimized games. The fact you get around 90 FPS means it's plenty sufficient for playing those games but if the game devs optimized their game, you'd easily see much bigger numbers. Probably double or triple what you are seeing. Of course, that would negate your benchmarks. You should be calling out devs who make games that perform badly, not hardware manufacturers.
Another person who learned the word "optimization" and thinks they understand it. Oh goody.
Man, the 5800X3D is such an incredible CPU for gaming.
Doubt intel will be able to beat that price/performance in the coming years, if ever.
Nope it’s already a historic one.
The L2 cache has finally jumped the L3 shark! That used to be how much was in L3. Next up is L1 hitting 1MB.
I haven't been into new tech in like some years ( I still use me ryzen 5 3600xt and 1070 on a 1080p monitor) but honestly, it scares me how many fast components got recently. Pushing more than 500 fps in games it's crazy for me.
So it's slower but more power efficient in gaming? Isn't the 7800x3d faster and also efficient?
@@KevinEF yep, my 7800x3d consumes 60-70 watts
Notice how they never compared power intake with AMD? Yeah, They tried their level best to save this disaster of a launch.
@@sharzo7728 They did compare the power consumption to the 7800X3D and the 9950X, atleast for productivity and really didn't hide the fact that these chips pull less across the board. I don't know what you are looking for.
@@1Percentt1 In gaming it would worth buying a 7800x3d but in productivity the u9 285k beats it all across the boards, and neck on neck against 9-9950x. But 9-9950x can draw up to194-200 watts while u9 285k uses fraction of that wattage. But I can't say anything about the new 9000x3d CPUs which has not released yet.
conclusion: intel u9 285k is a fast but lower power CPU with AI capabilities.
Hopefully Intel can use this and create an CPU that's more performant for games rather than production.
@jeppe1774 did you watch the full video and read my comment?
My comment was talking about gaming.
But even then, the u9 was pulling 220w against the 9950x's 200w. It's not using a fraction of the power, it's using more power.
And the 9950x is also faster at gaming.
THANK YOU for including CIV turn times. Most reviews don't include games like this.
How the fuck did he get audio this good at an airport
Its a greenscreen, Linus isn't allowed in an actual airport since...
@@Fridge_FiendSince what?
@@Flomockia Since he tried to carry a plane and dropped it.
LTT, JTC and GN releasing videos at literally the same millisecond. Oh the embargos...
Yeah, I am subscribed to a lot of tech channels in multiple languages... My timeline just got flooded with CPUs :D
LMAO Riley doing the Ron Burgundy was perfect, you need to keep that going.
Idk, ir still running on i9 9900k. + 3070ti an other pc i9 11900k + 4070 ti its comepletelly enough for gaming, dont see any reason to upgrade closest 3-4 years Only waste of money
when i got an i9 9900k 4 years ago that was the best pc decision i have ever made, this thing is still on top and i have no desire to upgrade to these new generations. I still have yet to run into any performance issues on this platform despite this processor being 6 years old, intel really hit their peak with this one
I also have 9900K but editing 4K video is already slow. I was waiting for this processor, but now I'm at a loss. Maybe I'll get 14700 hoping it won't degrade )
I initially thought this was filmed on a convention floor and was confused because I didn't think there were any conventions at the moment.
I like this minimum production out in the public video, reminds me of the first video I watched of Linus over a decade ago (overview of the MSI p55-GD65)
Feels more nice and grounded to me. Like a mate telling me a news.
13:04 Have you noticed any disparity between the software reported CPU package power, and actually measuring power consumption at the hardware level? I know it's not possible to isolate just the CPU's power, but you can isolate power consumption to the motherboard by using hardware interposers like how GN is doing it now (which includes CPU, RAM etc) and measure that. It means you should have far more trustworthy numbers compared to how Intel and AMD self-report power consumption at the software level. They also use different methods internally for reporting power, which can lead to apparent power consumption differences between vendors.
I don't think gn has been mentioned even once since last year and i think they're simply avoiding it so sadly i assume this won't be answered.
It is possible to isolate just the cpu power
Congratulations on 16 million subscribers
How good is the 5800X3D? That CPU has to be AMD's 1080Ti moment.
Lol not Linus recording at what looks like Pearson Airport... Anyone else feel like the ending was more like "No help... I'm being kidnapped"
I wonder what he was doing in Toronto. dbrand thing maybe?
Damm, all this new CPUs show how much the 5800X3D is a golden chip, if not of the century
Arrow Lake is like an arrow in the knee
6:38 *Vsauce theme plays*
R7 5800x3d:“uh...why am I still here and top of the charts too? are you guys even trying???”
Intel, "we're saving you loads of power consumption"...Nvidia, "hold my glass"
lol, 750 watts.
AMD for the win.
I loved the APO segment!
(I haven't watched anything else here, cause I already got the reviews from other channels.)
The naming scheme still works the same way. It just has the word "Ultra" probably to emphasize the new NPU. This is the 2nd generation of Ultra processors, hence the 2xx. The K and F suffixes still mean the same thing, the tiers are the same, and the xx part works the same way still. I find it hilarious how people see a different word and their brains turn off.
I also presume the non-Ultra versions will be rebadged Meteor Lake or even Raptor Lake CPUs.
Yeah they said the Ultra was to push the NPU and the idea of the “AI PC”
I want "Ultra" to become what Ryzen 1 was for AMD.
Competition good.
A launch with benchmarks that all cap below AMD's Ryzen 9 9950X doesn't bode well for Intel, I love it
This looks like it was filmed at an airport. But the crazy thing is how good it looks and how good the composition is. For something that appears to have been an improvised film shoot, it's pretty good production quality.
The decrease in power consumption is very important for servers
and laptops
amd server chips are still way better in that space too.
It would be foolish to trust intel after they tried to gaslight us for an entire year about their unstable processors. Many enterprise customers felt that first hand.
I'm disappointed in Microsoft and Windows 11. Zen5 and Arrow Lake perform amazingly well in Linux. Microsoft is 90% of the problem for recent CPU generations