Intel Core Ultra 285k, 265k, 245k specs and pricing
HTML-код
- Опубликовано: 12 окт 2024
- Intel has finally announced their latest Core Ultra Series Desktop processors! Are they worth the hype?
Sponsored Links
Check out the latest deals at iBuyPower! www.ibuypower....
○ Get your JayzTwoCents Merch Here! - www.jayztwocen...
○ Join this channel to get access to perks:
/ @jayztwocents
○ Join the official JTC Discord! / discord
○○○○○○ Items featured in this video available at Amazon ○○○○○○
► Amazon US - bit.ly/1meybOF
► Amazon UK - amzn.to/Zx813L
► Amazon Canada - amzn.to/1tl6vc6
••• Follow me on your favorite Social Media! •••
Facebook: / jayztwocents
Twitter: / jayztwocents
Instagram: / jayztwocents
SUBSCRIBE! bit.ly/sub2Jayz...
They need to bring back those old cases with the turbo button, except now they toggle 65/105W power.
its built into the cpu.
@@BlueRice but then I can't press a zoom zoom button so my brain makes a happy
@@BlueRiceTell me you didn’t use a computer in the 80/90s without telling you didn’t use a computers in the 80s and 90s😂
Man, flashback from the Intel Pentium days. Case had a turbo button. 166MHz with the turbo on, 133 otherwise.
@@Gren83 Right, the button made the CPU slower. These kids today.
Everybody: listening to Jay
Me: Obsessively watching the PC behind him to see which fans turn on and off.
OCD by any chance??? 😄
Same here 😂😂
I always set my GPU fan speed to the minimum that they won't turn off in MSI Afterburner, the noise when they turn on and off is so distracting, and probably could shorten the lifespan of the fan? I don't know.
Same
lol now i’m watching them!
The person who invented that new naming scheme has probably previously worked for a monitor manufacturer.
All hail Iiyama!
🤣
marketing team is doing fentanyl version 2.0
Laptop manufacturer:
"Hold my 16IRX8-82WK00GEMX"
The people who compare that new naming scheme to monitor naming schemes has never looked at monitor naming schemes.
TBH power draw was kinda getting out of hand and is still out of hand with the GPUs, it is nice to see it getting better, we shouldn't need a dedicated circuit to run our computers in our houses.
We've even gone beyond Bulldozer/Piledriver. Crazy.
yeah if you running intel cpu 350 watts , but Amd had 80 watt 7800x3d cpu for long time , looks like you just buy intel only so you might need powerplant in the house
UNLIMITED POWER
@@parm2-x7h i have a 5800x, I was speaking about the industry in general not my own setup ;)
@@parm2-x7h I'm running a 5800x, i was speaking about the industry in general ;)
Keeping my 7800X3D for a long time. :)
Lol
Intel: "yeah but they just glued cores together".....
Intel: We glued cores together.
Intel: We glued tiles together. Tiles is new sexy thing!
Tiles have a greater density of interconnects so they have a tiny slice of bragging rights - but as you suggest, certainly not enough to excuse their past criticism
Intel: We glued Chiplets together and called them Tiles. Now we gonna' glue Tiles together. [Or something.]
I hate this way of creating multi core systems. It’s just bad. I don’t care if it’s easier to manufacture, it’s just bad
AMD: We created FX! It's 1 core per thread.
Intel: Haha! We can do multithreading and we're faster!
AMD: We have Zen and it's multithreading capable with 2 threads per core.
Intel: We took FX and cloned it...
AMD: Um... 😳 WTF!?!
Thank you for pre mashing my keyboard for me. Power consumption helps with reducing the ridiculous cooler sizes, help noise and form factor. i.e. It helps make ITX viable because it makes low profile coolers more viable. I like these developments from both Intel and AMD
Small form factor is fun, and to really make something small, the cooler is never optimal. Power supplies are also lower capacity. In that world that often involves undervolting and heat management, these new improvements are huge.
not to mention portability and how much it will help battery life on laptops!! before they had to have there own "m" series but some laptops still use desktop processers but suffer greatly from power draw in bat life and heat
Intel ditched HT due to 3 reasons.
1) Sidechannel attacks.
2) HT requires doubling the register file among other things - this saves die space.
3) Intel was planning cores that can be partitioned in-situ as part of the Royal Core architecture. Ripping out HT is a first step towards that. It's unclear whether Royal Core will ever come to market as planned though, as the project wasn't completed before Jim Keller left.
And probably they'll do "raduced x86 instruction set" as main instruction set to compete with ARM
@@BozesanVlad X86S lies somewhere in the future. Probably well past 2034. Also, the instruction set size of x86 isn't really that much of a problem.
Modern instructions hit a dedicated fast path and the old stuff remains supported through microcode mappings, but isn't emitted often by compilers anymore anyways.
CISC applications tend to be smaller and require less I$ and I$ bandwidth btw., because loads and stores are most often an implicit part of other instructions instead of having to be explicitly added by the compiler.
At the same time, loads/stores can still be optimized, i.e. performed out of order and/or batched, once the instruction has been broken down into microOPs inside the CPU.
Just hope new X3Ds dont dissapoint.
X3D never disappoints.
As long as you don't expect a huge uplift, it'll be fine.
@@cizzymac seems there is large internal change in 9000 for l3 cache, we soon see what amd is cooking there 😉
depends if microsoft dont break the task handler agin to suit intel cpu's like they did with intels ecore bollocks
@cizzymac ? How ? Zen 4 to zen 4 3d is 20% faster in gaming while having 10% lower clocks
It really is the battle of the Ultra 2,85% vs AM5%
CPU meme fight xD
Thats zen 5%
Not am5%
At least Intel is supposedly hitting same performance at half the power, their main improvement been efficiency. I can’t say the same for AM5😅 seems like the same old AM4 on a newer node
-2,85% ***
@@slimjimjimslim5923 is that a joke? comparing am4 chips to am5 saying theres no difference is crazy. You might mean zen, and even then, zen didnt need a 50% drop in efficiency, it was already operating at like half the TDP of intel and that gap is still there, zen is still more efficient. The ultra 5 245kf has basically the same TDP as the 9900x at about 120w) with max socket power of about 160w. Thats a huge difference in performance for the same power draw.
I like the direction they are going with the efficiency, not only because it saves power, but because that is a passive way of battling heat. Which in a way directly translates to less throttling and more stability and durability. If we have learned anything from the 2 previews Intel gens, going all the way in power for sake of performance is not that wise.
Lol, so if you are a gamer, you are paying alot more for the new intel cpus while getting less in performance?. What a joke. The 9800x3d is the only way to go. I couldn't care less about productivity benchmarks.
Hmm.. I like durability as an argument for a new CPU generation. Wish my Intel Q6600 from 2007 that i still use to play old games had that!.. Are you for real?
@@RawBejkon I am for real, in fact I wish GPUs would adopt the same direction. Power draw is getting insane. a 1500 watt PSU? No thanks.
Yes and no. Yes that Intel's previous generations of CPU are using too much power and producing too much heat that it becomes a problem, so yes they needed to lower the power consumption. But at the same time no, its a fail because they had a node advantage over AMD and this is the best they can come up with? Heck, they probably would have similar results if they made 14th Gen CPUs on the new node to reduce power consumption while maintaining the same performance. Efficiency is great if there is performance increase. This is the same problem with AMD, going for efficiency for little to no gains with Zen5. That's why people complained and AMD released a 105W option.
The problem with Intel is that it is using a newer node, had the advantage and still couldn't win. In fact, it is still going to consume more power than AMD counterparts. Not to mention X3D chips will retain the lead, and 9800X3D will pull further ahead when it comes out. Intel needed efficiency AND performance improvement in order to compete. Unfortunately it couldn't achieve both.
insane excuses...
I'm still happy with the performance of my AMD Ryzen 9 5950X CPU.
Banger CPU
Same even so I would only go to 7800x3d if I changed
that gem with an rx 7900 xtx gives me a good 100+ FPS in 4K w/ Escape from Tarkov. even streets of tarkov on a good day sticks around 90 tops, and 70s low. everything else, even new games run like butter in 4K. got zero reason to upgrade
Little reason not to be, honestly.
I managed to get mine working on x370 mobo. I'll keep this system forever
4:21 Some caveats to this, because in some cases you actually get a performance increase when disabling hyperthreading because the core is not split between running two tasks
I wonder how it'll compare to 14th gen with HT disabled
I'm actually really glad to see a focus on power efficiency. It takes concern away from the power budget for a power supply and graphics card and especially cooling methods. On one hand, I wouldn't be surprised if they kept on their path of more performance and more power draw, but there's no way that would've been sustainable, so on the other hand, I can completely understand how this makes sense and hopefully be their best step moving forward.
I’m convinced as technology gets more advanced, games get less and less optimized and devs go “oh well”
its been that way for a decade already
so i got a 7900xtx to tell them oh well lol
I did a 7800X3D build in August, I'm good for a few years
...done dirt cheap.
The song is a classic Jay. It doesn't date you
Im running the efficiency train. Same power for less wattage is great thing to see. Over years we have seen the average wattage go up and up .. i would really enjoy gpus getting less power hungry now.. 3 slot cooling - seriously that hilarious
Said this when the whole 13th/14th gen debacle broke. We're finally starting to see the limits of silicon as a processor medium IMO.
more a limit of a certain methodology of processors in the x86 architecture. chiplet design will bring benefits that the previous methods didn't allow but this iteration is going to make it seem like Intel were taking steps backward too
I remember this being bleated out for what? 4 years during Pentium 4?
5+ years of Core i?
3d cache is coming to Intel right? That's going to open the floodgates for gaming performance. Zen5 has some latency issue in their design.
@@tim3172 Exactly. And at some point it will become a reality. You cannot deny clock speeds have practically stalled and there's only so many ways around that.
@@tim3172i think the argument holds more weight this go around as it isn't just cpus. gpu generations have significantly increased power consumption to maintain their generational improvements
The problem I have is that they cancelled the refresh which would be released next, so this is a dead platform, along with a dead socket
AC DC "dirty deeds done dirt cheap",a classic song.
First Gen Zen wasn't a chiplet design on the desktop. It was a single die, a die that was designed to be used in multi-die designs for servers and obviously Threadripper.
Most of the growing pains in Zen 1 were due to core complex design, each die had two 4-core CCXs which communicated via Infinity Fabric. In Zen 2, they doubled the L3 cache and moved to multi-die designs with a dedicated I/O die and separate Core die(s). Zen 3 then joined the two 4-core CCXs into a single 8-core CCX.
Since RGB gives you at least 30% performance boost I wonder if they will now make AI controlled RGB :D
dont give them ideas
I have a 8700k system that's about 6 years old at this point. Which still works just fine for productivity stuff. I've been really interested to see how Arrow Lake stacks up, but maybe I'll wait for the second generation of these chips.
Humans can’t bite down, they bite up.
But can they ByteDance?
As long as you keep your lower jaw int he same spot and move the rest of your head to it, you are biting down..
@@drdrenack Nah.Your lower jaw presses stuff against the upper jaw, that fact doesn't change even if you try to move with it.
@@AsheramK If the lower jaw is immobile, that would not be the case would it? The only reason the lower jaw presses is because it is free to move.
Fun fact
the only way any market share will shift is IF INTEL DITCHES THE DUMBASSERY THAT IS CHANGING SOCKET EVERY 2-3 GENERATIONS SO WE ARE FORCED TO BUY NEW MOTHERBOARDS AS WELL. My 12900KS is the last Intel CPU I will be using, buying, or even thinking of purchasing in the future.
Very few people care about that
@nostrum6410 quite a few people care about it, educate yourself.
@@Gimpy17 you've still have 12th, so you don't seem to care either
If you really have a 12900KS then you do not need to upgrade your PC in about a decade. CPU's last a very, very long time. Most people complaining about that fact either do not have a computer themselves or are extremely wealthy and stupid with their money.
I get what you're saying, but how many people bought a CPUs each release of am4? If they did 2-3 then what's the difference?
To be fair I care about power efficiency more and more these days. For the longest time I didn't, but with the 4000 series cards, and now the 5000 series cards eating SO MUCH POWER, while everything, including power, is getting so disgustingly expensive, I kind of don't want my GPU to eat 600W. Sure, it's logical that the more powerful the cards get the more power they will use, but we're talking double of what a lot of people considered to be a lot of power draw. I'm just kind of stuck between wanting the best performance, but also crying over the electricity bill; it's good to see someone take the power efficiency approach.
600Watts is like $289 a year ($24 a month) if you're running it 12 hours a day at 100% which is unlikely but I wasn't sure how hard core you are. I used $0.11Kwh as the average cost over that time as that's what mine is where I live. theres a calculator online for this. Using 300Watts its $144 for the year or $12 per month. Moral of the story is it's really not a big difference
@@toddblankenship7164$0.20 per kilowatt hour here in my part of Oregon. At over $500 per year, imo that's not a small amount of money. But that is an extreme case too. There is the benefit too of not having a space heater running against my air conditioning during summer is another argument for me. I still have 10th gen and recently I took off all overclocks and the experience is pretty similar and I'm using half the power and way cooler room. What I'm getting at is I understand the efficiency argument, but I'm interested in seeing though how far these will go with more power for sure
@@toddblankenship7164 it depends on where you are. Where i am, the bill is 400 dollars running a refrigrator, a freezer, and a couple tv's. With occasional air conditioner use and occasional oven use etc. 4 years ago it was about 175. Moral of the story EVERYONES EXPERIENCES ARE DIFFERENT.
@@toddblankenship7164 In my area price doubles between 2pm and 7pm so your estimate is off for other p.s youtube sucks my app never lets me comment properly
@@toddblankenship7164It's not just about the energy costs though. It's more about having a quieter PC that doesn't heat up your room like a space heater
The biggest concern I have with the new CPU's is the inclusion of the NPU's. When I bought my new laptop I purposely stayed away from any of the units with the NPU's due to Microsoft's ridiculous implementation of Recall and the HUGE security concerns that come with it and the fact that you cannot turn off Recall.
Yep Npu will will be used to profile users.
Recall is off by default tfym you can't turn it off?
Intel changes sockets too much. Not upgrading any time soon for sure.
It's a pain. I'm waiting for 9950x3d. I put a new socket in for 7900x. Hopefully it is OK.
The fact that the z790 is obsolete with the arrow lake cpus is crazy
I went from Intel to AMD when transitioning from a 9900K to newer hardware a few years ago when the 5950X was relatively new, didn't regret it then & seeing what's happened with the 13th & 14th Gen, and now how disappointing it looks like Arrow Lake will be, quite happy to keep my 7800X3D for a while longer, although I may end up getting a 9950X3D when it comes out as I'm doing workloads now that can actually use extra cores if I get them.
if you are actually doing work then you might want the current cpu's from amd. Yes, x3d's are the best to games but everywhere else they fall pretty short.
@@RealSugam I use the system for both, which is why a 9950X3D is the plan when they come out. Looking forward to it even though the 9000-series is kinda ass, I like the power efficiency since my system is already a power-sink regardless with all the crap I have in it (literally draws 150W from the wall at idle because of fans etc).
Jay was wrong, Zen 1 wasn't a chiplet design, and they aren't on their 5th generation of chiplet design. Zen 1 was monolithic. Chiplet design came with Zen 2.
That doesn't matter for Jay's target audience. If this is a problem for you, then HU and GN is what you should watch instead.
@@terribleatgames-rippedoff I follow those guys too.
@@terribleatgames-rippedoff I should watch the ignoramuses who whine when a $60 motherboard throttles with an $800 CPU running an artificial power virus in an artificial zero-airflow VRM situation?
Or I should watch the guy who calls anything that isn't the peak GaMeR value (while spouting off about dungeons and dragons) a "waste of sand"?
The 3600/5600? Perfect.
The 3700X? Trash.
3900X? Trash (because GaMiNG and multiple dies)
3950X? Trash.
Ditto all of the 5000 series parts.
He's double wrong. Not only was zen1/+ a single chip, it was not largely impacted by the ccd "problem" which he referenced. Zen fixed their competitiveness to intel by getting better clock speeds and better memory performance (alongside bigger cache amounts), but even after this happened the 3300x which had no ccd communication couldn't match the performance of a 3600x and both performed on the level of a 7700k in 4 core core workloads.
@@s1mph0ny Zen 1/+/2 had CCX scheduling issues. As an 8 core chip was made of 2 CCXs of 4 cores.
Should've ditched e-cores, or at least make an 8 or 10 P core only gamer edition, maybe with some extra cache ?
If they were smart, they would have a chip like that and market it like AMD does their X3D chips.
I can see e-cores on mobile chips. On desktop, no
??? E-Cores literally help me on my needs, what a shame if you think processors are only for gamers.
@@HyperionZero Cores can always clock down to meet lower power requirements. But they could still make another SKU that's just for desktop high performance needs.
@@HyperionZeroHaving a model with only P cores for those who want that wouldn't hurt those who liie E cores, just dotn buy that model
I'm so glad I bought 7800X3D back in July for $380.
Both AMD and Intel did not disappoint to disappoint
Yes for sure! I might upgrade to 9800X3D but we will see, I am currently using the 5800X3D so ;) Might be perfect to actually just skip 2 gen and go for the Zen 6!
nice to see intel has come around to gluing their cpu's as well.
Im gonna wait. I just spent weeks looking at your videos and learning how to build a mid tower PC. Found out about Micro Center cause of you, and also learned they're only 10 minutes from where I live. Ended up getting a I7-14700k/motherboard/48GB Ram bundle for $510. Then got an RTX 3080 new on ebay for $445.
This will definitely hold me off until the new CPU and RTX 5090 come out. Gives them time to iron out the issues before I commit to my monster PC build I wanna really make.
I’m cautiously excited for the new Intel stuff. I’ve been an AMD guy for a long time, but this definitely smells like their Zen moment. As much as I like AMD, I also like competition so neither company gets complacent.
I want to point out on the multi-threading graph at 6:30 that they mention this was tested at 125W approximately. So the 14900k or 9950X may perform better at higher wattages. I am happy they are improving the performance per watt, but I would be surprised if the Ultra 9 285k took the multi-threading crown.
I would say power reduction in Intels case is a good thing considering 14900k uses the power of a small sun
in the palm of your hand
13:20 - "But we can't 'get' anywhere in the future, if we don't 'start' somewhere. So if we don't start 'here', then we'll never get to 'there'." - Jay 2024
It seems both Intel and AMD reached a point where performance can’t go any higher at the point so they’re competing to see who can come up with worse naming schemes, Intel seemed to be in the lead with the whole core ultra thing but AMD is actually pulling ahead cause they went with 300 series for no reason other than its a bigger number that Intel’s 200 series
I'm running a 10700k with a 3080ti - after all of the headache I'm seeing with the 13/14gen Intel processors, I'm heavily being pushed back into the AMD camp. The 7800x3D seems like a very reasonable upgrade. (Trying to keep things on the somewhat cheaper side while still getting the best performance I can)
If your waiting for a possible cheaper option I think the 9600x3d could be a potential cheaper alternative to the 7800x3d, assuming when it comes out that it’s available to the general audience
The new core series is a new architecture and does not suffer from the problems of 13/14
I just went AMD, I'm waiting for them to become villains before I turn again
They did that with the 7k x3d launch. They got tens of thousand of gamers to jump on the almost double the cost 7950x3d by making sure it was release a month earlier and they knew that the parking/affinity feature wasn't really working for them yet. No company is a "good" guys. it's the degrees of what they can get away with. AMD makes good stuff, but they still pull the tricks when ever they think they can get away with it. They just haven't have as much chance to do it compared to Intel/Nvidia. Luckily for them ever time they do pull something that on it's own in a vacuum would have people really pissed off at them, Nvidia or Intel come along with something else that takes the heat off.
They totally blew the zen5 launch, made enemies with the tech reviewers. And lied with their performance gains. If they aren't already the bad actors, they certainly trying to milk every last cent from you. When they can't they drop the price a week later.
Calm down guys, obvious joke was meant to be obvious, not triggering
Hahaha. Maybe that's already happened when they decided they could follow nVidia into the ultra-high price GPU segment.
@@a5centthey literally announced they're not trying the high end anymore. their top of the line is 1000 while nvidia is literally double. come on now
Sigh. Jay I'm right with you there. I went with the AC/DC song. We've gotten old haven't we? I hope you're doing well.
Why were the fans spining on the GPU behind Jay? 1:37
Pc is turned on (possibly rendering something) and it's probably configured to have fans completely off until a certain temp threshold
I like to see both AMD and Intel going the route of forgoing some performance for efficiency. We're at a point of diminishing returns with frame rates on modern hardware especially with DLSS, FSR, and XeSS. I mean 75fps in an RPG is pretty damn good and smooth and so many competitive shooters are so well optimized old hardware can get 144+fps on them. The returns for efficiency are much higher than the playable experience in my opinion. Pair that with the 5090 claiming to draw up to 600W, it makes perfect sense. A computer drawing 900W between the CPU and GPU is just insane and completely unnecessary when you consider the point of diminishing returns.
The issue with 13/14th gen, the brand new socket neeeded with potentially short lifetime as intel announced skipping a gen to focus on a completely different sku and no real gen to gen gain ? I'm currenlty slowly saving to get a desktop in 6 months as I'm finally settling somewhere and I'm definitely leaning AM5
Thanks for the video :)
The thing about Hyperthreading/Simultaneous Multithreading, the standard 2 thread CPU core has at least 4 pipelines (4 processing pipes). One pipeline is the primary thread. One pipeline is a secondary thread. Two pipelines are hardware acceleration ASIC threads that are loaded operations from the CPU firmware, for example the code for SSE instructions run on these extra pipelines. All of these pipelines share the ALUs and FPU(s) inside the core. In order to run 4 of these threads simultaneously, there needs to be enough execution units in the core to serve these pipelines. Ripping out hyperthreading allows the CPU developer to shrink/simplify the core to just 1 threaded pipeline and at least 1 hardware acceleration pipeline. This reduces the number of ALUs in each core, as you have fewer "customers per core" to serve.
The next factor to drive Intel to drop Hyperthreading, or at least kill their simultaneous multithreading method, is all the instruction execution security flaws and potentially looking at caching and memory access improvements.
If there are cons to Hyperthreading/SMT, why did CPU engineers incorporate it into the CPU design? The purpose of multiple threads simultaneously running per core is about performance per square millimeter of silicon. Often on a CPU, a single instruction or operation can only utilize 1 execution unit in the CPU in a clock cycle, leaving the rest of that silicon unused. Each CPU core has more than just ALUs and FPUs because there are other non-logic operations to call. Because each core can only execute 1 instruction at a time, there are unused resources. Thus the purpose of SMT/HT is merging 2 CPU cores to share underused resources and execution cores, reducing the landscape of having 2 single thread cores into 1 2-thread core.
The real problem of SMT/HT is something more specific to the x86 series of CPUs, backwards compatibility. In order to maintain backwards compatibility so the next generation of CPUs can run last year's operating systems and games, the same core instruction set and syntax must be maintained. The original 8086 CPU was designed to directly access RAM and didn't consider RISC pipeline staging or multithreading. The modern CPU has to translate the old system design onto newer, more sophisticated structures, and that translation step is overhead.
DIRTY DEEDS DONES DIRT CHEAP!
Yes, it's a Jojo reference.
kekw
@@alentanor its an ac/dc song. Not a jojo reference. smh
@@Gimpy17 There is a stand in Jojo named after that song. It's the stand of funny valentine, the 23th president of the USA in steel ball run.
@@alentanor cool, still an AC/DC reference
A video from Steve and Jay on the same day about the same topic? We have truly been blessed this day.
Both Steves too!
Intel: Releases Ultra Core
AMD: Hold my EPYC 9005
I doubt that the NPU will do much. The limiting factor is memory bandwidth, similar to how graphics are always better on a dedicated gpu with high-bandwidth gddr6. What the NPU could do is reduce the power profile of low intensity neural processing, i.e. increase efficiency. But because neural processing isn't already power limited, it won't be much.
Dang, gotta wait two more years for Intel to make real moves. I understand the concept of a foundational product to build on later, but two years feels like an eternity.
Also another new platform...
They just have to stop setting themselves on fire right now.
Gaming comparison vs. the 9950X (6:48)?
Content creation / Desktop performance comparison vs. the 7950X3D (7:12)?
It's clear Intel has little to offer this time.
On the other hand: This seems to be Intel's own Zen1-moment and we can look forward to what the future will bring.
If its like Zen.
Gen 2 will be awesome! Skip the growing pains. Wait a year.
The i5 and i7 replacement is what I am interested in the most. While the top is always interesting it is not what most buy. If a mid range chip can max out the appropriate GPU and for testing show good 4090 results we are in good times if the price is reasonable. This big change had to come and it’s interesting for the right reasons.
Intel need to do more to dig out of the hole they made for themselves with the previous gen disaster of inadvertently killing their own CPUs.
They're offering free RMA's and an extended warranty of 5 years. What else do you want them to do?
@jamescarter8311 I got an RMA done. But here is the kicker from a 13900kf kf. They gave me a 14700kf and told me that's the best they can do. So I lose performance and money.
@@ryandraj🤣
@@jamescarter8311 That doesn't help if you have to replace the CPU every year and what happens after 5 years? Intel should have done a recall, something is fundamentally wrong with 13th and 14th gen
@@pixels_per_inch Literally not how anything works...
Something is fundamentally wrong with your understanding of the entire situation.
I need more power reduction for more itx cases.
I expect it to be slightly better than the 9950x but when the 9800x3d releases it will get crushed
I wouldn't count on it. AMD has a great history of underdelivering.
@@jamescarter8311 it's true that AMD has a history of underperforming, but so does Intel, so it's a pointless assertion. The crown goes back and forth between AMD and Intel, and has done so for the last 25 years.
i had a 13900k that is now in heaven and switched to 7950x after the win11 fixes my actual gaming perf is the same or close, those few fps intel had caused obligatory degradation I do not trust intel at all since the big problem is the ringbus design
Personally, I'm thrilled with my current 7950x3d and plan on staying here for a while. My husband's 3900x, meanwhile, could use an upgrade....
@@BronzeDragon133 Same, I don't see myself upgrading for at least a 5 - 8 years. This thing is a beast.
BTW, a lot of people were salivating talking about GTX 40 compared to AMD because of "efficiency". But that went out of the window when they prefer intel and now with zen5. People are just pea brains.
Its gonna be another 1 gen socket. At least with amd's zen I went from 1700 to 2700x to 3900x to 5900x and finally the 5800x3d all on one motherboard.
It is so smart from AMD. The step to upgrading your Cpu only is so much easier than having to change your mobo aswell.
@@legendtoni1094 absolutely, makes me wish i went AMD sooner then this year.
@@Enzo187 ye no way intel would have allowed generational leaps on the same Mobo until the 5800x3d. That kind of leap is unheared of what AMD did.
Intel has always made 2-gen sockets since Lynnfield/Bloomfield in 2008, excepting LGA 1700, which was 12/13/14.
AMD has released short-lived sockets like TRX40 and sTR4.
@@tim3172 OP sorta misspoke with "1 gen socket" theyre referring to how with ryzen you can go from a 1700 to a 5700x and not need a new mb.
Intel still hasnt earned trust back since they fucked us all tying cpus to gpus. I want to go back to using whatever I want in my rigs!!
I hope the Ultra 9 or the 9950x will be my next CPU. I just don't want buyer's remorse.
I can't wait to see the reviewers' actual benchmarks on the Ultra 9.
We going full European vs American with this one - efficiency vs power - maybe both pls?
I'm curious to see what the performance of Intel's new cpu's will be like. But I'll still end sticking with a 9800X3D, I mainly game and watch YT so an X3D chip is the best choice.
Intel: "Am I joke to you?"
Gamers: "Yes."
Intel still stuck on 8P cores!
Really wish they had higher P-core counts and get rid of the garbage E-cores that do almost nothing for gaming
Yup. E-cores were never much more than marketing. It's how Intel kept up with AMD's core counts. Of course plenty of wimdits buy based on numbers on paper, so it worked for Intel.
You know why that is? Power consumption. Increasing the P cores and decreasing the E cores might give you a miniscule improvement in gaming, but the powerdraw especially in all core workload increases. Personally, I think 8 P cores are fine for now, very few games (maybe you could name one?) profit from more than 8 P cores.
No game in the world is using more than 8 cores so why would you want more?
@@gizConAsh
The obvious answer is because games aren't the only thing people run on computers ;-)
@@a5centthen why complain about e cores doing nothing for gaming
Props to Intel for using graphs that don't seem fake, and claiming themselves winner. Customers demand more transparency, so it is cool to see.
Hold on, 8700k, looks like we're waiting ANOTHER generation.
lol what people are willing to do to not switch to amd.
When you have to stop playing intensive games during the summer because it's too hot in your room efficiency matters. How many times have you heard a streamer complain about how hot it is in the room ? People don't make the connection for some reason, but it does matter.
They probably close their doors because they're working and it restricts the airflow.
Never rush out and get the first generation of new technology. Intel shat the bed on that one.
I initially thought I needed my white pants for this. Turns out I didn’t have to change at all
I just upgraded to a 14900k this year, having zero issue with it..I think Ill wait
I upgraded to 14900KF last year and, like you, zero issues.
Hope you guys installed the latest BIOS versions.. I kind of suspect that my 13600K has been affected by the premature degradation affecting Intel 13th and 14th generation CPUs as I get less than 20K score in Cinebench R23 multi-core and I saw people getting around 23-24K.
I have the 14900K and have had zero issues. It's a beast with the 4090
@@BucifalulR they all have this is a ringbus p-e core design fault I had a 12900k it degraded too altough not as bad but still it's nowhere near day1 cbr23 and 3dmark benches
@@fredEVOIX Wait, what? I thought 12th gen is not affected. Where did you get the info that it is a p-e cores design fault?
Sorry Jay. 8:30 my brain can't get away from the GPU fans stopping behind you. 😂
Great video and thanks for the initial breakdown
I'm gonna wait as only recently moved to ryzen 7700 cpu so no upgrades for next few years
I got that same 7700 and it’s a beast for rendering video and gaming.
I'm on a 6600k and all i'm hearing is let someone else test this one out come back in 2 generations. If I buy a new CPU its still more worthwhile to buy AMD cpu or and 12th gen Intel and wait another 10 years before going down this route. (I'd still use the 6600K but win 11 kicking me off my PC.)
I'm in the process of a new build from my i5 9400 cpu. Looking at the last few gens, this information, and Intel's clear pivot to workloads, efficency, and AI, I can't think of any reason I'd build another Intel rig, rather than switching when the 9800x3d comes out. Aside from possibly trying to build a cheaper 12th gen Intel setup and see what comes up in a couple years if the 9800x3d is somehow a disaster.
So as someone who premonantly games, AMD just makes more sense to me right now.
Intel are no stranger to 'gluing' CPU's together. Just check out the old 'Pentium D' designs they put out to compete with AMD when they launched their first dual core CPU designs.
They started the gluing process with Pentium Pro.
It's almost like glue logic is the name for the circuitry connecting two disparate chips and isn't actually an insult, or something.
I think it kinda makes sense to get rid of hyperthreading now. HT was good when we were only able to get 2 and 4 core designs but now that 8 core designs are kinda standard and 16+ core designs are even becoming common it starts to make less sense. Having a single thread per core also allows much more granular power controls over the core. To my understanding HT also added about 20% of the die area to each core that had it which means that on a 16 core chip you could go up to about a 20c chip in the same die area.
I'm still running a 9900K with a 2080 Super. Runs Minecraft just fine, thank you!
9900k/2080 ti here
That's what I got too
What about raytraced Minecraft though?
@@MarkHallG slide show
@@MarkHallG never tried it on Minecraft. I tried ray tracing on whatever CoD came out around then….looked good but only got 30-50 fps if i recall.
"We can't get anywhere in the future if we don't start somewhere. If we don't start here then we'll never get to there" -J2C
I have an Intel Core i7 4790k with a GTX1070. I was thinking about getting the Intel Core 9 Ultra 285k. But I will wait for AMD to see if the Ryzen 9 9800X3D beats out the Ultra 285k in gaming. What's funny is that the 7800X3D is the king so far still..
For only gaming price to performance dollar is important other than that in contain creation you can buy anything as you are getting money back😅
7800x3d sure beats it ,so 9x3d will even more.
Fps will not matter, it is only 5 to 10% difference.
7800x3d.. buy it for $600, not worth it.
Productivity is still Intel is the king.
Wait for gaming benchmarks before you upgrade the CPU. This is just my opinion so take it or leave it but if I was you I would upgrade that CPU to the best gaming CPU that's at the top once intels newest chip is reviewed. I would then buy a used (find good deal) 3090, or 4090 once the 50 series is released as they will populate the market as the enthusiasts buy the 50 series. I would do this so your CPU and GPU upgrade path is staggered. the new CPU should easily last 5-6 years and the GPU would be upgraded in 2-3years which at that time you go back to buying the newest GPU generation at that time. If you have the cash do it all at once every time but I find this works great for me and budgeting my spending. I also have three sons that game and so I am also buying and building their 3 gaming PC parts like this as well.
@GangstalkingClips just buy good gpu like 4090 and it is much better. If you have 3090.
Graphic matters if you don't have decent gpu.
Anyone else reminded of the old 386 /486 chipsets? 20 years plus off and we're cycling back to Old naming habits
my very first PC was from Ibuypower. My dad and I went to china town in LA to pick it up. this was in 2002
Mine was a TI-99/4A in 1982. Then a Commodore 128 in 1985 and Amiga 500 in 1987. First proper PC was a Packard Bell Legend 232 486SX in 1992, followed by a Cyrix 686 based build in 1996. Pentium II in 1997, Celeron in 2001, AMD Athlon in 04, Phenom in 08 and 12, Skylake in 15, Ryzen in 19 and 23. Kinda geeked out there...
@@cmkeelDIM Does Ryzen suits you? What about plans for new intel?
@@sansanych6163 Either will do what I need. Next year is about the interval for my next build, but gaming is not high on that systems design. Im actually considering the Threadripper as this new system will be primarily a Video editing machine (NLE). Too early to tell. Have to see how the new Ultra Core stuff performs.
I think Intel really needed this and dedicate a whole generation focused on efficiency. I rebuilt my computer with Intel's 1300k and I was shocked at how warm it ran. So now I keep my build on a very aggressive undervolt so it's not cooking my office.
This must be so fucking confusing for AMD since everybody was shouting from the rooftops about how the efficiency of Ryzen chips was enough to consider them over Intel because the cost savings over time would account for any performance deficits or a higher CPU price. Flash forward to 9000 series when they decide to pure focus efficiency when they're already ahead performance wise and a lot of people are upset that the performance metrics aren't a big enough increase even though the efficiency is amazing. This after many years of people making fun of Intel for chasing performance when the cores run hot and suck up power. The people are now demanding AMD to focus on the performance instead of the efficiency which has been their main thing for years.
They're hypocrties. The internet has always been hypocrties.
First time I heard the name "Ultra 5 245K" literally had no clue what it could be without looking at it.
I'm holding on to my 12700k for a while. I still feel like it has another 3 to 5 generations of good use. But it is interesting to see what improvements intel and amd is doing for their products.
Yeah I don't see a reason to upgrade my 13600K either yet. I am interested if the Ryzen 9 X3D chips turn out well tho, especially if there is no more core parking nonsense.
i just upgraded from i7 4790k to i5 14600k for 200$, im holding on for another 10years
Upgrade to 14700K and your system will run so much faster. I don't recommend 14900K though as it's too hot and power hungry.
i was running a 9900k right up until a month or so ago and until 2023/24 id have hardly known my CPU was 7/8 years old. you'll be good until 2026 at least.
@@tyra18704790K gang, rise up! I’m satisfied with what I’m getting but I know a newer system would blow my mind.
Perhaps the most important thing about new Intel CPUs is the job security for tech review websites.
I was a die hard Intel fan ... till my 12700k... im happy with my 7800x3d good luck intel
I'll never go Intel now. No matter what!
Why?
@@yikesmcgee-gz3lq Why not? Intel has a history of anti-competitive practices that continues till this day. A blatant refusal to recall 13th/14th gen despite a QA fail and manufacturing defects. I realize that Arrow/Lunar lake are made by TSMC, but I will never want to support anything Intel from now on.
@@Lionel212001 with how cheap the cpus are now what does it matter?
My 12700k was paired with an ASUS z690 both had multiple driver / bios issues and to top it off ASUS support was horrid... fast forward to current intel... same story ... 2 companies i was huge fan boys of now lost my business
seriously. what do they mean by a) telling us on one hand to cut down power consumption across the board in order to ‘save the planet’ and b) producing on the other hand electronics that consume more and more power (cpus, gpus, larger tvs, ultra mega fast chargers for evs, smartphones etc)? whats up with all that?
more disappointment... and they couldn't even bump up the L3 cache huh?
Is it really so much to want something like a 12 or 16 physical core CPU with big cache, without having to rely on stupid core parking?
Gimme like 16 P-cores from Intel, with the X3D cache from AMD, on a single chip (no CCD parking nonsense) and no hyper threading.
Many techies are thinking that the upcoming X3D options will include at least one chip designed so we do not need to park cores.
@@Valthalin We'll see, the way I read it is that there will be 2 and/or 3 CCD chips, they're putting X3D on at least two CCDs. Until the performance is actually tested and latency is tested, I'll be holding my breath.
From a pure single-thread compute standpoint IIRC Intel is still ahead by 15% IPC, and in many many games, especially the ones I'm concerned about, the limit is the render thread. in VR you have realistically two total render threads at best, and frame pacing is more important than frame rate, and any judder causes absolute mayhem. in non-VR you can run many games at like 300fps no problem, then throw decent levels of graphics, and all of a sudden 72 fps becomes "impossible".
Slightly off the topic, I thought I share the love. my Ryzen 7800X3D was hitting high temperatures and I tried many thermal pastes but not liquid metal. The BSFF from Nuomi Chemical, the non-silicon, non-conductivity thermal paste is amazing stuff. I was using the highly rated Noctua paste was running 5-6 C higher. I don't know what it's made from and they even claim it outperforms liquid metal. Jayz, it would be great if we can have some proper tests on BSFF.
For those who are reading, hope you have a nice day!
Have a nice day too! ♥️
Life is suffering
For a feature that they once segmented and charged a $100 premium (Core i7's versus Core i5's), I find this quite ironic.
Yup, going AMD next. Bye Intel!
Which one?...amd's also high priced low upgraded gen chips..like 9000 series
@@ShafayethossainShafin-e5r The new x3d that is upcoming.
@@warywhistle oh yes...right..x3d is the only option nowadays
Tyey are advertising that this Gen has the same performance as the previous but it costs more? WTH? I dont think another company has tried that one before
IF I could upgrade before the end of the year...?
I'd get a 7800X3D; I only play WoW so even my 5800X is probably already overkill, but I to play 4K, high FPS - I still can't find a definitive answer to why I need to pair either CPU with a 40/5090 to achieve it, so I'm holding fire 'til I do see why...
Wow and other mmo loves x3d memory 😉
@@marcinkarpiuk7797 That's good to know; now if only I could get the 4K bit clarified - platform alone isn't enough justification to upgrade.
No reason to go crazy if it is just WoW you are playing. The top end tech is entirely overkill for the game. Running anything at 120fps on a 4k monitor will demand good hardware on the CPU and GPU side if that is what you want to achieve. I'd stick with your current cpu and try something around a 4070 ti super at 1440p, maybe look for a test video up on youtube and check out the graphics and the card(s) needed to get the performance you want.
@@Valthalin I can already do/get 1440p @ 100+fps (RX6800) & have had that for a good few years - I just feel it's time, for me, to want to play @ 4K with the bonus that other stuff I do/watch is available in 4K, as well.
What is crazy is Copilot+ requires 40 TOPS minimum to work, so what is 13 tops going to be able to do? the answer is pretty much nothing.
13tops is Jack, Microsoft requires over 30 just to run windows AI features. Waste of silicon on a desktop product if it can't even meet minimum spec for AI
Who is the Jack in question? The beanstalk guy?
You realize that usage exists beyond Windows, no?
Its interesting.
From what I understand for security reasons many had turned off their hyper threading so there wasn't a 'boost'. While this isn't important for gamers, it is if you're running Linux.
So now you have to consider the physical core count and the virtual core goes away.
Not sure how the NPU is going to play out.