Rumors says, it's gonna be just 1 number % increase, even tho it's still nice. We gonna have new king of gaming, probably better stability and it might use more MHZ of ram. Just the price gonna be too high on the start. Next year it will be new best seller.
@@club4ghz lol no, the 3d cache makes 1% low way better than anything else the high latencies in the core ultra series and the slow tile design is the one having bad 1% lows
Just remember that the home desktop CPU market isn't that important to either AMD or Intel. Server side is all they really care about just as the GPU market isn't that important to Nvidia anymore. Desktop gamers get the crumbs from all the tech companies.
@@Dr.D00p Yup totally concur but all the basement incels will still refuse to listen to this cold hard truth because gaming is everything in life to them.
@@assassinx4874 the compute die the one with the main cores is built on tsmc N3 the other parts like memory controller, graphics , pcie are built on 7nm. It's a chiplet design CPU not monolithic like previous generations
The only really noticeable difference is on a 4090 at 1080p. Any GPU below that will run fine on Intel. The internet acts like everyone is running a 4090, while complaining about how expensive they are. Amd fanboys are weird. They choose nvidia over amd GPU almost always.
To Intel's credit, the 285K does run about 5-10° cooler. Besides that, higher power draw and lower performance makes the 7800X3D a better deal. No to mention the price difference, new motherboard every other generation, and the 9800X3D on the way. Even if the 9800X3D has similar performance to the 7800X3D like the non X3D 9000 series, it'll most likely be more efficient like them as well. If that's the case, Intel's new chip will look even worse.
1) Thermistor location is the only thing temperature matters for. If the temperature is reading lower, it's because they moved the thermistors. All chips use thermistors as a heuristic to keep the silicon under the peak temperature hotspots where it degrades. A thermistor slightly farther from a hot spot reads lower but does not change the actual temperature of the hotspot. See: AMD chips having the same material, silicon, but setting their max thermistor temp to 95C instead of 100C. Why? Because at that reading, the hotspot is at the same peak allowable temperature as Intel's thermistor when it reads 100C. 2) Power draw in gaming is extremely similar. Performance outside of gaming is around 2X the 7800X3D. 3) 9000 series is going to be a sub-5% performance difference, it's a refresh. 4) Intel just did 3 generations on the same socket, stop parroting "new motherboard every generation", that hasn't been the case for a very long time.
Honestly? Way better than I expected. I was expecting Ultra to be incredibly shitty because of the popular techtubers keeps saying that it's the worst thing ever, but they're surprisingly good on this benchmark. Keep in mind this is THE productivity orientated chip. It will run laps in non gaming performance. And for Intel Ultra, things will only get better from here; Stability + compatibility updates and so on. Though the sensible option would be to wait because 9000x3d series are coming out soon.
This channel consistently has the worst performance for the amd cpus. They have yet not updated to windows 24H2 that brought nearly refresh like level of extra perfomance, that was actually always there on Linux but not on windows. There's a much bigger gap. Gamer nexus and hardware unboxed found even the 5700x3D beating the 285k sometimes.
Keep in mind that this comparison is irrelevant if you play in 4K. Both processors will give you the same FPS, and the Ultra 9 is much faster for any other productivity tasks.
Guess what. Its also a shitton more expensive. And people buying the 7800x3d need that performance othervise they would buy the 7700, and if you want productivity then just buy the 9950x, or if you want better everything for cheaper buy the 7950x3d.
Very true, and the 285 isn't even running the Extreme profile from the displayed power draw. My 5.9GHz all core OC'd 14900KS can hit the same framerates in FS2020 over Central Park as this 'comparison' achieves in 1080p!
Looks like they will both run fine with my 4090, but how does the intel run so much cooler than the amd? I've already rma'd two 13900k's and one 13700f. Just hoping this one will not degrade and just work. Running cooler for basically same performance is fine with me.
This channel does not specify the cpu cooler used for the 7800x3D. Nearly everyone runs their Intel cpus with 360mm AIOs, while the 7800x3d is often paired with just dual tower air coolers because that's enough for it to not throttle and lose perfomance.
Intel: releases selfdestroying CPUs Also Intel: not taking them off the market, so that their "fix" for it is irrelevant, because you could still get a self-destroying one, when buying one Also Intel: releasing an even worse CPU than last gen and worse than competition what is intel doing...
Do you know when you look and realize that that product is just ahead of its time? Just look at the 7800X3D. It may not be the best cpu for applications on the market. But when it comes to cost-benefit, it delivers the best in games. I own an R9 7950x3d and I recognize the beautiful work AMD did with the 7800X3D. 👏👏👏
Cannot please everyone. The basment incels will be yelling 'hey why this setup at low resolutions' although the rest of us with brains can understand the test.
I'd say it's better to turn off ray tracing before going down to 720p just to test a cpu. I have no clue why they chose to run some of these games with ray tracing maxed out.
@@Eepy-Rose no, not as bad as the GPU, but it is a big performance reduction. Testing at 720p low doesn't give you the full story about CPU performance these days. What I would do is test at the lowest resolution, with the most aggressive upscaling or image scaling option, and the maximum settings.
this is chiplet design, intel have so much to learn to eliminate latency as much possible, and games are not expecting the hyperreading removed😅, need optimization too from software side
Do you have some "city builder" games like Anno 1800, Cities Skylines, Planet Coaster - I have seen some benchmarks that surprisingly, the new Intel CPUs seem to manage there far better in power consumption than the AMD CPUs, Could be interesting to have it tested here.
@@OmnianMIU Well with all the 8 core and higher CPUs the leverage certainly seem to have deminished or at least there has been very few software that is able to scale well with additional cores.
Well the intel is not bad but its too expensive, needs a new motherboard etc. If it was like 449$ and socket 1700 so one could upgrade from like 12k series then it would be nice upgrade.
And to think the 7800X3D used to be $350 just over a month ago. In defense of the Core Ultra 9, it's a damn good processor for heavy workloads and content creation. The price is still too high.
It's exactly as Intel showed on slides and will shake out that way within a couple of months of launch. Within ~1% gaming performance of the 14900k (which is an absolute monster of a CPU) and even faster in fully-threaded uses. And uses considerably less power and has support for way faster RAM and has an NPU. It is EXACTLY what they said it was on the tin, they just rushed out the review samples and launch date before BIOS settings and Windows optimization were ready. It's not a showstopper generation, it's an interesting hybrid stopgap that sets the stage for really really huge iterations for 5-6 years to come. TSMC process node with about half of the architectural changes they intend, like hyperthreading removal, and without pushing the power envelope to the maximum. When they move to Intel foundry next gen and realize the rest of the changes, they will likely also juice the ring and core frequencies with the experience they've gained and throw a bit more cache on the cores, which will be exciting. You can already take a Core Ultra and overclock the ring and cores slightly, and stably, for 4-5% gains for free. Kind of weird to see people with 3700X's and a 3070 blab in comments sections about how the new hardware from Intel isn't convincing them. Yeah, not a surprise dude. You don't have the money for a rig that any of this matters for. If you bought a 285k it would be bottlenecked by your GPU unless you got a 4080 Super, 7900XTX, or 4090. If you have a 4070Ti, it's the same frames in gaming as a 7800X3d.
When the Ryzen 9800X3D arrives, the 7800X3D will drop in price. The 7800X3D is still the best for games. Hardware unboxed said that the Intel Core ultra 285K is on the same level as a Ryzen 7 7700X and still has a very high cost. The 7700x costs US$275 dollars and the Intel Core Ultra costs US$630 dollars. Buying Intel Core Ultra is wasting money.
6400 is only officially supported with cudimm modules. With normal udimm memory intel officially supports 5600. Also Jedec refers to the base spec of a memory type. DDR5 being 4800. It does not refer to a supported memory speed by a CPU manufacture.
Please have a 8C8T competition from 285K VS 14900K VS 7800X3D, I'm really curious about eliminating e cores and potentially windows scheduling issues will effect the gaming performance..
Какое же говно стали делать интел....Хуже даже не топовой ряженки при этом жрут больше с меньшей производительностью...У интел один плюс - греется меньше)
Fr like , U9 is more for AI use and workload and X3d is for GAMING isnt that clear already? The new Z890 mainboard bios can go further too , These people think those CPUs are just for gaming lol
@@latomz6551AI use? It has the worst NPU, even worse than mobile phones. It has only 13 tops. The ryzen hx 370 has 50 tops, and that's a laptop cpu. Also, just curious, how many Pixar movies have you rendered today with your 7700k? I'm sure you must be making a lot of money at home, in your room, doing such "work" tasks. Surely you aren't thinking you need 16 cores for a teams call, chrome and excel.
You don’t say what resolution you’re testing at. I’m getting this performance with my i9 14900k / 4090 at 4K resolution. Regardless, the difference between AMD and Intel seems academic and likely depends a lot on Window 11 tuning, DRAM speed and SSD speed.
Да интел это провал, они говорили сделали упор на энергоэффективность в новых процессорах, но ryzen всё равно получается энергоэффективнее и производительнее!
Intel did state that the X3D CPU would be slightly faster in games. On 7 November the new X3D CPU will come out. Waiting to see the results of that CPU.
In the witcher 285k at 105w is cooler than the 7800x3d at 65w...not to mention 24 cores vs 8.I think this new platform is an improvement over the last gen.
@@hardcoregamningchannel that really is not relevant since both CPUs are below 70c which is perfectly fine for any CPU. Id rather take few degrees warmer for 40-100% less power and more performance. stop trying to justify this trash intel gen
The real price of intel core 9 is 400$ not 600$, and even wealthy people will turn to Ryzen. That a shame.. I hope Intel will step up their game and prove that they are truly passionate about CPU architecture.
No doubt 7800x3D is the king but i hope 9800x3D improves upon 1% low, i don't get it why on some games 7800x3d have bad 1% LOWS, frequency? Or bad game optimization? Or last but not least windows as we have seen Microsoft was screwing AMD for years
barly any difference, just a few % here and there lol people make it seem like one is double the performance over the other while the difference is is single digits 🤣
@@nossy232323 LMAO not true, 4090 is being bottlenecked at 1080p no matter what CPU. A 5090 will not change things. CPUs are just too weak no matter intel or amd
core ultra 9 285k only for working due to good stabilization while 7800x3d good for gaming but the price is increasing because 9800x3d ready to release
Wonder if it would have been better with the HT instead of removing it... or at least add a few P cores. Will be interesting in a few months since they had to contract with TSMC to make these chips so they have to commit to the wafers, few will buy them for retail price. So we'll see in a few months if prices drop. Been fantastic sales on the 12th gen and 13th gen last month.
The current generation of CPUs is weird - seems significantly overpriced for most uses The market for both the 285K and the 9950X seems to be professional users using heavily multithreaded apps For those users a 600 dollar CPU that is somewhat faster than its predecessor while using less power is a reasonable purchase. If you save someone making 100 dollars an hour 10 minutes a day, a new system pays for itself in 6 months. I'm not sure that any of the other Core Ultra series or AMD 9000 series (non X3D) make sense to anyone with current pricing. There is clearly no reason to upgrade from last gen and last gen is priced better for new builds. We will see if the 9800X3D makes sense - Intel isn't even trying to compete in the gaming market.
Haha. Pathetic. Games, always games. Intel has overcome that era for so long. They'll still be dominant in the cpu market. You guys can't do a damn thing about it. Since what you "gamers" can only do is expecting these new Intel cpus would beat AMD' ones in gaming purposes. Wrong.
Starfield is the only game in which Core Ultra 9 285K can beat the 7800X3D.... 1 out of 10 games... The new 9800X3D will definitely beat Intel's CPU even in that game...
@@OmnianMIU Thanks, I have an I9 14900K, with 360mm cooling, with all the BIOS updates, and I still can't get below 85C° in a game like Horizon Forbiden West.
Dijo intel que perdía en gaming pero que era más eficiente, ni es mejor en gaming y es peor en eficiencia, intel se hizo el harakiri, ahora quiero ver rematando el trabajo al 9800x3d
Games :
Ghost of Tsushima - 0:00
Forza Horizon 5 - 0:57
Microsoft Flight Simulator - 1:59
Starfield - 2:59
Hogwarts Legacy - 3:59
CYBERPUNK 2077 - 4:59
Silent Hill 2 - 6:09
The Witcher 3 - 7:05
Red Dead Redemption 2 - 8:10
Star Wars Outlaws - 9:21
System:
Windows 11
Core Ultra 9 285K - bit.ly/4fisOO6
MSI MAG Z890 TOMAHAWK - bit.ly/3NFmC7a
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
RAM 32GB DDR5 6000MHz CL30 - bit.ly/4e3MqEG
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
Ryzen 7 7800X3D TOP
Бро можешь сделать пожалуйста 7500F VS 8700F
@@ПикаПика-ь2р 7500f 6 core 12 +10fps 8700F 8 core 16
@@vovapapi-sk2zg У меня друг сейчас проц выбирает и не знает что взять 7500f или 8700f что посоветуешь?
@@ПикаПика-ь2р 7500 f 6 ядер 12 8700 f 8 ядер 16 8700 f лутше больше ядер а fps + - как 7500f
7800x3D is older, uses less power, and still faster. AMD is killing it!
...except if you open any software that isn't a game.
@@CyberneticArgumentCreator Still performs similar to a 7700x, it's more than adequate for most users even users that do video editing etc.
Yeah, but not the best as Best Buy engineers and scientists often laud.
@@CyberneticArgumentCreator ...on a video comparing gaming performance
Finally 7800x3D is 2022 product
9800X3D vs 7800X3D - waiting
Bro 9800x3d is the new "gen" of the 7800x3d it will be better
@@Emanuele-j2eMax 10%
Rumors says, it's gonna be just 1 number % increase, even tho it's still nice. We gonna have new king of gaming, probably better stability and it might use more MHZ of ram. Just the price gonna be too high on the start. Next year it will be new best seller.
@@Berkelll they say 450 not much
2-3% difference
7800x3d: waiting for a real challenge
only its successor is it's real challenger
8 core processor in 2024 LMFAO
@@omerthenut2568 i9 8 core too...
@@sorryforthatt can you read?
Ryzen 9 9800X3D: Hello son!
The 7800X3D is a monster that even AMD is scared of.
7800X3D is lagging real bad in 1% lows. This makes your game feel choppier despite higher fps.
@@club4ghz lol no, the 3d cache makes 1% low way better than anything else
the high latencies in the core ultra series and the slow tile design is the one having bad 1% lows
@@club4ghz Cope harder please. Even in 1% lows majority of wins belong to 7800X3D.
@@Atrumoris Learn some new vocab and stop using your incel language loser
@@club4ghzhave you even paid any attention to the results of the video?
$450 vs $600 -10% power and still +5-10% TGP 😂
You forgot about -30% single core performance in workloads and -50-60% in multi core tasks. -30% slower DDR5 memory also.
TGP mean?
@@AMD_7900 you forgor about 7950x3d being faster and better and cheaper in literally everyhting according to techpowerup and hardware unboxed 😁
@@Tongred6237 7800x3d is like overclocked 7700x, very good cpu but really dont make core ultra look bad at all
thing abt this is that youll have to spend 3000$+ to know that, so im here telling you theyre worth the money, am5 is great and cheaper overall
7800X3D : Is that the best you can do???
7800x3d from 350$ to 470$ is still not a good look💀🙏
@@dummy1901Wait you will get 9800x3d for 349$
😎👌
"Ultra" stands for "Ultra bad"
Ultra-ass
😂
Ultra expensive
Not Ultra 285K. It’s Ultra -28.5% fps!😂
Ah come on now, lol!
It's just a bad day for Intel. Intel is getting buried by Ryzen 6ft deep...what a time to be alive.
7 9800x3d+6ft😂
Just remember that the home desktop CPU market isn't that important to either AMD or Intel. Server side is all they really care about just as the GPU market isn't that important to Nvidia anymore.
Desktop gamers get the crumbs from all the tech companies.
bad day? no bad year precisely.
@@Dr.D00p Yup totally concur but all the basement incels will still refuse to listen to this cold hard truth because gaming is everything in life to them.
Intel couldn't even surpass their own 14th gen in performance which is wild, they focused too much on reducing power consumption.
Let's take a minute and laugh at Starfield optimization
7800x3d 80% usage, 285k 70% "optimization"
7 more fps with the cost of 50W... 💀🗿
@@Jakiyyyyy and u9 285k still too much cooler 💀
@@OmnianMIU 3D Vcache inside the chipset so it has to be run hotter, still easier win tho.
@igm1571 it has nothing to do with 3d V cache, because all Zen4 chips are hotter then any other chip related to power absorb
This is AMDs sandy bridge moment.
Shine bright, you little underdog.
That was the 5800X3D. Everything since has been extremely minimal gains, just like Sandy Bridge.
You are 💀🧠 for picking a side to begin with
Man they should have cooked a little more. Using tsmc N3 and still consuming high power compared to 7800X3D
it's not 3nm its 7nm. Intel hasn't taken fabrication advantage yet ; )
@@assassinx4874 the compute die the one with the main cores is built on tsmc N3 the other parts like memory controller, graphics , pcie are built on 7nm. It's a chiplet design CPU not monolithic like previous generations
But it's cool too than 7800x3d .
@@Deathdemon65 its not why would they went to tsmc in the first place is already questionable
true that anything aside of core may use older nm
Lmk how amd performs in idle/light tasks 😂😂😂. Intel has always been more efficient overall.
Ultra for Intel Ultra fan boys
Amd fanboys ready to pay price hike on x3d. Prepare yourselves. amd can charge anything they want now.
@@oneanother1 at least we know the performance won't get worse lmao
The only really noticeable difference is on a 4090 at 1080p. Any GPU below that will run fine on Intel. The internet acts like everyone is running a 4090, while complaining about how expensive they are. Amd fanboys are weird. They choose nvidia over amd GPU almost always.
To Intel's credit, the 285K does run about 5-10° cooler. Besides that, higher power draw and lower performance makes the 7800X3D a better deal. No to mention the price difference, new motherboard every other generation, and the 9800X3D on the way. Even if the 9800X3D has similar performance to the 7800X3D like the non X3D 9000 series, it'll most likely be more efficient like them as well. If that's the case, Intel's new chip will look even worse.
not everyone can oc and i respect ryzen provided simple one click settings on the app for everyone to experience best gaming possible without any tune
its not that core ultra bad without oc, theyre on top by default, its just a waste if you dont, and might as well buy a 9700x
1) Thermistor location is the only thing temperature matters for. If the temperature is reading lower, it's because they moved the thermistors. All chips use thermistors as a heuristic to keep the silicon under the peak temperature hotspots where it degrades. A thermistor slightly farther from a hot spot reads lower but does not change the actual temperature of the hotspot. See: AMD chips having the same material, silicon, but setting their max thermistor temp to 95C instead of 100C. Why? Because at that reading, the hotspot is at the same peak allowable temperature as Intel's thermistor when it reads 100C.
2) Power draw in gaming is extremely similar. Performance outside of gaming is around 2X the 7800X3D.
3) 9000 series is going to be a sub-5% performance difference, it's a refresh.
4) Intel just did 3 generations on the same socket, stop parroting "new motherboard every generation", that hasn't been the case for a very long time.
Temperature alone doesn't mean anything. If it's using less power, it's also putting out less heat, just the specific thermistor is hotter.
Honestly? Way better than I expected. I was expecting Ultra to be incredibly shitty because of the popular techtubers keeps saying that it's the worst thing ever, but they're surprisingly good on this benchmark.
Keep in mind this is THE productivity orientated chip. It will run laps in non gaming performance.
And for Intel Ultra, things will only get better from here; Stability + compatibility updates and so on. Though the sensible option would be to wait because 9000x3d series are coming out soon.
exactly
Well said.
For the most part, even in productivity uses 285K is getting shafted by 9950X... which is far more energy efficient... and cheaper.
Uh, for productivity, it trades blows with the 9950x and even the 14700k.
This channel consistently has the worst performance for the amd cpus. They have yet not updated to windows 24H2 that brought nearly refresh like level of extra perfomance, that was actually always there on Linux but not on windows. There's a much bigger gap. Gamer nexus and hardware unboxed found even the 5700x3D beating the 285k sometimes.
7800X3D Gold Medal again🥇
9800x3d: im coming wait a moment
You are not using the latest 24H2 Windows version. The 7800x3d is a lot quicker than what we see here...every other online Benchmark proofs this🙋♂️
true2 the 3d cache should improve over time, but so does intel's ecores
ryzen 7000 notoriously being buggy at release and now with updates it perform rather impressively
@@iikatinggangsengii2471 this was a ryzen wide bug not just 7000 or 9000 series, my 5800x3d also gained performance from the update
@@Defineddwho asked you about your 5800?
@@Rebe-Caufman😂 lol
The only positive about the 'Ultra' is it's cooler than the 7800x3d
Who cares about a few degrees
Meh, that's just kinda little difference to me
Anyways 60°C are fine in terms of gaming lol
@@phantom0590 it matters bc u can oc it duh
@@stevensv4864 imagine overclocking ultra 9 just to reach the perfomance of base 7800x3d
Keep in mind that this comparison is irrelevant if you play in 4K. Both processors will give you the same FPS, and the Ultra 9 is much faster for any other productivity tasks.
😂😂😂 ahahahahaha
Guess what. Its also a shitton more expensive. And people buying the 7800x3d need that performance othervise they would buy the 7700, and if you want productivity then just buy the 9950x, or if you want better everything for cheaper buy the 7950x3d.
Very true, and the 285 isn't even running the Extreme profile from the displayed power draw. My 5.9GHz all core OC'd 14900KS can hit the same framerates in FS2020 over Central Park as this 'comparison' achieves in 1080p!
It's not irrelevant. Dragons dogma 2, baldurs gate 3 and star wars jedi survivor are cpu bottlenecked even at 4k native
7800X3D is the 1080ti of the CPUs
the 5800x3d is the 1080ti of the cpus.U are wrong m8
@@NoSkillsBro. It is GSX-R 600 of CPUs
Intel Clown Ultra ™️
Looks like they will both run fine with my 4090, but how does the intel run so much cooler than the amd? I've already rma'd two 13900k's and one 13700f. Just hoping this one will not degrade and just work. Running cooler for basically same performance is fine with me.
AMD has that V-cache stacked up the top of cpu and its very heat sensitive so it's a little hotter but it's fine.
Who asked? nobody cares
9800x3d will be cooler.... The die is reversed
This channel does not specify the cpu cooler used for the 7800x3D. Nearly everyone runs their Intel cpus with 360mm AIOs, while the 7800x3d is often paired with just dual tower air coolers because that's enough for it to not throttle and lose perfomance.
intel is back! I was on the verge of converting to AMD but looks like intel has won my support again 😊
You sure are grasping at straws here, don't you
@@Atrumoris Its a sarcastic comment my friend.
Intel: releases selfdestroying CPUs
Also Intel: not taking them off the market, so that their "fix" for it is irrelevant, because you could still get a self-destroying one, when buying one
Also Intel: releasing an even worse CPU than last gen and worse than competition
what is intel doing...
i mean please tell me that there is a target group for these ultra cpus, because i really don't see it
Haha yes, even if the price is same with 7800x3d i dont want to buy it😂 poor intel
Also u have to buy new motherboard.
Wait for Intel Updates and optimizations
Do you know when you look and realize that that product is just ahead of its time?
Just look at the 7800X3D. It may not be the best cpu for applications on the market. But when it comes to cost-benefit, it delivers the best in games.
I own an R9 7950x3d and I recognize the beautiful work AMD did with the 7800X3D. 👏👏👏
Cost 500€.
7800X3D: “the only one who can beat me is me”
what happen to intel 😢
14nm ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
greed
This is the first time intel uses this tile architecture , there will be improvements for sure .
Man… Intel is really struggling rn
They need to do something ASAP
Theyre going down. 😂
Go down to 720p, because over half of the shown games are hitting the GPU limit area.
Cannot please everyone. The basment incels will be yelling 'hey why this setup at low resolutions' although the rest of us with brains can understand the test.
I'd say it's better to turn off ray tracing before going down to 720p just to test a cpu. I have no clue why they chose to run some of these games with ray tracing maxed out.
@@Eepy-Rose RT hits the CPU hard too.
@@ryanspencer6778 yeah but probably not as hard as the gpu. Generally other testers don't use ray tracing while testing the CPU
@@Eepy-Rose no, not as bad as the GPU, but it is a big performance reduction. Testing at 720p low doesn't give you the full story about CPU performance these days. What I would do is test at the lowest resolution, with the most aggressive upscaling or image scaling option, and the maximum settings.
intel 285k just shattered world record in singel Core performance so i dont know what you people are talking about. Its faster then the new ryzen
Yeah, it does best in synthetic benchmarks, but mostly not in real world, and certainly not in games.
Those are synthetic benchmarks and not gaming results.
@@laszlozsurka8991You only care about gaming purposes?
this is chiplet design, intel have so much to learn to eliminate latency as much possible, and games are not expecting the hyperreading removed😅, need optimization too from software side
7800x3d😊👍🏻
Will there be improvements for intel ? Any bios update or anything to gain any kind of performance ?
Well, both the Zen 5 and 200 series CPUs disappointed me. I will be waiting for the next generation to drop next year.
The wattage for the i9 is wrong, you can see this on reviews of the cpu. They're siphoning like 50w from the board so it won't show in this test.
Intel: "*chuckles* I'm in danger!"
Bro 7 7800x3d costs about $280-$300 in our country, wtf?
Only 300? Is 450 in our country china
This CPU probably wasn’t designed solely for Gaming
Do you have some "city builder" games like Anno 1800, Cities Skylines, Planet Coaster - I have seen some benchmarks that surprisingly, the new Intel CPUs seem to manage there far better in power consumption than the AMD CPUs, Could be interesting to have it tested here.
Because the higher core count
@@OmnianMIU Yeah but I mean all the AMD CPUs still have hyper threading, I'm surprised Intels CPUs manage so much better w/o hyper-threading now!
@@Quast hyper threading doesn't mean more performance at all
@@OmnianMIU Well with all the 8 core and higher CPUs the leverage certainly seem to have deminished or at least there has been very few software that is able to scale well with additional cores.
@@Quast exactly
sorry but why do you test ddr5 6000 insted 8800
285K can do around 10.000 mhz OC ram.
@@club4ghz thats what i mean
Well the intel is not bad but its too expensive, needs a new motherboard etc. If it was like 449$ and socket 1700 so one could upgrade from like 12k series then it would be nice upgrade.
How is this right??? the new 285k should be much slower compared to 7800x3d according to nexus
Steve is such a smug asshole, I can't even watch him anymore. Even before this when he attacked linus, that was low, and i don't even like linus
I will continue to use my 12400F processor for many more years
And to think the 7800X3D used to be $350 just over a month ago. In defense of the Core Ultra 9, it's a damn good processor for heavy workloads and content creation. The price is still too high.
….. 😢! Intel!
@TG, can you test 7800XD3 vs 285K on 4K in more Games please?
Interesting. In MS Flight Simulator, despite having delivering lower FPS, the 285K significantly lowers GPU load & power draw
Man it's been a year since I bought my 7800X3D, time flyes... let's see what 9800X3D can do.
Look like much better compared to previous reviewers... maybe gets better with new bios
It's exactly as Intel showed on slides and will shake out that way within a couple of months of launch.
Within ~1% gaming performance of the 14900k (which is an absolute monster of a CPU) and even faster in fully-threaded uses. And uses considerably less power and has support for way faster RAM and has an NPU. It is EXACTLY what they said it was on the tin, they just rushed out the review samples and launch date before BIOS settings and Windows optimization were ready.
It's not a showstopper generation, it's an interesting hybrid stopgap that sets the stage for really really huge iterations for 5-6 years to come. TSMC process node with about half of the architectural changes they intend, like hyperthreading removal, and without pushing the power envelope to the maximum.
When they move to Intel foundry next gen and realize the rest of the changes, they will likely also juice the ring and core frequencies with the experience they've gained and throw a bit more cache on the cores, which will be exciting. You can already take a Core Ultra and overclock the ring and cores slightly, and stably, for 4-5% gains for free.
Kind of weird to see people with 3700X's and a 3070 blab in comments sections about how the new hardware from Intel isn't convincing them. Yeah, not a surprise dude. You don't have the money for a rig that any of this matters for. If you bought a 285k it would be bottlenecked by your GPU unless you got a 4080 Super, 7900XTX, or 4090. If you have a 4070Ti, it's the same frames in gaming as a 7800X3d.
Can we just pin the "it's already dead" meme on top for the 285k? Leave the poor stillborn thing alone :/
Hi mate, please include Counter-Strike 2 in your benchmarks.
Played by 28 Mil players every month. IDK why you ignore that game.
When the Ryzen 9800X3D arrives, the 7800X3D will drop in price. The 7800X3D is still the best for games. Hardware unboxed said that the Intel Core ultra 285K is on the same level as a Ryzen 7 7700X and still has a very high cost. The 7700x costs US$275 dollars and the Intel Core Ultra costs US$630 dollars. Buying Intel Core Ultra is wasting money.
Why are you still testing XMP 6000? Waste of a time not using a proper setup. It supports Jedec 6400 out of the box already
6400 is only officially supported with cudimm modules. With normal udimm memory intel officially supports 5600. Also Jedec refers to the base spec of a memory type. DDR5 being 4800. It does not refer to a supported memory speed by a CPU manufacture.
Were both CPUs used with the same cooler and on the same RPM???
even using the name ultra, still can,t beat this beast
Will there be a difference at 4K? Nobody plays at 1080p with these CPUs and a 4090.
Please have a 8C8T competition from 285K VS 14900K VS 7800X3D, I'm really curious about eliminating e cores and potentially windows scheduling issues will effect the gaming performance..
Какое же говно стали делать интел....Хуже даже не топовой ряженки при этом жрут больше с меньшей производительностью...У интел один плюс - греется меньше)
some people's little brains seem to think CPUs are only for gaming, lmao
Cry more about it little boy from India
Thats what you were thinking back when ryzen came out idiot.
Fr like , U9 is more for AI use and workload and X3d is for GAMING isnt that clear already? The new Z890 mainboard bios can go further too , These people think those CPUs are just for gaming lol
Even most tech reviewers have been small brained complaining about this cpu.
@@latomz6551AI use? It has the worst NPU, even worse than mobile phones. It has only 13 tops. The ryzen hx 370 has 50 tops, and that's a laptop cpu.
Also, just curious, how many Pixar movies have you rendered today with your 7700k? I'm sure you must be making a lot of money at home, in your room, doing such "work" tasks. Surely you aren't thinking you need 16 cores for a teams call, chrome and excel.
You don’t say what resolution you’re testing at. I’m getting this performance with my i9 14900k / 4090 at 4K resolution. Regardless, the difference between AMD and Intel seems academic and likely depends a lot on Window 11 tuning, DRAM speed and SSD speed.
Lets just acknowledge that your PC won't need its own power plant anymore.
Да интел это провал, они говорили сделали упор на энергоэффективность в новых процессорах, но ryzen всё равно получается энергоэффективнее и производительнее!
Hmmm Silent Hill 2 - 6:09 test seems like a big win for Arrow Lake.
Honestly, if the stats weren’t there, I wouldn’t be able to tell the difference lol
With specs like that why would you even bother playing at 1080p, you want minimum 1440p….
cuz he want to know cpu bottleneck
Look at his temps😮His Pc must be works outside in northpol😂 In all of his videos the temperatures are too low. FAKE Channel
You Can Test MU Online? I know it sounds silly but the Ryzens have problems with that game and I don't understand why.
Ultra processors, ULTRA PERFORMANC..... Wait
Intel did state that the X3D CPU would be slightly faster in games. On 7 November the new X3D CPU will come out. Waiting to see the results of that CPU.
In the witcher 285k at 105w is cooler than the 7800x3d at 65w...not to mention 24 cores vs 8.I think this new platform is an improvement over the last gen.
In cyberpunk AMD cpu - 70-75w, Intel cpu - 120-125w...
@@milkronin2488 yes and still lower temps
he was using different cpu cooler between them because he didn't overclock ryzen as i saw (also current x3Ds don't support it)
@@hardcoregamningchannel that really is not relevant since both CPUs are below 70c which is perfectly fine for any CPU. Id rather take few degrees warmer for 40-100% less power and more performance. stop trying to justify this trash intel gen
And? Temperature doesn't matter, as long as they aren't near TJmax.
why is dragons dogma not there as a cpu test? or any strategy game that actualy requires the cpu?.
The real price of intel core 9 is 400$ not 600$, and even wealthy people will turn to Ryzen. That a shame.. I hope Intel will step up their game and prove that they are truly passionate about CPU architecture.
I want to see non x3d comparisons. We already know a cpu on gaming steroids will win.
We need i5 tests 13600K 14600K Stock OC versus Ultra i5.
I have a request, can you do a recap at the end of the video and also overall FPS difference in percentage?
Buying intel ultra plus new motherboard? Nah Thanks. I don't want Intel to go bankrupt. AMD will monopoly the market if it happened.
Next Video Request,
*Test Arrow Lake Desktop "Alchemist" iGPU w/ 4Xe cores vs Zen 4 /5 RDNA 2 w/ 2 CU Cores.*
why is the gpu power on microsoft flight simulator so different
Did you look at th FPS # and the perental usage? 😲
Crazy how much more efficient the 78x3d is
AMD side: less power comsumption, more efficient
No doubt 7800x3D is the king but i hope 9800x3D improves upon 1% low, i don't get it why on some games 7800x3d have bad 1% LOWS, frequency? Or bad game optimization? Or last but not least windows as we have seen Microsoft was screwing AMD for years
barly any difference, just a few % here and there lol people make it seem like one is double the performance over the other while the difference is is single digits 🤣
GPU limited. Next gen GPUs will show much bigger difference.
@@nossy232323 LMAO not true, 4090 is being bottlenecked at 1080p no matter what CPU. A 5090 will not change things. CPUs are just too weak no matter intel or amd
@@AdiiSthis wan not on 24h2
the Intel so much cooler, kinda cool❄️
core ultra 9 285k only for working due to good stabilization while 7800x3d good for gaming but the price is increasing because 9800x3d ready to release
Can you test X3D mode on Gigabyte boards?
AMD: is that your latest the most powerful flagship?
Wonder if it would have been better with the HT instead of removing it... or at least add a few P cores. Will be interesting in a few months since they had to contract with TSMC to make these chips so they have to commit to the wafers, few will buy them for retail price. So we'll see in a few months if prices drop. Been fantastic sales on the 12th gen and 13th gen last month.
Why in every game here is gpu on 58-80% instead 99%?
More power, more price, new socket, less performance
9800X3D will be about 13% faster that 7800 thats a good step.
The current generation of CPUs is weird - seems significantly overpriced for most uses
The market for both the 285K and the 9950X seems to be professional users using heavily multithreaded apps
For those users a 600 dollar CPU that is somewhat faster than its predecessor while using less power is a reasonable purchase.
If you save someone making 100 dollars an hour 10 minutes a day, a new system pays for itself in 6 months.
I'm not sure that any of the other Core Ultra series or AMD 9000 series (non X3D) make sense to anyone with current pricing.
There is clearly no reason to upgrade from last gen and last gen is priced better for new builds.
We will see if the 9800X3D makes sense - Intel isn't even trying to compete in the gaming market.
That's why those "gamers" should stick to Intel 12th gen or AMD Ryzen 5, 7.
Care about their gpu more.
I remember when Intel fanboys made fun of AMD back in the phenom, Phenom 2, or FX Era. AMD took that personally. Now we're here 😂
Haha.
Pathetic.
Games, always games.
Intel has overcome that era for so long.
They'll still be dominant in the cpu market.
You guys can't do a damn thing about it.
Since what you "gamers" can only do is expecting these new Intel cpus would beat AMD' ones in gaming purposes.
Wrong.
Starfield is the only game in which Core Ultra 9 285K can beat the 7800X3D.... 1 out of 10 games... The new 9800X3D will definitely beat Intel's CPU even in that game...
5800x3d owners (even 57003d) still waiting for a proper upgrade from the 1080Ti of CPU’s 😂
PLEASE ADD: F1 2024 and ACC are huge different FPS!
It's weird to see intel having less stable frametime than ryzen, damn.
Which processor is colder?
285k
@@OmnianMIU Thanks, but my question is what type of cooling they are running the CPUs with. with 240mm, 320mm? or with a custom?
@@matiasalvarez3459 it's reported in the first comment here, mag core liquid 360
@@OmnianMIU Thanks, I have an I9 14900K, with 360mm cooling, with all the BIOS updates, and I still can't get below 85C° in a game like Horizon Forbiden West.
@matiasalvarez3459 what kind of AiO do you have? Did you install the contact frame?
What version of win 11 u used 😂
Should be tested in a 4K environment
Dijo intel que perdía en gaming pero que era más eficiente, ni es mejor en gaming y es peor en eficiencia, intel se hizo el harakiri, ahora quiero ver rematando el trabajo al 9800x3d