@@angeluorteganieves2539 I already tested a tuned and optimized 9950X against a 14900K, so I could use those results as a comparison however I want to make sure every CPU is tested with the same Windows version and same GPU drivers.
The 265k is $299 at microcenter rn...im leaning towards buying that rn because thats not a bad deal. I mostly use lightroom/premier pro and do some gaming in 4k so i need an all arounder. Qould that be a wise choice for that price?
If you take a look at my 265K video: ruclips.net/video/bxCXe9MC3nU/видео.html A drop in price by 33% ($399 to $299) would make it much better value when compared with a 9900X and 14700K. And remember I tested this chip before the latest round of updates that will improve it's performance. So yes, I think it would be a good buy for your use case ... especially if you get one of the newer B860 motherboards, which are cheaper than the Z890 boards.
The irony is that even though Intel uses TSMC's manufacturing process, it still remembers the high heat generation and frequency reduction caused by avx512. This means that we will not be able to use the avx512 instruction set invented by Intel on the Intel platform in the next few years.
I partly agree, but 9800x3D costs 750 euros in Europe, and it is much slower in work tasks everywhere, but as a gaming processor it is good. The second point is that the board is enough for 300-400$.
I agree on the price of the CPU, it's too high, however you can get cheaper Z890 motherboards and run DDR5-8800 without much of a performance penalty. That said, AM5 is in a great place because you can get cheap 600 series motherboards.
Wooooow incredible work, i have been waiting many days for this review, thank you so much, my motherboard is the same, i started playing call of duty, but after watching this video I will start tuning the processor, thank you men really very grateful.
@@blackbirdpctech I am trying to improve it, i have the ringbus in 4.2 the P cores in 5.6 the E cores in 5.0 the ngu in 3.6 the die to die in 3.6, im doing the stability tests, after an hour on cpu z stress test the cpu package temp is 97c, my memories stayed at 8800 cl 42, i can't go past that, i don't think i have enough experience in ram overclocking, i have everything in msi extreme mode (300 watts) the new bios gives me 8 ns more latency, I'm thinking of changing to the old 0x113 is better for my, im using windows 10 the performance with this tunning is 12% IN AVERAGE FPS 26% in 1% lows in crysis ( 2007 ) is a single core cpu game I will soon install Windows 11 on another SSD and do the comparisons
I reviewed the 265K recently: Is The Intel Core Ultra 7 265K Really That Bad? ruclips.net/video/bxCXe9MC3nU/видео.html I plan to review it again after the latest updates from Intel in mid January.
Great review and I appreciate the performance tweaking tips. I still plan to use the 9800X3D once CPU and mobo are back in stock and will use that video to tweak. But this video is great for comparison and shows that maybe there is hope for Intel yet, at least when it come to performance.
In terms of gaming performance it’s pretty hard to beat a gaming CPU like X3D. The x3d really saved AMD from a lot of headaches with gamers community and I am really happy they managed to do this.
I rarely comment, but thank you very much for the nuance and the *updated* information on these. I’m contemplating going for a high end build and using a 285k, but just unsure if a 9950x is the better choice. Quick sync is a great thing and missing on AMD (unless I get an Arc card as a secondary gpu)
I plan to compare the 285K against a 9950X after the new updates for the 285K are fully released … both are good options, the 9950X was a great chip to test.
Your willingness to actually enjoy yourself with these (wrestling promo for the actual comparison), along with BOTHERING TO SHOW 1080/1440 results on settings that aren't Ultra Maxed out Raytracing enabled? Happy to watch more of these. Noting some methodology issues with other Day 1 reviewer methods is nice too. They say they don't overclock/check bios settings to better emulate the average gamer/viewers experience, but then only ever test at max settings, which few people do (especially for cpus that aren't the ones in this video, which are the crown and peak of the cpu pile)
I'd stick with the 9800x3D Intel has issues still going into this new gen of CPUs. Just got to wait till they become available to us mere mortals since allocation is set to certain tech tubers initially. Once the 9950x3D comes out we may get hands on lower SKUs if lucky.
9800x3d was available in Australia at least from all retailers but sold out after each batch arrived over a week or so. Just keep checking stock at your local retailer or ask when the next batch will arrive and place a backorder if need be.
@latinlowrider59 where I live they don't allow back order and oddly they won't ship CPUs again they did this when the 5950x launched as well only way I got one back then was my brother would have to check the store daily in person then used a Carrier to ship. If these are going to be that difficult to get them not worth my time this is just another sign it's time to give up on this hobby as humans suck and ruin everything.
@latinlowrider59 way up in the Arctic Region of Canada it's ok though the difficulty of getting parts is killing the drive to even want em anymore I'll still locate for friends who don't mind spending the $$$ but personally losing interest in the hobby as more and more handicaps get added to obtaining parts. If it's not scalpers, no shipping on new products, insane mark ups, paper launches it really makes ya reconsider the hobby. But appreciate the offer.
One doubt, I have just bought a new Ultra Core 9, and I'm installing drivers. Would you recommend installing Management Engine driver, or is it better to leave it out?
The ME should be always kept updated to the latest release, firmware and driver and I think Matt would agree. Before doing it though, is recommended to read about the firmware, learn about the fixes, BIOS updates which you may need to perform after the ME update, etc.
You should always install the ME driver (as stated above). Typically in the notes to the latest bios it states that you should also install the latest ME firmware.
Will you update your comparison on the more current X870E chipset aorus master that is better matched for handling 8000+ DDR5 speeds - or at the very least use the latest AGESA 1.2.0.2b update that addresses memory latency issues and has demonstrable gaming performance uplift? This may sound nit-picky but you made sure the intel rig had all the latest advantages available - latest arrow lake OS patches, BIOS, and DDR5-9000 (which is by no means a guarantee on 285Ks and requires some degree of binning)
The chipset for the X870E didn't change from the X670E, features like USB 4 were added but there should be no performance difference. That said, you really should take a look at my recent posts on the X870E Aorus Master, and my 265K review where I show the performance difference between the X670E and X870E, and explain why I chose to continue testing with the X670E Aorus Master. In summary, there is something wrong with the X870E Aorus Master ... I've now tried three different Bios versions and the performance is still significantly LESS than my X670E with the same components. It's so bad that I decided to change my X870E testbench to the Asus ROG Crosshair X870E Hero, and even that isn't able to match the X670E Aorus Master (although it's much closer). I provided charts to show this in multiple recent posts, including one that I posted earlier today. So not only did I not bias the results against the 9800X3D by using my X670E, I actually helped it achieve better performance. I know this doesn't align with the thought that everyone is "out to get you", but I can assure you I do my best to make sure that each CPU is given the same level of care when I test them. Also, getting DDR5-9000 to run stable has more to do with the motherboard you use versus silicon quality for the CPU ... the memory controller for Arrow Lake is extremely good.
@@blackbirdpctech So I am in the unique position to have both the X670E and X870E aorus master boards on hand - and while I don't doubt your account of performance issues with your specific x870e board, I have not seen any anomalies like a noticeable performance drop on my end. I do try and follow other searchable conversations on the x870E and have not seen much chatter on your performance issues either. I will say that I spent significant time (4-5 months) working with Gigabyte support on the X670E master BIOS builds - multiple BIOS releases (beta and release) were unable to properly disable iGPU and HD audio when specified and worked with various support staff (including an engineer directly in Taiwan HQ). I have not benched the 670e with the lastest AGESA but when time permits I'd like to see if there is a noticeable performance delta, particularly in %1 lows at 1440p or higher resolutions. I do have observable quirks with my x870e; my g.skill 8000 kit that failed to post on the x670e works perfectly stable on the x870e at CL40 and CL38 and at only 1.4v (both using a 9800X3D) - HOWEVER my seemingly very dependable g.skil DDR-6000 CL30 EXPO kit that ran fine @ 6200 CL28 1.35v on the x670e will eventually get errors/crashes on the x870e running at default EXPO CL30 6000 timings (and wont post at any higher speed, no matter the voltage boost up to 1.5v). Very odd and annoying.
I reached out to Gigabyte and they confirmed an issue … not sure if you’ve carefully benchmarked both motherboards with the exact same components, but I have and the data for the X870E is not good. I tried multiple boards and got the same result on both. Even my Asus X870E Hero doesn’t perform as well as my X670E Aorus Master. I will be meeting with Gigabyte at CES to try to get some resolution because at this point, I’ve moved away from Gigabyte motherboards. My experience with the Z890 Aorus Master was even worse, and I tried two of those boards too.
@@blackbirdpctech what specific issue has Gigabyte confirmed? And is it firmware fixable or a hardware revision requiring a RMA? And for your performance drops, I am assuming these are most noticeable when benching low resolution 1080p (something I don't generally do at home)?
They confirmed that they were able to replicate my performance issue and they were “looking in to it further”. It was at multiple resolutions and game settings. Take a look at my community posts … I show all of the data and discuss it in much greater detail. Unfortunately each bios update since then has made it worse. I can’t imagine many people test both boards with the same components, so it’s not surprising that people aren’t complaining because most wouldn’t even know it’s an issue.
No, not at the moment ... the cost and performance wouldn't make that a worthwhile upgrade. If Intel continues to improve the platform, then that might change.
I like your benches but I think medium is a mistake. In some games Ultra settings have higher CPU load as well. Due to more draw calls or objects on screen. Otherwise love em. Add STALKER 2 and Valheim :P and SC2 if possible
I was trying to capture the full range, from 1080p/low to 4K/ultra ... 1440p/Medium was a mid point, more of a balanced CPU/GPU load. Happy to add/replace games in my benchmark suite as long as I can test them in a repeatable manner ... that's why I typically go for games with built-in benchmarks ... CS2 is tough to bench consistently ... will take a look at the others.
Alway was worth buying only if you are purely a gamer it was not. The channels that slated it was poor were gamer channels like Gamers Nexus, Level1 Tech (did not say it was poor) and non-gamer channels never said it was poor because it was not, I welcome improvements and bug fixes people who bought it like me have never had a complaint about it and I have 20 the next release of it will be vastly better as this was a big change for Intel. Good video though at least you understood the technology unlike most reviewers. The retail CPUs were nothing like the reviewer ones (I never had one BSOD on any of my 20 CPU's) though I don't care about games and windows did have a significant impact like it did with AMD when it released its own CPU's. Good vid
Hopefully this data shows that it is actually quite good at gaming now too. My guess is that Intel wasn't planning to release it as early as they did but that their hand was forced by issues with Raptor Lake combined with the imminent release (at the time) of the 9000 series by AMD.
In the end it's still the pricing that is on amd's advantage. Price to performance cannot be beat at this point if you're simply just gaming and you can still use the 9800x3d for work. since benchmarks mostly do not apply to real world situations and usage. Not only is intel marginally faster in some games, you'd need more expensive components along with making tweaks to improve performance. When building a pc you have to factor in the gpu too which is why most gamers stick with price to performance ratio. If you're rich then go for it but the majority of gamers will always go for the cheaper option especially when that cheaper option performs just as well as the more expensive one.
I had been waiting for Arrow Lake to build a new Audio DAW, unfortunately, as usual, the vast majority of reviews only based their review rating on Gaming Performance. If you couldn't care less about gaming, and want a CPU for actually Doing Something, then the Ultra 9 285K seems a good choice. Still some problems, mainly RAM choice, price and the NEED to have Windows 11 24H2, otherwise the latest updates don't work as they should.
9800X3d still far better for gaming. The mixcrocode update improved intel, but not near enough., For productivity Intel CPU is better with similar IPC of P cores and the very much improved e-cores having 24 in total due to 8. But the tile based latency intel has is just horrible for gaming and without 3DS vcache intel cannot mitigate it near enough to compete in gaming with X3D CPUs.
@@blackbirdpctech Thanks. Do you know how it compares to the ASRock Z890 Taichi OCF or Asus Maximus Apex? I'm seeing alot of overclockers using them on forums.
@@Tubezilla I have a Z890 Apex and I used an Apex Encore for Z790 ... I haven't tried the Taichi, but they should all be pretty close wrt memory OC'ing, given that Arrow Lake has such a good memory controller.
@@Tubezilla 2-dimm motherboards will always give you a better chance of getting higher speed memory stable. But if you ignore memory then there really isn't a big difference between these higher-end boards in terms of FPS ... it's your CPU and GPU that will have the largest impact.
actually, it did much better than i thought. there are a few games with 1% low issues. when i get my 4k 32" oled, it seems like i could go either way and be happy.
@@rozzbourn3653 I seriously considered that monitor after seeing it at CES last year (the picture quality is amazing) ... if the new GPUs use DP2.1 then it will be a great option. I ended up going for the Asus (OLED) and LG (WOLED) options because I wasn't using a GPU with DP2.1.
@@blackbirdpctech i plan on upgrading my 9900k 2080 super system in the next few months. im not in a hurry, so im going to take my time and get exactly what i want at the prices i feel good about.
My last 7 PC builds have been Intel and Nvidia combo. My next build will be AMD CPU and Nvidia GPU build. Either getting the 9950x or waiting for the 9900x3D. Intel has lost me as a repeat customer who would always buy their highest end i9s.
There’s lot of disarray in Intel strategies right now. They could’ve stopped this with the Ultra series but decided not to, lol. It feels like they never tested these CPUs in real world scenarios and released them with a corporative arrogance that people will buy them anyways. While I love the Ultra’s architecture, they have so much flaws that is very hard to buy one for use, maybe as a collection piece to remind me years from now of how you can deliver a state of the art architecture with the most unexpected poor functionality. Hectic!
Depends entirely on your settings and your games. If you play at 4K native (never any upscaling) and at max settings, the difference between processors should be very small at this point in time. Whatever CPU is faster in the 1080p tests will very likely be faster in the future in 4K too. It gives you an indication of how fast the CPU can run the game, and also how performance will differ in future more CPU intensive games. If you already have a CPU fast enough for your settings, there are generally no reason to upgrade. :)
When AM5 had been introduced to migrate into a new platform from AM4, the only thing preventing users from buying was the expensive DDR5 and Mobo price. But users can benefit from the new platform in terms of performance, less heat, and more power efficiency. On the other hand, the new platform Intel got the worse start in this migration. Gaming performance becomes an issue even with regression. While productivity is on par with the predecessor, the significant benefit is heat and power consumption.
@@yr6sport418 idk bro I willing to trade my 9800x3d for a 285k with cash difference so far no impressed with CPU I don’t games a 1080p I feel more dip in mmorpg than my old 13900k the avg is higher but the 1% drop really low in bdo and troné and liberty even in lost ark when there plenty of monster on screen running on x870e hero gskill 6000 cl30
The performance of the 285K in gaming really surprised me ... it definitely shows the potential of these chips when configured with the right firmware ... the big questions now are if Intel can continue to make improvements and if they will drop the price.
@@blackbirdpctechIt seems to me like this gen of Ultra was released in a hurry. This is also a big step into unknown for intel, first CPU with the most advanced TSMC stuff. They’ll definitely do better next time, intel loves to improve slowly, but now they don’t have to worry about lithography slowdowns, and can fully focus on performance.
Maybe in your country the 285k could be good value. Over here its 200 euro more expensive then the 9800x3d and the 265k is more in line with the 9800x3d just a little cheaper but not much. Everyone is getting the 9800x3d.
I didn’t say the 285K was good value, in fact I have a value chart in the video that shows the 9800X3D offering significantly better value. The good thing with videos is that it’s tough to argue against something that you can watch.
The problem is Intel failed big time, they went from 10nm to 3nm process node and they came up with this trash of a cpu, no amount of sugar coating can fix this. The main problem is that Intel should know by now that they need to come up with their own x3d stacked memory if they want to compete on gaming, they had all the time in this world and they didnt even bother, i mean the 285k humiliates the 9800x3d on everything else except gaming where it gets humiliated, so the answer is simple, they need to work on that, simple as that.
Stacked cache is not the answer, it's just a hardware hack to fuel fanboys' wars, they are other ways to achieve high performance in games without limiting a CPU's functionality as the stacked cache does. The x3d CPUs are also very expensive for what they have to offer and I'll stay away from them, unless they go really cheap, such as under $250 cheap.
Whats funny is Pat G. should have been the right pick to yank Intel out of their rut. And when he allowed PR out dissing AMD as 'glueing chips together' and other assorted BS while Intel continued just re-treading 14nm.... well both are where they should be now. And Pat is potentially on the hook for a lot of BS they fed to investors to potentially the tune of his entire $200M compensation during that time. Unfortunately, AMD seems to be sinking into the Captains chair of 'Lets give them just enough to satisfy' vs taking risks.
@@boots7859precisely, giving enough to satisfy is a big mistake. An approach which works for the moment, but which will backfire sooner or later if nothing major changes.
X3D humiliates Intel in gaming basically on the benchmarks only. Because I doubt that people who buy absolute top GPU would play anything less than 1440p, which is bound to GPU, and will be in a very long time. And if it's not an absolute top GPU, then you'll be bound to GPU even on 1080p anyway. So in the end it would seem that people have mass obsession for benchmark numbers, and ignore far more realistic scenarious. While in productivity and day-to-day tasks X3D, let's just say - not the best.
Yes, they are available from Newegg and Corsair however the higher speed kits (>8800) keep going out-of-stock as soon as Newegg updates them, so demand is greater than supply at the moment.
@blackbirdpctech Yeah , I think you should make a video on that topic once they release them officially , which you will most likely do anyway , just wanted to see what are ur thoughts on focused AI gpus vs raw performance.
@@BruceUZULUL if it works well and if you can't tell the difference then to me it's fine. I think it's important to remember that these are games, so whether AI generates the frame or the frame is drawn based on a developer defining it, as long as you can't tell the difference, then it really shouldn't matter. This is definitely something I plan to discuss in an upcoming video.
@@blackbirdpctech’Raw power’ and ‘Fake frames’, two terms which you should also clarify for users and explain why they are wrong when using such sintagma, especially when tensors and rt cores are clearly raw power and those ‘fake frames’ are as real as it gets. Every frame which is visible on your display is REAL, people, and is the result of a calculation taking place on physical hardware. As an analogy people were using the example of turbo in cars, but is not even like that. It is like having a second engine which delivers additional power with a new type of fuel.
It can be improved and I hope they'll do this with the entire U200 series. If they really care they'll improve it, especially now when the 285k is out of stock everywhere I look, and not because its gaming performance, but because (with all this series' flaws) the tech savvy users understood the revolutionary aspects happening under the lid. Intel did not release something like this in a long time.
Awesome video! I appreciate you putting objective metrics to what we had talked about during the 285k launch, that Intel is lightyears ahead on the memory controller and this would significantly impact performance. Now we know how much! Maybe AMDs memory controller will get a major performance bump with their next generation MB chips? The moment at 3:33 had me laugh out loud! You forgot the clown nose. :O)~ Come on Intel. Drop the price. Happy new year!
It's shocking to me that they haven't dropped the price already ... they really could make this a compelling option with an aggressive price cut. Happy New Year!
hi, great review! was bo6 tested with the correct render worker count in the config files? higher core cpus usually perfom worse in bo6 due to some render worker count shenanigens
@@blackbirdpctech ruclips.net/video/vUokFhHUZoA/видео.html this video does a pretty good job of explaining it, do skip over every other tweak though, not required. just wondering whether render worker count has any effect on the new arrow lake cpus
Agreed, I plan to do a comparison of the 285K against the 9950X and 14900K after the next firmware update is released. I chose the 9800X3D for this comparison because I was interested to see what Intel’s best chip could do.
I think it's simply Intel being stubborn and not realizing their current position in the market ... that said the 285K has been out-of-stock since launch, so either there is strong demand or not much stock.
Here is the 265K video: ruclips.net/video/bxCXe9MC3nU/видео.html As explained in the video, many of these tweaks are silicon dependent, so I would caution you against simply copying my setup ... better to use it as guidance ... you may be able to push your chip further or need to back-off a little.
@@blackbirdpctech alright thanks man :) yeah 53x & 47x works fine i could go higher but my H150i LCD 360mm cant cool the CPU much In cinebench r23 its hitting 93 up to 95c. thanks again.
So what you're telling me is "intel finewine" just gonna use that term to annoy amd guys who would say "amd finewine" because their products were so broken on launch it took 1 to 2 years to get all the performance out of it xd
I have to laugh because the AMD vs Intel vs Nvidia thing is pretty funny. On this video I am getting crap from AMD fans because the 9800X3D didn't win by enough and Intel fans because the 285K is not a "gaming" CPU ... so I am going to do my best to stay clear of any battle and simply try to stick to the facts ... the 285K did much better than I expected based on earlier reviews 😉
LOL, was rooting for Pat when he came in, thought he'd bring back some of the old Intel. Wrong! All he did was insult: Nvidia (lucky), AMD (rear view), Apple (lifestyle company) and more just running his mouth. We'll see how well 18A turns out, and whether its economical or even viable at scale. Keep laughing while you can with your quips, Intel will most likely keep failing until the US Govt. has to step in for NatSec reasons.
Well at least if anything good came out of this AMD is gaining market share, which I'm happy about in the CPU space. Hopefully in the GPU space Nvidia cards will stay on the shelf so Intel can gain market share there.
All I know is that I'm extremely happy with my 265k I picked up for $299. Paired with 48gb of DDR5 7200 and everything is so silky smooth. . . along with my 4070 super. Definitely a bang for your buck PC build. :)
I’m planning on re-testing the 265K after all of the new fixes are released in mid January … I’m very curious to see just how much extra performance I can extract from it.
For AMD the 9800X3D is faster in gaming than the 9950X. For Intel the 285K is the fastest gaming chip for the new Core Ultra series. That said I do plan to do a comparison with the 9950X in a future video.
@@blackbirdpctech Oh Appreciate! It is strictly for gaming then I get you. AMD is about to release Ryzen 9 9950X3D that should be a n interesting comparison to Ultra 9 :) Personally I am deciding for the best CPU that will permit me to Game but also for a Creative Workflow.
Is it really only 6 months away? At this point it's tough to trust anything coming from Intel ... will Panther Lake be fully functional from day one? Will it put Intel back on top? There are a lot of questions and no answers ... in the meantime we can test what is available today and figure out how to maximize the performance of these platforms.
@@blackbirdpctech Yea, I am really not optimistic in Intel's future plans, they really have internal problems like never before. Hopefully, if they manage to produce the next generation in their own foundry, we'll see some decent CPUs or at least decent prices?
@@TheSkyLynx2 I know it's not a popular opinion but I was more confident in their path forward when Pat was CEO ... I do think that fabbing their own products will be a very important step to getting back to the top ... only time will tell.
@@blackbirdpctechyea, who knows. Btw, panther lake is not a desktop CPU and it looks like they’ve cancelled the arrow lake refresh, so the next intel must be Nova Lake for desktop.
I don't think it really matters tbh. The average gamer has a 3070 graphics card for example. Only those who absolutely must have everything on ultra at 4K pay the top bucks. Heck even I own a 4K monitor and I don't have issues with playing anything. It's kinda funny Americans are freaking about the prices, us Europeans have paid those and much more since the 2000-series GPU.
So you're saying in order to get performance closer to the 9800x3d. Get really expensive ram Mess around with timings Overclock Test and mess around voltage other settings. Riiiiight! Yeh, I'll still with a 9800x3d
A couple of things: 1. The final tweaks for the 285K do not include any voltage modifications. 2. The 9800X3D in these tests is highly tuned and optimized, as explained and detailed in the video.
@@DavidAlfredoGuisado that's not true at all ... did you see the tweaks that I made to the 9800X3D? The tweaks for the 9800X3D resulted in an 8.3% increase in Cinebench score whereas the tweaks for the 285K resulted in an increase of 7.8%, which is similar but less than the 9800X3D. Furthermore, the -30 CO undervolt that I applied is heavily silicon dependent, as is the max CPU boost clock and ability to run DDR5-8000 stable.
@blackbirdpctech who cares about a synthetic cinabench score!? We are talking about real world gaming here, you make it seem as if the folks are wrong about the 285K and it's better than reviews are saying. The gaming performance you get out of the 9800x3d STOCK is incredible. You are saying to get close to it or a little better in some cases, spend more money and tinker around in the bios.
@@Sirjojovii I love it when someone says "real world gaming" ... hopefully you can appreciate just how much of an oxymoron that is. The cinebench score was simply used to provide a measure for how beneficial the tweaks were. And yes, the early reviews do not match current performance ... that is the point of the video. That said, I do agree with you that a 9800X3D is a better gaming option ... I specifically say that in the video, but it's important to also point out that the 9800X3D in the video was highly tuned and optimized as well ... not sure how that offends you.
Great Video! I appreciate you taking the time to work with arrow lake and not just throwing it away as soon as the 9800x3d released like Jay and the other big pc tech channels. I'm an early adopter of the 285k coming from a 4th gen i7 4790k, so I know I will experience a world of difference either way.(just waiting on the 5090 to complete) And I will be doing workstation tasks like 3d rendering and video editing a bit more than gaming. My only concern is the P and E core overclock temperature increase. 91c seems a bit too much, so I may not be able to go the overclocking route. Does the 91c and high power draw occur in just short spikes, or is more steady?
That's a great question ... you only get the temp increase when you use a power profile that is above the Intel default ... the one I used keeps PL1 at 250W but increases PL2 to 295W ... PL2 is a temporary increase, whereas PL1 is steady state, so I thought it was worth it ... if you are concerned I would recommend simply using the Intel default power profile ... you might have to backoff some of the tweaks a little, but then again you might not.
You are correct that DDR5-9000 will not work on most 4-dimm motherboards however DDR5-8000 will also not work on a large number of AM5 motherboards, so I think it's a somewhat balanced comparison from that perspective.
It scored much better than i thought. They may have redeemed themselfs. Still glad i`went with my AMD 9950X though, an excellent allrounder and much cheaper for me.
Normally you mustn't compare the 9800X3D and the 285K because the 9800X3D is just AMDs upper med tier CPU and the 285K is supposed to be the top high end CPU. (Since Intel cancled the 295K with 8 P-cores and 28 E-cores because the efficiency was not the same as for all the other ones. OK but honestly, what do you want with even more E-cores...) For a fair competition on the 285K you would rather compare at least the 9900X3D or even the the 9950X3D. Btw. here in my country we got € /euro, and here the 9800X3D already costs 579€, but the 285K currently in offer for just 550€ (tax already included). It's rather a bad joke but the so user friendly AMD suddenly is not that much user friendly anymore... And both CPUs are currently nowhere available - except at fraud for overpriced Well most likely the 9900X3D will be the worst in pure gaming because because this one only got 6 lanes of X3D, means even 2 less than on the 9800, and we still don't know if AMD fixed the commands this time not to use the non X3D cores which will naturally cause FPS drops
@@blackbirdpctech Yea purely aiming for gaming it is I admit. But on paper strictly speaking it's not 100% fair. Though I don't even think the 9950X3D will be so much better purely counting for gaming than the 9800X3D, probably a little bit because a bit faster clock speed. But In high resolution like 4K even L2 cache can play an importand role which the 9950 will have more, too. So I'm still excited for the first gaming benchmarks between those 2.
Yes, the DDR5-8000 32GB kit can be tuned with tighter timings than a DDR5-8000 48GB kit. Given that none of the games that I benchmarked came close to using 32GB of RAM (max system usage was around 20MB), and that both kits are single rank, the additional capacity of the 48GB kit provided no benefit. For the DDR5-9000 kit there isn't a 32GB option, so I had no choice but to use 24GB dimms. If you were to run an application that uses more than 32GB of system RAM then it would become an issue. My objective was to tune and optimize both systems as much as possible so I could compare best vs best configs.
95% of people interested in this stuff are gamers, plain and simple. Best bang for the buck is all that really matters. I left x86 recently, no more Win, Linux, and just got a screaming Mac Mini M4 for $560+ thats complete with a GPU that can play AAA at 1920/High at 60fps as a bonus. Maybe 50w, and silent and was about equal to a Ryzen 9700 Microcenter deal in brute force. I'm thinking MacOS is going to be somewhat less buggy than MS, and less occasional messing around like Linux, and have its own set of quirks I think will be less than I've gotten used to. Started with Abit BP-6 and a couple Celeron 300a and a pencil, and now I'm on a baby monster the size of 5-6 CD's stacked. And cheap as chips. Walk away.
I wanted to compare the best new chip from each company in gaming, and for Intel that's the 285K while for AMD that's the 9800X3D. The 265K is cheaper but the silicon quality isn't as good, so you can't overclock the chip as much. By the way, I assume you mean "multi-purpose" and not "multi-platform" and the 9800X3D and not the 9890X3D, which would be interesting.
You can still use the 9800x3d in productivity since it's still an 8 core 16 thread cpu that uses ddr 5 and didn't have the lower clock speeds of the previous x3d chips. Who says it cannot be used for productivity?
@@blackbirdpctech Yeah, it may not be the best at productivity but it would still do the job. Even past cpu's that aren't as powerful as this were used for productivity. The only difference is this get things done a little slower compared to the intel. That's like saying the intel cannot game cause it more of a productivity cpu. Both can be used on gaming and productivity but both have areas where they excell at.
Its very simple, I also managed to get great results form my 285k by increasing the normal lotus panendermic parameter of the VRMs, and adjusting every seventh non-differential setting to its modial factor. I was also able to significantly improve system stability by adjusting the fourth order transform within the memory sub-inverse reactive timings to their max levels.
That's pretty funny but keep in mind that I also tuned and optimized the 9800X3D in this video. The reason I show step-by-step guides is to show just how easy it is to tweak these CPUs ... it may sound difficult but it really is quite easy, it just takes time.
With all the fixes to the Z790 & Z690, I would have liked to seen it thrown into the mix. I mean, with the fixes, does the 14900K perform better now or worse before the fixes?
I plan to test the 285K against the 9950X and 14900K in a future video. That said, I've tested the 14900K extensively before and after the latest microcode update and the performance impact is minimal if you compare using Intel recommended settings ... the problem is most motherboards didn't use these settings when Raptor Lake was first released, so there is a performance hit as a result.
A bit biased at minute 25:05. The last three points are redundant, and the recommendation isn't even an objective point to check off. Instead, there should be two points to add: the 285k performs faster at 4K and that the software is still being improved, like the BIOS microcode updates coming on January 15th. It would be interesting to see similar ram benchmarks but for the 265k.
I quite literally stated those exact points in the video, which is evidenced by me actually saying it in the video. Furthermore, what exactly is biased? Here is the final part of my recommendation: "With that said, these results do offer up some interesting choices. On the one hand, if you are building a new gaming focused system and you want the very best, then based on these results, I would recommend buying a 9800X3D, if you can find it. It’s a truly amazing chip that offers outstanding performance at an equally great price. However, if you do a lot of professional workloads in addition to gaming, and you believe that Intel will indeed improve gaming performance with their next firmware update, then the 285K might be an attractive option. It will unfortunately cost you more, and availability of the 285K is not much better than the 9800X3D, but it might finally offer Intel fans a compelling reason to upgrade. I was actually quite impressed with how competitive the 285K was in these tests, the only thing really letting it down is the price, which is something Intel CAN do something about. Hopefully Intel can come through with their new firmware update in January because as things currently stand, AMD is dominating PC gaming." The 0x114 microcode update was released early and there are numerous reports that the performance actually dropped ... I will need to check when I get back from CES but if true, would be extremely disappointing.
I think the positive day one reviews is why the 9800X3D has been sold out ... but why has the 285K been sold out? That to me is a more interesting question, given that the reviews were not good.
@@blackbirdpctechnobody wants to admit they’ve bought it, people are afraid of shaming and backlash 😂 Paraphrasing a well known comedian, “never underestimate the power of fanboy people in large groups”, rofl.
So you don't need to be a rocket scientist, you seem to need a computer engineering degree. Wanted to see if an intel build was in my future and no, I'm not interested in coaxing performance out of a consumer product. When I build a Ryzen system I buy one of many X3D parts, set the PBO in bios and you have the best experience without the hassle.
The tweaks that I made to both chips are relatively simple ... you certainly don't need a degree to perform them. I agree that X3D chips are a much better out-of-the-box experience however I would definitely encourage you to make some basic tweaks, like undervolting. Simply turning PBO on doesn't really do much.
That wouldn’t make sense. The DDR5-8000 32GB kit can be tuned with tighter timings than a DDR5-8000 48GB kit. Given that none of the games that I benchmarked came close to using 32GB of RAM (max system usage was around 20MB), and that both kits are single rank, the additional capacity of the 48GB kit provided no benefit. For the DDR5-9000 kit there isn't a 32GB option, so I had no choice but to use 24GB dimms. If you were to run an application that uses more than 32GB of system RAM then it would become an issue. My objective was to tune and optimize both systems as much as possible so I could compare best vs best configs.
@@blackbirdpctechThe x3d only won by a very narrow margin, and this is not what the other tubers were saying, the difference should be huge because the x3d is king and intel is e-waste trash tech 😂. I had a feeling this was going to happen after this comparison. Oh gosh!!
Yes, you definitely called it … that said, my job is to show the data without bias, even when it goes against the expectations people have. Unfortunately it’s the larger tech RUclipsrs that are trash … not one of them followed up and retested the 285K after the recent round of updates. They did the exact same thing with the 9700X … they trashed it and left it for dead.
@@blackbirdpctechTotally true. I don’t know why are they doing that, I can speculate though. And yes, I noticed the silence regarding the 9700x, which is an awesome CPU which I really consider for a future upgrade.
@@blackbirdpctech Point taken. Although, there may be other bias in this test because the CPUs are, technically, meant to be different performance classes. Granted, AMD's Ryzen 9 X3D chips are, at least, a few weeks away from release.
Note: all this Pc upgrades mean fook all if your internet is chitouse bad laggy old address i was using 100mbs ping26 now nephews new house wifi using now 50mbs ping 36 Soon new fibre optics in 2025 jan cable to rear unit i,m in i hope its ok i watched today my wifi 50mbs BF5 ping go 122 to 999 i hit the task manager & refresh explorer then its back to 108 122ping bf5 on oceania asia servers aimbots play daily :)
Someone who will pay so much money doesn't want to do all those things to get maximum performance in gaming,they will choose 9800x3d out of the box to get the best framerates
The reason I provide step-by-step tweak guides is to show just how easy it is to extract max performance from your hardware. There is a misconception that this is difficult, it's not. Furthermore, the 9800X3D in this video was highly tuned and optimized as well, which is why the comparison is fair. Please don't assume that you know what others are thinking and/or are willing to do.
I don't like the testing components, but I might be uneducated. You're showing the difference between ram timings on Intel, without much explanation of why it happens. This also makes it seem like you're cherry-picking components for the intel build, while going for "standard" components for the AMD build. I'm not saying this is the case, I'm just saying it makes it seem that way. Perhaps some component would increase the AMD performance, perhaps not. But not explaining anything about it makes it seem sketchy. I'm not a brand fanboy so I don't care who wins, I just want the best FPS.
I assume that you must be new to my channel (which is not an issue of course) … you can take a look at some of my recent content on the 9800X3D and AM5 memory optimization to get a better understanding of the AMD platform and why I chose the components I did. The AM5 platform is quite different from Intel platforms and does not scale linearly with memory speed.
@@blackbirdpctech I do understand that they don't scale linearly. But it's not rocket science that most of your viewers on new videos have not watched pre-existing explanations to key parts of the video.
I can understand that it would be easier for you if all of the relevant content was contained in one video however it's important to keep the length of the video down, and given that this video is focused on the 285K, and is nearly 30min in length, I decided to not repeat information contained in my recent 9800X3D and AM5 memory videos. What part of my AMD build do you consider to be "standard" and how do you think the performance could be improved with different components? I used DDR5-8000 and I lowered the timings. I undervolted with a -30 all core CO. I overclocked the CPU by 200 MHz. I don't think you truly appreciate just how highly tuned and optimized this 9800X3D is ... it's not something most 9800X3D owners would be able to achieve.
You should read some of my recent posts on performance issues with X870E motherboards, in particular the X870E Aorus Master, which is the reason I have been using my X670E Aorus Master. The last 3 bios updates from Gigabyte have not fixed the issue. Regardless of that, there should be no performance difference between the motherboards since they use the exact same chipset.
They are mostly identical. The chipset is a straight rebrand with the only difference being that the 800 series force manufacturers to implement USB 4.0. Sure, there might be a few cases where the motherboard manufacturers decided to make some PCB improvements, but at most that could improve memory overclocking slightly. So far, we haven't really seen any performance difference outside the margin of error.
@@blackbirdpctech Yeah, it's perfectly valid to test with an X670E motherboard seeing as there is essentially no difference between them. It was really just a way for AMD to guarantee USB 4.0 support (at the cost of removing some freedom in PCI-E lane allocation from motherboard manufacturers) and for motherboard manufacturers to sell slightly tweaked revisions of their motherboards under a new name.
Did you watch the video? The 285K performed much better than I thought it would based on the reviews I had seen, so it looks like the recent microcode updates helped.
Empirical evidence are saying otherwise, they work great even without the next microcode. These are really good CPUs with an advances architecture, and the latest TSMC tech.
@@blackbirdpctech Yes i did, and the microcode gives gains and losses which also confirms its unpredictability and hence why i wouldn't recommend anyone purchase one.
Agreed, it will be interesting to see if Intel releases binned versions of these chips, such as a 290K … but they have to adjust their pricing before they do that.
Unless you've tested the most CPU intensive parts of the games (like what HUB does), you will get misleading results. They demonstrated how easy it was to get results showing the 285k matching the 9800X3D in GPU intensive parts of games. So, they made sure to avoid that. As for whether the 285k is worth it? For gaming? No. And now that the 9950X3D is about to be released. Well... the 285k can finally be laid to rest.
Are you seriously using HUB as an authority on testing?? They test CPUs at 1080P/Ultra, which doesn’t load the CPU. I showed how fundamentally flawed their testing methodology was in my 9700X video. So please use a better source, Steve from HUB is a clown.
@@blackbirdpctech Wow I just noticed you messed around with the 9800X3D using cl36 8000 (high latency) when CL30 6000 or 6400 was the way to go??? Then you messed around with the f clocks, didn't use the newest chipset and what bios? And you loaded the 285K with the latest and greatest, messed with the mem timings? Now I definitely can't take these benchmarks seriously as they do not in any way represent a realistic scenario for most gamers. Had I seen that, I would not have the wasted the time commenting on this video. And you call Steve a clown?
I don't know where to start. DDR5-8000 in 2:1 mode is faster than DDR5-6400/6000 in 1:1 mode ... probably a good idea to look at this video if you want to learn more about how AM5 based Ryzen CPU memory works: ruclips.net/video/JuUhnQaGG_I/видео.html You will also learn how to optimize your infinity fabric clocks ... for DDR5-8000 you are able to synchronize your clocks ... something you can't do for DDR5-6000/6400 RAM. With respect to the chipset, the X670E and X870E chipsets are the same ... the only difference is the features that were added, such as USB 4. The performance should be the same however X870E motherboards are currently performing worse than equivalent X670E boards. The Bios version, Nvidia drivers and Windows version are all listed on each benchmark chart and the Bios version for each motherboard is also in the CPU comparison table. So yes, Steve is still a clown and I'm glad that you commented, it provides an opportunity to correct some of the misinformation that HUB spreads on their channel.
@@blackbirdpctech Fair point on the chipset differences, however, regarding the memory, your own video stated it was pointless going to DDR5-8000 on AM5 as the gains are minimal and they are even more so when paired with X3D chips, one of which is the subject of this video. On that point, I agree. Your video is useful for those who prefer to tweak and tinker, not for the majority of the userbase. And because tweaking and tinkering can depend greatly on the 'silicon lottery', the results can be all over the place.
Yeah, this one was a long video ... take advantage of the chapters or come back and watch part of it later ... that said, I probably could have created a separate tuning and optimization guide video ... I might do that in the future.
I recommend looking at my recent "What's The Best Memory for AMD AM5 Ryzen CPUs?" video: ruclips.net/video/JuUhnQaGG_I/видео.html This should help you understand why it's not possible to run DDR5-9000 stable on AM5 systems.
In this case, unlike the 9700X which was poorly configured at a hardware level by AMD, I think the day one reviews were mostly correct ... it appears that Intel needed more time before launching these new CPUs. My guess is that the marketing geniuses at Intel forced the Engineering team to launch early. Unfortunately a lot of reputational damage was done with those early reviews and it's going to be tough to change the minds of the broader tech community. I'm hopeful that data like this will help.
I would gladly buy a core ultra 7 or a 9700X but I refuse to buy the overpriced X3D which should not cost more than $250. Only the small gain in FPSs doesn’t justify the price either.
Yes, it would have reduced the 1% low performance of the 9800X3D by around 10%. I recommend that you watch this video: ruclips.net/video/JuUhnQaGG_I/видео.html to learn more about the best memory for AM5 Ryzen based CPUs. I purposely chose the best setup for the 9800X3D to extract max performance.
For the memory video I used the common EXPO config for each kit: 6000 CL30, 6400 CL32 and 8000 CL38. In this video I ran 8000 CL36. That said, my testing shows that a small reduction in CAS latency does not result in a meaningful difference in performance.
Mostly because we really want these companies to have products with similar performances and decent overall prices, not what we see today in the pc parts market.
This looks fake. I just read that the new intel updates SLOWED the 285k by 18%. Looking at your graphs, this looks like the 9800x3d vs 7800x3d. What is going on here?
If you are going to make a claim like that then at the very least understand what it is that you are saying. The data for the 285K in this video was generated with an MSI Bios with microcode 0x112. The final update from Intel requires microcode 0x114, Intel CSME Firmware Kit 19.0.0.1854v2.2 and Windows 11 26100.2314 (or newer). The idiots at WCCFTech reported on a test that appears to have been performed with the 0x114 microcode but without the CSME Firmware kit, which is where you got your 18% performance drop from. It appears that many of the early Bios updates that included microcode 0x114 did not include the correct ME firmware version. I have not yet tested the latest version from MSI that was released a few days ago.
If it's not for gaming then why did Intel specifically show 285K gaming performance (against the 14900K and 9950X) in their launch deck for the 200S series?
@blackbirdpctech they will always show that reguardless as sales tactic, the 14900k is still better gaming than the new Intel cpus. But very informative video
It's actually significantly more important than the average FPS because it's a direct measure of how smooth your gameplay is. It refers to the lowest 1% of frame rates experienced during gameplay, essentially representing the absolute minimum FPS you might see during the most demanding moments of a game.
FPS in general is a metric that is only really useful for measuring the performance of games ... unfortunately it wouldn't be useful for editing or rendering.
@@blackbirdpctech 285k is a headache and has been since launch, the best CPU is the 9800, only parameter its even somewhat competitive are select productivity apps. buy the 9800x3d and forget (ease of mind)
I think it’s important to treat every component consistently when I test so that I don’t bias the results … by doing that you will sometimes find unexpected outcomes and this is one of those times … the 285K performed much better than I expected and was a great cpu to test. Intel needs to keep improving it, and they should lower the price, but I thought it would be terrible going in and it surprised me.
Great channel super detailed info hope it grows quickly
Appreciate that!
9950x vs 285k Would be another interesting battle.
Yes, I think a comparison against the 9950X and 14900K would be good to see.
9950X already Won if you Tweak it the right way!
@@angeluorteganieves2539 I already tested a tuned and optimized 9950X against a 14900K, so I could use those results as a comparison however I want to make sure every CPU is tested with the same Windows version and same GPU drivers.
or when they drop the 9950x3d and 9900x3d chips.
The 265k is $299 at microcenter rn...im leaning towards buying that rn because thats not a bad deal. I mostly use lightroom/premier pro and do some gaming in 4k so i need an all arounder. Qould that be a wise choice for that price?
If you take a look at my 265K video: ruclips.net/video/bxCXe9MC3nU/видео.html
A drop in price by 33% ($399 to $299) would make it much better value when compared with a 9900X and 14700K. And remember I tested this chip before the latest round of updates that will improve it's performance. So yes, I think it would be a good buy for your use case ... especially if you get one of the newer B860 motherboards, which are cheaper than the Z890 boards.
Excellent review. Thank you for the work you put into these. Especially the detail you give regarding the settings changes.
You are very welcome, glad you liked it!
Awesome video! I appreciate your hard work
Glad you liked it!
Some good solid info here !
Glad you liked it!
Good job and appreciate your work. We need more videos like yours EDUCATING the public
Thanks, glad you liked it.
The irony is that even though Intel uses TSMC's manufacturing process, it still remembers the high heat generation and frequency reduction caused by avx512. This means that we will not be able to use the avx512 instruction set invented by Intel on the Intel platform in the next few years.
If you need 700$ cpu with 700$ motherboard to compete 9800X3D so we have a big problem somewhere...
I partly agree, but 9800x3D costs 750 euros in Europe, and it is much slower in work tasks everywhere, but as a gaming processor it is good. The second point is that the board is enough for 300-400$.
I agree on the price of the CPU, it's too high, however you can get cheaper Z890 motherboards and run DDR5-8800 without much of a performance penalty. That said, AM5 is in a great place because you can get cheap 600 series motherboards.
@@blackbirdpctech I would buy 14700k and a z690 board - this is the most universal solution.😇
@@oleksiisl3806 you will be limited on memory speed with a Z690 ... ~DDR5-6800
I just got the 9800x3d for £518 in UK.
Very happy with it, it's a beast!
Wooooow incredible work, i have been waiting many days for this review, thank you so much, my motherboard is the same, i started playing call of duty, but after watching this video I will start tuning the processor, thank you men really very grateful.
You are very welcome ... let me know how much you are able to improve your performance.
@@blackbirdpctech I am trying to improve it, i have the ringbus in 4.2 the P cores in 5.6 the E cores in 5.0 the ngu in 3.6 the die to die in 3.6, im doing the stability tests, after an hour on cpu z stress test the cpu package temp is 97c, my memories stayed at 8800 cl 42, i can't go past that, i don't think i have enough experience in ram overclocking, i have everything in msi extreme mode (300 watts) the new bios gives me 8 ns more latency, I'm thinking of changing to the old 0x113 is better for my, im using windows 10 the performance with this tunning is 12% IN AVERAGE FPS 26% in 1% lows in crysis ( 2007 ) is a single core cpu game I will soon install Windows 11 on another SSD and do the comparisons
@@ignacioaltamirano9273 great job!
Awesome video
great to see price is taken into consideration at the end. those intel gears are so expensive...
That is the Achilles heel of the 285K and why I suggested Intel drop the price … at least the performance is getting better.
The Bruce Buffer bit was awesome. Subbed
He's a cool guy ... welcome to the Blackbird PC Tech community!
Great video as usual 😊
Thanks, glad you liked it
Thanks a lot for these comprehensive tests and analysis.
Glad you like them!
good review , more interested in a 265K
I reviewed the 265K recently:
Is The Intel Core Ultra 7 265K Really That Bad?
ruclips.net/video/bxCXe9MC3nU/видео.html
I plan to review it again after the latest updates from Intel in mid January.
Thanks for sharing your knowledge
You're welcome, glad you found it helpful.
Great review and I appreciate the performance tweaking tips. I still plan to use the 9800X3D once CPU and mobo are back in stock and will use that video to tweak. But this video is great for comparison and shows that maybe there is hope for Intel yet, at least when it come to performance.
That is the choice I would make too … hopefully Intel drops the price of the CPU to be more competitive
In terms of gaming performance it’s pretty hard to beat a gaming CPU like X3D. The x3d really saved AMD from a lot of headaches with gamers community and I am really happy they managed to do this.
Agreed.
I rarely comment, but thank you very much for the nuance and the *updated* information on these. I’m contemplating going for a high end build and using a 285k, but just unsure if a 9950x is the better choice. Quick sync is a great thing and missing on AMD (unless I get an Arc card as a secondary gpu)
I plan to compare the 285K against a 9950X after the new updates for the 285K are fully released … both are good options, the 9950X was a great chip to test.
@@blackbirdpctech Hahaha that comment makes it harder! When are the updates slated to be released in full? Thanks very much.
Mid January … then there is the new 9950X3D to consider as well
You may wait until the 9950X3D gets released and then wait for new benches. Next month already the date.
@@KamichamaTechstarifyI thought they said March?
Your willingness to actually enjoy yourself with these (wrestling promo for the actual comparison), along with BOTHERING TO SHOW 1080/1440 results on settings that aren't Ultra Maxed out Raytracing enabled? Happy to watch more of these. Noting some methodology issues with other Day 1 reviewer methods is nice too. They say they don't overclock/check bios settings to better emulate the average gamer/viewers experience, but then only ever test at max settings, which few people do (especially for cpus that aren't the ones in this video, which are the crown and peak of the cpu pile)
Really appreciate the feedback!
I'd stick with the 9800x3D Intel has issues still going into this new gen of CPUs. Just got to wait till they become available to us mere mortals since allocation is set to certain tech tubers initially. Once the 9950x3D comes out we may get hands on lower SKUs if lucky.
9800x3d was available in Australia at least from all retailers but sold out after each batch arrived over a week or so. Just keep checking stock at your local retailer or ask when the next batch will arrive and place a backorder if need be.
@latinlowrider59 where I live they don't allow back order and oddly they won't ship CPUs again they did this when the 5950x launched as well only way I got one back then was my brother would have to check the store daily in person then used a Carrier to ship. If these are going to be that difficult to get them not worth my time this is just another sign it's time to give up on this hobby as humans suck and ruin everything.
@@shadowarez1337 Where are you located i may be able to help.
@latinlowrider59 way up in the Arctic Region of Canada it's ok though the difficulty of getting parts is killing the drive to even want em anymore I'll still locate for friends who don't mind spending the $$$ but personally losing interest in the hobby as more and more handicaps get added to obtaining parts.
If it's not scalpers, no shipping on new products, insane mark ups, paper launches it really makes ya reconsider the hobby.
But appreciate the offer.
One doubt, I have just bought a new Ultra Core 9, and I'm installing drivers. Would you recommend installing Management Engine driver, or is it better to leave it out?
The ME should be always kept updated to the latest release, firmware and driver and I think Matt would agree.
Before doing it though, is recommended to read about the firmware, learn about the fixes, BIOS updates which you may need to perform after the ME update, etc.
You should always install the ME driver (as stated above). Typically in the notes to the latest bios it states that you should also install the latest ME firmware.
Will you update your comparison on the more current X870E chipset aorus master that is better matched for handling 8000+ DDR5 speeds - or at the very least use the latest AGESA 1.2.0.2b update that addresses memory latency issues and has demonstrable gaming performance uplift? This may sound nit-picky but you made sure the intel rig had all the latest advantages available - latest arrow lake OS patches, BIOS, and DDR5-9000 (which is by no means a guarantee on 285Ks and requires some degree of binning)
The chipset for the X870E didn't change from the X670E, features like USB 4 were added but there should be no performance difference. That said, you really should take a look at my recent posts on the X870E Aorus Master, and my 265K review where I show the performance difference between the X670E and X870E, and explain why I chose to continue testing with the X670E Aorus Master. In summary, there is something wrong with the X870E Aorus Master ... I've now tried three different Bios versions and the performance is still significantly LESS than my X670E with the same components. It's so bad that I decided to change my X870E testbench to the Asus ROG Crosshair X870E Hero, and even that isn't able to match the X670E Aorus Master (although it's much closer). I provided charts to show this in multiple recent posts, including one that I posted earlier today. So not only did I not bias the results against the 9800X3D by using my X670E, I actually helped it achieve better performance. I know this doesn't align with the thought that everyone is "out to get you", but I can assure you I do my best to make sure that each CPU is given the same level of care when I test them.
Also, getting DDR5-9000 to run stable has more to do with the motherboard you use versus silicon quality for the CPU ... the memory controller for Arrow Lake is extremely good.
@@blackbirdpctech So I am in the unique position to have both the X670E and X870E aorus master boards on hand - and while I don't doubt your account of performance issues with your specific x870e board, I have not seen any anomalies like a noticeable performance drop on my end. I do try and follow other searchable conversations on the x870E and have not seen much chatter on your performance issues either. I will say that I spent significant time (4-5 months) working with Gigabyte support on the X670E master BIOS builds - multiple BIOS releases (beta and release) were unable to properly disable iGPU and HD audio when specified and worked with various support staff (including an engineer directly in Taiwan HQ). I have not benched the 670e with the lastest AGESA but when time permits I'd like to see if there is a noticeable performance delta, particularly in %1 lows at 1440p or higher resolutions. I do have observable quirks with my x870e; my g.skill 8000 kit that failed to post on the x670e works perfectly stable on the x870e at CL40 and CL38 and at only 1.4v (both using a 9800X3D) - HOWEVER my seemingly very dependable g.skil DDR-6000 CL30 EXPO kit that ran fine @ 6200 CL28 1.35v on the x670e will eventually get errors/crashes on the x870e running at default EXPO CL30 6000 timings (and wont post at any higher speed, no matter the voltage boost up to 1.5v). Very odd and annoying.
I reached out to Gigabyte and they confirmed an issue … not sure if you’ve carefully benchmarked both motherboards with the exact same components, but I have and the data for the X870E is not good. I tried multiple boards and got the same result on both. Even my Asus X870E Hero doesn’t perform as well as my X670E Aorus Master. I will be meeting with Gigabyte at CES to try to get some resolution because at this point, I’ve moved away from Gigabyte motherboards. My experience with the Z890 Aorus Master was even worse, and I tried two of those boards too.
@@blackbirdpctech what specific issue has Gigabyte confirmed? And is it firmware fixable or a hardware revision requiring a RMA? And for your performance drops, I am assuming these are most noticeable when benching low resolution 1080p (something I don't generally do at home)?
They confirmed that they were able to replicate my performance issue and they were “looking in to it further”. It was at multiple resolutions and game settings. Take a look at my community posts … I show all of the data and discuss it in much greater detail. Unfortunately each bios update since then has made it worse.
I can’t imagine many people test both boards with the same components, so it’s not surprising that people aren’t complaining because most wouldn’t even know it’s an issue.
Do you recommend the 285K as an upgrade from 14900K?
No, not at the moment ... the cost and performance wouldn't make that a worthwhile upgrade. If Intel continues to improve the platform, then that might change.
I like your benches but I think medium is a mistake.
In some games Ultra settings have higher CPU load as well. Due to more draw calls or objects on screen.
Otherwise love em. Add STALKER 2 and Valheim :P and SC2 if possible
I was trying to capture the full range, from 1080p/low to 4K/ultra ... 1440p/Medium was a mid point, more of a balanced CPU/GPU load. Happy to add/replace games in my benchmark suite as long as I can test them in a repeatable manner ... that's why I typically go for games with built-in benchmarks ... CS2 is tough to bench consistently ... will take a look at the others.
@@blackbirdpctech I meant Starcraft 2. But thanks !
@@CharcharoExplorer one of my favorite games … will look in to how I can benchmark it.
It's top class for workstation pc's and reasonable for gaming
Take a look at the data, it will likely surprise you
@blackbirdpctech great content thanks 👍
You are very welcome!
Exactly. It's funny how blind people get when it's not at the top.
Alway was worth buying only if you are purely a gamer it was not. The channels that slated it was poor were gamer channels like Gamers Nexus, Level1 Tech (did not say it was poor) and non-gamer channels never said it was poor because it was not, I welcome improvements and bug fixes people who bought it like me have never had a complaint about it and I have 20 the next release of it will be vastly better as this was a big change for Intel. Good video though at least you understood the technology unlike most reviewers. The retail CPUs were nothing like the reviewer ones (I never had one BSOD on any of my 20 CPU's) though I don't care about games and windows did have a significant impact like it did with AMD when it released its own CPU's. Good vid
Hopefully this data shows that it is actually quite good at gaming now too. My guess is that Intel wasn't planning to release it as early as they did but that their hand was forced by issues with Raptor Lake combined with the imminent release (at the time) of the 9000 series by AMD.
In the end it's still the pricing that is on amd's advantage. Price to performance cannot be beat at this point if you're simply just gaming and you can still use the 9800x3d for work. since benchmarks mostly do not apply to real world situations and usage. Not only is intel marginally faster in some games, you'd need more expensive components along with making tweaks to improve performance. When building a pc you have to factor in the gpu too which is why most gamers stick with price to performance ratio. If you're rich then go for it but the majority of gamers will always go for the cheaper option especially when that cheaper option performs just as well as the more expensive one.
I had been waiting for Arrow Lake to build a new Audio DAW, unfortunately, as usual, the vast majority of reviews only based their review rating on Gaming Performance. If you couldn't care less about gaming, and want a CPU for actually Doing Something, then the Ultra 9 285K seems a good choice.
Still some problems, mainly RAM choice, price and the NEED to have Windows 11 24H2, otherwise the latest updates don't work as they should.
9800X3d still far better for gaming. The mixcrocode update improved intel, but not near enough., For productivity Intel CPU is better with similar IPC of P cores and the very much improved e-cores having 24 in total due to 8. But the tile based latency intel has is just horrible for gaming and without 3DS vcache intel cannot mitigate it near enough to compete in gaming with X3D CPUs.
Hi, Would you prefer the Unify or Ace board. Which would be the best Z890 board for overclocking?
The Unify … it’s a 2-dimm motherboard specifically designed for overclocking.
@@blackbirdpctech
Thanks.
Do you know how it compares to the ASRock Z890 Taichi OCF or Asus Maximus Apex?
I'm seeing alot of overclockers using them on forums.
@@Tubezilla I have a Z890 Apex and I used an Apex Encore for Z790 ... I haven't tried the Taichi, but they should all be pretty close wrt memory OC'ing, given that Arrow Lake has such a good memory controller.
@@blackbirdpctech One last question, I promise Which board gives you best FPS, Ace or Apex?
@@Tubezilla 2-dimm motherboards will always give you a better chance of getting higher speed memory stable. But if you ignore memory then there really isn't a big difference between these higher-end boards in terms of FPS ... it's your CPU and GPU that will have the largest impact.
actually, it did much better than i thought. there are a few games with 1% low issues. when i get my 4k 32" oled, it seems like i could go either way and be happy.
Which 4k 32” oled did you select?
@@blackbirdpctech im looking at the AORUS FO32U2 Pro with DP 2.1. im waiting for it to go on sale though because $1200 is not sitting right with me..
@@rozzbourn3653 I seriously considered that monitor after seeing it at CES last year (the picture quality is amazing) ... if the new GPUs use DP2.1 then it will be a great option. I ended up going for the Asus (OLED) and LG (WOLED) options because I wasn't using a GPU with DP2.1.
@@blackbirdpctech i plan on upgrading my 9900k 2080 super system in the next few months. im not in a hurry, so im going to take my time and get exactly what i want at the prices i feel good about.
wait but did they fixed 13 and 14th gen intel crashes and stability issues? Or only for the new core ultras?Oo
As long as we trust them as they say it’s not a physical problem, then it is “fixed”.
They did release a fix for Raptor Lake CPUs a while back ... simply update your motherboard Bios.
My last 7 PC builds have been Intel and Nvidia combo.
My next build will be AMD CPU and Nvidia GPU build. Either getting the 9950x or waiting for the 9900x3D.
Intel has lost me as a repeat customer who would always buy their highest end i9s.
They only have themselves to blame ...
There’s lot of disarray in Intel strategies right now. They could’ve stopped this with the Ultra series but decided not to, lol. It feels like they never tested these CPUs in real world scenarios and released them with a corporative arrogance that people will buy them anyways.
While I love the Ultra’s architecture, they have so much flaws that is very hard to buy one for use, maybe as a collection piece to remind me years from now of how you can deliver a state of the art architecture with the most unexpected poor functionality. Hectic!
it's mean that if you are going to play in 4 K. there is no reason to move 9800xd i am right?
From what processor?
Depends entirely on your settings and your games. If you play at 4K native (never any upscaling) and at max settings, the difference between processors should be very small at this point in time. Whatever CPU is faster in the 1080p tests will very likely be faster in the future in 4K too. It gives you an indication of how fast the CPU can run the game, and also how performance will differ in future more CPU intensive games.
If you already have a CPU fast enough for your settings, there are generally no reason to upgrade. :)
When AM5 had been introduced to migrate into a new platform from AM4, the only thing preventing users from buying was the expensive DDR5 and Mobo price. But users can benefit from the new platform in terms of performance, less heat, and more power efficiency.
On the other hand, the new platform Intel got the worse start in this migration. Gaming performance becomes an issue even with regression. While productivity is on par with the predecessor, the significant benefit is heat and power consumption.
The cost of the entire platform is definitely an issue for Intel … at least they are fixing the performance.
@@blackbirdpctech The biggest problem is lack of future support, why would anyone buy into LGA1851? When AM5 will still get another generation
@@yr6sport418 idk bro I willing to trade my 9800x3d for a 285k with cash difference so far no impressed with CPU I don’t games a 1080p I feel more dip in mmorpg than my old 13900k the avg is higher but the 1% drop really low in bdo and troné and liberty even in lost ark when there plenty of monster on screen running on x870e hero gskill 6000 cl30
@@adi6293Intel always supported 2 or 3 generation
If 1851 is only one gen then I agree
Very interesting for sure.
The performance of the 285K in gaming really surprised me ... it definitely shows the potential of these chips when configured with the right firmware ... the big questions now are if Intel can continue to make improvements and if they will drop the price.
@@blackbirdpctechIt seems to me like this gen of Ultra was released in a hurry. This is also a big step into unknown for intel, first CPU with the most advanced TSMC stuff. They’ll definitely do better next time, intel loves to improve slowly, but now they don’t have to worry about lithography slowdowns, and can fully focus on performance.
Nitty gritty is nice to see. For an all around workstation Intel definitely is a good argument.
Thanks for all the hard work.
Glad you liked it!
heavy work. thanks.
You are very welcome!
Maybe in your country the 285k could be good value. Over here its 200 euro more expensive then the 9800x3d and the 265k is more in line with the 9800x3d just a little cheaper but not much. Everyone is getting the 9800x3d.
I didn’t say the 285K was good value, in fact I have a value chart in the video that shows the 9800X3D offering significantly better value. The good thing with videos is that it’s tough to argue against something that you can watch.
In my country 285K is 150€ cheaper than 9800X3D
@@AlekseiSljusarev wow ... either the 9800X3D is super expensive or the 285K is cheap.
The problem is Intel failed big time, they went from 10nm to 3nm process node and they came up with this trash of a cpu, no amount of sugar coating can fix this.
The main problem is that Intel should know by now that they need to come up with their own x3d stacked memory if they want to compete on gaming, they had all the time in this world and they didnt even bother, i mean the 285k humiliates the 9800x3d on everything else except gaming where it gets humiliated, so the answer is simple, they need to work on that, simple as that.
Stacked cache is not the answer, it's just a hardware hack to fuel fanboys' wars, they are other ways to achieve high performance in games without limiting a CPU's functionality as the stacked cache does. The x3d CPUs are also very expensive for what they have to offer and I'll stay away from them, unless they go really cheap, such as under $250 cheap.
9800X3D is no productivity slouch using 50% less power for the same workload as the 14900KS.
Whats funny is Pat G. should have been the right pick to yank Intel out of their rut. And when he allowed PR out dissing AMD as 'glueing chips together' and other assorted BS while Intel continued just re-treading 14nm.... well both are where they should be now.
And Pat is potentially on the hook for a lot of BS they fed to investors to potentially the tune of his entire $200M compensation during that time.
Unfortunately, AMD seems to be sinking into the Captains chair of 'Lets give them just enough to satisfy' vs taking risks.
@@boots7859precisely, giving enough to satisfy is a big mistake. An approach which works for the moment, but which will backfire sooner or later if nothing major changes.
X3D humiliates Intel in gaming basically on the benchmarks only. Because I doubt that people who buy absolute top GPU would play anything less than 1440p, which is bound to GPU, and will be in a very long time. And if it's not an absolute top GPU, then you'll be bound to GPU even on 1080p anyway. So in the end it would seem that people have mass obsession for benchmark numbers, and ignore far more realistic scenarious. While in productivity and day-to-day tasks X3D, let's just say - not the best.
Are these new ram kits already available? I'm still stuck with a 7200 kit and leaving performance on the table. I live in Europe , belgium.
Yes, they are available from Newegg and Corsair however the higher speed kits (>8800) keep going out-of-stock as soon as Newegg updates them, so demand is greater than supply at the moment.
Unfortunately in Europe we don't have newegg :(
@@miraveilyt you should be able to find them wherever G.Skill RAM is sold.
"Huston, we have a problem. Gamers are running out of money."
Wait until the prices are revealed for the new Nvidia 5000 series GPUs
@@blackbirdpctech I am 100 percent certain that I would need to sell a kidney in order to afford the new nvidia gpu's.
Hey Matt , what are your thoughts on the new AI nvidia gpus 😁
Do you mean the 50 series using AI to generate frames?
@blackbirdpctech Yeah , I think you should make a video on that topic once they release them officially , which you will most likely do anyway , just wanted to see what are ur thoughts on focused AI gpus vs raw performance.
@@BruceUZULUL if it works well and if you can't tell the difference then to me it's fine. I think it's important to remember that these are games, so whether AI generates the frame or the frame is drawn based on a developer defining it, as long as you can't tell the difference, then it really shouldn't matter. This is definitely something I plan to discuss in an upcoming video.
@@blackbirdpctech’Raw power’ and ‘Fake frames’, two terms which you should also clarify for users and explain why they are wrong when using such sintagma, especially when tensors and rt cores are clearly raw power and those ‘fake frames’ are as real as it gets.
Every frame which is visible on your display is REAL, people, and is the result of a calculation taking place on physical hardware.
As an analogy people were using the example of turbo in cars, but is not even like that. It is like having a second engine which delivers additional power with a new type of fuel.
@@TheSkyLynx2 agreed
good comparison
Glad you liked it!
perfect presentation
Glad you liked it!
What do you think guys , is there any chance that Intel improve Core Ultra 285k more than that ?
It can be improved and I hope they'll do this with the entire U200 series. If they really care they'll improve it, especially now when the 285k is out of stock everywhere I look, and not because its gaming performance, but because (with all this series' flaws) the tech savvy users understood the revolutionary aspects happening under the lid. Intel did not release something like this in a long time.
Awesome video! I appreciate you putting objective metrics to what we had talked about during the 285k launch, that Intel is lightyears ahead on the memory controller and this would significantly impact performance. Now we know how much! Maybe AMDs memory controller will get a major performance bump with their next generation MB chips? The moment at 3:33 had me laugh out loud! You forgot the clown nose. :O)~ Come on Intel. Drop the price. Happy new year!
It's shocking to me that they haven't dropped the price already ... they really could make this a compelling option with an aggressive price cut. Happy New Year!
@@blackbirdpctech the 265k would tempt me at £300 but not £380
hi, great review! was bo6 tested with the correct render worker count in the config files? higher core cpus usually perfom worse in bo6 due to some render worker count shenanigens
I ran the built-in benchmark without any mods … can you tell me more about the config file update required?
@@blackbirdpctech ruclips.net/video/vUokFhHUZoA/видео.html
this video does a pretty good job of explaining it, do skip over every other tweak though, not required. just wondering whether render worker count has any effect on the new arrow lake cpus
Thanks, will check it out.
imo it should be the 285k vs the 9950x, since those cpus are meant to go against each other. intel doesnt make any dedicated gaming cpus
Agreed, I plan to do a comparison of the 285K against the 9950X and 14900K after the next firmware update is released. I chose the 9800X3D for this comparison because I was interested to see what Intel’s best chip could do.
if you look at intel core 200 ultra lineup all the cpus are below $400 except the 285k right now 285k is $620 so intel needs to cut the price on 285k
I think it's simply Intel being stubborn and not realizing their current position in the market ... that said the 285K has been out-of-stock since launch, so either there is strong demand or not much stock.
@@blackbirdpctech its in stock now.
@@ployth9000 yes it is, it wasn’t a few days ago … it has actually increased in price by $20 to $619.99, which is crazy.
Underrated
The CPU or my channel?
@ both, great info and hats off to you. Tech channel done right.
Thanks!
if they drop the price to 300$ then i will buy the 285k
That seems very unlikely ... at best I could see it dropping to $500.
Do you have a guide for intel 265k? So I can copy your settings and try them out as I come from am4 I don't know anything about intel
Matt has a vid featuring the 265K a bit down the video list, check it out!
Here is the 265K video: ruclips.net/video/bxCXe9MC3nU/видео.html
As explained in the video, many of these tweaks are silicon dependent, so I would caution you against simply copying my setup ... better to use it as guidance ... you may be able to push your chip further or need to back-off a little.
@@blackbirdpctech alright thanks man :) yeah 53x & 47x works fine i could go higher but my H150i LCD 360mm cant cool the CPU much In cinebench r23 its hitting 93 up to 95c. thanks again.
So what you're telling me is "intel finewine" just gonna use that term to annoy amd guys who would say "amd finewine" because their products were so broken on launch it took 1 to 2 years to get all the performance out of it xd
I have to laugh because the AMD vs Intel vs Nvidia thing is pretty funny. On this video I am getting crap from AMD fans because the 9800X3D didn't win by enough and Intel fans because the 285K is not a "gaming" CPU ... so I am going to do my best to stay clear of any battle and simply try to stick to the facts ... the 285K did much better than I expected based on earlier reviews 😉
LOL, was rooting for Pat when he came in, thought he'd bring back some of the old Intel. Wrong! All he did was insult: Nvidia (lucky), AMD (rear view), Apple (lifestyle company) and more just running his mouth.
We'll see how well 18A turns out, and whether its economical or even viable at scale. Keep laughing while you can with your quips, Intel will most likely keep failing until the US Govt. has to step in for NatSec reasons.
Well at least if anything good came out of this AMD is gaining market share, which I'm happy about in the CPU space. Hopefully in the GPU space Nvidia cards will stay on the shelf so Intel can gain market share there.
Hopefully history doesn't repeat itself and AMD becomes too dominant ... if it does then CPU prices will significantly increase.
All I know is that I'm extremely happy with my 265k I picked up for $299. Paired with 48gb of DDR5 7200 and everything is so silky smooth. . . along with my 4070 super. Definitely a bang for your buck PC build. :)
I’m planning on re-testing the 265K after all of the new fixes are released in mid January … I’m very curious to see just how much extra performance I can extract from it.
@@blackbirdpctech Really hoping Intel redeems themselves with these chips but so far I’m happy and impressed. 20 cores is amazing haha
What made you buy it over AMD CPU's? 😅 And what were you upgrading from?
I don't get it why Compare Ultra 9 to Ryzen 7? Shouldn't it be compared to Ryzen 9?
For AMD the 9800X3D is faster in gaming than the 9950X. For Intel the 285K is the fastest gaming chip for the new Core Ultra series. That said I do plan to do a comparison with the 9950X in a future video.
@@blackbirdpctech Oh Appreciate! It is strictly for gaming then I get you. AMD is about to release Ryzen 9 9950X3D that should be a n interesting comparison to Ultra 9 :) Personally I am deciding for the best CPU that will permit me to Game but also for a Creative Workflow.
This is the most clear, concise, thorough, person I have ever seen doing this type of thing. GO BLUE! My jaw hit the floor, OMG.
Appreciate the great feedback!
Panther Lake is 6 months or so away, I will WAIT. Big changes coming.
Is it really only 6 months away? At this point it's tough to trust anything coming from Intel ... will Panther Lake be fully functional from day one? Will it put Intel back on top? There are a lot of questions and no answers ... in the meantime we can test what is available today and figure out how to maximize the performance of these platforms.
@@blackbirdpctech Yea, I am really not optimistic in Intel's future plans, they really have internal problems like never before. Hopefully, if they manage to produce the next generation in their own foundry, we'll see some decent CPUs or at least decent prices?
@@TheSkyLynx2 I know it's not a popular opinion but I was more confident in their path forward when Pat was CEO ... I do think that fabbing their own products will be a very important step to getting back to the top ... only time will tell.
@@blackbirdpctechyea, who knows.
Btw, panther lake is not a desktop CPU and it looks like they’ve cancelled the arrow lake refresh, so the next intel must be Nova Lake for desktop.
I don't think it really matters tbh. The average gamer has a 3070 graphics card for example. Only those who absolutely must have everything on ultra at 4K pay the top bucks. Heck even I own a 4K monitor and I don't have issues with playing anything. It's kinda funny Americans are freaking about the prices, us Europeans have paid those and much more since the 2000-series GPU.
So you're saying in order to get performance closer to the 9800x3d. Get really expensive ram
Mess around with timings
Overclock
Test and mess around voltage other settings.
Riiiiight! Yeh, I'll still with a 9800x3d
A couple of things:
1. The final tweaks for the 285K do not include any voltage modifications.
2. The 9800X3D in these tests is highly tuned and optimized, as explained and detailed in the video.
@@DavidAlfredoGuisado that's not true at all ... did you see the tweaks that I made to the 9800X3D? The tweaks for the 9800X3D resulted in an 8.3% increase in Cinebench score whereas the tweaks for the 285K resulted in an increase of 7.8%, which is similar but less than the 9800X3D. Furthermore, the -30 CO undervolt that I applied is heavily silicon dependent, as is the max CPU boost clock and ability to run DDR5-8000 stable.
@blackbirdpctech who cares about a synthetic cinabench score!? We are talking about real world gaming here, you make it seem as if the folks are wrong about the 285K and it's better than reviews are saying. The gaming performance you get out of the 9800x3d STOCK is incredible. You are saying to get close to it or a little better in some cases, spend more money and tinker around in the bios.
@@Sirjojovii I love it when someone says "real world gaming" ... hopefully you can appreciate just how much of an oxymoron that is. The cinebench score was simply used to provide a measure for how beneficial the tweaks were. And yes, the early reviews do not match current performance ... that is the point of the video. That said, I do agree with you that a 9800X3D is a better gaming option ... I specifically say that in the video, but it's important to also point out that the 9800X3D in the video was highly tuned and optimized as well ... not sure how that offends you.
Great Video! I appreciate you taking the time to work with arrow lake and not just throwing it away as soon as the 9800x3d released like Jay and the other big pc tech channels. I'm an early adopter of the 285k coming from a 4th gen i7 4790k, so I know I will experience a world of difference either way.(just waiting on the 5090 to complete) And I will be doing workstation tasks like 3d rendering and video editing a bit more than gaming. My only concern is the P and E core overclock temperature increase. 91c seems a bit too much, so I may not be able to go the overclocking route. Does the 91c and high power draw occur in just short spikes, or is more steady?
That's a great question ... you only get the temp increase when you use a power profile that is above the Intel default ... the one I used keeps PL1 at 250W but increases PL2 to 295W ... PL2 is a temporary increase, whereas PL1 is steady state, so I thought it was worth it ... if you are concerned I would recommend simply using the Intel default power profile ... you might have to backoff some of the tweaks a little, but then again you might not.
@@blackbirdpctech Thanks!
thanks. 9000 memory is still prohibetively expensive, AND taking into account itwont work on 4-dimm mobo (which is 95%) the choice is obvious.
You are correct that DDR5-9000 will not work on most 4-dimm motherboards however DDR5-8000 will also not work on a large number of AM5 motherboards, so I think it's a somewhat balanced comparison from that perspective.
It scored much better than i thought.
They may have redeemed themselfs. Still glad i`went with my AMD 9950X though, an excellent allrounder and much cheaper for me.
I would choose a 9950X too, but yes it really surprised me.
Normally you mustn't compare the 9800X3D and the 285K because the 9800X3D is just AMDs upper med tier CPU and the 285K is supposed to be the top high end CPU. (Since Intel cancled the 295K with 8 P-cores and 28 E-cores because the efficiency was not the same as for all the other ones. OK but honestly, what do you want with even more E-cores...)
For a fair competition on the 285K you would rather compare at least the 9900X3D or even the the 9950X3D.
Btw. here in my country we got € /euro, and here the 9800X3D already costs 579€, but the 285K currently in offer for just 550€ (tax already included). It's rather a bad joke but the so user friendly AMD suddenly is not that much user friendly anymore... And both CPUs are currently nowhere available - except at fraud for overpriced
Well most likely the 9900X3D will be the worst in pure gaming because because this one only got 6 lanes of X3D, means even 2 less than on the 9800, and we still don't know if AMD fixed the commands this time not to use the non X3D cores which will naturally cause FPS drops
The product stacks for AMD and Intel are different, so when looking at gaming the best new gen chips currently are the 9800X3D and 285K.
@@blackbirdpctech Yea purely aiming for gaming it is I admit. But on paper strictly speaking it's not 100% fair.
Though I don't even think the 9950X3D will be so much better purely counting for gaming than the 9800X3D, probably a little bit because a bit faster clock speed.
But In high resolution like 4K even L2 cache can play an importand role which the 9950 will have more, too.
So I'm still excited for the first gaming benchmarks between those 2.
48gb ram vs 32gb ram???
Yes, the DDR5-8000 32GB kit can be tuned with tighter timings than a DDR5-8000 48GB kit. Given that none of the games that I benchmarked came close to using 32GB of RAM (max system usage was around 20MB), and that both kits are single rank, the additional capacity of the 48GB kit provided no benefit. For the DDR5-9000 kit there isn't a 32GB option, so I had no choice but to use 24GB dimms. If you were to run an application that uses more than 32GB of system RAM then it would become an issue. My objective was to tune and optimize both systems as much as possible so I could compare best vs best configs.
Neither 285k or 9800x3d are available here.
That is the same situation in the US … both chips go out of stock immediately.
95% of people interested in this stuff are gamers, plain and simple.
Best bang for the buck is all that really matters.
I left x86 recently, no more Win, Linux, and just got a screaming Mac Mini M4 for $560+ thats complete with a GPU that can play AAA at 1920/High at 60fps as a bonus.
Maybe 50w, and silent and was about equal to a Ryzen 9700 Microcenter deal in brute force.
I'm thinking MacOS is going to be somewhat less buggy than MS, and less occasional messing around like Linux, and have its own set of quirks I think will be less than I've gotten used to.
Started with Abit BP-6 and a couple Celeron 300a and a pencil, and now I'm on a baby monster the size of 5-6 CD's stacked. And cheap as chips.
Walk away.
So you've chosen the "lifestyle" path 🤣
/ this was a joke
What I really want to know is who gives a shit about 1080p gaming with a 4090 and these CPUs? Or am I the crazy one that is out of the loop
I got this question so much I decided to explain it in this video: ruclips.net/video/Zi7LtyQVPdM/видео.html
@blackbirdpctech sweet I'll check it out, thank you
Why are you comparing these 2 the core ultra 9 is a multi platform cpu the 9890x3D is a strictly a gaming cpu.
I wanted to compare the best new chip from each company in gaming, and for Intel that's the 285K while for AMD that's the 9800X3D. The 265K is cheaper but the silicon quality isn't as good, so you can't overclock the chip as much. By the way, I assume you mean "multi-purpose" and not "multi-platform" and the 9800X3D and not the 9890X3D, which would be interesting.
You can still use the 9800x3d in productivity since it's still an 8 core 16 thread cpu that uses ddr 5 and didn't have the lower clock speeds of the previous x3d chips. Who says it cannot be used for productivity?
@@dredgewalker agreed
@@blackbirdpctech Yeah, it may not be the best at productivity but it would still do the job. Even past cpu's that aren't as powerful as this were used for productivity. The only difference is this get things done a little slower compared to the intel. That's like saying the intel cannot game cause it more of a productivity cpu. Both can be used on gaming and productivity but both have areas where they excell at.
Its very simple, I also managed to get great results form my 285k by increasing the normal lotus panendermic parameter of the VRMs, and adjusting every seventh non-differential setting to its modial factor. I was also able to significantly improve system stability by adjusting the fourth order transform within the memory sub-inverse reactive timings to their max levels.
That's pretty funny but keep in mind that I also tuned and optimized the 9800X3D in this video. The reason I show step-by-step guides is to show just how easy it is to tweak these CPUs ... it may sound difficult but it really is quite easy, it just takes time.
With all the fixes to the Z790 & Z690, I would have liked to seen it thrown into the mix. I mean, with the fixes, does the 14900K perform better now or worse before the fixes?
I plan to test the 285K against the 9950X and 14900K in a future video. That said, I've tested the 14900K extensively before and after the latest microcode update and the performance impact is minimal if you compare using Intel recommended settings ... the problem is most motherboards didn't use these settings when Raptor Lake was first released, so there is a performance hit as a result.
A bit biased at minute 25:05.
The last three points are redundant, and the recommendation isn't even an objective point to check off. Instead, there should be two points to add: the 285k performs faster at 4K and that the software is still being improved, like the BIOS microcode updates coming on January 15th.
It would be interesting to see similar ram benchmarks but for the 265k.
I quite literally stated those exact points in the video, which is evidenced by me actually saying it in the video. Furthermore, what exactly is biased? Here is the final part of my recommendation:
"With that said, these results do offer up some interesting choices. On the one hand, if you are building a new gaming focused system and you want the very best, then based on these results, I would recommend buying a 9800X3D, if you can find it. It’s a truly amazing chip that offers outstanding performance at an equally great price. However, if you do a lot of professional workloads in addition to gaming, and you believe that Intel will indeed improve gaming performance with their next firmware update, then the 285K might be an attractive option. It will unfortunately cost you more, and availability of the 285K is not much better than the 9800X3D, but it might finally offer Intel fans a compelling reason to upgrade. I was actually quite impressed with how competitive the 285K was in these tests, the only thing really letting it down is the price, which is something Intel CAN do something about. Hopefully Intel can come through with their new firmware update in January because as things currently stand, AMD is dominating PC gaming."
The 0x114 microcode update was released early and there are numerous reports that the performance actually dropped ... I will need to check when I get back from CES but if true, would be extremely disappointing.
ChatGPT gave me a link to this video
That's awesome!
800 euro cpu, with 800 mobo, with 500-600 euro memory + tricky OC to still be 10% behind... ok got it.
The memory is double in Europe? Which country? And yes, it’s a costly approach but the OC is not tricky, it’s quite simple.
intel users all went out and bought the amd chip that's why it's sold out, but in social media they still shilling the hopium
I think the positive day one reviews is why the 9800X3D has been sold out ... but why has the 285K been sold out? That to me is a more interesting question, given that the reviews were not good.
@@blackbirdpctechnobody wants to admit they’ve bought it, people are afraid of shaming and backlash 😂 Paraphrasing a well known comedian, “never underestimate the power of fanboy people in large groups”, rofl.
So you don't need to be a rocket scientist, you seem to need a computer engineering degree.
Wanted to see if an intel build was in my future and no, I'm not interested in coaxing performance out of a consumer product.
When I build a Ryzen system I buy one of many X3D parts, set the PBO in bios and you have the best experience without the hassle.
The tweaks that I made to both chips are relatively simple ... you certainly don't need a degree to perform them. I agree that X3D chips are a much better out-of-the-box experience however I would definitely encourage you to make some basic tweaks, like undervolting. Simply turning PBO on doesn't really do much.
the I9 Core Ultra 285K is great CPU. Objectively.
I would agree, I think it's just priced too high ... hopefully Intel will address that soon.
@@blackbirdpctech yes. imagine if they lower it by 120-130$. Would be best seller in 15 days.
dop ppl still zip files? and why? storage is cheap.
I do use zip often, at least twice a week. When you locally backup folders with thousands of files in them it is a good idea to zip them.
You must redo this test after either increasing the AM5’s RAM to 48GB CL 42 or decreasing the LGA 1851’s RAM to 32GB CL 38 for true equivalence.
That wouldn’t make sense. The DDR5-8000 32GB kit can be tuned with tighter timings than a DDR5-8000 48GB kit. Given that none of the games that I benchmarked came close to using 32GB of RAM (max system usage was around 20MB), and that both kits are single rank, the additional capacity of the 48GB kit provided no benefit. For the DDR5-9000 kit there isn't a 32GB option, so I had no choice but to use 24GB dimms. If you were to run an application that uses more than 32GB of system RAM then it would become an issue. My objective was to tune and optimize both systems as much as possible so I could compare best vs best configs.
@@blackbirdpctechThe x3d only won by a very narrow margin, and this is not what the other tubers were saying, the difference should be huge because the x3d is king and intel is e-waste trash tech 😂.
I had a feeling this was going to happen after this comparison. Oh gosh!!
Yes, you definitely called it … that said, my job is to show the data without bias, even when it goes against the expectations people have. Unfortunately it’s the larger tech RUclipsrs that are trash … not one of them followed up and retested the 285K after the recent round of updates. They did the exact same thing with the 9700X … they trashed it and left it for dead.
@@blackbirdpctechTotally true. I don’t know why are they doing that, I can speculate though.
And yes, I noticed the silence regarding the 9700x, which is an awesome CPU which I really consider for a future upgrade.
@@blackbirdpctech Point taken. Although, there may be other bias in this test because the CPUs are, technically, meant to be different performance classes. Granted, AMD's Ryzen 9 X3D chips are, at least, a few weeks away from release.
Note: all this Pc upgrades mean fook all if your internet is chitouse bad laggy
old address i was using 100mbs ping26 now nephews new house wifi using now 50mbs ping 36 Soon new fibre optics in 2025 jan cable to rear unit i,m in i hope its ok
i watched today my wifi 50mbs BF5 ping go 122 to 999 i hit the task manager & refresh explorer then its back to 108 122ping bf5 on oceania asia servers aimbots play daily :)
Someone who will pay so much money doesn't want to do all those things to get maximum performance in gaming,they will choose 9800x3d out of the box to get the best framerates
Those people should just buy a console lol
The reason I provide step-by-step tweak guides is to show just how easy it is to extract max performance from your hardware. There is a misconception that this is difficult, it's not. Furthermore, the 9800X3D in this video was highly tuned and optimized as well, which is why the comparison is fair. Please don't assume that you know what others are thinking and/or are willing to do.
I don't like the testing components, but I might be uneducated. You're showing the difference between ram timings on Intel, without much explanation of why it happens. This also makes it seem like you're cherry-picking components for the intel build, while going for "standard" components for the AMD build. I'm not saying this is the case, I'm just saying it makes it seem that way. Perhaps some component would increase the AMD performance, perhaps not. But not explaining anything about it makes it seem sketchy.
I'm not a brand fanboy so I don't care who wins, I just want the best FPS.
I assume that you must be new to my channel (which is not an issue of course) … you can take a look at some of my recent content on the 9800X3D and AM5 memory optimization to get a better understanding of the AMD platform and why I chose the components I did. The AM5 platform is quite different from Intel platforms and does not scale linearly with memory speed.
@@blackbirdpctech I do understand that they don't scale linearly. But it's not rocket science that most of your viewers on new videos have not watched pre-existing explanations to key parts of the video.
I can understand that it would be easier for you if all of the relevant content was contained in one video however it's important to keep the length of the video down, and given that this video is focused on the 285K, and is nearly 30min in length, I decided to not repeat information contained in my recent 9800X3D and AM5 memory videos.
What part of my AMD build do you consider to be "standard" and how do you think the performance could be improved with different components? I used DDR5-8000 and I lowered the timings. I undervolted with a -30 all core CO. I overclocked the CPU by 200 MHz. I don't think you truly appreciate just how highly tuned and optimized this 9800X3D is ... it's not something most 9800X3D owners would be able to achieve.
should have used a x870e
You should read some of my recent posts on performance issues with X870E motherboards, in particular the X870E Aorus Master, which is the reason I have been using my X670E Aorus Master. The last 3 bios updates from Gigabyte have not fixed the issue. Regardless of that, there should be no performance difference between the motherboards since they use the exact same chipset.
They are mostly identical. The chipset is a straight rebrand with the only difference being that the 800 series force manufacturers to implement USB 4.0. Sure, there might be a few cases where the motherboard manufacturers decided to make some PCB improvements, but at most that could improve memory overclocking slightly. So far, we haven't really seen any performance difference outside the margin of error.
@@blackbirdpctech Yeah, it's perfectly valid to test with an X670E motherboard seeing as there is essentially no difference between them. It was really just a way for AMD to guarantee USB 4.0 support (at the cost of removing some freedom in PCI-E lane allocation from motherboard manufacturers) and for motherboard manufacturers to sell slightly tweaked revisions of their motherboards under a new name.
Ah okey, ive use the x870e godlike
Even with Microcode updates Intel CPU's are a mess right now and couldn't recommend anyone buy one.
Did you watch the video? The 285K performed much better than I thought it would based on the reviews I had seen, so it looks like the recent microcode updates helped.
I recommend watching the video.
Wrong. Arrow Lake is a pretty power efficient CPU that is particularly great for productivity and has reasonable gaming performance too.
Empirical evidence are saying otherwise, they work great even without the next microcode. These are really good CPUs with an advances architecture, and the latest TSMC tech.
@@blackbirdpctech Yes i did, and the microcode gives gains and losses which also confirms its unpredictability and hence why i wouldn't recommend anyone purchase one.
All that extra faffing for an already expensive intel CPU.
Agreed, it will be interesting to see if Intel releases binned versions of these chips, such as a 290K … but they have to adjust their pricing before they do that.
As expensive as the 9950X from AMD, which is its direct competitor from a desktop user perspective.
It’s an expensive very capable Intel CPU.
Unless you've tested the most CPU intensive parts of the games (like what HUB does), you will get misleading results. They demonstrated how easy it was to get results showing the 285k matching the 9800X3D in GPU intensive parts of games. So, they made sure to avoid that. As for whether the 285k is worth it? For gaming? No. And now that the 9950X3D is about to be released. Well... the 285k can finally be laid to rest.
Are you seriously using HUB as an authority on testing?? They test CPUs at 1080P/Ultra, which doesn’t load the CPU. I showed how fundamentally flawed their testing methodology was in my 9700X video. So please use a better source, Steve from HUB is a clown.
@@blackbirdpctech Wow I just noticed you messed around with the 9800X3D using cl36 8000 (high latency) when CL30 6000 or 6400 was the way to go??? Then you messed around with the f clocks, didn't use the newest chipset and what bios? And you loaded the 285K with the latest and greatest, messed with the mem timings?
Now I definitely can't take these benchmarks seriously as they do not in any way represent a realistic scenario for most gamers. Had I seen that, I would not have the wasted the time commenting on this video. And you call Steve a clown?
I don't know where to start. DDR5-8000 in 2:1 mode is faster than DDR5-6400/6000 in 1:1 mode ... probably a good idea to look at this video if you want to learn more about how AM5 based Ryzen CPU memory works: ruclips.net/video/JuUhnQaGG_I/видео.html
You will also learn how to optimize your infinity fabric clocks ... for DDR5-8000 you are able to synchronize your clocks ... something you can't do for DDR5-6000/6400 RAM. With respect to the chipset, the X670E and X870E chipsets are the same ... the only difference is the features that were added, such as USB 4. The performance should be the same however X870E motherboards are currently performing worse than equivalent X670E boards. The Bios version, Nvidia drivers and Windows version are all listed on each benchmark chart and the Bios version for each motherboard is also in the CPU comparison table.
So yes, Steve is still a clown and I'm glad that you commented, it provides an opportunity to correct some of the misinformation that HUB spreads on their channel.
@@blackbirdpctechHow dare you? 😂😂
@@blackbirdpctech Fair point on the chipset differences, however, regarding the memory, your own video stated it was pointless going to DDR5-8000 on AM5 as the gains are minimal and they are even more so when paired with X3D chips, one of which is the subject of this video. On that point, I agree.
Your video is useful for those who prefer to tweak and tinker, not for the majority of the userbase. And because tweaking and tinkering can depend greatly on the 'silicon lottery', the results can be all over the place.
Bro your videos are to big make small size video only like 6 to 7 minutes
Yeah, this one was a long video ... take advantage of the chapters or come back and watch part of it later ... that said, I probably could have created a separate tuning and optimization guide video ... I might do that in the future.
You forgot to mention the power usage....
I have a CPU power efficiency chart in the video.
Show us how it outperforms AMD in multi media tasks, etc.
I think that would be better when I compare it against the 9950X.
Why amd haven't 9000mhz ram? Bad test
Not understanding tech is even worse
I recommend looking at my recent "What's The Best Memory for AMD AM5 Ryzen CPUs?" video: ruclips.net/video/JuUhnQaGG_I/видео.html
This should help you understand why it's not possible to run DDR5-9000 stable on AM5 systems.
Seems like a lot of the poor reviews we see for the 285k are created by AMD fanboys
In this case, unlike the 9700X which was poorly configured at a hardware level by AMD, I think the day one reviews were mostly correct ... it appears that Intel needed more time before launching these new CPUs. My guess is that the marketing geniuses at Intel forced the Engineering team to launch early. Unfortunately a lot of reputational damage was done with those early reviews and it's going to be tough to change the minds of the broader tech community. I'm hopeful that data like this will help.
@@blackbirdpctech well that and their issues with 13/14 gen didn't help public trust. Two RMAs in on my 13900KS and second one is after bios updates
It's good to have intel fans. They pay the money, rhey don't care, what it costs. 🙂 I have to say a big thank to them.
I would gladly buy a core ultra 7 or a 9700X but I refuse to buy the overpriced X3D which should not cost more than $250.
Only the small gain in FPSs doesn’t justify the price either.
We all know that amd like 6000/6400mhz cl30 ram would have made a big difference
Yes, it would have reduced the 1% low performance of the 9800X3D by around 10%. I recommend that you watch this video: ruclips.net/video/JuUhnQaGG_I/видео.html to learn more about the best memory for AM5 Ryzen based CPUs. I purposely chose the best setup for the 9800X3D to extract max performance.
Iv found the 6400mhz cl30 is the sweet spot how come you used 6400mhz cl32?
For the memory video I used the common EXPO config for each kit: 6000 CL30, 6400 CL32 and 8000 CL38. In this video I ran 8000 CL36. That said, my testing shows that a small reduction in CAS latency does not result in a meaningful difference in performance.
mb
Why is this comment section full of coping Intel Meatriders?
@@Camhampton55 that's what you choose to post? Classy man. It's a CPU, just chill.
Mostly because we really want these companies to have products with similar performances and decent overall prices, not what we see today in the pc parts market.
This looks fake. I just read that the new intel updates SLOWED the 285k by 18%. Looking at your graphs, this looks like the 9800x3d vs 7800x3d. What is going on here?
If you are going to make a claim like that then at the very least understand what it is that you are saying. The data for the 285K in this video was generated with an MSI Bios with microcode 0x112. The final update from Intel requires microcode 0x114, Intel CSME Firmware Kit 19.0.0.1854v2.2 and Windows 11 26100.2314 (or newer). The idiots at WCCFTech reported on a test that appears to have been performed with the 0x114 microcode but without the CSME Firmware kit, which is where you got your 18% performance drop from. It appears that many of the early Bios updates that included microcode 0x114 did not include the correct ME firmware version. I have not yet tested the latest version from MSI that was released a few days ago.
i love intel
I wouldn't recommend loving any corporation, unless perhaps you are the founder ...
285k is not for gaming at all
If it's not for gaming then why did Intel specifically show 285K gaming performance (against the 14900K and 9950X) in their launch deck for the 200S series?
@blackbirdpctech they will always show that reguardless as sales tactic, the 14900k is still better gaming than the new Intel cpus. But very informative video
what the hell is 1% lows? an should i give a dam as a non gamer. i bet its a made up term... or a useless statistic.
It's actually significantly more important than the average FPS because it's a direct measure of how smooth your gameplay is. It refers to the lowest 1% of frame rates experienced during gameplay, essentially representing the absolute minimum FPS you might see during the most demanding moments of a game.
@blackbirdpctech ah ok. So, no use to me as a non game.. but could it be used as a metric for video editing, rendering?
FPS in general is a metric that is only really useful for measuring the performance of games ... unfortunately it wouldn't be useful for editing or rendering.
@@blackbirdpctech ok thanks
dum question
The 285K did much better than I thought it would
@@blackbirdpctech 285k is a headache and has been since launch, the best CPU is the 9800, only parameter its even somewhat competitive are select productivity apps. buy the 9800x3d and forget (ease of mind)
I think it’s important to treat every component consistently when I test so that I don’t bias the results … by doing that you will sometimes find unexpected outcomes and this is one of those times … the 285K performed much better than I expected and was a great cpu to test. Intel needs to keep improving it, and they should lower the price, but I thought it would be terrible going in and it surprised me.