@@wallywest2360 FWIW, back in the day, these guys were the top line because of their support and build quality. Unless something changed, that's really what you pay for. Their sXit just flat worked every time, barring some random hardware failure.
@@larracis well depending on the country (Life is good in the Tumor (USA)) it can be 7500 easy. in Canada The RTX 4090 or 5090 is $2500-2600 by itself the 7800x3d is $850. 7500 isnt that far off
@@davidsmith4186 Nah hes right tho bro Nothing should be that expensive unless you are full-on getting a SERVER type WORKSTATION bro And the fact its just a 280 RAD is wild!
No custom loop, case looks like 100$ max, front panel is restricting air flow, looks cheap, fans look cheap, it's rather small for 600W of components inside, it's prob a hotbox. GN should get their hands on one of these.
@@ahoymateeez My guy, Google is your friend. Falcon NW has been top tier in every aspect for over a decade. That case is bespoke to them. They have thermals entirely under control. If this was iBuyPower or some SI that on that level you’d have a valid point, but they’re not. Someone seriously looking at Falcon NE is probably only cross shopping Puget Systems.
Hii, can you please explain the practical differences? Does it even matter the contrasts between these two for productivity/power-users (i.e., audio production, video production, old/basic gaming, etc.)?
Gaming doesn't really use more than 8c/16t configured CPUs. Considering that there is a thread deficit, the workload task results are quite impressive if you ask me.
@@tj_2701 Fr. Gamers can use pretty much any PC, it really matters not which specs nowadays for 90% of non-psychotically-demanding contemporary titles. Any lower mid end, middle mid end, or (if you really are in that 0.1% of extreme gamers) the upper mid end will do. High end? No point; you'll never see the ebenfit let alone actually use it. Meanwhile, musicians, programmers, audio engineers, and video production benefit massively from higher end PCs and often benefit way more from high clock speed than abundance of cores. Rather get a 5.8ghz CPU with only a few cores than 100 cores running at 3.2ghz....as a musician. Gamers _prefer_ cooler running, more efficient PCs; productivity users on the other hand NEED both of these aa givens, in addition to stability.
Even for productivity, of you take a look at code compile, compression/decompression the 285k is weak af. There are much better reviews than this one...as usual.
Some people want certain brands, stick with Chevy, vs Ford, etc. I'm on the fence with the cpu. Currently on a i9-9900k and it's handling my 4090 fully, might wait and see if it can use the 5090 fully and then upgrade next. Either 285k or 16c amd
I wouldn't know as I only watch Jay....but RUclips has a feature called scheduled release. You make a video and upload it to youtube and then push schedule. So if embargo is Oct24 they just click Oct24 midnight.
Falcon Nortwest is launching a new financing program, they'll actually have a guy remove one of your kidneys in your bathtub instead of traditional financing
@@jondonnelly3 it's good for productivity as well, if you care about price/performance/efficiency... Yeah sure, if you don't care about gaming _at all_ 3D makes less sense even within just the AMD lineup. But that's mostly because you pay extra for the 3D, not because it's it's not a solid performer for productivity.
im glad i upgraded in 2022 lol i remember walking into microcenter and asking for a 13900k, the guy helping quicky told me that they have a 12900k open box for $285 and i snagged it because great deal. thought i might upgrade later if i need. fast forward and i definitely dont need more compute power, 13th and 14th gen are riddled with microcode issues and 285k is only marginally better in gaming than a 12900k
Ooh nice, Have the 12900KF 4090, it's a solid gaming machine. Snagged last year on Fantastech 1 sale. We just got a Microcenter in Miami, might drive down for the next build.
I hear ya there, as in early 2023 I bought one of the Chinese Erying Mobos with an Intel Core i9 12900H CPU(yes laptop chip on a desktop MATX board) as I don't have Micro Center near me, and it does everything I need it to do under Linux, and then some, even with 32GB of DDR4 3200Mhz RAM, It's been doing me so well I'm honestly considering buying a 2nd board to replace the 2nd computer in my gameroom as whole setup would cost me less than what many of these new Intel, or AMD CPUs would.
@@JosephKarthic if you're going to buy new just go AMD. The 7500F is cheap and beats the snot out of the 12400f. if you must buy intel I'd recommend going the used route, I've found the 12900k for under 300, I'm sure you can find a 12600k or ideally a 12700k for pretty cheap. You might even get one with AVX-512, which helps with certain applications.
@@JosephKarthic It's okay with no e cores, my mother uses one. Would consider a 7500F aliexpress also for this price point, could pair with a night devil itx or get an asrock pro rs budget board. Have gamed on 12700KF & 12900KF, I shut off e cores anyway so p cores can push boost higher. Would go AM5 if it's not costing more has upgrade path, though DDR4 is an option on the 12400F.
Watch GN on the power draw stuff. Literally ignore everyone else. They didn't isolate/test accurately to show the actual true efficiency (or lack thereof).
And ASML is above all of them providing the machines, yet Intel is the only company that has bought the latest generation of lithography machine while tsmc is waiting. This is probably a temporary solution while Intel is preparing to up their fab game significantly.
Exactly - people complaining about prices (understandably) should realise that it's currently due to a monopoly of these high end process nodes by TSMC and ASML, who are charging what they want. Ironically, ASML's latest financials looked poor since everyone else is so far behind, they are having reduced sales and so not being able to afford as much of the latest lithograph...
I think TSMC said that the throughput on the latest ASML machines would increase per-chip production costs, so that they would never see a benefit in terms of profit. They make more sense for Intel, which have fallen too far behind and may be desperate.
@@fpadams i think that is true yes, the machine is still not fully refined. As long as they can keep making chips with the current gen they will keep doing it. They already proved they can stretch existing technologies A LOT, as we saw with 193nm machines during the intel stagnation days.
I've never been a fan of the big/little core design. It breaks virtualization, for starters, and I use virtualization. Second, without the OS scheduling things properly, things get put on the wrong cores, and it hurts performance. I'm also not a fan of the separate CCX construction AMD went with. That also hurts performance when things get put on the wrong cores. So, I've been a big fan of the AMD big cache,single CCD designs lately. (I have a 3900X Hyper-V host for self training, but I never have VMs with more than 3 cores on it.) If MS can properly update Windows, and the Linux community update Linux, to schedule the cores right and not break virtualization, then I may consider more complex designs.
Hold on what is supposed to be broken about vm? Cause i have a core i5 with p and e cores and vms seem to work fine on unraid. But i only use a vm to run Minecraft servers
@@jondonnelly3It is that bad when it comes to x3d chips with dual CCDs. Core parking is a bugger to get working properly. Single CCD on x3d chipe is the only way to go.
@@JayztwocentsBecause he has to stay calm while sharing an office with Phil. 😆. Honestly though...its because he is so calm and nice. We just want to see your guys have nice things because we like them.
Being one of those crazy Skyrim/Fallout modders, I have noticed that big cache makes a world of difference because of the older Creation Engine "I hate multicore" coding. I am really wondering how these will fare vs 13900/13700. Draw calls. Nothing but draw calls...
The people that are in charge of intel the last ten years and the people that were in charge of Mitsubishi in the 2000's should be barred from touching a business for the rest of their lives. Screwing a good situation up that bad almost seems intentional, its almost too bad to be real.
@@MadClowdz Yes Stellantis deciding to turn Chrysler group into a high margin luxury brand, great idea... and those guys in the late 90s and early 2000s put the "BM" in IBM with their outsourcing and selloff to lenovo..
@@MadClowdz ah man i worked at a dodge dealership when Fiat bought into them and it become FCA, THAT shit was a DISASTER. the whole Chrysler group is one big screw up over and over.
I just got my 14700K from RMA my 13700K. Running with the latest bios (intel default setting) and undervolt it a bit to be safe. Hopefully this will last a few more years, but I’m definitely switching to AMD for my next upgrade.
I have a 14700k and I don't game, just productivity. Quicksync is a godsend so I'm liking the news that this crushes in compute! But I will wait for at least 3 more gens for this to be perfected. Then I think I'll notice a big uplift. Overall very exciting with this all new title tech.
@@mariorepas9100 no I haven't had a blue screen stability has been perfect. Ironically I used to have a 5950 x and that would constantly give me blue screens. It's the reason why I upgraded.
I think HWInfo is correctly indentifying your Core as 13. It is displaying the P-cores as they are physically distributed. 2 at the front, 4 in the middle, 2 at the back. P-Core 13 is the Lucky Pierre in the mix, he gets E-core and P-core action from both sides. 0 P-core 1 P-core 2 E 3 E 4 E 5 E 6 E 7 E 8 E 9 E 10 P-core 11 P-core 12 P-core 13 P-core 14 E 15 E 16 E 17 E 18 E 19 E 20 E 21 E 22 P-core 23 P-core
Regardless of the sides not being on, it’s still not an “open bench” setup. The hardware still resides inside the case, so it does have an effect. That would be akin to putting the bare hardware in a box with no top. Although it changes thermals a bit……….it’s still not an open bench.
@@CantankerousDave That's true! I remember that! But I can't find anything on Google about it because Google is so damn bad these days. You remember which Pentium they did this for?
Yesterday I got a 9950x for a new workstation. If we incentivize something we get more of it. Last thing I want to reward is the focus on efficiency over performance. Will look at Intel again when it is time to replace this one. Who knows, the best CPU in 3 years could be by NVIDIA or some new player.
So far, I have seen nothing to suggest I should upgrade from my 13th Gen CPU. Unless it melts down in the meantime [unlikely, IMO], I'll be hanging on to it for a few more years.
@@LuccianoNova Good point. If my 13th Gen _has_ been damaged, as long as the new BIOS/microcode actually works, replacing it is far easier and cheaper than basically buying a whole new CPU and Mobo combo.
Same but 12900KF 4090 48gb DDR5 on Linux. I think if anything is to be upgraded there was some smoothness gain in ram kits lower CAS and adding more ram. But I don't really need more ram. Just would be one place people could spend without going to another platform. I'm not sure but maybe the damaged CPU would work at lower clock/ram speeds. Have 3 12th gens in my household did a lot of BIOS updates but would go AM5 if a few years down the road a new build is required, still feel burnt by the time spent tuning machines but it's stable. Can always replace with RMA/12th gen or core lock the new chip. I disable the e cores gaming and turbo boost, would crash games for me.
Over the last 5-7 years of being quite engaged with various creators reviewing pc hardware - Jay and team remain my go-to in now only a handful of channels I follow. Respekt.
you guys really need to update your 7950x3d and 7900x3d benchmarks if you're going to throw them in the list. A correctly setup 7950x3d matches or slightly beats out the 7800x3d in games. GN's benchmarking proved that out a while back. Heck your own video showing how to FIX the 7950x3d to get it working right is what helped me get mine working correctly!
You mention the temp drop at the end of the test as being an indication of the efficiency of the 285k..... yet in the next slide we see the 14900 doing the exact same thing. Test ends, temps drop almost steaight to 40c I think you're trying to find 285k praise points where there's just typical CPU behaviour when on a AIO and having only done a 10 minute workload.
TechPowerUp did review using onboard igpu on 285K vs other intel and amd chips. The RTX 1060 beat them all.. 🤔 Interesting: “The ASUS Z890 Hero motherboard feeds four of the CPU VRM phases from the 24-pin ATX connector, instead of the 8-pins, which makes it impossible to measure CPU-only power with dedicated test equipment (we're not trusting any CPU software power sensors). You either lose some power because only the two 8-pin CPU connectors are measured, or you end up including power for the GPU, chips, and storage when measuring the two 8-pin connectors along with the 24-pin ATX. For this reason, we used an MSI Z890 Carbon motherboard exclusively for the power consumption tests in this review.” (Techpowerup)
The ~35K temperature step, at a guessed 220W difference, just tells us that Rt_ja is 0.16K/W. If the thermal path is 300mm^2, copper is 8.3K/W/m. So Rt_ja equates to 19mm of copper thickness, which is not terribly impressive from a power electronics standpoint.
Technically ASML got that tech from a consortium led by the US DoE and a bunch of labs like Lawrence Livermore National labs. They licensed the tech to ASML with strict rules, which is why the US can stop ASML from selling to China. That said they were mostly worried about Japanese competition at the time and intentionally avoid licensing it to Japanese companies.
After Intel kidded that AMD "glued together" their chips, they had to fit the tiles together to make it LOOK Like a single piece of silicon. Shame! They even stuck in a "filler" tile. Could that possibly help thermals, or is it really just cosmetic?
Am on 12900KF, it's still solid for my gaming use case (might be fine a good 3-5 years) but likely going AM5 X3D when the time comes to leave. Runs really well paired with 4090 and 48gb DDR5, very smooth. Am sure there is better but it hasn't felt unpleasant to game on yet, raids like a champ.
If Intel will not fix some strange performance issues with chipset drivers/microcode/whatever - I will buy AMD....it will be my first AMD since X2 3000 chip from Jurassic era. I was on Intel all this time since Core 2 Duo E6400. Currently I am on i7 12700k and see no reason to get 285K/265K. Very sad. :(
Sadly AMD dominates the average informed user market, the average uninformed person will go with Intel just because it has so much more brand recognition.
Steam Hardware Survey proposes otherwise, though I'm considering switching over to AM5 after I've really squeezed out all I can on my current LGA 1200 build
@@hacoberthejacober3345 Sadly yes. Here in China, most of my friends still think Intel over amd in gaming despite me keep telling them the wind had changed direction when Zen 3 launched. They wouldn't watch any reviews and even if they do, some reviewers are also biased for some reason.
The 9950x price is up because, as of yesterday only Amz (stock in single digits) and New Egg had it. Unless you could drive to a microcenter, which had them for $650. Stock seems to have improved at Amz and price is “down” to $671
I checked Newegg and Micro Center on the 9950x and they had them in stock at $599 and $649 respectively and that was 1 hour after your video had posted... $599 isn't bad for production work that can game. But, waiting to see what the new x3d will do...
I really want to see how the 9800X3D releases, I really hope it's good, because that's most likely my next CPU, as long as it's at least better than the 7800X3D
Thanks Jay, good to see that there are actual reviewers looking into all aspects. Hope to see some Intel updates before, but im gonna go ahead with the purchase.
I am on AMD with MS 23H2. When will I get 24H2? Or do I need to force it? Reinstall MS? Seems 'criminal' that 10% performance uplift is held until a major MS update instead of the monthly cumulative fixes!
As my children would say: Just Google it. Turns out forcing is fairly easy, based on the 2 videos I watched, to force the update using the Group Policy Editor. There is also a registry edit method.
It's still being baked. Within Windows Update - Windows Insider beta channel has it. Sign and up and you can get it now and yes the whole OS feels snappier on AMD I have.
6000MHZ was too slow for last few gens of Intel CPU's, want to see it tested with 8000-10000 mhz. Love the Falcon machine. If I was going to have someone else build mine it would be them.
So, as far as i am concerned: i am impressed by the 285k, but i don´t use it for games but for running tons of VST´s in my DAW, pretty hefty computing task ... this processeor would be ideal. Nice
I have an i7 6800k, going to Arrow Lake would be like going from a Vespa scooter to an Audi RS5. I do more than gaming, actually a lot less gaming, and more AI related tasks. This is also the start of a new platform and has an upgrade path for years to come. Will I buy this right now, no, do not have the money right now, but maybe next year or so.
I have a 13600K that is now around a year old. While it suffices for what I need it for, I regret not going AMD. The extra e-cores are a disappointment overall, I would rather just have fast, efficient p-cores (which is what AMD offers). I don't see any reason for anyone to consider the 200 series even if they need a platform upgrade. AMD's current lineup is as fast, faster in gaming and a LOT more efficient. And cheaper, too, but that's rapidly changing as retailers see a grifting opportunity.
I bought the i9-13900k with a new motherboard, ram and GPU and then found all these videos about the 13th and 14th gen cpus dying early and now I'm scared lol. I did do the bios updates and have made sure my voltage and everything else is normal however im still scared.
The tables got flipped. When ryzen first came out, intel had single core performance, but amd had multicore performance Now, amd has the single core performance, and intel has the multicore performance
Goodness, the 7950X3D is such a beast. It's rarely talked about in comparison to the 7800X3D, but look at those charts. It's essentially a 7800X3D when gaming, but it is actually useful when you exit the gaming application. You get everything the 7800X3D offers for games while having one of the most powerful productivity processors outside of workstations. People who bought the 7800X3D and want to do more than gaming will be incentivized to buy the new 9800X3D because of the higher clock speeds and supposedly better performance with compute outside of gaming. Someone with a 7950X3D has almost no reason to upgrade unless the 9950X3D ends up putting the 3d V Cache on both CCDs.
Why would 7800x3d users need to upgrade if you are 1440p or 4k the 9800x3d is not going to give you negligible benefits at 1440p and probably zero at 4k. Also for most things normal people do on the pc besides gaming ie. Browsing, youtube etc . The 7800x3d is just fine and nobody would know any difference between it and a cpu that is better at "productivity".
@@IRaoulDukeNo one is attacking the 7800X3D. It is the gaming king, and the choice for most buyers. The defense force doesn't have to come out. My post is for the people it applies to. Not everyone uses their PC for just gaming and light browsing, and even AMD recognizes the 7800X3D is poor outside of gaming. That's why leaks suggest there are substantially larger increases to the gains of the 9800X3D in non gaming applications than in gaming.
Intel seems to have the e-core thing pretty well nailed. But its "performance" cores are under performing. I said this on some other channel a few days ago, but it's worth a discussion: Have we reached the limit of CPU performance on silicon? Are we stuck until some other completely different technology comes to the fore? The 12900K is a massive bargain right now for 1440p or 4K gaming. For 1080P as well unless you're competing.
Arrow Lake limited to 250W and 6500 memory speed. What we are not seeing is: 1. CUDIMM has not been taken in count 10000 speed that potentially would give 10-15% boost alone. 2. Extreme Wattage 295W to further boost the performance. 3. No Hyper-Threading Technology allowing huge performance boost for E Cores 1000MHz overclocking. 4. We don't see PCIe 5.0 performance lift via direct lane to CPU because AMD/Nvidia GPU architecture doesn't have any gains from REBAR/E Cores. 5. 8 pin additional power to the PCIe 5.0 (150W) as buffer between GPU/CPU pairing? Interested to see what this is for. Point is AMD doesn't support CUDIMM and AMD/Nvidia are very minimal CPU users. Only way you would find a bottleneck is to run 4090 at 1080p. Who does that? So all and all this test is not realistic and from the architecture of Arrow Lake and boosted memory with PCIe 5.0 direct access to CPU (REBAR on steroids) it is not hard to see this CPU and components are tailor made for INTEL ARC to pair the GPU and CPU the best way possible boosting the CPU/memory to lift the GPU even higher performance. Battlemage 250-275W + Arrow Lake 250-295W = 500-570W combined performance with added CUDIMM for possible extra 10-15% if not more. I predict 5080 beating performance. Rest of the CPU's compared to all are optimized and most likely motherboard boosted that has been the talking point. For my view 285K looks very good. Would I buy it? No because it is too expensive, we don't have Battlemage that is the obvious part missing for gaming, it needs to cook a bit more, more CUDIMM options. I have 13700k and A770 for 4k working as entry for 4k and seems to work OK if not limited to mid settings and the value for coin is there. But give it few months and Battlemage yes.
All I want for Christmas is a CPU with all Performance cores. I don't need a ton of them. I want them to be fast and game like a mother. Bring back the Extreme
Feel like he seriously went out of his way to put this in the best light he can while most other tech reviewers are calling it for what it is …. Overall bad.
@@goinpoztal They said that is bad because it's not beating AMD... It was expected for a brand new CPU... They cried about efficiency and now they are crying because the games, that they aren't gonna play, are 5% slower... Intel CPUs were never oriented towards gamers, for that they would need a new design just for that and it's not a priority.
@@jeanytpremium be that as it may the problem is this struggles against a 13900k it many cases especially in gaming and that’s a cpu from 2022…. That’s a problem. Now you couple that with the failed to or at least barely mentioned by Jay at launch stability issues most reviewers are experiencing it just gets worse. Intel needed a home run here after the disaster that is the 14900k and 13900k parts failing at high rates and they just didn’t get it done. Leave AMD out of it because Intel was largely competing with their own previous generation and overall lost to themselves. It’s sad to see really because Intel was so far ahead in the cpu market that the only ones that could have got in their way was themselves and well they succeeded.
Great vid, great breakdown. I would love to see Metro Exodus APO performance metrics versus 12th/13th/14th gen. In particular, the volga introduction whereas you are walking with your wife, off the train in the direction of the church. Before you step-off the train rails, and you overlook the horizon, looking down at the lake/train yard etc the game becomes mega CPU limited. On my 14700k with APO enabled, depending on how my camera is angled, I've seen my 4090 at DLSS Quality @ 4k res drop to 60-70 gpu usage. It's hard to get the gpu to drop that low in general in that area, depending on the viewing angle etc. Can we see this APO comparison between 285k and 14th/13th/12th gen, please?!?! NOTE: Oh, then there's chapter 3 forest area, Taiga where the wolves and god of the forest are running rabid... Hyper CPU limited area again. I haven't myself tested APO in that area yet however.
Yeah 3rd choice in gaming, and taking 270W in powergated/bios Mode. There is altough a 85W heat loss thats sums different in gaming. Extrem expensive too. Bauer mentioned you in his review of this cpu, its about the VR-Voltage regulator. I see how you try to put this product in a good light, but its time to shorten the stocks again :)
Honestly, I kinda struggle to find the purpose of these videos, crazy specs, crazy prices, crazy gpus, crazy unoptimized games that run crazy bad on the aforementioned crazy specs, and so on.
These scenarios only seem to make sense to reviewers. I have a 5950X with DDR4 3600 CL14 and considering an upgrade, what would I experience if I moved to this whole platform? I feel like that's a better question to answer than saying this processor "isn't great for gaming" because of some chip design nerd knob that could have been done slightly better. The other reality is that I have to think many people will be going from DDR4 to DDR5, is CUDIMM the way to go and how would that compare to the great DDR4 we have today? Who is upgrading from the latest gen to the latest latest gen? nuts
What is your use case? Productivity, gaming or bit of both? None of the testing I have seen has used the new memory format. It will be interesting to see how it changes these numbers.
What? When reviewed of course you will compare it to all other options currently available. The review is to help you choose what processor to upgrade to. And whenever something new comes out, people with the last gen version ask if its worth an upgrade. They show the benchmarks here and you can pretty easily benchmark your current system…
So.. 13900k is the best overall? Price, FPS and compute power. It is top 3 on ALL thoe graphs (both gaming and compute) without concessions and + it is one of the cheapest ones. Right?
I thought more of you Jay, you needed 450w to hit 42000 score with a 14900k, I hit just over 42000 with my middle of the pack 14900k, 1.33v, 320w 83c on a phantek 420mm aio running 45% fan speed
This looks really good... For the chip that replaces it next (or 2) generation(s) from this. The main thing intel needed to get under control was power consumption and they've made a huge step in that regard. Now if they can keep the power under control in the next version of this chip, they'll be in a very competitive place. P.S. you'd expect that with the better power handling, you should be able to overclock this thing for some huge improvements. But who knows.
Would you consider including Star Citizen in your game testing? Even though it isn't done yet, it is really demanding of hardware and might provide some decent data.
As I think about the difference in the gaming portion of the testing for the 285k vs the 7800x3d, I am betting if intel since they went away from the hyperthreading, if they simply add more P cores, would boost that result greatly to match and or exceed that of the AMD 7800x3d.. I have been wanting Intel to do more P cores.. and they have not wanted to do so, then they drop off the hyper threading and still do not add more P cores and I think that is a mistake by Intel.. Intel, add more P cores.. next go round 16P cores for the 385k and 12P cores for the 365k.. and then 8P cores for the 345k..
Hello, I have a question not related to the topic. Is that the laptop you repasted? How is it doing and what are the temps under load/games? Thank you in advance :)
Dinosaur era: IBM hires Intel for the CPUs, required a second supplier - AMD Today: single manufacturer, TSMC making CPU chips for both Intel and AMD. Either way TSMC is winning...
You should do better research. DLVR was developed for Raptor Lake but Intel only got it working on RPL-R. It’s purpose is to allow idle cores run at lower voltage than the cores under load and save power. Nothing to do with controlling board manufacturers, would also have been 5 years later as a response to that. It was Intel CPUs requesting too much voltage that killed them…
Measurements without power limits are absurd. Intel has clearly stated the PBP of this processor (125W). Testing with a load of 10 minutes at 250W is simply OC. You should provide results for “stock” or for “stock and OC”, but not only for “OC”.
7500 freedom dollars for 4000 of machinery and 3500 in cable management is fucking wild.
I think it’s dumb at this point in the GPU Cycle for someone to purchase that $7500 system. Also for $7500 I would expect a custom loop not AIO 😂
My custom loop, that I over-spent on, was about $4500 total. $7500 is nuts, even at today's prices.
7500 better be top of the line and personally customizable if I'm spending that. That's wild.
@@wallywest2360 FWIW, back in the day, these guys were the top line because of their support and build quality. Unless something changed, that's really what you pay for. Their sXit just flat worked every time, barring some random hardware failure.
@@larracis well depending on the country (Life is good in the Tumor (USA)) it can be 7500 easy. in Canada The RTX 4090 or 5090 is $2500-2600 by itself the 7800x3d is $850. 7500 isnt that far off
It also looks like a 1000$ system you would get at bestbuy 😄😄
Falcon charging that much for a non custom loop is a joke
2024 "Brokie Statement Of The Year" award goes to 👏🏻
@@davidsmith4186
Nah hes right tho bro
Nothing should be that expensive unless you are full-on getting a SERVER type WORKSTATION bro
And the fact its just a 280 RAD is wild!
No custom loop, case looks like 100$ max, front panel is restricting air flow, looks cheap, fans look cheap, it's rather small for 600W of components inside, it's prob a hotbox. GN should get their hands on one of these.
@@ahoymateeez My guy, Google is your friend. Falcon NW has been top tier in every aspect for over a decade. That case is bespoke to them. They have thermals entirely under control. If this was iBuyPower or some SI that on that level you’d have a valid point, but they’re not. Someone seriously looking at Falcon NE is probably only cross shopping Puget Systems.
I'd hope the people buying those can write them off as business expense.
Jay, it's a Thread deficit and actually has 8 more cores. The 9950X is only 16 Cores with 32 Threads, while the 285K has 24 Cores but only 24 Threads.
Yep, Poor look on Intel's part
Hii, can you please explain the practical differences? Does it even matter the contrasts between these two for productivity/power-users (i.e., audio production, video production, old/basic gaming, etc.)?
Gaming doesn't really use more than 8c/16t configured CPUs. Considering that there is a thread deficit, the workload task results are quite impressive if you ask me.
@@tj_2701 Fr. Gamers can use pretty much any PC, it really matters not which specs nowadays for 90% of non-psychotically-demanding contemporary titles. Any lower mid end, middle mid end, or (if you really are in that 0.1% of extreme gamers) the upper mid end will do. High end? No point; you'll never see the ebenfit let alone actually use it.
Meanwhile, musicians, programmers, audio engineers, and video production benefit massively from higher end PCs and often benefit way more from high clock speed than abundance of cores. Rather get a 5.8ghz CPU with only a few cores than 100 cores running at 3.2ghz....as a musician. Gamers _prefer_ cooler running, more efficient PCs; productivity users on the other hand NEED both of these aa givens, in addition to stability.
Even for productivity, of you take a look at code compile, compression/decompression the 285k is weak af. There are much better reviews than this one...as usual.
So basically, wait for 9800X3D as expected, gotcha.
Or buy a 7800X3D (if you can find a good deal.. prices exploded).
@@ArticSpy man i got my 7800x3d 2 months back for 309 usd im so happy
@@pwn3426 Same, we clutched out this cpu is goated
yeah.
Some people want certain brands, stick with Chevy, vs Ford, etc. I'm on the fence with the cpu. Currently on a i9-9900k and it's handling my 4090 fully, might wait and see if it can use the 5090 fully and then upgrade next. Either 285k or 16c amd
LTT, JTC and GN releasing videos at literally the same millisecond. Oh the embargos...
All my favorite tech tubers releasing at the same time 🤣 I'm here first though.
but first we watch Jay) ахаха
I wouldn't know as I only watch Jay....but RUclips has a feature called scheduled release. You make a video and upload it to youtube and then push schedule. So if embargo is Oct24 they just click Oct24 midnight.
they also do it 1 minute after lift.
Always Jay first :)
Falcon Nortwest is launching a new financing program, they'll actually have a guy remove one of your kidneys in your bathtub instead of traditional financing
The 5800X3D outperforming the 285K in almost every game is insane. Got one last Feb and put it in my 8 year old X370 board and performs brilliantly.
The 7800X3D went up in price again... It's now about 480 Euros. 2 Months ago it was only 330 E.
Good thing is, it’s still worth it!
welcome to amd monopoly. 10800xed will cost you 600+ euros
$850 in Canada on new egg
Paid £320 for mine last month
Yeah, and people think AMD is a good company and love us. If they could, they would make it cost 1k. 😂😂😂
that 3 doors down joke was my kryptonite
Sadly, Jay can't be your superman.
It is impressive how the new lab it almost looks like the old studio 👍
Iw as trying to keep is "similar" but new and designed with a flow and purpose
All I'm getting from these testing and new releases is that the AMD 3D CPUs are just insanely good
for gaming.
@@jondonnelly3 it's good for productivity as well, if you care about price/performance/efficiency... Yeah sure, if you don't care about gaming _at all_ 3D makes less sense even within just the AMD lineup. But that's mostly because you pay extra for the 3D, not because it's it's not a solid performer for productivity.
im glad i upgraded in 2022 lol i remember walking into microcenter and asking for a 13900k, the guy helping quicky told me that they have a 12900k open box for $285 and i snagged it because great deal. thought i might upgrade later if i need. fast forward and i definitely dont need more compute power, 13th and 14th gen are riddled with microcode issues and 285k is only marginally better in gaming than a 12900k
Ooh nice, Have the 12900KF 4090, it's a solid gaming machine. Snagged last year on Fantastech 1 sale. We just got a Microcenter in Miami, might drive down for the next build.
Hi is it wise to buy 12400f now skipping 13400f and 14400f? I'm a casual gamer and browse mostly, i don't want the processor with issues..
I hear ya there, as in early 2023 I bought one of the Chinese Erying Mobos with an Intel Core i9 12900H CPU(yes laptop chip on a desktop MATX board) as I don't have Micro Center near me, and it does everything I need it to do under Linux, and then some, even with 32GB of DDR4 3200Mhz RAM, It's been doing me so well I'm honestly considering buying a 2nd board to replace the 2nd computer in my gameroom as whole setup would cost me less than what many of these new Intel, or AMD CPUs would.
@@JosephKarthic if you're going to buy new just go AMD. The 7500F is cheap and beats the snot out of the 12400f.
if you must buy intel I'd recommend going the used route, I've found the 12900k for under 300, I'm sure you can find a 12600k or ideally a 12700k for pretty cheap. You might even get one with AVX-512, which helps with certain applications.
@@JosephKarthic It's okay with no e cores, my mother uses one. Would consider a 7500F aliexpress also for this price point, could pair with a night devil itx or get an asrock pro rs budget board. Have gamed on 12700KF & 12900KF, I shut off e cores anyway so p cores can push boost higher. Would go AM5 if it's not costing more has upgrade path, though DDR4 is an option on the 12400F.
Watch GN on the power draw stuff. Literally ignore everyone else. They didn't isolate/test accurately to show the actual true efficiency (or lack thereof).
And ASML is above all of them providing the machines, yet Intel is the only company that has bought the latest generation of lithography machine while tsmc is waiting. This is probably a temporary solution while Intel is preparing to up their fab game significantly.
Exactly - people complaining about prices (understandably) should realise that it's currently due to a monopoly of these high end process nodes by TSMC and ASML, who are charging what they want. Ironically, ASML's latest financials looked poor since everyone else is so far behind, they are having reduced sales and so not being able to afford as much of the latest lithograph...
I think TSMC said that the throughput on the latest ASML machines would increase per-chip production costs, so that they would never see a benefit in terms of profit. They make more sense for Intel, which have fallen too far behind and may be desperate.
@@fpadams i think that is true yes, the machine is still not fully refined. As long as they can keep making chips with the current gen they will keep doing it. They already proved they can stretch existing technologies A LOT, as we saw with 193nm machines during the intel stagnation days.
I've never been a fan of the big/little core design. It breaks virtualization, for starters, and I use virtualization. Second, without the OS scheduling things properly, things get put on the wrong cores, and it hurts performance. I'm also not a fan of the separate CCX construction AMD went with. That also hurts performance when things get put on the wrong cores. So, I've been a big fan of the AMD big cache,single CCD designs lately. (I have a 3900X Hyper-V host for self training, but I never have VMs with more than 3 cores on it.) If MS can properly update Windows, and the Linux community update Linux, to schedule the cores right and not break virtualization, then I may consider more complex designs.
"It breaks virtualization" - that is 100% intentional. intel wants you buying xeon processors for vms.
Yeah I bought a 9950x for that reason. The double CCX is really not that bad and it's here to stay.
Hold on what is supposed to be broken about vm? Cause i have a core i5 with p and e cores and vms seem to work fine on unraid. But i only use a vm to run Minecraft servers
@@jondonnelly3It is that bad when it comes to x3d chips with dual CCDs. Core parking is a bugger to get working properly. Single CCD on x3d chipe is the only way to go.
The ideal cpu for you is clearly the 7800X3D. I myself am getting a 12900k avx-512 enabled for cheap and disabling the E cores.
Nick needs a 5090 and his own mini fridge
Why?
and a 2010 toyota corolla, burgundy in color
@@Jayztwocents its nick's account!
@@wolveric0with manual windows and transmission.
@@JayztwocentsBecause he has to stay calm while sharing an office with Phil. 😆. Honestly though...its because he is so calm and nice. We just want to see your guys have nice things because we like them.
Being one of those crazy Skyrim/Fallout modders, I have noticed that big cache makes a world of difference because of the older Creation Engine "I hate multicore" coding. I am really wondering how these will fare vs 13900/13700. Draw calls. Nothing but draw calls...
Waiting for the AMD 9950 x3d stats to see how it performs.
The people that are in charge of intel the last ten years and the people that were in charge of Mitsubishi in the 2000's should be barred from touching a business for the rest of their lives. Screwing a good situation up that bad almost seems intentional, its almost too bad to be real.
You can add Stellantis of today in that list, as well.
@@MadClowdz indeed
@@MadClowdz IBM
@@MadClowdz Yes Stellantis deciding to turn Chrysler group into a high margin luxury brand, great idea... and those guys in the late 90s and early 2000s put the "BM" in IBM with their outsourcing and selloff to lenovo..
@@MadClowdz ah man i worked at a dodge dealership when Fiat bought into them and it become FCA, THAT shit was a DISASTER. the whole Chrysler group is one big screw up over and over.
Bro I’m 41 and even I barely got the 3 doors down reference
That’s your age group I’m 41 too
I caught it right away, but I am in my early 50s :)
Well, they suck, so I didn't catch it. year younger 😂
@@chillnspace777 kryptonite super man song was played on every radio station. So unless you lived under a rock
I just got my 14700K from RMA my 13700K. Running with the latest bios (intel default setting) and undervolt it a bit to be safe. Hopefully this will last a few more years, but I’m definitely switching to AMD for my next upgrade.
did intel suppply you with that 14700k?
I have a 14700k and I don't game, just productivity. Quicksync is a godsend so I'm liking the news that this crushes in compute! But I will wait for at least 3 more gens for this to be perfected. Then I think I'll notice a big uplift. Overall very exciting with this all new title tech.
am5 is the answer
just go ahead and get it. 14700k, board and ram easy to sell on.
How is the stability of your system? Do you ever get blue screens? I've heard mixed reviews
@@mariorepas9100 no I haven't had a blue screen stability has been perfect. Ironically I used to have a 5950 x and that would constantly give me blue screens. It's the reason why I upgraded.
I think HWInfo is correctly indentifying your Core as 13. It is displaying the P-cores as they are physically distributed. 2 at the front, 4 in the middle, 2 at the back. P-Core 13 is the Lucky Pierre in the mix, he gets E-core and P-core action from both sides.
0 P-core
1 P-core
2 E
3 E
4 E
5 E
6 E
7 E
8 E
9 E
10 P-core
11 P-core
12 P-core
13 P-core
14 E
15 E
16 E
17 E
18 E
19 E
20 E
21 E
22 P-core
23 P-core
Regardless of the sides not being on, it’s still not an “open bench” setup. The hardware still resides inside the case, so it does have an effect. That would be akin to putting the bare hardware in a box with no top. Although it changes thermals a bit……….it’s still not an open bench.
Thumbnail said everything 🤣
A year from now, Intel will be selling a hyperthreading "upgrade" download for these chips, just $250.
no, it will be a subscription
@@arizona_anime_fan Ha!
Ah.... Tarkov prices....
You joke, but they did have a hardware dongle on one of their many platforms a few years back that unlocked more threads or something.
@@CantankerousDave That's true! I remember that! But I can't find anything on Google about it because Google is so damn bad these days. You remember which Pentium they did this for?
Yesterday I got a 9950x for a new workstation. If we incentivize something we get more of it. Last thing I want to reward is the focus on efficiency over performance. Will look at Intel again when it is time to replace this one. Who knows, the best CPU in 3 years could be by NVIDIA or some new player.
Before Phenom there were Athlon 64 X2 CPUs... launch date was May 2005 just like Pentium D.
I had one of those back in the day, the 5200+. A decent cheap way to make Oblivion and CSS less laggy.
So far, I have seen nothing to suggest I should upgrade from my 13th Gen CPU. Unless it melts down in the meantime [unlikely, IMO], I'll be hanging on to it for a few more years.
Same here my 13700 is still an absolute beast compared to most cpus. Plus the new socket mobos cost alot
@@LuccianoNova Good point. If my 13th Gen _has_ been damaged, as long as the new BIOS/microcode actually works, replacing it is far easier and cheaper than basically buying a whole new CPU and Mobo combo.
Same but 12900KF 4090 48gb DDR5 on Linux. I think if anything is to be upgraded there was some smoothness gain in ram kits lower CAS and adding more ram. But I don't really need more ram. Just would be one place people could spend without going to another platform. I'm not sure but maybe the damaged CPU would work at lower clock/ram speeds. Have 3 12th gens in my household did a lot of BIOS updates but would go AM5 if a few years down the road a new build is required, still feel burnt by the time spent tuning machines but it's stable. Can always replace with RMA/12th gen or core lock the new chip. I disable the e cores gaming and turbo boost, would crash games for me.
Over the last 5-7 years of being quite engaged with various creators reviewing pc hardware - Jay and team remain my go-to in now only a handful of channels I follow. Respekt.
Jay & Team are good and I watch every video but for the real deal on hardware reviews you use gamers Nexus.
Hardware unboxed was a channel I used to go to. But they have a specific audience they cater to so I guess it's okay
you guys really need to update your 7950x3d and 7900x3d benchmarks if you're going to throw them in the list. A correctly setup 7950x3d matches or slightly beats out the 7800x3d in games. GN's benchmarking proved that out a while back. Heck your own video showing how to FIX the 7950x3d to get it working right is what helped me get mine working correctly!
The whole point of these comparisons is to show out-of-box performance. The same reason overclocking or CUDIMM DDR5 results are not included
Seems the 285K is really good at synthetic benchmark. If I was only doing that, it would be my cpu of choice.
You mention the temp drop at the end of the test as being an indication of the efficiency of the 285k..... yet in the next slide we see the 14900 doing the exact same thing. Test ends, temps drop almost steaight to 40c
I think you're trying to find 285k praise points where there's just typical CPU behaviour when on a AIO and having only done a 10 minute workload.
Great review, once again :) But bongers me thou is that "8 core deficit" you mention a couple times. It's a 8 thread difference to be precise.
Man, really feeling good about getting my 7800x3d for $300 last black Friday.
same her cept 2 months ago for 309 USD
TechPowerUp did review using onboard igpu on 285K vs other intel and amd chips. The RTX 1060 beat them all..
🤔 Interesting:
“The ASUS Z890 Hero motherboard feeds four of the CPU VRM phases from the 24-pin ATX connector, instead of the 8-pins, which makes it impossible to measure CPU-only power with dedicated test equipment (we're not trusting any CPU software power sensors). You either lose some power because only the two 8-pin CPU connectors are measured, or you end up including power for the GPU, chips, and storage when measuring the two 8-pin connectors along with the 24-pin ATX. For this reason, we used an MSI Z890 Carbon motherboard exclusively for the power consumption tests in this review.”
(Techpowerup)
Simpler solution than Gamer’s Nexus monster power monitoring. They saw some weird things in the power details by doing the monitoring, however.
The p core e core arrangement isn't random. It's based on how they're arranged on the die
Was going to post this same comment but you are correct
The ~35K temperature step, at a guessed 220W difference, just tells us that Rt_ja is 0.16K/W. If the thermal path is 300mm^2, copper is 8.3K/W/m. So Rt_ja equates to 19mm of copper thickness, which is not terribly impressive from a power electronics standpoint.
I'm just watching these videos to get warm and fuzzy feelings about my 7800X3D.
TSMC ???... ah you wanted to say WE here in Holland at ASML :P
The tech needed for TSMC to make their chips come from Japan and Holland
I think you mean the Netherlands? Holland is a province of the Netherlands.
Technically ASML got that tech from a consortium led by the US DoE and a bunch of labs like Lawrence Livermore National labs. They licensed the tech to ASML with strict rules, which is why the US can stop ASML from selling to China. That said they were mostly worried about Japanese competition at the time and intentionally avoid licensing it to Japanese companies.
Well. This makes me feel better about having 9950x. Although if I wanna get the 9800x3d Ina year or two guess I will, we see
After Intel kidded that AMD "glued together" their chips, they had to fit the tiles together to make it LOOK Like a single piece of silicon. Shame! They even stuck in a "filler" tile. Could that possibly help thermals, or is it really just cosmetic?
The reason P and E cores are mixed is because they are listed in the order they are physically on the die. 2P, 4+4E, 2+2P, 4+4E, 2P
I was waiting for 285k do replace my 12900ks, but i will keep it until next gen, Thanks a lot for this video
Am on 12900KF, it's still solid for my gaming use case (might be fine a good 3-5 years) but likely going AM5 X3D when the time comes to leave. Runs really well paired with 4090 and 48gb DDR5, very smooth. Am sure there is better but it hasn't felt unpleasant to game on yet, raids like a champ.
We need timestamps Jay!!!
why? 285k is no good, get 1700
@@uhohwhy no lol. Just buy a x3d CPU
@@fallenlegion1828 Yeah.....not spending $500 on a CPU.
If Intel will not fix some strange performance issues with chipset drivers/microcode/whatever - I will buy AMD....it will be my first AMD since X2 3000 chip from Jurassic era. I was on Intel all this time since Core 2 Duo E6400. Currently I am on i7 12700k and see no reason to get 285K/265K. Very sad. :(
Yeah 12700k is plenty, unless your gaming at 1080p on 4090 on 300Hz monitor or something crazy. X3d for that.
Haven't touched an Intel product since 3rd gen, 2012. Can't see that changing. AM4 has dominated the avg user market imo.
Sadly AMD dominates the average informed user market, the average uninformed person will go with Intel just because it has so much more brand recognition.
Steam Hardware Survey proposes otherwise, though I'm considering switching over to AM5 after I've really squeezed out all I can on my current LGA 1200 build
My best friend just went from an i7 3770 to a R7 7800x3d
@@hacoberthejacober3345 Sad to admit, but that's true.
@@hacoberthejacober3345 Sadly yes. Here in China, most of my friends still think Intel over amd in gaming despite me keep telling them the wind had changed direction when Zen 3 launched. They wouldn't watch any reviews and even if they do, some reviewers are also biased for some reason.
The 9950x price is up because, as of yesterday only Amz (stock in single digits) and New Egg had it. Unless you could drive to a microcenter, which had them for $650. Stock seems to have improved at Amz and price is “down” to $671
I've always loved falcon northwest. Back in the day, they had the best lanbox case (the fragbox). I loved mine.
I checked Newegg and Micro Center on the 9950x and they had them in stock at $599 and $649 respectively and that was 1 hour after your video had posted... $599 isn't bad for production work that can game. But, waiting to see what the new x3d will do...
ordered one myself. x3d is really only needed for 1080p gaming esports and I moved to 4k 3 years ago.
I really want to see how the 9800X3D releases, I really hope it's good, because that's most likely my next CPU, as long as it's at least better than the 7800X3D
I feel like if you have a 7800x3d there will be no point going to a 9800x3d for max of 10% inc.
Thanks Jay, good to see that there are actual reviewers looking into all aspects. Hope to see some Intel updates before, but im gonna go ahead with the purchase.
I am on AMD with MS 23H2. When will I get 24H2? Or do I need to force it? Reinstall MS? Seems 'criminal' that 10% performance uplift is held until a major MS update instead of the monthly cumulative fixes!
As my children would say: Just Google it. Turns out forcing is fairly easy, based on the 2 videos I watched, to force the update using the Group Policy Editor. There is also a registry edit method.
It's still being baked. Within Windows Update - Windows Insider beta channel has it. Sign and up and you can get it now and yes the whole OS feels snappier on AMD I have.
It's got issues, don't force it. Wait for it to be fixed
not for gamers AND not for professionals
What? Nooo. Other channels don't exist. I will not have this blasphemy. There is only JTC.
6000MHZ was too slow for last few gens of Intel CPU's, want to see it tested with 8000-10000 mhz. Love the Falcon machine. If I was going to have someone else build mine it would be them.
Love the new studio Jay. This new rakish desk angle is also 😚🤌🏼
So, as far as i am concerned: i am impressed by the 285k, but i don´t use it for games but for running tons of VST´s in my DAW, pretty hefty computing task ... this processeor would be ideal. Nice
28:15 Jay calling me out with my really old system with a 6600 CPU 😂
Reminds me of the 80's, if you want to crunch data you get a IBM if you want graphics you get a Apple. AMD for games and Intel for productivity.
I am interested in the side by side ram testing, good call leaving it out of this video
I have an i7 6800k, going to Arrow Lake would be like going from a Vespa scooter to an Audi RS5. I do more than gaming, actually a lot less gaming, and more AI related tasks. This is also the start of a new platform and has an upgrade path for years to come. Will I buy this right now, no, do not have the money right now, but maybe next year or so.
I have a 13600K that is now around a year old. While it suffices for what I need it for, I regret not going AMD. The extra e-cores are a disappointment overall, I would rather just have fast, efficient p-cores (which is what AMD offers). I don't see any reason for anyone to consider the 200 series even if they need a platform upgrade. AMD's current lineup is as fast, faster in gaming and a LOT more efficient. And cheaper, too, but that's rapidly changing as retailers see a grifting opportunity.
Excellent review.. balanced. Thanks for not doing the classic dunk on Intel move like everyone else looking for clickbait.
I bought the i9-13900k with a new motherboard, ram and GPU and then found all these videos about the 13th and 14th gen cpus dying early and now I'm scared lol. I did do the bios updates and have made sure my voltage and everything else is normal however im still scared.
The tables got flipped.
When ryzen first came out, intel had single core performance, but amd had multicore performance
Now, amd has the single core performance, and intel has the multicore performance
How the turntables...
Just an idea Jay. Do a video directly comparing U-DIMM to CU-DIMM. Show us the direct comparison of possible performance uplift.
If they are doing TSMC ditch E cores for a gaming chip
Goodness, the 7950X3D is such a beast.
It's rarely talked about in comparison to the 7800X3D, but look at those charts.
It's essentially a 7800X3D when gaming, but it is actually useful when you exit the gaming application.
You get everything the 7800X3D offers for games while having one of the most powerful productivity processors outside of workstations.
People who bought the 7800X3D and want to do more than gaming will be incentivized to buy the new 9800X3D because of the higher clock speeds and supposedly better performance with compute outside of gaming.
Someone with a 7950X3D has almost no reason to upgrade unless the 9950X3D ends up putting the 3d V Cache on both CCDs.
Why would 7800x3d users need to upgrade if you are 1440p or 4k the 9800x3d is not going to give you negligible benefits at 1440p and probably zero at 4k. Also for most things normal people do on the pc besides gaming ie. Browsing, youtube etc . The 7800x3d is just fine and nobody would know any difference between it and a cpu that is better at "productivity".
@@IRaoulDukeNo one is attacking the 7800X3D. It is the gaming king, and the choice for most buyers.
The defense force doesn't have to come out.
My post is for the people it applies to.
Not everyone uses their PC for just gaming and light browsing, and even AMD recognizes the 7800X3D is poor outside of gaming.
That's why leaks suggest there are substantially larger increases to the gains of the 9800X3D in non gaming applications than in gaming.
Intel seems to have the e-core thing pretty well nailed. But its "performance" cores are under performing. I said this on some other channel a few days ago, but it's worth a discussion: Have we reached the limit of CPU performance on silicon? Are we stuck until some other completely different technology comes to the fore?
The 12900K is a massive bargain right now for 1440p or 4K gaming. For 1080P as well unless you're competing.
Who thought it was smart to make the 285K processor have ECC memory compatibility, while the Z890 chipset is not?
(intel website)
So they could sell a W-chipset like the previous gen W680 (vs Z690)
Arrow Lake limited to 250W and 6500 memory speed. What we are not seeing is:
1. CUDIMM has not been taken in count 10000 speed that potentially would give 10-15% boost alone.
2. Extreme Wattage 295W to further boost the performance.
3. No Hyper-Threading Technology allowing huge performance boost for E Cores 1000MHz overclocking.
4. We don't see PCIe 5.0 performance lift via direct lane to CPU because AMD/Nvidia GPU architecture doesn't have any gains from REBAR/E Cores.
5. 8 pin additional power to the PCIe 5.0 (150W) as buffer between GPU/CPU pairing? Interested to see what this is for.
Point is AMD doesn't support CUDIMM and AMD/Nvidia are very minimal CPU users. Only way you would find a bottleneck is to run 4090 at 1080p. Who does that? So all and all this test is not realistic and from the architecture of Arrow Lake and boosted memory with PCIe 5.0 direct access to CPU (REBAR on steroids) it is not hard to see this CPU and components are tailor made for INTEL ARC to pair the GPU and CPU the best way possible boosting the CPU/memory to lift the GPU even higher performance. Battlemage 250-275W + Arrow Lake 250-295W = 500-570W combined performance with added CUDIMM for possible extra 10-15% if not more. I predict 5080 beating performance.
Rest of the CPU's compared to all are optimized and most likely motherboard boosted that has been the talking point. For my view 285K looks very good. Would I buy it? No because it is too expensive, we don't have Battlemage that is the obvious part missing for gaming, it needs to cook a bit more, more CUDIMM options. I have 13700k and A770 for 4k working as entry for 4k and seems to work OK if not limited to mid settings and the value for coin is there. But give it few months and Battlemage yes.
A $7500 system!!!! Holy Crap, that is 3 months of mortgage payments! Is that really a system for testing??
The 9950x is out of stock and the prices that you see for them now is from scalpers.
All I want for Christmas is a CPU with all Performance cores. I don't need a ton of them. I want them to be fast and game like a mother. Bring back the Extreme
The question is if there's a performance uplift for production w/ CUDIMM.
Lil bro, der8auer called you out and people are realizing you just don't know what are you talking about
Feel like he seriously went out of his way to put this in the best light he can while most other tech reviewers are calling it for what it is …. Overall bad.
@@goinpoztal They said that is bad because it's not beating AMD... It was expected for a brand new CPU... They cried about efficiency and now they are crying because the games, that they aren't gonna play, are 5% slower... Intel CPUs were never oriented towards gamers, for that they would need a new design just for that and it's not a priority.
@@jeanytpremium be that as it may the problem is this struggles against a 13900k it many cases especially in gaming and that’s a cpu from 2022…. That’s a problem. Now you couple that with the failed to or at least barely mentioned by Jay at launch stability issues most reviewers are experiencing it just gets worse. Intel needed a home run here after the disaster that is the 14900k and 13900k parts failing at high rates and they just didn’t get it done. Leave AMD out of it because Intel was largely competing with their own previous generation and overall lost to themselves. It’s sad to see really because Intel was so far ahead in the cpu market that the only ones that could have got in their way was themselves and well they succeeded.
It's amazing what happens when you have a manufacturer who's familiar with the manufacturing process
Great vid, great breakdown.
I would love to see Metro Exodus APO performance metrics versus 12th/13th/14th gen.
In particular, the volga introduction whereas you are walking with your wife, off the train in the direction of the church.
Before you step-off the train rails, and you overlook the horizon, looking down at the lake/train yard etc the game becomes mega CPU limited.
On my 14700k with APO enabled, depending on how my camera is angled, I've seen my 4090 at DLSS Quality @ 4k res drop to 60-70 gpu usage.
It's hard to get the gpu to drop that low in general in that area, depending on the viewing angle etc.
Can we see this APO comparison between 285k and 14th/13th/12th gen, please?!?!
NOTE: Oh, then there's chapter 3 forest area, Taiga where the wolves and god of the forest are running rabid... Hyper CPU limited area again. I haven't myself tested APO in that area yet however.
Yeah 3rd choice in gaming, and taking 270W in powergated/bios Mode. There is altough a 85W heat loss thats sums different in gaming. Extrem expensive too. Bauer mentioned you in his review of this cpu, its about the VR-Voltage regulator. I see how you try to put this product in a good light, but its time to shorten the stocks again :)
What I think its that optimization is needed for real as like their GPU's worked themselves up, well here the same stuff
Honestly, I kinda struggle to find the purpose of these videos, crazy specs, crazy prices, crazy gpus, crazy unoptimized games that run crazy bad on the aforementioned crazy specs, and so on.
thanks for keeping it real about the whole lab thing, getting tired of techtubers claiming they got a "lab"
These scenarios only seem to make sense to reviewers. I have a 5950X with DDR4 3600 CL14 and considering an upgrade, what would I experience if I moved to this whole platform? I feel like that's a better question to answer than saying this processor "isn't great for gaming" because of some chip design nerd knob that could have been done slightly better. The other reality is that I have to think many people will be going from DDR4 to DDR5, is CUDIMM the way to go and how would that compare to the great DDR4 we have today? Who is upgrading from the latest gen to the latest latest gen? nuts
What is your use case? Productivity, gaming or bit of both?
None of the testing I have seen has used the new memory format. It will be interesting to see how it changes these numbers.
What? When reviewed of course you will compare it to all other options currently available. The review is to help you choose what processor to upgrade to. And whenever something new comes out, people with the last gen version ask if its worth an upgrade. They show the benchmarks here and you can pretty easily benchmark your current system…
So.. 13900k is the best overall? Price, FPS and compute power. It is top 3 on ALL thoe graphs (both gaming and compute) without concessions and + it is one of the cheapest ones. Right?
yup
Maybe even the 12900K. Doesn’t always keep up entirely, but the price is a lot lower. It’s half the price of a 9950X and 30% cheaper than the 13900K
I thought more of you Jay, you needed 450w to hit 42000 score with a 14900k, I hit just over 42000 with my middle of the pack 14900k, 1.33v, 320w 83c on a phantek 420mm aio running 45% fan speed
With so much temperature headroom I wonder how these will handle overclocking. Seems somewhat limited out of the box compared to other offerings
This looks really good... For the chip that replaces it next (or 2) generation(s) from this. The main thing intel needed to get under control was power consumption and they've made a huge step in that regard. Now if they can keep the power under control in the next version of this chip, they'll be in a very competitive place.
P.S. you'd expect that with the better power handling, you should be able to overclock this thing for some huge improvements. But who knows.
still drawing x2 the power of ryzen
@@arizona_anime_fan yeah, they're really gonna need to keep on that as one of their top priorities
Would you consider including Star Citizen in your game testing? Even though it isn't done yet, it is really demanding of hardware and might provide some decent data.
No Flight Sim? 😢
I hope reviewers plan to include MSFS '24 when it comes out for us sim nerds.
New studio looks nice!
As I think about the difference in the gaming portion of the testing for the 285k vs the 7800x3d, I am betting if intel since they went away from the hyperthreading, if they simply add more P cores, would boost that result greatly to match and or exceed that of the AMD 7800x3d.. I have been wanting Intel to do more P cores.. and they have not wanted to do so, then they drop off the hyper threading and still do not add more P cores and I think that is a mistake by Intel.. Intel, add more P cores.. next go round 16P cores for the 385k and 12P cores for the 365k.. and then 8P cores for the 345k..
Hello, I have a question not related to the topic. Is that the laptop you repasted? How is it doing and what are the temps under load/games? Thank you in advance :)
suffering on reviewers face saying intel cpu are bad makes me happy .
Dinosaur era: IBM hires Intel for the CPUs, required a second supplier - AMD
Today: single manufacturer, TSMC making CPU chips for both Intel and AMD. Either way TSMC is winning...
You should do better research. DLVR was developed for Raptor Lake but Intel only got it working on RPL-R. It’s purpose is to allow idle cores run at lower voltage than the cores under load and save power. Nothing to do with controlling board manufacturers, would also have been 5 years later as a response to that. It was Intel CPUs requesting too much voltage that killed them…
Measurements without power limits are absurd. Intel has clearly stated the PBP of this processor (125W). Testing with a load of 10 minutes at 250W is simply OC. You should provide results for “stock” or for “stock and OC”, but not only for “OC”.
Great video Jay! Thanks for always being honest and keeping us informed. Keep up the great work!
Id like to see 5000 series for comparison in the future if u are willing. Long time viewer 😊Love the vids
winning the lawsuit in europe was a much bigger deal for intel than arrow lake ... can now live to cpu another day