► Big thanks to Micro Center for sponsoring today’s video: New Customers Exclusive - Get $25 off your purchase of any AMD and Intel Processor (limit one per customer): micro.center/nv7 Check out Micro Center's PC Builder: micro.center/a3b Submit your build to Micro Center's Build Showcase: micro.center/nmk Shop LG 27UP850N-B 27" 4K UHD: micro.center/icp
Hey, small detail. In the video you said that both CCDs run at the same frequency. This might not be true. The 3D v-cache CCD runs a bit lower, while the other CCD keeps the clocks up, or at least that is what I see others saying.
There are so many other tech tubers that have reviewed the 7950x3d and accurately capture the difference between the 3d and non 3d parts. You've obviously done something wrong here man. Check the countless other reviews online.
so i was looking up "7950x in 2024" to see what other people think of it who also have one. came across your video and even though its late, i have really good things to say about this cpu and its gonna be a good processor for a while (so consider this gem if you find it at a reasonable price!). Yea, its hard to cool, but most 360 AIO's are good enough for the job. other than that minor draw back, this chip just does anything you ask of it. and its multi tasking performance is super quick and snappy. its super efficient with an undervolt, most of these cpus can handle a -30 all core offset and typically run faster when you do this (tip: set custom temp limit to 89c, you'll cut allot of heat out and it will rarely ever reach this limit even on 100% load for either x or x3d variant). If you do engineering, 3d work, gaming, whatever this cpu crushes it all and it does it all well. It is a master jack of all trades. I see why this was labeled pretty much neck to neck with the 13900k at launch. Its simply put a great processor.
Dude, don’t believe all this shit about 7950X3D having issues with scheduling. I’ve ran tons of games on this new CPU and haven’t found a single issue. Furthermore, I’m getting around 20-35% more performance than my old 7950X depending on the games. I don’t know where this guy came up with only 1% and the GPU theory definitely has nothing to do with it.
@@MustafaGT would actually love to see a video showing the performance in games, I've seen a few reviews but you never know when a youtuber gets a binned chip or not
@@fragalot Don't compare your previous generation cpu witch has one CCD. Both 7900X3D and 7950X3D work differently: they have one CCD with higher boost clocks and one CCD with 3D V-cache and lower boost clocks. That's why I talked about a better scheduler. Linux performance may be better. Check what Wendell from L1techs said. Note: 7800X3D has one CCD, so it is similar to 5800X3D and thus it does not need a better scheduler. A scheduler pushes the processes to the available cores, but in the case of 7900X3D and 7950X3D the scheduler must push all the game processes to the suitable cores, in other words to the CCD cores with the 3D V-cache.
the 7800x3d will be the one that makes sense, the difference between the 7950x3d and the og 7950x is pretty small in most cases, especially once you run the game at settings that most people with a 1000$+ gpu would reasonably be targeting
@@olliec6267 its not microsofts fault AMD decided to drop a problem on them without having a solid software solution for them. People knew this was gonna be an issue and to wait for the 7800X3D not just cause it was cheaper but because simpler. Way less to screw up and vcache was only gonna be on one ccd.
That's what I believe as well, and I think it has something to do with the dual CCDs of the 7950x3D with only one of them having the 3D V-cache. The 7800x3D will only have one CCD. Tech Yes City simulated the 7800x3D with the 7950x3D and saw improvements by disabling the non-3d cache CCD.
What is required in my case (7900X3D) is to force the CCD priority in the actual BIOS with the Cache option. Not relying on the XBOX Game Bar in Windows (Windows 11 in my case) because even with just relying on that it doesn't actually run my games on the 3D vCache CCD0, by default it runs everything I do on CCD1 only. But going to the BIOS and forcing the priority for CCD0 there worked and everything I do now in Windows (gaming, etc) first runs on CCD0. However, it IS a problem. No one should have to force that through the BIOS and it should absolutely be a "set and forget" thing you do maybe once in Windows and that's it. It should just work out of the box. It DOES work for some people out there and they'd claim that there is no scheduler-related issues. That's normal... because they don't experience the problem, obviously. But there ARE cases where - for some reason - JUST relying on Game Bar just won't cut it. I'm one of those cases, and forcing it via the BIOS was my only solution.
I run a 7950x3d and set the bios to prefer frequency, then used process lasso to set all the processes in my games folder to run on cache. The OS, and all other tasks run on frequency while my games get the whole cache CCD. The performance is insane.
You may not see much of a difference in 1080p with fps in games where the 7950x was already pretty much hitting the engine limit. The 7800x3d is targeted specifically at gaming. The 7950x3d is for both gaming with the x3d enabled ccd and productivity apps the non x3d ccd with higher clocks. Another way to think of the 7950x3d is kind of like a higher clocked 7700x and a 7800x3d combined.
I'm looking at an X3D for my next system because of the huge advantages in Blender, but if I wanted a pure gaming machine then I probably wouldn't bother. We are in a strange position at the moment where a lot of the very best tech doesn't really make sense in a bang-for-buck way for people who just want to play games.
Went from a 5950x woth 4080 to a 7950x3D and it runs phenominally. Talking 30-40% fps gains, CPU bottlenecks are real. Well worth it, the whiny reviewers who say otherwise... meh, just negative titles for more views and comments.
I got one and love it.. my main focus was great gaming performance with lower power consumption. Some of the other chips use way more then I'm comfortable with and therefore throw off a ton more heat as well. I'm willing to take some minor loses to have a power efficient rig while gaming.
Have you updated the chipset drivers? I play heavily scripted and ai games and notice the first 16 threads kicking these scenarios. It's just insane. The vm in my game just doesn't get overburdened..and if only a microsecond. Really insane performance thru that cache. I wish I could try my x4 100 billion dollar galaxy save with it. My old pc did slide shows of 4fps on that. Sadly my save got corrupted in an engine upgrade.. bet this baby pulls that back up to 30+ fps. Updating a universe economy in every frame of the game simply was too much for my old cpu. And that's a badly threaded game - 4-6 threads or something like it. Actually use a game whose kernel needs that extra cache for the extra ai and scripts that need to run every frame. That's when you'll notice it.
@@paradoxicalcat7173 some of the fastest cars these days are more fuel efficient.. take an electric vehicle example 2023 Tesla Model S Plaid Top Speed: 200 MPH 0-60 Time: 2.1 seconds
@@paradoxicalcat7173well what if you wanna go fast and not have to burn through gas at the same time? Electricity doesn't grow on trees and a higher TDP => higher heat levels => higher AC usage => higher overall electricity expenses, especially in today's world.
I've got one myself, super happy, but I'm upgrading from a 6850k so almost anything would be better. What I'm really curious about is those games that normally can't ever seem to run well like MS Flight simulator, where things get pushed in weird ways. I also want to test it when running slower RAM. I plan on upgrading to 128GB pretty soon, and its a known issue that you have to down clock your memory (4800 is often all motherboards will list as compatible) And maybe the cache will help deal with that.
You should get the new 192GB kit heard it can run at 5200Mhz with the 7950x (7950x3D wasn't available at the time). It should be available in the next month or so for AMD.
You get better 1% and 0.1% lows with 5800X3d and 7xx0X3D which ultimately gives you smoother gameplay in 4K or VR, it's not so much about the 1-2% maximum FPS.
@@Vernas_R i don't have it and i barely play VR chat but i saw Tupper showing something around 4ms CPU frametime in front of a mirror full of unoptimized avatars in discord
I got myself the 7950X3D and I am really pleased with it. Though, I refused to go with AMD's scheduler and instead use Process Lasso to handle it, and it works wonders since I can run the games on the v-cache CCD and all other apps on the second CCD without them fighting over resources like they would if you just park the second CCD that AMD does currently. So if one uses Process Lasso, it is absolutely amazing for gaming + content creation like recording / streaming etc, the power efficiency on the X3D makes it really nice for workstation machine as well. Comparing it to my 5950X at 3440x1440, 4x AA and Ultra settings it boosted the FPS massively (+100-150fps) in many VR worlds on both ChilloutVR and VRChat, and also boosted the FPS very well in VR, especially minimums and reprojections, with a Pimax 8KX which is 2x 4K resolutions. So in my opinion who games on my workstation machine, the 7950X3D is far better than the 7950X with both performance and power efficiency in all my scenarios, especially if one uses Process Lasso over AMD's scheduler with XBox Game bar.
@@wahahah AMDs high end CPUs are dis-aggregated so instead of having all of the cores on a single chip, they split them between 2 chips. So rather than having 1 chip with 16 cores, you'll have 2 chips with 8 cores each. On the 7950x3D, one of those 8 core chips has extra v cache, and the other chip has higher clock speeds. So depending on the game, it might be beneficial to disable one CCD or the other to maximize performance. Using a program like process lasso can help with that.
@@MistyKathrine With Process Lasso you don't have to disable one CCD, you keep both CCDs active but restrict a process to only run on one of them at all time, thus actually utilizing the entire cpu instead of disabling half like AMD's scheduler does atm.
@@FroggoVR Well yeah, but trying to explain it in a way that makes it easy to understand. With process lasso you're forcing the program to only use cores on the CCD you want.
► Big thanks to Micro Center for sponsoring today’s video: New Customers Exclusive - Get $25 off your purchase of any AMD and Intel Processor (limit one per customer): micro.center/nv7 Shop LG 27UP850N-B 27" 4K UHD: micro.center/icp Check out Micro Center's PC Builder: micro.center/a3b Submit your build to Micro Center's Build Showcase: micro.center/nmk
Weird, I love mine. 3 months I've had it, zero regrets, its blazing fast for my rendering and compiling, while also being insanely good in games. Literally the best of both worlds.
@@Xilent1 I upgraded from an old 3700x so yeah everything is WAY faster. Everything. My partner has a 5800x3d in her PC and the difference between hers and mine is also quite pronounced. I'm not jealous of her anymore lets just say lol.
@@EloyBushida Thanks! That's what I have now a 3700x. I'm trying out GeForce Now which uses a 16-core CPU (Not Named). Made me consider skipping over the 5800x3D to a 5950x or even a 7950x3D for a new build. I'm thinking between the two from what I'm trying to achieve, the 5950x should work just as fine as the 7950x3D minus the gaming improvements which I'm fine with for now. Looking to make my PC more snappier if that makes sense and not just graphics this upgrade or new build.
@@Xilent1 I totally get what you mean, before the 3700x I had an old 4 core intel chip and everything felt so sluggish, I was just dying for some zip! I think whichever you go with you will notice a difference for sure, but I would recommend maybe the 7900x if price is a factor? It tends to be around the price of a 5950x but its faster in like 90% of use cases, and on a newer platform which is always nice. Get some fast 6000 ddr5 and an m2 ssd to go with it, and you're really cookin!
@@EloyBushida hear you and thanks for the advice. Its not much of a price difference so I'll just go 7950x3D over 7900x. Definitely will use your RAM advice. Going to wait and see if the 9x00X3D bring the 7x00 series down in price at Microcenter. 5080 might not release till next year, so I'll wait for Sales for everything available now.
What's crazy about PCs today is a Raspberry Pi4 is about as powerful as a PC 10-20 years ago. Think about that, more so when it uses the same power as a single one of those NZXT fans, and this system has 17 of those running. Makes you wonder, when will we reach the stage where fans to cool a PC/compter are a thing of the past.
Did you ever try disabling the non-3d v-cache cores? Other reviewers noted seeing better performance this way. The point still stands that due to the way the split design is handled, the 7900X3D and 7950X3D should not be purchased. It is possible this may change in the future, but you always buy on the performance of today instead of the promise of tomorrow. If you need productivity performance, go with the non 3D versions of what you need. If you need gaming performance, wait for the 7800X3D where the split design will not be a problem and it will be a lot cheaper.
All I know is I came from a 2700x and with this knew am5 build with the 7950x3d I'm blown away. Always to just once build a pc that I didn't really need and pretty happy I did. I look at it like does anyone really need a Lamborghini? Nah but people buy em.
Probably better to not use the AMD core scheduler (the one that parks half the CPU when gaming) by disabling the Game Bar for instance. Then use something like Process Lasso to set the processor affinity to the 3D cache enabled cores for games. Other processes will go to the other cores, which should result in a better experience when doing other stuff in the background.
@@Jisu1337 Still doesnt matter. For games you pretty much have 7800x3d at twise the price. This is the problem AMD hit because they could not make a stable low latency interconnect so only 1 CCD has 3d cache. For games , the sheduler just disables half the CPU. Then for productivity it becomes a lesser version if its "little" brother because most programs cant use 3d cache and use the standard but prioritize the higher clocked cores. If you are willing to fundle arround with Lasso to move cores arround like on 12-14 gen Intel go for x3d but for people that just want something good enaungh out of the box with no fondling stuff they dont understand - the standard X version is far better and cheaper option. Also if you go for vitualization - dont bother with X3d , it cant use the 3d cache and you can passthru only 1 CCD , same problem with Intel's idiotic E cores.
@@rockbut3039 It's not worth the extra money, just go with the regular Ryzen 7900 series, save you some cash and maybe use that extra cash to put it toward other upgrades like video card.
When I built my system around last year's black Friday sales I was really hung in what CPU to use , between the 7700x or 7600x . Knew the x3D chips where coming but didn't know when or how well they would perform . Went with the 7600x and figured I would upgrade to a x3D chip if it shown a respectable improvement over the 7600x I'm a gamer so gaming performance and cost per frame are king in my book. I have no interest in the 7950x3d , too much $ for the fps and headaches needed to get it to peak fps. Now the 7800x3d I'm really looking forward to seeing the benchmarks in mmorpgs and rts . Really wish there was a microcenter in the Seattle area , closest MC is in Los Angeles. Over 1100 miles away! For our European friends it's the equivalent distance from London to Rome .
@@JohnDoeC78 Newegg is a hoot ! But sometimes I like to go down and look at what's on display. Being able to look at a computer monitor in person means a lot to me
I did everything that was recommended to do(which anybody that knows about building a PC should do) upon installing the new 7950X3D and it made a huge difference in games. I really don’t know how you getting only 1% faster than the 7950X, but there’s something definitely wrong with your setup dude. First of all, you didn’t even show the resource monitor or AMD Ryzen master to see if vcache CCD is actually working. You could have avoided the benchmarks and just gone with that in the first place. I had no issues whatsoever with my 7950x3D. All I did was update bios and chipset drivers and it was ready. I swear I don’t understand all this crap being put out there about the 7950X3D being so complicated. IT’S NOT!! You either hating for whatever reason or you really don’t know what you are doing. At 1440p I’ve gone from 190/200 fps to 260/265 fps in Warzone from 7950X to now 7950X3D. On average I’ve gained about 25-35% more performance on games compared to my old 7950X. This shit is ridiculous dude how these youtubers have nothing better to do than just hate or don’t have a clue what they are doing. Now it seems it’s because of a slower GPU? 🤦♂️ there’s folks with 3080 GPU’s getting same 20-30% performance increase. Lol
Thanks for adding to the problem and misinformation….you are defiant because your experience doesn’t match others and you refuse to understand the real issue here. This should not be a thing. Every single one of these issues (they are real) can be avoided using any other AMD processor or waiting for the 7800x3d. 7950x is fantastic for gaming and productivity at a great price. A proper hardware scheduler wasn’t used here and the plug and play gaming SKU (7800x3d) was delayed on purpose. The marketing hype worked, and you are their cheerleader. Wonderful.😂
He used an AMD GPU which was probably the biggest issue. The x3D chips make a lot more sense for people using Nivdia GPUs. If you're using an AMD GPU, the x3D doesn't make a lot of sense. Ironic, I know.
What I saw on the set up, you did not mention If you upgraded your chipset drivers for the x3d processor, it needs to be done. If you did this then I can see where your coming from with you conclusion.
Awesome video and very interesting information on both of those CPU's, but I think i'll just stick it out with my 5950X paired up with the RX6900XT for now I mean I am happy and satisfied with the performance I get with that setup.
If you're running at over 1080P the 5950X is going to be great for a long time to come. Your GPU will be the bottle neck for at least a couple more gens.
@@ghostofdre well really the gpu isn't a bottleneck simply because what game out there is ever powering demanding to say such though. All the games I do play IE Destiny 2, COD mw2, gears 5, elden ring, horizon zero dawn, God of War I play at 4k and performance is phenomenal. So honestly upgrading to the latest and greatest is pointless in my opinion
I appreciate you addressing this. I unsubscribed from Hardware Unboxed yesterday because all of their benchmarks are done exclusively @1080p. He runs a 4090 @1080p. I just can't watch that insanity anymore. The information is misleading people who play at 1440p and 4k. It is very important that you check benchmarks that run at the resolution you play at. I play exclusively @4k. I paired a 4090 with a r7 7800x3d instead of a 7950x3d because the extra processing power would give me 0 extra performance. I saved 400$ on my build at the cost of 0 performance. 7800x3d and 7950x3d are within a 1% difference and the 5800x3d is essentially the same in 9/10 games @4k. Games are GPU bound @4k. You only need a CPU powerful enough not to bottleneck your GPU @4k. This is counterintuitive because you would think you would need to pair the fastes CPU with the fastest GPU in order not to bottleneck. That is true at 1080p but not at 4k and not with ATI cards. So if you play @4k do yourself a favor and pair your card with an appropriate CPU. You don't need the fastest possible CPU the processing power is completely wasted.
When testing CPUs, it's better to test at 1080p to remove any GPU bottlenecks. Resolutions don't affect CPU performance as much if not at all, if the GPU is capable of handling it. By testing at 1080p (or lower) we can get a better estimation on the CPU throughput, and get more comparable values since the GPU isn't holding it back at all. it's common sense to be honest.
I think a lot of gamers are simply going to keep waiting for the 7800x3d. These higher end versions don't really make a ton of sense unless you're a gamer who also needs better productivity app performance as well...and then I guess maybe?
I am more into waiting 7600x3d if that ever comes out as budget option....or 8xxx series that will hopefully support higher memory speeds that current ddr5s are offering.
Has there been any leaks or tumours about it? If not then I wouldn't keep your hopes up. AMD probably knows the 6 core v Cache option would be an absolute beast for gamers so much so that it would cannibalise their own entire product stack. Just look at the fact that they delayed the 7800x3d. They knew that everyone would just get the 7800x3d over the more expensive stuff. Its silly really. Hopefully they learn their lesson this generation and realise consumers aren't as stupid as they think and give us the 7600x3d or a 8600x3d in the future.
One really important thing was left out here (at least its important for some people): The power consumption. x3d will have the same performance at significantly lower power consumption. We are talking about figures like +100% for the non x3d in games and even worse if you compare it with intel CPUs...
Holy crap so many people coping hard in the comments: "Just use process lasso" Why would you? Consumer should NEVER EVER feel the need to do its hardware's work. If AMD cant make their hardware to work properly its on them, not on a consumer to do the scheduling. 7950x3d and 7900x3d are terrible products, which does not work properly. The only chip from new lineup which is going to be amazing is 7800x3d. As long as amd wont have its hardware scheduler, like intel's "thread director" no amount of software patches and updates will fix those dual CCD v cache cpus. It is also much harder to schedule since both CCD's can be faster than the other in certain applications. With intel P and E cores its easy: p cores = fast, e cores = slow. High priority task = p cores, low priority task = e cores. If application uses more cores than chip in question has p cores, assign and add more e cores to the process. With amd its much harder for scheduling to assume which CCD should be used for, not all games runs better on v cache CCD, not all software runs faster on non V cache die. How to properly assign them? This gamebar solution is extremely crude and inconsistent which clearly shows in this video. To me even 7950x looks more appealing than both 7900x3d/7950x3d because its much more consistent and reliable, just like any proper cpu should be. If you are thinking of zen 4 3d goodness, 7800x3d is your ONLY option.
I built a 7700x & 4080fe PC, I was going to use the 7700x CPU as a placeholder until the 7800x3D came out but now I'm not sure. We'll see how the benchmarks do.
depends on what resolution you play in. if you play in 1080(which i dont know why anyone would use a 4080 for 1080p) then the cpu upgrade would makes sense. 1440p would be questionable. 4k wouldnt make sense as that falls almost entirely on the gpu. unless its a game that gets well over 120fps in 4k. then cpu would matter
that's because you are suppose to disable CCD2 since that doesn't have the vcache. So basically if you have the 7900x3d, it's game mode reduces it to a 6 core cpu but it's clocks can go far higher multicore that it can using all 12.
i just bought the 7950x3d from microcenter but it was bundle with a x670e board and 32gb of ram at 6000mhz paid 650 i think or 670 so it wasn't a bad buy
Bought the same bundle, except I sold the ram and bought trident ram 6000mhz 32. Did you do the 3d cache process? To get the core parking to work? I did and it was way worth it. Make sure your getting all your performance
@@killerbsting1621 i did the exact same and gave the kit the bundle came with and gave it to my cuzin im having no issues with my 7950x3d no crashes everything is alot smoother and im coming from a 13700k yes all my cores are working normal so im happy with my buy
This is a great no nonsense video about gaming PCs. A must watch! That little 5800x3d, B450, and a overclocked/undervolt 6800xt from the clean used market today is a true winner.
I'm not sure did I understand all the technical details, but looks like Intel 13600K will be much cheaper and easier to setup as a gaming rig paired with 4090. Thanks!
I'm on the fence on which one to obtain. I am mostly into motion graphics. (ie:3D etc) Which one would be faster at rendering ? And which one would be the best bang for the buck? (Would probably use it with a 4090)
Just disable ccd1 and be done with it lol. No scheduler, no cross cc'd latency. Ccd0 runs 250mhz faster than the 7800x3d. If you need more cores just enable ccd1 in bios.
The multi CCD's X3D CPU bios and windows settings, and software requirement is a Chinese fire drill to get it all working correctly, and that's IF the game you want to play actually uses the V-Cache -not all games do. AMD is on record stating that the "best" gains are experienced with a $1600-$2300 RTX 4090. And the regular AMD and Intel's CPUs aren't lowered clocked like the X3D Cores are to mitigate the extra heat from those cores. Bottom line: a lot of money to gain the "best gaming CPUs" differences. I'd go with a 9950X, 14900K or upcoming Ultra 285Ks. Or if you ONLY game, a 7800X3D/9800X3D with single CCDs.
Really wish they had committed and put the vcash on both sides. That video card is a bottle neck, but 4090 is stupid impractical in price. Many games both chips are fast enough to hit their software limits as well, in that the x3d does some better future proofing as we should more aggressive games come out. Though in general the benefits are niche. Also the whole is my OS and bios etc actually routing to thw cache correctly is a huge unknown and potential bottle neck.
At 1440p mix of low settings with very high textures and view distance I average 600fps in Fortnite with the 7950X3D and RTX 4090. With my 7950X I averaged around 500fps. For me it's a pretty huge boost.
Reminder to everyone: "Realistic" settings are irrelevant. The whole point of CPU benchmarking is to determine the objective and absolute differences in performance. Benchmarks are not a buyer's guide. They have never been a buyer's guide. Do not treat benchmarks as a buyer's guide. The argument that CPU scaling does not present itself with x GPU and y resolution is irrelevant. People upgrade their GPUs more often than they upgrade their CPUs. Furthermore, you do not want a CPU bottleneck. I do not know why anyone would be excited about a CPU bottleneck. You want your GPU to be running at 100% utilisation during gaming, and if you have a lot of CPU performance overhead, then that means you don't need to upgrade your CPU for years to come.
That Video got out before the Chipset updates i think. The 7950x3d performs way better in 1080 and 1440p. Before that update, the cpu didnt utilize the bigger cash.
Your conclusion that you didn't see the same results as other RUclipsrs benchmarking this is because you used a AMD GPU instead of an Nvidia GPU makes no sense whatsoever. Having an AMD GPU vs Nvidia should make NO difference on a game being able to take advantage of the cpus Vcache of the X3D. Are you really saying people need an Nvidia GPU if they are going to use the X3D?!?!? There is no way this can be true, this honestly points to a problem somewhere in the setup. EDIT: I 100% bet that in all the games you tested, the games were not using the correct CCD. I bet that if you did like Hardware Unboxed did and disabled the CCD without the extra Vcache to "simulate" a 7800X3D and to make sure the games aren't utilizing the wrong CCD, that your results would show the X3D pull way more frames. I still can't believe though that you suggested it was because you used a Radeon GPU with why the X3D results weren't higher than the non-X3D version.
Thanks for talking through your thought processes during benchmarking! Let's be honest, its all about the 7800x3d. I'm looking forward to being upset about the price in the UK in a couple of weeks.
I'm confused, I just watched two other youtubers testing the 7950X vs the 7950X3d in similar scenarios as in this video, and the 7950X3d was always faster in gaming! How is this possible!?!
are you sure that you have the latest chipset drivers? you might like to also try using CPU affinity to restrict the game to the CCD that has the extra cache. you can di that using the commandline commant start and include a CPU affinity flag to create a shortcut.
May be is the motherboard issue, not optimized well yet with a bios update, i saw some MBs perform way worse because are not optimized good yet even after they new update, or win. update can be also the issue, or some missing bios setting, but if you have the chance and time, try with some other good motherboards. The difference between both cps with 4090 and 7900xtx was very small, like 3% if i remember it right, i mean 4090 is getting around 3% more fps on 1080p between both CPUs than with 70900xtx, if i remember it right, was very small the difference.
I may have missed it but did you update your chipset for 3DVcache on your system after you installed the new chip? Those numbers seem off just a bit. But also under the impression the 3dvcache chips are better for games that require large amount of simulation. Space engineers, Stellaris, Universe sandbox, Ultimate Epic Battle Simulator, etc. However if I'm wrong on that someone let me know. Thinking about grabbing a 7800x3d soon when it releases.
@@MrFRNTIK I'm sick and tired of seeing AMD fanboys like you saying this person thrashes AMD when most Tech RUclipsrs I followed tell it like it is and not trying to be a shill and Bret is using Ryzen with a Radeon GPU. I love Nvidia's GPU for my own reason but that doesn't mean I'm not gonna call them out whenever they do stupid stuff like hijacking prices then they know they are being greedy pigs
@DaVillain he does though. I have never seen him say one good thing about amd. And if you have an example of that I'm sure he was reluctant in his praise.
@@potatoes5829 what conspiracy theory? I didn't know stating the obvious was akin to conspiracy. Idc what you use in your system. I personally hate the pricing of the rx7900xt.
I've learned my lesson one too many times going AMD over intel/NVIDIA to save a dollar, and I've never had an issue with either one. Unfortunately AMD had always suffered in comparison.
You can try disabling 8 cores of the CPU to simulate 7800x3d like Hardware Unboxed did. According to them that's the best gaming CPU available right now
Looking like it's picking up the 0.1% as the cache is benefiting the ram fetching. Ironically as you have so much vram I'd imagine it's hardly tapping the ram . I suspect in an 8gb card it might even run those better. This explains the 5800x3d more as ddr4
Thank you for your due diligence and financial sacrifice buying and testing this chip. I don't normally subscribe to the tech channels I watch but you just saved me from a bad purchase, so you've more than earned it in my book.
I purchased the regular 7950x and was wondering if a made a bad choice and not waiting for a x3d. I'm glad I didn't wait and purchased this. Really happy my processor and performs exactly how I need it to.
I upgraded to 5800x3D this year from my 3900x. (paired with 3070). That was definitely worthwhile. I love being GPU bound and having way less defined hitches if there's a technical issue with a game.
well, i have the 5800x and another 3600XT and a 3600 non X(t) and one 2600 and i must say, from a thermal standpoint, i like the monolithic design much better. The Chiplet Design may bring advantages to the manufacturing process as well as the profits of AMD, but with this 3d V-Cache it is one more compromise on the table, which i am not willing to pay for, anymore! Not saying i go to the blue team either. no, i will stay with my main setup with the 5800X and wait what happens next with Zen 3 Refresh or Zen4 and enjoy the low DDR4 Prices in order to fill up my 16 ram banks (on 4 Boards) with cheap 3600MT/s Ram.
Hardware is designed with future performance demand in mind. Current software is not yet optimized for this chip design yet. The next generation of games will show a more substantial difference
This is a lot of help in perspective for me. I feel any system or parts I buy now will live beyond my primary system. My current system, when I hang it up, will be my next step to having a server. I love my i9 3900x and the product line as it will help me create a good server once I move up from it. I try to get 10+ years out of my hardware. I have a system my kids have now Intel 2700k that is gaming Destiny 2 at near competitiveness levels with an Nvidia 980 ti. That Ryzen 7950 non-3d is steller. Even the 3d version can still do the work, in the long term. After 5 years of gaming, it would be a spectacular game server host and more.
my girlfriend was curious why i wanted to upgrade her from the 5900x to a 7800x3d, but was super excited to hear I wanted to use it for a system to host game servers
It is the temperature and TDP differences that matter. The 7950X3D can do the same performance at lower power consumption and temperature. Thats enough.
Rofl bros how are so many people having these issues? It's a plug and play experience once you update the bios and install chipset. I don't even have Xbox game bar or my system on balanced and the Window's scheduler still knows what to do. The difference between the two is huge especially at 1080p and I'm using an xtx. The host needs to step his game up. How can you be a tech youtuber and have issues installing a cpu man!
Basically clueless.... Here's some pointers it'll help you out: 1.Install AMD x3d optimizer drivers 2. Install Windows game bar 3. Verify PBS in your bios says to auto 4. Verify Core park in task manager is actually parking 9 -16 cores while in games. After you do all this.... You will find a noticeable game.
I'm looking to build a PC with an RTX 4090 to game on 4K. I'm more interested in the future. Would future games run better on the 7950x3D thanks to its cache or would it be the same as the 7950x? I'm not planning on upgrading the PC for many years to come, so I want to get the best of the best and therefore be future-proof.
@@Technae The 5.7GHz clockspeed of the 7950X3D is on the CCD without v-cache only. The v-cache CCD runs at about 5GHz, so the 7800X3D will be about the same.
@@isakh8565 hahahaha no! Seems like you are one of those who doesnt understand that the 7950x3d vcache die boosts 300mhz higher and the 7800x3d is gimped…
This seems like a huge win for me. Who bought a 5800x3d at launch and decided to skip this generation. My numbers are better then yours in some cases using it and a 3080ti. ❤❤
That's what happens when only one chiplet gets the 3D vcache. AMD delayed the launch of 7800X3D on purpose because they know that that CPU will put the 7950X3D and 7900X3D to shame and will just end up to no one buying them hence less profits lol.
That has nothing to do with the performance. Those games wouldn't have utilised more than 8 cores anyway. It would not help. And if they did, it would only hurt performance given the latency between the CCDs, even if both had V-cache.
Closest Micro Center to me is over 450mi away. Never will and never have been to Micro Cent. Keeping to online stores and into computer stores for my computer stuff I guess
To use the X3D CPUs properly, you need to first do a BIOS update, then install the chipset drivers and any Windows Updates. I also highly recommend ASRock motherboards like the Steel Legend or Taichi and GSkill TridentZ5 Neo RAM kits that are EXPO certified.
I could understand some people not wanting to pull the trigger on this CPU just yet. I saw some of the reviews on the 7950X3D against its non-3D version (7950X) and Intel’s 13900K. While the 7950X3D gives those people who game an advantage with the 3D V-cache technology, the drawback is that you have to contend with a reduced core clock on the cores getting the Cache, which may hinder performance in applications requiring the higher clocks. In a way, AMD’s Ryzen 7000X3D series is a little of a double-edged sword. It is showing the capabilities of having the best of both worlds, but it is also revealing why AMD did not attempt to try this first with the 5000 series (AM4 socket) before on to the 7000 series (AM5 socket).
This Chip just needs manual tuning... when it juggles between the best core of ccd1 and ccd2 your screwed.... since 1 CCD has the 3dVcache the other doesn't... if you really want the most of it it is going to be a pain in the ass to set core affinity manually. so for the enthusiast who is willing to invest time and effort into this matter until the windows scheduler can properly work with this chip. even the non X3d's sometimes load the slower cores when it throws around the threads. The Reason of this being intel for example has P and E cores, those have a designated naming when you check them out with Hwinfo for example, the X3D only shows as "P" cores. My opinion is that if they would have named the cores differently on the CCD that holds the vcache the whole problem is resolved. Then Windows can see the difference between a "P" and an "E" core for example. to me this is a design flaw. if the CCD with the X3D goodies would be named P+ and the regulars P the system can tell which will be the faster cores through the sceduler.
question is... does the x3d variant age like fine wine as amd does, and future programs and games will be able to leverage the cache enough or soon enough before u would move on to the next gen anyways
I would like to see what this test would look like with the same setup and updated bios to the current versions. Hadn’t upgraded my motherboard since may of 2023 and the update from February fixed every original problem I ran into at the time of building my PC(built newest gen PC February/March of 2023.
► Big thanks to Micro Center for sponsoring today’s video: New Customers Exclusive - Get $25 off your purchase of any AMD and Intel Processor (limit one per customer): micro.center/nv7
Check out Micro Center's PC Builder: micro.center/a3b
Submit your build to Micro Center's Build Showcase: micro.center/nmk
Shop LG 27UP850N-B 27" 4K UHD: micro.center/icp
bra go in bios and select prefer cache and ure fine :)
Just let me buy it and you can chill with your 7950x.
@@SnooZeR_NoR 😆
Hey, small detail. In the video you said that both CCDs run at the same frequency. This might not be true. The 3D v-cache CCD runs a bit lower, while the other CCD keeps the clocks up, or at least that is what I see others saying.
@@entity-bl5og cache ccd max 5250mhz non cache ccd 5000mhz pbo off
Seems you regret everything. Regretted the 7800x3d, regretted the 7950x3d. Maybe just stop buying stuff altogether.
he should try milking a chair, works for me.
Yeah was about to say the same thing, I thought 'didn't he also make a video about regretting the 7800x3d?' and yeah he did lmao
nailed kkkk
Why would anyone ever regret a 7800X3D? Things a monster.
😂😂😂😂😂😂😂
There are so many other tech tubers that have reviewed the 7950x3d and accurately capture the difference between the 3d and non 3d parts. You've obviously done something wrong here man. Check the countless other reviews online.
Too much advertising to get to the point...I understand it is necessary but with limits!
so i was looking up "7950x in 2024" to see what other people think of it who also have one. came across your video and even though its late, i have really good things to say about this cpu and its gonna be a good processor for a while (so consider this gem if you find it at a reasonable price!). Yea, its hard to cool, but most 360 AIO's are good enough for the job. other than that minor draw back, this chip just does anything you ask of it. and its multi tasking performance is super quick and snappy. its super efficient with an undervolt, most of these cpus can handle a -30 all core offset and typically run faster when you do this (tip: set custom temp limit to 89c, you'll cut allot of heat out and it will rarely ever reach this limit even on 100% load for either x or x3d variant).
If you do engineering, 3d work, gaming, whatever this cpu crushes it all and it does it all well. It is a master jack of all trades. I see why this was labeled pretty much neck to neck with the 13900k at launch. Its simply put a great processor.
I love mine. Nice and smooth on everything you ask it to do.
Windows OS scheduler needs a big update, to be able to use the X3D chips efficiently.
Dude, don’t believe all this shit about 7950X3D having issues with scheduling. I’ve ran tons of games on this new CPU and haven’t found a single issue. Furthermore, I’m getting around 20-35% more performance than my old 7950X depending on the games. I don’t know where this guy came up with only 1% and the GPU theory definitely has nothing to do with it.
@@MustafaGT would actually love to see a video showing the performance in games, I've seen a few reviews but you never know when a youtuber gets a binned chip or not
@@equiknox Yes
Works well already with my 5800X3D.
@@fragalot Don't compare your previous generation cpu witch has one CCD. Both 7900X3D and 7950X3D work differently: they have one CCD with higher boost clocks and one CCD with 3D V-cache and lower boost clocks. That's why I talked about a better scheduler. Linux performance may be better. Check what Wendell from L1techs said. Note: 7800X3D has one CCD, so it is similar to 5800X3D and thus it does not need a better scheduler. A scheduler pushes the processes to the available cores, but in the case of 7900X3D and 7950X3D the scheduler must push all the game processes to the suitable cores, in other words to the CCD cores with the 3D V-cache.
the 7800x3d will be the one that makes sense, the difference between the 7950x3d and the og 7950x is pretty small in most cases, especially once you run the game at settings that most people with a 1000$+ gpu would reasonably be targeting
"This is the chip that AMD made everyone wait for..." - Brett; yeah nah, the 7800X3D is the chip that AMD is making everyone wait for 🤣
when i bought my 7950x for 390 it feels like robbery in comparison. knew i couldnt trust microsoft to not fuck up os scheduler
@@olliec6267 noone needs the scheduler just select prefer cache in bios and you are fine
@@olliec6267 its not microsofts fault AMD decided to drop a problem on them without having a solid software solution for them. People knew this was gonna be an issue and to wait for the 7800X3D not just cause it was cheaper but because simpler. Way less to screw up and vcache was only gonna be on one ccd.
That's what I believe as well, and I think it has something to do with the dual CCDs of the 7950x3D with only one of them having the 3D V-cache. The 7800x3D will only have one CCD. Tech Yes City simulated the 7800x3D with the 7950x3D and saw improvements by disabling the non-3d cache CCD.
Next time, when you compare CPUs, compare their TDPs. Cause 7950X3D takes almost half the energy that 7950X does. You won't need to cool as much etc.
What is required in my case (7900X3D) is to force the CCD priority in the actual BIOS with the Cache option. Not relying on the XBOX Game Bar in Windows (Windows 11 in my case) because even with just relying on that it doesn't actually run my games on the 3D vCache CCD0, by default it runs everything I do on CCD1 only. But going to the BIOS and forcing the priority for CCD0 there worked and everything I do now in Windows (gaming, etc) first runs on CCD0.
However, it IS a problem. No one should have to force that through the BIOS and it should absolutely be a "set and forget" thing you do maybe once in Windows and that's it. It should just work out of the box. It DOES work for some people out there and they'd claim that there is no scheduler-related issues. That's normal... because they don't experience the problem, obviously. But there ARE cases where - for some reason - JUST relying on Game Bar just won't cut it. I'm one of those cases, and forcing it via the BIOS was my only solution.
I run a 7950x3d and set the bios to prefer frequency, then used process lasso to set all the processes in my games folder to run on cache. The OS, and all other tasks run on frequency while my games get the whole cache CCD. The performance is insane.
😅😊😊😊
You may not see much of a difference in 1080p with fps in games where the 7950x was already pretty much hitting the engine limit. The 7800x3d is targeted specifically at gaming. The 7950x3d is for both gaming with the x3d enabled ccd and productivity apps the non x3d ccd with higher clocks. Another way to think of the 7950x3d is kind of like a higher clocked 7700x and a 7800x3d combined.
I'm looking at an X3D for my next system because of the huge advantages in Blender, but if I wanted a pure gaming machine then I probably wouldn't bother. We are in a strange position at the moment where a lot of the very best tech doesn't really make sense in a bang-for-buck way for people who just want to play games.
Went from a 5950x woth 4080 to a 7950x3D and it runs phenominally. Talking 30-40% fps gains, CPU bottlenecks are real. Well worth it, the whiny reviewers who say otherwise... meh, just negative titles for more views and comments.
I got one and love it.. my main focus was great gaming performance with lower power consumption. Some of the other chips use way more then I'm comfortable with and therefore throw off a ton more heat as well. I'm willing to take some minor loses to have a power efficient rig while gaming.
Have you updated the chipset drivers? I play heavily scripted and ai games and notice the first 16 threads kicking these scenarios. It's just insane. The vm in my game just doesn't get overburdened..and if only a microsecond. Really insane performance thru that cache. I wish I could try my x4 100 billion dollar galaxy save with it. My old pc did slide shows of 4fps on that. Sadly my save got corrupted in an engine upgrade.. bet this baby pulls that back up to 30+ fps. Updating a universe economy in every frame of the game simply was too much for my old cpu.
And that's a badly threaded game - 4-6 threads or something like it. Actually use a game whose kernel needs that extra cache for the extra ai and scripts that need to run every frame. That's when you'll notice it.
"Fuel efficient race car". Pick one.
@@paradoxicalcat7173 some of the fastest cars these days are more fuel efficient.. take an electric vehicle example 2023 Tesla Model S Plaid
Top Speed: 200 MPH
0-60 Time: 2.1 seconds
@@paradoxicalcat7173well what if you wanna go fast and not have to burn through gas at the same time? Electricity doesn't grow on trees and a higher TDP => higher heat levels => higher AC usage => higher overall electricity expenses, especially in today's world.
I've got one myself, super happy, but I'm upgrading from a 6850k so almost anything would be better. What I'm really curious about is those games that normally can't ever seem to run well like MS Flight simulator, where things get pushed in weird ways. I also want to test it when running slower RAM. I plan on upgrading to 128GB pretty soon, and its a known issue that you have to down clock your memory (4800 is often all motherboards will list as compatible) And maybe the cache will help deal with that.
why so much ram?.. are you putting the OS on the memory?
@@EnhancedCognition Ha, no. I run Blender and Unreal engine. I also play with some Ai stuff.
You should get the new 192GB kit heard it can run at 5200Mhz with the 7950x (7950x3D wasn't available at the time). It should be available in the next month or so for AMD.
I have a tuf gaming b650m plus Wi-Fi, with 128g currently running on 4600 (expo 3600 -> 4600)
@@Martin_Speed I'm super curious about those kits for sure.
You get better 1% and 0.1% lows with 5800X3d and 7xx0X3D which ultimately gives you smoother gameplay in 4K or VR, it's not so much about the 1-2% maximum FPS.
True
And a massive MASSIVE boost of fps in VR chat
@@michel333alfa-kun3 Oh, what are you getting?
@@Vernas_R i don't have it and i barely play VR chat but i saw Tupper showing something around 4ms CPU frametime in front of a mirror full of unoptimized avatars in discord
I'm still rocking with the 5950X.
Nobody cares
I got myself the 7950X3D and I am really pleased with it. Though, I refused to go with AMD's scheduler and instead use Process Lasso to handle it, and it works wonders since I can run the games on the v-cache CCD and all other apps on the second CCD without them fighting over resources like they would if you just park the second CCD that AMD does currently.
So if one uses Process Lasso, it is absolutely amazing for gaming + content creation like recording / streaming etc, the power efficiency on the X3D makes it really nice for workstation machine as well.
Comparing it to my 5950X at 3440x1440, 4x AA and Ultra settings it boosted the FPS massively (+100-150fps) in many VR worlds on both ChilloutVR and VRChat, and also boosted the FPS very well in VR, especially minimums and reprojections, with a Pimax 8KX which is 2x 4K resolutions.
So in my opinion who games on my workstation machine, the 7950X3D is far better than the 7950X with both performance and power efficiency in all my scenarios, especially if one uses Process Lasso over AMD's scheduler with XBox Game bar.
big brain time
What's CCD if I may ask
@@wahahah AMDs high end CPUs are dis-aggregated so instead of having all of the cores on a single chip, they split them between 2 chips. So rather than having 1 chip with 16 cores, you'll have 2 chips with 8 cores each. On the 7950x3D, one of those 8 core chips has extra v cache, and the other chip has higher clock speeds. So depending on the game, it might be beneficial to disable one CCD or the other to maximize performance. Using a program like process lasso can help with that.
@@MistyKathrine With Process Lasso you don't have to disable one CCD, you keep both CCDs active but restrict a process to only run on one of them at all time, thus actually utilizing the entire cpu instead of disabling half like AMD's scheduler does atm.
@@FroggoVR Well yeah, but trying to explain it in a way that makes it easy to understand. With process lasso you're forcing the program to only use cores on the CCD you want.
► Big thanks to Micro Center for sponsoring today’s video: New Customers Exclusive - Get $25 off your purchase of any AMD and Intel Processor (limit one per customer): micro.center/nv7
Shop LG 27UP850N-B 27" 4K UHD: micro.center/icp
Check out Micro Center's PC Builder: micro.center/a3b
Submit your build to Micro Center's Build Showcase: micro.center/nmk
Weird, I love mine. 3 months I've had it, zero regrets, its blazing fast for my rendering and compiling, while also being insanely good in games. Literally the best of both worlds.
Does it make your overall PC experience snappier? Do games load faster vs 8 cores?
@@Xilent1 I upgraded from an old 3700x so yeah everything is WAY faster. Everything. My partner has a 5800x3d in her PC and the difference between hers and mine is also quite pronounced. I'm not jealous of her anymore lets just say lol.
@@EloyBushida Thanks! That's what I have now a 3700x. I'm trying out GeForce Now which uses a 16-core CPU (Not Named). Made me consider skipping over the 5800x3D to a 5950x or even a 7950x3D for a new build. I'm thinking between the two from what I'm trying to achieve, the 5950x should work just as fine as the 7950x3D minus the gaming improvements which I'm fine with for now. Looking to make my PC more snappier if that makes sense and not just graphics this upgrade or new build.
@@Xilent1 I totally get what you mean, before the 3700x I had an old 4 core intel chip and everything felt so sluggish, I was just dying for some zip!
I think whichever you go with you will notice a difference for sure, but I would recommend maybe the 7900x if price is a factor? It tends to be around the price of a 5950x but its faster in like 90% of use cases, and on a newer platform which is always nice. Get some fast 6000 ddr5 and an m2 ssd to go with it, and you're really cookin!
@@EloyBushida hear you and thanks for the advice. Its not much of a price difference so I'll just go 7950x3D over 7900x. Definitely will use your RAM advice. Going to wait and see if the 9x00X3D bring the 7x00 series down in price at Microcenter. 5080 might not release till next year, so I'll wait for Sales for everything available now.
What's crazy about PCs today is a Raspberry Pi4 is about as powerful as a PC 10-20 years ago. Think about that, more so when it uses the same power as a single one of those NZXT fans, and this system has 17 of those running.
Makes you wonder, when will we reach the stage where fans to cool a PC/compter are a thing of the past.
Look at solid state cooling ;)
Did you ever try disabling the non-3d v-cache cores? Other reviewers noted seeing better performance this way. The point still stands that due to the way the split design is handled, the 7900X3D and 7950X3D should not be purchased. It is possible this may change in the future, but you always buy on the performance of today instead of the promise of tomorrow. If you need productivity performance, go with the non 3D versions of what you need. If you need gaming performance, wait for the 7800X3D where the split design will not be a problem and it will be a lot cheaper.
they released an updated chipset driver that fixes this. it makes enabling game bar turn on the CCX handler
All I know is I came from a 2700x and with this knew am5 build with the 7950x3d I'm blown away. Always to just once build a pc that I didn't really need and pretty happy I did. I look at it like does anyone really need a Lamborghini? Nah but people buy em.
Probably better to not use the AMD core scheduler (the one that parks half the CPU when gaming) by disabling the Game Bar for instance.
Then use something like Process Lasso to set the processor affinity to the 3D cache enabled cores for games.
Other processes will go to the other cores, which should result in a better experience when doing other stuff in the background.
The core scheduler didnt work but works now.
@@Jisu1337 Still doesnt matter. For games you pretty much have 7800x3d at twise the price. This is the problem AMD hit because they could not make a stable low latency interconnect so only 1 CCD has 3d cache. For games , the sheduler just disables half the CPU. Then for productivity it becomes a lesser version if its "little" brother because most programs cant use 3d cache and use the standard but prioritize the higher clocked cores.
If you are willing to fundle arround with Lasso to move cores arround like on 12-14 gen Intel go for x3d but for people that just want something good enaungh out of the box with no fondling stuff they dont understand - the standard X version is far better and cheaper option.
Also if you go for vitualization - dont bother with X3d , it cant use the 3d cache and you can passthru only 1 CCD , same problem with Intel's idiotic E cores.
have you tried something like process lasso for locking certain tasks to certain cores instead of relying on windows or gamebar
I went with the 7700X, it had the best price/performance at the time and wasn't that much slower than the 7950X...
I upgraded from that to the 7900x3d. Not much difference, probably 10% better if that
@@SwordnScale Well I didn't, went with the Ryzen 7700x.
@@brianwolfgaming1452 am planning on buying the 7900x3d...u think its worth
@@rockbut3039 It's not worth the extra money, just go with the regular Ryzen 7900 series, save you some cash and maybe use that extra cash to put it toward other upgrades like video card.
@@brianwolfgaming1452 wait cos I was gonna get the rysen 9 7950x3d, u think that not worth it? Or do u think I should get the normal rysen 9 7950?
Staying on my 5950x until it can't do what I want it to do anymore :)
5950x here aswell all this about 7950x3d blowing up ;(
@@frallorfrallor3410i held my credit card until i remembered to check benchmarks.. now im confused 😢
When I built my system around last year's black Friday sales I was really hung in what CPU to use , between the 7700x or 7600x .
Knew the x3D chips where coming but didn't know when or how well they would perform .
Went with the 7600x and figured I would upgrade to a x3D chip if it shown a respectable improvement over the 7600x
I'm a gamer so gaming performance and cost per frame are king in my book.
I have no interest in the 7950x3d , too much $ for the fps and headaches needed to get it to peak fps.
Now the 7800x3d I'm really looking forward to seeing the benchmarks in mmorpgs and rts .
Really wish there was a microcenter in the Seattle area , closest MC is in Los Angeles.
Over 1100 miles away!
For our European friends it's the equivalent distance from London to Rome .
But you can buy online and get it posted... Like everyone else? Lol.
@@JohnDoeC78 Newegg is a hoot !
But sometimes I like to go down and look at what's on display.
Being able to look at a computer monitor in person means a lot to me
I did everything that was recommended to do(which anybody that knows about building a PC should do) upon installing the new 7950X3D and it made a huge difference in games. I really don’t know how you getting only 1% faster than the 7950X, but there’s something definitely wrong with your setup dude. First of all, you didn’t even show the resource monitor or AMD Ryzen master to see if vcache CCD is actually working. You could have avoided the benchmarks and just gone with that in the first place. I had no issues whatsoever with my 7950x3D. All I did was update bios and chipset drivers and it was ready. I swear I don’t understand all this crap being put out there about the 7950X3D being so complicated. IT’S NOT!! You either hating for whatever reason or you really don’t know what you are doing. At 1440p I’ve gone from 190/200 fps to 260/265 fps in Warzone from 7950X to now 7950X3D. On average I’ve gained about 25-35% more performance on games compared to my old 7950X. This shit is ridiculous dude how these youtubers have nothing better to do than just hate or don’t have a clue what they are doing. Now it seems it’s because of a slower GPU? 🤦♂️ there’s folks with 3080 GPU’s getting same 20-30% performance increase. Lol
Thanks for adding to the problem and misinformation….you are defiant because your experience doesn’t match others and you refuse to understand the real issue here. This should not be a thing. Every single one of these issues (they are real) can be avoided using any other AMD processor or waiting for the 7800x3d.
7950x is fantastic for gaming and productivity at a great price. A proper hardware scheduler wasn’t used here and the plug and play gaming SKU (7800x3d) was delayed on purpose. The marketing hype worked, and you are their cheerleader. Wonderful.😂
11
What mobo and ram are you using?
It’s on average across his benchmarking suite of a number of games. Which I feel like he stated a bunch of times through out the video
He used an AMD GPU which was probably the biggest issue. The x3D chips make a lot more sense for people using Nivdia GPUs. If you're using an AMD GPU, the x3D doesn't make a lot of sense. Ironic, I know.
Are you ever going to do a follow up here? These are regarded as god-tier CPUs so I don't think the problem is the processor, it's the user..
I just went with the 7900, I think it's the best one for me and what I do, and I'm always happy with the induced cooler as I never overclock.
What I saw on the set up, you did not mention If you upgraded your chipset drivers for the x3d processor, it needs to be done. If you did this then I can see where your coming from with you conclusion.
Awesome video and very interesting information on both of those CPU's, but I think i'll just stick it out with my 5950X paired up with the RX6900XT for now I mean I am happy and satisfied with the performance I get with that setup.
Also, the 5950X is the current king of efficiency (in 3D rendering, specially, in frames/watt).
I have exact same setup as well. Still king in my book for what I use it for.
@@Darkk6969 yup and honestly there is no need to upgrade this setup is BEAST!
If you're running at over 1080P the 5950X is going to be great for a long time to come. Your GPU will be the bottle neck for at least a couple more gens.
@@ghostofdre well really the gpu isn't a bottleneck simply because what game out there is ever powering demanding to say such though. All the games I do play IE Destiny 2, COD mw2, gears 5, elden ring, horizon zero dawn, God of War I play at 4k and performance is phenomenal. So honestly upgrading to the latest and greatest is pointless in my opinion
I appreciate you addressing this. I unsubscribed from Hardware Unboxed yesterday because all of their benchmarks are done exclusively @1080p. He runs a 4090 @1080p. I just can't watch that insanity anymore. The information is misleading people who play at 1440p and 4k. It is very important that you check benchmarks that run at the resolution you play at. I play exclusively @4k. I paired a 4090 with a r7 7800x3d instead of a 7950x3d because the extra processing power would give me 0 extra performance. I saved 400$ on my build at the cost of 0 performance. 7800x3d and 7950x3d are within a 1% difference and the 5800x3d is essentially the same in 9/10 games @4k. Games are GPU bound @4k. You only need a CPU powerful enough not to bottleneck your GPU @4k. This is counterintuitive because you would think you would need to pair the fastes CPU with the fastest GPU in order not to bottleneck. That is true at 1080p but not at 4k and not with ATI cards. So if you play @4k do yourself a favor and pair your card with an appropriate CPU. You don't need the fastest possible CPU the processing power is completely wasted.
When testing CPUs, it's better to test at 1080p to remove any GPU bottlenecks. Resolutions don't affect CPU performance as much if not at all, if the GPU is capable of handling it. By testing at 1080p (or lower) we can get a better estimation on the CPU throughput, and get more comparable values since the GPU isn't holding it back at all. it's common sense to be honest.
@@Rose-ng2zp I am particularly interested in exactly where the bottleneck is.
Video starts at 5:30 btw
I envy people that actually live near a micro center. Closest one to me is near d.c. and I can't really justify a 6hr round trip there and back.
Did you install Ryzen Master and put the CPU in gaming mode (which disables one of the CCDs)? That made a MASSIVE difference for me.
thank u
best cpu i've ever had. process lasso + this beast is amazing a year later
I think a lot of gamers are simply going to keep waiting for the 7800x3d. These higher end versions don't really make a ton of sense unless you're a gamer who also needs better productivity app performance as well...and then I guess maybe?
Maybe the gaming content creators but most of them would probably prefer intel for quicksync.
Did he mention updating the chipset drivers? I think that's important.
Assuming he did as he flashed the AMD website recommending it at 5:55 and he says he checked of that list.
@@Mercstar With good reasons, master.
@@Jord8043 Good spot. I messed it cause I was multitasking. this is quite sad. im going with intel this year
I am more into waiting 7600x3d if that ever comes out as budget option....or 8xxx series that will hopefully support higher memory speeds that current ddr5s are offering.
Has there been any leaks or tumours about it? If not then I wouldn't keep your hopes up. AMD probably knows the 6 core v Cache option would be an absolute beast for gamers so much so that it would cannibalise their own entire product stack. Just look at the fact that they delayed the 7800x3d. They knew that everyone would just get the 7800x3d over the more expensive stuff. Its silly really. Hopefully they learn their lesson this generation and realise consumers aren't as stupid as they think and give us the 7600x3d or a 8600x3d in the future.
One really important thing was left out here (at least its important for some people): The power consumption. x3d will have the same performance at significantly lower power consumption. We are talking about figures like +100% for the non x3d in games and even worse if you compare it with intel CPUs...
@Brendan "intel doesn't use power" oh yeah ;D their chips are running on air? ;D
Holy crap so many people coping hard in the comments:
"Just use process lasso"
Why would you? Consumer should NEVER EVER feel the need to do its hardware's work. If AMD cant make their hardware to work properly its on them, not on a consumer to do the scheduling.
7950x3d and 7900x3d are terrible products, which does not work properly. The only chip from new lineup which is going to be amazing is 7800x3d. As long as amd wont have its hardware scheduler, like intel's "thread director" no amount of software patches and updates will fix those dual CCD v cache cpus. It is also much harder to schedule since both CCD's can be faster than the other in certain applications. With intel P and E cores its easy: p cores = fast, e cores = slow. High priority task = p cores, low priority task = e cores. If application uses more cores than chip in question has p cores, assign and add more e cores to the process.
With amd its much harder for scheduling to assume which CCD should be used for, not all games runs better on v cache CCD, not all software runs faster on non V cache die. How to properly assign them? This gamebar solution is extremely crude and inconsistent which clearly shows in this video.
To me even 7950x looks more appealing than both 7900x3d/7950x3d because its much more consistent and reliable, just like any proper cpu should be. If you are thinking of zen 4 3d goodness, 7800x3d is your ONLY option.
I built a 7700x & 4080fe PC, I was going to use the 7700x CPU as a placeholder until the 7800x3D came out but now I'm not sure. We'll see how the benchmarks do.
Same. Not so sure I'll bother with this generation's X3D.
depends on what resolution you play in. if you play in 1080(which i dont know why anyone would use a 4080 for 1080p) then the cpu upgrade would makes sense. 1440p would be questionable. 4k wouldnt make sense as that falls almost entirely on the gpu. unless its a game that gets well over 120fps in 4k. then cpu would matter
that's because you are suppose to disable CCD2 since that doesn't have the vcache. So basically if you have the 7900x3d, it's game mode reduces it to a 6 core cpu but it's clocks can go far higher multicore that it can using all 12.
i just bought the 7950x3d from microcenter but it was bundle with a x670e board and 32gb of ram at 6000mhz paid 650 i think or 670 so it wasn't a bad buy
Bought the same bundle, except I sold the ram and bought trident ram 6000mhz 32. Did you do the 3d cache process? To get the core parking to work? I did and it was way worth it. Make sure your getting all your performance
@@killerbsting1621 i did the exact same and gave the kit the bundle came with and gave it to my cuzin im having no issues with my 7950x3d no crashes everything is alot smoother and im coming from a 13700k yes all my cores are working normal so im happy with my buy
The problem is there is massive potential for the 7900X3D and 7950X3D but there not utilised yet.
This is a great no nonsense video about gaming PCs. A must watch! That little 5800x3d, B450, and a overclocked/undervolt 6800xt from the clean used market today is a true winner.
Love Microcenter!! Whenever I am Stateside, I make it a point to go there and stock up on computer parts before returning to VAT heavy Europe.
Returnal was probably the worst game I've ever played.
Edit: Yeah, AMD's GPUs are pretty crappy compared to Nvidia.
I'm not sure did I understand all the technical details, but looks like Intel 13600K will be much cheaper and easier to setup as a gaming rig paired with 4090.
Thanks!
The 7950X3D is selling for $799 for what I had seen online. I grab the I9 13900k for $549 a month ago.
I have the normal 7950X. Got it for almost exactly $600 out the door. I have no Regrets
And now... turn off CCD with no 3d.
And the diff will be a lot bigger.
"And now..."
I tried the 7700x and the 7900x, no regrets. I would keep any if them... I have the 2 of them on sale, the one that I don't sell will stay in my PC
My 7950x3d is so much better than my 7950x was.
I'm on the fence on which one to obtain. I am mostly into motion graphics. (ie:3D etc) Which one would be faster at rendering ? And which one would be the best bang for the buck? (Would probably use it with a 4090)
Just disable ccd1 and be done with it lol. No scheduler, no cross cc'd latency. Ccd0 runs 250mhz faster than the 7800x3d. If you need more cores just enable ccd1 in bios.
Maybe. We will see if the 7800X3D really only goes to 5 Ghz.
The multi CCD's X3D CPU bios and windows settings, and software requirement is a Chinese fire drill to get it all working correctly, and that's IF the game you want to play actually uses the V-Cache -not all games do. AMD is on record stating that the "best" gains are experienced with a $1600-$2300 RTX 4090. And the regular AMD and Intel's CPUs aren't lowered clocked like the X3D Cores are to mitigate the extra heat from those cores. Bottom line: a lot of money to gain the "best gaming CPUs" differences. I'd go with a 9950X, 14900K or upcoming Ultra 285Ks. Or if you ONLY game, a 7800X3D/9800X3D with single CCDs.
Really wish they had committed and put the vcash on both sides. That video card is a bottle neck, but 4090 is stupid impractical in price. Many games both chips are fast enough to hit their software limits as well, in that the x3d does some better future proofing as we should more aggressive games come out. Though in general the benefits are niche. Also the whole is my OS and bios etc actually routing to thw cache correctly is a huge unknown and potential bottle neck.
At 1440p mix of low settings with very high textures and view distance I average 600fps in Fortnite with the 7950X3D and RTX 4090. With my 7950X I averaged around 500fps. For me it's a pretty huge boost.
Reminder to everyone:
"Realistic" settings are irrelevant. The whole point of CPU benchmarking is to determine the objective and absolute differences in performance. Benchmarks are not a buyer's guide. They have never been a buyer's guide. Do not treat benchmarks as a buyer's guide.
The argument that CPU scaling does not present itself with x GPU and y resolution is irrelevant. People upgrade their GPUs more often than they upgrade their CPUs. Furthermore, you do not want a CPU bottleneck. I do not know why anyone would be excited about a CPU bottleneck. You want your GPU to be running at 100% utilisation during gaming, and if you have a lot of CPU performance overhead, then that means you don't need to upgrade your CPU for years to come.
I'm so glad I went with Intel i9 13900kf several months ago! No need to worry about this nonsense!
That Video got out before the Chipset updates i think. The 7950x3d performs way better in 1080 and 1440p. Before that update, the cpu didnt utilize the bigger cash.
Your conclusion that you didn't see the same results as other RUclipsrs benchmarking this is because you used a AMD GPU instead of an Nvidia GPU makes no sense whatsoever. Having an AMD GPU vs Nvidia should make NO difference on a game being able to take advantage of the cpus Vcache of the X3D. Are you really saying people need an Nvidia GPU if they are going to use the X3D?!?!? There is no way this can be true, this honestly points to a problem somewhere in the setup.
EDIT: I 100% bet that in all the games you tested, the games were not using the correct CCD. I bet that if you did like Hardware Unboxed did and disabled the CCD without the extra Vcache to "simulate" a 7800X3D and to make sure the games aren't utilizing the wrong CCD, that your results would show the X3D pull way more frames. I still can't believe though that you suggested it was because you used a Radeon GPU with why the X3D results weren't higher than the non-X3D version.
Thanks for talking through your thought processes during benchmarking! Let's be honest, its all about the 7800x3d. I'm looking forward to being upset about the price in the UK in a couple of weeks.
I'm confused, I just watched two other youtubers testing the 7950X vs the 7950X3d in similar scenarios as in this video, and the 7950X3d was always faster in gaming! How is this possible!?!
are you sure that you have the latest chipset drivers?
you might like to also try using CPU affinity to restrict the game to the CCD that has the extra cache. you can di that using the commandline commant start and include a CPU affinity flag to create a shortcut.
do we need those requirement for regular guy just wanted to play?
@@rolandlucmayon7480 just using the new chipset drivers and game bar, you still get performance that matches or beats 13900k
May be is the motherboard issue, not optimized well yet with a bios update, i saw some MBs perform way worse because are not optimized good yet even after they new update, or win. update can be also the issue, or some missing bios setting, but if you have the chance and time, try with some other good motherboards. The difference between both cps with 4090 and 7900xtx was very small, like 3% if i remember it right, i mean 4090 is getting around 3% more fps on 1080p between both CPUs than with 70900xtx, if i remember it right, was very small the difference.
Trash ass amd
Try it with a RTX 4090 and proper DDR5 6000 CL30 memory and you will regret not getting it.
I may have missed it but did you update your chipset for 3DVcache on your system after you installed the new chip? Those numbers seem off just a bit. But also under the impression the 3dvcache chips are better for games that require large amount of simulation. Space engineers, Stellaris, Universe sandbox, Ultimate Epic Battle Simulator, etc. However if I'm wrong on that someone let me know. Thinking about grabbing a 7800x3d soon when it releases.
This dude always trashes amd so this video isn't surprising lol
@@MrFRNTIK I'm sick and tired of seeing AMD fanboys like you saying this person thrashes AMD when most Tech RUclipsrs I followed tell it like it is and not trying to be a shill and Bret is using Ryzen with a Radeon GPU. I love Nvidia's GPU for my own reason but that doesn't mean I'm not gonna call them out whenever they do stupid stuff like hijacking prices then they know they are being greedy pigs
@DaVillain he does though. I have never seen him say one good thing about amd. And if you have an example of that I'm sure he was reluctant in his praise.
@@MrFRNTIK bruh, why are AMD fanboys pushing conspiracy theories. NVIDIA fanboys might be annoying, but atleast they aren't outright insane
@@potatoes5829 what conspiracy theory? I didn't know stating the obvious was akin to conspiracy. Idc what you use in your system. I personally hate the pricing of the rx7900xt.
I've learned my lesson one too many times going AMD over intel/NVIDIA to save a dollar, and I've never had an issue with either one. Unfortunately AMD had always suffered in comparison.
You can try disabling 8 cores of the CPU to simulate 7800x3d like Hardware Unboxed did. According to them that's the best gaming CPU available right now
Lol then jus wait for 7800x3d n hub is pro amd. Intel 13900k tuned is still better than anything amd has to offer
Ol’ boy Framechasers is gonna gobble this up! 😂
Looking like it's picking up the 0.1% as the cache is benefiting the ram fetching.
Ironically as you have so much vram I'd imagine it's hardly tapping the ram .
I suspect in an 8gb card it might even run those better.
This explains the 5800x3d more as ddr4
Thank you for your due diligence and financial sacrifice buying and testing this chip. I don't normally subscribe to the tech channels I watch but you just saved me from a bad purchase, so you've more than earned it in my book.
🤣 bro did you just say the 7900xtx performs worse at 1080p!
I purchased the regular 7950x and was wondering if a made a bad choice and not waiting for a x3d. I'm glad I didn't wait and purchased this. Really happy my processor and performs exactly how I need it to.
What mobo and GPU are you using ?
@@kylergeston gigabyte x670e and rtx 4080
I upgraded to 5800x3D this year from my 3900x. (paired with 3070). That was definitely worthwhile.
I love being GPU bound and having way less defined hitches if there's a technical issue with a game.
well, i have the 5800x and another 3600XT and a 3600 non X(t) and one 2600 and i must say, from a thermal standpoint, i like the monolithic design much better. The Chiplet Design may bring advantages to the manufacturing process as well as the profits of AMD, but with this 3d V-Cache it is one more compromise on the table, which i am not willing to pay for, anymore! Not saying i go to the blue team either. no, i will stay with my main setup with the 5800X and wait what happens next with Zen 3 Refresh or Zen4 and enjoy the low DDR4 Prices in order to fill up my 16 ram banks (on 4 Boards) with cheap 3600MT/s Ram.
Hardware is designed with future performance demand in mind. Current software is not yet optimized for this chip design yet. The next generation of games will show a more substantial difference
This is a lot of help in perspective for me. I feel any system or parts I buy now will live beyond my primary system. My current system, when I hang it up, will be my next step to having a server. I love my i9 3900x and the product line as it will help me create a good server once I move up from it. I try to get 10+ years out of my hardware. I have a system my kids have now Intel 2700k that is gaming Destiny 2 at near competitiveness levels with an Nvidia 980 ti. That Ryzen 7950 non-3d is steller. Even the 3d version can still do the work, in the long term. After 5 years of gaming, it would be a spectacular game server host and more.
my girlfriend was curious why i wanted to upgrade her from the 5900x to a 7800x3d, but was super excited to hear I wanted to use it for a system to host game servers
It is the temperature and TDP differences that matter. The 7950X3D can do the same performance at lower power consumption and temperature. Thats enough.
Rofl bros how are so many people having these issues? It's a plug and play experience once you update the bios and install chipset. I don't even have Xbox game bar or my system on balanced and the Window's scheduler still knows what to do. The difference between the two is huge especially at 1080p and I'm using an xtx. The host needs to step his game up. How can you be a tech youtuber and have issues installing a cpu man!
Basically clueless.... Here's some pointers it'll help you out:
1.Install AMD x3d optimizer drivers
2. Install Windows game bar
3. Verify PBS in your bios says to auto
4. Verify Core park in task manager is actually parking 9 -16 cores while in games.
After you do all this.... You will find a noticeable game.
I'm looking to build a PC with an RTX 4090 to game on 4K. I'm more interested in the future. Would future games run better on the 7950x3D thanks to its cache or would it be the same as the 7950x? I'm not planning on upgrading the PC for many years to come, so I want to get the best of the best and therefore be future-proof.
same but with the rtx 4070ti
You didn’t analyse performance per watt. The X3D absolutely crushes everything else in that very relevant criteria.
Video starts at 6:07.
Why did Brett get autotuned for a second at 4:42 🤣
Your best option for gaming is to disable the CCD without the v-cache to effectively make a 7800x3d.
Exactly!
Like Hardware Unboxed did
@@Technae The 5.7GHz clockspeed of the 7950X3D is on the CCD without v-cache only. The v-cache CCD runs at about 5GHz, so the 7800X3D will be about the same.
@@isakh8565 hahahaha no! Seems like you are one of those who doesnt understand that the 7950x3d vcache die boosts 300mhz higher and the 7800x3d is gimped…
@@reijhinru1474 LOL 🤡
This seems like a huge win for me. Who bought a 5800x3d at launch and decided to skip this generation. My numbers are better then yours in some cases using it and a 3080ti. ❤❤
5800x3d for the win.
That's what happens when only one chiplet gets the 3D vcache. AMD delayed the launch of 7800X3D on purpose because they know that that CPU will put the 7950X3D and 7900X3D to shame and will just end up to no one buying them hence less profits lol.
That has nothing to do with the performance. Those games wouldn't have utilised more than 8 cores anyway. It would not help. And if they did, it would only hurt performance given the latency between the CCDs, even if both had V-cache.
but they lowered the clock speeds significantly... so that it doesnt..
Try it on 8K/16K and you'll see X3D perform better than normal one...
Closest Micro Center to me is over 450mi away. Never will and never have been to Micro Cent. Keeping to online stores and into computer stores for my computer stuff I guess
To use the X3D CPUs properly, you need to first do a BIOS update, then install the chipset drivers and any Windows Updates.
I also highly recommend ASRock motherboards like the Steel Legend or Taichi and GSkill TridentZ5 Neo RAM kits that are EXPO certified.
Go Shane W. I love Microcenter and their sales associates are really knowledgeable.
To summarize; don’t touch AMD CPU and GPU with a stick. Intel scale with memory in monolitic die.
I could understand some people not wanting to pull the trigger on this CPU just yet. I saw some of the reviews on the 7950X3D against its non-3D version (7950X) and Intel’s 13900K.
While the 7950X3D gives those people who game an advantage with the 3D V-cache technology, the drawback is that you have to contend with a reduced core clock on the cores getting the Cache, which may hinder performance in applications requiring the higher clocks.
In a way, AMD’s Ryzen 7000X3D series is a little of a double-edged sword. It is showing the capabilities of having the best of both worlds, but it is also revealing why AMD did not attempt to try this first with the 5000 series (AM4 socket) before on to the 7000 series (AM5 socket).
This Chip just needs manual tuning... when it juggles between the best core of ccd1 and ccd2 your screwed.... since 1 CCD has the 3dVcache the other doesn't... if you really want the most of it it is going to be a pain in the ass to set core affinity manually. so for the enthusiast who is willing to invest time and effort into this matter until the windows scheduler can properly work with this chip. even the non X3d's sometimes load the slower cores when it throws around the threads. The Reason of this being intel for example has P and E cores, those have a designated naming when you check them out with Hwinfo for example, the X3D only shows as "P" cores. My opinion is that if they would have named the cores differently on the CCD that holds the vcache the whole problem is resolved. Then Windows can see the difference between a "P" and an "E" core for example. to me this is a design flaw. if the CCD with the X3D goodies would be named P+ and the regulars P the system can tell which will be the faster cores through the sceduler.
question is... does the x3d variant age like fine wine as amd does, and future programs and games will be able to leverage the cache enough or soon enough before u would move on to the next gen anyways
Is there a link of what you followed as a guild for tuning the X3D correctly for gaming?
The 7900 XTX doesn't perform 'poorly' at lower resolutions. It's just BETTER at higher resolutions. You just be saying ANYTHING sometimes, bruh. 🤷🏿♂️
hmmm it actually perfoms better in 1440p
in 1080p sometimes have stutters
The color of your cables is : ... SO Relevant to the review...
I would like to see what this test would look like with the same setup and updated bios to the current versions.
Hadn’t upgraded my motherboard since may of 2023 and the update from February fixed every original problem I ran into at the time of building my PC(built newest gen PC February/March of 2023.