I think it would be a nice touch if you could highlight the CPU in the benchmarks. It can be a little confusing with so many different CPUs at the same time and it took me a bit to actually find the 7800X3D. I know it might be a little thing, but for me at least, it would make it a bit easier to spot the CPU that the review is about. Otherwise great video!
Nah its not just you, the charts in this are all over the place. The orders of the CPUs change around and its not even ordered by fps in some of them (5:02). I don't really get it. Much easier if the CPUs stay in the same spot and we compare its size to other bars easily.
Yeah... the charts are pretty bad for this one. There doesn't seem to be and rhyme or reason for how the cpu's are listed from chart to chart. sometimes it looks like it's by performance, but other times it just seems random.
I’d like them to take it a step further and just separate AMD and Intel on the charts. So in the case of this video then have AMD on top and Intel on the bottom and vice versa when it comes to Intel. I don’t know if this would be better or worse but whenever I want to find a specific cpu I have to go through the whole list because everything just gets scrambled.
7800X3D will sell like hot cakes in Factorio community the same way the 5800X3D did - the performance boost is so massive that it allows you to scale to much much MUCH bigger factories. Also, memory latency really REALLY __REALLY__ matters to Factorio. A DDR4 2666 vs 3200 MHz made like 10-15% difference in performance provided similar CAS latency ratings. On DDR5 this is even more pronounced and could be a fascinating rabit hole for the LTT Lab.
LTT lab lol. That thing is a joke. Tell me how 1 dude from hardware unboxed has FAR more games tested, with far more settings than an entire fucking "lab" does. Sorry, but their content is pretty much a joke. Far too quick.
No joke, pairing this with a 4090 is going to lead to an incredibly cool and quiet PC as neither part is using much power at all. Efficiency monstrosities.
@@RicochetForce A 4090 not using much power? Are you high? Efficiency is decent but even that would be way better on a 4090 if they underclocked it a little.
@@AwesomePossum510 I assume if you cap the frames of a game, a 4090 would be more efficient reaching 144 frames than a 4070 would be if that makes sense. It's easier for a body builder to lift heavy dumbbells' than it would be for a teenager. Easier for a Lamborghini to reach 150 MPH than a typical car. This would be the logic but I haven't seen any comparison videos or stats.
As someone who uses my desktop PC for gaming and home office work, I highly appreciate the increased focus on energy efficiency. While I'm not sure of the impact on menial workloads that don't require full CPU usage, I do take it into account when buying parts (CPU as well as GPU etc.). Especially here in Europe, where energy costs have exploded since February 2022.
Quick improvement idea: I think if you keep the charts at 9:00 in a constant order instead of sorting them by value every time would make them much faster to read, especially with many of them in a row.
I like the "best performer" sort order, but definitely something can be changed. Others have mentioned fixing just the review product or bold/colouring the entry. If bold/colouring is used, it would also be nice to have a second entry for the nearest price competitor (at time of release)
Or maybe add a vertical Line at the End of the product talking about to quickly compare the other ones with it. Comparing for excample the top with the bottom one is really Hard without a grid in One way or another
why do these reviews never test simulation speed? like stellaris, civ6, hoi4, total war end of turn times (maybe even football manager?)... It's one of the main strengths of this cpu pretty much no one knows about. There's a test that exists where a 5900x is put up against a 5800x3d. The 5800x3d destroyed the 5900x in stellaris speed, simulating 25-30% more days in the same amount of time. If this happens with alot of similar games, then this CPU could be a gamechanger for alot of people, but no one seems to explore this further. I still see people recommending non x3d chips for these type of games, because of "higher single clocks"' and I think alot of people are still making this mistake. Hoping to see someone test this extensively someday, but here we are 1 year later, and still no one has. edit: please stop saying "no one plays those games" FM23 & civ6 alone make up of 110k players rn on steam. That's not counting any of the total war games or paradox games, or any other simulation/grand strategy game... (which is probably another 200k+ total)
Just on the graphs: it would be nice if you could highlight the reviewed CPU's result on every page. It jumps around up and down (since the graphs are ordered by performance) and because each graph has short screentime, it's really frustrating to try and quickly see the position on each one.
This! The graphs are always terrible, LTT has acknowledged they're terrible and said Labs will improve them, but why can't we have some QoL improvements while we wait for the step change?
One big thing to take from this is that the 5800X3D is still an elite CPU, holding in with this and there is no need to rush to AM5 and upgrade everything until prices settle and we're forced to.
Interesting. I'm currently looking into upgrade my PC for Starfield, and I have an Intel i5-7600K. I want to switch to AMD, in part, because I like how AMD doesn't force you to buy a new motherboard after ever CPU, so in that regard, I feel like I should go with AM5 because AM4 is on its last generation of CPUs. But I'm also on a tight budget (if you couldn't tell by the fact I still use a i5-7600K). But I'm also in a hurry to build a PC that can play Starfield at 60 FPS. Maybe my other option is to buy used AM4 parts, that way, when I'm ready to upgrade in the future, the difference in resell value isn't so large.
@@ex0stasis72 Digital Foundry did a video on Starfield that may interest you. They claim there probably will not be ray tracing because it would be basically impossible do to all the calculations in the game. They explain why but it will certainly be a CPU-limited game as are all bethesda games. So I imagine you really could build a computer under $1000 and run the game at fairly high settings especially if you use 1080p.
@@Legendaryplaya thanks! After giving it some thought, I'm probably going to go with a used AM4 motherboard and the 5800x3D. Buying a much cheaper used motherboard solves my concern about it being the last generation on AM4 because I can always just resell it used again for not that much less later.
@@Legendaryplaya Good choices. I already have a RTX 2070 Super, so I'm not as much in dire need to upgrade just yet. But I am often CPU bottlenecked in CPU bound games, which Starfield is very much be.
I personally really appreciated the performance per watt cost graphs since they give a good estimate of the proportional difference in cost of ownership. Thanks Linus, Team and Labs!
They just missed talking about idle power while watching videos and stuff, Ryzen CPU's are notoriously horrible at that while Intel has it figured out.
@@TheFPSPower It's literally like 5 - 10 watts, total system. If you bought a $500 best of the best gaming CPU with 8 cores that can do well at productivity to leave it sit Idle the whole time, that would be stupid, and you could just turn it off, or set it to sleep after inactivity. I don't know why people keep bringing this up, 1 hour of productivity or six hours of gaming will displace 2 weeks of the difference at idle. The only point this argument ever makes is people like to grasp at straws to promote their favourite brands.
it's not just electricity savings, a low demanding Watts cpu will also allow to run a cheaper cooler, with less temperature and less noise and longer longevity without the need to repaste often.
I really like the comparison of power usage and running costs between different CPUs, it's not something I've considered when building PCs in the past. Would be great to see this in GPUs and the full PC build videos as well.
@@FriarPop I save way more than that, but how would you know. You don't know how I use my pc. What my electric prizes are. You just pulled a number out your §§§.
@@handlealreadytaken I just don't like wasting money, when there's a better way for me, that's it. Nothing more. Why do you guys even care so much, what other people prioritise?
You miss an important bit with Factorio. It's a game about automation and scaling out. While it's capped at 60 tps, no matter how powerful your system, it *will* dip below 60tps once you get a complex enough map. The better scaling means running bigger maps before you hit "TPS death". Would be interesting to develop a benchmark that tests how complex you can get before that happens.
maybe they could ask chatGPT to write a script that automates the game in a predictable way to reliably ensure growth happens equally every run, then just test for when the TPS drops. Unless factorio has procedural generation (random maps, etc). I've never played it so not sure on that part
@@SurgStriker Yes, the maps are procedural, but actual gameplay is quite deterministic. A better test may be a single mega-base that strains even the latest gen hardware and see what the UPS (updates per second) falls to. I'm sure there are plenty of folks in the community who would love to share their multi-thousand SPM (science per minute) bases as a benchmark.
@@SurgStriker The maps are procedural, but that's easily solved using a fixed seed. No need for chat gpt, since the game logic is implemented in Lua. So it would be quite easy to write a "mod" that tiles out parallel construction segments at a set rate using the dev-mode infinite source / sink stuff till the game grinds to a halt. With the fixed rate expansion, that lets you plot performance as "time till TPS drop".
That power usage on the 7800x3D is mindblowing. It would have been enough for AMD to retake the gaming crown decisively, but they did it with a part that sips power. Freakin magic.
@@wasd____lmao, it's still a killer cpu and just looking at charts doesn't do it justice at all. Intel and AMD build good CPUs, the cash here is the selling point. It has like double the cash of similar CPUs, so 10% less GHz really isn't the point here.
As a UK viewer (and someone who generally tries not to waste power) the 7800X3D is clearly for me. I had a great experience building a GPU-less AM5 7700 video editing system for someone recently.
@@sophieedel6324 you only buy a console if you only want gaming and nothing else. It does not beat the usage of a computer. Also, i3s are not necessarily less power usage than these chips and why sacrifice massive amounts of performance for that anyway
@@sophieedel6324 then you wouldn’t get an i3 or console either. You’d get a laptop if you truly cared about power consumption. Efficiency matters a lot. The fact of the matter is that those i3s won’t be more efficient than a 7800X3D; you just lose so much in performance and power consumption is barely different. You could also get a 4080 and power limit it to kill consoles in efficiency.
@@sophieedel6324they simply don’t provide the performance necessary for everyday tasks and gaming in one machine. An i3 is great value, don’t get me wrong, but I think in power per watt, the 7800X3D is really quite amazing. Just unfortunate about the clock speeds.
Still happy 5800x3D being up there in the top representing AM4! It's insane we went from Ryzen 3 1200 to 5800x3D on the same socket... just freaking insane.
@@heliosaiden i went from 2600 to 5700x3d. It essentially doubles the gaming performance it's so crazy. Games that heavily uses the cpu runs buttery smooth, i'm seeing crazy gains in dyson sphere program, stellaris, and beyond all reason. all done on my 7 year old gigabyte ab350 gaming 3
Especially with such a big difference. If you're not playing the highest-fidelity games, you can still benefit from spending a bit more today on this thing than paying for it in energy for the next 3 years or so
That power consumption section is a little misleading though. The 13600k pulls more power in productivity workloads but it's faster. Also, they didn't include idle power consumption numbers. Your CPU spends the majority of the time at idle, and AMD CPUs pull more power because of the I/O die.
I think it's time for a energy consumption testing in different workloads, similar to how laptops are tested for their battery life. Idle, web browsing, 4k playback, gaming (this will be hard, need some sort of average load). It will be harder, cause components need to be tested more in isolation than in conjunction like in laptops, but now that they have the new lab... ;)
Hot take: a difference of maybe $100 in total energy cost between CPUs amortized over a 5 year presumed lifespan is not a consideration. $20/year, less than $2/month, is literally just maybe some random loose change from your pockets. It's nothing you'd ever even notice or think about. It doesn't matter.
I am so glad that you finally started adding performance from games like Factorio! As someone who plays a lot of automation and 4X games. Those are very important metrics to me and many other automation/4X enjoyers. It is also very difficult to find reviews that include them, they all just focus on frames per second in big AAA games.
Agreed. I don’t necessarily play factorio (but want to try it) but i do play games like rimworld, they are billions and recently was gifted satisfactory. And i do enjoy city builder/management games also so it is nice to have some mention outside aaa
The Ryzen 7000 parts are pushed to the limits, so naturally, when AMD has to reign in some of that power due to voltage limitations on 3D vcache, you get a very efficient chip. I just don't think any of us were expecting it to be THAT efficient. It's insane.
It's not efficient when it takes longer on productivity tasks and doesn't really idle any more efficiently. You lose all those savings and more in the extra time it takes the processor to get things done.
@Winston Deleon that is a moot point though, this chip is marketed as a high end gaming CPU that has the capabilities to perform productivity tasks with the vast majority of buyers focusing on the former and not the latter. For those type of tasks the 7900x or 7950x would be more feasible
i just picked up a 5800X3d, started with a 3060ti and a 3600, when i upgraded my middle monitor to 1440p it left just alittle to be desired, got a good deal on a 3090 but then the 3600 was a HUGE bottle neck. Like my FPS in games like farcry 6 straight up doubled switching to the 5800x3d.
I’m having that issue now but with Intel I have I7-8700K and had a 2080 base. Then I upgraded to a 3080 TI and now most games are running sluggish. Cyberpunk, SCP, Assetto Corsa
@@InertGamer your CPU is too weak to play modern heavy games like Cyberpunk. If you get a intel 12000 or Ryzen X3D chip, your gains from 2080 or 3080ti would be huge.
Is it possible for all the graphs representing the cpus and their relative performance, that there could be a highlight of sorts for the two CPUs being compared? So maybe a red "highlight" around the 7800X3D, and a "blue" highlight for the CPU that costs the same or is priced to compete so that the viewer has a more intuitive feeling for the difference in the charts? Otherwise it feels like just a wall of numbers, and I usually like watching these not because I am going to buy them, but for entertainment. So a simple visual indicator for where they are on the list would be cool!
I'm so happy about this focus in performance per watt. It's such an important metric that's been overlooked for far too long. I truly wish ltt keeps talking that much about it in future reviews. Not just about CPUs, but anything. I cannot help but think this absurdity of power consumption nvidia and intel been pulling lately as partially a result from the community and reviewers on average not caring much or at all about efficiency. But raw performance and to hell with the costs. There's only one more thing that would make me even happier: measuring and caring about idle consumption
I agree! But I would take it a step further because it skews even more drastically when you consider AC/cooling costs to keep your room at a comfortable temp, especially with summer coming up in NA. Very few like to game while sweating.
Performance per watt is pretty much the ultimate measure of who has the best CPU architecture. It's fine looking at the benchmarks, but if the results are very close while one company needs a ton of extra wattage and heat output to achieve that result, I'm obviously going to pick the more efficient and cooler running chip if the prices are anywhere near comparable.
@@marcuscook5145well one thing about it though is some architectures can't consume too much power before they ultimately will fail, like ryzen, and something like 13th gen intel was made to be able to consume very high amounts of power. All in all I wouldn't say it necesarily makes one architecture superior over the other, but I think it's always a better foundation for improvements if you have lower watts consumed to supply the cpu sufficiently overall.
@@DKTronics70 its just a socket .. the platform is the chipset and its evolved over the years . and ddr5 and pcie 5 are not worth the extra outlay at all
The 5800X3D gets beat by 20-30 fps in some titles to the 7800X3D. I was shocked, i haven't seen a generational performance increase like this in a while. ruclips.net/video/TVyiyGGCGhE/видео.html
@@spacepioneer4070 what he meant was the 5800x3D is about the same as the 7800x3D in 4k gaming. The 7800x3D according to the charts, and the video you linked, only have a noticeable gap over the 5800x3D in 1080p. Even at 1440p the gap shrinks tho. And in 4k the gap is negligible.
Glad you guys actually showed the cost savings related to the efficiency. I think that speaks volumes about how great the solution is and if AMD actually spent more time trying to make the top end line of chips behave the way we'd expect they should, then it will be very hard to recommend anything other than AMD solutions.
@@JBlNN I would argue that they've taken a major risk already by challenging Intel when they were so small, now they're much larger and growing fast, I've used both Intel and amd before, just got a new amd cpu after using Intel for 7 yrs amd I'm excited to see the differences
I just love when gamers have a simple solution on a CPU. 7800X3D is just great, doesn't use a lot Watts, is fast and decently cheap. Just a perfect choice without thinking to much.
What amazes me the most is that power draw. My 5800x3d draws ~55w which is really nice in my quite small and compromized matx.....they managed to cut out 15W! from that with more performance. Now I seriously concider a SSF build as now any low profile cooler will do very well and the options opens up a lot while still having a quiet mashine.
yea the analysis with performance per watt is very good. I think he should have mentioned how you also wouldn't need as beefy a cooler for the 7800X3d vs other top performing chips. That's means a smaller PC and/or more quite PC.
I don't get why they even show it in a review of gaming CPU, 15$ difference in a year is nothing. Is it not normal in USA to tip delivery driver 5-7$ on each order? So like don't tip 2 times a year and you are done.
@@GrzegorzMikos its certainly not nothing, every dollar counts, also they showed what I would actually consider to be very minimal use, if you were a heavy gamer or left your PC on for longer periods, you could be saving dollars in the triple digits with this chip as opposed to say a 13900k, which is already crazy when you consider that its cheaper than the 13900k and on similar or better performance for gaming. Also, it's just good to note all the strengths and weaknesses of any product you review, like how this particular chip isn't a great investment for pure productivity
I loved seeing the estimated power costs during ownership. Really made me think about it and realize that the extra cost of the 7800X could easily be justified (plus less heat during summers (or all year if you live in Florida like me)).
The difference is is even really small in the US.. i was shocked how extremely cheap your power still is. In Germany we went up to about 65 us cents per kWh. The difference over a few years can easily be hundreds of Euros.
@Duglum666 honestly never thought about it that way, like as of right now I still live with my parents (not a leach only 16) and I just get the best thing I can afford, totally didn't think power was a factor of cost, even though it's long term.
@@Duglum666 the 65c were during peak, shouldn't hit that again, and most places are coming down quite a bit, seeing contracts here in Austria for low 20c range again, it basically doesn't matter long-term for home users (also energy subsidies).
The extra 100W of cooling load on the Aircon will easily add up over a year. Not only you have to pay for the extra 100W every hour, now you have to pay for your AC to get it out of your room. The 3D chips are a no brainer, and a fantastic value for money.
This 3D cache advantage reminds me of how overpowered L3 cache was when it got introduced. I did a little test between an Athlon II x4 620 and a Phenom 720. Parked both of them down to two cores and same clock speed. The L3 cache gave me up to about a third more fps iirc. It would stand to reason that more of it would be better when we keep improving it.
Also that was then, L3 cache (or larger cache vs. smaller) has a bigger impact later in a products life cycle. That's why the phenoms stayed relevant way into the core-i era. That's (part of) why the intel X58 platform is kind of usable for gaming today. The other part being triple channel. these x3D parts will have some absurd longevity, not in maxFPS but in minFPS/frametimes.
@@andreewert6576 Indeed. the minimum FPS or 0.1% low or whatever they call it, is something I use as a general guide when it comes to capping framerate. My current gear for example handles Deep Rock Galactic at 1440p at roughly 120~ish fps, but dips down to 80-90 regularly. I decided to cap it at 100. 10-15 fps drop will not bother me, but 30-40 will be felt. Smooth gameplay is paragon for chill experience, in my personal opinion.
@@Hr1s7i in any games that has a lot of things going on on the screen all at once, its very desirable to keep FPS consistently high even if you have to cap it to avoid the frame dips, which creates way better smooth gameplay. with X3D CPU, the 1% lows are soo high like never seen before, which just makes gaming experience butter smooth, and if you can tune it to be almost locked at certain high FPS, that is golden...
I really love the energy price graphs! Would love to see this more often for gpus too, since as a brit it's defo becoming more of a concern when looking at hardware and nobody does real world data of actual power usage
Fellow Brit here and I completely disagree, the cost of electricity is beyond negligible. Even using their price of 42p a kw if £22 a year extra running costs is enough to convince you then you shouldn’t be gaming on a pc.
@@oxfordsparky sure if you take a single component by itself it might be a small figure, but even then if you're paying even £10 a year for a 2fps increase is there point in that extra cost? No. But without performance per watt info it's all speculation
was thinking this myself watching the video. no clue if the videogame heavy or productivity heavy CPUs work best for the videogames that are productivity heavy :( would love to see 4x benchmarks
@@badjackjack9473 With AMD you will get the same exact performance out of 7600x as out of 7950x. It will be very similar story with Intel's i5 and i9 if it's the K type CPU.
Another really cool thing about 7800X3D being so efficient is probably if they decide to put it into mobile platforms. I can imagine 20 watt desktop replacement for field work processing.
They would invariably have to stack a _monolithic_ die with 3D V-Cache first before it would be attractive to mobile use. Desktop Ryzen 7000 CPUs will consume 20+ watts in idle, and almost nothing of that is due to the cores (which might take 0.6 W).
But would you see the advantages since notebooks get lower power GPUs? You can't generally upgrade the GPU in a notebook, so you are more likely to be GPU bound and waste the CPU horsepower, right?
DESKTOP AND MOBILE CPU'S ARE NOT THE SAME. do you really think if they could make the desktop part have that much performance at 20w they would LOL!!!!! WTF!!? U S E YOUR BRAIN!!!! the people in this comment thread are the same people on the steam forums saying their games won't run. ...."buuut i has a I7" lol
Now all we need are consumer desktop dual AM5 socket motherboards, so that you can put your interactive tasks on the 7800X3D and everything else on the 7950X.
Live in the UK. Those cost of ownership metrics really frigging matter. Been waiting for this type of analysis for a while now and am so happy to finally see it. I would MUCH rather give up 5/10% performance for, in this case, a new component to save significantly on cost of ownership. :)
This is going to be huge for me, i live in Alaska and energy is actually pretty cheap cause of hydro, but nobody has A/C so my room gets toasty quickly in summer with a 300+ watt system running for long sessions. i was thinking of getting a 13900k, but that would literally have doubled my wattage output into my room. this new chip if its running at 100 watts which is the same or less then my ancient amd FX 8core chip i have now. holy cow im going to be going from 32nm to 5nm tech, this is going to be a massive jump. im holding off though for at least one reviewer to use it in Arma 3 and DCS and compare it to intel as those are my most played games. exshily in arma 3 intel has always had a huge lead over amd just because of optimization and single thread reasons. but that dosnt matter if it comes at 200+ watts more cost.
Definitely relevent these days. Features like Radeon Chill where it limits your FPS while you're idling in games can actually save you very noticible sums of money, as well as buying efficient hardware. I miss the days where you didn't have to consider these things but here we are..!
This is why i wait for an entire line of products to be released/reviewed before deciding on a purchase. I’m on a 5800X right now. No need to upgrade my PC right now other than a video card, but it’s nice to know where things are headed. I built it in 2021, so it has a lot of life left in it.
As a Portuguese speaker I paused the video and go back just to hear again hahaha Linus destroyed the pronunciation hahaha Of course not his fault... but it was funny! 🤣
I’ve been waiting years for the PC to finally release efficient CPU’s. The 7800x3d seems like the real deal. Nice job AMD! I just don’t understand how Intel has had a few years to update their CPU’s after Apple released their super efficient M chips, but have done nothing for efficiency.
Intel has a 62.8% market share so they dont need efficient chips. 62.8% of the world love Intel and keep buying Intel. AMD only have a 35.2% market share so they need to work much harder than Intel and produce better products than Intel if they want to grow bigger than 35.2%. The CPU company with the bigger market share will always be complacent.
Apple uses an entirely different cpu architecture (Apple is ARM, while AMD/Intel are x86); x86 chips will probably never come close to ARM. ARM has many of Its own issues relative to x86, so ARM releasing into the mainstream desktop market is not happening anytime soon. Comparing ARM efficiency to x86 desktop chips is not a fair comparison by any means.
Idk what is does, but Escape from Tarkov seems to heavily benefit from the increased cache in the 5800x3d, and the whole community went crazy over the massive performance increases over any other chip on the market, even with older 10 series cards. I can only imagine how big of a jump this is in games like that
AI computations benefit the most from having a bigger cache (see how massive the gains are on the graphs for Warhammer 3), I've never played Tarkov but I guess there is some heavy and possibly non optimized ai logic running in the background.
@@frale_2392 you can play tarkov on 4K with this chip and in good quality, i'm already running tarkov on 4K high, fsr on ultra quality with a 5800X3D and i have stable 130 fps. the new 7800x3D is gonna be a monster too, clueless, tarkov is dead because of the cheaters
In MSFS it doesn't really matter which so far released 40 -series GPU you run - but from x3D you get 10-20% more fps lol So instead of 50 you get 55-60 - which is a lot.
The performance of basically every mid to high end CPU is so good, the choice of CPU almost doesn't matter for most consumers. For the more hardcore audience, it's best to pick one that is more tailored to your preferred games. You really can't go wrong though.
The performance is good today for most of them indeed. But as Linus and others have said, buying a better CPU today helps you not have to upgrade it (along with mobo and potentially RAM) when you upgrade your GPU in the future, especially with higher-end GPUs.
@@GiorgosKoukoubagia yep I'd argue AM5 itself is a good investment, regardless of the CPU you choose. Should be supported for many years to come. That's why I bought a Godlike board and 7950X3D. Pricy, but I'm set for a long time.
running 7800x3d for now and in desktop it has just 35-38 Watts usage (and its a gaming PC) GPU takes around 26W its an 4070 also undervolted to max. So total around 61W power usage in total in desktop, that competes even with some gaming laptops now!!! PS: using Samsung pro 990 for 2 Tb and DDR5 5600 Mhz. Got this PC just 2 weeks ago for 1200$ used and I'm happy!
I think the reason the 7900X3D performed so badly with dual CCD in Factorio, is because game mode probably didn't kick in for the benchmark. And as such the CPU probably didn't recognize it as a game, which would have parked the non v-cache CCD.
No, it's because it has 2 CCD's, each with 6 cores on them. So When one CCD is shut down in favor of the cache, it essentially becomes a 6-core CPU with V-Cache.
@@PQED well, yes.. But to determine when it should switch to v-cache only ccd, it relies on windows game mode to determine if a game is running. If game mode doesn't trigger, it won't know which ccd to prioritize. And my guess is that the fsctorio benchmark doesn't trigger game mode as the benchmark doesn't seem to run like the game normally.
7:08 I love how Adam’s in this bit because back in the first AMD extreme tech upgrade video he really wanted a 3DVcache CPU but missed out on the timing
I really loved seeing the pricing for using the chips in California prices. I don't live there, but I super appreciate the graphs, as I find that incredibly important when choosing between the options and saying "well I have the money, so I might as well get the more expensive chip... right?"
Man, it's crazy how basically every graph was showing triple digit fps across the board... Doesn't seem that long ago when these types of graphs were in the 60-90 fps range. 😅
now consider how much less power it draws = how much less heat it puts out into your cooling system = how much more graphic card will have headroom (may even be able to get more from Video card due to better thermals!
That is of course gaming but in other operations of the PC the 13900k will do better but the reality is the difference in power draw and the real world performance over all for most everyone the amd is still the better overall.
I just upgraded to this from a i7 6800k and it's definitely significantly better. I paid around $530 for a CPU, motherboard and ram combo from microcenter.
The 7950X3D basically becomes a 7800X3D when you turn off half the cores in the BIOS. Hopefully they fix it with software updates so it's always like that without turning off half of it.
100% agree about the delayed launch of a better cheaper product. If there is a defence it’s this, the 7950x3d performing worse seems to be a software issue (firmware, drivers, or OS). It should be best of both worlds rather than worst of both worlds. I’m sure original expectation was best of both worlds, and they didn’t adjust their plan when it didn’t pan out that way. Maybe theirs firmware updates in future.
Yea really hoping they improve the schedular or the parking some how for the people who have the 7950x3d that wanna use it for production and also gaming.
@@achillesa5894 Yea currently with some games i play there are scheduling issues unfortunately where the cpu doesnt fully utilize the 3d cache by about 30-50% so it can be like a 20-40 fps loss which really sucks -.-
They delayed it to sell people more 7950X's as they new that the 7800X3D would be *THE CHIP* to go to for most people. The 7950X relies on Windows game bar to schedule itself so good luck...
Bought one for my dad this year, I'd say it's well worth it. Buy the screwdriver and a couple bit sets and you'll never need another screwdriver or another bit set in your life.
I finally jumped in when 5800x3D hit $319 and I had point on Prime so it only cost me points. I was already running 5900x and I popped in the 3D chip and gained 20 fps on the same game, same setting running at 1.16v and switch out for an air cooler. 3D is surely efficient and cool as long as you run PBO2.
It's a straight up beast. Exactly what I expected, really a BIS chip and worth the wait for AM5 platform, especially after some time for boards and memory to get cheaper too. Definitely amazing at certain games where it use extra cache for it's max potential like in WoW and other. Very nice!
I'm buying that for emulation and avoid CPU bottlenecks for years after I buy it. And also the upgrade path. Planning to keep this AM5 motherboard for years
I could be wrong, but I was always under the impression that emulation performance relied on high clock speeds as opposed to tons of L3 cache. If that's the case, x3D chips wouldn't be ideal for emulators since their clock speeds are significantly lower than their normal X version counterparts. Someone correct me if I'm wrong though!
@@Benefits: it's always about what the program is doing. A SNES game (plus the structures required for emulation) will probably fit easily into L3 cache, but a PS2 game won't.
I love my 5800X3D. I will be staying with it for quite a while. The thing that I feel when playing with it now is the reduction or removal of a lot of micro-stutters. Load hitching doesn't happen as it used to as well. I do feel it in Cyberpunk 2077, it's not in the framerate it's in the input response. Turning a corner on a vehicle at 100+MPH on my old CPU would feel sluggish and those little hitches as it tried to keep up with the GPU are gone. Yes, I am a 1080p player.
Well done in showing running costs. That has always been a factor in pricing my PCs. The excuse "rich people don't care about the bills" wears thin after a while (that excuse doesn't apply to many/most of us) so seeing those metrics is refreshing.
having the 5800X3D and now a 7900 XTX I'm set for a couple of years. I might end up skipping the entire AM5 generation and see when AM6 comes out what both AMD and Intel have at that point. Right now I dont need an upgrade. Not for the next 5-6 years. Still its astonishing the new implementations and innovations that AMD and Intel comes out with. I wonder when AMD will also have a big.little structure and add some efficiency cores, or if they just straight up go 3D V-Cache on all their future CPUs to make them as efficient as possible. They have already won that game.
They are planning out Big Little for Zen 5. Essentially their plan is to use Zen4 with avx512 and some other SMID instructions removed and call it zen 4 dense. This will shrink core size and interconnect size, enabling low power compute zen4 while leaving full fat zen 5 to do the high power high speed stuff.
They already tried a big.little chip. It's just for laptop/mobile though (2Perf+4Eff). That's where efficiency matters. For desktop, all the efficiency cores are parked and used only into production tasks.
Same setup as you. 5800x3d on a b550 Taichi board with a 7900xtx Taichi. The rig is amazing at 4k and not too insane on power draw so it's perfect for me for the time being. I don't see a need to upgrade for the coming years at all. I plan to use am5 for a home nas build though with a b650 itx board and a Jonsbo n1 case. Will use one of the non x CPUs because of how power efficient they are
I did the exact same upgrade last month. GPU is a 5600XT on a 1080p monitor so the gains were minimal. Bottleneck Calculator says the 5800x3d is a bottleneck for a 4070 Ti or higher... not sure what GPU i should upgrade to. Will get a 1440p monitor as well. Wait for the 4070?
I honestly think that long term review of these chips, future forecasting in terms of energy consumption and compatibility need to be an integral part of these reviews. Most people aren't building a new rig every year.
Just did my upgrade. 7800X3D, B650-Plus, 32GB DDR5 6000, and a 6950XT (should have made the jump to 7900XT but whatever). Now, that's component prices I wasn't prepared for, but coming from a i7-6700K and 1080, it was about time, although the old i7 and 1080 still hang.
@@razgarorIn general, not much as changed. Gaming, a leap forward, obviously, especially in newer titles like Dartide that gobble VRAM and CPU. Mainly because I use 3440x1440 34'' monitor, the 1080 was getting destroyed, then the CPU lagged bebind. I was getting 40%, 60% GPU utilisation with the 6950XT, so that needed to change. In some games, the old stuff worked perfectly well (Back For Blood, old titles...), with some visual reductions. I think if you game in 1080p, you can probably get away with a system as old as mine for a while. Less VRAM, easier on the GPU...A cheaper GPU, like a 6700, as long as you have decent amount of VRAM, imo.
Oh and I've switched to two NVME drives (!TB, 4TB). But really, I had a 500TB ssd, 2TB NVME, and a 2TB HDD, and yes, it's faster, but diminishing returns. All in all, I could have kept the old system. If you're short on cash, second hand will save you a lot because all that shit is expensive. And as I said, my old pc still works fine and could still game OK. Besides, not many exciting things on the horizon. It's a bad time for gamers.
I'm extremely happy with my 7900x. I initially wanted to wait for the 7900X3D, but I do a lot of recording and streaming while gaming, and I plan on keeping my 7900x instead of upgrading.
For game _developers_ the 3D variants don't make a whole lot of sense either. We're getting top-tier but only just shy of best performance in gaming but gain a massive improvement in the productivity department. Compiling and loading and editing take up a whole lot of time out of my daily schedule and I'd gladly give up even 15% of fps 5 years down the line for the massive gain in productivity.
@@gozutheDJ yeah. I was using the AV1 encoder with my 4080 but there isn’t a whole lot of support for AV1 encoded videos yet. So I use Nvidia NVENC H.264 but I’ve been wondering about trying out x264
The power consumption story is missing idle and web browsing or watching video values. If the 3D chips consume more power in idle or low performance tasks that might also be of interest for those that use their computer for extended periods of time with small bursts of power.
@@ShersGarage Sure but many only need the CPU Power for short bursts. And why not get a x3d cpu if you like to game on the same computer as well. I found this video which gives a better look at efficiency: ruclips.net/video/JHWxAdKK4Xg/видео.html
@@asldfjkalsdfjasdf doesn't look like there is that much difference in the power consumption. Maybe 5GBP difference? My take is, if you you plan on using the PC mostly for gaming, get the X3D. Otherwise it is a waste of money. In some cases non X3D CPUs are just as good or better than X3D counterpart.
Dude is a great seller. You stare at a graph that shows you barely any differences, and he still manages to use words like "win" and "lead" to shine some light on single products. No one can sell me, that you will notice 10 % gains at 200+ FPS.
Nice Work Team LTT! I especially appreciated the efficiency cost/KWh comparison. This is a significant factor in the northeast US where its over .42/KWh.
@@Hugosanches594 I have the "same" setup and at 4K 120hz there are very few games where the 5800X3D will bottleneck the 4090. I see some performance loss in Witcher 3 and in Hogwarts, mainly because of low CPU utilization/bad game optimization. These games would have benefitted from faster core speeds than the X3D chips have. In Hogwarts a quick .exe tweek mostly fixed it and in both games DLSS/FG made it pretty much a non issue. (Running around in Skellige in Witcher 3 I had to turn of Frame Generation for a while though, because of some bug.) In 1440p there are several games where you will get a CPU bottleneck, but that is true for all CPU's currently available when you have a 4090.
Yeah, I'm switching all my 3600's to 5600X's, and making sure I have at least B550 boards. I just got an ITX B550 last week to replace the ITX Fatal1ty 450, and the last 5600X arrived on Friday. I'll be able to recoup probably 75% of my expenditure.
For the performance graphs I would love for them to be color coded so like the product that is being reviewed have a different color to differentiate from the others
I'm all to holding companies feet to the fire. But the push back for AMD launching the 7950X3D and 7900X3D 5 weeks before the 7800X3D is just too much for me. It was announced at the same time and included the MSRP so everyone knew they only had to wait a little more than a month and how much they would save. Those who need the bleeding edge or can't control themselves really just need to look in the mirror if they have regrets today. All major review sites (at least the ones I saw) said they expected the same performance with the 7800X3D and to wait to buy.
Four months later it wouldn’t matter. I just bought a 7950x3d and got a free motherboard. It’s equivalent to 7800x3d plus a motherboard, only with 8 more cores.
Your inclusion of the energy efficiency over time makes me really happy that you invested in that lab. Power draw is one of the main reasons I refuse to get any current Gen NVIDIA product, Even though I really really really wanted a 3070 TI. Will most likely buy an arc a770 when that's on sale.
It would be an added value for this video, and I know, more time to test things, to see the difference in gaming performance between 7800X3D and 7950X3D while streaming at 1080p with a reasonable number of programs open for streaming (elgato stream deck, xlr, RGB controller, etc). I think it is a pretty common environment nowadays, and useful too see if the extra money for the 7950X3D is more convenient than building an entire PC dedicated to streaming the video output from the gaming PC (which, it most probably is), or if the difference between the CPUs is neglectable.
remember that the drivers effectively shut off the other 8 cores on the 7950x3d while gaming, so it would be purely clock speed making the difference and i cant imagine the extra few hundred mhz would be worth the extra cost for that application.
Honestly I would like to see benchmarks in CPU Intensive ports like Witcher 3's Updated version (with the comment that this was tested as of this point and CDPR may eventually fix it)
I'm still rocking a i7 5820k CPU and it was a 6 core processor that you guys hyped as one of the best choices long term even though there was faster chips that were still quad core.
Surprised to see how well my 5800X3D holds its own still next to these newer chips, and it most definitely isn't slowing down my 4080 so I guess I'll be holding onto it till the next CPU generation 😄
@@Petar98SRB my 5800x3d is super inconsistent. it idles around 39-44c and during gaming it likes to sit around 50-60, and jumps around 70c regularly before going back down to 60ish c. and cinebench makes it go to like 84-85. You may want to check your thermal paste application/mounting, or reexamine your current cooling solution.
@@Petar98SRB X3D chips do get somewhat warm yes, but 90 is definitely on the high end. Gaming temps average 60-65c for me, cinebench load caps around 80-83c usually. I would definitely double check your thermal paste application and cooler mounting pressure, or possibly your case airflow setup.
The problem with the 7900x3D and 7950x3D simply need microsoft to fix windows so that it can run the games on the 3D cache CCD, much like they had to do with intel's P cores vs E cores.
"Raises some big questions like- What the F***?" Exactly my reaction when I saw the 7800x3D bullying literally everything in most gaming scenarios. I've gone through numerous reviewers and seen the same results. Amazing.
If you heavily play a Unity game (say, VRChat or Escape From Tarkov) then the extra cache of these chips vs Intel's is gigantic. Unity hits cache a lot.
@@Dyils True! There's not been a huge amount of Unity-focussed testing of these. VRChat is a good general stress test of Unity on BIRP though given you can throw a very wide variety of content into it and really stress the engine out in high-load instances. The hypothesis is that it's a Unity quirk, something that has been mentioned by VRC devs, due to the reality of Unity games essentially being injected assets and scripts into a core engine that can only do limited build optimization compared to bespoke engines or Unreal. I think the same finding was made by Tarkov players: see here ruclips.net/video/Gi8X99f70D8/видео.html To clarify: it's good for lots of games, but I think Unity games in particular are friendly towards high levels of cache.
I think improving the 7950x3d is just a matter of software. All AMD need is a better scheduler. And looking at the numbers this kind of one die cache is going to be the future. So let's wait a few months and driver update can improve it even beyond 7800x3d
Nope. Infinity fabric latency and having to "guess" at which core to use in the multi-die setup is always going to be a weakness that keeps 7950X3D from being what it should. AMD played customers with a chip they knew was no better than a cheaper offering coming shortly after.
Strange how everybody cut some slack for Intel designing a CPU with different cores that needed a new OS to work, but AMD get flak because Windows can't allocate a RAM-heavy process to a core with lots of cache...
just use process lasso and make a personal CPU set, with PBO and its instantly faster than the 7800x3D. I've already tested this.. the issue is windows scheduler not being set up for gaming unless you use windows gamemode but that hinders some performance.
@@mitch075fr It struck me as odd as well that Intel got so much help from Microsoft to make their product work, but AMD has to rely on Game Bar of all things? It's not reliable in the slightest. AMD really does need to work on their own hardware scheduler though, but I don't see how it excuses the above.
Nah not true. It needs hardware thread scheduler like Intel has. AMD just cheaped out as is usual for AMD. Never touched their trash products cause they pull of cheap crap like this.
I have 2 monitors and do gaming/streaming and light 3d modeling. 13600k has had no issues and no lag, super quick. I feel like most modern cpus are all overkill for 95% of users.
PSA: it is possible to run your non x3d chips power limited to achieve pretty much the same watt per frame as the X3d variants. You will fall a little short in cache limited scenarios, but you will get pretty close. It barely even impacts performance on all core loads.
I really would like to see an improvement with your charts, especially if there are so many. It's hard to focus on them without pausing the video a lot. Sometimes low values are better, sometimes high values are better, nearly every time the order of the CPUs changes, the spacing between the lines is different each time. The transitions between the charts/graphs could be better I think. I don't know how you could achieve that because I am not a layout designer but you definitely could do something here. That would be great!
Would have loved to see the 7900x in the comparisons. As of writing you can get one for $430 (admittedly at heavy discount, but multiple retailers have the same price), the closest price comparison out there.
You can just take a look at the 7700x results. They perform similarly in gaming, and because games don't usually benefit from the extra cores, in some games the 7700x performs even better than the 7900x by a very small margin.
Mmm, I only recently built a new PC with a 5950x and a 3090ti... the thought of having to get a new motherboard and really rebuild the PC from near scratch next time around when my last PC lasted 10yrs... its pretty annoying and not something I want to do at all. I'll see how long I can hold on I guess... but with all the UE5 advances, I can't imagine my current PC'll keep pace in 5yrs time. Meh.
@@Quasiguambo ??? I got a Ryzen 1700x, built it at release, thats 6 yeras ago. I can just replace to a 5800x3d and the same system will work for years. Do you expect the 5950x to only last a couple??
Just bought a sealed in retail box 7800X3D for $340 from local ad about a month ago. Then found another 7800X3D listed for $300 new but with no box. I snagged it for $260 for my son. I was only shopping for a used 7600 but the extra $100 was worth it. First time my kid has the same PC spec as me CPU/MB/RAM wise at least.
I think it would be a nice touch if you could highlight the CPU in the benchmarks. It can be a little confusing with so many different CPUs at the same time and it took me a bit to actually find the 7800X3D. I know it might be a little thing, but for me at least, it would make it a bit easier to spot the CPU that the review is about. Otherwise great video!
Nah its not just you, the charts in this are all over the place. The orders of the CPUs change around and its not even ordered by fps in some of them (5:02). I don't really get it. Much easier if the CPUs stay in the same spot and we compare its size to other bars easily.
Yeah GN highlights the CPU in their videos, not sure why LMG doesn’t…
yeah, Gamers Nexus always highlight the ones that they're talking about
Yeah... the charts are pretty bad for this one. There doesn't seem to be and rhyme or reason for how the cpu's are listed from chart to chart. sometimes it looks like it's by performance, but other times it just seems random.
I’d like them to take it a step further and just separate AMD and Intel on the charts. So in the case of this video then have AMD on top and Intel on the bottom and vice versa when it comes to Intel.
I don’t know if this would be better or worse but whenever I want to find a specific cpu I have to go through the whole list because everything just gets scrambled.
7800X3D will sell like hot cakes in Factorio community the same way the 5800X3D did - the performance boost is so massive that it allows you to scale to much much MUCH bigger factories.
Also, memory latency really REALLY __REALLY__ matters to Factorio. A DDR4 2666 vs 3200 MHz made like 10-15% difference in performance provided similar CAS latency ratings. On DDR5 this is even more pronounced and could be a fascinating rabit hole for the LTT Lab.
CAS latency doesn't matter at all, it's more like tRRDS/L tFAW and tRFC for timings.
I just hope this cpu can get 60 fps on my modded cities skylines , although most of my mods are gameplay tweak, very few of the mod that is asset mod
in factorio the performance is totally nuts
LTT lab lol.
That thing is a joke. Tell me how 1 dude from hardware unboxed has FAR more games tested, with far more settings than an entire fucking "lab" does.
Sorry, but their content is pretty much a joke. Far too quick.
All of that community already bought the 7950X3D.
AMD’s performance is great in the last few years, but look at that efficiency! 85W while the 13900K is hitting over 250w. That’s insane.
No joke, pairing this with a 4090 is going to lead to an incredibly cool and quiet PC as neither part is using much power at all. Efficiency monstrosities.
@@RicochetForce A 4090 not using much power? Are you high? Efficiency is decent but even that would be way better on a 4090 if they underclocked it a little.
@@RicochetForce Forget to switch alts?
@@AwesomePossum510 I assume if you cap the frames of a game, a 4090 would be more efficient reaching 144 frames than a 4070 would be if that makes sense. It's easier for a body builder to lift heavy dumbbells' than it would be for a teenager. Easier for a Lamborghini to reach 150 MPH than a typical car. This would be the logic but I haven't seen any comparison videos or stats.
@@AwesomePossum510 have you heard of undervolting?
As someone who uses my desktop PC for gaming and home office work, I highly appreciate the increased focus on energy efficiency. While I'm not sure of the impact on menial workloads that don't require full CPU usage, I do take it into account when buying parts (CPU as well as GPU etc.). Especially here in Europe, where energy costs have exploded since February 2022.
Blame your government for not knowing their place beneath Russia's boot.
hmm what happened in february 2022 🤔 /s
@@TheRealSykx A certain nation decided to invade its neighbor who was not a threat in any way at all
@@techietisdead not a threat at all...but definitely put NATO troops at the border and didn't expect anything to happen...crazy. Asking for it.
@@Pt08020 ..........
Quick improvement idea:
I think if you keep the charts at 9:00 in a constant order instead of sorting them by value every time would make them much faster to read, especially with many of them in a row.
Agreed! Either that or 'Freeze' the tested product at the top and sort the rest by best to worst.
Bigly agree or even BOLD or HIGHLIGHT the main product being tested.
I like the "best performer" sort order, but definitely something can be changed. Others have mentioned fixing just the review product or bold/colouring the entry.
If bold/colouring is used, it would also be nice to have a second entry for the nearest price competitor (at time of release)
The charts all through the video are impossible to follow and therefore useless fluff. More data doesn't make it better when it's not easy to consume
Or maybe add a vertical Line at the End of the product talking about to quickly compare the other ones with it. Comparing for excample the top with the bottom one is really Hard without a grid in One way or another
why do these reviews never test simulation speed? like stellaris, civ6, hoi4, total war end of turn times (maybe even football manager?)... It's one of the main strengths of this cpu pretty much no one knows about.
There's a test that exists where a 5900x is put up against a 5800x3d. The 5800x3d destroyed the 5900x in stellaris speed, simulating 25-30% more days in the same amount of time. If this happens with alot of similar games, then this CPU could be a gamechanger for alot of people, but no one seems to explore this further.
I still see people recommending non x3d chips for these type of games, because of "higher single clocks"' and I think alot of people are still making this mistake. Hoping to see someone test this extensively someday, but here we are 1 year later, and still no one has.
edit: please stop saying "no one plays those games" FM23 & civ6 alone make up of 110k players rn on steam. That's not counting any of the total war games or paradox games, or any other simulation/grand strategy game... (which is probably another 200k+ total)
Fr The games i care about with these CPUS are never done
because most players are not playing it.
IDK
@@whateverfitshere might wanna take a look again at steamcharts. very wrong take
Definitely. My biggest CPU bound game right now is flipping turns on Civ 6 when the game gets late.
Just on the graphs: it would be nice if you could highlight the reviewed CPU's result on every page. It jumps around up and down (since the graphs are ordered by performance) and because each graph has short screentime, it's really frustrating to try and quickly see the position on each one.
Up
This. Also, don't shuffle them around. Keep the same order, then its much easier to compare even it the graphs are only shown for a short time.
This! The graphs are always terrible, LTT has acknowledged they're terrible and said Labs will improve them, but why can't we have some QoL improvements while we wait for the step change?
this is why you are part of a failed society, cant pause the video but want things highlighted for you. You are pure laziness exemplified and a brokie
they totally should!
One big thing to take from this is that the 5800X3D is still an elite CPU, holding in with this and there is no need to rush to AM5 and upgrade everything until prices settle and we're forced to.
Interesting. I'm currently looking into upgrade my PC for Starfield, and I have an Intel i5-7600K. I want to switch to AMD, in part, because I like how AMD doesn't force you to buy a new motherboard after ever CPU, so in that regard, I feel like I should go with AM5 because AM4 is on its last generation of CPUs. But I'm also on a tight budget (if you couldn't tell by the fact I still use a i5-7600K). But I'm also in a hurry to build a PC that can play Starfield at 60 FPS.
Maybe my other option is to buy used AM4 parts, that way, when I'm ready to upgrade in the future, the difference in resell value isn't so large.
@@ex0stasis72 Digital Foundry did a video on Starfield that may interest you. They claim there probably will not be ray tracing because it would be basically impossible do to all the calculations in the game. They explain why but it will certainly be a CPU-limited game as are all bethesda games. So I imagine you really could build a computer under $1000 and run the game at fairly high settings especially if you use 1080p.
@@Legendaryplaya thanks! After giving it some thought, I'm probably going to go with a used AM4 motherboard and the 5800x3D. Buying a much cheaper used motherboard solves my concern about it being the last generation on AM4 because I can always just resell it used again for not that much less later.
@ex0stasis72 nice. If it was my money, I'd grab a 3060 ti/4060 ti for DLSS, and it'd be a smooth 60 fps.
@@Legendaryplaya Good choices. I already have a RTX 2070 Super, so I'm not as much in dire need to upgrade just yet. But I am often CPU bottlenecked in CPU bound games, which Starfield is very much be.
I personally really appreciated the performance per watt cost graphs since they give a good estimate of the proportional difference in cost of ownership. Thanks Linus, Team and Labs!
it skews even more drastically when you add AC/cooling costs to keep your room at a confortable temp!
They just missed talking about idle power while watching videos and stuff, Ryzen CPU's are notoriously horrible at that while Intel has it figured out.
@@TheFPSPower stock yes, but one could always throttle the voltage a bit
@@TheFPSPower It's literally like 5 - 10 watts, total system. If you bought a $500 best of the best gaming CPU with 8 cores that can do well at productivity to leave it sit Idle the whole time, that would be stupid, and you could just turn it off, or set it to sleep after inactivity. I don't know why people keep bringing this up, 1 hour of productivity or six hours of gaming will displace 2 weeks of the difference at idle.
The only point this argument ever makes is people like to grasp at straws to promote their favourite brands.
it's not just electricity savings, a low demanding Watts cpu will also allow to run a cheaper cooler, with less temperature and less noise and longer longevity without the need to repaste often.
I really like the comparison of power usage and running costs between different CPUs, it's not something I've considered when building PCs in the past. Would be great to see this in GPUs and the full PC build videos as well.
nice avatar. Also, I always try to build a really power efficient pc.
If you are worried about $50 over 5 years, you have problems.
@@FriarPop I save way more than that, but how would you know. You don't know how I use my pc. What my electric prizes are. You just pulled a number out your §§§.
@@tshcktall even if it’s 10x, you have problems if that’s your financial undoing.
@@handlealreadytaken I just don't like wasting money, when there's a better way for me, that's it. Nothing more. Why do you guys even care so much, what other people prioritise?
You miss an important bit with Factorio. It's a game about automation and scaling out. While it's capped at 60 tps, no matter how powerful your system, it *will* dip below 60tps once you get a complex enough map. The better scaling means running bigger maps before you hit "TPS death". Would be interesting to develop a benchmark that tests how complex you can get before that happens.
maybe they could ask chatGPT to write a script that automates the game in a predictable way to reliably ensure growth happens equally every run, then just test for when the TPS drops.
Unless factorio has procedural generation (random maps, etc). I've never played it so not sure on that part
@@SurgStriker Yes, the maps are procedural, but actual gameplay is quite deterministic. A better test may be a single mega-base that strains even the latest gen hardware and see what the UPS (updates per second) falls to. I'm sure there are plenty of folks in the community who would love to share their multi-thousand SPM (science per minute) bases as a benchmark.
@@SurgStriker The maps are procedural, but that's easily solved using a fixed seed. No need for chat gpt, since the game logic is implemented in Lua. So it would be quite easy to write a "mod" that tiles out parallel construction segments at a set rate using the dev-mode infinite source / sink stuff till the game grinds to a halt. With the fixed rate expansion, that lets you plot performance as "time till TPS drop".
@@SurgStriker ChatGPT isn't some magic spell that can do everything for you.
@@NightKev anime girls aren't magically gonna take your virginity, but here you are with this profile pic
That power usage on the 7800x3D is mindblowing. It would have been enough for AMD to retake the gaming crown decisively, but they did it with a part that sips power. Freakin magic.
@@wasd____ you don’t buy a 7800x3D for production. You buy a 13900k or 7950x.
@@wasd____ buy a 7700x.
@@wasd____lmao, it's still a killer cpu and just looking at charts doesn't do it justice at all.
Intel and AMD build good CPUs, the cash here is the selling point. It has like double the cash of similar CPUs, so 10% less GHz really isn't the point here.
@@wasd____If you're losing money over 5% performance lost, upgrade you concern troll.
@@wasd____ well if a person couldn't afford a dedicated fast cpu then their "production" does not need one.
As a UK viewer (and someone who generally tries not to waste power) the 7800X3D is clearly for me. I had a great experience building a GPU-less AM5 7700 video editing system for someone recently.
Ah yes the 7700, I chose that for my build specifically because of its power efficiency (UK too)
if you cared about power consumption, you wouldn't buy a 7800X3D to begin with, you would get an i3 or a console
@@sophieedel6324 you only buy a console if you only want gaming and nothing else. It does not beat the usage of a computer. Also, i3s are not necessarily less power usage than these chips and why sacrifice massive amounts of performance for that anyway
@@sophieedel6324 then you wouldn’t get an i3 or console either. You’d get a laptop if you truly cared about power consumption. Efficiency matters a lot. The fact of the matter is that those i3s won’t be more efficient than a 7800X3D; you just lose so much in performance and power consumption is barely different. You could also get a 4080 and power limit it to kill consoles in efficiency.
@@sophieedel6324they simply don’t provide the performance necessary for everyday tasks and gaming in one machine. An i3 is great value, don’t get me wrong, but I think in power per watt, the 7800X3D is really quite amazing. Just unfortunate about the clock speeds.
This CPU with the extra cache is an absolute monster for World of Warcraft up to 30% extra frames
thats good to know.
Star Citizen too
I think MSFS also likes the huge L3 Cache
World of War craft? 😂
Yep... Just the older 5800X3D was performing insanely well with WoW. Miles ahead other higher core CPUs. This new 7800X3D will do even better.
I love how Linus never misses any opportunity to mention how AMD abandoned X399 lol
I'm not a person who remembers CPUs by its chipset names, do you mean the Threadripper non-pro lineup?
I'm mad too; 20grand down the drain damn near
@@SyRose901 Yes
@@SyRose901 you could just look it up considering you're on the internet
@@jakesnussbuster3565 True, didn't think of that.
Still happy 5800x3D being up there in the top representing AM4!
It's insane we went from Ryzen 3 1200 to 5800x3D on the same socket... just freaking insane.
What MOBO btw?
2 mobo (B350 -> X570) and 4 CPUs later, I still rock on AM4
@@heliosaiden i went from 2600 to 5700x3d. It essentially doubles the gaming performance it's so crazy. Games that heavily uses the cpu runs buttery smooth, i'm seeing crazy gains in dyson sphere program, stellaris, and beyond all reason. all done on my 7 year old gigabyte ab350 gaming 3
Finally! I'm so happy that you are starting to consider the energy usage of CPUs and the price they truly cost in a longer run.
This. A lot of people (me included) like to maintain their pc parts for a little more than a couple of years
Especially with such a big difference. If you're not playing the highest-fidelity games, you can still benefit from spending a bit more today on this thing than paying for it in energy for the next 3 years or so
That power consumption section is a little misleading though. The 13600k pulls more power in productivity workloads but it's faster. Also, they didn't include idle power consumption numbers. Your CPU spends the majority of the time at idle, and AMD CPUs pull more power because of the I/O die.
I think it's time for a energy consumption testing in different workloads, similar to how laptops are tested for their battery life. Idle, web browsing, 4k playback, gaming (this will be hard, need some sort of average load). It will be harder, cause components need to be tested more in isolation than in conjunction like in laptops, but now that they have the new lab... ;)
Hot take: a difference of maybe $100 in total energy cost between CPUs amortized over a 5 year presumed lifespan is not a consideration. $20/year, less than $2/month, is literally just maybe some random loose change from your pockets. It's nothing you'd ever even notice or think about. It doesn't matter.
I am so glad that you finally started adding performance from games like Factorio! As someone who plays a lot of automation and 4X games. Those are very important metrics to me and many other automation/4X enjoyers. It is also very difficult to find reviews that include them, they all just focus on frames per second in big AAA games.
Agreed. I don’t necessarily play factorio (but want to try it) but i do play games like rimworld, they are billions and recently was gifted satisfactory. And i do enjoy city builder/management games also so it is nice to have some mention outside aaa
Yep exactly, I'm looking to upgrade just to improve performance in games like Stellaris etc.
@@Austynrox You should try Oxygen not Included and Dyson Sphere both VERY good.
Also, heavily modded Minecraft tech modpacks. Those are very CPU intensive.
Maybe even civ6 turn time test could become a standard imo.
The power efficiency has been incredible on the 5/7800X3D chips. Truly ideal.
Agreed, i thought the days of air cooling was behind us on flagship gaming CPU's... I was wrong. I'm actually blown away!
The Ryzen 7000 parts are pushed to the limits, so naturally, when AMD has to reign in some of that power due to voltage limitations on 3D vcache, you get a very efficient chip. I just don't think any of us were expecting it to be THAT efficient. It's insane.
It's not efficient when it takes longer on productivity tasks and doesn't really idle any more efficiently. You lose all those savings and more in the extra time it takes the processor to get things done.
@Winston Deleon that is a moot point though, this chip is marketed as a high end gaming CPU that has the capabilities to perform productivity tasks with the vast majority of buyers focusing on the former and not the latter. For those type of tasks the 7900x or 7950x would be more feasible
@@spacepioneer4070 air cooling is far from behind, good air coolers have never been behind AIOs.
i just picked up a 5800X3d, started with a 3060ti and a 3600, when i upgraded my middle monitor to 1440p it left just alittle to be desired, got a good deal on a 3090 but then the 3600 was a HUGE bottle neck. Like my FPS in games like farcry 6 straight up doubled switching to the 5800x3d.
I’m having that issue now but with Intel I have I7-8700K and had a 2080 base. Then I upgraded to a 3080 TI and now most games are running sluggish. Cyberpunk, SCP, Assetto Corsa
@@InertGamer your CPU is too weak to play modern heavy games like Cyberpunk. If you get a intel 12000 or Ryzen X3D chip, your gains from 2080 or 3080ti would be huge.
Is it possible for all the graphs representing the cpus and their relative performance, that there could be a highlight of sorts for the two CPUs being compared? So maybe a red "highlight" around the 7800X3D, and a "blue" highlight for the CPU that costs the same or is priced to compete so that the viewer has a more intuitive feeling for the difference in the charts? Otherwise it feels like just a wall of numbers, and I usually like watching these not because I am going to buy them, but for entertainment. So a simple visual indicator for where they are on the list would be cool!
Totally agree. I like how they go through the graphs quickly but with a little indicator of what’s what I wouldn’t need to pause the video as much.
I'm so happy about this focus in performance per watt. It's such an important metric that's been overlooked for far too long. I truly wish ltt keeps talking that much about it in future reviews. Not just about CPUs, but anything.
I cannot help but think this absurdity of power consumption nvidia and intel been pulling lately as partially a result from the community and reviewers on average not caring much or at all about efficiency. But raw performance and to hell with the costs.
There's only one more thing that would make me even happier: measuring and caring about idle consumption
I agree! But I would take it a step further because it skews even more drastically when you consider AC/cooling costs to keep your room at a comfortable temp, especially with summer coming up in NA. Very few like to game while sweating.
Performance per watt is pretty much the ultimate measure of who has the best CPU architecture. It's fine looking at the benchmarks, but if the results are very close while one company needs a ton of extra wattage and heat output to achieve that result, I'm obviously going to pick the more efficient and cooler running chip if the prices are anywhere near comparable.
@@marcuscook5145well one thing about it though is some architectures can't consume too much power before they ultimately will fail, like ryzen, and something like 13th gen intel was made to be able to consume very high amounts of power.
All in all I wouldn't say it necesarily makes one architecture superior over the other, but I think it's always a better foundation for improvements if you have lower watts consumed to supply the cpu sufficiently overall.
Just look at how the old 5800X3D is keeping up with these new CPUs in games. What a chip!
@@dukejukem413 The PLATFORM (AM4) it runs on is very old - September 2016 - that is ancient in PC times.
@@DKTronics70 its just a socket .. the platform is the chipset and its evolved over the years . and ddr5 and pcie 5 are not worth the extra outlay at all
Yeah, and I’m still trying to get one without obliterating my bank account lol
@@Malc180s oh its older in computers years come on enough excuses
@@DKTronics70 the platform/socket yes. The 5800X3D was released in March 2022....so not old.
I placed an order for a custom built PC today and these two will be my GPU and CPU, couple with 32Gig DDR5, 6,000Mhz RAM. Can't wait to get it!
Super happy to see the 5800X3D still very close in performance, gotta thank AMD for blessing us AM4 peeps.
Bought mine yesterday because I didnt want to spend 300€ for a itx motherboard again
got mine a month ago on sale and happy i did. it should last me awhile. AM5 is still too much money
Just got one today but forgot to flash the bios lol. Anyways its great value cpu
The 5800X3D gets beat by 20-30 fps in some titles to the 7800X3D. I was shocked, i haven't seen a generational performance increase like this in a while. ruclips.net/video/TVyiyGGCGhE/видео.html
@@spacepioneer4070 what he meant was the 5800x3D is about the same as the 7800x3D in 4k gaming. The 7800x3D according to the charts, and the video you linked, only have a noticeable gap over the 5800x3D in 1080p. Even at 1440p the gap shrinks tho. And in 4k the gap is negligible.
Glad you guys actually showed the cost savings related to the efficiency. I think that speaks volumes about how great the solution is and if AMD actually spent more time trying to make the top end line of chips behave the way we'd expect they should, then it will be very hard to recommend anything other than AMD solutions.
Having the top CPU only matters because we consumers say so,
@enriqueamaya3883too true 😂
That's the thing AMD will never take risks or move ahead development thats why Intel/Nvidia will always be the best.
@@JBlNN I would argue that they've taken a major risk already by challenging Intel when they were so small, now they're much larger and growing fast, I've used both Intel and amd before, just got a new amd cpu after using Intel for 7 yrs amd I'm excited to see the differences
That breakdown of expected power consumption was a very nice metric. Please continue adding that into reviews!
I just love when gamers have a simple solution on a CPU.
7800X3D is just great, doesn't use a lot Watts, is fast and decently cheap.
Just a perfect choice without thinking to much.
What amazes me the most is that power draw. My 5800x3d draws ~55w which is really nice in my quite small and compromized matx.....they managed to cut out 15W! from that with more performance. Now I seriously concider a SSF build as now any low profile cooler will do very well and the options opens up a lot while still having a quiet mashine.
Thank you for mentioning performance per watt. With how performant these chips are it's now become a very important factor in my purchasing decisions.
yea the analysis with performance per watt is very good. I think he should have mentioned how you also wouldn't need as beefy a cooler for the 7800X3d vs other top performing chips. That's means a smaller PC and/or more quite PC.
Power companies. Continuously motivating us to make better choices. It's for the environment, m'kay?
I don't get why they even show it in a review of gaming CPU, 15$ difference in a year is nothing. Is it not normal in USA to tip delivery driver 5-7$ on each order? So like don't tip 2 times a year and you are done.
They still messed it up tho :/
No idle power consumption.
@@GrzegorzMikos its certainly not nothing, every dollar counts, also they showed what I would actually consider to be very minimal use, if you were a heavy gamer or left your PC on for longer periods, you could be saving dollars in the triple digits with this chip as opposed to say a 13900k, which is already crazy when you consider that its cheaper than the 13900k and on similar or better performance for gaming.
Also, it's just good to note all the strengths and weaknesses of any product you review, like how this particular chip isn't a great investment for pure productivity
I loved seeing the estimated power costs during ownership. Really made me think about it and realize that the extra cost of the 7800X could easily be justified (plus less heat during summers (or all year if you live in Florida like me)).
The difference is is even really small in the US.. i was shocked how extremely cheap your power still is. In Germany we went up to about 65 us cents per kWh. The difference over a few years can easily be hundreds of Euros.
@Duglum666 honestly never thought about it that way, like as of right now I still live with my parents (not a leach only 16) and I just get the best thing I can afford, totally didn't think power was a factor of cost, even though it's long term.
@@Duglum666 where in Germany, that used to be the case but now in Hesse it dropped back to about 38 cents per Kwh
@@Duglum666 the 65c were during peak, shouldn't hit that again, and most places are coming down quite a bit, seeing contracts here in Austria for low 20c range again, it basically doesn't matter long-term for home users (also energy subsidies).
The extra 100W of cooling load on the Aircon will easily add up over a year. Not only you have to pay for the extra 100W every hour, now you have to pay for your AC to get it out of your room. The 3D chips are a no brainer, and a fantastic value for money.
This 3D cache advantage reminds me of how overpowered L3 cache was when it got introduced. I did a little test between an Athlon II x4 620 and a Phenom 720. Parked both of them down to two cores and same clock speed. The L3 cache gave me up to about a third more fps iirc. It would stand to reason that more of it would be better when we keep improving it.
Also that was then, L3 cache (or larger cache vs. smaller) has a bigger impact later in a products life cycle. That's why the phenoms stayed relevant way into the core-i era. That's (part of) why the intel X58 platform is kind of usable for gaming today. The other part being triple channel.
these x3D parts will have some absurd longevity, not in maxFPS but in minFPS/frametimes.
@@andreewert6576 Indeed. the minimum FPS or 0.1% low or whatever they call it, is something I use as a general guide when it comes to capping framerate. My current gear for example handles Deep Rock Galactic at 1440p at roughly 120~ish fps, but dips down to 80-90 regularly. I decided to cap it at 100. 10-15 fps drop will not bother me, but 30-40 will be felt. Smooth gameplay is paragon for chill experience, in my personal opinion.
@@Hr1s7i in any games that has a lot of things going on on the screen all at once, its very desirable to keep FPS consistently high even if you have to cap it to avoid the frame dips, which creates way better smooth gameplay. with X3D CPU, the 1% lows are soo high like never seen before, which just makes gaming experience butter smooth, and if you can tune it to be almost locked at certain high FPS, that is golden...
I really love the energy price graphs! Would love to see this more often for gpus too, since as a brit it's defo becoming more of a concern when looking at hardware and nobody does real world data of actual power usage
I did hope to see the 13700K or 13900K there, would have been quite shocking :O
Hey you stole my username
@@Psythik 😯 Psythix's unite! The important question is has anyone called you "fish sticks" on voice chat because they don't know how to pronounce it?
Fellow Brit here and I completely disagree, the cost of electricity is beyond negligible. Even using their price of 42p a kw if £22 a year extra running costs is enough to convince you then you shouldn’t be gaming on a pc.
@@oxfordsparky sure if you take a single component by itself it might be a small figure, but even then if you're paying even £10 a year for a 2fps increase is there point in that extra cost? No. But without performance per watt info it's all speculation
You should also try benchmarking CPUs on 4x games like Civ, Stellaris or city builders where they matter the most.
we asked for this on the other 3d cpus review, I guess they didnt care or read it
was thinking this myself watching the video. no clue if the videogame heavy or productivity heavy CPUs work best for the videogames that are productivity heavy :( would love to see 4x benchmarks
4x games don't care about 3dcache, 13900K was faster than 5800X3D already
I just wanna see a high end CPU try to play stellaris late game on high speed.
@@badjackjack9473
With AMD you will get the same exact performance out of 7600x as out of 7950x. It will be very similar story with Intel's i5 and i9 if it's the K type CPU.
Another really cool thing about 7800X3D being so efficient is probably if they decide to put it into mobile platforms. I can imagine 20 watt desktop replacement for field work processing.
Can't wait for Ryzen HX3D series mobile processor (if AMD want to)
They would invariably have to stack a _monolithic_ die with 3D V-Cache first before it would be attractive to mobile use. Desktop Ryzen 7000 CPUs will consume 20+ watts in idle, and almost nothing of that is due to the cores (which might take 0.6 W).
7800x3d on mobile will be hella efficient to gaming but very inefficient when idling, so battery life will still be poor
But would you see the advantages since notebooks get lower power GPUs? You can't generally upgrade the GPU in a notebook, so you are more likely to be GPU bound and waste the CPU horsepower, right?
DESKTOP AND MOBILE CPU'S ARE NOT THE SAME.
do you really think if they could make the desktop part have that much performance at 20w they would LOL!!!!! WTF!!? U S E YOUR BRAIN!!!! the people in this comment thread are the same people on the steam forums saying their games won't run. ...."buuut i has a I7" lol
Now all we need are consumer desktop dual AM5 socket motherboards, so that you can put your interactive tasks on the 7800X3D and everything else on the 7950X.
Live in the UK. Those cost of ownership metrics really frigging matter. Been waiting for this type of analysis for a while now and am so happy to finally see it. I would MUCH rather give up 5/10% performance for, in this case, a new component to save significantly on cost of ownership. :)
This is going to be huge for me, i live in Alaska and energy is actually pretty cheap cause of hydro, but nobody has A/C so my room gets toasty quickly in summer with a 300+ watt system running for long sessions. i was thinking of getting a 13900k, but that would literally have doubled my wattage output into my room. this new chip if its running at 100 watts which is the same or less then my ancient amd FX 8core chip i have now. holy cow im going to be going from 32nm to 5nm tech, this is going to be a massive jump. im holding off though for at least one reviewer to use it in Arma 3 and DCS and compare it to intel as those are my most played games. exshily in arma 3 intel has always had a huge lead over amd just because of optimization and single thread reasons. but that dosnt matter if it comes at 200+ watts more cost.
Definitely relevent these days. Features like Radeon Chill where it limits your FPS while you're idling in games can actually save you very noticible sums of money, as well as buying efficient hardware. I miss the days where you didn't have to consider these things but here we are..!
7800X3D is a no-brainer in the UK. Price difference will pay for itself in lower energy costs.
This is why i wait for an entire line of products to be released/reviewed before deciding on a purchase. I’m on a 5800X right now. No need to upgrade my PC right now other than a video card, but it’s nice to know where things are headed. I built it in 2021, so it has a lot of life left in it.
The way Linus completely obliterated Paulo Coelho 🐰 was something to behold
indeed, I cringed a bit
As a Portuguese speaker I paused the video and go back just to hear again hahaha Linus destroyed the pronunciation hahaha
Of course not his fault... but it was funny! 🤣
@@1ZeeeN I'm Romanian, and even I cringed hearing his pronunciation.
indeed
@@1ZeeeN same xD
I’ve been waiting years for the PC to finally release efficient CPU’s. The 7800x3d seems like the real deal. Nice job AMD!
I just don’t understand how Intel has had a few years to update their CPU’s after Apple released their super efficient M chips, but have done nothing for efficiency.
Intel has a 62.8% market share so they dont need efficient chips. 62.8% of the world love Intel and keep buying Intel. AMD only have a 35.2% market share so they need to work much harder than Intel and produce better products than Intel if they want to grow bigger than 35.2%. The CPU company with the bigger market share will always be complacent.
Apple uses an entirely different cpu architecture (Apple is ARM, while AMD/Intel are x86); x86 chips will probably never come close to ARM. ARM has many of Its own issues relative to x86, so ARM releasing into the mainstream desktop market is not happening anytime soon. Comparing ARM efficiency to x86 desktop chips is not a fair comparison by any means.
Idk what is does, but Escape from Tarkov seems to heavily benefit from the increased cache in the 5800x3d, and the whole community went crazy over the massive performance increases over any other chip on the market, even with older 10 series cards. I can only imagine how big of a jump this is in games like that
Oh Boi we're not ready...
same thing for Star Citizen, it love X3D chips. glad AMD are coming through for us again!
AI computations benefit the most from having a bigger cache (see how massive the gains are on the graphs for Warhammer 3), I've never played Tarkov but I guess there is some heavy and possibly non optimized ai logic running in the background.
@@frale_2392 you can play tarkov on 4K with this chip and in good quality, i'm already running tarkov on 4K high, fsr on ultra quality with a 5800X3D and i have stable 130 fps. the new 7800x3D is gonna be a monster too,
clueless, tarkov is dead because of the cheaters
In MSFS it doesn't really matter which so far released 40 -series GPU you run - but from x3D you get 10-20% more fps lol So instead of 50 you get 55-60 - which is a lot.
The only thing I learned from this video today was that It's segue @0:53 And not Segway.
The performance of basically every mid to high end CPU is so good, the choice of CPU almost doesn't matter for most consumers. For the more hardcore audience, it's best to pick one that is more tailored to your preferred games. You really can't go wrong though.
As someone who really wants the stuff they buy for their pc to last, I’m always looking to get the absolute best I can for the price I get it at.
this is the real answer
it really depends what you are looking for.
The performance is good today for most of them indeed. But as Linus and others have said, buying a better CPU today helps you not have to upgrade it (along with mobo and potentially RAM) when you upgrade your GPU in the future, especially with higher-end GPUs.
@@GiorgosKoukoubagia yep I'd argue AM5 itself is a good investment, regardless of the CPU you choose. Should be supported for many years to come. That's why I bought a Godlike board and 7950X3D. Pricy, but I'm set for a long time.
running 7800x3d for now and in desktop it has just 35-38 Watts usage (and its a gaming PC) GPU takes around 26W its an 4070 also undervolted to max. So total around 61W power usage in total in desktop, that competes even with some gaming laptops now!!!
PS: using Samsung pro 990 for 2 Tb and DDR5 5600 Mhz.
Got this PC just 2 weeks ago for 1200$ used and I'm happy!
I think the reason the 7900X3D performed so badly with dual CCD in Factorio, is because game mode probably didn't kick in for the benchmark. And as such the CPU probably didn't recognize it as a game, which would have parked the non v-cache CCD.
No, it's because it has 2 CCD's, each with 6 cores on them. So When one CCD is shut down in favor of the cache, it essentially becomes a 6-core CPU with V-Cache.
@@PQED well, yes.. But to determine when it should switch to v-cache only ccd, it relies on windows game mode to determine if a game is running. If game mode doesn't trigger, it won't know which ccd to prioritize. And my guess is that the fsctorio benchmark doesn't trigger game mode as the benchmark doesn't seem to run like the game normally.
@@parasitex I'm aware of how unreliable (and incompatible) Game Bar is, and that's why it's ridiculous to employ it in this fashion.
7:08 I love how Adam’s in this bit because back in the first AMD extreme tech upgrade video he really wanted a 3DVcache CPU but missed out on the timing
I will be involuntarily dreaming of Rabid Adam for years to come.
I really loved seeing the pricing for using the chips in California prices. I don't live there, but I super appreciate the graphs, as I find that incredibly important when choosing between the options and saying "well I have the money, so I might as well get the more expensive chip... right?"
A 7950X3D costs around 300-350 millirentmonths.
The "Bar Nun" graphic/pun was so good I don't even need to watch the rest of the video.
Glad to see power efficiency and TCO making it into these calculations.
I put my hamster in a sock and slammed it against the furniture.
Man, it's crazy how basically every graph was showing triple digit fps across the board... Doesn't seem that long ago when these types of graphs were in the 60-90 fps range. 😅
I never thought I would see a 5800x3d out perform a 13900k in any game 😮
now consider how much less power it draws = how much less heat it puts out into your cooling system = how much more graphic card will have headroom (may even be able to get more from Video card due to better thermals!
@@airBornFpv I know. It's painful to watch.
That is of course gaming but in other operations of the PC the 13900k will do better but the reality is the difference in power draw and the real world performance over all for most everyone the amd is still the better overall.
@@jamesmackinlay4477 well yeah - hence “the best gaming cpu” 😄😘
Boutta upgrade from my old trusty Ryzen 5 3600 to this "little boy" of a nuke of a CPU. I am financially responsible I swear
did you do it? i'm in the same position with the 3600 he's getting a little worn out these days
@@giffya yes I did it, it is a wonderful beast, a significant upgrade. Sadly though, you need a whole new Mobo and ram :/
Might be worth it to wait until Ryzen 9000 and 7800X3D drops in price?
I just upgraded to this from a i7 6800k and it's definitely significantly better. I paid around $530 for a CPU, motherboard and ram combo from microcenter.
Great contextual content, keep calling out all companies in the space for their "smoke and mirrors" PR BS. Good and clean data and info ❤
would love to see some software improvements that would better assign the cores on the 7950X3D to appropriate workloads to close the gap
You can deactivate the cores manually but u still paid 200% more for the same Chip then
The 7950X3D basically becomes a 7800X3D when you turn off half the cores in the BIOS. Hopefully they fix it with software updates so it's always like that without turning off half of it.
You mean processor affinity..? Cause that's been a native windows feature for some time now.
Interesting
100% agree about the delayed launch of a better cheaper product.
If there is a defence it’s this, the 7950x3d performing worse seems to be a software issue (firmware, drivers, or OS). It should be best of both worlds rather than worst of both worlds.
I’m sure original expectation was best of both worlds, and they didn’t adjust their plan when it didn’t pan out that way. Maybe theirs firmware updates in future.
Yea really hoping they improve the schedular or the parking some how for the people who have the 7950x3d that wanna use it for production and also gaming.
Yeah if they get the scheduler perfect it will be the ideal CPU for streamers or people who game and work on the same pc
@@achillesa5894 Yea currently with some games i play there are scheduling issues unfortunately where the cpu doesnt fully utilize the 3d cache by about 30-50% so it can be like a 20-40 fps loss which really sucks -.-
They delayed it to sell people more 7950X's as they new that the 7800X3D would be *THE CHIP* to go to for most people.
The 7950X relies on Windows game bar to schedule itself so good luck...
@@griffin1366 i know how it works lol ive got one. A lot of people chose the 7950x3d for productivity and gaming purposes....
70 bucks for a screwdriver???
It's a really nice screwdriver
Still no one needs a 70 buck screw driver
@@codykyle511Whats the point of your comment then, besides giving something you think no one wants or need more attention?
@@codykyle511 he rich sooo
Bought one for my dad this year, I'd say it's well worth it. Buy the screwdriver and a couple bit sets and you'll never need another screwdriver or another bit set in your life.
I finally jumped in when 5800x3D hit $319 and I had point on Prime so it only cost me points. I was already running 5900x and I popped in the 3D chip and gained 20 fps on the same game, same setting running at 1.16v and switch out for an air cooler. 3D is surely efficient and cool as long as you run PBO2.
It's a straight up beast. Exactly what I expected, really a BIS chip and worth the wait for AM5 platform, especially after some time for boards and memory to get cheaper too.
Definitely amazing at certain games where it use extra cache for it's max potential like in WoW and other. Very nice!
I'm buying that for emulation and avoid CPU bottlenecks for years after I buy it. And also the upgrade path. Planning to keep this AM5 motherboard for years
I could be wrong, but I was always under the impression that emulation performance relied on high clock speeds as opposed to tons of L3 cache. If that's the case, x3D chips wouldn't be ideal for emulators since their clock speeds are significantly lower than their normal X version counterparts. Someone correct me if I'm wrong though!
@@Benefits: it's always about what the program is doing. A SNES game (plus the structures required for emulation) will probably fit easily into L3 cache, but a PS2 game won't.
I wish LTT would test VR performance. Typically, clock speed is really important.
I love my 5800X3D. I will be staying with it for quite a while. The thing that I feel when playing with it now is the reduction or removal of a lot of micro-stutters. Load hitching doesn't happen as it used to as well.
I do feel it in Cyberpunk 2077, it's not in the framerate it's in the input response. Turning a corner on a vehicle at 100+MPH on my old CPU would feel sluggish and those little hitches as it tried to keep up with the GPU are gone.
Yes, I am a 1080p player.
Isnt 1080p still the standard?
bruh, why 1080p? come on
Well done in showing running costs. That has always been a factor in pricing my PCs. The excuse "rich people don't care about the bills" wears thin after a while (that excuse doesn't apply to many/most of us) so seeing those metrics is refreshing.
having the 5800X3D and now a 7900 XTX I'm set for a couple of years. I might end up skipping the entire AM5 generation and see when AM6 comes out what both AMD and Intel have at that point. Right now I dont need an upgrade. Not for the next 5-6 years. Still its astonishing the new implementations and innovations that AMD and Intel comes out with. I wonder when AMD will also have a big.little structure and add some efficiency cores, or if they just straight up go 3D V-Cache on all their future CPUs to make them as efficient as possible. They have already won that game.
They are planning out Big Little for Zen 5. Essentially their plan is to use Zen4 with avx512 and some other SMID instructions removed and call it zen 4 dense. This will shrink core size and interconnect size, enabling low power compute zen4 while leaving full fat zen 5 to do the high power high speed stuff.
They already tried a big.little chip.
It's just for laptop/mobile though (2Perf+4Eff). That's where efficiency matters.
For desktop, all the efficiency cores are parked and used only into production tasks.
Also they are going to add 3D cache onto their GPU I hear.
Same setup as you. 5800x3d on a b550 Taichi board with a 7900xtx Taichi. The rig is amazing at 4k and not too insane on power draw so it's perfect for me for the time being. I don't see a need to upgrade for the coming years at all. I plan to use am5 for a home nas build though with a b650 itx board and a Jonsbo n1 case. Will use one of the non x CPUs because of how power efficient they are
Personally I hope this big.little fad goes away. It makes the schedulers too complex.
I just got my 7800x3d today and omg that’s an upgrade from my 7700
I upraded from my 2600x to a 5800x3d with a bios update for $299. Somtimes, the best medicine is to wait on the best of last gen to go on sale lol.
I did the exact same upgrade last month. GPU is a 5600XT on a 1080p monitor so the gains were minimal. Bottleneck Calculator says the 5800x3d is a bottleneck for a 4070 Ti or higher... not sure what GPU i should upgrade to. Will get a 1440p monitor as well. Wait for the 4070?
The cpu we were anticipating and waiting for to see the results after seeing the 7950X3D and know how the 5800X3D performs
I honestly think that long term review of these chips, future forecasting in terms of energy consumption and compatibility need to be an integral part of these reviews. Most people aren't building a new rig every year.
Just did my upgrade.
7800X3D, B650-Plus, 32GB DDR5 6000, and a 6950XT (should have made the jump to 7900XT but whatever). Now, that's component prices I wasn't prepared for, but coming from a i7-6700K and 1080, it was about time, although the old i7 and 1080 still hang.
Damn bro great upgrade
i got literally the same upgrade from my old system to the new one. besides the new GPU. how does it feel compared to your old rig ?
@@razgarorIn general, not much as changed. Gaming, a leap forward, obviously, especially in newer titles like Dartide that gobble VRAM and CPU.
Mainly because I use 3440x1440 34'' monitor, the 1080 was getting destroyed, then the CPU lagged bebind. I was getting 40%, 60% GPU utilisation with the 6950XT, so that needed to change.
In some games, the old stuff worked perfectly well (Back For Blood, old titles...), with some visual reductions. I think if you game in 1080p, you can probably get away with a system as old as mine for a while. Less VRAM, easier on the GPU...A cheaper GPU, like a 6700, as long as you have decent amount of VRAM, imo.
Oh and I've switched to two NVME drives (!TB, 4TB). But really, I had a 500TB ssd, 2TB NVME, and a 2TB HDD, and yes, it's faster, but diminishing returns.
All in all, I could have kept the old system. If you're short on cash, second hand will save you a lot because all that shit is expensive. And as I said, my old pc still works fine and could still game OK. Besides, not many exciting things on the horizon. It's a bad time for gamers.
@@0bzen22What kind of cooler are you running on the CPU now and what what are the temps like? Looking to get a 7800x3d myself!
I'm extremely happy with my 7900x. I initially wanted to wait for the 7900X3D, but I do a lot of recording and streaming while gaming, and I plan on keeping my 7900x instead of upgrading.
Good choice
For game _developers_ the 3D variants don't make a whole lot of sense either. We're getting top-tier but only just shy of best performance in gaming but gain a massive improvement in the productivity department. Compiling and loading and editing take up a whole lot of time out of my daily schedule and I'd gladly give up even 15% of fps 5 years down the line for the massive gain in productivity.
@@gozutheDJ yeah. I was using the AV1 encoder with my 4080 but there isn’t a whole lot of support for AV1 encoded videos yet. So I use Nvidia NVENC H.264 but I’ve been wondering about trying out x264
You would probably be lucky to get the 7800x3D in 6 months lol - so it is not necessarily a bad choice.
@@gozutheDJ is CRF similar to CQ level? I have CQ set to 14.
Had on my AM4 board 2700x, now rocking the 5800x3d. And 32GB 3200mhz RAM.
Thank you AMD. 🙏
bro whats your gpu atm?
@@omargosh6412 Its the 4070 TI.
The power consumption story is missing idle and web browsing or watching video values.
If the 3D chips consume more power in idle or low performance tasks that might also be of interest for those that use their computer for extended periods of time with small bursts of power.
How much more? 50w more? If primary use for the PC is browsing, x3d CPU is a wrong tool for the task.
@@ShersGarage Sure but many only need the CPU Power for short bursts. And why not get a x3d cpu if you like to game on the same computer as well.
I found this video which gives a better look at efficiency: ruclips.net/video/JHWxAdKK4Xg/видео.html
@@asldfjkalsdfjasdf doesn't look like there is that much difference in the power consumption. Maybe 5GBP difference? My take is, if you you plan on using the PC mostly for gaming, get the X3D. Otherwise it is a waste of money. In some cases non X3D CPUs are just as good or better than X3D counterpart.
Dude is a great seller. You stare at a graph that shows you barely any differences, and he still manages to use words like "win" and "lead" to shine some light on single products. No one can sell me, that you will notice 10 % gains at 200+ FPS.
True, but when you are buying a new cpu why not spend 50€ more for a much better cpu?
@@baboka2616 50? I got my 7600x for 195,00 and the 7800X3D was around 350,00.
Nice Work Team LTT! I especially appreciated the efficiency cost/KWh comparison. This is a significant factor in the northeast US where its over .42/KWh.
I'm running the 5800X3D with a 4090 @4K and I'm very happy with it! Definitely going to let me get a few more years out of my AM4 setup!
You have the 100% of gpu usage with dlss quality or have a little bottleneck like 1440p?
@@Hugosanches594 I have the "same" setup and at 4K 120hz there are very few games where the 5800X3D will bottleneck the 4090. I see some performance loss in Witcher 3 and in Hogwarts, mainly because of low CPU utilization/bad game optimization. These games would have benefitted from faster core speeds than the X3D chips have.
In Hogwarts a quick .exe tweek mostly fixed it and in both games DLSS/FG made it pretty much a non issue.
(Running around in Skellige in Witcher 3 I had to turn of Frame Generation for a while though, because of some bug.)
In 1440p there are several games where you will get a CPU bottleneck, but that is true for all CPU's currently available when you have a 4090.
@@randomguy- Hogwarts is pretty new, so it may get some optimization with updates.
I don't think anything for you can last a few years when you upgrade every 1-2 years.
Yeah, I'm switching all my 3600's to 5600X's, and making sure I have at least B550 boards. I just got an ITX B550 last week to replace the ITX Fatal1ty 450, and the last 5600X arrived on Friday. I'll be able to recoup probably 75% of my expenditure.
For the performance graphs I would love for them to be color coded so like the product that is being reviewed have a different color to differentiate from the others
I'd also like them to highlight the product currently under review to easily pick out its position.
I'm all to holding companies feet to the fire. But the push back for AMD launching the 7950X3D and 7900X3D 5 weeks before the 7800X3D is just too much for me. It was announced at the same time and included the MSRP so everyone knew they only had to wait a little more than a month and how much they would save. Those who need the bleeding edge or can't control themselves really just need to look in the mirror if they have regrets today. All major review sites (at least the ones I saw) said they expected the same performance with the 7800X3D and to wait to buy.
Four months later it wouldn’t matter. I just bought a 7950x3d and got a free motherboard. It’s equivalent to 7800x3d plus a motherboard, only with 8 more cores.
@@Cooe. with the latest chipset driver the non-vcache core parking is automatic.
Your inclusion of the energy efficiency over time makes me really happy that you invested in that lab. Power draw is one of the main reasons I refuse to get any current Gen NVIDIA product, Even though I really really really wanted a 3070 TI. Will most likely buy an arc a770 when that's on sale.
lmfaoo magine caring about an xtra couple pennies in ur bill this mf boycotts nvidia 😂😂😂😂 go hug a tree while i enjoy superior fps
Do you happen to know that Nvidias 40 series cards use less power than AMDs 7000 series cards?
nVidia 👑
The 4070 (released after your comment I realise) is very power efficient. I'm mid-life upgrading my current computer with it and a 5800x3D.
4070 sips power for how good it is
It would be an added value for this video, and I know, more time to test things, to see the difference in gaming performance between 7800X3D and 7950X3D while streaming at 1080p with a reasonable number of programs open for streaming (elgato stream deck, xlr, RGB controller, etc).
I think it is a pretty common environment nowadays, and useful too see if the extra money for the 7950X3D is more convenient than building an entire PC dedicated to streaming the video output from the gaming PC (which, it most probably is), or if the difference between the CPUs is neglectable.
remember that the drivers effectively shut off the other 8 cores on the 7950x3d while gaming, so it would be purely clock speed making the difference and i cant imagine the extra few hundred mhz would be worth the extra cost for that application.
The X in the name= expensive
Nah, it'll go back down after awhile.
Tru
Xdd
What about Xbox series X 0_0
@@thatguy3000 I doubt it this time. These are going to be paper CPUs with AMD putting the ccd’s in the Ryzen 9’s first
Literally just got the Ryzen 7 7800x3d from microcenter with a bundle deal for $267.12 ❤🔥
so lucky
Currently happy with my 5800x3d and 6800xt for 1440p. while this cpu is obviously faster, I think ill wait things out to the 8000 series.
Honestly I would like to see benchmarks in CPU Intensive ports like Witcher 3's Updated version (with the comment that this was tested as of this point and CDPR may eventually fix it)
I wish you'd include the 5900x or 5950x, just for some perspective for those of us running a fairly recent-ish AMD CPU.
5800x3d vs 5900x benchmark video on youtube. RUclips it brother
I'm still rocking a i7 5820k CPU and it was a 6 core processor that you guys hyped as one of the best choices long term even though there was faster chips that were still quad core.
Surprised to see how well my 5800X3D holds its own still next to these newer chips, and it most definitely isn't slowing down my 4080 so I guess I'll be holding onto it till the next CPU generation 😄
It's not even a year old, bud
What are yours temperatures,my goes to like 90 and i dont like that...
@@Forke13 its not current gen. no one thinks its old ffs
@@Petar98SRB my 5800x3d is super inconsistent. it idles around 39-44c and during gaming it likes to sit around 50-60, and jumps around 70c regularly before going back down to 60ish c.
and cinebench makes it go to like 84-85.
You may want to check your thermal paste application/mounting, or reexamine your current cooling solution.
@@Petar98SRB X3D chips do get somewhat warm yes, but 90 is definitely on the high end. Gaming temps average 60-65c for me, cinebench load caps around 80-83c usually. I would definitely double check your thermal paste application and cooler mounting pressure, or possibly your case airflow setup.
The main takeaway from this video is that the word is not spelled as Segway but as Segue
47 anos para ver o Linus citar Paulo Coelho com um sotaque '' perfeito'' hahaha, muito bom!
tava bebendo uma água e engasgei de rir kkk
Aonde ?
Paulo corrilo hahahha
@@leoblaidd 3:16
Acheeeei hahaha
The problem with the 7900x3D and 7950x3D simply need microsoft to fix windows so that it can run the games on the 3D cache CCD, much like they had to do with intel's P cores vs E cores.
Windows fix it's scheduler and memory management?
BAAHAHAHAHAHAHAAAAA😂
@@FLMKane Yeah, a really silly request 🤣
Im a 5950x owner, but don’t you just point microsoft xbox game bar to see a program as a game, then the games runs on the cached chiplet?
it's been 2 months and it still isn't fixed safe to say it will never get fixed.
"Raises some big questions like-
What the F***?"
Exactly my reaction when I saw the 7800x3D bullying literally everything in most gaming scenarios. I've gone through numerous reviewers and seen the same results. Amazing.
If you heavily play a Unity game (say, VRChat or Escape From Tarkov) then the extra cache of these chips vs Intel's is gigantic. Unity hits cache a lot.
What makes you think it's unity? A sample size of 2? No, cache is good for anything that isn't optimized well.
@@Dyils True! There's not been a huge amount of Unity-focussed testing of these. VRChat is a good general stress test of Unity on BIRP though given you can throw a very wide variety of content into it and really stress the engine out in high-load instances. The hypothesis is that it's a Unity quirk, something that has been mentioned by VRC devs, due to the reality of Unity games essentially being injected assets and scripts into a core engine that can only do limited build optimization compared to bespoke engines or Unreal. I think the same finding was made by Tarkov players: see here ruclips.net/video/Gi8X99f70D8/видео.html
To clarify: it's good for lots of games, but I think Unity games in particular are friendly towards high levels of cache.
Great to know as a VRchatter looking for a new system! Didn’t know that
@@Dyils Cache is good for anything that needs to access the same area of memory very frequently. It has nothing to do with poor optimization.
I have the 7950x3D and play vrchat. I’m not sure if it’s worth to downgrading to 7800x3d for the simplicity and being on one ccd, can someone help me?
I think improving the 7950x3d is just a matter of software.
All AMD need is a better scheduler. And looking at the numbers this kind of one die cache is going to be the future.
So let's wait a few months and driver update can improve it even beyond 7800x3d
Nope. Infinity fabric latency and having to "guess" at which core to use in the multi-die setup is always going to be a weakness that keeps 7950X3D from being what it should. AMD played customers with a chip they knew was no better than a cheaper offering coming shortly after.
Strange how everybody cut some slack for Intel designing a CPU with different cores that needed a new OS to work, but AMD get flak because Windows can't allocate a RAM-heavy process to a core with lots of cache...
just use process lasso and make a personal CPU set, with PBO and its instantly faster than the 7800x3D. I've already tested this.. the issue is windows scheduler not being set up for gaming unless you use windows gamemode but that hinders some performance.
@@mitch075fr It struck me as odd as well that Intel got so much help from Microsoft to make their product work, but AMD has to rely on Game Bar of all things? It's not reliable in the slightest.
AMD really does need to work on their own hardware scheduler though, but I don't see how it excuses the above.
Nah not true. It needs hardware thread scheduler like Intel has. AMD just cheaped out as is usual for AMD. Never touched their trash products cause they pull of cheap crap like this.
I have 2 monitors and do gaming/streaming and light 3d modeling. 13600k has had no issues and no lag, super quick. I feel like most modern cpus are all overkill for 95% of users.
PSA: it is possible to run your non x3d chips power limited to achieve pretty much the same watt per frame as the X3d variants. You will fall a little short in cache limited scenarios, but you will get pretty close. It barely even impacts performance on all core loads.
I second this on both of my ryzen CPUs I was able to get a solid undervolt whilst maintaining higher then stock performance
can't wait to get one in 4 years!
I really would like to see an improvement with your charts, especially if there are so many. It's hard to focus on them without pausing the video a lot. Sometimes low values are better, sometimes high values are better, nearly every time the order of the CPUs changes, the spacing between the lines is different each time. The transitions between the charts/graphs could be better I think. I don't know how you could achieve that because I am not a layout designer but you definitely could do something here. That would be great!
I got the 7950x3d since I game and work. Paired with 128gb ram and 4090. So it's a good balance for me.
Had it for about 9 months
Would have loved to see the 7900x in the comparisons. As of writing you can get one for $430 (admittedly at heavy discount, but multiple retailers have the same price), the closest price comparison out there.
You can just take a look at the 7700x results. They perform similarly in gaming, and because games don't usually benefit from the extra cores, in some games the 7700x performs even better than the 7900x by a very small margin.
the power efficiency of that 7800X3D CPU with that epic performance makes it an instant buy for me :)
because people buying $500 CPU care about power efficience, stupid shill
all I can say is again, thank you AMD for releasing the 5800X3D and keeping my X470 platform still up and running :)
Mmm, I only recently built a new PC with a 5950x and a 3090ti... the thought of having to get a new motherboard and really rebuild the PC from near scratch next time around when my last PC lasted 10yrs... its pretty annoying and not something I want to do at all.
I'll see how long I can hold on I guess... but with all the UE5 advances, I can't imagine my current PC'll keep pace in 5yrs time. Meh.
@@Quasiguambo ??? I got a Ryzen 1700x, built it at release, thats 6 yeras ago. I can just replace to a 5800x3d and the same system will work for years. Do you expect the 5950x to only last a couple??
Just bought a sealed in retail box 7800X3D for $340 from local ad about a month ago. Then found another 7800X3D listed for $300 new but with no box. I snagged it for $260 for my son. I was only shopping for a used 7600 but the extra $100 was worth it. First time my kid has the same PC spec as me CPU/MB/RAM wise at least.