Fun fact: Factorio devs mentioned in one of the blogs that they optimized game code so much that it's mostly bottlenecked by cpu cache rather than clock speed. Also ram speed boosts performance too and by a lot!
Labs are definitely showing their value, great review. I think you should also include idle power draw. It can be a big deal for some use cases and sometimes people may find that their systems are drawing more at idle than they should (had that happen recently and it turned out to be a driver issue), so it's useful to know where approximately they should be.
People are noticing high idle power draws because they install crap like animated wallpapers and 10 game launchers, or they have multimonitor setups with high refresh rate. Or because they enable the high performance power profile in Windows. Tldr: too many factors for idle power draw today, since there is never a real idle. What you see drawn from the CPU is not what you measure from the cord. I use a 15 min sleep with wake up via usb (key/mouse). So It uses a few watts to power the memory and that's it. And for the commodity to not have to push the button manually, by just pressing the mouse to power on.
Usually idle power draw is around 5W whatever platform you're on At least, mine draws the same with 3600 and 5800x3d. Note that the 3600 never puts to sleep its cores, while the 5800x3d will run just 2 cores/4 threads if the workload is light enough and the others are almost literally turned off
You can't measure the whole system 's powerdraw for a review tho: different ANY component will massively change your results, so the possible combinations are too many to test or even just list.
i'm interested in the idle power draw of the non X SKUs. I've seen a few people note 60-80w idle power draws on 7000 series compared to their intel systems.
Short Feedback on the new Performance Graphs: They are very nice and clean but it is a little confusing how they are ordered. You seem to randomly order them either for average or for 5% lows but never consistently. You have graphs where some CPU is in the Middle by it's average, directly behind you have one that has lower average, but higher 5% lows, after which you have one that has higher average, but lower 5% lows, which makes it hard to quickly see, which CPU is actually faster. Both ordering methods, average or 5% lows are totally viable, but you should choose one and stick to it to make the graphs more easily readable, especially since you show them for a limited amount of time. That means pausing the video constantly is a necessity to take in the information acuratly, which makes it harder to follow the script while those graphs are on screen. Other then that, Lab seems to really do a great job here!
The graphs are completely unreadable blurs on Mobile at 603p, Plus the tests seem irrelevant for anyone not playing those particular games. It was really easier back when everyone used synthetic tests that were designed to give realistic results across all CPU generations since the 1981 launch of the PC platform. Famous tests that would still be relevant include LINPACK, Whetstone and 3D-mark, each test may need updates to deal with new inventions by the 3 chip giants.
I was going to comment on the same thing. Maybe get some inspiration from Hardware Unboxed. Their graphs are also often packed with information, but theirs is a bit easier to digest.
@@johndododoe1411 if they made their graphs readable at [360?]p it would look like granny zoom to the majority of their audience. Change the quality and wait for the frame to buffer if the graphs are that important to you. Only using synthetic tests that were relevant on the first PCs would be more useless to the average viewer than testing even one modern indie game that nobody plays. Get some perspective
THANK YOU for finally including Factorio benchmarks on your CPU reviews. I think CPU benchmarking in industry reviews are way too focused on AAA GPU-bound action games instead of processing-intensive strategy/sim games. Titles like Factorio and Civilization 6 are very relevant for people shopping for the optimal gaming CPU.
We need more benchmarks for productivity as well. Everything focuses on AAA gaming at 8K with full RTX, paired with the best DDR5 and a 4090 overclocked to blowing up. That's not my use case and the data unhelpful.
Looks like they made the 7600 with the same idea that the 3600 played out to be. I really like it! I'm still using R5 3600 and don't have any intentions on upgrading soon. I'm still running at stock but when it starts to wear down, I'll simply need to OC a bit to match what ever GPU I have(currently a RX 6650 XT).
Actually, what they did is CPU binning 101. They just stockpiled "golden" 7000x CPUs for months and are now selling them for the 65w lineup. Genius move.
If you're amazed at this performance and power draws you should check the work they did with the Steam Deck's APU. I got a lot of hope for their future mobile end chips.
Upgraded from 2700X to the 5900X recently and even with its eco mode capped at 65W it's a massive improvement to me. So I will probably wait for the next huge efficiency jump and another increase in thread count.
@@Barrysteezy You should ponder these questions for a while, if you can confidently answer them then you’ll have an easier time making a choice: 1. What chipset is ur MOBO, and is it one that supports 5000 series? 2. Do you plan on upgrading to an entirely new system in the next 5 years 3. Do you plan on doing any creative work or CPU-intensive work such as live-streaming to twitch while gaming? These are all things you should consider before spending any moneys on upgrades
@@Barrysteezy I recently got an RX 6800 but had an RX 5700 before that. One thing helping a lot with the GPU was enabling resizable bar in the BIOS. Honestly though my setup is pretty screwed by PCIe limitations since I'm using 8 lanes for some SSD adapter card. Therefore my GPU is limited in bandwidth (only using 8 lanes) and I only have PCIe 3.0 with my motherboard. However it's still fine but I assume with PCIe 4.0/5.0 I could easily get 5~10% more performance in games. The reason for upgrading was mostly to get a newer feature set with mesh shaders and ray tracing pipelines (I wanted to do some graphics development with that). However the CPU upgrade still made a lot of sense to me because I'm compiling a lot of code during development. So this adds up easily when I need to draw less power now while it takes less time as well. The most reason why I didn't went for AM5 is the pricing while my old mainboard (x370 chipset) got a BIOS update which enabled Ryzen 5000 support (even with more than 8 cores). So the upgrade was much cheaper and the benefit of Ryzen 7000 over 5000 is not that huge to be honest. But if you consider it, I would wait for the reviews of the 3D cache chips, especially if you care most about gaming. There's still a chance that they won't perform as perfect during launch though since they rely on software optimizations to utilize their stacked cache most efficiently. So hopefully reviews will cover that part.
where exactly do you see improvement and did you really needed it? 2700x is still capable of any task, you just wasting money for higher numbers in benchmarks.
@@teamfishbowl1076 mine is going above 95c with aerocool verkho2 (2 pipe cooler) and there was R5 1600x (95w) never going above 60c with the same cooler. I don't understand that.
@@kliibapzyeah its a bit odd. prime95 on am5 7600 for an hour with prism cooler will easily hit 90c+. my ryzen 2600 with the dark rock pro4 can do 3-4 hours of torture tests on prime95 and never go above 65-70c. i want to test the dark rock pro on the am5 and see what happens, but either way the 7600 runs NO WHERE NEAR ICE COLD.
ICE cold? They run around 93c with stock cooler thats 200F enough to cool meats easily. Not sure I would use ice to describe something that can cook steaks.
If you are still on AM4 mobo, the 5800x3d is a great cap filler without the need of new mobo/ram/cpu and it seems to be staying on par with the next gen CPU's
Well, i dont wanna be the karen today but the 5800x3d is better for content creator or ai people but more cores could mean better performance even though a game only runs at 8
yeah, i found recently that my b450 mobo has a bios update for 5xxx series... and i'm running 1600 so i'll probably save a buck and just buy brand new r9, and i don't even have to buy other parts, yay
@@TheEpxMaster It's realistically the same since the AMD motherboards are twice the price of Intel's. The 7900 sounds like an pretty mediocre deal, when you're spending that much on the AM5 boards and DDR5, you're probably better off with a 7900X3D when that comes out, because chances are if you want a board for a 7900, you're not an average user and shouldn't be cheaping out
@@TheNicePIXEL I most certainly will continue to use the vastly superior cpu's. My task is to INFORM all the newbies of the LIES spread regarding the 5800X3D , I hate when newbies waste hard earned dollars
So the 7600 is the new budget king, the 7700 is DOA, and the 7900 is a power efficient mid/high-end monster? If only Radeon could deliver like Ryzen Edit: changed the 7900 from mid-range to mid/high end
Budget king will be the 13400F. Being BLCK OCing is becoming more common on boards at good prices. a 13100F overclocked will even give the 7600 a run for the money.
@@MiGujack3 I was thinking that too, AM4 is dead end but will still be relevant. Z690's are very affordable. B660's are plenty enough for a 13600k or better. Mortar Max+12400F is where it's at right now. 5.3GHZ and 13600k performance in gaming.
@@PDXCustomPCS 13400F is still based on alder lake cores so gaming performance will be far behind 7600 and 13600k and only about the same as Zen 3 5600x which you can get for $135. You can see at 7:10 the 7600 is a bit faster than even the 13600k in gaming, it's prettymuch impossible for the 13400f and 13100f to match 13600k and 7600, if you OC them you can also OC the 13600k or 7600.
@@halocubed6788 I built a 11 liter Dan A4 H2O case with a 3070ti and a 5800x3D about 2 weeks ago. It has 60 degree temps for the GPU and the CPU never goes higher than 70 degrees when gaming. It has been running great and I overall love it.
Hey just wanted to say thank you for the channel. I used to be a system builder back at the turn of the century and into the sli Era of gaming. I'm slowly catching up with the new tech. So glad to see some other enthusiasts make these videos.
All of the X version chips were available for $10 -$20 above these new chips between Thanksgiving and Christmas. I picked up a 7700X for $339 from Newegg. Currently, the new chips are a great value but early adopters of the X series did just fine.
I went with the 7700 and have no regrets. I got 32GB of free DDR5 from Microcenter. I did a direct comparison against a free mobo deal they were running on Intel 12 series, and made the decision to go with the new platform instead of the tail end of an old one. I don't feel bad about it. And I'm happy with the 6800xt I paired it with for $500 after seeing the latest gen video cards.
I paid $30 more for a 13700k, z690 for $150, and carried over ddr4 b-die ram I bought two generations ago. A more normie 32gb ram kit would've only been $70 anyway
The performance per watt is insanely impressive. I can imagine it being useful for say a university environment, where you might have 100's of machines working on students engineering projects or similar. You want the performance but also dont' want to use too much energy.
@@colinkirkpatrick5618 pression boost override. Its a setting in the bios that lets the cpu boost to its maximum capabilities until it hits 75C (for older ryzen) or 95C for AM5 ryzen.
I was planning on skipping the 7000 series, but price and TDP of the non-X chips definitely has me reconsidering. If only the 600 series boards and ddr5 ram weren't so overpriced, then it would be a really easy choice to upgrade.
@@NinjaForHire bad troll. Wasn't funny. Nobody but you laughed. Cus an integrated gpu beating my dedicated gpu is more of a 1 in 9 billion chance. Get back to me when integrated steps up thier game *mic drop*
Wow, those are impressive thermals for the 7900! Think it might be going into my next build (which will be SFF)! I want to see what the X3D line this time around does compared to the 7900 and the 5800c3D
7900 will be great for building a quiet entry-level workstation at a very reasonable budget. This is pretty exciting. I think productivity-wise for devs, this is a great chip.
@@youtubewatcher4603 every recent cpu released has increased ipc to the point where something like a second hand i7-8700k is actually a bad purchase rn, you really dont wanna go below 10th gen even if you gotta go entry level to stay on the newer platform unless you wanna deal with microstutters on older cpus.
DDR5 128 GB ram (4400 Mhz is the highest stable config) costs like $800 DDR4 3600 128 GB ram costs $450 and has the same performance 7900 sucks for virtualisation workstation builds or CAD simulation workstations
@@Haskellerz sucks for virtualisation? does that mean VMware machines? that's my use case. please respond with what I should go for. I know I don't upgrade a lot coming from a 4690k but I like the idea of having the latest and greatest. was almost sure to buy a 7900 but why would it suck for vmware?
This is quality reporting. Coverage like this helps real people filter the marketing hype and holds manufacturers to account. Nice job LMG 👌 And nice job AMD!
EDIT: What about idle power consumption? That's an important factor. Not sure if it's being considered. Apparently there is a big idle difference between the two brands? 13:10 I like building efficient, quiet, and low heat output air cooled systems. The Ryzen 9 7900 specifically is look'n reeeeeally good! 88w max power use for that type of creative multi-threaded performance. 😍
I'm still using my first CPU, R5 2600. Love the direction AMD have taken with value, which means I'll be looking at potentially going AM5 when I upgrade probably next year. (Gotta let those prices drop a bit more). I can only hope GPUs will follow at some point
Even if you could find a decent priced 5600 you'd still see a decent performance jump over the 2600. Then when it comes time to upgrading you would have a decent base system to sell as the 5000 is the last series of the AM4 platform. I had a 3700x and didn't need to upgrade but still did (5800x) and honestly did not regret doing so one bit.
What's the issue with the 2600? I have a 3700, I know it's zen2 and 8core but at least for many home workloads it seems pretty good. Games will probably not run much better unless you have a really great GPU like at least a 3070 or so, and the same goes for blender renders or whatever else you might be doing for fun. I would only upgrade if there's a clear advantage in your particular system combination and your use case, and then also only if you need it. So many people play on 1080p 60fps but want 4k120 for no reason. If you play battlefield, will you really be better at 4k120 or notice the visual difference enough to warrant spending $1500 on all that stuff? I upgraded my whole system a while ago (3700, rtx2070, 32gb RAM) and my secondary PC with a 1050ti & i7 4770k runs most things well enough to not really care during gameplay.
@@LuLeBe It's not an issue but it's just a less mature processor. AMD made huge improvements between 2000 and 3000 and even bigger up to the 5000 series. Memory optimisation was one of the biggest changes and in some scenarios you can see a massive uplift in performance as the way the infinity cache works relies heavily on memory speeds and latency. I upgraded my 3700x to a 5800x to take full advantage of my RTX 3070 and triple wide 5760x1080 setup. I get a solid 60 fps now in most games whereas before I did have quite a few frame time issues as well as 1% lows dipping. You have to remember that everyone has different uses for their computers and what may seem like a small upgrade can turn out to be a big one for some people. That, and AMD have made huge improvements between each generation of Ryzen. It's not like the old days where Intel only improved by a few percent each generation.
I always wanted an R5 2600! Of course, my upgrade got delayed substantially, and I got an R7 7700X for christmas. Best thing ever. Good luck with the upgrade! I'm still using a mid-tier GPU...
@@blunderingfool that's basically the upgrade I made. Depending on what you play and the price you could find the 3d chip at it's a bit equivocal. If you are happy with performance right now it's not worth it. If you play things like flight sim in vr or really logic heavy games like civ a lot though it'll be a great upgrade without needing a whole new pc, especially if you sell the 5600 and get a good deal
sir Linus good evening, I'm Thomas E. Berdon from the Philippines...my rig is already complete but except for the CPU...if you ever any spare to give-aways that you never use it anymore... a AMD athlon 3000G or AMD ryzen 3 3200G it would be awesome...this is my rig I built it on my own...an old micro atx computer case, aA320M-S2H gigabyte motherboard, a 2X4 kingston beast 2666Mhz cl16 ram, a walram 128Gb M.2 ssd for OS, a ramsta 128Gb 2.5" ssd for storage, a wd blue 500Gb hdd for storage and a DVD rom drive...that's it.....I always watch your videos on RUclips as you assemble it one by one until you done with it...I learned a lot from you sir...thank you very much for the time that you read this....
@@fynkozari9271 why? I just care about the power consumption. 1kwh is 1€ here so If I go for a lower end B660, I cannot safely assume it'll run the next gen of chip, meaning that the 50-70€ I save on not going x670 means I gotta redo half my build and spend another 150-200 bucks I'll go x670, the 50€ saved on a 700€ investment is not worth the compromise
@@roqeyt3566 Even A320 motherboards can run the 5800X3D. I don't think future compatibility is really an issue on B650. There's a reason B650 is so expensive. They come with good VRMs and I/O. Buy only what you need and want in a motherboard. No point having a X670E if it doesn't have the features you want.
@@dalzy25 you technically can, but how many mobo's from first gen got bios's for 5th gen On top, many mobo's show weird behavior (sluggishness, microstutters, performance loss) if they're too low end or old, as they were designed for their generation of chips and not the future ones. Not only that, power delivery, especially with a 200w+ socket, becomes much more important compared to the old limited AM4 socket. So while you could, it's a question of should you. It'll be a pain in the behind if history is anything to go by
@@NinjaForHire LOL. AMD is a corporation too. They don't give a flying F about your damn wallet. They only released these non X chips because their X sales and new platform sales were low. It's almost as if the average user didn't like the idea of spending all that money on a new everything for a chip that stupidly ran itself factory over clocked to run at 95c. Funny how as soon as AMD became the better choice for CPUs. The X series was all there was. And the prices went up! But in a fanboys mind. AMD AND IT'S SHAREHOLDERS ARE LOOKING OUT FOR MY WALLET!
@@NinjaForHire if AMD cared the 7900 XT wouldn’t cost so astronomically high. Don’t skimp on a company. Any company, let alone a billion dollar company
I'm actually super interested in the non x chips after watching this purely because of that incredible thermal performance, I live in Australia so almost the same performance with incredible thermals, I'm genuinely going to consider a new build
It would be great if you could include the idle power consumption (e.g. on Windows Desktop) in future tests as well. Like you said in the video, energy costs are higher then ever. When I'm using my pc I'm usually browsing the web or listening to music the same amount of time or even more than I'm acutally gaming.
Besides, I believe that it was AMD graphics cards that had unusually high numbers on idle. Including that in the tests puts pressure on the manufacturers to fix whatever issue they might have.
Intel have lower power usage on idle. The CCD chiplet design is inherently more power hungry on idle/very low loads About the AMD Gpu: the memory doesnt downclock properly in idle when using multi monitors (of a differing hz rate so ie 165hz main and 60hz 2nd like I have) and it consumes 30-40watt constant in idle. (in my case on rx6800)
@@joshjlmgproductions3313 If your GPU pulls 60W at "idle" then your GPU never actually goes to idle. You prob have your windows misconfigured, set the power setting to balanced instead of high performance, it doesnt have any real impact on game performance anyway, its just a waste of energy. A 2080Ti should pull around 10-15W at idle.
@CompilationHUB since 2017, yes they have, rx 7000 series isnt as efficient because its not as powerful as they expected, but its not a hardware flaw, they need to fix the drivers
I got the 7700 non-x because not only very little performance diff that will be noticable but that 65w tdp in an SFF build is a huge bonus for thermals
it's nice to see optimization to reduce power consumption and thermals instead of just throwing more watts at their chip like Intel and Nvidia have been doing for the past few years
To be fair, a lot of people forget that Intel is still on a 10 nm process node while AMD is on a 5 nm process node. This is what accounts for the current difference in power efficiency. We can look at the Intel roadmap to see that they plan to release 14th gen processors in the second half of this year. These are on the 7 nm node, which should greatly reduce power consumption. Similarly in 2024, they plan to release the 15th gen, which should use a 2 nm process. This is slated to compete with the AM5 release in 2024 which should be using a 3 nm process. So, we should expect the Intel 14th gen to be a large jump in efficiency even if the designs are just scaled, though there are also likely going to be some other improvements. This should help them compete with the non-X AMD cpus. Then, we are set up for another head-to-head in 2024: Intel 15th gen vs Zen5. It is great to see Intel potentially back in the game and helping to spur more competition in the CPU space. In recent times, Intel has been stagnant, but now, they are showing signs of moving back into the competition, which is great for us consumers.
@@YounesLayachi I know the current gen is 10 nm (Intel 7), I said 14th gen is 7 nm (Intel 4) and 15th gen is 2 nm (Intel 20A). Thanks for watching out, though.
Happily surprised to see Factorio there as a game to compare at; no idea how that got on your radar but I look forward to hearing more benchmarks with that game, maybe with the UPS/FPS on a 500+ hour mega base since that's where it can really matter :D
Actually the ram speed thing changed on Zen 4 there’s no longer a penalty for desyncing infinity fabric and ram. That said Zen 4’d memory controller generally can’t handle much faster than 6000 but the 6800 used on the Intel bench is way more expensive
Meanwhile I'm satisfied with my 5900x, but it does make good expectation for the next gen (8000) when I'm more likely to upgrade. I eagerly await seeing benchmarks for the x3D variants of these chips.
Got a 5900x too but I can't seem to get the temps under control, 85C when gaming no matter what the hell I do. I'm not used to having temps that high, feels uncomfortable.
3:09 Whenever I buy a CPU one of the first things I do is check the amount of cache and L2 cache available on the CPU. Having a little extra space for your computer to store stuff in the CPU cache can go a long way. The moment your PC needs to reach out to memory is the moment you start to see less performance, granted it's not usually noticeable.
6:24 This is one of the moments when having some more clear documentation and dedicated explanation regarding the Labs SOPs and testing methodology, not necessarily directly in the video but either in a stand-alone video or a link to a forums thread, would be really helpful. It’s no doubt hard to get a high quality testing lab put together, especially when there is a well known focus on developing testing automation. However, when there’s been issues with previous tests that don’t get caught before the video is published and then get at most a pinned comment, it makes me cautious when I start hearing about “strange data” that was discovered which for all I know could be human error but is immediately passed over without much explanation regarding how this result was validated to be accurate. Reputation is so important, and LTT as a brand has often made transparency a prominent feature, so please continue to apply that to the labs as they develop so that detail oriented viewers, the primary target of labs level content, won’t view the labs as having a reputation for letting errors slip by or having a murky methodology more focused on obtaining results quickly over accurately. To be clear, I love the content and don’t believe that there is any issue per se in this video, but hearing that line and no follow-up immediately made me remember other errors and think “well, they’ve been wrong in the past so I’ll just check GN instead” which is not the feeling I want to get out of labs related content because I’m really excited about what the labs have the potential to do. I am looking forward to seeing more content making use of the Labs capabilities and how your testing evolves in the future!
Great to hear. Still planning on waiting 2 years for my next upgrade but LOVE hearing there is a much cheaper option that is still quality. Here's hoping we get the same on gpu end as well
It's actually a better cpu then the 7600 or the 7600x lol and a much cheaper mb and ram so yeah great deal. I got the same cpu. It's a no brainer right now if you need gaming performance
@@sumedhtech1526 price to performance no, but there are more affordable options like 5600x and 5700x although if your building a new pc I'd go for 5800x3d and some 3600mhz RAM
Considering I bought a 7600X just last week I should probably feel some buyers remorse but since it was on sale for 230€/$ I am actually very happy that I got it a day before AMD announced the new chips since it's now back to the MSRP
I got the 7600x last week. It's 10 euros more expensive than the 7600. So a pretty negliable difference, just as negliable as the performance difference.
@@00vaag correct me, but i think 7600x doesn't come with a cooler. And 7600 does. So depending what cooler you got it should be a bit bigger difference.
As someone who was just about to upgrade their SFF build to a 5700X, the 7700 is definitely the way I'll be going (plus it's an excuse to get an all-new platform and shiny new things)
probably the right decision while more costly now, you aint stuck on a EoL socket with no further upgrades but rather a new one that just started shining. Plus new RAM that can only increase in quality which might be a real gamechanger in the future
Which is also currently extremely expensive. You'll need to buy new RAM anyways so why pay a lot for underwhelming new gen stuff when the old one works just fine
Just upgraded to the 5700x. Verry Happy with the performance gain and the undervolting and oc potential. Will last me a long time (still on x370) but shiny new new tech is shiny new tech
I am excited to upgrade to an R7 7700 for a hopefully cheap price in 2 years. My 3700X with it's low power consumption is great and I am hoping to get more performance for the same power consumption.
I'm loving the multiple approaches to computing , feels like the companies are specializing for different needs. It will be very difficult to tell who is moving in the correct direction. I commend AMD for squeezing raw cores as much as possible and Intel for going through a period of trying new things with cpu gpu and storage.
The MSRP for the 7900 is $449. I got my 7900X on Amazon during Christmas time when it was $449. So I'm still extremely excited about it!! got a good deal if you ask me. Especially when you consider that the 7900x went back up to $549
I find it weird they increased the prices back up. I thought they would keep it down since the x3d version are coming out, and ppl were saying the X version weren't selling well too.
@@Mewzyc I was surprised too tbh. I think they will drop the prices once the X3D chips actually come out. They're probably gonna milk every cent out of the MSRP of these that they can.
Holy shit, a factorio benchmark!? This is amazing! X3D appears to be amazing for Factorio, which will almost certainly factor in to my next CPU purchasing decision.
I'm glad to see these well-priced options showing up, but do keep in mind that the X CPUs are selling way below their MSRP right now. For the past month, the 7950X has been $550-$570 on both Amazon and Newegg.
I am so glad to hear about this. This makes me less reluctant to upgrade, because I don't just want my system to draw more power in order to be faster.
Alright, so while the big CPUs are the ones that normally draw a lot of attention, I have to say I'm very impressed with what the 7600 is putting up in performance. For gaming, in most titles it's putting up FPS within about 10-20% of the 7900X despite being $320 less. For productivity, it's within spitting distance of the 7600X for $70 less. If you're not building a high-end rig, it seems like the budget tier is fantastic this go around.
I was watching that chip the entire time too, super high clocks, low prices, low power draw, low temps. Low in all the right places and high where it matters
@@johnsherby9130 I'm hard-core interested with the low Temps. After all, above all else, temp kills. Anything that stays cool generally will not die by abuse. If it barely hits above 50 when going full swinging, then that means it'll be exceptionally resilient. Beat it, boost it, aim your hairdryer at it for the shiggles, but it'll still be artic cool compared what the rest of the big boys are pulling. I'm sold.
I very much agree. Just because it's not at the top of these charts, doesn't mean it's bad. It's still going to be great at productivity and gaming when compared to older CPUs, or having no PC at all right now.
Useful video! I like that AMD start to work on their power efficiency. I want to see some APUs with more CUs! A power efficient AMD APU with 8 or 12 RDNA2 CUs would be a true competitor to NVIDIA GPUs for some gamers. Btw, the part from 13:41 sounds like it's recorded with a phone. 😀
Picked up the 7600x for $240 while it was still available at that price when AMD announced the non x and x3d cpus. I'm hoping AM5 will be just as good as AM4 and adopting now also means ready for new features and getting the most life out of the socket 🤘🏼
Do you actually update during a sockets life tho? I bought Ryzen first gen and actually ended up swapping Motherboards, not CPUs, as I went from ATX to ITX.
If you think those temps are crazy, the E3-1240L v5 in my NAS generally sits at +2 degrees above ambient. Its hottest in summer was 33.33 degrees (32+ degrees ambient) and coldest during winter was 13.6 degrees (ambient often dropping to a low 10-11 degrees). Oh, while I'm shamelessly flexing, it as a SilverStone KR03 Heat pipe CPU cooler in it. Flex over, great video as always guys.
Loving the new layouts for your data! It made it very clear which data was what and made it easier to take a look at at a glance without pausing. Fantastic work by your graphics team!
I recently upgraded my CPU from a 1700x to a 5700x. Absolutely love the difference in performance and with the added benefit of it running at 65w aswell!
I got the 5800x like one month before the 5700x was released. Having CPUs with over 100w of TDP has been always a nightmare for me. Fortunately I was able to do some nice undervolting with it, so it never reaches 90w. Still, my biggest frustration isn't not having bought the 5700x, but the 5800x3D instead.
I wasn't planning on upgrading until I watched this video. Thank you for always being excited about value in the PC space. Helps a lot of us make informed decisions! -Wage Slave
Watts per FPS is kind of pointless though. The only thing that should matter is FPS. Would you really want to lose 20% of your FPS to save a few watts? I mean it makes sense for laptops or anything else that runs on batteries but it's useless for desktops.
@@rubiconnn It shows efficiency and helps people who aren't building super gaming computers make an informed decision. It's not the do-all and end-all of CPU choice but very helpful. My computer idles between 150 and 180 watts so I turn it off when I'm not using it because that's a lot of power to just throw away. If I were to do a PC build right now that 7600 would be on my short list.
Ok I'm confused. At 11:14 you show that the 7900 runs at only 40C when running Prime95, and you say the cores were running at 100%, and yet in the other graph (12:17) you show that the core speed was around its base speed. So why did the cores ramp up closer to their Max Boost speed? My expectation was that they would run as max boost speed and only start to thermally throttle when they reached about 95C. Or to put it another way, why didn't the cores run faster given that they had so much spare thermal headroom?
Does this not happen on every release? There's always a price cut on last gen, and non x skus have refined power usage. I've gotten the 1800x, 2600, 3700x all on good price cuts. I always buy when the price cuts happen. The 5800x dropped to like 300 and the 5800x3d to 350, have both, the 5800x3d does wonders for my mmos.
having a 5950x I wasn't looking to upgrade anytime soon, however as someone really into Star Citizen right now, I've heard that the x3D cache is really helpful for it, so the 7950x3D might be my next upgrade.
Same with tarkov. I just upgraded from 3700x to 5900x and everyone who has the 5800x3d gets better frames than me. So I either kinda downgrade but get better game performance or wait for next gen 3d chips
Fortunately I stopped worrying about my PC not being able to run Scam Citizen decently a long, long time ago. My latest build doesn't even reach 60 fps at 1080p on it. And now it's a 10 year old "game", it doesn't even look that good anymore.
@@sovo1212 well that would be because it’s still in an a alpha and is pretty unoptimized currently, I get 1440p 60fps+ so I’m not too worried about really :) but to each their own.
I'm still using my late-2019 3700X 1080p build & it is still chugging along great over 4 years later. I do want to build a whole new system this year (2024) though just for the fun of it, & go for a targeted 1440p lower-draw, no-overclock build using the Ryzen 9 7900 CPU as the heart of the system.
I went with a 5600 recently and am still happy. The base clocks on these make way more damn sense but the overall cost of the build would be pretty high. These come with heatsinks too which may just be because AMD didnt design a better one for the high tdp on the X ones
AMD has 3 tiers of heatsink/cooler for their ryzen range, all 3 are actually pretty decent like im still running my stock cooler for my ryzen 3700x and it doesnt thermal throttle unless being hit with cpuZ
To be fair if you're buying a top end chip, odds are that you would be buying your own decent cooling. Budget oriented chips like the non-x chips make much more sense to have a heatsink included and I actually like that AMD have made the decision to do it this way.
Very excited to see some coverage of the 7950x3d. Got to check it out at their booth at CES and from just playing with it for a few minutes, it seems like an absolute beast.
I'm counting the days! Though I'll probably just get the 7800X3D since I only game on my home PC these days. I really want to go AMD this time round, but the X3Ds have got to at least match Intels 13th gen in games to get my buy-in.
I'd still go AMD just for the upgrade path. With AM5 you will be able to get new processors unlike the 13th gen that use the same socket as the 12th gen meaning you're going to be stuck with the 13th gen processor forever. I already did this mistake back when Ryzen came out. Got a 7th gen i7 instead of a 1700x and now i'm stuck with it but if i would have picked an AM4 platform i would have upgraded that 1700x easily
While correct there is no evidence am5 support will be as long as am4. they only guarantuee till 2025 which is in 2 years already. Buying a 300 $ cpu and upgrading in 2 years time already is monkey business. Also this is assuming AMD is still competetive in 2-3 years... who knows about the future. Buy whats best now. Or matches your wallet. Which might also be am5 if board prices come down.
@@Daisudori Honestly i don't even know what to get. i'm looking to upgrade to either a 13th gen i7 or a 7700x. Board prices don't really matter to me cause for some reason in my country those AM5 boards cost just as much as a good LGA 1700 board so in my case it's just a matter of preference.
Hey linus when watching the graphs of the data, i feel it would be helpful when doing this style of comparison to color code the CPU names. For example with this one, the 7900 and 7900x could be blue and then the 7700 and 7700x could be green and then so on so forth. Just a suggestion :)
Listen, if you dove into AM5 at the prices they started at, you already knew you were paying way more than if you just waited a few months. So then you played yourself. Side note, the 7600 reminds me a lot of the old 2600, and why it was such a good deal compared to the 2600x, at least till the 3000 series dropped lol
this review doesn't really show that those people played themselves.. I mean, the X-series *is* performing better than the non-X. Productivity tasks is just clearly a win for X. But it seems like no matter what: Less power consumption, cheaper performance-ready CPUs, should make *everyone* happy.
That does also interest me. You can boost an 7900 up to an 7900x but I'm sure you can also turn down your 7900x. Kinda looks like those are the same chips lol. The 7900 is actually more expensive currently than the 7900x in Germany.
Are there any specs for idle power draw? If you want a small home server, 90% of the time it's going to be on and idle, so that's where the real power savings come in. Also, we'd need to compare the total power of the system at idle to team blue's offerings to get a fuller picture.
@@rustler08 I'm curious to know what prompted the aggressive tone in your reply? Your initial question was good, asking why I would consider this for a server application. From that we could have had a productive dialogue and possibly shared ideas and learned from one another. But then you shut it down by shooting down my proposal without any context. You have no knowledge of my use case, and therefore cannot make the assumption that it is overkill. I happen to be a software, site reliability, cloud, network, machine learning, and big data engineer with over a decades experience. I am the head of department for systems development at tma successful FinTech. And have a side business getting off the ground. I have multiple clustered home servers for fun, side project work, and learning, running kubernetes and deployed with config as code. Most of the time they run close to idle, but when I crank the big data stuff, or do large compiles, or render or transcode video (a hobby of mine, which I do in software if it's for long term archival) the cluster can run pegged at 100% for hours or days at a time. I am also off grid, so energy matters to me. So the idea of CPUs which are significantly better performance per watt is of interest to me, and I earn well, so cost isn't a significant factor. But I am conscious of my carbon footprint, and so what my devices draw while idle is of interest to me.
I just noticed that the graphs all have the LTT logo in a fading dot matrix. Really cool form of content protection. I feel like a lot of RUclipsrs could benefit from this kind of tech.
the 13600k's score is ronge by a lot and the 13900k is probably hitting a thermel trodelling I am not an intel fan boy but the testing latly is very light in gaming and in general not accert an cosistnt
Perks of being a very late adopter. I just got the 5000 series recently, and in a few years once the MoBo and RAM prices go down, I'm gonna grab one of these 65W beauties
With current CPU's, for the Factorio benchmark, might be good to use the Flame 30K save as opposed to the 10K. Regardless, very happy to see Factorio being used as a benchmark, it is great for single core bench marking.
Making the jump from an I5-6600k to an R7 7700. I can't wait to see what the difference will be like. Im looking forward to way better performance and way better energy savings. I just wish it was priced at $300 US and that the mainboards were more affordable.
For some reason, those much more efficient chips just feel so much more desirable/satisfactory than those that are run to toast.. even if theyre not reaching the absolute maximum performance. The leaps are insane anyways, seeing my just two years old 5600x (which was a massive leap from its predecessor) is already completely blown away by everything else.
@@frederickmiller5492 I need a guide to underclock my 5600. I did that from Ryzen Master and all it did was micro stutter my games because I have SAM on for my overclock undervolt RX 6800
I oced my 5600X at 4.52 ghz boost ( with a NH-U14S) and got 11400 from 10600 in cinebench multi core and 1360 from idk didn't try at stock behavior, peaks at 73 degrees celsius
Dumb question, but I only took one C class like 10 years ago. Can you use intel compilers with an AMD CPU? I remember that one being much better than the open source option at optimizing the run times
@@TennisGvy Yes you can, if you mean a regular C/C++ compiler. A program compiled for a specific platform (let's say Windows 11) will execute on any CPU.
Always love seeing competition bringing better, cost effective, and power efficient parts for people to enjoy doing the things they do on their computers!
This is why I chose AM5 The upgrade path is just awesome. Sure Intel may have the crown but its just a small margin. Once the X3D versions launches, it will take the charts back
@@Freestyle80 i started with the r5 2600 in 2018, then jumped to 3600 then 5600x while rocking a gigabyte b450 board. I may not be part of the 99% but I sure will keep upgrading in the future
Same here, built my computer around it, and even slapping a RX 6800 works perfectly fine... I'm just running into some weird USB controller issues what hopefully will be solved with this card I'm buying
It's the motherboard prices that kill the potential for AM5, not the CPUs imo. Even recently I've recommended someone to go AM4 - where the motherboards are still surprisingly expensive as well both used and new. I guess that's the downside to motherboards supporting many generations of CPU (Ryzen 1000, 2000,3000 and 5000 series), prices never go down all that much as everything stays relevant. It spells doom for bargains in the far future as well as In the end it's almost always motherboards that end up being the more pricey part. Over time motherboards will kick the bucket whilst CPUs are practically immortal, even the good old 486 CPUs are mostly still working as new, finding functional motherboards is a lot harder.
Let me tell you where the non-X CPUs are going to wind up: homelab servers. They're *perfect* for them what with the high core/thread counts, low TDP, and 64GB RAM support. A lot of us are already using AMD Ryzen 5600G CPUs in dedicated TrueNAS Scale servers just for that reason. It allows you to run 40-50 different moderate-load Docker containers without breaking a sweat. Hell, I'm running 30 containers in a VM on an ancient Core i7 Lenovo tiny PC running as an XCP-NG host and that VM is backed up to my NAS every night.
They (AMD) could have fumbled this so bad. We still have yet to see the longevity the AM5 platform will have, but this bodes /very/ well for the industry as a whole. This is the Dr. Lisa Su we came to love.
X = Xspensive
Watch this be top comment
Linus better pinned this comment
Top comment
That's a killer one
X sounds like a badass letter
Linus, we all know that we gonna buy this CPU like 3 years after it arrive with 50% of initial prices
People don't want to buy a 3 yo cpu
3 years? You mean 5 years?
Lol, like with the 5600...
@@txmits507 lmao speak for yourself
@@txmits507 what's the point eh haha
That thermals chart is insane! I'm so happy to see a return to awesome midrange CPUs, and am very excited for the budget ones.
I so fucking agree =)
@@fridaycaliforniaa236 TRUE
Mid range lol.
@@fynkozari9271 this. Midrange CPUs should cost no more than $170, but whatever.
The 7600 is goiing to become a legendary budget cpu like the 2600k.
Fun fact: Factorio devs mentioned in one of the blogs that they optimized game code so much that it's mostly bottlenecked by cpu cache rather than clock speed. Also ram speed boosts performance too and by a lot!
Feeling good about the 3DVCache CPU I just bought then
These suckers bought 7700x instead of 5800X3D with 96mb cache!!! MUAAHAHAH
@@petercollins797 i literally bought the 5800x3d for factorio lmfao
@@jaxrammus9165 7800x3d looking juicy as well for factorio, tarkov and other games .
@@jaxrammus9165 I did the same thing but for rust lmao
Labs are definitely showing their value, great review.
I think you should also include idle power draw. It can be a big deal for some use cases and sometimes people may find that their systems are drawing more at idle than they should (had that happen recently and it turned out to be a driver issue), so it's useful to know where approximately they should be.
People are noticing high idle power draws because they install crap like animated wallpapers and 10 game launchers, or they have multimonitor setups with high refresh rate. Or because they enable the high performance power profile in Windows.
Tldr: too many factors for idle power draw today, since there is never a real idle. What you see drawn from the CPU is not what you measure from the cord.
I use a 15 min sleep with wake up via usb (key/mouse). So It uses a few watts to power the memory and that's it. And for the commodity to not have to push the button manually, by just pressing the mouse to power on.
Usually idle power draw is around 5W whatever platform you're on
At least, mine draws the same with 3600 and 5800x3d. Note that the 3600 never puts to sleep its cores, while the 5800x3d will run just 2 cores/4 threads if the workload is light enough and the others are almost literally turned off
You can't measure the whole system 's powerdraw for a review tho: different ANY component will massively change your results, so the possible combinations are too many to test or even just list.
i'm interested in the idle power draw of the non X SKUs. I've seen a few people note 60-80w idle power draws on 7000 series compared to their intel systems.
This one aged well, whoever bought AMD still has a working pc 😂
Short Feedback on the new Performance Graphs: They are very nice and clean but it is a little confusing how they are ordered. You seem to randomly order them either for average or for 5% lows but never consistently. You have graphs where some CPU is in the Middle by it's average, directly behind you have one that has lower average, but higher 5% lows, after which you have one that has higher average, but lower 5% lows, which makes it hard to quickly see, which CPU is actually faster. Both ordering methods, average or 5% lows are totally viable, but you should choose one and stick to it to make the graphs more easily readable, especially since you show them for a limited amount of time. That means pausing the video constantly is a necessity to take in the information acuratly, which makes it harder to follow the script while those graphs are on screen. Other then that, Lab seems to really do a great job here!
Felt similarly confused as well
The graphs are completely unreadable blurs on Mobile at 603p,
Plus the tests seem irrelevant for anyone not playing those particular games. It was really easier back when everyone used synthetic tests that were designed to give realistic results across all CPU generations since the 1981 launch of the PC platform. Famous tests that would still be relevant include LINPACK, Whetstone and 3D-mark, each test may need updates to deal with new inventions by the 3 chip giants.
I was going to comment on the same thing. Maybe get some inspiration from Hardware Unboxed. Their graphs are also often packed with information, but theirs is a bit easier to digest.
Looks to me like a calculated score is assigned to each CPU based on all 3 categories in which they're measured
@@johndododoe1411 if they made their graphs readable at [360?]p it would look like granny zoom to the majority of their audience. Change the quality and wait for the frame to buffer if the graphs are that important to you.
Only using synthetic tests that were relevant on the first PCs would be more useless to the average viewer than testing even one modern indie game that nobody plays.
Get some perspective
THANK YOU for finally including Factorio benchmarks on your CPU reviews.
I think CPU benchmarking in industry reviews are way too focused on AAA GPU-bound action games instead of processing-intensive strategy/sim games. Titles like Factorio and Civilization 6 are very relevant for people shopping for the optimal gaming CPU.
We need more benchmarks for productivity as well. Everything focuses on AAA gaming at 8K with full RTX, paired with the best DDR5 and a 4090 overclocked to blowing up.
That's not my use case and the data unhelpful.
@@halfbakedproductions7887 they did the same number of productivity benchmarks as they did gaming....
I play games like factorio and timberborn and I dream about reaching 60 UPS when having a really large base. Triple A are GPU bound that's well known
Also games like Victoria 3 (seeing how far it can get as observer from start after x amount of time) or Dyson Sphere Program
Phoronix has a great benchmarking suite
Looks like they made the 7600 with the same idea that the 3600 played out to be. I really like it!
I'm still using R5 3600 and don't have any intentions on upgrading soon. I'm still running at stock but when it starts to wear down, I'll simply need to OC a bit to match what ever GPU I have(currently a RX 6650 XT).
Due to ass prices in a small market, I still often buy Ryzen 5 3600s for my office.
@@Postman00 5700x is like only 200€
@@Tupsuu He mentioned "a small market". That makes me think he's not in US, nor UK.
@@Winnetou17 Im in Finland
@@Tupsuu Ok, fair enough.
AMD being able to pull off these power draws just gives me a lot of faith in their engineers.
Actually, what they did is CPU binning 101. They just stockpiled "golden" 7000x CPUs for months and are now selling them for the 65w lineup. Genius move.
@@sovo1212 Not actually Golden. that 0.2GHz drop is enough for parts that were subpar at higher clocks work a lot better.
If you're amazed at this performance and power draws you should check the work they did with the Steam Deck's APU. I got a lot of hope for their future mobile end chips.
@@EuclidesGBM Grab any 7000x CPU, try to underclock that 0.2ghz and undervolt it to get to 65w TDP, it won't be stable.
@@sovo1212 considering I managed to overclock my 3600X to 4.3GHz All Cores whislt undervolting it to 58W... I think it is quite possible
Upgraded from 2700X to the 5900X recently and even with its eco mode capped at 65W it's a massive improvement to me. So I will probably wait for the next huge efficiency jump and another increase in thread count.
I'm curious. I also have a 2700x but with a rtx 2070. What gpu are you using and what is your workload?
@@Barrysteezy You should ponder these questions for a while, if you can confidently answer them then you’ll have an easier time making a choice:
1. What chipset is ur MOBO, and is it one that supports 5000 series?
2. Do you plan on upgrading to an entirely new system in the next 5 years
3. Do you plan on doing any creative work or CPU-intensive work such as live-streaming to twitch while gaming?
These are all things you should consider before spending any moneys on upgrades
@@Barrysteezy I recently got an RX 6800 but had an RX 5700 before that. One thing helping a lot with the GPU was enabling resizable bar in the BIOS.
Honestly though my setup is pretty screwed by PCIe limitations since I'm using 8 lanes for some SSD adapter card. Therefore my GPU is limited in bandwidth (only using 8 lanes) and I only have PCIe 3.0 with my motherboard. However it's still fine but I assume with PCIe 4.0/5.0 I could easily get 5~10% more performance in games. The reason for upgrading was mostly to get a newer feature set with mesh shaders and ray tracing pipelines (I wanted to do some graphics development with that).
However the CPU upgrade still made a lot of sense to me because I'm compiling a lot of code during development. So this adds up easily when I need to draw less power now while it takes less time as well.
The most reason why I didn't went for AM5 is the pricing while my old mainboard (x370 chipset) got a BIOS update which enabled Ryzen 5000 support (even with more than 8 cores). So the upgrade was much cheaper and the benefit of Ryzen 7000 over 5000 is not that huge to be honest.
But if you consider it, I would wait for the reviews of the 3D cache chips, especially if you care most about gaming. There's still a chance that they won't perform as perfect during launch though since they rely on software optimizations to utilize their stacked cache most efficiently. So hopefully reviews will cover that part.
where exactly do you see improvement and did you really needed it? 2700x is still capable of any task, you just wasting money for higher numbers in benchmarks.
why would you buy a 16 core chip just to cap it at 65w? 😂 I can almost guarantee you that you'd lose barely any performing with a 65w 7900x
7600 is ice cold, which is lovely. Can't wait for it to age and see more cheaper options and small factor options!
i got mine to 93c building shaders on the last of us lol
@@teamfishbowl1076 mine is going above 95c with aerocool verkho2 (2 pipe cooler) and there was R5 1600x (95w) never going above 60c with the same cooler. I don't understand that.
@@kliibapzyeah its a bit odd. prime95 on am5 7600 for an hour with prism cooler will easily hit 90c+.
my ryzen 2600 with the dark rock pro4 can do 3-4 hours of torture tests on prime95 and never go above 65-70c.
i want to test the dark rock pro on the am5 and see what happens, but either way the 7600 runs NO WHERE NEAR ICE COLD.
ICE cold? They run around 93c with stock cooler thats 200F enough to cool meats easily. Not sure I would use ice to describe something that can cook steaks.
@@Physics072I'd it much worse than the 7600x?
If you are still on AM4 mobo, the 5800x3d is a great cap filler without the need of new mobo/ram/cpu and it seems to be staying on par with the next gen CPU's
I'm not gonna upgrade from a 5800x to a 58003D xD
Well, i dont wanna be the karen today but the 5800x3d is better for content creator or ai people but more cores could mean better performance even though a game only runs at 8
yeah, i found recently that my b450 mobo has a bios update for 5xxx series... and i'm running 1600 so i'll probably save a buck and just buy brand new r9, and i don't even have to buy other parts, yay
@@Shadow-zo8xj It thought it was the other way around, 5800x3D for gaming mostly, but i'm no expert
For games yes.
Waiting for the 3D prices, but that 7900 is looking amazingly efficient for a nice price
It's not a nice price at all. AM5 motherboards are very expensive, so is the DDR5 memory you have to buy.
@@sophieedel6324 it’s better than buying a motherboard and CPU every 2 years with Intel
@@TheEpxMaster not really I upgrade after 5 years and by then every component is worth upgrading.
Ryzen 7900 76 mb cache. I would kill for that cache. 65 watt tdp is the best.
@@TheEpxMaster It's realistically the same since the AMD motherboards are twice the price of Intel's. The 7900 sounds like an pretty mediocre deal, when you're spending that much on the AM5 boards and DDR5, you're probably better off with a 7900X3D when that comes out, because chances are if you want a board for a 7900, you're not an average user and shouldn't be cheaping out
Very excited to see what the 3D V-Cache chips offer. Amazing how competitive the 5800X3D is a generation later
5800X3D ,, WAY WAY overpriced EXTREMELY overhyped
@@tilapiadave3234 wrong :)
@@TheNicePIXEL Indeed You are wrong ,, very wrong
@@tilapiadave3234 keep using your Intel CPU and let us use our OVERPRICED and OVERHYPED X3D's 😁
@@TheNicePIXEL I most certainly will continue to use the vastly superior cpu's.
My task is to INFORM all the newbies of the LIES spread regarding the 5800X3D , I hate when newbies waste hard earned dollars
So the 7600 is the new budget king, the 7700 is DOA, and the 7900 is a power efficient mid/high-end monster? If only Radeon could deliver like Ryzen
Edit: changed the 7900 from mid-range to mid/high end
7700 isn't exactly DOA because you can just turn on PBO in the BIOS and get pretty close to X performance for less $
Budget king will be the 13400F. Being BLCK OCing is becoming more common on boards at good prices. a 13100F overclocked will even give the 7600 a run for the money.
No chance of it being a budget king with those board prices.
@@MiGujack3 I was thinking that too, AM4 is dead end but will still be relevant. Z690's are very affordable. B660's are plenty enough for a 13600k or better. Mortar Max+12400F is where it's at right now. 5.3GHZ and 13600k performance in gaming.
@@PDXCustomPCS 13400F is still based on alder lake cores so gaming performance will be far behind 7600 and 13600k and only about the same as Zen 3 5600x which you can get for $135. You can see at 7:10 the 7600 is a bit faster than even the 13600k in gaming, it's prettymuch impossible for the 13400f and 13100f to match 13600k and 7600, if you OC them you can also OC the 13600k or 7600.
As an SFF user, those temps are impressive. I would definitely go this route once I move to am5.
how small? i want to go sff but i’m afraid to make the jump
@@halocubed6788 I built a 11 liter Dan A4 H2O case with a 3070ti and a 5800x3D about 2 weeks ago. It has 60 degree temps for the GPU and the CPU never goes higher than 70 degrees when gaming. It has been running great and I overall love it.
@@halocubed6788 try tecware fusion. its itx friendly
as a sff fan myself, i gotta keep the 5700x until am5 reaches its last gen
@@cuerex8580 nice, eol is going to pretty awesome on am5!
Love the frames per watt charts. Would love a real deep dive on this with eco modes on cpus and maybe even power limits / fps limits on gpus too!
Hey just wanted to say thank you for the channel. I used to be a system builder back at the turn of the century and into the sli Era of gaming. I'm slowly catching up with the new tech. So glad to see some other enthusiasts make these videos.
Same situation here =)
Yeah me too. Did my first modern build (first one in 12 years) late last year. Amazing how much faster everything has gotten for about the same money.
All of the X version chips were available for $10 -$20 above these new chips between Thanksgiving and Christmas. I picked up a 7700X for $339 from Newegg. Currently, the new chips are a great value but early adopters of the X series did just fine.
I went with the 7700 and have no regrets. I got 32GB of free DDR5 from Microcenter. I did a direct comparison against a free mobo deal they were running on Intel 12 series, and made the decision to go with the new platform instead of the tail end of an old one. I don't feel bad about it. And I'm happy with the 6800xt I paired it with for $500 after seeing the latest gen video cards.
I got $20 off the CPU and mobo combo, in addition to the discount on the CPU, so I kinda got it cheaper than than the non-x version. 😂
I don’t believe you
I paid $30 more for a 13700k, z690 for $150, and carried over ddr4 b-die ram I bought two generations ago. A more normie 32gb ram kit would've only been $70 anyway
@@SlimedogNumbaSixty9 please tell me total price?
The performance per watt is insanely impressive. I can imagine it being useful for say a university environment, where you might have 100's of machines working on students engineering projects or similar. You want the performance but also dont' want to use too much energy.
once you turn on PBO they consume the same exact wattage almost.
@@Manysdugjohnwhat is PBO?
@@colinkirkpatrick5618 pression boost override. Its a setting in the bios that lets the cpu boost to its maximum capabilities until it hits 75C (for older ryzen) or 95C for AM5 ryzen.
@@colinkirkpatrick5618 the built in automatic overclocking
@@colinkirkpatrick5618 peanut butter orange
I was planning on skipping the 7000 series, but price and TDP of the non-X chips definitely has me reconsidering. If only the 600 series boards and ddr5 ram weren't so overpriced, then it would be a really easy choice to upgrade.
DDR5 ram isn't that overpriced in my opinion, however those 600 series boards are extremely expensive.
@@officialteaincorporated243 Easy, just buy b560 mobo. How much DDR5 currently? DDR4 3200mhz in my country $58 current price before discounts.
@@fynkozari9271 i believe zen4 doesnt support ddr4 ram
@@reformierende_person b550 am4 b650 am5.
I'm hoping we see some Zen 3+ AM5 chips for the budget 6000 series. Imagine something like the 6980HS becoming the 6700G or something like that.
It's nice seeing a product being able to get a good review from time to time, not what we are getting with gpus
When iGPU integration starts shitting on a dedicated GPU.
@@NinjaForHire Have you seen the AMD laptop chips announced at CES? iGPUs are going to DEMOLISH low end dedicated GPUs.
@@Gabu_ lmao yeeaa suuuurre
@@NinjaForHire bad troll. Wasn't funny. Nobody but you laughed. Cus an integrated gpu beating my dedicated gpu is more of a 1 in 9 billion chance. Get back to me when integrated steps up thier game *mic drop*
@@cooleyYT who hurt you...
Wow, those are impressive thermals for the 7900! Think it might be going into my next build (which will be SFF)! I want to see what the X3D line this time around does compared to the 7900 and the 5800c3D
7900 will be great for building a quiet entry-level workstation at a very reasonable budget. This is pretty exciting. I think productivity-wise for devs, this is a great chip.
What about for gaming? Good as well?
@@bigdoublet CPUs haven’t mattered too much in gaming for a while. A well chosen, cheap, older CPU is usually the way to go.
@@youtubewatcher4603 every recent cpu released has increased ipc to the point where something like a second hand i7-8700k is actually a bad purchase rn, you really dont wanna go below 10th gen even if you gotta go entry level to stay on the newer platform unless you wanna deal with microstutters on older cpus.
DDR5 128 GB ram (4400 Mhz is the highest stable config) costs like $800
DDR4 3600 128 GB ram costs $450 and has the same performance
7900 sucks for virtualisation workstation builds or CAD simulation workstations
@@Haskellerz sucks for virtualisation? does that mean VMware machines? that's my use case. please respond with what I should go for. I know I don't upgrade a lot coming from a 4690k but I like the idea of having the latest and greatest. was almost sure to buy a 7900 but why would it suck for vmware?
This is quality reporting. Coverage like this helps real people filter the marketing hype and holds manufacturers to account. Nice job LMG 👌 And nice job AMD!
EDIT: What about idle power consumption? That's an important factor. Not sure if it's being considered. Apparently there is a big idle difference between the two brands?
13:10 I like building efficient, quiet, and low heat output air cooled systems. The Ryzen 9 7900 specifically is look'n reeeeeally good! 88w max power use for that type of creative multi-threaded performance. 😍
I'm still using my first CPU, R5 2600. Love the direction AMD have taken with value, which means I'll be looking at potentially going AM5 when I upgrade probably next year. (Gotta let those prices drop a bit more). I can only hope GPUs will follow at some point
you will see that in 10 years, when others companies came out.
Even if you could find a decent priced 5600 you'd still see a decent performance jump over the 2600. Then when it comes time to upgrading you would have a decent base system to sell as the 5000 is the last series of the AM4 platform. I had a 3700x and didn't need to upgrade but still did (5800x) and honestly did not regret doing so one bit.
What's the issue with the 2600? I have a 3700, I know it's zen2 and 8core but at least for many home workloads it seems pretty good. Games will probably not run much better unless you have a really great GPU like at least a 3070 or so, and the same goes for blender renders or whatever else you might be doing for fun.
I would only upgrade if there's a clear advantage in your particular system combination and your use case, and then also only if you need it. So many people play on 1080p 60fps but want 4k120 for no reason. If you play battlefield, will you really be better at 4k120 or notice the visual difference enough to warrant spending $1500 on all that stuff?
I upgraded my whole system a while ago (3700, rtx2070, 32gb RAM) and my secondary PC with a 1050ti & i7 4770k runs most things well enough to not really care during gameplay.
@@LuLeBe It's not an issue but it's just a less mature processor. AMD made huge improvements between 2000 and 3000 and even bigger up to the 5000 series. Memory optimisation was one of the biggest changes and in some scenarios you can see a massive uplift in performance as the way the infinity cache works relies heavily on memory speeds and latency. I upgraded my 3700x to a 5800x to take full advantage of my RTX 3070 and triple wide 5760x1080 setup. I get a solid 60 fps now in most games whereas before I did have quite a few frame time issues as well as 1% lows dipping. You have to remember that everyone has different uses for their computers and what may seem like a small upgrade can turn out to be a big one for some people. That, and AMD have made huge improvements between each generation of Ryzen. It's not like the old days where Intel only improved by a few percent each generation.
I always wanted an R5 2600! Of course, my upgrade got delayed substantially, and I got an R7 7700X for christmas. Best thing ever. Good luck with the upgrade! I'm still using a mid-tier GPU...
The fact that 5800X3D is still here for comparison still surprises me 👀
Why ? As a owner i'm more than happy tho haha.
@@TiPeteux I love how 5800X3D still performs, would've gotten one if I didn't have 3950X right now :P
@@leungoscar4126 I have a 5600, worth an upgrade?
@@blunderingfool that's basically the upgrade I made. Depending on what you play and the price you could find the 3d chip at it's a bit equivocal. If you are happy with performance right now it's not worth it. If you play things like flight sim in vr or really logic heavy games like civ a lot though it'll be a great upgrade without needing a whole new pc, especially if you sell the 5600 and get a good deal
It's a last gen chip. It's not that suprising.
sir Linus good evening, I'm Thomas E. Berdon from the Philippines...my rig is already complete but except for the CPU...if you ever any spare to give-aways that you never use it anymore... a AMD athlon 3000G or AMD ryzen 3 3200G it would be awesome...this is my rig I built it on my own...an old micro atx computer case, aA320M-S2H gigabyte motherboard, a 2X4 kingston beast 2666Mhz cl16 ram, a walram 128Gb M.2 ssd for OS, a ramsta 128Gb 2.5" ssd for storage, a wd blue 500Gb hdd for storage and a DVD rom drive...that's it.....I always watch your videos on RUclips as you assemble it one by one until you done with it...I learned a lot from you sir...thank you very much for the time that you read this....
That r9 7900 was pretty much what I was waiting for, I'll grab one in the summer when supply/demand settles on the entire platform
But x670 price. Are u gonna get b560?
@@fynkozari9271 why? I just care about the power consumption. 1kwh is 1€ here so
If I go for a lower end B660, I cannot safely assume it'll run the next gen of chip, meaning that the 50-70€ I save on not going x670 means I gotta redo half my build and spend another 150-200 bucks
I'll go x670, the 50€ saved on a 700€ investment is not worth the compromise
@@roqeyt3566 wow, your power is about 10x what I pay right now. The margin drops to 4x what I pay in summer though.
@@roqeyt3566 Even A320 motherboards can run the 5800X3D. I don't think future compatibility is really an issue on B650. There's a reason B650 is so expensive. They come with good VRMs and I/O. Buy only what you need and want in a motherboard. No point having a X670E if it doesn't have the features you want.
@@dalzy25 you technically can, but how many mobo's from first gen got bios's for 5th gen
On top, many mobo's show weird behavior (sluggishness, microstutters, performance loss) if they're too low end or old, as they were designed for their generation of chips and not the future ones. Not only that, power delivery, especially with a 200w+ socket, becomes much more important compared to the old limited AM4 socket.
So while you could, it's a question of should you. It'll be a pain in the behind if history is anything to go by
That's amazing. Love the lower TDPs and temps with an almost equal performance. It's definitely worth the small performance loss.
I'm with this guy save so much pwr to performance it's crazy makes Intel look like they don't care about your wallet in the long run.
@@NinjaForHire LOL.
AMD is a corporation too. They don't give a flying F about your damn wallet.
They only released these non X chips because their X sales and new platform sales were low.
It's almost as if the average user didn't like the idea of spending all that money on a new everything for a chip that stupidly ran itself factory over clocked to run at 95c.
Funny how as soon as AMD became the better choice for CPUs. The X series was all there was. And the prices went up!
But in a fanboys mind. AMD AND IT'S SHAREHOLDERS ARE LOOKING OUT FOR MY WALLET!
@@NinjaForHire if AMD cared the 7900 XT wouldn’t cost so astronomically high. Don’t skimp on a company. Any company, let alone a billion dollar company
I'm actually super interested in the non x chips after watching this purely because of that incredible thermal performance, I live in Australia so almost the same performance with incredible thermals, I'm genuinely going to consider a new build
It would be great if you could include the idle power consumption (e.g. on Windows Desktop) in future tests as well. Like you said in the video, energy costs are higher then ever. When I'm using my pc I'm usually browsing the web or listening to music the same amount of time or even more than I'm acutally gaming.
Besides, I believe that it was AMD graphics cards that had unusually high numbers on idle. Including that in the tests puts pressure on the manufacturers to fix whatever issue they might have.
Very good point, I always wish for the same thing.
Intel have lower power usage on idle. The CCD chiplet design is inherently more power hungry on idle/very low loads
About the AMD Gpu: the memory doesnt downclock properly in idle when using multi monitors (of a differing hz rate so ie 165hz main and 60hz 2nd like I have) and it consumes 30-40watt constant in idle. (in my case on rx6800)
@@Daisudori Your GPU only consumes 30W at idle? My 2080 Ti consumes 60 - 70W at idle, and I only have 1 1080p monitor.
@@joshjlmgproductions3313 If your GPU pulls 60W at "idle" then your GPU never actually goes to idle.
You prob have your windows misconfigured, set the power setting to balanced instead of high performance, it doesnt have any real impact on game performance anyway, its just a waste of energy.
A 2080Ti should pull around 10-15W at idle.
i'm very happy to see efficiency being targeted for once. now if only the GPU market would go this route.
AMD have always cared about efficiency though. It's intel who show no interest in it
@CompilationHUB since 2017, yes they have, rx 7000 series isnt as efficient because its not as powerful as they expected, but its not a hardware flaw, they need to fix the drivers
@@defnotatroll here we go, fangirls spreading misinformation just to shill for a corporate company
@CompilationHUB r9 290x....only the psu's can tell the horror stories from that card haha
@@defnotatroll Not until Zen.
I got the 7700 non-x because not only very little performance diff that will be noticable but that 65w tdp in an SFF build is a huge bonus for thermals
It has been 7 months. Have you enjoyed it so far?
Been over a year now, still enjoy it? Planning on getting it soon
it's nice to see optimization to reduce power consumption and thermals instead of just throwing more watts at their chip like Intel and Nvidia have been doing for the past few years
behold i9 14900K 500W
@@damara2268 more like 600+ xDD
To be fair, a lot of people forget that Intel is still on a 10 nm process node while AMD is on a 5 nm process node. This is what accounts for the current difference in power efficiency.
We can look at the Intel roadmap to see that they plan to release 14th gen processors in the second half of this year. These are on the 7 nm node, which should greatly reduce power consumption. Similarly in 2024, they plan to release the 15th gen, which should use a 2 nm process. This is slated to compete with the AM5 release in 2024 which should be using a 3 nm process.
So, we should expect the Intel 14th gen to be a large jump in efficiency even if the designs are just scaled, though there are also likely going to be some other improvements. This should help them compete with the non-X AMD cpus.
Then, we are set up for another head-to-head in 2024: Intel 15th gen vs Zen5. It is great to see Intel potentially back in the game and helping to spur more competition in the CPU space.
In recent times, Intel has been stagnant, but now, they are showing signs of moving back into the competition, which is great for us consumers.
@@withjoe1880 intel pulled a sneaky on ya and renamed their 10 nm ESF to "intel 7" , they are the same process, just a different marketing number
@@YounesLayachi I know the current gen is 10 nm (Intel 7), I said 14th gen is 7 nm (Intel 4) and 15th gen is 2 nm (Intel 20A).
Thanks for watching out, though.
Happily surprised to see Factorio there as a game to compare at; no idea how that got on your radar but I look forward to hearing more benchmarks with that game, maybe with the UPS/FPS on a 500+ hour mega base since that's where it can really matter :D
Factorio devs are magician or some sort. My mid size base would run happily on 12 years old i5 2410m. Just amazing.
Factorio is great for cache benching, because it just beats the living shit out of CPU and GPU cache.
People kept spam requesting it during the intel arc gaming stream so I guess it stuck in their heads afterwards lmao
Destiny no lifing the game maybe.
@@jammo7370 Probably Destiny streamer fans.
Actually the ram speed thing changed on Zen 4 there’s no longer a penalty for desyncing infinity fabric and ram. That said Zen 4’d memory controller generally can’t handle much faster than 6000 but the 6800 used on the Intel bench is way more expensive
Meanwhile I'm satisfied with my 5900x, but it does make good expectation for the next gen (8000) when I'm more likely to upgrade. I eagerly await seeing benchmarks for the x3D variants of these chips.
Same with my 5900x. I'll probably upgrade in 5 years right as AM5 socket support ends so I can get a mature and cheap CPU.
I'm happy having my 5950X running perfectly on an original x370 board! Longevity!
Same. I went 2700x then 5900x. Gonna stay a while
wouldnt next gen be 9000?
Got a 5900x too but I can't seem to get the temps under control, 85C when gaming no matter what the hell I do. I'm not used to having temps that high, feels uncomfortable.
I’m probably still gonna hold out for the 3D CPUs, but I’m glad for the better deal here.
🤑
3:09 Whenever I buy a CPU one of the first things I do is check the amount of cache and L2 cache available on the CPU. Having a little extra space for your computer to store stuff in the CPU cache can go a long way. The moment your PC needs to reach out to memory is the moment you start to see less performance, granted it's not usually noticeable.
Those temps and power consumption 👀 Very exciting to see this for small form factor pcs
6:24 This is one of the moments when having some more clear documentation and dedicated explanation regarding the Labs SOPs and testing methodology, not necessarily directly in the video but either in a stand-alone video or a link to a forums thread, would be really helpful. It’s no doubt hard to get a high quality testing lab put together, especially when there is a well known focus on developing testing automation. However, when there’s been issues with previous tests that don’t get caught before the video is published and then get at most a pinned comment, it makes me cautious when I start hearing about “strange data” that was discovered which for all I know could be human error but is immediately passed over without much explanation regarding how this result was validated to be accurate.
Reputation is so important, and LTT as a brand has often made transparency a prominent feature, so please continue to apply that to the labs as they develop so that detail oriented viewers, the primary target of labs level content, won’t view the labs as having a reputation for letting errors slip by or having a murky methodology more focused on obtaining results quickly over accurately.
To be clear, I love the content and don’t believe that there is any issue per se in this video, but hearing that line and no follow-up immediately made me remember other errors and think “well, they’ve been wrong in the past so I’ll just check GN instead” which is not the feeling I want to get out of labs related content because I’m really excited about what the labs have the potential to do. I am looking forward to seeing more content making use of the Labs capabilities and how your testing evolves in the future!
Great to hear. Still planning on waiting 2 years for my next upgrade but LOVE hearing there is a much cheaper option that is still quality. Here's hoping we get the same on gpu end as well
I just got the 5800X3D with a way more affordable motherboard + RAM. Got now a solid system thx to the cheap price of the AM4 platform :P
It's actually a better cpu then the 7600 or the 7600x lol and a much cheaper mb and ram so yeah great deal. I got the same cpu. It's a no brainer right now if you need gaming performance
@@timcesnovar978 how does 5800x3d beat 7600 or x variant? the new am5 platform supports ddr5 ram which massive increases the performance?
@@sumedhtech1526 3D Vcache. You do know how to use google right? And no the ram doesn't make a big difference at all.
@@timcesnovar978 if i want to build a pc purely for gaming is there a more affordable cpu than 5800x3d which has a good price to performance ratio?
@@sumedhtech1526 price to performance no, but there are more affordable options like 5600x and 5700x although if your building a new pc I'd go for 5800x3d and some 3600mhz RAM
Considering I bought a 7600X just last week I should probably feel some buyers remorse but since it was on sale for 230€/$ I am actually very happy that I got it a day before AMD announced the new chips since it's now back to the MSRP
I've bought an 7900x but for the same price as the 7900 currently. So I don't have buyers remorse (yet).
I got the 7600x last week. It's 10 euros more expensive than the 7600. So a pretty negliable difference, just as negliable as the performance difference.
@@00vaag correct me, but i think 7600x doesn't come with a cooler. And 7600 does. So depending what cooler you got it should be a bit bigger difference.
As someone who was just about to upgrade their SFF build to a 5700X, the 7700 is definitely the way I'll be going (plus it's an excuse to get an all-new platform and shiny new things)
probably the right decision while more costly now, you aint stuck on a EoL socket with no further upgrades but rather a new one that just started shining. Plus new RAM that can only increase in quality which might be a real gamechanger in the future
Which is also currently extremely expensive. You'll need to buy new RAM anyways so why pay a lot for underwhelming new gen stuff when the old one works just fine
Just upgraded to the 5700x. Verry Happy with the performance gain and the undervolting and oc potential. Will last me a long time (still on x370) but shiny new new tech is shiny new tech
I am excited to upgrade to an R7 7700 for a hopefully cheap price in 2 years.
My 3700X with it's low power consumption is great and I am hoping to get more performance for the same power consumption.
I'm loving the multiple approaches to computing , feels like the companies are specializing for different needs. It will be very difficult to tell who is moving in the correct direction. I commend AMD for squeezing raw cores as much as possible and Intel for going through a period of trying new things with cpu gpu and storage.
The MSRP for the 7900 is $449. I got my 7900X on Amazon during Christmas time when it was $449. So I'm still extremely excited about it!! got a good deal if you ask me. Especially when you consider that the 7900x went back up to $549
I find it weird they increased the prices back up. I thought they would keep it down since the x3d version are coming out, and ppl were saying the X version weren't selling well too.
@@Mewzyc I was surprised too tbh. I think they will drop the prices once the X3D chips actually come out. They're probably gonna milk every cent out of the MSRP of these that they can.
Holy shit, a factorio benchmark!? This is amazing! X3D appears to be amazing for Factorio, which will almost certainly factor in to my next CPU purchasing decision.
enjoyed seeing this aswell, but you have to be really far in Factorio that these differences get noticable :,)
As soon as I closed the replies he said "Perhaps the most interesting game we tested though is Factorio"
Wait 1 month and you'll have the 7800X3D available
Enjoy Factorio!!
I'm glad to see these well-priced options showing up, but do keep in mind that the X CPUs are selling way below their MSRP right now. For the past month, the 7950X has been $550-$570 on both Amazon and Newegg.
I am so glad to hear about this. This makes me less reluctant to upgrade, because I don't just want my system to draw more power in order to be faster.
Alright, so while the big CPUs are the ones that normally draw a lot of attention, I have to say I'm very impressed with what the 7600 is putting up in performance.
For gaming, in most titles it's putting up FPS within about 10-20% of the 7900X despite being $320 less. For productivity, it's within spitting distance of the 7600X for $70 less. If you're not building a high-end rig, it seems like the budget tier is fantastic this go around.
I was watching that chip the entire time too, super high clocks, low prices, low power draw, low temps. Low in all the right places and high where it matters
@@johnsherby9130 I'm hard-core interested with the low Temps. After all, above all else, temp kills. Anything that stays cool generally will not die by abuse. If it barely hits above 50 when going full swinging, then that means it'll be exceptionally resilient. Beat it, boost it, aim your hairdryer at it for the shiggles, but it'll still be artic cool compared what the rest of the big boys are pulling.
I'm sold.
I very much agree. Just because it's not at the top of these charts, doesn't mean it's bad. It's still going to be great at productivity and gaming when compared to older CPUs, or having no PC at all right now.
Love the Factorio benchmarks, the factory must grow and so must the UPS
Useful video! I like that AMD start to work on their power efficiency. I want to see some APUs with more CUs! A power efficient AMD APU with 8 or 12 RDNA2 CUs would be a true competitor to NVIDIA GPUs for some gamers.
Btw, the part from 13:41 sounds like it's recorded with a phone. 😀
Picked up the 7600x for $240 while it was still available at that price when AMD announced the non x and x3d cpus. I'm hoping AM5 will be just as good as AM4 and adopting now also means ready for new features and getting the most life out of the socket 🤘🏼
Same, I got my 7600x for 245 and I’m happy
Same here, in the process of building, just waiting for my mobo to arrive.
I got mine on black Friday, I get the same performance for only 10 dollars more, 2 months earlier.
Got my 7600x for 229 last week, saw the release of the non X's and laughed
Do you actually update during a sockets life tho?
I bought Ryzen first gen and actually ended up swapping Motherboards, not CPUs, as I went from ATX to ITX.
I love that factorio is now a benchmark game. Such a nice introduction
If you think those temps are crazy, the E3-1240L v5 in my NAS generally sits at +2 degrees above ambient. Its hottest in summer was 33.33 degrees (32+ degrees ambient) and coldest during winter was 13.6 degrees (ambient often dropping to a low 10-11 degrees). Oh, while I'm shamelessly flexing, it as a SilverStone KR03 Heat pipe CPU cooler in it.
Flex over, great video as always guys.
Loving the new layouts for your data! It made it very clear which data was what and made it easier to take a look at at a glance without pausing. Fantastic work by your graphics team!
I recently upgraded my CPU from a 1700x to a 5700x. Absolutely love the difference in performance and with the added benefit of it running at 65w aswell!
2600 to 5600x here. Performance difference was nuts. Never knew how much the 2600 - even overclocked - actually bottlenecked my 2060. lol
@@Seppe1106 yeah same story here, my 1700x was bottlenecking my 2070 super so much!
@@Seppe1106 That’s crazy because your 2060 is equivalent to a GTX 1070ti in rasterization performance.
AMD users can brag 65w to Intel users using core i7 241 watt, i9 350watt . Getting the same fps for a fraction of the energy.
I got the 5800x like one month before the 5700x was released. Having CPUs with over 100w of TDP has been always a nightmare for me. Fortunately I was able to do some nice undervolting with it, so it never reaches 90w. Still, my biggest frustration isn't not having bought the 5700x, but the 5800x3D instead.
I wasn't planning on upgrading until I watched this video. Thank you for always being excited about value in the PC space. Helps a lot of us make informed decisions! -Wage Slave
Thank you for including those Watts per FPS graphs, after all! That's where we can clearly see efficiency at gaming. It's very interesting.
Watts per FPS is kind of pointless though. The only thing that should matter is FPS. Would you really want to lose 20% of your FPS to save a few watts? I mean it makes sense for laptops or anything else that runs on batteries but it's useless for desktops.
@@rubiconnn It shows efficiency and helps people who aren't building super gaming computers make an informed decision. It's not the do-all and end-all of CPU choice but very helpful. My computer idles between 150 and 180 watts so I turn it off when I'm not using it because that's a lot of power to just throw away. If I were to do a PC build right now that 7600 would be on my short list.
I liked how the non x cpu’s were highlighted in the chart, good job
Ok I'm confused. At 11:14 you show that the 7900 runs at only 40C when running Prime95, and you say the cores were running at 100%, and yet in the other graph (12:17) you show that the core speed was around its base speed. So why did the cores ramp up closer to their Max Boost speed?
My expectation was that they would run as max boost speed and only start to thermally throttle when they reached about 95C.
Or to put it another way, why didn't the cores run faster given that they had so much spare thermal headroom?
Does this not happen on every release? There's always a price cut on last gen, and non x skus have refined power usage. I've gotten the 1800x, 2600, 3700x all on good price cuts. I always buy when the price cuts happen. The 5800x dropped to like 300 and the 5800x3d to 350, have both, the 5800x3d does wonders for my mmos.
having a 5950x I wasn't looking to upgrade anytime soon, however as someone really into Star Citizen right now, I've heard that the x3D cache is really helpful for it, so the 7950x3D might be my next upgrade.
Same with tarkov. I just upgraded from 3700x to 5900x and everyone who has the 5800x3d gets better frames than me. So I either kinda downgrade but get better game performance or wait for next gen 3d chips
Fortunately I stopped worrying about my PC not being able to run Scam Citizen decently a long, long time ago. My latest build doesn't even reach 60 fps at 1080p on it. And now it's a 10 year old "game", it doesn't even look that good anymore.
@@sovo1212 well that would be because it’s still in an a alpha and is pretty unoptimized currently, I get 1440p 60fps+ so I’m not too worried about really :) but to each their own.
I'm still using my late-2019 3700X 1080p build & it is still chugging along great over 4 years later. I do want to build a whole new system this year (2024) though just for the fun of it, & go for a targeted 1440p lower-draw, no-overclock build using the Ryzen 9 7900 CPU as the heart of the system.
I went with a 5600 recently and am still happy. The base clocks on these make way more damn sense but the overall cost of the build would be pretty high. These come with heatsinks too which may just be because AMD didnt design a better one for the high tdp on the X ones
AMD has 3 tiers of heatsink/cooler for their ryzen range, all 3 are actually pretty decent like im still running my stock cooler for my ryzen 3700x and it doesnt thermal throttle unless being hit with cpuZ
To be fair if you're buying a top end chip, odds are that you would be buying your own decent cooling. Budget oriented chips like the non-x chips make much more sense to have a heatsink included and I actually like that AMD have made the decision to do it this way.
Yeah 65watt cpu doesnt need special cooling. Maximum only 88 watt, temperature is low. The big highest end cpus high temp need big cooling.
Did you test the x variants on eco mode compared to the non X variants? I wonder what the temps would do if you limit the X versions to 65W.
My 7700x in Eco Mode runs as the 7700 in this video degree’s wise but faster thanks to higher clocks.
Very excited to see some coverage of the 7950x3d. Got to check it out at their booth at CES and from just playing with it for a few minutes, it seems like an absolute beast.
I'm counting the days! Though I'll probably just get the 7800X3D since I only game on my home PC these days. I really want to go AMD this time round, but the X3Ds have got to at least match Intels 13th gen in games to get my buy-in.
I'd still go AMD just for the upgrade path. With AM5 you will be able to get new processors unlike the 13th gen that use the same socket as the 12th gen meaning you're going to be stuck with the 13th gen processor forever. I already did this mistake back when Ryzen came out. Got a 7th gen i7 instead of a 1700x and now i'm stuck with it but if i would have picked an AM4 platform i would have upgraded that 1700x easily
While correct there is no evidence am5 support will be as long as am4. they only guarantuee till 2025 which is in 2 years already. Buying a 300 $ cpu and upgrading in 2 years time already is monkey business.
Also this is assuming AMD is still competetive in 2-3 years... who knows about the future. Buy whats best now. Or matches your wallet. Which might also be am5 if board prices come down.
@@Daisudori Honestly i don't even know what to get. i'm looking to upgrade to either a 13th gen i7 or a 7700x. Board prices don't really matter to me cause for some reason in my country those AM5 boards cost just as much as a good LGA 1700 board so in my case it's just a matter of preference.
I Appreciate the Factorio Benchmark :)
Hey linus when watching the graphs of the data, i feel it would be helpful when doing this style of comparison to color code the CPU names. For example with this one, the 7900 and 7900x could be blue and then the 7700 and 7700x could be green and then so on so forth. Just a suggestion :)
Listen, if you dove into AM5 at the prices they started at, you already knew you were paying way more than if you just waited a few months. So then you played yourself.
Side note, the 7600 reminds me a lot of the old 2600, and why it was such a good deal compared to the 2600x, at least till the 3000 series dropped lol
Yeah, AMD made the CPU market interesting to follow again. The Intel-only era was really boring.
I agree, but the price dropped in weeks, not months. Still stings.
If you want a successful review video for gamers, you need to cry about something.
the 5600 and 5600X are the same again. practically no difference in performance at all, but a much lower cost
this review doesn't really show that those people played themselves.. I mean, the X-series *is* performing better than the non-X.
Productivity tasks is just clearly a win for X.
But it seems like no matter what: Less power consumption, cheaper performance-ready CPUs, should make *everyone* happy.
Great price to performance here. Looking forward to what the x3d really offers to round out the lineup.
It would be interesting to know if there's any difference in performance between these new non-X chips and the old X chips running in 65W Eco Mode.
That does also interest me. You can boost an 7900 up to an 7900x but I'm sure you can also turn down your 7900x.
Kinda looks like those are the same chips lol.
The 7900 is actually more expensive currently than the 7900x in Germany.
The work the lab is doing is really starting to show, and it's awesome. It's a great investment.
Are there any specs for idle power draw? If you want a small home server, 90% of the time it's going to be on and idle, so that's where the real power savings come in. Also, we'd need to compare the total power of the system at idle to team blue's offerings to get a fuller picture.
@@rustler08 I'm curious to know what prompted the aggressive tone in your reply?
Your initial question was good, asking why I would consider this for a server application. From that we could have had a productive dialogue and possibly shared ideas and learned from one another.
But then you shut it down by shooting down my proposal without any context. You have no knowledge of my use case, and therefore cannot make the assumption that it is overkill.
I happen to be a software, site reliability, cloud, network, machine learning, and big data engineer with over a decades experience. I am the head of department for systems development at tma successful FinTech. And have a side business getting off the ground.
I have multiple clustered home servers for fun, side project work, and learning, running kubernetes and deployed with config as code. Most of the time they run close to idle, but when I crank the big data stuff, or do large compiles, or render or transcode video (a hobby of mine, which I do in software if it's for long term archival) the cluster can run pegged at 100% for hours or days at a time. I am also off grid, so energy matters to me.
So the idea of CPUs which are significantly better performance per watt is of interest to me, and I earn well, so cost isn't a significant factor. But I am conscious of my carbon footprint, and so what my devices draw while idle is of interest to me.
I just noticed that the graphs all have the LTT logo in a fading dot matrix. Really cool form of content protection. I feel like a lot of RUclipsrs could benefit from this kind of tech.
with those temps and power consumption its seems to be a great improvement!
correction - "Massive"
the 13600k's score is ronge by a lot and the 13900k is probably hitting a thermel trodelling I am not an intel fan boy but the testing latly is very light in gaming and in general not accert an cosistnt
Perks of being a very late adopter. I just got the 5000 series recently, and in a few years once the MoBo and RAM prices go down, I'm gonna grab one of these 65W beauties
With current CPU's, for the Factorio benchmark, might be good to use the Flame 30K save as opposed to the 10K. Regardless, very happy to see Factorio being used as a benchmark, it is great for single core bench marking.
Gotta love seeing the 5800x3D sneaking into the lineup 😎🔥
Fr it brings me a smile seeing my last gen cpu still topping the board in some games
Making the jump from an I5-6600k to an R7 7700. I can't wait to see what the difference will be like. Im looking forward to way better performance and way better energy savings. I just wish it was priced at $300 US and that the mainboards were more affordable.
The more pressure on competitors to provide lower prices, the better. People shouldn't need to sacrifice a kidney for a good system.
Oh joy, now we have tech support scams.
Also, yea, price improvements mean nothing whilst bloated governments make all of your dosh worthless.
For some reason, those much more efficient chips just feel so much more desirable/satisfactory than those that are run to toast.. even if theyre not reaching the absolute maximum performance. The leaps are insane anyways, seeing my just two years old 5600x (which was a massive leap from its predecessor) is already completely blown away by everything else.
just undervolt your cpu - I've been running my 7950x undervolted - it works great and uses less energy
Or oc it if you have sufficient cooling lol
@@frederickmiller5492 I need a guide to underclock my 5600. I did that from Ryzen Master and all it did was micro stutter my games because I have SAM on for my overclock undervolt RX 6800
I oced my 5600X at 4.52 ghz boost ( with a NH-U14S) and got 11400 from 10600 in cinebench multi core and 1360 from idk didn't try at stock behavior, peaks at 73 degrees celsius
And, for once, high performance per watt actually matters for casual consumers.
Just yesterday I ordered a Ryzen 7600 (not X), new MB and RAM. My old Ryzen 1700 CPU and MB took a dump.
how is it going so far? im also thinking abour ordering the 7600 (non X).
Built an all red PC for my partner, glad to see the 5800X3D was still the right choice. This thing is insane.
Very happy with my 7950x for triple-monitor programmer workstation. No dedicated GPU, great compile times and flawless Linux iGPU support.
Same use case here, 7700x, it's great. No regrets.
Dumb question, but I only took one C class like 10 years ago. Can you use intel compilers with an AMD CPU? I remember that one being much better than the open source option at optimizing the run times
Why no dGPU? That seems counterintuitive for workstations unless you only do compiling.
It's definitely compiling only, isn't it?
@@TennisGvy Yes you can, if you mean a regular C/C++ compiler. A program compiled for a specific platform (let's say Windows 11) will execute on any CPU.
Always love seeing competition bringing better, cost effective, and power efficient parts for people to enjoy doing the things they do on their computers!
This is why I chose AM5
The upgrade path is just awesome.
Sure Intel may have the crown but its just a small margin. Once the X3D versions launches, it will take the charts back
right, you bought something today because you want to upgrade 5 years later
99% of the people don't do this
@@Freestyle80 that doesn't go against the op's original statement
@@Freestyle80 i started with the r5 2600 in 2018, then jumped to 3600 then 5600x while rocking a gigabyte b450 board. I may not be part of the 99% but I sure will keep upgrading in the future
Nice to know what I'll be able to get with a 7700 in a few years, and not to waste money on x variants. For now I'm definitely sticking with my 3700x.
Same here, built my computer around it, and even slapping a RX 6800 works perfectly fine... I'm just running into some weird USB controller issues what hopefully will be solved with this card I'm buying
Nice, would have loved a more indepth section on the overclocking part. How does it affect the amazing efficiency?
It's so refreshing to see chip manufacturers not only care about raw performance!
It's the motherboard prices that kill the potential for AM5, not the CPUs imo.
Even recently I've recommended someone to go AM4 - where the motherboards are still surprisingly expensive as well both used and new. I guess that's the downside to motherboards supporting many generations of CPU (Ryzen 1000, 2000,3000 and 5000 series), prices never go down all that much as everything stays relevant.
It spells doom for bargains in the far future as well as In the end it's almost always motherboards that end up being the more pricey part. Over time motherboards will kick the bucket whilst CPUs are practically immortal, even the good old 486 CPUs are mostly still working as new, finding functional motherboards is a lot harder.
Let me tell you where the non-X CPUs are going to wind up: homelab servers. They're *perfect* for them what with the high core/thread counts, low TDP, and 64GB RAM support.
A lot of us are already using AMD Ryzen 5600G CPUs in dedicated TrueNAS Scale servers just for that reason. It allows you to run 40-50 different moderate-load Docker containers without breaking a sweat. Hell, I'm running 30 containers in a VM on an ancient Core i7 Lenovo tiny PC running as an XCP-NG host and that VM is backed up to my NAS every night.
They (AMD) could have fumbled this so bad. We still have yet to see the longevity the AM5 platform will have, but this bodes /very/ well for the industry as a whole. This is the Dr. Lisa Su we came to love.