Just ran CB2024 benchmark on 14700KF (slightly overvolted) - 380W peak power, wattmeter readings. 24% less. Not surprised, honestly, since HW is shilling for AMD.
@@dsdfasefes1883So you believe that Steve, someone who's been reviewing hardware for 20 years, just randomly makes up numbers and puts them in a graph? That graph is total system usage, so different PSU, different motherboard (and all of them have different voltage and boosting profiles), different RAM and different GPU than those in your system.
@@dsdfasefes1883 Tell me you know nothing about tech without telling me. How about you test his 14900k with his board and settings, oh you can't so what you say is worthless.
As a mere PC gamer, Threadripper is way beyond anything need or even can afford. That said I love seeing how insane CPU's can get, and there's something to be said for massively expanded IO connectivity.
@@FullNietzsche. I think the point was that given their computing use case (“gamer”), and it does so much more than they could use it for (“beyond anything need”), or even within their justifiable budget (“even can afford”), they still enjoy this content (“I love seeing…”). The last part was the point of their comment, and at no point do they express how they felt about the gaming prowess or its usefulness in that field.
You think you won't need it? Try becomming an engineer and also having a home-server. You will have fun running web-services and simulations on such a compact thing. :)
7800x3d is so cheap and efficient that it rarely makes any sense to look into anything else if you need high-end gaming systems. Can pretty much pick the crappiest vrm motherboard, throw in a 35$ peerless assassin and you are under the cost of 14900K alone before even considering the need of a 200W larger psu.
That 7970x was definitely the star here in my opinion, so close to the previous 64core flagship with half the cores, some might even say it’s good value 👀
64-core version is meant for workloads that are extremely parallel but you don't need or budget cannot afford the I/O connectivity or bandwidth of the WRX50 platform/Threadripper PRO
Surprising how the 7980X without PBO enabled use roughly the same amount of power as a 14900K given the amount of cores it has, sure it's not efficient as a gaming CPU but it was never made to be one anyway, but sure is efficient with the amount of cores it packed.
Gaming is a GPU activity, but Threadripper is great for gaming/not a relevant difference. High clocks and a lot of native L3 cache, plus they overclock better. Threadripper is also MUCH faster at everything that actually matters (Video editing, compilation, general multi-tasking, VMs, generative AI, LLMs, encoding, game dev, streaming, etc).
@@chiefjudge8456 Threadripper doesnt "overclock better" Threadripper still has zen4 cores meaning that you might still be able to hit 5.4 in all cores witha decent liquid cooler. Zen4 cores are good for up to 5.7ghz on air depending on silicon lotery, but this is dealing with 6 core units, 64 cores make a bit more heat and as a ressult all core clocks must come down.
That's what MSI claims. And even if accurate, it's low demand for MSI TR boards. When I've shopped for TR boards in the past, MSI never had attractive options, so it's not surprising they'd have lower demand.
@@TrueThanny really? the MSI X399 MEG Creation was the best MB to get for X399 and the TRX40 Creator successor just as good. I think MSI sees AMD's HEDT for what it is, a big waste of a investment. AMD showed their true colors with how they handled TR4 and TRX40, only a fool full of cash would pour their money on something that support is on shaky grounds.
@@vh9network Totally agree. AMD very deliberately killed off their entire non-pro HEDT userbase with that insane TRX40 early EOL - AFTER taking all those customers money. One eye-opening observation I've noticed is in comparing TRX40 vs TRX50 motherboards - TRX50 is SEVERELY cut down in IO and functionality compared to TRX40. The TRX40 Zenith II Extreme, or Gigabyte equivalent, are particular eye-openers. AMD didn't have the guts to mention TRX40 owners in light of TR 7000, and there's obviously going to be zero compensation for them. Hell mend AMD.
Congrats on the million subs Steve, Tim, and the rest of the folks at HUB. A year ago I hardly knew anything about computers. I was overwhelmed with all the information out there while researching my first build, but luckily stumbled upon this gemstone of a channel. One video turned into 2, into a dozen, and soon I was watching videos purely for the entertainment and education. Fast forward to today I am obsessed with all things computer tech and have been loving every moment of it, and it's all thanks to you guys, your wisdom, and the passion you share with the community. Here's to the next million. Thanks for all you guys do ❤. Don't change a thing.
Congrats on hitting gold! You've been so great for so long at delivering the absolute best graphs on YT! 1M is way too low for the quality of content you guys put out! Well deserved!
Congrats! Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's with the 7980X and 7970X being memory bandwidth bottlenecked in all of these tests).
The "Our Threadripper experience to date" part of the video was really insightful and enjoyable to listen to. The fact that "niche" but extremely expensive products have worse support than the mainstream commercial offerings. Makes sense but at the same time it's somewhat shocking to hear.
I think Buildzoud has said the same about mainstream motherboards. Buying the Godlike, Apex or Xtreme boards could mean worse support compared to Tomahawk, Ace or Hero boards that are sold in much bigger numbers. I have a X570 Master board that was quite expensive in 2019 costing over 400 but since it was also relatively popular (for example Gamers Nexus used it as their board for AM4 reviews) it still receives BIOS updates to this day. the newest BIOS is from September this year. About 4 years after release.
@@Raivo_K In the meantime, the budget boards used by most system builders aren't that well supported though, only the middle gets support. This is expected on the lower end but completely unacceptable on the higher end, especially on HEDT. I'm confident this lack of support doesn't happen in the professional segment with brands like Dell or Supermicro.
Ya Threadripper not keeping board support was really problem for first time buyer. It's become such a different platform now from where it initially started. When they went up up as they said during zen2 they really killed the original threadripper project. AMD screwing themselves over again.
Great content again! Very glad you shared your experience with the 5995WX and the MSI board that wasn't put into mass production. It's important information to have. Even as a system administrator, these parts are outside of my needs but they are super impressive for those that need this type of multi-threaded performance.
I really like that you started highlighting the part you're talking about in a table by blurring out the rest. Makes it easier to focus and find the part :)
Damn... Steve I just have to say I've watched HUB for YEARS, I can remember the old HUB spinning intro like it was yesterday. I remember watching 7th gen Intel reviews (I think that's about when I found the HUB channel) and man you do a bang up job
In 10 years they'd be worthless though. These things are drawing as much wattage as an RTX 4090 and they're very very niche in usage - outdated beyond belief in 10 years time.
There'll be consumer CPU's that beat these in both single core and multi core in the next 3-5 years. These are commercial for who needs the performance NOW.
The motherboards will be the barrier to getting these, not the CPUs themselves. Authentic motherboards will be dying in 5-10 years, while unofficial ones from China will likely have the same issues as the older Xeon ones.
Your experience with Threadripper 5000 is definitely interesting to hear about. It definitely goes to highlight why I'm glad AMD seems to have changed its approach now with the 7000 series. The Pro and Enthusiast class Threadrippers can now share a platform, and they start at an actually somewhat reasonable pricepoint, closer to the way Threadripper 1000 and 2000 worked, and I really think those early hits were AMD's best effort, they were simply groundbreaking for their time. The broader the audience, the more the support, fewer problems in the long run. And these chips and boards hold their value pretty darn well in the used market. I like the pricing for the 32 core, 2500 is not bad for a chip like that, but it does have to be stated it's 500 more expensive than its 2990WX ancestor, with fewer PCI-E lanes. Inflation, I suppose, but I do feel like it would be much more appealing at 2 grand rather than 2.5. That 24 core, though, sheesh, with only 24 cores you're still basically competing with the 7950X in multicore performance, and that chip is only 600 dollars, without a bundle or a black friday sale. 1500 dollars for the entry level is rough. Entry level for HEDT threadripper used to be a measly 500 dollars. I really think AMD needs to rethink this. Drop that to a flat 1k and it will be within budget for a lot more people. Dropping over a grand on a CPU is just too hard a pill to swallow for too many people, especially when that was once the price of the original Threadripper flagship. It really goes to show how competition is good for consumers. AMD was much better about its pricing when they weren't dominating the server space and Intel actually had competing products in HEDT. As for the Pro sku pricing, yeah let's not even start with that, that's a whole different ball game, that's "take out a long term loan just to build a computer" level. 10 grand for the 96 core, yikes. Really only aimed at people who need insane performance for a truly insane workflow.
I'm still rocking my Threadripper 2950X / ASUS MEG X399 CREATION... It's certainly paid for itself over the last 5 years as my primary work machine. It's just astounding to see what you can get these days.
When your mainstream processor consumes more power than the best HEDT 32 and 64 core CPUs but it's nowhere near the performance of those chips, you know something's wrong.
In gaming I just limit my 13900K at 90W. Only reason I got an Intel I9 in my new build was because the seller messed up the price and basically gave me a free upgrade to a i9 from a i5 and a £360 discount 😅😂....
You should do a engineering based simulation benchmark, theres these cool channels that test projectiles hittting armor and seeing how it reacts, very intensive calcualtion stuff
Love that pic on the battery at 2:40 Its like an adult holding the threadripper out of reach of the kid..."nah nah nahhhh this is for grown ups little timmy" lol
At 6:30 the page lists windows 11 21h2 but you say its the latest windows, so would be 23h2. just a minor typo Appreciate the hard work in all of the testing and all of that troubleshooting mess with the 5995. This is the kind of thing we need to know when considering these platforms.
Windows 11 21H2 is quite old, and you said it was the last windows 11 build (which should be 23H2). Is it a mistake in the spreadsheet or you tested it on 21H2 on purpose?
I have seen some people complain about the cost of this cpu saying they would want the whole system to cost 5k and my thoughts were that really it would probably be that if you are in the market for this cpu you will probably be in the 10-15k range. The motherboards are about 1000, rams about 1000, the gpus you would want to pair this with are 1000 plus. Then why would you buy this if you didnt also need loads of pcie lanes so add money for all the peripherals and large/fast nvme/ssds. A huge ATX case, a huge power supply. This isn't a product for you probably (advice to 98% of people out there).
People will pay £2000 for a 4090 so they can play cod at 1440p, but then complain about systems with ground breaking performance that hasn't existed before for less than what server stuff costs. My x99e-WS board isn't technically server grade and it still cost a fortune.
People that pay this kind of money for a system don't buy DIY. I suspect this is also the reason why support is very low / non existent. Systems in that price range are mission critical you can't have them out of service for two weaks because you need to rma something. OEMs with 3 year on site busniess support are the obvious choice for them.
Awesome, I wonder how much faster or more efficient the 96 core will be. It would be nice to see efficiency figures highlighted on more multi-threaded tests rather than just gaming. That cinebench result was pretty intriguing. GN did a good bit of that so it's good to be able to get slightly different information from all my favorite channels.
If there is one suggestion I would give AMD, it is this. AMD, rebalance the naming scheme so the x900 segment is not crowded. Maybe reserve all of x900 to Threadripper. Maybe go hexadecimal and create an xA00 and xB00 tier.
@11:34 Surprised that Cyberpunk was so low. What happened with "Cyberpunk 2077 will now leverage the full capability of higher-core-count CPUs." I expected better scaling for multicore CPUs.
Wow, what a beauty. How beautiful. How beautiful. I would like the 32 cores. I don't need it at all. I have the 5950x looped at 4.8Gh and I can't get it above 60 degrees even in August. I also cool Ram (64 b-die J Skill), nvme and GPU in semi-parallel. But what a beauty. 32 cores on 4 channels. These are crazy, monstrous numbers, the integrity of every piece of data is checked. With a little hand it is almost certainly possible to make those video games run 2 at a time, and playable. Ram never ends. I have 64 Gb B-die which work very well and are very fresh but it is not so much. What beasts of machines. Thank you.
You're issue with the MSI WS WRX80 is really unfortunate. To hear that there was pretty much no bios updates due to not making it to production and only releasing samples. This is telling since, Gamers Nexus had a mobo roundup for the TRX50, and MSI telling them they're sitting this launch out. It's not something they're interested in. So much for Pro lol, I wonder if big studios/companies buying these products had any issues as well
any studio's that might of had them were using pre-builds which are a business tax write off so likely were using supermicro developed boards which are designed for stability first, performance second.
It would be great to test compilation workload which are very common for such workstation CPU. For instance Linus Torvalds at some point had a ThreadRipper CPU.
@@PainterVieraxWendell on the other hand...I think the first video I saw from him was with the 64 core Threadripper 3000 CPU, showing all the different things you could do with it (virtual machines and what not) and it was very informative. GN Steve goes to him for help with that stuff.
@@vigilant_1934 yep Wendell helped them to set up some TR servers (IIRC he installed them Unraid). But GN team really are Windows guys (like HUB) and even if Wendell is helping them, they clearly aren't experienced enough with this system to be sure the comparison is not compromised. Not even sure if those Windows reviewers are capable of using basic DOS command lines on PowerShell, though Tim showed being able to compile Chromium at least.
I had the same problem with mine and as it turned out. I had to change the actual plug. It was plugged into at the wall. I was not drawing enough power from the wall to keep the CPU. Stable
I was wondering how memory bandwidth starved higher core count Ryzen CPUs might be if AMD did increase the core count with Zen 5 or Zen 6 but still left it at only dual channel memory. So comparing TR scores with dual and quad memory would be really interesting. Thanks Steve!
The reason why very few people now actually buy HEDTs is because thanks to Ryzen regular desktop CPUs have become super capable workhorses. The major difference we now get with the HEDT and Pro CPUs vs regular desktop CPUs is the number of available PCIe lanes. I know SLi and CrossFire are basically dead in the water, but it will be really interesting to see if there would be any good gains if you put 2x 4090s or 2x RX 7900XTXs. Especially now when you can have 2x 7900XTX for less money than a single 4090 it's interesting to know if both running in x16/x16 config with MGPU Enabled can outperform a single 4090.
I love HUB, but... am I the only one a bit disappointed with the apps that were tested? I mean, at least a Linux kernel compilation test would've been nice...
Of course your 5900X is going to be better in gaming, it's a general purpose CPU, These Threadrippers are Workstation CPU's they're not going to do well in gaming. It's going to be better than your 5900X in Workstation tasks.
That's really good bang for the buck at that price. I had the 5950X until about two months ago, and it ripped through software compilation like there was no tomorrow. The thing just was that that pesky little 7950X3D kept telling me it was looking for an owner who could put it through it's paces.... and eventually I caved... You should look into undervolting. My 5950X with a -30mV and a +200MHZ BPO consistently boosted to 5.05GHz, with an average of 4.8GHz during long compile jobs. And that was on air cooling.
Hardware today is already incredibly complex, and HEDT hardware even moreso. I've been on the 2950X ever since release and glad I stuck with it, it's obviously a chip that sold well and therefore has some decent community/support.
The 7980X seems to be memory bandwidth constrained. I'm really interested in seeing how the Pro series compares. Considering how far from gaming CPUs they are, they are actually very impressive at doing it. There should be no issues with someone occasionally gaming on their workstation.
If you think it's a simple matter of "more effort", you are completely mistaken. Believe it or not, the Radeon group is extremely highly regarded and respected within AMD, because they are at an inherent disadvantage against Nvidia in various respects, and yet they keep making relatively competitive products anyway. The bad press that Radeon products often gets is usually blown way out of proportion.
What on earth do you think they did pioneering MCM GPU designs? Not making simple proven incremental improvements is risky though and so far it sounds like that effort hasn't fully paid off.
@@syncmonism I'm using a 6700xt as we speak. I just want them to knock it out of the park ( like they have been with their cpus). I don't think that's too much to ask! If anything it's routing for them to excel.
I don't have a use case for Threadripper, but a monster CPU that's stacked full of threads always seemed like something fun to try. Maybe one day I'll find a real use case for one in a home server or something. Or maybe slap Gentoo on it and compile chromium in four minutes.
I think it would be interesting to see the 7970X and 7980X vs the RTX 4090 and 7900 XTX (CUDA, Optix, HIP and HIP-RT) in blender (4 probably), including efficiency (total power need for a fixed workload). I know the 7980X can have acces to much more memory and that you easily could make a renderfarm using cheap used pc's with RTX card, but still the best CPU vs the best GPU in the same workload would be fun.😀
Don't forget that CPUs maintain industry leadership performance in CGI studios (like ILM or Weta digital). GPU are extremely limited in that kind of computational task. CPU like these can access to terabytes of fast ram, while GPUs have limited pool of GDDR (few gigabytes). That has a huge impact while rendering complex animation, like a building collapsing, an explosion or a fluid simulation. It's totally different from our gaming perception where we think GPU can handle any kind of graphic task. I remembered seeing a special, shot inside DreamWorks headquarters, where they illustrated why they had to use AMD Opteron CPUs to handle perfect global illumination in a full CGI movie like "Shrek" indeed of a batch of simple graphics card. It was almost 20 years ago and still we have GPU that are memory bound😅
@@Squagliafrittata Well I know I am asking if a motorbike or a car is faster at delivering a can of soda, but I feel like your answer is that actually a cargo ship is faster cause it can deliver way more soda. A video on a supercomputer render farm could also be fun and I expect that a modern render farme will contain quite a few GPUs and the artists probably have a RTX 4090 or RTX 6000, cause Optix is just so powerful when making adjustments in previews.
Threadripper is the kind of tech that's so damn cool to see, it makes me, a mere gamer, want to learn to do some sort of work that can utilize this beastly tech so i can have an excuse to buy one.
Its interesting to know that the NZXT Kraken AIO is not compatible with Threadripper. Steve you should have busted out a Noctua U14S-TR4-SP3 and showed how all those cores can be cooled by just air.
I do not understand how AMD can even classify these CPUs as HEDT based solely on their price. These are workstation class CPUs. Only a few years ago Intel's HEDTs cost slightly over 1,000+ dollars and anything more expensive where generally Xeons, suited for high-end workstations and not desktop/gaming rigs.
Hoping we get a 7960X review at some point... perhaps it'll fair better than the others in gaming... unlikely but possible. I'm at least considering it, as I'm finding myself doing a lot of rendering workloads lately & less gaming, not sure how much of an adnvantage the 7960X setup would have over what I have now, since a fair bit of the workload is done on GPU. Of course, getting a Threadripper setup would allow me to slowly add more GPUs & thusly exponentially increase my potential output/workload capability.
Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's even with the 7980X being heavily memory bandwidth bottlenecked on TRX50 with 1 channel per 16 cores).
Well you didn't use games in benchmarks that actually benefit from cache - while this processor isn't meant to be used to game, it would have been a great showcase to show, if other titles that benefit from cache, like ACC or Microsoft Flight Simulator, do benefit even more
The extra cache does not matter that much on these CPUs because it's not unified like on 7800X3D but spread across multiple CCDs so there is actually less cache per die. Accessing all cache requires heavy use of infinity fabric which is a bottleneck.
Look again at the Assassins Creed Mirage result. That is one of the instances where the 7800X3D outperforms the 7950X3D by a wide margin because the game (accidentally) gets scheduled on the non-V-Cache CCD of the 7950X3D. Total cache of the CPU is only a meaningful distinction on Intel parts. Because the Infinity Fabric that connect AMD's CCDs is relatively slow, the cache hits on other CCDs, than where the code is executing, is almost as slow as a memory fetch. So it's basically only "cache per CCD" that makes a meaningful metric on AMD. I have a 7950X3D and it's trivially easy to set up launch scripts that pin each game to the CCD it responds the best to, (but Steve tests out-of-box performance - as he should) and I can attest that completely blocking a game from accessing the cache of the other CCD has zero impact on performance. For gaming I pretty much treat my system as if I had a dual socket board with two 8 core CPUs. One with V-Cache and one with higher clocks. Limiting every game to one CCD also has the handy benefit that I can compile code in the background without it wrecking my gaming experience.
@@Bluth53 YW! AMD's layout is an absolute brilliant stroke of genius from a cost-of-manufacture perspective, but it is not without tradeoffs on Desktop.... Which is exactly why they launched V-Cache. And it settles it pretty hard that, no, you don't need more than 8 cores for gaming.
My WRX80 board from ASRock has had a few BIOS updates and is rock stable, so I think you were unlucky with the MSI board, I think its based on a server board since it has IPMI etc, the server team know how to make something that is stable
IMOP threadripper does not belong in the DIY marked anymore. It is more a mix of HDET/Epyc and therefore belongs to the OEMs. Demand is extremly low and currently at an all time.
Hmmm... So you used MSI boards? Didn't you think to try a Gigabyte board? I am just asking. I am interested in, if a Gigabyte board is better. Well, HEDT is inbetween a Mainstream and a Server platform and we know, that Gigabyte is very good in mainstream. ;) Thank you for your answer in advance!
Awesome CPUs but that board is filthy as hell especially inside that secondary PCI Express There is a little bit of dust on that first one but not as bad as that 2nt
@@pirotehs The mainstream platforms. And AMD specifically makes the X3D variants for top shelf gaming. You'll never get as tight memory timings on the 4/8/12 memory channels like Threadripper, Threadripper PRO and Epyc Workstation has, and memory timings is everything in gaming.
I waited 9 months for the MSI WS WRX to use with my 5975wx, finally gave up after they told me on twitter the motherboard was never coming out. Been using the AsRock WRX80 Creator ever since. Also having crashing problems, haven't had time to resinstall windows, that's next I think.
Seller : "Don't worry we have more affordable.. 2 and half thousand dollar.." Me : "Yes. Thank you. Can you uh.. show the clearance bin now... if there's any.. for the.. old.. AMD chip I can browse? Joke aside, thank you for the review being done as always. All this graph is helpful indeed for people to look into what to buy and compare.
If i wouldn't have had massive firmware reliability issues with third gen Threadripper, i could be almost excited about those new CPU's. I use my workstation PC for generating my income as an engineer and i don't want to spend my worktime troubleshooting issues, which shouldn't even exist on a professional-grade platform. So i'm doing pretty okay with my inefficient and slower Xeon processors that can deliver nearly 100% uptime. But honestly, i can't wait for dedicated RISC-based platforms to find their way back to professional workstation users.
I'm pretty sure it's an issue with the 4 or 5 consumer board manufacturers. Less tacky enterprise-grade hardware from Dell, Lenovo or Supermicro gets more support. For a RISC based platform, there was some relatively affordable Power9 systems few years ago. Real beasts, though definitively obsolete nowadays.
@@PainterVierax I've always had Asus' professional lineup since the X58 era. The Xeon based system all ran and still run pretty stable. Currently using the C422 SE with no issues in my trusty old 19" case. My TR mainboard also was an Asus. The issues i had were mostly related to USB, which is bad, because i use a lot of specialised peripherials and lab equipment. Also had one occurence, where my SSD's started freaking out. I looked up online and there were a lot of users sharing the same issues using different and also more expensive boards. I'm not an Intel fanboy (i dislike their current consumer platforms), but i never experienced something like this before, even on consumer hardware. I use an older IBM Power system as a project server/atorage unit using BSD, but it isn't optimised as a desktop platform. I currently have high hopes for RISC V in the future. They already began to manufacture HPC CPU's and it's just a manner of time until there will be workstation grade derivatives. A lot of people in the engineering and visual space have a bit of a nostalgic feeling about oldschool unix workstations. A friend of mine even has a small collection.
@@hyperturbotechnomike I agree AMD has something to do with it but the board manufacturers have to push the bios revisions as well. Don't expect too much of RV in the HEDT/workstation segment. If Power and ARM left it, it's because x86 is very competitive and possess the inertia of decades of software development.
Sounds just like their consumer platform. Had lots of troubleshooting I had to do for clients since their systems randomly decide they don't want to boot or had all their BIOS settings wiped for no reason. Some systems became unstable at stock after a while and some even had a dead CPU, the most common was the R5 3600 non-X.
@@PainterVierax You know what's great. Having something tested out of the box. How professional systems should be. My Xeon boards always worked OOB and did not require firmware updates to fix major issues. I think the reason for that is that Intel is focused on HPC/Datacenter. A few generations ago, they began using different architectures between Xeon and consumer desktop. There is also no Win11 barely functioning P&E core bs on Xeon. AMD should treat it's professional users more seriously, even if they take initial financial losses in R&D. They can win it back, after gaining reputation. As said, i'm not a hater. I'm a fan and user of AMD's consumer platform, since Intel's consumer products are so abyssmal. Biglittle architecture makes zero sense in a desktop PC. Only battery powered devices benefit from this technology.
The problems you started to have with the 5000s series is all motherboard related and has nothing to do with the cpu. So saying you had "issues" with Threadripper cpus is not really honest here.
As much as I'm pissed at AMD for screwing over everyone on the older TR platforms, I'm not surprised they went the route they did. The main thing, imo, was the lack of any serious competition. Why sell at a lower price than Enterprise/data center when your 'competition' is slower by half and costs more to boot?
It's no excuse to very deliberately kill off your entire non-pro HEDT userbase (and many pro users). For AMD to publicly promise that they'll support TRX40 "for the long term"...take everyone's money...then laugh loudly as they quickly EOL TRX40 is about as insane as you can get. I also invested in TRX40 and as you can tell, I'm pissed at AMD. I'd never have purchased this if I'd known what they would do.
@@ChrisM541 when you are not making any real money from a niche, super high end market segment, then yes that's entirely fair. It's a dick move and erodes trust, but from a business standpoint makes sense.
@@imglidinhere You have no idea how butthurt he is about that. He has pretty much spent his entire afternoon ranting about it up and down the comment stack. And he did the same for the news coverage of the announcement. Which suggests that he did the same for the TR 5000 announcements and reviews. It's actually rather funny, pathetic and sad at the same time. But to comment on your original statement: AMD has made no promise of future socket compatibility for TR 7000, so factor that in if you want to go that route. Fishhawk Falls Refresh is out soon, but that's also an EOL platform, so you're in the same boat either way. That said... I really don't get the anger (except the principle of promising support and not delivering) when we are talking systems that easily cost $10k fully spec'ed... what the hell is $700 for a motherboard?!?
@@andersjjensen People that buy TR systems are... not the type that upgrade every generation. If you bought a 3990X, you were probably not planning to upgrade for a loooong time. The vast majority of HEDT users built their systems and let them run until they die.
@@imglidinhere I concur. But apparently Chris here thinks differently. I'm looking forward to see him grinding his axe again for the Threadripper 9000 launch... :P
I wonder how well DaVinci Resolve would run with the newest generation of TR CPUs. I'll keep an eye on Puget Systems, which specializes in video production testing. Definitely not the best choice for gaming, but for certain productivity tasks - wow!
As the owner of a 3960X Threadripper system I can attest that the extra performance isn't worth the squeeze on your wallet. I really wanted to like the Threadripper platform, and I tried oh so hard to do so, but in the end I feel like I wasted my money.
...especially after they deliberately killed off TRX40 after only one(!!!) CPU release. So much for that infamous "long term support promise". So braindead for AMD to kill off its entire HEDT userbase.
At these prices and with such spec, I’m curious who are the buyers? System integrators? Because I can’t imagine building a workstation and not having any fast support option.
The PRO line-up always used to come out first OEM only and I would assume a huge chunk of actual pro users/companies that buy a lot of them would buy them from an OEM with support contracts and all.
@@HatchlingKifathis And do you know how expensive it would be to put enough Vcache on the CPU to actually make it worth it These are 2k to 4k CPU’s That would make them 4k to 8k CPUs 8 thousand dollars for a 3-8% performance increase? Thats not worth buying, definitely not worth making
Well, despite all that power, demand will also be very, very low for these new Threadripper parts. They are much too expensive. I for my part got burnt with the TRX40 platform. I bought three of these for our CAM workshop, with the firm intention of upgrading to 5000 series Threadripper once they would be available. These 3000 series we got are resally sweet to work with, but since Treadripper 5000 was never released for TRX40, I'm stuck with these. My lesson out of this simply is: never ever again Threadripper, especially at these insane prices for motherboard, RAM and CPU.
If you don't need it, don't buy it. If you do need it you can't complain about the price because you need it. "omh the price of this caterpillar 787B is so expensive"...do you own a mine?
100% agree. AMD royally - and deliberately - shafted every TRX40 owner. Taking their hard earned money off the back of that infamous "long term support promise".....then laughing at them as they very quickly EOL'd TRX40. Imagine deliberately killing off your entire non-pro/HEDT userbase...and many pro users too. Shocking, braindead way to treat your customers.
@@ChrisM541 This is why, i have returned my third gen Threadripper again. had a lot of firmware issues. My Xeon based workstation is running inefficient and hasn't as much power, but it has an almost 100% uptime and no random crashes and failures with it's connectivity.
Congrats on that million.
I’m going to be honest, for some reason I thought they were way over 1M subs…
Same man, they definitely deserve more than they have
@@LEGnewTubebecause of their high quality content
Steve definitely deserves it
@@50H3i1and Tim!
7:45 crazy to compare the 7980x power consumption vs the 14900k
Around 13% more power for 65.7% performance increase on a HEDT CPU is nuts.
Just ran CB2024 benchmark on 14700KF (slightly overvolted) - 380W peak power, wattmeter readings. 24% less. Not surprised, honestly, since HW is shilling for AMD.
@@dsdfasefes1883So you believe that Steve, someone who's been reviewing hardware for 20 years, just randomly makes up numbers and puts them in a graph?
That graph is total system usage, so different PSU, different motherboard (and all of them have different voltage and boosting profiles), different RAM and different GPU than those in your system.
@@dsdfasefes1883 Tell me you know nothing about tech without telling me.
How about you test his 14900k with his board and settings, oh you can't so what you say is worthless.
@@dsdfasefes1883who hurt you as a child?😢
As a mere PC gamer, Threadripper is way beyond anything need or even can afford. That said I love seeing how insane CPU's can get, and there's something to be said for massively expanded IO connectivity.
For a gamer it is beaten soundly in games by a much cheaper i5.
@@FullNietzsche. I think the point was that given their computing use case (“gamer”), and it does so much more than they could use it for (“beyond anything need”), or even within their justifiable budget (“even can afford”), they still enjoy this content (“I love seeing…”). The last part was the point of their comment, and at no point do they express how they felt about the gaming prowess or its usefulness in that field.
You think you won't need it?
Try becomming an engineer and also having a home-server. You will have fun running web-services and simulations on such a compact thing. :)
7800x3d is so cheap and efficient that it rarely makes any sense to look into anything else if you need high-end gaming systems. Can pretty much pick the crappiest vrm motherboard, throw in a 35$ peerless assassin and you are under the cost of 14900K alone before even considering the need of a 200W larger psu.
and than, there are the dual socket epycs with an extra 3d cache for some spice. Truly insane.
That 7970x was definitely the star here in my opinion, so close to the previous 64core flagship with half the cores, some might even say it’s good value 👀
It's the best choice also because it boosts higher than its bigger brother
not if you're gonna use it for it's intended use which is obviously multithreading
64-core version is meant for workloads that are extremely parallel but you don't need or budget cannot afford the I/O connectivity or bandwidth of the
WRX50 platform/Threadripper PRO
The 7980X is definitely the star, but it is heavily memory bandwidth bottlenecked on TRX50 (only 1 channel per 16 cores).
The 7980X runs cooler, whereas the 7970X runs hotter at the same TDP.
Surprising how the 7980X without PBO enabled use roughly the same amount of power as a 14900K given the amount of cores it has, sure it's not efficient as a gaming CPU but it was never made to be one anyway, but sure is efficient with the amount of cores it packed.
Gaming is a GPU activity, but Threadripper is great for gaming/not a relevant difference. High clocks and a lot of native L3 cache, plus they overclock better. Threadripper is also MUCH faster at everything that actually matters (Video editing, compilation, general multi-tasking, VMs, generative AI, LLMs, encoding, game dev, streaming, etc).
That's how far behind the 7800x3D Intel is. Forced to go all out to compete.
@@chiefjudge8456 Threadripper doesnt "overclock better" Threadripper still has zen4 cores meaning that you might still be able to hit 5.4 in all cores witha decent liquid cooler. Zen4 cores are good for up to 5.7ghz on air depending on silicon lotery, but this is dealing with 6 core units, 64 cores make a bit more heat and as a ressult all core clocks must come down.
that's insane that y'all got a motherboard that was never released because of low demand 😭
This kinda makes sense now, considering MSI has skipped out TRX50 for this gen
That's what MSI claims. And even if accurate, it's low demand for MSI TR boards. When I've shopped for TR boards in the past, MSI never had attractive options, so it's not surprising they'd have lower demand.
@@TrueThanny really? the MSI X399 MEG Creation was the best MB to get for X399 and the TRX40 Creator successor just as good.
I think MSI sees AMD's HEDT for what it is, a big waste of a investment.
AMD showed their true colors with how they handled TR4 and TRX40, only a fool full of cash would pour their money on something that support is on shaky grounds.
@@vh9network Totally agree. AMD very deliberately killed off their entire non-pro HEDT userbase with that insane TRX40 early EOL - AFTER taking all those customers money. One eye-opening observation I've noticed is in comparing TRX40 vs TRX50 motherboards - TRX50 is SEVERELY cut down in IO and functionality compared to TRX40. The TRX40 Zenith II Extreme, or Gigabyte equivalent, are particular eye-openers.
AMD didn't have the guts to mention TRX40 owners in light of TR 7000, and there's obviously going to be zero compensation for them. Hell mend AMD.
@@TrueThannyI had an MSI x399 carbon and it ran well with 1950x at 4.1ghz.
Congrats on the million subs Steve, Tim, and the rest of the folks at HUB. A year ago I hardly knew anything about computers. I was overwhelmed with all the information out there while researching my first build, but luckily stumbled upon this gemstone of a channel. One video turned into 2, into a dozen, and soon I was watching videos purely for the entertainment and education.
Fast forward to today I am obsessed with all things computer tech and have been loving every moment of it, and it's all thanks to you guys, your wisdom, and the passion you share with the community.
Here's to the next million. Thanks for all you guys do ❤. Don't change a thing.
Very well said! Happened to me, too :D
Congrats on hitting gold! You've been so great for so long at delivering the absolute best graphs on YT! 1M is way too low for the quality of content you guys put out! Well deserved!
Congrats! Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's with the 7980X and 7970X being memory bandwidth bottlenecked in all of these tests).
The "Our Threadripper experience to date" part of the video was really insightful and enjoyable to listen to. The fact that "niche" but extremely expensive products have worse support than the mainstream commercial offerings. Makes sense but at the same time it's somewhat shocking to hear.
I think Buildzoud has said the same about mainstream motherboards. Buying the Godlike, Apex or Xtreme boards could mean worse support compared to Tomahawk, Ace or Hero boards that are sold in much bigger numbers.
I have a X570 Master board that was quite expensive in 2019 costing over 400 but since it was also relatively popular (for example Gamers Nexus used it as their board for AM4 reviews) it still receives BIOS updates to this day. the newest BIOS is from September this year. About 4 years after release.
@@Raivo_K In the meantime, the budget boards used by most system builders aren't that well supported though, only the middle gets support. This is expected on the lower end but completely unacceptable on the higher end, especially on HEDT. I'm confident this lack of support doesn't happen in the professional segment with brands like Dell or Supermicro.
Ya Threadripper not keeping board support was really problem for first time buyer. It's become such a different platform now from where it initially started. When they went up up as they said during zen2 they really killed the original threadripper project. AMD screwing themselves over again.
Great content again! Very glad you shared your experience with the 5995WX and the MSI board that wasn't put into mass production. It's important information to have. Even as a system administrator, these parts are outside of my needs but they are super impressive for those that need this type of multi-threaded performance.
I really like that you started highlighting the part you're talking about in a table by blurring out the rest. Makes it easier to focus and find the part :)
Damn... Steve I just have to say I've watched HUB for YEARS, I can remember the old HUB spinning intro like it was yesterday. I remember watching 7th gen Intel reviews (I think that's about when I found the HUB channel) and man you do a bang up job
Can't wait for them to become affordable in 10 years
In 10 years they'd be worthless though. These things are drawing as much wattage as an RTX 4090 and they're very very niche in usage - outdated beyond belief in 10 years time.
@@user-ds8rj2vc4vthat's the point bro. Plus are you sure? Even 2$ Xeons from Aliexpress are still somewhat competitive
There'll be consumer CPU's that beat these in both single core and multi core in the next 3-5 years. These are commercial for who needs the performance NOW.
@@madgodzilla12465 yeah and cost 20x as much. I can get an X99 Xeon with 22 cores for 30$ that will not be far behind 5950X
The motherboards will be the barrier to getting these, not the CPUs themselves. Authentic motherboards will be dying in 5-10 years, while unofficial ones from China will likely have the same issues as the older Xeon ones.
Never in the history of mankind have so many W's, 9's, 5's, and X's been spoken in so little time by so few reviewers. Heroic effort.
Made my day!
Don't for get all the "Pro", "Premium" and "Plus"
Good To know that I wasn't the only one to experience 5000 series problems!
Your experience with Threadripper 5000 is definitely interesting to hear about. It definitely goes to highlight why I'm glad AMD seems to have changed its approach now with the 7000 series. The Pro and Enthusiast class Threadrippers can now share a platform, and they start at an actually somewhat reasonable pricepoint, closer to the way Threadripper 1000 and 2000 worked, and I really think those early hits were AMD's best effort, they were simply groundbreaking for their time. The broader the audience, the more the support, fewer problems in the long run. And these chips and boards hold their value pretty darn well in the used market.
I like the pricing for the 32 core, 2500 is not bad for a chip like that, but it does have to be stated it's 500 more expensive than its 2990WX ancestor, with fewer PCI-E lanes. Inflation, I suppose, but I do feel like it would be much more appealing at 2 grand rather than 2.5.
That 24 core, though, sheesh, with only 24 cores you're still basically competing with the 7950X in multicore performance, and that chip is only 600 dollars, without a bundle or a black friday sale. 1500 dollars for the entry level is rough. Entry level for HEDT threadripper used to be a measly 500 dollars. I really think AMD needs to rethink this. Drop that to a flat 1k and it will be within budget for a lot more people. Dropping over a grand on a CPU is just too hard a pill to swallow for too many people, especially when that was once the price of the original Threadripper flagship.
It really goes to show how competition is good for consumers. AMD was much better about its pricing when they weren't dominating the server space and Intel actually had competing products in HEDT.
As for the Pro sku pricing, yeah let's not even start with that, that's a whole different ball game, that's "take out a long term loan just to build a computer" level. 10 grand for the 96 core, yikes. Really only aimed at people who need insane performance for a truly insane workflow.
I'm still rocking my Threadripper 2950X / ASUS MEG X399 CREATION... It's certainly paid for itself over the last 5 years as my primary work machine. It's just astounding to see what you can get these days.
When your mainstream processor consumes more power than the best HEDT 32 and 64 core CPUs but it's nowhere near the performance of those chips, you know something's wrong.
Bigger Number Better
Intel sure went nuts with power just to get a small performance uplift...
Pumping more power was the only way to increase FPS against the 13k lineup.
In Watch Dogs Legion I have 470W wattmeter peak power consumption with stock settings. 11% less. 14700KF.
In gaming I just limit my 13900K at 90W. Only reason I got an Intel I9 in my new build was because the seller messed up the price and basically gave me a free upgrade to a i9 from a i5 and a £360 discount 😅😂....
I wonder when Intel will be able to compete in efficiency again. They are so many generations behind.
You should do a engineering based simulation benchmark, theres these cool channels that test projectiles hittting armor and seeing how it reacts, very intensive calcualtion stuff
HEDT motherboards look so cool. They literally mean business lol.
Love that pic on the battery at 2:40 Its like an adult holding the threadripper out of reach of the kid..."nah nah nahhhh this is for grown ups little timmy" lol
Good eye. I'm watching on my phone and had to zoom in to see that.
Thats actually crazy how the intel CPU's use more power than the threadripper parts lmao
Add this to my list of "don't need it, would never fully utilize it, but damn I want it"!
At 6:30 the page lists windows 11 21h2 but you say its the latest windows, so would be 23h2. just a minor typo
Appreciate the hard work in all of the testing and all of that troubleshooting mess with the 5995. This is the kind of thing we need to know when considering these platforms.
congrats on the million guys, thanks for the quality content!
Windows 11 21H2 is quite old, and you said it was the last windows 11 build (which should be 23H2). Is it a mistake in the spreadsheet or you tested it on 21H2 on purpose?
It was a mistake on screen. I'll fix that moving forward.
I have seen some people complain about the cost of this cpu saying they would want the whole system to cost 5k and my thoughts were that really it would probably be that if you are in the market for this cpu you will probably be in the 10-15k range. The motherboards are about 1000, rams about 1000, the gpus you would want to pair this with are 1000 plus. Then why would you buy this if you didnt also need loads of pcie lanes so add money for all the peripherals and large/fast nvme/ssds. A huge ATX case, a huge power supply. This isn't a product for you probably (advice to 98% of people out there).
People will pay £2000 for a 4090 so they can play cod at 1440p, but then complain about systems with ground breaking performance that hasn't existed before for less than what server stuff costs.
My x99e-WS board isn't technically server grade and it still cost a fortune.
People that pay this kind of money for a system don't buy DIY. I suspect this is also the reason why support is very low / non existent. Systems in that price range are mission critical you can't have them out of service for two weaks because you need to rma something. OEMs with 3 year on site busniess support are the obvious choice for them.
Hearing about your issue, I guess that's why those Lenovo OEM type of stuff is actually not bad.
Awesome, I wonder how much faster or more efficient the 96 core will be. It would be nice to see efficiency figures highlighted on more multi-threaded tests rather than just gaming. That cinebench result was pretty intriguing. GN did a good bit of that so it's good to be able to get slightly different information from all my favorite channels.
Congratz on 1 milli guys been here since 2018
Not me waiting for 7995WX results like I'll ever own one lmao
I know it's wild
If there is one suggestion I would give AMD, it is this. AMD, rebalance the naming scheme so the x900 segment is not crowded. Maybe reserve all of x900 to Threadripper. Maybe go hexadecimal and create an xA00 and xB00 tier.
Congrats on 1 million subs, very much deserved!
Holy shit finally 1mil. I am so happy for you guys
@11:34
Surprised that Cyberpunk was so low.
What happened with "Cyberpunk 2077 will now leverage the full capability of higher-core-count CPUs."
I expected better scaling for multicore CPUs.
Wow, what a beauty. How beautiful. How beautiful. I would like the 32 cores. I don't need it at all. I have the 5950x looped at 4.8Gh and I can't get it above 60 degrees even in August. I also cool Ram (64 b-die J Skill), nvme and GPU in semi-parallel. But what a beauty. 32 cores on 4 channels. These are crazy, monstrous numbers, the integrity of every piece of data is checked. With a little hand it is almost certainly possible to make those video games run 2 at a time, and playable. Ram never ends. I have 64 Gb B-die which work very well and are very fresh but it is not so much. What beasts of machines. Thank you.
Congratulations for 1 million my dudes
You're issue with the MSI WS WRX80 is really unfortunate. To hear that there was pretty much no bios updates due to not making it to production and only releasing samples. This is telling since, Gamers Nexus had a mobo roundup for the TRX50, and MSI telling them they're sitting this launch out. It's not something they're interested in. So much for Pro lol, I wonder if big studios/companies buying these products had any issues as well
any studio's that might of had them were using pre-builds which are a business tax write off so likely were using supermicro developed boards which are designed for stability first, performance second.
@@sirmonkey1985 True, forgot there are other mobo vendors that are more suitable/reliable than MSI
1 million subscribers goes hard 🔥🔥🔥
It would be great to test compilation workload which are very common for such workstation CPU. For instance Linus Torvalds at some point had a ThreadRipper CPU.
GN did some of that and Level1Linux usually does that with Linux kernels. As does Phoronix.
@@walkir2662 I would not trust the former, as they're not experienced enough with Linux or even just with the act of compiling software.
@@PainterVieraxWendell on the other hand...I think the first video I saw from him was with the 64 core Threadripper 3000 CPU, showing all the different things you could do with it (virtual machines and what not) and it was very informative. GN Steve goes to him for help with that stuff.
@@vigilant_1934 yep Wendell helped them to set up some TR servers (IIRC he installed them Unraid). But GN team really are Windows guys (like HUB) and even if Wendell is helping them, they clearly aren't experienced enough with this system to be sure the comparison is not compromised.
Not even sure if those Windows reviewers are capable of using basic DOS command lines on PowerShell, though Tim showed being able to compile Chromium at least.
I had the same problem with mine and as it turned out. I had to change the actual plug. It was plugged into at the wall. I was not drawing enough power from the wall to keep the CPU. Stable
I was wondering how memory bandwidth starved higher core count Ryzen CPUs might be if AMD did increase the core count with Zen 5 or Zen 6 but still left it at only dual channel memory. So comparing TR scores with dual and quad memory would be really interesting. Thanks Steve!
"Yes mom, I need this to do homework"
The reason why very few people now actually buy HEDTs is because thanks to Ryzen regular desktop CPUs have become super capable workhorses. The major difference we now get with the HEDT and Pro CPUs vs regular desktop CPUs is the number of available PCIe lanes. I know SLi and CrossFire are basically dead in the water, but it will be really interesting to see if there would be any good gains if you put 2x 4090s or 2x RX 7900XTXs. Especially now when you can have 2x 7900XTX for less money than a single 4090 it's interesting to know if both running in x16/x16 config with MGPU Enabled can outperform a single 4090.
I love HUB, but... am I the only one a bit disappointed with the apps that were tested? I mean, at least a Linux kernel compilation test would've been nice...
Just got a 5900X for $289.99. 12 cores 24 threads is still good for a workstation and some gaming if still on AM4 platform.
Of course your 5900X is going to be better in gaming, it's a general purpose CPU, These Threadrippers are Workstation CPU's they're not going to do well in gaming. It's going to be better than your 5900X in Workstation tasks.
That's really good bang for the buck at that price. I had the 5950X until about two months ago, and it ripped through software compilation like there was no tomorrow. The thing just was that that pesky little 7950X3D kept telling me it was looking for an owner who could put it through it's paces.... and eventually I caved...
You should look into undervolting. My 5950X with a -30mV and a +200MHZ BPO consistently boosted to 5.05GHz, with an average of 4.8GHz during long compile jobs. And that was on air cooling.
Hardware today is already incredibly complex, and HEDT hardware even moreso. I've been on the 2950X ever since release and glad I stuck with it, it's obviously a chip that sold well and therefore has some decent community/support.
Have you monitored the thread utilisation, temps and clock frequency during the times your rendering was several times longer than expected?
The 7980X seems to be memory bandwidth constrained. I'm really interested in seeing how the Pro series compares. Considering how far from gaming CPUs they are, they are actually very impressive at doing it. There should be no issues with someone occasionally gaming on their workstation.
1 RAM channel for 16 cores is pretty much a disaster.
It's funny to see that in game performances are really not bad at all
I want to see AMD put as much effort into their GPUs next!
If you think it's a simple matter of "more effort", you are completely mistaken. Believe it or not, the Radeon group is extremely highly regarded and respected within AMD, because they are at an inherent disadvantage against Nvidia in various respects, and yet they keep making relatively competitive products anyway. The bad press that Radeon products often gets is usually blown way out of proportion.
What on earth do you think they did pioneering MCM GPU designs?
Not making simple proven incremental improvements is risky though and so far it sounds like that effort hasn't fully paid off.
@@syncmonism I'm using a 6700xt as we speak. I just want them to knock it out of the park ( like they have been with their cpus). I don't think that's too much to ask! If anything it's routing for them to excel.
@@steveozone4910for now Intel looks way more promising
@@GewelRealhopefully intel gets the power draw right soon, they have powerful hardware that isn't translated fully into gaming as of yet
Holy moly, Antec. That's a name I haven't heard in 15 years. I thought they went bust like 10 years ago?
I don't have a use case for Threadripper, but a monster CPU that's stacked full of threads always seemed like something fun to try. Maybe one day I'll find a real use case for one in a home server or something. Or maybe slap Gentoo on it and compile chromium in four minutes.
I think it would be interesting to see the 7970X and 7980X vs the RTX 4090 and 7900 XTX (CUDA, Optix, HIP and HIP-RT) in blender (4 probably), including efficiency (total power need for a fixed workload).
I know the 7980X can have acces to much more memory and that you easily could make a renderfarm using cheap used pc's with RTX card, but still the best CPU vs the best GPU in the same workload would be fun.😀
Don't forget that CPUs maintain industry leadership performance in CGI studios (like ILM or Weta digital). GPU are extremely limited in that kind of computational task. CPU like these can access to terabytes of fast ram, while GPUs have limited pool of GDDR (few gigabytes). That has a huge impact while rendering complex animation, like a building collapsing, an explosion or a fluid simulation. It's totally different from our gaming perception where we think GPU can handle any kind of graphic task. I remembered seeing a special, shot inside DreamWorks headquarters, where they illustrated why they had to use AMD Opteron CPUs to handle perfect global illumination in a full CGI movie like "Shrek" indeed of a batch of simple graphics card. It was almost 20 years ago and still we have GPU that are memory bound😅
@@Squagliafrittata Well I know I am asking if a motorbike or a car is faster at delivering a can of soda, but I feel like your answer is that actually a cargo ship is faster cause it can deliver way more soda.
A video on a supercomputer render farm could also be fun and I expect that a modern render farme will contain quite a few GPUs and the artists probably have a RTX 4090 or RTX 6000, cause Optix is just so powerful when making adjustments in previews.
Someday Adobe will figure out threading. Someday.
Threadripper is the kind of tech that's so damn cool to see, it makes me, a mere gamer, want to learn to do some sort of work that can utilize this beastly tech so i can have an excuse to buy one.
Its interesting to know that the NZXT Kraken AIO is not compatible with Threadripper.
Steve you should have busted out a Noctua U14S-TR4-SP3 and showed how all those cores can be cooled by just air.
Amazing now let pixar render and finish those movies. 🤣
Congratulations to 1 million
I do not understand how AMD can even classify these CPUs as HEDT based solely on their price. These are workstation class CPUs. Only a few years ago Intel's HEDTs cost slightly over 1,000+ dollars and anything more expensive where generally Xeons, suited for high-end workstations and not desktop/gaming rigs.
Hoping we get a 7960X review at some point... perhaps it'll fair better than the others in gaming... unlikely but possible. I'm at least considering it, as I'm finding myself doing a lot of rendering workloads lately & less gaming, not sure how much of an adnvantage the 7960X setup would have over what I have now, since a fair bit of the workload is done on GPU. Of course, getting a Threadripper setup would allow me to slowly add more GPUs & thusly exponentially increase my potential output/workload capability.
The platform wonkiness is unfortunate, but somewhat expected given the lack of users to report issues.
Threadripper 7000 is a MONSTER. Makes regular Ryzen and Intel Core look like cheap toys (and that's even with the 7980X being heavily memory bandwidth bottlenecked on TRX50 with 1 channel per 16 cores).
It's impressive to see the 7950x (16/32) battling head to head with the 3970x (32/64).
Grats on a million subs HUB!! :)
The Spiderman remastered results show incredible scaling.
I can see the guys from Antec got really inspired by the Torrent.
Well you didn't use games in benchmarks that actually benefit from cache - while this processor isn't meant to be used to game, it would have been a great showcase to show, if other titles that benefit from cache, like ACC or Microsoft Flight Simulator, do benefit even more
The extra cache does not matter that much on these CPUs because it's not unified like on 7800X3D but spread across multiple CCDs so there is actually less cache per die. Accessing all cache requires heavy use of infinity fabric which is a bottleneck.
Look again at the Assassins Creed Mirage result. That is one of the instances where the 7800X3D outperforms the 7950X3D by a wide margin because the game (accidentally) gets scheduled on the non-V-Cache CCD of the 7950X3D.
Total cache of the CPU is only a meaningful distinction on Intel parts. Because the Infinity Fabric that connect AMD's CCDs is relatively slow, the cache hits on other CCDs, than where the code is executing, is almost as slow as a memory fetch. So it's basically only "cache per CCD" that makes a meaningful metric on AMD.
I have a 7950X3D and it's trivially easy to set up launch scripts that pin each game to the CCD it responds the best to, (but Steve tests out-of-box performance - as he should) and I can attest that completely blocking a game from accessing the cache of the other CCD has zero impact on performance. For gaming I pretty much treat my system as if I had a dual socket board with two 8 core CPUs. One with V-Cache and one with higher clocks. Limiting every game to one CCD also has the handy benefit that I can compile code in the background without it wrecking my gaming experience.
@@andersjjensen Thank you for the real world scenario daily driver feedback!
@@Bluth53 YW! AMD's layout is an absolute brilliant stroke of genius from a cost-of-manufacture perspective, but it is not without tradeoffs on Desktop.... Which is exactly why they launched V-Cache. And it settles it pretty hard that, no, you don't need more than 8 cores for gaming.
My WRX80 board from ASRock has had a few BIOS updates and is rock stable, so I think you were unlucky with the MSI board, I think its based on a server board since it has IPMI etc, the server team know how to make something that is stable
IMOP threadripper does not belong in the DIY marked anymore. It is more a mix of HDET/Epyc and therefore belongs to the OEMs. Demand is extremly low and currently at an all time.
i wish you guys could benchmark cities skylines 2 with a high population save file. Thanks for the video btw ^_^
Hmmm... So you used MSI boards? Didn't you think to try a Gigabyte board? I am just asking. I am interested in, if a Gigabyte board is better. Well, HEDT is inbetween a Mainstream and a Server platform and we know, that Gigabyte is very good in mainstream. ;)
Thank you for your answer in advance!
Awesome CPUs but that board is filthy as hell especially inside that secondary PCI Express
There is a little bit of dust on that first one but not as bad as that 2nt
Can't wait to see 40 game benchmark video of this CPU against 14900K
It's not a gaming CPU.
@@thelaughingmanofficial really? Which CPUs are gaming CPU?
@@pirotehs The mainstream platforms. And AMD specifically makes the X3D variants for top shelf gaming. You'll never get as tight memory timings on the 4/8/12 memory channels like Threadripper, Threadripper PRO and Epyc Workstation has, and memory timings is everything in gaming.
I waited 9 months for the MSI WS WRX to use with my 5975wx, finally gave up after they told me on twitter the motherboard was never coming out. Been using the AsRock WRX80 Creator ever since. Also having crashing problems, haven't had time to resinstall windows, that's next I think.
Seller : "Don't worry we have more affordable.. 2 and half thousand dollar.."
Me : "Yes. Thank you. Can you uh.. show the clearance bin now... if there's any.. for the.. old.. AMD chip I can browse?
Joke aside, thank you for the review being done as always. All this graph is helpful indeed for people to look into what to buy and compare.
Thumbnail got me here the size of that cpu I could fry eggs on it
Oh noes, Steve is telling one of those horror stories at the high-end, +$5K system builds you sometimes read about on the interwebs!...
Welcome back Antec :P
If i wouldn't have had massive firmware reliability issues with third gen Threadripper, i could be almost excited about those new CPU's.
I use my workstation PC for generating my income as an engineer and i don't want to spend my worktime troubleshooting issues, which shouldn't even exist on a professional-grade platform. So i'm doing pretty okay with my inefficient and slower Xeon processors that can deliver nearly 100% uptime. But honestly, i can't wait for dedicated RISC-based platforms to find their way back to professional workstation users.
I'm pretty sure it's an issue with the 4 or 5 consumer board manufacturers. Less tacky enterprise-grade hardware from Dell, Lenovo or Supermicro gets more support.
For a RISC based platform, there was some relatively affordable Power9 systems few years ago. Real beasts, though definitively obsolete nowadays.
@@PainterVierax I've always had Asus' professional lineup since the X58 era. The Xeon based system all ran and still run pretty stable. Currently using the C422 SE with no issues in my trusty old 19" case.
My TR mainboard also was an Asus. The issues i had were mostly related to USB, which is bad, because i use a lot of specialised peripherials and lab equipment. Also had one occurence, where my SSD's started freaking out. I looked up online and there were a lot of users sharing the same issues using different and also more expensive boards.
I'm not an Intel fanboy (i dislike their current consumer platforms), but i never experienced something like this before, even on consumer hardware.
I use an older IBM Power system as a project server/atorage unit using BSD, but it isn't optimised as a desktop platform.
I currently have high hopes for RISC V in the future. They already began to manufacture HPC CPU's and it's just a manner of time until there will be workstation grade derivatives. A lot of people in the engineering and visual space have a bit of a nostalgic feeling about oldschool unix workstations. A friend of mine even has a small collection.
@@hyperturbotechnomike I agree AMD has something to do with it but the board manufacturers have to push the bios revisions as well.
Don't expect too much of RV in the HEDT/workstation segment. If Power and ARM left it, it's because x86 is very competitive and possess the inertia of decades of software development.
Sounds just like their consumer platform. Had lots of troubleshooting I had to do for clients since their systems randomly decide they don't want to boot or had all their BIOS settings wiped for no reason. Some systems became unstable at stock after a while and some even had a dead CPU, the most common was the R5 3600 non-X.
@@PainterVierax You know what's great. Having something tested out of the box. How professional systems should be. My Xeon boards always worked OOB and did not require firmware updates to fix major issues. I think the reason for that is that Intel is focused on HPC/Datacenter. A few generations ago, they began using different architectures between Xeon and consumer desktop. There is also no Win11 barely functioning P&E core bs on Xeon.
AMD should treat it's professional users more seriously, even if they take initial financial losses in R&D. They can win it back, after gaining reputation.
As said, i'm not a hater. I'm a fan and user of AMD's consumer platform, since Intel's consumer products are so abyssmal. Biglittle architecture makes zero sense in a desktop PC. Only battery powered devices benefit from this technology.
The problems you started to have with the 5000s series is all motherboard related and has nothing to do with the cpu. So saying you had "issues" with Threadripper cpus is not really honest here.
@HardwareUnboxed - got any EYPC data to compare these processor to? That would be interesting. Thank you and keep up the great work guys.
Me : Can't afford a Threadripper machine.
Also me : here watching this anyway.
Yay! 🎉 1 Million!
congrats on hitting 1M
I liked how far back the processors you used like the Intel Core i9-11900K as I have the 300MHz slower boost Intel Core i7-11700K.
As much as I'm pissed at AMD for screwing over everyone on the older TR platforms, I'm not surprised they went the route they did. The main thing, imo, was the lack of any serious competition. Why sell at a lower price than Enterprise/data center when your 'competition' is slower by half and costs more to boot?
It's no excuse to very deliberately kill off your entire non-pro HEDT userbase (and many pro users). For AMD to publicly promise that they'll support TRX40 "for the long term"...take everyone's money...then laugh loudly as they quickly EOL TRX40 is about as insane as you can get. I also invested in TRX40 and as you can tell, I'm pissed at AMD. I'd never have purchased this if I'd known what they would do.
@@ChrisM541 when you are not making any real money from a niche, super high end market segment, then yes that's entirely fair. It's a dick move and erodes trust, but from a business standpoint makes sense.
@@imglidinhere You have no idea how butthurt he is about that. He has pretty much spent his entire afternoon ranting about it up and down the comment stack. And he did the same for the news coverage of the announcement. Which suggests that he did the same for the TR 5000 announcements and reviews. It's actually rather funny, pathetic and sad at the same time.
But to comment on your original statement: AMD has made no promise of future socket compatibility for TR 7000, so factor that in if you want to go that route. Fishhawk Falls Refresh is out soon, but that's also an EOL platform, so you're in the same boat either way.
That said... I really don't get the anger (except the principle of promising support and not delivering) when we are talking systems that easily cost $10k fully spec'ed... what the hell is $700 for a motherboard?!?
@@andersjjensen People that buy TR systems are... not the type that upgrade every generation. If you bought a 3990X, you were probably not planning to upgrade for a loooong time. The vast majority of HEDT users built their systems and let them run until they die.
@@imglidinhere I concur. But apparently Chris here thinks differently. I'm looking forward to see him grinding his axe again for the Threadripper 9000 launch... :P
I wonder how well DaVinci Resolve would run with the newest generation of TR CPUs. I'll keep an eye on Puget Systems, which specializes in video production testing. Definitely not the best choice for gaming, but for certain productivity tasks - wow!
Great for playing 124 instances of Tetris at the same time!
Hey Steve, was the test system specs meant to say "Windows 11 23H2"?
As the owner of a 3960X Threadripper system I can attest that the extra performance isn't worth the squeeze on your wallet. I really wanted to like the Threadripper platform, and I tried oh so hard to do so, but in the end I feel like I wasted my money.
...especially after they deliberately killed off TRX40 after only one(!!!) CPU release. So much for that infamous "long term support promise". So braindead for AMD to kill off its entire HEDT userbase.
Build livestream for the new editing workstation?? 👀
That 240mm Deepcool AIO is no slouch if it can cool down these beasts even with a PBO overclock where they consume well over 300W.
so will you be discussing the 5995wx issues in detail on the podcast?
At these prices and with such spec, I’m curious who are the buyers? System integrators? Because I can’t imagine building a workstation and not having any fast support option.
The PRO line-up always used to come out first OEM only and I would assume a huge chunk of actual pro users/companies that buy a lot of them would buy them from an OEM with support contracts and all.
It would be fascinating if AMD were to launch the 7970X3D and 7980X3D.
Considering there is little to no benefit there is in production applications, I'd say putting 3D cache on Threadrippers would make no sense.
@@HatchlingKifathis
And do you know how expensive it would be to put enough Vcache on the CPU to actually make it worth it
These are 2k to 4k CPU’s
That would make them 4k to 8k CPUs
8 thousand dollars for a 3-8% performance increase?
Thats not worth buying, definitely not worth making
You should do benchmarks on audio programs like avid pro tools
Shame we can’t see a comparison with Intel HEDT. I feel like it’d have been the most relevant comparison.
Well, despite all that power, demand will also be very, very low for these new Threadripper parts. They are much too expensive.
I for my part got burnt with the TRX40 platform. I bought three of these for our CAM workshop, with the firm intention of upgrading to 5000 series Threadripper once they would be available. These 3000 series we got are resally sweet to work with, but since Treadripper 5000 was never released for TRX40, I'm stuck with these. My lesson out of this simply is: never ever again Threadripper, especially at these insane prices for motherboard, RAM and CPU.
If you don't need it, don't buy it. If you do need it you can't complain about the price because you need it.
"omh the price of this caterpillar 787B is so expensive"...do you own a mine?
100% agree.
AMD royally - and deliberately - shafted every TRX40 owner. Taking their hard earned money off the back of that infamous "long term support promise".....then laughing at them as they very quickly EOL'd TRX40. Imagine deliberately killing off your entire non-pro/HEDT userbase...and many pro users too. Shocking, braindead way to treat your customers.
For professionals, which use them for their income, they aren't. They also don't get replaced as often as a consumer platform.
@@ChrisM541 This is why, i have returned my third gen Threadripper again. had a lot of firmware issues. My Xeon based workstation is running inefficient and hasn't as much power, but it has an almost 100% uptime and no random crashes and failures with it's connectivity.
7950x seems a better all round performer