i9-13900K Vs. Ryzen 9 7950X - What is the BEST Gaming and Workstation CPU?
HTML-код
- Опубликовано: 14 июл 2024
- Is Intel's i9-13900K the BEST Gaming, Workstation CPU to get vs The Ryzen 9 7950X? Let's compare vs the Ryzen 7 5800X3D, i9-12900K and i5-12600K! Though I think this brings back good competition and Intel is trading benchmarks with AMD yet again. Though although this chip is fast, it's not all flawless victories.
Check out the i9-13900K here: ebay.us/EskTAF
Check out the Z790 Asrock Steel Legend Motherboard: ebay.us/XhfmM8
DDR5 6800MHz CL34 memory: amzn.to/3CMIY0D
Timetable
00:00 Gaming Benchmarks 1080p Low and 4K ultra with RTX 4090, Shadow of the Tomb Raider, CS Go, Far Cry 6, Cyberpunk 2077, Horizon Zero Dawn and F1 22. Firestrike Physics
04:46 Undervolting with Cinebench R23, SoTTR and Apex Legends. Clockspeeds and CPU behavior, it's changed quite a bit since 12900K.
07:52 Productivity benchmarks, Vray, Cinebench R23, Premiere Pro, 7 Zip Compression, Decompression and C1.3. Temperatures and Power Consumption.
11:58 Instructions per cycle comparison vs i9-12900K and R9 7950X (IPC).
12:51 Conclusion for Gamers
14:10 Workstation CPU recommendation.
16:07 Question of the Day, does the Ryzen 7950X iGpu benefit premiere pro users?
❤️Become a Tech YES City member and get access to perks
/ @techyescity
⭐Consider Subscribing here bit.ly/3G20vC1
💯Merch - www.redbubble.com/shop/techyes...
❤️Support Directly - / techyescity
💻Discord Access - / discord
-------------------------------------------------------------------------------------------------
DISCLOSURES: Generally all links tied to products are either Amazon, AliExpress or Ebay Affilaite links, this means that if you purchase a product we earn a small sales commission, which costs you nothing extra (if you end up purchasing a product). All sponsored content will contain the word "SPONSOR" if directly sponsored or "AD." Any additional revenue stream will be disclosed with similar disclosure.
Music Provided by either: epidemicsound, audio library or royaltyfreeplanet.
#CPU #Tech #Gaming - Хобби
THANK YOU for doing true undervolt testing and comparison. Most "undervolting" content I've found is really just comparing Eco modes - not actually lowering the stock voltage. Liked and subbed.
My home office is the hottest room in the house due to sun exposure. I do office work, gaming, and huge Adobe Lightroom workloads, so I'm looking for the best all-around performance while dumping the least heat into my environment. Puget puts the 13900k in the lead for Lightroom in general, but given my all-around heat, performance, and efficiency concerns, it sounds like you might recommend the 7950x because it responds so well to undervolting?
I like that you're doing testing that makes sense for the product during each review, great content and shows you understand part
I really appreciate the under-volting results you've been including. I found UV to be of less utility on my 12700K and ran basically a permanent -.025v offset. This is was good up to a 5.1 GHz OC where I really had to start upping vcore to hit 5.2GHz. I'm curious to see how much power I can cut on the 13900K by tuning the e-cores. There was a pretty significant cost savings disabling the e-cores all-together so I'm real curious to see how far this CPU can be optimized for efficiency when gaming. I get most people wouldn't want to pop into bios to enable / disable cores just to save power or increase your boost headroom but I kind of enjoy that tweaking aspect of Intel's architecture. On my Ryzen systems it got kind of boring - apply a -30 offset in curve optimizer and let PBO do the rest.
Intel is more efficient in gaming, DeBauer did a video on it, at 60W+ it does double the performance in gaming compared to AMD draining more while having the same score. TL DR. If you want to undervolt and save money, temps.. then Intel for games primarly, AMD for workloads primarly. ruclips.net/video/H4Bm0Wr6OEQ/видео.html
@@Hito343 Yes, I saw his video though I"m not sure if that is necessarily the conclusion I came to. Intel does do a better job at scaling down power usage in lighter use-cases. The IO die in Zen only goes so low in terms of power draw so it's a constant (albeit relatively small) power load, as are the CCDs themselves even if under-utilized. So it's not so much that it is more efficient gaming, just that gaming workloads are typically lighter and Intel can scale lower.
CPUs are so damn powerful nowadays. People forget that a modern i3 is essentially yesteryear's i7.
Competition sure is a blessing for consumers.
that's because intel has stood still for about 5 years from 2015 to 2020...and as a result a phone chip (apple a series is as fast the the i9 cpus from 2020-2021.
@@vladmihai306 I'd say they basically stood still since 2012.
Sandy Bridge already gave AMD's Bulldozer a hard time, Ivy Bridge was a just small improvement with a new node but it was enough to surpass AMD's Piledriver without a sweat. Then Intel had basically no competition onwards untill Ryzen in 2017, but they weren't expecting that and had trouble with their 10 nm node which stuck then on 14 nm with Skylake refreshes until 2020.
@@ismaelsoto9507 Haswell was a big improvement over Sandy Bridge...and Skylake was also quite a decent bump. Their strategy was to get better nodes and with that, better efficiency and better experiences in laptops especially. That all stopped when their nodes didn't improve anymore, so from 2015-2016, they were stuck on 14nm and the only thing they could do was to add more cores to the skylake 6700K design. Kind of what they are doing now with the 13th gen which is just 12th gen + some more cores (sure and some more cache), the difference being now that their fabrication processes seem to be healthy.
@@vladmihai306 The efficiency did get better from 2011 to 2016 for laptop CPUs, but core counts where abysmal for the "U" models and the performance difference over an i3 to an i7 on these dual cores wasn't that huge. It only got good in 2018 when AMD started offering quad cores for the low power CPUs.
A very good review. Are you going to prepare a content where you make similar comparisons for 13700k?
13700K is a bloodbath vs Ryzen 5 and Ryzen 7. It's like the i9-11900K vs the Zen 3 Ryzen 9s.
Interesting seeing the big variance in testing between different youtubers. Hard to know which ones to trust!
You can't go wrong with the power figures... the 13900k is garbage worst than Prescott.
All these top CPUs are so close that any variable can change the result more than normal, especially since 13900k will pull 300w+ if able to be cooled
Because the 13900k is throttling like crazy depending on cooling solution
@@evalangley3985 its actually more fps / watt in gaming than 7950x
Many use the default-settings of the motherbaords - and those are ingeneral insanely bad - disabling all limits and running the CPU a very hard OC - which absolutely destroys all efficiency.
Go's to show just how good the r7 5800x3d, That's got to win for bang for the buck.
That would probably be something like the $100 12100.
Here the 5800x3d costs the same as the 7700x and 13700k.
It only wins in a hand full of games that benefit alot from its cache. Some don't benefit alot and it would get destroyed in productivity even vs a 13600k.
@@alvar891 now factor in motherboards and ram! and come back
@@Deviantsoundz cost 13600k rig. also 5800x3d is not a production cpu strictly gaming.hence when it was released as the fastest gaming cpu.
7:12 It's a feature called IA CEP (Intel Architecture Current Excursion Protection) that's existed since 12th gen. It effectively causes clock stretching to keep the cpu stable.
That's weird it behaves much differently this generation.
@@techyescity I've posted a reply in the comment section explaining my findings, Unfortunately I highly believe it's not 13900k's behavior but Asrock Steel Legend Z790 board's problem. currently I am contacting the Asrock tech team and I will update if the tech team get back to me; it would be nice if you can pin my comment.
@@SnowReborn surr I'll take a look when I get home. Just on a train right now.
@@techyescity In your honest opinion, which platform would you choose AMD or Intel in this generation based on what you've seen and tested. Also, the only thing that kept me from pulling the trigger on Intel was the heat on the CPU, but it doesn't seem like that is going to be an issue at least if I have a 360 AIO. I was just trying to make sure the CPU wasn't sitting at 80C or above for extended periods of time. Also, it doesn't seem as though AVX512 benefited much with AM5 which was shocking to me.
I honestly think it's the larger L2 and L3 cache that is pushing the performance uplift
Will you be doing any videos on 1 PC streaming with the E Cores vs AV1 in the upcoming future?
can you please advise on which values to set the Intel's utility to safely undervolt the 13900k to use the rec. 253W PL1/PL2 ?
Great content brother,
The power consumption tho is quite ridiculous, no wonder overclocking is becoming history, and undervolting is almost a must,
Appreciate your honest reviews as always
Hi, very nice review, so i have a strix z690e from asus, the one you reviewed, i want to put the 13900k in it, will it have the same performance as with a Z790 chipset? Or do i need to change my board?
Could you maybe also include stock performance and powerdraw? You did a slight undervolting but still with OC-settings.
WHY ?
@@tilapiadave3234 What why? Do you only want to see auto-OC values?
Just picked one up at microcenter for $569! Stoked
What high-end CPU has the best encoding times with slow 4k/8k encodes? These things overheat (and then throttle) like crazy, so I'm curious if longer encode times will show different numbers than the short Handbrake tests everyone is using for comparison.
Thanks for testing against the 5800X3D!
Im hoping to find that bad boy used
@@1992Latvian me too but looking at the 13600K + DDR4 motherboard could sway me even though the Intel side is a dead socket for upgrades. I am waiting on the Hardware Unboxed review to decide. Currently I am running a 3700X on a AM4 motherboard with 16GB DDR4 3600 and 6700 XT.
Im sitting on B550 and 3500x, so 5800x3d would be easy swap, but if price and performance for i5 is good enough I might swap. am5 still means new mobo and ddr5. In the and at least for me it will boil down to how much I need to pay in total for performance I want.
@@Mopantsu don't forget a useful bit of information. Raptor lake will be held back a lot more on ddr4 memory speeds than AMD vcache. You don't see much of a difference in faster memory on vcache so save some cache and performance loss on AMD and 3600 cl16.
@@1992Latvian the i5 13600k is faster in both games and workloads. also $120 more expensive where I live.. thats almost the cost of the motherboard. It even beat the 5950x in many workloads.. incredible for that price of the 13600k
I have to say, this review and comparison looked much fair than previous ones. Do you think you can review and compare the 13600k in a budget built DDR4 3600 CL16 or so, budget MoBo and a cheap 360mm AIO?
Will do a DDR4 comparison a bit later, got all the ddr5 results done for now. Been flat out testing and validating... so many excel spread sheets... lol
Thank you
Which cpu wins your Snappiness test in windows though? :)
hmmm seems like an undervolt solves the throttling issue others were mentioning? I guess I could do that.
The cinebench to compare old generation on same clock is not great as it does not stress the extra cache they have which could be great for gamers.
Thank you for testing higher resolution.
Bryan are those CB temps after a 20 min loop? I ask because your temps look lower than other reviewers, particularly HUB's where they show even the 13600k thermal throttling with a 360mm CLC. Maybe something isn't quite right with their temps, at least that's my hope.
From what I saw he was running this MSI board that ran it at 300 watts... that's way too high, the steel legend capped it at 250~ watts, which the 420mm H170i was handling fine, and yes this was a long duration I tested for, I did the motherboard hot spot testing at the same time.
will you do dedicated under volt video for 13900k ?
Debauer's latest on power limiting the 13900K is very informative and shows the 13900K can be a good performer at low power. Power limits and undervolting are king.
Try telling the AMD fan--girls ,, they just REFUSE to accept FACTS
how the world has changed...
Brian how are getting such low temps when gaming when Hardware Unboxed was thermal throttling when gaming using a 480mm aio?
I used a Z790 steel legend, remains with intels spec. I just saw his was going to 300W, must be the motherboard manufacturer setting overvolting it.
The cyberpunk benchmark seems having missing numbers, Brian. Can you re-upload this one after fixing it?
Yeah I just noticed it now, will cut it out of this video and put the cyberpunk numbers in the 13600k review! Thanks, just been so busy lately!
Would like to see 1440p 👍 👌 results
What CPU do you recommend for a 4k 120hz gaming system? I am playing unoptimised indie titles like Squad, Tarkov etc. I have a 7700K so I need to upgrade. Likely be paired with a 4090 or 7900XT.
7700k in 4k will do fine but if you dont want to spend to much you can go with 12600k - 5600 or newer versions of that.
i play those games esp for tarkov 5800x3d or wait for the 7800x3d or 7959x3d
hello, i came here because of the exact same issue i encountered in ur vid. I am using same mobo, the asrock steel legend z790. it seems like there is a lot of bug and kinks on this mobo atm, 1 the xtu constantly bug out, and most software reports incorrect vcore voltage(i was later confirmed by asrock tech: "We check this case with power team at HQ. Core VIDs in HWiNFO and Core Voltage in XTU are the voltage requested by cores.
The actual voltage which the motherboard provides to CPU is the readings from Super IO. You can monitor this reading by the Vcore Voltage under Motherboard section in HWiNFO.
") therefore the only hwinfo64's mobo section of vcore is correct reference. 2 i found the all core ratio wont clear after its been changed in bios, for instance, if change p core from auto to 53, next boot shows all p core 53, which is good, going back bios, f9 reset to default, or loading user profile, shows p core ratio resets from 53 to auto, and now boot back to system, expected behavior is p core will be reset to factory setting(55 all core 58 2 core) but instead it still stuck at 53. all of the issue mentioned above did not happen on asus prime a z790 board, so i think is the mobos issue. the reason system wont crash with -200mv on xtu is just because xtu isnt working well w this motherboard(refering 6:30 mark of the video); NOT because 13900k has some self protection, it constantly changes my voltage and offset into "undefined" but xtu works perfectly on asus board, with same os and hardware. when you are undervolting in xtu you are essentially lowerinh the core vid(reported by xtu as core voltage) but actual vcore reported on hwinfo64 is unchanged, i have vcore set to 1.23 in bios but xtu reports my vcore(core vid) as high as 1.5, and if you attempt to undervolt something aggressive like -500mv on xtu, the system will still crash, most likely now the core vid is actually lower than vcore; however, lowering the offset in XTU does lower the power consumption; The power consumption with undervolting on asrock steel legend z790 current is very bad as the power consumption is still siginificantly higher than my other Asus Prime-A Z790 board, when I tried to fix 1.23V with both boards, p core at 5.2, e core at 4.3, in Cinebench R23, both gets 39300 points(power limit unlocked); However, Asrock has 1.5V core Vid and runs at whopping 290W and average temp 90C(EVGA CLC 360), where as Asus with same hardware OS and settings runs 190W averaging 75C. Something is terriebly wrong with Asrock board and I am currently emailing their tech team. ill keep updated if asrock tech get back to me w more info
Asrock finally got back tot me, even though it took couple weeks:
"BIOS 3.07 fix issue#1 and issue#2, please update to this BIOS version and check if the problems can be solved.
You won’t need to uninstall XTU now, but please go to [XTU]=>[Settings]=>[Advanced Options]=>[Restore tuning after reboot] and disable this function to avoid XTU driver to affect BIOS settings.
Thank you.
Sincerely,
ASRock Tech Support"
I presume the "quick sync " doesn't work with the 13900kf?
Shame you didn't have a 13600k too mate (sorry, just being impatient) . I think that's going to be the best value if I upgrade soon. But wow that 13900k is a freaking beast!! Amazing numbers.
Also, do you think we actually 'need' all this 13th gen power for some sim, pubg and light editing? Not having any issues with my 10400 and 3060ti on ddr4 but the performance seems like too much of an improvement to ignore?
If you don't need the performance then hold off brother. Although the 13600K will give a boost to all those three you mentioned, will release the review in about 8ish hours if i can.
👍👍 for using fast DDR5 and 4090
you put the cs go graphs up again where the cyberpunk graphs were supposed to be
What is your justification for recommending that people turn off core isolation? Benchmarks show that the performance difference is pretty negligible, and this is a security feature for Windows 11.
Great review though, always great to hear your perspective on things 👍
Performance can be around 10% increase on both CPUs. sometimes around 15% for intel.
Can’t decide 7950x or 13900k for gaming, I don’t like the high temperature on those CPU…
I think the temperature on 7950x eco mode better than under voltage the 13900k
💯
Excited to see the 7950X3D
Another OVER-priced OVER-hyped waste of sand
Wish there was a solid mATX offering for z790, I would have went 13900k.
Oh yeah!!
Now that AMD has seen Intel's counterpunch, wait for the new 3D Cache release from AMD.
Will 3d cache matter for productivity or only games?
@@carlo_oppermann games mostly
Perhaps we need Terra hertz systems for a performance uplift out of x86 😁 IPC uplift zero, frequency uplift 10x
MT/s or MHz?
Wait, this thing has to run on liquid cooling?
every channel says that 13900k is faster and cheaper then the 7950X except for hardware unboxed ,
in their testing i think they ran each test immediately after running a stress test for an hour. lol
The real dilemma for me in comparing the 13900k and the 7950x has been in audio production, which sadly mainstream reviewers don't test.
For those curious, in audio, you want the maximum single core performance because your weakest core decides how complex your tracks can get, since a track usually gets processed on one core at a time, but more cores mean more simultaneous tracks, so if e-cores are good enough, they'd win. However, it's hard to tell if intel's better performing 8 P-cores + 16 e-cores will bring more benefit than the 16 regular ones on the 7950x.
I would love it if there was an easy to download audio benchmark that I could do, everytime I have had to benchmark audio I have had to download a program, make a track and literally get a stopwatch to benchmark individual parts of it. If the audio community could get something streamlined then I would be 'all ears' :P.
@@techyescity Thanks for the response. It's a shame considering how large the music hobbyist community is. If I can develop one one day that can be easily downloaded and executed with no hassle, I'll be sure to share it.
How does the AMD workstation CPU compare to the Intel equivalent?
Does AMD use fewer checksums than Intel and if so, does this result in execution errors?
Is AMD more compatible with Linux than Intel?
wrong slide for cyberpunk
So corporate rates CPU based on power efficiency and you rank based on initial purchase price? I mean if your CPU is going to use more power, doesn't that make it more expencive?
I rate it based on application productivity, just like the 7950x. These are end user cpus. corporate would be looking at Xeon and EPYC
1:37 gotcha!😂
Raptor Lake sounds fast
But everyone knows a Meteor is faster than a Raptor. Meteor Lake it is then.
Meteor ain't coming any time soon
wait you lost subs?
or am i from a alternate reality where you had 1million+ subs?
wtf, i think i remember you had like a million subs
Try this on a high-midrange air cooler and see how it dies in throttling.
High end air coolers are a dying breed, same with Zen4.
@@puciohenzap891 I would call your comment stupid but considering how obvious it is I don`t think it is necessary to hint at it.
@@sebasstein7014 it was rather ignorant
ah yes 100$ aircoolers that lose to the artic freezer 2...
Can’t wait to get a 4090 and a 13900k so I can replace my space heater with something more useful
Good one. Original.
Which CPU do you think it's better for Handbrake and is going to run colder and draw less wats? Reviews and benchmarks says different things and Handbrake is not the heaviest workload so both CPUs might be not overly hot yet I think undervolted 7950X might be better option here? altho Intel was always better in Handbrake really hard to say. Amazing video
Ayy im 9th here seen this vid 14mins after being uploaded
I got my i9-10900k sitting here just crying itself to sleep.
Why no 1440p. Where does the 13900k perform at that setting? Thats currently my preferred resolution
Me too.
Resolution does not effect cpu performance significantly. If the cpu is capable of pushing 300fps at 1080p it can do quite close at 4k. The limiting factor is the gpu.
The cpu does not do any rasterising. It does draw the scene and position objects but to cpus, resolution doesn't mean anything. The gpu which rasterises on the other hand does give a shit.
What WOULD effect cpu usage is field of view. Because of this, in some cases an ultra wide monitor would put more load on a cpu.
Jamil nailed it.
hey brian, i play 1080p high refresh gaming on competitive shooters and aaa titles. i have a 280hz monitor as well. should i purchase the 13700k or the 13900k? any help here or input from you is highly appreciated. thanks in advance sir. i know the 13700k specs are pretty much the same as a 12900k. only difference being the 13700 has more cache than 12th gen. lets say were only talking 2-5 fpd differences then i would just save the extra money and go 13700k. thank you sir
noti gang
YES
i9 13900k FTW!!! and it's $40 dollars cheaper...
You forgot to mention the power consumption.
Isn't ryzen 13th Gen supposed to be competing with ryzen 7000
I appreciate the 4K fps results even when there's not much difference.
13900K vs 12900KS?
AMD needs to lowering prices of all zen4 cpus for at least 20% and b650 mobos must be available for 100-120$ cheapest ones because otherwise they dont have some point now
"will be" ,, and Intel Z790 boards "WILL BE" available for under $5.00 ,,,
@@tilapiadave3234 amd will lowering prices pretty son when zen3 sale start to go down
@@Mr11ESSE111 And Intel will do the same ,,,, it is the REDICULOUS decision of AMD to ONLY offer DDR5 that is killing them ,, that and the motherboard prices are OUTRAGEOUS. I don't find the pricing of AM5 CPU's to be too bad
@@tilapiadave3234 they are bad priced especially 7600x and 7700x which offers too.less cores for money!!basically amd from 8/16 c/t for 300-320$ in 2017-18 went to 6/12 c/t for same money in 2020-now
@@Mr11ESSE111 I get your point ,,, but $20 or $40 overpriced but the mobo's are $100 PLUS over-priced
does intel cpu undervolt work on other motherboard series for example b660 or only z series
because i only seen ppl undervolt on z motherboard with K cpu
You don't even need to undervolt it - just don't run it with the bad auto-OC that most Z-boards have.
"Bye" cut off again :(
You have to pay to see the whole video
are u shelton:),unbelievable
Felesleges össze hasonlítások ezek
performance & power consumption in eco mode.
CSGO results are different from channel to channel and its alot difference in fps
I pick the system that still runs Windows 7
Maybe I missed, how were the temperatures under full load and in gaming?
at 11:40 , but I don't think he mentioned cooler used (same for intel and ryzen)
I really want the 7950x, but I have no justification to buy it. 😢
Remember you can save 100 usd more with intel 13900k rather than amd 9 7950x.
Cyberpunk having the same graphs as CSGO 1:47
For the Intel 13900K 8 core, if you overclock it a bit, it needs a nuclear reactor, and it goes instantly to 100 degrees even with a €300 AIO and to 70 degrees with real water cooling WTF. Other than that, this is the last LGA 1700 CPU. that's not Okay
this is cope, AMD has the same slightly worse cooling.....
@all ryzen 7 5800X3D. Stay with it. upgrade is absolutely not necessary.
My 2667v2 Is the best. Just so you all know :D
Still after seeing all the relevant reviews for Zen4 and Rapor Lake I can't help but wonder who these processors are for? The high core count CPUs really try to be a jack of all trades and cost accordingly. But who needs this? If you need a workstation desktop PC yeah but this is a use case for say 1% of customers. And if it was stricly for gaming, you would assume by now that 4K will be easily available with all new GPUs in the future, but again who runs into a CPU bottleneck in 4K? Owners of 2000$ GPUs perhaps and this again will just be relevant for around 1% of customers. Seems it's becoming a very elitist thing to adopt the best desktop hardware from now on.
One man's "massive increase" is another man's "meh".
In my honest opinion Brian your review is much better than hardware unboxed they were too negative in theirs about Intel good job with your review keep it up 💪💪
HU is bias and way too negative even when Intel clearly beaten amd they want to bring down Intel and by the way I've watched more than one review about these CPU's and most of them are positive unlike HU
@@hygarthwilliams7429 Ironically other reviewers ar TOO positive about intel. E.g. they used wholesale pricing for intel and consumer pricing for AMD, saying the 13900K will cost $600, which is bs. It's $660 for consumers. So they're wrongly saying Intel is $100 cheaper. Not to mention they glance over the insane power consumption. If you live in Europe, the marginal price difference will be diminished by your electricity bill. Sure you can tweak the CPU, but realistically a small amount of people will do that.
ohh so will stick with 13900k....hihihi what about power draw??? compared to 7950X?? why you are not planning to use ryzen by undervolting it as you said in your previous videos???
13900K is only 1-2% faster than a 12900K for gaming, so dont bother with it tbh. Keep your 12900K or get a 13600K, or a 5800x3D on the AMD side, and save $100's. Don't go by the highs in the FPS's, but the 1% lows
4090 and 5800x3d or wait for 14900k or something else ?
Wait for 7950x3D, I have a feeling it will change up the game.
7700x are overall faster then 7950x in games
I’m on a crusade. Reviewers need to add the most played games - Fortnite apex WarZone.
Apex is done, no point with a 300 FPS cap, I recently had it in my tests but for CPU now, all these cpus can cap it. the 4090 is taking it over 220 FPS at 4K max too now. lol. I can look at fortnite and warzone, but they will get over 200 fps easily on practically all cpus featured in this review.
Undervolting 🙂
The best gaming and workstation CPU is the XEON!!!! Long live X58!
What about the fact that the z790 motherboards don’t feature an m.2 Gen5 slot for your SSD. While an AMD motherboard does have one. That’s huge in my opinion.
There is more to workstations than Premiere Pro and video editing.
I showed 5 productivity benchmarks. should paint a good picture that they will trade blows depending on the app. Get what performs the best for you.
@@techyescity What about some engineering/scientific workloads where the PC will crunch numbers for a few hours?
that is some thumbnail
This review is very interesting and undervolting the 7950x brings impressive results.
NEARLY as good as the 13900k
Game resolution "1080p" + detail setting "lowest available", we are going to the extremes to show difference between this CPU monsters. Basically if you are a gamer with a midrange GPU, just buy a decent CPU and go play. If you are buying/building a midrange gaming PC, just buy the best GPU you can afford, and then get a decent CPU (Core i3/i5/Ryzen 5) + RAM and enjoy gaming.
Exactly, this is why I like to show those 4K numbers, even with a 4090, it's going to make very little difference at 4k.
@@techyescity At least you do take the time to explain if its worth it or not, depending on each case. Thank you and keep the good job and content!
Btw that undervolt wasn't enough, it can do 4.9ghz at around 0.9V so you had like a -400+mV undervolt headroom. And the power consumption would've been like 130-140W 😂
Pshh.. then Intel would win.. can't have that lol
What if it isn't stable and you lose performance?
@@skorpers this only happens with memory. If it isn't stable it crashes.. not lose performance. You'll either have 99.99% of performance or crash. And basically every time it is less than 100%.. it'll crash.
@@christophermullins7163 Clock stretching is alive and well, has been for years.
@@skorpers nah
Meh I have an 11900k and its still more cpu than I need for 4k gaming on the 3090. By the time I need a new gpu then the 16th gen intel will be out.
1:18
Cinebench single core is a useless statistic since no one renders with 1 core. Try looping the test for a while. You will see intel start to thermal throttle and thus lower the score.