Keep in mind Intel is using 10nm, while AMD is using 4nm. Intel's part is a refresh of Raptor Lake released nearly an year ago, while AMD's is a supposed 'grounds-up' architecture that's supposed to have a 16% average IPC uplift, and took 2 years to release from Ryzen 7000. Yet, it's losing against Intel in performance, and consuming only like 10-20W less(in some cases similar power and a few even more power than the i9, while having lower performance). AMD also said that the 9950X would be upwards of 20% faster than the 14900K in gaming. You know you have fucked up badly, if your part consumes almost as much power as a 14900K while performing noticeably worse.
I was thinking the same, all that marketing claim from AMD, so much bs. Intel is about to release Arrow Lake, which will be 3nm processors, so AMD really has to pull their fingers out of their nose and ensure their 9000x3d series 3nm processors will be at least on competitive level with Arrow Lake. Otherwise Intel will make up for their huge fuck up with oxidation issues and AMD will lose it's chance to gain edge in market sector.
Couldn't be more wrong. TDP is much lower, and this is not AMD's pinnacle chip, whereas this is for Intel. wait for the 9800/9950x3d to come out, it will wash Intel off the map, as its 7800x3d currently does.
AMD never fails to miss an opportunity to miss an opportunity, this has happened many times and they this time had their greatest chance to increase their lead but instead decided to sit comfortable
@@artorias89 The problem is that x3D is only for gaming and obviously Intel doesn't have a similar CPU segment so comparing it to x3D is silly, like when you go to the ring but forget the weight class.
In the productivity and the efficiency sides. And the 9800X3D is not even shown yet. Now, where's Intel's stability, durability, efficiency and just anything new? LOOL
@carlo1132 I would rma it who knows how much damage it could have. I mean it might be ok now bit could fail in weeks/ months time best get it checked 😮
@@joepearson6644 according to some who have tested the new 0x129 microcode it seems to of stopped the transient voltage spikes so it should technically stop further degradation.
@@mathesar As for the voltage aspect, yes. But now there's the copper corrosion issue. Intel admitted that all of their 13th and 14th gen CPUs have this problem, so even if the microcode is resolved, you still have a degrading chip.
@@get-in-2-get-out774 I see that its efficiency is as bad as Intel 15th gen i9...with a new process node+new architecture.... Whereas 14-15th gen no process node improvement, no architecture improvement, it's under.... But AMD , huh ?!
@@furieux6742how can this person be wrong? If the AMD processor isn't as fast as the Intel processor but costs more, that's a fail. Take this L, and have a seat way over there.
There's absolutely no reason for buying these possessors Wait fot 9800X3D ,if it fails to deliver then zen 5 would be the worst architecture of all time.
@@TalonsTech yeah, i think 3nm for i7/i9 arrow lake. 2nm for i5 arrow lake. intel 17th gen nova lake will use 1.4nm which comes out in 2026. i wish intel released their gpus faster though
Capped my 14900k to 253 Watts, downloaded the Microcode update , all good here no issues in gaming or productivity. Renders my 2 minute 3D projects in an hour .
@@joepearson6644 Nope I got my 14900k a little over a month ago when I knew all the problems with intel, never had a problem with it since day one I switched from my 12900k to 14900k, and no crashes... after a week of having it I set the power limit to 253 watts, no issues then I got the microcode patch, no issues. when it comes to the internet social media always blow things out of proportions . yes 13th and 14th gen have issues , but the majority of it where for people using 13th gen who Overclocked there CPU's when the mother board was already overclocking it out the box, now add that with the High voltage request and that's how some CPUs have Degradgation especially 13th gen CPUs that are over 2 years old etc Why should I be scared of long term damage when my CPU never had a issue , and I downloaded the patch and I have a 5 extended warrant if something does happen. So I have nothing to worry about. I stopped using AMD CPU since the Phenom Black days back in 2008, went to intel in 2010, had a I7 970 for 10 years without a single issue, upgraded to a 12900k in 2021 then sold that for a 14900k in July , never had a problem since.
X3D is great for 1080p gaming...if you play in 1440p or 4k its all GPU bound...its better to stick with 9950x or 7950x. It has faster productivity as well. performance.
@youtubeaccount7544 That is only true on like two models. The 7800X3D beat the 14900K in almost every possible game for cheaper, less heat, less power, and it doesn't kill itself.
He doesn't use 250 W for gaming when it isn't needed and if you have QHD or 4k monitor there is no reason for CPU overvoltage when they can perfectly sit fine in 150W in gaming, just use 250W if you want but isnt needed to beat AMD (NO X3D) also I have Intel CPU (i7 14700k) and no instability issues because I keep it in Eco mode ( or normal mode) maybe it'll happen to me but for now I don't have instability issue. Normally I use 120 W at max for gaming and sometimes I'm a bit drunk (not literally) and decided to Cinebench my CPU, it hit 250W also peak temperature 70C°. using Arctic Liquid freezer 3 RGB (stock, I have other fans, more powerful but for now Im okay) I dont know why these people buy bullshit of Cooler to cool a CPU that uses a lot of wattage (AMD and Intel 7000, 9000, Alder lake and Raptor lake) and uses the most powerful profile for an extra 10% FPS. Intel has a more power hungry profile but both have Temperature issues, no hate to any brand (Cuz i uses both☠️)
@@Igor-Bond Yes i guess the cpu will be chilling out at 4k. I believe it is also being stressed out in 1440p but thats mostly in starfield and cyberpunk
The reason for this is that Starfield, like Fallout 4 and Skyrim, is full of draw calls, unlike what happens with other Bathesda games, such as the Doom series. And this lack of optimization of draw calls in Bethesda's RPGs, which makes them always so heavy on the CPU, is the reason why it's easy to make so many mods for these games, unlike other games optimized for draw calls. Therefore, when you see someone saying that Bethesda always uses the same game engine and that they don't learn anything from the past, you are seeing someone who either doesn't like mods or who doesn't understand how they are always possible to do in Bethesda's RPGs. When the people at Bethesda decide to always use the "same" game engine, believe me, they know why they do it. And what does this have to do with AMD CPUs? Somewhere during the development of CPUs (I think it was when the first Duo Cores were launched), Intel doubled the capacity of its CPUs to process draw calls per cycle, while AMD chose not to increase this processing capacity. From that moment on, AMD CPUs began to be condemned in terms of performance in games with a lot of draw calls, something that became very evident when a game called Skyrim was released in 2011. This is why to this day AMD's high-end CPUs lag behind in performance in Bethesda's RPGs and even other games. Because of the recent problems with the 13th and 14th generation of Intel processors, I came here to see the performance of this new high-end AMD processor in Sarfield, to see if it was possible for me to go back to an AMD CPU (the my last AMD processor was a Ryzen 5 1600X), but unfortunately I see that I will continue, for a while longer, with my old and faithful i7-12700K. Sad.😒
not being a fanboy here, but that's a 16 core processor, of course it will burn some fuel. that and it doesn't die with the oxidization. i really want intel to get back on track, so we as consumers can win.
Oh, now we're were worried about power consumption again I see. AMD in their infinite wisdom says the XBox Game Bar needs to be used with the 9900x and 9950x when gaming. You know, the core parking thing that needs to be used with the 7950x3d and 7900x3d. If I was gaming, I wouldn't even touch the 9950x. The cores on the 2nd CCD run at a slower frequency than the ones on CCD #0. AMD has some things to sort out with Zen 5, mainly with scheduling. Don't expect any miracles though. At best the 9950x will be able to match the 13900k and 14900k in gaming.
@@awaiswasi354 Spreading blatant misinfo about the oxidization issue has killed any credibility you had left. Meanwhile, my 13900KF I bought back in Jan 2023 is still going strong at 5.8ghz and beating any 7800X3D in gaming, rendering, and strenuous workloads.
@@battlephenom8508 misinformation, bruh? i had an i9 13th gen and it was giving me problems, too. that isn't a misinformation but the experience i had with intel. boyo, don't be a shill and just chill.
Well to be fair, (at least before the microcode patch) Intel's best CPUs were degrading and crashing, while the 7800X3D was not only faster, it also drew less power.
@@auritro3903 not being any fanboy, but i think we should compare Intel cpus with amd's non-X3D cpus. because the X3D cpus are only meant for gaming and the the 9950x and 14900k are meant for both gaming and productivity. Edit: Also windows is getting a new update(24H2) and according to them it should fix ryzen 9000 series performence and match with the benchmarks amd shared with us before its launch.
@@ProZog99Shorts HUB did some testing, and while the patch did improve performance, it is still no where near what AMD claimed. AMD claimed that 9700X was 10% faster than 14700K in gaming, but actual benchmarks by HUB showed that the 9700X on 24H2 was still 3% slower than the 14700K.
I was about to switch from lga1700 to am5 but now I think i should wait for Intel's 15 gen/arrow lake cpus, and I dont want any X3D cpus as my main task is productivity and gaming both.
Hello guys, could you help me? I'm a complete layman when it comes to hardware and I wanted to hear opinions from people who are familiar with it. I want to build my first PC but I have doubts. I want to play, but I want to edit videos and render some things in blender/maya. I heard that the R7 7800X3D is better for gaming and stuff because of the cache, something like that. However, the R9 7950X is better for multitasking, so I wanted to know from you, which one should I buy? If there is another AMD better than the ones I mentioned, you can tell me about it. I saw that the R9 90xx came out, but it seems that they are inferior to the ones I mentioned previously. I want to build my first PC so I want to make sure I have the best hardware we have so far. Remember, I want to use it to play games but I also want to use it to edit videos and render in Blender/Maya. Sorry if everything I said is confusing, I'm using the translator.
Hi Brother , I Have Bad English . Sorry . but i dont know how much money you want to spend but if you are rich dont think about what is worth to buy . those cpu's you said ( 9750x3d , 7800x3d , 14900k lablalalal) all of them are great . i ahve to say next gen cpus's ( core ultra and r9 9000) is like last gen but a fixed version of last gen , in power usage and temp and..... 7800x3d is best of the best for gaming with a little crashe's in ue5 and 4dblender rendering 9750x3d not worth to buy . its so pricy maybe my country is Sh1itt you can buy 9700x3d and disable half core's and you will get same performance like 7800x3d maybe little bit lower like 8 % 14900k is realy realy good at gaming and rendering but you need custom water cooling ( open loop or close loop) if you want to see the true power of 14900k. but you can hold you cpu even on 302 watt with rog ryuo iii 360 at 89 temp without any issue like me but go just watch @framechasers videos maybe it help you anyway be well brother , always sorry for my bad english GODBLESSES
@@Rastin_In_Creation So bro, I want to buy an AMD and since it's my first PC I want to buy the best components they currently have so I don't have to upgrade after a while. I would even get the R7 7800X3D but I'm a little worried because I've seen a lot of people saying that it's only good for games. I want to play, but I also want to edit and use Blender, so I have doubts. Until a while ago I was almost 100% decided on getting the R9 7950X but then they launched these new 9000 series and I went back on my decision.
@@宗派 Pretty much everyone on here is focused on Gaming performance. For Blender, nothing beats the 9950x. The 9950x is about 2.5x times faster in Blender than the 7800x3d. However, the 7950x is also pretty good at running Blender. The 9950x is only 10% to 15% faster in Blender. Also, a lot of people use the GPU to render instead of the CPU. As far as the 7800x3d goes, you would need to pair it with a high end graphics card and play games on a high refresh rate monitor to get the most out of it. People are Hyper focused on framerate, which is fine, but how much good will that do you if you plan to game on a 1440p monitor with a 165hz refresh rate. Most of these CPU gaming performance benchmarks are done with an RTX 4090, at 1080p, because the workload is shifted to the CPU. Because of how the workload shifts based on settings, screen resolution and the Graphics card you're using, the difference between the 7800x3d, Ryzen 9900x, 9950x, and 7950x at 1440p with a graphics card like the RTX 4070ti Super will not be that big. If budget is a concern, you will be better off saving money on the 7950x and putting the saved money towards a better Graphics card.
@@m8x425 I was just thinking about buying a 4070 ti super, so instead of buying the R9 9950X I buy the R9 7950X and a 4080 super? 4090 is unfeasible for me at the moment, it is very expensive in my country.
Is this with the new intel microcode? Also is the Ryzen running under administrative account? Also windows 11 gimps 9000 series performance. Fix is underway. Waiting on 9000X3D cpus in the future tba at CES 25.
More like when Intel uses dual ranked 7200 2x32gb or 8400 single ranked 2x24gb DDR5. Intel would absolutely smash this chip when not held back by that slow 6000 DDR5.
@@TalonsTech Take a 14900KS and overclock it. Use TVB to clock all core to 6ghz+ until it reaches 65-70c. Then make sure to tune voltages below 1.4v vcore (don't want to degrade) get DDR5 8200-8400XMP. Tune timings for lower latency. You now have the fastest gaming cpu on the planet.
It will be cool to watch a comparison again when AMD fixes the Admin account bug as well as with the 100w versions of these chips that are rumored to be coming out soon.
I don't understand how the hell are you maintaining 14900k under 70°c? Even with newest bios update and 360mm water cooling my i9 is going over the roof 85°c. I have to under volt it to keep it stable..
You need a really good AIO or custom cooling with multiple rads. I'm using Arctic Liquid Freezer III 420mm in push/pull (6x140mm fans) with Arctic MX-6. The other option is direct-die cooling or delid and use liquid metal. With my setup I can sustain 310-320w power draw in 21c ambient room without thermal throttle. In games it's much cooler.
@@chovekb its the same as 7950X in productivity or slightly better not good leap and expensive compare to 7000 series and about efficiency just buy non X 7000 series cpu its same as 9000 while losing 5%-10% WATCH hardware unboxed not linus tech tips
@@Raghav-gd3fn I've watched all that, you watch it again! They are testing with the classic cpu testing tools. You watch the new AI tests and even with the classic ones 10% and some efficiency is still fine. Btw there's a windows admin and user account bug that restricts these cpus!
That is how Intel works. They have a higher 253 or 320w limit depending on which profile you use. So as long as you're under the power/current/temp limits, the CPU will boost to it's all core max and stay there 24/7. For gaming, it will usually always be under those limits and sit at max clocks. For heavy core use like rendering it will throttle back due to power/current/temps. AMD is not the same. It boosts and throttles earlier. Just different approaches.
9950X not best for gaming but sure it can game too just fine. I'd gladly switch my 14900K and 14900KF to 9950X systems. If you don't want to spend hours to no end tuning your 14900K with power and voltage settings and stability tests AMD is a no brainer. But if you're a gamer best to go for X3D.
I've never understood why those comparisons even exists in the first place. Ryzen 9/i9 aren't meant for gaming. Not to mention that when someone has on of these CPUs, they likely have a high-res monitor anyways. Priductivity is where they should be comoared in. Gaming comparisons at this tier is useless imo.
To test the power of the processors, 1080p, which is a resolution that is not loaded on the graphics card, is used so that the results obtained are determined by the processor power and not the graphics card. If 4K was also tested, it would have tested the graphics card, not the processor. Broadcasters and RUclips content producers who shoot gaming videos want gaming performance and those who want to play games comfortably in 4K prefer these processors. Those who will use it for gaming look at gaming tests, while those who will use it for business look at synthetic tests.
@KuRtOğLu_MuStAfA68 What I'm saying is it's dumb to buy these for gaming only and when you're also doing productivity you likely don't sit there with a 1080p monitor anyways.
@@auritro3903true even I prefer Intel over AMD (I have AMD GPU though) they still beat AMD with a refreshed refresh CPU of two years when ZEN 5 was supposedly clapping Intel old refreshed CPU and they still get no near of them without the X3d when Intel doesn't have that, disappointing from AMD
@@indrauchiha1133 true but the failing cpus tho, and we still have 9000x3d coming up which will be insane for gaming, and we have 7000 series x3d chips aswell.
How do you achieve that high fps in Counters Strike 2 with 4090 and 14900k? I also have a 4090 and 14900k but only get about a half of the frames at about 300. Ive tried with QHD and FHD and 1280x960.
@8ccv70 I think your pc have some issue because any 14900k get same fps as showed in this video. Did you tweak something in the bios regarding cpu settings??? What is your motherboard? Is ram 6400 cl32??
Im in the process of upgrading from a intel 5820k which is starting to fail on me, ive locked in and decided its come down to either the 14900kf - ryzen 9950x - 7950x3d or 9900x. Im in research mode right now and looking at comparisons trying to decide.
Go with Ryzen 7800x3d if all you do is play Games. Last Gen GPU but is actually Faster in Games then the 2 being benchmarked. And Scalable to Next x3d Version of Ryzen Series CPU's.
@@davidcole2337 Thank you for the recommendation and reply, It wouldn't just only be for gaming but also ue5 game development, rigging & 3d animation using maya and rendering done in ue5, digital audio workstation work using ableton live, occational VR gaming and development & video editing / live streaming. Would the Ryzen 7800x3d be okay for this sort of workload also? Again thank you for the recommendation
if gaming isn't priority go for more cores (in reality just check some more benchmarks) i personally will got for 9700x/9900x 1440p itx build for mixed workloads
@@eivisch Thanks for the help, so from what i've seen i think I'd need something with a lot of cores for my use case. If i go the Ryzen route its out of 7800x3d - 7950x or 9900x. Leaning more to the 9900x atm, or the intel 14900kf if i go that route. Ill look into more comparisons. Thank you for the reply
I think if you have time wait this year as intel will likely announce it's 15 gen arrow lake in October or November. Rumours are telling that it would take much less power than it's previous 14 gen and 13 gen.
Wdym 4090 version? There is only one 4090 version(apart from the China exclusive 4090D). If you are talking about the model, it's listed in the top comment and description.
Добрый день. 500к подписчиков, и каждого вы вводите в заблуждение своими видеороликами. Я сражу вижу что вы не специалист в этой области Для того чтобы замерить производительность процессора в играх или в синтетики, необходимо отключить видеокарты, либо её вытащить, если процессор не имеет графического ядра, то это уже минус в сторону процессора. Далее, в настройках игр необходимо выставлять настройки на минимум и с наимейсшим разрешением экрана, для того чтобы как можно меньше нагружать видеокарты и чтобы вся нагрузка шла на процессор А также, необходимо устанавливать минимальный допустимый вольтаж для видеокарты чтобы она не переходила в турбо режим, в рабочий режий, чтобы она была просто на фоне.
The game tests were tested at 1080p at the highest settings and no one uses processors without a graphics card. You wrote such a comment because Amd gave bad performance.
Although Zen5 is disappointing for gaming so far , 9950X is not a gaming CPU and you'd have to park up some cores in order to play games . So this comparison is useless, moreover when 13th-14th gen Intel are broken .
they are not broken , Microcode fixed any issues for 14th gen, 13gen if the users over clocked it since release then its a bust but they can RMA it for a new processor
Is funny how all you AMD fanboys saying 9950x not a gaming cpu people have been buying it for gaming 14900k is not a gaming cpu over a year old and stil beats the new cpu in gaming and power consumption is most of the games shown .. still can’t find the efficiency amd was on about😂
This doesnt match what I am seeing. I have 5 live gaming / benchmark pcs at present. Only had the 9950x and played around with it for a week but it is handily beating my 14900k in benchmarks. I wonder if these tests were done before the bug in windows was fixed that was effecting AMD results. I saw an increase of about 12 % on 7950x right away. Though I have been having increasing issues with my 14900 being stable with any overclocks. All of the pcs I have use very elaborate watercooling. I am also testing a test bios on the 9950x which increases power draw by default which hurts the efficiency but as I am mainly in benchmarks this is not an issue for me. Have been having issues where the 14900k will fail to boot even when I default the BIOS with no oc at all. Hoping I havent killed it. Not really tested it much in games as I tested the 7800x3d a lot but I guess the new x3d will probably own those comparisons but just not my focus. Generally I dont tend to build for gaming only and the whole game bar and other steps to jump through gets on my nerves. I guess I will get one if they iron that out though. But generally, in defence of AMD on this occasion I like this chip a lot. Seems to be a lot of performance left on the table. I just need to get my hands on the super fast expo memory now. Stuck with non expo 6400 while waiting for new RAM to be available
My 13700k (i know, not the same but still) runs at 5327MHZ while gaming and never passed the 60º that way... the only exception was when compilyng shaders for TLOU1 and it went to 82º C, at 100% load, but it took like 2 minutes max.
I dont know why my 14900K seeing 70 degree celsius with RTX4080 while playing WoW at 1080p. Is this happening because of resolution? In my case i have total 10 fans.
okay but it is also important about adobe programs like ae or pr etc. or maybe stable diffusion and obs etc. sooo it is maybe lower fps (im not sure) it is better performance on workflow i think.
These benchmarks at such low resolutions as 1080p do not reflect real-world usage, as most users use higher resolutions like 1440p and 2160p. At these resolutions, the difference between various high-end CPUs is minimal or almost nonexistent, because the GPU is the most important factor.
What you're talking about is a graphics card test, but this is a torture test. All over the world, processor tests are done at the lowest resolution and graphics cards are tested at the highest resolution settings. If you lack technological knowledge, please stay silent instead of making wrong comments.
For ultimate gaming performance, get a 7800x3d. For cheap gaming performance , get a 5800x3d or 5700x3d. People forget that pc users are not just gamers, so for non gamers get the nonx3d CPUs , be it 9000 series (if upgrading from old platform) or the 7000 series. AMD had it all covered, without the lingering fear of instability.
Ryzen 9000 is worse than intel in performance per watt and performance in general 😂. If you have an intel 13th or 14th gen cpu, lock your cores and you won’t have an issue.
It'd be nice if you showed the proof of the builds and specs first. (Edited for Context) From what I've seen. The 9950x performs much worse than it's depicted in this video.
@TalonsTech no, I'm an Intel fan. I have a 14900ks in my main rig, and a 12600k in my other building. I've just seen so many reviewers show the 9950x not getting anywhere near the 14900k in FPS comparisons, and this guys tests show they're like 10 fps apart in every game. Kinda sus
@@AustnTok He pairs the Intel rig with slow 6000 DDR5. This is a huge advantage to the AMD chip since it basically maxes it out memory OC wise. Intel can use far higher DDR5 to get max perf and most reviews don’t use 6000 for Intel.
@TalonsTech yeah, I get it, I'm running 7200mhz with my 14900ks atm, I had 7600mhz but one of the sticks died. They do it because they think it "levels the playing field" or something, but 6000 on AMD is basically the equivalent to 7200 on Intel. Although, using 6000 vs 7200 or even 8000, with Intel you're unlikely to get more than 5-10 more FPS with those faster speeds in most games. I've tested it myself. I have 5 different DDR5 kits
Eu acho que a AMD conseguiu algo tão excepcional no 9800x3d que eles simplesmente decidiram ir por outro caminho, deixando os CPUs sem o vcache mais eficientes e um pouco melhores pra trabalhar. E vai criar uma linha gamer x3d. E assim ter dois tipos de produtos pra dois públicos.
@@syedawishah yeah, intel arrow lake i5 series will use 2nm confirmed. i7/i9 arrow lake series will use 3nm. i want to get the i5 series intel 17th gen nova lake will be using 1.4nm which comes out in 2026
Lol...where is that efficiency? And it was marketed with gaming benchmarks. All those manufactured excuses are just silly. 14900k also is not gaming cpu...😂
7800sh1t3d is a solution to get 10% better fps at nowadays. Wait for more threads utilizations by the games and you will see how worse is that 7800sh1t3d. The other argument about "9950x" is not a gaming cpu is simply poor and useless
Most people here doesn't even realize that neither of these CPUs and the GPU as well are made for gaming on the 1st place, they are made for productivity work! Go ask their creators if you want, they'll tell you the same LOL
@@laszlozsurka8991 it's for rendering work, like all the 90 models! Gaming on a 450W gpu is just DUMB. Ada is for AI work, dont argue with me! Your Chrome picture says all LOL
@@chovekb Then why NVIDIA marketed as a gaming GPU? Don't argue with me when you don't know shit. NVIDIA marketed the 4090 as an enthusiast gaming GPU, get your facts straight before you spout BS.
It consumes the same level of power as the Intel 14900k processor which is in 10nm, has the same level of temperature rates and lags behind Intel, while AMD has a price of $650, Intel has a price of $545.
Que humilhação para a AMD, perdeu para a Intel em todos os cenários e condições. Não é possível, não houve melhorias nessa nova geração de processadores AMD. Muito pelo contrário, eles pioraram muito e ficaram mais caros
It looks like this is a Windows 11 issue, my friend is getting 190-200 fps in RDR 2 with the 9950X on Windows 10 while testing games is getting 176 here
Games :
Ghost of Tsushima - 0:09
CYBERPUNK 2077 - 1:05
Forza Horizon 5 - 2:10
Microsoft Flight Simulator 2020 - 3:15
Starfield - 4:15
The Witcher 3 - 5:18
CS2 - 6:28
Hogwarts Legacy - 7:28
Horizon Forbidden West - 8:23
Red Dead Redemption 2 - 9:27
System:
Windows 11
Core i9-14900K - bit.ly/3rTFhVy
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
Ryzen 9 9950X - bit.ly/3Av13mv
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz CL32
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
Day # ( I dont remember anymore) you are testing CPU's on highest graphics settings instead on lowest
bro add some ps3 emulation bc is a hell of cpu demanding😭
did you apply the new microcode update on 14900K?
can you please remake 7900x3d gaming benchmarks? after one year and bios updates, ppl are saying its a lot better than it was!
9950x is high performance cpu, not gaming optimized cpu
Keep in mind Intel is using 10nm, while AMD is using 4nm. Intel's part is a refresh of Raptor Lake released nearly an year ago, while AMD's is a supposed 'grounds-up' architecture that's supposed to have a 16% average IPC uplift, and took 2 years to release from Ryzen 7000. Yet, it's losing against Intel in performance, and consuming only like 10-20W less(in some cases similar power and a few even more power than the i9, while having lower performance). AMD also said that the 9950X would be upwards of 20% faster than the 14900K in gaming.
You know you have fucked up badly, if your part consumes almost as much power as a 14900K while performing noticeably worse.
It's 10nm+++++... not bad though
I was thinking the same, all that marketing claim from AMD, so much bs. Intel is about to release Arrow Lake, which will be 3nm processors, so AMD really has to pull their fingers out of their nose and ensure their 9000x3d series 3nm processors will be at least on competitive level with Arrow Lake. Otherwise Intel will make up for their huge fuck up with oxidation issues and AMD will lose it's chance to gain edge in market sector.
@@Hullbreachdetected bud intel cpu are not even stable lol
Couldn't be more wrong. TDP is much lower, and this is not AMD's pinnacle chip, whereas this is for Intel. wait for the 9800/9950x3d to come out, it will wash Intel off the map, as its 7800x3d currently does.
@@KiwiPepega yeah , you are right
AMD never fails to miss an opportunity to miss an opportunity, this has happened many times and they this time had their greatest chance to increase their lead but instead decided to sit comfortable
AMD = Accept Misserible Descisions
чего,мля?
arrow lake wil lcome soon, they are holding their x3d chips back. somehow people dont realize it
@@artorias89 The problem is that x3D is only for gaming and obviously Intel doesn't have a similar CPU segment so comparing it to x3D is silly, like when you go to the ring but forget the weight class.
@@chien.nguyen.i believe 8 cores with x3d only shines at gaming, whereas 12 or 16 cores with x3d can do productivity and gaming altogether.
not a fanboy but where is the "improvements" of AMD side ?
9000 series sucks, 7800x3d is still the goat. hopefully the 9800x3d delivers great performance gains
its kinda dissappointing. i was about to upgrade from am4
the cpu isnt really for gaming ideally, only x3d models should u expect high gaming perf
I'm an amd fanboy and I'm telling you, there's none
In the productivity and the efficiency sides. And the 9800X3D is not even shown yet. Now, where's Intel's stability, durability, efficiency and just anything new? LOOL
My 14900K is still running great, locked the cores and undervolt, I will stick with it for awhile.
are you lose some cores?
@@SUPERVINAO No I have all my cores.
@carlo1132 I would rma it who knows how much damage it could have. I mean it might be ok now bit could fail in weeks/ months time best get it checked 😮
@@joepearson6644 according to some who have tested the new 0x129 microcode it seems to of stopped the transient voltage spikes so it should technically stop further degradation.
@@mathesar As for the voltage aspect, yes. But now there's the copper corrosion issue. Intel admitted that all of their 13th and 14th gen CPUs have this problem, so even if the microcode is resolved, you still have a degrading chip.
For gaming, the Ryzen 7800X3D is still unbeatable
Yeah I’m considering whether I should get that or wait until the 9800X3D comes out
heck, even ryzen 7 5800x3d can still beat these latest chips
lol bro except gaming, 7800x3D sucks 😂
Video rendering, 3D animation i9 beats
hey i've been using the Ryzen 7800X3D for a year now and it's amazing, i haven't been following the pc news lately, is the 9800x3d worth upgrading to?
@@OneFreeMan17 nah just skip for zen6.
where is the so called "Efficiency Improvements" and "Performance Gain" AMD ?
X3D
eco mode lol
Where is it ?
Well... In your dreams.
2 years, a new node, for nothing.
They went towards AVX2, AVX-512, AVX-VNNI, and other server side tasks.
@@get-in-2-get-out774 I see that its efficiency is as bad as Intel 15th gen i9...with a new process node+new architecture....
Whereas 14-15th gen no process node improvement, no architecture improvement, it's under.... But AMD , huh ?!
Ryzen 9000s series is disappointing
You are wrong. Wait for 9800x3d next year probably.
@@furieux6742how can this person be wrong? If the AMD processor isn't as fast as the Intel processor but costs more, that's a fail.
Take this L, and have a seat way over there.
@@christophejergales7852 when your i9 13th gen and i9 14th gen will die on you, you will take and L and sit down in the corner crying
@@phantom0590 I don't want to be 13th and 14th gen and get burned, badum tiss🥁
😂🤣
For the people thinking I am saying this in favor of intel 13 to 14th gen , I am not , 9000s is disappointing even when compared with the ryzen 7000s
There's absolutely no reason for buying these possessors
Wait fot 9800X3D ,if it fails to deliver then zen 5 would be the worst architecture of all time.
exactly
AMD bulldozer nearly killed the company 😮
its 8 cores. if you want to do aynthing else besides game. not great
AMD using as much or more power than Intel 14900K and getting less FPS.
Intel 10nm dominating TSMC 4nm. Biggg ooooffff.
intel arrow lake is going to be using 2nm now. arrow lake will destroy amd
@@butonghit 2 and 3nm but as you say, it’s not going to be pretty. Arrow Lake dropping Oct 10. AMD in some trouble this gen.
@@TalonsTech yeah, i think 3nm for i7/i9 arrow lake. 2nm for i5 arrow lake.
intel 17th gen nova lake will use 1.4nm which comes out in 2026. i wish intel released their gpus faster though
@@butonghit no is not 2nm is just a naming not really 2nm if i guess its more like 4nm in reality and 4nm is 7nm.
Dual CCD CPUs or even high core count CPUs are highly inefficient for gaming.
Capped my 14900k to 253 Watts, downloaded the Microcode update , all good here no issues in gaming or productivity. Renders my 2 minute 3D projects in an hour .
Are you not worried on any long term damage?
соболезную вам с выбором данного процессора, врагу не пожелал бы
@@joepearson6644 Nope I got my 14900k a little over a month ago when I knew all the problems with intel, never had a problem with it since day one I switched from my 12900k to 14900k, and no crashes... after a week of having it I set the power limit to 253 watts, no issues then I got the microcode patch, no issues. when it comes to the internet social media always blow things out of proportions . yes 13th and 14th gen have issues , but the majority of it where for people using 13th gen who Overclocked there CPU's when the mother board was already overclocking it out the box, now add that with the High voltage request and that's how some CPUs have Degradgation especially 13th gen CPUs that are over 2 years old etc
Why should I be scared of long term damage when my CPU never had a issue , and I downloaded the patch and I have a 5 extended warrant if something does happen. So I have nothing to worry about. I stopped using AMD CPU since the Phenom Black days back in 2008, went to intel in 2010, had a I7 970 for 10 years without a single issue, upgraded to a 12900k in 2021 then sold that for a 14900k in July , never had a problem since.
@@cfif_asd Thanks its a great CPU, I'm enjoying it Renders my 3D projects fast.
@@joepearson6644 I replied to you but I'm not seeing it
Microcenter has the 7950x3D for $399 if I buy the combo deal...so tempting.
Do it!
That's a steal. You should get it.
X3D is great for 1080p gaming...if you play in 1440p or 4k its all GPU bound...its better to stick with 9950x or 7950x. It has faster productivity as well. performance.
AMD missed the perfect opportunity to punch the air with the 9000 series. AMD feels like Intel now, 5% improvement every refresh 💀
Holy Moly, R9 cost more, draws more Energy and is slower ? 😨
Not always more energy, but slower? Yeah, AMD was sleeping for the last 2 years clearly
Man that takes something special to beat i9 14900k in consumption
I wonder if he used the XBox "Game Bar" that you're supposed to use with the 7950x3d and 7900x3d.
@@m8x425 yeah and create a local admin with UAC turned off so the parked cores are in the kernel garage
@youtubeaccount7544 That is only true on like two models. The 7800X3D beat the 14900K in almost every possible game for cheaper, less heat, less power, and it doesn't kill itself.
How in the hell did you maintain the intel i9 temperature?
What kind of CPU cooler do you use?
Read the video description
Msi mag coreliquid C360
He doesn't use 250 W for gaming when it isn't needed and if you have QHD or 4k monitor there is no reason for CPU overvoltage when they can perfectly sit fine in 150W in gaming, just use 250W if you want but isnt needed to beat AMD (NO X3D) also I have Intel CPU (i7 14700k) and no instability issues because I keep it in Eco mode ( or normal mode) maybe it'll happen to me but for now I don't have instability issue.
Normally I use 120 W at max for gaming and sometimes I'm a bit drunk (not literally) and decided to Cinebench my CPU, it hit 250W also peak temperature 70C°. using Arctic Liquid freezer 3 RGB (stock, I have other fans, more powerful but for now Im okay)
I dont know why these people buy bullshit of Cooler to cool a CPU that uses a lot of wattage (AMD and Intel 7000, 9000, Alder lake and Raptor lake) and uses the most powerful profile for an extra 10% FPS. Intel has a more power hungry profile but both have Temperature issues, no hate to any brand (Cuz i uses both☠️)
Performs worse, costs more, and uses just as much power.
🤨
Starfield is so BRUTAL on cpu.. look at all that power being used from the i9 lol
I agree with you, 145 fps versus 127 fps, it's really cool, pure power.
@@Igor-Bond It's also a waste. You only need 80-100 fps to enjoy starfield. Capping will probably fix the power/temps issue
@@Ladioz And if you load a 4K video card with a resolution, then you will not need restrictions, and then the processor can be put on a weaker one.
@@Igor-Bond Yes i guess the cpu will be chilling out at 4k. I believe it is also being stressed out in 1440p but thats mostly in starfield and cyberpunk
The reason for this is that Starfield, like Fallout 4 and Skyrim, is full of draw calls, unlike what happens with other Bathesda games, such as the Doom series. And this lack of optimization of draw calls in Bethesda's RPGs, which makes them always so heavy on the CPU, is the reason why it's easy to make so many mods for these games, unlike other games optimized for draw calls. Therefore, when you see someone saying that Bethesda always uses the same game engine and that they don't learn anything from the past, you are seeing someone who either doesn't like mods or who doesn't understand how they are always possible to do in Bethesda's RPGs. When the people at Bethesda decide to always use the "same" game engine, believe me, they know why they do it.
And what does this have to do with AMD CPUs?
Somewhere during the development of CPUs (I think it was when the first Duo Cores were launched), Intel doubled the capacity of its CPUs to process draw calls per cycle, while AMD chose not to increase this processing capacity. From that moment on, AMD CPUs began to be condemned in terms of performance in games with a lot of draw calls, something that became very evident when a game called Skyrim was released in 2011. This is why to this day AMD's high-end CPUs lag behind in performance in Bethesda's RPGs and even other games.
Because of the recent problems with the 13th and 14th generation of Intel processors, I came here to see the performance of this new high-end AMD processor in Sarfield, to see if it was possible for me to go back to an AMD CPU (the my last AMD processor was a Ryzen 5 1600X), but unfortunately I see that I will continue, for a while longer, with my old and faithful i7-12700K.
Sad.😒
Now this time even the power consumption is higher. Fanboys are gonna be completely silent for this one.
not being a fanboy here, but that's a 16 core processor, of course it will burn some fuel. that and it doesn't die with the oxidization. i really want intel to get back on track, so we as consumers can win.
Oh, now we're were worried about power consumption again I see.
AMD in their infinite wisdom says the XBox Game Bar needs to be used with the 9900x and 9950x when gaming. You know, the core parking thing that needs to be used with the 7950x3d and 7900x3d. If I was gaming, I wouldn't even touch the 9950x. The cores on the 2nd CCD run at a slower frequency than the ones on CCD #0.
AMD has some things to sort out with Zen 5, mainly with scheduling. Don't expect any miracles though. At best the 9950x will be able to match the 13900k and 14900k in gaming.
@@awaiswasi354 Spreading blatant misinfo about the oxidization issue has killed any credibility you had left. Meanwhile, my 13900KF I bought back in Jan 2023 is still going strong at 5.8ghz and beating any 7800X3D in gaming, rendering, and strenuous workloads.
@@battlephenom8508 COPIUM!
@@battlephenom8508 misinformation, bruh? i had an i9 13th gen and it was giving me problems, too. that isn't a misinformation but the experience i had with intel. boyo, don't be a shill and just chill.
9950x belongs in the official garbage tier list
😂😂😂😂😂😂😂😂😂
My friend was saying AMD has won and is better than intel he's evidently not done any research 😂😂😂
So basically AMD Ryzen 9000 series is kind of like Intel's 14th gen. Came out with a very little performance.
And people are saying amd is better than intel rn 😂
Well to be fair, (at least before the microcode patch) Intel's best CPUs were degrading and crashing, while the 7800X3D was not only faster, it also drew less power.
@@auritro3903 the X3D chips are indeed faster in most AAA games but not in CPU Clock speed-intensive games (Counter Strike 2 for example).
@@auritro3903 not being any fanboy, but i think we should compare Intel cpus with amd's non-X3D cpus. because the X3D cpus are only meant for gaming and the the 9950x and 14900k are meant for both gaming and productivity.
Edit: Also windows is getting a new update(24H2) and according to them it should fix ryzen 9000 series performence and match with the benchmarks amd shared with us before its launch.
@@ProZog99Shorts HUB did some testing, and while the patch did improve performance, it is still no where near what AMD claimed.
AMD claimed that 9700X was 10% faster than 14700K in gaming, but actual benchmarks by HUB showed that the 9700X on 24H2 was still 3% slower than the 14700K.
Yeah because Microsoft slow down AMD processor on purpose
I’m downgrading to Windows 10 even that version is still better
10nm vs 4nm. wth?
What BIOS version and performance profile is being used on Intel?
I was about to switch from lga1700 to am5 but now I think i should wait for Intel's 15 gen/arrow lake cpus, and I dont want any X3D cpus as my main task is productivity and gaming both.
Intel will have a new CPU for the 1700 platform by next year. I would wait too.
@@jhendra83 wait really?
@@ProZog99ShortsI think they are moving to a new socket
Zen 5 X3D will do both, its fully unlocked for overclocking.
Or you could wait for the 12x P-Core Bartlett Lake processor.
Yeah, in WS tasks the 9950x is better, but not better enough to upgrade from a 13900k.
No reason to get Ryzen 9000. 7800x3d and 13900k with voltage regulations will stick kick ass.
7800x3d is still the best gaming cpu
No bro just gaming but apps work load multi tasking online intel is always king
@@syedawishah that litteraly what he said
@@syedawishah can you read?
@@syedawishah So you mean to tell me that the 7800X3D is bad for photoshop, 3D programs and such?? Only specifically gaming??
@@archvile1313 Its not bad for that stuff, people are rather "slow" in the head.
This might be the biggest disappointment ever
Not really. We all know the X3D variants are the game changers. These CPUs are for productions and people who may still be using 3000 series CPUs.
These are mostly production CPUs
@@AbysmalEnd 14900k still decimates in production tasks
14th gen was even worse. Not the worst tbh
@@XxApo70xX amd missed the opportunity to crush Intel
Hello guys, could you help me? I'm a complete layman when it comes to hardware and I wanted to hear opinions from people who are familiar with it.
I want to build my first PC but I have doubts.
I want to play, but I want to edit videos and render some things in blender/maya.
I heard that the R7 7800X3D is better for gaming and stuff because of the cache, something like that.
However, the R9 7950X is better for multitasking, so I wanted to know from you, which one should I buy?
If there is another AMD better than the ones I mentioned, you can tell me about it. I saw that the R9 90xx came out, but it seems that they are inferior to the ones I mentioned previously. I want to build my first PC so I want to make sure I have the best hardware we have so far.
Remember, I want to use it to play games but I also want to use it to edit videos and render in Blender/Maya.
Sorry if everything I said is confusing, I'm using the translator.
Hi Brother , I Have Bad English . Sorry . but i dont know how much money you want to spend but if you are rich dont think about what is worth to buy . those cpu's you said ( 9750x3d , 7800x3d , 14900k lablalalal) all of them are great . i ahve to say next gen cpus's ( core ultra and r9 9000) is like last gen but a fixed version of last gen , in power usage and temp and.....
7800x3d is best of the best for gaming with a little crashe's in ue5 and 4dblender rendering
9750x3d not worth to buy . its so pricy maybe my country is Sh1itt
you can buy 9700x3d and disable half core's and you will get same performance like 7800x3d maybe little bit lower like 8 %
14900k is realy realy good at gaming and rendering but you need custom water cooling ( open loop or close loop) if you want to see the true power of 14900k. but you can hold you cpu even on 302 watt with rog ryuo iii 360 at 89 temp without any issue like me
but go just watch @framechasers videos maybe it help you
anyway be well brother , always
sorry for my bad english
GODBLESSES
@@Rastin_In_Creation So bro, I want to buy an AMD and since it's my first PC I want to buy the best components they currently have so I don't have to upgrade after a while. I would even get the R7 7800X3D but I'm a little worried because I've seen a lot of people saying that it's only good for games. I want to play, but I also want to edit and use Blender, so I have doubts. Until a while ago I was almost 100% decided on getting the R9 7950X but then they launched these new 9000 series and I went back on my decision.
@@宗派 Pretty much everyone on here is focused on Gaming performance.
For Blender, nothing beats the 9950x. The 9950x is about 2.5x times faster in Blender than the 7800x3d. However, the 7950x is also pretty good at running Blender. The 9950x is only 10% to 15% faster in Blender. Also, a lot of people use the GPU to render instead of the CPU.
As far as the 7800x3d goes, you would need to pair it with a high end graphics card and play games on a high refresh rate monitor to get the most out of it. People are Hyper focused on framerate, which is fine, but how much good will that do you if you plan to game on a 1440p monitor with a 165hz refresh rate. Most of these CPU gaming performance benchmarks are done with an RTX 4090, at 1080p, because the workload is shifted to the CPU. Because of how the workload shifts based on settings, screen resolution and the Graphics card you're using, the difference between the 7800x3d, Ryzen 9900x, 9950x, and 7950x at 1440p with a graphics card like the RTX 4070ti Super will not be that big.
If budget is a concern, you will be better off saving money on the 7950x and putting the saved money towards a better Graphics card.
@@m8x425 I was just thinking about buying a 4070 ti super, so instead of buying the R9 9950X I buy the R9 7950X and a 4080 super? 4090 is unfeasible for me at the moment, it is very expensive in my country.
Is this with the new intel microcode? Also is the Ryzen running under administrative account? Also windows 11 gimps 9000 series performance. Fix is underway. Waiting on 9000X3D cpus in the future tba at CES 25.
I9 14900k is better idk why people agreeing with this
Considering the i9 14900k can support up to 8000mhz ram it’s still bette than the 9950x at 6000mhz.
Even then 14900K had mercy on 9950X and only used 6000mhz DDR5 ram
I wonder how big the gap is when the 14900K uses 7200mhz DDR5 ram
More like when Intel uses dual ranked 7200 2x32gb or 8400 single ranked 2x24gb DDR5. Intel would absolutely smash this chip when not held back by that slow 6000 DDR5.
@@TalonsTech Take a 14900KS and overclock it. Use TVB to clock all core to 6ghz+ until it reaches 65-70c. Then make sure to tune voltages below 1.4v vcore (don't want to degrade) get DDR5 8200-8400XMP. Tune timings for lower latency. You now have the fastest gaming cpu on the planet.
AMD folded under no pressure ☠️
It will be cool to watch a comparison again when AMD fixes the Admin account bug as well as with the 100w versions of these chips that are rumored to be coming out soon.
I don't understand how the hell are you maintaining 14900k under 70°c?
Even with newest bios update and 360mm water cooling my i9 is going over the roof 85°c. I have to under volt it to keep it stable..
Have you installed the contact frame?? Are you skilled for the installation of coolers on cpu? What is your thermal paste??
You need a really good AIO or custom cooling with multiple rads. I'm using Arctic Liquid Freezer III 420mm in push/pull (6x140mm fans) with Arctic MX-6. The other option is direct-die cooling or delid and use liquid metal.
With my setup I can sustain 310-320w power draw in 21c ambient room without thermal throttle. In games it's much cooler.
What the hell AMD? I know these arent X3D.. But.. What happened?
Productivity and efficiency focus, that happens.
@@chovekb its the same as 7950X in productivity or slightly better not good leap and expensive compare to 7000 series and about efficiency just buy non X 7000 series cpu its same as 9000 while losing 5%-10% WATCH hardware unboxed not linus tech tips
@@Raghav-gd3fn I've watched all that, you watch it again! They are testing with the classic cpu testing tools. You watch the new AI tests and even with the classic ones 10% and some efficiency is still fine. Btw there's a windows admin and user account bug that restricts these cpus!
Zen 5 is geared towards improvements to Epyc
@@m8x425 fair enough.. but it was also marketed towards gamers which is a bit of a shame it's lackluster
just a question: why the 14900k cpu speed is locked(all tests) vs amd speed changing all time?
thx
That is how Intel works. They have a higher 253 or 320w limit depending on which profile you use. So as long as you're under the power/current/temp limits, the CPU will boost to it's all core max and stay there 24/7. For gaming, it will usually always be under those limits and sit at max clocks. For heavy core use like rendering it will throttle back due to power/current/temps. AMD is not the same. It boosts and throttles earlier. Just different approaches.
9950X not best for gaming but sure it can game too just fine. I'd gladly switch my 14900K and 14900KF to 9950X systems. If you don't want to spend hours to no end tuning your 14900K with power and voltage settings and stability tests AMD is a no brainer. But if you're a gamer best to go for X3D.
I've never understood why those comparisons even exists in the first place. Ryzen 9/i9 aren't meant for gaming. Not to mention that when someone has on of these CPUs, they likely have a high-res monitor anyways.
Priductivity is where they should be comoared in.
Gaming comparisons at this tier is useless imo.
To test the power of the processors, 1080p, which is a resolution that is not loaded on the graphics card, is used so that the results obtained are determined by the processor power and not the graphics card. If 4K was also tested, it would have tested the graphics card, not the processor. Broadcasters and RUclips content producers who shoot gaming videos want gaming performance and those who want to play games comfortably in 4K prefer these processors. Those who will use it for gaming look at gaming tests, while those who will use it for business look at synthetic tests.
@KuRtOğLu_MuStAfA68 What I'm saying is it's dumb to buy these for gaming only and when you're also doing productivity you likely don't sit there with a 1080p monitor anyways.
I9s are good at both gaming and productivity. X3Ds are only good for games.
Big oof, Zen5 delivering fewer FPS/Watt than Raptor Lake... That sais a lot.
Wish we had 7800X3D numbers as a direct comparison here.
intel comeback???
Yes hope with 2nm iam a big fan of intel
Intel doesn't need a comeback, when AMD's latest parts are not able to beat a refresh of a 2 year old part.
@@auritro3903true even I prefer Intel over AMD (I have AMD GPU though) they still beat AMD with a refreshed refresh CPU of two years when ZEN 5 was supposedly clapping Intel old refreshed CPU and they still get no near of them without the X3d when Intel doesn't have that, disappointing from AMD
@@auritro3903 they do when their latest chips are failing.
@@indrauchiha1133 true but the failing cpus tho, and we still have 9000x3d coming up which will be insane for gaming, and we have 7000 series x3d chips aswell.
How do you achieve that high fps in Counters Strike 2 with 4090 and 14900k? I also have a 4090 and 14900k but only get about a half of the frames at about 300. Ive tried with QHD and FHD and 1280x960.
What is you ram? Is XMP enabled in your bios? Higher FPS is very depending on ram speed
@@OmnianMIU 32gb ddr5 6400mhz and XMP is enabled
@8ccv70 I think your pc have some issue because any 14900k get same fps as showed in this video. Did you tweak something in the bios regarding cpu settings??? What is your motherboard? Is ram 6400 cl32??
Im in the process of upgrading from a intel 5820k which is starting to fail on me, ive locked in and decided its come down to either the 14900kf - ryzen 9950x - 7950x3d or 9900x. Im in research mode right now and looking at comparisons trying to decide.
Go with Ryzen 7800x3d if all you do is play Games. Last Gen GPU but is actually Faster in Games then the 2 being benchmarked. And Scalable to Next x3d Version of Ryzen Series CPU's.
@@davidcole2337 Thank you for the recommendation and reply,
It wouldn't just only be for gaming but also ue5 game development, rigging & 3d animation using maya and rendering done in ue5, digital audio workstation work using ableton live, occational VR gaming and development & video editing / live streaming. Would the Ryzen 7800x3d be okay for this sort of workload also?
Again thank you for the recommendation
if gaming isn't priority go for more cores (in reality just check some more benchmarks) i personally will got for 9700x/9900x 1440p itx build for mixed workloads
@@eivisch Thanks for the help, so from what i've seen i think I'd need something with a lot of cores for my use case.
If i go the Ryzen route its out of 7800x3d - 7950x or 9900x.
Leaning more to the 9900x atm, or the intel 14900kf if i go that route. Ill look into more comparisons.
Thank you for the reply
I think if you have time wait this year as intel will likely announce it's 15 gen arrow lake in October or November. Rumours are telling that it would take much less power than it's previous 14 gen and 13 gen.
Which rtx 4090 version is being used ?
Wdym 4090 version? There is only one 4090 version(apart from the China exclusive 4090D). If you are talking about the model, it's listed in the top comment and description.
Добрый день.
500к подписчиков, и каждого вы вводите в заблуждение своими видеороликами.
Я сражу вижу что вы не специалист в этой области
Для того чтобы замерить производительность процессора в играх или в синтетики, необходимо отключить видеокарты, либо её вытащить, если процессор не имеет графического ядра, то это уже минус в сторону процессора.
Далее, в настройках игр необходимо выставлять настройки на минимум и с наимейсшим разрешением экрана, для того чтобы как можно меньше нагружать видеокарты и чтобы вся нагрузка шла на процессор
А также, необходимо устанавливать минимальный допустимый вольтаж для видеокарты чтобы она не переходила в турбо режим, в рабочий режий, чтобы она была просто на фоне.
The game tests were tested at 1080p at the highest settings and no one uses processors without a graphics card. You wrote such a comment because Amd gave bad performance.
perfections rtx 4090 , i9 14900k with 1080p 540hz monitor in my PUBG
Let's check back in 8 weeks see if things have changed, I both hope and suspect they will.
Although Zen5 is disappointing for gaming so far , 9950X is not a gaming CPU and you'd have to park up some cores in order to play games . So this comparison is useless, moreover when 13th-14th gen Intel are broken .
they are not broken , Microcode fixed any issues for 14th gen, 13gen if the users over clocked it since release then its a bust but they can RMA it for a new processor
Is funny how all you AMD fanboys saying 9950x not a gaming cpu people have been buying it for gaming 14900k is not a gaming cpu over a year old and stil beats the new cpu in gaming and power consumption is most of the games shown .. still can’t find the efficiency amd was on about😂
@@AccuracyGaming 14900k is not a year old yet , so I don't know how you get over a year old .
9950x unexpectedly underperforms 14900k
This doesnt match what I am seeing. I have 5 live gaming / benchmark pcs at present. Only had the 9950x and played around with it for a week but it is handily beating my 14900k in benchmarks. I wonder if these tests were done before the bug in windows was fixed that was effecting AMD results. I saw an increase of about 12 % on 7950x right away. Though I have been having increasing issues with my 14900 being stable with any overclocks. All of the pcs I have use very elaborate watercooling. I am also testing a test bios on the 9950x which increases power draw by default which hurts the efficiency but as I am mainly in benchmarks this is not an issue for me. Have been having issues where the 14900k will fail to boot even when I default the BIOS with no oc at all. Hoping I havent killed it. Not really tested it much in games as I tested the 7800x3d a lot but I guess the new x3d will probably own those comparisons but just not my focus. Generally I dont tend to build for gaming only and the whole game bar and other steps to jump through gets on my nerves. I guess I will get one if they iron that out though. But generally, in defence of AMD on this occasion I like this chip a lot. Seems to be a lot of performance left on the table. I just need to get my hands on the super fast expo memory now. Stuck with non expo 6400 while waiting for new RAM to be available
How is your 14900k running that cool?? I mean usually theyre always in the 75s-90s under game loads.
My 13700k (i know, not the same but still) runs at 5327MHZ while gaming and never passed the 60º that way... the only exception was when compilyng shaders for TLOU1 and it went to 82º C, at 100% load, but it took like 2 minutes max.
I think its underclocked(not sure btw)
Probably the latest microcode update
@@rchamy94 definitely not that cause I have the latest micro code and still runs blazing hot with a Kraken Elite 360 AIO
@@rchamy94you can see that his 14900k js pretty much cold in other videos
1080p tests for these chips is kind of criminal.
I dont know why my 14900K seeing 70 degree celsius with RTX4080 while playing WoW at 1080p. Is this happening because of resolution? In my case i have total 10 fans.
Yeah cool, tune that RAM to 7800 MT on the 14900K and then we're talking performance.
Soo AMD just pulled an Intel.
People saying the intel cpus will die over time like we ain’t got more money to spend
okay but it is also important about adobe programs like ae or pr etc. or maybe stable diffusion and obs etc. sooo it is maybe lower fps (im not sure) it is better performance on workflow i think.
These benchmarks at such low resolutions as 1080p do not reflect real-world usage, as most users use higher resolutions like 1440p and 2160p. At these resolutions, the difference between various high-end CPUs is minimal or almost nonexistent, because the GPU is the most important factor.
What you're talking about is a graphics card test, but this is a torture test. All over the world, processor tests are done at the lowest resolution and graphics cards are tested at the highest resolution settings. If you lack technological knowledge, please stay silent instead of making wrong comments.
13600K / 13700K - 7600x / 7700x / 7800x3d still best cpu for gaming
and 12400F for 100$
with the current performance bug on Ryzen 9000, these comparisons don't make any sense, do they?
Whatever lets people cope. It is funny how all those excuses pop up peddled by tech tubers. It is a bad product based on how it was marketed.
With windows 24h2 you could add 10% fps increasy to all ryzen resloults
Yes
El ryzen 9 9950x3d va ser mejor
For ultimate gaming performance, get a 7800x3d. For cheap gaming performance , get a 5800x3d or 5700x3d. People forget that pc users are not just gamers, so for non gamers get the nonx3d CPUs , be it 9000 series (if upgrading from old platform) or the 7000 series. AMD had it all covered, without the lingering fear of instability.
Well, this is surprising in a rather unexpected way.
Bruh make the overlay counters a bright color and not purple it's impossible to see in lower resolutions
Win 11 is broken, 26% gains showing in Win 10 and Linux.
AMD is very slow on windows 11. Microsoft is doing Intel a flavor by purposely slow down an AMD Processor
Price drop when ?
What happened why no 9950 x3d?
i don't really have much hope from 9800x3d, better to get 7800x3d right now, skip zen6 and wait for next generation!
Pointless, we need faster CPUs for a 4090 especially for high refresh rate 1440p
Even the current 7800X3D can struggle at times
are you using the hidden admin account for testing ryzen?
Am i seeing right? they have the same power draw and intel has better clock and temperature with higher fps, wtf isn't this supposed to be a new cpu?
"Advanced" Micro Devices
i think the 9950x3D model or the x3D model's in general will be a better investment.
Zen 5 just reminds me of Rocket Lake... maybe that means Zen 6 will actually be great like Alder Lake was? We can hope 😂
Maybe, but I'm gonna be more cautious about Zen 6 now...
Is this on win 10 or 11 people say win 11 is broken with 9000 series ryzen
Ryzen 9000 is worse than intel in performance per watt and performance in general 😂. If you have an intel 13th or 14th gen cpu, lock your cores and you won’t have an issue.
Theres something strange with the witcher benchmark, the GPU is using more W and load% AT LOWER FPS on AMD side...
It'd be nice if you showed the proof of the builds and specs first. (Edited for Context) From what I've seen. The 9950x performs much worse than it's depicted in this video.
😂😂😂😂
Uh oh now we have conspiracy theorists with AMD's 9000 sucking so hard.
@TalonsTech no, I'm an Intel fan. I have a 14900ks in my main rig, and a 12600k in my other building. I've just seen so many reviewers show the 9950x not getting anywhere near the 14900k in FPS comparisons, and this guys tests show they're like 10 fps apart in every game. Kinda sus
@@AustnTok He pairs the Intel rig with slow 6000 DDR5. This is a huge advantage to the AMD chip since it basically maxes it out memory OC wise.
Intel can use far higher DDR5 to get max perf and most reviews don’t use 6000 for Intel.
@TalonsTech yeah, I get it, I'm running 7200mhz with my 14900ks atm, I had 7600mhz but one of the sticks died. They do it because they think it "levels the playing field" or something, but 6000 on AMD is basically the equivalent to 7200 on Intel. Although, using 6000 vs 7200 or even 8000, with Intel you're unlikely to get more than 5-10 more FPS with those faster speeds in most games. I've tested it myself. I have 5 different DDR5 kits
Zen 5 non x3D is a discgrace to AMD
Eu acho que a AMD conseguiu algo tão excepcional no 9800x3d que eles simplesmente decidiram ir por outro caminho, deixando os CPUs sem o vcache mais eficientes e um pouco melhores pra trabalhar. E vai criar uma linha gamer x3d. E assim ter dois tipos de produtos pra dois públicos.
Intel 15th gen 🥶 2nm 🥶🥶🥶🥶🥶🥶
10nm 14th gen intel is as efficient as 4nm from amd. imagine arrow lake 2nm that will come out on october
@@butonghit bro is this real that intel is making 2 nm?
@@butonghit hope intel will still king bcoz om intel fan
@@syedawishah yeah, intel arrow lake i5 series will use 2nm confirmed. i7/i9 arrow lake series will use 3nm.
i want to get the i5 series
intel 17th gen nova lake will be using 1.4nm which comes out in 2026
@@butonghit sounds good bro thnx
The 9950X isn't a gaming CPU its productivity and efficiency focused. The 7800X3D bodies both in gaming anyways.
Lol...where is that efficiency? And it was marketed with gaming benchmarks. All those manufactured excuses are just silly. 14900k also is not gaming cpu...😂
7800sh1t3d is a solution to get 10% better fps at nowadays. Wait for more threads utilizations by the games and you will see how worse is that 7800sh1t3d. The other argument about "9950x" is not a gaming cpu is simply poor and useless
Classic AMDefending. By that definition you guys shouldn't be comparing a 14900K to a 7800X3D, since it's 'meant for productivity' ☠
It’s hybrid for me while Threadripper is a pure workstation cpu
Look at the AMDips on the Ghost of Tsushima 20 fps on 0.1% lows.
With new windows update even 7700x is faster than 14900f huge performance uplift
Very bad. Even the power consumption is about the same
Better transistor size
Better in promotion
Same power consumption
=> WORSE PERFORMANCE
What the heil is this? Not even Zen 0.5%
Intel cheats along with a Microsoft that’s why
ruclips.net/video/mVpv-EpEoGM/видео.htmlsi=nP1jG4b7tzi387Mk
The 14900K is both faster and less expensive.
This isn't a gaming cpu its for multithreaded workloads can't expect it to do well
You had better wait a bit, if you wanna buy 9000s cpu. AMD will drop their prices soon.
Wait for Ryzen 9000 X3D the non X3D Ryzen 9000 chips unless we are talking EPYC Turin or Threadripper Shimada Peak, are uninteresting.
Comparing with the 14900K it is a mistake coz 14900K literally brokers in few months
Most people here doesn't even realize that neither of these CPUs and the GPU as well are made for gaming on the 1st place, they are made for productivity work! Go ask their creators if you want, they'll tell you the same LOL
Wrong. RTX 6000 Ada Generation is for productivity work. 4090 is a gaming GPU.
@@laszlozsurka8991 it's for rendering work, like all the 90 models! Gaming on a 450W gpu is just DUMB. Ada is for AI work, dont argue with me! Your Chrome picture says all LOL
@@chovekb Who lied to you?
@@chovekb Then why NVIDIA marketed as a gaming GPU? Don't argue with me when you don't know shit.
NVIDIA marketed the 4090 as an enthusiast gaming GPU, get your facts straight before you spout BS.
Wrong. AMD advertised these chips for productivity AND gaming 'leadership'. Try again.
Truth be told I already expected disappointment from this series, so not really surprising to me.
wtf 0 amd and 9 intel and 1 draw if that stability was not there no way anyone going to get 9950X
not worth $650, what my fellow pc enthusiasts opinion?
It consumes the same level of power as the Intel 14900k processor which is in 10nm, has the same level of temperature rates and lags behind Intel, while AMD has a price of $650, Intel has a price of $545.
Que humilhação para a AMD, perdeu para a Intel em todos os cenários e condições. Não é possível, não houve melhorias nessa nova geração de processadores AMD. Muito pelo contrário, eles pioraram muito e ficaram mais caros
If it wasnt for 3D V-Cache, AMD would be left in the dust.
It looks like this is a Windows 11 issue, my friend is getting 190-200 fps in RDR 2 with the 9950X on Windows 10 while testing games is getting 176 here
Bro has good CPUs but even then he is playing in 1080p 😆
Latency is at issue here. This needs to be re-run since patched.
We didn't die just because we stumbled once, don't worry the king will return, forever intel nvidia