What would give me higher performance asrock 7900 xtx or a 13900k new mobo memory with 3080 ? Currently running a 3080 11900 msi z590 meg ac with trident ddr3 memory? How would a 3080 with a 13900k or a 7900 xtx on 11900 be better for budget friendly ?
Intel screwed up the 11th Gen desktop processors with the 14nm backport. My 10nm i7-11800H with mediocre cooling and a lower clock speed could match a desktop i7-11700.
@@arduinoblokodtr3699 Yeah, I'm a bit torn about it. On the one hand I'd prefer to see an apples-to-apples comparison of all four processors using the same DDR4 memory. On the other hand, a pure DDR4 comparison might mislead people, because at this point, realistically, very few people would buy a new LGA1700 rig without DDR5. (DDR5 is now finally cheap enough that DDR4 is no longer an attractive option if you're starting from scratch.) The youtuber probably chose the best option here. Anyway, we know from release-era reviews that the 12900k was ~15% faster (in games) than the 11900k when both used DDR4. Clearly the memory is doing a lot of the heavy lifting in the video's benchmarks.
Agree! I still like my 10900k. After my old cpu died about 2 Jears ago i needed a new Motherboard + Cpu. The option was either a Ryzen 7 3800xt/3900xt or the i9. And the i9 was a bit faster in Gaming and i was Familiar with OC behavior and so on. But the next Plattform could really be AMD. The new Ryzen 70003D Prozessors are pretty nice.
@@nk130489 I can confirm they're great. I have a 7900X3D in my recently built rig. Definitely check them out if you're in the market for an upgrade to AM5.
Most of people really dont know what is up with 11900k vs 10900k. The biggest difference is, 11900k can use PCI 4.0. Meaning, we can make full use of all PCI express lanes for both NVM Gen 4.0 and GPUS like nVidia 40 series. 10900k just wont be able to recognize Gen 4 NVMes, and modern/faster GPUs will run but limited to PCI 3.0 bandwiths. Will running under PCI 3.0 affect your experience? dont think so. But just know you are limiting the full power of your GPU. So, besides PCI 4.0, this CPU is more effective in terms of energy, and funny, does basically the same amount of work than 10900k with 2 less cores. So, 11900k is a revised 10900 k with pci 4.0 capabilites, reduced power consuption and same work power with less cores.
I have a 10900k and from this video I don't see a real reason to upgrade yet. All those results are more than playable. If anything, I can just wait a year or two then put a 14900k in for even better results.
At this time, I completely agree with you. I'm in the process of transferring my apps & date from my current ancient i7-3770k on a Z77 Sabertooth mobo w/2070 card to my new i9-12900k on a ROG Maximus II mobo w/3080 card. My current system has served me very well as a gamer & for work over the past 11 years, but it's time to move on! If I hadn't practically stolen my new system, I would have been content with a i9-10900K or i9-11900K. I'd enjoy what I had right now, put some coin aside, and wait for what's coming out in late 2024. More processing power & less power requirements (cooler running) will be coming.
The performance between the 10th and 13th is insane. Only 3 years difference. I am still with my i9 10900k and waiting for the next generation to finally upgrade and remove the bottleneck on the 4080.
the reason why the 12th gen jump was so huge is because that's what 11th gen was supposed to be. 11th gen released in April and 12th gen released in November of 2021
i9 users are most likely geting the best RAM they can, it makes no sense to get a 12900K or 13900K with DDR4 memory, if you have a tight budget just step down to an i5 and an i7
Why is no one testing the 13900k with 7000+ RAM? I only see 6000 MHz. I mean, in these days 7000+ is pretty cheap and the 13900k rly benefits from it. Running mine with 7800 MHz and its fking awesome.
i9-11900K 8 cores running at 5.3GHz(using OC TVB. LLC, temp, voltage were tested well and optimized manually) with DDR4 3733MHz with 42.4ns latency, my PC is still a performance beast in 2023.
How is that 10850k holding up? Mine has severely degraded to the point where it’s constantly crashing on default settings. Only works with at least 1.45V
@@s.d.2833 I've always kept the voltage under 1.39. It's running fine with 2 cores boosted to 5.2 Ghz ahd the other cores at 4.8 Ghz. It's also being cooled by a Corsair i150 Elite 360 mm aio.
Can you please make a test between different GPU producers of the same GPU? What's the difference between RTX 4090 from Asus, MSI, Zotac, PNY and so on?
In this case with the 4090 the 10900k and other processors are creating a bottleneck, clearly. But I wonder with other GPUs, such as the 3060, 3070, or 3080, there would be such a massive bottleneck. I have a theory that it wouldn't bottleneck that much. So if that is the case, if you don't have the latest GPUs, you wouldn't see such a massive upgrade with the CPU.
@@СергейГабриель Да походу это чес какой-то 5:50 в Ведьмаке 50 ФПС 😅 автор походу совсем не понимает, как работает компьютер и поэтому публикует какую-то ерунду.
@@games-pc здравствуйте. Естественно. Заметьте,он не ответил. Это просто пропихивание якобы более крутых процов. Спецом видюхе не дают работать. А мне неинтересны циферки,мне нужна реальная производительность. В итоге нашёл честные обзоры. С картой именно 4080,как у меня 11900kf уступает 14900kf максимум 10%. И это далеко не во всех играх. Обычно 5-7%. А цена платформ катастрофическая. Даже учитывая,что плата у меня карбон от MSI, я свою связку дороже 40 тысяч не продам. А 14900k + z790 карбон около 100тысяч стоит. Плюс оператива. Свою ддр4 2×16гб от корсар я тоже шибко дорого не продам,а нормальная ддр5 стоит в два раза дороже. Я шлак за 12-15т. ставить не буду. Эти блохеров продажных процентов 95 на Ютюбе.
What a huge difference between the cpu's. Never realised that 12th gen was that much faster then 11th gen. And tomorrow my AMD Ryzen 7 7800X3D is comming :) The FPS king at the moment...and 5% faster on average then the 13900K. Now only obtaining a 4090 rtx.... pff so expensive.
I ordered a i7 11700k and the seller actually packed me a i9 11900k. Now I don't know I should feel blessed or mad. After this benchmark, I'm even more confused.
You have to cap the GPU usage on this to truly see the real difference. The 12 and 13 pull way more frames but the GPU usage is way higher in all examples
This was very interesting. I didn't realize the differences were that big to be honest. It would be great if you do this with more mid ranged CPUs that most people actually use (8 cores perhaps).
I have 10900k and rtx 4080, thinking of upgrading to 13900KS but couldn't find a good motherboard, can I use 7800mhz rams with Rog strix z790-E gaming wifi?
@@Adem.940 you should upgrade the cpu first because that will def give you a lot more performance, then if you want to upgrade the ram to 7600mhz or a little lower theres almost no difference
Raptor lake is just rebrand lake with more power draw and a few more frames. 12th gen is that sweet spot. Though I do run manually tuned Hynix A die D5 @ 8400MTs on my 13900K. Can't get those speeds with a 12900k.
Intel in 2015: TADAAA SKYLAKE ULTIMATE PERFORMANCE Intel in 2016: SKYLAKE IS GOOD SO GET RENAMED VERSION 2017:UMMMM JUST GAIN CORES 2018: JUST RESELL THIS SHIT, NOBODY WILL NOTICE 2020: SO JUST GIVE THEM HYPERTHREADING, F&@K OFF People: Amd is cheaper, faster and energy-effective 2021: YEAAAHHH THIS IS OUR NEW ARCHITECTURE. 15% MORE PERFORMANCE AND 50% MORE TDP.
i got a 12900k with a 4090 and even in 4k some games need a faster cpu to get all the frames out of the 4090, im most likely getting the 14900k when it arrives.
because it is the worst CPU intel produced, 10900K was cheaper, has 2 more cores (10 vs 8), larger L2 cache, lower power draw, better OC potential...etc
@@cks2020693 child please. First of all, the 10900k just wont be able to recognize Gen 4 NVMes which already makes it the worst option compared to the 11900K. Secondly, the 11900k is superior to the 10900k with pci 4.0 capabilites and just having 2x cores on the older 10900K, doesn't really mean anything.
I guess I should say thanks for showing how insignificant the CPU is in gaming performance. You could spend a boatload on upgrading your proc for 1-5% fps boost or get 20-25% by spending less on faster RAM.
i dont think you know how this works. you cant just upgrade the ram on a 10900k to the ram of a 13900k the faster ram is not compatible wiith the older chipset.
@@CB-bi1be No, no you cannot. Still exactly my point though. The difference in RAM speeds, going from DDR4 to DDR5 account for 95-99% of the performance gains. The DDR5 6000mhz RAM offers roughly 40% more frames while being clocked roughly 40% faster than the DDR4 3600mhz. The processor isn't offering any improvement itself other than support for a faster DDR revision being added between the 11th and 12th generation. The DDR generational improvement are worth upgrading for. The CPU generational improvements are not. At least not if your agenda is gaming.
then would it be possible to somehow re_engineer th older cpu/motherboard to accept the newer ram ? could it be as simple as re-soldering on the new ram interface to the motherboard? @@danmckinney3589
The 13900k is only 5% faster but needs 20% more energy. What a bad CPU Generation. The 12900k is the best of these 4 kind. Intel shows how bad there generations are.
There isn't any game here where the 13900K is 5% while using 20% more power. Power efficiency is slightly regressed for the first three games in the video, but all the subsequent games the 13900K gives more performance for a smaller power increase.
i dont get it: in hogwarts usage of 10900 is 20-25% and we get 47fps when 13900k usage is 14% and we get 72fps , my point is where is rest of the cpu power? clearly 10900k and others are not used to ful potential , why im forced to go 13900k to have much better performance when my cpu is not used fully, like buying sandwitch but i can take only a small bite from it , i get it its opitimalization, but it suck hard, not only hogwarts iom not attacking only this game , but all games.
Bro 13900k has 12 more threads than 10900k. This results in lower percentages. The thing is, that programming a game which uses multicore is not that easy as u think. Thats why 6c/12t is still enough for gaming.
They all have different core counts, but not all cores are used for gaming, gaming will use, at maximum, 8 cores atm. What matters for gaming is single-threaded performance and cache, not core/thread count or usage. 10900K is 10c/20t 11900K is 8c/16t 12900K is 8P+8E/24t 13900K is 8P+16E/32t The 12900K and 13900K have E cores as well as P cores. The E cores shouldn't really be used for gaming anyway, at least in conjunction with the P Cores (Dawid Does Tech Stuff actually found that the E cores alone are very hit and miss for gaming, they were only really slightly better than a 4th Gen i5 and in BFV Single Player, it was basically unplayable, but if you are spending as much as you are for a 12900K or 13900K, you really shouldn't game on the E-Cores anyway), they only really exist for accelerating multi-core workloads.
Am I the only one who noticed that the difference between the two setups? I mean if you look at the 10900K/11900K is using 3600DDR where as the 12900k/13900k is using 6000DDR which would explain the massive difference coming from the previous generations before the 12th Gen. Now if they were using roughly the same speed memory not sure it would be as big as a jump from previous Gens. That being said. The 12th, 13th and even the 10th Gen is better than the underwhelming 11th Gen i9, Sorry Intel. Not sure what you were thinking on that one. You are suppose to leap ahead when it comes to technology not leap backwards. ;)
Странно... Повторил настройки из видео в Ведьмак3 (стимовская версия со всеми обновлениями), и мой 9900kf выдает от 70 до 85fps в зависимости от локации. Правда память у меня 3800 с затюнеными таймингами и 4х8Gb. По ощущениям и в киберпанке побольше видел, но лень проверять - настроек много менять надо.
@@Masterprocessor , я тоже сомневаюсь в объективности теста. Справедливости ради замечу - в данном случае видеокарта значения не имеет. DLSS 3 выключен, а так же установлено разрешение 1080p, чтобы видеокарта не была загружена на 100%.
Вот именно что «в зависимости от локации», а в видосе специально чел бегает чисто по Новиграду, так как это максимально цпу-интенсивная зона и разница в процессорах будет более различима. Если вылезти из Новиграда/рынка Оксенфурта все процы Интел из теста покажут 60+ фпс
@@games-pc повторюсь В данном видео упор идет в процессоры, в их архитектуру, кэш, частоту, инструкции,все в совокупности. То что процессоры работают не на 99-100% это ни о чем не говорит.
It's not a bottleneck, the 4090 doesn't need to be utilized much if put under 1080p or lower resolutions. It's a GPU catered towards 4K gaming. I also have the same combination of 13900K and 4090, it uses more than half of itself when put under 4K load.
I don't understand this at all. The CPU usage of 10th and 11th gen is low. Its not like they are a bottleneck. I know that these chips are capable but are being held down by intel's shady SOC software, so that the consumer would get the newer chips and discard the older ones.
Well AMD fanboys are right on that one, E-cores are useless in gaming. But the P-cores spank AMD's P cores. The faster performance jump is from the superior P cores every generation not because of the additional e-cores. I mean look at the jump from 12900K to 13900K. Both have similar P core arch but the 13900K P cores are clocked 600MHz faster and also have 6MB extra L3 cache and like 3% better IPC, but the 600MHz faster P core clock is almost all the difference. And though clock speeds of P cores between 10th to 12th Gen is almost the same, 12th Gen has a massive IPC uplift over 11th Gen and even more so over 10th Gen. Though 10th Gen still a little better than 11th Gen because 11th Gen backported to 14nm which made latency awful and thus in many cases worse for gaming despite better IPC. 12th Gen is 20% IPC uplift over 11th Gen on what it should have been 10nm in which 11th Gen would have been good.
@@Jerry-jo6dn Well 7800X3D is a different animal in gaming. It trades blows with the well tuned 13900K in gaming with e-cores off. Though 13900K wins in everything else even e-cores off. Though the superior P cores were overall, not just gaming. The extra cache really seems good for games to fight against each other. But yes the 13900K uses a lot more power than 7800X3D for similar gaming performance.
The E-Cores are not the reason why it's faster. If he turned the E-Cores off then the results would be the same. The 12900k and 13900k overclock better with the E-Cores off, so it would be even faster.
Games :
The Last of Us Part I - 0:21
Spider-Man - 1:36 - gvo.deals/TestingGamesSpiderManPC
CYBERPUNK 2077 - 2:42 - gvo.deals/TestingGamesCP2077
Hogwarts Legacy - 3:50 - gvo.deals/TG3HogwartsLegacy
Forza Horizon 5 - 5:15 - gvo.deals/TestingGamesForza5
Hitman 3 - 6:35 - gvo.deals/TestingGamesHitman3
Microsoft Flight Simulator - 7:33 - gvo.deals/TestingGamesMFS20
The Witcher 3 - 9:02 - gvo.deals/TestingGamesWitcher
System:
Windows 11
Intel i9 10900K - bit.ly/2YcPUkI
MSI MPG Z490 GAMING PLUS - bit.ly/3hsf9Hs
Core i9 11900K - bit.ly/3foHCiQ
ASUS ROG Z590 Maximus XIII Hero - bit.ly/3m1kZSE
Core i9-12900K - bit.ly/3H4waEv
Core i9-13900K - bit.ly/3SgY3xf
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
CPU Cooler - be quiet! Dark Rock Pro 4 - bit.ly/35G5atV
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
32Gb RAM DDR4 3800Mhz - bit.ly/35vyWko
32GB RAM DDR5 6000MHz CL38 - - bit.ly/3XlBGdU
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
where is i9 9900k ??
有AMD的嗎
3600mhz on 10 and 11th gen while 12 and 13th using 6000nhz?
We're these run at stock settings? Seems like the 12900K could go at least 5.3
What would give me higher performance asrock 7900 xtx or a 13900k new mobo memory with 3080 ? Currently running a 3080 11900 msi z590 meg ac with trident ddr3 memory? How would a 3080 with a 13900k or a 7900 xtx on 11900 be better for budget friendly ?
The 12th gen had a 2 generational leap over the 11th gen while being way more power efficient too
32GB RAM DDR5 6000MHz CL38
@@FORWARDCATS hol up wtf?
@@jonathanng138 that gave 12th and 13th a gap from 10th and 11th
Intel screwed up the 11th Gen desktop processors with the 14nm backport. My 10nm i7-11800H with mediocre cooling and a lower clock speed could match a desktop i7-11700.
But it has like 3 times the latency and USB issues
dang that jump from 11th gen to 12th gen is MASSIVE!
Ddr5
@streamx3m in spiderman its half of the reason why its so much faster. DDR5 13900k is massively faster than DDR4 13900k
@@arduinoblokodtr3699 and meanwhile the 5800X3D with DDR4 still offers slightly better gaming performance than 12900k with DDR5 lol
@@arduinoblokodtr3699 Yeah, I'm a bit torn about it. On the one hand I'd prefer to see an apples-to-apples comparison of all four processors using the same DDR4 memory. On the other hand, a pure DDR4 comparison might mislead people, because at this point, realistically, very few people would buy a new LGA1700 rig without DDR5. (DDR5 is now finally cheap enough that DDR4 is no longer an attractive option if you're starting from scratch.) The youtuber probably chose the best option here.
Anyway, we know from release-era reviews that the 12900k was ~15% faster (in games) than the 11900k when both used DDR4. Clearly the memory is doing a lot of the heavy lifting in the video's benchmarks.
And it shows the jump from 10th to 11th was jank.
Buying the 11900k was the biggest mistake of my life
I even forgot that the 11th generation existed. Underwhelming cpu.
Not bigger than buying RTX 3090ti for 2k 🎉
@@EzBreezy750 OOF 💀
😂🤣😜
Why is that if you play 1440p or 4k it wouldn't matter
RTX 4090 and 11900K below 50 fps @1080p. System running at below 50% all round. Hogwarts Legacy, what a game. Marvelous piece of coding.
I upgraded to 3080+13600K and exactly this game + last of us + warzone 2 are the reason why I started locking games at 120FPS at high details.
11900k just such a weak cpu...
ddr5 helps hogwart performance ...
@@xxovereyexx5019 Exorcism won't help a difficult game either.
@@notenjoying666 lol, that is the most retarded statement ever.
I would love to see 10700k vs 11700k vs 12700k vs 13700k
lol imagine having a 11700k lmao.
@@Zesmasimagine it coming with your pre built and you had no choice 😂
Geez, the difference between 12 gen and 11 gen is crazy
10900K is still an excellent CPU. Anybody who bought one definitely got their money's worth.
no
Agree!
I still like my 10900k. After my old cpu died about 2 Jears ago i needed a new Motherboard + Cpu. The option was either a Ryzen 7 3800xt/3900xt or the i9. And the i9 was a bit faster in Gaming and i was Familiar with OC behavior and so on.
But the next Plattform could really be AMD. The new Ryzen 70003D Prozessors are pretty nice.
@@nk130489 I can confirm they're great. I have a 7900X3D in my recently built rig. Definitely check them out if you're in the market for an upgrade to AM5.
@@ShishioYes
Its fine for anything below a 4090 gpu, 1440p and above. This is an extreme case scenario to show the generational leap of the cpu.
Most of people really dont know what is up with 11900k vs 10900k.
The biggest difference is, 11900k can use PCI 4.0. Meaning, we can make full use of all PCI express lanes for both NVM Gen 4.0 and GPUS like nVidia 40 series.
10900k just wont be able to recognize Gen 4 NVMes, and modern/faster GPUs will run but limited to PCI 3.0 bandwiths. Will running under PCI 3.0 affect your experience? dont think so. But just know you are limiting the full power of your GPU.
So, besides PCI 4.0, this CPU is more effective in terms of energy, and funny, does basically the same amount of work than 10900k with 2 less cores. So, 11900k is a revised 10900 k with pci 4.0 capabilites, reduced power consuption and same work power with less cores.
I have a 10900k and from this video I don't see a real reason to upgrade yet. All those results are more than playable. If anything, I can just wait a year or two then put a 14900k in for even better results.
At this time, I completely agree with you. I'm in the process of transferring my apps & date from my current ancient i7-3770k on a Z77 Sabertooth mobo w/2070 card to my new i9-12900k on a ROG Maximus II mobo w/3080 card. My current system has served me very well as a gamer & for work over the past 11 years, but it's time to move on! If I hadn't practically stolen my new system, I would have been content with a i9-10900K or i9-11900K. I'd enjoy what I had right now, put some coin aside, and wait for what's coming out in late 2024. More processing power & less power requirements (cooler running) will be coming.
I'm on a 10600 and I'm going to be riding the line in a year or two
I would like to have 16 performance cores i9 14900 without those so called "efficiency cores" ;) That would be a monster... ;o)
16 big cores wouldn't fit on the socket, the E cores are 4 times smaller so you could trade the 16 E cores for 4 big cores at max.
The performance between the 10th and 13th is insane. Only 3 years difference. I am still with my i9 10900k and waiting for the next generation to finally upgrade and remove the bottleneck on the 4080.
Intel should adopt AMD style and let 3 generations of CPUs to be upgraded on the same board.
@@zacthegamer6145 Yes, its really annoying that you need to change the motherboard after 1 generation if you want to upgrade.
@@vini-ix8yt you can get a 7800X3D right now and still have the option to upgrade just the CPU few years from now on the same motherboard
@@vini-ix8yt You should just buy ryzen 7800x3d in games it will likely be on pair with next gen intel.
@@vini-ix8yt meteor lake won’t be all too impressive, Zen 5 and Arrow Lake are your best bet at maximum performance, unless you want efficiency.
The 13th gen was able to utilize more RTX performance so it got more FPS , right ?
the reason why the 12th gen jump was so huge is because that's what 11th gen was supposed to be. 11th gen released in April and 12th gen released in November of 2021
Should've used DDR4 for all of them to make it less RAM dependant
yep would've been less difference
nah
3733mhz or 4400mhz or 5333mhz ddr4 still a beast
@@McLeonVP ddr5 6000 with a low latency is much better
i9 users are most likely geting the best RAM they can, it makes no sense to get a 12900K or 13900K with DDR4 memory, if you have a tight budget just step down to an i5 and an i7
@@andrei007ps DDR4 4400mhz or 5333mhz why not?
The i9 13900K is an absolute beast! I can't believe the performance difference compared to the i9 10900K. I9s have gone a long way
its more power efficient with TRIPLE the cores and at higher clock speeds. 13th gen is really impressive
@@DemonSaine 20 threads vs 32 threads, most of those cores are baby cores
It didnt u watch 1080p low settings dlss balanced with 4090… this is simply dumb.
will arrive the same video but in 4k ?
No... no point in doing the same video at 4k. The results would be mostly the same except in a few games.
Massive diffrence between 10 and 13 gen
Why is no one testing the 13900k with 7000+ RAM? I only see 6000 MHz. I mean, in these days 7000+ is pretty cheap and the 13900k rly benefits from it. Running mine with 7800 MHz and its fking awesome.
True. And most 6400Mhz CL32 kits can easily be OC'ed to 7200MhzCL34
@@od1sseas663 True. And I mean, if you buy a 13900k you should be able to afford a board supporting at least 7000...
@@rueggly2367 Even the cheapest Z790 boards can handle 7200Mhz. My Z790 PG Lightining easily does 7200Mhz no problem
@@od1sseas663 Ye, but lots of ppl running the 13900k on Z690 while having only low DDR5 support. Whyever doing that...
@@rueggly2367 Z690 is the problem. It's trash for everything over 6400Mhz
are we sure both systems are running in dual rank ? it seems a bit innacurate
i9-11900K 8 cores running at 5.3GHz(using OC TVB. LLC, temp, voltage were tested well and optimized manually) with DDR4 3733MHz with 42.4ns latency, my PC is still a performance beast in 2023.
Running 10900k with 4600 c16 and 34ns latency. Still a beast 😁
@@tobiaspabst5524 Maximus Apex?
@@elotkarloketh166 ofcourse
@@tobiaspabst5524 what're your CPU benchmark scores?
@@elotkarloketh166 what application?
How do you group all the p and e cores frequencies together like that in afterburner?
The 13900K is overclocked, so the difference is a bit exaggerated. I run my 10850K at 5.1 Ghz and my 12900k at 5.1 Ghz, with two cores as 5.2 Ghz.
How is that 10850k holding up? Mine has severely degraded to the point where it’s constantly crashing on default settings. Only works with at least 1.45V
@@s.d.2833 I've always kept the voltage under 1.39. It's running fine with 2 cores boosted to 5.2 Ghz ahd the other cores at 4.8 Ghz. It's also being cooled by a Corsair i150 Elite 360 mm aio.
were literally not gonna have a proper cpu to pair with the 4090 until like 4 years lmao
i3 16100f with 4 p cores, 4 e cores running at 7ghz all core boost within 50 watts usage maximum and 32mb cache.
Вот бы увидеть еще сравнение с одинаковой скоростью памяти, было бы понятно за счет чего такой прирост.
Это вообще тесты от балды какой-то, вы что серьезно верите что на таких системах в том же Ведьмаке 8:50 50 FPS? 😅
Why is the 10 & 11th gen using 3600mhz ram while the others are using 6000mhz????
ddr4 vs ddr5
10th and 11th Gen support only DDR4, 12th and 13th support DDR5.
my 10900k is still going strong today, running it with 4000mhz ddr4 and its still rolling on as nothing paired with a 4070ti @3440x1440p.
what water cooling do you use ?
Can you please make a test between different GPU producers of the same GPU? What's the difference between RTX 4090 from Asus, MSI, Zotac, PNY and so on?
Performance will be margin of error, the only differences are noise, and longevity, but longevity is hard to test.
Greate video, what soft do you use to monitor system parameters
In this case with the 4090 the 10900k and other processors are creating a bottleneck, clearly.
But I wonder with other GPUs, such as the 3060, 3070, or 3080, there would be such a massive bottleneck. I have a theory that it wouldn't bottleneck that much. So if that is the case, if you don't have the latest GPUs, you wouldn't see such a massive upgrade with the CPU.
Because he's benchmarking the CPus not GPUs.
You won't see a bottleneck like this with slower GPUs..... unless you turn the quality settings down.
Even 1440p would help.
yeah, but who plays at 1080p?
This video is about CPU not GPU
The guy on the video want to see any bottlenecking problem
@@ikram2512 yes i know. that's why testing if a cpu is good for gaming is weird.
Здравствуйте. Почему видеокарта не нагружена ?
Почитай про бутылочное горлышко у процессоров, тогда поймешь
@@Сыр-с8ь здравствуйте. А я думаю,что спецом так сделали,чтобы более новый проц зарекламить. У меня 11900kf и 4080 вулкан. Карта загружается на 99%.
@@СергейГабриель Да походу это чес какой-то 5:50 в Ведьмаке 50 ФПС 😅 автор походу совсем не понимает, как работает компьютер и поэтому публикует какую-то ерунду.
@@Сыр-с8ь уже давно всё прочитано. Автор вводит в заблуждение людей с целью втюхать беспонтовый проц.
@@games-pc здравствуйте. Естественно. Заметьте,он не ответил. Это просто пропихивание якобы более крутых процов. Спецом видюхе не дают работать. А мне неинтересны циферки,мне нужна реальная производительность. В итоге нашёл честные обзоры. С картой именно 4080,как у меня 11900kf уступает 14900kf максимум 10%. И это далеко не во всех играх. Обычно 5-7%. А цена платформ катастрофическая. Даже учитывая,что плата у меня карбон от MSI, я свою связку дороже 40 тысяч не продам. А 14900k + z790 карбон около 100тысяч стоит. Плюс оператива. Свою ддр4 2×16гб от корсар я тоже шибко дорого не продам,а нормальная ддр5 стоит в два раза дороже. Я шлак за 12-15т. ставить не буду. Эти блохеров продажных процентов 95 на Ютюбе.
can u do it on 1440p and 4k too?
Then the bottleneck would shift and it would be more because of the 4090
you should go find a GPU benchmark video then. that would completely defeat the purpose of what the YTuber is trying to show.
What a huge difference between the cpu's. Never realised that 12th gen was that much faster then 11th gen. And tomorrow my AMD Ryzen 7 7800X3D is comming :) The FPS king at the moment...and 5% faster on average then the 13900K. Now only obtaining a 4090 rtx.... pff so expensive.
now the 14900k is king at damn near the same price
Is for 4K i9 9900K is a good processor and how much percentage difference observed in comparison to i9 13900K ?
Not a big difference because 4k is mostly GPU limited. But you are much better off with a 13th gen i7 or i5 than a 9th gen i9...
Can you please do; 9700k vs 10700k vs 11700k vs 12700k vs 13700k ?
9700, 10700, 11700 are same performance.
@@KelvinKMS 🤦🏼♂️
@@KelvinKMS 🤦
@@KelvinKMS 🤦
hi, what do you use to display the fps please?
@@ak20ayush thank you ;)
How is 1080p that low on wicher 3 that looks 4k performance are you using raytracing
The main issue is that raytracing in this case limited the fps due to cpu bottleneck, that was a pretty demanding place to the cpu.
I ordered a i7 11700k and the seller actually packed me a i9 11900k. Now I don't know I should feel blessed or mad. After this benchmark, I'm even more confused.
Maybe they felt sorry for you and gave you an upgrade? 💀
toss that CPU in the trash
The big dogs in the room are Cyberpunk 2077 and MSFS 2020 at 4K. Would love to see a video comparing CPUs and those games at 4K.
In 4k the CPU is insignificant.
There you can use an 8 years old CPU.
You have to cap the GPU usage on this to truly see the real difference. The 12 and 13 pull way more frames but the GPU usage is way higher in all examples
This was very interesting. I didn't realize the differences were that big to be honest. It would be great if you do this with more mid ranged CPUs that most people actually use (8 cores perhaps).
13600K is the best choice in my opinion.
Power 13900K with consumption 12700K
i5-13600K is a beast
well 13700KF's price is dropped in my country, is just 20 bucks different from 13600k
13700k is basically 12900k or slightly better. lol
@@xxovereyexx5019 Go with i7
13900k is way better than 13600k not even close lol
@@xTurtleOW Yup i9 3% better than i5 lol
I have 10900k and rtx 4080, thinking of upgrading to 13900KS but couldn't find a good motherboard, can I use 7800mhz rams with Rog strix z790-E gaming wifi?
Just get a 12700 or 12900k and save $300
If all you do is gaming, get a 7800X3D. The KS version is a rip-off anyway.
No guarantees for 7800MHz, you're gonna need an enthusiast 2-DIMM board for an OC that high
@@steph_on_yt I can't find Z790 Apex in this damn country named Turkey so I guess I should aim for lower frequency...
@@Adem.940 you should upgrade the cpu first because that will def give you a lot more performance, then if you want to upgrade the ram to 7600mhz or a little lower theres almost no difference
As intel Changes sockets performance gap gets huge😮
if youre coming from like i5 9400f or ryzen 2600x going latest cpu will multiple your fps like 3x prob
means your 'im fine' 70-80 fps gaming will be the dam 240hz gaming
i see core ultra purpose to accomodate the 360hz needs, which is indeed needed atm, ryzen 9000x3d maybe catches
I got lucky. I grabbed a 3090 ti off amazon for $999.00 haven't seen prices like that since last year
Now do let's see Paul Ryzen's leap
GPU scaling benchmarks please 5800 X 3D vs 7800x3d with 4090 at 4k
No 4k numbers?
this is a benchmark for CPUs, at 4K the fps difference will be in single digits if at all.
Raptor lake is just rebrand lake with more power draw and a few more frames. 12th gen is that sweet spot. Though I do run manually tuned Hynix A die D5 @ 8400MTs on my 13900K. Can't get those speeds with a 12900k.
I still continue with my 10700. Is enough for me
I mean with that archtecture there is really no room for upgrade, unless you go for a new motherboard
I was i5 11400 and i bought i9 13900k and i got just i10 frames what do i do ?
Can't wait for 14th gen!
New motherboard, PSU, cooler, electricity bill.😂😂😂
Intel you can burn your house easily.
@@lelouchabrilvelda1794 You blind? The new CPU’s used less power….
It’s only in synthetic benchmarks where those new CPU’s go insane…
@@lelouchabrilvelda1794 are u dyslexic?
Intel in 2015: TADAAA SKYLAKE ULTIMATE PERFORMANCE
Intel in 2016: SKYLAKE IS GOOD SO GET RENAMED VERSION
2017:UMMMM JUST GAIN CORES
2018: JUST RESELL THIS SHIT, NOBODY WILL NOTICE
2020: SO JUST GIVE THEM HYPERTHREADING, F&@K OFF
People: Amd is cheaper, faster and energy-effective
2021: YEAAAHHH THIS IS OUR NEW ARCHITECTURE. 15% MORE PERFORMANCE AND 50% MORE TDP.
Wow how u get so good temp on the i9 13900 k ?
Hardline in a cool environment or open test bench
@@thetshadow999animates9 man my i9 13900 k run super hot lol
@@donglee6599 If you are a gamer only just by 7800x3d.
You can use cheap air cooler on that beast cpu and beat the 13900.
I think next 14 gen intel cpu huge leap compete with 12th gen i am 13600k user but i am also waiting 14 th gen 😊
Will 14 gen requires new socket?or will be LGA 1700?
@@swiza79 12 + 13 are on one and 14 + 15 will be on another
@@thetshadow999animates9 typical for Intel ,but thanks 👍
@@thetshadow999animates9 that sucks, they change sockets every 2 gen, while AM was using AM4 for a very long time.
@@theoriginals9821 yes
i got a 12900k with a 4090 and even in 4k some games need a faster cpu to get all the frames out of the 4090, im most likely getting the 14900k when it arrives.
cause the 4090 is very powerfull, you need a 13900k for 4090 even in 4k
@@Tuskabanana the 13900k it's not even enough for a 4090, waste of money at this time, might as well wait for the next gen
с rtx 4090 мало кто играет в FullHD Нужны тесты в 2к и 4к вот тогда там разница будет не так велика и 10900k ещё не устарел -)
This test makes the 11900K look inefficient, consuming the most power for a low framerate.
because it is the worst CPU intel produced, 10900K was cheaper, has 2 more cores (10 vs 8), larger L2 cache, lower power draw, better OC potential...etc
@@cks2020693 child please. First of all, the 10900k just wont be able to recognize Gen 4 NVMes which already makes it the worst option compared to the 11900K. Secondly, the 11900k is superior to the 10900k with pci 4.0 capabilites and just having 2x cores on the older 10900K, doesn't really mean anything.
13th version is 5% more powerful for 100% more power consumption.
This benchmark shows how much gain you can have with ddr5 compared to ddr4.
I guess I should say thanks for showing how insignificant the CPU is in gaming performance. You could spend a boatload on upgrading your proc for 1-5% fps boost or get 20-25% by spending less on faster RAM.
i dont think you know how this works. you cant just upgrade the ram on a 10900k to the ram of a 13900k the faster ram is not compatible wiith the older chipset.
@@CB-bi1be No, no you cannot. Still exactly my point though. The difference in RAM speeds, going from DDR4 to DDR5 account for 95-99% of the performance gains. The DDR5 6000mhz RAM offers roughly 40% more frames while being clocked roughly 40% faster than the DDR4 3600mhz. The processor isn't offering any improvement itself other than support for a faster DDR revision being added between the 11th and 12th generation. The DDR generational improvement are worth upgrading for. The CPU generational improvements are not. At least not if your agenda is gaming.
then would it be possible to somehow re_engineer th older cpu/motherboard to accept the newer ram ? could it be as simple as re-soldering on the new ram interface to the motherboard? @@danmckinney3589
glad i have purchased the i5 13500, which is really good for everything atleast for me
The 13900k is only 5% faster but needs 20% more energy. What a bad CPU Generation.
The 12900k is the best of these 4 kind.
Intel shows how bad there generations are.
There isn't any game here where the 13900K is 5% while using 20% more power. Power efficiency is slightly regressed for the first three games in the video, but all the subsequent games the 13900K gives more performance for a smaller power increase.
i dont get it: in hogwarts usage of 10900 is 20-25% and we get 47fps when 13900k usage is 14% and we get 72fps , my point is where is rest of the cpu power? clearly 10900k and others are not used to ful potential , why im forced to go 13900k to have much better performance when my cpu is not used fully, like buying sandwitch but i can take only a small bite from it , i get it its opitimalization, but it suck hard, not only hogwarts iom not attacking only this game , but all games.
Bro 13900k has 12 more threads than 10900k.
This results in lower percentages.
The thing is, that programming a game which uses multicore is not that easy as u think. Thats why 6c/12t is still enough for gaming.
They all have different core counts, but not all cores are used for gaming, gaming will use, at maximum, 8 cores atm. What matters for gaming is single-threaded performance and cache, not core/thread count or usage.
10900K is 10c/20t
11900K is 8c/16t
12900K is 8P+8E/24t
13900K is 8P+16E/32t
The 12900K and 13900K have E cores as well as P cores. The E cores shouldn't really be used for gaming anyway, at least in conjunction with the P Cores (Dawid Does Tech Stuff actually found that the E cores alone are very hit and miss for gaming, they were only really slightly better than a 4th Gen i5 and in BFV Single Player, it was basically unplayable, but if you are spending as much as you are for a 12900K or 13900K, you really shouldn't game on the E-Cores anyway), they only really exist for accelerating multi-core workloads.
12900K & 13900K have ddr5
12900k & 13900k still win even with ddr4,
but ddr5 make the gap even further
i9 10900k was the king, and still rocking 🔥
ryzen X3D please too
I LOVE MY 10900K, 32GB @3600MHZ, 1 TB NVME AND 3 SSD, Z490 MSI GAMING EDGE WIFI, SEASONIC GX850 GOLD, CORSAIR 4000D AIRFLOW + 4 CORSAIR ML120 AND 4070
Have you tested it in 4K resolution to see what happens?🤔
its not cpu test if 4k rather its gpu test
A 4K test would make the test GPU bound rather than CPU bound since 4K demands more power from the GPU than the CPUs.
At 4K the GPU would reach max or near max usage for all the CPUs. So the results would be very similar
then this video would be pointless as it would be 100% GPU bound and a CPU comparisons would be meaningless
Imagine there were people upgrading from 10th gen to 11th gen back then
I could have got a 11900k, the 10900k overclock and RAM tuned is faster.
And what about i9-9900k? I have this CPU
Still good on 1080p and 1440p depends on what games u play
@@ikram2512 GTA Fivem
People talk about CPU wattage yet 12900k 13900k are overall lower 🤣🤣💀💀
DDR5 makes a huge difference indeed. On equal memory the difference would be minimal at best.
😳 It really makes a big difference.
Am I the only one who noticed that the difference between the two setups? I mean if you look at the 10900K/11900K is using 3600DDR where as the 12900k/13900k is using 6000DDR which would explain the massive difference coming from the previous generations before the 12th Gen. Now if they were using roughly the same speed memory not sure it would be as big as a jump from previous Gens. That being said. The 12th, 13th and even the 10th Gen is better than the underwhelming 11th Gen i9, Sorry Intel. Not sure what you were thinking on that one. You are suppose to leap ahead when it comes to technology not leap backwards. ;)
Странно... Повторил настройки из видео в Ведьмак3 (стимовская версия со всеми обновлениями), и мой 9900kf выдает от 70 до 85fps в зависимости от локации. Правда память у меня 3800 с затюнеными таймингами и 4х8Gb. По ощущениям и в киберпанке побольше видел, но лень проверять - настроек много менять надо.
да, не верю что 10-11 поколение не может выдать 60 фпс, неадекватный тест, у 12 поколения + 4090 в фуллхд должно всегда улетать за 100 фпс
@@Masterprocessor , я тоже сомневаюсь в объективности теста. Справедливости ради замечу - в данном случае видеокарта значения не имеет. DLSS 3 выключен, а так же установлено разрешение 1080p, чтобы видеокарта не была загружена на 100%.
Вот именно что «в зависимости от локации», а в видосе специально чел бегает чисто по Новиграду, так как это максимально цпу-интенсивная зона и разница в процессорах будет более различима. Если вылезти из Новиграда/рынка Оксенфурта все процы Интел из теста покажут 60+ фпс
Очень странный тест, нет упора не в процессор, не в видеокарту. Как так? Получается что дело не в процессорах, а в оперативной памяти.
Тут упор во все процессоры, нагрузка в виде 99% на процессор не обязательна для того чтобы в него был упор)
@@maslov_m5 это как? А ну разве, что в мониторинге выставлено «средняя загрузка CPU», а не «загрузка в реальном времени», тогда да.
@@games-pc повторюсь
В данном видео упор идет в процессоры,
в их архитектуру, кэш, частоту, инструкции,все в совокупности.
То что процессоры работают не на 99-100% это ни о чем не говорит.
Such a huge difference between the 11th and 12th generation. wow
should use ddr4 memory. this can not tell how much improvement caused by RAM. some motherboard still support ddr4.
The ram is stuck or what? (10900k and 11900k)
Ddr4 vs ddr5
12th gen i9 seems good considering it's like a downclocked 13th gen i9
It isn’t tho, only lower end SKU 13th gen use alder lake, the rest use raptor lake
12900k has less e cores than 13900k, equivalent to 13700k.
Sorry but who's buying the $600 CPU to play @ 1080p?
The fact that the RTX 4090 is still bottlenecked by the 13900K (most games don't even reach 80% GPU usage) is just amazing!
It's not a bottleneck, the 4090 doesn't need to be utilized much if put under 1080p or lower resolutions. It's a GPU catered towards 4K gaming. I also have the same combination of 13900K and 4090, it uses more than half of itself when put under 4K load.
We can see you don't know how CPU TESTS work 😂😂. 4090 is for 4k, 1080 is for CPU testing...
I don't understand this at all. The CPU usage of 10th and 11th gen is low. Its not like they are a bottleneck. I know that these chips are capable but are being held down by intel's shady SOC software, so that the consumer would get the newer chips and discard the older ones.
And AMD fan boy said that E cores are useless in gaming. LOL!
Well AMD fanboys are right on that one, E-cores are useless in gaming. But the P-cores spank AMD's P cores. The faster performance jump is from the superior P cores every generation not because of the additional e-cores. I mean look at the jump from 12900K to 13900K. Both have similar P core arch but the 13900K P cores are clocked 600MHz faster and also have 6MB extra L3 cache and like 3% better IPC, but the 600MHz faster P core clock is almost all the difference.
And though clock speeds of P cores between 10th to 12th Gen is almost the same, 12th Gen has a massive IPC uplift over 11th Gen and even more so over 10th Gen. Though 10th Gen still a little better than 11th Gen because 11th Gen backported to 14nm which made latency awful and thus in many cases worse for gaming despite better IPC. 12th Gen is 20% IPC uplift over 11th Gen on what it should have been 10nm in which 11th Gen would have been good.
@@Wolverine607 You mean way more power to fight against a Ryzen 7800X3D at 40W lol.
@@Jerry-jo6dn Well 7800X3D is a different animal in gaming. It trades blows with the well tuned 13900K in gaming with e-cores off. Though 13900K wins in everything else even e-cores off. Though the superior P cores were overall, not just gaming. The extra cache really seems good for games to fight against each other. But yes the 13900K uses a lot more power than 7800X3D for similar gaming performance.
@@Wolverine607 SIMILAR PERFORMANCE?
Did you see the results of DOTA, VALORANT, PUBG and more.
They destroying intel so much.
The E-Cores are not the reason why it's faster. If he turned the E-Cores off then the results would be the same. The 12900k and 13900k overclock better with the E-Cores off, so it would be even faster.
12900K & 13900K have ddr5
12900k & 13900k still win even with ddr4,
but ddr5 make the gap even further
8600k vs 13600k ?
@ 1920x1080 though?? really????
helpful video... to bad too many people fail to understand the point of this video... "You should have done 4K!" 🤦♂🤦♂
Т10900к топчик ) правда... вопросы к озу )
4:20 Same glitch :D
intel 11th gen was a ripoff, sad for anyone who got it
@@myuvmyuv i7-10700K way more cheaper
I upgraded from a 3770k and got my 11900k for a very small price off a friend, so loving it 😊
@@SM-mk9kl if you got for a nice price is definitely better!
@@SM-mk9kl If they are cheaper then they was good
Cpu is serious bottleneck for rtx 4090 this gpu is most likely future proof...
It is not correct to compare 3GHz vs 6GHz RAM
There is a problem with your testing. The cpu usage is never at 100% which means it's not reliable (and I'm not a 11900k user 😂)
When 15th gen arrives next year with a new mb platform i will upgrade from my 10900k hopefully, still a good cpu ftw
Смыла пока нету менять ждать 15-16 поколение вот тогда можно сменить -)
1=12900k 2=10900k, 12900k more performance and better efficiency
They did well considering they were stuck with 14nm++++++++++++++.
lmao 4:22 on the right
Bugwarts Legacy