Os caras comemorando 50w sendo que é possivel colocar o 14900k em 75w com a mesma performace mas fica calado vendo a placa de vídeo consumir 350w. É muita contradição.
@@CarFlightGamesCFG pois é,uso i9 13900k consumindo 80w no warzone e em jogos de campanha consumindo 44/65w tudo é otimização de vcore e baseline profile no turbo até vc achar a voltagem ideal
de 3d v-cache thing they did was a very smart move when it comes to gaming it would be crazy if they put this tech in the playstation 6 and the next xbox
@@Der_Joghurt_Ohne_Ecke hotter ? its cooler ? im cooling my 7800x3d with a 40$ air cooler and it dont even exceed 60c during gaming there is a lower thermal limit on the x3d chips than the non x3d chips because the v-chache is heat and voltage sensitive not because they get hot the thermal limit is 10c lower than the non x3d chips lol dont know where you got that info from but they are the opposite , the 7800x3d at least is cool af
@@Der_Joghurt_Ohne_Ecke I have a 30 buck cooler on my 7800x3d and it maxes out at 80 degrees under full load. around 70 in normal gaming. no problem to cool whatsoever
@@Deathscythe91 cyberpunk: 61c 14900k, 63c 7800x3d. 75 w vs 150w. It gets slightly hotter and draws less han half the watts. thats all because of the 3D cache, so yes, it does get hotter because of the cache, but its not so bad.
wait what? how is that possible? is it really true?? i would possibly look into upgrading(ig downgrading for other tasks) but i want to emulate games like last of us is it possible to get 2x perf even on that if possilble pelase revert
@@PraveenYadav-q9l from 7700k you'll probably get more than 2x for emulators. on PC games i get 30% to 50% more FPS . (On ARMA 3 YAAB I get almost 2x more from 10700f) Edit: and that's on 1440p
I'm glad that Intel is going trough this, this means that that they will have to work into optimization of the arquitecture and optimization of new architectures and not just raise the clock speeds to sell lies of "new products" to people.
@@iikatinggangsengii2471 Price doesn't determine how good something is. A new generation should be roughly the same price as the last generation, or we'd be spending millions of dollars on CPUs by now. That's not how it works. This isn't a matter of opinion.
Yes, but in several games that are very demanding such as Hogwarts Legacy, the i9 pulled ahead in averages and 1% lows and in some cases 1% lows (see Starfield and Witcher 3). A processor is much more than the aggregate average FPS you get across multiple games. You have to consider: 1) Which processor has the better memory controller? 2) Which processor offer more system resources and can multi-task? Thinks several Chrome tabs plus discord etc 3) Which processor offers better efficiency/performance? Performance in terms of system resources as well as FPS per watt 4) Which processor is right for YOU, and your workload? And more Also, one doesn't have to use the baseline profile..... Unless of course you have zero clue how to configure/build a PC
Half the cost, half the TDP, minimum hardware requirements and at the same time the best 3D performance! Like Pentium 4 and Athlon 64 in the past... AMD can make a ultimate comeback!
Don't forget that 14900K socket is effectively dead. No future CPU can use it . Now let's take a look back at AM4 socket x370 Motherboard from December 2017, initially intended for Ryzen 9 1900x but user can still upgrade to Ryzen 5800x3D just by doing BIOS update. Can Intel socket from 2017 do that? Nope, neither is LGA 1700. AM5 today, can use 7800x3D all the way up to 10800x3D in the future
a few years ago every single Intel CPU Generation had its own socket and it was communicated that upgrading a cpu on the same socket isn´t forseen. The mobo and the CPu are one unit and the socket should be optimized as much as possible for every change in a cpu. AMD bought Intel´s first socket system and they copied their socket system again when they rised.
@@kazuviking and AM4 supported out of planned 2017-2022, and yet today, there are still several new CPUs came out for AM4. 5600x3D for example. Therefore AM4 supported from 2017-2024 indefinitely. AM5 is already planned from 2022-2027 but I can assure you it will still be receiving new CPUs until 2029 5+ Years support or more for the same Socket is never done by Intel.
@@JohnLeOpinion which is why its even funnier how bad the 14th gen intel is at games(or efficiency, or stability). There hasnt been a single consumer grade cpu in recent history which overperformed its upper tier competitiors in gaming.
can someone explain to me why power consumption is that big of a deal im legit asking becuz its not something that matters to me. It equates to arouns a couple hundred dollars maybe a year. I can understand if someone has a low end system with like a 500watt psu and needs headroom. It seems more en vogue than anything,
@@ryanwishart5452 HEAT. More Watts = Hotter room. Bigger coolers, faster Fans = More noise. All that at a higher cost, for essentially the same performance.
The X3D chips tend to have much higher CPU utilization than the non-3D variants, which explains how they are able to perform better dispite lowered CPU clock speeds.
Also, the R7800X3D doesn't expire from overvoltage issues (14th gen) or oxidation issues (13th gen). There's a reason that Intel stock is (today, 7/31/24) cratering at $29 while AMD is riding high at $132.
Intel doesn’t have a baseline profile, they’ve said this themselves. They said it’s 253/253 and 400a in extreme mode and 320/320 400a for KS models in extreme mode. Both of which are considered stock and covered by warranty. So how much current did you set? Because you’re nowhere near 253w yet your P cores are throttled back heavily.
Have you not been keeping up with the drama? Several motherboard manufacturers have customer power profiles that are enabled by default, that provide basically unlimited power (4096 watts on the Asus boards). Asus calls it Multicore Enhancement, can't remember what the other guys call it. Intel has asked motherboard manufacturers to stop doing that and set the default to Intels spec instead. This is also a thing on AMD motherboards. In PBO you can set the processor power to CPU/AMD or Motherboard limits. Setting it to Motherboards allow significantly higher volts and amps when boosting.
@@AluminumHaste Yes and Intel has publicly commented. They have explicitly stated that i9 CPUs DO NOT use Intel baseline. They instead recommend Intel Performance or Extreme modes. They offer a table which states the power limits users can use. Extreme Mode is a default setting, it is 253/253 400a for i9 K/KF and 320/320 400a for i9 KS. Unfortunately many outlets reported and tested before Intel made their statement against the rumors and "leaks".
"bu bu bu if Intel would won if..." Nobody cares. Even if Intel got 2 fps more, it uses nearly twice the power draw costs like 70% more, has ZERO upgrade path and is slower on average in gaming. The numbers don't lie. How tf prople still buy Intel for gaming is beyond me.
you do realise the intel chips are not designed for gaming like the ryzen chip? go and compare video rendering and see what happens 😂 I would happily sacrifice 10% less fps for twice the rendering performance. Im so sick of this AMD bot fanboy crap now tbh. you act like the ryzen chip saves you 100's every month. I dont stare at fps monitors when I play and would not notice a dam difference in gaming performance, the same as my monthly electric bill. so stop slating people just because they dont like your precious AMD shite. And yes because of people like you, i wont touch AMD anything. Their GPUs are trash. And their cpus are poorly optimized for anything except gaming (igpu is also trash)
@@prussell890 This video isn't about productivity performance or video rendering, it's about gaming performance.... Also why are you talking about AMD GPUs in a video about AMD vs Intel CPU? By that logic Intel dGPUs are even worse. No one cares about integrated graphics btw, if you buy an iGPU instead of a dGPU, you most likely won't care about iGPU performance anyway. Btw yes I do agree that Intel is better for productivity tasks since it got more cores. If you do more productivity tasks than gaming then you should always get Intel CPUs for best performance.
JayzTwoCents just put out a video today detailing how Asus's current Intel Baseline bios has an incorrect AMP setting (its too low) and you have to manually increase the amps in BIOS from 280A to 400A in order to comply with Intel's recently published baseline chart, he shows how 280A decreases Cinebench score by over 3K points vs. 400A
What is Intel Baseline Profile? is that mean running on intel default settings? Sound like very unimportant feature if you are overclocking and have k series
It means intel recommended settings as most MOBO manufacturers are pushing the power limits far beyond their spec, so they are thermal throttling a lot quicker. @JayzTwoCents has made a video with an in-depth explanation of this.
Makes sense its 3x the CPU of the 7800X3D 8 cores vs 24 cores. IF they managed to make a cpu with triple the physical core count consume less power than the other smaller CPU on a smaller node. It would be the biggest slap in the face to AMD if one gen advanced nodes + smaller cpus = more heat than an intel CPU with triple the cores on a bigger node
@@D3nsity_ if you disable 16 E-cores, how much FPS does it win? and how less power it consumes? the answer is: not much difference. Games usually don't need more than 6 cores, and if windows scheduler doesn't fail, 14900K will always use power hungry P-cores.
Apart from the obvious exceptions where the difference is massive, you're obviously going to have fun on either platform. But the issue I see is power consumption and AMD absolutely stomps here
I think the 14900K is now evenly matched with the 7950X while costing less but worse power efficiency. (Was previously better but consumed a LOT more power)
@@senorbonbon even if you multitask alot and you would like to go intel do not buy the 14th gen, go with the 13900k for that since that isnt an unstable mess
Amd more efficient, same performance, cheaper, long lasting platform. Absolute w cpu. I use it daily and it fits all my needs. 8 cores is more then enough even for very heawy multitasking.
@@evan-du3vk RTX 4070, RTX 4060, All other RTX cards below 3060, RX6600XT, RX7600, all GTX cards below the 1080 ti. Thats 90% of the GPU market right there
@@JusstyteN Ever had one? I have one and it's great. Overclocked, never a single problem.... I can run BF 2042 and upscale movies on the background, while on my friends on discord and even streaming if I want to.
I always had this doubt: In a real world where the player has discord, an open browser with several tabs, WhatsApp and perhaps a work app like Office, which of these processors would do better? I would like to see a comparison like that.
You hit the nail on the head. The processor with the most physical cores can handle more and will last longer. With that said, the 14900K and the 7800x3d are different and one is more of a generalist and the other is a specialist. As a PC builder it is my opinion that 8 cores are on the way out.... hyper threading has proven not that useful for gaming anyway. An actual somewhat equivalent comparison is 7950x3d vs 14900k on said multi tasking plus gaming workload.
Interesting video! I would like to know if we should expect a video showing the 4090 and 7800x3d test in the game Black Desert Online in 4K resolution?
You have problem with LLC or something on Intel platform, there is no way to have 4600MHz throttle on PCores, I just did your settings in forbidden west and I have 232FPS with 14900k on intel defaults.
it's not just LLC, that and LLC really kicks in when the CPU is under heavy stress. Every single tech person commenting on this issue has never set a 24/7 overclock, or if they have, they upgraded to the next gen parts before they degraded the CPU. These are not the guys who had an 8700k running for 5 years at 5.2GHz with 1.325v. The LLC is feeding the CPU too much voltage on top of the voltage already being set too high.
no they fixed that a long time ago. The cause of that one was with people running 6000MT/s RAM in EXPO, which caused the EXPO voltage to go up to an unsafe level. Also, this was worse with certain motherboards.
I literally don't understand intel fans. Amd LITERALLY gives slightly higher performance at a much lower price and a much lower power usage. LITERALLY 0 REASON TO USE INTEL
@@Boraboaryou do aware that Threadripper consuming exactly same amount of power to 14900KS ? That's the true Rival of 14900K & KS for Productivity & Gaming at the same time
@@niezzayt3809 Yep. But I don't know if I can find Threadripper for less than 500 bucks and a cheap mobo. Also, threadripper is definitely not the Holy Grail for gaming.
Intel Core is multipurpose, while Ryzen is focused on gaming. If you don't ever need to render stuff just go for Ryzen. I9-14900K : ~41000 Cinebench R23 R7 7800X3D : ~18000 Cinebench R23
You might wanna test these with power plan set to high performance. those P-cores are all over the place. They should be able to do 5.7GHz 24/7 with decent cooler.
It's interesting to note that on the AMD side the CPU has a significantly higher utilization vs the intel CPU, which seems to also correspond to higher utilization of the 4090. Yet the intel build has the same or lower framerates but with higher CPU clock speeds and higher power consumption. Were there any voltage settings set lower in bios on the 14900k to help with the thermal issues the 14th gen has been suffering as of late? It just seems like the 7800X3D is just doing more with less, compared to the 14900K as if it's in "tryhard" mode. I am amazed since ever since these CPUs have been on the market, the 14900k was performance king.. clearly this is wrong.. or things have changed since.
Those are not Intel's baselines! Those people at motherboard manufacturers that did those baselines are completelly stupid. Especially Gigabyte. They used 1,7 mΩ loadline which leads to absolutelly insane operating voltages. Even Intel has called motherboard manufacturers (especially Gigabyte) to remove that crap from BIOS. Gigabyte already remowed that BIOS update that implements Intel Baseline. And also ASUS uses baseline that has some utterly insane settings. Those people that programmed those BIOS settings at motherboard manufacturers are stupid or completelly incompetent.
O problema desses testes é que eles limitam o processador apenas a jogos, quando o foco de alguns claramente é para o trabalho. Gostaria que fizessem comparativos não só entre os jogos, mas também em programas de renderização de vídeo. E com essas configurações, creio que o intel levaria a melhor no quesito produtividade. Claro que isso não é uma critica ao canal, porque o foco é teste em jogos. Mas acho que poderiam abrir os testes para outros programas além de jogos, já que existe uma demanda por esse tipo de conteúdo no youtube!
14900k much colder, less temps. 👌Also if you undervolt 14900k it will be 39c on games. Tested myself for over a month, works perfect, 39-43c. In Ghost of Tsushima 3440x1440 rez. Ultra settings 160-185 fps, power draw 27w 😊 the rest games the same story with same fps (GPU rtx 4080) good luck 😉
I have an i5-12400f with gigabyte h610m mobo currently using an gt 730 thinking for an upgrade to rtx 3060 or arc a750 here rtx 3060 is 323usd while arc is 240 usd what do i do?
A750 is a really good buy if you know what you’re doing. It’ll take more time to set it up correctly but once you do it’ll perform great. If you have a little more cash to spend go for the A770 as it has 16gb. With this amount of vram, it’ll hold up for a while.
I've noticed on usage cases where the 7 7800X3D in Horizon zero the GPU is like 10% or more in use than intel side I'm not a computer whiz but I'm wondering how that is
@@kazuvikingsame with intel CPUs, they supply too much voltage on 13 and 14 gen cpus, i my self turned profile to intel baseline profile in motherboard, and it never goes over 250 w
@@dankuk765 I see, thank you! I'm torn between the i5 14600k, i7 14700k and the 7800x3D, can't decide which to buy and i love playing on emulators and want the highest possible fps on them...
How you know if they will be cheaper?. 7900xtx is getting 100 watts more then 4080 super don't have dlss and rt. And in mine country amd is 10% cheaper. I get 30%-40 % less. But really 10% is that much difference in card that offer less??
Why is there such a big difference in the 7800 x3d tests in this video and in your video 6 months ago Core i7 14700K vs. Ryzen 7 7800X3D - 10-game test
This comparison is wrong because the percentage of each GPU or CPU usage should be the same. If the i9 has a lower FPS due to using a lower GPU, it will definitely change if they are the same.
It’s interesting situation with those i9 13th and 14th gen.People who buying them,in my opinion are intusiasts,and know that i9th are hot. And they seriously didn't think of making an undervolt of it?
As powerful and as fast as the 7800x3D is, I went with the 13700K. AMD is great at 1 thing and 1 thing ONLY, and that's gaming. No disrespect to the 7800x3D in any way though.
let me say this, you need your pc to do other stuff like work, editing and other heavy stuff? you need to consider well all the options well and choose what it's best for you (like have more cores, save money, have more speed etcetc), while if you need a pc just to play ... well why choose a CPU that cost almost 2X the money in price and cost more on bills too?
@1907IceFire I wanted a balanced PC that's great for gaming that's also a powerhouse in productivity, and good for everyday use. These new AMD chips get really HOT from what I know. Intel's do too, but it's not as bad as AMD. I did some benchmarking with my 13700K + 4080 Super on some games, and I was impressed regardless. And considering I got the 13700K that's a solid $40 less than the 7800x3D. I would count that as an absolute win, regardless of performance compared in gaming.
@@1907IceFire u know that realistically, the difference between a x3d and an i7 is less than 5% on 1080P WITH a 4090; so at 1080P on absolute lowest settings the difference is small; and at 1440P and more they re evenly matched, while also keeping i7's 80% more performance in overall productivity than the x3d, so you basically trade nothing in gaming performance for more performance overall; even if you d get a high end chip like that for 1080P, not many will pair it with a 4090 to see that 10 fps difference between 250 and 260; i got a 13700kf at 100 dolars less than a 7800x3d, and like you said, i get the 7800x3d performance at a much lower price, no reason for me to get a much more expensive cpu that will give me no performance boost, or a performance boost so small that would be unnoticeable, especially since i don t have a 4090
@@KoItai1 that's for sure and that's why i said you need to check your needs before buy, higher isn't always better if you want to build a pc, cause cpu will be not the only part where you'll need to invest more (cooler, power supply, bills, a gpu that can match well with your cpu, motherboard...) if you can find a similar cpu at lower price and you're in a budget why not? a 13700kf is good if you saved 100$ like a 7800x3d is good if you can save more than 100$ compared to a 14900k every budget have his best CPU must buy in the end, people with unlimited budget have no interest to watch this things in the end...
Games :
CYBERPUNK 2077 - 0:06
Call of Duty: Warzone - 1:05
Hogwarts Legacy - 2:01 - gvo.deals/TG3HogwartsLegacy
The Witcher 3 - 2:58 - gvo.deals/TestingGamesWitcher
Horizon Forbidden West - 3:59
Red Dead Redemption 2 - 5:01 - gvo.deals/TestingGamesRDR2
Avatar Frontiers of Pandora - 6:12
Microsoft Flight Simulator - 7:17
Starfield - 8:19
Forza Horizon 5 - 9:17 - gvo.deals/TestingGamesForza5
System:
Windows 11
Core i9-14900K - bit.ly/3rTFhVy
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
CPU Cooler - MSI MAG CORELIQUID C360 - bit.ly/3mOVgiy
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR HX Series HX1200 1200W - bit.ly/3EZWtNj
AMD gives better performance at almost half the power draw lmao. Intel fanboys seething
also half the price
@@luixkjkk And that's before the more expensive supporting components needed by the 14900k.
there arent any intel fanboys here lmao
watch them come soon
Why would people become a fan fanboys of an cpu manufacturer wtf
Intel's "E cores" 😂
Funny - 70watt vs 140 watt, 450$ vs 550$
AMD with Ryzen made a fatality to Intel ; )
Os caras comemorando 50w sendo que é possivel colocar o 14900k em 75w com a mesma performace mas fica calado vendo a placa de vídeo consumir 350w. É muita contradição.
@@CarFlightGamesCFG pois é,uso i9 13900k consumindo 80w no warzone e em jogos de campanha consumindo 44/65w tudo é otimização de vcore e baseline profile no turbo até vc achar a voltagem ideal
@@warzonehighlights773 кто этим будет заниматься? большинству не нужны регулировки, они в этом не разбираются.
@@Angelo0chek this isn´t my problem,and yes them
Какой? 😅
Ryzen 7 7800X3D GOAT
Bro meatrides amd products but also hates amd products this is what i would call the schrodiners consumer
@@xigaxhad3635 Looks like to me he just likes the better products and isn't a fan boi for one or the other.
@@greebuh check his comments, the guy just hates amd gpus
Yea
"That’s why he’s the MVP, that’s why he’s the GOAT!!😊"
de 3d v-cache thing they did was a very smart move when it comes to gaming
it would be crazy if they put this tech in the playstation 6 and the next xbox
idk if you can cool it with an console good because 3dv gets hotter than non 3dv
How good is the Playstation 6 you got?
@@Der_Joghurt_Ohne_Ecke hotter ? its cooler ? im cooling my 7800x3d with a 40$ air cooler and it dont even exceed 60c during gaming
there is a lower thermal limit on the x3d chips than the non x3d chips because the v-chache is heat and voltage sensitive not because they get hot
the thermal limit is 10c lower than the non x3d chips lol
dont know where you got that info from but they are the opposite , the 7800x3d at least is cool af
@@Der_Joghurt_Ohne_Ecke I have a 30 buck cooler on my 7800x3d and it maxes out at 80 degrees under full load. around 70 in normal gaming. no problem to cool whatsoever
@@Deathscythe91 cyberpunk: 61c 14900k, 63c 7800x3d. 75 w vs 150w. It gets slightly hotter and draws less han half the watts. thats all because of the 3D cache, so yes, it does get hotter because of the cache, but its not so bad.
Crazy, I got 2x the performance on RPCS3 and Yuzu after I upgraded to 7800X3D from 10700F and it never went beyond 80W
wait what? how is that possible? is it really true?? i would possibly look into upgrading(ig downgrading for other tasks) but i want to emulate games like last of us is it possible to get 2x perf even on that if possilble pelase revert
@@PraveenYadav-q9l He was probably CPU bottleneck'ed. Tell me your setup
@@FreshMedlar i have a 7700k
@@FreshMedlar not really, emulators really prefer faster CPUs
@@PraveenYadav-q9l from 7700k you'll probably get more than 2x for emulators. on PC games i get 30% to 50% more FPS . (On ARMA 3 YAAB I get almost 2x more from 10700f)
Edit: and that's on 1440p
I'm glad that Intel is going trough this, this means that that they will have to work into optimization of the arquitecture and optimization of new architectures and not just raise the clock speeds to sell lies of "new products" to people.
The one ting they really have to improve on, and AMD excels at, is PR
Kinda wish this would happen to Nvidia and even the playing field a bit.
@@mattyb584 Nvidia is dominating right now, this can happen only if they make desperate attempts like Intel did.
@@JohnLeOpinion And PR is short for PeRformance per Watt.
7800x3d is the best gaming cpu ever made lol
sort of, imo not 'flagship' enough, their 9000 series are, and deserve high pricetags
@@iikatinggangsengii2471 Price doesn't determine how good something is. A new generation should be roughly the same price as the last generation, or we'd be spending millions of dollars on CPUs by now. That's not how it works. This isn't a matter of opinion.
in Horizon and MFS the difference is simply huge
also rdr
Yes, but in several games that are very demanding such as Hogwarts Legacy, the i9 pulled ahead in averages and 1% lows and in some cases 1% lows (see Starfield and Witcher 3). A processor is much more than the aggregate average FPS you get across multiple games. You have to consider:
1) Which processor has the better memory controller?
2) Which processor offer more system resources and can multi-task? Thinks several Chrome tabs plus discord etc
3) Which processor offers better efficiency/performance? Performance in terms of system resources as well as FPS per watt
4) Which processor is right for YOU, and your workload? And more
Also, one doesn't have to use the baseline profile..... Unless of course you have zero clue how to configure/build a PC
@@LorentGuimaraes i mean starfield almost fully utilizing the 7800x3d at above 85%. Honestly its truly a gaming cpu and the best one right now
@@LorentGuimaraes What about price of the 14900?
All your points from 1 to 4 is usless.
@@artyrkarpov7463 What about it? It has 24 physical cores, so it is more expensive. Is there an argument somewhere??
ryzen 7 7800x3d is the best gaming CPU ever made.
lol
the best gaming ever made
@@fbiagentmiyakohoshino8223the ever made
@@old1101 the made
@@Gcity67kthe
Half the cost, half the TDP, minimum hardware requirements and at the same time the best 3D performance!
Like Pentium 4 and Athlon 64 in the past...
AMD can make a ultimate comeback!
Yet Intel can live off AMD x86 royalty............. It's not fair.
AMD made an ultimate comeback with the release of Ryzen 1xxx in 2017
Don't forget that 14900K socket is effectively dead. No future CPU can use it .
Now let's take a look back at AM4 socket x370 Motherboard from December 2017, initially intended for Ryzen 9 1900x but user can still upgrade to Ryzen 5800x3D just by doing BIOS update.
Can Intel socket from 2017 do that? Nope, neither is LGA 1700.
AM5 today, can use 7800x3D all the way up to 10800x3D in the future
True i have 7800x3d and its best bang4buck and u can upgrade in future
a few years ago every single Intel CPU Generation had its own socket and it was communicated that upgrading a cpu on the same socket isn´t forseen. The mobo and the CPu are one unit and the socket should be optimized as much as possible for every change in a cpu. AMD bought Intel´s first socket system and they copied their socket system again when they rised.
Came to update, the AM4 socket just got another batch of CPU's launched for it alongside the SECOND gen of AM5 CPU's. Intel could NEVER
AM5 is only supported till 2027.
@@kazuviking and AM4 supported out of planned 2017-2022, and yet today, there are still several new CPUs came out for AM4. 5600x3D for example.
Therefore AM4 supported from 2017-2024 indefinitely.
AM5 is already planned from 2022-2027 but I can assure you it will still be receiving new CPUs until 2029
5+ Years support or more for the same Socket is never done by Intel.
Intel note to self:
Never. *EVER* Mess with an AMD X3D CPU for gaming.
Here in Italy, the 7800X3D costs 350€, while the 14900K is 600€.
The 3d-V cache works so well, and the AMD CPU seems to utilize the GPU way better
consumes less to give more at a cheaper price!
You do realize that these two CPUs are in completely different classes?
@@JohnLeOpinion which is why its even funnier how bad the 14th gen intel is at games(or efficiency, or stability). There hasnt been a single consumer grade cpu in recent history which overperformed its upper tier competitiors in gaming.
can someone explain to me why power consumption is that big of a deal im legit asking becuz its not something that matters to me. It equates to arouns a couple hundred dollars maybe a year. I can understand if someone has a low end system with like a 500watt psu and needs headroom. It seems more en vogue than anything,
@@ryanwishart5452 because people dont like spending additional couple hundred dollars if they dont need to
@@ryanwishart5452 HEAT. More Watts = Hotter room. Bigger coolers, faster Fans = More noise. All that at a higher cost, for essentially the same performance.
I still remember the days when it was the other way around. Intel Core i7 CPUs were unmatched. It's interesting how things turned out.
yeah FX was terrible because programs never used more than 4 cores but nowadays they perform pretty well against an i7 around its time
The 4790k was pretty much unmatched by AMD and anything intel put out until like the 8th gen, that thing was just busted.
7800x3D is GOD and with 70w nothing more, for me the 14th generation of Intel has been a complete failure.
the Ryzen 7 7800X3D is the clear winner, but the 1% lows in many games is not stable as the I9 14900K
Damn! Starfield almost fully utilizing the 7800x3d 🙀
The X3D chips tend to have much higher CPU utilization than the non-3D variants, which explains how they are able to perform better dispite lowered CPU clock speeds.
Are you still playing it?
50% on a 24 core, 32 thread CPU during gaming is also absolutely crazy.
Ryzen 7 vs i9, genius
Also, the R7800X3D doesn't expire from overvoltage issues (14th gen) or oxidation issues (13th gen). There's a reason that Intel stock is (today, 7/31/24) cratering at $29 while AMD is riding high at $132.
Intel doesn’t have a baseline profile, they’ve said this themselves. They said it’s 253/253 and 400a in extreme mode and 320/320 400a for KS models in extreme mode. Both of which are considered stock and covered by warranty.
So how much current did you set? Because you’re nowhere near 253w yet your P cores are throttled back heavily.
Have you not been keeping up with the drama?
Several motherboard manufacturers have customer power profiles that are enabled by default, that provide basically unlimited power (4096 watts on the Asus boards).
Asus calls it Multicore Enhancement, can't remember what the other guys call it.
Intel has asked motherboard manufacturers to stop doing that and set the default to Intels spec instead.
This is also a thing on AMD motherboards. In PBO you can set the processor power to CPU/AMD or Motherboard limits.
Setting it to Motherboards allow significantly higher volts and amps when boosting.
@@AluminumHaste Yes and Intel has publicly commented. They have explicitly stated that i9 CPUs DO NOT use Intel baseline. They instead recommend Intel Performance or Extreme modes. They offer a table which states the power limits users can use. Extreme Mode is a default setting, it is 253/253 400a for i9 K/KF and 320/320 400a for i9 KS.
Unfortunately many outlets reported and tested before Intel made their statement against the rumors and "leaks".
it would be embarrassing to see a 80W CPU competing against 250w CPU
"bu bu bu if Intel would won if..." Nobody cares. Even if Intel got 2 fps more, it uses nearly twice the power draw costs like 70% more, has ZERO upgrade path and is slower on average in gaming. The numbers don't lie. How tf prople still buy Intel for gaming is beyond me.
dont forget that shit is failing now, imagine paying more money just to get errors, rofl
@@maoutted Intel released a fix / patch for it recently though. So they should mostly work now.
@@4nlimited3dition_4n3d thats prolly a temporary fix tho they need to make their shit in 5nm
you do realise the intel chips are not designed for gaming like the ryzen chip? go and compare video rendering and see what happens 😂 I would happily sacrifice 10% less fps for twice the rendering performance. Im so sick of this AMD bot fanboy crap now tbh. you act like the ryzen chip saves you 100's every month. I dont stare at fps monitors when I play and would not notice a dam difference in gaming performance, the same as my monthly electric bill. so stop slating people just because they dont like your precious AMD shite. And yes because of people like you, i wont touch AMD anything. Their GPUs are trash. And their cpus are poorly optimized for anything except gaming (igpu is also trash)
@@prussell890 This video isn't about productivity performance or video rendering, it's about gaming performance.... Also why are you talking about AMD GPUs in a video about AMD vs Intel CPU? By that logic Intel dGPUs are even worse. No one cares about integrated graphics btw, if you buy an iGPU instead of a dGPU, you most likely won't care about iGPU performance anyway.
Btw yes I do agree that Intel is better for productivity tasks since it got more cores. If you do more productivity tasks than gaming then you should always get Intel CPUs for best performance.
JayzTwoCents just put out a video today detailing how Asus's current Intel Baseline bios has an incorrect AMP setting (its too low) and you have to manually increase the amps in BIOS from 280A to 400A in order to comply with Intel's recently published baseline chart, he shows how 280A decreases Cinebench score by over 3K points vs. 400A
What is Intel Baseline Profile? is that mean running on intel default settings?
Sound like very unimportant feature if you are overclocking and have k series
It means intel recommended settings as most MOBO manufacturers are pushing the power limits far beyond their spec, so they are thermal throttling a lot quicker. @JayzTwoCents has made a video with an in-depth explanation of this.
7800x3D same FPS with HALF the power of 14900k 🤣
More in some cases.
With much better 1% lows
And half the price
Makes sense its 3x the CPU of the 7800X3D 8 cores vs 24 cores. IF they managed to make a cpu with triple the physical core count consume less power than the other smaller CPU on a smaller node. It would be the biggest slap in the face to AMD if one gen advanced nodes + smaller cpus = more heat than an intel CPU with triple the cores on a bigger node
@@D3nsity_ if you disable 16 E-cores, how much FPS does it win? and how less power it consumes? the answer is: not much difference. Games usually don't need more than 6 cores, and if windows scheduler doesn't fail, 14900K will always use power hungry P-cores.
Apart from the obvious exceptions where the difference is massive, you're obviously going to have fun on either platform. But the issue I see is power consumption and AMD absolutely stomps here
Yes, with the success of the Ryzen 5800X3D, you see 3D integration in the Ryzen 7000 series and future CPU. Users must thank 5800X3D.😊
Today: 15 Oct 2024
Ryzen 7800x3D costs 500 euro
Intel i9 14900 costs 460 euros
For a reason...
I got my 14900K for $250 and and my 7800x3d today for $300(new)
half power with that level of performance? no doubt 7800x3d the winner here
The warzone test 1% lows on the R7 are insane, thats what makes the chip so damn good.
Ryzen 9 7950X after hearing 13900K-14900K have issues with gaming and sometimes multitasking: “YES, My turn A$$Holes”
I think the 14900K is now evenly matched with the 7950X while costing less but worse power efficiency. (Was previously better but consumed a LOT more power)
@@auritro3903 so what is the move than? currently have a i9 9900k, should I get the Ryzen 7 7800X3D or i9 14th gen
@@senorbonbon 7800x3d easily
@@senorbonbon even if you multitask alot and you would like to go intel do not buy the 14th gen, go with the 13900k for that since that isnt an unstable mess
@@senorbonbon7800x3d all day. You’ll be able to upgrade while using the same motherboard for years
Amd more efficient, same performance, cheaper, long lasting platform. Absolute w cpu. I use it daily and it fits all my needs. 8 cores is more then enough even for very heawy multitasking.
i wanna ask you are you use same cooler for beanching>?
I've seen a lot of BS in my years but this is up there with the later Pentium 4 processors, the GTX 970 really having 3.5GB of VRAM, and Bulldozer.
All the test with Full HD? Im interested in higher resolution
Тогда это будет тест видеокарты
honestly insane how far this rivalry has come.
You've got to hand it to AMD. They made one helluva gaming CPU with the 7800x3D. Still loving my 5800x3D. These things are GOAT status.
Only i see is 80W vs 150W
12700K / 13600k / 7700x / 7800x3D is now perfect gaming cpu for good price and power draw.
Only reason to buy intel is if your still a fanboy from when they were on top. Power consumption matters these days. Especially in australia.
For me it's funny when people say about graphic cards that dosent seems a problem
@@evan-du3vk Intels new CPUs are using more power than most GPUs. Thats the problem
@@kingiument4627 i have 13900k and in extreme settings is using less then 200 watts. So don't know witch good graphic takes less
@@evan-du3vk RTX 4070, RTX 4060, All other RTX cards below 3060, RX6600XT, RX7600, all GTX cards below the 1080 ti. Thats 90% of the GPU market right there
NASA uses intel so I’m using intel 🎉
Was expecting Intel's to be a lot less, but the 7800X3D is still the no-brainer for gaming.
If you have zero clue what you are doing it may be the CPU for you, the 14900K is not idiot proof
@@JohnLeOpinion 14900k is trash cpu. I had 12700k its horrible. Im so glad i upgraded to am5
@@JusstyteN Ever had one? I have one and it's great. Overclocked, never a single problem.... I can run BF 2042 and upscale movies on the background, while on my friends on discord and even streaming if I want to.
@@JohnLeOpinion what do you mean idiot proof? you have to be smart to not run out of 24 cores of 14900k?
I always had this doubt:
In a real world where the player has discord, an open browser with several tabs, WhatsApp and perhaps a work app like Office, which of these processors would do better?
I would like to see a comparison like that.
Who tf uses discord while playing story games?
ok pay +50% price for what? so you can't use a discord and browser on amd which are add 1% more load to cpu?
It's a good idea tbh
or like 3 screens
You hit the nail on the head. The processor with the most physical cores can handle more and will last longer. With that said, the 14900K and the 7800x3d are different and one is more of a generalist and the other is a specialist. As a PC builder it is my opinion that 8 cores are on the way out.... hyper threading has proven not that useful for gaming anyway.
An actual somewhat equivalent comparison is 7950x3d vs 14900k on said multi tasking plus gaming workload.
Interesting video! I would like to know if we should expect a video showing the 4090 and 7800x3d test in the game Black Desert Online in 4K resolution?
When are the 4k results allowed to be released ?
Let's be honest here. Either processor is going to give you a fantastic gaming experience. I like both of them.
You have problem with LLC or something on Intel platform, there is no way to have 4600MHz throttle on PCores, I just did your settings in forbidden west and I have 232FPS with 14900k on intel defaults.
it's not just LLC, that and LLC really kicks in when the CPU is under heavy stress.
Every single tech person commenting on this issue has never set a 24/7 overclock, or if they have, they upgraded to the next gen parts before they degraded the CPU. These are not the guys who had an 8700k running for 5 years at 5.2GHz with 1.325v.
The LLC is feeding the CPU too much voltage on top of the voltage already being set too high.
Nobody notice the AMDip at 48secs?
nobody notice the intel Dip at 1:27?
IntelCope is the best cope
7800X3D best CPU gaming in the world, Amazing
Does burning problem still occur in AMD processors and motherboards ?
no they fixed that a long time ago. The cause of that one was with people running 6000MT/s RAM in EXPO, which caused the EXPO voltage to go up to an unsafe level.
Also, this was worse with certain motherboards.
@@m8x425 They fixed 1 or 2 Months ago. He probably talks about SoC one.
I literally don't understand intel fans.
Amd LITERALLY gives slightly higher performance at a much lower price and a much lower power usage. LITERALLY 0 REASON TO USE INTEL
excuse me. what's the monitoring software used?
MSI afterburner
For gaming AMD is the best, but what about work performance, means compilation/shaders/rendering? Still the better choice?
Thats why there are AMD Threadrippers.
@@Xawwis Sure, but not this monster 🤣I mean home workstation on 7800x3d vs 14900k with gaming option
@@Boraboaryou do aware that Threadripper consuming exactly same amount of power to 14900KS ? That's the true Rival of 14900K & KS for Productivity & Gaming at the same time
@@niezzayt3809 Yep. But I don't know if I can find Threadripper for less than 500 bucks and a cheap mobo. Also, threadripper is definitely not the Holy Grail for gaming.
@@CaneCorsobabs-vj5rk thanks, that's exactly what I mean
Dont worry guys the ultra i9 super turbo mega next gen is on the way and no the upcoming ones wont be failed refreshes.😮
10 P-Cores, 48 Trash-cores and 570W powerdraw?
correct me if im wrong but does the 14900k use doublish the power of 7800x3d? is that what it shows
2-3 )))
lmao yes
Triple the cores and double the power consumption. 8 cores consuming half of 24 cores is more concerning
@@D3nsity_ There are usually only about eight cores working in games, not all. So its 8 vs 8.
@@Xawwis Red dead uses more than 8 cores. Lights up all 16cores/24threads of my 13700K. All the games shown here can use more than 8 cores.
For Warzone, why is the same FPS that the GPower of 2 GPUs is so different?
why is vram usage so high when playing warzone?
What software is that for PERFORMANCE DISPLAY?
Msi afterburner dude
The i9 suffers most in the 1% lows me thinks, and use much more power oc. Tnx for the comparison, mate! 😇🍑
7800X3D literally the king right now.
Intel Core is multipurpose, while Ryzen is focused on gaming. If you don't ever need to render stuff just go for Ryzen.
I9-14900K : ~41000 Cinebench R23
R7 7800X3D : ~18000 Cinebench R23
If you need rendering go AMD Threadriper 100 000 Cinebench R23
Threadrippers suck at gaming
@@jesusserrano3673 ohhh mimimimi
18000 also works 🤷🏽♂️ if you are primarily gaming and SOMETIMES video editing, just stick with 7800x3d to save money and fps
Intel is superior to AMD in every way. Unless you don't know how to tune your system. Get an AMD system for novice builders
You might wanna test these with power plan set to high performance. those P-cores are all over the place. They should be able to do 5.7GHz 24/7 with decent cooler.
How is your CPU not overheating?
TIPS please!?
It's interesting to note that on the AMD side the CPU has a significantly higher utilization vs the intel CPU, which seems to also correspond to higher utilization of the 4090. Yet the intel build has the same or lower framerates but with higher CPU clock speeds and higher power consumption. Were there any voltage settings set lower in bios on the 14900k to help with the thermal issues the 14th gen has been suffering as of late? It just seems like the 7800X3D is just doing more with less, compared to the 14900K as if it's in "tryhard" mode. I am amazed since ever since these CPUs have been on the market, the 14900k was performance king.. clearly this is wrong.. or things have changed since.
Very interesting how this processors will act with next gen consoles in 2030+ when games will utilize about 28 threads
Those are not Intel's baselines! Those people at motherboard manufacturers that did those baselines are completelly stupid. Especially Gigabyte. They used 1,7 mΩ loadline which leads to absolutelly insane operating voltages. Even Intel has called motherboard manufacturers (especially Gigabyte) to remove that crap from BIOS. Gigabyte already remowed that BIOS update that implements Intel Baseline. And also ASUS uses baseline that has some utterly insane settings. Those people that programmed those BIOS settings at motherboard manufacturers are stupid or completelly incompetent.
Nice to see your CPU and GPU are running cool, they should last a long time.
tbh the mobile/laptop setup is what i use more, to create contents etc, bit sad if itll undergo 'maintenance' as well
its being developed so i can enjoy using it anywhere, just like the desktop and then baam hang
O problema desses testes é que eles limitam o processador apenas a jogos, quando o foco de alguns claramente é para o trabalho. Gostaria que fizessem comparativos não só entre os jogos, mas também em programas de renderização de vídeo. E com essas configurações, creio que o intel levaria a melhor no quesito produtividade.
Claro que isso não é uma critica ao canal, porque o foco é teste em jogos. Mas acho que poderiam abrir os testes para outros programas além de jogos, já que existe uma demanda por esse tipo de conteúdo no youtube!
14900k much colder, less temps. 👌Also if you undervolt 14900k it will be 39c on games. Tested myself for over a month, works perfect, 39-43c. In Ghost of Tsushima 3440x1440 rez. Ultra settings 160-185 fps, power draw 27w 😊 the rest games the same story with same fps (GPU rtx 4080) good luck 😉
Not bad amd! I’ll change my 10900f to 10900x3d in the future
I have an i5-12400f with gigabyte h610m mobo currently using an gt 730 thinking for an upgrade to rtx 3060 or arc a750 here rtx 3060 is 323usd while arc is 240 usd what do i do?
Why would you buy rtx 3060 lol, go for 4060 or even better 6700xt.
A750 is a really good buy if you know what you’re doing. It’ll take more time to set it up correctly but once you do it’ll perform great. If you have a little more cash to spend go for the A770 as it has 16gb. With this amount of vram, it’ll hold up for a while.
of course, take the 3060, ARC is not a gaming graphics card
of course, take the 3060, ARC is not a gaming graphics card
@@dohamakarov2720 why did you sent it twice lol
I've noticed on usage cases where the 7 7800X3D in Horizon zero the GPU is like 10% or more in use than intel side I'm not a computer whiz but I'm wondering how that is
Its bottlenecking where the cpu limits the gpu from being at its full capacity
can you do some test without raytracing and DLSS?
I need AMD to challenge Nvidia even higher on the top end so I can still buy Nvidia.
Their GPUs may be iffy but AMD is king right now when it comes to CPUs and APUs. The 7800x3d is a beautiful little piece of tech.
Can you test more competitive game ? And with optimised settings 1080p low settings
GAMING KING👑
Have you updated your bios? There's no way 7800x3d runs only on 65 - 85w
Mine runs on 100-120
BIOS giving way too much voltage for no reason. Apply -30 Curve and enable pbo.
@@kazuvikingsame with intel CPUs, they supply too much voltage on 13 and 14 gen cpus, i my self turned profile to intel baseline profile in motherboard, and it never goes over 250 w
@@xovgamer The power limit will never fix it. You never let the core voltage hit more than 1.4V otheriwise it degrades the cpu.
@@kazuviking yeah
People with rtx 4090 doing 1080p gaming
but how? how 8 core can beat 24 core i stil dont understand
How is the 7800x3D for emulation like pcsx2, rpcs3 and all that?
I finished most exclusives from PS3 on rpcs3 locked 30 FPS on 4k or 1440p sometimes was possibile to 60fps but not always , pcsx2 isbeasy to emulate
@@dankuk765 I see, thank you! I'm torn between the i5 14600k, i7 14700k and the 7800x3D, can't decide which to buy and i love playing on emulators and want the highest possible fps on them...
@@darkmessiah9566if you have more than 60 fps then you're good.
I wish AMD GPUs were as good as this, then NVIDIA's high prices and greed would not be seen.
How you know if they will be cheaper?. 7900xtx is getting 100 watts more then 4080 super don't have dlss and rt. And in mine country amd is 10% cheaper. I get 30%-40 % less. But really 10% is that much difference in card that offer less??
Imagine being a fanboy and arguing over a computer part...
AMD's standards have truly Ryzen.
And the funny thing is that between the cheaper 7700 and 7800x3D there is no difference in 2K and 4K.
AMD is the king of gaming. And that chip is over 1 year old now.
Amd uses LESS power COSTS LESS, and still decently outperforms the intel 😭😭
is intel getting better fps on higher resolutions?
il rè che perde sempre? com è possibile questa cosa?
Im happy with my 7800x3d. To me amd had the best cpus and nvidia the best gpus thats the best combo atm imo
Why is there such a big difference in the 7800 x3d tests in this video and in your video 6 months ago
Core i7 14700K vs. Ryzen 7 7800X3D - 10-game test
Called driver optimization.
This comparison is wrong because the percentage of each GPU or CPU usage should be the same. If the i9 has a lower FPS due to using a lower GPU, it will definitely change if they are the same.
9800x3D will kill em all :D
This age not quite well
Intel New Bulldozer 2011
I was around for Bulldozer, and it was depressing..... but at least Bulldozer worked
my 7800x3d is so good it can probably easy last another 5 years, specially if u play in uwqhd or 4k
Cant wait to see the 9800x3d. I think it will leave intel even more in the dust.
Should be a comparison view of DCS…in 4K VR…DCS maxed out will kill any CPU…
It’s interesting situation with those i9 13th and 14th gen.People who buying them,in my opinion are intusiasts,and know that i9th are hot. And they seriously didn't think of making an undervolt of it?
This combo is the dream for pc gamers
As powerful and as fast as the 7800x3D is, I went with the 13700K. AMD is great at 1 thing and 1 thing ONLY, and that's gaming.
No disrespect to the 7800x3D in any way though.
let me say this, you need your pc to do other stuff like work, editing and other heavy stuff? you need to consider well all the options well and choose what it's best for you (like have more cores, save money, have more speed etcetc), while if you need a pc just to play ... well why choose a CPU that cost almost 2X the money in price and cost more on bills too?
@1907IceFire I wanted a balanced PC that's great for gaming that's also a powerhouse in productivity, and good for everyday use.
These new AMD chips get really HOT from what I know. Intel's do too, but it's not as bad as AMD. I did some benchmarking with my 13700K + 4080 Super on some games, and I was impressed regardless.
And considering I got the 13700K that's a solid $40 less than the 7800x3D. I would count that as an absolute win, regardless of performance compared in gaming.
@@1907IceFire u know that realistically, the difference between a x3d and an i7 is less than 5% on 1080P WITH a 4090; so at 1080P on absolute lowest settings the difference is small; and at 1440P and more they re evenly matched, while also keeping i7's 80% more performance in overall productivity than the x3d, so you basically trade nothing in gaming performance for more performance overall; even if you d get a high end chip like that for 1080P, not many will pair it with a 4090 to see that 10 fps difference between 250 and 260; i got a 13700kf at 100 dolars less than a 7800x3d, and like you said, i get the 7800x3d performance at a much lower price, no reason for me to get a much more expensive cpu that will give me no performance boost, or a performance boost so small that would be unnoticeable, especially since i don t have a 4090
@@KoItai1 that's for sure and that's why i said you need to check your needs before buy, higher isn't always better if you want to build a pc, cause cpu will be not the only part where you'll need to invest more (cooler, power supply, bills, a gpu that can match well with your cpu, motherboard...) if you can find a similar cpu at lower price and you're in a budget why not? a 13700kf is good if you saved 100$ like a 7800x3d is good if you can save more than 100$ compared to a 14900k every budget have his best CPU must buy in the end, people with unlimited budget have no interest to watch this things in the end...
@@1907IceFire true
Can i get a benchmark with these specs on the game "Lego Indiana Jones Adventure?"