@@megadeth8592 my 13600k idles at 5w. The 7800x3d idles 25w+. Tech Notice made a video comparing the cpus and found in daily consumption, the intel chips are much more efficient. Unless if you render videos or play games every hour you use ur PC, the intel chips are more efficient.
@VorT3x lol say what? e-cores are unsuitable for gaming. So they will never be utilized for gaming Your argument makes no sense what so ever. And even If games will ever get to the point they can actually make use of 12-16 or more treads. AMD CPU offers CPU's with up the 16 real cores which will always win against an e-core.
@@really7187 My i7-12700K gets +740fps averages on CSGO ulletical and the power draw of the 7800X3D is the power draw of my 12700K but on a bot deathmatch. E-cores can't help in CSGO, but it's kinda embarrasing when my 12 core 20 thread chip uses less power and gets higher fps than AMD's overhyped and gimmicky 8 core 3D chip. Iddle power draw is just 5 watts. Obviously a mediocre CPU compared to a disaster looks nice. Meanwhile tech people still ignore the 12700K for some reason and it's the best selling Alder Lake chip, there was a deal on Amazon for 250, not even the Ryzen 7 1700 when it was launched looked so amazing.
@@fystry The e-cores can be used for anything, not only for background. The only reason for their existance is for intel to maximize perfomance for their monolithic chips (obviously lowers power consumption for lightly threaded loads as well). I managed to use them in Portal 2 for making test chambers. The people who say that e-cores are useless for gaming maybe are: -Ignorants. -Defend badly optimized games like Hogwarts legacy that can't max out a 4 core 8 thread i7-4770K from 10 years ago while it gets almost twice the 1% lows in Cyberpunk 2077.
My Core i7-12700K: "oh, finally some good competition for my power efficency" . Note: the wattage of the 7800X3D on CSGO Ulletical benchmark is the same wattage of my 12700K but on a bot deathmatch AND using the integrated graphics at the same time.
I'm not impressed because on CSGO ulletical benchmark on my 12700K i get +740fps, and the power draw for the 7800X3D with way lower fps is the power draw of my 12700K on a bot deathmatch which is far more CPU intensive and i have 12 cores, 5GHz max cores and higher IPC (on the big cores). Intel made a mistake with the i7-12700K because it's so good, holds up pretty well and makes current Intel AND chips from AMD look stupid.
@@saricubra2867 cs go is your argument? You get 740 fps+? Do you have a monitor that goes 740 herz XD ???????? Asus 1080 460 herz monitor is the highest refresh screen there is now and im sure you didnt pay around 1000€ for that thing and even if you had it anything above 460fps would be pointless
@@Deathscythe91 Isn't about seeing 700fps, it's the lower INPUT LAG related to 740fps vs 460fps... Faster processors means a more snappy gameplay feel.
With all the BIOS issues and bugs, I'd save the headaches and go 13900K. Pretty much everyone I know on new builds goes with a 360 radiator and 1000w minimum PS now a days anyway, so no big deal on power draw
@@993mike Nah, I prefer to burn my CPU than burn my house down. 13900K is nice, but you can't do anything with it unless you have a 360mm AIO and a 1000W. Plus AM5 will be supported for a long time, while the 13900k platform is a dead end. You can slap an air cooler on a 7800X3D or 7950X3D and be happy for the next 4 years then upgrade to a 8000 or 9000 series with just a bios update.
Yeah, Sounds great, but two Main issues: ITs Not a monolithic CPU sich aus a 6900hx or so, so the idle consumption ist extreme, more than 4-5 hours without Power would Not bei possible. 2. The Thermodynamik is quote a feat, die the stacking of thr Cache its quote hard to cool the ccx itself.
Idk why but the amd one seems to push the gpu a little more, so more power consumption on gpu. Which means yeah both will probably consume the same amount of power. Just a word from a guy using i5 12400f 😂
Unlike its was in october when 7000 Ryzen was announced, in todays climate when motherboard prices for AM5 have dropped, and prices of DDR5 as well, Ryzen is no brainer for games and still has upgrade path for probably at least two generations unlike 13900k which is end of the road.
@@saricubra2867I specifically made my comment about gaming and not about productivity tasks where 13900k is better this is why I can conclude by reading your comment that you obviously did not read mine carefully, and in gaming number of cores is not everything remember just few years ago when first ryzens had 8 cores and intel 4 cores i7 where better in games. So number of cores are not everything for gaming ( if that was the truth xeons and thredrippers woud be best gaming CPUs and they suck for games) and you are obviously not very tech versed but still felt confidant enough to call my argument stupid. Your are just Intel fan boy. That 8 core CPU is beating or is roughly the same as 24 core CPU in games at much better price and drastically better power consumption. And calling upgrade path stupid LOL, imagine for example in 1-2 years when you will be able just update bios and put your new 9000x3d chip in same computer white no extra cost except CPU and save hundreds of dollars you will not be able to this with your 13900k dude you will have to buy whole new PC to have that top of the line 15900k. If you still think argument is stupid you are beyond help.
@@madmax1436 ok exactly what's the point of having future support for let's say 4-5 years. I don't mean if it's just a better policy, because it is. I'm asking how exactly it benefits the average user any different from 2 year support, because most users wouldn't change their CPUs for at least 5 years, and by that time the chipset would be completely obsolete for a newer CPU anyway since a lot of features would be locked/delayed access to. With a 24 core CPU like i9 it makes even less sense because for the average user that would last like a decade and any chip on the same mobo would be completely garbage
I didn't feel like waiting for the 7800X3D to launch, or waiting for the 7950X3D to be in stock after launch, so I got a 13900K. I don't regret my purchase, but the 7800X3D is indeed a great value.
While performance is similar in games, in productivity the i9 would destroy the 7800X3D. So I am debating whether to go for the 7800X3D or just a 7950X. At 1440p or above the difference wont be as pronounced but in any kind of producitivity it will be.
This CPU looks like the last one I will ever have to buy. I am so bloody sick and tired of watercooling and 6 fans in my case and massive air coolers. Small form factor winner right here.
Same. I had a full tower hard tube 3090ti and 10900k and that was a headache to maintain. Wasted so many hours for a little gain. Now I'm running 7950X3D and a 4090 on a 13 liter case that runs circles around my old build. Best decision ever!
Exactly. The 13600, 13700, and 13900K? The first one you're strongly recommended to use an AIO. For the latter two it's a nearly mandatory recommendation. That's a big chunk of change that could go to a GPU. The Intel parts all use more power, which requires a more expensive power supply. Which is more money which could go to a GPU. The Intel parts produce more heat, which requires more cooling to control (and thus more noise). The 7800X3D is the perfect solution for a gaming PC in 2023. Unless you're utilizing Blender, Cinebench, VRay, etc... on a regular basis you do not need a 13700K or 13900K.
This cpu only for gaming, it does not cost lest, when you buy the cpu, you must buy the suitable mobo, ram for it, mobo and ddr5 for the amd cpu is more expensive than intel, the only thing i blame intel is the heat, dont care much about 50w
I dont feel bad for Itnel at all...they held the industry hostage to 4 cores for nearly a decade and asked a heft price and got ridiculous profits out of it. They nearly bankrupt AMD quite a few times by keeping AMD in litigation fights (Intel LOST them by the way...) to keep AMD incapable of actually investing and competing... NOW we see WHY Intel did everything it could to STOP competition...when asked, it can only compete by burning itself into the ground....more heat, more energy consumed...and still lost by some margins in software that are often optimize to use Intel FIRST.... We need MORE competition, HEALTHY competition in which companies do their best to compete. Not what Intel, MS and Nvidia do:, trowing money at the problem and not doing the hard work.
@@lethanhtung4972 Bro what, there are amd mobos for 90 euros lol. And the ram is literally the exact same price for intel/amd? It definately costs less. And yes, we are talking about gaming here? Where in my comment did i mention productivity? Also its a video only comparing them in gaming, so why assume that people care about productivity?
7800x3d seems like a beast ! But in my experience in actual gaming i find intel to be a lot more smoother. I used to get odd stutters in games with 5800x3d even with 240 fps, recently switched to 13900k no stuttering issues at all!
Your observations are not supported by the 100s of benchmarks available from various review sites with 1% and 0.1% lows figures. Also comparing 5800x3d against a 13900K?
@@happysickfreak8415 How bad? .1% bad? 😂 It runs circles on intel efficiency. Intel should be glad AMD is going for power effeciency, not brute power. Because once AMD ups the TDP and let the zen architect loose, it's game over for intel.
The amd cpus do have stutter problems it’s something with windows that hasn’t been fixed . Intel is setup so easy that windows has no problem sending the workloads out correctly
My 5800X3D + 64GB RAM + 3x 2TB 980 Pro + RTX 4080 hits 400w max from the wall..under load. Probably 370w? Using the 7800X3D, crazy performance for the amount of power.
@gorege jones yeah I suppose but remember GPUs can have power spikes for a few seconds..not exactly sure by how much. Also I'm using a 1000w platinum PSU, The 1000w is useless now LOL GPU pulls 280w max during metro exodus enhanced edition with RT..obviously cpu is not fully used. I tested with a power meter..some games are 370..even 350w under full load from wall :)
@@MrEditsCinema For me it was more like I just play mostly older games and e-sports titles. I had 4090 as well, but returned it and got a 4070 Ti instead. Plenty for me in my use. Got 13600K paired with it.
@@soupdragon151 tell me what cpu competes with 13600k $300 price and performance in games and rendering? none lol only 7800x3d saved ryzen this generation
Like almost every reviewer an commenter of this video the Watt to FPS ratio is REALLY impresive!, living in a country where electricity is REALLY expensive just that will be the reason for me to incline for AMD instead of Intel, but I currently am very satisfied with my 7700X with the -30 PBO curve applied!!
@@s19hu Precision Boost Overdrive (PBO) allows the processor to run beyond the defined voltage values to the limits of the board, and allows it to boost at higher voltages for longer durations than default
be careful because amd motherboard companies made them cheap as possible and causes them to burn as mentioned by the post above. Its purely the fault of the motherboard companies and not amd. Like buying a car with paper wheels and engine.... Wait for new motherboards of next year not this years.
Purely gaming in mind 7800x3d Gaming and productivity in one Intel don't mention wattage cause anyone buying an either of these CPUs can obviously afford to keep the lights on
I've got the 5800X3D & the RTX 4090 and use them only for 4k gaming up to 115 FPS limit on 120Hz LG OLED C1 65'. So for the next 2-5 years I don't need any other upgrade
It's amazing to see an 8-core chip trading blows with the absolute high end from both sides while using much less power. AMD really showing Intel that Innovation over raw power can win in the long run.
Well yeah it is as Intel P cores are so much better in other workloads and have better IPC. The 3D cache really helps power much lower clocks and slightly weaker cores to trade blows. Though to be fair, both have only 8 P cores and few if any games benefit at all from, more than 8 cores, so in reality its 8 cores against 8 cores. Still impressive AMD 7800X3D trading blows with much higher clocked 8 Intel P cores while also using way less power. But really its 8 cores against 8 cores, so its not like AMD is trading blows with much higher core count as we all know very few games benefit from more than 8 cores or e-cores.
@@Wolverine607 it does actually benefit some games from the E cores they was a video made about it they work together improve FPS slightly + 1% and 0.1% fps.
Hilarious how the 7800X3D is doing the same or similar job at nearly HALF the power. But I think it i7 13k makes more sense since i9 is usually the Productivity line, like 7950X
amd for gaming intel for both gaming and productivity workloads. Im pretty sure the i7 is 88% the same of an i9 in rendering and compiling anyway so the i9 is purely for enthusiasts
7800х3d конечно впичатляет Но не забывайте, 13900к в обработке видео будит минимум в 2 раза быстрее Я лично выбрал 13900к если играю то отключаю экономные ядра и потребление падает примерно на 30w если надо что-то обрабатывать включаю Все тут как я почитал судят и сравнивают эти 2 процессора только по играм
Это да, в том то и прикол что если чисто для игр брать то от красных идеально. Но лично я не только в играх буду его использовать, соответственно на выбор это влияет. Ну и смотря ещё что рендерить, я в давинчи рендерю через видяху
New bios update made by AMD and now the 7800x3d is 30 % even faster you must redo this test since AMD themselves Sayed than the 3d cache was not working in current windows need updated schedule
@@ag8912 It gets better, NVIDIA itself admited that cyberpunk on the best raytracing tech they have gives you 16 fps with their flagship... unless you do the DLSS stuff, then it gives you 128 lol
Hogwarts Legacy is poorly optimized garbage, watch that NPC bug at 2:20 on the right, this joke of a game shouldn't be used as a benchmark until it's patched (like Cyberpunk 2077 now).
Noticed you are using 6000mhz ram kit on that 13900k. I bet the 13900k would eek out a couple more wins with a 7200mhz+ kit although i guess you could argue that you don’t need an expensive ram kit for the 7800x3D. Just a thought 😅.
Heavily optimized for Intel (sponsor its main tournaments since inception...)...why oh why Intel still win in some games even when the arctechture is already so obviously surpassed heh? Look at that cyberpunk....yeah...same with Nvidia...but things ARE changing...
@@TSEDLE333 naah that's a misconception amd cpus aren't that good that they just show lmfao and also u dont wanna compromise 50% better productivity performance for 5 percent better gaming performance 😂
Source engine does. Actually all midrange amd cpu's perform a lot better(like up to +150-200fps) in cs2 on source2 and 5600x outperform every intel gpu for the same price there.
@@ImBrilliantz11 wtf are you talking about? amd cpus banging intels cpus for multiple gens now, you think amd just got the huge market share cause their so nice? Nah their cpus are great for the price, the 7800x3d is much cheaper aswell and most gamers dont care about a little less productivity. Most dont even use their pc for other things than gaming and browsing. Just take the L fanboy.
Jesus ! With the half of the power consumption of 13900k and , 7800x3d still manages to win. And it is 200-220 dollars cheaper on retail, moreover the platform (mobos) are cheaper on the amd side too. Holy bananas AMD ! Good job.
Fluidity of motion and animation seems better on 7800X3D? There seems to be micro stutters on 13900k. Where both used with the same physical video card? Do you see it as well on the vid? is a matter of video editing / capturing / YT or is it noticeable in real life as well when played on those machines?
Honestly? The only thing I can see is that the 7800x3D is about the same in gaming as the 13900k. Seriously, some games the i9 wins, others the 7800x3d wins, and a lot of them it's basically a tie, or the fps is so high it doesnt really matter. Also, the argument that the 13900k runs too hot, it just happens in syntethics benchmarks, in gaming where the load on the cpu is not high most of the time, the 13900k even runs cooler than the 7800x3d as you can see in most cases. Also, if you game in 4K, it REALLY doesn't matter which of them you get.
I have a 13900k and I totally agree with this assessment. Both CPUs are fine and you can't go wrong with either one. Grab the one you want and go play some games.
Thanks for this. For some reason, this is the only game benchmark channel that actually SHOWS wattage, thermal, and usage. Like, seriously, why is it so taboo to put them in their benchmark?
Iv just got the 13400f its engouch to play in 4k@144hz, you have simply to disable the powerlimits in the bios and set the fsb to 102.9 Mhz and get 6000Mhz ram :)
that's weird how modern games doesn't look any better than games from ~10 years ago, but still year after year becomes more resource hungry, now that's a perfect illustration of an inflation kids
The X3D stuff is a bit odd when you think about it. In games where vcache helps it's great. In other titles the lower clock speeds needed to protect the 3DCache from heat means lower FPS. Basically for anyone who plays a ton of different titles, it's hit and miss.
Power efficiency is important to me. Intel has some great performers but they need to upgrade their efficiency in the future for me to consider upgrading to them. If AM5 has close to the longevity of AM4, then it would be a great platform long term
yup. Wish I knew that the 65w varient cpus really went up to 200 watts when I was buying an i7 back in 2020. I dont bother gaming at all in the summer even with 80f its disgustingly hot as an oven.
@@slumy8195 Yeah it sucks when some cpus pull as much power as a mid range gpu. That's the main reason I went for the 5700x over the 5800x. 65 wat tdp but doesn't pull more than 75 watts and temps barely get my case warm. So any future cpu I get will have to be on AM5, be it a non x chip, or even a X3D chip that seems to be good on power and temps like the 7800x3d. Would love to see intel follow in their foot steps for power efficiency
The problem with Ryzen is the 0.1% lows compared to INTEL processors in that regard intel is more stable and you can tell on the fluidity of the games, on the other hand INTEL is consuming too much power making it the worst choice in terms of efficiency. Hard to tell which is best, it will depends on how estrict you are with certain games.
To everybody sitting here talking about how impressive the power draw is vs the 13900K, can we address the fact its also LITERALLY less than half the chip? Its a 8 core 16 thread CPU vs a 24 core 32 thread, its like comparing how much stealthier a silenced pistol is vs a Light Machine Gun. Or throwing on cinebench or some productivity tasks and bragging about how much the 13900K stomps it. Its a 5nm chip with 1/3 the cores and 1/2 the threads of a 10nm chip, i sure as hell hope its a lot more power efficient
The 7800X3D using a FRACTION of the power the 13900K uses a sight to see! The efficiency and performance are NUTS. Intel has serious problems with their architecture if they plan on retaining it moving forward.
When i bought my i7-12700K, the motherboard was giving me 240 watts because of 1.42 volts for some reason on Cinebench. Now it draws 150 watts and works around 1.21 volts. Intel motherboards have weird voltages out of the box. 7800X3D is an 8 core chip vs the 13900K that has 24 cores. My i7-12700K is as efficient as that 7800X3D for games.
Hmm interesting comparison, but the Low 1% and Low 0.1% differences between AMD and Intel is quite huge in some games. Does it matter in real world performance?
@@RicochetForce damn kids are kids *productivity is a thing that's exists* and for ur kind info 13700k is far better in every aspect than gaming if compared to 7800x3d and my illiterate man is comparing 13900k which is 50%+ faster in productivity than ur garbage 7800x3d and calls it is legend LMFAO
Gotta remember if you don't have 4090 there's no difference between these CPU's. In fact the gain from even i5 13600K and R5 7600X to these CPU's is minimal if you don't have 4090.
cpu tests are better when you turn the graphics as low as possible to show how many frames the cpu can pump out. If you run max graphics then you turn it into GPU bottleneck and the test is pointless
Many things that people here are missing when praising 7800X3D is that first it has worse 0.1% in nearly every game and CPU usage is about 15-25% higher most of the time. 13900K is overall a better CPU in my opinion.
@@notenjoying666 The 99 and 101 FPS difference that you're talking about is actually the average lol, in many games there is about 40-60FPS difference when it comes to 0.1% lows and it also has high CPU usage, which means with newer GPUs this CPU won't last that long. It's a mostly gaming CPU and even then is failing to provide a smoother overall gaming performance than a 13900K lol (higher average FPS doesn't mean smoother gameplay, just in case you will try to make this argument)
Half the watts, about 20°C cooler at full load and is over 100$ cheaper. I don't see why I should get the 13900K over this. Edit: Hopefully they fix the melting problems of these chips fast.
you cant fix something that has been engineered by trash. Same when ppl think a car cpu upgrade will fix there car lol the car will still have problems its purely a hardware end not software. I'll see what the 2023 or 2024 boards bring because these 2022 boards are the problem and no update will fix companies cheaping out on everything as possible.
I'm more impressed by the power usage than anything. I do more than gaming and use quicksync quite a bit so I'm happy with my 13700k but if gaming was my main concern the 7800X3D would be a no brainer.
The comments are just pure pain. People will literally try to refuse technological advancement, and instead of being happy for getting more for less, they defend a clearly less worth it (gaming) CPU. Like, wake up guys. Anyways, I wish good gametime to everyone, not based on your hardware. Remember, don't let the PC end like the consoles!
One thing I see is that yes the power is lower with the AMD chip but look at the % left over between the AMD and the Intel guys the AMD doesn't have much room left over where as the Intel has tons left over to use. I call that future proofing.
The same cooler was used, and the fan speed is adjusted according to CPU temp not power consumption. More likely 7800X3D is so hot for it's power consumption level due to stupidly thick heat spreader on AM5 that acts as a major bottleneck in Zen4 CPUs cooling.
The real answer is actually that the cache sitting on top of the die holds back the heat. The heat transfer from the die to the heat spreader is poor. Thats why any cooler will cool it.
Honestly I'm still going to stick with the 13900k. The games the 7800 beats it in, I don't particularly play or care to play, and most importantly, while sure the AVG fps is high, the 1% and 0.1% lows on the 13900k are better in most situations. I cap my game fps anyway, and what matters to me is not having to deal with bs stutters during gunfights in Apex and similar. This is also purely game performance... And I plan to do more than just game so.. Oh well. End of the matter is.. I've been on a 4790k all the way up till now. What an upgrade it'll be eh?
@@lethanhtung4972 yep, I myself live in Estonia the price for electricity has gone wild, I myself have a fixed price 200€/month it ain't that much compared to the wild stuff I've seen, in winter some guy paid 600€ for heating 800€ total for a small apartment, the thing here is we have a energy company that isn't owned by the government so they just jacked up the price to the moon to the extent where a farm had to pay nearly 100K for a year's worth of electricity and it wasn't a big farm or anything a private owned farm, it's quite wild what's going on here, I really don't care since I have a fixed price.
I’m testing the 7800x3D for a few months now. Many updates and the best ram possible. It’s a nightmare to avoid bluescreens. I managed to keep it at 5.15GHz stable but still not as smooth as many claim. The 1% lows are better even on the i7 and no, I don’t like Intel and their high charging price for something that should be lower. At the end I would recommend Intel unless u wanna wait for more stable bios and chipset drivers. With time the 7800 will beat anything even at 1% lows. For now I’m very skeptical on bench’s and overall performance.
The 1 percent lows tell me everything I need to know. Also there’s the fact of Amd having a way worse ddr5 mem controller usually only getting to 6000 while 13900k can get to 8000
I dont think the memory speed really matters that much. I think its more about memory latency and timings. 6000mhz vs 8000mhz cant have a direct comparison yknow
@@SeeFlow-bo1dl yes very true but Intel can also have higher freq with lower latency which properly scales. I’ve seen a lot of people easily overclock to 7800 cl32 or 7600 cl30 on Intel while they are lucky to even get 7000 period on Amd and only able get those cas numbers in the 6000 range. Now despite mem frequency Amd has been competing well but if they improved their mem controller up to par with Intel’s then they’d absolutely destroy Intel.
@@notenjoying666 nope you obviously new to the computing world. I have owned a Ryzen system and everything about it related to software was buggy and still is today. Some of my friends own the latest and greatest Ryzen systems and themselves find AMD buggy, but they bought AMD because it’s more affordable.
Holy shit, that hogwarts is so poorly optimized game! 😱 And many of my friends were drooling on to spend 60€ on that PoS 🤣🤣 Edit: Conclusion from the tests: AMD is still the best bang for buck.
No reason to go with the x3d chip unless you want a pure gaming machine with a 4090. Pair it with any GPU lower than a 4090 and you’ll have identical results. The intel chips also destroy this chip in every other metric, like multicore workloads. Also people like to talk about power consumption in games but fail to mention total power consumption. My 13600k idles at 4 watts, and the 13900k is similar. The Ryzen 7000 series is known to idle at 20-25w+. Also, a Ryzen 7800x3d with mobo and DDR5 ram will run you around $850, whereas a 13600k with mobo and DDR5 will be around $500-$600. Edit: People are stupidly calling me an intel fanboy, so let's disprove that quickly. Despite me owning an intel chip, I recommend the 5800x3d over both these chips. Why? Because cpu, ram, mobo =$450. Yes, that’s the price of a 7800x3d. Again 7800x3d with ram and mobo will cost $850. For 10% - 15% more performance with an rtx 4090, you’re gonna pay $400 more. If you own an rtx 4090 and want the fastest CPU possible, by all means get the 7800x3d. But I know that's not the case for everyone, so it doesn't make any sense to pay $400 more for a CPU that's only 5-15% faster. Hence why the 5800x3d is the best value gaming CPU.
@@knhgv cause most 13600ks can be overclocked to 5.5ghz like the stock all-core frequency of the i9. In fact my 13600k runs at 5.7ghz so it will mostly run faster than the i9 showed in the video.
@@ProVishGaming LMFAO SUCH A CRINGE NONSENSE LOL. K 7800x3d can be easily oc'ed to 5.4 with base clock without any issues, what u gonna say here? 13600k oc;ed never did better then 13900k by any means anyway, thats fake.
@@notenjoying666 no the 7800x3d can’t???? Don’t watch a skatterbench video and think it’s possible. The 3d-VCache on the die is sensitive to high voltages and you can it frequently parks cores whenever it can. You need to win the silicon lottery if you want 5.4ghz. Also your assessment of my 13600k is completely wrong. Unless if a game is able to utilize all 8-P cores(which is rare), my overclocked i5 does perform better than an i9 games.
Games :
Returnal - 0:06
CYBERPUNK 2077 - 0:59 - gvo.deals/TestingGamesCP2077
Hogwarts Legacy - 2:10 - gvo.deals/TG3HogwartsLegacy
Horizon Zero Dawn - 3:24 - gvo.deals/TestingGamesH0D
CS GO - 4:34
Spider-Man - 5:41 - gvo.deals/TestingGamesSpiderManPC
Forza Horizon 5 - 6:30 - gvo.deals/TestingGamesForza5
Microsoft Flight Simulator - 7:38 - gvo.deals/TestingGamesMFS20
Hitman 3 - 8:48 - gvo.deals/TestingGamesHitman3
A Plague Tale: Requiem - 9:47
System:
Windows 11
Ryzen 7 7800X3D - bit.ly/43e3VxW
Gigabyte X670E AORUS MASTER - bit.ly/3BRDIIG
Core i9-13900K - bit.ly/3SgY3xf
ASUS ROG Strix Z790-E Gaming - bit.ly/3scEZpc
CPU Cooler - be quiet! Dark Rock Pro 4 - bit.ly/35G5atV
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz CL32
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
Please test Cyberpunk at 1440p and 4k both with ray tracing and DLSS3 enabled. Would love to see the numbers. Highest quality settings.
do you have smt patch enabled in cyberpunk? it gives decent boost ryzen 8+ cores cpus
2:40😂
The power usage of 7800X3D is very impressive.
Almost half huh
what's idle gonna be? Intel is much more efficient overall.
@@ProVishGaming lmfaooooooooooooo no
@@megadeth8592 my 13600k idles at 5w. The 7800x3d idles 25w+. Tech Notice made a video comparing the cpus and found in daily consumption, the intel chips are much more efficient. Unless if you render videos or play games every hour you use ur PC, the intel chips are more efficient.
Even though intel uses double amount of power still impressed how it keeps the thermal low ....
love this back and forth between AMD and Intel, cpu's are so fast now even low end stuff are great.
waiting to see the same happening for gpus
Honestly for most gamers, I feel like a 7600x or 13600x would suffice just fine. 7800x3d is overkill in my opinion
@@kotztotz3530 lmao braindead
@@kotztotz3530 depends on the gpu, you dont want to bottleneck your 4090 and you gotta consider future titels aswell
@@kotztotz3530 bros comparing a trash 6 core cpu to best cpu in that range 💀
Half the watts and similar performance. That's impressive AMD. Good job.
Not similar, 7800x3d beats 13900k.
@VorT3x lol say what? e-cores are unsuitable for gaming. So they will never be utilized for gaming Your argument makes no sense what so ever.
And even If games will ever get to the point they can actually make use of 12-16 or more treads. AMD CPU offers CPU's with up the 16 real cores which will always win against an e-core.
@VorT3x the e-cores are never meant to be for gaming
Just background tasks
@@really7187 My i7-12700K gets +740fps averages on CSGO ulletical and the power draw of the 7800X3D is the power draw of my 12700K but on a bot deathmatch. E-cores can't help in CSGO, but it's kinda embarrasing when my 12 core 20 thread chip uses less power and gets higher fps than AMD's overhyped and gimmicky 8 core 3D chip.
Iddle power draw is just 5 watts.
Obviously a mediocre CPU compared to a disaster looks nice. Meanwhile tech people still ignore the 12700K for some reason and it's the best selling Alder Lake chip, there was a deal on Amazon for 250, not even the Ryzen 7 1700 when it was launched looked so amazing.
@@fystry The e-cores can be used for anything, not only for background. The only reason for their existance is for intel to maximize perfomance for their monolithic chips (obviously lowers power consumption for lightly threaded loads as well).
I managed to use them in Portal 2 for making test chambers.
The people who say that e-cores are useless for gaming maybe are:
-Ignorants.
-Defend badly optimized games like Hogwarts legacy that can't max out a 4 core 8 thread i7-4770K from 10 years ago while it gets almost twice the 1% lows in Cyberpunk 2077.
5800X3D : I'm so proud of you, son
My 5800x3D is weeping tears of joy. The future of AMD is bright.
My Core i7-12700K: "oh, finally some good competition for my power efficency" .
Note: the wattage of the 7800X3D on CSGO Ulletical benchmark is the same wattage of my 12700K but on a bot deathmatch AND using the integrated graphics at the same time.
@@saricubra2867 platform is not dead though. why i went with AMD.
@@djam7484 who asked
@@djam7484 Platform is a moot point if i have 12 cores and top tier IPC. I'm only behind the Zen 4 Ryzen 9s and the 13th i7 or 12/13th gen i9s.
im realy impressed with the power draw of the 7800x3d while still giving such performance
@@ilovebanana177 You don't say?
You could easily run the full desktop chip in a laptop. How cool
I'm not impressed because on CSGO ulletical benchmark on my 12700K i get +740fps, and the power draw for the 7800X3D with way lower fps is the power draw of my 12700K on a bot deathmatch which is far more CPU intensive and i have 12 cores, 5GHz max cores and higher IPC (on the big cores).
Intel made a mistake with the i7-12700K because it's so good, holds up pretty well and makes current Intel AND chips from AMD look stupid.
@@saricubra2867 cs go is your argument? You get 740 fps+? Do you have a monitor that goes 740 herz XD ???????? Asus 1080 460 herz monitor is the highest refresh screen there is now and im sure you didnt pay around 1000€ for that thing and even if you had it anything above 460fps would be pointless
@@Deathscythe91 Isn't about seeing 700fps, it's the lower INPUT LAG related to 740fps vs 460fps...
Faster processors means a more snappy gameplay feel.
7800x3D... the new king. Roughly the same performance (sometimes better) as the 13900k, while using significantly less power. Fantastic job, AMD!
13900K is much better CPU overall though. For just $150 more its a no brainer unless you're just pure gamer who never multitasks.
@@teemuvesala9575 nobody is disputing the greatness of the 13900k, but for gaming, I’d take the 7800x3D.
@@teemuvesala9575 Look at the 0.1% lows again on AMD, stutter fest CPUs just like their GPUs
@@APN34Z Spotted intel fanboy in every comment section
@@-Ice_Cold- Dont forget the Nvidia shill too 😁😁
How efficient the 7800X3D is is really insane!
With all the BIOS issues and bugs, I'd save the headaches and go 13900K. Pretty much everyone I know on new builds goes with a 360 radiator and 1000w minimum PS now a days anyway, so no big deal on power draw
@@993mike Nah, I prefer to burn my CPU than burn my house down. 13900K is nice, but you can't do anything with it unless you have a 360mm AIO and a 1000W. Plus AM5 will be supported for a long time, while the 13900k platform is a dead end. You can slap an air cooler on a 7800X3D or 7950X3D and be happy for the next 4 years then upgrade to a 8000 or 9000 series with just a bios update.
@@Alp577 also intel motherboards are rather limited
@@993mike only there on asus boards
@@Alp577 wut, the video shows they operate around the same temp (though Intel does use more wattage when gaming).
The 7800X3D's efficiency under load is pretty insane
just due to process size
wrong@@siniyden
@@pedrof3232 right
Imagine having 7800X3D in a laptop without any downclocking, that would be revolutionary
Bro the battery
Yea except for the laws of thermodynamics that's a great idea
I think he meant that it is so low in wattage that it can almost compare to the current HX/HS CPUs
@@TravisAntonio AFAIK 7800X3D has lower wattage than 7945HX, so it is actually possible to put it in a laptop without downclock
Yeah, Sounds great, but two Main issues:
ITs Not a monolithic CPU sich aus a 6900hx or so, so the idle consumption ist extreme, more than 4-5 hours without Power would Not bei possible.
2. The Thermodynamik is quote a feat, die the stacking of thr Cache its quote hard to cool the ccx itself.
20% cheaper, as much as 50% reduced power consumption and virtually identical performance as that 13900K.
This truly is AMD's gift to gamers.
In games that uses cache. It's much better. Like in dota 2. 25% better
Idk why but the amd one seems to push the gpu a little more, so more power consumption on gpu. Which means yeah both will probably consume the same amount of power. Just a word from a guy using i5 12400f 😂
@@leanmeanbaguette
Several reviewers have measured total system draw and the Ryzen still pulls way less power
@@stebo5562 nice! I would definitely love a 7800x3d. But sadly I chose intel mobo
@@cotasa no never 25.
Lol the lady in hogwarts running in place at 2:39 🤣
I was looking for this comment
She is doing some workout stuff or something 🏃♀️🏃♀️😆😆😆
Unlike its was in october when 7000 Ryzen was announced, in todays climate when motherboard prices for AM5 have dropped, and prices of DDR5 as well, Ryzen is no brainer for games and still has upgrade path for probably at least two generations unlike 13900k which is end of the road.
i9-13900K has 24 cores and 32 threads.
It's a pseudo Threadripper, the upgrade path argument is so stupid for that kind of a CPU .
@@saricubra2867I specifically made my comment about gaming and not about productivity tasks where 13900k is better this is why I can conclude by reading your comment that you obviously did not read mine carefully, and in gaming number of cores is not everything remember just few years ago when first ryzens had 8 cores and intel 4 cores i7 where better in games. So number of cores are not everything for gaming ( if that was the truth xeons and thredrippers woud be best gaming CPUs and they suck for games) and you are obviously not very tech versed but still felt confidant enough to call my argument stupid. Your are just Intel fan boy. That 8 core CPU is beating or is roughly the same as 24 core CPU in games at much better price and drastically better power consumption. And calling upgrade path stupid LOL, imagine for example in 1-2 years when you will be able just update bios and put your new 9000x3d chip in same computer white no extra cost except CPU and save hundreds of dollars you will not be able to this with your 13900k dude you will have to buy whole new PC to have that top of the line 15900k. If you still think argument is stupid you are beyond help.
@@saricubra2867 but wait, do you want to buy a platform (mobo+cpu) , knowing that same year will be a new socket ?
@@saricubra2867 i would wait for 14th gen if i really wanted to buy intel so to atleast have 15th gen support
@@madmax1436 ok exactly what's the point of having future support for let's say 4-5 years. I don't mean if it's just a better policy, because it is. I'm asking how exactly it benefits the average user any different from 2 year support, because most users wouldn't change their CPUs for at least 5 years, and by that time the chipset would be completely obsolete for a newer CPU anyway since a lot of features would be locked/delayed access to. With a 24 core CPU like i9 it makes even less sense because for the average user that would last like a decade and any chip on the same mobo would be completely garbage
The power usage difference is insane. I’d go 7800X3D.
just due to process size
I didn't feel like waiting for the 7800X3D to launch, or waiting for the 7950X3D to be in stock after launch, so I got a 13900K. I don't regret my purchase, but the 7800X3D is indeed a great value.
While performance is similar in games, in productivity the i9 would destroy the 7800X3D. So I am debating whether to go for the 7800X3D or just a 7950X. At 1440p or above the difference wont be as pronounced but in any kind of producitivity it will be.
13900k🤤
@@irishRocker1 You are better off going for a 7900X really. Do you really want the highest frames or comfortable render times
What cooler do you have to tame it below 100C at high loads
This CPU looks like the last one I will ever have to buy.
I am so bloody sick and tired of watercooling and 6 fans in my case and massive air coolers.
Small form factor winner right here.
Same. I had a full tower hard tube 3090ti and 10900k and that was a headache to maintain. Wasted so many hours for a little gain. Now I'm running 7950X3D and a 4090 on a 13 liter case that runs circles around my old build. Best decision ever!
Lmao 😂😂😂😂
Yeah when you dont make the same comment on nvidia gpu that uses 400 watts or so
@@Alp577 Great combo
Yep, iam with you. Last Gen the 5700X was the too late CPU for our DAN A4 4.1 Build - the 7800X3D seems awesome for that RIG
It costs less, draws less power resulting in less upkeep cost and heat, and also give you more fps in games. I almost feel bad for intel
Exactly. The 13600, 13700, and 13900K? The first one you're strongly recommended to use an AIO. For the latter two it's a nearly mandatory recommendation. That's a big chunk of change that could go to a GPU.
The Intel parts all use more power, which requires a more expensive power supply. Which is more money which could go to a GPU.
The Intel parts produce more heat, which requires more cooling to control (and thus more noise).
The 7800X3D is the perfect solution for a gaming PC in 2023. Unless you're utilizing Blender, Cinebench, VRay, etc... on a regular basis you do not need a 13700K or 13900K.
This cpu only for gaming, it does not cost lest, when you buy the cpu, you must buy the suitable mobo, ram for it, mobo and ddr5 for the amd cpu is more expensive than intel, the only thing i blame intel is the heat, dont care much about 50w
I dont feel bad for Itnel at all...they held the industry hostage to 4 cores for nearly a decade and asked a heft price and got ridiculous profits out of it. They nearly bankrupt AMD quite a few times by keeping AMD in litigation fights (Intel LOST them by the way...) to keep AMD incapable of actually investing and competing...
NOW we see WHY Intel did everything it could to STOP competition...when asked, it can only compete by burning itself into the ground....more heat, more energy consumed...and still lost by some margins in software that are often optimize to use Intel FIRST....
We need MORE competition, HEALTHY competition in which companies do their best to compete. Not what Intel, MS and Nvidia do:, trowing money at the problem and not doing the hard work.
@@lethanhtung4972 Bro what, there are amd mobos for 90 euros lol. And the ram is literally the exact same price for intel/amd?
It definately costs less. And yes, we are talking about gaming here? Where in my comment did i mention productivity? Also its a video only comparing them in gaming, so why assume that people care about productivity?
@@lethanhtung4972 How can RAM cost different?
Damn good efficiency...
its been great idling efficiency
I currently have a Ryzen 5 5600X but I am already seeing my future update, and that Ryzen 7 7800X3D is a very good option
the price , efficiency , thermals of 7800X3D is just incredible 🔥💘
excellent work team red
@@CAT_CUTIE what are you talking about
Price and efficiency are top notch, thermals though, sucks big time! Same temperature as the intel part while using half the wats
@@CAT_CUTIE but this is i9 13900k, we not blind
@@baiaioangeorge8946 who cares about temperature?
@@CAT_CUTIE In productivity? If you talking about gaming, then you're delusional intel fanboy
7800x3d seems like a beast ! But in my experience in actual gaming i find intel to be a lot more smoother.
I used to get odd stutters in games with 5800x3d even with 240 fps, recently switched to 13900k no stuttering issues at all!
That’s the statement I was looking for. Intel for CPU Nvidia for GPU. I wish AMD would be better in real life as it is on the paper. That’s odd.
Your observations are not supported by the 100s of benchmarks available from various review sites with 1% and 0.1% lows figures. Also comparing 5800x3d against a 13900K?
@@happysickfreak8415 How bad? .1% bad? 😂 It runs circles on intel efficiency. Intel should be glad AMD is going for power effeciency, not brute power. Because once AMD ups the TDP and let the zen architect loose, it's game over for intel.
@@happysickfreak8415 Are you basing this only on this one guys experience because maybe he just has issues with his system.
The amd cpus do have stutter problems it’s something with windows that hasn’t been fixed . Intel is setup so easy that windows has no problem sending the workloads out correctly
I love how AMD is now taking the lead in gaming performance
If you say so lol
@iDTecKt they are 💀 a $400 chip is outperforming intels flagship
@Lurzyy people buy i9 not for gaming, lol. There's i5 and i7 for this kind of stuff
@@LowRiderFuckYou no shit, the 13900k is still intels best performing cpu in gaming so its still a valid argument
AMD numbah wan! #1
My 5800X3D + 64GB RAM + 3x 2TB 980 Pro + RTX 4080 hits 400w max from the wall..under load.
Probably 370w? Using the 7800X3D, crazy performance for the amount of power.
What ,that less power usage ? How is it possible dude, then in that sense a 550w psu is enough for your components right ?
my 13900k 6ghz + 4090 oc setup takes about 1000w not bad
@gorege jones yeah I suppose but remember GPUs can have power spikes for a few seconds..not exactly sure by how much.
Also I'm using a 1000w platinum PSU, The 1000w is useless now LOL
GPU pulls 280w max during metro exodus enhanced edition with RT..obviously cpu is not fully used. I tested with a power meter..some games are 370..even 350w under full load from wall :)
@Turtle damm that's crazy, ordered a 4090 but cancelled due to how much games I play. Would pay a fortune for electricity
@@MrEditsCinema For me it was more like I just play mostly older games and e-sports titles. I had 4090 as well, but returned it and got a 4070 Ti instead. Plenty for me in my use. Got 13600K paired with it.
which program are you using to check stats ?
Intel's 13900k is a silicon work of art but Ryzen 7800x3d is just perfect for gaming
Intels fall from dominance since Ryzen is so hilarious to watch, that's what they get for being fucking complacent
bro you have no idea what your talking about, intels mid range lineup such as 13600k are killer value
@@soupdragon151 tell me what cpu competes with 13600k $300 price and performance in games and rendering? none lol only 7800x3d saved ryzen this generation
Like almost every reviewer an commenter of this video the Watt to FPS ratio is REALLY impresive!, living in a country where electricity is REALLY expensive just that will be the reason for me to incline for AMD instead of Intel, but I currently am very satisfied with my 7700X with the -30 PBO curve applied!!
what is -30 PBO
@@s19hu Precision Boost Overdrive (PBO) allows the processor to run beyond the defined voltage values to the limits of the board, and allows it to boost at higher voltages for longer durations than default
@@HighIgnition okay ty!
@@HighIgnition now that's how an enthusiast explains 👏🏻
@@adityadivine9750 thanks!
To be noted that Cyberpunk still needs Ryzen SMT fix from CyberTweaks to fully leverage Ryzen CPUs.
Наконец то АМуде смог, причем не в отношении цена/производительность или энергоэффективности, а просто по всем фронтам😊
be careful because amd motherboard companies made them cheap as possible and causes them to burn as mentioned by the post above. Its purely the fault of the motherboard companies and not amd. Like buying a car with paper wheels and engine.... Wait for new motherboards of next year not this years.
7800x3d is just sipping 45-65w fort his level of performance. Its absolutely insane.
Purely gaming in mind 7800x3d Gaming and productivity in one Intel don't mention wattage cause anyone buying an either of these CPUs can obviously afford to keep the lights on
At this power consumption it is the absolute beast for laptops - low power = higher battery life + good thermals for laptop coolers. Very good job.
7800X3D is trading blows with the 13900k and uses a hell of a lot less power? Sign me up!
I've got the 5800X3D & the RTX 4090 and use them only for 4k gaming up to 115 FPS limit on 120Hz LG OLED C1 65'. So for the next 2-5 years I don't need any other upgrade
It's amazing to see an 8-core chip trading blows with the absolute high end from both sides while using much less power. AMD really showing Intel that Innovation over raw power can win in the long run.
Well yeah it is as Intel P cores are so much better in other workloads and have better IPC. The 3D cache really helps power much lower clocks and slightly weaker cores to trade blows. Though to be fair, both have only 8 P cores and few if any games benefit at all from, more than 8 cores, so in reality its 8 cores against 8 cores. Still impressive AMD 7800X3D trading blows with much higher clocked 8 Intel P cores while also using way less power. But really its 8 cores against 8 cores, so its not like AMD is trading blows with much higher core count as we all know very few games benefit from more than 8 cores or e-cores.
@@Wolverine607 it does actually benefit some games from the E cores they was a video made about it they work together improve FPS slightly + 1% and 0.1% fps.
Hilarious how the 7800X3D is doing the same or similar job at nearly HALF the power. But I think it i7 13k makes more sense since i9 is usually the Productivity line, like 7950X
amd for gaming intel for both gaming and productivity workloads. Im pretty sure the i7 is 88% the same of an i9 in rendering and compiling anyway so the i9 is purely for enthusiasts
Ryzen 7 vuelve a ser el rey del gaming muy buen procesador muy equilibrado ahora sólo falta que baje un poco de precio y las placas base también.
yo lo pille muy barato y ahora los precios estan pòr las nubes 400 euros o mas y solo hablando de la CPU
7800х3d конечно впичатляет
Но не забывайте, 13900к в обработке видео будит минимум в 2 раза быстрее
Я лично выбрал 13900к если играю то отключаю экономные ядра и потребление падает примерно на 30w если надо что-то обрабатывать включаю
Все тут как я почитал судят и сравнивают эти 2 процессора только по играм
Это да, в том то и прикол что если чисто для игр брать то от красных идеально. Но лично я не только в играх буду его использовать, соответственно на выбор это влияет. Ну и смотря ещё что рендерить, я в давинчи рендерю через видяху
As i thought, the 13gen of intel CPUs is good but not excellent, here it come the AMD CPUs, they are excellent 👌
I thought Intel would beat AMD in Hogwarts, I was wrong.
You can get more out of the 13900K if you pair with faster memory, however the power consumption of 7800X3D will reamin unmatched.
Yeah, got much more potential in the 13900k chip, just look at the overall CPU usage at both of the CPUs tells it.
New bios update made by AMD and now the 7800x3d is 30 % even faster you must redo this test since AMD themselves Sayed than the 3d cache was not working in current windows need updated schedule
75 fps in Hogwarts legacy at 1080p with an RTX 4090. Jesus Christ.
It's because of the Ray Tracing, that game does NOT like RTX
@@BlackHayateMX I get that but holy moly. Imagine getting barely above 60 fps with a flagship current gen GPU…
@@ag8912 It gets better, NVIDIA itself admited that cyberpunk on the best raytracing tech they have gives you 16 fps with their flagship... unless you do the DLSS stuff, then it gives you 128 lol
@@BlackHayateMX I really want to be a PC gamer but man is it hard these days lol
Hogwarts Legacy is poorly optimized garbage, watch that NPC bug at 2:20 on the right, this joke of a game shouldn't be used as a benchmark until it's patched (like Cyberpunk 2077 now).
A really cpu intensive game like Minecraft would be a nice addition for cpu testing
2:38 run forest run!
Magical treadmill
i'm guessing the 0,1% lows are gonna be more stable after some driver updates
Wow, that 7800x3d is sick. Even better than the 7950x3d. Love It.
Noticed you are using 6000mhz ram kit on that 13900k. I bet the 13900k would eek out a couple more wins with a 7200mhz+ kit although i guess you could argue that you don’t need an expensive ram kit for the 7800x3D. Just a thought 😅.
I feel like CSGO really hate Ryzen CPUs but overall similar or even better performance in other games.
Heavily optimized for Intel (sponsor its main tournaments since inception...)...why oh why Intel still win in some games even when the arctechture is already so obviously surpassed heh?
Look at that cyberpunk....yeah...same with Nvidia...but things ARE changing...
@@TSEDLE333 naah that's a misconception amd cpus aren't that good that they just show lmfao and also u dont wanna compromise 50% better productivity performance for 5 percent better gaming performance 😂
Source engine does. Actually all midrange amd cpu's perform a lot better(like up to +150-200fps) in cs2 on source2 and 5600x outperform every intel gpu for the same price there.
@@TSEDLE333 Just like Horizon Zero Dawn is an AMD title 😉
@@ImBrilliantz11 wtf are you talking about? amd cpus banging intels cpus for multiple gens now, you think amd just got the huge market share cause their so nice? Nah their cpus are great for the price, the 7800x3d is much cheaper aswell and most gamers dont care about a little less productivity. Most dont even use their pc for other things than gaming and browsing. Just take the L fanboy.
What monitor resolution are you running for the Microsoft Flight Simulator test?
Jesus ! With the half of the power consumption of 13900k and , 7800x3d still manages to win. And it is 200-220 dollars cheaper on retail, moreover the platform (mobos) are cheaper on the amd side too. Holy bananas AMD ! Good job.
Eh, no it didn't? 13900K won more games and not to mention its much better CPU outside of gaming. 7800X3D should be $350.
Results don't match up with other channels. Don't know which results to trust.
insane how its about to achieve same or more fps with only half wattage, would be bill changer for ppl who play games for long hours
how? are you talking about the $5 a year of everyday use lol
@@kodakblack531 cant u read?
@@yinitialize4930 yes I can read that you just changed your comment.
@@yinitialize4930 I still don’t think ppl will care about $5 year if they’re getting better perf
just FYI you can get more fps than in this video by overclocking a 13600kf and $70 b die ddr4 and that'll save a lot more than $5 a year
Fluidity of motion and animation seems better on 7800X3D? There seems to be micro stutters on 13900k. Where both used with the same physical video card? Do you see it as well on the vid? is a matter of video editing / capturing / YT or is it noticeable in real life as well when played on those machines?
Honestly? The only thing I can see is that the 7800x3D is about the same in gaming as the 13900k. Seriously, some games the i9 wins, others the 7800x3d wins, and a lot of them it's basically a tie, or the fps is so high it doesnt really matter. Also, the argument that the 13900k runs too hot, it just happens in syntethics benchmarks, in gaming where the load on the cpu is not high most of the time, the 13900k even runs cooler than the 7800x3d as you can see in most cases. Also, if you game in 4K, it REALLY doesn't matter which of them you get.
doesn't matter until u take 150$ into account
With AMD you use way less watts and you can upgrade the CPU to a new in 2-3 years if you want.
With Intel you cant do that.
Intel fan bois really showing how upset they are over a fucking brand.....
I have a 13900k and I totally agree with this assessment. Both CPUs are fine and you can't go wrong with either one. Grab the one you want and go play some games.
2:40 look at this woman 😂
Thanks for this. For some reason, this is the only game benchmark channel that actually SHOWS wattage, thermal, and usage. Like, seriously, why is it so taboo to put them in their benchmark?
because they dont know how to custom their afterburner for 15 mins.
Iv just got the 13400f its engouch to play in 4k@144hz, you have simply to disable the powerlimits in the bios and set the fsb to 102.9 Mhz and get 6000Mhz ram :)
Thats so bad on so many levels.
WOW AMD did it very well!!!
thanks for the video!!!! what cpu cooler do you use on the 7800x3d? the bequiet dark rock 4 is correct?
I am waiting for Ryzen 7 9800X3D so this could get cheaper 😅
You be able to get it for around $280 when ryzen 8000 comes out
8000 wont release the X3D. they skip a generation and release every 2 years with nvidia gpus
that's weird how modern games doesn't look any better than games from ~10 years ago, but still year after year becomes more resource hungry, now that's a perfect illustration of an inflation kids
how amd just put a little more cache into their cpu and destroyed intel's top of the line is mindblowing 🤣
It wouldn't be mind blowing if you had a little knowledge about cpu
The X3D stuff is a bit odd when you think about it. In games where vcache helps it's great. In other titles the lower clock speeds needed to protect the 3DCache from heat means lower FPS.
Basically for anyone who plays a ton of different titles, it's hit and miss.
its pretty much great for every game but old csgo and old valve titles…
which will give you above 500fps anyways and unless 1000hz monitors exist… thats pretty good
Отличный результат, АМД радует.
Power efficiency is important to me. Intel has some great performers but they need to upgrade their efficiency in the future for me to consider upgrading to them. If AM5 has close to the longevity of AM4, then it would be a great platform long term
yup. Wish I knew that the 65w varient cpus really went up to 200 watts when I was buying an i7 back in 2020. I dont bother gaming at all in the summer even with 80f its disgustingly hot as an oven.
@@slumy8195 Yeah it sucks when some cpus pull as much power as a mid range gpu. That's the main reason I went for the 5700x over the 5800x. 65 wat tdp but doesn't pull more than 75 watts and temps barely get my case warm. So any future cpu I get will have to be on AM5, be it a non x chip, or even a X3D chip that seems to be good on power and temps like the 7800x3d. Would love to see intel follow in their foot steps for power efficiency
The problem with Ryzen is the 0.1% lows compared to INTEL processors in that regard intel is more stable and you can tell on the fluidity of the games, on the other hand INTEL is consuming too much power making it the worst choice in terms of efficiency. Hard to tell which is best, it will depends on how estrict you are with certain games.
Where are you finding poor 1% lows with 7800X3D?
To everybody sitting here talking about how impressive the power draw is vs the 13900K, can we address the fact its also LITERALLY less than half the chip? Its a 8 core 16 thread CPU vs a 24 core 32 thread, its like comparing how much stealthier a silenced pistol is vs a Light Machine Gun. Or throwing on cinebench or some productivity tasks and bragging about how much the 13900K stomps it. Its a 5nm chip with 1/3 the cores and 1/2 the threads of a 10nm chip, i sure as hell hope its a lot more power efficient
The 7800X3D using a FRACTION of the power the 13900K uses a sight to see! The efficiency and performance are NUTS.
Intel has serious problems with their architecture if they plan on retaining it moving forward.
13th gen Intel can be undervolted more than usual which greatly reduces heat & power usage, Either way the 7800X3D is impressive.
When i bought my i7-12700K, the motherboard was giving me 240 watts because of 1.42 volts for some reason on Cinebench. Now it draws 150 watts and works around 1.21 volts.
Intel motherboards have weird voltages out of the box.
7800X3D is an 8 core chip vs the 13900K that has 24 cores.
My i7-12700K is as efficient as that 7800X3D for games.
While that is true, the cpu is only a fraction of the total power use of a pc.
Hmm interesting comparison, but the Low 1% and Low 0.1% differences between AMD and Intel is quite huge in some games. Does it matter in real world performance?
It does, you can feel 0.1 as a stutter, you can't feel extra 20fps when you have 200+ already.
Nah mate, amd cpu's are miles ahead of Intel for gaming
Draw less power then 13600k
Costs like 13700k
Perform the same or better then 13900k
Such a legend.🥰
This is looking like a new legendary CPU in the making.
@@RicochetForce damn kids are kids *productivity is a thing that's exists* and for ur kind info 13700k is far better in every aspect than gaming if compared to 7800x3d and my illiterate man is comparing 13900k which is 50%+ faster in productivity than ur garbage 7800x3d and calls it is legend LMFAO
Only in gaming. One trick cpu lol
@@xTurtleOW 13900k better only in workload. One trick cpu lol.
Userbenchmark enjoyer - > avoided.
My particular 12700K also can draw less power than a 13600K, not impressive.
The power consumption of 7800x3d is freaking impressive
Even a 37mm lp cooler can handle it with ease
No. U must limit pbo2 for air cooling for all x3d anyway. Actually liquid is a must for all x3d. Power consumption doesnt mean its cold.
@@notenjoying666 Is that so? disappointing 🥲
just put a notcua dh115 and ur fine
Gotta remember if you don't have 4090 there's no difference between these CPU's. In fact the gain from even i5 13600K and R5 7600X to these CPU's is minimal if you don't have 4090.
Not true at all when the game is CPU bound
cpu tests are better when you turn the graphics as low as possible to show how many frames the cpu can pump out. If you run max graphics then you turn it into GPU bottleneck and the test is pointless
Many things that people here are missing when praising 7800X3D is that first it has worse 0.1% in nearly every game and CPU usage is about 15-25% higher most of the time. 13900K is overall a better CPU in my opinion.
such a big deal with 99 fps against 101 lol. "huge difference:
Oh wait.
We didnt include price factor here.
This cpu costs like 13700k...
🤣🤣
@@notenjoying666 The 99 and 101 FPS difference that you're talking about is actually the average lol, in many games there is about 40-60FPS difference when it comes to 0.1% lows and it also has high CPU usage, which means with newer GPUs this CPU won't last that long. It's a mostly gaming CPU and even then is failing to provide a smoother overall gaming performance than a 13900K lol (higher average FPS doesn't mean smoother gameplay, just in case you will try to make this argument)
Where did you see worst 1% and 0.1% than i9 13900K? In every game except fps games like cs:go this CPU has higher 1% and 0.1% than i9 13900K
13900K is all around cpus not just for gaming.
Linus tech tips say this is the begining this gap would be larger 3d cache would take more advantage over time.
Half the watts, about 20°C cooler at full load and is over 100$ cheaper. I don't see why I should get the 13900K over this.
Edit: Hopefully they fix the melting problems of these chips fast.
you cant fix something that has been engineered by trash. Same when ppl think a car cpu upgrade will fix there car lol the car will still have problems its purely a hardware end not software. I'll see what the 2023 or 2024 boards bring because these 2022 boards are the problem and no update will fix companies cheaping out on everything as possible.
7800x3d 👍
Какой же глубокий заглот от синих. Надеюсь они начнут уже скидывать цену за их оверпрайстнутые процы.
I'm more impressed by the power usage than anything. I do more than gaming and use quicksync quite a bit so I'm happy with my 13700k but if gaming was my main concern the 7800X3D would be a no brainer.
13700k will also be better for multi tasking as you can run any workload in background while running games without getting huge performance penalty.
Overclock your 13700k and it will faster in some games :D
@@jondonnelly3 Not interested in overclocking. In fact I've undervolted and dropped power limits a bit to keep temps under control on air.
normaly the 13900K would cost 1000+ $ but because amd we have it at 600, and still expensive, AMD givin both prices good, big win for AMD.
Very nice and efficient. It's a shame this came out so late after I had already done my build. :(
Perfect time for me to upgrade... still rocking the i7 4790k since Dec 2014.
The comments are just pure pain. People will literally try to refuse technological advancement, and instead of being happy for getting more for less, they defend a clearly less worth it (gaming) CPU.
Like, wake up guys.
Anyways, I wish good gametime to everyone, not based on your hardware. Remember, don't let the PC end like the consoles!
Better performance than 13900K while using half the power!
It's. Not better in all cases and it gets smoked in productivity
it is amazing processor R7 7800X3D
One thing I see is that yes the power is lower with the AMD chip but look at the % left over between the AMD and the Intel guys the AMD doesn't have much room left over where as the Intel has tons left over to use. I call that future proofing.
Why 0.1% lows are so bad on Ryzen, at least in first game tested?
The i9 has 24 cores
@@saricubra2867 and that means what lol
@@APN34Z More cores can improve the frametime consistency for a game like Cyberpunk 2077 that scales pretty well.
1080p? si estos procesadores y la gráfica son para 4k, el test debería ser en 4k, sería absurdo adquirir estos componentes para jugar a 1080p.
How can the 7800x3D use like half the Watt but CPU temps are almost the same?
lower fan speeds. It was probably on auto fan with a similar temperature target
The same cooler was used, and the fan speed is adjusted according to CPU temp not power consumption. More likely 7800X3D is so hot for it's power consumption level due to stupidly thick heat spreader on AM5 that acts as a major bottleneck in Zen4 CPUs cooling.
The real answer is actually that the cache sitting on top of the die holds back the heat. The heat transfer from the die to the heat spreader is poor. Thats why any cooler will cool it.
2:36 bro fr said 'gotta go fast💀😭🙏'
Honestly I'm still going to stick with the 13900k. The games the 7800 beats it in, I don't particularly play or care to play, and most importantly, while sure the AVG fps is high, the 1% and 0.1% lows on the 13900k are better in most situations. I cap my game fps anyway, and what matters to me is not having to deal with bs stutters during gunfights in Apex and similar.
This is also purely game performance... And I plan to do more than just game so.. Oh well.
End of the matter is.. I've been on a 4790k all the way up till now. What an upgrade it'll be eh?
i think you missed the part where it shows the power draw, in some countries like the UK electricity has become really expensive.
Get 64 GB to your new system when you do.
I plan to do more than just game. So why did you even look at this cpu? Why even bother with your shitty comment? Brand fanbois are the worst.
@@curedanxiety1679 only on EU, and you know why they have to bear such a cost.
@@lethanhtung4972 yep, I myself live in Estonia the price for electricity has gone wild, I myself have a fixed price 200€/month it ain't that much compared to the wild stuff I've seen, in winter some guy paid 600€ for heating 800€ total for a small apartment, the thing here is we have a energy company that isn't owned by the government so they just jacked up the price to the moon to the extent where a farm had to pay nearly 100K for a year's worth of electricity and it wasn't a big farm or anything a private owned farm, it's quite wild what's going on here, I really don't care since I have a fixed price.
Are those temps on ryzen 7 7800x3d really from cooling it down with dark rock pro 4? and is pbo2 on or off?
Amd has a problem with 0.1% in many games
I’m testing the 7800x3D for a few months now. Many updates and the best ram possible.
It’s a nightmare to avoid bluescreens.
I managed to keep it at 5.15GHz stable but still not as smooth as many claim.
The 1% lows are better even on the i7 and no, I don’t like Intel and their high charging price for something that should be lower.
At the end I would recommend Intel unless u wanna wait for more stable bios and chipset drivers.
With time the 7800 will beat anything even at 1% lows. For now I’m very skeptical on bench’s and overall performance.
Lol, Intel uses 2x power and still loses , Intel fanboys must be fuming ,
From the stutters, no they're not pmsfl
@@APN34Zit takes double the physical cores for intel to compete with AMD huh 😂
The 1 percent lows tell me everything I need to know. Also there’s the fact of Amd having a way worse ddr5 mem controller usually only getting to 6000 while 13900k can get to 8000
I dont think the memory speed really matters that much. I think its more about memory latency and timings. 6000mhz vs 8000mhz cant have a direct comparison yknow
@@SeeFlow-bo1dl yes very true but Intel can also have higher freq with lower latency which properly scales. I’ve seen a lot of people easily overclock to 7800 cl32 or 7600 cl30 on Intel while they are lucky to even get 7000 period on Amd and only able get those cas numbers in the 6000 range. Now despite mem frequency Amd has been competing well but if they improved their mem controller up to par with Intel’s then they’d absolutely destroy Intel.
AMD: We Need Less Power
Intel: We Need More Power
I’m still an Intel fan due to the 1% and 0.1% lows
+ more power + more heat + more price
Yeah. Such a huge diff. Like 5% or smth🤣
There is a reason why people go with Intel because AMD will always be buggy around software. Gaming AMD is on par with Intel.
@@SiqFix People go for intel because intel spend more money for advertising then making a good product.
Like imagine 10nm cpu in 2023, wtf...
@@notenjoying666 nope you obviously new to the computing world. I have owned a Ryzen system and everything about it related to software was buggy and still is today. Some of my friends own the latest and greatest Ryzen systems and themselves find AMD buggy, but they bought AMD because it’s more affordable.
how are your temps so low on the 7800x3? My idle cpu temp is 44 celcius and all the forums i read say idle is 40-45
still within normal temp dont worry
id worry if case/idle temps are over 55c
Holy shit, that hogwarts is so poorly optimized game! 😱 And many of my friends were drooling on to spend 60€ on that PoS 🤣🤣
Edit: Conclusion from the tests: AMD is still the best bang for buck.
the trash of us took its place!
@@slumy8195 xD
The comment is not only for the algorythm but also as a thank you for your work and effort.
No reason to go with the x3d chip unless you want a pure gaming machine with a 4090. Pair it with any GPU lower than a 4090 and you’ll have identical results. The intel chips also destroy this chip in every other metric, like multicore workloads. Also people like to talk about power consumption in games but fail to mention total power consumption. My 13600k idles at 4 watts, and the 13900k is similar. The Ryzen 7000 series is known to idle at 20-25w+. Also, a Ryzen 7800x3d with mobo and DDR5 ram will run you around $850, whereas a 13600k with mobo and DDR5 will be around $500-$600.
Edit: People are stupidly calling me an intel fanboy, so let's disprove that quickly. Despite me owning an intel chip, I recommend the 5800x3d over both these chips. Why? Because cpu, ram, mobo =$450. Yes, that’s the price of a 7800x3d. Again 7800x3d with ram and mobo will cost $850. For 10% - 15% more performance with an rtx 4090, you’re gonna pay $400 more. If you own an rtx 4090 and want the fastest CPU possible, by all means get the 7800x3d. But I know that's not the case for everyone, so it doesn't make any sense to pay $400 more for a CPU that's only 5-15% faster. Hence why the 5800x3d is the best value gaming CPU.
What are thise combos? Why you suddenly put 13600k when 7800x3d is 13900k level cpu lmao
@@knhgv cause most 13600ks can be overclocked to 5.5ghz like the stock all-core frequency of the i9. In fact my 13600k runs at 5.7ghz so it will mostly run faster than the i9 showed in the video.
@@ProVishGaming LMFAO SUCH A CRINGE NONSENSE LOL. K 7800x3d can be easily oc'ed to 5.4 with base clock without any issues, what u gonna say here?
13600k oc;ed never did better then 13900k by any means anyway, thats fake.
found the intel fanboy lol
@@notenjoying666 no the 7800x3d can’t???? Don’t watch a skatterbench video and think it’s possible. The 3d-VCache on the die is sensitive to high voltages and you can it frequently parks cores whenever it can. You need to win the silicon lottery if you want 5.4ghz. Also your assessment of my 13600k is completely wrong. Unless if a game is able to utilize all 8-P cores(which is rare), my overclocked i5 does perform better than an i9 games.