Please give this video a thumbs up if you liked it and feel free to comment below or ask me anything. It will help me to get recognized by youtube's algorithm. Thanks! :) Timecodes: ○ 0:00 - S.T.A.L.K.E.R. 2: Heart of Chornobyl ○ 1:08 - Cyberpunk 2077 ○ 2:14 - Horizon Forbidden West ○ 3:19 - Warhammer 40,000: Space Marine 2 ○ 4:21 - Ghost of Tsushima ○ 5:35 - Starfield ○ 6:34 - The Witcher 3 Next Gen ○ 7:41 - SILENT HILL 2 Remake ○ 8:50 - Black Myth Wukong ○ 9:53 - God of War Ragnarok
@@Eduardoboninisinhorettia few games need more CPU. A few games need more GPU. + higher FPS need a better CPU 60FPS or 300+ FPS is a big diff. sometimes. But: i got a 14700K and i Max need 30-40% of my CPU with a 4090 4k 4096x2160 + 120FPS. (99% RPGs) So i think the 9800x3d is 1-2 FPS better at Max. (Or the same) GPU is more important in 4k (like above) A 9800x3d dont need 100% Power in 4k. Same for my 14700K (Can say that about the games i played)
need another test with dlss quality (maybe + fg), i would never play 1440p with 4080 using dlss performance, the test is made specifically to show at least some difference in cpu, but in real conditions of use the difference will be almost unnoticeable
I will tell you right away... 10700f will be to slow in 1080p so your 4070 will not work with full potential. Using frame generation will most likely fix that issue and you will be good
It doesn't if you're GPU bound. Note that in most games RTX 4080 is at 50-70% usage, that means the CPU is dragging it down. But in Wukong, on both CPUs the GPU is running at 99%, so it's gpu bound and there's no FPS difference between CPUs. So yeah, if you have a RTX 4080 GPU or better and don't play in 4K, the CPU will make a difference. But if you have something like 4070 and/or play in 1080p then CPU won't matter much.
and they are right in some way... this GPU is to strong for that old i7 so it works at 50% ... testing this with a 3060 would make much more sense... low fps is due to bottleneck
@@JN-hs5xl You are wronge. Look CPU benchmark with 4070/4080/4090 with all cpu comparison, you can gain about 20, 30, 40, 50 fps depending of CPU age's and games. CPU ALWAYS did a nice diff, since 10 years honestly. Everytime i change my CPU i always won 20/30 fps on games i play.
Also in STALKER 2.. Both are high CPU intensive games..... even Intel i9 14900k and AMD 9800x3D the CPU usage in STALKER goes to 35 to 40 % in town areas
The 10700f is bottlenecking the gpu and taking the load on it self you have to put the setting high enough for the cpu to give the load to to the gpu and when it does the diffrence will not even be noticable between the both cpus and the same gpu
It's not like 14700KF is too much powerful - NO - the 4080 is too powerful so it force a huge bottleneck with 10700KF - that's the reason for the high FPS gap
You should just disable HTs in BIOS and re-run with 4070 like 1 viewer said. 🤩RESULTS would be more representative. HTs can add about 20% but only when cores are 100%, but now looks that cores can add another 100%🤣 - NOT - thats wird how wins CALCULATES cores and CPU summary usage.☠Some games with HT on, can be even slower. Every game gets max. results with 6 cores. Spiderman uses 3-5 cores (all cores when loading screen or shader comp). What did you do with 14700 E-cores? - disabled them in BIOS? INSTALL Process Lasso, Put game tested to CORES 1-8 put nvidia NV*.exe & MSIA, RTSS (SHIFT+mouse SELECT mark multiple) CPU AFFINITY ALWAYS to E-cores and repeat the testing (NO GAME IS using more than 6 CORES anyway) (intel HW THREAD DIRECTOR USES CORES 1ST, HTs 2nd when all cores are FULL - idk about ryzen but it has to be simmiliar - win task manager / CPU / CORES 0,2,4,6... HTs 1,3,5,7...) and there is some connection between WINDOWS and BIOS INTEL HW thread DIRECTOR 😁
I picked a few less CPU demanding games on purpose. It's to showcase that not every game will receive a significant performance boost from upgrading your CPU.
everything these days just stutters non stop.. we're just pushing hardware beyond its limits thinking that more fps equals more stability.. I rather play a game at 60fps that has 1% and 0.1% lows at also 60fps than play a 300fps game that has 1% and 0.1% lows at 12fps stutter is something that ruins my gaming experience way more than low framerate even games that run smooth for a few seconds and have amazing frametime the moment you turn the camera slightly BOOM there it is I am so done with modern gaming
4080 rd was 2022. At 2022 we already had 12400f. So If u want to show difference - take 3080. Because in my opinion and requirements there are no difference. I had 1070 in past and I still dont need better. With my card I will get same fps on both CPUs
The 10700f is bottlenecking the gpu and taking the load on it self you have to put the setting high enough for the cpu to give the load to to the gpu and when it does the diffrence will not even be noticable between the both cpus and the same gpu
Please give this video a thumbs up if you liked it and feel free to comment below or ask me anything. It will help me to get recognized by youtube's algorithm. Thanks! :)
Timecodes:
○ 0:00 - S.T.A.L.K.E.R. 2: Heart of Chornobyl
○ 1:08 - Cyberpunk 2077
○ 2:14 - Horizon Forbidden West
○ 3:19 - Warhammer 40,000: Space Marine 2
○ 4:21 - Ghost of Tsushima
○ 5:35 - Starfield
○ 6:34 - The Witcher 3 Next Gen
○ 7:41 - SILENT HILL 2 Remake
○ 8:50 - Black Myth Wukong
○ 9:53 - God of War Ragnarok
my i7 14700F making me proud out here 😊
people rarely upload footage of the F variant
What do you use to keep that chip cool?
@@johnwarren6762 ak620 deepcool
Crazy power draw. I'm using 10700F on 1 PC, 12400F for another PC. 7700 or 7800X3D is way more efficient.
@@madarasoun3018 luckily power is not a issue nor the extra cents in cost
@@LEGEND-rt9zc You're lucky indeed. In SEA, the more electricity we use, the higher the price per KWh.
Just upgraded mine from a 10700K to 9800x3d! Woo!
Would love a 4K comparison video too!
Congrats!
4k dosent matter, when the GPU is the
system's bottleneck the FPS will be equal in both
@@Eduardoboninisinhorettia few games need more CPU.
A few games need more GPU.
+ higher FPS need a better CPU
60FPS or 300+ FPS is a big diff. sometimes.
But: i got a 14700K and i Max need 30-40% of my CPU with a 4090 4k 4096x2160 + 120FPS.
(99% RPGs)
So i think the 9800x3d is 1-2 FPS better at Max. (Or the same)
GPU is more important in 4k (like above)
A 9800x3d dont need 100% Power in 4k. Same for my 14700K
(Can say that about the games i played)
Hows the fps difference?
This was a nice upgrade. Congrats.
Thanks!
need another test with dlss quality (maybe + fg), i would never play 1440p with 4080 using dlss performance, the test is made specifically to show at least some difference in cpu, but in real conditions of use the difference will be almost unnoticeable
thank you for actually having a 14700F review
I have the i7 10700F and a RTX 4070 Super. Runs great and MSFS2024 runs at 75FPS on high to ultra settings which is all im interested in.
16gb ram when stalker eats like 17...
this video is very interesting great job and thanks!.
Happy to help!
i have a i7-14700F paired with a 4070 Super, very happy.
Can you test the 10700F vs 14700F with an RTX4070 12gb video card at 1080p resolution?
Thanks for your videos!
I'll look into it.
@@MxBenchmarkPC Respect! ❤
I will tell you right away... 10700f will be to slow in 1080p so your 4070 will not work with full potential. Using frame generation will most likely fix that issue and you will be good
just upgraded my i7 14700F from i5-12400F and im super happy with the performance
And what card do you have?
@@santiagofernandezcabal6286 4070s
I remember theses guys saying processeur doesnt affect FPS lololo
same for ram clockspeed. I remember a Fallout 4 benchmark from DDR3? times where 1866mhz vs 3000 mhz (xmp) increased FPS by 80%.
Depending 100% on the game
you can have the same fps when you're GPU bound with an 10-year-old CPU and a new one
It doesn't if you're GPU bound. Note that in most games RTX 4080 is at 50-70% usage, that means the CPU is dragging it down. But in Wukong, on both CPUs the GPU is running at 99%, so it's gpu bound and there's no FPS difference between CPUs. So yeah, if you have a RTX 4080 GPU or better and don't play in 4K, the CPU will make a difference. But if you have something like 4070 and/or play in 1080p then CPU won't matter much.
and they are right in some way... this GPU is to strong for that old i7 so it works at 50% ... testing this with a 3060 would make much more sense... low fps is due to bottleneck
@@JN-hs5xl You are wronge. Look CPU benchmark with 4070/4080/4090 with all cpu comparison, you can gain about 20, 30, 40, 50 fps depending of CPU age's and games. CPU ALWAYS did a nice diff, since 10 years honestly. Everytime i change my CPU i always won 20/30 fps on games i play.
That I7 10th gen is my old cpu(gave my old computer setup to my uncle). I have the 14th gen I7, definitely seen a massive boost in my games
Wow, happy I didn't keep my 8700K from my 1080Ti setup when upgrading to the 4070Super. Didn't think there'd be such a difference!
1:34
WOA!!!! 227w and just only 60C°. Your cooler is awesome.
How many fps does it make a difference in 4k resolution?
3:38 warhammer does have one of the best frame generations ive ever seen.
Such a big difference in performance in Space Marine 2
Also in STALKER 2.. Both are high CPU intensive games..... even Intel i9 14900k and AMD 9800x3D the CPU usage in STALKER goes to 35 to 40 % in town areas
how to run all core 5.1ghz?
Is the i7 14700F paired with a 4070 super okay?
Sure.
whats pl1 and pl2? undervolted too?
i mean the 14700f
250w, no undervolt.
anything overclocked?
Nope.
It's not a big difference when you consider that you spent about $800 on the upgrade.
I didn't spend that much on this upgrade. Check the latest community post for more info.
Is it okay i7-14700f pair with 3070ti?
Sure.
Your gpu likes it !
But... wtf, under 60fps with the combo 1400f+4080 at 1440p + dlss performance in Stalker 2 ?!
Stalker 2 is extremely CPU heavy game.
Game is unoptimised trash atm😂
Not bad gz for upgrade 👌
Thanks!
Please also test the 4070 with these CPUs
I'll look into it.
Привет. Сделай такой же тест, но в 4K. Вот это точно будет интересно.
5800x3d: take my bear
DLSS performance 1440p?
:DDDD FAIL !!!!
That's how you test the CPU.
Basically 720p
9:53 :DDDD 1104x616 !!! NAAB !!!
my i7 14700k in games has 100w and 55 degrees (pl1/pl2 200W, 5,5/4,3GHz) seem test is on unlimited Wats and on stock cooler
Really? I got the same chip new and I’m worried that a 240 AIO won’t cool it
Thanks
The 10700f is bottlenecking the gpu and taking the load on it self you have to put the setting high enough for the cpu to give the load to to the gpu and when it does the diffrence will not even be noticable between the both cpus and the same gpu
It's not like 14700KF is too much powerful - NO - the 4080 is too powerful so it force a huge bottleneck with 10700KF - that's the reason for the high FPS gap
🗿 stay strong
You should just disable HTs in BIOS and re-run with 4070 like 1 viewer said. 🤩RESULTS would be more representative. HTs can add about 20% but only when cores are 100%, but now looks that cores can add another 100%🤣 - NOT - thats wird how wins CALCULATES cores and CPU summary usage.☠Some games with HT on, can be even slower.
Every game gets max. results with 6 cores. Spiderman uses 3-5 cores (all cores when loading screen or shader comp). What did you do with 14700 E-cores? - disabled them in BIOS?
INSTALL Process Lasso, Put game tested to CORES 1-8 put nvidia NV*.exe & MSIA, RTSS (SHIFT+mouse SELECT mark multiple) CPU AFFINITY ALWAYS to E-cores and repeat the testing (NO GAME IS using more than 6 CORES anyway)
(intel HW THREAD DIRECTOR USES CORES 1ST, HTs 2nd when all cores are FULL - idk about ryzen but it has to be simmiliar
- win task manager / CPU / CORES 0,2,4,6... HTs 1,3,5,7...) and there is some connection between WINDOWS and BIOS INTEL HW thread DIRECTOR 😁
your 14700f is power limited,i74w is too high on a game..try undervolting but im not sure if b760 boards support it...
ну хз, странный тест
So all the upgrading GPU only is horseshit right? You always need to upgrade both
more o less 50%
A good amount of these tests are GPU bound on the 14700f 😅
I picked a few less CPU demanding games on purpose. It's to showcase that not every game will receive a significant performance boost from upgrading your CPU.
@MxBenchmarkPC I see. The 50 series will show that major difference.
Good pezdong! thank you kindly
GG
everything these days just stutters non stop..
we're just pushing hardware beyond its limits
thinking that more fps equals more stability..
I rather play a game at 60fps that has 1% and 0.1% lows at also 60fps
than play a 300fps game that has 1% and 0.1% lows at 12fps
stutter is something that ruins my gaming experience way more than low framerate
even games that run smooth for a few seconds and have amazing frametime
the moment you turn the camera slightly BOOM there it is
I am so done with modern gaming
4080 rd was 2022. At 2022 we already had 12400f.
So If u want to show difference - take 3080. Because in my opinion and requirements there are no difference. I had 1070 in past and I still dont need better. With my card I will get same fps on both CPUs
надо было разогнать 10700 до частот 14700-го и сравнивать !
Не не мог глянул материнскую плату она не очень мягко говоря 2933 это максимум по памяти и разгон проца скорее всего не желателен.
@@СергейКостенко-г8щ тогда нужно было сбросить частоты у 14700 !
@@tt-yw2ou хуйня твой 10700 брат
The 10700f is bottlenecking the gpu and taking the load on it self you have to put the setting high enough for the cpu to give the load to to the gpu and when it does the diffrence will not even be noticable between the both cpus and the same gpu