Games : Forza Horizon 5 l 1080p l - 0:06 - gvo.deals/TestingGamesForza5 Forza Horizon 5 l 1440p l - 0:48 Forza Horizon 5 l 4K l - 1:26 Hogwarts Legacy l 1080p l - 2:10 - gvo.deals/TG3HogwartsLegacy Hogwarts Legacy l 1440p l - 2:45 Hogwarts Legacy l 4K l - 3:29 Red Dead Redemption 2 l 1080p l - 4:15 - gvo.deals/TestingGamesRDR2 Red Dead Redemption 2 l 1440p l - 5:05 Red Dead Redemption 2 l 4K l - 5:50 Atomic Heart l 1080p l - 6:34 Atomic Heart l 1440p l - 7:36 Atomic Heart l 4K l - 8:33 CYBERPUNK 2077 l 1080p l - 9:16 - gvo.deals/TestingGamesCP2077 CYBERPUNK 2077 l 1440p l - 10:02 CYBERPUNK 2077 l 4K l - 10:32 CYBERPUNK 2077 l 4K l DLSS 3 - 11:12 PUBG l 1080p l - 11:49 - gvo.deals/TestingGamesPUBG PUBG l 1440p l - 12:40 PUBG l 4K l - 13:32 Resident Evil 4 l 1080p l - 14:17 Resident Evil 4 l 1440p l - 15:01 Resident Evil 4 l 4K l - 15:38 Spider-Man l 1080p l - 16:20 - gvo.deals/TestingGamesSpiderManPC Spider-Man l 1440p l - 17:11 Spider-Man l 4K l - 17:55 Spider-Man l 4K l DLSS 3 - 18:34 Microsoft Flight Simulator l 1080p l - 19:24 - gvo.deals/TestingGamesMFS20 Microsoft Flight Simulator l 1440p l - 20:14 Microsoft Flight Simulator l 4K l - 20:54 The Witcher 3 l 1080p l - 21:33 - gvo.deals/TestingGamesWitcher The Witcher 3 l 1440p l - 22:15 The Witcher 3 l 4K l - 22:50 The Witcher 3 l 4K l DLSS 3 - 23:21 System: Windows 11 Ryzen 9 7950X3D - bit.ly/41wwPIV Gigabyte X670E AORUS MASTER - bit.ly/3BRDIIG G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz CL38 CPU Cooler - be quiet! Dark Rock Pro 4 - bit.ly/35G5atV GeForce RTX 4090 24GB - bit.ly/3CSaMCj SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
I remember few years ago i thought 4k gaming was just ridiculous it would need a hardware so advance that i dont think it would be possible at least 7-8 years from now and here we are, gaming at 4k ultra settings high fps. Technology never cease to amaze me.
Yea its bad. I remember playing it the first few weeks it launched and I was getting drops to single digit FPS way too often with a 5800X3D + 2080Ti. Uninstalled it, figured id wait for 6 months of patches then revisit.
@@aditech8432 We don't know if DLSS is on but considering it is on in every other game in this video it's safe to assume it's also on for HL. Nevertheless the game is very very poorly optimized.
Can ALMOST max out the GPU at 1080p. Incredible results. I remember when the 4090 first came out, the best CPUs were still bottlenecking it at 4k even! These new 3d v cache chips are finally for next gen cards! I am happy with my recent 5800x3d purchase :) The jump to AM5 is too expensive for me at the moment, but maybe I'll upgrade for the 8800x3d :)
I recently bought a 4090 and paired it with my 5900x. Already this combo is a beast,running on 32gb ddr 4 3600mhz cl14 ram and just pcie3 and 4 nvmes. While that combo isn’t the ultimate,it’s already a banger for my 165hz uwqhd ultrawide display 😊🔥
@@jamshidjafari6462 1% low is telling us that the worst 1% frame per seconds(fps) of the time. Although it's sounds not very important but it will affect your experience of the game. Like for example your average fps is around 80fps but your 1% low is lesser than 60fps like 40 30 fps, if it is that case you will feel your game a bit laggy/sluggish because of the 1% low that lesser than 60fps. (Just to fill the gap, we know that if game run lesser than 60 fps, you will feel the game isn't run as smooth)
Getting my 7950x3D in September and currently on a 9900k with the 4090. Everything still runs amazing but seeing my GPU sitting at 50-60% in some games has made me want to upgrade faster.
@@Universeverse923 you wont notice a difference between the 7800x3d and 7950x3d in gaming but for multi core tasks you will surely notice and judging by what he said he just games on his 4090
@@marcelojogos6247 yes 7800x3d will beat 7950x3d, bc of how 3d cache is distributed, 7950x3d can not use it accurately because it is only in 8 of 16 cores, while 7800x3d has cache in all cores. Proof of this is that if you disable the 7950x3d no-cache cores, it runs faster in games
My RX 6750 XT going in Hogwarts Legacy above 10 GB sometimes even above 11GB of Vram in 1080p with all maxed out but without Raytracing. Rest is an 5600x and 3200 Mhz DDR4 Ram. Interesting to see the results here.
Yeah this is what ive been trying to say aswell that 4070 Ti is not gonna age well in other low spec gaming channel but instead they all say how 12gb of vram is still okay lmao, look who's laughing now
planned obsolescence.. 12Gb will not be enough after year or two. 16 GB will be the norm for 1440p and 20gigs for 4K. But Very high/high settings for gameplay. Ultra is for screenshots so 12gb may last longer.
@@yosifvidelov Ehhh, probably not. At least the GPU temps are realistic for the high end coolers. My 4090 often stays around 60-70°C in my 30°C ambient room, depnding on the game. Forza Horizon 4 e.g. is a 60°C GPU title for me. Silent Bios, aka 800-1200 rpm max.
@@Waldherz well 4080 has the same huge cooler and i get 60-65 in most titles @1440p 22-23C ambient temp at stock settings while using considerably less power. So you tell me :)
@@yosifvidelov The 4080 has a much lower quality die and runs at a higher voltage. Have a look at the power consumption and temperature difference between the Ryzen 9 5900X and 5950X. You will find that the 5950X runs a lot cooler while also using less power for higher clockrates at more cores.
@@Waldherz ruclips.net/video/uyMnMPLYk2w/видео.html gaming x trio seems the worst so these temps here are much off. so i call 100% fan speed/noise to get them..
@@vishvrajzala8017 i use frame gen with no dlss on cyberpunk and it is definitely not a gimmick. there's no choppiness at all, the only difference is the input lag but you get used to it and you get a smooth game with no dlss needed.
@@pedestrian_0 dlss definitely degrades quality there is no doubt about it I care about resolution a lot as I can easily defrenciate while framegen is just not best for shooting game where latency and delay matters you can use them in story mode games for sure
I wouldnt. Its either get the 7800X3D or the 7950X3D. 7900 is in a weird spot the way the 3D caching works on Zen 4. Only one CCD gets it. So with the 7800X3D its only a single CCD anyway, so you get all 8 cores benefitting. 7950X3D like the 7900X has 2 CCD's but the 7950X3D is 8 cores per CCD whereas the 7900X is 6 cores. So with he 7900X3D only 6 cores are getting access to the 3D cache, the other 6 arent. This is a moot point now as multiplatform game engines (what most games are built on) are still from last gen with enhancements for next gen. So 4-6 cores is the sweet spot for most games. This will soon no longer be the case as PS4/Xbone is phased out (were seeing it already) and all game engines/games are built from ground up for Next-gen only, which PS5/SeriesX all run 8 core Zen 2 chips.
Hogwarts, Cyberpunk, The Witcher (?????) just to name a few are running around 60fps and can even dip below 60fps ON A 4090/7950x3d …. PC Games are just not optimised… Look at what Consoles can do with just a fraction of the power. 4090/7950x3d combo is the craziest thing you can get right now just to dip below 60fps WOW
It's because of native 4k with raytracing in hogwarts and witcher. Learn how intensive running 3-4 types of raytracing at native 4k is. Although dlss is set to quality (1440p upscaled to 4k) in cp2077. Consoles run triple A games at 30fps at native 4k with high settings (what they call "quality" mode) with raytracing off , what they do is not exactly impressive. Cope harder. The last of us runs at 30fps at native 4k high settings, while the 4090 pushes 90+ fps at native 4k ultra settings even with bad optimisation on pc. There is no comparison and you can't afford the 4090 combo anyway.
Still not enough for a solid 4k 144fps experience. Lighter-Popular games yes, but demanding games no. Demanding doesnt always mean good of course. I would ignore hogwarts legacy numbers like the plague. Might be another 5-7 years before 4k high refreshrate is not just existent across "most" titles, but also "affordable". I remember when people were losing their shit when the 2080ti came out and it was the first GPU which could hit 4k 60fps average on hardware unboxed. It was a minor blip of excitement though because by the time it became a thing, high-refresh rate became more important. 3090/6900xt were incredible improvements, and 4090 is a really strong push in the right direction.
I dont understand why hogwarts Legacy looks so damn good in this Video, ive the Same Settings (i guess so) but only a half of the raytracing effects, it might be only a 1/3 of it, what I am doing wrong ?
@@alooy3331 It generates more frames with AI, but those frames don't actually contribute anything to input latency, and sometimes just making movement look worse in practice.
Can someone explain to me why the Bottleneck Calculator from PcBuilds is saying that anything bellow Ryzen 7950x is bottle necking rtx 4080, and even 7950x is used almost to 100%? Because this shows that even at 1080p cpu is used around 15% only.
Thank you. This is THE video I wanted to see. It's amazing how you get higher fps with 1440p than with 1080p. Just shows that the 4090 is really what the 5090 would have been.
Год назад+3
how to explain that there are more fps in 1440p than 1080p ?
That's the "super 4k real life GTA 6 mod" for you, ray tracing is just being used to reflect everything possible and it's being implemented horribly by the developers, that's why I never use it since it also has a massive performance hit. The only games I liked it were Metro Exodus Enhanced Edition and Control
I’m kind of upset that I got this CPU now. I was looking at UserBenchmark and they’re saying it’s just a marketing tactic from AMD and that I should’ve bought Intel’s chip.
GPU Power consumption is too high, less resolution less power consumption is what I expected, there seems to an optimization issue,GPU will get cooked at this rate.
People I ask for the opinion of the PC that I have planned to do, if you recommend something or tell me what you think of how they call it, I need help xd -Samsung monitor led 22 "100V / 240V -2x RAM memory T-Force 8GB Vulcan 2666MHz - AMD Ryzen 3 4100 Processor Without OEM cooler -Mother Gigabyte GAA320M-H AM4 - 1x CPU Cooler ID cooling SE-224-XT Black V2 -Video card Asrock RADEON RX 550 2Gb GDDR5 Phantom Gaming -Solid disk SDD M.2 Kingston 500GB NV2 3500MB/s NVME PCI E 4x - Thermaltake Smart 500W 80 Plus White Power Supply - Antec NX292 MESH ARGB Cabinet Tempered glass
What's your budget? Can you just trade a 4100 for a 5500? Rx 550 for a 6600?or a 1660 at least? Also if I'm not mistaken you would have to update the bios to make the Mobo work with a 4100/5500
2 games already this near 2k dollar gpu can't hit 60 fps at 4k and we aren't even in the generation of unreal 5 games yet which run insanely bad.... oh... oh no lol.
🤔If you had used DLSS which does not use the native resolution, but a lower resolution... The processor would take that low resolution as a base, which would result in better use of the processor, taking a greater advantage of FPS.
@@kodakblack531 The CPU takes the resolution that the game is using and since DLSS does not run at the native resolution but at a much lower resolution, the CPU will take that low resolution as a base, which would result in a better use of the CPU. A pleasure to have taught you, son, that you seem quite lost on the subject. 😎✌
@@EarthIsFlat456 honestly given the massive price disparity between an entry level system and this I simply expected better performance, so of course I would be disappointed. Honestly I can't tell if it's just poor optimization on the games part or something with the hardware, but honestly it's not nearly as nice as I would have hoped.
@@visiblydisturbed1688 The only games where 4090 is not impressive is where native 4k with raytracing is used like hogwarts and witcher 3. Learn how demanding running 3-4 types of raytracing at the same time is before talking trash.
@@dhaumya23gango75 I wasn't even talking trash, I simply said that I was disappointed with the performance given the exorbitant price of the hardware, I simply gave my opinion on how I felt, and honestly it's not going to change, I'm sorry if that seems like trash talk to you.
Games :
Forza Horizon 5 l 1080p l - 0:06 - gvo.deals/TestingGamesForza5
Forza Horizon 5 l 1440p l - 0:48
Forza Horizon 5 l 4K l - 1:26
Hogwarts Legacy l 1080p l - 2:10 - gvo.deals/TG3HogwartsLegacy
Hogwarts Legacy l 1440p l - 2:45
Hogwarts Legacy l 4K l - 3:29
Red Dead Redemption 2 l 1080p l - 4:15 - gvo.deals/TestingGamesRDR2
Red Dead Redemption 2 l 1440p l - 5:05
Red Dead Redemption 2 l 4K l - 5:50
Atomic Heart l 1080p l - 6:34
Atomic Heart l 1440p l - 7:36
Atomic Heart l 4K l - 8:33
CYBERPUNK 2077 l 1080p l - 9:16 - gvo.deals/TestingGamesCP2077
CYBERPUNK 2077 l 1440p l - 10:02
CYBERPUNK 2077 l 4K l - 10:32
CYBERPUNK 2077 l 4K l DLSS 3 - 11:12
PUBG l 1080p l - 11:49 - gvo.deals/TestingGamesPUBG
PUBG l 1440p l - 12:40
PUBG l 4K l - 13:32
Resident Evil 4 l 1080p l - 14:17
Resident Evil 4 l 1440p l - 15:01
Resident Evil 4 l 4K l - 15:38
Spider-Man l 1080p l - 16:20 - gvo.deals/TestingGamesSpiderManPC
Spider-Man l 1440p l - 17:11
Spider-Man l 4K l - 17:55
Spider-Man l 4K l DLSS 3 - 18:34
Microsoft Flight Simulator l 1080p l - 19:24 - gvo.deals/TestingGamesMFS20
Microsoft Flight Simulator l 1440p l - 20:14
Microsoft Flight Simulator l 4K l - 20:54
The Witcher 3 l 1080p l - 21:33 - gvo.deals/TestingGamesWitcher
The Witcher 3 l 1440p l - 22:15
The Witcher 3 l 4K l - 22:50
The Witcher 3 l 4K l DLSS 3 - 23:21
System:
Windows 11
Ryzen 9 7950X3D - bit.ly/41wwPIV
Gigabyte X670E AORUS MASTER - bit.ly/3BRDIIG
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz CL38
CPU Cooler - be quiet! Dark Rock Pro 4 - bit.ly/35G5atV
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
I remember few years ago i thought 4k gaming was just ridiculous it would need a hardware so advance that i dont think it would be possible at least 7-8 years from now and here we are, gaming at 4k ultra settings high fps. Technology never cease to amaze me.
yea but be ready to pay like 10k
@@exxxz1999you werent stop capping 1080 ti is garbage
@@artuilech.75063*
and it only costs 2k$
@@artuilech.750610k? Bitch where lmfaooo? You can literally buy this rig on Corsair for 4,600. Google is free you know
When the 4090 gets more FPS at 4K than what you get at 1080p:
Its like a Bugatti
Where
Some people have gf
Real life have infinite resolution and fps.
@@myemailvl actually, real life has 60 fps and 576 Megapixels, which is more than 8K :)
On these kind of tests, you should put below the AVG FPS as you do on the comparisons. Just my opinion. Great work
It's not comparison so don't really need. I guess
Jesus Hogwarts nearly dropping below 100fps at 1080p... that game needs help
Yea its bad. I remember playing it the first few weeks it launched and I was getting drops to single digit FPS way too often with a 5800X3D + 2080Ti. Uninstalled it, figured id wait for 6 months of patches then revisit.
Bro ray tracing is on in all res and dlss is off and getting 60 fps in native 4k ultra with ray tracing is impressive
Gonna have to wait for the 5090😢
@@aditech8432 We don't know if DLSS is on but considering it is on in every other game in this video it's safe to assume it's also on for HL. Nevertheless the game is very very poorly optimized.
Can ALMOST max out the GPU at 1080p. Incredible results. I remember when the 4090 first came out, the best CPUs were still bottlenecking it at 4k even! These new 3d v cache chips are finally for next gen cards! I am happy with my recent 5800x3d purchase :) The jump to AM5 is too expensive for me at the moment, but maybe I'll upgrade for the 8800x3d :)
Indeed! I ended up buying a 5600 tho, it was just way too cheap.
I got an am5 mobo and i didnt even know it only supported ddr5.. i had to upgrade the ram aswell to ddr5 i thought it was retro compatible
Бэйтман выдал базу
@@jokerbrothers3165 safe to say you won't need any mobo or ram upgrade for coming 4 5 years now unlike Intel
@@ht_ghauri true
I recently bought a 4090 and paired it with my 5900x. Already this combo is a beast,running on 32gb ddr 4 3600mhz cl14 ram and just pcie3 and 4 nvmes. While that combo isn’t the ultimate,it’s already a banger for my 165hz uwqhd ultrawide display 😊🔥
How much did all that cost you?
@@Bafana777 prolly like 3k
@@sql64 how the hell is that only cost 3k?
I paid 3k aswell for the 4090 trio x am5 x670p mobo 32gb ram ddr5 6000mhz and a 1000w power supply
@@Bafana777 everything
Incredible 1% and 0.1% lows in Forza lol
😂😂😂
Power of AMD CPUs 😎
What it means 1%low??
@@jamshidjafari6462 1% low is telling us that the worst 1% frame per seconds(fps) of the time. Although it's sounds not very important but it will affect your experience of the game. Like for example your average fps is around 80fps but your 1% low is lesser than 60fps like 40 30 fps, if it is that case you will feel your game a bit laggy/sluggish because of the 1% low that lesser than 60fps. (Just to fill the gap, we know that if game run lesser than 60 fps, you will feel the game isn't run as smooth)
50w cpu power damn AMD thats efficient
Getting my 7950x3D in September and currently on a 9900k with the 4090. Everything still runs amazing but seeing my GPU sitting at 50-60% in some games has made me want to upgrade faster.
Get a 7800X3D for gaming it’s better
@@T3x27let him get the 7950x3d if he has a 4090
@@Universeverse923 you wont notice a difference between the 7800x3d and 7950x3d in gaming but for multi core tasks you will surely notice and judging by what he said he just games on his 4090
@@Universeverse923 well it sounded like he just wanted to play games, also that’s like 5 months ago
Gpu works like that. It's normal
So is this the Fastest CPU/GPU Combination you can buy as of today? Should be at least Top 3.
13900k is fastest, i think, 7800x3d will smash both tho
13900ks is way fastest and also very efficient with uv
yes it is the fastest gaming wise.
@@lexavlogs7149 7950x3d is the best gaming processor...
@@marcelojogos6247 yes 7800x3d will beat 7950x3d, bc of how 3d cache is distributed, 7950x3d can not use it accurately because it is only in 8 of 16 cores, while 7800x3d has cache in all cores. Proof of this is that if you disable the 7950x3d no-cache cores, it runs faster in games
Damn, RE4 Remake and Spiderman are the only games that eats up 8gb vram or more at 1080p and easily goes above 12gb at 4k.
My RX 6750 XT going in Hogwarts Legacy above 10 GB sometimes even above 11GB of Vram in 1080p with all maxed out but without Raytracing. Rest is an 5600x and 3200 Mhz DDR4 Ram. Interesting to see the results here.
uhm Hogwart's Legacy as well and TW3 with Raytracing.
Yeah this is what ive been trying to say aswell that 4070 Ti is not gonna age well in other low spec gaming channel but instead they all say how 12gb of vram is still okay lmao, look who's laughing now
in the near future the 4070ti gonna have big problems. I'm happy with my 3090. 12gb vram is a joke in the year 2023.
planned obsolescence.. 12Gb will not be enough after year or two. 16 GB will be the norm for 1440p and 20gigs for 4K. But Very high/high settings for gameplay. Ultra is for screenshots so 12gb may last longer.
impressive how your 4090 stays at 47-49 degrees
He is using Russia and Canada's winter as a cooler
@@jojo-vi9wq lmao
How do you get those low Temps? Incredible good temperatures.
Water cooling systen
Wich one ? I have deepcool ls720 360mm in a coolermaster td 500 mesh v2 and 7 coolers. And my temps are 65 in procesor 7800x3d and 67 in rtx 3090
@@joseignaciolopez3573 well that a pretty demanding cpu [I have the same one] but my temps there usually below 45, but still that a pretty low temp
i'm impressed about how the CPU and GPU temperature properly restrained under 60 degrees
Well not with stock fan curve for sure. It uses a custom high up fan curve.
@@yosifvidelov Ehhh, probably not. At least the GPU temps are realistic for the high end coolers.
My 4090 often stays around 60-70°C in my 30°C ambient room, depnding on the game. Forza Horizon 4 e.g. is a 60°C GPU title for me.
Silent Bios, aka 800-1200 rpm max.
@@Waldherz well 4080 has the same huge cooler and i get 60-65 in most titles @1440p 22-23C ambient temp at stock settings while using considerably less power. So you tell me :)
@@yosifvidelov The 4080 has a much lower quality die and runs at a higher voltage.
Have a look at the power consumption and temperature difference between the Ryzen 9 5900X and 5950X. You will find that the 5950X runs a lot cooler while also using less power for higher clockrates at more cores.
@@Waldherz ruclips.net/video/uyMnMPLYk2w/видео.html gaming x trio seems the worst so these temps here are much off. so i call 100% fan speed/noise to get them..
If I ever had a PC like this I might not ever go outside again
I have this but i go outside
why nobody talk how that fucking atomic hearth is super super fucking well optimized
Was any dlss used for Spider-Man in your testing for 1440p?
I don't know why but turning on frame-gen makes the gameplay look a lot choppier
@@faults8208reflex is always on when you enable FG.
It's artificialy added frames that's why no one ise frame gen or dlss while doing game test they are just marketing gimmick
@@vishvrajzala8017remnant 2:
@@vishvrajzala8017 i use frame gen with no dlss on cyberpunk and it is definitely not a gimmick. there's no choppiness at all, the only difference is the input lag but you get used to it and you get a smooth game with no dlss needed.
@@pedestrian_0 dlss definitely degrades quality there is no doubt about it I care about resolution a lot as I can easily defrenciate while framegen is just not best for shooting game where latency and delay matters you can use them in story mode games for sure
What software do you use for your statistics over lay?
Msi Afterburner
Good performances I'm waiting 7900x3d for an upgrade😁👍
I wouldnt. Its either get the 7800X3D or the 7950X3D. 7900 is in a weird spot the way the 3D caching works on Zen 4. Only one CCD gets it. So with the 7800X3D its only a single CCD anyway, so you get all 8 cores benefitting. 7950X3D like the 7900X has 2 CCD's but the 7950X3D is 8 cores per CCD whereas the 7900X is 6 cores. So with he 7900X3D only 6 cores are getting access to the 3D cache, the other 6 arent.
This is a moot point now as multiplatform game engines (what most games are built on) are still from last gen with enhancements for next gen. So 4-6 cores is the sweet spot for most games. This will soon no longer be the case as PS4/Xbone is phased out (were seeing it already) and all game engines/games are built from ground up for Next-gen only, which PS5/SeriesX all run 8 core Zen 2 chips.
When you start testing EFT ?
the witcher 3 is the new "but can it run it?"
which performance overlay are you using? is it afterburner?
Thankd for review. Which cpu cooler are you using? What are your 7950x3d pbo settings? My 7950x3d run hot.Please help me.
Tienes que ver tu placa madre y la bios q tiene , ahí tienen diferente configuración de voltajes y eso baja mucho la temperatura
Hogwarts, Cyberpunk, The Witcher (?????) just to name a few are running around 60fps and can even dip below 60fps
ON A 4090/7950x3d
…. PC Games are just not optimised… Look at what Consoles can do with just a fraction of the power. 4090/7950x3d combo is the craziest thing you can get right now just to dip below 60fps WOW
It's because of native 4k with raytracing in hogwarts and witcher. Learn how intensive running 3-4 types of raytracing at native 4k is. Although dlss is set to quality (1440p upscaled to 4k) in cp2077. Consoles run triple A games at 30fps at native 4k with high settings (what they call "quality" mode) with raytracing off , what they do is not exactly impressive. Cope harder. The last of us runs at 30fps at native 4k high settings, while the 4090 pushes 90+ fps at native 4k ultra settings even with bad optimisation on pc. There is no comparison and you can't afford the 4090 combo anyway.
U are right about Hogwarts and The Witcher, but Cyberpunk looks good enough to deserve the power its taking.
What app do you use to check fps?
RivaTuner Statistics Server
@@DOLPH1NARU-JR But it looks different to mine
@@okranol He just changed some settings idk exactly how is he doin that
The FPS in Witcher 3 @1440p are pretty strange. Running it with my 13900K + 4090 at ~130-140 FPS?
He is running this with enhanced edition(high res textures) i guess or with rt
Ahh..yes..the combo we can never afford but still watching the vid
Btw nice work man
@Jack C damn man that's one beast of a pc u have
@@exxxz1999united states is not the whole world
Oh and here comes 7000 apus to the poor and needy
Ótimo vídeo. Gostei dos testes. Ganhou mais um inscrito.
RE Engine has such good optimization.
Man, your temps are so low ;). How did you do that ? Did you undervolt? Or is it just the ryzen 7950x3D being a cooler cpu? What’s your AIO?
Gpu is probably water cooled
can you do one with dlss turned on at 4k?
Still not enough for a solid 4k 144fps experience. Lighter-Popular games yes, but demanding games no.
Demanding doesnt always mean good of course. I would ignore hogwarts legacy numbers like the plague.
Might be another 5-7 years before 4k high refreshrate is not just existent across "most" titles, but also "affordable".
I remember when people were losing their shit when the 2080ti came out and it was the first GPU which could hit 4k 60fps average on hardware unboxed. It was a minor blip of excitement though because by the time it became a thing, high-refresh rate became more important.
3090/6900xt were incredible improvements, and 4090 is a really strong push in the right direction.
I dont understand why hogwarts Legacy looks so damn good in this Video, ive the Same Settings (i guess so) but only a half of the raytracing effects, it might be only a 1/3 of it, what I am doing wrong ?
Wonder how it performs with frame gen and dlss 3
He turned on frame generation in Cyberpunk.
@@winnb4968 what is framegeneration? what does it do?
@@alooy3331 It generates more frames with AI, but those frames don't actually contribute anything to input latency, and sometimes just making movement look worse in practice.
he used frame gen in Cyberpunk, Spider-Man and The Witcher 3
@@sirtonfy but not in 1440P and 1080P?
Can someone explain to me why the Bottleneck Calculator from PcBuilds is saying that anything bellow Ryzen 7950x is bottle necking rtx 4080, and even 7950x is used almost to 100%? Because this shows that even at 1080p cpu is used around 15% only.
Here its multicore in % but most of the games use singlecore performance as the main point :) Ur CPU is nearly never at 100% at all :)
@@Denniz69 Thank you Denis!
Thank you. This is THE video I wanted to see. It's amazing how you get higher fps with 1440p than with 1080p. Just shows that the 4090 is really what the 5090 would have been.
how to explain that there are more fps in 1440p than 1080p ?
@ CPU bottleneck. lower the resolution, the more the CPU has to work over the GPU.
@@Komarikai I thinks it's more of a game engine or api thing, doesn't make sense a cpu can generate more higher res frames than lower res
@ Margin of error. If the CPU is operating at its full potential, you won't be getting higher fps no matter the resolution
The fps is the same in theory. If you were to measure it a lot of times the 1440p and 1080p results would level out to being (almost) the exact same
at 1:45 when entering tunnel there is fps drop but from the graphs fps rises. Is it rendering issue or is this fake?
Probably fake. Don’t trust easily what you see.
not a huge computer guy, but wouldn't 16 GB of RAM bottleneck your performance at 4K ?
true
What the hell is C.PWR because my 7950X3D doing nothing in windows idle at 65W watching this video.
meanwhile, my titanXp hits 95C just by entering a game, and gives me 40fps in low setting1440p resolution, and doesn't support dlss.
Dont know why but my 4090 only run 50% at 2k in re4r, in result, it only have around 140fps, in red dead 2, I have same fps like you @@.
price 850 usd i9 14900ks 170-180 watt 5 frames more - price 680 usd Amd Ryzen 9 7950X3D 80 watt on forza horizon 5 😂 in Norway
How did u drop your temperature
3:36 why is everything so shiny, has there just been a downpour in there?
That's the "super 4k real life GTA 6 mod" for you, ray tracing is just being used to reflect everything possible and it's being implemented horribly by the developers, that's why I never use it since it also has a massive performance hit. The only games I liked it were Metro Exodus Enhanced Edition and Control
@@mihailcirlig8187 RT is not realistic, you want to get realistic real life experience, turn RT off.
he is using dlss quality?
in 1440p and 4k?
RX 7900 vs. 4070 Ti at 1440p without ray tracing, please!
I’m kind of upset that I got this CPU now. I was looking at UserBenchmark and they’re saying it’s just a marketing tactic from AMD and that I should’ve bought Intel’s chip.
I watching because maybe I see this PC in my dreams😂
Can u add CoD Warzone 1 and/or 2 and GtaV
Damn... perhaps I shouldn't use a 4K screen after all... the 4090 seems to be barely running 60 FPS in 4K.
Posso nunca ter, mas eu não consigo parar de admirar e desejar...
Option : EPIC (DLSS) + 8800MT CL36 ?
3A gaming at a studdy 4k 120hz is now possible
Yes mom, this is good for Zoom and PowerPoint
hahaha
The Witcher 3, a game from 2015 reaching 50 FPS lowest in 4k on the best hardware you can get nowadays lol
Did you forget he is using raytracing and ultra settings
witcher engine is so bad... unbelievable ... 30-55fps
The game is as shit as CDPR itself
Ryzen can't
It's a very old game
На человеке пауке frametime показывает раскачку на паутине?)
6000+ dolars in PC components and it still stutters at 4k 60
What a time to be alive huh?
Here where I live it would be triple that amount
HOW TO RENAME GRAPH LIKE YOU ?
Hell yeah boy im finna buy an rtx 4090 with 7950X3D just to see the twins in atomic heart on 4k
daaaam. dream pc
GPU Power consumption is too high, less resolution less power consumption is what I expected, there seems to an optimization issue,GPU will get cooked at this rate.
Ladies and gents you're looking at 2025 laptop performance.
Are these all DLSS off or on?
400фпс в атомик херте... будто сиэс гоу. Вот это оптимизация! Учитесь, всякие хогвартсы да киберпунки!
hogwarts is really struggling even with this cpu!! my 5800x is getting the same or even slightly better frames.
No
On 4k mind you
People I ask for the opinion of the PC that I have planned to do, if you recommend something or tell me what you think of how they call it, I need help xd
-Samsung monitor led 22 "100V / 240V
-2x RAM memory T-Force 8GB Vulcan 2666MHz
- AMD Ryzen 3 4100 Processor Without OEM cooler
-Mother Gigabyte GAA320M-H AM4 - 1x CPU Cooler ID cooling SE-224-XT Black V2
-Video card Asrock RADEON RX 550 2Gb GDDR5 Phantom Gaming
-Solid disk SDD M.2 Kingston 500GB NV2 3500MB/s NVME PCI E 4x
- Thermaltake Smart 500W 80 Plus White Power Supply
- Antec NX292 MESH ARGB Cabinet Tempered glass
it's a shit PC
Why you not use rdna 2 GPU Its better for long term . Example RX 6500 XT and RX 6600
@@jokojr100dl Ok,
-Asrock RX 570 8GB GDDR5 Phantom Gaming Elite Video Card
is good??
What's your budget? Can you just trade a 4100 for a 5500?
Rx 550 for a 6600?or a 1660 at least? Also if I'm not mistaken you would have to update the bios to make the Mobo work with a 4100/5500
I don't know why i am watching, i cant even afford a ramstick lmao
😂
They ruined the witcher 3. There is no reason why at 4k on a 4090, the game gets 50fps.
shouldn’t the power supply be a little more powerful?
yep
Kinda strange. Back in the days the 1080 was already "the 4k card". Nowadays you still need the top notch card to play 4k.
4k is just resolution, Minecraft in 4k is not the same as Cyberpunk in 4k
The 1080 card in reality never was a 4k card
lo mejor de lo mejor.
Bro the cpu only use 20% for playing AAA games💀
Why so low temps
This is what console players think every pc player has for cpu and gpu
In my head, every average American can have a PC like this
Gtx 1660ti + ryzen 5600x 1080p ultra
1440 ultra
Wow HL is still a mess on pc
wow hogwarts 4k and just getting 60fps
Harry Potter makes 4090 looks not so impressive xD
what harry potter? its hogwarts legacy...totally different stuff
Hogwarts Legacy is truly the next Crysis. It'll take another decade before computers are able to brute force past its poor optimization.
remnant 2:
@@ghostbuster2339 Cyberpunk 2077 just reaching further and further levels of super saiyan with each new level of raytracing
Me watching on a broken 720p screen
💀
Wow not even completely saturating the bandwidth of the 4090...
what kinda bs comparison is this??? you dont even put it side by side to see the framerate differences smdh 🤦♂🤦♂
Топовая сборка, но стоит…
Ага)
2 games already this near 2k dollar gpu can't hit 60 fps at 4k and we aren't even in the generation of unreal 5 games yet which run insanely bad.... oh... oh no lol.
poor optimization
Could you try apex legends?
Who buy 4090 for FHD ???😅😅😅😅😂😅😂😅😅😂😅
Me 😂
running DLSS at 1080p with a 4090 smh
🤔If you had used DLSS which does not use the native resolution, but a lower resolution... The processor would take that low resolution as a base, which would result in better use of the processor, taking a greater advantage of FPS.
Not true
Dlss adds more stain on cpu so does rt and higher settings
Dlss is good if you are gpu bound not cpu bound
@@kodakblack531 The CPU takes the resolution that the game is using and since DLSS does not run at the native resolution but at a much lower resolution, the CPU will take that low resolution as a base, which would result in a better use of the CPU. A pleasure to have taught you, son, that you seem quite lost on the subject. 😎✌
@@lorenzogonzalez4300 I guess you’re my son bc you’re still wrong.😎 Son, dlss won’t help cpu bottleneck it’ll help gpu bottleneck.
Why is Witcher 3 performing so poorly?
4k is a new standart for gaming. If you own a 1080p or 1440p monitor you can't enjoy games at 100%.
@jhonsierra_1172 I bought everything with my money. 4k is worth it. Everyone can save money and buy what they dreamed of.
You can easily play in 1080p, this desire is just silly capitalist consumerism
the lows are really bad why is this ?
Rdr2 looks amazing, i just currently did an rdr2 video in 4k with i9 13k and 3080TI on an ultrawide 4k oled monitor, feel free to check it out :)
Honestly this was very disappointing to see, I honestly expected more out of top of the line hardware
You just have unrealistic expectation.
@@EarthIsFlat456 honestly given the massive price disparity between an entry level system and this I simply expected better performance, so of course I would be disappointed. Honestly I can't tell if it's just poor optimization on the games part or something with the hardware, but honestly it's not nearly as nice as I would have hoped.
@@visiblydisturbed1688 In what games in particular do you find the performance disappointing?
@@visiblydisturbed1688 The only games where 4090 is not impressive is where native 4k with raytracing is used like hogwarts and witcher 3. Learn how demanding running 3-4 types of raytracing at the same time is before talking trash.
@@dhaumya23gango75 I wasn't even talking trash, I simply said that I was disappointed with the performance given the exorbitant price of the hardware, I simply gave my opinion on how I felt, and honestly it's not going to change, I'm sorry if that seems like trash talk to you.
makes you wonder why hogwarts legacy feels so clonky on my pc... the game is fucning unfinished
disable dlss and then do actual test of the gpu. DLSS is garbage.
holy shit .....
holly shIt!! PUBG 349FPS