I've been reading some weird comments about the word revisiting in the title (lol): I'm not saying the 4090 is old, or slow, or anything like that. I'm just revisiting it because the last big video I made on it was in 2022, and since then many new games with new engines have been released, so I'm testing those now in this condensed video format. That's all. Have a nice day and be kind to each other 🙂✌️
The title is perfect considering the new line of GPUs are around the corner. Its revision on the last one compared to the new ones that are coming is perfect. Perfect title.
4090 can run every games maxed out expect path traced one (above 1080p) what 5090 can't do nether. simple math = keep the 4090 and enjoy dlss 4.0 buff.
@@Mrzeseb13the 5090 has multi frame generation that the 40 series won’t be getting, dlss 4.0 is coming to 40 series tho. but 40 series can just use lossless scaling (which sucks idc what anyone says i’ll never use frame gen unless the latencies are less than 10 ms and looks undistinguishable)
It's the fault of developers not putting in the time and effort to actually optimize (reduce draw calls, texture compression, polygon culling, adjusting amount of rays and de-noising, etc) their games. Developers need to go back to their roots (ie. The Order 1866, Red Dead Redemption 2, Jet Set Radio, Gran Turismo 4, Forza Horizon 5) some of these games are from the 90s and I included those as they ran at 60fps on original PS2/DC hardware while still looking good.
@@arivelo The way mult frame gen works is that they wait for two frames to generate, then squeeze a frame in between. So it doesn't reduce frame latency. COMPROMISED EXPERIENCE.
its not either of their faults. its the game devs. UE5 has the features and ability to be very optimized. devs just use those feature to take shortcuts for good visuals but bad performance since they fall back on upscaling and FG. not upscaling or fg fault either
@@kleo1307nah UE5 definetely deserves lots of the blame. It's an engine made for a very specific target (destructible and dynamic envs like Fortnite) being advertised as an universal engine. Devs can turn lumen and nanite off of course, but when most games made with the engine run so badly eyebrows start to raise. Don't forget how long it took for them to fix shader compilation stutter too. I've heard its TAA is awful as well.
@@Rafael57YT UE5 literaly relies on TAA to function. Look up how the matrix demo looks without TAA. Everything Jitters there. Geometry. Shadows. Shaders. Jesus
I know you're joking, but you have no idea how right you are with the upcoming games in 2025 and beyond. That 3 fps at 720p before any frame generation is gonna be a reality.
@olebrumme6356 Yeah developers are going to be even more lazy with the optimization. DLSS was a mistake. Hopefully nvidia does implement something in a future software update where bigger cards cannot use DLSS and only mid range-low end Graphics card can utilize it to boost frames. That way, when lower end cards get the same amout of fake fps as high end cards, people that bought a 4090/5090 will be angry about how a 5060 is getting the same fps what's the point of buying a high end card? Then the developers are forced to actually optimize their shitty games to compensate for all gamers getting a good experience.
@@younesKamel-b1o yes, but the 5070M has 8. M is their mobile lineup which is what you find inside laptops. The 5070M will be what you find in any laptop that says it has a 5070 and it will only have 8 GB VRAM
Can't wait for the RTX 5090 video, I really wanna know how the GPU performs without frame generation, because Nvidia is pushing this AI stuff and I want to know how the GPU truly performs without frame generation to see how really of an upgrade it is from the RTX 4090
@@GewelReal with frame generation surely but how would the graphical comparison be. I’d imagine one looking softer than the other and other differences of course.
@@TransformersHoarder 25%-40% rasterization, based on their specifications. With frame gen and all the typical nvidia gimmicks, you're looking at about ~120%-150%
I know it's beating a dead horse at this point, but man, seeing you go from 100+ FPS on Ragnarok buttery smooth, to Stalker 2 with dips into the 50s is ridiculous. I know they have different engines/different games and all that, but Stalker 2 in no way looks better than Ragnarok and it performs so much worse. Thanks UE5!
It’s built on the same tech and engine and is pretty much the same as the 2018 one. There’s nothing intensive going on just look at cpu usage while it’s running lol.
@zWORMzGaming They really do. I never understood the people who claim SF is an ugly game. It's a beautiful game. The shadows and textures are top notch. I like the 1970s retro sci fi. And I gotta say, DLSS works great with SF.
For sure. I do think the game deserves more credit than it gets for its performance optimization right now. Runs quite well actually, and dramatically better than launch
@ThunderTheBlackShadowKitty I should give it another try then, because 1month after release graphics looked dull to me, good in closed spaces perhaps, but outside was just average 2020 graphics and some stuff like trees looked righit out of a 2014 game. This was at 4k with a 6950xt, the poor thing could not even handle high settings despite it being a powerhouse
it's crazy to me that a game like horizon can reach 90 fps with no gimmicks and be absolutely gorgeous yet a game like marvel rivals can't break 70 without any gimimcks and it looks akin to overwatch, which launched in 2016 and had its glorified graphics overhaul in 2022. Same thing with a game like Silent Hill 2 Remake, etc. etc.
Yeah, no. Never compare multiplayers, which are made to be run at 120+ fps by anyone with relevant hardware 2-3 years later with singleplayers, which target best possible visual output at 30 fps, and everything above is a luxury of newer, stronger hardware. Also, Silent Hill 2 has gorgeous graphics in every aspect, even in plain raster, rt simply does smoother lighting, sharper reflections and shadows, effects, which Horizon could also benefit from. You're comparing art direction, not every game is supposed to be all bright and super colorful. Final Fantasy and Resident Evil are loved in equal measures.
@@БоднарРостислав Yes but in raw performance terms, there is no reason these games should be performing worse than games with more graphical fidelity. Especially when new gen games look like old games yet perform worse. Most modern generation hardware cannot break into comfortable FPS on games running on unreal engine 5, like marvel rivals from most recent personal experience. I'm not speaking solely on art direction but rather the geometry and textures that accompany a game like Horizon FW. Visuals don't mean anything if you can't comfortably play with them, especially when the pinnacle of graphical processors prior to the 50xx cannot bring out the best in these games.
@@sataniccereal I don't play rivals, but I remember R6Siege performing worse than BF4 in 2015, was it? Nobody cares about whether it did or did not. Everybody plays Warzone even with constant gradual performance drops. You cannot adequately compare geometry between SH2 and HFW, it's not 1:1, SH2 should be compared to AW2 and RE4, and it holds its ground firm. But starfield can be compared, starfield has better textures, volumetrics, higher polycount on objects and models overall. Like AW2, it's been fixed now and totally playable. Doesn't use UE5 either. I agree, UE5 is either too complicated to configure correctly or has fundamental flaws and a lot of studios don't want a big R&D budget to devise their own engine. Jedi Survivor is running past Horizon too. But UE4 in it has troubles working with new algorithms put on it, which is why it's putting a heavy load on cpu and other studios, seeing this, did not entertain to follow the same path and moved to UE5 instead. I love Horizon as a franchise, but it's not a paragon of graphics. It's a solid 7. And it could use an upgraded 4k texture pack, and maybe developers entertained the idea, but in their render, it would need what? 16? 20gb? They probably decided against it after the controversy with Zero Dawn pc port and its infinite vram hunger. Still, it's funny how people got angry with Star Wars Outlaws because of things like npcs don't react to your presence and actions, but you can literally jump on heads, swing your spear on npcs with no collision, fail jumps and bump into invisible walls in Horizon. Sure, Horizon has better story.
@tillicumbeach638 they hs already said that dlss 4 is coming to 40 series, the only thing thing that'd not coming to 40 series is not getting mfg which isn't dlss 4 completely
always nice to see your vids pop up in my notifs, it's a nice wholesome time for me and i just chill and watch you test stuff, for some weird reason it's very relaxing
Wow, i didn't expect a video like this! This thing will be relevant for very long as it has a big performance margin bonus. 5090 will be faster and will have DP 2.1 but the 4090 is still a monster! Greetings.
Just saying. The heaviest part of Cyberpunk 2077 for the GPU is actually the Reconciliation Park in the middle of tall trees, by a long shot, especially with RayTracing. Test this. You probably going to be sad after that 😅
It is estimated to be 25-40% faster or probably something in the middle. 60 series will probably be more exciting due to improved manufacturing process (3nm or less). I’m definitely not changing my 4090 for more fake frames :). Frame generation even on the current card doesn’t excite me. I switch it off for better image quality.
Not sure about either! I've been waiting for the arc GPU to be available here for a month, and it still isn't... And I hope some mini PC brand sends something with those APUs, or maybe even AMD, but let's see :)
Proud 4090 owner since I build my new machine June 2023. Knock on wood no issues what so ever. I plan on using this for years to come and most likely won't upgrade till the 7000 series if I can help it. Now if only games were more optimized :(
I got a 4090 last year and I’m very satisfied personally with the performance! I’m definitely skipping the 5090 and instead saving up for the 6090… There is no upcoming game that I want to play in which the 4090’s performance would be unacceptable to me. GTA 6 tho might be the one that will make me upgrade (it should be out around the 6090’s time).
As an RTX 4090 owner, I enjoy playing my games at High settings, not ultra, with Quality DLSS and sone Ray Tracing (RT), but not extreme or Path Tracing (PT) due to the performance hit being absolutely ridiculous 😵😵💫🤕! I feel we’re (Nvidia and AMD) are a solid decade+ away from having decent RT or PT performance in games with a minor to moderate hit to performance and YES, Lumen in UE5 should have a toggle to turn it off until we’re getting decent RT performance out of the box!
There is no need for a 5090 if you dont plan on playing with native RT/PT with everything on ultra at 4k. The 4090 got more than 60fps in All of these games with the maximum settings without RT or PT or Dlss/fg, and still around 55 in the 2 most diffucult ones that have mesh shaders always on anyway, and Dlss quality does not hurt the image at all so you may as well use it on them. In most of these games it even obtained more than 60fps even with RT on witohut dlss or fg. There is no need to upgrade at all
Bought a 4090 from a friend of mine, 3 months ago, for a very good price, inkl 1 year left of warranty. And after seeing the insane prices of the new 5090 card, , I'm glad I did 😅 I have zero use of 4x fake frames, as I can't even stand using 2x 😂
@@frallorfrallor3410 RTX 5090 will most likely be very good, don't worry friend. Penguin has RTX 4090 and still not going to replace in the first few moments anyway, RTX 4090 more than fast enough for Penguins work and gaming.
This is an insane benchmark showcasing the RTX 4090. Even though the Nvidia Showcase of the new RTX 50 series manages to overshadow this one, I still think this card will stand the test of time very well and maintain high frame rates. Well done, sir, for showing the true power.
@zWORMzGaming Sounds good 👍 IT would be nice If you Testing GPUs in Livestreams and we can Chat with you in realtime. Im pretty Sure it will be boost your RUclips Channel even more ;)
Been watching your videos for about 2 years now and I am about to actually build a cheap gtx 1660 super build. I am only gonna end up spending around $150 bucks overall due to the fact that my friend is giving me his old gpu (the gtx 1660 super). Anyways, that part was kind pointless to add, but keep going, you make AMAZING videos!
It is said that in the MSI Afterburner application, the GPU temperature and power consumption indicators can cause a significant drop in FPS and lead to frame drops. Disabling these features results in a performance boost. If anyone is unaware of this, they should disable this setting in the program, or alternatively, use another program. I hope the channel owner reads this message.
When doing the game test, when playing games with 1080p and 1440p resolution on a 4k display, the quality is a little lower and the fps is higher. No one play games in 1080p and 1440p resolution with a 4k display on a pc made for 4k. Real world testing means testing games only at 4k resolution with a 4k monitor with a pc made for 4k gaming. Testing games at 1440p resolution with a 1440p monitor with a pc made for 1440p gaming. Testing games at 1080p resolution with a 1080p monitor with a pc made for 1080p gaming. if you want to real world testing you need build 3 pc setup and 3 monitors(1080p,1440p,4k)
I love watching your videos. I've been watching for years, and I’m finally going to buy my first GPU with the new RTX 5090 and build my first PC. Thank you so much for all of your hard work!! 🙏🏼
Oh thank you so much, because its so fast, its impossible to notice now I understand. Thanks. Do you use gsync? I an new to those pc settings and stuff so I would like to know. @@zWORMzGaming
Great video as always. The awp gameplay was chef kiss. Question. When you are referring to gpu power utilization when it relates to x3d cpus, whats that all about? Just built my rig with a 9800x3d/4090 and curious as this is the first im hearing of it. Might want to look into it myself.
Thank you! Some people found that having the GPU power usage displayed on the OSD makes ryzen x3d CPUs stutter more. I haven't found a game where that happens yet though. And it's more common with the 9800X3D apparently
Imo every game should perform like Hogwarts Legacy in a 4090, 100ish fps in AAA games but they barely go above 60fps so I think there might be something wrong with game developers.
Hey Krzzpy, really nice to see the strix 4090 on your channel! Could you test avatar frontiers of pandora (with unobtainium settings) on the 4090 aswell? I would really like to see that. But as always, a great and entertaining video! 🙏👍 GOODBYEEEEE BOOOOB! 💥🔫
Hey Kryzzp, the 7900XTX with latest drivers can trade blows with the 4090, im still thinking about getting a 7900XTX Nitro+ to upgrade from my 5700XT Nitro+, but also thinking about the 9070XT if the price/perf is good (but it has only 16GB of VRAM, i want to play at 4K)
@@brasiliafc2138 no It isnt, It Is barely 0 to 5% faster than the 4080s and at least 20 to 25% weaker than the 4090, being 20% slower than a GPU Is NOT trading blows with It, a GPU Is trading blows with another if It Is less than 10% weaker than that card, not 25.
Im so hyped about the 5090 comparison, i believe apart from the DLSS4 Performance, its gonna be pretty much the same fps vs watt Performance. From what i saw it consumes about 500-530, meaning its around 15% higher than 4090 but its also „just“ 15% more fps in most games. In conclusion it would be more future proof, but its a 1:1 upgrade which isnt really that great. Big advantage is the dual slot design though, the 4090 is a real brick, good thing they downscaled it again for the 50 series.
The RTX 4090 will likely still be the 2nd most powerful GPU for gaming even after RTX 50 series comes out It was expensive, yes. Still expensive today. But it gave you an experience far and above any other gaming card you could buy. It trivialises any game prior to the 2020s & with frame gen, it makes even modern games a 4K high refresh experience. For 4K path tracing, we are probably not getting that for at least another generation; a massive breakthrough or a new node would probably be required.
They would need a little bit more than the double the core count of the 5090 to achieve 60 fps in Black Myth Wukong for full path tracing and also faster vram. I dont know when we will get something like that :/. In 6-8 years ?
5080 only matches the 4090's memory PROCESSING performance. 4090 still has 75% more CUDA cores and 8GB more VRAM. You'll get "more frames" generated with MFG on, but with a 75% generated set of frames instead of rendered, its going to be a smeary, artifacty, high latency mess. And keep in mind, RTX 40 series gets everything from DLSS4 except MFG.
@@lexustech48 Cores aren't really comparable across generations though. Also the price difference is the main thing. The 5080 will probably be about the same performance as the 4090 in terms of raw performance. The extra vram on the 4090 will help maybe in future games but it won't really matter in most situations. I just don't see the 4090 falling to the same price as the 5080 or below in the current market. Yet again, the 5080 will probably go for way above msrp.
You should try reverting back to an older driver. I was having lots of stuttering issues with the newer Nvidia Drivers, but after reverting to the driver 556.12, almost every stutter was gone (except for the shader compilation ones of course).
Do you know how California wildfires happen? Because nvidia engineers tries to overclock Rtx 5090 to 3ghz! That's why this gen they had to lower the clocks for 5090!!
Hey Kryzzp! Long time viewer here (I've seen almost all your modern videos and some old ones!) I wanted to ask: Do you think the mobile GPUs having less VRAM than the mobile GPUs from previous generations is acceptable? For example, the GTX 1060M had 6 GB of VRAM, and now RTX 3060M also has 6 GB of VRAM. The 4060M isn't much better with 8 GB of VRAM.... How badly do you feel that effects performance? Just was curious of your opinion. Thanks!
Hi! Thank you 😊🙏 I think we're not seeing the improvements in VRAM amounts that we should be seeing. It seems like the 5070 Laptop GPU will also come with 8GB which is just ridiculous. Especially with games using more and more VRAM... It all comes down to price points though, a cheap laptop shouldn't have more than 8GB of VRAM because it likely won't need it to play at 1080p medium settings for example. But more expensive laptops, especially on the 70 tier, should've had 12GB even on the 40 series. I'm actually going to make a 4070 Laptop GPU video for next Sunday, I'll see how much of a bottleneck the VRAM really is then. Thanks for the comment 🙂
Cyberpunk PL is still, apart from Bioshock, Dishonored and RDR2, is still one of my go to games to showcase to friends in terms of raw fidelity AND Path Tracing
Hey kryzzp please do a video just like this but for the 4060!!! It's probably the most bought rtx from the 40 series so it would be very helpful for many people
How is this even a question? If the latest GPU is not a monster in the newest games then we have a serious problem. The 5000 series being released imminently doesn't all of a sudden make the 4000 series bad.
I can’t wait for a side by side comparison to the 5080… in my opinion the 5080 is going to be a direct competitor to a 4090 on the used market. I can imagine a market where both are priced comparably. Do I go with the 5080 or 4090 for my next build is my question?
Hey kryzzp, you could add war thunder onto the benchmark list, they added ray tracing recently and from what ive seen there isnt a lot of benchmarks on it (as of writing this though there is only RT support for NVIDIA cards but they are working on implementing it for other cards as well)
when you have a 4090, you don't need the 50 series and 60 series, you can just upgrade your cpu to reduce the bottle-necking even more! imagine in a few years where a r7 11800x3d is powerful enough to fully utilize your 4090 at 1080p!
I've been reading some weird comments about the word revisiting in the title (lol):
I'm not saying the 4090 is old, or slow, or anything like that. I'm just revisiting it because the last big video I made on it was in 2022, and since then many new games with new engines have been released, so I'm testing those now in this condensed video format.
That's all. Have a nice day and be kind to each other 🙂✌️
I AGREE
The title is perfect considering the new line of GPUs are around the corner. Its revision on the last one compared to the new ones that are coming is perfect. Perfect title.
That all good and all but
16:42
The last of us part 1: Am I a joke to you?
Haha, I totally forgot about TLOU1 for some reason. Missed that one in this video unfortunately!
@@zWORMzGaming THE FINALS 😥
If the 4090 can't handle a game it is absolutely NOT the 4090s fault
4090 can run every games maxed out expect path traced one (above 1080p) what 5090 can't do nether.
simple math = keep the 4090 and enjoy dlss 4.0 buff.
Dllss 4 coming to 40 series too? Dang@@Mrzeseb13
@@Mrzeseb13the 5090 has multi frame generation that the 40 series won’t be getting, dlss 4.0 is coming to 40 series tho. but 40 series can just use lossless scaling (which sucks idc what anyone says i’ll never use frame gen unless the latencies are less than 10 ms and looks undistinguishable)
It's the fault of developers not putting in the time and effort to actually optimize (reduce draw calls, texture compression, polygon culling, adjusting amount of rays and de-noising, etc) their games.
Developers need to go back to their roots (ie. The Order 1866, Red Dead Redemption 2, Jet Set Radio, Gran Turismo 4, Forza Horizon 5) some of these games are from the 90s and I included those as they ran at 60fps on original PS2/DC hardware while still looking good.
@@arivelo The way mult frame gen works is that they wait for two frames to generate, then squeeze a frame in between. So it doesn't reduce frame latency. COMPROMISED EXPERIENCE.
4090 is not gonna lose that monster GPU status anytime soon
yea, i think its the only gpu that will have the 1080 ti reputation
5070 Ti wipes the floor with 4090
@@d4nith3keep glazing, even 5090 is only around 30% faster than the 4090 in raw performance.
LOL. Nice joke fake frame lover boy@@d4nith3
@@d4nith3Enjoy ur AI frames jimbo
The worst thing is that youtube is reducing quality so we cant see how games actually look like with these settings
I mean you can't really blame them, uncompressed videos are very huge, and they have millions, maybe billions of vids
@@Alsoldalssan122 This is one reason I record at 100 MBS😆
lmao u act like u would actually see the difference
@@hayvxyz yes you would
@@hayvxyz 4K RUclips/Netflix is less detailed than 1080P Blu-ray in terms of bit rate.
If the 4090 can't handle UE5 it ain't a 4090 issue, its a UE5 issue
its not either of their faults. its the game devs. UE5 has the features and ability to be very optimized. devs just use those feature to take shortcuts for good visuals but bad performance since they fall back on upscaling and FG. not upscaling or fg fault either
@@kleo1307nah UE5 definetely deserves lots of the blame. It's an engine made for a very specific target (destructible and dynamic envs like Fortnite) being advertised as an universal engine. Devs can turn lumen and nanite off of course, but when most games made with the engine run so badly eyebrows start to raise.
Don't forget how long it took for them to fix shader compilation stutter too.
I've heard its TAA is awful as well.
@@Rafael57YT UE5 literaly relies on TAA to function. Look up how the matrix demo looks without TAA. Everything Jitters there. Geometry. Shadows. Shaders. Jesus
@@oxxylix504TAA that is blurry
@@kleo1307it is a UE5 issue. Epic games ignore developer feedback and provide incompetent documentation
4090 that thing is old news it's all about the 5090 now. Ever since the 5090 was announced my 4090 barely hits 3FPS at 720p
RIP to those who didn't think that was sarcasm. 😂
I know you're joking, but you have no idea how right you are with the upcoming games in 2025 and beyond. That 3 fps at 720p before any frame generation is gonna be a reality.
@olebrumme6356 I know, lazy ass devs and sweet nothings whispered in their ear by Unreal5 and Jensen
@olebrumme6356 Yeah developers are going to be even more lazy with the optimization. DLSS was a mistake. Hopefully nvidia does implement something in a future software update where bigger cards cannot use DLSS and only mid range-low end Graphics card can utilize it to boost frames. That way, when lower end cards get the same amout of fake fps as high end cards, people that bought a 4090/5090 will be angry about how a 5060 is getting the same fps what's the point of buying a high end card? Then the developers are forced to actually optimize their shitty games to compensate for all gamers getting a good experience.
@@adityavandarimight just be the dumbest idea I’ve ever heard, thank god you ain’t a dev
Krzzpy what do you think about the RTX 5070M with 8GB VRAM 💀
GTX 1070M dropped a decade ago with 8GB VRAM.
Yeah... I hate it 😅
Rtx 5070 have 12gb vram
@@younesKamel-b1o M
@@younesKamel-b1o I think he meant the Laptop Version
@@younesKamel-b1o yes, but the 5070M has 8. M is their mobile lineup which is what you find inside laptops. The 5070M will be what you find in any laptop that says it has a 5070 and it will only have 8 GB VRAM
Can't wait for the RTX 5090 video, I really wanna know how the GPU performs without frame generation, because Nvidia is pushing this AI stuff and I want to know how the GPU truly performs without frame generation to see how really of an upgrade it is from the RTX 4090
I want to see it against the 5080 as they would be priced comparably.
30-40% faster than 4090
@@GewelReal with frame generation surely but how would the graphical comparison be. I’d imagine one looking softer than the other and other differences of course.
Im betting its around a 14% increase
@@TransformersHoarder 25%-40% rasterization, based on their specifications. With frame gen and all the typical nvidia gimmicks, you're looking at about ~120%-150%
I know it's beating a dead horse at this point, but man, seeing you go from 100+ FPS on Ragnarok buttery smooth, to Stalker 2 with dips into the 50s is ridiculous. I know they have different engines/different games and all that, but Stalker 2 in no way looks better than Ragnarok and it performs so much worse. Thanks UE5!
GOW Ragnarok has baked lighting.
@@Funnky Double the framerate too. To be fair stalker 2’s budget was nowhere near Ragnarok’s so I’m sure that was a variable too
@@Artafact106 Baked lighting not really possible for Stalker 2 because it's an open world game with dynamic TOD.
god of war ragnarok is a ps4 game
It’s built on the same tech and engine and is pretty much the same as the 2018 one. There’s nothing intensive going on just look at cpu usage while it’s running lol.
38:00 Im shocked bro.Your precision with awp is godlike!
Can we please talk about how great Starfield's VRAM usage is even at native 4K ultra? That's so refreshing to see.
And the textures actually look very good in that game, nice 🙂
@zWORMzGaming They really do. I never understood the people who claim SF is an ugly game. It's a beautiful game. The shadows and textures are top notch. I like the 1970s retro sci fi. And I gotta say, DLSS works great with SF.
They improved it a lot. Plus you also have many patches.
For sure. I do think the game deserves more credit than it gets for its performance optimization right now. Runs quite well actually, and dramatically better than launch
@ThunderTheBlackShadowKitty I should give it another try then, because 1month after release graphics looked dull to me, good in closed spaces perhaps, but outside was just average 2020 graphics and some stuff like trees looked righit out of a 2014 game. This was at 4k with a 6950xt, the poor thing could not even handle high settings despite it being a powerhouse
The only monster I see here is El Kryzzpo. Damn dude, what a huge video👌
Cant wait for your 5090 benches👌
it's crazy to me that a game like horizon can reach 90 fps with no gimmicks and be absolutely gorgeous yet a game like marvel rivals can't break 70 without any gimimcks and it looks akin to overwatch, which launched in 2016 and had its glorified graphics overhaul in 2022. Same thing with a game like Silent Hill 2 Remake, etc. etc.
Yeah, no. Never compare multiplayers, which are made to be run at 120+ fps by anyone with relevant hardware 2-3 years later with singleplayers, which target best possible visual output at 30 fps, and everything above is a luxury of newer, stronger hardware.
Also, Silent Hill 2 has gorgeous graphics in every aspect, even in plain raster, rt simply does smoother lighting, sharper reflections and shadows, effects, which Horizon could also benefit from. You're comparing art direction, not every game is supposed to be all bright and super colorful. Final Fantasy and Resident Evil are loved in equal measures.
@@БоднарРостислав Yes but in raw performance terms, there is no reason these games should be performing worse than games with more graphical fidelity. Especially when new gen games look like old games yet perform worse. Most modern generation hardware cannot break into comfortable FPS on games running on unreal engine 5, like marvel rivals from most recent personal experience. I'm not speaking solely on art direction but rather the geometry and textures that accompany a game like Horizon FW. Visuals don't mean anything if you can't comfortably play with them, especially when the pinnacle of graphical processors prior to the 50xx cannot bring out the best in these games.
@@sataniccereal I don't play rivals, but I remember R6Siege performing worse than BF4 in 2015, was it? Nobody cares about whether it did or did not. Everybody plays Warzone even with constant gradual performance drops.
You cannot adequately compare geometry between SH2 and HFW, it's not 1:1, SH2 should be compared to AW2 and RE4, and it holds its ground firm. But starfield can be compared, starfield has better textures, volumetrics, higher polycount on objects and models overall. Like AW2, it's been fixed now and totally playable. Doesn't use UE5 either. I agree, UE5 is either too complicated to configure correctly or has fundamental flaws and a lot of studios don't want a big R&D budget to devise their own engine. Jedi Survivor is running past Horizon too. But UE4 in it has troubles working with new algorithms put on it, which is why it's putting a heavy load on cpu and other studios, seeing this, did not entertain to follow the same path and moved to UE5 instead. I love Horizon as a franchise, but it's not a paragon of graphics. It's a solid 7. And it could use an upgraded 4k texture pack, and maybe developers entertained the idea, but in their render, it would need what? 16? 20gb? They probably decided against it after the controversy with Zero Dawn pc port and its infinite vram hunger.
Still, it's funny how people got angry with Star Wars Outlaws because of things like npcs don't react to your presence and actions, but you can literally jump on heads, swing your spear on npcs with no collision, fail jumps and bump into invisible walls in Horizon. Sure, Horizon has better story.
4090 will be a beast for like 5 years,now especially with dlss 4 coming to it
Even a 2060 can use DLSS 4, just not the MFG. Not even the 4090 will support that.
what makes you think they will release dlss 4 for 4k cards? Its Ngreedia we talking about
What? I thought 40 series isn’t getting DLSS4 but just a few minor features to increase performance a tad.
@tillicumbeach638 they hs already said that dlss 4 is coming to 40 series, the only thing thing that'd not coming to 40 series is not getting mfg which isn't dlss 4 completely
@@isxl-iy2gk DLSS4 is upscaling not frame generation but does reduce latency from what I thought.
Good time, I really wanted to watch an old vid but here I see you posted a new one
Hope you enjoy it!
always nice to see your vids pop up in my notifs, it's a nice wholesome time for me and i just chill and watch you test stuff, for some weird reason it's very relaxing
Wow, i didn't expect a video like this! This thing will be relevant for very long as it has a big performance margin bonus. 5090 will be faster and will have DP 2.1 but the 4090 is still a monster! Greetings.
I want to see how the 5090 will perform with path tracing
Since it apparently has twice the raytracing speed per core
I would really like to see some actual 5k monitors. 5k 240hz would saturate that dp 2.1 bandwidth.
I have exam tomorrow but this seems more important.
Good luck!!
@@zWORMzGaming Tnx Ill do my best!
@@leexy3395hii😊
Just saying. The heaviest part of Cyberpunk 2077 for the GPU is actually the Reconciliation Park in the middle of tall trees, by a long shot, especially with RayTracing. Test this. You probably going to be sad after that 😅
Is that the park in city centre ?
Wonder how the 5090 will compare.
It is estimated to be 25-40% faster or probably something in the middle. 60 series will probably be more exciting due to improved manufacturing process (3nm or less). I’m definitely not changing my 4090 for more fake frames :). Frame generation even on the current card doesn’t excite me. I switch it off for better image quality.
When will you be testing the arc b580? And more importantly will you be testing the new strix halo apus?
Not sure about either!
I've been waiting for the arc GPU to be available here for a month, and it still isn't...
And I hope some mini PC brand sends something with those APUs, or maybe even AMD, but let's see :)
@zWORMzGaming can't wait for you to review them
Proud 4090 owner since I build my new machine June 2023. Knock on wood no issues what so ever. I plan on using this for years to come and most likely won't upgrade till the 7000 series if I can help it.
Now if only games were more optimized :(
Omg look at that power draw you're gonna blackout the whole house the second you plug it in💀
you are actually the greatest benchmarker on youtube, you leave timestamps amd everything as well as specs, this is why you are the goat
I got a 4090 last year and I’m very satisfied personally with the performance!
I’m definitely skipping the 5090 and instead saving up for the 6090…
There is no upcoming game that I want to play in which the 4090’s performance would be unacceptable to me.
GTA 6 tho might be the one that will make me upgrade (it should be out around the 6090’s time).
As an RTX 4090 owner, I enjoy playing my games at High settings, not ultra, with Quality DLSS and sone Ray Tracing (RT), but not extreme or Path Tracing (PT) due to the performance hit being absolutely ridiculous 😵😵💫🤕! I feel we’re (Nvidia and AMD) are a solid decade+ away from having decent RT or PT performance in games with a minor to moderate hit to performance and YES, Lumen in UE5 should have a toggle to turn it off until we’re getting decent RT performance out of the box!
Path tracing is really beautiful in this game. Don't waste your 4090.
U should test the rtx 3090 and 3090 ti if u can
There is no need for a 5090 if you dont plan on playing with native RT/PT with everything on ultra at 4k. The 4090 got more than 60fps in All of these games with the maximum settings without RT or PT or Dlss/fg, and still around 55 in the 2 most diffucult ones that have mesh shaders always on anyway, and Dlss quality does not hurt the image at all so you may as well use it on them. In most of these games it even obtained more than 60fps even with RT on witohut dlss or fg. There is no need to upgrade at all
I'd sure hope it got more than 60fps in all of these games
@tiobiovr It literally did?
Bought a 4090 from a friend of mine, 3 months ago, for a very good price, inkl 1 year left of warranty. And after seeing the insane prices of the new 5090 card, , I'm glad I did 😅
I have zero use of 4x fake frames, as I can't even stand using 2x 😂
For how much did you bought it?
I buyd one too from a friend with 2 years warranty for 1100€.
your lucky i have to get the 5090 there is no more 4090 around :( crossing fingers the 5090 will work at all?
@@frallorfrallor3410 RTX 5090 will most likely be very good, don't worry friend. Penguin has RTX 4090 and still not going to replace in the first few moments anyway, RTX 4090 more than fast enough for Penguins work and gaming.
This is such a good rundown of the best of (soon to be) last gen. Looking forward to the 50 series cards to come
You should have a look at the rtx 2080 ti sometime, it's still a very good gpu
2080ti my 🐐
I cant wait for you to test the 50 series cards in a few weeks. This was another amazing video as always:)
This is an insane benchmark showcasing the RTX 4090. Even though the Nvidia Showcase of the new RTX 50 series manages to overshadow this one, I still think this card will stand the test of time very well and maintain high frame rates. Well done, sir, for showing the true power.
Glad you found it interesting! Thank you 🙂💪
love your videos man 😍😍😍😍
Do a video like this for the 4080 super please
Great vid! Any chance you could make a video giving your thoughts on Lossless Scaling from steam?
Maybe, I'll see if I can make it before the 50 series comes
Do you plan getting the RTX 5090 and 9950X3D?
con 16 core avrai un pc completo.. 9950x3d tutta la vita top
5090 and 9950x3d will be the first true workstation level performance for average people imo
@@ٴٴٴٴٴٴۥ Gpu 30% better and cpu 10% better. Not much..
@@user-wq9mw2xz3j that’s fair but even with that somewhat minimal boost there is very little that won’t be able to handle
It might sound strange, but you could also do livestreams while testing a GPU? That would be a great idea, wouldn't it?
I'm moving to a new house in a few months and I'm planning on doing live streams there. It'll be fun 🙂
@zWORMzGaming Sounds good 👍 IT would be nice If you Testing GPUs in Livestreams and we can Chat with you in realtime. Im pretty Sure it will be boost your RUclips Channel even more ;)
Please, could you revisit the RTX 4080/4080 SUPER? It was a while since you did a video with it.
The forty ninty is massive.....JUST LIKE THE LOW TAPE-
😅 😐
Yo are you going to test the RTX 5090 once it releases?
what do you think?
If nvidia or some aib sends him one lol
We just ignoring how bro started to pop off in Bo6?
Been watching your videos for about 2 years now and I am about to actually build a cheap gtx 1660 super build. I am only gonna end up spending around $150 bucks overall due to the fact that my friend is giving me his old gpu (the gtx 1660 super). Anyways, that part was kind pointless to add, but keep going, you make AMAZING videos!
happy for you friend
Pretty cheap PC, very nice of your friend to give you a 1660S, still a relevant GPU.
Enjoy it 😁
It is said that in the MSI Afterburner application, the GPU temperature and power consumption indicators can cause a significant drop in FPS and lead to frame drops. Disabling these features results in a performance boost. If anyone is unaware of this, they should disable this setting in the program, or alternatively, use another program. I hope the channel owner reads this message.
Did you even watch the video ? he mentioned this like two times.
Revisiting RTX 3080Ti please
When doing the game test, when playing games with 1080p and 1440p resolution on a 4k display, the quality is a little lower and the fps is higher. No one play games in 1080p and 1440p resolution with a 4k display on a pc made for 4k. Real world testing means testing games only at 4k resolution with a 4k monitor with a pc made for 4k gaming. Testing games at 1440p resolution with a 1440p monitor with a pc made for 1440p gaming. Testing games at 1080p resolution with a 1080p monitor with a pc made for 1080p gaming. if you want to real world testing you need build 3 pc setup and 3 monitors(1080p,1440p,4k)
I love watching your videos. I've been watching for years, and I’m finally going to buy my first GPU with the new RTX 5090 and build my first PC. Thank you so much for all of your hard work!! 🙏🏼
my day just got better because of a new kryzzp video
Your best bet to get a 4090 at a good price is to buy used in your local area.
There's people selling stripped 4090s as working though, and this has affected both Amazon and Ebay.
Waiting on 4090 vs 5090 benchmark.
@adityakshirsagar5383 I'm snagging a 5090 on launch day. I'm not upgrading I'm doing a new build.
That was some fancy shooting. You're pretty good.
Thanks!
Hello kryzzp how are you, when is the setup video coming out ?
Hey! Feeling better now :)
It'll come this week!
Bro I love you. You know I am the og Subscriber 🎉 we have talked so many times and thank you for the pc advice ❤
Any time! 😃🍻
Are you going to buy a 50 sereis card?
They will probably send him to test like they did with the 40 and 30 series
@@teamexegaming4244 sad that he did not get 3090
@@Gargentwitchongargee yeah but he got the 4090 so hope this time he gets the 5090
@@teamexegaming4244 I hope he gets 5090 because he is more popular.
Wow... your Counter Strike skills are outstanding.
can u test rtx 3080 ti?
He did already a while ago, I want to see him test the 3080 (10 GB).
How do u get 450 fps and not screen tearing in counter strike? U dont have 450hz monitor do you ?
There probably is screen tearing, but it's harder to notice at those FPS
Oh thank you so much, because its so fast, its impossible to notice now I understand. Thanks. Do you use gsync? I an new to those pc settings and stuff so I would like to know. @@zWORMzGaming
1000000 times better than fake 5070 with 12 giga and only DLSS and IA.
And 10000000 times more expensive then the 5070 😅
not really, 2x better in best case, only 1.5x or 50% better in general.
Great video as always. The awp gameplay was chef kiss.
Question. When you are referring to gpu power utilization when it relates to x3d cpus, whats that all about? Just built my rig with a 9800x3d/4090 and curious as this is the first im hearing of it. Might want to look into it myself.
Thank you!
Some people found that having the GPU power usage displayed on the OSD makes ryzen x3d CPUs stutter more. I haven't found a game where that happens yet though. And it's more common with the 9800X3D apparently
@zWORMzGaming ty for info I'll have to look into it some as I own that cpu and use the osd.
The 3090 Ti is sadly the last proper non-FG GPU. No doubt the 4090 will be fairly similar in rasterization performance to the 5090.
The difference seems to be %20-25.
Just bought a 4090 a few days ago. Seeing a new benchmark today is nice
Imo every game should perform like Hogwarts Legacy in a 4090, 100ish fps in AAA games but they barely go above 60fps so I think there might be something wrong with game developers.
Hi, what is the name of the testing overlay you use? Thanks!
hello kryzzp
Hi 👋
Hey Krzzpy, really nice to see the strix 4090 on your channel!
Could you test avatar frontiers of pandora (with unobtainium settings) on the 4090 aswell? I would really like to see that.
But as always, a great and entertaining video! 🙏👍
GOODBYEEEEE BOOOOB! 💥🔫
Hey Kryzzp, the 7900XTX with latest drivers can trade blows with the 4090, im still thinking about getting a 7900XTX Nitro+ to upgrade from my 5700XT Nitro+, but also thinking about the 9070XT if the price/perf is good (but it has only 16GB of VRAM, i want to play at 4K)
No It absolutely cant, the 7900xtx Is not even able to beat the 4080s in pure raster, and In RT or pt It Is 40% weaker.
Throw in some AMD fanboys, all we need now are the Intel B580 fanboys. Competition is always good.
@@enricogiunchi7171 7900 xtx is 10 % faster than 4080 and 20 % weaker than 4090.
@@brasiliafc2138 no It isnt, It Is barely 0 to 5% faster than the 4080s and at least 20 to 25% weaker than the 4090, being 20% slower than a GPU Is NOT trading blows with It, a GPU Is trading blows with another if It Is less than 10% weaker than that card, not 25.
A monster like this will be still relevant even in 5 years. Now let's see the new born, RTX 5090 😉
7+ years if we're talking about 1080p max settings 60 fps
@@hamzakhalil-gs7oc 7+ yes, 1080p max settings 60FPS+
My first time commenting on RUclips▶️
Welcome! I responded to the wrong comment previously 😅
Thank you😊
Im so hyped about the 5090 comparison, i believe apart from the DLSS4 Performance, its gonna be pretty much the same fps vs watt Performance.
From what i saw it consumes about 500-530, meaning its around 15% higher than 4090 but its also „just“ 15% more fps in most games. In conclusion it would be more future proof, but its a 1:1 upgrade which isnt really that great.
Big advantage is the dual slot design though, the 4090 is a real brick, good thing they downscaled it again for the 50 series.
The RTX 4090 will likely still be the 2nd most powerful GPU for gaming even after RTX 50 series comes out
It was expensive, yes. Still expensive today. But it gave you an experience far and above any other gaming card you could buy. It trivialises any game prior to the 2020s & with frame gen, it makes even modern games a 4K high refresh experience. For 4K path tracing, we are probably not getting that for at least another generation; a massive breakthrough or a new node would probably be required.
They would need a little bit more than the double the core count of the 5090 to achieve 60 fps in Black Myth Wukong for full path tracing and also faster vram. I dont know when we will get something like that :/. In 6-8 years ?
Du bist der beste danke für alle deine Videos :) ich liebe deine Videos :)
Thank you so much! 😊🥰
❤
Unless scalpers make it unobtainable, the 5080 is going to absolutely kill the market for this card.
I think they will be priced comparably.
You will be able to get the 4090 second hand hopefully at good prices when the 5090 comes out.
5080 only matches the 4090's memory PROCESSING performance. 4090 still has 75% more CUDA cores and 8GB more VRAM. You'll get "more frames" generated with MFG on, but with a 75% generated set of frames instead of rendered, its going to be a smeary, artifacty, high latency mess. And keep in mind, RTX 40 series gets everything from DLSS4 except MFG.
@@lexustech48 Cores aren't really comparable across generations though. Also the price difference is the main thing. The 5080 will probably be about the same performance as the 4090 in terms of raw performance. The extra vram on the 4090 will help maybe in future games but it won't really matter in most situations. I just don't see the 4090 falling to the same price as the 5080 or below in the current market. Yet again, the 5080 will probably go for way above msrp.
@@lexustech48 that’s cool to know as we have two 5070 Super 12GB gaming PCs. But a 5080/9800X3D is easily in my near future.
You should try reverting back to an older driver.
I was having lots of stuttering issues with the newer Nvidia Drivers, but after reverting to the driver 556.12, almost every stutter was gone (except for the shader compilation ones of course).
Do you know how California wildfires happen? Because nvidia engineers tries to overclock Rtx 5090 to 3ghz! That's why this gen they had to lower the clocks for 5090!!
Hey Kryzzp! Long time viewer here (I've seen almost all your modern videos and some old ones!) I wanted to ask:
Do you think the mobile GPUs having less VRAM than the mobile GPUs from previous generations is acceptable? For example, the GTX 1060M had 6 GB of VRAM, and now RTX 3060M also has 6 GB of VRAM. The 4060M isn't much better with 8 GB of VRAM....
How badly do you feel that effects performance? Just was curious of your opinion. Thanks!
Hi! Thank you 😊🙏
I think we're not seeing the improvements in VRAM amounts that we should be seeing. It seems like the 5070 Laptop GPU will also come with 8GB which is just ridiculous. Especially with games using more and more VRAM... It all comes down to price points though, a cheap laptop shouldn't have more than 8GB of VRAM because it likely won't need it to play at 1080p medium settings for example. But more expensive laptops, especially on the 70 tier, should've had 12GB even on the 40 series.
I'm actually going to make a 4070 Laptop GPU video for next Sunday, I'll see how much of a bottleneck the VRAM really is then.
Thanks for the comment 🙂
@zWORMzGaming Thank you for the reply!!! I'll be waiting for the next SUNNNNDAAAY VIDEO about that laptop! I'm very interested!
wtf do you mean by "revisiting" when it is the best GPU currently (until 5090 actually launches)?.
Because the last time I made a full video on it was in 2022
Well it was released back in 2022 and ZwormZ also tested this GPU at that time. That was a long time ago.
Can’t wait for the 50 series videos😊
RTX 4090 and 1080 Ti are the most legendary GPUs in history.
Damn Kryzzp you're good with the awp 😭🔥
Can't wait to beat it for $549 smh
Cyberpunk PL is still, apart from Bioshock, Dishonored and RDR2, is still one of my go to games to showcase to friends in terms of raw fidelity AND Path Tracing
Hey kryzzp please do a video just like this but for the 4060!!! It's probably the most bought rtx from the 40 series so it would be very helpful for many people
How is this even a question? If the latest GPU is not a monster in the newest games then we have a serious problem. The 5000 series being released imminently doesn't all of a sudden make the 4000 series bad.
UR TESTING VIDEOS ARE UNIQUE
Come back in 5 years and you will say this was the best card and you will laugh or cry on the performance.
I have a feeling the RTX 4090 might become the new GTX 1080TI. Meaning it will be able to run games for a LONGGGG time.
I can’t wait for a side by side comparison to the 5080… in my opinion the 5080 is going to be a direct competitor to a 4090 on the used market. I can imagine a market where both are priced comparably. Do I go with the 5080 or 4090 for my next build is my question?
This exact GPU is £2,300 at Overclockers UK and that’s on sale from like £2,600.
Mental prices.
hope you get to review the 5090 when you can 🙏 i wonder how much it actually will be better than the 4090 especially with that price of 2,000 😭
Fingers crossed! I hope I get one to review as well!
Hey kryzzp, you could add war thunder onto the benchmark list, they added ray tracing recently and from what ive seen there isnt a lot of benchmarks on it (as of writing this though there is only RT support for NVIDIA cards but they are working on implementing it for other cards as well)
are you gona make a video about lossless scaling fg 3.0 ? Looks quite impressive
But can it run crysis?
U need a 6090ti super max to run crysis
4090 is good enough for me. Not jumping to any new GPUs for Years
ZwormZ should open his own gpu store, my man owns every gpu he's reviewed so far.
When will you upgade to AM5?
I am on AM5
@@zWORMzGaming ah mb, i meant the 9000X3D series
Not sure, AMD said they'd send me a CPU but it was a month ago already, still waiting
🐀🐀 @@zWORMzGaming
when you have a 4090, you don't need the 50 series and 60 series, you can just upgrade your cpu to reduce the bottle-necking even more! imagine in a few years where a r7 11800x3d is powerful enough to fully utilize your 4090 at 1080p!
It's crazy that Starfield and Tlou 1 port we've been complaining about just a year ago feel like optimized titles, comparing to UE 5 "masterpieces".
Good to see that "Slowfield" became Optimised
The Roundabout is not the most intensive area in the game. That would be inside Organitopia in Dogtown.
Will be tested RTX 50 series cards when they come out?
in cyberpunk if you enter the game with frame generation active if you deactivate it when you reactivate it there is no need to restart the game