They no longer target 60fps with native resolution, They now expect players to turn on DLSS/FSR to get close to 60fps... such a shame.. DLSS was supposed to be the cherry on top of the 60fps to get to like 100fps... not this abomination of optimisation
And with piracy not very rampant these days the devs just slap denuvo on it. So pay our price, or you don't get to try our game. Not that I'm all for that , but I think it would give the devs a wake up call if there games were being distributed for free.
The amazing thing is there's been only a slight improvement in game graphics since the day's of RDR2 and Shadow of the Tomb Raider. But here we are now with hardware which has double to triple the amount of power, but are still getting very similar looking games, only with half the amount of frame rates, plus a ton of bad stuttering on top!!! I swear Dev's just cannot be bothered at all, to do any optimization on PC these days. It's just a case of "Oh well, they can just gain those lost frames back by using the included upscaling methods. It also saves us a whole bunch of extra time, money and effort, on trying to optimizing it on PC"
I got one and it is basically the same as 4080 but with lower FPS (ofc). same CPU bottleneck :( at FHD GPU isn't fully utilized, but it runs like crap in certain areas. like kryzzp mentioned, it was really good before arriving to Hogwarts. without RT it is battery smooth. also, reflections indeed look kinda weird with RT. i hope they fix performance at least :)
@narkomancers5262 I think that's pretty self explanatory, it's on the same level, it's just obviously not going to match the 4080 in raster performance. It can keep up just not on the same level, if you want the analytical data just wait until Kryzzps upload :)
@Narkomancers I guess he could argue that it has the same architecture and feature support, other than that it is worse in basically every measure except gpu clock..
@Narkomancers I meant except for the FPS numbers, you get all the stuttering because of CPU bottleneck. GPU isn't fully utilized (just to reiterate: I'm talking about RT. ofc it is not the same without RT and if there's no CPU bottleneck)
@@Zenith_Nulls No problems it runs perfect and I’m very happy with it. I run it together with LG Ultra 45” and every game I play, I max out the graphics and it just runs no problem.
Games are made to run at good settings on PS5 either at 60fps lower res or 1440p to 4k and inbetween at better settings at 30fps. If your pc is worse than a PS5 at those settings, res or fps then just buy a console and stop complaining. It's not the PS4 era now thank goodness.
In case you don't know for Alder and raptor lake, the cache on the 13700k is the right amount of cache and cores and clock speed for the optimal way to go. The 6 core ones don't have enough cache to work well with high end gpus.
Do the extra 6mb make that much of a difference? This is still good though, it's 12900k performance at stock so still faster than that when OCed. And I couldn't bottleneck the 4080 in BF2042 even at 1080p so it's pretty good.
@@zWORMzGaming At around 14:50 for example on 13900k i'm averaging 130+ seems the i5 is dipping way lower due to lower core/ cache it also helps alot that mine is 5.8ghz all core but the higher tier cpus do help alot in this game :)
You can turn DLSS OFF and keep Frame Generation ON. You didn’t realise this when testing 1440p. The best way to play this game is 1440p Ultra, Raytracing + frame generation. 100-130 fps constant on a 4080 / 5800x3D w/32gb RAM
Just wondering, at 4k what is the like general DLSS mode to use? is it quality, balanced or performance? People tend to jump straight to quality but...yeah, what do you think?
With DLSS 2.5.1, even Ultra Performance can look decent at 4K, personally, I always start with Quality and only go down if changing some settings from Ultra to High didn't give enough fps, i finished A Plague Tale Requiem on my RTX 3080 with DLSS Balanced and it looked pretty good while keeping 60 FPS most of the time.
i have used 4k on 3070 and now on 4090 and let me tell you that at 4k dlss performance looks still better than 1440p native while performing better, i use quality as long as im over 60 fps all of the tme and enable frame gen when available as it hardy impacts visual, has extremely little artifacting and gives you 30-110 extra fps depending on the game.
Not sure this rule of thumbs from DF or something but I do 1080p Q, 1440p B, 4K P, 8K UP. And if you need DLSS P on 1080p, I'm just gonna consider it's unusable.
I feel like a lot of newer released games aren't greatly optimized well anymore. You think having the newest and greatest GPU, you would see a lot more games perform like butter @ 4K ultra RT ON settings. Raytracing only feels useful with DLSS on IMO
Why it make no sense to buy a 4080 for 1080p? Or only for this game? I mean, I have a 1080p 165Hz gaming monitor, and I do think in a lot of games it feels better to have that 120+ fps smoothness than a 60fps high quality texture. Apart from that, if you play not well optimized games (Like escape from tarkov, rust), you can end up less than 100 fps even with a 4080 in 1080p :D
I don't have an RTX but I kinda enjoy watching your videos. Purpose of this comment is to appreciate what you're doing. I like to stay up-to-date on tech n gaming.
some people are saying there are memory leak issues when changing settings mid game which can peg your gpu to a way lower performance at 99%,. ive seen the pegging at 99% at half the previous fps for myself after changing settings. interesntlingly same issue that ive had in cyberpunk and witcher 3 when using dlss
Hey man, I have a question. I see you have a z690-P as well. Didnt you had Any issues with the video signal when installing the 40xx cards? My 3060 ti broke and i got a 4070 ti and it didnt output video through hdmi, and i had to connect the hdmi to the mobo to get into BIOS and fix it. Now the gpu displays video, but it totally skips the BIOS splashscreen and cannot ENTER in BIOS anymore, and i absolutely spam Both f2 and del buttons at Start-Up. The gpu is good, i’ve tested rdr2 with it, i had 100-120fps maxed out at 1440p no dlss, and 60-65 degrees Temp so not sure if it must be rma’d. I’ve flashed the BIOS to the latest version and installed the latest drivers for the gpu
Hi, I didn't have any issue, it has worked ever since I installed the 4080 for the first time, and I can get into BIOS with it as well. It could be the monitor, the motherboard is compatible 👍
@@zWORMzGaming i dont think the monitor is at fault, as i tried with two different tvs, the gpu just doesnt have signal when booting up. it is indeed strange. If i connect the HDMI to the motherboard, it works perfectly, but if I connect the HDMI to the GPU, it totally skips the BIOS, the input is not registered and it boots straight into the windows. the weirdest part: if i shutdown the monitor and reopen it, the signal is lost until i restart the pc
So now developers are relying on upscaling tech so that they don’t have to optimize the game much on the GPU side of things and frame generation so that they don’t have to optimize much on the CPU side of things…great.
Very much that. They rely on DLSS/FSR do to their job for them, making these 'next gen console' only ports be far, FAR more demanding than they should be.
@@upicked9598haven’t experienced a single one on my 4090. 1440p, RT ON, Ultra settings, NO DLSS or FG. Zero stuttering, lagging, or high latency. Runs flawlessly at about 100fps.
Hello Kryzzp Hope you are doing well . I'm just want to ask from you that how did u OverClock Your Cpu and Memory (Ram) Every Time. Please make a video Tutorial on it .
The issues faced in this "pre-launch" release are still way better than what we faced with cyberpunk release. Im not getting as bad as the results here in the video as with mine. I'm on a 4080 with 7600x and waiting for the 7900x3d to drop in a few weeks. If I slow to a walk speed before entering a door I don't get loading circle or stuttering. Patch updates and cache optimization would do this game wonders
@@mofarah3488 5120x1440,dlss with everything ultra. I recently turned rtx off cuz it wouldn't report my in game fps until I turned it off. My install is on a upgrade from windows 10>11. I have also noticed that if the fast travel to the school I get massive stutter and pop in just after but if I broom or walk to the school I get no stutter at all. It's Def an asset loading/cache issue. Still very playable cuz it never happens when I'm in battle just exploring
@@zWORMzGaming thanks! But i think ill wait another 6 months or so. Most of the games arent worth buying at release these days for full price. Im very sensitive to bad frame times and stuttering. So i hope it will be fixed until then.
Hey man, can you also test how it runs on older but still capable cards e.g gtx 780, 950, AMD R9s..etc Edit: If possible also on old CPUs like 3rd gen i5/i7..Just curious to the levels of support this game has for older hardware.
The gtx 780 is like a 1050 ti with less vram. Capable 🤣, yeah no. Time to move on. 3gb of vram can't even display anything above N64 textures in this game.
@@theoldpcgamer77 dude relax, I use 580 8GB. I just want to see how the game performs on older hardware as I did see someone playing it in 1080p with 30fps on a 750ti and it raised my curiosity.
@@GlitchedGame0 Yeah the 750 ti has textures so bad it's 🤣 and at 540p upscaled with FSR for barely 30fps I wouldn't call that playable by any means and your eyes will turn to cancer.
@@theoldpcgamer77 Its that bad huh? 😂 Wasn't really paying attention to the details, just saw 30fps. But is it really due to low vram or does the game just look like crap on low texture settings?
My settings on a 4090 at 1440p: DLAA on, FG on, 144fps cap, everything Ultra, RT on but no shadows (they have issues), plus some additional RT tweaks in engine.ini: [SystemSettings] r.RayTracing.Reflections.ScreenPercentage=100 r.RayTracing.Reflections.SamplesPerPixel=1 r.RayTracing.Reflections.MaxRoughness=0.7 r.RayTracing.AmbientOcclusion.Intensity=1 If I play it at 4K on my G1 I would just switch from DLAA to DLSS quality (with newer DLL).
Can you explain how you know you're CPU bottlenecked at times? Your overlay shows your not using much cpu so what is the bottleneck you're referencing?
The game isn't optimized to run on all of the i5's cores so it relies on a few fast cores alone. If said cores were faster, we'd get more FPS and higher GPU usage, do it's CPU bottlenecked anyway.
Tell me 1 one game, just 1, that is built on Unreal Engine, and doesn't stutter and uses "appropriate" amount of memory. This games takes more than 16 GB to run! That's insane for game that doesn't even look THAT good. Tomb Raider (Shadow) looks better than that.
This cpu doesn't have enough cache so it will make zero diff. The 13700k is where the cache amount is good enough and the 13900k will make no difference clock for clock over that.
Pls test a gtx 1660 super idk if i should buy it im scared that my pc (the gpu) dosent work well.. My specs are: Gtx 1660 super 16gb ram Intel i5 11th gen
@@sanji663Yeah, I just miss the days were you could buy a game that was fully completed and were the game designer cared about the finished product. Now they put some unfinished crap on the market and they don't care about it.
@@Th3_Dark_Side Developers nowadays rely on powerful hardware, like all the rtx or rx cards and dlss (and frame generation for 4k) to make their game playable instead of optimizing it just a little bit.
Thank you for testing nvidia reflex off. By the way I wouldn't pet a cat or any animal for that matter in Hogwarts. It might be a teacher or worst a creepy Animagus dude 😁😁😁
I think it finally time to consider buying a 4K monitor next time i upgrade my gpu damn i understand why this fps may not be what some expected but damn this beauty of a game in 4k nativ 60fps 🥵
I don't know the other components but I guess a well equipped used PC with those specs is worth around 700-800€ 200 for the GPU, about 300 for the CPU and everything else will probably cost 200-300
@@zWORMzGaming Not sure about the power supply or fans but the case is a Corsair 4000D but thanks! I just wanted your opinion since you know a lot about pc’s
1440p ultra with a 7950x and a 6950xt I have considerably better performance, playing in linux using proton. EDIT; RT is for suckers, shiny gooder right? Ruins the game making everything into a mirror and performance drops through the crapper, on EVERY game.
@@ese_andris I can infact run RT just fine. Got a 7900xtx now which has roughly the RT performance of a 3090 and for the record RT is still highly overrated.
Just got my 4080 yesterday and I've been testing it on HL. After enabling FG with 1440p ultra I've never dropped below 130fps. What a marvelous technology from Nvidia.
Why quality preset on 4k? Run performance like any normal person so it uses 1080p resources to render 4k. 1080p quality, 1440p balanced, 4k performance. There is no visual difference between quality and performance on 4k but a huge fps difference
Lock up developers with only a laptop 3050 and force them to get 60fps at 1440p or they cant release the game. That should be the baseline. Or Nvidia could whitelist games after release and make them detect when they are being used for development and only present a quarter of resources?
I just leave RT Reflections and Ultra settings on my 4080. I get about 80-120fps 4k DLSS Quality FG enabled. Some stuttering but not terrible...needs to be patched soon.
Frame Generation gives me just 10 FPS in Hitman 3 on 4K Ultra settings with RTX on, i get 43 without and 56 with FG, is that normal ? i have a 4080 and a 5800x3D with 64GB of RAM
Yes. Because if GPU usage is 99-100% already then DLSS FG doesn't give much FPS. For better FPS from DLSS FG your gpu must be CPU bottlenecked. You can check it by yourself - set 720p resolution with your current settings in Hitman 3, gpu usage will drop down from 99% to lets says 70% and you will be CPU bottlenecked. Now enable FG and your FPS will be doubled and GPU usage will rise also.
Literally got the same setup as yours, but somehow I can not activate Frame Generation, does anyone perhaps know why? Even reinstalled the drivers etc. didnt work tho...
@@zWORMzGaming It was turned on, quite confusing.. I just turned on "Hardware-accelerated GPU scheduling", restarted, checked and it was on automatically.... thanks anyway!
10900k and a 4090 due to vr and framegen saved my ass to get locked 100s at native res in HL Dlss2 is nice when needed but for RT in singleplayer games dlss3 comes in also pretty handy
I am getting 100-120fps with RTX 4080 and I7-13700k on 4K Ultra settings without Ray Tracing. Wonder if they fixed something after this benchmark?´ On 1440p Ultra no Ray Tracing I'm almost locked 165fps not dropping much, maybe dropping to 150 at times.
Dear brother, are you here with overclock? Processor? Card memories? Simply in 4k you get 3 fps more than my system 7800x3d + rtx4080× deepcoolak620 + RAM 32GB 6000hz 30cl
It's the harry potter scar...
..from Wish!
i was wondering whats that "N" in your forehead
@Dorothy Gale I bet it's thunder tattoo. xD
Suppose integration of harry parker in this game
RUclips kill the quality
I thought it was made by a permanent marker. LOL!
The sad part is that as GPUs get more and more powerful, the devs to less and less optimization, we are doomed!
i wish they could utilize CPU at its full potential :| you can buy GPU, but there's no CPU to overkill these RT issues
They just think "meh, they'll turn on DLSS/FSR" and leave it at that. Lazy devs prefer to let upscalers do their job for them.
They no longer target 60fps with native resolution, They now expect players to turn on DLSS/FSR to get close to 60fps... such a shame.. DLSS was supposed to be the cherry on top of the 60fps to get to like 100fps... not this abomination of optimisation
And with piracy not very rampant these days the devs just slap denuvo on it. So pay our price, or you don't get to try our game. Not that I'm all for that , but I think it would give the devs a wake up call if there games were being distributed for free.
The amazing thing is there's been only a slight improvement in game graphics since the day's of RDR2 and Shadow of the Tomb Raider. But here we are now with hardware which has double to triple the amount of power, but are still getting very similar looking games, only with half the amount of frame rates, plus a ton of bad stuttering on top!!! I swear Dev's just cannot be bothered at all, to do any optimization on PC these days. It's just a case of "Oh well, they can just gain those lost frames back by using the included upscaling methods. It also saves us a whole bunch of extra time, money and effort, on trying to optimizing it on PC"
Would love to see the 4070ti i wonder if there is much difference
Great content as always, very complete benchmarks here and very good tests
I'll test that tomorrow :)
Thank you!
I got one and it is basically the same as 4080 but with lower FPS (ofc). same CPU bottleneck :( at FHD GPU isn't fully utilized, but it runs like crap in certain areas. like kryzzp mentioned, it was really good before arriving to Hogwarts. without RT it is battery smooth. also, reflections indeed look kinda weird with RT. i hope they fix performance at least :)
@narkomancers5262 I think that's pretty self explanatory, it's on the same level, it's just obviously not going to match the 4080 in raster performance. It can keep up just not on the same level, if you want the analytical data just wait until Kryzzps upload :)
@Narkomancers I guess he could argue that it has the same architecture and feature support, other than that it is worse in basically every measure except gpu clock..
@Narkomancers I meant except for the FPS numbers, you get all the stuttering because of CPU bottleneck. GPU isn't fully utilized (just to reiterate: I'm talking about RT. ofc it is not the same without RT and if there's no CPU bottleneck)
Just got a 4080 and the 13700k is bottlenecking it at high refresh gaming for a few games! Crazy how powerful the GPU is
I just ordered the 4080 and the 13700 to, it isn’t built yet but man I’m excited and hope it runs good for years
@@Mr450pro how did it go
@@Zenith_Nulls
No problems it runs perfect and I’m very happy with it. I run it together with LG Ultra 45” and every game I play, I max out the graphics and it just runs no problem.
Crazy how the VRAM usage jumps between ~7 to ~10 gigs depending on the area. People will have massive frame drops even on 3080s.
developers these days prolly thinks we all have 4090s 😂
Well, they surely optimize their game to barely run on a 4090
Games are made to run at good settings on PS5 either at 60fps lower res or 1440p to 4k and inbetween at better settings at 30fps. If your pc is worse than a PS5 at those settings, res or fps then just buy a console and stop complaining. It's not the PS4 era now thank goodness.
In case you don't know for Alder and raptor lake, the cache on the 13700k is the right amount of cache and cores and clock speed for the optimal way to go. The 6 core ones don't have enough cache to work well with high end gpus.
Do the extra 6mb make that much of a difference?
This is still good though, it's 12900k performance at stock so still faster than that when OCed. And I couldn't bottleneck the 4080 in BF2042 even at 1080p so it's pretty good.
@@zWORMzGaming At around 14:50 for example on 13900k i'm averaging 130+ seems the i5 is dipping way lower due to lower core/ cache it also helps alot that mine is 5.8ghz all core but the higher tier cpus do help alot in this game :)
obviously you need 8 cores to pare with top end GPU, it is not even a question.
Have you done a 7900xtx review yet? I was looking the other day and didn't see any on your channel.
no
You can turn DLSS OFF and keep Frame Generation ON. You didn’t realise this when testing 1440p. The best way to play this game is 1440p Ultra, Raytracing + frame generation. 100-130 fps constant on a 4080 / 5800x3D w/32gb RAM
is it less noisy that way? Just ordered a 4080 :)
@@El_Deen it’s way better! DLSS kills quality and makes the game blurry, hate it tbh! Frame generation is perfectly clear
what do you use for the benchmarking? (the little windows to measure fps, temps and utilization)
msi afterburner prob
Bro I always enjoy your videos. I cannot sleep peacefully without watching your videos, much love
Hogwarts legacy stuttering issue is better than FORSBROKEN and STUTTERNITE 😁😁😁
Yes it is
It doesn't stutter whatsoever. If it does it is your own hardware not the game.
@@MaDDeX93 Did you even watch the video? And you can't say its his hardware as he's using 4080 13600K
@@MaDDeX93 Are you paid to place comments like this ? my god what a nonsense.
Don't forget CYBERBUG
I have been waiting for this video. Thank you!
Thanks for watching!
Just wondering, at 4k what is the like general DLSS mode to use? is it quality, balanced or performance? People tend to jump straight to quality but...yeah, what do you think?
With DLSS 2.5.1, even Ultra Performance can look decent at 4K, personally, I always start with Quality and only go down if changing some settings from Ultra to High didn't give enough fps, i finished A Plague Tale Requiem on my RTX 3080 with DLSS Balanced and it looked pretty good while keeping 60 FPS most of the time.
i have used 4k on 3070 and now on 4090 and let me tell you that at 4k dlss performance looks still better than 1440p native while performing better, i use quality as long as im over 60 fps all of the tme and enable frame gen when available as it hardy impacts visual, has extremely little artifacting and gives you 30-110 extra fps depending on the game.
Performance. Always performance. There is literally no visual gain to run it higher. 1080p run qouality, 1440p balanced, 4k performance.
Not sure this rule of thumbs from DF or something but I do 1080p Q, 1440p B, 4K P, 8K UP. And if you need DLSS P on 1080p, I'm just gonna consider it's unusable.
@@Jakiyyyyy i picked up that rule of thumb from digital foundry in the cyberpunk optimised settings video.
What software so you use for frame data? Is this Afterburner?
I feel like a lot of newer released games aren't greatly optimized well anymore. You think having the newest and greatest GPU, you would see a lot more games perform like butter @ 4K ultra RT ON settings. Raytracing only feels useful with DLSS on IMO
Why it make no sense to buy a 4080 for 1080p? Or only for this game? I mean, I have a 1080p 165Hz gaming monitor, and I do think in a lot of games it feels better to have that 120+ fps smoothness than a 60fps high quality texture. Apart from that, if you play not well optimized games (Like escape from tarkov, rust), you can end up less than 100 fps even with a 4080 in 1080p :D
I don't have an RTX but I kinda enjoy watching your videos. Purpose of this comment is to appreciate what you're doing. I like to stay up-to-date on tech n gaming.
I bought a 3060 like 4/5 monts ago.. With all the new games like this one out should I upgrade for 1080p gaming?
some people are saying there are memory leak issues when changing settings mid game which can peg your gpu to a way lower performance at 99%,. ive seen the pegging at 99% at half the previous fps for myself after changing settings. interesntlingly same issue that ive had in cyberpunk and witcher 3 when using dlss
There’s definitely a memory leak. The game uses 98% of my 16gb ram, it lowers to about 80% when I play a few minutes, but it’s still really annoying.
yeah my wife pegged me yesterdat
Hey man, I have a question. I see you have a z690-P as well. Didnt you had Any issues with the video signal when installing the 40xx cards? My 3060 ti broke and i got a 4070 ti and it didnt output video through hdmi, and i had to connect the hdmi to the mobo to get into BIOS and fix it. Now the gpu displays video, but it totally skips the BIOS splashscreen and cannot ENTER in BIOS anymore, and i absolutely spam Both f2 and del buttons at Start-Up. The gpu is good, i’ve tested rdr2 with it, i had 100-120fps maxed out at 1440p no dlss, and 60-65 degrees Temp so not sure if it must be rma’d. I’ve flashed the BIOS to the latest version and installed the latest drivers for the gpu
Hi, I didn't have any issue, it has worked ever since I installed the 4080 for the first time, and I can get into BIOS with it as well.
It could be the monitor, the motherboard is compatible 👍
@@zWORMzGaming i dont think the monitor is at fault, as i tried with two different tvs, the gpu just doesnt have signal when booting up. it is indeed strange. If i connect the HDMI to the motherboard, it works perfectly, but if I connect the HDMI to the GPU, it totally skips the BIOS, the input is not registered and it boots straight into the windows. the weirdest part: if i shutdown the monitor and reopen it, the signal is lost until i restart the pc
Hey Kryzzp! I'm sure this has been asked before, but what capture card do you use?
hi, it's the elgato 4k60 pro mk2
@@zWORMzGaming thank you brudda 🙏🏿
So now developers are relying on upscaling tech so that they don’t have to optimize the game much on the GPU side of things and frame generation so that they don’t have to optimize much on the CPU side of things…great.
Very much that. They rely on DLSS/FSR do to their job for them, making these 'next gen console' only ports be far, FAR more demanding than they should be.
Even with Framegen it runs like crap. It's full of stutters.
@@upicked9598haven’t experienced a single one on my 4090. 1440p, RT ON, Ultra settings, NO DLSS or FG. Zero stuttering, lagging, or high latency. Runs flawlessly at about 100fps.
@@brandonsturdevant5415 opposite experiene here. 1440p 4080, get lots of frame drops from 170 to 110 constantly
Do you stutter when playing this game? Even on high FPS?
sure FSR was off at 4k ultra with RT on? can you double check? (enable and then disable DLSS).
Got a 4080 as well on Hogwart i'm having difficulty to get rid of that screen flickering when i move the camera...
So I have this on PS5 but I built a gaming PC with a 4080. Is this game worth it to rebuy on PC? Will it look that much better on 4K?
Hello Kryzzp Hope you are doing well . I'm just want to ask from you that how did u OverClock Your Cpu and Memory (Ram) Every Time. Please make a video Tutorial on it .
The issues faced in this "pre-launch" release are still way better than what we faced with cyberpunk release.
Im not getting as bad as the results here in the video as with mine. I'm on a 4080 with 7600x and waiting for the 7900x3d to drop in a few weeks.
If I slow to a walk speed before entering a door I don't get loading circle or stuttering. Patch updates and cache optimization would do this game wonders
Also got the same what settings you running the game ?
@@mofarah3488 5120x1440,dlss with everything ultra. I recently turned rtx off cuz it wouldn't report my in game fps until I turned it off. My install is on a upgrade from windows 10>11.
I have also noticed that if the fast travel to the school I get massive stutter and pop in just after but if I broom or walk to the school I get no stutter at all. It's Def an asset loading/cache issue. Still very playable cuz it never happens when I'm in battle just exploring
@@phillysupra thnx man, yeaah heard thats allot i hope the deva will day patch it asap. !
Is the stuttering better with medium settings?
Have a 7700x and a 3080 12gb and im thinking of a 1440p 100-120 fps experience with medium settings😅
yeah, it still stutters but it is better on medium. I think you're spot on with that prediction!
@@zWORMzGaming thanks!
But i think ill wait another 6 months or so.
Most of the games arent worth buying at release these days for full price.
Im very sensitive to bad frame times and stuttering.
So i hope it will be fixed until then.
First and lohe your vids as always 🐐👋
Best gaming youtuber atm? i sure think so
Hey man, can you also test how it runs on older but still capable cards e.g gtx 780, 950, AMD R9s..etc
Edit: If possible also on old CPUs like 3rd gen i5/i7..Just curious to the levels of support this game has for older hardware.
Yeah 😉
The gtx 780 is like a 1050 ti with less vram. Capable 🤣, yeah no. Time to move on. 3gb of vram can't even display anything above N64 textures in this game.
@@theoldpcgamer77 dude relax, I use 580 8GB. I just want to see how the game performs on older hardware as I did see someone playing it in 1080p with 30fps on a 750ti and it raised my curiosity.
@@GlitchedGame0
Yeah the 750 ti has textures so bad it's 🤣 and at 540p upscaled with FSR for barely 30fps I wouldn't call that playable by any means and your eyes will turn to cancer.
@@theoldpcgamer77 Its that bad huh? 😂 Wasn't really paying attention to the details, just saw 30fps. But is it really due to low vram or does the game just look like crap on low texture settings?
Did you increase any voltage for the 13600K OC?
Left it at auto and the motherboard chose to set it to 1.42
My settings on a 4090 at 1440p: DLAA on, FG on, 144fps cap, everything Ultra, RT on but no shadows (they have issues), plus some additional RT tweaks in engine.ini:
[SystemSettings]
r.RayTracing.Reflections.ScreenPercentage=100
r.RayTracing.Reflections.SamplesPerPixel=1
r.RayTracing.Reflections.MaxRoughness=0.7
r.RayTracing.AmbientOcclusion.Intensity=1
If I play it at 4K on my G1 I would just switch from DLAA to DLSS quality (with newer DLL).
one of the best games if seen so far and one of the worst RT performance ive seen. Hope they optimise it soon! great vid once again!
No. The game is ass checks.
How is your cpu temp so low, mine gets to 95% sometimes and 85% between these two wtf
hey guys is anyone else experiencing the really terrible ghosting in this game? anyone find a fix?
Can you explain how you know you're CPU bottlenecked at times? Your overlay shows your not using much cpu so what is the bottleneck you're referencing?
The game isn't optimized to run on all of the i5's cores so it relies on a few fast cores alone.
If said cores were faster, we'd get more FPS and higher GPU usage, do it's CPU bottlenecked anyway.
What drivers are good ,game ready drivers or studio ready drivers
game ready drivers
I love how Raytracing barely looks any different but still consumes 50% of the frames.
Tell me 1 one game, just 1, that is built on Unreal Engine, and doesn't stutter and uses "appropriate" amount of memory. This games takes more than 16 GB to run! That's insane for game that doesn't even look THAT good. Tomb Raider (Shadow) looks better than that.
That's what I said in his 3060ti video
Thiis game doesan't stutter. If it does it is your own hardware not the game.
@@MaDDeX93 it stutter a lot after the first 1-2 hours of gameplay
No it doesn't haha, u just mad that u got a shit pc
Tbh I've expected more from 4080RTX.
Ngridia cut this card well from 4090.
What FPS counter program do you use?
what frequency are the effective cores and the ring bus overclocked?and the RAM should also be overclocked, the growth will be great.
This cpu doesn't have enough cache so it will make zero diff. The 13700k is where the cache amount is good enough and the 13900k will make no difference clock for clock over that.
hey, which software do you use to display the fps and temps pls?
MSI Afterburner
Pls test a gtx 1660 super idk if i should buy it im scared that my pc (the gpu) dosent work well..
My specs are:
Gtx 1660 super
16gb ram
Intel i5 11th gen
It's a shame that games nowadays are so bad optimized.
as long the money flows they wont care that much
@@sanji663Yeah, I just miss the days were you could buy a game that was fully completed and were the game designer cared about the finished product. Now they put some unfinished crap on the market and they don't care about it.
@@Th3_Dark_Side Developers nowadays rely on powerful hardware, like all the rtx or rx cards and dlss (and frame generation for 4k) to make their game playable instead of optimizing it just a little bit.
how u using Corsair RM 750i 80+ Gold with 4080 and 13600k?
Thank you for testing nvidia reflex off.
By the way I wouldn't pet a cat or any animal for that matter in Hogwarts. It might be a teacher or worst a creepy Animagus dude 😁😁😁
I have a question. Do windows 11 is that bad ?
The performance... maybe is it because of Denuvo?
the stetring .. I think it's because of ''denuvo'' the protection system in the game .
Is not because of Unreal Engine 4?
@@Jakiyyyyy unreal engine 4, bad optimization and denuvo
love this videos (part 2)
Is 4080 super more of a 4k or 2k card?
4k and DLSS? just lower the resolution instead of using dlss?
Why? It looks like native res at 4K with DLSS lol
how do you get that status ui in the top left? what software is that
Msi afterburner with custom rivatuner statistics overlay
Omg that screen that game looks so beautiful
You are the benchmark G.O.A.T
9:50 bro what the heck, just 250watt? Thats insane
Keep in mind part of the "poor" performance is due to bad optimization on the port to PC. There is a day one patch that should improve some of this.
I hope so
I have only just noticed. But are you playing under the stairs. Haha. You look like your head is hitting the ceiling? How fitting. lol
I think it finally time to consider buying a 4K monitor next time i upgrade my gpu damn i understand why this fps may not be what some expected but damn this beauty of a game in 4k nativ 60fps 🥵
Anyway we could possible get the overlay file for RTSS that you use?
I just love the simple look of it!
@@taekwoncrawfish9418 nah its his own custom RTSS overlay bro
@@taekwoncrawfish9418 brother in christ, the NVIDIA one is so obvious, maybe look it up b4 you so blatantly lie :)
i just bought a pc with amd 5900x cpu and amd rx 5700xtreference gpu. How much would you say this is worth ?
I don't know the other components but I guess a well equipped used PC with those specs is worth around 700-800€
200 for the GPU, about 300 for the CPU and everything else will probably cost 200-300
@@zWORMzGaming Not sure about the power supply or fans but the case is a Corsair 4000D but thanks! I just wanted your opinion since you know a lot about pc’s
1440p ultra with a 7950x and a 6950xt I have considerably better performance, playing in linux using proton. EDIT; RT is for suckers, shiny gooder right? Ruins the game making everything into a mirror and performance drops through the crapper, on EVERY game.
You are just saying this because you can't run RT🤣
@@ese_andris I can infact run RT just fine. Got a 7900xtx now which has roughly the RT performance of a 3090 and for the record RT is still highly overrated.
do you think it would be worth it to upgrade from a 5800x to a 13600k?
No, the 5800X3D is pretty much as good as this for gaming and you can keep the platform.
I'd wait for Ryzen 7000x3D though
@Od1sseas yeah I was saying that if he wants to upgrade to the i5 he should consider the 5800x3d and keep the AM4 platform
Just got my 4080 yesterday and I've been testing it on HL. After enabling FG with 1440p ultra I've never dropped below 130fps.
What a marvelous technology from Nvidia.
فيديو عظيم كالعاده 🤩
جميل جدا
holy sht, the word optimization does not process in developers brains
this why i buy games after a couple of updates
Kryyzp : Performance is horrible , maybe I should turn off ray tracing on a $1200 GPU 😂😂🤣
hi everyone ❣
1) nice video like always, 2) wich software is that for capturing stats on cpu and gpu? tnx all
msi afterburner with rivaturner
@@itsvlend ty man
At 4K you said it is dropping into 50 fps, but never seen drop below 60 fps???
I'm loving this game honestly, and I've never really been a Harry Potter fan, but I think that's just because Daniel radclife annoyed me
I like the little lightning bolt on your forehead😂
Finally someone gets it!
Hello hope all is well,
Could you test a 6800XT in this game?
How many wands does Kryzp have?
Why quality preset on 4k? Run performance like any normal person so it uses 1080p resources to render 4k. 1080p quality, 1440p balanced, 4k performance. There is no visual difference between quality and performance on 4k but a huge fps difference
Oh there is a big difference in quality.
Damn, I really need to get me an RTX card.
Join the club. I won't even be able to play it well with my 970.
thumbnail broooooooooooooo 🤣❤️
What's causing the stutters? What is the input latency with DLSS 3?
Probably not enough VRam.
can you test rx 6600 next please?
Can’t you play 1440p on a 4K screen with black bars to get a native 1440p experience? Or maybe windowed mode?
Lock up developers with only a laptop 3050 and force them to get 60fps at 1440p or they cant release the game. That should be the baseline.
Or Nvidia could whitelist games after release and make them detect when they are being used for development and only present a quarter of resources?
Best intro
You're a wizard zWORMz
😃
Krysp did you get a weird tattoo on your head ??
it's the harry potter scar
Is this GPU worth $1,200?
2 vedio in one day now we are talking
The DLSS wont work with m’y 3080 Ti i get same fps Native or with DLSS
I just leave RT Reflections and Ultra settings on my 4080. I get about 80-120fps 4k DLSS Quality FG enabled. Some stuttering but not terrible...needs to be patched soon.
What CPU and RAM do you have? Is it on ssd or nvme?
@@HeroStrike 13700k ddr4 64gb 3600mhz Gen4 nvme ssd
Frame Generation gives me just 10 FPS in Hitman 3 on 4K Ultra settings with RTX on, i get 43 without and 56 with FG, is that normal ? i have a 4080 and a 5800x3D with 64GB of RAM
Yes it is, in a few games it doesn't make a big difference
@@zWORMzGaming oh alright, thank you !! Really enjoy the videos and thanks for the answer 🙂
Yes. Because if GPU usage is 99-100% already then DLSS FG doesn't give much FPS. For better FPS from DLSS FG your gpu must be CPU bottlenecked. You can check it by yourself - set 720p resolution with your current settings in Hitman 3, gpu usage will drop down from 99% to lets says 70% and you will be CPU bottlenecked. Now enable FG and your FPS will be doubled and GPU usage will rise also.
Enjoyed it :)
you can use frame gen without dlss you just have to select the option Dlss off instead of turning it off completely.
Literally got the same setup as yours, but somehow I can not activate Frame Generation, does anyone perhaps know why? Even reinstalled the drivers etc. didnt work tho...
You need DLSS enabled to turn it on in this game
@@zWORMzGaming It was turned on, quite confusing.. I just turned on "Hardware-accelerated GPU scheduling", restarted, checked and it was on automatically.... thanks anyway!
benchmark that i5 13600k at that 5.6ghz paired with 6000mhz ram in more games would really be interested in that.
10900k and a 4090 due to vr and framegen saved my ass to get locked 100s at native res in HL
Dlss2 is nice when needed but for RT in singleplayer games dlss3 comes in also pretty handy
What do you think, would I5 11400F bottleneck rtx 4070ti in 1440p.
Yes it will!
yes, ig you buy any 4000 gpu i advise you to upgrade to i5 13th gen or higher or ryzen 5 7600 or the new xrd chips coming
@@manuelmunguia616 naaah too expensive.
I am getting 100-120fps with RTX 4080 and I7-13700k on 4K Ultra settings without Ray Tracing. Wonder if they fixed something after this benchmark?´
On 1440p Ultra no Ray Tracing I'm almost locked 165fps not dropping much, maybe dropping to 150 at times.
Do you think i7 13700f will be fine for rtx 4080? Actually I have an 12400f with that gpu
Dear brother, are you here with overclock? Processor? Card memories? Simply in 4k you get 3 fps more than my system 7800x3d + rtx4080× deepcoolak620 + RAM 32GB 6000hz 30cl
sorry my mistake I get 20 fps more than you in 4k native rt same exact settings on 4k screen 70+fps
theres an ini change you can do to fix the reflections. it is more demanding than current rt settings