@@lawyerlawyer1215 yes, I did try it on a 4080... Was it pretty? Sometimes yes... The global illumination is awesome but not 100% better over baked in lightning, considering the FPS cost... And the reflections? They look good on static scene but are noticably slow in movement = not worth it
@@cythose4059 If I didn't love all my GPUS I would considered selling my 4090 a month or so ago. I feel the 6 series will be worth the wait if one can.
I have an request: the witcher 3 next gen got a huge texture remake from a modder named hulk hogan on youtube It named The Witcher 3 HD Reworked Project NextGen Edition , In nexus mods , I tried it the game look so detailed and my 4070 ti using all the 12gb vram on 1440p after it I hope you can show it to the world The game look like 2024 texture details with it
@Jralex876 there is also more mods will make it look better There is mod for realistic clouds and mod for realistic 4k texture gerelt including his equipment and swords
@frenchtoastenjoyer that's new information for me and yes he did an amazing work and I really want to see people test the game with those mods it look times better than what we seeing in this video
tbh I've tested this mod, and what can I say is that mod looks good, but not worth it - I mean huge vram consumption, much longer loading screens and loading textures, which leads to stuttery mess. Also this mod takes huge amount of fps. Better to have fast loadings, less stutters ,less vram usage and more fps, over just better textures
here’s a quick tip for anyone looking to boost performance in this game: obviously, use dlss. you can swap rt ambient occlusion for "hbao+" and still maintain quality. rt shadows aren’t worth enabling and sometimes even look worse. use the performance mode for global illumination. 8x msaa doesn’t need to be enabled for hairworks because "taa" already handles it in this version. the "ultra" setting isn’t the same as in the older version-it’s a "new" ultra (not referring to ultra+). so, using ultra instead of ultra+ is essentially free fps with minimal quality loss.
RTAO compliments RTGI very well in this game, and if you're enabling one of the RT effects, the cost of enabling the other is very limited due to the BVH. I would also add that reducing crowd density helps because of the cpu bottleneck.
Im pleasantly surprised my 6750xt is running well above 60fps on ultra without having to tweak the graphic settings around(RT off of course). Its an absolutely gorgeous game for being a decade old.
In the Digital Foundry video they couldn't tell the difference between the settings in dx12 vs dx11 when RT was off. Its hard to get better performance because performance is already like 3x worse then dx11 even WITHOUT RT
9:15 I highly doubt The Witcher 4 is coming out in 2025. They just dropped the first trailer and announced they're just now going into full production. With The Witcher 3 they dropped the first trailer almost 2 years before release, and I'm not expecting it to be much different this time. Best case scenario for release is probably late 2026 imo.
Lets be honest the trailer was prerender in UE but its real quest, so dame will/should look very close to trailer, just with worse cutscenes, and there is no way 5090 with current leaks be that powerful to run it.
Yeah its 100% not coming out in 2025, they announced two weeks before the trailer that they went into production as in, making models, implementing everything, building the actual game etc. Before this they were in preproduction eg planning, concept art, game design, prep of tech etc. The very extreme hopium earliest we will see the game is end of 2026, however its more likely 2027/2028. All that being said, I am looking forward to it too. Ciri is awesome and I feel CDPR are back in form after Phantom Liberty.
True that, they needed to make a trailer cause of Nvidia sponsorship, it will be quite a while to develop the game, not to mention delays since, well it's UE5, we probably gonna see new gen graphics card by the time witcher 4 released.
I remember getting FPS in the 40s back in the day with 4k and a GTX 1070. It’s insane that a 4090 gets the same now. The next gen version with ray tracing really is demanding.
I put 400 hrs on this game when it came out. Bought the physical, installed the 5 discs. Played on a Nvidia 930m laptop at 720p lowest with 28-30 fps. 45 fps at 4k Ultra RT is nothing
Wow I didn't know this got a physical release on PC. I wish PCs with disc drives and physical game releases were more common since I like owning physical copies. Part of why I still use consoles sometimes (though I suspect physical games are going away in a generation or two.)
@@Eepy-Rose I have the Cyberpunk physical as well but sadly it doesn't have any installation discs. Only 2 OST discs, postcards, stickers, map and world compendium
@@oneilc818 UE5 is great, the devs dont know what to do, there are soo many useless futures that are running that are NOT needed, they can disable Lumen and RT and the game would boost 50% only by that.
@@cythose4059UE5 is a double edged sword, yes it can look great with minimal effort but then the developers who used to optimize the game for such visuals now no longer needed to and became lazy, adding to the fact that DLSS and FSR or TSR has become a cheap way to skimp on optimization
been watching ur videos for months and finally subed, one advice though never change quantity of videos released if it affect the quality, ppl watching benchmarks are very critical of the details, keep up the good work dude
You HAD CPU bottleneck which is especially visible in chapter called "4K RT ULTRA / DLSS Performance / FG". It is visible whenever CPU utilization is lower than 40%, GPU utilization is always 98%. And when CPU utilization goes it 40% and higher. GPU utilization immediately starts to drop.
In most cases sure, but it can still help even at 4k depending on your monitor size. If it's a 32 inch 4k panel for example yeah you could turn AA off.
*"Honestly. People using high AA solutions at 4K is crazy to me.* *FXAA is the most you'll ever need at 4K or higher."* Honestly, the problem is that you don't know what you don't know. Whether or not AA is needed does not depend upon display resolution --- it depends upon _angular resolution._ Angular resolution is essentially how _large_ pixels appear to be. And this is a function of two things: Screen Size and View Distance. At the same view distance, the pixels on a 24" 4K screen are going to look _much smaller_ than the pixels on a 55" 4K screen. In short, I have a "High Immersion Gaming" setup. I normally sit 18" away from a 55" 4K OLED for a horizontal FoV of 106°. I can assure you that antialiasing is absolutely necessary with this configuration. The same thing goes for when I'm sitting 5 feet away from a 110" front projection screen.
@@mrbobgamingmemes9558 It's definitely being built with those features being turned on. No doubt about it. Series S will have to be the floor of what they will be targeting.
@@Phil_529 please dont give them idea, i dont want to play at 480p with dlss ultra performance just because of forced rt, my gpu still able run black myth wukong without being stuttery mess at all , because they did not forced ray tracing, remember, force ray tracing will make rtx 4090 owner suffer if they refuse to use upscaling. Forced rt should not exist before steam most popular gpu can handle rt at 60 fps on 1080p
@@yancgc5098"just because you can doesn't mean you should" , Anthony from LTT. Remember, i dont think CDPR only target very low sales for some reason. If you make such a ambitious games you want as much sales as possible so forcing RT will prevent lot of sales .
Chris, remember that TAA-U stands for TAA Upscaling. 4K with TAAU is not rendering a native 4K. It's very likely to be 1728p at that resolution. If you want to render at native 4K, use FXAA and you'll see the true performance profile of the card for 4K ray tracing.
It oftentimes looks worse. The exagerated saturation effect, that it evokes, like here on the video makes the texture details harder to discern, while not making the color scheme more vibrant.
This also happens in cyberpunk, starfield, when you activate nvidia reflex the use of the gpu always drops to 80, 85% sometimes it starts at 99%, 95% and then the reflex starts to slowly lower the use of the video card I don't know why
9:40 I believe it's worth mentioning that CDPR have already stated many times, that they know what issues UE5 has and if i recall correctly they already even had a conference about potential ways to tackle the traversal stutter. Also it has been mentioned many times in the interviews and such that they are saying it's gonna be more of an UE5 fork rather than what were used to now, which also makes sense given their tech experience with RedEngine. Personally I am also of the opinion that given their experience with RT and PT in RedEngine and considering they have even backported some RT to Witcher 3 (that was being developed like 5 yrs before RTX) there is no way they just gonna be like "yeah you know scrap that we turning on lumen and forget it". Their recent trailer shows they are still in close collaboration with NVIDIA which further strenghtens the point that they are going to be working hard on RT(aka making it the proper way, not just using unoptimized UE5 toggles way). SOOOOO there's clearly potential, can't w8 to see how they're gonna manage all that.
A $1500 card can’t do 4k 60 fps is the biggest ripoff in gaming. I know people hate the 4060, but at least I didn’t get ripped off by spending $900 for the entire PC.
I wish it was 1500. new 2500 is the cheapest available in germany, even ebay its still 1800, minimum. I gotta say i love my 4060Ti, its 16GB are more than enough, and its fast enough for the games i play, i have a 1080p monitor so its perfect. I actually wanted to get a 4090 with a 4K Monitor, but its no use since id need to get a 7800x3D and new mainboard and that would basically mean new pc.
The reason is that full ray tracing requires faster, higher amount and more efficient tensor cores. Nvidia knows this and thats why we got dlls3 and upcoming dlss4 to mitigate the performance cost until we have enough tensor cores in newer GPUs. Its like when tesselation came out, it took a couple of generation of GPUs to now make it insignificant in terms of performance. The only problem is the price, new GPUs is going to cost a big TV, a used car or something. Thats the biggest issue here until we have a more efficient GPU manufacturing cost.
Something has to give. Consoles aren’t going to cost that much if they continue to exist and people at least are not going to upgrade from a prior pc at these prices very often. Games are not regularly going to come out with this technology mandatory when even the strongest hardware doesn’t run it well. If hardware progression slows studios will have to learn to optimize.
The next gen update introduced half-assed Ray Tracing implementation, bugs, crashes, CPU bottlenecking, graphical glitches, changed the weather system for a worse one, changed the water for a worse one, shadows now are terrible, and overall shit performance. They ruined a perfectly running game by adding mods which anyone could already just download, then they botched their implementation too. Thanks, CDPR.
The next gen witcher 3 medium settings are equal to the old gen ultra settings. Which means witcher 3 next gen graphical upgrades arw indeed very impressive.
install oldgen (probably this will give you 60fps if you now use nextgen no joke and game looks better on low settings) new gen got one shit quest not worth it, install it with mod tweaks (you can set settings that matter higher and settings that dont matter like shadow quality lower than low) and instal dll tweak both are on nexus, there are few other mods for performance you can probably achive 1080p 40fps or 720p 60fps and both settings will look better and smoother.
If you really want to experience peak witcher 3, I strongly suggest using the withcer 3 hd rework project mod. Absolute game changer in terms of visuals.
Honestly amazing just how much better it looks with the next-gen upgrade. More demanding than Cyberpunk 2077 with PT though due to RedEngine being so unequipped to handle the additional CPU demands of the next-gen update. The RT is also demanding and was not playable at 1440p for me before I upgraded from a 3080 to 4070 Ti Super. This is one of those cases where it's not that it's unoptimized, it's that it can't be further optimized in any meaningful way due to the limitations of the relatively ancient game engine (I'm CPU-limited to 40fps without frame-gen in Novigrad with a 5900x). I'm not that happy that they're moving to UE5 for future games, but they definitely had to do something.
Loved the video and love The Witcher 3! Please test some slower GPUs like RTX 4070 with optimised settings as well. Performance RTGI, High/Ultra settings instead of Ultra+, no hairworks.
For 10 year old game with vast open world its perfect. Hopefully if they keep this up in witcher4, we can get graphics like trailer in 4090. 5090 for sure.
Witcher 3 is really interesting when it comes to Raytracing. In Cyberpunk 2077 on max settings with basic Raytracing on 4070 I have around 75 fps in the city and around 100 outside the city, 1080p resolution. Witcher 3 on the other hand gives me about 55 fps so somehow Raytracing in Witcher 3 is much more demanding than that in Cyberpunk 🤨
I seriously don't get how people like RT that much man... It always tanks performance only to have some reflections in puddles and different shadows...
I'd like to see this done with Batman: Arkham Knight, the last good game Rocksteady developed. It still holds up to this day visually, even though the anti-aliasing leaves something to be desired.
When the GPU usage is only in the 80%'s, does that mean the CPU is the bottleneck? I was under the impression that a CPU bottleneck is when the CPU is at 100%, but i been told by many people that if your GPU doesnt hit 100% then its possibly a CPU bottleneck even if the CPU is only at like 30% usage. Is that true?
This game is so well optimized and the graphics still hold up really well for a game that released in 2015. Its so sad to see CD Projekt Red abandoning the RED engine and switching to UE5. There goes performance I guess for The Witcher 4. Its fucking sad to see these gaming studios abandon their amazing engines they created which are well optimized and the RED engine was one of them, not to mention how the engine encouraged the creation of alot of mods and it fully supported the modding community
Unreal based games are far more easily modded than proprietary ones lol especially if the devs don't provide tools. Plus it looks very clearly old and doesn't hold up all that well, Rise of the Tomb raider came out only a little later than this game and looks much more detailed (the vanilla version not this next gen one )
kryzzp, the witcher 3 uses old DLSS 3.1, if you update manually DLSS to 3.8, and DLSS FG to 3.7 it will look better in terms of losing details. Also idk what's going on with your FG + Reflex, but on mine 4070s it doesn't getting 80-90% GPU usage, and I think it is because of those old DLSS files
great video like always ❤❤ but hey I know this ain't going to be the ideal test but would you test the all great RTX 4090 in cs2 ? just out of curiosity 🙃
9:15 I think you misunderstood something; The Witcher for is expected to release late 2026 or 2027. But fingers crossed🤞 And in a German Podcast (Gamestar Podcast) the head developer for quest design talked about the switch to UE5, and it is mainly because they plan on doing an entire trilogy, so they want to save resources on programming and updating their own engine between each game. And of course it's easier to find programmers that already know how to operate the UE5 engine.
I cant wait to see you bench mark the blood of dawnwalker. being a title developed on UE 5 i wanna see how much of a beefy card ill need to get a smooth 60fps at atleast 1440p.
ui5 is the future of AAA games, thats why alot are using it for newer games, the engine graphics are absolutely stunning but i do agree the performance is still not there yet
It is optimized only for DX11. It runs very well on DX11. They fucked up the DX12 with nextgen update because this game was built with DX11 in mind from the ground up. It is an adaptation rather than a fully native DX12 implementation.
Bro I just upgraded to the AORUS 4070 Super, fired up this game and set it to ultra and I'm getting a BEAUTIFUL 138fps (I use a 1080p 144hz g-sync monitor)
Witcher 4 will require rtx 6080/90 to run at 4k 120fps with full ray tracing. Thats why the game is not coming until a few years... but the graphics will be insane
I am playing this game in 4k on a 2070 super right now lol. With DLSS Quality, medium settings always above 50 fps. Still looks really good. Pretty happy with that
We have reached the point where the RTX 4090 is limited due to an insufficient amount of VRAM. Maybe we will see 32GB on the 5090? Quadro GPUs with 48GB already exist. At 8K without DLSS the game used 19GB of system RAM.
No it doesnt run well and suffiecient because its 2015 game on 2013 engine with mods, translation layer to dx12 and RT on top. Look how unstable this game is stock (1.31) thats why video settings are so limited in both oldgen and nextgen version.
Friend, don't just try DSR 8K; also try DLDSR 2.25 6K, because 2.25 has superior quality compared to DSR 8K. And if possible, test DLDSR 2.25 with DLSS Quality.
Thanks for watching!
📸 EMEET S800 Product Link (video sponsor) - bit.ly/3BoUK4X
9 year old game retrofitted with RT running like ass at native 4k on a 4090. Brilliant
its almost like ray tracing is graphically intensive 🤔
@@happygofishing Almost like RT isn't worth it in games 🤔
@@199772AVVIalmost like you haven’t tried it in a good GPU…
let me tell u about quake PT...
@@lawyerlawyer1215 yes, I did try it on a 4080... Was it pretty? Sometimes yes... The global illumination is awesome but not 100% better over baked in lightning, considering the FPS cost... And the reflections? They look good on static scene but are noticably slow in movement = not worth it
He's doing sponsors now so he can afford the 5090 when it comes out lol
Yes 👀
Just one more sponsor, Arthur one more sponsor 😂@@zWORMzGaming
I heard rumored EU price of €3000 (tax included)
@@UM_88-s2c Weelll , the rtx 4090 right now is stil over 2000 euro soo...
@@cythose4059 If I didn't love all my GPUS I would considered selling my 4090 a month or so ago. I feel the 6 series will be worth the wait if one can.
I have an request: the witcher 3 next gen got a huge texture remake from a modder named hulk hogan on youtube
It named The Witcher 3 HD Reworked Project NextGen Edition ,
In nexus mods ,
I tried it the game look so detailed and my 4070 ti using all the 12gb vram on 1440p after it
I hope you can show it to the world
The game look like 2024 texture details with it
I saw it ! The game look like the witcher 4 and much better than what we seeing in this video without it
He did a great job
@Jralex876 there is also more mods will make it look better
There is mod for realistic clouds and mod for realistic 4k texture gerelt including his equipment and swords
Fun fact: the modder works for CDPR and also worked on next gen update for Witcher 3. Pretty cool guy
@frenchtoastenjoyer that's new information for me and yes he did an amazing work and I really want to see people test the game with those mods it look times better than what we seeing in this video
tbh I've tested this mod, and what can I say is that mod looks good, but not worth it - I mean huge vram consumption, much longer loading screens and loading textures, which leads to stuttery mess. Also this mod takes huge amount of fps. Better to have fast loadings, less stutters ,less vram usage and more fps, over just better textures
here’s a quick tip for anyone looking to boost performance in this game: obviously, use dlss. you can swap rt ambient occlusion for "hbao+" and still maintain quality. rt shadows aren’t worth enabling and sometimes even look worse. use the performance mode for global illumination. 8x msaa doesn’t need to be enabled for hairworks because "taa" already handles it in this version. the "ultra" setting isn’t the same as in the older version-it’s a "new" ultra (not referring to ultra+). so, using ultra instead of ultra+ is essentially free fps with minimal quality loss.
RT ambient occlussion is the strongest aspect of this game in terms of looks. don't disable it guys, and yes it's better than hbao+.
@@CGRADT and half the frames.. nope
RTAO compliments RTGI very well in this game, and if you're enabling one of the RT effects, the cost of enabling the other is very limited due to the BVH. I would also add that reducing crowd density helps because of the cpu bottleneck.
Im pleasantly surprised my 6750xt is running well above 60fps on ultra without having to tweak the graphic settings around(RT off of course). Its an absolutely gorgeous game for being a decade old.
In the Digital Foundry video they couldn't tell the difference between the settings in dx12 vs dx11 when RT was off. Its hard to get better performance because performance is already like 3x worse then dx11 even WITHOUT RT
Glad to see my man getting some sponsors, congrads! great video as always!
9:15 I highly doubt The Witcher 4 is coming out in 2025. They just dropped the first trailer and announced they're just now going into full production. With The Witcher 3 they dropped the first trailer almost 2 years before release, and I'm not expecting it to be much different this time. Best case scenario for release is probably late 2026 imo.
Lets be honest the trailer was prerender in UE but its real quest, so dame will/should look very close to trailer, just with worse cutscenes, and there is no way 5090 with current leaks be that powerful to run it.
They said it was planned for 2027 at the earliest
@@erisium6988 It could, lots of breakthroughs in graphics coming with AI.
I doubt Nvidia already has GPUs that will be released in 2027 or later.
Yeah its 100% not coming out in 2025, they announced two weeks before the trailer that they went into production as in, making models, implementing everything, building the actual game etc. Before this they were in preproduction eg planning, concept art, game design, prep of tech etc.
The very extreme hopium earliest we will see the game is end of 2026, however its more likely 2027/2028.
All that being said, I am looking forward to it too. Ciri is awesome and I feel CDPR are back in form after Phantom Liberty.
True that, they needed to make a trailer cause of Nvidia sponsorship, it will be quite a while to develop the game, not to mention delays since, well it's UE5, we probably gonna see new gen graphics card by the time witcher 4 released.
I remember getting FPS in the 40s back in the day with 4k and a GTX 1070. It’s insane that a 4090 gets the same now. The next gen version with ray tracing really is demanding.
Demanding? 😂 bless your heart ❤
*UNOPTIMIZED***** IS the word you were looking for ^_^
not really demanding at all with this graphics but half baked DX12 implementation.
I put 400 hrs on this game when it came out. Bought the physical, installed the 5 discs. Played on a Nvidia 930m laptop at 720p lowest with 28-30 fps.
45 fps at 4k Ultra RT is nothing
@@dvornikovalexei 930m laptop - 28-30
2k$ gpu - 45.
What a lovely ABSENCE OF DIFFERENCE LMFAO. ,🤣😂😆🤣🤦
@notenjoying666 Did you miss the difference between 720p lowest and 4k Native Ultra with Ultra Ray Tracing or is your brain fried?
Wow I didn't know this got a physical release on PC. I wish PCs with disc drives and physical game releases were more common since I like owning physical copies. Part of why I still use consoles sometimes (though I suspect physical games are going away in a generation or two.)
@@Eepy-Rose I have the Cyberpunk physical as well but sadly it doesn't have any installation discs. Only 2 OST discs, postcards, stickers, map and world compendium
Installed. I understand that my friend
5090 better be underwhelming because i got no kidney left.
So we can imagine how demanding that big open world UE5 witcher 4 will be for murdering current Ultra high end hardwares.
@@thepaintbrushonlymodeller9858 Definetely. As any other heavily nvidia sponsored title.
Or in other words- 1000% unoptimized game is about to be released yet again.
Just like Stalker 2, it's going to run like complete and utter shit. UE5 is garbage.
@@oneilc818 UE5 is great, the devs dont know what to do, there are soo many useless futures that are running that are NOT needed, they can disable Lumen and RT and the game would boost 50% only by that.
@@cythose4059UE5 is a double edged sword, yes it can look great with minimal effort but then the developers who used to optimize the game for such visuals now no longer needed to and became lazy, adding to the fact that DLSS and FSR or TSR has become a cheap way to skimp on optimization
been watching ur videos for months and finally subed, one advice though never change quantity of videos released if it affect the quality, ppl watching benchmarks are very critical of the details, keep up the good work dude
Thanks for the tip and for subscribing :)
Hope you enjoy the upcoming videos!
Congrats on getting sponsors now
You HAD CPU bottleneck which is especially visible in chapter called "4K RT ULTRA / DLSS Performance / FG". It is visible whenever CPU utilization is lower than 40%, GPU utilization is always 98%. And when CPU utilization goes it 40% and higher. GPU utilization immediately starts to drop.
The 7800x3d is bottlenecking at 4k, we are fucked
@@kerkertrandov459 My 5600X is screaming bloody murder.
Honestly. People using high AA solutions at 4K is crazy to me.
FXAA is the most you'll ever need at 4K or higher.
Highly depends on the size of the monitor you have. 48'' at 4k is the same ppi as a 24'' 1080p
i disable it
In most cases sure, but it can still help even at 4k depending on your monitor size. If it's a 32 inch 4k panel for example yeah you could turn AA off.
*"Honestly. People using high AA solutions at 4K is crazy to me.*
*FXAA is the most you'll ever need at 4K or higher."*
Honestly, the problem is that you don't know what you don't know. Whether or not AA is needed does not depend upon display resolution --- it depends upon _angular resolution._
Angular resolution is essentially how _large_ pixels appear to be. And this is a function of two things: Screen Size and View Distance.
At the same view distance, the pixels on a 24" 4K screen are going to look _much smaller_ than the pixels on a 55" 4K screen.
In short, I have a "High Immersion Gaming" setup. I normally sit 18" away from a 55" 4K OLED for a horizontal FoV of 106°. I can assure you that antialiasing is absolutely necessary with this configuration. The same thing goes for when I'm sitting 5 feet away from a 110" front projection screen.
They moved to UE5 because its easier to pick up talent and many of the senior red engine engineers have left.
I just hope they dont to stupid things like forced rt, nanite and lumen stuff
@@mrbobgamingmemes9558 It's definitely being built with those features being turned on. No doubt about it. Series S will have to be the floor of what they will be targeting.
@@Phil_529 please dont give them idea, i dont want to play at 480p with dlss ultra performance just because of forced rt, my gpu still able run black myth wukong without being stuttery mess at all , because they did not forced ray tracing, remember, force ray tracing will make rtx 4090 owner suffer if they refuse to use upscaling. Forced rt should not exist before steam most popular gpu can handle rt at 60 fps on 1080p
@@mrbobgamingmemes9558 It will most likely have forced Software Lumen and Nanite, Stalker 2 and Silent Hill 2 are games that do it already
@@yancgc5098"just because you can doesn't mean you should" , Anthony from LTT. Remember, i dont think CDPR only target very low sales for some reason. If you make such a ambitious games you want as much sales as possible so forcing RT will prevent lot of sales .
9:14 The Witcher 4 is NOT coming out in 2025. It's going to take YEARS for it to be released. We just got the reveal trailer, better stay patient...
First time seeing you put a sponsor in your video
They will not drop W4 in 2025...2027 if we are lucky
Yeah, I have 0 faith after the CP2077 launch and their terrible ability to get in their own way.
i think in 2025 is going to be the official release date
CDpr has said that they will only launch game trailers if they are gonna launch the game 2 years after. They learned from cyberpunk disaster
r
@@shalahuddinumar3386 we'll see...if they learned.
Chris, remember that TAA-U stands for TAA Upscaling. 4K with TAAU is not rendering a native 4K. It's very likely to be 1728p at that resolution. If you want to render at native 4K, use FXAA and you'll see the true performance profile of the card for 4K ray tracing.
at 4k you use dlss performance
@@bigturkey1 Ummm, wut?.. usually you don’t “have” to use DLSS at all if you don’t want to… lol
@@brando3342 modern games since like 2020 are designed to use dlss performance at 4k
@@bigturkey1 That’s a serious problem, and shouldn’t be the case. It’s garbage.
@@brando3342 if dlss wasnt invented then games would still look like they did from 2017
1:08 Arthur getting shot 😂
honestly i wouldn't even notice it was RT on
"Cope"
It oftentimes looks worse. The exagerated saturation effect, that it evokes, like here on the video makes the texture details harder to discern, while not making the color scheme more vibrant.
You have obviously not played this game. There is a huge difference. Whether you like it or not, that's another question
You mean huge difference in performance? There is minimal visual improvement unless you’re a RT fanboy.
Finally the Witcher 3 the best game of all time
Ehhhh, it's mediocre at best
I couldn’t even pass the first 5 minutes
Skyrim is. Not this poorly optimized 💩💩💩.
Overated
@@notenjoying666 Cringe man, it's like comparing rat piss to wine......
thats a $4000 graphics card where i live and its only getting 40fps thats crazy
This also happens in cyberpunk, starfield, when you activate nvidia reflex the use of the gpu always drops to 80, 85% sometimes it starts at 99%, 95% and then the reflex starts to slowly lower the use of the video card I don't know why
9:40 I believe it's worth mentioning that CDPR have already stated many times, that they know what issues UE5 has and if i recall correctly they already even had a conference about potential ways to tackle the traversal stutter. Also it has been mentioned many times in the interviews and such that they are saying it's gonna be more of an UE5 fork rather than what were used to now, which also makes sense given their tech experience with RedEngine. Personally I am also of the opinion that given their experience with RT and PT in RedEngine and considering they have even backported some RT to Witcher 3 (that was being developed like 5 yrs before RTX) there is no way they just gonna be like "yeah you know scrap that we turning on lumen and forget it". Their recent trailer shows they are still in close collaboration with NVIDIA which further strenghtens the point that they are going to be working hard on RT(aka making it the proper way, not just using unoptimized UE5 toggles way). SOOOOO there's clearly potential, can't w8 to see how they're gonna manage all that.
A $1500 card can’t do 4k 60 fps is the biggest ripoff in gaming. I know people hate the 4060, but at least I didn’t get ripped off by spending $900 for the entire PC.
I wish it was 1500. new 2500 is the cheapest available in germany, even ebay its still 1800, minimum.
I gotta say i love my 4060Ti, its 16GB are more than enough, and its fast enough for the games i play, i have a 1080p monitor so its perfect. I actually wanted to get a 4090 with a 4K Monitor, but its no use since id need to get a 7800x3D and new mainboard and that would basically mean new pc.
12:00 if you are using reflex or a frame generator - try switching hair works to Geralt mode, this should slightly increase gpu usage
Надо вообще отключать hairworks
funny at 1440p my 3080ti isnt far off ur performance on the 4090. Good stuff !
Well 1440p is half the resolution, so not so strange.
Was waiting for this moment 😅
The reason is that full ray tracing requires faster, higher amount and more efficient tensor cores. Nvidia knows this and thats why we got dlls3 and upcoming dlss4 to mitigate the performance cost until we have enough tensor cores in newer GPUs. Its like when tesselation came out, it took a couple of generation of GPUs to now make it insignificant in terms of performance.
The only problem is the price, new GPUs is going to cost a big TV, a used car or something. Thats the biggest issue here until we have a more efficient GPU manufacturing cost.
Something has to give. Consoles aren’t going to cost that much if they continue to exist and people at least are not going to upgrade from a prior pc at these prices very often. Games are not regularly going to come out with this technology mandatory when even the strongest hardware doesn’t run it well. If hardware progression slows studios will have to learn to optimize.
The next gen update introduced half-assed Ray Tracing implementation, bugs, crashes, CPU bottlenecking, graphical glitches, changed the weather system for a worse one, changed the water for a worse one, shadows now are terrible, and overall shit performance. They ruined a perfectly running game by adding mods which anyone could already just download, then they botched their implementation too. Thanks, CDPR.
The next gen witcher 3 medium settings are equal to the old gen ultra settings. Which means witcher 3 next gen graphical upgrades arw indeed very impressive.
Runs fine on my 4080 Super on RT Max. No issues @70fps with mods and texture packs plus a reshade.
And here is me playing Witcher 3 in 720p low settings with 35+ frames
At that point, you might actually be better off just playing the og Witcher 3 before it got the shitty performance-eating remaster
install oldgen (probably this will give you 60fps if you now use nextgen no joke and game looks better on low settings) new gen got one shit quest not worth it, install it with mod tweaks (you can set settings that matter higher and settings that dont matter like shadow quality lower than low) and instal dll tweak both are on nexus, there are few other mods for performance you can probably achive 1080p 40fps or 720p 60fps and both settings will look better and smoother.
If you really want to experience peak witcher 3, I strongly suggest using the withcer 3 hd rework project mod. Absolute game changer in terms of visuals.
Honestly amazing just how much better it looks with the next-gen upgrade. More demanding than Cyberpunk 2077 with PT though due to RedEngine being so unequipped to handle the additional CPU demands of the next-gen update. The RT is also demanding and was not playable at 1440p for me before I upgraded from a 3080 to 4070 Ti Super.
This is one of those cases where it's not that it's unoptimized, it's that it can't be further optimized in any meaningful way due to the limitations of the relatively ancient game engine (I'm CPU-limited to 40fps without frame-gen in Novigrad with a 5900x). I'm not that happy that they're moving to UE5 for future games, but they definitely had to do something.
can you do a Sunday video of the RTX 2060 SUPER?
Witcher 3 at 4K with RT looks beautiful for an almost 10 year old game. Everything pops so much and everything has so much depth.
Thats how the kid became the next Bob 18:56
😅
Loved the video and love The Witcher 3! Please test some slower GPUs like RTX 4070 with optimised settings as well. Performance RTGI, High/Ultra settings instead of Ultra+, no hairworks.
For 10 year old game with vast open world its perfect. Hopefully if they keep this up in witcher4, we can get graphics like trailer in 4090. 5090 for sure.
Maybe at 1080p lol
I forgot how beautiful Slavic forest is in this game
i'd like to see a comparison for Witcher 3 with full RT vs. no RT because i can barely see the difference in most games with RT off and RT on
Witcher 3 is really interesting when it comes to Raytracing. In Cyberpunk 2077 on max settings with basic Raytracing on 4070 I have around 75 fps in the city and around 100 outside the city, 1080p resolution. Witcher 3 on the other hand gives me about 55 fps so somehow Raytracing in Witcher 3 is much more demanding than that in Cyberpunk 🤨
I seriously don't get how people like RT that much man... It always tanks performance only to have some reflections in puddles and different shadows...
I'd like to see this done with Batman: Arkham Knight, the last good game Rocksteady developed.
It still holds up to this day visually, even though the anti-aliasing leaves something to be desired.
hairworks with rt makes geralts shadow bald
you should try it with texture modes too
One thing they should've given us is a sharpness slider in the next gen version instead of just "low" or "high" lol
When the GPU usage is only in the 80%'s, does that mean the CPU is the bottleneck? I was under the impression that a CPU bottleneck is when the CPU is at 100%, but i been told by many people that if your GPU doesnt hit 100% then its possibly a CPU bottleneck even if the CPU is only at like 30% usage. Is that true?
This game is so well optimized and the graphics still hold up really well for a game that released in 2015. Its so sad to see CD Projekt Red abandoning the RED engine and switching to UE5. There goes performance I guess for The Witcher 4. Its fucking sad to see these gaming studios abandon their amazing engines they created which are well optimized and the RED engine was one of them, not to mention how the engine encouraged the creation of alot of mods and it fully supported the modding community
Shit ain't optimized bruh
No, it is not.
The latest and apparently final version is riddled with graphical errors.
Unreal based games are far more easily modded than proprietary ones lol especially if the devs don't provide tools. Plus it looks very clearly old and doesn't hold up all that well, Rise of the Tomb raider came out only a little later than this game and looks much more detailed (the vanilla version not this next gen one )
@@0newithmisery Exactly these people forget that Red engine was also used to launch a trainwreck called Cyberpunk lol
@AmaanGhaffar Play on DX11
kryzzp, the witcher 3 uses old DLSS 3.1, if you update manually DLSS to 3.8, and DLSS FG to 3.7 it will look better in terms of losing details. Also idk what's going on with your FG + Reflex, but on mine 4070s it doesn't getting 80-90% GPU usage, and I think it is because of those old DLSS files
I honestly think this graphic settings, 4K ultra 60fps was meant to 5090. If not, I can’t understand why its requirement are that high.
Witcher 4 gonna keep updating graphics even beyond RTX 90 series
The 4090 uses 400+ watts, almost like my slow cooker roasts.😅
my opinion its that you can also include optimizated settings in all test like this its the bet way to play
Well in 4k 60fps on RUclips the Camera with 30 FPS Is noticeable. But okay !
Love this game ! Too bad PCVR has killed "flat gaming" for me, hopefully TW4 can work with UEVR, that would be magic 🙌😁
great video like always ❤❤
but hey I know this ain't going to be the ideal test
but would you test the all great RTX 4090 in cs2 ?
just out of curiosity 🙃
9:15 I think you misunderstood something; The Witcher for is expected to release late 2026 or 2027. But fingers crossed🤞
And in a German Podcast (Gamestar Podcast) the head developer for quest design talked about the switch to UE5, and it is mainly because they plan on doing an entire trilogy, so they want to save resources on programming and updating their own engine between each game. And of course it's easier to find programmers that already know how to operate the UE5 engine.
I hope you do same video with 5090 friend!
I cant wait to see you bench mark the blood of dawnwalker. being a title developed on UE 5 i wanna see how much of a beefy card ill need to get a smooth 60fps at atleast 1440p.
First sponsor of the channel!
Hello, i love watching your videos :) Greetings from Germany 🙋
Hey! Thanks 😃
Greetings mate!
Dude i cant wait for you to test the 5090 in every game at 16K lol
18:41 I hate such Bugs.
Loved it but why are you using TAAU and not the better on?
Sad that you forget check the balance DLSS :-) at 4k!
with DLSS this card can provide a playable experience at 4k with RT enabled. Pretty insane, I wonder what the 5090 will do.
@erenyaeger7662 wow insane 2015 game is playable on 2022. What a joke 🤡.
It is crazy that you can not max out game at 4K, with top gpu, without upscaling or framegen.
I remember playing W3 with my 2 x 980ti in SLI
next video : Destroying the RTX 4090 in Counter Strike 1.6 with Ray Tracing 32K Ultra settings
ui5 is the future of AAA games, thats why alot are using it for newer games, the engine graphics are absolutely stunning but i do agree the performance is still not there yet
It is optimized only for DX11. It runs very well on DX11. They fucked up the DX12 with nextgen update because this game was built with DX11 in mind from the ground up. It is an adaptation rather than a fully native DX12 implementation.
Its not about being built for dx11. Its just a lazy update and they easily could have gotten it working better
Legends are Watching video on 480p ❤
This is what i was looking for
Thats how nvidia reflex works in every game. It needs some GPU headroom.
Playing this at max settings and raytracing at 140fps at 1440p on DLSS and frame gen on my 4080
Can you try it with Halk Hogan's mod? 😄
Maybe it's something to do with amdip/MSI Afterburner. Research this issue. I read that there is a problem related to power monitoring.
Looks better then Stalker 2 in most places
Bro I just upgraded to the AORUS 4070 Super, fired up this game and set it to ultra and I'm getting a BEAUTIFUL 138fps (I use a 1080p 144hz g-sync monitor)
4070 Super with a 1080p display?
My guy, what are you even doing?
@@MuppetsSh0w i mean in 2 years this might be native resolution by the looks of modern games who knows
Plz change to a decent 1440p monitor that is not expensive now, even 1440p Dlss quality is looking way better than native 1080P.
@@MuppetsSh0w he is getting a lot of fps in a really good resolution.
@@MuppetsSh0w nothing wrong with using a 1440p intended card by playing at 1080p.
My 4070 ti super runs this game very well 🤷🏻♂️
Cause this game is sooo gorgeus
I dont know why but when you set the refection quality to low you gain up to 30 fps,strange boost...
Man 4090 going to get old fast with next gen of games with ray tracing at least .Maybe even faster with rtx 5000 AI shenanigans.
Nvidia reflex isnt being "weird". It is known that it causes fps drops, because it decreases gpu power to sync with the cpu thus reduce latency
Witcher 4 will require rtx 6080/90 to run at 4k 120fps with full ray tracing. Thats why the game is not coming until a few years... but the graphics will be insane
I am playing this game in 4k on a 2070 super right now lol. With DLSS Quality, medium settings always above 50 fps. Still looks really good. Pretty happy with that
idk man.
dlss does work but looks absolutely awful.
@CGRADT I don't really have a choice 😭 I do sit quite far from my TV so it looks good enough to me
Hey Kryszzp is the 5090 coming home early? (Or later like the rtx 4090)
Ahhh nothing better than swiming in filthy canal.
Imagine if rockstar updated rdr2 like this…
Brother when are you getting your B580? I want your benchmarks!
Waiting for a 5090 😅
You need the 9800x3d it’s amazing.
We have reached the point where the RTX 4090 is limited due to an insufficient amount of VRAM. Maybe we will see 32GB on the 5090? Quadro GPUs with 48GB already exist. At 8K without DLSS the game used 19GB of system RAM.
No it doesnt run well and suffiecient because its 2015 game on 2013 engine with mods, translation layer to dx12 and RT on top. Look how unstable this game is stock (1.31) thats why video settings are so limited in both oldgen and nextgen version.
Friend, don't just try DSR 8K; also try DLDSR 2.25 6K, because 2.25 has superior quality compared to DSR 8K. And if possible, test DLDSR 2.25 with DLSS Quality.
its awful how we go on about ps5 not hitting 60 but on a 1.6k gpu witcher 3 max can actually dip below 30
is there any reason why u using a 7950x3d and dissable half the cores to make it like a 7800x3d in every video?
So that every core utilise 3d cache , and get more fps .