And if you think about it you can get and amazing experience if you enable frame generation holy shit what a surprise to think a game that is being market as a showoff to the rtx 4000 runs like crap on amd and nvidia older rtx gpus and all this problems are almost exclusive to the pc version if that doesnt smell like shit i dont know what else to think
on addition to this this game doenst have any mind blowing graphics or rt effecs, the rt implementation is pretty average here, idk why this game is so demanding
Dlss 3 Frame Generation on my rtx 4090 is horrible. Gived me 20-30 fps, but stutters like crazy during action, and drops to 5-10 fps during cutscenes. Had to turn that shit off quick.
yea that only reason being that I saw at native 4k maxed settings with rt with the rtx 4090 it could not even get 60 I saw a vid on that and obviously the frame gen and the dlss 3 saves the day for the 4090 to get over 60 with the rt on in this game
To "fix" crashing, When you start the game and are in the first menu, enable all ray tracing. Once you have loaded your save, then change the ray tracing settings to your liking. I got this from someone in Steam and it works.
life saver. bought this couple days ago. have wanted to play it for a long time now but the raytracing update and a sale pushed me to get it. couldnt load back into my game in the first room. this fixed the insta crashes
And I thought my 3080 was doing bad even with DLSS. I was somewhere in the marshes and was hovering at about 52-55 fps @ 1440p. I only disabled RT and everything else maxed out and went straight to 120+ FPS. Easy choice for me. Yes the lighting and the water looked amazing but there's no way in hell I'd trade 120 fps for 50 or even lower, I bet it would drop way lower in certain areas.
Yes Nvidia Reflex really helps with this on my RTX 3060. I still think this new patch needs further work, it's not acceptable that it works only on a RTX 4080 or RTX 4090....
Agree! Wow, disabling Reflex and GPU utilization is back up to 100% almost everywhere for me. Input lag is worse, but smooth frametimes finally! (RTX 2060, no RT)
@@sgtbeercanyt that's because the3080 ti was overpriced trash. people assumed it was high quality, based off price and lack of availability. the new iterations are no different.
Im running 1440p Max Settings DLSS Quality All RT on. 3080 TI, 9900k 32gb ddr4 3600mhz. Im getting 70-80 fps. Go to NVIDIA control panel and enable Threaded Optimization, this fixes most performance issues on all cpu's. Ive seen a 36.5%-50% increase in fps across the board.
Thank you! This helped me to reduce the stutters. Sadly I still have bad frame rate. I have i7 4790k, rtx 3080 12gb and 3840x1600 display and in the first village after Woesong bridge with all RT and quality DLSS only 27fps. On the road and open map I have more than 45fps. I would need to upgrade my computer soon, i guess.
@@Teeb2023 Threaded optimization turned on globally didn't help me out, made it worse somehow actually. Will try and apply it directly to the game instead this time around. However will give you my exact settings in about 2 hours I've settled with that still has 3 RT effects enabled, just dropping some Ultra+ settings down with no real noticeable difference in quality but a decent boost in FPS! Currently getting 42-50 fps now :)
@@goranbenak1201 I upgraded from a 4790k to AMD 5600x and modern games ran better. Not huge difference for max framerate but for low framerate dips definitely better
Also just did a video with RTX 3080 yesterday and my findings are pretty much the same. I did however have a few more issues, in that disabling and enabling RT in-game would cause my game to crash. Also, changing DLSS settings too quickly caused it to crash. I tested in a different area than you and were even more CPU limited on a 12700K at 1440p. Game needs a serious patch!
This next gen update is indeed terribly unoptimized even with RT disabled, I would rather playing the older version with Mods and have a good experience.
I may be wrong, but I believe someone explained that the reason for the bad performance and bottlenecks, is because for whatever reason, instead of CDPR porting the Witcher 3 natively over to DX12, I believe they used some kind of DX Wrapper, which is used to easily port DX11 games over to DX12. I believe Microsoft came out and stated that the DX11 to 12 Wrapper is meant only for 2D games, and for whatever reason, CDPR used it on this massive, open world game with 100's of hours of content which is beyond me.
This is correct and I wish more people would talk about it instead of just saying. "Oh it's just a really heavy RT game" like, if my 3060Ti can handle infinite bounce RTGI in Metro Exodus Enhanced in the open worlds, including Taiga at 70-100fps, and that's a heavier game graphically than the Witcher 3... yeah.
@@licensetoshred That's what I was thinking. Absolutely, Metro Exodus is a very very graphically demanding game, especially with the enhanced edition and raytracing. It all boils down to how much the devs want to optimize the game, and CDPR took the lazy route, and now the game runs horribly.
@@thecubanmissile4806 I didn't know this, thank you for that info! That's annoying that people make up stuff like that, because I've heard that from multiple places, and that's been really meaning. CDPR should've just taken more time and optimized the update before pushing it out.
@@Amfneey yeah the real way they did it is using nvidia remix which is used to put ray tracing in dx9 games which I think they got help from nvidia to get it working in a similar wrapper concept but it was definitely not meant for 2d games.
This is interesting, but I’d like to actually see the best optimized settings you can do. No one needs to play with settings all at Ultra+, etc. … RT and resolution/scaling aren’t the only buttons you can press to improve performance.
im feeling a bit heart broken, i started a new run of witcher 3, expecting the next gen update to run like butter on my 3080....and it has been just....heart breaking.
I went from a 5600x to 5800x3d yesterday and it massively improved GPU utilization and fps with a 3090 at 4k. There are still some drops below 99% and there seems to be a memory leak as well as the exact same "fps dropping after opening the map" bug that Cyberpunk has had for years now. Whatever CDPR is doing, they ain't doing it right in terms of optimization, but the RT itself is fantastic like in CP.
@@mikem2253 yeah, it made a huge difference with all poorly optimized CPU titles, and there were more of those in the last 3, 4 months than in the previous year+ lol. Spiderman/Miles now run with RT on high at 85-110 range whereas before it was dropping to 50s and was inconsistent as hell due to massive drops. Plague Tale in the market sticks to 80+ similarly, and it was dropping to mid 40s with the 5600x, and Witcher 3 is nowhere near as erratic and stuttery and also sticks in the 50-60 range. It's really a bummer how shitty and overwhelming the Witcher's motion blur is because with CP2077, it's so good that even locking to 40 on a 120hz screen feels very good, but here it looks like a blurry mess and disabling it makes low frames feel awful as usual. This is all with DLSS quality and balanced only for Plague Tale.
@@omarcomming722 Annoying lol I just ordered a 7900xt and tun a ryzen 5600. Refuse to spend more on my system now but maybe i’ll grab a 5800x3d in a few months.
The observation about Reflex smoothing out frametimes is very interesting. Seems odd as in some titles I've heard others suggest disabling CPU prerendering makes frametimes noticeably worse.
Yeah this fix has been around for a few days now. I've tried to tell people about it but there is apparently AMD Anti-Lag also which might help. (only in AMD software options) Honestly for the frame-rate hit I just don't think RT is worth it on this title unless you have a 40 series card. Yes some nice ambient occlusion and sometimes ok reflections with better indoor lighting is nice, but do we really play games at 30fps on PC?
the rt hit on the gpu is actually quite reasonable, it's the cpu that is bottlenecking and given we have seen so many complex games, like cyberpunk work fine with the same cpus I'd say they could improve it a lot. It's not a good look for the devs
@@Clutch4IceCream No it's not. It used RTGI as the main RT feature. Go look at my Metro Exodus Enhanced videos and check the framerate. That game uses infinite bounce RTGI which I have on ULTRA in those videos. The reason for the bad performance is they used a quick DRD11 TO D3D12 Wrapper to port the game to DX12 which Microsoft said themselves, no should use unless they are really simple looking game graphically. They didn't actually make it a proper DX12 game, hence why the CPU usage is so fucked lmao. Also I'm running a 3060Ti in that game lmao, and still get 70-100 fps avg in the open world sections.
I do not know if this adds to the issue in this particular game but I could observe in the past that leaving Nvidia GPUs (and AMD for that matter altho their algorithm makes this very hard) to their default clocking behaviour is bad in almost all scenarios. What I like to do is pick a voltage setting in the voltage curve (via Afterburner for example) and lock that specific step in as the maximum. My 4090 for example run at 0.92V at 2750Mhz constantly. It does not go past that, it does not drop below and it never reaches the TDP-Limit either, because that would cause the GPU to clock according to the power-limitations. That is exactly what you want to avoid. There are many games that react with stutters if there is a clock-change occuring, especially if it is signficant, like 25 Mhz and above at a time. This is due to every scene in every game utilizing the GPU differently and thus also putting a different load on it. While you can stay below the TDP in most FHD/WQHD-Settings for almost all games, especially with DLSS and FG involved, that is not the case in 4k. In 4k, most current games will completely load the GPU most of the time. Most of it but not all the time and thus you easily get clock-fluctuations that contribute to furter stutter. Locking the card into a specific clock/voltage step also has other benefits. You have a more stable experience, especially when OCing, because you do not have to worry about instability at different steps along the voltage curve, thus making it much reliable and stable. Then there is the benefit of having consistent performance across the board and usually more performance at that, because, ideally, you want to put the GPU in the spot where it is most efficient. The more voltage you allow, the more diminishing returns you will get because of all the access heat and TDP-limitations and thus shrinking down OC-Potential with every step. Simply limiting the Power Target does not yield similar results. It just shifts them around. It is always best practice for Nvidia GPUs to limit them to a voltage curve setting at which they will never, ever reach the TDP-Limit to begin with. Saves you power and the card will be much more stable, run cooler and more efficient. Also, you lose no performance, whatsoever. On average, an aftermarket 4090 will clock between 2650-2800Mhz during 3DMarks TImeSpy Extreme. Just locking it into place, even at the stock voltage, to 2725 Mhz will yield the same or slightly better results. If you then mod it so that it can reac those 2725Mh with 0.05V less or even lower than that, that is when you truly win. Anyways, I can highly recommend this to improve system stability and to avoid added stuttering in games. It does not affect all games but it does massively affect some others. I remember when the COD Modern Warfare Remake came out and Warzone was first introduced..there especially, in the beginning on my then RTX2080, it was borderline unplayable for me before I smoothed the voltage curve out.
Im playing this game pretty much maxed out with RT with a 5800x3D + 3080 12GB, and yeah at 1440p im mostly at 50-60 fps. It was hard for me to not give up the RT cause it changes the lighting and water so much in the game. Its beautiful and this game deserves to be viewed with everything turned up. (Draw distance wise). Its really a shame that there wasnt a native Dx12 port. It really suffers the performance of this amazing game. On one save file going through all the dlc im already at 220+ hours and not done with dlood and wine yet.
There is a forest near this town ray tracing is really taxing in there I was getting 40+ with some setting tweaks in this city but frames were dropping to 20s in there. RT looks good but can't really play with it *sigh*
I feel like lately devs are doing less and less to optimize their games. Dark Tide, Plague Tale Requiem Gotham Knights, and now this might as well have a 4090 as recommended
Not even a 4090 runs it buttery smooth as it should... And that's a major kick in the balls. Because there should in no way be any problem with a fookin 2000 Dollars gpu...
I have an i9 9900k, a 3080 10Gb and 32Gb of DDR4 RAM. At 4K at you same settings (everything on Ultra+ and all RT on, DLSS on Quality) I only get 28fps (with rare spikes of 40fps) when things don't bug out... When they do my fps collapse under 10 until I exit and re-enter the game. The Witcher 3 is now as broken technically as CP077. Nice work CDPJRed...
They have to come out with an update to squeeze frames out of the settings with RT off. I understand with it on, it will rank frames, but with it OFF? It should be better
I have a 13700K/32GB DDR5-6000 CL32 + 3080 and I get pretty much the same FPS. 55-65 in Toussant and Novigrad. There is definitely some serious problems. They really need to optimize it, it feels like its not optimized at all. Its like they just increased the sliders to the max and called it a day.
Quick question, how did you figure out at 8:13 you were cpu limited? To me it looks like it’s only at 36% utilization. I’m asking because I think my rig is cpu bottlenecked and I want to find out.
Cpus have lots of cores and threads. Game programming isn't parallelized enough to use them all. If a single game thread is the limit then the cpu is the limiting factor despite not all threads being at 100%. Yes, you usually spot it by seeing gpu usage drop.
I can confirm that AMD Anti Lag with RTGI or all RT turned on will remove the frame pacing issue (stuttering). But yes, needs further optimization for sure. I can run CP2077 with RT (but reflection off) at 60 FPS with FSR2 Balanced.
If I don't undervolt my GPU on the crowded area of the common folks at Toussaint its around 40-50 FPS. But yeah hotspot was at 100C so I'm just going to wait for further optimization.
This is kind of why I like console gaming. You don’t have to spend any time fretting over all this stuff. You just turn the thing on and play, and turn it off. So much simpler and way, way less expensive.
It runs better than console, and looks better. With no issues at 60fps. But this is because you're running at sub-4k at a locked 30fps in gimped RT mode (no reflections or shadows) and less than ultra+ at 60 1800p (mix of high and ultra with some ultra+ settings.) You can do the same on PC if you're happy with console quality visuals. But we're used to 120fps, even 60 doesn't cut it. My Ps5 is mostly a blurry chuggy mess and its connected to the same monitor I use for my PC. Your virgin eyes have not yet seen the light of the PC gods side by side next to console peasantry on a 144hz 48inch 4k oled. Your cheap 65inch tv makes everything look like virgin shit. Your fate is one of perpetual peasantry. Ignorance is bliss as they say. Plow my fields wretch. (just kidding lol).
I never had the crashing issues. But one thing I noticed tho, when I enabled all the ray tracing settings with everything set to ultra+. The first 10 minutes when I first launched the game, it seems to be stable at 45-60 fps. But after that, it dropped to 10-15fps in any area. It’s downright unplayable
This is a big fault at devs side, sure they gave us a free update but if we cannot even run the game at 60 on high end GPUs then any update does not make sense. For a 7 year old game it should not be that intensive, if the RT implementation is normally this expensive then imagine what they are going to do with Witcher 4. Holyyy.
DX11 does not support HDR and it is a huge improvement for HDR supported monitors. Really it's much more impactful than RT and I advise anyone who have an HDR monitor to run the game in dx12 with RT off if they don't have a 40x series cards
Hi Dan, could you by any chance test this with a monitor that has a built in refresh rate indicator? I've a 3090 (1440p) and the Nvidia overlay says my framerate is around 40 to 50, similar to your 1440p results here, but my monitor itself says the refresh rate is jumping around 70 to 80Hz/FPS, I commented in the prev 4090 video where the FPS indicators from overlays might be severely bugged
Honestly, RT doesn't look "better" in every sense or scene. Just "different." Reflections are better, for sure. But, does anyone else find some scenes in CP2077 & TW3 worse with RT enabled? It seems more distracting and sometimes causes an "uncanny valley" effect. I'll be disabling RT during my next playthrough.
It really depends for the most part i find RT on tw3 like applying a reshade instead of a real change and even tho some scenes look way more realistic i feel like it crushes the artstyle a lot reflections are the most welcome one and gsmechanging but you need RTGI for them to work
I remember people complaining at GTA mods to make streets like wet puddles all over and shiny, well thats what I seen in cyberpunk when RT, same shit, but because is RT people are like bruh thats so goooddd. Whatever.
Well RT is more realistic. Whether thats better or worse is personal preference. Some people may prefer the fake but more fantasy like screen space effects. Regarding the over reflective surfaces, thats because games still have terrible particle effects and physics. If dust and dirt could be simulated properly open world games would look so much better.
Question! I have used rtss and afterburner since I moved to pc but every setup I've used the osd FPS Average never works it only shows the actual FPS. How do I get the averages to show? Do you use hwinfo to share to rtss?
DX12 version of this game needs some serious patching imo. DX12 has frame time issues to fps just tanking in certain scenarios where else in DX11 mode everything just runs fine. Weirdly enough, in DX11 mode, the game crashes when I OC my GPU, while in DX12 mode, the game runs fine with that same OC profile, no crashes whatsoever.
Interesting video and thank you for including 4K DLSS Performance. I feel that this is a setting that's often overlooked. The giveaway here is that if you tone down the settings to Ultra or High you can get pretty acceptable performance on a 4K TV while not sacrificing too much image quality.
Yes needs 4000 series to run maxed out with rt my rog strix4090 does gets 100fps dlss quality with frame gen although still suffers some slight stuttering! Looks brilliant but hopefully get some optimisation!
RT is just the latest idea GPU producers thought to make you think you need an upgrade. But when you play, unless you stare at screen instead of playing, the difference with/without RT is often quite shy. Don’t be fooled dudes!
It would be great to see a sim racing simulator such as “Assetto Corsa Competizione” in your tests. We have 3M people on Race Department that would love that.
Your 3080 would clearly perform better with an undervolt, i have 2 profiles but i mostly use mine at 1900mhz and it sits around 280-300 watts and runs much cooler with a stable clock.
Yeah, I have a 3080ti with a 5800x3d and mine would crash when I try to adjust the graphics settings up or down. Had to switch back to DX11 in order to play the game without crashing
I prefer Reflex on (no boost) because I value the power saving of not running at max clocks and the highest power state all the time vs the imperceptible difference in latency. However, in CPU limited scenarios boost is most effective giving up to a 15% reduction in latency and given the Witcher 3 RTX is CPU limited it might make sense to use it in this case.
Thanks for another entertaining and useful video! I've been having similar issues with my 3070ti, I'll give some of your suggestions a try, not a big fan of raytracing in general but it would be nice to see the game with all the next gen graphics turned on :)
5:08 totally unexpected and unexceptional cpu usage. DF said that dx12 requires dev time to efficiently manage hardware resources. Guess they really just wanted to get this out for Christmas
this update was not optimized well for ray tracing. It's funny because I just got done playing metro exodus gold edition where they went over the entire game and fully raytraced everything , literally everything and i never went under 75 fps. Usually around 80-85.This was on extreme Maxed out settings at 1440p. I cant even get 50 fps on witcher 3 max settings. running rtx 3080 fe
BUt TAAU does not equal antive right? 9:22 You talk about native but switch to TAAU which irritates me. surely I am missing something here but it is still Upscaled, is it not?
Nope its just taa at native res. If you enable dynamic res then it upscales. Using TAAU here decreases fps by 2-3 just as TAA, if it was upscaling we should be getting more fps.
I've been using an RTX 3080, and I've gotta say, ray tracing is just not very efficient for rendering in terms of the quality you get for the GPU compute performance it takes to achieve it. This is why standard rasterized performance is still more important than ray-traced performance. Ray tracing is always the first thing you want to turn off in order to improve fps. Ray tracing is all about Nvidia leveraging the fear of missing out (a.k.a. "fomo") to scare people into buying their GPUs over AMD. Because there's so much bias toward Nvidia to begin with, it works very well. People look for a way to justify the opinions/ biases they currently hold, and Nvidia's ray tracing performance is very effective at this.
You buy a 1000+ video card and turn off the best feature? Man.. try and lower the resolution and look at the game.. ray tracing looks much much better!
^ I'm with this dude. When I need frames the first thing I do is turn down resolution. Well implemented RT makes a much bigger difference than resolution. Cyberpunk, Metro Exodus: EE vs non RT standard version - these are immediately noticeable differences where as with resolution I have to squint my eyes and go "Oh, yeah. I think that fence in the background is a little more crisp at 4k".
Some games do it well some games not really. Metro Exodus EE is a great example and this is what this witcher update should have been... RT is a big advance in graphics but it still needs some research effort on the developpers side.
Dx12 version has its own settings files. If you have more FPS on dx11, it's just because you have lower settings than on dx12. Open both version and check your graphics settings carefully, you probably have higher graphics settings in the DX12 version like screen space ambient occlusion and anti aliasing.
Initially, launching the game through the CDPR launcher only let me use FSR for my 3080. Even on ultra performance in FSR, using raytracing just destroyed performance. The workaround came by manually launching the game from the bin -> x64_dx12 folder. That let me choose DLSS, which works decently with raytracing in performance mode.
To show CPU temps you need Hwinfo64 and add it in afterburner to display it. To show rtss in the game you need to make a rtss profile for The Witcher 3 and then add these 2 lines [RendererDirect3D12] D3D11on12=0 in the file witcher3.exe.cfg that's in the rivatuner statistics server profile folder
I try "Witcher 3 Next Gen" yesterday have noticed stuttering on horse (RT ON, RTX 3070, 30fps, 1440p, DLSS Quality) Today I install new driver 527.56 part of village stuttering GPU on 99% power only 70% even crashed the game. Finally after restart run normal RT ON, RTX 3070, 1440p, DLSS Quality GPU on 70-80% power 70-80% and fix 30fps.
I got this advice to solve performance problems with the next gen update from the comments section of another video and it helped me: - go to documents/the witcher 3 - open dx12user.settings with a text editor - change key "LimitFPS" = ? to half your monitors Hz minus 10 (ex: 120Hz = 50), this wont actually limit your fps to 50 (FPS in the game settings on "Unlimited") - LimitFPS=50 (in the example) - save and exit - open game - all stutters gone I installed this next gen update on Dec 16, 2022, via Steam, where I originally purchased the game, from version “v1.32” up to now version “v4.00” My gaming pc includes an AMD 5600X CPU (3.7 GHz, OC 4.6 GHz boosted), an AMD RX 6700 XT 12GB GPU and 16GB RAM (2x8GB DDR4 3200 Mhz) and “AMD SmartAccess Memory - SAM" enabled via BIOS. Based on this advice ("half your monitors Hz minus 10") I set the "LimitFPS" to 72.5 Hz because my monitor offers 165 Hz/0.8ms/Fast IPS. The game now looks a bit better and any stutters are gone with for me very satisfactory FPS around 70 to 135 FPS, sometimes less or more depending on the in-game situation. I have often heard the advice that MSI Afterburner and similar software should be deactivated.
I’ve been playing on a 7950X/60 gigs of ram and 3090. It does not run well and, frankly, the only way I can tell the difference is with side-by-side screenshots. The thing crashes all the time. Disappointed and now I get why DF only got videos a few weeks back.
The rt will crash for most of the pcs if you only toggle the global ilumination if you put every effects you can turn it on and off just dont use only global ilumination
Very good video, it's a shame that the 30xx cards struggle so much with this. Even if you are one of the odd people who really enjoy staring at game lighting and are willing to sacrifice FPS, its a bit disappointing with DLSS 3 being locked behind the 40xx cards and dx12 performing as badly as it does.
I get quite a worse performance with my 3080 + 3700x + ddr4 combo. Average fps around 44 with everything maxed out (bar hairworks) on 1440p. New technology does make a difference.
Strix RTX 3080, paired with a Ryzen 9 5950x on a rog crosshair vIII dark hero mobo and no matter what settings i have, as long as Ray Tracing is on, It stutters so much. So I think RT is just busted on this game unless you have the new 40 series cards, then It might work as intended.
It's becoming clear now that Nvidia is using Ray Tracing as a weapon to make low and mid range cards look highly unattractive so they can sell more of their (now overpriced) high-end ones.
Still has micro stutters do to bad cpu thread optimization. If you showed all cpu threads you’d see how hard it hits just 1 thread at a single time and hops around. Really bad
Game crashes a lot on RX 7900 XT when I try to activate RT or even just FSR with high, ultra or ultra+ settings. It works on low with no anti aliasing.
worst part of next gen update is poor implementation of DX12, my pc can't run it on 30 native res, while on dx11 i play at stable 60-80 depends on intensity of scene
In original you have 100 (90-120) fps in 4K without DLSS. The patch bring the game to "Next Gen". The graphic and details of trees and grass are amazing. And the new textures. Its looks very nice. But the RT is too difficult. Why CD Projekt has given't no more RT steps?
I'm about 10 hr into a dx12 playthrough with Ray traceing every so often I have to back out to the main menue bc my frames drop from 55ish to 30 for no reason and stay there untill I reload
Hey Daniel, I enabled radeon anti-lag on my amd gpu and it's working. No more frametime spikes. So yeah I confirm that it works. 20-30fps on my radeon 6700XT so I will not play it like that anyway. 7 Year old game(remaster) that only works on 2 GPUS. Maybe works on the new RDNA3 gpus idk but anyway these are all 1000$+ GPUS. Tone deaf game devs.
@@anuzahyder7185 Of course I don't play with RT on. I only tested with RT to see if anti-lag gets rid of those frame time spikes. The game runs ok without RT... But in DX11 mode I get double the framerate so I guess they still need to optimize this. Probably it's not gonna happen with this dx11 to dx12 wrapper that they use... If this was a 60$ game I found ask for a refund instantly.
I found a weird thing because i use reflex in the game but when i turn it on the game drops almost 10 fps and my gpu usage drops to almost 75% the when i turn it off i get the fps back and my gpu goes to 100%, and if you need to use dlss or fsr you need to use the dx12 because you dont get that on dx11
@@123TheCloop i know bro im just saying that for some reason they didnt include fsr and dlss in the dx11 version of the next gen update and a lot of visual improvements aside from RT and even HDR are tied to the dx12 version and doesnt exist in the dx11 version
@@PixelShade nvdia drivers are actually even worse lol look at the xtx launch. It’s fine perfectly and not a single problem. Nvidia launch? Burning adapters and driver issues. The fact amd is better depsite having less people is crazy, this is the generation where amd wins just like adored Tv (the saviour of amd fans) predicted
@@Drip7914 yeah it's kind of crazy. Sure, I have owned graphics cards, 3D accelerator and GPUs since the early 90s. And today I have owned equal amounts of Nvidia and AMD/ATI cards. And ultimately no one sticks out to be better. Until now, when I actually prefer both AMD software and drivers in this current generation. I will have a really hard time going back to nvidias windows98-style control panel, and their neglected GeForce Experience.
It's strange, where is the fps drop coming from? I play with RTX off and only get about 60-70 fps. Before the update I got about 120 fps on the same settings (including mods).
@@tiz6581 Yes they have, but that alone shouldn't drop the fps by that much. I played with improved graphic mods before (those who are included in the game now) and I got still around 120 fps.
I can't even load into my save game without the game crashing unless I disable RT. And when I load into a game and try to enable it, the game crashes instantly. I'm running a 5900x and 3080ti. Next Gen update more like Witcherpunk 1277.
In some games, Injecting RT into old games causes issues rather than the game being made from scratch with RT, it sure seems to be the case here..........
Somehow, the performance loss with RTX is insane, while the uplift with DLSS/FSR is kinda nonexistent, especially considering how much better it is in cyberpunk (these days)
Wanted to ask the same thing. Using the GOG dx12 version of the game and found a hack for the RTSS overlay.. however, this forced the game into crashes. So would love to see a functioning workaround (and hopefully CDPR fixing this with the next patch)
If you need Frame Gen and DLSS to run this at 60, it’s not optimized even remotely and the features shouldn’t have been added. They could have stuck with putting the 4k textures and calling it a day.
The most inefficient graphics settings are ultra+. No idea why people haven't figured out that you can't see the difference beyond in motion. Ultra is for screenshots.
@@funtourhawk all I want is to cdpr fix fps drops. I don't even care as long as it runs at 40 fps but those 5 fps slideshows every 20 minutes are unplayable
For me RTX isnt working at all :D If I turn RTX on, my Game crashes all the time. (3080) Just play this game on DX11 and it runs way better. You get all the visual upgrades, only DLSS and RTX are not possible.
i tought that i have a bug. I have an overclocked RTX 3090 and a i9 9900k and cant reach even 45 fps in 4k max with dlss quality. I tought i will have 100 fps easily. I love the new ray tracing look. But seems like i need to buy an rtx 4090 or play the game with 30 fps.
So Nvidia is pushing for RT but doesn't give us affordable GPUs to actually run it. I'm gonna be pissed if the prices don't drop after Ampere stock runs dry.
Nvidia is not really pushing for RT, it's just a gimmicky feature to sell more GPUs like Nvidia Physx particles was back in the day. Just keep it disable if you don't want garbage unplayable framerate. I mean if the game had RT forced, then it would be another story. But it's not. It's just 1 click to remove it and you have super smooth gameplay again.
The fact that only two GPU's on the entire market can run this at 60fps with RT is insane
Publishers pushing games at the high end is how Nvidia advertises 4090s as "necessary"...
And if you think about it you can get and amazing experience if you enable frame generation holy shit what a surprise to think a game that is being market as a showoff to the rtx 4000 runs like crap on amd and nvidia older rtx gpus and all this problems are almost exclusive to the pc version if that doesnt smell like shit i dont know what else to think
on addition to this this game doenst have any mind blowing graphics or rt effecs, the rt implementation is pretty average here, idk why this game is so demanding
Dlss 3 Frame Generation on my rtx 4090 is horrible. Gived me 20-30 fps, but stutters like crazy during action, and drops to 5-10 fps during cutscenes. Had to turn that shit off quick.
yea that only reason being that I saw at native 4k maxed settings with rt with the rtx 4090 it could not even get 60 I saw a vid on that and obviously the frame gen and the dlss 3 saves the day for the 4090 to get over 60 with the rt on in this game
To "fix" crashing, When you start the game and are in the first menu, enable all ray tracing. Once you have loaded your save, then change the ray tracing settings to your liking. I got this from someone in Steam and it works.
life saver. bought this couple days ago. have wanted to play it for a long time now but the raytracing update and a sale pushed me to get it. couldnt load back into my game in the first room. this fixed the insta crashes
One thing you didn't try that helps with the cpu capping is turning down crowd density from ultra+ to medium.
And I thought my 3080 was doing bad even with DLSS. I was somewhere in the marshes and was hovering at about 52-55 fps @ 1440p. I only disabled RT and everything else maxed out and went straight to 120+ FPS. Easy choice for me. Yes the lighting and the water looked amazing but there's no way in hell I'd trade 120 fps for 50 or even lower, I bet it would drop way lower in certain areas.
Yes Nvidia Reflex really helps with this on my RTX 3060. I still think this new patch needs further work, it's not acceptable that it works only on a RTX 4080 or RTX 4090....
Ya I was forced to lower settings on my 3080 ti lol.
@@sgtbeercanyt Low spec gamer we need you back lol
Agree! Wow, disabling Reflex and GPU utilization is back up to 100% almost everywhere for me. Input lag is worse, but smooth frametimes finally! (RTX 2060, no RT)
They meant it when they said that this is for the next gen hardware.
@@sgtbeercanyt that's because the3080 ti was overpriced trash. people assumed it was high quality, based off price and lack of availability. the new iterations are no different.
Im running 1440p Max Settings DLSS Quality All RT on. 3080 TI, 9900k 32gb ddr4 3600mhz. Im getting 70-80 fps. Go to NVIDIA control panel and enable Threaded Optimization, this fixes most performance issues on all cpu's. Ive seen a 36.5%-50% increase in fps across the board.
Thank you! This helped me to reduce the stutters. Sadly I still have bad frame rate. I have i7 4790k, rtx 3080 12gb and 3840x1600 display and in the first village after Woesong bridge with all RT and quality DLSS only 27fps. On the road and open map I have more than 45fps. I would need to upgrade my computer soon, i guess.
@@goranbenak1201 your cpu is heavily throttling your gpu here. Your cpu is really due for an update
Your PC spec is identical to mine, I'll give your idea a try tonight. Cheers. :)
@@Teeb2023 Threaded optimization turned on globally didn't help me out, made it worse somehow actually. Will try and apply it directly to the game instead this time around. However will give you my exact settings in about 2 hours I've settled with that still has 3 RT effects enabled, just dropping some Ultra+ settings down with no real noticeable difference in quality but a decent boost in FPS! Currently getting 42-50 fps now :)
@@goranbenak1201 I upgraded from a 4790k to AMD 5600x and modern games ran better. Not huge difference for max framerate but for low framerate dips definitely better
I found that adjusting the setting in game is ineffective in dx12. I only adjust graphics setting on main menu. That's done the trick.
Also just did a video with RTX 3080 yesterday and my findings are pretty much the same. I did however have a few more issues, in that disabling and enabling RT in-game would cause my game to crash. Also, changing DLSS settings too quickly caused it to crash. I tested in a different area than you and were even more CPU limited on a 12700K at 1440p. Game needs a serious patch!
yeah and i was very excited to try it, hope they fix it
This next gen update is indeed terribly unoptimized even with RT disabled, I would rather playing the older version with Mods and have a good experience.
dx11 runs flawlessly on anything with around 8GB as long as you don’t turn on hair works.
I may be wrong, but I believe someone explained that the reason for the bad performance and bottlenecks, is because for whatever reason, instead of CDPR porting the Witcher 3 natively over to DX12, I believe they used some kind of DX Wrapper, which is used to easily port DX11 games over to DX12. I believe Microsoft came out and stated that the DX11 to 12 Wrapper is meant only for 2D games, and for whatever reason, CDPR used it on this massive, open world game with 100's of hours of content which is beyond me.
This is correct and I wish more people would talk about it instead of just saying. "Oh it's just a really heavy RT game" like, if my 3060Ti can handle infinite bounce RTGI in Metro Exodus Enhanced in the open worlds, including Taiga at 70-100fps, and that's a heavier game graphically than the Witcher 3... yeah.
@@licensetoshred That's what I was thinking. Absolutely, Metro Exodus is a very very graphically demanding game, especially with the enhanced edition and raytracing. It all boils down to how much the devs want to optimize the game, and CDPR took the lazy route, and now the game runs horribly.
@@thecubanmissile4806 I didn't know this, thank you for that info! That's annoying that people make up stuff like that, because I've heard that from multiple places, and that's been really meaning. CDPR should've just taken more time and optimized the update before pushing it out.
@@Amfneey yeah the real way they did it is using nvidia remix which is used to put ray tracing in dx9 games which I think they got help from nvidia to get it working in a similar wrapper concept but it was definitely not meant for 2d games.
This is interesting, but I’d like to actually see the best optimized settings you can do. No one needs to play with settings all at Ultra+, etc. … RT and resolution/scaling aren’t the only buttons you can press to improve performance.
im feeling a bit heart broken, i started a new run of witcher 3, expecting the next gen update to run like butter on my 3080....and it has been just....heart breaking.
I went from a 5600x to 5800x3d yesterday and it massively improved GPU utilization and fps with a 3090 at 4k. There are still some drops below 99% and there seems to be a memory leak as well as the exact same "fps dropping after opening the map" bug that Cyberpunk has had for years now. Whatever CDPR is doing, they ain't doing it right in terms of optimization, but the RT itself is fantastic like in CP.
Even at 4k??
Please just say it CP77 it's very jarring to read that LMAO
RT for 40XX cards, yes, well done. I have a 3080 12GB OC and I disable RT in 4K. Better performance
@@mikem2253 yeah, it made a huge difference with all poorly optimized CPU titles, and there were more of those in the last 3, 4 months than in the previous year+ lol.
Spiderman/Miles now run with RT on high at 85-110 range whereas before it was dropping to 50s and was inconsistent as hell due to massive drops. Plague Tale in the market sticks to 80+ similarly, and it was dropping to mid 40s with the 5600x, and Witcher 3 is nowhere near as erratic and stuttery and also sticks in the 50-60 range.
It's really a bummer how shitty and overwhelming the Witcher's motion blur is because with CP2077, it's so good that even locking to 40 on a 120hz screen feels very good, but here it looks like a blurry mess and disabling it makes low frames feel awful as usual.
This is all with DLSS quality and balanced only for Plague Tale.
@@omarcomming722 Annoying lol I just ordered a 7900xt and tun a ryzen 5600. Refuse to spend more on my system now but maybe i’ll grab a 5800x3d in a few months.
I started up a game. Played for 20 mins on a 3080ti 5800x3d. Didn't like the performance. Shelved the game and am waiting for updates.
The observation about Reflex smoothing out frametimes is very interesting. Seems odd as in some titles I've heard others suggest disabling CPU prerendering makes frametimes noticeably worse.
Yeah this fix has been around for a few days now. I've tried to tell people about it but there is apparently AMD Anti-Lag also which might help. (only in AMD software options)
Honestly for the frame-rate hit I just don't think RT is worth it on this title unless you have a 40 series card.
Yes some nice ambient occlusion and sometimes ok reflections with better indoor lighting is nice, but do we really play games at 30fps on PC?
the rt hit on the gpu is actually quite reasonable, it's the cpu that is bottlenecking and given we have seen so many complex games, like cyberpunk work fine with the same cpus I'd say they could improve it a lot. It's not a good look for the devs
its kinda of a joke that a 3080 cant handle 1080p 60fps, CDPR needs to fix this ASAP
You do understand that this game is fully pathtraced? Its immensely impressive it runs as well as it does.
@@Clutch4IceCream it is not fully path traced, there’s only a handful of games that are and this is not one of them
Bro it’s not path traced😂
@@Clutch4IceCream No it's not. It used RTGI as the main RT feature. Go look at my Metro Exodus Enhanced videos and check the framerate. That game uses infinite bounce RTGI which I have on ULTRA in those videos. The reason for the bad performance is they used a quick DRD11 TO D3D12 Wrapper to port the game to DX12 which Microsoft said themselves, no should use unless they are really simple looking game graphically. They didn't actually make it a proper DX12 game, hence why the CPU usage is so fucked lmao. Also I'm running a 3060Ti in that game lmao, and still get 70-100 fps avg in the open world sections.
I do not know if this adds to the issue in this particular game but I could observe in the past that leaving Nvidia GPUs (and AMD for that matter altho their algorithm makes this very hard) to their default clocking behaviour is bad in almost all scenarios. What I like to do is pick a voltage setting in the voltage curve (via Afterburner for example) and lock that specific step in as the maximum. My 4090 for example run at 0.92V at 2750Mhz constantly. It does not go past that, it does not drop below and it never reaches the TDP-Limit either, because that would cause the GPU to clock according to the power-limitations. That is exactly what you want to avoid. There are many games that react with stutters if there is a clock-change occuring, especially if it is signficant, like 25 Mhz and above at a time. This is due to every scene in every game utilizing the GPU differently and thus also putting a different load on it. While you can stay below the TDP in most FHD/WQHD-Settings for almost all games, especially with DLSS and FG involved, that is not the case in 4k. In 4k, most current games will completely load the GPU most of the time. Most of it but not all the time and thus you easily get clock-fluctuations that contribute to furter stutter.
Locking the card into a specific clock/voltage step also has other benefits. You have a more stable experience, especially when OCing, because you do not have to worry about instability at different steps along the voltage curve, thus making it much reliable and stable. Then there is the benefit of having consistent performance across the board and usually more performance at that, because, ideally, you want to put the GPU in the spot where it is most efficient. The more voltage you allow, the more diminishing returns you will get because of all the access heat and TDP-limitations and thus shrinking down OC-Potential with every step. Simply limiting the Power Target does not yield similar results. It just shifts them around. It is always best practice for Nvidia GPUs to limit them to a voltage curve setting at which they will never, ever reach the TDP-Limit to begin with. Saves you power and the card will be much more stable, run cooler and more efficient. Also, you lose no performance, whatsoever. On average, an aftermarket 4090 will clock between 2650-2800Mhz during 3DMarks TImeSpy Extreme. Just locking it into place, even at the stock voltage, to 2725 Mhz will yield the same or slightly better results. If you then mod it so that it can reac those 2725Mh with 0.05V less or even lower than that, that is when you truly win.
Anyways, I can highly recommend this to improve system stability and to avoid added stuttering in games. It does not affect all games but it does massively affect some others. I remember when the COD Modern Warfare Remake came out and Warzone was first introduced..there especially, in the beginning on my then RTX2080, it was borderline unplayable for me before I smoothed the voltage curve out.
will try this with an AMD graphics card, see how this holds up with it, thanks for sharing
@@gara8142 any update with amd cards?
I waited on another run until this updade. It runs fine but I'll wait some time more untill they'll hopefully optimize it.
Not just the fps performance but I'm also waiting for the LOD pop-in fixes.
Im playing this game pretty much maxed out with RT with a 5800x3D + 3080 12GB, and yeah at 1440p im mostly at 50-60 fps. It was hard for me to not give up the RT cause it changes the lighting and water so much in the game. Its beautiful and this game deserves to be viewed with everything turned up. (Draw distance wise). Its really a shame that there wasnt a native Dx12 port. It really suffers the performance of this amazing game. On one save file going through all the dlc im already at 220+ hours and not done with dlood and wine yet.
There is a forest near this town ray tracing is really taxing in there I was getting 40+ with some setting tweaks in this city but frames were dropping to 20s in there. RT looks good but can't really play with it *sigh*
I feel like lately devs are doing less and less to optimize their games. Dark Tide, Plague Tale Requiem Gotham Knights, and now this might as well have a 4090 as recommended
Not even a 4090 runs it buttery smooth as it should... And that's a major kick in the balls. Because there should in no way be any problem with a fookin 2000 Dollars gpu...
What's the software used? I mean the osd part with the graph etc?
Thx for the info btw!
I have an i9 9900k, a 3080 10Gb and 32Gb of DDR4 RAM. At 4K at you same settings (everything on Ultra+ and all RT on, DLSS on Quality) I only get 28fps (with rare spikes of 40fps) when things don't bug out... When they do my fps collapse under 10 until I exit and re-enter the game.
The Witcher 3 is now as broken technically as CP077. Nice work CDPJRed...
Reflex low latency will not work and break when you go to Novigrad town, there place whereby there's a townhall with a banker and blacksmith
They have to come out with an update to squeeze frames out of the settings with RT off. I understand with it on, it will rank frames, but with it OFF? It should be better
I have a 13700K/32GB DDR5-6000 CL32 + 3080 and I get pretty much the same FPS. 55-65 in Toussant and Novigrad. There is definitely some serious problems. They really need to optimize it, it feels like its not optimized at all. Its like they just increased the sliders to the max and called it a day.
Next RT Egg after Portal RTX
Quick question, how did you figure out at 8:13 you were cpu limited? To me it looks like it’s only at 36% utilization. I’m asking because I think my rig is cpu bottlenecked and I want to find out.
Cpus have lots of cores and threads. Game programming isn't parallelized enough to use them all. If a single game thread is the limit then the cpu is the limiting factor despite not all threads being at 100%. Yes, you usually spot it by seeing gpu usage drop.
Thanks!
I can confirm that AMD Anti Lag with RTGI or all RT turned on will remove the frame pacing issue (stuttering). But yes, needs further optimization for sure. I can run CP2077 with RT (but reflection off) at 60 FPS with FSR2 Balanced.
How many fps do you get with reflections?
If I don't undervolt my GPU on the crowded area of the common folks at Toussaint its around 40-50 FPS.
But yeah hotspot was at 100C so I'm just going to wait for further optimization.
Thanks for the update, I'm using Rx 6700 xt and finally can at least play this with 40+fps without the judder and shuttering.
@Daniel Owen = Is your case's side panel installed (left on) when you do these tests? Enjoy your content... Thanks.
Yes. Just running everything as a normal pc at stock settings.
@@danielowentech Thank you. Working on an airflow problem; saw your 340-350w watts at 71-72c and had to ask. Happy holidays.
This is kind of why I like console gaming. You don’t have to spend any time fretting over all this stuff. You just turn the thing on and play, and turn it off. So much simpler and way, way less expensive.
It runs better than console, and looks better. With no issues at 60fps. But this is because you're running at sub-4k at a locked 30fps in gimped RT mode (no reflections or shadows) and less than ultra+ at 60 1800p (mix of high and ultra with some ultra+ settings.) You can do the same on PC if you're happy with console quality visuals. But we're used to 120fps, even 60 doesn't cut it. My Ps5 is mostly a blurry chuggy mess and its connected to the same monitor I use for my PC. Your virgin eyes have not yet seen the light of the PC gods side by side next to console peasantry on a 144hz 48inch 4k oled. Your cheap 65inch tv makes everything look like virgin shit. Your fate is one of perpetual peasantry. Ignorance is bliss as they say. Plow my fields wretch. (just kidding lol).
Hi, any reasons to not use Reflex: "On + Boost" instead of only "On" ?
I never had the crashing issues. But one thing I noticed tho, when I enabled all the ray tracing settings with everything set to ultra+. The first 10 minutes when I first launched the game, it seems to be stable at 45-60 fps. But after that, it dropped to 10-15fps in any area. It’s downright unplayable
Thank you for being late for work, for our sake! We appreciate your efforts big dawg! 🍻😁
Here’s a conspiracy: CDPR and Nvidia did this on purpose so that people would wanna buy 4000 series gpus.
True
This is a big fault at devs side, sure they gave us a free update but if we cannot even run the game at 60 on high end GPUs then any update does not make sense. For a 7 year old game it should not be that intensive, if the RT implementation is normally this expensive then imagine what they are going to do with Witcher 4. Holyyy.
Hey Daniel, what software do you use for the stats display up in the left corner?
Msi afterburner with Rivatunner.
Hi Daniel you think you could test this game with an rtx 3060 and include a section where hairworks of off?
DX11 does not support HDR and it is a huge improvement for HDR supported monitors. Really it's much more impactful than RT and I advise anyone who have an HDR monitor to run the game in dx12 with RT off if they don't have a 40x series cards
Yes hdr is really good in this game. Playing with rt ultra+ on my 4090. Gonna finish the game again. It’s a different experience 😱
How are you getting Afterburner to work in DX12? I've tried a few "fixes" but none of them worked.
Hi Dan, could you by any chance test this with a monitor that has a built in refresh rate indicator?
I've a 3090 (1440p) and the Nvidia overlay says my framerate is around 40 to 50, similar to your 1440p results here,
but my monitor itself says the refresh rate is jumping around 70 to 80Hz/FPS,
I commented in the prev 4090 video where the FPS indicators from overlays might be severely bugged
More likely your monitor is in a low framerate compensation mode when you drop below its freesync/gsync range.
@@danielowentech
Oh didn't even realise that was a thing, thanks,
it actually does look smooth though so that's a plus
Yes, for example if your game is at 40fps, the monitor will got to 80 and show the frame twice.
@@danielowentech
Thanks so much for the replies! Cleared up a lot of confusion
Honestly, RT doesn't look "better" in every sense or scene. Just "different." Reflections are better, for sure.
But, does anyone else find some scenes in CP2077 & TW3 worse with RT enabled? It seems more distracting and sometimes causes an "uncanny valley" effect.
I'll be disabling RT during my next playthrough.
It really depends for the most part i find RT on tw3 like applying a reshade instead of a real change and even tho some scenes look way more realistic i feel like it crushes the artstyle a lot reflections are the most welcome one and gsmechanging but you need RTGI for them to work
Nah I also find rt to be useless and tbh it probs won’t become a thing until another 10 years. So for now amd gpus r the next choice
I remember people complaining at GTA mods to make streets like wet puddles all over and shiny, well thats what I seen in cyberpunk when RT, same shit, but because is RT people are like bruh thats so goooddd. Whatever.
Well RT is more realistic. Whether thats better or worse is personal preference. Some people may prefer the fake but more fantasy like screen space effects. Regarding the over reflective surfaces, thats because games still have terrible particle effects and physics. If dust and dirt could be simulated properly open world games would look so much better.
@@rdmz135 agreed. Games just aren't there yet. Especially when almost every game is optimized for consoles.
Why is ray tracing heavier then cyberpunk?
Question! I have used rtss and afterburner since I moved to pc but every setup I've used the osd FPS Average never works it only shows the actual FPS. How do I get the averages to show? Do you use hwinfo to share to rtss?
DX12 version of this game needs some serious patching imo. DX12 has frame time issues to fps just tanking in certain scenarios where else in DX11 mode everything just runs fine. Weirdly enough, in DX11 mode, the game crashes when I OC my GPU, while in DX12 mode, the game runs fine with that same OC profile, no crashes whatsoever.
Interesting video and thank you for including 4K DLSS Performance. I feel that this is a setting that's often overlooked. The giveaway here is that if you tone down the settings to Ultra or High you can get pretty acceptable performance on a 4K TV while not sacrificing too much image quality.
Yes needs 4000 series to run maxed out with rt my rog strix4090 does gets 100fps dlss quality with frame gen although still suffers some slight stuttering! Looks brilliant but hopefully get some optimisation!
RT is just the latest idea GPU producers thought to make you think you need an upgrade. But when you play, unless you stare at screen instead of playing, the difference with/without RT is often quite shy.
Don’t be fooled dudes!
This was expected with RTX and 4k trying to become a standard around the same time. Thats why i stuck with 1440p on a 27 inch monitor for my 3080.
It would be great to see a sim racing simulator such as “Assetto Corsa Competizione” in your tests. We have 3M people on Race Department that would love that.
Your 3080 would clearly perform better with an undervolt, i have 2 profiles but i mostly use mine at 1900mhz and it sits around 280-300 watts and runs much cooler with a stable clock.
+2 FPS?
@@SebbS82 2 fps at that fps is huge....
@@DanCollier1 but doesn't feel like it brings much
@@SebbS82 it gets him so close to 60 which feels better, it draws less power which saves money and runs cooler? its all positives in this instance.
I hate that publishers are tanking our FPS with shitty optimisation just to sell high-end GPU
Optimize your games for fuck sake
How did u get MSI Afterburner to work???
I have a 3080ti and a 12900kf. Every time I turn the game up to 4K resolution it crashes. I can run at Ultra+ with RT Ultra in 1440p.
Yeah, I have a 3080ti with a 5800x3d and mine would crash when I try to adjust the graphics settings up or down. Had to switch back to DX11 in order to play the game without crashing
oh wow, that reflex made it like it boosted the fps, true magic
Maybe is just me but i saw better performance when switching graph setting then going to menu and re load the save.
Is it typically better to use Reflex on or Reflex on + boost?
I prefer Reflex on (no boost) because I value the power saving of not running at max clocks and the highest power state all the time vs the imperceptible difference in latency.
However, in CPU limited scenarios boost is most effective giving up to a 15% reduction in latency and given the Witcher 3 RTX is CPU limited it might make sense to use it in this case.
@@playcloudpluspc thank you
Thanks for another entertaining and useful video! I've been having similar issues with my 3070ti, I'll give some of your suggestions a try, not a big fan of raytracing in general but it would be nice to see the game with all the next gen graphics turned on :)
5:08 totally unexpected and unexceptional cpu usage. DF said that dx12 requires dev time to efficiently manage hardware resources. Guess they really just wanted to get this out for Christmas
this update was not optimized well for ray tracing. It's funny because I just got done playing metro exodus gold edition where they went over the entire game and fully raytraced everything , literally everything and i never went under 75 fps. Usually around 80-85.This was on extreme Maxed out settings at 1440p. I cant even get 50 fps on witcher 3 max settings. running rtx 3080 fe
BUt TAAU does not equal antive right? 9:22 You talk about native but switch to TAAU which irritates me. surely I am missing something here but it is still Upscaled, is it not?
Nope its just taa at native res. If you enable dynamic res then it upscales. Using TAAU here decreases fps by 2-3 just as TAA, if it was upscaling we should be getting more fps.
@@ashank9856 aah thanks for the explanation. I find that way a bit misleading but now I am wiser ^^
Yes dude, even I was confused at first by the 'u' at the end.
I've been using an RTX 3080, and I've gotta say, ray tracing is just not very efficient for rendering in terms of the quality you get for the GPU compute performance it takes to achieve it. This is why standard rasterized performance is still more important than ray-traced performance. Ray tracing is always the first thing you want to turn off in order to improve fps.
Ray tracing is all about Nvidia leveraging the fear of missing out (a.k.a. "fomo") to scare people into buying their GPUs over AMD. Because there's so much bias toward Nvidia to begin with, it works very well. People look for a way to justify the opinions/ biases they currently hold, and Nvidia's ray tracing performance is very effective at this.
You buy a 1000+ video card and turn off the best feature? Man.. try and lower the resolution and look at the game.. ray tracing looks much much better!
^ I'm with this dude. When I need frames the first thing I do is turn down resolution. Well implemented RT makes a much bigger difference than resolution. Cyberpunk, Metro Exodus: EE vs non RT standard version - these are immediately noticeable differences where as with resolution I have to squint my eyes and go "Oh, yeah. I think that fence in the background is a little more crisp at 4k".
Some games do it well some games not really. Metro Exodus EE is a great example and this is what this witcher update should have been... RT is a big advance in graphics but it still needs some research effort on the developpers side.
@@cristianspiridon the 3080 is not a 1000= GPU....
@@Chocapic_13 True. But if you count.. most games look much much better with RT! Is more important to add it than going from medium to ultra !
What software do you use to overlay fps counter?
MSI Afterburner and Rivatuner.
You're trying a village /part of TW3 that was already givin performance issues before this patch
The problem is the dx12 version of the game, try out dx11. I go from 60-70 too 90-100 fps (both without RTX)
Yes but you can’t use ray tracing in DX11 mode
Dx12 version has its own settings files. If you have more FPS on dx11, it's just because you have lower settings than on dx12. Open both version and check your graphics settings carefully, you probably have higher graphics settings in the DX12 version like screen space ambient occlusion and anti aliasing.
Initially, launching the game through the CDPR launcher only let me use FSR for my 3080. Even on ultra performance in FSR, using raytracing just destroyed performance.
The workaround came by manually launching the game from the bin -> x64_dx12 folder. That let me choose DLSS, which works decently with raytracing in performance mode.
How is MSI afterburner working when it's known for not working with the DX12 of this game ?
I pinned a comment on how to do it in my 4090 test
Why doesn't msi afterburner show cpu temps? I have that issue at the moment. Is it a new update that no longer shows it?
To show CPU temps you need Hwinfo64 and add it in afterburner to display it. To show rtss in the game you need to make a rtss profile for The Witcher 3 and then add these 2 lines
[RendererDirect3D12]
D3D11on12=0
in the file witcher3.exe.cfg that's in the rivatuner statistics server profile folder
@@austintse5719 Thank you, I'll be sure to do so right now
I try "Witcher 3 Next Gen" yesterday have noticed stuttering on horse (RT ON, RTX 3070, 30fps, 1440p, DLSS Quality) Today I install new driver 527.56 part of village stuttering GPU on 99% power only 70% even crashed the game. Finally after restart run normal RT ON, RTX 3070, 1440p, DLSS Quality GPU on 70-80% power 70-80% and fix 30fps.
I don't know about playing capped 30 fps on a 3080.
I got this advice to solve performance problems with the next gen update from the comments section of another video and it helped me:
- go to documents/the witcher 3
- open dx12user.settings with a text editor
- change key "LimitFPS" = ? to half your monitors Hz minus 10 (ex: 120Hz = 50), this wont actually limit your fps to 50 (FPS in the game settings on "Unlimited")
- LimitFPS=50 (in the example)
- save and exit
- open game
- all stutters gone
I installed this next gen update on Dec 16, 2022, via Steam, where I originally purchased the game, from version “v1.32” up to now version “v4.00”
My gaming pc includes an AMD 5600X CPU (3.7 GHz, OC 4.6 GHz boosted), an AMD RX 6700 XT 12GB GPU and 16GB RAM (2x8GB DDR4 3200 Mhz) and “AMD SmartAccess Memory - SAM" enabled via BIOS.
Based on this advice ("half your monitors Hz minus 10") I set the "LimitFPS" to 72.5 Hz because my monitor offers 165 Hz/0.8ms/Fast IPS.
The game now looks a bit better and any stutters are gone with for me very satisfactory FPS around 70 to 135 FPS, sometimes less or more depending on the in-game situation.
I have often heard the advice that MSI Afterburner and similar software should be deactivated.
I’ve been playing on a 7950X/60 gigs of ram and 3090. It does not run well and, frankly, the only way I can tell the difference is with side-by-side screenshots. The thing crashes all the time. Disappointed and now I get why DF only got videos a few weeks back.
The game is badly optimized for DX 12. Turning on and off raytracing will crash the game. I’m running this with a 5950X and a RTX 3090ti FE.
The rt will crash for most of the pcs if you only toggle the global ilumination if you put every effects you can turn it on and off just dont use only global ilumination
The more I fiddle with graphics, the worse it gets. It’s unplayable and makes me nauseous.
@@allansolano5587 I have to turn it on otherwise I can’t turn on any other Ray tracing menu.
@@TheCrzyman0 lol as long as you leave antialiasing on your game will look good. When you turn it off it looks super weird.
Very good video, it's a shame that the 30xx cards struggle so much with this. Even if you are one of the odd people who really enjoy staring at game lighting and are willing to sacrifice FPS, its a bit disappointing with DLSS 3 being locked behind the 40xx cards and dx12 performing as badly as it does.
I get quite a worse performance with my 3080 + 3700x + ddr4 combo. Average fps around 44 with everything maxed out (bar hairworks) on 1440p. New technology does make a difference.
Strix RTX 3080, paired with a Ryzen 9 5950x on a rog crosshair vIII dark hero mobo and no matter what settings i have, as long as Ray Tracing is on, It stutters so much. So I think RT is just busted on this game unless you have the new 40 series cards, then It might work as intended.
It's becoming clear now that Nvidia is using Ray Tracing as a weapon to make low and mid range cards look highly unattractive so they can sell more of their (now overpriced) high-end ones.
Still has micro stutters do to bad cpu thread optimization. If you showed all cpu threads you’d see how hard it hits just 1 thread at a single time and hops around. Really bad
Game crashes a lot on RX 7900 XT when I try to activate RT or even just FSR with high, ultra or ultra+ settings.
It works on low with no anti aliasing.
I just wanted to acknowledge the immense work and time you take to make your videos. The description alone speaks volumes of your content. 😎😎
worst part of next gen update is poor implementation of DX12, my pc can't run it on 30 native res, while on dx11 i play at stable 60-80 depends on intensity of scene
Sad to see a 3080 having issues with a 6 years old game
In original you have 100 (90-120) fps in 4K without DLSS. The patch bring the game to "Next Gen". The graphic and details of trees and grass are amazing. And the new textures. Its looks very nice. But the RT is too difficult. Why CD Projekt has given't no more RT steps?
I'm about 10 hr into a dx12 playthrough with Ray traceing every so often I have to back out to the main menue bc my frames drop from 55ish to 30 for no reason and stay there untill I reload
Hey Daniel, I enabled radeon anti-lag on my amd gpu and it's working. No more frametime spikes. So yeah I confirm that it works. 20-30fps on my radeon 6700XT so I will not play it like that anyway. 7 Year old game(remaster) that only works on 2 GPUS. Maybe works on the new RDNA3 gpus idk but anyway these are all 1000$+ GPUS. Tone deaf game devs.
It works . Y u need to turn on rt. Radeon 6700xt isnt a ray tracing card tbh
@@anuzahyder7185
Of course I don't play with RT on. I only tested with RT to see if anti-lag gets rid of those frame time spikes.
The game runs ok without RT... But in DX11 mode I get double the framerate so I guess they still need to optimize this. Probably it's not gonna happen with this dx11 to dx12 wrapper that they use... If this was a 60$ game I found ask for a refund instantly.
I found a weird thing because i use reflex in the game but when i turn it on the game drops almost 10 fps and my gpu usage drops to almost 75% the when i turn it off i get the fps back and my gpu goes to 100%, and if you need to use dlss or fsr you need to use the dx12 because you dont get that on dx11
dlss/fsr are not exclusive to dx12..... games even without dx12 have these features
But nvidia has no bugs! You must be wrong! /#TheInternet
🤣
@@123TheCloop i know bro im just saying that for some reason they didnt include fsr and dlss in the dx11 version of the next gen update and a lot of visual improvements aside from RT and even HDR are tied to the dx12 version and doesnt exist in the dx11 version
@@PixelShade nvdia drivers are actually even worse lol look at the xtx launch. It’s fine perfectly and not a single problem. Nvidia launch? Burning adapters and driver issues. The fact amd is better depsite having less people is crazy, this is the generation where amd wins just like adored Tv (the saviour of amd fans) predicted
@@Drip7914 yeah it's kind of crazy. Sure, I have owned graphics cards, 3D accelerator and GPUs since the early 90s. And today I have owned equal amounts of Nvidia and AMD/ATI cards. And ultimately no one sticks out to be better. Until now, when I actually prefer both AMD software and drivers in this current generation. I will have a really hard time going back to nvidias windows98-style control panel, and their neglected GeForce Experience.
It's strange, where is the fps drop coming from? I play with RTX off and only get about 60-70 fps. Before the update I got about 120 fps on the same settings (including mods).
Didn’t they improve the textures for every setting
@@tiz6581 Yes they have, but that alone shouldn't drop the fps by that much. I played with improved graphic mods before (those who are included in the game now) and I got still around 120 fps.
@@OrbinHD it's because of bad dx12 implementation. Run it in dx11 mode and it should be fine RT is unusable anyways
I can't even load into my save game without the game crashing unless I disable RT. And when I load into a game and try to enable it, the game crashes instantly. I'm running a 5900x and 3080ti. Next Gen update more like Witcherpunk 1277.
What did they do for 2+ years
In some games, Injecting RT into old games causes issues rather than the game being made from scratch with RT, it sure seems to be the case here..........
Somehow, the performance loss with RTX is insane, while the uplift with DLSS/FSR is kinda nonexistent, especially considering how much better it is in cyberpunk (these days)
How did you get the msi afterburner overlay to work on dx12 mode?
Wanted to ask the same thing. Using the GOG dx12 version of the game and found a hack for the RTSS overlay.. however, this forced the game into crashes. So would love to see a functioning workaround (and hopefully CDPR fixing this with the next patch)
he's using dx11
I made a pinned comment on my 4090 test of this game explaining.
@@UniverseGd no he's not he's using dx12 bro because dx12 is the only api in this game that has rt???
@@coreypasternak9435 I guess so? I don't have a RT Nvidia card and quit the game after experiencing that buggy mess.
I can't even use dlss, it has been grayed out since I can play the game at DX11 only, no option to switch to DX12. Gog version.
If you need Frame Gen and DLSS to run this at 60, it’s not optimized even remotely and the features shouldn’t have been added. They could have stuck with putting the 4k textures and calling it a day.
The most inefficient graphics settings are ultra+. No idea why people haven't figured out that you can't see the difference beyond in motion. Ultra is for screenshots.
dropping the foliage distance/hairworks will give you 10-15 fps...
@@funtourhawk all I want is to cdpr fix fps drops. I don't even care as long as it runs at 40 fps but those 5 fps slideshows every 20 minutes are unplayable
Thank you so much Daniel! Hard to find in depth review and benchmarks for the 12GB version.
For me RTX isnt working at all :D
If I turn RTX on, my Game crashes all the time. (3080)
Just play this game on DX11 and it runs way better. You get all the visual upgrades, only DLSS and RTX are not possible.
i tought that i have a bug. I have an overclocked RTX 3090 and a i9 9900k and cant reach even 45 fps in 4k max with dlss quality. I tought i will have 100 fps easily. I love the new ray tracing look. But seems like i need to buy an rtx 4090 or play the game with 30 fps.
So Nvidia is pushing for RT but doesn't give us affordable GPUs to actually run it. I'm gonna be pissed if the prices don't drop after Ampere stock runs dry.
Nvidia is not really pushing for RT, it's just a gimmicky feature to sell more GPUs like Nvidia Physx particles was back in the day. Just keep it disable if you don't want garbage unplayable framerate. I mean if the game had RT forced, then it would be another story. But it's not. It's just 1 click to remove it and you have super smooth gameplay again.
@@michaelzomsuv3631 people that call Ray tracing a gimmick are the true idiots
How do you pass that cpu limit?