Please give this video a thumbs up if you liked it and feel free to comment below or ask me anything. It will help me to get recognized by youtube's algorithm. Thanks! :) Timecodes: ○ 0:00 - RDR2 Ultra Settings, DLSS OFF ○ 1:07 - God of War Ultra Settings, DLSS OFF ○ 2:17 - The Witcher 3 Ultra Settings ○ 3:22 - GTA V Ultra Settings, FXAA/MSAA OFF ○ 4:45 - Cyberpunk 2077 Ultra Settings, RTX/DLSS OFF ○ 5:57 - Hitman 3 Ultra Settings ○ 7:12 - Assassin's Creed Valhalla Max Settings ○ 8:24 - Chernobylite Ultra Settings, DLSS OFF
@temannopellaria no, you don't need to use GeForce Experience in order to be able to use DLDSR. But you need any RTX GPU to be able to use DLDSR, that's the only requirement.
@@MxBenchmarkPC I have a rtx 3080 tuf oc edition, with a 1440p 240 hz monitor. I selected DLDSR(4k) from nvidia control panel, but when I use 4k resolution in any game, the performance is identical to 4k with DSR. I have the latest drivers of course.
@temannopellaria that's right, performance at 4K DLDSR and 4K DSR is identical, but the image quality is different. DLDSR 1620p is comparable to 4K DSR in terms of image quality, but with better performance at DLDSR 1620p. Better example in your case - your DLDSR 4K is comparable to 5K DSR in terms of image quality, but with the performance from 4K DSR.
This is an amazing technology. I've been using 2.25x on my 1440p laptop >> which makes it 4K DLDSR res. Playing games like God of War in 4K DLDSR with dlss enabled produces some stunning quality with really high frames. I think DLDSR + DLSS may be an amazing option going forwards. Really impressed. Thanks for the awesome video!!!!!
@@Pecata_Gaming92 Hi. I've set mine to 0% and have not noticed any issues at all. Picture quality is incredible. I've only used it with Shadow of the Tomb Raider (4k DLDSR + DLSS performance) and GoW (4k DLDSR + DLSS quality), both at 0%.
Ps, a great tip for GoW - in accessibility settings disable ambient camera sway. It makes everything shimmer, with or without dlss. It was visible on console too, for example on foliage and snow. Disabling it keeps the 3rd person camera still (like in tomb raider or any other 3rd person game) and completely eliminates the distracting shimmer
I have an 3060ti and a 1440p monitor I tried dldsr 4k resolution in 1080p at 30 percent smoothness and played deathloop. I got a comparable image to 1440p native while standing still with settings higher, I also got a performance boost, DLSS was set to quality. However in motion the picture can become quite noisy, some edges in window frames also appeared jagged compared to native. I think that dldsr still needs some development before it becomes commonly used.
You can't compare image quality on youtube because it compresses the video. What you're really comparing is how well it compresses the different images/frames -- whichever one is less noisy will look better on youtube.
A good tech indeed. Last november got a 240hz, 27inch Adaptive sync monitor, locked at 1080p. Also got the rtx 4060, wich isdirected at 1080p. I regreted a bit having bought such a large monitor, for a 1080p setting, pixels get to far apart. When i discovered DLDSR, i was super impressed. I set it to 2.25x DL 1620p (i read somewhere that it scales better at 2.25x then 1.75x, but can not really be sure). Played Guardians of galaxy full Ultra+Ultra RT + DLDSR 2.25x + DLSS, average 70fps or so, amazing combo and the difference was apparent. Performance drop was negligable.I Recommend it if you were stuck like me at 1080p
At a 10% cost compared to 4k dlss quality. Having the 1.78x options on games with dlss ( with the performance mode options instead of quality) gets rid of all antialiasing problems
@@0253749depaula I've been experimenting with all sorts of options to bypass the TAA blur on a 1080p screen. So far, the DLSS quality mode is alright but it handles vegetation kind of weird. I'll try replacing with newer DLSS dll files and experiment with that.
@@0253749depaula I have noticed, never thought I needed anything above 1080p. If they didn't use that weird dithering pattern on transparent objects it would have looked fine.
Dont know about 1920p DLDSR, but this 1620p DLDSR looks almost identical to 4k DSR (but only at a fraction of it's performance hit). I think the latter one looks ever so slightly sharper... but if you told me you mixed up which one was which one, I would believe that as well, so... Marvellous job nVidia!
There's literally no reason for 4K DLDSR on a 1080p monitor. DLDSR mainly is there to help with DSR resolutions that can't be evenly divided into 1080p. Any other resolution than 4K on a 1080p screen will look awful when you use it with DSR. DLDSR helps by applying AI filtering to make it more crisp etc TLDR; Since 4K is the double of 1080p it already looks perfect as it is. DLDSR on it would be pointless
I go with 1440p/1620pDLDSR DLSS Balanced. Most off games It still look much better than native 1080p TAA and better most off the times than 1080 DLAA. Now also use DLSS 3.7 with E or C preset, that depens on the game. Oby downside is scaling volumetric lights, reflection which also depends on the game how implemented.
Grass looks better at 4K DSR. Maybe because it is a new game. Older games such as The Witcher 3, Rise of the Tomb Raider demonstrate a lot of shimmering.
In my flight sim testing, DSR 4x legacy looks significantly better than DLDSR 2.25. I would love to be able to test a DLDSR 4x however, since DLDSR 2.25 certainly looked better than DSR 2.25 legacy.
@@godlyiwnl , GPU's don't run any "cooler" now than years ago on average. The size of the GPU die doesn't necessarily mean anything in terms of temps. And there may be beefy coolers but a lot of the GPU's are creating so much HEAT ENERGY compared to a few years ago these coolers are necessary... the coolers capability pretty much just scales, on average, with the heat energy you're trying to dissipate. Graphics card manufacturers generally just run the GPU close to its maximum capabilities in terms of temps/voltage/frequency. It's pretty common to see temps over 80degC still as it has been for many years. The GTX1080 came out about six years ago and 80+ degC is common. (Running "cooler" might gain a small amount of performance because of how GPU's tend to scale, but it's pretty minor difference. I'd rather set a fan profile that keeps the temps safe but HIGH resulting in the lowest FAN NOISE... I did a Noctua fan mod to my GTX1080 and can rarely hear the fans in my computer even when gaming with no speakers on. My GPU fans are about 300RPM in idle, 600RPM when gaming usually and ramp up to around 1100RPM in very demanding GPU workloads.)
How would this compare to just setting a higher resolution to begin with and only using DLSS? DLDSR is increasing your render resolution and then squeezing it down to fit your game screen, so you could do the same thing downsampling + DLSS and compare which looks better for you
all games offer dldsr but only a few games offer dlss. also a higher res than your monitor's native is only selectable with dldsr enabled. so you cant do dlss 1440p on a 1080p monitor.
@@MxBenchmarkPC Smoothness should be at 0% for 4x dsr, it's only meant for uneven scaling to hide aliasing. And the smoothness setting on dldsr is actually sharpening, 0% being max sharpening and 100% being off.
4k dsr is better, but if performance is a factor dldsr is very good to have. 1.78 dldsr and dlss to ultra quality makes for really good performance and good stable image quality Cyberpunk 2077
DLSS renders lower resolution image and upscales it to the higher resolution. DSR renders higher resolution image and then downscales it to your native resolution. DLDSR does the same thing as DSR, but with help of AI it can produce high resolution image quality with less performance hit in comparison to 4K resolution, for example. Basically it's another performance boosting technology for those, who is already familiar with DSR.
Sorry but dsr 4x is not 4k like you say in title, or not? 4k has a resolution of 3840 × 2160 pixel. The dsr 4x have a 5120x2880 pixel resolution. Am i wrong?
Hey MxBenchmarkPC please give me your opinion so I have an rtx 4090 with the i7 13700k paired with an asus rog swift monitor G-sync 1/ms 360Hz monitor 1080p only do you think using 1620p DLDSR will look way better than my 1080p native in games?.
So im confused at what you were testing?? fps? cause of course there would be a quality difference and fps difference DLDSS has the same quality it even states that the only difference is efficiency between the two. DSR AND DLDSR. and does smoothing even work in games? I know it works on browsers and stuff desktop but never noticed it do anything in game but really never checked or took a deep dive into that.
Okay so after lots of research I still don't have a proper answer. Having a 3080 Ti with a 1080p 60hz 32 inch monitor, what is the best way to improve image quality? Using Dldsr 1620p, or 4k Dsr? Performance is not an issue at all in the games I play both resolutions get well above 60 fps which is all I need, so putting the performance factor away, what is the best visual quality? Also someone explain me the best smoothness % please, I am understanding that dldsr and dsr should be used with completely different smoothness %, what are the numbers and why?
@@wolfythesunbro 0. only if the image hurts your eyes with sharpness or you still see jaggies should you start increasing smoothness, preferably only upto 33%.
Yes, DLDSR fully depends on the native resolution of the monitor. On 1080p monitor you'll get 1440p and 1620p resolutions, on 1440p monitor you'll get 1920p and 4K resolutions.
@chrisenkil on 1440p monitor you'll get 2 options of DLDSR: 1920p and 4K resolution. If you want higher image quality, then use 1920p DLDSR, but there will be a noticeable performance hit.
What is the smoothness setting? DLDSR is clearly darker than 4k. The percentage is applied inversely to the DSR. Then 33% DSR smoothness is similar to 50% or more when compared to DLDSR. You shouldn't use the same percentage for both.
@@ImperialDiecast Holy shit, just NO....0% sharpness wiht DL-DSR = max sharpness. It looks disgusting. I personally use 60% on my 4K OLED and it looks perfect, with no oversharpening at all.
this is the world we live in now , paying 1200 bucks for a videocard , to have an neural network give you 20% more fps for sacrificing your sanity and the REASON WHY you bought a new videocard in the first place , to have sharp looking games this ai nonsense needs at least a good decade to develop into something we can actually use ... it's nice for people who can't afford ''nice things'' , but the ones who CAN afford it ... are being spit on basically
Honestly, I don't get it. DLDSR looks noticeably worse, but it also, unsurprisingly, runs a lot faster than plain 4K DSR. Does it even look that much better than native 1080p? I think, it would be best to compare these to native, as it's hard to judge if the trade-off is worth it. I never found downsampling to be anything other than a gimmick, to be honest. If your GPU is capable of higher resolutions, it's better to buy a higher resolution monitor and play at native or use DLSS when the GPU isn't fast enough.
not a gimmick at current gpu situation.. you can play heavy games on native resolution on budget & use DLDSR for old games to get better image quality. not all games have higher resolution scaling option so this is better alternative.
It is not a gimmick. Downscaling from 4K to 2K works wonders in Assassin's Creed Valhalla and other games that support render resolution scale. Having more visual information per pixel displayed downscaled is just simple math, just depends on the algorithm used for downscaling to the lower pixel count. Otherwise it may not work that well. In VR higher render resolution scalling is a must and looks sharp as fu%@! Play HL:Alyx with 150% render resolution vs 100% - Difference is Earth and Mars.
@@VargVikernes1488 Because playing games at 4K requires a really good GPU? DLDSR is good because the performance impact is low in comparison. It's nice for games that don't have good anti aliasing options, like Dark Souls 3.
Watching this on a 42 inches, I would say 1440p for sure 1080p native looks quite blurry with most edges ( even with anti aliasing, it looks good but still a bit blurry ) 1440p is a really good compromise. From what I see on this video, 4K DSR looks sharper but 1620p DLDSR looks about same as 1527p. Real downside to me is motion Static image quality is really good but when there's motion, it looks blurrier than native 1440p on a 42 inches
I have a 43" 4K gsync, and I use 1440p resolution with DLDSR, same quality as native 4K but higher framerate ! the job is done, and with DLSS it's so good. NVIDIA destroys AMD :)
@@abc-ni9lp Could be driver? because I played rdr2 and enabled 1620p dldsr but it looked normal to me(same res. as old dsr) and hit same fps. I think driver not showing much performance of dldsr. wait for next driver.
@DanteX1 it depends on a game really. In some games it is hard to find the difference, especially during gameplay, but in some other games it might be behind 4K DSR.
im still bt confused. which one is better or the both look the same only 1620p has better fps. also 1620p with dlss looks better or worse then 1080 in cyberpunk for example.
In order to run God of War beyond your native resolution you need to set your desktop resolution to DLDSR (1620p), for example, or any other resolution from DLDSR/DSR and then launch the game.
Try to not use like this yet. On nvidia main page, they recomend not set as desktop, because its unstable and can bug out totally, needing a usb stick to get image again from recovery mode.
can tell 1620p is little blurry eg 0:48 look at the grass and some other objects, just like that many other places, or my eyes are playing tricks dono but been searching for lower multiplier cause 4x do a big hit on performance...
U're wrong, dldsr is better for temporal anti aliasing than dsr, watch the digital foundry video, also dldsr is sharper than dsr, thats why we need more smoothness than dsr
@Horror Flix Urdu if you seek for maximum image quality, then 4K DSR is your choice. But, if you seek for high framerates with decent image quality, then 1620p DLDSR will be a better option.
on 3060ti and lower gpus you cant use dldsr in games because the FPS and frametimes become dogshit unplayable, yes you get 70 fps in witcher 3 at 1440p DLDSR but the 70 fps feels like 25 fps with it.
@Gianluk FSR and DSR are completely different technoligies. FSR is a upscaler, DSR is a downscaler. Driver level FSR is comparable to a driver level NIS from NVIDIA.
How to enable it on rx560 4gb? Never showed the option ever in years. Native of the tv 1024x768. It used to work on r9 270 till it was discontinued on a driver update
If they were allowed it to run on all GPUs, then the end result would have been not that much different from 1620p DSR because of missing tensor cores.
@@MxBenchmarkPC i'm pretty sure you dont need tensor cores to do literally anything, look at xess for example. nvidia just making every product "need" something on rtx cards
@@crispinotechgaming XESS will supposedly work with anything _but_ at a performance hit. RT works on Pascal cards but the performance hit makes it not worth it. Not mentioning that is disingenuous.
@@crispinotechgaming AND will have a performance hit on non-Intel cards because it won’t be supported on a hardware level. Wait for actual independent testing after it arrives before jumping to conclusions.
Please give this video a thumbs up if you liked it and feel free to comment below or ask me anything. It will help me to get recognized by youtube's algorithm. Thanks! :)
Timecodes:
○ 0:00 - RDR2 Ultra Settings, DLSS OFF
○ 1:07 - God of War Ultra Settings, DLSS OFF
○ 2:17 - The Witcher 3 Ultra Settings
○ 3:22 - GTA V Ultra Settings, FXAA/MSAA OFF
○ 4:45 - Cyberpunk 2077 Ultra Settings, RTX/DLSS OFF
○ 5:57 - Hitman 3 Ultra Settings
○ 7:12 - Assassin's Creed Valhalla Max Settings
○ 8:24 - Chernobylite Ultra Settings, DLSS OFF
Hi, does one have to use Geforce experience to use DLDSR?
@temannopellaria no, you don't need to use GeForce Experience in order to be able to use DLDSR. But you need any RTX GPU to be able to use DLDSR, that's the only requirement.
@@MxBenchmarkPC I have a rtx 3080 tuf oc edition, with a 1440p 240 hz monitor.
I selected DLDSR(4k) from nvidia control panel, but when I use 4k resolution in any game, the performance is identical to 4k with DSR.
I have the latest drivers of course.
@temannopellaria that's right, performance at 4K DLDSR and 4K DSR is identical, but the image quality is different. DLDSR 1620p is comparable to 4K DSR in terms of image quality, but with better performance at DLDSR 1620p. Better example in your case - your DLDSR 4K is comparable to 5K DSR in terms of image quality, but with the performance from 4K DSR.
@@MxBenchmarkPC Ah thank you, it's all clearer to me now. :)
This is an amazing technology.
I've been using 2.25x on my 1440p laptop >> which makes it 4K DLDSR res. Playing games like God of War in 4K DLDSR with dlss enabled produces some stunning quality with really high frames. I think DLDSR + DLSS may be an amazing option going forwards. Really impressed.
Thanks for the awesome video!!!!!
Yeah, it is great that you can use DLDSR and DLSS at the same time.
At what values do u set the smoothness levels? Is 0% by default too sharp for a 1440p?
@@Pecata_Gaming92 Hi. I've set mine to 0% and have not noticed any issues at all. Picture quality is incredible. I've only used it with Shadow of the Tomb Raider (4k DLDSR + DLSS performance) and GoW (4k DLDSR + DLSS quality), both at 0%.
@Deathbreeder in this video the smoothness levels were set to 15% in both cases. Yes, 0% value will be too sharp for 1440p DLDSR.
Ps, a great tip for GoW - in accessibility settings disable ambient camera sway. It makes everything shimmer, with or without dlss. It was visible on console too, for example on foliage and snow. Disabling it keeps the 3rd person camera still (like in tomb raider or any other 3rd person game) and completely eliminates the distracting shimmer
Was looking forward for this!
Glad to hear that!
thanks mate! the video was hard to find no doubt xD ill stay with dldsr when i got a better cpu ill go for the 4k
I have an 3060ti and a 1440p monitor I tried dldsr 4k resolution in 1080p at 30 percent smoothness and played deathloop. I got a comparable image to 1440p native while standing still with settings higher, I also got a performance boost, DLSS was set to quality. However in motion the picture can become quite noisy, some edges in window frames also appeared jagged compared to native. I think that dldsr still needs some development before it becomes commonly used.
I dont see that much difference in newer games. then I tried it with Bioshock 1 and holy shit its really feels like huge improvement
@@kaant21 which version the remaster or original? I'll give it a go
@@kaant21 It mostly helps with games that didn’t have TAA, and it’s much better than MSAA.
>and played deathloop
stopped reading there
Hey i also have a 3060 ti and a 1440p monitor but when i do 4k from dldsr i my fps drops from 100 to 20 in SOTY(1440p to 4k)
You can't compare image quality on youtube because it compresses the video. What you're really comparing is how well it compresses the different images/frames -- whichever one is less noisy will look better on youtube.
A good tech indeed. Last november got a 240hz, 27inch Adaptive sync monitor, locked at 1080p. Also got the rtx 4060, wich isdirected at 1080p. I regreted a bit having bought such a large monitor, for a 1080p setting, pixels get to far apart. When i discovered DLDSR, i was super impressed. I set it to 2.25x DL 1620p (i read somewhere that it scales better at 2.25x then 1.75x, but can not really be sure). Played Guardians of galaxy full Ultra+Ultra RT + DLDSR 2.25x + DLSS, average 70fps or so, amazing combo and the difference was apparent. Performance drop was negligable.I Recommend it if you were stuck like me at 1080p
At a 10% cost compared to 4k dlss quality. Having the 1.78x options on games with dlss ( with the performance mode options instead of quality) gets rid of all antialiasing problems
Even with Red Dead Redemption 2?
@@gevelegian u need to have 4k (1.78 option and dlss) and also a strong enough gpu (40 series) in order to get consistent 60+ frames
@@0253749depaula I've been experimenting with all sorts of options to bypass the TAA blur on a 1080p screen. So far, the DLSS quality mode is alright but it handles vegetation kind of weird. I'll try replacing with newer DLSS dll files and experiment with that.
@@gevelegian with 1080p? Good luck. The only real solution is sitting further away from the screen
@@0253749depaula I have noticed, never thought I needed anything above 1080p. If they didn't use that weird dithering pattern on transparent objects it would have looked fine.
Dont know about 1920p DLDSR, but this 1620p DLDSR looks almost identical to 4k DSR (but only at a fraction of it's performance hit). I think the latter one looks ever so slightly sharper... but if you told me you mixed up which one was which one, I would believe that as well, so...
Marvellous job nVidia!
Yeah, indeed.
man.. I hope all games would smoothly support DLDSR. I hate it when you alt tab xD so far CP77 has the best support.
Dsr looks sharper and more clear
Could you make another video with 1080p native vs 1620p DLDSR?
I'll look into it.
@bruno longhi 1080p native vs 1620p DLDSR video is ready and published. Feel free to watch it :)
@@MxBenchmarkPC nice one!
@bruno longhi thanks!
so for people on a 1080p monitor it's not possible to get 4K DLDSR ?..
*that shit's stupid*
There's literally no reason for 4K DLDSR on a 1080p monitor. DLDSR mainly is there to help with DSR resolutions that can't be evenly divided into 1080p. Any other resolution than 4K on a 1080p screen will look awful when you use it with DSR. DLDSR helps by applying AI filtering to make it more crisp etc
TLDR; Since 4K is the double of 1080p it already looks perfect as it is. DLDSR on it would be pointless
I go with 1440p/1620pDLDSR DLSS Balanced. Most off games It still look much better than native 1080p TAA and better most off the times than 1080 DLAA. Now also use DLSS 3.7 with E or C preset, that depens on the game. Oby downside is scaling volumetric lights, reflection which also depends on the game how implemented.
Thanks for the video!
Did you use a default smoothness factor of 33% in case of DLDSR?
The smoothness factor was set to 15% in both cases.
@@MxBenchmarkPC Got it, thanks.
@@MxBenchmarkPC for dsr 4x you should always use smoothness of 0.
@@MxBenchmarkPC You should put the smoothness to 0% to get the sharpest image quality
Whats your monitor resolution?
Grass looks better at 4K DSR. Maybe because it is a new game. Older games such as The Witcher 3, Rise of the Tomb Raider demonstrate a lot of shimmering.
That's because it's 2160p dsr scaling vs the 1620p dldsr scaling, we just need a 2160p dldsr scaling for actual better fps for 1080p monitors
Nearly the same image quality but with 54% more performe on overage is really nice to have.
what is used to record this? external capture, or software?
Doesn't really matter whether he's recording accurate video since youtube compresses the video.
In my flight sim testing, DSR 4x legacy looks significantly better than DLDSR 2.25. I would love to be able to test a DLDSR 4x however, since DLDSR 2.25 certainly looked better than DSR 2.25 legacy.
Impressive temps on the 3080 btw... you have that at stock or curve edited?
Stock settings.
@@godlyiwnl ,
GPU's don't run any "cooler" now than years ago on average. The size of the GPU die doesn't necessarily mean anything in terms of temps. And there may be beefy coolers but a lot of the GPU's are creating so much HEAT ENERGY compared to a few years ago these coolers are necessary... the coolers capability pretty much just scales, on average, with the heat energy you're trying to dissipate.
Graphics card manufacturers generally just run the GPU close to its maximum capabilities in terms of temps/voltage/frequency. It's pretty common to see temps over 80degC still as it has been for many years. The GTX1080 came out about six years ago and 80+ degC is common.
(Running "cooler" might gain a small amount of performance because of how GPU's tend to scale, but it's pretty minor difference. I'd rather set a fan profile that keeps the temps safe but HIGH resulting in the lowest FAN NOISE... I did a Noctua fan mod to my GTX1080 and can rarely hear the fans in my computer even when gaming with no speakers on. My GPU fans are about 300RPM in idle, 600RPM when gaming usually and ramp up to around 1100RPM in very demanding GPU workloads.)
Great comparison wish my rx590 could pull speeds like that
Thanks! The rx590 is not a bad GPU neither, especially considering current GPU market situation.
Made a test of DLDSR vs 4k but with ingame scaler lower to 1440P.
How would this compare to just setting a higher resolution to begin with and only using DLSS? DLDSR is increasing your render resolution and then squeezing it down to fit your game screen, so you could do the same thing downsampling + DLSS and compare which looks better for you
all games offer dldsr but only a few games offer dlss. also a higher res than your monitor's native is only selectable with dldsr enabled. so you cant do dlss 1440p on a 1080p monitor.
@@ImperialDiecast You can if you set a custom Resolution in NVCP.
In motion dldsr looks better and in some pictures DSR looks better, sharper.
What smoothness did you use?
15% in both cases.
@@MxBenchmarkPC Thanks bro
@Floating no problems, mate :)
@@MxBenchmarkPC Smoothness should be at 0% for 4x dsr, it's only meant for uneven scaling to hide aliasing. And the smoothness setting on dldsr is actually sharpening, 0% being max sharpening and 100% being off.
4k dsr is better, but if performance is a factor dldsr is very good to have. 1.78 dldsr and dlss to ultra quality makes for really good performance and good stable image quality Cyberpunk 2077
Is it worth to enable on 2K monitor? GPU temp increase very much, like + 15 degreese
If I use DLDSR 2.25x in-game, should I turn off Anti-Aliasing in the in-game options? (I use a native 1440p Monitor) ?
If it's MSAA - yes, you can turn it off.
If it's TAA - no, leave it enabled.
@@MxBenchmarkPC How abou the FXAA and the MFAA Setting in the Nvidia control panel, should I have it on or off? :)
@SarjuGaming FXAA and MFAA are also fine, leave it enabled.
This might be a dumb question, but can DLSS and DLDSR be used together?
Yes, you can use it together.
What happened if you on dsr in Nvidia settings + Dlss on ingame settings?
It will work as intended. You can also use DLDSR and DLSS at the same time.
I wish 1620p was more common
I didn't look into this to much..but whats the difference between this and DLSS?
DLSS renders lower resolution image and upscales it to the higher resolution. DSR renders higher resolution image and then downscales it to your native resolution. DLDSR does the same thing as DSR, but with help of AI it can produce high resolution image quality with less performance hit in comparison to 4K resolution, for example. Basically it's another performance boosting technology for those, who is already familiar with DSR.
@@MxBenchmarkPC Cool..thanks for explaining it!..👌
@Tank Connors glad to help!
Sorry but dsr 4x is not 4k like you say in title, or not? 4k has a resolution of 3840 × 2160 pixel. The dsr 4x have a 5120x2880 pixel resolution. Am i wrong?
This test was done on 1080p monitor.
DSR 4.00x on 1080p monitor = 4K resolution (3840x2160)
DSR 4.00x on 1440p monitor = 5K resolution (5120x2880)
@@MxBenchmarkPC ah ok, sorry i missed at what resolution starts the monitor test. Thanks!
@davide1555 no problems.
hi benchmarkpc does using dsr on nvidia settings damage your gpu? thanks
@@sadiemicoalvin8695 no
dear bro, how much smoothnes are you using for DSR 4k?
15%.
Hey MxBenchmarkPC please give me your opinion so I have an rtx 4090 with the i7 13700k paired with an asus rog swift monitor G-sync 1/ms 360Hz monitor 1080p only do you think using 1620p DLDSR will look way better than my 1080p native in games?.
Yes.
@@MxBenchmarkPC Thanks what smoothness would you add with that?
@SirSone 15%
@@MxBenchmarkPC last question do I need to change desktop resolution too or does it work if I change it to the new dldsr reso in game only?
@@MxBenchmarkPC i noticed text in game is small how do you fix that?
So im confused at what you were testing?? fps? cause of course there would be a quality difference and fps difference DLDSS has the same quality it even states that the only difference is efficiency between the two. DSR AND DLDSR. and does smoothing even work in games? I know it works on browsers and stuff desktop but never noticed it do anything in game but really never checked or took a deep dive into that.
Are you slow or something?
is this on a 1080p native resolution?
1080p monitor was used in this video.
I got a rtx 3080 card. Can I use dsr with that? I’m playing on a 4K tv.
Sure.
Okay so after lots of research I still don't have a proper answer. Having a 3080 Ti with a 1080p 60hz 32 inch monitor, what is the best way to improve image quality? Using Dldsr 1620p, or 4k Dsr? Performance is not an issue at all in the games I play both resolutions get well above 60 fps which is all I need, so putting the performance factor away, what is the best visual quality? Also someone explain me the best smoothness % please, I am understanding that dldsr and dsr should be used with completely different smoothness %, what are the numbers and why?
++++
in your case definitely 4k dsr
@@ImperialDiecast With what smoothness?
@@wolfythesunbro 0. only if the image hurts your eyes with sharpness or you still see jaggies should you start increasing smoothness, preferably only upto 33%.
@@ImperialDiecast Alright, thanks. What about 1620p Dldsr smoothness?
1920p DLDSR looks noticeably better than DSR 4, but I don't have 1620p mode. It also depends on the native resolution of the monitor
Yes, DLDSR fully depends on the native resolution of the monitor. On 1080p monitor you'll get 1440p and 1620p resolutions, on 1440p monitor you'll get 1920p and 4K resolutions.
are you high 4k dsr looks better
@@MxBenchmarkPC I have a 2k native monitor do you think its better if I use dldsr or i let native ? Fps will be higher on native right?
@chrisenkil on 1440p monitor you'll get 2 options of DLDSR: 1920p and 4K resolution. If you want higher image quality, then use 1920p DLDSR, but there will be a noticeable performance hit.
@@MxBenchmarkPC There are two options for DLDSR in 1440p. 1.78x 1920p and 2.25x 4k. 5k is 4xDSR
What is the smoothness setting? DLDSR is clearly darker than 4k. The percentage is applied inversely to the DSR. Then 33% DSR smoothness is similar to 50% or more when compared to DLDSR. You shouldn't use the same percentage for both.
So what is the best smoothness for Dldsr and Dsr?
@@wolfythesunbro 0
@@ImperialDiecast Holy shit, just NO....0% sharpness wiht DL-DSR = max sharpness. It looks disgusting. I personally use 60% on my 4K OLED and it looks perfect, with no oversharpening at all.
this is the world we live in now , paying 1200 bucks for a videocard ,
to have an neural network give you 20% more fps for sacrificing your sanity
and the REASON WHY you bought a new videocard in the first place , to have sharp looking games
this ai nonsense needs at least a good decade to develop into something we can actually use ...
it's nice for people who can't afford ''nice things'' , but the ones who CAN afford it ... are being spit on basically
Чуда не произошло. 4K DSR однозначно лучше. И это ещё видео на передаёт всю полноту картины. Странное сравнение. Надо сравнивать идентичные параметры.
Honestly, I don't get it. DLDSR looks noticeably worse, but it also, unsurprisingly, runs a lot faster than plain 4K DSR. Does it even look that much better than native 1080p? I think, it would be best to compare these to native, as it's hard to judge if the trade-off is worth it. I never found downsampling to be anything other than a gimmick, to be honest. If your GPU is capable of higher resolutions, it's better to buy a higher resolution monitor and play at native or use DLSS when the GPU isn't fast enough.
it dosnt work at current stage its gimmick so far people are overhyped about this placebo feature
not a gimmick at current gpu situation.. you can play heavy games on native resolution on budget & use DLDSR for old games to get better image quality.
not all games have higher resolution scaling option so this is better alternative.
It is not a gimmick. Downscaling from 4K to 2K works wonders in Assassin's Creed Valhalla and other games that support render resolution scale. Having more visual information per pixel displayed downscaled is just simple math, just depends on the algorithm used for downscaling to the lower pixel count. Otherwise it may not work that well. In VR higher render resolution scalling is a must and looks sharp as fu%@! Play HL:Alyx with 150% render resolution vs 100% - Difference is Earth and Mars.
@@yosifvidelov For VR, sure, but for everything else it feels like a waste. Why not just buy a 4K monitor, instead?
@@VargVikernes1488 Because playing games at 4K requires a really good GPU? DLDSR is good because the performance impact is low in comparison. It's nice for games that don't have good anti aliasing options, like Dark Souls 3.
Exactly what I needed for my newly acquired 4090. Time to run 4k dsr on my 1440p 240hz screen
@@asr1el942 dsr 4x cannot be beat. Dldsr is great for needing slightly more performance for less visual fidelity
What would be the native equivalent of DLDSR 1620p in terms of image quality? 1080p, 1440p?
Watching this on a 42 inches, I would say 1440p for sure
1080p native looks quite blurry with most edges ( even with anti aliasing, it looks good but still a bit blurry )
1440p is a really good compromise. From what I see on this video, 4K DSR looks sharper but 1620p DLDSR looks about same as 1527p.
Real downside to me is motion
Static image quality is really good but when there's motion, it looks blurrier than native 1440p on a 42 inches
I have a 43" 4K gsync, and I use 1440p resolution with DLDSR, same quality as native 4K but higher framerate ! the job is done, and with DLSS it's so good. NVIDIA destroys AMD :)
Its 4k equivalent.
what do you think, Dldsr at 1620p better or worse than 4k dsr. In term of image quality.
worse 1620p DLDSR is same quality as old 1620p DSR with even less FPS,
@@abc-ni9lp Could be driver? because I played rdr2 and enabled 1620p dldsr but it looked normal to me(same res. as old dsr) and hit same fps. I think driver not showing much performance of dldsr. wait for next driver.
@DanteX1 it depends on a game really. In some games it is hard to find the difference, especially during gameplay, but in some other games it might be behind 4K DSR.
im still bt confused. which one is better or the both look the same only 1620p has better fps. also 1620p with dlss looks better or worse then 1080 in cyberpunk for example.
That's the whole purpose of it. 1620p DLDSR look like 4K, but with higher fps.
@@MxBenchmarkPC but what if u use dlss? would it still look better then 1080p?
@UDespair yes.
@@MxBenchmarkPC thanks for clearing it up!
@UDespair no problems.
Im Using a 1080p Monitor, i run all my gameson DLdsr 2x25x and a DSR Sharpness of 0% if thats correct
as long as DLDSR doesnt support 4x, its not really an option for me.
How do you activate DLDSR in God of War? It doesn't have fullscreen mode, so I cannot change the resolution.
In order to run God of War beyond your native resolution you need to set your desktop resolution to DLDSR (1620p), for example, or any other resolution from DLDSR/DSR and then launch the game.
Try to not use like this yet. On nvidia main page, they recomend not set as desktop, because its unstable and can bug out totally, needing a usb stick to get image again from recovery mode.
@@CARLITOTESTIGO so its only in games that you shoupd change the reso?
can tell 1620p is little blurry eg 0:48 look at the grass and some other objects, just like that many other places, or my eyes are playing tricks dono but been searching for lower multiplier cause 4x do a big hit on performance...
dldsr sucks tbh because it's blurry af and it leaves jagged edges. But if you want great performance, go for it.
U're wrong, dldsr is better for temporal anti aliasing than dsr, watch the digital foundry video,
also dldsr is sharper than dsr, thats why we need more smoothness than dsr
@@SangokuSS54дср 4x лучше чем длдср 2x25 в рдр видно качество разницу но и производителььность меньше
Bro I am a super noob pls can u elaborate which resolution is better please
Depends on your GPU and preferable framerate, you need to be more specific.
@@MxBenchmarkPC 3080ti I have but I am totally noob at things😂
@Horror Flix Urdu if you seek for maximum image quality, then 4K DSR is your choice. But, if you seek for high framerates with decent image quality, then 1620p DLDSR will be a better option.
@@MxBenchmarkPC my friend says that 1620 dldsr is better then 4k native Is it interms of image?
@Horror Flix Urdu it is comparable, but the final image quality differs from game to game. You should try both and then pick the right one for you.
You should compare DLDSR 2.25x to DSR 2.25x, not DSR 4.00x.
Its because the claims are that DLDSR 2.25x will look similar to 4x DSR while having better performance of course.
@@venator9536 4k dsr looks far better though, right? atleast in this video.
@@katakuri774yes but only when zoomed in and a 20 fps decrease
on 3060ti and lower gpus you cant use dldsr in games because the FPS and frametimes become dogshit unplayable, yes you get 70 fps in witcher 3 at 1440p DLDSR but the 70 fps feels like 25 fps with it.
man.. hope amd comes up with tech like this on driver lvl.
Competition is a great thing, i think AMD will come up with something similar in the future.
AMD already announced they are coming with what they call FSR on Drivers ready.
@@gianlukkthat is upsampling technology.. we want downsampling software like dsr/super resolution on driver lvl
@Gianluk FSR and DSR are completely different technoligies. FSR is a upscaler, DSR is a downscaler. Driver level FSR is comparable to a driver level NIS from NVIDIA.
Fundamentally impossible without hardware to accelerate tensor math operations.
idk why but for me 1080p looks much better than 1620p on my game, 1620 makes it blurry any ideas why?
Try to tweak the smoothness level in the NVIDIA Control Panel settings. It's located above DSR\DLDSR options.
How to enable it on rx560 4gb? Never showed the option ever in years. Native of the tv 1024x768. It used to work on r9 270 till it was discontinued on a driver update
DLDSR is available only on RTX GPUs.
DLDSR looks washed as hell
TALK ABOUT BOTTLENECKING
but the big question is: can it run on non rtx gpu? probably yes lol, but nvidia still limits
If they were allowed it to run on all GPUs, then the end result would have been not that much different from 1620p DSR because of missing tensor cores.
@@MxBenchmarkPC i'm pretty sure you dont need tensor cores to do literally anything, look at xess for example. nvidia just making every product "need" something on rtx cards
@@crispinotechgaming XESS will supposedly work with anything _but_ at a performance hit. RT works on Pascal cards but the performance hit makes it not worth it. Not mentioning that is disingenuous.
@@crispinotechgaming AND will have a performance hit on non-Intel cards because it won’t be supported on a hardware level. Wait for actual independent testing after it arrives before jumping to conclusions.
1620p looks slightly worse than DSR4 2160p. Also, assuring Nvidia that it looks better is just pure marketing