DLSS 2.0 is just AMAZING but, now we need most games to support it, and i cant understand why there are not more game announced. Now is easy to apply to games isnt it?
Galan17 if I’m not mistaken, 2.0 was only released a few weeks ago so it’s still early days. Also, implementation requires game devs to make some adjustments to the render path, so a game patch/update is required. Be patient friend, they will come. :)
Is there any chance dlss 2.0 gets an ultra quality mode? This might be alot better than running Taa on rtx cards especially on say ampere right? Almost like some form of good msaa but with only a slight performance loss.
I don't know but if I had to guess, I think it's unlikely but who knows. Reasons for thinking it's unlikely are 1) you get exponentially diminishing returns to improvement in image quality the higher you set the render resolution and perhaps they've decided that quality mode (i.e. 44% display resolution) is the optimal threshold of return & 2) the original purpose/goal of the techs invention was to improve fps without sacrificing image quality so they can market it as essentially a significantly free fps booster (which is a competitive advantage over e.g. AMD that does not have the tech....yet). 3) (ties into 2) it makes RTX playable thus making it more likely people will buy the new RTX cards.
It's not mine of course, I'm not the speaker in the video and I don't work for Nvidia but I couldn't find a copy of it on RUclips so I uploaded a copy. You're welcome if that's what you were referring to. :)
It works with any resolution, automatically providing you with 3 lower render resolutions that will then be DLSS upscaled to your native (or rather "target") resolution of choice. So for a native/target resolution of 3440x1440p, DLSS 2.0 will provide you with 1720x720p ("Performance" mode), 1994x835p ("Balanced" mode) and 2293x960p ("Quality" mode) to choose from. They represent 25%, 34% and 44% the number of pixels of the native/target resolution.
DLSS 2.0 is the greatest miracle Nvidia ever performed. AMD probably laughed their butts off with DLSS 1.0... look how much Nvidia wasted on dedicated Tensor cores. AMD is no longer laughing.
I’m very much inclined to agree. I would personally rate it as the killer feature of RTX over real time ray tracing even though real time RT is über cool too but DLSS2.0 makes RT viable. It’s also just the perfect anti-aliasing technique combined with super sampling image quality (despite actually being sub sampling if you think about it lol) and with significant performance boost has to be the winner. AMD will have to follow suit if they want to remain competitive. I also predict AMD’s version will make its way onto the pro versions of the next gen AMD powered consoles in a few years time.
@@DrR1pper Oh yeah, guaranteed AMD is 100% dedicated to releasing something comparable asap. Just a proof of concept for future console releases will make me feel better. Fortunately, it's always easier to do something after it's been proven possible :) AI seems to be the gift that keeps on giving. Who knows, maybe animation & physics will be next to receive an unexpected boost. I think a big winner could be Nintendo. They are already in bed with Nvidia and with $20 more silicon, they can 2x their performance in a nice switch refresh :)
@@jewymchoser we're on the same wavelength of thought about further potential applications to game engines in general which could then further improve performance for less computational cost. ;)
@@DrR1pper My biggest concern is now file size. Devs will need tools that take recorded test plays and AUTOMATICALLY identify assets that are rarely used or seen only at low quality and adjust compression accordingly. I think the big take-away lately is manual optimization is over... thank g-d!
ist magic only downside is the need for pre-deep learning on nvidia super computers. so you cant use this technique for every game by default. and you need new information from nvidia, which is delivered from nvidia via the drivers. we need some techniques which is independet from nvidia DL information. would love to see some implementation of checkerboard rendering natively in nvidia driver. so we can turn it on for every game.
Fortunately, this downside no longer exists. The per-game deep learning training requirements were true for DLSS 1.0. This is not the case for DLSS 2.0 (and future iterations). The improvements made from 1.0 to 2.0 were achieved by shifting to a generalised solution.
mal-avcisi unfortunately, it won’t ever be that easy still (but shouldn’t be hard either). Game developer still needs to do some work to get DLSS working properly and well (e.g. insert DLSS into the right place in the games specific rendering engine pipeline, provide object-pixel velocity data for DLSS to use as an input which any game having a TAA option should already have though, etc). This part remains the same.
Sorry, but anything below 60fps is not "playable." The fact that a 2080Ti struggles with 4K + RT barely getting by 60FPS shows how terrible the value of turing was, and how Nvidia used everyone as a paid-in-full alpha test. Ampere better not struggle with this, and better not be overpriced to hell like the 20-series were, otherwise Sony, MS, and AMD will be thanking Nvidia for the extra sales due to the consoles promising 4k/60 and coming in around $500 for a "turn key" gaming experience, vs paying over $1000 for one component to barely edge them out.
"The fact that a 2080Ti struggles with 4K + RT barely getting by 60FPS" "Sony, MS, and AMD will be thanking Nvidia for the extra sales due to the consoles promising 4k/60" 4K + RT on PC vs 4k no RT on console isn't a fair comparison. RT takes a fuckton of resources. If you're going to compare them, make it fair, and not a dishonest comparison. Not to mention, you're conveniently leaving out DLSS, which is the future of gaming. RTX 2060+ has it already, newer consoles do not. Gonna have to wait till the consoles this gen are upgraded, or wait till next gen altogether. At the moment, NVIDIA is comfortably in the drivers (no pun intended) seat when it comes to graphics. There is absolutely no question about that.
a guy like you does not worth to discuss. "anythung below 60 fps isnt playable" you stupid or what? what about consoles? and games that are not competitive?
DLSS 2.0 is Magic 🎨
"Any sufficiently advanced technology is indistinguishable from magic"
Exceptional technology. I switched to from AMD to Nvidia 30 series purely for DLSS 2. Basically a free 50% boost in 4k.
DLSS 2.0 is just AMAZING but, now we need most games to support it, and i cant understand why there are not more game announced. Now is easy to apply to games isnt it?
Galan17 if I’m not mistaken, 2.0 was only released a few weeks ago so it’s still early days. Also, implementation requires game devs to make some adjustments to the render path, so a game patch/update is required. Be patient friend, they will come. :)
Great talk, wonderful technology... Makes me wonder what other parts neural networks will play in future graphical technologies!
Is there any chance dlss 2.0 gets an ultra quality mode? This might be alot better than running Taa on rtx cards especially on say ampere right?
Almost like some form of good msaa but with only a slight performance loss.
I don't know but if I had to guess, I think it's unlikely but who knows. Reasons for thinking it's unlikely are 1) you get exponentially diminishing returns to improvement in image quality the higher you set the render resolution and perhaps they've decided that quality mode (i.e. 44% display resolution) is the optimal threshold of return & 2) the original purpose/goal of the techs invention was to improve fps without sacrificing image quality so they can market it as essentially a significantly free fps booster (which is a competitive advantage over e.g. AMD that does not have the tech....yet). 3) (ties into 2) it makes RTX playable thus making it more likely people will buy the new RTX cards.
That's happening now, they're calling it DLAA.
thank you for this video, wondering how this neural network would perform on movies without re train it
It's not mine of course, I'm not the speaker in the video and I don't work for Nvidia but I couldn't find a copy of it on RUclips so I uploaded a copy. You're welcome if that's what you were referring to. :)
How did you get such a high quality version? The one I see on Nvidia's website is 480p at best.
He used DLSS 2.0.
@@AboutOliver 😂😂😂😂😂😂😂😂😂😂
@Walkier I think they've done something on their end to lower the resolution because I took this from Nvidia's website 7 months ago.
Holy shit this is amazing
My only question is what if you want to run at 3440x1440p?
It works with any resolution, automatically providing you with 3 lower render resolutions that will then be DLSS upscaled to your native (or rather "target") resolution of choice. So for a native/target resolution of 3440x1440p, DLSS 2.0 will provide you with 1720x720p ("Performance" mode), 1994x835p ("Balanced" mode) and 2293x960p ("Quality" mode) to choose from. They represent 25%, 34% and 44% the number of pixels of the native/target resolution.
@@DrR1pper Wonderful, thanks for the answer. This is going to make me able to upgrade to that resolution.
DLSS 2.0 is the greatest miracle Nvidia ever performed.
AMD probably laughed their butts off with DLSS 1.0... look how much Nvidia wasted on dedicated Tensor cores.
AMD is no longer laughing.
I’m very much inclined to agree. I would personally rate it as the killer feature of RTX over real time ray tracing even though real time RT is über cool too but DLSS2.0 makes RT viable. It’s also just the perfect anti-aliasing technique combined with super sampling image quality (despite actually being sub sampling if you think about it lol) and with significant performance boost has to be the winner.
AMD will have to follow suit if they want to remain competitive. I also predict AMD’s version will make its way onto the pro versions of the next gen AMD powered consoles in a few years time.
@@DrR1pper Oh yeah, guaranteed AMD is 100% dedicated to releasing something comparable asap. Just a proof of concept for future console releases will make me feel better. Fortunately, it's always easier to do something after it's been proven possible :)
AI seems to be the gift that keeps on giving. Who knows, maybe animation & physics will be next to receive an unexpected boost.
I think a big winner could be Nintendo. They are already in bed with Nvidia and with $20 more silicon, they can 2x their performance in a nice switch refresh :)
@@jewymchoser we're on the same wavelength of thought about further potential applications to game engines in general which could then further improve performance for less computational cost. ;)
@@DrR1pper My biggest concern is now file size. Devs will need tools that take recorded test plays and AUTOMATICALLY identify assets that are rarely used or seen only at low quality and adjust compression accordingly.
I think the big take-away lately is manual optimization is over... thank g-d!
@@jewymchoser FSR is here now :)
Crazy
"from a non-playable experience to a playable experience"
how bold of you to assume 8fps is non-playable
ist magic only downside is the need for pre-deep learning on nvidia super computers. so you cant use this technique for every game by default. and you need new information from nvidia, which is delivered from nvidia via the drivers.
we need some techniques which is independet from nvidia DL information.
would love to see some implementation of checkerboard rendering natively in nvidia driver. so we can turn it on for every game.
Fortunately, this downside no longer exists. The per-game deep learning training requirements were true for DLSS 1.0. This is not the case for DLSS 2.0 (and future iterations). The improvements made from 1.0 to 2.0 were achieved by shifting to a generalised solution.
@@DrR1pper is there any way we can see the possibility to turn dlss 2.0 for every game via the driver ?
mal-avcisi unfortunately, it won’t ever be that easy still (but shouldn’t be hard either). Game developer still needs to do some work to get DLSS working properly and well (e.g. insert DLSS into the right place in the games specific rendering engine pipeline, provide object-pixel velocity data for DLSS to use as an input which any game having a TAA option should already have though, etc). This part remains the same.
@@DrR1pper thanks for the explanation.
AMAZING WHAT THE F!!!!
Sorry, but anything below 60fps is not "playable." The fact that a 2080Ti struggles with 4K + RT barely getting by 60FPS shows how terrible the value of turing was, and how Nvidia used everyone as a paid-in-full alpha test. Ampere better not struggle with this, and better not be overpriced to hell like the 20-series were, otherwise Sony, MS, and AMD will be thanking Nvidia for the extra sales due to the consoles promising 4k/60 and coming in around $500 for a "turn key" gaming experience, vs paying over $1000 for one component to barely edge them out.
"The fact that a 2080Ti struggles with 4K + RT barely getting by 60FPS"
"Sony, MS, and AMD will be thanking Nvidia for the extra sales due to the consoles promising 4k/60"
4K + RT on PC vs 4k no RT on console isn't a fair comparison. RT takes a fuckton of resources. If you're going to compare them, make it fair, and not a dishonest comparison.
Not to mention, you're conveniently leaving out DLSS, which is the future of gaming. RTX 2060+ has it already, newer consoles do not. Gonna have to wait till the consoles this gen are upgraded, or wait till next gen altogether. At the moment, NVIDIA is comfortably in the drivers (no pun intended) seat when it comes to graphics. There is absolutely no question about that.
a guy like you does not worth to discuss. "anythung below 60 fps isnt playable" you stupid or what? what about consoles? and games that are not competitive?
This guy talks like he has marbles stuffed in his mouth.