[4K] Nvidia RTX DLSS Analysis: AI Tech Boosts Performance by 40% - But What About Image Quality?

Поделиться
HTML-код
  • Опубликовано: 19 окт 2024
  • Is this too good to be true? The tensor cores in the new Nvidia RTX cards allow 4K gaming to be accelerated by 40 per cent in terms of performance. How does it work and is there a catch? John, Rich and Alex share their thoughts.
    Subscribe for more Digital Foundry: bit.ly/DFSubscribe
    Join the DF Patreon, support the team and get pristine quality video downloads! www.digitalfou...

Комментарии • 994

  • @Kal360
    @Kal360 6 лет назад +332

    I remember a simpler time when we were shifting from 720p to 1080p and people on forums claimed AA isn't needed at higher resolutions like 1080p...

    • @Michael.Virtus
      @Michael.Virtus 6 лет назад +48

      You are right, but still it's a fact that aliasing is less and less of a problem the bigger the resolution is. I remember my 15" monitor playing at 800x600 and it was so bad without AA. Now at 1440p I rarely turn AA on. It's not so bad without it. I prefer maximum fps (at 144hz). AA will always be visually better, but the problem is the performance hit. Some people are okay playing at 60fps/hz. But I am done with that standard. High frequency monitor is an amazing smoothness for the eyes.

    • @JayJapanB
      @JayJapanB 6 лет назад +17

      When people would just read 1080p on the back of their 360 game and be happy with their life...

    • @webinatic216
      @webinatic216 6 лет назад +6

      I super sample 4k to 1080p just because the enemies are then more visible from a distance. It's good enough.

    • @nomercy8989
      @nomercy8989 6 лет назад +14

      I remember the exact same thing when 4k became the big new thing. That we won't need AA any more...

    • @axolet
      @axolet 6 лет назад +26

      You have to consider that games from that era were less detailed and more blocky. Aliasing is more prevalent with small, detailed objects such as blades of grass, tree leaves and long draw distances etc

  • @CreativGaming
    @CreativGaming 6 лет назад +339

    What I like about Digital Foundry is that they make NO JUDGMENTS or tell people how to SPEND their money. They use objective reasoning when observing new technology. They pretty much just give you the entire picture and let you decide for yourself.

    • @TheLoucM
      @TheLoucM 6 лет назад +33

      Excatly this, I hate when tech channels turn into buying guides.

    • @fcukugimmeausername
      @fcukugimmeausername 6 лет назад +14

      Just like Toms Hardware!

    • @smokeydops
      @smokeydops 6 лет назад +18

      Yeah, these guys are cool; not so much "reviewers", but analysts. I like it.

    • @madfinntech
      @madfinntech 6 лет назад +3

      Exactly! The very reason I keep watching them, even after saying things like "smooth 30 fps".

    • @Flamevortex-ky6xh
      @Flamevortex-ky6xh 6 лет назад +3

      30fps is smooth with good frame pacing and motion blur, that’s the reason why the new Spider-Man game without motion blur makes my head hurt.

  • @Faithful-Preservation
    @Faithful-Preservation 6 лет назад +597

    I am more impressed with DLSS so far than I have been with RT.

    • @289kman
      @289kman 6 лет назад +18

      thats because ray tracing is early and not ready yet forget about ray tracing

    • @Luredreier
      @Luredreier 6 лет назад +29

      +xGeoThumbs Likewise.
      Don't get me wrong, it's not going to make me buy Nvidia hardware anytime soon.
      But it's definitivly impressive technology.
      And I hope AMD brings something similar to the table soon too. =)

    • @kable_U
      @kable_U 6 лет назад +1

      DLSS could be impressive. There has been no evidence of this so far though, and there won't be until it gets implemented into actual games.

    • @Faithful-Preservation
      @Faithful-Preservation 6 лет назад +13

      Noobish It seems to me that competation by a rival company is good for the consumer.

    • @HybOj
      @HybOj 6 лет назад +7

      too bad DLSS is blurry as hell

  • @stewardappiagyei6982
    @stewardappiagyei6982 6 лет назад +37

    Hello Richard, everyone from Digital Foundry here.

  • @Raph920
    @Raph920 6 лет назад +139

    I hope to learn how much of an improvement DLSS is at 4K vs no AA at all.

    • @Luredreier
      @Luredreier 6 лет назад +11

      Probably about 30-40%
      Essentially this is like comparing 1440p *with* high end AA vs 4k without high end AA in terms of GPU workload...

    • @BecomeMonketh
      @BecomeMonketh 6 лет назад +1

      if it's faster thanTAA with 40% , then it will be faster than no AA by 50% ish

    • @liamness
      @liamness 6 лет назад +16

      Even with 4K you would probably want some kind of AA, if only to get rid of shimmer.

    • @icy1007
      @icy1007 6 лет назад +4

      I'd rather have native 4K with no AA.

    • @Psychonaut-im3zz
      @Psychonaut-im3zz 6 лет назад +1

      DLSS makes 4k relevant (it gives a superior crisper image, a defacto nvidia standard). Other AA solutions are in software, now you have a AA (on-the-fly) hardware solution. It is superior by definition. If anything, it is the secret sauce for raytracing (aside from rasterization speedup). Go a step further NVLink (2x 2080/2080ti) is the secret sauce for raytracing.

  • @machielste1
    @machielste1 6 лет назад +7

    Finally a deep dive into what DLSS actually looks like and how it performs, you guys rock !

  • @DatGrunt
    @DatGrunt 6 лет назад +126

    I'm more excited about DLSS than raytracing but the price. Like god damn. AMD please step up the competition.

    • @Luredreier
      @Luredreier 6 лет назад +14

      +DatGrunt Yeah, I *really* hope that AMD implements some variation of DLSS for their own cards soon.
      Would be a great upgrade for me and my RX 480 =)

    • @paul1979uk2000
      @paul1979uk2000 6 лет назад +2

      It wouldn't supprise me if AMD are already working on something like DLSS for the next gen of consoles and PC gpu's, ray tracing is a diffrent story altogether and I really hope AMD does work on that as well and the only real indicator that they might be working on it is that Microsoft presentation a few months back where Microsoft talked quite a bit about ray tracing, it does suggest the next Xbox console could have ray tracing in it which bolds well for both PC and console gamers because for this kind of tech to be adoption, consoles will need to support it as well as AMD hardware.

    • @DatGrunt
      @DatGrunt 6 лет назад +2

      There's also Intel releasing their dedicated GPUs like in 2020. I hope they're not underwhelming.

    • @lawthirtyfour2953
      @lawthirtyfour2953 6 лет назад +1

      Been waiting for raytracing for decades. I can look past the very first games taking performance hits and see how amazing it is.

    • @Doc_Valparaiso
      @Doc_Valparaiso 6 лет назад

      Paul Aiello I definitely think AMD is working on their own versions of DLSS, and RTX. They're gonna need as many techniques as possible to utilize in the next console gen.
      I think AMD's partnership with MS, and Sony is a big motivator for them to innovate. We might not be seeing the fruits of that "motivation" with Vega atm. However, Sony
      are working directly with AMD on Navi. So, when you think about the Pro's heavy utilization of checkerboad rendering (some on the X as well) DLSS is the obvious next
      step in that evolution. ...I'm also curious to see if Nvidia's new tech is utilized in the rumored next iteration of the Switch.

  • @antraxbeta23
    @antraxbeta23 6 лет назад +49

    Maybe my eye's are bad , but on a lot of DLSS samples the image is blurry

    • @Mr.Beavis667
      @Mr.Beavis667 6 лет назад +2

      i thought the same...

    • @mjc0961
      @mjc0961 6 лет назад +8

      Your eyes are fine. Upscaled images are always blurry.

    • @gabrielst527
      @gabrielst527 6 лет назад +2

      It's upscaling (Nvidia said it on the presentation powerpoint) but using AI to make the lower resolution image look better, i think this could work if you made a 1440p picture look 4k (since the difference is not that huge on a gaming monitor) but if you are using a really big screen i think the differences with DLSS would be noticeable. (in lower image quality i mean)

    • @sacb0y
      @sacb0y 6 лет назад +4

      Depends on the scene, some things looked blurrier mostly object edges, other stuff looked a lot clearer like the backgrounds, and things like hair looked a lot cleaner.

    • @marverickbin
      @marverickbin 6 лет назад

      They use the 4k rendered images to train the network to super sample the images. A better metric would be comparing the quality of the original 4k that they are using to train and the DLSS results.

  • @OxRashedxO
    @OxRashedxO 6 лет назад +18

    Unreal 1 music in the background! Great choice.
    👍🏻👍🏻👍🏻👍🏻👍🏻👍🏻

  • @doublevendetta
    @doublevendetta 6 лет назад +8

    I'm a fan of reconstruction techniques like this. I'm a fan of enabling DRS in more games. I'm a fan of having a NON-dynamic scaling slider for those who want it. What I'm NOT a fan of, is the pricing of these cards, especially after seeing what is essentially complete confirmation of my suspicions about their typical raster performance over Pascal. The new tech is cool, but it's not widely implemented yet, and this very much feels like paying an "early adopter" tax in the order of several hundred dollars. Until we can see something on the driver level the likes of the NVidia DSR algorithm, I can't see these as a good buy in terms of value per dollar for most anyone.
    Relying on developers to individually implement all of this means you're going to see the big budget ones probably get things out at a decent clip. But AS an indie developer who's kept my eye on all of these things, I can tell you that given the current, very short list of dev software that even SUPPORTS this, we couldn't even implement almost any of it into our workflow, even if we wanted to. And the pricing scheme puts any of these higher tier pieces of silicon outside of what I would call "sensible spending," even for our most optimistic budget projections.

  • @Xilefian
    @Xilefian 6 лет назад +46

    It kind of makes sense that frame reconstruction handles transparency better than TAA as it's a treatment on the final image (there's no concept of transparency) whereas TAA is treated earlier on using information from the previous frame and current motion vectors which are completely ruined by transparent surfaces (which frankly should write zero motion as transparency really does screw up screen-space motion).
    My curiosity is how is shader aliasing handled by DLSS? Would it emphasise the issue or leave it alone?
    As for how this works, I imagine it's just training an AI to reconstruct the 64x super sample image from a 1/64th sized input image - like any other AI training set.
    I get criticised for mentioned this often, but I think the Waifu2x AI would be a good example to look at - this AI was trained by showing the AI a low resolution piece of artwork and seeing if it can reconstruct the full resolution source - once it gets all tests 100% perfect, the AI is assumed to be good at up-scaling.
    EDIT: 17:45 shows that shader aliasing is still an issue.
    EDIT2: Single frame was a guess at how the AI works - only NVIDIA knows what kind of data set the AI is trained with. It could be that it takes multiple frames of previous input, but that just seems too bandwidth crazy to me. Single frame is good enough, I know this from doing single frame up-scaling tests with Waifu2x in the past (with the idea of "this would be great for games in the future" - well I guess the future is here...)

    • @Luredreier
      @Luredreier 6 лет назад +5

      Yeah.
      The tech is impressive.
      Looking forward to AMDs implementation.
      I'll probably wait with upgrading my GPU till they get the same tech. ^^

    • @Xilefian
      @Xilefian 6 лет назад +4

      It's just a shame that very little has been done with AMD's rather beefy compute hardware to allow space for things like AI and neural network training. The hardware is there - just that the API isn't. They bet on OpenCL which just wasn't as easy to use compared to CUDA - and now OpenCL is largely dropped.
      I think AMD have been awaiting a strong Vulkan compute effort, which currently is marred by SPIR-V compilers being a little too disconnected from already existing tools (the situation is improving every week, though - it's a slow community effort). After that it's a case of finding a model that best fits AMD's hardware.
      I've heard that AMD have had a lot of success with developing private tools that can compile a Caffe (Deep learning framework) CUDA application for a specialised AMD GPU compute workload and have excellent performance, but I'm yet to see any of AMD's efforts published as usable tools. It's likely that AI workloads don't fit current AMD GPU hardware and more development is needed.

    • @AkshayKumarX
      @AkshayKumarX 6 лет назад +2

      I've been using Waifu2x context menu utility for a long time and it serves its purpose well. Its quite impressive that they've managed to make this new hardware that is applying Deep learning Super sampling techniques on real time high resolution frames and doing it rather well in what seems to be its first iteration I would say.
      I'm assuming that you already know about this but if you don't, look up "Two minute papers" on RUclips and take a look at some of the recent videos if you want to know more about Deep learning based Super resolution images and more, Great stuff :)

    • @giancarloriscovalentin1865
      @giancarloriscovalentin1865 6 лет назад

      Amazing Deep Learning technique

    • @raytracemusic
      @raytracemusic 6 лет назад +1

      woah - cool stuff imma check

  • @biglevian
    @biglevian 6 лет назад +39

    Sounds promising. My next GPU upgrade will be in 2-3 years, so I hope RayTracing and DLSS are ready and available in Midrange then.

    • @TroublingStatue
      @TroublingStatue 6 лет назад +4

      Yeah, the only _released_ game that has RTX right now is Shadow of the Tomb Raider. And even then it's not available at launch, it's coming out later on in an update.

    • @Luredreier
      @Luredreier 6 лет назад +2

      +biglevian
      Yeah, and I hope AMD gets something equivalent to DLSS too by that time...

    • @TerryMartinART
      @TerryMartinART 6 лет назад

      whats AMD making ?

    • @Luredreier
      @Luredreier 6 лет назад +2

      +Terry Martin We don't know exactly what they're making for 2-3 years into the future.
      Another version of Vega should come out this year (probably only for prosumers etc), Navi is expected next year (2019), according to the rumors it's intended as a midrange option.
      So I'm hoping a card like the 2080 or 2070 at lower prices for the top of the line from AMD although I wouldn't be surpriced by a card similar to a 1080 costing the same as a current day 580.
      After Navi all we can see on their roadmap is "Next Gen" (roughly 2020)
      The rumors points towards something like the multi-die approach used for Ryzen.
      But no-one really knows for sure what they have in mind...
      Machine Learning has been a thing for a while and there's already some Vega cards with additional machine learning instructions added.
      I wouldn't be surpriced if this improved further with Navi, although I don't know if they'll improve the machine learning capabilities enough to support something like DLSS already by Navi or if that is something we won't see till the "Next-Gen" cards.

    • @TerryMartinART
      @TerryMartinART 6 лет назад

      Thank you for that info.

  • @ScreamerRSA
    @ScreamerRSA 6 лет назад +102

    2080ti ~30% performance increase, ~70% price increase. nVidia, the way its meant to be played.

    • @madfinntech
      @madfinntech 6 лет назад +9

      Someone ripped out the Tensor and RT cores the one you bought? I didn't know the 1080 Ti had those.

    • @ronnymueller1918
      @ronnymueller1918 6 лет назад +21

      The way its meant to be payed

    • @kevinmlfc
      @kevinmlfc 6 лет назад +4

      Yes and it can't hit 4k@60 on a 2 year old game.

    • @mduckernz
      @mduckernz 6 лет назад +4

      The way _you're_ meant to be played.

    • @lendial
      @lendial 6 лет назад +1

      predictioN: 3080ti ~15% performance increase at 2,000 usd.

  • @davidcunningham6399
    @davidcunningham6399 6 лет назад +1

    Hi DF. Love the videos, but noticing the penchant for "For sure" has been recently replaced with "Heavy". Technically could you be more specific as to what element or load you are referring to? I can only assume DLSS or the other techniques in this particular video, but could be memory, CPU, everything... or some appreciative slang term. Thanks. Do keep up the otherwise excellent work!

  • @Aveok
    @Aveok 6 лет назад +9

    At 1080 TI the leap was to hugh, you couldnt even know where to start explaining the advantages. With the the RTX even DF has to search for some small benefits to justify the purchase.

    • @miguelpereira9859
      @miguelpereira9859 5 лет назад

      That the result of Moore's law going out the window most likely

  • @191desperado
    @191desperado 6 лет назад +1

    NOBODY else does this kind of material. This is why I’m joining your Patreon ASAP!

  • @ardentlyenthused338
    @ardentlyenthused338 6 лет назад +3

    @DigitalFoundy, Since you say DLSS is working off a 2K image and then "inferring" to 4K, I think it could be insightful to include a 2K benchmark. In this video you look at how much of a performance improvement you get relative to 4K, which is absolutely interesting, but it would also be informative to know how much of a performance penalty you get compared to 2K; again, since the ultimate image is 2K + DLSS ~ 4K. How much does that DLSS part cost? Just saying, if you're looking for more DLSS videos or investigation to do while you wait for ray tracing stuff to come out, there's another question you could delve into!

  • @thibautnoah2599
    @thibautnoah2599 6 лет назад +3

    That was really interesting especially the part where you talk about how dlss could make rtx run without too much strain in 4k

  • @clarknova1567
    @clarknova1567 6 лет назад +131

    Way too expensive. The performance per dollar metric is really important here because you are getting less at the cost of more. The two features that make this series of cards aren’t even ready at launch.

    • @jacobsmith1173
      @jacobsmith1173 6 лет назад +39

      But you have just described the top tier card for every generation ever. I mean this with respect, if your attitude towards these cards is about performance per dollar, I am not sure why you would go to flagship card videos within two minutes of upload, the day the card was released (supposed to release). Are you expecting the card to be the cheapest on the market? OR offer the best bang for buck?

    • @varungandhi8796
      @varungandhi8796 6 лет назад +8

      Well, the price per dollar will probably get better as the new more optimized drivers come out

    • @purona2500
      @purona2500 6 лет назад +6

      You VASTLY overestimate the performance increase that drivers give

    • @wile123456
      @wile123456 6 лет назад +10

      Jacob Smith you must suffer from amnisia then because every new gen has offered one tier higher performance (970 = 780, 1070 = 980/ti) but for the same price as that tier usually cost, or atleast close to it. That's how it has been for years. Why is that? Well because the previous gen will always have a cheaper price when the new gen launches, so in order for the new gen to be a good sell that have to atleast match the old MSRP of the previous gen. And that has been at launch. The price of the new gen will drop as well just like this one, but atleast before there was value to the cards outside of 2 gimmick features only a handful of games will support (and doesn't right now)
      Now you are paying 800 dollars for a 2080 compared to the msrp of the 1080 500 dollars. 1200 dollars (the same as the overpriced titans) for the 2080ti, compared to teh 700 dollar for 1080ti. Then you factor in that 1080 is 400 dollars now and 1080ti is 600 dollars and you have possibly the worst price to performance ratio of any generation of video cards EVER.
      Now if you used your brain instead of gut fanboyism you would be able to see that but sadly nvidia's brand name is too strong. We the consumer, everyone watching this video, loses, because we are paying more for less

    • @jacobsmith1173
      @jacobsmith1173 6 лет назад +10

      +Varun Gandhi But it is kind of missing the point. The 2080Ti is just a re-branded Titan card. These cards aren't for people concerned with Frames per Dollar. They are marketed for people who want the maximum possible performance. The best thing about the re-branding is that consumers know that they aren't going to release a re-branded XX80Ti version down the line for $500 less. These are also for people wanting to get the most out of 4K, as recent years have shown, hitting 60Fps has been just within reach. Now, we seem to blow right past it. Giving plenty of headroom.

  • @OrangeRock
    @OrangeRock 6 лет назад +3

    So.. Does dlss work on 1080p?
    Just curious..
    Since in 4k it renders at 1440p
    Does it run 720p at 1080p dlss?
    Or does it only support 4k?

  • @huskuas643
    @huskuas643 6 лет назад

    Extremely informative video, thanks a bunch for this one. And to your editor, I really commend them for using original Unreal BGMs for this video, many, many props to that.

  • @Aggrofool
    @Aggrofool 6 лет назад +87

    Nvidia's pricing will turn me into a console gamer

    • @sebastianhoyos1961
      @sebastianhoyos1961 6 лет назад +7

      I mean i could get checkerboard 4k for much cheaper on a console......

    • @tomaszpapior2143
      @tomaszpapior2143 6 лет назад +9

      As a PC player who bought PS4 Pro a year ago, now I spend like 80% of gaming time on this machine. Uncharted, Gran Turismo and Horizon Zero Dawn are so good

    • @budthecyborg4575
      @budthecyborg4575 6 лет назад

      Crypto did that to me last year.
      The Xbox X is the only hardware I have that can output a 4K60hz image over HDMI (My 980Ti is basically equivalent to the Xbox, but only has HDMI 1.4), so the console is the only way I can use a 4K TV unless I buy a new GPU, which at this point isn't going to be until AMD comes out with a chip faster than the 1080Ti.
      Hopefully that's only a year away. In the meantime Xbox X runs most games at the same fidelity as my PC would manage anyway.

    • @tomaszpapior2143
      @tomaszpapior2143 6 лет назад +2

      @@budthecyborg4575 X is a good choice as well, I've got PS4 Pro mostly because of games not available on PC. I was planning to buy a pc months later, when crypto madness is down but didn't decided as there is still too many games for me to play on the ps4 and PC parts prices are crazy this time.

    • @budthecyborg4575
      @budthecyborg4575 6 лет назад +1

      I've been a big fan of Forza for the last decade. Xbox One has not disappointed.
      Xbox X is still the only way to get Halo:MCC, and now that we have it running in 4K that means co-op runs in dual 32:9 1080p screens! Comparing with the price of those things on PC my $300 4K TV split across the middle would be worth about $2,000!

  • @NateHancock
    @NateHancock 6 лет назад +1

    I love the discussion and insights made in this video. Great job DigitalFoundry!

  • @madfinntech
    @madfinntech 6 лет назад +11

    8:55 As a PC gamer and someone who prefers the actual resolution whatever it might be have an open mind for reconstructed image IF it doesn't hinder the quality. All these other techniques have major flaws that make it obvious it not being the real deal. I'll take it IF it can fool me the I think it's the real deal. It doesn't matter what the used technique is, the results matter and something like PS4 Pro's checkerboard technique doesn't pass as 4K to me.

    • @mjc0961
      @mjc0961 6 лет назад +1

      "something like PS4 Pro's checkerboard technique doesn't pass as 4K to me"
      Of course not. It is objectively not 4K. It's a lower resolution image upscaled to 4K, which is not 4K no matter how many fancy techniques.

    • @Malus1531
      @Malus1531 6 лет назад +1

      mjc0961 Isn't that true of DLSS too, just with a better upscaling technique?

    • @Hayreddin
      @Hayreddin 6 лет назад

      @@Malus1531 not really, DLSS uses a 1440p input to produce a 4K output inferring the missing details, it's not simple denoising/upscaling, that's why it's able to produce sharp borders and text, it recognizes them as such and "knows" that borders and text should always look sharp and clean. There might be other cases where tha AI isn't as sure and might produce artifacts instead, maybe putting fake details where there should be none or over-smothing something that should look crisper. The good thing is that this is coming from an nvidia supercomputer that's constantly being trained by them, and everytime a new game is fed to the AI it gets a little better at doing its thing, so I guess game developers could provide better DLSS with patches over time or it could be a thing nvidia does via driver updates/geforce experience. I personally like the "painterly" look of DLSS and I'd like to see it in action on a real interactive game soon.

    • @Malus1531
      @Malus1531 6 лет назад +1

      @@Hayreddin So how does that make anything I said false? It's still just a more advanced upscaling. All upscaling means is converting a lower resolution to a higher one, like from 1440p to 4K as you just said. And it's still inferior to native 4K, just like regular upscaling or checkerboard, only more advanced.

    • @Hayreddin
      @Hayreddin 6 лет назад

      @@Malus1531 I never said it was superior or even equal to native 4K, but saying it's "just a more advanced upscaling" is not entirely correct either because the image is not upscaled per se but "reimagined" in 4K, I can see it can just be a question of semantics though and my intention wasn't to prove you wrong or start an argument.

  • @Raigavin
    @Raigavin 6 лет назад

    Great analysis, appreciate the effort taken to share your thoughts.
    Although a small side note, the video warping effect between screen transitions hurt my eyes and they just happen to time when I focus my eyes to look at the minute details in the video :( .
    Again, thanks for the great video.

  • @Bloodred16ro
    @Bloodred16ro 6 лет назад +8

    DLSS looks like a disappointment to me. By far the biggest attractions of 4K are the sharpness and the fine detail you can see, especially when close to a PC monitor. DLSS provides a blurred image where both of those qualities are essentially gone. Why would I use a 4K monitor in order to play a 1440p upscale of a game? Why not use a 1440p monitor at native resolution and use DLSS 2X for AA instead? What's the point of aiming for 4K if you're going to give up on the sharpness and detail anyway? Sure, it's better than a traditional upscaler and probably better than checkerboard techniques, but the FFXV screenshots I've seen make it very clear that a lot of detail is lost (even when compared to TAA, which in and of itself sometimes adds some blur; I'd love to see a comparison against a shot with no AA or one with real SSAA), as does the Infiltrator comparison in this video.
    The only real appeal I see to regular DLSS (not 2X) is the idea that it might be used as a sort of trade-off for ray tracing, as in you give up some sharpness and detail and get better lighting/AO/shadows in return at a resolution where ray tracing is otherwise much too slow to use.

  • @symol30872
    @symol30872 6 лет назад

    Great video, very informative (Also love the sneaky Unreal music in the background 😉)

  • @Screamin_Demon
    @Screamin_Demon 6 лет назад +7

    Some Dude: Hey James Cameron what is AI going to be used for in the future? Maybe killer robots or house maids or something?
    James: Nah it'll used to make video games look better... (╯°□°)╯︵ ┻━┻

  • @xenoaltrax485
    @xenoaltrax485 6 лет назад +1

    Is pixel counting of screen captures still valid considering that multi-resolution shading has been in use since Maxwell and now you have variable-rate shading with Turing? How can you tell if the region you're counting pixels in wasn't subjected to a different shader resolution?

  • @THEREALKEMMZ
    @THEREALKEMMZ 6 лет назад +3

    Is DLSS only on the 2080 series cards?. It would be nice to have it for the older cards as well.

    • @mduckernz
      @mduckernz 6 лет назад +2

      Yes, because the operations used in performing the DLSS will only run efficiently on Turing.
      You _could_ run DLSS on Pascal (or an AMD GPU for that matter...) - the algorithm is compatible... it's just math - but tensor operations (an n-dimensional matrix) are slow on older hardware since they only support vectors and matrices, not tensors. You can emulate them perfectly fine, but at significantly slower speed - so much slower that you would be better off just rendering at higher resolution.

    • @steveballmersbaldspot2.095
      @steveballmersbaldspot2.095 6 лет назад

      It wouldn't run well as it is designed for hardware with tensor cores.

  • @Khaotika
    @Khaotika 6 лет назад +2

    When you're pointing out the blurriness of TAA, have you taken into account that the blur might be intentional?
    For example the beginning of FFXV benchmark demo when they roll out the car and there's rock and bushes in the background, assuming they didn't completely alter the way it should look from the public version, the blur is intentional DoF effect that's present in public version even without AA

  • @ashenone3427
    @ashenone3427 6 лет назад +4

    Will DLSS work with 1440p and 1080p resolutions? That's what I'm wondering.

    • @pottuvoi2
      @pottuvoi2 6 лет назад +1

      Yes, it should work on lower resolutions as well.

    • @mduckernz
      @mduckernz 6 лет назад

      Yes. It should work equally as well, possibly even better (since I expect the algorithm is trained on a single very high resolution data set, so the lower the resolution, the more "detail" can be inferred into that set of pixels) since the resolution delta is larger

  • @AdnanAhmed
    @AdnanAhmed 6 лет назад +1

    Having my tea and Richard talking about tech jargon - this is the life.

  • @janklaassen6404
    @janklaassen6404 6 лет назад +33

    DLSS makes a painterly look where texture detail disappears. It's like the noise reduction filter on your phone when shooting with little light.

    • @Luredreier
      @Luredreier 6 лет назад +3

      +Abbablablah Bablhablah
      What frames are you talking about?
      Can you link the time stamp you're referring to?

    • @GraveUypo
      @GraveUypo 6 лет назад +1

      iunno, looked good to me for most part.
      but i'm skeptical. i want game image quality comparisons, not just demos. demos are way too easy to make a very specific neural network that cleans them up almost perfectly.
      1:58 oh. i see it now. looks almost like anisotropic filtering was turned off

    • @GrantZ90
      @GrantZ90 6 лет назад +5

      Just take a look at 1:41 for example. Blurry as hell. All the detail in the textures gone.

    • @Darkmasta05
      @Darkmasta05 6 лет назад +6

      @@GrantZ90 Hi, this timestamp is the one frame that is yet untreated. The detail lost is to be attributed to the resolution difference 1440p/2160p and not dlss.

    • @LisaSamaritan
      @LisaSamaritan 6 лет назад +6

      It might go both ways. Look at the logo, on the table @ 9:49. Clearly more detail in DLSS.
      They show the same frame @ 10:57.

  • @palody_en-ja
    @palody_en-ja 6 лет назад +1

    I loved the plug at the end. Vuvuzelas are BACK!

  • @l0lzor123
    @l0lzor123 6 лет назад +5

    Couldnt really see the image quality difference between DLSS and TAA until 9:37 damn noctis looks blurry there

    • @anogrotter1985
      @anogrotter1985 6 лет назад

      Keep in mind DLSS performs better because it's actually a lower resolution, so even if it looks the same it's really impressive.

    • @Luredreier
      @Luredreier 6 лет назад

      +l0lzor123
      Yeah, impressive what they can do with lower resolution rendering with this tech.

    • @sheikhrayan9538
      @sheikhrayan9538 6 лет назад

      Taa looks so much worse, omg lol

  • @themodfather9382
    @themodfather9382 5 лет назад +2

    It's not 4k, it's 1440p upscaled. The information is simply not there. Remember upscaling DVD players? Same thing.
    I think it's a cool feature, but people are saying it looks the same as 1800p upscaled. Using a 4k monitor, we should be comparing: 4k, 1800p, 1440p, 1440p+DLSS upscale. People have been upscaling resolutions for over 20 years so let's not pretend this is a new thing.

    • @flarpman2233
      @flarpman2233 5 лет назад +1

      Flarp here.
      Anyone who can't spot the difference between native 4K and 1440P upscaled is either making things up or blind. Furthermore, there is no 'performance boost' at 4K from doing this, you are gaining performance due to running at a resolution below 4K.

  • @rainZSlayer
    @rainZSlayer 6 лет назад +3

    Lo and behold, "faux K" arrives to the PC master race. Such ground breaking technology!!

    • @HarryPearce7
      @HarryPearce7 6 лет назад

      Console drones can't even perceive faults in their absolute parlor trick of a rendering method that is checkerboard

    • @rainZSlayer
      @rainZSlayer 6 лет назад

      HarryPearce Harr harr... Faux K is the new revolution. Bow down to pcmr 😂

    • @e5m956
      @e5m956 4 года назад

      I agree with Chatterjee, I'm a PC master race and don't like the idea of rendering at a lower resolution and up scaling! I pay for 4k hardware I want full fledged 4k! In Shadow of the Tomb Raider with no AA at 4k the image looks much sharper than with DLSS albeit with no AA you get some shimmering (which is also annoying) but AA blurs the nice sharp 4k image! :( But still, PC will still have much better framerate and at least you have the option to use what you want (DSLL, no AA, TAA, this resolution, that resolition, turn this setting down, turn that setting up) where the consoles graphics settings are locked and you're stuck with what you got. So yea, even with console-like upscaling PC is still the master race hehe. :)

  • @JulianDanzerHAL9001
    @JulianDanzerHAL9001 5 лет назад

    I don't like resolution scaling personally but as an anti aliasing method this seems pretty interesting
    the disadvantage being that of all graphics features ever this is probably the hardest to predict so if you want to know how well it does you really can't use benchmarks or even short testruns you prettymuch have to test through an entire game and all possible scenarios just to see how it behaves in all cases

  • @msironen
    @msironen 6 лет назад +4

    I expect these 4K purists will also refuse to listen to any kind of compressed audio. It's gotta be that 1.4 Mbps CD Audio from the 80s (or better yet SACD) just so you won't miss any of those least significant zeros in the bitstream.

    • @NoobGamingXXX
      @NoobGamingXXX 6 лет назад +2

      They can wait 25 years for a hardware that will support their purist idealogy

  • @jeffhampton6972
    @jeffhampton6972 6 лет назад

    This was really informative, thank you. Also, as OxRashedxO would agree, the Unreal music was a fantastic surprise. Hell yes!

  • @kenshii_jim
    @kenshii_jim 6 лет назад +20

    At 1080p, FF XV probably has one of the worst and most aggressive TAA implementations i have experienced(right beside MH World), i literally said "wtf is this oily layer on my screen" on the first 10min of gameplay. Saying its not the best is being nice.

    • @killermoon635
      @killermoon635 6 лет назад +1

      TAA make the game look better. So much jagged edges on the trees if you turn taa off

    • @zblurth855
      @zblurth855 6 лет назад +1

      oooo that why i have oil on MHW

    • @gabrielst527
      @gabrielst527 6 лет назад

      The problem is that Nvidia intentionally left TAA on or DLSS to make 4k native look bad, this test should be done with 4k native no AA vs 4k upscaled DLSS.

    • @kenshii_jim
      @kenshii_jim 6 лет назад

      Yeah Gab Stu, you're right, TAA makes it less sharp, so we dont notice as much of a difference.

  • @nihren2406
    @nihren2406 6 лет назад +2

    So what you're saying is that DLSS solves the hair aliasing problem in Final Fantasy XV, at least for the most part? That was something that always bugged me even on PC. In cutscenes there was always a lot of shimmering on the characters hair, like watching a bunch of razor blades go down each strand. It's worse on some characters, but still there regardless. Now granted I wasn't playing at 4K, but I had AA cranked to max and that only seemed to have a little effect on it.

    • @pottuvoi2
      @pottuvoi2 6 лет назад

      It does seem like it in the video.
      Knowing at least partly how DLSS works it's actually not surprising that it works so well on hair dithering.
      The AI receives 64xSSAA image in part of it's training and dithering basically changes into 64 value gradient in alpha.
      It's very highly different part in source images, so it certainly would make sense for AI to learn how to help with.

  • @Xenthorx
    @Xenthorx 6 лет назад +4

    TAA looks like no AA at 1:50

  • @Theinvalidmusic
    @Theinvalidmusic 6 лет назад

    Nice use of the Dusk Horizon theme from Unreal in the background. Still an absolute fave

  • @mhosni86
    @mhosni86 6 лет назад +3

    Are you telling me that the technology that Deckard used in Blade Runner is already happening? XD

    • @mduckernz
      @mduckernz 6 лет назад

      Yes, but it's been around for a long time. This is not new technology. I've been using neural network based image and video upscaling for over 5 years now... NVIDIA has just built hardware acceleration for it and packaged it into a convenient API. That's all

  • @allbrancereal_
    @allbrancereal_ 6 лет назад

    At 17::45 you can see clear shimmering/missingtexture at the center of the DLSS footage in the distance.

  • @buzhichun
    @buzhichun 6 лет назад +4

    In 10 years games will only render 4 actual pixels, then upscale it with Deep Leaning to 12K or whatever resolution we're using then. You heard it here first.

    • @ketrub
      @ketrub 6 лет назад

      you're not funny nor clever

    • @buzhichun
      @buzhichun 6 лет назад +1

      jeez, who shat in your cornflakes this morning?

    • @ole7736
      @ole7736 6 лет назад

      I found it funny and clever.

  • @RaufZero
    @RaufZero 6 лет назад

    guys, haven't you changed the description by accident? DLSS on the left looks blurry comparing to picture on the right with TAA
    10:24 watch this

  • @Rithysak101
    @Rithysak101 6 лет назад +24

    The thumbnail mislead me into thinking you're talking about versus 13 :(

    • @batmangovno
      @batmangovno 6 лет назад +2

      Yeah, I would've loved to see FFvXIII's darker artstyle realized with current-gen top tech.

    • @Rithysak101
      @Rithysak101 6 лет назад +1

      @@batmangovno the more I think about it the more it saddens me :(

    • @KagatoAsuka
      @KagatoAsuka 6 лет назад +1

      There are mods for FF15 that swaps in some Versus 13 assets I know its not the same but still pretty cool.

  • @AgUf007
    @AgUf007 6 лет назад +1

    I love the Unreal music on the background

  • @CharcharoExplorer
    @CharcharoExplorer 6 лет назад +108

    RTX 3000 series will be a much better product at 7nm. Especially if Gamers do not vote for these prices and wait. But they are spineless and do not make long term decisions who am I kidding lol?

    • @wile123456
      @wile123456 6 лет назад +8

      Yes sadly most gamers nowadays are just sheep buying brand names like it's an iphone or clothing. Most of them don't even watch these videos. Nvidia knows their market and they are abusing it to the fullest extent, making these insane price hikes while people still make them sell out. It's hard to keep faith in this industry when these buyers are screwing over everyone else

    • @ToothlessSnakeable
      @ToothlessSnakeable 6 лет назад +1

      It won't be 7nm apparently AMD is getting priority on that from TSMC

    • @bitscorpion4687
      @bitscorpion4687 6 лет назад

      10nm Samsung in 2020 is what you should expect...

    • @cremvursti
      @cremvursti 6 лет назад +4

      At $1200 I'm pretty sure not many will pick one up. Nvidia isn't Apple and the rtx 2080ti isn't the iPhone X, so I'm pretty sure they won't sell like crazy. Nvidia prolly already knows this and by pricing the new cards so steep it probably means that they don't even expect to sell as many cards and this generation of new cards is more like a minor one, the middle step between two bigger generations. That means that the bulk of the sales will still come from the GTX 10xx generation, at least until AMD releases new cards as well, at which point we might see some changes in the pricing policies.

    • @madfinntech
      @madfinntech 6 лет назад

      You don't say! They aren't out now though.

  • @shawnchalfant1595
    @shawnchalfant1595 6 лет назад +1

    At the end of the day, if I can run on high settings with FXAA @ 1080p I perfectly happy. It’s all about the gaming for me. Never understood why geeks are so obsessed with the numbers. The point of gaming is to enjoy it. Look at Breath of the Wild. Not breaking any big numbers there yet the gaming is so immersive. But you guys have to pump out videos and get the tech out there. Love this channel!!!

  • @madfinntech
    @madfinntech 6 лет назад +4

    I really hope this DLSS and other AI features will be utilized in gaming in the future and I'm not only saying this because I bought the 2080 Ti. I bought it for 4K AND 60 fps gaming but it would be cool if these other features would take off as well.

    • @Ford-wt8rn
      @Ford-wt8rn 6 лет назад +1

      Same here. It could increase the longevity of these cards too....if Cyberpunk 2077 uses DLSS in say, 2020- the 2080ti would probably be able to play it and higher settings with great framerates, HDR, etc....Of course im speculating but it really could be a game changer. I'll take a checkerboard like image, maxed out, with high framerates, over lowering the settings, native 4k, and lower frame rates for really demanding games. And we all know games will get more demanding these next couple years....

  • @Engel990
    @Engel990 6 лет назад +1

    Holy shit the Unreal music in the background... Just after your left the ship. I fucking recognize that anytime anywhere no matter how softly it is played.

  • @dreammfyre
    @dreammfyre 6 лет назад +28

    Quantum image reconstruction. Spooky pixels at a distance.

    • @Dictator93
      @Dictator93 6 лет назад +3

      LOL - no joke, when looking over DLSS I actually said "spooky action at a distance" to describe some of the behaviour :D
      -Alex

    • @Luredreier
      @Luredreier 6 лет назад

      +re hash
      Yeah, I noticed that.
      But I think it's well worth the cost for Nvidia users given the performance benefits.
      A RTX 2060 running a game supporing DLSS will probably give AMD a real beating there.
      And one where Nvidia has actually earned the victory unlike with all the gameworks nonsense...

    • @franzusgutlus54
      @franzusgutlus54 6 лет назад

      if you don^t look at the monitor, the image is blurry and crisp at the same time!

    • @st0nedpenguin
      @st0nedpenguin 6 лет назад +2

      If you don't look at the screen the game is both running and not running.

  • @ardentlyenthused338
    @ardentlyenthused338 6 лет назад +1

    If this kind of performance boost is representative, and enough games support it, DLSS could be a game changer, and ultimately be one of the key technologies that helps achieve acceptable performance with Ray Tracing enabled. For instance, if you take the average 30% performance boost of the 2080 Ti over the 1080 Ti (simply from being a faster architecture) and combine that with the 40% increase from DLSS, you're looking at over an 80% performance increase relative to the 1080 Ti (1.3 X 1.4 = 1.82, or about 82%). That's a HUGE gain, one that certainly will party or perhaps entirely offset any performance penalty incurred by Ray Tracing.

  • @erdincylmaz4529
    @erdincylmaz4529 6 лет назад +7

    It is much more blurry than taa.

    • @killermoon635
      @killermoon635 6 лет назад +2

      Look 5:50, the background looks less blury on DLSS.
      Also, the hairs has no jagged edges on DLSS unlike TAA

  • @ItsCith
    @ItsCith 6 лет назад

    Oh man, the background music brings back so many memories! I love Unreal 98!!

  • @mitthjarta5
    @mitthjarta5 6 лет назад +3

    Called it.
    Also "deep learning" is more a buzzword (not one I'm fond of). From my understanding it's applying a ruleset of filtering functions on clusters of pixels, based on relative properties between samples (a la how adaptive sharpening and many 'quick AA' methods work), upon recognising a clusters characteristics, it'll apply the most suitable filter pattern for it by traversing a LUT (The filter-pattern LUT being the only "Deep Learning" aspect as far as I'm aware, a pattern set generated by doing comparative analysis on a goliath datasets of 1440p ^ 64x down-sampled frames (if generic probably generated in part by the current demos), that's where the deep learning came in, it's probably a dataset that'll get updated in tandem with drivers, and thus improve over time IF it has any shortcomings), What makes it vastly better than TAA is the amount of raw silicon dedicated to the task, and the ability to apply such a large set of filters that are cluster specific. The "Deep Learning" aspects are actually quite bruteforce if the data sets are generated by super computers. And require entire blocks of compute ICs to be applied in real-time.
    Where TAA and most temporal injection methods fail in comparison is they are very generic and holistic, and have far lower overheads. Even without the DL datasets, the same compute silicon could offer less complex adaptive filtering techniques that improve upon TSAA.
    It should be said it's not "new" per se, it's just a more bruteforce and extensive application of similar techniques that AMD and Nvidia have previously done in the past, EQAA(GCN) and CSAA(NV) sampling methods have been around for years and do something **similar** . But those are MSAA methods, and fall apart at higher resolutions, and are fundamentally incompatible with deferred shading as i understand it.
    What I've noticed especially by the stills NV themselves have promoted is it's clearly upsampled, it has telltale scaling footprints imo (a visible "smeary" loss in texel definition). which is how I called it ahead of time upon it's reveal. It IS quite nice especially with the performance implications. but I'm against any vendor exclusive features when they are so fundamental regardless of vendor.
    Which is why I'm hoping it's not just relegated to RTX SKUs exclusively going forward. There is a reason 8xTSAA in DOOM(2016) with Vulkan performed better on AMD than any other filtering method, because it use async to offload the AA to otherwise IDLE compute cores (in a very similar way to how this is being accomplished). Nvidia now also have the might of async compute at their disposal, which is great, but they're trying to squander it's application, so it won't also benefit AMD.. And using marketing buzzwords and jargon to justify those exclusionary tactics. Yes they put in a nice amount of RnD etc, and should be commended on that, but I'm ambivalent towards their clear anti-competetive leanings.

    • @giancarloriscovalentin1865
      @giancarloriscovalentin1865 6 лет назад +1

      Agree

    • @mduckernz
      @mduckernz 6 лет назад +2

      That's a very narrow view of DL.
      DL is a term for a cornucopia of different techniques. It is not any one thing, or even one process. It is true that it encompasses the things you mentioned, but it also encompasses far for than that.
      If I had to give a high level summary it would be that of "a deeply layered, sometimes recursive, architecture of learning algorithms, usually with some form of reinforcement learning". Notice this says nothing about the kind of das it processes (nothing about pixels, for instance), nor anything about how this data is processed.
      The technique that NVIDIA seems to be using with DLSS seems to be a neural network based approach. This is not new technology, it is rather old in fact. I've been using AI NN-based upscaling for images and video for well over 5 years now (there are even plugins for MPlayer and I think VLC that do it... I forget the name but if you search "neural network upscaling MPlayer" you will find it) and it works very well. What NVIDIA have done is simply accelerate it in hardware and package it into a nice API. That's it. Not a massive innovation, just clever use of old tech.
      However, with regard to what you said about how this will be used to lock out AMD: absolutely agreed. This was very likely _one of_ their primary goals. I can admire the technology but despise its ulterior motives at the same time.

    • @mitthjarta5
      @mitthjarta5 6 лет назад +1

      I agree completely,
      But the terminology of "deep learning" is very one-sided marketing. The umbrella term "Machine Learning" already exist to bundle those things together, DL has primarily been pushed by marketing and PR, not so much by computer scientists. It's nowhere near as bad as "blockchain" as a recent trendy buzzword, but following machine learning for years (auxiliarily, have a local friend in the field), the term "deep learning" never came up from 2013 until it was 'invented'. And the field has been establishing itself for almost a decade at this point. The terms I heard commonly were Machine Learning, Neural Networks, self-learning algorithms etc etc.
      "Machine Learning" was the easy to understand catch all term being used, if you look up all the established Wikipedia articles around the subject, that is the vernacular. Even "Self Learning" was used in the press heavily. But Nvidia themselves are the ones that pushed "Deep Learning" into the vernacular, so heavily that it became somewhat pervasive, and if anyone hears a story about "deep learning" especially a few years back (when it was being associated with Tesla due to their partnership), the only results that came up when searching were predominantly Nvidia's own material. it was essentially a tactic of SEO for CEOs.
      I think it's a perfectly serviceable term but not fond of it because of the above. It's like how tensor-flow became an industry standard toolchain for dataflow programming, Google even went and created a TPU "Tensor Processing Unit" without discouraging others from making similar ASICs, and then Nvidia go and start using "tensor cores" to bank on it's synonymity. even though 'Tensor Cores' are fundamentally just GPGPU cores (albeit with optimizations), not true TPU ASIC design. It's not an isolated incident they do this aggressive marketing driven hijacking of technology all the time.
      I'm glad you commented though to clarify the extent of Machine Learning within DLSS, I did come across too dismissively, ultimately I do agree there is enough to justify it's use. And glad you stated your stance on the ethics of vendor lock-in. I'm genuinely fearful of a repeat of Microsoft in the hardware/'emerging technologies' space.

  • @user-dz3ph7dl4m
    @user-dz3ph7dl4m 6 лет назад +1

    the reason nVidia demonstrated DLSS vs TAA is that TAA introduces blurring / softening. that lower DLSS pixel count will be less noticeable when side by side with TAA... when available please compare to native 4k DigitalFoundry. while TAA might look nice at 1080p it defeats the purpose of applying it when rendering at 4k imo.

  • @varshoee
    @varshoee 6 лет назад +29

    The only tech channel we need on RUclips for CPU and GPU analysis is DigitalFoundry, always professional and informative contents are presented here.

    • @evanvandenberg2260
      @evanvandenberg2260 6 лет назад +10

      Nima V and Hardware Unboxed, and Gamers Nexus, tbh we are kinda spoiled. 😋

    • @EnduroTrailRiding
      @EnduroTrailRiding 6 лет назад +6

      Where's there benchmarked review video then? There's far better channels out there that have done that already today and now prove more credible. I assume digital foundry is on nvidias payroll to not release a benchmark comparison video already highlighting how crap these new cards truly are in terms of performance upgrade vs last gen.

    • @Luredreier
      @Luredreier 6 лет назад +4

      +Nima V
      I very much like their in-depth look at frame pacing.
      But their quality comes at a cost of quantity.
      They can't really check as many titles or as many variables as other reviewers like Hardware Unboxed does.
      And at the end of the day it's Hardware Unboxed, Gamers Nexus and Actually Hardcore Overclocking that I use the most for most of my purchasing decisions regarding motherboards, CPUs and GPUs.
      PSUs are a different story.
      But yes, Digital Foundry is definitivly a great supplement source of information.
      But "only tech channel we need"?
      Definitivly not.

    • @MarceloTezza
      @MarceloTezza 6 лет назад +6

      You are so wrong so many levels, digital foundry was once something now is just a bunch on blind guys.

    • @GraveUypo
      @GraveUypo 6 лет назад +1

      Kinda have to agree with Marcelo Tezza here
      quality's dropped HARD from what it once was. their analysis are getting shallower and shallower, not to mention biased. this video was pretty low quality. it was more a conversation than a technical analysis. i could learn almost nothing i already didn't know, and i didn't even do much research into dlss.

  • @WeeWeeJumbo
    @WeeWeeJumbo 6 лет назад +1

    Always very grateful that this level of insight is available to the public

  • @darak2
    @darak2 6 лет назад +3

    The technology seems cool, but the 1440p base resolution is very apparent in scenes with high texture details. No upscaling magic is going to recreate details that were never there to begin with. FF is not a good test case since its texture quality is actually pretty poor (and so is its post-processing stack), not to mention an on-rails demo is the best case scenario for an algorithm based on training.
    Until this is released in a real game and we're able to test its real performance implications and quality over really dynamic and interactive scenes, any review is pointless.

    • @marverickbin
      @marverickbin 6 лет назад

      But in fact the details are there, because they use 4k images in the network training.
      Maybe this game is not the best example, and the training architecture have to improve, or more images have to be put on the network. But the technology allows to foresee what is not there. Is like you manage to draw something you already seen from other angle. The information is there in your brain network.

    • @darak2
      @darak2 6 лет назад +1

      DLSS will be able to produce a better blend of multiple frames of information in order to prevent aliasing, but it won't be able to add missing information that is not present in the frames generated at runtime. There are two reasons why this is guaranteed. The first is a dataset size issue: you'll need to store somewhere all the missing pixels from the original 4K pictures for each and every frame, data which won't be available otherwise at runtime; the size would be prohibitively large. The second is speed. The more individual cases (texture details) the algorithm must react to, the bigger the network will grow and the slower it will run. Some existing superscaling neural networks are able to infer non-existing details, but those run extremely slow and unlike DLSS they are trained for extremely specific cases such as celebrity faces in a particular angle (and even then, the results are horrible 50% of the time).
      Neural networks are a great technology, but they are not magic. Even if you don't believe me, you have plenty of examples of the missing texture details in the existing DLSS captures.

  • @DNAReader
    @DNAReader 6 лет назад

    @DigitalFoundry TAA is spatial super-resolution using previous frames and player movement to provide an anti-aliasing through interpolation. DLSS is a series of quasi gaussian filter (convolutional filters like those of a NN) learned from ground truth (4K MSAA8x) compared to no-AA 1440p. The filters are a learned filtering process, accelerated through the Nvidia ASIC : TensorCores (hence why they say "inferencing")

  • @desmond3245
    @desmond3245 6 лет назад +6

    There is no point in using TAA in the first place when it's 4K. So comparing TAA with DLSS at 4K to make DLSS seem better makes no sense. Why don't compare DLSS with No AA at 4K?

  • @DELTAGAMlNG
    @DELTAGAMlNG 6 лет назад +1

    is dlss only for 4k or will it work for people running 1440p monitors

  • @eldarmusayev7653
    @eldarmusayev7653 6 лет назад +5

    @21:57 while using the Tensor cores to upscale the ray traced image seems like a good idea, those tensor cores are already busy doing something else - denoising the ray-traced image. Maybe it's possible to run both the upscale model and the denoise mode in parallel and assign them half the tensors each, but I've never seen a single gpu able to run two models in parallel, and would wager that this would require a separate unified model, one that is trained using an image that is both lower-res and noisy from RTX, with supersampled cleaned up ray traced frames as the training target - introducing lots more room for errors in the end result.

    • @TheLoucM
      @TheLoucM 6 лет назад

      the star wars demo runs both at ''the same time''.

    • @JimBob1937
      @JimBob1937 6 лет назад +1

      Not all devs seem to be using the tensor cores for denoising. Those that are, I see no problem with, the different work will merely be scheduled amongst the cores, why do you feel cores can only be used by one thing? Ignoring the ASIC cores we're talking about, CUDA cores are by their nature able to be scheduled for mixed work loads, that's kind of the whole point of the unified core architecture. With ASIC cores, you'll just be queuing the workload among them, with the work being parallizable, it can shard off into the number of cores needed and available, and it's highly unlikely the denoising will require many of those cores. So, I don't think this is an accurate criticism.

    • @JayJapanB
      @JayJapanB 6 лет назад

      Dice weren't using the Tensor cores for anything.

    • @eldarmusayev7653
      @eldarmusayev7653 6 лет назад

      @jarrod1937 At least for people using various machine learning libraries, you can't parallelize tasks and assign one to half a gpu. You just assign it a gpu and then off it goes.
      If denoising and supersampling are two separate models, they'd need to be run sequentially. That means extra slowdown. That's why.

    • @eldarmusayev7653
      @eldarmusayev7653 6 лет назад

      @JayJapanB Dice were definitely using tensor cores for denoising the ray traced reflections. RTX includes denoising as a vital part of the process. Unless you've got a source where Dice devs explicitly say they were using ray tracing without tensor cores.

  • @rvalent9366
    @rvalent9366 6 лет назад +2

    i hope gtx 2060 will have some tensor cores (no rt) so we can still have dlss, and maybe Variable Rate Shading via ia?

  • @fatboymachinegun
    @fatboymachinegun 6 лет назад +190

    PC gamers:
    Fake 4k on consoles? Wow what a joke.
    Fake 4k on PC: Wow this technique is so advanced!

    • @RinkuHeroOfTime
      @RinkuHeroOfTime 6 лет назад +17

      tbh i really dnt care for 4k

    • @GraveUypo
      @GraveUypo 6 лет назад +28

      basically that.
      but honestly, i want fake 4k on both. i want to run games at a native 1080p and have them not look smeary at a 4k screen. that way i can use low end hardware and still have good performance.

    • @Solidouz
      @Solidouz 6 лет назад +11

      Lol so true.

    • @EcchiRevenge
      @EcchiRevenge 6 лет назад +39

      It's advanced.
      Nobody said it's preferred over true 4k...etc.
      It's just another option that PC Master Race has, sometimes for hitting that 144fps mark(that consoles don't even have the CPU to do).
      Nothing new there, it's not like PC games can't have dynamic resolution.

    • @QuietWookie
      @QuietWookie 6 лет назад +8

      it is advanced tho.

  • @zNoteck
    @zNoteck 6 лет назад +6

    The unreal music in the background tho

  • @UbiquitousDIY
    @UbiquitousDIY 6 лет назад +1

    Given dlss is constructing images to some extent how does this affect playing competitive games (siege/csgo etc) which need the most raw and immediate visual info/frames for the best possible gameplay?

  • @keyboard_g
    @keyboard_g 6 лет назад +7

    So the conclusion is that faux 4k is easier to do than real 4k. Ground breaking stuff.....

    • @glorious240fps6
      @glorious240fps6 5 лет назад

      You didn't understand a single thing in this video.

    • @themodfather9382
      @themodfather9382 5 лет назад

      @@glorious240fps6 Yeah he did, no matter how you slice it, it's a 1440p image. Just like the Xbox360 used to upscale a 576p image to 1080p. I love the xbox360 but let's be honest about what it is.

    • @glorious240fps6
      @glorious240fps6 5 лет назад

      @@themodfather9382 it's a new technology,it's getting better and better over time. Look at metro exodus's dlss now Vs. bf5's dlss before.

  • @SaschaWagner
    @SaschaWagner 6 лет назад +1

    Unreal-Soundtrack = Goosebumps

    • @e5m956
      @e5m956 4 года назад

      I still remember coming from a n64/Goldeneye on a TV to a PC with a Voodoo 3 that came with Unreal packed in on a 1024x768 PC monitor and being blown away at the sharpness and graphics of Unreal in the opening intro screen!

  • @zeldacuz
    @zeldacuz 6 лет назад +3

    DigitalFoundry gives DLSS it's stamp of approval? Nice.
    DLSS is the future it seems.

  • @proesterchen
    @proesterchen 6 лет назад +1

    It's quite telling that the two comparisons Nvidia chose to present to the press are against other (arguably worse) blur filters, rather than against the 'real thing.'

  • @KeyToAnime
    @KeyToAnime 6 лет назад +5

    That Versus XIII thumbnail makes we very triggered. ._.

  • @Maschinenzimmer777
    @Maschinenzimmer777 6 лет назад

    What's the game in the second half of the vid? Looks amazing, loving the art style as well as the tech behind it, but couldn't pick a name

  • @Anita028
    @Anita028 6 лет назад +3

    That gorgeous Unreal soundtrack in the background is giving me a bad time trying to concentrate in what are they talking... I just want to see more Unreal...What a soundtrack for what a game.

  • @ishaanpitcon4572
    @ishaanpitcon4572 6 лет назад +1

    I would be interested to see a comparison with no AA as well.

  • @b01scout96
    @b01scout96 Год назад +3

    Today my RTX 4090 reached over 11,000 points in FFXV benchmark (maxed, UHD, DLSS). Compared to about 3,000 points on this video this is astounding! 😀
    But DLSS 2.0 significantly increased the image quality. A shame the game never received the new version, though.

  • @mathburn1
    @mathburn1 6 лет назад

    @5:53 what's happen to Noctis's inner shirt? Artifact?
    Also a shimmering @10:26 - @10:28, @11:55 - @11:57, @17:33 - @17:52(bright spot in the middle of the screen) on DLSS side. And I think it looks blurrier as well.
    But the performance gain is really something worth sacrifice.

  • @tepkisiz
    @tepkisiz 6 лет назад +22

    Come on it is obvious that DLSS is rendering at 2K, you can easily see the detail loss in textures. DLSS is cool as a new AA method, but it is stupid to compare it to 4k TAA in terms of performance. Compare to 2K TAA if you have to.

    • @madfinntech
      @madfinntech 6 лет назад +16

      1440p, not 2K.

    • @Luredreier
      @Luredreier 6 лет назад +1

      +Seko Us
      What frames where you looking at?
      The one where they paused just after change of frame (the only one where I genuinly saw it being 2k instead of 4k as the NN hadn't had time to catch up with the change of frame yet).
      Because I didn't notice anything else other then those frames where it looked any worse then real 4k.

    • @tepkisiz
      @tepkisiz 6 лет назад +2

      1:58, 16:30, and all other static side by side comparisons. Check the ground and other static areas in the background and watch at 4k, low resolution will be more obvious in native 4k, as youtube is already compressing the image.

    • @Luredreier
      @Luredreier 6 лет назад +4

      +Seko Us
      Alright, I'll admit that 1:58 *does* look better with 4k TAA, but I doubt I'd notice that in a moving picture.
      16:30 Looks about equal to me on the two.
      And several other parts of the video DLSS actually looks *better* to me.
      But part of the point here isn't if 2k picture turned into a 4k with DLSS or a native 4k TAA picture looks better.
      But if it looks better then the 2k picture to start with or close enough to the 4k TAA to accept the dissadvantages.
      The idea being that you can buy a 4k monitor and play games on it with a GPU that is designed for a 2k resolution or keep playing at 4k with a 4k capable GPU long after the demands of the games makes native 4k impossible.
      So no, I don't think it's stupid to compare it with 4k TAA as you *can* see several cases where it *does* look better.

    • @jakub7244
      @jakub7244 6 лет назад

      Check car details in comparison images, DLSS just washes everything like if you did it in photoshop 1st year in college. You get sharp edges but textures that looks like stretched 720p... but if they can improve this, its gonna be nice, but far away from native 4k.

  • @ethangold2010
    @ethangold2010 6 лет назад

    At 6:07 you comment on how their TAA implementation leaves behind a lot of ghosting artifacts but DLSS doesn't. But actually if you look very closely at the DLSS image you can see very light ghosting. Compare the lower part of the tree before and after he swings his fishing rod. There is pretty obvious ghosting if you look closely. I'm not saying that it will be visible in gameplay, but I think it is important to note.

  • @Solidouz
    @Solidouz 6 лет назад +15

    Man you guys are damage controlling so hard for the 2080. The 2080 is bad product no saving it. I canceled my 2080 preorder, got a 1080ti for much less. Maybe next iteration. Or a 2080ti at MSRP will be worth it.

    • @alonofisrael4802
      @alonofisrael4802 6 лет назад

      YES YES YES DAMAGE CONTROL

    • @youtuberobbedmeofmyname
      @youtuberobbedmeofmyname 5 лет назад

      @@alonofisrael4802 it is. lol this channel is bass ackwards

    • @glorious240fps6
      @glorious240fps6 5 лет назад

      @@alonofisrael4802 lol braindead AMD fanboy.

    • @glorious240fps6
      @glorious240fps6 5 лет назад

      @@youtuberobbedmeofmyname brainless AMD fanboy crying and accusing a channel being biased,lol.

    • @glorious240fps6
      @glorious240fps6 5 лет назад

      @Solidouz Snake lolz,so now saying the truth is damahe controling?? Ok brainwashed idiot. Not everything about rtx cards is bad,the only bad thing is the price,other than that, it's a kickass card with great features, it's sad to see those features absent at launch. Only one game has ray tracing and dlss is MIA. But these features are actually good,AMD CEO herself said so and praised nvidia for being early adopters. AMD is also joining the party.
      You are clearly brainwashed by AMD shill channels that made you think that rtx 2080 is just bad and there's no saving it.

  • @Brotoles
    @Brotoles 6 лет назад

    About the Infiltrator demo, assuming the frame time graph is completely synced, it's interesting to notice that frame time spikes happen at different frames when using DLSS when compared to TAA... It seem that because of the temporal element of TAA, the frame time spikes happen one frame after the DLSS ones...

  • @DerMitWieWoOhneNamen
    @DerMitWieWoOhneNamen 6 лет назад +36

    1:58 The DLSS image is extremely blurry compared to the traditional method. That's not closer to 4k, it's farther away from higher revolution... Look at the hair for example

    • @wile123456
      @wile123456 6 лет назад +31

      Well that's because they pause at a frame change, so it doesnt have a previous frame to use for reconstrcution (it's also why TAA doesnt remove the edges on the right either)

    • @erdincylmaz4529
      @erdincylmaz4529 6 лет назад +7

      bla bla bla, dlss is shit, nice nvdiia ass licking session on this video.

    • @bitscorpion4687
      @bitscorpion4687 6 лет назад +8

      5:47 seems pretty good to me

    • @Luredreier
      @Luredreier 6 лет назад +9

      +Monchi AwesomeSauce
      I'm a AMD fan and I'm probably *never* going to buy a Nvidia card.
      However give credit where credit is due.
      DLSS is a *good* technology that I'm looking forward to AMD implementing themselves.
      Sure it's not perfect, the first frame after a major change like the one they paused at is at a lower resolution, but quite frankly without such a pause you're *never* going to notice that.
      And 40% performance uplift compared to regular AA at a given resolution is a no-brainer.
      I'm going to recommend that friends of mine who *are* Nvidia users (and fans) buy something like this (or possibly the next generation) to benefit from this technology as I think this is sort of Nvidias equivalent of the first GCN processors on our side.
      A HD 7970 GHz edition is still a good GPU today because of all its future looking technology, many of those features where not used till only a couple of years ago.
      But yes, a GPU from June 22nd 2012 can *still* be used to game at 1080p in 2018.
      That's impressive.
      And I think this thing might be something equivalent on the Nvidia side although it might be worth waiting for the second gen card with this tech for Nvidia fans.

    • @hugh31098
      @hugh31098 6 лет назад +9

      Have you watched the whole video? The hair on TAA looks sharper because there're a lot more artifact.
      DLSS definitely handle aliasing better, even when compare to such blurry technique like Temporal Anti-Aliasing. And it's also give extra performance.

  • @deathrow989
    @deathrow989 6 лет назад

    Alex is such a credit to the Digital Foundry team, seems like you guys got the dream team

  • @DJHeroMasta
    @DJHeroMasta 6 лет назад +5

    *This* Is What I'm 2nd Most Excited Out With RTX Cards.... *DLSS* 😍.

  • @marvelv212
    @marvelv212 6 лет назад

    Are you going to blow up images just to look at jaggies when playing games? Once AA is applied I don't think it makes a huge difference which AA long as you have it on and performance hit is less. Either way it's good to see which AA is looking better.

  • @jarekstorm6331
    @jarekstorm6331 6 лет назад +15

    Native 4K matters, due to the nature of modern displays requiring perfect pixel to pixel parity between source and display. With modern monitors, the pixels are fixed, unlike CRT in which upscaling or downscaling could be done perfectly. Fixed pixel displays require native resolution in order to avoid blurry images, and I'm honestly not sure why nvidia is unable to produce a product at this late date which can achieve this at 3840x2160 60Hz native.
    We've suffered through 1920x1080 for well over a decade, upgraded to 2560x1440 over the last 5 years now, but still cannot achieve 2x this resolution at a meager 60Hz? Checkerboarding, upscaling, and DLSS are not a substitute for native resolution. I'd honestly rather run NO AA at all than sacrifice perfect pixel to pixel rendering that modern displays require. Gimmicks are not what we want. We just want the power to drive native 4K images at the bare minimum of 60Hz, and I feel that this is not an unreasonable desire only a year away from 2020!
    You wanted a PC gamer's perspective, well there you have it, from a 50-year-old man who's been gaming since Pong in 1976. Progress has stagnated due to nvidia's monopoly of the PC GPU market.

    • @darak2
      @darak2 6 лет назад +8

      It's not as clear cut as you may think. Modern rendering techniques use a lot of multiple resolution tricks: even when your game is configured and delivering a 4K output, chances are that multiple of the buffers involved in the composition (AO, shadows, motion blur, lighting) will be lower resolution. In addition, techniques like TAA will blur the image, effectively producing a lower fidelity result in order to limit jaggies and other artifacts. So, while it is true that pixel pixel displays requires native resolution to avoid a blurry result, modern games are actually producing a blurry result on purpose, which makes it a moot point. In addition, resolution and fidelity is much less apparent in fast motion, so a lot of tricks involving time-based compositions work better than what the screenshots suggest (TAA is one of them, but any modern upscaler will be too).
      I was an skeptic, but after playing quite a few first party PS4 games based on its piss-poor checkerboard technique, all of them looking far better than you'd think considering the hardware, I'm now pretty open to the idea.

    • @MGsubbie
      @MGsubbie 6 лет назад +1

      Jarek Storm
      Let's be fair here and not put all the blame on Nvidia. Moore's law is dead, even if Nvidia put in more effort, the era of amazing leaps in between generations is over.

    • @Astraldymensions
      @Astraldymensions 6 лет назад +1

      I see what you are saying, but 4K is just a large amount of pixels to push. Yes for the past 5 years we have been on 1440p, but 1440p is also about 69% of full 4K. That's still a lot of ground to make up for if 1080p has been 50% of 4K the entire time. On top of that the rate at which we are increasing pixel counts has gotten insane. Going from 480p to 1080p is way smaller than than 1080p to 4K, and none of this is helped by games devs barely targeting 1080p native resolution on the console end.
      Perhaps we should have a card by now that manages it (technically across most games the 1080ti can run 4k 60fps), but with the advances in technology (that are very gpu heavy) that newer games are using it's difficult to keep up with every game on the market. We're closing the gap, it's just bigger and more complex than previously with a lot more rendering techniques in the soup.

    • @nameless-user
      @nameless-user 6 лет назад +3

      This seems kinda stuck up to me, to be honest. DLSS is not a replacement for native, but I'd take it any day over a 4K image that isn't locked to 60fps or higher. I want this kind of tech to go mainstream in the future; it would make budget gaming that much more attractive.

    • @Mogura87
      @Mogura87 6 лет назад +1

      Astraldymensions 1080p is only 25% of 4K, and 1440p is only 44%. 4K: 3860x2160=8 337 600 pixels, 1440p: 2560x1440=3 686 400 pixels. 1080p: 1920x1080=2 063 600. Your point about the insane increase in resolution still stands, and it doesn't stop with 4K either. 8K:7680x4320=33 177 600, which is almost (3,97) 4 times 4K resolution. 8K displays, VR headsets and cameras are already available to consumers.

  • @TheWarTube
    @TheWarTube 6 лет назад +2

    Actually impressed of the results, but I'd like to see how DLSS behaves with lower sample counts (like upscaling from 720p to 1080p) to see if the quality level can keep up.

  • @lpsomething
    @lpsomething 6 лет назад +3

    I bought amd stock at 10.30 usd and sold it for 32.00 usd to buy an rtx. god bless america...........

    • @lpsomething
      @lpsomething 6 лет назад

      I don't invest in the dutch stock market, and have received zero money from it as a result. Because of this, I don't go on the internet talking about dutch stocks. I hope this helps. God bless!!

  • @ww2v2
    @ww2v2 6 лет назад

    17:32 There is evidence of artifacting when you look at the middle of the scene (the actual middle of each split screen) - there is what looks like an archway with light playing on the water which is seen through it (beyond it). As the camera pans upward, you can see on the screen with DLSS that there is some kind of chequer-like artifacting happening there (also on an area of light just to the right of it) that isn't present on TAA. It's hardly a big deal, but it just goes to show that even on a canned benchmark, the image is definitely not flawless. Performance is good, though.

  • @mrpositronia
    @mrpositronia 6 лет назад +17

    Are people really that bothered about minute pixellation these days?

    • @lightning2940
      @lightning2940 6 лет назад +10

      Well thats pc master race for you

    • @mrpositronia
      @mrpositronia 6 лет назад +1

      @@lightning2940 i mean, I have a GTX 980ti and a 1440p monitor and not once have I ever been upset about seeing pixels. Mind you I grew up with ZX Spectrums, so maybe my opinion is out of date...

    • @lightning2940
      @lightning2940 6 лет назад

      @@mrpositronia i dunno about that but most Hardcore pc gamer's are always too much picky about the details in the games nd plus they got too much cash to throw on everything

    • @ColpoRosso
      @ColpoRosso 6 лет назад +5

      I used to get mental about antialising ten years ago. Now, I can't care less. I just want more fps, and if the art is well done, you don't even need AA.

    • @nikobellic7455
      @nikobellic7455 6 лет назад

      But the DLSS one seems a tad bit softer than even TAA

  • @Evab3vA
    @Evab3vA 6 лет назад

    Unreal music in the background!

  • @alexaka1
    @alexaka1 6 лет назад +14

    I can't wait to run these demos on my new 1000 dollar graphics card. So worth it, thanks Nvidia. /s

    • @rmendoza720
      @rmendoza720 6 лет назад

      chicken....meet egg....or...wait...is it egg....meet chicken? Are game producers going to put these features in the game if the hardware isn't there? At least the hardware is there now....be patient....its coming.

    • @theripper121
      @theripper121 6 лет назад

      Gotta say this release is a major misstep for Nvidia. The biggest player in the world for graphics cards tech can't manage to get together with not even a single game developer to have at least one playable game to show off what they are saying is the greatest leap in rendering ever? The two most touted features of your product is absolutely idle on release. Not one game that supports Ray tracing. Not one game that has integrated dlss. Windows drivers for ray tracing won't even launch for another month. What was the rush? Not like Amd is nipping at their heels now. With the single largest price leap in video card prices from tier to similar tier that I can recall ever (and I've been with the pc platform since the 386 days) isn't a way to launch a product. It's not that big of a deal if these features were just niceties added to a new card instead they are being touted as your flagship features and you have no way to show any of it and wont for quite some time equals botched launch if you ask me. Nvidia keeps touting how easy it is to implement dlss. If that's the case where's at least one title just one you can use the 2nd half of your shiny new 20x0 card with?

  • @timothyweakly2496
    @timothyweakly2496 6 лет назад +1

    Man I wish I had more understanding about pcs. That performance gain is crazy.

  • @AshtarSheran1
    @AshtarSheran1 6 лет назад +6

    the lowest point of nvidia in twenty years.
    crazy prices and ridiculous performance.

    • @JayJapanB
      @JayJapanB 6 лет назад

      They're the first company to implement realtime RT in games, so I'll give them that. That's a pretty big deal.
      Prices need to take a nose dive though. Between price hikes and inflation in Australia, the 2080 is twice the price the 680 was. Feels bad, man.

    • @JimBob1937
      @JimBob1937 6 лет назад +1

      "the lowest point of nvidia in twenty years."
      I'm not sure if that is even remotely accurate, seems like a largely exaggerated claim merely because of the pricing. The tech they're pushing is probably one of the larger leaps in recent times from either AMD or Nvidia. As a developer, I'm excited, lots of new tools in the toolbox.

    • @Psydrre
      @Psydrre 6 лет назад +1

      It's PC gaming in a nutshell.

  • @BrainDesmo
    @BrainDesmo 6 лет назад +2

    Console gaming philosophy and tech, positively influencing pc gaming.