'Upscaling' framerates in games - Using RIFE to prepare for DLSS 3

Поделиться
HTML-код
  • Опубликовано: 7 фев 2025

Комментарии • 438

  • @aaronmarko
    @aaronmarko 2 года назад +635

    Rendering technology is so wild. It's hard to believe how just 30 years ago we had games at a resolution of 160x200 and now computers are just inventing frames whole cloth.

    • @Tofuey
      @Tofuey 2 года назад +77

      30 years is an eternity in modern technology terms.
      Hell 30 years is a significant portion of a human lifetime.

    • @toafloast1883
      @toafloast1883 2 года назад +11

      Man insane how we went from the pinhole camera back in 1021 to what we have today in 2022 absolutely incredible

    • @prateekpanwar646
      @prateekpanwar646 2 года назад +16

      A Pentium from 30 years ago would have produced 10+ quadrillions of cycles by now.
      The same amount of cycles today's 8 core cpu can do under 30 mins. That's how quick technology has improved and evolved.

    • @Henrix1998
      @Henrix1998 2 года назад +6

      30 years ago was 1992 and we had SNES with much higher resolution and only 4 years before N64 and SM64

    • @ragnarockerbunny
      @ragnarockerbunny 2 года назад +2

      As far as I'm aware only pre-8 bit consoles and handhelds ran at resolutions that low. Both the NES and SNES ran at 256x244, although for NTSC regions the image was actually rendered at 256x224. The N64 was capable of 480p image, although the vast majority of games used the 320x240 mode. The original GameBoy and Gameboy Colour both rendered at 160x144. Much older video formats also rendered at 144p.
      I mean, yes, the fact that the Atari 2600 had pixels you could count on screen and the PS5 and current PCs are rendering at 3840x2160 (which was always a pet peeve of mine, 4K suggests a 4 times larger resolution but if 1080p is 1920x1080, then 4K is actually only double the resolution, the 4K coming from the horizontal axis and not the vertical axis but that's neither here nor there) but also time moves forward.

  • @fuzzypotatoo
    @fuzzypotatoo 2 года назад +146

    One thing to note about 24fps movies are that each frame is filled with motion blur which helps our head basically create detail from what is essentially a blurry mess.
    A frame from a video game capped at 24fps would be essentially moments frozen in time with no motion blur, thus looking less smooth.
    In fact, shutter angles are a consideration for DPs to vary the length of the motion blur depending on whether it's an action scene (making it look almost superhuman) or to mimic intoxication or being drugged by making the blur extra long.

    • @battyflaps5410
      @battyflaps5410 2 года назад +1

      usually shutter speed is double the fps you capture at so you get smooth more natural realistic motion blur

    • @float32
      @float32 2 года назад +2

      The rave scene in Blade being a good example of no blur, if I remember correctly

  • @TheChraz
    @TheChraz 2 года назад +506

    I dont know if you tried this already, but how does it look if you extract all the AI generated frames from the 30->60fps video and play them back without any real frames (and then ai generate again based on that). Might be a cool experiment to see how often you can generate until it looks bad or non-playable.

    • @BallstinkBaron
      @BallstinkBaron 2 года назад +48

      I really like this idea! It wouldn't be necessarily in context but it'd be interesting to watch at the very least

    • @gui18bif
      @gui18bif 2 года назад +5

      Cool idea

    • @Yorgos56
      @Yorgos56 2 года назад +47

      this has 3klik's written all over it, love it

    • @900bot2
      @900bot2 2 года назад +6

      Brilliant idea! Would make a perfect 3kliks video.

    • @verbatim7488
      @verbatim7488 2 года назад +1

      3KLIKS, 2KLIKS, KLIKS, ANYONE

  • @HoneyTwee
    @HoneyTwee 2 года назад +104

    Once again Philip is explaing this technology better than anybody else.
    Somehow this CSGO RUclipsr is my go to guy when telling people about AI technology.
    Great video man. Really shows you're willing to go the extra mile with the effort you put into this stuff. 💙

    • @lolroflmaoization
      @lolroflmaoization 2 года назад +5

      I enjoy his videos, but Digital foundry is my go to if you want in depth analysis.

    • @HoneyTwee
      @HoneyTwee 2 года назад +7

      @@lolroflmaoization sure, I'd agree with some stuff. But with this very example of DLSS 3 digital foundry seems very restricted by Nvidia about what they can show. And didn't dedicate their video to exploring AI frame generation in its entirety, just showed some percentage increases when using DLSS 2 and DLSS 3 in like 2 Nvidia approved games and scenarios.
      So at least for now, I think Philip has them beat in terms of a full comprehensive look at this technology. So hats off to him I say.

    • @WhizXGames
      @WhizXGames 2 года назад +6

      you must be thinking of 3kliksphillip. different guy. brother. father?

    • @HoneyTwee
      @HoneyTwee 2 года назад +7

      @@WhizXGames haha true, the CSGO mapper, the AI expert and the jelly baby eater. A true family.

    • @HoneyTwee
      @HoneyTwee 2 года назад +3

      @@eniff2925 ultimately DLSS 3 isn't out yet. So this video isn't trying to be a deep dive and review of DLSS 3 and if you should turn it on or not. It's meant as an overview of what frame interpolation is, how it works, the limitations of the technology and what artifacts to look for, what frame rates the technology works better at and the fact that you see much better results when interpolating smoother looking motion to begin with (such as 24fps+) and explained why with simple and easy to understand graphs about the time between each real frame and AI generated frame and that tells us about what kind of use cases DLSS 3 is going to be best at (60fps to begin with and up)
      Like why so negative? This video is great for just expanding on a basic understanding of the technology and without early access from Nvidia is the best we can hope to do, it covers many things that weren't mentioned in digital founday's video.
      And you're negative towards DLSS as well saying it's a huge sacrifice in either latency or resolution. Whilst ignoring the benefits which is much smoother motion to drive monitors like 240Hz and 360Hz and 480Hz at 4k. Or allow higher end demand rendering like ray tracing to become smooth looking.
      The tech isn't out yet, we don't know how good it is. But even if you take digital founday's numbers on latency at their worst and say that latency is the same as native when you have reflex on and are rendering at 1/4th the resolution. Or a little higher when rendering at native res with frame generation and reflex on presumably. Is that really a massive deal? It's milliseconds more latency for a much much smoother image to drive these new 240Hz+ monitors. If you only focus on the downsides of a few ms higher latency it sounds terrible, but you're just ignoring the positives. Which doesn't really matter at the moment since neither of us can make a good conclusion about the tech yet. But it seems you are instinctively negative and bias against it for whatever reason, a sentiment I've seen a lot of due to the anger with Nvidia over priced and what not.
      Just give the tech a chance, stop hating on a video for not being an in-depth look at DLSS 3's specific implementation of frame interpolation before the tech is even out, and stop being so negative. 👍

  • @Shellslime
    @Shellslime 2 года назад +18

    Just wanted to say that I really appreciated the random Oblivion OST towards the end of the video, absolutely wonderful

    • @gsedej_MB
      @gsedej_MB 2 года назад +2

      Same taught. I had to check if anyone else noticed.

    • @jeffbloomers
      @jeffbloomers 2 года назад +1

      @@gsedej_MB same

  • @albarnie1168
    @albarnie1168 2 года назад +32

    The biggest difference is that dlss has access to color, normal, depth and motion vectors straight from the engine.

    • @Sweenus987
      @Sweenus987 2 года назад +1

      I don't see why DLSS wouldn't have access to the vertex data too

  • @dXXPacmanXXb
    @dXXPacmanXXb 2 года назад +55

    Impressive stuff tho, I still remember when fading between frames to increase framerate was the state of the art

    • @emberytp
      @emberytp 2 года назад +5

      My old copy of Sony Vegas sure likes to remind me that resampling framerates was the future, when it makes even my hard cuts out of black blurry.

    • @Rainquack
      @Rainquack 2 года назад +1

      I've been pretty surprised that MPV seems to do that, while the Smooth Motion option in MPC-HC (video players) doesn't seem to do anything at all.
      I can't believe that every single player in existence shouldn't be able to do better real time interpolation on PC and a 5600X and 3060Ti than my 10yo TV on some potato chip.
      RIFE 4.0 via Flowframes is great and shockingly fast, compared to my previous experience 2 years ago with DAIN 0.36 and a GTX 960, but I'd still love to know about something that replicates that standard TV feature.

  • @nunayobusiness7521
    @nunayobusiness7521 2 года назад +64

    One thing to keep in mind is that rife can see all the frames while nvidia can only see two, the present frame and the next one.
    Edit: IM WRONG rife only uses two frames despite having access to all of them. Thanks Quaterions

    • @baka2907
      @baka2907 2 года назад +2

      @@dan_loeb not only nvidia others upscale too. It's not special thing tô nvidia

    • @krakow10
      @krakow10 2 года назад +9

      Unfortunately RIFE only uses two frames regardless of the fact it has access to all of them...

    • @w04h
      @w04h 2 года назад +2

      @@baka2907 What does this have to do with anything? This is not upscaling.

    • @circuit10
      @circuit10 2 года назад

      @@baka2907 ???

    • @jm27232
      @jm27232 2 года назад

      RIFE does not use more than 2 frames.

  • @thomasmatthews5593
    @thomasmatthews5593 2 года назад +29

    I really appreciate the amount of effort that has gone into this video. Straightforward and easy to understand, even for someone who isn't very tech savvy. Thanks for the all the hard work, very enjoyable!

  • @saphricpcgaming5182
    @saphricpcgaming5182 2 года назад +8

    0:00 You may not like it, but this is what the peak male form looks like.

  • @maxmustsleep
    @maxmustsleep 2 года назад +4

    incredible work with the explanations and demos! The filmed video is literally perfect

  • @fffrrraannkk
    @fffrrraannkk 2 года назад +4

    Any time I notice music from Oblivion in the background of a video I have to go back and watch parts again because the music is so good it's distracting.

  • @MFKitten
    @MFKitten 2 года назад +14

    I'm pretty sure the motion vectors in use will make a massive difference. The motion vectors are 3D, so it knows whether things are moving towards or away from you as well.
    I just wish games would also use the "time warp" style fps boost that VR uses. With VR you have the drivers doing stuff with the depth buffers, distorting the previous frame, guided by the depth buffer, using your headset's input.

    • @Rainquack
      @Rainquack 2 года назад +1

      So it allows you to move your head in a scene, while technically the action in the scene isn't newly rendered yet, for a low head-movement input latency?
      I never heard of this, cause I don't own a VR headset, but that's what I understand this as.

    • @MFKitten
      @MFKitten 2 года назад +3

      @@Rainquack yes! I encourage you to look it up on youtube. There are videos demonstrating it. The idea is to never let your headset's movement suffer from any framerate changes. When a frame isn't ready in time, it takes the previous frame and warps it according to the geometry of the scene (stored in the depth buffer), so you can always move around freely in the space in the full framerate, and if things are chugging along you will see items and characters and stuff get choppy, but your own movement in the space is always smooth. It's brilliant!

  • @neti_neti_
    @neti_neti_ 2 года назад

    मेरे विचार से यह एक बहुत सुन्दर , उपयोगी और महत्वपूर्ण तकनीक है चित्रपट निर्माण के लिए । इस तकनीक द्वारा हम विशेष दृश्य प्रभाव जैसे मैट्रिक्स प्रभाव या टाइम-स्लाइस (जो की कई कैमरों का उपयोग करके बनाया जाता है ) ताकि काल धीमा हो जाये या स्थिर प्रतीत हो , बहुत सरलता और तीव्रता गति से से सम्पन किया जा सकता है।
    वैसे आपने इस चित्रपट में गहन शोध , स्पष्ट अवलोकन , प्रज्ञावान विश्लेषण और बहुत सुन्दर प्रस्तुतीकरण किया है ।

  • @logisticallogic8673
    @logisticallogic8673 2 года назад +22

    Wow, those morphing videos created by RIFE remind me of some rather spiritual experiences

    • @HaxxorElite
      @HaxxorElite 2 года назад +1

      ?

    • @logisticallogic8673
      @logisticallogic8673 2 года назад +6

      @@HaxxorElite the RIFE merging looks really similar to an effect one can get while using psychedelics

    • @HaxxorElite
      @HaxxorElite 2 года назад +2

      @@logisticallogic8673 It has no connection

    • @logisticallogic8673
      @logisticallogic8673 2 года назад +8

      @@HaxxorElite ? I'm not quite sure what you mean, I'm only saying it looks similar to things I've personally seen before.

    • @oiltycoonbillionaire
      @oiltycoonbillionaire 5 месяцев назад

      Do you have autism​@@HaxxorElite

  • @BigCheeseEvie
    @BigCheeseEvie 2 года назад +1

    You released the video just in time. Was rewatching your older ones

  • @pepp418
    @pepp418 2 года назад +5

    2:33 this effect could be used purposefully to make some amazing visual stylisation

  • @UOTCbassist
    @UOTCbassist 2 года назад +2

    DLSS 3 will be HUGE for older games that have physics tied to fps. A great example is Ocarina of Time. Trying to play at 20fps gives me a headache sometimes, but I power through it since it's one of my favorite games ever. If DLSS can properly "upscale" it to 60fps on Project64, I'd buy a 40 series card alone for that.

  • @scarpusgaming
    @scarpusgaming 2 года назад +3

    I would love to see a dlss 3 example that shows the input video on the left, the "upscaled" result in the middle, and JUST the newly generated frames on the right. Would make for a really neat comparison.

  • @Vartazian360
    @Vartazian360 2 года назад +14

    I see what you tried to do here, but all of these technologies are only giving the AI images without motion vectors, whereas dlss3 has all of the motion vectors in thr engine to work with in addition to the visual data. Early comparisons between existing techs show that dlss is far more advanced than off the shelf interpolation AI programs. However what you said about it looking smoother without feeling smoother is probably spot on. That will be interesting to see how it does actually feel versus just watching a game with dlss 3 footage

  • @pair_of_fins
    @pair_of_fins 2 года назад +3

    I think this is the most I have ever critically analyzed and learned from a clip of a half naked man

    • @pair_of_fins
      @pair_of_fins 2 года назад

      Other than maybe that unlisted 4kliksphillip video from a while back

  • @Harsh-up3ip
    @Harsh-up3ip 2 года назад

    your videos are dope. been binging everything since i paused watching you from a few yrs ago, good stuff.

  • @krakow10
    @krakow10 2 года назад +4

    Long time SVP user and RIFE user as of this year here. I would like to report that chain link fences are the bane of all things frame interpolation, and you should use it as a worst case scenario instead of grass!

  • @EVPointMaster
    @EVPointMaster 2 года назад +2

    When using Rife with Flowframes, it does have an option to detect scene changes, so it will not interpolate between those 2 frames.

  • @flufo
    @flufo 2 года назад +3

    I enjoyed this video thank you for making it

  • @TotalTonix
    @TotalTonix 2 года назад

    Thank you for your research! I've really enjoyed this type of content from you over the last few years!

  • @BRUXXUS
    @BRUXXUS 2 года назад +1

    Wait... is that Uncle Al? Oh god, he's broken into our physical dimension!
    Aside from that, I did find this whole experiment/demonstration super fascinating. I've always been mesmerized with some plugins and software that can take 60fps and attempt to convert them into slow motion. The weird bending and smeary distortion can look really neat.

  • @eX1st4132
    @eX1st4132 2 года назад

    I almost didn't click on this video, because the title sounded boring, but I am very glad that I changed my mind. This video is even better than a lot of your recent ones.

  • @offchristianamr
    @offchristianamr 2 года назад +6

    Great video. I've been waiting for real-time frame interpolation tech in games for years; it's very exciting to see it finally coming to life. Plus, the wise mystical music always gets me in a wise, mystical mood.

  • @-blade-9203
    @-blade-9203 2 года назад

    Seeing my own animation in your video caught me of guard. Very nice

  • @World_Theory
    @World_Theory 2 года назад +1

    Idea: The artifacting in the 60 to 120 FPS conversion becomes fast enough to trigger a kind of survival mechanism in the human brain, related to danger response and reflex. Like if you see something unexpectedly move in the corner of your vision, that may be a danger that you need to react to, so your brain distributes more resources to paying attention to it, and divides your attention between the flickering artifacts and whatever thing your trying to concentrate on.

  • @ragnarockerbunny
    @ragnarockerbunny 2 года назад +8

    Games tend to benefit from higher framerates but film and animation don't. Especially since double the framerate means double the render time which can make animation a particular hell to work with, and AI interpolation on animation... Tends to be extremely hit or miss. More isn't strictly better. I'm softer on this than most, I've seen interpolations that do actually look incredible and benefit the animation, but I also know the vast majority of it looks bad and changes the "feel" of the animation. Something that used to be snappy and punchy can become overly smooth. Something that was meticulously workshopped to look natural can look artificial. It very much is a right tool for the right job situation.

    • @bungleleaders6823
      @bungleleaders6823 2 года назад +1

      > Games tend to benefit from higher framerates but film and animation don't.
      Hard disagree. It's choppy frame rates that artificial-looking. Natural looking is ultra high frame rates to get closer to life-like motion.

  • @xopsawa8720
    @xopsawa8720 2 года назад +11

    Frame upscaling and the artifacts that come along with it look really similar to asynchronous space warp (frame interpolation for vr headsets)

    • @theRPGmaster
      @theRPGmaster 2 года назад +1

      This is not upscaling though, it's interpolation

    • @AngryApple
      @AngryApple 2 года назад

      @@theRPGmaster you could argue that its temporal upscaling 🤔

    • @theRPGmaster
      @theRPGmaster 2 года назад

      @@AngryApple You could argue it's temporal upscaling when plants grow

    • @MisterChief711
      @MisterChief711 9 месяцев назад

      @@AngryApple no you cant because then it would be called temporal upscaling

  • @thatcherfreeman
    @thatcherfreeman 2 года назад +10

    You made a great point at the end of the video. Past 60fps, traditionally I think the benefit of higher framerates has been the reduction in latency, not that it appears to move smoother. You can try looking over someone's shoulder as they play a video game and honestly anything above 50 looks reasonably smooth, but they can certainly feel that it's not perfect due to the latency. I'm really curious how DLSS 3 is going to feel on 30fps and 60fps games

    • @theemulationportal
      @theemulationportal 2 года назад +2

      Disagree, CRT's in motion resolution are equivalent to 1000fps, 60hz on a sample and hold LCD or OLED screen looks like a smeary mess, in fact resolution drops to 360 lines equivalent to 360p when panning the camera which looks ultra terrible on modern 4k displays. Latency is only 1/2 the picture and only really relevant for E-sport titles / twitchy games. Nvidia, Intel have both done research papers/tests that suggest past a certain point people don't care about latency. Also this testing methodology is useless as DLLS 3 has access to the games render pipeline and motion vectors so all those artefacts he experiences in his CS:GO example are a limitation of RIFE and can't be compared 1:1 with DLSS.

    • @mechanicalmonk2020
      @mechanicalmonk2020 2 года назад +4

      @@theemulationportal what bargain bin OLED do you have that smears 60fps content?

    • @thatcherfreeman
      @thatcherfreeman 2 года назад

      @@theemulationportal You're right about the motion vectors, I expect DLSS to outperform the methodology shown in the video in terms of image quality.
      My argument was more that the increase in framerate is only really meaningful as long as you can feel improvements in latency. If you can't feel improvements in latency, you probably aren't benefitting anymore from high FPS from a smoothness standpoint either. I suppose we could actually test this empirically by having a 240hz screen and injecting input lag until it has the same latency as if you were playing at 60fps, might be an interesting test to do.

    • @bungleleaders6823
      @bungleleaders6823 2 года назад

      @@mechanicalmonk2020 For a 1000 pixels per second motion tracked by the eye, the object tracked is perceived to have a 16,6 pixel wide symmetrical smear on a sample and hold 60Hz display. Being OLED doesn't change that. A theoretical instant pixel transition technology would not change that either.

    • @bungleleaders6823
      @bungleleaders6823 2 года назад

      @@thatcherfreeman You are working on a false premise. Increasing the frame rate (independent of what happens to input lag) does yield improved motion clarity. And that effect is transformative up until ridiculously high frame rates/refresh rates orders of magnitude higher than what is available today. Life-like artifact-free motion portrayal requires tens of thousands frames per seconds at tens of thousands Hertz.

  • @yom35
    @yom35 2 года назад +8

    what looks REALLY good is already high fps footage upscaled to even higher (say 240 to 2x or 3x that speed) then RESAMPLED back down to 60fps with sony vegas smart resample. it looks insanely smooth good video btw

  • @SquishEESpark
    @SquishEESpark 2 года назад +1

    8:25 - It’s impressive technology, but I feel like those camera “jumps” would drive me insane after a while. It’s similar to the later examples where you can’t exactly look for the issues, but you can FEEL their presence.

  • @ChristianBrugger
    @ChristianBrugger 2 года назад +7

    I think it could be interesting to just show the 60 AI frames for the 120fps example, to judge what it is doing.

  • @Tegridy_Gaming
    @Tegridy_Gaming 2 года назад +4

    Saving this to my "watch later" pile as I'm going to sleep. 😂

    • @quantum5661
      @quantum5661 2 года назад +6

      see you in the morning my dude

  • @Zolbat
    @Zolbat 2 года назад +1

    I'm fairly sure the gpu will utilise more data to interpolate muuuuuuuuch better. A gpu has a z-buffer (depth image), movement vectors for each pixel and more data, basically for free. This additional data can make the output almost perfect, even for the grass situation for example.

  • @hippiemcfake6364
    @hippiemcfake6364 2 года назад +18

    Cool video! Thanks for making it. For the 30->60fps examples, could you show the real skipped frame and the interpolated frame side by side or even the difference between them? I feel like that might be really cool to look at. Cheers.

  • @Beehj84
    @Beehj84 2 года назад

    The Oblivion score in the background was doing my head in trying to work out if I had another tab open with someone streaming gameplay lol.

  • @ink3988
    @ink3988 2 года назад

    I'm glad you sorted out the city packs for flight sim! Sorry, you were probably flooded with comments on the video on that...

  • @Borizz183
    @Borizz183 2 года назад +2

    it would be cool if there was like an option that would only activate the fps upscaling when your fps goes below a certain point.
    for example you would normally play at 60 fps but when the game sometimes dropped to 40 it would generate new frames to hit the 60 fps again

  • @justasmig
    @justasmig 2 года назад +1

    6:04 that "1" in "15 FPS" looks sus

  • @_Egitor
    @_Egitor 2 года назад +2

    Don't undersell yourself - you are always quite mesmerizing to watch ;)

  • @microsoftpowerpoint4731
    @microsoftpowerpoint4731 Год назад

    Ray tracing and dlss are the technical computing miracle technologies of my time and I now understand what a lot of older gamers talk about about what I take for granted, it’s such an exciting time

  • @zilliq
    @zilliq 2 года назад

    The way you blocked your teammate who was trying to reload hurt more and more the more you see it and the culmination was the 50% speed. You could really see the distress in his eyes as he progressively understands what's happening to him

  • @tritoner1221
    @tritoner1221 2 года назад +2

    love your work!

  • @ethyrice
    @ethyrice 2 года назад +1

    doubling the replay speed to increase fps was the most smart thing i ever heard 😮

  • @NavJack27gaming
    @NavJack27gaming 2 года назад

    its going to be quite interesting for sure. With post processing you can't really test what would happen with a game. 5FPS decimated from a 60FPS video would be as you presented it, an even decimation of the frame rate and thusly, the frame times. but with a game, as i'm sure you know, you have a variable frame time. i'm the most interested in where you are getting, by all means, a measurable "60fps" or "120fps" but your frame TIMES are wonky and uneven. THAT is where i'm sure DLSS 3 will shine.
    EDIT: and yeah, DLSS3 will also have the motion vectors and depth buffer among other things that can act as hints to it. this is going to be VERY VERY interesting to see.

  • @lashingfanatic
    @lashingfanatic 2 года назад

    excellent video as always, always a pleasure

  • @gabboman92
    @gabboman92 2 года назад +1

    artifacts do not worry me. what worries me is the input latency

  • @Dapstart
    @Dapstart 2 года назад

    I still think the optimal solution in a game for this sort of tech would be to dynamically raise the "pure" framerate based on the frame to frame distance, essentially demanding more frames for the ai to use as reference based on the current speed of camera. After animating for many years, I can tell you that your can make things appear very smooth even at a low fps assuming the image isn't changing a ton frame to frame.

  • @phaphachboy
    @phaphachboy 2 года назад

    This may be great for VR where artifacts at the corner of your eyes may not be noticed

  • @ToniGAM3S
    @ToniGAM3S 2 года назад

    yea the 5-60 csgo scene broke my brain, I have never in my life experienced motion sickness not even when I bought my vr set and instantly started with locomotive movement. But that scene broke me, jesus

  • @Gargantura
    @Gargantura 2 года назад +1

    that intro is gold.

  • @aussieknuckles
    @aussieknuckles 2 года назад +1

    I really didn't understand DLSS as well as I thought till I discovered your channel, and since then I've learnt a lot. This technology is going to get pretty wild as time goes on.

  • @Jakobschannel01
    @Jakobschannel01 Год назад

    12:06 “especially in applications like blender” 12:10 proceeds to show footage from maya

  • @desolunatic5045
    @desolunatic5045 2 года назад +1

    Well there is a game realm where something like this is already implemented - ASW in VR games, and it gives very convincing results.

  • @beardalaxy
    @beardalaxy 2 года назад +2

    I wonder if devs will be able to use dlss3 on specific elements instead of the whole screen. For example, games that have lower FPS things in the distance like particle effects or NPCs. Interpolation on just those distant elements could really help sell the scene.

  • @resurgam_b7
    @resurgam_b7 2 года назад +2

    In the 5 -> 60 flight sim, sure there was a lot of funny business going on over much of the screen, but there was also a lot of surprisingly smooth motion on the more distant objects. Definitely not how I'd want to game, but still super impressive that it can take 5 frames and turn them into 60 with any degree of success even if it's not exactly a good result overall.
    I am curious how this program does its thing without itself reducing performance. If the computer is not fast enough to produce 60 real frames, how is it fast enough to invent fake frames to make up the difference? I get that the fake frames aren't as complex to resolve as real ones, but they aren't free either, the computer has to do some calculating to produce them. Is there a point at which the computer is spending so much time inventing fake frames that the real frame rate starts to suffer?

  • @rubz1390
    @rubz1390 2 года назад

    Didn't expect to see David Bowie demonstrations today.

  • @metasamsara
    @metasamsara 2 года назад

    OOOOH the bugs you mention about framerate reminded me of an old bug. when you ran oldest cossacks game on 64 bit architecture it couldnt help but speed up the game so fast it was unplayable it was like running a speedhack on very fast setting. it was meant for 32bit architecture. there were also some differences of gameplay between different screen resolutions iirc.

  • @duudiiss
    @duudiiss 2 года назад +3

    Woah, what is the name of the last song used in Conclusion chapter? It is so good, never heard it on another philip video before.

    • @ar-4775
      @ar-4775 2 года назад +1

      i also wanna know

  • @ividyon
    @ividyon 2 года назад +2

    I'm surprised that you, as an avid VR fan, made no comparison to the interpolation technology used in headsets! This kind of "Something is wrong, but you can't exactly tell what" happens ALL the time when you use interpolation to smooth out the framerates in VR. Things seem to "hang" on the edges of the screen sometimes and warp out of proportion. It is why I prefer to have a choppier framerate, as this kind of "wrong" feeling disappears, though I presume people who quickly get motion sick cannot enjoy the same luxury.

    • @ividyon
      @ividyon 2 года назад

      Specifically, I'm speaking of "Asynchronous Spacewarp", which adds frames when you move your head, to interpolate where you're about to look.

  • @floodtheinbox
    @floodtheinbox 2 года назад

    I feel like taking LSD turns your brain's DLSS into Rife 5->60fps

  • @dylnDOT
    @dylnDOT 2 года назад

    Something interesting is going to be if it's like minimum frame times, what if there's 10 frames between some of the "true" frames, after nothing but 1 frame between "true" frames. Something like a new form of stuttering...

  • @Kamil_O
    @Kamil_O 2 года назад

    this is how real gamers prepare for new GPU - running half naked in the park

  • @Jakewake52
    @Jakewake52 2 года назад

    Two things I would say-
    The low frame rate interpolation shimmering could be used for some really good music videos
    Hypothetically for the corner warping if the area it takes up is consistent enough would rendering the image at a slightly higher resolution and basically cropping to the desired resolution

  • @TheWeeky
    @TheWeeky 2 года назад +1

    I just hope that some form of it can be modded in or maybe even down the line officially supported on older cards, DLSS2 is great at what it does but fully raytraced cyberpunk still makes my gpu catch fire a bit even on Balanced

  • @loganwolv3393
    @loganwolv3393 2 года назад

    Seems like first person is the true achille's heel of frame interpolation. Explains why Nvidia used a regular third person drive through instead of panning the camera around in an interior like you're losing your mind where you could possibly see some artifacts.

  • @GabrielWard2001
    @GabrielWard2001 2 года назад +4

    I just want to say i appreciated the music from Oblivion which kicked in at about 9:22.
    EDITED: TIMESTAMP IN THIS COMMENT WAS INCORRECT

    • @stianhoiland
      @stianhoiland 2 года назад

      I had to scroll this far to find the first comment mentioning this?! gg Gabriel (and it's 9:22)

    • @GabrielWard2001
      @GabrielWard2001 2 года назад

      @@stianhoiland Thanks for the correction. Have a nice day. :)

  • @Lagger01
    @Lagger01 2 года назад +1

    Have you seen RTX remix phillip? Who am I kidding, of course you have, feel like it's a techology you'll love. I can't wait to go render so many old dx8 and 9 games with new ray tracing capabilities it'll be very neat to see them in a different light.

  • @sakurasfingernails7247
    @sakurasfingernails7247 2 года назад

    I certainly trust that DLSS3 will do a way better job with the grass than RIFE can. I imagine having access to actual depth buffer + motion vectors would fix 90% of everything that RIFE got wrong in that example.

  • @SO-Negative
    @SO-Negative 2 года назад

    Getting these extra frames from DLSS is like bying a vowel in wheel of fortune.

  • @johanavril1691
    @johanavril1691 2 года назад +2

    I doubt dlss 3's artefacts will look anythink like that, it works very differently

  • @metasamsara
    @metasamsara 2 года назад

    Dude I can already see Warowl organizing a competitive tournament where everyone is frame locked at 5fps then it is upscaled to 60 fps that would be hillarious how it glitches it looks like a psychedelic trip lmao

  • @BloodWraith777
    @BloodWraith777 2 года назад

    4:32 They tried to recover images from dreams using this technology. This is the best they could do.
    2 and a half P. At 2 and a half FPS.

  • @MrDirt
    @MrDirt 2 года назад

    2:40 that’s the effect I got when playing half life Alyx on my old PC.

  • @Ularg7070
    @Ularg7070 2 года назад +1

    The 5fps example of you walking along some grass and its interporlation artifacts are basically the only thing I see in those "UHD 60FPS" upscales of anime ops. They're so distracting and wrong that it ruins the entire video and I'll never get why people are obsessed with what is obviously a low-effort conversion, wasting an otherwise interesting tool.

  • @nyande1828
    @nyande1828 2 года назад +1

    Most of the artifacts in this video are the result of naively estimating motion by finding the nearest patch of similar looking pixels. This is why high frequency detail like grass, rapidly moving objects, or repeating textures look the worst.
    DLSS 3 should have access to motion vectors for both frames, so if done properly, most of these artifacts should be avoidable. The only hard part is generating detail that is occluded in input frames.
    We have the technology to do this convincingly, but I doubt it is optimized enough to run in real time yet, so it looks like DLSS3 is just smudging surrounding colors to fill the gap. Maybe that’s convincing enough in practice.

  • @hundvd_7
    @hundvd_7 2 года назад

    5:01 This came out really nice with the Vsauce-like music

  • @spectrobit5554
    @spectrobit5554 2 года назад

    The 5 real frames example gave me a headache. Really impressed with 24 though.

  • @ShagadelicBY
    @ShagadelicBY 2 года назад

    First 17 seconds of this video is the best news I've heard since the great war

  • @metasamsara
    @metasamsara 2 года назад

    Loool about the keeping the money for camera comment, imagine in 20 years when cryptography has become so powerful we can run quantum apps on old pentium 3 CPUs that would be hilarious. I always thought the race to better specs was a linear scaling through centralized means of production for profits, rather than a smart scaling through data compression and restructuring.

  • @kattenmusen10kofficial
    @kattenmusen10kofficial 2 года назад

    Very interesting test and comparasion! But I wonder if DLSS3 creates its newly interpolated frames by interpolating between grainly DLSS-upscaled frames Or is this new frame interpolation an entirely separate tech which doesn't involve any upscaling at all? At the top of the video you mentioned that it doesn't allow for higher resolutions than before. So I assume its working like DLSS2 plus interpolated extra frames. I am looking forward to a possible future video where you are testing it out to see how it looks and behaves in detail! Big love from Europe!

  • @metasamsara
    @metasamsara 2 года назад

    My Samsung smart TV upscales frames already it's called sharpness setting, really it's good as long as you don't go overboard with it (mine is set on 70/100). Especially during circular outer pans of camera, that's when you notice all the gaps in spacetime it can not possibly fill :D

  • @Geddy135
    @Geddy135 2 года назад +2

    it'll be interesting to live in an age where you're playing a game and while the frame rate doesn't stutter anymore, you just _feel_ latency get worse for a moment.

  • @sunny113866
    @sunny113866 2 года назад

    Fricken great video!

  • @Great.Milenko
    @Great.Milenko 2 года назад

    the importance of motion vectors cannot be understated.

  • @SekritJay
    @SekritJay 2 года назад

    I am distracted by Phil's exceedingly tight trousers

  • @xyoxus
    @xyoxus 2 года назад +1

    7:57 Thank you! Atleast Avatar is getting adaptive refresh rate where in action scenes it will have more than 24 fps (which every scene with fastish movement should have imo)

  • @motboy8
    @motboy8 2 года назад +4

    Even the 15 real FPS upscaled looked fairly impressive to me. I was thinking about how I played like 80 hours of Oblivion on my crap old PC at 15FPS, and then the Oblivion music kicked in around 9:30 and took me right back there 🕑🕒🕓

  • @Toby-Wan-Kenobi
    @Toby-Wan-Kenobi 2 года назад +3

    So Elden ring at 120 fps may be possible. That would be cool.

    • @mechanicalmonk2020
      @mechanicalmonk2020 2 года назад +1

      It's FROM software. They can't be arsed to make small fixes to their games let alone integrate dlss

  • @dreamywtf
    @dreamywtf 2 года назад

    this video reminds me of how much i hated using twixtor for editing haha, could never get the extreme slow motion looking right and now I find out why

  • @ZhenyaPav
    @ZhenyaPav 2 года назад +1

    I wonder if this would be adopted in VR, instead of the Asynchronous Space Warp

  • @SkyBorik
    @SkyBorik 2 года назад +1

    I dunno Philip, but the song on 6:45 is the worst one of the new. These horns are just so distracting from your speech.
    I thought I would get used to it, but no, it’s just terrible

  • @skylx0812
    @skylx0812 2 года назад +1

    Why are you dressed like Capt. Kirk when he was wrasslin that weird kid CharleyX?

  • @EER0000
    @EER0000 2 года назад +1

    Since I don’t play multiplayer games anyway, I am mostly interested in how smooth something looks, CK3 with 120fps ftw

  • @PhilipLL
    @PhilipLL 2 года назад +1

    I find it really facinating to see you break down all the new methods used in graphics prosessing. And your facination with AI, content generation and upscaling. And its really also a great source to learn about this stuff.
    But i wouldn't ignore the fact that AI has a terrible tendency to ruin artistic intent. Both by warping visual art and ruining classical animation. So AI upscaling isn't a silver bullet. And sometimes when you upscale something, there is also the aspect to consider that the product may be worse than the original.
    Lower framerates and resolutions aren't allways something that should be consided undesirable in all cases. Though this is mainly applicable to artistic works / games & hand animation.
    I've seen countless examples of people using AI framerates upscaling on traditional animation, and publishing these as "superior" to the origionals. Ignoring the fact that the artefacts / smoothening caused contradicts both the artistic intent by the creators and the core principles of the craft. Which makes me sad.
    This is not something i expect, or even think you should cover. But i still think the question of IF we should use AI upscaling in the first place, is as important, if not more important as how good said upscaling is.
    Obviously no disrespect to your facination with the scene. I've been watching you for years, and there are a lot of facinating projects that stem from the field of AI generation.

    • @PhilipLL
      @PhilipLL 2 года назад +1

      @@2kliksphilip Hey, i don't want to start i discussion over this. But as a game developer myself, i really value artistic intent. And in artistic projects, AI upscaling is only going to act as a filter which can serve to distort those pieces with artistic intent. This is a general consensus within a large majority of artistic communities, not because of any sort of blanket hate. Thinking of it as that is simply oversimplyfying a complex issue.
      Yeah, a large ammount of AI modified content is guaranteed. And in most cases, artists have control over how our art is displayed. But if everything is filered trough a layer of AI upscaling, you are taking the artistic power away from those who'se passion it is to make these games in the first place.
      Then again AI upscaling for games can still be a great tool. Especially for more realistic games, like the ones you've displayed as examples. For people using weaker computers and for games like these. This can be an awesome tool.
      But all i'm saying is that its not just a straight up improvement in every case. And more of a very useful tool in certain cases, rather than a straight up upgrade to anything its applied to.

    • @AlphaGarg
      @AlphaGarg 2 года назад +2

      @@PhilipLL Did anyone ever say it was "a straight up improvement"? Far as I can tell, that's just a strawman. Nobody who's eager about frame interpolation and/or image/video upscaling is saying it'll improve anything you put on it - quite the contrary, we're looking at the places it makes mistakes in and waiting to see how they'll be resolved. And as for games, would you rather your player "enjoy" your "artistic intent" via 5, 10, 15 fps?
      Would you rather an AI upscale a rendered frame from your game, or someone either not play at all because of the low framerates or have to play at a lower resolution where your art is completely destroyed? Serious questions.. personally, as both a dev and a player, I'd much, MUCH rather have this sort of technology continue to evolve and am eager to see applications of it both in the now and in the future. Without this sort of stuff I wouldn't be able to play, say, Gmod in VR with my extremely underpowered MX110 and i5, or Arma 3 at native res.

    • @TVPInterpolation
      @TVPInterpolation 2 года назад +1

      it truly depends on how you look at "artist intent", especially when you then use specific AIs for specific tasks.
      as an example: my video interpolation AI TVP is able to understand the 12 principles of animation, and although its in beta, it performs pretty well!
      RIFE is a good test for liveaction/reallife scenarios, but doesnt work so well for animation.