The future of upscaling?

Поделиться
HTML-код
  • Опубликовано: 31 янв 2025
  • ИгрыИгры

Комментарии • 871

  • @johnfortnite1987
    @johnfortnite1987 25 дней назад +145

    I've come back to this video time and time again as time has gone on and now it's finally a reality with DLSS 4 and Reflex 2 Warp, you were spot on here!!

    • @veilmontTV
      @veilmontTV 12 дней назад +1

      It really is impressive.

  • @MFKitten
    @MFKitten 2 года назад +1304

    YES! THIS IS THE THING I HAVE WANTED! Ever since Oculus demoed reprojection with a dev kit prototype years ago, I have been begging the universe to make developers realize the brilliance of this!

    • @dtrjones
      @dtrjones 2 года назад +35

      Exactly this technology has been around for years in VR. DLSS 3 though is hardware accelerated which is nice as there was always an initial penalty using Asynchronous Timewarp (Meta) or Asynchronous Reprojection (SteamVR) before the extra frames were noticilble which would sometimes means lowering graphical settings. However you know what, even hardware accelerated reprojection has been around before DLSS 3. Jeri Ellsworth CEO of Tilt 5 with a product of the same name has hardware reprojection which allows the AR projectors in each pair of glasses to run at an incredible 180 Hz.

    • @Caynug
      @Caynug 2 года назад +2

      so funny seeing you here! I remember listening to your agile intrepid demo songs on sevenstring over 10 years ago hahah.

    • @hspxz
      @hspxz 2 года назад +10

      Its actually not easy to implement, you need to integrate it with camera controller in game, with physics, etc.
      In vr it's easier cause you already get all player movements outside game, by tracking

    • @sguploads9601
      @sguploads9601 2 года назад +7

      reprojectrion is not in any way connected to framerates. you need to know how rendering is made to understand it.

    • @abyssalczech6719
      @abyssalczech6719 2 года назад +4

      @@sguploads9601 isnt reprojection just some fancy algebra

  • @DeSinc
    @DeSinc 2 года назад +234

    Why hadn't I thought of this? Reprojection was such a good idea, but I never even thought of putting it into normal games. Cool idea, undoubtedly will be coming in the future as gpus become more reliant on tricks and software to provide improvements, rather than ever diminishing chip speed improvements.

    • @TheoHiggins
      @TheoHiggins 2 года назад +9

      And the cool thing is rendering as we know it is just the culmination of software tricks upon software tricks to bodge our 3D world into something that can be reasonably calculated 10s-100s of times a second
      Down the line looking back at tech like this once it's adopted, iterated upon, and eventually matured, it'll be hard to imagine a time when games didn't make use of it

    • @jamesboyce7467
      @jamesboyce7467 2 года назад

      Why are wee sooo incrediable stuuoooopedd?

    • @mxusoleum
      @mxusoleum 2 года назад +3

      @@TheoHiggins You can simulate mouse input in a separate thread (Diabotical, Goldsrc) at a fixed frequency. I dont see how the same couldnt work for vr.

    • @DanielClear2
      @DanielClear2 24 дня назад +3

      ding ding ding ding!

    • @imAgentR
      @imAgentR 14 дней назад

      Games are already dependent on tricks and rendering techniques to make them plausible. It’s just an evolution of the tools used. Otherwise we still wouldn’t be playing games as we do now.

  • @phazeF
    @phazeF 25 дней назад +89

    UR WISH CAME TRUE. NVIDIA REFLEX 2 FRAME WARPING

  • @Bambiindistress
    @Bambiindistress 2 года назад +871

    The combination of all these technologies is the perfect mix for gaming

    • @Systox25
      @Systox25 2 года назад +7

      Casual gaming

    • @Dionyzos
      @Dionyzos 2 года назад +19

      @@Systox25 I think some of this can also benefit competetive games. If you have a 240Hz monitor and run a game at 144Hz input lag could be reduced by polling your movements at 240Hz.

    • @brett20000000009
      @brett20000000009 2 года назад +14

      @@Dionyzos the whole can't benefit comp games is dumb, this is going to enable 1000fps@1000hz(yes they are actually working on 1080@1000hz monitors) that will give superior mousefeel that's perfectly consistent. it will be better for muscle memory with little to no sacrifice to base framerate.

    • @ShadowMKII
      @ShadowMKII 2 года назад

      I prefer it for baking.

    • @AlphaGarg
      @AlphaGarg 2 года назад

      @@Dionyzos Yep, I find CSGO utterly unplayable since it drops below 40 fps on my 60hz PC every now and again. But it's not because the image is any more or less stuttery - my eyes can't perceive that difference - but my muscle memory can. Tracking in that game is near impossible for me without weird flicks interspaced through otherwise smooth movement. Meanwhile in Quake Champions, a game I can run at 60fps solid, I have no issues with tracking. It's all down to how responsive the input is, and if the input matches what I see, boom, framedrops and hitches become a thing of the past.
      I yearn for the day something like DLSS or FSR 2.0 is implemented on a hardware level. And even more so for the day something like this spatial warping is implemented. Especially as a developer. Not having to worry about optimisation as much would be great so I can focus on things like gameplay, sound, UI, etc. Y'know, the actually fun stuff.

  • @Remyie
    @Remyie 2 года назад +332

    That's so clever. After testing out the demo, I can easily say that I cannot see or feel any difference between uncapped 750fps and 60fps capped with both tweaks. Even if I moved the mouse super fast, it's not noticeable. If we added a slight motion blur to both, it would literally be same. Also the timewarp 3d screenshot is such a brilliant idea to interpolate between frames. Now I'm wondering why these weren't a thing for all these years. Finally I did try out 30fps, input lag is definitely a lot better but the ghosting is very visible and distracting.

    • @sawyergrimm287
      @sawyergrimm287 2 года назад +28

      Its noticeable, but it's up to each player to decide whether they want ghosting or laggy 30fps

    • @Sergeeeek
      @Sergeeeek 2 года назад +19

      I'm sure this same technique could be improved even further to hide reprojection artifacts. This would be great to have for console games because they run at 30fps a lot of the time.

    • @StanleyKubick1
      @StanleyKubick1 2 года назад +1

      per-pixel mb

    • @AriinPHD
      @AriinPHD 2 года назад +10

      do not add motion blur to anything. ever.

    • @rdmz135
      @rdmz135 2 года назад +19

      @@AriinPHD motion blur is great for racing games

  • @Isaax
    @Isaax 2 года назад +161

    This is one of your most important videos in recent times. Thanks a lot for bringing this to attention! I remember how I HATED the idea of VR because of "the lag that would be there if you just moved your head", which was before i knew that it was already a sovled issue. I was so shocked (positively) when I got my first Oculus in early 2019 to see how responsive it felt after all.

    • @fireaza
      @fireaza 2 года назад +2

      Yeah, this was one of the things they were laser-focused on when they were developing the Rift and Vive. They obviously knew from the VR experiments in the 90s that it was going to be ESSENTUAL to have zero lag when turning your head, otherwise motion sickness was going to be as rampant of an issue as, well, people who have never tried VR thinks it is.

    • @Junya01
      @Junya01 2 года назад +1

      @@fireaza motion sickness still is an issue for many VR users. It’s why the Zuck is wrong and it’ll never be fully adopted by the general public

    • @Derpynewb
      @Derpynewb 2 года назад +3

      @@Junya01 nah its gonna be fully adopted. There are people addicted to this shit like people were addicted to social media. Thats exactly why meta, a social media company is investing so heavily into this. Also the tech has already grounded its use in industrial settings. Its already adopted by the military and some police forces because of how versatile it is. Its just in its infancy. With how costs of everything is rising, its easier to have a pretend virtual things than real ones.
      I like vr but at the same time I know for sure its going to be hell. Imagine readyplayer me but without the good ending. I started realising all the little details after actually using vr. Its fucking annoying me how I didn't notice so much shit in that movie. Smaller irl houses since everyone lives online.

    • @Junya01
      @Junya01 2 года назад +1

      @@Derpynewb motion sickness dude. 3/4 vr users already experience it in some form or another, u think that shit is gonna go away just cuz the tech is getting better? People aren’t gonna pay extra to do shit in vr when they can do it cheaper, faster, and without feeling nauseous by just doing it irl or through a screen.

    • @Derpynewb
      @Derpynewb 2 года назад +5

      @@Junya01 there's a few things you cannot replicate from vr to desktop. I've encountered some people that had horrible and I mean horrible motion sickness. They bruteforced their way till they no longer had it. You can lose motion sickness. I initially got a bit motion sick and personally lost it too.
      The other thing is the motion sickness is caused by a disconnect from visual senses and your biological accelerometers conflicting. And there's research being done on that too.
      Vr does give you motion sickness but you can get used to it. Its just right now there's no compelling content to convince the masses to.
      Fps games give some people motion sickness and people got used to that too. However the sample size isn't big enough to say say if most people would be able to get over vr motion sickness. Since there are some people who can't get used to desktop motion sickness.
      Everything I've said is anecdotal and the sample sizes arnt big enough to really give a concrete conclusion. Personal opinion is it will take off.

  • @darbyshiredanny97
    @darbyshiredanny97 18 дней назад +5

    Absolutely clairvoyant video. Anytime someone challenges if you know what you're talking about, show them this video.

  • @BallinLikeMike23
    @BallinLikeMike23 25 дней назад +12

    Philip predicted Framewarp 2 years before NVIDIA. Rewatched this video after the 50s series announcement and this does a better job at explaining the “new” reflex 2 technology

    • @zachb1706
      @zachb1706 25 дней назад +9

      Tbf it’s based on a Nvidia paper

    • @BattousaiHBr
      @BattousaiHBr 25 дней назад +2

      @@zachb1706 reflex itself was inspired by a discovery (bottlenecked GPU has a large impact on input lag) from a youtuber, albeit a different one.

  • @Astraben
    @Astraben 2 года назад +1320

    What I fear most is that companies will just target the same hardware requirements to run things at 60fps, for example, and use all these tricks to simply save themselves the bother of optimizing the game

    • @Derpynewb
      @Derpynewb 2 года назад +207

      With how shits going, making it run at least 60fps is actually quite good. But I get your point. But at the end of the day, what matters is the actual experience. Everything in 3d graphics is a lie. Most shadows are baked and are not real time. If you take your clothes off in game, its just an empty void.
      If lettuce tasted 100% like pizza, I'd be fine with lettuce
      The only issue is if they make lettuce taste like wet pizza, and most people dont give a shit except me. THEN it becomes an issue.

    • @zarwil
      @zarwil 2 года назад +53

      You could say the same about every technology or hardware upgrade ever

    • @Astraben
      @Astraben 2 года назад +38

      @@zarwil And I do

    • @Derpynewb
      @Derpynewb 2 года назад +81

      @@zarwil On that note actually. It's proof to Astra's argument. Ever play an old game on modern hardware and see how high the FPS flies (if its not cpu bound)?
      The old games run 10 times faster even though they don't look 10 times worse.

    • @TheMrLeoniasty
      @TheMrLeoniasty 2 года назад +2

      It's what they are doing now with stuff DLSS and other stuff.

  • @ToonamiAftermath
    @ToonamiAftermath 2 года назад +12

    Great video! I'm hoping that we don't have to wait too long for more advancements like this, I think a game engine can combine all of these technologies into 1 smooth and configurable experience that would advance gaming graphics and reduce the barrier to entry at the same time. Nice callout to Foveated rendering too.

  • @earthling_parth
    @earthling_parth 2 года назад +26

    You have possibly inspired a real use-case for this thanks to Comrade Singer and LTT pushing your vision Philip

  • @grggrgrgg
    @grggrgrgg 25 дней назад +30

    Nvidia literally just saw this video and went "Let's take this technology John Carmack invented 12 years ago, and market it as an 'RTX' feature"

    • @hat1324
      @hat1324 9 дней назад +2

      But sadly no one else did in 12 years so they get to do this

    •  4 дня назад

      it wasnt possible to do it until now in a way that wouldn't hinder the image quality. now its possible with inpainting ai

    • @grggrgrgg
      @grggrgrgg 4 дня назад

      It works fine in VR without AI inpainting. There are many ways to tackle this issue.
      Modern games also provide you with a ton of info you can take advantage of to fill the gaps more intelligently, like the depth buffer and velocity buffers, in addition to the rendered pixels. Imo even the simple techniques used in the Unity demo are not so noticeable if you go like 60->144fps in normal gameplay. I'd rather play that, than base 60 fps.
      But it's also of course a trade-off in compute, as the async reprojection render has to be ultra fast. Shifting the weight to specialized hardware like Nvidia is doing is a smart move.

  • @Armi1P
    @Armi1P 2 года назад +21

    It's amazing how combining fairly simple tricks can give such an impressive result!

  • @luukvanoijen7082
    @luukvanoijen7082 2 года назад +317

    hey, the name for that "3d screenshot" thingy for frame smoothing is "Temporal reprojection" i think. i used this technique in a raytracing shader for minecraft once, it's actually really cool and not demanding at all either, just a little bit of matrix math

    • @TheMrKeksLp
      @TheMrKeksLp 2 года назад +10

      Correct!

    • @luukvanoijen7082
      @luukvanoijen7082 2 года назад +36

      @@bluescorpian lol its actually not that much math to do this, but theres of course more to it than that, like discarding pixels so you dont accidentally try to reproject when you shouldnt (think a pixel being occluded or something)

    • @miguelpereira9859
      @miguelpereira9859 2 года назад +3

      @@bluescorpian Matrix based calculations aren't that complex

    • @logitech4873
      @logitech4873 2 года назад +3

      It's also similar to the framerate smoothing done in Oculus VR
      If a game happens to hang up, you'll see the same kind of weird artifacts as seen at 6:25

    • @Girugi
      @Girugi 2 года назад

      This is not correct. Temporal reprojectuon is tot ale an element from the current fram and transform it to the space of the last frame (using inverse current projection and last frames projection). But this is all completely different. This is rather using something called ray marching, which can be expensive, but there are tricks to speed it up, and the closer the projections are, the larger ray marching steps you can take, and it will be cheaper. Honestly, the implementation here looks a bit rough.
      If you want to compare this technique to something common in games, compare it to screen space reflections. It's the same technique, but the rays are generated from the camera near plane and in projection direction. So all trucks toy use to get smoother results and not miss details in SSR can be used here too.
      But, SSR is a relatively expensive technique. Many games does it at quarter res to save performance. Of course, it will be cheaper here if we can take larger steps. But that is also why we see the big slices that breaks up as you can see in the video.

  • @DarkSwordsman
    @DarkSwordsman 2 года назад +56

    This was actually quite eye-opening. It makes sense now why turning my head with my Rift headset still feels smooth despite a game's framerate sitting around 25-40 FPS, but when I move around it feels juddery and laggy with visual artifacts (because of Asynchronous Space Warp).

  • @RandomBruh
    @RandomBruh 2 года назад +40

    Incredible! I just tried it with a render target of 60 with a 280hz (yeah, don't ask) monitor, and aside from the little bit of weird ghosting on objects close to the camera while moving the character, it's amazing.
    I would have little-to-no reason not to turn this tech on on literally every game, which as others have pointed out could be a problem when you think about the progress of computing power and game optimisation in general.

    • @RandomBruh
      @RandomBruh 2 года назад +2

      @@starlight_garden I did say don't ask.
      (It was available for around the same price as a 240hz one so why not.)

    • @keppycs
      @keppycs Год назад

      @@RandomBruh AW2723DF?

  • @l3monguy
    @l3monguy 24 дня назад +3

    I can't believe how spot on you were about this after watching reflex 2. Just wow

  • @RazielXT
    @RazielXT 2 года назад +38

    Its like RTS games draw mouse cursor outside of game render loop to keep it responsive, just now we want it for whole screen :)

    • @Trikipum
      @Trikipum 2 года назад +6

      they have been using hardware cursors in this kind of games for eons...That is why they feel responsive, because the mouse input is direct.. but this works with a cursor and nothing else, the game still will feel unresponsive if it is running at 10fps, specially if you move the camera around.

    • @RazielXT
      @RazielXT 2 года назад +2

      @@Trikipum Well of course with 10fps, but without hw cursor even 30fps would be horrible

  • @JSTM2000
    @JSTM2000 25 дней назад +6

    I instantly though of this video when I saw the NVIDIA reflex 2 feature and it looks like I wasn't the only one.
    It look it like it'll use AI is a part of the equation to fix the warped view. I guess we'll have to see it in action to judge if it's actually good, but my guess is that it's going to have it's own quirks, just like every other AI technology.

  • @Xarius
    @Xarius 2 года назад +215

    The rhythm game "osu!" has this feature! Audio, graphics and input are all on separate threads, meaning drawing can run at 240Hz, while input is polled at 1000Hz

    • @AveryChow
      @AveryChow 2 года назад +22

      this is only true for the ‘lazer’ version I believe. I haven’t really noticed much of a difference as the game runs fast on my computer anyways (it’s not a very demanding game)

    • @Derpynewb
      @Derpynewb 2 года назад +7

      I don't think that actually does anything, because the screen displays the incorrect position of your cursor, meaning you cannot adjust its movement path as fast as if it was actually native. It doesn't matter if the input is at 240hz because you can't see where anything acurately is and hence you cannot react accordingly IMO.

    • @Xarius
      @Xarius 2 года назад +10

      @@Derpynewb True! However that means that actual cursor position is much more accurately mapped to real movement. For example, if your game runs too slow, one frame your cursor is out of the hit object, and the next, it's out from the other side, even though you did aim correctly :p

    • @Aesthesis92
      @Aesthesis92 2 года назад +9

      Older arena game called Reflex Arena also has this. Hitreg is processed at 1000hz.

    • @Aidiakapi
      @Aidiakapi 2 года назад +13

      @@Derpynewb Our own brains have significant processing latency. Even the people with fastest reaction time do not beat 100ms on visual stimuli. (We can respond faster to audio cues.)
      If you played a game like osu! by moving your cursor, observing the new position of the cursor, and then clicking. You'd be much too late.
      Similarly in shooters, you don't aim by moving your cursor, then seeing if it's on top of the enemy, and then clicking, even if your click-to-photon latency was 0ms, it'd be much too slow.
      Point is, it matters a lot.

  • @gamingmarcus
    @gamingmarcus 2 года назад +4

    It's amazing to see how your very unique perspective on tech topics gets appreciated more and more in the scene.

  • @YannYoel
    @YannYoel 2 года назад +13

    Congratulations on making it into and inspiring the newest LTT video with this one :)

  • @jeffmccloud905
    @jeffmccloud905 2 года назад +45

    The Unity game engine already has this feature ("frame independent input"), but only if you use the "New Input System" (Unity devs will know what that means). Unity introduced a new way to read user input in the last ~2 years that implements this feature. VR games made in Unity (which is most of them) use the new system.

  • @yoced
    @yoced 25 дней назад +10

    Kliksphilips Huang CEO of graphics

  • @beastbum
    @beastbum 2 года назад +9

    Kliks empire expands to Canada

  • @cyjan3k823
    @cyjan3k823 2 года назад +5

    Beginning of this video was very confusing but I am glad I was there to see this demo, really impressive stuff

  • @Girugi
    @Girugi 2 года назад +131

    Just remember that everything shown here will only work reasonably well in an fps where the camera just rotates around a fixed point when you move the mouse. Any third person camera would need the last technique shown here, a lot, and break so much easier

    • @elin4364
      @elin4364 2 года назад +2

      YUP, still cool though

    • @truffles
      @truffles 2 года назад +12

      i hate the games where your view orbits slightly off-center from your character and does not stay centered, and stuff that is very close to your view moves in a very strange way while you look around

    • @Dionyzos
      @Dionyzos 2 года назад +3

      Developers can tweak the peripheral resolution and size of the image to a point where it's less noticeable or even give the option to tweak it yourself. This would of course come with a bigger performance impact but as there is a net benefit with Frame Generation it's an overall improvement. Let's say we're going to the extreme without reduced peripheral resolution and render a 4K Image but only display it on a 1440p display. This should be enough to even compensate for quick camera movements while still giving an overall Framerate increase with Frame Generation as 4K does not halve the framerate compared to 1440p while FG doubles it. Add Foveated rendering to this method, render the peripheral pixels at DLSS Ultra Performance or something, and the results will be even better as noticeability decreases with increased camera speed.

    • @Girugi
      @Girugi 2 года назад +2

      @@Dionyzos you do know that scaling resolution is not growing the pixel count linearly right? 1440p is about 3.6 million pixels, 4k is about 8.3 million. So you will pay over double the cost for rasterization and all post effects. The gains you speak of will not be as big as you think. There is a reason why no game with ssr renders outside the frame to improve edge reflections. It is just not that cheap.
      And no matter your buffer sizes, the tps camera issue will not be solved. Depth peeling might be stronger for that, like a low resolution wider fov depth peeling. But I can see a lot of issues for that too. But, for a standard FPS that rotates around a fixed spot this technique has interesting potential. But one more thing to remember, with DLSS3, no new depth buffer is generated as far as I know, so the depth reprojection mentioned here would not work, without changing DLSS3 segnificanrly and making it more espensive. And we can easily see that all transparency will have big issues. Like, imagine a muzzle flair.
      So what would probably be best to do is to not render the weapon and weapon effects in to the regular scene buffer. But render those parts at 100% frame rate and overlay on top after reprojecting.
      That would solve other issues too. Like making feedback from the weapon have a consistent timing with your click, and not depend on if you clicked on a generated frame or not.

    • @Girugi
      @Girugi 2 года назад +2

      @@Dionyzos oh and also, claiming 4k would be enough buffer for 1440p is just bogus. Maybe for gamepad players, but a mouse can easily turn 90 or 180 degrees in one frame. Especially when we talk competitive players.

  • @gytis321s2
    @gytis321s2 8 месяцев назад +9

    ONE YEAR HAS PASSED AND NO ONE EVEN MENTIONED HIS EVER AGAIN. Devs DONT GIVE A SHIT

    • @GiveMoneyToRandoms
      @GiveMoneyToRandoms 8 месяцев назад +2

      This could be the fix for everything everyone would play games at 30 fps and full settings but this is the thing, if they develop this tech how can they sale 2.000$ gpus no one would buy them anymore, you would argue for fps but nearly 90% of the esport shoter are fine optimized so you also wouldn’t need a high end gpu

    • @skyscall
      @skyscall 12 часов назад

      It's now a feature on NVIDIA Reflex 2.

  • @gaminggregor4297
    @gaminggregor4297 16 дней назад +1

    You were mentioned in the latest WAN Show by Linustechtips from 15.01.25 . Seems like your work is more relevant than ever! 🙂

  • @Luukebaas
    @Luukebaas 2 месяца назад +2

    Man why wasnt this recommended to me 2 years ago, I've watched all your vids for basically a decade & I missed this one while its such a good vid

  • @bernegerrits8394
    @bernegerrits8394 2 года назад +66

    Stormworks: Build & Rescue implements something like this. When a big/laggy object or vehicle is loaded, the physics engine slows down, but the camera movement keeps locked at 60fps. It is quite obvious when the physics are running at 15 fps, but it helps reduce camera stutters and makes it easier to control

    • @geemcspankinson
      @geemcspankinson 2 года назад

      It's really good

    • @512TheWolf512
      @512TheWolf512 2 года назад +2

      Source engine had this for all this time. Physics in multiplayer run way slower than in single, yet are independent of the scene rendering frame rate

    • @hotmailcompany52
      @hotmailcompany52 2 года назад

      KSP Also did this but nowhere near the same level of Stormworks.

    • @Trikipum
      @Trikipum 2 года назад +4

      this has nothing to do with what the video talks about. What you are talking about is called time dilation and has been done for decades in certain games. The engine simply goes in slow motion so it has time to process everything and keep a decent frame rate.

    • @shepardpolska
      @shepardpolska 2 года назад +3

      It's kind of not the same thing. By that you can say Minecraft does the same thing, but the real reason is that the world you play runs on an internal server and is disconected from the render you are seeing.

  • @Sprixitite
    @Sprixitite 2 года назад +3

    This feels like the kind of thing I go down a rabbit hole of when implementing mouse movement in a fresh project lol, good job

  • @SirSicCrusader
    @SirSicCrusader 2 года назад

    That is fascinating, and that there are already examples of the idea in action is really cool.

  • @LeWhiske
    @LeWhiske 25 дней назад +5

    bro is omnipotent and predicted nvidias next move XD

  • @splatlingsquid4595
    @splatlingsquid4595 2 года назад +7

    Congrats on being showcased and mentioned on LTT! Really love the new content and approach to tech you've been doing.

    • @treeinafield5022
      @treeinafield5022 2 года назад +1

      Which LTT video was it? I'd like to hear what they said.

  • @denisruskin348
    @denisruskin348 2 года назад +4

    Seeing DLSS 2.x and FSR 2.x simultaneously in a game always cheers me up. Being able to enjoy "4K" and a big boost to fps (especially for high refresh rate panels) is just amazing. Zero or minimal loss to image quality and the game feels much smoother. It really looks like magic at times.

  • @psycl0n3
    @psycl0n3 2 года назад +2

    Really like this kind of content, these advancements have their uses and make sense. Super interesting topics lately!

  • @Tony14-_-
    @Tony14-_- 2 года назад +10

    Congrats on the LTT mention Phillip! Glad two worlds collide

  • @Blaxpoon
    @Blaxpoon 2 года назад +2

    I think we will always need these tricks no matter how good the hardware becomes because the artists become more ambitious and push it to the limits always

  • @The_Viktor_Reznov
    @The_Viktor_Reznov 2 года назад +19

    Holy fuck LTT straight up shouted out Philip, what a timeline

  • @microsoftpowerpoint4731
    @microsoftpowerpoint4731 25 дней назад +1

    thanks, this video makes me excited for the future of optimization, lots of cool and very creative tricks

    • @artizard
      @artizard 25 дней назад

      Great timing with reflex 2 being announced

  • @a_blind_sniper
    @a_blind_sniper 2 года назад +6

    I think this is it. This video clearly states my exact frustration with low framerates and offers a clear solution that already works. I hope someone at Nvidia sees this because this is a fantastic idea.

    • @skyscall
      @skyscall 12 часов назад

      Good news; NVIDIA announced this feature with Reflex 2 on the new GPUs.

  • @ThreeCatProductions
    @ThreeCatProductions 2 года назад +1

    Fantastic video, really interesting stuff, perhaps because I studied real-time 3D technologies at University. This is really well researched, and cleanly worded. Great work!

  • @stezzzza
    @stezzzza 2 года назад +1

    Love this video! Absolutely love VR and never even considered its uses for flatscreen gaming. Great job! ❤

  • @TheFirstObserver
    @TheFirstObserver 2 года назад +9

    Interesting idea. IIRC, the RTS Supreme Commander did something similar, decoupling inputs and graphics processing from simulation processing. This was done to allow updates to process in .1s intervals to prevent the simulation from lagging under load, but the high refresh still let the game be smooth for the time and inputs feel responsive (since clicks and ui still responded as fast as the engine could render).

    • @outlander234
      @outlander234 2 года назад +2

      What you are talking about has been done since the dawn of videogames. What the guy in video is talking about is something different.

    • @TheFirstObserver
      @TheFirstObserver 2 года назад

      @@outlander234 Well, I get that different items are often split off from the main gameplay loop into their own threads, and often update at their own frequencies, but I was under the impression this specific proposal was a decoupling of the engine's simulation from the graphics and UI specifically? So responses to player inputs are tied specifically to the graphics refresh rate?
      I guess I may be confused...?😅

    • @alex15095
      @alex15095 2 года назад +2

      ​@@TheFirstObserver What you're describing is doing game logic in a slower thread, and interpolating between states in the render thread while rendering as fast as possible. The video's proposal is to slap on another thread and transform the previously rendered frame to the new camera transform whenever the render thread hasn't produced a frame in time.

    • @TheFirstObserver
      @TheFirstObserver 2 года назад

      @@alex15095 Oh, OK. I think I get it now. Thanks for the clarification. 🙂

  • @mariuspuiu9555
    @mariuspuiu9555 2 года назад +106

    the problem is that regardless of how you try to uncouple the mouse input from the "gameplay", input lag will be the same. you are still stuck waiting for your input to manifest in the game. UNLESS you want the game to calculate stuff without them appearing on screen which introduces a whole lot of other issues. clicks won't register on screen at the correct visual timings. (it will just end up feeling like lag compensation and how you can get kills or get killed after moving behind a wall)

    • @RiseClient
      @RiseClient 2 года назад

      Just make raycasting updated on the same thread as the mouse movement, problem solved

    • @samnicholson9047
      @samnicholson9047 2 года назад +10

      There's no reason that a game can't process mouse clicks faster than the base framerate. There might be a delay in seeing the physical bullet, but the position on screen will be correct. It shouldn't be too difficult to run the game engine faster than the graphics, since we're already doing that with the interpolation anyways.

    • @luddeoland
      @luddeoland 2 года назад +5

      As he said, thus should not be used for competitive games. The "click problem" would nearly only be when you make an inhuman flick and dont even know if you should have hit or not.

    • @nbt1254
      @nbt1254 2 года назад +5

      I think he is completely aware of that. He is only talking about the perceived mouse responsiveness because it affects the playing experience much more than the actual in-game input delay.
      Also, I think in most cases you won't be using it for 15-30 fps, the most common use case would probably be to get from 60 to 144 fps and you probably wouldn't notice the effect you described at 60+ fps, it's not like the input delay at 60 fps is insanely bad.

    • @justapleb7096
      @justapleb7096 2 года назад +6

      This is a non issue because literally ALL of the games where this problem would actually matter would be able to be run at 500fps anyways.

  • @Caynug
    @Caynug 2 года назад

    wow I want this in all games, truly revolutionary and I thought about this stuff for years because VR makes it work so good.

  • @Invisibilitylock
    @Invisibilitylock Год назад

    Thanks dude, rlly helped me understand this concept, wasnt like most videos where i cannot ever keep up with what is being said.

  • @AmiciCherno
    @AmiciCherno 2 года назад

    This was amazing to witness even just as a demonstration, thank you for posting this video, let's hope the future of budget options in gaming looks better like this.

  • @elliejohnson2786
    @elliejohnson2786 2 года назад +1

    Oh my god, this is genius! I actually didn't even realise the game was running at 15 FPS when you had 144 mouse movements, and at a glance the extended rendering Comrade Stinger's demo comes with is good enough. The only time I realised was when you moved the character itself.

  • @AlucardDH
    @AlucardDH 25 дней назад +5

    I'm from the future of 2025. This ideas is used in NVidia Reflex 2 with Frame Wrap.
    It's not exactly the same implementation (the rendered frame seems to be updated with the wraping technic just before display, to have a more fresh camera position, following the mouse movements) but this is close!
    I think Comrade Stinger inspired (if not more) NVidia.

  • @Dribbleondo
    @Dribbleondo 2 года назад +4

    I think Stingers' onto something with that 3D Projection tech.

  • @TrulySct
    @TrulySct 2 года назад

    I've been thinking about this as a possibility for a while now, but having someone with this much charisma explaining it really convinces me so much more than I did myself :)

  • @fireaza
    @fireaza 2 года назад +1

    Ooooh, this has cleared up a few things about synthesized frames in VR for me. I always assumed it was like motion interpolation in TVs, where it's generating a "fake" frame, but it's actually just moving the viewpoint on the current frame? That would explain why head movement always feels smooth, even when you're getting a bad framerate. This would also explain why when the FPS gets really low, that you can sometimes turn your head and see the edges of the image. This must be one of the tricks Valve came up with when they were developing the Vive and reached the point where none of their playtesters were getting motion sick anymore

  • @hoverbike
    @hoverbike 25 дней назад +1

    ITS REAL NOW PHILIP, YOU SAVED THE 50 SERIES!

  • @sashamakeev7547
    @sashamakeev7547 2 года назад +1

    Well at first 1/3 of video i was like "you cant do that..." but having watched through whole video - these are great ideas and tricks that can make gaming better!

  • @SpidFightFR
    @SpidFightFR 2 года назад +2

    That looks nice ! I look forward to see technologies like this in games !

  • @krollic
    @krollic 24 дня назад +4

    you called it.

  • @elin4364
    @elin4364 2 года назад +8

    After seeing this I almost instantly wanted to try to implement this into the game im working on right now, but after thinking about it there are a few big issues with implementing this into an existing project made in a exiting game engine like unity:
    1. Unity doesn't support async rendering, and I dont think most modern game engines do either. This means that you cant really push multiple frames per frame like that in a good way.
    2. One possible approach around that could be to split the work of "1 real frame" into parts, IE if you're doing 4 "fake" frames per "real" frame you'd maybe do the cpu calculations on the first 2 frames and then the actual rendering on the next 2 (you could even do this in parallel, IE while you're rendering frame #1 you're also doing the cpu work of frame #2 in parallel). This would still probably be slower than if you could actually do async rendering properly though.
    2. Even if you did that, most existing games rely on doing operations "per frame", all of which you'd have to redo for this new system, which is A LOT of work.
    So you'd really need to write your own engine to do this effectively, which I mean you can 100% do, but is suddenly a huge project sadly. Sounds like a really fun side project and im 100% gonna try it out haha.
    (also the demo you showed of is made in unity, but they just rendered the whole "real" frame in the last "fake" frame possible, meaning you dont really get anything extra out of it sadly (except maybe lower gpu usage on more powerful gpus which is nice I guess), it certainly shows off how cool the technology could be given if engines properly supported it though!)

    • @Qrzychu92
      @Qrzychu92 2 года назад +1

      yeah, this should be supported on the engine level. Only thing you have to separate from the game logic is camera movements, so that engine can take the "3d screenshot". Image next Unreal adding a checkbox for this, then you have have to implement camera movemnet in this specific way, boom. Unlocked.
      It would be WILD, especailly for consoles

  • @guest0191
    @guest0191 2 года назад +5

    who would have thought this would be featured on LTT im stoked this technology is getting the attention it deserves

  • @Mixa_Lv
    @Mixa_Lv 16 дней назад +1

    Currently, if I'm not mistaken, the AI-generated frames are only based on the most recently rendered true frames, but I don't see it being impossible that it could also take player inputs and actions into account when generating frames.
    When it comes to looking around, this video already offers solutions to this, but it's also possible to just generate the new areas until they properly rendered. There are programs that let you expand images for example.
    For player actions, such as 3rd person moving, shooting, reloading, updating UI and stuff like that, I take my idea from cloth simulations and reflection framerates. Sometimes in games, cloth physics and reflections are rendered at lower framerates, so what if we only rendered the UI and the player model/viewmodel at full framerate, and the world would run at half framerate, which would be then frame generated to match the player framerate? This way, the player's actions would always be rendered with the latest information and at highest image quality, which is the most important aspect to focus on anyway.

    • @2kliksphilip
      @2kliksphilip  16 дней назад +1

      What they're doing is ideal. They get the latest frame rendered and adjust it for the latest mouse movement. In short: 10 FPS normally means your mouse movements are delayed by at least 100 ms. Imagine if, after 100 ms of rendering a frame, it can then be quickly adjusted for 100 ms of mouse movements and BOOM. Suddenly it feels 100 ms more responsive every frame :) Obviously you don't want just 10 fps- with all this MFG stuff going on it'll be more like, here have 40 FPS of high res fully pathtraced content, made to 160 fps with frame generation, with each frame adjusted for latest mouse movements

  • @sirreoser5668
    @sirreoser5668 2 года назад +15

    CSGO major starting and a 2kliks video. Oh boy we're spoiled

  • @mattdavisgames
    @mattdavisgames 2 года назад

    Great video, I had been thinking about reprojection outside VR and it's great to see that demo!

  • @Matt-nv2qg
    @Matt-nv2qg 2 года назад

    Can'ty believe you convinced me, but yeah, this is a great idea. Nicely done- the demo really did it.

  • @JustLukeMinecraft
    @JustLukeMinecraft 2 года назад

    This is a fantastic idea! Game designers please take notes

  • @exmerion
    @exmerion 25 дней назад +4

    Phil, it looks like they might be using a subtle version of this for reflex 2.

  • @insu_na
    @insu_na 2 года назад +5

    Your video was mentioned on LTT (and properly credited). Cool stuff

  • @JacobLukasiewicz
    @JacobLukasiewicz 2 года назад

    That video should blow up and go viral so we can have some response from true AAA games developers. Great Vid!

  • @58geniusplayer
    @58geniusplayer 2 года назад +2

    Wow the demo is really great

  • @MrAlaxUA
    @MrAlaxUA 2 года назад +3

    As a game developer, one important issue comes to mind. What happens to the game elements which are tied to the camera rotation directly, like your gun model? VR games can freely use warp because there is never anything of value attached to your face, while your arms fluidity is not as important as camera's. The problem with warp, is that it's fairly simple, in the sense that it just takes the last frame as a plane with a texture and transforms it. It means that your gun on the screen, that usually is directly tied to your camera movement, will feel rubber-y, lagging. And I don't see a simple way of addressing that, except for trying to reduce the time before the next frame as much as possible by using frame interpolation.

    • @RazielXT
      @RazielXT 2 года назад +1

      With velocity vectors you would know not to move those pixels anywhere. Ofc it would have to be "smart" warp

  • @JoshM7
    @JoshM7 2 года назад +3

    I have wanted something like this forever. For a very long time I felt like something like this should have existed.
    It could be my mind remembering games being better than what they were but I feel like older games (can't put an exact date, but maybe around ps1/ps2 era or maybe earlier?) even at low frame rates were more responsive than most games today. But I could be wrong. Either way. I really want this. It reminds me of playing at 120-240fps on a 60Hz screen. You can't see the extra frames but the game felt drastically quicker/smoother because of the low frametimes.
    But if you can make inputs a separate thing polling at a higher rate then you don't have to draw a lot more frames to feel this effect.

  • @raychua
    @raychua 2 года назад +2

    Linus Tech redirected me here. This make even better sense to use in cloud gaming where images are rendered and streamed and your inputs clearly lag from the network round trip. There's something called GGPO rollback that dealt with this for fighting games that if put together with this could improve things even further.

  • @existentialselkath1264
    @existentialselkath1264 2 года назад +9

    Overscan (the concept of rendering stuff slightly off screen) is already used infrequently as a solution to certain screenspace artifacts from ssr etc.
    Would certainly be a good idea for this kinda thing

  • @Mr.Electricc
    @Mr.Electricc 2 года назад +2

    Hey glad people get to see the brother of 3clicksphillip

  • @k7ufo819
    @k7ufo819 2 года назад +5

    I wouldn't complain so much about 30fps console games if this were a thing.

  • @aaronlink127
    @aaronlink127 2 года назад +2

    This kind of tangentially reminds me, Overwatch 2 has an interesting approach for mouse, they allow you to shoot in between frames and it actually calculates where your camera *would* have been when you shot, even if it never spent a single frame there, it's very interesting stuff. That tech + this visual interpolation could help too.

  • @Blap7
    @Blap7 2 года назад +1

    engines like Godot have two process loops, one for physics at a fixed rate, and one that runs every frame.
    things like mouse movement and visuals can be in the per screen frame render loop, while player movement and other stuff will be in the physics process loop.

  • @soulshinobi
    @soulshinobi 2 года назад +1

    This type of analysis of unique topics is so fascinating and well presented, my favorite thing about this channel.

  • @vtvitibttboybobtottbo
    @vtvitibttboybobtottbo 2 года назад +1

    Great video. Never heard of this approaches to smooth 30 FPS experience. But now I'm very interested.

  • @atrombonist
    @atrombonist 2 года назад +1

    LTT didn't bring me here but you getting a shoutout once more proves your in depth videos are quality content!

  • @triplebog
    @triplebog 2 года назад

    This is a fascinating idea. Games already generally have multiple internal refreshrates.

  • @Snoozy96
    @Snoozy96 2 года назад +1

    One thing that I really want people to know, is that on top all of this. There are Ai generated facial animating, Ai Generated voices, and ai generated text that can adhere to storylines and world rules. Combine all of this together, and we will get an amazing VR experience like none before. You could genuinely create and entire VR MMO that has continuous quersting, that is random and follows the worlds laws, while also being unique and generating animations for specific body types of characters int he world as well.
    We are so close to achieving these types of games. I hope int he next decade something close to an alpha version of this that combines it all together to make something that just tests these foundations out will come out.

  • @changeagent228
    @changeagent228 Год назад +1

    Very good. I have always enjoyed frame interpolation since first discovering it decades ago. I like the edge stretching idea too to free the mouse. Maybe render in a higher res than the desired display res to allow for stabilisation cropping without edge distortion.

    • @zachb1706
      @zachb1706 11 месяцев назад

      It's not interpolation, it's reprojection. It actually renders new frames after the first has already been rendered, lowering input latency

  • @devlin1991
    @devlin1991 2 года назад +2

    When I wrote the renderer for JanusVR (which did have a 2D KBM driven mode) I went with a system that prioritized mouse/head movement latency over keyboard or mouse click latency. Rather than being pipelined like most renders it would try to always render to vsync and would re-render a previous frame's content at the new camera position if the CPU "frame builder" thread was unable to provide a new frame in time. Our scene content was user controlled so badly behaving scripts or poorly built avatars (per triangle unique materials......) was common and this kept motion-to-photon latency more stable. It shouldn't be too difficult to find or modify one of the time warp or spacewarp implementations to work in a purely mouse driven scene. I'd love to try out a prototype of that.

  • @Snoozie
    @Snoozie 2 года назад +4

    The best part of this is that this is in worst case scenario in what is essentially a tech demo
    Imagine this stuff at 60fps, or 72fps.
    While you would probably not get a competitive advantage on the frames updating, with 144hz mouse movement this would completely bypass games running poorly on reaction times in a sense.
    I hope this catches on

  • @scriblestingray5713
    @scriblestingray5713 2 года назад +1

    Absolutely the best way to do this. Input lag is the worst feeling when playing games in my opinion. I think if major engines add this we could see it become standard.

  • @MrCrazyhamster28
    @MrCrazyhamster28 2 года назад +4

    You got a shout out on Linus Tech Tips!

  • @paulwolff2121
    @paulwolff2121 2 года назад

    I was super excited when Unity released the decoupled Input from Rendering Feature. I loved to develop mobile apps with unity but the biggest drawback always was that those were killing your battery suuuper quick, even if you were only rendering the main menu. This feature allowed to signifantly reduce frame rate in menus, if there was no input from the user and only up it again if the user was scrolling or anything was animating. I mean it was possible before, but you always had to wait for the next frame for your user Input to be detected and thus having a lag.
    Didn't think about FPS at the time. Great use of that feature!

  • @blinded6502
    @blinded6502 2 года назад +1

    Actually, now that I recall it, I've seen this reprojection thing years ago in some VR demo
    You basically could pilot a Mars rover while basically being located in a dark room, with a small screen of what rover sees floating in front of you. And you could use your controller to steer the camera, updating that view as if it were a flashlight of sorts
    At least that's how I remember it

  • @jcquints3364
    @jcquints3364 2 года назад

    holy shit, i'm pretty much unaware of vr tech since i'm not really interested in vr gaming. but holy this is some next level stuff lol. game optimization is such a work of art

  • @Jarmahent
    @Jarmahent 2 года назад

    The band of my (gaming)existence has been input lag, so I’m all in for this one!

  • @olivercuddeford8920
    @olivercuddeford8920 2 года назад +43

    This is similar to how a phone’s refresh rate can be 60Hz but its touch sample rate can be triple that, which makes 60Hz feel much smoother and more responsive than it would if touch detection was pinned to screen refresh rate

    • @Jona69
      @Jona69 2 года назад +22

      I don't think that's a good comparison. That would be similar to how mouse sampling rates are usually at 500 or 1000 hrz.

    • @shanroxalot5354
      @shanroxalot5354 2 года назад

      Johnathan is correct. This is already how our inputs work.

  • @jakesdd5401
    @jakesdd5401 2 года назад +1

    This, this is what we need!

  • @joeykeilholz925
    @joeykeilholz925 22 дня назад +1

    We've done it lads

  • @Kureiji-Desu
    @Kureiji-Desu 2 года назад

    In a perfect world, technologies like this would not exist, but this is not a perfect world.

  • @OllieWille
    @OllieWille 2 года назад +1

    I've been thinking a lot about this myself. I'm imagining how this could be used either save power on portable units, or to boost framerates on machines without making the fans have to spin like crazy. The thing is, if you were to use these two approaches AND motion vectors, you could even make the moving objects in the scene look much smoother!
    You could even do a high quality render pass at 20fps, then actually run the game at 60fps but only "rendering" motion vectors, making it possible to warp the pixels into the right places.
    My fear of this is that developers would just throw optimization out the window when they get this much headroom to play with.

  • @dominikrohel2546
    @dominikrohel2546 2 года назад +2

    This goes out of topic of upscaling, but I've recently found out about DirectX-Vulkan, which as far as I know helps stabilise and or increase framerate (mostly) in older games. I reckon people would be thankful for acknowledging that they finally can run GTA 4 on max setting with smooth framerate, which dxvk helped me achieve, so video on this would be awesome.

  • @isaiahbriggs2093
    @isaiahbriggs2093 2 года назад +3

    this would be great for old games that have fps limited physics

  • @Greybell
    @Greybell 2 года назад +1

    When DLSS 3 was announced, I can't help but be reminded of how VR solves low frame rates by interpolating the frames so you don't feel motion sick without your viewpoint and controller tracking delay. Flat screen would definitely benefit from this too and I don't see why we can't have it.