The future of upscaling?

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024
  • DLSS 3's framerate doubling feature is all well and good, but that lag is painful. But... what if there was a way to reduce it? Welcome to the wonders of Framerate Independent Mouse Movement.
    Download Comrade Stinger's demo here: drive.google.c...
    It took him one evening to make, and another few hours to implement the position reprojection.

Комментарии • 838

  • @DeSinc
    @DeSinc Год назад +197

    Why hadn't I thought of this? Reprojection was such a good idea, but I never even thought of putting it into normal games. Cool idea, undoubtedly will be coming in the future as gpus become more reliant on tricks and software to provide improvements, rather than ever diminishing chip speed improvements.

    • @TheoHiggins
      @TheoHiggins Год назад +7

      And the cool thing is rendering as we know it is just the culmination of software tricks upon software tricks to bodge our 3D world into something that can be reasonably calculated 10s-100s of times a second
      Down the line looking back at tech like this once it's adopted, iterated upon, and eventually matured, it'll be hard to imagine a time when games didn't make use of it

    • @jamesboyce7467
      @jamesboyce7467 Год назад

      Why are wee sooo incrediable stuuoooopedd?

    • @mxusoleum
      @mxusoleum Год назад +2

      @@TheoHiggins You can simulate mouse input in a separate thread (Diabotical, Goldsrc) at a fixed frequency. I dont see how the same couldnt work for vr.

  • @MFKitten
    @MFKitten Год назад +1266

    YES! THIS IS THE THING I HAVE WANTED! Ever since Oculus demoed reprojection with a dev kit prototype years ago, I have been begging the universe to make developers realize the brilliance of this!

    • @dtrjones
      @dtrjones Год назад +34

      Exactly this technology has been around for years in VR. DLSS 3 though is hardware accelerated which is nice as there was always an initial penalty using Asynchronous Timewarp (Meta) or Asynchronous Reprojection (SteamVR) before the extra frames were noticilble which would sometimes means lowering graphical settings. However you know what, even hardware accelerated reprojection has been around before DLSS 3. Jeri Ellsworth CEO of Tilt 5 with a product of the same name has hardware reprojection which allows the AR projectors in each pair of glasses to run at an incredible 180 Hz.

    • @Caynug
      @Caynug Год назад +2

      so funny seeing you here! I remember listening to your agile intrepid demo songs on sevenstring over 10 years ago hahah.

    • @gazooc
      @gazooc Год назад +10

      Its actually not easy to implement, you need to integrate it with camera controller in game, with physics, etc.
      In vr it's easier cause you already get all player movements outside game, by tracking

    • @sguploads9601
      @sguploads9601 Год назад +6

      reprojectrion is not in any way connected to framerates. you need to know how rendering is made to understand it.

    • @abyssalczech6719
      @abyssalczech6719 Год назад +4

      @@sguploads9601 isnt reprojection just some fancy algebra

  • @Remyie
    @Remyie Год назад +314

    That's so clever. After testing out the demo, I can easily say that I cannot see or feel any difference between uncapped 750fps and 60fps capped with both tweaks. Even if I moved the mouse super fast, it's not noticeable. If we added a slight motion blur to both, it would literally be same. Also the timewarp 3d screenshot is such a brilliant idea to interpolate between frames. Now I'm wondering why these weren't a thing for all these years. Finally I did try out 30fps, input lag is definitely a lot better but the ghosting is very visible and distracting.

    • @sawyergrimm287
      @sawyergrimm287 Год назад +28

      Its noticeable, but it's up to each player to decide whether they want ghosting or laggy 30fps

    • @Sergeeeek
      @Sergeeeek Год назад +19

      I'm sure this same technique could be improved even further to hide reprojection artifacts. This would be great to have for console games because they run at 30fps a lot of the time.

    • @StanleyKubick1
      @StanleyKubick1 Год назад +1

      per-pixel mb

    • @AriinPHD
      @AriinPHD Год назад +9

      do not add motion blur to anything. ever.

    • @rdmz135
      @rdmz135 Год назад +19

      @@AriinPHD motion blur is great for racing games

  • @Astraben
    @Astraben Год назад +1260

    What I fear most is that companies will just target the same hardware requirements to run things at 60fps, for example, and use all these tricks to simply save themselves the bother of optimizing the game

    • @Derpynewb
      @Derpynewb Год назад +204

      With how shits going, making it run at least 60fps is actually quite good. But I get your point. But at the end of the day, what matters is the actual experience. Everything in 3d graphics is a lie. Most shadows are baked and are not real time. If you take your clothes off in game, its just an empty void.
      If lettuce tasted 100% like pizza, I'd be fine with lettuce
      The only issue is if they make lettuce taste like wet pizza, and most people dont give a shit except me. THEN it becomes an issue.

    • @zarwil
      @zarwil Год назад +54

      You could say the same about every technology or hardware upgrade ever

    • @Astraben
      @Astraben Год назад +36

      @@zarwil And I do

    • @Derpynewb
      @Derpynewb Год назад +79

      @@zarwil On that note actually. It's proof to Astra's argument. Ever play an old game on modern hardware and see how high the FPS flies (if its not cpu bound)?
      The old games run 10 times faster even though they don't look 10 times worse.

    • @TheMrLeoniasty
      @TheMrLeoniasty Год назад +2

      It's what they are doing now with stuff DLSS and other stuff.

  • @Isaax
    @Isaax Год назад +150

    This is one of your most important videos in recent times. Thanks a lot for bringing this to attention! I remember how I HATED the idea of VR because of "the lag that would be there if you just moved your head", which was before i knew that it was already a sovled issue. I was so shocked (positively) when I got my first Oculus in early 2019 to see how responsive it felt after all.

    • @fireaza
      @fireaza Год назад +2

      Yeah, this was one of the things they were laser-focused on when they were developing the Rift and Vive. They obviously knew from the VR experiments in the 90s that it was going to be ESSENTUAL to have zero lag when turning your head, otherwise motion sickness was going to be as rampant of an issue as, well, people who have never tried VR thinks it is.

    • @Junya01
      @Junya01 Год назад +1

      @@fireaza motion sickness still is an issue for many VR users. It’s why the Zuck is wrong and it’ll never be fully adopted by the general public

    • @Derpynewb
      @Derpynewb Год назад +3

      @@Junya01 nah its gonna be fully adopted. There are people addicted to this shit like people were addicted to social media. Thats exactly why meta, a social media company is investing so heavily into this. Also the tech has already grounded its use in industrial settings. Its already adopted by the military and some police forces because of how versatile it is. Its just in its infancy. With how costs of everything is rising, its easier to have a pretend virtual things than real ones.
      I like vr but at the same time I know for sure its going to be hell. Imagine readyplayer me but without the good ending. I started realising all the little details after actually using vr. Its fucking annoying me how I didn't notice so much shit in that movie. Smaller irl houses since everyone lives online.

    • @Junya01
      @Junya01 Год назад +1

      @@Derpynewb motion sickness dude. 3/4 vr users already experience it in some form or another, u think that shit is gonna go away just cuz the tech is getting better? People aren’t gonna pay extra to do shit in vr when they can do it cheaper, faster, and without feeling nauseous by just doing it irl or through a screen.

    • @Derpynewb
      @Derpynewb Год назад +5

      @@Junya01 there's a few things you cannot replicate from vr to desktop. I've encountered some people that had horrible and I mean horrible motion sickness. They bruteforced their way till they no longer had it. You can lose motion sickness. I initially got a bit motion sick and personally lost it too.
      The other thing is the motion sickness is caused by a disconnect from visual senses and your biological accelerometers conflicting. And there's research being done on that too.
      Vr does give you motion sickness but you can get used to it. Its just right now there's no compelling content to convince the masses to.
      Fps games give some people motion sickness and people got used to that too. However the sample size isn't big enough to say say if most people would be able to get over vr motion sickness. Since there are some people who can't get used to desktop motion sickness.
      Everything I've said is anecdotal and the sample sizes arnt big enough to really give a concrete conclusion. Personal opinion is it will take off.

  • @luukvanoijen7082
    @luukvanoijen7082 Год назад +314

    hey, the name for that "3d screenshot" thingy for frame smoothing is "Temporal reprojection" i think. i used this technique in a raytracing shader for minecraft once, it's actually really cool and not demanding at all either, just a little bit of matrix math

    • @TheMrKeksLp
      @TheMrKeksLp Год назад +10

      Correct!

    • @luukvanoijen7082
      @luukvanoijen7082 Год назад +36

      @@bluescorpian lol its actually not that much math to do this, but theres of course more to it than that, like discarding pixels so you dont accidentally try to reproject when you shouldnt (think a pixel being occluded or something)

    • @miguelpereira9859
      @miguelpereira9859 Год назад +3

      @@bluescorpian Matrix based calculations aren't that complex

    • @logitech4873
      @logitech4873 Год назад +3

      It's also similar to the framerate smoothing done in Oculus VR
      If a game happens to hang up, you'll see the same kind of weird artifacts as seen at 6:25

    • @Girugi
      @Girugi Год назад

      This is not correct. Temporal reprojectuon is tot ale an element from the current fram and transform it to the space of the last frame (using inverse current projection and last frames projection). But this is all completely different. This is rather using something called ray marching, which can be expensive, but there are tricks to speed it up, and the closer the projections are, the larger ray marching steps you can take, and it will be cheaper. Honestly, the implementation here looks a bit rough.
      If you want to compare this technique to something common in games, compare it to screen space reflections. It's the same technique, but the rays are generated from the camera near plane and in projection direction. So all trucks toy use to get smoother results and not miss details in SSR can be used here too.
      But, SSR is a relatively expensive technique. Many games does it at quarter res to save performance. Of course, it will be cheaper here if we can take larger steps. But that is also why we see the big slices that breaks up as you can see in the video.

  • @Bambiindistress
    @Bambiindistress Год назад +866

    The combination of all these technologies is the perfect mix for gaming

    • @Systox25
      @Systox25 Год назад +6

      Casual gaming

    • @Dionyzos
      @Dionyzos Год назад +19

      @@Systox25 I think some of this can also benefit competetive games. If you have a 240Hz monitor and run a game at 144Hz input lag could be reduced by polling your movements at 240Hz.

    • @brett20000000009
      @brett20000000009 Год назад +14

      @@Dionyzos the whole can't benefit comp games is dumb, this is going to enable 1000fps@1000hz(yes they are actually working on 1080@1000hz monitors) that will give superior mousefeel that's perfectly consistent. it will be better for muscle memory with little to no sacrifice to base framerate.

    • @ShadowMKII
      @ShadowMKII Год назад

      I prefer it for baking.

    • @AlphaGarg
      @AlphaGarg Год назад

      @@Dionyzos Yep, I find CSGO utterly unplayable since it drops below 40 fps on my 60hz PC every now and again. But it's not because the image is any more or less stuttery - my eyes can't perceive that difference - but my muscle memory can. Tracking in that game is near impossible for me without weird flicks interspaced through otherwise smooth movement. Meanwhile in Quake Champions, a game I can run at 60fps solid, I have no issues with tracking. It's all down to how responsive the input is, and if the input matches what I see, boom, framedrops and hitches become a thing of the past.
      I yearn for the day something like DLSS or FSR 2.0 is implemented on a hardware level. And even more so for the day something like this spatial warping is implemented. Especially as a developer. Not having to worry about optimisation as much would be great so I can focus on things like gameplay, sound, UI, etc. Y'know, the actually fun stuff.

  • @earthling_parth
    @earthling_parth Год назад +23

    You have possibly inspired a real use-case for this thanks to Comrade Singer and LTT pushing your vision Philip

  • @DarkSwordsman
    @DarkSwordsman Год назад +56

    This was actually quite eye-opening. It makes sense now why turning my head with my Rift headset still feels smooth despite a game's framerate sitting around 25-40 FPS, but when I move around it feels juddery and laggy with visual artifacts (because of Asynchronous Space Warp).

  • @Dionyzos
    @Dionyzos Год назад +24

    I remember being blown away by the concept of Foveated Rendering back when I saw the Oculus presentation. This is just as mindblowing to me right now. Just imagine the visual fidelity we can achieve and make playable with this.

  • @RazielXT
    @RazielXT Год назад +38

    Its like RTS games draw mouse cursor outside of game render loop to keep it responsive, just now we want it for whole screen :)

    • @Trikipum
      @Trikipum Год назад +5

      they have been using hardware cursors in this kind of games for eons...That is why they feel responsive, because the mouse input is direct.. but this works with a cursor and nothing else, the game still will feel unresponsive if it is running at 10fps, specially if you move the camera around.

    • @RazielXT
      @RazielXT Год назад +2

      @@Trikipum Well of course with 10fps, but without hw cursor even 30fps would be horrible

  • @RandomBruh
    @RandomBruh Год назад +39

    Incredible! I just tried it with a render target of 60 with a 280hz (yeah, don't ask) monitor, and aside from the little bit of weird ghosting on objects close to the camera while moving the character, it's amazing.
    I would have little-to-no reason not to turn this tech on on literally every game, which as others have pointed out could be a problem when you think about the progress of computing power and game optimisation in general.

    • @RandomBruh
      @RandomBruh Год назад +2

      @@starlight_garden I did say don't ask.
      (It was available for around the same price as a 240hz one so why not.)

    • @keppycs
      @keppycs 8 месяцев назад

      @@RandomBruh AW2723DF?

  • @Armi1P
    @Armi1P Год назад +21

    It's amazing how combining fairly simple tricks can give such an impressive result!

  • @YannYoel
    @YannYoel Год назад +13

    Congratulations on making it into and inspiring the newest LTT video with this one :)

  • @ganja_hotdog
    @ganja_hotdog Год назад +11

    Big papa Linus feature

  • @insu_na
    @insu_na Год назад +5

    Your video was mentioned on LTT (and properly credited). Cool stuff

  • @Tony14-_-
    @Tony14-_- Год назад +10

    Congrats on the LTT mention Phillip! Glad two worlds collide

  • @existentialselkath1264
    @existentialselkath1264 Год назад +9

    Overscan (the concept of rendering stuff slightly off screen) is already used infrequently as a solution to certain screenspace artifacts from ssr etc.
    Would certainly be a good idea for this kinda thing

  • @The_Viktor_Reznov
    @The_Viktor_Reznov Год назад +17

    Holy fuck LTT straight up shouted out Philip, what a timeline

  • @jeffmccloud905
    @jeffmccloud905 Год назад +44

    The Unity game engine already has this feature ("frame independent input"), but only if you use the "New Input System" (Unity devs will know what that means). Unity introduced a new way to read user input in the last ~2 years that implements this feature. VR games made in Unity (which is most of them) use the new system.

  • @ToonamiAftermath
    @ToonamiAftermath Год назад +11

    Great video! I'm hoping that we don't have to wait too long for more advancements like this, I think a game engine can combine all of these technologies into 1 smooth and configurable experience that would advance gaming graphics and reduce the barrier to entry at the same time. Nice callout to Foveated rendering too.

  • @beastbum
    @beastbum Год назад +9

    Kliks empire expands to Canada

  • @TheLifeFruit
    @TheLifeFruit Год назад +8

    You made it to LTT

  • @Girugi
    @Girugi Год назад +130

    Just remember that everything shown here will only work reasonably well in an fps where the camera just rotates around a fixed point when you move the mouse. Any third person camera would need the last technique shown here, a lot, and break so much easier

    • @elin4364
      @elin4364 Год назад +2

      YUP, still cool though

    • @truffles
      @truffles Год назад +11

      i hate the games where your view orbits slightly off-center from your character and does not stay centered, and stuff that is very close to your view moves in a very strange way while you look around

    • @Dionyzos
      @Dionyzos Год назад +3

      Developers can tweak the peripheral resolution and size of the image to a point where it's less noticeable or even give the option to tweak it yourself. This would of course come with a bigger performance impact but as there is a net benefit with Frame Generation it's an overall improvement. Let's say we're going to the extreme without reduced peripheral resolution and render a 4K Image but only display it on a 1440p display. This should be enough to even compensate for quick camera movements while still giving an overall Framerate increase with Frame Generation as 4K does not halve the framerate compared to 1440p while FG doubles it. Add Foveated rendering to this method, render the peripheral pixels at DLSS Ultra Performance or something, and the results will be even better as noticeability decreases with increased camera speed.

    • @Girugi
      @Girugi Год назад +2

      @@Dionyzos you do know that scaling resolution is not growing the pixel count linearly right? 1440p is about 3.6 million pixels, 4k is about 8.3 million. So you will pay over double the cost for rasterization and all post effects. The gains you speak of will not be as big as you think. There is a reason why no game with ssr renders outside the frame to improve edge reflections. It is just not that cheap.
      And no matter your buffer sizes, the tps camera issue will not be solved. Depth peeling might be stronger for that, like a low resolution wider fov depth peeling. But I can see a lot of issues for that too. But, for a standard FPS that rotates around a fixed spot this technique has interesting potential. But one more thing to remember, with DLSS3, no new depth buffer is generated as far as I know, so the depth reprojection mentioned here would not work, without changing DLSS3 segnificanrly and making it more espensive. And we can easily see that all transparency will have big issues. Like, imagine a muzzle flair.
      So what would probably be best to do is to not render the weapon and weapon effects in to the regular scene buffer. But render those parts at 100% frame rate and overlay on top after reprojecting.
      That would solve other issues too. Like making feedback from the weapon have a consistent timing with your click, and not depend on if you clicked on a generated frame or not.

    • @Girugi
      @Girugi Год назад +2

      @@Dionyzos oh and also, claiming 4k would be enough buffer for 1440p is just bogus. Maybe for gamepad players, but a mouse can easily turn 90 or 180 degrees in one frame. Especially when we talk competitive players.

  • @Xarius
    @Xarius Год назад +212

    The rhythm game "osu!" has this feature! Audio, graphics and input are all on separate threads, meaning drawing can run at 240Hz, while input is polled at 1000Hz

    • @AveryChow
      @AveryChow Год назад +22

      this is only true for the ‘lazer’ version I believe. I haven’t really noticed much of a difference as the game runs fast on my computer anyways (it’s not a very demanding game)

    • @Derpynewb
      @Derpynewb Год назад +7

      I don't think that actually does anything, because the screen displays the incorrect position of your cursor, meaning you cannot adjust its movement path as fast as if it was actually native. It doesn't matter if the input is at 240hz because you can't see where anything acurately is and hence you cannot react accordingly IMO.

    • @Xarius
      @Xarius Год назад +10

      @@Derpynewb True! However that means that actual cursor position is much more accurately mapped to real movement. For example, if your game runs too slow, one frame your cursor is out of the hit object, and the next, it's out from the other side, even though you did aim correctly :p

    • @Aesthesis92
      @Aesthesis92 Год назад +8

      Older arena game called Reflex Arena also has this. Hitreg is processed at 1000hz.

    • @Aidiakapi
      @Aidiakapi Год назад +13

      @@Derpynewb Our own brains have significant processing latency. Even the people with fastest reaction time do not beat 100ms on visual stimuli. (We can respond faster to audio cues.)
      If you played a game like osu! by moving your cursor, observing the new position of the cursor, and then clicking. You'd be much too late.
      Similarly in shooters, you don't aim by moving your cursor, then seeing if it's on top of the enemy, and then clicking, even if your click-to-photon latency was 0ms, it'd be much too slow.
      Point is, it matters a lot.

  • @guest0191
    @guest0191 Год назад +5

    who would have thought this would be featured on LTT im stoked this technology is getting the attention it deserves

  • @sadago2690
    @sadago2690 Год назад +4

    Someone got the Linus bump

  • @splatlingsquid4595
    @splatlingsquid4595 Год назад +7

    Congrats on being showcased and mentioned on LTT! Really love the new content and approach to tech you've been doing.

    • @treeinafield5022
      @treeinafield5022 Год назад +1

      Which LTT video was it? I'd like to hear what they said.

  • @cyjan3k823
    @cyjan3k823 Год назад +5

    Beginning of this video was very confusing but I am glad I was there to see this demo, really impressive stuff

  • @gamingmarcus
    @gamingmarcus Год назад +4

    It's amazing to see how your very unique perspective on tech topics gets appreciated more and more in the scene.

  • @user-ub1on7ui7t
    @user-ub1on7ui7t Год назад +3

    Congrats on being featured in LTT video!

  • @bernegerrits8394
    @bernegerrits8394 Год назад +64

    Stormworks: Build & Rescue implements something like this. When a big/laggy object or vehicle is loaded, the physics engine slows down, but the camera movement keeps locked at 60fps. It is quite obvious when the physics are running at 15 fps, but it helps reduce camera stutters and makes it easier to control

    • @geemcspankinson
      @geemcspankinson Год назад

      It's really good

    • @512TheWolf512
      @512TheWolf512 Год назад +2

      Source engine had this for all this time. Physics in multiplayer run way slower than in single, yet are independent of the scene rendering frame rate

    • @hotmailcompany52
      @hotmailcompany52 Год назад

      KSP Also did this but nowhere near the same level of Stormworks.

    • @Trikipum
      @Trikipum Год назад +4

      this has nothing to do with what the video talks about. What you are talking about is called time dilation and has been done for decades in certain games. The engine simply goes in slow motion so it has time to process everything and keep a decent frame rate.

    • @shepardpolska
      @shepardpolska Год назад +3

      It's kind of not the same thing. By that you can say Minecraft does the same thing, but the real reason is that the world you play runs on an internal server and is disconected from the render you are seeing.

  • @TheFirstObserver
    @TheFirstObserver Год назад +9

    Interesting idea. IIRC, the RTS Supreme Commander did something similar, decoupling inputs and graphics processing from simulation processing. This was done to allow updates to process in .1s intervals to prevent the simulation from lagging under load, but the high refresh still let the game be smooth for the time and inputs feel responsive (since clicks and ui still responded as fast as the engine could render).

    • @outlander234
      @outlander234 Год назад +2

      What you are talking about has been done since the dawn of videogames. What the guy in video is talking about is something different.

    • @TheFirstObserver
      @TheFirstObserver Год назад

      @@outlander234 Well, I get that different items are often split off from the main gameplay loop into their own threads, and often update at their own frequencies, but I was under the impression this specific proposal was a decoupling of the engine's simulation from the graphics and UI specifically? So responses to player inputs are tied specifically to the graphics refresh rate?
      I guess I may be confused...?😅

    • @alex15095
      @alex15095 Год назад +1

      ​@@TheFirstObserver What you're describing is doing game logic in a slower thread, and interpolating between states in the render thread while rendering as fast as possible. The video's proposal is to slap on another thread and transform the previously rendered frame to the new camera transform whenever the render thread hasn't produced a frame in time.

    • @TheFirstObserver
      @TheFirstObserver Год назад

      @@alex15095 Oh, OK. I think I get it now. Thanks for the clarification. 🙂

  • @elin4364
    @elin4364 Год назад +7

    After seeing this I almost instantly wanted to try to implement this into the game im working on right now, but after thinking about it there are a few big issues with implementing this into an existing project made in a exiting game engine like unity:
    1. Unity doesn't support async rendering, and I dont think most modern game engines do either. This means that you cant really push multiple frames per frame like that in a good way.
    2. One possible approach around that could be to split the work of "1 real frame" into parts, IE if you're doing 4 "fake" frames per "real" frame you'd maybe do the cpu calculations on the first 2 frames and then the actual rendering on the next 2 (you could even do this in parallel, IE while you're rendering frame #1 you're also doing the cpu work of frame #2 in parallel). This would still probably be slower than if you could actually do async rendering properly though.
    2. Even if you did that, most existing games rely on doing operations "per frame", all of which you'd have to redo for this new system, which is A LOT of work.
    So you'd really need to write your own engine to do this effectively, which I mean you can 100% do, but is suddenly a huge project sadly. Sounds like a really fun side project and im 100% gonna try it out haha.
    (also the demo you showed of is made in unity, but they just rendered the whole "real" frame in the last "fake" frame possible, meaning you dont really get anything extra out of it sadly (except maybe lower gpu usage on more powerful gpus which is nice I guess), it certainly shows off how cool the technology could be given if engines properly supported it though!)

    • @Qrzychu92
      @Qrzychu92 Год назад +1

      yeah, this should be supported on the engine level. Only thing you have to separate from the game logic is camera movements, so that engine can take the "3d screenshot". Image next Unreal adding a checkbox for this, then you have have to implement camera movemnet in this specific way, boom. Unlocked.
      It would be WILD, especailly for consoles

  • @leongao5120
    @leongao5120 Год назад +4

    You got on the credited on ltt

  • @atrombonist
    @atrombonist Год назад +1

    LTT didn't bring me here but you getting a shoutout once more proves your in depth videos are quality content!

  • @a_blind_sniper
    @a_blind_sniper Год назад +6

    I think this is it. This video clearly states my exact frustration with low framerates and offers a clear solution that already works. I hope someone at Nvidia sees this because this is a fantastic idea.

  • @denisruskin348
    @denisruskin348 Год назад +4

    Seeing DLSS 2.x and FSR 2.x simultaneously in a game always cheers me up. Being able to enjoy "4K" and a big boost to fps (especially for high refresh rate panels) is just amazing. Zero or minimal loss to image quality and the game feels much smoother. It really looks like magic at times.

  • @Sprixitite
    @Sprixitite Год назад +3

    This feels like the kind of thing I go down a rabbit hole of when implementing mouse movement in a fresh project lol, good job

  • @mariuspuiu9555
    @mariuspuiu9555 Год назад +105

    the problem is that regardless of how you try to uncouple the mouse input from the "gameplay", input lag will be the same. you are still stuck waiting for your input to manifest in the game. UNLESS you want the game to calculate stuff without them appearing on screen which introduces a whole lot of other issues. clicks won't register on screen at the correct visual timings. (it will just end up feeling like lag compensation and how you can get kills or get killed after moving behind a wall)

    • @AGuy-vq9qp
      @AGuy-vq9qp Год назад +5

      It’ll be very small though, so not much of an issue.

    • @AGuy-vq9qp
      @AGuy-vq9qp Год назад

      It’ll be very small though, so not much of an issue.

    • @TestingInProgress1234
      @TestingInProgress1234 Год назад

      Just make raycasting updated on the same thread as the mouse movement, problem solved

    • @samnicholson9047
      @samnicholson9047 Год назад +10

      There's no reason that a game can't process mouse clicks faster than the base framerate. There might be a delay in seeing the physical bullet, but the position on screen will be correct. It shouldn't be too difficult to run the game engine faster than the graphics, since we're already doing that with the interpolation anyways.

    • @luddeoland
      @luddeoland Год назад +5

      As he said, thus should not be used for competitive games. The "click problem" would nearly only be when you make an inhuman flick and dont even know if you should have hit or not.

  • @psycl0n3
    @psycl0n3 Год назад +2

    Really like this kind of content, these advancements have their uses and make sense. Super interesting topics lately!

  • @Invisibilitylock
    @Invisibilitylock 8 месяцев назад

    Thanks dude, rlly helped me understand this concept, wasnt like most videos where i cannot ever keep up with what is being said.

  • @sirreoser5668
    @sirreoser5668 Год назад +15

    CSGO major starting and a 2kliks video. Oh boy we're spoiled

  • @SirSicCrusader
    @SirSicCrusader Год назад

    That is fascinating, and that there are already examples of the idea in action is really cool.

  • @Mr.Electricc
    @Mr.Electricc Год назад +2

    Hey glad people get to see the brother of 3clicksphillip

  • @morphious86.
    @morphious86. Год назад +17

    this works great in VR where there's no HUD or weapon glued to your face, but what if there is?
    say in an FPS, wouldn't this effect cause your weapon to lag a few frames behind where you're actually looking?

    • @marknator
      @marknator Год назад +8

      You can make these essentials rendered at all times and not get affected

    • @btarg1
      @btarg1 Год назад +2

      In a multiplayer scenario yes, but DLSS and other tech like this isnt designed for online competitive gaming at all

    • @lanik8163
      @lanik8163 Год назад +3

      I mean in VR you have a literal second body strapped to you, so I don't see your point. Not to mention that there are quite a few VR games with huds.
      But to actually address your point, you could literally just render the weapon separately, like most games already do to avoid them clipping into walls.

    • @paulopatine
      @paulopatine Год назад +7

      In fps games is "easy", just render the first person model in the same "thread" as the inputs, I think the problem will be third person games, how do you make this without the player character being jarring in the world?

    • @Crossfirev
      @Crossfirev Год назад +1

      Then maybe you could decouple the hud from the gameplay and update and render it at that same rate as the mouse movement.

  • @stezzzza
    @stezzzza Год назад +1

    Love this video! Absolutely love VR and never even considered its uses for flatscreen gaming. Great job! ❤

  • @raychua
    @raychua Год назад +2

    Linus Tech redirected me here. This make even better sense to use in cloud gaming where images are rendered and streamed and your inputs clearly lag from the network round trip. There's something called GGPO rollback that dealt with this for fighting games that if put together with this could improve things even further.

  • @Dribbleondo
    @Dribbleondo Год назад +4

    I think Stingers' onto something with that 3D Projection tech.

  • @Blaxpoon
    @Blaxpoon Год назад +2

    I think we will always need these tricks no matter how good the hardware becomes because the artists become more ambitious and push it to the limits always

  • @ThreeCatProductions
    @ThreeCatProductions Год назад +1

    Fantastic video, really interesting stuff, perhaps because I studied real-time 3D technologies at University. This is really well researched, and cleanly worded. Great work!

  • @JoshM7
    @JoshM7 Год назад +3

    I have wanted something like this forever. For a very long time I felt like something like this should have existed.
    It could be my mind remembering games being better than what they were but I feel like older games (can't put an exact date, but maybe around ps1/ps2 era or maybe earlier?) even at low frame rates were more responsive than most games today. But I could be wrong. Either way. I really want this. It reminds me of playing at 120-240fps on a 60Hz screen. You can't see the extra frames but the game felt drastically quicker/smoother because of the low frametimes.
    But if you can make inputs a separate thing polling at a higher rate then you don't have to draw a lot more frames to feel this effect.

  • @58geniusplayer
    @58geniusplayer Год назад +2

    Wow the demo is really great

  • @Havel_The_Rock
    @Havel_The_Rock Год назад +7

    Perhaps AI image out-painting could also be used to fix those black bars outside the image, if it can be done fast enough

  • @fireaza
    @fireaza Год назад +1

    Ooooh, this has cleared up a few things about synthesized frames in VR for me. I always assumed it was like motion interpolation in TVs, where it's generating a "fake" frame, but it's actually just moving the viewpoint on the current frame? That would explain why head movement always feels smooth, even when you're getting a bad framerate. This would also explain why when the FPS gets really low, that you can sometimes turn your head and see the edges of the image. This must be one of the tricks Valve came up with when they were developing the Vive and reached the point where none of their playtesters were getting motion sick anymore

  • @SpidFightFR
    @SpidFightFR Год назад +2

    That looks nice ! I look forward to see technologies like this in games !

  • @nogussy
    @nogussy Год назад +2

    What an awesome demo. I'm amazed this hasn't taken over gaming

  • @aaronlink127
    @aaronlink127 Год назад +2

    This kind of tangentially reminds me, Overwatch 2 has an interesting approach for mouse, they allow you to shoot in between frames and it actually calculates where your camera *would* have been when you shot, even if it never spent a single frame there, it's very interesting stuff. That tech + this visual interpolation could help too.

  • @jakesdd5401
    @jakesdd5401 Год назад +1

    This, this is what we need!

  • @dominikrohel2546
    @dominikrohel2546 Год назад +2

    This goes out of topic of upscaling, but I've recently found out about DirectX-Vulkan, which as far as I know helps stabilise and or increase framerate (mostly) in older games. I reckon people would be thankful for acknowledging that they finally can run GTA 4 on max setting with smooth framerate, which dxvk helped me achieve, so video on this would be awesome.

  • @MrAlaxUA
    @MrAlaxUA Год назад +3

    As a game developer, one important issue comes to mind. What happens to the game elements which are tied to the camera rotation directly, like your gun model? VR games can freely use warp because there is never anything of value attached to your face, while your arms fluidity is not as important as camera's. The problem with warp, is that it's fairly simple, in the sense that it just takes the last frame as a plane with a texture and transforms it. It means that your gun on the screen, that usually is directly tied to your camera movement, will feel rubber-y, lagging. And I don't see a simple way of addressing that, except for trying to reduce the time before the next frame as much as possible by using frame interpolation.

    • @RazielXT
      @RazielXT Год назад +1

      With velocity vectors you would know not to move those pixels anywhere. Ofc it would have to be "smart" warp

    • @TerminalMontageClips.
      @TerminalMontageClips. Год назад

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @devlin1991
    @devlin1991 Год назад +2

    When I wrote the renderer for JanusVR (which did have a 2D KBM driven mode) I went with a system that prioritized mouse/head movement latency over keyboard or mouse click latency. Rather than being pipelined like most renders it would try to always render to vsync and would re-render a previous frame's content at the new camera position if the CPU "frame builder" thread was unable to provide a new frame in time. Our scene content was user controlled so badly behaving scripts or poorly built avatars (per triangle unique materials......) was common and this kept motion-to-photon latency more stable. It shouldn't be too difficult to find or modify one of the time warp or spacewarp implementations to work in a purely mouse driven scene. I'd love to try out a prototype of that.

  • @olivercuddeford8920
    @olivercuddeford8920 Год назад +43

    This is similar to how a phone’s refresh rate can be 60Hz but its touch sample rate can be triple that, which makes 60Hz feel much smoother and more responsive than it would if touch detection was pinned to screen refresh rate

    • @Jona69
      @Jona69 Год назад +22

      I don't think that's a good comparison. That would be similar to how mouse sampling rates are usually at 500 or 1000 hrz.

    • @shanroxalot5354
      @shanroxalot5354 Год назад

      Johnathan is correct. This is already how our inputs work.

  • @Snoozie
    @Snoozie Год назад +4

    The best part of this is that this is in worst case scenario in what is essentially a tech demo
    Imagine this stuff at 60fps, or 72fps.
    While you would probably not get a competitive advantage on the frames updating, with 144hz mouse movement this would completely bypass games running poorly on reaction times in a sense.
    I hope this catches on

  • @AmiciCherno
    @AmiciCherno Год назад

    This was amazing to witness even just as a demonstration, thank you for posting this video, let's hope the future of budget options in gaming looks better like this.

  • @isaiahbriggs2093
    @isaiahbriggs2093 Год назад +3

    this would be great for old games that have fps limited physics

  • @tdreamgmail
    @tdreamgmail Год назад +2

    It's actually makes low framerate gaming much more appealing, like on laptops and old computers with old graphics cards.

  • @blinded6502
    @blinded6502 Год назад +1

    Actually, now that I recall it, I've seen this reprojection thing years ago in some VR demo
    You basically could pilot a Mars rover while basically being located in a dark room, with a small screen of what rover sees floating in front of you. And you could use your controller to steer the camera, updating that view as if it were a flashlight of sorts
    At least that's how I remember it

  • @elliejohnson2786
    @elliejohnson2786 Год назад +1

    Oh my god, this is genius! I actually didn't even realise the game was running at 15 FPS when you had 144 mouse movements, and at a glance the extended rendering Comrade Stinger's demo comes with is good enough. The only time I realised was when you moved the character itself.

  • @WayStedYou
    @WayStedYou Год назад +5

    This video just got a shoutout on LinusTechTips channel. 13:50

  • @mattdavisgames
    @mattdavisgames Год назад

    Great video, I had been thinking about reprojection outside VR and it's great to see that demo!

  • @JDSPonYT
    @JDSPonYT Год назад +17

    This already exists with SpecialK and it's Latent Sync + Input offset.
    EDIT: The input not being tied to framerate for games that is

    • @brett20000000009
      @brett20000000009 Год назад

      err no specialk does not have async timewarp wtf 17 people upvoted this

    • @JDSPonYT
      @JDSPonYT Год назад

      @@brett20000000009 read the EDIT (that was made seconds after the comment was made)

  • @brett20000000009
    @brett20000000009 Год назад +2

    stinger released a new version with improved player reprojection and it works even better.

  • @Snoozy96
    @Snoozy96 Год назад +1

    One thing that I really want people to know, is that on top all of this. There are Ai generated facial animating, Ai Generated voices, and ai generated text that can adhere to storylines and world rules. Combine all of this together, and we will get an amazing VR experience like none before. You could genuinely create and entire VR MMO that has continuous quersting, that is random and follows the worlds laws, while also being unique and generating animations for specific body types of characters int he world as well.
    We are so close to achieving these types of games. I hope int he next decade something close to an alpha version of this that combines it all together to make something that just tests these foundations out will come out.

  • @EdJames0
    @EdJames0 Год назад +3

    This is a good solution for mouse movement, but it isn't a general solution as what the mouse can "do" is different to every game (third person vs first person, driving vs shooter) but is a very good solution and is better than nothing! The other issue is for player movment and other forms of input. You would have a variable input delay between jumping and shooting, which is maybe why we haven't seen a general solution for something like this yet.

  • @jcquints3364
    @jcquints3364 Год назад

    holy shit, i'm pretty much unaware of vr tech since i'm not really interested in vr gaming. but holy this is some next level stuff lol. game optimization is such a work of art

  • @JacobLukasiewicz
    @JacobLukasiewicz Год назад

    That video should blow up and go viral so we can have some response from true AAA games developers. Great Vid!

  • @Blap7
    @Blap7 Год назад +1

    engines like Godot have two process loops, one for physics at a fixed rate, and one that runs every frame.
    things like mouse movement and visuals can be in the per screen frame render loop, while player movement and other stuff will be in the physics process loop.

    • @TerminalMontageClips.
      @TerminalMontageClips. Год назад

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @TrulySct
    @TrulySct Год назад

    I've been thinking about this as a possibility for a while now, but having someone with this much charisma explaining it really convinces me so much more than I did myself :)

  • @OllieWille
    @OllieWille Год назад +1

    I've been thinking a lot about this myself. I'm imagining how this could be used either save power on portable units, or to boost framerates on machines without making the fans have to spin like crazy. The thing is, if you were to use these two approaches AND motion vectors, you could even make the moving objects in the scene look much smoother!
    You could even do a high quality render pass at 20fps, then actually run the game at 60fps but only "rendering" motion vectors, making it possible to warp the pixels into the right places.
    My fear of this is that developers would just throw optimization out the window when they get this much headroom to play with.

  • @Matthew-.-
    @Matthew-.- Год назад +1

    This stuff would almost certainly require some wizardry at the engine level to allow interaction between the player and the world if they're being rendered separately.

  • @TheTechAdmin
    @TheTechAdmin Год назад +1

    0:17 The reflection used on that purple flat bead curtain is that of the outside street, lol. Not of the inside of the building they're actually in.

  • @MarioRCarbonell
    @MarioRCarbonell Год назад +1

    To address some criticism, the scene and the user interface with the minimap, stats, etc, should be rendered separately and mixed on the mouse input response.
    The video card should render the 3D scene, be modified as shown on the demo, and then the superimposed user interface be put on top. This would solve a lot of critical comments. Also the gun or the hands in a first person game could threated as part of the user interface to not be in a different position.
    A further enhancement of the technique could be to render different scene planes at different fps.
    - Long distance objects, like mountains, sky, etc, could be rendered at 5 fps.
    - Medium distance objects, rendered at 15 fps.
    - Near objects, at 60 fps.
    The fps numbers are invented, just and example.
    Those planes would move according to the camera perspective immediately after the user input.
    The final image would be a combination of long distance + medium + near + user interface. Each one updated at different frame rates and combined to give a new frame after the user input.

  • @giserson2
    @giserson2 Год назад +3

    This would make it possible for native rendering to target 30fps while making it feel and look like 60 and it would really only cause issues if you moved the view around crazy fast, which could be obscured with motion blur.
    In the quest for photorealistic graphics this would be a very powerful tool.

    • @keppycs
      @keppycs 8 месяцев назад

      Wether it's looking around or moving around, objects are still panning across the screen at a fast pace. Because of this higher framerates are still very much beneficial

  • @greenlemon9155
    @greenlemon9155 Год назад +6

    async between CPU and GPU was not enough, now they are unsyncing the CPU with the CPU

    • @sety5591
      @sety5591 Год назад

      intentional desync, lol. User input in the future while the game lag in the past. Click on the head, explode 300ms later... lag. Like a mouse that has its own CPU while the game lag behind on another CPU.

  • @Matt-nv2qg
    @Matt-nv2qg Год назад

    Can'ty believe you convinced me, but yeah, this is a great idea. Nicely done- the demo really did it.

  • @tristanwegner
    @tristanwegner 10 месяцев назад

    This works much better in first person view (it was developed my for VR head rotation), since the perspective barely changes, and just translating the old imagine is pretty close to what is expected. But in third person view as shown with the car at 1:40, rotation is around something in the scene, thus it reveals new information behind the car with the movement, which cannot come from reprojection.

  • @prateekpanwar646
    @prateekpanwar646 Год назад +2

    It’s equivalent to FixedUpdate in unity which run at 50 fps regardless of game (yeah tried 100 but it just messes everything related to physics like gravity, inertia).

  • @sashamakeev7547
    @sashamakeev7547 Год назад +1

    Well at first 1/3 of video i was like "you cant do that..." but having watched through whole video - these are great ideas and tricks that can make gaming better!

    • @TerminalMontageClips.
      @TerminalMontageClips. Год назад

      Hurry my friend , thank you for watching , hit me up i have something for you

  • @danieltammeling1810
    @danieltammeling1810 Год назад +2

    Now we just need a program that can do this on any game. Reshade can already get the depth, all you need is a program running at refresh rate that can copy the game window with re-projection, use the depth buffer from Reshade for movement, and extend the game's edge colors to hide the black part.

  • @sety5591
    @sety5591 Год назад +1

    It will feel like lag in online gaming. Mouse can move around freely, movement appear instant, shooting appear instant, particle effect/gibs appear instant, but hitpoints only go down after a delay, or the game will "rubber-band" your character back to a correct position if the graphic warping/projection don't match the game physics. Eg: click on a head, might show some instant gib, (using some low-cost particle effect that don't lag), but the gibbed character only go down with a correct VFX after the game catch-up with your input.

  • @raqtty
    @raqtty Год назад +7

    Linus click tips

  • @paulwolff2121
    @paulwolff2121 Год назад

    I was super excited when Unity released the decoupled Input from Rendering Feature. I loved to develop mobile apps with unity but the biggest drawback always was that those were killing your battery suuuper quick, even if you were only rendering the main menu. This feature allowed to signifantly reduce frame rate in menus, if there was no input from the user and only up it again if the user was scrolling or anything was animating. I mean it was possible before, but you always had to wait for the next frame for your user Input to be detected and thus having a lag.
    Didn't think about FPS at the time. Great use of that feature!

  • @amarynthos519
    @amarynthos519 Год назад

    I love videos like these.

  • @MrCrazyhamster28
    @MrCrazyhamster28 Год назад +4

    You got a shout out on Linus Tech Tips!

  • @DMitsukirules
    @DMitsukirules Год назад +1

    I tried conrads demo. I think this technology is amazing, and should be used, and SHOULD also be used in competitive scenarios.
    What this introduces is consistency of mouse movements. Basically, if you don't use this, you get "movement artifacts." In other words, your movements don't always remain 1:1. If you do use it, your movements are always 1:1, but you get visual artifacts. Now as a person who plays fighting games, I'm actually very familiar with what it's like to play with both as a trade off. Delay code is visually perfect, but messes up input consistency. Rollback is input perfect, but messes up visuals occasionally. Rollback is better every time. It's not a case by case thing. It's always better to react to a visual artifact then to go through your consistent motion and it just not work. You can adapt to one, and you can't adapt to the other.
    The other thing we are ignoring is that in terms of the trade off, you have to remember the NORMAL case. In other words, what this technology would be doing normally is nothing. If you have a 165 hz monitor, and can maintain in the worse case in Apex legends 90 hz, you essentially get a straight bonus package with no negatives. Yes if you are running at 30 fps upscaled to 60 that is no good, but really, we all know that person isn't playing competitive. They don't have the tools to play competitively, unfortunately.
    I really hope this tech is adopted heavily in the future.

  • @попрпаррлпвча
    @попрпаррлпвча Год назад +24

    I commented this already on one of the previous videos about dlss 3 but I'll repeat myself because I think it's a cool idea and no one's seen it. Could you make a video that only shows generated frames? Basically play back every other frame. On the other hand the fps and the recording framerate may not sync perfectly. Use v-sync? And I think reshade has an extension that takes screenshots. Maybe it can be configured so that half of the frames are saved. It may not work with frame generation, though. I think it had trouble running with dlss or dldsr back when it was introduced

    • @homeyworkey
      @homeyworkey Год назад

      lol i got a video idea, not sure how you would get dlss to work with this case though, probably would need to use some other interpolation algorithm
      replace those 'core' screenshots with new frames based on the generated frames. then replace the old generated frames with newer frames based on the semi-old frames. see how long you can go before its cursed

  • @vtvitibttboybobtottbo
    @vtvitibttboybobtottbo Год назад +1

    Great video. Never heard of this approaches to smooth 30 FPS experience. But now I'm very interested.

    • @TerminalMontageClips.
      @TerminalMontageClips. Год назад

      Hurry my friend , thank you for watching , hit me up i have something for you...

  • @JustLukeMinecraft
    @JustLukeMinecraft Год назад

    This is a fantastic idea! Game designers please take notes

  • @powermaxx11
    @powermaxx11 Год назад

    I need this in my veins right now