Am I misreading the patch notes or this latest one doesn't talk about X2? Only about X3 improvements "X3 mode has been updated to further reduce artifacts on patterned textures and in dark scenes."
If you don't like the latency, you can try the "Allow Tearing" option under Sync mode. It significantly reduces latency at the cost of potential screen tearing. However, after 60ish hours of using this option, I haven't experienced screen tearing at all. Obviously, it might not work as well on EVERY game and EVERY pc but its a pretty huge improvement, so its worth a try for anyone who has LS.
I used the program for a few months now already and from my experience as long as you aren't specifically looking for the artifacts and those little jittering pixels I have no issues and em very greatfull for this to exist
@@G0A7 maybe try with the G-sync now available turned on? It could be because sometimes they add the jittering on the top but with g sync this then gets rid of the problem
@@risingg3169 i could try but tbh 60fps is more than enough for chrono trigger, more often than not i just forget that i have It installed and just plain without it
In case no one's mentioned it here yet or tried it, you can also use Lossless Scaling for videos and it works surprisingly well. However, you do need to be on fullscreen and there should be no UI elements present such as captions or the scrub bar otherwise you will get FPS fluctuations. Also works alongside RTX VSR.
@@papiertoilette-pb9lf movies are made to be 24fps for reason. give it more than 60fps give its a cheap "soap effect" look. looks like robotic , or game like. but if u like that thats fine.
Never in my whole life do I expect a pixelated Duck software on Steam could double,triple or QUADRUPLE my FPS in games. TFS is legend for upgrading this for free constantly. Lossless Scaling is truly the GOAT when you wanted to get more FPS on games that didn't have the expensive exclusive DLSS or AMD's whatever FSR or FMF, this Duck just say "Let me allow every single game be blessed with tons of Frames and Resolution!" and then the frames actually did appear. It really saved my precious GTX 1060 6GB from early retirement and I believe it will benefit people who had crazy high Hz monitors or uses a high frame rate TV madlads to use this just for the sake of playing anything on 4k or 8k.
Unfortunately lossless scaling is snake oil. The upscaling is literally just worse dlss/fsr. The frame gen reduces ur native frames to put more fake frames. Fake frames are not real frames, they just create the illusion of smoothness. Ur input lag is still based on ur real native frames, which are now lower thanks to lossless scaling consuming gpu power and vram. The 3000 and 4000 series have special hardware in them that is used for dlss and frame gen. This hardware makes the performance cost of those nvidia features practically zero, which lossless scaling doesn't have because it's a software implementation rather than a hardware one. U're better off just saving money, buying a used 3060 if on a budget and using dlss, that will give u much better quality at a much smaller performance cost. The frame gen feature could be useful for games that dont have native frame gen, but unfortunately its gsync support is lackluster. And gsync/freesync aren't even that good. Special K's latent sync blows both of them out of the water. Even then u need at least a 144hz monitor to make it worth using. If u have a decent gpu like a 4070 or above, u shouldnt struggle getting 144 fps on non-demanding games, and the demanding ones usually have native frame gen included anyway. So pretty much the only time this program is worth using is if u have an older gpu with a 144hz monitor where u get around 75-80 fps stable. Lock the framerate to 72 and double it up to 144. And even then u miss out on latent sync and u have to use a shitty gsync implementation. Aint worth it unless ure really destitude.
@@kerkertrandov459 FPS unlockers aren't a option for most older games on a emulator where making the frame rate higher breaks the physics or makes the game run too fast or nerfs jumping etc.
Lossless scaling at 4x is great for 2D games with lots of scrolling. These games are often limited to 60 fps (such as Factorio and Terraria). These games combined with my 240 Hz monitor looks fantastic. All motion blur induced by tracking objects with your eye is almost eliminated.
I play Super Smash flash 2. This game is locked on 30fps. This game needs this app to achieve 60fps. Ain't no other option to fix this but to use Lossless Scaling. 😅
I downloaded more SSD space yesterday. The cool thing is i did not have enough space to download the extra space but the extra space gave me the option to download the extra space on the extra space itself. The problem is the download was so massive that at the end of the day i have the same amount of SSD space as before.
I downloaded more internet speed, but unfortunately the download of the speed itself takes up so much bandwidth that it eats into itself. Still, it's essentially free, so I think I'll keep downloading it just so I can tell people I have a faster connection.
I just bought and tested it. X2 is the best option. No need to use X4. 120/144 FPS is enough. X2 has a very good quality too. I didnt even notice that there are any artifactes. This shit is better than Frame Generation and FSMR LOL,
@@Jeannemarre yea, this app is great for capping to your max refresh rate too. Also for games which are hard capped at 60, like Tekken 8. I play Tekken 8 with smooth 120 fps and you do not notice any more input lag.
The thing is that 120/144 means you have Sync in Default and not in OFF (allow tearing), which means you have more input lag. Best option is 2x performance mode with allow tearing
@@ghostlu5462This Honestly Using the LS1 Upscaler + the X2, both on performance, have minimum impact on the image while feeling smooth. Each update silently boosts X2 Frame rate to the point is the best choice for a lot of gamers.
I tested it today on youtube and a movie. And - ITS WORKS! You can boost 4k 25 fps video to 100fps or 24fps movie to 96 fps even in fullscreen without graphics problems.
Hadn't gotten around to trying it like that but was hoping it might work. I tend to stream at 720p but have a 1440p 165 Hz monitor. I've only tried it on a few games. Didn't work with LA Noire but worked very well on FH4.
@@xShikari no, what should fix it is internal motion pacing Same in games The game could be running at 60fps then see a random item or character moving at 30fps or less Sometimes it can be cool Like into the spiderverse where Miles is moving at 12fps but the movie is 24fps But running a movie at 60fps is awful, the movement is very video game looking It's like those people who were upscaling JJK episodes from 24fps to 48fps and upscaled to 4K Way too much ghosting and flames looked very liquid You could see this with a satellite tv running on a Samsung or LG 4K60 display It upscales everything to run at 60fps Every single movie no matter how old it's got that motion smoothing feature on that basically doubles the frame rate and everything you watch is at 60fps
Turn the Performance toggle to On for better latency! Theres a bit more artifacting, but its less noticeable the more frames you feed it, and its less noticable than the input lag.
I've used it with 2x scaling and 2x FG to play FH4 at 1440p60 with a GTX 670 SC. Both on performance mode. Things got a bit wobbly when moving through my garage quickly but in play, latency was low and the image good. Good enough to be normally competitive in ranked racing. I've played the game at least 120 hours like that.
@@DragonOfTheMortalKombat Yes you can stack in game Fg and Lsfg, Tried it on Cyberpunk with Fsr fg mod resulted in 660fps with 1440p low preset Fsr balanced *And Spiderman Miles Morales's Fsr 3 Fg around the same fps due to Lsfg capped to monitor refresh rate for base Fps. I have 165hz monitor so x4 from Lsfg is 660fps.
Considering X2 is Quality and X4 is performance. I actually use the X2 to save power on my 3090. I limited my AW3423DWF refresh rate to 120 so I can use 10-bit. Then I cap the refresh rate in games to 60/90 and just let the X2 generate the rest.
yeh 2x is good enough to play with good image quality, also I'm using 2x and i have 88 fps on cyberpunk 2077, before 42 fps without using lossless scaling. i'm using gtx 1080 8gb GPU
You probably won't notice the artifacting while actively engaged in a fast paced game. It's when you go out of your way to find them using flick tests that they become more apparent. Something to note is that lowering the Mode to X2 and enabling Performance Mode (and setting Sync Mode to Disabled) significantly helps latency. While X4 is cool, it's unoptimized and without the performance settings will result in a bad time latency wise
Just like "lo-fi" music can "work" but it can also sound low-effort and cheap. If you want to make something good, the "they won't notice" excuse will only get in your way.
@@danielowentech While it is the best way, you can always record at the highest FPS your phone or camera can do and slow it down. It will not be the most accurate in terms of milliseconds but I think most people just want to know which one has the lowest latency. If they can see one is visually faster than the other, it should be enough. Also maybe an image quality comparison would be nice as in try to target a stable FPS (either 60 or 120FPS or both!) and record it. The reason for stable FPS is to guarantee that the recording is good and hopefully get a clean capture. I'm not sure the built in recording feature like from AMD Adrenalin can capture FG, but if it can, you can use that since we are not concerned about the absolute performance. If not, then obviously you can use capture card. Whatever works best for you.
I experimented with this on my 6600XT with a 170hz 1440p monitor with vrr. It works great and I don’t notice any ghosting or artifacts pop up, but I do want to avoid having them whenever possible, same story with the latency since I play with a controller most of the time. Works best when you maintain a consistent 60fps to begin with
Daniel, congrats! You have now become an educator and a youtuber. So that answers your dilemma that you were facing in one of your past videos. You offer a great easy to understand explanation that many people may not understand, the difference in latency, and the explanation of the values being presented. Appreciate it!
@@TheStrengthScholar And what kind of FPS jump did you obtain in it? Also what are the best settings? I'm seeing a lot of people say x2 and allow tearing?
Daniel, it would be great if you could try LSC with a second GPU for frame generation. From the LSC panel itself, you can select which GPU you want to use. This way, there is no frame loss on the GPU that is rendering the game. I haven't seen any channel doing that.
this thing is the best thing ever made in gaming history, it helps me with helldivers, where I have a 4060 ti gpu and a weak cpu for the game amd 5600x that causing me only reached around 40-55 fps max in that game (while wukong can grant me around 70-80 fps in cinematic) , was thought to upgrade my whole pc except the gpu, and then I found your video, and try it, it works just like magic, as you said frame gen doenst increase response time, but highers fps, makes my own natural response time better XD
Should specify that this application is the star of the show for emulation. Emulation, you're looking for a fixed frame rate 100% of the time. That goes great with this application that works best while at a stable FPS as well. VRR doesn't do anything for emulation and can cause slow motion at lower frame rates. Games that lock at 30 FPS aren't great but games that lock at 60FPS like many Mario 3D titles, feel much better with it on.
When games close at 30 frames, it seems as if they are at 60 frames, but this is a shame. The more the frames are cut, the more the control becomes heavier and heavier, and this is annoying.
@@thebigcj517 unless you can do run ahead frames like retroarch and combine that with the frame smoothing, since computing power isn't the limitation here
Have been using this for quite some time now and I am really happy with it. I personally do not really care about hud elements having errors and the artifacts are also barely noticeable, at least for me. I don't really pay too much attention to finer details. I have a 165hz monitor for almost 2 years now but I couldn't really utilize it. I was mostly stuck on 120-144hz because of performance, even with x2 frame gen of Nvidia (I have a 4080). Since x3 mode I can play pretty much every game at 165fps, so I can finally use the 165hz of my monitor. And yea like you said, with games like Mordhau (competitive game) I do not use it because of the latency. But in single player games that aren't crazy fast paced, 60fps baseline is perfectly fine latency wise. Only downside is: you can't really go back to something like 60fps. My god it feels like lag 🤣
EDF6 is another great option for LossLess scaling, since it's limited to 60fps, and the frame generation does help make the game a hell of a lot smoother.
First DLSS 2.0 dropped and I couldn't believe it wasn't magic. Then Nvidia released Frame Gen and it's almost as impressive. I can't wait for what they are cooking up next. They should definitely be criticized for their anti consumer behaviour but you can't deny the genius of their engineers.
It's just like an evolution of injectors back in the day, they just fiddle with settings to make games look better than they actually are. It isn't a replacement for native, and it certainly isn't magic. More like copium for devs inability to properly optimize PC games
Using words like magic and genius bring back bad memories of Apple's language. I dont agree with glorifying these big corporations. Unfortunately Nvidia is more and more Apple like.
@@krishnaprasad1570 No he has a point, a lot of people will say that LS doesnt work and it make the performance worse when on. This is due to them having a weaker gpu that is already at max utilization and max vram usage. So when they enable LS they lose performance, while the LS and the game are fighting for resource like vram causing even more performance issues.
I wish ths focused on quality of frame generation instead of quantity of frames. The x2 is amazing but still has flickering issues (especially with 3rd person games)
That is an odd statement. Increasing the number of frame inserted is by far the easy part. The entire task with frame Gen is to accurately predict the frame while not increasing latency. AMD could easily make an upscaler that look BETTER than dlss.. know why they don't no it? Because turning it on would DEcrease your frame rate instead of increasing it. Same here.. they could easily make the 2x be Far better quality but the hit to base performance would be so much that it is not worth using. There is a limited amount of resources that these programs have to work with as well as limited data to work with.. only the pixels on your screen. That is it and they do an excellent job with that limited info. You're saying they should be working on the really hard issue instead of offering the really easy stuff like 3x 4x. Doesn't really make sense. Everyone likes the option to do 4x even if just for the fun of it.
@@waltuhputurdawayThe only way to get better frames than Lossless is to use raw data that comes directly from the game engine even before the frame is rendered completely. Motion vectors is the big one people talk about but there is other useful data as well. Sure, Lossless with get better but it will only get a bit better. By time gpus are strong enough to be able to use a significant more resources to generation frame, there will be ai driven framegen in every game for every GPU. With Lossless.. there is a limit to how good it can get.
I remember long ago i used to skip frames on ps2 emulator and up to 2x it worked more or less playable. And now instead of skipping we're adding frames. What a plot twist. Btw in Valheim even AFMF2 feels perfect, with L.S. it would be probably even better.
Daniel can you please try using Lossless Scaling with the RX 7600? Its crashing allot for me and im not the only one. There's a whole thread on the official discord but so far nothing has changed with each update :(
@@MiGujack3 Its RX 7600 and RX 580 issue. My PC blacks out and shuts down while using stock settings without any overclocking or undervolting. The GPU works fine for gaming and there's no issue but when I use Lossless the PC crashes and shuts down. I'm not the only one as i've found posts relating to this issue on the steam page, their discord and reddit. Just hoping to see if Daniel can figure it out because i wanna use this thing.
@@cajampa it is not the same at all , you can test it even without numbers the diff is so obvious , also x4 will eats alot of base fps so you got that too
@@christophermullins7163 there is no coordination between real and generated frames if you use both of them at the same time it will be bloody mess my man it is not just math x2 x2 =4 , enjoy the mess lol
I recently started using LSFG in Helldivers 2. Works pretty darn well! For better latency, make sure to turn on ULLM/Reflex in your Nvidia control panel. With ULLM off the delay is pretty noticeable. With ULLM on, delay is barely there and easy to forget about. Baseline FPS on my rig is ~100 or so. With LSFG on it makes it 80-90 baseline with 160+ output.
It only works when you put --use-d3d11 in the steam launch parameter as driver level ULLM does nothing in DX12 (The reason why they came up with reflex and put it into games)
@@RyanKristoffJohnsonLatest version of RTSS also features Nvidia reflex as the frame limiter. I tried it and it works for games with no in-game reflex option. Try this instead of ULLM.
didnt watch the whole vid but if you want to demonstrate the smoothness cant you record at 240fps, and play it at 0.25 speed on youtube to display each frame. Obviously people who have 240hz know its smoother, its just seeing the quality of each frame and we can picture it back together at full speed.
Agreed. I think he should record it at 120hz and then slow it down to 1/2 and upload it as that. And then those with a 120hz screen can run it at 2x and they will get the real 120hz video.
@@spencer4hire81 Everyone can not watch his video in 2x and get the benefits from seeing it in 120hz. Only those with 120hz screens. I pretty much always watch in 2x anyway. Sure he gets a less watch time. But so what. The whole point is he is showing of high frame rate video. And this is a technique he could use to improve, what his audience will see compared to what he sees and not be limited to 60fps RUclips videos.
What I'd be curious about, is whether the negatives mentioned happen much if you are not trying to add SO MANY extra frames, but rather just close the gap to a nice round value (like 60 or 72 - capping it with MSIAB/RT or the like).
Really interesting video Daniel! However when stacking frame gens, you should absolutely use the performance mode, in your frame gen stack example your base FPS post DLSS FG dropped significantly, showing a big GPU bottleneck, please retry this (for yourself, not on video) and tell me if it improves things, it should! Also please lock your base FPS with the in game frame limiter to 59 FPS, so 59 -> 118 with DLSS FG + Reflex, then x4 performance 118 to 472 FPS to be in VRR range for minimal input lag :) With this it should feel, and look, amazing :)
Just tried it on Ghost of Tsushima where I was getting decent performance with the built in dlss and FSR frame gen. Went with DLAA and 4x lossless scaling in full screen (no fps counter) but you can feel the smoothness once applied. Insanity. I know it's not supposed to work full-screen but I definitely feel the smooth motion.
Regarding 10:10 when Daniel enables LSFG x4 on top of DLSS3 FG. He's attempting base 40, DLSS3 FG doubled to 80 and then quadrupled to 320. But he's not locking a perfect 80/320 at all times, he's doing 74/305 or smth. That's due to the GPU being maxed out. If he'd drop some settings so the fps would be 100% locked to 80 / 320, the input lag would be significantly better. Talking as someone with a 7900 XTX that played Cyberpunk Path Traced using base 40, FSR3 FG to 80 and then LSFG x3 to 240 fps.
@@chacharealsmooth941 well ya theres trade offs, otherwise no one would ever buy a new gpu, although maybe in the future fg will be so good that it wont matter anymore
He could have done something else instead. Use performance mode on LSFG, and enable DLSS on quality to increase the base frame rate to something that is much better input lag wise.
I tried Lossless Scaling X4 earlier on Trepang2 a super fast game and noticed no input latacy at all and i found i died less and played better with the fluidity absulutly amazing i would rarther use this than Dlss frame generation the end of the gun gargled slightly but on a faced paced game not notale what so ever.
I love this application and use it daily. Yes there are some artifacts, often small or dismissable for me. In shooters I won't use it and will perfer any built in game options. Reguardless, this is amazing. It's best used in non-esport type games like for me: Witcher 3, Spiderman, Path of Exile mapping, and best of all youtube/google chrome. Since many videos and movies are uploaded with 32 fps, this makes the videos really feel like 120, and makes me feel like I'm getting real use out of my monitor outside of just gaming. Really appreaciate the work they put into this and making it so simple to use.
A more granular 1.25 or 1.5 mode would be nice here. A perceived jump from 30 - 40fps is significant enough honestly. I'm thinking this would be a good application for power-limited devices or lower spec hardware... Just enough of a bump to meet that good playability range; but not as much of a hit in terms of visual anomalies since the frame gen would only occur every 4th or 3rd frame.
i managed to get solid 60fps with the Rx580 and 1440p monitor. mixed low medium high ultra setting and DLSS Enabler Mod + LSFG. you need to cap the actual fps using rivatuner tho.. 60/120 is smooth af with Freesync or Gsync.
@@damara2268 I have Lossless Scaling. And I have used FSR 3.0 mod on Cyberpunk 2077 for example. Being easier does not matter to me. What matters is there a mod or not. And the image quality, how much the base frame rate drops etc.
@@mikalappalainen2041FSR 3 much better, tested in TLOU . Lossless Scaling in this game works like shit and burn gpu a lot, Fsr decrease gpu usage, temp, give more stable fps and decrease latency much better. Only one reason to use lossless, game without Fsr3 generation
If the delay latency is too much for you, you can try set the "Low latency mode" to ultra. This setting can be found in Nvidia control panel and can be pretty useful for some game.
I randomly found Lossless Scaling by looking for an app that let me scale my old games using integer scaling, which it does beautifully for most games. Slowly they added more features and it's probably the best few dollars I ever spent on any windows app. The frame generation works so well, yes it's not perfect and requires certain settings like windowed mode, etc. But when it's all in action and activated it's almost like downloading more GPU. Feels good.
I've been using tool this for a year, and it really does work well! Wherever there's not support for AMD RSR or FSR, I'll use it. (Primarily Minecraft Java, tbh. Lol) As well as when streaming videos on youtube or elsewhere to bring them to 4K. (Just use a pop out picture in picture for your video, then use the shortcut key to make it scale it up. I use brave browser that has good support for PIP) Haven't tried frame generation, but I'm not a fan of fake frames tbh, and don't play games that would really require it where it wouldn't hurt me mechanically. Lol.
@@MiGujack3you will get tearing otherwise. Enable Vsync from NVCP. I had no severe latency issues with vsync + loseless for games that are not competitive. But I am using a controller which is recommend
Haven't played an AAA game from the past several years and my laptop is a budget one from 2019 with 1080p60hz display and a gtx1650, so this stuff is all very new and cool to me. You upgrading this year to a laptop with a 4060 and 3.2k 165hz display, so im stoked to see what it looks like.
2x is for 120hz gaming.. 3x is for 180hz and 4x is for 240hz. I have been enjoying dlss frame gen so so much on cyberpunk and the witcher. Going from 65 to 110 is absolutely worth the increase in latency that i cant seem to notice on a controller. I hope fsr and dlss offer a 3x or 4x soon. Amd needs to take some notes from lossless to better afmf. I lost my 6950 xt before i used the second version but i can say with condidence that the first iteration of afmf was absolute TRASH. Glad amd is making some moves.
As someone that loves this software and uses it in most games I do want to advise that it it does drop your base frame rate by about 10-20 fps so if your too far under 60fps it will induce fairly significant input latency, anywhere near or above 60fps and its a significant improvement to smoothness
In absolutely any game, input lag increases very much if your GPU is loaded at 95-100%, it doesn't matter how much fps you have, Nvidia reflex PARTIALLY solves this problem, but the best solution today is still lock fps. Cyberpunk and Elden Ring showed you this very clearly, it's not about the gamepad and mouse. LSFG works best if your GPU is never loaded at 100% and your base fps never drops below 60.
15:01 - it is interesting that we have technologically come to a point that we have created 'visual artifacting' in realtime - that term, I am sourcing from an expression that comes from video compression and rendering where the data around sharper edges gets 'lost' when utilizing a lossy codec (video COmpressor DECompressor) in order to save used bandwidth of the video data stream. the codec, during compression of the video, can allow a blur/loss of video data (detail) and the result around sharp edges is termed "ringing" or sometimes called "mosquito noise" (after the flying annoyances that hover over the skin). this loss of visual data results in a 'haze' or blurry motion around these sharp (and potentially moving) edges in the video output. the fact that these visual artifacts are being caused/rendered in Realtime for the human eye - as opposed to hours-long video stream rendering where interpolated frames are calculated in smaller rectangle shapes and put together as video frames in a single video file - blows my mind (hey as a person getting older, that used to take over-night sometimes lol). so neat to see it all happen with almost no delay whatsoever... fantastic coverage of this, btw. great vid
I think you're wrong about something, x4 right now does have image quality issues but x2 LFSG completely destroys FSR3 framegen in term of image quality, there's no comparison, the only advantage FSR3 has over LSFG x2 is that FSR3 skips the HUD entirely but that leads to other issues as well. FSR3 barely create an actually convincing/correct/stable interpolated image right now.
I think, you mentioned in earlier videos, that capping the base frame rate can limit latency, which you did not mention in this video. I used this with a RTX2060 to play Black Myth Wukong with 120 fps on 1440p high settings with dlss on 50% (40% in chapter 3) to get 30fps capped base frame rate and it played wonderfully. Not capping the framerat introduced latency spikes.
I'm surprised Nvidia and AMD didn't do this first. Maybe they were saving 3- and 4-frame interpolation to pretend their _next_ generation is worth what they'll be asking for it.
Good question is what if you use it for vr games, because vr is almost always 4k or more of resolution and you have to get 120 fps or more to really feel immersed.
u must pay lootboxes and get the legendary X5 mode with a drop rate of 0,000000000x10-^432149x=z²+x/69, after that you need the battlepass (only 1000$ right now it's almost free ngl) and only then you must have 100 000 hours on loseless scaling, and make a 2000$ dollars payment to finally get X6, but it's only avalaible to those who own a rtx 10090 ti super ultra (which cost around 15k$ in 2050)
Been using this on WoW. And omg...it is amazing. Set base fps to 60 in the game as that is the lowest i go down to in valdrakken. Then set to x3 with allow tearing. Perfect 180fps matches my monitor refresh rate, and doesn't skip a beat. So good.
Would be nice If CPU’s with a dedicated gpu could slightly boost frame generation of a discrete GPU, by using the extra RAM. Also I don’t know the exact process that Lossless Frame Generation uses, because I’m wondering if they use AV1 or H.265/HEVC when they plant the frames?
I just tried this on Helldivers 2 and finally hit 240fps on my 240hz 1080p monitor! Felt super smooth but the latency was definitely noticeable. I prefer x2 and x3 mode with perceived latency practically non existent with x2 mode as long as you have right settings switched on.
@@MrSuvvri but what's the point if the mouse isn't smooth? Idk, when I'd want to use it is when it sucks the most so this quite literally seems no good
Yeah I understand where you are coming from. Because I don't really find it useful in shooting games. Almost unplayable. I use it for livestream and videos tho, almost all the time. Movies, discord streaming. All this makes huge difference.
Been using it for about 2 months now, and I'm halfway through replaying God of War since it first came out on PS4, works flawlessly. Lossless scaling is a must buy gem. Definitely a must have for the chill non competitive games.
Losless scaling is legitimately great, but I've found the most use out of it on RUclips and with emulators. Real frames are always preferable, interpolating past frame caps is where it's at
@@BourbonBiscuit. what he is saying is that no matter if you are generating 1 or 2 or 3 frames the latency is only as long as the time it has to wait for a the second in game frame. so all 3 should have the exact same latency but from some reviews Ive seen the 3x has less latency than 2x (something having to do with frame pacing or idk I aint no tech wizard)
@@BourbonBiscuit. yeah but why would you do that if you can already hit the target frame-rate from 60 to 120 going 30 to 120 would be both very stupid and unnecessary but going from 60 to 240 should feel the exact same as 60 to 120 when it comes to latency
As a teacher you shouldn't compare Apples with Oranges, you know the scientific method. So if want to compare latency and image artifacts, you compare DLSS FG to LS 2x and not the 4x. You should only compare de LS 4x with the LS 3x and 2x, because it's only one changed variable the number of frames inserted on the same upscaler. So I have to vote negatively for your video, I could voted positively if you had compared Apples with Apples.
I tried Lossless Scaling but refunded because it just made every game screen tear like crazy. I played with the settings a lot (VSYNC, Framerate limits) etc but no luck.
Some games gave me more issues than others with LS. I wonder if by tearing, you're actually seeing artifacting. The X2 mode is least likely to artifact and tear and make sure you don't have in-game settings and LS settings fighting each other. I had to play with in-game settings, LS settings, and my driver settings to get an optimal experience but once I solved it, it works great.
What was your fps? Have you enabled vsync only from Nvidia panel? Are you using RTSS properly? What monitor refresh rate have you set? Or maybe the game you are trying has problems on its own that only the developers can fix or check the internet for stuttering fixes? I can help you with this. For me loseless scaling is working perfectly with all older games, emulators.
Agreed. Ignore the other comments. People acting like frame interpolation is “performance” are either lying to themselves or don’t understand the technology at all. This is a cool way to get “smoother” gameplay, but it introduces input lag and artificial frames to make it happen. You can’t call that performance. Imaging if your car’s speedometer doubled the speed it showed you, so while driving at 60MPH it showed you 120MPH. Is that performance? Of course not. This isn’t anything like adding a turbo to an engine- overclocking would be that equivalent.
@@nintendoconvert4045 But this is actually giving you more frames per second, it's not just SHOWING more FPS like your speedometer comparison. If it said 240fps while actually getting 60 fps this app would be getting entirely shit on by everyone, but it doesn't do that.
No works on 98 percent of all games steam origin uplay and any game installed on your system weather legit or others like abandonware there is only a couple that wont work when lossless scaling cant force a windowd game to full screen like if you use somthing like dgVoodoo to change the api from dx9 to dx11 but fantastic program.
It's worth mentioning that frame generation will use extra gpu power, and this will cause you to lose "real frames." So if you were getting 60 real fps, you might get 50 when FG is on. Personally I wouldn't bother with FG if you can't keep more than 60 real fps because the input delay will just be too much
I've encountered this problem today, after buying Crosshair X, it's a crosshair app which goes as a semi-overlay onto your Rust game adding a crosshair. I've tried scaling my game repeatedly with all types of settings from Lossless Scaling, even with older versions, modes, nothing helps. The scaling is being done on the Crosshair X instead of on Rust and this makes my Rust game run in 40fps because CrosshairX is scaled. I've also tried first scaling Rust, then adding and turning on the Crossshair X app so that it won't be scaled, doesn't work. I've also tried disabling fullscreen optimizations, so on in the properties of the apps. Any fixes? I am desperate... A fix I would see working perfectly would be Lossless Scaling being blocked to scale on Crosshair X and locked only to scale with Rust game, but I don't really know how to do that. Need help!!!
Don't forget that the 2x mode still exists and is getting better in terms of efficiency and image quality with each update
Yeah that's what i'm using in Burnout Paradise Remastered.
The 2X mode has best quality of course
Am I misreading the patch notes or this latest one doesn't talk about X2? Only about X3 improvements
"X3 mode has been updated to further reduce artifacts on patterned textures and in dark scenes."
x2 mode is the best in term of latency and visual quality. best use in fps capped games ofc like elden ring
Nvidia saved the industry from SLI and CROSSFIRE. Artificially doubled or tripled the GPUs you own.
You all owe Nvidia an apology.
If you don't like the latency, you can try the "Allow Tearing" option under Sync mode. It significantly reduces latency at the cost of potential screen tearing. However, after 60ish hours of using this option, I haven't experienced screen tearing at all. Obviously, it might not work as well on EVERY game and EVERY pc but its a pretty huge improvement, so its worth a try for anyone who has LS.
I find the Performance toggle does more for input lag for me
Yes, VRR technologies often handle the tearing by themselves, making VRR and Allow Tearing LSFG a win-win.
@@ashishgoyal8655lossless
@@briando8150 u mean it helps with latency or hurts it?
Woah this is huge help. Thank you
highly technical and accurate diagram is all we ask for :D
Congrats on the shoutout in the latest Hardware Unbox video Daniel, you're getting some well deserved recognition unc
can't believe i'm seeing the word 'unc' in a youtube comment
What is it about?
My goat made it
While you have a fair point as far as Daniels recognition goes, you are steppin out of line junior. Sit tf down boi.
derbauer was also there. :)
I used the program for a few months now already and from my experience as long as you aren't specifically looking for the artifacts and those little jittering pixels I have no issues and em very greatfull for this to exist
I used it on chrono trigger, the only problem i saw was the text showing up (more times if it was on top Idk why), pretty neat feature
@@G0A7 maybe try with the G-sync now available turned on? It could be because sometimes they add the jittering on the top but with g sync this then gets rid of the problem
@@risingg3169 i could try but tbh 60fps is more than enough for chrono trigger, more often than not i just forget that i have It installed and just plain without it
@@risingg3169can i use this with a weak cpu and a strong gpu ? 12400f and 7800xt.
🤮
In case no one's mentioned it here yet or tried it, you can also use Lossless Scaling for videos and it works surprisingly well. However, you do need to be on fullscreen and there should be no UI elements present such as captions or the scrub bar otherwise you will get FPS fluctuations. Also works alongside RTX VSR.
@@raven3696 wait... That is actually lit! I'll have to try this on RUclips videos and maybe even movies and see how they look
Videos don't work like games, 24fps will look better than 60fps
does it work for online games like cod or tarkov
@@prequall it won't for some people, i now watch film at around 90fps and it's much better than the laggy 24fps, can't stand it anymore...
@@papiertoilette-pb9lf movies are made to be 24fps for reason. give it more than 60fps give its a cheap "soap effect" look. looks like robotic , or game like. but if u like that thats fine.
Never in my whole life do I expect a pixelated Duck software on Steam could double,triple or QUADRUPLE my FPS in games. TFS is legend for upgrading this for free constantly. Lossless Scaling is truly the GOAT when you wanted to get more FPS on games that didn't have the expensive exclusive DLSS or AMD's whatever FSR or FMF, this Duck just say "Let me allow every single game be blessed with tons of Frames and Resolution!" and then the frames actually did appear. It really saved my precious GTX 1060 6GB from early retirement and I believe it will benefit people who had crazy high Hz monitors or uses a high frame rate TV madlads to use this just for the sake of playing anything on 4k or 8k.
Unfortunately lossless scaling is snake oil. The upscaling is literally just worse dlss/fsr. The frame gen reduces ur native frames to put more fake frames. Fake frames are not real frames, they just create the illusion of smoothness. Ur input lag is still based on ur real native frames, which are now lower thanks to lossless scaling consuming gpu power and vram. The 3000 and 4000 series have special hardware in them that is used for dlss and frame gen. This hardware makes the performance cost of those nvidia features practically zero, which lossless scaling doesn't have because it's a software implementation rather than a hardware one. U're better off just saving money, buying a used 3060 if on a budget and using dlss, that will give u much better quality at a much smaller performance cost. The frame gen feature could be useful for games that dont have native frame gen, but unfortunately its gsync support is lackluster. And gsync/freesync aren't even that good. Special K's latent sync blows both of them out of the water. Even then u need at least a 144hz monitor to make it worth using. If u have a decent gpu like a 4070 or above, u shouldnt struggle getting 144 fps on non-demanding games, and the demanding ones usually have native frame gen included anyway. So pretty much the only time this program is worth using is if u have an older gpu with a 144hz monitor where u get around 75-80 fps stable. Lock the framerate to 72 and double it up to 144. And even then u miss out on latent sync and u have to use a shitty gsync implementation. Aint worth it unless ure really destitude.
@@kerkertrandov459 It's not snake oil for emulators and games that are locked to 30fps or 60 fps
@@Spectrophia just use a fps unlocker then
@@kerkertrandov459 FPS unlockers aren't a option for most older games on a emulator where making the frame rate higher breaks the physics or makes the game run too fast or nerfs jumping etc.
@@Spectrophia bboomer gaming
Lossless scaling at 4x is great for 2D games with lots of scrolling. These games are often limited to 60 fps (such as Factorio and Terraria). These games combined with my 240 Hz monitor looks fantastic. All motion blur induced by tracking objects with your eye is almost eliminated.
the problem is 60fps is not enough base framerate
@@sudd3660 For what? For 1x 60 is perfectly fine. 2x too depending on game. For 3x i agree that base framerate should be closer to 90 or so.
I play Super Smash flash 2. This game is locked on 30fps. This game needs this app to achieve 60fps. Ain't no other option to fix this but to use Lossless Scaling. 😅
I figured loss of fine detail during motion would mostly negate any benefit of higher fps. Sounds like this works better with 2D games.
@@sudd3660 Many of these games utilize a hardware cursor which is unaffected by the frame rate or latency. This makes input lag way less noticeable.
I downloaded more SSD space yesterday. The cool thing is i did not have enough space to download the extra space but the extra space gave me the option to download the extra space on the extra space itself. The problem is the download was so massive that at the end of the day i have the same amount of SSD space as before.
What 😭
I downloaded more internet speed, but unfortunately the download of the speed itself takes up so much bandwidth that it eats into itself. Still, it's essentially free, so I think I'll keep downloading it just so I can tell people I have a faster connection.
You guys creating paradox now?😭
(Nice analogy lmao)
underrated
The SSD got bigger but it has the same memory space inside of it
I just bought and tested it. X2 is the best option. No need to use X4. 120/144 FPS is enough. X2 has a very good quality too. I didnt even notice that there are any artifactes. This shit is better than Frame Generation and FSMR LOL,
This app just saved my 3090 in black myth wukong lol.
@@Jeannemarre yea, this app is great for capping to your max refresh rate too. Also for games which are hard capped at 60, like Tekken 8. I play Tekken 8 with smooth 120 fps and you do not notice any more input lag.
@@Yakoo86 that’s dope !!
The thing is that 120/144 means you have Sync in Default and not in OFF (allow tearing), which means you have more input lag.
Best option is 2x performance mode with allow tearing
@@ghostlu5462This Honestly
Using the LS1 Upscaler + the X2, both on performance, have minimum impact on the image while feeling smooth.
Each update silently boosts X2 Frame rate to the point is the best choice for a lot of gamers.
Hardware Unboxed shouted you out in their last video, great work!
I tested it today on youtube and a movie. And - ITS WORKS! You can boost 4k 25 fps video to 100fps or 24fps movie to 96 fps even in fullscreen without graphics problems.
Hadn't gotten around to trying it like that but was hoping it might work. I tend to stream at 720p but have a 1440p 165 Hz monitor. I've only tried it on a few games. Didn't work with LA Noire but worked very well on FH4.
That's not advised, running movies at 60fps is already bad, the motion is broken and image quality is also broken
It looks smooth yes but it's a movie
Everything under solid 60 fps is trash for fg. 24fps video is already so jelly when you push it to 58 let alone 96
@@melxvee6850It depends on the movie. Panning shots look awful on most screens nowadays, this should fix that shouldn't it?
@@xShikari no, what should fix it is internal motion pacing
Same in games
The game could be running at 60fps then see a random item or character moving at 30fps or less
Sometimes it can be cool
Like into the spiderverse where Miles is moving at 12fps but the movie is 24fps
But running a movie at 60fps is awful, the movement is very video game looking
It's like those people who were upscaling JJK episodes from 24fps to 48fps and upscaled to 4K
Way too much ghosting and flames looked very liquid
You could see this with a satellite tv running on a Samsung or LG 4K60 display
It upscales everything to run at 60fps
Every single movie no matter how old it's got that motion smoothing feature on that basically doubles the frame rate and everything you watch is at 60fps
Great breakdown. I love the highly accurate and technical breakdown graphs.
Yuge update for high refresh rate panel enjoyers
Turn the Performance toggle to On for better latency! Theres a bit more artifacting, but its less noticeable the more frames you feed it, and its less noticable than the input lag.
I've used it with 2x scaling and 2x FG to play FH4 at 1440p60 with a GTX 670 SC. Both on performance mode. Things got a bit wobbly when moving through my garage quickly but in play, latency was low and the image good. Good enough to be normally competitive in ranked racing. I've played the game at least 120 hours like that.
@@Lurch-Botupgrade your GPU
@GewelReal my guy, if we could, would we be using lossless scaling?
@@GewelReal Yeah right why dont you donate me an RTX 6090 Ti?
@@dingickso4098 it hasn't released yet so no
Additionally, the image quality of 2x mode has improved a lot since the latest update, espacially in some darker scenes.
Now I'm waiting for someone to try AMD AFMF + FSR FG + LS. Complete madness
Tried it, unfortunately no.. Afmf and Lsfg will turned each other off lol.
Similar software conflicting.
@@WutTheHekk but fsr fg and lsfg can work right ?
@@DragonOfTheMortalKombat Yes you can stack in game Fg and Lsfg,
Tried it on Cyberpunk with Fsr fg mod resulted in 660fps with 1440p low preset Fsr balanced
*And Spiderman Miles Morales's Fsr 3 Fg around the same fps due to Lsfg capped to monitor refresh rate for base Fps.
I have 165hz monitor so x4 from Lsfg is 660fps.
나는 이미 그것을 시도했다. 특정 게임에서는 매우 잘 작동합니다.
I did that... the game rendered at negative frames and scaled them up to 144, as a neat little side effect my GPU has its own singularity
Considering X2 is Quality and X4 is performance. I actually use the X2 to save power on my 3090. I limited my AW3423DWF refresh rate to 120 so I can use 10-bit. Then I cap the refresh rate in games to 60/90 and just let the X2 generate the rest.
I have the same monitor, do you have a link to setup 10bit color?
Is 10 bit that much worth it to drop from 165hz?
yeh 2x is good enough to play with good image quality, also I'm using 2x and i have 88 fps on cyberpunk 2077,
before 42 fps without using lossless scaling. i'm using gtx 1080 8gb GPU
@@SLModernJukebox 60fps is the minimum tu use 2x
@@silvio3d Nope. It is 30fps. 60fps is the recommended.
You probably won't notice the artifacting while actively engaged in a fast paced game. It's when you go out of your way to find them using flick tests that they become more apparent. Something to note is that lowering the Mode to X2 and enabling Performance Mode (and setting Sync Mode to Disabled) significantly helps latency. While X4 is cool, it's unoptimized and without the performance settings will result in a bad time latency wise
Just like "lo-fi" music can "work" but it can also sound low-effort and cheap. If you want to make something good, the "they won't notice" excuse will only get in your way.
nah, its all i can see, cant even stand dlss, shits fuzzy as fuck, even the dlaa is questionable
Can you compare AFMF latency vs the latency in Lossless Scaling? Never seen that comparison.
The best way to test that would be with a click-to-photon measurement, which I don't own the hardware to do.
@@danielowentech While it is the best way, you can always record at the highest FPS your phone or camera can do and slow it down. It will not be the most accurate in terms of milliseconds but I think most people just want to know which one has the lowest latency. If they can see one is visually faster than the other, it should be enough. Also maybe an image quality comparison would be nice as in try to target a stable FPS (either 60 or 120FPS or both!) and record it. The reason for stable FPS is to guarantee that the recording is good and hopefully get a clean capture. I'm not sure the built in recording feature like from AMD Adrenalin can capture FG, but if it can, you can use that since we are not concerned about the absolute performance. If not, then obviously you can use capture card. Whatever works best for you.
Afmf2 has less latency lossless scaling has more quality
@@Nicx343xAFMF doesn't have less quality. FSR 2.1 has less quality. If you combine both. Yeah. The quality is slightly worse.
Unless you set manually your software + game. Adrenalin isn't newbie friendly.
oh my god.I never knew about this..got it straight away for my emulators and older pc games thank you so much x
I experimented with this on my 6600XT with a 170hz 1440p monitor with vrr. It works great and I don’t notice any ghosting or artifacts pop up, but I do want to avoid having them whenever possible, same story with the latency since I play with a controller most of the time. Works best when you maintain a consistent 60fps to begin with
Would this work with the VR mods for cyberpunk? I wonder...
Guys if your monitor has crosshair overlay in OSD you can use it instead and disable crosshair in the games 🙂
The monitor crosshairs are usually so bad though lol
@@afroize Yeah you are right they usually don't look great, but I think there is probably some software overlay available, like Crosshair V2 on steam.
Only use them for snipers
Daniel, congrats! You have now become an educator and a youtuber. So that answers your dilemma that you were facing in one of your past videos. You offer a great easy to understand explanation that many people may not understand, the difference in latency, and the explanation of the values being presented. Appreciate it!
This can make your VRAM run extremely hot, like 90C+, so be careful with it. Monitor your VRAM temps on your GPU.
Well it's obvious the higher the frame the hotter the GPU
This has nothing to do with LS, and everything to do with the gpu you have.
@@haelkerzael7589 This "can" make your VRAM run extremely hot.
I’m definitely downloading lossless for Elden Ring to get a better gaming experience!!!! Thank you!🤩
I literally cannot play Elden Ring without lossless scaling enabled. I have probably put 100 hours into lossless scale and through Elden ring alone
@@TheStrengthScholar And what kind of FPS jump did you obtain in it? Also what are the best settings? I'm seeing a lot of people say x2 and allow tearing?
9:44 - bro casually kicking out 3 headshots on pedestrians is wild 😂
Daniel, it would be great if you could try LSC with a second GPU for frame generation. From the LSC panel itself, you can select which GPU you want to use. This way, there is no frame loss on the GPU that is rendering the game. I haven't seen any channel doing that.
Finally, I can download frames. When can I download more ram?
Theoretically you could use Onedrive as ram.
proberly to young for that meme xd that goes back to the late 90´s early 2000´s
@@aeppikx i know the meme, i just wanted to be a smartass and say people have managed to use onedrive as "ram"
do you know why this program shows 130/250 but in fact fps 10 and a slideshow?
@@dizzy7376 cause your pc is shit.
this thing is the best thing ever made in gaming history, it helps me with helldivers, where I have a 4060 ti gpu and a weak cpu for the game amd 5600x that causing me only reached around 40-55 fps max in that game (while wukong can grant me around 70-80 fps in cinematic) , was thought to upgrade my whole pc except the gpu, and then I found your video, and try it, it works just like magic, as you said frame gen doenst increase response time, but highers fps, makes my own natural response time better XD
Should specify that this application is the star of the show for emulation. Emulation, you're looking for a fixed frame rate 100% of the time. That goes great with this application that works best while at a stable FPS as well. VRR doesn't do anything for emulation and can cause slow motion at lower frame rates. Games that lock at 30 FPS aren't great but games that lock at 60FPS like many Mario 3D titles, feel much better with it on.
Imagine wind waker at 120fps
When games close at 30 frames, it seems as if they are at 60 frames, but this is a shame. The more the frames are cut, the more the control becomes heavier and heavier, and this is annoying.
@@thebigcj517 unless you can do run ahead frames like retroarch and combine that with the frame smoothing, since computing power isn't the limitation here
Oh I gotta try this
Excellent video and explanations of how it all works, good job Daniel.
Have been using this for quite some time now and I am really happy with it. I personally do not really care about hud elements having errors and the artifacts are also barely noticeable, at least for me. I don't really pay too much attention to finer details. I have a 165hz monitor for almost 2 years now but I couldn't really utilize it. I was mostly stuck on 120-144hz because of performance, even with x2 frame gen of Nvidia (I have a 4080). Since x3 mode I can play pretty much every game at 165fps, so I can finally use the 165hz of my monitor. And yea like you said, with games like Mordhau (competitive game) I do not use it because of the latency. But in single player games that aren't crazy fast paced, 60fps baseline is perfectly fine latency wise.
Only downside is: you can't really go back to something like 60fps. My god it feels like lag 🤣
EDF6 is another great option for LossLess scaling, since it's limited to 60fps, and the frame generation does help make the game a hell of a lot smoother.
First DLSS 2.0 dropped and I couldn't believe it wasn't magic. Then Nvidia released Frame Gen and it's almost as impressive. I can't wait for what they are cooking up next. They should definitely be criticized for their anti consumer behaviour but you can't deny the genius of their engineers.
No one's talking about Nvidia and their walled garden here.
Also they've already released a new "magic", it's called ray reconstruction.
It's just like an evolution of injectors back in the day, they just fiddle with settings to make games look better than they actually are. It isn't a replacement for native, and it certainly isn't magic. More like copium for devs inability to properly optimize PC games
@@JABelms bingo.
@@JABelms agreed, the way people are labelling it as magic is insane
Using words like magic and genius bring back bad memories of Apple's language. I dont agree with glorifying these big corporations.
Unfortunately Nvidia is more and more Apple like.
Best video looking for info like this for a long time
In the end, it depends on your GPU.
If you're GPU is too weak, then it worsening your fps and input lag instead.
U r good if u have anything abovr a GTX 1660s
Eh, this allows me to get close to 60 fps on Hogwarts Legacy and Jedi Survivor, using a 1060....
@@krishnaprasad1570 No he has a point, a lot of people will say that LS doesnt work and it make the performance worse when on. This is due to them having a weaker gpu that is already at max utilization and max vram usage. So when they enable LS they lose performance, while the LS and the game are fighting for resource like vram causing even more performance issues.
Quality was great man, especially after RUclips compression and loss. Subbed
I wish ths focused on quality of frame generation instead of quantity of frames. The x2 is amazing but still has flickering issues (especially with 3rd person games)
That is an odd statement. Increasing the number of frame inserted is by far the easy part. The entire task with frame Gen is to accurately predict the frame while not increasing latency.
AMD could easily make an upscaler that look BETTER than dlss.. know why they don't no it? Because turning it on would DEcrease your frame rate instead of increasing it.
Same here.. they could easily make the 2x be Far better quality but the hit to base performance would be so much that it is not worth using. There is a limited amount of resources that these programs have to work with as well as limited data to work with.. only the pixels on your screen. That is it and they do an excellent job with that limited info. You're saying they should be working on the really hard issue instead of offering the really easy stuff like 3x 4x. Doesn't really make sense. Everyone likes the option to do 4x even if just for the fun of it.
@@ryujigames2509 that’s what Nvidia frame generation is
@@BourbonBiscuit.explain that?
@@waltuhputurdawayThe only way to get better frames than Lossless is to use raw data that comes directly from the game engine even before the frame is rendered completely. Motion vectors is the big one people talk about but there is other useful data as well. Sure, Lossless with get better but it will only get a bit better. By time gpus are strong enough to be able to use a significant more resources to generation frame, there will be ai driven framegen in every game for every GPU. With Lossless.. there is a limit to how good it can get.
@@waltuhputurdaway Nvidia FG image quality has come a long way, instead of just multiplying frames more
I remember long ago i used to skip frames on ps2 emulator and up to 2x it worked more or less playable. And now instead of skipping we're adding frames. What a plot twist. Btw in Valheim even AFMF2 feels perfect, with L.S. it would be probably even better.
Daniel can you please try using Lossless Scaling with the RX 7600? Its crashing allot for me and im not the only one. There's a whole thread on the official discord but so far nothing has changed with each update :(
i wish nvidia will make 2x 3x 4x framegen for 50-series. also enjoy amd as always 🗿
Its crashing on my 1660S as well
I run it with 6600 XT no issues.
@@MiGujack3 Its RX 7600 and RX 580 issue. My PC blacks out and shuts down while using stock settings without any overclocking or undervolting. The GPU works fine for gaming and there's no issue but when I use Lossless the PC crashes and shuts down. I'm not the only one as i've found posts relating to this issue on the steam page, their discord and reddit. Just hoping to see if Daniel can figure it out because i wanna use this thing.
@@amzgamingxtry upscale from 1440p to 4K this will fix your crashing maybe
You should have tried dlss 3 frame gen with LS x2 rather than x4 to see if the latency penalty is as heavy
bruuh it is not that simple , you cant use 2 fg together or it will give you bad frame pacing and quality and also bad latency
LS x2 and x4 latency should be the same. It is still a frame behind.
What I was thinking. I'ma try it in cyberpunk when I get home from work lol
@@cajampa it is not the same at all , you can test it even without numbers the diff is so obvious , also x4 will eats alot of base fps so you got that too
@@christophermullins7163 there is no coordination between real and generated frames if you use both of them at the same time it will be bloody mess my man it is not just math x2 x2 =4 , enjoy the mess lol
I recently started using LSFG in Helldivers 2. Works pretty darn well! For better latency, make sure to turn on ULLM/Reflex in your Nvidia control panel. With ULLM off the delay is pretty noticeable. With ULLM on, delay is barely there and easy to forget about.
Baseline FPS on my rig is ~100 or so. With LSFG on it makes it 80-90 baseline with 160+ output.
It only works when you put --use-d3d11 in the steam launch parameter as driver level ULLM does nothing in DX12 (The reason why they came up with reflex and put it into games)
@@bb5307 Interesting. I'm pretty sure I noticed a decent difference between Off vs Ultra. I can double check it though
@@RyanKristoffJohnsonLatest version of RTSS also features Nvidia reflex as the frame limiter. I tried it and it works for games with no in-game reflex option. Try this instead of ULLM.
Thanks for the info its something I heard about a few months ago but never looked into it
didnt watch the whole vid but if you want to demonstrate the smoothness cant you record at 240fps, and play it at 0.25 speed on youtube to display each frame. Obviously people who have 240hz know its smoother, its just seeing the quality of each frame and we can picture it back together at full speed.
Agreed. I think he should record it at 120hz and then slow it down to 1/2 and upload it as that. And then those with a 120hz screen can run it at 2x and they will get the real 120hz video.
Wouldn’t the algorithm crush his stats if everyone watched a large section of his video at 4x speed though?
@@spencer4hire81 Everyone can not watch his video in 2x and get the benefits from seeing it in 120hz. Only those with 120hz screens. I pretty much always watch in 2x anyway.
Sure he gets a less watch time.
But so what. The whole point is he is showing of high frame rate video.
And this is a technique he could use to improve, what his audience will see compared to what he sees and not be limited to 60fps RUclips videos.
What I'd be curious about, is whether the negatives mentioned happen much if you are not trying to add SO MANY extra frames, but rather just close the gap to a nice round value (like 60 or 72 - capping it with MSIAB/RT or the like).
Really interesting video Daniel! However when stacking frame gens, you should absolutely use the performance mode, in your frame gen stack example your base FPS post DLSS FG dropped significantly, showing a big GPU bottleneck, please retry this (for yourself, not on video) and tell me if it improves things, it should! Also please lock your base FPS with the in game frame limiter to 59 FPS, so 59 -> 118 with DLSS FG + Reflex, then x4 performance 118 to 472 FPS to be in VRR range for minimal input lag :)
With this it should feel, and look, amazing :)
Just tried it on Ghost of Tsushima where I was getting decent performance with the built in dlss and FSR frame gen. Went with DLAA and 4x lossless scaling in full screen (no fps counter) but you can feel the smoothness once applied. Insanity.
I know it's not supposed to work full-screen but I definitely feel the smooth motion.
Regarding 10:10 when Daniel enables LSFG x4 on top of DLSS3 FG.
He's attempting base 40, DLSS3 FG doubled to 80 and then quadrupled to 320. But he's not locking a perfect 80/320 at all times, he's doing 74/305 or smth. That's due to the GPU being maxed out. If he'd drop some settings so the fps would be 100% locked to 80 / 320, the input lag would be significantly better.
Talking as someone with a 7900 XTX that played Cyberpunk Path Traced using base 40, FSR3 FG to 80 and then LSFG x3 to 240 fps.
How can you use FG in Cyberpunk?
Do you mean afmf?
@@chacharealsmooth941 Either LukeFZ mod (Patreon) or DLSS Enabler (Nexus). I mean FSR3 FG, not AFMF.
@@raresmacovei8382 Gotcha. Used it myself and it works, though the quality is not perfect.
@@chacharealsmooth941 well ya theres trade offs, otherwise no one would ever buy a new gpu, although maybe in the future fg will be so good that it wont matter anymore
He could have done something else instead. Use performance mode on LSFG, and enable DLSS on quality to increase the base frame rate to something that is much better input lag wise.
Does this work with VR Titles as well ? Or just Pancake screen ? Thanks for the vid ! Im excited to try this :)
Vr no , can run anything windows rog ally etc steam deck no
I tried Lossless Scaling X4 earlier on Trepang2 a super fast game and noticed no input latacy at all and i found i died less and played better with the fluidity absulutly amazing i would rarther use this than Dlss frame generation the end of the gun gargled slightly but on a faced paced game not notale what so ever.
really liking this channel
I love this application and use it daily. Yes there are some artifacts, often small or dismissable for me. In shooters I won't use it and will perfer any built in game options. Reguardless, this is amazing. It's best used in non-esport type games like for me: Witcher 3, Spiderman, Path of Exile mapping, and best of all youtube/google chrome. Since many videos and movies are uploaded with 32 fps, this makes the videos really feel like 120, and makes me feel like I'm getting real use out of my monitor outside of just gaming. Really appreaciate the work they put into this and making it so simple to use.
A more granular 1.25 or 1.5 mode would be nice here.
A perceived jump from 30 - 40fps is significant enough honestly.
I'm thinking this would be a good application for power-limited devices or lower spec hardware... Just enough of a bump to meet that good playability range; but not as much of a hit in terms of visual anomalies since the frame gen would only occur every 4th or 3rd frame.
Bro had beef with that rabbit
i managed to get solid 60fps with the Rx580 and 1440p monitor.
mixed low medium high ultra setting and DLSS Enabler Mod + LSFG.
you need to cap the actual fps using rivatuner tho.. 60/120 is smooth af with Freesync or Gsync.
With my 3070, it's either this or FSR 3.0 modded frame generation. And at the moment, this seems better.
This is better cause it's much easier to use. Just keyboard shortcut and no need to install any mods that might as well glitch the game out
@@damara2268 I have Lossless Scaling. And I have used FSR 3.0 mod on Cyberpunk 2077 for example. Being easier does not matter to me. What matters is there a mod or not. And the image quality, how much the base frame rate drops etc.
@@mikalappalainen2041FSR 3 much better, tested in TLOU . Lossless Scaling in this game works like shit and burn gpu a lot, Fsr decrease gpu usage, temp, give more stable fps and decrease latency much better. Only one reason to use lossless, game without Fsr3 generation
@@mikalappalainen2041 lossless scaling is better than the FSR 3.0 mod? is it cuz of more fps or image quality?
If the delay latency is too much for you, you can try set the "Low latency mode" to ultra. This setting can be found in Nvidia control panel and can be pretty useful for some game.
I randomly found Lossless Scaling by looking for an app that let me scale my old games using integer scaling, which it does beautifully for most games. Slowly they added more features and it's probably the best few dollars I ever spent on any windows app. The frame generation works so well, yes it's not perfect and requires certain settings like windowed mode, etc. But when it's all in action and activated it's almost like downloading more GPU. Feels good.
I've been using tool this for a year, and it really does work well! Wherever there's not support for AMD RSR or FSR, I'll use it. (Primarily Minecraft Java, tbh. Lol) As well as when streaming videos on youtube or elsewhere to bring them to 4K. (Just use a pop out picture in picture for your video, then use the shortcut key to make it scale it up. I use brave browser that has good support for PIP)
Haven't tried frame generation, but I'm not a fan of fake frames tbh, and don't play games that would really require it where it wouldn't hurt me mechanically. Lol.
Mozilla pop out windows are pure garbage and won't work. I will try Brave though.
Does it work with Vsync ?
It didnt for me, id recommend turning it off
Do not use vsync with lossless, limit fps manually
@@MiGujack3you will get tearing otherwise. Enable Vsync from NVCP. I had no severe latency issues with vsync + loseless for games that are not competitive. But I am using a controller which is recommend
Haven't played an AAA game from the past several years and my laptop is a budget one from 2019 with 1080p60hz display and a gtx1650, so this stuff is all very new and cool to me. You upgrading this year to a laptop with a 4060 and 3.2k 165hz display, so im stoked to see what it looks like.
2x is for 120hz gaming.. 3x is for 180hz and 4x is for 240hz.
I have been enjoying dlss frame gen so so much on cyberpunk and the witcher. Going from 65 to 110 is absolutely worth the increase in latency that i cant seem to notice on a controller. I hope fsr and dlss offer a 3x or 4x soon. Amd needs to take some notes from lossless to better afmf. I lost my 6950 xt before i used the second version but i can say with condidence that the first iteration of afmf was absolute TRASH. Glad amd is making some moves.
As someone that loves this software and uses it in most games I do want to advise that it it does drop your base frame rate by about 10-20 fps so if your too far under 60fps it will induce fairly significant input latency, anywhere near or above 60fps and its a significant improvement to smoothness
In absolutely any game, input lag increases very much if your GPU is loaded at 95-100%, it doesn't matter how much fps you have, Nvidia reflex PARTIALLY solves this problem, but the best solution today is still lock fps. Cyberpunk and Elden Ring showed you this very clearly, it's not about the gamepad and mouse. LSFG works best if your GPU is never loaded at 100% and your base fps never drops below 60.
15:01 - it is interesting that we have technologically come to a point that we have created 'visual artifacting' in realtime - that term, I am sourcing from an expression that comes from video compression and rendering where the data around sharper edges gets 'lost' when utilizing a lossy codec (video COmpressor DECompressor) in order to save used bandwidth of the video data stream.
the codec, during compression of the video, can allow a blur/loss of video data (detail) and the result around sharp edges is termed "ringing" or sometimes called "mosquito noise" (after the flying annoyances that hover over the skin). this loss of visual data results in a 'haze' or blurry motion around these sharp (and potentially moving) edges in the video output.
the fact that these visual artifacts are being caused/rendered in Realtime for the human eye - as opposed to hours-long video stream rendering where interpolated frames are calculated in smaller rectangle shapes and put together as video frames in a single video file - blows my mind (hey as a person getting older, that used to take over-night sometimes lol). so neat to see it all happen with almost no delay whatsoever...
fantastic coverage of this, btw. great vid
I think you're wrong about something, x4 right now does have image quality issues but x2 LFSG completely destroys FSR3 framegen in term of image quality, there's no comparison, the only advantage FSR3 has over LSFG x2 is that FSR3 skips the HUD entirely but that leads to other issues as well. FSR3 barely create an actually convincing/correct/stable interpolated image right now.
I think, you mentioned in earlier videos, that capping the base frame rate can limit latency, which you did not mention in this video.
I used this with a RTX2060 to play Black Myth Wukong with 120 fps on 1440p high settings with dlss on 50% (40% in chapter 3) to get 30fps capped base frame rate and it played wonderfully. Not capping the framerat introduced latency spikes.
If only this worked in VR
Thanks for video.wouldnt known about this😊
I'm surprised Nvidia and AMD didn't do this first. Maybe they were saving 3- and 4-frame interpolation to pretend their _next_ generation is worth what they'll be asking for it.
3x or 4x in shit tecnology
Haven’t tested yet, but it sounds like you would have less latency if you turned off DLSS/FSR & just used 2x on this app?
You can use amd fsr without using a scale and 2x fg. 60 FPS cap. The quality is better than native… but most ok the users cant make use of ir
I can finally play my latest DooM at 1840 fps
But is that enough? 🤔
Do you also have a 1840hz monitor?
@@JanM2he is the first person to get the new neuro-vision from neurolink. There is a display port is his brain so yeah. He fidna own.
🤖 I'm singing the Doom Song!
🎵 Doom doom doom Doom! 🎶 Doom Doom Doom! 🎵
@@JanM2 I personally have a 1841 HZ monitor, but I'm locking it down to 1840 cause my eyes can't see past 1840fps.
Good question is what if you use it for vr games, because vr is almost always 4k or more of resolution and you have to get 120 fps or more to really feel immersed.
X6 avaible but only on rtx 😀
*Rtx 5000 series
@@Games_and_Tech best RTX 6090 😛
What a dumb comment. 😂 Love it
u must pay lootboxes and get the legendary X5 mode with a drop rate of 0,000000000x10-^432149x=z²+x/69, after that you need the battlepass (only 1000$ right now it's almost free ngl) and only then you must have 100 000 hours on loseless scaling, and make a 2000$ dollars payment to finally get X6, but it's only avalaible to those who own a rtx 10090 ti super ultra (which cost around 15k$ in 2050)
@@nawabifaissal9625 nah 3080 and good IQ :)
Been using this on WoW. And omg...it is amazing.
Set base fps to 60 in the game as that is the lowest i go down to in valdrakken.
Then set to x3 with allow tearing.
Perfect 180fps matches my monitor refresh rate, and doesn't skip a beat. So good.
Seen a lot of lossless scaling videos lately i wonder if its an ad
No it is not it is fantastic
Would be nice If CPU’s with a dedicated gpu could slightly boost frame generation of a discrete GPU, by using the extra RAM. Also I don’t know the exact process that Lossless Frame Generation uses, because I’m wondering if they use AV1 or H.265/HEVC when they plant the frames?
I just tried this on Helldivers 2 and finally hit 240fps on my 240hz 1080p monitor! Felt super smooth but the latency was definitely noticeable. I prefer x2 and x3 mode with perceived latency practically non existent with x2 mode as long as you have right settings switched on.
Great stuff 👍Thanx 🙏
This tech seems a bit pointless to me. Considering the main reason why I want more frames is for my mouse to feel smoother (lighter).
In fps games sure but for stuff like Elden ring, skyrim, basically anything that isn't a fast paced game it's fine
@@MrSuvvri but what's the point if the mouse isn't smooth?
Idk, when I'd want to use it is when it sucks the most so this quite literally seems no good
Yeah I understand where you are coming from. Because I don't really find it useful in shooting games. Almost unplayable.
I use it for livestream and videos tho, almost all the time. Movies, discord streaming. All this makes huge difference.
It will feel smoother.
it makes sense for games like cyberpunk that are hard to run but you want them to look nice and aiming doesn't matter too much
Been using it for about 2 months now, and I'm halfway through replaying God of War since it first came out on PS4, works flawlessly. Lossless scaling is a must buy gem. Definitely a must have for the chill non competitive games.
Cyberpunk citizens: Ooh sh!t it's frame generation tests time! 😅
G-sync support is super, but still this works only on windowed mode or fullscreen is allowed now?
Losless scaling is legitimately great, but I've found the most use out of it on RUclips and with emulators. Real frames are always preferable, interpolating past frame caps is where it's at
what a time to be alive
Nobody would use x3 because of artifacting+latency, doubt this will be different. G-sync support is massive though
Afaik latency on x3/x4 should be the similar to the x2 latency because it's still only running 1 frame behind
@@Game_Crunch but have you not got 3 injected frames for every 1 that that you do not have control of, therefore latency?
@@Game_Crunch and if your capping your frame rate at 4th or your target rather than half will that not introduce more latency?
@@BourbonBiscuit. what he is saying is that no matter if you are generating 1 or 2 or 3 frames the latency is only as long as the time it has to wait for a the second in game frame. so all 3 should have the exact same latency but from some reviews Ive seen the 3x has less latency than 2x (something having to do with frame pacing or idk I aint no tech wizard)
@@BourbonBiscuit. yeah but why would you do that if you can already hit the target frame-rate from 60 to 120
going 30 to 120 would be both very stupid and unnecessary
but going from 60 to 240 should feel the exact same as 60 to 120 when it comes to latency
It works great, im buying that , thank you
As a teacher you shouldn't compare Apples with Oranges, you know the scientific method. So if want to compare latency and image artifacts, you compare DLSS FG to LS 2x and not the 4x. You should only compare de LS 4x with the LS 3x and 2x, because it's only one changed variable the number of frames inserted on the same upscaler. So I have to vote negatively for your video, I could voted positively if you had compared Apples with Apples.
Wow. I have used this in other places like Age of Empires and RUclips, this is a very powerful program. Great recommendation.
I tried Lossless Scaling but refunded because it just made every game screen tear like crazy. I played with the settings a lot (VSYNC, Framerate limits) etc but no luck.
Some games gave me more issues than others with LS. I wonder if by tearing, you're actually seeing artifacting. The X2 mode is least likely to artifact and tear and make sure you don't have in-game settings and LS settings fighting each other. I had to play with in-game settings, LS settings, and my driver settings to get an optimal experience but once I solved it, it works great.
What was your fps? Have you enabled vsync only from Nvidia panel? Are you using RTSS properly? What monitor refresh rate have you set?
Or maybe the game you are trying has problems on its own that only the developers can fix or check the internet for stuttering fixes?
I can help you with this. For me loseless scaling is working perfectly with all older games, emulators.
Does G-Sync support mean it will only work with NVidia GPUs?
That'd be too bad.
Run all vrr types freesynce too so amd compatible
I miss raw Performance
This is awesome it's like the introduction of turbochargers in engines
This is the future now old man
Go back to Bronze Era
Agreed. Ignore the other comments. People acting like frame interpolation is “performance” are either lying to themselves or don’t understand the technology at all. This is a cool way to get “smoother” gameplay, but it introduces input lag and artificial frames to make it happen. You can’t call that performance. Imaging if your car’s speedometer doubled the speed it showed you, so while driving at 60MPH it showed you 120MPH. Is that performance? Of course not. This isn’t anything like adding a turbo to an engine- overclocking would be that equivalent.
@@nintendoconvert4045 But this is actually giving you more frames per second, it's not just SHOWING more FPS like your speedometer comparison. If it said 240fps while actually getting 60 fps this app would be getting entirely shit on by everyone, but it doesn't do that.
Will this only work with Steam games?
I've a HP 290 sff with an RTX 3050 shoehorned into it.
Would love to try this with Escape From Tarkov.
No works on 98 percent of all games steam origin uplay and any game installed on your system weather legit or others like abandonware there is only a couple that wont work when lossless scaling cant force a windowd game to full screen like if you use somthing like dgVoodoo to change the api from dx9 to dx11 but fantastic program.
You can use it on ANYTHING, even a RUclips video.
i did not like it at all
It's worth mentioning that frame generation will use extra gpu power, and this will cause you to lose "real frames." So if you were getting 60 real fps, you might get 50 when FG is on. Personally I wouldn't bother with FG if you can't keep more than 60 real fps because the input delay will just be too much
So basically it's a gimmick.
No.
If you consider DLSS a gimmick too then yes it is.
This is a game changer for pascal series gpus. I have an old pc with a 1070 and this will give it new life.
Been using AMD's frame generation on quite a few titles lately and it's also very good. Double the fps with a click of a button is crazy cool.
I've encountered this problem today, after buying Crosshair X, it's a crosshair app which goes as a semi-overlay onto your Rust game adding a crosshair. I've tried scaling my game repeatedly with all types of settings from Lossless Scaling, even with older versions, modes, nothing helps. The scaling is being done on the Crosshair X instead of on Rust and this makes my Rust game run in 40fps because CrosshairX is scaled. I've also tried first scaling Rust, then adding and turning on the Crossshair X app so that it won't be scaled, doesn't work. I've also tried disabling fullscreen optimizations, so on in the properties of the apps. Any fixes? I am desperate... A fix I would see working perfectly would be Lossless Scaling being blocked to scale on Crosshair X and locked only to scale with Rust game, but I don't really know how to do that. Need help!!!