So a little note sir Fabio, if you wish to see the x5, x10, x20 results, you need to increase the max frame latency. I am not really expert but it is more like increasing the limit of how many frames can be rendered. In AMD GPU, if you put 3, the allowed rendered frames is doubled of your monitor's refresh rate. Example: You have 144hz monitor, max frame latency is 3. So 288fps is the max to be rendered. (in NVIDIA GPU, max frame latency is 2 for double. I don't know why AMD needs to set value 3 for double). If max frame latency is 1, then it is only rendered max of your monitor's refresh rate. It should be tripled if you set max frame latency to 3 (for NVIDIA), 4 (for AMD) so 144x3 = 432fps So basically, you should give it a try to increase max frame latency for x5, x10, x20. Of course this is just my understanding. If I am wrong in explaining or facts, do forgive me
You still made many mistakes but at least much better than last video you recorded. You need to increase max frame latency setting to higher values to increase your fps with x10 and x20 (or any frame gen x where you dont get the fps you want even though gpu is not maxed) . That is why it was not going higher. Also 4k is very demanding with lsfg so you need to lower res scale to much lower values like 50 percent. Even then you wont see any problems on image most of the time. It only lowers optical flow resolution, not reducing resolution of game. With at least 50 res scale, you would lose much less fps and get much better experience. I appreciate trying again though, thanks for you time.
@@joaocardoso9031 You can have better experience in games with built in frame gen also since lsfg can go much higher multipliers than other frame gen techs (up to x20). Other frame gen techs can go only up to x2, or with 5000 series nvidia gpus x4 only (if game supports)
to generate the frames your gpu needs to have enough power to run them still, so turning down the settings to low for the 10X and 20X would give you better results and way less artifacts, it's why it works better when you cap the frame rate to something capable, great video, hope you get better!
@@99mage99 Of course it is. But to people who don't know, can think most of it is a visual effect. And nothing could be further from the truth. I am only stating this to those who don't know but are curious.
This new version of Lossless Scaling is absolutely nuts. I've tested it with some games that were problematic with previous versions like Wind Waker (Dolphin Emulator) and Berserk: Millennium Falcon (Dreamcast, Retroarch) and the artifacts are like 90% reduced; Link's head no longer disappears and the ground in Berserk, specially the tiled bricks no longer produces the same nasty artifacts as before. Input latency is improved to the point where you might as well have LSFG3 active at all times since the experience is so much improved.
i use it on the old OG GTA SA which is hardlocked to 30 and the X2 mode feels like magic. no artifacts whatsoever and super smooth gameplay. for emulators and old games this is godsend. download more FPS, if only i could tell my 12 years old me.
I've been playing TOTK at 240fps for months thanks to Lossless Scaling. I still sometimes have some stuttering, but that's just because I don't know how to eliminate little frame time spikes. I've tried locking the frame rate to 40 or 45fps using Ultracam, but it never resulted in a smoother experience, so I stick to 60fps and multiply the frames by 4. I can't go back to playing TOTK any other way.
And surely there will be no difference at all between a 50series doing their 4x FG and a software that costs 6$ on steam. Who cares about quality anyways.
i scaled up from 2 fps to 240 fps. and 360p to 4k. everything perfektly fine. i can now play cyberpunk in 4k@240h with my 15 years oid pc. this is awesome.
Couldn't care less about the mistakes, just glad you made a video mentioning it because I wouldn't have even noticed Lossless Scaling updated. I don't use it all that much but I actually tried it on Nobody Wants to Die at 4k max settings capped at 30fps just to see how it looked and felt going from 30 to 60 fps and I ended up playing the entire game like that.
I do love these technologies. So now the name of the game is to get to to 60fps +10%, lock to 60 and then double up to 120 (I have 120Hz monitor). There's life in the old 6800XT yet!
@@LeoEichdi that makes no sense as LS only takes a live capture of your screen and injects frames or upscales. You would get a black screen with or without it.
The way it is right now lossless scaling can't compare at all to the nvidias frame gen technology. as ancient said, this is an algorithm while nvidias frame gen uses in engine data alongside machine learning for prediction of motion in accurate manner. Now I'm not fan of ANY frame gens and i'd rather have real 60 or 120 fps instead of invented fake frames and i'd rather devs spend time to actually build their engines and games to work properly on the mighty powerful hardware we all own today. Just go play days gone and then play any modern game and tell me it makes sense that days gone runs as is on ps4 while modern games run like crap on much much more powerful modern hardware.
You can also use Lossless Scaling on youtube videos and twitch livestream , setting fsr to max sharpness will make the image way less blurry especially when watching video games.
It doesn't make any sense. You are predicting what's between 1 frame to 1 other with Frame gen. If you got 10 frames or 500 frames you still are going to predict what's between 1 frame to 1 other not 1 frame to 20 others. Results are the same, only thing different is movement. If you have a still image then you get less artifacts due to having little to 0 variations while if you have a lot of movement you will get LSD
The more frames, the closer the prediction is to both frames. Difference between them is smaller(in terms of position of objects), so it has less room for errors
Dude, back when I was doing YT stuff, I learned _very_ quickly that no matter how well I _thought_ I was pushing through the sick brain-fog, it was never enough. It ALWAYS showed in the final video. Get better! 😃
Losless is crazy now. In wuthering waves the only ghosting i could see is when I stay still and start sprinting. Using LSFG 3.0 is a game changer in this game because its locked at 60 fps for everyone, except the ryzen 7 and intel i7 owners
Honestly I only need x2 .Now with this update I have less artifacts and my fans stop being noisy , the resources drop greatly. I love this new LS 3.0🤩. Great work broo.👍
Same i was about to make a huge upgrade to the 9070xt but now i can stay on my Rx6600 because with x50 Framegen it has the performance af the AiTX 6090 thats not even releases yet
PLEASE NOTE: If you are locking the FPS globally with Nvidia control panel, LS will not generate more frames than that since its self is also then locked. To resolve this, just add the separete Nvidia control panel for Lossless Scaling and remove the lock just it !
Appreciate your videos as always my man. It's ok to take a break if you are sick brother. We can't be 100% all the time. You release alot of content as is. Rest up if you need
Thanks for your great effort, hope you get better soon. Everyone should watch this video with LSFG enabled. 🙂 Working with videos, emulators and old games (old games, the gameplay speed attached to fps, and it ruins when forced within the game itself) is what make this software is so great.
I have tested x20 mode with 60 fps on some games. One of the games I was averaging 60/1000fps with major ghosting and some bad but playable input delay. It felt incredibly smooth on my 280hz monitor and I think with a few more tweaks and updates it will be reasonably playable in the future.
I tried out lossless scaling on final fantasy xiv on X2 mode with LSFG 3 and get less than 15ms latency even with an online game when before the update I got obvious unplayable input lag, I'd say I'm impressed
i've tested lsfg3 x4 in first kingdom come @ high settings on my rx 6600 and i can confirm. It's simply amazing, almost afmf2 quality with double the fps, now i can easily hit 160fps with the same settings and the game plays just smooth as butter, it's incredible
I finally got my first pc which looks like brand new for 1350€ from a guy who where about to travel to Australia so he put a low price to sell it fast. It's equipped with a 14600kf and a PNY RTX 4070 ti super and the other components are pretty premium too. Still haven't played on it yet but i'm excited.
You are gonna have a fkn blast on that system bro! Glad for you! Welcome to the top tier of gaming... It IS glorious here ain't gunna sugar coat it for the consoled.
@@JerryOIOOIIO I had already a 180 Hz Monitor with G-Sync which i used with my laptop and this gpu is a beast, I won't need to upgrade for a while.In Cyberpunk I get like 110 FPS in 1440 P Ultra Dlss quality, With Ray tracing ultra Dlss quality around 75 FPS and even with Path Tracing can still get 45-55 FPS DLSS quality.
Thanks for the review. This makes me want to reinstall the software again. Previously I wasn't so convinced on what it can do. But now maybe we can give it another chance to do better. 🤷🏻♀️🥴
@aricrudd6579 That's not how that works. The software is designed with a different operating system in mind. Fsr didn't run on Linux until games started implementing it or until steam implemented it manually in the UI of the steam deck. If they wanted to get it working they could but it could be an issue of licensing so it's still might not happen. You're not wrong about that part in all likeliness.
@@ClassicRoc87 FSR works through the graphics pipeline, if I’m not mistaken, but Lossless Scaling apparently uses a few Windows functions that can’t be handled by things like Wine and Proton. Making LS run on Linux is far more complicated than you’re implying.
@aricrudd6579 fsr 2.0 works through the pipeline. 1.0 is basically a fancy filter which is kind of how the frame generation works in this application. Don't get me wrong I'm not an expert but I believe it can be implemented if they wanted it to.
@@ClassicRoc87 Frame generation is more than a fancy filter, considering it has to actually interpolate frames to make extra frames. I think the devs of LS actually explained somewhere why they can’t/won’t make a Linux version. But considering Proton recently added support for frame generation, hopefully someone else can do something similar for Linux. I just don’t think LS itself will ever be Linux compatible.
tekken and other figthing games like mk, sf combat is synced with fps, using fg is just worsening your chance to register a hit before the opponent does because of additional input delay
I recorded footage from different games with OBS with a 7800xt and i could only get rid of the framepacing bug by "just" disabling HAGS, that is also something they are working on because its a bug in the latest OBS Version. Works flawless with Hags enabled and Adrenaline Recording in the Background though. Lossless Scaling was the best investment i did 2024 :) i even bought it three times for my nephews and showed them how to use it. Now they can kill bugs and Automatons with 90fps(Base)/180fps(x2) fixxed on their 240hz screens :D and GPU Usage never going above 80%. Really awesome and easy to use tool if you are not playing competitive and need to squeeze every ms out of your input latency. Really like how you explain it in detail and make videos about it. If you tried it out once and made it work you can never go back.
i really respect and appreciate you for rectifying your mistake, also i recommend using VRR (g sync or freesync with LS and set sync mode to allow tearing for even lower latency), also capping fps gives better latency too, if you have an old GPU and run a dual gpu setup it is just crazy broken op. ALSO IT IS HIGHLY RECOMMENDED TO LOWER RESOLUTION SCALE WHEN RUNNING AT 4k or 1440p (nvidia does that too). For 4k 50% and 1440p 70-80%.
I believe x6, x10 and x20 doesn't work correctly because you would need a much higher monitor frequency. Other than that pretty good video! If I may recommend, using 50% resolution scale (or optical flow scale is more appropriate) still looks really good with a 4K target and runs much better, 100% is only recommended at 1080p honestly.
Actually, I believe it may not be working correctly because you are hitting 100% GPU usage. If there is not enough headroom for Lossless Scaling to do its work, then it may drop frames resulting in artifact and latency hell. When 6x, 10x, and 20x was enabled, the framerate counter also indicated that it did not reach the target (generated) framerate. This means that Lossless Scaling is dropping a lot of frames.
Hi there, love your videos. I wanted to mention that you can select the beta version from steam launch setting and Lossless Scaling will be in version 3.0. Not sure if there is any difference but worth checking out!
@Savitarax It means no performance penalty from enabling lossless scaling, I do a before and after comparing them here ruclips.net/video/gH359ZNxvNk/видео.html
Hello there ! One thing to mention... You can have the Beta version of Lossless Scalling which has new UI By going to Steam Library, then Lossless Scalling's Property and selecting Beta Program version as Beta
That major artefacting you got at 10x and 20x were mostly caused by the application being limited to somewhere around 200fps rather than the frame generation quality itself. I've seen that exact 'slurring' artefact happen whenever LSFG attempts to render more than a software-imposed frame rate cap. Also, yeah, sometimes it'll randomly cap the fps to some number that bears no relation to the monitor's refresh rate or anything like that, like what's happening with your LSFG being capped to something like 200fps, and I still don't know why.
Along with some of the mistakes related to your tests, another one was that you said that LS requires the game to be on borderless windowed mode. Actually LS works fine on windowed mode as well, it doesn't have to be borderless. This might be relevant to some people who want to change their resolutions in game but some games don't allow to change resolution in borderless mode, only windowed mode for some reason.
What you should do is to lower resolution to reduce GPU load because you gpu load is at 100 percent which is why the gpu cant generate more AI frames. I played fortnite capped at 60 fps 1440p, when i used 10x Frame gen. My fps decreased to 35 and i got 350 fps. I reduced resolution to decrease gpu load from 90% to 78% and i now have 600fps
@@alperen1383 Yeah, if I look at 120hz for a while, especially my OLED, it looks pretty decent. But I say it looks choppy sometimes because when I drop straight from 360hz (my monitor) to 120 it looks laggy so I can get why the other guy would want 240hz smoothness
@@alperen1383 I'm a competitive gamer and the input latency I really don't even feel it. as long as you lock your frames like in the video shows it has the same input lag as dlss. I would much rather play a fast past competivte game on 240hz than 120hz.
Tested this app on 2D games from my Contra Anniversary Collection. Managed to get them to run at x10 without artifacts on my 165Hz monitor (more than that my monitor's refresh rate can0t keep up with so many extra frames). It locks at 165fps but the animation is muuuuuuch smoother, and the edges of the sprites look a lot less jagged. The motion clarity adds a lot of antialiasing. Lovely app, excellent for any game and for future hybrid handhelds with high refresh displays.
You're running into the limit of what you can calculate with the x6 mode and above. If you tried it on a 4090 it would probably go a lot higher. Something interesting you could try is to use a second GPU (you don't need anything crazy) as your display out and LSFG calculation card and use the 7800xt to render the game. That will work and it should let you use FG both with little performance impact to the game and the full performance of FG without it having to compete with the game for resources. The more powerful the secondary card, the more intermediary frames it should be able to calculate. If my 4090's massive cooler didn't cover my second PCIe slot I'd try it myself.
@@p_serdiuk Probably something even like the gtx 750 would work, because it's fine to have it at 100% usage as long as it's the one doing the calculation, but preferably something like a 1050 or higher
It has been proven several times that depending on the game, the in-game fps lock can actually have more input lag. And not all games let you lock fps to a specific number
Wait really? I watched a video guide before for locking FPS, and it showed RTSS being the best one for frame time stability compared to in-game or gpu app fps lock.
RTSS has an option to override the default FPS Limiter implementation and use Nvidia Reflex instead. Not only that, the tool description says that this is the recommended option when limiting frames using Frame Generation, and it is even compatible with games that do not have native Reflex support. Which means that in addition to limiting FPS, this option also brings all the latency benefits of using Reflex. I believe this is an option exclusive to Nvidia cards, but still, you can't simply say that limiting FPS via RTSS is worse, because in these cases, RTSS is simply the best option.
I think your GPU hitting 100% might be why you aren't getting more generated frames. It's great that the frames are being generated on the GPU instead of CPU.
3.0 is practically flawless on Tarkov. Literally no more gpu usage and no artifacting that I’ve noticed. Really great update from LS devs. Fantastic on helldivers too
I tested and found the same, I think the frame generation impact is pretty massive if you are close to the max GPU output or using to high X levels of FG. Best case scenario remains a locked 60fps into X2 or X3. I found using 60FPS locked in Darktide Warhammer 40.000, X3 and my cap being 144hz, it works very well.
Just from curiosity, are you using Lossless Scaling on daily basis or just for video purposes? For example I can't live without it to be fair. I was thinking about upgrading my rig for smoother 1080p gaming, but when you showed us Lossless Scaling i decided to give it a try and my RX 6700 XT and R5 5600x are busting over 140 pure frames on a 165 hz monitor in all games ON ULTRA, that saved me a lot of money and nerves. And also I wanted to thank you for everything, your videos helped me a lot. Keep it up man, you are doing Gods work.
@@AncientGameplays ah you're right, it might have something to do with the gpu usage then cuz it seems like it starts messing up when the gpu is at full capacity
Lossless Scaling works much worse for me than the built-in AFMF2. When using the x3 x4 mode in LS, the performance does not increase at all, and the picture is spoiled by x10, and in the x2 mode it makes 90 frames out of 60 at best, when AFMF2 makes stable 120 frames and spoils the picture less. I use 780m (7840hs).
@@gingerbread6967 Because Lossless Scaling works with any window. A single 7$ app that works on nearly all GPU and any game - against DLSS that require mid or high-end GPU for, and a supported game. FSR is ugly, imho. There is no magic, but great value for an amazing tool does feel like magic in today's economy of proprietary technologies. People seem to recognize that.
gahhdaamnit that Snoop Dogg WHO!? always gets me 😂😂😂😂 Thanks I got it today and been using it, not perfect by any means but I rather have a smooth play at the sacrifice of some artifacts here and there. Also the x10 and x20 did the exact same thing as it did to you on my first run, I then had to go out and exit the game, when I returned the x10 and x20 was working as it should without that weird artifact effect and I was getting the right amount of fps
Just updated can't wait to try! lol @ Little wet. I've been locking Diablo 4 @4K 30 FPS using LS frame gen. Keeps my fans quieter & looks even better now too! Finally got to try the new version.
Btw, 30 or 40 fps can feel smooth in a game when you enable motion blur, the higher the intensity the smoother the game will feel and look. Trust me, I used to hate motion blur, but now i love it when I discovered this trick. It's the console experience, after all.
I tried this version myself in Elden Ring and Dark Souls 3 since they are capped at 60FPS and using mods to unlock tbe framerate will mess up the game mechanics (for example running in slow motion). With Lossless Scaling I had a lot of artifacts even with 2x using version 2.1, but now with version 3.0 I can barely see any even in x4 (using a 240Hz screen) and the latency is reduced a lot I don't even notice, I can dodge any attack, it's truly a huge improvenent.
NOW, I am fixing the issues that I made on the last video :D
Testing X10 and X20 Frame generation as well
So a little note sir Fabio, if you wish to see the x5, x10, x20 results, you need to increase the max frame latency. I am not really expert but it is more like increasing the limit of how many frames can be rendered. In AMD GPU, if you put 3, the allowed rendered frames is doubled of your monitor's refresh rate.
Example: You have 144hz monitor, max frame latency is 3. So 288fps is the max to be rendered. (in NVIDIA GPU, max frame latency is 2 for double. I don't know why AMD needs to set value 3 for double). If max frame latency is 1, then it is only rendered max of your monitor's refresh rate. It should be tripled if you set max frame latency to 3 (for NVIDIA), 4 (for AMD) so 144x3 = 432fps
So basically, you should give it a try to increase max frame latency for x5, x10, x20. Of course this is just my understanding. If I am wrong in explaining or facts, do forgive me
You still made many mistakes but at least much better than last video you recorded. You need to increase max frame latency setting to higher values to increase your fps with x10 and x20 (or any frame gen x where you dont get the fps you want even though gpu is not maxed) . That is why it was not going higher. Also 4k is very demanding with lsfg so you need to lower res scale to much lower values like 50 percent. Even then you wont see any problems on image most of the time. It only lowers optical flow resolution, not reducing resolution of game. With at least 50 res scale, you would lose much less fps and get much better experience. I appreciate trying again though, thanks for you time.
So now we can have a better experience on games that don't have built in frame gen like Jeraldo o feiticeiro
@@joaocardoso9031 You can have better experience in games with built in frame gen also since lsfg can go much higher multipliers than other frame gen techs (up to x20). Other frame gen techs can go only up to x2, or with 5000 series nvidia gpus x4 only (if game supports)
to generate the frames your gpu needs to have enough power to run them still, so turning down the settings to low for the 10X and 20X would give you better results and way less artifacts, it's why it works better when you cap the frame rate to something capable, great video, hope you get better!
Now i get 480fps from 60 in cyberpunk at 4k , i think my rx7800xt = 5090 by nvidia logic
XD
Now my 3070 is a 4080 super
My god you've cracked their code
Now my 4080 is 8080
fr
now i get 600 fps in cyberpunk path tracing at 8k on my nokia 3310
therefore its better than 5090
@@jonas314ano therefore correct you are indeed
Correct
Impossible with AI
Love it 😂😂😂
can confirm. he did
Nvidia 2026: you only need 1 frame to run the entire game!
Nvidia 2027: we can directly generate frames in your brain. 2GBs of vram are enough.
Sounds like a win tbh
Nvidia:This is fake news, when we get access to the brain we won't need any vram
oh yeah!
I wouldn't complain getting 240hz raytracing on cyberpunk 2077 with my gtx 770 😂
what if only 500KB brain? pls help
im a fan of x20, now i dont need to do shrooms anymore to experience visual artifacts... haha
Shrooms is way more than visual artifacts.
Otherwise it would be enough to just look out of a kaleidioiskope
@@cajampa It's just a joke, brother
@@99mage99 Of course it is. But to people who don't know, can think most of it is a visual effect. And nothing could be further from the truth. I am only stating this to those who don't know but are curious.
LSD: it's less of an ass than DLSS.
I've never used anything in my life except for weed (for medical purposes, totally), but x20 is what I imagine acid trip would look like.
This new version of Lossless Scaling is absolutely nuts. I've tested it with some games that were problematic with previous versions like Wind Waker (Dolphin Emulator) and Berserk: Millennium Falcon (Dreamcast, Retroarch) and the artifacts are like 90% reduced; Link's head no longer disappears and the ground in Berserk, specially the tiled bricks no longer produces the same nasty artifacts as before. Input latency is improved to the point where you might as well have LSFG3 active at all times since the experience is so much improved.
Great to hear!
Yeah based I also love using LS with emus
i use it on the old OG GTA SA which is hardlocked to 30 and the X2 mode feels like magic. no artifacts whatsoever and super smooth gameplay. for emulators and old games this is godsend. download more FPS, if only i could tell my 12 years old me.
What about zelda totk?
I've been playing TOTK at 240fps for months thanks to Lossless Scaling. I still sometimes have some stuttering, but that's just because I don't know how to eliminate little frame time spikes. I've tried locking the frame rate to 40 or 45fps using Ultracam, but it never resulted in a smoother experience, so I stick to 60fps and multiply the frames by 4. I can't go back to playing TOTK any other way.
20x mode fg looks like playing on mushrooms
Ahahaha 😂
The trails are real man.
This will transform your GT 130 to a RTX 5090 wow!
My gtx 1560 is now the Upcoming RTXDLSSOD 9090 TI :)
and it's giving around 10 Native fps :D BUT WITH DLSSOD 99 It will give 30k Fps :3
not actually u need high enough base framerate like 60
And surely there will be no difference at all between a 50series doing their 4x FG and a software that costs 6$ on steam. Who cares about quality anyways.
i scaled up from 2 fps to 240 fps. and 360p to 4k. everything perfektly fine. i can now play cyberpunk in 4k@240h with my 15 years oid pc. this is awesome.
Dude are you serious? 🤔@@mal-avcisi9783
the example of a hard worker even tho he's ill he remade the video just for simple mistake, great video as always get well soon brother
Thank you for the kind words
20x looks like an AI dreaming the game
Checkmate jensen, you got 4x while we have 20x, so thats like four 5090s by your logic jensen
And we all have a Xbox now!
@@ShFred Is that a pun on Geforce Now!? 😂
I mean, technically not because the FPS doesn't go over 202 and then just gets more and more distorted.
@narrativeless404 shh. Don't tell Jensen this. We just draw random bars on a graph with no relevance to real life.. just like him.
@GFClocked *LMFAO* 🤣🤣🤣
Couldn't care less about the mistakes, just glad you made a video mentioning it because I wouldn't have even noticed Lossless Scaling updated. I don't use it all that much but I actually tried it on Nobody Wants to Die at 4k max settings capped at 30fps just to see how it looked and felt going from 30 to 60 fps and I ended up playing the entire game like that.
I playes that game too haha
I do love these technologies. So now the name of the game is to get to to 60fps +10%, lock to 60 and then double up to 120 (I have 120Hz monitor). There's life in the old 6800XT yet!
Or if you have a crap GPU, the goal is to get 40 fps and lock to 30 + lossless scale to 4X
The old 6800XT is running Stalker 2 at 1440 getting 90-100fps with ultra settings. LSFG 3.0 is icing on the cake.
Old lol.. what I have to say with my 2070?
Afmf2?
6800XT old? lol
Lossless Scaling be pulling a Kaioken with its multipliers 😂
15:10 That's an awesome tip in the upper right! Thank you for sharing all of this!
This is how the GT 710 will be faster than the 4090 - using multi frame generation
my vega 8 too 😎
Every GPU is RTX 5090 now
But 1080TI is better cause lower TDP and i have 11GB nearly the same as 12GB 5070 which is now a 5090 but my card is a lot smaller!
For those who love modded Skyrim, this tool is a game changer.
How you use it?
you can now go nuts with an enb
It doesn´t work with community shaders for me, I get a black screen
@@fnfal113 please dont.
@@LeoEichdi that makes no sense as LS only takes a live capture of your screen and injects frames or upscales. You would get a black screen with or without it.
This cost like 7 dollars, Nvidia's new GPU's are 500+ for the same thing. Hmm... tough decision.
Not same thing haha. Much better, this is just an algorithm.
I bought it for $4 or is it depend on the country?
The way it is right now lossless scaling can't compare at all to the nvidias frame gen technology. as ancient said, this is an algorithm while nvidias frame gen uses in engine data alongside machine learning for prediction of motion in accurate manner.
Now I'm not fan of ANY frame gens and i'd rather have real 60 or 120 fps instead of invented fake frames and i'd rather devs spend time to actually build their engines and games to work properly on the mighty powerful hardware we all own today.
Just go play days gone and then play any modern game and tell me it makes sense that days gone runs as is on ps4 while modern games run like crap on much much more powerful modern hardware.
its $3.5 USD for me
nvdia fg wayyy better
You can also use Lossless Scaling on youtube videos and twitch livestream , setting fsr to max sharpness will make the image way less blurry especially when watching video games.
Thanks for the tips!
Framegen also works , pretty nice if you are watching some fast paced fps game video for exemple
Stated that as well
Without premium videos are 60 fps on yt so this is rly helpfull
My headcanon is that the x20 FG mode in Alan Wake is part of the lore.
It's scratch changing the story
🤣🤣
Looking at the ingame effects before frame gen it makes sense lol
I love his app for watching movies with frame gen, movies with CGI character the app makes them blend more into the scenes with the real actors.
Small correction: more frames is better because the FG has LESS information to predict, it has more information to begin with so it's easier.
It doesn't has less to predict, it has the same percentage wise, no?
@@AncientGameplays Technically if you make more native frames the difference between each is less, so it's easier to predict the fake ones
It doesn't make any sense. You are predicting what's between 1 frame to 1 other with Frame gen. If you got 10 frames or 500 frames you still are going to predict what's between 1 frame to 1 other not 1 frame to 20 others. Results are the same, only thing different is movement. If you have a still image then you get less artifacts due to having little to 0 variations while if you have a lot of movement you will get LSD
The more frames, the closer the prediction is to both frames. Difference between them is smaller(in terms of position of objects), so it has less room for errors
Dude, back when I was doing YT stuff, I learned _very_ quickly that no matter how well I _thought_ I was pushing through the sick brain-fog, it was never enough. It ALWAYS showed in the final video. Get better! 😃
🙏💪
Get well soon!
Thank you! Considerably better now!
@@AncientGameplays Don't forget the Vitamin too!
Losless is crazy now. In wuthering waves the only ghosting i could see is when I stay still and start sprinting. Using LSFG 3.0 is a game changer in this game because its locked at 60 fps for everyone, except the ryzen 7 and intel i7 owners
when i played that it ran fine at 120fps for me, it wasn't locked. and i don't remember installing an unlocker.
This "ghost" is not due to the losless is an "effect" of the game. Try disabling losless and test again. Probably ghosting will still happen.
Be quiet weeb
I recall the game having a 120fps option on release when I watched Tektone play it, and flamed other gacha games that didn't.
You can enable 120fps on unsupported hardware but yeah i think you should use lossless anyway
Honestly I only need x2 .Now with this update I have less artifacts and my fans stop being noisy , the resources drop greatly. I love this new LS 3.0🤩.
Great work broo.👍
My Rx 6600= Rtx 4090
Let's go..... 🎉🎉🎉
Same i was about to make a huge upgrade to the 9070xt but now i can stay on my Rx6600 because with x50 Framegen it has the performance af the AiTX 6090 thats not even releases yet
Joke aside 😂. Is it good for rx 6600?
PLEASE NOTE: If you are locking the FPS globally with Nvidia control panel, LS will not generate more frames than that since its self is also then locked. To resolve this, just add the separete Nvidia control panel for Lossless Scaling and remove the lock just it !
its better to lock fps per game on nvidia, or use riva turner.
@1LOW-s5u Yes, but some pople (like me) like to have fps globally locked and the unlock it per game if required. Thats why i wrote original comment
Appreciate your videos as always my man.
It's ok to take a break if you are sick brother. We can't be 100% all the time. You release alot of content as is. Rest up if you need
You are a hero, and that software too. Lossless scaling is cooking 🔥👑
Thanks for your great effort, hope you get better soon.
Everyone should watch this video with LSFG enabled. 🙂
Working with videos, emulators and old games (old games, the gameplay speed attached to fps, and it ruins when forced within the game itself) is what make this software is so great.
Jensen Huang: "Write that down! Write that down!"
Glad to see a re-test! thank you very much 💗💗
Fabio, thank you so much for your hard work, In india, in the morning the first video came out. and now at night 11:58pm, another came out.
Machine haha
I have tested x20 mode with 60 fps on some games. One of the games I was averaging 60/1000fps with major ghosting and some bad but playable input delay. It felt incredibly smooth on my 280hz monitor and I think with a few more tweaks and updates it will be reasonably playable in the future.
I tried out lossless scaling on final fantasy xiv on X2 mode with LSFG 3 and get less than 15ms latency even with an online game when before the update I got obvious unplayable input lag, I'd say I'm impressed
i've tested lsfg3 x4 in first kingdom come @ high settings on my rx 6600 and i can confirm. It's simply amazing, almost afmf2 quality with double the fps, now i can easily hit 160fps with the same settings and the game plays just smooth as butter, it's incredible
I finally got my first pc which looks like brand new for 1350€ from a guy who where about to travel to Australia so he put a low price to sell it fast. It's equipped with a 14600kf and a PNY RTX 4070 ti super and the other components are pretty premium too. Still haven't played on it yet but i'm excited.
You are gonna have a fkn blast on that system bro! Glad for you! Welcome to the top tier of gaming... It IS glorious here ain't gunna sugar coat it for the consoled.
@@emilgustavsson7310 thanks 🤣
If you haven't already, get a G-Sync monitor with a very high refresh rate, and yes, you're going to have a blast.
@@JerryOIOOIIO I had already a 180 Hz Monitor with G-Sync which i used with my laptop and this gpu is a beast, I won't need to upgrade for a while.In Cyberpunk I get like 110 FPS in 1440 P Ultra Dlss quality, With Ray tracing ultra Dlss quality around 75 FPS and even with Path Tracing can still get 45-55 FPS DLSS quality.
Same. Upgraded from a 3080ti laptop to a 7800x3d + 4080 super for 1650 bucks. I think its a good deal hopefully since it has 3 tb of ssd.
Thanks for sharing this man! Gonna try some multi GPU for some experiment with Lossless Scaling.
I thought you might make a revision. Good stuff my man you earned my sub
I've learned more about how to use Lossless Scaling than I ever have before.
TY 💯❤
Bro, this thing is wild. This video actually made me go buy it. 👍
As you should. =) The most value on Steam. By a mile or ten thousand...
Thanks for the review. This makes me want to reinstall the software again. Previously I wasn't so convinced on what it can do. But now maybe we can give it another chance to do better. 🤷🏻♀️🥴
The feature in the latest update 3 has completely fixed the ghosting that you face in games when playing at 40 or 30 FPS
Valve should ad the frame gen portion as a built in feature for the Steam Deck.
I’m not sure that’s possible. The program doesn’t even launch on Linux, no matter what compatibility thing is used. I wouldn’t hold my breath.
@aricrudd6579 That's not how that works. The software is designed with a different operating system in mind. Fsr didn't run on Linux until games started implementing it or until steam implemented it manually in the UI of the steam deck.
If they wanted to get it working they could but it could be an issue of licensing so it's still might not happen. You're not wrong about that part in all likeliness.
@@ClassicRoc87 FSR works through the graphics pipeline, if I’m not mistaken, but Lossless Scaling apparently uses a few Windows functions that can’t be handled by things like Wine and Proton. Making LS run on Linux is far more complicated than you’re implying.
@aricrudd6579 fsr 2.0 works through the pipeline. 1.0 is basically a fancy filter which is kind of how the frame generation works in this application. Don't get me wrong I'm not an expert but I believe it can be implemented if they wanted it to.
@@ClassicRoc87 Frame generation is more than a fancy filter, considering it has to actually interpolate frames to make extra frames. I think the devs of LS actually explained somewhere why they can’t/won’t make a Linux version.
But considering Proton recently added support for frame generation, hopefully someone else can do something similar for Linux. I just don’t think LS itself will ever be Linux compatible.
Man you were sick AF yesterday, you had to be motivated to make a video that sick... good that youre better today!
Slightly better yes
Great job, Fabio!
We all have our sick days, I get it! Personally, I'm SUPER dumb when I'm not feeling well. lol
Glad you're feeling better. Gerald.
🤣🤣🤣
Jerald you mean?
Tekken 8 in 60fps to 240fps, that smoothness is insane. Now i cant go back to 60 xD
🤣🤣🙏
tekken and other figthing games like mk, sf combat is synced with fps, using fg is just worsening your chance to register a hit before the opponent does because of additional input delay
@@melexxatrue, but it does look sick for single player
Glad too see your health is improving, man.
I recorded footage from different games with OBS with a 7800xt and i could only get rid of the framepacing bug by "just" disabling HAGS, that is also something they are working on because its a bug in the latest OBS Version. Works flawless with Hags enabled and Adrenaline Recording in the Background though.
Lossless Scaling was the best investment i did 2024 :) i even bought it three times for my nephews and showed them how to use it. Now they can kill bugs and Automatons with 90fps(Base)/180fps(x2) fixxed on their 240hz screens :D and GPU Usage never going above 80%. Really awesome and easy to use tool if you are not playing competitive and need to squeeze every ms out of your input latency.
Really like how you explain it in detail and make videos about it. If you tried it out once and made it work you can never go back.
Nvidia launch rtx 5000 series : "Now we have x4 Frame Generation"
LS : "huh, rookie number"
i really respect and appreciate you for rectifying your mistake, also i recommend using VRR (g sync or freesync with LS and set sync mode to allow tearing for even lower latency), also capping fps gives better latency too, if you have an old GPU and run a dual gpu setup it is just crazy broken op. ALSO IT IS HIGHLY RECOMMENDED TO LOWER RESOLUTION SCALE WHEN RUNNING AT 4k or 1440p (nvidia does that too). For 4k 50% and 1440p 70-80%.
Just tried it today.
The artifacts are signifcantly reduced.
Ghosting is still visible but only if you look hard enough.
And if you don't have enough base fps
Have u tried in alan wake 2?
I had no idea about this software man.
Thank you for covering this
Thanks to LSFG 3.0 I was able to play silenrt hill 2 remake at 60fps with my 1070 and the experience was impressive
Jerralt is tired of living in the shadows. He finally gets mentioned and you have the audacity to laugh. He will have his day.
I believe x6, x10 and x20 doesn't work correctly because you would need a much higher monitor frequency. Other than that pretty good video! If I may recommend, using 50% resolution scale (or optical flow scale is more appropriate) still looks really good with a 4K target and runs much better, 100% is only recommended at 1080p honestly.
The monitor doesn't matter here, I was many times above my refresh rate across this video. I might be wrong though
X3 is enough for most people.
Actually, I believe it may not be working correctly because you are hitting 100% GPU usage. If there is not enough headroom for Lossless Scaling to do its work, then it may drop frames resulting in artifact and latency hell.
When 6x, 10x, and 20x was enabled, the framerate counter also indicated that it did not reach the target (generated) framerate. This means that Lossless Scaling is dropping a lot of frames.
Hi there, love your videos. I wanted to mention that you can select the beta version from steam launch setting and Lossless Scaling will be in version 3.0. Not sure if there is any difference but worth checking out!
You really should check out Lossless Scaling with two GPU's, it's a massive gamechanger considering it can be done with a budget GPU
Why use 2 GPU’s ?
@Savitarax It means no performance penalty from enabling lossless scaling, I do a before and after comparing them here ruclips.net/video/gH359ZNxvNk/видео.html
This video actually convinced me to try Lossless Scaling!
I have an 165hz monitor and highest i can go without feeling disgusted is 6x27.5 for games and 7x24 for videos, i think it is pretty good.
Anything over x3 is too much for me
7x is crazy
@@AncientGameplayssame
Love your content man, keep it up :D.
Hello there !
One thing to mention...
You can have the Beta version of Lossless Scalling which has new UI
By going to Steam Library, then Lossless Scalling's Property and selecting Beta Program version as Beta
I have a review version. Will test the beta
Glad you got better today!
That major artefacting you got at 10x and 20x were mostly caused by the application being limited to somewhere around 200fps rather than the frame generation quality itself. I've seen that exact 'slurring' artefact happen whenever LSFG attempts to render more than a software-imposed frame rate cap. Also, yeah, sometimes it'll randomly cap the fps to some number that bears no relation to the monitor's refresh rate or anything like that, like what's happening with your LSFG being capped to something like 200fps, and I still don't know why.
Yup
The LSFG is capped, probably due to the "Sync mode" being set to "Default", instead of "Off (Allow tearing)" in Lossless Scaling.
great video! I was expecting something like "let's cap it at 3 fps so in 20x mode we'll get 60 fps" :DD
Along with some of the mistakes related to your tests, another one was that you said that LS requires the game to be on borderless windowed mode. Actually LS works fine on windowed mode as well, it doesn't have to be borderless. This might be relevant to some people who want to change their resolutions in game but some games don't allow to change resolution in borderless mode, only windowed mode for some reason.
I think he meant you just cant do fullscreen, most players play in fullscreen or borderless.
I mean...nobody plays in windowed mode man...
@@AncientGameplays When you use windowed mode and LS, LS fullscreens the game for you.
Just when i needed it, thank you Fabio!
What you should do is to lower resolution to reduce GPU load because you gpu load is at 100 percent which is why the gpu cant generate more AI frames. I played fortnite capped at 60 fps 1440p, when i used 10x Frame gen. My fps decreased to 35 and i got 350 fps. I reduced resolution to decrease gpu load from 90% to 78% and i now have 600fps
the GPU being at 100% is not the issue lol. I can run the GPU at 100% and 50 FPS and then double or triple the frames and it will still work
Bro. I just got 1000 fps using Lossless Scaling. Insanely warbley but still. 1000 is 1000fps lol
@@AncientGameplays Hey... u still there?
How that honest and human intro just gained you more fans like you were using 10x Pisco-Scaling! 😂 keep it up Fabio; all the best to you! 💪
Lossless Scaling is a must have. Marvel rivals I get 120ish frames on average, but with lossless scaling I get a solid 240hz.
Dude isn't that an online game? Why would you want more latency in an online game? İsn't 120 hertz smooth enough?
@@alperen1383120hz feels kinda meh compared to 240hz, but I wonder how the latency compares
@@alexmaldonado2804 dude it's all about your eyes getting adjusted. 120 hertz feels smooth as butter even 60 is good
@@alperen1383 Yeah, if I look at 120hz for a while, especially my OLED, it looks pretty decent. But I say it looks choppy sometimes because when I drop straight from 360hz (my monitor) to 120 it looks laggy so I can get why the other guy would want 240hz smoothness
@@alperen1383 I'm a competitive gamer and the input latency I really don't even feel it. as long as you lock your frames like in the video shows it has the same input lag as dlss. I would much rather play a fast past competivte game on 240hz than 120hz.
Tested this app on 2D games from my Contra Anniversary Collection. Managed to get them to run at x10 without artifacts on my 165Hz monitor (more than that my monitor's refresh rate can0t keep up with so many extra frames). It locks at 165fps but the animation is muuuuuuch smoother, and the edges of the sprites look a lot less jagged. The motion clarity adds a lot of antialiasing. Lovely app, excellent for any game and for future hybrid handhelds with high refresh displays.
Well this is going to get crazy
indeed haha
You're running into the limit of what you can calculate with the x6 mode and above. If you tried it on a 4090 it would probably go a lot higher.
Something interesting you could try is to use a second GPU (you don't need anything crazy) as your display out and LSFG calculation card and use the 7800xt to render the game. That will work and it should let you use FG both with little performance impact to the game and the full performance of FG without it having to compete with the game for resources. The more powerful the secondary card, the more intermediary frames it should be able to calculate. If my 4090's massive cooler didn't cover my second PCIe slot I'd try it myself.
What do you mean by not anything crazy? What are the options?..
@@p_serdiuk Probably something even like the gtx 750 would work, because it's fine to have it at 100% usage as long as it's the one doing the calculation, but preferably something like a 1050 or higher
Always following your. Channel nice video Fabio!
Do not lock from RTSS or driver app, it will increase input lag significantly. Lock FPS from in game engine instead.
It has been proven several times that depending on the game, the in-game fps lock can actually have more input lag. And not all games let you lock fps to a specific number
Wait really?
I watched a video guide before for locking FPS, and it showed RTSS being the best one for frame time stability compared to in-game or gpu app fps lock.
I use the amd adrenaline for mine
RTSS has an option to override the default FPS Limiter implementation and use Nvidia Reflex instead. Not only that, the tool description says that this is the recommended option when limiting frames using Frame Generation, and it is even compatible with games that do not have native Reflex support. Which means that in addition to limiting FPS, this option also brings all the latency benefits of using Reflex.
I believe this is an option exclusive to Nvidia cards, but still, you can't simply say that limiting FPS via RTSS is worse, because in these cases, RTSS is simply the best option.
@@fedrecedriz6868 frame rate stability ≠ lower frame time
Nice information it helps a lot to save my old GPU instead of bought a new stuff
I think your GPU hitting 100% might be why you aren't getting more generated frames. It's great that the frames are being generated on the GPU instead of CPU.
3.0 is practically flawless on Tarkov. Literally no more gpu usage and no artifacting that I’ve noticed. Really great update from LS devs. Fantastic on helldivers too
We don't even need 5090 for this. This is crazy.
I tested and found the same, I think the frame generation impact is pretty massive if you are close to the max GPU output or using to high X levels of FG. Best case scenario remains a locked 60fps into X2 or X3. I found using 60FPS locked in Darktide Warhammer 40.000, X3 and my cap being 144hz, it works very well.
GPU makers hate him! One simple trick to keep your graphics h̶a̶r̶d̶ card!
This is so good man
It's extension amazing
7:40 that was a very attractive laugh i understand how you pulled that baddy now
Hahaha
Just from curiosity, are you using Lossless Scaling on daily basis or just for video purposes? For example I can't live without it to be fair.
I was thinking about upgrading my rig for smoother 1080p gaming, but when you showed us Lossless Scaling i decided to give it a try and my RX 6700 XT and R5 5600x are busting over 140 pure frames on a 165 hz monitor in all games ON ULTRA, that saved me a lot of money and nerves.
And also I wanted to thank you for everything, your videos helped me a lot.
Keep it up man, you are doing Gods work.
You need to turn off the sync mode or else it caps fps to around the monitor's refresh rate
No it doesn't. My refresh rate is 138. It capped at 200-205
@@AncientGameplays ah you're right, it might have something to do with the gpu usage then cuz it seems like it starts messing up when the gpu is at full capacity
i have a rtx 4070S and with this software I can finally play games at max graphics with more than 60 fps ABSOLUTELY AWESOME
10:44 with 10x its more like "Lost In Scaling"
This is a lifesaver if you are running the Lorerim modlist
Thank you Fabio! Lossless scaling or AFMF2?
Ls generally
Yes, I'm also interested.
@@AncientGameplayswhy exactly?
For input latency, AMFM2.
For overall smoothness while in motion, Lossless Scaling.
Lossless Scaling works much worse for me than the built-in AFMF2. When using the x3 x4 mode in LS, the performance does not increase at all, and the picture is spoiled by x10, and in the x2 mode it makes 90 frames out of 60 at best, when AFMF2 makes stable 120 frames and spoils the picture less. I use 780m (7840hs).
Native DLSS and the latest FSR is better than Lossless Scaling ,but if you don't have either, Lossless Scaling is a good crutch
in-engine frame generation is always better and always was. But LS is very much on par quality wise, just not on latency
The draw, if I'm not mistaken, is Lossless Scaling and Frame Generation can be used with any game and Direct X, while the others are limited.
@@JerryOIOOIIOyes
@@gingerbread6967 Because Lossless Scaling works with any window. A single 7$ app that works on nearly all GPU and any game - against DLSS that require mid or high-end GPU for, and a supported game. FSR is ugly, imho.
There is no magic, but great value for an amazing tool does feel like magic in today's economy of proprietary technologies. People seem to recognize that.
In Stalker 2 in-game FSR3 FG is awful for me, using LS I have less artifacts...
gahhdaamnit that Snoop Dogg WHO!? always gets me 😂😂😂😂
Thanks I got it today and been using it, not perfect by any means but I rather have a smooth play at the sacrifice of some artifacts here and there. Also the x10 and x20 did the exact same thing as it did to you on my first run, I then had to go out and exit the game, when I returned the x10 and x20 was working as it should without that weird artifact effect and I was getting the right amount of fps
14:47 Fabio having an organism!
Just updated can't wait to try! lol @ Little wet. I've been locking Diablo 4 @4K 30 FPS using LS frame gen. Keeps my fans quieter & looks even better now too! Finally got to try the new version.
would like to see some wgc vs dxgi research for this app
Btw, 30 or 40 fps can feel smooth in a game when you enable motion blur, the higher the intensity the smoother the game will feel and look. Trust me, I used to hate motion blur, but now i love it when I discovered this trick. It's the console experience, after all.
What programm are you using for monitoring your ping?
what ping?
@@AncientGameplays he mean what is the program that monitors your fps, clocks, gpu usage, framerate latency and so on ? i am asking that aswell :D
MSI Afterburner guys
@@kyrb5927 Yeah i meant latency (ms) thank you
@@kyrb5927MSI afterburner doesn’t work with all those specs, literally %1 lows and frame graph never work
I tried this version myself in Elden Ring and Dark Souls 3 since they are capped at 60FPS and using mods to unlock tbe framerate will mess up the game mechanics (for example running in slow motion). With Lossless Scaling I had a lot of artifacts even with 2x using version 2.1, but now with version 3.0 I can barely see any even in x4 (using a 240Hz screen) and the latency is reduced a lot I don't even notice, I can dodge any attack, it's truly a huge improvenent.