This was a MONSTER project, hope it helps :) Link to the tool in the description, give it a shot! Some other things to make note of: 1) Lossless Scaling is advertised to work in DirectX games, which is a lot of games, but not all. You might be able to get it to work but it’s not guaranteed. Maybe support will expand in the future, but can’t be sure 2) if you are getting black screens try borderless window instead of full screen. That is where it is most consistent. Remember this is an external program just interpolating that it sees, so you need to give it the proper thing to look at. Just some things I didn’t mention in the video, hope that helps
This technology isn't bad as long as the basic standard of real 60fps is achieved first. This makes a lot of progress for display technologies like BFI and persistence blur based issues etc. But we recently exposed blatant abusive and consumer manipulation concerning optimization in recent games that might unfortunately end up putting this software on the more cursed side as we've explained with upscalers. And it's going to be the worst with UE5: Things like Nanite, which in our documentary we referenced proof that it has much worse performance over traditional optimization and causes WORSE visual quality thanks to subpixel problems Nanite makes lightmaps or basic interactions with impossible and Lumen will be lower quality than UE's volumetric lightmaps, cause unnecessary TAA dependency, and force performance problems. With Nanite, that also means you're going to be forced to endure the performance pentenaly of virtual shadow maps. Because Epic Games has failed to properly fix TAA(we stated how in our video),devs will be forced to use TSR to fix the nanite subpixel issues which we measured to be 11 times more costly than TAA. Because of all these horrible development choices have been made regarding proper optimization, you're going to have a 30fps experience for basic clarity on hardware with MUCH more potential. We cover a lot more issues like this in our video exploring the manufactured issues with optimization in modern games. We show damning content and PROOF Digital Foundry should have brought to people's attention years ago. One of the developers of lossless scaling made a community post sharing the importance of our video for gamers.
@@SimplCupJust like you can use FSR3 FG with AFMF (x2x2), yes, you can use FSR3 FG with LSFG x3 (so x2x3). You cannot combine FSR3 FG with AFMF and LSFG at the same time however
@@joen4287 Cathode Ray Tube monitors, way better looking and lighter than old skull bulky TVs that actually have good sound. Counter Strike 2 includes FSR, as for valorant, use lossless scaling.
if you want to get the full 165 fps on your monitor, try using a display port instead of an hdmi port. For some odd reason, some monitors will only put out 120 fps on a 165 fps monitor with hdmi.
Tip : you can use LSFG on any application such as youtube video or mediaplayer,u can watch movie at 48 fps,after awhile you will feel 24 fps so choppy.
What'd be nice is Nvidia properly implementing the frame gen settings into the Nvidia control panel/ a app like this. They should be sued because if the tech can do it why isn't it turned on. It's them controlling it like they did the right to repair for years. Never forget how much blame gets put on the consumer for faulty products. Also. The latency gain from this is ridiculous. Why I am stating Nvidia do their damn Job rather than make the CEO richer and they invest into AI the thing that built them they are ignorant too. I hope another company comes soon that's better than both of them that's cheaper. And the stocks from ai pop so they are forced to make things cheaper.
It's a pretty great project, my use case is usually for 60fps capped games which are not fast paced for x2, the faster the game play is the higher base frame rate you'll want, but 60fps for x2 is good enough for games like Elden Ring, Witcher, Hogwarts. Racing games? I'd avoid even at 60fps, because there is too many changes on screen for it to look decent. 30fps vidoes of game content is also ok, IRL content not so much. My favorite thing to do in the rare chance it's optimal, is use x3 and 80fps to get 240FPS motion clarity (max refresh of my monitor). The reason I say it is rare, is because I'd prefer real frames to fake frames still. If I can do 100-120 (or higher obviously) without LSFG, then I'd prefer that to 80fps x3 LSFG because it looks good enough without giving up any responsiveness.
If gpu is 90-100% use Then lossless scaling isn't good either This app also use gpu to draw frames And if gpu is fully utilized by games Then this app doesn't make much difference
it works like a charm on 2D games, like GBA games, Gameboy and SNES games, they ran muuuuuuch smoother. And it works on videos too. Lossless Scaling + FG x 3 + Performance mode + framerate locked to 1/3 of your monitor's speed = a godlike combo
That's just crazy. I didn't really believe at first. Tested in Elden Ring but didn't see much difference. Then I hopped into Fortnite. I locked fps at 60 and turned on x3 frame generation. And guess what? It actually feels like 180fps(my monitor is 180hz). I even tried going from 30 to 90 and it felt better too. Edit: at least for me, latency is very noticeable. At first, I was excited and didn't notice it, but after some time, it became pretty noticeable
I use a FirePro V3900 to do the frame generation and it runs at about 75% load and keeps my 1650 able to keep a solid 40fps in Cyberpunk which plays well with frame-gen to VSYNC. 40 base fps works much better than 30
Lol glad I got Lossless Framegen for free. This channel kinda months late on this review, me like 1 month late on discovering this tech. A little warning to RTX 2060 owners though it will rise your temp higher on newer "AAA" games with ultra graphics even on high.
Was thinking of side grading from a 1660ti to rtx 2060. Not really worth it in this case. My new 32" 1440p 165 hz monitor was suffocating it lol. Got an 18" CRT monitor too and it's been fun playing games at 1024x768 100hz. So buttery without stutter. Also got a 40" led tv 1080p 60hz that'll let it breathe a bit. Ty for sharing
Let's just make one thing very clear, that app doesn't always work. There are situations when it should work but it doesn't, and the times it does it either makes the game worse or completely broken. But, when it does what its supposed to it's unbelievably good.
Last of us is a single player game so input lag is less of an issue. Wont be using Frame gen on multiplayer games like CS2. I think increasing fps using upscaling is better.
With my old RTX2060 cappend at 50fps, it feels very smooth with X3 in performance mode and allow tearing, i really don't see the difference. I do sometimes enable and disable it. By the way 2.10 beta is comming out with some improvements ;). I can finally play Cyberpunk with many cars/people as my CPU is lacking with frame gen. Even if the DPS goes lower than my capped fps, is still holds on correctly. The best thing ever for my very old graphic card.
I wish you could get your maximum refresh rate on your monitor; wanting to show us while you being limited to 120hz/fps is a bit sad. Any chance you'll get this corrected and/or even as far as buying a 165hz+ monitor in a near future... maybe? 😄
i use this tool to watch youtube vids at 120fps and its insane bro (turn on vsync in frame gen cus its gonna lag like hell because its trying to double your monitors hz and vsync fixes that)
9:08 Zelda botw or just any nintendo console emulator is a bad exampe because if you're already emulating it, it usally comes with framerate unlockers by default in the Emulator. But still a great video about a great technique to have more frames in games!
The only problem with lossless scaling is the latency. Other than that it is like black magic or something. Especially since it's the first proper iteration from a very small developer. Its insane that It's better than AMDs fluid motion frames
You convinced me to try it on my gtx1660, but in the last part where it drops fps to generate more fps... well, I ain't so sure anymore. Was planing to play never games once I get better setup and I think I might just do that. But on the other side, I emulate lots of games like bomber man 64 at the moment. So I might need to try it out.
So they are actually inserting active frames using machine learning, that's amazing, so much better than BFI because you are then losing 50% input fidelity with it, same with typical frame-interpolation, so this is amazing, the proof will be using it with my CRT monitor, if it can pass that test, then it's legit. I'll be honest though, I don't know why the hell Nvidia or AMD just doesn't produce a smooth rolling-scan mode, this would completely trump all these methods, a line by line scan mode would negate so many of the common frame issues, and the best thing it would mean anything from as low as 30 FPS can be enjoyed with smooth motions and input fidelity, I mean Sony managed it over 10 years ago with their OLED BVM's, I'm sure Nvidia could produce a modern raster-scan mode for OLED displays, if not raster-scan, which is the gold standard for frame modulation, then use focus-field-drive as the basis for frame modulation.
So after seeing the bit about this tech needing resources and causing performance loss, I wonder if then instead of using the GPU's resources, it could use a separate machine-learning card, like an M.2 Coral module or something like that.
I've tried to use this with 7 days to die in multiple configurations and most seem higher frame rate but all of them are absolutely unplayable as everything seems a bit slower to the point jumping is almost space jumping in a sorts.
We don’t even need to download ram almost every operating system automatically repurposed a portion of your SSD to use as spare RAM when your normal one is full
@@DesocupadoXtremocyberpunk with frame gen working on my 7900xt Is a godsend. Shame on cd projekt red for not implementing it yet. Running tons of graphic mods with no problem
Especially with 3 wives but only 2 houses, implying 2 out of those wives are living in the same house with the dude, either having been convinced it's okay or completely oblivious to not being the only wife in that household.
Another insane funcionality of this app, is its hability to frame gen cloud games, i discovered it trying Sea of Thieves on Xcloud cause i didn't wanted to wait for the download, and 1080p60 was to low, so i enabled and god, running a game on another machine, in another resolution with another fps, felt like a native 1440p120 game, technology really moves fast. Yesterday i read a coment of a guy that streams his PS5 playing Bloodborne to his Pc and runs it at 60fps with LS 😅 men is a genius
As a 34 year old gamer, who has been on the pc since the days of doom, lemme tell you, messing with settings is this first thing i do, then i head kver to the nexus to start breaking shit
People don't hate framegeneration, people hate Nvidia's framegeneration, because they sold it as if it could only work on the RTX4000 and that their graphics (significantly more expensive) were better than the RTX 3000 and AMD's because the "fake frames" mean better performance
@@GamingArtificethankfully it's pretty easy once you know where to look 1. launch rivatuner statistics server, or RTSS for short if anyone doesn't know, and click add on the bottom left 2. find your game exe and add it to the list 3. select your game exe in the list and click setup near the bottom middle 4. in the General tab scroll down to the section titled "Compatibility properties" 5. tick off "Enable framerate limiter" and in the drop down menu select NVIDIA reflex
I tried posting this comment once already but it doesn't seem to be here? but I'll try again now 1. launch rivatuner statistics server, or RTSS for short if anyone doesn't know, and click add on the bottom left 2. find your game exe and add it to the list 3. select your game exe in the list and click setup near the bottom middle 4. in the General tab scroll down to the section titled "Compatibility properties" 5. turn on "Enable framerate limiter" and in the drop down menu select NVIDIA reflex
That's the issue. Instead of being used as a tool for low-spec gamers - it's used by developers as a crutch. Why bother with optimizations, when you can just fake it with framegen, right? That's the annoying part, but I guess it was to be expected. :/
for me is a blessing in the heavily CPU bound games or mods (eg. with a lot of npcs, explosions, etc at the same time in the scene). And kind of unexpected, but I'm really glad it works with RUclips video. To boost 25 or 30 fps video to double or triple is a blessing for the eyes :)
There are many games that need a fast CPU, especially in certain scenarios. I use a 5800X3D paired with 3080ti and, for example, in Serious Sam: Siberian Mayhem, I get 50-70% GPU usage (@100+ FPS) most of the time and even lower especially in heavy battles in NG+ mode, when the FPS drops even below 40 😢 and GPU usage drops even lower, because the CPU isn't fast enough to maintain a decent FPS (...let's say over 60). This happens even if I lower the CPU speed settings in the graphics/performance menu. Of course, not so badly, but still significant.
A theory to how Lossless Scaling Works: When you start the frame generation for a certain DXGI layered Window, Lossless scaling takes 18-22 frames off of the Game, intentionally, so that it can train on previous frames better, the frames that have been taken, translate to lossless scaling having more priority to make frames in-between 2 real frames. As for the input lag, the principle here is just as it was said, Lossless Scaling again: uses Previous Frames and train on them, to predict the next frame when the input get's sent. It could be also that the next frame depending on the input, disables itself on the first contact with the key. Be it a Left click, As soon as you press Left click or any button of that sort, the next frame (WILL NOT) be frame generated, and for a slight millisecond, Lossless scaling will disable itself to prioritize your input rather than generating a frame. After your input gets sent to the PC, (then) and only then would loss less scaling continue generating frames. That's my theory to why it's so good.
About LSFG performances; here's some tips: LSFG's framegen works inside the screen res of your monitor, so the less pixel your borderless fullscreen has, the less processing it's gonna take to function. So if you use LSFG at 4K, be careful, I'd recommend using a scaling resolution instead (Image Scaling: On, for nvdia drivers) or use LSFG's built in image scaler function on a small borderless windows. If you use the game's resolution scale instead or even dlss/fsr performance at 4K, LSFG won't actually recognize the lower resolution this way, for the app it's still a 4K image it has to interpolate and that's too much work. So make sure to manually get your screen resolution itself or at least the game window resolution itself under control.
Imo LSFG is for old gpu. If you already use 4k monitor, that means you must be using rtx or rx series. Then there's no point using LSFG rather than using dlss and fsr, because LSFG is far more horrible than dlss and fsr due to LSFG is just AI'ing your graphics post processing instead of baked into the process itself.
Small developer, been around for years with image upscaling. But frame generation is a huge thing. They're doing such a great job, it's working mindblowingly well.
it works fine on RX590 and 60 hz monitor. i cant get a fix 60 fps in alot of games so i capped my fps to 30 and i used 2x mode the result is super good and smooth i like it so much. i wanna test 3x on my old 15 year old laptop so yeah xD Edit : didn't work on my old laptop mybe its too old idk
One of the things that tick me off about the PC Gaming community is the fact that AMD makes features free for gamers but the public turns their back on them. They smashed the 4 core barrier for us all and people quickly forget. They introduce Freesync for all so we did not have to pay hundreds extra for a PROPRIETARY chips in monitors for Gsync tech forcing everyone into an nVidia GPU. Monitors got so much cheaper for variable refresh. Now they Open Source something for everyone again (many time previously). We get framegen for all. People will still pull up a chair to suckle at the teets of nVidia. I hope AMD says F it and pulls out of GPU's and nVidia doubles their prices AGAIN. We all deserve it. LOL
You seem to be forgetting that it was nVidia that brought all these technologies to massmarket in the first place, and that their solutions are always better than AMDs 'free' versions. That's why people buy nvidia, because they are first and better.
@@TransoceanicOutreach HUH? You seem to ignore all the shady business NVIDIA has conducted even during the GPU mining epidemic. Pay 3x the value, forcefully limiting gpus and so on. It's like all the people saying Apple is better than Samsung, absolutely hilarious brainwashed sheep buying into propaganda. lmaooooooooo
This is a bad take. You should always choose what is best for your uses. AMD is not a tech startup or small company, if they want people to buy their products they can make them better. And if NVIDIA wants people to buy their products they can make them cheaper. You seem to also forget that technologies like this allow people to use their gpus longer which reduces sales for these companies and reduces ewaste. These are both good things, corporations have significant headroom for improvement and we have both options and influence on the market. This is a win for gamers.
For the best results you should use X2 and use fps cap to half of your monito's refresh rate , ex ,, if you have 120 Hz monitor just cap your game to 60 and enable Vsync then use LSFG 2.1 X2 and you won't believe how smooth the result is , if you use X3 and your game is already give you 60-80 fps and your monitor is 120 Hz then you will see artifacts in the game , it will take a long time to explain why because it's complicated so just take my advice as expert and try it yourself , I have 60 Hz 1080p monitor and cyberpunk 2077 Ultra settings i get around 50fps , without fps cap i have 60 with FG but it's not as smooth as if i cap the fps to 30 and use FG 2.1 X2 it's 99.% like native 60 especially when using Vsync to get rid of tearing ،
It is actually pretty good, but you need to mention that it comes with a performance cost - you need around 20% headroom of GPU usage, so you better find a setting where your GPU load is around 70-80% so Lossless Scaling doesn't run into a bottleneck. Because when it does it the experience is similar to the 2006 SLI-Microstutters
Also depends on your monitor, if LS generates more frames them your refresh rate can show it will not push it, i play Hogwarts Legacy at 80fps in a 165hz monitor, with frame gen it jumps to 150fps with 100% GPU utilization and no stutters
Well, this is BYU would scale in the lossless scaling app and then you would implement it if you were really concerned about the headroom more so than cutting back settings
this is especially useful for modding old games like skyrim where you can lock your fps at whatever you want with enb or display tweaks so you can get more mods in without worrying about fps loss
@@vextakes Yup, just scroll down to "Preferred GPU" and select the iGPU, it will run both upscaling and FG there. The only real issue is making sure your iGPU is strong enough to deliver all the frames, but otherwise I haven't noticed any issues.
@@vextakes Oh, also, if you allow it to autostart with your PC in the settings, Steam can't detect that it's running, allowing you to use it on multiple PCs simultaneously.
I've encountered this problem today, after buying Crosshair X, it's a crosshair app which goes as a semi-overlay onto your Rust game adding a crosshair. I've tried scaling my game repeatedly with all types of settings from Lossless Scaling, even with older versions, modes, nothing helps. The scaling is being done on the Crosshair X instead of on Rust and this makes my Rust game run in 40fps because CrosshairX is scaled. I've also tried first scaling Rust, then adding and turning on the Crossshair X app so that it won't be scaled, doesn't work. I've also tried disabling fullscreen optimizations, so on in the properties of the apps. Any fixes? I am desperate... A fix I would see working perfectly would be Lossless Scaling being blocked to scale on Crosshair X and locked only to scale with Rust game, but I don't really know how to do that. Need help!!!
Lossless Scaling is finally getting attention! I've been using this for a while, not so much recently because I went from a Laptop with a GTX 1650 Max-Q to a desktop with a 4070Ti but when I was on that laptop, it worked wonders.
seeing a lot of comments saying that you need some gpu headroom to use this; I'm running DayZ from 80-110fps (uncapped) on my 4070 laptop, if I get this lossless framegen tool, and cap my fps to say, 60, that will lower the gpu usage meaning I can take advantage of framegen, yes?
You described everything so perfectly, i was trying to explain it to my roommates very poorly, but you helped me get a real understanding of the technology and side effects of using it. Thanks a lot! I use it on my 2060 for a few games but i only recommend it if you're already over 60 FPS
Yikes, that Windows 10 1903 requirement's lookin kinda bad. I mean it's on Steam, of course they removed Windows 7&8.1 from support by Google demands for some reason. I want a Windows 7&8 alternative cause built in GPU drivers for scaling suck and we need better options. I would love for 2x Integer Scaling for Minecraft on my laptop.
Just got it because its 7 bucks and if you look at these yotube sheets we pay like 2 bucks a frame so if it gets us 3 frames its break even. On warzone 1440p I'm going from 120 to 190 FPS. Don't feel mouse lag/drag feels just as snappy. Not sure what is the hot key or if its working? Is that it it's magic!?!?
There is actually slight input lag with it turned on. Refuse to use it in certain games with fast camera movement (with mouse) because it is a bit unresponsive
I tried the software yesterday. In its current version, support for G-Sync was added. I own a RTX 3060 and a Ryzen 5600X combo. I game on a 180 Hz G-Sync 1440p monitor. I tried the software to measure the gains of frame generation. The games I chose were Assassin's Creed Odyssey, Assassin's Creed Origins and Witcher 3. In these games the average FPS at ultra settings is anywhere between 55 and 70, with dips around 50 and spikes around 90 before applying the software. With FG X2 I observed that the initial FPS would first be cut by a third, dipping from a steady 60 to only 40 for instance. Then FG would kick in and double the frame rate to 80 in this example. The downside is noticeable latency, significant shimmer and minor tearing in some shapes during movement. The worst shimmer was in Witcher 3 where it looked as if Geralt was hallucinating! The upside is that I was able to apply all RT options and still game at 50 fps. Quite a technical prowess if you ask me! Finally my G-Sync monitor VRR does a fantastic job at adapting to the FPS and delivering a very smooth experience. That's where my setup really shines these days.
This was a MONSTER project, hope it helps :)
Link to the tool in the description, give it a shot!
Some other things to make note of:
1) Lossless Scaling is advertised to work in DirectX games, which is a lot of games, but not all. You might be able to get it to work but it’s not guaranteed. Maybe support will expand in the future, but can’t be sure
2) if you are getting black screens try borderless window instead of full screen. That is where it is most consistent. Remember this is an external program just interpolating that it sees, so you need to give it the proper thing to look at.
Just some things I didn’t mention in the video, hope that helps
By the way you can get a DC barrel jack type c adapter. great in-depth video
This also works with videos, watching 30 fps videos in 60 fps.
have you tried using lsfg with other framegen's on? what if you can sixtuple your fps?
This technology isn't bad as long as the basic standard of real 60fps is achieved first. This makes a lot of progress for display technologies like BFI and persistence blur based issues etc. But we recently exposed blatant abusive and consumer manipulation concerning optimization in recent games that might unfortunately end up putting this software on the more cursed side as we've explained with upscalers. And it's going to be the worst with UE5:
Things like Nanite, which in our documentary we referenced proof that it has much worse performance over traditional optimization and causes WORSE visual quality thanks to subpixel problems
Nanite makes lightmaps or basic interactions with impossible and Lumen will be lower quality than UE's volumetric lightmaps, cause unnecessary TAA dependency, and force performance problems.
With Nanite, that also means you're going to be forced to endure the performance pentenaly of virtual shadow maps.
Because Epic Games has failed to properly fix TAA(we stated how in our video),devs will be forced to use TSR to fix the nanite subpixel issues which we measured to be 11 times more costly than TAA.
Because of all these horrible development choices have been made regarding proper optimization, you're going to have a 30fps experience for basic clarity on hardware with MUCH more potential.
We cover a lot more issues like this in our video exploring the manufactured issues with optimization in modern games. We show damning content and PROOF Digital Foundry should have brought to people's attention years ago. One of the developers of lossless scaling made a community post sharing the importance of our video for gamers.
@@SimplCupJust like you can use FSR3 FG with AFMF (x2x2), yes, you can use FSR3 FG with LSFG x3 (so x2x3).
You cannot combine FSR3 FG with AFMF and LSFG at the same time however
downloading fps, what a time to be alive
😂
I combined FSR with my CRT and created an OP combo for Counter Strike 2 on a 11 year old laptop.
@@saricubra2867 Can i do it for valorant?? can you give me the recipe please? whats CRT?
@@joen4287 CRT is an old timey glass picture box.
@@joen4287 Cathode Ray Tube monitors, way better looking and lighter than old skull bulky TVs that actually have good sound.
Counter Strike 2 includes FSR, as for valorant, use lossless scaling.
10 kids 3 wives and 2 houses
Lil bro literally called me out
Damn!
Dude did not call you out.
Bro is lying hard
@@JV7486 come to iraq habibi
I've been using it to framegen some games locked at 25fps and it still works ok. It's absolutely wild.
Cute pfp.
@@alastor8091bro is down baaaaaaaaad
how do u hadle the latency
do you even have a gaming graphics card..
nah 25fps is unplayable its so much latency at least in most games
if you want to get the full 165 fps on your monitor, try using a display port instead of an hdmi port. For some odd reason, some monitors will only put out 120 fps on a 165 fps monitor with hdmi.
great tip, i'd repost this at the top for others if i were you
its 165hz monitor not fps and also its because most monitors don't have a hdmi 2.1 port so they are capped at 144 or 120
@@Cooler-wh4rg 1hz equals 1fps
So true!
Sadly it didn't work for me in cyberpunk but it worked surprisingly well in snowrunner, can't wait to try it out in Helldivers and see what it can do!
Make sure to choose Window Borderless because this software doesn't work with Fullscreen mode
helldiver 2 work fine with my slow rx6600 8gb at 1440p mixed settings... got 80fps
@@zxcv404 alright I'll try that
@@dyris2594did it work?
you somehow cycle the same games that i play. impressive lol
Just found your channel. Like your stuff man. Keep up the good work.
Tip : you can use LSFG on any application such as youtube video or mediaplayer,u can watch movie at 48 fps,after awhile you will feel 24 fps so choppy.
how ?
@@nonbshd just scale and go back to the app u want to FG
yeah but why would you ever watch stuff at high framerates that's so stupid
@@Zephhi Yeah,its very stupid,but messing around with the setting is fun and the generated movie frame looks quite good.
My rtx 2080 super is now a 40 series 🎉🎉😂
yes lmao
What'd be nice is Nvidia properly implementing the frame gen settings into the Nvidia control panel/ a app like this. They should be sued because if the tech can do it why isn't it turned on. It's them controlling it like they did the right to repair for years. Never forget how much blame gets put on the consumer for faulty products. Also. The latency gain from this is ridiculous. Why I am stating Nvidia do their damn Job rather than make the CEO richer and they invest into AI the thing that built them they are ignorant too. I hope another company comes soon that's better than both of them that's cheaper. And the stocks from ai pop so they are forced to make things cheaper.
It has anime one because the upscaling and frame gen works on videos on your computer and in your browser as well
It's a pretty great project, my use case is usually for 60fps capped games which are not fast paced for x2, the faster the game play is the higher base frame rate you'll want, but 60fps for x2 is good enough for games like Elden Ring, Witcher, Hogwarts. Racing games? I'd avoid even at 60fps, because there is too many changes on screen for it to look decent. 30fps vidoes of game content is also ok, IRL content not so much.
My favorite thing to do in the rare chance it's optimal, is use x3 and 80fps to get 240FPS motion clarity (max refresh of my monitor). The reason I say it is rare, is because I'd prefer real frames to fake frames still. If I can do 100-120 (or higher obviously) without LSFG, then I'd prefer that to 80fps x3 LSFG because it looks good enough without giving up any responsiveness.
bro I can download FPS now :D
If gpu is 90-100% use
Then lossless scaling isn't good either
This app also use gpu to draw frames
And if gpu is fully utilized by games
Then this app doesn't make much difference
Next is ram
@@ghostclone8293 can be downloaded from amazon
We finally have software based BFI, I'm surprised it hadn't been done 10 years ago.
I tried it in Elden Ring, Forza 5, it works amazingly I love it :)
5:57 it really depends. For games where input latency matters, I’d rather have 60 rasterized frames than 120 generated frames.
this made my R9 380 run games in 1080p 60 without a problem
we have same gpu
it works like a charm on 2D games, like GBA games, Gameboy and SNES games, they ran muuuuuuch smoother. And it works on videos too. Lossless Scaling + FG x 3 + Performance mode + framerate locked to 1/3 of your monitor's speed = a godlike combo
That's just crazy. I didn't really believe at first. Tested in Elden Ring but didn't see much difference. Then I hopped into Fortnite. I locked fps at 60 and turned on x3 frame generation. And guess what? It actually feels like 180fps(my monitor is 180hz). I even tried going from 30 to 90 and it felt better too.
Edit: at least for me, latency is very noticeable. At first, I was excited and didn't notice it, but after some time, it became pretty noticeable
Question is: where is the machine learning happening during these frame generation?
I use a FirePro V3900 to do the frame generation and it runs at about 75% load and keeps my 1650 able to keep a solid 40fps in Cyberpunk which plays well with frame-gen to VSYNC. 40 base fps works much better than 30
Lol glad I got Lossless Framegen for free. This channel kinda months late on this review, me like 1 month late on discovering this tech. A little warning to RTX 2060 owners though it will rise your temp higher on newer "AAA" games with ultra graphics even on high.
btw you can as well use it for ytb videos! Watching game/movie trailers and music videos in 48fps instead of 24 is crazy fr
I just set my own settings in this program, and removed latency completely.
Was thinking of side grading from a 1660ti to rtx 2060. Not really worth it in this case. My new 32" 1440p 165 hz monitor was suffocating it lol. Got an 18" CRT monitor too and it's been fun playing games at 1024x768 100hz. So buttery without stutter. Also got a 40" led tv 1080p 60hz that'll let it breathe a bit. Ty for sharing
A bigger advantage than FPS is the drop in graphics card temperature and power consumption.
Let's just make one thing very clear, that app doesn't always work. There are situations when it should work but it doesn't, and the times it does it either makes the game worse or completely broken. But, when it does what its supposed to it's unbelievably good.
Try running your games in windowed borderless, cap your frames, and use it in performance mode
@@Liquid208 I always do, the results aren't always consistent. The AI model that's used needs to be trained to work effective across the board.
Last of us is a single player game so input lag is less of an issue. Wont be using Frame gen on multiplayer games like CS2. I think increasing fps using upscaling is better.
Is the Lossless Scaling app open source? Can I extract it and use it anywhere? We need an open source version that we can tinker with.
With my old RTX2060 cappend at 50fps, it feels very smooth with X3 in performance mode and allow tearing, i really don't see the difference. I do sometimes enable and disable it. By the way 2.10 beta is comming out with some improvements ;). I can finally play Cyberpunk with many cars/people as my CPU is lacking with frame gen. Even if the DPS goes lower than my capped fps, is still holds on correctly. The best thing ever for my very old graphic card.
the song of doki doki literature club at 4:48 killed me JAJjjaja good memories aaaaand good video btw
you forgot to mention that the developer continues to update the software and impresses us after each update.
for sure, he is amazing
I wish you could get your maximum refresh rate on your monitor; wanting to show us while you being limited to 120hz/fps is a bit sad. Any chance you'll get this corrected and/or even as far as buying a 165hz+ monitor in a near future... maybe? 😄
i use this tool to watch youtube vids at 120fps and its insane bro (turn on vsync in frame gen cus its gonna lag like hell because its trying to double your monitors hz and vsync fixes that)
That 30 year old dude you described is a mormon priest
30:01 RIP to GTX 1070. It was legend.
9:08 Zelda botw or just any nintendo console emulator is a bad exampe because if you're already emulating it, it usally comes with framerate unlockers by default in the Emulator. But still a great video about a great technique to have more frames in games!
I just wish it worked with G-Sync. Because I don't wanna miss that.
The only problem with lossless scaling is the latency. Other than that it is like black magic or something. Especially since it's the first proper iteration from a very small developer.
Its insane that It's better than AMDs fluid motion frames
this work for videos too... when u in Fullscreen mode, u can double or triple video frames...
Play at 120 FPS with no upscaling at negative resolution Max settings at 1440
Just bought the app. It just works. It’s so good.
HUGE BRO THANK YOU
it works flawlessly on the gtx 1660 ti. Hellblade 2 gave 10 fps in a few areas. lfg save it.
You convinced me to try it on my gtx1660, but in the last part where it drops fps to generate more fps... well, I ain't so sure anymore.
Was planing to play never games once I get better setup and I think I might just do that. But on the other side, I emulate lots of games like bomber man 64 at the moment. So I might need to try it out.
It just works. Coolest buy in a while
At this point I'm convinced Jensen himself is an a.i bot father his own kind to surpass humans lol
Cyberpunk bottle literally having a seizure in DLSS XD
Hey I’m curious if this can also work for vr
The ideal use case are games that are capped at 60 fps with a GPU usage below 80%
So they are actually inserting active frames using machine learning, that's amazing, so much better than BFI because you are then losing 50% input fidelity with it, same with typical frame-interpolation, so this is amazing, the proof will be using it with my CRT monitor, if it can pass that test, then it's legit. I'll be honest though, I don't know why the hell Nvidia or AMD just doesn't produce a smooth rolling-scan mode, this would completely trump all these methods, a line by line scan mode would negate so many of the common frame issues, and the best thing it would mean anything from as low as 30 FPS can be enjoyed with smooth motions and input fidelity, I mean Sony managed it over 10 years ago with their OLED BVM's, I'm sure Nvidia could produce a modern raster-scan mode for OLED displays, if not raster-scan, which is the gold standard for frame modulation, then use focus-field-drive as the basis for frame modulation.
Do you know if this works on nvidia cards, And helldivers 2?
It does
Wow the screen tearing is reaaal. Lfg seems to be not plagued by it
Yeah, now x4. I have 30 fps when i play GTA 5 with NaturalvisionEvolved. Now i have 120.
So after seeing the bit about this tech needing resources and causing performance loss, I wonder if then instead of using the GPU's resources, it could use a separate machine-learning card, like an M.2 Coral module or something like that.
I've tried to use this with 7 days to die in multiple configurations and most seem higher frame rate but all of them are absolutely unplayable as everything seems a bit slower to the point jumping is almost space jumping in a sorts.
Looks like the guy telling people to "download more FPS" was right all along and we were the idiots.
Download more RAM
@@RohanSanjith they tried that in the 90's. it may work now with AI Ram management.
ahead of his time
We don’t even need to download ram almost every operating system automatically repurposed a portion of your SSD to use as spare RAM when your normal one is full
Pass.
Rx 580 has gained 3 more years xd
I'm still chilling with my RX 570 8gb. Works really well when overclocked.
@@mongeeman might aswell call it an Rx 580 since Rx 580s are just oc 570s sounds awesome someone's still rocking em
hey brothers im also rocking a rx590, an oced rx 580 lmfao
😂
Will they ever die? I had 2 of them 😅
Man, can AMD just release FSR 3.1 already haha
Im exited for It, now that FG mods got so much easier to install will help a lot
@@DesocupadoXtremocyberpunk with frame gen working on my 7900xt Is a godsend. Shame on cd projekt red for not implementing it yet. Running tons of graphic mods with no problem
@@НААТ they are bought out by Nvidia
@@НААТThey've officially announced no further updates to the game, so modded FSR3 will be the only option for the foreseeable future.
The end of this June or early July. Wait for it! 🤷🏻♀️
" 30 YR OLD dude with 10 kids, 3 wives, 4 jobs, 2 houses... " were are those dudes? because i want the recipe...
Especially with 3 wives but only 2 houses, implying 2 out of those wives are living in the same house with the dude, either having been convinced it's okay or completely oblivious to not being the only wife in that household.
Don't worry, those have no time to game anyways.
Yeh wtf. I was going to the comments for this one. He thinks when he's 30 he has all that. Looooool.
look at your dad
come to middle east then lol
Another insane funcionality of this app, is its hability to frame gen cloud games, i discovered it trying Sea of Thieves on Xcloud cause i didn't wanted to wait for the download, and 1080p60 was to low, so i enabled and god, running a game on another machine, in another resolution with another fps, felt like a native 1440p120 game, technology really moves fast. Yesterday i read a coment of a guy that streams his PS5 playing Bloodborne to his Pc and runs it at 60fps with LS 😅 men is a genius
FG just insert frame that sent to your monitor. You can do this to all types of animations. Even powerpoint animations will look 2x smoother
Yap yt videos and streams even on discord can be seen with this
I play Zelda Tears of the Kingdom in the Nintendo Switch and Lossless Scaling by using a capture card
@@lightthelast5416 thats even better for latency, witch capture card you got?
@@lightthelast5416damn that never crossed my mind imma tell my friends that play on playstation and use capture cards that
Happy that lossless scaling is getting more attention rn
It’s because of their May update. Before that, their frame gen was unusable.
absolutely right
fr bro, I play 2-3 games with it and it's much better.
@@iireaper420ii6Don't say fr
It works on Asus ally go
Best purchase of my life. Essentially upgraded my 3060 to a 4060 for the price of a burger
At the cost of insane latency which isnt mentioned at all lmao.
@@chuggynation8275 It's really not that insane at all, I've been using it since then
As a 34 year old gamer, who has been on the pc since the days of doom, lemme tell you, messing with settings is this first thing i do, then i head kver to the nexus to start breaking shit
What ?
@@G_O_A_T__W.B hes the goat basically
@@MersonEdits ohh ok 🗿🙌🏻
but do you have 10 kids, 4 jobs, 3 wifes and 2 houses, tho
@@aboomination897 2nd wife. God knows how many kids.
People don't hate framegeneration, people hate Nvidia's framegeneration, because they sold it as if it could only work on the RTX4000 and that their graphics (significantly more expensive) were better than the RTX 3000 and AMD's because the "fake frames" mean better performance
No I just hate frame generation period, don't presume to speak for everyone.
it literally can only work on the 4000 series, are you high
@@electricant55 Because Nvidia chose to lock it to their latest gen only. There is no reason it could not work on older cards as proven by AMD.
@@Dark-qx8rk AMD uses a completely different technology and you can't even use their framegen without FSR upscaling
@@electricant55we can.
You can also use RTSS to inject Reflex to minimize the input lag.
How tho?
@@GamingArtificethankfully it's pretty easy once you know where to look
1. launch rivatuner statistics server, or RTSS for short if anyone doesn't know, and click add on the bottom left
2. find your game exe and add it to the list
3. select your game exe in the list and click setup near the bottom middle
4. in the General tab scroll down to the section titled "Compatibility properties"
5. tick off "Enable framerate limiter" and in the drop down menu select NVIDIA reflex
Into lossless scaling, game, or both?
I need a tutorial on this xD
I tried posting this comment once already but it doesn't seem to be here? but I'll try again now
1. launch rivatuner statistics server, or RTSS for short if anyone doesn't know, and click add on the bottom left
2. find your game exe and add it to the list
3. select your game exe in the list and click setup near the bottom middle
4. in the General tab scroll down to the section titled "Compatibility properties"
5. turn on "Enable framerate limiter" and in the drop down menu select NVIDIA reflex
5:39 lowkey I don't think anyone REALLY hates frame gen but a lot of us hate how games are built AROUND frame gen
I do. It adds stupid amounts of input lag and creates artifacts. Not worth it at all
@@iceangelx22 it is worth it.
@@Heartsbane055 no, its not.. its should be optional.. developer should target higher fps with optimization first.
That's the issue. Instead of being used as a tool for low-spec gamers - it's used by developers as a crutch. Why bother with optimizations, when you can just fake it with framegen, right? That's the annoying part, but I guess it was to be expected. :/
@@zultriova89 its worth it. its just yes. they should make the game playable. then make this as an extra
for me is a blessing in the heavily CPU bound games or mods (eg. with a lot of npcs, explosions, etc at the same time in the scene).
And kind of unexpected, but I'm really glad it works with RUclips video. To boost 25 or 30 fps video to double or triple is a blessing for the eyes :)
What games are cpu heavy ? I've been looking at mine and it's always only like 5% being used seems like my GPU is always 80 or 90%
There are many games that need a fast CPU, especially in certain scenarios.
I use a 5800X3D paired with 3080ti and, for example, in Serious Sam: Siberian Mayhem, I get 50-70% GPU usage (@100+ FPS) most of the time and even lower especially in heavy battles in NG+ mode, when the FPS drops even below 40 😢 and GPU usage drops even lower, because the CPU isn't fast enough to maintain a decent FPS (...let's say over 60).
This happens even if I lower the CPU speed settings in the graphics/performance menu. Of course, not so badly, but still significant.
A theory to how Lossless Scaling Works:
When you start the frame generation for a certain DXGI layered Window, Lossless scaling takes 18-22 frames off of the Game, intentionally, so that it can train on previous frames better, the frames that have been taken, translate to lossless scaling having more priority to make frames in-between 2 real frames.
As for the input lag, the principle here is just as it was said, Lossless Scaling again: uses Previous Frames and train on them, to predict the next frame when the input get's sent.
It could be also that the next frame depending on the input, disables itself on the first contact with the key. Be it a Left click, As soon as you press Left click or any button of that sort, the next frame (WILL NOT) be frame generated, and for a slight millisecond, Lossless scaling will disable itself to prioritize your input rather than generating a frame. After your input gets sent to the PC, (then) and only then would loss less scaling continue generating frames.
That's my theory to why it's so good.
I noticed the raw frames dropped about 10-20fps once it on
About LSFG performances; here's some tips: LSFG's framegen works inside the screen res of your monitor, so the less pixel your borderless fullscreen has, the less processing it's gonna take to function. So if you use LSFG at 4K, be careful, I'd recommend using a scaling resolution instead (Image Scaling: On, for nvdia drivers) or use LSFG's built in image scaler function on a small borderless windows. If you use the game's resolution scale instead or even dlss/fsr performance at 4K, LSFG won't actually recognize the lower resolution this way, for the app it's still a 4K image it has to interpolate and that's too much work. So make sure to manually get your screen resolution itself or at least the game window resolution itself under control.
But their upscaling looks horrible compared to DLSS or even FSR 2 so it's absolutely not worth it
@@electricant55I second this. I play in 4k with no problem
Not really though. LS1 and NIS looks great. Just emptied the sharpness slider. Have you tried yourself or just making assumptions? 🤷🏻♀️
Imo LSFG is for old gpu. If you already use 4k monitor, that means you must be using rtx or rx series. Then there's no point using LSFG rather than using dlss and fsr, because LSFG is far more horrible than dlss and fsr due to LSFG is just AI'ing your graphics post processing instead of baked into the process itself.
@@youravghuman5231 you know 4k monitors existed before RTX cards right? what a moronic statement
there's VRR support down in the legacy sub-menu and it works well for me, dont need to cap framerates or anything. it just works!
it says "this option no longer activates VRR and is deprecated" when hovering over it.
Small developer, been around for years with image upscaling. But frame generation is a huge thing. They're doing such a great job, it's working mindblowingly well.
My IGPU still has some hope😭
Time to download it from chrome for free boys🗣️🗣️💣💣
Is there any simple solution to use the app instead of that stupid 37 min of yapping
You can also use this on RUclips videos. 180FPS on a 60FPS video is just visually stunning.
あっぷ
genius
it works fine on RX590 and 60 hz monitor.
i cant get a fix 60 fps in alot of games so i capped my fps to 30 and i used 2x mode the result is super good and smooth i like it so much.
i wanna test 3x on my old 15 year old laptop so yeah xD
Edit : didn't work on my old laptop mybe its too old idk
Finally a reasonable video that all gamers need worldwide.
The best coverage of this on the entire internet, unironically.
For now... I think it's going to explode soon.
@@aaarrrooo hope so!
One of the things that tick me off about the PC Gaming community is the fact that AMD makes features free for gamers but the public turns their back on them. They smashed the 4 core barrier for us all and people quickly forget. They introduce Freesync for all so we did not have to pay hundreds extra for a PROPRIETARY chips in monitors for Gsync tech forcing everyone into an nVidia GPU. Monitors got so much cheaper for variable refresh. Now they Open Source something for everyone again (many time previously). We get framegen for all. People will still pull up a chair to suckle at the teets of nVidia. I hope AMD says F it and pulls out of GPU's and nVidia doubles their prices AGAIN. We all deserve it. LOL
You seem to be forgetting that it was nVidia that brought all these technologies to massmarket in the first place, and that their solutions are always better than AMDs 'free' versions. That's why people buy nvidia, because they are first and better.
@@TransoceanicOutreach HUH? You seem to ignore all the shady business NVIDIA has conducted even during the GPU mining epidemic.
Pay 3x the value, forcefully limiting gpus and so on.
It's like all the people saying Apple is better than Samsung, absolutely hilarious brainwashed sheep buying into propaganda. lmaooooooooo
This is a bad take. You should always choose what is best for your uses. AMD is not a tech startup or small company, if they want people to buy their products they can make them better. And if NVIDIA wants people to buy their products they can make them cheaper. You seem to also forget that technologies like this allow people to use their gpus longer which reduces sales for these companies and reduces ewaste. These are both good things, corporations have significant headroom for improvement and we have both options and influence on the market. This is a win for gamers.
Imagine giving lossless access to games 🤯
Dying for some form of native steamOS implementation. Going from 40 to 60 on steam deck would be a game changer.
LLS becoming popular on Handheld PC
For the best results you should use X2 and use fps cap to half of your monito's refresh rate , ex ,, if you have 120 Hz monitor just cap your game to 60 and enable Vsync then use LSFG 2.1 X2 and you won't believe how smooth the result is , if you use X3 and your game is already give you 60-80 fps and your monitor is 120 Hz then you will see artifacts in the game , it will take a long time to explain why because it's complicated so just take my advice as expert and try it yourself ,
I have 60 Hz 1080p monitor and cyberpunk 2077 Ultra settings i get around 50fps , without fps cap i have 60 with FG but it's not as smooth as if i cap the fps to 30 and use FG 2.1 X2 it's 99.% like native 60 especially when using Vsync to get rid of tearing ،
It is actually pretty good, but you need to mention that it comes with a performance cost - you need around 20% headroom of GPU usage, so you better find a setting where your GPU load is around 70-80% so Lossless Scaling doesn't run into a bottleneck. Because when it does it the experience is similar to the 2006 SLI-Microstutters
This is covered at 30:41 and onwards.
Also depends on your monitor, if LS generates more frames them your refresh rate can show it will not push it, i play Hogwarts Legacy at 80fps in a 165hz monitor, with frame gen it jumps to 150fps with 100% GPU utilization and no stutters
@@DesocupadoXtremo fail
Well, this is BYU would scale in the lossless scaling app and then you would implement it if you were really concerned about the headroom more so than cutting back settings
There is a performance mode, but looks slightly worse.
If you have an extra GPU, you can run LSFG on it.
I am using this also for movies . Game changer .
Bro.. movie changer***
My CRT is the one to blame, if it's locked to 60Hz, anything below 60fps will destroy your eyes, specially movies.
why do you use this for movies im confused
@@el_gabronBecause you can watch RUclips in 180 fps for 60 fps videos or 90 fps or 30 fps videos. Same goes for movies. Because it's COOL
This would likely be even better for watching anime, since a lot of scenes are animated at around 10-15 frames per second!
this is especially useful for modding old games like skyrim where you can lock your fps at whatever you want with enb or display tweaks so you can get more mods in without worrying about fps loss
Another cool thing about this is that you can run it on the iGPU, giving your main GPU essentially a free performance boost.
Wait, u can do that?
The implications of that is massive
@@vextakes Yup, just scroll down to "Preferred GPU" and select the iGPU, it will run both upscaling and FG there. The only real issue is making sure your iGPU is strong enough to deliver all the frames, but otherwise I haven't noticed any issues.
@@vextakes Oh, also, if you allow it to autostart with your PC in the settings, Steam can't detect that it's running, allowing you to use it on multiple PCs simultaneously.
Yes that's very true I'm using igpu for lsfg and 100% main gpu usage to the game works like a champ
My 3060 thanks you
Mine as well 😜
MSFS runs great now on my 6950XT with settings almost maxed out on 1440p with FSR Balanced (FSR sharpness to Max). Capped the framerate to 40 with X3.
I've encountered this problem today, after buying Crosshair X, it's a crosshair app which goes as a semi-overlay onto your Rust game adding a crosshair. I've tried scaling my game repeatedly with all types of settings from Lossless Scaling, even with older versions, modes, nothing helps. The scaling is being done on the Crosshair X instead of on Rust and this makes my Rust game run in 40fps because CrosshairX is scaled. I've also tried first scaling Rust, then adding and turning on the Crossshair X app so that it won't be scaled, doesn't work. I've also tried disabling fullscreen optimizations, so on in the properties of the apps. Any fixes? I am desperate... A fix I would see working perfectly would be Lossless Scaling being blocked to scale on Crosshair X and locked only to scale with Rust game, but I don't really know how to do that. Need help!!!
I see a lot of people in the steam forums and on their discord having issues getting it to work.
Yeah it's beta software at best, can't count the number of times it crashed my GPU driver
*This has revived so many old GPUs!*
This is very safe to use online as well, as it does not interact with the game at all or anything. It records it and streams it to you.
it works with the 1080 Ti
It works with every gpu
Lossless Scaling is finally getting attention! I've been using this for a while, not so much recently because I went from a Laptop with a GTX 1650 Max-Q to a desktop with a 4070Ti but when I was on that laptop, it worked wonders.
yessir 😀
seeing a lot of comments saying that you need some gpu headroom to use this; I'm running DayZ from 80-110fps (uncapped) on my 4070 laptop, if I get this lossless framegen tool, and cap my fps to say, 60, that will lower the gpu usage meaning I can take advantage of framegen, yes?
Yes
You described everything so perfectly, i was trying to explain it to my roommates very poorly, but you helped me get a real understanding of the technology and side effects of using it. Thanks a lot! I use it on my 2060 for a few games but i only recommend it if you're already over 60 FPS
Yikes, that Windows 10 1903 requirement's lookin kinda bad. I mean it's on Steam, of course they removed Windows 7&8.1 from support by Google demands for some reason. I want a Windows 7&8 alternative cause built in GPU drivers for scaling suck and we need better options. I would love for 2x Integer Scaling for Minecraft on my laptop.
Just got it because its 7 bucks and if you look at these yotube sheets we pay like 2 bucks a frame so if it gets us 3 frames its break even. On warzone 1440p I'm going from 120 to 190 FPS. Don't feel mouse lag/drag feels just as snappy. Not sure what is the hot key or if its working? Is that it it's magic!?!?
yo, aint it bannable on wz?
There is actually slight input lag with it turned on. Refuse to use it in certain games with fast camera movement (with mouse) because it is a bit unresponsive
avoid at all killing stabing war games it will poison your brain to bad..
I think Lossless Scaling is a god send, cause it works on the hardest game to run to this day, Minecraft
I tried the software yesterday. In its current version, support for G-Sync was added.
I own a RTX 3060 and a Ryzen 5600X combo. I game on a 180 Hz G-Sync 1440p monitor.
I tried the software to measure the gains of frame generation. The games I chose were Assassin's Creed Odyssey, Assassin's Creed Origins and Witcher 3. In these games the average FPS at ultra settings is anywhere between 55 and 70, with dips around 50 and spikes around 90 before applying the software.
With FG X2 I observed that the initial FPS would first be cut by a third, dipping from a steady 60 to only 40 for instance. Then FG would kick in and double the frame rate to 80 in this example.
The downside is noticeable latency, significant shimmer and minor tearing in some shapes during movement.
The worst shimmer was in Witcher 3 where it looked as if Geralt was hallucinating!
The upside is that I was able to apply all RT options and still game at 50 fps. Quite a technical prowess if you ask me!
Finally my G-Sync monitor VRR does a fantastic job at adapting to the FPS and delivering a very smooth experience. That's where my setup really shines these days.
Is it the 12gb version?
@@rambledogs2012 yes it is.