I just tested it with classic ridge racer series for the ps1 on duckstation. the game runs as smooth as butter now, those games were hard locked at 30 fps
im going to test with R4 (ridge racer type four) in the future. i dont have a GPU for now (only IGPU on my ryzen) but is cool to know even retro emulators can use lossless scalling too
So the latency jump is at DLSS 3 Frame Generation level now. There are a lot more artifacts than with DLSS 3, but this is looking so good that I would already start using it on emulators. Thanks for the video!
It'll always have more artifacts than DLSS 3/4. It doesn't have hardware acceleration or data like motion vectors/ depth buffer to wok with. Also DLSS will likely see a further reduction in latency with upcoming reflex 2. Maybe in a future update. Given how little LSFG has to work with, it is surprisingly good.
@DragonOfTheMortalKombat No, it doesn't have access to game engine data like motion vectors, but because of that, it works on every single game and emulator you want, and it doesn't need to be implemented, something that you do need to do with DLSS FG (still, we have people like PureDark doing that for us). "Hardware acceleration" is less relevant now that all DLSS FG will use is tensor cores, not hardware optical flow, after the upcoming update. What truly matters is how good the AI behind it is, something Lossless doesn't have, but it's doing a pretty good job and improving a lot with just machine learning. Because of what Reflex 2 does, I highly doubt it would work WITH Frame Generation, let alone Multi-frame Generation. We'll have to wait and see, but Nvidia Reflex is good enough that it leaves you at a much lower input latency than what you would have with Reflex and Frame Generation off.
Up until this 3.0 release I was playing TOTK at 60fps then using LS x4 to get 240fps, but there was a fair bit of stutter. Now I can leave it at 30fps and use 8x to play at a mostly perfect 240fps. It does make motion control arrow aiming more difficult, but it is so fucking beautiful! I also tried playing Wipeout 64 at 240fps, but the emulator keeps dropping frames like crazy. Can't wait to get Wipeout 64 looking as smooth as Wipeout Phantom Edition! Lossless Scaling is so damn good!
@Ghost-pb4ts that's now what i meant, what i meant Is that instead of optimising the emulator itself maybe Seattle for performance standard with those techs. Correct me if in wrong.
No emulator developer is going to use this if someone has a decent PC they don't need none of these gimmicks. And that's the thing you are trading one thing for another there's going to be lag even though he said 30% less lag than LS2 he still never said how much lag. If you have a potato computer I can see this being at least allowing you to play certain games and that is a great thing well worth the money. I just don't understand why people are playing at 244 HZ with the Zelda game why would you do that?
This isn't about native performance. Emulating something like PS2, Dreamcast, or PS3, native performance is 60fps. If you can ALREADY hit that in the games you play, you can now play them in 120fps or higher. You might not think that's something worth doing, but I can tell you that it is in a lot of situations. Real frames/fake frames is missing the point. It's about the smoothness of the experience, and the clarity of the image while there is heavy motion. It really does make a difference and is awesome in a lot of scenarios (but admittedly not all).
@@AlexwpiGame That's interesting, but I guess not everyone has the right setup or the expertise to configure things properly. I’m enjoying smooth gameplay without any issues, maybe you should look into optimizing your system or settings before jumping to conclusions. 😊
@@AlexwpiGame Hi, thats a good video, i have a question : if my refresh rate monitor caps at 60 by default, its ok using LLS with a 30 fps game ( emulator or native pc) and x2 to go 60? i mean maximun X2
It's all about intentions! Nvidia is fooling their customers and is promoting DLSS and MFG to sell their new hardware. LS is offering upscaling and frame generation for older hardware to keep them alive. That's why Nvidia is getting hate while LS is getting praised for the same stuff.
Lol if think nvidia is getting hate, you're delusional. Only a vocal minority is speaking up, rest of the people will happily consume their product. Don't believe me ? Wait for rtx 50 sales.
@@DragonOfTheMortalKombat that's what monopoly looks like. People have no option. Supply is low and demand is high, while there's no competition for RTX5080 and RTX5090.
DLSS is amazing, and the reason they only work on the new cards is because even if you do get it working on the older cards they don't have the hardware for an enjoyable experience. DLSS and LS are different products with different use cases. Anywhere you can use DLSS you should. Everywhere else there's LS
The 3.0. version makes 30 to 60 fps finally playable. I'm too lazy to install CEMU to play BOTW at 60 and my system cannot consistenly hit 60 on Ryujinx with the patch at 4K. The previous version smudged the interface and Link's outline heavily and introduced noticeable input delay (but was still ahead of AMD's driver level AFMF). Right now with the 3.0. framegen the only thing that gets smudged are the hearts if you spin the camera rapidly, everything else is consistent. The input lag is also way lower, after a minute or two you forget that it's the scaler and not just native 60 FPS perfomance. Extremely impressive.
It's pretty useful for games that tie physics to fps especially. Botw comes to mind for instance, I remember there being some sections the 60fps patch could make the physics go haywire
If you use a second graphics card you can get around the performance hit. Run the game on one card and frame generation on the other. It doesn't need to be the same card model or same brand, but keep in mind that the intensity of the LSFG calculation will scale with resolution and scale factor. Most integrated graphics should be enough for 1080p x2, but for 4k x20 you'll need a 4090.
I have a vega 7 + rtx 3050 system. And it works great most of the time. Doing 1080p 45 fps x2. As long as the frame pacing from the game is decent, iGPU will handle it well.
The Nvidia app has a latency OSD. Frame generation should be benchmarked with latency in mind since high latency results are no better than activating motion smoothing on your tv. Lossless Scaling is not free performance, its duplicating frames. Good performance is the combination of low latency, smooth visuals and the absence of stutter.
I use both Berserk for the Dreamcast and Wind Waker for the Gamecube as benchmarks to test Lossless Scaling, AFMF2... Lossless Scaling 3.0 got rid of 90% of the most annoying artifacts, specially the ground in Berserk and Link's head in Wind Waker. Also input latency got significantly reduced.
I've been watching animated and live action shows with this (just to switch up now and again. I know animation and film purists would kill me for even saying this 😅 )
Maybe if there were some GPU headroom when using Winlator, this could be possible, but there would also need to be improvements in Winlator's multitasking processes. Speaking of which, I sent you the setup for running RPCS3 on Android via Discord. Did you get a chance to check it out?
Cool tech. Passing on this because I'm on Linux and don't need another stack in proton, I don't even bother with ReShade. Hopefully something like this and reshade just get integrated into gamescope.
I don't think so, because you need to have some GPU headroom (an unused percentage of the GPU) to dedicate to the generated frames. Usually, Winlator utilizes the full power of the device, so all you'll generate are frames filled with artifacts.
I've got all my games on a htpc with a 4k tv that only does 60fps. Might check this out just to upgrade lower fps games to the max. I can't imagine artifacts would be bad with that upper limit
I saw this upload floating around torrent sites, I dare not touch it because I don't know and am not familiar with it. I humbly ask, will this work for laptops? Say, Acer Nitro with GTX 1050?
Yes, I’ve even tested it previously with a GTX 1050 with 2GB, but I recommend purchasing the app directly from Steam. It’s a very affordable application to risk getting a virus.
someone knows if you can select the fps desired? i want to use this program but for old dos games but i don't want them to run at 60fps but a little less like 40
How do you capture the videos with Lossless?, with nvidia it keeps saying there is a program that prevent capturing the screen, and i can´t capture anything with lossless enabled.
You must use OBS. Start LS, check the "Run as Administrator" option, close the application, and run it again. This time, a permission request will appear. Run OBS as an administrator, add a game capture, and point it to LS. Close all programs, and now run them in this order: LS > OBS > Start recording with OBS > Use the shortcut to enable scaling through LS. If you did everything correctly, you will now have the LS recording with the generated frames. For future sessions, always start LS first, followed by OBS.
Yes, but with a caveat: some games run with variable FPS, so you just need to press Control + U, and your FPS will automatically reach the maximum possible without speeding up the game. However, for other games with locked FPS, a patch is required (specific for each game, as it modifies the code).
bro, how do you get such low fps with a 4070? I get an average of 90-100 fps in CEMU running BOTW maxxed out in 4k with all graphic enhancements, all this with a 3090ti
@edmundroth6337 I was playing TotK, not BotW. Moreover, in Nintendo Switch emulators, your GPU is practically idle; what really matters for performance is the CPU.
That depends a lot on the games you play. Some games require less precision, so you won't feel as much input delay. However, as I mentioned in the video, I don't recommend it for competitive games.
With gamepad and third person games it's not bad i have played through 80% of the Metal Wolf Chaos and entirety of Parasite Eve 1 (on very old version) and never had a problem. But with fps games on mouse it's damn bad.
@ So the question is; if it's only not noticeable in games that require low precision, is it even worth using to update from let's say 30fps to 60fps? Since higher fps are only beneficial in high precision games like shooters... what's the point? It's like a catch22 type of deal.
You are wrong about one thing man, You don't need one of the best processor in the market to run tears of the kingdom at 60fps native. I just bought Ryzen 7 5700x a week ago and it easily runs that game at 60fps 1440p with almost zero stutters or frame drops. I don't even need to use LSFG. You can find one at 120-130$ easily or even cheaper in the used market.
Today its a mid range CPU, but back when it was launched, he was the best in business, basically a 5800x cheaper, the Intel 12900k at the time i believe was faster but people were already ditching Intel
Acabei de pegar uma TV 4K 120hz, vou usar suas dicas do video pra deixar alguns jogos que já rodam a 60 tranquilamente, batendo 120! Seria tranquilo né? no quesito de nao perder qualidade
With this configuration, you'll have a decent gameplay experience, around 50FPS in TotK. For significant improvements, only with a 12th-gen CPU or higher.
Does this only help with gpu performance? Because if it helps with cpu performance that would benefit me greatly considering my cpu is from like 2011- my cpu can run “some” Xbox 360 games at decent speeds. It’s I think a i7-2600 non k version so it can’t be overclocked.
If there's a CPU bottleneck with very low FPS, this software won't help you. You need to have a performance of at least 30 FPS, but 40 FPS is recommended.
Você poderia testar Xenoblade 2? Mesmo com um ryzen 5600 e uma 3060, tá engasgando muito, ele pega mais de 50 fps, com 3x bate mais de 120, mas fica meio travando
It's doesn't make much sense now that the software is used for frame generation mostly, but the original purpose of the software was integer scaling better 4k and 1080p and other compatible resolutions.
Fake frames don't count as real frames when you have a very beefy CPU. Games feel way better to play, more important than how they look (that's my take).
Very ignorant of you to think that games can always get real frames. There are many old games that can't get more than 60FPS without breaking the physics or NPC AI etc. Emulators also have FPS caps like in the case of PCSX2. And not to mention that even the fastest gaming CPU has troubles in RPCS3. Once you get more than 60FPS, real frames aren't even worth it for most single player games.
Thanks to channel patreons / youtube members:
• Paolo • Cecilia • martinez3010 • Kesnei • thethnikkaman • Z • Caseten • Francesco Brodini • Borja • xtremebuga#0 • Eulogio Gallo • iviilan#0 • Romek Dz • unsentsoul32#0 • tgrmstr.#0 • Philip Will • deshwitat#0
The 7 Best Dollars to Spend in Life
ahhh nice
I just tested it with classic ridge racer series for the ps1 on duckstation. the game runs as smooth as butter now, those games were hard locked at 30 fps
im going to test with R4 (ridge racer type four) in the future. i dont have a GPU for now (only IGPU on my ryzen) but is cool to know even retro emulators can use lossless scalling too
i mean i played IGI 1 at 144hz with this
Can you play in full screen?
@@1Saynex Ofc you it always creates overlay and tries to make it fullscreen so i no longer need to use borderless gaming
As someone who does gaming with a mini PC ,(integrated graphics )this program did wonders , thanks for the tip :)
So the latency jump is at DLSS 3 Frame Generation level now. There are a lot more artifacts than with DLSS 3, but this is looking so good that I would already start using it on emulators. Thanks for the video!
It'll always have more artifacts than DLSS 3/4. It doesn't have hardware acceleration or data like motion vectors/ depth buffer to wok with. Also DLSS will likely see a further reduction in latency with upcoming reflex 2. Maybe in a future update.
Given how little LSFG has to work with, it is surprisingly good.
@DragonOfTheMortalKombat No, it doesn't have access to game engine data like motion vectors, but because of that, it works on every single game and emulator you want, and it doesn't need to be implemented, something that you do need to do with DLSS FG (still, we have people like PureDark doing that for us). "Hardware acceleration" is less relevant now that all DLSS FG will use is tensor cores, not hardware optical flow, after the upcoming update. What truly matters is how good the AI behind it is, something Lossless doesn't have, but it's doing a pretty good job and improving a lot with just machine learning.
Because of what Reflex 2 does, I highly doubt it would work WITH Frame Generation, let alone Multi-frame Generation. We'll have to wait and see, but Nvidia Reflex is good enough that it leaves you at a much lower input latency than what you would have with Reflex and Frame Generation off.
yep i use loseless scaling bought it on steam ages ago just $3 for good progam what a steal
Well well well, finally some good news for the beginning of the year!
I always wanted to play twilight Princess and wind water at 60 fps. And now I can
Up until this 3.0 release I was playing TOTK at 60fps then using LS x4 to get 240fps, but there was a fair bit of stutter. Now I can leave it at 30fps and use 8x to play at a mostly perfect 240fps. It does make motion control arrow aiming more difficult, but it is so fucking beautiful! I also tried playing Wipeout 64 at 240fps, but the emulator keeps dropping frames like crazy. Can't wait to get Wipeout 64 looking as smooth as Wipeout Phantom Edition! Lossless Scaling is so damn good!
Yep I play the game 10K 60 frames, but I could also do 4K 240 frames
I have a fear that some emulator devs might implement those techs instead of improving native performance.
What? No there aren't no deadline
There is no single devlopers
This is passion work
Open source means anyone can implement whatever they want
@Ghost-pb4ts that's now what i meant, what i meant Is that instead of optimising the emulator itself maybe Seattle for performance standard with those techs. Correct me if in wrong.
No emulator developer is going to use this if someone has a decent PC they don't need none of these gimmicks.
And that's the thing you are trading one thing for another there's going to be lag even though he said 30% less lag than LS2 he still never said how much lag.
If you have a potato computer I can see this being at least allowing you to play certain games and that is a great thing well worth the money.
I just don't understand why people are playing at 244 HZ with the Zelda game why would you do that?
@RichFromQueensNY You are right, thanks for the info!
This isn't about native performance. Emulating something like PS2, Dreamcast, or PS3, native performance is 60fps. If you can ALREADY hit that in the games you play, you can now play them in 120fps or higher. You might not think that's something worth doing, but I can tell you that it is in a lot of situations. Real frames/fake frames is missing the point. It's about the smoothness of the experience, and the clarity of the image while there is heavy motion. It really does make a difference and is awesome in a lot of scenarios (but admittedly not all).
Played God Of War 3 While Using Lossless its really similar like magic happening
It's a shame that in more advanced stages of the game, the performance drops even further, which will cause many graphical artifacts.
@@AlexwpiGame That's interesting, but I guess not everyone has the right setup or the expertise to configure things properly. I’m enjoying smooth gameplay without any issues, maybe you should look into optimizing your system or settings before jumping to conclusions. 😊
@@AlexwpiGame Hi, thats a good video, i have a question : if my refresh rate monitor caps at 60 by default, its ok using LLS with a 30 fps game ( emulator or native pc) and x2 to go 60? i mean maximun X2
Shadps4 will make it run much better so we could have stable 60fps and we could just increase it to 120fps. We just need to wait few months.
It's all about intentions!
Nvidia is fooling their customers and is promoting DLSS and MFG to sell their new hardware.
LS is offering upscaling and frame generation for older hardware to keep them alive.
That's why Nvidia is getting hate while LS is getting praised for the same stuff.
And that their cards are still using 8GB VRAM. That is barely enough for 1080p these days unless you really enjoy old games.
Lol if think nvidia is getting hate, you're delusional. Only a vocal minority is speaking up, rest of the people will happily consume their product. Don't believe me ? Wait for rtx 50 sales.
@@DragonOfTheMortalKombat that's what monopoly looks like. People have no option. Supply is low and demand is high, while there's no competition for RTX5080 and RTX5090.
DLSS is amazing, and the reason they only work on the new cards is because even if you do get it working on the older cards they don't have the hardware for an enjoyable experience. DLSS and LS are different products with different use cases. Anywhere you can use DLSS you should. Everywhere else there's LS
The 3.0. version makes 30 to 60 fps finally playable. I'm too lazy to install CEMU to play BOTW at 60 and my system cannot consistenly hit 60 on Ryujinx with the patch at 4K.
The previous version smudged the interface and Link's outline heavily and introduced noticeable input delay (but was still ahead of AMD's driver level AFMF).
Right now with the 3.0. framegen the only thing that gets smudged are the hearts if you spin the camera rapidly, everything else is consistent. The input lag is also way lower, after a minute or two you forget that it's the scaler and not just native 60 FPS perfomance. Extremely impressive.
It's pretty useful for games that tie physics to fps especially. Botw comes to mind for instance, I remember there being some sections the 60fps patch could make the physics go haywire
Great video dude !!! THANKS !!!
If you use a second graphics card you can get around the performance hit. Run the game on one card and frame generation on the other. It doesn't need to be the same card model or same brand, but keep in mind that the intensity of the LSFG calculation will scale with resolution and scale factor. Most integrated graphics should be enough for 1080p x2, but for 4k x20 you'll need a 4090.
was thinking about picking up an A380 for the AV1 encoding, so this is sick to hear
I have a vega 7 + rtx 3050 system. And it works great most of the time. Doing 1080p 45 fps x2. As long as the frame pacing from the game is decent, iGPU will handle it well.
The Nvidia app has a latency OSD. Frame generation should be benchmarked with latency in mind since high latency results are no better than activating motion smoothing on your tv. Lossless Scaling is not free performance, its duplicating frames. Good performance is the combination of low latency, smooth visuals and the absence of stutter.
I use both Berserk for the Dreamcast and Wind Waker for the Gamecube as benchmarks to test Lossless Scaling, AFMF2...
Lossless Scaling 3.0 got rid of 90% of the most annoying artifacts, specially the ground in Berserk and Link's head in Wind Waker. Also input latency got significantly reduced.
I really wish it would work on steam deck
I've been watching animated and live action shows with this (just to switch up now and again. I know animation and film purists would kill me for even saying this 😅 )
I love the new lossless scaling update.
3:53 youre right alexwpi, but its still great if you want to save on power consumption by generating frames than native rendering even on high end gpu
I hadn't thought about it that way; you have a point.
Where can I DL citron?
It seems the DL button on their web page is not functioning. For weeks now...
I wonder if loss less scaling will work on winlator 😅
Maybe if there were some GPU headroom when using Winlator, this could be possible, but there would also need to be improvements in Winlator's multitasking processes. Speaking of which, I sent you the setup for running RPCS3 on Android via Discord. Did you get a chance to check it out?
Cool tech. Passing on this because I'm on Linux and don't need another stack in proton, I don't even bother with ReShade. Hopefully something like this and reshade just get integrated into gamescope.
This made me think
Can lossless scaling be used in winlator to enhance performance
I don't think so, because you need to have some GPU headroom (an unused percentage of the GPU) to dedicate to the generated frames. Usually, Winlator utilizes the full power of the device, so all you'll generate are frames filled with artifacts.
This will be fun in mungen type of games.
I have spent ages trying different settings. No matter what I do the latency is just too high. Feels like moving under water. Unusable for me :(
Thanks for the video. Do you know if Lossless Scaling works well with VR games? Thank you again for the Info,hope you have a great day.
thank you for the test~👍
I've got all my games on a htpc with a 4k tv that only does 60fps. Might check this out just to upgrade lower fps games to the max. I can't imagine artifacts would be bad with that upper limit
3090 rog strix and 4090 strix good enough?
tripplebuffer x45 fps also works fine for 144hz monitors
48x3
I saw this upload floating around torrent sites, I dare not touch it because I don't know and am not familiar with it. I humbly ask, will this work for laptops? Say, Acer Nitro with GTX 1050?
Yes, I’ve even tested it previously with a GTX 1050 with 2GB, but I recommend purchasing the app directly from Steam. It’s a very affordable application to risk getting a virus.
Looks really good if you don't know what input latency is.
It's absolutely a killer. I have tried many settings and the input latency is garbage. Can't use it.
someone knows if you can select the fps desired? i want to use this program but for old dos games but i don't want them to run at 60fps but a little less like 40
How do you capture the videos with Lossless?, with nvidia it keeps saying there is a program that prevent capturing the screen, and i can´t capture anything with lossless enabled.
You must use OBS.
Start LS, check the "Run as Administrator" option, close the application, and run it again. This time, a permission request will appear.
Run OBS as an administrator, add a game capture, and point it to LS.
Close all programs, and now run them in this order: LS > OBS > Start recording with OBS > Use the shortcut to enable scaling through LS.
If you did everything correctly, you will now have the LS recording with the generated frames.
For future sessions, always start LS first, followed by OBS.
@@AlexwpiGame Thank you, this is so detailed that i think i am not gonna fail following your instructions, once again, thank you!
Think this will run on a rog ally?
Do i need for every single switch game a variable refresh mod to use unlocked frames without fast forwarded / speeded up gameplay?
Yes, but with a caveat: some games run with variable FPS, so you just need to press Control + U, and your FPS will automatically reach the maximum possible without speeding up the game. However, for other games with locked FPS, a patch is required (specific for each game, as it modifies the code).
Should I update to the beta version? Is it much better?
Yes, absolutely.
bro, how do you get such low fps with a 4070? I get an average of 90-100 fps in CEMU running BOTW maxxed out in 4k with all graphic enhancements, all this with a 3090ti
I think he is running the switch version on citron
he has an i5
@@arvc21 even so, it should not be that low. Also in rpcs3 he has very low fps, actually he has very low fps across the board. Very strange.
@edmundroth6337 I was playing TotK, not BotW. Moreover, in Nintendo Switch emulators, your GPU is practically idle; what really matters for performance is the CPU.
I chose very problematic games on RPCS3, which is why the FPS is so low and unstable.
How about latency? Input lag must feel horrible with all of these "generated frames".
That depends a lot on the games you play. Some games require less precision, so you won't feel as much input delay. However, as I mentioned in the video, I don't recommend it for competitive games.
With gamepad and third person games it's not bad i have played through 80% of the Metal Wolf Chaos and entirety of Parasite Eve 1 (on very old version) and never had a problem. But with fps games on mouse it's damn bad.
@ So the question is; if it's only not noticeable in games that require low precision, is it even worth using to update from let's say 30fps to 60fps? Since higher fps are only beneficial in high precision games like shooters... what's the point? It's like a catch22 type of deal.
You are wrong about one thing man, You don't need one of the best processor in the market to run tears of the kingdom at 60fps native.
I just bought Ryzen 7 5700x a week ago and it easily runs that game at 60fps 1440p with almost zero stutters or frame drops. I don't even need to use LSFG.
You can find one at 120-130$ easily or even cheaper in the used market.
Well, Ryzen 7 5700x is one of the best processors on the market today, without considering server versions like Epyc.
@@edonizeti_ the 5700X is a decent processor but its far from the best of the market, it's already two generations old
Today its a mid range CPU, but back when it was launched, he was the best in business, basically a 5800x cheaper, the Intel 12900k at the time i believe was faster but people were already ditching Intel
Do you pick this beta over the latest version?
The beta is currently the most recent version, the only one with LSFG3.
@ I’m saying does it have more performance than the latest version?
En mi caso probando yuzu, noté que usaba más gpu al usar la 3.0 y tuve que optar a regresar a la 1.0
I was playing breath of the wild 10K with my 4090 laptop
I have a problem with new lsfg 3.0. when i enable it my frames drops by atleast 40% and its not playble. Lsfg 2.3 works fine.
Can someone help me?
Use resolution scale (for generate frame) under x2 x3 x4, you can reduce your frame drops
Drop the resolution scale to something like 75%
Acabei de pegar uma TV 4K 120hz, vou usar suas dicas do video pra deixar alguns jogos que já rodam a 60 tranquilamente, batendo 120! Seria tranquilo né? no quesito de nao perder qualidade
se usar 60FPS fixos e colocar 2x vc nem vai sentir direito artefatos gráficos
Hello
Do I need to upgrade my configuration I7 9700K RTX 3070 so that I can play in fluid conditions with the Ryujinx emulator?
With this configuration, you'll have a decent gameplay experience, around 50FPS in TotK. For significant improvements, only with a 12th-gen CPU or higher.
@@AlexwpiGame Ok. thank you for your reply.
Does this only help with gpu performance? Because if it helps with cpu performance that would benefit me greatly considering my cpu is from like 2011- my cpu can run “some” Xbox 360 games at decent speeds. It’s I think a i7-2600 non k version so it can’t be overclocked.
If there's a CPU bottleneck with very low FPS, this software won't help you. You need to have a performance of at least 30 FPS, but 40 FPS is recommended.
i have the same cpu and yes it works really well
As long your CPU can give decent frame rates yes.
Você poderia testar Xenoblade 2? Mesmo com um ryzen 5600 e uma 3060, tá engasgando muito, ele pega mais de 50 fps, com 3x bate mais de 120, mas fica meio travando
Esse jogo é extremamente ruim de emular, vou esperar meu pc novo chegar na semana que vem e texto
Obrigado pela resposta!
Can this make son ps3 games playable on the deck.
Only on PC or can be used on android with winlator?
Only PC
Steamdeck?
The only way I would use it if it was automatic
Just keep the app open and press a hotkey.
The term lossless scaling doesn't make any sense.
It's doesn't make much sense now that the software is used for frame generation mostly, but the original purpose of the software was integer scaling better 4k and 1080p and other compatible resolutions.
MANNN PLSSS UPGRADE YOUR CPU 🙏😭
its still good
I’ve already bought a Ryzen 7 9700X, I think it will arrive this weekend.
@@AlexwpiGame Finally thank god
i tested verison 2.11.2 on intel 4000 hd
no fps generation work
i mean it make ur game slower + wierd effect
This is a 2011 integrated gpu.. what do you expect? magic?
@@singlewave805 still i expect it work give more fps
i tested version 3 now
not work
@@madmaster3d framegen needs gpu power to work, you dont have any gpu power with that thing.
Fake frames don't count as real frames when you have a very beefy CPU. Games feel way better to play, more important than how they look (that's my take).
Very ignorant of you to think that games can always get real frames. There are many old games that can't get more than 60FPS without breaking the physics or NPC AI etc. Emulators also have FPS caps like in the case of PCSX2. And not to mention that even the fastest gaming CPU has troubles in RPCS3.
Once you get more than 60FPS, real frames aren't even worth it for most single player games.
@@DragonOfTheMortalKombat "And not to mention that even the fastest gaming CPU has troubles in RPCS3. "
i9-13900K is +80fps in Sonic Unleashed.
shame to see this channel promote fake frames
Real voice please
Some people don't like their own voice, I hated mine for a very very long time
@@HankChillin Nobody likes their voice. Using it in a video is a good way to get over it and be happy ;)
he mentioned he might do portugese voiceover
As already mentioned, there’s no issue with my voice; I just don’t speak English natively.
@AlexwpiGame OOOO lol that makes sense
can you please make a winlator guide