Check out the full K/DA - Villian 60FPS here: ruclips.net/video/YZy6l2dBbR4/видео.html Longer RIFE vs DAIN comparison compilation: ruclips.net/video/2BwC_utLhH8/видео.html RIFE APP with Windows exe is available for free: nmkd.itch.io/flowframes
Hey I need help, I’m trying to use rife but when I interpolate videos it only gives me the frames as Images not as a mp4. Is this how it suppose to work?
That's what I've seen with a lot of people just plugging interpolation into anime, they don't recognize that the low framerate is just going to make it so that it will hold the drawn poses for longer than the interpolation frames, because since the original footage holds onto a single pose for 2 or more frames, then the animation will still look jittery if the ai only creates an in between every other frame. to get a better result someone would have to first export an animation on 1s, then change the timing of the frames after the AI does it's job.
Finally some one who understands interpolation AI. It must be used with very intentionally and not as a crutch to save cost. It can hard carry slow mo action scenes and cloth physics but 24 FPS impact frames still should be done by a key animator. Just look at CGI, barely any studios understand why just cutting frames down to 24 FPS doesnt make CGI models look better. Because CGI is a perfect model that warps and transforms perfectly to a perspective grid. It cannot exaggerate or hold a pose unless it it intentionally done. This is also why scenes which intentionally have 60 FPS look better when interpolated, because the amount of detail that the audience can perceive is far greater, giving them more time to process the whole scene in question.
True but that is already a solution, you can "de-duplicate" the frames which I think mainly just fixes up the frames on twos, it is but of the software though not the AI Model itself.
@@123495734 how is that gay? It depends on the anime. Anime stands for animation, that's what Japanese people call cartoons. Japanese people watch anime... Japanese cartoons are basically anime and anime is Japanese cartoons. So are you making fun of another nation's word for cartoons oranother nation's cartoons? There are many many different types of anime.
Since it isn't exactly perfect as of yet, I think this has a lot of use for creative efforts rather than remastering efforts. I've done a bit of animation work here and there, frame-by-frame rotoscoping being the most nightmare-ish type I've done, and the idea of possibly being able to skip out on drawing a lot of the frames in the less active scenes and yet keep the framerate consistent across the whole animation is pretty dang appealing. I once did rotoscoping with morphing shapes rather than raster drawings, and that allowed me to skip frames, but the workload ended up being about the same as rather than drawing each shape, every keyframe that I animated had to have the pre-existing shapes morphed manually into the shapes of the next keyframe, sometimes the object moved too fast or at an inconsistent velocity thus requiring a lot of correction, and new objects that popped up had to have a whole new shape drawn in and manually morphed, every shape doubled the workload required for the next keyframes. Sure I could animate at like 12 per second and interpolate it as high as I wanted, but the in-between frames got really ugly going past 24 fps.
I've noticed that with all of the tools I've used so far (dain, cain, and rife), they still sometimes have trouble with things like text on top of a moving object. Sometimes it treats the text as being a part of the object and warps it as the object moves.
I still think DAIN is the absolute King, only huge drawback is the processing time. My brain is kind of seeing an elite "soap opera effect" when looking at RIFE processed videos
Wish I'd found this before starting a DAIN job. 7 hours for a 3248×2736, 67-frame, 15 to 60 FPS interpolation of an object rotating in stop motion. Granted I only have a 1050Ti, but whatever. Still preferable to doing the extra 300% of picture-taking work with my particular setup.
@@intelgen7860 Why do you keep calling it things like gay and transgender? Have your opinions on the game sure, but I'd be careful on the way you phrase it.
Well yeah this exact what I saw with DAIN. For scenes with high action and fast frames you are getting artifacts and for scenes with low action it looks smoother but is unnecessary because you dont really need more frames here.... . So basically it is a lost-lost situation... .
It sounds like the creators of RIFE needs to use more frames to deal with quick motion. They already have the core so theoretically it shouldn't be difficult to do.
I don’t think the technology will ever be perfect but it has potential as a tool to speed up animation work, the ai creates the base image and then an animator goes in to patch up the artifacts that the ai produces rather than doing it frame by frame. I don’t see it replacing parts of post production, rather speeding it up.
The issue with for example AOT is that the framerate of the background does not match with the framerate of the foreground. This is usually a no go in frame interpolation. To get an actual clean result you need to split those two and do it seperatly. Also you need to do some artifact corrections frame by frame. To interpolate the composition and expect things to be 99% clean without needs for correction is just... cringe. I dont know if there will be a work around in future for such things, but Dain/Rife is imo not ment to be used on a endconsumer level, where you put 30 mins of film in it, click a button and fin, but within animation/film studios, where they are prototyping the scenes themselves.
@Ekklipse What noodle video? Are you unironically using snowflake? Also, it's fine for to be an experiment but animation being produced this way is bad for so many reasons.
Wait, so if I got it right, this is not interpolating in real time? Do I have to "upscale" to 60 fps before I watch a video file? Or does it work in real time?
Question about frame rate capability. What is the lowest you can go? For example, if I wanted to convert 15 to 24, can it do it? I don't need anything above 24. What I'm trying to do, is smooth out camera moves.
In 3D motion graphic CGI, should you render the frames with or without motion blur if you want to boost the FPS? Which would yield better results? I haven't found any answers regarding motion blur on/off with DAIN/RIFE/CAIN.
You never really want motion blur, because it's really just an artifact masking technique. More clear frames will be a better artifact masking technique. So if you have a 30fps video, putting it at 4x and watching it in 1x or 2x speed will be more visually appealing, because you get at least 2-4 frames between every frame when viewing, which smooths out the video. If you just add blur to the frames, all motion will be blurry, especially when slowed down. The only time you would want blurring is if you have video with constant erratic movement that makes path prediction difficult to the point it has tons of artifacts.
Why not revise the interpolation algorithm into a plugin compatible with a variety of animation workflow platforms to be able to interpolate various animation sequences more efficiently? The real value of this algorithm lies in what can be done with it right this very moment.
Or you can just use SVP, which was around for 10+ years, and do *all that* in *real-time* (or as a part of video encode, via an extra filter chain), on *any* resolution or output frame rate, utilizing pure CPU power (and add OpenCL on top of that, if it's supported by your GPU). The end result (if you choose more complex algos) is more than pleasing.
Yeah, I tried to feed SVP a 3d animation stored in my computer and I was stuck. If I can't even load a file (no, I'm not computer illiterate), then the program is dead in the water.
@@mrlightwriter AviSynth scripts and command line, mate. =) If you thought that there is a magic "Drag-n-drop here to make a 60fps video" button in SVP - I have bad news for you. For the exact same reason 3D modelling apps don't have a "Create a realistic dog model" or "Make me a futuristic scenery" buttons. In short - "Zero IQ" apps _(drag, drop, press one button)_ are very handy, but severely limited, in both script/effect variance and possible ways to alter the source _(not to mention the output quality)._ For example, in the early days, I thought that the StaxRip app was bad and unusable, because I couldn't figure out a way to *properly* _(the way _*_I wanted_*_ - because there still was a "single button" option which I _*_did not_*_ want to use)_ convert a video using it. Guess how my perception of "usefulness" changed when I revisited that app a bit later on. =) SVP and other similar stuff, and most apps in general, per se, *still works* _(worked when Win7 was still around, works now on both Win10 20H2 and 21H1)_ - you just have to put a bit more effort to it. If you won't see the quality change, going from a "one button" apps to the professional ones - there's no reason to use an app like that at all. A common built-in _(in most TV's nowadays)_ "fluid motion" option would suffice your need then. It'll be a buggy output, glitchy in every way possible ... yet provide a 60fps output.
@@Vitaliuz Within Flowframes Video Interpolator UI, I tried RIFE CUDA and RIFE NCNN, with combinations of versions 1.5, 1.6, 1.7, 1.8, 2.0 and 2.4, but to no avail. You referred AviSynth scripts and command line. Could you point me to some directions?
@@mrlightwriter Do you know anything about anime? Anime is not constantly animated in 2, sometimes it is in 1, 3 or even 4 and that's because the animator purposely decided to do so. More frames is not better animation
@@guancho9142 Yes, I know the difference between the different types of animations in anime. I was generalizing. In a longer answer, it wouldn't harm having the possibility of increasing the fps of animes (for the ones who want to see more fluid animation). Of course, animation in 1, 2, 3 or 4s is related to artistic choices and/or time and money contraints, and it should continue to be produced that way.
Trying to use the (flowframes) experimental interpolation, but 8gb vram isn't enough for 1080p, and it only supports Nvidia. What's the quality difference between normal interpolation and above?
Do you think anything like this could apply for gaming in the future? I get that it's just for video interpolation and it needs a file to convert to 60fps but Maybe for things like game capture software that could smooth out the 30 fps games of last gen since sony and xbox only made like 15 next gen consoles.
Also great video. It was recommended on a digital foundry video about why destiny 2 can't run at 60 fps on ps4 pro using examples of frame interpolation. I was also looking for anyways to perhaps mod spiderman ps4 or miles morales to 60fps. Obviously no luck there but I was hoping frame interpolation could probably be an option.
Frame interpolation can already be done in real time, but it's a lot less refined looking. Something like this would not be good for smoothing out console frames, because it takes as much or more power than the game itself.....so you'd be needing more than twice the graphics power to turn 30fps into 60fps....which means you might as well use that power to get more gaming fps instead. On top of that, since games are interactive and rely on input, if you ran a 30fps game at 60fps visually, you'd still have the input lag of 30fps, not to mention since it's interactive, frame trajectory prediction wouldn't work without a second frame as a reference. How these work is they look at the differences between a first and second frame, then interpolate a third frame in the middle. With no second frame to interpolate, there is no prediction, so not only would you have a 30fps input lag, you'd also have a 3fps delay on top of that, or about 100ms visual delay between actions, which would make the game even less playable than 30fps without interpolation.
@@peoplez129 "but Maybe for things like game capture software that could smooth out the 30 fps games" He never talked about interpolating frames in real-time gameplay, he was talking about game recordings.
Something similar to DLSS but for frame interpolation could be cool. Imo the problem is that games are not just seen like videos, but also felt, so if the motion is wrong it would be a lot more jarring
gonna boost the brain of this person cause alot of the artistic vision is lost when you boost legit everyvideo like some videos are rendered in 23 fps for a reason
When used dain it took my cpu to 100% usage and i got a "warning cpu usage" while in the app it shows the device to use is the gpu! whats wrong with dain?
Using the flowframes I do get Error "Your GPU run out of VRAM!!.." after trying to use like 1.8GB of GPU ram ...no idea why but I do get this tried different settings limiting the max video input size but still the same pff
i must say that i prefer DAIN with thier own Blur settings like the area and threshold config for better smooth movements of the background, even when RIFE has a much faster processing of all the rendering it still doesnt look as expected since there arent very much options in RIFE compared to DAIN
I wonder if they could use this commercially for anime. Have the anime drawn in the usual 24 frames and then use the ai and then have redrawers for frames deemed necessary to fix. It seems possible to me atleast.
ok, here's why this doesn't work, ok u i assume want to watcha 60 anime, 1: these ai interpolating software does not take into account the animation directors vison 2: most of the time the interpolated frames mess up the timing 3: ok let's say its 60fps and u suggest they redo the problematic frames, how long do u think that'll take, cause I'm genuinely surprised at the ignorance it takes to suggest this animators are severely underpaid for their work so want to potentially double their work load
@Ekklipse no doesnt work like that most animation studios are booked for the entire year, meaning they have a set time and very strict schedule to abide by, what ur suggesting isn't easier, ur not taking into account of complex shows and detailed characters the fact of the matter is that re takes, and frame clean ups already occured in the industry adding double the frame count is literally gonna increase the time working on those,
@Ekklipse I think I get what your saying. Wouldn’t you be putting in 29/35 hand drawn frames per second instead of 24/30? I’m not sure how that would work but I get what your saying. You think the adding the more frames (the inbetweener frames) would increase the accuracy but the problem is that it would still be inaccurate unless the added 5 frames acted as true inbetweeners. What I mean is if the 5 frames were spaced out for the software to fill in those un-animated slots (the slots would be 4.8/6 frames. So 4.8/6 frames in between the hand drawn inbetweeners.) but the software isn’t programmed to do it that way. That’s how I think it works anyway. I might be wrong because I don’t know too much about animation and I’m mostly going off of game rendering knowledge (the game renderer is basically on-the-spot animation animated by the computer).
@@zyxwv in the 4 months since I posted this, I also found out some editing software does it themselves, and a pretty damn good job too I've used it quite a bit in some of my videos this year and it looks like true 60FPS
@@guancho9142 just take the insult or get with the times. There is a reason more and more movies are coming out with higher refresh rate. Leave the shackles behind
@@timonix2 1. These movies are FILMED in high FPS and not interpolated 2. The appeal of animation, especially 2d and stop motion, is its craftmanship. This is a fuck you to those animators who have carefully decided the timing of each scene
@@guancho9142 If a program can help animators get the job done faster, that's more time they can spend on the next scene. That's also potentially more time with their families. Any aid is better than crunch time.
There's a reason people don't like interpolation on these type of things. Anime just doesn't work with the way frames are held for more than one frame, its practically turning the once normal animation into a weird, janky, frame by frame that looks terrible compared to the hand animated project.
Don't you guys see that using this tool for TV shows and movies break the desired result? Anime is anime because of how it looks in motion. Movies are planned to be played at 24 frames per second. If they were planned to be played other way they would be made other way. I can understand this for personal content or sports, but not for anything else.
One day it will exist an AI so good in interpolate frames, that you wouldn't even need a video, put a photo, it can be a guy running, a face or a building being demolished, doesn't metter... It will understand and create a random coherent movement using just a single frame.
Holy smokes, it took 2mins 4secs to make a 30fps video into 60fps. I ran it on DAIN before and it would take 6 hours(!) for the same video, but it crashed after 2hrs. Crazy. Only problem I have is that the re-duplication analysis said it didn't exist any duplicate frames in the video, even though I know for a fact the original video have. The output result also have duplicate frames. Any way to solve this?
Hey, Nice Video. thx for the Information. I got 2 Questions. 1. shell i wait for an update for this programm, so it will work for anime aswell? 2. Do you think an Intel 630 grafic card is enough for this master piece, or do I have to get a new PC/ Graphic Card? :D Hace a nice day
You are wrong RIFE can process what your computer can handle. I have used AI to upscale videos to 8K then I have used RIFE for interpolating to 144 fps.
@@aesrae8828 32 bit..... In 2021? Like actually? No pc that officially runs 32 bit windows will ever be able to use rife without setting itself on fire. 32 bit hasn't been a standard for 13 years So no, unless they implemented it
Check out the full K/DA - Villian 60FPS here: ruclips.net/video/YZy6l2dBbR4/видео.html
Longer RIFE vs DAIN comparison compilation: ruclips.net/video/2BwC_utLhH8/видео.html
RIFE APP with Windows exe is available for free: nmkd.itch.io/flowframes
Hi 🌍💯🔭
Anime
0:19 Cowboy Bepop
0:43 One Punch Man
1:28 Akira
2:23 Konosuba
3:19 Attack on Titan
4:08 One Punch Man
4:27 Jojo's Bizarre Adventure
4:34 Jujutsu Kaisen
1:00 to 1: 18 show name pls?
@@athul_c1375 It's not a show. It's a music video called K/DA - VILLAIN by Riot Games.
1:38 amogus
The movement looks sluggish sometimes
Ok, but Why
I have used this for a couple of days.The main two advs are fast speed and low gpu memory usage.
Are the results as good as DAIN ?
@@ifanmorgan8070 It faster than Dain but Dain has better result
@@faizyunus6618 Thanks. I think for my purposes (home movies) the RIFE is good enough. I honestly could not see a difference between them
@@ifanmorgan8070 For movies Rife is good enough but for anime DAIN is better 😁
Hey I need help, I’m trying to use rife but when I interpolate videos it only gives me the frames as Images not as a mp4. Is this how it suppose to work?
That's what I've seen with a lot of people just plugging interpolation into anime, they don't recognize that the low framerate is just going to make it so that it will hold the drawn poses for longer than the interpolation frames, because since the original footage holds onto a single pose for 2 or more frames, then the animation will still look jittery if the ai only creates an in between every other frame. to get a better result someone would have to first export an animation on 1s, then change the timing of the frames after the AI does it's job.
Finally some one who understands interpolation AI. It must be used with very intentionally and not as a crutch to save cost. It can hard carry slow mo action scenes and cloth physics but 24 FPS impact frames still should be done by a key animator.
Just look at CGI, barely any studios understand why just cutting frames down to 24 FPS doesnt make CGI models look better. Because CGI is a perfect model that warps and transforms perfectly to a perspective grid. It cannot exaggerate or hold a pose unless it it intentionally done.
This is also why scenes which intentionally have 60 FPS look better when interpolated, because the amount of detail that the audience can perceive is far greater, giving them more time to process the whole scene in question.
You can choose to remove duplicate frames, you know.
True but that is already a solution, you can "de-duplicate" the frames which I think mainly just fixes up the frames on twos, it is but of the software though not the AI Model itself.
flowframes has a feature to fix that lol
Should note the creator is planning on making a model to interpolate anime, fingers crossed it comes out soon!
Gay
gay
Awesome.
@@123495734 how is that gay? It depends on the anime. Anime stands for animation, that's what Japanese people call cartoons. Japanese people watch anime... Japanese cartoons are basically anime and anime is Japanese cartoons. So are you making fun of another nation's word for cartoons oranother nation's cartoons? There are many many different types of anime.
@@Stealthful_ gay
Since it isn't exactly perfect as of yet, I think this has a lot of use for creative efforts rather than remastering efforts.
I've done a bit of animation work here and there, frame-by-frame rotoscoping being the most nightmare-ish type I've done, and the idea of possibly being able to skip out on drawing a lot of the frames in the less active scenes and yet keep the framerate consistent across the whole animation is pretty dang appealing.
I once did rotoscoping with morphing shapes rather than raster drawings, and that allowed me to skip frames, but the workload ended up being about the same as rather than drawing each shape, every keyframe that I animated had to have the pre-existing shapes morphed manually into the shapes of the next keyframe, sometimes the object moved too fast or at an inconsistent velocity thus requiring a lot of correction, and new objects that popped up had to have a whole new shape drawn in and manually morphed, every shape doubled the workload required for the next keyframes. Sure I could animate at like 12 per second and interpolate it as high as I wanted, but the in-between frames got really ugly going past 24 fps.
The only problem with Flowframes is you can't specify your output frames yourself, you need to choose the ones proposed on the list
For the given time,it is more than okay with higher resolution.I will check this one for sure.Thank you for the video.
you deserve more subscribers, brother
I've noticed that with all of the tools I've used so far (dain, cain, and rife), they still sometimes have trouble with things like text on top of a moving object. Sometimes it treats the text as being a part of the object and warps it as the object moves.
Noted, thank you for your info sir
FOR ALL GPUUUU
DAMNNNN YEEESSSSSSSSSS
I had been using Hybrid, but this tool seems better and faster.
@@2BsWraith i wish i have good pc
@@pops7249 If you cant run RIFE then try hybrid, i believe it's easier on the system and compatible with intel and AMD cards as well.
@@2BsWraith can you send the link for it?
@@2BsWraith That's all i need to hear :D
DAIN eat much vram and slow, but i'm stil impressed.
if you value time use RIFE
I still think DAIN is the absolute King, only huge drawback is the processing time. My brain is kind of seeing an elite "soap opera effect" when looking at RIFE processed videos
watching high-fps anime on the high-hz displays...
LETS GO
THIS! I NEED BUTTERY SMOOTH ANIMATION AND MOVIES/SHOWS
@@Unreal989 how to ruin animation 101
Wish I'd found this before starting a DAIN job. 7 hours for a 3248×2736, 67-frame, 15 to 60 FPS interpolation of an object rotating in stop motion. Granted I only have a 1050Ti, but whatever. Still preferable to doing the extra 300% of picture-taking work with my particular setup.
1:40 When the imposter is interpolated to 60fps 😳
Sus
among us is gay. overrated. never played it. never will.
@@intelgen7860 bruh you can't say a game is gay when you haven't played it. Just abouit discounts your opinion on it instantly
@@wilbofreeshan6340 Not really. I've seen enough to know it's transgender.
@@intelgen7860 Why do you keep calling it things like gay and transgender? Have your opinions on the game sure, but I'd be careful on the way you phrase it.
Well yeah this exact what I saw with DAIN. For scenes with high action and fast frames you are getting artifacts and for scenes with low action it looks smoother but is unnecessary because you dont really need more frames here.... . So basically it is a lost-lost situation... .
fast frames dont give me artifacts tho?
Quick question, where do you learn/find the papers you talk about?
Hi, 2 years later, is RIFE the best available option for frame interpolation?
still is gold standard
It sounds like the creators of RIFE needs to use more frames to deal with quick motion. They already have the core so theoretically it shouldn't be difficult to do.
I don’t think the technology will ever be perfect but it has potential as a tool to speed up animation work, the ai creates the base image and then an animator goes in to patch up the artifacts that the ai produces rather than doing it frame by frame. I don’t see it replacing parts of post production, rather speeding it up.
The issue with for example AOT is that the framerate of the background does not match with the framerate of the foreground.
This is usually a no go in frame interpolation. To get an actual clean result you need to split those two and do it seperatly. Also you need to do some artifact corrections frame by frame.
To interpolate the composition and expect things to be 99% clean without needs for correction is just... cringe.
I dont know if there will be a work around in future for such things, but Dain/Rife is imo not ment to be used on a endconsumer level, where you put 30 mins of film in it, click a button and fin, but within animation/film studios, where they are prototyping the scenes themselves.
So if using this for 3d animation rendering (as a time saver), would it be better to leave motion blur off?
well thank you for this quick overview. it worked for a smooth camera motion from a Blender animation. finally i got this slow mo working
I wonder whats the newest algorithm right now.
So, how do I use it on Colab again?
Im not the smartest lol
hope an alternative of SVP can pop up really soon.
handbrake
Good thing is you can just use both, depending on whether you want to spend the time or not.
I wonder if in the future, hand drawn animation (which drawn in 24FPS or less) that is AI interpolated to 60FPS would be standard in anime industry.
It won't, because it looks bad
@@Mqstodon i said "in the future", the ai technology would be much better by then... And perhaps there will be specialized ai for hand drawn animation
Oh yes, fuck timing, fuck intent, fuck the talent of an artist. Pls i WanT mORE fps 🤡
@Ekklipse What noodle video? Are you unironically using snowflake?
Also, it's fine for to be an experiment but animation being produced this way is bad for so many reasons.
@@guancho9142 Stop lying
Any update on higher achievements on Dain or Rife (now known as Flowframes)? - Paul
Wait, so if I got it right, this is not interpolating in real time? Do I have to "upscale" to 60 fps before I watch a video file? Or does it work in real time?
Real time means the processing time is similar to the original length of the video,with a power gpu of course.
Question about frame rate capability. What is the lowest you can go? For example, if I wanted to convert 15 to 24, can it do it? I don't need anything above 24. What I'm trying to do, is smooth out camera moves.
You can double it to 30 fps and then use openshot editor to export it at 24 fps.
In 3D motion graphic CGI, should you render the frames with or without motion blur if you want to boost the FPS? Which would yield better results? I haven't found any answers regarding motion blur on/off with DAIN/RIFE/CAIN.
You never really want motion blur, because it's really just an artifact masking technique. More clear frames will be a better artifact masking technique. So if you have a 30fps video, putting it at 4x and watching it in 1x or 2x speed will be more visually appealing, because you get at least 2-4 frames between every frame when viewing, which smooths out the video. If you just add blur to the frames, all motion will be blurry, especially when slowed down. The only time you would want blurring is if you have video with constant erratic movement that makes path prediction difficult to the point it has tons of artifacts.
I dont have nvidea device and i have a low cpu low end pc ..i tried this flowframes and it doesnt even render and causes too much error warnings
Wonder if you could use this to enhance fps of games in realtime given a velocity map and exact coordinates of objects in each native frame.
No you can't that's not how it works...
you can use this on games (but its better to use this in live action or camera caught videos rather than minecraft for example
Why not revise the interpolation algorithm into a plugin compatible with a variety of animation workflow platforms to be able to interpolate various animation sequences more efficiently?
The real value of this algorithm lies in what can be done with it right this very moment.
Let's teach IA to convert 3D anime models into 2D to fill the gaps in the interpolation frame job.
I wonder could this be used in video games to bost fps similar to how the RTX AI works
Or you can just use SVP, which was around for 10+ years, and do *all that* in *real-time* (or as a part of video encode, via an extra filter chain), on *any* resolution or output frame rate, utilizing pure CPU power (and add OpenCL on top of that, if it's supported by your GPU).
The end result (if you choose more complex algos) is more than pleasing.
Yeah, I tried to feed SVP a 3d animation stored in my computer and I was stuck. If I can't even load a file (no, I'm not computer illiterate), then the program is dead in the water.
@@mrlightwriter AviSynth scripts and command line, mate. =)
If you thought that there is a magic "Drag-n-drop here to make a 60fps video" button in SVP - I have bad news for you. For the exact same reason 3D modelling apps don't have a "Create a realistic dog model" or "Make me a futuristic scenery" buttons.
In short - "Zero IQ" apps _(drag, drop, press one button)_ are very handy, but severely limited, in both script/effect variance and possible ways to alter the source _(not to mention the output quality)._
For example, in the early days, I thought that the StaxRip app was bad and unusable, because I couldn't figure out a way to *properly* _(the way _*_I wanted_*_ - because there still was a "single button" option which I _*_did not_*_ want to use)_ convert a video using it. Guess how my perception of "usefulness" changed when I revisited that app a bit later on. =)
SVP and other similar stuff, and most apps in general, per se, *still works* _(worked when Win7 was still around, works now on both Win10 20H2 and 21H1)_ - you just have to put a bit more effort to it.
If you won't see the quality change, going from a "one button" apps to the professional ones - there's no reason to use an app like that at all. A common built-in _(in most TV's nowadays)_ "fluid motion" option would suffice your need then. It'll be a buggy output, glitchy in every way possible ... yet provide a 60fps output.
@@Vitaliuz Within Flowframes Video Interpolator UI, I tried RIFE CUDA and RIFE NCNN, with combinations of versions 1.5, 1.6, 1.7, 1.8, 2.0 and 2.4, but to no avail.
You referred AviSynth scripts and command line. Could you point me to some directions?
@@mrlightwriter Ever got SVP to work? If not I could try helping.
give it 5 year and an ambitious studio whos willing to incorporate this tech in their design process, and anime will change forever.
anime should never go 60fps
@@entsu88 true.
@@entsu88 But it wouldn't harm for anime to be 24fps (since they animate at two's)
@@mrlightwriter Do you know anything about anime? Anime is not constantly animated in 2, sometimes it is in 1, 3 or even 4 and that's because the animator purposely decided to do so. More frames is not better animation
@@guancho9142 Yes, I know the difference between the different types of animations in anime. I was generalizing. In a longer answer, it wouldn't harm having the possibility of increasing the fps of animes (for the ones who want to see more fluid animation). Of course, animation in 1, 2, 3 or 4s is related to artistic choices and/or time and money contraints, and it should continue to be produced that way.
Trying to use the (flowframes) experimental interpolation, but 8gb vram isn't enough for 1080p, and it only supports Nvidia. What's the quality difference between normal interpolation and above?
2:36 Often this is a problem but RIFE does it at the best I've ever seen.
Is this a live application? Like instant use of motion interpolation like tv? Will it work with browser streams?
yeah but there's literally an option on dain to turn off the depth for 3D media if you want
RIFE is better than Hydrid ? 🤔 I have been using Hybrid to upscale movies to 60fps
Do you think anything like this could apply for gaming in the future? I get that it's just for video interpolation and it needs a file to convert to 60fps but Maybe for things like game capture software that could smooth out the 30 fps games of last gen since sony and xbox only made like 15 next gen consoles.
Also great video. It was recommended on a digital foundry video about why destiny 2 can't run at 60 fps on ps4 pro using examples of frame interpolation. I was also looking for anyways to perhaps mod spiderman ps4 or miles morales to 60fps. Obviously no luck there but I was hoping frame interpolation could probably be an option.
Frame interpolation can already be done in real time, but it's a lot less refined looking. Something like this would not be good for smoothing out console frames, because it takes as much or more power than the game itself.....so you'd be needing more than twice the graphics power to turn 30fps into 60fps....which means you might as well use that power to get more gaming fps instead. On top of that, since games are interactive and rely on input, if you ran a 30fps game at 60fps visually, you'd still have the input lag of 30fps, not to mention since it's interactive, frame trajectory prediction wouldn't work without a second frame as a reference. How these work is they look at the differences between a first and second frame, then interpolate a third frame in the middle. With no second frame to interpolate, there is no prediction, so not only would you have a 30fps input lag, you'd also have a 3fps delay on top of that, or about 100ms visual delay between actions, which would make the game even less playable than 30fps without interpolation.
@@peoplez129 "but Maybe for things like game capture software that could smooth out the 30 fps games"
He never talked about interpolating frames in real-time gameplay, he was talking about game recordings.
Something similar to DLSS but for frame interpolation could be cool. Imo the problem is that games are not just seen like videos, but also felt, so if the motion is wrong it would be a lot more jarring
I cant figure out how to use this thing. it just kept downloading model file unet.pkl over and over again for hours.
gonna boost the brain of this person cause alot of the artistic vision is lost when you boost legit everyvideo like some videos are rendered in 23 fps for a reason
Can you combine all AI graphics into one system. Continuous graphics enhance into very very detail.
What a time to be alive
Let's say that I render 15 fps and I render a 30fps depth map image sequence, can use it to get a better result out of rife?
Underrated video, thanks mate!
I kept getting this error.
RuntimeError: unexpected EOF, expected 1826598 more bytes. The file might be corrupted.
does i get it right if i have an error of ssl/tls chanel in Flowframes - that mean i can not try to dain or rife?
I use to have a software that automatically converted all of my media into highframerates real time but I forgot what it was called 😢
SPLASH 2.0 or SVP
When used dain it took my cpu to 100% usage and i got a "warning cpu usage" while in the app it shows the device to use is the gpu! whats wrong with dain?
4:50 Is that a panda? ºwº
I want to see that comparison, where can I find it? :D
No, that's a cow, from the 1800s. It's black and white because there were no color back then.
RIFE artifact (side effect) list
0:40 (square) boundary ghost
2:38 (foot) cross motion broken (fast motion broken)
2:51 (ball) small object / fast motion problem
Is there any frame interpolation app for m1 mac gpu?
Using the flowframes I do get Error "Your GPU run out of VRAM!!.." after trying to use like 1.8GB of GPU ram ...no idea why but I do get this tried different settings limiting the max video input size but still the same pff
you need more VRAM then
@@2kDVI That's not the issue, trust me
So does this allow me to produce a new file that's smoother, or is it just for viewing in realtime?
i must say that i prefer DAIN with thier own Blur settings like the area and threshold config for better smooth movements of the background, even when RIFE has a much faster processing of all the rendering it still doesnt look as expected since there arent very much options in RIFE compared to DAIN
60 fps for animation without doing it at 60 fps would be so much help for our animators
24/ 30 fps is an art style. would hate to see "60fps" get slapped on everything for trend points...
its cool but nothing can beat that 3D depth awareness in DAIN
i tried rife out, though it is like 20x faster than Dain, my video output would have a slight freeze on part of the video. which is ah quite weird...
if you are in the app disable duplicate removal
@@AnimeRandom1 thanks will try out!
Thanks a lot
Already test it, works good!
I wonder if they could use this commercially for anime. Have the anime drawn in the usual 24 frames and then use the ai and then have redrawers for frames deemed necessary to fix. It seems possible to me atleast.
ok, here's why this doesn't work, ok u i assume want to watcha 60 anime,
1: these ai interpolating software does not take into account the animation directors vison
2: most of the time the interpolated frames mess up the timing
3: ok let's say its 60fps and u suggest they redo the problematic frames, how long do u think that'll take, cause I'm genuinely surprised at the ignorance it takes to suggest this
animators are severely underpaid for their work so want to potentially double their work load
@@kcisor sorry dude. It was just a simple thought I had.
@Ekklipse than wouldn’t that just be 29/35 fps?
@Ekklipse no doesnt work like that most animation studios are booked for the entire year, meaning they have a set time and very strict schedule to abide by, what ur suggesting isn't easier, ur not taking into account of complex shows and detailed characters the fact of the matter is that re takes, and frame clean ups already occured in the industry adding double the frame count is literally gonna increase the time working on those,
@Ekklipse I think I get what your saying. Wouldn’t you be putting in 29/35 hand drawn frames per second instead of 24/30? I’m not sure how that would work but I get what your saying. You think the adding the more frames (the inbetweener frames) would increase the accuracy but the problem is that it would still be inaccurate unless the added 5 frames acted as true inbetweeners. What I mean is if the 5 frames were spaced out for the software to fill in those un-animated slots (the slots would be 4.8/6 frames. So 4.8/6 frames in between the hand drawn inbetweeners.) but the software isn’t programmed to do it that way. That’s how I think it works anyway. I might be wrong because I don’t know too much about animation and I’m mostly going off of game rendering knowledge (the game renderer is basically on-the-spot animation animated by the computer).
i dont know, im render video 30fps to ke 60fps with subtitle but subtitle Little Flashing !!!
i hv question how to render with subtitle no flashing ?
was about to buy any nvidia card w/ cuda v5+ cores, man, you saved me from unnecessary spendings
what about GPU requirements? Dain is super demanding when it comes to vRAM
Is there any PC game implementation for this?
There are so many PC cutscenes that I've wanted to see in 60fps for so long
Just record it and process it. There's probably a lot of good YT 2 MP4 converters if you need that too
@@zyxwv in the 4 months since I posted this, I also found out some editing software does it themselves, and a pretty damn good job too
I've used it quite a bit in some of my videos this year and it looks like true 60FPS
Me transforming and watching maybe all of jujutsu kaisen episode with 60fps: *"well hello there"*
tried this in windows 10, had either crash, every second frame black, out of memory error, nothing worked , just amazing
please help me . Its keep saying Download failed during thr interpolation
what a time to be alive!
seems animators should use this in the creation of their work rather than an after thought could help the in-between key frames process
Oh yes, because more frames is better, right? The inbetween of a machine are also much valuable. What an insult to animators
@@guancho9142 just take the insult or get with the times. There is a reason more and more movies are coming out with higher refresh rate. Leave the shackles behind
@@timonix2 1. These movies are FILMED in high FPS and not interpolated
2. The appeal of animation, especially 2d and stop motion, is its craftmanship. This is a fuck you to those animators who have carefully decided the timing of each scene
@@timonix2 Also I only know of a couple of movies who have come out in higher FPS and the reception was not exactly amazing
@@guancho9142 If a program can help animators get the job done faster, that's more time they can spend on the next scene. That's also potentially more time with their families. Any aid is better than crunch time.
wow, rife is so much faster. I tried using the other one and as cool as it was it would take forever to do anything useful with it.
RIFE is now in SVP real time interpolation
Tutorial on how to use rife?
There's a reason people don't like interpolation on these type of things. Anime just doesn't work with the way frames are held for more than one frame, its practically turning the once normal animation into a weird, janky, frame by frame that looks terrible compared to the hand animated project.
I fully agree the way they animate on twos for the characters but the background moves on ones is very bad for interpolation.
Don't you guys see that using this tool for TV shows and movies break the desired result?
Anime is anime because of how it looks in motion.
Movies are planned to be played at 24 frames per second.
If they were planned to be played other way they would be made other way.
I can understand this for personal content or sports, but not for anything else.
do you know if the FR can go higher than 60fps?
'Just a few more papers down the line' lol yeah 2 minute papers. GANG GANG.
One day it will exist an AI so good in interpolate frames, that you wouldn't even need a video, put a photo, it can be a guy running, a face or a building being demolished, doesn't metter... It will understand and create a random coherent movement using just a single frame.
There already is an Algo presented by NVIDIA not more than a month ago that does this.
Does anybody know about the stuff they use in Smart TVs? It's also smooth af and has little artifacts
and quite old already
Holy smokes, it took 2mins 4secs to make a 30fps video into 60fps. I ran it on DAIN before and it would take 6 hours(!) for the same video, but it crashed after 2hrs. Crazy.
Only problem I have is that the re-duplication analysis said it didn't exist any duplicate frames in the video, even though I know for a fact the original video have. The output result also have duplicate frames. Any way to solve this?
Share please some info - How long was that video and resolution etc. and what is your GPU
Do you know how to install the program?
Hey, Nice Video. thx for the Information. I got 2 Questions.
1. shell i wait for an update for this programm, so it will work for anime aswell?
2. Do you think an Intel 630 grafic card is enough for this master piece, or do I have to get a new PC/ Graphic Card? :D
Hace a nice day
can i ask if rife can convert 60fps of 4k reso video ?? i have rtx 3070 laptop ??
i use shotcut and i just changed the GOP from 300 to 600 and its actually smoother :/
Does this work with amd gpu’s??
works well on my 5700XT
cuda wont work it keeps sayin ran outta vram even tho no other program is using my 3060
a vulkan error occurred during interpolation
How to fix this ?
Is it good for fast Anime edits?
i know this vid is old but isnt flowframes malware
no 💀💀
Seems like RIFE supports at least 3840x2160 input now... I tried this with RIFE 3.1
How to use this can you tell?
The rife Google colab has an error please help
Does rife work good with motion blur? Dain app didn't
You are wrong RIFE can process what your computer can handle. I have used AI to upscale videos to 8K then I have used RIFE for interpolating to 144 fps.
Where will be the output file of colab
Is this available on 32 bit processors
*WHAT*
@@PizzaPowerXYZ 32 bit pcs duhhb
@@aesrae8828 32 bit..... In 2021? Like actually? No pc that officially runs 32 bit windows will ever be able to use rife without setting itself on fire.
32 bit hasn't been a standard for 13 years
So no, unless they implemented it
@@PizzaPowerXYZ what do you call my pc then? Lol is my windows called Cracked windows
@@PizzaPowerXYZ i bought it on 2019