Interested in Runway and Pika? Pika vs Runway: Lip Sync ruclips.net/video/Isw1rew8qrI/видео.html Runway Multi Motion Brush: ruclips.net/video/P41bfQrsgDY/видео.html Pika 1.0 Intro: ruclips.net/video/ctjXvyqalzo/видео.html
Absolutely stunning how far this has evolved in such a short amount of time. Give it another year or two and it'll look like real footage, real actors, real motion.
Nice job comparing the engines. Just a note: The big reason Gen-2 looks better in a lot of cases is the higher frame rate. 8 fps from Pika vs. 24 from Gen-2. Gen-2 resolution is also higher. If you run the Pika output into something like Topaz to upscale and interpolate you'll get much cleaner results. I'm pretty sure Pika will have upscale and interpolate options like Gen-2 has without having to use Topaz. Anyway, just wanted to mention that. Keep up the good work. I really enjoy all your videos.
Thank you! Good call on the frame rate and resolution differences. I may have to check out Topaz, that's the third time I've heard about it this week. I'm happy to wait for Pika and Fulljourney to add upscaling because when Gen-2 "improved" it took a lot of the fun out of it.
@@aivideoschool Can't say enough good things about Topaz. Besides the uprez and interpolate it also will give you REALLY smooth slow motion. Very useful extending the length of clips.
Save for the quality, Pika is amazing. I realized that some simple short prompts eg, "zoom in", "People walking," "wind blowing" can bring out a very good video in Pika labs. These 2 clips are made using Pika labs and I was able to get some good horse riding shots . ruclips.net/video/29J-KL5GQyQ/видео.html ruclips.net/user/shorts50gMpdhbcu4
I think the bottom line is, neither of these tools is particularly useable yet - fantastic nonetheless but not really useful. The problem is, the second these tools do give you the quality and animation and realism, all the animation software and animators will be rendered obselete...
There will be great economic problems as AI evolves and jobs are lost in al sectors. We've already seen bank tellers, Copywriters and marketing being first, graphic designers and art directors next. Filmmakers, music composers, news anchors, draftsmen, architects, fashion designers, taxi and delivery drivers, etc. As robotics and AI evolve, there will be fewer and fewer jobs that are not cheaper to do than with technology. We will lose the ability to think and create as we rely on technology more. We will have more people without occupations that still need to be fed and housed and kept busy so they don't revolt. This leads to a dystopian society of walled off communities of elites and ghettos of the masses. If society wants to survive, then there needs to be legislation to control the extent of AI. We will have media completely made up to create narratives that the ruling class figure will control the masses and keep them in their place. We are already seeing the chaos and societal breakdown occurring as media is being controlled. When it becomes manufactured, then the dystopian future we see in sci-fi movies will be reality.
@@tonyr.4778well not everything is about money, and for now, even with AI human factor is still a major one. It's like people were panicking when photography was invented. Just chill. It's only a tool.
Nice comparison, I've been looking for something that can create consistent results and longer video for my film, I don't think both of these are capable at this moment, but I think soon will be🎉
I just saw FullJourney allows LoRAs now (haven't tried it yet). I think Loras will help with that consistency even more. Being able to create a custom face you can prompt with will be huge.
Thanks for the comparison, very helpful! I've toyed with PIKA and it works well and I think has more creative freedom than runway but so far, runway has been really good in workflow and quality.
Totally agree. Someone else pointed out in the comments here, Pika now has a "-fps 24" control you can add to the end of the prompt. So much smoother! Runway's tools are great, and their non-generative AI tools (Magic Tools) are wildly underrated.
you didnt actually use the magic brush in runway, im not saying it would have worked but thats why it exists. so if you had marked only the window with a magic brush and the animated it it maybe would have worked. theoretically.
Funny you say that... when I recorded this video Motion Brush didn't exist, but once it was released, I went back to test that exact bus ride clip you mentioned and it turned out great. I love Motion Brush. ruclips.net/video/soZbh9O25ig/видео.html
U helped me a lot. Im doing animated nfts of ninja ducks rowing in boats, i couldnt get the boat and waves and clouds to all move but not their faces. Even with magic tool. But i never tried -10proximity.
Great video! My experience is that the animations in Pika can be quite amazing, but I rarely see my text prompts impacting the final results though. Maybe I am doing something wrong. RunwayML videos are often extremely cinematic and captivating, however, it certainly does not work with as many different images as Pika. I could just forget trying my own art work with RunwayML, however Pika managed to animate that as well. RunwayML works Great with photo realistic images and highly detailed high quality images. Hope Pika Labs manage to keep their unlimited generations, even on an eventual paid plan. That would make all the difference in my opinion.
Someone on the Pika Discord mentioned starting with a landscape and adding things to it via text prompt. I could mostly get that to work by telling it where to place the new thing "a tank/nomad/horse goes by in the foreground" which you can see at 05:12.
@@aivideoschool Cool! Never thought of doing it like that. I have had a subject in my image, and tried animating that subject via text. That doesn't seem to work
@@aivideoschool Never thought about that, however, my drawings were 1920x1080, so it should have worked. Seems like cartoon drawings are harder to animate in RunwayML, unless they depict humanoid subjects, while Pika on the other hand can easily handle things like cartoon ducks etc
I find both images coming from Pika and runaway, fuzzy blurry and not satisfying, that was ok by the Beta times, but now they are both paid options, it is time to update their core capabilities
It's a weird trade off: "you don't want it to be blurry? Lower the motion so you get an almost static shot." The original Gen-2 model had a lot more motion and none of the blurriness but there were a lot of hallucinations. I think that's what all the models are over compensating for.
I was dragging my feet on Kaiber because they were one of the first to go from free to pay-for-credits, but everyone is going paid now. Kaiber seems focused on creators/creatives (rather than corporate/enterprise) which I love so I'm checking them out now.
Excellent comparison video and matches my experience. Sooo... Pika Labs and RunwayML need to have a baby. :) With the guy walking in the forest attempt, did you try passing the "-motion 4" parameter with your prompt, to try and get more motion out of the result?
Thank you! I just went back and looked and I wasn't using the -motion parameter on those. (That might have been a choice to use "default mode" for both.) I'm loving the new -fps parameter in Pika. I've noticed when using -fps 24 and -motion 4 it sometimes creates too much noise/unexpected motion. Not always but sometimes.
Runway just released Motion Brush. That might be helpful, combined with a prompt. How to use Runway's Motion Brush ruclips.net/video/soZbh9O25ig/видео.html
let me tell a greta tip on pika do not take the firt result keep sayign retry and se what happens in 3rd and forth and dso o and on ull see a great acheivements the more you retry i like pika but ill try runway but it seems to blud and amudge so many details,also ts best to use a good ai to create an image then bring it to life with oather imge to video sites such as those
Optimistic to think that this will only be used in storyboarding. At this rate in a couple of years, actors wont be needed, camera men, directors etc. So let's chill and take it more slowly
You can achieve much higher quality videos and better motion with Pika Labs then what's demonstrated in this video. Here is a raw sample clip, with no post-processing. ruclips.net/video/GMT1TVzDCCU/видео.html
Really great quality! What do you think of Leonardo AI being fed into Pika Labs (/animate)? The two green female android clips in this video were made this way: ruclips.net/video/IPLFHNcQY-c/видео.html
You can do a lot of AI stuff on Discord (Morph Studio, FullJourney) or websites (Runway, Pika Labs, Leonardo AI). All of that can be done on a Chromebook or phone. It's only when you want your own computer/GPU to generate things locally that you'll need a powerful computer.
I'm preparing a short film, I want to know which AI is the most interesting for generating truly realistic videos, and which even a subscription is worth a little weakness for Kaiber but I hesitate: the murf does not suit me, and Leonardo ai video is too imprecise, same for the sRunwayML videos, from pika
In Gen-2 there's a settings button when you generate (currently in the bottom left). If you open it, you can Remove Watermark. If you've already generated the video, try Runway's own Inpainting tool. It might work?
What i see is, runway is better for camera movement, and pika for object. it would have been sick to merge them somehow but idk if it would turn out any good.
I've been having great fun and quality using Leonardo AI's new Photorealistic gen mode and taking the image generated and using it with Pika Labs "/animate" command. I believe the output is on par with RunwayML at that point. @aivideoschool
you say that it is possible that you provide too many details and that is why the graphics do not come out, but the AI creators write that the more details, the better
I know many people go that route for image generation but in my experience, doing a short, concise prompt with a specific instruction works better for image-to-video. I'm genuinely curious if anyone has good tips involving longer i2v prompts.
I think Runway looks bad. It's impressive, but it definitely has that AI look that most people are turned off by. It's cartoony. Pika at least looks are artistic and real, rather than CGI fakeness.
Runway updated their model recently and it looks more realistic than when I recorded this, but I totally agree. I hate the cartoonish AI look that isn't quite animated but isn't real looking either.
Yeah, it would be great if they had the "extend" option like RunwayML. But you could always use a video editor, or automate the process using ffmpeg, grab the last frame, and resubmit it to Pika Labs with the "/animate" command with a text prompt telling it what to do next with that final frame.
I put the same image in both, the Titanic wreckage. Pica made bubbles, Runway turned the image to make a whole new section of the boat. Pika is awful, like it has its own agenda and doesn't follow my prompts
I know, I've had success and misses with both. One tip I heard that I found helpful was remembering the AI can see the picture too so I might say "a fish swims out of the hole in the ship" or whatever you're going for. And then keeping the action limited to one specific thing that can be shown in 3 seconds. It's still not perfect but then I fine tune from there or hit the refresh button. Good luck!
I agree. I used both for my 4 minute dark love story, "Law of Attraction" ruclips.net/video/ExVfCKkQtk0/видео.html . I found Pika Labs more useful even with a lesser video quality. Runway ML can be a bit frustrating at times.
No offense meant to the author of this video, of course, but I almost find this video sarcastic because from what I see in a lot of ways Runway's results look waaay worse than Pika's... I guess what I my ultimate point is - the quality of the animation doesn't only lie in the smoothness and the lack of noise...just because something is polished and its textures look smoother doesn't mean it more realistic and better overall in all aspects. Am I the only one thinking this?
No offense taken. I just want to clarify that the quality I referred to had to do with technical aspects like HD resolution and frame rate. 3 months ago when this was recorded, Pika was only like 12 fps and half the resolution that it is today. Pika has surpassed Runway and it's still free
Thanks, I agree. Runway had an early lead but the controls and quality from Pika have surpassed them in the past few months. Hopefully the competition means higher quality video and more creative options.
FYI - I don't do promos for anyone, I just share what I find interesting and am enthusiastic about. If you watch the whole video, I say they're each better at different things.
Interested in Runway and Pika?
Pika vs Runway: Lip Sync ruclips.net/video/Isw1rew8qrI/видео.html
Runway Multi Motion Brush: ruclips.net/video/P41bfQrsgDY/видео.html
Pika 1.0 Intro: ruclips.net/video/ctjXvyqalzo/видео.html
this comparison is really eye-opening! I love seeing how different tools stack up against each other. Pika and Runway
Absolutely stunning how far this has evolved in such a short amount of time. Give it another year or two and it'll look like real footage, real actors, real motion.
4 Months only and sora ai dropped out lol
Nice job comparing the engines. Just a note: The big reason Gen-2 looks better in a lot of cases is the higher frame rate. 8 fps from Pika vs. 24 from Gen-2. Gen-2 resolution is also higher. If you run the Pika output into something like Topaz to upscale and interpolate you'll get much cleaner results. I'm pretty sure Pika will have upscale and interpolate options like Gen-2 has without having to use Topaz. Anyway, just wanted to mention that. Keep up the good work. I really enjoy all your videos.
Thank you! Good call on the frame rate and resolution differences. I may have to check out Topaz, that's the third time I've heard about it this week. I'm happy to wait for Pika and Fulljourney to add upscaling because when Gen-2 "improved" it took a lot of the fun out of it.
@@aivideoschool Can't say enough good things about Topaz. Besides the uprez and interpolate it also will give you REALLY smooth slow motion. Very useful extending the length of clips.
You can now manually set your frame rate with Pika Labs using the -fps parameter.
@@tyberwhite Thanks! I just tried it. Butter smooth!
wow, -fps 24 makes everything so much smoother! thanks for the heads up.
Cool vid!
Thanks for this!
My pleasure!
Pika labs free or runway free. Answer me
Save for the quality, Pika is amazing. I realized that some simple short prompts eg, "zoom in", "People walking," "wind blowing" can bring out a very good video in Pika labs. These 2 clips are made using Pika labs and I was able to get some good horse riding shots . ruclips.net/video/29J-KL5GQyQ/видео.html
ruclips.net/user/shorts50gMpdhbcu4
Informative video
Thank you! That's what I aim for.
I think the bottom line is, neither of these tools is particularly useable yet - fantastic nonetheless but not really useful. The problem is, the second these tools do give you the quality and animation and realism, all the animation software and animators will be rendered obselete...
There will be great economic problems as AI evolves and jobs are lost in al sectors. We've already seen bank tellers, Copywriters and marketing being first, graphic designers and art directors next. Filmmakers, music composers, news anchors, draftsmen, architects, fashion designers, taxi and delivery drivers, etc. As robotics and AI evolve, there will be fewer and fewer jobs that are not cheaper to do than with technology. We will lose the ability to think and create as we rely on technology more. We will have more people without occupations that still need to be fed and housed and kept busy so they don't revolt. This leads to a dystopian society of walled off communities of elites and ghettos of the masses. If society wants to survive, then there needs to be legislation to control the extent of AI. We will have media completely made up to create narratives that the ruling class figure will control the masses and keep them in their place. We are already seeing the chaos and societal breakdown occurring as media is being controlled. When it becomes manufactured, then the dystopian future we see in sci-fi movies will be reality.
@@tonyr.4778 That was deep. Did you generate this text with ChatGPT?
@@tonyr.4778well not everything is about money, and for now, even with AI human factor is still a major one. It's like people were panicking when photography was invented. Just chill. It's only a tool.
Thank you making amazing video,very helpfull.
I'm so glad it was helpful!
I like pika. Thanks for your video!
I like Pika too!
1.4k+... Thanks. That is a good review mate.
I'm glad you enjoyed it. Thank you for watching!
Great comparison. Both are good like you said
Nice comparison, I've been looking for something that can create consistent results and longer video for my film, I don't think both of these are capable at this moment, but I think soon will be🎉
I just saw FullJourney allows LoRAs now (haven't tried it yet). I think Loras will help with that consistency even more. Being able to create a custom face you can prompt with will be huge.
Thanks for replying, I think very soon advanced release will cater consistent characters, facial, expression, etc@@aivideoschool
Дуже дякую за це відео. Ти мені дуже допоміг.
Add negative prompts in Pika nd yyour footage will stabilize a lot, especially the hair blowing
Thanks, I don't think Pika had negative prompts when I made this but I am very tempted to go back and try now to get her hair to stop blowing :)
Guy in the forest in Runway looks like having a stroke
Thanks for the comparison, very helpful! I've toyed with PIKA and it works well and I think has more creative freedom than runway but so far, runway has been really good in workflow and quality.
Totally agree. Someone else pointed out in the comments here, Pika now has a "-fps 24" control you can add to the end of the prompt. So much smoother! Runway's tools are great, and their non-generative AI tools (Magic Tools) are wildly underrated.
Tks.
Great Tutorial
Thank you!
you didnt actually use the magic brush in runway, im not saying it would have worked but thats why it exists. so if you had marked only the window with a magic brush and the animated it it maybe would have worked. theoretically.
Funny you say that... when I recorded this video Motion Brush didn't exist, but once it was released, I went back to test that exact bus ride clip you mentioned and it turned out great. I love Motion Brush. ruclips.net/video/soZbh9O25ig/видео.html
@@aivideoschool Oh thanks. im watching it. i see the proximity should be -10 for one object movement- that was bugging me all this time.
U helped me a lot. Im doing animated nfts of ninja ducks rowing in boats, i couldnt get the boat and waves and clouds to all move but not their faces. Even with magic tool. But i never tried -10proximity.
Great video! My experience is that the animations in Pika can be quite amazing, but I rarely see my text prompts impacting the final results though. Maybe I am doing something wrong. RunwayML videos are often extremely cinematic and captivating, however, it certainly does not work with as many different images as Pika. I could just forget trying my own art work with RunwayML, however Pika managed to animate that as well. RunwayML works Great with photo realistic images and highly detailed high quality images. Hope Pika Labs manage to keep their unlimited generations, even on an eventual paid plan. That would make all the difference in my opinion.
Someone on the Pika Discord mentioned starting with a landscape and adding things to it via text prompt. I could mostly get that to work by telling it where to place the new thing "a tank/nomad/horse goes by in the foreground" which you can see at 05:12.
also, out of curiosity, is your own artwork landscape or portrait mode? Pika does both, Runway only seems to do landscape.
@@aivideoschool Cool! Never thought of doing it like that. I have had a subject in my image, and tried animating that subject via text. That doesn't seem to work
@@aivideoschool Never thought about that, however, my drawings were 1920x1080, so it should have worked. Seems like cartoon drawings are harder to animate in RunwayML, unless they depict humanoid subjects, while Pika on the other hand can easily handle things like cartoon ducks etc
thank u
I find both images coming from Pika and runaway, fuzzy blurry and not satisfying, that was ok by the Beta times, but now they are both paid options, it is time to update their core capabilities
It's a weird trade off: "you don't want it to be blurry? Lower the motion so you get an almost static shot." The original Gen-2 model had a lot more motion and none of the blurriness but there were a lot of hallucinations. I think that's what all the models are over compensating for.
It would be awesome to also have a comparison with Motion, from Kaiber AI
I was dragging my feet on Kaiber because they were one of the first to go from free to pay-for-credits, but everyone is going paid now. Kaiber seems focused on creators/creatives (rather than corporate/enterprise) which I love so I'm checking them out now.
@@aivideoschoolJust paid only? Or Free tier plus paid tier like Leonardo AI and others. Pika Labs is still free, free only in fact.
Excellent comparison video and matches my experience. Sooo... Pika Labs and RunwayML need to have a baby. :) With the guy walking in the forest attempt, did you try passing the "-motion 4" parameter with your prompt, to try and get more motion out of the result?
Thank you! I just went back and looked and I wasn't using the -motion parameter on those. (That might have been a choice to use "default mode" for both.) I'm loving the new -fps parameter in Pika. I've noticed when using -fps 24 and -motion 4 it sometimes creates too much noise/unexpected motion. Not always but sometimes.
@@aivideoschoolWhat are you doing with the -fps parameter? I haven't played with that one. Just the -motion and -ns parameters.
I cannot get objects to move in runway, only the view. Anyone have tips?
Runway just released Motion Brush. That might be helpful, combined with a prompt.
How to use Runway's Motion Brush
ruclips.net/video/soZbh9O25ig/видео.html
let me tell a greta tip on pika do not take the firt result keep sayign retry and se what happens in 3rd and forth and dso o and on ull see a great acheivements the more you retry i like pika but ill try runway but it seems to blud and amudge so many details,also ts best to use a good ai to create an image then bring it to life with oather imge to video sites such as those
Pika is fun.
Ixi lá ele
Storyboarding will never be the same…
Optimistic to think that this will only be used in storyboarding.
At this rate in a couple of years, actors wont be needed, camera men, directors etc. So let's chill and take it more slowly
You can achieve much higher quality videos and better motion with Pika Labs then what's demonstrated in this video. Here is a raw sample clip, with no post-processing. ruclips.net/video/GMT1TVzDCCU/видео.html
Nice! Those shots look awesome.
Really great quality! What do you think of Leonardo AI being fed into Pika Labs (/animate)? The two green female android clips in this video were made this way: ruclips.net/video/IPLFHNcQY-c/видео.html
just checked out your creations on your channel and they do amazing!! I'm still debating on which platform to subscribe too lol
Hi I want start learning AI , do I need any minimum requirements for my laptop spec?
You can do a lot of AI stuff on Discord (Morph Studio, FullJourney) or websites (Runway, Pika Labs, Leonardo AI). All of that can be done on a Chromebook or phone. It's only when you want your own computer/GPU to generate things locally that you'll need a powerful computer.
I'm preparing a short film, I want to know which AI is the most interesting for generating truly realistic videos, and which even a subscription is worth a little weakness for Kaiber but I hesitate: the murf does not suit me, and Leonardo ai video is too imprecise, same for the sRunwayML videos, from pika
Try Luma or Kling
@@aivideoschool does it expensive ??
you dont taybe 4k in prompt or 8k
how can we remove watermark from runway after animation?????
In Gen-2 there's a settings button when you generate (currently in the bottom left). If you open it, you can Remove Watermark. If you've already generated the video, try Runway's own Inpainting tool. It might work?
can move the watermark in Pika?
With a paid plan you can
@@aivideoschool thanks bro🙏🏻
I've been working with Runway and Kaiber. They are hit and miss. I have some outstanding shots, but I have some weird ones. The same for Kaiber.
I feel like image to video actually gives better results with more motion than text to video.
What i see is, runway is better for camera movement, and pika for object. it would have been sick to merge them somehow but idk if it would turn out any good.
I've been having great fun and quality using Leonardo AI's new Photorealistic gen mode and taking the image generated and using it with Pika Labs "/animate" command. I believe the output is on par with RunwayML at that point. @aivideoschool
you say that it is possible that you provide too many details and that is why the graphics do not come out, but the AI creators write that the more details, the better
I know many people go that route for image generation but in my experience, doing a short, concise prompt with a specific instruction works better for image-to-video. I'm genuinely curious if anyone has good tips involving longer i2v prompts.
I think Runway looks bad. It's impressive, but it definitely has that AI look that most people are turned off by. It's cartoony. Pika at least looks are artistic and real, rather than CGI fakeness.
Runway updated their model recently and it looks more realistic than when I recorded this, but I totally agree. I hate the cartoonish AI look that isn't quite animated but isn't real looking either.
@@aivideoschool It's impressive, but still a long way off from me using either option.
@@ArcanePath360 Can you suggest a better option than this two
Any idea how to reproduce this on free open source software? Stable Diffusion & Text-To-Video for example. I would be interested in a workflow.
Pika is free. Also, check out FullJourney, which is free and based on Zeroscope. Here's a tutorial ruclips.net/video/eaV9wzRDTMQ/видео.html
I tried to use pika, register on discord and I couldn't use it instead am still on waitlist
We can mix chroma key and AI art to make something cool
Runway's AI rotoscope/green screen tool is pretty impressive
It was your prompts that didn't work, not the a.i.
Too bad Pika only generates 3 seconds. Maybe that will increase in the future.
Yeah, it would be great if they had the "extend" option like RunwayML. But you could always use a video editor, or automate the process using ffmpeg, grab the last frame, and resubmit it to Pika Labs with the "/animate" command with a text prompt telling it what to do next with that final frame.
I put the same image in both, the Titanic wreckage. Pica made bubbles, Runway turned the image to make a whole new section of the boat. Pika is awful, like it has its own agenda and doesn't follow my prompts
I know, I've had success and misses with both. One tip I heard that I found helpful was remembering the AI can see the picture too so I might say "a fish swims out of the hole in the ship" or whatever you're going for. And then keeping the action limited to one specific thing that can be shown in 3 seconds. It's still not perfect but then I fine tune from there or hit the refresh button. Good luck!
I agree. I used both for my 4 minute dark love story, "Law of Attraction" ruclips.net/video/ExVfCKkQtk0/видео.html . I found Pika Labs more useful even with a lesser video quality. Runway ML can be a bit frustrating at times.
Nice! Pika now has 24fps if you add this to your prompts: -fps 24
No offense meant to the author of this video, of course, but I almost find this video sarcastic because from what I see in a lot of ways Runway's results look waaay worse than Pika's...
I guess what I my ultimate point is - the quality of the animation doesn't only lie in the smoothness and the lack of noise...just because something is polished and its textures look smoother doesn't mean it more realistic and better overall in all aspects.
Am I the only one thinking this?
No offense taken. I just want to clarify that the quality I referred to had to do with technical aspects like HD resolution and frame rate. 3 months ago when this was recorded, Pika was only like 12 fps and half the resolution that it is today. Pika has surpassed Runway and it's still free
@@aivideoschool thanks for the response! I understand your point totally and I'm glad you didn't interpret my comment in a bad way.
Both crap but I prefer Pika for some controls for now. Also it's faster and free beta. Thanks for the video.
Thanks, I agree. Runway had an early lead but the controls and quality from Pika have surpassed them in the past few months. Hopefully the competition means higher quality video and more creative options.
More mashed potatoes.
ai is good
"ai filmmakers" hahaha. find the error.
This is a shill video. Pika is paying people off to promote their sub par algorithm. Runway is WAY better.
FYI - I don't do promos for anyone, I just share what I find interesting and am enthusiastic about. If you watch the whole video, I say they're each better at different things.