Glad I came across this video to confirm that I am not crazy! Even before this camera control drop, I couldn't get Runway to do anything with even the most basic prompts! And this is also with using their own prompt guide. You tell it to make the character walk, and all you get is the camera moving around the character in slow motion. It's crazy! We need better prompt adherence in Runway. You just end up burning through a bunch of credits to get mostly unusable shots similar to what is happening to your characters. It is almost as bad as Luma Dream Machine, just with better asthetics.
@@gabemichael_ai True true. But we've got to keep calling this stuff out. The competition for our dollars is fierce in this space with new players entering the arena just about every day. So we've got to force these companies to step up their game if they want to keep getting our money. Kling AI and Hailuo are the top players in my book for now, with Luma Dream Maching dead last. Runway seems the most promising since it uses actually cinematography concepts in their prompts, but they've got to seriously step their game up
What we need in Runaway, apart for adherence to the prompt AND the camera controls, is an image to video WITH video control net (video as an example of the movement we want to see in the scene) something like video to video but starting from an image, this is so important
Runway’s latest Gen 3 Alpha turbo release includes Advanced Camera Control, enabling direct camera movement. To start, you must add an image manually, as text prompting alone won’t work. Using mouse control, you can drag across the image to simulate camera movements, which are automatically reflected on the sliders. It’s recommended to add a prompt describing the subject and scene to help the model interpret the camera action accurately. Templates are available, allowing users to specify the subject and action for each camera move. Each prompt requires clearing before inputting a new one, as prompts stack otherwise. For example, setting the camera to lower and tilt creates a low-angle shot of a monster walking down a hallway. Although the camera moves as specified, the subject remains static, which may not achieve cinematic movement. Other moves, like the orbit shot, work well but may cause the character to morph, reminiscent of earlier Gen 2 issues. A futuristic BMW shot with a low camera angle looks impressive, although the model added rearview mirrors to the image. For another test, the camera orbits around a black demon while two characters kiss, keeping specific visual details like the demon’s horns intact. However, achieving coherent scene interactions remains a challenge. In another attempt, a robot in a tilt-up move stayed static, not delivering the desired cinematic effect. This new control feature in Gen 3 shows promise, and improvements are anticipated in upcoming updates.
This AI is going to be huge. I tried the 1 month unlimited generation at $98. It's pretty good actually. But ngl, there's a long way to go to call it's "proper" for making the movement "Alive"
"This AI is going to be huge" ??? Runway has been on the top 3 chart since like a year ago my dude. Only Kling and Minimax can compete. The new Mochi is also good but since it only does text to video for now, it isn"t a real contender
@@gnoel5722 Yes, it will definitely replace 95% film makers leaders, and anyone can be the one. The potential of video generation AI is unbelievably high, far more than their current capability. Their current video generation quality couldn't even catch up even a simple animation on RUclips. But surely, AI can definitely make a huge explosion in innovation. So I can't wait for that.
The problem I’ve always had with Runway is the lack of dynamic animation of the subjects. Like the the camera movements are cool. But the subjects just doesn’t do anything
Check it again today. They just did an update to their model and I've gotten significant amount of movement even within gen 3 turbo advance camera control.
Excellent, thanks! Very cool, yet Runway is still miles behind MiniMax with prompt accuracy and animation in general. Their new updates are cool, but they still have a great morphing issue and they censor too strongly. The logical most important thing would be for them to match MiniMax A.I. technology with following prompts and MiniMax's superior animation that renders actions in real-time speed by default rather than in slow-motion. Even Runway's "fast motion" usually looks like a faster slow motion, in most cases.
For me , the slow motion is not an issue that much because in the editing software i speed it up. I might need to experiment with minimax more since I haven't seen much of a difference between them. Runways other tools made it just more flexible for me.
@adarwinterdror7245 try to make a character walk without mistakes in Runway. After 10 failed attempts, go to Minimax and master it at first attempt. Then, whe can continue the chat. 😮
I kind of feel Kling AI is miles ahead of Luma and Runwayml in terms of charscter movement and control. Camera movement is good but not the most useful feature
Finally!!!, camera control!!.
This is just the beginning.
It's a brand new word
Glad I came across this video to confirm that I am not crazy! Even before this camera control drop, I couldn't get Runway to do anything with even the most basic prompts! And this is also with using their own prompt guide. You tell it to make the character walk, and all you get is the camera moving around the character in slow motion. It's crazy! We need better prompt adherence in Runway. You just end up burning through a bunch of credits to get mostly unusable shots similar to what is happening to your characters. It is almost as bad as Luma Dream Machine, just with better asthetics.
There's a lot to improve upon that's for sure, but all these companies continue to impress me weekly.
@@gabemichael_ai True true. But we've got to keep calling this stuff out. The competition for our dollars is fierce in this space with new players entering the arena just about every day. So we've got to force these companies to step up their game if they want to keep getting our money. Kling AI and Hailuo are the top players in my book for now, with Luma Dream Maching dead last. Runway seems the most promising since it uses actually cinematography concepts in their prompts, but they've got to seriously step their game up
That BMW is sweet
I wouldn't mind driving it.
What we need in Runaway, apart for adherence to the prompt AND the camera controls, is an image to video WITH video control net (video as an example of the movement we want to see in the scene) something like video to video but starting from an image, this is so important
Nailed it. 🔨
Always something new
Seems like every week!
We all need to thank Minimax. It's been a long time we didn't see so many new features in Runway. 😅 I LOVE competitions.
Good point. Competition will be great for us all!
Runway’s latest Gen 3 Alpha turbo release includes Advanced Camera Control, enabling direct camera movement. To start, you must add an image manually, as text prompting alone won’t work. Using mouse control, you can drag across the image to simulate camera movements, which are automatically reflected on the sliders. It’s recommended to add a prompt describing the subject and scene to help the model interpret the camera action accurately.
Templates are available, allowing users to specify the subject and action for each camera move. Each prompt requires clearing before inputting a new one, as prompts stack otherwise. For example, setting the camera to lower and tilt creates a low-angle shot of a monster walking down a hallway. Although the camera moves as specified, the subject remains static, which may not achieve cinematic movement.
Other moves, like the orbit shot, work well but may cause the character to morph, reminiscent of earlier Gen 2 issues. A futuristic BMW shot with a low camera angle looks impressive, although the model added rearview mirrors to the image. For another test, the camera orbits around a black demon while two characters kiss, keeping specific visual details like the demon’s horns intact.
However, achieving coherent scene interactions remains a challenge. In another attempt, a robot in a tilt-up move stayed static, not delivering the desired cinematic effect. This new control feature in Gen 3 shows promise, and improvements are anticipated in upcoming updates.
Great AI summary of my work
The zoom out camera motion is great
Yeah I love it.
Thank you for your efforts !!!
❤️
This AI is going to be huge. I tried the 1 month unlimited generation at $98. It's pretty good actually. But ngl, there's a long way to go to call it's "proper" for making the movement "Alive"
🦾
"This AI is going to be huge" ??? Runway has been on the top 3 chart since like a year ago my dude. Only Kling and Minimax can compete. The new Mochi is also good but since it only does text to video for now, it isn"t a real contender
@@gnoel5722 Yes, it will definitely replace 95% film makers leaders, and anyone can be the one. The potential of video generation AI is unbelievably high, far more than their current capability. Their current video generation quality couldn't even catch up even a simple animation on RUclips. But surely, AI can definitely make a huge explosion in innovation. So I can't wait for that.
Slow movement can be fixed though in post by speeding up the clip.
Damn. How do you keep up with this?
Hard to do!
The problem I’ve always had with Runway is the lack of dynamic animation of the subjects. Like the the camera movements are cool. But the subjects just doesn’t do anything
Check it again today. They just did an update to their model and I've gotten significant amount of movement even within gen 3 turbo advance camera control.
@ I don’t see anything in their website about an update regarding that 😅
@@chrisjlee2013 Well I've tested it extensively yesterday, and its so much better than last week.
Good for landscape/static model shots
Sure is
Excellent, thanks! Very cool, yet Runway is still miles behind MiniMax with prompt accuracy and animation in general. Their new updates are cool, but they still have a great morphing issue and they censor too strongly. The logical most important thing would be for them to match MiniMax A.I. technology with following prompts and MiniMax's superior animation that renders actions in real-time speed by default rather than in slow-motion. Even Runway's "fast motion" usually looks like a faster slow motion, in most cases.
For me , the slow motion is not an issue that much because in the editing software i speed it up.
I might need to experiment with minimax more since I haven't seen much of a difference between them. Runways other tools made it just more flexible for me.
Same
🦾
@adarwinterdror7245 try to make a character walk without mistakes in Runway. After 10 failed attempts, go to Minimax and master it at first attempt. Then, whe can continue the chat. 😮
🤖
But when i generate images Why the subject not move much😢 please help
What I found is more photo realistic. Humans have more movement than animated scenes or animated characters. Hopefully that changes soon
They really need to put all this in portrait mode.
I believe you can or that it's coming soon.
You can. It’s a Turbo mode feature.
Nice music. Who is it?
More info on that soon.
So, another credit burning feature by Runway? Great.
This is all headed somewhere great. But until then use it when needed for a specific purpose.
is it possible generate in 4K? TKS
Not yet. But you can upscale with topaz and it looks great
@@gabemichael_ai topaz? please do you have the link? Is it possible upscale video hd to 4k?
@@GINGAGEAR_PRODUCTIONS www.topazlabs.com/topaz-video-ai
Every week EVERY THING changes
Yep
It might work for Matrix bullet effect…..give it a try.
Good idea
I love it BUT they seem to sacrifice camera movement for character movement, needs work, but it will be amazing.
Nailed it.
I kind of feel Kling AI is miles ahead of Luma and Runwayml in terms of charscter movement and control. Camera movement is good but not the most useful feature
That's what happens when there is no such thing copyright in your country. Hoping we have some declarative rulings soon on our end.
@@gabemichael_aiI don't understand 😅
Could not focus, talking with throat instead of mouth.
This is what happens when you get sick, but still want to be helpful to others. You should try it sometime.
@ Sorry