Generate ANY Video! │Runway Gen-3 Alpha Tutorial
HTML-код
- Опубликовано: 28 дек 2024
- Instantly generate any video scene using AI! Runway's Gen-3 Alpha is here and so far it is one of my favorite AI tools to play around with as of recently.
🔥 Easy AI Install Guide - Install, Run & Control AI Apps, Bots, & Generative Tools ⤵️
/ pinokio-easy-run-11256...
🔥 Black Mixture Discord:
/ discord
My Favorite Motion Graphics Assets, Plugins, Templates, and MORE!!
🔌⚡ AEJUICE: bit.ly/aejuice...
Before you get creating, make sure you hit the subscribe button because I may go over helpful tips and tricks in a future tutorial. Or just hit the subscribe button because you liked the video and appreciate free videos😉.
If you like what we're doing and want to support:
/ @blackmixture
Follow on IG: @blackmixture
runway V3 is an upgrade with many outstanding features, thank you
Thank you for the real impressions!
Thanks for doing that to show what it’s like 👍
The video of the Asian girl wearing the wig, how did they maintain the character consistency? Any ideas?
Were I to guess, they used --cref as a parameter in Midjourney and then used those images for image to video on Runway.
@@FRandAI Hi, how do they do that exactly?
Thanks
@@Lucas-of9ek 1. Upload a picture of your character to the Internet
2. Go to Midjourney
3. Type out your prompt for desired image (don’t hit enter yet)
4. Type: --cref and then paste the link of your image uploaded to the Internet
5. Optional: type --cw followed by 1 to 100, 1 being vaguely adhering to the design of your character and 100 being strongly adhering to the design of your character
6. Type whatever other parameters you need, then click enter/done/run/wherever it is
Note: it’s far from perfect
Is it possible to run this locally without being required to use the website?
Tell them to have the option (for more points or whatever) to generate a depth map video along with it. That way you can use it for compositing in some additional stuff.
Thanks for the tips!
yo thanks for the video mate, I dont know if its only me but the yellow subtitles are super distracting for me.
wow...I make my own videos, can I add with Runway any visual effects over my work?...for example addind a new character or visual fx around my character?...is it posible with this tool, or only generates with ni interaction with real videos?...tnks
why dont you try a long prompt
Thank you for your efforts !!!
Can you use your own image
Yes
Nice but I need able to generate consistent characters, like I can with Midjourney faces, and then use those as input into video. Gen 3 Alpha has no way to make consistent characters, even when using same seed.
its okay, still ALOT to improve
Thanks 😊
I used it for a month and canceled. definitely a bit janky as is. looking forward to future versions though.
Just not practical yet unless you want horror content lol
What's up BM, yo I know this post is a month old, but peep the Gen 3 Alpha Turbo x Flux1 flow ruclips.net/video/1FUHjx8n0Is/видео.htmlsi=Nkfs1twvJHuM1uF3
the prompts were awful
OMG ! Is the OpenAI ROBOT now Dancing SALSA too ? 🎶 🎵 OpenAI my Sugar Papi,
by PEACHY da WHUUPi on RUclips, Insta,Spotify . HILARIOUS.
CAN YOU GUESS WHICH A.I. I used ?