How to Make Midjourney Consistent Characters Talk Using Runway & Pika Lip Sync (AI Video Tutorial)
HTML-код
- Опубликовано: 3 авг 2024
- Midjourney can now create images with consistent characters, so I’m going to show you how to make a movie with those consistent characters using the new lip sync features in both Runway and Pika. CHAPTERS & LINKS ⬇️
🍿 Here's the Time Trouble movie created in this tutorial
• Time Travel Trouble - ...
🧐 My comparison of Runway & Pika Lip Sync
• Runway vs Pika Lip Syn...
🍿 And here's the original Gen-2 and Wav2Lip time travel movie from June 2023
• AI-Generated Time Trav...
📖 CHAPTERS
00:00 Introduction
01:22 Creating character reference images in Midjourney
02:32 Spreadsheet to keep track of characters, styles, script
03:38 Creating style reference images in Midjourney
04:00 Combining a character and style (cref & sref)
05:16 Inpaint the styled image to change wardrobe
06:05 Inpaint the character reference to change wardrobe
07:15 Workarounds for framing (close-up, wide shots)
08:14 Two different consistent characters in the same shot!
09:33 Pika Lip Sync
10:34 Longer Pika lip syncs using video (CapCut hack)
11:46 “Add 4s” doesn’t work with lip sync in Pika
12:11 Use other image-to-video generators to create longer videos and motion (Stable Video hack)
12:28 Runway Lip Sync & Motion Brush section
12:38 Runway Motion Brush
13:28 Add motion before lip sync
13:37 Motion Brush for gestures
14:08 Runway Lip Sync (Generative Audio) using Gen-2 assets
15:04 Runway Camera Control
16:02 Recap of steps: Character, Style, Motion, Speech, Lip Sync
16:26 Our time travel movie with Midjourney consistent characters talking with Runway/Pika lip synch
🛠️ TOOLS
Midjourney: www.midjourney.com/
Pika: pika.art/
Runway: runwayml.com/
I also used these for the final video:
Stable Video: www.stablevideo.com/
ElevenLabs: elevenlabs.io/
Suno: www.suno.ai/
CapCut: capcut.com/
#midjourney #runwayml #pika #pikalabs #lipsync #runwaylipsync #pikalipsync #talkingcharacters #ailipsync #aivideo #aivideocreation #aitutorial #aitutorialforbeginners #filmmaking #howtomakeaivideo #consistentcharacters #motionbrush #runwaymotionbrush Кино
New stack:
- MJ
- CapCut or Canva to cut out character
- Animate background separately with Pixaverse (better more stable motion brush) or Stable Video
- or just camera move with...
- Lip Sync with Pika or Sync Labs
- Optional cleanup step with Topaz or some of these ComfyUI steps
awsome. Clear, concise, and easy to follow. Excellent visuals and examples. Very helpful. Thanks!
This is quite easily the best tutorial for consistent characters. Well done.
Thank you!
thanks for the tutorial. this is great
Glad it was helpful!
thanks Mike. this is great.
great video - thank you:)
Very nice tutorial and an amazing end. Really appreciate the video.
Thank you for watching the whole thing, I hope it was helpful!
Excellent Videos 😎💯
Thanks for watching!
17:09 LOL this is AWESOME!! Though I prefer Runway's lipsync because it has facial movement as well.
Good tutorial
Thank you, great video! However, Pika doesn`t offer lip sync at the moment and runwayml didn`t work with my animation videos and it also didn`t work with faces in profile
Pika does offer Lip Sync, I just tried it last night. You have to upload an image or video, then you'll see the Lip Sync button. It might not work with animated characters. They have to look human-like for it to recognize the face.
Can you please let me know how to lip sync when there are 2 characters in the same scene? is it possible? Runway keeps detecting the wrong face. (I also have a robot in the scene which isnt getting detected at all)
The only way I know of currently is to cover/paint over one face in the photo and lip synch the first character. Then paint over the second face and lip synch the other character. In your editing program, crop or mask the two clips so you only see the talking faces.
P.S. You need to add 0:00 at the top of your chapters list in the description so that the chapter markers will show up on the videos timeline, otherwise they are just buried in your description area, hard to find and not helpful to viewers
Thank you for this tip! I didn't realize that was the trick to get them to show up in the timeline.
@@aivideoschool I figured that. My pleasure my friend. By the way, in my hunt to find the best quality AI videos, I stumbled across your channel and I saw you reviewing a lot of the platforms that are already somewhat known. But I just stumbled across replikant. It's more of a 3D animation software, but I think it may be relevant to some of your followers as it allows ultra realistic storytelling videos to be created with their platform very easily without having to have years of 3D animation experience. It's a new platform made by epic games I believe.
Glad I showed up in your hunt. Replikant looks so cool but I don't have an Nvidia GPU (I'm on a Mac M2). I wish they had a cloud version like Wonder Dynamics but it looks a lot better than Wonder.
for those who can't afford midjourney. Is there a way for consistent characters with free Dalle-3 inside Microsoft bing. thank you very much.
The best way I found prior to Midjourney's update was to uniquely name the character in the prompt and use a detailed physical description that includes age. Then you copy the seed number from an image you like. If you add onto the original prompt with the same seed, it's sort of similar. I don't think Dalle will give you a seed but Leonardo AI will and is also free.
@@aivideoschool thanks my friend. there's a new way to make ai movies you have to make a huge video about it which is creating a full body shot image for a character and his background and make the character move with viggale ai. this is much much better than the slight motion we have with runway, pika or haiper.