@@activemotionpictures So true! Omar is a true legend for developing it. I've been playing with it, not that much yet, but your demo really sells it I think, I though you had a clip from a real scene and just drew GP mouth shapes on top of it first haha :D
Will this export to unity? At the very least the 2d mouth animations? Unity and Blender modifiers and drivers and shaders are a no but I just want the shape key animations for the face to play in my unity cutscene, I would rather use Rhubarb 2D Lipsync for Blender to create my scene in blender and import into unity. Any help would be appreciated. Thanks
Hello nice tutoriel ! But it's give me an "Error!!! Json Decoder", with all files than i tried, my files are .wav, do you know how i can don't have no more this error please ?
@@activemotionpictures Thanks for your answer ! I just test it and it's doesn't change anything... I tryed many format of wav (with Audacity) and it's doesn't work, still the error...
You have 2 big problems here that neither you nor tiny nick elaborated on how to fix during an ACTUAL animation production. 1) Once the modifier is connected to Rhubarb its difficult to assign the mouth positions except from memory because you won't be able to see the frames, so can you do that before you connect the modifier to rhubarb? 2) Once the animation is generated how can i remove the original frames i used to create the mouth shapes? It looks like Rubarhb just makes a sequence untop of those frames IN THE SAME LAYER. If i was you i would have used the rest of the video to highlight those issues and how to work around them. This is a video about how to do Lipsynching right?
Answers: 1. You (in any animation production pipeline) have already decided mouth counts and mouth poses (1A, 2, 3open, etc.. any code to identify works). So this is not guess work, so no Nick or Me, to elaborate on that. If you need so, open a text editor in blender, use the vowels and number code to recall poses, and this will work, regardless. 2. This video is about how to bake sound to Rhubarb. Any other feature requests, you may do them on THEIR page. How Rubarb works for 2D is just an extension (not an optimization) of their plugin in Blender.
Man, real-time compositor is so good for stuff like this, no more delays, just instant results. Amazing!
Indeed! All future 2D animated shows (cut animation + frame to frame animation) will be instant to watch including the effects!
@@activemotionpictures So true! Omar is a true legend for developing it. I've been playing with it, not that much yet, but your demo really sells it I think, I though you had a clip from a real scene and just drew GP mouth shapes on top of it first haha :D
Top notch! So good that you make me want to drop my current works and investigate this field haha
You're free to ask the questions here before switching. 😁.
Thanks!
Amazing video!!! :) thanks for checking out the addon!!!!
I already DMd you a video with feedback about what things could be possible to do to make it even better. Twitter DM-ed.
Thank you very much for this!!!
Love to see it on a 3d character
I was thinking the same 😁
Will this export to unity? At the very least the 2d mouth animations? Unity and Blender modifiers and drivers and shaders are a no but I just want the shape key animations for the face to play in my unity cutscene, I would rather use Rhubarb 2D Lipsync for Blender to create my scene in blender and import into unity. Any help would be appreciated. Thanks
no. it will not export to unity.
However, I'm going to record a new video using Coatools in 2025 to explicitly export to Unity.
Hello nice tutoriel ! But it's give me an "Error!!! Json Decoder", with all files than i tried, my files are .wav, do you know how i can don't have no more this error please ?
try recording a wav at 24khz. Then at 32khz, to see which one will work. It's gotta be a WAV format, but I think you're saving the file beyond 32khz.
@@activemotionpictures Thanks for your answer ! I just test it and it's doesn't change anything... I tryed many format of wav (with Audacity) and it's doesn't work, still the error...
You have 2 big problems here that neither you nor tiny nick elaborated on how to fix during an ACTUAL animation production.
1) Once the modifier is connected to Rhubarb its difficult to assign the mouth positions except from memory because you won't be able to see the frames, so can you do that before you connect the modifier to rhubarb?
2) Once the animation is generated how can i remove the original frames i used to create the mouth shapes? It looks like Rubarhb just makes a sequence untop of those frames IN THE SAME LAYER. If i was you i would have used the rest of the video to highlight those issues and how to work around them. This is a video about how to do Lipsynching right?
Answers:
1. You (in any animation production pipeline) have already decided mouth counts and mouth poses (1A, 2, 3open, etc.. any code to identify works). So this is not guess work, so no Nick or Me, to elaborate on that. If you need so, open a text editor in blender, use the vowels and number code to recall poses, and this will work, regardless.
2. This video is about how to bake sound to Rhubarb. Any other feature requests, you may do them on THEIR page.
How Rubarb works for 2D is just an extension (not an optimization) of their plugin in Blender.
How can I move or adjust the mouth
@@lovetotheworld001 in grease pencil draw mode, each frame is a mouth shape. select the frame, edit mode, move the vertex, exit edit mode.