mocopi for Unreal Engine Tutorial: #3 Live Retargeting
HTML-код
- Опубликовано: 30 сен 2024
- In this video we learn how to retarget mocopi data to the Unreal 5 mannequin and the Metahuman.
この動画では、mocopiのモーションデータをUnreal 5のmannequinとMetahumanにリターゲティングする方法を解説します。
Download mocopi receiver plugin : www.sony.net/m...
X (Twitter) : / raynosbysony
For technical support and request, please make comment in mocopi FunLab Discord.
技術サポートやリクエストについては、mocopi FunLab Discordで承ります。
discord.com/invite/fX4CqTBJnr
Thanks for the tutorial. I am using VIRDYN gloves for hand together with mocopi. How can I retarget those two data from mocopi and Virdyn at the same time in One?
Excellent tutorials. How would you live retarget a character that is a BP with hair grooms etc.? I can't just select the base skeletal mesh in that case.
how it preforms if you try to walk on stairs or any kind of obstacle that have a height ?
Can I record the motion and then edit the metahumans in the sequence editor? (Like the fingers) and will the UE face cap work with it as well?
Yes, you can. There are videos out there that explain how to it ;)
hey brother! did you figure out how to edit it? I still can't find a method to add animation to fingers...
I figured it out! When you will record your animation and retarget it to your metahuman -> go to sequence -> add metahuman blueprint there -> add that animation to it -> bake to metahuman rig = that will create a rig with keyframes, which you can manually adjust later (including finger adjustment)
@@buniakdmitrii9287 Wow thanks man! been trying to figure that out, you're awesome!
@@btmanimations7079 no problems!
These tutorials are great. Thank you very much!