PLEASE READ: - Metahuman Animator REQUIRES iPHONE 12, NOT 11 - If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage Something I thought of later: You could actually have two sequences of the same animation, one for wider shots, and one for closer shots. The closer shots would have heavier smoothing to eliminate any jitter showing up in closeups, while the wider shots would have minimal smoothing, so the body movement and placement in space would be more accurate.
I can not thank you enough for this. The quality of this video, the amount of editing, planning, and just over all quality is insane. This single 2 hours replaces around 20 other videos i have been trying to Frankenstein together to get this workflow to work, but each ones was on different versions, some methods were outdated but the only tutorial i could find. This is a god send and has saved me most likely months on my personal project I am working on. As a motion graphics/VFX artist coming from Houdini/C4D and trying to learn UE through tutorials it has been a huge pain, but this is exactly all the info i have been trying to find. 2k subs is criminally low for this high quality content, thank the youtube algorithm gods for randomly suggesting this! You've got a sub and someone who will direct everyone to this video who is in my same situation. Cheers!
literary i have no words for you,. I was finding these type of long explainaition tutorial because of many question, and you give all the answers in this video , this is complete and enough for making short or long content movie,. Thank a lot from me and also from all the audience who struggling with lake of good guidance ❤
THIS IS THE MOTHER OF ALL OTHER VIDEO TUTORIALS on these subjects. I still cannot believe how amazing detailed up to point it is. Thank you - You are the SAINT!
Man I wish I found your channel sooner. You casually answered all my questions and doubts in one video and that’s crazy considering how many tutorials I’ve watched over the years. I’ll be your loyal subscriber from now on🔥😂
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
I can't even begin to explain how I have enjoyed ever bit of this tutorial, actually one of the best unreal metahuman tutorials I have ever had the pleasure of watch, everything is detailed out so perfectly that you have my upper most respect, I was afraid of unreal engine but just this tutorial made it possible for me to download and start my ventures, thank you so much Sir for this and I promise you that once am done with my project I will tag you.
I just wanted to thank you for doing this tutorial. Truly a staple of how tutorials should be done. I’ve used the concept of this tutorial on other projects and recently decided to follow your project step by step for practice. I’ve finished,but had to go back and work on my lighting. I think lighting can make or break a scene/project and is one of the hardest things to get right. I hope to see more tutorials like this on your channel in the future. Thank you again and god bless.
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
@@NorthwoodsInteractive Sadly I think this only works when the head detaches during a body animation only. I followed your steps of combining facial mocap and body mocap and was also forced to restart the engine in order to fix it. Please let me know if you find a fix and I'll do the same.
Really useful Regarding the focus target (cube) that is flying when you attach it to the head bone, You should reset the transformation to (0,0,0) and then attach it.
Absolutely perfect and helpful. Can you make how to integrate this within gameplay ? I would really like to see a seamless integration of cutscene like exactly in the last of us. Can you recreate a simlar style ?
Great tutorial! As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
Is there any chance you will be making a detailed tutorial like this to show how to make a customized Metahuman? This video is so well detailed and exactly how I need to learn, but so far the custom metahuman world is still trying to piece things together from a million vague 'tutorials' out there.
I know what you mean. I would really like to do a tutorial on that, I just need to make sure I can do it with software that isn't too expensive, like blender or something. Most of that process requires multiple licenses for stuff like Maya, Zbrush, substance, etc
Thank you for such a comprehensive tutorial, very well done! I did notice a duplicated link in the description for the budget helmet rig, and was wondering if there was another part for the list we needed to snag to replicate your build 1:1? I see your rig has straight arms vs. the curved arms in your posted list, as well as a mount on the helmet that I can't seem to find. Thank you thank you again for any help, and can't wait to see more!
Hey, thank you for pointing this out! I fixed the link in the description, now it goes to the same two piece arm I have on my helmet, which also comes with the helmet mount. It ends up being more expensive than I remember when I first built mine, which was two years ago. Price now is closer to $70. Please let me know if you build it!
I'm wondering if you've had a chance to test the new "Audio to Facial Animation" tool in UE 5.5. I wonder if we will get to the point where capturing anything will even be necessary.
I have not tried it yet. Totally possible we get to that point, but right now I am enjoying how relatively easy and cost effective it is to do performance capture
Great tutorial and breakdown of the entire cinematic, my only question is, people who are with iphone 11 pro max, what is the alternative of metahuman animator ?
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end! I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right? Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations. Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Totally agree. Move One is not an ideal solution, but the point of this video was to show it can still get some good results if some considerations are taken. I will probably try using it for some future work, just because it was so quick.
Sir if I may ask at 28:57 you deleted the Metahuman control rig for body AND at 1:21:32 it appears again when you are smoothing out jitters, how does it come again
This is the first time I've felt like I understand the whole process. Thank you very much. Not much detail on the MoveOne site, so some questions you may or may not be able answer: What is the real minimum computer setup? I'm running an M1 iMac with 16GB of ram. iMac has a good camera. Could that be used in place of the ipad? Any idea when an android version of MoveOne might be released? How much memory/storage space would a phone require for the captures?
Hey I'm glad you found it useful! As far as I know, an iMac won't work, it needs to be an iPad or iPhone since it specifies an iOS version. The website has a waiting list for Android, not sure when it is coming out fortunately.
@@NorthwoodsInteractive Thanks again. It looks to me like using any form of mac to do the processing is a non-starter. There is no mesh to metahuman plugin. You might add that to your pinned note at the top. Bummer.
Hey, how do you not have a million subscribers? Unbelievable! Do you think that it would be possible to combine this workflow with Stretchsense gloves to get a better result with the hands? Thank you in advance!
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you! That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
Thank you very much. I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
Hey, can you please make a video on how to fix the issue when we add facial animation, the head gets detached from the body. When I am working on one take shots it works. But if we have multiple shots in a sequence and the metahuman has to be in different locations in different shots the head remains at the tag origin. Thank you in advance 😊
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
Hey! Thanks very much for this, really appreciate it! I have one important question: does it matter whether you have an Iphone 13 Pro or a 16 Pro for Livelink facial mocap or for Move One body mocap? I`m interested in terms of significant or relevant quality improvements in terms of capture? I`m really debating whether to invest in an Iphone 16 Pro due to better cameras or would a 13 Pro do the job just fine? Thanks!
I do not know if there is a difference, that is a good question. I used an iPhone 13 and it seems to do pretty well. I am just guessing, but I imagine the quality improvements would be small, if any. It's still 30 fps, and I don't think the resolution of the camera makes too much of a difference. There might be a difference in depth sensor resolution, but I am not sure.
@NorthwoodsInteractive So your Sequencer timeline is 24 fps, the body mocap is at 60 fps and the facial mocap is at 30 fps. Does it matter that we have different fps that don't match?
Hi friend, I have this problem: I have my metahuman plugging active but when I do right click to add a capture source It doesn't appear.. what could I have wrong to not have that option showing?
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
Is there a way to use props during the recording or will it not read? Like let’s say my scene starts with someone sharpening a knife, will this process work for something like that?
Hi. Please what iPad are you using to run move one. I recently got an iPad for this specific reason but everytime I boot up move one it says the camera isn’t compatible with move one.
I agree completely! I made this before I learned about QuickMagic. I just released a video on this channel featuring QuickMagic and showing how it stands out from the other s8ngle camera solutions
Does anybody know at what time he explained how to flip the rokoko headrig holder for the phone? Apologies for my laziness, but 2 hours is a large amount of time to look through
I don't show how to do it in the video. You need an Allen wrench, then all you need to do is unscrew the phone holder part and rotate it so it is facing away from your face, and screw it back together
@@NorthwoodsInteractive I really appreciate the quick response! Just tried exactly what you said and got it working. Keep up the wonderful work, it'll eventually pay off
I don't have any courses. I haven't done any full courses myself, but I know Bad Decisions Studio had a completely free beginner course on RUclips. For Metahumans, Feeding_Wolves and JSFilmz
PLEASE READ:
- Metahuman Animator REQUIRES iPHONE 12, NOT 11
- If you want my Live Link Face recording and Move One raw animation to follow along with the tutorial, you can get the files on our Discord: discord.gg/thedarkestage
Something I thought of later: You could actually have two sequences of the same animation, one for wider shots, and one for closer shots. The closer shots would have heavier smoothing to eliminate any jitter showing up in closeups, while the wider shots would have minimal smoothing, so the body movement and placement in space would be more accurate.
Iam using 11 Pro without any problem
I can not thank you enough for this. The quality of this video, the amount of editing, planning, and just over all quality is insane. This single 2 hours replaces around 20 other videos i have been trying to Frankenstein together to get this workflow to work, but each ones was on different versions, some methods were outdated but the only tutorial i could find. This is a god send and has saved me most likely months on my personal project I am working on. As a motion graphics/VFX artist coming from Houdini/C4D and trying to learn UE through tutorials it has been a huge pain, but this is exactly all the info i have been trying to find. 2k subs is criminally low for this high quality content, thank the youtube algorithm gods for randomly suggesting this! You've got a sub and someone who will direct everyone to this video who is in my same situation. Cheers!
I'm really glad to hear this was so helpful! Thanks for watching :) I would love to see what you end up making, consider sharing on our discord!
@@NorthwoodsInteractive will do!
literary i have no words for you,. I was finding these type of long explainaition tutorial because of many question, and you give all the answers in this video , this is complete and enough for making short or long content movie,. Thank a lot from me and also from all the audience who struggling with lake of good guidance ❤
THIS IS THE MOTHER OF ALL OTHER VIDEO TUTORIALS on these subjects. I still cannot believe how amazing detailed up to point it is. Thank you - You are the SAINT!
Loved the hand pose section - great tips!
This has helped me so much I can't express how much. You are professional, precise and thorough. I can't believe you only have 3.54k subs.
Man I wish I found your channel sooner. You casually answered all my questions and doubts in one video and that’s crazy considering how many tutorials I’ve watched over the years. I’ll be your loyal subscriber from now on🔥😂
Incredible and comprehensive tutorial, thank you! As you point out, there are other (shorter) tutorials out there that cover bits and pieces of this info, but nothing as completely packaged and comprehensive as this one you just uploaded! And an excellent summary of the inexpensive body MoCap solutions for us indies ha ha, thanks a ton for doing this!
Really glad to hear it, I was trying to make what I thought people wanted
That's so useful for the community of filmmakers. You bring it all together. Many thx 🙂
I can't even begin to explain how I have enjoyed ever bit of this tutorial, actually one of the best unreal metahuman tutorials I have ever had the pleasure of watch, everything is detailed out so perfectly that you have my upper most respect, I was afraid of unreal engine but just this tutorial made it possible for me to download and start my ventures, thank you so much Sir for this and I promise you that once am done with my project I will tag you.
Really well explained video. The information was presented in a concise and understandable way. Good job. Thank you.
I just wanted to thank you for doing this tutorial. Truly a staple of how tutorials should be done. I’ve used the concept of this tutorial on other projects and recently decided to follow your project step by step for practice. I’ve finished,but had to go back and work on my lighting. I think lighting can make or break a scene/project and is one of the hardest things to get right. I hope to see more tutorials like this on your channel in the future. Thank you again and god bless.
I've been searching looking and struggling just to see this video today
Thank you so so much
Glad I could help!
This... This is what we were waiting for. Massive thanks guiding me through this maze.
Fantastic!! I think a tutorial like this is what the entire community has been waiting on! Thank you for this.
Awesome, thank you!
Sir make some virtual production tutorial
Dude this channel deserves million subscribers
I said the same thing
Really enjoying this will have to watch it a couple of times but this hits the nail on the dot for me exactly what I was missing ❤
This tutorial is insane timing… this has saved me so much stress with my shortfilm
Awesome video mate! Ur a hella talented guy it’s awesome to see the sick content ur churning out!
awesome work. i enjoy the whole 2 hours tutorial. Eyes blinks needed. But Bravo !
Thanks for guiding us film makers just starting out! Metahuman animator and Move AI are indeed the beginners/ indei solution
Thats EXACTLY what I was looking for. Great video!
This is amazing, thank you! First one I have found that covers everything from start to finish for this topic and in perfect detail.
Wow I definitely need more than one thumb up! What I great session, really well explained and always on point! Thanks for sharing your workflow!
This is awesome, thank you so much! Can’t wait to try this out.
Wow this is truly comprehensive and in depth.
Amazing tutorial. This is how a proper tut should be. ;) Thanks a lot...i know the work this takes. Super useful, will be looking out for more of your videos.
The sphere method is what I do as well. The reason the ball goes off into a random space is because the transform it was using for world space becomes its local coordinate when you attach it. I haven't had much luck with keeping world space when attaching or detaching, but its not much of an issue. Great overview of the process, thanks for the tutorial!
Ah yes, I have just learned this! Makes sense.
I make a 3d animation movie using Metahuman. This information is helpful. Thank you so much.
This is a fantastic video. I'm going to be trying this for my UE5 short film. Thank you!
That's awesome! Would love to see the final product, please consider posting it in the discord!
Head reattachment made easy… Just select the character and check and uncheck Actor Hidden In Game. Great tutorial!!
Brilliant! Thank you!
@@NorthwoodsInteractive Sadly I think this only works when the head detaches during a body animation only. I followed your steps of combining facial mocap and body mocap and was also forced to restart the engine in order to fix it. Please let me know if you find a fix and I'll do the same.
this is gold! thank you for that. will spread the word.
Thank you for this generous and passionate tutorial, best step to step Tutorial ever!🥰
Really useful
Regarding the focus target (cube) that is flying when you attach it to the head bone, You should reset the transformation to (0,0,0) and then attach it.
Yep, that's the ticket!
amazing job and congratulations for the result. I only missed the animation of the clothes, which is not difficult to do with chaos cloth.
Yeah i need to do some cloth sim
It’s a really good guide about creating cinematic in ue for beginner as a know) Thank you
Glad it was helpful!
This was a great tutorial, thanks!
I just saw the mocap helmets, sorry for my earlier post lol
Absolutely perfect and helpful. Can you make how to integrate this within gameplay ? I would really like to see a seamless integration of cutscene like exactly in the last of us. Can you recreate a simlar style ?
Thank you very very much for clean tutorial. All seconds are great...
Amazing Tutorial and Make Life Easy For Beginners, Thank You👏
You're very welcome!
This was awesome. Thank for this tutorial
exactly what i was looking for
Awesome, glad to hear. Let me know how the format works for you
Great tutorial!
As far as I know, when you attach an object, its location is related to the attachment. So you basically need to set the attached object to its 0,0,0 if you want it to be at the same place of the attachment.
Oooh that makes so much sense, thank you!
Exactly what i what i was looking for! Thanks :)
I have already an iPhone,Can i ask which iPad model do we need as minumum requirement?
I am not sure, but their app page just says iOS 16.4 or later
Thx a lot 👍🏻
Amazing tutorial! Taking the cheap but effective approach to performance capture. Really Great stuff. Subscribed !
Thanks for the sub!
This is excellent! Thank you for a great video!👍
many thanks friend, amazing tutorial
You are very welcome, glad you like it!
Outstandig turorial! I believe the heavy wheight on the head and a the 60s limit is a problem for acting. I would rather reccord face and body apart: face first and use the audio to guide the performance.
Yeah it is definitely a limitation, and what you mentioned is a popular option. Just keep in mind there a lot of subtle movements with the eyes that are hard to sync perfectly when doing the animations separately. It can lead to the uncanny valley.
Is there any chance you will be making a detailed tutorial like this to show how to make a customized Metahuman? This video is so well detailed and exactly how I need to learn, but so far the custom metahuman world is still trying to piece things together from a million vague 'tutorials' out there.
I know what you mean. I would really like to do a tutorial on that, I just need to make sure I can do it with software that isn't too expensive, like blender or something. Most of that process requires multiple licenses for stuff like Maya, Zbrush, substance, etc
Thank you for such a comprehensive tutorial, very well done! I did notice a duplicated link in the description for the budget helmet rig, and was wondering if there was another part for the list we needed to snag to replicate your build 1:1? I see your rig has straight arms vs. the curved arms in your posted list, as well as a mount on the helmet that I can't seem to find. Thank you thank you again for any help, and can't wait to see more!
Hey, thank you for pointing this out! I fixed the link in the description, now it goes to the same two piece arm I have on my helmet, which also comes with the helmet mount. It ends up being more expensive than I remember when I first built mine, which was two years ago. Price now is closer to $70. Please let me know if you build it!
@@NorthwoodsInteractive You're the man, thank you so much! Will do!
I'm wondering if you've had a chance to test the new "Audio to Facial Animation" tool in UE 5.5. I wonder if we will get to the point where capturing anything will even be necessary.
I have not tried it yet. Totally possible we get to that point, but right now I am enjoying how relatively easy and cost effective it is to do performance capture
thanks brooooo, automatically i joined to your channel
48 seconds, subscribed.
Great tutorial and breakdown of the entire cinematic, my only question is, people who are with iphone 11 pro max, what is the alternative of metahuman animator ?
Unfortunately you need iPhone 12 or newer, I misspoke in the video
Thank you so much for your work and u share so helpfull and im so hype by what i can do now
Superb work. Thanks a lot for all the details.
My pleasure
Amazing tutorial, immediate like and subscribe. Gonna get started on this right away.
Awesome to hear! Please consider sharing your work in our Discord!
I really learnt a lot......🤜🤛
Please make more videos like this✌✌
Hello there, this is an extremely useful tutorial! Thanks for showing us everything, from the beginning till the end!
I have a question though: How does Move One handle body occlusion? For example, if I turn around and my arms are occluded by my body, I assume move one loses track of them, right?
Every clip I have watched displays only full frontal movements, which usually isn't the case if someone wants to create lifelike animations.
Thanks for your reply!
As far as I know, it does not do anything special for occlusion; if it is out of sight, it won't know what to do with it and it will likely bug out. It seems to work ok in this animation when I turn away a little, as though it has a little bit of predictive ability. I have not stress tested it, and have always tried to give the camera as good of visibility as possible
@@NorthwoodsInteractive Thanks for replying. Unfortunately, that's what stops me from getting into move one or other similar mocap systems that process video files: When creating lifelike motions, in many occasions the arms will be occluded by the body and the software will lose track of them. Only two or more cameras can solve this.
Totally agree. Move One is not an ideal solution, but the point of this video was to show it can still get some good results if some considerations are taken. I will probably try using it for some future work, just because it was so quick.
@@NorthwoodsInteractive Yes your point is valid and is proven through your video! Thanks again for showing us!
thank you
masterclass... thanks a lot man!
Sir if I may ask at 28:57 you deleted the Metahuman control rig for body AND at 1:21:32 it appears again when you are smoothing out jitters, how does it come again
Right click on the body in the Sequencer and select Bake to Control Rig
@NorthwoodsInteractive thank you for your kind response
This is the first time I've felt like I understand the whole process. Thank you very much.
Not much detail on the MoveOne site, so some questions you may or may not be able answer:
What is the real minimum computer setup? I'm running an M1 iMac with 16GB of ram.
iMac has a good camera. Could that be used in place of the ipad?
Any idea when an android version of MoveOne might be released?
How much memory/storage space would a phone require for the captures?
Hey I'm glad you found it useful! As far as I know, an iMac won't work, it needs to be an iPad or iPhone since it specifies an iOS version. The website has a waiting list for Android, not sure when it is coming out fortunately.
@@NorthwoodsInteractive Thanks again. It looks to me like using any form of mac to do the processing is a non-starter. There is no mesh to metahuman plugin. You might add that to your pinned note at the top. Bummer.
Just one word cool
Thank you for all the knowledges! Youre the best!
My pleasure!
wowww Amazing tutorial!!!
Thank you!
Hey, how do you not have a million subscribers? Unbelievable! Do you think that it would be possible to combine this workflow with Stretchsense gloves to get a better result with the hands? Thank you in advance!
Hey thanks! I am definitely interested in exploring this, and will look into it.
Amazing Tutorial!
Glad it was helpful!
Hopefully more phone manufacturers will start adding LIDAR to their phones as well. The iPhone exclusivity sucks 😬
Thank you so much! Crazy work!!! Respect and thank you for showing all this! Do you create always a new Metahuman Identity or do you reuse old if it is the same Headrig for example?
Thank you!
That is a great question - so I actually create a new metahuman identity for every facial animation I use, even if it is the same headrig. Basically every time I roll the Live Link app, I do the face calibration gazes at the beginning, and set up a new identity and performance in UE. I think you might not need to do that, but every time I have tried to use the same identity for multiple performances, I get weird animation bugs.
@@NorthwoodsInteractive thank you!!!
Excellent
Thank you so mucch...🎉❤.. been looking ,since so long.
Enjoy!
Thanks! good vid. lets hope they don't screw up all that wonderful quixel content too badly with the switch to fab.
This is a great tutorial. Regarding the depth sensor, previously i thought the iPhone 10 was the minimum requirement, now its iPhone 11?
Thank you very much.
I just checked, and it looks like it actually requires iPhone 12!! I could have sworn it was 11. I will make an annotation to the video. Thanks for pointing that out!
@@NorthwoodsInteractive thank you, once again the detailed tutorial is extremely helpful! Glad to know the requirements :)
Great tutorial!.. does the facial animator work with iPhone XR at all?
No, it will only work with iphone 12 or newer
Hey, can you please make a video on how to fix the issue when we add facial animation, the head gets detached from the body. When I am working on one take shots it works. But if we have multiple shots in a sequence and the metahuman has to be in different locations in different shots the head remains at the tag origin.
Thank you in advance 😊
hello! this is excellent thank you! how long did the whole process take?
Thank you, glad it was what you were looking for! That's a great question, one I should have answered in the video... I did the whole process twice, once to see if I could actually get the results I wanted using Move One, and another doing the same thing for the tutorial. The first time, it took me about six hours to complete the entire scene. The second... much harder to say since I was making the tutorial at the same time.
just what I needed
Awesome, let me know what you think, and if you end up following the whole process!
Great Tutorial
Thank you!
Hey! Thanks very much for this, really appreciate it! I have one important question: does it matter whether you have an Iphone 13 Pro or a 16 Pro for Livelink facial mocap or for Move One body mocap? I`m interested in terms of significant or relevant quality improvements in terms of capture? I`m really debating whether to invest in an Iphone 16 Pro due to better cameras or would a 13 Pro do the job just fine? Thanks!
I do not know if there is a difference, that is a good question. I used an iPhone 13 and it seems to do pretty well. I am just guessing, but I imagine the quality improvements would be small, if any. It's still 30 fps, and I don't think the resolution of the camera makes too much of a difference. There might be a difference in depth sensor resolution, but I am not sure.
Great stuff! Cheers.
Great job!
Thanks!
I’m trying to replicate your diy helmet and I can’t find the vertical elbow piece between the phone mount and the long helmet rod?
a.co/d/7Y4qkRv
Apparently I had the wrong link in the description
@NorthwoodsInteractive
So your Sequencer timeline is 24 fps, the body mocap is at 60 fps and the facial mocap is at 30 fps. Does it matter that we have different fps that don't match?
It's a great question, but I haven't seen any problem so far
thanks full!
bro is the character meant to look at the camera? if so, you can constrain the CTRL_EYES to look at camera to make it more realistic.
That is very good advice and something I will likely work into a future tutorial
Hi friend, I have this problem: I have my metahuman plugging active but when I do right click to add a capture source It doesn't appear.. what could I have wrong to not have that option showing?
very good job
Please make a video on what is the best way to transfer data from iphone to pc
Not sure if joking but that is in here. I felt a little silly adding it but I actually had to figure that out since I don't normally transfer from an iPhone tp PC
@@NorthwoodsInteractive I'm using Icloud, sync and then download when it's done.
Is there a way to use props during the recording or will it not read? Like let’s say my scene starts with someone sharpening a knife, will this process work for something like that?
You can use props, they just might occlude your hands. I think with this method though you are probably best just pantomiming it
awesome bro
Thanks ✌️
Hi. Please what iPad are you using to run move one. I recently got an iPad for this specific reason but everytime I boot up move one it says the camera isn’t compatible with move one.
I am using an iPad 4 pro. What kind do you have? You need IOS version 8 or newer i think
when it comes to record full body animation, maybe quickmagic could be a better choice
I agree completely! I made this before I learned about QuickMagic. I just released a video on this channel featuring QuickMagic and showing how it stands out from the other s8ngle camera solutions
@ I will try it on my own but first I need a helmet with a holding
Is it possible to do face tracking on metahuman that i created by myself using 3D mesh of my face?
Absolutely!
Amazing! Had to stop watching to ensure that I don't run off buying iphones, head rigs etc. 😅
amazing
Does anybody know at what time he explained how to flip the rokoko headrig holder for the phone? Apologies for my laziness, but 2 hours is a large amount of time to look through
I don't show how to do it in the video. You need an Allen wrench, then all you need to do is unscrew the phone holder part and rotate it so it is facing away from your face, and screw it back together
@@NorthwoodsInteractive I really appreciate the quick response! Just tried exactly what you said and got it working. Keep up the wonderful work, it'll eventually pay off
do you have any complete course? I would like to study with you, but if you dont have, please, can you say a course for us study?
I don't have any courses. I haven't done any full courses myself, but I know Bad Decisions Studio had a completely free beginner course on RUclips. For Metahumans, Feeding_Wolves and JSFilmz