would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right? I saw some webcam version of face mocap software but seems not smooth. Buying some used iphone is the only cheap way to use face mocap I guess
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
I love the section where you explain how movie special effects were created. and i have one quick question. Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through RUclips, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting. Prior to that facechift worked with 50 USD kinect.
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
"Don't need expensive gear", "I'm using my iPhone" Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result. I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
Sir I have a sci-fi story. I want to make it as a short film. But I have no money and no software. Can you tell me how to do it ? Please any suggestions me sir
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
I posted a question about Unreal Engine 5.2 and if it is possible to use imported animals like a fully rigged gorilla made with 3ds Max. Maybe I should not post a link here. Sorry. This could change everything as a filmmaker - will buy your course
@@liampugh could you answer more detailed please? I am totally new and interested in Unreal Engine. I do have a filmstudio with greenscreen and some good lights. Do you think it is realistic to set this up using the course? Is there a support in the course?
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
Unreal Engine 5.2 brings some awesome new features! I know what to do this weekend! 😁 What are you weekend plans?
Making blender animations! 😂
glad to see you got it working
you dont know a ffing thing about it lol
Okay Sure@@peter486
2 minutes of tutorial and 7 mins of sponsorship 😇
would like to see a detailed tutorial from start to finish. From a blank project to final render. Not just face mocap but also body animations. It is often quite complex and confusing as to the number of steps and different interfaces required. Many of the “tutorials” do not explain properly and often skip steps or start with “ I have already set up the …”. Also, the accuracy of the lip sync is often off and letters like B and P are not done with closed lips etc. Also how to animate the eyes to be more focused on the camera when needed. I would ideally like a single tutorial instead of having to watch dozens of videos each doing something different and not always working as expected.
Tat's the whole point, don't you think it's strange that THEY ALL suffer from the same total arrogance and stupid jokes and half-info, plus sponsoring they're mates crap for you to buy and to get stuck, whilst they make dallas
have you found one yet? looking for the same
@@LiterallyJord me too did you guys get any thing
just a note - your metahuman has a bowling ball heavy lower face because you scanned yourself with a big beard. the solver doesn't know the difference between hair and skin so it adds your beard as geometry, covers it in skin, and then you add your beard on top of that. You'd have a much better match if you went clean shaven for a scan-2-metahuman/animator creation, and then do your captures after that. From that period you should be able to use animator while having your beard and then apply the animation data to your non-bulbous metahuman, just as you would apply it to anyone other than you.
I have a massive mustache and metahuman thinks I have an overbite. Will have to shave it but it should grow back in one to two days.
My nba2k player always looked ridiculous because of that
@@44punk yeah, that’s how Freddie mercury got those big front chompers. He had that mustache when his face was created.
"We don't need expensive gear"
Also him: "I'm using my iPhone right now"
🗿
Yeah... saying "expensive" depends on a person's perspective...
iPhones in general are expensive... And not very durable 😅
if you think an iphone to do facial mocap is expensive it's best you don't go into facial mocap.
iphones are fucking cheaps nowaday we arent in 2010 area anymore just buy an old one lol...
@@ZinzinnovichYou’d need one capable of facial recognition. So don’t buy too old. Pretty much any model after and including the X should work.
+100 for the neat folder structure and detailed naming of assets! :D Great tutorial!
Great tutorial man! This does skim over a lot of important details though. For example, what you showed here is only enough to animate the face and play it back, nothing else. Would've been nice if you talked about baking to control rig to do some tweaks or even how to animate the face with a performance (you're final export showed body and face). Also exporting the animation the way you did will only work on your specific metahuman since you specified that mesh. If you exported the metahuman skeleton instead, it could work on any metahuman.
This is correct, I am trying to figure out how to connect the body and the head together without the head floating away
Exactly ... instead of all the promoting there f4ggoty porn friends add ons and soft/hardware , give us info that's realistic and compleet ...spice girl with a beard
You guys are great, you show from start to finish for a tutorial at a followable pace.
how do i fix the floating head
this is mind blowing, does it work with android as well as long as it has LIDAR?
YESS CINECOM UPLOADS ALWAYS MAKE MY DAY!
You dont need LiDAR, just true depth face Cam on iPhone
😁 oh yeah. depth cam cant map objects its just make depth of any object. you cant get movement with out lidar . then vfx films games just use depth cam for make full body movements not full body suit with trcking point for lidar
@@SggQpwpqpq-vq3ds knew that , was just related to metahuman animator and the mistake included in this particular video
TrueDepth camera is lidar bruh
That’s just lidar
This is a game-changer for face mocap! 🙌
Hey Jordy, AKA MASTER ARTIST 😁 your Skilshare courses are awesome man . Funny and straight to the point 😃 Thank you so much.
Dont forget, this is just the first step... in a few months, years this is gonna be even better.... amazing!
Clickbait. They want you to buy an "iphone".
I just had to
Like just every good MoCap
I never thought I needed to see jordy with shredded abs and chest 😂
*Is it anyway to do this live with your phone?*
is it possible to do in a android phone?
Which link is for how to create a meta human?
I don't get it all of people promoting how amazing this feature is and literally no ones talking about that is completely unsuable for complex animations because of huge bugs with head rotation, body disconnection, baking etc.
Thanks for showing...I just wish it was even easier...so many menus to Open, so many options to choose wrong, so many steps to remember
I thought i was alone in this lost adventure 🤣🤣🤣🤣
I clicked on the face and then clicked on the asset in animation mode. And I clicked on the animation sequence (performance), but the video does not play. What's the problem?
Same problem mate
Thanks for the quick funny workflow. ❤😂🎉
Thank you. This was really helpful
I wonder if this would work with any other 3D model with enough blend shapes. This would be a really nice and easy way to animate not just the meta humans. BUTTTTT Jordie... im surprised you guys are only noticing this now?!?!?!? this live link has been a thing since UE4.24 or 26 but and meta humans since like beta 5.0 lol BUT you still made it simple and fast to ingest and start creating thank you :)
Seems like Unreal engine's native support of Mocap is only available using iphone. (besides the professional equipment) am I right?
I saw some webcam version of face mocap software but seems not smooth.
Buying some used iphone is the only cheap way to use face mocap I guess
Very COOL! However I have been looking for a way to do this to a character that is not a Metahuman, A character that i have Modeled in Cinema 4D... What is the best way to do this face MoCap to a character like that... ?
Hi can this method work for Android phone.
Thank you
8:22 - Performance to existing MetaHuman downloaded from Quixel Bridge / Created from MetaHuman Creator / Custom MH. Thank you!!! I was looking for that everywhere.
Crazy! Could this be used for lip sync and just an image of a face?
So cool!
my metahuman dont have hair when i include in mi project
Same problem for me too
Thanks for uploading video 😊
that was great nice bruh . love from indonesia
Are there significant improvements for the later editions of phones?
Still waiting for the Body Motion Capture 😅
Unreal looks almost real now!
Happy Friday!
nice tut.why did you mention that requires the lidar sensor when you havent use the lidar sensor in the video?
is automatic used by the iphone when he records on iphone with metahuman app....it gives all the datas and not just a normal video.
I wouldn't call this high quality but its better than what we used to have.
haven't you skipped the FIT TEETH part ? or is it alright to bypass that button
Love the videos!
is pasible only with iphone?
We don’t need expensive gear, just an iPhone 📲
😅😅 nice tutorial.. Thanks
So good, thanks!
I love the section where you explain how movie special effects were created.
and i have one quick question.
Is a vfx certificate required for employment in a company like MPC, SUNRISE WETA, etc?
Absolutely not required. You "just" need to be good at what you're doing. They don't care about a degree. You can learn everything you need through RUclips, google, online courses and a lot of trial and error on your own. The one advantage a school might bring you is maybe contacts and connections into the industry. But generally I wouldn't recommend studying vfx, except it's at one of the top schools... and they are pretty expensive.
my only question is
can you do a full body for the process of creating a characters movement
It was so unusual to see your neutral face since you are mostly very animated! Love your videos!!
Any iPhone 11 or up works?
How did you fiwx the floating head?
I cannot figure it out for the life of me.
Metahuman Animator does not support iPhone 11, only 12 and above.
Hi can I use blender or unreal engine on ipad. if yes then which one is the best ipad to use it for rendering and editing in blender / unreal engine
There was a company called faceshift that Apple bought up. That software is inbeded in all iphones. Thats why that works. Its not unreal doing the heavy lifting.
Prior to that facechift worked with 50 USD kinect.
What do you think about the possibility of using Apple's upcoming Vision Pro for virtual production? Could it make LED walls obsolete? Could actors and crew use this to work remotely?
Wait a sec iPhone 11 doesn't have lidar sensor🤔
Excuse me, what is the price of the suit and where can it be purchased? What if you can only record with an Apple cell phone? greetings from Mexico
Autor:
We don't need expensive gear.
Autor also:
I'm using my i-phone
God damn it, another iPhone tracking video. I'M NOT BUYING AN IPHONE JUST TO TRY SOMETHING FOR 5 SECONDS!
is this "live link" is on pc? using a phone is a little bit unprofessional and cheap.
Do you have on for combining body and facial like you did here?
how do i fix my separated head?
you guys are the best
Spoiler alert: Everyone (who is RICH) can do High Quality face Mocap now.
so many steps. I think this can still be made more efficient
That's how you talk, you show a lot of great things, but I haven't seen a single video of yours with a normal explanation of how to get the desired effects.
How do you add clothes?
So fun!
Wait, I thought that Iphone X or greater was all that was necessary for using Live Link?
Does this only work on metahumans or can i use my own 3d model with this face mocap
did u ever made games?
Here's me hoping that one day there will be a way to utilise these types of technologies, in a much simple way.
Is there a way to animate a non Metahuman face?
It would have been cool if you also mentioned the scene where the neck floating problem in the end etc.
Is this possible with Android? 😅
any chance of getting this to work on mac? metahuman plugin is windows only 😢
Nightmare fuel!
Thank you, but how I did shape keys in fast way any program for that??
Dude what you're talking about, this was possible with live link app on iphone years ago - but it was live and you had to record the whole thing and put it together in sequencer and bake an animation .... I made myself a lot of animations last year, same quality
live link face was available, yes. but the way the animation data is processed and the number of facial data points is far different. The use of the lidar is the key change, and the final product here is WAY better. I too animated lots of faces with the live link face and there is simply no comparison.
"Don't need expensive gear", "I'm using my iPhone"
Last time I checked iPhone's (with LiDAR, like the iPhone 11) don't fall in the category "not expensive".
Is this Unreal Engine guide or a promo video?
Promo video with a course commercial at the end lmao
Nice ✨✨
And just like that……. I need to watch again. Lol
Does this work on ipad pro with true depth sensor too?
i like 70% of this video is random ads
Sorry, might be a Noob question... Does this app mocap data works with only metahuman characters? Or we can use it on our own characters created using maya/zbrush? Because everyone one is just mentioning metahuman characters only.
What app do you use with an android? I have a Samnsung galaxy S23 ultra
So, explain to me why someone would even need an iPhone if Unreal Engine is doing the actual facial capture with the green tracking marks? I would also LOVE a video from you on the Unreal Engine and Character Creator 4 from Reallusion's workflow for character animation. Both full body, facial etc, it might actually be superior to Unreal Engine, more universal, and easier to set up 🤔 but your opinion and a video would be awesome
My guess is that the iPhone app is bundling the RGB video data with the depth from the iPhone's camera (and possibly it's ARKit 3D model as well). If that's happening, then Unreal is likely matching it's analysis of the RGB video with the extra information the app gave, using both together to give a better result.
I'd love to see how this compares with just using RGB video, since I don't have an iPhone.
Thanks very useful
Sir I have a sci-fi story. I want to make it as a short film. But I have no money and no software. Can you tell me how to do it ? Please any suggestions me sir
Thanks for the great tutorial, but could anybody help, why animation doesn’t work when i track it to the face mesh in sequencer, after all the steps have been done? Cannot find any info how to solve it
Everyone... with an iPhone.
Can't wait for a body mocap with the same 4D tech.
Used livelink, but saying unsupported video format :/ Im using a 15 pro
I would go with an open lip when picking an open frame .
I've got the problem, that the "MetaHuman Plugin" is unavailable for me. Anyone else with this problem?
I posted a question about Unreal Engine 5.2 and if it is possible to use imported animals like a fully rigged gorilla made with 3ds Max. Maybe I should not post a link here. Sorry. This could change everything as a filmmaker - will buy your course
Yes
@@liampugh could you answer more detailed please? I am totally new and interested in Unreal Engine. I do have a filmstudio with greenscreen and some good lights. Do you think it is realistic to set this up using the course? Is there a support in the course?
ACTUAL GOLD
I have got problem, Time: 8:28 I don't see it in the drop-down tab. I cannot choose AS Metahuman Performance. And in the folder I have it.
I can't upload the live link files into UE5. When I try to drag it intto the content folder, it says error. Any idea whats up?
Is there a way we can replicate using pixel 4xl ir camera?
Hey, I actually would greatly appreciate the help, can we create a metahuman with the live link video data, like when we don't already have a metahuman of the person in the video but want to create a metahuman with the help of the live link video..
Wow! I wanna know how to make a metahuman that is so like yourself!!! When I use MetaHuman animator or metahuman creator, the face is not like me so much.
Can I use a HMC?