ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ruclips.net/user/videoiXH79mrKADM/edit First rap test ruclips.net/video/mAT1X4xbEdA/видео.html
Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming
Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best
This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?
just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}
Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!
I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?
A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman. That game is going to be insane levels of detail.
hey bro! awesome! I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea? Thank you!
So, whenever I want to make a facial animation (of myself doing it with live link) I need to go through these steps? But they can be used on every metahuman?
epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .
@@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,
Hey J, for some reason the result of your test is not very good, compared to other tests I have seen online. Do you reckon there was something in the configuration/shooting conditions that interfeered? Or perhaps the other tests used other types of cameras, such as stereo cams, rather than iPhone? Thanks for the tut though.
@@Jsfilmz The rig used when they announced MHA wasn't a metahuman rig, in fact they only showcased how the new system works on metahuman rigs for like, 5" at the end of the presentation.
This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.
when I play the level in a new window (PIE) the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally ! what should I do ?
@@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.
Amazing content as always! Could you please make a video on troubleshooting these three issues: - “Promote Frame” randomly jumping to a different frame than the selected one. - Metahuman Identity Solve not accurate result. - “Add Teeth Pose” breaking the Identity Solve even more. Thanks a lot!!!!!
Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves. When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )
Hi, i asked this to a lot person but couldn't get appropriate answer. After creating identity and perform with it, my Metahuman Character's default lips and teeth are changing, it's because of getting my identity. Is it possible to keep animation exactly same but with character's default facial features?
Please try it & please report if it works, I have same iPhone XS. I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.
Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?
Jo do you have any clue why my metahuman i created in creator does not show up in bridge? Its so frustrating man i spend over a hour now trying to get it to work I am mad as fu!& now😅
You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content! I do have one slight issue though!.. For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭 Have you experienced this yet? any tips? Thank you
after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?
The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.
Did you see issues of floating head? Face animates and body remains still.. if neck rotation is disabled it's fixed but the neck rotation part of the capture is lost.. Do you know how to avoid that? Thanks! 🙂🙏🏻
Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?
Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares. For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.
Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation. Great video!
@@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation
@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech
@@Jsfilmz You could make the calibration video with a tripod and use the helmet for the capture. I don’t know if it will improve the calibration, the metahuman seems a little off especially the nose.
Having a huge problem when I try to add the animation to my metahuman in the sequencer it becomes detached from the body from around the shoulders area. Was it because I may have moved too much in the capture? Can’t seem to get the body and the head attached
Do you have to record the videos with the live link app? Because for every android user this would be a real pain in the *ss and make this tool completely unusable and useless for non apple-device owners. I tried using a normal mp4 video but it said that there is no footage in the folder.
Do you think there will be a marketplace to buy mesh to meta human data scans eventually? For example I don't know any korean girls to run this on with an iphone like that company did but i'd be willing to pay people who know attractive people from every race if they did
I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it. I tried re-tracking face markers but still had the same problem.
@@Jsfilmz it looks like that if you use a Metahuman that you already had before the latest update, it won't export the head movement, but if you download a Metahuman now, it will. I guess they have updated all metahumans in the Bridge catalog. I still don't know how to attach it to the body.
Wow. I have the same issue. How to attach the body with proper movement...? I think there is the way to solve it with BP... I checked the Unreal forum but I can't find the proper way.
3:52 it's says iPhone 11 here But my phone is also 11 and over there on live link for metahuman animator I get that your device model is not supported you can continue to use but results won't be that good
Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.
Maybe in a year or 2 Epic will give us a way to import our videos for mocap, I’ve tried so many apps and don’t like the results, only one left to try is supposedly the best one, move ai.
Hello, awesome tutorial, but I have a very basic problem, capture manager does not recognize the files, literally the step you make at 1:20 of your video shows no video files for me. I recorded with Iphone 13 mini (supports LiveLink, also realtime works), and I am working on Windows PC. Any ideas what can be the problem?
@@Jsfilmz Im not so sure about that, because real-time connection works, just that imported video is not recognized at all. I must mention that I am using Windows PC, so might there be a problem while transferring files? I just imported from OneDrive the whole zipped folder and un unpacked, but is there maybe another way?
The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4
I was trying to make it work all night and it keeps crashing after first stage of tracking😭 I can make it work only when I select like 150 frames. I bought laptop in january and its already not enough😭
@@Jsfilmz ah I see your point but wouldn't it be better for calibration to do it without the helmet to get the three angles then so the depth part of the camera recordings with the helmet on make it more accurate for certain things like creases on face and stuff similar to that when making facial expressions?
ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ruclips.net/user/videoiXH79mrKADM/edit
First rap test ruclips.net/video/mAT1X4xbEdA/видео.html
Do you need the depth capture in Live Link or can you just use video?
Bruh! It is nuts how accurate this thing is.
Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming
Knew this was coming the second I saw the Unreal post about this. Excited to try this out today.
Bro, you are a master!
And the way that Unreal it's improving this technology day by day it's amazing.
thx dude!
Incredible! You're so ON IT. Thanks for the walk-thru. Very helpful!!!
Good and fast tutorial, nice man
tx
7:00 I'm listening to this video driving around, windows down, level 20 on Bluetooth and at a Bus Stop with little old ladies 🤦♂️
lol
Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best
You're amazing. Thank you for staying on the pulse!
thanks for being here!
BIG! I was wanting to explore this, thanks for the tutorials!
The accuracy looks insane
yea
This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?
home made bro with my sweat and blood its not $129 anymore though its $169 now
just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}
oof someones bout to buy an iphone 😂
LUV'd your last vid with your little girl (all of your vids ❤ ) 🏆😁👍!
hahah thx man it got 20k views on twitter hahah
Some day i think you will make tutorials before release. haha. So quick!!!!!!!!!!!
hahaha yea they didnt select me for beta access unfortunately i know some did so im gonna have to catch up to them
4:43 "Iden-TITTIES" hahahahahaha
Great video man! Thank you for the content.
Awesome.. Randomly woke up at 4am knew there was a reason... thank you!
hahaha
You the man bro
Thanks Lord Helmet!!!
yeeeeaaaaaaa 🔥
hey J can i borrow 10k subs so i can hit 100k? thanks mang
Hi! Thanks for the cool video. Listen to the question, will it work if you upload a regular video shot on a camera?) just no iphone))
dont think so iphones are precalibrated like i showed in the performance editor
No, the software requires depth information which a regular camera can’t provide. Borrow an iPhone 11 or newer.
...and off we go!
sooo quick , Dude ! you are fast as hell .... 😁
i woke up 5 am today haha
Super sick! Do you feel like the results are significantly better than the live stream app?
bro go watch my rap with it it will answer your question
Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!
Great job!
Do you think it's worth shelling out a bit more money for the iPhone mini 13 over 12?
im broke so ur askin wrong person
I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?
i made tutorial both ways my usb transfers faster for bigger files
@@Jsfilmz oh sweet as no worries bro. Any tips on uploading from USB then using archives? Seems more complicated to setup the Performance that way...
A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman.
That game is going to be insane levels of detail.
yea hideo was lurking around on my channel when ue5 first came out i was one of the first ones to cover it
hey bro! awesome!
I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea?
Thank you!
i dont understand :( maybe join unreal discord and post pics and issue there
So, whenever I want to make a facial animation (of myself doing it with live link) I need to go through these steps? But they can be used on every metahuman?
epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .
yes
@@Jsfilmz awesome ,, is there any reason not to get the iphone 12 mini ?
@@kool-movies i havent tested longer takes with 12 mini yet but it used to overheat on me alot hahaha
@@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,
Thanks for the tutorial! Do you have a recorded take file I can use as a test? I dont have an iphone. Thanks
I get the "assertion failed" crash every time I promote the first frame... what should I do?
Hey J, for some reason the result of your test is not very good, compared to other tests I have seen online. Do you reckon there was something in the configuration/shooting conditions that interfeered? Or perhaps the other tests used other types of cameras, such as stereo cams, rather than iPhone? Thanks for the tut though.
i think the demo videos that came out were done with stereo cams not sure
@@Jsfilmz The rig used when they announced MHA wasn't a metahuman rig, in fact they only showcased how the new system works on metahuman rigs for like, 5" at the end of the presentation.
@@matteo.grossi hahaha thats cheating then right lol
This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.
beats buying a real mocap system 😂
@@Jsfilmz True I suppose :)
Also hey I'd like one of those helmets, but I do have an iphone 11, will that work? It's bigger than the mini, of course.
i know some people whos tried 11 with MHA and they said it works i havent tested it myself
Gonna buy an iphone 12 mini like you have, can you tell me which mount you use so I can buy that and yr helmet? @@Jsfilmz
So how do we connect this to a body so we can add animations to it? 😅
This is awesome! Would I be able to take the metahuman/face mocap and export it into blender? I don't need any of the textures or materials. Thanks!
when I play the level in a new window (PIE)
the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally !
what should I do ?
I need to buy iPhone 12 + first hahaha
looks like 11 works
@@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.
@@aaagaming2023I tested the iPhone 11 Pro last night and it works fine, zero issues.
Thank you so much, its amazing
So strange, in Capture source whatever I put( live link or archive) it doesn't find it. It gets green in Capture manager but nothing shows up🤔
hey man im uploading another tutorial stand by maybe u missed something
@@Jsfilmz great, thank you so much
Amazing content as always!
Could you please make a video on troubleshooting these three issues:
- “Promote Frame” randomly jumping to a different frame than the selected one.
- Metahuman Identity Solve not accurate result.
- “Add Teeth Pose” breaking the Identity Solve even more.
Thanks a lot!!!!!
Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves.
When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )
Hi, i asked this to a lot person but couldn't get appropriate answer. After creating identity and perform with it, my Metahuman Character's default lips and teeth are changing, it's because of getting my identity. Is it possible to keep animation exactly same but with character's default facial features?
So right now its only recording, but no livestream yet?
live? nah facegood for that
when doing facial mocap, what do you do for a mic? do you have a little one that you attach to your helmet? what kind of mic is goof for that?
just my good ole senheisser g2
@@Jsfilmz do you attach it to the arm of your helmet?
Incredible! Is there a way to export Animation data to Maya to have more freedom for further tweaks?
that would be amazing but i dont think thats possible yet
This guy made a script to do it ruclips.net/video/ecYO5-5fL0U/видео.html
Hope my XS works. Don't want any more Apple pradux
Please try it & please report if it works, I have same iPhone XS.
I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.
does someone got it working good with iphone 11? the new epic post says it needs aleast iphone 12..
try it broski
Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?
I Hope that behind the scenes they are working on full body mocap
Great stuff!
Thanks!
If I'm not wrong, you should be turning off the neck solving on a headmounted camera. And use it only for static cameras.
in my case ill use neck movements from my mocap the exported sequence doesnt come with neck check out new rap video i uploaded
Jo do you have any clue why my metahuman i created in creator does not show up in bridge? Its so frustrating man i spend over a hour now trying to get it to work
I am mad as fu!& now😅
update bridge
Could you provide me with the information about your iPhone 12 model so that I can search for a similar one to buy?
12 mini broski
@@Jsfilmz Thanks!
Hello bro.I just went back to this video ,becasue for me when update to 5.2.1 its says Preparation for Performance Fialed,any tips on it bro.
You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content!
I do have one slight issue though!..
For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭
Have you experienced this yet? any tips?
Thank you
after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?
The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.
thank you for your answer, what are they doing this warning for, do you think there will be a noticeable quality difference?
@@UnrealEnginecode The quality I got was excellent so I'm not worried about it.
Did you see issues of floating head? Face animates and body remains still.. if neck rotation is disabled it's fixed but the neck rotation part of the capture is lost.. Do you know how to avoid that? Thanks! 🙂🙏🏻
Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?
can it be used not with my face but with the one created in metahuman creator? and how to do it?
stay tuned
Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares.
For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.
man im a terrible actor but ill try
@@Jsfilmz Thanks a lot!
Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation.
Great video!
wait like its doing ai pose estimations already?
@@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc
Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation
@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech
I wonder how long a given take can be. I have to make some EDU products with this, and they might need to be on the long-ish side.
Isn't it necessary to have an iPhone with this? I don't have an iphone sorry
JS can that be done with DAZ characters too?
I don't have an iPhone. Will it work with my 2021 iPad pro ?
Imported my video from my iPhone and it won’t show up in capture manager. I’m lost. Lol.
gotta import entire folder
@@Jsfilmz Ah ha! Thanks!
On the metahuman video they use 4 calibration images, is there a reason why you did not use side views?
i have helmet on bro hahahaha they on a tripod if i move my head the camera wil move too
@@Jsfilmz You could make the calibration video with a tripod and use the helmet for the capture. I don’t know if it will improve the calibration, the metahuman seems a little off especially the nose.
Hi! It's amazing video. I try to import my recording video using iPhone 11 but, in the capture manager the video can be reading. Any suggest for it?
firewall?
Having a huge problem when I try to add the animation to my metahuman in the sequencer it becomes detached from the body from around the shoulders area. Was it because I may have moved too much in the capture? Can’t seem to get the body and the head attached
BOSS!
Hey! I just wanted to say thank you for your videos! If it wasn't for you I couldn't have created animation for my metahumans. ❤ Thank you
That is awesome!
Omg yes! My video will never make it if it wasn't for you! I even mentioned you in appreciation in the description ❤️
@@HQLNOHthanks man not manu give credit back to me i appreciate it
Did everything to the t and my animation sequence doesn't show up in the Face animation menu. Any tips?
Do you have to record the videos with the live link app?
Because for every android user this would be a real pain in the *ss and make this tool completely unusable and useless for non apple-device owners.
I tried using a normal mp4 video but it said that there is no footage in the folder.
tell android to get it together i havr android main phone
my head keeps detaching from the body when i attach the animation to the face and play it in sequencer
bro mine keep crashin when loading trackers. Any suggestions?
check hardware specs? not sure mine just takes a min
@@Jsfilmz I fixed it. Apparently if you create a duplicate project from an older version of unreal engine, it won't work. Thanks though.
Do you think there will be a marketplace to buy mesh to meta human data scans eventually? For example I don't know any korean girls to run this on with an iphone like that company did but i'd be willing to pay people who know attractive people from every race if they did
can i move the mocap data to another software, like blender?
Great vid!! In the Showcase ,didnt they show a way you could use this app to generate textures for your metahuman? Will you be showing us how as well?
can you send me that video? i dont think i saw that
@@Jsfilmz I think I may be mistaken. I thought the HellBlade II showcase did it ,but I think they may have had a premade metahuman.
@@HellMunky What's your workflow? What do you use to generate the mesh/textures that you import into UE to use in the plug-in?
Were you able to figure out how to connect the iPhone to wirelessly be triggered to record from Unreal?
just the regular livelink way?
@@Jsfilmz With the new MetaHuman Animator version of LiveLinkFace, can't seem to connect. Trying to match my body mocap with the face mocap.
cant do it live animator is offline
@@Jsfilmz I wonder if I can I can use OSC to sync up the recording process on the phone with the mocap
I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it.
I tried re-tracking face markers but still had the same problem.
is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.
yes watch my tripod tutorial i uploaded today
@@Jsfilmz it looks like that if you use a Metahuman that you already had before the latest update, it won't export the head movement, but if you download a Metahuman now, it will. I guess they have updated all metahumans in the Bridge catalog. I still don't know how to attach it to the body.
@@Jsfilmz I watched it but you are not showing how to attach the body, you just say "I will import a body animation"
Wow. I have the same issue. How to attach the body with proper movement...? I think there is the way to solve it with BP... I checked the Unreal forum but I can't find the proper way.
Hold up! Did it just create a Metahuman face with your facial proportions just based on the iPhone calibration!? that's crazy!
yes
@@Jsfilmz that's awesome! Glad you were able to share this! Awesome video
@@Impaczus43 it made a fatter version of me which is effed up lol
@@JsfilmzI mean honestly it does have your facial features but do u think you can adjust that manually ???
3:52 it's says iPhone 11 here
But my phone is also 11 and over there on live link for metahuman animator I get that your device model is not supported you can continue to use but results won't be that good
yea ive had people here try it
@@Jsfilmz I just went back to the site and it seems that iPhone X and 11 are removed
wtfff on the docs?
@@Jsfilmz yes
Have u done any vids how to take that head with animation and apply to another metahuman ?
Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.
it´s here!!!!! downloading plugin!!!!!
is there a way to create the captures from a webcam?
Maybe in a year or 2 Epic will give us a way to import our videos for mocap, I’ve tried so many apps and don’t like the results, only one left to try is supposedly the best one, move ai.
yea its good man
Hello, awesome tutorial, but I have a very basic problem, capture manager does not recognize the files, literally the step you make at 1:20 of your video shows no video files for me. I recorded with Iphone 13 mini (supports LiveLink, also realtime works), and I am working on Windows PC. Any ideas what can be the problem?
firewall?
@@Jsfilmz Im not so sure about that, because real-time connection works, just that imported video is not recognized at all. I must mention that I am using Windows PC, so might there be a problem while transferring files? I just imported from OneDrive the whole zipped folder and un unpacked, but is there maybe another way?
The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4
I receive software release and your notifications at the same time, how is it even possible :D
i woke up 5 am today to make this video
I was trying to make it work all night and it keeps crashing after first stage of tracking😭 I can make it work only when I select like 150 frames. I bought laptop in january and its already not enough😭
yea its cpu intensive bro
@@Jsfilmz I hope upgrading ram to 64gb would help
gl man
When the teeth are facing you but the face isn't, must be metahuman, or a horrible childhood accident you didn't wanna talk about
Was about to say the same thing about your teeth JS 😂
I thought when you're calibrating you need a minimum of 3 the front, left and right, and that the teeth only need one in front?
turn ur head with helmet on
@@Jsfilmz ah I see your point but wouldn't it be better for calibration to do it without the helmet to get the three angles then so the depth part of the camera recordings with the helmet on make it more accurate for certain things like creases on face and stuff similar to that when making facial expressions?
@@ComanderJTCar as i know u gotta calibrate like how ur gonna record
@@Jsfilmz ok makes sense I just figured it would read your face better if it had an initial depth analysis from multiple angles was my thinking