Seriously guys, each time Epic comes up with new outstanding features on Unreal Engine, I'm turning myself into a rocket scientist mentality. The team at Epic Games rocks!
@@AlexisRivera3D most animation data and 3d model data can be exported from unreal engine as FBX. You'll have to find a tutorial but it should be possible, if not... it would be because unreal engine as a company set up something to keep this from taking place. considering the fact that unreal is more recently apart of many pipeline software, the chances of that being the case is slim to none.
Anyone know how to solve issue where the body becomes detached from the animation ? My head is animating and it separates from around the shoulder area .
@edentheascended4952 I exported the Metahuman to Blender but has some issues, for example the hair is not compatible with Blender so I had to add a new one as hair cards
Ain't gonna happen, unfortunately. I gave my hopes up on that. A lot of people, specially new solo-devs on a tight budget trying hard to finally start making profit from the struggles of learning and experimenting with all they've got, are looking forwards to seeing that finally happening... and while waiting for that other people remain one or two steps ahead. They say we can use an head-mounted camera but why not simply have the chance of using the smartphones we have? Evidently they do not agree or do not find that business-worthy and rather look at us as dinosaurs... ready for extinction! I love Apple, don't get me wrong, but their monopoly when it comes to some stuff is hugely annoying...
Honestly kind of confusing, is there some technical reason for the lack of support outside Iphone? What self respecting developer owns an iphone anyway, a difficult to modify, locked down system. There is a reason for the meme of graphic designers/artists etc using apple products, they aren't generally the technically minded ones, and I say that as an artist.
Yeah best thing ive found is manually scaling down bones. Guess they dont want some freaky children stuff inthere, because im sure ppl would go that far for shock value....
Hot damn. I use Unity at my job, and Unreal at home. And switching from one to another feels like I'm traveling 10 years into the future. Completely ruined any chance for Unity to catch up in terms of graphical fidelity. And it's just as easy to use, if not more so in some instances. Insane.
I'm reminded of something Siren devs did years ago, thinking of that old technique to create a realistic face to now is so surreal yeah it'll really help indies, they could use unreal for faces and probably still import to other places if need be
Hihi. I was waiting for this plugin since you announced it. And I prepare 2min of videoclips for making a short film. So you published it in the exact right point and I want to thank you for making it possibil that I can do this stuff for free. :)) Thank you really much
i'm sure it's possible with a simple camera, why need a iphone ? there are phones with 2 cameras allowing you to have depth and capture volumes, it is also enough to put dots on your face, I hope you will add that in the very near future if not it sucks
Really, really interesting. Great video making everything as simple as possible I'm sure, while still having enough detail. Lots of learning and experimenting to do, but will try to use this feature one way or another.
Metahuman is an excellent way to spend three weeks just trying to make a floating talking head actually attach to an animated body correctly. Even if they're literally "both metahuman" as of 5.3 there is still not official method.
Amazing tech, but not a good tutorial. I wish you would have started by saying you'll need your initial Live Link Face capture to have a Front, Left, Right, and Teeth pose. You don't realize these prerequisites until halfway through the video. Also, what's the best way to get the data from your iPhone to your pc?
This tutorial is fantastic! I'm hoping that there's a follow-up about batch processing performances via the included python scripts. I have a project with around 300 takes, but I haven't been able to figure out the batch processing workflow yet!
At Shrapnel, we are thrilled to be leaning into Metahuman and the flexibility it provides through facial blends, diversity, age and the fidelity it gives you at the click of a button. This is a game changer and we are excited to see what it gives us. - Jay, Shrapnel's Art Director 😎
question, does this require an iphone (aside from the stereo camera of course), or can the footage be from any camera at this point? Just curious, wasn't sure if it was using the iphone lidar data or something like that.
@@lexastron I did find out that it does use the data from the iPhone camera and that's why they use it. There's ways to use Android or a webcam, it's just not as accurate or articulate unfortunately, however, you can get close and then hand animate the rest to get it a lot closer to the performance. It'll take a little more work but I guess it's doable just would be a lot easier with an iPhone
Are there any guides coming for the stereo capture. I made a capture source, selected stereo archive, pointed it at my directory with stereo hmc. Went to the capture manager, selected the source and.... nothing. No videos to select. What formats are supported? I tried mp4 and mov (does the codec matter)? I assume I am missing some really simple step, but can't find it.
3:29 What's the iPhone app we should download to record videos to import to the engine? could you show us the process with iPhone (a real step-by-step)?
A lot of people are new to UE, because of MetaHuman. But for a beginner, this tutorial is wayyy too fast and most click are not explained. Just some feedback for future work 🙂 Keep doing awesome stuff
In 5.3.2 It looks like once I add the teeth pose, set the frame, and click "fit Teeth" the B view looks like there are two head meshes overlayed and you don't see the teeth result. Then Prepare For Performance fails. ??? REALLY frustrating.
Please add support to iPad Pro 2022. I used with Live Link and worked perfectly. Now I just updated the app and when I open it, shows two methods: -The old Live Link -The new Metahuman Animator, but it says in red color that my device is not supported. How is not capable an iPad Pro 11” 2022 4ºgen with M2 chip, and iPhone12 is capable?
Hey! I've managed to capture a performance but once I add it as a face animation to the Metahuman Blueprint, the head stays detached from the body. How can I merge them together and work on my body animation separate from the facial / head rotation?
Thank you for your very clear tutorial. I was wondering if it would be possible to batch-process 50 performances at once with the same Metahuman Identity? Is it possible, or do I need to manually set them up each time?
Tried again with 5.4 and it still fails. There was a new ID start screen about system specs. How about more info on that? The error message just fails with no help as to why.
So I get it, you guys don't like Android, but what about other 3D-Cams solutions like Intel RealSense or similar? Anyway, amazing piece of software, congrats.
Are these results much better than the prior LiveLink setup? I am excited to test out myself, but it does look like a more complex workflow than the offline CSV method that currently exists. Excited to test out.
Hi, thank you for this. Is there a written documentation on this/these processes step-by-step? I feel parts of the video tutorial take for granted stuff, a noob like myself has a hard time following along. Oh and BTW. Something I do not understand. How come the Live Link app is available only for iPhone, but the Metahuman plugin in is not available for UE on macOS. Why is that?
At 7:05 after Mesh to MetaHuman returns, the blendshapes all goto vertex count 0. Is there a way to preserve these after the solve? Standard Bridge MetaHuman downloads have both an embedded DNA as well as morph targets.
I only see MetaHuman Identity and Capture Data (Mesh) as an option. The menu itself is called MetaHuman instead of MetaHuman Animator. What did I do wrong?
Help! I get error when importing iphone video. Either it says audio clip could not be imported or it could not verify ingested file. Both instances I end up without a Capture Data file so I cannot move forward.
help plz 1:56 i find my iphone their but iy don't show any of my captures even when i press start capturing bottom it shows that it is capturing but when i stop it doesn't show anything
So awesome! But i have master issue with bridge I create a metahuman with creator (for 5.2) but it is not showing up in bridge no matter what i do. Do you have a solution on that pls?
@@piorism i got it to work- i was just very dumb i had to update bridge vie the epic games launcher. After that the metahuman from creator showed up in my quixel bridge. Thanks for your reply man 😄
So I'm stuck at the part that is conveniently skipped in the tutorial. I have pulled the take files from my iphone and placed them in a folder and set that folder as the target for my capture source. But I don't see anything in the capture manager. :(
Is there a way to queue Processes for MetaHuman Performance under MetaHuman Animator? and while at it for Export Animation too? This would liberate time waiting for each Process to be finished before moving on.
Hi! I have a problem when I activate the plugin and open the project. It gives me this error: ''The ‘MetaHuman’ plugin failed to load because the ‘MetaHumanMeshTracker’ module could not be loaded. This may be due to an operating system error or the module might not be properly set up.'' and closes the project. Any Idea of what could it be the problem? Thanks!
Hi, i want to take the metahuman with the animation and put It on blender to add a couple of details and do the render there, and everytime i try the animation no longer works, what would be the proper way to archive that?
You would need 3D capture data. iPhones come with depth cameras, which is why Epic mentions them. You can get a standalone depth camera pretty cheap from Intel (cheaper than having to get a phone you don't want). I think Intel's flagship depth camera is called RealSense and they start at about $100.
I really wish there was a version of this for people who aren't code heads. I'd love to do cool stuff with Metahumans, but Unreal Engine has to be the least intuitive, frustrating software I've ever attempted to learn. I've worked with Adobe Photoshop and Illustrator professionally since the 90s, and InDesign since it came out in the early 2000s, and I've also managed to learn tons of other software, but I guess I'm dumber than I thought. 🤷
I found it pretty easy in like 3 hours I had a character that looks like me running around in 3rd person, found it pretty cool hardest part I find about learning is the blueprints but still much easier than c++ and don't need much knowledge to create a character
@@elitegold4805 I do love the Blueprints. They're extremely easy to use and it's cool what sort of functions you can come up with without pages of code.
Have you tried Unity? I know this seems odd to suggest on an Unreal video. But I did find that Unity is far more intuitive for beginners to make certain kind of projects. I find Unreal a lot more scalable for making complex 3D functions and interactions.
@@elitegold4805 so yeah, you kinda just proved my point. You know C++. The closest thing to a programming language I've ever used is HTML. 😆 And yes, using the MetaHuman Creator interface online is super easy and fun, and I think it works great. I've gone bananas making MetaHumans, I have over 50 of them at this point. But it's when it gets into Unreal Engine that the wheels come off for me. Unreal Engine was designed for people like you, coders, not people like me, graphic designers. So for me the learning curve is quite steep. I've never used node-based interfaces before, for instance. I'm having to learn all this new terminology, because I haven't done much in 3D before, so 'normals' and 'UV maps' and things like that are things I've had to learn about. You already have a lot of that fundamental knowledge, so for you it's more about just learning how Unreal Engine does things you want to do. I kinda just wish there was a version of Unreal Engine that was designed for people like me, is what I'm saying. Nothing about Unreal Engine seems obvious or intuitive to me, and it relies on already knowing a lot of stuff I don't know yet. And I don't have any interest in making games, I only want to use Unreal to make cinematics and still images, so all of the game stuff just gets in the way for me. And yes I know _this is a game engine,_ so I have to learn about all that to use the software effectively. And I just...don't enjoy that aspect, I guess. Plus I'm Gen X, and I feel like I've been learning new software my whole life, so every time I have to start from literal scratch part of me is like 'not this again!' 😩 Oh, and I'm on a Mac. So. Yeah. 😔 I am glad, however, that the Unreal Engine interface doesn't try to hold your hand too much, because that's more annoying than not knowing stuff. I hate it when an interface peppers you with pop-up notifications while you're trying to concentrate on a task. Adobe has gotten really bad about that in recent years, and I get that those programs are probably just as dense for total beginners as Unreal Engine is for me, but there is a better way to teach people software than Clippy On Steroids. 😁
how to combine body and head? my head keeps detaching from the body when i attach the animation to the face and play it in sequencer, i tried to baking to control rig but no change..
@@elvismorellidigitalvisuala6211 i have found a very vague fix, can only work if you have no body animation, i still don’t know how to add body animation on this fix for now but here is the fix: Before doing this just make sure you bake animation to control rig (just in case) then, Right click face and go up to rebind component and click on body. What this does it attaches the body to the face/neck and the whole body moves with the neck, so it looks weird if you see it with full body but if the camera is just in portrait, it kind of works, i didn’t figured out a better fix for it for now
@@elvismorellidigitalvisuala6211 that fix isn’t a good one, but i hope other creators will figure out how to fix this or do we need a body animation for this or a head mount camera to nullify any neck below movements you know..😕
Seriously guys, each time Epic comes up with new outstanding features on Unreal Engine, I'm turning myself into a rocket scientist mentality. The team at Epic Games rocks!
This is a huge game changer for indie devs and animators. Thanks, Epic! 🖤
why there 's no metahuman plugin?
Do you know if this can be exported to Blender?
@@AlexisRivera3D most animation data and 3d model data can be exported from unreal engine as FBX. You'll have to find a tutorial but it should be possible, if not... it would be because unreal engine as a company set up something to keep this from taking place. considering the fact that unreal is more recently apart of many pipeline software, the chances of that being the case is slim to none.
Anyone know how to solve issue where the body becomes detached from the animation ? My head is animating and it separates from around the shoulder area .
@edentheascended4952 I exported the Metahuman to Blender but has some issues, for example the hair is not compatible with Blender so I had to add a new one as hair cards
The way it creates a 3D-Model of my face, via the video capture..
Just tested it on iPhone 12..
Wow.
You guys nailed it! Gamechanger. Holy moly :D
Do we need Iphone or any phone works?
does your unreal 5.2 not crash using Metahuman Identities and processing the neutral pose? I get it crashing every time.
@@EnterMyDreams hello, can we use other phone? Or we have to use Iphone?
@@EnterMyDreams No issues here so far. Sorry to hear you got crashes
@@buraksozugecer Only iPhone 12 or better
4:59 you must select a body type before „Mesh to Metahuman“ - otherwise you get an error
do that by selecting „Body“ down at the left outliner
as well as a bunch of other steps that this tut missed, heh. I mean, thank god for pop up boxes, but sheesh.
always unreal what you guys bring to the table!! This is so awesome, cant wait to try !!
Thanks a lot
oh yea it makes your games epic!
Simple and straightforward process demo, Raff. Thanks & Cheers
It would be awesome if you add Webcam or Android camera support
Ain't gonna happen, unfortunately. I gave my hopes up on that. A lot of people, specially new solo-devs on a tight budget trying hard to finally start making profit from the struggles of learning and experimenting with all they've got, are looking forwards to seeing that finally happening... and while waiting for that other people remain one or two steps ahead. They say we can use an head-mounted camera but why not simply have the chance of using the smartphones we have? Evidently they do not agree or do not find that business-worthy and rather look at us as dinosaurs... ready for extinction!
I love Apple, don't get me wrong, but their monopoly when it comes to some stuff is hugely annoying...
Honestly kind of confusing, is there some technical reason for the lack of support outside Iphone? What self respecting developer owns an iphone anyway, a difficult to modify, locked down system. There is a reason for the meme of graphic designers/artists etc using apple products, they aren't generally the technically minded ones, and I say that as an artist.
@@CarbonI4 It requires depth data. Hence you have two options, stereo capture or iphone (which uses lidar).
Now the only thing we need in Metahuman is proper age. Its difficult to make very young and very old people physiques. Great work, thanks guys!
Yeah best thing ive found is manually scaling down bones. Guess they dont want some freaky children stuff inthere, because im sure ppl would go that far for shock value....
Epic is just amazing
... Love how they open source every progress they achieve❤❤
Thanks a lot
Hot damn. I use Unity at my job, and Unreal at home. And switching from one to another feels like I'm traveling 10 years into the future. Completely ruined any chance for Unity to catch up in terms of graphical fidelity. And it's just as easy to use, if not more so in some instances. Insane.
I am speechless.. This will shake the industry! A HUGE thank you to EPIC for this.
I'm reminded of something Siren devs did years ago, thinking of that old technique to create a realistic face to now is so surreal
yeah it'll really help indies, they could use unreal for faces and probably still import to other places if need be
Thanks, this will be a great help for my Unreal Engine music clips and trailers
Hihi. I was waiting for this plugin since you announced it. And I prepare 2min of videoclips for making a short film. So you published it in the exact right point and I want to thank you for making it possibil that I can do this stuff for free. :)) Thank you really much
i'm sure it's possible with a simple camera, why need a iphone ? there are phones with 2 cameras allowing you to have depth and capture volumes, it is also enough to put dots on your face, I hope you will add that in the very near future if not it sucks
It's a really really GAME CHANGER... Epic is gaming in the real world. Thanks for Epicgames and Epic developers!!!!
Thank you so much!!! This has given indie devs a way to compete with AAA titles!
Really, really interesting. Great video making everything as simple as possible I'm sure, while still having enough detail. Lots of learning and experimenting to do, but will try to use this feature one way or another.
Thanks a lot
So finally it’s out? Nice! Thank you!
Metahuman is an excellent way to spend three weeks just trying to make a floating talking head actually attach to an animated body correctly. Even if they're literally "both metahuman" as of 5.3 there is still not official method.
Amazing tech, but not a good tutorial. I wish you would have started by saying you'll need your initial Live Link Face capture to have a Front, Left, Right, and Teeth pose. You don't realize these prerequisites until halfway through the video. Also, what's the best way to get the data from your iPhone to your pc?
Amazing! The head detaches from the body when imported into the sequencer though.
Why is that?
I am also looking for a solution
Finally ! Crazy work is comming! 😍
You just unlocked so many doors for me, you have no idea.
This tutorial is fantastic! I'm hoping that there's a follow-up about batch processing performances via the included python scripts. I have a project with around 300 takes, but I haven't been able to figure out the batch processing workflow yet!
At Shrapnel, we are thrilled to be leaning into Metahuman and the flexibility it provides through facial blends, diversity, age and the fidelity it gives you at the click of a button. This is a game changer and we are excited to see what it gives us. - Jay, Shrapnel's Art Director 😎
Thank you very much for making this tutorial.
question, does this require an iphone (aside from the stereo camera of course), or can the footage be from any camera at this point? Just curious, wasn't sure if it was using the iphone lidar data or something like that.
I double that question.
@@lexastron I did find out that it does use the data from the iPhone camera and that's why they use it. There's ways to use Android or a webcam, it's just not as accurate or articulate unfortunately, however, you can get close and then hand animate the rest to get it a lot closer to the performance. It'll take a little more work but I guess it's doable just would be a lot easier with an iPhone
@@legacylee Got it. Thank you 🙏
@@lexastron np I hope they start adding similar tech to androids, I'm not a big iPhone fan but I gotta admit iPhones do have the tech we need lol
This opens a new world for people like me 🤩
Nice video! I wish I know where I am wrong. My BPMetahuman does not move with the animation...
Are there any guides coming for the stereo capture. I made a capture source, selected stereo archive, pointed it at my directory with stereo hmc. Went to the capture manager, selected the source and.... nothing. No videos to select. What formats are supported? I tried mp4 and mov (does the codec matter)? I assume I am missing some really simple step, but can't find it.
I must be doing something wrong, when applying the exposed face animation to my metahuman, it works but the head is detached from the body…
GREAT TUTORIAL! Thanks Epic Games ❤
Can't wait to try it! Tomorrow will be a busy daayyyyy!
3:29 What's the iPhone app we should download to record videos to import to the engine? could you show us the process with iPhone (a real step-by-step)?
Live link face
@@saraiev_ oops! sorry!
A lot of people are new to UE, because of MetaHuman. But for a beginner, this tutorial is wayyy too fast and most click are not explained. Just some feedback for future work 🙂 Keep doing awesome stuff
Is this possible with android? None of us have iphones
When Android?
When Android?
Never 🤑
Hardware limitations mean android can't
@@LauranceWake people already made livelink face analogs for android. Also, Epic Games announced live link for android too far ago
when androids builds in a truedepth camera
Fantastic I was waiting for this.
In 5.3.2 It looks like once I add the teeth pose, set the frame, and click "fit Teeth" the B view looks like there are two head meshes overlayed and you don't see the teeth result. Then Prepare For Performance fails. ??? REALLY frustrating.
Please add support to iPad Pro 2022.
I used with Live Link and worked perfectly. Now I just updated the app and when I open it, shows two methods:
-The old Live Link
-The new Metahuman Animator, but it says in red color that my device is not supported.
How is not capable an iPad Pro 11” 2022 4ºgen with M2 chip, and iPhone12 is capable?
I am wodnering the same thing. Hopefully they will update the plugin to work with Ipad.
It is listed on the blog that you need to use Iphone12 or higher.
is working, I tried it on ipad pro 2022 and is fine
@@3Dave666 really? Don't have any inconsistencies? It gave me an advice that it can behave incorrectly and I didn't try it.
@@3dart_es409 I tried with a 10 seconds video and it looks like is perfectly working, it uses 2 captures (video and depth) and ipad can do that
what about android people?
Maybe because Android manufacturers don't want to add the depth camera components that iPhones have and focus on other features
Hey! I've managed to capture a performance but once I add it as a face animation to the Metahuman Blueprint, the head stays detached from the body. How can I merge them together and work on my body animation separate from the facial / head rotation?
Very helpful, thanks!!! Can't wait to try it out :)
When it will be available for other devices, like DSLRs, HD Webcams, Android Phones etc? (Head Mounted Cameras)
Thank you for your very clear tutorial.
I was wondering if it would be possible to batch-process 50 performances at once with the same Metahuman Identity?
Is it possible, or do I need to manually set them up each time?
Metahuman is finally on Unreal Engine 5!
Tried again with 5.4 and it still fails. There was a new ID start screen about system specs. How about more info on that? The error message just fails with no help as to why.
So I get it, you guys don't like Android, but what about other 3D-Cams solutions like Intel RealSense or similar? Anyway, amazing piece of software, congrats.
New Robocop game needs this badly.
Cannot wait to start using this. Thanks. :)
Are these results much better than the prior LiveLink setup? I am excited to test out myself, but it does look like a more complex workflow than the offline CSV method that currently exists.
Excited to test out.
I keep crashing on the prepare for performance step, anyone found a fix?
When I bake the Face animation to the Control Rig like in the video, the animation freezes. Anyone have an idea what to do?
hi, whats the workaround for when file can't be imported from queue due to audio file being too big? thanks
Unreal folks: Any recommendations for HMC rigs that work with Animator?
I'm running into issues with connecting head to body. Could there be a conflict between the two animations?
Great tutorial!
Also… P is for Plosive 🤓
WOW! Now I understand why actors and actresses worry about AI and Digital engines.
Excellent, can't wait to try it
Hi, thank you for this. Is there a written documentation on this/these processes step-by-step? I feel parts of the video tutorial take for granted stuff, a noob like myself has a hard time following along.
Oh and BTW. Something I do not understand. How come the Live Link app is available only for iPhone, but the Metahuman plugin in is not available for UE on macOS. Why is that?
can the video is saved locally from iPhone then sent it to the cloud for my colleagues
overseas ?
Sure! you send the zip file
@@oimob3D great
We need Android support 😢
Tell android manufacturers to start including front-facing lidar
It is necessary to have an IPhone or can I just upload a video?
Coming!!! Change the World!
👏🏿👏🏿
Thank you guys
At 7:05 after Mesh to MetaHuman returns, the blendshapes all goto vertex count 0. Is there a way to preserve these after the solve? Standard Bridge MetaHuman downloads have both an embedded DNA as well as morph targets.
I only see MetaHuman Identity and Capture Data (Mesh) as an option. The menu itself is called MetaHuman instead of MetaHuman Animator. What did I do wrong?
can i use any adroid phone video capture?
How to fix floating head?
Help!
I get error when importing iphone video. Either it says audio clip could not be imported or it could not verify ingested file. Both instances I end up without a Capture Data file so I cannot move forward.
help plz 1:56 i find my iphone their but iy don't show any of my captures even when i press start capturing bottom it shows that it is capturing but when i stop it doesn't show anything
So awesome! But i have master issue with bridge
I create a metahuman with creator (for 5.2) but it is not showing up in bridge no matter what i do. Do you have a solution on that pls?
You need to login multiple times if it doesn't show up, on both the Epic side and the Quixel side. It's a mess, but it works.
@@piorism i got it to work- i was just very dumb i had to update bridge vie the epic games launcher. After that the metahuman from creator showed up in my quixel bridge.
Thanks for your reply man 😄
@@3rdDim3nsn3D Awesome :) Good luck. And yeah as said, if they don't show up at some point in the future, it'll likely be because of a login timeout.
@@piorism aleight thx for the hint👍😃
So I'm stuck at the part that is conveniently skipped in the tutorial. I have pulled the take files from my iphone and placed them in a folder and set that folder as the target for my capture source. But I don't see anything in the capture manager. :(
Do I have to do a new identity for every project, or can I use the same identity with side views and just add a new performance?
Thank you! Thank you! Thank you!
Why isn't the Metahuman Plugin available in my plugins? I am using the newest version (5.2.1)
is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.
I'm curious about decoding with audio, can I add a separate audio track instead of using the one from recording?
Is there a way to queue Processes for MetaHuman Performance under MetaHuman Animator? and while at it for Export Animation too? This would liberate time waiting for each Process to be finished before moving on.
Why can't we use our high quality footage from for example high-end mirrorless cameras like Sony A7IV?
Boom! Epic Games just dropped the ultimate mic-drop moment!
make a video with the head moving, I can't attach the head to the body when the neck is moving
yeah I want to know this too
when i press prepare for performance it says it needs 64gb of ram... it makes sense i guess.
welldone. thanks
Hi! I have a problem when I activate the plugin and open the project. It gives me this error: ''The ‘MetaHuman’ plugin failed to load because the ‘MetaHumanMeshTracker’ module could not be loaded. This may be due to an operating system error or the module might not be properly set up.'' and closes the project. Any Idea of what could it be the problem?
Thanks!
You need at least 64gb of ram to process the animation at the end will this change to 32 at least
I hit "Prepare for Performance" button and received the 64gb ram warning, it's training on 32gb right now with a GTX1080, and my PC is burning!
My footage isn't showing up in the capture source. I'm using the live archive as you recommended but nothing shows up. Can anyone advise?
Hi, i want to take the metahuman with the animation and put It on blender to add a couple of details and do the render there, and everytime i try the animation no longer works, what would be the proper way to archive that?
Can any HD camera do?
Android? Streaming webcam?
You guys are suspiciously quiet in answering this question.
You would need 3D capture data. iPhones come with depth cameras, which is why Epic mentions them. You can get a standalone depth camera pretty cheap from Intel (cheaper than having to get a phone you don't want). I think Intel's flagship depth camera is called RealSense and they start at about $100.
How about this for Android as well, not just garbage apple products?
in 7:48 (Process) I have problem.. please help me? UE 5.2.1 gives an error and crashes..
Any luck solving the problem?
what is the cable to connect iphone to pc? i know it can work on wifi but for speed some cable connection is used. what's the cable?
I really wish there was a version of this for people who aren't code heads. I'd love to do cool stuff with Metahumans, but Unreal Engine has to be the least intuitive, frustrating software I've ever attempted to learn. I've worked with Adobe Photoshop and Illustrator professionally since the 90s, and InDesign since it came out in the early 2000s, and I've also managed to learn tons of other software, but I guess I'm dumber than I thought. 🤷
I don't think so. I was able to wrap my head around photoshop and blender, but unreal is very complex and impossible to learn
I found it pretty easy in like 3 hours I had a character that looks like me running around in 3rd person, found it pretty cool hardest part I find about learning is the blueprints but still much easier than c++ and don't need much knowledge to create a character
@@elitegold4805 I do love the Blueprints. They're extremely easy to use and it's cool what sort of functions you can come up with without pages of code.
Have you tried Unity? I know this seems odd to suggest on an Unreal video. But I did find that Unity is far more intuitive for beginners to make certain kind of projects. I find Unreal a lot more scalable for making complex 3D functions and interactions.
@@elitegold4805 so yeah, you kinda just proved my point. You know C++. The closest thing to a programming language I've ever used is HTML. 😆
And yes, using the MetaHuman Creator interface online is super easy and fun, and I think it works great. I've gone bananas making MetaHumans, I have over 50 of them at this point. But it's when it gets into Unreal Engine that the wheels come off for me. Unreal Engine was designed for people like you, coders, not people like me, graphic designers. So for me the learning curve is quite steep. I've never used node-based interfaces before, for instance. I'm having to learn all this new terminology, because I haven't done much in 3D before, so 'normals' and 'UV maps' and things like that are things I've had to learn about. You already have a lot of that fundamental knowledge, so for you it's more about just learning how Unreal Engine does things you want to do.
I kinda just wish there was a version of Unreal Engine that was designed for people like me, is what I'm saying. Nothing about Unreal Engine seems obvious or intuitive to me, and it relies on already knowing a lot of stuff I don't know yet. And I don't have any interest in making games, I only want to use Unreal to make cinematics and still images, so all of the game stuff just gets in the way for me. And yes I know _this is a game engine,_ so I have to learn about all that to use the software effectively. And I just...don't enjoy that aspect, I guess. Plus I'm Gen X, and I feel like I've been learning new software my whole life, so every time I have to start from literal scratch part of me is like 'not this again!' 😩
Oh, and I'm on a Mac. So. Yeah. 😔
I am glad, however, that the Unreal Engine interface doesn't try to hold your hand too much, because that's more annoying than not knowing stuff. I hate it when an interface peppers you with pop-up notifications while you're trying to concentrate on a task. Adobe has gotten really bad about that in recent years, and I get that those programs are probably just as dense for total beginners as Unreal Engine is for me, but there is a better way to teach people software than Clippy On Steroids. 😁
how to combine body and head? my head keeps detaching from the body when i attach the animation to the face and play it in sequencer, i tried to baking to control rig but no change..
I am looking for this fix too, if I find I'll comment here, if you find answer me :)
@@elvismorellidigitalvisuala6211 i have found a very vague fix, can only work if you have no body animation, i still don’t know how to add body animation on this fix for now but here is the fix:
Before doing this just make sure you bake animation to control rig (just in case) then, Right click face and go up to rebind component and click on body.
What this does it attaches the body to the face/neck and the whole body moves with the neck, so it looks weird if you see it with full body but if the camera is just in portrait, it kind of works, i didn’t figured out a better fix for it for now
@@elvismorellidigitalvisuala6211 that fix isn’t a good one, but i hope other creators will figure out how to fix this or do we need a body animation for this or a head mount camera to nullify any neck below movements you know..😕
Can I animate the faces of characters created in Character Creator with Metahuman Animator?
Thank you!
Nice plug-in, would be nice if wasn't just for iPhone though, what about us Android users.
We the Android users are out of the road, they are focusing on iOS because of the features
Very exciting!