How to Use MetaHuman Animator in Unreal Engine

Поделиться
HTML-код
  • Опубликовано: 21 янв 2025

Комментарии • 324

  • @coder5382
    @coder5382 Год назад +115

    Seriously guys, each time Epic comes up with new outstanding features on Unreal Engine, I'm turning myself into a rocket scientist mentality. The team at Epic Games rocks!

  • @patrikbaboumian
    @patrikbaboumian Год назад +125

    This is a huge game changer for indie devs and animators. Thanks, Epic! 🖤

    • @danielyeh1627
      @danielyeh1627 Год назад +6

      why there 's no metahuman plugin?

    • @AlexisRivera3D
      @AlexisRivera3D Год назад +2

      Do you know if this can be exported to Blender?

    • @edentheascended4952
      @edentheascended4952 Год назад +2

      @@AlexisRivera3D most animation data and 3d model data can be exported from unreal engine as FBX. You'll have to find a tutorial but it should be possible, if not... it would be because unreal engine as a company set up something to keep this from taking place. considering the fact that unreal is more recently apart of many pipeline software, the chances of that being the case is slim to none.

    • @naytbreeze
      @naytbreeze Год назад

      Anyone know how to solve issue where the body becomes detached from the animation ? My head is animating and it separates from around the shoulder area .

    • @AlexisRivera3D
      @AlexisRivera3D Год назад +1

      @edentheascended4952 I exported the Metahuman to Blender but has some issues, for example the hair is not compatible with Blender so I had to add a new one as hair cards

  • @Raztal
    @Raztal Год назад +14

    The way it creates a 3D-Model of my face, via the video capture..
    Just tested it on iPhone 12..
    Wow.
    You guys nailed it! Gamechanger. Holy moly :D

    • @buraksozugecer
      @buraksozugecer Год назад +2

      Do we need Iphone or any phone works?

    • @EnterMyDreams
      @EnterMyDreams Год назад +4

      does your unreal 5.2 not crash using Metahuman Identities and processing the neutral pose? I get it crashing every time.

    • @buraksozugecer
      @buraksozugecer Год назад

      @@EnterMyDreams hello, can we use other phone? Or we have to use Iphone?

    • @Raztal
      @Raztal Год назад +3

      @@EnterMyDreams No issues here so far. Sorry to hear you got crashes

    • @Raztal
      @Raztal Год назад +3

      @@buraksozugecer Only iPhone 12 or better

  • @HoudiniEnjoyer
    @HoudiniEnjoyer Год назад +13

    always unreal what you guys bring to the table!! This is so awesome, cant wait to try !!

    • @stephanedavi
      @stephanedavi 5 месяцев назад

      Thanks a lot

    • @andreww4751
      @andreww4751 4 месяца назад

      oh yea it makes your games epic!

  • @sheraixy
    @sheraixy Год назад +28

    4:59 you must select a body type before „Mesh to Metahuman“ - otherwise you get an error
    do that by selecting „Body“ down at the left outliner

    • @TheWillvoss
      @TheWillvoss Год назад +1

      as well as a bunch of other steps that this tut missed, heh. I mean, thank god for pop up boxes, but sheesh.

  • @qaesarx
    @qaesarx Год назад +1

    Now the only thing we need in Metahuman is proper age. Its difficult to make very young and very old people physiques. Great work, thanks guys!

    • @williamtolliver749
      @williamtolliver749 Год назад

      Yeah best thing ive found is manually scaling down bones. Guess they dont want some freaky children stuff inthere, because im sure ppl would go that far for shock value....

  • @nximusic
    @nximusic Год назад +4

    Thanks, this will be a great help for my Unreal Engine music clips and trailers

  • @behrampatel4872
    @behrampatel4872 Год назад +13

    Simple and straightforward process demo, Raff. Thanks & Cheers

  • @hamzaahmad9735
    @hamzaahmad9735 Год назад +4

    Epic is just amazing
    ... Love how they open source every progress they achieve❤❤

  • @warrenzechnas
    @warrenzechnas Год назад +3

    I am speechless.. This will shake the industry! A HUGE thank you to EPIC for this.

  • @marcusclark1339
    @marcusclark1339 Год назад +1

    I'm reminded of something Siren devs did years ago, thinking of that old technique to create a realistic face to now is so surreal
    yeah it'll really help indies, they could use unreal for faces and probably still import to other places if need be

  • @NickHeinOfficialPage
    @NickHeinOfficialPage Год назад

    When I bake the Face animation to the Control Rig like in the video, the animation freezes. Anyone have an idea what to do?

  • @entropicvibe
    @entropicvibe Год назад +19

    It would be awesome if you add Webcam or Android camera support

    • @davidedemurodominijanni9889
      @davidedemurodominijanni9889 Год назад +6

      Ain't gonna happen, unfortunately. I gave my hopes up on that. A lot of people, specially new solo-devs on a tight budget trying hard to finally start making profit from the struggles of learning and experimenting with all they've got, are looking forwards to seeing that finally happening... and while waiting for that other people remain one or two steps ahead. They say we can use an head-mounted camera but why not simply have the chance of using the smartphones we have? Evidently they do not agree or do not find that business-worthy and rather look at us as dinosaurs... ready for extinction!
      I love Apple, don't get me wrong, but their monopoly when it comes to some stuff is hugely annoying...

    • @CarbonI4
      @CarbonI4 Год назад +8

      Honestly kind of confusing, is there some technical reason for the lack of support outside Iphone? What self respecting developer owns an iphone anyway, a difficult to modify, locked down system. There is a reason for the meme of graphic designers/artists etc using apple products, they aren't generally the technically minded ones, and I say that as an artist.

    • @ed1726
      @ed1726 Год назад +2

      @@CarbonI4 It requires depth data. Hence you have two options, stereo capture or iphone (which uses lidar).

  • @iamthenightshift
    @iamthenightshift Год назад +5

    Hot damn. I use Unity at my job, and Unreal at home. And switching from one to another feels like I'm traveling 10 years into the future. Completely ruined any chance for Unity to catch up in terms of graphical fidelity. And it's just as easy to use, if not more so in some instances. Insane.

  • @antoni-yt
    @antoni-yt Год назад +1

    Hihi. I was waiting for this plugin since you announced it. And I prepare 2min of videoclips for making a short film. So you published it in the exact right point and I want to thank you for making it possibil that I can do this stuff for free. :)) Thank you really much

  • @Amelia_PC
    @Amelia_PC Год назад +5

    3:29 What's the iPhone app we should download to record videos to import to the engine? could you show us the process with iPhone (a real step-by-step)?

    • @SARAIEX
      @SARAIEX Год назад +1

      Live link face

    • @Amelia_PC
      @Amelia_PC Год назад +2

      @@SARAIEX oops! sorry!

  • @yasmeensalawy9601
    @yasmeensalawy9601 9 месяцев назад +1

    help plz 1:56 i find my iphone their but iy don't show any of my captures even when i press start capturing bottom it shows that it is capturing but when i stop it doesn't show anything

  • @AzizSalokhiddinov
    @AzizSalokhiddinov Год назад +2

    in 7:48 (Process) I have problem.. please help me? UE 5.2.1 gives an error and crashes..

    • @ruslan_3ddd
      @ruslan_3ddd Год назад

      Any luck solving the problem?

  • @hyperspace-films
    @hyperspace-films Год назад

    Thank you so much!!! This has given indie devs a way to compete with AAA titles!

  • @muviing5427
    @muviing5427 Год назад +2

    It's a really really GAME CHANGER... Epic is gaming in the real world. Thanks for Epicgames and Epic developers!!!!

  • @tds_FrankOfficiel
    @tds_FrankOfficiel Год назад +10

    i'm sure it's possible with a simple camera, why need a iphone ? there are phones with 2 cameras allowing you to have depth and capture volumes, it is also enough to put dots on your face, I hope you will add that in the very near future if not it sucks

  • @PatrickCharpenet
    @PatrickCharpenet Год назад +3

    I must be doing something wrong, when applying the exposed face animation to my metahuman, it works but the head is detached from the body…

  • @MarisFreimanis
    @MarisFreimanis Год назад

    So finally it’s out? Nice! Thank you!

  • @EthanCasner
    @EthanCasner 9 месяцев назад +1

    Metahuman is an excellent way to spend three weeks just trying to make a floating talking head actually attach to an animated body correctly. Even if they're literally "both metahuman" as of 5.3 there is still not official method.

  • @3dart_es409
    @3dart_es409 Год назад +5

    Please add support to iPad Pro 2022.
    I used with Live Link and worked perfectly. Now I just updated the app and when I open it, shows two methods:
    -The old Live Link
    -The new Metahuman Animator, but it says in red color that my device is not supported.
    How is not capable an iPad Pro 11” 2022 4ºgen with M2 chip, and iPhone12 is capable?

    • @CGeneralist92
      @CGeneralist92 Год назад +1

      I am wodnering the same thing. Hopefully they will update the plugin to work with Ipad.

    • @AlexiosLair
      @AlexiosLair Год назад

      It is listed on the blog that you need to use Iphone12 or higher.

    • @3Dave666
      @3Dave666 Год назад +1

      is working, I tried it on ipad pro 2022 and is fine

    • @3dart_es409
      @3dart_es409 Год назад

      @@3Dave666 really? Don't have any inconsistencies? It gave me an advice that it can behave incorrectly and I didn't try it.

    • @3Dave666
      @3Dave666 Год назад +1

      @@3dart_es409 I tried with a 10 seconds video and it looks like is perfectly working, it uses 2 captures (video and depth) and ipad can do that

  • @_rider_1063
    @_rider_1063 Год назад +61

    When Android?

    • @Rokinso
      @Rokinso Год назад +2

      When Android?

    • @MakotoIchinose
      @MakotoIchinose Год назад +13

      Never 🤑

    • @LauranceWake
      @LauranceWake Год назад +10

      Hardware limitations mean android can't

    • @_rider_1063
      @_rider_1063 Год назад +13

      @@LauranceWake people already made livelink face analogs for android. Also, Epic Games announced live link for android too far ago

    • @sheraixy
      @sheraixy Год назад +7

      when androids builds in a truedepth camera

  • @OfficialCloverPie
    @OfficialCloverPie Год назад +1

    how to combine body and head? my head keeps detaching from the body when i attach the animation to the face and play it in sequencer, i tried to baking to control rig but no change..

    • @elvismorellidigitalvisuala6211
      @elvismorellidigitalvisuala6211 Год назад

      I am looking for this fix too, if I find I'll comment here, if you find answer me :)

    • @OfficialCloverPie
      @OfficialCloverPie Год назад

      @@elvismorellidigitalvisuala6211 i have found a very vague fix, can only work if you have no body animation, i still don’t know how to add body animation on this fix for now but here is the fix:
      Before doing this just make sure you bake animation to control rig (just in case) then, Right click face and go up to rebind component and click on body.
      What this does it attaches the body to the face/neck and the whole body moves with the neck, so it looks weird if you see it with full body but if the camera is just in portrait, it kind of works, i didn’t figured out a better fix for it for now

    • @OfficialCloverPie
      @OfficialCloverPie Год назад +1

      @@elvismorellidigitalvisuala6211 that fix isn’t a good one, but i hope other creators will figure out how to fix this or do we need a body animation for this or a head mount camera to nullify any neck below movements you know..😕

  • @oldmatttv
    @oldmatttv Год назад +3

    Really, really interesting. Great video making everything as simple as possible I'm sure, while still having enough detail. Lots of learning and experimenting to do, but will try to use this feature one way or another.

  • @stylezmorales
    @stylezmorales Год назад +6

    question, does this require an iphone (aside from the stereo camera of course), or can the footage be from any camera at this point? Just curious, wasn't sure if it was using the iphone lidar data or something like that.

    • @lexastron
      @lexastron Год назад +1

      I double that question.

    • @stylezmorales
      @stylezmorales Год назад +3

      @@lexastron I did find out that it does use the data from the iPhone camera and that's why they use it. There's ways to use Android or a webcam, it's just not as accurate or articulate unfortunately, however, you can get close and then hand animate the rest to get it a lot closer to the performance. It'll take a little more work but I guess it's doable just would be a lot easier with an iPhone

    • @lexastron
      @lexastron Год назад +1

      @@stylezmorales Got it. Thank you 🙏

    • @stylezmorales
      @stylezmorales Год назад +2

      @@lexastron np I hope they start adding similar tech to androids, I'm not a big iPhone fan but I gotta admit iPhones do have the tech we need lol

  • @grmnt.studio
    @grmnt.studio Год назад +7

    Hey! I've managed to capture a performance but once I add it as a face animation to the Metahuman Blueprint, the head stays detached from the body. How can I merge them together and work on my body animation separate from the facial / head rotation?

  • @Riptwo
    @Riptwo 11 месяцев назад

    This tutorial is fantastic! I'm hoping that there's a follow-up about batch processing performances via the included python scripts. I have a project with around 300 takes, but I haven't been able to figure out the batch processing workflow yet!

  • @brothercuber3288
    @brothercuber3288 Год назад +2

    Is this possible with android? None of us have iphones

  • @Zeno617
    @Zeno617 Год назад +2

    I keep crashing on the prepare for performance step, anyone found a fix?

  • @Kozlov_Production
    @Kozlov_Production Год назад +1

    How to fix floating head?

  • @RichardBoisvert
    @RichardBoisvert Год назад +3

    Amazing tech, but not a good tutorial. I wish you would have started by saying you'll need your initial Live Link Face capture to have a Front, Left, Right, and Teeth pose. You don't realize these prerequisites until halfway through the video. Also, what's the best way to get the data from your iPhone to your pc?

  • @ed1726
    @ed1726 Год назад +3

    Are there any guides coming for the stereo capture. I made a capture source, selected stereo archive, pointed it at my directory with stereo hmc. Went to the capture manager, selected the source and.... nothing. No videos to select. What formats are supported? I tried mp4 and mov (does the codec matter)? I assume I am missing some really simple step, but can't find it.

  • @bingzhou7970
    @bingzhou7970 Год назад

    Finally ! Crazy work is comming! 😍

  • @TheNovaProspect
    @TheNovaProspect Год назад +2

    You just unlocked so many doors for me, you have no idea.

  • @madedigital
    @madedigital Год назад +1

    You need at least 64gb of ram to process the animation at the end will this change to 32 at least

    • @oimob3D
      @oimob3D Год назад

      I hit "Prepare for Performance" button and received the 64gb ram warning, it's training on 32gb right now with a GTX1080, and my PC is burning!

  • @wywarren
    @wywarren 5 месяцев назад

    At 7:05 after Mesh to MetaHuman returns, the blendshapes all goto vertex count 0. Is there a way to preserve these after the solve? Standard Bridge MetaHuman downloads have both an embedded DNA as well as morph targets.

  • @the-secrettutorials
    @the-secrettutorials Год назад

    This opens a new world for people like me 🤩

  • @parralox
    @parralox Год назад

    Great tutorial!
    Also… P is for Plosive 🤓

  • @FPChris
    @FPChris 9 месяцев назад

    In 5.3.2 It looks like once I add the teeth pose, set the frame, and click "fit Teeth" the B view looks like there are two head meshes overlayed and you don't see the teeth result. Then Prepare For Performance fails. ??? REALLY frustrating.

  • @asntech4014
    @asntech4014 Год назад +1

    can the video is saved locally from iPhone then sent it to the cloud for my colleagues
    overseas ?

    • @oimob3D
      @oimob3D Год назад

      Sure! you send the zip file

    • @asntech4014
      @asntech4014 Год назад

      @@oimob3D great

  • @TheNSproject
    @TheNSproject Год назад +1

    Nice video! I wish I know where I am wrong. My BPMetahuman does not move with the animation...

  • @Leomerya12
    @Leomerya12 Год назад +2

    Can any HD camera do?
    Android? Streaming webcam?
    You guys are suspiciously quiet in answering this question.

    • @myth0genesis
      @myth0genesis Год назад

      You would need 3D capture data. iPhones come with depth cameras, which is why Epic mentions them. You can get a standalone depth camera pretty cheap from Intel (cheaper than having to get a phone you don't want). I think Intel's flagship depth camera is called RealSense and they start at about $100.

  • @MR3DDev
    @MR3DDev Год назад

    Fantastic I was waiting for this.

  • @Domm_Dynamite
    @Domm_Dynamite Год назад +1

    So I'm stuck at the part that is conveniently skipped in the tutorial. I have pulled the take files from my iphone and placed them in a folder and set that folder as the target for my capture source. But I don't see anything in the capture manager. :(

  • @AzizSalokhiddinov
    @AzizSalokhiddinov Год назад

    @UnrealEngine in 7:48 (Process) I have problem.. please help me? UE 5.2.1 gives an error and crashes..

    • @veith3dclub
      @veith3dclub Год назад

      me, too ... could you fix it?

  • @Enchantaire
    @Enchantaire Год назад +3

    Amazing! The head detaches from the body when imported into the sequencer though.
    Why is that?

    • @martem000
      @martem000 Год назад

      I am also looking for a solution

    • @보리타작-x9s
      @보리타작-x9s 26 дней назад

      you should erase each control rigs in the sequencer

  • @IvanHalimau
    @IvanHalimau Год назад +5

    GREAT TUTORIAL! Thanks Epic Games ❤

  • @FPChris
    @FPChris 9 месяцев назад

    Tried again with 5.4 and it still fails. There was a new ID start screen about system specs. How about more info on that? The error message just fails with no help as to why.

  • @RavenWT
    @RavenWT Год назад

    Can't wait to try it! Tomorrow will be a busy daayyyyy!

  • @sahinerdem5496
    @sahinerdem5496 Год назад +1

    can i use any adroid phone video capture?

  • @RikkTheGaijin
    @RikkTheGaijin Год назад +1

    is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.

  • @AlexiosLair
    @AlexiosLair Год назад +1

    I'm curious about decoding with audio, can I add a separate audio track instead of using the one from recording?

  • @canndask
    @canndask Год назад +1

    Thank you very much for making this tutorial.

  • @lievenvanhove
    @lievenvanhove Год назад +3

    Thank you for your very clear tutorial.
    I was wondering if it would be possible to batch-process 50 performances at once with the same Metahuman Identity?
    Is it possible, or do I need to manually set them up each time?

  • @castilho-art
    @castilho-art Год назад +1

    make a video with the head moving, I can't attach the head to the body when the neck is moving

  • @BadlerStudios
    @BadlerStudios Год назад +3

    Are these results much better than the prior LiveLink setup? I am excited to test out myself, but it does look like a more complex workflow than the offline CSV method that currently exists.
    Excited to test out.

  • @보리타작-x9s
    @보리타작-x9s 26 дней назад

    1:55 there is no files to see in my project.. any advice?

    • @보리타작-x9s
      @보리타작-x9s 23 дня назад

      I've solved problem. The only Data by meta human animator can be linked this tutorials. Not Livelink data in the app.

  • @윤예진-z1v
    @윤예진-z1v Год назад

    02:19 I tried many times, but i cannot import the takes to CaptureData folder. ... How can i solve this?

    • @윤예진-z1v
      @윤예진-z1v Год назад

      This is the error message. "LogCaptureManager: Ingest for take MySlate_4_iPhone failed: 'Failed to validate collected file.' "

  • @jasoncorganbrown
    @jasoncorganbrown 5 месяцев назад

    I'm running into issues with connecting head to body. Could there be a conflict between the two animations?

  • @DRLarkin
    @DRLarkin Год назад +2

    Unreal folks: Any recommendations for HMC rigs that work with Animator?

  • @MarcusGrip-o1m
    @MarcusGrip-o1m Год назад

    Help!
    I get error when importing iphone video. Either it says audio clip could not be imported or it could not verify ingested file. Both instances I end up without a Capture Data file so I cannot move forward.

  • @reinohvp
    @reinohvp Год назад +1

    It is necessary to have an IPhone or can I just upload a video?

  • @dmartinr1xxxx
    @dmartinr1xxxx 9 месяцев назад

    hi, whats the workaround for when file can't be imported from queue due to audio file being too big? thanks

  • @wokeupina
    @wokeupina Год назад +1

    when i press prepare for performance it says it needs 64gb of ram... it makes sense i guess.

  • @Lukimuc
    @Lukimuc Год назад

    I only see MetaHuman Identity and Capture Data (Mesh) as an option. The menu itself is called MetaHuman instead of MetaHuman Animator. What did I do wrong?

  • @jigneshghelani3559
    @jigneshghelani3559 Год назад +8

    what about android people?

    • @enrix00
      @enrix00 Год назад +3

      Maybe because Android manufacturers don't want to add the depth camera components that iPhones have and focus on other features

  • @blaszmogames
    @blaszmogames Год назад

    what is the cable to connect iphone to pc? i know it can work on wifi but for speed some cable connection is used. what's the cable?

  • @binhvongola
    @binhvongola Год назад

    need help, after i create sequencer and add animation into face, then i play but it's not happened, anyone help me please

  • @PrintThatThing
    @PrintThatThing Год назад

    Very helpful, thanks!!! Can't wait to try it out :)

  • @oliverricardobisbalmompo2110
    @oliverricardobisbalmompo2110 Год назад

    Hi! I have a problem when I activate the plugin and open the project. It gives me this error: ''The ‘MetaHuman’ plugin failed to load because the ‘MetaHumanMeshTracker’ module could not be loaded. This may be due to an operating system error or the module might not be properly set up.'' and closes the project. Any Idea of what could it be the problem?
    Thanks!

  • @karlbovski1656
    @karlbovski1656 Год назад +1

    So I get it, you guys don't like Android, but what about other 3D-Cams solutions like Intel RealSense or similar? Anyway, amazing piece of software, congrats.

  • @LightningProductions-mt8oj
    @LightningProductions-mt8oj Год назад

    Metahuman is finally on Unreal Engine 5!

  • @ISknne3D
    @ISknne3D Год назад

    Cannot wait to start using this. Thanks. :)

  • @argonproductions
    @argonproductions Год назад

    Hi, i want to take the metahuman with the animation and put It on blender to add a couple of details and do the render there, and everytime i try the animation no longer works, what would be the proper way to archive that?

  • @DohaElwardaney
    @DohaElwardaney Год назад +1

    WOW! Now I understand why actors and actresses worry about AI and Digital engines.

  • @joshjen1129
    @joshjen1129 Год назад

    My footage isn't showing up in the capture source. I'm using the live archive as you recommended but nothing shows up. Can anyone advise?

  • @coalescence
    @coalescence Год назад

    Excellent, can't wait to try it

  • @3rdDim3nsn3D
    @3rdDim3nsn3D Год назад +2

    So awesome! But i have master issue with bridge
    I create a metahuman with creator (for 5.2) but it is not showing up in bridge no matter what i do. Do you have a solution on that pls?

    • @piorism
      @piorism Год назад +1

      You need to login multiple times if it doesn't show up, on both the Epic side and the Quixel side. It's a mess, but it works.

    • @3rdDim3nsn3D
      @3rdDim3nsn3D Год назад +1

      @@piorism i got it to work- i was just very dumb i had to update bridge vie the epic games launcher. After that the metahuman from creator showed up in my quixel bridge.
      Thanks for your reply man 😄

    • @piorism
      @piorism Год назад

      @@3rdDim3nsn3D Awesome :) Good luck. And yeah as said, if they don't show up at some point in the future, it'll likely be because of a login timeout.

    • @3rdDim3nsn3D
      @3rdDim3nsn3D Год назад

      @@piorism aleight thx for the hint👍😃

  • @valentinnist
    @valentinnist 11 месяцев назад

    Can I animate the faces of characters created in Character Creator with Metahuman Animator?

  • @habibmensch7368
    @habibmensch7368 Год назад

    Why isn't the Metahuman Plugin available in my plugins? I am using the newest version (5.2.1)

  • @synapzproductions
    @synapzproductions Год назад

    HERE WE GO!

  • @grossimatte
    @grossimatte Год назад +1

    How do you export the face board keyframes to the face board in maya?

    • @hiendam0
      @hiendam0 Год назад

      I have the same question. Does anyone know?

    • @grossimatte
      @grossimatte Год назад +1

      @@hiendam0 FYI, there's a third party plugin that does exactly that, but I am not sure if that will now work with this new version.

  • @MilutinMujovic
    @MilutinMujovic Год назад

    Why can't we use our high quality footage from for example high-end mirrorless cameras like Sony A7IV?

  • @kurkista
    @kurkista Год назад +1

    Hi, thank you for this. Is there a written documentation on this/these processes step-by-step? I feel parts of the video tutorial take for granted stuff, a noob like myself has a hard time following along.
    Oh and BTW. Something I do not understand. How come the Live Link app is available only for iPhone, but the Metahuman plugin in is not available for UE on macOS. Why is that?

  • @leallan69
    @leallan69 Год назад

    Where are the assets in the MetaHuman Animator submenu under Advanced Assets?

  • @CineTanner
    @CineTanner Год назад

    Anyone have an issue where after making the identity and selecting footage the playback view is just black?

  • @JohnnyRampant
    @JohnnyRampant Год назад

    Can I export this rig and facial animation to maya? I would need to add shot sculpting pass? Appreciate if anyone can help

  • @Steve-lu7nu
    @Steve-lu7nu Год назад +1

    Can android camera footage do facial animation

  • @josevieira3367
    @josevieira3367 Год назад

    Hi for me who doesn't have an ifone, how to use it I can record a video with any cell phone on my face and do it too or I can only do it with a Hiphone????????????????????????????????

  • @zimingpeng8418
    @zimingpeng8418 Год назад

    any steps by Animator is working fine. But anyone kown why the body and face are disconnected in sequencer when i add the animation on the face? When i set "Additive Anim class" to "Mesh Space", it seems also like very rigid.

  • @Sedokun
    @Sedokun Год назад +7

    When it will be available for other devices, like DSLRs, HD Webcams, Android Phones etc? (Head Mounted Cameras)

  • @kurkista
    @kurkista Год назад

    Is there a way to queue Processes for MetaHuman Performance under MetaHuman Animator? and while at it for Export Animation too? This would liberate time waiting for each Process to be finished before moving on.

  • @이현보-v4t
    @이현보-v4t Год назад

    In the meta-human performance stage, clicking Processing causes a conflict and cannot proceed, is there any way?

  • @pharhillanimated
    @pharhillanimated 10 месяцев назад

    Do I have to do a new identity for every project, or can I use the same identity with side views and just add a new performance?

  • @PradeepKumar-qw1ov
    @PradeepKumar-qw1ov 10 месяцев назад

    (No Body Type Preset is selected in the Body Part. Please select a Body Type Preset to continue). i am stuck to this error any one help this appear while auto-riging metahuman mesh only

  • @Viscte
    @Viscte Год назад +2

    How about this for Android as well, not just garbage apple products?