Metahuman Animator Tutorial

Поделиться
HTML-код
  • Опубликовано: 17 окт 2024
  • КиноКино

Комментарии • 295

  • @Jsfilmz
    @Jsfilmz  Год назад +7

    ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ruclips.net/user/videoiXH79mrKADM/edit
    First rap test ruclips.net/video/mAT1X4xbEdA/видео.html

    • @jimmwagner
      @jimmwagner Год назад +1

      Do you need the depth capture in Live Link or can you just use video?

  • @MR3DDev
    @MR3DDev Год назад +9

    Bruh! It is nuts how accurate this thing is.

  • @lukewilliams7020
    @lukewilliams7020 Год назад +7

    Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming

  • @jimmwagner
    @jimmwagner Год назад +1

    Knew this was coming the second I saw the Unreal post about this. Excited to try this out today.

  • @leonaraya2149
    @leonaraya2149 Год назад +4

    Bro, you are a master!
    And the way that Unreal it's improving this technology day by day it's amazing.

  • @PrintThatThing
    @PrintThatThing Год назад +3

    Incredible! You're so ON IT. Thanks for the walk-thru. Very helpful!!!

  • @user-ct8my8rv9c
    @user-ct8my8rv9c Год назад +3

    Good and fast tutorial, nice man

  • @DLVRYDRYVR
    @DLVRYDRYVR Год назад +3

    7:00 I'm listening to this video driving around, windows down, level 20 on Bluetooth and at a Bus Stop with little old ladies 🤦‍♂️

  • @MiguePizar
    @MiguePizar Год назад +2

    Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best

  • @ohheyvoid
    @ohheyvoid Год назад +2

    You're amazing. Thank you for staying on the pulse!

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      thanks for being here!

  • @MANIAKRA
    @MANIAKRA Год назад +1

    BIG! I was wanting to explore this, thanks for the tutorials!

  • @lukewilliams7020
    @lukewilliams7020 Год назад +1

    The accuracy looks insane

  • @MichaelHurdleStudio
    @MichaelHurdleStudio Год назад +1

    This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?

    • @Jsfilmz
      @Jsfilmz  Год назад

      home made bro with my sweat and blood its not $129 anymore though its $169 now

  • @kool-movies
    @kool-movies Год назад +1

    just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}

    • @Jsfilmz
      @Jsfilmz  Год назад

      oof someones bout to buy an iphone 😂

  • @JoshPurple
    @JoshPurple Год назад +1

    LUV'd your last vid with your little girl (all of your vids ❤ ) 🏆😁👍!

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      hahah thx man it got 20k views on twitter hahah

  • @sakibkhondaker
    @sakibkhondaker Год назад +4

    Some day i think you will make tutorials before release. haha. So quick!!!!!!!!!!!

    • @Jsfilmz
      @Jsfilmz  Год назад +3

      hahaha yea they didnt select me for beta access unfortunately i know some did so im gonna have to catch up to them

  • @guilhermenunes7075
    @guilhermenunes7075 Год назад

    4:43 "Iden-TITTIES" hahahahahaha
    Great video man! Thank you for the content.

  • @christiandebney1989
    @christiandebney1989 Год назад +1

    Awesome.. Randomly woke up at 4am knew there was a reason... thank you!

  • @moneybagvr2113
    @moneybagvr2113 Год назад +3

    You the man bro

  • @IDAVFX
    @IDAVFX Год назад +1

    Thanks Lord Helmet!!!

  • @artofjhill
    @artofjhill Год назад +1

    yeeeeaaaaaaa 🔥

    • @Jsfilmz
      @Jsfilmz  Год назад

      hey J can i borrow 10k subs so i can hit 100k? thanks mang

  • @burov.a.4467
    @burov.a.4467 Год назад +1

    Hi! Thanks for the cool video. Listen to the question, will it work if you upload a regular video shot on a camera?) just no iphone))

    • @Jsfilmz
      @Jsfilmz  Год назад

      dont think so iphones are precalibrated like i showed in the performance editor

    • @andrewwelch5017
      @andrewwelch5017 Год назад

      No, the software requires depth information which a regular camera can’t provide. Borrow an iPhone 11 or newer.

  • @Stonefieldmedia
    @Stonefieldmedia Год назад +1

    ...and off we go!

  • @StyleMarshall
    @StyleMarshall Год назад +2

    sooo quick , Dude ! you are fast as hell .... 😁

    • @Jsfilmz
      @Jsfilmz  Год назад

      i woke up 5 am today haha

  • @emotionamusic
    @emotionamusic Год назад +1

    Super sick! Do you feel like the results are significantly better than the live stream app?

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      bro go watch my rap with it it will answer your question

  • @josiahruelas498
    @josiahruelas498 Год назад +1

    Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!

  • @2beornot2bable
    @2beornot2bable Год назад +1

    Great job!

  • @Griff_The_5th
    @Griff_The_5th Год назад +1

    Do you think it's worth shelling out a bit more money for the iPhone mini 13 over 12?

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      im broke so ur askin wrong person

  • @Blairjones3d
    @Blairjones3d Год назад +1

    I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      i made tutorial both ways my usb transfers faster for bigger files

    • @Blairjones3d
      @Blairjones3d Год назад

      @@Jsfilmz oh sweet as no worries bro. Any tips on uploading from USB then using archives? Seems more complicated to setup the Performance that way...

  • @TetraVaalBioSecurity
    @TetraVaalBioSecurity Год назад +1

    A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman.
    That game is going to be insane levels of detail.

    • @Jsfilmz
      @Jsfilmz  Год назад

      yea hideo was lurking around on my channel when ue5 first came out i was one of the first ones to cover it

  • @Va.bene.
    @Va.bene. Год назад +1

    hey bro! awesome!
    I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea?
    Thank you!

    • @Jsfilmz
      @Jsfilmz  Год назад

      i dont understand :( maybe join unreal discord and post pics and issue there

  • @supermattis108
    @supermattis108 Год назад

    So, whenever I want to make a facial animation (of myself doing it with live link) I need to go through these steps? But they can be used on every metahuman?

  • @kool-movies
    @kool-movies Год назад +2

    epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .

    • @Jsfilmz
      @Jsfilmz  Год назад

      yes

    • @kool-movies
      @kool-movies Год назад +1

      @@Jsfilmz awesome ,, is there any reason not to get the iphone 12 mini ?

    • @Jsfilmz
      @Jsfilmz  Год назад

      @@kool-movies i havent tested longer takes with 12 mini yet but it used to overheat on me alot hahaha

    • @kool-movies
      @kool-movies Год назад

      @@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,

  • @alexdjulin
    @alexdjulin Год назад

    Thanks for the tutorial! Do you have a recorded take file I can use as a test? I dont have an iphone. Thanks

  • @TheChrisNong
    @TheChrisNong Год назад +1

    I get the "assertion failed" crash every time I promote the first frame... what should I do?

  • @matteo.grossi
    @matteo.grossi Год назад +1

    Hey J, for some reason the result of your test is not very good, compared to other tests I have seen online. Do you reckon there was something in the configuration/shooting conditions that interfeered? Or perhaps the other tests used other types of cameras, such as stereo cams, rather than iPhone? Thanks for the tut though.

    • @Jsfilmz
      @Jsfilmz  Год назад

      i think the demo videos that came out were done with stereo cams not sure

    • @matteo.grossi
      @matteo.grossi Год назад

      @@Jsfilmz The rig used when they announced MHA wasn't a metahuman rig, in fact they only showcased how the new system works on metahuman rigs for like, 5" at the end of the presentation.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      @@matteo.grossi hahaha thats cheating then right lol

  • @oldmatttv
    @oldmatttv Год назад +1

    This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      beats buying a real mocap system 😂

    • @oldmatttv
      @oldmatttv Год назад

      @@Jsfilmz True I suppose :)

  • @JoshuaLundquist
    @JoshuaLundquist Год назад +1

    Also hey I'd like one of those helmets, but I do have an iphone 11, will that work? It's bigger than the mini, of course.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      i know some people whos tried 11 with MHA and they said it works i havent tested it myself

    • @JoshuaLundquist
      @JoshuaLundquist Год назад

      Gonna buy an iphone 12 mini like you have, can you tell me which mount you use so I can buy that and yr helmet? @@Jsfilmz

  • @JDRos
    @JDRos 10 месяцев назад

    So how do we connect this to a body so we can add animations to it? 😅

  • @marcusjordan8036
    @marcusjordan8036 Год назад

    This is awesome! Would I be able to take the metahuman/face mocap and export it into blender? I don't need any of the textures or materials. Thanks!

  • @ArabicTechAILab
    @ArabicTechAILab 7 месяцев назад

    when I play the level in a new window (PIE)
    the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally !
    what should I do ?

  • @veith3dclub
    @veith3dclub Год назад +2

    I need to buy iPhone 12 + first hahaha

    • @Jsfilmz
      @Jsfilmz  Год назад +3

      looks like 11 works

    • @aaagaming2023
      @aaagaming2023 Год назад +2

      @@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.

    • @andrewwelch5017
      @andrewwelch5017 Год назад

      @@aaagaming2023I tested the iPhone 11 Pro last night and it works fine, zero issues.

  • @pokusalisobaki
    @pokusalisobaki 5 месяцев назад

    Thank you so much, its amazing

  • @davidMRZ
    @davidMRZ Год назад +2

    So strange, in Capture source whatever I put( live link or archive) it doesn't find it. It gets green in Capture manager but nothing shows up🤔

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      hey man im uploading another tutorial stand by maybe u missed something

    • @davidMRZ
      @davidMRZ Год назад

      @@Jsfilmz great, thank you so much

  • @natanaelgauna3600
    @natanaelgauna3600 Год назад

    Amazing content as always!
    Could you please make a video on troubleshooting these three issues:
    - “Promote Frame” randomly jumping to a different frame than the selected one.
    - Metahuman Identity Solve not accurate result.
    - “Add Teeth Pose” breaking the Identity Solve even more.
    Thanks a lot!!!!!

  • @D3Z_animations
    @D3Z_animations Год назад

    Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves.
    When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )

  • @itsyigitersoy
    @itsyigitersoy 8 месяцев назад

    Hi, i asked this to a lot person but couldn't get appropriate answer. After creating identity and perform with it, my Metahuman Character's default lips and teeth are changing, it's because of getting my identity. Is it possible to keep animation exactly same but with character's default facial features?

  • @jerrtinlsc
    @jerrtinlsc Год назад +1

    So right now its only recording, but no livestream yet?

    • @Jsfilmz
      @Jsfilmz  Год назад

      live? nah facegood for that

  • @binyaminbass
    @binyaminbass Год назад +1

    when doing facial mocap, what do you do for a mic? do you have a little one that you attach to your helmet? what kind of mic is goof for that?

    • @Jsfilmz
      @Jsfilmz  Год назад

      just my good ole senheisser g2

    • @binyaminbass
      @binyaminbass Год назад

      @@Jsfilmz do you attach it to the arm of your helmet?

  • @Persianprograph
    @Persianprograph Год назад +1

    Incredible! Is there a way to export Animation data to Maya to have more freedom for further tweaks?

    • @Jsfilmz
      @Jsfilmz  Год назад

      that would be amazing but i dont think thats possible yet

    • @Luke_wp
      @Luke_wp Год назад

      This guy made a script to do it ruclips.net/video/ecYO5-5fL0U/видео.html

  • @DLVRYDRYVR
    @DLVRYDRYVR Год назад +3

    Hope my XS works. Don't want any more Apple pradux

    • @IamSH1VA
      @IamSH1VA Год назад +2

      Please try it & please report if it works, I have same iPhone XS.
      I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.

  • @Aragao95
    @Aragao95 Год назад +3

    does someone got it working good with iphone 11? the new epic post says it needs aleast iphone 12..

  • @muviing5427
    @muviing5427 Год назад

    Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?

  • @Fotenks
    @Fotenks Год назад +1

    I Hope that behind the scenes they are working on full body mocap

  • @3DComparison
    @3DComparison Год назад +1

    Great stuff!

  • @sibinaayagam1838
    @sibinaayagam1838 Год назад

    If I'm not wrong, you should be turning off the neck solving on a headmounted camera. And use it only for static cameras.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      in my case ill use neck movements from my mocap the exported sequence doesnt come with neck check out new rap video i uploaded

  • @3rdDim3nsn3D
    @3rdDim3nsn3D Год назад +1

    Jo do you have any clue why my metahuman i created in creator does not show up in bridge? Its so frustrating man i spend over a hour now trying to get it to work
    I am mad as fu!& now😅

  • @Amelia_PC
    @Amelia_PC Год назад +1

    Could you provide me with the information about your iPhone 12 model so that I can search for a similar one to buy?

  • @JonCG
    @JonCG Год назад

    Hello bro.I just went back to this video ,becasue for me when update to 5.2.1 its says Preparation for Performance Fialed,any tips on it bro.

  • @terry9183
    @terry9183 Год назад

    You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content!
    I do have one slight issue though!..
    For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭
    Have you experienced this yet? any tips?
    Thank you

  • @UnrealEnginecode
    @UnrealEnginecode Год назад +2

    after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?

    • @andrewwelch5017
      @andrewwelch5017 Год назад +1

      The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.

    • @UnrealEnginecode
      @UnrealEnginecode Год назад

      thank you for your answer, what are they doing this warning for, do you think there will be a noticeable quality difference?

    • @andrewwelch5017
      @andrewwelch5017 Год назад

      @@UnrealEnginecode The quality I got was excellent so I'm not worried about it.

  • @dmingod999
    @dmingod999 Год назад

    Did you see issues of floating head? Face animates and body remains still.. if neck rotation is disabled it's fixed but the neck rotation part of the capture is lost.. Do you know how to avoid that? Thanks! 🙂🙏🏻

  • @lili-ro7cw
    @lili-ro7cw Год назад

    Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?

  • @cubiclulz
    @cubiclulz Год назад +1

    can it be used not with my face but with the one created in metahuman creator? and how to do it?

  • @Pauliotoshi
    @Pauliotoshi Год назад +2

    Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares.
    For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.

    • @Jsfilmz
      @Jsfilmz  Год назад +2

      man im a terrible actor but ill try

    • @Pauliotoshi
      @Pauliotoshi Год назад +2

      @@Jsfilmz Thanks a lot!

  • @AllanMcKay
    @AllanMcKay Год назад +2

    Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation.
    Great video!

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      wait like its doing ai pose estimations already?

    • @AllanMcKay
      @AllanMcKay Год назад +2

      @@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc
      Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      ​@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech

  • @benblaumentalism6245
    @benblaumentalism6245 Год назад

    I wonder how long a given take can be. I have to make some EDU products with this, and they might need to be on the long-ish side.

  • @humbertomoli99
    @humbertomoli99 Год назад +1

    Isn't it necessary to have an iPhone with this? I don't have an iphone sorry

  • @KyanZero
    @KyanZero 9 месяцев назад

    JS can that be done with DAZ characters too?

  • @prasoondhapola2875
    @prasoondhapola2875 Год назад

    I don't have an iPhone. Will it work with my 2021 iPad pro ?

  • @memnok9980
    @memnok9980 Год назад +2

    Imported my video from my iPhone and it won’t show up in capture manager. I’m lost. Lol.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      gotta import entire folder

    • @memnok9980
      @memnok9980 Год назад

      @@Jsfilmz Ah ha! Thanks!

  • @chipcode5538
    @chipcode5538 Год назад +2

    On the metahuman video they use 4 calibration images, is there a reason why you did not use side views?

    • @Jsfilmz
      @Jsfilmz  Год назад +2

      i have helmet on bro hahahaha they on a tripod if i move my head the camera wil move too

    • @chipcode5538
      @chipcode5538 Год назад

      @@Jsfilmz You could make the calibration video with a tripod and use the helmet for the capture. I don’t know if it will improve the calibration, the metahuman seems a little off especially the nose.

  • @dzeeriq
    @dzeeriq Год назад

    Hi! It's amazing video. I try to import my recording video using iPhone 11 but, in the capture manager the video can be reading. Any suggest for it?

  • @naytbreeze
    @naytbreeze Год назад

    Having a huge problem when I try to add the animation to my metahuman in the sequencer it becomes detached from the body from around the shoulders area. Was it because I may have moved too much in the capture? Can’t seem to get the body and the head attached

  • @jaykeastle8804
    @jaykeastle8804 Год назад +1

    BOSS!

  • @HQLNOH
    @HQLNOH Год назад +2

    Hey! I just wanted to say thank you for your videos! If it wasn't for you I couldn't have created animation for my metahumans. ❤ Thank you

    • @Jsfilmz
      @Jsfilmz  Год назад

      That is awesome!

    • @HQLNOH
      @HQLNOH Год назад +1

      Omg yes! My video will never make it if it wasn't for you! I even mentioned you in appreciation in the description ❤️

    • @Jsfilmz
      @Jsfilmz  Год назад

      ​@@HQLNOHthanks man not manu give credit back to me i appreciate it

  • @sybexstudio
    @sybexstudio 8 месяцев назад

    Did everything to the t and my animation sequence doesn't show up in the Face animation menu. Any tips?

  • @nmatthes2927
    @nmatthes2927 Год назад

    Do you have to record the videos with the live link app?
    Because for every android user this would be a real pain in the *ss and make this tool completely unusable and useless for non apple-device owners.
    I tried using a normal mp4 video but it said that there is no footage in the folder.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      tell android to get it together i havr android main phone

  • @OfficialCloverPie
    @OfficialCloverPie Год назад

    my head keeps detaching from the body when i attach the animation to the face and play it in sequencer

  • @RandallsDiversion
    @RandallsDiversion Год назад +1

    bro mine keep crashin when loading trackers. Any suggestions?

    • @Jsfilmz
      @Jsfilmz  Год назад

      check hardware specs? not sure mine just takes a min

    • @RandallsDiversion
      @RandallsDiversion Год назад

      @@Jsfilmz I fixed it. Apparently if you create a duplicate project from an older version of unreal engine, it won't work. Thanks though.

  • @prodev4012
    @prodev4012 Год назад

    Do you think there will be a marketplace to buy mesh to meta human data scans eventually? For example I don't know any korean girls to run this on with an iphone like that company did but i'd be willing to pay people who know attractive people from every race if they did

  • @PecoraSpec
    @PecoraSpec 7 месяцев назад

    can i move the mocap data to another software, like blender?

  • @ielohim2423
    @ielohim2423 Год назад +1

    Great vid!! In the Showcase ,didnt they show a way you could use this app to generate textures for your metahuman? Will you be showing us how as well?

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      can you send me that video? i dont think i saw that

    • @ielohim2423
      @ielohim2423 Год назад

      @@Jsfilmz I think I may be mistaken. I thought the HellBlade II showcase did it ,but I think they may have had a premade metahuman.

    • @ielohim2423
      @ielohim2423 Год назад

      @@HellMunky What's your workflow? What do you use to generate the mesh/textures that you import into UE to use in the plug-in?

  • @tomhalpin8
    @tomhalpin8 Год назад +1

    Were you able to figure out how to connect the iPhone to wirelessly be triggered to record from Unreal?

    • @Jsfilmz
      @Jsfilmz  Год назад

      just the regular livelink way?

    • @tomhalpin8
      @tomhalpin8 Год назад

      @@Jsfilmz With the new MetaHuman Animator version of LiveLinkFace, can't seem to connect. Trying to match my body mocap with the face mocap.

    • @Jsfilmz
      @Jsfilmz  Год назад

      cant do it live animator is offline

    • @tomhalpin8
      @tomhalpin8 Год назад

      @@Jsfilmz I wonder if I can I can use OSC to sync up the recording process on the phone with the mocap

  • @jonos138
    @jonos138 8 месяцев назад

    I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it.
    I tried re-tracking face markers but still had the same problem.

  • @RikkTheGaijin
    @RikkTheGaijin Год назад +1

    is there a way to export the head movement? it looks unnatural to have just the facial expression with a still head.

    • @Jsfilmz
      @Jsfilmz  Год назад

      yes watch my tripod tutorial i uploaded today

    • @RikkTheGaijin
      @RikkTheGaijin Год назад +1

      @@Jsfilmz it looks like that if you use a Metahuman that you already had before the latest update, it won't export the head movement, but if you download a Metahuman now, it will. I guess they have updated all metahumans in the Bridge catalog. I still don't know how to attach it to the body.

    • @RikkTheGaijin
      @RikkTheGaijin Год назад

      @@Jsfilmz I watched it but you are not showing how to attach the body, you just say "I will import a body animation"

    • @muviing5427
      @muviing5427 Год назад +1

      Wow. I have the same issue. How to attach the body with proper movement...? I think there is the way to solve it with BP... I checked the Unreal forum but I can't find the proper way.

  • @Impaczus43
    @Impaczus43 Год назад +3

    Hold up! Did it just create a Metahuman face with your facial proportions just based on the iPhone calibration!? that's crazy!

    • @Jsfilmz
      @Jsfilmz  Год назад +3

      yes

    • @Impaczus43
      @Impaczus43 Год назад +1

      @@Jsfilmz that's awesome! Glad you were able to share this! Awesome video

    • @Jsfilmz
      @Jsfilmz  Год назад

      @@Impaczus43 it made a fatter version of me which is effed up lol

    • @soncho.editz1
      @soncho.editz1 Год назад

      @@JsfilmzI mean honestly it does have your facial features but do u think you can adjust that manually ???

  • @reubencf
    @reubencf Год назад +2

    3:52 it's says iPhone 11 here
    But my phone is also 11 and over there on live link for metahuman animator I get that your device model is not supported you can continue to use but results won't be that good

    • @Jsfilmz
      @Jsfilmz  Год назад

      yea ive had people here try it

    • @reubencf
      @reubencf Год назад +1

      @@Jsfilmz I just went back to the site and it seems that iPhone X and 11 are removed

    • @Jsfilmz
      @Jsfilmz  Год назад

      wtfff on the docs?

    • @reubencf
      @reubencf Год назад

      @@Jsfilmz yes

  • @DariuszMakowski
    @DariuszMakowski Год назад

    Have u done any vids how to take that head with animation and apply to another metahuman ?

  • @JoshuaLundquist
    @JoshuaLundquist Год назад

    Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.

  • @juanmaliceras
    @juanmaliceras Год назад +1

    it´s here!!!!! downloading plugin!!!!!

  • @michaelb1099
    @michaelb1099 6 месяцев назад

    is there a way to create the captures from a webcam?

  • @RemDovans
    @RemDovans Год назад +1

    Maybe in a year or 2 Epic will give us a way to import our videos for mocap, I’ve tried so many apps and don’t like the results, only one left to try is supposedly the best one, move ai.

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      yea its good man

  • @thehotpotatosquad
    @thehotpotatosquad Год назад

    Hello, awesome tutorial, but I have a very basic problem, capture manager does not recognize the files, literally the step you make at 1:20 of your video shows no video files for me. I recorded with Iphone 13 mini (supports LiveLink, also realtime works), and I am working on Windows PC. Any ideas what can be the problem?

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      firewall?

    • @thehotpotatosquad
      @thehotpotatosquad Год назад

      @@Jsfilmz Im not so sure about that, because real-time connection works, just that imported video is not recognized at all. I must mention that I am using Windows PC, so might there be a problem while transferring files? I just imported from OneDrive the whole zipped folder and un unpacked, but is there maybe another way?

  • @coulterjb22
    @coulterjb22 6 месяцев назад

    The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4

  • @GiorgiBekurashvili
    @GiorgiBekurashvili Год назад +1

    I receive software release and your notifications at the same time, how is it even possible :D

    • @Jsfilmz
      @Jsfilmz  Год назад +2

      i woke up 5 am today to make this video

  • @privet20005
    @privet20005 Год назад +1

    I was trying to make it work all night and it keeps crashing after first stage of tracking😭 I can make it work only when I select like 150 frames. I bought laptop in january and its already not enough😭

    • @Jsfilmz
      @Jsfilmz  Год назад

      yea its cpu intensive bro

    • @privet20005
      @privet20005 Год назад +1

      @@Jsfilmz I hope upgrading ram to 64gb would help

    • @Jsfilmz
      @Jsfilmz  Год назад +1

      gl man

  • @DominicRyanOsborne
    @DominicRyanOsborne 3 месяца назад

    When the teeth are facing you but the face isn't, must be metahuman, or a horrible childhood accident you didn't wanna talk about

  • @TheSabanrab
    @TheSabanrab Год назад +1

    Was about to say the same thing about your teeth JS 😂

  • @ComanderJTC
    @ComanderJTC Год назад +1

    I thought when you're calibrating you need a minimum of 3 the front, left and right, and that the teeth only need one in front?

    • @Jsfilmz
      @Jsfilmz  Год назад

      turn ur head with helmet on

    • @ComanderJTC
      @ComanderJTC Год назад

      @@Jsfilmz ah I see your point but wouldn't it be better for calibration to do it without the helmet to get the three angles then so the depth part of the camera recordings with the helmet on make it more accurate for certain things like creases on face and stuff similar to that when making facial expressions?

    • @Jsfilmz
      @Jsfilmz  Год назад

      ​​@@ComanderJTCar as i know u gotta calibrate like how ur gonna record

    • @ComanderJTC
      @ComanderJTC Год назад

      @@Jsfilmz ok makes sense I just figured it would read your face better if it had an initial depth analysis from multiple angles was my thinking