The Future of Facial Animation with AI | Unreal Engine Metahuman
HTML-код
- Опубликовано: 6 авг 2024
- In this video, I share my experience with facial animation using different software and solutions, from ARkit to AI-powered solutions. I break down the process into steps to help you understand the components of facial animation capture. This will enable you to make an informed decision on which solution fits your production needs best.
--------------------
Resources in the video:
Weta's Avatar facial paper: www.fxguide.com/fxfeatured/ex...
Matrix Demo breakdown (must watch): • The Matrix Awakens: Cr...
Faceware portal: • Faceware Portal Webina...
Ziva face on Siggraph 2022: • SIGGRAPH 2022 Real-Tim...
D14D: di4d.com/
--------------------
Twitter: / hanyang_vfx
Insta: / unreal_hany
Artstaion: www.artstation.com/hanyang
00:00 Introduction
01:15 ARkit Face in depth
03:42 Use ARkit in Maya with metahuman
04:43 AI-driven facial
07:01 AI result review
10:06 What's the future? Кино
my 60s ARkit workflow here: ruclips.net/user/shortsXCImJLD9FA4?feature=share
Great stuff Unreal Han! Thank you for your suggestions!
🔥exciting series, looking forward to the next one!
Aye Cory!!!
Fascinating Han thank you so much for sharing! Brilliant deep dive 🙏
Awesome stuff! Very useful sum up for facial production.
Thank you for sharing your insights, looking forward to learn more from this site! Thank you!
Awesome content Han! subscribed and liked!
Excellent, comprehensive, brilliant, thank you
superb video man
Awesome man thanks you so much, am waiting for more. 💕💕💕💕
Thanks for this comprehensive video, Han.
Very interesting. Thanks for sharing. Great video ❤😮😊
iClone Acculips, with adjustments, layered in UE with ARkit passes, with adjustments, have gotten me the best results by far.
Do you have a workflow tutorial on that?
@@SaguinMedia I will probably do it after my next video release!
Ahahah
Glad I discovered this channel
Really great content. Thanks for sharing.
Love your content! Keep it up!
this is great info.. thanks for sharing!
Thank you SO SO SO much for this research. I know it was a mountain. I definantly will stick with ARKit and additional post animation where needed.
What an awesome video!
Great stuff keep going
Amazing explanation and sharing of knowledge! Insta sub!
Really interesting!!! Thanks. Sticking to ARKit for now. 👍
Youre the man Han. Show us the ways!
感谢分享,干货中的干货👍
Learn something new from master Han today✌🏻
may i ask, what did you learn from this video ?
I feel like with this ai matching system, the animations look a lot more emotional and believable!
you deserve your name ! +1 sub
4:17 Super creepy 😎 Great effect to show a robot freaking out and starting to lose control.
Hey, have you seen Unreal Engine's 5.2 Realtime Facial Tracking Animation Demo? Are you using the same tech/approach in the video?
hi good stuff, which program you use to do the machine learning from tracking data to retarget data? and how did you put it back for auto retargeting
Excellent breakdown! Thanks for putting it together! I’m glad to see Faceware Portal get a quick mention at the end.
As a long-time user of Faceware, I know the barrier of entry is a bit steep for indy devs and beginners, but I believe Portal eliminates a huge portion of that front end, especially when it comes to building a tracking model. The fact that Portal is a cloud-based neural net solver, I expect it to continue to evolve and improve over time, and hopefully become more accessible to a wider audience.
Totally agree. It's not easy to get Faceware for sure. Love to hear your thoughts on it in terms of quality tho. Especially the new portal.
@@Unrealhan I only briefly touched on this subject in my last webinar, but I hope I’ll have some more substantial feedback to share in the near future. Here’s a clip to preview those tracking results on static cam footage:
ruclips.net/user/clipUgkxZbu8aJEZfmg7oBM01iaEBaBayFtcPRNO
Would love a tutorial on editing a base meta human (like heavily edit, turn the human into a orc or dwarf) in blender or zbrush then importing back to unreal with all the facial rigging working, or even how to edit the blend shapes of the new edited character in unreal
It's coming for sure. Still need a bit time but def on the list!
@Dahoony thank you!
Hi thanks for this! Can I get your opinion on upcoming image generators? Specifically Bluewillow as they are still in beta testing phase? Are they too late?
Great video!
Link to face good?
Is there any option for android alternative for ARkit
And now MetaHuman Animator is dropping soon. Exciting times!
What about combine and retarget 3d scanned per frame facial animation and morph animation? Maybe is not good and flexible for unreal but must be better result
11:09 Respect.
Can I ask you where you got the assets you used to spice up your scene for your last video?? I thought they were assets scattered aroung the city already, but I cant find the movie theater entrance and storefront entrance anywhere....
kitbash3d.com/collections
@@Unrealhan awesome, thank you!
Does it run on an apple m1 chip. I heard that its complicated?
What do you think about the new animation in UE 5.2 they have greatly improved it, please record a video!
New sub here looking for good content on unreal for cinematic purposes.
what software or tool u use to track AI generated faical capture
Sir can you make a full tutorial about that
Do those metahumans' skins naturally deform based on collisions like a real human? I'm really wondering if this is happening.
For me, a big drawback of ARKit is that certain facial expressions aren't recorded into the blendshapes at all. For example you can hardly do non-symmetric brow up movement (🤨), worried face (😟) and sad face with hanging lower lip. I assume this can be solved by software in the future, but so far I haven't seen much progress in ARKit facial tracking last few years unfortunately.
good point. the hardware ability is there, just need someone to design the app specifically for us animators. I hope something is coming tho
@@Unrealhan yeah I hope so too. Although it is currently limited, it is very accessible both technically and financially compared to alternatives.
does it work on mac m1 max ?
I have an android, not an iphone. But I'm working on getting an iphone (not to have as a phone but just for facial motion capture). Could you please put together a tutorial at some point on facial motion capture in Unreal Engine 5 please?
How about looking at iClone 2 Unreal?
What is your opinion of the new Faceware Portal solution?
您好,杨老师。我是一名正在学习ue的学生。想请问一下您的lip sync 中,手指涂抹嘴唇那个部分是怎么做的,可以简单帮我指个方向吗。非常感谢
I think having the unreal engine options completely work on a iOS or android device is truly being accessible
IMO I think I will be sticking to live link face app for a while and then manually correct in Unreal.
Same here.
this just got a huge update
Ya crazy stuff right? They cracked the code
Facil ? What about body track to convert the data into a avatar ?
top
I want to see AI interpolated in-betweens for hand-drawn animation that use a timing chart that you give it so it understands which friend a favor because right now that text just looks mushy cuz they're straight in-betweens
Really interesting breakdown, and matches exactly my experience, although, the more you feed the FG AI, the better it becomes, so those places where it misses become less and less
hey i know you!
@@Jsfilmz me too
Now Unreal 5.2 is out. What do you think about Metehuman Animator?
Game changer, great for creators and sucks for other companies. Can’t wait for it to come out.
@Unreal Han I really hope they start enabling the LiDAR in the rear camera soon. The demo appears to be using the front cam which is SL based with low resolution
What about faceware?
Should be similar idea. I tested their lite version before, similar to ARKit. The new portal looks like an ai driven solution as well.
Me crying in 3DS MAX rip me, honestly i might aswell start some sorta RND for facial tracking for 3ds max, i need courage!!!!
We really need an AI that analyses like 10 minutes of videos of someones face and can use that data to make near-perfect copy of all the emotions of that person in metahumans.
The ghetto camera rig was like $4k lol!!
Give alternative we dont have i phone
man....metahuman animator is comming...it use iphone depth sensor. wow.. what a fast
我也在期待那一天的到来,去释放arkit或者普通摄像头的最高的能力,AiGC的时代终将来临!!
if there's someone who looking for lipsync animation for virtual human's speeking here are some alternative ai animation solutions
speechgraphics is amazing
OVR is free
Metahuman SDK Plugin is free
🦒
That video of the robot girl and the girl kissing looks inappropriate honestly
I’d rather it didn’t it’s bad enough without getting it to do that
Dude AI is useless. It's guessing + using more power. And people growling isn't a sellable product in entertainment. Entertainment is s break from that.