- Видео 169
- Просмотров 1 068 343
Pixel Prof
США
Добавлен 19 авг 2013
I teach Virtual Production and performance capture at Drexel University.
Before teaching, I worked in feature film VFX working on Matchmoving (Syntheyes, PFTrack, MatchMover Pro, etc.), Compositing (Nuke, AE, Shake), Pipeline Programming & Scripting (C, C++, Python), 3D (Maya, C4D), Photogrammetry, and Motion Capture (Vicon, Optitrak, MotionBuilder), but after Epic's Unreal Engine Fellowships in 2021 and 2023, most of what I do now involves Unreal Engine for in-camera-VFX, nDisplay, live greenscreen VP and of course mocap/Metahumans. That doesn't mean that I've left my MotionBuilder / Maya / Nuke / Houdini roots behind. :-)
I'm particularly passionate about the power and potential these technologies have for communications and interaction in general, and I hope this channel helps others get up to speed with these tools more easily and quickly.
Have fun.
Before teaching, I worked in feature film VFX working on Matchmoving (Syntheyes, PFTrack, MatchMover Pro, etc.), Compositing (Nuke, AE, Shake), Pipeline Programming & Scripting (C, C++, Python), 3D (Maya, C4D), Photogrammetry, and Motion Capture (Vicon, Optitrak, MotionBuilder), but after Epic's Unreal Engine Fellowships in 2021 and 2023, most of what I do now involves Unreal Engine for in-camera-VFX, nDisplay, live greenscreen VP and of course mocap/Metahumans. That doesn't mean that I've left my MotionBuilder / Maya / Nuke / Houdini roots behind. :-)
I'm particularly passionate about the power and potential these technologies have for communications and interaction in general, and I hope this channel helps others get up to speed with these tools more easily and quickly.
Have fun.
Quick Tip: Default Frame Rate for Motion Design Projects in Unreal Engine 5.5
This quick tip video shows where you can find and change the setting for the default frame rate for Motion Design animation in UE5.5
Просмотров: 164
Видео
Fixing "Auto" Retargeting Issues in Unreal Engine 5.5
Просмотров 2979 часов назад
This video shows workflow for addressing issues that can be encountered in Unreal Engine when Auto Retargeting doesn't really do the "Auto" part so well. By generating retargeting assets and making logical adjustments to the source and target regarting poses, what initially looks like a very problematic retarget can be made quite effective.
Creating a Motion Design Project in Unreal Engine 5.5
Просмотров 45712 часов назад
This video is part of a series for beginners showing how to create a new project in UE 5.5 configured for Motion Design work. In addition to setting up the blank project and plugins, this also shows how to add a free set of material assets that can be used when creating motion graphics elements.
Install Unreal Engine on a PC
Просмотров 10012 часов назад
Just a quick video showing how to install Unreal Engine. The screen captures are from a Mac, but the install process described is for a PC.
Broadcast Transition Logic in Unreal 5.5 (For Beginners)
Просмотров 34512 часов назад
This video, intended for beginners, is a step by step demonstration on how to setup a simple lower thirds broadcast graphic template using the Transition Logic features of UE5.5's Motion Design plugin.
Dynamic Materials in Unreal Engine 5.5
Просмотров 61121 час назад
This quick-tip video shows how to use Dynamic Materials in Unreal 5.5 (should also work in 5.4 ...and maybe.. in 5.3?) Dynamic Materials allow a single material instance to have lots of different looks in a single level, without the need to create duplicate instance assets in the Content Browser. This is particularly useful in Motion Design so template levels can be configured to use different ...
Animate Metahumans Using an Audio File in Unreal Engine 5.5
Просмотров 4,7 тыс.День назад
One of the "under the radar" demos at Unreal Fest 2024 was the new ability to use nothing more than an audio dialog file to animate Metahuman lip sync animation. This video shows a Pre-Release walkthrough of how to use this functionality using the 5.5 Metahuman Plugin. This video was recorded using the initial Pre-Prelease version of UE5.5, so there are still a few things to be ironed out, and ...
NDI Alpha Channel with Motion Design in Unreal 5.5
Просмотров 78121 день назад
This video shows how Unreal Engine 5.5's native NDI Output plugin can be used with the Motion Design Rundown Playlist feature to output graphics with alpha channel that can be composited for live broadcasting and live streaming outside of Unreal Engine. This video is recorded using an early, pre-release version of UE5.5 downloaded from Epic's Github source repository, so features are subject to...
Native NDI Output in Unreal Engine 5.5
Просмотров 1,8 тыс.21 день назад
Let's take a look at the built-in NDI Media output functionality in a pre-release version of Unreal Engine 5.5! This video is the first in a series taking a look at some new features and updates found in the source code release of UE5.5. Since this is built from source ahead of the official pre-release, anything shown could change between this recording and the official release, but it's always...
Installing Unreal Engine 5 on a Mac
Просмотров 2,6 тыс.Месяц назад
This video walks through the process of installing Unreal Engine onto an Apple Macintosh, (in this case, an M2 MacbookAir). Included in the video is a demonstration of creating a first-time Film, Video & Live Events project, and adjusting some of the Engine's performance settings to improve the frames per second achieved. In this case, the FPS improved from 19-21fps to a more respectable ~30fps...
Triggering Different Sequences in Unreal Engine
Просмотров 330Месяц назад
This video shows how Unreal Engine can be setup to play different Level Sequences in response to different events such as triggering differet collision boxes. This is shared in response to some questions that were posted to a previous tutorial explaining how to use a keystroke to play a single sequence. Hope this helps. Have fun.
Free Movella Mocap Asssets: Retargeting in Unreal 5.4
Просмотров 4542 месяца назад
This video is a quick look at how to easily retarget free XSens motion capture animation assets downloaded from Movella.com so they can be used on Unreal Engine Manny or MetaHuman characters. The free mocap data can be downloaded from movella.com
Create Custom Metahuman Template Project in Unreal Engine
Просмотров 5592 месяца назад
This video shows how you can easily create your own custom template project that can be used when launching Unreal Engine to save time by having all the plugins and rendering settings you need ready to go any time you create a new project.
SIGGRAPH 2024 - Revopoint 3D Handheld Scanners
Просмотров 2242 месяца назад
In this video, I chat with Vivian Li, from Revopoint 3D to learn about the MIRACO - Standalone 3D Scanner. Vivian shares an "instant demo" scanning my head and face on the spot as in their SIGGRAPH booth. You can learn more about the Revopoint 3D scanners here: www.revopoint3d.com/collections/3d-scanners
SIGGRAPH 2024 - Move4D Volumetric Capture System
Просмотров 4112 месяца назад
In this video, I chat with Matt Bennett at SIGGRAPH 2024 to learn more about the Move4D volumetric motion capture system. You can learn and see more about this system at www.move4d.net/
SIGGRAPH 2024 - Machine Learning & USD Tools in Nuke and Katana with The Foundry's Christy Anzelmo
Просмотров 2262 месяца назад
SIGGRAPH 2024 - Machine Learning & USD Tools in Nuke and Katana with The Foundry's Christy Anzelmo
SIGGRAPH 2024 - ArcEye/Sony Volumetric Capture System
Просмотров 5072 месяца назад
SIGGRAPH 2024 - ArcEye/Sony Volumetric Capture System
SIGGRAPH 2024 - New LookingGlass "Holographic" Displays
Просмотров 4202 месяца назад
SIGGRAPH 2024 - New LookingGlass "Holographic" Displays
SIGGRAPH 2024 - Vicon Performance Capture
Просмотров 2122 месяца назад
SIGGRAPH 2024 - Vicon Performance Capture
SIGGRAPH2024 - How Manus Mocap Gloves Work
Просмотров 2772 месяца назад
SIGGRAPH2024 - How Manus Mocap Gloves Work
Virtual Production Camera Calibration with Vicon Shogun
Просмотров 6512 месяца назад
Virtual Production Camera Calibration with Vicon Shogun
Quick Tip: Importing older "Incompatible" Assets into new versions of Unreal Engine
Просмотров 4143 месяца назад
Quick Tip: Importing older "Incompatible" Assets into new versions of Unreal Engine
Automating CSV Data Updates for Unreal Engine Motion Design
Просмотров 6903 месяца назад
Automating CSV Data Updates for Unreal Engine Motion Design
CSV Driven Motion Graphics with Unreal Engine 5.4 - Video 3 Step-by-Step
Просмотров 6863 месяца назад
CSV Driven Motion Graphics with Unreal Engine 5.4 - Video 3 Step-by-Step
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 2 Technical Overview
Просмотров 5324 месяца назад
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 2 Technical Overview
Reality Capture Intro for Office Hours
Просмотров 3014 месяца назад
Reality Capture Intro for Office Hours
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 1 Introduction
Просмотров 6464 месяца назад
CSV Spreadsheet Driven Motion Graphics with Unreal Engine 5.4 - Video 1 Introduction
UE 5.4 Motion Design Boolean Text Operation
Просмотров 8914 месяца назад
UE 5.4 Motion Design Boolean Text Operation
When I export it and bring it into my level sequence, my metahuman does not move. But when checking the animation sequence, the head does move
Thank you Pixel Prof!
hi, thanks for this! i have a similar issue but mine is happening with a character that i've rigged in Blender using Auto Rig Pro (so it's rigged to the SK_Mannequin skeleton). So I import it and select that as the skeleton. It looks fine, but when i bring in some animation, it distorts the character like happens with your MH. I tried making your changes to the SK_Mannequin skeleton but didnt do anything (potentially because it's IK rigged). Sorry for the long comment!
Great tuto. Do you know if it is possible to trigger play/stop scenes/pages in broadcast module with midi or osc messages ? Thanks for all your videos.
Wonderful video. TX 4 sharing. 6:45 the character's source toe ends are dipped. I've seen this in other mocaps.
Good catch! Yeah, This data is from a “first time using the mocap system” class. I’ll have to cover the Vicon process for correcting this toe issue next time the class is in the studio.😁
@@PixelProf Thanks for covering how to edit the source, tho. Really good info. Really useful for these cases: fingers, toe end. Great video!
Super cool future… love it… I can’t figure out how to trigger inside particular overlap but not by overwhelming instead by action key … can’t connect conditions is overlapping + action key to trigger sequence attached to overlapping box. My had is spinning in BPs… C++ is so much easier to make it work. Any ideas?
Boss - Can you tell me more about 5.5 material designers? Almost no one talks about it. , it’s just that the BUG problem of 5.4 has not been fixed yet. . .
Yup… more on that coming soon! 😁 There are some great updates to Material Designer in 5.5 that make some significant changes from the 5.4 version, but there are a couple glitches in the first 5.5 Prerelease. Looking forward to the updates coming in the next week or so.
I tried using this rundown system but it had too many bugs. Textures randomly disappeared. Animations glitched showing wrong frame when taking in a new level. Even playlist failed and played the wrong thing. Hope this part of unreal gets fixed asap 🙂
UE5.5 is working much better than the 5.4 version, but the first pre-release still has a couple glitches in it that should be fixed before full release.
Could you do a more comprehensive tutorial on metahuman lip sync? There's almost none out there.
Sure… are you looking for something on editing animation that already came from performance or audio capture(like this video shows), ….. or “starting from scratch” using only keys & curves on the control board for animating?
@@PixelProf Starting from scratch is more accurate :) I'm somewhat familiar with UE, I do environments but haven't even tried metahumans. What I actually have in mind is creating an avatar or virtual character that would be the face of a youtube channel and do all the talking. I'm thinking of something more realistic than the one above, with more complex facial expressions, head movement and even gestures. I'm not sure how much effort this is. Though with all the AI rage out there, avatars will be on the rise, if they aren't already.
Awesome, how to use transition logic and layers
I tried to export to blender....Mesh & Armature are poles apart....soo frustating
Erg...I don't use Blender (yet) so not sure what can be done there. Will have to dig into it sometime in the near future.
You saved me. Thank you!!!
Thank you for making such a quick and clear tutorial!!
Amazing video, thank you. Helped me solve this issue in UE5.4 and it worked perfectly!
Awesome. I recall trying something like this way back using Blender phonemes/visemes (mouth shapes)...A, O, E, W/R, T/S, L/N, U/Q, M/B/P, F/V to aid in Lip Sync. Where each phoneme was associated with specific lip and facial movements. Haven't tried it in years. Waiting to try this out once they Release of the Unreal Engine 5.5. The Eye movements (Blinking) is what is puzzling Me.
runtime?
Not yet.
It's really not bad. But the facial expression still looks like someone who took an overdose of Botox.
now how do can i use the eyes i cant find it out
I'll work on a follow-up video that shows how to apply adjustments and otherwise animate the face and eyes in conjunction with the viseme results from this tool, but it is basically to use sequencer to bake this result onto a face control board, and layer on additional animation inputs.
i am looking forward to see your next videos thanks so much
Great simple tutorial, so glad they've implemented this. Gave it a couple of tests and the results are very impressive, particularly when combined with the facial animations from city sample to replace the eye movements/blinking. Will still use the previous animator method for accurate facial, but this is a great quick method.
Glad it helped!
For sure this was my most favorite thing I saw and demoed at Unreal Fest. Thanks for making a video so quickly!
Hah! Thanks Tony!
Thank you for doing this. I've been using Iclone 8's Acculips to Unreal Metahumans via livelink. Combined with iclone's "Digital Soul" and other powerful animation tools this is a game changer.
Yap... then export it via FBX, import to 5.4 and smoke it.. works perfect for now :)
I haven't gotten lucky with this, any advice?
thanks you so much i have been looking for this tutorial for weeks
oh man this is awesome - thanks for sharing!
is it possible to combine this with a performance capture?
Yup. Body motion can be applied independently and other facial performance can be layered on with Unreal’s Sequencer.
Very interesting thanks. What can u do with the depth input?
Depth input is to process data captured with the LiveLink Face iOS app when it’s set to “MetaHuman Animator” mode.
it is already in Nvidia omniverse app with some expression faces !
Hoping to try this for real time streamed audio, but looks like it’s not there yet. If anyone has suggestions for that, please let me know 🙏
Yup. Fingers crossed that's a near-future thing, but this function (for now) is recorded audio/post-process only.
pretty cool but the lack of any emotion on the rest of the face because its only tracking audio and not a facial structure makes me wonder what this could be used for practically
Since Unreal has an extremely capable (ind continuously improving), layered animation system, the synchronized viseme animation from this process can be readily combined with other motion capture, hand-keyed or procedural animation sources. For example, a LiveLink Face performance capture can be quite effective for overall face expressions, but lacks fidelity around the mouth, so something I’m hoping to experiment with soon is using this to process the audio from a LiveLink capture, and then art-direct the layering of results from both performance capture methods. Also, I imagine this is a step in the direction of realtime audio->animation, which could facilitate performance based on realtime voice generation systems (just speculation on my part for now)
@@PixelProfwould love to see that idea of mixing the live link and facial cap. but i definitely seeing it going in that direction
@@PixelProf ok I"ve done this with Livelink Face app you can move the head In real time to make it more human like but the eyes needs to be able to blink more but you can run the animation and do the head movement at the same time
I wanted to use meta humans to create digital avatars but I see tools like hey gen coming up and I really wonder if meta human is actually worth the time.. AI is moving too quick
Great tutorial. I’m trying to get additional animation on top of that audio file to animation. Normally I would bake mocap to rig and add additive- but with this facial mocap it breaks or reduces the lip sync to very small movements. How would you go about it?
Animate = Lipsync = CLICKBAIT
If can be Blueprinted and feed a wav to playback as a stream would be perfect.
I agree
Amazing. So quick and easy to follow.... Thanks
Appreciate the effort that went into this video and that Epic is rolling out the capability @PixelProf
This is amazing! Thanks so much for sharing that this is now available. After the last "free" plugin that was allowing audio to lip sync decided to stop all of us using it for free and started charging obscene prices and deemed only for large studios, so many of us indie devs were completely derailed in using the metahuman lip sync functionality. Epic is amazing for making this a part of the engine finally. No longer will we be stopped by a pay wall for our projects. Hooray! I will be sharing your video :)
This is great, Thank you.
Are you on Mac or Windows? Somehow I am not able to activated BMD (8K Pro) on Mbp (M1 max with Expension) in UE5 (with BMD plugin enabled). Every other software is working fine with 8K pro
Unreal can only output Blackmagic Media Capture tools on Windows machines.
@@PixelProf agreed, this apple vs Epic game beef is making it very hard for content creators. Hope they find solution soon.
Is there anyway at all to package a project as a standalone .exe file so that the Livelink driven avatar Metahuman, or one inported from Characer Creator) still works? A VTuber was able to do this with v4.26 years ago, but it involved really unstable 3rd party plugins and most responders to that tutorial could not get it to work, myself included.
Exciting! Yes, my dream is to make a science/history video game based on Unreal Engine 5. Since my computer is a MacBook Pro, I was wondering if I could download Unreal Engine 5 from it. Video fully viewed.
Thank you so much for figuring this out and sharing a video about it Do you have a clue how I could make the VR-spectator cam (from the UE 5.5 XR-Template) stream via NDI? As I am missing the media viewport capture (from the Virtual Production Tools), I was not able to get it to work. (What I am also missing is a way to decide between HX3 and "uncompressed" NDI streams or making it working on a Mac or IOS...). Have you figured out a way to make this work?
Really needed this playlist, great stuff, well explained.
Interesting can’t wait to get 5.5
Hi, I have a question,if I got SVG have animation information how can I play it in UE5 system?
UE's SVG import doesn't internalize SVG animation... it's really just intended for still logos and shapes.
Hi Nick! Tried importing this into Blender, the fbx came in looking like a monster. Boo! I'll try import the fbx into Mobu. Fingers Crossed!
Did MoBu/Maya work better? For me, these seem to import the files well. I’m not sure of what the “expectations” of Blender’s fbx importer are.
What about corrective joints? They are not animated properly with this solution, unfortunately.
Those should be driven by blueprints in unreal rather than animated by an external tool. External tools should just animate on the primary joints and avoid changing anything on the leaf joints.
@@PixelProf If You export animation from Maya or MoBu wihout animated corrective joints then in UE those joints won't be animated automatically. That is why MH in Maya are built with two constrained skeletons by default, I presume.
@@filipmasiulewicz5036 I'm at Unreal Fest this week, but will check/test when I'm back next week. I think the idea behind the driver skeleton in Maya is that you can get the proper deformations for cloth and other physics interactions there. As long as the MH Animation Blueprints and Post Process blueprints haven't been altered or deactivated, those should be driving the corrective joints and shapes in UE. If something external is overriding them, you could try bringing the animation into Sequencer, then baking to ControlRig, then back to a new Animation Sequence to get the appropriate corrections applied.
Thank you
Wow after I just purchased Offworld Live too
I don’t know what NDI media output is to be excited about it.
Being built in, it’s natively integrated into the Motion Design broadcast graphics Rundown/Playlist system, making it turnkey to connect with NDI-based broadcast and streaming setups without needing to configure render targets or even install any external plugins. To me, that’s awesome.🤓
NDI allows you to send out a key and fill output out over a network card instead of outputting to SDI or HDMI
@@dwovowb interesting
Can't waiit to test it!
Have fun!😊
Oh wow. 💯👌❤️ #unrealengine
Alpha channel, working?
Yup….NDI Alpha Channel with Motion Design (an Unreal 5.5 "Sneek Peek") ruclips.net/video/oxuSC_fVEIE/видео.html
Hi Nicholas! I am currently trying to set up a little virtual production Studio with an OptiTrack Mocap system (Prime x22), green screen, a BMD Ultimatte 4K Keyer. In combination with a PC-Workstation with a BMD 8K Capture Card, a 4090, and a BMD Ursa Mini Pro G2 camera. 1) I connected my time code and the BMD Genlock generator with my eSync-Box of the Mocap System. But I'm not 100% sure how to get Genlock and Time code simultaneously into the Camera and the Capture card of the PC. Do you have any advice for me? 2) Do you recommend tow separate time code generator (ext) for the eSync-Box of the Mocap System and one for the Camera? regards Patrick
You can connect additional outputs from the BMD Genlock generator to the Ref inputs on the Ursa, Ultimatte and Decklink. That should cover genlock sync across the systems. The camera only "needs" timecode if you'll be recording on it. If the camera is only used for "live-only", you can ignore it's timecode. As of UE5.4, UE should be able to get timecode from the mocap system via livelink, so you can use that for UE timecode (along with the Decklink genlock signal for lock). Hope this helps.
@@PixelProf Great! that helps a lot! ...I thought the camera needs both at live streaming to unreal :) ... thx 🙏