Wow, I am trying this for Days!! With 2022 3 3 some buttons are not the same. F.E. Button red left is missing and no Pose Checker but Pose Estmation. I was desperate, 'cause it never worked. BUT: Wait some Seconds after clicking the yellow Triangle. Just some seconds more. And BOOOOM it works!! Wow!!! Thank you Sir!
It looks great. What's the oldest video card you can use with this and can you save the animations for use in other software such as Blender, iClone, C4D and others?
As the visual tracking process with standard and depth cameras get better and better. I would love to see this coupled with some AI To handle - foot sliding - odd pose snapping Basically I can’t see why taking the tracker output and matching it to a database of human motions. Basically do a reverse pass where the 3d animation rendered in 2D is mapped to the original and other 2D and the iteratively is adjusted to be perfect. All this has open papers for solutions already. So I guess it’s just a matter of time. If this then is coupled with a retargable rig human ik style. With built in balance and physic stabilization and adjustment and animation layer support. And weight (body weight) physics / Mechancis it will be great. If the AI also can indenting typical motions form the and mark those as sub animation and even betters spilt all those into NLA tracks then we can slide some movements back and forth. Make them stronger weaker. More stylised. Add finally command driven adjustments. Like “lean forward into time xxxx and straight up with wider stride length into til xxx Then we are going for a true interactive physics and behavior driven animation system. This together with tracking and normal pose correction keying for totally control and we have an animation tool that will make novices produce good animations. And allow really good animation to works so much faster and drive those motion understanding so much easier
The best solutions post tweaking ive seen are, iClone 8 and Cascadeur. Really good at removing footslide, re tweaking and trying to correct shaky joints. Still not a perfect solution but works well
@@armondtanz I have played a bit with those still for me the best mocap / human animation software is motion builder. Tbh I have not done serious work in like 10y so maybe I am just nostalgic.
@@litjellyfish ah yes. Seen a lot on motion builder. I do patato like animation so im not very good at it. I have to rely on slight tweaks. Im liking icone8. Its helped more than i could imagine. Very useful program.
Is this possible with metahumans out of the box or does it require a specific rigging/bone system or anything of the sorts? Cheers for all the content!
I don't think Omniverse is challenging the Unreal rather complementary to it. Basically movie creator find Unreal somewhat intimidatory regarding file import from various sources and retargeting. Except Reallusion (through Auto Setup) and Autodesk Maya (through Datasmith), everybody feels annoyed with the file import and material handling in Unreal which fairly easy in Omniverse which accepts USD in best way. Blender has its problem in its own to export in USD in right way. My opinion Omniverse is far more easy to learn and handle has all components ( audio2face, machinima, pose tracker)of movie making through path tracing in fraction of time than in Blender. Only flipside is you have to have a RTX card which is now far more cheap than it was few months back.
@@Ecceptor It took my i7 11700k almost 5 minutes to start up Machinima 2022.1. If you have anything older than a 7th gen intel machine. You need to upgrade. And no a high end xeon 4th chip won't do it. I tried it. Its the instruction sets that are missing in the older chips.
Hi Jae. I'm trying to export the mocap data I generated from Machinima's Pose Tracker to Blender and retarget to a rig to refine the animation (similar to what I would do to Mixamo mocap animations). I think right now the only tutorial video of this is yours. The file I got from following your tutorial is a USD and when I try importing to Blender or Maya nothing comes of it (no animated armature/rig). I don't know if I'm doing something wrong... I'm following your video instructions
Firstly export it with usd, then drag and drop that usd file to the sequencer and check if your character is being animated, after then export it in fbx
Hi, did you manage to feed a live stream into Pose Tracker? I spent a couple of hours today trying to feed BM Capture Card into Maschinima with no positive result. I think it has not been developed yet.
Thank you. If you make steam live which camera I must use webcam?? can you make a new video for this and what camera I can use?? it is webcam or specific camera?? Thank you in advance
How about if the video has two people in it, such as two people dancing tango together? Can it track two people, and apply those poses to both of the 3D models?
Thanks. Is it possible to bring over a character from Maya including a control rig for body and face created in Maya and use it with this tool? And is Xgen hair supported including proper hair shading like with the Arnold hair shader? Thank you.
Hey! can this be retargeted to a ue5 or metahuman manikin? Are there any restrictions on importing videos? Well, I'm also wondering if this can be exported to Maya for retargeting? Thank you!
@@Jsfilmz Record using multiple synced cameras at different angles (front, sides, back, etc). To give a better understanding of where the body is and avoid obstructions.
I followed your video step by step but I can't get Sol to animate after dragging Sol and the tracking .usd to the sequencer. I'm on Omniverse Machinima 2022.2. 🥲
That's amazing. It's true, now there's no doubt about it, Nvidia has done it again.
Wow, I am trying this for Days!! With 2022 3 3 some buttons are not the same. F.E. Button red left is missing and no Pose Checker but Pose Estmation. I was desperate, 'cause it never worked.
BUT: Wait some Seconds after clicking the yellow Triangle. Just some seconds more. And BOOOOM it works!! Wow!!! Thank you Sir!
i made tutorial for new version
Very simple explain! fine!
Is this a sponsored video? Good they got the pose back in finally.
im so exiciting!
Thanks bro this is helpful, That's something I needed to speed up my work, if you have further advices let me know please.
Great video bro. I didn't even know Machinima existed. Great tech these people over at Nvidia have, and this is free sun? Wow!
Yet another insanely useful video
your videos are dope! do you have one on the nvidia face mocap?
It looks great. What's the oldest video card you can use with this and can you save the animations for use in other software such as Blender, iClone, C4D and others?
Requires RTX card. You can export animations from Omniverse to FBX
As the visual tracking process with standard and depth cameras get better and better. I would love to see this coupled with some AI
To handle
- foot sliding
- odd pose snapping
Basically I can’t see why taking the tracker output and matching it to a database of human motions. Basically do a reverse pass where the 3d animation rendered in 2D is mapped to the original and other 2D and the iteratively is adjusted to be perfect.
All this has open papers for solutions already. So I guess it’s just a matter of time.
If this then is coupled with a retargable rig human ik style. With built in balance and physic stabilization and adjustment and animation layer support. And weight (body weight) physics / Mechancis it will be great.
If the AI also can indenting typical motions form the and mark those as sub animation and even betters spilt all those into NLA tracks then we can slide some movements back and forth. Make them stronger weaker. More stylised.
Add finally command driven adjustments. Like “lean forward into time xxxx and straight up with wider stride length into til xxx
Then we are going for a true interactive physics and behavior driven animation system. This together with tracking and normal pose correction keying for totally control and we have an animation tool that will make novices produce good animations.
And allow really good animation to works so much faster and drive those motion understanding so much easier
Salute for this comment
@@KindTom1 glad you could understand it with all spelling and construction error all over the sentences :)
The best solutions post tweaking ive seen are, iClone 8 and Cascadeur. Really good at removing footslide, re tweaking and trying to correct shaky joints.
Still not a perfect solution but works well
@@armondtanz I have played a bit with those still for me the best mocap / human animation software is motion builder. Tbh I have not done serious work in like 10y so maybe I am just nostalgic.
@@litjellyfish ah yes. Seen a lot on motion builder. I do patato like animation so im not very good at it. I have to rely on slight tweaks.
Im liking icone8. Its helped more than i could imagine. Very useful program.
Unbelievable !
Awesome video, and I thought it was hilarious that those windows kept refusing to resize hahaha
just I LOVE YOU, YOU ARE SAVE MY LIFE
This is all I needed ✨🔥but jus disappointed I won't be able to join the contest coz my country isn't listed on the NVIDIA website
My version of Omniverse shows this annoying Up and Forward labels for the character. I don't know how to get rid of it.
amaziiiiing like all your videos, I have one question, can we export the animation to unreal engine to use there with metahuman?
not directly you will probably have to do round trip
The animation .USD can be use with unreal engine and Meta Human ? Fingers are track ?
Is this possible with metahumans out of the box or does it require a specific rigging/bone system or anything of the sorts? Cheers for all the content!
u cant use metahumans inside omniverse epic's license rules
@@Jsfilmz What? lol
I don't think Omniverse is challenging the Unreal rather complementary to it. Basically movie creator find Unreal somewhat intimidatory regarding file import from various sources and retargeting. Except Reallusion (through Auto Setup) and Autodesk Maya (through Datasmith), everybody feels annoyed with the file import and material handling in Unreal which fairly easy in Omniverse which accepts USD in best way. Blender has its problem in its own to export in USD in right way. My opinion Omniverse is far more easy to learn and handle has all components ( audio2face, machinima, pose tracker)of movie making through path tracing in fraction of time than in Blender. Only flipside is you have to have a RTX card which is now far more cheap than it was few months back.
you'll also need beefy cpu. My i5 can't run it.
@@Ecceptor It took my i7 11700k almost 5 minutes to start up Machinima 2022.1. If you have anything older than a 7th gen intel machine. You need to upgrade. And no a high end xeon 4th chip won't do it. I tried it. Its the instruction sets that are missing in the older chips.
@@furynotes
I see. I was excited to play Omniverse create, then it ran 10fps on my pc.
@@Ecceptor in real time or path?
OMG.........you get a subscribe today bro.......this is an amazing video.....this to unreal ....using metahuman. That should be the next video
the country can follow the machinima contest is limited i cant get in
Hello, I'm going to buy a graphics card for UE5, do you recommend a founder edition or a normal TUF?
yea cant go wrong with FE
This is excellent
Can this be done for birds and animals too.
I would like to retarget a pigeon to flying dragon 😂
when I start machinima it shows "rtx loading". I've GTX 1650 and I want to run machinima with this. Please please help me out
yea bro im not sure if ur gpu is supported i think it has to be rtx
Hey Bro, is there livelink Face in Machinima or Create?
How does Metahuman fair in omniverse?
is it compatible with UE4 mannequin Skeletons?
please make a video on the retargetting process, if it's not compatible with UE skeletons.
Hi Jae. I'm trying to export the mocap data I generated from Machinima's Pose Tracker to Blender and retarget to a rig to refine the animation (similar to what I would do to Mixamo mocap animations). I think right now the only tutorial video of this is yours. The file I got from following your tutorial is a USD and when I try importing to Blender or Maya nothing comes of it (no animated armature/rig). I don't know if I'm doing something wrong... I'm following your video instructions
The same problem for me, did you solved it?
Firstly export it with usd, then drag and drop that usd file to the sequencer and check if your character is being animated, after then export it in fbx
Hi, did you manage to feed a live stream into Pose Tracker? I spent a couple of hours today trying to feed BM Capture Card into Maschinima with no positive result. I think it has not been developed yet.
The character seems flowing on the floor, is there a better way to make the skeleton move more real?
Thank you. If you make steam live which camera I must use webcam?? can you make a new video for this and what camera I can use?? it is webcam or specific camera?? Thank you in advance
How about if the video has two people in it, such as two people dancing tango together? Can it track two people, and apply those poses to both of the 3D models?
How to fix foot sliding
Just 1 front view Camera is ok?
yes
why people from italy cannot apply the contest?
they probably have strict gambling rules
Thanks. Is it possible to bring over a character from Maya including a control rig for body and face created in Maya and use it with this tool? And is Xgen hair supported including proper hair shading like with the Arnold hair shader? Thank you.
i dont have maya right now but omni has a maya connector so i am guessing yes
Much better than wrnch
agreed lol i didnt even try wrnch hahhaa
Can you do this from the side view?
Wait I could just turn the camera side ways after….
Hey J, is there a control rig available in Machinima to clean anim after?
no id take it to iclone
Do these have rig controllers?
whatttt any type of video can i screen record say fortnite and use that for my video i wonder if it works with flying videos
all u can do is try
@@Jsfilmz nice tutorial ty
@@Rilict you are welcome
Hey! can this be retargeted to a ue5 or metahuman manikin? Are there any restrictions on importing videos? Well, I'm also wondering if this can be exported to Maya for retargeting? Thank you!
i dont have maya right now but omni has a maya connector so i am guessing yes
@@Jsfilmz sounds good - thank you!
there is in no pose tracker option under animation tab. I installed Machinima 2022.3.0. How to enable this option?
It's now called "Pose Estimation"
I'm second... :)
Does this only work with RTX or can it be used with GTX ?
check the min specs rtx i believe
Ok Thank you very much 😃
so how do I can to export this to an FBX file???
How to apply Machinima on metahuman
Hey how to convert the use file we get to video .
Please answer
Wish this had multicam support. As is it's super janky.
what do u mean?
@@Jsfilmz Record using multiple synced cameras at different angles (front, sides, back, etc). To give a better understanding of where the body is and avoid obstructions.
Is there no way to export the animation in fbx or bvh format
yes you can
@@Jsfilmz can you please make a tutorial 🙏🏾🙏🏾
Now all I need is an RTX card 😢
R34 ?
I followed your video step by step but I can't get Sol to animate after dragging Sol and the tracking .usd to the sequencer. I'm on Omniverse Machinima 2022.2. 🥲
I installed a much lower version, 2022.1.3, and it seemed to work now. :D
The program is still indeed in beta. 😅