@@soulstarcomics Hard to say... I haven't tried the Xsens yet, but the results are working for me (with adding cinemotion and adding cameras to the right spot) :D
Yes, I have tried move ai and it gave me better results than a perception neuron studio suit (8k). Main limitation is the room size. If you have a large room than it is very good and affordable solution
and yes to testing Plask/Move, as well as Quest 2 tracking! Would love to see how much the Quest can do in comparison to an Xsens suit + gloves, and what combinations you would pair together (maybe even at different budget thresholds) to achieve a complete mocap setup.
Great video han! Good to see someone reviewing these with the highest quality benchmark set for their work. Most people i see are somehow doing a ton of cleanup after posting 6k for tech like this and i feel the same way as you. If you are paying that much, the tech needs to deliver a lot more. Looking forward to the ai solutions. Hopefully they are cheaper and accurate.
I've been on a similar journey from XSens + Manus Prime 1, then Prime 2 and ended up selling all of that and replacing it with a small OptiTrack setup + Quantum gloves. Having the reliability and volumetric accuracy from the optical system means 'high fives' are possible and you'd have a much better grip on props etc - see what Matt Workman does with his Vicon setup. I've had other thoughts about having tracking points on the finger tips and using those as a post-capture corrective 'ik' weight to help keep the gloves spatially precise.... I'll see how that goes!
~4:56 I can't tell the if the right one is real footage or in engine if I don't pay attention to the fingers! Not only is the mocap data so clean, but the absolutely amazing scene you've setup is just on whole another level
I really need to get my own mocap suit. Also crazy that cod keyframed that whole anim sequence. If we didnt have mocap idk how we wouldve finished all those cinematics in time haha
Thanks for your perspective. The end user experience matters as the cost doesn't necessarily produce all the value needed from a production standpoint.
Can I recieve from you the name of this camera you use on your head while capturing the animation of hand? I'd like to read more about this stuff and get into animating so this is kinda important for me.
Hi, really interested in this content, so how do you manage hands now, given that you concluded that it might not be worth purchasing any of the available gloves as of now if one holds high standards about the animation. What’s your work flow in regards to the hands part of the animation? Would really appreciate your response!
Fantastic review and overall conclusion, I also have the Xsens body and Manus 1 gloves system and am creating cinematic pre-vis and short film projects and Im finding that the body motion capture with the Xsens suit is really very good and quality is up to standard but the finger and face capture ( using various different solutions) always falls short and you end up having to adjust and adapt the way you tell the story and shoot the scenes due to the weaknesses in these elements of the mocap pipeline. Im hopeful that the new faceware portal is going to improve face capture and still looking for a good finger capture solution.
@@grossimatte I have the MVN LINK suit and I can't recommend it enough, compared to others it has pretty much no drifting and the HD processing after capture does all the clean-up for you, the body motion always looks natural, I've always found it to be high quality whether its quick movements or slow and subtle.
@@grossimatte Ive not no, I needed the more robust suit as I had to mo cap a set of professional athletes for a project and I figured the awinda system wouldnt hold up when fast running or jumping would be involved, i ended up buying 2 more suits on their own so can swap out sensors and cover any body size from jnr through to large adults.
@@grossimatte i work with awinda version. Honesly, you will end up using awinda rather than the other just bcause your free to put tracker on anyone with measurements where as the other solution is not as comfortable for the actor + not so flexible. Theyr havent the same refresh rate 60 vs 120. Its mocap you will still need to clean some stuff whathever suit your using. Thats what makes an AAA product
This is a great video and i always be interested in other mocap solution. Thank you also for your honesty and your integrity. I really enjoy all your content and you are a wonderfull indie director. Hope you are working on a new production :)
Stretchsense too pricy for me for what it offers. It’s a pose based system, meaning you are essentially blend between poses while capturing. Which increase stability and smoothness but compromise on precision. Move ai I doubt will be accurate for fingers to “no clean up needed” level.
@@Unrealhan very interesting! Thank you for that info. Have you had a chance to try their new gloves? I’m currently in talks with a sales rep in LA to test a pair. Not sure where you are but I can link you guys
I went all in on the Awinda, the Metagloves, and the Standard Deviation Helmet for iPhone facial capture. I am super impressed by Xsens! I agree with you on the gloves: good, accurate, but SUPER fragile and choppy if you obstruct the view. The facial mocap, however, is abysmal! Do you have any advice? I am using stylized cartoony characters and nothing comes even close to the range of expression that I need, not to mention that lip sync looks like garbage. I tried Faceware - meh. AccuLips in iClone...better, but it restricts the face...
Face is the trickiest. I’m planning a video about that as well. Short version is for us indie iPhone is the best bet unfortunately. All other solutions are not budget friendly. Faceware new portal looks better but I’m not 100% onboard yet. It looked like another facegood version. I wanna to talk more in-depth in a video too.
@@Unrealhan Portal looks like a couple steps above the rest. I spoke with a rep at Faceware, they said that Portal is meant for studios and starts at 60k/year! I keep thinking that the face is where i need to sacrifice fast on the altar of good and actually keyframe it...just not there yet....
I'm talking to faceware about portal now and seems their pricing is way up in the clouds, 60 000$ for 12h and that's the minimum. I used old faceware with the retargeter and analyser and it was not super easy to work with nor did it have proper batch tools (more like python scripts that didn't work out of the box) Nvidia is doing some cool facial stuff that's worth checking out, it's based of audio curves but the data looks better than faceFX at least.
Hello! My group and I are currently working on a “Star Wars Revan Malachor V” fan film and while looking around I stumbled upon your epic video. I was thinking some of the stuff you did here could be applied really well to our fan film. Would you be open to talking on discord or somewhere about working in some minor or major capacity on this project?
Great info! They definitely seems extremely expensive for what they are. Makes sense to invest more in the suits. I spotted you have both the Link and Awinda Xsens kits - can I ask do you notice a difference in quality between them? I'm considering which one to purchase...
I use link at work and awinda at home and they both have pros and cons. Link pros: faster to strap, longer range, higher rate, on body recording. Cons: expensive and especially if you need many sizes (1 suit = 1000€), more things to keep on your actor (cables, battery pack & sender). Awinda pros: cheaper, gets 3 shirt sizes in the price (+ any extra size is 100€), works better with multiple actor sizes for a cheaper price, less stuff to carry on you (and no cables), actors telling me it's more convenient to carry compared to the cabled link, nicer to fall on your back for stunts since you don't carry lot of stuff above your pelvis, cheaper to replace parts since each sensor is separate (link is in cabled chains) Cons: each sensor has to be charged individually + battery life is shorter, distance is less and on body recording is very short when losing connection, worse refresh rate (though I find 60hz to work as good as 240hz since you'll go 24-30 fps anyways) If you're on a tight budget i think awinda is way to go. If you need range/on body recording link will be way to go.
Thanks for making the vid. Here's my story: I bought 2 pair of stretchsense. For slow, regular movement I found them OK, but when dancing - yikes! crazy jitter, espcially for sudden stop. For fast accurate, complex movement, just not there - inaccurate and drift. Also, the motionbuilder integration is an absolute nightmare, mostly because there is no way to change the start frame to zero, so the starting frame is set to the dawn of man. OK, not that long in the past, but literally in the millions of frames. Oh - and the Annual subscription, just got a bill, $2980 USD. (Anybody want a pair of inexpensive stretchsense gloves?!). One positive is they have great customer service. The Rokoko gloves look better in online demos, actually - for 5 times less money. On Xsens Body suit - when I first got mine, it was pretty good, but on their last update we couldn't even get a calibration going. (The one where they ask you to walk in a circle.) Not sure if my sensors are old, or if the new update was crappy. I've never had good range of more than a few steps. In one capture last year we actually walked near the talent with the Awinda Station Anyway, wasted $5k CAD on a license that we used for six weeks that was a constant fight to work with. Not happy.
Try walking in an eight, that solved a lot of my issues. I think xsens calibrate best when you get all directions and a circle will only get one side (i had lot of issues walking in a circle) If you get arm issues you could try a eight calibration while holding the hands together (like a lucia) after the n pose, also fixes some issues that could happen.
@@jingjingmannen Thank you for the suggestion. We can give that a try, but I think our problems are deeper. We have terrible wifi issues, doesn’t really matter which channel we go on. Tried in totally clean environment, just no range. I think it is a hardware problem.
@@blairzettl3933 that sucks. And it's the awinda you use? Did you try putting the receiver higher up? (Table/shelf) Do you have the station or the usb stick as a reciever? I got the middle version of awinda and I could even record a person walking up like 5 floors and about 30 meters indoor without problems)
@@jingjingmannen We have two of the station type Awinda system. I have reached out to support about the signal issue and discussed while still under warranty- no solution was ever forthcoming. We tried everything, short of a ceiling mount.
Great video! Please do make videos about an AI workflow! I also tested a couple of gloves and got similar results: just not great results. But AI might definitely be a solution. So yes, please do make videos about AI Workflow, thanks :)
Havent tried the new manus gloves but the old ones didn't impress a lot. Very jittery and doesn't get the fingertips well (like if you push your finger down on a table) also heard they are prone to breaking so for the price it can't be justified when you can get an xsens awinda for the same price of a pair of gloves.
I also thought that those gloves would really push Manus a step ahead from StretchSense....but oh boy, for 6k I expected much more than this. Since I had multiple issues with the old gloves ( broke down after few weeks ) I'm super worried about the very flimsy cables from the fingers tip to the hand, since I suppose that if they get caught somewhere, it will rip out the electronics, but I hope I'm wrong. Thanks for the video, I think that with that 6k you can invest in something else :D
MetaGlove should be ashamed. As a part time hobbyist robotics engineer, I'm almost encouraged to design my own mocap gloves after seeing these in action, just to prove that it can be done for much less $ and deliver sub 5mm precision, especially since output requirements aren't even real-time, such as would be with finger tracking in VR.
We also have these gloves and the precision is realy bad, we have to rework all finger animations and often sjust delete the crappy data and do manually everything. Waste of money.
Do you think the ai motion capture (move.ai) produces better quality than Xsens yet?
Can you do a comparison video?
@@soulstarcomics Hard to say... I haven't tried the Xsens yet, but the results are working for me (with adding cinemotion and adding cameras to the right spot) :D
@@soulstarcomics def planing to do one. So far I haven’t got good result from ai yet, not to mention the capturing experience.
Yes, I have tried move ai and it gave me better results than a perception neuron studio suit (8k). Main limitation is the room size. If you have a large room than it is very good and affordable solution
Just a nice comment to push the algorithm! Thanks for making the video :) Very funny intro!! hahah
👍
Love your transparency and willing to test things for us who are planning on investing in our own setups man! PRiceless resource.
and yes to testing Plask/Move, as well as Quest 2 tracking! Would love to see how much the Quest can do in comparison to an Xsens suit + gloves, and what combinations you would pair together (maybe even at different budget thresholds) to achieve a complete mocap setup.
Great video han! Good to see someone reviewing these with the highest quality benchmark set for their work. Most people i see are somehow doing a ton of cleanup after posting 6k for tech like this and i feel the same way as you. If you are paying that much, the tech needs to deliver a lot more. Looking forward to the ai solutions. Hopefully they are cheaper and accurate.
I've been on a similar journey from XSens + Manus Prime 1, then Prime 2 and ended up selling all of that and replacing it with a small OptiTrack setup + Quantum gloves. Having the reliability and volumetric accuracy from the optical system means 'high fives' are possible and you'd have a much better grip on props etc - see what Matt Workman does with his Vicon setup. I've had other thoughts about having tracking points on the finger tips and using those as a post-capture corrective 'ik' weight to help keep the gloves spatially precise.... I'll see how that goes!
Han - Thanks so much for testing this gear for us. This is tremendously helpful for us content creators.
~4:56 I can't tell the if the right one is real footage or in engine if I don't pay attention to the fingers! Not only is the mocap data so clean, but the absolutely amazing scene you've setup is just on whole another level
Would the hand tracking of a Quest 3 be able to help with capturing/animating hand data?
I really need to get my own mocap suit. Also crazy that cod keyframed that whole anim sequence. If we didnt have mocap idk how we wouldve finished all those cinematics in time haha
I notice at the time 5:00, you can previz the gun's mechine motion without any tracking module, how can you reach that?
Thanks for your perspective. The end user experience matters as the cost doesn't necessarily produce all the value needed from a production standpoint.
Thanks for very complex text and honest option. I really wonder your option about move ai
Would love to see your tutorials as you do it all in UE etc…
Even if you do courses, I would like to pay something for it!
your videos just amaze me!
Great content Han! Really well made video.
Please which toy gun or prop were you using for motion capture 4:00. I need one for my upcoming short film 📽️
it's an Airsoft blowback Glock
Hello, thank you for your video! But doesn't xsens HD reprocessing automatically solve the mocap problems of xsens gloves?
awesome info, really appreciate it.
TY for make this tutorial really 🙏🏽 i love your content 👏🏼 Hope i can learn more info on the next video TY again
any updates on ur search for the best gloves or mocap suit? lov the honestly of the vid, so really curious if ur opinion has changed
Can I recieve from you the name of this camera you use on your head while capturing the animation of hand? I'd like to read more about this stuff and get into animating so this is kinda important for me.
hans i rememebr you used to have a course but i cant seem to find it. Is it still around?
Which motion capture solution do you currently prefer?
Hi, really interested in this content, so how do you manage hands now, given that you concluded that it might not be worth purchasing any of the available gloves as of now if one holds high standards about the animation. What’s your work flow in regards to the hands part of the animation? Would really appreciate your response!
Fantastic review and overall conclusion, I also have the Xsens body and Manus 1 gloves system and am creating cinematic pre-vis and short film projects and Im finding that the body motion capture with the Xsens suit is really very good and quality is up to standard but the finger and face capture ( using various different solutions) always falls short and you end up having to adjust and adapt the way you tell the story and shoot the scenes due to the weaknesses in these elements of the mocap pipeline. Im hopeful that the new faceware portal is going to improve face capture and still looking for a good finger capture solution.
@@grossimatte I have the MVN LINK suit and I can't recommend it enough, compared to others it has pretty much no drifting and the HD processing after capture does all the clean-up for you, the body motion always looks natural, I've always found it to be high quality whether its quick movements or slow and subtle.
@@grossimatte Ive not no, I needed the more robust suit as I had to mo cap a set of professional athletes for a project and I figured the awinda system wouldnt hold up when fast running or jumping would be involved, i ended up buying 2 more suits on their own so can swap out sensors and cover any body size from jnr through to large adults.
@@grossimatte i work with awinda version. Honesly, you will end up using awinda rather than the other just bcause your free to put tracker on anyone with measurements where as the other solution is not as comfortable for the actor + not so flexible. Theyr havent the same refresh rate 60 vs 120. Its mocap you will still need to clean some stuff whathever suit your using. Thats what makes an AAA product
curious, what didn't you like about stretchsense? For me its the most accurate capture solution when working with props so far.
This is a great video and i always be interested in other mocap solution. Thank you also for your honesty and your integrity. I really enjoy all your content and you are a wonderfull indie director. Hope you are working on a new production :)
This is very entertaining!
Thanks for this - you returned the stretchsense gloves? Not good? Any pointers there ?
Also, moveai for hand motion? Pretty cool. Did you manage yet?
Stretchsense too pricy for me for what it offers. It’s a pose based system, meaning you are essentially blend between poses while capturing. Which increase stability and smoothness but compromise on precision. Move ai I doubt will be accurate for fingers to “no clean up needed” level.
@@Unrealhan very interesting! Thank you for that info. Have you had a chance to try their new gloves? I’m currently in talks with a sales rep in LA to test a pair. Not sure where you are but I can link you guys
@@AnnisNaeemOfficial Stretchsense? Yeah, we bought fidelity pro and was disappointed. You can check my previous short testing that.
@@Unrealhan ok that’s really great to know! Thank you so much!
hey,
nice comprehensive ...
what about perception neuron?
PN studio pro ...
Haven’t tried perception neuron tho, saw them on Siggraph looks alright.
I went all in on the Awinda, the Metagloves, and the Standard Deviation Helmet for iPhone facial capture. I am super impressed by Xsens! I agree with you on the gloves: good, accurate, but SUPER fragile and choppy if you obstruct the view. The facial mocap, however, is abysmal! Do you have any advice? I am using stylized cartoony characters and nothing comes even close to the range of expression that I need, not to mention that lip sync looks like garbage. I tried Faceware - meh. AccuLips in iClone...better, but it restricts the face...
Face is the trickiest. I’m planning a video about that as well. Short version is for us indie iPhone is the best bet unfortunately. All other solutions are not budget friendly.
Faceware new portal looks better but I’m not 100% onboard yet. It looked like another facegood version. I wanna to talk more in-depth in a video too.
@@Unrealhan Portal looks like a couple steps above the rest. I spoke with a rep at Faceware, they said that Portal is meant for studios and starts at 60k/year! I keep thinking that the face is where i need to sacrifice fast on the altar of good and actually keyframe it...just not there yet....
I'm talking to faceware about portal now and seems their pricing is way up in the clouds, 60 000$ for 12h and that's the minimum. I used old faceware with the retargeter and analyser and it was not super easy to work with nor did it have proper batch tools (more like python scripts that didn't work out of the box)
Nvidia is doing some cool facial stuff that's worth checking out, it's based of audio curves but the data looks better than faceFX at least.
Hello man! For simulating the cloth what is your workflow? I'm having a hard time deciding what to learn and have the best quaility! Thank you!
Great video, Han, Have you tried Perception neuron? I would be very curious to see how they preform against Xsens and the other competitors. Cheers!
No i haven't yet, i saw them at siggraph last year, looking not bad but doesn't seem to offer something new. Love to try them out tho.
more animation content please!
hi mr. han. can unreal engine 5 put together with rtx 4090 work together to make something as good as Avatar 2 ?
Great video!!
Hello! My group and I are currently working on a “Star Wars Revan Malachor V” fan film and while looking around I stumbled upon your epic video. I was thinking some of the stuff you did here could be applied really well to our fan film. Would you be open to talking on discord or somewhere about working in some minor or major capacity on this project?
Please do review Move Ai!!!
the man!
Hello sir, can you tell me what softwares you used for your first cgi scene?
Rendered in UE5
Great info! They definitely seems extremely expensive for what they are. Makes sense to invest more in the suits. I spotted you have both the Link and Awinda Xsens kits - can I ask do you notice a difference in quality between them? I'm considering which one to purchase...
Yes, link def works better, especially the range. I would go for link if budget allows.
I use link at work and awinda at home and they both have pros and cons. Link pros: faster to strap, longer range, higher rate, on body recording. Cons: expensive and especially if you need many sizes (1 suit = 1000€), more things to keep on your actor (cables, battery pack & sender).
Awinda pros: cheaper, gets 3 shirt sizes in the price (+ any extra size is 100€), works better with multiple actor sizes for a cheaper price, less stuff to carry on you (and no cables), actors telling me it's more convenient to carry compared to the cabled link, nicer to fall on your back for stunts since you don't carry lot of stuff above your pelvis, cheaper to replace parts since each sensor is separate (link is in cabled chains)
Cons: each sensor has to be charged individually + battery life is shorter, distance is less and on body recording is very short when losing connection, worse refresh rate (though I find 60hz to work as good as 240hz since you'll go 24-30 fps anyways)
If you're on a tight budget i think awinda is way to go. If you need range/on body recording link will be way to go.
Thanks for making the vid. Here's my story: I bought 2 pair of stretchsense. For slow, regular movement I found them OK, but when dancing - yikes! crazy jitter, espcially for sudden stop. For fast accurate, complex movement, just not there - inaccurate and drift. Also, the motionbuilder integration is an absolute nightmare, mostly because there is no way to change the start frame to zero, so the starting frame is set to the dawn of man. OK, not that long in the past, but literally in the millions of frames. Oh - and the Annual subscription, just got a bill, $2980 USD. (Anybody want a pair of inexpensive stretchsense gloves?!). One positive is they have great customer service. The Rokoko gloves look better in online demos, actually - for 5 times less money. On Xsens Body suit - when I first got mine, it was pretty good, but on their last update we couldn't even get a calibration going. (The one where they ask you to walk in a circle.) Not sure if my sensors are old, or if the new update was crappy. I've never had good range of more than a few steps. In one capture last year we actually walked near the talent with the Awinda Station Anyway, wasted $5k CAD on a license that we used for six weeks that was a constant fight to work with. Not happy.
Try walking in an eight, that solved a lot of my issues. I think xsens calibrate best when you get all directions and a circle will only get one side (i had lot of issues walking in a circle)
If you get arm issues you could try a eight calibration while holding the hands together (like a lucia) after the n pose, also fixes some issues that could happen.
@@jingjingmannen Thank you for the suggestion. We can give that a try, but I think our problems are deeper. We have terrible wifi issues, doesn’t really matter which channel we go on. Tried in totally clean environment, just no range. I think it is a hardware problem.
@@blairzettl3933 that sucks. And it's the awinda you use? Did you try putting the receiver higher up? (Table/shelf) Do you have the station or the usb stick as a reciever? I got the middle version of awinda and I could even record a person walking up like 5 floors and about 30 meters indoor without problems)
@@jingjingmannen We have two of the station type Awinda system. I have reached out to support about the signal issue and discussed while still under warranty- no solution was ever forthcoming. We tried everything, short of a ceiling mount.
Great video! Please do make videos about an AI workflow! I also tested a couple of gloves and got similar results: just not great results. But AI might definitely be a solution. So yes, please do make videos about AI Workflow, thanks :)
Is your name Han?
한국 분이세요?
Stretchsense gloves are much better. The manus gloves have way less sensors and have tension that doesn’t let your hand rest naturally
Havent tried the new manus gloves but the old ones didn't impress a lot. Very jittery and doesn't get the fingertips well (like if you push your finger down on a table) also heard they are prone to breaking so for the price it can't be justified when you can get an xsens awinda for the same price of a pair of gloves.
🔥🔥🔥
I also thought that those gloves would really push Manus a step ahead from StretchSense....but oh boy, for 6k I expected much more than this.
Since I had multiple issues with the old gloves ( broke down after few weeks ) I'm super worried about the very flimsy cables from the fingers tip to the hand, since I suppose that if they get caught somewhere, it will rip out the electronics, but I hope I'm wrong.
Thanks for the video, I think that with that 6k you can invest in something else :D
MetaGlove should be ashamed. As a part time hobbyist robotics engineer, I'm almost encouraged to design my own mocap gloves after seeing these in action, just to prove that it can be done for much less $ and deliver sub 5mm precision, especially since output requirements aren't even real-time, such as would be with finger tracking in VR.
$6000? that's a deal breaker if it doesn't work properly
We also have these gloves and the precision is realy bad, we have to rework all finger animations and often sjust delete the crappy data and do manually everything. Waste of money.