@@Unrealhan Is easier setup than xsens (also no subscription fee) but the hand capturing is not good. Rokoko was pretty janky but I tested it in 2018 so it may be different now.
Another issue I have is your testing method. You are wearing baggy clothing without much contrast, which will also reduce the accuracy of the tracker. I use Move AI in a well-lite studio space with white walls, and the talent each wearing body-snug bright colored suits. Haven't had any real issue. I think too many people assume this is TOO casual of a method for capturing mocap data - you need to treat the studio space like you would a $50,000 IR tracking system, not like you would an inertia-based suit workflow.
Spot on. Exactly. Its a camera based design. Also Xsens requires subscription too from what I can see. And if you want unlimited processing minutes it $250 a month. Pretty crazy. MoveAI also lets you use Iphone 8 instead for cheaper cameras or just use experimental mode and use different cameras entirely. Some guy in the moveai discord had these $25 cameras that fit the specifications by Moveai and had four set up getting great results.
I put moveai with a period and youtube auto deleted it. Fantastic. Well lets try again. First off thank you for this video and information! I have to disagree however for the following points: 1. You can spend way less on a Iphone 8 or even different cameras entirely. In experimental mode you can use any camera that fits the right specs. A guy got it working with $25 cameras. Moveai even started out with GoPros before Iphones. 2. The 30 minute cap seems bad but then you realize that Xsens does the same thing unless you pay $250 a month for unlimited processing. A MONTH! And thats after buying the suit. 3. Want to add another person with xsens? Well now you need another expensive suit. Or you have to record again separately and hope the animation syncs. 4. Suiting up every single time with multiple actors can be a headache. I still think the price bracket Xsens wants is not comparable to moveai. It is much cheaper still to use moveai. Unless there is a way to process unlimited data with Xsens that is free I don't see how it is better if it has that same subscription issue with an even higher barrier of entry.
Great video, Han. Really interesting to see a comparison of the 2 systems. One of the keys to getting the most out of any mocap system is knowing exactly what it can and can’t do so you can plan round it, so this is really useful. I always find the claim “you don’t need a suit” a little cheeky. While you don’t need to wear a Lycra suit, you do need clothing that contrasts with the surroundings and the different parts of your body to improve tracking.
@@Unrealhan yeah, it’s a shame the software can’t do a quick preview to give you some idea if it’s worked or not, a bit like raw marker/sensor data in other software. It’s the not knowing if it’s worked that’s a bit of a deal breaker for some.
It's funny that people complained about the pricing of this mocap (from what I understand you are talking about the multi-cam option) NOW it's not $365 a year but $5,000 a year and not 30 minutes a month but 20 minutes a month. And the one camera option is also still more expensive than that full multi-camera plan was. I just paid $20 for a 1 month access of 3 minutes processing time. I was so hyped back then to try it out but I didn't have access to an iPhone... I finally got to borrow one or two but now I'm screwed again, this time by the pricing policy :(
Thank you for the fair comparison, it really shows the strong and weak points of both systems. This information will for sure help those checking out different options, I would surely recommend they watch this video to get a good overview.
Really nice comparison. Let just add to that the fact that mostly... everyone has a phone capable of doing that. We tried it in our studio and if you combine someones phone + tablet, you end up pretty fast at the number of devices needed for the capture. In our test I just borrowed someones iPhone + my own + my other friends phone in the test and finally my ipad. We ended up with four devices at a total cost of... 0$. Which cuts the cost then to "only" the tracking fees. But for the accuracy and everything in this video thanks for comparing the exact same motion with two different technics! It's still super impressive for a tech that can be used by anyone!
Thanks for the comparison and a very fair review. It was certainly eye opening when I got into Mocap via XSens Awinda a few years ago. Understanding that although there's a single term for this ("Mocap") there's actually so many subtle differences that one ought to understand - some you spoke of such as the volume where inertial suits can run loose in the wild vs in a volume. Some others that I had to learn the hard way : Do you want Live streamed mocap or just offline recordings? Do you want really, flawless mocap done Live or Offline? Two or more people touching with great registration between them? Props? Camera tracking? Visibility of the Mocap suit to a camera that may be filming? So many of these questions I didn't know i needed to ask but had to discover along the way. We've since gone with OptiTrack for great Live/Offline recording, multiple person registration, Props and Camera tracking although recently we were asked to do recording mocap of people wearing high fashion clothing - so the XSens came out of the cupboard because we could hide the sensors ! It's a wonderful thing but gee, was it expensive learning all these points! Thank you for helping the community understand more about the topic!
Han, I'm just addicted to your videos mate, I NEED A FULL COURSE OF YOURS SOON! any plans on doing it? Maybe a filmming master course or something? Cheers and always thank you for the amazing content!
It's true that it's too expensive, but for amateur creators like me, there is no alternative other than move ai. I wish there was a monthly subscription plan.
Very glad that somebody finally has a more critical view on that software solution. After two months of using it I gotta say they definitely hit the right direction but the execution is far from perfect. The most frustrating thing for me are the restrictions of the app: 1. You can't upload individual files, only all or none. 2. they have deactivated the function to share the raw recorded videos from the iPhones to other devices. 3. Wouldn't be that much of a problem but unfortunately the app is prone to crash when uploading a lot of takes (which is normal on long production days). The older the iPhone is the worse the problem gets. So, that's why their USP of cost-efficient productions is kind of invalid. Ultimately you end up buying or renting expensive iPhone models.
@@Unrealhan You CAN ask the support for individual access to this function. Though that doesn't solve the problem of the app constantly crashing on uploads, especially on older models. It has something to do with bad memory management. The Devs know about this problem. Though they are still trying to push the narrative that only need iPhone 8s in order to do unlimited MoCap.
@@ArthurBaum working really hard on fixing all these problems as you know. We are only three weeks in from launch so should all be fixed and these gremlins solved in the next couple of weeks. Also appreciate your feedback on the functionality reviewing this too
@@ArthurBaum You can actually go cheaper than iphone 8. A guy in the discord setup $25 cameras that fit within spec and ran 4 of them with great results. I don't know much about the connectivity issues you are having but seems like a pretty fantastic alternative to a $12000 mocap suit and $250 a month sub to motioncloud.
Han thanks for the objective review. We’d welcome your thoughts on what a more viable free trial period might be alongside the minute allocation monthly. Also if you want to do more tests pls do reach out and we can facilitate more credits for you to continue to explore. Totally hear you on the connectivity issues two weeks out from launch we are still ironing out some gremlins from the system and making it much more robust. Just to confirm you are testing this against the MVN link which costs $12,000 and software which costs $5,000-10,000 pr/yr? If so it’s a thrill to see the data come out so closely comparable. Sorry you had to shoot in the rain but loads of people are shooting in much tighter spaces indoors using the wide angle lens on the 11 up. The hardware on a system like this can run as little as $1300 if you are buying refurbished iPhone 11s.
Yes the benchmark is MVN link, not a cheap suit for sure and I understand not everyone has access to system like that. MoveAI does present an alternative for sure. Good to know there's room for extra minutes for testing etc. I think the trial period should be 10 mins (i think it's 5 for beta users?). That should give ppl enough time to trouble shoot the equipment and run some actual test in different settings and environments. They probably already purchased the iPhones so most likely they will commit. In terms of monthly allocation, personally i would prefer a monthly subscription instead of annual. Reason being I only use mocap for the first half of a project (shooting and previs), not every month. So depends on each user case maybe the quota will vary but I think an average 30mins/month is not too flexible. I'm curious to know what the community thinks too, as it varies case by case. Love to hear what your road map is in terms of the data quality too, AI develops so fast you guys probably already working on some cool stuff 😉
If you can decouple the processing from the export, that will open up some options. Allow users to test footage recording conditions/variables/workflow but limit how much they can actually export
Nice video! I was thinking to do this kind of video +optical but can't seem to get to it. Thanks for including the floor rolling and hands on surface, great to see how it does in these situations
Hmm, just checked all options for this budget motion capture and I'm very disappointed (nothing against Move tho). My main question is, how are VR googles so cheap, when you track the headset, and several controllers with perfect precision? It can do it for less than 1000 bucks, and you have a VR headset next to it? I would so much like some explanation.. thanks.
+1 SUB ! I have been looking for a great RUclipsr that talks about retargeting in maya. The most daunting thing as a solo indie dev is character rigging + animating and retargeting. I have not gotten it working proper. Anyway, Great video! I was looking at Rokoko and an Iphone system. I think Rokoko is the way to go
This is super helpful. Thank you for an honest and comprehensive review with your OWN testing and data. Definitely a sub from me. I've got a Perception Neuron v32 which still works well for me. Markerless mocap at an affordable price with good fidelity...just doesn't quiiiite seem to be there ....yet
Cool video! I see you mostly do your animations in unreal engine. And I have a question. How many poligons does your rigged model have? As I can see, it's done very well. I personally rig models in blender. Although they are great in LOD0. Blender can't let it move or render smoothly. Although I have a powerful computer. And it seems to be very smooth in unreal engine. Can you give tips if I need to know something about a UE model beforehand...
uhm.... so 365 USD for something you wouldn't use during the whole year with the lie of being $1 day price, but on top of that you only get 30min per month.....
Honestly you failed to research proper clothing and contrast. You mean well, but it is a misleading video. You should try again with different clothing
No matter what costume you wear, it's impossible to get better results than MVN LINK on a live or recorded basis because move ai's software lags behind MVN animate. In particular, the problem of recognizing fast repetitive movements as noise is very serious, and the problem of rolling on the floor is unrelated to the costume.
There's a few AAA studios who would disagree with you there. It's still being refined and improved, but it out-performed my Awinda Starter in my tests (without a 10K p/a Xsens Pro subscription, I ran into issues with the Xsens drifting that move doesnt give me, so cleanup is at least consistent)
Disagree. Get creative with your mocap and telling a story. Be very specific about what you need an when. Once you do that it has a professional place to be used. Its just another tool. Cant afford crazy xsens prices? Well then you use moveai. That simple.
Which mocap are you using?
Perception Neuron, but I am going to give Deepmotion a shot.
@@MR3DDev curious to know how they compare to xsens or rokoko
@@Unrealhan Is easier setup than xsens (also no subscription fee) but the hand capturing is not good. Rokoko was pretty janky but I tested it in 2018 so it may be different now.
Awesome look at this! I was tempted to look into move, but I think I will stick with my Unreal Engine Vive Tracker solution for the time being.
Rokoko but soon going on Xsens
Another issue I have is your testing method. You are wearing baggy clothing without much contrast, which will also reduce the accuracy of the tracker. I use Move AI in a well-lite studio space with white walls, and the talent each wearing body-snug bright colored suits. Haven't had any real issue. I think too many people assume this is TOO casual of a method for capturing mocap data - you need to treat the studio space like you would a $50,000 IR tracking system, not like you would an inertia-based suit workflow.
Spot on. Exactly. Its a camera based design. Also Xsens requires subscription too from what I can see. And if you want unlimited processing minutes it $250 a month. Pretty crazy. MoveAI also lets you use Iphone 8 instead for cheaper cameras or just use experimental mode and use different cameras entirely. Some guy in the moveai discord had these $25 cameras that fit the specifications by Moveai and had four set up getting great results.
@@FinalGradeI'm going to give this a go with some cheapish 1080p 60fps action cams
I put moveai with a period and youtube auto deleted it. Fantastic. Well lets try again. First off thank you for this video and information! I have to disagree however for the following points:
1. You can spend way less on a Iphone 8 or even different cameras entirely. In experimental mode you can use any camera that fits the right specs. A guy got it working with $25 cameras. Moveai even started out with GoPros before Iphones.
2. The 30 minute cap seems bad but then you realize that Xsens does the same thing unless you pay $250 a month for unlimited processing. A MONTH! And thats after buying the suit.
3. Want to add another person with xsens? Well now you need another expensive suit. Or you have to record again separately and hope the animation syncs.
4. Suiting up every single time with multiple actors can be a headache.
I still think the price bracket Xsens wants is not comparable to moveai. It is much cheaper still to use moveai. Unless there is a way to process unlimited data with Xsens that is free I don't see how it is better if it has that same subscription issue with an even higher barrier of entry.
This was so well filmed, edited, narrated, and researched. Subscribed. Keep up the amazing work, you'll definitely take off buddy!
Great video, Han. Really interesting to see a comparison of the 2 systems.
One of the keys to getting the most out of any mocap system is knowing exactly what it can and can’t do so you can plan round it, so this is really useful.
I always find the claim “you don’t need a suit” a little cheeky. While you don’t need to wear a Lycra suit, you do need clothing that contrasts with the surroundings and the different parts of your body to improve tracking.
Agreed. Tricky part is it’s ai, so we just all guessing what will work best including clothing.
@@Unrealhan yeah, it’s a shame the software can’t do a quick preview to give you some idea if it’s worked or not, a bit like raw marker/sensor data in other software. It’s the not knowing if it’s worked that’s a bit of a deal breaker for some.
It's funny that people complained about the pricing of this mocap (from what I understand you are talking about the multi-cam option)
NOW it's not $365 a year but $5,000 a year
and not 30 minutes a month but 20 minutes a month.
And the one camera option is also still more expensive than that full multi-camera plan was.
I just paid $20 for a 1 month access of 3 minutes processing time.
I was so hyped back then to try it out but I didn't have access to an iPhone... I finally got to borrow one or two but now I'm screwed again, this time by the pricing policy :(
0:32 It would be so cool if you could give us some tips on filming two characters fighting while having only one motion capture suit
Good point. Will put on the list!
@@Unrealhan I second this! Would love to see your workflow on that!
Thank you for the fair comparison, it really shows the strong and weak points of both systems. This information will for sure help those checking out different options, I would surely recommend they watch this video to get a good overview.
$4,000 for 6 iPhones? Why? You can get used iPhone 11's for like $200 a pop. Maybe $1,200 on the high end.
~$50 Webcams also works well, don’t need to spend $200 for iPhone.
This is awesome, thank you for the direct comparison!
Really nice comparison. Let just add to that the fact that mostly... everyone has a phone capable of doing that. We tried it in our studio and if you combine someones phone + tablet, you end up pretty fast at the number of devices needed for the capture. In our test I just borrowed someones iPhone + my own + my other friends phone in the test and finally my ipad. We ended up with four devices at a total cost of... 0$. Which cuts the cost then to "only" the tracking fees.
But for the accuracy and everything in this video thanks for comparing the exact same motion with two different technics!
It's still super impressive for a tech that can be used by anyone!
Thanks for the comparison and a very fair review. It was certainly eye opening when I got into Mocap via XSens Awinda a few years ago. Understanding that although there's a single term for this ("Mocap") there's actually so many subtle differences that one ought to understand - some you spoke of such as the volume where inertial suits can run loose in the wild vs in a volume. Some others that I had to learn the hard way : Do you want Live streamed mocap or just offline recordings? Do you want really, flawless mocap done Live or Offline? Two or more people touching with great registration between them? Props? Camera tracking? Visibility of the Mocap suit to a camera that may be filming?
So many of these questions I didn't know i needed to ask but had to discover along the way. We've since gone with OptiTrack for great Live/Offline recording, multiple person registration, Props and Camera tracking although recently we were asked to do recording mocap of people wearing high fashion clothing - so the XSens came out of the cupboard because we could hide the sensors !
It's a wonderful thing but gee, was it expensive learning all these points!
Thank you for helping the community understand more about the topic!
An update would be welcome. Now there is monthly payment option and experimental mode, which works with all sort of footages from any camera.
Great summary and initial review - looking forward to hear more about this in the future.
Han, I'm just addicted to your videos mate, I NEED A FULL COURSE OF YOURS SOON! any plans on doing it? Maybe a filmming master course or something? Cheers and always thank you for the amazing content!
It's true that it's too expensive, but for amateur creators like me, there is no alternative other than move ai. I wish there was a monthly subscription plan.
It’s nice to see your channel going so well!
Very glad that somebody finally has a more critical view on that software solution. After two months of using it I gotta say they definitely hit the right direction but the execution is far from perfect. The most frustrating thing for me are the restrictions of the app: 1. You can't upload individual files, only all or none. 2. they have deactivated the function to share the raw recorded videos from the iPhones to other devices. 3. Wouldn't be that much of a problem but unfortunately the app is prone to crash when uploading a lot of takes (which is normal on long production days). The older the iPhone is the worse the problem gets. So, that's why their USP of cost-efficient productions is kind of invalid. Ultimately you end up buying or renting expensive iPhone models.
Very good point. User experience def more to be desired. The app literally is just doing recording, should be able to let us access raw files.
@@Unrealhan You CAN ask the support for individual access to this function. Though that doesn't solve the problem of the app constantly crashing on uploads, especially on older models. It has something to do with bad memory management. The Devs know about this problem. Though they are still trying to push the narrative that only need iPhone 8s in order to do unlimited MoCap.
@@ArthurBaum working really hard on fixing all these problems as you know. We are only three weeks in from launch so should all be fixed and these gremlins solved in the next couple of weeks. Also appreciate your feedback on the functionality reviewing this too
@@ArthurBaum You can actually go cheaper than iphone 8. A guy in the discord setup $25 cameras that fit within spec and ran 4 of them with great results. I don't know much about the connectivity issues you are having but seems like a pretty fantastic alternative to a $12000 mocap suit and $250 a month sub to motioncloud.
Han thanks for the objective review.
We’d welcome your thoughts on what a more viable free trial period might be alongside the minute allocation monthly. Also if you want to do more tests pls do reach out and we can facilitate more credits for you to continue to explore. Totally hear you on the connectivity issues two weeks out from launch we are still ironing out some gremlins from the system and making it much more robust.
Just to confirm you are testing this against the MVN link which costs $12,000 and software which costs $5,000-10,000 pr/yr? If so it’s a thrill to see the data come out so closely comparable. Sorry you had to shoot in the rain but loads of people are shooting in much tighter spaces indoors using the wide angle lens on the 11 up. The hardware on a system like this can run as little as $1300 if you are buying refurbished iPhone 11s.
Yes the benchmark is MVN link, not a cheap suit for sure and I understand not everyone has access to system like that. MoveAI does present an alternative for sure.
Good to know there's room for extra minutes for testing etc. I think the trial period should be 10 mins (i think it's 5 for beta users?). That should give ppl enough time to trouble shoot the equipment and run some actual test in different settings and environments. They probably already purchased the iPhones so most likely they will commit.
In terms of monthly allocation, personally i would prefer a monthly subscription instead of annual. Reason being I only use mocap for the first half of a project (shooting and previs), not every month. So depends on each user case maybe the quota will vary but I think an average 30mins/month is not too flexible. I'm curious to know what the community thinks too, as it varies case by case.
Love to hear what your road map is in terms of the data quality too, AI develops so fast you guys probably already working on some cool stuff 😉
If you can decouple the processing from the export, that will open up some options. Allow users to test footage recording conditions/variables/workflow but limit how much they can actually export
@@PeteD appreciate your feedback thank you.
Fantastic elucidation of where these tools are at, thank you Han! 😎🥳🏆
Good review, super clear and honest.
Needed that review, Thank you!!
Nice video! I was thinking to do this kind of video +optical but can't seem to get to it. Thanks for including the floor rolling and hands on surface, great to see how it does in these situations
30 minutes a month? Geez I got hours of mocap data for my 10 minute short film. These monetization methods are not good.
it could be cheaper to rig and animate it by yourself 🙂 or ... have a look into iPi Recorder (ipisoft)
Thank you for sharing and explaining Han!
Thx for your efforts mate. Can you also make a comparison video of xsense and rokoko? That would be great.
Get Xsens if you can. Rokoko can’t beat it not even close.
Rokoko for me!
I love your content! Great breakdown, thank you so much
Body Occlusion with 6 cameras! lol Great video!
Hmm, just checked all options for this budget motion capture and I'm very disappointed (nothing against Move tho). My main question is, how are VR googles so cheap, when you track the headset, and several controllers with perfect precision? It can do it for less than 1000 bucks, and you have a VR headset next to it? I would so much like some explanation.. thanks.
What about finger? Is it good enough?
+1 SUB ! I have been looking for a great RUclipsr that talks about retargeting in maya. The most daunting thing as a solo indie dev is character rigging + animating and retargeting. I have not gotten it working proper. Anyway, Great video! I was looking at Rokoko and an Iphone system. I think Rokoko is the way to go
This is super helpful. Thank you for an honest and comprehensive review with your OWN testing and data. Definitely a sub from me. I've got a Perception Neuron v32 which still works well for me. Markerless mocap at an affordable price with good fidelity...just doesn't quiiiite seem to be there ....yet
is it better to use vive trackers
The data quality isn’t remotely comparable
@@anthonyganjou2485 in what way? Better or worse?
In my honest opinion after having tried it: nah.
ty
Good Review
of course, its raining on the day when you booked everything out
Cool video! I see you mostly do your animations in unreal engine. And I have a question. How many poligons does your rigged model have? As I can see, it's done very well. I personally rig models in blender. Although they are great in LOD0. Blender can't let it move or render smoothly. Although I have a powerful computer. And it seems to be very smooth in unreal engine. Can you give tips if I need to know something about a UE model beforehand...
I think if wonder studio turns out to be as good as they claim, these existing solutions won't stay relevant much longer. Be it mov ai or mocap suits
I think wonder studio utilize similar tech only single camera based. I remain skeptical about how polished they are until someone test it.
uhm.... so 365 USD for something you wouldn't use during the whole year with the lie of being $1 day price, but on top of that you only get 30min per month.....
是你吗 我还买了你在翼虎的课程呢
Honestly you failed to research proper clothing and contrast. You mean well, but it is a misleading video. You should try again with different clothing
No matter what costume you wear, it's impossible to get better results than MVN LINK on a live or recorded basis because move ai's software lags behind MVN animate. In particular, the problem of recognizing fast repetitive movements as noise is very serious, and the problem of rolling on the floor is unrelated to the costume.
jesus christ loves you all
MoveAi only for who is learning about animation and mocap
There's a few AAA studios who would disagree with you there. It's still being refined and improved, but it out-performed my Awinda Starter in my tests (without a 10K p/a Xsens Pro subscription, I ran into issues with the Xsens drifting that move doesnt give me, so cleanup is at least consistent)
Disagree. Get creative with your mocap and telling a story. Be very specific about what you need an when. Once you do that it has a professional place to be used. Its just another tool. Cant afford crazy xsens prices? Well then you use moveai. That simple.