I don't know what the circumstances of this Rokoko recording were made in but I've been testing the V1 smart suit recently and, as long as you have a high bandwidth gaming router, I can confirm the results are miles, miles better than this. There's usually a moment or two you'll want to smooth in editing later but, from my experience, I wouldn't call this Rokoko motion representative at all :)
As rokkoko suit1 user, I confirm to have received very similar results as in video above (with dedicated and same as recommended from Rokoko) on a tennis court (no sources of magnetic interference). Rokoko require a lot of cleanup and the results are not good enough.
@@anthonyganjou2485 No doubts in the experience of you mocap operator, but something definitely wrong with your Rokoko results. It is unusable at all in your case. I have V1 suit and even if it has some small glitches with the legs sometimes (which should be better in V2 as they promise), it produces MUCH better and reliable results than in your example. Your mocap data from rokoko is simply bad. Idk why this happened, but this is definitely not representative at all, as Sam said above. Hope Rokoko used dedicated wifi router (better 5Ghz), and Move AI uses it's own router in different bandwith. Also, check if there's nothing metal/magnetic in the area of mocap which can interfere. Smartsuit is pretty sensitive to that. I see that Move AI (as well as other vision based systems like ipiSoft) can produce more interesting and quality results if you have enough space for that, but it would be much more informative and fair for potential customers to see both systems at their max quality. Right now what was captured with the Rokoko can't be used in the comparison. If their suit really produced such a poor quality of mocap, their product would not exist at all.
I have Rokoko II gen, and legs - is weakest part of this system. Also very important - does this floor have some metal parts because this can interfere to sensors. Also want to know how accurate minor moving Move AI by hands and other part
this was a comparison created by Move ai, of course they are going to make their own results look way better. You need a non-biased 3rd party to run these comparison tests to get a real comparison.
Based on watching a Rokoko demo this week, I think these results are not filtered using their studio software, which seems to fix a lot of the leg problems. It's also not really a fair comparison to compare brand new technology to 5 year old technology.
"So maybe that person goes and grabs... I don't know... some shrimp, or a plate of shrimp." (I'm sorry, the actress that played Leila from Repo Man just moved in down the hall and I have Repo Man quotes going through my head all the time now.)
I have been field testing the move multicam and the move one app for years now and while I'm impressed with the convenience, I'm depressed with the cleanup required. There is a lot of noise, no two ways about it. I am interested to see tests with move multicam vs move one vs xsens done by neither Move nor Xsens. Impartial reviewer only. With all these suits or markerless capture, they all look impressive with moderately fast motion. VERY fast motion usually reveals breakdown, and slow steady motions have breakdowns. Try putting a hand on a hip, or on a table and see if it remains locked. A seated conversation done in mocap is a good test. Can the characters have arms on tables so that they aren't shaking their way above and below the tabletop? When they are stoic, are they in fact vibrating from solve noise? Try attaching a prop, like a gun or coffee cup to the hand bone - can it hold steady when the actor does, or is it shaking around like they've got Parkinson's. Conversely, dancing is the absolute worst test of a mocap system. Please, testers, stop doing that.
@@ukmonk$365 at $1/day but with a fair usage policy of 30 minutes a month. Which feels kindah low to me. Mocap is a fairly lumpy exercise. Gonnah record and capture a lot of video and keep doing it until you know you nailed the motion both from a performance and a resulting mocap pov. 30 minutes a month doesn't feel like it will be enough for such an endeavour. After 30 mins you're into their overage charges.
Wow, not sure I would have bought the Xsens if I had this option. I still get drift with the Xsens, and as good as it is standing up, I've had pretty crummy results with crawling/ground work. Curious of your experience there? I'm finding I need to do a LOT of cleanup.
@@anthonyganjou2485 I've actually got a ton of Q's. I'm deep in the middle of a virtual production project and was very close to upgrading my XSens, but this could potentially change my plans. Is there a way to get in touch via email?
Man I was bummed like you wouldn't believe at my Sony Mocopi not being capable of really pulling off a close enough animation so that I could just clean it up by hand. I'd lose my ever loving MIND if I bought a Rokoko and wasn't unbelievably happy LOL Sucks but I really do think that the only way you'll be genuinely happy with motion capture is if you spend some big bucks. Although I the thing that I REALLY want to see is someone create something better than Cascadeur. THAT would probably be the absolute best thing for those of us who want to create action scenes.
MoveAI would definitely be my way to go, but with only 30 minutes credit per month an absolute no-go. And that feeling of having to get every shot perfect on the first try because of the credits...nope. MoveAI could be so cool, but totally useless in these conditions
Hey, love the tuts! I am kind of struggling with somethin and perhaps you have some answers. I have a character that I made in Blender with hair. I rigged it in autorig pro rig and retargeted it to some rokoko mocap and all works well. Then I exported an fbx for the animation and a alembic for the groom. Now I am struggling to attach the hair to the fbx. The only way I found out was via blueprints. That worked but is that the way to go I wonder. Ideally it would be nice to have the rig/fbx file in unreal and being able to replace it with other mocap data when needed. Without having to replace textures and having the groom intact. Am I doing it right or is there an alternative? Thanks in advance
Hmm, I love your videos but i've a rokoko Smartsuit Pro II. I'll have to test myself after your comments with a drift. Just remember the Smartsuit Pro II is alot better than the first version. So i don't know if this is a fair representation of the Rokoko output. Otherwise Move AI looks really accurate and cool. How difficult is it to setup though, and how does that setup change between different recording locations/sets.
WOW, that's amazing! I tried to register on MoveAI, but haven't got any response yet... Is there any way for normal people to have a try of this amazing technique? Or do you have any clue about when MoveAI will be available to the public? Thanks.
Hi all appreciate there is massive frustration we haven’t released yet public launch is coming very soon as is pricing that makes it accessible to everyone
Hey Chris it is initially android will be next year. The rationale is that you can get a device anywhere on earth and as we support down to the 8 they become pretty cheap as well. The device also has functionality that optimises how you can connect the devices to create the rig, shoot and upload.
@@anthonyganjou2485 understandable, Apple has been pushing the facial tracking and better front cameras for the purposes for the new fields. Android is lagging in AR space for the same technologies now.
Thank you man. Great post. I wonder if we can Face and Head Mocap realtime in UNREAL together with move ai. Is it possible Stream the Full body and face mocap realtime in unreal simultaneously.?
@@Jsfilmz So how can be the workaround.. I mean lets say I made a mocap and I wanna attach my face capture to it.. Is it possible or no.? lets say I use the livelink face capture of UE.. could be.?
Hi man, i'm from Brasil and i have this idea about a product to launch here, but some times, some informations are really hard to find, u can help me about this MOVE AI livestream? I'm close to invest in some iphones to use this, looks perfect to me!
@@Jsfilmz No, no extra job, sorry my english!!!! What i mean, is like, for exemplo, u have some video that show MOVE AI live link with unreal? Just it kkkkk
@@Jsfilmz Cool... is limited to who can apply? ...or is it open to all... apologies if you've already addressed this! You're just the who's who on mocap and Unreal, bro!!
I really want to know the pricing for this. I really wish they would release a beta to test it ASAP. I hope its priced so small indies can get it. Three plans would be great. 1) Under $15 for indies 2) Companies making over 100k pay business prices $40+ a month 3) If you make over 1 million pay corpo prices (Contact us).
Move ai vs Rokoko Mocap!
Files!
drive.google.com/drive/folders/1TCWEHnedeEQ3GJMzrIM8T7YpUnSG36a_
I don't know what the circumstances of this Rokoko recording were made in but I've been testing the V1 smart suit recently and, as long as you have a high bandwidth gaming router, I can confirm the results are miles, miles better than this. There's usually a moment or two you'll want to smooth in editing later but, from my experience, I wouldn't call this Rokoko motion representative at all :)
Hey just to confirm this was done with an experienced Rokoko operator. Smart Suit 2 tests will be next week.
As rokkoko suit1 user, I confirm to have received very similar results as in video above (with dedicated and same as recommended from Rokoko) on a tennis court (no sources of magnetic interference). Rokoko require a lot of cleanup and the results are not good enough.
@@anthonyganjou2485 No doubts in the experience of you mocap operator, but something definitely wrong with your Rokoko results. It is unusable at all in your case. I have V1 suit and even if it has some small glitches with the legs sometimes (which should be better in V2 as they promise), it produces MUCH better and reliable results than in your example. Your mocap data from rokoko is simply bad. Idk why this happened, but this is definitely not representative at all, as Sam said above. Hope Rokoko used dedicated wifi router (better 5Ghz), and Move AI uses it's own router in different bandwith. Also, check if there's nothing metal/magnetic in the area of mocap which can interfere. Smartsuit is pretty sensitive to that.
I see that Move AI (as well as other vision based systems like ipiSoft) can produce more interesting and quality results if you have enough space for that, but it would be much more informative and fair for potential customers to see both systems at their max quality. Right now what was captured with the Rokoko can't be used in the comparison. If their suit really produced such a poor quality of mocap, their product would not exist at all.
@@anthonyganjou2485 Hello, did you perform that test and if so, where can we view the results?
I have Rokoko II gen, and legs - is weakest part of this system. Also very important - does this floor have some metal parts because this can interfere to sensors. Also want to know how accurate minor moving Move AI by hands and other part
Вау. Это круто. Невероятно плавные и четкие движения. Очень естественно получается.
Thanks for the work you put in.
this was a comparison created by Move ai, of course they are going to make their own results look way better. You need a non-biased 3rd party to run these comparison tests to get a real comparison.
This technology is amazing for indies and individuals who can't afford overpriced solutions like Rokoko. We have rokoko and it really sucks...
Super useful. Was considering Rokoko… not now
this is v1 though wait for v2 maybe its better
@@Jsfilmz fair point
Based on watching a Rokoko demo this week, I think these results are not filtered using their studio software, which seems to fix a lot of the leg problems. It's also not really a fair comparison to compare brand new technology to 5 year old technology.
@@hyperbolicfilms yeah because this comparison was done by MoveAI themselves. Not exactly an unbiased test
I think there's a place for both.. You can use Xsens, and then use AI together for exact positioning.
yes i use both
"So maybe that person goes and grabs... I don't know... some shrimp, or a plate of shrimp."
(I'm sorry, the actress that played Leila from Repo Man just moved in down the hall and I have Repo Man quotes going through my head all the time now.)
I have been field testing the move multicam and the move one app for years now and while I'm impressed with the convenience, I'm depressed with the cleanup required. There is a lot of noise, no two ways about it. I am interested to see tests with move multicam vs move one vs xsens done by neither Move nor Xsens. Impartial reviewer only. With all these suits or markerless capture, they all look impressive with moderately fast motion. VERY fast motion usually reveals breakdown, and slow steady motions have breakdowns. Try putting a hand on a hip, or on a table and see if it remains locked. A seated conversation done in mocap is a good test. Can the characters have arms on tables so that they aren't shaking their way above and below the tabletop? When they are stoic, are they in fact vibrating from solve noise? Try attaching a prop, like a gun or coffee cup to the hand bone - can it hold steady when the actor does, or is it shaking around like they've got Parkinson's. Conversely, dancing is the absolute worst test of a mocap system. Please, testers, stop doing that.
It’s very interesting. Is there any pricing info yet?
yeah 365 for the year i think
@@ukmonk$365 at $1/day but with a fair usage policy of 30 minutes a month. Which feels kindah low to me. Mocap is a fairly lumpy exercise. Gonnah record and capture a lot of video and keep doing it until you know you nailed the motion both from a performance and a resulting mocap pov. 30 minutes a month doesn't feel like it will be enough for such an endeavour. After 30 mins you're into their overage charges.
Wow, not sure I would have bought the Xsens if I had this option. I still get drift with the Xsens, and as good as it is standing up, I've had pretty crummy results with crawling/ground work. Curious of your experience there? I'm finding I need to do a LOT of cleanup.
Xsens comparisons out Monday!
@@anthonyganjou2485 Great! Are you part of Move?
Yes I’m the Co founder
AMA
@@anthonyganjou2485 Amazing product! Looking forward to Monday!!
@@anthonyganjou2485 I've actually got a ton of Q's. I'm deep in the middle of a virtual production project and was very close to upgrading my XSens, but this could potentially change my plans. Is there a way to get in touch via email?
Man I was bummed like you wouldn't believe at my Sony Mocopi not being capable of really pulling off a close enough animation so that I could just clean it up by hand. I'd lose my ever loving MIND if I bought a Rokoko and wasn't unbelievably happy LOL Sucks but I really do think that the only way you'll be genuinely happy with motion capture is if you spend some big bucks. Although I the thing that I REALLY want to see is someone create something better than Cascadeur. THAT would probably be the absolute best thing for those of us who want to create action scenes.
In all motions the 2 iPhones beat the Rokoko.
MoveAI would definitely be my way to go, but with only 30 minutes credit per month an absolute no-go. And that feeling of having to get every shot perfect on the first try because of the credits...nope. MoveAI could be so cool, but totally useless in these conditions
Hey, love the tuts!
I am kind of struggling with somethin and perhaps you have some answers.
I have a character that I made in Blender with hair. I rigged it in autorig pro rig and retargeted it to some rokoko mocap and all works well.
Then I exported an fbx for the animation and a alembic for the groom. Now I am struggling to attach the hair to the fbx. The only way I found out was via blueprints. That worked but is that the way to go I wonder.
Ideally it would be nice to have the rig/fbx file in unreal and being able to replace it with other mocap data when needed.
Without having to replace textures and having the groom intact.
Am I doing it right or is there an alternative?
Thanks in advance
never owned rokoko mate sorry
Hmm, I love your videos but i've a rokoko Smartsuit Pro II. I'll have to test myself after your comments with a drift. Just remember the Smartsuit Pro II is alot better than the first version. So i don't know if this is a fair representation of the Rokoko output. Otherwise Move AI looks really accurate and cool. How difficult is it to setup though, and how does that setup change between different recording locations/sets.
they are doing the version 2 next week i think. here is the entire process on move ai ruclips.net/video/g0JEsjA4rqQ/видео.html
We are testing against the smart suit 2 next week
I am curous about Rokoko single camera Vision / Ai versus Move Ai Singlecam
Keep in mind, move ai low key biased. That being said, results are amazing.
WOW, that's amazing! I tried to register on MoveAI, but haven't got any response yet... Is there any way for normal people to have a try of this amazing technique? Or do you have any clue about when MoveAI will be available to the public? Thanks.
Same here. That's the issue. Still no clue when this thing will be public
Exactly, at least tell us a projected product release date and perhaps a price point. It's frustrating to not have access to it.
Registered and have been waiting a while too. Would be good to at least get an update.
Would love to get access myself, also waiting
Hi all appreciate there is massive frustration we haven’t released yet public launch is coming very soon as is pricing that makes it accessible to everyone
Hey, man. Are you able to use gopros with MoveAI? Or is the iphones a requirement? Thanks a lot!
ive used gopros with it on the channel go check it out
@@Jsfilmz can you use an android yet?
Is this something exclusive to Iphone? . Since we are just processing a video to extract Mocap from it
yes iphones its calibrated to iphones
@@Jsfilmz So no Ipad... man Im not buying a iphone just for this. Wait just heard that stuff about a host? What do you mean by that?
@@no_player_commentary I'm in the same idea, but i find out that this software still not open to public yet.... hope that launch soon!!!!
nice man, i got one suit with gloves, works not so cool
Move ai and Rokoko should to improving the Hand and Face tracking like DeepMotion
I wonder if I could use my 8 playstation eye cameras, which max at 480p 60 fps, with move ai.
What's makes PlayStation eye cameras so special? I have seen people talking about ps3 eyes .
We could support them but the quality won’t be anywhere near as good
@@MODEST500 Well, it was super cheap a while back ($5 per camera) and was used for ipisoft - a markless motion capture software.
@@anthonyganjou2485 Oh hey, you're the real deal. Even with 8? That's a shame.
It’s more about technical time to onboard the cameras. I’ll speak to the tech team to see when/if we can get it on the road map
Is this limited to iPhones? It really sucks that the best mocap/facial tracking is stuck behind the apple ecosystem
Hey Chris it is initially android will be next year. The rationale is that you can get a device anywhere on earth and as we support down to the 8 they become pretty cheap as well. The device also has functionality that optimises how you can connect the devices to create the rig, shoot and upload.
@@anthonyganjou2485 understandable, Apple has been pushing the facial tracking and better front cameras for the purposes for the new fields. Android is lagging in AR space for the same technologies now.
Ive been an android fan for a loooong time and its sad to say they are slacking
Still confused, I have been testing a lot of AI based mocap but not satisfied with the result.
do you see the result in realtime? or do you have to wait for the processing in the cloud?
this one is cloud. i think realtime is coming though
Thank you man. Great post. I wonder if we can Face and Head Mocap realtime in UNREAL together with move ai. Is it possible Stream the Full body and face mocap realtime in unreal simultaneously.?
not live not yet but eventually im sure it will come *wink*
@@Jsfilmz So how can be the workaround.. I mean lets say I made a mocap and I wanna attach my face capture to it.. Is it possible or no.? lets say I use the livelink face capture of UE.. could be.?
Anyone ever told you, you look like Dante Basco :D
lol yea alot i dressed up as rufio one halloween
@@Jsfilmz YES, the world needed it 🌍😂 Awesome man!
You can get away with 2 Iphones.
You can use an iPad as a host!!!
Which camera works best with Move AI? Go Pro or Apple phone?
they are pretty close imo
I spent so much money on rokoko. I guess time to sell it
What's the verdict? Move Ai with 4 cameras looked best to me.
wow,, move ai can use single iphone? does my only 1 iphone xs max can do this?
This is insane! It will destroy Accurig.
accurig is a rigging tool not mocap though
Hello JS! Does Moveai support Live Drive?
not yet but i was told eventually it will
Hi man, i'm from Brasil and i have this idea about a product to launch here, but some times, some informations are really hard to find, u can help me about this MOVE AI livestream? I'm close to invest in some iphones to use this, looks perfect to me!
cool man tbh i cant take any extra work my fulltime job keeping me busy
@@Jsfilmz No, no extra job, sorry my english!!!!
What i mean, is like, for exemplo, u have some video that show MOVE AI live link with unreal?
Just it
kkkkk
I Saw that this app still not open to public???
@@DouglasBenvenuti yea not yet and no livestream yet only offline import
@@Jsfilmz some idea about when is open?
Add me to Rokoko owners that think it’s garbage.
hahahaha man i tried to warn people yo
@@Jsfilmz the gloves do pretty good.
Does this work in real time? Or pre recorded feed turn into animation?
Move AI is way better! BUT... I heard that Move AI is going to be crazy expensive... can you confirm?
idk its free tiill march now i thtink
@@Jsfilmz Cool... is limited to who can apply? ...or is it open to all... apologies if you've already addressed this! You're just the who's who on mocap and Unreal, bro!!
had Rokoko smart suit V1 since launch and now i use V2 and mine is awesome! this one here looks laggy as hell!! this is a false representation tbh!
you compare 200 dollars / month vs 1 free cam
He compares phone cameras vs 2k$ mocap suit
Is rokoko made with 1 camera? Why not use 2 cameras?
Roko is the motion capture suit not Rokoko Ai
does this move only work with iphone?
Hey!
I really want to know the pricing for this. I really wish they would release a beta to test it ASAP.
I hope its priced so small indies can get it. Three plans would be great.
1) Under $15 for indies
2) Companies making over 100k pay business prices $40+ a month
3) If you make over 1 million pay corpo prices (Contact us).
im giving them price suggestions well see
@@Jsfilmz Super. I hope they listen to your suggestions. Am sure you have a good grasp of what prices would be popular.
Pricing will make it accessible to everyone :)
How , when we can use it
its still in beta mate im beta testing it for them
@@Jsfilmz we need this , suits cost arm and leg to get them , and this technology still not released I hope soon
Is that all iPhone 11 models?
im not sure i just know its 11 and up since its a tight space
@@Jsfilmz thanks =)
u shouldnt wear black clothes when using Rokoko
Daddy want.