Correction: Move.ai can be calibrated with another actor than the one being captured, so Tom Cruise can chill while calibration is happening. Thanks move.ai for the clarification.
Hi Blender Bob! This is a super useful clip and it's great to see how these different technologies compare! Well done! We just wanted to reach out and correct some information here. When using our software, anyone can do the calibration of the cameras. The calibration is an important step in our process as it maps out the capture volume that will be used for the action sequence. So if you got Mr Cruise hanging around, he can chill in the back while anyone, i.e an assistant is doing the calibration of the cameras. Nevertheless, though, we appreciate the mention!
You guys are by far the best system for video motion capture. I obviously didn’t know someone else can do the calibration. Sorry about that. I will mention it in a pinned comment and on the social media. My bad. The point was that you need a calibration yet in sone of your demo clip, you capture a soccer game. How does that work then? I’m a busy guy. Would it be possible to track an underwater footage without calibration? I haven’t tried your system going up stairs either.
Bob, you are now my chief technical advisor & best friend. Thanks for bringing clarity & context to this technology. The times, they are a’changin. ✌️☕️🎩🎩🎩
Thanks a lot Bob for using my clip. Great video. Having a lot of fun on my spare time with all these new technologies, AI and traditional. Its a passion. ;) All the best, keep those videos coming.
Awesome!! I can’t wait to use it! I’m working on a fan film and this would really help make it possible for me since I’m a one man operation using an old laptop. Exciting times!
When I watch the wonderDynamics demo I first think was 100% fake , I request a test, but never come back to me, but that look amazing. And I'm 100% with you about the job situation but that is just a tool we need to adapt and not being afraid of this new technology
Your explanation's are so awesome. Even for information that is very foreign to me. I can understand everything you explain. You always make me laugh as well. Thanks Blender Bob.
Thanks for showing this, definitely going to be useful for my lower end stuff! Animation is so expensive, for easier shots I usually suffer through this will be massive, quality base and in time!
So far I did get any issues with occluded parts. Obviously anything outside the camera view will go a bit weird. You can’t use multiple cameras at the moment
Been testing the beta for a few weeks now and was sceptical but I am very impressed. You still need to know how to use software like Blender/Maya/C4D etc to tweak the animation and get the most out of it but it’s very, very powerful and I think the finished product will be awesome. Interested to see the pricing range etc
Hey there, i just recieved the closed beta for wonder too! super happy to be a part of it. Unfortunately i cannot get to one of the steps you have with importing a custom character from blender into wonder dynamics. even with the character they have supplied on their example in the blender section, it fails and claims excess files, could you please explain how you managed to upload your character maybe? wasn't showed in this video, many thanks you're amazing.
An interesting thing about these new automations and AI imaging plugins is that they are often not designing them to create or output editable assets. It is awesome that these tools can help us make concepts and first passes, but how do you version up from something made by an AI? Your example with getting a body track from Wonder Dynamics was super interesting and looks like a great workflow. I will have to look into this one further, thanks for sharing these potential uses and your humor/outlook!
i tried i found this not tottaly usable until i understood you can have the whole file damn this is crazy, i'd like it to just give me the cam track in an environment too
I'm curious, I don't think I've seen footage with green screens. How does it hold up with that? Does it need a background with contrast and detail to work?
Sorry, I write in french is more easier. Cher Blender bob, je teste Wonder Studio et je dois dire que je suis bluffé. Mais j'ai un petit souci... Je crée un perso avec CC4 et Wonder Studio ne reconnaît pas le fichier FBX. Je transfère ce FBX sous Blender puis je recrée un nouveau FBX. Là, Wonder Stuio reconnaît le fichier, mais il me dit que "le fichier ne contient aucune texture" ?!?!? Et là, je sèche. Avez-vous une idée, procédure pour importer ses propres personnages? En vous remerciant.
Je n’ai jamais été capable d’importer des persos. J’ai toujours des messages d’erreurs mais cette semaine ils ont dit qu’ils travaillaient sur une façon plus simple d’importer des fichiers de Blende et Maya sans avoir à passer par une multitude de contraintes. À suivre
Yes and they have a long list of specs that you need to follow, especially for naming convention. People have asked on the beta forum for them to support the most popular ones, like Maximo, Rigify and AutoRig Pro.
I've had no less than 3 vfx artists (that are literally working on the same films/shows I am) dismiss Wonder Dynamics completely. I showed them several tests I did, and all they could say was "I would never show this to clients, there's still so much work to do....bla bla". I literally gave them a disclaimer before I shoed them the test to look at the technology...not just the final results. But all they could focus on was that the results weren'ty 100% in-theater deliverable-ready. Are we just surrounded by people that just refuse to see the usefulness here? I mean, they have the exact same skills and knowledge as me...they're literally a part of the same stufio pipelines...and still they just won't admit it has any merit. Sad.
Any examples of their output in 3D DCC/Game engine? Is it only look good from the camera perspective because of the lack of camera calibration information?
@@BlenderBob Thanks! Is there any chance that you can either record a video showing the result from multiple views or simply upload a .blend/.fbx somewhere?
@@BlenderBob Thanks! It looks good, to be honest. I wonder how stable the underwater case is (7:55)? I'm asking because there is a moving camera there. Interesting to see how accurately it's solved. Thanks!
@@garlic333 It's shaky because the footage was crappy to begin with. Low res highly compressed MP4. And water bubbles passing in front of the camera, not good.
@@BlenderBob It's got me thinking, in just a few short years, we could find ourselves living in a Matrix-style world where we're blissfully unaware that we're just a bunch of lines of code! But hey, at least nobody will be out of a job, right? Silver linings and all that.
8 don’t think it will replace even biped character animation. Sometime hand-animated has a style that live action mocas can’t do. Cartoons are more exaggerated than real humans, even if the mocas is perfect.
It is great to see you break down each mo-cap tech and techniques and its limitations. I just saw this new device being developed by HTC and featured on Thrillseeker ruclips.net/video/c6kB2HhImLo/видео.html. I'm quite excited by this potential and the new AI driven mocap software . The clean plate tools on Wonder Dynamics is insane!
People are too funny, just mocap? How about rotoscoping, rigging a model and retargeting the animation onto the model, create the clean plate and clean up where the actor was removed from the scene, relighting the scene so that your subjects ties into the scene more realistically, compositing, etc... you can use the blend file with the mocap data and camera tracking and act out your own cartoons or animated short films, just have to building out the sets and add whatever character you want then retarget the animation and boom, life made easy. Trust me, there are more ways to use this application and it's data than the intended use. I heard people complain about how long it takes to render out... like "it's cool, but it takes an hour per generation", they must not understand the hours of work Wonder Studio gives us back by doing all the tedious and painstaking tasks. I'm new to the game, and I guess I came in at a good time hah. Thanks for the video Bob, def crushing that sub button and like, keep em coming bud! Here's wishing you all the best!
The section where You "enact underwater calibration" is priceless :)
And thanks for everything else You do also :)
Correction: Move.ai can be calibrated with another actor than the one being captured, so Tom Cruise can chill while calibration is happening. Thanks move.ai for the clarification.
Hi Blender Bob! This is a super useful clip and it's great to see how these different technologies compare! Well done! We just wanted to reach out and correct some information here. When using our software, anyone can do the calibration of the cameras. The calibration is an important step in our process as it maps out the capture volume that will be used for the action sequence.
So if you got Mr Cruise hanging around, he can chill in the back while anyone, i.e an assistant is doing the calibration of the cameras. Nevertheless, though, we appreciate the mention!
You guys are by far the best system for video motion capture. I obviously didn’t know someone else can do the calibration. Sorry about that. I will mention it in a pinned comment and on the social media. My bad. The point was that you need a calibration yet in sone of your demo clip, you capture a soccer game. How does that work then? I’m a busy guy. Would it be possible to track an underwater footage without calibration? I haven’t tried your system going up stairs either.
@@moveai I'm already on your Discord server. Where can we continue this conversation? :-)
Bob, you are now my chief technical advisor & best friend. Thanks for bringing clarity & context to this technology. The times, they are a’changin. ✌️☕️🎩🎩🎩
It is just insane how fast and versatile the Beta already is. Thanks for showing all the Advantages over other tools
Thanks a lot Bob for using my clip. Great video. Having a lot of fun on my spare time with all these new technologies, AI and traditional. Its a passion. ;) All the best, keep those videos coming.
Awesome!! I can’t wait to use it! I’m working on a fan film and this would really help make it possible for me since I’m a one man operation using an old laptop. Exciting times!
This is great! All crying saying it's BS don't understand how difficult this stuff is. Wonder Dynamics is a game changer
When I watch the wonderDynamics demo I first think was 100% fake , I request a test, but never come back to me, but that look amazing.
And I'm 100% with you about the job situation but that is just a tool we need to adapt and not being afraid of this new technology
WHOA! Thats awesome! I loved spirits within when it came out!
Very Beautiful Presentation...What I like most is ...Water Calibration Explanation🤣😂😄 This Shows your Acting Capability....Superb....
Your explanation's are so awesome. Even for information that is very foreign to me. I can understand everything you explain. You always make me laugh as well. Thanks Blender Bob.
Thanks for showing this, definitely going to be useful for my lower end stuff! Animation is so expensive, for easier shots I usually suffer through this will be massive, quality base and in time!
1:20 - loved that movie. I remember the CG looked pretty outstanding to me at the time.
love u! nice video, thanks!!
Thank you very much for all your efforts.
'good overview (how does it fare when body parts are occluded & can it use multiple cameras?). Cheers.
So far I did get any issues with occluded parts. Obviously anything outside the camera view will go a bit weird. You can’t use multiple cameras at the moment
да это конечно супер пупер тема! все меняется быстро нейросети это сила !
dead on point. as if Louis de Funès explains it!
Can you make a Wonder Dynamics .blend character with a custom metahuman?
Simple answer: No. metahumans are not compatible with anything
Been testing the beta for a few weeks now and was sceptical but I am very impressed. You still need to know how to use software like Blender/Maya/C4D etc to tweak the animation and get the most out of it but it’s very, very powerful and I think the finished product will be awesome. Interested to see the pricing range etc
that ping-pong suit instantly reminded me of Mokap fighter in Mortal Kombar Armageddon
Hey there, i just recieved the closed beta for wonder too! super happy to be a part of it. Unfortunately i cannot get to one of the steps you have with importing a custom character from blender into wonder dynamics. even with the character they have supplied on their example in the blender section, it fails and claims excess files, could you please explain how you managed to upload your character maybe? wasn't showed in this video, many thanks you're amazing.
I haven’t been able to do it either and I didn’t get a chance to try again. May want to check on the Discord server.
An interesting thing about these new automations and AI imaging plugins is that they are often not designing them to create or output editable assets. It is awesome that these tools can help us make concepts and first passes, but how do you version up from something made by an AI? Your example with getting a body track from Wonder Dynamics was super interesting and looks like a great workflow. I will have to look into this one further, thanks for sharing these potential uses and your humor/outlook!
That's crazzzyyy oh my godddd
UV is powerful ; saving time.
Wish you didn't have to be fancy business to use the beta. It's so cool
It’s not going to stay in beta forever
This is only at closed beta and it's impressive. The question is: How much it will cost?
I have no idea
Good stuff!🍪
Aren`t these guys used to be called Boston dynamics?
You have a marked pedagogical talent!
i tried i found this not tottaly usable until i understood you can have the whole file damn this is crazy, i'd like it to just give me the cam track in an environment too
I'm curious, I don't think I've seen footage with green screens. How does it hold up with that? Does it need a background with contrast and detail to work?
No green screen needed. It will try to rebuild the BG the best it can
@@BlenderBob I'm thinking for big productions where they have a lot of actors in front of a green screen. Where the background needs to be replaced.
@@kyleheilig255 Well, I would never use their system to clean the BG. I would just export the blend file and relight it and render it myself
Sorry, I write in french is more easier.
Cher Blender bob, je teste Wonder Studio et je dois dire que je suis bluffé.
Mais j'ai un petit souci...
Je crée un perso avec CC4 et Wonder Studio ne reconnaît pas le fichier FBX.
Je transfère ce FBX sous Blender puis je recrée un nouveau FBX.
Là, Wonder Stuio reconnaît le fichier, mais il me dit que "le fichier ne contient aucune texture" ?!?!?
Et là, je sèche.
Avez-vous une idée, procédure pour importer ses propres personnages?
En vous remerciant.
Je n’ai jamais été capable d’importer des persos. J’ai toujours des messages d’erreurs mais cette semaine ils ont dit qu’ils travaillaient sur une façon plus simple d’importer des fichiers de Blende et Maya sans avoir à passer par une multitude de contraintes. À suivre
@@BlenderBob Oh, parfait. Merci de la réponse. Content d'apprendre que ça ne vient pas de moi.😁
On reste à l'écoute...
do you need to rig your character before you upload it?
Yes and they have a long list of specs that you need to follow, especially for naming convention. People have asked on the beta forum for them to support the most popular ones, like Maximo, Rigify and AutoRig Pro.
oh oh price OF rekoko suit is gonna drop WAAAAAAAY DOWN.
I've had no less than 3 vfx artists (that are literally working on the same films/shows I am) dismiss Wonder Dynamics completely. I showed them several tests I did, and all they could say was "I would never show this to clients, there's still so much work to do....bla bla". I literally gave them a disclaimer before I shoed them the test to look at the technology...not just the final results. But all they could focus on was that the results weren'ty 100% in-theater deliverable-ready. Are we just surrounded by people that just refuse to see the usefulness here? I mean, they have the exact same skills and knowledge as me...they're literally a part of the same stufio pipelines...and still they just won't admit it has any merit. Sad.
They are the VHS tapes of the industry
@@BlenderBob Beta max 🤣
Someone from my generation, huh?
@@BlenderBob Yessir...you and I are both GenX'ers I think. And they're at the top of the Millenial cut-off
@@johntnguyen1976 Gen X'ers are a resourceful bunch, unafraid of adaptation. That scares everyone else. :)
It will be in realtime renderings soon.
Yeah, just need to use unreal instead of Blender or Maya
I am adapted with blender "kind of :)"
is this a stand alone software thingy or do you upload your footage into the cloud? Their cloud. So they train more AIs on your body features...
This is a web site so you don’t need an insane graphics card.
You can do calibration prior to the shoot (Not underwater 😂) and then you shoot :)
Nop because underwater you have refraction and that will screw up everything
@@BlenderBob I see, makes sense why James Cameron had to fill pools with black balls!
@@BartBarlow That was to stabilise the water and cut light coming from the top I thing, but essentially, yes.
I registered and fill form to access almost a month a go and no reply from them
I guess there was enough bite testers.
This would be great for RUclips. Just create your character/avatar and most of the work is done.
Any examples of their output in 3D DCC/Game engine? Is it only look good from the camera perspective because of the lack of camera calibration information?
Yeah, it's only good from the camera POV. The systems cannot track what it doesn't see.
@@BlenderBob Thanks! Is there any chance that you can either record a video showing the result from multiple views or simply upload a .blend/.fbx somewhere?
@@garlic333 I added a link in the description
@@BlenderBob Thanks! It looks good, to be honest. I wonder how stable the underwater case is (7:55)? I'm asking because there is a moving camera there. Interesting to see how accurately it's solved. Thanks!
@@garlic333 It's shaky because the footage was crappy to begin with. Low res highly compressed MP4. And water bubbles passing in front of the camera, not good.
It is just for the cheap Mocap market not aaa class like the Avatar movie and not for pixar style animation. Pixar style animators are safe.
yet... ;-)
@@BlenderBob yes I actually said safe for probably 10 years and then I deleted it. but yeah that's reality
@@Metarig There will always be a need for non humanoids but for cartoon, way less than 10 years at the speed AI develops.
@@BlenderBob It's got me thinking, in just a few short years, we could find ourselves living in a Matrix-style world where we're blissfully unaware that we're just a bunch of lines of code! But hey, at least nobody will be out of a job, right? Silver linings and all that.
@@Metarig Maybe you're right
Is ai riplace vfx artist
8 don’t think it will replace even biped character animation. Sometime hand-animated has a style that live action mocas can’t do. Cartoons are more exaggerated than real humans, even if the mocas is perfect.
Yeah, that’s what I said. It doesn’t have the human touch. But for TV shows for young kids who don’t care about that, they will use it for sure.
In many cases mocap already has replaced biped characters… why wouldn’t this do the same in certain projects.
I meant some situations like Pixar, not Marvel
It is great to see you break down each mo-cap tech and techniques and its limitations. I just saw this new device being developed by HTC and featured on Thrillseeker ruclips.net/video/c6kB2HhImLo/видео.html. I'm quite excited by this potential and the new AI driven mocap software . The clean plate tools on Wonder Dynamics is insane!
I see that the comments of the usual 3D master have stressed you out a bit.😂
It’s all about education in technology. :-)
Essaie de dire wave lenk
People are too funny, just mocap? How about rotoscoping, rigging a model and retargeting the animation onto the model, create the clean plate and clean up where the actor was removed from the scene, relighting the scene so that your subjects ties into the scene more realistically, compositing, etc... you can use the blend file with the mocap data and camera tracking and act out your own cartoons or animated short films, just have to building out the sets and add whatever character you want then retarget the animation and boom, life made easy. Trust me, there are more ways to use this application and it's data than the intended use. I heard people complain about how long it takes to render out... like "it's cool, but it takes an hour per generation", they must not understand the hours of work Wonder Studio gives us back by doing all the tedious and painstaking tasks. I'm new to the game, and I guess I came in at a good time hah. Thanks for the video Bob, def crushing that sub button and like, keep em coming bud! Here's wishing you all the best!
Amazing, all jobs in the mud
Not all. The boring ones mostly