Best AI Motion Capture 2021 - OpenPose vs DeepMotion

Поделиться
HTML-код
  • Опубликовано: 18 окт 2024

Комментарии • 116

  • @corruptedsmurf260
    @corruptedsmurf260 3 года назад +43

    I don’t really care about the sponsored parts; I sub to this channel just to see what these AI people have thought of next. It’s like my news.

  • @gta12368
    @gta12368 3 года назад +57

    Even though it was sponsored, I think you showed their strenghts and weaknesses fairly. Great vid!

  • @chaosfire321
    @chaosfire321 3 года назад +13

    Definitely a ways to go before they render mocap suits obsolete. Still, amazing progress in such little time!

    • @daton3630
      @daton3630 3 года назад +1

      There are still instances where mocap suits will be need, so I don't think it will be going obsolete soon

    • @DeepMotionInc
      @DeepMotionInc 3 года назад +5

      Thanks for the feedback! We have been hard at work releasing quality updates and new features every few weeks. We don't expect to be able to replace tracking suits immediately and our current goal is to fill a gap for more cost-effective solutions that doesn't require a big investment.

    • @LloydSummers
      @LloydSummers 3 года назад +1

      They have ways to go before it is even really usable - I’ve found you’ll use up your credits in any tier just trying to get an OK base animation to work on

    • @shawnarms16
      @shawnarms16 3 года назад +1

      @@LloydSummers same here. I mean even with the suit I have…I have to do a few takes to get it right. 25 minutes tops a month an 60% unusable just want feasible yet.
      I love deepmotion and may use it if I can’t get my Rokoko suit to an actor.

  • @jaeslow6347
    @jaeslow6347 3 года назад +5

    Creative processes are just going to made so much easier with this software like this, this is the time of indie games with small teams.

  • @RomiWadaKatsu
    @RomiWadaKatsu 2 года назад +2

    Given the jankiness of the results, that pricing is unreasonable

  • @RR-gx4ec
    @RR-gx4ec 3 года назад +11

    'Ad' should be in the title - clearly visible from homepage and search results.

  • @vikingzeppelin
    @vikingzeppelin 3 года назад +20

    can you add multiple camera sources to increase accuracy in DeppMotion3D? i missed it if you said so in the video

  • @razmoose213
    @razmoose213 3 года назад +15

    This would be every small time vtubers dream.

    • @canalactivado
      @canalactivado 3 года назад

      But don't the Vtubers already do it? D:
      I see some of them move their heads and some others move their fingers. *Don't they already do it with AI? Or with DeepMotion?* D:

    • @viktorchernyk
      @viktorchernyk 2 года назад +1

      @@canalactivado probably Facerig

  • @LloydSummers
    @LloydSummers 3 года назад +12

    Deep motion is not reliable, I found we needed to redo it a half dozen times but by then you used up your credits. In practice it is unmanageably expensive as you need to upgrade to deal with their unreliable framework. It also does some really bad jitter, and doesn’t do facial or hands. This has convinced me to just use OpenPose and contribute back to an open project and maybe tie in some smoothing specific ML.
    Of note: we also use rokoko- much smoother but still needs cleaning. I’ll see if I can’t post a video on how our mocap compares. Edit: Oh wow, looks like OpenPose even has a Unity plugin.

    • @tabby842
      @tabby842 2 года назад

      dunno if you're still using mocap, any changes on your opinion of deepmotion 7 months later?

    • @LloydSummers
      @LloydSummers 2 года назад +3

      @@tabby842 yes it's still quite unusable. I ended up cloning OpenPose (open source) and using that for a bit
      We actually have Rokoko suits, depth sensors, and VR equipment - so we have a lot more options than most folx. I think the pricing is unreasonable given the number one reason for needing to reporocess captures is poor coding on the Deep Motion team.
      There are deep motion alternatives now though that are cheaper and better at processing that you should check out. Just google for non marker motion capture solutions :).

    • @DB009
      @DB009 2 года назад

      @@LloydSummers ive been trying to get OpenPose to work for a while now on unity not working unity just keeps crashing.

  • @David_Sirovsky
    @David_Sirovsky Год назад

    I think if you used this tool together with a cascadeur or mixamo, the result would be better

  • @neknaksdn9539
    @neknaksdn9539 2 года назад

    deep motion: let's sponsor this guy so he can give us only a good review
    bycloud: na'ahhh

  • @LOCKSHADES
    @LOCKSHADES 2 года назад +4

    I think video fps and quality might be a better result. However, it's just for finer result. Some of the positioning will still be estimated at 0.7%-1.2%, currently even on high end imagery.

  • @sebastienletuan5647
    @sebastienletuan5647 2 года назад +2

    Excellent comparison. I am assuming that once a 3-D animation file is generated, it is possible to modify it (I.e. make it do different things) ? Or are you stuck with the animation that was captured initially?

    • @hi_its_jerry
      @hi_its_jerry 2 года назад +2

      yeah you can always clean it up and modify keyframes

  • @LOCKSHADES
    @LOCKSHADES 2 года назад +2

    On the other hand, as long as the estimation of just one dimension then you should expect at worse 1/3 of the error and at best 2/3 of the dimensions( it does better in real life most times). The dinominator is the number of dimension there is to account for.
    At worse z depth isn't solved and just sparse data. At most xy and z are figured out but there's the handicap of just one perspective point(camera). No way to always cross check.

  • @waters.create
    @waters.create Год назад +1

    I think you've just saved me a few 1000 pounds here

  • @daydaykey
    @daydaykey Год назад

    pleased to see master Ma

  • @n1lknarf
    @n1lknarf 3 года назад +14

    The problem with Ai driven software for mocap is the lack of depth recognition, especially if recording with only 1 camera. Great for 2D animation but very bad for 3D as it won't recognize bones moved in the Y axis (front and back)

    • @GS-tk1hk
      @GS-tk1hk 3 года назад +7

      But that's the thing about AI, it can estimate depth from a single image by inferring it from, for example, the way objects shrink with distance or the parallax effect. If you close one of your eyes, you still have a pretty decent depth vision despite it being just one image.

    • @canalactivado
      @canalactivado 3 года назад

      @@GS-tk1hk I don't quite understand it. Before, mocap suits were used and now AI is being used D:
      It will be possible to make animations with the AI without mocap suit?

    • @GS-tk1hk
      @GS-tk1hk 3 года назад +1

      @@canalactivado It already is, but it's not as good as mocap suits. I think a company called DeepMotion is working on it, among others. Basically the AI recognizes the human and how he/she moves in space, and translates that to virtual bones for animation

    • @bronationgamingultimate6760
      @bronationgamingultimate6760 2 года назад

      im not sure If it does have support or not but you could probably use the xbox kinnect V2 with the depth sensing tech it has.

    • @laupoke
      @laupoke Год назад

      Yeah it can lol, wtf are you on about ? The whole purpose of AI is to make extermely good educated guesses. If it was trained on data that move on the Y axis (and it was) then it can estimate that. As a matter of fact, i'm pretty sure you would be able to tell if every point of every bone was in the same plane. This comment is dumb.

  • @mallhits
    @mallhits 2 года назад +1

    Dude. This info is fucking gold. I’m sold in deep motion. Thanks for the video

    • @tabby842
      @tabby842 2 года назад

      How do you like it so far?

  • @DemonOf4thGeneration
    @DemonOf4thGeneration 2 года назад +2

    How to change the file to pmd for MMD.

  • @sam_s3344
    @sam_s3344 Год назад

    Thank you for the amazing video!
    Do we have a tutorial which shows us Do we have a tutorial which shows us how to set up open pose? To someone with no coding experience.

  • @sweetambitions2381
    @sweetambitions2381 Год назад

    Is there something that could capture motions when the camera isn’t fixed? Say, a close up of walking with the camera moving 360 around it.

  • @MaxStudioCG2023
    @MaxStudioCG2023 Год назад

    what about the deepmotion copyrights? on the free version what are we alowed to do with those 30sec free anims? or what is the share we get from those?

  • @NeroForte_
    @NeroForte_ 2 года назад +1

    can you compare deepmotion to ROKOKO?

  • @wensongliang1905
    @wensongliang1905 3 года назад +5

    惊现闪电五连鞭

  • @arifsultan2009
    @arifsultan2009 2 года назад

    Is there any AI motion capture software which can capture my finger movements? Deepmotion doesn't capture fingers motion

  • @holdthetruthhostage
    @holdthetruthhostage 2 года назад

    Thanks for this

  • @cpsstudios
    @cpsstudios Год назад

    What version of OpenPose is this?

  • @grimsk
    @grimsk 2 года назад

    should know that it was the year for giant steps in motion capture..

  • @veesoho93
    @veesoho93 2 года назад

    amazing thank you !

  • @MrGTAmodsgerman
    @MrGTAmodsgerman 2 года назад +1

    Looks like they lowered the amount of seconds you can upload for free accounts. This is now only 10 seconds!

  • @JadenIrons
    @JadenIrons 2 года назад

    2:11 is that Chika dance?

  • @beefquiche
    @beefquiche Год назад

    I've been working with mocap for 18 years. Large studios won't adopt these video based alternatives as they can afford vicon systems and a team of animators to beautify the input data.
    But this progress is exciting. Maybe by ~2030 the tech will be precise enough for AAA budget games.

  • @mikealbert728
    @mikealbert728 3 года назад +3

    Can any of these free monocular solutions like openpose and vibe actually get root motion? As much as i like them, without root motion anything with root motion is inherently going to be better.

  • @ATomCzech
    @ATomCzech Год назад

    Why there are just two extrems. Super expensive special suite and then attempts to track motion completely in any clothes. Why not use something in the middle with still extremely low price as some usual joga suit with some markers on it, which makes tracking much more accurate.

  • @OhTaco77
    @OhTaco77 3 года назад +2

    Can i export it to vmd format?

  • @liverookie2145
    @liverookie2145 Год назад

    Update: they give free 60 credits now and 30 second video lengths for free

  • @nicolaastanghe475
    @nicolaastanghe475 Год назад

    anny multicam options ?

  • @sdfrtyhfds
    @sdfrtyhfds 3 года назад +3

    I don't think you can compare 2d solution to a 3d one. I don't know why you are taking this approach when you already stated there are open source 3d pose estimation models. This comparison also doesn't make much sense given these advantages depend on your use cases

    • @bycloudAI
      @bycloudAI  3 года назад +6

      Seems like I wasn't clear enough in my video.
      They might have slightly better or worse models to estimate 3D positions but at the end of the day it relies on monocular 2D estimations first, then convert into 3D. The same goes for the commercial options too. Essentially, they both relies on 2D estimations, it's just that the visualization for DeepMotion is able to be presented in 3D. That's all.
      A comparison video is literally about testing each of them in different use cases. This video is just to demonstrate to people the difference and similarity so that everyone know what they will get out of them when they use these technologies.

  • @ambocc
    @ambocc 2 года назад

    Anyone compare this to ipisoft?

  • @sadaqatalimeerani4807
    @sadaqatalimeerani4807 2 года назад

    very helpful

  • @warfaceindiablackburnfire330
    @warfaceindiablackburnfire330 3 года назад +2

    Awesome 🔥

  • @danielyeh1627
    @danielyeh1627 Год назад +1

    *闪电五连鞭 卧槽*

  • @joprecious4570
    @joprecious4570 2 года назад

    What about brekel

  • @monster.3ds
    @monster.3ds Год назад

    Best mocap is x box 360 kinect

  • @АлексейФарахов-л6б
    @АлексейФарахов-л6б 3 года назад +2

    Very cool

  • @dalemsilas8425
    @dalemsilas8425 3 года назад +4

    Their pricing rates are weird. How about they just straight up charge per minute. No " bundles".
    Like 1 min vid = $2, straight forward.

  • @sam_s3344
    @sam_s3344 Год назад

    Anyone here who's tried the deep motion?

  • @user-cw3nb8rc9e
    @user-cw3nb8rc9e 2 года назад

    Great study. It looks like both solutions suck unfortunately.

  • @fuzzyhenry2048
    @fuzzyhenry2048 2 года назад +1

    闪电五连鞭和蔡徐坤绷不住了🤣🤣🤣

  • @McHornet7
    @McHornet7 2 года назад

    Alot of bugs in the motions

  • @3dmax_Ani
    @3dmax_Ani 3 года назад

    Hi

  • @Tsero0v0
    @Tsero0v0 2 года назад

    閃電五連鞭🤣

  • @remainder_
    @remainder_ 3 года назад +2

    How many coding languages do you know

  • @keisaboru1155
    @keisaboru1155 2 года назад

    AHHHHH NVIDIA card t,t

  • @ThiagoBarbanti
    @ThiagoBarbanti 3 года назад +5

    no way, with a lot of drifts, pose issues and wrong rotations we will not see these solution on movies or games

    • @daton3630
      @daton3630 3 года назад +2

      yeah

    • @canalactivado
      @canalactivado 3 года назад

      In the future it will be possible to make animations with the AI without mocap suit? D:

  • @themightyflog
    @themightyflog 3 года назад +9

    Deep motion is to expensive and charges you for even bad captures.

    • @DeepMotionInc
      @DeepMotionInc 3 года назад

      We just added a new feature that automatically fails videos that do not meet a certain animation threshhold. We are also adding another new feature in the coming weeks that allows you to rerun a video that passed with new settings.

    • @shawnarms16
      @shawnarms16 3 года назад

      @@DeepMotionInc that is good to know. But with the price and quality I can buy a Rokoko. I think the pay options are limiting and should include more minutes. But that is just me. I’m doing a training course using deepmotion. It’s pretty cool so far.

    • @noedits5543
      @noedits5543 3 года назад

      Don't worry. Within a year someone will come up with an easy free solution. 😀 Then you will scratch your head for paying folks for free stuff. Never pay anyone unless youre earning tons using that. Rather Use open source.

    • @DeepMotionInc
      @DeepMotionInc 3 года назад

      @@noedits5543 There are open source options out there right now and we welcome any and all comparisons to our service! We have a team that is continuously hard at work optimizing the animation quality and adding new features that provide added value, releasing a stream of frequent updates every few weeks.

  • @nikhil22bhardwaj
    @nikhil22bhardwaj 3 года назад +2

    Somewhere a blenderdev is watching it and...

    • @Rctdcttecededtef
      @Rctdcttecededtef 3 года назад +2

      …using it to create tasteful furry “videos”

  • @jiarunlu9050
    @jiarunlu9050 Год назад

    哈哈哈哈 闪电五连辫 遭不住了

  • @DrMattPhillips
    @DrMattPhillips 3 года назад +9

    I think a sponsored video should be more clearly disclosed, I watched the entire video and yet it wasn't clearly confirmed to be sponsored by you directly until reading the description and even then I had to click the 'show more' button which legally within my location wouldn't qualify to digital advertising regulations. You can also tag a video as an advert, of which this one very much felt like to me. I like your work, but I don't feel you were very ethically transparent at all with this being an advert, the only transparency came in the way of knowing the social cues of when somebody is trying to sell you something, but beyond that it'd be very easy to miss, and I feel you really dropped the ball there. Even a 'thank you for sponsoring this video' at the start and end would have felt more transparent when mentioning DeepMotion, as by sponsoring they are by definition being given preferential treatment, of which needs to be clear.

    • @crescentsmile3463
      @crescentsmile3463 3 года назад +9

      He says it verabally at the beginning of the video and there is also an overlay on the actual video saying it contains a paid promotion. Sooo... maybe you just missed those?

    • @DrMattPhillips
      @DrMattPhillips 3 года назад +2

      @@crescentsmile3463 The overlay wasn't there when I made this comment, he added that afterwards. Equally, the description was different too and wasn't as clear as it currently is. I find it odd that you didn't consider this as a possibility. As for the disclaimer in the video, I listened to the opening several times before making the comment, and I didn't find it clear enough. I never said it wasn't disclosed, just I felt it wasn't disclosed in a way that's transparent. However, I'd retract that now that he's made the two main and most important changes with the overlay and the description change. I feel these changes make it ethically transparent, and I'm glad he took such a perspective constructively rather than dismissing it or assuming bad faith like you did.

    • @crescentsmile3463
      @crescentsmile3463 3 года назад +6

      ​@@DrMattPhillips He literally says 'DeepMotion is today's sponsor' in the video... so...?

    • @DrMattPhillips
      @DrMattPhillips 3 года назад +2

      @@crescentsmile3463 There's something known as digital advertising standards, mentioning it 1 minute in isn't considered transparent nor contradicts what I stated. I have no idea why you are still pushing back against my fair constructive criticism when he clearly read it and made the correct changes I suggested, thus making my initial point resolved and making the content better for it. There's literally nothing negative here beyond your inability to respect a fair comment, made with good faith, not even directed at you specifically. It's bizarre.

    • @crescentsmile3463
      @crescentsmile3463 3 года назад +6

      @@DrMattPhillips "I watched the entire video and yet it wasn't clearly confirmed to be sponsored by you directly until reading the description" - actually mentioning it in the video does contradict what you said in your original comment. You're trying to throw shade at a dude and I'm just giving the facts.

  • @bhuwannepathya
    @bhuwannepathya 2 года назад

    .

  • @酱花青年
    @酱花青年 2 года назад

    不讲武德

  • @rusticagenerica
    @rusticagenerica 2 года назад

    Bodysuits suck.

  • @yimur5062
    @yimur5062 Год назад +1

    all of these ai things is just bullshit..

    • @yimur5062
      @yimur5062 Год назад

      chatgpt: wait a sec..let me explain: Motion capture (sometimes referred as mo-cap or mocap, for short) is the process of recording the movement of objects or people. It is used in military, entertainment, sports, medical applications bla bla bla..

  • @nanaouzumaki6044
    @nanaouzumaki6044 3 года назад +2

    blender users are done for..

  • @BOSSposes
    @BOSSposes 2 года назад

    sure ... pose the face go ahead make 1 shot of a project with that horrible rig

  • @radekoncar2404
    @radekoncar2404 3 года назад +6

    Label this as ad in the title. Worthless.

    • @buisnessmahn6642
      @buisnessmahn6642 3 года назад +3

      I mean, did he not say it was sponsored in the video? When you watch a movie or a TV show they don't have their sponsors in the title or upfront. That just doesn't make sense. And worthless? You just weren't paying attention

    • @hamsolo474
      @hamsolo474 3 года назад +1

      I mean how dare he talk about a service with a free offering and explain he's sponsored by them and then spend half the video talking about a free alternative to them.

  • @rafadananimasyon5274
    @rafadananimasyon5274 3 года назад +1

    Can you convert a 2D image to a 3D model for me for free?

    • @m.filmtrip
      @m.filmtrip 3 года назад +4

      Ha ha

    • @noedits5543
      @noedits5543 3 года назад

      I can convert a real human being into an animal for free.