3D Motion Capture using Normal Webcam | Computer Vision OpenCV

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024

Комментарии • 212

  • @murtazasworkshop
    @murtazasworkshop  2 года назад +49

    Would you like to see more Python and Unity Tutorials? Share your ideas and I will create the most voted.

    • @ameerhamza5542
      @ameerhamza5542 2 года назад +2

      Create a program to measure the size of foot with live webcam. Thank You
      Big fan from Pakistan

    • @TriSutrisnowapu
      @TriSutrisnowapu 2 года назад

      Would you mind also share for unreal engine workflow?

    • @faisaltasleemft
      @faisaltasleemft 2 года назад

      Yes I really like ur contents. Full or learning.
      Where do u work? Any whatsapp/discord community u r maintaining for Python?

    • @studyinchina4139
      @studyinchina4139 2 года назад +2

      please what if I would like to track specific landmarks, like in a video, if I only want to track the upper limbs landmarks.

    • @anime_on_data7594
      @anime_on_data7594 2 года назад +3

      can we replace the skeleton with cartoon charctor

  • @user-xm8bv6pd7e
    @user-xm8bv6pd7e 2 года назад +17

    26:30
    1. Lock the inspector
    2. Select all sphere in hierarchy with shift+click
    3. Drag drop into inspector’s Body list field

    • @holaholahee
      @holaholahee 2 года назад

      ok

    • @MonkeyShark
      @MonkeyShark 2 года назад

      I was going to comment this lol, I used to not realize that was a thing and I was so mad at the hours I'd previously wasted when I found out.

    • @kanall103
      @kanall103 2 года назад +1

      How I lock inspector?

    • @user-xm8bv6pd7e
      @user-xm8bv6pd7e 2 года назад

      Hi,
      At the top of the inspector, you can see the lock icon.
      You just need to click it to lock or unlock your inspector window.

    • @kanall103
      @kanall103 2 года назад

      Thank you

  • @tunaroll3957
    @tunaroll3957 2 года назад +6

    Exactly what I've been looking for! Mocap suits are so expensive! Thanks a lot Murtaza, for your awesome tutorials

  • @aek--
    @aek-- 2 года назад +27

    Great job, thank you. but How do we apply this movement to a mesh object?

    • @murtazasworkshop
      @murtazasworkshop  2 года назад +25

      Will create a tutorial on that soon.

    • @aek--
      @aek-- 2 года назад +1

      Thanks for the reply, looking forward to it

    • @pinguimaya
      @pinguimaya 2 года назад +6

      @@murtazasworkshop Good job, how can i use it for a model3d on unity3d, or how can i use in blender ??

    • @ricetv942
      @ricetv942 2 года назад

      @@murtazasworkshop please

    • @sydneybeaa
      @sydneybeaa 2 года назад

      @@murtazasworkshop Im so excited!

  • @FarmBoyTech
    @FarmBoyTech 4 месяца назад +1

    Man your videos are just out of this world.

  • @jpmccrossan6807
    @jpmccrossan6807 Месяц назад

    An absolutely amazing video that is really clear, well explained and uses a wonderful logic driven process :)

  • @RockingScience
    @RockingScience 2 года назад +10

    I followed your earlier pose tracking video and made something similar to this in blender, in that I got the position data into a list and converted it into a binary file and on blender converted it back to a list and animated a rig with it (tbh I really learnt a lot from that video and hope I can learn something here too, to make my blender one better)

    • @FreshCreamSoup
      @FreshCreamSoup 2 года назад +2

      Hi, can you share it ? I would like to explore it. Thanks.

    • @spatil689
      @spatil689 Год назад

      Please can u share this?

    • @aaditya4998
      @aaditya4998 Год назад

      hi bro can u tell me how u mapped that on a humanoid 3d rig

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      How to animate the rig by using dimensions and outputs getting here

  • @sultanrahil3380
    @sultanrahil3380 Год назад +1

    Amazing video. Great job. Keep it up. 👍

  • @Equilibrier
    @Equilibrier 2 года назад +11

    This is just awesome, thanks a lot ! Keep up with the good work !
    PS: Can we find a good 3D model and a good rig to translate those points/spheres into rig articulation points and make a complete human 3D model to act the same as in the animation ? In theory, it should be possible. If you are willing to try something like this I would be grateful. Thanks !

    • @FerisovGerman
      @FerisovGerman 7 месяцев назад

      rigging

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar

    • @Equilibrier
      @Equilibrier 5 месяцев назад

      ​@@aangigandhi Hope I'm not too late for you; I'm far from being proficient in Unity3D and haven't played with it for some years, but the process would rather be pretty simple, but on the implementation side you will have to make things work and requires some practice, of course, but it's certinaly very doable:
      1. you have cvzone which reads the poses as absolute points coordonate and you also have the image-map which shows you the pseudo-rig points those points translate to; it's easier you could look at an actual rig in Unity3D to see what points it has and so on so you can do the right mapping directly in python, before writing to the persisted-data file, if you ask me. Maybe these clips will help you understand the official unity3D rig: ruclips.net/video/Htl7ysv10Qs/видео.html and ruclips.net/video/Wx1s3CJ8NHw/видео.html
      2. In unity3D things should be as simple as it was for the author, since the code you will have to write is similar; you have to make a GameObject which defines the bones/rig-points (don't know which, I haven't played with it but you can) as a mere Vector of Transform or Point3D or something like that, write the logic that assigns those points from the ones stored in your file, place a delay like he did in the video, in the Update() method (correctly you would get the timings from the cvzone and store those in your file as delays or keyframes and this would make you do a perfectly synced 3D translation of the originar video movements but you shouldn't try this from the start, only after the first part works...) and then, in the inspector, you will have to manually associate those points (your GameObject Vector) with actual rig points/bones (similar to what he did, but this time from the rig itself, not from the created spheres);
      3. of course you should start with a project that has a 3D model aswell associated with the rig, and maybe somehow find an unity3d mechanism to easily switch 3d models
      PS: you should be careful with coordinate systems, you saw in the vide, he did a transform on the points (over the camera matrix, which you don't see because it's hidden in the transform function) to make them be in the local (camera) coordonate system, from the absolute (world) system he had them in in the first place (in the file); most probably you should do the very same, since I assume the rig will also have it's bones/points in the cam coordinate system.
      here is some code that you can take as reference but it's pretty much the same of his (just some sample codes to make you understand the idea):
      using System.Collections;
      using System.Collections.Generic;
      using UnityEngine;
      using System.IO;
      public class RigAnimator : MonoBehaviour
      {
      public Transform[] bones; // Oasele rig-ului
      private List frames; // Lista cu pozițiile pentru fiecare cadru
      private int currentFrame = 0;
      void Start()
      {
      LoadBoneData("path/to/your/data.json");
      }
      void Update()
      {
      if (frames.Count == 0) return;
      // Aplică pozițiile oaselor
      for (int i = 0; i < bones.Length; i++)
      {
      bones[i].localPosition = frames[currentFrame][i];
      }
      // Actualizează la următorul cadru
      currentFrame = (currentFrame + 1) % frames.Count;
      }
      void LoadBoneData(string filePath)
      {
      string jsonContent = File.ReadAllText(filePath);
      frames = JsonUtility.FromJson(jsonContent);
      }
      }
      (sorry about the romanian comments, forgot to write them in english, but you will understand so this would be animation code, loadBoneData will have to read the file based on the file format, but a json format is verry suitable you should go with it, since in python you have json.dumps for this as well)
      and for association (in the inspector is a possibility but maybe it works in the code, as well, don't know) - it should be something like this
      // another gameobject which runs only once (this is why the logic is in the Start method, not in the Update method)
      void Start()
      {
      LoadBoneData("path/to/your/data.json");
      bones = new Transform[5]; // Să presupunem că avem 5 oase importante
      bones[0] = GameObject.Find("Bip001 Pelvis").transform; // Exemplu de nume de os
      bones[1] = GameObject.Find("Bip001 Spine").transform;
      bones[2] = GameObject.Find("Bip001 L Thigh").transform;
      bones[3] = GameObject.Find("Bip001 R Thigh").transform;
      bones[4] = GameObject.Find("Bip001 Head").transform;
      // Adaugă restul oaselor conform structurii tale de rig
      }
      You should put these in the right order and modify what it is to modify and in theory it should work.
      Good luck with your work ! Hope it helps !

    • @Equilibrier
      @Equilibrier 5 месяцев назад +1

      @@aangigandhi I don't know if this is still a need for you, today I saw the comment and I tried to write two useful comments for you and none was posted by youtube. Don't know how to get in touch, maybe you can find a possibility, as youtube kicks out my comments without explanations.

    • @minhdungdo4541
      @minhdungdo4541 Месяц назад +1

      @@Equilibrier Hi Equilibrier, I hope you have a great day. Could you share those 2 useful comments with me? I believe it would help me alot in my project

  • @krzysztofjarzyna4225
    @krzysztofjarzyna4225 2 года назад +2

    Amazing video. So glad I found your channel. Do you think that it might be possible to remove camera movement? I'm thinking yes, because if you can track it, it should also be possible to create sort of a "mask" to apply on the created data set to modify coordinates in each frame. I mean apply "negative camera movement" for each move of it.
    Once again, thank you for the great content. Top explanation, great video quality. Instant sub.

  • @quarkblue
    @quarkblue 9 месяцев назад

    very informative video🙏👍.
    BTW the better way for dragging and dropping objects here at 27:16 will be by adding a few extra lines in the code.
    public GameObject body;
    public GameObject[] bodyLM;
    private void Awake()
    {
    bodyLM = new GameObject[body.transform.childCount];
    for (int i = 0; i < body.transform.childCount; i++)
    {
    bodyLM[i] = body.transform.GetChild(i).gameObject;
    }
    }
    after adding this function, in the unity just drag and drop the "Body" parent object and keep the BodyLM list as is, and when you run game the script will do its work 😋

  • @yaat9532
    @yaat9532 2 года назад

    Why no one know about this channel your vids are so good i can understand it clearly.

  • @hexagonproductions5466
    @hexagonproductions5466 2 года назад +1

    Hey! amazing is all I can think of saying, easy to understand. Been looking for something that actually works for a while, I think a lot of people are, maybe adding more cameras will improve the quality as you said. No idea how it would be done lol, but sounds cool. Great as always o/

  • @wangshj2319
    @wangshj2319 2 года назад +3

    Can you generate the skeleton in python and Unity using Webcam in real-time?

  • @Segmentsoflives
    @Segmentsoflives 2 года назад

    You are a Swiss knife of technologies- Boss.

  • @Long_Dong_Chong
    @Long_Dong_Chong 2 года назад

    Learning so much from your tutorials. Thanks for everything you produce.

  • @Zarrar2802
    @Zarrar2802 2 года назад

    oh wow just when I started looking up ML models for vision based MOCAP, YT recommends me this. And right on time too

  • @youtobebug
    @youtobebug 2 года назад +3

    It’s a great job, thank you for sharing this video. And would you make an introduction or a tutorial of MoCap and SMPL model for further learning this topic?

  • @zzzzzzzzzzzzzzzzzzzzzzzzzzzzzz
    @zzzzzzzzzzzzzzzzzzzzzzzzzzzzzz 2 года назад +2

    Can we just appreciate the time he puts into his videos?

  • @edslab5383
    @edslab5383 2 года назад

    Congratulations! It is a really interesting project and it could have many applications. It is a great achievement since commercial motion capture systems are too expensive.

  • @anantchandak9574
    @anantchandak9574 2 года назад

    this man always suprise me with his content

  • @TrySomeCG
    @TrySomeCG 5 месяцев назад

    you dont have to grab the spheres manually and put them in the array, delete all the elements from the array, select all the spheres and drag them to the array field, it will auto fill the array

  • @MotuDaaduBhai
    @MotuDaaduBhai 2 года назад +1

    Awesome stuff! Is there a way to feed the motion data directly into Unreal Engine model rig so the motion capture can be done in real-time?

  • @kayumiy
    @kayumiy 2 года назад +6

    Can you apply this animation to any other 3d object avatar?

    • @spatil689
      @spatil689 Год назад

      Can we do this??

    • @spatil689
      @spatil689 Год назад

      If yes please tell me

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar

  • @sanniu4270
    @sanniu4270 2 года назад +1

    Great job! Could you explain how to get the depth data for human joints? Is it estimated by a well-trained neural network?

  • @jorgebueno1891
    @jorgebueno1891 2 года назад

    Ey, gracias por tener tanta paciencia al momento de enseñar este tipo de contenidos

  • @truestbluu
    @truestbluu 2 года назад +1

    this would be fun for vr

  • @kurushitanaka5150
    @kurushitanaka5150 2 года назад +1

    I want to ask, what software do you use?

  • @edslab5383
    @edslab5383 2 года назад +2

    Can you make a video about the estimation of the true size of objects in a scene using multiple cameras through triangulation.

  • @tallslimpr
    @tallslimpr 2 года назад

    Fantastic video, well presented!

  •  2 года назад

    Too amazing to be real! Thanks for share this.

  • @musicalbirds2928
    @musicalbirds2928 2 года назад

    Great video as always. Thanks for sharing.

  • @MirageDev_
    @MirageDev_ 2 года назад

    Thanks a lot bro! Keep it up!

  • @parthchotaliya3468
    @parthchotaliya3468 2 года назад

    good tutorial, keep pushing!!!

  • @AHabib1080
    @AHabib1080 Год назад +2

    It would be great if you can make this project one step further.
    For example,
    Recording Motion Real time with Cameras around a room and send these data to Unreal Engine / Unity.
    i don't know how hard it is. but I was really hoping if you could teach us how to do it

    • @twenmod
      @twenmod Год назад +1

      It is really easy to make this real time
      just make the python script read from the webcam
      then just make the unity script read the file in update instead of start

    • @AHabib1080
      @AHabib1080 Год назад +2

      @@twenmod How can i Separate the bones data?
      Each Part of the Movement, like hands, legs, spine, head, neck etc.?
      is it possible?

    • @twenmod
      @twenmod Год назад +3

      @@AHabib1080 I don't get what you mean, what I'm doing to get the movement to a character is setting up the points as IK targets

  • @geralt9036
    @geralt9036 2 года назад

    thanks! you are a blessing

  • @DissyWorld
    @DissyWorld 2 года назад +1

    Can this project be made with vs code or Jupiter nootbook

  • @baveraby
    @baveraby 2 года назад

    Imagine VR full-body with this technology!

  • @pulinduvinnath1812
    @pulinduvinnath1812 2 года назад

    Great video. Thank you 👌

  • @FREEKER18
    @FREEKER18 2 года назад

    Очень круто. Хотел бы увидеть с двумя камерами. Обработку по двум камерам стоящими под 90 градусов.

  • @earnmoney2289
    @earnmoney2289 2 года назад

    thank you, great job.

  • @gurudaki
    @gurudaki Год назад

    Excellent tutorial!!I have a question please....
    How can we insert an open source avatar to replace the lines connecting the landmarks? I mean the landmarks to be on an avatar instead of lines

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar

  • @FreshCreamSoup
    @FreshCreamSoup 2 года назад +2

    Very helpful. How do we apply this movement to 3D player model.

    • @swastikkarwa2507
      @swastikkarwa2507 2 года назад

      Did you have any luck with this?
      I need it very urgently

    • @spatil689
      @spatil689 Год назад

      Ya nee too

    • @FreshCreamSoup
      @FreshCreamSoup Год назад

      @@swastikkarwa2507 ​ @Sneha Patil yes, I asked chat gpt to implement this feature it's not perfect but still manageble

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar

    • @FreshCreamSoup
      @FreshCreamSoup 5 месяцев назад

      @@aangigandhi are you stuck on creating avatar or applying animation data to avatar ?

  • @MarkTreeNewBee
    @MarkTreeNewBee 2 года назад +1

    Also, I want to know may I get rotation parameters from cvzone and mediapipe?

  • @CyroCh.
    @CyroCh. 2 года назад

    Hi, have you ever tried a proyect with the xbox kinect?

  • @flioink
    @flioink 2 года назад +1

    Alright, all done - pretty cool, thanks.
    I'd like a hint on how to replicate this in Unreal Engine.
    I'm guessing the code from the second part won't be much different, but I'm not sure how to set it up inside the engine.

    • @TriSutrisnowapu
      @TriSutrisnowapu 2 года назад +1

      I wonder if there is a workflow for unreal either.. btw unreal has support for python scripting, so I assume there should be a way

    • @flioink
      @flioink 2 года назад

      @@TriSutrisnowapu 100% there's a way, but I don't kno which is the right class to use for such a thing.
      I also bet there's a way to influence the character skeleton joints with those points.

  • @dukezacharia7881
    @dukezacharia7881 Год назад

    Wow nice work can i ask can this be applied to 3D Animal motion capture?

  • @moses5407
    @moses5407 Год назад

    will this work moton analysis (not the animation portion) in real time by changing video source to an available camera vs. a stored video?

  • @joaocamiloulhoa4878
    @joaocamiloulhoa4878 2 года назад

    Amazing channel of youTube!!

  • @akhibali8405
    @akhibali8405 2 года назад

    Hi,great vedio
    can you please upload body measurements taking using Opencv,
    Finding shoulder size, Waist etc..

  • @nuriman432
    @nuriman432 9 месяцев назад

    is this the same as Deepmotion?

  • @werachaisrisupinanont3353
    @werachaisrisupinanont3353 2 года назад

    Amazing man

  • @mia_in_lab
    @mia_in_lab Год назад

    Thanks very much!!That's what i want to do!!

  • @llIIllIlIIllX_XIillIIllIIllIll
    @llIIllIlIIllX_XIillIIllIIllIll 2 года назад

    hello, is it possible to improve the quality of the system to track facial and hands movement as well?

  • @madstudios1688
    @madstudios1688 2 года назад +1

    how can i make my own python 3d camera tracker?
    i want to learn how extract camera motion data from imagine sequence and use it in a 3d program

  • @葉沛成
    @葉沛成 2 года назад +1

    hello,I want to know if I apply this with realtime tracking, how can I collect the data in a stored txt? How to run that code? cuz I think if just replace an video link with cap.videocapture, and I can just get immediate data so I don't know how to do? Another question is, I can't wait to see that you put this model in a 3D model, ex: vtuber, ha ha

  • @rithika9442
    @rithika9442 2 года назад +1

    how do i do this on unreal engine 4

  • @alvarobyrne
    @alvarobyrne 2 года назад

    nice. division is a costly operation . prefer multiply by 0.1 than divide per 10: you will notice the difference in bigger data but better getting used to it in all cases

  • @gamer-tf2pe
    @gamer-tf2pe 2 года назад

    Thanks a lot man

  • @samsularefinsafi3448
    @samsularefinsafi3448 Год назад

    This video is so amazing. Thank you sir.
    Sir, I have a question here. Like your handtracking game, you did that realtime. I have a project on realtime stick model find out. But I cant understand where should i change the code in python. I use Spyder for python interpreter. Can you please help me to complete my project...

  • @antonolivaresjuanjose2992
    @antonolivaresjuanjose2992 2 года назад

    You are The BEST

  • @tahirullah4923
    @tahirullah4923 2 года назад

    great sir ,Sir can you make a video on distance estimation from webcam to any reference point or image.sir plz make a video for this thanks.
    i have done face and hand distance measurements.

  • @ewwkl7279
    @ewwkl7279 2 года назад

    Great tutorial. Thank you so much.
    I have tried to find angle of elbow but I've got an error.
    angle = deetector.findAngle(img, 11, 13, 15, draw=False)
    The error was
    Finds angle between three points. Inputs index values of landmarks
    instead of the actual points.
    how to input index values?

  • @vipinsou3170
    @vipinsou3170 2 года назад

    which processor you are using bro?

  • @MarcoSilvaJesus
    @MarcoSilvaJesus Год назад

    Can you give me the link of the motion capture program you used in this video?

  • @MarkTreeNewBee
    @MarkTreeNewBee 2 года назад

    Thank you for your video explaination, it helps me a lot, just one more question bother me, how can I put these red sphere into a character to sync the motion in unity3D?

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar.Please help

  • @say_hon3y
    @say_hon3y Год назад

    Murtaza Sir,
    How can this be achieved with a Humanoid 3D Model, if you can make a short video on that, it will be helpful as lot's of people are searching simple ways to make Motion Capturing and it will be easy if we could do it with just few lines of code like in this video.
    It was very helpful.
    Thank You

    • @aangigandhi
      @aangigandhi 5 месяцев назад

      Did you found the solution for that?If yes pls guide me its urgent requirement for my project.I successfully completed web cam feed to animation(made of sphere and line) conversion but stucked in creating avtar

  • @caroscheler
    @caroscheler 2 года назад

    Great Tutorial! I'm wondering, is it possible to combine this with the video you created on hand tracking? So can I get 3D data on Hands to bring to Unity? Or is this just possible with a video of a full body? Thanks!

    • @samsularefinsafi3448
      @samsularefinsafi3448 Год назад

      sir, did you get how to detectect full body in unity hub realtime? I am facing the same problem

    • @caroscheler
      @caroscheler Год назад +1

      @@samsularefinsafi3448 sorry, I needed only hands, so I haven't done any full body tracking. I tracked them with the other tutorial and then used a plugin to record the movement of my hand in unity and export it as fbx for use in maya.

    • @matearkossy7894
      @matearkossy7894 10 месяцев назад

      Hello! Im currently trying to achieve that as well! Do you have any help with that maybe? @@samsularefinsafi3448

  • @pixelstate26
    @pixelstate26 2 года назад

    Is this working with any video? sounds exciting.

  • @welsonfy5246
    @welsonfy5246 Год назад

    Good job. Possible capture mutiple players ?

  • @humayunkabir7866
    @humayunkabir7866 2 года назад

    Can we expect something for Blender workflow..?

  • @mohakbajaj4235
    @mohakbajaj4235 2 года назад

    loved it

  • @actionkey8042
    @actionkey8042 2 года назад

    AWESOME ))

  • @whitelee9630
    @whitelee9630 10 месяцев назад

    there are a problem:INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
    Traceback (most recent call last):
    File "/Users/mac/PycharmProjects/MotionCapture/Data/MotionCap.py", line 9, in
    lmList, bboxInfo = detector.findPose(img)
    ValueError: too many values to unpack (expected 2)
    Process finished with exit code 1

  • @shagull9159
    @shagull9159 Год назад

    Is this possible that I record my video in sign language and it can capture hand gestures motion also?

  • @user-cw3nb8rc9e
    @user-cw3nb8rc9e 2 года назад

    Does your code link lead to full program with gui ?

  • @naiyraelkady8204
    @naiyraelkady8204 2 года назад

    How would you apply the annotations on a 3D model with an armature ??

  • @NaveenSword
    @NaveenSword 2 года назад

    Can this be used to detect multiple people?

  • @wgalloPT
    @wgalloPT 2 года назад

    QUESTION: What would I have to do if I want a numerical value of lets say the dots of the waist in the "Y" coordinate to display at all times?

    • @wgalloPT
      @wgalloPT 2 года назад

      Never mind the question above..i figured it out..thankk you

  • @temyraverdana6421
    @temyraverdana6421 2 года назад

    Wow. Thanks for share

  • @ezhil182020
    @ezhil182020 2 года назад

    Can you please let me know what are all the application of this content ?

  • @priyanshubh
    @priyanshubh 2 года назад

    Please bring more such OpenCV with Unity video's it's really helpful.

  • @vaishnavikothoori9250
    @vaishnavikothoori9250 Месяц назад

    which domain now the project is

  • @AiPhile
    @AiPhile 2 года назад +2

    Sir I am working on same project, but in blender

  • @zishventure4280
    @zishventure4280 2 года назад

    Vector3 is not being recognized and I am getting the error …. Index was outside the bounds of the array.

  • @mslmanni
    @mslmanni Год назад

    I can't register at the site property. It's not sending email for confirmation or resetting password.

  • @infozy
    @infozy 2 года назад

    Which software it is?

  • @manavarora3686
    @manavarora3686 2 года назад

    getting a NullReferenceException in the Animation Code line 22:
    string[] points = lines[counter].Split(',');
    What to do?

  • @BloxxerBoi_m134
    @BloxxerBoi_m134 2 года назад

    Does this work for roblox animation? Like importing the file

  • @mimARDev
    @mimARDev Год назад

    Sir can i run this on android and iOS devices?

  • @infinity2creation551
    @infinity2creation551 2 года назад

    U are osm bro 😍😍😍🥺🥺

  • @mrneonlightning132
    @mrneonlightning132 2 года назад

    Can we import this into unreal engine too?

  • @bharatbhaishah8406
    @bharatbhaishah8406 2 года назад

    41:55 I am getting an error of unassigned reference exception: the variable body of animation code has not been assigned.
    How can I solve the error . plsss tell

  • @t-raystudios
    @t-raystudios 2 года назад

    I like this, is there a tutorial where you can convert this to a .bvh?

  • @cyruslegg
    @cyruslegg 2 года назад

    I’m wondering how to view it real time with matplotlib?

  • @minhdungdo4541
    @minhdungdo4541 Месяц назад

    Hi Murtaza, I am big fan. As I tried the code from the video, the result = detector.findPose(img) only give an array of 3 value [id,x.y] instread of [id,x,y,z] like what you got in the video. Do you know why this happen? or did I do something wrong?

  • @HuyNguyen-vp7eb
    @HuyNguyen-vp7eb 2 года назад

    Can it work with multi people?

  • @minisreejith21
    @minisreejith21 2 года назад

    Which country do o
    You live. And where did you got your graduation.you are superb

  • @haithinhtran5108
    @haithinhtran5108 Год назад

    How you make skeleton with model 3d?

  • @davidzhong
    @davidzhong Год назад

    What about the sounds?