LSTM - Detect hand actions

Поделиться
HTML-код
  • Опубликовано: 26 авг 2024
  • This video is a demonstration of LSTM Network detecting whether my hand has grasped an object or not, and in this case is a Rubik's cube, the network was trained on 4 different datasets (3 grasp techniques + 1 neutral), with a total of 4000 dataset.
    If you're interested, check out this Github repo of the work: github.com/thi...

Комментарии • 8

  • @lexx-dat
    @lexx-dat 2 месяца назад

    I recommend using color differentiation between the fingers, which allows for more precise model predictions :)

  • @Rehan0011-dc6ec
    @Rehan0011-dc6ec 2 месяца назад

    u have used 2 classes...like grasped and nor grasped...can we use the same code for multiple classes with moving hands??

    • @thieulong0612
      @thieulong0612  2 месяца назад

      I actually have 4 classes: neutral, rest, gripping, carrying, but in the results I labelled them down to only 2 classes, grasped (carrying, gripping) and not grasped (neutral, rest). It is recommended for you to train your custom datasets for your application because this approach is very likely to depend on the angle, point of view. That way you can be very customizable, and you can detect 2 hands as well, since mine demo only detect 1 hand at a time, it may limit your goal. But the same code with a bit of modifications based on your total number of classes, what you labelled them can be used for multiple classes with moving hands.

    • @Rehan0011-dc6ec
      @Rehan0011-dc6ec 2 месяца назад

      @@thieulong0612 ok thanx alot

  • @avivlohiet9839
    @avivlohiet9839 3 месяца назад

    used media pipe as the ground truth of the classification model?

    • @thieulong0612
      @thieulong0612  3 месяца назад +1

      Yes I used the landmarks from Mediapipe

    • @avivlohiet9839
      @avivlohiet9839 3 месяца назад

      @@thieulong0612 well done

    • @thieulong0612
      @thieulong0612  3 месяца назад

      @@avivlohiet9839 thank you 🥰