Unity XR Hands Shapes & Poses - Create Custom Gestures with Gesture Building Blocks

Поделиться
HTML-код
  • Опубликовано: 4 фев 2025

Комментарии • 24

  • @osmanDemir104
    @osmanDemir104 Год назад +3

    Thanks for video.
    Can we detect the hand pose that is meant to hold a specific object? For example, detecting whether the user is holding a pen correctly. Or detecting whether the user is holding a drill correctly.

    • @blackwhalestudio
      @blackwhalestudio  Год назад +1

      That could definitely be possible! However, it would probably require very accurate finger shapes and less tolerance values.

  • @gagagu01
    @gagagu01 3 месяца назад

    Hi, thanks for this tutorial. I've got that working on my Apple Vision Pro. the gesture will be detected but I don't know how to grab something with that gesture. Is there any tutorial available for this?

  • @Besttechnology
    @Besttechnology Год назад +1

    hi sir , thx for the video

  • @MuhammadRafay-gg6gi
    @MuhammadRafay-gg6gi 11 месяцев назад +1

    Great explanation. How can I grab an interactable object with my new custom pose/shape. I want the object to only be grabbed with the custom pose. Can you provide me link to a video if you have done it already. I have watched the previous setup video and although there was mention of input action map in it. I couldn't extract the needed information from it as im quite unfamiliar with it. Can you help me out in this regard? Thanks

    • @gagagu01
      @gagagu01 3 месяца назад

      Hi, have you find a solution for this? I am looking for the same Info

  • @yavalang
    @yavalang Год назад +1

    Thanks for the Info. One question: how to perform a Shake Pose? I tried waving with fingers, palm and other movements, but no feedback on the shake button...BTW, I'm using a Quest3

    • @blackwhalestudio
      @blackwhalestudio  Год назад

      We can only design static gestures at this point unfortunately! So for shaking, you would need to have kind of a half closed hand on each person and then maybe checking the collision between the two hands or come up with a different solution.

    • @poolguymsj
      @poolguymsj Год назад +2

      I too read it as shake first... but it actually says Shaka . thumb and pinky out

    • @yavalang
      @yavalang Год назад

      @@poolguymsj Ohhhh, well done, you're right, thank you, works well!!!

  • @funboys1000
    @funboys1000 Год назад +1

    When u redirect us to a different video of yours (like u did at 2:17) plz do add video link in the description.

    • @blackwhalestudio
      @blackwhalestudio  Год назад

      Hi! I did, the link is always on the top right corner!
      That's the video: ruclips.net/video/uGq10Tcl3Ns/видео.html&t

    • @funboys1000
      @funboys1000 Год назад

      Thank you

  • @zzw6299
    @zzw6299 7 месяцев назад

    I wanted to run it on hololens, but encountered a different configuration issue

  • @giuseppelauria8288
    @giuseppelauria8288 Год назад +1

    Very Good Video.
    Can I use XR Hands in my MR project with the Meta Quest 3?

  • @funboys1000
    @funboys1000 Год назад

    WHY !! Gui elements only visible in one eye :"""(

  • @AbelMicroobjects
    @AbelMicroobjects 8 месяцев назад

    Can we do this in vision pro?

    • @blackwhalestudio
      @blackwhalestudio  8 месяцев назад

      yes!

    • @vikneshkamal
      @vikneshkamal 4 месяца назад

      @@blackwhalestudio I tried doing the custom hand gesture to activate teleportation in Apple Vision Pro. But, the gestures are not detected. Can you help me redirect to any resource that could achieve this?

  • @Cool浩
    @Cool浩 Год назад

    Can Pico be used?

    • @blackwhalestudio
      @blackwhalestudio  Год назад

      I'm not sure if Pico supports hand tracking as well as OpenXR. But if so, this should work yes!