visionOS Tutorial: Hand Tracking, Scene Reconstruction and Real World Interactions with ARKit

Поделиться
HTML-код
  • Опубликовано: 4 сен 2024
  • We are going to build an app for Apple Vision Pro that allows you to place cubes on real world objects using the position of your left fingertip as a pointer and your right hand as a trigger. This is possible using ARKIt in a visionOS Immersive Space.
    ➡️ Tutorial Files: / visionos-hand-100990554
    👏 Support me on Patreon: / brianadvent
    ➡️ Web: www.brianadvent...
    ✉️ COMMENTS ✉️
    If you have questions about the video or Cocoa programming, please comment below.

Комментарии • 23

  • @vettorazi
    @vettorazi 5 месяцев назад +2

    Basically everything I wanted to know about ARKit and Vision Pro in one single video!

  • @swapnilxi
    @swapnilxi 2 месяца назад +1

    Please make more such videos, interaction of hands and object. Absolutely loved this ❤

  • @paulikhane
    @paulikhane 5 месяцев назад +2

    it's been awhile. You were among the RUclipsrs who inspired when I was starting out with Swift and iOS many years ago. Good to hear your voice again.

    • @BrianAdvent
      @BrianAdvent  5 месяцев назад +1

      Thank you so much for sharing this!

    • @whiteruski
      @whiteruski 4 месяца назад

      Same here

  • @samentyevtriboy4731
    @samentyevtriboy4731 5 месяцев назад +1

    A very educational lesson.
    Thanks for your efforts.
    Continue the topic about VisionOS

  • @gfunkadelicAus
    @gfunkadelicAus Месяц назад

    Love this, really hope you're just busy experimenting with Vision Pro and we see more from you again soon Brian. Would love to see building on some of these concepts with being able to drag, rotate and resize the objects and being able to pick from a range of shapes or objects and load them into the same scene.
    Also keen to see if real world occlusion works where if the cube goes under the table then you can't see it until you look under the table.
    All things I can research and hopefully build onto this great starter in the meantime. Thanks again and can't wait to see more!

  • @jaimemartinez-yt2iv
    @jaimemartinez-yt2iv 2 месяца назад

    Great tutorial.. looking forward for more of them

  • @jokosalsa
    @jokosalsa 4 месяца назад +1

    Fantastic video tutorial!

  • @davidthomas5562
    @davidthomas5562 5 месяцев назад

    I’m glad you’re back.

  • @user-ko5cq8tq9e
    @user-ko5cq8tq9e 4 месяца назад

    It's fantastic, and would you like to upload the object detection in VisionOS? It would be greaful :)

  • @jy-sk4vv
    @jy-sk4vv 3 месяца назад

    Fantastic Video. I am checking how to wear any ornament/watch around Wrist. Is there any way?

  • @stancartmankenny
    @stancartmankenny 2 месяца назад

    17:48 - you say "once we reached this next line of code, we know the fingerTip is tracked" but all you know at that point is that it is non-nil. Should it actually be tracked, or is it enough to just be non-nil?

  • @itzcreepyog
    @itzcreepyog 4 месяца назад

    Hello, how do I get the RealityKitContent package?

  • @hygisonbrandao9251
    @hygisonbrandao9251 4 месяца назад

    Can you give me the link for that code? it is somewhere on apple website right?

  • @QiOS283
    @QiOS283 3 месяца назад

    Created a ModelEntity with a USDZ file and placed it on the wrist. The ModelEntity is so large that it cannot be scrolled, scaled, or moved. How can I set the size or frame of the ModelEntity?

    • @namtranhhgames
      @namtranhhgames 3 месяца назад

      I guess you can scale it down using something like this `entity.scale = SIMD3(x: 0.01, y: 0.01, z: 0.01)`

  • @tianyuanzhang7111
    @tianyuanzhang7111 4 месяца назад

    Hello, I had built this in my VisionPro. I can be able to click ‘ start tracking’ , but there is no blue dot on my hands. I am wondering did I miss something to set up in Vision Pro itself or anything else?, I used the file from you. 🥺

    • @BrianAdvent
      @BrianAdvent  4 месяца назад

      It is possible that you hand occludes the blue dots. Try to add .upperLimbVisibility(.hidden) as a modifier to the RealityView

  • @Nealcar30489
    @Nealcar30489 5 месяцев назад

    Are we allowed to use Apple's DragRotationModifier for our own apps that we plan on selling?

    • @BrianAdvent
      @BrianAdvent  5 месяцев назад +2

      You can use it in commercial products as long as you include the licence text Apple provided: "The Apple Software is provided by Apple on an "AS IS" basis. APPLE
      MAKES NO WARRANTIES, EXPRESS OR IMPLIED, INCLUDING WITHOUT LIMITATION
      THE IMPLIED WARRANTIES OF NON-INFRINGEMENT, MERCHANTABILITY AND FITNESS
      FOR A PARTICULAR PURPOSE, REGARDING THE APPLE SOFTWARE OR ITS USE AND
      OPERATION ALONE OR IN COMBINATION WITH YOUR PRODUCTS.
      IN NO EVENT SHALL APPLE BE LIABLE FOR ANY SPECIAL, INDIRECT, INCIDENTAL
      OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
      SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
      INTERRUPTION) ARISING IN ANY WAY OUT OF THE USE, REPRODUCTION,
      MODIFICATION AND/OR DISTRIBUTION OF THE APPLE SOFTWARE, HOWEVER CAUSED
      AND WHETHER UNDER THEORY OF CONTRACT, TORT (INCLUDING NEGLIGENCE),
      STRICT LIABILITY OR OTHERWISE, EVEN IF APPLE HAS BEEN ADVISED OF THE
      POSSIBILITY OF SUCH DAMAGE."

  • @arthuralvarez7253
    @arthuralvarez7253 5 месяцев назад

    Is there any way to test these features on the Vision Pro Simulator?

    • @BrianAdvent
      @BrianAdvent  5 месяцев назад

      unfortunately handtracking can only be tested on the device itself at the moment.