Digging into the Meta Depth API

Поделиться
HTML-код
  • Опубликовано: 26 дек 2024

Комментарии • 20

  • @LucasRizzotto
    @LucasRizzotto 11 месяцев назад

    Oh hey Jared! awesome seeing you here, thanks for the video!

  • @poolguymsj
    @poolguymsj 11 месяцев назад

    I'm loving videos like this!! Especially how you go deeper into how it works and what that means for how its used.
    I've got a hand tracking game currently in openxr and I might have to switch to metaxr for passthrough (because of URP bug) so depth might be something I can add.

  • @haraldgundersen7303
    @haraldgundersen7303 Год назад

    Is Shader Graph able to support depth functionslity?

    • @jbienz
      @jbienz  Год назад +1

      Theoretically yes, since depth is a RenderTexture.

  • @SimonDarksideJ
    @SimonDarksideJ Год назад

    Great stuff and fantastic video Jared, any chance of sharing the repo with your sample included and the preferred project setup?

    • @jbienz
      @jbienz  Год назад +3

      Hey Simon. Sorry it took me so long to get back to you. Unfortunately I didn't fork the upstream project before making my changes, and by the time I got around to it they had made some changes. But I have now updated everything on my own fork and you can try these changes by pulling this branch:
      github.com/SolerSoft/Unity-DepthAPI/tree/Feature/DepthView

    • @jbienz
      @jbienz  Год назад +1

      P.S. all of my modifications are in a root folder called 'Custom'. I also made my own scene so I didn't break the original. Note that the cameras in OVRCamerRig had to be modified to render the Depth Map. Unity doesn't do this by default.

  • @CaedesCZ
    @CaedesCZ 11 месяцев назад

    Hello,
    I want to ask. I'm using your example code but I can't get the depthapi to work in the Editor. I have experimental features enabled, I have passthrough via quest link enabled. But I keep getting a message that the depth texture is not available. It doesn't even show up in unity passthrought texture.
    I don't know what I could have missed, do you have any tip?

    • @jbienz
      @jbienz  11 месяцев назад

      Hey CaedesCZ, sorry, I was at MIT Reality Hack. Unfortunately, last I checked the depth texture isn't available in Unity. I don't even think it's available with the Meta simulator running. I believe the only way to test is to build and deploy to device. This is a preview feature for now, so the ability to test in editor may be added later.

    • @guybrush3000
      @guybrush3000 9 месяцев назад

      @@jbienz Oooh I racking my brain over this. I've had no problem putting the occlusion into my shaders, but I couldn't get the depth texture in unity. So it should work with a build? Ok i'll try that now...

  • @fungameplayer5005
    @fungameplayer5005 6 месяцев назад

    What exactly is the shader on the materials doing that show the depth texture? Is that a custom shader or something built-in because I can't seem to find it.

    • @jbienz
      @jbienz  6 месяцев назад

      How the depth shaders work are explained at 10:56 in this video. Did you have a specific question?

  • @BogomilPetrov
    @BogomilPetrov Год назад

    Would there be any possible way to somehow convert the depth texture data to a mesh?

    • @G3David
      @G3David Год назад

      I had a similar idea, we already have hand tracking, before we do soft occlusion we should start with putting a passthrough shader on the mesh generated by the players hand model, and when we get fully upper body tracking, same thing, then tweak it from there
      Conceivably, we might be able to do that method(at least for hands) now without the depth api

    • @jbienz
      @jbienz  Год назад +1

      Theoretically you could generate a quad with the equivalent of a height map. This could also be done with a displacement shader. But that would be a rectangular surface either "bumped up" or "dug in" by the depth map. I'm curious what your use case would be?

    • @BogomilPetrov
      @BogomilPetrov Год назад

      ​@@G3DavidThat is a pretty standard solution. Meta has had samples doing this for quite awhile.
      The basis World Beyond demo they have uses this.
      If you have the Meta SDK in unity there is a hands pass through shader and also materials and examples.
      It's not a very good system... The tracking often doesn't match up perfectly and the tracking lags behind the camera feed.

    • @BogomilPetrov
      @BogomilPetrov Год назад +1

      ​@@jbienzYeah that makes complete sense.
      I get why they went with the Scene API method and mesh.
      You would have depth in the mesh but nothing could go behind the I objects.
      I guess we'll never get Depth API collisions ever because of that. Anything occluding would make thing not be able to pass behind it.
      If at least they enable the depth sensor for the initial mesh generation it would be a lot lot less "course" than what they have ATM with just the optical flow.
      I'm surprised they just use only the Stereo optical flow ATM when scanning the scene and not the depth sensor at all... I guess that comes soon.

  • @fire17102
    @fire17102 Год назад

    Very nice but you still got to do proper background removal after getting the roi of the hands

    • @jbienz
      @jbienz  Год назад

      Thank you for your comment! Can you explain a bit more what you mean? I'm not sure I understand the acronym ROI here?

  • @samrun0
    @samrun0 Год назад

    Very cool 👍