I'm loving videos like this!! Especially how you go deeper into how it works and what that means for how its used. I've got a hand tracking game currently in openxr and I might have to switch to metaxr for passthrough (because of URP bug) so depth might be something I can add.
Hey Simon. Sorry it took me so long to get back to you. Unfortunately I didn't fork the upstream project before making my changes, and by the time I got around to it they had made some changes. But I have now updated everything on my own fork and you can try these changes by pulling this branch: github.com/SolerSoft/Unity-DepthAPI/tree/Feature/DepthView
P.S. all of my modifications are in a root folder called 'Custom'. I also made my own scene so I didn't break the original. Note that the cameras in OVRCamerRig had to be modified to render the Depth Map. Unity doesn't do this by default.
Hello, I want to ask. I'm using your example code but I can't get the depthapi to work in the Editor. I have experimental features enabled, I have passthrough via quest link enabled. But I keep getting a message that the depth texture is not available. It doesn't even show up in unity passthrought texture. I don't know what I could have missed, do you have any tip?
Hey CaedesCZ, sorry, I was at MIT Reality Hack. Unfortunately, last I checked the depth texture isn't available in Unity. I don't even think it's available with the Meta simulator running. I believe the only way to test is to build and deploy to device. This is a preview feature for now, so the ability to test in editor may be added later.
@@jbienz Oooh I racking my brain over this. I've had no problem putting the occlusion into my shaders, but I couldn't get the depth texture in unity. So it should work with a build? Ok i'll try that now...
What exactly is the shader on the materials doing that show the depth texture? Is that a custom shader or something built-in because I can't seem to find it.
I had a similar idea, we already have hand tracking, before we do soft occlusion we should start with putting a passthrough shader on the mesh generated by the players hand model, and when we get fully upper body tracking, same thing, then tweak it from there Conceivably, we might be able to do that method(at least for hands) now without the depth api
Theoretically you could generate a quad with the equivalent of a height map. This could also be done with a displacement shader. But that would be a rectangular surface either "bumped up" or "dug in" by the depth map. I'm curious what your use case would be?
@@G3DavidThat is a pretty standard solution. Meta has had samples doing this for quite awhile. The basis World Beyond demo they have uses this. If you have the Meta SDK in unity there is a hands pass through shader and also materials and examples. It's not a very good system... The tracking often doesn't match up perfectly and the tracking lags behind the camera feed.
@@jbienzYeah that makes complete sense. I get why they went with the Scene API method and mesh. You would have depth in the mesh but nothing could go behind the I objects. I guess we'll never get Depth API collisions ever because of that. Anything occluding would make thing not be able to pass behind it. If at least they enable the depth sensor for the initial mesh generation it would be a lot lot less "course" than what they have ATM with just the optical flow. I'm surprised they just use only the Stereo optical flow ATM when scanning the scene and not the depth sensor at all... I guess that comes soon.
Oh hey Jared! awesome seeing you here, thanks for the video!
I'm loving videos like this!! Especially how you go deeper into how it works and what that means for how its used.
I've got a hand tracking game currently in openxr and I might have to switch to metaxr for passthrough (because of URP bug) so depth might be something I can add.
Is Shader Graph able to support depth functionslity?
Theoretically yes, since depth is a RenderTexture.
Great stuff and fantastic video Jared, any chance of sharing the repo with your sample included and the preferred project setup?
Hey Simon. Sorry it took me so long to get back to you. Unfortunately I didn't fork the upstream project before making my changes, and by the time I got around to it they had made some changes. But I have now updated everything on my own fork and you can try these changes by pulling this branch:
github.com/SolerSoft/Unity-DepthAPI/tree/Feature/DepthView
P.S. all of my modifications are in a root folder called 'Custom'. I also made my own scene so I didn't break the original. Note that the cameras in OVRCamerRig had to be modified to render the Depth Map. Unity doesn't do this by default.
Hello,
I want to ask. I'm using your example code but I can't get the depthapi to work in the Editor. I have experimental features enabled, I have passthrough via quest link enabled. But I keep getting a message that the depth texture is not available. It doesn't even show up in unity passthrought texture.
I don't know what I could have missed, do you have any tip?
Hey CaedesCZ, sorry, I was at MIT Reality Hack. Unfortunately, last I checked the depth texture isn't available in Unity. I don't even think it's available with the Meta simulator running. I believe the only way to test is to build and deploy to device. This is a preview feature for now, so the ability to test in editor may be added later.
@@jbienz Oooh I racking my brain over this. I've had no problem putting the occlusion into my shaders, but I couldn't get the depth texture in unity. So it should work with a build? Ok i'll try that now...
What exactly is the shader on the materials doing that show the depth texture? Is that a custom shader or something built-in because I can't seem to find it.
How the depth shaders work are explained at 10:56 in this video. Did you have a specific question?
Would there be any possible way to somehow convert the depth texture data to a mesh?
I had a similar idea, we already have hand tracking, before we do soft occlusion we should start with putting a passthrough shader on the mesh generated by the players hand model, and when we get fully upper body tracking, same thing, then tweak it from there
Conceivably, we might be able to do that method(at least for hands) now without the depth api
Theoretically you could generate a quad with the equivalent of a height map. This could also be done with a displacement shader. But that would be a rectangular surface either "bumped up" or "dug in" by the depth map. I'm curious what your use case would be?
@@G3DavidThat is a pretty standard solution. Meta has had samples doing this for quite awhile.
The basis World Beyond demo they have uses this.
If you have the Meta SDK in unity there is a hands pass through shader and also materials and examples.
It's not a very good system... The tracking often doesn't match up perfectly and the tracking lags behind the camera feed.
@@jbienzYeah that makes complete sense.
I get why they went with the Scene API method and mesh.
You would have depth in the mesh but nothing could go behind the I objects.
I guess we'll never get Depth API collisions ever because of that. Anything occluding would make thing not be able to pass behind it.
If at least they enable the depth sensor for the initial mesh generation it would be a lot lot less "course" than what they have ATM with just the optical flow.
I'm surprised they just use only the Stereo optical flow ATM when scanning the scene and not the depth sensor at all... I guess that comes soon.
Very nice but you still got to do proper background removal after getting the roi of the hands
Thank you for your comment! Can you explain a bit more what you mean? I'm not sure I understand the acronym ROI here?
Very cool 👍