By using a ARKit from Unity asset store I was able to see the face anchor put out by my iPhone X depth camera. this Blog was very helpful: blogs.unity3d....
Hi nice video, out of curiosity is it possible to get and isolate the iphone depth image with this method or any nudge in the right direction with this. Just trying to find where to start, Thanks).
Well, much like the old Xbox Kinect there is a projector the sends out an infrared point cloud on your face. The dots are much more fine than the Kinect. And then the camera reads the depth from the cloud. And the converts that information into a list of numbers and locations in fractions of seconds. And coverts that list into the face. Fascinating process at the blink of the eye.
@@tinkerking6785 very cool, thanks. I was really interested in Unity/iPhone integration and what can be accessed. I am working on an pointcloud engine and was interested in realtime capture to my format so your video inspired my interest. I will look into "Face Cap" thanks, maybe time to get into the ios api) but if already accessible in unity would save some time I guess).
I have been searching and searching for a way to get the values out to apply to my 3d characters in unity3d. I can get the anchor vertices and the Blend-shape coefficients out to a text file, but I am stopped right there.
Hi nice video, out of curiosity is it possible to get and isolate the iphone depth image with this method or any nudge in the right direction with this. Just trying to find where to start, Thanks).
Well, much like the old Xbox Kinect there is a projector the sends out an infrared point cloud on your face. The dots are much more fine than the Kinect. And then the camera reads the depth from the cloud. And the converts that information into a list of numbers and locations in fractions of seconds. And coverts that list into the face. Fascinating process at the blink of the eye.
There is a app called “Face Cap” for the iPhone. It might have what you are needing.
Face Scan is another one too.
@@tinkerking6785 very cool, thanks. I was really interested in Unity/iPhone integration and what can be accessed. I am working on an pointcloud engine and was interested in realtime capture to my format so your video inspired my interest. I will look into "Face Cap" thanks, maybe time to get into the ios api) but if already accessible in unity would save some time I guess).
@@tinkerking6785 cool, thanks!
Very cool. Can’t wait to try 👍🏼
I have been searching and searching for a way to get the values out to apply to my 3d characters in unity3d. I can get the anchor vertices and the Blend-shape coefficients out to a text file, but I am stopped right there.
Amazing! Do you know any resource where I can find how to apply different 3D models?
I need some kind of converter that will translate into a unity anim file
Absolutely, unfortunately it is realtime only.
Would i be able to set this up too if i got an iphone x
HI, really great job. I cannot see any lag. I also use this but I always got lag. Could you tell me how to fix this plz? Thank you
Master AnimationPlus I’m having issues with lag as well - did you figure that out?