Started of the Oculus framework in Unity but they recently changed everything so none of the tutorials on RUclips work any more. This is far better, also because it is cross platform, so it is possible to port to other headsets as well. However I have a hard time keeping up as you skip very fast over some steps. I cannot get the hands to show, plane or anything else, like I am positioned somewhere else in the world than the start point.
I have followed this video to the letter as well as several others. I have never once had the hands even show up when I play the scene. It's very frustrating that I cannot get this to work. I am using a quest 2 and Unity 2022.3.18. Any troubleshooting tips?
I will resume the series very soon, please excuse the delay! We both have full-time jobs and sadly didn't have enough time for RUclips in the past weeks :(
Is this only working for the meta vr headsets? I’m currently working with the vive focus 3, and everything works apart from grabbing objects, any advices?
This should work with any device that supports OpenXR and has cameras for hand tracking! Unfortunately I am not too familiar with Vive Focus, though I've had my fair share of problems with its predecessors before. Sorry for not being able to help!
@@blackwhalestudio it does support open xr and only when i try to grab something, the handtracking stops for a short time and then comes back. Maybe this is a unity problem and not a device prroblem, thats what i maybe thought...?
Does anyone else have a problem with objects just falling out of hands constantly when using hand tracking compared to controllers? It's so frustrating and there is literally no information out there... It's happening in every project no matter which version of XR toolkit I use...
Hello man, I must say that I have a lot of respect for what you have achieved! If I understood correctly, you were able to create something in Unity without using any controllers or headsets. Is that correct? Also, could you explain why an Android device was necessary for this project? Did you use it as a third device for some aspect of the creation process?
Hello, I'm using the Pico 4 and followed your tutorial. When I load the app on the headset, the hands are detected but are unrendered(bright pink) and I can't use the pinch and the ray doesn't work. Although I can press the buttons by moving the hand. Do you know what could be causing the problem? Edit: I recently found out that the XR default input action doesn't contain the indexPressed and pinchStrength to work for Pico 4. Therefore, at the moment you can only use this by importing the Pico SDK instead of Pico OpenXR SDK which is outdated. Hope they add it soon. You can follow the rest of the tutorial as it will work. Only the bindings need to be changed and you can't use OpenXR.
Best tutorial out there. Most of them only cover the hand demo scene, but you took it a step further. Well done!
the best XR tutorial til today. Seriously.
I neeed to see the other videos you are talking about.. so intriguing
Best tutorial +1. I like the simplification part the most. Always great to know what the absolute essentials and then develop from there.
Great update! thanks for sharing
Started of the Oculus framework in Unity but they recently changed everything so none of the tutorials on RUclips work any more. This is far better, also because it is cross platform, so it is possible to port to other headsets as well. However I have a hard time keeping up as you skip very fast over some steps. I cannot get the hands to show, plane or anything else, like I am positioned somewhere else in the world than the start point.
hi, still cannot grab items, you skipped that phase of assigning things on input
I have followed this video to the letter as well as several others. I have never once had the hands even show up when I play the scene. It's very frustrating that I cannot get this to work. I am using a quest 2 and Unity 2022.3.18.
Any troubleshooting tips?
at 7:44, when i search for origin xr hands, i get xr origin (xr rig) and not xr origin hands as u get. my xr hands is version 1.3.0
Have you imported the samples in from the package manager?
@@blackwhalestudio yes
Any idea how to change it from pinching to grabbing with your full hand?
You should be able to change interactions in your input action map!
@@blackwhalestudio Can you explain this further? im trying to change it from a pinch a full grab, but I cant find any tutorials online for this.
i was exploring spatial for collaborative setup. if i create a scene with XR hands and import that into spatial..will i be able to interact same way?
For some reason, the hands get disable on start
complete the serie please
I will resume the series very soon, please excuse the delay! We both have full-time jobs and sadly didn't have enough time for RUclips in the past weeks :(
@@blackwhalestudio all support ♥️
Do the hands only appear when I install the apk on oculus? Because here it doesn't show up when I play through the editor.
No, it should work in the editor. Make sure to enable the hand travking subsystem for windows as well
not visible hand my project pls help
make sure to import the hand visualizer from the samples in the unity package manager under the xr hands package!
How to use it on pico 4?
HI, it shows that my hand tracking system doesnt work and i need to chnage some sort of a backend, can you help me with it please?
Hi, can you let me know the exact error message please?
Is this only working for the meta vr headsets? I’m currently working with the vive focus 3, and everything works apart from grabbing objects, any advices?
This should work with any device that supports OpenXR and has cameras for hand tracking! Unfortunately I am not too familiar with Vive Focus, though I've had my fair share of problems with its predecessors before. Sorry for not being able to help!
@@blackwhalestudio thanks for your answer anyway!
@@blackwhalestudio it does support open xr and only when i try to grab something, the handtracking stops for a short time and then comes back. Maybe this is a unity problem and not a device prroblem, thats what i maybe thought...?
Does anyone else have a problem with objects just falling out of hands constantly when using hand tracking compared to controllers? It's so frustrating and there is literally no information out there... It's happening in every project no matter which version of XR toolkit I use...
Does anyone know why I don't have "Samples"?
Make sure you import them separately from the package manager
Hello man, I must say that I have a lot of respect for what you have achieved! If I understood correctly, you were able to create something in Unity without using any controllers or headsets. Is that correct? Also, could you explain why an Android device was necessary for this project? Did you use it as a third device for some aspect of the creation process?
I used my meta quest 2. you definitely need a headset for this tutorial that supports OpenXR and has handtracking capabilities!
Hello, I'm using the Pico 4 and followed your tutorial. When I load the app on the headset, the hands are detected but are unrendered(bright pink) and I can't use the pinch and the ray doesn't work. Although I can press the buttons by moving the hand. Do you know what could be causing the problem?
Edit: I recently found out that the XR default input action doesn't contain the indexPressed and pinchStrength to work for Pico 4. Therefore, at the moment you can only use this by importing the Pico SDK instead of Pico OpenXR SDK which is outdated. Hope they add it soon. You can follow the rest of the tutorial as it will work. Only the bindings need to be changed and you can't use OpenXR.
Thanks a lot for your feedback!!