There is also a Building Blocks Script in each BuildingBlocks objects that allows you to find all the other building blocks elements and if you want to remouve one, you can just delete it and it will reflect on the building blocks scripts and the building blocks window automatically. Very clean!
Honestly, not just for beginners, but this is really helpful for rapid prototyping projects! This came at the perfect time for me actually as I was getting passthrough working manually with the new SDK just last month and need to make a few different projects to test some things.
At 1:42 we see "Something went wrong. Please try again later". Overlooking such messages does not inspire confidence. However, having watched more of your tutorial, things still worked! Thanks for a very informative show.
I hope they make something similar in godot. They need more projects like this to get devs to get quickly into actually building the game and not reinventing the wheel every time. Would love valve releasing the Alex sdk to be usable.
I readed that uses a different plug in / provider so you cant use use XR Hands and oculus interaction SDK at the same because are incompatible (I was looking for a way to use XR Hands and Passthrough from Oculus SDK)
When I tried to use the controller building block, unity crashed. And could you make a tutorial on how to combine the building blocks with other VR / MR things?
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
With the Quest 3 and that SDK are we able to use the Quest 3 room scan feature and create a mesh collider from that rather than manually placing all the boundaries using the furniture function?
Thank you Valem. How can we connect the Meta Quest 3 to Unity for real-time testing like what you did here in the tutorial. I'm using Mac so I can't use Meta link. Please help me.
Hi. I have a question. Why hand tracking doesn't work on unity editor? I have already enabled the hand tracking to my oculus quest 2 and it's working perfectly fine on the the device. Can someone please help me
Look if in your Oculus Desktop App, in settings, beta, you have real time features for developers activated ? Or in Camera Rig, in Quest Features, General, you have Hand Tracking Support with Controllers and Hands?
I still find it confusing tough that unity builds there own vr strcutures while meta does the same. It would be really great if unity could remain the universal base for example through open xr. And then meta delivers the feature set on top of that. But they build entirly parralel like having their own interaction and grabbable system.. I always wonder if the unity meta open xr enables you to use this new building blocks on other platforms such as the pico..
It's normal, Meta's helmets are not the only ones in the world. There are actually dozens if not more helmets that most people have never even heard of. I happened to see a list of maybe all of them on a site. Unity is more universal and works both for Meta helmets and for example phones.
Thank you very much! Does Passthrough still only show up when standalone in the headset, and not in Oculus Link Mode? Also, I've been trying to learn how to set up an XR scene synced with the shapes of my whole house, and it seems like there IS more support for stuff like this now with the remembered location mapping (is this relevant to the Room Model Building Block? another term I saw was "Scene Understanding"?), but it keeps reverting to the constrained single level plane small boundary with unnecessary guardian system, not what I want. More about how to do this would be great, if it's not somewhere already. Cheers! Also, these synthetic hands float stuck in space at last known location if the headset loses sight of them, pretty odd. Is this a bug or is there a way to make them not do that, if anyone knows? :D
I had a quick question, Whenever I build my project and run it on VR the camera just offsets it self to some amount in y direction ! don't know why and how to control that ?
I assume replace the Cube's mesh and colliders with the ones for your custom objects. You'll have to adjust the Rigidbody settings to better reflect the object mass (weight effectively) you're swapping in. Or you can copy the unique scripts from the cubes over to your objects.
Is there anyone that sees the right controller both in left and right hand with default configuration? I don't know the reason but my left controller always shows the right Meta Quest Touch Plus Right object
Anyone else having issues when adding the hand tracking, where hand materials are pink. Seems to be an issue with the OculusSampleAlphaHandOutline thats used byt DefaultHandMaterial, SkeletonBoneMaterial, SystemGestureHandMaterial and SystemGestureSkeletonBoneMaterial located under the BuildingBlocks\BlockData\HandTracking\Materials
Yes, the problem is because you have created your project with URP Template. Try create a new unity project with default 3D template (no 3D URP Template) or go to project settings, graphics and be sure that Scriptable render Pipeline Settings is None, do the same in Quality in Render Pipeline Asset. If there is something instead of none, means that you are using URP as render pipeline and that could be the issue
I think it is more like a script which compares the velocity of the controllers. If the velocity is greater than x value, so move in x / y / z coords your Camera Rig, or give a force in impulse mode your player rigidbody
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
Hi. I could not create the apk, I received several errors about the SDK, I already reinstalled it with Android Studio, but the problem is not solved. Thanks
Qué errores ? Podrías dar click en development build al momento de hacer el apk con la opción "Build and run" asegurándote que la plataforma de exportación seleccionada es Android y que el dispositivo que sale en "Run Device" son tus Quest. También puede ser porque al momento de conectar las quest al PC no permitiste la depuración USB, o tus quest no tienen suficiente almacenamiento. O podría ser que seguiste el orden del video y al final añadiste las manos sintéticas que causa un problema porque falta una referencia y tienes que arrastrar un componente de manos sintéticas donde falta (puedes saberlo dando click en el error, te manda al GameObject en el que ocurre y es bastante intuitivo donde parece que necesita algo de manos sintéticas pero no está asignado)
Why can't thins ever just work... Getting the error "Error building Player: 2 errors" with no other explanation? Wtf is this shit? Anyone know how to solve it? Tried switching UnityEditor, no difference :(
I am currently one of your patreon subscribers. I am about to cancel my subscription because, I have been trying to reach out to you many time without any success. As you know, there have been many changes between Oculus XR and Open XR. I wanted to follow your how to draw in Vr but was having problems running this because this tutorial is outdated. You set up a community but you are not responding to its members while reaping the benefits. Not cool.
There is also a Building Blocks Script in each BuildingBlocks objects that allows you to find all the other building blocks elements and if you want to remouve one, you can just delete it and it will reflect on the building blocks scripts and the building blocks window automatically. Very clean!
Thanks for the tips!
Would be great to have a series on the building blocks! This is great Valem
Ive been teaching a unit of VR dev in my high wchool game design course and this will save us a week or two of learning to set up VR. Wow
Honestly, not just for beginners, but this is really helpful for rapid prototyping projects! This came at the perfect time for me actually as I was getting passthrough working manually with the new SDK just last month and need to make a few different projects to test some things.
Due to my terrible C# skills this shall be very useful
Thanks meta for this time
At 1:42 we see "Something went wrong. Please try again later". Overlooking such messages does not inspire confidence. However, having watched more of your tutorial, things still worked! Thanks for a very informative show.
this has to be a dream come true, to make a game in 15 clicks
Did u get it to work?
Woah, the click and drag stuff at a per-scene level is pretty great. I'm trynna do a workshop on XR stuff, and this'll really simplify that process.
Anyone had problems with the new update n 62.0 of the meta xr all in one ( 26 february)?
You can also add any Building Bloks by clicking right down plus (+) button, instead of drag and drop
Wonderful tutorial!!
great video! I love your tutorials
Tysm for making this it helps a lot!
I hope they make something similar in godot. They need more projects like this to get devs to get quickly into actually building the game and not reinventing the wheel every time.
Would love valve releasing the Alex sdk to be usable.
Seems much more easy than fiddling with OpenXr and get everything to work there, but also makes your project only compatible with Meta headsets.
hi im wondering is the Hand Pose Grab - Oculus Interaction SDK can be done in xr hands and interaction tool kit?
I readed that uses a different plug in / provider so you cant use use XR Hands and oculus interaction SDK at the same because are incompatible (I was looking for a way to use XR Hands and Passthrough from Oculus SDK)
When I tried to use the controller building block, unity crashed. And could you make a tutorial on how to combine the building blocks with other VR / MR things?
Love it!!
I have followed your tutorial but then i got stucked trying to grab with the controllers. Is there a building block that allows me to do so?
do u have found a solution?
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
I might have to come back to XR dev with this!
fantastic video! most videos are old, you are the most up to date video!
please cover setting up teleport, custom interacts like adding new items
❤❤❤
How on earth do you see things in the helmet? Nothing works for me, with Quest link on or not. Only when using the button "Build and Run" it works.
With the Quest 3 and that SDK are we able to use the Quest 3 room scan feature and create a mesh collider from that rather than manually placing all the boundaries using the furniture function?
Thank you Valem. How can we connect the Meta Quest 3 to Unity for real-time testing like what you did here in the tutorial. I'm using Mac so I can't use Meta link. Please help me.
Hi. I have a question. Why hand tracking doesn't work on unity editor? I have already enabled the hand tracking to my oculus quest 2 and it's working perfectly fine on the the device. Can someone please help me
Look if in your Oculus Desktop App, in settings, beta, you have real time features for developers activated ? Or in Camera Rig, in Quest Features, General, you have Hand Tracking Support with Controllers and Hands?
How to integrate hand and glove models sold in the Unity Asset Store with VR hand tracking? I couldn't find any resources for this.
Some one had problems with the poke Interactable? because in my project it works on play by unity but it doesn't work on build, thank you!
Great video !! didn't know this feature exists when i switched to the new SDK
is there any way to show the hands when i am using controllers?
I still find it confusing tough that unity builds there own vr strcutures while meta does the same. It would be really great if unity could remain the universal base for example through open xr. And then meta delivers the feature set on top of that. But they build entirly parralel like having their own interaction and grabbable system.. I always wonder if the unity meta open xr enables you to use this new building blocks on other platforms such as the pico..
It's normal, Meta's helmets are not the only ones in the world. There are actually dozens if not more helmets that most people have never even heard of. I happened to see a list of maybe all of them on a site. Unity is more universal and works both for Meta helmets and for example phones.
Thank you very much! Does Passthrough still only show up when standalone in the headset, and not in Oculus Link Mode?
Also, I've been trying to learn how to set up an XR scene synced with the shapes of my whole house, and it seems like there IS more support for stuff like this now with the remembered location mapping (is this relevant to the Room Model Building Block? another term I saw was "Scene Understanding"?), but it keeps reverting to the constrained single level plane small boundary with unnecessary guardian system, not what I want. More about how to do this would be great, if it's not somewhere already. Cheers!
Also, these synthetic hands float stuck in space at last known location if the headset loses sight of them, pretty odd. Is this a bug or is there a way to make them not do that, if anyone knows? :D
I had a quick question, Whenever I build my project and run it on VR the camera just offsets it self to some amount in y direction ! don't know why and how to control that ?
Try in player settings in PC and Android Icons set Stereo Rendering Mode to Multiview or Multi Pass
this is cool but I cannot let virtual keybord work, there is out there a tutorial about it?
Amazing video, thanks. Is it possible to change or replace cubes for another object? I would like to use my own objects. Thank you
I assume replace the Cube's mesh and colliders with the ones for your custom objects. You'll have to adjust the Rigidbody settings to better reflect the object mass (weight effectively) you're swapping in. Or you can copy the unique scripts from the cubes over to your objects.
@@SetsuneW thank you. I'm gonna try.
Yes, you have to add Grabbable, Hand Grab Interactable and Physics Grabbable scripts to any object
@@altairbarahona2306 thank you
Which render pipeline do you recommend for the project?...URP?
Standard pipeline seems a bit easier with Meta SDK but URP seems better for Unity XR Toolkit
will this work with the quest 2 ? or is it only meant for the the quest 3 ?
it should work
Is there anyone that sees the right controller both in left and right hand with default configuration? I don't know the reason but my left controller always shows the right Meta Quest Touch Plus Right object
Anyone else having issues when adding the hand tracking, where hand materials are pink. Seems to be an issue with the OculusSampleAlphaHandOutline thats used byt DefaultHandMaterial, SkeletonBoneMaterial, SystemGestureHandMaterial and SystemGestureSkeletonBoneMaterial located under the BuildingBlocks\BlockData\HandTracking\Materials
Yes, the problem is because you have created your project with URP Template. Try create a new unity project with default 3D template (no 3D URP Template) or go to project settings, graphics and be sure that Scriptable render Pipeline Settings is None, do the same in Quality in Render Pipeline Asset. If there is something instead of none, means that you are using URP as render pipeline and that could be the issue
would it work from quest 2?
How can I use a swimming mechanism using the Meta building blocks?
I think it is more like a script which compares the velocity of the controllers. If the velocity is greater than x value, so move in x / y / z coords your Camera Rig, or give a force in impulse mode your player rigidbody
Does anybody know why some building blocks are greyed out and can't be added?
It seems because you have already added to the scene
looks like the cronrollers cant grab things you have to use hand tracking
If you try to download the old Oculus Integration instead of Meta XR, there are multiple example scenes, one of that has objects that can be grabbed with hands or controllers, and the set up for them
Hola. No pude crear el apk, recibí varios errores sobre el SDK, ya lo reinstale con Android Studio, pero no se resuelve el problema.
Gracias
Hi. I could not create the apk, I received several errors about the SDK, I already reinstalled it with Android Studio, but the problem is not solved.
Thanks
Qué errores ? Podrías dar click en development build al momento de hacer el apk con la opción "Build and run" asegurándote que la plataforma de exportación seleccionada es Android y que el dispositivo que sale en "Run Device" son tus Quest.
También puede ser porque al momento de conectar las quest al PC no permitiste la depuración USB, o tus quest no tienen suficiente almacenamiento.
O podría ser que seguiste el orden del video y al final añadiste las manos sintéticas que causa un problema porque falta una referencia y tienes que arrastrar un componente de manos sintéticas donde falta (puedes saberlo dando click en el error, te manda al GameObject en el que ocurre y es bastante intuitivo donde parece que necesita algo de manos sintéticas pero no está asignado)
@@altairbarahona2306 gracias por tu respuesta. Lo resolví, fue un problema con las versiones (y el path) del sdk. Gracias
Do you think you can make another video (series) on the recently updated SteamVR Plugin?
Why can't thins ever just work...
Getting the error "Error building Player: 2 errors" with no other explanation?
Wtf is this shit? Anyone know how to solve it? Tried switching UnityEditor, no difference :(
@ValemTutorials T'es français? t'a l'accent
omg this is a bit easier! lol
wow
hi
You all still use Unity?🤯
I am currently one of your patreon subscribers. I am about to cancel my subscription because, I have been trying to reach out to you many time without any success. As you know, there have been many changes between Oculus XR and Open XR. I wanted to follow your how to draw in Vr but was having problems running this because this tutorial is outdated. You set up a community but you are not responding to its members while reaping the benefits. Not cool.
What are you trying to get to? I just finished the whole thing today without many issues.