📌 Here is a great guide with additional details which could be helpful as you integrate Interaction SDK: developer.oculus.com/documentation/unity/unity-isdk-getting-started/
I have seen a grab locomotion setup under Unitys XR Toolkit Locomotion Parent Object in the XR Rig. Did you experience yet how this locotmoation type feels? I am just curious about that as i could image pulling the world to you / grabbing it could also be a very intuitive way to navigate with hands only.
Unity XR Toolkit locomotion is very specific, it reminds me of games such as Gorilla Tag where you grab onto the floor and pull to move around. I say it could work on some experiences but to me that’s no very initiative. I personally prefer teleportation as shown on this video with hands or controllers. Good question and comment David!
Thanks for your comment and yes most new features are very smooth, looks like each iteration is bringing performing improvements. I remember when this first was released and how much it has improved from that point forward! Glad you like it!
The resizable canvas is very cool and smooth. When I tried to use it for my project and customize it for my game, I have no idea how it's put together. It seems very convoluted and I couldn't fully reverse engineer it in order to use it. Do you know of any resource that explains how these demos are put together?
That’s a great question, I will go through their docs to see if there are additional resources, I didn’t see any when I first looked but I will double check. Also, I may cover this in a future video 😉 Thanks for your feedback.
Hey Dilmer, do you have some suggestions for HandGrabPoseTool of New Interaction SDK? It seems doesn't work recording Hand Grab Pose with Unity Editor 2022
@@dilmerv Unity 2022.3.32 + Meta XR Interaction SDK OVR Samples v.65 + Scene HandGrabPoseTool + Meta Quest 3 (dev mode) connected with OculusLink. Unity Editor in Play mode cannot "record" grab poses, or better, no Hands are rocognized at all.
Do you know if it is possible to mix parts of this SDK with an existing meta-openxr + ar foundation project? It would be nice to be able to have the UI experience shown here.
That’s a great question, I will do more testing but last time I tried it wasn’t possible. The current Unity’s OpenXR implementation is not compatible with Meta’s custom OpenXR backend, but let’s not discard it, I will ask as well since I know a few people who worked on this.
I'd love to have a comprehensive playlist about the new sdk if possible! I'm just a beginner at this so I'd love to understand how this works in detail, not only interaction but also locomotion and UI too! Thanks for the great video
I appreciate your feedback thank you! I am working on a new playlist with some of the new Meta XR packages, I will be sure to include this one as well.
When I load the locomotion sample project and preview in headset, the camera is locked to my head, (like there is a main camera but there is only interactionOVR Rig) any solution?
i tried to install the asset from asset store , but i get errors, and the main issue is authentication errors .. unity editor doesn't allow me to install sample packages because it has license and it tells me that i don't have its license. what should i do? i even made another account but again the error is shown.
Thanks for the information, could you send me the exact error that shows when you try to install the sample packages? Also, could you try a brand new project, perhaps a different Unity project could be a good test. Thanks
Curious if you know how to create custom hand/body gestures since the update? I would love a tutorial since I know the not so old examples are no longer relevant.
I haven’t done a tutorial yet but yes I did some work with custom hand gestures, not so much with body gestures but sounds interesting. Let me look into it for a future video, thanks for your feedback.
Fantastic tutorial (as always) Is anyone else having a load of errors being thrown out from the comprehensive scene? from a brand new project I am getting: Converting invalid MinMaxAABB UnityEngine.Mesh:SetSubMesh (int,UnityEngine.Rendering.SubMeshDescriptor,UnityEngine.Rendering.MeshUpdateFlags) EntryPointNotFoundException: isdk_NativeComponent_Activate assembly: type: member:(null) Oculus.Interaction.PoseDetection.ShapeRecognizerActiveState.get_Active () (at ./Library/PackageCache/com.meta.xr.sdk.interaction@63.0.0/Runtime/Scripts/PoseDetection/ShapeRecognizerActiveState.cs:129) Oculus.Interaction.ActiveStateSelector.Update () (at ./Library/PackageCache/com.meta.xr.sdk.interaction@63.0.0/Runtime/Scripts/Interaction/Core/ActiveStateSelector.cs:58) :( Anything to be concerned by?
Sounds like it didn’t find the Interaction SDKs DLLs needed as per “EntryPointNotFoundException” did you install the all in one package or just the interaction SDK package?
The UI doesn't seem very good to me, it looks like buttons are accidentally pressed when you want to scroll or just hover over them. And conversely, when trying to press them, they aren't selected some times in the video. Depending on what you're doing, this can be a disaster. When you look beyond how impressive it seems at first glance, it's somewhat disappointing.
📌 Here is a great guide with additional details which could be helpful as you integrate Interaction SDK:
developer.oculus.com/documentation/unity/unity-isdk-getting-started/
I have seen a grab locomotion setup under Unitys XR Toolkit Locomotion Parent Object in the XR Rig. Did you experience yet how this locotmoation type feels? I am just curious about that as i could image pulling the world to you / grabbing it could also be a very intuitive way to navigate with hands only.
Unity XR Toolkit locomotion is very specific, it reminds me of games such as Gorilla Tag where you grab onto the floor and pull to move around. I say it could work on some experiences but to me that’s no very initiative. I personally prefer teleportation as shown on this video with hands or controllers.
Good question and comment David!
That looks surprisingly smooth :o
Thanks for your comment and yes most new features are very smooth, looks like each iteration is bringing performing improvements. I remember when this first was released and how much it has improved from that point forward!
Glad you like it!
The resizable canvas is very cool and smooth. When I tried to use it for my project and customize it for my game, I have no idea how it's put together. It seems very convoluted and I couldn't fully reverse engineer it in order to use it. Do you know of any resource that explains how these demos are put together?
That’s a great question, I will go through their docs to see if there are additional resources, I didn’t see any when I first looked but I will double check. Also, I may cover this in a future video 😉
Thanks for your feedback.
Nice coverage. The weapon is called a slingshot. :)
Thanks my friend, I couldn’t remember what it was called 😅 thank you!
Hey Dilmer, do you have some suggestions for HandGrabPoseTool of New Interaction SDK? It seems doesn't work recording Hand Grab Pose with Unity Editor 2022
Hey how are you! Mmm what part of it is not working? Are you getting any errors? Let me know and I can try to test it this weekend.
@@dilmerv Unity 2022.3.32 + Meta XR Interaction SDK OVR Samples v.65 + Scene HandGrabPoseTool + Meta Quest 3 (dev mode) connected with OculusLink.
Unity Editor in Play mode cannot "record" grab poses, or better, no Hands are rocognized at all.
Do you know if it is possible to mix parts of this SDK with an existing meta-openxr + ar foundation project? It would be nice to be able to have the UI experience shown here.
That’s a great question, I will do more testing but last time I tried it wasn’t possible. The current Unity’s OpenXR implementation is not compatible with Meta’s custom OpenXR backend, but let’s not discard it, I will ask as well since I know a few people who worked on this.
@@dilmervThanks! Would really love to know the answer :)
I'd love to have a comprehensive playlist about the new sdk if possible! I'm just a beginner at this so I'd love to understand how this works in detail, not only interaction but also locomotion and UI too! Thanks for the great video
I appreciate your feedback thank you! I am working on a new playlist with some of the new Meta XR packages, I will be sure to include this one as well.
@@dilmerv I also have one question, why isn't meta building building blocks used for these tutorials?
nice! is that working well on Meta Quest 2?
Yes it works well with both 😉 good question!
Wow this looks much better than what I used! I really need to get my own Quest 3!
Great to know and a Quest 3 is a great device and investment, thanks for your comment!
Dilmer please expand on the new UI rig and how we can modify it to use it. Maybe a breakdown on how it works would be great! Thank you
Hey thanks for your feedback! I will look into it more and make an extended video.
When I load the locomotion sample project and preview in headset, the camera is locked to my head, (like there is a main camera but there is only interactionOVR Rig) any solution?
Have you try a simple VR/MR project with a camera rig + controllers? Does that work or does it behave the same way?
i tried to install the asset from asset store , but i get errors, and the main issue is authentication errors .. unity editor doesn't allow me to install sample packages because it has license and it tells me that i don't have its license. what should i do? i even made another account but again the error is shown.
Thanks for the information, could you send me the exact error that shows when you try to install the sample packages? Also, could you try a brand new project, perhaps a different Unity project could be a good test.
Thanks
@@dilmerv do you think it has something to do with developer meta (Facebook) account?
did you change to bobvr back from elite strap?
I did, I couldn’t resist to try it and it turns out to be a lot more comfortable.
@@dilmerv thanks that’s great to know
Curious if you know how to create custom hand/body gestures since the update? I would love a tutorial since I know the not so old examples are no longer relevant.
I haven’t done a tutorial yet but yes I did some work with custom hand gestures, not so much with body gestures but sounds interesting. Let me look into it for a future video, thanks for your feedback.
Would this interaction SDK also work on a Pico 3 or does it have to be a Meta Quest?
This is specific for Meta Quest devices, I heard of ways to make it work with other devices but haven’t seen it done yet.
The samples package has been deprecated. Is there any other source?
Fantastic tutorial (as always) Is anyone else having a load of errors being thrown out from the comprehensive scene? from a brand new project I am getting:
Converting invalid MinMaxAABB
UnityEngine.Mesh:SetSubMesh (int,UnityEngine.Rendering.SubMeshDescriptor,UnityEngine.Rendering.MeshUpdateFlags)
EntryPointNotFoundException: isdk_NativeComponent_Activate assembly: type: member:(null)
Oculus.Interaction.PoseDetection.ShapeRecognizerActiveState.get_Active () (at ./Library/PackageCache/com.meta.xr.sdk.interaction@63.0.0/Runtime/Scripts/PoseDetection/ShapeRecognizerActiveState.cs:129)
Oculus.Interaction.ActiveStateSelector.Update () (at ./Library/PackageCache/com.meta.xr.sdk.interaction@63.0.0/Runtime/Scripts/Interaction/Core/ActiveStateSelector.cs:58)
:( Anything to be concerned by?
Sounds like it didn’t find the Interaction SDKs DLLs needed as per “EntryPointNotFoundException” did you install the all in one package or just the interaction SDK package?
@@dilmerv I actually attempted each of them separately and both together which still resulted in various above errors
Meta needs to add image trackingb
I agree, I wouldn’t be surprised if it happens sooner than we think!
netcode pls
I will cover it in a future video, thanks for your feedback!
The UI doesn't seem very good to me, it looks like buttons are accidentally pressed when you want to scroll or just hover over them.
And conversely, when trying to press them, they aren't selected some times in the video.
Depending on what you're doing, this can be a disaster. When you look beyond how impressive it seems at first glance, it's somewhat disappointing.
I am not sure I follow your comment, can you tell me with a timestamp hay you are referring to? Thanks