Just wanted to thank you for this - this is really easy to understand and I really like that you're not using some kind of background music or timelapses that would make it confusing. Great work :-)
I just found your channel a minute ago and liked and subscribed. I'm an aspiring AR developer/creator. I'd love to see a tutorial on how to make an interactive infographic that can be overlayed on top of an object. Keep up the great work.
Really impressive and Amazing Tutorial ! i just got mine Quest 3 and watching tutorials on its developement in unity and so far i really loved the way u explained stuff , simple and easy to follow for beginners ! you deserve more views and subscribers Kudos fello developer !
Thanks a lot :D Nope, the final project is not available but it's very simple to build with the base project and assets which you can get the link fro in the description.
Amazing tutorial! I'm having some problem with Quest Link getting data of my room setup, so it just always default to room backup prefabs, has anyone experienced this?
I believe you can only get the room data from a build and not over the link, so just build and run a single time and then you will have it over Link. You also need to enable some beta options in the Quest Link desktop app
You can get the data over link that’s what I have done in the video as well. In your Meta Quest link app on you PC, go to Settings, Beta, scroll down and enable “Spatial data over Meta Quest link”
@@immersiveinsiders Yes! I had to re-enable that setting it seemed, after I did that it just works! Thanks so much for an awesome tutorial! Helped tremendously 🙏
Nope, what you could do is use the "Anchor Prefab Spawner" component to spawn a quad on walls and then you can change the quad material to show highlight.
Great tutorial, thank you! I'm still try to understanding how scene understanding works. So, if you map your room first and you have a mesh table, two windows, and a bed in your room (in your example scene) in the video you're presenting, would another user encounter the same mesh (a table, two windeows, and a bed) when loading the MR app example you've developed, or would they see their own room's mesh after they mapping it?
Yes you are right! I had already mapped my room. It’s user specific, they see their own room’s mesh. If they don’t have a room mapped it will ask you to map your room.
@@immersiveinsiders thank you for the clarification! Do you know what happened if a user won’t map their own room? A user will not able to play the game if a user won’t map it?
@@anmaavr8315 generally as a developer your need have a check in place to make sure user setups up their room. If not then you would not allow the user to play the game. Or you can load a default room but the problem is that the walls will not match the user environment and break the immersion.
Hey,is it possible to develop With Quest and Macbook ? For mixed reality ,I see MRTK3 is used(unavailable for MAC?Is it different from Building Blocks of Meta?I just want to know can i invest in Quest 3 for XR Development/XR Design using MAC in unity?
Love this - One quick question on the final Solution. Should it work if I take the build into a different room? I tried this and poor Oppy fell through the floor.
my hand tracking not working i'm using unity 2022.3.21f1 meta all in one sdk v64 the hand tracking working when use the quest 3 as usual use such as browser and any native in the quest 3 but with unity not working and i can see the quest thumb button but no hands tried to interact but nothing happen
@@kennethramirez7450 Really? I thought the Q1 and Q2 were similar in terms of passthrough? That is a shame... I'll have to wait quite a while for the Q3 to come down in price... Thanks so much for your help with this!
Ah this is so good! Thank you for this, im trying to further understand the navmesh on user scene mesh, is there any in depth resource for this? Im trying to get the mummy to walk on walls instead of floor, how would i do this?
Hey, while developing this, I did try to make the mummy walk on the wall but it's not easy. By default, NavMesh works on an X-Z plane, so you'll have to rotate that plane and generate the nav mesh. This gives you NavMesh on one wall. You need to do this x number of times based on the number of walls the user has. Then comes other problems like, what happens when mummies come close to the adjacent wall? What if there is a plant, window, or table? It gets really complex.
Can you please make a video about how to set up a custom body pose? The SDK example doesn't help and I just want to be able to make my own poses to make a simple game. No one has any videos on how to do it!
How do I add collision detection or any other script only to the walls..like when the planes of the walls are instantiated I want a script to be added to the wall plane?
Right now in the script we lines of code that will make it spawn facing the user. You can replace those lines of code with any rotation value you choose.
What solutions do you have if you are unable to have play mode through your Quest 3? This issue has happened alot and even my Unity has crashed attempting to complete this video. Your team are fantastic!
@@immersiveinsiders I am gettng an error with passthrough pre-requisites which were met. I had to delete and start all over. Unity has a mind of its own.
Thanks for your nice tutorial, but i can control my oppy with my controllers, and the coffin and the mummy monster didn't show up, does i miss something? i use the unity 2022.3.24f.
I transferred the APK into the device and not able to run the game, i usually create in unreal, this is my first time using unity so i am bit confused in this part
😂 bhai, use at least quest 2, as quest 1 is in verse of disappearance from the market. I think it’s already disappeared, only some people who brought it in launch day have it, probably some university labs have it.
Just wanted to thank you for this - this is really easy to understand and I really like that you're not using some kind of background music or timelapses that would make it confusing. Great work :-)
I just found your channel a minute ago and liked and subscribed. I'm an aspiring AR developer/creator. I'd love to see a tutorial on how to make an interactive infographic that can be overlayed on top of an object. Keep up the great work.
Thank you so much! I had an issue with the floor where objects were falling through it. Your video tutorial helped me a lot!
This is the most comprehenisve tutorial on XR development yet! Thank you so much!
Really impressive and Amazing Tutorial ! i just got mine Quest 3 and watching tutorials on its developement in unity and so far i really loved the way u explained stuff , simple and easy to follow for beginners ! you deserve more views and subscribers Kudos fello developer !
Super happy to hear that :D
Great intro and clear steps, nice to see something so detailed. Curious if you have tried this in Unity 6? Would the same method work?
I will be taking these courses. Thanks
That was impressive and awesome, congrats on the explanation, work and demo!! Is the final project available somewhere? Thanks again.
Thanks a lot :D
Nope, the final project is not available but it's very simple to build with the base project and assets which you can get the link fro in the description.
Incredible tutorial, Thank you
Nice tutorial, thanks 🙌
Hi! when i try to get my own room scanned, it always ends up taking a random room instead of scanning mine. any suggestions on what to do?
@@nieverbeek9505 you mean via link? This could be happening because your room data is not shared over link. Try reconnecting.
My EffectMesh material is showing as opaque in the Game view. But in the Editor and in the Build, its transparent :/ any ideas why this might be?
Not sure why this could be happening.
This video made me believe that I can also build a xr game. Thanks
@@ankk98 happy to hear that :D
Please add a follow up to add sound effects and scoring
Amazing tutorial!
I'm having some problem with Quest Link getting data of my room setup, so it just always default to room backup prefabs, has anyone experienced this?
I believe you can only get the room data from a build and not over the link, so just build and run a single time and then you will have it over Link. You also need to enable some beta options in the Quest Link desktop app
You can get the data over link that’s what I have done in the video as well. In your Meta Quest link app on you PC, go to Settings, Beta, scroll down and enable “Spatial data over Meta Quest link”
@@immersiveinsiders Yes! I had to re-enable that setting it seemed, after I did that it just works!
Thanks so much for an awesome tutorial! Helped tremendously 🙏
Do you know if the Meta Effect Mesh building block allows for highlighting on hover?
Nope, what you could do is use the "Anchor Prefab Spawner" component to spawn a quad on walls and then you can change the quad material to show highlight.
Great tutorial, thank you! I'm still try to understanding how scene understanding works. So, if you map your room first and you have a mesh table, two windows, and a bed in your room (in your example scene) in the video you're presenting, would another user encounter the same mesh (a table, two windeows, and a bed) when loading the MR app example you've developed, or would they see their own room's mesh after they mapping it?
Yes you are right! I had already mapped my room.
It’s user specific, they see their own room’s mesh. If they don’t have a room mapped it will ask you to map your room.
@@immersiveinsiders thank you for the clarification! Do you know what happened if a user won’t map their own room? A user will not able to play the game if a user won’t map it?
@@anmaavr8315 generally as a developer your need have a check in place to make sure user setups up their room. If not then you would not allow the user to play the game. Or you can load a default room but the problem is that the walls will not match the user environment and break the immersion.
@@anmaavr8315 general user won’t be able to
@@immersiveinsiders What do you mean by general user won't be able to? what they cannot?
Hey,is it possible to develop With Quest and Macbook ? For mixed reality ,I see MRTK3 is used(unavailable for MAC?Is it different from Building Blocks of Meta?I just want to know can i invest in Quest 3 for XR Development/XR Design using MAC in unity?
Yes, you can use a Mac for VR/MR development. MRTK is a different SDK. You can use Meta All in One SDK to develop XR apps for the Meta devices .
Love this - One quick question on the final Solution. Should it work if I take the build into a different room? I tried this and poor Oppy fell through the floor.
Yes it should work as long as you have mapped the room.
my hand tracking not working i'm using unity 2022.3.21f1
meta all in one sdk v64
the hand tracking working when use the quest 3 as usual use such as browser and any native in the quest 3 but with unity not working and i can see the quest thumb button but no hands tried to interact but nothing happen
Seems like the hand visuals are not getting enabled.
Have you made sure to follow all the steps mentioned in the video?
Any way to do this on an Oculus Q1?? The device link is no longer supported... Some devlopers do not yet have the money for a Q3 model.☹
Unfortunately the passthrough is very limited to the Q1 in comparison to the Q2 and the Q3.
@@kennethramirez7450 Really? I thought the Q1 and Q2 were similar in terms of passthrough? That is a shame... I'll have to wait quite a while for the Q3 to come down in price... Thanks so much for your help with this!
Ah this is so good! Thank you for this, im trying to further understand the navmesh on user scene mesh, is there any in depth resource for this? Im trying to get the mummy to walk on walls instead of floor, how would i do this?
Hey, while developing this, I did try to make the mummy walk on the wall but it's not easy. By default, NavMesh works on an X-Z plane, so you'll have to rotate that plane and generate the nav mesh. This gives you NavMesh on one wall. You need to do this x number of times based on the number of walls the user has.
Then comes other problems like, what happens when mummies come close to the adjacent wall? What if there is a plant, window, or table? It gets really complex.
Please provide a FULL course for AR development ,👀👀 please I want to learn it
Do you mean mobile AR or using a VR headset with a passthrough?
Can you please make a video about how to set up a custom body pose? The SDK example doesn't help and I just want to be able to make my own poses to make a simple game. No one has any videos on how to do it!
How do I add collision detection or any other script only to the walls..like when the planes of the walls are instantiated I want a script to be added to the wall plane?
You can do something like this:
public void Start()
{
var walls = MRUKRoom.Instance.GetWallAnchors();
foreach( var wall in walls)
}
@@immersiveinsiders and this should be attached to?
Any ideas on how we can set the spawn Rotation, using the Find Spawn Positions script?
Right now in the script we lines of code that will make it spawn facing the user. You can replace those lines of code with any rotation value you choose.
What solutions do you have if you are unable to have play mode through your Quest 3? This issue has happened alot and even my Unity has crashed attempting to complete this video. Your team are fantastic!
Do you get any errors when you press play?
@@immersiveinsiders I am gettng an error with passthrough pre-requisites which were met. I had to delete and start all over. Unity has a mind of its own.
Thanks for your nice tutorial, but i can control my oppy with my controllers, and the coffin and the mummy monster didn't show up, does i miss something? i use the unity 2022.3.24f.
Unity version could be the issue but I am not really sure, maybe you can try using the same version as mine.
@@immersiveinsiders thanks for Your reply, your tutorial are the best, you deserve more subscribers, thanks again!
What shader should we select if we are using URP when adding the occlusion?
This should help: github.com/oculus-samples/Unity-DepthAPI?tab=readme-ov-file#for-urp
can I use Oculus Quest 2 for ar development
Yes
I transferred the APK into the device and not able to run the game, i usually create in unreal, this is my first time using unity so i am bit confused in this part
If you click build and run it should automatically open in your headset
Base project link is dead?
Here's the updated version: github.com/immersive-insiders/StarterProject
How do i set Quest 1 in target device i don't see any option
😂 bhai, use at least quest 2, as quest 1 is in verse of disappearance from the market. I think it’s already disappeared, only some people who brought it in launch day have it, probably some university labs have it.
Is there any possibility of making it a location specific?
What exactly do you mean?
How did u get “Meta” and “Oculus” on the top?
i assume you mean in the menu. You need to add the meta sdk to your project
@@IlanPerez Yes, thank you!
Can we run this on Android device
Nope, it works only with Quest 2 or Quest 3.
Please make a roadmap for AR VR developer
We already have a free roadmap course on our website :)
immersive-insiders.com/course/xr-roadmap
"to make your life easier we have already setup with the latest sdk"... me right now - "And how do you do that?" :D okie.. off to some other video
aha, we have covered this in my previous video. ruclips.net/video/5D4UWqg1t-4/видео.html