- Видео 62
- Просмотров 334 301
Black Whale Studio - XR AI Tutorials
Швейцария
Добавлен 25 апр 2021
Hi XR Developers, I'm Robi and welcome to the Black Whale Studio RUclips channel. I'm a highly motivated individual with a passion for XR and the goal to create useful products that make developer’s lives easier. On this channel, I share my expertise and passion for XR development through comprehensive tutorials. Whether you're a budding developer or a seasoned pro, there's always something new to learn with here.
Interact with us:
📌Twitter: xrdevrob
📌 Discord: discord.gg/v8fVjweUPB
Support me:
🎋 Patreon: www.patreon.com/blackwhalestudio
#XR #Unity #Metaverse #Spatialweb #educationalvideo
Interact with us:
📌Twitter: xrdevrob
📌 Discord: discord.gg/v8fVjweUPB
Support me:
🎋 Patreon: www.patreon.com/blackwhalestudio
#XR #Unity #Metaverse #Spatialweb #educationalvideo
Colocation with Meta’s Shared Spatial Anchors & the new Colocation Discovery API
Hi XR Developers! For a long time, arguably the most difficult part about Meta’s XR SDK was dealing with Spatial Anchors and especially sharing them amongst all players in a multiplayer experience to create, what we call, a colocated experience. As of v71, the whole setup has become much easier and most tutorials out there are unfortunately not accurate anymore. I will show you how easy it has become to set up a colocated experience by making this video as concise as possible.
Resources:
🔗 Colocation Discovery: developers.meta.com/horizon/documentation/unity/unity-colocation-discovery
🔗 Shared Spatial Anchors: developers.meta.com/horizon/documentation/unity/unity-shared-spatial-anchors
Find t...
Resources:
🔗 Colocation Discovery: developers.meta.com/horizon/documentation/unity/unity-colocation-discovery
🔗 Shared Spatial Anchors: developers.meta.com/horizon/documentation/unity/unity-shared-spatial-anchors
Find t...
Просмотров: 4 001
Видео
Destructible Global Mesh, Instant Content Placement and MRUK Trackables - Meta's MRUK v71 update
Просмотров 1,9 тыс.2 месяца назад
Hi XR Developers! As I’m sure you know by now Meta’s MR Utility Kit provides a rich set of utilities and tools on top of the Scene API to perform common operations when building spatially-aware apps. This makes it easier to program against the physical world, and allows developers to focus on what makes their app unique. We can easily ray-cast against scene objects, find valid spawn locations i...
Create an MR Drawing app using Logitech’s MX Ink Stylus for Meta Quest
Просмотров 1,7 тыс.4 месяца назад
Hi XR Developers! If you have ever used apps like ShapesXR, PaintingVR or Vermillion, you will know how great it feels to create your own paintings or sketch up your own designs in your headset. This experience just got much better with the Logitech MX-Ink pen, which is built specifically for Meta Quest and is publicly available starting this month! Today we will look at how to set up the pen, ...
Use Meta’s Llama 3 Model in Unity with Amazon Bedrock & other partner services
Просмотров 2 тыс.5 месяцев назад
Hi XR Developers! In the last video we learnt how to record a voice command using Meta’s Voice SDK and turn our speech into text. In this video we can finally take this text and send it as a prompt to different Large Language Models or LLMs for short. We can then get a response back and use Meta’s Voice SDK to speak the response out loud. This allows you to for example create natural conversati...
Use Meta's Voice SDK for Wake Word Detection & Speech To Text (STT)
Просмотров 2,9 тыс.6 месяцев назад
Hi XR Developers! In this video we will look at how to use Meta’s Voice SDK to define and understand intents. Furthermore, we will use these intents to create a wake word that we know from assistants such as Google and Siri, as well as create some logic to capture the transcript of what we said after the wake word was heard by our system! Resources: 🔗 Meta XR Core SDK: assetstore.unity.com/pack...
Apple Vision Pro UI Interactions with Unity PolySpatial | Keyboard and Voice Inputs
Просмотров 2,7 тыс.9 месяцев назад
Hi XR Developers! In this video we will look at how to use the visionOS system keyboard for our Unity UI and how to achieve speech to text input using AI (Whisper from OpenAI)! Resources: 🔗 Unity PolySpatial 1.1 Documentation: docs.unity3d.com/Packages/com.unity.polyspatial.visionos@1.1/manual/index.html 🔗 Play to Device: docs.unity3d.com/Packages/com.unity.polyspatial.visionos@1.1/manual/PlayT...
Apple Vision Pro Input & Object Manipulation with Unity PolySpatial | Connect to Unity and Xcode
Просмотров 6 тыс.9 месяцев назад
Hi XR Developers! In this video I will teach you how to use the 3D touch input for visionOS devices. I prepared a bunch of scripts that let us manipulate game objects in different way! There is also Skeletal Hand Tracking which is available to us through the XR hands package from Unity. We will look into that in a future video! Resources: 🔗 Unity PolySpatial 1.1 Documentation: docs.unity3d.com/...
Create and Manage SwiftUI windows with Unity PolySpatial for Apple visionOS
Просмотров 3,3 тыс.10 месяцев назад
Hi XR Developers! In this video we will look at the new PolySpatial 1.1 update and learn how to create and manage native SwiftUI windows to interact with our Unity scene! In a previous video we have looked at the basic setup and the sample scenes already. The previous video also explains the most important concept of Windows, Volumes and Spaces, so definitely check this one out first! Resources...
Virtual Windows in Passthrough Mode | Selective Passthrough Shader & Stencil Shader
Просмотров 6 тыс.10 месяцев назад
Hi XR Developers! In this video we will learn how you can easily create these amazing virtual windows inside your room, like we have seen from many viral videos lately, like Ocean Rift! If you want to take this experience one step further and smash realistic wholes into your walls and ceiling, check out my exclusive tutorial on Patreon, on how to create your own “First Encounters” Meta Quest ex...
Advanced Hand Tracking & Lighting Fast Interaction Setup | Meta’s huge v62 Update
Просмотров 6 тыс.11 месяцев назад
Hi XR Developers! In this quick video we are going to look at the new improvements that version 62 of Meta’s XR SDK brought to our hand tracking and the interaction SDK. We are going to look at Multimodal, Wide Motion Mode, or WMM for short, and Cap-sense. Furthermore, we will take a quick look at Meta’s new comprehensive interaction sample and how to set up interactions in seconds! Resources: ...
Meta XR Simulator | Synthetic Environments, Scene Recording & Multiplayer Testing
Просмотров 8 тыс.11 месяцев назад
Hi XR Developers! In today’s video we will take a look at Meta’s XR Simulator, which is its own XR runtime within Unity, that allows us to test features without having an actual Meta Quest device! Keep in mind that the Simulator only works on Windows machines! Furthermore, much like the Mixed Reality Utility Kit, the XR Simulator lets us simulate multiple synthetic environments, import and crea...
Meta Quest Spatial Anchors | Create, Save, Erase & Load spatial anchors
Просмотров 14 тыс.11 месяцев назад
Hi XR developer! In previous videos we learned how to use mixed reality features such as passthrough, scene understanding and environment occlusion. We also looked at several ways on how to place objects in our room. Now, what if we would like to place our objects consistently in our scene and be able to load them again in the exact same position? This is where we need Meta’s spatial anchors, w...
Meta Quest Scene API | Scene Understanding & Content Placement with OVR Scene Manager
Просмотров 11 тыс.Год назад
Hi XR Developers! We covered many basics of the Meta Presence Platform on this channel already, however one very important feature that we haven’t really talked about it the Scene API! We looked at how to scan our room and use the scene models to create mixed reality experiences with the Mixed Reality Utility Kit or MRUK for short in a previous video. MRUK is a replacement for the OVR Scene Man...
Meta Quest Depth API and Occlusion Shaders for Environment Occlusion in Mixed Reality
Просмотров 8 тыс.Год назад
Meta Quest Depth API and Occlusion Shaders for Environment Occlusion in Mixed Reality
Meta Quest Passthrough API | How to make virtual and passthrough windows
Просмотров 16 тыс.Год назад
Meta Quest Passthrough API | How to make virtual and passthrough windows
Unity XR Hands Shapes & Poses - Create Custom Gestures with Gesture Building Blocks
Просмотров 6 тыс.Год назад
Unity XR Hands Shapes & Poses - Create Custom Gestures with Gesture Building Blocks
How to use Lights & Shadows in Passthrough Mode | Passthrough Relighting w/ OVR Scene Manager & MRUK
Просмотров 5 тыс.Год назад
How to use Lights & Shadows in Passthrough Mode | Passthrough Relighting w/ OVR Scene Manager & MRUK
Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK
Просмотров 14 тыс.Год назад
Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK
Unity Sentis: Create AI-driven XR experiences
Просмотров 6 тыс.Год назад
Unity Sentis: Create AI-driven XR experiences
Meta OVR Hands | Pointer Data & Pinch Gesture for next generation interactions
Просмотров 7 тыс.Год назад
Meta OVR Hands | Pointer Data & Pinch Gesture for next generation interactions
Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast
Просмотров 22 тыс.Год назад
Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast
Create Believable Haptic Feedback with Meta’s Haptics Studio & Haptics SDK
Просмотров 2,4 тыс.Год назад
Create Believable Haptic Feedback with Meta’s Haptics Studio & Haptics SDK
Develop for Apple Vision Pro with Unity’s PolySpatial | Play to Device, XR Simulator & visionOS
Просмотров 16 тыс.Год назад
Develop for Apple Vision Pro with Unity’s PolySpatial | Play to Device, XR Simulator & visionOS
Meta Quest Controller Input & Animations
Просмотров 8 тыс.Год назад
Meta Quest Controller Input & Animations
Cross-platform Mixed Reality Development for Meta Quest, HoloLens 2 & Magic Leap 2 with Unity
Просмотров 3,8 тыс.Год назад
Cross-platform Mixed Reality Development for Meta Quest, HoloLens 2 & Magic Leap 2 with Unity
Get Started with Meta Quest Development in Unity | Meta's new XR SDK release!
Просмотров 30 тыс.Год назад
Get Started with Meta Quest Development in Unity | Meta's new XR SDK release!
Lynx R1 Setup Guide - Using Handtracking & Enabling AR Mode
Просмотров 1,1 тыс.Год назад
Lynx R1 Setup Guide - Using Handtracking & Enabling AR Mode
Unity VR Multiplayer Development with Oculus Integration & Photon Fusion (using Meta Avatars SDK)
Просмотров 2,9 тыс.Год назад
Unity VR Multiplayer Development with Oculus Integration & Photon Fusion (using Meta Avatars SDK)
XR Hands: Installation and Setup - How to create a Hand Tracking App using Unity's XRI
Просмотров 16 тыс.Год назад
XR Hands: Installation and Setup - How to create a Hand Tracking App using Unity's XRI
Apple Vision Pro - How to become an Apple XR Developer with Unity for visionOS
Просмотров 13 тыс.Год назад
Apple Vision Pro - How to become an Apple XR Developer with Unity for visionOS
Will this work on a ar project
Wondering if it's possible to enable occlusion from the Depth API for the stencil shader? Meaning that e.g. my real hand can occlude / cover parts of the virtual window.
Yes, this should be possible nowadays using depth masking: developers.meta.com/horizon/documentation/unity/unity-depthapi-occlusions-advanced-usage#using-the-depth-mask-feature
Can this anchors be dynamic? Like tracking real life moving ball?
Undortunately no
Very nice work! Thanks a lot, Meta should add this to their documentation as I was really struggeling on how to setup shared anchors with their incomplete documentation about this topic. One thing which is still not working is persisting the anchors throughout a session, so if I turn off the headset for one player and put it on again, the anchor is shifted and the position of other players and network objects isn't correct anymore and shifted. Do you already have a solution for this issue? Thanks a lot again, nice work!
You need to make sure that evefy client that joins discovers a session, this means that also a client needs to be advertising. So if the forst player leaves (who was advertising) and joins again, another client has to start advertising.
not working..to many error..sentis now 2.1.1
Hey Man! Very nice improvement! I remember doing shared spatial anchors and the publishing part to be able to share was a nightmare. Awesome! Also, very nice to see you! always enjoy your content
Thanks man, hope to see you this year at another hack! :)
Hi! Could you help with getting IMU data from Meta Quest Headset?
Hey, thank you for the tutorial! When I start the app on 2 headsets the players and the cube is synced. However, the further interaction is kind of mirrored. When I move the cube to the left, it moves for the other headset in the other direction. Do you have an idea why?
Very Informative, Thanks for this!
Which editor version are you using? I'm having some issues with Unity 6 - Meta's collocation/matchmaking blocks on Unity 6 don't seem to be updated with new multiplayer packages. Also, is there a github repo available to view the collocation and alignment manager code?
Unity 6 does work with Shared Spatial Anchors and Colocation Discovery. However, I am not sure about the building blocks, and also most Oculus Samples have not been updated to v71 or to use group-based anchors yet! I would recommend setting it up like I did in the video. I tested it in Unity 6 as well without issues. Let me know if you have any issues!
I was able to replicate the approach in Unity 6, but you have to downgrade Netcode to version 1.11.0. With Netcode 1.12.0 the authorization handling didn't work properly on my side.
Very accurate and new! Big thumbs up!!!
Your tutorials are awesome and really helpful! I'm about to start an internship in Augmented Reality Development and this is helping me to prepare for that quite a lot!
Wow!! Thank you so much mate! This video literally saved my life! God bless you and your family 🙏
Is there by any chance a sprinting mechanic?
How does the scene get built? Is it all runtime or do you need to manually configure the room each time?
Great tutorial... How can we add anchors to custom objects, track distance from it, and interact with it. For example, I want to track my monitor screen, get distance from the screen and trigger a response when my hand touches the screen. Is this possible with meta quest 3?
love you man 😭
love you too bro
What a mess... for what I read in the comments, it doesn't work for anyone :/
For me it works fine, only problem I have at the moment is inaccuracy of spatial anchors when you work with them on big scale like on a big field
I gotta say getting it to work was the hardest thing I ever challenged in my Unity development path, holy mackerel!
But what about spawning different prefabs? Who wants after all spawn the same prefab on and on? Great tutorial anyway
One approach is to save a reference between each anchor and its corresponding prefab (e.g. by giving each prefab an ID) using the anchor’s UUID, which you can access to uniquely identify it. For example, you could store this mapping in PlayerPrefs. When the game starts, you can retrieve these references to determine which prefab belongs to each anchor and instantiate them accordingly. This is what I did.
Thank you this was very helpful!!!
Wow, you got that out quick. I'm still trying to figure out why the update broke passthrough and hand tracking in my game. Thanks, very informative!
Looks like I'm gearing up mentally to jump back into this :), I'm glad to see how far along v71 has come and what all was added. Thanks for the demonstrations and walkthrough!
This is great!! Some of the features most developers were looking forward. Onto the next one 💪🏼 Mesh Destruction ✅ API to access Camera ⏳ Climb Interaction ⏳
The wait is almost over
Guys make something together :)
Thanks! Very few people are actually making updated videos on Meta's XR SDKs. Can you please show how to properly implement the virtual keyboard with their UI set? I have spent a lot of time trying different stuff from building blocks and reading the documentation but it is outdated and some components aren't there anymore as guided in the documentation. No one seems to have made a video on using meta's virtual keyboard properly.
hi! Thanks for the video. Is Unity 6's incompatibility with metaSDK a problem that only happens to me? And the Asset Store is still only updated up to 69v.
Hey, what issues exactly are you running into?
Hi, Great tutorial! When I play this from unity it works perfectly but when I build it and install on my quest it no longer has passthrough. Any ideas?
same
how to create scripts about rotation
did you find anything
@@xXGeorgLPXx yes
@@elegantroc8877 yoo could you send it to me ?
i want to do this exactly in Unreal Engine. Help!
Whenever I try to run it lags like crazy, even though I have a 7700x and 6800 xt. When I compile it it lags just as much, and my hardware is nowhere near full usage, only around 20% gpu usage. Any clue how to solve this?
workingAnchor.created never comes true!
Can the virtual window work if I create a whole 3D scene for it instead of skybox?
Yes, definitely
Hey I love the tutorial! So clear and informative! Question: I've finished it and viewing through Quest Link,and while I can view the MX Ink stylus through both eyes, I only have the drawing image through my left eye. Any help? Thank you!
Great video.. I am trying to demo a scene on my Vision Pro through play to device/polyspatial.. Gaze works but my hand gestures and pinch to grab (or any hand interaction for that matter) does not work..
I am able to load the scene in my Appalachian pro through unity. The gaze function seems to work but the hand interactions don't work. I pinch but nothing happens. I tried to manipulate the cube but I am unable to do so. Any ideas into what is happening? I would really appreciate it
Amazon made changes and now it doesn't work anymore the way it was set up, please help
So can I completely use the interface written in SwiftUI to drive the entire Unity scene?
Hi there, I can't seem to find the Room Model building block. Should I use the Scene Mesh building block instead?
Should work the same :)
Hi, Great tutorial! When I play this from unity it works perfectly but when I build it and install on my quest it no longer has passthrough. Any ideas?
To those who have a problem with the simulator window not popping up. The simulator defaults to using integrated graphics instead of your dedicated graphics card, which the simulator can't use. You'll have to disable the integrated graphics device manually in the device manager to make it work. I've noticed the same problem when trying to get Quest link to work.
Other solution, which worked for me: "Tick the option for "Oculus" in Edit > Project Settings > XR Plug-in Management > Windows, Mac, Linux"
Is it possible to see the camera passthrough on a monitor with a link cable setup? I can see everything in passthrough in my quest 3 but on my monitor every passthrough content is set to black. On top of that everytime I want to see a skybox unity show me the passthrough instead. Even if i use only plane passthrough. Any ideas? :/
Love your content but the background 'music' is SO distracting when trying to follow. It's like attending a lecture in an elevator!
I tried to load anchors in the same session without quitting the app (my use case is to visually clear the anchors after saving and loading them on the user's command) - the resultant anchors-array from LoadingUnboundAnchors is NOT null but its length=0. Once I quit and reopen the app, the anchors are loading successfully. I want to load them before quitting the app (I mean in the same session). Any ideas on why this is not happening?? Or any suggestions on how to implement this?
I can't seem to find the room model building block?
It's been removed. For objects to be able to physically interact with your room, you can use the Scene Mesh building block instead.
Meta XR simulator does not appear even after activating it. Nothing.
Hi, another great video. I was thinking maybe you can create a tutorial how to use unity sentis to create an AI NPCs that generate dialogs based on its knowledge database? I saw a video for the Inworld AI ruclips.net/video/gUCahPSAut4/видео.html but right now it don't exist in the asset store and I dont want to integrate the chatGPT so I was wondering maybe I can create an AI with similar abilities but using instead the Unity Sentis. What do you think about it? Also is there a tutorial how to create models for Unity sentis?
Ik this a old video but any help would be appreciated. How do I make it so when the player moves in real life, they don’t through walls or objects. I’ve tried some things, but they always feel harsh.
Hi, thanks for this tutorial. I've got that working on my Apple Vision Pro. the gesture will be detected but I don't know how to grab something with that gesture. Is there any tutorial available for this?