- Видео 178
- Просмотров 334 985
Black Whale - XR AI Tutorials
Швейцария
Добавлен 25 апр 2021
Hi XR Developers, I'm Robi and welcome to the Black Whale Studio RUclips channel. We are a game development studio, specializing in the futuristic realms of Extended Reality (XR), which includes both Virtual Reality (VR) and Augmented Reality (AR). We bring you immersive multiplayer experiences that are not just games, but virtual realities that you can step into. On this channel, we share our expertise and passion for XR game development through comprehensive tutorials. Whether you're a budding developer or a seasoned pro, there's always something new to learn with us.
Interact with us:
📌Twitter: xrdevrob
📌 Discord: discord.gg/v8fVjweUPB
Support me:
🎋 Patreon: www.patreon.com/blackwhalestudio
Welcome to this channel and let's create an open and accessible online space that can be experienced with continuity!
#XR #Unity #Metaverse #Spatialweb #educationalvideo
Interact with us:
📌Twitter: xrdevrob
📌 Discord: discord.gg/v8fVjweUPB
Support me:
🎋 Patreon: www.patreon.com/blackwhalestudio
Welcome to this channel and let's create an open and accessible online space that can be experienced with continuity!
#XR #Unity #Metaverse #Spatialweb #educationalvideo
Destructible Global Mesh, Instant Content Placement and MRUK Trackables - Meta's MRUK v71 update
Hi XR Developers! As I’m sure you know by now Meta’s MR Utility Kit provides a rich set of utilities and tools on top of the Scene API to perform common operations when building spatially-aware apps. This makes it easier to program against the physical world, and allows developers to focus on what makes their app unique. We can easily ray-cast against scene objects, find valid spawn locations inside our rooms, find the most suitable surface for placing objects or even achieve some neat lighting effects against our furniture. Version 71 of MRUK it as big update that adds even more useful features, which we will quickly cover in this video.
Resources:
🔗 MRUK Overview: developers.meta.com/hori...
Resources:
🔗 MRUK Overview: developers.meta.com/hori...
Просмотров: 1 416
Видео
Create an MR Drawing app using Logitech’s MX Ink Stylus for Meta Quest
Просмотров 1,5 тыс.3 месяца назад
Hi XR Developers! If you have ever used apps like ShapesXR, PaintingVR or Vermillion, you will know how great it feels to create your own paintings or sketch up your own designs in your headset. This experience just got much better with the Logitech MX-Ink pen, which is built specifically for Meta Quest and is publicly available starting this month! Today we will look at how to set up the pen, ...
Use Meta’s Llama 3 Model in Unity with Amazon Bedrock & other partner services
Просмотров 1,7 тыс.3 месяца назад
Hi XR Developers! In the last video we learnt how to record a voice command using Meta’s Voice SDK and turn our speech into text. In this video we can finally take this text and send it as a prompt to different Large Language Models or LLMs for short. We can then get a response back and use Meta’s Voice SDK to speak the response out loud. This allows you to for example create natural conversati...
Use Meta's Voice SDK for Wake Word Detection & Speech To Text (STT)
Просмотров 2,3 тыс.4 месяца назад
Hi XR Developers! In this video we will look at how to use Meta’s Voice SDK to define and understand intents. Furthermore, we will use these intents to create a wake word that we know from assistants such as Google and Siri, as well as create some logic to capture the transcript of what we said after the wake word was heard by our system! Resources: 🔗 Meta XR Core SDK: assetstore.unity.com/pack...
Apple Vision Pro UI Interactions with Unity PolySpatial | Keyboard and Voice Inputs
Просмотров 2,6 тыс.8 месяцев назад
Hi XR Developers! In this video we will look at how to use the visionOS system keyboard for our Unity UI and how to achieve speech to text input using AI (Whisper from OpenAI)! Resources: 🔗 Unity PolySpatial 1.1 Documentation: docs.unity3d.com/Packages/com.unity.polyspatial.visionos@1.1/manual/index.html 🔗 Play to Device: docs.unity3d.com/Packages/com.unity.polyspatial.visionos@1.1/manual/PlayT...
Apple Vision Pro Input & Object Manipulation with Unity PolySpatial | Connect to Unity and Xcode
Просмотров 6 тыс.8 месяцев назад
Hi XR Developers! In this video I will teach you how to use the 3D touch input for visionOS devices. I prepared a bunch of scripts that let us manipulate game objects in different way! There is also Skeletal Hand Tracking which is available to us through the XR hands package from Unity. We will look into that in a future video! Resources: 🔗 Unity PolySpatial 1.1 Documentation: docs.unity3d.com/...
Create and Manage SwiftUI windows with Unity PolySpatial for Apple visionOS
Просмотров 3,2 тыс.9 месяцев назад
Hi XR Developers! In this video we will look at the new PolySpatial 1.1 update and learn how to create and manage native SwiftUI windows to interact with our Unity scene! In a previous video we have looked at the basic setup and the sample scenes already. The previous video also explains the most important concept of Windows, Volumes and Spaces, so definitely check this one out first! Resources...
Virtual Windows in Passthrough Mode | Selective Passthrough Shader & Stencil Shader
Просмотров 5 тыс.9 месяцев назад
Virtual Windows in Passthrough Mode | Selective Passthrough Shader & Stencil Shader
Advanced Hand Tracking & Lighting Fast Interaction Setup | Meta’s huge v62 Update
Просмотров 6 тыс.9 месяцев назад
Advanced Hand Tracking & Lighting Fast Interaction Setup | Meta’s huge v62 Update
Meta XR Simulator | Synthetic Environments, Scene Recording & Multiplayer Testing
Просмотров 8 тыс.10 месяцев назад
Meta XR Simulator | Synthetic Environments, Scene Recording & Multiplayer Testing
Meta Quest Spatial Anchors | Create, Save, Erase & Load spatial anchors
Просмотров 12 тыс.10 месяцев назад
Meta Quest Spatial Anchors | Create, Save, Erase & Load spatial anchors
Meta Quest Scene API | Scene Understanding & Content Placement with OVR Scene Manager
Просмотров 11 тыс.10 месяцев назад
Meta Quest Scene API | Scene Understanding & Content Placement with OVR Scene Manager
Meta Quest Depth API and Occlusion Shaders for Environment Occlusion in Mixed Reality
Просмотров 8 тыс.10 месяцев назад
Meta Quest Depth API and Occlusion Shaders for Environment Occlusion in Mixed Reality
Meta Quest Passthrough API | How to make virtual and passthrough windows
Просмотров 14 тыс.11 месяцев назад
Meta Quest Passthrough API | How to make virtual and passthrough windows
Unity XR Hands Shapes & Poses - Create Custom Gestures with Gesture Building Blocks
Просмотров 6 тыс.11 месяцев назад
Unity XR Hands Shapes & Poses - Create Custom Gestures with Gesture Building Blocks
How to use Lights & Shadows in Passthrough Mode | Passthrough Relighting w/ OVR Scene Manager & MRUK
Просмотров 4,7 тыс.11 месяцев назад
How to use Lights & Shadows in Passthrough Mode | Passthrough Relighting w/ OVR Scene Manager & MRUK
Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK
Просмотров 13 тыс.11 месяцев назад
Mixed Reality Utility Kit: Build spatially-aware apps with Meta XR SDK
Unity Sentis: Create AI-driven XR experiences
Просмотров 6 тыс.Год назад
Unity Sentis: Create AI-driven XR experiences
Meta OVR Hands | Pointer Data & Pinch Gesture for next generation interactions
Просмотров 6 тыс.Год назад
Meta OVR Hands | Pointer Data & Pinch Gesture for next generation interactions
Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast
Просмотров 21 тыс.Год назад
Meta’s Building Blocks | Develop Mixed Reality Apps lightning fast
Create Believable Haptic Feedback with Meta’s Haptics Studio & Haptics SDK
Просмотров 2,3 тыс.Год назад
Create Believable Haptic Feedback with Meta’s Haptics Studio & Haptics SDK
Develop for Apple Vision Pro with Unity’s PolySpatial | Play to Device, XR Simulator & visionOS
Просмотров 15 тыс.Год назад
Develop for Apple Vision Pro with Unity’s PolySpatial | Play to Device, XR Simulator & visionOS
Meta Quest Controller Input & Animations
Просмотров 8 тыс.Год назад
Meta Quest Controller Input & Animations
Cross-platform Mixed Reality Development for Meta Quest, HoloLens 2 & Magic Leap 2 with Unity
Просмотров 3,7 тыс.Год назад
Cross-platform Mixed Reality Development for Meta Quest, HoloLens 2 & Magic Leap 2 with Unity
Get Started with Meta Quest Development in Unity | Meta's new XR SDK release!
Просмотров 29 тыс.Год назад
Get Started with Meta Quest Development in Unity | Meta's new XR SDK release!
Lynx R1 Setup Guide - Using Handtracking & Enabling AR Mode
Просмотров 1,1 тыс.Год назад
Lynx R1 Setup Guide - Using Handtracking & Enabling AR Mode
Unity VR Multiplayer Development with Oculus Integration & Photon Fusion (using Meta Avatars SDK)
Просмотров 2,8 тыс.Год назад
Unity VR Multiplayer Development with Oculus Integration & Photon Fusion (using Meta Avatars SDK)
XR Hands: Installation and Setup - How to create a Hand Tracking App using Unity's XRI
Просмотров 15 тыс.Год назад
XR Hands: Installation and Setup - How to create a Hand Tracking App using Unity's XRI
Apple Vision Pro - How to become an Apple XR Developer with Unity for visionOS
Просмотров 12 тыс.Год назад
Apple Vision Pro - How to become an Apple XR Developer with Unity for visionOS
Jump in VR using the Unity Character Controller - Advanced VR Tutorials
Просмотров 3,1 тыс.Год назад
Jump in VR using the Unity Character Controller - Advanced VR Tutorials
Wow!! Thank you so much mate! This video literally saved my life! God bless you and your family 🙏
Is there by any chance a sprinting mechanic?
How does the scene get built? Is it all runtime or do you need to manually configure the room each time?
Great tutorial... How can we add anchors to custom objects, track distance from it, and interact with it. For example, I want to track my monitor screen, get distance from the screen and trigger a response when my hand touches the screen. Is this possible with meta quest 3?
love you man 😭
love you too bro
What a mess... for what I read in the comments, it doesn't work for anyone :/
For me it works fine, only problem I have at the moment is inaccuracy of spatial anchors when you work with them on big scale like on a big field
I gotta say getting it to work was the hardest thing I ever challenged in my Unity development path, holy mackerel!
But what about spawning different prefabs? Who wants after all spawn the same prefab on and on? Great tutorial anyway
Thank you this was very helpful!!!
Wow, you got that out quick. I'm still trying to figure out why the update broke passthrough and hand tracking in my game. Thanks, very informative!
Looks like I'm gearing up mentally to jump back into this :), I'm glad to see how far along v71 has come and what all was added. Thanks for the demonstrations and walkthrough!
This is great!! Some of the features most developers were looking forward. Onto the next one 💪🏼 Mesh Destruction ✅ API to access Camera ⏳ Climb Interaction ⏳
The wait is almost over
Guys make something together :)
Thanks! Very few people are actually making updated videos on Meta's XR SDKs. Can you please show how to properly implement the virtual keyboard with their UI set? I have spent a lot of time trying different stuff from building blocks and reading the documentation but it is outdated and some components aren't there anymore as guided in the documentation. No one seems to have made a video on using meta's virtual keyboard properly.
hi! Thanks for the video. Is Unity 6's incompatibility with metaSDK a problem that only happens to me? And the Asset Store is still only updated up to 69v.
Hey, what issues exactly are you running into?
Hi, Great tutorial! When I play this from unity it works perfectly but when I build it and install on my quest it no longer has passthrough. Any ideas?
same
how to create scripts about rotation
did you find anything
i want to do this exactly in Unreal Engine. Help!
Whenever I try to run it lags like crazy, even though I have a 7700x and 6800 xt. When I compile it it lags just as much, and my hardware is nowhere near full usage, only around 20% gpu usage. Any clue how to solve this?
workingAnchor.created never comes true!
Can the virtual window work if I create a whole 3D scene for it instead of skybox?
Yes, definitely
Hey I love the tutorial! So clear and informative! Question: I've finished it and viewing through Quest Link,and while I can view the MX Ink stylus through both eyes, I only have the drawing image through my left eye. Any help? Thank you!
Great video.. I am trying to demo a scene on my Vision Pro through play to device/polyspatial.. Gaze works but my hand gestures and pinch to grab (or any hand interaction for that matter) does not work..
I am able to load the scene in my Appalachian pro through unity. The gaze function seems to work but the hand interactions don't work. I pinch but nothing happens. I tried to manipulate the cube but I am unable to do so. Any ideas into what is happening? I would really appreciate it
Amazon made changes and now it doesn't work anymore the way it was set up, please help
So can I completely use the interface written in SwiftUI to drive the entire Unity scene?
Hi there, I can't seem to find the Room Model building block. Should I use the Scene Mesh building block instead?
Hi, Great tutorial! When I play this from unity it works perfectly but when I build it and install on my quest it no longer has passthrough. Any ideas?
To those who have a problem with the simulator window not popping up. The simulator defaults to using integrated graphics instead of your dedicated graphics card, which the simulator can't use. You'll have to disable the integrated graphics device manually in the device manager to make it work. I've noticed the same problem when trying to get Quest link to work.
Other solution, which worked for me: "Tick the option for "Oculus" in Edit > Project Settings > XR Plug-in Management > Windows, Mac, Linux"
Is it possible to see the camera passthrough on a monitor with a link cable setup? I can see everything in passthrough in my quest 3 but on my monitor every passthrough content is set to black. On top of that everytime I want to see a skybox unity show me the passthrough instead. Even if i use only plane passthrough. Any ideas? :/
Love your content but the background 'music' is SO distracting when trying to follow. It's like attending a lecture in an elevator!
I tried to load anchors in the same session without quitting the app (my use case is to visually clear the anchors after saving and loading them on the user's command) - the resultant anchors-array from LoadingUnboundAnchors is NOT null but its length=0. Once I quit and reopen the app, the anchors are loading successfully. I want to load them before quitting the app (I mean in the same session). Any ideas on why this is not happening?? Or any suggestions on how to implement this?
I can't seem to find the room model building block?
Meta XR simulator does not appear even after activating it. Nothing.
Hi, another great video. I was thinking maybe you can create a tutorial how to use unity sentis to create an AI NPCs that generate dialogs based on its knowledge database? I saw a video for the Inworld AI ruclips.net/video/gUCahPSAut4/видео.html but right now it don't exist in the asset store and I dont want to integrate the chatGPT so I was wondering maybe I can create an AI with similar abilities but using instead the Unity Sentis. What do you think about it? Also is there a tutorial how to create models for Unity sentis?
Ik this a old video but any help would be appreciated. How do I make it so when the player moves in real life, they don’t through walls or objects. I’ve tried some things, but they always feel harsh.
Hi, thanks for this tutorial. I've got that working on my Apple Vision Pro. the gesture will be detected but I don't know how to grab something with that gesture. Is there any tutorial available for this?
How do you have these anchors persist in the unity editor while not in play mode? A use case, I want to reskin my room with a 3d model I created and textured, how would I get the room scale 3d model to match the physical room?
Great video. The XR Simulator window didn't pop up for me though. Must be missing something...
Hello, I am finding the voice recognition to be terribly inaccurate. For instance, I can say "bloople", and although my wake word is "hey matt", "bloople" will activate the speech to text. Additionally, after giving a transcription, and hearing the noise that confirms it received my transcription, the app voice experience does not stop listening, and will continue to take more prompts, even past 20 seconds. Any tips?
You could try a different model
Hey, what a great tutorial. How can I keep my skybox texture when the passthrough is toggled off? I mean I don't really like it to be completely black.
You need to switch the camer clear flags and background color. So the background needs to be switched back to skybox.
CAT
@@VtuberHotelVKing Lmaoo. Didn't expect any hololive fans here tho. Yeah, she's my oshi. My cutest kitten🥰🥰🥰
ZAMN
guys please help When I install the Meta XR Core package I cant find any elements inside the files are empty
is it possible to get the user view? I want to get the user view to do image recognition
I followed along with this tutorial, but the appVoiceExperience doens't start on awake. It will not respond to my wake word on play. Instead I have to manually press activate on the App Voice Experience component. However when I do this then the application stops listening to my wake word and will instead always trigger wake word detected event and complete transcription event within ms of each other. Do you have any advice?
same issue here.
Did you find a solution to this?!?!
Great Tutorial, i am having an issue, when i click on play the synthetic environment window opens up, but theres no Meta XR simulator window, Can u help me with it like what could be the issue?
i'm having this issue too in unity 2022.3.28f1 with v68 of the simulator. not sure if something broke recently?
@@Fizzyhex im using the same unity version , for the testing purposes i reimported the meta SDK in a new seprate project using URP and there its working completely fine. I think the issue mustve happened while setting up the sdk as i also activated the XR plugins in the unity before importing the SDK which mightve been causing the issue
same issue here in unity 2021.3.36f1
Solution which worked for me: "Tick the option for "Oculus" in Edit > Project Settings > XR Plug-in Management > Windows, Mac, Linux"
Hi, I've followed the exact steps but the script gives me errors like TensorFloat and Iworker couldnt been found. And I am sure that I've downloaded sentis
how did u fix the error? I've encountered the same errors
@@firefly5033 Well If I remmember it right be sure that you have downloaded the older version of this unity package, like the test experimental one. You can do it by looking and downloading the exact version number of that. But after I used, it didnt worked well so I gave up on this project
How can I add more sign recognitions like 4 + - / * And is it possible to do this with a 3d drawing pen?
This is awesome, thanks. You've inspired me to make some XR content! Keep it up!
bro! when г publish ur video about hand gestures in visionos???
tell them to get some GPS receivers in the next version. Else, if it's there's a seamless way to use it from a connection to your cell phone, I'd like to know.