Patents are a big deal. And Apple tends to register for every little thing they can possibly dream up to prevent other companies like Meta from launching their technologies with useful things that Apple owns the patent for. I have a feeling that Meta is adressing their fuzzy room mapping with their redevelopment of Augments. I hope to see it .. Thanks for a great episode as always Dan! Nailed it! ❤
Good to see you again! This raw mesh of the room. Although its been a while since I've played some of these MR games. I'm pretty sure it used that mesh when in Figmin XR. I could roll a ball of my couch, Albeit a low poly couch. Then it rolled behind the table. Wich occluded that 3d object. But its never perfect, and the mesh itself can shift in position. Probably why Meta is taking forever to implement augments. Some games use this hand occlusion (not like avp) but a transparent 3d model mask of a hand. Piano Vision, Toy Monsters and Pencil are games/apps that use this hand occlusion. Especially cool in toy monster imo. I really hope we get it in the UI at some point. Meta is clearly going after AVP with their v67 update. But not having eyetracking to foveate the render resolution while in the main Ui probably limits the the Quest 3's capabilities greatly.
I don't think AR is a gimmick as some experiences use it to wonderful effect, but like you say occlusion is an issue. Bomber drone is great, but sometimes it places enemies behind a wall, outside a room etc. The thing that bothers me though, is even if you use point cloud cloud storage, you frequently have to rescan the room as for some reason the quest dosen't seem to want to remember the room layout and defaults to a 1metre circle.
Whatever, I only use my Quest 3 for SIM racing games and could care less about if AR does not do/work correctly. I would assume META will make the mesh thing better in the future, at least now they have the Pass through looking better. Eye tracking is not a necessity with my gaming (don't do social apps or Avatars).
VR is a personal experience and everyone is looking for different things. I just feel like pass through quality, headset tracking, controller tracking, and lenses/display quality should still be the priority. Eye tracking has a short list of benefits and is over hyped. I have 3 headsets with eye tracking and all it does is set IPD automatically.
@@ChristephenWood Ah, thanks for all your information. I agree with you on pass through. It has come along ways since the first Oculus Rift and very crude black and white fuzzy pass through. Take care.
@@bringhurstvr I own a Varjo XR-3 and Varjo VR-3. The Eye tracking support on the game side just isn’t there. On the OpenXR side though there is support but game devs seem to not implement Varjo Support.
lol you raise good points and insight as a consumer but also called me poor for having smaller house. I want AR to not be gaming focused. I want to replace walls with my email inbox, windows with weather I want to see and change the colour of my space as I feel. and most of all real time translation when I look at a sign in x language it will translate, map markers for a destination so i can follow using whatever mapping software on my phone. so far the game side has been okay for me much more immersed in the full VR games over ar but that likely is due to lack of use in the current game market but visual desktop and whatever options we have in the future would be nice even as simple as a hdmi dongle that streams the output of any device into your vr homespace so you can hang monitors in ar would be good.
If I understand correctly, developers have access to the simplified/labeled “Scene Anchors” as well as the detailed “Scene Mesh,” so the latter isn’t being discarded. However, it’s much more resource intensive to calculate collisions with complicated shapes than with simple primitives (like boxes), so some developers may rely on the former to improve performance. Meta is working on an AI room-scanning model, called “SceneScript,” which should provide a nice middle ground between the simple anchors and detailed mesh. Regarding your suggestion to limit the size of the play space that can be scanned, I suspect the appropriate limit would vary by game. For example, a very simple application may function fine in a large space, while a more complex game could quickly run out of resources. Admittedly, I’m speaking as an extreme amateur who clumsily dabbles in coding from time to time, so others may be able to offer more detailed and accurate insight.
Patents are a big deal. And Apple tends to register for every little thing they can possibly dream up to prevent other companies like Meta from launching their technologies with useful things that Apple owns the patent for.
I have a feeling that Meta is adressing their fuzzy room mapping with their redevelopment of Augments. I hope to see it ..
Thanks for a great episode as always Dan! Nailed it! ❤
Good to see you again! This raw mesh of the room. Although its been a while since I've played some of these MR games. I'm pretty sure it used that mesh when in Figmin XR. I could roll a ball of my couch, Albeit a low poly couch. Then it rolled behind the table. Wich occluded that 3d object. But its never perfect, and the mesh itself can shift in position. Probably why Meta is taking forever to implement augments.
Some games use this hand occlusion (not like avp) but a transparent 3d model mask of a hand. Piano Vision, Toy Monsters and Pencil are games/apps that use this hand occlusion. Especially cool in toy monster imo. I really hope we get it in the UI at some point. Meta is clearly going after AVP with their v67 update. But not having eyetracking to foveate the render resolution while in the main Ui probably limits the the Quest 3's capabilities greatly.
I don't think AR is a gimmick as some experiences use it to wonderful effect, but like you say occlusion is an issue. Bomber drone is great, but sometimes it places enemies behind a wall, outside a room etc. The thing that bothers me though, is even if you use point cloud cloud storage, you frequently have to rescan the room as for some reason the quest dosen't seem to want to remember the room layout and defaults to a 1metre circle.
Welcome back!
Welcome back! Did you see that Inseye Lumi will produce lenses that will bring it to the Quest 3
I did, thanks man. Looks very cool.
Yeah I agree. I don't think we are there yet. Still very gimmicky for me.
meta clearly adds more cameras and scanners to better spy on you.. not make better couch models for your game. :)
Cause that's what I was talking about.
Whatever, I only use my Quest 3 for SIM racing games and could care less about if AR does not do/work correctly. I would assume META will make the mesh thing better in the future, at least now they have the Pass through looking better. Eye tracking is not a necessity with my gaming (don't do social apps or Avatars).
VR is a personal experience and everyone is looking for different things. I just feel like pass through quality, headset tracking, controller tracking, and lenses/display quality should still be the priority. Eye tracking has a short list of benefits and is over hyped. I have 3 headsets with eye tracking and all it does is set IPD automatically.
@ChristephenWood agreed. I just think they should abandon the use cases they're so bent on focusing on.
@@ChristephenWood
Ah, thanks for all your information. I agree with you on pass through. It has come along ways since the first Oculus Rift and very crude black and white fuzzy pass through. Take care.
@@bringhurstvr I own a Varjo XR-3 and Varjo VR-3. The Eye tracking support on the game side just isn’t there. On the OpenXR side though there is support but game devs seem to not implement Varjo Support.
You just gained a follower. I am glad you are not here shilling for half assed games. I just got the meta 3 512
lol you raise good points and insight as a consumer but also called me poor for having smaller house.
I want AR to not be gaming focused. I want to replace walls with my email inbox, windows with weather I want to see and change the colour of my space as I feel. and most of all real time translation when I look at a sign in x language it will translate, map markers for a destination so i can follow using whatever mapping software on my phone.
so far the game side has been okay for me much more immersed in the full VR games over ar but that likely is due to lack of use in the current game market but visual desktop and whatever options we have in the future would be nice even as simple as a hdmi dongle that streams the output of any device into your vr homespace so you can hang monitors in ar would be good.
Agreed. But my place is small too. It's a townhome that's maybe 55 feet across.
If I understand correctly, developers have access to the simplified/labeled “Scene Anchors” as well as the detailed “Scene Mesh,” so the latter isn’t being discarded. However, it’s much more resource intensive to calculate collisions with complicated shapes than with simple primitives (like boxes), so some developers may rely on the former to improve performance. Meta is working on an AI room-scanning model, called “SceneScript,” which should provide a nice middle ground between the simple anchors and detailed mesh. Regarding your suggestion to limit the size of the play space that can be scanned, I suspect the appropriate limit would vary by game. For example, a very simple application may function fine in a large space, while a more complex game could quickly run out of resources. Admittedly, I’m speaking as an extreme amateur who clumsily dabbles in coding from time to time, so others may be able to offer more detailed and accurate insight.
This was a very insightful reply.
🤨
In Figmin you can swap between mesh and cubes
@@bluescreentobsen yeah, I've discovered they give you a little more access.
What a welcome surprise! Nice to see ya Mr Bringhurst... Please stop by again soon! 😊