The Quest 3 is Leaving a LOT of AR Scan Data Unused

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024

Комментарии • 23

  • @andreburger1981
    @andreburger1981 3 месяца назад +2

    Patents are a big deal. And Apple tends to register for every little thing they can possibly dream up to prevent other companies like Meta from launching their technologies with useful things that Apple owns the patent for.
    I have a feeling that Meta is adressing their fuzzy room mapping with their redevelopment of Augments. I hope to see it ..
    Thanks for a great episode as always Dan! Nailed it! ❤

  • @S1MONSAYSVR
    @S1MONSAYSVR 3 месяца назад +3

    Good to see you again! This raw mesh of the room. Although its been a while since I've played some of these MR games. I'm pretty sure it used that mesh when in Figmin XR. I could roll a ball of my couch, Albeit a low poly couch. Then it rolled behind the table. Wich occluded that 3d object. But its never perfect, and the mesh itself can shift in position. Probably why Meta is taking forever to implement augments.
    Some games use this hand occlusion (not like avp) but a transparent 3d model mask of a hand. Piano Vision, Toy Monsters and Pencil are games/apps that use this hand occlusion. Especially cool in toy monster imo. I really hope we get it in the UI at some point. Meta is clearly going after AVP with their v67 update. But not having eyetracking to foveate the render resolution while in the main Ui probably limits the the Quest 3's capabilities greatly.

  • @c6jones720
    @c6jones720 3 месяца назад +1

    I don't think AR is a gimmick as some experiences use it to wonderful effect, but like you say occlusion is an issue. Bomber drone is great, but sometimes it places enemies behind a wall, outside a room etc. The thing that bothers me though, is even if you use point cloud cloud storage, you frequently have to rescan the room as for some reason the quest dosen't seem to want to remember the room layout and defaults to a 1metre circle.

  • @Eagledocstew
    @Eagledocstew 3 месяца назад +2

    Welcome back!

  • @Eagledocstew
    @Eagledocstew 3 месяца назад +1

    Welcome back! Did you see that Inseye Lumi will produce lenses that will bring it to the Quest 3

    • @bringhurstvr
      @bringhurstvr  3 месяца назад

      I did, thanks man. Looks very cool.

  • @BlakeBlackstone
    @BlakeBlackstone 3 месяца назад

    Yeah I agree. I don't think we are there yet. Still very gimmicky for me.

  • @MrTechcat
    @MrTechcat 3 месяца назад

    meta clearly adds more cameras and scanners to better spy on you.. not make better couch models for your game. :)

    • @bringhurstvr
      @bringhurstvr  3 месяца назад +1

      Cause that's what I was talking about.

  • @dathyr1
    @dathyr1 3 месяца назад +1

    Whatever, I only use my Quest 3 for SIM racing games and could care less about if AR does not do/work correctly. I would assume META will make the mesh thing better in the future, at least now they have the Pass through looking better. Eye tracking is not a necessity with my gaming (don't do social apps or Avatars).

    • @ChristephenWood
      @ChristephenWood 3 месяца назад +1

      VR is a personal experience and everyone is looking for different things. I just feel like pass through quality, headset tracking, controller tracking, and lenses/display quality should still be the priority. Eye tracking has a short list of benefits and is over hyped. I have 3 headsets with eye tracking and all it does is set IPD automatically.

    • @bringhurstvr
      @bringhurstvr  3 месяца назад +1

      @ChristephenWood agreed. I just think they should abandon the use cases they're so bent on focusing on.

    • @dathyr1
      @dathyr1 3 месяца назад

      @@ChristephenWood
      Ah, thanks for all your information. I agree with you on pass through. It has come along ways since the first Oculus Rift and very crude black and white fuzzy pass through. Take care.

    • @ChristephenWood
      @ChristephenWood 3 месяца назад

      @@bringhurstvr I own a Varjo XR-3 and Varjo VR-3. The Eye tracking support on the game side just isn’t there. On the OpenXR side though there is support but game devs seem to not implement Varjo Support.

  • @AriestheLegendRVA
    @AriestheLegendRVA 3 месяца назад

    You just gained a follower. I am glad you are not here shilling for half assed games. I just got the meta 3 512

  • @snowSecurityneeded
    @snowSecurityneeded 3 месяца назад

    lol you raise good points and insight as a consumer but also called me poor for having smaller house.
    I want AR to not be gaming focused. I want to replace walls with my email inbox, windows with weather I want to see and change the colour of my space as I feel. and most of all real time translation when I look at a sign in x language it will translate, map markers for a destination so i can follow using whatever mapping software on my phone.
    so far the game side has been okay for me much more immersed in the full VR games over ar but that likely is due to lack of use in the current game market but visual desktop and whatever options we have in the future would be nice even as simple as a hdmi dongle that streams the output of any device into your vr homespace so you can hang monitors in ar would be good.

    • @bringhurstvr
      @bringhurstvr  3 месяца назад

      Agreed. But my place is small too. It's a townhome that's maybe 55 feet across.

  • @muridsilat
    @muridsilat 3 месяца назад

    If I understand correctly, developers have access to the simplified/labeled “Scene Anchors” as well as the detailed “Scene Mesh,” so the latter isn’t being discarded. However, it’s much more resource intensive to calculate collisions with complicated shapes than with simple primitives (like boxes), so some developers may rely on the former to improve performance. Meta is working on an AI room-scanning model, called “SceneScript,” which should provide a nice middle ground between the simple anchors and detailed mesh. Regarding your suggestion to limit the size of the play space that can be scanned, I suspect the appropriate limit would vary by game. For example, a very simple application may function fine in a large space, while a more complex game could quickly run out of resources. Admittedly, I’m speaking as an extreme amateur who clumsily dabbles in coding from time to time, so others may be able to offer more detailed and accurate insight.

    • @bringhurstvr
      @bringhurstvr  3 месяца назад +1

      This was a very insightful reply.

  • @poulinski74
    @poulinski74 3 месяца назад

    🤨

  • @bluescreentobsen
    @bluescreentobsen 2 месяца назад

    In Figmin you can swap between mesh and cubes

    • @bringhurstvr
      @bringhurstvr  2 месяца назад +1

      @@bluescreentobsen yeah, I've discovered they give you a little more access.

  • @andreburger1981
    @andreburger1981 3 месяца назад +1

    What a welcome surprise! Nice to see ya Mr Bringhurst... Please stop by again soon! 😊