U-ARE-ME: Uncertainty-Aware Rotation Estimation in Manhattan Environments

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024
  • Accompanying video for U-ARE-ME: Uncertainty-Aware Rotation Estimation in Manhattan Environments
    Aalok Patwardhan*, Callum Rhodes*, Gwangbin Bae, and Andrew J. Davison.
    * denotes equal contribution
    Project page: callum-rhodes....
    Abstract:
    Camera rotation estimation from a single image is a challenging task, often requiring depth data and/or camera intrinsics, which are generally not available for in-the-wild videos. Although external sensors such as inertial measurement units (IMUs) can help, they often suffer from drift and are not applicable in non-inertial reference frames. We present U-ARE-ME, an algorithm that estimates camera rotation along with uncertainty from uncalibrated RGB images. Using a Manhattan World assumption, our method leverages the per-pixel geometric priors encoded in single-image surface normal predictions and performs optimisation over the SO(3) manifold. Given a sequence of images, we can use the per-frame rotation estimates and their uncertainty to perform multi-frame optimisation, achieving robustness and temporal consistency. Our experiments demonstrate that U-ARE-ME performs comparably to RGB-D methods and is more robust than sparse feature-based SLAM methods. We encourage the viewer to visit our project page at callum-rhodes.....

Комментарии •