Skip to main navigation Skip to search Skip to main content

Robust Depth-Aided RGBD-Inertial Odometry for Indoor Localization

  • Harbin Institute of Technology

Research output: Contribution to journalArticlepeer-review

Abstract

RGB-D cameras such as RealSense and Structure Sensors have been widely used in most robotics systems. This paper presents a system for estimating the trajectory of an RGB-D camera and IMU in indoor environments. The system uses a novel relative pose estimation method that utilizes depth measurements and epipolar constraints for initialization. An adaptive depth estimation method is also proposed, which fuses a depth uncertainty model and multi-view triangulation. In the backend, a sliding window framework is used to optimize the system state by minimizing the residuals of pre-integrated IMU, 3D features re-projection, and 2D features epipolar constraint. The effectiveness of the system is evaluated using publicly available datasets with ground truth trajectories.

Original languageEnglish
Article number112487
JournalMeasurement: Journal of the International Measurement Confederation
Volume209
DOIs
StatePublished - 15 Mar 2023

Keywords

  • Epipolar constraints
  • RGB-D camera
  • Sensor fusion
  • Visual-inertial odometry

Fingerprint

Dive into the research topics of 'Robust Depth-Aided RGBD-Inertial Odometry for Indoor Localization'. Together they form a unique fingerprint.

Cite this