Skip to main navigation Skip to search Skip to main content

Visual and inertial sensor fusion for mobile X-ray detector tracking: Demo abstract

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Robust 3D pose tracking of an object is a critical technique for various mobile sensing applications. Computer vision-based pose tracking method provides a cost-effective solution, but it is sensitive to occlusion and illumination change issues. In this work, we propose a novel visual-inertial sensor fusion framework and demonstrate the real-time implementation of a tightly-coupled sensor fusion algorithm: inertial perspective-n-point (IPNP) algorithm. With measurements from an inertial measurement unit (IMU), the prototype system only needs to detect two keypoints to track all six degrees of freedom of a planar object, e.g., a mobile X-ray detector, a 50% reduction on required number of keypoints, compared with the vision-based perspective-n-point algorithm.

Original languageEnglish
Title of host publicationSenSys 2020 - Proceedings of the 2020 18th ACM Conference on Embedded Networked Sensor Systems
PublisherAssociation for Computing Machinery, Inc
Pages643-644
Number of pages2
ISBN (Electronic)9781450375900
DOIs
StatePublished - 16 Nov 2020
Externally publishedYes
Event18th ACM Conference on Embedded Networked Sensor Systems, SenSys 2020 - Virtual, Online, Japan
Duration: 16 Nov 202019 Nov 2020

Publication series

NameSenSys 2020 - Proceedings of the 2020 18th ACM Conference on Embedded Networked Sensor Systems

Conference

Conference18th ACM Conference on Embedded Networked Sensor Systems, SenSys 2020
Country/TerritoryJapan
CityVirtual, Online
Period16/11/2019/11/20

Keywords

  • mobile sensing
  • pose estimation
  • sensor fusion

Fingerprint

Dive into the research topics of 'Visual and inertial sensor fusion for mobile X-ray detector tracking: Demo abstract'. Together they form a unique fingerprint.

Cite this