Complementary Perception for Handheld SLAM

Thomas Lowe, Soohwan Kim, Mark Cox

Research output: Contribution to journalArticlepeer-review

13 Scopus citations


We present a novel method for mapping general three-dimensional environments, where sufficient geometric or visual information is not everywhere guaranteed and where the device motion is unconstrained as with handheld systems. The continuous-time simultaneous localization and mapping algorithm integrates a lidar, camera, and inertial measurement unit in a complementary fashion whereby all sensors contribute constraints to the optimization. The proposed algorithm is designed to expand the domain of mappable environments and therefore increase the reliability and utility of general purpose mobile mapping. A key component of the proposed algorithm is the incorporation of depth uncertainty into visual features, which is effective for noisy surfaces and allows features with and without depth estimates to be modeled in a unified manner. Results demonstrate a wider mappable domain on challenging environments compared to the state-of-the-art lidar or vision-based localization and mapping algorithms.

Original languageEnglish
Article number8263549
Pages (from-to)1104-1111
Number of pages8
JournalIEEE Robotics and Automation Letters
Issue number2
StatePublished - Apr 2018


  • SLAM
  • field robots
  • localization
  • mapping
  • sensor fusion


Dive into the research topics of 'Complementary Perception for Handheld SLAM'. Together they form a unique fingerprint.

Cite this