A sensor fusion methodology for obstacle avoidance robot

Md Sayedul Aman, Md Anam Mahmud, Haowen Jiang, Ahmed Abdelgawad, Kumar Yelamarthi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

Obstacle detection and navigation of dynamic environments is a challenge in mobile robotics. To address this challenge, this paper presents an efficient sensor fusion methodology to detect the size and location of obstacles and navigate the mobile robot with high accuracy. This is done by leveraging upon the unique advantages of accuracy in both ultrasonic sensor and a Kinect sensor for near-field and far-fields respectively. Further, an efficient Kalman filter is implemented to reduce the systematic errors in encoder data to track robot pose of the robot in real-time and reach the destination with high accuracy. Implemented on differential drive-based mobile robot, the proposed system has been validated with a high efficiency of detecting obstacles and reaching the destination with an accuracy of 5cm.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Electro Information Technology, EIT 2016
PublisherIEEE Computer Society
Pages458-463
Number of pages6
ISBN (Electronic)9781467399852
DOIs
StatePublished - Aug 5 2016
Event2016 IEEE International Conference on Electro Information Technology, EIT 2016 - Grand Forks, United States
Duration: May 19 2016May 21 2016

Publication series

NameIEEE International Conference on Electro Information Technology
Volume2016-August
ISSN (Print)2154-0357
ISSN (Electronic)2154-0373

Conference

Conference2016 IEEE International Conference on Electro Information Technology, EIT 2016
Country/TerritoryUnited States
CityGrand Forks
Period05/19/1605/21/16

Keywords

  • Kalman filter
  • kinect
  • mobile robot
  • sensor fusion
  • ultrasonic sensor

Fingerprint

Dive into the research topics of 'A sensor fusion methodology for obstacle avoidance robot'. Together they form a unique fingerprint.

Cite this