A Kinect based vibrotactile feedback system to assist the visually impaired

Kumar Yelamarthi, Brian P. Dejong, Kevin Laubhan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations

Abstract

This paper presents a Microsoft Kinect based vibrotactile feedback system to aid in navigation for the visually impaired. The lightweight wearable system interprets the visual scene and presents obstacle distance and characteristic information to the user. The scene is converted into a distance map using the Kinect, then processed and interpreted using an Intel Next Unit of Computing (NUC). That information is then converted via a microcontroller into vibrotactile feedback, presented to the user through two four-by-four vibration motor arrays woven into gloves. The system is shown to successfully identify, track, and present closest objects, closest humans, multiple humans, and perform distance measurements.

Original languageEnglish
Title of host publication2014 IEEE 57th International Midwest Symposium on Circuits and Systems, MWSCAS 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages635-638
Number of pages4
ISBN (Electronic)9781479941346, 9781479941346
DOIs
StatePublished - Sep 23 2014
Event2014 IEEE 57th International Midwest Symposium on Circuits and Systems, MWSCAS 2014 - College Station, United States
Duration: Aug 3 2014Aug 6 2014

Publication series

NameMidwest Symposium on Circuits and Systems
ISSN (Print)1548-3746

Conference

Conference2014 IEEE 57th International Midwest Symposium on Circuits and Systems, MWSCAS 2014
Country/TerritoryUnited States
CityCollege Station
Period08/3/1408/6/14

Keywords

  • blind
  • kinect sensor
  • navigation assistance
  • tactile feedback
  • visually impaired

Fingerprint

Dive into the research topics of 'A Kinect based vibrotactile feedback system to assist the visually impaired'. Together they form a unique fingerprint.

Cite this