TY - GEN
T1 - OVPC mesh
AU - Ruetz, Fabio
AU - Hernández, Emili
AU - Pfeiffer, Mark
AU - Oleynikova, Helen
AU - Cox, Mark
AU - Lowe, Thomas
AU - Borges, Paulo
N1 - Funding Information:
This research was partially funded by the NATO Science for Peace and Security Programme through the IUFCV project (Ref. 985079). 1Autonomous Systems Lab, ETH Zurich, Switzerland {ruetzf, oelena, pfmark}@ethz.ch, 2Robotics and Autonomous Systems, Data61, CSIRO, Brisbane, Australia {Name.Surname}@csiro.au Fig. 1. Raw local point cloud (colored by intensity) obtained from a 3D LiDAR and the extracted watertight mesh (white edges). The mesh encapsulates the robot and gives a conservative three-dimensional free-space estimate of the local robot surrounding. This environment representation can also be used for rough-terrain motion planning.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/5
Y1 - 2019/5
N2 - This paper presents a novel approach for local 3D environment representation for autonomous unmanned ground vehicle (UGV) navigation called On Visible Point Clouds Mesh (OVPC Mesh). Our approach represents the surrounding of the robot as a watertight 3D mesh generated from local point cloud data in order to represent the free space surrounding the robot. It is a conservative estimation of the free space and provides a desirable trade-off between representation precision and computational efficiency, without having to discretize the environment into a fixed grid size. Our experiments analyze the usability of the approach for UGV navigation in rough terrain, both in simulation and in a fully integrated real-world system. Additionally, we compare our approach to well-known state-of the-art solutions, such as Octomap and Elevation Mapping and show that OVPC Mesh can provide reliable 3D information for trajectory planning while fulfilling real-time constraints.
AB - This paper presents a novel approach for local 3D environment representation for autonomous unmanned ground vehicle (UGV) navigation called On Visible Point Clouds Mesh (OVPC Mesh). Our approach represents the surrounding of the robot as a watertight 3D mesh generated from local point cloud data in order to represent the free space surrounding the robot. It is a conservative estimation of the free space and provides a desirable trade-off between representation precision and computational efficiency, without having to discretize the environment into a fixed grid size. Our experiments analyze the usability of the approach for UGV navigation in rough terrain, both in simulation and in a fully integrated real-world system. Additionally, we compare our approach to well-known state-of the-art solutions, such as Octomap and Elevation Mapping and show that OVPC Mesh can provide reliable 3D information for trajectory planning while fulfilling real-time constraints.
UR - http://www.scopus.com/inward/record.url?scp=85071494151&partnerID=8YFLogxK
U2 - 10.1109/ICRA.2019.8793503
DO - 10.1109/ICRA.2019.8793503
M3 - Conference contribution
AN - SCOPUS:85071494151
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 8648
EP - 8654
BT - 2019 International Conference on Robotics and Automation, ICRA 2019
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 20 May 2019 through 24 May 2019
ER -