Sensor fused three-dimensional localization using IMU, camera and LiDAR

Document Type

Conference Proceeding

Publication Date

1-5-2017

Abstract

© 2016 IEEE. Estimating the position and orientation (pose) of a moving platform in a three-dimensional (3D) environment is of significant importance in many areas, such as robotics and sensing. In order to perform this task, one can employ single or multiple sensors. Multi-sensor fusion has been used to improve the accuracy of the estimation and to compensate for individual sensor deficiencies. Unlike the previous works in this area that use sensors with the ability of 3D localization to estimate the full pose of a platform (such as an unmanned aerial vehicle or drone), in this work we employ the data from a 2D light detection and ranging (LiDAR) sensor, which can only estimate the pose in a 2D plane. We fuse it in an extended Kalman filter with the data from camera and inertial sensors showing that, despite the incomplete estimation from the 2D LiDAR, the overall estimated 3D pose can be improved. We also compare this scenario with the case where the 2D LiDAR is replaced with a 3D LiDAR with similar characteristics, but the ability of complete 3D pose estimation.

Publication Title

Proceedings of IEEE Sensors

Share

COinS