Omni-vision mobile robot vSLAM based on spherical camera model
Document Type
Conference Proceeding
Publication Date
12-1-2011
Abstract
This paper presents the Visual Simultaneous Localization and Mapping (vSLAM) algorithm based on the spherical camera model, a novel algorithm for simultaneous localization and mapping (SLAM). We mapped the wide-angle image to the spherical image because wide-angle images exhibit significant distortion, for which existing scale-space detectors such as the scale-invariant feature transform(SIFT) are inappropriate. The algorithm adopts the omni-vision odemetry based on spherical camera model, and enables low cost navigation in cluttered and populated environments. No initial map is required, and it satisfactorily handles dynamic changes in the environment, and associates detected features with previously detected features. The results of the offline experiments indicate the feasibility of the proposed method. © 2011 IEEE.
Publication Title
2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011
Recommended Citation
Tong, G.,
Wu, Z.,
&
Tan, J.
(2011).
Omni-vision mobile robot vSLAM based on spherical camera model.
2011 IEEE International Conference on Robotics and Biomimetics, ROBIO 2011, 829-834.
http://doi.org/10.1109/ROBIO.2011.6181390
Retrieved from: https://digitalcommons.mtu.edu/michigantech-p/10904