Paper
19 February 2018 A novel visual-inertial monocular SLAM
Xiaofeng Yue, Wenjuan Zhang, Li Xu, JiangGuo Liu
Author Affiliations +
Proceedings Volume 10608, MIPPR 2017: Automatic Target Recognition and Navigation; 106080M (2018) https://doi.org/10.1117/12.2288360
Event: Tenth International Symposium on Multispectral Image Processing and Pattern Recognition (MIPPR2017), 2017, Xiangyang, China
Abstract
With the development of sensors and computer vision research community, cameras, which are accurate, compact, wellunderstood and most importantly cheap and ubiquitous today, have gradually been at the center of robot location. Simultaneous localization and mapping (SLAM) using visual features, which is a system getting motion information from image acquisition equipment and rebuild the structure in unknown environment. We provide an analysis of bioinspired flights in insects, employing a novel technique based on SLAM. Then combining visual and inertial measurements to get high accuracy and robustness. we present a novel tightly-coupled Visual-Inertial Simultaneous Localization and Mapping system which get a new attempt to address two challenges which are the initialization problem and the calibration problem. experimental results and analysis show the proposed approach has a more accurate quantitative simulation of insect navigation, which can reach the positioning accuracy of centimeter level.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Xiaofeng Yue, Wenjuan Zhang, Li Xu, and JiangGuo Liu "A novel visual-inertial monocular SLAM", Proc. SPIE 10608, MIPPR 2017: Automatic Target Recognition and Navigation, 106080M (19 February 2018); https://doi.org/10.1117/12.2288360
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Biological research

Navigation systems

Back to Top