The content of this article comes from the "Surveying and Mapping Bulletin" No. 9 in 2024, drawing review number: GS Jing (2024) No. 1659
Vision/inertial/ultra-wideband datasets based on unmanned platforms in complex scenarios
Luo Haolong1,2, Yang Zidi1, Li Xueqiang1,3, Zou Danping2, Li Jiansheng1, Li Guangyun1
1. School of Geospatial Information, Information Engineering University, Zhengzhou 450001, China; 2. Shanghai Key Laboratory of BeiDou Navigation and Location Service, Shanghai Jiao Tong University, Shanghai 200240, China; 3. Unit 61618, Beijing 100080, China
Funds: National Natural Science Foundation of China (42071454); State Key Laboratory of Geographic Information Engineering(SKLGIE2023-M-2-2)
Keywords: multi-source sensor fusion, dataset, complex scene, ultra-wideband, unmanned platform
Citation format: Luo Haolong, Yang Zidi, Li Xueqiang, Zou Danping, Li Jiansheng, Li Guangyun. Vision/Inertial/Ultra-Wideband Dataset Based on Unmanned Platform in Complex Scenes[J]. Bulletin of Surveying and Mapping, 2024, 0(9): 87-95. DOI: 10.13474/j.cnki.11-2246.2024.0916
Abstract:Navigation and SLAM technology based on multi-source sensor fusion is the mainstream direction of current development, and its research and application in complex scenarios have attracted more and more attention. However, there are relatively few multi-source sensor datasets for complex scenarios, especially those containing ultra-wideband (UWB) sensors. In order to facilitate users to test and verify the multi-source sensor fusion algorithm in complex scenarios, and dig deep into the shortcomings and potential development directions of multi-source sensor fusion and SLAM technology, this paper firstly relies on unmanned vehicle and UAV platforms to realize visual/inertial/ultra-wideband dataset collection in seven complex scenarios, including dynamic scenes, non-line-of-sight scenes, and large-scale scenes. Then, the high-precision optical motion capture system provides real-time six-degree-of-freedom real position and attitude for the dataset. Finally, four advanced open-source algorithms, VINS-MONO, VINS-FUSION, VIR-SLAM and ORB-SLAM3, were used to verify and analyze all scene sequences. The experimental results show that the data of all scene sequences are effectively available.
body
About authorAbout author:LUO Haolong (1996—), male, Ph.D. candidate, his main research direction is multi-source sensor fusion navigation and SLAM. E-mail:[email protected] Corresponding author: Li Guangyun. E-mail:[email protected]
First trial: Ji Yinxiao review: Song Qifan
Final Judge: Jin Jun
information