TY  - JOUR
ID  - zuniga2020ijrr
T1  - The UMA-VI dataset: Visual–inertial odometry in low-textured and dynamic illumination environments
A1  - Zuñiga-Noël, David
A1  - Jaenal, Alberto
A1  - Gomez-Ojeda, Ruben
A1  - Gonzalez-Jimenez, Javier
JA  - The International Journal of Robotics Research
Y1  - 2020
UR  - https://journals.sagepub.com/doi/10.1177/0278364920938439
M2  - doi: https://doi.org/10.1177/0278364920938439
N2  - This article presents a visual–inertial dataset gathered in indoor and outdoor scenarios with a handheld custom sensor rig, for over 80 min in total. The dataset contains hardware-synchronized data from a commercial stereo camera (Bumblebee®2), a custom stereo rig, and an inertial measurement unit. The most distinctive feature of this dataset is the strong presence of low-textured environments and scenes with dynamic illumination, which are recurrent corner cases of visual odometry and simultaneous localization and mapping (SLAM) methods. The dataset comprises 32 sequences and is provided with ground-truth poses at the beginning and the end of each of the sequences, thus allowing the accumulated drift to be measured in each case. We provide a trial evaluation of five existing state-of-the-art visual and visual–inertial methods on a subset of the dataset. We also make available open-source tools for evaluation purposes, as well as the intrinsic and extrinsic calibration parameters of all sensors in the rig. The dataset is available for download at http://mapir.uma.es/work/uma-visual-inertial-dataset
M1  - img_url=http%3A%2F%2Fmapir.isa.uma.es%2Falbertojaenal%2FUMAVI%2FUMAVI.png
M1  - rank_indexname=
M1  - rank_pos_in_category=
M1  - rank_num_in_category=
M1  - rank_cat_name=
M1  - impact_factor=
ER  -