NTU VIRAL: A Visual-Inertial-Ranging-Lidar Dataset, From an Aerial Vehicle Viewpoint

02/01/2022
by   Thien-Minh Nguyen, et al.
0

In recent years, autonomous robots have become ubiquitous in research and daily life. Among many factors, public datasets play an important role in the progress of this field, as they waive the tall order of initial investment in hardware and manpower. However, for research on autonomous aerial systems, there appears to be a relative lack of public datasets on par with those used for autonomous driving and ground robots. Thus, to fill in this gap, we conduct a data collection exercise on an aerial platform equipped with an extensive and unique set of sensors: two 3D lidars, two hardware-synchronized global-shutter cameras, multiple Inertial Measurement Units (IMUs), and especially, multiple Ultra-wideband (UWB) ranging units. The comprehensive sensor suite resembles that of an autonomous driving car, but features distinct and challenging characteristics of aerial operations. We record multiple datasets in several challenging indoor and outdoor conditions. Calibration results and ground truth from a high-accuracy laser tracker are also included in each package. All resources can be accessed via our webpage https://ntu-aris.github.io/ntu_viral_dataset.

READ FULL TEXT

page 1

page 3

page 5

page 8

research
04/04/2023

USTC FLICAR: A Multisensor Fusion Dataset of LiDAR-Inertial-Camera for Heavy-duty Autonomous Aerial Work Robots

In this paper, we present the USTC FLICAR Dataset, which is dedicated to...
research
01/11/2019

A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors

Nowadays, more and more sensors are equipped on robots to increase robus...
research
12/19/2021

M2DGR: A Multi-sensor and Multi-scenario SLAM Dataset for Ground Robots

We introduce M2DGR: a novel large-scale dataset collected by a ground ro...
research
01/18/2023

SensorX2car: Sensors-to-car calibration for autonomous driving in road scenarios

The performance of sensors in the autonomous driving system is fundament...
research
10/23/2020

VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach

In recent years, Onboard Self Localization (OSL) methods based on camera...
research
10/17/2022

INSANE: Cross-Domain UAV Data Sets with Increased Number of Sensors for developing Advanced and Novel Estimators

For real-world applications, autonomous mobile robotic platforms must be...
research
01/25/2021

The GRIFFIN Perception Dataset: Bridging the Gap Between Flapping-Wing Flight and Robotic Perception

The development of automatic perception systems and techniques for bio-i...

Please sign up or login with your details

Forgot password? Click here to reset