Learning Agile, Vision-based Drone Flight: from Simulation to Reality

04/09/2023
by   Davide Scaramuzza, et al.
0

We present our latest research in learning deep sensorimotor policies for agile, vision-based quadrotor flight. We show methodologies for the successful transfer of such policies from simulation to the real world. In addition, we discuss the open research questions that still need to be answered to improve the agility and robustness of autonomous drones toward human-pilot performance.

READ FULL TEXT

page 2

page 3

page 5

research
06/22/2018

Deep Drone Racing: Learning Agile Flight in Dynamic Environments

Autonomous agile flight brings up fundamental challenges in robotics, su...
research
09/18/2023

Contrastive Learning for Enhancing Robust Scene Transfer in Vision-based Agile Flight

Scene transfer for vision-based mobile robotics applications is a highly...
research
07/12/2023

Agilicious: Open-Source and Open-Hardware Agile Quadrotor for Vision-Based Flight

Autonomous, agile quadrotor flight raises fundamental challenges for rob...
research
10/26/2022

Learning Deep Sensorimotor Policies for Vision-based Autonomous Drone Racing

Autonomous drones can operate in remote and unstructured environments, e...
research
01/07/2022

Visual Attention Prediction Improves Performance of Autonomous Drone Racing Agents

Humans race drones faster than neural networks trained for end-to-end au...
research
12/21/2022

Cooperative Flight Control Using Visual-Attention – Air-Guardian

The cooperation of a human pilot with an autonomous agent during flight ...
research
09/30/2021

Fly Out The Window: Exploiting Discrete-Time Flatness for Fast Vision-Based Multirotor Flight

Current control design for fast vision-based flight tends to rely on hig...

Please sign up or login with your details

Forgot password? Click here to reset