Cruising Review


Publication Title | Are We Ready for Autonomous Drone Racing

Drone Information Series

Drone search was updated real-time via Filemaker on:

Drone | Return to Search List

Search Completed | Title | Are We Ready for Autonomous Drone Racing
Original File Name Searched: ICRA19-Delmerico.pdf | Google It | Yahoo | Bing



Page Number: 001
Previous Page View | Next Page View

Text | Are We Ready for Autonomous Drone Racing | 001



This paper has been accepted for publication at the IEEE International Conference on Robotics and Automation (ICRA), Montreal, 2019. ©IEEE
Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset
Jeffrey Delmerico, Titus Cieslewski, Henri Rebecq, Matthias Faessler, and Davide Scaramuzza.
Abstract—Despite impressive results in visual-inertial state estimation in recent years, high speed trajectories with six degree of freedom motion remain challenging for existing estimation algorithms. Aggressive trajectories feature large accelerations and rapid rotational motions, and when they pass close to objects in the environment, this induces large apparent motions in the vision sensors, all of which increase the difficulty in estimation. Existing benchmark datasets do not address these types of trajectories, instead focusing on slow speed or constrained trajectories, targeting other tasks such as inspection or driving. We introduce the UZH-FPV Drone Racing dataset, consisting of over 27 sequences, with more than 10 km of flight distance, captured on a first-person-view (FPV) racing quadrotor flown by an expert pilot. The dataset features camera images, inertial measurements, event-camera data, and precise ground truth poses. These sequences are faster and more challenging, in terms of apparent scene motion, than any existing dataset. Our goal is to enable advancement of the state of the art in aggressive motion estimation by providing a dataset that is beyond the capabilities of existing state estimation algorithms.
SUPPLEMENTARY MATERIAL
The dataset is available at http://rpg.ifi.uzh.
ch/uzh-fpv
I. INTRODUCTION
High-quality, large-scale, and task-driven benchmarks are key to pushing the research community forward. A well- known and compelling example can be found in autonomous driving, where the introduction of multiple datasets and benchmarks ([1], [2], [3], [4]) triggered drastic improvements of various low-level algorithms (visual odometry, stereo, optical flow), leading to impressive results on these bench- marks. These improvements ended up finding applications not only in autonomous driving, but also in many other tasks, benefiting the vision community as a whole. Yet, can we conclude that low-level vision is solved? Our opinion is that the constraints of autonomous driving—which have driven the design of the current benchmarks—do not set the bar high enough anymore: cars exhibit mostly planar motion with limited accelerations, and can afford a high payload
This work was supported by the National Centre of Competence in Research Robotics (NCCR) through the Swiss National Science Foundation, the SNSF-ERC Starting Grant, and the DARPA Fast Lightweight Autonomy Program.
The authors are with the Robotics and Perception Group, Dept. of Informatics, University of Zurich, and Dept. of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland—http://rpg.ifi.uzh.ch. Jeffrey Delmerico is now with Microsoft Mixed Reality and AI Lab, Zurich, Switzerland. Matthias Faessler is now with Verity Studios, Zurich, Switzerland.
(a) Outdoor sequence
(d) Indoor sequence
(b) Events
(e) Events
(c) Optical flow
(f) Optical flow
Fig. 1: We present a drone racing dataset containing syn- chronized IMU, camera and event camera data recorded in indoor and outdoor environments. The dataset exhibits the largest optical flow magnitudes (pixel displacement per second) among all visual-inertial datasets to date. Figs. 1a,1d: preview images from the high-quality onboard fisheye cam- era. Figs. 1b,1e: visualization of the asynchronous event stream for in the outdoor (resp. indoor) sequence, obtained by integrating events over a temporal window of ∆t = 30 ms (resp. ∆t = 10 ms) (blue: positive events, red: negative events). Figs. 1c,1f: color-coded magnitude of the optical flow (blue is small, red is large).
and compute. So, what is the next challenging problem? We posit that drone racing represents a scenario in which low level vision is not yet solved. The fast, six-degree-of-freedom trajectories that occur in drone races, with high accelerations and rapid rotations, are beyond the capabilities of the current state of the art in state estimation. The purpose of this dataset is to spur innovation toward solutions to state estimation under these challenging conditions.
Very recently, there has been tremendous enthusiasm for autonomous drone racing in the research community due to the research challenges posed by agile flight, including the now annual IROS Autonomous Drone Race [5], [6]. More generally, high-speed robot navigation in cluttered, unknown environments is currently a very active research area [7], [8], [9], [10], [11], [12], [13] and funding of over 50 million US dollars has been made available through the DARPA Fast Lightweight Autonomy Program (2015-2018) and the DARPA Subterranean Challenge (2018-2021).
However, drone racing did not begin in research or indus-

Image | Are We Ready for Autonomous Drone Racing



are-we-ready-autonomous-drone-racing-001

Conrad Maldives Drone View

Learn more about becoming a digital nomad: Visit the CruisingReview.com website

YouTube Drone Video Site: Visit the YouTube website

Digital Nomad Course: Course Info

Search Contact: greg@cruisingreview.com