Autonomous drone racing

In this video, we showcase aggressive autonomous drone races.

The drone is equipped with a pair of stereo cameras and a DJI N3 flight controller. All computing during the flight is done onboard. Our system consists of visual-inertial SLAM with loop closure, global mapping, local mapping, global trajectory optimization, local re-planning, and human-drone interaction interfaces.

We target for better performance beyond human in challenging drone racing scenarios. Four video clips are presented to showcase the performance in indoor and outdoor, static and dynamic environments:
1. Indoor autonomous drone racing in a static environment
2. Indoor autonomous drone racing in an environment with unknown obstacles
3. Outdoor autonomous drone racing, trial 1
4. Outdoor autonomous drone racing, trial 2

Authors: Fei Gao, Luqi Wang, Boyu Zhou, Luxin Han, and Shaojie Shen

Read more

Autonomous drone racing

Code for VINS-Fusion is now available on GitHub

VINS-Fusion is an optimization-based multi-sensor state estimator, which achieves accurate self-localization for autonomous applications (drones, cars, and AR/VR). VINS-Fusion is an extension of VINS-Mono, which supports multiple visual-inertial sensor types (mono camera + IMU, stereo cameras + IMU, even stereo cameras only). We also show a toy example of fusing VINS with GPS. Features:

  • multiple sensors support (stereo cameras / mono camera+IMU / stereo cameras+IMU)
  • online spatial calibration (transformation between camera and IMU)
  • online temporal calibration (time offset between camera and IMU)
  • visual loop closure.

We are the TOP open-sourced stereo algorithm on KITTI Odometry Benchmark by 12 Jan. 2019.

Authors: Tong Qin, Shaozu Cao, Jie Pan, Peiliang Li and Shaojie Shen

Code: https://github.com/HKUST-Aerial-Robotics/VINS-Fusion

VINS-Fusion

HKUST Aerial Robotics Group

Welcome to the HKUST Aerial Robotics Group led by Prof. Shaojie Shen. Our group is part of the HKUST Robotics Institute.

We develop fundamental technologies to enable aerial robots (or UAVs, drones, etc.) to autonomously operate in complex environments. Our research spans the full stack of aerial robotic systems, with focus on state estimation, mapping, trajectory planning, multi-robot coordination, and testbed development using low-cost sensing and computation components.

Video Highlights


Find more here: