Event-based Stereo Visual Odometry
Check out our new work: "Event-based Stereo Visual Odometry", where we dive into the rather unexplored topic of stereo SLAM with event cameras and propose a real-time solution.
Authors: Yi Zhou, Guillermo Gallego and Shaojie Shen
Event-based stereo visual odometry
Quadrotor fast flight in complex unknown environments
We presented RAPTOR, a Robust And Perception-aware TrajectOry Replanning framework to enable fast and safe flight in complex unknown environments. Its main features are:
(a) finding feasible and high-quality trajectories in very limited computation time, and
(b) introducing a perception-aware strategy to actively observe and avoid unknown obstacles.
Specifically, a path-guided optimization (PGO) approach that incorporates multiple topological paths is devised to search the solution space efficiently and thoroughly. Trajectories are further refined to have higher visibility and sufficient reaction distance to unknown dangerous regions, while the yaw angle is planned to actively explore the surrounding space relevant for safe navigation.
Authors: Boyu Zhou, Jie Pan, Fei Gao and Shaojie Shen
Quadrotor fast flight
Code for Autonomous Drone Race is now available on GitHub
We released Teach-Repeat-Replan, which is a complete and robust system enables Autonomous Drone Race.
Teach-Repeat-Replan can be applied to situations where the user has a preferable rough route but isn't able to pilot the drone ideally, such as drone racing. With our system, the human pilot can virtually control the drone with his/her navie operations, then our system automatically generates a very efficient repeating trajectory and autonomously execute it. During the flight, unexpected collisions are avoided by onboard sensing/replanning. Teach-Repeat-Replan can also be used for normal autonomous navigations. For these applications, a drone can autonomously fly in complex environments using only onboard sensing and planning.
Major components are:
- Planning: flight corridor generation, global spatial-temporal planning, local online re-planning
- Perception: global deformable surfel mapping, local online ESDF mapping
- Localization: global pose graph optimization, local visual-inertial fusion
- Controlling: geometric controller on SE(3)
Authors: Fei Gao, Boyu Zhou, and Shaojie Shen
Videos: Video1, Video2
Code: https://github.com/HKUST-Aerial-Robotics/Teach-Repeat-Replan
Autonomous drone racing
Autonomous drone racing
Code for VINS-Fusion is now available on GitHub
VINS-Fusion is an optimization-based multi-sensor state estimator, which achieves accurate self-localization for autonomous applications (drones, cars, and AR/VR). VINS-Fusion is an extension of VINS-Mono, which supports multiple visual-inertial sensor types (mono camera + IMU, stereo cameras + IMU, even stereo cameras only). We also show a toy example of fusing VINS with GPS. Features:
- multiple sensors support (stereo cameras / mono camera+IMU / stereo cameras+IMU)
- online spatial calibration (transformation between camera and IMU)
- online temporal calibration (time offset between camera and IMU)
- visual loop closure.
We are the TOP open-sourced stereo algorithm on KITTI Odometry Benchmark by 12 Jan. 2019.
VINS-Fusion
HKUST Aerial Robotics Group
Welcome to the HKUST Aerial Robotics Group led by Prof. Shaojie Shen. Our group is part of the HKUST Robotics Institute.
We develop fundamental technologies to enable aerial robots (or UAVs, drones, etc.) to autonomously operate in complex environments. Our research spans the full stack of aerial robotic systems, with focus on state estimation, mapping, trajectory planning, multi-robot coordination, and testbed development using low-cost sensing and computation components.
Video Highlights
News
A paper is accepted by RA-L
December 14th, 2020
A paper is accepted by T-RO
September 15th, 2020
A paper is accepted by PAD2020
August 5th, 2020
A paper is accepted by ECCV 2020
July 3rd, 2020
A paper is accepted by IROS 2020
July 1st, 2020
A paper is accepted by FPL 2020
May 20th, 2020
A paper is accepted by T-RO
May 4th, 2020
Prof Shen receives the AI 2000 Most Influential Scholars Honorable Mention
April 14th, 2020
A paper is accepted by CVPR 2020
February 26th, 2020
A paper is accepted by JFR
January 27th, 2020
Six papers are accepted by ICRA 2020
January 22nd, 2020
A paper is accpeted by ISRR 2019
August 1st, 2019
Code for Autonomous Drone Race is now available on GitHub
June 30th, 2019
Four papers are accepted by IROS 2019 and one of them will be published in RA-L
June 27th, 2019
A paper is accepted by RA-L
June 5th, 2019
A paper is accepted by T-RO
June 5th, 2019
A paper is accepted by CASE 2019
May 16th, 2019
A paper is accepted by UAVision 2019
April 8th, 2019
A paper receives Honorable Mention status for the 2018 T-RO Best Paper award
April 3rd, 2019
A paper is accepted by T-RO
March 28th, 2019
A paper is accepted by CVPR 2019
February 25th, 2019
A paper is accepted by JFR
February 25th, 2019
Four papers are accepted by ICRA 2019
February 11th, 2019
Prof Shen gave a talk at CMU RI
February 11th, 2019
Code for VINS-Fusion is now available on GitHub
January 14th, 2019
A paper is accepted by RA-L
January 10th, 2019
A paper is accepted by ROBIO 2018
October 18th, 2018
A paper is accepted by IJMAV
October 9th, 2018
A paper is accepted by JFR
October 11th, 2018
Tong Qin wins the IROS 2018 Best Student Paper Award
October 4th, 2018
A paper is accepted by 3DV 2018
July 21st, 2018
A paper is accepted by T-RO
July 20th, 2018
A paper is accepted by ECCV 2018
July 3rd, 2018
Seven papers are accepted by IROS 2018
June 29th, 2018
A paper is accepted by ISER 2018
June 28th, 2018
A paper is accepted by T-RO
May 19th, 2018
A paper is accepted by ETRA 2018
April 26th, 2018
A paper is accepted by IJCAI-ECAI 2018
April 17th, 2018
Two papers are accepted by ICUAS 2018
April 17th, 2018
Four papers are accepted by ICRA 2018
January 12th, 2018