The researchers found that both types of cameras perform well in normal light conditions. “When illumination decreases, however, standard cameras begin to experience motion blur that ultimately disorients the drone and crashes it, whereas event cameras also work well in very low light,” said Sihao Sun, a researcher in Scaramuzza’s lab.
“State-of-art flight controllers can stabilize and control a quadrotor even when subjected to the complete loss of a rotor. However, these methods rely on external sensors, such as GPS or motion capture systems, for state estimation. To the best of our knowledge, this has not yet been achieved with onboard sensors,” he said.
The primary challenge of this problem stems from the inevitable high-speed yaw rotation (over 20 rad/s), causing motion blur to cameras, which is a problem for visual inertial odometry (VIO). The high dynamic range and high temporal resolution of the event-based camera.
“Experimental validations show that our approach is able to accurately control the position of a quadrotor during a motor failure scenario. We believe our approach will render autonomous quadrotors safer in both GPS denied or degraded environments,” said Sun.
The team plans to release both controller and the VIO algorithm as fully open source.
- NASA LOOKS TO BRAINCHIP'S SPIKING NEURAL NETWORK CHIP FOR SPACE
- SELF-HEALING SOFTWARE FOR DRIVERLESS CARS
- KIT PROVIDES ALGORITHMS FOR NEUROMORPHIC IMAGE SENSOR
- STACKED EVENT-BASED VISION SENSOR BOASTS HIGHEST HDR
- SONY ACQUIRES SWISS VISION SENSOR FIRM
Other articles on eeNews Europe