MENU

Event driven image sensor boosts drone safety

Event driven image sensor boosts drone safety

Technology News |
By Nick Flaherty



Researchers in Switzerland and the Netherlands have used event driven image sensors to improve the safety of a multi-rotor autonomous drone. 

Once a motor fails, a drone can rotate uncontrollably, making navigation and stabilisation very difficult. Larger drones use GNSS satellite receivers to try to regain control of the system, but this is not practical for smaller aircraft.

“When one rotor fails, the drone begins to spin on itself like a ballerina,” said Davide Scaramuzza, head of the Robotics and Perception Group at UZH and of the Rescue Robotics Grand Challenge at NCCR Robotics, which funded the research. “This high-speed rotational motion causes standard controllers to fail unless the drone has access to very accurate position measurements.”

Instead, the technique developed by the researchers combine data from a standard camera with an event-driven camera sensor that only responds to changes in an image. These event driven, neuromorphic or spiking neural network sensors are being commercialised by companies such as Prophesee in France and Opteran in the UK. The team plans to release the technology as open source.

This is increasingly important with more autonomous drones being used for deliveries that have to operate safely in the event of a failure.

The research team developed algorithms that combine information from the two sensors and use it to track the quadrotor’s position relative to its surroundings. This enables the onboard computer to control the drone as it flies – and spins – with only three rotors.

Next: Event driven image sensor performance 


The researchers found that both types of cameras perform well in normal light conditions. “When illumination decreases, however, standard cameras begin to experience motion blur that ultimately disorients the drone and crashes it, whereas event cameras also work well in very low light,” said Sihao Sun, a researcher in Scaramuzza’s lab.

“State-of-art flight controllers can stabilize and control a quadrotor even when subjected to the complete loss of a rotor. However, these methods rely on external sensors, such as GPS or motion capture systems, for state estimation. To the best of our knowledge, this has not yet been achieved with onboard sensors,” he said.

The primary challenge of this problem stems from the inevitable high-speed yaw rotation (over 20 rad/s), causing motion blur to cameras, which is a problem for visual inertial odometry (VIO). The high dynamic range and high temporal resolution of the event-based camera.

“Experimental validations show that our approach is able to accurately control the position of a quadrotor during a motor failure scenario. We believe our approach will render autonomous quadrotors safer in both GPS denied or degraded environments,” said Sun.

The team plans to release both controller and the VIO algorithm as fully open source.

nccr-robotics.ch

Related articles

Other articles on eeNews Europe 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s