The findings published in the Science Robotics journal under the title „Dynamic Obstacle Avoidance for Quadrotors with Event Cameras“ report an overall detection and reaction latency (to initiate an obstacle avoidance manoeuver) of only 3.5 milliseconds, sufficiently low for the drone to avoid even fast-moving obstacles such as a basket ball being thrown at it.
Event cameras are bio-inspired sensors that measures per-pixel brightness changes, asynchronously, capturing a stream of events at microsecond resolution instead of having to generate full image frames as conventional cameras do.
But standard vision algorithms that analyse image changes across full frames cannot be applied to a stream of asynchronous events, and the researchers had to develop novel algorithms to exploit the temporal information contained in the event stream to distinguish between static and dynamic objects. Their moving-obstacle detection algorithm works by collecting events during a short-time sliding window, it then compensates for the motion of the drone (ego-motion) during that time. Tightly integrated with the drone‘s inertial measurement unit data, this ego-motion compensation approach enables even a moving sensor to remove the pixel events generated by the static parts of the drone‘s environment, leaving dynamic objects to be easily identified.
With an overall latency of only 3.5ms, the algorithm is sufficiently fast to run in real time on a small onboard computer. Object detection was successfully tested both indoor and outdoor as demonstrated in a video where a quadcopter is shown executing various collision-avoidance manoeuvers in front of a ball.