The idea stems from a proposal that autonomous drones that are able to “bob and weave through trees” could provide a solution for finding hikers that are lost in forests. This could offer an alternative to helicopters and drones flying above the trees, which are unable to see through the tree canopy, however using GPS signals to guide the drones in such forest environments – where the signals would be unreliable or even nonexistent – would be problematic.

The researchers instead propose a solution where the drones use only onboard computation and wireless communication to collaboratively search under dense forest canopies. They describe a system where each autonomous quadrotor drone would be equipped with laser-range finders for position estimation, localization, and path planning, and as the drone flies around, it creates an individual 3D map of the terrain.

The drone uses algorithms to help it recognize unexplored and already-searched spots, so it knows when it’s fully mapped an area. An off-board ground station combines individual maps from multiple drones into a global 3D map that can be monitored by human rescuers.

In a real-world implementation, say the researchers, the drones would also be equipped with object detection to identify a missing hiker. When located, the drone would tag the hiker’s location on the global map, and human searchers could then use this information to plan a rescue mission.

“Essentially,” says Yulun Tian, a graduate student in the Department of Aeronautics and Astronautics (AeroAstro) and first author of a paper on the system, “we’re replacing humans with a fleet of drones to make the search part of the search-and-rescue process more efficient.”

To test their idea, the researchers tested multiple drones in simulations of randomly generated forests, and tested two drones in a forested area within NASA’s Langley Research Center. In both experiments, each drone mapped a roughly 20-square-meter area in about two to five minutes and collaboratively fused their maps together in real-time.

The drones also performed well across several metrics, say the researchers, including overall speed and time to complete the mission, detection of forest features, and accurate merging of maps.

Each drone was equipped with a LIDAR system, which creates a 2D scan of the surrounding obstacles by shooting laser beams and measuring the reflected pulses. While this can be used to detect trees, say the researchers, individual trees appear similar to one another, so they programmed their drones to instead identify multiple trees’ orientations, which is far more distinctive.

With this method, when the LIDAR signal returns a cluster of trees, an algorithm calculates the angles and distances between trees to identify that cluster. The drones can then use that as a unique signature to tell if they’ve visited the area before or if it’s a new area.

The feature-detection technique, say the researchers, helps the ground station – which continuously monitors the drone scans – accurately merge maps. The drones generally explore an area in loops, producing scans as they go. When two drones loop around to the same cluster of trees, the ground station merges the maps by calculating the relative transformation between the drones, and then merging the individual maps to maintain consistent orientations.

“Calculating that relative transformation tells you how you should align the two maps so it corresponds to exactly how the forest looks,” says Tian.

In the ground station, robotic navigation software called “simultaneous localization and mapping” (SLAM) — which both maps an unknown area and keeps track of an agent inside the area — uses the LIDAR input to localize and capture the position of the drones. This helps it fuse the maps together accurately, resulting in a map with 3D terrain features – trees appear as blocks of colored shades of blue to green, depending on height, while unexplored areas are dark but turn gray as they’re mapped.

On-board path-planning software tells a drone to always explore these dark unexplored areas as it flies around. Producing a 3D map is more reliable than simply attaching a camera to a drone and monitoring the video feed, say the researchers. Transmitting video to a central station, for instance, requires a lot of bandwidth that may not be available in forested areas.

The researchers say they also employed a search strategy that lets the drones more efficiently explore an area. In a traditional approach, a drone would always search the closest possible unknown area, however that could be in any number of directions from the drone’s current position. “That doesn’t respect dynamics of drone [movement],” says Tian. “It has to stop and turn, so that means it’s very inefficient in terms of time and energy, and you can’t really pick up speed.”

So instead, the researchers have their drones explore the closest possible area, while considering their current direction – an approach they believe can help the drones maintain a more consistent velocity. This strategy – where the drone tends to travel in a spiral pattern – covers a search area much faster, which can be a critical factor in search and rescue missions.

The researchers are presenting their paper describing their autonomous system at the International Symposium on Experimental Robotics (ISER) conference.

Related articles:
Drone-captured 3D maps promise improved wireless networks
3D through-wall imaging achieved with drones and Wi-Fi
Handheld search and rescue sensor uses IoT, smartphone
Small flying robots can grip, pull heavy loads

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles