MENU



Using RFID tags that can be applied to any object, the system, called TurboTrack, enables robots using it to locate tagged objects within 7.5 milliseconds, on average, and with an error of less than a centimeter. Such a system, say the researchers, could enable greater collaboration and precision by robots working on packaging and assembly, and by swarms of drones carrying out search-and-rescue missions.

With the system, a standard RF reader sends a wireless signal that reflects off the RFID tag and other nearby objects, and rebounds to the reader. An algorithm sifts through all the reflected signals to find the RFID tag’s response. Final computations then leverage the RFID tag’s movement – even though this usually decreases precision – to improve its localization accuracy.

The system, say the researchers, could replace computer vision for some robotic tasks. Unlike computer vision, which is limited by what it can see, RF signals can identify targets without visualization, within clutter and through walls.

To test the system, the researchers attached one RFID tag to a cap and another to a bottle (see image above). A robotic arm located the cap and placed it onto the bottle, held by another robotic arm.

In another demonstration, the researchers tracked RFID-equipped nanodrones during docking, maneuvering, and flying. In both tasks,say the researchers, the system was as accurate and fast as traditional computer vision systems, while working in scenarios where computer vision fails.

“If you use RF signals for tasks typically done using computer vision,” says Fadel Adib, an assistant professor and principal investigator in the MIT Media Lab, and founding director of the Signal Kinetics Research Group, “not only do you enable robots to do human things, but you can also enable them to do superhuman things. And you can do it in a scalable way, because these RFID tags are only three cents each.”

In manufacturing, the system could enable robot arms to be more precise and versatile in, for example, picking up, assembling, and packaging items along an assembly line. Another promising application say the researchers, is using handheld “nanodrones” for search and rescue missions.

Such nanodrones currently use computer vision and methods to stitch together captured images for localization purposes. However, these drones often get confused in chaotic areas, lose each other behind walls, and can’t uniquely identify each other, limiting their ability to, for example, spread out over an area and collaborate to search for a missing person.

Using TurboTrack, say the researchers, nanodrones in swarms could better locate each other, for greater control and collaboration.

“You could enable a swarm of nanodrones to form in certain ways, fly into cluttered environments, and even environments hidden from sight, with great precision,” says Zhihong Luo, a graduate student in the Signal Kinetics Research Group and first author of a paper on the research.

For more, see “3D Backscatter Localization for Fine-Grained Robotics.”

Related articles:
Drone fleet can search under dense forests without GPS
3D through-wall imaging achieved with drones and Wi-Fi
MIT system lets Wi-Fi locate users within tens of centimeters
Drones relay RFID signals for inventory tracking

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s