DIY smart cane is self-navigating
Using tools from autonomous vehicles, the researchers built the augmented cane – which helps people detect and identify obstacles – move easily around those objects, and follow routes both indoors and out. Unlike previous sensor canes, which are either heavy and expensive or limited in their sensing capabilities, the augmented cane incorporates cutting-edge sensors, weighs only three pounds, can be built at home from off-the-shelf parts and open-source software, and costs $400.
The device, say the researchers, will hopefully be an affordable and useful option for the more than 250 million people with impaired vision worldwide.
“We wanted something more user-friendly than just a white cane with sensors,” says Patrick Slade, a graduate research assistant in the Stanford Intelligent Systems Laboratory and first author of a paper describing the augmented cane. “Something that cannot only tell you there’s an object in your way, but tell you what that object is and then help you navigate around it.”
The augmented cane is equipped with a LIDAR sensor – the laser-based technology used in some self-driving cars and aircraft that measures the distance to nearby obstacles. The cane has additional sensors including GPS, accelerometers, magnetometers, and gyroscopes, like those on a smartphone, that monitor the user’s position, speed, direction, and so forth.
The cane makes decisions using artificial intelligence-based way finding and robotics algorithms like simultaneous localization and mapping (SLAM) and visual servoing – steering the user toward an object in an image.
“Our lab is based out of the Department of Aeronautics and Astronautics,” says Mykel Kochenderfer, senior author on the study and an associate professor of aeronautics and astronautics and an expert in aircraft collision-avoidance systems, “and it has been thrilling to take some of the concepts we have been exploring and apply them to assist people with blindness.”
Mounted at the tip of the cane is a motorized, omnidirectional wheel that maintains contact with the ground. This wheel leads the user with impaired vision by gently tugging and nudging, left and right, around impediments. Equipped with built-in GPS and mapping capabilities, the augmented cane can even guide its user to precise locations – like a favorite store in the mall or a local coffee shop.
In real-world tests with users that volunteered through the Palo Alto Vista Center for the Blind and Visually Impaired, the researchers put the augmented cane in the hands of people with visual impairments as well as sighted people who were blindfolded. They were then asked to complete everyday navigation challenges – walking hallways, avoiding obstacles, and traversing outdoor waypoints.
“We want the humans to be in control,” says Kochenderfer, “but provide them with the right level of gentle guidance to get them where they want to go as safely and efficiently as possible.”
The augmented cane increased the walking speed for participants with impaired vision by roughly 20 percent over the white cane alone. For sighted people wearing blindfolds, the results were more impressive, increasing their speed by more than a third. An increased walking speed is related to better quality of life, so the hope is that the device could improve the quality of life of its users.
The researchers are open-sourcing every aspect of the project.
“We wanted to optimize this project for ease of replication and cost,” says Kochenderfer. “Anyone can go and download all the code, bill of materials, and electronic schematics, all for free.”
“Solder it up at home. Run our code,” says Slade. “It’s pretty cool.”
The researchers note that the cane is still a research prototype, and that a lot of significant engineering and experiments are necessary before it is ready for everyday use. The researchers add that they would welcome partners in industry who could streamline the design and scale up production to make the augmented cane even more affordable.
Looking ahead, the researchers next plan to include refinements to their prototype and develop a model that uses an everyday smartphone as the processor – an advance that could improve functionality, broaden access to the technology, and further drive down costs.
For more, including a downloadable parts list and DIY solder-at-home instructions, see “Multimodal sensing and intuitive steering assistance improve navigation and mobility for people with impaired vision.”
Related articles:
Miniaturized obstacle-detection technology is aim of new research project
“Proximity Hat” makes the blind see
Wearable radar sensor assists the visually impaired
Vibrating car window gives blind people views of the landscape
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
