Byron SpiceThursday, October 15, 2015Print this page.
Scientists from IBM Research and Carnegie Mellon University have announced the first open source platform designed to support the creation of smartphone apps that enable the blind to better navigate their surroundings.
The IBM and CMU researchers used the platform to create a pilot app, called NavCog, that draws on existing sensors and cognitive technologies to inform blind people on the CMU campus about their surroundings by "whispering" into their ears through earbuds or by creating vibrations on smartphones. (Learn more about the app in the video below.)
The app analyzes signals from Bluetooth beacons located along walkways and from smartphone sensors to enable users to move without human assistance, whether inside campus buildings or outdoors. Initial beacon routes on the CMU campus connect the Gates and Hillman Centers with the Cohon University Center via the Pausch Bridge, Newell-Simon Hall with the Gates and Hillman Centers, and Wean Hall with Newell-Simon Hall.
Researchers are exploring additional capabilities for future versions of the app to detect who is approaching and what their mood is. NavCog will soon be available free of charge on the Apple App Store.
The first set of cognitive assistance tools for developers is now available via the cloud through IBM Bluemix. The open toolkit consists of an app for navigation, a map editing tool and localization algorithms that allow the blind to know in real time where they are, which direction they are facing and additional surrounding environmental information. The computer vision navigation application tool turns smartphone images of the surrounding environment into a 3-D space model to help improve localization and navigation for the visually impaired.
"While visually impaired people like myself have become independent online, we are still challenged in the real world. To gain further independence and help improve the quality of life, ubiquitous connectivity across indoor and outdoor environments is necessary," said IBM Fellow Chieko Asakawa, a visiting faculty member in the Robotics Institute.
"I'm excited that this open platform will help accelerate the advancement of cognitive assistance research by giving developers opportunities to build various applications and test non-traditional technologies such as ultrasonic and advanced inertial sensors to assist navigation," she said.
The combination of these multiple technologies is known as "cognitive assistance," a research field dedicated to helping the blind gain information by augmenting missing or weakened abilities.
Kris Kitani, a systems scientist in the Robotics Institute is working with Asakawa on the NavCog project with support from an IBM Open Collaborative Research award.
The researchers plan to add various localization technologies, including sensor fusion, which integrates data from multiple environmental sensors for highly sophisticated cognitive functioning, such as facial recognition in public places like hospitals and corporate campuses. They also are exploring the use of computer vision to characterize the activities of people in the vicinity and ultrasound technology to identify locations more accurately.
"From localization information to understanding of objects, we have been creating technologies to make the real-world environment more accessible for everyone," said Martial Hebert, director of the Robotics Institute.
"With our long history of developing technologies for humans and robots that will complement human's missing abilities to sense the surrounding world, this open platform will help expand the horizon for collaboration around the world to open up the new real-world accessibility era for the blind in the near future," he said.
Byron Spice | 412-268-9068 | bspice@cs.cmu.edu