Wearables Help the Blind Walk
Wearables Help the Blind Walk
For people who are blind or visually impaired, options for walking and traveling the streets are beginning to change. Canes and seeing-eye dogs are not going away, but new technology drawn from robotics and sensors from autonomous vehicles are poised to help the visually impaired move about at greater ease. In one development, sensing systems using LIDAR, echolocation, and infrared technology translate data to tactile information as vibrations. One New York City startup is fitting this technology into clothing where imbedded sensors pulse and vibrate to inform users of obstacles in their path.
Called “Eyeronman,” the system is fitted into two wearable items of clothing: an external vest that has embedded sensors to interpret the user’s environment, and an internal belt fitted with vibrating elements that translate the data to tactile information. Based on where the vibrations are felt and their speed, a user can figure out the direction of an obstacle and the speed in which it is approaching.
“We’re now building a functional prototype wearable that can sense 120 degrees of the horizontal field and 120 degrees of the vertical with a ten to 18-foot range,” says Dr. J.R. Rizzo, founder and chief medical advisor of Tactile Navigation Tools, and assistant professor of rehabilitation medicine and director of the Visuomotor Integration Laboratory and the Technology Translation in Medicine Lab at New York University’s Langone Department of Physical Medicine and Rehabilitation. “It is similar to bumper sensors. The chirp scales up [when the object gets closer].
Rizzo, who developed degenerative vision loss at an early age, conceived the idea while in medical school studying multi-sensory integration. “I was looking at sensor fusion, using sonar buttressed by RADAR or LIDAR, and integrating it into something meaningful,” he says.
Getting to this point has taken a few years, a partner and an education in building a business. One of his first actions was signing up a computational neuroscientist, Todd Hudson, a research scientist at NYU’sCenter for Neural Science, as technology advisor. They then took advantage of NYU’s Technology Transfer Office within the Stern School of Business, along with its W.R. Berkeley Center for Innovation.
Through the school, they received help in developing their business plan and were introduced to business advisors and potential investors. They learned how to pitch their business through competitions, such as the lab’s $200,000 Entrepreneurs Challenge. “We were not a finalist, but we did receive some grants, and put in our own money,” says Rizzo.
The introductions produced not only seed money but professional advice and staffing. As a startup, money for research and development as well as salaries is scarce, so the team offered equity stakes in the firm to attract and compensate senior expertise in engineering, business management and medical knowledge. Some work piecemeal, or as needed. In addition, Tactile Navigation Solutions works with two other firms as independent contractors.
“We run pretty light,” says Rizzo. “In one month we may have ten people working on a project. The next month, after those tasks are finished, maybe we have only two. This allows us to direct cash to research and development.”
Down the road, Rizzo envisions taking the Eyeronman platform to a “fully connected” 4G, Wi-Fi system with a headset or form of auditory communications. “The goal is to have a vest and belt fitted with a complete multi-modal system,” he says.
The Eyeronman system now is fitted with LIDAR, the same technology used in autonomously controlled vehicles to recognize obstacles, “like a bat or dolphin with echolocation,” says Rizzo, who complains that the sightless have been left behind, using canes developed in the early 20th Century, while medical technology provides sophisticated prosthetics or other products to those who have lost limbs or the use of limbs. “Sensors then convert that information into something meaningful.”
For instance, the system would detect an obstacle on the user’s upper right and convert that into vibrations in the upper right portion of a T-shirt or vest made of electro-active polymers. Studies have shown parts of the brain used to process visual information are used to process auditory information in visually impaired people. That plasticity allows a blind person to train themselves to recognize shapes; a blind person could walk by a car and feel it by vibration, forming the beginning of a vibratory library.
Rizzo says the system can also be adapted for use by police or firefighters working at night or in situations where vision is impaired, such as a burning building. It also holds potential for soldiers in combat. The system can be fitted with fire-retardant materials or a bullet-proof vest, he adds.
Eyeronman is powered by a lithium-ion battery and Rizzo says the design has evolved substantially since the company was formed in 2013. Provisional patents were filed a year earlier. Early designs substituted LED bulbs for vibratory actuators and designers have experimented with the number, array and position of sensors.
“We have to deal with efficiency from a temporal processing standpoint,” Rizzo notes. “We have a foundation that can be modified for each user but we still need to create a stable platform for the sensory scheme, to create a scaffolding that you can put into a wearable [article of clothing]. You also have to account for pitch and yaw, and realize that someone can turn their head.”
Rizzo says the system now works when someone fitted with it walks at a slow pace. The goal is to have it work while walking at a normal pace, but some technical difficulties need to be overcome. “Ultrasound operates at the speed of sound, but you still need to wait for the chirp off of the echo,” he says, adding that they’ve had to work to minimize crosstalk and outside noise from sensors.
Still, “When people put this vest on, nothing is average,” he claims. “You turn the system on and have people walk an obstacle course and people understand, they pick this up almost instantly. You can now walk in one direction and put your torso in another direction” and recognize objects.
Rizzo’s goal is start commercial production in 18 to 24 months. After that, he hopes to add cloud-based software to deliver voice messages through a multi-modal system. That would be a development to bring technology for the visually impaired into the 21st Century.
Learn the best practices of bringing your research to market and creating marketable solutions at ASME’s COMS 2016
We’re now building a functional prototype wearable that can sense 120 degrees of the horizontal field and 120 degrees of the vertical with a ten to 18-foot range.”Dr. J.R. Rizzo, founder, Tactile Navigation Tools