Assistive devices can use intrinsic neural connections to help blind people to detect their environment without hundreds of hours of training or intense concentration, say researchers at the California Institute of Technology (Caltech). The research was announced on Oct. 26, 2015, and was published in the October 22, 2015, issue of Scientific Reports.
The human brain has a network of sensory neurons that interact and are integrated to process sounds, smells, visual information, and the feel and taste of food. Aids called sensory substitutions devices help people without sight to achieve a new sensory functionality similar to vision. The researchers conducted trials on a device called vOICe that translates images into sound. Its components are a small computer connected to a camera attached to darkened glasses that allow the device to “see” the same way a human eye would.
Each camera image is scanned from left to right by a computer algorithim. An associated sound is generated with a volume and frequency based on the brightness and vertical location of the pixels. A lot of bright pixels at the top of a column would be converted into a high-frequency, loud sound. A large number of lower dark pixels would generate a lower-pitched, quieter sound. A bind person wearing these special glasses can then associate the sounds with the characteristics of their environment.
“Many neuroscience textbooks really only devote a few pages to multisensory interaction, but 99 percent of our daily life depends on multisensory—also called multimodal—processing,” says Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology and principal investigator on the study. An example of this process is talking to a person you know well who is crying and visualizing the person’s face in tears while hearing the sound of sobbing. “This is an example of the way sensory causality is not unidirectional—vision can influence sound, and sound can influence vision,” Shimojo said.
Shimojo and postdoctoral scholar Noelle Stiles used crossmodal mappings to stimulate the visual cortex with sound which encodes information about the environment. The mappings include matching noisy sounds with bright lights and a high pitch to elevated locations in space. The trials were conducted with both blind and sighted people using the vOICe device. Blind participants in the trials were asked to feel textures and match them to sounds while sighted people without instruction or training were asked to match images to sounds. Both groups demonstrated an intuitive ability to identify images and textures from their associated sounds.
“Auditory regions are activated upon hearing sound, as are the visual regions, which we think will process the sound for its spatial qualities and elements, said Stiles. “The visual part of the brain, when processing images, maps objects to spatial location, fitting them together like a puzzle piece.”
“Is seeing what happens when you open your eyes?” Shimojo said. “No, because opening your eyes is not enough if the retina, the light-sensitive layer of tissue in the eye, is damaged. Is it when your visual cortex is activated? But our research has shown that the visual cortex can be activated by sound, indicating that we don’t really need our eyes to see. It’s very profound—we’re trying to give blind people a visual experience through other senses.”