Could a hearing aid read minds to filter out the noise?
People who are hearing impaired often have difficulty in noisy environments like restaurants or parties, where picking out one voice from the crowd is key. Current hearing aids can filter out background noise, but cannot separate one conversation from many.
Now, Columbia researchers at the Zuckerman Institute are designing a hearing aid that can focus in on a single conversation. Nima Mesgarani, an associate professor of electrical engineering, and his team have developed a system that relies both on a new type of microphone and on the brain’s own ability to filter what it hears.
The microphone receives auditory input and uses deep neural network models, a form of artificial intelligence, to separate speakers from a mixture of voices. Then, the device compares these separated speakers to the listener’s neural signals to determine which voice he or she is attending to, and amplifies that voice. The entire process takes under 10 seconds.
“Our system demonstrates a significant improvement in both subjective and objective speech quality measures—almost all of our subjects said they wanted to continue to use it,” Mesgarani says. He and his team are hoping to turn this cognitive hearing aid into a real product within the next five years. Learn more.