New brain implant may translate thought to speech: study

Deesha Bondre | Apr 25, 2019, 16:21 IST
Scientists recently revealed a revolutionary implant that is able to decode words directly from a person’s thoughts
With a growing population of deaf and mute people, and sign language known to only a few, communication for is an everyday hassle for these patients. But if a person’s deaf or mute condition has propelled by brain damage, those days of miscommunications may just behind them. Scientists recently revealed a revolutionary implant that is able to decode words directly from a person’s thoughts.
Deaf and mute patients usually rely on sign language and communication devices or lip reading for their communication. This implant may be incredibly helpful in easing out these everyday issues.
The research behind this project came from the University of California, San Francisco. The team said that they had successfully reconstructed "synthetic" speech using an implant to scan the brain signals of volunteers as they read several hundred sentences aloud. However, the team stresses that the technology is still in its preliminary state, but still shows great promise to transpose thoughts of mute patients in real time. The team’s findings of the technology were published in the journal, Nature, in which they said they adopted a three-stage approach, instead of trying to directly translate the electrical activity to speech.
For this, they asked participants to read out sentences, simultaneously an implant on the brain surface monitored neural activity while the acoustic sound of the words was recorded.
The signals were then transformed to represent the physical movement required for speech-specific articulations of the jaw, mouth, and tongue -- before converting these into synthetic sentences. The final step was asking volunteers to identify words and sentences from the computerised speech. The recordings were strange: a little fuzzy but the simulated sentences mimic those spoken by the volunteers so closely that most words can be clearly understood.
The experiment was conducted only with people who could speak. But its application could easily be used to synthesise from participants even when they only mimed the sentences.

"Very few of us have any idea of what’s going in our mouths when we speak," said Edward Chang, the lead study author.
"The brain translates those thoughts into movements of the vocal tract and that’s what we’re trying to decode."
This could potentially open the way for an implant that can translate into words the brain activity of patients who know how to speak but have lost the ability to do so.
'Those thieves stole jewels'
The sentences used in the study were simple, declarative statements, including: "Shipbuilding is a most fascinating process", and "Those thieves stole thirty jewels."
Gopala Anumanchipalli, the co-author of the study, told AFP that the words used would add to a database that could eventually allow users to discern more complicated statements.
"We used sentences that are particularly geared towards covering all of the phonetic contexts of the English language," he said. "But they are only learned so they can be generalised from."
The researchers identified a type of "shared" neural code among participants, suggesting that the parts of the brain triggered by trying to articulate a word or phrase are the same in everyone.

Copyright © 2021 Bennett, Coleman & Co. Ltd.
All rights reserved.