A breakthrough technology in brain implants and digital avatars allowed a stroke survivor to regain her voice and speak with facial expressions after 18 years.

AI and Brain Implant Help Paralyzed Woman Speak Through Avatar After Suffering From Stroke for 18 Years
(Photo: Pixabay/ anaterate)

A Story of Survival

Ann worked as a high school math teacher in Canada when she suffered a stroke in 2005. At age 30, a brainstem attack left her body severely paralyzed, making her lose control of all the muscles in her body.

The stroke came on suddenly, and doctors cannot explain its cause. Ann was diagnosed with locked-in syndrome (LIS), a condition where a person has full sensation and all working senses but is locked inside a body without muscles.

Ann went through physical therapy over the next few years, allowing her to breathe independently and move her neck. She also learned to move her facial muscles enough to smile, laugh, wink, or cry. However, the muscles responsible for speaking remained immobile.

In 2021, Ann learned about the study by Edward Chang, a doctor from the University of California San Francisco, who has worked on a brain-computer interface (BCI) technology. She discovered the research after reading about a paralyzed man named Pancho.

Pancho was a patient who also experienced a brainstem stroke many years earlier, and it was not clear if his brain could still deliver the commands for speech. Chang and his colleagues had attempted to translate Pancho's brain signals into text as he tried to speak. Because of the study's success, Pancho became the first person with paralysis who demonstrated the possibility of decoding speech-brain signals into full words.

READ ALSO: Brain-computer Interface Accurately Translates Brainwaves Into Letters, Helping Paralyzed Patients to Spell Words

Decoding the Speech Signals

In Ann's case, the researchers stepped up into something more ambitious by decoding her brain signals into speech and the movements that simulate a person's face during normal conversation. According to Chang, their goal was to restore a full, embodied method of communication, which is the most natural way for humans to talk with others.

To make this possible, a paper-thin rectangle of 253 electrodes was implanted onto the surface of Ann's brain. The electrodes were connected to a bank of computers by a cable plugged into a port fixed to Ann's head.

For weeks, Ann and the researchers worked together to train the AI algorithm to help the system recognize her unique brain signals intended for speech. Instead of training the AI to recognize whole words, Chang and his team developed a system that decodes words from smaller units called phonemes. This approach allows the computer to learn only 39 phonemes to decode any word in English. It not only made the process three times faster, but it has also enhanced the system's accuracy.

An algorithm was also devised for synthesizing speech to personalize Ann's voice. A recording of her speaking at her wedding was used to make her speech sound like her original voice before the injury.

Meanwhile, Ann's avatar was animated with the help of software developed by Speech Graphics, which simulates muscle movements of the face. In the future, the team aims to create a wireless version of the system that will not require Ann to be physically connected to the brain-computer interface.

RELATED ARTICLE: Brain Implant Powered by Breathing May Help Improve Lives of People With Neurological Disorders

Check out more news and information on Brain Implant in Science Times.