In a first, brain implant lets man with complete paralysis spell out thoughts: I love my cool son. – Science

In its final stages, the neurological disease amyotrophic lateral sclerosis (ALS) can bring extreme isolation. People lose control of their muscles, and communication may become impossible. But with the help of an implanted device that reads his brain signals, a man in this “complete” locked-in state could select letters and form sentences, researchers report this week.

“People have really doubted whether this was even feasible,” says Mariska Vansteensel, a brain-computer interface researcher at the University Medical Center Utrecht who was not involved in the study, published in Nature Communications. If the new spelling system proves reliable for all people who are completely locked in—and if it can be made more efficient and affordable—it might allow thousands of people to reconnect to their families and care teams, says Reinhold Scherer, a neural engineer at the University of Essex.

ALS destroys the nerves that control movement, and most patients die within 5 years of diagnosis. When a person with ALS can no longer speak, they can use an eye-tracking camera to select letters on a screen. Later in the disease’s progression, they can answer yes-or-no questions with subtle eye movements. But if a person chooses to prolong their life with a ventilator, they may spend months or years able to hear but not communicate.

In 2016, Vansteensel’s team reported that a woman with ALS could spell out sentences with a brain implant that detected attempts to move her hand. But this person still had minimal control of some eye and mouth muscles. It wasn’t clear whether a brain that has lost all control over the body can signal intended movements consistently enough to allow meaningful communication.

The participant in the new study, a man with ALS who is now 36, started to work with a research team at the University of Tübingen in 2018, when he could still move his eyes. He told the team he wanted an invasive implant to try to maintain communication with his family, including his young son. His wife and sister provided written consent for the surgery.

Consent for this type of study comes with ethical challenges, says Eran Klein, a neurologist and neuroethicist at the University of Washington, Seattle. This man wouldn’t have been able to change his mind or opt out during the period after his last eye-movement communication.

Researchers inserted two square electrode arrays, 3.2 millimeters wide, into a part of the brain that controls movement. When they asked the man to try to move his hands, feet, head, and eyes, the neural signals weren’t consistent enough to answer yes-or-no questions, says Ujwal Chaudhary, a biomedical engineer and neurotechnologist at the German nonprofit ALS Voice.

After nearly 3 months of unsuccessful efforts, the team tried neurofeedback, in which a person attempts to modify their brain signals while getting a real-time measure of whether they are succeeding. An audible tone got higher in pitch as the electrical firing of neurons near the implant sped up, lower as it slowed. Researchers asked the participant to change that pitch using any strategy. On the first day, he could move the tone, and by day 12, he could match it to a target pitch. “It was like music to the ear,” Chaudhary recalls. The researchers tuned the system by searching for the most responsive neurons and determining how each changed with the participant’s efforts.

By holding the tone high or low, the man could then indicate “yes” and “no” to groups of letters, and then individual letters. After about 3 weeks with the system, he produced an intelligible sentence: a request for caregivers to reposition him. In the year that followed, he made dozens of sentences at a painstaking rate of about one character per minute: “Goulash soup and sweet pea soup.” “I would like to listen to the album by Tool loud.” “I love my cool son.”

He eventually explained to the team that he modulated the tone by trying to move his eyes. But he did not always succeed. Only on 107 of 135 days reported in the study could he match a series of target tones with 80% accuracy, and only on 44 of those 107 could he produce an intelligible sentence.

“We can only speculate” about what happened on the other days, Vansteensel says. The participant may have been asleep or simply not in the mood. Maybe the brain signal was too weak or variable to optimally set the computer’s decoding system, which required daily calibration. Relevant neurons may have drifted in and out of range of the electrodes, notes co-author Jonas Zimmermann, a neuroscientist at the Wyss Center for Bio and Neuroengineering.

Still, the study shows it’s possible to maintain communication with a person as they become locked in by adapting an interface to their abilities, says Melanie Fried-Oken, who studies brain-computer interface at Oregon Health & Science University. “It’s so cool.” But hundreds of hours went into designing, testing, and maintaining the personalized system, she notes. “We’re nowhere near getting this into an assistive technology state that could be purchased by a family.”

The demonstration also raises ethical questions, Klein says. Discussing end-of-life care preferences is difficult enough for people who can speak, he notes. “Can you have one of those really complicated conversations with one of these devices that only allows you to say three sentences a day? You certainly don’t want to misinterpret a word here or a word there.” Zimmermann says the research team stipulated the participant’s medical care shouldn’t depend on the interface. “If the speller output were, ‘unplug my ventilator,’ we wouldn’t.” But, he adds, it’s up to family members to interpret a patient’s wishes as they see fit.

Chaudhary’s foundation is seeking funding to give similar implants to several more people with ALS. He estimates the system would cost close to $500,000 over the first 2 years. Zimmermann and colleagues, meanwhile, are developing a signal processing device that attaches to the head via magnets rather than anchoring through the skin, which carries a risk of infection.

So far, devices that read signals from outside the skull haven’t allowed spelling. In 2017, a team said it could classify with 70% accuracy yes-or-no answers from the brain of a completely locked-in participant using a noninvasive technology called functional near-infrared spectroscopy (fNIRS). Two co-authors on the new study, Chaudhary and University of Tübingen neuroscientist Niels Birbaumer, were part of that team. But other researchers have voiced concerns about the study’s statistical analysis. Two investigations found misconduct in 2019, and two papers were retracted. The authors sued to challenge the misconduct findings, Chaudhary says. Scherer, who was skeptical of the fNIRS study, says the results with the invasive device are “definitely sounder.”

Wyss Center researchers continue to work with this study participant, but his ability to spell has decreased, and he now mostly answers yes-or-no questions, Zimmermann says. Scar tissue around the implant is partly to blame because it obscures neural signals, he says. Cognitive factors could play a role, too: The participant’s brain may be losing the ability to control the device after years of being unable to affect its environment. But the research team has committed to maintaining the device as long as he continues to use it, Zimmermann says. “There’s this huge responsibility. We’re quite aware of that.”