A team of engineers, neuroscientists, and neurosurgeons at UC Davis and UC Davis Health has demonstrated that brain-computer interfaces (BCIs) used for translating brain signals into speech can also enable users to control a computer cursor. This development is considered a first in the field.
The research, published in the Journal of Neuroengineering, was supported by the National Science Foundation and the National Institutes of Health. The findings suggest that BCIs could become more feature-rich, offering new ways to restore function and autonomy to people with paralysis.
“Future steps in multimodal BCIs could include gesture decoding for all sorts of different things, enriching the types of interactions someone with paralysis can have with their environment beyond speech,” said Tyler Singer-Clark, a biomedical engineering Ph.D. student at UC Davis and first author on the paper.
Singer-Clark works in the UC Davis Neuroprosthetics Lab, co-directed by neuroscientist Sergey Stavisky and neurosurgeon David Brandman. The lab had previously developed what it described as the most accurate BCI for speech ever recorded, which formed the basis for this new project.
The team’s BCI is implanted in the speech motor cortex and interprets electrical activity from thought, converting it into words displayed on a computer. During their work, the researchers noticed that the area of the brain being analyzed for speech also showed potential for supporting cursor control—a motor function typically linked to another part of the brain.
Singer-Clark developed software to enable cursor control through the speech BCI, drawing on previous research into BCIs for motor functions and using existing code from their speech BCI system. “We didn’t have to reinvent the pre-processing of the neural data,” he said. “For cursor control, it’s actually the same pre-processing steps the speech BCI uses to get the neural features that are going to be useful for decoding the intention of the participant.”
To individualize the decoding software, Singer-Clark worked with a participant from the original speech BCI research, who is unable to move due to amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig’s Disease. The participant observed and imagined cursor movements while Singer-Clark mapped corresponding neural activity.
Once the software was activated, the participant adapted to controlling the cursor with his thoughts in less than 40 seconds. He was able to move the cursor, click on applications, and open links independently.
Singer-Clark noted that the BCI does not translate abstract thoughts into movement but instead relies on a process described by the participant as “intuition.” Singer-Clark explained, “I’ll say, ‘What motor imagery are you using?’ And he says, ‘Intuition.’”
David Brandman, co-director of the UC Davis Neuroprosthetics Lab, said, “Singer-Clark’s work is incredibly important for the field. His work has not only empowered our BrainGate2 participant to use a computer cursor with his thoughts but has also led the way for multiple companies in this space to design their clinical trials.”
Singer-Clark emphasized both the scientific and personal significance of the project. He noted that it supports a growing view that different body parts and their movements are represented across multiple areas of the motor cortex rather than being isolated to specific regions.
He also highlighted the impact on quality of life: “There’s a man with ALS who can control his computer independently without someone else helping him for hours and hours every day. It’s like this great event, and we might not have tried if we didn’t have that prior research encouraging us to do that.”
The researchers at UC Davis aim to continue developing BCIs to help people with paralysis achieve greater autonomy.



