Schematic of the
cursor BCI. Credit: Tyler Singer-Clark et al
University of
California, Davis researchers have developed a brain-computer interface (BCI)
that enables computer cursor control and clicking, using neural signals from
the speech motor cortex. One participant with amyotrophic lateral sclerosis
(ALS) used the interface for daily life activities, including independent
control of a personal desktop computer and text entry.
Neurological diseases such as stroke or ALS can
interrupt the pathway from the brain to the muscles, causing a loss of movement
and communication. ALS progressively destroys upper and lower motor neuron
pathways, leaving cognition intact but causing paralysis in all four limbs and
significant speech impairment.
Brain-computer interfaces are intracortical implanted
devices that bypass any disruption by reading neural signals directly from the
brain and producing output on the user's behalf. Many BCIs have relied on neural activity from the dorsal motor cortex, a brain region associated with hand
and arm movements. When signals are decoded, users can move a cursor by
attempting or imagining limb motion.
In contrast, speech BCIs rely on the ventral
precentral gyrus, where neural signals are linked to facial movements and
speech articulation. Decoding neural signals from this region enables fast, speech-based communication but has
not been shown to support general computer navigation or motion control.
Implantation into both dorsal and ventral areas would
be ideal, yet it is considered surgically impractical or infeasible. As a result, users
and clinicians must choose between cursor control and speech decoding.
In the
study, "Speech motor cortex enables BCI cursor control and click," published in the Journal of Neural Engineering, researchers
conducted a single-participant case study to test whether neural activity from
the speech motor cortex could support both cursor control and speech decoding
with a single implant site.
One
participant with ALS, a 45-year-old man with paralysis in all four limbs and
difficulty speaking clearly, took part in the research. All sessions were run
at the participant's home.
Four
64-electrode arrays were surgically implanted in the ventral precentral gyrus
of the participant. Electrode targeting was guided by preoperative MRI and
cortical alignment with the Human Connectome Project.
Neural
signals were acquired at a sampling rate of 30 kHz and bandpass filtered
between 250 and 5,000 Hz. Threshold crossings and spike band power were
calculated every millisecond from each electrode. These features were then
grouped into 10-millisecond bins, producing a stream of 512-dimensional feature
vectors that served as input to the decoding systems.
Three task
paradigms were used to evaluate system
performance: Radial8 Calibration, Grid Evaluation, and Simultaneous
Speech and Cursor. A linear velocity decoder controlled cursor movement, while
a separate linear classifier decoded click events.
Decoder
parameters were continuously recalibrated using linear regression for velocity
and logistic regression for click
classification with weights updated every few seconds during active control.
Calibration
occurred quickly as the participant acquired his first target using neural
control within 40 seconds after initiating the system.
During
later sessions with optimized settings, the participant used the system to
control the cursor with high efficiency, averaging 2.90 bits per second.
Earlier sessions showed lower performance, averaging 1.67 bits per second. The
highest rate recorded in any single session was 3.16 bits per second. One bit
per second corresponds to the ability to make several accurate choices per
minute, with higher values indicating faster and more precise control.
Across
1,263 total trials, 1,175 targets were correctly selected, corresponding to 93%
accuracy. Eighty-eight incorrect selections occurred, and no trials ended due
to timeout. Six clicks were registered on temporarily disabled targets, and 23
clicks occurred outside of any target boundary.
Click
classification performance exceeded chance across all four electrode arrays.
One well-placed array contributed the most to cursor decoding and closely
matched the performance of the full-array decoder.
In
sessions involving simultaneous speech and cursor control, median target
acquisition time increased to 4.51 seconds. Conditions without speech ranged
from 3.37 to 3.51 seconds, illustrating that speech production interfered with
the participant's ability to control the cursor, yet did not cause delays in
sequential actions. Improvements in decoder design could mitigate interference
and enhance future usability.
A single
implant site supported both communication and computing functions in an
independent home setting, providing a proof of concept for the feasibility of
multi-modal BCI systems.
For patients cognitively intact but unable to use their limbs or speak, a neural interface that provides both computer-cursor control and speech decoding can restore crucial channels of communication, independence, and substantially improve quality of life.
by
Justin Jackson , Medical Xpress
Source: Brain interface allows speech decoding and computer control in ALS patient

No comments:
Post a Comment