You are here
February 13, 2024
How the brain produces speech
At a Glance
- Researchers identified how neurons in the human brain encode various elements of speech.
- The findings might be used to help develop treatments for speech and language disorders.
Speech and language depend on our ability to produce a wide variety of sounds in a specific order. How the neurons in the human brain work together to plan and produce speech remains poorly understood.
To begin to address this question, an NIH-funded team of researchers, led by Drs. Ziv Williams and Sydney Cash at Massachusetts General Hospital, recorded neuron activity during natural speech in five native English speakers. The experiments were done while participants were having electrodes implanted for deep brain stimulation. The researchers recorded neurons in a prefrontal brain region known to be involved in word planning and sentence construction. They used high-density arrays of tiny electrodes that could record signals from many individual neurons at once. Their results appeared in Nature on January 31, 2024.
The scientists found that the activity of almost half the neurons depended on the particular sounds, or phonemes, in the word about to be said. Some neurons, for instance, became more active ahead of speaking the sounds for 鈥減鈥 or 鈥渂鈥, which involve stopping airflow at the lips. Others did so ahead of speaking 鈥渒鈥 or 鈥済鈥 sounds, which are formed by the tongue against the soft palate. Moreover, certain neurons seemed to reflect the specific combination of phonemes in the upcoming word. The team found that they could predict the phonemes that made up the word about to be spoken based on the activity of these neurons.
For about a quarter of the neurons, activity further reflected specific syllables, or ordered sequences of phonemes that may be all or part of a word. The team could predict the syllables in the upcoming word using the activity from these neurons. These neurons did not respond to the phonemes in the syllable by themselves. Nor did they respond to the phonemes out of order or split across different syllables.
A minority of neurons responded to the presence of prefixes or suffixes. These are examples of morphemes, or groups of sounds that carry specific meanings. The presence of morphemes in the upcoming word could be predicted from these neurons鈥 activities.
The team also found that different sets of neurons activated in a specific order. The morpheme neurons activated first, around 400 milliseconds (ms) before the utterance. Phoneme neurons activated next, around 200 ms before the utterance. Syllable neurons activated last, around 70 ms before utterance. Most neurons responded to the same feature (phoneme, syllable, or morpheme) both before and during the utterance. But the activity patterns during the utterance differed from those before it. Finally, the team found that neurons that responded to speech sounds during speaking differed from those that responded to those same speech sounds during listening.
In an accompanying paper in the same issue of Nature, another research team used the same technique to examine how neurons in another area of the brain respond while listening to speech. They similarly found that single neurons encoded different speech sound cues.
The findings suggest how various elements of speech are encoded in the brain, and how the brain combines these elements to form spoken words. This information might aid in developing brain-machine interfaces that can synthesize speech. Such devices could help a range of patients with conditions that impair speech.
鈥淒isruptions in the speech and language networks are observed in a wide variety of neurological disorders鈥攊ncluding stroke, traumatic brain injury, tumors, neurodegenerative disorders, neurodevelopmental disorders, and more,鈥 says co-author Dr. Arjun Khanna. 鈥淥ur hope is that a better understanding of the basic neural circuitry that enables speech and language will pave the way for the development of treatments for these disorders.鈥
鈥攂y Brian Doctrow, Ph.D.
Related Links
- Scientists Translate Brain Activity into Music
- Brain Decoder Turns a Person鈥檚 Brain Activity into Words
- Understanding How the Brain Tracks Social Status and Competition
- Study Reveals Brain Networks Critical for Conversation
- Device Allows Paralyzed Man to Communicate with Words
- How the Human Brain Tracks Location
- Memories Involve Replay of Neural Firing Patterns
- Scientists Create Speech Using Brain Signals
- How The Brain Keeps Track of Time
References: Khanna AR, Mu帽oz W, Kim YJ, Kfir Y, Paulk AC, Jamali M, Cai J, Mustroph ML, Caprara I, Hardstone R, Mejdell M, Mesz茅na D, Zuckerman A, Schweitzer J, Cash S, Williams ZM. Nature. 2024 Jan 31. doi: 10.1038/s41586-023-06982-w. Online ahead of print. PMID:聽38297120.
Funding: NIH鈥檚 National Institute of Neurological Disorders and Stroke (NINDS), National Institute of Mental Health (NIMH), and National Institute on Deafness and other Communication Disorders (NIDCD); Canadian Institutes of Health Research; Foundations of Human Behavior Initiative; Tiny Blue Dot Foundation; American Association of University Women.