A new brain-scan study of college-age speakers of English, Spanish, Hebrew and Chinese shows that the same speech regions of the brain are activated when they read, regardless of the language.
We’re not born with the brain circuitry we need for reading. We have to be taught.
Specifically, beginning readers learn to sound out words by matching what they see to what they say.
As reading becomes more effortless, the brain forms efficient circuitry linking the visual system to the speech centers in the left hemisphere.
That process is well established for English and other languages that use alphabets, but scientists disagree about whether reading works the same way for Chinese, which offers far fewer clues about how to pronounce the words depicted in its written form.
Most Read Stories
- Russian hackers tried to access Washington’s voting systems, officials say
- California brain surgeon faces more child sex abuse charges
- Seattle’s real Spider Man sets us straight: They’re not out to get you VIEW
- Boeing seeks quick legal fix to stop Bombardier
- We just experienced warmest and driest summer ever recorded in Seattle
That’s why I was interested in a new study showing that reading in English and Chinese — and Spanish and Hebrew — activates the same speech regions of the brain.
“The principles by which reading is built into the brain are far more similar than dissimilar across languages and that has clear implications for how you teach reading and how you remediate disorders of reading,” said Kenneth Pugh, a co-author on the paper.
I’m fascinated by the idea that reading — this cultural invention that sprang up in wonderfully varied written forms around the world just a few thousand years ago — gets “built into the brain” in predictable ways governed by the much older speech systems all human share.
Pugh works with the study’s lead author, Jay Rueckl, at Haskins Laboratories, a nonprofit research institute affiliated with Yale University and the University of Connecticut that focuses on the biology of language.
Their study, published earlier this week in Proceedings of the National Academy of Sciences, is believed to be the first to examine four very different languages at the same time with the same procedures.
Researchers working in labs in the United States, Israel, Taiwan and Spain asked 21 college-age speakers in each language to read or listen to a series of simple words such as “window” and “woman” and say whether the word referred to something living or not (to ensure they knew the meaning of the word. )
Using a scanning technique called functional Magnetic Resonance Imaging, researchers measured differences in blood flow to infer what regions of the brain were working harder during the tasks and needed to be replenished.
Then they compared the blood-flow patterns across the four languages, written and spoken, and found they overlapped the most in regions of the left hemisphere associated with understanding and producing speech.
The study did note some differences in how the brain responded to different languages.
But those differences were far outweighed by the overlaps, suggesting that all written languages are funneled into a universal circuitry for reading that is shaped by the speech centers of the brain.
“These differences – the linguistic properties and even the visual properties – have some importance, but we don’t want to miss the forest for the trees,” said Stephen Frost, another co-author at Haskins. “There’s a lot they have in common.”