Oral language is for hearing. Written language is for reading. They’re not the same thing. (PART 2)
This is at the heart of our most important differences.
PART 2 OF 2
The view from “under the hood”
Finally, consider what’s least obvious and harder to study, but just as relevant: Literal confirmation of what Tierney and Pearson suggest in their monograph:
[L]earning to read may not be specifically wired in the same way we have come to accept the specificity of the wiring for learning one’s oral language.
Learning to read is certainly not wired in the brain in the same way as learning one’s oral language is. The evidence comes from neuroscience.
Full disclosure: I am not a neuroscientist, nor do I claim any such expertise. I rely on the research and publications (those I can actually understand) of researchers who over the past several decades have illuminated what goes on “under the hood,” so to speak. For those interested, I strongly recommend this article by Kenneth Pugh on the “Neuroscience of Language and Literacy,” particularly as it applies to bilingual and multilingual learners, and Reading in the Brain by Stanislaus Dehaene.
Another article I recently discovered explores what happens in the brain that enables and reflects literacy acquisition and development. Note, however, that the author falls into the familiar and unfortunate trap of calling oral language natural and written language not only unnatural but artificial. Nonetheless the article has useful and interesting information, some of it pretty detailed and technical. The author also goes over some of the information I’ve discussed here.
Neuroscience has become one of the contemporary fronts in the reading wars because of claims by some that it supports particular approaches to teaching reading. It seems clear to me, as neuroscientists themselves acknowledge (see, e.g., Pugh’s article, above; Seidenberg & others; and Dehaene’s book), that neuroscientific research itself cannot prescribe reading curriculum and instruction. However, discoveries in neuroscience have deeply informed the neuronal causes and consequences of individuals’ acquisition of reading. For my purposes here, I will focus on one aspect, the “reading circuitry” in the brain that must be constructed, enabling and reflecting reading acquisition.
The brain circuitry existing at birth enables the start of oral language acquisition and its development. In contrast, the brain circuitry necessary for reading, or literacy more broadly, must be constructed through experience with print, including instruction in how print works and its relationship to oral language. That circuitry connects portions of the brain that process what we see to what we hear and then to spoken (oral) language. A diagram from one of Dehaene’s paper presentations helps us understand.
Note the green areas; these are involved when an individual—new, young, or old—is processing or using oral language.
One of the red areas is where the language’s sounds, known as “phonemes,” are processed (“phonemic representation”). Note it is located within one of the regions involved in spoken language. This is because spoken language involves distinguishing among different sounds in the language. If we can’t do that, we wouldn’t be able to tell apart the words we hear.
Then note that another red area, way to the right, in the back of the brain, is the visual cortex. This is where visual information taken in by our eyes goes. It is not within the language regions because spoken language does not require use of the visual system.
But reading most certainly requires the visual system, hence the development of the “visual word form area” (VWFA), sometimes called “the brain’s letterbox,” the area of the brain to left of the visual cortex in the diagram dedicated to the visual processing of letters and words.
Finally, the connection is made to the oral language system (see the arrows), and the “reading (or literacy) circuit” is complete, connecting the sounds of the language (“phonemes”) to their visual representations (letters, or “graphemes”) to how the language makes and communicates meaning (“semantics”).
It’s like a three-legged stool: sound-symbol-meaning. This is the essential core of reading, according to my non-neuroscientific understanding.
Here’s how Dehaene describes it:
A vast brain circuit is transformed when we learn to read. All of the regions shown in red increase their activation and specialization during the acquisition of literacy. Furthermore, a massive bundle of connections link … visual areas … with … regions involved in phonological coding…. As a result, we gain the ability to access the spoken language system through vision. (Dehaene, 2013; emphasis added)
Think about the implications of that last sentence: “we gain the ability to access the spoken language system through vision.” Here is a succinct definition of the fundamental difference between spoken and written language. Written language must be perceived through our eyes (for the visually impaired, it must be perceived through their fingers; for the hearing-impaired some alternatives have been developed, depending on communication mode). Once perceived, it must be linked to our oral language in order to make sense and to be worth all the effort that goes into learning to read.
There is no single, guaranteed, one-and-only-one method or approach to help all individuals achieve this circuitry. However, the preponderance of the classroom research, independently and in coordination with neuroscientific research, lends great credence to the idea that this circuitry is more likely—more likely, not guaranteed—to be constructed and robust when foundational literacy skills (what we call “phonics” and “decoding”) are taught systematically and explicitly enough, compared to not teaching them systematically and explicitly enough, to help students make clear connections between the sounds in words and how those sounds are represented in print. (I won’t even attempt to review the classroom research that supports this statement. Extensive documentation is available at the Evidence Advocacy Center, or EAC.)
In short, the neuroscience converges with the instructional research and provides an underlying rationale that helps explain instructional research findings, but it does not endorse any one method or program.
AND AT THE SAME TIME, as this instruction is proceeding (and students will vary enormously in how much of it they need), language development, comprehension skills, knowledge, and other skills and attributes necessary for becoming successful readers must be attended to. These must not wait until students have “the decoding part down,” as educators and others sometimes say mistakenly. Students must certainly get “the decoding part down.” But so too must they develop that portion of the reading circuit—language and its many attributes and tributaries—without which literacy has little value.
Accomplishing the reading circuitry wiring that’s needed for reading is a fairly straightforward task for some children, a considerable challenge for many if not most, and an extreme one for some. Whether we regard reading as natural or unnatural is ultimately unimportant. What’s important is that we provide all students with what they need to become proficient, confident, motivated, and engaged readers.
"A vast brain circuit is transformed when we learn to read. All of the regions shown in red increase their activation and specialization during the acquisition of literacy. Furthermore, a massive bundle of connections link … visual areas … with … regions involved in phonological coding…. As a result, we gain the ability to access the spoken language system through vision. (Dehaene, 2013; emphasis added)"
Thank you for such an excellent explanation and this wonderful image of accessing the spoken language system through vision. A recent book, Calling All Neurons: How Reading and Spelling Happen, by Lori Josephson, explains the "massive bundle of connections" to parents and children.