University of Buffalo study: Phonics is a useful tool in learning

30 Jan

PBS Parents has a very good primer on phonics:

What is phonics?
Phonics is simply the system of relationships between letters and sounds in a language. When your kindergartener learns that the letter B has the sound of /b/ and your second-grader learns that “tion” sounds like /shun/, they are learning phonics.

Why is phonics important?
Learning phonics will help your children learn to read and spell. Written language can be compared to a code, so knowing the sounds of letters and letter combinations will help your child decode words as he reads. Knowing phonics will also help your child know which letters to use as he writes words.

When is phonics usually taught?
Your child will probably learn phonics in kindergarten through second grade. In kindergarten, children usually learn the sounds of the consonant letters (all letters except the vowels a, e, i, o, and u). First- and second-graders typically learn all the sounds of letters, letter combinations, and word parts (such as “ing” and “ed”). They practice reading and spelling words containing those letters and patterns. Second-graders typically review and practice the phonics skills they have learned to make spelling and reading smooth and automatic…. http://www.pbs.org/parents/education/reading-language/reading-tips/phonics-basics/

See, Phonics Instruction http://www.readingrockets.org/article/phonics-instruction and Understanding Phonics http://www.scholastic.com/teachers/article/understand-phonics

Science Daily reported in Concentrating on word sounds helps reading instruction and intervention:

A neuroimaging study by a University at Buffalo psychologist suggests that phonics, a method of learning to read using knowledge of word sounds, shouldn’t be overlooked in favor of a whole-language technique that focuses on visually memorizing word patterns, a finding that could help improve treatment and diagnosis of common reading disorders such as dyslexia.

“Phonological information is critical for helping identify words as they’re being read,” says Chris McNorgan, PhD, assistant professor of psychology, whose study, “Skill dependent audiovisual integration in the fusiform induces repetition suppression,” used MRI scans to observe how parts of the brain responded to audio and visual word cues. The results are published in the most recent edition of Brain & Language.

A better reader is someone whose visual processing is more sensitive to audio information, according to the study’s results.

“There are applications here not just for reading disorders, but also for how children are taught to read in the classroom,” he says.

Barring injury, McNorgan says, all parts of the brain are working at all times, contrary to the myth that it functions at only a fraction of its capacity. However, different parts of the brain are specialized for different types of activities that trigger some regions to work harder than others.
With reading, the Visual Word Form Area (VWFA) is excited when it encounters familiar letter combinations. But most activities require communication between different brain regions and coordination with sensory systems, like an outfielder watching a baseball while the brain programs the motor system to catch it…..
Concentrating on word sounds helps reading instruction and intervention
http://www.sciencedaily.com/releases/2015/01/150128141425.htm

Citation:

Concentrating on word sounds helps reading instruction and intervention
Date: January 28, 2015

Source: University at Buffalo
Summary:
A neuroimaging study by psychologist suggests that phonics shouldn’t be overlooked in favor of a whole-language technique, a finding that could help improve treatment and diagnosis of common reading disorders.
Brain Lang. 2015 Feb;141:110-23. doi: 10.1016/j.bandl.2014.12.002. Epub 2015 Jan 9.
Skill dependent audiovisual integration in the fusiform induces repetition suppression.
McNorgan C1, Booth JR2.
Author information
Abstract
Learning to read entails mapping existing phonological representations to novel orthographic representations and is thus an ideal context for investigating experience driven audiovisual integration. Because two dominant brain-based theories of reading development hinge on the sensitivity of the visual-object processing stream to phonological information, we were interested in how reading skill relates to audiovisual integration in this area. Thirty-two children between 8 and 13years of age spanning a range of reading skill participated in a functional magnetic resonance imaging experiment. Participants completed a rhyme judgment task to word pairs presented unimodally (auditory- or visual-only) and cross-modally (auditory followed by visual). Skill-dependent sub-additive audiovisual modulation was found in left fusiform gyrus, extending into the putative visual word form area, and was correlated with behavioral orthographic priming. These results suggest learning to read promotes facilitatory audiovisual integration in the ventral visual-object processing stream and may optimize this region for orthographic processing.
Copyright © 2014 Elsevier Inc. All rights reserved.

Here is the press release from the University of Buffalo:

Press Release
Concentrating on word sounds helps reading instruction and intervention

UB researcher’s findings point to the value of word sounds over visual processing during reading instruction or when diagnosing and treating reading disorders
By Bert Gambini
Release Date: January 26, 2015

BUFFALO, N.Y. – A neuroimaging study by a University at Buffalo psychologist suggests that phonics, a method of learning to read using knowledge of word sounds, shouldn’t be overlooked in favor of a whole-language technique that focuses on visually memorizing word patterns, a finding that could help improve treatment and diagnosis of common reading disorders such as dyslexia.

“Phonological information is critical for helping identify words as they’re being read,” says Chris McNorgan, PhD, assistant professor of psychology, whose study, “Skill dependent audiovisual integration in the fusiform induces repetition suppression,” used MRI scans to observe how parts of the brain responded to audio and visual word cues. The results are published in the most recent edition of Brain & Language.

A better reader is someone whose visual processing is more sensitive to audio information, according to the study’s results.

“There are applications here not just for reading disorders, but also for how children are taught to read in the classroom,” he says.

Barring injury, McNorgan says, all parts of the brain are working at all times, contrary to the myth that it functions at only a fraction of its capacity. However, different parts of the brain are specialized for different types of activities that trigger some regions to work harder than others.

With reading, the Visual Word Form Area (VWFA) is excited when it encounters familiar letter combinations. But most activities require communication between different brain regions and coordination with sensory systems, like an outfielder watching a baseball while the brain programs the motor system to catch it.

How this communication happens while reading – which requires visual and auditory knowledge – and to what extent is less clear. So McNorgan’s study looked for what’s known as top-down influence of auditory knowledge in the VWFA.

Think of a bottom-up process as a flow of information that begins with the visual system feeding neurons that detect basic features in words such as line orientation that eventually leads to word recognition. A top-down process implies that some other information enters that flow of visual recognition – information like the knowledge of the word sounds.

“This auditory knowledge can be used to help rule out some letter combinations. For example, many words end in ISK or ASK. For a few milliseconds there may be some ambiguity among the neurons trying to figure out whether that last letter is a K or an X,” said McNorgan. “Since you don’t have any words ending in ISX in your verbal repertoire, this helps rule out the possibility that you read the word DISX and instead read the word as DISK.”

To find evidence of this top-down input, researchers presented subjects with wide ranges of reading abilities between the ages of 8 and 13 with word pairs. The subjects had to determine if the words rhymed while an MRI scanner monitored their brain activity.

The experiment used three sets of conditions when presenting the word pairs: subjects first read the word pairs (visual-only); then heard the word pairs (auditory-only); and lastly, a combination of sight and sound, hearing the first word but reading the second (audio-visual). The MRI scanner determined which parts of the brain were most active during each condition by displaying a three dimensional representation of the brain, made up of what look like a series of cubes, called voxels.

“Think of the voxels as LEGOS assembled together to make a 3D model of the brain. Each cube has a measurement of activation strength that allows us to understand of what’s happening in each area under all three of the conditions,” said McNorgan.

The resulting images, he said, comprise something like a movie reel, with approximately one frame passing every two seconds. Signal strength is then measured in each voxel under all the condition across all the snapshots in time.

“Looking at the voxels in a particular brain area, if the signal strengths associated with two different conditions differ, then you have some evidence that brain area processes information about the two conditions differently,” says McNorgan.

To make sense of the results through all the conditions, researchers take the sum of the auditory-only and visual-only signals and compare that to the strength of the audio-visual condition. This helps them distinguish between multisensory sensory neurons, which become excited by audio-visual information, and collections of heterogeneous unisensory neurons, a mix of visual-only and auditory-only that respond excitedly to one or the other.

“If the audio-visual response is greater than the sum of the auditory-only and the visual-only, this suggests that getting both types of inputs causes these neurons to fire for longer periods of time. This is a superadditive effect,” says McNorgan. “An audio-visual response less than that sum suggests that getting both types of inputs causes these neurons to fire for less time. This is a subadditive effect.”
This subadditivity is associated with higher reading scores and faster responses to similarly spelled words, the reading equivalent to having a head start in a race.

“As you learn how to read, your brain starts to make more use of top-down information about the sounds of letter combinations in order to recognize them as parts of words,” says McNorgan. “This information gives your word-recognition system a leg-up, allowing it to respond more quickly. The multisensory neurons are getting the job done sooner, so they don’t need to fire for as long. Better readers seem to have more of these neurons taking advantage of auditory information to help the visual word recognition system along.”

Early intervention and basic instruction would counterintuitively involve this auditory information, “thinking more about the sounds of different words instead of concentrating on recognizing words,” says McNorgan.
Media Contact Information
Bert Gambini
News Content Manager, Economics, Media Study and Psychology
Tel: 716-645-5334
gambini@buffalo.edu
– See more at: Concentrating on word sounds helps reading instruction and intervention – University at Buffalo
http://www.buffalo.edu/news/releases/2015/01/028.html

This study shows that there are many things to be learned about how to effectively teach reading skills to those who are struggling.

Related:

The importance of the skill of handwriting in the school curriculum

https://drwilda.com/2012/01/24/the-importance-of-the-skill-of-handwriting-in-the-school-curriculum/

The slow reading movement

https://drwilda.com/2012/01/31/the-slow-reading-movement/

Why libraries in K-12 schools are important

https://drwilda.com/2012/12/26/why-libraries-in-k-12-schools-are-important/

Where information leads to Hope. © Dr. Wilda.com

Dr. Wilda says this about that ©

Blogs by Dr. Wilda:

COMMENTS FROM AN OLD FART©

http://drwildaoldfart.wordpress.com/

Dr. Wilda Reviews ©

http://drwildareviews.wordpress.com/

Dr. Wilda ©

https://drwilda.com/

For exclusive content: THE OLD BLACK FART
Subscribe at http://beta.tidbitts.com/dr-wilda-the-old-black-fart/the-old-black-fart

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s

%d bloggers like this: