The pendulum has swung again
Whole word or Phonics? The pendulum swings every decade.
With Science of Reading (SoR), the pendulum has fully swung to the phonics side again. And, as it did with Whole Word reading, it has swung too far. If you talk to many “evangelists” of SoR, they’ll tell you how every word can be decoded, if you just learn enough phonics.
And, yes, technically, they’re right.
However, that misses a big issue. There are 26 letters in the alphabet. There are 460ish GPCs (grapheme to phoneme correspondences) in English. A grapheme, remember, is a single letter or combination of letters (like “a” or “ough”). A phoneme is a single sound (there are 43 or 44 depending on your region). A GPC is the relationship between the two. So the relationship that “a” makes the sound “aaaa” and “ough” sounds like the long O when in a word like “though.”
The issue here, is that the same grapheme can map to multiple phonemes. The easiest (and most obvious) is that the vowels can be both long and short. But what about the words hear and bear? In these, the “ea” maps to a different sound in each (long e in hear and short e in bear).
Some diehard phonics people will likely argue that you can teach all sorts of rules around word origin, or how the other letters in the word affect sounds (because hear and heart also use different sounds). But, let’s be honest, you’re never going to be able to read anything if you before you can learn any new word you have to first know everything about it’s origin.
Side note: One of my favourite examples of phonics and how crazy it all is are the words colonel and kernel. They are pronounced the same! Even though the first half of each word is so incredibly different.
While phonics are incredibly helpful and needed (I am not arguing against phonics, I’m just arguing against phonics being the be all/end all), being able to read well means that you have been creating the mappings in your brain between words and their sounds/meanings. You are not decoding each word you come across one by one. You are only decoding words the first few times you come across them, until you’ve encountered them enough to have that mapping created.
Now, that could sound like support for whole word teaching (it’s not). What it is, is acknowledgement that teaching phonics will only get you so far. Even research has recognized that you don’t need to learn every single GPC. Instead, it has shown that you should aim to know the 64 most common (at minimum)1. That once you have those, the rest you can/will learn as you encounter words that use them. But that because so many graphemes can map to multiple phonemes, the best thing to do is to focus on the most common or frequent form and teach those first.
I feel like this is something most of us intuitively know. I mean, there’s the long running meme/joke about realizing your pronunciation of a word is way off because you learned it from reading. That’s a clear indication that you used the phonics knowledge you had and came up with a reasonable “guess” to the word. And you know what it? It doesn’t matter if you didn’t pronounce it properly the first time (or even million times) in your head. And when you do find out the correct pronunciation, you can correct yourself and move on.
I’ve been thinking about all of this a lot as I’ve been trying to figure out what order of GPCs I want to use in my little program. I had heard many swear by the UFLI curriculum, but when I looked into it, it didn’t really match what I expected. Many of the high frequent GPCs I expected come quite late, and some that I know have low frequency came much earlier. I looked at some other versions as well, and nothing really “clicked.” I think I’m going to go back to these 64 and start from there. And the 100 high frequency words.
Solity, J., & Vousden, J. I. (2009). Real books vs. reading schemes: A new perspective from instructional psychology. Educational Psychology, 29, 469–511.


