1a. Whole Word / Look and Say
1b. Whole Language / ''A psycholinguistic guessing game'' /
Literature-based Approach / Holistic Approach / Real Books (UK) /
Discovery Method / Emergent Literacy / Language Experience /
Apprenticeship Approach / ''Acquisition of reading in authentic contexts through a progressive and invisible pedagogy'' (Goouch/Lambirth p110.
italics added) / ''A hydra-headed beast'' (Turner.1990 p2)
The very first whole word programme, the Quadrille programme, was invented in France
by Abbe Bertaud in 1744. Derivatives
of this programme spread through continental Europe - endorsed
by the King of Prussia, used by Basedow and Gedike in Germany
and Jacatot in Belgium. Back in France, Abbe de l'Epee (circa
1760) was inspired by the Quadrille programme to produce his
own whole word programme, which he used with deaf-mutes.
By 1826, whole word books were being promoted both sides of the Atlantic,
with Abbe de l'Epee's programme being used by Thomas Gallaudet
with deaf-mutes in America. Gallaudet also produced a
beginners' reading book (1836) for hearing children, The Mother's Primer.
This taught reading by the whole word method with all mention
of the deaf-mute connection erased (Rodgers.
In 1886, the psychologist James McKeen Cattell found through eye-movement
experiments that short words could be named as fast as isolated
also discovered the word superiority effect; a letter is identified
more accurately and rapidly in the context of a real word than in
the context of a nonword or a random letter string. These results led him to
believe that individual written words must be processed by the brain as
whole units. His experiments were repeated by modern-day eye-movement
researchers. They reproduced Cattell's results, but interpreted them very
differently: ''(T)he word-superiority effect
demonstrates that skilled readers process all of the letters when identifying a word' (italics added. Ashby/Rayner p58)
Cattell's discoveries reinforced the opinions and beliefs of a procession
of self-appointed education experts working in the new teacher-training
colleges. They ordered teachers to use the whole word, look-say method using
flash-cards. The look-say method metamorphosised into the whole-sentence
'meaning' method using reading schemes (basal readers.USA), written using
high frequency words.
Modern eye-movement research. Part 1. The Eyes Have It.
Part 2. The Eyes to the Write (in English orthography).
Already established as the main method to teach reading throughout England by the 1920s, whole word / look and say used reading scheme books with repetitive text. Children were expected to memorise the words as whole shapes
through look-and-say flash cards and the constant repetition of those words in a particular scheme's books. Some ''intelligent guessing'' (The Practical Infant Teacher.1930) was also recommended.
One of the continuing stream of education 'experts', Dr. Russell
of California University, produced a book in 1949 that included the following
strategies, in order of importance, to aid recognition of new words:
general pattern, or configuration, of the word.
2. Special characteristics
of the appearance of the word.
3. Similarity to known words.
of familiar parts in longer words.
5. The use of picture clues.
use of context clues.
7. Phonetic and structural analysis of the word.
p55) These strategies (1-6 are forms of guessing) are almost identical
to those advocated by some reading experts today.
The look and say reading schemes
were very dull and repetitive, introducing new words at a very slow rate in order to aid memorisation, but they proved
lucrative for the newly emerging educational publishers. Dick and Jane were the main characters in a hugely popular look-say scheme used to teach children to read, from the 1930s through to the 1970s, in the United States. The Ladybird Peter and Jane Key Word reading scheme was written by a British educationalist William Murray, and first published in 1964. The Ladybird Key Word books were used in 80% of schools until the 1970s.
During the look and say era, Dr.Joyce Morris undertook research on, Reading in the Primary School (1959), collecting and analysing data from seven-year-olds at a large number of Kent's primary schools. She found that reading standards in Kent at that time, ''were above the national average. Nevertheless, 19.2 percent of the 3,022 survey seven-year-olds could be classed as ‘non-readers', and a further 26.4 percent had some mastery of reading mechanics but not sufficient for them to be independent readers of simple information and story books''
Nearly 40 years later, researchers Masterson, Dixon and Stuart carried out an experiment to see how easy it was for five-year-old beginning readers to
remember words by sight from repeated shared reading of the same whole word texts. It turned out to be much harder than they expected: They were 'shocked' to discover that 36 repetitions were not enough to guarantee that children would remember a word. ''When we tested children’s ability to read words they’d experienced more than 20 times in their school reading, on average they could read only one word correctly''(italics added. Stuart.p26/27 in Lewis/Ellis. Phonics)
According to Diane McGuinness, an in-depth examination of writing
systems, both ancient and modern, reveals that the average person can only
retain about 2,000 whole word shapes in their visual-memory (D.McGuinness GRB p214). Victor Mair, Professor of Chinese Language
and Literature, agrees with McGuinness. He says, ''There is a
natural upper limit to the number of unique forms that can
be tolerated in a functioning script. For most individuals,
this amount seems to lie in the range of approximately 2,000-2,500''
(Daniels. The World's Writing Systems p200)
Definition of writing: ''A system of more or less
permanent marks used to represent an utterance in such a way that it can
be recovered more or less exactly without the intervention of the
(Daniels. The World's Writing Systems p21)
Contrary to the myth that they are logographic or ideographic
writing systems (conveying meaning without regard to sound),
apart from around 1850 Kanji 'picture' symbols, Japanese
writing consists of sequences of different consonant-vowel
pairs (diphones). Chinese writing is based on monosyllabic-morphemic units fused with category symbols (DeFrancis). The misconception that Chinese is a logographic writing system occurs because nearly half of the words are only one syllable long -a whole word and a syllable unit of sound at the same time. ''The Chinese language has around 1,200 syllables; English has about 60,000. This is why Chinese is written as a syllabary and English is not'' (D.McGuinness SRS intro)
See- Diane McGuinness's book 'Why Children Can't Read' p47-52
and Share's 'Anglocentricities' paper p588.
''Writing systems are solutions to the problem of representing spoken
language in visuo-graphic form. It’s what they all do, despite their
superficial differences. Writing systems that represent words as visual
patterns independent of their pronunciations do not exist. Such systems
are inadequate for representing large numbers of words: Too hard to
learn or use'' (Mark Seidenberg. Blog.)
''The fact is that NO writing system ever exceeded 2000 symbols. This is because that is the absolute limit (lifetime learning limit) of a human's ability to remember which abstract symbol (or sequence of symbols) stands for which word. Think about how hard even this would be! It takes Japanese children from first form to the end of secondary school to memorize 1850 Kanji symbols and which word they go with. The bulk of their writing system is written with sound symbols, not word symbols. The beauty of using sounds (syllables, diphones, phonemes) is that this drastically reduces the memory load'' (D. McGuinness)
The whole word method leads some children to believe that they must memorise each word as a random string of letters. This makes learning to read exactly like trying to memorise the telephone directory. ''Like printed letter strings, telephone numbers contain a small set of symbols … Unless all numbers are dialled correctly and in the right order the connection will fail … Unfortunately, there are no systematic or predictable relationships between these strings and their corresponding entries; so each of the many thousands of such associations must be painstakingly committed to memory. There may exist a few rare individuals who are capable of memorizing entire telephone directories, but for the average child about to learn to read, the absurdity of this task should be obvious'' (Share. Cognition 55/2.1995. quoted in Goswami)
There was a revival of interest in phonics amongst some teachers as a result of the publication of Rudolf Flesch's
polemic Why Johnny Can't Read (1955), but look-say continued as the central method despite the evidence that it failed to teach a significant percentage of children how to read. Whole language became the dominant approach in the 1980s and lasted until the introduction of the National Literacy Strategy (NLS) in 1998, which brought in mixed methods.
The Whole Language Movement (called 'Real Books' in the UK), founded by Kenneth
Goodman and Frank Smith (an ex-journalist who, by his own admission, has never taught anyone to read), appeared (circa 1970) as a reaction to the dreary look-say schemes.
Phonics was completely sidelined. "Matching letters with sounds is a flat-earth view of the world" Goodman declared in his 1986 book, What's Whole in Whole Language.
Whole language is a 'philosophy' rather than a method to teach reading; only
'real' books / authentic texts are provided and children's near guesses at words are accepted if they preserve meaning, e.g. 'pony' rather than 'horse' -see the whole language definition of reading below. Children are expected to 'discover'
the alphabet code for themselves and reading will 'emerge'.
''(I)n 1991 three Whole Language professors wrote a book, Whole Language: What’s the Difference?, in which they defined what they meant by reading. They wrote: From a whole language perspective, reading (and language use in general) is a process of generating hypotheses in a meaning-making transaction in a sociohistorical context. As a transactional process reading is not a matter of “getting the meaning” from text, as if that meaning were in the text waiting to be decoded by the reader. Rather, reading is a matter of readers using the cues print provide and the knowledge they bring with them to construct a unique interpretation.…This view of reading implies that there is no single “correct” meaning for a given text, only plausible meanings'' (Blumenfeld. Why Johnny STILL can't read. 11/02/11)
Whole language purists were hostile to the look-say method
but even more so to phonics. Children were given 'authentic
texts' (commercial story books) to
read from the very start. No decoding instruction was given as it was decided
that children could learn to read as easily as they learnt to talk and walk,
simply by having unfettered access to plenty of lovely story books
with a helpful adult on hand; the teacher stopped teaching and became
'a guide on the side'.
There is no scientific evidence to support the whole language
method. The research cited by the whole language advocates consists almost entirely of collections of anecdotes or 'kidwatching'
Professor Steven Pinker, a leading cognitive scientist, who wrote the
foreword to Diane McGuinness's book Why Children Can't Read, said,
''In the dominant technique, called 'whole language', the
insight that language is a naturally developing human instinct
has been garbled into the evolutionarily improbable claim
that reading is a naturally developing human instinct. Old-fashioned
practice at connecting letters to sounds is replaced by immersion
in a text-rich social environment and the children don't learn
to read'' (Pinker p342).
In other words, although speech and language are 'hard wired'
into our brains, reading, which is a relatively recent cultural
phenomenon, cannot possibly be fixed in this way. “We were never born to read.” (Wolf p3)
Reading scores hit the floor in LEAs that took on the whole
language fad with unquestioning enthusiasm; in his 1990 paper,
Sponsored Reading Failure, the late Martin Turner wrote that about 25% of pupils arriving at south London comprehensive schools
regularly had a reading age below 9 years,10% below 8 years,
and approximately 50% of pupils arriving at east London comprehensives had a reading age below 9 years. (Turner p10)
The statistics didn't improve much with the addition of a small
amount of (analytic) phonics to the NLS mixture; in December 2010,
twelve years after its introduction, the BBC reported that, ''One in 11 boys in England - one in seven in some areas - starts secondary school with, at best, the reading skills of an average seven-year-old'' http://www.bbc.co.uk/news/education-12000886
''So to the other achievements of the 'real books' movement may be added that of creating dyslexia'' (Martin Turner p19)
American Professor Martin Kozloff wrote, ''In fact, the revolutionary whole language conception of reading as a "psycholinguistic guessing game" is a bizarre fantasy--a fantasy that managed to catch on (and make many thousands of children illiterate) because students in schools of education naively trusted their "literacy" professors--who were more interested in getting tenure, making a reputation, and selling themselves as innovators and self-inflating champions of social justice than they were at making sure new teachers (1) are guided by scientific research (which does not support whole language) and (2) know exactly how to teach reading effectively. In some fields (medicine, law, engineering) this combination of self-aggrandizement, immorality, and ineptitude is called malpractice, fraud, and criminal negligence. In education, it is called "philosophical differences" and "academic freedom." Apparently, school children and new teachers are supposed to pay for the academic freedom of education professors'' (http://people.uncw.edu/kozloffm/wlquotes.html) The general public pay for this academic freedom too; if professionals aren't capable of accurately decoding GPC by GPC all-through-the-word, accidents will happen.
For example, the New York Times (NYT 3/6/99) reported how pharmacists are increasingly giving out incorrect prescriptions. In one incident, chlorpromazine, a drug which lowers blood sugar, was wrongly substituted for chlorpropamide, an anti-psychotic, with fatal results.
In their guidance document on 'Dyslexia, literacy and psychological assessment', the British Psychological Society dismissed whole language: ''The whole language model of reading conceives word reading as a ‘psycho-linguistic guessing game’. It is argued that, driven by a search for meaning, the fluent reader makes educated guesses on the basis of the text already read. A crucial assumption is that most words can be ‘read’ as wholes, visually. The evidence against such an account of reading behaviour is by now incontrovertible. Accurate and fluent word decoding is a pre-requisite for efficient reading for interest and information'' (BPS 2005 .p26)
A pure whole language approach is rarely used in England's primary schools nowadays
but, despite the official introduction of synthetic phonics in 2006, a whole
language mutation, the NLS mixed method (1998), continues in
most classrooms, according to NFER's 2013 survey of 583 literacy coordinators.
Dr.Morag Stuart, contributor to the Rose Report 2006, gave evidence to the Education and Skills committee's inquiry on Teaching Children to Read (2004). She told the committee, ‘'I work at the Institute of Education and I go in there every day. However, I work in the School of Psychology and Human Development and I teach on Master’s courses for already qualified teachers and the continuing professional development programme. I moved to the Institute of Education because I recognised that I now knew an awful lot about reading and my knowledge was useful to teachers. However, I have never been invited to give so much as a single lecture on the initial teacher training course which runs in my own institution. That is the extent of my failure to make a difference.'’(Ev 25)
In his article, The Education White Paper: a CPS Postnatum (2010), Tom Burkard wrote that, ''(T)eacher training was first identified as the major obstacle to the implementation of effective practices in the 1996 report,
Reading Fever. In an unpublished CPS report that was sent to Nick Gibb just prior to the general election, we suggested that new arrangements were needed to train teachers to use synthetic phonics effectively. We included a survey of reading lists for 46 initial teacher training (ITT) courses, which revealed an overwhelming hostility to this method, and indeed a profound disagreement with the coalition’s overall vision of educational reform''.
In 2012 the then Coalition Government made it clear that proficiency in synthetic phonics was the expectation for all teachers and those training to teach. This expectation is now reflected in the Teachers’ Standards. In order to meet the standard, trainee teachers should, by the end of their training:
• know and understand the recommendations of the Rose Review 2006; and the Simple View of Reading
and be able to apply this understanding to their teaching of reading and writing.
• know and understand the alphabetic code
• know and understand the Criteria for assuring high quality phonic work (DfE, 2011) and be able to recognise how they are met in a range of phonic programmes
• be able to apply their knowledge and understanding of the Criteria to the teaching and assessment of phonics using a school’s phonic programme
• be able to identify, and provide targeted support for, children making progress both beyond and below the expected level
As can be seen above, university teacher training departments are obliged,
nowadays, to provide trainees with information on systematic synthetic phonics (SSP).
According to anecdote, some teacher trainers believe that they can fulfill this
obligation simply by handing trainees a copy of the government's now archived
SSP programme Letters&Sounds.
The fact is that many teacher trainers remain ideologically wedded to the NLS mixture of methods
and are extremely reluctant to train students to teach decoding through
synthetic phonics alone. They continue to provide trainees with a
subversive subtext and false balance to ensure that any SSP course content is undermined. An ITE lecturer described approvingly how this was being done: ''Due to the very nature of what it means to be a professional there can be no doubt that for some there will be subversion at work the creation of guerrilla campaigns against the imposition of SSP...For example, an organised, strategic resistance may be through the philosophy promoted within a faculty'' (Hewitt p88)
Prof. Morag Stuart: Government imposition of synthetic phonics is “damaging able readers” really?
''My uni was very anti the Rose report, and one of our assignments was to do a presentation about how poor the data were, and why the whole lot of it should be taken with a pinch of salt.'' (Mumsnet Primary Education forum 2011)
X All student teachers would benefit from
reading this open letter from Prof.Pamela Snow:
David Bunker is an English teacher who graduated with a PGCE from the University of Bristol in 2011. He recently wrote a blog post with the title Things I wish they’d told me. One of the things he wished they'd told him on his PGCE course was that ''Teaching Phonics isn’t evil''. He went to to say, ''Despite being an English teacher, I consider my knowledge of phonics teaching to be incredibly poor. We didn’t actually learn about it at Uni, we were just made aware that teaching reading through phonics completely undermines the nature of ‘meaning’. We read a lot about this as well – a lot of Literature supporting the same point. Yes, we knew a lot about why Synthetic Phonics was wrong, without really being told what it was or how it is used in Primary Schools''
A senior ITE lecturer wrote a paper where she asserted that, ''A lecturer with integrity and a good understanding of how children read will ensure that students, who are learning to teach reading, understand that the sole use of SSP is not an effective way to teach reading, but that for many children a variety of
[decoding] approaches is required'' (Hewitt. p88) She failed to provide even one piece of scientific evidence to support this view. In the same paper she stated that reading researcher Prof.Stanovich and whole language founder Frank Smith both ''endorse the belief that children learn to read through a whole word approach to reading'' (Hewitt p82). In actual fact, Stanovich says that he and his colleague Richard West were at first very taken with Frank Smith’s theories about context effects and expected their own research to confirm them. However, their experiments led them to very different conclusions.
see - Extracts taken from 'Romance and Reality' by Prof. Keith Stanovich.
“That direct instruction in alphabetic coding facilitates early reading acquisition is one of the most well established conclusions in all of behavioral science” (Stanovich p415)
An easy way to see the anti-synthetic phonics bias present in many
universities' teacher education departments is to look at their Primary English
reading lists. There is often a very visible inbalance of texts, with those
promoting anti-synthetic phonics views and misinformation greatly outnumbering
those which provide reliable information about
teaching synthetic phonics. For example, a university which listed a couple of
*synthetic phonics texts as 'required reading' had an additional and extensive
'recommended reading' book list which consisted entirely of texts written or
edited by academics who are known to be anti-synthetic phonics, in the case of
Goouch and Lambirth virulently so. *Use of the words
'(synthetic) phonics' in an academic text's title does not necessarily mean that
the author supports, or is even
particularly knowledgeable about, (synthetic) phonics teaching.
Emeritus professor of education and whole language proponent, Henrietta Dombey, described the 2006 Rose Report as 'amateurish' and said that it, ''takes the profession along a dangerous path, not supported by sound research evidence, into some dangerous territory'' (Wyse/Styles. Editorial)
Student teachers, see Resources and X's throughout the website for RECOMMENDED books, papers and articles on teaching reading.
''Those who have an opposing view [of
synthetic phonics] have yet to produce any data showing
that their favoured approach produces greater long-term benefits'' (R.Johnston. www.publicservice.co.uk issue 20. p82).
The very peculiar case of Goodman, Smith and Clay (or why the whole
language approach just wont die)
Is Reading about 'Getting Meaning from Print'?
'Reading by Apprenticeship?' Paper by Roger Beard and Jane Oakhill: a
comprehensive critique of Liz Waterland's 1985 'Read With Me' which
promoted whole language & 'real' books
David Hargreaves 1996 TTA lecture: 'Teaching as a research-based profession: possibilities and prospects'.
p7: According to Caroline Cox, there are four principal grounds on which teachers justify their practices. They are: 'tradition (how it has always been done); prejudice (how I like it done); dogma (this is the 'right' way to do it) and ideology (as required by the current orthodoxy)
Stanovich, P. J., & Stanovich, K. E. (2003).
Using research and reason in education: How teachers can use scientifically based research to make curricular and instructional decisions
Burkard 2010: 46 ITT reading lists.
2007. Read it and weep. Charlotte Allen.
What are the problems with whole language and why doesn't it
A whole language catalogue of the grotesque
Whole language and Kenneth Goodman's 'psycholinguistic guessing game'
A personal essay: Thank you Whole Language
If learning to read is a 'linguistic task', what's wrong with Whole Language?
Joyce Morris: PhonicsPhobia: fascinating personal history of teaching reading 1940s ->1970s. UK.
The ideographic myth.
Chinese written language - morphosyllabic.
Internet meme: Aoccdrnig to rscheearch at Cmabrigde Uinervtisy...
Also see Seidenberg's book 'Language at the Speed of Sight' p94->6
Next page >