Partly because the use of school attainment as a proxy index for literacy is dubious, literacy tests have been developed over the last 20 years, relying on increasingly sophisticated "simulations" of everyday uses of literacy. A literacy test sponsored by Southam News in 1987 identified 4.5 million Canadian adults as "functionally illiterate."2 The 1989 Survey of Literacy Skills Used in Daily Activities, from Statistics Canada, aims to portray several levels of reading, writing and numeracy "information processing ability," and not to draw a single line separating the literate from the illiterate.3

For the Southam test, a range of "everyday tasks" with literacy were simulated. To assess literacy levels, a subset of relatively simple tasks was used. These tasks primarily involved using administrative documents — for example, finding the expiry date on a driver's license, making out a cheque, filling out a job application. Cut-off points were defined that allowed some of those tested to be classified basic illiterates (8%), and these and others to be classified as functional illiterates (24%). The Southam test has been widely criticized for its limited sample size, unrealistic or poorly formulated test items, an uncertain correspondence between English and French items, reporting immigrants not fluent in either official language as illiterate, and arbitrary criterion levels.4 In spite of criticisms, statistics from the census and the Southam survey have been widely repeated in descriptions of the literacy problem by journalists, politicians, civil servants and literacy advocates.

There are also problems with the usual statistics from the viewpoint of literacy work. People's school attainment shows how much opportunity and stamina for schooling they had, not how much at ease they are with literacy. Tests of literacy ability, especially when designed to produce a count of illiterates, are often wide of the mark, missing the wide variety of people's actual difficulties with reading, writing and arithmetic. Even to come close to the actual experience of literacy work, literacy statistics need to say something about the the kinds and levels of difficulties with reading and writing that students and teachers may identify. Fortunately, some sources are available to begin on a statistical account that makes sense from the standpoint of practice. Some census and labour force survey data can be used. Individual items from the Southam test — not bundled together to classify people — can be used. The Statistics Canada survey is very useful.


2 Creative Research Group, Literacy in Canada: A Research Report, Prepared for Southam News, Ottawa, 1987.
3 Statistics Canada, Adult Literacy in Canada: Results of a National Study, Ottawa, Minister of Industry, Science and Technology, 1991 (Cat. 89-525E).
4 One review of criticisms of the Southam report is provided in K. Kelly, S. Murray and A. Satin, "A National Literacy Skill Assessment: Planning Report," Statistics Canada Special Surveys Group, April 1988.