In a special pilot issue of The College Quarterly in the spring of 1993, Michael Rock began with the premise that “our people are our strength” and used that idea to build an argument that “in an increasingly competitive international economy we must make people our advantage. But schools do not seem to be graduating the knowledge workers whose skills are essential for the future prosperity of our country.” Rock’s article was entitled “Training the Future” and his concern was that, as Canada shifts from a resource economy to a knowledge-based economy, it would be necessary to overcome a legacy of a high illiteracy rate (he cites 40%). The piece was used just as an editing and formatting test and only the first page was actually published. The remainder of the piece seems to have been lost.

Others writing at that time, however, shared this concern. Kelly, Montigny, O’Neill & Sharpe (1992), for example, reported that literacy rates in the workforce were higher than general literacy rates (partly because the oldest cohorts were not in the workforce). Yet they concluded

Of the approximately 1.2 million adults who believe that their skills are inadequate, only 9% are taking training to improve their performance, although a further 52% allow that they might enrol some time in the future. Enrolment in literacy programs has traditionally been low, and the results reported above suggest that motivating Canadians to register for them may continue to pose a challenge. (p.7)

In this 20th year of publication of The College Quarterly, it seems fitting to take a look at our progress—how have we done in addressing the concerns Rock raised in that special issue 20 years ago?

Literacy would be a good place to start. It is a shaky and uncertain starting point, however, since definitions vary across space and time (e.g., de Castell, Luke & MacLennan, 1981). Before the 1990’s, concerns about literacy and its relation to the economy were based on a variety of different measures and international comparisons would have been difficult at best. These concerns about literacy and “training the future” were based on beliefs about the relationship between literacy and economic achievement. For example, using Statistics Canada data from the 1989 Survey of Literacy Skills Used in Daily Activities (LSUDA), Finnie and Meng (2007) examined the belief that “literacy and numeracy skills influence labour market performance and income in specific ways other than educational attainment” (p. 5). They looked for relationships between measures of literacy and numeracy and the “employment outcomes” of school dropouts. They summarized their results regarding the relationship between literacy and employment for dropouts as follows:

Among high school dropouts of both sexes, functional literacy had a significant, positive impact on being employed at the time of the survey, having been employed at any time in the previous 12 months, and having worked mostly full time when employed (Table 4).20 In contrast, the formal education variable (years of education) was not at all significant for male dropouts and significant only for female dropouts who worked mostly full time. (Finnie and Meng, 2007, p. 8)

That is, school dropouts with better literacy skills did better in the workplace than dropouts without those skills. It is not so clear exactly how literacy differences affected the employment outcomes of more highly educated people; yet, it is at the high end that “knowledge-economy” demands for information skills is greatest. This was the gist of Rock’s concern as well, but things were about to change.

The year after Rock’s article appeared in The College Quarterly special issue, Canada and a group of international partners established the International Adult Literacy Survey, a programme that developed some common understandings and a common measurement scheme. Using the data from this survey for adults aged 26-65 during the period 1994-1998, Tuijnman prepared a report for Statistics Canada that showed Canada to be well above average for the group of 21 nations studied. As he observed, “In only three countries are the test scores higher than in Canada and the United States: Finland, Norway, and Sweden.” (Tuinman, 2001, p.19).

It seems, then, that some of the concern expressed 20 years ago was either overstated or quickly remedied. Earlier in the 20th century, as in its earliest days, Canada struggled with comparisons to the achievements of Europe and the United States, but those comparisons were not based on controlled studies or standardized measures. With better data, it seems that international rankings have been changing in more recent times. The level of secondary school completion in Canada has grown steadily and now ranks marginally ahead of the United States, but it is still behind many high-achieving European and Asian nations (OECD, 2011, p. 44). Beyond that, the Organisation for Economic and Cooperation and Development (OECD) also reported that Canada was ahead of all but Korea in the percentage of 25- to 34-year-olds who had attained tertiary education (p. 30). Even more recently, Saute & Hess (2012) reported that Canada has the highest level of postsecondary completion at 51%.

The international comparisons of overall school achievement, therefore, would suggest that Canada has been doing well on what Rock called “training the future,” and these achievements might well be regarded as more noteworthy given the challenges posed by Canada’s diversity, the isolation of many aboriginal people, and the need to provide second language education to many new immigrants. Rock’s specific focus, though, was on literacy and I would like to deal with that in two ways: international comparisons and “new literacies.”

International comparisons

The most recent international comparisons were released by the OECD in November 2013. The Programme for International Student Assessment (PISA) is an OECD program that tests 15-year-old students in over 70 countries on measures of skills and knowledge. The most recent PISA report (OECD, 2013) involved about half a million students in the 65 countries that participated in the testing in 2012.1 PISA seeks to differ from other international education tests by focusing on the students’ ability to apply or use information rather than simply to remember it. This focus makes PISA results particularly relevant for the issue of “training the future.”

While Canadian students performed slightly less well on the 2012 PISA tests of reading, mathematics and science than their counterparts did in the 2009 testing, they still performed significantly above the international average in each of these areas. In the case of reading (since our focus is literacy), it is worth noting that the decrease was not statistically significant and that the Canadian average was 27 points above the OECD average. Canada’s rankings in comparison to 65 other nations were 13th in mathematics, 10th in science (tied with Lichtenstein) and 7th (tied with Finland and Chinese Taipei) in reading. This is substantially better performance that either of the two nations to which Canadians most often make comparisons, the United Kingdom and the United States.

The 2009 cohort has now graduated and many of them are in colleges or universities. The 2012 cohort will be entering postsecondary programs in a little over two years’ time. At least insofar as the skills measured in PISA testing are concerned, it would appear that Canada is meeting the challenge of “training the future”; but, as I will argue below, this is a moving target.

What of the skills of current adults? The pattern we have seen above, increasing concern about international standing in literacy and the development of standard international measures, occurred there as well:

Going back in history it is worth noting that by the mid-1980s policy makers in North America had become dissatisfied with the use of educational attainment as a proxy measure of what workers and students knew and could do (Niece and Adset, 1992). This dissatisfaction manifested itself in a desire to measure foundation skills such as reading literacy more directly, through the administration of actual proficiency tests. Canada conducted its first literacy survey in 1987.

This survey, entitled ‘Broken Words’, was conducted by Southam Inc. It discovered that there were more than five million functionally illiterate adults in Canada, or 24 per cent of the adult population…. It was the first time a study had gone to such lengths in illustrating the presence of functional illiteracy as a ‘hidden problem’ in Canada. (OECD, Statistics Canada, 2011, pp. 23-24)

Consequently, again in the early 1990s, Canada collaborated with several other nations in the development of the International Adult Literacy Survey (IALS). Early administrations of the test found high levels of adult functional illiteracy in many countries (24% in Canada). This led to national policy changes and program development in many countries and to the development of PISA by OECD. In the course of time, the IALS was modified and replaced by the Adult Literacy and Life Skills Survey (ALL), but it appears that further development of improved measures is ongoing. (OECD, Statistics Canada, 2011).

The ALL was administered to 16-65 year olds in participating countries. To date, the number of countries has been relatively small and the participants were primarily from Europe, North America and Oceania. The test examines prose literacy and document literacy as well as numeracy and problem solving; however, in this paper only the prose literacy (focused on reading, understanding and using texts like job applications, newspapers and instructional manuals) and document literacy (finding and using information in diverse document formats like tables, schedules and graphs ) will be considered. Performance in all of these areas was reported in terms of five broad levels of achievement with the lowest level (Level 1) requiring primarily skills of recognition and recall and Level 5 requiring the abilities needed to find appropriate information in highly complex representations and to make inferences or draw conclusions based on that information (OECD, Statistics Canada, 2011, p. 15).

The ALL country comparisons were reported for assessments done in 2003 and in 2006-08. On the prose literacy measurement, Canada ranked third behind (but not statistically significantly) Norway and Bermuda and significantly higher than all other participants.2 On the document literacy measure, Canada again place third, significantly below Norway and the Netherlands, and significantly ahead of all others. In both cases, Canada’s average score fell in Level 3 which is described as follows:

Prose literacy 3

Document literacy

Tasks in this level tend to require respondents to make literal or synonymous matches between the text and information given in the task, or to make matches that require low-level inferences. Other tasks ask respondents to integrate information from dense or lengthy text that contains no organizational aids such as headings. Respondents may also be asked to generate a response based on information that can be easily identified in the text. Distracting information is present, but is not located near the correct information.

Some tasks in this level require the respondent to integrate multiple pieces of information from one or more documents. Others ask respondents to cycle through rather complex tables or graphs which contain information that is irrelevant or inappropriate to the task.

It is also worth observing that Canada had among the highest percentages falling at the top levels: for Prose Literacy, 58% scored at levels 3, 4, and 5 with 20% scoring at levels 4 and 5, for Document Literacy, 57% scored at levels 3, 4, and 5 with 21% scoring at levels 4 and 5.

The authors of the ALL report provide extensive information about the measures used (including sample items), the sampling methodology and the analyses conducted. They also provide detailed and complex comparisons of country scores and of the relation of these scores to one another and to social and economic measures. Most of this is outside the scope of this article, but some of their conclusions are pertinent here:

Significant proportions of the adult population display poor levels of proficiency in one or more of the skill domains assessed and many perform poorly in all domains;

The differences in the level and distribution of literacy, numeracy and problem solving skills are associated with large differences in economic and social outcomes;

With some exceptions, the population mean scores in prose and document literacy changed little over the period between the administration of IALS and ALL.;

In most countries the mean proficiency scores in prose and document literacy are lower for older age groups than for younger ones, with the steepest decline occurring for the oldest age group. (OECD, Statistics Canada, 2011, p. 306).

Together, these conclusions suggest that we (in all of the countries tested) are not really doing enough to prepare adults for the knowledge skills required for modern economies, but that the problem is less severe among younger adults. That is, though Canada is doing well in the international comparisons, Rock’s concern about “training the future” ought still be a concern and not just for Canada. They also suggest that changing our performance in these skills areas might lead to improvements in the economic and social conditions of a nation, but it is important to note that the ALL research was descriptive and correlational, not experimental, so the emphasis must be on “suggest” and “might.”

Lastly, the ALL researchers also concluded that “ALL offers some insight into the relationship between foundation skills and familiarity with ICTs. In particular it demonstrates the existence of a significant relationship between literacy proficiency and familiarity with and use of ICTs” (OECD, Statistics Canada, 2011, p. 308). That relationship is part of the focus of the next session.

New literacies

Again, at about the same time Rock mused about literacy issues in “training the future,” others were beginning to introduce new complexities into what literacy involved. The idea of “multiliteracies“ was developed by a group of language scholars who met in New London, Connecticut in 1994 (New London Group, 1996). Cope & Kalantzis (2000) indicated that their use of the term “multiliteracies” emerged from 1994 discussions among this elite as they struggled with what must be done to address disparities in the outcomes of language education programmes. The meeting resulted in a “pedagogy of multiliteracies” manifesto. Cole and Kalantzis summarized the group’s reasoning as having focused on two main arguments: 1) increasingly “meaning-making” involves sources as audio, video, and other non-textual sources of information in some kind of multimedia or hypermedia format (multi-modal literacy) and 2) increasing diversity in many places combined with growing global commercial activity in the context an Internet-connected world have resulted in greater awareness of the importance of both language diversity and the complex, diverse versions of English coming to be known as “world Englishes.”

These two developments have the potential to transform both the substance and the pedagogy of literacy teaching not only in English but also in other languages around the world. No longer do the old pedagogies of a formal, standard written national written language have the utility they once possessed. In contrast, the Multiliteracies argument suggests the necessity of an open-ended and flexible functional grammar which assists language learners to describe language differences (cultural, subcultural, regional/national, technical, context-specific, and so on) and the multimodal channels of meaning now so important in communication.

Clearly these are fundamental issues concerning our future. In addressing these issues, literacy educators and students must see themselves as active participants in social change; as learners and students who can be active designers—makers—of social futures. Cope & Kalantzis (2000, p. 6-7)

Since that time a substantial body of research has grown up around the concept of multiliteracies, most of it in the form of descriptive studies of projects that apply the concepts, for example,

Students in this study developed media literacy skills through student-produced digital video. The students were able to benefit from understanding and following the news making process. The findings indicate that students were able to perform a cursory analysis of the media. Student analysis of media messages was limited to a superficial, lower-order level. There did not appear to be evidence of deeper, more critical evaluation of the messages inherent in media stories. For example, responses to the Crossing Media activity involved more descriptive, fact-based answers rather than a higher-level interpretative analysis of the stories.

Once students, however, had the opportunity to undertake the news making process it was evident that they started to develop [a] more critical approach to the media (Cooper, Lockyer & Brown, 2013, 103).

While it is not yet clear what the long term impact of the multiliteracies concept will be, it is clear that the idea has captured the interest of many researchers who have established research centres like those at Brock University (http://www.brocku.ca/education/departmentsandcentres/multiliteracies) and York University (http://www.multiliteracies4kidz.ca/).

Conclusion

Rock’s assertion that “our people are our strength” remains an appropriate way to think about Canada’s place in the world and international comparisons are a useful way of getting some measure of that strength. However, the relative standing of nations is not really an indication of what its citizens can do. With the knowledge economy has come a dawning realization that the traditional understanding of literacy may not capture enough about the communication and learning skills of populations that are engulfed in images on web pages, televised programmes and advertisements, online videos, myriad social media mechanisms, mobile devices that put the world’s knowledge in one’s hands, and a constantly emerging digital environment. For the foreseeable future, “training the future” will be a moving target and skills of media literacy and online communication will also be potent determinants of what citizens know and what they can do with that knowledge.

This is an enormous challenge for all educational institutions and a clarion call for continuous professional development in all sectors of the knowledge society.

In rank order: The Netherlands, Australia, New Zealand, Switzerland, Hungary, the United States, and Italy.

Definitions taken from OECD, Statistics Canada, 2011, p. 15.

Bill Hunter is a member of the College Quarterly editorial board and a professor in the Faculty of Education at the University of Ontario Institute of Technology, where he was the founding dean of the faculty. He is a past editor of the Canadian Journal of Education. He recently published Online learning and community cohesion: School links with Roger Austin of the University of Ulster, Northern Ireland. It is available from Routledge.