In my previous post about the ESEE I wrote about how the use of a threshold score to indicate one’s employability along with a normal range (200-300) is unsubstantiated, bewildering and very damaging to adult learners. The problematic use of threshold scores is not limited to the ESEE. Test developers may have simply carried on a tradition that was established by individuals involved in the international literacy testing project who designated Level 3 as a “suitable minimum” for participation in today’s society. While the actual thresholds are different —Level 3 in international literacy tests compared to a score of 250 in the ESEE—the use of unsupported and false claims is similar. Where did these statements come from and why do they endure?
In a report produced for the first international testing project, the International Adult Literacy Survey or IALS, Level 3 was established as an achievement standard without empirical evidence or even a rationale. The statement in the report simply read: “Level 3 is considered by experts as a suitable minimum for coping with the complex information produced in the knowledge society.”1 Absent from this and all subsequent reports is a discussion of who the experts are and how the designation of a minimal proficiency was determined.
Later IALS reports (see page xi in this one) contained evaluative details related to people’s abilities to manage everyday literacy demands, to take tests, and to learn new job skills (or anything new for that matter). Related judgemental statements also insinuated that people could be a threat to the well-being of their children.
Level 1 indicates persons with very poor skills, where the individual may, for example, be unable to determine the correct amount of medicine to give a child from information printed on the package.
Level 2 respondents can deal only with material that is simple, clearly laid out, and in which the tasks involved are not too complex. It denotes a weak level of skill, but more hidden than Level 1. It identifies people who can read, but test poorly. They may have developed coping skills to manage everyday literacy demands, but their low level of proficiency makes it difficult for them to face novel demands, such as learning new job skills.
Level 3 is considered a suitable minimum for coping with the demands of everyday life and work in a complex, advanced society. It denotes roughly the skill level required for successful secondary school completion and college entry.
The development and use of the statements seems to be a Canadian initiative. Although the testing methodology was primarily developed and refined in a series of national assessment projects in the US, related reports do not contain the economic and social implications that later appear in Statistics Canada and OECD reports. In addition, early US based reports contradict the notion of a threshold level. Then, once Statistics Canada was no longer a partner in the international testing initiative, the statements were dropped from OECD reports. The statements were also publicly repudiated in 2013 by the current manager of the adult literacy testing initiative for the OECD (i.e. PIAAC).
Criticism and denunciation
One of the most strident critics of the notion of a suitable minimum (and other issues with the international testing methods) is Thomas Sticht, a US based cognitive psychologist, reading researcher and former professor at Harvard University. In a 2011 critical commentary he stated that the Level 3 suitable minimum is unfounded and potentially damaging. It is one of several issues with international adult literacy testing that has led to “defamation and gross misrepresentation of adult literacy competence.” He termed the collective transgressions “a maliteracy practice.”
A decade earlier, in an academic article published in a Canadian journal, he pointed out to his Canadian readership that US researchers and test developers refused to entertain the idea that there is a suitable minimum proficiency standard. In addition, he demonstrated how US researchers directly involved in their own national and later international testing initiatives bluntly stated that the data can’t be used to answer any questions related to an individual’s performance at work, an ability to increase productivity and potential contributions to a nation’s economic competitiveness. As cited by Sticht, the US researchers stated the following:
Because it is impossible to say precisely what literacy skills are essential for individuals to succeed in this or any other society, the results of the National Adult Literacy Survey provide no firm answers to such questions.
The same year that Thomas Sticht wrote his academic critique, Irwin Kirsch (the lead test designer) and his associates, also based in the US, emphatically state the following in a technical report (see page 9):
[T]hese data do not reveal the types of literacy demands that are associated with particular contexts in this pluralistic society. That is, they do not enable us to say what specific level of prose, document, or quantitative skill is required to obtain, hold, or advance in a particular occupation, to manage a household, or to obtain legal or community services, for example.
Despite the denunciations, the use of a Level 3 threshold was maintained in the next round of testing (the Adult Literacy and Life Skills Survey or ALLS) overseen by Statistics Canada and the OECD.
The notion that nearly half the population do not have adequate literacy skills was quickly picked up by the media (perhaps that was the intent). The suitable minimum and evaluative statements were also commonly used by adult literacy advocates and support organizations to draw attention to their aims and generate policy support and funding. At the same time, some provincial governments and the federal government developed policy directives and policy projects around the Level 3 threshold, leading to commitments to raise literacy levels (see IALS and Essential Skills in Canadian Literacy Policy and Practice for an overview). Ontario developed the most comprehensive policy projects around the idea that Level 3 (or high school completion) is a suitable minimum, including mandatory literacy screening for those applying for social assistance (Ontario Works), mandatory participation in an education program for those who fail the screening test and mandatory assessments intended to show literacy progress and skill gains towards Level 3 for those participating in Ontario’s adult literacy programs.
Twice in 2013 at the Centre for Literacy’s Summer and Fall Institutes in Montreal, William Thorn who oversees the Programme for the International Assessment of Adult Competencies or PIAAC repudiated the use of a Level 3 threshold and any statements that make connections between results and people’s performance and abilities at home, at school, and at work. When responding to audience questions, he bluntly stated that the use of a Level 3 threshold and related statements are “manifestly false.” (A few months later in a presentation by video that I heard, he added that the statements are merely “an interpretation” of the international testing results.)
In the initial presentation, he emphasized there would be no statements from the OECD in any PIAAC related reports to indicate that Level 3 is needed for an individual “to function in a modern society and economy” since the evidence does not support it. The levels are not “normative,” he said, and do not represent any sort of standard. The statements about a suitable minimum or threshold level are merely…
…a supplementary interpretation which has been put on these levels, which I don’t think is justified. And I think that’s a view that is shared by many other people as well.
It seems however that policy advocates and persuaders in Canada do not share this view.
(To hear William Thorn’s comments go to 22:30.)
Legacy in Canada
Despite the repudiation, denunciation and criticism of the use of a Level 3 threshold, it’s use endures. In a recently published report Smarten Up: It’s Time to Build Essential Skills authors Janet Lane and T. Scott Murray, the former manager of the IALS and ALLS at Statistics Canada, write the following (see page 29):
It is estimated that fully 49 per cent of the adult Canadian population aged 16 and older have only Level 1 and 2 skills. Half of our adult population lacks the literacy skills to compete fully and fairly in the emerging knowledge-intense global economy.
The statements have also been taken up by the Ontario Ministry of Advanced Education and Skills Development and integrated into their policy vision for Ontario’s workforce Building the Workforce of Tomorrow: A Shared Responsibility. In the document, the topic of adult literacy is barely mentioned, but when it does appear, PIAAC scores are used to make what has been called an unjustified and false interpretation of the capabilities of Ontarians.
47% of Ontarians have literacy scores and 53% have numeracy scores below Level 3 identified by ABC Life Literacy as the level required to succeed in a technology-rich environment).
Because the statements have been around for a couple of decades, unhinging from them will take some time. There will be no media headlines proclaiming “Oops, we were wrong: Level 3 doesn’t predict what people can actually do in their lives.” But gradually, as more people find out there is no basis to make such claims, perhaps these statements will be removed from websites, information fact sheets, and assessments like the ESEE. Confusing the issue for test developers, policymakers and literacy support organizations are on-going efforts to connect the results with predictions of performance and economic productivity. According to the note related to the above statement, ABC Life Literacy based this statement on a webinar by T. Scott Murray and Janet Lane, the authors of the Smarten Up report.
Why perpetuate the repudiated, false and damaging statements?
(1) Page 131 in the 1997 report Literacy Skills for the Knowledge Society: Further Results from the International Adult Literacy Survey, produced by the OECD and Statistics Canada.
2 thoughts on “The “manifestly false” Level 3 threshold and its legacy in Canada”
What an eye opener the Q&A with William Thorn at the Centre of Literacy’s Summer Institute were for me. Before I always had a sense that there was something not quite right about the Level 3 cut-off and it was good to hear from someone with the OECD that the levels were merely a supplementary interpretation that is not shared by some within the organisation. This was around the same time CMEC (Council of Ministers of Education Canada) adapted the OECD’s Education and Skills Online version of PIAAC for individual assessments to include Canadian content under the name of PIAAC Online. In Ontario, literacy learners were asked to participate in the trial to validate the new test items (without being told that the piloted version was not designed to produce accurate testing results and that it was not an adaptive test that would take the responses of participants into account – a highly frustrating experience for many). Also, in the same invitational memo it was mentioned that PIAAC Online was being tested for the use of a pre and post assessment a.k.a. learner gains. I recall talking to a StatsCan and CMEC representative at the Summer Institute and was made aware that the pilot of PIAAC Online as a learner gains tool was an Ontario initiative. For reasons unknown to me at least, PIAAC Online did not pass the test and the next iteration of a learner gains assessment is ESEE.
Thanks for providing these insights about the many challenges with using the IALS scale and scoring system in an education system, no matter what particular tool is used. Although the ministry decided not to use PIAAC Online/Education and Skills Online for LBS they are embarking on a large project using Education and Skills Online to test the literacy and numeracy gains of postsecondary students. You can read more about the project coordinated by the Higher Education Quality Council of Ontario (HEQCO) and called the Essential Adult Skills Initiative here. I wonder if the HEQCO folks are aware of the challenges facing LBS when attempting to measure literacy/numeracy gains in an education system using the IALS methodology?