This blog is about more than critiquing a couple of tests. It is also about more than Ontario’s LBS system, although the system will be examined regularly. The blog is fundamentally an effort to articulate for all readers how an international adult literacy testing initiative has shaped the day-to-day teaching and learning of adult literacy. By examining how this has happened, we can gain some important insights into the tremendous influence that assessments have on learning, and the subsequent impacts on both learners and educators. What is happening in Ontario is unique. The adult literacy learning system has done more to integrate the international literacy testing methods than any other jurisdiction, up to this point. Tracing the efforts to make this happen serves as a case study and cautionary tale. Hopefully, it also serves as a learning opportunity.
In this post, I will provide an overview of the terrain I will attempt to examine. The mapping also provides some context for what is happening in Ontario and my concern about tests like the ESEE.
The following graphic is a timeline of the international testing initiatives that have had such impacts on literacy learning in LBS. It’s important to note that all initiatives use the same methodological framework when conducting the actual test (initiatives also involved other tools like surveys). In addition, most test items have remained the same since the early 1990s. (The test items are the individual tasks and questions that make-up the complete test.)
Methods and messages are developed and refined
During the initial phase of test development from 1985-1992 methods were refined. After their initial design was completed by Irwin Kirsch with various associates at Educational Testing Services (ETS), the tasks were used in a handful of US projects and one Canadian project. The Canadian project, called Literacy Skills Used in Daily Activities (LSUDA), was important because it demonstrated that the test tasks could be translated, a necessary step in bringing the test to an international stage. In addition, some key messages that related results with one’s ability to “function” and “contribute” to the economic well-being of the country were developed.
The graphic below shows the methods and messages that are of interest. They appeared in various reports published after the first round of international testing from 1994-1998. This was the International Adult Literacy Survey (IALS), a partnership between the Organization for Economic Cooperation and Development (OECD), Statistics Canada and Educational Testing Services (ETS) in the US.
I plan to look at each of the five methods and messages in future posts to show you how they are lifted out of the context of international, large-scale testing and carried into various educational projects.
Methods and messages are carried into other projects by first generation experts
The first people to carry specific methods and messages into other projects were those who gained expertise working on the international projects, including test development expertise, statistical analysis, psychometric expertise (the complex analytical work done to score the test tasks) and policy persuasion expertise.
Notable Canadian experts include Stan Jones who led the development of the Essential Skills framework and Scott Murray who has been a prominent policy advocate. In addition, ETS has maintained an important role in test item development and scoring. It maintains the bank of test tasks used in the international projects, which are now overseen exclusively by the OECD. ETS also developed the very first spin-off test PDQ, likely using the expertise of test developers and psychometricians who worked on the IALS. And, a more recently developed spin-off, Education and Skills Online, is administered and overseen by ETS. It is also featured prominently on the PIAAC website, indicating its importance to the overall testing initiative. The first generation expertise of individuals and organizations continues to be highly active.
A proliferation of projects by second generation experts
In addition, a second generation of expertise took hold in Canada as a result of the Essential Skills project. Throughout the 2000s and continuing to the present there has been a proliferation of educational projects, including test development, the development of curriculum frameworks and standards (also called benchmarks, outcomes and competencies) and learning materials—all of which draw on the methods and messages developed as part of the international literacy assessment projects of the OECD.
The Essential Skills project, an initiative of the federal government, was instrumental in supporting the development of second generation expertise, as it provided the following elements:
- The first articulation of a curriculum framework and learning standards using the five IALS levels and the accompanying level descriptions for Reading, Document Use and Numeracy
- The application of the framework in what is called profiling, an effort that led to the development of parsed descriptions of hundreds of jobs using the hierarchy and categories in the ES framework
- Training in the use of test development methods, often involving adult literacy instructors
- And, very importantly, funding—first from the National Literacy Secretariat, and then from the Office of Literacy and Essential Skills (OLES). During the decade that OLES provided funding from 2005 until 2014 or so, it limited the types of projects and organizations that could be funded in order to focus only on the enhancement and use of the ES framework and related international testing methods and messages.
The two green arrows in the above graphic point to the projects that were developed for use by MTCU. Both will be examined closely, since they are or will be used to provide accountability measures. Other tools will only be referred to in general ways.
Although Ontario is the only province in Canada to have incorporated the tests and standards from international literacy testing into accountability frameworks, Alberta has developed similar mechanisms—Read Forward and the Alberta Reading Benchmarks (ARB)—but provincial program funders have not (yet?) mandated their use.