Since 2007 the policy folks who oversee the literacy program at Ontario’s Ministry of Training, Colleges and Universities (MTCU) have been attempting to use an international literary spin-off test to produce a measure of what they refer to as learner gains. What they are after are test results that show an improvement or increased score as a result of participation in Ontario’s Literacy and Basic Skills (LBS) programs.
The initial project was called Learner Skills Attainment (LSA) involving representatives and consultants from Ontario’s literacy support organizations. Based on the LSA report, written by a consultant working with the College Sector Committee, MTCU staff made it clear that any assessment chosen for the LBS system must be based on the international literacy test, incorporating its fundamental testing methods and scoring system. There is no clear explanation of why they thought this test would be preferred over other standardized tools, nor was there analysis in this initial report of whether the methods used in international population testing should or even could be used for a range of adult literacy programs. Decisions, at least the evidence of those decisions that are accessible to the public, seem to be come from nowhere.
By 2007 a few spin-off tests had been developed for MTCU policy staff and their consultants to consider. What I call spin-offs are reformulations of the original international literacy test that incorporate its fundamental test development methods in rigorous ways, including psychometric analysis and sometimes the work of psychometricians that worked on the original. The methods were originally devised in the 1980s by Irwin Kirsch and various associates working with Educational Testing Services (ETS).
Soon after the first round of international literacy testing was completed in the early and mid 1990s, ETS developed the first spin-off called Prose, Document, Quantitative (PDQ). It continues to be available online for programs and individual use.
Canada produced its own spin-off called the Test of Workplace Essential Skills or TOWES in the early 2000s. You can find some sample test questions here. TOWES was developed primarily with funding from the federal government. A network of community college distributors across Canada control access to the test. The TOWES administration team is affiliated with Bow Valley College in Calgary.
A more recent spin-off was developed by the OECD in partnership with the European Commission. This spin-off is called the Education and Skills Online Assessment. Access to the test and purchasing is also administered through ETS. You can take a demo test but you must use Firefox to access the online demo.
In their search for a learner gains tool, the ministry considered each one of these spin-off tests starting in 2007-2008. After field-trials and pilots, none were deemed appropriate for use in LBS. I don’t have a report or documentation that states this. It is simply based on my own conversations with a few people who were involved in testing. Field-trials and pilots came and went, yet no decision about a learner gains test was made. When I asked those involved in field-trials and piloting about the experiences of students, most said the tests were simply too difficult and inappropriate.
There are sound methodological reasons why they are too difficult and inappropriate for use in an educational system focused on the development of literacy and language.
- The average reading level of the text on test items developed for the international literacy test is Grade 8. Any spin-off that has a rigorous alignment with the original test would have to have a similar average reading level, as it is an element used to determine test item difficulty and scoring. Nearly half (44%) of adults enrolled in LBS in 2014-2015 have less than a high school education. It is highly likely that those who left school early, or struggled in school, or have a learning challenge, or do not speak English or French as a first language would find a Grade 8 level of text challenging—that’s why they are in an LBS program.
- Not only is the text itself difficult but the way the test questions are formulated adds to the difficulty. Rather than using a reading comprehension model and approach that is used in K-12 education, the test developed its own model of reading called information-processing. What this entails is scanning the text for particular bits of information. Scanning is a high level reading skill used by those who are fluent and competent readers to quickly move through familiar texts to find bits of information for a specific purpose. Spin-off tests force most LBS test-takers to use the skill of scanning before they have even learned to fully read and comprehend texts, before they are fluent readers, and before they understand a wide-range of text formats. In addition, it forces test-takers to read unfamiliar texts.
- The original literacy test and spin-offs are intentionally designed not to evaluate any sort of language skill development. That’s right: they do not not evaluate typical elements of language and literacy that programs teach such as vocabulary, grammatical conventions, content knowledge, the organization of texts, various formats, genres, styles, etc. That means the test does not and will never align with the actual curriculum in use in LBS programs.
And now the ministry is attempting yet again to make an international literacy testing spin-off work for the LBS system. They are currently in the middle of what is called the Learner Gains Research Project involving 33 programs and 45 sites. The spin-off in use this time is called Essential Skills for Employment and Education (ESEE), same fundamental methods of test development, different group of developers. This time though, the ministry is funding the redevelopment of some test items at the lowest levels. The ESEE is one of three assessments developed in partnership with The Essential Skills Group and Ontario’s College Sector Committee, an LBS funded support organization.
I will be talking to people involved in this most recent effort and I encourage anyone involved to post comments, which you can do anonymously or by contacting me directly. Some initial feedback indicates that again, despite the funding of additional test items, the test is too difficult for most LBS learners. Take the demo and see for yourself if the test is appropriate.
So here we have it: a decade of effort to make an unworkable assessment work for the LBS system.