School Board Program Experiences with the LGRPLGRPTitlePage

This presentation highlights the issues and challenges associated with a ministry attempt to field test a standardized assessment for use in the  Literacy and Basic Skills (LBS) accountability framework. A long-term aim of the ministry is to use test results to make funding decisions.

The test, Essential Skills for Employment and Education (ESEE), incorporates basic methodological principles and methods from the Organization for Economic Cooperation and Development’s (OECD) international literacy testing initiative,  including levels, a scoring system, statements of literacy standards and test item design. Some of these methods and principles have been modified. The LBS Committee of CESBA, who represent the interests and concerns of managers in school board adult literacy programs in Ontario, decided to pursue a study after hearing about the challenges experienced by their colleagues participating in the Learner Gains Research Project (LGRP).

The impacts of the use of the ESEE as part of the LGRP are serious and damaging, not only to adult learners, but also to individual programs and the LBS system as a whole.

  • The ESEE test produces failure. Messages of failure are built into the test and the test sets up students to fail by using texts with a Grade 11-12 reading level.
  • The use of cut-off scores and normal ranges is unsubstantiated.
  • The test demoralizes many students and may directly impact student enrollment.
  • The results are meaningless to learners and individual programs since they aren’t aligned with the OALCF, the curriculum in use in the three academic paths, including secondary credit and traditional adult literacy development curricula.
  • Use of the ESEE or any international literacy spin-off marginalizes and disconnects programs from the predominant approach to teaching literacy and the curriculum in use to support further education goals.
  • The results may likely prove to be meaningless for the ministry as well; it is highly unlikely that they will show statistically significant increases between pre and post testing.
  • Learners have been subjected to an unfair test that was not adequately validated for use in LBS, even for piloting purposes.
  • The ministry must seriously consider whether it is worth pursuing the use of a learner gains tool that aligns with international literacy levels and the 500 point scoring system after nearly a decade of work and the consideration of four different tools.

The full report is available here.

Assessment Challenges, Contradictions and Inequities: An Analysis of the Use of Digital Technology and OALCF MilestonesPages from AlphaPlus Assessment Use Research Overview (Ver 3.3 Oct 5-15)

Since the first year of the OALCF, the Use Digital Technology Milestones have been used far more than other Milestones; we knew this based on a previous study. In Performance Management System training sessions it was suggested that the way LBS programs report on Milestone data was not consistent. Three previous studies described the following challenges with the Milestones: they are too difficult for some and too easy for others, their content and pedagogy are not related to learner goals, they contain confusing instructions, they can’t be changed to better accommodate learners, and the results may be unreliable. We also completed previous research that looked at the theory and methods used to develop the Milestones, which indicated there may be challenges. The study examined how assessors and instructors use and understand the Milestones. We focused first on the digital technology Milestones, and then looked more broadly at Milestone use in general.

We analysed OALCF documents and Milestones, surveyed 181 assessors, interviewed 26 coordinators, assessors and practitioners from six different programs across the province in all streams, and analysed data from EOIS-CaMS (2013-2014). We reached the following conclusions based on our analysis:

  • The Milestones and their guidelines interfere with the ministry’s objective to “improve service delivery, learner experiences and learner outcomes” by introducing a series of challenges, contradictions and inequities into many programs.
  • Their content and overall design is disconnected from learner goals and commonly used pedagogical approaches, making them challenging for some yet too easy for others, and difficult for instructors to predict readiness and successful completion.
  • They are primarily used for compliance and not to demonstrate progress.
  • Programs rely heavily of digital technology Milestones, which are far more predictable.
  • Guidelines make the Milestones administrable but disregard teaching and learning.
  • Programs are impacted in different ways depending on their resources (i.e. size, ability to work together, existing curricular supports, administrative supports, etc.) and the education levels and goals of their learners; programs also respond in different ways and have developed various strategies to ensure they meet reporting requirements and remain learner-centred.

For additional information and more details, a slide presentation is available. We also developed an educator/assessor research brief on the use and appeal of the DT Milestones, along with a brief describing the various ways programs have responded to the Milestones and set up different assessment practices.

From an International Adult Literacy Assessment to the Classroom: How Test Development Methods are Transposed into Pedagogy

This book chapter is part of an international collection analysing the politics and practices of international literacy assessment (ILA) organized by the OECD.  The book, Literacy as Numbers: Researching the Politics and Practices of International Literacy Assessment  LiteracyasNumbersCoverexplores “how internationally comparable numbers, now so heavily relied on in national policy are produced, and how they are shaping our understanding of the meanings and purposes of literacy.”

Here is an excerpt from the introduction of the chapter I wrote. In this chapter I explicate how aspects of the OECD’s adult literacy assessment regime are put to use to co-ordinate adult literacy learning in Canada. Guided by institutional ethnography, the analysis reveals how the testing technology used by the OECD, along with its operational and support devices, is transposed into the context of adult literacy education and vocational training, carrying the ideological interests of the assessment regime, some of its methodological procedures and associated literacy practices. Methods used to develop the testing technology and its related operational and support devices have been reformulated, and then integrated into curriculum frameworks, programme assessments and even instructional materials. This has been primarily a Canadian project, which began in the early 1990s, and has involved mostly Canadian experts who were and continue to be directly involved in the international assessment initiative. Transposition is a process of institutionalising and codifying aspects of the OECD assessment regime within adult literacy education policy, pedagogy and teaching and learning practice. It is accomplished when the test’s operational and support devices are incorporated into a national occupational skills development framework. It is also accomplished when the assessment regime’s testing technology is reformulated as spin-off tests for individual and programme use, a facet of the transposition that has recently moved beyond North America, and has been taken up by the European Union.

The Coordination of Adult Literacy Policy and Pedagogy to Ensure Productivity in a Knowledge Economy

Guided by institutional ethnography, this analysis uncovers CASAE2014Papersequences of textual coordination between an international literacy testing (ILT) initiative and day-to-day teaching and learning in adult literacy education programs and projects. The analysis reveals how ideological concerns related to literacy, and its assumed potential to increase productivity in a ‘knowledge economy’, are first actualized to manage and monitor the literacy resources of national populations and sub-groups, and then extended and expanded in Canada – using the Essential Skills project at the federal level and a curriculum reform initiative in Ontario – to ensure individual productivity. Extending the project even further, is a group of policy entrepreneurs who have created profiles of the potentially most productive Canadians, using the ILT as its basis. Findings demonstrate how the use of assessments, curricular materials and overall policy directives that are derived from the ILT play-out in contradictory, unfair and ineffective ways, resulting in restricted learning opportunities, educational inequalities and a perverse literacy pedagogy.

Essentializing the Experiences and Expertise of Adult Literacy Educators

Adult literacy educator expertise is being subsumed by the Essential Skills framework and IALS testing methodology as both are packaged as adult literacy pedagogy. Preliminary findings from an Institutional Ethnography illustrate how educators are becoming increasingly immersed in the discursive relations of the literacy regime as they: 1) get hooked into the discourse of the regime; 2) establish a direct link with assessments and accountability requirements; and 3) are taught to change the way they teach, discounting both research and practice based knowledge of literacy and adult learning.

Tensions Between Policy, Practice and Theory: International Perspectives on Adult Literacy

This paper highlights topics and issues discussed during an international symposium in which I was a participant with Tannis Atkinson, Richard Darville, Audrey Gardner and Mary Hamilton.

Nearly three decades of empirical research and theoretical discussion have helped us to understand literacy as not merely a discrete and autonomous skill, but as situated, social, multiple, complex—as ‘literacies’. During this same period policies that inform literacy education have become increasingly narrow, aligning to the OECD’s large-scale international literacy surveys that offer ‘evidence’ of a nation’s capacity to compete. This symposium considers the range of responses from policy-makers, researchers and practitioners in Scotland, England and Canada to this environment, highlighting complexities and tensions but also offering hope for alternative perspectives and strategies that could reinvigorate the field.

Far From Perfect But Full of Promise: Literacy and Numeracy Policy in Scotland
Audrey Gardner
From Margin to Mainstream: Lessons from England’s Skills for Life Strategy
Mary Hamilton
The Essential Skills Framework in Canada: Mediating Literacy Measures and Learning
Christine Pinsent-Johnson
Developing Literacy Work?
Richard Darville
Governmentality: A Promising Theoretical Frame
Tannis Atkinson