ESEE Update and a Study of School Board Participants in the LGRP

The Learner Gains Research Project (LGRP) wrapped up on November 30, 2016. As far as I can tell, those coordinating the project at MAESD have not communicated directly with the field, nor the programs that participated in the project to share their findings.

During a meeting I attended in late January, a MAESD representative said a decision about the use of the test as a learner gains measure in the accountability framework is delayed. Perhaps decisions about the learner gains tool will be part of a broader discussion of problems identified in the LBS system in the LBS Evaluation that was just released. I will look more closely at the evaluation report in upcoming posts, but I thought I better do some catching up first. Also, the evaluation report doesn’t delve into the details of the LGRP, likely because it was still underway during the evaluation period. Hopefully, the findings from this study will be included in the broader conversation.

The school board study

I oversaw a small research project last summer that involved school board participants in the LGRP. The study was organized by one of the participants in the LGRP who wanted to find out if other programs were sharing similar challenges and, very importantly, systematically document experiences and concerns. It would have been ideal to connect with all participants in the LGRP, but funds and time were limited. School boards are ideal programs to examine because they hug the middle so to speak. Their learner profile related to age, education level, and goals is more aligned with the provincial profile compared to community-based programs and colleges. A snapshot of school board programs likely provides a very good indication of what is happening across the province.

I interviewed five program managers and an instructor who were overseeing the administration of the Essential Skills for Employment and Education (ESEE) test in their programs. All together, they were to administer the test to 525 learners or about 30% of the proposed sample.

All were very critical of the test and the testing experience. Mostly, they were worried about the proposed use of the test and implications for their learners and programs.

npred-testingcrying2_custom-0ce22ba65f0416297bc17d8282bff89bbf195f19-s800-c85

Image source 

Study participants raised many concerns about the difficulty and length of the test.

  • The test is simply too long and difficult, they explained, taking many learners more than twice the recommended time of one hour to complete, and some took up to four and five times longer.
  • One manager said some learners were simply opting to discontinue the program rather than take the test.
  • All participants mentioned technical challenges with the online format, including the need to teach learners how to navigate the test environment. One program created a test preparation and support course to assist a group who had a particularly negative experience with initial testing.
  • There was tremendous confusion, and both concern and apathy, among learners regarding the use and relevance of results; they are not related to secondary credit, PLAR or completing the GED.
  • Learners interpreted statements built into the test as messages of failure. (I previously commented on this aspect of the test.) The use of “normal” scoring rages and cut-off scores related to employability and education access has not been validated nor empirically demonstrated. Learners with past negative schooling experiences were further marginalized and demoralized.

Study participants also described a series of LBS system misalignments and contradictions.

  • Participants were surprised to discover that the OALCF levels are not aligned with the ESEE levels. They thought students would be prepared to do the ESEE if they were at OALCF level 2, but these students had results at ESEE level 1, if they were able to complete the test at all.
  • Learners must take all three sections of the ESEE despite program goals, which may be focused only on numeracy.
  • Some post-tests had been completed during the time of the interviews, and participants saw either no change or even score decreases. They said some learners were very reluctant to complete the post-test and simply moved through the questions quickly, clicking randomly on responses.
  • Participants encountered pedagogical confusions and contradictions. Instructors were put in very awkward and contradictory positions as they encouraged learners to complete the test, and then turned around and told learners the test was irrelevant when they received poor results.
  • A couple of managers expressed grave concerns about the use of results and how they would be used to draw negative conclusions about the program.

Managers and an instructor described several administration challenges.

  • The pre-test over-burdened an already time-consuming intake process and increased staff workloads.
  • Managers had a difficult time recruiting learners to participate in the project, as test results have no relevance to their learning and program goals. Even more challenging, was convincing learners to take the post-test since most had negative experiences with the pre-test.
  • Running the project over the summer was a challenge as programs usually close down. Many learners who did the pre-test in the spring likely would not return in September, adding to recruitment challenges.
  • Mangers felt isolated and frustrated. They wanted to communicate their challenges directly to the coordinators of the LGRP, but were not given the means to do so. They weren’t even aware of the other study participants.

Why do learners struggle with the test?

Just what makes the test so difficult? I previously looked at the issues related to readability and test design.  In general though, the international literacy test and its design was never intended to test individual literacy development and skill gains. When the methods are used within the context of education to do just that—test individual learner skill gains—a series of problems, contradictions and disconnects arise. The most serious problem, in this particular reformulation of the methods, is the difficulty of test-items. A readability analysis of all ESEE reading items, including the six pre-test items, revealed that the texts are written at a senior high school reading level (Grades 11-12) on average. The average grade level of most texts that people encounter in their daily lives is Grades 7-10 (e.g., newspapers, websites, fiction, self-help books, etc.). The average grade level of texts used in international literacy testing is Grade 8. Half the learners enrolled in LBS have not completed high school and about one-third completed Grade 10 or less. It is not clear why the ESEE contains such difficult test items.

A unique pedagogical approach called information-processing is built into the ESEE and other spin-offs including the OALCF Milestones. The decision to align the LBS system with international literacy levels, scores and standards statements (i.e. descriptions of complexity) means adults are learning a unique and perverse pedagogy of reading that contradicts literacy learning in the secondary credit, postsecondary and apprenticeship paths, Ontario’s K-12 system and predominant approaches to teaching literacy.

The costs of obtaining a score are high, and it is the learners who take on the burden explained one manager:

We seem to be down-loading a lot on learners and they don’t need it. We’re in it for the learners, but for some reason we seem to be bogged down in the data. [MAESD] has no idea of the impact of down-loading.

The impacts of the use of the ESEE and the LGRP were serious and damaging, not only to adult learners, but also to individual programs and the ministry’s LBS program as a whole.

I’ll share the slide presentation based on the study. It is available here as a PDF but the following version also has my notes: School Board Experiences with the Learner Gains Research. A full report is also available upon request.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s