Internal Contradictions and the Displacement of Learner-Centred Programming
A close reading of the LBS logic model reveals internal contradictions about who the system is designed to support. It also reveals statements that displace the traditional philosophical approach of LBS: learner-centred programming.
First, the contradiction. One of the main findings from the LBS evaluation—a lack of vision and inconsistent messaging about who the program is intended to support—can be directly linked to an internal contradiction in the logic model. The stated purpose of the LBS system is
to help adults in Ontario […] including learners who may have a range of barriers to learning.
This appears to be an inclusive and equitable statement. But it doesn’t stand alone. In the target statements that follow, the OECD’s international testing levels (used in the IALS, IALSS and PIAAC) and the OALCF levels are used to define who can access LBS:
Adult learners, Ontario residents, who have their literacy and basic skills assessed at intake as less than the end of Level 3 of the International Adult Literacy and Skills Survey (IALSS) or the Ontario Adult Literacy Curriculum Framework (OALCF)…
Several problems arise as a result of the target statements:
- They don’t connect directly to the purpose statement and help one identify “a range of barriers to learning” and who may have these. Adult learners are those Ontarians who achieve a certain level of proficiency. This also suggests that the assessments are entry requirements.
- The two Level 3 standards are different. Level 3 on the IALSS is not the same as Level 3 on the OALCF. Having two distinct measures, neither of which support the purpose, is inconsistent and contradictoty.
- In addition, IALSS levels are mostly irrelevant for the LBS system. Level 3 is irrelevant since test-takers at the beginning of Level 3 have completed college; and those in the middle of Level 3 have completed university. The LBS system does not provide support for those in the postsecondary system (with the exception of preparatory programs). In addition, the IALSS scale can’t be used for those with abilities aligned with an elementary education level. Tests-takers don’t place on the scale unless they have completed 8-10 years of formal education.
- The OALCF scale operates differently than the IALSS. Although it shares some common elements with international testing methods, it doesn’t share enough of these to produce similar testing results.
- The OALCF scale also requires LBS participants to have some literacy knowledge to be placed at an Level 1, potentially blocking their participation.
While the purpose statement appears to be somewhat equitable and inclusive, directing programs and policy staff to support those with “learning barriers” the accompanying target statements undermine and confuse the purpose. They can also be used to create additional learning barriers, exacerbating existing inequalities.
Of course policy staff and program staff are confused, wondering who is eligible to participate and what is the main intent of participation in LBS. Is the main aim of the program to demonstrate a level improvement or to mediate “learning barriers?” If one can’t complete an IALSS spin-off test or OALCF assessment are they ineligible?
The chosen assessment methods and the decision to use an assessment as a condition for participation are barriers to inclusive and equitable participation. This is one example of inequality by design, one that prevents equitable and inclusive access to LBS for those Ontarians who have no other supported learning opportunities. The targets also suggest that those who aren’t able to draw on their skills to pass a test cannot access the program.
From Learner-centred to Employment-centred
The logic model provides a rationale and justification for pressuring learners to pursue employment goals over personal goals. It also provides a rationale for valuing and encouraging short-term employment outcomes over the achievement of education related aims and associated credentials. One of the stated immediate outcomes in the logic model is the following:
Training matches/addresses participants’ and/or provincial/territorial needs/priorities.
Here we see the tradition of learner-centred programming compromised, particularly if the learner’s reasons for participating in a program don’t align with provincial “needs/priorities.” The ministry gives itself the right to persuade and pressure adults (and the educators who work with them) to learn what they deem a priority.
The ministry’s right to control training—that is what is learned and how it is learned—is asserted again in a long-term impact statement:
Employment and training services [are] aligned with labour market development needs and priorities.
This time, the learner is completely absent from the statement. Learner-centred programming is turned into employment-centred programming. The logic model provides a justification for policy staff to override and disregard a participant’s personal aims, goals and concerns.
How this plays out in local programs will of course vary. Some may indeed “turn away” from supporting personal learning aspirations, while others strive to maintain learner-centred programming. But to do so, they have to spend time and effort working against the LBS system and for learners. This is a second example of an inequality by design, one that doesn’t respect the right of individuals to decide what kinds of literacy learning would be most useful in their lives, for their families and communities.
Further advancing the employment-centred focus is another intermediate outcome statement that qualifies the value of educational goals. As mentioned, the first immediate outcome is that
Learners successfully transition to employment, postsecondary, apprenticeship, secondary school, and increased independence.
The second intermediate outcome statement is a qualifier (my emphasis):
Learners move to further education/training that represents progression toward employment.
It’s possible that the intent of the support statement is to include educational activities that are left unnamed in the goal paths, such as completing a GED test or entering a vocational training program. At the same time (particularly considering the employment-centred focus) the statement undermines the value of education as a goal that can be pursued for its own inherent value and for a credential. Education is valued only if it leads to a job. The credential, of great value to the learner (and employers and other educational institutions) is not mentioned.
We can see the devaluation of education play out in exit and follow-up reporting that require extensive tracking of employment and wages and no tracking of learners based on their education achievements. What happens to LBS participants once they complete an ACE program (postsecondary path) or achieve PLAR credits for mature students (secondary path) or pass the GED? What opportunities can they now pursue? What barriers have been mitigated?
A related problem is the exclusion of the GED as a named goal. Many remote programs rely on the GED as a means to support Indigenous learners. Not naming the GED delegitimizes it and the students who pursue it. We don’t even know how the goal is categorized by programs and are unable to track it.
Here are two additional examples of inequality by design: 1) the overall devaluation of educational achievements, even though the majority in programs have these goals, and 2) the exclusion of the GED as a named goal for many Indigenous learners.
A Hierarchy of Outcomes and Impact Statements
It is the final section of the LBS logic model, the long-term impacts, that is the most powerful and crucial to understand. Measures of long-term impacts are most important to develop in order to demonstrate overall system effectiveness. While immediate outcomes are somewhat useful, and the ones that were examined by LBS evaluators, it is the long-term impacts that run the system.
In the next post, I will go beyond the text in the logic model to examine the assumptions and measurement methods carried into it in order to demonstrate the long-term impacts.