A mapping of the “administrative burden” in LBS programs

This will be the first of a few posts focused on particular findings, conclusions and recommendations in the recent Literacy and Basic Skills (LBS) evaluation report. I hope to highlight and synthesize identified issues, providing both policy analysts and LBS support organizations the in-depth knowledge and understandings needed to make important decisions moving forward. The topics I will address in the posts are

  • The onerous reporting and accountability process
  • The policy and pedagogical distortions that have been introduced by aligning the mandate of LBS, along with various curriculum and assessment mechanisms, with the bogus Level 3 international literacy standard
  • The role of LBS, its learner profile, overall program purpose and fit within the Employment Ontario (EO) system; and
  • The challenges in developing a fair and equitable pay-for-performance system for a highly dissimilar, complex and diverse learning program.

Although it may appear the discussion is limited to Ontario programs, the topics—increased accountability demands, policy and pedagogical distortions accompanying the OECD’s adult literacy testing project, the insistence that funding for adult literacy development serve only the labour market, and the inequitable impacts of pay-for-performance managerial systems used in education—may be familiar to readers outside of Ontario.

My soundbite take away from the evaluation report is this:

LBS supports the development and achievement of an array of  outcomes related to well-being, goal achievement, social engagement, the mastery of a variety of new literacy, numeracy and technology related practices and transition into further education, training and employment for a very reasonable cost, despite the numerous administrative, curricular, testing, administrative, funding, communication and visioning issues identified.

On the ground, people manage to make things work, but at what costs to learners, program staff and policy aims?

One of the key issues identified in the recent evaluation report is the “administrative burden.”  A participant from one program estimated they spend 30 – 50% of their time on administration. Another evaluation participant, working with about 100 students, stated they previously spent about three hours per month on ministry required administration (before LBS was integrated into Employment Ontario) and now spend 70 hours per month on administration (see page 120 in the report)—an astounding increase that equates to two weeks of full-time work for one person. (We definitely need more research to fully document the percentage of a program’s budget, including both direct expenses for support staff and educator’s time, that is allotted to administrative and accountability work. We also need know how much of a learner’s time is spent on these activities compared to actual learning.)

Based on my reading of the report, there are three main contributors to the administrative burden.

  1. A substantial increase in the number of reports and documentation mandated by the ministry.
  2. The use of a complex and technically unstable database system referred to as EOIS-CaMS, without the support of a help-desk, up to this point.
  3. Redundant data collection and entry within the ministry mandated processes and between the ministry mandated processes and host organizations (community organizations, colleges, school boards). Program staff may have to enter the same information about learners, levels or spending in two or more different systems and reports. Also, many host organizations require their own reports and documentation from LBS programs, and programs use their own databases, documentation and reporting processes because the ministry systems do not supply information needed by programs such as learner lists, class lists, and attendance records, etc.

Evaluators are quite pointed in their assessment of these impacts and write the following:

The Ministry may currently be doing more to decrease the efficiency and effectiveness of the LBS program than to increase it. Burdensome data entry and reporting requirements are the most commonly cited cause of inefficiency in the LBS system (p. 125).

They also describe some counter-productive practices that impact learners, programs and the LBS system including

  • cancelling scheduled program support time in order to catch-up on reporting, data collection and entry;
  • not entering learners in the EOIS-CaMS system if a program has met their learner targets because it’s too time-consuming;
  • or not entering learners into the system if they don’t help a program meet its suitability targets.

I wanted to better identify and map the many elements of the reporting and documentation demands. The project got more complicated and took far more time than anticipated. I fell down the rabbit hole of managerial mechanisms, frameworks, reports and data collection tools. Since I don’t currently work in a program, I referred to the LBS evaluation report, program guidelines and various training presentations that I have collected over the years. The mapping, which appears below, is a working draft. There may be some adjustments after I receive feedback. More importantly, you may want to make your own changes to better reflect your context and understandings. Here is the power point slide used to create the mapping that you are welcome to modify: LBSManagerialMapping.


Starting from the right side, the illustration shows that there are 13 or more different data and information collection tools and processes, in which learners are asked to supply personal information, take tests, sign forms, and respond to questionnaires. None of these processes involve the completion of program learning activities. Most are mandated by the ministry and a couple may be required by the organization that hosts the program.

Much (all?) of this information is then parsed, re-categorized  and re-presented in some way as it makes its way into various mandated reports and reporting systems. For example, information collected on entry using the mandated learner registration form (a PDF fillable form) is then re-entered into a service plan in EOIS-CaMS, and some may be re-entered again into a program database.

I have identified 23 mechanisms (blue boxes; and green boxes indicate mechanisms used to judge performance) and four different organizational frameworks (orange boxes) to categorize the data and information collected. Each of the 23 managerial mechanisms is a method devised to compile information that is then reformulated in various mandated reports. Six of the seven reporting processes are mandated by the ministry to monitor operations and evaluate performance. The seventh, program specific reports, usually isn’t looked at by the ministry directly, but may be required by the host or to support program operations.

The red text (SQRs, suitability, Milestones, CTs and accompanying testing processes) are aspects of the administrative burden that are a focus of recommendations in the LBS evaluation report. Placing these within the overall accountability and reporting system indicates that the recommendations may address some issues, but certainly fall short in addressing the system as a whole.

The managerial mechanisms are concepts people had to learn to use. None are immediately apparent. What’s a plan item, a task, a competency or a sub-goal? What other information in the system defines what it is? How is it put to use? A great deal of time and effort is devoted to mastering the use of the concepts and ensuring their proper documentation.

One of the reasons that the accountability system has become so complex is likely due to the way that system was developed. The four frameworks represent four different phases and priorities. The first, simply labelled LBS, shows five pre-existing mechanisms that were in use before the integration into EO. Mechanisms like skill-based learning (a reading comprehension approach) goal paths and transition, and the LBS levels (loosely aligned with grade levels) continue to organize program delivery and field-based understandings. Other mechanisms, such as the five functions of LBS (information and referral, assessment, learner plan development, training, exit and follow-up), goal paths and transitions have more of a direct role in the EO system. In addition, pre-existing reporting processes, such as business plans and compliance visits, were maintained.

Then, three separate initiatives led to the development of 18 new managerial mechanisms and new reporting processes, namely a series of Service Quality Reports (SQRs) and the Service Plan.

  1. Integration into the EO network of services using a case management system, accompanying database system (EOIS-CaMS) using categories and structures devised for programs providing employment related services and supports (not education providers like LBS).
  2. Development of a pay-for-performance system (the Performance Management Framework) that emphasized learner outcomes and performance using testing over learner participation, actual program accomplishments and actual transitions.
  3. The development of a performance monitoring framework and accompanying testing system, the OALCF, that would be used to supply key performance indicators in the PMF, constituting 50% of the performance evaluation (i.e. the Milestones, CTs and TBD Learner Gains). A decision was also made to align the performance monitoring framework (OALCF) with international literacy testing levels and general testing methods in the mistaken belief that the international testing levels are standards. The intent was to demonstrate that the LBS system could produce performance measures that aligned with international testing results; thus demonstrating that participation in LBS could have a direct impact on international literacy testing results for Ontarians.

This was primarily a systems design effort overseen by separate work groups and entities within the ministry with little or no consideration of existing program operations. I say this based on my own short-lived experience working on the OALCF.  The overall integration work was primarily a codified and procedural design project with little accounting or regard for the actual day-to–day work, existing organizational structures, procedures and demands on people in programs, including the learners.

Hopefully, the mapping can be used to help people articulate the issues and their concerns, and provide a shared basis of understanding to facilitate upcoming stakeholder discussions.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s