Skills versus tasks: A false debate that obscures a perverse reading pedagogy (part 3 of 4)

We can see the difference between the information processing technique—carried into the Essential Skills framework, the OALCF, spin-off tests like the ESEE and OALCF Milestones—and reading comprehension play out in test questions. While the text to be read could be quite similar in both approaches, adherence to either an information-processing model or a reading comprehension model means the test questions and expected responses are very different.

The example test task below came from Employment and Social Development  Canada’s (ESDC) Essential Skills Indicator. It is a Level 1 reading test task. The test, similar to the ESEE, is designed to be used for pre- and post-testing, signaling it is to be used within a teaching and learning context. The preamble is directed at potential learners and not educators. The stated aim is for people to use the indicator to recognize their “skill strengths and areas that may require improvement.” (Who would bother unless directed to do so?)

It appears ESDC is taking a more direct approach to reaching adults deemed to be in need of “improvement” after plans for the development and funding of a pan-Canadian adult literacy and learning network were dropped in 2014, and funding was pulled from provincial and national support and development organizations. (Read more here, here and here.)

Before examining how the information-processing method is used to develop test questions, I will look at the text to be read to point out how design considerations made for international population testing are carried into an educational context.

ESDC Level 1 Reading (2)

Text to be read

Design elements and considerations put in place for international testing, as part of an OECD initiative to spur international competitiveness and the development of human capital, are carried into the spin-off tests and the texts to be read. These design elements include the following:

  1. Texts without familiar logos, symbols, place names, and organization names. These are important contextual cues that could help make the texts more relevant and meaningful in education programs, but were removed for the development of test-items that could be used in different countries. This design element is carried into the OALCF Milestones, which were produced for use in one province, but do not include any references to living, working or attending school in the province.
  2. Another design element is content that attempts to represent various settings and activities in daily life, but which completely disregards the circumstances of particular people. (How many learners in adult literacy programs are concerned with booking hotels with an ocean view for their vacation?)
  3. Also related to content is a predominance of formats aimed at getting things done, being efficient and following rules and procedures, but no formats aimed at supporting knowledge, personal development, social and cultural understandings, being critical and asking questions, etc.
  4. Texts are re-formatted for testing standardization purposes so they initially appear familiar but don’t always contain expected structures and information. (Test-takers are duped into thinking they can respond using their experience with similar texts but quickly discover this won’t work.)
  5. A final design element is incorporating activities that may not reflect what one would actually do in a similar situation. (Who sends a fax to book a hotel room?)

These design elements, including, as demonstrated in the ESEE, no apparent standardization of the reading level of the text, are enough to make the texts impenetrable (also demeaning and irrelevant) for many learners who seek out literacy programs for support.

Questions to be answered

The question in the example is the following: On which day of the week is the guest planning to arrive? Here is what it looks like online.

ESDC Reading 1 Question

You may first notice the awkward wording of the test question. This is carefully done to set up the matching process that directs the test-taker to scan the text looking for congruence between a weekday and an indication of arrival. To answer the information-processing question the reader must do the following:

  • Recognize that the question is asking about both a day of the week and an arrival date
  • Search the text to find all references to a day of the week
  • Then go through a process of elimination in order to discern which day is associated with arrival date.
Day of the week Arrival or other purpose?
Wednesday, January 25, 2016 Date that the fax was sent
Tuesday, June 13, 2016 Arrival date
Thursday, June 22, 2016 Departure date

This question is considered a straightforward matching process: it doesn’t require multiple scans; it refers to concrete and recognizable information such as dates; and there are no apparent distractors. Although some readers could mix up Tuesday and Thursday. At this level, it’s the text to be read more than the questions that could present a barrier to learners working on reading that is around an elementary school level. In 2014-2015 5,300 students indicated they had 0-8 years of education. This text scored at Grade 7 and Grade 5 on two readability analyses. Compounding the reading difficulty are the five design elements mentioned above, potentially making the text indecipherable for thousands of learners in LBS.

You may be much more familiar with the reading comprehension approach. It is the predominant approach to teaching literacy in K-12 and in the vast majority of adult literacy and basic skills programs, particularly those that assist students in gaining a secondary credential (i.e. transition into an adult high school to earn a diploma, GED and secondary equivalency at a college). Before the OALCF, the approach was directly integrated into the LBS curriculum framework (referred to as an outcomes framework or the matrix) and all assessment materials in what were called demonstrations.

If one was to develop reading comprehension questions for this text, they could include the following:

  1. Who is the message for?
  2. Who sent the message?
  3. When does the writer of the message want to be at the hotel?
  4. What kind of room does she want?
  5. If the manager writes her back, what would he say if no rooms are available?

The main aim of questions in a reading comprehension approach is to connect the reader with the text in order to derive meaning from the text. Who, what, when, why, where questions are commonly used to help the reader orient to the text. Questions could also have readers draw on linguistic knowledge, such as vocabulary (e.g. What is another word for request?). Readers may be asked to take on the role of the writer or other person referenced in the text in order to develop a response. Importantly, a reading comprehension approach usually integrates writing. Test-takers are often directed to develop full sentence (or paragraph) responses based on the text.

This comparison is not being made to suggest that reading comprehension is an ideal approach in all situations. It must be learned if an adult is preparing to enter or re-enter the K-12 system. However, it is of limited use when supporting the development of literacy practices for a myriad of personal and work-related uses. This limitation does connect to the skills versus task debate and is important to discuss. However, the potential for having these conversations disappeared when the information-processing technique was wrongly assumed to offer an alternative.  Both approaches are constructions devised for test development, and both approaches disregard how people actually read and respond to texts in their lives.

With the introduction of the OALCF, the LBS funder developed and then mandated the use of a curricular and accountability system that upended the previous system. They now hold programs accountable for something that they don’t actually do. Until the OALCF was introduced in 2011, programs did not not teach reading using information-processing. Now many do in order to prepare their learners take mandated tests. This perverse pedagogy has led to a series of consequences with a range of impacts that affect some programs and learners more than others. I will look at these in the final post of the series.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s