Development

In 2011, the Read Forward project team set out a goal to create a reading assessment referenced to IALS and easy to use in a variety of instructional settings. To accomplish this goal, the project team worked in consultation with a variety of Bow Valley College staff members, and with reading experts, literacy specialists, testing professionals, statistics experts, and IALS specialists from across Canada.

The wide span of reading skills on the IALS measurement scale made it difficult to use the IALS levels to demonstrate the more incremental improvement in reading customarily seen among learners in time-limited programs. Therefore, Read Forward took the first three levels of IALS and divided them into six smaller levels, based on an analysis of IALS data of native English language speakers (Grenier et al., 2008). This structure allowed for progression of reading skills across shorter distances than in the IALS levels, making changes in reading skills more visible to both learners and practitioners.

After the development of Read Forward’s six levels, several item writers were hired to write the content of the tests, that is, their texts and accompanying questions. It was important that the content of the test items cover the “everyday” of the adult world including items from home, community and work, and therefore the created prose and document texts were designed to replicate the types of learners would come in contact with in their day-to-day lives. Consideration was also given to the layout and accompanying graphics such that it supported the reading.

The project included two piloting phases to test out the material.

 

Phase 1: Local Pilot – Bow Valley College

We approached the pilot in two ways: 1) We worked with one instructor to embed the tests into the curriculum; 2) We administered the tests one at a time with individual classes. A total of 236 tests were taken by learners.

We returned to each of the classes who wrote a test with the individual results feedback forms for all the learners, and then conducted a group debriefing. This gave us the opportunity to ask the learners about their experiences with the test and the results feedback forms and gave the learners the chance to express any concerns we may not have thought to ask about directly.

The feedback from both learners and instructors was overwhelmingly positive. Learners were immediately seeing the connection between the texts and questions in the text and their day-to-day lives. Instructors commented on their learners’ level of interest and engagement.

 

Phase 2: National Pilot

For piloting partners, we looked for a variety of organizations and a wide geographic distribution in order to ensure: the resource was appropriate to use in a variety of contexts including community programs, workplace programs and colleges; and the resource represented the everyday reading of learners across Canada. Eight organizations in cities and towns in seven provinces and territories were selected. All partner organizations had a minimum of one person attend a training session to learn about the project, how to use the resource, and how to gather and submit feedback for the pilot.

Between the organizations, each of the piloted tests was written by a minimum of 10 learners. In total, there were 330 tests taken. Feedback was collected from the learners who wrote the tests and from the instructors and program coordinators who administered the tests.

 

Analysis

There were two different forms of analysis:

  • item analysis, which looked at the construction aspects of the test questions and texts and how that reflected the level of difficulty
  • response analysis, which looked at whether learners answered correctly or not and how that reflected problematic texts and questions

Data from the item analysis and response analysis were merged, and the results were used to determine to final alignment of the questions and texts in the tests for each of the six reading levels.

 

The Advisory Committee

Throughout the entire process of development, project staff worked closely with an advisory team of adult learning experts from across Canada. Their input was sought at crucial junctures along the way so that their observations and expertise could inform overarching considerations of the project to ultimately ensure usability of the final resource in the field.

—–
Grenier, S., Jones, S., Strucker, J., Murray, T. S., Gervais, G., and Brink, S. (2008). Learning Literacy in Canada: Evidence from the International Survey of Reading Skills. Ottawa: Statistics Canada, HRSDC, Government of Canada.