Posts

Welcome to Reading RACES!

Reading RACES is a multi-discliplinary research project that involves the collaboration between experts in reading instruction and experts in computer science engineering at The Ohio State University.

The Grant

The research we are doing is currently supported by the U.S. Department of Education, Institute of Educational Sciences (Grant No. R324A120103).

Project Goal

The overall (3-year) goal is to develop a reading fluency intervention, consisting of a repeated reading format using culturally relevant passages. This intervention is delivered through computer software. The computer-based intervention is designed to foster relatively independent use and practice by first- and second-grade struggling readers in urban settings.

Year 1 (2012-2013) the researchers:

  • Wrote and Equated 30 Culturally Relevant passages for both 1st and 2nd grade
    • Content for passages was based on interviews of students, parents, and teachers and researcher observations
    • Passages equated using standard psychometric procedures
    • Studied the effects of Culturally Relevant vs. Non-culturally relevant Passages
    • Developed MAZE comprehension measures
    • Began to design the interactive computer program

 Year 2 (2013-2014)

During the second project year, the researchers successfully inserted the repeated reading intervention, which includes the stories we developed in Year 1, into the computer software.  The stories were based on the students’ backgrounds and were considered to be culturally relevant (CR). Our research from the first year indicated that the students read the CR passages more fluently than they did non-CR passages and that they liked most stories with which they could personally identify. The students also showed preferences for stories that reflected altruism (i.e., doing the right thing or helping others).

To test out whether our intervention worked, we obtained parent permissions and conducted a small-scale study with 7 second-grade intervention students and 3 comparison students in two Columbus City Schools. Each of the targeted intervention students (7) read 25 CR stories. The students practiced the stories until they reached the designated goal. For each story the students also completed a comprehension maze. To determine if the improved reading generalized to non-trained passages, students were assessed on the AIMSweb passages and mazes. During instruction the students completed the following steps: (a) listened to a computer delivered script introducing the instructional procedure, (b) listened to the human voice model, delivered via the computer, read the targeted practice passage, (c) read the passage with the human voice model, (d) read the passage independently up to 3 times, with corrective feedback being given by the experimenters (e) read the practice passage for a one-minute timing, if the student successfully reached his/her goal, and (f) take a three-minute maze comprehension assessment, if student reached predetermined goal. When a student met his/her goal and completed the maze for the CR passage, the student then read an AIMSweb passage and completed the corresponding comprehension maze. During the period when the 7 target students received the intervention, the comparison students did not receive intervention but were only assessed on reading fluency. The team only assessed the reading fluency of the comparison students. The comparison students then received 4 weeks of intervention and completed between 15-19 stories on a more updated version of the software that allowed for greater student independence. Using a multiple baseline design, the researchers closely monitored the students’ responses to study the training effects.

We also replicated this study with 4 first-grade students at one of the schools, using a more up-to-date version of the software that allowed for even greater student independence. The first grade students actually required more adult assistance.  Each first grade student entered intervention at different points in time and received between 3 to 7 weeks of intervention and read between 7to 15 stories. The first graders also received minimal error correction from the experimenters.

Some of our main findings are as follows:

  • Six of our seven initial second-grade intervention students made substantial gains on fluency and comprehension during intervention and maintained most of those skills during 2 weeks and 1 month maintenance follow-up (see data).
  • Percentage growth for targeted students ranged from 33% to 199% in fluency and from 56% to 225% on maze comprehension.
  • In comparison before the intervention, the comparison peers’ percentage growth ranged from (-)28% to 20% in fluency and from 13% to 67% in maze performance.
  • Fluency performance increased substantially for comparison peers once they began intervention (see data).
  • First graders responded more quickly to the intervention as evidenced by immediate transfer of fluency skills to novel AIMSweb passages (see data).

 Year 3 (2014-2015)

This year we are further testing our intervention by focusing on first graders as well as second graders and to assess the extent to which each group can use the intervention independently. To do this we are conducting more studies to include more children. Another major focus is to involve teachers in using the intervention with the children. We continue to refine the software to make sure it is performing as originally intended. Specifically we are:

  • Refining the software to calculate correct words per minute (CWPM) and provide error correction to students without adult supervision
  • Validating the software on 1st graders from 2 different schools; one group of students are English learners or from homes where English is a second language
  • Refining procedures for teacher implementation in the 2nd grade classrooms
  • Finalizing the user’s manual
  • Securing bridge funding to keep the project going until we qualify for a U.S. Department of Education efficacy grant

What’s Next?

We are looking for schools and districts who would like to partner with us to scale up this software program. In particular, these schools and/or districts would be key players in helping us secure bridge funding by writing letters of support/interest in our project to specified stakeholders and the Ohio State Internal Review Board (templates can be provided).  Most of this bridge funding would cover project support staff needed to:

  • Work cooperatively with building/district IT personnel
  • Provide professional development training and coaching to teachers, staff, and/or school volunteers (e.g., parents or reading tutors)

Collect, analyze, and report student data

For more information or if you have questions, please contact us!