The Relationship between Oral Reading Fluency and Reading Proficiency.

2,589 views
2,283 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
2,589
On SlideShare
0
From Embeds
0
Number of Embeds
53
Actions
Shares
0
Downloads
50
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Relationship between Oral Reading Fluency and Reading Proficiency.

  1. 1. Walden University COLLEGE OF EDUCATION This is to certify that the doctoral study by Kathy Jones has been found to be complete and satisfactory in all respects, and that any and all revisions required by the review committee have been made. Review Committee Dr. Thomas Schnick, Committee Chairperson, Education Faculty Dr. Barbara Bailey, Committee Member, Education Faculty Dr. Brett Welch, University Reviewer, Education Faculty Chief Academic Officer David Clinefelter, Ph.D. Walden University 2010 Abstract
  2. 2. The Relationship Between Oral Reading Fluency and Reading Proficiency by Kathy S. Jones MA, Abilene Christian University, 2001 BS, Abilene Christian University, 1997 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Education Administration Walden University August 2010
  3. 3. Abstract Students who struggle with reading in Grade 3 often fall behind their peers in reading proficiency. Failure to meet minimum reading proficiencies of state-mandated tests can negatively affect children’s success in subsequent grades. Educators have used oral reading fluency tests as reliable indicators of students’ later reading proficiency. Studies in 7 states found that oral reading fluency predicted performance on state reading assessments. One aspect of the automaticity theory was used to explain why struggling readers have insufficient attention to devote to reading proficiently. This nonexperimental, quantitative study investigated whether a statistically significant relationship existed between Grade 3 students’ oral reading fluency rates and their reading proficiency when assessed by the state-mandated assessment. Pearson correlation was used to compare the middle-of-year oral reading fluency rates, measured by the Dynamics Indicators of Basic Literacy Skills oral reading fluency, and reading proficiency, measured by the scale score Grade 3 Reading Texas Assessment of Knowledge and Skills, of 155 Grade 3 students for the 2008-2009 school year in a school district. The results indicated a relationship between oral reading fluency and reading proficiency. Study results may help elementary school administrators, teachers, and reading specialists to identify at-risk readers and implement interventions to enable students to gain greater reading proficiency and improve their performance on state- mandated assessments.
  4. 4. The Relationship Between Oral Reading Fluency and Reading Proficiency by Kathy S. Jones MA, Abilene Christian University, 2001 BS, Abilene Christian University, 1997 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Education Administration Walden University August 2010
  5. 5. UMI Number: 3423184 All rights reserved INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted. In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if material had to be removed, a note will indicate the deletion. UMI 3423184 Copyright 2010 by ProQuest LLC. All rights reserved. This edition of the work is protected against unauthorized copying under Title 17, United States Code. ProQuest LLC 789 East Eisenhower Parkway P.O. Box 1346 Ann Arbor, MI 48106-1346
  6. 6. Dedication This dissertation is dedicated to God. Throughout my life, He has prepared me for what was coming next. At times, He prepared me for something before I even dared to dream about it. My doctoral journey has been no exception. God first planted the dream in me of pursuing a doctorate 15 years ago. I did not even have a bachelor’s degree at the time, and the dream seemed impossible. Every time it was in danger of fading, God would use someone to nudge me forward and place me back on track. He has been with me through all the twists and turns of these past 5 years that I have been enrolled in the doctoral program. He has molded me and made me into the person I have become. I dedicate this honor to Him and trust that He will guide me as He continues to use me to serve others with the talents with which He has blessed me.
  7. 7. Acknowledgments God has used many people to encourage and inspire me on this doctoral journey. I am thankful for the support of my family. My husband, Roy, has stood by my side through all the ups and downs. He celebrated with me as I conquered hurdles and encouraged me when I was discouraged. My adult children, Rachel, Benjamin, Timothy, and Daniel, have showered me with comments, and I have especially been bolstered and encouraged with their cries of “Way to go, Mom!” My parents, Bill and Anita Love, planted the vision of setting goals and working hard to achieve them. My father has gone on, but I treasure the memory of the crack in his voice as he proudly told others I was working on my doctorate. People in the field of academia have guided me on my journey as well. I appreciate the support of my chair, Dr. Thomas L. Schnick. He has promptly reviewed my work each time it was submitted and provided me with constructive criticism that improved the quality of my dissertation. Dr. Barbara Bailey served as my methodologist. I thank her for helping me see how the problem, purpose, and research questions provided the foundation for the dissertation. I am especially appreciative of my editor, Dr. Noelle Sterne, who coached me from where I was to where I needed to be. She had a clear vision of what my dissertation had to be and how to get me there. In addition to editorial changes, she encouraged me to clearly describe my thoughts, to tell her more about what I was writing, and to find additional sources to support my statements. This coaching was invaluable in my desire to deepen and expand the substance of this dissertation.
  8. 8. I am grateful too for the bonds I have forged with fellow students. Because Walden is an online university, I have not had the pleasure of meeting many of my peers. Nonetheless, I was amazed at how deeply the bond of friendship developed. I especially appreciate the friendship of Susan Myers. We met in our first online classes, roomed together at two residencies, exchanged many emails, and talked on the phone. I am confident that our professional relationship will continue even beyond graduation. I appreciate and thank Keith Richardson, the superintendent of the district involved in this study, for his cooperation. He supported me in signing the data use agreement and letter of cooperation so that I could use the district archival data for the study. I appreciate also the continued support and encouragement of Doug Doege, the principal of the school for the study. He began working with me when the logistics of the study were still a dream to me. As the details developed, he helped make them a reality. I am also grateful for the cooperation of the Grade 3 teachers and students. Finally, I give special thanks to Dynamics Measurement Group for giving me permission to use the DIBELS ORF benchmark in the study and to the Texas Education Agency for granting copyright permission for the use of the Grade 3 Reading TAKS in the study.
  9. 9. i Table of Contents List of Tables .......................................................................................................................v Section 1: Introduction to the Study ....................................................................................1 Problem Statement.........................................................................................................3 Nature of the Study........................................................................................................4 Research Question and Hypothesis................................................................................6 Purpose of the Study......................................................................................................6 Theoretical Framework..................................................................................................7 Operational Definitions..................................................................................................8 Assumptions, Limitations, Scope, and Delimitations..................................................10 Significance of the Study.............................................................................................11 Transition Statement....................................................................................................13 Section 2: Literature Review .............................................................................................16 Introduction..................................................................................................................16 Automaticity Theory....................................................................................................16 Oral Reading Fluency ..................................................................................................21 Definition of Oral Reading Fluency ..................................................................... 22 The Role of Oral Reading Fluency in the Context of Learning to Read .............. 26 Fluency Instruction ............................................................................................... 29 Reading Proficiency.....................................................................................................30 Reading Proficiency and Automaticity Theory .................................................... 30
  10. 10. ii Definition of Reading Proficiency........................................................................ 34 Vocabulary and Matthew Effects.......................................................................... 36 Comprehension ..................................................................................................... 38 Relationship Between Oral Reading Fluency and Reading Proficiency in State-Mandated Assessments...........................................................................43 Studies of Grade 3 Only........................................................................................ 44 Studies of Reading First Schools.......................................................................... 44 Studies of Grade 3 and Other Grades ................................................................... 45 Limitations of State Studies.................................................................................. 47 Summary......................................................................................................................49 Section 3: Research Design ...............................................................................................52 Introduction..................................................................................................................52 Research Design...........................................................................................................52 Setting and Sample ......................................................................................................55 Setting ................................................................................................................... 55 Characteristics of the Sample................................................................................ 55 Sampling Method.................................................................................................. 57 Sample Size........................................................................................................... 57 Eligibility Criteria for Study Participants ............................................................. 58 Instrumentation and Materials .....................................................................................58 DIBELS ORF........................................................................................................ 59 TAKS ................................................................................................................... 67
  11. 11. iii Data Collection and Analysis.......................................................................................71 Data Collection ............................................................................................................71 Data Analysis...............................................................................................................73 Researcher’s Role ........................................................................................................74 Protection of Participants’ Rights................................................................................74 Summary......................................................................................................................75 Section 4: Results of the Study.........................................................................................77 Introduction..................................................................................................................77 Research Question and Hypothesis..............................................................................77 Research Tools.............................................................................................................78 Data Analysis...............................................................................................................80 Summary......................................................................................................................87 Section 5: Discussion, Conclusions, and Recommendations.............................................89 Overview......................................................................................................................89 Interpretation of Findings ............................................................................................89 Recommendations for Action ....................................................................................100 Recommendations for Further Study.........................................................................105 Quantitative Studies............................................................................................ 105 Qualitative Studies.............................................................................................. 108 Mixed-Method Studies........................................................................................ 109 Conclusion .................................................................................................................112 References........................................................................................................................116
  12. 12. iv Appendix A: Permission to Collect Data.........................................................................133 Appendix B: Data Use Agreement ..................................................................................135 Appendix C: Permission to Use DIBELS........................................................................139 Appendix D: Permission to Use TAKS ...........................................................................141 Curriculum Vitae .............................................................................................................151
  13. 13. v List of Tables Table 1. Characteristics of Grade 3 Students in Texas Overall and the Research Site .......................................................................................................................56 Table 2. Correlation of DIBELS ORF Rates and TAKS Scores for Grade 3 Students.....80 Table 3. Ranges and Categorization Cut Points of DIBELS ORF and Grade 3 TAKS ....82 Table 4. Comparison of Students’ Performance by Categories for DIBELS ORF and Grade 3 Reading TAKS.......................................................................................84 Table 5. Studies Correlating State-Mandated Assessments to the DIBELS ORF.............90
  14. 14. 1 Section 1: Introduction to the Study It is important for students to learn to read proficiently by the end of Grade 3. From kindergarten through Grade 2, students focus on learning to read (Ehri, 2005). The process of learning to read is a complex but systematic one. Many children learn to recognize letters and words even before beginning formal school (Ehri, 2005). By the end of Grade 3, students are expected not only to decode the words in a text, but also to understand what they read (National Institute of Child, Health and Human Development [NICHHD], 2000). Students who struggle with reading at the end of Grade 3 often continue to fall farther behind their peers in terms of reading proficiency (Morgan, Farkas, & Hibel, 2008). The failure to meet minimum reading proficiencies at the end of Grade 3 can have negative consequences on children’s success in subsequent grades (Jimerson, Anderson, & Whipple, 2002; Katsiyannis, Zhang, Ryan, & Jones, 2007). After Grade 3, students transition from learning to read to reading to learn in other subject areas (Musti-Rao, Hawkins, & Barkley, 2009). As a part of implementing the No Child Left Behind legislation (NCLB, 2002), the federal government commissioned the National Accessible Reading Assessment Project (NARAP, 2006) to define reading proficiency. Their definition described reading as a process in which readers decode words in order to make meaning. Readers understand text by using a variety of reading strategies to determine the purpose of a passage and understand the context and nature of the text. Once the NARAP established its working definition of reading proficiency, states used this definition to write and assess curriculum objectives for reading. In section
  15. 15. 2 2, I review literature that discussed reading in the context of students’ learning to read and reading proficiency. In compliance with NCLB (2002), federal legislators mandated that school districts monitor students’ progress by identifying academic needs early in school, and providing scientifically-based interventions. As NCLB requires, states have designed curriculum standards and annual assessments to measure the number of students who read proficiently. For example, Texas has used performance on high-stakes tests in Grade 3 to determine promotion or retention of students (Texas Education Agency [TEA], 2008). Given the legal mandates and the commitment of NCLB to leave no child behind academically, it is important that local school districts identify nonproficient and struggling readers before they fail in reading and later grades (Jenkins, Hudson, & Johnson, 2007). To ascertain students’ reading proficiency in the early grades, researchers have established that oral reading fluency is a reliable predictor of reading proficiency (Fuchs, Fuchs, Hosp, & Jenkins, 2001; Simmons et al., 2008). Section 2 contains additional reports of researchers who found that oral reading fluency was a predictor of performance on seven state-mandated reading assessments (Baker et al., 2008; Barger, 2003; Roehrig, Petscher, Nettles, Hudson, & Torgesen, 2008; Shapiro, Solari, & Petscher, 2008; Vander Meer, Lentz, & Stollar, 2005; Wilson, 2005; Wood, 2006). Section 2 also contains reviews of literature in which reading experts disagree on the definition of fluency (Kame’enui & Simmons, 2001; Samuels, 2007). States define reading proficiency by using curriculum standards rather than the verbal definitions of
  16. 16. 3 NARAP (2006), and the curriculum standards may differ from state to state (Colorado Department of Education, 2009; Florida Department of Education, 2005; TEA, 2004). Researchers have established positive correlations between oral reading fluency rates and reading proficiency (Hosp & Fuchs, 2005; Simmons et al., 2008). Additional research is needed to confirm the specific relationship between oral reading fluency assessments and other state-mandated tests for reading proficiency (Roehrig et al., 2008). The purpose of this research project was to determine if a relationship existed between oral reading fluency of Grade 3 students in a West Texas school district and their performance on the state-mandated assessment in Texas in the 2008-2009 school year. Problem Statement In Texas, the site of the current research, in 2009, 33,462 students, approximately 15% of all Grade 3 students, were unable to read proficiently enough to pass the Grade 3 Reading Texas Assessment of Knowledge and Skills (TEA, 2009c). According to the Nation’s Report Card in Reading (U. S. Department of Education, 2007), 67% of the fourth-grade students who took the National Assessment of Educational Progress (NAEP) in 2007 scored at the basic level or above. However, 33% of the tested students did not read well enough to score at the basic level. The majority of students learn to read proficiently (Chard et al., 2008; Ehri, 2005). Nevertheless, many students do not reach an acceptable level of reading by Grade 3 and are considered struggling readers (Applegate, Applegate, McGeehan, Pinto, & Kong, 2009; Pressley, Gaskins, & Fingeret, 2006; Torgesen & Hudson, 2006).
  17. 17. 4 Educators have used oral reading fluency as a reliable indicator of students’ progress toward overall reading proficiency (Jenkins et al., 2007). The following researchers have found significant correlations between oral reading fluency rates and reading proficiency in seven states’ mandated reading assessments: Arizona (Wilson, 2005), Colorado (Shaw & Shaw, 2002; Wood, 2006), Florida (Buck & Torgesen, 2003; Roehrig et al., 2008), North Carolina (Barger, 2003), Ohio (Vander Meer et al., 2005), Oregon (Baker et al., 2008), and Pennsylvania (Shapiro et al., 2008). However, it is unknown whether a similar connection exists in the Texas school system (Roehrig et al.). An investigation of this possible relationship is important for providing evidence to aid elementary school administrators, teachers, and reading specialists to identify at-risk readers. Early identification of struggling readers in concurrence with research-driven interventions can close the gap between these readers and their proficient peers before the end of Grade 3 (Simmons et al., 2008). Nature of the Study To determine if a statistically significant (p < .05) relationship existed between students’ oral reading fluency rates and their reading proficiency, this nonexperimental quantitative study compared the middle-of-year oral reading fluency rates and reading proficiency of 155 Grade 3 students for the 2008-2009 school year in a West Texas school district. The students resided in a small West Texas town with a population of 6,821, and in which all Grade 3 students attended the same elementary school. The demographics of the school district are similar to those of all Grade 3 students in Texas
  18. 18. 5 (TEA, 2009c). To obtain as wide a range of scores as possible, all of the Grade 3 students in this district comprised the nonprobability convenience sample. The Dynamic Indicators of Basic Early Literacy Skills Oral Reading Fluency (DIBELS ORF) comprised the independent variable to measure students’ oral reading fluency rates. The developers of DIBELS ORF found the assessments reliable and valid (Good & Kaminski, 2002a). Additionally, other researchers have found DIBELS ORF rates reliable (Baker et al., 2008; Buck & Torgesen, 2003; Roehrig et al., 2008; Shaw & Shaw, 2002; Vander Meer et al., 2005; Wilson, 2005, Wood, 2006). However, some researchers disagreed with the definition of fluency used by the developers of DIBELS (Hudson, Lane, & Pullen, 2005; Young & Rasinski, 2009) and the reliability and validity of the assessment (Samuels, 2007). Section 2 contains a review of the literature on DIBELS ORF, including the controversy. The Grade 3 Reading Texas Assessment of Knowledge and Skills (TAKS) scale scores (TEA, 2009b) comprised the dependent variable to measure students’ reading proficiency. The TEA in conjunction with Pearson Education established the reliability and validity of TAKS (TEA & Pearson, 2008). Staff members of these organizations have worked regularly with teachers in Texas and national testing experts to ensure that the TAKS is a quality assessment. Further information describing the reliability and validity of TAKS is included in section 3. To address the study problem, I used the middle-of-year 2008-2009 DIBELS ORF benchmark rates of the Grade 3 sample from January 2009 and the scale score of the 2009 Grade 3 Reading TAKS. I applied the Statistical Package for the Social Sciences
  19. 19. 6 (SPSS, 2009) software program, version 17.0, and conducted a Pearson correlation analysis to determine if there was a statistically significant relationship (p < .05) between oral reading fluency rates as measured by DIBELS ORF and reading proficiency as measured by the scale score of the Grade 3 Reading TAKS. Research Question and Hypothesis The following research question guided this study: Is there a statistically significant relationship between Grade 3 students’ oral reading fluency rates and their reading proficiency? From this question, the following null and alternative hypotheses were formulated: H0: There is no statistically significant relationship between students’ oral reading fluency rates, as measured by the students’ middle-of-year DIBELS ORF rates, and their reading proficiency, as measured by their scale scores on the Grade 3 Reading TAKS for the 2008-2009 school year. H1: There is a statistically significant relationship between students’ oral reading fluency rates, as measured by the students’ middle-of-year DIBELS ORF rates, and their reading proficiency, as measured by their scale scores on the Grade 3 Reading TAKS for the 2008-2009 school year. Section 3 contains additional information regarding the research question, hypotheses, methodology, and measurements. Purpose of the Study The purpose of this nonexperimental, quantitative study was to determine whether a statistically significant relationship (p < .05) existed between Grade 3 students’ oral
  20. 20. 7 reading fluency rates and their reading proficiency. With a nonprobability convenience sample of 155 Grade 3 students in a West Texas school district, I sought to determine if a statistically significant relationship existed between oral reading fluency rates and students’ reading proficiency on the summative, high-stakes reading assessment results for the 2008-2009 school year. The independent variable was oral reading fluency, and the dependent variable was reading proficiency. The outcome of the study identified additional ways in which elementary teachers can screen struggling readers so that interventions can take place to help these students to increase their reading ability. Theoretical Framework The automaticity theory developed by LaBerge and Samuels (1974) and expanded upon by Samuels (2006) formed the theoretical basis of this study. The automaticity theory explains the relationship between fluency and comprehension. When initially proposing the automaticity theory, LaBerge and Samuels found that accurately reading words alone was not sufficient for reading proficiency. They posited that for students to devote attention to other aspects of reading, they must first accurately and fluently read words. In 2006, Samuels identified four components of the reading process: decoding, comprehension, metacognition, and attention; the last he viewed as limited. When students spend too much time trying to sound out words, they cannot comprehend what they have read by the end of a line or a page. Unless students are able to fluently decode the words in a text, they do not have sufficient attention to devote to comprehend what is read or to use metacognitive strategies to improve their comprehension. This theory
  21. 21. 8 explains why slow readers who may accurately identify all or most of the words in a passage may still not read proficiently enough to pass state-mandated reading assessments. Several researchers have tested the automaticity theory and found a relationship between oral reading fluency and reading proficiency in general (Baker et al., 2008; Daane, Campbell, Grigg, Goodman, & Oranje, 2005; Kuhn, 2005; Morgan & Sideridis, 2006; Riedel, 2007). However, scholars disagree on the definition of oral reading fluency. Some define oral reading fluency as rate and accuracy alone (Kame’enui & Simmons, 2001; Riedel), and others include prosody and comprehension (Allington, 2009; Samuels, 2006). I will further discuss the automaticity theory, including controversies in the literature, in section 2. Operational Definitions The following terms as defined were used throughout this study. Automaticity theory: The four-component process by which proficient readers decode words fluently, enabling them to focus attention on higher comprehension skills of the passages read (Samuels, 2006). Dynamic Indicators of Basic Early Literacy Skills (DIBELS): Short, 1-minute assessments that educators use to measure the development of early literacy skills. These assessments include oral reading fluency, retell fluency, and nonsense word fluency (Good & Kaminski, 2002b). Matthew effects: The gap between readers and struggling readers. Matthew effects are based on the Biblical concept found in Matthew 25:29, a verse which refers to
  22. 22. 9 the rich becoming richer and the poor becoming poorer. In the context of reading, this term indicates that students who read consistently increase their knowledge base and vocabulary, and readers who read less will learn less and continue to be struggling readers. The gap between proficient readers and struggling readers widens as they progress through school (Morgan et al., 2008; Stanovich, 1998). Oral reading fluency (ORF): The ability to read words accurately and quickly (Fuchs et al., 2001; Roehrig et al., 2008) with proper expression (Eldredge, 2005; Hudson et al., 2005) and comprehension (Marcell, 2007; Samuels, 2007). In the current study, Grade 3 students’ DIBELS ORF rates were used as the independent variable (Good & Kaminski, 2002b). Proficiency level: A scale score of 2100 or more on the Grade 3 Reading TAKS (TEA, 2009b). Out of the 36 questions, proficient readers answer at least 24 correctly to demonstrate proficiency. Reading proficiency: The ability of a reader to decode the words in a passage and comprehend what it says. Proficient readers employ a variety of strategies to comprehend a passage, such as determining the purpose for reading, using context clues and the nature of the passage, and using background knowledge (Cline, Johnstone, & King, 2006). Proficient readers demonstrate a basic understanding of reading, apply knowledge of literary elements, use a variety of strategies to analyze a passage of text, and apply critical-thinking skills to analyze a passage by scoring 2100 or more on the Grade 3 Reading TAKS (TEA, 2004). In the current study, I used students’ scale scores from the Grade 3 Reading TAKS as the dependent variable to measure reading proficiency.
  23. 23. 10 Scale score: In 2009, the TEA measured performance on the TAKS using a scale score. A raw score of zero had a scale score of 1399, and a perfect raw score of 36 had a scale score of 2630. Students who achieved a scale score of 2100 or above were considered proficient; students who received a scale score of 2400 or above received commended performance (TEA, 2009b). Struggling readers: Struggling readers have at least average intelligence, but they read more slowly than other readers their age and may be at risk for long-term reading difficulties (Pressley et al., 2006). Texas Assessment of Knowledge and Skills (TAKS): The TAKS is a state- mandated, summative, high-stakes assessment in Texas that assesses specific proficiencies in Grades 3 to 11 in accordance with the state curriculum (TEA, 2009a). The TAKS is scored with a scale score, defined above. For this study, I used the scale score of the Grade 3 Reading TAKS to measure the dependent variable of reading proficiency (TEA, 2006, 2009e). Assumptions, Limitations, Scope, and Delimitations In this study, I made four assumptions. First, the DIBELS ORF was a reliable measure of oral reading fluency. Second, the Grade 3 Reading TAKS was a reliable measurement of reading proficiency. Third, the Grade 3 students’ DIBELS ORF rates and Grade 3 Reading TAKS scale scores reported for the target West Texas school district were accurate and reliable. Finally, the statistical method chosen for this study was appropriate.
  24. 24. 11 I acknowledged four limitations for this study. The first was the use of a nonprobability convenience sample, and the second was the limitation of the study population to students in Grade 3. Because of these factors, the results may not generalize to students in other elementary grades. Third, the study took place in a single geographical location, a small town in West Texas. Similarly, the results may not generalize to other locations in Texas or other states. Finally, quantitative correlational analysis does not prove the existence of relationships between variables. Rather, in correlational studies, a statistical procedure may establish a relationship between two variables, but neither the method nor results prove that one variable causes another variable (Gravetter & Wallnau, 2005). Thus, study results do not prove the existence of a relationship between oral reading fluency rates and reading proficiency or that low fluency rates cause poor reading comprehension. Rather, results indicate whether a statistically significant relationship exists between the two variables. The scope focused on the oral reading fluency rates and reading proficiency of Grade 3 students in a Western school district. I limited the study to 155 students in Grade 3 in a West Texas elementary school district. I investigated only the two variables of oral reading fluency and reading proficiency for a single school year, 2008-2009. Each of the variables was operationalized by a single state-mandated assessment, for oral reading fluency the DIBELS ORF and for reading proficiency the Grade 3 Reading TAKS. Significance of the Study This study was significant in several ways. With regard to the literature, the study filled a gap that existed regarding students’ oral reading fluency and reading proficiency.
  25. 25. 12 Researchers in seven states (Baker et al., 2008; Barger, 2003; Roehrig et al., 2008; Shapiro et al., 2008; Vander Meer et al., 2005; Wilson, 2005; Wood, 2006) used DIBELS ORF rates to determine if there was a significant relationship between oral reading fluency rates and reading proficiency as measured by their states’ mandated reading assessment. Although each of these studies found that oral reading fluency rates correlated with reading proficiency, additional studies were needed for other states since the curriculum standards that define reading proficiency differ from state to state (Roehrig et al.). This study filled a gap in the literature on the assessment of struggling readers by determining if such a relationship exists with Grade 3 students in Texas. In addition, with regard to professional application of the study, the results can help teachers in Texas. NCLB (2002) mandates that school districts must make adequate yearly progress in students’ reading scores. In situations where the school or district has not met adequate yearly progress for 2 consecutive years, state education agencies are required to impose corrective actions, such as removing the principal, replacing the staff, receiving advice from outside experts, or requiring the implementation of an entirely new curriculum. The study results can help Texas educators identify struggling readers before the administration of the Grade 3 Reading TAKS. Educators can then provide interventions designed to improve basic reading skills, such as decoding, fluency, vocabulary, and comprehension skills. Such interventions may help struggling readers’ possibilities of scoring in the proficient range on the state-mandated assessment (Simmons et al., 2008).
  26. 26. 13 In turn, the school districts may increase the likelihood of meeting adequate yearly progress. Study results may also contribute to positive social change. Elementary educators may more easily identify struggling readers so that interventions can take place before students take the Grade 3 Reading TAKS. Such interventions would target basic literacy skills and include greater oral reading fluency and decoding strategies to improve struggling readers’ reading proficiency (Jenkins et al., 2007). Diagnosis of struggling readers and skills implementation help reduce the risk of students’ failing the state tests and increase their possibilities of academic success (Ehri, 2005; Shapiro et al., 2008). Struggling readers who are identified and remediated before the end of Grade 3 are more likely to improve their reading skills, thus helping to close the academic gap with more proficient peers (Simmons et al., 2008). When struggling readers experience success in reading, they are more likely to continue academic achievement through the grades and graduate from high school (Houge, Peyton, Geier, & Petrie, 2007; Rumberger & Palardy, 2005). Graduation from high school with proficient reading skills increases an individual’s opportunities for employment and contribution to society (Katsiyannis et al., 2007). Transition Statement It is important for students to learn to read proficiently by the end of Grade 3 (Ehri, 2005; NCLB, 2002). Nationwide, most students learn to read before the fourth grade (U. S. Department of Education, 2007); however, between 15% (TEA, 2009c) and 33% (U.S. Department of Education, 2007) of students do not read proficiently.
  27. 27. 14 Researchers in Arizona (Wilson, 2005), Colorado (Wood, 2006), Florida (Roehrig et al., 2008), North Carolina (Barger, 2003), Ohio (Vander Meer et al., 2005), Oregon (Baker et al., 2008), and Pennsylvania (Shapiro et al., 2008) have found that oral reading fluency rates correlated with reading proficiency as assessed on their states’ mandated reading assessment. The two variables have been studied only in these seven states. This nonexperimental quantitative study, based on the automaticity theory (LaBerge & Samuels, 1974) and using Pearson correlation, filled a gap in the literature to determine with a sample of 155 Grade 3 students in Texas if a statistically significant relationship (p < .05) existed between their oral reading fluency rates (middle-of-year DIBELS ORF) and their reading proficiency (end-of-year TAKS). Study findings may result in positive social change by providing Texas educators with a valuable tool for identifying struggling readers in Grade 3 (Simmons et al., 2008). Once struggling readers are identified, educators can provide scientifically-based reading interventions to strengthen phonemic awareness, phonics, fluency, vocabulary, and comprehension skills (Chard et al., 2008). The identification and assistance of struggling readers can help them improve academically and increase their success later in life (Katsiyannis et al., 2007; Shaw & Berg, 2009). In section 2, I review pertinent literature on the automaticity theory, oral reading fluency, reading proficiency, and related concepts. In section 3, I describe the study methodology in greater detail. Aspects include the sample, instruments, protection of participants’ rights, data collection, and data analysis. In section 4, I report the study findings. In section 5, I analyze the findings and discuss data from the study in light of
  28. 28. 15 the previous studies that compared DIBELS ORF rates with student performance on state-mandated assessments. Additionally, I make recommendations in section 5 regarding future actions and ideas for future studies.
  29. 29. 16 Section 2: Literature Review Introduction In this section I review literature pertinent to this study in terms of the theory and studies relevant to the variables. The section is organized as follows: (a) automaticity theory; (b) oral reading fluency, including definitions, the role of oral reading fluency in learning to read, and fluency instruction; (c) reading proficiency, including reading proficiency and automaticity theory, definitions of reading proficiency, vocabulary and Matthew effects, and comprehension; and (d) the relationship between oral reading fluency and reading proficiency in state-mandated assessments, including studies of Grade 3 only, of Reading First schools, and of Grade 3 and other grades, as well as limitations of these studies. The section is concluded with a summary. To investigate the literature for this study, I searched electronic databases, such as Academic Search Premier, EBSCO, Dissertations and Theses, and ERIC, using the keywords oral reading fluency/ORF, phonemic awareness, phonics, reading comprehension, reading proficiency, and vocabulary. I also used bibliographies of articles to search for additional literature and reviewed peer-reviewed articles. In addition, I read several recently published books on the subject of oral reading fluency. Automaticity Theory LaBerge and Samuels (1974) used the automaticity theory to explain the relationship between fluency and reading proficiency. They acknowledged that reading is a complex skill that takes years to develop. Fluent readers are able to process the necessary information in less than a second. However, some readers never reach a level at
  30. 30. 17 which they fluently read a text, even though they are able to adequately communicate orally. Reading is a complex skill with many subskills, in which readers must recognize letters, spelling patterns, and words as well as attach meanings of words and context. According to this theory, attention is a limited faculty, and when students are beginning to learn to read, their attention is focused on factors other than meanings, such as shapes of letters (LaBerge & Samuels, 1974). In prekindergarten and kindergarten, students must initially focus on the shapes within a letter, the length of lines, and the direction letters are facing. As beginning readers, students have great difficulty distinguishing the difference between letters. In addition, at first, students focus their attention on recognizing and naming letters and do not have enough attention to focus on the sounds. In addition to letter naming, LaBerge and Samuels (1974) found that a certain amount of attention devoted to other subskills is also necessary for development of fluent reading. Once beginning readers automatically recognize letters, they must learn to associate sounds with the letters. Then they can focus their attention on reading parts of words and entire words. To test their theory of automaticity, LaBerge and Samuels (1974) designed an experiment with eight college students. They measured the time the students took to identify patterns of familiar and unfamiliar letters. The students were able to accurately identify patterns of familiar letters more quickly than those of unfamiliar letters. Next, the students received training on the unfamiliar letters. At the end of a 20-day period, they were able to recognize the unfamiliar letters more accurately although still not as fast as
  31. 31. 18 the familiar letters. LaBerge and Samuels concluded that some degree of attention was still needed for the students to make the association for unfamiliar letters. From these results, LaBerge and Samuels (1974) expressed concerns regarding their observations of teaching reading. They had found that teachers often directly teach letter names until students are able to correctly identify the letters. Then teachers move to direct instruction in other aspects of reading, such as letter sounds and blending. However, students may still not be able to fluently identify letters. Because their attention is focused on this task, they may not have adequate attention to devote to learning sounds and how to blend them together to make words. Thus, LaBerge and Samuels recommended that teachers continue teaching and testing letter naming until students reach a level of automaticity. The researchers maintained that only when students can automatically recognize letters can they focus a significant amount of attention on the sounds the letters make. In 1974, when LaBerge and Samuels first proposed the automaticity theory, they focused on automaticity at the word level. Their study showed how readers who had not reached a level of automatically decoding words would have trouble comprehending. At that time, LaBerge and Samuels did not conceptualize that automaticity in other areas was also important. However, as researchers continued to develop the automaticity theory, they recognized that metacognition also played an important role in comprehension and that aspects of metacognition could be automaticized (Samuels, Ediger, Willcutt, & Palumbo, 2005). Metacognition is the readers’ awareness of what is being read and whether or not
  32. 32. 19 comprehension is taking place. Readers who use metacognition are thinking about whether or not they are thinking about what the passage says. Factors such as readers’ motivation to read, their attitudes and beliefs about their reading ability, their interest in the topic, and the amount of attention they devote to reading can affect how much they comprehend. Whether or not readers are distracted by other factors, such as the environment, noises, or other thoughts, can affect comprehension. Metacognition also involves the insight readers bring to a text derived from background knowledge. When readers read something with which they are highly familiar, gaining new insights and adding to their cognitive repertoire of knowledge seems easy (Samuels et al., 2005). However, when readers know very little about the topic, they may have to work harder to comprehend the passage. Good readers learn to recognize when cognitive aspects interfere with their ability to comprehend and to make adjustments. Samuels et al. recommended that teachers instruct readers to be aware of such factors by modeling thinking strategies and measuring students’ use through rubrics until they automatically use metacognition strategies when reading. According to automaticity theory, students who have low fluency rates struggle with reading proficiency (Goffreda, Diperna, & Pedersen, 2009). In order to read proficiently, students must be able to recognize words in context and use a variety of strategies to comprehend a passage. Samuels (2006) summarized the automaticity theory by identifying four components as elements of the relationship between oral reading fluency and reading proficiency: decoding, comprehension, metacognition, and attention.
  33. 33. 20 First, a reader must decode the words in a text (LaBerge & Samuels, 1974). Beginning or struggling readers may devote so much attention to decoding words that comprehension is difficult, if not impossible (Samuels, 2006). However, proficient readers are able to decode words and quickly attach meaning to them (Shaywitz, 2003). Skilled readers can read a word and, within 150 milliseconds, less than the time it takes for the heart to beat once, the brain has attached meaning to the word. However, beginning and struggling readers process reading differently. Whereas skilled readers see whole words, beginning and struggling readers, such as dyslexics, may only see a few letters or sounds at a time (Samuels, 2006). In 1974, LaBerge and Samuels could only presume that oral reading fluency would affect comprehension. By 2006, Samuels had identified comprehension as another component of the reading process. Fluent readers spend less attention on decoding words and therefore can devote more attention to comprehending. Proficient readers combine the information in a text with their background knowledge and critically analyze the text. Samuels believed that readers must automatically decode before they can devote sufficient attention to comprehension. Samuel’s (2006) next component of the reading process is metacognition or one’s own awareness of the thought processes. Proficient readers use metacognition when they do not understand a text and make adaptations so they can comprehend it. Walczyk and Griffith-Ross (2007) found that readers can comprehend by using metacognition to know when they do not comprehend and adapt by using reading strategies such as slowing down, reading out loud, rereading, or sounding out difficult words. Some readers use
  34. 34. 21 metacognition to improve their use of reading strategies until the strategies become automatic. For example, readers can learn the skill of making inferences and use this skill repeatedly until they automatically infer meaning while reading a passage. Proficient readers use metacognition as they read and analyze text (Samuels, 2006). Samuels’ (2006) final component of the reading process is attention. This is an outcome of the previous three components, decoding, comprehension, and metacognition. Beginning readers use excessive attention to decode words and have insufficient attention remaining for comprehending or thinking about reading strategies they could use. As readers become more proficient, they decode words automatically and devote more attention to comprehension and metacognition. According to the automaticity theory, readers are not proficient until they can apply their attention to decode, comprehend, and use metacognition at the same time (Samuels, 2006). Oral Reading Fluency Originally LaBerge and Samuels (1974) used the automaticity theory to explain why low reading fluency rates affect reading proficiency. The degree of expression can indicate students’ understanding of the passage (Samuels, 2006). Most researchers studying oral reading fluency and reading proficiency agree that oral reading fluency and reading proficiency are related (Daane et al., 2005; Deeney, 2010; Miller & Schwanenflugel, 2006). Researchers disagree, however, on how fluency is defined and the validity of assessments used to assess it. To measure oral reading fluency, researchers have developed assessments such as AIMSweb (Edformation, 2004), the DIBELS (Good & Kaminiski, 2002a), the Reading Fluency Monitor (Read Naturally, 2002), and the
  35. 35. 22 Texas Primary Reading Inventory (University of Houston, 1999). In particular, Reading First schools have widely used the DIBELS with more than 1,800,000 children (Allington, 2009; Baker et al., 2008; Glenn, 2007; Manzo, 2005). Some researchers have claimed that political reasons may have motivated the widespread use of the DIBELS (Allington, 2009; Goodman, 2006; Manzo, 2005; Pearson, 2006; Pressley, Hilden, & Shankland, 2005; Riedel, 2007). One of the developers of DIBELS, Good from the University of Oregon, served on national committees that developed Reading First. Critics (e.g., Glenn, 2007) have alleged that Good personally benefitted when applications for Reading First grants were denied and schools felt pressured by federal officials and consultants to include DIBELS in their grant applications (Manzo, 2005). However, despite such claims, researchers recognize DIBELS as a respected way of measuring oral reading fluency (Baker et al., 2008; Riedel et al., 2007). Definition of Oral Reading Fluency Many researchers have focused on the definition of fluency used by the developers of the DIBELS, although definitions vary (Hudson et al., 2005; Samuels, 2006). A fluent reader can read text with speed, accuracy, and proper expression (NICHHD, 2000). Worthy and Broaddus (2002) defined fluency as “not only rate, accuracy, and automaticity, but also of phrasing, smoothness, and expressiveness” (p. 334). Effective readers do more than just read the words; they understand what they read (Marcell, 2007). Hudson et al. included accuracy, rate, and prosody in their definition of fluency. Samuels (2007) and others (Allington, 2009; Miller & Schwanenflugel, 2008;
  36. 36. 23 Rasinski, 2006) contended that educators should consider prosody or the expression with which students read a passage. Fuchs et al. (2001) found that prosody was difficult to measure, so they chose to focus on rate and accuracy. The developers of the DIBELS defined fluency as rate and accuracy (Good & Kaminski, 2002a). Various researchers have asserted that the definition of fluency should include comprehension (Kuhn, 2005; Marcell, 2007; Pikulski & Chard, 2005; Samuels, 2007). Pikulski and Chard (2005) defined fluency as follows: Reading fluency refers to efficient, effective word-recognition skills that permit a reader to construct the meaning of text. Fluency is manifested in accurate, rapid, expressive oral reading and is applied during, and makes possible, silent reading comprehension. (p. 510) Samuels (2006) not only agreed that the definition of fluency should include comprehension but also stated that measures of fluency should assess reading and comprehension at the same time. Samuels noted that beginning readers focus first on decoding, and once they are able to decode automatically, they focus on comprehension. Samuels emphasized that the automaticity theory, which he and LaBerge developed (LaBerge & Samuels, 1974), requires students to decode words automatically so that they can comprehend. He pointed out educators who assess fluency first and then assess comprehension later or with a different assessment miss the point of automaticity. Deeney (2010) included endurance in her definition of fluency. She agreed that the definition of fluency consists of four components; accuracy, rate or speed, prosody, and comprehension. However, in her work with students she saw the need to consider
  37. 37. 24 endurance. In her opinion, 1-minute fluency probes provide useful information for identifying which students are struggling. However, Deeney pointed out that these problems did not provide adequate information for determining why students struggle and what can be done to address their academic needs. She agreed with Pikulski and Chard (2005), who called for a deeper view of fluency. Deeney believed that such a deeper view includes rate, accuracy, prosody, comprehension, and endurance. The developers of DIBELS (Kame’enui & Simmons, 2001) and others (Francis et al., 2005; Harn, Stoolmiller, & Chard, 2008; Hasbrouck & Tindal, 2006; Jenkins et al., 2007; Katzir et al., 2006) agreed with the definition of Roehrig et al. (2008) of oral reading fluency as “accuracy and rate in connected text, or correct words per minute” (p. 345). Reading is a complex skill (Fuchs et al., 2001) that includes various components. Kame’enui and Simmons discussed the simplicity and complexity of oral reading fluency. They cited Watson (1968), the discoverer of DNA, who stated, “The idea is so simple it had to be true” (as cited in Kame-enui & Simmons, p. 203). Like DNA, fluency is a complex skill with many components, including phonemic awareness, alphabetic principle, word reading, expression, and comprehension. Fluency is also easily recognized. For example, a person listening to struggling readers decoding words sound by sound can easily recognize they are not reading fluently. Kame’enui and Simmons (2001) emphasized that even though educators have debated the exact definition of oral reading fluency for decades, definition is not the major point. Rather, as LaBerge and Samuels (1974) recognized when they developed the automaticity theory, reading proficiency is a complex skill with many different aspects.
  38. 38. 25 Although LaBerge and Samuels focused only on rate and accuracy and the subskills of letter and word recognition in 1974, they recognized that other aspects are involved. When educators use only rate and accuracy to define fluency, they are measuring only the subskill of fluency. Different assessments can be used to measure other subskills of reading proficiency. Skills such as phonemic awareness, phonics, and word reading are necessary components of fluency that enable students to read passages expressively and with understanding. However, by defining fluency as accuracy and rate, researchers gain a measureable, workable definition to analyze the progress of beginning readers toward the goal of adequately comprehending what they read. The National Reading Panel (NICHHD, 2000) identified fluency as one of the main components of early reading instruction. Hasbrouck and Tindal (2006) analyzed thousands of students’ oral reading fluency rates in order to develop norms for Grades 1 through 8. Educators can listen to a student read for as little as 1 minute and compare the fluency rate with thousands of other students at the same grade level at the beginning, middle, or end of the school year. At the conclusion of their research, Hasbrouck and Tindal recognized the complexity of reading and recommended that educators consider fluency in the context of the other components necessary for mastery. In the present study, I defined oral reading fluency as the accuracy and rate at which students read a grade level text (Francis et al., 2008; Harn et al., 2008; Hasbrouck & Tindal, 2006; Jenkins et al., 2007; Katzir et al., 2006). This definition provided a specific way to measure oral reading fluency so that I could determine if a relationship
  39. 39. 26 existed between oral reading fluency and reading proficiency. Although adopting the limited definition of accuracy and rate (Roehrig et al. 2008), I also acknowledged that prosody plays an important role in reading and fluency. However, because prosody is difficult to measure (Fuchs et al., 2001; Schwanenflugel et al., 2006), I chose to use the more measurable definition of oral reading fluency. I also acknowledged that oral reading fluency is more that reading quickly. Proficient readers are also able to comprehend what they read, as supported by the research question, whether a relationship exists between oral reading fluency and reading proficiency, as measured by Grade 3 students’ DIBELS ORF and the Grade 3 Reading TAKS. The Role of Oral Reading Fluency in the Context of Learning to Read As readers begin to read, fluency has a significant role (Harn et al., 2008). Fluency for educators may be compared to temperature readings for physicians (Hasbrouck & Tindal, 2006). A physician considers a patient’s temperature, and if it is above normal, the physician looks for possible causes and then determines an appropriate treatment. In a similar way, educators examine oral reading fluency. If a student’s oral reading fluency rate is below average, educators look for possible causes and then determine appropriate interventions (Hasbrouck & Tindal). Ehri (2005) studied the process by which students learn to read words. She identified five phases: (a) the prealphabetic phase, (b) the partial-alphabetic phase, (c) the full-alphabetic phase, (d) the consolidated-alphabetic phase, and (e) the automatic-
  40. 40. 27 alphabetic phase. These phases describe the process that beginning readers engage in as they develop the skill of reading. Phonemic awareness is one of the first skills beginning readers must learn. Ehri (2005) discussed phonemic awareness in her first two phases: the prealphabetic and partial-alphabetic phases. Beginning readers must learn to recognize that words they use in their spoken vocabulary are constructed of individual sounds (Ehri & McCormick, 1998). Katzir et al. (2006) demonstrated the role of phonemic awareness in fluency. They concluded that readers need phonemic awareness the most when decoding words they have not seen before. After readers have developed a certain level of fluency, the brain focuses less on phonemic awareness and more on other components of fluency, such as the speed at which they are able to recognize letter patterns or word units. Researchers have also found phonemic awareness in kindergarten to be a predictor of oral reading fluency in later years (Katzir et al.). As students learn phonemes, they begin to learn phonics, learning to associate the phonemes to letters or graphemes (Ehri, 2005; Ehri & McCormick, 1998). Although beginning readers focus on phonemic awareness, they deal only with the sounds of language. When they begin to understand that letters represent sounds, they use phonics. In Ehri’s second phase, the partial-alphabetic phase, children use the first and last letters when identifying words. Children then learn to master the ability to sound out words. Approximately half way through first grade, most readers enter what Ehri described in her third phase as the full-alphabetic phase. During the first two phases, readers laboriously sound out words. As they move into the full-alphabetic phase, they begin to
  41. 41. 28 recognize that certain letters work together to make certain sounds. Recognizing more of these letter patterns, students are able to read faster. Over time, students learn to recognize more letter patterns and words allowing them to read more words correctly each minute (Ehri & McCormick). Researchers have established a relationship between the knowledge of phonetic elements and fluency (Chard et al., 2008; Eldredge, 2005; Harn et al., 2008). Chard et al. documented that students who had demonstrated mastery in the alphabetic principal by the end of Grade 1 were more fluent readers than their struggling peers at the end of Grades 2 and 3. The readers who struggled with the alphabetic principle in Grade 1 did not progress at the same rate as their peers who had mastered the alphabetic principle. Chard et al. emphasized that teachers’ assuring that students have a good grasp of phonetic knowledge by the end of Grade 1 is critical, because phonetic knowledge serves as a predictor of how fluently they will read in later years. Mastery at the full-alphabetic phase is essential before moving to the next two stages, the consolidated-alphabetic and the automatic-alphabetic phases. Usually during Grade 2, students progress to Ehri’s (Ehri & McCormick, 1998) fourth phase, the consolidated-alphabetic phase. Readers begin to recognize more combinations of letters within words (Harn et al., 2008). They begin to read words syllable by syllable as well as to recognize prefixes and suffixes in words. This ability to identify groups of letters helps to facilitate fluent reading as readers recognize more words by sight (Ehri, 2005). However, readers may not completely develop the concepts of the full-alphabetic phase until Grade 8 (Ehri & McCormick). In fact, some students
  42. 42. 29 learn to read and progress well until approximately Grade 4, when they begin to read words with four or more syllables. As Ehri pointed out, educators may need to provide further instruction to help these students see the syllabic units in larger words. Fluency is the final phase, the automatic-alphabetic phase, in Ehri’s (Ehri & McCormick, 1998) developmental sequence to early reading proficiency. During this phase, students are able to read most words quickly and efficiently. When they encounter a word they do not know, they use one or more of several methods they have developed to ascertain meaning. Speece and Ritchey (2005) established that oral reading fluency predicts reading comprehension. In their study with students in Grade 1 and Grade 2, a significant gap was found between the oral reading fluency rates of students at risk for reading difficulty and their peers who were not at risk. The researchers concluded that the most unique variance for students’ second-grade increases in reading mastery and state assessment achievement was predicted by increases in oral reading fluency for students in the first grade. Fluency Instruction Oral reading fluency has been correlated with overall reading competence (Fuchs et al., 2001), and studies have confirmed several strategies to improve reading fluency. When students repeatedly read a passage, their fluency increases (Begeny, Daly, & Valleley, 2006; Hiebert, 2005; Martens et al., 2007; Therrien, Gormley, & Kubina, 2006; Therrien & Hughes, 2008). The repeated reading strategies can be implemented in a variety of ways including readers’ theater, in which students repeatedly read a script as they prepare to perform the reading for an audience (Corcorcan & Davis, 2005; Rasinski,
  43. 43. 30 2006). Although studies have found repeated readings effective, Kuhn (2005) found that wide reading is just as effective, and Morgan and Sideridis (2006) found motivational strategies were more effective than repeated readings. In addition, other strategies have improved fluency. Nes Ferrera (2005) found that pairing struggling readers with more experienced readers helped the struggling readers to read more fluently. Rasinski and Stevenson (2005) found parents an effective resource when they helped their at-risk students by working daily at home with them on reading. Reading Proficiency Reading proficiency involves more than mechanical reading of a passage. Based on their review of literature, the National Reading Panel (NICHHD, 2000) captured the essence of reading by identifying its five most important components. These are phonemic awareness, phonics, fluency, vocabulary, and reading comprehension. Reading Proficiency and Automaticity Theory Several researchers (Baker et al., 2008; Kuhn, 2005; Morgan & Sideridis, 2006; Therrien & Hughes, 2008) have confirmed a relationship between reading proficiency and automaticity. Baker et al. found that the DIBELS ORF rates of 4,696 Grade 3 students in Oregon significantly predicted their performance on the state-mandated assessment. Students who read fluently also comprehended well enough to attain scores of proficient on the state assessment. Kuhn (2005) also confirmed the automaticity theory in a study of 24 Grade 2 students in a southeastern United States city with three student groups. The first group participated in the repeated reading condition, in which students read the same story
  44. 44. 31 repeatedly over a 3-day period using strategies such as modeling, repetition, choral reading, and pair reading. Over the 6-week period, this procedure was followed with six books. The second group participated in the nonrepetitive reading condition, in which students read each book one time. The students read the same six stories as the ones used in the repeated reading condition, with an additional 12, so that a new story was available at each session. The third group participated in the listening only condition, in which the researcher expressively read the books aloud to the students. Over the 6-week period, these students listened to the same 18 stories as the students in the nonrepetitive group read. The fourth group was the control group, in which students received no interventions outside of the normal classroom instruction (Kuhn, 2005). Kuhn (2005) found that the repeated reading and nonrepetitive reading intervention groups were a more effective way to help students decode words than simply listening or providing no intervention. Thus, Kuhn’s work confirmed the automaticity theory. The students in the repeated reading and nonrepetitive reading groups demonstrated that they automaticized the knowledge of words and sounds better than the students in the listening only and control groups. Students in the first two groups demonstrated greater gains in their ability to read both real and nonsense words and their oral reading fluency rates. Therrien and Hughes (2008) studied the effects of repeated reading and question generation on students’ reading fluency and comprehension. During a 2-week period, students who read at Grade 2 or 3 instructional levels were randomly divided into two
  45. 45. 32 groups. One group received the repeated reading intervention. Students were asked to read a passage repeatedly until they reached the desired fluency level. On average, students reached the fluency goal after 2.42 readings. The other group received the question-generation intervention. Students read the passage once and received questions to cue them to better comprehend the narrative passage, such as the following: (a) Who is the main character?, (b) Where and when did the story take place?, and (c) What did the main character do? In both groups after the intervention was completed, tutors asked both factual and inferential questions. Results documented that repeated reading improves fluency: students in the repeated reading group read 22.5 more correct words per minute than students in the question-generation group, who only read the passage one time. Additionally, the students in the repeated reading group comprehended more factual questions than the students in the question-generation group. There was no significance difference in the two groups when they answered inferential questions (Therrien & Hughes, 2008). Therrien and Hughes (2008) concluded that repeated reading improved both fluency and comprehension. However, they recommended that additional research be conducted to determine the effects of text difficulty. Oral reading fluency rates are important to consider when students are reading passages in which they cannot read a significant percentage of the words. For example, in this study the researchers considered instructional level to be one at which students correctly read 85% to 95% of the words. Therefore, the students in the question-generation group may not have been able to
  46. 46. 33 understand 5% to 15% of the words in the text. When students do not know 5% to 15% of the words in a passage, their comprehension can be affected. The situation can be further compounded when students read at their level of frustration, less than 85% of the words in the passage. Therrien and Hughes (2008) recommended that other research studies focus on how comprehension is affected by readers’ levels of difficulty. The researchers also recommended that studies with longer intervention time be conducted to indicate whether more than 2 weeks of intervention would enable the students in the question-generation group to use cued questions to answer the inferential questions. This study demonstrated that repeated reading does improve both fluency and comprehension. In contrast to these and other studies that found repeated reading to be an effective intervention (Begeny et al., 2006; Hiebert, 2005; Hudson et al., 2005; Rasinski, 2006; Therrien et al., 2006; Therrien & Kubina, 2007), Morgan and Sideridis (2006) found that repeated reading was not as effective as other strategies. They conducted a meta-analysis of 30 studies that used single-subject research designs to determine “the effectiveness of different types of interventions on fluency for students with or at risk for learning disabilities” (p. 200). The researchers categorized the interventions by the following strategies: (a) keywords and previewing, (b) listening and repeated readings, (c) goal setting plus performance feedback, (d) contingent reinforcement, (e) goal setting plus feedback and reinforcement, (f) word recognition, and (g) tutoring. According to Morgan and Sideridis’ (2006) findings, the most effective interventions were reinforcement, goal setting plus feedback, and goal setting plus
  47. 47. 34 feedback and reinforcement. When the researchers analyzed the students’ improvements over time, the goal setting interventions showed significant growth, and the listening and repeated readings interventions did not. Although Morgan and Sideridis’ findings did not concur with other researchers’ findings regarding repeated readings, their results nevertheless substantiate the automaticity theory. They found that when students were motivated to set goals and were positively reinforced, they were able to improve their automaticity, as demonstrated by increases in their oral reading fluency rates. Definition of Reading Proficiency After the authorization of NCLB (2002), the U.S. Department of Education formed the NARAP to define reading and reading proficiency (2006). Since NCLB mandated that all students be proficient in reading by 2013-2014, states and test developers needed a definition of reading proficiency to create appropriate measurement assessments. This task was not a simple one. The definition had to capture the essence of reading and encompass the developmental attributes of reading across grade levels (NARAP). Each of the 50 states defines reading proficiency at each grade level in terms of their curriculum expectations, as measured by the state’s reading assessment. NARAP’s (2006) definition also had to include students with disabilities who access texts differently from nondisabled students. For example, blind students access texts through braille. Hearing-impaired students may not decode the words of a text in the same manner as nonhearing-impaired students. Furthermore, students with learning disabilities
  48. 48. 35 such as dyslexia may require accommodations, including extra time in order to read a passage proficiently. Thus, the national definition of reading proficiency had to allow for differences in states’ definitions, curriculum differences, students’ ages and grades, and conditions of handicap. After NARAP (2006) convened a panel of experts who were discussing working definitions with focus groups, NARAP formulated the following definition of reading proficiency: Reading is the process of deriving meaning from text. For the majority of readers, this process involves decoding written text. Some individuals require adaptations such as braille or auditorization to support the decoding process. Understanding text is determined by the purposes for reading, the context, the nature of the text, and the readers’ strategies and knowledge. (Cline et al., 2006, para. 8) Following from this definition, states were able to use it as a guide to examine their curriculum expectations at each grade level and design assessments to measure reading proficiency. Under the NCLB (2002), implementation means that 50 different states are assessing reading proficiency at Grades 3, 5, 8, and exit level. Consequently, 200 definitions could exist for defining reading proficiency (four grades times 50 states). Many states also give assessments of reading proficiency in more grades. In Texas, for example, assessments are given in eight grades, Grades 3 through 10 (TEA, 2009d). Definitions vary among states. TEA, for example, requires proficient readers in Grade 3 to have a basic understanding of reading, apply knowledge of literary elements, use a variety of strategies to analyze text, and apply critical-thinking skills to analyze the
  49. 49. 36 passage (TEA, 2004). Florida’s state assessment that measures reading proficiency contains four areas of assessment: (a) word phrases in context; (b) main idea, plot, and purpose; (c) comparison and cause/effect; and (d) reference and research (Florida Department of Education, 2005). In Colorado, the definition of reading proficiency includes requiring students to “locate, select, and make use of relevant information from a variety of media, reference, and technological sources” and to “read and recognize literature as a record of human experience” (Colorado Department of Education, 2010, para.7). Even though NARAP (2006) provided states with a general definition of reading proficiency, as these examples show, each state assesses reading proficiency with its own definition, based on their curriculum standards. As the definition of reading proficiency in Texas shows, reading proficiency is a complex skill. Proficient readers, as noted earlier, must attach meaning to the words they read to understand what the author is communicating (NARAP, 2006). Proficient readers not only read the words on a page, but they use strategies to read between the lines and beyond the lines. Thus, a review of literature relating to vocabulary and comprehension is warranted. Vocabulary and Matthew Effects Vocabulary is one of the National Reading Panel’s five essential components for effective reading programs (NICHHD, 2000). Vocabulary is comprised of the words an individual knows (Rosenthal & Ehri, 2008). People may identify words by the way they sound, the way they are used in context, or the way they are spelled. New words are introduced either through words that are heard or read. Thus, readers can read more
  50. 50. 37 proficiently when they decode unknown words and attempt to determine the meaning of unknown words (NICHHD, 2000). Ouellette (2006) found that readers are more likely to decode an unknown word if it exists in their spoken vocabulary. When readers know 90% to 95% of the words in a text, they are able to figure out the meaning of the other unknown words (Hirsch, 2003). Struggling readers have difficulty decoding words. Because reading is difficult for them, they become frustrated (Ehri & McCormick, 1998). They must work hard to read and often find numerous ways to avoid reading. Consequently, they do not want to read widely, and their vocabulary does not increase. Conversely, proficient readers have a larger vocabulary than struggling readers because they have read more. Proficient readers are able to use their vocabulary to decode and determine the meaning of even more unknown words. The more they read, the more they know. Stanovich (1998) referred to this gap between proficient readers and their struggling peers as Matthew effects—a reference to the Biblical passage, Matthew 25:29, that refers to the rich becoming richer and the poor becoming poorer (p. 10). For struggling readers, the gap widens as they progress through the grades (Morgan et al., 2008; Stanovich, 1998). Several studies support Matthew effects. Katz, Stone, Carlisle, Corey, and Zeng (2008) studied the difference in growth of reading skills between Grade 1 proficient readers and struggling readers. The researchers reported that not only did the struggling readers have lower oral reading fluency rates than the proficient readers, but they improved at a slower rate. A study with students in kindergarten to Grade 3 in Oregon and Texas documented the presence of Matthew effects for students in all four grades
  51. 51. 38 (Chard et al., 2008). Rosenthal and Ehri (2008) studied the acquisition of new vocabulary with students in Grades 2 and 5. For students at both grade levels, the researchers documented the gap between struggling readers and proficient readers and its effect on students’ learning of new words. Consequently, struggling readers’ vocabulary is limited when compared to their proficient peers (Rosenthal & Ehri, 2008). Struggling readers’ limited vocabulary hinders them from learning as many new words as their proficient fellow students. Struggling readers who have a limited vocabulary have difficulty understanding texts that they read, decoding and learning new words, and developing skills that allow them to become proficient readers. Comprehension The final component for effective reading programs identified by the National Reading Panel is comprehension (NICHHD, 2000). Comprehension and oral reading fluency rates are related, and researchers continue to examine the relationship from various perspectives. These include questions such as the following: Are oral reading fluency rates and comprehension related only in the beginning stages of reading? Do all readers with low reading fluency rates struggle with comprehension? Are there readers with high oral reading fluency rates who are unable to comprehend? The following researchers have asked these questions. Fuchs et al. (2001) were among the first to establish oral reading fluency as an accurate and reliable measure that served as a predictor of general reading comprehension. Other researchers (Baker et al., 2008; Chard et al., 2008; Katz et al.,
  52. 52. 39 2008; Simmons et al., 2008) have since correlated oral reading fluency rates and comprehension at different grade levels and on a variety of assessments. As beginning readers progress through the various phases of learning to read (Ehri, 2005), teachers design instructional strategies to meet their needs and help them improve their reading skills. By the middle of first grade, a connection is established between how many words a minute students read and how much they comprehend (Hosp & Fuchs, 2005). For students in Grades 1 through 4, researchers (Hosp & Fuchs, 2005; Simmons et al., 2008) have found that oral reading fluency correlated with reading performance on the Woodcock Reading Mastery Test-Revised (Woodcock, 1973). For these same grades, other researchers have studied the relationship between oral reading fluency and comprehension (Daane et al., 2005). Baker et al. (2008) and Chard et al. (2008) found a relationship between oral reading fluency rates and performance on SAT-10. The results of Pressley et al. (2005) confirmed a relationship between oral reading fluency and reading performance on the TerraNova. Speece and Ritchey (2005) also demonstrated the relationship between oral reading fluency and the Comprehensive Test of Phonological Processing (CTOPP) and the Test of Word Reading Efficiency (TOWRE). Researchers have also studied the relationship between oral reading fluency and comprehension in older students. With students above Grade 4, Therrien and Hughes (2008) as well as Therrien and Kubina (2007) found that oral reading fluency and comprehension were significantly related. In work with middle and junior high school students, Fuchs et al. (2001) established a correlation between oral reading fluency and reading proficiency as measured by the Stanford Achievement Test. However, for
  53. 53. 40 students at various grade levels, additional research should be conducted to examine whether a statistically significant relationship exists between oral reading fluency and reading proficiency as defined by specific state reading assessments, such as the TAKS. The current study filled this gap for Grade 3 students in Texas. Older struggling readers with low oral reading fluency rates can learn to use certain learning strategies to improve their comprehension (Walczyk & Griffith-Ross, 2007). Walczyk and Griffith-Ross worked with students in Grades 3, 5, and 7 to determine if struggling readers at these grade levels could develop compensatory strategies to help them read text. The researchers found that struggling readers sometimes slowed down in order to facilitate their comprehension. For example, to compensate for decoding unknown words, struggling readers would sound out unknown words or simply skip them. When the students realized they were not comprehending, they might pause, look back, or reread the text. In addition, factors other than familiarity or understanding of words can affect comprehension. These factors include degree of readers’ motivation or feelings of time pressure. Walczyk and Griffith-Ross (2007) found that students who had low motivation read slowly, but those who were interested in the text could comprehend by using compensation strategies. Stage of maturation also seemed to play a role in the effect of time pressure on comprehension. Students in Grade 3 comprehended less when under time constraints than students in Grade 7. However, students in Grade 5 performed at generally the same level with and without time constraints.
  54. 54. 41 Walczyk and Griffith-Ross (2007) conjectured that the readers’ engagement made the difference. The students in Grade 3 seemed to be engaged in the text as they tried to decode the words, and time pressure negatively affected their comprehension. In contrast, by Grade 7 many of the students had become less engaged with routine reading assignments, but their engagement increased when they were timed. The students in Grade 5 appeared to be in transition between the two modes of reading text. Thus, Walczyk and Griffith-Ross demonstrated that students with low fluency rates can comprehend text by using metacognitive strategies to compensate, such as sounding out words, and engaging with greater motivation when experiencing time pressure. Researchers indicate that oral reading fluency rates predict comprehension (Chard et al., 2008; Daane et al., 2005) not because readers can read a passage quickly but because fluent readers are able to use mental energy comprehending rather than struggling with decoding (Fuchs et al., 2001; Samuels, 2006). Kuhn et al. (2005) indicated that when the readers in their study focused on how fast they read the passage, their comprehension was not as effective as those who focused on the passage itself. Samuels (2007) reiterated that when readers focus too intensely on how fast they are reading, their ability to comprehend suffers. Fluency is only one component of reading (NICHHD, 2000). When Hasbrouck and Tindal (2006) developed the oral reading fluency norms, they observed that once readers reached the 50th percentile, teachers should not encourage them to read faster. The researchers recommended that teachers only provide intervention for those students who read 10 words per minute below the 50th percentile. This advice was based on the
  55. 55. 42 principle that fluency intervention is reading fluently enough to focus on comprehension rather than a race for readers of the most words in a minute. Some readers read fluently but have trouble comprehending (Harrison, 2008; Samuels, 2006). If readers are not actively engaged in the reading process, they can mechanically read a passage but not understand what the author is saying (Schnick & Knickelbine, 2000). Readers who are intrinsically motivated to read can understand what the author is saying because they are curious and want to read more. In contrast, readers who are not motivated to read are often not engaged and fail to make the necessary connections for comprehension. Marcell (2007) reported on a student who was able to read a passage fluently at 120 words correct per minute but could not retell what he had read. This student needed further instruction in comprehension strategies in order to become a proficient reader. Samuels (2006) pointed out that students whose second language is English may successfully decode passages although they have very little comprehension. His work with among students in St. Paul, Minnesota, revealed that although about 20% could read fluently, they could not comprehend what they read. Furthermore, some high- comprehending readers have low fluency rates (Samuels, 2006). The goal of reading is to comprehend or understand text (Applegate et al., 2009; NICHHD, 2000), and the studies in this section indicate a relationship between students’ oral reading fluency rates and comprehension at various stages of reading and grade levels. Oral reading fluency rates are important not because of how fast students read but
  56. 56. 43 because fluency demands little conscious attention and enables them to focus on comprehension (Samuels, 2006). Relationship Between Oral Reading Fluency and Reading Proficiency in State- Mandated Assessments Since LaBerge and Samuels’ (1974) development of the automaticity theory, many studies (Daane et al., 2005; Kuhn, 2005; Riedel, 2007) have confirmed a relationship between how fluently students read and how proficiently they comprehend text at various grade levels. However, fewer studies have considered oral reading fluency rates with the DIBELS ORF and reading proficiency as measured by state-mandated assessments. Many school districts use the DIBELS ORF assessment to identify students at risk for failing (Cusumano, 2007; Fletcher, Francis, Morris, & Lyon, 2005; Gersten & Dimino, 2006; Jenkins et al., 2007; Katz et al., 2008; Stecker, Lembke, & Foegen, 2008; Wood, Hill, Meyer, & Flowers, 2005). Studies have been conducted for only seven states for positive correlations between students’ oral reading fluency rates, measured by the DIBELS ORF, and reading proficiency, using their states’ mandated reading assessments. These states are as follows: Arizona (Wilson, 2005), Colorado (Shaw & Shaw, 2002; Wood, 2006), Florida (Buck & Torgesen, 2003; Roehrig et al., 2008), North Carolina (Barger, 2003), Ohio (Vander Meer et al., 2005), Oregon (Baker et al., 2008), and Pennsylvania (Shapiro et al., 2008).
  57. 57. 44 Studies of Grade 3 Only Researchers have conducted two studies with Grade 3 students only. In Colorado, Shaw and Shaw (2002) conducted research using 52 Grade 3 students from a Colorado elementary school to determine if the DIBELS ORF benchmarks were a predictor of success on the Colorado State Assessment Program. The researchers found the DIBELS ORF to be strongly correlated, .80, the highest correlation of the studies reviewed here. For Florida Grade 3 students, Buck and Torgesen (2003) compared the DIBELS ORF rates of 1,102 students in 13 elementary schools to their performance on the Florida Comprehension Assessment Test. The researchers found a significant correlation, .70. They thus concluded that for this sample the DIBELS ORF rate was a predictor of success on the Florida Comprehension Assessment Test. Studies of Reading First Schools For studies in three states, the researchers (Baker et al., 2008; Roehrig et al., 2008; Wilson, 2005) used the large databases from Reading First schools to determine if DIBELS ORF correlated with state-mandated assessments. To identify and monitor the progress of struggling readers, over 90% of the Reading First schools used DIBELS oral reading fluency (Glenn, 2007; U.S. Department of Education, 2010). With an Arizona Reading First school, Wilson conducted a study with 241 Grade 3 students. He found a moderately large correlation (.74) between the DIBELS ORF and the Arizona Instrument to Measure Standards. Roehrig et al. conducted a similar study with Florida students, in which the DIBELS ORF rates of 35,207 Grade 3 students were compared to their performance on the Florida Comprehensive Assessment. Correlations were also
  58. 58. 45 moderately large, ranging from .70 to .71. In Oregon, Baker et al. used 34 randomly selected Oregon Reading First schools located in 16 different school districts. The researchers compared the DIBELS ORF rates of 4,696 Grade 3 students with their performance on the Oregon State Reading Assessment. The correlations ranged from 0.58 to 0.68, somewhat lower than the Arizona and Florida studies. Pressley et al. (2005) suggested that studies on the DIBELS should be conducted by researchers not associated with Reading First. In response, Barger (2003) compared the DIBELS ORF rates of 38 students to the North Carolina End of Grade 3 Test. Barger also found a moderately large correlation (.73), similar to those found by Wilson (2005) and Roehrig et al. (2008). These researchers documented the positive relationship between the DIBELS ORF and four different state-mandated reading assessments for Grade 3 students. Studies of Grade 3 and Other Grades However, Fuchs et al. (2001) suggested that the correlation between oral reading fluency and comprehension may be stronger in Grade 3 than in other grades, for which assessments measure higher levels of comprehension. Three groups of researchers (Shapiro et al., 2008; Vander Meer et al., 2005; Wood, 2006) studied this issue by using students in Grades 3, 4, and 5. For Colorado schools, Wood (2006) used 82 Grade 3 students, 101 Grade 4 students, and 98 Grade 5 students to determine if the DIBELS ORF consistently predicted performance on the Colorado Student Assessment Program across grade levels. Wood found significant correlations at all three levels: for Grade 3 at .70,
  59. 59. 46 Grade 4 at .67, and Grade 5 at .75. These values were similar to those found by Wilson (2005), Roehrig et al. (2008), and Barger (2003) for Reading First schools. For two grade levels with students in Ohio, Vander Meer et al. (2005) correlated the end-of-year Grade 3 DIBELS ORF with performance on the reading portion of the Ohio Proficiency Test. A total of 318 Grade 4 students from three elementary schools comprised the sample. Vander Meer et al. (2005) found a significant correlation, .65, between the end-of-year Grade 3 ORF scores and the Grade 4 Ohio Proficiency Test scores. Because this study considered the same students across two successive grades, the researchers concluded that oral reading fluency was an accurate predictor of performance on the reading portion of the Ohio Proficiency Test. With students in Pennsylvania across grade levels, Shapiro et al. (2008) compared the DIBELS ORF rates of 401 Grade 3 students, 394 Grade 4 students, and 205 Grade 5 students with their performance on the Pennsylvania System of School Assessment. The researchers found significant correlations for all three grades: Grade 3 at.67, Grade 4 at .64, and Grade 5 at.73. Again, these results were similar to those of the previous studies. The findings of Wood (2006) and Shapiro et al. support the use of the DIBELS ORF as a consistent predictor of performance on state-mandated reading assessments not only for Grade 3 but also across grade levels. Thus, for seven states of the 50 that require state-mandated assessments of reading proficiency, studies have found significant positive relationship between the DIBELS ORF and the state measures. However, because definitions of reading proficiency differ across states and grade levels, the need existed for researchers to
  60. 60. 47 determine if DIBELS ORF correlates significantly with state-mandated assessments in other states and at other grade levels (Roehrig et al., 2008). The present study focused on Grade 3 students in Texas. No similar study has been conducted to determine if a significant relationship existed for Grade 3 students between DIBELS ORF rates (independent variable) and reading proficiency (dependent variable), as measured by the scores on the Grade 3 Reading TAKS. Limitations of State Studies In the studies reviewed above, the researchers all reported significant relationships between the DIBELS ORF in relation to Grade 3 students and reading proficiency as assessed on state-mandated assessments. However, the researchers also recognized limitations of their studies. One limitation was sample composition and size. For example, the Colorado sample (Wood, 2006) consisted of primarily White students. The North Carolina sample (Barger, 2003) had only 38 students. On the other hand, the Oregon sample (Baker et al., 2008) and one of the Florida samples (Roehrig et al., 2008) had large statewide samples from students in Reading First schools. One of the criteria for being a Reading First school was a high population of students from low socioeconomic backgrounds, and thus these samples were not representative of the general Grade 3 population. In all these studies, researchers recommended future research that would include wider cross-sections of students as well as non-Reading First schools for greater generalizability of results to other school settings. Another limitation of these studies was recognition of other variables that might affect the relationship between oral reading fluency and reading proficiency. In the
  61. 61. 48 Oregon study, Baker et al. (2008) singled out diversity as a factor. The Reading First sample schools had large poverty rates and low reading achievement, and Baker et al. recommended additional research that focused on specific ethnic backgrounds as well as subpopulations, such as special education students. In Arizona, Wilson (2005) recommended future studies after Arizona implemented the 2005 Arizona’s Instrument to Measure Standards Dual Purpose Assessment (AIMS DPA) measurement scale and performance levels. Wilson posited that the relationships between scores would be different with the new standards. When Wilson published his results, it was not clear whether the expected performance levels would be higher or lower than the ones used in his study. For the Pennsylvania study, Shapiro et al. (2008) used data from the 4Sight assessments in addition to DIBELS. They used the 4Sight assessments to provide further information on students’ reading comprehension to predict reading proficiency. The 4Sight assessments are short tests similar in format and other aspects to state assessments and are used throughout the school year to produce overall scores that may predict students’ scores on state assessments. Many states use the 4Sight assessments, including California, Pennsylvania, Tennessee, and Texas (Success for All Foundation, 2010). In the Pennsylvania study (Shapiro et al., 2008), the 4Sight benchmark assessments were administered to students as a group rather than individually. Whole group administration enabled researchers to assess many students in approximately 1 hour. However, some reading skills, such as oral reading fluency, are best tested when
  62. 62. 49 administered individually so educators can monitor the rate and accuracy with which a student reads. Shapiro et al. (2008) recommended future studies in other states to determine if their findings could be replicated. Additional studies over time and in various locations would help generalize the findings to larger populations. Shapiro et al. also noted that the schools in their samples represented varying demographic characteristics but were similar in their levels of poverty. The researchers recommended additional studies in other schools with different poverty levels to eliminate poverty as a factor and for greater generalization. As a result of these limitations and recommendations, researchers indicate the need for additional studies on the DIBELS ORF and state-mandated assessments, in Grade 3 and other grades. Roehrig et al. (2008) pointed out this need especially in other states. I responded to this need with this study to determine if a statistically significant relationship existed between the DIBELS ORF and the TAKS for Grade 3 students in Texas. Summary In section 2, literature regarding the automaticity theory, oral reading fluency, and reading proficiency was reviewed. LaBerge and Samuels (1974) developed the automaticity theory to explain why readers who struggle with low oral reading fluency rates also struggle with comprehension. La Berge and Samuels and other researchers that followed (Baker et al., 2008; Kuhn, 2005; Morgan & Sideridis, 2006) demonstrated that readers have a limited amount of attention to devote to reading. When too much attention

×