Your SlideShare is downloading. ×
Validity ppt1
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Validity ppt1

2,582
views

Published on

validity

validity

Published in: Education, Technology

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
2,582
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
30
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. DEVELOPMENTALREADINGASSESSMENTDRAPublished by PearsonReviewed by:Carolyn Kick
  • 2. AUTHORSJoetta BeaverWith a bachelor of science in elementary education and masters degree inreading from The Ohio State University, Joetta Beaver worked as anelementary teacher (K-5) for 30 years, as well as a K-5 LanguageArts/Assessment coordinator and an Early Education teacher-leader. She isthe primary author of DRA2 K-3, co-author of DRA2 4-8 and DevelopingWriters Assessment (DWA), consultant, and speaker.Mark Carter, PhDWith assessment the focus of much of his professional work, Mark Carterserved as a coordinator of assessment for Upper Arlington Schools (wherehe currently teachers fifth grade), conducted numerous seminars, and co-authored DRA2 4-8, DWA, and Portfolio Assessment in the ReadingClassroom. He received his doctorate of philosophy from The Ohio StateUniversity where he also taught graduate courses in education as anadjunct professor.
  • 3. OVERVIEW OF THE DRA The Developmental Reading Assessment is a set of individually administered criterion-referenced assessments K-8. Purpose: Identify students’ reading level based on accuracy, fluency, and comprehension Other purposes- Identify students’ strength and weaknesses at their independent reading level, planning instruction, monitor reading growth, and preparation for testing expectations. Assessment is administered one-on-one requiring students to read specifically selected leveled assessment texts that increase with difficulty. Administered, scored, and interpreted by classroom teachers.
  • 4. DRA HISTORY & REVISIONS 1988-1997- DRA is researched and developed by Joetta Beaver and the Upper Arlington School District 1997- DRA K-3 is published by Pearson 1999- Evaluation of the Development of Reading 2002- DRA 4-8 2004- DRA Word Analysis 2005 SRA Second Edition (DRA2), K-3 & 4-8 2006- Evaluation of the Development of Reading 2007- More than 250,000 classrooms use DRA and EDL 2008- Pearson partners with Liberty Source on DRA2 Handheld Tango Edition 2009- DRA2 Handheld- Tango wins CODIE Award
  • 5. DRA READING ASSESSMENTCRITERIA Oral Reading Fluency Ability Word Analysis Comprehension Grade K
  • 6. ORAL READING AND FLUENCYThe total number of oral reading errors is converted to an accuracy score using a words-per minute (WCPM) metric. Rating expression, phrasing, rate, and accuracy on a 4 point scale. This begins at level 14- the transitional level, grades 1 and 2.
  • 7. COMPREHENSION Levels 3-16, once the oral reading is over, the student should take the book and read it again silently. This gives them another opportunity to check themselves on comprehension for retelling. Students retell what happens in the story. Underline information that the student is able to give, but which requires prompting. Note information that the student is able to give, but which requires prompting, with a TP (teacher prompt). Follow-up questions follow the summary and if used need to be tallied to the left. The number of prompts to elicit more information will be calculated as part of the comprehension score.
  • 8. WORD ANALYSISAssesses phonological awareness, metalanguage, letter/word recognition, phonics, and structural analysis in grades K-3. DRA Word Analysis is included in the new second edition of DRA K-3.
  • 9. READING LEVELS Emergent Early Transitional Levels Levels A-3 Levels 4-12 14-24Kindergarten Grade 1 Grades 1 & 2 Intermediate/Middle Extending: Levels School Levels 40- 28-39 Grades 2 & 3 80, Grades 4-8
  • 10. INFORMATION ON DRA
  • 11. VALIDITY
  • 12. KORETZ ON VALIDITY“…Validity, which is the single most important criterion for evaluating achievement testing. ..but, tests themselves are not valid or invalid. Rather, it is an inference based on test scores that is valid or invalid. ..Validity is also a continuum: inferences are rarely perfect. The question to ask is how well supported the conclusion is” (Koretz, 2008, p. 31).
  • 13. VALIDITY CONT. Messick 1994 would argue that construct validity refers to the inferences that are drawn about score meaning, specifically the score interpretation and the implications for test use (quantitative). This theoretical framework becomes subject to empirical challenges, a unified approach of validity What is the test measuring? Can it measure what it intends to measure?
  • 14. FOUR TYPES OF VALIDATION Predictive Concurrent Criterion-oriented Criterion-Oriented Validity Content Construct
  • 15. CRITERION VALIDITYPredictive validity is were we draw an inference from test scores to performance.Concurrent Validity- studied when a test is proposed a substitute for another, or a test is shown to correlate with some contemporary criterion (Cronbach & Meehl, 1955).
  • 16. CONTENT VALIDITYAccording to Yu, content validity is when we drawinferences from test scores to a larger domain ofitems similar to those on the test, sample population.This selection of content is usually done by experts.Experts tend to lack experience in the field, andassume that all are experts.
  • 17. CONSTRUCT VALIDITYAccording to Hunter and Schmidt (1990), construct validity is aquantitative question rather than a qualitative distinction such as "valid" or"invalid"; it is a matter of degree. Construct validity can be measured bythe correlation between the intended independent variable (construct)and the proxy independent variable (indicator, sign) that is actually used. -Yu
  • 18. PEARSON EDUCATION ON DRA VALIDITY Pearson refers “…to validity of an assessment, one looks at the extent to which the assessment actually measures what it is supposed to measure.” Questions to be asked when examining validity include:  Does this assessment truly measure reading ability?  Can teachers make accurate inferences about the true reading ability of a student based upon DRA2 assessment results?
  • 19. PEARSON EDUCATION ON CONTENT RELATED VALIDITY OF THE DRA The content validity of a test relates to the adequacy with which the content is covered in the test. A “Theoretical Framework and Research,” the DRA2 incorporates reading domains to review and research good readers with consultants and educators. Content Validity was built into the DRA and DRA2 assessments during the development process.
  • 20. PEARSON CRITERION RELATED VALIDITY ON THE DRA Criterion-related validity refers to the extent to which a measure predicts performance on some other significant measures, (called a criterion) other than the test itself. Criterion validity may be broken down into two components: concurrent and predictive. Concurrent validity correlates the DRA to many other reading tests:  Gray’s Oral Reading Test-4th Edition  GORT-4; Wiederholt & Bryant, 2001  DIBELS Oral Reading Fluency Test-6th Edition  Correlations Between DRA2 and Teacher Ratings
  • 21. DRA REVIEW, NATALIE RATHVON, PH. D.The following evidence of validation is basedupon the review of the DRA completed by:Natalie Rathvon, Ph. D., Assistant ClinicalProfessor, George Washington University,Washington DC, Private Practice Psychologistand School Consultant, Bethesda, MD(August 2006)
  • 22. DRA CONTENT VALIDITYIn a review by Natalie Rathvon, PH.D. Oral Fluency, running record- derived from only Clay’s Observational Survey (Clay, 1993). Teacher surveys (return rates were 46%), conducted (ns of 80 to 175) revealed that DRA provided teachers with information describing reading behaviors and identifying instructional goals There were also concerns about adequacy and accuracy of the comprehension assessment and the accuracy of text leveling prior to 2003 before the Lexile framework evaluated the readability in the DRA text. Concerns about who develop and reviewed the assessment. There is no evidence that external reviewers participated in the development, revision, or validation process. Rathvon states, “Means, standard deviations, and standard errors of measurement should be presented for accuracy, rate, and comprehension scores for field test students reading adjacent text levels to document level-to-level progression.”
  • 23. CONSTRUCT VALIDITY EVIDENCE Results from Louisiana statewide DRA administrations for spring of 2000 through 2002 for students in grades 1 through 3 (ns = 4,162 to 74,761) show an increase in DRA levels across grades, as well as changes in DRA level for a matched sample of student (n = 32.739) over a three year period. This indicates that the skills being measured are developmental. The DRA as evidence can detect changes in reading levels. As evidenced in two studies evaluating the relationship between Lexile Scale measures and DRA running-record format is a valid method of assessing reading comprehension.
  • 24. SUMMARY OF WHAT DRA IS: An attractive reading battery modeled after an informal reading inventory based Clay’s Observational Survey (Clay, 1993) Authentic Texts Instructionally relevant measures of fluency and comprehension Provides meaningful results for classroom teachers, parents, and other stakeholders Provides encouraging evidence that the use of DRA predicts future reading achievement for primary grade students.
  • 25. DRA CRITERION RELATED VALIDITY No concurrent validity evidence is presented documenting the relationship between the DRA and standardized or criterion- referenced tests of reading, vocabulary, language, or other relevant domains for students in kindergarten or grades 4 through 8. There is a need for studies examining the extent to which individual students obtain identical performance levels on the DRA and validated reading measures are especially needed. No information is provided to document the relationship between the DRA Word AnalysNo concurrent validity evidence is presented for any of the DRA assessments in terms of relationship between DRA performance and contextually relevant performance measures, such as teacher ratings of student achievement or classroom grades.
  • 26. SUMMARY OF WHAT DRA IS: Responsive to intervention for primary grade students An assessment model that has raised teacher awareness of student reading levels corresponding them with appropriate texts. Teacher reviewed and survey based on classroom practice (return rates were 46%), conducted (ns of 80 to 175) (Rathvon, 2006) Provided evidence that the Lexile Scale measures and DRA running record format is a valid method of assessing reading comprehension.
  • 27. SUMMARY OF WHAT DRA IS NOT: Informal reading inventories lack in reliability and validity (Invenizzi et al,; Spector, 2005) Provide evidence of text equivalence within levels Provide evidence for overall reading level for half the grade levels. Have consistent process of text selection, scoring, and administering- vulnerable to teacher inconsistencies and judgments (improved since Lexile model) Provide enough evidence of criterion-related validity for older students Provide concurrent validity evidence documenting the relationship between the DRA and standardized or criterion-referenced tests of reading, vocabulary, or language in kindergarten and grades 4-8 Provided to document the relationship between the DRA Word Analysis and any criterion measure.
  • 28. SUMMARY OF WHAT DRA IS NOT: Provide sufficient evidence that teachers can select texts aligned with students’ actual reading level (or achieve acceptable levels of scorer consistency and accuracy) Provide evidence of demographic groups Include external reviewers in the development, revision, and validitaion of any DRA series Provide complete field testing reporting Provide theoretical rationale or empirical data supporting the omission of a standard task to estimate student reading level. Provide standard means , standard deviations, and standard errors of measurement ensuring accuracy
  • 29. WHAT DOES ALL OF THIS MEAN?Learning about the validity of the Developmental ReadingAssessment was difficult. I have yet to administer one, butwould like to go through the process. There is no empiricalevidence that consistently supports the validity of the DRA.There are far too many variables, and opportunities for humanbehavior to alter results and effect the variability.However, the difference in how teachers approach thediagnostics of the reading levels of students, and theawareness of getting leveled texts in the classroom haschanged dramatically over the past few years. The changesin reading instruction based on results of the DRA (though notvalid) has changed reading instruction in our district.
  • 30. RESOURCES http://mypearsontraining.com/pdfs/TG_DRA2_ProgramComponen ts.pdf DRA k-3 PMDBSUBCATEGORYID=&PMDBSITEID=2781&PMDBSUBSO LUTIONID=&PMDBSOLUTIONID=&PMDBSUBJECTAREAID=&P MDBCATEGORYID=&PMDbProgramID=23662