Pvaas Growth Standard Methodology


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • This presentation will focus on the methodology that is utilized by the Pennsylvania Value-Added Assessment System in its full state-wide implementation. This methodology is complex and is represented conceptually throughout this presentation. Participants will receive maximum benefit from this presentation after participating in a PVAAS Overview presentation of the Growth Standard Methodology or viewing these materials on the PDE website. [CLICK]
  • Pvaas Growth Standard Methodology

    1. 1. Pennsylvania Value-Added Assessment System (PVAAS) PVAAS Growth Standard Methodology: Statewide Implementation Questions about these materials and the Growth Standard Methodology can be directed to the PVAAS Statewide Core Team. See contact info on Slide 31.
    2. 2. Important Questions for Value-added Calculations <ul><li>What is a Growth Standard and how is it set? </li></ul><ul><li>How can we compare scores across different years? </li></ul><ul><li>How do we estimate a student’s true level of achievment? </li></ul>
    3. 3. The Growth Standard: Key Metric in PVAAS <ul><li>The Growth Standard specifies the minimal designated academic gain from grade to grade for a cohort of students. </li></ul><ul><li>The use of a Growth Standard creates the possibility that ALL schools can demonstrate appropriate growth. </li></ul>
    4. 4. An Analogy
    5. 5. An Analogy <ul><li>Doctors plot a child’s length/height over time. </li></ul><ul><li>Each child may have a unique growth curve. </li></ul>
    6. 6. When growth “acceptable”? <ul><li>The length/height measurement is increasing over time. </li></ul><ul><li>The length/height measurement maintains the approximate position in its length/height distributions as the child grows. </li></ul><ul><li>The child’s length/height continues to increase in a consistent manner. </li></ul>A significant deviation of the growth pattern or a change outside the “typical” range of values is an indication that further investigation is required.
    7. 7. What is the Growth Standard for a child’s length/height? <ul><li>The standard is that the child maintain the approximate same position each of the increasing distributions of length/heights as the child grows. </li></ul><ul><li>A significant deviation from that pattern indicates a need for further investigation. </li></ul>
    8. 8. Growth Standard Charts for Academic Achievement <ul><li>Let us build an Academic Achievement Growth Chart. </li></ul><ul><li>Collect the average performances of a large sample of students using a uniform assessment during each year of their career through school. </li></ul><ul><li>Plot curves to represent appropriate percentile patterns. </li></ul><ul><li>An example: Suppose the following table represents the means and SDs of a group of students on the PSSA beginning in 3 rd grade and continuing through 8 th grade and ultimately 11 th grade. </li></ul>
    9. 11. In an ideal world… <ul><li>We would have a large body of longitudinal data from many cohorts to construct our growth charts. </li></ul><ul><li>Since we do not, we will use the distributions from a base year for the creation of the growth curves. </li></ul><ul><li>The base year distributions are approximates to the achievement distributions of a cohort from grade 3 to grade 8 and 11. </li></ul>
    10. 13. Using the Base Year 2006 <ul><li>Suppose the distributions from 2006 are given by </li></ul>Conversion to NCE scores will use the Base Year distributions in their calculations. 276 255 310 250 SD 1285 1300 1290 1270 Mean 6 5 4 3 Grade
    11. 14. Suppose the means of a cohort in two consecutive years are: 2007: 3rd  1390 and 2008: 4th  1450 <ul><li>NCE scores are calculated for both using the 2006 means and SD’s. </li></ul>2007: 3rd  1390  2008: 4th  1450  All future PSSA scaled scores will be converted to NCE scores using the 2006 Base year parameters for the comparison to calculate the mean gain of a cohort of students. 310 250 SD 1290 1270 Mean 4 3 Grade
    12. 15. The NCE Growth Curves
    13. 16. Some Thoughts… <ul><li>This Growth Standard concept demonstrates the need for longitudinal data when considering academic growth since each student has his/her own academic growth curve. </li></ul><ul><li>But… </li></ul><ul><li>The example also exhibits the remaining two issues for PVAAS value-added methods: </li></ul><ul><ul><li>Comparing scores from year to year </li></ul></ul><ul><ul><li>Estimate the “true” level of achievement for input into the growth curve. </li></ul></ul>
    14. 17. Calculation of Gain from year to year <ul><li>Student growth is measured by difference in performance in consecutive years. </li></ul>But there is a problem with this! These scores are not comparable! 6 5 4 3 Grade 35 20 20 Gain 1365 1330 1310 1290 Score
    15. 18. Comparing scaled scores on the PSSA from different years <ul><li>PSSA tests have different means and standard deviations at each grade and for different years. For example, in 8 th grade: </li></ul>239.7 1370 208.1 1350 2004 274.3 1360 222.2 1370 2005 SD Mean SD Mean Year Reading Math
    16. 19. A Solution: Conversion to NCE Scores <ul><li>NCE scores indicate the position of a scaled score on a reference scale (mean = 50, sd = 21.06) so that the scaled scores from different distributions with different scales can be compared. </li></ul><ul><li>The use of NCE scores does not impose a normal distribution on the data, nor does the use of NCE scores have any relationship to normed referenced tests. </li></ul><ul><li>NCEs are excellent for looking at scores over time. </li></ul><ul><li>Using Data to Improve Student Learning in High Schools </li></ul><ul><li>Victoria L. Bernhardt </li></ul>
    17. 20. NCE Scores Are About Position <ul><li>To calculate an NCE score: </li></ul><ul><li>Calculate the z-score of the data value of interest , that is, the number of standard deviations the data value is from the mean of its distribution: </li></ul><ul><li>The NCE score is calculated using the following formula: </li></ul>
    18. 21. A Question… <ul><li>George scores a 655 on the SAT mathematics exam. </li></ul><ul><li>George also scores a 28 on the ACT mathematics exam. </li></ul>Which score should he report to his colleges if he wants to provide the “better” score?
    19. 22. A Matter of Comparison <ul><li>How do we compare George’s scores? </li></ul>The nature of each distribution is irrelevant to the question of interest: 28 655 George 5.0 20.7 ACT 110 520 SAT SD Mean
    20. 23. A Solution <ul><li>Conversion of both scores to NCE scores allows for the identification of the position of each score on the same scale. </li></ul><ul><li>This identification of position provides the capability of comparison since the converted scores will be based on the same distribution parameters. </li></ul>
    21. 24. Which Score Should George Choose to Report? Using a NCE scale with mean 50 and standard deviation 21.06… SAT score of 655  NCE score 75.85 ACT score of 28  NCE score 80.74 Clearly, he should report his ACT score! ACT score SAT score
    22. 25. Consider Another Hypothetical Scenario… In 2006, Wilma was in 4 th grade and scored as follows on the 4 th grade PSSA: Mean for 4 th Grade – 2006 = 1303.24 Standard Deviation for 4 th Grade – 2006 = 164.20 Wilma’s scaled score = 1425 In 2005, Wilma was in 3 rd grade and scored as follows on the 3 rd grade PSSA: Mean for 3 rd Grade – 2005 = 1356.75 Standard Deviation for 3 rd Grade – 2005 = 126.20 Wilma’s scaled score = 1425 Do these scores indicated that Wilma progressed during 4 th grade?
    23. 26. Let’s Look at it Graphically… Even though Wilma’s scaled scores were the same (both 1425), since the distributions were different, we really can’t compare the two scores… Wilma Wilma
    24. 27. A Tentative Solution: Conversion to Percentiles In our example, Wilma score of 1425 was in the 66 th percentile for 2005 but was in the 76 th percentile for 2006. These percentiles focus on Wilma’s position in each distribution. Wilma Wilma
    25. 28. But… <ul><li>We cannot calculate Wilma’s gain – the difference of percentiles does not make sense… </li></ul><ul><li>Percentiles are not meaningful for calculating means for different years, gains, etc., since they are calculated from different distributions. </li></ul>
    26. 29. The Complete Solution: Conversion to NCE Scores <ul><li>To establish a basis of comparison for different distributions from different schools in different years, we convert the scaled scores to units in the SAME scale. </li></ul><ul><li>The scale we will use is from the NCE distribution with mean 50 and standard deviation approximately equal to 21.06. </li></ul>Mean
    27. 30. The NCE Distribution and Wilma <ul><li>Wilma’s NCE score for 2005 (3 rd grade) is 61 </li></ul><ul><li>while her score for 2006 (4 th grade) is 66. </li></ul>Wilma 2006 4 th Wilma 2005 3rd
    28. 31. Wilma’s gain… <ul><li>Wilma’s gain = 2006 NCE score – 2005 NCE score </li></ul><ul><li>(4 th Grade) (3 rd Grade) </li></ul><ul><li> = 66 – 61 </li></ul><ul><li> = + 5 </li></ul><ul><li>The mean gain of all of the students in Wilma’s cohort can now be compared to the Growth Standard for growth for Wilma’s cohort. </li></ul>
    29. 32. PVAAS Statewide Methodology Student A Base Year NCE Score (2006) 2009 Observed School Mean NCE Scores Student A Test Score (2009)
    30. 33. The Problem with the Mean of the Observed Scores <ul><li>The mean of the observed NCE scores at best represents a single snapshot in time of student achievement of the PSSA Anchors… </li></ul><ul><li>Is it the most comprehensive assessment of the school’s TRUE level of achievement? </li></ul><ul><li>How about the Bad Day syndrome? </li></ul>
    31. 34. Observed vs. Composite Estimate… Which is better? <ul><li>What if we combined the new, observed data with all of the prior PSSA assessment information that we have for this cohort of students? </li></ul><ul><li>Would not a longitudinal view of the cohort’s performance yield a more precise and reliable estimate of the true level of achievement? </li></ul><ul><li>This is the essence and power of the </li></ul><ul><li>PVAAS methodology! </li></ul>
    32. 35. Consider an Example… <ul><li>Determine the percent of candies that are blue… </li></ul>If you were to open only one bag and find that 13% of the candies are blue, how much confidence would you have in your estimate of the true percentage of blue candies for all candies?
    33. 36. Only One Sample? A Bit Risky… <ul><li>Let’s open 50 bags and look at the distribution of the percents of blue candies… </li></ul>Looking at these 50 bags, what would you estimate the “true” percent of blue candies for all candies?
    34. 37. What If? <ul><li>Let’s open 50 more bags and add them to the 50 selected earlier… </li></ul>Distribution with n = 50 Distribution with n = 100 With this additional data, we can make a better estimate of the true percent of blue candies!
    35. 39. PVAAS Statewide Methodology Computer 2009 Observed School Mean NCE Scores 2008 Estimated School Mean NCE Score 2007 Estimated School Mean NCE Score 2006 Estimated School Mean NCE Score 2009 Estimated School Mean NCE Scores Gain = 2009 Estimate – 2008 Estimate Compare to Growth Standard  School Rating
    36. 40. How to Measure Growth of a School? <ul><li>Using a Growth Standard </li></ul><ul><li>Student scaled scores are converted to NCE scores (2006 parameters). </li></ul><ul><li>The mean NCE score for each school is calculated. </li></ul><ul><li>PVAAS revises all earlier estimates based on the addition of the current data. </li></ul><ul><li>PVAAS calculates an estimated NCE mean score. </li></ul><ul><li>Estimated Mean NCE Gain </li></ul><ul><li>= Current Estimated NCE mean – Previous Estimated NCE mean </li></ul><ul><li>Gain is compared to Growth Standard for School Effect Rating. </li></ul>
    37. 41. Here is the Fall 2006 PVAAS District/School Report
    38. 42. Gain Ratings Mean NCE Gain for a cohort in a given year represents the progress of students in that cohort relative to the Growth Standard of 0. Color ratings: Green – mean gain greater than or equal to the Growth Standard  favorable indicator Yellow – mean gain less than one SE below the Growth Standard  warning sign Light Red – mean gain is between one and two SE’s below the Growth Standard  stronger caution Red – mean gain less two SE’s below the Growth Standard  most serious warning
    39. 43. Level of Evidence – The Role of Standard Error <ul><li>The color-coded ratings on the mean gain of cohorts are based on the level of confidence we have that the gain of the cohort is truly below the Growth Standard… </li></ul>THE GOAL Slight Evidenceof Lack of Progress Greater Evidenceof Lack of Progress Significant Evidence of Lack of Progress At or above the Growth Standard Less than 1 SE below Growth Standard Between 1 and 2 SE’s below Growth Standard More than 2 SE’s below Growth Standard
    40. 44. The Power of PVAAS <ul><li>The power of this methodology is that it produces: </li></ul><ul><ul><li>Accurate estimates of the true level of achievement of the students in this school. </li></ul></ul><ul><ul><li>Updated estimates of all prior mean performance estimates simultaneously as new data is input into the longitudinal data structure. </li></ul></ul><ul><ul><li>Over time, more accurate and reliable estimates of the true level of understanding of the students in this grade or school. </li></ul></ul>
    41. 45. Questions? <ul><li>For more information, contact: </li></ul><ul><li>[email_address] </li></ul><ul><li>717-606-1911 </li></ul>
    42. 46. Gerald L. Zahorchak, D.Ed. Secretary of Education Commonwealth of Pennsylvania www.pde.state.pa.us 333 Market Street Harrisburg, PA 17126