Your SlideShare is downloading. ×
State Assessment Data Meeting for Admin
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

State Assessment Data Meeting for Admin

184
views

Published on

In this presentation, principals will engage in a process to make sense of assessment data then lead the process with their staffs. …

In this presentation, principals will engage in a process to make sense of assessment data then lead the process with their staffs.

Learning Target: I can lead analysis of data using a data-driven dialog protocol.
Success Criteria:
* Summarize best practices for data analysis
* Predict what we may see
* Make literal observations
* Draw inferences and ask questions
* Identify possible next steps

Learning Target: I can explain AMOs to my staff.
Success Criteria:
* Describe what AMOs are and how they are calculated
* Interpret AMO calculations

Published in: Education

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
184
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. August 2012Preliminary DataLearning Meeting August 2012 Orting School District Marci Shepard
  • 2. “Be Present” Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 3.  I can lead analysis of data using a data-driven dialog protocol. • Summarize best practices for data analysis • Predict what we may see • Make literal observations • Draw inferences and ask questions • Identify possible next steps I can explain AMOs to my staff. • Describe what AMOs are and how they are calculated • Interpret AMO calculations Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 4.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 5.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 6.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 7. Update from the State Online versus Paper and Pencil Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 8. More technology difficulties this year• Technology glitches are reported as irregularities and may have had a negative impact on scores. o We have been analyzing differences between students with reported irregularities and those in schools that did not have irregularities, and cannot detect that this had a negative effect. o When students were unable to enter an answer or had another technology failure that precluded measuring their skill, a modified scoring table will be applied.• There are already many plans in place to fix the technology difficulties. o Districts will have more time with the test engine so students are not unfamiliar with the tools and functionality. o Test vendor (DRC) will develop a mechanism for verifying that each district has proper set-up for online testing a month prior to testing. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 9. Online Testing – Mode Comparability• Equating, which compares performance on items common to last year’s test, shows the raw score needed to be at Level 2, Level 3, and Level 4 are the same in each mode – but when we apply those cut scores, the percent of students meeting standard on the paper tests is higher than the percent meeting standard on the online tests in nearly all grades and all content areas. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 10. Online Testing – Mode Comparability• Districts had concerns in the past two years about online being harder, but the psychometrics showed no difference.• If we gave identical paper tests, or identical online tests to two groups of people one group might do better than the other, and we would conclude that the groups had different abilities (maybe one had more high performing students). That is what we have attributed the minor mode differences to in previous years.• But this year brought larger differences, all in favor of paper/pencil tests… Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 11. Online Testing – Mode ComparabilityTable below shows 2012 percent meeting standard (based only on equatingsamples):Grade % Testing Math Reading Science Online Online Paper Online Paper Online Paper 3 ~15% 64.5 65.2 57.8 67.8 4 ~25% 57.9 58.7 64.7 71.8 5 ~35% 62.8 63.6 67.6 71.7 59.2 67.0 6 ~50% 61.6 62.3 63.0 70.6 7 ~50% 55.0 58.4 64.9 71.0 8 ~50% 52.9 58.7 64.6 68.9 61.5 70.9 Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 12. Online Testing – Mode Comparability• This year the differences between modes are bigger than in first two years of online testing.• The biggest differences are in the text-based subjects, where student read passages online (reading and science).• The differences tend to be smaller in the upper grades, but not always.• Technology irregularities did not explain the differences. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 13. Online Testing – Mode Comparability• In consultation with our assessment vendors, psychometrics experts and national technical advisory committee, we made an adjustment to online scores as part of our equating process. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 14. Online Testing – Mode ComparabilityTable below shows 2012 adjusted percent meeting standard (based only onequating samples): Grade % Testing Math Reading Science Online Online Paper Online Paper Online Paper 3 ~15% 69.0 65.2 71.8 67.8 4 ~25% 65.6 58.7 69.2 71.8 5 ~35% 66.2 63.6 72.0 71.7 68.6 67.0 6 ~50% 61.6 62.3 67.3 70.6 7 ~50% 61.5 58.4 72.7 71.0 8 ~50% 56.8 58.7 68.7 68.9 65.4 70.9 Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 15. This means……• Any systematic differences in difficulty between modes have already been adjusted in scores reported to districts.• OSPI will continue to examine mode effects during equating to determine if an adjustment is warranted in future years. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 16.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 17.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 18.  I can lead analysis of data usinga data-driven dialog protocol.• Summarize best practices for data analysis• Predict what we may see• Make literal observations• Draw inferences and ask questions• Identify possible next steps Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 19.  I can lead analysis of data using a data-driven dialog protocol. • Summarize best practices for data analysis • Predict what we may see • Make literal observations • Draw inferences and ask questions • Identify possible next stepsOrting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 20.  I can explain AMOs tomy staff.• Describe what AMOs are and how they are calculated• Interpret AMO calculations Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 21.  I can explain AMOs tomy staff.• Describe what AMOs are and how they are calculated• Interpret AMO calculations Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 22. nnual Measurable bjectives Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 23. Testing in ESEA Flexibility Waiver• AYP rules and procedures are replaced by Annual Measureable Objectives.• Lowest performing schools in reading and math need to revise their school improvement plan using up to 20% of district Title I monies.• Participation in assessments and performance of sub-groups (including English language learners, special education, poverty) still key. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 24. State-developed differentiated recognition accountability and supporto Reward Schools • Highest performing schools • High-progress schoolso Priority Schools • 5% lowest performing Title I and Title 1-eligible schools with less than 60% graduation rateo Focus Schools • 10% of Title I schools with highest proficiency gapso Emerging Schools • The next lowest 10% of schools on the Focus list, and the next 5% of schools on the Priority list Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 25. Accountability Evolution with ESEA Waiver Up to 2011-12 2012-13 and 2013-14 2014-15 and beyondAYP Determinations AMO Calculations AMO CalculationsSanctions No Sanctions (letters, transportation, No Sanctions (letters, transportation,Set-asides etc.) etc.) Up to 20% Set-asides for Priority, Focus, Up to 20% Set-asides for Priority, Focus, and Emerging Schools and Emerging SchoolsSchool ImprovementUses AYP calculations to identifyschools and districts in a step ofimprovement (Title I)Uses AYP calculations to generatelist of Persistently Lowest AchievingSchools ESEA Waiver Application ESEA New Accountability Accountability System System Used to identify Reward, Focus and Used to identify Reward, FocusSBE/OSPI Achievement Priority and Emerging schools and Priority and EmergingIndex schoolsUsed to identify Award Schools Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 26. Accountability Evolution with ESEA Waiver Up to 2011-12 2012-13 and 2013-14 2014-15 and beyondAYP Determinations AMO Calculations AMO CalculationsDeterminations based on Annual targets to close proficiency gaps Annual targets to close proficiency gapscurrent status of % meeting by ½ by 2017; uses 2011 as baseline and by ½ by 2017; uses 2011 as baseline andstandard compared to Uniform adds equal annual increments (1/6 of adds equal annual increments (1/6 ofBar (100% by 2014) proficiency gap) to get to 2017 target; proficiency gap) to get to 2017 target; each subgroup, school, district, and each subgroup, school, district, andAYP determinations reported state, have unique annual targets. state, have unique annual targets.on Report Card Calculations reported on Report Card Calculations reported on Report CardNot making AYP results insanctions for Title 1 schools No sanctions No sanctions$$$ set-asides Up to 20% Set-asides for Priority, Focus, Up to 20% Set-asides for Priority, Focus, and Emerging Schools and Emerging Schools Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 27. State-developed differentiated recognition accountability and supporto Annual Measurable Objectives • Using 2011 as a baseline, OSPI set benchmarks that will cut proficiency gaps in half by 2017 for every WA school. • No sanctions required, but the expectation is that SIPs would include strategies to close gaps. • N size = 20 Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 28. Annual Measurable Objectives (AMOs)WA has opted to establish AMOs as equal increments set toward the goal ofreducing by half the percent of students who are not proficient in all AYP subcategories by fall 2017 (within six years) Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 29. 2012–13 Waiver Tasks• Office of Student and School Success will work with Priority, Focus and Emerging schools to address gaps.• The State Board of Education (SBE) and OSPI are required to submit a revised accountability system request, which is likely to include growth data.• Legislature must pass a law to require ‘focused evaluations’ to use student growth as a significant factor.• State must establish rules regarding use of student growth as a significant factor in teacher and principal evaluation and support systems. Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 30.  I can explain AMOs tomy staff.• Describe what AMOs are and how they are calculated• Interpret AMO calculations Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 31. CDs with Resources to Use with your Staff• This PowerPoint• Preliminary data from the state• OSD data comparisons• How to calculate AMOs• OSD AMO calculations• Data driven dialog protocol Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard
  • 32. Questions? Orting School District  Teaching, Learning and Assessment August 2012  M. Shepard