Evaluation Practices from Top to Bottom

1,033 views
987 views

Published on

Data sharing and data-driven decision making are a critical component for successful collaborations that drive toward student achievement. At this session, we will discuss best practices for developing a data driven, results-based organization, learning from Higher Achievement’s experiences successfully submitting to a third party evaluation, customizing a management information system for in-house use, and regularly using internal and external data to make strategic and programmatic decisions.

Published in: Education
1 Comment
0 Likes
Statistics
Notes
  • Be the first to like this

No Downloads
Views
Total views
1,033
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
13
Comments
1
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation Practices from Top to Bottom

  1. 1. Evaluation from Top to Bottom Rachel Gwaltney Chief of Programs, Higher Achievement www.higherachievement.org
  2. 2. AGENDA• Program overview• Third-party research study findings• Internal data analysis – Participant outcomes – Program quality – Staff• Interactive discussion – Benefits and best practices for data and systems – Challenges of data and systems• Resources
  3. 3. Higher Achievement’s Theory of Change • After school and summer program Increased offering middle school students Academic 650 hour extended learning Opportunities beyond 900 hours of school Increased • Preparing scholars for college and Increased Academic Academic career readinessAchievement Interest • Combined culture and content model Increased • Founded 35 years ago in DC Academic • Started national expansion 2008 Effort
  4. 4. Who Are Our Scholars? Scholars commit to• 5th – 8th grade 650 hours per• Starting GPA: 2.5 year, beyond the 1000 hours in• 99% minority school• 81% FARM-eligible• 79% will be first-generation college graduates• Most are recommended by teachers
  5. 5. Culture- high expectations - praise for effort- student voice and choice - learning is fun!http://www.youtube.com/watch?v=EOEZZvI2aKU&feature=rel ated
  6. 6. Results and ImpactAnnual outcomes: Third-party research:• Significant improvements in • The intensive year-round grades, test scores, and school program had a significant attendance impact on youths standardized reading and math test scores.8th grade graduates (2010):• Improved their average GPA • 64%of parents of children from 2.2 to 3.2 attending the program• 95% were placed in a top high confirming at their first-year school program follow-up that they spoke to Higher Achievement staff• 85% improved or maintained about their childs progress at an A or B in math and reading least once a month.
  7. 7. Research Partners• Principal Investigators – Carla Herrera (Public/Private Ventures) – Jean Baldwin Grossman (P/PV, Princeton) – Leigh L. Linden (The University of Texas at Austin)• Funders of published work to date – The Atlantic Philanthropies – The William T. Grant Foundation – The Wallace Foundation• Data Collection – Survey Research Management
  8. 8. Research design• Overview – Evaluation of the Higher Achievement program • After-school and summer program – One-year, two-year and summer findings – Four-year evaluation in progress• Recruitment and Randomization – 951 students applied to Higher Achievement – More students applied than Higher Achievement could serve – Randomly chose students to offer admission to Higher Achievement – Remainder became a control group• Advantages of design – Gold standard evaluation strategy – Sample comprises “types” of children served by Higher Achievement
  9. 9. Outcomes Measured• Key outcomes and variables of interest: – Standardized test scores • Abbreviated SAT 10 Problem Solving • Abbreviated SAT 10 Reading Comprehension – Behavior – Academic attitudes – Perceptions of peer and adult support – Participation in Higher Achievement and other OST programs – Activities related to high school application• Analyzed separately: – Parent and child assessments of OST programs – Mentor and teacher surveys within Higher Achievement – Qualitative data on Higher Achievement
  10. 10. Timing of Data Collection 2006 2007 2008 2009 2010 2010 2011 2012 Spring Spring Spring Spring Spring Fall Spring SpringCohort 1 (N=276) Survey Round Baseline FU1 FU2 FU4 Grade Entering 5th/6th 6th/7th 7th/8th 9th/10thCohort 2 (N=276) Survey Round Baseline FU1 FU2 FUSp FUFa FU4 Grade Entering 5th/6th 6th/7th 7th/8th 8th 8th 9th/10thCohort 3 (N=399) Survey Round Baseline FU1 FU2/FUSp FUFa FU4 Grade Entering 5th/6th 6th/7th 7th/8th 7th/8th 9th/10thNote: FU1 = One-Year Follow-Up FU2 = Two-Year Follow-Up FU4 = Four-Year Follow-Up FUSp = Spring FU for the Summer Study FUFa = Fall FU for the Summer Study
  11. 11. Standardized Test Scores• Significant effects after two years – Problem Solving: 0.12 Standard Deviations – Reading Comp: 0.09 Standard Deviations• Effect sizes are larger than those reported for other OST programs evaluated by large-scale RCT studies.• No effects after one year• No difference during summer 2010
  12. 12. Behavior• Asked youth about their engagement in several negative behaviors – In-school: e.g., principal’s office, tardies, skipping – Out-of-school: e.g., taking or breaking something, hitting• At both the one- and two-year follow-ups treatment students were more likely to report engaging in some of these behaviors.
  13. 13. Academic Attitudes• Six measures – Industry and Persistence – Creativity – Self-Perceptions of Academic Abilities – Enjoyment of Learning – Curiosity – Ability to Change the Future through Effort• Overall, treatment students have more negative attitudes than control students after the first year.• No overall differences at the second year.• Effects vary by the grade at which youth enter HA.• Gains in Enjoyment of Learning during Summer 2010
  14. 14. Program Participation• Higher Achievement provides opportunities that scholars would not otherwise have. – Without access to HA, 35 percent attend an academic OST; – Access to HA increases this by 52 percentage points. – Treatment students average more time in academic OST programs • 10.3 hours more a week during the academic year • 19.8 hours more a week during the summer
  15. 15. Activity Participation• Treatment youth were more likely to report engaging in a wide range of activities. For example: – Visiting a college campus – Speaking to a group about youth’s ideas or work – Speaking to an adult about high school, college and jobs – Going to events outside youth’s neighborhood – Writing poems, stories, etc. not for school – Going to events outside of school
  16. 16. High School Application Activities• Only tested in Summer 2010 and four-year follow-up• Students were more likely to report engaging in various preparatory activities. For example: – 14 percentage point difference in visiting high schools – 15 percentage point difference in getting application information on a school• Significant increase in students wanting to attend competitive high schools. – A relative increase of 16 percentage points
  17. 17. How are these outcomes achieved?• This study cannot rigorously answer this question.• But Higher Achievement has several characteristics that make it stand out as a strong program: – Long-term and intensive – Broad range of academic and enrichment activities – Guided by grade-level curricular standards – Staff are well trained and supported – Strives to involve parents – Focus on small-group instruction – Opportunities for leadership
  18. 18. Conclusions• Participation in well-structured, long-term, academically focused out-of-school-time programs can boost student achievement.• Gains take time, emerging only after two years of access to the program.• Gains coincided with increased reports of negative behavior and without an improvement in academic attitudes. – Requires further investigation• Engagement in activities related to high school application process is promising. – Fourth-year data collection to be completed this summer and published the following year.• Lack of test score differences in summer does not mean that the summer program is not an important component of the program.
  19. 19. Implications:• Organizations• Sector• Policy
  20. 20. Lessons Learned• Addressing moral question of denying access to program for research purposes• Ensuring staff capacity to recruit and support• Feedback from researchers was invaluable for program improvement• Retention is critical• Plan regarding communicating findings• Work with research team to secure investment
  21. 21. Internal Data Practices – Scholar Data FREQUENCY DATA TRACKEDDaily Individual feedback on session participation and progressQuarterly Report card data (grades, attendance) Scholar Action PlanBiannually (twice/year) Attitudes and behavior (360 survey)Annually Scholar outcomes: Standardized test scores, GPA, school attendance, high school placement
  22. 22. Internal Data Analysis - Program FREQUENCY DATA TRACKEDWeekly Feedback from volunteer mentorsBiweekly (every two weeks) Dashboard dataPeriodically Program quality observations • Internal tool correlated to core program elements • YPQA external toolTriannually (three times/year) Quality assurance reports compiled from observationsAnnually Scholar outcomes
  23. 23. Internal Data Analysis - Staff FREQUENCY DATA TRACKEDFirst 90 days Completion of orientation goalsQuarterly Progress toward workplan goalsAnnually • Evaluation against workplan goals and organizational culture • Scholar outcomes
  24. 24. Lessons Learned• Invest in the right systems• Set up and enforce strong systems for data collection• Train staff to report on, understand, and act on analysis of data• Make data-driven improvement part of organizational culture
  25. 25. Discussion
  26. 26. Resources• Harvard Family Research Project: www.hfrp.org – Afterschool Evaluation 101• ChildTrends: www.childtrends.org – Data-Driven Decision-Making in Out-of-School Time Programs• Forum for Youth Investment: www.forumfyi.org – From Soft Skills to Hard Data• Wallace Foundation: www.wallacefoundation.org – Hours of Opportunity – The Cost of Quality OST Programs• Public/Private Ventures: www.ppv.org
  27. 27. Thank You!

×