College Success Academy: Launching a New Program with Research and Evaluation Partners


Published on

Presented at the 2013 NPEA conference by: Brigham Nahas Research Associates, The Steppingstone Foundation, Kingsbury Center at NWEA

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Why TSF chose MAP over others: adaptive and continuous spectrum (i.e. more questions at student’s level; greater precision in reflecting improvement of knowledge vs “single-form test”)“Single-form test:” for ex, terra nova, Iowa, ACT – fixed group of questions targeted towards a grade vs specific students (50 questions vs thousands of questions to draw from in timed vs untimed  gauge engagement)Reliability/validity: MAP less standard deviation of error
  • College Success Academy: Launching a New Program with Research and Evaluation Partners

    1. 1. Launching a new program withresearch and evaluation partners NPEA Conference April 11, 2013
    2. 2. Agenda• Welcome and introductions• How we got here• Program design: the scorecard• Planning evaluation activities• What we learned• The next chapter: measuring non-cognitive
    3. 3. Who we are• Robert Theaker, Senior Research Associate The Kingsbury Center, Northwest Evaluation Association• Roblyn Brigham PhD, Managing Partner Brigham Nahas Research Associates• Yully Cha, Chief Program Officer The Steppingstone Foundation
    4. 4. What we believe about our work• Evaluate early and often• Culture matters• Broad view of what we mean by “data”• Findings inspire action• Students front and center
    5. 5. How we got here• Environmental context – Summer learning loss and the achievement gap – College access to persistence/graduation• The Steppingstone Foundation’s 2009 strategic plan – The public school venture
    6. 6. Program design1. Academic achievement Center for Higher Education Studies, UC Berkeley; Cliff Adelman, U.S. Dept of Ed; National Center for Educational Accountability2. Socio-emotional competency Malecki et al (Measuring Perceived Social Support: Development of the Child and Adolescent Social Support Scale) • Adult relationships Learning First Alliance (Every Child Learning: Safe and Supportive Schools) • Self-efficacy Robbins et al, (Do Psychosocial and Study Skill Factors Predict College Outcomes? A Meta-Analysis)3. Positive behavior Balfanz et al (Preventing Student Disengagement and Keeping Students on the Graduation Path in Urban Middle Grades Schools) • Attendance • School disciplinary action4. College awareness The Bridgespan Group (Reclaiming the American Dream); Southern Regional Education Board (Middle Grades to HS: Mending a Weak Link; Choy, U.S. Dept of Ed (Students Whose Parents Did Not Go to College: Postsecondary Access, Enrollment and Persistence)
    7. 7. Academic achievement• The search for the right tool – Summative – Formative – Measure summer learning – And we want to compare against national norms Measures of Academic Progress (MAP)
    8. 8. National Map• Partners in 50 states• Over 5000 Partner Districts• Over 6 million students assessed• Partners in 100 foreign countries
    9. 9. NWEA Uses a RIT Scale (Rasch Unit) Equal interval Linked to curriculum RIT SCALE Achievement scale Adult Reading Grace Cross graded 232 X x Shows growth x x x Greater score 207 x X x xx x Devon x precision x x Like an academic ruler 184 X Daniel Beginning Reading
    10. 10. High Performing Example Grace’s Test Pattern
    11. 11. Planning with external evaluator Logic model: Implement observations, focusArticulate “the model” groups, and interviews Continue evaluation Define outcomes and Internal team debrief activities sequence sessions Three-year analysis andBuild consensus around Build SPSS database report data Mid-year observations and report
    12. 12. Data collection plan• Qualitative Data – Observations, interviews, focus groups – Program staff, teachers, parents, tutors, and students• Quantitative Data – Scorecard and MAP – Non-cognitive measures – Surveys to capture perspective of parents• Process of sharing what we learn
    13. 13. What we learned: qualitative Strengths Challenges Our response• Program • Demanding job • Change schedule administration is for teachers/staff and temperature strong • Culture needs to • Admission info• The program deepen to and interview “culture” is taking transform sessions root • Defining who to • IEP info collected• Scholars’ serve/who can in admission; enthusiasm for best benefit from improve faculty the academics the program orientation
    14. 14. What we learned: quantitative Class 1 Class 2 Class 1 Class 2 Summer 2011 93% n/a Summer 2011 42/46 n/a Academic Year 87% n/a 2011-2012 Academic Year 35/42 n/a Summer 2012 92% 94% 2011-2012 Summer 2012 29/35 = 63% 26/46 = 78% English Language Arts Math Average Average days Score SGP* A/P** Score SGP* A/P** GPA missed from schoolGrade 4 230 n/a 13 230 n/a 11 Grade 4 (2011) 2.7 8(2011) Grade 5 (2012) 2.8 6Grade 5 232 55 18 238 65 19(2012)
    15. 15. What we learned: measuring academic impact• Year one – Correcting mistakes in how test is administered – Learning how to read and report the results• Year two – Summer learning impact and school year effects – English Language Learners – Boys
    16. 16. Scores and percentiles
    17. 17. How we create a Virtual Comparison Group (VCG)We identify includedstudentsall matching students from GRDIdentifyGrade, Subject, Starting Achievement,Randomly select comparison groupSchool Income, Urban vs. Rural Classification, etc. 19
    18. 18. Virtual Comparison Groups (VCGs) Compare your student’s growth to similarstudents in similar schools 20
    19. 19. Next chapter: measuring non-cognitive• Virtual Comparison Group (VCG) analysis• External evaluator year-three report• Non-cognitive assessments – Holistic Student Assessment – Survey of After-School Youth Outcomes (SAYO)