Ces 2013 negotiating evaluation design


Published on

Concurrent session #6 (Tuesday June 11th, 15:45-17:15)
Negotiating evaluation desigg in developmental evaluation: an emerging framework for shared decision-making

by Heather Smith Fowler, Dominique Leonard, & Neil Price

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Ces 2013 negotiating evaluation design

  1. 1. Negotiating evaluation design indevelopmental evaluationHeather Smith Fowler, Dominique Leonard, Neil PriceCES 2013
  2. 2. Who we are Who is SRDC?• Social Research and Demonstration Corporation (SRDC)• Canadian non-profit research firm• Why was SRDC selected to lead the evaluation?– 20 years’ experience running demonstration projects– 10+ years testing new educational programs– Methodological + content expertise– Presence across Canada
  3. 3. Raising the Grade: Description 35 Boys and Girls Clubs across Canada Funded by Rogers Youth Fund Started in Oct 2012 To date, just under 300 youth enrolled National program BUT focus on local flexibility to adapt Academics + tinkering with tech! http://www.raisingthegrade.ca/
  4. 4. Raising the Grade: major program components1. Academic support 1:1 mentorship Personal development plans (MyBlueprint), e-portfolios Engaging projects and activites (driven by youth interests) Access to curated, high-quality academic resources2. Rogers Tech Centres3. Scholarship program4. Evaluation and continuous improvement
  5. 5. RTG evaluation Formative, then summative Logic model development Theory-driven? Collaboration on program development Program monitoring architecture What were the clues that this was really DE…?
  6. 6. Hierarchy of evidence
  7. 7. Developmental evaluation characteristics• Focused on and responsive to social innovation (dynamic,flexible, context-specific, unique, interested in differences,timely)• Rooted in positive, trusting relationships built over time (notalways typical of evaluation)• Timely engagement & rapid feedback• Evaluator as a critical friend• Flexible methods to fit research questions• Still interested in process and quality, but NOT interested in alinear relationship between goals and outcomes
  8. 8. Evaluation questions• Is the program effective at…? Does it work?• How is it being implemented across 25-35 Clubs? i.e., what arethey doing?• Is the program a breeding ground for innovation? (What’s new?How did it come about? What are the contextual factors thatcontribute to innovation?)• What are the critical ingredients of the program in reality?• How are youth responding to the program? (what do they like?What do they think needs to change? What are they doing?)• How are Ed Managers and mentors responding to the program?
  9. 9. Selection criteria for methods and tools Permit timely feedback Inclusive Engaging and fun Support digital & connected learning principles Empowering Accessible, youth-friendly and authentic (face validity) Available at low-cost Comparison data Support capacity-building goals … + valid and reliable
  10. 10. Criteria for evaluation designs Permit timely feedback Match overarching evaluation questions Inform decision-making at different stages Support program principles - equitable, social, participatory;engagement, empowerment, discovery Authentic and meaningful Low-cost Comparison data … + rigorous
  11. 11. Insights from our first year DE can be relevant for certain aspects of a program, whileothers lend themselves more to tried-and-true methods.Need to discuss with the client which program componentsare suitable for which types of methods DE has helped us be creative in imagining new ways tocollect program information and reflect it back to programstaff & participants, including about their experiences with theprogram (e.g., participant self-portrait – how to make itevolving and forward-looking?) Research questions are time sensitive and relative to thestage of the program
  12. 12. Challenges Being sufficiently critical – it doesn’t come naturally to us (tooCanadian!), especially during the stage of building closerelationships Trying to plan workloads and stay on budget when programis evolving and tasks are changing in response to emergingfindings, practice How can we show our added value as evaluators withoutconcrete, rigorous methods that make definitive statementsabout the program? How to determine if social innovation is effective? How muchcan the vision change?
  13. 13. More challenges How not to slide into old habits/traditional methods and waysof working? Developing, learning, and using innovative evaluationmethods How to mirror back personal experiences/information aboutchanges to youth without violating confidentiality? Keeping a long-term perspective
  14. 14. Next steps Separating out implementation-level questions fromoverarching questions What are the priority questions right now (and down the road)and what are the best designs to address those?s Defining the inquiry framework - Appreciative Inquiry, Actualvs Ideal Comparison What? So what? Now what?