0
If I Knew Then What I Know Now:  Building a  Successful  Evaluation Roblyn Brigham, Brigham Nahas Research Associates Andy...
Overview and Introduction <ul><li>The workshop focus  </li></ul><ul><ul><li>if I knew then, what I know now… </li></ul></u...
Internal and External Evaluation <ul><li>The What, Why, and Who </li></ul><ul><li>New Jersey SEEDS Alumni Follow-up Study ...
Evaluation Planning:  Factors to Consider <ul><li>Organizational Characteristics </li></ul><ul><li>Data Collection Capacit...
Organizational Characteristics  and Evaluation Design <ul><li>Size and Structure of Organization </li></ul><ul><li>Culture...
Designing Evaluation:  Non-negotiables <ul><li>  </li></ul><ul><li>Identify programmatic or evaluation goals upfront durin...
Data Collection:  Capacity and Commitment <ul><li>  </li></ul><ul><li>What Skills for What Aspects of Collection? </li></u...
Data Analysis:  Capacity and Action <ul><li>Who is Involved in Analyzing the Data? </li></ul><ul><ul><li>Skills </li></ul>...
Presenting Results:  Know Your Audience <ul><li>The Presenter and the Audience  </li></ul><ul><li>Lessons learned:  </li><...
Test Results Over Time
Test Results Over Time
 
Additional Slides <ul><li>Evaluation Design Tool (KIPP) </li></ul><ul><li>Vision Mapping (KIPP) </li></ul>
The KSS core team articulated specific goals, objectives, and metrics for the event (which mapped back to the overall visi...
Executive summary: KSS 2009 successfully delivered against our vision; per-participant costs were lowest level ever <ul><u...
The Lie Factor ( The Visual Display of Quantitative Information, 2 nd  Ed.  by E.R. Tufte, 2001) Los Angeles Times Aug. 5,...
Upcoming SlideShare
Loading in...5
×

Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation

814

Published on

This workshop focused on evaluation tips and tools, lessons learned, and mistakes to avoid. It was designed for those charged with leading evaluation at their organizations.

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
814
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Evaluation helps organizations succeed in gaining funding, delivering services, and improving internal processes. Yet conducting rigorous evaluation is a challenge when resources are limited, or you are in charge of evaluation without having had evaluation training. In this workshop, four evaluators of college-access organizations with different levels of experience identify key realworld stumbling blocks such as: 1) confusing evaluation for external use with evaluation for internal use; 2) finding that too much data paralyzes organizational decisions; 3) prioritizing data collection over data analysis; 4) finding your audience’s eyes glazing over when you talk about evaluation; and 5) over or underestimating the resources you will need for your project. This session will focus on evaluation tips and tools, lessons learned, and most importantly, mistakes to avoid. It is designed for those charged with leading evaluation for their organizations (even if they have little evaluation experience); it is not for those completely new to evaluation. Introduction of Panelists: Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation
  • Overview slide for Janet/Danielle’s section: Describe our orgs re: mission, size Describe our orgs briefly: Two things very important to how we do evaluation:   KIPP – national, multiple site, autonomous model (meaning the Foundation does not run or operate the KIPP schools, rather it provides support and training and economies of scale. Currently 82 schools in 20 states and DC, serving 21,000 students. We do internal and external evaluations – I’m going to focus on our internal program evaluation work today.
  • Evaluation - not a one-size-fits-all approach • Size &amp; Structure of Organization – Who on your staff does program evaluation? No one, everyone or one person? - Evaluator wears many hats?   • Culture of Org – Is evaluation part of your organization’s culture already or will this work be entirely new? What systems and processes need to be put in place to create a data-driven culture? What time and resources are dedicated to all phases?   Age of org affects what can/should be evaluated   Nature of the program offering(s) – direct service, info only, prepping now for future outcomes, multiple sites? Implementing a model vs. responsive to specific program context Through experience we’ve learned that the following matters greatly: -   • Size &amp; Structure of Organization – Who on your staff does program evaluation? No one, everyone or one person? - Evaluator wears many hats?   • Culture of Org – Is evaluation part of your organization’s culture already or will this work be entirely new? What systems and processes need to be put in place to create a data-driven culture? What time and resources are dedicated to all phases? Age of org affects what can/should be evaluated   Nature of the program offering(s) – direct service, info only, prepping now for future outcomes, multiple sites? Implementing a model vs. responsive to specific program context Describe our orgs briefly: Two things very important to how we do evaluation: KIPP – national, multiple site, autonomous model (meaning the Foundation does not run or operate the KIPP schools, rather it provides support and training and economies of scale. Currently 82 schools in 20 states and DC, serving 21,000 students. We do internal and external evaluations – I’m going to focus on our internal program evaluation work today. Two important things about how my org has decided to do eval:   DSE: 1. Recently, we built a culture around making program evaluation a critical component to a program’s lifecycle; and engaged program managers in managing their own evaluations 2. Work hard to first define program goals, participant outcomes, and even process goals; and then connect them to the right evaluation tools and processes in order to ensure that the information is useful and actionable.l (show slide with tool to define eval goals, outcomes, tools)   JS 1. Learning org – make decisions based on research and data (but easy to leave evaluation thinking until the end) - moving beyond “satisfaction surveys” that ask “rate your level of satisfaction” 2. Interconnected Teams – shared database; own evals and linked evals; mixed methods; Showcase: why does this matter to you?
  • Evaluable questions: Examples of using proxies, be realistic in making claims Lesson learned: If you are not going to act on evaluation findings, do not collect the data –yet (painful decision-making but necessary)
  •   Data Collection activities: • Articulating “Evaluable” questions - What does Success Look Like? (Theory of change)   • (Deciding What Data to Collect) Data Collection – An iterative process – Identifying what data needs to be collected; Ways of collecting data; getting people on board; Identifying who’s involved in data collection • Commitment to Using Results – Being clear upfront how you will USE the results from data collection and analysis: process for making sure the data is utilized for decision-making and program improvement How do data link back to mission and theory of change? What can you take on now? Decide this as part of larger eval over time   • What to evaluate (what data to collect)? – Guidelines: Incremental steps based on age of org: New orgs – beginning eval: implementation &amp; staff training After first year – focus on knowledge and behavior outcomes of those you serve After a few years – focus on measuring program impact (with external evaluator?) at theory-of-change level   • 1-2 things we’ve learned from experience: DES: 1. When we were a younger organization, no systems in place for doing real program evaluation – program managers were each responsible for doing it on their own, but with no support, no guidelines, and no expertise in this area. Lots of survey monkeying – we learned there needed to be some centralization of efforts to ensure quality evaluation was happening, that we were actually learning from our experiences, that we were retaining information during staff transitions, and that the data we collected was actually utilized. 2. Now that we’re a bit older and have some processes in place, the big question becomes “how much is too much” – survey/interview request fatigue – what data is really necessary – what needs to be collected every year and what doesn’t, etc. 3. Who’s involved is also important – important to have analyst involved in survey development – critical to knowing what data is actually going to be useable.   JS: 1.   2. Steppingstone: consider how to send “message in a bottle” - Survey Punch- card, team planning tools, comments in Excel
  • • Data Analysis - Who’s involved in analyzing the data? Best is team, inclusion of non-eval folks included in later stages to help interpret - Ask yourself: what’s surprising and why? what’s worrisome and why? what’s missing?   Share Results of Analysis: org parts are linked via data-driven cycle, activity: each Team presents its data Prioritize action to be taken in response to analysis – formative? Summative?   • 1-2 things we’ve learned from experience: DES: 1. KIPP – Articulate the link to your vision/mission: Delivering against vision slide – demonstrate that we set vision first, used variety of data collection methods to track progress towards vision (or goals), and then presented information in ways that were appropriate for audience. 2. Example: Take action based on results; KSLP team utilizes nightly surveys to make immediate adjustments for courses the following day. Others have more subtle changes – but bottom line is – data should be used.   JS: 1. Steppingstone – Quarterly Showcase: Teams share their own data analysis, other teams point out how those data affect them, where data-sharing will be key (attention to cross-Team needs is key – JS role, establishing data calendar) 2. Example: Transition Study – exploratory research to include point of view of all broad range of stakeholders. Analyzed themes using x-team analysis group, categorized themes per Team, left each team to do final interpretion and take action: informed support services activities, informed socio-emo curriculum  
  •   Knowing your audience: Something we will all respond to who wants headlines? who will want to test your interpretation of the data? analyst not always best person to assess what is best for audience   Examples of what we’ve learned through experience: JS 1. visuals count: double-check automated processes - e.g., Excel slide -------------- chart with wrong Y axis for percentile and correct Y axis -------------- slide ------------- USA today graphic or overly busy graphic or data heavy graphic --------   DSE 1. KIPP: Variation in slide decks on various benchmarks – differentiated according to audience 2. What I learned: started to write lots of “reports” or “white papers” at the beginning – truth is – power point often works best to make succinct, easy to digest points, and it’s easily shared.
  • Ensure participants feel the collective power and impact of our broader Team and Family and the overall movement by providing a powerful introduction and reconnection to KIPP Provide an opportunity for big KIPPsters to network and make connections with others by bringing communities together to share, reflect, and learn Provide an opportunity for personal learning and growth Kick off the 2009-2010 school year with high energy and building momentum towards the belief of what is possible for our kids and renewing our collective commitment to realizing these possibilities in our KIPP communities across the country.
  • % of doctors has decreased by 15% - two-dimensional data But size of doctor (change in area of image) has decreased by over 75% - Problem using area to represent two-dimensional data
  • Transcript of "Research, Policy & Evaluation: If I Knew Then What I Know Now: Building Successful Evaluation"

    1. 1. If I Knew Then What I Know Now: Building a Successful Evaluation Roblyn Brigham, Brigham Nahas Research Associates Andy Hoge, New Jersey SEEDS Janet Smith, The Steppingstone Foundation Danielle Stein Eisenberg, KIPP Foundation April 8, 2010
    2. 2. Overview and Introduction <ul><li>The workshop focus </li></ul><ul><ul><li>if I knew then, what I know now… </li></ul></ul><ul><li>Presentation Outline </li></ul><ul><li>Introduction of Panelists </li></ul>
    3. 3. Internal and External Evaluation <ul><li>The What, Why, and Who </li></ul><ul><li>New Jersey SEEDS Alumni Follow-up Study </li></ul><ul><ul><li>The story from inside and out </li></ul></ul>
    4. 4. Evaluation Planning: Factors to Consider <ul><li>Organizational Characteristics </li></ul><ul><li>Data Collection Capacity </li></ul><ul><li>Data Analysis </li></ul>
    5. 5. Organizational Characteristics and Evaluation Design <ul><li>Size and Structure of Organization </li></ul><ul><li>Culture of Organization </li></ul><ul><li>Age of Organization </li></ul><ul><li>Nature of the Program Offering(s) </li></ul>
    6. 6. Designing Evaluation: Non-negotiables <ul><li>  </li></ul><ul><li>Identify programmatic or evaluation goals upfront during the program design stage </li></ul><ul><li>Involve key stakeholders at all phases, including analysis </li></ul><ul><li>Articulate “Evaluable” Questions </li></ul><ul><li>Articulate Action Plan for Using Results </li></ul><ul><ul><li>Short-term & Long-term Evaluation Plan </li></ul></ul>
    7. 7. Data Collection: Capacity and Commitment <ul><li>  </li></ul><ul><li>What Skills for What Aspects of Collection? </li></ul><ul><li>Standardize Terminology: e.g., enrollment, placement </li></ul><ul><li>Monitor Data Integrity and Accuracy </li></ul>
    8. 8. Data Analysis: Capacity and Action <ul><li>Who is Involved in Analyzing the Data? </li></ul><ul><ul><li>Skills </li></ul></ul><ul><ul><li>Key Points of View </li></ul></ul><ul><ul><li>What jumps out? What is missing? </li></ul></ul><ul><li>Prioritize Action to be Taken in Response to Analysis </li></ul>
    9. 9. Presenting Results: Know Your Audience <ul><li>The Presenter and the Audience </li></ul><ul><li>Lessons learned: </li></ul><ul><ul><li>Making Claims, Issues of Accuracy </li></ul></ul><ul><ul><li>Multiple Audiences: Most Effective Format </li></ul></ul><ul><ul><li>Audience Response </li></ul></ul>
    10. 10. Test Results Over Time
    11. 11. Test Results Over Time
    12. 13. Additional Slides <ul><li>Evaluation Design Tool (KIPP) </li></ul><ul><li>Vision Mapping (KIPP) </li></ul>
    13. 14. The KSS core team articulated specific goals, objectives, and metrics for the event (which mapped back to the overall vision). Strand leads did the same. <ul><ul><li>The Board Strand’s Evaluation P lanning T ool </li></ul></ul>Strand Goal Objective Metric Evaluation Tool Boards Board members should be inspired by KIPP's mission and energized to contribute as Board members Board members will feel inspired to continue their work with KIPP 95% of board members will indicate that they feel somewhat or very inspired to continue their work with KIPP Strand Survey Board members should feel part of a network-wide Board community, and national reform movement, rather than just a supporter of a local KIPP effort. Board members will feel like part of a national network 90% of board members will indicate that they feel somewhat or very connected to a national community Strand Survey Board members should learn practical skills and/or obtain tools that will enhance their Board's effectiveness Board members should leave KSS with at least one tool or practical skill they can immediately put to use Can name 1 tool or skill they used Strand Survey and Follow-Up Survey Board members should learn about KIPP initiatives that are meaningful to their Board service - e.g. KIPP share Board members will leave KSS knowing about national initiatives Can name 2 KIPP initiatives that are relevant to their region or school Strand Survey
    14. 15. Executive summary: KSS 2009 successfully delivered against our vision; per-participant costs were lowest level ever <ul><ul><li>1,810 people attended KSS 2009 – up 16% from last year and 7th consecutive year of record attendance </li></ul></ul><ul><ul><li>94% of respondents strongly (70%) or somewhat agree (24%) that KSS enhances their sense of belonging to the KIPP community </li></ul></ul>Collective Power; Intro/ Reconnection Network; Share, Reflect, and Learn Personal Learning Kick off school year with high energy; Renew collective commitment Per-participant costs were lowest level ever Please see the appendix for the supporting data to the bullet points below. <ul><ul><li>91% of attendees strongly (50%) or somewhat agree (41%) that KSS provides opportunities to learn helpful strategies from colleagues in other schools or organizations. </li></ul></ul><ul><ul><li>The top two reasons why teacher respondents attend KSS are “I value the Professional Development opportunities KSS provides” and “I came to learn new instructional strategies” </li></ul></ul><ul><ul><li>90% of all respondents attendees strongly (59%) or somewhat agree (31%) that they learned new ideas and strategies at KSS that they could directly apply to their </li></ul></ul><ul><ul><li>94% of all overall session ratings were either “excellent” (34%) or “good” (60%) </li></ul></ul><ul><ul><li>94% of respondents strongly (70%) or somewhat agree (24%) that KSS “renewed my sense of purpose in my work” </li></ul></ul><ul><ul><li>KSS was fun! </li></ul></ul><ul><ul><li>KSS 2009 costs were 11% higher than originally budgeted due primarily to attendance being 13% higher than projected… </li></ul></ul><ul><ul><li>… resulting in KSS 2009 per-participants costs being our lowest ever, down 5.5% from previous low in 2007 </li></ul></ul>
    15. 16. The Lie Factor ( The Visual Display of Quantitative Information, 2 nd Ed. by E.R. Tufte, 2001) Los Angeles Times Aug. 5, 1979, p. 3
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×