Evaluation pal program monitoring and evaluation technology

1,329 views
1,180 views

Published on

In this session, Dr. Cugelman will discuss his work to develop an automated program monitoring and evaluation technology, called Evaluation Pal. He launched Evaluation Pal in 2011, then in 2012, pilot tested it for an evaluation of the Green Infrastructure Ontario Coalition which was submitted to the Ontario Trillium Foundation. Soon after, MaRS' Social Innovation Generation accepted it into their incubator program.

In this session, Dr. Cugelman will provide a tour of the tool, and use the Green Infrastructure Ontario case study to demonstrate how automated data collection can be used in the program evaluation process. This presentation will also provide an opportunity to discuss the challenges and opportunities of using technology to aid program evaluation.

Published in: Technology, Business
2 Comments
2 Likes
Statistics
Notes
No Downloads
Views
Total views
1,329
On SlideShare
0
From Embeds
0
Number of Embeds
11
Actions
Shares
0
Downloads
45
Comments
2
Likes
2
Embeds 0
No embeds

No notes for slide

Evaluation pal program monitoring and evaluation technology

  1. 1. www.evaluationpal.com Evaluation Pal Program monitoring and evaluation technology Canadian Evaluation Society Conference (2013) Brian Cugelman, PhD www.evaluationpal.com
  2. 2. AGENDA 1. Monitoring and evaluation, plus elephants 2. The vision of real-time feedback 3. Introducing Evaluation Pal 4. Project history 2
  3. 3. © Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp. MONITORING AND EVALUATION, PLUS ELEPHANTS 3
  4. 4. 4 But it's often about: Accountability: Satisfying donor requirements We say it's about: Decision making: Making better decisions based on evidence Performance improvement: Learning what works and improving performance Risk mitigation: Identifying risks early, to avoid potential crises M & E: THE ELEPHANT IN THE ROOM
  5. 5. M & E FOR MANY ORGANIZATIONS • Requires expensive consultants • The process takes up too much staff time • Valuable information often comes too late • Few people read big reports • Evaluators sometimes scare people 5
  6. 6. © Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp. THE VISION OF REAL-TIME FEEDBACK 6
  7. 7. DESIGNING & IMPLEMENTING 7 Execution (-) Bad Execution (+) Good Execution Design (+) Evidence Informed Promising intervention poorly executed Promising intervention well executed (-) Not evidence informed Unlikely intervention poorly executed Unlikely intervention well executed •Research is only part of the equation •Execution is just as important
  8. 8. FEEDBACK AND PERFORMANCE 8 Goals Improving performance Feedback on performance
  9. 9. WHAT SUCCESS NORMALLY LOOKS LIKE 9 •Both Internet marketing and public mobilization seem to follow power laws •Growth can be logarithmic with peaks and valleys between campaigns Time (years) Impact 1 2 3 4 5 6 7 8 9 10
  10. 10. WITHOUT FEEDBACK, ORGANIZATION CAN’T... 10 Judge which activates are most or least efficient Feedback is essential to success, for people or organizations
  11. 11. TREND TOWARDS ITERATIVE LEARNING AND IMPROVING 11 1. Deploy 2. Assess3. Revise 1. Deploy: Implementing the latest iteration 2. Assess: Measuring and learning 3. Revise: Rethinking and adapting Unknown cousins: •Developmental evaluation •Lean start-up
  12. 12. © Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp. INTRODUCING EVALUATION PAL 12
  13. 13. SOLUTION - EVALUATIONPAL.COM 13 A tool that helps organizations monitor their progress and improve their performance.
  14. 14. 14
  15. 15. A TOOL FOR LEARNING CULTURES 15 2. Ask for the feedback that you need 5. Improve your performance 3. Collect feedback from informants and add hard evidence
  16. 16. CASE STUDY OF OUR EVALUATION OF GIO 16 1 2
  17. 17. 1. DESCRIBE 17
  18. 18. DESCRIBE YOUR ORGANIZATION – LOGIC MODEL 18 Inputs Activities Outcomes Ultimate Goal(short-term) (mid-term) (long-term) Steering Committee members Coalition staff Expert peer review committee (volunteers) Consultants Workshop partners Volunteers Intern Funding from Trillium Funding from Steering Committee members In-kind donations Conducting outreach & education Implementing 5 workshops Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure Sharing best practices Producing the Green Infrastructure Ontario Report Carrying out the launch event Posting & distributing content through the website Producing & sending the e-update Operating the Coalition Steering Committee Meeting ministers & government staff Increase awareness & support for green infrastructure among non-profit organizations Increase awareness & support for green infrastructure among government staff Increase coverage of green infrastructure issues in the media Increase awareness & support for green infrastructure among decision makers Increase political support & priorities for green infrastructure Increase support & priorities for green infrastructure among the public Increase green infrastructure funding mechanisms Increase green infrastructure policy & legislation Increase the implementatio n of green infrastructure in Ontario
  19. 19. DESCRIBE YOUR ORGANIZATION – THREE LOGIC MODELS 19 There are also logic models for people, focused on personal and professional development. 1. Non-profit organization 2. Social enterprise 3. For-profit organization
  20. 20. 20
  21. 21. 21
  22. 22. 22
  23. 23. 2. ASK 23
  24. 24. 24 Constituent volunteering and donating Implementation quality and efficiency Brand health & reputation Stakeholder satisfaction Impact Likelihood of reaching goals Investments Over 40 base, extrapolated, and customer insight metrics and measures. Market, strategy, foresight Stakeholder demographics & psychographics Personal development
  25. 25. METRIC CATEGORIES 25 Demographics and psychographics Base-metrics •Investments •Implementation quality •Progress towards goals •Stakeholder & customer engagement •Reputation and brand health •Advice for success •Market Attractiveness •Equitable office Extrapolated metrics •Value for money •Effective prioritizing •Effectiveness engagement (Power Analysis) •Contribution of activities to goals •SWOT •PEST •Source credibility •Program implementation fidelity •Most significant change •Product and service attractiveness
  26. 26. 3. COLLECT 26
  27. 27. ADD INFORMANTS Internal Partner External Staff / Peers Managers Board members Highly involved volunteers Volunteers Donors / Funders Partner organizations Consultants and experts Customers Constituents Beneficiaries Peer organizations
  28. 28. 28
  29. 29. TRADITIONAL SAMPLING VERSUS PANEL SURVEYS 1 6 35 4 2 Traditional surveys (all in one go) Evaluation Pal panels (randomly divided across a year) 1
  30. 30. Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec •Too much engagement •Too much information •Too late to act on insight TRADITIONAL END OF PROGRAM ASSESSMENTS (all in one go) 1
  31. 31. 31 Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec •Only engage a small sample at a time •Random sampling offers confident findings •Insight available throughout the year •Randomization within key informant groups •Near real time feedback EVALUATION PAL PANELS (randomly divided across a year) 1 6 35 4 2
  32. 32. 32
  33. 33. 33 Sequence of respectful timed messages: •28 days •14 days •7 days •1 day before
  34. 34. 34
  35. 35. 35
  36. 36. 4. LEARN 36
  37. 37. Browser http://www.evaluationpal.com/panel/settings Search Living logic model Inputs Activities Outcomes Ultimate Goalshort-term mid-term long-term Steering Committee members Coalition staff Expert peer review committee (volunteers) Consultants Workshop partners Volunteers Intern Funding from Trillium Funding from Steering Committee members In-kind donations Conducting outreach & education Implementing 5 workshops Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure Sharing best practices Producing the Green Infrastructure Ontario Report Carrying out the launch event Posting & distributing content through the website Producing & sending the e-update Operating the Coalition Steering Committee Meeting ministers & government staff Increase awareness & support for green infrastructure among non-profit organizations Increase awareness & support for green infrastructure among government staff Increase coverage of green infrastructure issues in the media Increase awareness & support for green infrastructure among decision makers Increase political support & priorities for green infrastructure Increase support & priorities for green infrastructure among the public Increase green infrastructure funding mechanisms Increase green infrastructure policy & legislation Increase the implementation of green infrastructure in Ontario MixedOn track At risk Not assessed
  38. 38. Browser http://www.evaluationpal.com/panel/settings Search Contribution of activities towards goals
  39. 39. Browser http://www.evaluationpal.com/panel/settings Search Keep up the good workConcentrate here Low priority Possible overkill Effective focus
  40. 40. Browser http://www.evaluationpal.com/panel/settings Search Efficiency EfficientLeast Efficient Efficient Most Efficient
  41. 41. Browser http://www.evaluationpal.com/panel/settings Search Contribution of activities to goals
  42. 42. 42 Browser http://www.evaluationpal.com/panel/settings Search Contribution of activities to goals
  43. 43. Browser http://www.evaluationpal.com/panel/settings Search Crowd sourced SWOT Strengths •Active, influential and diverse coalition 15 •Commitment, motivation and vision 10 •Communications, outreach and online activities 10 •Green infrastructure is an important topic 6 •Credibility 5 •Networking 5 •Branding and design 3 •Evidence based 3 •Expertise and experience 3 •Timing 3 •Ethics and values 2 •Focus on realistic goals 2 •Inclusive process 2 •Sharing best practices 2 •Workshops and their output 2 Weaknesses •Public engagement and awareness 12 •Political engagement and support 11 •Setting coalition goals and focusing on green infrastructure topics 7 •Funding 6 •Making a persuasive case for green infrastructure 5 •Media interest 3 •Member commitment, engagement and collaboration 3 •Steering Committee coherence, contributions and leadership 3 •Not enough engagement with stakeholders 3 •Achieving concrete outcomes 2 •Capacity 2 •Reach outside current network 2 •Too much on green roofs 2
  44. 44. Browser http://www.evaluationpal.com/panel/settings Search Opportunities Expand the Coalition and network 13 Highlight economic opportunities and savings 9 Raise public awareness and support 5 Make links to climate change and green energy 4 Align with government and municipal priorities 4 Improve government relations and shape policy 3 Highlight benefits 3 Raise awareness through education and events 2 Better use the Coalition 2 Access funding 1 Build local capacity 1 Coalition’s capacity 1 Election commitments 1 Audit green infrastructure and report progress 1 Design school curriculum 1 Threats Budget limits or perceptions that green infrastructure is not economical 17 Public awareness, apathy and competing issues 13 GI is not understood or valued, or is seen as a fringe idea 10 Persuading implementers that green infrastructure is comparable to grey infrastructure (can’t make a strong case) 7 Lack of political relations, awareness and support 5 Coalition governance and vision 4 Lack of a clear message 1 Lack of Canadian case studies 1 Lack of media interest 1 Not enough coordination among key actors 1 Poor existing policy 1 Scope of network too small 1 Slow reaction time 1 Crowd sourced SWOT
  45. 45. REGULAR AND SPECIAL REPORTS Regular reports (in every report) • All core performance and impact measures • Most significant change • Demographics and psychographics Special reports (once per year) 1. Gender and equity audit 2. Stakeholder satisfaction 3. Performance barriers and solutions 4. SWOT 5. Staff peer appraisals 6. PEST 45
  46. 46. 4. IMPROVE 46
  47. 47. A FEEDBACK TOOL FOR AN ENTIRE ORGANIZATION 47 •Save time collecting data •Focus on learning, rather than harassing staff to collect data •Support developmental evaluation and lean start-up •Obtain evidence over time, for the end of program evaluation Program evaluators •Gain a top level overview of a program’s performance •Obtain a tool to build a learning organization •Identify potential threats to the organization or its programs Management •Marketing and communications: Better understanding of the key people who help their organization thrive •Volunteer coordinator: Understand volunteer needs, barriers and satisfaction •Fundraising: Gain insight into constituents and their donating habits over time Staff
  48. 48. © Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp. PROJECT HISTORY 48
  49. 49. PROJECT TIMELINE 49 BETA 1 Invented & launched (2011) Market testing (2012-2013) 1st pilot study (2011-2012) BETA 2 Redesigned & expanded (2012) YLC project (2013) Analysis models (2009) MaRS SIG (2012) BER citations •A. Emm, E. Ozlem, K. Maja, R. Ilan, & Florian, S. (2011). Value for Money: Current Approaches and Evolving Debates. London, UK: London School of Economics. •Cugelman, B., & Otero, E. (2010). Basic Efficiency Resource: A framework for measuring the relative performance of multi-unit programs. : Leitmtoiv and AlterSpark. •Cugelman, B., & Otero, E. (2010). Evaluation of Oxfam GB's Climate Change Campaign: Leitmotiv, AlterSpark, Oxfam GB. Download •Eurodiaconia (2012) Measuring Social Value. Brussels, Belgium. 2nd pilot study (2012) Numerous NGOs & evaluators
  50. 50. Want to learn more? Brian Cugelman, PhD (416) 921-2055 brian@alterspark.com www.evaluationpal.com

×