www.evaluationpal.com
Evaluation Pal
Program monitoring and evaluation technology
Canadian Evaluation
Society Conference
(2013)
Brian Cugelman, PhD
www.evaluationpal.com
AGENDA
1. Monitoring and evaluation, plus elephants
2. The vision of real-time feedback
3. Introducing Evaluation Pal
4. Project history
2
© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.
MONITORING AND EVALUATION,
PLUS ELEPHANTS
3
4
But it's often about:
Accountability: Satisfying
donor requirements
We say it's about:
Decision making:
Making better decisions
based on evidence
Performance
improvement: Learning
what works and
improving performance
Risk mitigation:
Identifying risks early, to
avoid potential crises
M & E: THE ELEPHANT IN THE ROOM
M & E FOR MANY ORGANIZATIONS
• Requires expensive consultants
• The process takes up too much staff time
• Valuable information often comes too late
• Few people read big reports
• Evaluators sometimes scare people
5
© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.
THE VISION OF REAL-TIME FEEDBACK
6
DESIGNING & IMPLEMENTING
7
Execution
(-) Bad Execution (+) Good Execution
Design
(+)
Evidence
Informed
Promising intervention
poorly executed
Promising intervention
well executed
(-) Not
evidence
informed
Unlikely intervention
poorly executed
Unlikely intervention
well executed
•Research is only part of the equation
•Execution is just as important
FEEDBACK AND PERFORMANCE
8
Goals
Improving performance
Feedback on
performance
WHAT SUCCESS NORMALLY LOOKS LIKE
9
•Both Internet marketing and public mobilization seem to follow power laws
•Growth can be logarithmic with peaks and valleys between campaigns
Time (years)
Impact
1 2 3 4 5 6 7 8 9 10
WITHOUT FEEDBACK, ORGANIZATION CAN’T...
10
Judge which activates are most or least efficient
Feedback is essential to
success, for people or
organizations
TREND TOWARDS ITERATIVE LEARNING AND IMPROVING
11
1. Deploy
2. Assess3. Revise
1. Deploy: Implementing the latest iteration
2. Assess: Measuring and learning
3. Revise: Rethinking and adapting
Unknown cousins:
•Developmental evaluation
•Lean start-up
© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.
INTRODUCING EVALUATION PAL
12
SOLUTION - EVALUATIONPAL.COM
13
A tool that helps
organizations monitor
their progress and
improve their
performance.
14
A TOOL FOR LEARNING CULTURES
15
2. Ask for the
feedback that
you need
5. Improve your
performance
3. Collect feedback
from informants and
add hard evidence
CASE STUDY OF OUR EVALUATION OF GIO
16
1 2
1. DESCRIBE
17
DESCRIBE YOUR ORGANIZATION – LOGIC MODEL
18
Inputs Activities
Outcomes Ultimate
Goal(short-term) (mid-term) (long-term)
Steering
Committee
members
Coalition staff
Expert peer
review committee
(volunteers)
Consultants
Workshop
partners
Volunteers
Intern
Funding from
Trillium
Funding from
Steering
Committee
members
In-kind donations
Conducting outreach &
education
Implementing 5
workshops
Building the Coalition
Filing an Environmental
Bill of Rights application
to change the definition
of infrastructure
Sharing best practices
Producing the Green
Infrastructure Ontario
Report
Carrying out the launch
event
Posting & distributing
content through the
website
Producing & sending the
e-update
Operating the Coalition
Steering Committee
Meeting ministers &
government staff
Increase
awareness &
support for green
infrastructure
among non-profit
organizations
Increase
awareness &
support for green
infrastructure
among
government staff
Increase coverage
of green
infrastructure
issues in the
media
Increase
awareness &
support for green
infrastructure
among decision
makers
Increase political
support &
priorities for
green
infrastructure
Increase support
& priorities for
green
infrastructure
among the public
Increase green
infrastructure
funding
mechanisms
Increase green
infrastructure
policy &
legislation
Increase the
implementatio
n of green
infrastructure
in Ontario
DESCRIBE YOUR ORGANIZATION – THREE LOGIC MODELS
19
There are also logic models for people, focused on personal and professional development.
1. Non-profit
organization
2. Social
enterprise
3. For-profit
organization
20
21
22
2. ASK
23
24
Constituent volunteering and donating
Implementation
quality and
efficiency
Brand health &
reputation
Stakeholder
satisfaction
Impact
Likelihood of reaching goals
Investments
Over 40 base, extrapolated, and customer insight metrics and measures.
Market, strategy, foresight
Stakeholder
demographics
&
psychographics
Personal development
METRIC CATEGORIES
25
Demographics
and
psychographics
Base-metrics
•Investments
•Implementation quality
•Progress towards goals
•Stakeholder & customer
engagement
•Reputation and brand health
•Advice for success
•Market Attractiveness
•Equitable office
Extrapolated metrics
•Value for money
•Effective prioritizing
•Effectiveness engagement
(Power Analysis)
•Contribution of activities to goals
•SWOT
•PEST
•Source credibility
•Program implementation fidelity
•Most significant change
•Product and service attractiveness
3. COLLECT
26
ADD INFORMANTS
Internal Partner External
Staff / Peers
Managers
Board members
Highly involved volunteers
Volunteers
Donors / Funders
Partner organizations
Consultants and experts
Customers
Constituents
Beneficiaries
Peer organizations
28
TRADITIONAL SAMPLING VERSUS PANEL SURVEYS
1 6
35
4 2
Traditional surveys
(all in one go)
Evaluation Pal panels
(randomly divided across a year)
1
Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec
•Too much engagement
•Too much information
•Too late to act on insight
TRADITIONAL END OF PROGRAM ASSESSMENTS
(all in one go)
1
31
Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec
•Only engage a small sample at a time
•Random sampling offers confident findings
•Insight available throughout the year
•Randomization within key informant groups
•Near real time feedback
EVALUATION PAL PANELS
(randomly divided across a year)
1 6
35
4 2
32
33
Sequence of respectful
timed messages:
•28 days
•14 days
•7 days
•1 day before
34
35
4. LEARN
36
Browser
http://www.evaluationpal.com/panel/settings Search
Living logic model
Inputs Activities
Outcomes Ultimate
Goalshort-term mid-term long-term
Steering
Committee
members
Coalition staff
Expert peer review
committee
(volunteers)
Consultants
Workshop partners
Volunteers
Intern
Funding from
Trillium
Funding from
Steering
Committee
members
In-kind donations
Conducting outreach &
education
Implementing 5 workshops
Building the Coalition
Filing an Environmental
Bill of Rights application
to change the definition
of infrastructure
Sharing best practices
Producing the Green
Infrastructure Ontario
Report
Carrying out the launch
event
Posting & distributing
content through the
website
Producing & sending the
e-update
Operating the Coalition
Steering Committee
Meeting ministers &
government staff
Increase awareness
& support for
green
infrastructure
among non-profit
organizations
Increase awareness
& support for
green
infrastructure
among government
staff
Increase coverage
of green
infrastructure
issues in the media
Increase awareness
& support for
green
infrastructure
among decision
makers
Increase political
support &
priorities for green
infrastructure
Increase support &
priorities for green
infrastructure
among the public
Increase green
infrastructure
funding
mechanisms
Increase green
infrastructure
policy & legislation
Increase the
implementation
of green
infrastructure in
Ontario
MixedOn track At risk Not assessed
Browser
http://www.evaluationpal.com/panel/settings Search
Contribution of activities towards goals
Browser
http://www.evaluationpal.com/panel/settings Search
Keep up the good workConcentrate here
Low priority Possible overkill
Effective focus
Browser
http://www.evaluationpal.com/panel/settings Search
Efficiency
EfficientLeast Efficient
Efficient Most Efficient
Browser
http://www.evaluationpal.com/panel/settings Search
Contribution of activities to goals
42
Browser
http://www.evaluationpal.com/panel/settings Search
Contribution of activities to goals
Browser
http://www.evaluationpal.com/panel/settings Search
Crowd sourced SWOT
Strengths
•Active, influential and diverse coalition 15
•Commitment, motivation and vision 10
•Communications, outreach and online activities 10
•Green infrastructure is an important topic 6
•Credibility 5
•Networking 5
•Branding and design 3
•Evidence based 3
•Expertise and experience 3
•Timing 3
•Ethics and values 2
•Focus on realistic goals 2
•Inclusive process 2
•Sharing best practices 2
•Workshops and their output 2
Weaknesses
•Public engagement and awareness 12
•Political engagement and support 11
•Setting coalition goals and focusing on green
infrastructure topics
7
•Funding 6
•Making a persuasive case for green infrastructure 5
•Media interest 3
•Member commitment, engagement and
collaboration
3
•Steering Committee coherence, contributions and
leadership
3
•Not enough engagement with stakeholders 3
•Achieving concrete outcomes 2
•Capacity 2
•Reach outside current network 2
•Too much on green roofs 2
Browser
http://www.evaluationpal.com/panel/settings Search
Opportunities
Expand the Coalition and network 13
Highlight economic opportunities and savings 9
Raise public awareness and support 5
Make links to climate change and green energy 4
Align with government and municipal priorities 4
Improve government relations and shape policy 3
Highlight benefits 3
Raise awareness through education and events 2
Better use the Coalition 2
Access funding 1
Build local capacity 1
Coalition’s capacity 1
Election commitments 1
Audit green infrastructure and report progress 1
Design school curriculum 1
Threats
Budget limits or perceptions that green
infrastructure is not economical
17
Public awareness, apathy and competing issues 13
GI is not understood or valued, or is seen as a
fringe idea
10
Persuading implementers that green infrastructure
is comparable to grey infrastructure (can’t make a
strong case)
7
Lack of political relations, awareness and support 5
Coalition governance and vision 4
Lack of a clear message 1
Lack of Canadian case studies 1
Lack of media interest 1
Not enough coordination among key actors 1
Poor existing policy 1
Scope of network too small 1
Slow reaction time 1
Crowd sourced SWOT
REGULAR AND SPECIAL REPORTS
Regular reports
(in every report)
• All core performance and
impact measures
• Most significant change
• Demographics and
psychographics
Special reports
(once per year)
1. Gender and equity audit
2. Stakeholder satisfaction
3. Performance barriers and
solutions
4. SWOT
5. Staff peer appraisals
6. PEST
45
4. IMPROVE
46
A FEEDBACK TOOL FOR AN ENTIRE ORGANIZATION
47
•Save time collecting data
•Focus on learning, rather than harassing staff to collect data
•Support developmental evaluation and lean start-up
•Obtain evidence over time, for the end of program evaluation
Program
evaluators
•Gain a top level overview of a program’s performance
•Obtain a tool to build a learning organization
•Identify potential threats to the organization or its programs
Management
•Marketing and communications: Better understanding of the
key people who help their organization thrive
•Volunteer coordinator: Understand volunteer needs, barriers
and satisfaction
•Fundraising: Gain insight into constituents and their donating
habits over time
Staff
© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.
PROJECT HISTORY
48
PROJECT TIMELINE
49
BETA 1
Invented &
launched
(2011)
Market
testing
(2012-2013)
1st pilot study
(2011-2012)
BETA 2
Redesigned
& expanded
(2012)
YLC project
(2013)
Analysis
models
(2009)
MaRS SIG
(2012)
BER citations
•A. Emm, E. Ozlem, K. Maja, R. Ilan, & Florian, S. (2011). Value for Money: Current Approaches and Evolving Debates. London, UK: London School of
Economics.
•Cugelman, B., & Otero, E. (2010). Basic Efficiency Resource: A framework for measuring the relative performance of multi-unit programs. : Leitmtoiv and
AlterSpark.
•Cugelman, B., & Otero, E. (2010). Evaluation of Oxfam GB's Climate Change Campaign: Leitmotiv, AlterSpark, Oxfam GB. Download
•Eurodiaconia (2012) Measuring Social Value. Brussels, Belgium.
2nd pilot
study
(2012)
Numerous
NGOs &
evaluators
Want to learn more?
Brian Cugelman, PhD
(416) 921-2055
brian@alterspark.com
www.evaluationpal.com

Evaluation pal program monitoring and evaluation technology

  • 1.
    www.evaluationpal.com Evaluation Pal Program monitoringand evaluation technology Canadian Evaluation Society Conference (2013) Brian Cugelman, PhD www.evaluationpal.com
  • 2.
    AGENDA 1. Monitoring andevaluation, plus elephants 2. The vision of real-time feedback 3. Introducing Evaluation Pal 4. Project history 2
  • 3.
    © Copyright 2013| Brian Cugelman, PhD | AlterSpark Corp. MONITORING AND EVALUATION, PLUS ELEPHANTS 3
  • 4.
    4 But it's oftenabout: Accountability: Satisfying donor requirements We say it's about: Decision making: Making better decisions based on evidence Performance improvement: Learning what works and improving performance Risk mitigation: Identifying risks early, to avoid potential crises M & E: THE ELEPHANT IN THE ROOM
  • 5.
    M & EFOR MANY ORGANIZATIONS • Requires expensive consultants • The process takes up too much staff time • Valuable information often comes too late • Few people read big reports • Evaluators sometimes scare people 5
  • 6.
    © Copyright 2013| Brian Cugelman, PhD | AlterSpark Corp. THE VISION OF REAL-TIME FEEDBACK 6
  • 7.
    DESIGNING & IMPLEMENTING 7 Execution (-)Bad Execution (+) Good Execution Design (+) Evidence Informed Promising intervention poorly executed Promising intervention well executed (-) Not evidence informed Unlikely intervention poorly executed Unlikely intervention well executed •Research is only part of the equation •Execution is just as important
  • 8.
    FEEDBACK AND PERFORMANCE 8 Goals Improvingperformance Feedback on performance
  • 9.
    WHAT SUCCESS NORMALLYLOOKS LIKE 9 •Both Internet marketing and public mobilization seem to follow power laws •Growth can be logarithmic with peaks and valleys between campaigns Time (years) Impact 1 2 3 4 5 6 7 8 9 10
  • 10.
    WITHOUT FEEDBACK, ORGANIZATIONCAN’T... 10 Judge which activates are most or least efficient Feedback is essential to success, for people or organizations
  • 11.
    TREND TOWARDS ITERATIVELEARNING AND IMPROVING 11 1. Deploy 2. Assess3. Revise 1. Deploy: Implementing the latest iteration 2. Assess: Measuring and learning 3. Revise: Rethinking and adapting Unknown cousins: •Developmental evaluation •Lean start-up
  • 12.
    © Copyright 2013| Brian Cugelman, PhD | AlterSpark Corp. INTRODUCING EVALUATION PAL 12
  • 13.
    SOLUTION - EVALUATIONPAL.COM 13 Atool that helps organizations monitor their progress and improve their performance.
  • 14.
  • 15.
    A TOOL FORLEARNING CULTURES 15 2. Ask for the feedback that you need 5. Improve your performance 3. Collect feedback from informants and add hard evidence
  • 16.
    CASE STUDY OFOUR EVALUATION OF GIO 16 1 2
  • 17.
  • 18.
    DESCRIBE YOUR ORGANIZATION– LOGIC MODEL 18 Inputs Activities Outcomes Ultimate Goal(short-term) (mid-term) (long-term) Steering Committee members Coalition staff Expert peer review committee (volunteers) Consultants Workshop partners Volunteers Intern Funding from Trillium Funding from Steering Committee members In-kind donations Conducting outreach & education Implementing 5 workshops Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure Sharing best practices Producing the Green Infrastructure Ontario Report Carrying out the launch event Posting & distributing content through the website Producing & sending the e-update Operating the Coalition Steering Committee Meeting ministers & government staff Increase awareness & support for green infrastructure among non-profit organizations Increase awareness & support for green infrastructure among government staff Increase coverage of green infrastructure issues in the media Increase awareness & support for green infrastructure among decision makers Increase political support & priorities for green infrastructure Increase support & priorities for green infrastructure among the public Increase green infrastructure funding mechanisms Increase green infrastructure policy & legislation Increase the implementatio n of green infrastructure in Ontario
  • 19.
    DESCRIBE YOUR ORGANIZATION– THREE LOGIC MODELS 19 There are also logic models for people, focused on personal and professional development. 1. Non-profit organization 2. Social enterprise 3. For-profit organization
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
    24 Constituent volunteering anddonating Implementation quality and efficiency Brand health & reputation Stakeholder satisfaction Impact Likelihood of reaching goals Investments Over 40 base, extrapolated, and customer insight metrics and measures. Market, strategy, foresight Stakeholder demographics & psychographics Personal development
  • 25.
    METRIC CATEGORIES 25 Demographics and psychographics Base-metrics •Investments •Implementation quality •Progresstowards goals •Stakeholder & customer engagement •Reputation and brand health •Advice for success •Market Attractiveness •Equitable office Extrapolated metrics •Value for money •Effective prioritizing •Effectiveness engagement (Power Analysis) •Contribution of activities to goals •SWOT •PEST •Source credibility •Program implementation fidelity •Most significant change •Product and service attractiveness
  • 26.
  • 27.
    ADD INFORMANTS Internal PartnerExternal Staff / Peers Managers Board members Highly involved volunteers Volunteers Donors / Funders Partner organizations Consultants and experts Customers Constituents Beneficiaries Peer organizations
  • 28.
  • 29.
    TRADITIONAL SAMPLING VERSUSPANEL SURVEYS 1 6 35 4 2 Traditional surveys (all in one go) Evaluation Pal panels (randomly divided across a year) 1
  • 30.
    Jan Feb MarApr May June July Aug Sep Oct Nov Dec •Too much engagement •Too much information •Too late to act on insight TRADITIONAL END OF PROGRAM ASSESSMENTS (all in one go) 1
  • 31.
    31 Jan Feb MarApr May June July Aug Sep Oct Nov Dec •Only engage a small sample at a time •Random sampling offers confident findings •Insight available throughout the year •Randomization within key informant groups •Near real time feedback EVALUATION PAL PANELS (randomly divided across a year) 1 6 35 4 2
  • 32.
  • 33.
    33 Sequence of respectful timedmessages: •28 days •14 days •7 days •1 day before
  • 34.
  • 35.
  • 36.
  • 37.
    Browser http://www.evaluationpal.com/panel/settings Search Living logicmodel Inputs Activities Outcomes Ultimate Goalshort-term mid-term long-term Steering Committee members Coalition staff Expert peer review committee (volunteers) Consultants Workshop partners Volunteers Intern Funding from Trillium Funding from Steering Committee members In-kind donations Conducting outreach & education Implementing 5 workshops Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure Sharing best practices Producing the Green Infrastructure Ontario Report Carrying out the launch event Posting & distributing content through the website Producing & sending the e-update Operating the Coalition Steering Committee Meeting ministers & government staff Increase awareness & support for green infrastructure among non-profit organizations Increase awareness & support for green infrastructure among government staff Increase coverage of green infrastructure issues in the media Increase awareness & support for green infrastructure among decision makers Increase political support & priorities for green infrastructure Increase support & priorities for green infrastructure among the public Increase green infrastructure funding mechanisms Increase green infrastructure policy & legislation Increase the implementation of green infrastructure in Ontario MixedOn track At risk Not assessed
  • 38.
  • 39.
    Browser http://www.evaluationpal.com/panel/settings Search Keep upthe good workConcentrate here Low priority Possible overkill Effective focus
  • 40.
  • 41.
  • 42.
  • 43.
    Browser http://www.evaluationpal.com/panel/settings Search Crowd sourcedSWOT Strengths •Active, influential and diverse coalition 15 •Commitment, motivation and vision 10 •Communications, outreach and online activities 10 •Green infrastructure is an important topic 6 •Credibility 5 •Networking 5 •Branding and design 3 •Evidence based 3 •Expertise and experience 3 •Timing 3 •Ethics and values 2 •Focus on realistic goals 2 •Inclusive process 2 •Sharing best practices 2 •Workshops and their output 2 Weaknesses •Public engagement and awareness 12 •Political engagement and support 11 •Setting coalition goals and focusing on green infrastructure topics 7 •Funding 6 •Making a persuasive case for green infrastructure 5 •Media interest 3 •Member commitment, engagement and collaboration 3 •Steering Committee coherence, contributions and leadership 3 •Not enough engagement with stakeholders 3 •Achieving concrete outcomes 2 •Capacity 2 •Reach outside current network 2 •Too much on green roofs 2
  • 44.
    Browser http://www.evaluationpal.com/panel/settings Search Opportunities Expand theCoalition and network 13 Highlight economic opportunities and savings 9 Raise public awareness and support 5 Make links to climate change and green energy 4 Align with government and municipal priorities 4 Improve government relations and shape policy 3 Highlight benefits 3 Raise awareness through education and events 2 Better use the Coalition 2 Access funding 1 Build local capacity 1 Coalition’s capacity 1 Election commitments 1 Audit green infrastructure and report progress 1 Design school curriculum 1 Threats Budget limits or perceptions that green infrastructure is not economical 17 Public awareness, apathy and competing issues 13 GI is not understood or valued, or is seen as a fringe idea 10 Persuading implementers that green infrastructure is comparable to grey infrastructure (can’t make a strong case) 7 Lack of political relations, awareness and support 5 Coalition governance and vision 4 Lack of a clear message 1 Lack of Canadian case studies 1 Lack of media interest 1 Not enough coordination among key actors 1 Poor existing policy 1 Scope of network too small 1 Slow reaction time 1 Crowd sourced SWOT
  • 45.
    REGULAR AND SPECIALREPORTS Regular reports (in every report) • All core performance and impact measures • Most significant change • Demographics and psychographics Special reports (once per year) 1. Gender and equity audit 2. Stakeholder satisfaction 3. Performance barriers and solutions 4. SWOT 5. Staff peer appraisals 6. PEST 45
  • 46.
  • 47.
    A FEEDBACK TOOLFOR AN ENTIRE ORGANIZATION 47 •Save time collecting data •Focus on learning, rather than harassing staff to collect data •Support developmental evaluation and lean start-up •Obtain evidence over time, for the end of program evaluation Program evaluators •Gain a top level overview of a program’s performance •Obtain a tool to build a learning organization •Identify potential threats to the organization or its programs Management •Marketing and communications: Better understanding of the key people who help their organization thrive •Volunteer coordinator: Understand volunteer needs, barriers and satisfaction •Fundraising: Gain insight into constituents and their donating habits over time Staff
  • 48.
    © Copyright 2013| Brian Cugelman, PhD | AlterSpark Corp. PROJECT HISTORY 48
  • 49.
    PROJECT TIMELINE 49 BETA 1 Invented& launched (2011) Market testing (2012-2013) 1st pilot study (2011-2012) BETA 2 Redesigned & expanded (2012) YLC project (2013) Analysis models (2009) MaRS SIG (2012) BER citations •A. Emm, E. Ozlem, K. Maja, R. Ilan, & Florian, S. (2011). Value for Money: Current Approaches and Evolving Debates. London, UK: London School of Economics. •Cugelman, B., & Otero, E. (2010). Basic Efficiency Resource: A framework for measuring the relative performance of multi-unit programs. : Leitmtoiv and AlterSpark. •Cugelman, B., & Otero, E. (2010). Evaluation of Oxfam GB's Climate Change Campaign: Leitmotiv, AlterSpark, Oxfam GB. Download •Eurodiaconia (2012) Measuring Social Value. Brussels, Belgium. 2nd pilot study (2012) Numerous NGOs & evaluators
  • 50.
    Want to learnmore? Brian Cugelman, PhD (416) 921-2055 brian@alterspark.com www.evaluationpal.com