SlideShare a Scribd company logo
1 of 7
111
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P1
Reed Early, MA CE
CES National Conference
Toronto, 2013
Agenda
3:15pm criteria for good performance indicators, and examples
3:30 styles of indicator selection
3:45 lessons learned and traps to avoid selecting indicators
Questions welcome anytime
Mon 3:15 – 4:45pm
Main Mezzanine, Tudor 8 Room
Handout and Powerpoint at http://www3.telus.net/reedspace/shared/
Performance
Indicators
withinand
across
Community
settings
222
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P2
Performance Indicators (PI) Within and Across Community Settings
Performance Indicators – Some everyday examples:
Light bulb – i.e. brightness compared to candles or watts
Paint and shingles - i.e. durability over time
Business – i.e. 3rd
quarter earnings compared to last year
Behavior change class – i.e. did they quit smoking
Grade 12 graduates – i.e. employed or in university
Styles of Indicator Selection
Management (goals and objectives)
Strategic (strategies and targets)
Consultative (focus group or survey data collection and analysis)
Best Practice (borrow from the best, adapt, evaluate and choose)
Performance Measurement Systems
a human and computer based system to measure indicators, i.e. HOMES
client based indicators i.e. user satisfaction surveys, exit interviews
community results based indicators, i.e. Tools For Action Series A resource guide for
designing a community indicator project. a report by SPARC BC, April 2008
Community Objective and Potential Indicators, United Way of Greater Milwaukee,
Planning, Allocation & Monitoring Division
Lessons learned
Try to get good indicators
Make sure indicator is not driving performance
Don’t sacrifice Meaning for Measurability (or Relevance for Rigor)
Remember - indicators may need revision
Use 2+ indicators for anything important
Enrich indicators by discussing limitations
Traps to avoid in Success Indicators (Scriven)
Teaching the test ( indicator abuse)
Indicator becomes the mission
Indicator displaces the problem
Watch for letting IT drive the choice of indicators (putting cart before horse)
Doing it because its popular
Quick and dirty evaluation
Take Away Notes
a) Performance indicators relate to logic models
b) Indicator selection requires care, time and conscious thought
c) Different styles are: Management, strategic, consultative, best practice
d) Indicators should be selected and evaluated against criteria
Handout and Powerpoint at http://www3.telus.net/reedspace/shared/
333
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P3
Performance Indicators (PI) Within and Across Community Settings
Participants: Community agency? Federal? Provincial? Municipal? Academic?
..the organizational dilemma is: at the top one works on the right problems but with the wrong
information, and at the bottom one works with the right information but on the wrong problems...
Arnold J. Meltsner
Question - think of an example of the above
1) Common fallacies of selecting performance indicators
a) Leave it to management - just ask the people at the top
b) Its easy – anyone can do it (or you’re not a good manager)
c) Its obvious - just go find some (that will make us look good)
d) Its quick - form a subcommittee (and tell us tomorrow)
Activity 1 - Obstacles to selecting indicators (3-4 min) (Chart)
> Close your eyes and write your name on a card (no visual feedback)
> Turn it over and write your place of birth using your non-dominant hand (new task)
> Swap cards with the person sitting next to you and verbally instruct them to write your age in
months without telling them what it actually is (unfamiliar units, indirect communication).
> Optional - Provide lengthy written instructions via a Policy and Procedure Manual on how to
measure your success on the card (awkward obfuscated instructions)
2) Beliefs and values that help get buy in. Convince people that:
Information is a good thing – accurate information is even better
Performance information will empowers them to be more efficient and effective
Information and experience combined make knowledge
Knowledge is to be shared - knowledge is not “power over”
Knowledge is to drive action - or it is meaningless
3) Performance Indicators – Some everyday examples:
a) Light bulb – i.e. brightness compared to candles or watts
b) Paint and shingles - i.e. durability over time
c) Business – i.e. 3rd
quarter earnings compared to last year
d) Behavior change class – i.e. did they quit smoking
e) Grade 12 graduates – i.e. employed or in university
4) PI should (in aggregate):
a) Measure results, short range (outcomes) as well as long range (impacts)
444
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P4
Obsession with outcomes…if all you have is a hammer...every problem becomes a nail
b) Measure outputs, process, delivery (to know what caused the indicator to shift)
c) Measure elements of a “theory of change” i.e. the determinants of success
d) Assess agency resources (capacity is necessary but not sufficient for success)
5) Logic Models (Chart)
a) The best tool to start with
b) Include at least activities, outputs, and outcomes
6) Success Measures Proximity - (Chart)
a) Field of influence – include measures outside the walls of the organization, but within
the limits of measurability
b) Timing – include measures during and immediately after the program
c) Consider long term measures
7) Styles of Indicator Selection (Chart)
a) Management (goals and objectives)
b) Strategic (strategies and targets)
c) Consultative (focus group or survey data collection and analysis)
d) Best Practice (borrow from the best, adapt, evaluate and choose)
8) Management (Chart)
a) Indicators reflect goals and objectives – i.e. related to logic model outcomes
b) Useful in service or prevention oriented agencies
c) Often mandated via legislation i.e. service plans
d) This style may be used by senior managers to independently choose the indicators
9) Strategic (Chart)
a) Indicators reflect strategies and targets – i.e. activities, outputs and near outcomes
b) Useful for units that produce a quantifiable or tangible products
c) Often adopted as a result of accountability or accreditation initiative
d) May be used by performance team in a larger agency
10) Consultative (Chart)
a) Indicators based on focus group or survey of employees (or clients)
b) Useful for decentralized, flat organizations
c) Requires time and effort and original data collection
d) May be used by smaller agencies practicing more democratic management
11) Best Practice (Chart)
a) Selected from literature/Internet, from similar leading agencies, and adapted
b) Useful to any agency with resources to search and research
c) Requires time, effort, and openness to experimentation
d) May provide benchmark comparability
555
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P5
Our best effort should be spent on finding out what funders, clients and other
stakeholders define as success.
Guy Leclerc
The priest, the cabbie and Saint Peter
12) Health examples (Chart)
a) Immunization rate
b) Infant mortality rate
c) Hospital acquired infections
d) Inpatient mortality rate
e) Cost per bed/day
13) Social Services examples (Chart)
a) Child Behavior Checklist
b) Early unmarried childbearing
c) School attendance and graduation rate
d) Children in families below poverty line
e) Youth unemployment
f) Cost per child in care
14) Industry Strategic Indicators (Chart and example)
a) Sales
b) Sales of new products
c) Value added to raw material consumed
d) Cost savings to industry i.e. reduced training and down time
e) Increased market share %
f) Increased geographic penetration
15) Continuing Education (Montague) (Chart)
a) Resources (staff, funding)
b) Reach (target market, consumers)
c) Relevance (meaningful)
d) Results - Educational (meets the mandate)
e) Results - Financial (affordable)
16) Performance Measurement Systems
a) a human and computer based system to measure indicators, i.e. HOMES
b) client based indicators i.e. user satisfaction surveys, exit interviews
c) community results based indicators, i.e. Tools For Action Series A resource guide for
designing a community indicator project. a report by SPARC BC, April 2008
(the above should not be confused with forms of accreditation and audit – such as ISO9000, CARF, COA, etc)
d) Community Objective and Potential Indicators, United Way of Greater Milwaukee,
Planning, Allocation & Monitoring Division
17) How do YOU develop Performance Indicators?
18) Conference performance indicators of success
666
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P6
e) Participant satisfaction
f) Citations to conference
g) Publications from conference
h) Networking and outside connections
i) Sleeper effects (year later citations?)
j) Diffusion of benefits (client benefits?)
k) ……
19) Criteria for a performance indicator:
A. Authoritative - commonly agreed to be true (i.e. speedometer)
B. Economical – only what’s needed (not all the instruments of a 747)
C. Ethical - (i.e. urgent child safety issues cannot be “monitored”)
D. Feasible – possible to measure (i.e. difficulty of assessing safe sex practices)
E. Logical - outputs vs outcomes (i.e. #pamphlets given do not equal #pamphlets read)
F. Manageable - suggest 10 at a time and no more than 40 overall
G. Measurable – qualitative or quantitative (i.e. trust level on a scale of 1-10)
H. Reliable - accurate (i.e. don’t ask literacy of homeless) (use split half, test retest)
I. Specific – at the right level of precision (i.e. don’t ask satisfaction on 100 point scale)
J. Visible/accessible like a car dashboard - (i.e. original cars had gas gauge on the tank)
K. Timely - (i.e. instant readout of gas economy versus computed mpg, conversions etc)
L. True - measure of success (i.e. logically measures end goal - face validity)
M. Valid – exact or close proxy (i.e. not # rings to answer phone) (construct validity)
20) Types of indicators (Charts)
l) Analog and continuous
m) Digital and Logical
n) Qualitative and Spatial
o) Ratio and Rates
21) When NOT to do Performance Measures
p) Low dosage - program too weak
q) Immature - program continuously evolving
r) Amorphous - no explicit or credible logic/theory
s) The good cause - program with no goals
t) Impact is already well known
u) Poor delivery model
v) Unethical
w) Nothing to compare to
x) A negative finding cannot be accepted
y) A ridiculous waste of time
22) Lessons learned
z) Try to get good indicators
aa) Make sure indicator is not driving performance
bb) Don’t sacrifice Meaning for Measurability (or Relevance for Rigor)
cc) Remember - indicators may need revision
dd) Use 2+ indicators for anything important
777
Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550
Performance Indicators Within and Across Community Settings – P7
ee) Enrich indicators by discussing limitations
23) Traps to avoid in Success Indicators (Scriven)
ff) Teaching the test ( indicator abuse)
gg) Indicator becomes the mission
hh) Indicator displaces the problem
ii) Watch for letting IT drive the choice of indicators (putting cart before horse)
jj) Doing it because its popular
kk) Quick and dirty evaluation
24) Take Away Notes
ll) Performance indicators relate to logic models
mm) Indicator selection requires care, time and conscious thought
nn)Different styles are:
i) Management, strategic, consultative, best practice
oo) Indicators should be selected and evaluated against criteria
In government - Performance indicators cover the range of activities:
financial performance: appropriation mechanism, source and application of funds, prudence,
diligence, probity, integrity and financial accounting and reporting;
legal compliance: fairness, equity and probity: the extent to which the agency has met its
legislative requirements and its standards of conduct (such as human rights, employment equity, and
conflict of interest guidelines);
operational performance: achievement of outputs targets, delivery systems for the goods and
services produced in an economical, efficient and cost-effective manner;
organizational performance: overall capability of the organization and the interactions among
strategic planning, management structures and processes, human, material and financial resources, all
in relation to the mission and goal and the demands of the external environment. Management
direction, working environment, appropriate control systems, monitoring and reporting systems (on
inputs, and outputs);
program performance: information on policy intent, on the continued relevance,
appropriateness and responsiveness of programs to the policy (clear objectives, clear goals, outputs,
acceptance, intended and unintended outcomes, results, impacts); cost-effectiveness;
institutional performance: the ability of the organization, to have reached its purposes, to
have fulfilled its mission, to have succeeded, in effect;
Separate is the performance of individuals in the organization including employee performance, board
performance, executive committee performance, management performance, administrative performance,
and team performance, not to be confused with program/service performance indicators.
Guy Leclerc: Accountability, Performance Reporting, Comprehensive Audit

More Related Content

What's hot

Promoting knowledge sharing in projects
Promoting knowledge sharing in projectsPromoting knowledge sharing in projects
Promoting knowledge sharing in projectsLouise Worsley
 
Thesis Concept Km V0.1
Thesis Concept Km V0.1Thesis Concept Km V0.1
Thesis Concept Km V0.1Amber Krishan
 
Tech Audit overview
Tech Audit overviewTech Audit overview
Tech Audit overviewedtech111
 
Rapidly Generating Human and System Requirements - Mark Mellblom
Rapidly Generating Human and System Requirements  - Mark MellblomRapidly Generating Human and System Requirements  - Mark Mellblom
Rapidly Generating Human and System Requirements - Mark MellblomPaul W. Johnson
 
Module 8 presenter notes
Module 8 presenter notesModule 8 presenter notes
Module 8 presenter notesTony
 
Sacred PM Practices
Sacred PM PracticesSacred PM Practices
Sacred PM PracticesJeff Edwards
 
Is your bi system fit for purpose?
Is your bi system fit for purpose?Is your bi system fit for purpose?
Is your bi system fit for purpose?Jisc
 
Information Technology Project Management - part 08
Information Technology Project Management - part  08Information Technology Project Management - part  08
Information Technology Project Management - part 08Rizwan Khurram
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013CesToronto
 
Robotic Process Automation
Robotic Process AutomationRobotic Process Automation
Robotic Process AutomationRavikanth-BA
 
Regulatory Instability, BPM Technology, and BPM Skill Configurations
Regulatory Instability, BPM Technology, and BPM Skill ConfigurationsRegulatory Instability, BPM Technology, and BPM Skill Configurations
Regulatory Instability, BPM Technology, and BPM Skill ConfigurationsMichael zur Muehlen
 
Emerging Challenges in Learning: Proving the Business Value
Emerging Challenges in Learning: Proving the Business Value Emerging Challenges in Learning: Proving the Business Value
Emerging Challenges in Learning: Proving the Business Value Human Capital Media
 

What's hot (13)

Promoting knowledge sharing in projects
Promoting knowledge sharing in projectsPromoting knowledge sharing in projects
Promoting knowledge sharing in projects
 
Thesis Concept Km V0.1
Thesis Concept Km V0.1Thesis Concept Km V0.1
Thesis Concept Km V0.1
 
Tech Audit overview
Tech Audit overviewTech Audit overview
Tech Audit overview
 
Rapidly Generating Human and System Requirements - Mark Mellblom
Rapidly Generating Human and System Requirements  - Mark MellblomRapidly Generating Human and System Requirements  - Mark Mellblom
Rapidly Generating Human and System Requirements - Mark Mellblom
 
Module 8 presenter notes
Module 8 presenter notesModule 8 presenter notes
Module 8 presenter notes
 
Sacred PM Practices
Sacred PM PracticesSacred PM Practices
Sacred PM Practices
 
BABOKv3 A Summary v1.00
BABOKv3 A Summary v1.00BABOKv3 A Summary v1.00
BABOKv3 A Summary v1.00
 
Is your bi system fit for purpose?
Is your bi system fit for purpose?Is your bi system fit for purpose?
Is your bi system fit for purpose?
 
Information Technology Project Management - part 08
Information Technology Project Management - part  08Information Technology Project Management - part  08
Information Technology Project Management - part 08
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013
 
Robotic Process Automation
Robotic Process AutomationRobotic Process Automation
Robotic Process Automation
 
Regulatory Instability, BPM Technology, and BPM Skill Configurations
Regulatory Instability, BPM Technology, and BPM Skill ConfigurationsRegulatory Instability, BPM Technology, and BPM Skill Configurations
Regulatory Instability, BPM Technology, and BPM Skill Configurations
 
Emerging Challenges in Learning: Proving the Business Value
Emerging Challenges in Learning: Proving the Business Value Emerging Challenges in Learning: Proving the Business Value
Emerging Challenges in Learning: Proving the Business Value
 

Similar to Selecting Performance Indicators for Community Settings

Organization theory and design 14 2013
Organization theory and design 14 2013Organization theory and design 14 2013
Organization theory and design 14 2013Wai Chamornmarn
 
Cima syllabus is in the context of look : E3 strategic case study 2015 march
Cima syllabus is in the context of look  :  E3 strategic case study 2015 marchCima syllabus is in the context of look  :  E3 strategic case study 2015 march
Cima syllabus is in the context of look : E3 strategic case study 2015 marchPraneeth Prabodha Dissanayaka, MILT
 
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...Susan Hanley
 
CH-2.1 Conceptualizing and Initializing the IT Project.ppt
CH-2.1 Conceptualizing and Initializing the IT Project.pptCH-2.1 Conceptualizing and Initializing the IT Project.ppt
CH-2.1 Conceptualizing and Initializing the IT Project.pptamanuel236786
 
Six Sigma Green Belt Training Part 4
Six Sigma Green Belt Training Part 4Six Sigma Green Belt Training Part 4
Six Sigma Green Belt Training Part 4Skillogic Solutions
 
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxMan7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxjessiehampson
 
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxMan7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxtienboileau
 
Lean Six Sigma Naval Reserve Presentation
Lean Six Sigma Naval Reserve PresentationLean Six Sigma Naval Reserve Presentation
Lean Six Sigma Naval Reserve Presentationajax85
 
All the way round 360 degree feedback September 2011
All the way round 360 degree feedback September 2011All the way round 360 degree feedback September 2011
All the way round 360 degree feedback September 2011Timothy Holden
 
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgsfreelean
 
How to be a consultant and run a successful assignment
How to be a consultant and run a successful assignmentHow to be a consultant and run a successful assignment
How to be a consultant and run a successful assignment1STOUTSOURCE LTD
 
Transformationcoaching16 jan-16
Transformationcoaching16 jan-16Transformationcoaching16 jan-16
Transformationcoaching16 jan-16Ghazali Md. Noor
 
8 Steps to Sustainability Reporting
8 Steps to Sustainability Reporting8 Steps to Sustainability Reporting
8 Steps to Sustainability ReportingJackson Seng
 
Integrating Digital Marketing Analytics
Integrating Digital Marketing AnalyticsIntegrating Digital Marketing Analytics
Integrating Digital Marketing AnalyticsDaniel Shea
 
Info_Sec&Cyber_Security_Intervention-v1
Info_Sec&Cyber_Security_Intervention-v1Info_Sec&Cyber_Security_Intervention-v1
Info_Sec&Cyber_Security_Intervention-v1John Gilleland, CPCU
 
Baldrige Award Criteria for 2024.pdf
Baldrige Award Criteria for 2024.pdfBaldrige Award Criteria for 2024.pdf
Baldrige Award Criteria for 2024.pdfMaruay Songtanin
 
Conducting a Knowledge Audit
Conducting a Knowledge AuditConducting a Knowledge Audit
Conducting a Knowledge AuditDavid G. Jones
 

Similar to Selecting Performance Indicators for Community Settings (20)

Organization theory and design 14 2013
Organization theory and design 14 2013Organization theory and design 14 2013
Organization theory and design 14 2013
 
Cima syllabus is in the context of look : E3 strategic case study 2015 march
Cima syllabus is in the context of look  :  E3 strategic case study 2015 marchCima syllabus is in the context of look  :  E3 strategic case study 2015 march
Cima syllabus is in the context of look : E3 strategic case study 2015 march
 
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
SharePoint "Moneyball" - The Art and Science of Winning the SharePoint Metric...
 
Master Of Science Dissertation
Master Of Science DissertationMaster Of Science Dissertation
Master Of Science Dissertation
 
CH-2.1 Conceptualizing and Initializing the IT Project.ppt
CH-2.1 Conceptualizing and Initializing the IT Project.pptCH-2.1 Conceptualizing and Initializing the IT Project.ppt
CH-2.1 Conceptualizing and Initializing the IT Project.ppt
 
Six Sigma Green Belt Training Part 4
Six Sigma Green Belt Training Part 4Six Sigma Green Belt Training Part 4
Six Sigma Green Belt Training Part 4
 
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxMan7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
 
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docxMan7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
Man7057 Proposal Feedback FormStudent NameSanaullahStudent N.docx
 
Lean Six Sigma Naval Reserve Presentation
Lean Six Sigma Naval Reserve PresentationLean Six Sigma Naval Reserve Presentation
Lean Six Sigma Naval Reserve Presentation
 
Implementing IT Service Management: A Guide to Success
Implementing IT Service Management: A Guide to SuccessImplementing IT Service Management: A Guide to Success
Implementing IT Service Management: A Guide to Success
 
All the way round 360 degree feedback September 2011
All the way round 360 degree feedback September 2011All the way round 360 degree feedback September 2011
All the way round 360 degree feedback September 2011
 
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs
00 Vital Links Lean Six Sigma Change Acceleration 38 Pgs
 
How to be a consultant and run a successful assignment
How to be a consultant and run a successful assignmentHow to be a consultant and run a successful assignment
How to be a consultant and run a successful assignment
 
Transformationcoaching16 jan-16
Transformationcoaching16 jan-16Transformationcoaching16 jan-16
Transformationcoaching16 jan-16
 
8 Steps to Sustainability Reporting
8 Steps to Sustainability Reporting8 Steps to Sustainability Reporting
8 Steps to Sustainability Reporting
 
Integrating Digital Marketing Analytics
Integrating Digital Marketing AnalyticsIntegrating Digital Marketing Analytics
Integrating Digital Marketing Analytics
 
Project Planning & Feasibility Study
Project Planning & Feasibility StudyProject Planning & Feasibility Study
Project Planning & Feasibility Study
 
Info_Sec&Cyber_Security_Intervention-v1
Info_Sec&Cyber_Security_Intervention-v1Info_Sec&Cyber_Security_Intervention-v1
Info_Sec&Cyber_Security_Intervention-v1
 
Baldrige Award Criteria for 2024.pdf
Baldrige Award Criteria for 2024.pdfBaldrige Award Criteria for 2024.pdf
Baldrige Award Criteria for 2024.pdf
 
Conducting a Knowledge Audit
Conducting a Knowledge AuditConducting a Knowledge Audit
Conducting a Knowledge Audit
 

More from CesToronto

Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?CesToronto
 
Evaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyEvaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyCesToronto
 
Developing Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessDeveloping Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessCesToronto
 
The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...CesToronto
 
Ces 2013 the role of technology and social media - raising the grade
Ces 2013   the role of technology and social media - raising the gradeCes 2013   the role of technology and social media - raising the grade
Ces 2013 the role of technology and social media - raising the gradeCesToronto
 
Evaluating Libraries' Business Services
Evaluating Libraries' Business Services Evaluating Libraries' Business Services
Evaluating Libraries' Business Services CesToronto
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...CesToronto
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...CesToronto
 
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...CesToronto
 
Ces 2013 negotiating evaluation design
Ces 2013   negotiating evaluation designCes 2013   negotiating evaluation design
Ces 2013 negotiating evaluation designCesToronto
 
Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013CesToronto
 
Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...CesToronto
 
Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...CesToronto
 
Avoiding Chartjunk
Avoiding Chartjunk Avoiding Chartjunk
Avoiding Chartjunk CesToronto
 
Evaluation for Development: matching evaluation to the right user, the right...
Evaluation for Development:  matching evaluation to the right user, the right...Evaluation for Development:  matching evaluation to the right user, the right...
Evaluation for Development: matching evaluation to the right user, the right...CesToronto
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013CesToronto
 
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CesToronto
 
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...CesToronto
 
Closing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersClosing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersCesToronto
 
Ces 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCes 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCesToronto
 

More from CesToronto (20)

Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?Why aren't Evaluators using Digital Media Analytics?
Why aren't Evaluators using Digital Media Analytics?
 
Evaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technologyEvaluation pal program monitoring and evaluation technology
Evaluation pal program monitoring and evaluation technology
 
Developing Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative ProcessDeveloping Performance Measures Through a Consultative Process
Developing Performance Measures Through a Consultative Process
 
The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...The international Partnership Initiative to promote Evaluation Capacity Devel...
The international Partnership Initiative to promote Evaluation Capacity Devel...
 
Ces 2013 the role of technology and social media - raising the grade
Ces 2013   the role of technology and social media - raising the gradeCes 2013   the role of technology and social media - raising the grade
Ces 2013 the role of technology and social media - raising the grade
 
Evaluating Libraries' Business Services
Evaluating Libraries' Business Services Evaluating Libraries' Business Services
Evaluating Libraries' Business Services
 
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
Demonstrating Research Impact: Measuring Return on Investment with an Impact ...
 
Improving evaluations and utilization with statistical edge nested data desi...
Improving evaluations and utilization with statistical edge  nested data desi...Improving evaluations and utilization with statistical edge  nested data desi...
Improving evaluations and utilization with statistical edge nested data desi...
 
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
Evaluating through a Community of Practice: How 8 Evaluators are Improving Pr...
 
Ces 2013 negotiating evaluation design
Ces 2013   negotiating evaluation designCes 2013   negotiating evaluation design
Ces 2013 negotiating evaluation design
 
Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013Ces conference presentation 2013 final_june 2013
Ces conference presentation 2013 final_june 2013
 
Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...Monitoring and Evaluation of International Development Assistance to the Priv...
Monitoring and Evaluation of International Development Assistance to the Priv...
 
Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...Living the Values of Engagement and the Strategic View with Program Evaluatio...
Living the Values of Engagement and the Strategic View with Program Evaluatio...
 
Avoiding Chartjunk
Avoiding Chartjunk Avoiding Chartjunk
Avoiding Chartjunk
 
Evaluation for Development: matching evaluation to the right user, the right...
Evaluation for Development:  matching evaluation to the right user, the right...Evaluation for Development:  matching evaluation to the right user, the right...
Evaluation for Development: matching evaluation to the right user, the right...
 
Indicators workshop ces 2013
Indicators workshop ces 2013Indicators workshop ces 2013
Indicators workshop ces 2013
 
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
CES Toronto 2013 Engaging Practitioners in Evaluation using the Risk Based Co...
 
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...
CES 2013 conference - Rethinking the Relationship between Monitoring and Eval...
 
Closing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and StakeholdersClosing the Knowledge Gap Between Evaluators and Stakeholders
Closing the Knowledge Gap Between Evaluators and Stakeholders
 
Ces 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System LevelCes 2013 - Doing Developmental Evaluation at the System Level
Ces 2013 - Doing Developmental Evaluation at the System Level
 

Selecting Performance Indicators for Community Settings

  • 1. 111 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P1 Reed Early, MA CE CES National Conference Toronto, 2013 Agenda 3:15pm criteria for good performance indicators, and examples 3:30 styles of indicator selection 3:45 lessons learned and traps to avoid selecting indicators Questions welcome anytime Mon 3:15 – 4:45pm Main Mezzanine, Tudor 8 Room Handout and Powerpoint at http://www3.telus.net/reedspace/shared/ Performance Indicators withinand across Community settings
  • 2. 222 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P2 Performance Indicators (PI) Within and Across Community Settings Performance Indicators – Some everyday examples: Light bulb – i.e. brightness compared to candles or watts Paint and shingles - i.e. durability over time Business – i.e. 3rd quarter earnings compared to last year Behavior change class – i.e. did they quit smoking Grade 12 graduates – i.e. employed or in university Styles of Indicator Selection Management (goals and objectives) Strategic (strategies and targets) Consultative (focus group or survey data collection and analysis) Best Practice (borrow from the best, adapt, evaluate and choose) Performance Measurement Systems a human and computer based system to measure indicators, i.e. HOMES client based indicators i.e. user satisfaction surveys, exit interviews community results based indicators, i.e. Tools For Action Series A resource guide for designing a community indicator project. a report by SPARC BC, April 2008 Community Objective and Potential Indicators, United Way of Greater Milwaukee, Planning, Allocation & Monitoring Division Lessons learned Try to get good indicators Make sure indicator is not driving performance Don’t sacrifice Meaning for Measurability (or Relevance for Rigor) Remember - indicators may need revision Use 2+ indicators for anything important Enrich indicators by discussing limitations Traps to avoid in Success Indicators (Scriven) Teaching the test ( indicator abuse) Indicator becomes the mission Indicator displaces the problem Watch for letting IT drive the choice of indicators (putting cart before horse) Doing it because its popular Quick and dirty evaluation Take Away Notes a) Performance indicators relate to logic models b) Indicator selection requires care, time and conscious thought c) Different styles are: Management, strategic, consultative, best practice d) Indicators should be selected and evaluated against criteria Handout and Powerpoint at http://www3.telus.net/reedspace/shared/
  • 3. 333 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P3 Performance Indicators (PI) Within and Across Community Settings Participants: Community agency? Federal? Provincial? Municipal? Academic? ..the organizational dilemma is: at the top one works on the right problems but with the wrong information, and at the bottom one works with the right information but on the wrong problems... Arnold J. Meltsner Question - think of an example of the above 1) Common fallacies of selecting performance indicators a) Leave it to management - just ask the people at the top b) Its easy – anyone can do it (or you’re not a good manager) c) Its obvious - just go find some (that will make us look good) d) Its quick - form a subcommittee (and tell us tomorrow) Activity 1 - Obstacles to selecting indicators (3-4 min) (Chart) > Close your eyes and write your name on a card (no visual feedback) > Turn it over and write your place of birth using your non-dominant hand (new task) > Swap cards with the person sitting next to you and verbally instruct them to write your age in months without telling them what it actually is (unfamiliar units, indirect communication). > Optional - Provide lengthy written instructions via a Policy and Procedure Manual on how to measure your success on the card (awkward obfuscated instructions) 2) Beliefs and values that help get buy in. Convince people that: Information is a good thing – accurate information is even better Performance information will empowers them to be more efficient and effective Information and experience combined make knowledge Knowledge is to be shared - knowledge is not “power over” Knowledge is to drive action - or it is meaningless 3) Performance Indicators – Some everyday examples: a) Light bulb – i.e. brightness compared to candles or watts b) Paint and shingles - i.e. durability over time c) Business – i.e. 3rd quarter earnings compared to last year d) Behavior change class – i.e. did they quit smoking e) Grade 12 graduates – i.e. employed or in university 4) PI should (in aggregate): a) Measure results, short range (outcomes) as well as long range (impacts)
  • 4. 444 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P4 Obsession with outcomes…if all you have is a hammer...every problem becomes a nail b) Measure outputs, process, delivery (to know what caused the indicator to shift) c) Measure elements of a “theory of change” i.e. the determinants of success d) Assess agency resources (capacity is necessary but not sufficient for success) 5) Logic Models (Chart) a) The best tool to start with b) Include at least activities, outputs, and outcomes 6) Success Measures Proximity - (Chart) a) Field of influence – include measures outside the walls of the organization, but within the limits of measurability b) Timing – include measures during and immediately after the program c) Consider long term measures 7) Styles of Indicator Selection (Chart) a) Management (goals and objectives) b) Strategic (strategies and targets) c) Consultative (focus group or survey data collection and analysis) d) Best Practice (borrow from the best, adapt, evaluate and choose) 8) Management (Chart) a) Indicators reflect goals and objectives – i.e. related to logic model outcomes b) Useful in service or prevention oriented agencies c) Often mandated via legislation i.e. service plans d) This style may be used by senior managers to independently choose the indicators 9) Strategic (Chart) a) Indicators reflect strategies and targets – i.e. activities, outputs and near outcomes b) Useful for units that produce a quantifiable or tangible products c) Often adopted as a result of accountability or accreditation initiative d) May be used by performance team in a larger agency 10) Consultative (Chart) a) Indicators based on focus group or survey of employees (or clients) b) Useful for decentralized, flat organizations c) Requires time and effort and original data collection d) May be used by smaller agencies practicing more democratic management 11) Best Practice (Chart) a) Selected from literature/Internet, from similar leading agencies, and adapted b) Useful to any agency with resources to search and research c) Requires time, effort, and openness to experimentation d) May provide benchmark comparability
  • 5. 555 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P5 Our best effort should be spent on finding out what funders, clients and other stakeholders define as success. Guy Leclerc The priest, the cabbie and Saint Peter 12) Health examples (Chart) a) Immunization rate b) Infant mortality rate c) Hospital acquired infections d) Inpatient mortality rate e) Cost per bed/day 13) Social Services examples (Chart) a) Child Behavior Checklist b) Early unmarried childbearing c) School attendance and graduation rate d) Children in families below poverty line e) Youth unemployment f) Cost per child in care 14) Industry Strategic Indicators (Chart and example) a) Sales b) Sales of new products c) Value added to raw material consumed d) Cost savings to industry i.e. reduced training and down time e) Increased market share % f) Increased geographic penetration 15) Continuing Education (Montague) (Chart) a) Resources (staff, funding) b) Reach (target market, consumers) c) Relevance (meaningful) d) Results - Educational (meets the mandate) e) Results - Financial (affordable) 16) Performance Measurement Systems a) a human and computer based system to measure indicators, i.e. HOMES b) client based indicators i.e. user satisfaction surveys, exit interviews c) community results based indicators, i.e. Tools For Action Series A resource guide for designing a community indicator project. a report by SPARC BC, April 2008 (the above should not be confused with forms of accreditation and audit – such as ISO9000, CARF, COA, etc) d) Community Objective and Potential Indicators, United Way of Greater Milwaukee, Planning, Allocation & Monitoring Division 17) How do YOU develop Performance Indicators? 18) Conference performance indicators of success
  • 6. 666 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P6 e) Participant satisfaction f) Citations to conference g) Publications from conference h) Networking and outside connections i) Sleeper effects (year later citations?) j) Diffusion of benefits (client benefits?) k) …… 19) Criteria for a performance indicator: A. Authoritative - commonly agreed to be true (i.e. speedometer) B. Economical – only what’s needed (not all the instruments of a 747) C. Ethical - (i.e. urgent child safety issues cannot be “monitored”) D. Feasible – possible to measure (i.e. difficulty of assessing safe sex practices) E. Logical - outputs vs outcomes (i.e. #pamphlets given do not equal #pamphlets read) F. Manageable - suggest 10 at a time and no more than 40 overall G. Measurable – qualitative or quantitative (i.e. trust level on a scale of 1-10) H. Reliable - accurate (i.e. don’t ask literacy of homeless) (use split half, test retest) I. Specific – at the right level of precision (i.e. don’t ask satisfaction on 100 point scale) J. Visible/accessible like a car dashboard - (i.e. original cars had gas gauge on the tank) K. Timely - (i.e. instant readout of gas economy versus computed mpg, conversions etc) L. True - measure of success (i.e. logically measures end goal - face validity) M. Valid – exact or close proxy (i.e. not # rings to answer phone) (construct validity) 20) Types of indicators (Charts) l) Analog and continuous m) Digital and Logical n) Qualitative and Spatial o) Ratio and Rates 21) When NOT to do Performance Measures p) Low dosage - program too weak q) Immature - program continuously evolving r) Amorphous - no explicit or credible logic/theory s) The good cause - program with no goals t) Impact is already well known u) Poor delivery model v) Unethical w) Nothing to compare to x) A negative finding cannot be accepted y) A ridiculous waste of time 22) Lessons learned z) Try to get good indicators aa) Make sure indicator is not driving performance bb) Don’t sacrifice Meaning for Measurability (or Relevance for Rigor) cc) Remember - indicators may need revision dd) Use 2+ indicators for anything important
  • 7. 777 Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550 Performance Indicators Within and Across Community Settings – P7 ee) Enrich indicators by discussing limitations 23) Traps to avoid in Success Indicators (Scriven) ff) Teaching the test ( indicator abuse) gg) Indicator becomes the mission hh) Indicator displaces the problem ii) Watch for letting IT drive the choice of indicators (putting cart before horse) jj) Doing it because its popular kk) Quick and dirty evaluation 24) Take Away Notes ll) Performance indicators relate to logic models mm) Indicator selection requires care, time and conscious thought nn)Different styles are: i) Management, strategic, consultative, best practice oo) Indicators should be selected and evaluated against criteria In government - Performance indicators cover the range of activities: financial performance: appropriation mechanism, source and application of funds, prudence, diligence, probity, integrity and financial accounting and reporting; legal compliance: fairness, equity and probity: the extent to which the agency has met its legislative requirements and its standards of conduct (such as human rights, employment equity, and conflict of interest guidelines); operational performance: achievement of outputs targets, delivery systems for the goods and services produced in an economical, efficient and cost-effective manner; organizational performance: overall capability of the organization and the interactions among strategic planning, management structures and processes, human, material and financial resources, all in relation to the mission and goal and the demands of the external environment. Management direction, working environment, appropriate control systems, monitoring and reporting systems (on inputs, and outputs); program performance: information on policy intent, on the continued relevance, appropriateness and responsiveness of programs to the policy (clear objectives, clear goals, outputs, acceptance, intended and unintended outcomes, results, impacts); cost-effectiveness; institutional performance: the ability of the organization, to have reached its purposes, to have fulfilled its mission, to have succeeded, in effect; Separate is the performance of individuals in the organization including employee performance, board performance, executive committee performance, management performance, administrative performance, and team performance, not to be confused with program/service performance indicators. Guy Leclerc: Accountability, Performance Reporting, Comprehensive Audit