Indicators workshop ces 2013

238 views
199 views

Published on

Reed Early's presentation on Choosing Performance Indicators from June 10 - see matched PPTX and DOCX files.

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
238
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Indicators workshop ces 2013

  1. 1. 111Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P1Reed Early, MA CECES National ConferenceToronto, 2013Agenda3:15pm criteria for good performance indicators, and examples3:30 styles of indicator selection3:45 lessons learned and traps to avoid selecting indicatorsQuestions welcome anytimeMon 3:15 – 4:45pmMain Mezzanine, Tudor 8 RoomHandout and Powerpoint at http://www3.telus.net/reedspace/shared/PerformanceIndicatorswithinandacrossCommunitysettings
  2. 2. 222Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P2Performance Indicators (PI) Within and Across Community SettingsPerformance Indicators – Some everyday examples:Light bulb – i.e. brightness compared to candles or wattsPaint and shingles - i.e. durability over timeBusiness – i.e. 3rdquarter earnings compared to last yearBehavior change class – i.e. did they quit smokingGrade 12 graduates – i.e. employed or in universityStyles of Indicator SelectionManagement (goals and objectives)Strategic (strategies and targets)Consultative (focus group or survey data collection and analysis)Best Practice (borrow from the best, adapt, evaluate and choose)Performance Measurement Systemsa human and computer based system to measure indicators, i.e. HOMESclient based indicators i.e. user satisfaction surveys, exit interviewscommunity results based indicators, i.e. Tools For Action Series A resource guide fordesigning a community indicator project. a report by SPARC BC, April 2008Community Objective and Potential Indicators, United Way of Greater Milwaukee,Planning, Allocation & Monitoring DivisionLessons learnedTry to get good indicatorsMake sure indicator is not driving performanceDon’t sacrifice Meaning for Measurability (or Relevance for Rigor)Remember - indicators may need revisionUse 2+ indicators for anything importantEnrich indicators by discussing limitationsTraps to avoid in Success Indicators (Scriven)Teaching the test ( indicator abuse)Indicator becomes the missionIndicator displaces the problemWatch for letting IT drive the choice of indicators (putting cart before horse)Doing it because its popularQuick and dirty evaluationTake Away Notesa) Performance indicators relate to logic modelsb) Indicator selection requires care, time and conscious thoughtc) Different styles are: Management, strategic, consultative, best practiced) Indicators should be selected and evaluated against criteriaHandout and Powerpoint at http://www3.telus.net/reedspace/shared/
  3. 3. 333Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P3Performance Indicators (PI) Within and Across Community SettingsParticipants: Community agency? Federal? Provincial? Municipal? Academic?..the organizational dilemma is: at the top one works on the right problems but with the wronginformation, and at the bottom one works with the right information but on the wrong problems...Arnold J. MeltsnerQuestion - think of an example of the above1) Common fallacies of selecting performance indicatorsa) Leave it to management - just ask the people at the topb) Its easy – anyone can do it (or you’re not a good manager)c) Its obvious - just go find some (that will make us look good)d) Its quick - form a subcommittee (and tell us tomorrow)Activity 1 - Obstacles to selecting indicators (3-4 min) (Chart)> Close your eyes and write your name on a card (no visual feedback)> Turn it over and write your place of birth using your non-dominant hand (new task)> Swap cards with the person sitting next to you and verbally instruct them to write your age inmonths without telling them what it actually is (unfamiliar units, indirect communication).> Optional - Provide lengthy written instructions via a Policy and Procedure Manual on how tomeasure your success on the card (awkward obfuscated instructions)2) Beliefs and values that help get buy in. Convince people that:Information is a good thing – accurate information is even betterPerformance information will empowers them to be more efficient and effectiveInformation and experience combined make knowledgeKnowledge is to be shared - knowledge is not “power over”Knowledge is to drive action - or it is meaningless3) Performance Indicators – Some everyday examples:a) Light bulb – i.e. brightness compared to candles or wattsb) Paint and shingles - i.e. durability over timec) Business – i.e. 3rdquarter earnings compared to last yeard) Behavior change class – i.e. did they quit smokinge) Grade 12 graduates – i.e. employed or in university4) PI should (in aggregate):a) Measure results, short range (outcomes) as well as long range (impacts)
  4. 4. 444Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P4Obsession with outcomes…if all you have is a hammer...every problem becomes a nailb) Measure outputs, process, delivery (to know what caused the indicator to shift)c) Measure elements of a “theory of change” i.e. the determinants of successd) Assess agency resources (capacity is necessary but not sufficient for success)5) Logic Models (Chart)a) The best tool to start withb) Include at least activities, outputs, and outcomes6) Success Measures Proximity - (Chart)a) Field of influence – include measures outside the walls of the organization, but withinthe limits of measurabilityb) Timing – include measures during and immediately after the programc) Consider long term measures7) Styles of Indicator Selection (Chart)a) Management (goals and objectives)b) Strategic (strategies and targets)c) Consultative (focus group or survey data collection and analysis)d) Best Practice (borrow from the best, adapt, evaluate and choose)8) Management (Chart)a) Indicators reflect goals and objectives – i.e. related to logic model outcomesb) Useful in service or prevention oriented agenciesc) Often mandated via legislation i.e. service plansd) This style may be used by senior managers to independently choose the indicators9) Strategic (Chart)a) Indicators reflect strategies and targets – i.e. activities, outputs and near outcomesb) Useful for units that produce a quantifiable or tangible productsc) Often adopted as a result of accountability or accreditation initiatived) May be used by performance team in a larger agency10) Consultative (Chart)a) Indicators based on focus group or survey of employees (or clients)b) Useful for decentralized, flat organizationsc) Requires time and effort and original data collectiond) May be used by smaller agencies practicing more democratic management11) Best Practice (Chart)a) Selected from literature/Internet, from similar leading agencies, and adaptedb) Useful to any agency with resources to search and researchc) Requires time, effort, and openness to experimentationd) May provide benchmark comparability
  5. 5. 555Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P5Our best effort should be spent on finding out what funders, clients and otherstakeholders define as success.Guy LeclercThe priest, the cabbie and Saint Peter12) Health examples (Chart)a) Immunization rateb) Infant mortality ratec) Hospital acquired infectionsd) Inpatient mortality ratee) Cost per bed/day13) Social Services examples (Chart)a) Child Behavior Checklistb) Early unmarried childbearingc) School attendance and graduation rated) Children in families below poverty linee) Youth unemploymentf) Cost per child in care14) Industry Strategic Indicators (Chart and example)a) Salesb) Sales of new productsc) Value added to raw material consumedd) Cost savings to industry i.e. reduced training and down timee) Increased market share %f) Increased geographic penetration15) Continuing Education (Montague) (Chart)a) Resources (staff, funding)b) Reach (target market, consumers)c) Relevance (meaningful)d) Results - Educational (meets the mandate)e) Results - Financial (affordable)16) Performance Measurement Systemsa) a human and computer based system to measure indicators, i.e. HOMESb) client based indicators i.e. user satisfaction surveys, exit interviewsc) community results based indicators, i.e. Tools For Action Series A resource guide fordesigning a community indicator project. a report by SPARC BC, April 2008(the above should not be confused with forms of accreditation and audit – such as ISO9000, CARF, COA, etc)d) Community Objective and Potential Indicators, United Way of Greater Milwaukee,Planning, Allocation & Monitoring Division17) How do YOU develop Performance Indicators?18) Conference performance indicators of success
  6. 6. 666Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P6e) Participant satisfactionf) Citations to conferenceg) Publications from conferenceh) Networking and outside connectionsi) Sleeper effects (year later citations?)j) Diffusion of benefits (client benefits?)k) ……19) Criteria for a performance indicator:A. Authoritative - commonly agreed to be true (i.e. speedometer)B. Economical – only what’s needed (not all the instruments of a 747)C. Ethical - (i.e. urgent child safety issues cannot be “monitored”)D. Feasible – possible to measure (i.e. difficulty of assessing safe sex practices)E. Logical - outputs vs outcomes (i.e. #pamphlets given do not equal #pamphlets read)F. Manageable - suggest 10 at a time and no more than 40 overallG. Measurable – qualitative or quantitative (i.e. trust level on a scale of 1-10)H. Reliable - accurate (i.e. don’t ask literacy of homeless) (use split half, test retest)I. Specific – at the right level of precision (i.e. don’t ask satisfaction on 100 point scale)J. Visible/accessible like a car dashboard - (i.e. original cars had gas gauge on the tank)K. Timely - (i.e. instant readout of gas economy versus computed mpg, conversions etc)L. True - measure of success (i.e. logically measures end goal - face validity)M. Valid – exact or close proxy (i.e. not # rings to answer phone) (construct validity)20) Types of indicators (Charts)l) Analog and continuousm) Digital and Logicaln) Qualitative and Spatialo) Ratio and Rates21) When NOT to do Performance Measuresp) Low dosage - program too weakq) Immature - program continuously evolvingr) Amorphous - no explicit or credible logic/theorys) The good cause - program with no goalst) Impact is already well knownu) Poor delivery modelv) Unethicalw) Nothing to compare tox) A negative finding cannot be acceptedy) A ridiculous waste of time22) Lessons learnedz) Try to get good indicatorsaa) Make sure indicator is not driving performancebb) Don’t sacrifice Meaning for Measurability (or Relevance for Rigor)cc) Remember - indicators may need revisiondd) Use 2+ indicators for anything important
  7. 7. 777Reed Early, Credentialed Evaluator, rearly@telus.net 250 748 0550Performance Indicators Within and Across Community Settings – P7ee) Enrich indicators by discussing limitations23) Traps to avoid in Success Indicators (Scriven)ff) Teaching the test ( indicator abuse)gg) Indicator becomes the missionhh) Indicator displaces the problemii) Watch for letting IT drive the choice of indicators (putting cart before horse)jj) Doing it because its popularkk) Quick and dirty evaluation24) Take Away Notesll) Performance indicators relate to logic modelsmm) Indicator selection requires care, time and conscious thoughtnn)Different styles are:i) Management, strategic, consultative, best practiceoo) Indicators should be selected and evaluated against criteriaIn government - Performance indicators cover the range of activities:financial performance: appropriation mechanism, source and application of funds, prudence,diligence, probity, integrity and financial accounting and reporting;legal compliance: fairness, equity and probity: the extent to which the agency has met itslegislative requirements and its standards of conduct (such as human rights, employment equity, andconflict of interest guidelines);operational performance: achievement of outputs targets, delivery systems for the goods andservices produced in an economical, efficient and cost-effective manner;organizational performance: overall capability of the organization and the interactions amongstrategic planning, management structures and processes, human, material and financial resources, allin relation to the mission and goal and the demands of the external environment. Managementdirection, working environment, appropriate control systems, monitoring and reporting systems (oninputs, and outputs);program performance: information on policy intent, on the continued relevance,appropriateness and responsiveness of programs to the policy (clear objectives, clear goals, outputs,acceptance, intended and unintended outcomes, results, impacts); cost-effectiveness;institutional performance: the ability of the organization, to have reached its purposes, tohave fulfilled its mission, to have succeeded, in effect;Separate is the performance of individuals in the organization including employee performance, boardperformance, executive committee performance, management performance, administrative performance,and team performance, not to be confused with program/service performance indicators.Guy Leclerc: Accountability, Performance Reporting, Comprehensive Audit

×