Your SlideShare is downloading. ×
0
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
*αí*ß
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

*αí*ß

283

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
283
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. davidr: C-8 Moving Toward Excellence: Integrating Performance and Outcome Measurements A Data-driven Organizational Quality and Performance System Jill Pfitzenmayer, Ph.D. Child & Family Services, Newport, RI David Robinson, Ed.D. Center for Evaluation and Research with Children and Adolescents, Massachusetts Society for the Prevention of Cruelty to Children, Boston, MA
  • 2. Presentation Objectives 1 .    Participants will learn how to identify program goals, objectives and specific program activities toward the accomplishment of program mission, vision and values. 2.    Participants will obtain information about establishing an organizational structure that supports critical analysis of program performance and quality improvement. 3.    Participants will identify specific outcome indicators, measurement tools and data collection methods relevant to their program plans.
  • 3. Presentation Objectives 4.    Participants will learn how to link staff training, employee retention, program performance and outcome measurement efforts. 5.    Participants will be able to understand when internal outcome assessment can be enhanced by the introduction of an outside evaluator as consultant or primary investigator.
  • 4. <ul><li>Conceptual framework </li></ul><ul><li>Description of the identified needs and problems </li></ul><ul><li>Discussion of the methodology and approaches used to address the systems issues, </li></ul><ul><li>Review of specific agency initiatives and reporting structures developed to enhance organizational quality, and </li></ul><ul><li>Discussion by an outside evaluator familiar with the agency who was brought in for consultation </li></ul>Agenda
  • 5. Definitions <ul><li>Performance Measurement – “Ongoing monitoring and reporting of program accomplishments, particularly progress towards preestablished goals.”* </li></ul><ul><ul><li>type or level of program activities (process) </li></ul></ul><ul><ul><li>direct products and services delivered (outputs) </li></ul></ul><ul><ul><li>results of products and services (outcomes) </li></ul></ul><ul><li>Program Evaluation – “Individual systematic studies conducted periodically or on an ad hoc basis to assess how well a program is working.”* </li></ul><ul><ul><li>internal or external experts and program managers </li></ul></ul><ul><ul><li>learn benefits of program or how to improve it </li></ul></ul>* GAO, April 1998
  • 6. Integrating Performance Measurement and Program Evaluation: A Conceptual Framework <ul><li>Linked to institutional mission, goals, objectives, strategic plan, process and outcome evaluation </li></ul><ul><li>SMART - S imple, M inimum paper, A ctively used, R evised, and T ransferable to new programs </li></ul><ul><li>Best Practices – continuously updated incorporating new findings, technology and resources </li></ul><ul><li>Comprehensive - Quantity, Quality, Effort and Effect </li></ul>
  • 7. Program Evaluation
  • 8. Performance Measurement System 1. Gilbert Performance Engineering Approach <ul><li>Develop a MODEL of process, department, program </li></ul><ul><li>Define INDICATORS of valued results and expectations </li></ul><ul><li>Take MEASURES to see how well process is working </li></ul><ul><li>Implement METHODS of improvement and evaluate results </li></ul>
  • 9. Performance Measurement System 2. Friedman Four Quadrant Approach INPUT OUTPUT QUANTITY QUALITY WHAT QUALITY OF EFFECT/CHANGE DID WE PRODUCE? HOW MUCH EFFECT/CHANGE DID WE PRODUCE? HOW WELL DID WE DELIVER SERVICE? HOW MUCH SERVICE DID WE DELIVER?
  • 10. Effective Organizational Performance Measurement Systems <ul><li>Commitment – Leadership team committed to measurement </li></ul><ul><li>Clarity – Degree of clear strategies and metrics </li></ul><ul><li>Metrics – Indicators measure valuable processes and expectations </li></ul><ul><li>Alignment – Performance measures aligned with key people processes and structures </li></ul><ul><li>Involvement – Key stakeholders involved in defining, tracking, and ongoing review of quality performance improvement system </li></ul>
  • 11. <ul><li>Multi-service organization over 136 years old. CEO been in office for 22 years; powerful Board of Directors </li></ul><ul><li>Primary services: </li></ul><ul><ul><li>Nine child & adolescent residential sites </li></ul></ul><ul><ul><li>Two child care centers </li></ul></ul><ul><ul><li>Family counseling program </li></ul></ul><ul><ul><li>Home-based program </li></ul></ul><ul><ul><li>School-based programs </li></ul></ul><ul><ul><li>Transitional housing program </li></ul></ul>Information about CFS
  • 12. <ul><li>Minimal COA requirements: </li></ul><ul><ul><ul><ul><ul><li>Chart review </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Customer satisfaction </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Utilization review in select areas </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Individual outcome data </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Indication of systematic review of agency </li></ul></ul></ul></ul></ul>Implementation of Continuous Quality Improvement (CQI) Program
  • 13. <ul><li>Problem 1: disconnect between accrediting body and agency leadership perceptions of need </li></ul><ul><ul><li>Problem 2: global discomfort with/ suspicion of “science” and “data” </li></ul></ul><ul><ul><ul><li>Problem 3: Continuous Quality Improvement is supposed to involve all levels of staff, not be a top-down process </li></ul></ul></ul>Implementation Model: Challenges
  • 14. <ul><li>1. Assemble committee representing key constituents of the agency: </li></ul><ul><ul><ul><ul><ul><li>Agency leadership </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Information systems </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Human Resources </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Business Office </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Program staff </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Parent </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>Consumer </li></ul></ul></ul></ul></ul>Steps Required to Develop an Outcome System Year 1:
  • 15. <ul><li>Develop a conceptual model to help staff review program activities and outcomes (e.g., logic models) </li></ul><ul><li>Map out organizational structure and evaluation activities (e.g., department teams) </li></ul><ul><li>Identify strategy that makes sense to staff to collect, analyze and use information </li></ul>Steps Required to Develop an Outcome System Year 1:
  • 16. <ul><li>Develop protocol and process for all levels of staff participation </li></ul><ul><li>Develop tools for reporting program activities at least quarterly and identify survey instruments </li></ul><ul><li>Train staff on recommended model and process </li></ul>Steps Required to Develop an Outcome System Year 1:
  • 17. Identify a Solution-Focused Process <ul><li>PLAN </li></ul><ul><li>Decide what is the problem </li></ul><ul><li>Brainstorm solutions </li></ul><ul><li>DO </li></ul><ul><li>Enact solutions </li></ul><ul><li>Collect data on how solutions are working </li></ul><ul><li>ACT </li></ul><ul><li>Revise action plan if needed </li></ul><ul><li>Discontinue plan </li></ul><ul><li>Continue to monitor </li></ul><ul><li>CHECK </li></ul><ul><li>Analyze data </li></ul><ul><li>Discuss findings </li></ul>
  • 18. CQI Activities Related to COA Standards C&FS develops long-term strategic plan (4 yrs) (G2.3) Identified stakeholders participate in all appropriate activities (G2.2) Each dept. develops annual plan of goals and objectives (G2.4) Internal & external audits & reviews (G2.10, G2.5) Chart Reviews (G2.6, G2.1) Customer satisfaction surveys (G2.8, G2.10) Outcomes Measures (G2.7, G2.10) Quarterly review of each dept.achievements by agency CQI Team (G2.9) Annual reports of year-end progress(G2.9) Reassessment, as necessary (G2.11)
  • 19. <ul><ul><li>Annual plan </li></ul></ul><ul><ul><li>Quarterly Monitoring summary </li></ul></ul><ul><ul><li>Summaries of client/referral satisfaction forms </li></ul></ul><ul><ul><li>Summaries of incidents that occurred over the quarter </li></ul></ul><ul><ul><li>Summary of external review activities </li></ul></ul><ul><ul><li>Review department-specific and aggregated findings </li></ul></ul>Flow of CQI Activities
  • 20. Year 2: Challenges to CQI System <ul><ul><li>Many staff did not understand logic models </li></ul></ul><ul><ul><li>Program Directors didn’t understand basic concepts (e.g., difference between benchmark and baseline data) </li></ul></ul><ul><ul><li>Challenges existed around grasping conceptual material and managing task of putting information on computer! </li></ul></ul>
  • 21. Year 2: Challenges to CQI System <ul><li>Some programs never tracked outcome data at all </li></ul><ul><li>State-funded programs required their own outcome tools that were either cumbersome or lacked validity </li></ul><ul><li>Staff lacked buy-in, experienced turnover or couldn’t make use of findings </li></ul><ul><li>Several programs had a hard time identifying individual outcome indicators </li></ul>
  • 22. Year 2: Challenges to an Outcome System <ul><li>Mechanically tracking data and outcomes became a challenge—no standardized data base in the agency </li></ul><ul><li>Changes implemented as a result of outcome data were often not documented </li></ul><ul><li>Staff were “busy” but couldn’t document real client change (i.e., conflating outputs and outcomes) </li></ul>
  • 23. Year 2: Small Successes for an Outcome System <ul><li>Some programs used the process to study and implement real change </li></ul><ul><ul><li>Residential Network: system for getting paperwork accomplished </li></ul></ul><ul><ul><li>Family Counseling: saw change in patient population via use of SCL-90-R </li></ul></ul><ul><ul><li>School-based: outcomes supported pilot program around prevention education </li></ul></ul>
  • 24. Creating Organizational Change <ul><li>Year 2: Agency/Board leadership in strategic planning meetings, which required a SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis of programs </li></ul><ul><li>Lack of careful investigation of program outcomes forced leadership to rely more on fiscal and output information than on outcome data </li></ul><ul><li>Frustration Innovation </li></ul>
  • 25. Creating Organizational Change <ul><li>Continuous Quality Improvement </li></ul><ul><li>Reactive </li></ul><ul><li>Minimal Board involvement </li></ul><ul><li>Driven by COA requirements </li></ul><ul><li>Voluntary management participation </li></ul><ul><li>Organizational Quality & Performance </li></ul><ul><li>Proactive </li></ul><ul><li>Board subcommittee </li></ul><ul><li>Driven by strategic planning process </li></ul><ul><li>Sr VP assigned to job </li></ul>
  • 26. Coordination of OQP Activities Agency Strategic Plan Organizational Quality & Performance Assess performance and outcomes to align agency goals and program activities Learning Institute Link staff needs with training and learning activities Employer of Choice Initiatives Maintain focus on performance to attract and retain highest quality staff
  • 27. Moving Toward Excellence: Changes in Culture and Systems <ul><li>Staff are now required to identify steps in developing programs reflecting excellence and best practices </li></ul><ul><li>Management are held accountable for process and program outcomes </li></ul><ul><li>Systematic data collection and analysis requires new MIS system </li></ul><ul><li>Agency-wide committees and initiatives brought into oversight process </li></ul>
  • 28. Moving Toward Excellence: Changes in Culture and Systems <ul><li>New OQP committee members include Corporate Communications and Development in order to ensure inter-departmental coordination </li></ul><ul><li>Methodology for tracking process and outcomes in place </li></ul><ul><li>OQP SVP meets provides technical assistance in program plan development </li></ul>
  • 29. Moving Toward Excellence: Goals of OQP Process <ul><li>Department-specific process and outcome data will be collected, analyzed and acted upon </li></ul><ul><li>Inter-departmental coordination will improve </li></ul><ul><ul><li>Corporate Com, MIS & Development included in program planning </li></ul></ul><ul><ul><li>Programs work together on mutual or complimentary projects </li></ul></ul><ul><ul><li>Inter-program conflicts will be quickly identified and resolved </li></ul></ul>
  • 30. Moving Toward Excellence: Goals of OQP Process <ul><li>Communication across departments and up, down chain of command will improve </li></ul><ul><li>Agency leadership will have improved and speedier methods of determining where there are problems and strengths in programs </li></ul><ul><li>Agency and program leaders have tools to continuously plan for the future </li></ul>
  • 31. CFS Program Performance Plan <ul><li>Interdepartmental staff meetings should be held with regularity to ensure smooth operations and mutual problem-solving. </li></ul><ul><li>Use of critical thinking skills should be modeled by supervisors. </li></ul><ul><li>Staff meetings are held regularly to communicate to and training staff on current policies and practices. </li></ul><ul><li>Staff are encouraged by supervisors to voice opinions about policies and procedures. </li></ul><ul><li>Staff are taught and encouraged to employ the ability to deconstruct complex tasks while maintaining awareness of the connection to a larger whole. </li></ul><ul><li>CFS has developed a framework for enabling staff to see patterns and interrelationships and we are able to learn from our own experiences by reviewing successes and failures systematically. </li></ul>These are the program activities that must be completed to reflect our intended Accomplishments with expected date (s) of completion These are the program activities we are currently engaged in or have completed to reflect our intended Accomplishments This is how we define the components of Initial Program Goals In order to accomplish our Program and Performance Goals, we need to have the following in place Additional Activity Required (4) Current Status of Activities (3) Accomplishments (2) Initial Program Goals (1)
  • 32. CFS Program Outcome Plan <ul><li>None to report. </li></ul>Information to help assess where we are starting (baseline) or comparative markers (benchmark) Baseline/ Benchmark Data (9) <ul><li>Learning self-report </li></ul><ul><li>Staff self-reports </li></ul><ul><li>75% of staff will report that they participated in at least one activity in which they acquired knowledge. </li></ul><ul><li>Staff will be skilled at acquiring knowledge </li></ul>Instruments or measurement tools we will use to assess Outcomes Where we will obtain information to measure Program Outcomes This is how we will quantify our expected program outcomes These are the observable, measurable and quantifiable changes we can expect if we meet our Program Goals Measures (8) Date Sources (7) Outcomes Quantified (6) Outcomes (5)
  • 33. CFS Quarterly Program Performance Report <ul><li>Supervisors find challenges integrating multidisciplinary staff issues in meetings. Additional training required and will be completed by 1/1/04 </li></ul><ul><li>Five interdepartmental staff meetings have been held since Sept 03. </li></ul><ul><li>Supervisors received critical thinking training in Aug 03 and have started modeling skills. </li></ul><ul><li>Interdepartmental staff meetings should be held with regularity to ensure smooth operations and mutual problem-solving. </li></ul><ul><li>Use of critical thinking skills should be modeled by supervisors. </li></ul><ul><li>CFS has developed a framework for enabling staff to see patterns and interrelationships and we are able to learn from our own experiences by reviewing successes and failures systematically. </li></ul>Note any difficulties or challenges that are interfering with the progress being made toward completion of each task Status of progress toward Additional Activities Required as stated in the Performance Plan As stated in the Program Performance plan from column (4) As stated in the Program Performance plan from column (1) Comment/Analysis on Progress (13) Progress Made on Additional Activities (12) Additional Activity Required (11) Initial Program Goals (10)
  • 34. CFS Quarterly Program Outcome Report <ul><li>By Director: provide two hours/month for staff development activities. Implement no later than 12/03. </li></ul>State what further action will be taken, by whom, and expected date of completion <ul><li>Future Actions: </li></ul><ul><li>By Whom </li></ul><ul><li>Activity </li></ul><ul><li>Time Frame </li></ul><ul><li>(18) </li></ul><ul><li>Some progress made toward staff participation in knowledge acquisition; staff report challenges in finding time to learn outside of job. </li></ul><ul><li>100 staff surveyed </li></ul><ul><li>66% reported participation </li></ul>None to report <ul><li>75% of staff will report that they participated in at least one activity in which they acquired knowledge. </li></ul>Discussion of findings; comparison of this quarter’s findings to baseline/ benchmarks Sample size and data results from measures and data sources As stated in Program Outcome plan from column (9) As stated in the Program Outcome plan from Column (5) Analysis/ Conclusions (17) <ul><li>Results: </li></ul><ul><li>Sample Size </li></ul><ul><li>Findings </li></ul><ul><li>(16) </li></ul>Baseline or Benchmark Data (15) Expected Outcomes (14)
  • 35. Moving Toward Excellence: How Do Program Staff Use OQP System? <ul><li>Program Directors use process to bring their vision into program planning </li></ul><ul><li>Staff are oriented to OQP performance and outcome plans upon hire as a learning tool </li></ul><ul><li>Program leaders use plans to maintain an eye on program improvement activities </li></ul>
  • 36. Moving Toward Excellence: Lessons Learned <ul><li>Appreciate what you bring to the process from your own discipline and theoretical orientation and how they inform your outcome assessment work </li></ul><ul><li>Ask for help. Use volunteers, experts, etc. </li></ul><ul><li>Get organized and don’t panic </li></ul><ul><li>Don’t underestimate the value of communication </li></ul><ul><li>Maintain a stance of being inquisitive, not judgmental </li></ul><ul><li>Remind yourself and your staff that the process is developmental </li></ul><ul><li>Aim for direction, not perfection </li></ul>
  • 37. “ I wanted to share with you my excitement of having completed the baseline ECERS and ITERS for child care today!…With this data we can now create our baseline and use it to establish our movement forward to improved overall quality in our Centers for the children, families and our staff! This is so exciting. Good things are in our future, along with much hard work…I am confident that I will lead the way to our success!” Patty Shelley, Child Care Program Director Genuine Unsolicited Testimony from Staff:
  • 38. <ul><li>Commitment – Leadership team committed to measurement </li></ul><ul><li>Clarity – Degree of clear strategies and metrics </li></ul><ul><li>Metrics – Indicators measure valuable processes and expectations </li></ul><ul><li>Alignment – Performance measures aligned with key people processes and structures </li></ul><ul><li>Involvement – Key stakeholders involved in defining, tracking, and ongoing review of quality performance improvement system </li></ul>ASSESSMENT: Case Study of CQI & Performance Measurement System SCORE = Rating Scale: 1 = A Little; 2 = Just Enough; 3 = Success
  • 39. QUESTIONS? COMMENTS? FEEDBACK? ADVICE?
  • 40. Primary Presenter: Jill Pfitzenmayer, Ph.D. Title: Senior Vice President, Organizational Quality and Performance Agency: Child & Family Services Address: 24 School Street, Newport, RI 02840 Phone: 401-848-4186 Fax: 401-841-8841 Email: [email_address]   Co-Presenter: David Robinson, Ed.D. Title: Director for the Center for Evaluation and Research with Children and Adolescents (CERCA) Agency: Massachusetts Society for the Prevention of Cruelty to Children Address: 399 Boylston Street, Boston, MA 02116 Phone: 617-587-1594 Fax: 617-587-1582 Email: [email_address] or [email_address] Web: www.mspcc.org CONTACT INFORMATION

×