ICMI Quality Presentation


Published on

Published in: Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

ICMI Quality Presentation

  1. 1. Today’s Discussion  What’s Wrong with Quality? Measuring and Monitoring the Quality Program Quality vs. Customer Satisfaction Building Value
  2. 2. The Challenge: What’s Wrong with Quality?  Failure to measure, analyze and drive an ROI  Little or no linkage between quality measurements and customers’ perception of a quality encounter  An emerging understanding of more strategic program applications such as experience monitoring and voice of customer programs that dramatically enhance the value of the contact center to the organization
  3. 3. Quick Practitioner Poll Is Your Quality Management Program currently optimized in these areas?  Financial/ROI  Customer Satisfaction  Adding Value Beyond the Contact Center
  4. 4. Today’s Discussion What’s Wrong with Quality?  Measuring and Monitoring the Quality Program Quality vs. Customer Satisfaction Building Value
  5. 5. How to Measure Quality Program Financials Cost Of Quality Analysis
  6. 6. How to Measure Quality Program Financials Return on Quality
  7. 7. How to Monitor the Monitoring Program Monitor the program for consistent execution. Sound programs will include monthly or quarterly reporting of: – Program cost – Audit cost – Calibration Variance – Auditor Effectiveness – Program Constraints – Margin of Error – Impact of Process/Policy Changes Annual reporting and analysis of program ROI
  8. 8. Today’s Discussion What’s Wrong with Quality? Measuring and Monitoring the Quality Program  Quality vs. Customer Satisfaction Building Value
  9. 9. Quality vs. Customer Satisfaction: Let the Customer be the Judge
  10. 10. Every Service Encounter = Moment of Truth
  11. 11. Does Quality Monitoring = Quality Interactions? Six steps to improve the linkage of quality processes with customer experiences and loyalty: 1. Correlate monitoring criteria with 4. Ensure that monitoring results are customer satisfaction measures analyzed to identify performance patterns and trends 2. Correlate monitoring scores and 5. Measure contact satisfaction at post contact customer satisfaction the agent level results 3. Utilize a customer oriented approach 6. Invest in technologies that optimize to monitoring and scoring interactions auditing sample sizes for greater statistical relevance
  12. 12. Today’s Discussion What’s Wrong with Quality? Measuring and Monitoring the Quality Program Quality vs. Customer Satisfaction  Building Value
  13. 13. Quality: Building Value Through Collaboration Contact Centers must increase collaboration with other business units (e.g. HR, Product R&D, IT, Marketing, the Field Workforce and Sales) in order to:  Share the story  Educate stakeholders on process  Seek out input and potential value enhancements
  14. 14. Quality: Building Value Through Intelligence Create closed loop processes that link quality data with:  Hiring and recruiting process improvements  Training needs assessment (new hire/on-going)  Coaching effectiveness measures and coach development plans
  15. 15. Quick Practitioner Poll Does your Center measure agent quality metric improvement as a key indicator of Supervisor/Coach effectiveness?
  16. 16. Quality: Building Value Through Voice of the Customer Quality Monitoring is a Key Enabler A sound VOC Program should:  Have a defined value stream for  Provide on-going insight into customer categories and types of customer wants and needs feedback  Increase awareness of customer  Recognize and account for all preferences customer intelligence stakeholders  Enable assessment of customer and with defined routing logic perceptions  Recognize intelligence time  Provide unbiased reporting of sensitivity feedback  Provide a closed loop reporting  Mine solicited and unsolicited mechanism to ensure lessons feedback from internal and external learned are leveraged for continued sources e.g: organization improvement o Customer buzz monitoring using speech analytics(internal) o Social media monitoring (External)
  17. 17. ICMI’s Quality Self Assessment (QSA) Survey Results Review
  18. 18. Overview Assessment Categories 1. Enterprise 2. Quality Assurance Program Structure 3. Quality Assurance Monitoring Form 4. Reporting 5. Calibration 6. Monitoring and Coaching 7. Hiring and Training
  19. 19. Positive The category with the most positive outlook is…  Enterprise
  20. 20. Definition of Enterprise Section Enterprise “This area is about the level of support your center’s quality assurance program has within your organization and from executive leadership. It is also about how well your program is aligned with the Enterprise’s mission, vision and customer expectations.”
  21. 21. Enterprise Outlook  Executive management support for the program.  An appropriate organizational structure is in place to support the contact center’s quality assurance program and process.  Quality performance monitoring standards are linked with customers’ expectations and measure both foundation (basic required skills) and finesse (soft skills) standards appropriate for the organization.
  22. 22. Opportunities Two categories with opportunities for the most improvement…  Quality Assurance Program Structure  Reporting
  23. 23. Definition of Program Structure Section Quality Assurance Program Structure “This area is about your quality assurance program having all essential processes documented with critical technologies in place to provide a holistic, 360 degree view of the response quality provided on all interaction channels. It is also about understanding the financial payback for all quality improvements and spending.”
  24. 24. Program Structure – Challenges  Lacking a defined purpose and objectives  Limited post interaction surveying  Not including customer feedback into the process  Depending only on the quality monitoring process to determine customer expectations  Limited or under-utilized technology  All customer access channels are not monitored  FCR not measured or utilizes poor methodology  Not measuring Return on Investment (ROI)
  25. 25. Program Structure Enrichment High performing centers understand that a quality contact is defined by their customers…  Develop their program objectives & purpose based on customer’s expectations.  Understand their customers better by using post transaction automated CSAT surveying to obtain direct feedback.  Incorporate direct customer feedback into agent quality scoring and coaching feedback.  Measure FCR by asking the customer via post transaction automated surveying. And using other methods to validate.
  26. 26. Program Structure Enrichment(contd) High performing centers understand the value and impact of consistently delivering quality…  Quality is a priority and as such, the necessary time and resources required are provided.  Technology is considered a crucial enabler.  All channels are monitored to ensure response consistency.  ROI is gained when the cost of quality monitoring is understood and monitoring processes are aligned with organizational strategic objectives.
  27. 27. Definition of Reporting Section Reporting “This area is about whether or not your reporting is thorough and contributes not only to the improvement of the quality of the interactions but also contributes to overall process improvement within the center and Enterprise.”
  28. 28. Reporting – Common Challenges  Limited, actionable reporting.  Not tracking, trending or analyzing results by category to identify individual and/or center-wide performance results and process improvement opportunities.  Limited use of quality statistical tools.  No formal process is in place to share key customer data, findings, and feedback outside the center.
  29. 29. Reporting Enrichment  Effective and meaningful reporting of quality results is critical to high performing organizations.  Track and trend results using a variety of analytical/statistical tools to achieve improvement.  Use automated reporting and/or develop appropriate databases for use in manipulating quality data.  Develop relationships with other departments in the organization and actively share data obtained from the quality program.  Use scorecards and dashboards to report/monitor quality results on a daily, weekly, and monthly basis.
  30. 30. Helpful Resources ICMI Quality Scorecard The Real-Time Quality Self-Assessment  icmi.com/qualityscorecard ICMI Quality Whitepaper Discover Why Contact Center Quality Doesn’t Measure Up — ICMI Quality Advisor And What You Can Do About It • 3 days onsite support • 3 months continuous advisory support  icmi.com/qualitywhitepaper  icmi.com/qualityadvisor