Service Levels - the Star Rating

723 views

Published on

How to express service levels in a way that customers can understand.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
723
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Helping Asset Managers is my passion – see phone number! Acknowledge: Client for commissioning the work this is based on Dean Taylor (Opus) for introducing us to the concept Kathy Dever-Todd (NAMS) for consultation suggestions Ross Waugh (Waugh Infrastructure Ltd) for my involvement.
  • Other Refs: NAMS PQS guideline; NZS/WASSA for Toilets, DWSNZ for water supply)
  • Apologise for image quality
  • Background: Low community interest to attend council-organised LoS meetings’ in 2005 Crucial strategy 2008 “we will go to them” Obviously some groups not representative of community wrt area of interest. Explain how minority group inputs normalised.
  • The bars indicate the percentage of total respondents who rated each service component high, medium, low, or not rated Staff expertise and location were highly important Range of books, off-street parking and disabled access.
  • Benefit for Councillors: See what ratepayers were prepared to pay for - Yes: toilets at a heavily used soccer park No: more playground equipment Now that we have good indication of service level preferences we can set about designing a star rating system that is meaningful to ratepayers.
  • You could apply weights to each of the service level components. I decided to keep the scoring system simple It works without weighting Once used, weighting could not be changed again – need continuity across future AMPs Consider who will be doing the site surveys and keep the definitions of service components simple also.
  • Asset provision or operation. Components often categorised as ‘availability, reliability, accessibility’ etc
  • Initial calibration based on experience and verbal description: v-low, low-av etc
  • Further calibration based on draft star rating: Without significant gap between rating numbers, result is too sensitive to service components.
  • Total score for a building can be thought of as ‘property quality score’. (NAMS Property manual). NAMS manual
  • Further calibration based on draft star rating: Without significant gap between rating numbers, result is too sensitive to service components.
  • Final calibration ensures that expenditure makes an appropriate difference to the rating. If funds limited, improving operations components might be achieved by re-balancing resources eg open shorter hours but also on Saturday eg cleaned same number of times but rescheduled to suit peak times eg fault response target time quicker because performance shows it is being achieved at current cost
  • Refer to Pyramid of Service measures – PQS detailed system supports Star Rating
  • Again – Star rating an objective asset management-based replacement for a very subjective satisfaction measure. This does not imply that managers can ignore customer satisfaction results. They reflect the mood of the community but not necessarily the level of service provided.
  • Consultation is continuous (reporting is part of it) The concept and the future reporting are in understandable terms.
  • Helping Asset Managers is my passion – see phone number! Acknowledge: Our client for commissioning the work this is based on Dean Taylor (Opus) for introducing us to the concept Kathy Dever-Todd/NAMS for PQS concept and consultation suggestions Ross Waugh (Waugh Infrastructure Ltd) for my involvement.
  • Service Levels - the Star Rating

    1. 1. The ‘Star Rating’ Making Technical Standards Real to Customers Colin Symonds - IMCS Ltd Phone 021 HELP - AM 021 4 3 5 7 - 2 6
    2. 2. Outline <ul><li>What is a “Star Rating”? </li></ul><ul><li>Why introduce a new service measure? </li></ul><ul><li>Consultation Plan and Results </li></ul><ul><li>Scoring system and calibrating </li></ul><ul><li>Service Level Targets linked to spending </li></ul>
    3. 3. What is a ‘Star Rating’? <ul><li>A customer service level measure: </li></ul><ul><li>More stars = higher service level </li></ul><ul><li>Common concept - like ‘Qualmark’ motel/hotel rating: </li></ul><ul><li>5-star hotel - $250 </li></ul><ul><ul><li>3-star motel - $120 </li></ul></ul><ul><ul><li>1-star (back-packers) - $50 </li></ul></ul>
    4. 4. Why ‘Star Rating’ <ul><li>Simple for customers to understand. </li></ul><ul><li>Demonstrates quality and cost trade-off. </li></ul><ul><li>If set up properly – </li></ul><ul><ul><li>Directly connected to technical standard measures. </li></ul></ul><ul><ul><li>Influenced by both capital and operational spend. </li></ul></ul><ul><li>Moves away from ‘satisfaction’ (very subjective) to an auditable, structured system. </li></ul><ul><li>Addresses OAG’s recommendation for AMPs. </li></ul>
    5. 5. OAG’s Recommendations
    6. 6. AMP Service Level Consultation <ul><li>Integral to LTCCP Consultation </li></ul><ul><li>“ We go to Them” – improve participation </li></ul><ul><li>Keep councillors informed </li></ul><ul><li>Focus Groups A: LoS Components </li></ul><ul><li>(current costs; what’s important to you?) </li></ul><ul><li>Focus Groups B: Cost </li></ul><ul><li>(cost options for important components) </li></ul>
    7. 7. Results - Importance to Customers <ul><li>Libraries: LoS Components High Medium Low No Opinion </li></ul>
    8. 8. Results – Costs/Options
    9. 9. Scoring System Outline <ul><li>LoS Components identified </li></ul><ul><li>Define how LoS components measured </li></ul><ul><li>Establish a scoring scale </li></ul><ul><li>Test LoS component scores and totals over a range from poor to excellent LoS. </li></ul><ul><li>Total score converted to Star Rating. </li></ul><ul><li>Survey the facilities and score them. </li></ul><ul><li>Check effect of budgets on Star Rating. </li></ul>
    10. 10. Asset Provision & Operations <ul><li>Capex </li></ul><ul><li>+ related Opex </li></ul>Opex only
    11. 11. Calibration -1
    12. 12. Calibration -2
    13. 13. Scoring -1
    14. 14. Scoring - 2
    15. 15. Calibration -3 <ul><li>Does spending/saving change the total points and star rating as expected… for Capex? </li></ul><ul><li>… for Opex? </li></ul><ul><li>Evaluate spend in terms of effect on star rating. </li></ul><ul><li>Collate the 10 year programme in terms of the planned changes in service level over time. </li></ul>Longer Time: -$? Add Ramps: +$??
    16. 16. Results & Targets - to LTCCP
    17. 17. Please Note <ul><li>Star Rating as presented here - </li></ul><ul><ul><li>Is not a benchmark between authorities. </li></ul></ul><ul><ul><li>Is an internal quality measure linked to costs. </li></ul></ul><ul><ul><li>Is repeatable, auditable, not subjective. </li></ul></ul><ul><li>Target should not be “5 star service levels” </li></ul><ul><ul><li>– Most communities cannot afford 5 star quality. </li></ul></ul><ul><ul><li>All communities need “appropriate quality for the available funding” </li></ul></ul>
    18. 18. Currently Available <ul><li>Parks (in five categories, plus cemeteries) </li></ul><ul><li>Pensioner Housing </li></ul><ul><li>Community Halls </li></ul><ul><li>Public Libraries </li></ul><ul><li>Public Toilets </li></ul>
    19. 19. Conclusion <ul><li>How effective is this? … </li></ul><ul><ul><li>Consultation success! (300 responses/19 groups); </li></ul></ul><ul><ul><li>Clear results provided to LTCCP process; </li></ul></ul><ul><ul><li>Simple long-term LoS measures identified. </li></ul></ul><ul><li>We can demonstrate to customers: </li></ul><ul><ul><li>We listened to what you said. </li></ul></ul><ul><ul><li>You get what you pay for. </li></ul></ul><ul><ul><li>Targets and progress reports will be easy for customers to understand. </li></ul></ul>
    20. 20. Thank You Colin Symonds - IMCS Ltd Phone 021 HELP - AM 021 4 3 5 7 - 2 6 Acknowledgements: SWDC; Dean Taylor (Opus); NAMS, Ross Waugh (Waugh Infrastructure Management Ltd)

    ×