This session will help you understand what you should be reporting, and how to better communicate the value of what you do. Join John Custy, a service management and metrics authority, as he explains how to identify the various types of metrics you need to report; their value, purpose, and how to use them.
13. PURPOSE OF METRICS
How to use metrics
• Inform your stakeholders
• Report measurements so that stakeholders can understand activities and results
• Promote the value of the organization
• Determine the best way to communicate the information to the stakeholders
• Perform better stakeholder analysis to facilitate stakeholder buy-in
• Improve performance - people do what is measured
15. ENSURE ALIGNMENT
• Account for IT Processes and
Deliverables
• Inform stakeholders
• Understand IT Performance
COMPLIANCE
• Achieve certifications; ISO/IEC
20000, COBIT
• Measure progress to goals/
objectives
OPERATIONAL EXCELLENCE
• Measure IT Performance
• Control IT Processes
• Maximize IT Productivity (people)
• Report Costs
• Demonstrate value of IT
Organization
PURPOSE OF METRICS
What are we trying to accomplish?
19. PURPOSE OF METRICS
Sharing Accomplishments
What should you report?
Key performance indicators
Critical success factors
Variances to baseline
Progress towards targets
Annotate milestones and abnormalities
Service improvement projects
21. DIFFERENT TYPES OF METRICS
Metrics & Characteristics
Quantitative
• How much or how many
• Ex. The number of times
customers contact the service
desk
% :)
22. DIFFERENT TYPES OF METRICS
Metrics & Characteristics
Quantitative
METRICS !
!
•Performance indicators (PI)
!
•Key performance indicators (KPI)
!
•Key results indicators (KRI)
• How much or how many
• Ex. The number of times
customers contact the service
desk
CHARACTERISTICS !
!
•Efficiency vs. effectiveness
!
•Leading vs. lagging
% :)
23. DIFFERENT TYPES OF METRICS
Efficiency vs. Effectiveness
Quantitative
• How much or how many
• Ex. The number of times
customers contact the service
desk
% :)
24. DIFFERENT TYPES OF METRICS
Efficiency vs. Effectiveness
Quantitative
EFFICIENCY !
!
•How • How fast?
much or how many
!
•• Ex. The number of times
How customers many?
contact the service
!
desk
•Transactional Cost
!
•Incident/Request/Access Management
!
•Departmental Goals
EFFECTIVENESS !
!
•Accuracy
!
•Customer Satisfaction
!
•Total Organizational Cost
!
•Problem Management
!
•Enterprise Objectives
% :)
25. DIFFERENT TYPES OF METRICS
Quantitative vs. Qualitative
Quantitative
• How much or how many
• Ex. The number of times
customers contact the service
desk
% :)
26. DIFFERENT TYPES OF METRICS
Quantitative vs. Qualitative
Quantitative
QUANTITATIVE !
•How much or how many
!
•Ex. The number of times customers
contact the service desk
• How much or how many
• Ex. The number of times
customers contact the service
desk
QUALITATIVE !
•How well something or someone is
performing.
!
•Ex. Customer Satisfaction, Employee
Satisfaction, stock price.
% :)
28. • Time to process an order
• Time to check inventory item
• Time to send/receive an e-mail
• Time to …
@ITSMNinja
End-to-End Performance
29. • Cost Per Transaction
• Cost Per User
@ITSMNinja
End-to-End Performance
30. Uptime – compared to …
• Downtime per Service
• Frequency and total amount of time
• Number of incidents (type/category)
• Number of recurring incidents
• Time per incident
@ITSMNinja
Service Availability
31. • Problems identified per Service
• # incidents per problem
• Lost time per problem
• Changes Per Service
• # successful changes (time, budget)
• Lost time due to changes – incidents and requests
• # Service Requests due to changes
• # problems due to changes – IT and business lost time
• % improvements due to changes
@ITSMNinja
Service Availability
32. BALANCE
NEEDED
Operational metrics allow
you to understand
where to improve
Service metrics report on
the overall performance
of the service
34. What Type of Metrics Are Reported?
IT INTERNAL METRICS
SENIOR MANAGEMENT
SERVICE MANAGEMENT
BUSINESS UNIT METRICS
REGULATORY/
COMPLIANCE
35. Four Types of Process Metrics
PROGRESS EFFICIENCY EFFECTIVENESS COMPLIANCE
!
IN PROCESS
MATURITY
!
USE OF
RESOURCES
!
CORRECT AND
COMPLETE THE
FIRST TIME
!
TO PROCESS AND
REGULATORY
REQUIREMENTS
37. @ITSMNinja Typical Operational Measurements
• Response
• % connected immediately (Real-Time)
• Abandon Rate
• Wait (hold/queue) Time
• Average Speed to Answer (ASA)
• Response Time service level XX% in YY seconds
• Call-Back Time
• Desktop (PC)
38. @ITSMNinja Typical Operational Measurements
• Resolution
• Resolved First Contact
• Resolved X hours, Y hours, Z hours
• Cases re-opened, Repeats
• Requests resolved without assistance (self-help)
• Calls/Cases avoided due to self-help
43. @ITSMNinja Typical Operational Measurements
• Knowledge Base Usage
• Accesses/Searches per contact
• # solutions per search
• # solutions searched/opened/viewed
• Time spent reviewing solutions
• Ease of finding solutions
• Quality of solutions (ability to use solutions)
44. @ITSMNinja Typical Operational Measurements
• Service Asset and Configuration Management
• Errors in CMDB
• Resources improvement utilizing CMDB
• Change Management
• Number of incidents/requests due to the change
• Additional (reduction) workload due to changes
• Release Management
• Number of incidents/requests due to the release
• Additional (reduction) in workload due to releases
46. SERVICE DESK METRICS
WORKLOAD
•
• Volumes
• Calls/Cases per
customer per
month
• Number of
registered users/
Total number of
users
• Time spent
contacting users
• Time spent on
change related
incidents/requests
47. SERVICE DESK METRICS
WORKLOAD
•
• Volumes
• Calls/Cases per
customer per
month
• Number of
registered users/
Total number of
users
• Time spent
contacting users
• Time spent on
change related
incidents/requests
INDIVIDUALS
• Number of calls
taken
• Average Handle
Time (AHT)
• Avai labi l i ty
• Occupancy
• Number of
incidents/requests
c l o s e d o n f i r s t
contact
• Customer
Satisfaction
• Contribution to
Knowledge base
48. SERVICE DESK METRICS
WORKLOAD
•
• Volumes
• Calls/Cases per
customer per
month
• Number of
registered users/
Total number of
users
• Time spent
contacting users
• Time spent on
change related
incidents/requests
INDIVIDUALS
• Number of calls
taken
• Average Handle
Time (AHT)
• Avai labi l i ty
• Occupancy
• Number of
incidents/requests
c l o s e d o n f i r s t
contact
• Customer
Satisfaction
• Contribution to
Knowledge base
CUSTOMERS
• Customer
Satisfaction
• Frequency of
surveying,
Number not
responding
• Volumes
• Calls/Case
49. SERVICE DESK METRICS
WORKLOAD
•
• Volumes
• Calls/Cases per
customer per
month
• Number of
registered users/
Total number of
users
• Time spent
contacting users
• Time spent on
change related
incidents/requests
INDIVIDUALS
• Number of calls
taken
• Average Handle
Time (AHT)
• Avai labi l i ty
• Occupancy
• Number of
incidents/requests
c l o s e d o n f i r s t
contact
• Customer
Satisfaction
• Contribution to
Knowledge base
CUSTOMERS
• Customer
Satisfaction
• Frequency of
surveying,
Number not
responding
• Volumes
• Calls/Case
RESPONSE
• Average Speed to
Answer (ASA)
• % calls answered
live vs.. queued
• Call back time
• Abandon Rate
(ABA)
• Responses within
Service Level &
Outside service
l e v e l
51. INCIDENT, REQUEST AND ACCESS MANAGEMENT
RESOLUTION
•
• Incident closure
( f rom time of
submission)
• Mean Time for
Service
Restoration
(MTSR) for Levels
1 , 2 , & 3
• Incidents matched
(KE)
• Incidents Re-
Opened
• Closed First
Contact
• Escalations for
resolut ion
• Remote tool
u t i l i z a t i o n
• Desk-side visits
• Incidents closed
v i a s e l f - h e l p
52. INCIDENT, REQUEST AND ACCESS MANAGEMENT
RESOLUTION
•
• Incident closure
( f rom time of
submission)
• Mean Time for
Service
Restoration
(MTSR) for Levels
1 , 2 , & 3
• Incidents matched
(KE)
• Incidents Re-
Opened
• Closed First
Contact
• Escalations for
resolut ion
• Remote tool
u t i l i z a t i o n
• Desk-side visits
• Incidents closed
v i a s e l f - h e l p
VOLUME
• Total number of
incidents/requests
( b y p r i o r i t y &
category)
!
• Security related
incidents
53. INCIDENT, REQUEST AND ACCESS MANAGEMENT
RESOLUTION
•
• Incident closure
( f rom time of
submission)
• Mean Time for
Service
Restoration
(MTSR) for Levels
1 , 2 , & 3
• Incidents matched
(KE)
• Incidents Re-
Opened
• Closed First
Contact
• Escalations for
resolut ion
• Remote tool
u t i l i z a t i o n
• Desk-side visits
• Incidents closed
v i a s e l f - h e l p
VOLUME
• Total number of
incidents/requests
( b y p r i o r i t y &
category)
!
• Security related
incidents
RESPONSE TIME
• Service Desk
performance
!
• Level 2/3 – same
as SD metrics
54. INCIDENT, REQUEST AND ACCESS MANAGEMENT
RESOLUTION
•
• Incident closure
( f rom time of
submission)
• Mean Time for
Service
Restoration
(MTSR) for Levels
1 , 2 , & 3
• Incidents matched
(KE)
• Incidents Re-
Opened
• Closed First
Contact
• Escalations for
resolut ion
• Remote tool
u t i l i z a t i o n
• Desk-side visits
• Incidents closed
v i a s e l f - h e l p
VOLUME
• Total number of
incidents/requests
( b y p r i o r i t y &
category)
!
• Security related
incidents
RESPONSE TIME
• Service Desk
performance
!
• Level 2/3 – same
as SD metrics
ESCALATION
• Time to escalate
!
• % Escalated to
correct group
!
• Technical &
Hierarchical
56. INCIDENT, REQUEST AND ACCESS MANAGEMENT
CUSTOMER SATISFACTION
• Incident, Request & Access
Management processes
•
SELF-SERVICE
• Number of unique users
!
• Average t ime per user
!
• # pages viewed
60. Factors to Consider When Reporting
• Who are the stakeholders?
• How does what you are reporting impact the stakeholders?
• Reports must be easy to read and understood, thus they need to be developed with
the stakeholder in mind.
• Reports need to show how the support center is contributing to the goals of each
stakeholder and the business.
• Reports must identify the appropriate channels to communicate with each of the
stakeholders.
66. Scorecards Drive Transformation
I Relevant High (Business Low (Process Improvement) Moderate Transformation)
to IT
H General Specific
T ISO/IEC
17799
ITIL/ISO/IEC
20000 C COBIT
Six Sigma
ISO/IEC 9000
Malcolm Baldrige Award
Scorecards
Standards help us to ensure that IT is aligned to meet business objectives
67. • Simple indicator
• Reference base
• Measures the key issues
• Reports on progress to goals
@ITSMNinja
Support Scorecard
68. Scorecard Criteria: Operational Performance
COLUMN TITLE COLUMN TITLE COLUMN TITLE
Overall performance
Response time
Resolution time
Closed first call
Abandon time
Wait time
Status time
Backlog aging
69. Scorecard Criteria: Operational Performance
COLUMN TITLE COLUMN TITLE COLUMN TITLE
Overall performance
Response time
Resolution time
Closed first call
Abandon time
Wait time
Status time
Backlog aging
70. Scorecard Criteria: Operational Performance
SERVICE SCORECARD
The Acme Support Center Scorecard provides a
weekly report of performance on our Service
CS Overall Support Center Product
Level commitments.
Key Service Area Goal Actual
Response Time
Front-line Measurements:
• Call Pick-Up Time - All incoming calls are answered by a support consultant
• Call Waiting Time - Average is less that 3 minutes
• Back-line Measurements:
• Non-Accepted Call Back Time - All customers not responded to on the initial
call by a support consultant will be called back within 30 minutes.
80%
3 minutes
90%
Resolve Time
Resolved on First Contact - 30% resolved first call (4 month goal is 50%)
Resolved Same Day - 40% resolved within 1 business day (4 month goal is 60%)
Resolved Same Week - 85% resolved within 5 business days
30%
40%
85%
Status
Priority 1 Issues - Customer provided status every 4 hours until resolved
Priority 2 Issues - Customer provided status every 24 hours until resolved or
workaround provided
Call Aging - Manage backlog so that no more than 20% over 2 weeks and 5%
over 30 days
80%
80%
80%
Backlog (Average age of open items) 3 days
10 days
Event Survey
Overall satisfaction rating on a 1-5 scale 4.1
COLUMN TITLE
71. Service Desk Scorecard
COLUMN TITLE
Overall performance
Response time
Resolution time
Closed first call
Abandon time
Wait time
Status time
Backlog aging
COLUMN TITLE
Customer sat/quality
Overall satisfaction
Response satisfaction
Resolution satisfaction
Status satisfaction
Improvement goals
Alignment to business
73. Balanced Scorecard
Business
Goals
Financial Perspective
COLUMN TITLE
Provide a good return on investment on IT-enabled business investments
Manage IT-related business risks
Improve corporate governance and transparency
Customer Perspective
Improve customer orientation and service
Offer competitive products and services
Establish service continuity and availability
Create agility in responding to changing business requirements
Achieve the cost optimization of service delivery
Obtain reliable and useful information for strategic decision making
Internal Perspective
Improve and maintain business process functionality
Lower process costs
Provide compliance with external laws, regulations and contracts
Provide compliance with internal policies
Manage business change
Improve and maintain operational and staff productivity
Learning and Growth Perspective
Manage product and business innovation
Acquire and maintain skilled and motivated personnel
76. You cannot manage what you cannot
CONTROL
You cannot control what you cannot
MEASURE
You cannot measure what you cannot
DEFINE
77. What do you need to measure?
What should you be doing you do with the metrics you produce?
78. Thank you!
Join me tomorrow @2PM to learn
Knowledge Centered Support (KCS) – The Methodology that Really Works
John Custy • Managing Consultant • JPC Group • @ITSMNinja