A business data analysis of e-Solution's quality of service delivered from 2016 to 2017. The resulting analysis were listed in our prescribed service quality matrix which is based on industry standards on measuring service quality.
5. Service Quality Matrix
Scale Importance & Weight
Customer Satisfaction Rate
Interaction per Resolution
Time to Resolution
First Contact
Resolution
First Level
Resolution
Repeat Calls
5
7. CUSTOMER SATISFACTION
A measure of how service is
meeting (or exceeding)
customer expectation
Assessed through surveys of
customers via telephone call,
email or post [Joshi et al, 2011]
7
8. 99% of the tickets were
accepted by customers
17 tickets were not
accepted
99% of the tickets were
accepted by customers
41 tickets were not
accepted
Analysis Results
8
9. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
Customer satisfaction
rate is at 86% [Firuta, 2017]
Customer satisfaction
KPI is set at 85%
Maintained 99% in 2
years
9
10. INTERACTION per RESOLUTION
Measures the occurrence of incorrectly assigned
tickets.
It can result in delays to resolution process &
decrease customer satisfaction [Navvia, 2014]
10
11. 24462
4580
782
282
91
44
15
8
2
3
2
3
0 1 2 3 4 5 6 7 8 9 10 12
COUNT
NUMBER OF REASSIGNMENT
INTERACTION PER RESOLUTION
2016
81% were resolved
without re-assigning
15% had been re-
assigned once
35586
5055
922
348
137
42
18
9
7
4
3
2
1
0 1 2 3 4 5 6 7 8 9 10 11 13
COUNT
NUMBER OF REASSIGNMENT
INTERACTION PER RESOLUTION
2017
84% were resolved
without re-assigning
12% had been re-
assigned once
Analysis Results
11
12. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
Interaction per
Resolution global
standard rate is at 86%
[Leggett, 2017]
Interaction per
Resolution KPI was not
set
Yearly improvement
12
13. TIME TO RESOLUTION
Measures the average time it takes
to resolve a problem
Though may differ depending on
complexity and urgency, aim to
shorten time [Ivarsson, 2013]
13
15. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
Time to Resolution global
standard is at 66% within
3 days upon reporting
[Sharkasi, 2009]
Time to Resolution KPI
was not set
Yearly improvement
15
16. FIRST CONTACT RESOLUTION
Measures the number or
percentage of problems
resolved during the first
customer call.
An indicator of service’
effectiveness in handling
issues the first time. [Ramburg,
2011]
16
17. Analysis Results
27% of the tickets were
resolved in their first
contact from January to
June 2016
FCR rate from January
to June 2016 is 37%
25% of the tickets were
resolved in their first
contact from January to
June 2017
FCR rate from January
to June 2017 is 34%
30% 29% 41% 32% 34% 29%
0
2000
4000
6000
8000
10000
12000
14000
Jan Feb Mar Apr May Jun
Tickets Resolved in First Contact 2017
Total Calls First Contact Resolved FCR Calculation
31% 36% 45% 41% 34% 33%
0
1000
2000
3000
4000
5000
6000
7000
8000
Jan Feb Mar Apr May Jun
Tickets Resolved in First Contact 2016
Total Calls First Contact Resolved FCR Calculation
17
18. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
First Contact Resolution
global standard is at 85%
[Narayanan, 2008]
First Contact Resolution
KPI is set at 64%
Yearly decline
18
19. FIRST LEVEL RESOLUTION
Measures rate of issues
being resolved by quickly by
service desk without
escalation.
However, can be misleading
therefore analyze conjointly
with repeat calls received.
[Atkinson, 2013]
19
20. 0
1000
2000
3000
4000
5000
6000
7000
8000
Jan Feb Mar Apr May Jun
First Level Resolution 2016
Total Tickets Received Resolved - First Level
Analysis Results
0
2000
4000
6000
8000
10000
12000
14000
Jan Feb Mar Apr May Jun
First Level Resolution 2017
Total Tickets Received Resolved - First Level
76% were resolved within
the first level support
82% were resolved within
the first level support
20
21. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
First Level Resolution
global standard rate is
at 74% [Rumburg, 2011]
FLR KPI is set at 70%
21
22. REPEAT CALLS
Measures the percentage
of tickets representing
recurring issues for the
same customer.
Indicator of a recurring flaw
in the system or service
[Atkinson, 2013] that may warrant
further root cause analysis.
22
23. Analysis Results
39% of the total calls
received were repeat
calls
Top 3 categories of
repeat calls are issues
relating to configuration
46% of the total calls
received were repeat
calls
All except application
tickets have configuration
issues
25%
23%
22%
18%
12%
Top 5 Category of Tickets with
Repeat Calls
User Access
Network
Software
Application
Teaching
Spaces
50%
19%
18%
7%
6%
Top 5 Category of Tickets with
Repeat Calls
Software
Application
User Access
Network
Hardware
23
24. Against the
Industry Global
Standards
HOW ARE YOU FARING? [Harmon et al., 2006]
Against Internal
Benchmark
Acceptable repeat call
rate is at 10% [Rumburg, 2011]
Repeat call KPI was not
set
Yearly decline, repeat
call increased
24
26. Service Quality Matrix
Scale Importance & Weight
Customer Satisfaction Rate Percentage Medium | 10%
Interaction per Resolution Percentage Medium | 10%
Time to Resolution Days Medium | 10%
First Contact
Resolution
First Level
Resolution
Repeat Calls
Percentage
Percentage
Percentage
High | 30%
High | 20%
High | 20%
26
29. ROOT-CAUSE ANALYSIS METHODOLOGY
Implement an Optimisation
Model to correlate incident
tickets that will be based on 3
criteria:
* Category-based correlation
* Correlating Configuration
items
* Augment Scheduled
Resource Data Collection
29
30. IMPROVE DATA QUALITY
Data quality must be
considered as part of Business
Strategy
Key Points to Consider:
Use Statistical Analysis to
scrutinize data
Devise staff training and
awareness program
Minimize missing data
30
32. REFERENCES
Atkinson, R, 2013, ‘Metrics for the New World of Support’, UBM LLC, retrieved 23
September 2017, <http://www.thinkhdi.com/~/media/HDICorp/Files/White-
Papers/whtppr-1213-metrics-new-world-support.pdf>.
Firuta, J, 2017, ‘Customer Service Report 2017’, Livechat, weblog post, June 10, retrieved
10 September 2017, <https://www.livechatinc.com/livechat-resources/customer-service-
report-2017/>.
Harmon, E.P., Hensel, S.C. and Lukes, T.E., 2006. Measuring performance in services.
McKinsey Quarterly, 1, p.30.
Ivarson, J, 2013. ‘Quality Management for IT Support Services’, Master of Science Thesis
in Quality and Operations Management, Chalmers University of Technology, Sweden.
Navvia, 2014, ‘Incident Management for Stanford University’, Consulting Portal Inc..,
retrieved 23 September 2017,
<https://uit.stanford.edu/sites/default/files/2015/12/09/Incident_Management%20Docume
nt%20-%20Version%201.pdf>.
Narayanan Kumbakara, (2008) "Managed IT services: the role of IT standards", Information
Management & Computer Security, Vol. 16 Issue: 4, pp.336-359, https://doi-org.ezproxy-
b.deakin.edu.au/10.1108/09685220810908778
33. REFERENCES
Rumburg, J, 2011, ‘Metric of the Month: First Level Resoluttion’, MetricNet, retrieved 23
September 2017, <https://www.thinkhdi.com/~/media/HDICorp/Files/Library-
Archive/Insider%20Articles/First%20Level%20Resolution.pdf>.
Tomek Soluch, (2012), Helpdesk - taka a good advice, get the right device [ONLINE].
Available at: https://dribbble.com/shots/676690-Helpdesk-taka-a-good-advice-get-the-
right-device [Accessed 27 September 2017].
Trienekens, J.J., Bouman, J.J. and Van Der Zwan, M., 2004. Specification of service level
agreements: Problems, principles and practices. Software Quality Journal, 12(1), pp.43-57.
Editor's Notes
General service management literature proposes that quality should be defined as “meeting and/or exceeding customers’ expectations”
Implementing measurement system is first step to reduce variance and improve the productivity of services
From the tickets that were not accepted, here’s the business service the client had raised the issue from.
[Harmon, Hensel and Lukes of McKinsey Quarterly]
Services can be measured & variance controlled by following 3 principles: benchmark internally, measure drivers of cost & make metrics accurate to identify relevant costs
Being bounced around, making multiple calls to the helpdesk will negatively affect user satisfaction.
Days were calculated from the opening and closing date using the int function, calculating difference of 2, broken down in days, hours and minutes
2016 has 7 missing closed dates = mostly enterprise application & computer on campus
2017 has 174 missing closed dates = enterprise, computer on campus & software
Mit and stanford
Service desk ability to FCR are a function of many factors: complexity & type of issues received, experience of agent, quality of training and tools in their disposal
Filters used were: Closed tickets, Solved within 24 hours, Not a Repeat Call, and Issue resolved by Service Desk
Filters used were: Closed tickets, Solved within 24 hours, and Issue resolved by Service Desk
Caller ID with unique category, subcategory & team that resolved the issue
Average Resolution Time is given the lowest weight in the overall helpdesk performance index in the academic environment, because users will tolerate longer resolution times if they have confidence that the agent who answers the call will be able to resolve their issues or deliver the call to the appropriate agent.
Service desk ability to FCR are a function of many factors: complexity & type of issues received, experience of agent, quality of training and tools in their disposal
Service desk ability to FCR are a function of many factors: complexity & type of issues received, experience of agent, quality of training and tools in their disposal