Your SlideShare is downloading. ×
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Services marketing   service quality
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Services marketing service quality

17,833

Published on

Defining Service Quality …

Defining Service Quality

Evaluating Quality Technical & Functional Quality

Researching Service Quality

The SERVQUAL instrument

Published in: Education, Business
0 Comments
8 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
17,833
On Slideshare
0
From Embeds
0
Number of Embeds
13
Actions
Shares
0
Downloads
712
Comments
0
Likes
8
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. +ServicesMarketing Service Quality Tom Chapman www.marketing101.co.uk Twitter @idlehans 1
  • 2. + Introduction  Defining Service Quality  Evaluating Quality  Technical & Functional Quality  Researching Service Quality  The SERVQUAL instrument2
  • 3. + What do you think?  Define Quality  Why is Quality important?  How do you evaluate it?3
  • 4. + Defining Quality  quality is an ambiguous term  “although we cannot define quality, we know what quality is” (Pirsig, 1987)  “quality is fitness for use, the extent to which the product successfully serves the purpose of the user during usage” (Juran, 1974)  “quality is zero defects - doing it right the first time”, Parasuraman, Zeithaml and Berry, 1985)  “quality is exceeding what customers expect from the service”, Zeithaml, Parasuraman and Berry, 1990)4
  • 5. + Service Quality - early writings ‘service quality results from a comparison of what customers feel a service provider should offer (i.e. their expectations) with the provider’s actual performance’ (Parasuraman, 1996: 145) ‘Service quality is a measure of how well the service level delivered matches customer expectations. Delivering quality service means conforming to customer expectations on a consistent basis’ Lewis and Booms (1983)5
  • 6. + Why is Quality Important?  Superior product/service quality relative to competitors is the single most important factor affecting profitability (PIMS study)  Premium prices  Customer preference  Customer retention  Market expansion/market share  Other benefits:  productivity, advertising, distribution/access6
  • 7. + Changing management focus Creating better value 2000+ for customers and the organisation 1990’s Quality 1980’s 1970’s Productivity7
  • 8. + Service Quality - shifting focus  in the past, industry focused particularly on defining and meeting internal quality or technical standards  today the focus has shifted to quantifying customers’ assessments of services and products (external measurement) and then translating these into specific internal standards  delivering quality service is fundamental to corporate success because research shows it is closely linked to profits8
  • 9. + Service Quality – a major business concern  Quality is an elusive concept not easily articulated by consumers  can lead to better market share, profitability, lower costs and improve productivity  performances, not objects, which may vary with quality evaluations not made solely on service outcome but also on service process9
  • 10. + Service Quality – profits/costs  increased profits found to be due particularly to:  fewer customer defections  stronger customer loyalty  more cross-selling of products and services  higher margins (due to service enhancements of core products)  improved service quality cuts costs  fewer customers to replace  less corrective work to do  fewer inquiries and complaints to handle  lower staff turnover and dissatisfaction10
  • 11. + Enhancing service value11
  • 12. + What is Quality?  Conformance quality  producing the product/service according to specification every time, with no correction required  Quality-in-use  customer judgements about quality received and resultant level of customer satisfaction  Technological quality  superior performance features of product/service derived from advanced new technologies12
  • 13. + Service Quality Total quality Image (corporate/local) Technical Relational quality of the Functional quality: by outcome: WHAT quality of the WHOM is the offered/receive process: HOW service d delivered13
  • 14. + Evaluating Quality  access (physical approachability of service location, ease of finding way around the service environment and route clarity)  aesthetics (extent to which service package components are agreeable or pleasing to the customer, including appearance and ambience of the service environment, appearance and presentation of service facilities, goods and staff)  attentiveness/helpfulness (extent to which service, especially contact staff help the customer, interested in them and show a willingness to serve)  availability (of service facilities, staff and goods available to the customer)14
  • 15. + Evaluating Quality  care (concern, consideration, sympathy and patience shown to customer, including putting at ease and feeling emotionally comfortable)  cleanliness/tidiness (of the tangible components of the service package)  comfort (physical comfort of the service environment and facilities)  commitment (staff’s apparent commitment to their work, including pride and satisfaction, diligence and thoroughness)  communication (ability of service provider to communicate in a way the customer will understand; ability of staff to listen and understand the customer)15
  • 16. + Evaluating Quality  competence (skill, expertise, professionalism with which service is executed; correct procedures, execution of customer instructions, product knowledge displayed by staff, giving sound advice)  courtesy (politeness, respect, propriety shown by the service - usually staff)  flexibility (willingness and ability to amend/alter the service to meet customer needs)  friendliness (warmth and personal approachability of service providers, especially contact staff)16
  • 17. + Evaluating Quality  functionality (fitness for purpose)  integrity (honesty, justice, fairness, trust in treating customers)  reliability (and consistency of performance of service facilities, goods and staff; keeping agreements)  responsiveness (speed and timeliness of service delivery, responding promptly to customer requests, minimal waiting/ queuing time)  security (personal safety of customers and possessions while participating in the service process)17
  • 18. + Zeithaml, Parasuraman and Berry Ten dimensions Five dimensions tangibles tangibles reliability reliability* responsiveness responsiveness* competence courtesy credibility assurance security access communication empathy understanding the customer18 Zeithaml, Parasuraman and Berry
  • 19. + Expectations  little known about what determines expectations and how formed  Individualistic  own norms, values, wishes, needs  changing over time  changes in aspiration  changes in need  do customers know what is expected of them?19
  • 20. + Expectations  expectations can be formulated in terms of “what should be done” and “what will be done”  fourdifferent performance standards distinguished:  deserved or equitable performance  ideal or desirable performance  expected performance  minimal tolerable performance  the difference between the desired service level and adequate service level is the …………20
  • 21. + Perceptions  “perception is defined as the process by which an individual selects, organizes and interprets stimuli into a meaningful and coherent picture of the world” (Schiffman and Kanuk, 1987)  subjective and selective  resulting attitudes about a particular service provider may change over time (long-term attitudes may be more stable than immediate attitudes)21
  • 22. + Satisfiers and Dissatisfiers  critical incidents  courtesy  Behaviour  understanding  Responsiveness  communication  negative experiences  competence  reliability22
  • 23. + Satisfiers and Dissatisfiers  greater perceived control by the customer may decrease the sources of customer dissatisfaction  consumers check whether their expectations are in line with actual experiences of the service and service delivery  looking for gaps between expectations and perceptions is important in detecting what needs to be improved  satisfaction emerges when actual service meets expectations or when it exceeds expectations (positive disconfirmation)  dissatisfaction occurs when actual service is below expected level (negative disconfirmation)23
  • 24. + Customer Perceptions of Quality  Critical incidents  events throughout service delivery impact on perceived quality  Evaluation  customers check whether their expectations are in line with actual experiences of the service  Satisfaction  actual service meets or exceeds expectations (positive disconfirmation)  Dissatisfaction  actual service is below expected level (negative disconfirmation)  Gap analysis  looking for gaps between expectations and perceptions is important in guiding quality improvement24
  • 25. + Dimensions of Service Quality  Reliability  ability to perform the promised service dependably and accurately – delivering what is promised  Responsiveness  willingness to help customers and provide prompt service  adapting the service to customer needs  Assurance  employees knowledge and courtesy  ability to inspire trust and confidence25 (Parasuraman, Zeithaml & Berry, 1988)
  • 26. + Dimensions of Service Quality  Empathy  caring, individualised attention  customers are unique and special  customers are understood and valued  Tangibles  appearance of physical facilities, equipment, personnel and communication materials  continuity  perceived quality26 (Parasuraman, Zeithaml & Berry, 1988)
  • 27. + Gaps Model of Service Quality Customer Gap Customer Company Gap 1 Gap 4 Gap 3 Gap 2 (Parasuraman, Zeithaml &27 Berry, 1985)
  • 28. + Service Quality Gaps Gap 1 – Not knowing what customers expect  Inadequate market research  Poor market segmentation  Lack of upward communication (contact employees to managers)  Insufficient customer relationship focus  Inadequate service recovery28
  • 29. + Service Quality Gaps Gap 2 – incorrect service design & standards  Inability to translate customer expectations into clear quality specifications  Lack of management commitment to service quality  Customer expectations thought to be unreasonable or unfeasible  Absence of a formal quality programme (guidelines, standards)  Poor service design29
  • 30. + Service Quality Gaps Gap 3 – Not delivering to service standards  Employees unwilling or unable to perform the service at the desired level  Poor internal organisation  ineffective recruitment, inadequate teamwork, employees not motivated, role conflict, role ambiguity, poor supervision  Poor employee-technology job fit (appropriate tools to perform roles)  Failure to match supply and demand  Customers unaware of roles and responsibilities  Problems with service intermediaries30
  • 31. + Service Quality Gaps Gap 4 – Promises do not match performance  Over-promising in advertising, personal selling or physical evidence cues  Management wants to show services offered in best possible light  Poorly-integrated marketing communications  Insufficient communication between marketing/sales & operations  Ineffective management of customer expectations31
  • 32. + Service Quality - attributes  in 1988 PZB operationalised the construct (of perceptions and expectations differences) as the difference measured between two 7 point rating scales -  one scale measuring customers’ expectations about service companies in general within the service sector/category being investigated  the other scale measuring customers’ perceptions about a particular company whose service quality is being assessed  PZB measured the extent to which customers felt companies should possess a specified service attribute and the extent to which customers felt a given company did possess the attribute32
  • 33. + Service Quality - expectations and perceptions statements  attributes were put as statements, with which customers were asked to express the degree of agreement/disagreement on a 7 point scale  expectations statements:  e.g. the physical facilities at hotels should be visually appealing  the behaviour of hotel employees should instil confidence in customers  hotels should give customers individual attention  corresponding perceptions statements:  the physical facilities at ABC Hotel are visually appealing  the behaviour of ABC Hotel employees instils confidence in customers  ABC Hotel gives customers individual attention33
  • 34. + SERVQUAL construction  PZB thus developed a comprehensive set of statements to represent facets of the 10 service quality dimensions  this yielded 97 statements (approx. 10 per dimension)  a two part instrument developed - part 1 consisted of 97 expectations statements, part 2 - 97 perceptions statements  roughly half the statements were worded negatively  instrument piloted on a sample of 200 customers resulting in a reduced 34 item instrument with 7 rather than 10 dimensions (PZB 1988)34
  • 35. + SERVQUAL five dimensions  reliability and validity of the reduced instrument was assessed further - data collected of 4 US service companies, samples of 200 customers of each - this produced consistent results  further elimination of items created a 22 item instrument, grouping the 22 items into just 5 general dimensions  3 of the original 10 dimensions remained intact in the final 5 dimensions (tangibles, reliability and responsiveness) plus the remaining 7 original dimensions clustered into 2 broader dimensions:  (1) assurance (knowledge and courtesy of employees and their ability to inspire trust and confidence) basically a combination of the original dimensions of competence, courtesy, credibility and security35
  • 36. + Service Quality - SERVQUAL refinements  (2) empathy (caring, individualised attention the firm provides its customers) represents access, communication and understanding the customers “SERVQUAL is most valuable when it is used periodically to track service quality trends, and when it is used in conjunction with other forms of service quality measurement” (PZB, 1988:31) In 1991 PBZ further refined SERVQUAL:  three types of services and 5 companies  data collected through mail surveys of independent samples of customers of each company, giving combined sample size of 1,936  the distribution of expectations ratings obtained was highly skewed toward the upper end of the 7 point scale36
  • 37. + SERVQUAL refinements  the statements were revised to capture what customers will expect from companies delivering excellent service e.g. original expectations statement was “hotels should give customers individual attention” was revised to read “excellent hotels will give customers individual attention”  the negatively worded statements in the original SERVQUAL instrument were problematic - they were awkward, could have confused respondents and may have lowered the reliabilities for dimensions containing them - so they were changed to a positive format  finally, 2 original items (one under tangibles and assurance) were replaced with 2 new items, to capture more fully the dimensions37
  • 38. + SERVQUAL usage  despite refinements, reliability always emerges as the most critical dimension and tangibles the least critical  SERVQUAL can be used:  to determine the average gap score (between customers’ perceptions and expectations) for each service attribute  to assess a company’s SQ along each of the 5 SERVQUAL dimensions  to compute a company’s overall weighted SERVQUAL score which takes account of the SQ gap on each dimension and the relative importance of the dimension38
  • 39. + SERVQUAL usage  usedto track customers’ expectations and perceptions on individual service attributes and SERVQUAL dimensions over time  to compare a company’s SERVQUAL scores against those of competitors  toidentify and examine customer segments that significantly differ in their assessments of a company’s service performance  to assess internal service quality - i.e. quality of service provided by one dept/division to others within the company39
  • 40. + SERVQUAL concerns  questions raised about SERVQUAL’s expectations components (Babakus & Mangold, 1992, Cronin & Taylor, 1992)  the interpretation and operationalisation of expectations (Teas, 1993)  the reliability and validity of SERVQUAL’s difference score formulation (Babakus & Mangold, 1992, Brown, Churchill & Peter, 1993)  SERVQUAL’s dimensionality (Carmen, 1990, Finn ( Lamb, 1991)  but counter-arguments by PBZ 1991, 1993, and 1994, and Parasuraman, 199640
  • 41. + SERVQUAL concerns  is it necessary to measure expectations? - studies show scores on the perceptions-only component of SERVQUAL explain significantly more variance in customers’ overall evaluations of a co’s SQ (measured on a single item overall perceptions rating scale) than are perception- expectation difference scores. PZB argue that measuring expectations has diagnostic value (i.e. pinpoints SQ shortfalls)  how should the expectations construct be operationalised? multiple ways the term “expectations” can be interpreted - SQ researchers have generally viewed expectations as normative standards (customer beliefs about what a service provider should offer) but customer satisfaction/dissatisfaction researchers have typically considered expectations to be predictive standards (what customers feel a service provider will offer)41
  • 42. + SERVQUAL operationalisation  but both “should” and “will” expectations have been used in measuring SQ although ZBP in 1993 went on to develop a conceptual model of expectations  can we operationalise SQ as a difference score? operationalising any construct as a difference between 2 other constructs is questioned on psychometric grounds; critics suggest that direct measures (i.e. non-difference scores) of the expectations-perceptions gap may be psychometrically superior - but this issue is not resolved  does SERVQUAL have 5 distinct dimensions that cross different contexts? replication studies have not been able to reproduce a clean 5 dimensional factor structure as the original PZB 1988 study - differences may be due to data collection and analysis procedures42
  • 43. + further SERVQUAL criticisms (see Buttle 1996)  SERVQUAL is based on a disconfirmation paradigm rather than an attitudinal paradigm  little evidence that customers assess SQ in terms of P-E gaps  process orientation rather than service encounter outcomes  SERVQUAL’s five dimensions are universals with high intercorrelation between 5 RATER dimensions (reliability, assurance, tangibles, empathy and responsiveness)  don’t consumers use standards other than expectations to evaluate SQ? and yet it fails to measure absolute SQ expectations  4 or 5 items cannot capture the variability within each SQ dimension43
  • 44. + Further considerations  customer assessments of SQ may vary from “moment of truth” to “moment of truth”  using a 7 point Likert scale is flawed  reversing polarity of items in the scale causes respondent error  Cronin & Taylor (1992, 1994) say SERVQUAL is flawed, with perceived quality is best thought of as an attitude  PZB describe satisfaction as more situation or encounter specific and quality as more holistic, being developed over a longer time period  argued that PZB are inductive, and take no account of the literature in economics, psychology and statistics  arguments about the marginal revenue of SQ improvements always exceeding the marginal cost44
  • 45. + Dynamics  interdependencies among the dimensions of quality are difficult to describe  also is the customer value of improvements a linear or non- linear function?  SERVQUAL fails to capture the dynamics of changing expectations (customers learn from experiences) indeed, Gronroos (1993) says we need to know more about how expectations are formed and change over time  from the customer’s viewpoint, failure to meet expectations often is more significant than success in meeting or exceeding expectations  while process of service delivery focused, it’s argued that outcome quality is already contained within reliability, competence and security45
  • 46. + Service Quality - other models  Richard & Allaway (1993) tested an augmented SERVQUAL model which incorporates both process and outcome components - they concluded that process and outcome is a better predictor of consumer choice than process or outcome alone  the number of SQ dimensions may be dependent on the particular service being offered (Babakus & Boller, 1992)  Teas (1993b) believes respondents may be using one of six interpretations of expectations:  service attribute importance (customers may respond by rating the expectations statements according to the importance of each attribute)46
  • 47. + Performance specification  forecasted performance (customers may respond by using the scale to predict the performance they would expect)  ideal performance (the optimal performance, what performance “can be”)  deserved performance (the performance level customers feel performance should be)  equitable performance (the level of performance customers feel they ought to receive given a perceived set of costs)  minimum tolerable performance (what performance “must be”)47
  • 48. + Standards  Lacobucci et al (1994) would drop the word “expectations” and prefer the word “standards”; they believe several standards may operate simultaneously, among them “ideals”, industry standards etc.  Gronroos (1993) refers to the bad service paradox - a customer may have low expectations based on previous experience with the service provider - if these expectations are met, there is no gap and SQ is deemed satisfactory  so, do customers always evaluate SQ in terms of expectations and perceptions or are there other forms of SQ evaluation?  what form do customer expectations take, how best (if at all) they can be measured) and are expectations common across a class of service providers?48
  • 49. + Attitudes  do attitude-based measures of SQ perform better than the disconfirmation model and which attitudinal measure is most useful?  can we integrate outcome evaluations into SQ measurement and how can this be done?  is the predictive validity of perception measures of SQ better than P-E measures?  what are the relationships between SQ, customer satisfaction, behavioural intention, purchase behaviour, market share, word- of-mouth and customer retention?  what is the role of context in determining E and P evaluations? what context markers do consumers employ?49
  • 50. + Evaluation  are analytical context markers (such as tangibility and consumer involvement)useful in advancing SQ theory?  do evaluative criteria in intangible-dominant services (e.g. consulting) differ from those in tangible-dominant services (e.g. hotels)?  how does customer involvement influence the evaluation of SQ?  how do customers integrate transaction-specific or moment of truth (MOT) specific evaluations of SQ? To what extent are some MOTs more influential in final evaluation than others?  what are the relationships between the five RATER factors? How stable are these relationships across contexts?  what is the most appropriate scale format for collecting valid and reliable SQ data? and to what extent can customers correctly classify items into their a priori dimensions?50
  • 51. + SERVQUAL additions  ZBP (1993) conceptual model of expectations - customers have 2 different service levels that serve as comparison standards in assessing SQ:  Desired Service (a level of service representing a blend of what customers believe “can be” and “should be” provided  Adequate Service (the minimum level of service customers are willing to accept)  separating these 2 levels is a Zone of Tolerance that represents the range of service performance a customer would consider satisfactory  because SERVQUAL expectations component measures normative expectations, the construct represented by it reflects the desired service construct51
  • 52. + SERVQUAL additions  theSERVQUAL structure did not capture the adequate service construct so PZB (1994b) augmented and refined SERVQUAL to:  capture not only the discrepancy between perceived service and desired service - called a measure of service superiority but also  the discrepancy between perceived service and adequate service, labelled a measure of service adequacy  PZBtherefore, rated desired, adequate and perceived service, and went on to label “adequate service” as minimum service52
  • 53. + Diagnostic value  tests have shown that measuring perceptions alone should suffice if the sole purpose of SQ measurement on individual attributes is to try to maximise the explained variance in overall service ratings but  from a practical viewpoint, it is important to pinpoint SQ shortfalls and take appropriate corrective actions (therefore, there is diagnostic value in measuring perceptions against expectations)  clearly operationalising customer expectations as a zone or range of service levels is feasible empirically and diagnostically  using the zone of tolerance as a comparison standard in evaluating service performance can help companies in understanding how well they are at least meeting customer’s minimum requirements and how much improvement is needed before they achieve service superiority53
  • 54. + Measuring Service Quality  SERVQUAL:  One scale measuring customer expectations about service companies in general within the relevant service sector  One scale measuring customer perceptions about a particular company  Based on five dimensions of service quality  Compare expectation scores with perceived quality achieved  Used for internal performance management, benchmarking versus competitors, customer segmentation, tracking expectations/perceptions over time54
  • 55. + Measuring Service Quality  SERVQUAL criticisms:  Doubts over conceptual foundation & methodology  Only measures technical (outcome) & functional (process) service quality  Results not re-producible over time (lacks stability)  Risks in assessing customer satisfaction relative to prior expectations (if expectations low, even “poor” service might seem good)  Only valid for services with high search or experience characteristics – problems with credence characteristics  better to use questions about performance (= perception) only (Cronin and Taylor, 1992 and 1994 - SERVPERF) - higher predictive validity  Measuring expectations has only diagnostic value (pinpointing service quality shortfalls)55

×