A Service Quality Model for Web-Services Evaluation in Cultural Heritage Management


Published on

A Service Quality Model for Web-Services Evaluation in Cultural Heritage Management
Chiabai Aline - Basque Centre for Climate Change BC3, Bilbao,
Rocca Lorena - Department of Geography, University of Padova
Chiarullo Livio - Livio Chiarullo, Eni Foundation Enrico Mattei, Italy

Published in: Technology, Design
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Innovative methodology, the “Blended Focus Groups” integrating in-person meetings with on-line discussion and e-learning modules
  • This tool has been built in order to improve CH management and valorize the tangible and intangible resources in the territory.
  • For web-services, it is even more complicate, as these are intangible, not perishable, heterogeneous, and are created on demand and thus simultaneously produced and consumed.
  • In our case study we analyse gap 5, as this is the reason why the service fails. Customers’ expectations are driven by their personal needs, their past experiences, word of mouth and marketing communications from providers. Only the last factor is controlled by service providers.
  • Steps to reach a high quality services.
  • The SERVQUAL test was applied in this case study for exploratory purpose to test specific aspects of the geo-referenced website. As the object under valuation is a website made of a set of web-services, it is difficult to identify the evaluation criteria adopted by the customer. This can vary from one to another. It is therefore important to identify the main service dimensions. So the first thing is to identify what is the service quality we want to evaluate, that is the dimensions of the service quality.
  • This allows to calculate a weighted servqual score by weighting each gap, as calculated above, with the score assigned to each service dimension.
  • The respondent has reported the relative importance that each service dimension has for him, by allocating a total score of 100 points among the stated dimensions.
  • Average expectation for each feature.
  • When weighted means are used, the gap increases in most cases. In particular values change more significantly for the sub-services included in content and usability dimensions, as they are those with higher weight. Interesting to note that the e-governance area is the one with positive gaps, even if it was given the lowest expectations.
  • Although web services are more and more oriented to a participatory network, users’ expectations are still primarily focused on the search of secure, clear and up-to-date information, as well as improved layout and usability
  • The use of the web to promote public participation and specific actions about the territory is still not sufficiently recognized by the citizens
  • A Service Quality Model for Web-Services Evaluation in Cultural Heritage Management

    1. 1. A Quality Model for Web-Services Evaluation in Cultural Heritage Management ICCSA 2011 Santander, 20-23 June 2011 Aline Chiabai
    2. 2. <ul><li>Background and motivations </li></ul><ul><li>Case study </li></ul><ul><li>Theoretical framework </li></ul><ul><li>Methods </li></ul><ul><li>Survey results </li></ul><ul><li>Conclusions and further steps </li></ul>Outline
    3. 3. <ul><li>ISAAC “Integrated e-Services for Advanced Access to Heritage in Cultural Tourist Destinations” (FP6) </li></ul><ul><li>User relevant integrated platform and e-services in urban cultural destinations to valorize the use of EU heritage taking into account stakeholders’ perspectives </li></ul><ul><li>Case study: e-governance model for cultural heritage management in the city of Genoa (Italy) </li></ul>ISAAC project
    4. 4. <ul><li>First phase: </li></ul><ul><ul><li>Construction of a user-friendly geo-referenced Web system ( www.isaac-genovaculture.eu ) as a tool to facilitate communication and participation among the different stakeholders (for CH management) </li></ul></ul><ul><li>Second phase: </li></ul><ul><ul><li>Activation of the participatory process (Blended Focus Groups): bottom-up approach </li></ul></ul><ul><ul><li>Stakeholders: residents, tourists and service providers </li></ul></ul><ul><ul><li>Fine-tuning of the system according to users’ expectations </li></ul></ul><ul><ul><li>SERVQUAL analysis </li></ul></ul>Integrated approach
    5. 5. Recursive cycle “planning-action-revision” e-Participation website prototype Application: Blended focus groups Residents Tourists Local service-providers External service-providers SERVQUAL TEST Satisfaction analysis Stakeholders’ involvement Evaluation of users expectations
    6. 6. <ul><li>Residents living and working in the local territory and those working there but live elsewhere. </li></ul><ul><li>Tourists using the territory in their free time (for amusement, sport, visiting monuments, participating in events…), but not living in the territory. </li></ul><ul><li>Service providers qualified experts about services related to CH (to improve its access and valorise it) </li></ul><ul><ul><li>Local service providers with specific knowledge of the territory and its resources: private actors (local cultural associations, tourist agencies, museums, etc), or public actors (municipality, department of cultural heritage, public transport, etc). </li></ul></ul><ul><ul><li>External service providers with wider and nation-wide competence. They operate at a higher level than local associations (e.g. associations for the conservation of historical and cultural heritage, and IT providers). </li></ul></ul>Stakeholders
    7. 7. http://www.isaac-genovaculture.eu/ Information Inform and sensitise people about cultural resources Communication Use new technologies as scaffolding to favour communication and critical reflection between citizens about cultural tourism and cultural resources Participation Design scenarios for sustainable urban destinations through participatory tools
    8. 9. <ul><li>Generally, the management of CH does not account for users’ preferences </li></ul><ul><li>Traditional top-down approaches often not effective and even in contrast with real users’ needs </li></ul><ul><li>Innovative forms of CH management have to be built on users’ expectations and satisfaction for service optimization and diversification </li></ul>Rationale of the study
    9. 10. <ul><ul><li>SERVQUAL original version developed by marketing expert (Zeithaml, Parasuraman e Berry, 1991) </li></ul></ul><ul><ul><li>Objective: provide enterprises with a tool to evaluate the service quality, customers’ opinions about products/services supplied </li></ul></ul><ul><ul><li>Service quality: the degree of discrepancy between the expectations or desires of the customers and their perceptions </li></ul></ul><ul><ul><li>More difficult to evaluate service quality (and specifically e-services) than product quality </li></ul></ul>SERVQUAL ANALYSIS
    10. 11. The SERVQUAL model
    11. 12. The quality service life cycle
    12. 13. <ul><li>Object to evaluate: website comprising a set of web-services </li></ul><ul><li>The following “service quality dimensions” have been identified, characterising the web-services: </li></ul><ul><li>design, layout and usability (navigation, visualization) </li></ul><ul><li>content and information updating </li></ul><ul><li>e-participation </li></ul><ul><li>self-construction topics (interaction with web design staff) </li></ul><ul><li>Within each group, a number of more specific features (sub-services) have been identified </li></ul>Survey design
    13. 14. Service quality dimensions
    14. 15. <ul><li>Section 1: Socio-demographic questions (gender, age, education) and internet behaviour (frequency and location of use) </li></ul><ul><li>Section 2: Users’ expectations with respect to a generic and hypothetical website that delivers services similar to the one under valuation. </li></ul><ul><li>Section 3: Users’ perceptions (satisfaction) of the specific website under valuation </li></ul>Questionnaire strucutre
    15. 16. EXPECTATION ANALYSIS: EXAMPLE Completely disagree Neither agree nor disagree Completely agree <ul><ul><li>For each of the 4 macro areas, a set of statements have been proposed to the respondents </li></ul></ul><ul><ul><li>Respondents have been asked to assign a score to each specific feature, using a likert scale from 1=completely disagree, 4=neither agree or disagree, 7=completely agree </li></ul></ul>1 An excellent web portal about cultural goods, should be graphically attractive and have a pleasant aspect, even if visualisation is slower. 1 2 3 4 5 6 7 2 An excellent web portal about cultural goods, provides specific areas for thematic discussion reserved to users groups. 1 2 3 4 5 6 7
    16. 17. PERCEPTION ANALYSIS: EXAMPLE Completely disagree Neither agree nor disagree Completely agree 1 www.isaac-genovaculture.eu is graphically attractive and has a pleasant aspect, even if visualisation is a little slow. 1 2 3 4 5 6 7 2 www.isaac-genovaculture.eu provides specific areas for thematic discussion reserved to group of users. 1 2 3 4 5 6 7
    17. 18. <ul><li>(1) GAP = P-E </li></ul><ul><li>P = perception score (perceived quality) </li></ul><ul><li>E = expectation score (expected quality) </li></ul><ul><li>Scores are assigned based on the response to the questionnaire (seven point likert scale) </li></ul><ul><li>The largest negative gaps identify the service features where to prioritize intervention for improvement </li></ul><ul><li>Positive gap if expectations have been exceeded, which is not necessarily positive (over-supplying) </li></ul>Calculation of SERVQUAL score
    18. 19. <ul><li>(2) GAP w = (P-E)x(D+100)/100 </li></ul><ul><li>P = perception score </li></ul><ul><li>E = expectation score </li></ul><ul><li>D = score assigned to the service dimension (based on a total 100 score) </li></ul>The weighted SERVQUAL score Respondent is asked to allocate a total score of 100 points among the 4 service dimensions, according to the relative importance assigned to them: <ul><li>Layout/usability, (b) content, (c) e-participation, </li></ul><ul><li>(d) self-construction topics </li></ul>
    19. 20. <ul><li>Timing: February-April 2008 </li></ul><ul><li>Place: Genoa </li></ul><ul><li>Recruitment: respondents randomly selected from a list of users subscribed to the FEEM Culture Factory of Genoa </li></ul><ul><li>Sample: 89 compiled questionnaires </li></ul><ul><li>Quota sampling </li></ul>Survey administration
    20. 21. Results: users’ preferences for service quality dimensions
    21. 22. Results for the broad service dimensions
    22. 23. Analysis of users’ expectations about specific Web features
    23. 24. Results at sub-service level
    24. 25. Results: SERVQUAL simple gap expectations/perceptions
    25. 26. Results: SERVQUAL simple and weighted gap
    26. 27. <ul><li>The most important features for a CH website (higher expectations) are layout, design, usability and content, in particular: </li></ul><ul><ul><li>Information easy to find and supported by images, clear information and frequently updated </li></ul></ul><ul><li>Inclusion of e-Participation services in web-based cultural city services is of less demand from the citizens compared to other attributes </li></ul>Conclusions (1)
    27. 28. <ul><li>CNSIS report (2010) examines the relationship of the Italians with ICT and concludes that only 43% of the web users totally trust this communication means (might explain the low desirability of e-participation) </li></ul><ul><li>Interacting with public administration to improve cultural services is currently seen as insignificant, confirming that the Web is yet to be taken by the citizens as a true instrument for participation in collective decision-making </li></ul>Conclusions (2)
    28. 29. <ul><li>Small sample size, does not allow for differentiation among stakeholders types </li></ul><ul><li>Study the relationship between respondents answers and their socio-demographic profile (age, gender, education, …) </li></ul><ul><li>Include analysis of service providers and compare with expectations/perceptions of users </li></ul>Limitations and further research
    29. 30. THANK YOU [email_address] Basque Centre for Climate Change (BC3) Bilbao, Spain