SlideShare a Scribd company logo
Maureen Brookes is currently Undergraduate Programme Director and Senior Lecturer in Marketing
in the Department of Hospitality, Leisure & Tourism Management. Maureen joined the department in
1995 after a long career in the hospitality industry. In 1998 she implemented a longitudinal survey to
assess hospitality student perceptions of their entire academic experience.
Vol. 2, No. 1.
ISSN: 1473-8376
www.hlst.ltsn.ac.uk/johlste
Evaluating the ‘Student Experience’: An
Approach to Managing and Enhancing Quality
in Higher Education
Maureen Brookes (meabrookes@brookes.ac.uk)
Oxford Brookes University
Gipsy Lane, Headington, Oxford, OX3 0BP, UK.
DOI:10.3794/johlste.21.27
 Journal of Hospitality, Leisure, Sport and Tourism Education
Abstract
This paper presents an argument for student evaluation of entire hospitality programmes or courses in
higher education. It reports on the approach undertaken within one hotel school to monitor the total
‘student experience’ and demonstrates the potential benefits of using this approach to aid quality
management and enhancement.
Keywords: student, evaluation, quality management and enhancement
Introduction
Student evaluation of teaching quality in higher education is a well-recognised practice and research
on the subject has been conducted for over seventy years (O’Neil, 1997). The merits of student
evaluation have also been well debated, with some academics arguing that students are not suitably
qualified to judge quality of teaching (see for example, Wallace, 1999) and others offering strong
support for the use of student evaluation for quality assurance purposes (see for example, Oldfield and
Baron, 2000; Murray, 1997). Within hospitality and tourism programmes, much of the recent
literature relates to the use of student evaluations of individual modules or units of study for faculty
and administrative purposes (see for example, Mount and Sciarini, 1999; Knutson et al., 1997).
However, where student feedback is used as a mechanism for quality assurance, there is also support
for student evaluation of entire courses or programmes of study in order to facilitate a more
comprehensive assessment (Wilson et al., 1997). This paper reports on such an approach taken within
the School of Hotel and Restaurant Management (since renamed the Department of Hospitality,
Leisure and Tourism Management) at Oxford Brookes University, England. It begins by discussing
the forces driving quality management processes in higher education and the methods employed by
the School to evaluate undergraduate programmes. The paper reports on a number of benefits of
incorporating student views on their broader educational experiences and concludes that this is an
essential part of a quality management and enhancement process.
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 18
Managing quality in higher education
The massive expansion of student numbers and changes in government funding has put the issue of
quality firmly on the agenda of higher education institutions (Oldfield and Baron, 1998). With the
introduction of tuition fees in 1998, students began to view themselves as paying customers,
demanding value for money and the right to be heard (Spira, 1996). As in many other parts of the
world, the general public began to demand greater accountability and called for valid, reliable and
comparable performance data on teaching quality in higher education (Wilson et al., 1997).
In response to these forces, the Quality Assurance Agency (QAA) for Higher Education was
established to ensure all government-funded education is of approved quality, to encourage
improvements in the quality of education and to provide public information on the quality of
individual higher education programmes. Quality is assessed at subject level by peer review against
six aspects of provision: curriculum design, content and organisation; teaching, learning and
assessment; student progression and achievement; student support and guidance; learning resources;
and quality management and enhancement (QAA, 1997). As the results of these quality audits are
published, the QAA system provides a comparative indicator of the quality of higher education
provision that is necessary in a climate of greater accountability. The results from the most recent
audit for hospitality and tourism have recently been published. Given the increasingly competitive
environment in hospitality and tourism education, with both increased provision and declining student
numbers, these audit scores are very important to individual institutions.
The increasingly competitive environment has also led a number of higher education institutions to
monitor levels of student satisfaction (King et al., 1999). Measuring student satisfaction as an
indicator of quality is consistent with a total quality management approach (TQM). Wiklund and
Wiklund (1999) report that several universities are now adopting TQM and as a result, a customer
focus has become a core value for many. While the precept that students are customers is not
universally accepted (see for example, Wallace, 1999), there has been growing support for the use of
student satisfaction surveys as an indicator of teaching quality (Aldridge and Rowley, 1998).
Furthermore, Murray (1997) reports that the use of these surveys has led to measurable improvements
in teaching quality. As such, student feedback can be used as an effective tool for quality
enhancement. Harvey (1995) also advises that student satisfaction goes hand in hand with the
development of a culture of continuous quality improvement.
It has been argued that any quality management tool must serve two functions; one of accountability
and one of enhancement (Jackson, 1996). While the QAA approach serves the accountability
function, additional internal mechanisms are required to best serve the quality enhancement function.
Jackson (1996) argues that the function of enhancement is fulfilled when institutions are better able to
understand the strengths and weaknesses in their policies, practices and procedures. Soliciting
feedback from students on their entire learning experience enables this understanding to be achieved.
Furthermore, if used appropriately, it enables student views to be integrated into quality enhancement
decisions (Aldridge and Rowley, 1998).
Designing the ‘student experience’ survey
Established over fifty years ago, the School is one of the oldest providers of both undergraduate and
postgraduate hospitality and tourism education in the UK. The School has long been concerned with
quality management and enhancement and a number of sound, established mechanisms are in place.
Until 1997 however, there was little formal input from students on their evaluation of their broader
educational experience. With 25 per cent of the School’s student population from outside of the UK,
and more students entering the programmes with different educational backgrounds and experiences,
it became clear that we had to monitor our ability to meet the needs of this diverse student population
on a broader basis. By actively soliciting student opinions on their overall experience, the voice of
another stakeholder could be incorporated into a process of continuous quality improvement. A
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 19
decision was therefore taken to launch the ‘student experience’ where students would be asked to
assess the quality of their educational experience.
A review of the current literature on student evaluation and feedback enabled the author to make a
number of initial decisions regarding the methodological approach to be adopted. An investigation
into current ‘best practice’ followed, via interviews with other Schools within and external to the
University. A recent ‘themed audit’ conducted by the University into the use of student feedback was
beneficial at this stage.
From this early investigative work, it became clear that a survey method using self-completion
questionnaires would enable data to be collected from as large a sample of the student population as
possible, in a cost effective way (Finn et al., 2000). Harvey et al.’s (1997) Student Satisfaction
Manual and Ramsden’s Course Experience Questionnaire (1991) were used as guidance at this stage
to determine the style of the questions. However, quality criteria are related to specific situations
(Richardson, 1998) and therefore must be identified by students themselves (Oldfield and Baron,
2000; Aldridge and Rowley, 1998). Therefore, a series of focus groups were held with students to
help determine the content of the questionnaire. The decision was taken that one section of the
questionnaire would determine student perceptions of curriculum design, organisation and content,
and a second section would assess student perceptions of the quality of teaching and learning; student
support and guidance; and learning resources and facilities provided within the School and University.
These sections contained a series of statements identified as important to students, in a manner similar
to other student satisfaction surveys (see for instance, Aldridge and Rowley’s (1998) review, although
a much shortened version is used). This design also enabled us to bring our internal process more in
line with the external quality control requirements of the QAA. As the questionnaire is measuring
student attitudes or perceptions, a quantitative Likert-type scale was selected as appropriate (Clark et
al., 1998). Student perceptions and feelings are recognised as valid criteria for student feedback
(Fraser, 1991). In order to monitor how the needs of particular student bodies are met, questionnaires
are designed to include demographic data such as age, gender and country of origin, as well as
programme and mode of study.
King et al. (1999) argue that student feedback only provides a snapshot of student opinion and,
therefore, the real value of student feedback lies in its use in longitudinal studies (Wilson et al., 1997).
Given its purpose as a tool for quality management and enhancement, a longitudinal approach
(Oppermann, 1997) was adopted in order to provide comparability and benchmark performance across
different cohorts of students and over time.
However, in order to be an effective quality enhancement tool (Jackson, 1996), the questionnaire also
had to provide richer data to facilitate decision-making on quality enhancement. For this reason,
questionnaires include another section comprised of a series of open-ended questions. Students are
asked to provide feedback on different aspects of their experience and how they believe improvement
could be achieved. This section is also used to investigate any current student issues identified by
student representatives and to obtain feedback on actions taken as a result of previous surveys.
It was next necessary to determine the best time to administer the survey. This issue is particularly
important for our School due to the fact that students join programmes at various entry points.
Sciarini et al. suggest that formative feedback is more in line with a continuous quality improvement
process which seeks to ‘add value to student learning experience’ (1997:37). However, the value of
summative feedback in quality assessment is also recognised (O’Neill, 1997). It was decided
therefore, that these approaches could be combined effectively in the design of the ‘student
experience’. As a result, three different self-completion questionnaires were developed that could be
administered to different cohorts of students at different times of the academic year and as they
progress through their programmes of study. First year students are surveyed in the first term of their
programme to determine their initial perceptions of their experience and to help identify ways in
which we can improve their induction and integration into the School and University. Students who
are studying at advanced level, but not in their final year of study, are surveyed in the second term,
about halfway through their programme. Graduating students are surveyed in the final academic term
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 20
of the year to assess their perceptions of their entire experience of studying within the School and at
Oxford Brookes University. All three surveys are administered each academic year.
A pilot study was conducted and, as a result and after further consultation with student
representatives, the questionnaires were altered to allow for some variability across the three
questionnaires, but with consistency across the different cohorts surveyed in order to enable
benchmarking. Table 1 below, gives an example of the first section of the questionnaire for first year
students.
On a scale of 1 to 5, where 1 means ‘definitely disagree’ and 5 means ‘definitely agree’, circle your responses to
the following statements.
Section 1: Effectiveness of Curriculum Design, Organisation & Content
[extent disagree (1) -
agree (5)]
The criteria for acceptance on the programme were made clear to me. 1 2 3 4 5
The aims and objectives of the programme are clear to me. 1 2 3 4 5
The standard of the work expected of students is clear to me. 1 2 3 4 5
The amount of effort required by students was made clear from the start. 1 2 3 4 5
The majority of modules so far are reasonably challenging. 1 2 3 4 5
There is a wide choice of modules available to choose from. 1 2 3 4 5
I understand the role of my personal tutor. 1 2 3 4 5
I plan to make effective use of my personal tutor during my programme of study. 1 2 3 4 5
Staff from the School were very helpful during induction week. 1 2 3 4 5
I feel that I made the right choice of programme to study. 1 2 3 4 5
I understand the learning outcomes for each module I am currently taking. 1 2 3 4 5
It has been easy to meet other students on the programme. 1 2 3 4 5
Table 1: Survey of First Year Students (Stage 1)
The second section of the questionnaire however, is standardised across all questionnaires as indicated
in Table 2 below.
Please rank on a scale of 1 to 5, with 1 being the lowest and 5 the highest, the quality of the following:
Section 2: Perceived Quality of Experience [rank 1(low) - 5 (high)]
Teaching & Learning 1 2 3 4 5
Accommodation 1 2 3 4 5
Teaching rooms 1 2 3 4 5
Administrative Support - School 1 2 3 4 5
Administrative Support - University 1 2 3 4 5
Personal Tutor System 1 2 3 4 5
Catering Facilities 1 2 3 4 5
IT facilities 1 2 3 4 5
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 21
Library Resources 1 2 3 4 5
Sports/Leisure Facilities 1 2 3 4 5
Table 2: Survey to all Cohorts of Students
The final qualitative section, by its very nature and purpose, alters across different surveys. Examples
of the types of questions asked of graduating students are included in Table 3.
1. How would you describe the atmosphere in the School?
2. In your opinion, are there any ways that support to students could be improved,
a) within the School?
b) within the University?
3. If the School received additional funding, where do you think the money should be spent?
4. Are there any additional modules you feel should be added to the programme?
5. Were there any times that you have felt under excess pressure? If yes, could you describe the cause of the
pressure?
6. Would you recommend the course to a friend?
7. How would you describe the ‘student experience’ in the School?
Table 3: Survey of Graduating Students: Qualitative Feedback
Assessing the ‘student experience’: the process
Harvey et al. (1997) recommend following a clear set of procedures in order to ensure student
participation. All surveys are conducted during the fourth teaching week of each term. Compulsory
modules or programme-wide student meetings are used in order to capture the views of a greater
percentage of the student population. Non-teaching staff administer the survey and discuss its
importance for quality improvement with students on each occasion. In order to facilitate cross-
tabulation of results against different student bodies and longitudinally, both quantitative and
qualitative data are analysed through descriptive statistics and cross tabulation using SPSS.
Qualitative answers are coded using a post-defined (Miles and Huberman, 1994) method that calls for
one researcher to manually develop a number of numerically coded categories based on frequency of
occurrence. Recognising the potential bias in this system, check-coding takes place; a process that
also provides a good reliability check (Miles and Huberman, 1994).
Findings are presented at the Undergraduate Programme Committee meeting in week 6 of each term,
where student representatives provide further insight and increase the reliability of the findings.
Appropriate recommendations are discussed in consultation with student representatives and, where
further investigation is deemed necessary, it is usually done through the use of focus groups facilitated
by a researcher employed outside of the School. A full written report with recommendations is
presented for approval the following week at School Board of Studies, the committee responsible for
all programme decisions. Figure 1 depicts the process followed for the ‘student experience’ and how
the findings are used to inform decision-making.
Areas of concern outside the School remit are passed on through appropriate university-wide
committees. The full report is then made available to students before the end of the term in order to
‘close the feedback loop’. Harvey et al. (1997) suggest that this is a vital process for successfully
involving students in quality management and we believe it demonstrates our commitment to taking
student feedback seriously. As such, the report is posted electronically on the school intranet and
manually on the student notice boards. In addition, student representatives are asked to further
disseminate the results of the survey and the resulting actions undertaken.
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 22
Assessing the ‘student experience’: the findings
The first ‘student experience’ survey of 104 graduating students was conducted in June 1998. Forty-
four students completed the questionnaire, representing 42 per cent of the graduate student population.
The quantitative results from the first survey indicated a generally positive perception of the quality of
their experience. For instance, over 90 per cent of respondents rated the quality of teaching and
learning, library resources, and sports and leisure facilities as above average (3 or higher). In
addition, over 85 per cent of respondents felt they had achieved the aims and objectives of the
programme and were adequately prepared for a career in the hospitality industry. Over 90 per cent felt
that the modules studied were reasonably challenging. The qualitative section of the survey indicated
that 86 per cent of the respondents would recommend the course to a friend and the most frequent
responses given to describe their experience within the School were “friendly”, “welcoming”,
“interesting” and “challenging”.
Although one survey only provides a snapshot of student perception (King et al., 1999), it enabled us
to pinpoint a number of areas for improvement. By comparing student ratings for the various criteria,
it was clear to see where students felt there was need for improvement. For example, only 44 per cent
of respondents considered that there was a reasonably wide choice of modules available to choose
from. Module choice was subsequently improved by increasing the availability of resource-based
modules to all three terms of study in the academic year. Subsequent ratings improved and the latest
survey indicates that 80 per cent of respondents now feel there is a wide choice of modules available.
Other enhancements of provision for students as a result of the survey include changes to the
induction programme, a re-induction programme for returning intern students and increased
SURVEY UNDERTAKEN
ANALYSIS OF FINDINGS
REPORT TO UNDERGRADUATE
PROGRAMME COMMITTEE
ACTIONS AGREED AT
BOARD OF STUDIES
FEEDBACK TO STUDENTS
ACTIONS IMPLEMENTED
RESULTS MONITORED
FINDINGS &
ANALYSIS
FURTHER
INVESTIGATION
Figure 1: Implementing the ‘Student Experience’
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 23
availability of associate lecturers. To date, 13 ‘student experience’ surveys have been conducted; four
with first year students, four with continuing students and five for graduate students. The sheer
number of surveys conducted to date and the way they are monitored across different cohorts prevent
the detailed presentation of all findings within the context of this article.
However, it is still possible to demonstrate how the longitudinal design of the ‘student experience’ has
further enabled us to enhance areas of our provision. One such area is that of the personal tutor
system. Personal tutors offer guidance and support to a set number of students assigned to them on
academic, personal and pastoral matters. The overall ratings for the quality of the personal tutor
system were generally high in early surveys. Mean ratings of quality were between 3.2 and 3.5 (out of
5) and the mode achieved either 4 or 5. Over the first five surveys, the percentage of respondents
ranking the personal tutor system as above average varied between 68 per cent and 85 per cent.
However, further analysis revealed there was some disparity in the student rankings. In one survey
almost 30 per cent of respondents ranked the personal tutor system as 2 or below, despite the
acceptable mean achieved, and this was further supported by the qualitative data. After reporting these
findings and discussing them with student representatives, a decision was taken to investigate personal
tutoring further. Subsequent focus groups revealed that while some students took full advantage of
the system on offer from the start of their academic careers, developed a good relationship with their
personal tutor, and thus ranked it highly, others failed to make use of the system for a variety of
reasons and therefore rated it poorly. Many students simply wanted to see more of their personal
tutors, but perceived them to have limited availability. This finding is consistent with other research
reports on personal tutoring (see for example, Rose, 1996).
As a result of these focus groups, a more proactive approach to personal tutoring has been
implemented within the School. In order to try and get students into the habit of seeing them
regularly, personal tutors initiate contact on a regular basis from the beginning of a student’s academic
career, in conjunction with the development of personal progress files. Students are encouraged to
meet their personal tutors to discuss their academic performance and investigate ways to improve their
approach to learning. In a meeting with personal tutors during induction, the role of the personal tutor
is clearly explained and students are given a progress file that contains a schedule of meetings with
their tutor for the entire year. They are also advised that they can request further sessions as they see
fit. A clearer system of office hours has also been developed with tutors available for five hours each
week at different times and days of the week and these are posted up to three weeks in advance. This
system is explained to students during induction and in the programme handbook. The results of the
subsequent ‘student experience’ surveys suggest that the new approach is perceived as an
improvement, (with the mean rating up slightly and ranging from 3.6 to 4 and the mode of 5 regularly
achieved). Over the last eight surveys administered, between 82 per cent and 86 per cent have ranked
the personal tutor system as above average and the percentage of students rating the system as 2 or
below has fallen to under 10 per cent for all but one of the surveys. The qualitative data also supports
this improved perception of the quality. The longitudinal nature of the ‘student experience’ will
enable us to continue to monitor the effectiveness of the new system. This example highlights the
importance of using qualitative research methods to validate questionnaire responses (Lucas, 1999).
In addition to its purpose as a quality enhancement tool, a number of other benefits have emerged
from the approach taken to evaluating the ‘student experience’ that have a broader, yet still positive
impact on the School. One of these is the development of closer working relationships with the
central providers of other university support services. For instance, comments made by students on
library provision have resulted in a more proactive student approach by the Subject Librarian, and
greater collaboration with academic staff on recommended student reading and secondary research
training for students. Computer Services have also taken note and have recently requested that a copy
of each survey be forwarded to their department for their own quality management purposes. This
closer co-operation can only improve the overall experience of students.
The process undertaken for disseminating the survey results to students has also proved beneficial. As
students progress through their programme of study, they can see the difference their feedback has
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 24
made to their educational provision, further supporting the arguments for formative feedback in
quality improvement. The response rates continue to be high and have grown from the first survey at
42 per cent of the graduate population to a usual figure of around 70 per cent of the student cohort.
Students also make the effort to complete all sections of the questionnaire fully, rather than suffering
from the often reported ‘questionnaire fatigue’. Furthermore, by taking the surveys seriously,
students’ critical evaluation skills - key skills required of graduates - are enhanced. Richardson
(1998) also argues that increased analytical ability is a benefit of student evaluation. The process
also encourages all faculty and staff members to become more reflective practitioners. Overall, the
findings further support Harvey et al.’s (1997) argument to close the ‘feedback loop’.
The final benefit relates directly to the ability of the School to compete effectively in a crowded
marketplace. Students consistently describe the School as “friendly”, “helpful”, “supportive” and
“rewarding”. These positive comments made by our students, as well as the percentage of students
who would recommend the programme to a friend are regularly posted on our website for all potential
students to view and are used in the School’s promotional literature. In essence, they are word of
mouth recommendations by those deemed to be in a trusted and authoritative position; current
students.
The ‘student experience’: a tool for quality enhancement
The ‘student experience’ survey is now firmly embedded within the School’s academic calendar.
Student response rates continue to be high and the benefits derived from the process have proved to be
wider than anticipated. Despite predominantly positive feedback and high ratings by the students,
these evaluations have enabled us to fine tune aspects of policies, procedures and practices, and
thereby enhance the quality of our provision. While quality assurance procedures should continue to
draw on a range of processes and expertise, students are clearly key stakeholders who can provide
valuable and reliable data to inform quality improvement decisions. Using student evaluations to
regularly monitor the comprehensive ‘student experience’ has proved to be an effective and essential
component of the quality management and enhancement process in our School and the positive impact
has been far reaching.
However, there are arguably improvements that could be made to further enhance the process. For
instance, it would be helpful to solicit graduate views of their experience after taking up employment
in a manner similar to that of Purcell and Quinn’s (1996) graduate survey. This research, also
undertaken within our School, sought to clarify good practice in achieving the fit between hospitality
management education and industry requirements. Such an approach would enable respondents to
look back and evaluate the whole of their programme in relation to the industry skills and knowledge
required in their current positions. More sophisticated statistical analysis might also prove beneficial,
especially if the survey is expanded.
Despite these limitations, the ‘student experience’ has been seen to make a positive contribution to
our School and has therefore been rolled out to postgraduate programmes of study. Whether it will be
rolled out further across the University remains to be seen. What is interesting to consider is whether
the survey should be rolled out across other hospitality programmes and institutions to provide a
vehicle for external benchmarking.
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 25
References
Aldridge, S. and Rowley, J. (1998) Measuring Customer Satisfaction in Higher Education. Quality
Assurance in Education 6(4), 197-204.
Clark, M., Riley, M., Wilkie, E. and Wood, R. C. (1998) Researching and Writing Dissertations in
Hospitality and Tourism. London: International Thompson Business Press.
Finn, M., Elliott-White, M. and Walton, M. (2000) Tourism & Leisure Research Methods. Harlow:
Pearson Education.
Fraser, M. (1991) Quality Assurance in Higher Education. London: Falmer Press.
Gill, J. and Johnson, P. (1997) Research Methods for Managers. London: Paul Chapman Publishing
Ltd.
Harvey, L., Plimmer, L., Moon S. and Geall V. (1997) Student Satisfaction Manual. Buckingham:
Open University Press.
Harvey, L. (1995) Student Satisfaction. The New Review of Academic Librarianship 1(1), 161-173.
Jackson, N. (1996) Internal Academic Quality Audit in UK Higher Education: Part 1 - current practice
and conceptual frameworks. Quality Assurance in Education 4(4): 37-46.
King, M., Morison, I., Reed, G., and Stachow, G. (1999) Student Feedback Systems in the Business
School: A Departmental Model. Quality Assurance in Education 7(2): 90-100.
Knutson, B., Schmidgall, R. and Sciarini, M. (1997) Teaching Evaluations in CHRIE Member
Schools: Perceptions of the Students. Journal of Hospitality & Tourism Education 9(1), 30-32.
Lucas, R. (1999) Survey Research. In B. Brotherton (ed.) The Handbook of Contemporary Hospitality
Management Research. Chichester, UK: John Wiley & Sons Ltd, 77-96.
Miles, M. B. and Huberman, A. M. (1994) Qualitative Data Analysis. California: Sage Publications.
Mount, D. J. and Sciarini, M. (1999) IPI and DSE: Enhancing the Usefulness of Student Evaluation of
Teaching Data. Journal of Hospitality & Tourism Education 10(4), 8-13.
Murray, H. G. (1997) Does Evaluation of Teaching Lead to Improvement of Teaching? International
Journal for Academic Development 2(1), 8-23.
Oldfield. B. and Baron, S. (2000) Student Perceptions of Service Quality in a UK University Business
and Management Faculty. Quality Assurance in Education 8(2), 85-95.
Oldfield. B. and Baron, S. (1998) Is the Servicescape Important to Student Perceptions of Service
Quality? Research Paper, Manchester Metropolitan University.
O’Neil, C. (1997) Student Ratings at Dalhousie. Focus 6(5), Halifax: Dalhousie University, 1-8.
Oppermann, M. (1997) Longitudinal Studies - A Methodological Clarification. Journal of Hospitality
& Leisure Marketing 4(4), 71-74.
Powell, A., Hunt, A. and Irving, A. (1997) Evaluation of Courses by Whole Student Cohorts: A Case
Study. Assessment & Evaluation in Higher Education 22(4), 397-404.
Purcell, K. and Quinn, J. (1995) Careers and Choices in Hospitality Management: An International
Survey. Oxford: Oxford Brookes University.
Quality Assurance Agency (1997) Subject Review Handbook October 1998 to September 2000,
December.
Ramsden, P. (1991) A Performance Indicator of Teaching Quality in Higher Education: The Course
Experience Questionnaire. Studies in Higher Education 16(2), 129-150.
Richardson, K. E. (1998) Quantifiable feedback: can it really measure quality? Quality Assurance in
Education 6(4), 212-219.
Rose, C. (1996) The Greenwich Experience. Teaching News 42(Spring), Oxford: Oxford Brookes
University, 5-7.
Sciarini, M., Gross, L. and Woods, R. (1997) Outside Input: The Risks and Returns of Evaluating for
Instructional Improvement. Journal of Hospitality & Tourism Education 9(1), 37-40.
Spira, L. (1996) What do we mean by quality? Teaching News 43(Summer), Oxford: Oxford Brookes
University 5-6.
Veal, A. J. (1997) Research Methods for Leisure and Tourism: A Practical Guide, 2nd
edn. London:
Pitman Publishing.
Wallace, J. (1999) The Case for Students as customers. Quality Progress 32(2), 47-51.
Wiklund, P. S. and Wiklund, H. (1999) Student Focused Design and Improvement of University
Courses. Managing Service Quality 9(6), 434-443.
Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing
Quality in Higher Education.
Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 26
Wilson, K., Lizzio, A. and Ramsden, P. (1997) The Development, Validation and Application of the
Course Experience Questionnaire. Studies in Higher Education 22(1), 33-53.

More Related Content

What's hot

Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
iosrjce
 
Quality nursing education
Quality nursing educationQuality nursing education
Quality nursing education
Bhushan Joshi
 
Turner, colt cross purposes ijobe v2 n1 2014
Turner, colt cross purposes ijobe v2 n1 2014Turner, colt cross purposes ijobe v2 n1 2014
Turner, colt cross purposes ijobe v2 n1 2014
William Kritsonis
 
1.vina serevina ahmad rampiki
1.vina serevina ahmad rampiki1.vina serevina ahmad rampiki
1.vina serevina ahmad rampiki
vinaserevina
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative study
mcjssfs2
 
Final Defense PowerPoint
Final Defense PowerPointFinal Defense PowerPoint
Final Defense PowerPoint
Demetris Wilson, Sr EdD
 
Monitoring and assessment in Secondary Schools
Monitoring and assessment in Secondary SchoolsMonitoring and assessment in Secondary Schools
Monitoring and assessment in Secondary Schools
Theresa Lowry-Lehnen
 
1st assignment
1st assignment1st assignment
1st assignment
benhemsen
 
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
oircjournals
 
research proposal
 research proposal research proposal
research proposal
Nafeesa Naeem
 
Transformation process quality of business education products from colleges o...
Transformation process quality of business education products from colleges o...Transformation process quality of business education products from colleges o...
Transformation process quality of business education products from colleges o...
Alexander Decker
 
Assessment - Process
Assessment - ProcessAssessment - Process
Assessment - Process
stomaskovic
 
The correlation among teachers’ expectations and students’ motivation, academ...
The correlation among teachers’ expectations and students’ motivation, academ...The correlation among teachers’ expectations and students’ motivation, academ...
The correlation among teachers’ expectations and students’ motivation, academ...
Alexander Decker
 
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
IshtiaqAhmedChowdhur1
 

What's hot (14)

Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
Assessmentof Nursing Students’ Attitude toward Learning Communication Skills ...
 
Quality nursing education
Quality nursing educationQuality nursing education
Quality nursing education
 
Turner, colt cross purposes ijobe v2 n1 2014
Turner, colt cross purposes ijobe v2 n1 2014Turner, colt cross purposes ijobe v2 n1 2014
Turner, colt cross purposes ijobe v2 n1 2014
 
1.vina serevina ahmad rampiki
1.vina serevina ahmad rampiki1.vina serevina ahmad rampiki
1.vina serevina ahmad rampiki
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative study
 
Final Defense PowerPoint
Final Defense PowerPointFinal Defense PowerPoint
Final Defense PowerPoint
 
Monitoring and assessment in Secondary Schools
Monitoring and assessment in Secondary SchoolsMonitoring and assessment in Secondary Schools
Monitoring and assessment in Secondary Schools
 
1st assignment
1st assignment1st assignment
1st assignment
 
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
School effectiveness-and-improvement-contribution-of-teacher-qualification-to...
 
research proposal
 research proposal research proposal
research proposal
 
Transformation process quality of business education products from colleges o...
Transformation process quality of business education products from colleges o...Transformation process quality of business education products from colleges o...
Transformation process quality of business education products from colleges o...
 
Assessment - Process
Assessment - ProcessAssessment - Process
Assessment - Process
 
The correlation among teachers’ expectations and students’ motivation, academ...
The correlation among teachers’ expectations and students’ motivation, academ...The correlation among teachers’ expectations and students’ motivation, academ...
The correlation among teachers’ expectations and students’ motivation, academ...
 
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
4b89e477-4661-4131-9d82-e05f24c996f1-150623214810-lva1-app6892.pdf
 

Similar to Q1 (2)

Alternative assessment
 Alternative assessment Alternative assessment
Alternative assessment
Godfred Abledu
 
Journal of Education and Practice
Journal of Education and Practice                             Journal of Education and Practice
Journal of Education and Practice
MerrileeDelvalle969
 
Journal of Education and Practice
Journal of Education and Practice                             Journal of Education and Practice
Journal of Education and Practice
TatianaMajor22
 
Perception and Expectation of Students Towards Service Quality
Perception and Expectation of Students Towards Service QualityPerception and Expectation of Students Towards Service Quality
Perception and Expectation of Students Towards Service Quality
Asma Muhamad
 
Performative Turn TiHE ifirst
Performative Turn TiHE ifirstPerformative Turn TiHE ifirst
Performative Turn TiHE ifirst
Bruce Macfarlane ???
 
Assessment and verification: A higher education perspective
Assessment and verification: A higher education perspective Assessment and verification: A higher education perspective
Assessment and verification: A higher education perspective
Journal of Education and Learning (EduLearn)
 
On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
 On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf... On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
Research Journal of Education
 
Assesssment based education, click open
Assesssment based education, click openAssesssment based education, click open
Assesssment based education, click open
Gholam-Reza Abbasian
 
Article Review
Article ReviewArticle Review
Article Review
fatinnah
 
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
Sheila Sinclair
 
What_does_quality_in_higher_education_mean_Perce.pdf
What_does_quality_in_higher_education_mean_Perce.pdfWhat_does_quality_in_higher_education_mean_Perce.pdf
What_does_quality_in_higher_education_mean_Perce.pdf
ThanhTonLm
 
medicalteacher_manuscript
medicalteacher_manuscriptmedicalteacher_manuscript
medicalteacher_manuscript
Bradford Wingo
 
Perceived service quality and student satisfaction in higher education
Perceived service quality and student satisfaction in higher educationPerceived service quality and student satisfaction in higher education
Perceived service quality and student satisfaction in higher education
IOSR Journals
 
ijsrp-p12116.pdf
ijsrp-p12116.pdfijsrp-p12116.pdf
ijsrp-p12116.pdf
solomon83659
 
Students’ Satisfaction on the Quality of Service of Andres Bonifacio College
Students’ Satisfaction on the Quality of Service of Andres Bonifacio CollegeStudents’ Satisfaction on the Quality of Service of Andres Bonifacio College
Students’ Satisfaction on the Quality of Service of Andres Bonifacio College
ijtsrd
 
A Critical Review Of Research On Formative Assessments The Limited Scientifi...
A Critical Review Of Research On Formative Assessments  The Limited Scientifi...A Critical Review Of Research On Formative Assessments  The Limited Scientifi...
A Critical Review Of Research On Formative Assessments The Limited Scientifi...
James Heller
 
Assessing Service-Learning
Assessing Service-LearningAssessing Service-Learning
Assessing Service-Learning
Kristen Flores
 
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate EssaysA Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
Jennifer Holmes
 
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
IJAEMSJORNAL
 
A Review Of Paradigms For Evaluating The Quality Of Online
A Review Of Paradigms For Evaluating The Quality Of OnlineA Review Of Paradigms For Evaluating The Quality Of Online
A Review Of Paradigms For Evaluating The Quality Of Online
Jill Brown
 

Similar to Q1 (2) (20)

Alternative assessment
 Alternative assessment Alternative assessment
Alternative assessment
 
Journal of Education and Practice
Journal of Education and Practice                             Journal of Education and Practice
Journal of Education and Practice
 
Journal of Education and Practice
Journal of Education and Practice                             Journal of Education and Practice
Journal of Education and Practice
 
Perception and Expectation of Students Towards Service Quality
Perception and Expectation of Students Towards Service QualityPerception and Expectation of Students Towards Service Quality
Perception and Expectation of Students Towards Service Quality
 
Performative Turn TiHE ifirst
Performative Turn TiHE ifirstPerformative Turn TiHE ifirst
Performative Turn TiHE ifirst
 
Assessment and verification: A higher education perspective
Assessment and verification: A higher education perspective Assessment and verification: A higher education perspective
Assessment and verification: A higher education perspective
 
On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
 On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf... On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
On Demand: Exploring the Potential of Electronic Feedback on Assessment Perf...
 
Assesssment based education, click open
Assesssment based education, click openAssesssment based education, click open
Assesssment based education, click open
 
Article Review
Article ReviewArticle Review
Article Review
 
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
A Comparative Study Of Competency-Based Courses Demonstrating A Potential Mea...
 
What_does_quality_in_higher_education_mean_Perce.pdf
What_does_quality_in_higher_education_mean_Perce.pdfWhat_does_quality_in_higher_education_mean_Perce.pdf
What_does_quality_in_higher_education_mean_Perce.pdf
 
medicalteacher_manuscript
medicalteacher_manuscriptmedicalteacher_manuscript
medicalteacher_manuscript
 
Perceived service quality and student satisfaction in higher education
Perceived service quality and student satisfaction in higher educationPerceived service quality and student satisfaction in higher education
Perceived service quality and student satisfaction in higher education
 
ijsrp-p12116.pdf
ijsrp-p12116.pdfijsrp-p12116.pdf
ijsrp-p12116.pdf
 
Students’ Satisfaction on the Quality of Service of Andres Bonifacio College
Students’ Satisfaction on the Quality of Service of Andres Bonifacio CollegeStudents’ Satisfaction on the Quality of Service of Andres Bonifacio College
Students’ Satisfaction on the Quality of Service of Andres Bonifacio College
 
A Critical Review Of Research On Formative Assessments The Limited Scientifi...
A Critical Review Of Research On Formative Assessments  The Limited Scientifi...A Critical Review Of Research On Formative Assessments  The Limited Scientifi...
A Critical Review Of Research On Formative Assessments The Limited Scientifi...
 
Assessing Service-Learning
Assessing Service-LearningAssessing Service-Learning
Assessing Service-Learning
 
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate EssaysA Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
A Generic Framework For Criterion-Referenced Assessment Of Undergraduate Essays
 
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
Institutional and Program Self-Evaluation (IPSE): Towards Institutional Susta...
 
A Review Of Paradigms For Evaluating The Quality Of Online
A Review Of Paradigms For Evaluating The Quality Of OnlineA Review Of Paradigms For Evaluating The Quality Of Online
A Review Of Paradigms For Evaluating The Quality Of Online
 

Recently uploaded

BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
Nguyen Thanh Tu Collection
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
iammrhaywood
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
Academy of Science of South Africa
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
RAHUL
 
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
Diana Rendina
 
spot a liar (Haiqa 146).pptx Technical writhing and presentation skills
spot a liar (Haiqa 146).pptx Technical writhing and presentation skillsspot a liar (Haiqa 146).pptx Technical writhing and presentation skills
spot a liar (Haiqa 146).pptx Technical writhing and presentation skills
haiqairshad
 
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem studentsRHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
Himanshu Rai
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
Dr. Shivangi Singh Parihar
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
heathfieldcps1
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Denish Jangid
 
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxBeyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
EduSkills OECD
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
Priyankaranawat4
 
How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience
Wahiba Chair Training & Consulting
 
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptxPrésentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
siemaillard
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
Celine George
 
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptxChapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
TechSoup
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Excellence Foundation for South Sudan
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
Celine George
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
TechSoup
 

Recently uploaded (20)

BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
BÀI TẬP DẠY THÊM TIẾNG ANH LỚP 7 CẢ NĂM FRIENDS PLUS SÁCH CHÂN TRỜI SÁNG TẠO ...
 
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptxNEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
NEWSPAPERS - QUESTION 1 - REVISION POWERPOINT.pptx
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
 
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPLAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UP
 
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
Reimagining Your Library Space: How to Increase the Vibes in Your Library No ...
 
spot a liar (Haiqa 146).pptx Technical writhing and presentation skills
spot a liar (Haiqa 146).pptx Technical writhing and presentation skillsspot a liar (Haiqa 146).pptx Technical writhing and presentation skills
spot a liar (Haiqa 146).pptx Technical writhing and presentation skills
 
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem studentsRHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
RHEOLOGY Physical pharmaceutics-II notes for B.pharm 4th sem students
 
PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.PCOS corelations and management through Ayurveda.
PCOS corelations and management through Ayurveda.
 
The basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptxThe basics of sentences session 6pptx.pptx
The basics of sentences session 6pptx.pptx
 
Chapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptxChapter wise All Notes of First year Basic Civil Engineering.pptx
Chapter wise All Notes of First year Basic Civil Engineering.pptx
 
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxBeyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptx
 
clinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdfclinical examination of hip joint (1).pdf
clinical examination of hip joint (1).pdf
 
How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience How to Create a More Engaging and Human Online Learning Experience
How to Create a More Engaging and Human Online Learning Experience
 
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptxPrésentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
Présentationvvvvvvvvvvvvvvvvvvvvvvvvvvvv2.pptx
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
 
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptxChapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
 
Walmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdfWalmart Business+ and Spark Good for Nonprofits.pdf
Walmart Business+ and Spark Good for Nonprofits.pdf
 
Your Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective UpskillingYour Skill Boost Masterclass: Strategies for Effective Upskilling
Your Skill Boost Masterclass: Strategies for Effective Upskilling
 
How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17How to Fix the Import Error in the Odoo 17
How to Fix the Import Error in the Odoo 17
 
Leveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit InnovationLeveraging Generative AI to Drive Nonprofit Innovation
Leveraging Generative AI to Drive Nonprofit Innovation
 

Q1 (2)

  • 1. Maureen Brookes is currently Undergraduate Programme Director and Senior Lecturer in Marketing in the Department of Hospitality, Leisure & Tourism Management. Maureen joined the department in 1995 after a long career in the hospitality industry. In 1998 she implemented a longitudinal survey to assess hospitality student perceptions of their entire academic experience. Vol. 2, No. 1. ISSN: 1473-8376 www.hlst.ltsn.ac.uk/johlste Evaluating the ‘Student Experience’: An Approach to Managing and Enhancing Quality in Higher Education Maureen Brookes (meabrookes@brookes.ac.uk) Oxford Brookes University Gipsy Lane, Headington, Oxford, OX3 0BP, UK. DOI:10.3794/johlste.21.27  Journal of Hospitality, Leisure, Sport and Tourism Education Abstract This paper presents an argument for student evaluation of entire hospitality programmes or courses in higher education. It reports on the approach undertaken within one hotel school to monitor the total ‘student experience’ and demonstrates the potential benefits of using this approach to aid quality management and enhancement. Keywords: student, evaluation, quality management and enhancement Introduction Student evaluation of teaching quality in higher education is a well-recognised practice and research on the subject has been conducted for over seventy years (O’Neil, 1997). The merits of student evaluation have also been well debated, with some academics arguing that students are not suitably qualified to judge quality of teaching (see for example, Wallace, 1999) and others offering strong support for the use of student evaluation for quality assurance purposes (see for example, Oldfield and Baron, 2000; Murray, 1997). Within hospitality and tourism programmes, much of the recent literature relates to the use of student evaluations of individual modules or units of study for faculty and administrative purposes (see for example, Mount and Sciarini, 1999; Knutson et al., 1997). However, where student feedback is used as a mechanism for quality assurance, there is also support for student evaluation of entire courses or programmes of study in order to facilitate a more comprehensive assessment (Wilson et al., 1997). This paper reports on such an approach taken within the School of Hotel and Restaurant Management (since renamed the Department of Hospitality, Leisure and Tourism Management) at Oxford Brookes University, England. It begins by discussing the forces driving quality management processes in higher education and the methods employed by the School to evaluate undergraduate programmes. The paper reports on a number of benefits of incorporating student views on their broader educational experiences and concludes that this is an essential part of a quality management and enhancement process.
  • 2. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 18 Managing quality in higher education The massive expansion of student numbers and changes in government funding has put the issue of quality firmly on the agenda of higher education institutions (Oldfield and Baron, 1998). With the introduction of tuition fees in 1998, students began to view themselves as paying customers, demanding value for money and the right to be heard (Spira, 1996). As in many other parts of the world, the general public began to demand greater accountability and called for valid, reliable and comparable performance data on teaching quality in higher education (Wilson et al., 1997). In response to these forces, the Quality Assurance Agency (QAA) for Higher Education was established to ensure all government-funded education is of approved quality, to encourage improvements in the quality of education and to provide public information on the quality of individual higher education programmes. Quality is assessed at subject level by peer review against six aspects of provision: curriculum design, content and organisation; teaching, learning and assessment; student progression and achievement; student support and guidance; learning resources; and quality management and enhancement (QAA, 1997). As the results of these quality audits are published, the QAA system provides a comparative indicator of the quality of higher education provision that is necessary in a climate of greater accountability. The results from the most recent audit for hospitality and tourism have recently been published. Given the increasingly competitive environment in hospitality and tourism education, with both increased provision and declining student numbers, these audit scores are very important to individual institutions. The increasingly competitive environment has also led a number of higher education institutions to monitor levels of student satisfaction (King et al., 1999). Measuring student satisfaction as an indicator of quality is consistent with a total quality management approach (TQM). Wiklund and Wiklund (1999) report that several universities are now adopting TQM and as a result, a customer focus has become a core value for many. While the precept that students are customers is not universally accepted (see for example, Wallace, 1999), there has been growing support for the use of student satisfaction surveys as an indicator of teaching quality (Aldridge and Rowley, 1998). Furthermore, Murray (1997) reports that the use of these surveys has led to measurable improvements in teaching quality. As such, student feedback can be used as an effective tool for quality enhancement. Harvey (1995) also advises that student satisfaction goes hand in hand with the development of a culture of continuous quality improvement. It has been argued that any quality management tool must serve two functions; one of accountability and one of enhancement (Jackson, 1996). While the QAA approach serves the accountability function, additional internal mechanisms are required to best serve the quality enhancement function. Jackson (1996) argues that the function of enhancement is fulfilled when institutions are better able to understand the strengths and weaknesses in their policies, practices and procedures. Soliciting feedback from students on their entire learning experience enables this understanding to be achieved. Furthermore, if used appropriately, it enables student views to be integrated into quality enhancement decisions (Aldridge and Rowley, 1998). Designing the ‘student experience’ survey Established over fifty years ago, the School is one of the oldest providers of both undergraduate and postgraduate hospitality and tourism education in the UK. The School has long been concerned with quality management and enhancement and a number of sound, established mechanisms are in place. Until 1997 however, there was little formal input from students on their evaluation of their broader educational experience. With 25 per cent of the School’s student population from outside of the UK, and more students entering the programmes with different educational backgrounds and experiences, it became clear that we had to monitor our ability to meet the needs of this diverse student population on a broader basis. By actively soliciting student opinions on their overall experience, the voice of another stakeholder could be incorporated into a process of continuous quality improvement. A
  • 3. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 19 decision was therefore taken to launch the ‘student experience’ where students would be asked to assess the quality of their educational experience. A review of the current literature on student evaluation and feedback enabled the author to make a number of initial decisions regarding the methodological approach to be adopted. An investigation into current ‘best practice’ followed, via interviews with other Schools within and external to the University. A recent ‘themed audit’ conducted by the University into the use of student feedback was beneficial at this stage. From this early investigative work, it became clear that a survey method using self-completion questionnaires would enable data to be collected from as large a sample of the student population as possible, in a cost effective way (Finn et al., 2000). Harvey et al.’s (1997) Student Satisfaction Manual and Ramsden’s Course Experience Questionnaire (1991) were used as guidance at this stage to determine the style of the questions. However, quality criteria are related to specific situations (Richardson, 1998) and therefore must be identified by students themselves (Oldfield and Baron, 2000; Aldridge and Rowley, 1998). Therefore, a series of focus groups were held with students to help determine the content of the questionnaire. The decision was taken that one section of the questionnaire would determine student perceptions of curriculum design, organisation and content, and a second section would assess student perceptions of the quality of teaching and learning; student support and guidance; and learning resources and facilities provided within the School and University. These sections contained a series of statements identified as important to students, in a manner similar to other student satisfaction surveys (see for instance, Aldridge and Rowley’s (1998) review, although a much shortened version is used). This design also enabled us to bring our internal process more in line with the external quality control requirements of the QAA. As the questionnaire is measuring student attitudes or perceptions, a quantitative Likert-type scale was selected as appropriate (Clark et al., 1998). Student perceptions and feelings are recognised as valid criteria for student feedback (Fraser, 1991). In order to monitor how the needs of particular student bodies are met, questionnaires are designed to include demographic data such as age, gender and country of origin, as well as programme and mode of study. King et al. (1999) argue that student feedback only provides a snapshot of student opinion and, therefore, the real value of student feedback lies in its use in longitudinal studies (Wilson et al., 1997). Given its purpose as a tool for quality management and enhancement, a longitudinal approach (Oppermann, 1997) was adopted in order to provide comparability and benchmark performance across different cohorts of students and over time. However, in order to be an effective quality enhancement tool (Jackson, 1996), the questionnaire also had to provide richer data to facilitate decision-making on quality enhancement. For this reason, questionnaires include another section comprised of a series of open-ended questions. Students are asked to provide feedback on different aspects of their experience and how they believe improvement could be achieved. This section is also used to investigate any current student issues identified by student representatives and to obtain feedback on actions taken as a result of previous surveys. It was next necessary to determine the best time to administer the survey. This issue is particularly important for our School due to the fact that students join programmes at various entry points. Sciarini et al. suggest that formative feedback is more in line with a continuous quality improvement process which seeks to ‘add value to student learning experience’ (1997:37). However, the value of summative feedback in quality assessment is also recognised (O’Neill, 1997). It was decided therefore, that these approaches could be combined effectively in the design of the ‘student experience’. As a result, three different self-completion questionnaires were developed that could be administered to different cohorts of students at different times of the academic year and as they progress through their programmes of study. First year students are surveyed in the first term of their programme to determine their initial perceptions of their experience and to help identify ways in which we can improve their induction and integration into the School and University. Students who are studying at advanced level, but not in their final year of study, are surveyed in the second term, about halfway through their programme. Graduating students are surveyed in the final academic term
  • 4. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 20 of the year to assess their perceptions of their entire experience of studying within the School and at Oxford Brookes University. All three surveys are administered each academic year. A pilot study was conducted and, as a result and after further consultation with student representatives, the questionnaires were altered to allow for some variability across the three questionnaires, but with consistency across the different cohorts surveyed in order to enable benchmarking. Table 1 below, gives an example of the first section of the questionnaire for first year students. On a scale of 1 to 5, where 1 means ‘definitely disagree’ and 5 means ‘definitely agree’, circle your responses to the following statements. Section 1: Effectiveness of Curriculum Design, Organisation & Content [extent disagree (1) - agree (5)] The criteria for acceptance on the programme were made clear to me. 1 2 3 4 5 The aims and objectives of the programme are clear to me. 1 2 3 4 5 The standard of the work expected of students is clear to me. 1 2 3 4 5 The amount of effort required by students was made clear from the start. 1 2 3 4 5 The majority of modules so far are reasonably challenging. 1 2 3 4 5 There is a wide choice of modules available to choose from. 1 2 3 4 5 I understand the role of my personal tutor. 1 2 3 4 5 I plan to make effective use of my personal tutor during my programme of study. 1 2 3 4 5 Staff from the School were very helpful during induction week. 1 2 3 4 5 I feel that I made the right choice of programme to study. 1 2 3 4 5 I understand the learning outcomes for each module I am currently taking. 1 2 3 4 5 It has been easy to meet other students on the programme. 1 2 3 4 5 Table 1: Survey of First Year Students (Stage 1) The second section of the questionnaire however, is standardised across all questionnaires as indicated in Table 2 below. Please rank on a scale of 1 to 5, with 1 being the lowest and 5 the highest, the quality of the following: Section 2: Perceived Quality of Experience [rank 1(low) - 5 (high)] Teaching & Learning 1 2 3 4 5 Accommodation 1 2 3 4 5 Teaching rooms 1 2 3 4 5 Administrative Support - School 1 2 3 4 5 Administrative Support - University 1 2 3 4 5 Personal Tutor System 1 2 3 4 5 Catering Facilities 1 2 3 4 5 IT facilities 1 2 3 4 5
  • 5. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 21 Library Resources 1 2 3 4 5 Sports/Leisure Facilities 1 2 3 4 5 Table 2: Survey to all Cohorts of Students The final qualitative section, by its very nature and purpose, alters across different surveys. Examples of the types of questions asked of graduating students are included in Table 3. 1. How would you describe the atmosphere in the School? 2. In your opinion, are there any ways that support to students could be improved, a) within the School? b) within the University? 3. If the School received additional funding, where do you think the money should be spent? 4. Are there any additional modules you feel should be added to the programme? 5. Were there any times that you have felt under excess pressure? If yes, could you describe the cause of the pressure? 6. Would you recommend the course to a friend? 7. How would you describe the ‘student experience’ in the School? Table 3: Survey of Graduating Students: Qualitative Feedback Assessing the ‘student experience’: the process Harvey et al. (1997) recommend following a clear set of procedures in order to ensure student participation. All surveys are conducted during the fourth teaching week of each term. Compulsory modules or programme-wide student meetings are used in order to capture the views of a greater percentage of the student population. Non-teaching staff administer the survey and discuss its importance for quality improvement with students on each occasion. In order to facilitate cross- tabulation of results against different student bodies and longitudinally, both quantitative and qualitative data are analysed through descriptive statistics and cross tabulation using SPSS. Qualitative answers are coded using a post-defined (Miles and Huberman, 1994) method that calls for one researcher to manually develop a number of numerically coded categories based on frequency of occurrence. Recognising the potential bias in this system, check-coding takes place; a process that also provides a good reliability check (Miles and Huberman, 1994). Findings are presented at the Undergraduate Programme Committee meeting in week 6 of each term, where student representatives provide further insight and increase the reliability of the findings. Appropriate recommendations are discussed in consultation with student representatives and, where further investigation is deemed necessary, it is usually done through the use of focus groups facilitated by a researcher employed outside of the School. A full written report with recommendations is presented for approval the following week at School Board of Studies, the committee responsible for all programme decisions. Figure 1 depicts the process followed for the ‘student experience’ and how the findings are used to inform decision-making. Areas of concern outside the School remit are passed on through appropriate university-wide committees. The full report is then made available to students before the end of the term in order to ‘close the feedback loop’. Harvey et al. (1997) suggest that this is a vital process for successfully involving students in quality management and we believe it demonstrates our commitment to taking student feedback seriously. As such, the report is posted electronically on the school intranet and manually on the student notice boards. In addition, student representatives are asked to further disseminate the results of the survey and the resulting actions undertaken.
  • 6. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 22 Assessing the ‘student experience’: the findings The first ‘student experience’ survey of 104 graduating students was conducted in June 1998. Forty- four students completed the questionnaire, representing 42 per cent of the graduate student population. The quantitative results from the first survey indicated a generally positive perception of the quality of their experience. For instance, over 90 per cent of respondents rated the quality of teaching and learning, library resources, and sports and leisure facilities as above average (3 or higher). In addition, over 85 per cent of respondents felt they had achieved the aims and objectives of the programme and were adequately prepared for a career in the hospitality industry. Over 90 per cent felt that the modules studied were reasonably challenging. The qualitative section of the survey indicated that 86 per cent of the respondents would recommend the course to a friend and the most frequent responses given to describe their experience within the School were “friendly”, “welcoming”, “interesting” and “challenging”. Although one survey only provides a snapshot of student perception (King et al., 1999), it enabled us to pinpoint a number of areas for improvement. By comparing student ratings for the various criteria, it was clear to see where students felt there was need for improvement. For example, only 44 per cent of respondents considered that there was a reasonably wide choice of modules available to choose from. Module choice was subsequently improved by increasing the availability of resource-based modules to all three terms of study in the academic year. Subsequent ratings improved and the latest survey indicates that 80 per cent of respondents now feel there is a wide choice of modules available. Other enhancements of provision for students as a result of the survey include changes to the induction programme, a re-induction programme for returning intern students and increased SURVEY UNDERTAKEN ANALYSIS OF FINDINGS REPORT TO UNDERGRADUATE PROGRAMME COMMITTEE ACTIONS AGREED AT BOARD OF STUDIES FEEDBACK TO STUDENTS ACTIONS IMPLEMENTED RESULTS MONITORED FINDINGS & ANALYSIS FURTHER INVESTIGATION Figure 1: Implementing the ‘Student Experience’
  • 7. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 23 availability of associate lecturers. To date, 13 ‘student experience’ surveys have been conducted; four with first year students, four with continuing students and five for graduate students. The sheer number of surveys conducted to date and the way they are monitored across different cohorts prevent the detailed presentation of all findings within the context of this article. However, it is still possible to demonstrate how the longitudinal design of the ‘student experience’ has further enabled us to enhance areas of our provision. One such area is that of the personal tutor system. Personal tutors offer guidance and support to a set number of students assigned to them on academic, personal and pastoral matters. The overall ratings for the quality of the personal tutor system were generally high in early surveys. Mean ratings of quality were between 3.2 and 3.5 (out of 5) and the mode achieved either 4 or 5. Over the first five surveys, the percentage of respondents ranking the personal tutor system as above average varied between 68 per cent and 85 per cent. However, further analysis revealed there was some disparity in the student rankings. In one survey almost 30 per cent of respondents ranked the personal tutor system as 2 or below, despite the acceptable mean achieved, and this was further supported by the qualitative data. After reporting these findings and discussing them with student representatives, a decision was taken to investigate personal tutoring further. Subsequent focus groups revealed that while some students took full advantage of the system on offer from the start of their academic careers, developed a good relationship with their personal tutor, and thus ranked it highly, others failed to make use of the system for a variety of reasons and therefore rated it poorly. Many students simply wanted to see more of their personal tutors, but perceived them to have limited availability. This finding is consistent with other research reports on personal tutoring (see for example, Rose, 1996). As a result of these focus groups, a more proactive approach to personal tutoring has been implemented within the School. In order to try and get students into the habit of seeing them regularly, personal tutors initiate contact on a regular basis from the beginning of a student’s academic career, in conjunction with the development of personal progress files. Students are encouraged to meet their personal tutors to discuss their academic performance and investigate ways to improve their approach to learning. In a meeting with personal tutors during induction, the role of the personal tutor is clearly explained and students are given a progress file that contains a schedule of meetings with their tutor for the entire year. They are also advised that they can request further sessions as they see fit. A clearer system of office hours has also been developed with tutors available for five hours each week at different times and days of the week and these are posted up to three weeks in advance. This system is explained to students during induction and in the programme handbook. The results of the subsequent ‘student experience’ surveys suggest that the new approach is perceived as an improvement, (with the mean rating up slightly and ranging from 3.6 to 4 and the mode of 5 regularly achieved). Over the last eight surveys administered, between 82 per cent and 86 per cent have ranked the personal tutor system as above average and the percentage of students rating the system as 2 or below has fallen to under 10 per cent for all but one of the surveys. The qualitative data also supports this improved perception of the quality. The longitudinal nature of the ‘student experience’ will enable us to continue to monitor the effectiveness of the new system. This example highlights the importance of using qualitative research methods to validate questionnaire responses (Lucas, 1999). In addition to its purpose as a quality enhancement tool, a number of other benefits have emerged from the approach taken to evaluating the ‘student experience’ that have a broader, yet still positive impact on the School. One of these is the development of closer working relationships with the central providers of other university support services. For instance, comments made by students on library provision have resulted in a more proactive student approach by the Subject Librarian, and greater collaboration with academic staff on recommended student reading and secondary research training for students. Computer Services have also taken note and have recently requested that a copy of each survey be forwarded to their department for their own quality management purposes. This closer co-operation can only improve the overall experience of students. The process undertaken for disseminating the survey results to students has also proved beneficial. As students progress through their programme of study, they can see the difference their feedback has
  • 8. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 24 made to their educational provision, further supporting the arguments for formative feedback in quality improvement. The response rates continue to be high and have grown from the first survey at 42 per cent of the graduate population to a usual figure of around 70 per cent of the student cohort. Students also make the effort to complete all sections of the questionnaire fully, rather than suffering from the often reported ‘questionnaire fatigue’. Furthermore, by taking the surveys seriously, students’ critical evaluation skills - key skills required of graduates - are enhanced. Richardson (1998) also argues that increased analytical ability is a benefit of student evaluation. The process also encourages all faculty and staff members to become more reflective practitioners. Overall, the findings further support Harvey et al.’s (1997) argument to close the ‘feedback loop’. The final benefit relates directly to the ability of the School to compete effectively in a crowded marketplace. Students consistently describe the School as “friendly”, “helpful”, “supportive” and “rewarding”. These positive comments made by our students, as well as the percentage of students who would recommend the programme to a friend are regularly posted on our website for all potential students to view and are used in the School’s promotional literature. In essence, they are word of mouth recommendations by those deemed to be in a trusted and authoritative position; current students. The ‘student experience’: a tool for quality enhancement The ‘student experience’ survey is now firmly embedded within the School’s academic calendar. Student response rates continue to be high and the benefits derived from the process have proved to be wider than anticipated. Despite predominantly positive feedback and high ratings by the students, these evaluations have enabled us to fine tune aspects of policies, procedures and practices, and thereby enhance the quality of our provision. While quality assurance procedures should continue to draw on a range of processes and expertise, students are clearly key stakeholders who can provide valuable and reliable data to inform quality improvement decisions. Using student evaluations to regularly monitor the comprehensive ‘student experience’ has proved to be an effective and essential component of the quality management and enhancement process in our School and the positive impact has been far reaching. However, there are arguably improvements that could be made to further enhance the process. For instance, it would be helpful to solicit graduate views of their experience after taking up employment in a manner similar to that of Purcell and Quinn’s (1996) graduate survey. This research, also undertaken within our School, sought to clarify good practice in achieving the fit between hospitality management education and industry requirements. Such an approach would enable respondents to look back and evaluate the whole of their programme in relation to the industry skills and knowledge required in their current positions. More sophisticated statistical analysis might also prove beneficial, especially if the survey is expanded. Despite these limitations, the ‘student experience’ has been seen to make a positive contribution to our School and has therefore been rolled out to postgraduate programmes of study. Whether it will be rolled out further across the University remains to be seen. What is interesting to consider is whether the survey should be rolled out across other hospitality programmes and institutions to provide a vehicle for external benchmarking.
  • 9. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 25 References Aldridge, S. and Rowley, J. (1998) Measuring Customer Satisfaction in Higher Education. Quality Assurance in Education 6(4), 197-204. Clark, M., Riley, M., Wilkie, E. and Wood, R. C. (1998) Researching and Writing Dissertations in Hospitality and Tourism. London: International Thompson Business Press. Finn, M., Elliott-White, M. and Walton, M. (2000) Tourism & Leisure Research Methods. Harlow: Pearson Education. Fraser, M. (1991) Quality Assurance in Higher Education. London: Falmer Press. Gill, J. and Johnson, P. (1997) Research Methods for Managers. London: Paul Chapman Publishing Ltd. Harvey, L., Plimmer, L., Moon S. and Geall V. (1997) Student Satisfaction Manual. Buckingham: Open University Press. Harvey, L. (1995) Student Satisfaction. The New Review of Academic Librarianship 1(1), 161-173. Jackson, N. (1996) Internal Academic Quality Audit in UK Higher Education: Part 1 - current practice and conceptual frameworks. Quality Assurance in Education 4(4): 37-46. King, M., Morison, I., Reed, G., and Stachow, G. (1999) Student Feedback Systems in the Business School: A Departmental Model. Quality Assurance in Education 7(2): 90-100. Knutson, B., Schmidgall, R. and Sciarini, M. (1997) Teaching Evaluations in CHRIE Member Schools: Perceptions of the Students. Journal of Hospitality & Tourism Education 9(1), 30-32. Lucas, R. (1999) Survey Research. In B. Brotherton (ed.) The Handbook of Contemporary Hospitality Management Research. Chichester, UK: John Wiley & Sons Ltd, 77-96. Miles, M. B. and Huberman, A. M. (1994) Qualitative Data Analysis. California: Sage Publications. Mount, D. J. and Sciarini, M. (1999) IPI and DSE: Enhancing the Usefulness of Student Evaluation of Teaching Data. Journal of Hospitality & Tourism Education 10(4), 8-13. Murray, H. G. (1997) Does Evaluation of Teaching Lead to Improvement of Teaching? International Journal for Academic Development 2(1), 8-23. Oldfield. B. and Baron, S. (2000) Student Perceptions of Service Quality in a UK University Business and Management Faculty. Quality Assurance in Education 8(2), 85-95. Oldfield. B. and Baron, S. (1998) Is the Servicescape Important to Student Perceptions of Service Quality? Research Paper, Manchester Metropolitan University. O’Neil, C. (1997) Student Ratings at Dalhousie. Focus 6(5), Halifax: Dalhousie University, 1-8. Oppermann, M. (1997) Longitudinal Studies - A Methodological Clarification. Journal of Hospitality & Leisure Marketing 4(4), 71-74. Powell, A., Hunt, A. and Irving, A. (1997) Evaluation of Courses by Whole Student Cohorts: A Case Study. Assessment & Evaluation in Higher Education 22(4), 397-404. Purcell, K. and Quinn, J. (1995) Careers and Choices in Hospitality Management: An International Survey. Oxford: Oxford Brookes University. Quality Assurance Agency (1997) Subject Review Handbook October 1998 to September 2000, December. Ramsden, P. (1991) A Performance Indicator of Teaching Quality in Higher Education: The Course Experience Questionnaire. Studies in Higher Education 16(2), 129-150. Richardson, K. E. (1998) Quantifiable feedback: can it really measure quality? Quality Assurance in Education 6(4), 212-219. Rose, C. (1996) The Greenwich Experience. Teaching News 42(Spring), Oxford: Oxford Brookes University, 5-7. Sciarini, M., Gross, L. and Woods, R. (1997) Outside Input: The Risks and Returns of Evaluating for Instructional Improvement. Journal of Hospitality & Tourism Education 9(1), 37-40. Spira, L. (1996) What do we mean by quality? Teaching News 43(Summer), Oxford: Oxford Brookes University 5-6. Veal, A. J. (1997) Research Methods for Leisure and Tourism: A Practical Guide, 2nd edn. London: Pitman Publishing. Wallace, J. (1999) The Case for Students as customers. Quality Progress 32(2), 47-51. Wiklund, P. S. and Wiklund, H. (1999) Student Focused Design and Improvement of University Courses. Managing Service Quality 9(6), 434-443.
  • 10. Brookes, M. (2003) Evaluating the ‘Student Experience: An Approach to Managing and Enhancing Quality in Higher Education. Journal of Hospitality, Leisure, Sport and Tourism Education 2(1), 17-26. 26 Wilson, K., Lizzio, A. and Ramsden, P. (1997) The Development, Validation and Application of the Course Experience Questionnaire. Studies in Higher Education 22(1), 33-53.