Transcript of "Sera conference 2012 student evaluations"
Lawry Price Roehampton UniversitySERA CONFERENCE 2012 - November 21st/23rd NovemberAyr Campus, University of the West of Scotland “Student evaluations – making them work”SummaryAny university’s success and reputation is dependent to a very large extent on its ability to deliver aquality student experience. This paper reports on the main findings from a university’s firststandardised, institutional wide internal undergraduate module evaluation survey. The survey wasdeveloped to build up a comprehensive picture of student’s satisfaction with their undergraduatemodule experience. The original and first report related to autumn 2011 modules only whilesubsequent spring and yearlong modules were analysed separately at the conclusion to theacademic year 2011/12.This report therefore focuses on all the university-wide questions that used the five point ratingscale for autumn modules only. It also provides an overview of the process that took place toconduct this survey and makes some suggestions for improvements for subsequent surveys.Overall the results from 4488 responses (60% response rate) relating to 218 modules were highlypositive. Mean ratings1 ranged between 4.1 and 4.4 for the seven main question sections (table 1below) and range between 3.7 and 4.6 for the individual questions. Students were particularlypositive about the teaching, supervision of their work and the academic support on offer during themodule delivery, with an overall mean rating for these areas of 4.4 out of 5. Students were lesssatisfied with assessment and feedback and learning resources (both with a mean score of 4.1). Thequestion ‘I completed all the suggested reading’ was by far the lowest rated question in the surveywith only 46% of students agreeing with this comment.Table 1: Mean ratings for the seven main scaled question sections Institutional mean ratings Academic support during the module 4.4 Quality of teaching and supervision 4.4 Module organisation and management 4.3 Overall satisfaction with the module 4.2 Assessment and feedback 4.1 Learning resources relating to the module 4.1 Me and my module 4.1 1 2 Mean 3 ratings 4 5____________________________________1The mean scores referred to throughout this report relate to un-weighted means. Un-weighted scores are the meanratings of modules as a whole so do not take account of module size. 1
Lawry Price Roehampton UniversityThe process of the surveying the students worked generally well given it was the first time ofimplementation but there were areas identified where improvements could be made and proposalsmade for subsequent surveys. Survey data was made available to a range of stakeholders (includingHeads of Department), module convenors, programme leaders and individual lecturers at theappropriate level of disaggregation for self-evaluation purposes.Organisation of the SurveyThe survey software chosen for hosting the module evaluation survey was Evasys, a systemmaintained by Electric Paper, a company who are experts in student-related surveys. 32 questionswere asked covering the following areas:- 1. Me and my module 2. Quality of teaching and supervision 3. Assessment and feedback 4. Academic support during the module 5. The way the module was organised and managed 6. Learning resources relating to the module 7. Departmental specific questions 8. Overall satisfaction with the module 9. A free text section asking students what was good about the module and what could be improvedMost questions gave the students the opportunity to agree/disagree with statements on a scalefrom “Strongly agree” to “Strongly disagree”. There was the additional opportunity for individualdepartmental questions as well as a section for students to comment more generally. In order toavoid the low responses typically associated with online questionnaires it was decided that thesurvey would take place on paper during class time. A summary of the process for conducting thesurvey is shown at appendix 1. A review of the overall process to take the survey took place in June2012involving key stakeholders with the aim of improving the efficiency of the process for 2012/13.The university’s Planning Department, in liaison with its Academic Office, identified the followingissues emerging out of this first institutional-wide survey:- Awareness among academic members of staff about the process could have been greater. Concerns were expressed from some academic staff on what the survey data would be used for and how secure the data was. Gaps in data on modules for example lack of up to date lecturer information meant preparing data was time consuming. Academic departments would benefit from clearer communication on what reports are provided, to whom, at what level of detail and when.Key proposals emerging out of this review:- 2
Lawry Price Roehampton University A need for review of the overall process (which took place in June 2012) A timeline of activities and broad outline of the process for 2012/13 would be sent out in the summer period to all department contacts. By establishing time frames and informing departments about the basic requirements and also a reiteration of the benefits of the process, an increased sense of ownership of the project could be generated. A series of road shows would be offered to academic departments with the aim of raising awareness of the process as well as giving opportunities for questions. The university’s Planning Department would consult on this with the established review group to see what departments would like covered and then design the sessions accordingly in conjunction with staff from the Academic Office and the Learning and Teaching Enhancement Unit of the University. The privacy policies would be highlighted at the road shows and the document (appendix 3) would be distributed before the start of the process. A review of how to maximise accuracy of the module data would take place as part of the overall project review. A timeline of who received which reports and at what level of detail would be discussed at the project review and subsequently communicated to departments.Institutional SummaryOverall the results from 4488 responses, relating to 218 modules were highly positive. Mean ratingsranged between 3.7 and 4.6 for individual questions. Students were particularly positive aboutteaching and learning and academic support. ‘There were sufficient opportunities to participate andask questions’ was the highest rated question with a mean rating of 4.6 and 90% of respondentsagreeing to this question.The question ‘I have completed the suggested reading’ was the lowest rated question of the surveywith only 46% of students agreeing with this. Only 58% of respondents agreed that ‘the libraryresources relating to this module have been good enough for my needs’ and 57.5% agreed with thestatement ‘I have submitted a lot of my coursework online’. ‘Understanding the marking criteriabefore I completed the assessment’ and ‘understanding how to improve my work from the feedback Ireceived’ both received mean ratings of 4, with 67.5% and 64.7% of respondents agreeing with thesestatements respectively.Key findings by Question& DepartmentThe following summarises headline responses to individual questions by questionnaire theme.Me and my module– the lowest mean rated question of the survey was for ‘I have completed all thesuggested reading’ with all but one department having this as their lowest mean score. Studentsresponded positively across all departments to attending sessions and tutorials associated, with 83%agreeing with this question 3
Lawry Price Roehampton UniversityOverall satisfaction with the module– responses on overall satisfaction were favourable across alldepartments. ‘Overall I am satisfied with the quality of the module’ had the highest meandepartmental ratings in this section, ranging between 4 and 4.5. A module’s applicability toworkplace was the lowest rated question with mean values between 3.8 and 4.Quality of teaching and supervision– scores were highly positive in response to this question.Students in all departments rated the question on ‘sufficient opportunities to participate and askquestions’ the highest with departmental mean ratings ranging between 4.3 and 4.7. Levels ofsatisfaction were high for all questions in this section with only one mean rating falling below 4.Assessment and feedback– this section contained some of the lowest scores of the survey. Areas ofmost concern for students were ‘I understood the marking criteria before completing theassessment’, with departmental mean scores ranging from 3.6 to 4.2 and ‘I understood how toimprove my work from the feedback I received’ with departmental mean scores ranging from 3.5 to4.2. The responses on online submission of coursework needed careful interpretation as somecoursework submissions are submitted on paper as well as online.Academic support during the module–students responded very positively to this question,expressing high levels of agreement with being able to discuss matters with lecturers. Mean scoresranged between 3.9 and 4.5.The way the module was organised and managed– students expressed high levels of satisfaction onmodule organisation and management. Scores for the two questions in this section appear closelyrelated with mean scores ranging from 3.9 to 4.6.Learning resources relating to the module – this question had the second lowest mean score after ‘Ihave completed my suggested reading’. Out of this it was suggested that there would be value ininvestigating whether the two responses were interrelated at module level. Library staff respondedby setting up to use the Evasys system to evaluate these findings. Mean ratings ranged more widelyin this section, indicating wider differentials in satisfaction between departments. Mean ratings for‘the library resources relating to this module have been good enough for my needs’ range from 3.2 to4.1 and ‘the moodle site has been good enough for my needs’ range from 3.6 to 4.3.The remaining two questions were driven by individual Departments and specifically designed toelicit key messages related to the student experience.The Departmental specific questions producedresults that were highly positive and provided key insights and feedback from students which, whenlinked to the free text section(asking students what was good about the module and what could beimproved) were informative tools useful in the context of informing review, planning forward andprompts for potential change to module content and delivery (with further comment regardingresources to support learning included).Separate evaluations and summary reports were produced for Heads of Department and thoseresponsible for Learning and Teaching. These focused on summaries of the module evaluations,response rates and gave indicators as to how individual Departments compared to the universityresults as a whole. It was re-emphasised here that this was first and foremost an “evaluating the 4
Lawry Price Roehampton Universitymodule” process and certainly not an evaluation of the individual lecturer/module leader overseeingdelivery of the particular module.Review and changes for follow on processesA full review process followed the completion of this first round of completed autumn moduleevaluations. Some key changes in operational matters were put in place for the spring period as wellas a confirmed commitment and accepted view that the same module evaluation template would beused to maintain a consistency for the academic year in question.The revised quest for even better response rates and therefore more detailed and accurate data toemerge from the exercise was a key objective. To achieve this a further concerted raise awarenesscampaign to reiterate and communicate purpose, value and worth of the activity was put in place forstudents and staff alike – “buy-in” was deemed to be crucial for on-going success for all stakeholdersin the process. There was recognition too that the very careful planning that had been invested inthe project, over a long lead-in period (including a contained pilot exercise preceding full university-wide implementation) had played its part in initial success. The need to finally go with what was inplace for an autumn module evaluation (to also meet the timescale originally planned)was a keydecision but also acknowledged potential shortfalls where specific data was lacking.The outcome was that individual departments did indeed prosper from receiving data-rich feedbackon a scale not available nor experienced before and module leaders and designers were placed in aposition to fully utilise this accrued information and detail both for review purposes and futureplanning. Ultimately the drivers behind students satisfaction were more pronounced, open andtransparent, the key aim of the project in the first place. Rather than wait for the results of NSS(National Student Survey) and other related barometers to gauge student response about theirexperiences, the university was now able to monitor this across the particular academic year inquestion. The beginnings therefore of a culture of module evaluation had been established onwhich to further build. 5
Lawry Price Roehampton University Appendix 1 UNDERGRADUATE MODULE EVALUATION PROCESS FLOW CHART LTEU and Planning Academic Office arrange Planning extract module design/agree print of standard data from SRS questionnaire questionnaires Planning reconcile module, lecturer and Timetabling provide Academic Office provide timetabling information additional details on details of lecturers and send to programme modules teaching modules convenors to fill in gaps Academic Office arrange Academic Office make Planning upload modular for printing of packs for each module: information into Evasys • questionnaires • Collate questionnaires and and generate coversheets • cover sheets -differeent one per coversheets for Academic Office. module • Put in A4 envelopes Departmental Offices Academic Office distribute Students complete distribute questionnaires packs to departmental questionnaire to lecturers offices Departmental offices Academic Office compile Lecturer return completed return completed completed questionairres questionnaires to questionnaires to and cover sheets and send Departmental offices Academic Office for scanning Electronic Paper scan Planning send individual coversheets/ forms into reports out through the Electric Paper upload ScanStation software to module responses into Evasys • Include checks lecturers Planning send summary reports to Heads of department and LTAG chairs 6