Thanks for invitation; hope sharing of our practice and lessons learnt will be of help to others; will include activities to enable participants to consider some of these in relation to their own practice.
The University’s programme has enabled us to learn a lot not only about students but about how departments operate – useful to get a “closer lens” on local practice – and how institutional processes could be enhanced to support SRS. Working at department level has enabled us to empower local teams to match initiatives to their particular students (e.g., activities appropriate for large/medium cohorts) and identify wider issues to share across disciplines. This might not have happened if we had invested entirely in University-wide initiatives.2010-11 produced retention data of all University’s programmes by POLAR category. Identified 5 department with lowest retention by POLAR 1 + 2 plus all other retention (64 – 82% overall, varied between programmes)Visited all departments; discussed What Works findings; invited programme teams to identify strategies that would benefit students using the principles of creating a strong sense of belonging and using the academic sphere as the most important site for nurturing participation. Used part of AA funding for new activities; steering group; regular reporting (student and staff engagement with activities + budget spend); presentations at staff conference. Using AA funding – has to benefit POLAR 1 + 2 students but obviously cannot discriminate between these and other groups. Challenge is to build local practice so that (a) departments identify successful interventions and (b) embed related funding into their local budgets. Possible question: how does the University of Chester spend the rest of its AA funding?Thought to keep in mind for later: how does your institution spend its AA funding?
Can we learn something from the nature of the programmes concerned? We suggest that for these disciplines there could be a significant difference between student expectations and the reality of their learning experience. This could be both a programme and an institutional issue.Staff views of why retention is problematic: low average entry points; A level vs BTEC; poor attendance; commuting students; timetable and catering arrangements (i.e., external factors).Institutional policies implemented? – attendance monitoring; adherence to PAT scheme; individual follow up for non-submissions.Most departments were not making use of/not confident with institutional data; local data sets needed improving.Most departments thought they were “listening to students” but once formalised this with focus groups etc., realised that they had misconceptions or misunderstandings e.g., about the pre-HE experience.Local leadership critical - our evidence suggests staff teams need to have a sense of belonging and engagement before students can. Compare with our successful departments – higher entry points but also better use of data and more cohesive staff teams. Extended history of high attrition rates causes gloom – sometimes needs fresh approach to change things round.
Psychology and Criminology; CS had 6th lowest retention in original analysis. Used knowledge from Year 1 of University programme plus What Works to suggest interventions. Workload implications – need a local leader but HoD must see that work is recognised and shared. Staff continue to find monitoring and evaluation challenging – need to plan this in advance, alongside planning the interventions (like data collection and analysis in a research study!). Links between core team and HEA/Paul Hamlyn valuable – additional support and opportunities to share with other HEIs. Also have access to data collected about the institution by MantzYorke on behalf of funders.
For a student, experience of being in HE is like a patchwork. Monitoring and evaluation of interventions is essential but separating out impact of individual activities’ impact is hard (and not necessarily appropriate) – is about their contribution to the “patchwork” of experience and the combined effect of these for each student. Our data shows that one year on, only one department improved retention data BUT it was only after the first year that we appreciated how the departments were working and which local and institutional level practices needed development. Successful department in year 1 (Criminology) improved from 74.2 through 79.6 (11-12 anomalous – higher entry grades) to 82.1 – new leadership and close attention to data; individual follow-up on non-submission; socially-based academic activities. Another department where retention declined slightly had improved NSS results – moved into top few for student satisfaction – impact of improved environment? Other departments learnt a lot and expect to see improvements in second year (reasons e.g., improved attendance monitoring; follow up of non-submissions; leadership). So – important lesson – even if retention not improved initially, these activities overall satisfaction and student progression (achievement). Target setting: 2% is only 2 students but £18K – multiplied over several programmes has significant effect and interventions may improve experience of all students. Many initiatives can be evaluated through existing or slightly modified mechanisms, e.g., module evaluations; student experience surveys; attendance and assessment data (from support departments also). “New” methods – focus groups; post-event questionnaires; anecdotal evidence of engagement – is latter form valid this valid? Helps if staff believe students more committed to their learning (attitude change); feedback about peer support valuable too. Reporting to steering group/core team important as is reporting to SMT. Sharing at institutional events; business planning; Faculty reviews; programme monitoring, etc. More focus needed on pre-HE experience.
Visits can be exclusive rather than inclusive. PAS (either mentoring or PAL) takes several years to become embedded in student culture. Workshops may only be attended by the “worried well” not the students who need them.
Prompts to inform discussion: Point 1 – team approach all staff engaged; sense of department identity; culture of staff being present outside contact hours. Sense of location (may be related to physical space or how virtual community is constructed); appropriate use of social media pre-entry and to support induction. Admin and support staff recognised as just as important. Students as partners in learning, e.g., department societies, web pages and newsletters; student and staff engagement with evaluation processes.Point 2: Forum for key service and academic departments to work together – marketing and admissions, registry, MIS, student services and SU. Data exploration is a new skill for many academics. Similarly, data experts often need to work directly with colleagues to know what aspects of their data sets are difficult to understand. Reporting structure up to and including SMT – engagement of member of SMT/VC. Team who work on SRS to promote and share their work
Dr Kate Irving, University of Chester - Developing your approach to student retention and success
Student retention and success
- finding out “What Works”
Dr Kate Irving
Director of Learning and Teaching
What Chester did - the University’s SRS
• Initial identification of students/programmes
– use of data;
• Engagement with staff – using “What Works”
• Support – Access Agreement funding and
• Evaluation and reporting mechanisms.
Learning from Year 1
• Discipline related? Media; Criminology; Sports
& Exercise Science; Art & Design; Marketing,
Tourism and Events Management;
• Staff perceptions of determinants of
• Adherence to institutional policies;
• Data – availability, accessibility and usage;
• Engagement and leadership of programme
Activity 1 – which programmes and
• Thinking about retention and success at your
institution, are there any patterns or links
between programmes where this needs to be
• How could you/have you found out about why
there is lower retention and success in these
• What strategies have been adopted to improve
retention and success so far? “What Works” for
Year 2 - joining the “What Works?”
• Opportunity to extent existing programme and
explore Single/Combined Honours retention;
• Three programmes: Criminology; Psychology and
• Focus on monitoring and evaluation valuable;
• Incorporated learning from year 1 of University
programme to improve impact of initiative.
• External support – improved engagement.
Activity 2 – monitoring and evaluation
Thinking of one or two initiatives that you are
planning at your own institution….
• What will be the opportunities and challenges
of putting these activities into practice?
• How might each be monitored and evaluated?
Decision to “stay” is complex
Improvements may take several years…
Set realistic targets.
Match monitoring and evaluation to
activity/intervention – utilise existing
mechanisms when appropriate;
• Support projects through local and
Activity 3A – monitoring and
What might the opportunities and challenges be of
the following initiatives to support retention and
How might each be monitored and evaluated?
• A curriculum-related half-day visit for students
• Introducing peer assisted support for students;
• Additional workshops for students on
“troublesome concepts” in the discipline.
Activity 3B - local and institutional
• In your experience, what are the leadership
characteristics of successful
• List ways in which academic, service
departments and institutional level teams can
work together to enhance retention and
Activity 4 - next steps…
• Share a list of actions for you to take back to your
• Please ask if we can help by discussing your ideas:
• Thank you very much for your contributions!