Your SlideShare is downloading. ×
0
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Bel conference   league tables july 2013
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Bel conference league tables july 2013

194

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
194
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Definitely used in recruitment of overseas students, and overseas partners and potential partners have an interest in how we perform or are viewed
  • Final point – subject level tables are not consistent, as the level of granularity of data used is not reflective of what actually happens
  • Retention is not a measure in The Guardian League table. Retention measures using a HESA PI, which does not directly correlate with our own internal progression data are used in other league tablesUnder future performance measures you could perhaps mention the REF. The way we played it last time has had a negative impact on our league table position in all the other league table (not the Guardian) every year since 2008.
  • Transcript

    • 1. Students as Partners in creating our public reputation
    • 2. Contents • What are league tables for? • Student input to league tables. • How league tables are constructed. • What do they say about us? • How should we respond?
    • 3. Where do you think we should be in a university league table?
    • 4. Why are they important? • Used in recruitment decisions – maybe. • Reputation – definitely • How our students see us • Our own self-worth
    • 5. What are the problems with League tables? • League tables use the most recent data sets available at the time of publication. • There is a time-lag when using publishing data sets and time series data. • Different statistical analysis, methodology and weightings are used. In the different tables • There can be changes to criteria and methodology • Data sets can change • Difficult to make comparisons between subjects and between competitors at this lower level.
    • 6. More problems.... • Categorisation of courses and subjects are different. And only concentrate on FT UG provision – this University is about so much more than that • The editorial that appears alongside each University entry is not always available to the University to update or influence prior to publication. • Information is often not the same as we might use internally • Newspapers love to trumpet obvious, but wildly misleading headlines
    • 7. However – we are stuck with them….. ....so, let’s make sure we understand what they say about us, and try to make them work for us
    • 8. Over-riding importance – Improving Student Outcomes • We need to allow students to reach their full potential – Celebrating individual success – Maximising individual rewards – Maximising contribution to society – How do we do this in areas of low aspiration? • Students are increasingly consumers – Access price comparisons – is there price sensitivity? – Access to performance comparisons
    • 9. Impact of student outcomes on the University • KIS • Unistats • National Student Survey • League Tables • Which? University Guide • College and University websites • The Complete University Guide • ……etc etc
    • 10. Student Outcomes in Public Information • Inputs – Spend per student – Staff student ratios – Entry standards – Research ratings – Cost of living – Spend on services – Faculty spend • Outputs – Number of “good” degrees – National Student Survey results – Retention rates – Employability
    • 11. Student Input to Public Information • National Student Survey – Measure of final year students – Questions on satisfaction with course, feedback, teaching • Used in: – League tables/KIS (external) – Portfolio performance review (internal) – Award annual monitoring (internal)
    • 12. Ones to watch • Complete University Guide (May) • Guardian University Guide (June) * • Times/Sunday Times Good University Guide (August) • * - in University strategic plan…..aim to be in top 50
    • 13. What do they say about us now? YEAR THE COMPLETE UNIVERSITY GUIDE THE GUARDIAN THE TIMES THE SUNDAY TIMES 2011 80 69 77 96 2012 99 77 89 105 2013 108 96 100 107 2014 113 92 ??? ???
    • 14. SU positions in Complete University Guide for each factor Entry Standards 111/124 Student Satisfaction 54/124 <- good!! Research Assessment 111/124 Graduate Prospects 116/124 Staff student ratio 87/124 Academic Services Spend 94/124 Facilities Spend 62/124 Good Honours 103/124 Degree Completion 108/124 Overall 113th
    • 15. To be successful in the CUG league table : • increase the number of good degrees awarded – how? • recruit better qualified students who are more likely to get good degrees – WP? Value added? • with better degrees, more graduates will get graduate entry jobs • increase research assessment scores – limit number submitted to REF
    • 16. Factor Source Weighting % Satisfied with Teaching NSS 2012 10% % Satisfied overall with course NSS 2012 5% Expenditure per student (FTE) HESA data for 2010–11, and 2011–12 15% Student:staff ratio HESA data for 2010–11, and 2011–12 15% Career prospects 2010-11 HESA/DLHE data 15% Value added score/10 HESA data (ie the cohort who graduated in 2012 15% Average Entry Tariff typical UCAS scores of students aged 20 or under on entry (HESA) 15% % Satisfied with Assessment NSS 2012 10% Guardian league table factors
    • 17. 2010 2011 2012 2013 2014 Entry Tariff 249.4 235.8 238.5 245.5 254 225.0 230.0 235.0 240.0 245.0 250.0 255.0 260.0 Entry Tariff
    • 18. 2010 2011 2012 2013 2014 Career prospects (%) 65.2 63.2 56.4 55.1 48 0.0 10.0 20.0 30.0 40.0 50.0 60.0 70.0 Career prospects (%)
    • 19. Are we trying to improving position or performance? • Clearly, we can try to play the game of moving our league table position • What we really want to do is improve our performance in each of the key areas to make sure there is a sustainable and genuine change
    • 20. What steps can we take? Guardian criteria Suggested Action Entry standards Review all current standard offers, are we pitching ourselves properly against competitors? Average A level scores have gone up, have our offers? Student/staff ratio Reviewing more thoroughly the data we submit to HESA Developing better models of SSR to identify where investment is most needed Spend per student Reviewing more thoroughly the data we submit to HESA Identifying capital spend needed Increased recent spend on libraries and IT will have an impact
    • 21. Guardian criteria Action Value added Increase number of “good” degrees awarded. Reviewing all level modules with low pass rates and average marks. Reviewing degree classification rules as part of change to % calculation Identifying through portfolio review awards with consistently poor progression and attainment NSS teaching, assessment and feedback and overall satisfaction Faculty action plans, and award level plans Increased student engagement with survey Seven principles of feedback Online assessment and feedback project employment Encouraging more students to complete DLHE Staffordshire Graduate – improving our students’ chances of success
    • 22. Keeping track of how we are doing • Portfolio Performance Measurement – Provides an internal review mechanisms for award performance – Records market attractiveness – Retention, progression and “good” degrees – National Student Survey, DLHE • Future performance measures – Value added (difficult to get the raw data) – SSR at School and subject level – REF
    • 23. Conclusions • League Tables are a necessary evil, and a part of the HE landscape • One of our KPIs is to ‘to be amongst the top 50 institutions in The Guardian league table’ • We all have a part to play, in explaining to students and parents what they really measure, • Central work on-going on data returns and strategy • Schools working in partnership with our students on supporting their experience and attainment
    • 24. @mikehamlyn blogs.staffs.ac.uk/mgh1/

    ×