Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluation scores and academic performance at the OU

134 views

Published on

Satisfaction surveys have increasingly been used as a proxy for student learning in higher education, for example in the UK’s teaching excellence framework. However, in this keynote I will critically discuss this practice using OU data on 111,256 students on 151 different modules. Significantly higher student satisfaction was found in modules in which students received large amounts of learning materials and worked through them individually, than in courses where students had to collaborate and work together. However, the best predictor for whether students actually passed the module was whether there were collaborative learning activities, such as discussion forums and online tuition sessions. In fact, no relations were found between student satisfaction scores and academic performance in those modules. Therefore, during the keynote I will discuss whether or not we should actually listen to students’ feedback, and if yes which students’ voices we should adhere to.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Keynote: 7th eSTEeM Annual Conference Critical discussion of Student Evaluation scores and academic performance at the OU

  1. 1. 7th eSTEeM Annual Conference Critical discussion of Student Evaluation scores and academic performance at the OU If you want to vote and share, log into: https://pollev.com/bartrienties552 @DrBartRienties
  2. 2. Background of QAA Study 2015 Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series 2015-16. Gloucester: Quality Assurance Agency. • HE increasingly competitive market: student satisfaction has become an important component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012; Rienties, 2014). • Measurement of student satisfaction is important to pinpoint strengths and identify areas for improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012). • Potential benefits and drawbacks of student evaluations have been well-documented in the literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011), o Recent research continues to suggest strong resistance amongst academic staff (Crews & Curtis, 2011; Moskal et al. 2015; Rienties, 2014). o Most student survey instruments lack of focus on key elements of rich learning, such as interaction, assessment and feedback. • Emerging body of literature questioning appropriateness of student satisfaction for measuring teacher effectiveness (Marsh, 2007; Li et al., 2016; Uttl et al., 2017)
  3. 3. Key Questions of the Project 1. To what extent are institutions using insights from NSS and institutional surveys to transform their students’ experience? 2. What are the key enablers and barriers for integrating student satisfaction data with QA and QE 3. How are student experiences influencing quality enhancements a) What influences students’ perceptions of overall satisfaction the most? Are student characteristics or module/presentation related factors more predictive than satisfaction with other aspects of their learning experience? b) Is the student cohort homogenous when considering satisfaction key drivers? For example are there systematic differences depending on the level or programme of study? Rienties, B., Li, N., & Marsh, V. (2015). Modeling and managing student satisfaction: use of student feedback to enhance learning experience Subscriber Research Series 2015-16. Gloucester: Quality Assurance Agency.
  4. 4. Methodology (Logistic Regression) & Validation Step 1: A descriptive analysis was conducted to discount variables that were unsuitable for satisfaction modelling. Step 1 also identified highly correlated predictors and methodically selected the most appropriate. Module Presentation Student Concurrency Study history Overall Satisfaction SEaM UG new, UG continuing, PG new and PG continuing students were modelled separately at Step 2. Step 2: Each subset of variables was modelled in groups. The variables that were statistically significant from each subset were then combined and modelled to identify the final list of key drivers We found at Step 3 that the combined scale provided the simplest and most interpretable solution for PG students and the whole scale for UG students. The solution without the KPI’s included was much easier to use in terms of identifying clear priorities for action. Step 3 Validation: all models have been verified by using subsets of the whole data to ensure the solutions are robust. A variety of model fit statistics have also been used to identify the optimum solutions.
  5. 5. Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657-672. Impact factor: 1.243 According to 111,000+ students, what distinguishes excellent from good to not-so- good modules? 1) Good advice from teachers 2) Links well to professional practice 3) Links well to qualifications 4) Quality of teaching materials 5) Quality of tutors
  6. 6. Li, N., Marsh, V., Rienties, B., Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657-672. Impact factor: 1.243
  7. 7. How does student satisfaction relate to module performance?Satisfaction Students who successfully completed module
  8. 8. Ullmann, T., Lay, S., Rienties, B. (2017). Data wranglers’ key metric report. IET Data Wranglers, Open University
  9. 9. Is satisfaction related to students’ behaviour and performance? • Learning design data (>300 modules mapped) • VLE data • >140 modules aggregated individual data weekly • >37 modules individual fine-grained data daily • Student feedback data (>140) • Academic Performance (>140) • Predictive analytics data (>40) • Data sets merged and cleaned • 111,256 students undertook these modules Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341
  10. 10. Model 1 Model 2 Model 3 Level0 .284** .304** .351** Level1 .259 .243 .265 Level2 -.211 -.197 -.212 Level3 -.035 -.029 -.018 Year of implementation .028 -.071 -.059 Faculty 1 .149 .188 .213* Faculty 2 -.039 .029 .045 Faculty 3 .090 .188 .236* Faculty other .046 .077 .051 Size of module .016 -.049 -.071 Finding information -.270** -.294** Communication .005 .050 Productive -.243** -.274** Experiential -.111 -.105 Interactive .173* .221* Assessment -.208* -.221* LMS engagement .117 R-sq adj 20% 30% 31% n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 4 Regression model of learner satisfaction predicted by institutional and learning design analytics • Level of study predict satisfaction • Learning design (finding info, productive, assessment) negatively predict satisfaction • Assimilative learning design (benchmark) and interactive learning design positively predict satisfaction
  11. 11. Model 1 Model 2 Model 3 Level0 -.142 -.147 .005 Level1 -.227 -.236 .017 Level2 -.134 -.170 -.004 Level3 .059 -.059 .215 Year of implementation -.191** -.152* -.151* Faculty 1 .355** .374** .360** Faculty 2 -.033 -.032 -.189* Faculty 3 .095 .113 .069 Faculty other .129 .156 .034 Size of module -.298** -.285** -.239** Learner satisfaction (SEAM) -.082 -.058 LMS Engagement -.070 -.190* Finding information -.154 Communication .500** Productive .133 Experiential .008 Interactive -.049 Assessment .063 R-sq adj 30% 30% 36% n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 5 Regression model of learning performance predicted by institutional, satisfaction and learning design analytics • Size of module and discipline predict completion • Satisfaction unrelated to pass rates • Learning design (communication) predicts completion
  12. 12. Constructivist Learning Design Assessment Learning Design Productive Learning Design Socio-construct. Learning Design VLE Engagement Student Satisfaction Student retention 150+ modules Week 1 Week 2 Week30 + Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341 Communication
  13. 13. Conclusions (Part I) 1. Student satisfaction important for enhancing teaching and learning practice, but has limited relation to learning outcomes 2. Learning design strongly influences student engagement, satisfaction and performance
  14. 14. Conclusions (Part II) 1. How to improve our understanding of students 1. Talk to them (e.g., OU Live, discussion forum) 2. Ask frequent feedback (e.g., online post box, discussion forum) 2. How to interpret student evaluation findings? 1. Use it as a developmental tool for your own teaching and learning 2. Ask what other teachers learned
  15. 15. Critical discussion of Student Evaluation scores and academic performance at the OU @DrBartRienties

×