The value of students developing the capacity to make accurate judgements about the quality of their work and that of others has been widely studied and recognised in higher education literature. To date, much of the research and commentary on evaluative judgement has been theoretical in nature, focusing on perceived benefits and proposing strategies seen to hold the potential to foster evaluative judgement. Their efficacy remains largely untested. The rise of educational tools and technologies which generate data on learning activities at an unprecedented scale, alongside insights from the learning analytics and educational data mining communities, provide new opportunities for fostering and supporting empirical research on evaluative judgement. Accordingly, this paper offers a conceptual framework and an instantiation of the framework in the form of an educational tool called RiPPLE for data-driven approaches to investigate the enhancement of evaluative judgement. Two case studies, demonstrating how RiPPLE can foster and support empirical research on evaluative judgement are presented.
"Linking Research to the Classroom: Making Connections for Students with Learning Disabilities" Presentation Given at LDA Annual Conference - February, 2013
"Linking Research to the Classroom: Making Connections for Students with Learning Disabilities" Presentation Given at LDA Annual Conference - February, 2013
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at the Learning Analytics Summer Institute event (LASI Asia) run in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project.
This slide deck was presented at CNX 2014 in Houston, USA on 1 April 2014 as part of the "Student Efficacy: Are they Learning?" rapid fire panel. It contains preliminary research findings on educators and students using OpenStax College open textbooks.
Final, updated research findings can be found in the slide deck "The Impact of Open Textbooks in the USA and South Africa..." and via http://oerresearchhub.org
Using A Research Lens To Examine Your COVID-19 Pandemic ResponseTanya Joosten
FEATURED SESSION
Using A Research Lens To Examine Your COVID-19 Pandemic Response
Date: Tuesday, November 17th
Time: 11:45 AM to 12:30 PM
Conference Session: Concurrent Session 4
Session Modality: Virtual
Lead Presenter: Tanya Joosten (National Research Center for Distance Education and Technological Advancements (DETA) and the University of Wisconsin-Milwaukee)
Track: Research, Evaluation, and Learning Analytics
Location: Zoom Room 1
Session Duration: 45min
Brief Abstract:
Using a recently developed research toolkit to drive our discussion, this session will help you identify meaningful research questions, variables, measures, instrumentation and other data collection tools, and data collection techniques to more effectively understand your and your institution’s response to providing instruction and support remotely during COVID-19 pandemic.
Learning analytics: An opportunity for higher education?Dragan Gasevic
Slides used in my keynote at the Annual Conference of the European Association of Distance Teaching Universities - The open, online, flexible higher education conference - #OOFHEC2015
07 18-13 webinar - sharnell jackson - using data to personalize learningDreamBox Learning
Learning and competency data can be useful tools in assessing a student’s individual learning needs. In this month’s Blended Learning webinar, presenters Sharnell Jackson and Tim Hudson shared best practices for organizing and using student data in order to better meet student needs. They also discussed processes for using and analyzing data at the student, classroom, and district levels.
22 January 2018 HEFCE open event “Using data to increase learning gains and t...Bart Rienties
With the Teaching Excellence Framework being implemented across England, a lot of higher education institutions have started to ask questions about what it means to be “excellent” in teaching. In particular, with the rich and complex data that all educational institutions gather that could potentially capture learning gains, what do we actually know about our students’ learning journeys? What kinds of data could be used to infer whether our students are actually making affective (e.g., motivation), behavioural (e.g., engagement), and/or cognitive learning gains? Please join us on 22 January 2018 in lovely Milton Keynes at a free OU- and HEFCE-supported event on Using data to increase learning gains and teaching excellence.
14.00-15.00 Measuring learning gains with (psychometric) questionnaires
Dr Sonia Ilie, Prof Jan Vermunt, Prof Anna Vignoles (University of Cambridge, UK): Learning gain: from concept to measurement
Dr Fabio Arico (University of East Anglia): Learning Gain and Confidence Gain Through Peer-instruction: the role of pedagogical design
Dr Paul Mcdermott & Dr Robert Jenkins (University of East Anglia): A Methodology that Makes Self-Assessment an Implicit Part of the Answering Process
15.00-15.45 Measuring employability learning gains
Dr Heike Behle (University of Warwick): Measuring employability gain in Higher Education. A case study using R2 Strengths
Fiona Cobb, Dr Bob Gilworth, David Winter (University of London): Careers Registration Learning Gain project
Presentation on learning analytics given by Rebecca Ferguson at the Nordic Learning Analytics Summer Institute (Nordic LASI), organised by the SLATE Centre, in Bergen Norway, 29 September 2017.
Presentation by Rebecca Ferguson (IET, The Open University, UK) at the Learning Analytics Summer Institute event (LASI Asia) run in Seoul, South Korea, in September 2016. This presentation, on Visions of the Future of learning analytics, is based on work carried out by the European consortium working on the Learning Analytics Community Exchange (LACE) project.
This slide deck was presented at CNX 2014 in Houston, USA on 1 April 2014 as part of the "Student Efficacy: Are they Learning?" rapid fire panel. It contains preliminary research findings on educators and students using OpenStax College open textbooks.
Final, updated research findings can be found in the slide deck "The Impact of Open Textbooks in the USA and South Africa..." and via http://oerresearchhub.org
Using A Research Lens To Examine Your COVID-19 Pandemic ResponseTanya Joosten
FEATURED SESSION
Using A Research Lens To Examine Your COVID-19 Pandemic Response
Date: Tuesday, November 17th
Time: 11:45 AM to 12:30 PM
Conference Session: Concurrent Session 4
Session Modality: Virtual
Lead Presenter: Tanya Joosten (National Research Center for Distance Education and Technological Advancements (DETA) and the University of Wisconsin-Milwaukee)
Track: Research, Evaluation, and Learning Analytics
Location: Zoom Room 1
Session Duration: 45min
Brief Abstract:
Using a recently developed research toolkit to drive our discussion, this session will help you identify meaningful research questions, variables, measures, instrumentation and other data collection tools, and data collection techniques to more effectively understand your and your institution’s response to providing instruction and support remotely during COVID-19 pandemic.
Learning analytics: An opportunity for higher education?Dragan Gasevic
Slides used in my keynote at the Annual Conference of the European Association of Distance Teaching Universities - The open, online, flexible higher education conference - #OOFHEC2015
07 18-13 webinar - sharnell jackson - using data to personalize learningDreamBox Learning
Learning and competency data can be useful tools in assessing a student’s individual learning needs. In this month’s Blended Learning webinar, presenters Sharnell Jackson and Tim Hudson shared best practices for organizing and using student data in order to better meet student needs. They also discussed processes for using and analyzing data at the student, classroom, and district levels.
Let's Talk Research Annual Conference - 24th-25th September 2014 (Professor R...NHSNWRD
"Introduction to Evidence Synthesis": Professor Rumona Dickson's presentation provided an overview of evidence synthesis and a platform to refine questions that participants wanted to answer related to their own clinical practice. The workshop also included information detailing how teams of health care professionals might access support for addressing their clinical review questions through the CPD programme of the CLAHRC NWC.
Development and Adoption of an Adaptive Learning System: Reflections and Less...Hassan Khosravi
Adaptive learning systems (ALSs) aim to provide an efficient, effective and customised learning experience for students by dynamically adapting learning content to suit their individual abilities or preferences. Despite consistent evidence of their effectiveness and success in improving student learning over the past three decades, the actual impact and adoption of ALSs in education remain restricted to mostly research projects. In this paper, we provide a brief overview of reflections and lessons learned from developing and piloting an ALS in a course on relational databases. While our focus has been on adaptive learning, many of the presented lessons are also applicable to the development and adoption of educational tools and technologies in general. Our aim is to provide insight for other instructors, educational researchers and developers that are interested in adopting ALSs or are involved in the implementation of educational tools and technologies.
Researching e-portfolios: The current state of playdcambrid
The first in the Europortfolio project's series of open webinars, from February 7, 2014. Inter/National Coalition for Electronic Portfolio Research co-directors Darren Cambridge, Barbara Cambridge, and Kathleen Yancey present on the philosophy behind and design of the Coalition, how its results illustrate the principle of "scaling out," and the four propositions about assessment with e-portfolios and their non-negotiable core that Coalition members are currently exploring.
CCCOER Webinar: OER Research on Open Textbook adoption and LibrariansBeck Pitt
"OER Research on Open Textbook adoption and Librarians" was presented by Beck Pitt on 10 December 2014 as part of a CCCOER webinar with Nicole Allen (SPARC) and Una Daly.
These slides were created by reversioning two previous presentations: Librarians Perceptions of OER and Open Access Week 2014: Open Textbook Research Overview (also available on Slideshare).
CCCOER Webinar: OER Research on Open Textbook adoption and LibrariansOER Hub
"OER Research on Open Textbook adoption and Librarians" was presented by Beck Pitt on 10 December 2014 as part of a CCCOER webinar with Nicole Allen (SPARC) and Una Daly.
These slides were created by reversioning two previous presentations: Librarians Perceptions of OER and Open Access Week 2014: Open Textbook Research Overview (also available on Slideshare).
Researching ePortfolios: The current state of play- Darren Cambridge, Barbara...EPNET-Europortfolio
#ePortfolios #Webinar
webianr available at https://www.youtube.com/watch?v=AUVTGmLHYmU
Published on Feb 19, 2014
Researching ePortfolios: The current state of play led by Darren Cambridge, Babara Cambridge and Kathleen Blake Yancey
This webinar was held on Friday 7th Febuary 2014 by www.europortfolio.org
This webinar discusses the research on e-portfolios, presenting the work of the Inter/National Coalition for Electronic Portfolio Research as a model for collaborative inquiry embedded within the process of implementation that both generates new knowledge and leads to successful results.
Over more than a decade, the Coalition has worked with nearly 70 further and higher education institutions in the US, Canada, the UK, Australia, and the Netherlands to better understand how e-portfolios can supporting learning, assessment, and institutional change.
The webinar will provide an overview of the Coalition's process, survey some results from cohorts that have completed their work, and discuss current questions it is investigating and how they might apply to cross-sector practice in Europe.
For more information about the Coalition and its work see http://ncepr.org/
Webinar leaders will be: Barbara Cambridge, Director, Washington Office, National Council of Teachers of English, Darren Cambridge, Principal Consultant, Networked Learning Group, American Institutes for Research and Kathi Yancey, Kellogg W. Hunt Professor of English and Distinguished Research Professor, Florida State University.
Europortfolio is a European Network of ePortfolio Experts & Practitioners.
Europortfolio, a not-for profit association established with the support of the European Commission, is, dedicated to exploring how e-portfolios and e-portfolio-related technologies and practices can help us to empower:
1. 'Individuals as reflective learners and practitioners;
2. Organisations as a place for authentic learning and assessment, and
3. Society as a place for lifelong learning, employability and self-realisation."
Europortfolio has a broad agenda, if you would wish to know more, or to get involved, you can do this by visiting our website www.europortfolio.org
Effects of Technological Interventions for Self-regulation: A Control Experi...Hassan Khosravi
The benefits of incorporating scaffolds that promote strategies of self-regulated learning (SRL) to help student learning are widely studied and recognised in the literature. However, the best methods for incorporating them in educational technologies and empirical evidence about which scaffolds are most beneficial to students are still emerging. In this paper, we report our findings from conducting an in-the-field controlled experiment with 797 post-secondary students to evaluate the impact of incorporating scaffolds for promoting SRL strategies in the context of assisting students in creating novel content, also known as learnersourcing. The experiment had five conditions, including a control group that had access to none of the scaffolding strategies for creating content, three groups each having access to one of the scaffolding strategies (planning, externally-facilitated monitoring and self-assessing) and a group with access to all of the aforementioned scaffolds. The results revealed that the addition of the scaffolds for SRL strategies increased the complexity and effort required for creating content, were not positively assessed by learners and led to slight improvements in the quality of the generated content. We discuss the implications of our findings for incorporating SRL strategies in educational technologies.
Charting the Design and Analytics Agenda of Learnersourcing SystemsHassan Khosravi
This presentation offers data-driven reflections and lessons learned from the development and deployment of a learnersourcing adaptive educational system called RiPPLE, which to date, has been used in more than 50-course offerings with over 12,000 students. Our reflections are categorised into examples and best practices on (1) assessing the quality of students’ contributions using accurate, explainable and fair approaches to data analysis, (2) incentivising students to develop high-quality contributions and (3) empowering instructors with actionable and explainable insights to guide student learning. The paper associated with the presentation is available at https://dl.acm.org/doi/abs/10.1145/3448139.3448143
Automated Insightful Drill-Down Recommendations for Learning Analytics Dashbo...Hassan Khosravi
The big data revolution is an exciting opportunity for universities, which typically have rich and complex digital data on their learners. It has motivated many universities around the world to invest in the development and implementation of learning analytics dashboards (LADs). These dashboards commonly make use of interactive visualisation widgets to assist educators in understanding and making informed decisions about the learning process. A common operation
in analytical dashboards is a ‘drill-down’, which in an educational setting allows users to explore the behaviour of sub-populations of learners by progressively adding filters. Nevertheless, drill-down challenges exist, which hamper the most effective use of the data, especially by users without a formal background in data analysis. Accordingly, in this paper, we address this problem by proposing an approach that recommends insightful drill-downs to LAD users. We present results from an application of our proposed approach using an existing LAD. A set of insightful drill-down criteria from a course with 875 students are explored and discussed.
A Multivariate Elo-based Learner Model for Adaptive Educational SystemsHassan Khosravi
The Elo rating system has been recognised as an effective
method for modelling students and items within adaptive educational systems. The existing Elo-based models have the
limiting assumption that items are only tagged with a single
concept and are mainly studied in the context of adaptive
testing systems. In this paper, we introduce a multivariate Elo-based learner model that is suitable for the domains
where learning items can be tagged with multiple concepts,
and investigate its fit in the context of adaptive learning. To
evaluate the model, we first compare the predictive performance of the proposed model against the standard Elo-based
model using synthetic and public data sets. Our results from
this study indicate that our proposed model has superior
predictive performance compared to the standard Elo-based
model, but the difference is rather small. We then investigate the fit of the proposed multivariate Elo-based model by integrating it into an adaptive learning system which incorporates the principles of open learner models (OLMs).
The results from this study suggest that the availability of
additional parameters derived from multivariate Elo-based
models have two further advantages: guiding adaptive behaviour for the system and providing additional insight for students and instructors.
LAK18 Reciprocal Peer Recommendation for Learning PurposesHassan Khosravi
Boyd Potts, Hassan Khosravi , Carl Reidsema, Aneesha Bakharia, Mark Belonogof, Melanie Fleming (2018). Proceeding of the 8th International Learning Analytics and Knowledge (LAK) Conference
How to Build a Module in Odoo 17 Using the Scaffold MethodCeline George
Odoo provides an option for creating a module by using a single line command. By using this command the user can make a whole structure of a module. It is very easy for a beginner to make a module. There is no need to make each file manually. This slide will show how to create a module using the scaffold method.
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
Thinking of getting a dog? Be aware that breeds like Pit Bulls, Rottweilers, and German Shepherds can be loyal and dangerous. Proper training and socialization are crucial to preventing aggressive behaviors. Ensure safety by understanding their needs and always supervising interactions. Stay safe, and enjoy your furry friends!
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
বাংলাদেশের অর্থনৈতিক সমীক্ষা ২০২৪ [Bangladesh Economic Review 2024 Bangla.pdf] কম্পিউটার , ট্যাব ও স্মার্ট ফোন ভার্সন সহ সম্পূর্ণ বাংলা ই-বুক বা pdf বই " সুচিপত্র ...বুকমার্ক মেনু 🔖 ও হাইপার লিংক মেনু 📝👆 যুক্ত ..
আমাদের সবার জন্য খুব খুব গুরুত্বপূর্ণ একটি বই ..বিসিএস, ব্যাংক, ইউনিভার্সিটি ভর্তি ও যে কোন প্রতিযোগিতা মূলক পরীক্ষার জন্য এর খুব ইম্পরট্যান্ট একটি বিষয় ...তাছাড়া বাংলাদেশের সাম্প্রতিক যে কোন ডাটা বা তথ্য এই বইতে পাবেন ...
তাই একজন নাগরিক হিসাবে এই তথ্য গুলো আপনার জানা প্রয়োজন ...।
বিসিএস ও ব্যাংক এর লিখিত পরীক্ষা ...+এছাড়া মাধ্যমিক ও উচ্চমাধ্যমিকের স্টুডেন্টদের জন্য অনেক কাজে আসবে ...
The simplified electron and muon model, Oscillating Spacetime: The Foundation...RitikBhardwaj56
Discover the Simplified Electron and Muon Model: A New Wave-Based Approach to Understanding Particles delves into a groundbreaking theory that presents electrons and muons as rotating soliton waves within oscillating spacetime. Geared towards students, researchers, and science buffs, this book breaks down complex ideas into simple explanations. It covers topics such as electron waves, temporal dynamics, and the implications of this model on particle physics. With clear illustrations and easy-to-follow explanations, readers will gain a new outlook on the universe's fundamental nature.
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Natural birth techniques - Mrs.Akanksha Trivedi Rama University
Development of educational tools that enable large-scale ethical empirical research on evaluative judgement
1. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement
Development of educational tools that enable large-scale ethical
empirical research on evaluative judgement
Dr Hassan Khosravi
The University of Queensland
Brisbane, QLD, Australia
h.khosravi@uq.edu.au
George Gyamfi
The University of Queensland
Brisbane, QLD, Australia
g.gyamfi@uq.net.au
Dr Barbara Hanna
The University of Queensland
Brisbane, QLD, Australia
b.hanna@uq.edu.au
Dr Jason Lodge
The University of Queensland
Brisbane, QLD, Australia
jason.lodge@uq.edu.au
2. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 2
Overview
Evaluative Judgement Educational Tools and Educational Research
Conceptual Model & a Referenced Implementation Case Studies and Data-Driven Reflections
3. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 3
Overview
Best short paper nomination
Invited submission to the Journal of Learning Analytics
(under review)
4. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 4
Evaluative judgement (EJ) is the capability to make
decisions about the quality of work of oneself and
others.
• Recognised as the skill that empowers students to:
1. Use feedback effectively
2. Develop expertise in their field
3. Develop the autonomy to think critically
4. Become reflexive and lifelong learners with
knowledge of their evaluative potential.
What is Evaluative Judgement?
5. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 5
Strategies to Foster Evaluative Judgement
Rubrics Self-assessment Feedback ReflectionExemplars Peer-assessment
q Current research on EJ’s potential has been largely theoretical.
q Empirical work to verify the effect size of the proposed strategies is emerging.
6. Development of educational tools that enable large-scale ethical empirical research on evaluative judgementDevelopment of educational tools that enable large-scale ethical empirical research on evaluative judgement
Evaluative Judgement Educational Tools and Educational Research
Conceptual Model & a Referenced Implementation Case Studies and Data-Driven Reflections
7. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 7
Facilitating Evaluative Judgement via Educational Technologies
Peer grading and evaluation system
q Most educational technologies are built without the aim of supporting research.
q They do not enable data harvesting or conducting controlled experiments.
8. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 8
Two Success Stories
§ Data from PeerWise has
been used by 80
publications.
§ Research supported:
impact of gamification and
the ability of students to
develop high-quality
learning resources.
Associate Professor
Paul Denny
§ Data from ASSISTments has
enabled 27 publications.
§ Research supported:
adaptive learning and the
personalisation of feedback.
Professor Neil Heffernan
9. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 9
Aim
Present a conceptual framework and an instantiation of the
framework in the form of an educational tool called RiPPLE for
data-driven approaches to investigate the enhancement of
evaluative judgement.
10. Development of educational tools that enable large-scale ethical empirical research on evaluative judgementDevelopment of educational tools that enable large-scale ethical empirical research on evaluative judgement
Evaluative Judgement Educational Tools and Educational Research
Conceptual Model & a Referenced Implementation Case Studies and Data-Driven Reflections
11. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 11
The Conceptual Model
• The proposed framework consists of seven main considerations for the
implementation of educational tools that promote both the development of
evaluative judgement and research into it.
12. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 12
• An overview of the referenced implementation for a tool called RiPPLE based
on the proposed conceptual model.
A Referenced Implementation
13. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 13
The RiPPLE Platform
Content creation Content Evaluation Adaptive practice
Peer study recommendations Clicker-based in-class activity See ripplelearning.org
50
Course offerings
Over 12,000
Users
Over 15,000
Resources Created
Over 1,000,000
Resources viewed
14. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 14
Content Creation
Multi-Choice and Multi-Answer Worked Examples Notes
15. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 15
Can Students Create High-Quality Resources?
“60% of all explanations classified as being of high or outstanding
quality. Overall, 75% of questions met combined quality criteria”
“People with greater expertise tend to make assumptions about student
learning that turn out to be in conflict with students’ actual performance
and developmental propensities.”
“Crowdsourcing can efficiently yield high-quality assessment
items that meet rigorous judgmental and statistical criteria. ”
Overall, 86% of the examples classified in this multiple cohort investigation were
found to be a ‘High quality question’ by being coherent, correct, requiring more
than just simple factual recall, possessing a valid solution and reasonable
distractors.
16. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 16
Self and Peer Assessment
17. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 17
Assessment Feedback
18. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 18
Delivering an Adaptive Learning Experience
19. Development of educational tools that enable large-scale ethical empirical research on evaluative judgementDevelopment of educational tools that enable large-scale ethical empirical research on evaluative judgement
Evaluative Judgement Educational Tools and Educational Research
Conceptual Model & a Referenced Implementation Case Studies and Data-Driven Reflections
20. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 20
• Aim: provides an example of how RiPPLE may be used to conduct empirical
research on evaluative judgement.
• Research questions
• RQ1. How do students’ subjective evaluations of learning resources compare with those
of domain experts?
• RQ2. What is the impact of practice over time on students’ ability to judge the quality of
learning resources?
Case Study – Observational study
Location: The University of Queensland (UQ)
Course: Introduction to Information System
Number of students: 512
Number of experts: 6
Number of Resources: 2,355
Number of ratings: 31,143
Duration: 13 Weeks
21. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 21
• RQ1. How do students’ subjective evaluations of learning resources compare with those of
domain experts?
Results – RQ1
• This analysis reveals a strong and
positive correlation (r=0.832, p<0.05)
between ratings from the two groups.
22. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 22
RQ2. What is the impact of practice over time on students’ ability to judge the quality of
learning resources?
Results – RQ2
Domain experts’ ratings as the gold standard
to compute Root Mean Squared
Error(RMSE)
A statistically significant association
between weekly RMSE value at p=0.0226
and a mild inverse correlation (reduction in
student error in quality ratings) between
week r=-0.602.
23. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 23
Data Driven Reflections - Instructors vs Students
Data
• #Courses= 6
• #Expert Spot Checks = 1,011
• # Learner Ratings on the spot-checks= 3,464
Results
• When instructors provide a rating of 3, 4 or 5 there is
92% agreement and 8% disagreement
• When instructors provide a rating of 1 or 2 there is
19% agreement and 81% disagreement
• The task of identifying low-quality resources based on
solely students judgements maybe challenging.
24. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 24
• On the effects of rubrics on evaluative judgement: A Randomised Controlled
Experiment – measuring time on task, confidence and outcome
Case Study – Controlled Experiment
Control Group Experiment Group
Presented in the following talk
25. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 25
• On the effects of self-assessment on evaluative judgement: A Randomised
Controlled Experiment - measuring time on task, confidence and outcome
Experiments Under Investigation
Control Group Experiment Group
26. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 26
• On the effects of feedback on evaluative judgement: A Randomised Controlled
Experiment - measuring time on task, confidence and outcome
Experiments Under Investigation
Control Group Experiment Group
27. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 27
Data-Driven Reflections – Rubric Criteria
Criteria: Alignment, Correctness, Clarity Criteria: Alignment, Correctness + Clarity, Difficulty, Critical thinking
28. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 28
Data-Driven Reflections – Rating vs Confidence
Data
• #Courses= 19
• #Learners= 2,410
• #Experts= 28
• #Resources= 14,559
• #Learner Moderations= 64,044
Results
• students are most likely to provide a high rating with a
high level of confidence
• The Dunning-Kruger effect, in which low-performing
students tend to overestimate their abilities may be
one of the factors that contribute to the large number
of students who self-assess their confidence as high
29. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement 29
Summary
Evaluative Judgement Educational Tools and Educational Research
Conceptual Model & a Referenced Implementation Case Studies and Data-Driven Reflections
30. Development of educational tools that enable large-scale ethical empirical research on evaluative judgement
Development of educational tools that enable large-scale ethical empirical research
on evaluative judgement
Dr Hassan Khosravi
h.khosravi@uq.edu.au
George Gyamfi
g.gyamfi@uq.net.au
Dr. Barbara Hanna
b.hanna@uq.edu.au
Dr. Jason Lodge
jason.lodge@uq.edu.au