Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
LASI 2017 – 5th July 2017
LASI 2017 – 5th July 2017
RaMon, a Rating Monitoring System for
Educational Environments
Mikel V...
LASI 2017 – 5th July 2017
• Introduction
• Rater effects
• Our proposal: RaMon
• Conclusions & future work
Agenda
2
LASI 2017 – 5th July 2017
• Evaluation should be equal for all students
– Objective
– Impartial
• But it is influenced by ...
LASI 2017 – 5th July 2017
• Rater effects can remarkably affect the
evaluation when multiple raters are involved
– Final Y...
LASI 2017 – 5th July 2017
• Leniency / Severity
– Tendency to give significantly higher or lower scores
• Differential Len...
LASI 2017 – 5th July 2017
• Traditional approach
– Mean score
– Discriminability
Rater effects
6
More information is
needd...
LASI 2017 – 5th July 2017
• RaMon: Rating Monitoring system
– Using visual learning analytics to detect and
measure rater ...
LASI 2017 – 5th July 2017
• Leniency / Severity
Our proposal
8
• Incorporating
information to
the plot
Few evaluations
LASI 2017 – 5th July 2017
• Leniency / Severity
Our proposal
9
• Incorporating
information to
the plot
High agreement
scor...
LASI 2017 – 5th July 2017
• Leniency / Severity
Our proposal
10
• Comparing different evaluable elements
Final Report Oral...
LASI 2017 – 5th July 2017
• Differential Leniency /Severity
Our proposal
11
• Comparing
evaluations by
role
Rater 9 tends ...
LASI 2017 – 5th July 2017
• Differential Leniency /Severity
Our proposal
12
• Comparing
evaluations by
role in a
particula...
LASI 2017 – 5th July 2017
• Controversial evaluations
Our proposal
13
• Analyzing the
rater agreement
Very low
agreementPr...
LASI 2017 – 5th July 2017
• RaMon provides enriched visualizations when
evaluation rubrics are used
Our proposal
14
LASI 2017 – 5th July 2017
Our proposal
15
Oral Defense
Rater 8 uses higher levels
evaluating his/her students
Rubric’s dim...
LASI 2017 – 5th July 2017
• Detailed heatmap
Our proposal
16
Project 67’s
Final Report
D E BRubric’s dimensions
Performanc...
LASI 2017 – 5th July 2017
• Conclusions
– RaMon helps detecting rater effects and
controversial evaluations
– It has been ...
LASI 2017 – 5th July 2017
• Future work
– Apply RaMon in more courses
– Apply RaMon in peer-review scenarios
– Add new vis...
LASI 2017 – 5th July 2017
LASI 2017 – 5th July 2017
RaMon, a Rating Monitoring System for
Educational Environments
THANK Y...
Upcoming SlideShare
Loading in …5
×

VII Jornadas eMadrid "Education in exponential times". Mikel Villamañe: "RaMon, a Rating Monitoring System for Educational Environments". 05/07/2017.

88 views

Published on

VII Jornadas eMadrid "Education in exponential times". Mikel Villamañe: "RaMon, a Rating Monitoring System for Educational Environments". 05/07/2017.

Published in: Education
  • Be the first to comment

  • Be the first to like this

VII Jornadas eMadrid "Education in exponential times". Mikel Villamañe: "RaMon, a Rating Monitoring System for Educational Environments". 05/07/2017.

  1. 1. LASI 2017 – 5th July 2017 LASI 2017 – 5th July 2017 RaMon, a Rating Monitoring System for Educational Environments Mikel Villamañe, Mikel Larrañaga, Ainhoa Álvarez, University of the Basque Country UPV/EHU
  2. 2. LASI 2017 – 5th July 2017 • Introduction • Rater effects • Our proposal: RaMon • Conclusions & future work Agenda 2
  3. 3. LASI 2017 – 5th July 2017 • Evaluation should be equal for all students – Objective – Impartial • But it is influenced by the rater’s : – Thinking processes – Knowledge level – Personal preferences Introduction 3 RATER EFFECTS
  4. 4. LASI 2017 – 5th July 2017 • Rater effects can remarkably affect the evaluation when multiple raters are involved – Final Year Projects’ evaluation board – Doctoral Thesis’ evaluation board – Scientific papers’ reviewers – Peer-review scenarios • Rater effects should be detected in order to: – Take remediation actions – Help forming an evaluation board Introduction 4
  5. 5. LASI 2017 – 5th July 2017 • Leniency / Severity – Tendency to give significantly higher or lower scores • Differential Leniency /Severity – Biasing the scores of a particular group • Central tendency – Tendency to give only central scores • Randomness – Giving scores inconsistently with other raters • Halo / Horn – Rating the student based in previous knowledge Rater effects 5
  6. 6. LASI 2017 – 5th July 2017 • Traditional approach – Mean score – Discriminability Rater effects 6 More information is needded Lenient rater or good works?
  7. 7. LASI 2017 – 5th July 2017 • RaMon: Rating Monitoring system – Using visual learning analytics to detect and measure rater effects – It also allows detecting controversial evaluations – Tested in a real scenario • Final Year Project course • More than 100 projects • 15 raters Our proposal 7
  8. 8. LASI 2017 – 5th July 2017 • Leniency / Severity Our proposal 8 • Incorporating information to the plot Few evaluations
  9. 9. LASI 2017 – 5th July 2017 • Leniency / Severity Our proposal 9 • Incorporating information to the plot High agreement score Rater 4 is not lenient Problematic raters?
  10. 10. LASI 2017 – 5th July 2017 • Leniency / Severity Our proposal 10 • Comparing different evaluable elements Final Report Oral Defense Raters 9, 10 and 15 are lenient in the final report but severe in the oral defense
  11. 11. LASI 2017 – 5th July 2017 • Differential Leniency /Severity Our proposal 11 • Comparing evaluations by role Rater 9 tends to score better the students under his/her supervision
  12. 12. LASI 2017 – 5th July 2017 • Differential Leniency /Severity Our proposal 12 • Comparing evaluations by role in a particular evaluable element Final Report Raters 3 and 9 are severe with the final report made by students not under their supervision
  13. 13. LASI 2017 – 5th July 2017 • Controversial evaluations Our proposal 13 • Analyzing the rater agreement Very low agreementProject 67 should be revised to assure a fair evaluation
  14. 14. LASI 2017 – 5th July 2017 • RaMon provides enriched visualizations when evaluation rubrics are used Our proposal 14
  15. 15. LASI 2017 – 5th July 2017 Our proposal 15 Oral Defense Rater 8 uses higher levels evaluating his/her students Rubric’s dimensions Performance levels Raters
  16. 16. LASI 2017 – 5th July 2017 • Detailed heatmap Our proposal 16 Project 67’s Final Report D E BRubric’s dimensions Performance levels Rater 7 is being lenient
  17. 17. LASI 2017 – 5th July 2017 • Conclusions – RaMon helps detecting rater effects and controversial evaluations – It has been applied in a real scenario – It can be used to: • Form evaluation boards in order to get fair evaluations • Reflect about the evaluation process and take remediation actions Conclusions & future work 17
  18. 18. LASI 2017 – 5th July 2017 • Future work – Apply RaMon in more courses – Apply RaMon in peer-review scenarios – Add new visualization options Conclusions & future work 18
  19. 19. LASI 2017 – 5th July 2017 LASI 2017 – 5th July 2017 RaMon, a Rating Monitoring System for Educational Environments THANK YOU Contact: mikel.v@ehu.eus http://galan.ehu.eus

×