Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Some experiences from evaluating and stress testing digital examination systems

201 views

Published on

A talk given at Inspera 2018 in Oslo about digital examination from a university teacher perspective.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Some experiences from evaluating and stress testing digital examination systems

  1. 1. Some experiences from evaluating and stress-testing digital examination systems Per Olof Arnäs Chalmers University of Technology
  2. 2. I do not like exams. really
  3. 3. Why do we give exams?
  4. 4. Why do we give exams? To teach? To pass verdict?
  5. 5. Why do we give exams? To pass verdict? Yes, in a sense. We are obligated to evaluate knowledge.
  6. 6. Why do we give exams? To teach? Yes! We are also obligated to transfer knowledge.
  7. 7. Me I love the 21st century Senior lecturer inlogistics at ChalmersUniversity of Technology Co-produced two MOOCs in Logistics Research on thedigitalization of thetransportation industry Per Olof Arnäs
  8. 8. My beliefs Society does not want test-takers Understanding is muchmore important thanknowledge of facts Written exams is often a badway to ensure knowledge and understanding My job is to make mystudents understandthe subject better andfaster than I did as astudent myself Digital development is key to human evolution
  9. 9. Active learning Blended learning • Students are involved more than listening • Less emphasis is placed on transmitting information and more on developing students’ skills • Students are involved in higher-order thinking (analysis, synthesis, evaluation) • Students are engaged in activities (e.g. reading, discussing, writing) • Greater emphasis is placed on students’ exploration of their own attitudes and values (Bonwell and Eison 1991) Bonwell, C. C. and J. A. Eison (1991). Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports, ERIC. Garrison, D. R. and H. Kanuka (2004). "Blended learning: Uncovering its transformative potential in higher education." The Internet and Higher Education 7(2): 95-105. Blended learning uses a combination of face-to-face learning with asynchronous content (on the internet) and has a large transformative potential (Garrison and Kanuka 2004).
  10. 10. Why do we give exams? To teach? This means that the exam should be part of the learning process of the student!
  11. 11. The short version: It’s a definite upgrade from analog
  12. 12. The ultra short version: The Devil is in the details
  13. 13. The cost of things
  14. 14. Economy-of-Scale
  15. 15. Learning curve for the teacher
  16. 16. Fixed costs are going down
  17. 17. Administration cost is lowered (a lot)
  18. 18. Economy-of-Scale incl. administration
  19. 19. Evaluated systems Ping pong DigiExam Inspera
  20. 20. Three processes Student Teacher Admin
  21. 21. Three phases Before During After
  22. 22. Students before exam Preparations Download software Do test exam HardwareOK? Borrow computer?
  23. 23. Students before exam + Ability to test system - Learning curve Some planning needed
  24. 24. Students during exam User interfaceOnboardingInternet access Login issues Math Images
  25. 25. Students during exam + Faster than analog No handwriting Copy/paste Higher quality of answers - Some prefer analog Connectivity issues Compatibility issues Math Images
  26. 26. Students after exam FeedbackLearning! Results accessible
  27. 27. Students after exam + Rich feedback Swift feedback Learning opportunity - ?
  28. 28. Teachers before exam Create examQuestions Feedback texts Solutions Prepare students Hardware? Software? Question types
  29. 29. Teachers before exam + Re-use questions Multiple teachers - Prepare students Some question types still missing Enter solution when entering question
  30. 30. Teachers during exam Stand-by Answer questions Assist in onboarding
  31. 31. Teachers during exam + Message to all students via system - ?
  32. 32. Teachers after exam Grading Multiple graders Rich feedback Pre-written feedback
  33. 33. Teachers after exam + Multiple graders Automatic grading Fast grading of essays Rich feedback! - Need for mouse Want to add same feedback texts to all students PDF-comments Sorry, but this sucks!
  34. 34. Admin before exam Room(s) People Extra hardware Outlets Wifi Training Support Management
  35. 35. Admin before exam + Centralised management - Special rooms Trained personnel Computer lending
  36. 36. Admin during exam Security Support Wifi security Visible screens Onboarding Lending of computers Offline support
  37. 37. Admin during exam + Digital overview Secure - New type of security needed Connectvity issues Power failures
  38. 38. Admin after exam Register results LMS integration Archive?
  39. 39. Admin after exam + Fraction of workload compared to analog - Manual transfer of results to LMS Infinite archive
  40. 40. Some ideas GradingCreatingNew, non-additive question types Solution when creating Low friction UI Library of feedback texts Show total score “Smart”support
  41. 41. Question type: “Correct path” Check all correct alternatives (none, one, or more). 3 points if all are correct, -1 point per error (min 0 points) A. Alternative 1 B. Alternative 2 C. Alternative 3 D. Alternative 4 E. Alternative 5 •The student needs to find the “correct path” and gets penalties for each failure until 0 •Tests many things at once. •Should be automatically graded (not possible today). •Differentiates students •Hard to guess
  42. 42. Question type: “Correct path” Check all correct alternatives (none, one, or more). 3 points if all are correct, -1 point per error (min 0 points) •The student needs to find the “correct path” and gets penalties for each failure until 0 •Tests many things at once. •Should be automatically graded (not possible today). •Differentiates students •Hard to guess TRUE FALSE Statement A Statement B Statement C Statement D Statement E
  43. 43. Error^2 0 0,1 0,2 0,3 0,4 0,5 0,6 0,7 0,8 0,9 1 Error 0 0,1 0,2 0,3 0,4 0,5 0,6 0,7 0,8 0,9 1 Risky behaviour does not pay Question type: Mean Squared Error A statistical risk function where an estimator (=the student) is tested. For each question, the error (between 0 and 1) is squared The mean of all the errors form the Mean Squared Error - MSE
  44. 44. Usage of results Question 1 Question 2 Question 3 MSE Student 1 0 0 0,6 0,12 Student 2 1 1 1 1 Student 3 0 0,8 0,4 0,27 MSE 0,33 0,55 0,51 Rank 3 1 2 Theirresults My results
  45. 45. Feedback films Second loop Statistics Answers
  46. 46. Mean Squared Error - Goals Double loop Ensure understanding Lowworkloadfor me Scalable Honesty Reflection Think - don’t guess Quick feedback Should be digital!
  47. 47. Grading - some ideas Word cloud from essays “Flagged” words Highlight in answers Canned feedback “Learn” during grading Verbal/videocommenting Secondopinion Send question to colleague
  48. 48. Grading - some ideas See statistics Per question Diagrams See totals for anonymous students
  49. 49. Insights - digital exams… + Save money Free up time Are scalable Can be used to increase learning - Need physical infrastructure Need new competences Difficult to go back…
  50. 50. Why do we give exams? To teach! This means that the exam must be part of the learning process of the student!
  51. 51. Some experiences from evaluating and stress-testing digital examination systems Per Olof Arnäs Chalmers University of Technology per-olof.arnas@chalmers.se Thank you!

×