Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Future of Online Testing with MOOCs: An Exploratory Analysis of Current Practice

6,551 views

Published on

Published in: Education

The Future of Online Testing with MOOCs: An Exploratory Analysis of Current Practice

  1. 1. The Future of Online Testing with MOOCs: An Exploratory Analysis of Current Practice Eamon Costello (National Institute for Digital Learning, Dublin City University) Jane Holland (Royal College of Surgeons in Ireland) Mark Brown (National Institute for Digital Learning, Dublin City University)
  2. 2. Background • NIDL - DCU and Funded MOOC research – MOOCs and building regional capacities SCORE2020 Project – HOME Project • MOOCs and the Media • Institutional Drivers
  3. 3. Possible MOOC Futures
  4. 4. Possible MOOC Futures
  5. 5. Possible MOOC Futures
  6. 6. Brightest MOOC Futures Dillenbourg, P. (2015) Proposal for a Digital Education Strategy for Flanders Universities. “Thinkers in Residence” Programme from KVAB Koninklijke Vlaamse Academie van België voor Wetenschappen en Kunsten. Available from: http://www.kvab.be/denkersprogramma/files/DP_BlendedLearning_No-time-to-lose.pdf “Sooner or later, online tests will be as reliable or even more reliable than on campus exams”
  7. 7. Current State of Play • The future is taken care of • But what about the present? – How mature are (x)MOOCs? – How reliable and valid are MCQ type tests in MOOCs?
  8. 8. Multiple Choice Question (MCQ) Tests (Single Best Answer) – Reliability: If we repeated this would we get the same result? - Validity: Are we measuring what we think we are? Don’t get “fooled by randomness” Taleb, N. (2004). Fooled by randomness: The hidden role of chance in life and in the markets. Random House Incorporated.
  9. 9. Best Practice Case, S. M., & Swanson, D. B. (2003). Constructing written test questions for the basic and clinical sciences (3rd ed.). Philadelphia, PA: National Board of Medical Examiners.
  10. 10. • Ambiguous or unclear information • Negative worded stem (not, incorrect, except) • Implausible distracters • Gratuitous information in stem • More than one or no correct answer • Longest option is correct • Logical cues in stem • Word repeats in stem and correct answer • Unfocused stem • True/false question • Use of all of the above • Vague terms (sometimes, frequently) • Absolute terms (never, always) • Use of none of the above • Fill-in-blank • Complex or K-type • Grammatical cues in sentence completion • Convergence cues Best Practice Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, 26(8), 662-671
  11. 11. • Ambiguous or unclear information • Negative worded stem (not, incorrect, except) • Implausible distracters • Gratuitous information in stem • More than one or no correct answer • Longest option is correct • Logical cues in stem • Word repeats in stem and correct answer • Unfocused stem • True/false question • Use of all of the above • Vague terms (sometimes, frequently) • Absolute terms (never, always) • Use of none of the above • Fill-in-blank • Complex or K-type • Grammatical cues in sentence completion • Convergence cues • Position of correct option Best Practice Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse Education Today, 26(8), 662-671
  12. 12. Methodology • Use Tarrant et. al. (2004)’s diagnostic tool to analyse MCQs in MOOC systematically • Look for item writing flaws
  13. 13. The Data • 12 Courses – from six MOOC platforms – From 12 Universities/Providers • Total MCQs: 115
  14. 14. None/All of the Above • None of the above: 1 (0.87%) • All of the above: 2 (1.74%) The kernel is defined as: A. The graphical user interface on top of the operating system B. The glue between hardware and software applications C. the software libraries need to run the system D. all of the above Holsgrove, G., & Elzubeir, M. (1998). Imprecise terms in UK medical multiple‐choice questions: what examiners think they mean. Medical Education, 32(4), 343-350.
  15. 15. Number of Correct Options per Question • Greater than 1 correct: 10 (8.7%) • Average: 1.39 Check the words that are used synonymously in a JMeter test plan A. end users B. virtual users C. concurrent users D. threads
  16. 16. Number of Options per Question Greater than 4 options: 9 (8%) Average: 3.77 2% 6% 17% 68% 6% 2% 0 20 40 60 80 1 option 2 options 3 options 4 options 5 options 7 options
  17. 17. Correction Option is the Longest • In 47 of the 115 MCQs the correct option is the longest (40.87%)
  18. 18. Position of Correct Option 105 MCQs 21% 30% 29% 19% 1% 0 10 20 30 1st 2nd 3rd 4th 5th
  19. 19. 15% 27% 32% 26% 0 5 10 15 20 25 1st 2nd 3rd 4th Position of Correct Option in 73 Four- Option MCQs
  20. 20. Overall • 17 (14.78%) all the questions contain a defined item writing flaw • When counting only four item writing flaws • Two more item writing flaws are apparent in characteristics that appear more often than they should be chance (small sample)
  21. 21. Further work Add qualitative analysis using the Tarrant et. al. (2004) evaluation tool • Ambiguous or unclear information • Negative worded stem (not, incorrect, except) • Implausible distracters • Gratuitous information in stem • Logical cues in stem • Word repeats in stem and correct answer • Unfocused stem • Vague terms (sometimes, frequently) • Absolute terms (never, always) • Fill-in-blank • Complex or K-type • Grammatical cues in sentence completion • Convergence cues
  22. 22. Implications • Validity and Reliability of MOOC Testing • Replicating unsound pedagogies • MOOC teachers/developers need evidence- lead teaching
  23. 23. Questions? Me: eamon.costello@dcu.ie

×