Opening up multiple choice - assessing with confidence


Published on

This presentation presents a new online question style, Open CBM (Certainty/Confidence Based Marking).

This achieves an open style of question (similar to a free-text or numeric question) where the student doesn't pick from possible answers, but retains the robust and easy implementation of a multiple choice (MCQ) question.

It achieves this by appropriating the technique of certainty/confidence-based marking (CBM). In CBM, a student both selects an answer and also their level of confidence: they score full marks for knowing that they know the correct answer, some credit for a tentative correct answer but are penalised if they believe they know the answer but get it wrong.

An Open CBM question is presented in two stages. Initially, the question is presented with no answer options visible; instead the student must set their confidence level that they know the answer. Only then are the possible answers are revealed and the student answers as a normal MCQ. The marking scheme follows standard CBM practice. Mechanically the question remains a simple MCQ: answer matching is trivial and robust, questions are easy to implement, and existing question banks can be reused. However, to the student, the question is effectively transformed from closed MCQ to an open question. They need to formulate an answer first before they can decide their confidence in their answer, so they must decide their answer in the absence of any positive or negative clues, reducing the chance of misconceptions, or working backwards.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • This presentation is based on an eSTEeM project’ll need to unpick the terms used in the title
  • This is an open question – ask a question, give no clues, wait for an answer
  • This is a typical close question – ask a question, but give a choice of answers, student just picks one.Subtle differences in meaning of ‘open’ and ‘closed’. Here I mean especially that ‘open’ questions don’t constrain student to answer in a particular way; conversely that ‘closed’ questions amount to picking the right answer but inevitably offer clues. ‘Open’ can also have the sense that more than one answer is possible, ie sense might be different and not just way in which it is expressed.
  • MCQ has advantages, especially for computer implementation and so are heavily used. But they also have disadvantages: if distractors (wrong answers) are trivially wrong then they are easily discounted leaving easy guess at right answer if distractors are plausible because they shows common errors, then danger that students might remember the wrong answerOpen questions supposedly test deeper understanding, can find misconceptions without trapping students into errors.Numeric answers easy to deal with on a computerUnconstrained text possible to deal with by computer but usually difficult to get right without testing – not feasible for summative assessmentEven harder if answers could be diagrams, or other media
  • This is another technique – confidence or certainty based markingAsk a question in conventional manner, but also ask student how certain / confident they are of the answer (low, medium, high)Students really need to know how certain their answer is because of the marking…
  • If student isn’t certain, then they get a low mark if they take a guess and get it right, but no marks if it is wrong.They only get full marks if they are right and are confident of their answer.Worst case is that they think they know the answer but get it wrong – then they get a mark penalty.Payback matrix designed to ensure that students get best overall score if they are honest about certainty. If they are underconfident, they don’t score well.If they are overconfident, they will score badly.
  • Claimed advantages are that marking scheme focuses the mind, students need to be much more self-aware and reflective.Possible downsides might be that scheme is confusing, and biased to different character types, but Tony Gardner-Medwin claims not, as long as students given training on how marking works and given opportunity to practice
  • This is my twist on CBMQuestion is posed, but student first asked how certain they are of the answer.Only when they set their confidence level, are they able to answer.Here multiple choices only revealed at this stage, but could be input box for numeric answer, or labels to drag and drop onto diagra, depending on question type.Once question answered, feedback given in normal way, marking using CBM scheme.
  • Advantages are inherited from standard closed question, eg for MCQ trivial to implement, able to use existing question banks.But effectively transformed to open question since student has at least to formulate the answer in order to decide certainty. No clues. No tempting misconceptions.Ie have to answer the question as set
  • Open CBM inherits all the advantages of MCQ but not the disadvantagesOpen CBM also has the advantages of open questioning, and encourages reflection as CBMMay have some disadvantages -- not always possible to phrase question in open way, esp if trying to reuse existing question banks -- possibly style of questioning intimidating, and way people react might depend on personality (but Tony G-M claims no strong effect given good introduction and practice with style of questions)
  • Benefits accrue to studentsBut also to staff since possible to get ‘better’ questions with less effort
  • Trial with studentsOpenCBM question type implemented in OpenMark (much more flexible than Moodle quiz)Questions taken from existing question bankOffered as revision quizzes at end of course, not as summative assessmentIdentical questions could be delivered in one of two styles – Open CBM or traditional MCQ. A student got one set all in the same style, but could retake quiz for block and get another set of similar questions but in the other style.Some issues: -- question bank turned out to be a draft, not final, so included errors -- wasn’t possible to change OpenMark reporting so that although student’s confidence scores as well as marks were recorded, this couldn’t be reported back to student at end of quiz.
  • Not easy to compare results across real questions because of random selection of questions – not the best experimental design!This just shows results for the test set which students used to learn how the marking worked. Students may have been playing rather than giving serious answers, but indicative none the less.Note that students did pretty well (but not perfectly!) on simple question – what is 2+2?They did badly on Heisenberg’s uncertainly principle – but since they expected this to be difficult, their score was around zeroInteresting is the simple calculus – clearly some students were saying they were confident and then getting it wrong, so mean score was below zero.I think that this could be a point for learning for a student – if they find they don’t know something they thought they did, that is a good reason to think about it and learn.Just getting a question wrong isn’t sufficient – too easy for student to think, I got that wrong but the question was so difficult that no one could be expected to get it right (eg as Heisenberg question)
  • Please get in touch if you want to know more or would like to try this out on students!
  • Opening up multiple choice - assessing with confidence

    1. 1. Opening up multiple choice: Assessing with confidence Jon Rosewell Developments in scholarship of teaching and assessment in MCT eSTEeM workshop, The Open University, 17th January 2014
    2. 2. Open question
    3. 3. Multiple-choice question
    4. 4. MCQ Open question • Pros – objective marking – reliable marking – easy to implement • Cons – distractors trivial – distractors engender misconceptions – working backwards • Pros – tests deeper learning – can find misconceptions – numeric easy to mark • Cons – free text very difficult to mark reliably – figures, audio, etc very, very difficult to mark
    5. 5. Confidence-Based Marking (CBM) Certainty-Based
    6. 6. CBM scoring Score Confidence Correct Wrong Low 1 0 Tentative & correct Medium 2 -2 Confidently correct High 3 -6 Cocksure – and wrong! Gardner-Medwin & Curtin (2007) Certainty-Based Marking (CBM) for Reflective Learning and Proper Knowledge Assessment []
    7. 7. CBM motivations • • • • Rewards care and effort Greater engagement Encourages reflective learning Encourages self-assessment
    8. 8. Open CBM
    9. 9. Open CBM advantages • Mechanically, question remains simple MCQ – answer matching is trivial – easy to implement – possible to reuse existing question banks • But transformed closed to open question – student must formulate answer to decide confidence – must decide answer without +ve or –ve clues. – cannot work backward – will not be led into misconceptions.  must answer the question as set.
    10. 10. MCQ Open CBM • Pros – objective marking – reliable marking – easy to implement • Cons – distractors trivial – distractors engender misconceptions – working backwards • Pros – open question – reflection – MCQ • Cons – not always applicable – intimidating? – personality dependent?
    11. 11. Possible benefits • Students – better engagement – improved learning – more reflective learners • Staff – easier question setting
    12. 12. Trial: T216 Cisco networking • • • • • • • Open CBM implemented in OpenMark Questions from existing bank Offered as revision quizzes One set for each of four blocks 25 questions drawn randomly from bank of ~100 Randomly offered as either open CBM or MCQ format Issues: – Some questions bad! – OpenMark report doesn’t show confidence
    13. 13. Demo questions – scores Question What is 2 + 2? What is derivative of x³? Who painted the 'Mona Lisa'? Who is the 'Mona Lisa'? Uncertainty principle -- whose? Uncertainty principle -- formula? Taken by 238 Mean score 2.51 223 212 208 207 218 -0.47 2.14 0.21 0.03 0.01 Easy Tricky! Difficult
    14. 14. Jon Rosewell
    15. 15. MCQ Open CBM • Pros – objective marking – reliable marking – easy to implement • Cons – distractors trivial – distractors engender misconceptions – working backwards • Pros – open question – reflection – MCQ • Cons – not always applicable – intimidating? – personality dependent?
    16. 16. How (not) to do MCQ… • Give the name of a supervolcano: • Yellowstone • Redstone • Bluestone • Which is a plate boundary: • • • • Conservative Destructive Liberal Labour • What policy did the Chinese introduce to limit population: • One child policy • Two child policy • The child catcher Reported from BBC Bite Size
    17. 17. eSTEeM project plan • Implement open CBM question type – in OpenMark • Trial in course assessment – controlled experimental design – pilot (formative) on T216 CISCO networking • Log measures – Assignment scores – Time spent on task • Structured interviews to probe attitudinal aspects • User-lab trials
    18. 18. Uptake Quiz Demo CCNA 1 CCNA 2 CCNA 3 CCNA 4 Finished 162 Unfinished 178 213 86 69 64 262 130 51 67
    19. 19. Demo questions – confidence Question What is 2 + 2? High 210 Medium 5 Low 23 What is derivative of x³? Who painted the 'Mona Lisa'? Who is the 'Mona Lisa'? Uncertainty principle -- whose? Uncertainty principle -- formula? 96 165 56 62 39 46 21 41 31 21 81 26 111 114 158