Eassessment Bob Rotheram


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • When thinking about CAA, most people think immediately of multiple-choice questions (MCQs), but there are many other types of question one could computerise. For example, [run through and explain the above, which are the types available via the web version of QMP]. However, there are other types as well, e.g.? [ranking, ‘hot spot’, drag and drop…]
  • You know the importance of feedback for effective learning. Computers can provide it very well, if there’s a thoughtful human being behind the machinery. Practice wisdom about CAA feedback says it should contain: a statement as to whether the chosen option is correct or not; (if appropriate) a statement which gives the correct answer and a sentence or so on why it is correct; a few words to clear up possible misunderstanding about the wrong choice; the source of the information on the correct answer (to help the student find out more and, importantly, to give staff a starting point if they wish to amend a question, e.g. by supplying more up-to-date data); language which is clear and not discouraging. Going beyond the minimum, you might also include: a related question for students to think about; a web link to further information. If you do this, choose a link that is likely to have a reasonable ‘shelf life’ (i.e. it isn’t likely to disappear quickly). Sometimes it may be better to link to the home page of a website (e.g. HM Prison Service), rather than the precise page (which might be short-lived) containing a particular statistic. For more info, see Chapter 4 of Blueprint
  • If you’re interested in setting ‘objective’ questions for your students (via CAA or not) a crucial issue is the intellectual level at which they should be pitched. The general expectation is that greater intellectual demands are placed on students as they move through their programmes, e.g. from displaying knowledge and understanding at the beginning, to using critical and evaluative skills by the time they obtain their degrees. Very often, the underpinning educational theoretical material is Bloom’s Taxonomy . Bloom (1956) postulated six types of educational objectives: knowledge comprehension application analysis synthesis evaluation and argued that the further one goes down the list the more difficult they are to attain. It therefore follows that it is unreasonable to require all Level 1 students to show competence in synthesis and evaluation. Conversely, it would not be demanding enough if final year Honours students (Level 3) were only required to show knowledge and comprehension.
  • Here are some question formats for testing knowledge.
  • And some for testing comprehension (provided the material hasn’t been taught or discussed in class, in which case they would simply be knowledge recall questions).
  • SPQR could be used in lots of ways. Some highlighted here. Some ideas mine, others from Editorial Board, students who have tried SPQR, literature on CAA and feedback. Left column: Using collections of questions. Most people’s first thought is summative assessment (save marking time, etc) but may reject that for various reasons (inc. being too high stakes). May then start thinking about other uses of collections of questions. Right column: Using individual questions. Idea came to me gradually at first, but boosted by reading Littlejohn (2003) on reusing online resources. Makes the telling point that academics are v. resistant to using existing resources (e.g. books, videos) ‘as is’. (Related to ‘not invented here’ problem) Prefer to use bits, woven into their own, personal ‘lesson plan’. Same is true for electronic resources – much more likely that s.p. teachers will use small items from elsewhere, customised and perhaps adapted. Littlejohn makes the point strongly that for electronic learning resources to have much chance of being re-used by others, they should consist of small, ‘granular’ ‘learning objects’. SPQR items (Q, A, feedback notes) fit the bill well. So here are some ideas for using items individually. Easy to include or exclude the feedback notes. Any more suggestions? DISCUSSION
  • Take-up of CAA has been highly variable across disciplines. More readily in maths, science, engineering, medicine than in some other areas, e.g. social sciences. Discuss in small groups (10 minutes): Use of CAA in your discipline (that you know of). Potential/advantages, if any. Limitations/disadvantages that you see. Work with the group to come up with two lists: (Get a volunteer scribe) Advantages include: Save staff time in marking, rapid feedback (esp. formative feedback), consistency with teaching modes (use of computers). Objections include: Risk of system failure, Security, Measurement issues (student familiarity, speed of working), ‘It’s Mickey Mouse’ (‘objective testing’ is allegedly good for only very low level testing of useless information. Variation: ‘Tests using QuestionMark have no educational value ’.), It’s rote learning ( there’s no point in testing empty facts . Note the pejorative ‘empty’. Assumption that facts have no place in HE.), Discipline discursive, contested ( assumption that this is entirely true.), ‘Not invented here’ (syndrome is quite powerful in British HE), Hard to create good questions . (E.g. if multiple-choice, a good ‘key’ and plausible distractors. Tough, too, to write good, helpful feedback.), Reluctance to share questions. Anyone think that there is no potential at all for CAA in their discipline?
  • A qualitative study, but here are a few numbers. Also: wide range of subjects.
  • Simon to do this.
  • End with a joke: “ To err is human, but to really screw things up requires a computer.”
  • Eassessment Bob Rotheram

    1. 1. Assessment and feedback: technology to the rescue? Bob Rotheram National Teaching Fellow
    2. 2. Outline <ul><li>‘ Objective testing’ </li></ul><ul><ul><li>‘ Social Policy Question Resource’ (SPQR) </li></ul></ul><ul><li>Digital audio for assessment feedback </li></ul><ul><ul><li>‘ Sounds Good’ </li></ul></ul><ul><li>With each: </li></ul><ul><ul><li>What to use technology for </li></ul></ul><ul><ul><li>How to do it </li></ul></ul><ul><ul><li>Limitations and obstacles </li></ul></ul>
    3. 3. Objective testing <ul><li>“ Objective tests require a user to choose or provide a response to a question whose answer is pre-determined … the marking of the responses is completely non-subjective because no judgement has to be made on the correctness or otherwise of an answer at the time of marking. … An objective test is only as objective as the test designer makes it .” (Bull & McKenna 2004) </li></ul>
    4. 4. Some question types <ul><li>multiple-choice (MCQ) </li></ul><ul><li>multiple response </li></ul><ul><li>text match response/fill in the blank </li></ul><ul><li>true/false, yes/no </li></ul><ul><li>Likert scale </li></ul><ul><li>numeric </li></ul><ul><li>essay </li></ul>
    5. 5. Objective testing: objections? <ul><li>Well? </li></ul>
    6. 6. MCQ: easy?
    7. 7. MCQ: feedback
    8. 8. Feedback: recommendations <ul><li>statement on whether chosen option correct or not </li></ul><ul><li>(if incorrect) statement giving correct answer and why correct </li></ul><ul><li>clear up possible misunderstanding about wrong choice </li></ul><ul><li>source of info on correct answer </li></ul><ul><li>language clear and not discouraging </li></ul>
    9. 9. Bloom’s Taxonomy (1956) <ul><li>knowledge </li></ul><ul><li>comprehension </li></ul><ul><li>application </li></ul><ul><li>analysis </li></ul><ul><li>synthesis </li></ul><ul><li>evaluation </li></ul>
    10. 10. Knowledge questions <ul><li>What word means the same as…? </li></ul><ul><li>What is the most important difference between…? </li></ul><ul><li>Which one of the following sequences shows the correct order of…? </li></ul><ul><li>What are the major classifications of…? </li></ul><ul><li>Which method is the most useful for…? </li></ul><ul><li>What evidence best supports the theory of…? </li></ul>
    11. 11. Comprehension questions <ul><li>Which one of the following is closest in meaning to the term…? </li></ul><ul><li>The statement ‘…’ means that … </li></ul><ul><li>[Various facts are presented.] Which of the reasons listed below best explains this? </li></ul>
    12. 12. Objective testing: some uses <ul><li>Summative assessment </li></ul><ul><li>Diagnostic tests </li></ul><ul><li>Formative tests </li></ul><ul><li>Informal quizzes </li></ul><ul><li>Revision aid </li></ul><ul><li>Subject intro </li></ul><ul><li>Individual questions v. flexible, ‘granular’, ‘learning objects’: </li></ul><ul><ul><li>insert into lectures </li></ul></ul><ul><ul><li>gather further info via web links </li></ul></ul><ul><ul><li>preparation for classes </li></ul></ul><ul><ul><li>… </li></ul></ul>
    13. 13. Objective testing: limitations, obstacles <ul><li>Other assessment also needed </li></ul><ul><li>Up-front effort required </li></ul><ul><li>Question-checking </li></ul><ul><li>Maintenance, shelf-life </li></ul><ul><li>Collaboration </li></ul><ul><li>Data portability </li></ul>
    14. 14. Writing questions <ul><li>Bull & McKenna, (2004), Blueprint for Computer-Assisted Assessment </li></ul><ul><li>PASS-IT (2004), Good Practice Guide in Question and Test Design </li></ul>
    15. 15. In your discipline? <ul><li>Use? </li></ul><ul><li>Potential/advantages? </li></ul><ul><li>Limitations/disadvantages? </li></ul>
    16. 16. Sounds Good <ul><li>‘ Quicker, better assessment using audio feedback’ </li></ul><ul><li>Coursework: </li></ul><ul><ul><li>formative, summative, individual, group </li></ul></ul><ul><li>Leeds Met (2008) </li></ul><ul><li>Leeds Met, Newman, Northampton, York St John (2008-9) </li></ul><ul><li>[Subject centres: Engineering, GEES] </li></ul>
    17. 17. Sounds Good: some numbers <ul><li>£35k + £15k (JISC) </li></ul><ul><li>38 lecturers </li></ul><ul><li>1200+ students </li></ul><ul><li>Cohort sizes: 3 to 151 </li></ul><ul><li>Levels: 1 to doctoral </li></ul><ul><li>37 presentations! </li></ul>
    18. 18. Marian Tuck’s coursework <ul><li>Book review </li></ul><ul><ul><li>length, tips </li></ul></ul><ul><ul><li>assessment criteria </li></ul></ul><ul><li>Read it </li></ul><ul><li>Audio feedback </li></ul><ul><ul><li>listen (as Marian Tuck) </li></ul></ul><ul><ul><li>how did it feel? </li></ul></ul><ul><ul><li>other comments? </li></ul></ul>
    19. 19. Tools, techniques <ul><li>Digital audio feedback </li></ul><ul><ul><li>MP3 recorders </li></ul></ul><ul><ul><li>‘ Audacity’ software </li></ul></ul><ul><ul><li>‘ WIMBA’ voice tools </li></ul></ul><ul><li>Audio files via: </li></ul><ul><ul><li>Email </li></ul></ul><ul><ul><li>VLE </li></ul></ul>
    20. 20. Early comments <ul><li>(Student) “Very helpful. It felt like the tutor was able to expand more… . Often when you read feedback, things can get misunderstood or meant in a different way. I felt this way was very clear.” </li></ul><ul><li>(Staff) “I think with practice this will get quicker as I get more used to things.” </li></ul>
    21. 21. More staff opinions <ul><li>Staff like audio feedback: </li></ul><ul><ul><li>Quality, quantity… </li></ul></ul><ul><ul><li>“ I was able to give … more detailed and pertinent feedback … [It] became almost an online tutorial.” </li></ul></ul><ul><ul><li>“ An ideal medium to assist in the development of skills and confidence of students.” </li></ul></ul>
    22. 22. Student opinions <ul><li>Students like audio feedback! </li></ul><ul><ul><li>personal, detail, careful consideration </li></ul></ul><ul><li>“ Very helpful. It felt like the tutor was able to expand more… . Often when you read feedback, things can get misunderstood or meant in a different way. I felt this way was very clear.” </li></ul>
    23. 23. On the other hand… <ul><li>Some students prefer written feedback </li></ul><ul><ul><li>Should staff oblige? </li></ul></ul><ul><ul><ul><li>Effort to produce, student ‘skimming’… </li></ul></ul></ul><ul><li>Some want audio + written </li></ul><ul><ul><li>Whether/how to do it? </li></ul></ul>
    24. 24. Audio feedback: Practice guidelines <ul><li>Handheld recorder more convenient? </li></ul><ul><ul><li>record direct to MP3; USB port </li></ul></ul><ul><li>Keep files short (<5 mins?) </li></ul><ul><li>‘ Good enough’ sound quality (32kbps mono?) </li></ul><ul><li>Get approval for audio use </li></ul><ul><li>See www.soundsgood.org.uk </li></ul>
    25. 25. Time saving? <ul><li>Don’t expect to save time immediately </li></ul><ul><li>Time to send audio files </li></ul><ul><ul><li>Problem if many students </li></ul></ul><ul><li>Best chance of saving staff time if: </li></ul><ul><ul><li>give lots of feedback </li></ul></ul><ul><ul><li>write slowly but record speech quickly </li></ul></ul><ul><ul><li>comfortable with technology </li></ul></ul><ul><ul><li>easy to send audio feedback </li></ul></ul>
    26. 26. Is audio feedback worth it? <ul><li>Experienced practitioner </li></ul><ul><li>Better, but may not be quicker </li></ul><ul><li>Give an extended trial </li></ul><ul><li>Worth it for some assessment, esp. </li></ul><ul><ul><li>if detail, personal touch valuable </li></ul></ul><ul><ul><li>to groups </li></ul></ul><ul><li>Other uses of audio! </li></ul>
    27. 27. Technology to the rescue? <ul><li>Not a magic bullet </li></ul><ul><li>Use appropriately </li></ul><ul><ul><li>In favourable circumstances </li></ul></ul><ul><ul><li>Make informed choices on methods </li></ul></ul><ul><li>Technology can bite! </li></ul><ul><li>“ To err is human, but to really screw things up requires a computer.” (Anon) </li></ul>
    28. 28. Sources (1) <ul><li>Bull J & McKenna C (2004), Blueprint for Computer-Assisted Assessment , RoutledgeFalmer, London </li></ul><ul><li>PASS-IT (2004), Good Practice Guide in Question and Test Design , http://staffnet.kingston.ac.uk/~ku36708/RRR/docs/goodpracticeguide.pdf </li></ul>
    29. 29. Sources (2) <ul><li>Rotheram B, (2009) Practice tips on using digital audio for assessment feedback , http://sites.google.com/site/soundsgooduk/downloads/Audio_feedback_tips_3.pdf </li></ul>