• Save
Introduction to e-Assessment
Upcoming SlideShare
Loading in...5
×
 

Introduction to e-Assessment

on

  • 1,070 views

 

Statistics

Views

Total Views
1,070
Views on SlideShare
1,049
Embed Views
21

Actions

Likes
2
Downloads
0
Comments
0

1 Embed 21

https://my.cityofglasgowcollege.ac.uk 21

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial LicenseCC Attribution-NonCommercial License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • We will focus on designing eAssessments and look at some practical examples. The presentation will highlight Blooms Taxonomy to consider writing assessments at various levels on the taxonomy scale.
  • What is effective assessment?Before we start thinking about e-Assessment, it is perhaps useful to consider what we mean by effective assessment practice. This is what will underpin and inform any decisions we make regarding the approach we take to assessment and why. As with any tool, it is key that we use the right technologies in the right way and in the right context for the assessment to be effective. The JISC Publication Effective Assessment in a Digital Age, (JISC, 2010) explores effective practice in e-Assessment in Further and Higher Education in the UK. The publication contains a number of case studies and refers to principles of effective assessment practice, using examples of principles developed by the Re-Engineering Assessment Practices (REAP) project as a basis for analysing the underlying educational value of the case studies highlighted. It sets out a definition for effective assessment practice as:"Effective assessment and feedback can be defined as practice that equips learners to study and perform to their best advantage in the complex disciplinary fields of their choice, and to progress with confidence and skill as lifelong learners, without adding to the assessment burden on academic staff." NICOL, D. 2011. Re-Engineering Assessment Practices (REAP). Accessed 13 June 2011. Available online at: http://www.reap.ac.ukJISC, 2010. Effective Assessment in a Digital Age. Accessed 13 June 2011. Available online at: http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassass_eada.pdf
  • What is e-Assessment?Another JISC Publication, Effective Practice with eAssessment (JISC, 2007) provides a definition for e-Assessment, which we may wish to consider when thinking about the use of technologies for assessment. “eAssessment is the end-to-end electronic assessment processes where ICT is used for the presentation of assessment activity, and the recording of responses. This includes the end-to-end assessment process from the perspective of learners, tutors, learning establishments, awarding bodies and regulators, and the general public.” JISC, 2007. Effective Practice with eAssessment. Accessed 13 June 2011. Available online at: http://www.jisc.ac.uk/publications/programmerelated/2007/pub_eassesspracticeguide.aspx
  • Categories of AssessmentBroadly speaking, there are three categories of Assessment. DiagnosticDiagnostic assessment can be used to diagnose the current level of learning / skills level achieved by the student. It might take place before learning has occurred in the course, unit or session and can help the institution/teacher to set the appropriate learning activities at the appropriate level for the student. Diagnostic assessment might also be used during classes to expose the level of knowledge, perhaps through the use of Electronic Voting Systems or an online quiz with adaptive testing. FormativeFormative assessment gives students the opportunity to practice and test their learning during a course. It can give teachers information on areas of misunderstanding which can then be addressed through Just In Time Teaching, for example (tailoring what is taught based on awareness of understanding). Black & William, (1998, p2) say: “... assessment becomes 'formative assessment' when the evidence is actually used to adapt the teaching to meet the needs.” Ensuring effective feedback practice is employed can greatly enhance the usefulness of formative assessment and help the student understand areas requiring improvement. A range of technologies can be used to support formative assessment.SummativeSummative assessment normally takes place at the end of a course / module / period of learning, and is used to indicate how a student has performed against the standards set for the assessment task. Summative assessment is high stakes, so consideration must be given to areas such as security, authentication, verification, invigilation etc. An institutional, strategic approach is advisable when using technologies to ensure these issues are addressed.BLACK, P. and WILLIAM, D. 1998. Inside the Black Box. London: GL Assessment.
  • Assessment for learning involves using assessment in the classroom to raise learners’ achievement. It is based on the idea that pupils will improve most if they understand the aim of their learning, where they are in relation to this aim and how they can achieve the aim (or close the gap in their knowledge). Assessment for Learning means using evidence and feedback to identify where learners are in their learning, what they need to do next and how best to get achieve this. In practice, this means obtaining clear evidence about how to drive up individual attainment; understanding between teachers and pupils on what they need to improve, and agreement on the steps needed to promote sound learning and progress.
  • Bloom’s taxonomy is often used as a tool for exploring what we may wish to assess in learners. It is based on a continuum from convergent - one right answer, to divergent - no one right answer. It could be argued that we should always be aiming to assess the more divergent (sometimes referred to as higher level thinking) aspects of Bloom’s taxonomy. The convergent types focus on memory recall and the ability to classify for example, whereas the divergent focuses on the ability to create, develop and evaluate. BLOOM, B. S., 1956. Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.
  • A student of Bloom, Lorin Anderson, with David Krathwohl, developed a new version of the taxonomy in 2001 – Bloom’s Revised Taxonomy. It uses verbs rather than nouns to describe key terms and sub categories, focusing on the actions of each stage from lower to higher order thinking skills. A digital version of the taxonomy – Bloom’s Digital Taxonomy can be accessed at the Educational Origami site. This shows Bloom’s Revised Taxonomy with suggested digital tools and approaches that might be used at each stage of the taxonomy scale. Other resources include a spreadsheet of Google tools to support Bloom’s revised taxonomy, developed by Kathy Schrock. ANDERSON, L.W., and KRATHWOHL, D., 2001. A Taxonomy for Learning, Teaching and Assessing: a Revision of Bloom's Taxonomy of Educational Objectives. New York: LongmanEducational Origami, 2011. Accessed 13 June 2011. Available online at: http://edorigami.wikispaces.com/Bloom%27s+Digital+Taxonomy#Bloom%27s%20Digital%20TaxonomySHROCK, K. 2011. Google tools to support Bloom’s revised taxonomy. Accessed 13 June 2011. Available online at: http://kathyschrock.net/googleblooms/
  • To counter the idea that e-Assessment is only multiple-choice questions, it might be worth looking at quizzes in a bit more depth to consider the range of question types and interactions possible between student and tutor, student to student and student with self. Common questions types which are associated with e-Assessment include:Multiple Choice;Multiple ResponseMatching / Drag & DropSequencingFill in the blankLikert scaleAssertion / ReasonThe question types illustrated here are often found in VLE tools and other software such as Wimba Create, Hot Potatoes and the Xerte Online Toolkit. These tools can be automatically marked and with some of the tools (e.g. the VLE) the results can be stored. The other important advantage of these tools is that they can enable quite detailed feedback.  Online quizzes can be marked, and feedback provided or generated, in a range of ways. It is possible to set up quizzes with front-loaded answers and feedback, which will be automatically marked. This type of quiz lends itself well to self-testing as they can be accessed time and time again by students as they test their knowledge and do retrieval practice. Feedback might also incorporate links to further information or resources a student could refer to if they have got a question wrong. Writing them can be time consuming up front if feedback or marks are pre-set but this can lead to less time being required at the marking stage. Quizzes might also include free text entries, however, which have to be human marked. Feedback might be provided for each of these at the time of marking. Feedback or marking does not have to come from the tutor, however, as peer review could also be employed. Educational theorists such as David Nicol and David Boud suggest that if a student is reviewing and evaluating the work of a peer, they need to really engage with the criteria to judge the work against and consider the value of the work. It is argued that this leads to students becoming better at making judgements about their own work as well as developing other skills they will need in their professional lives. Confidence or certainty based marking gives students the opportunity to state their confidence in their answer. They will receive marks on a sliding scale so that if the answer is correct but they indicated a low level of certainty about their answer, they will receive less marks than if they are correct but indicate a high level of confidence in their choice. Conversely, if their answer is wrong but they indicate a high level of confidence in their response, they will have more marks deducted than if they are wrong but indicate a low level of certainty. The thinking behind this approach is that it encourages students to really consider their answer and the reasons why they believe they can justify that it is correct. It has been argued that it promotes the questioning of ideas and reflection on what one thinks one knows. (See Tony Gardner-Medwin’s presentation on Confidence-Based Markingfor more information on this approach).NICOL, D. 2010. Peer Evaluation in Education Review (PEER). Accessed 3 April 2012. Available online at: http://www.reap.ac.uk/reap/resourcesPrinciples.htmlBOUD, D. 2011. Assessment Futures. Accessed 3 April 2012. Available online at: http://www.assessmentfutures.com. GARDNER-MEDWIN, T. 2005. Confidence-based marking. Accessed 3 April 2012. Available online at: http://www.bioscience.heacademy.ac.uk/ftp/events/repforum03/gardnermedwin.pdf
  • Student Response SystemsAlso known as - Electronic Voting Systems/ Interactive Response Systems / Audience Response Systems (also known as Clickers)Popular software to enable EVS includes Turning Point (http://www.turningtechnologies.co.uk/) or Quizdom (http://www.quizdom.co.uk/)Often used as an add-on to MS PPt but mobile and web versions now becoming more popular so no need to distribute handset – can use mobile phone / mobile device / PC Feedback / encourage interaction / peer learning /
  • This slide is for display to the audience to show them how they will vote on your polls in your presentation. You can remove this slide if you like or if the audience is already comfortable with texting and/or voting with Poll Everywhere.This is a just standard rate text message, so it may be free for you, but you may be charged if you do not have a text messaging plan. Your phone numbers can’t be seen and you’ll never receive follow-up text messages outside this presentation.
  • This slide is for display to the audience to show them how they will vote on your polls in your presentation. You can remove this slide if you like or if the audience is already comfortable with texting and/or voting with Poll Everywhere.Sample Oral Instructions:You can participate by submitting an answer atPoll4.com on your laptop or a mobile phone.The service we are using is serious about privacy. I cannot see who you are or who voted.
  • This slide is for display to the audience to show them how they will vote on your polls in your presentation. You can remove this slide if you like or if the audience is already comfortable with texting and/or voting with Poll Everywhere.Sample Oral Instructions:The way you will be able to participate is by tweeting a response to @poll. Your followers won’t see this message.
  • Assessing higher order skills – Scenario MC questionThis scenario MC uses more complex wordingScenario can test application and synthesis of knowledgeC – Transcript should be made – practical solution. In ideal world a video should have subtitles.
  • Answer truee.g. Student may be under stress and offered more time to do something but prefer to do the same as othersStudent may have had negative experience associated with declaring a disability and does not want it to be disclosed – therefore individual lecturers may not know
  • Assessing higher order skills – assertion/reason questionUse of MC test to assess higher order skillsQuite a complex stem but still possible to automatically markAnswer D
  • When testing students, it is important that they are being tested on the subject and not on their ability to interpret the question, guess the answer given implausible distractors or ability to use questions in the quiz to help them answer other questions.These suggestions for creating valid assessment items have been adapted from the SQA Guidelines on Online Assessment for Further Education.
  • To counter the idea that e-Assessment is only multiple-choice questions, it might be worth looking at quizzes in a bit more depth to consider the range of question types and interactions possible between student and tutor, student to student and student with self. Common questions types which are associated with e-Assessment include:Multiple Choice;Multiple ResponseMatching / Drag & DropSequencingFill in the blankLikert scaleAssertion / ReasonThe question types illustrated here are often found in VLE tools and other software such as Wimba Create, Hot Potatoes and the Xerte Online Toolkit. These tools can be automatically marked and with some of the tools (e.g. the VLE) the results can be stored. The other important advantage of these tools is that they can enable quite detailed feedback.  Online quizzes can be marked, and feedback provided or generated, in a range of ways. It is possible to set up quizzes with front-loaded answers and feedback, which will be automatically marked. This type of quiz lends itself well to self-testing as they can be accessed time and time again by students as they test their knowledge and do retrieval practice. Feedback might also incorporate links to further information or resources a student could refer to if they have got a question wrong. Writing them can be time consuming up front if feedback or marks are pre-set but this can lead to less time being required at the marking stage. Quizzes might also include free text entries, however, which have to be human marked. Feedback might be provided for each of these at the time of marking. Feedback or marking does not have to come from the tutor, however, as peer review could also be employed. Educational theorists such as David Nicol and David Boud suggest that if a student is reviewing and evaluating the work of a peer, they need to really engage with the criteria to judge the work against and consider the value of the work. It is argued that this leads to students becoming better at making judgements about their own work as well as developing other skills they will need in their professional lives. Confidence or certainty based marking gives students the opportunity to state their confidence in their answer. They will receive marks on a sliding scale so that if the answer is correct but they indicated a low level of certainty about their answer, they will receive less marks than if they are correct but indicate a high level of confidence in their choice. Conversely, if their answer is wrong but they indicate a high level of confidence in their response, they will have more marks deducted than if they are wrong but indicate a low level of certainty. The thinking behind this approach is that it encourages students to really consider their answer and the reasons why they believe they can justify that it is correct. It has been argued that it promotes the questioning of ideas and reflection on what one thinks one knows. (See Tony Gardner-Medwin’s presentation on Confidence-Based Markingfor more information on this approach).NICOL, D. 2010. Peer Evaluation in Education Review (PEER). Accessed 3 April 2012. Available online at: http://www.reap.ac.uk/reap/resourcesPrinciples.htmlBOUD, D. 2011. Assessment Futures. Accessed 3 April 2012. Available online at: http://www.assessmentfutures.com. GARDNER-MEDWIN, T. 2005. Confidence-based marking. Accessed 3 April 2012. Available online at: http://www.bioscience.heacademy.ac.uk/ftp/events/repforum03/gardnermedwin.pdf
  • Some assessment theorists There are a number of educational theorists in the field of assessment. Currently there are four who appear to be particularly influential and working at the intersection of assessment theory and how technology can be used to enhance or underpin principles of effective practice. These are the four Davids - David Nicol, David Boud, (David) Royce Sadler and David Carless! All of them promote the idea that assessment is more than just about marking / grading but should be very much about the learning - assessment for learning as well as assessment of learning. They have slightly different approaches but they seem to seem to have the same key concern, which is that effective assessment practice should result in self-regulating learners who are equipped with skills for lifelong learning. David Nicol is the Director of the influential Re-Engineering Assessment Practices (REAP) and Peer Evaluation in Education Review (PEER) projects. His work has mainly focused on principles of good assessment and feedback design. A set of 12 principles of good assessment design were developed during the REAP project. These were based on 7 principles of good feedback practice proposed by Nicol and Macfarlane-Dick (2006) which were extended to incorporate some principles around institutional practices to support good assessment design.Help clarify what good performance is (goals, criteria, standards) Encourage ‘time and effort’ on challenging learning tasks Deliver high quality feedback information that helps learners self-correct Provide opportunities to act on feedback (to close any gap between current and desired performance) Ensure that summative assessment has a positive impact on learning? Encourage interaction and dialogue around learning (peer and teacher-student) Facilitate the development of self-assessment and reflection in learning Give choice in the topic, method, criteria, weighting or timing of assessments Involve students in decision-making about assessment policy and practice Support the development of learning communities Encourage positive motivational beliefs and self-esteem Provide information to teachers that can be used to help shape the teaching. David Boud has done a lot of work around developing students’ own skills in assessment, self-evaluation, authentic assessment and assessment in the workplace. He has developed a set of principles also, which can be accessed on the Assessment Futures website.. (David) Royce Sadler is interested in the transition from feedback to self-monitoring and developing independence in learning. Again he has developed a set of essential elements for effective formative assessment, which focus on the student understanding what is required of them, the ability to judge how their work matches that and how to ‘close the gap’ between their current work and what is required.Making the transition from receiving feedback to self-monitoring and developing independence in learning has been a key theme of the essential elements for effective formative assessment developed by Royce Sadler. These focus on the student understanding what is required of them, the ability to judge how their work matches that and how to ‘close the gap’ between their current work and what is required.David Carless has conducted research into sustainable feedback practices and is interested in feedback timing. He focuses on what he terms learning oriented assessment (LOA) which he believes consists of: appropriate assessment task design; student involvement in assessment through peer- and self-evaluation; dialogic feedback.NICOL, D. 2004. Re-Engineering Assessment Practices (REAP). Accessed 13 June 2011. Available online at: http://www.reap.ac.ukNICOL, D.J. and MACFARLANE-DICK, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218NICOL, D. 2004. Re-Engineering Assessment Practices (REAP). Accessed 13 June 2011. Available online at: http://www.reap.ac.uk/reap/resourcesPrinciples.htmlBOUD, D. 2011. Assessment Futures. Accessed 13 June 2011. Available online at: http://www.assessmentfutures.com. SADLER, R. 2011. Accessed 13 June 2011. Available online at: http://www.griffith.edu.au/professional-page/emeritus-professor-royce-sadlerSADLER, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119-144. CARLESS, D. 2011. Accessed 13 June 2011. Available online at: http://web.edu.hku.hk/academic_staff.php?staffId=dcarlessCARLESS, D. 2007. Learning-oriented assessment: conceptual bases and practical implications. Innovations inEducation and Teaching International. 44(1), 57–66
  • The 12 principles developed by the Re-Engineering Assessment Practices (REAP) project, were based on the 7 principles of good feedback practice to support formative assessment and self-regulated learning,developed by David Nicol and Debra Macfarlane-Dick in 2006 and were also informed by Graham Gibbs and Claire Simpson’s set of “Conditions Under Which Assessment Supports Students Learning”, 2004: Help clarify what good performance is (goals, criteria, standards) Encourage ‘time and effort’ on challenging learning tasks Deliver high quality feedback information that helps learners self-correct Provide opportunities to act on feedback (to close any gap between current and desired performance) Ensure that summative assessment has a positive impact on learning Encourage interaction and dialogue around learning (peer and teacher-student) Facilitate the development of self-assessment and reflection in learning Give choice in the topic, method, criteria, weighting or timing of assessments Involve students in decision-making about assessment policy and practice Support the development of learning communities Encourage positive motivational beliefs and self-esteem Provide information to teachers that can be used to help shape the teachingA key point about the principles is that they include verbs and promote what to do to help embed assessment for learning practices in the curriculum.Some of the principles may be harder to implement than others although this will perhaps depend on the context for the assessment. To take an example, in a set, formal course, it might be challenging to give choice in the topic, method, criteria, weighting or timing of assessments, however, the work the SQA has done to chunk competencies, develop levels of learning and allow choice in areas such as the method used to complete an assessment does provide some flexibility. This approach might also become more feasible if we move towards a more agile, open approach to assessment in general, such as might be possible through an open badges system, research around which has been encouraged by the Digital Media and Learning Competition, 2011 on ‘Badges for Lifelong Learning, supported by the Mozilla Foundation, MacArthur Foundation and HASTAC in the USA (we will look at the open badges concept in more detail later on). NICOL, D. 2004. Re-Engineering Assessment Practices (REAP). Accessed 13 June 2011. Available online at: http://www.reap.ac.uk/reap/resourcesPrinciples.htmlNICOL, D. J. and MACFARLANE-DICK, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218GIBBS, G and SIMPSON, C. (2004). Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education vol.1 pp.3-31.
  • The assessment matrix used in the case study was mapped to the seven principles of good feedback outlined by Nicol & McFarlane-Dick (2006): Good feedback practice:helps clarify what good performance is (goals, criteria, expected standards)facilitates the development of self-assessment (reflection) in learningdelivers high quality information to students about their learningencourages teacher and peer dialogue around learningencourages positive motivational beliefs and self-esteemprovides opportunities to close the gap between current and desired performanceprovides information to teachers that can be used to help shape the teachingNicol, D.J. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218
  • McPherson and Simpson map this as follows: Students developed a clear understanding of the assessment criteria, goals and standards via the formative assessment process. Feedback provided by the assessment system and from peers encouraged students to develop strategies for self-assessment. Automated feedback, and the potential to retake the assessment empowered students to self-correct any misunderstanding.Through the synchronous discussion of results of assessments in the classroom, the seminar tutor was able to focus feedback dialogue in the areas where student understanding was limited, opening up the space for both tutor-student and student-student dialogue.Through the reaffirmation of understanding and progress, identifiable in increased performance, ABC fostered positive self-esteem and stimulated motivation through the rewards of engagement and the identification of progress.This potential was extended by the possibility for students to retake assessments at their convenience and to act on the feedback received. This encouraged continual engagement with feedback with the reward lying in improvement of performance.And, significantly, the feedback matrix produced by the use of ABC on the module provided a rich vein of feedback for the teaching staff responsible for constructing assessments and formal feedback. Through its use in the classroom, and through the monitoring of student performance, ABC provides teaching staff with tools that can actively be mobilised to close the gap between staff and student understanding of student comprehension of module material, feedback and assessment. This goes a long way to making the pedagogical underpinnings
  • Assessment for learningAll of the theorists mentioned above focus on assessment for learning and that assessment tasks should support the development of self-regulating learners. Elements which could be said to support the development of self-regulating learners include teachers / the curriculum being flexible and responsive to learners. Areas of mis-understanding might be identified through diagnostic and formative assessment and result in Just In Time Teaching to address any areas requiring more support. Assessment tasks should be an integrated part of the curriculum, with formative assessments allowing students to build towards summative assessment. If particular technologies will be used during summative assessment, students should also be given the opportunity to build up knowledge with the technology, so they feel comfortable with it. Linking assessments to criteria will help students focus on what is required of them and to understand what they are aiming for. Assessments should also support high academic standards. For example, formative assessment can be used to help students to learn about what is expected of them regarding accurate referencing, structure, writing style etc. Timing of feedback is important so that students have time to integrate the learning from the feedback into their future submissions – a feedback loop. Feedback does not only have to come from the teacher. The educationalists mentioned above have conducted a lot of research into the benefits of peer review and how this can help develop skills in self-evaluation. If a student is reviewing and evaluating the work of a peer, they need to really engage with the criteria to judge the work against and consider the value of the work. It is argued that this leads to students becoming better at making judgements about their own work. All of the above should help support learners to direct their own learning better as they increasingly understand what it is they are aiming for and what they need to do to achieve that. A range of technologies can be used to support the above in a number of ways, including enabling fast turnaround of feedback, quick diagnostic tests, automated marking, platforms for supporting peer review etc.
  • A common mis-perception of e-Assessment is that it is only about online multi-choice quizzes and that these can only be used for testing things like memory-recall. This does not have to be the case as MCQs, if written well, with adequate distractors, can test more than this but there are also many more ways in which technologies can be used to test a wide range of thinking, as outlined in Bloom’s taxonomy. The list of approaches outlined here is not exhaustive and will continue to evolve. It can be challenging to group types of e-Assessment approach as many of them will overlap. To consider a range of possibilities with e-Assessment the following list contains category, output / outcome, method and review / marking driven approaches. Each of these might be the starting point to meet a particular teaching need.  Diagnostic E.g. EVS, mobile polling software, twitter, quizzesShowcasing material and influencesE.g. e-Portfolios, incorporating digital content, blogs, files, feeds; wikisDemonstrating competency / meeting standardsE.g. e-Portfolios, blog posts tagged against standards, virtual worldsBlogging / journallingE.g. Blogs - written, audio, videoDigital storytelling E.g. Video, audio submissions, wikisQuizzes automated markingE.g. VLEs, online / mobile quizzes, gamesHuman markingE.g. VLE assignment submission, virtual worldsPeer reviewE.g. via for example: E-Portfolios, Wikis, peer review software, discussion forums Let’s consider some examples for the above. A number of technologies can be used to demonstrate competency / meeting standards. For example, students might use an e-Portfolio to upload and submit evidence against specific pre-defined standards / skills. Virtual worlds can be used to provide simulations and an authentic setting for assessment, where learners use avatars or perhaps augmented reality to interact with the environment and respond to stimulus. Complex or simple games can also be used to help learners demonstrate ability and comprehension by completing a task before they can move on to the next level. Games do not need to be complicated for the learning to be valuable, a simple game with a sound underlying design might be just as useful and interesting for learners as one that is set in an immersive world and is expensive. Key aspects such as challenge, feedback and reward can provide the necessary impetus for learners to engage with the game and the inherent learning. Digital storytelling, where images, text and audio can be pulled together to create an online story, might be used to demonstrate the ability to produce original content or to re-use existing resources, mash-up and assemble. Video, audio submission and wikis might be used for the same purposes. To enable students to showcase material and compare or argue why content has been created in a certain way, e-Portfolios, which allow learners to pull together documents, blogs, feeds, images, video and so on in a central place, could be used. Wikis might also be an appropriate tool for presenting content, developing a knowledge base etc.  Blogging and journaling can be effective tools for helping students to reflect and analyse thinking. Blogs might be used to demonstrate competency / understanding, if the blog posts are tagged with the appropriate standard / competency. The ability to articulate concepts and summarise effectively can be supported through micro-blogging tools such as twitter, which might also be used in diagnostic testing for polling. Dedicated online polling software or handheld clickers might also be used in class to gain a quick overview of understanding and to help a teacher shape what they will focus on in their teaching. To support ongoing feedback, automatically marked quizzes delivered online or via mobile devices could be used. Quizzes and self-tests could be set up by the teacher and made available throughout a course to help students see how they are progressing at any given point and to help them to become more self-directing in their learning. Assignments submitted online might be marked by a human via a VLE and other online assessment engines. Peer review can be supported through dedicated peer review software but also through tools which allow students to open up access to, or communicate with, their peers such as e-Portfolios, wikis and discussion forums.
  • A number of approaches to e-Assessment and considerations around effective practice have been highlighted so far. We’ll go on to explore some of these approaches and considerations in a bit more depth and demonstrate how they have been applied by reviewing some case study examples.
  • Polling can be used online or in the classroom. Systems developed to be specifically used in the classroom are known by a number of different names including Student Response Systems, Electronic Voting Systems (EVS) and Clickers. The technology enables students to indicate their responses to a series of questions usually via a handset. There are also browser, text or twitter based systems, which might be used via personal mobile devices in the classroom. Student responses to a question can be displayed immediately on a whiteboard and/or downloaded to gather evidence. This encourages student interaction during a class and feedback can be given immediately from the tutor and peers. Popular polling software includes Poll Everywhere, Turning Point and Quizdom. Case Study: Using Web 2.0 Tools to Develop and Support a Multi-Campus Class @ UWS (a Case Study with Daniel Livingstone)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - MCC This case study highlights the use of PollEverwhere, a web-based system which is free to use (for up to 30 students ). The example involved undergraduate Computing Systems students from the University of the West of Scotland. Multiple choice and free text response questions were created using the PollEverywhere software and students were asked to respond by texting from their mobile, via twitter, or via the PollEverywhere web site.  This supported active learning in class as the questions were discussed in small groups prior to responding via the PollEverywhere tool. The lecturer considered one major advantage of using this software was it cut out the need to organise and maintain handsets. The lecturer also found making use of polling software tools (along with other tools) helped student engagement especially with the larger student numbers on the course.Case Study: Using Moodle easyVoter Student Response System at Cumbernauld College (a Case Study with Colleen Hurren)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - cumbernauld1This case study highlights the use of EasyVoter, which is an add-on to the Moodle Virtual Learning Environment. This enables quick and easy creation of interactive questions without the need of handheld devices or 3rd party software. Voting takes place within the teaching environment and is led by the tutor/lecturer with students responding via the Moodle interface on a PC / laptop / Mobile device.  Cumbernauld College had not invested in a commercial voting system and the case study outlines the pilot phase of the project with the software installation, set-up, and training of a small number of staff done within a short period of time. The quiz questions were created quickly with a clear and easy to use interface. Five question types are available (Fill in the content, multiple choice, true false, numeric or text) and quiz results can be show as a graph that can be downloaded as a CSV file.
  • Peer Assessment is an area of assessment which is attracting a lot of interest and research (see the PEER project led by David Nicol at the University of Strathclyde). There are a number of benefits associated with using Peer assessment with students. Peer assessment engages students, as they are required to actively participate in the learning process. Peer assessment encourages deeper learning and understanding and gives students practice in constructing feedback. Students will need a good understanding of assessment criteria to enable them to give constructive feedback and they will be encouraged to reflect and develop self-assessment techniques.  David Nicol highlights some examples of peer assessment during an Effective Assessment & Feedback webinar delivered as part of the RSCtv series. Nicol provides an example of how peer review might be structured: Students write essay on one topic from threeEach student provides feedback on three essays in another topic anonymously using rubricThe rubric: write a short summary of the essay, comment on and rate (four point scale) the structure, arguments, evidence, writing, suggest ways of improving the essay.Students receive peer reviews of own essaysThey then comment on and rate their own essay using same rubric.Finally, students rate 3 reviews (on others’ work) and comment on how useful they think they would be to author.Grading: for participating in the task, for their own essay and for their review of it. Case study: Using Web 2.0 Tools to Develop and Support a Multi-Campus Class (Case Study with Daniel Livingstone)Use of the PeerWise peer assessment toolhttp://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - MCC This example highlights the use of a peer assessment tool being used with 1st year Computing Systems students at the University of the West of Scotland. Students were asked to compose their own multiple choice questions, then asked to answer questions composed by other students. These questions were then ranked using a browser based software called PeerWise. This is a free to use online repository of multiple choice questions created by students. Some student questions were then used as part of the module’s summative assessments. RSCtv: Effective Assessment & Feedback, presented by David Nicol, 29th March 2011 - http://www.rsc-sw-scotland.ac.uk/rsctv/RSCtv.htm#archive
  • Embedding e-Assessment in module delivery using Assessment21's ABC at UWS (case study by Neil McPherson and Alan Simpson)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - uwsembed This example from the University of the West of Scotland demonstrates the use of an assessment engine (Assessment 21’s Assess By Computer) to deliver a CertHe Social Studies programme. The assessment engine was used formativelyto deliver a series of multiple choice questions and fill in the blank questions which included automated feedback. The tests were delivered at the start of seminar sessions with the outcomes used as a basis for discussion to clarify student understanding.  Students were encouraged to discuss questions with peersto build informal feedback into the process. Formal feedback directed students to core text readings. Tests then made available outside seminar settings and students asked to re-take them and re-assess their performance encouraging self-assessment and self-regulation.  The assessment engine was also used for summative purposes and students were given the opportunity to take a mock test under exam conditions to give them experience of using the system. The summative assessment consisted of 40 multiple choice questions and 10 slotted questions and was delivered to 87 students using six computer labs.  The systems invigilation tool was used and this enabled online synchronous communication between each invigilator (based in each of the six labs) and a central invigilator. The completed assessments were automatically marked via the system marking tool, exported to a spreadsheet, then entered into the UWS student information system.
  • Using Turnitin as a teaching tool and for summative assessment - University of the West of Scotland (Jo Oliver Vodcast)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - turnitin_uwshttp://rsce-assessment.blogspot.com/2011/05/using-turnitin-to-teach-academic.html(RSC e-Assessment Blog, Grainne Hamilton) Turnitin features Originality Check, which checks for improper citations or plagiarism by comparing work to text comparison databases Turnitin was used with nursing students (post-registration and adult returners to education) at the University of the West of Scotland to support the development of academic writing skills. By using the originality check tool, the students were encouraged to submit their work and check it themselves. This encouraged students to learn how to analyse their own work, consider how to integrate research into their writing and to ensure they referenced correctly.  An Information Literacy and Plagiarism Pilot Project at the City of Glasgow College (Case Study with Jen Fuller & Nicola Buddo)http://www.rsc-sw-scotland.ac.uk/case_studies/docs/CityofGlasgowColl_Safe%20Assign.pdf  Staff at City of Glasgow College were concerned at the lack of information literacy and accurate referencing skills of students so the college chose to run a pilot project involving 1st year HNC/D students likely to enter a degree course during the 2nd or 3rd year of the course.  An Information Literacy Skills tutorial was developed and covered how to search for information and evaluate resources, and how to correctly reference different types of information. Students were given paper-based notes addressing referencing and information skills. Students were also pointed to the Intute Internet Detective tutorial (an online Information Skills tutorial). This was followed by a tutorial on how to submit work online using the plagiarism detection tool, Blackboard Safe Assign. An informal evaluation of the pilot indicated that it was vital to include guidance on information literacy when using plagiarism detection software. This could help to contextualise plagiarism and learn about the wider picture of copyright, IPR and re-using resources.
  • Using Turnitin as a teaching tool and for summative assessment - University of the West of Scotland (Jo Oliver Vodcast)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - turnitin_uwshttp://rsce-assessment.blogspot.com/2011/05/using-turnitin-to-teach-academic.html(RSC e-Assessment Blog, Grainne Hamilton) The Grademark feature of Turnitin is a feedback mechanism that enables the tutor to highlight sections of work and to write custom comments that can be dragged onto the assessment. All marking can be done online, and the tool can track student performance by analyzing and identifying areas of concernIt has been used to mark the assessments of nursing students (post-registration and adult returners to education) at the University of the West of Scotland. The tool can store all assignments in one place, which enabled the tutor to move easily from one assignment to next which resulted in a quicker marking process. Comments can be saved and re-used on other papers, and the tutor could use pre-set comments so spent less time typing or writing. Higher consistency of marking was achieved by making use of the marking grid (rubric), which involved the tutor deciding on marking criteria. The tool was also found to benefit group marking as every marker used the same grid and online access could be given to the moderator. Students would see comments on their work when they accessed their Gradebook and they found the quick turnaround of timely feedback useful as they could use comments to improve future work.
  • Immersive Use of Mahara for Creative Industry Courses at Cumbernauld College (Case study with Alan Moffat)http://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm - cumbmaharahttp://rsce-assessment.blogspot.com/2011/05/immersive-use-of-mahara-for-learning.html(RSC e-Assessment Blog, Grainne Hamilton)  The ePortfolio system Mahara was used with HNC Creative Industries students at Cumbernauld College for both formative and summative assessment. The lecturer found it a useful tool to bring all the different elements of the students’ practice in one place. The system could incorporate media (including audio and video), which would allow students to view their progress as the course evolved.  Use of the platform allowed for a mixture of written work including theory, which could be put in context by using media such as photography and audio (important for sound production students). As well as highlighting the students’ progress, they also had an archive of their work that could be used for showcasing purposes. The lecturer found it useful to see the students’ progress in terms of self-development and also in relation to the learning outcomes of the course. These were mapped to a timetable to ensure students knew exactly what they needed to do and when, to complete the outcomes.  The system enabled transparent feedback that could be made available to other course lecturers and to the students helping to enable a conversation to take place. The feedback could also be opened to other students to incorporate peer feedback. Part of the learning process within the creative industries field is to learn not to take feedback personally. Students learned to engage in a mature debate about their work and not shy away from criticism. Students also learned to reflect on their work (making use of the blog tool in Mahara) and to provide constructive feedback to others.  Using e-Portfolios to create a learning community and to support assessment and reflective practice - at the Royal Conservatoire of Scotland (formerly the RSAMD) (Case study with Aaron Shorr, Andy Dougan, SilviyaMihaylova and Alicia Shaw) http://www.rsc-scotland.ac.uk/case_studies/CaseStudies.htm#rcsmahara http://rsce-assessment.blogspot.co.uk/2011/09/using-e-portfolios-to-create-learning.html(RSC e-Assessment Blog, Grainne Hamilton)  Staff and students at the Royal Conservatoire of Scotland have been making use of the opportunities to incorporate rich multi-media, to blog, upload videos of performances and more to enable reflection on practice, assess in more effective ways and foster inter-disciplinary engagement across the institution. The School of Music and the Digital Film and TV department in the School of Drama had identified a number of areas they felt the use of an e-Portfolio system would help them to address. These included: creating archives of performances and projects by students and masterclasses and concerts by national and international artists; enabling students to reflect on their own performances and projects; distance learning and support; peer review and feedback. They also wanted to support learning through the media and environments students are familiar with. Gordon McLeod, the Learning Technologist for the Royal Conservatoire of Scotland developed a YouTube style environment for the School of Music to upload students' performances to, which enabled students to review their performances and receive feedback from peers. Staff were impressed with the quality of feedback which might include annotated manuscripts and historical references. Students responded positively to the videos as it allowed them to see their performance from an audience perspective and identify areas that required work. One student commented that this also helped build their confidence in their ability.  The Digital Film and TV department stipulated that students use a blog to reflect on their practice and to document assessed collaborative research projects. The blogging tool in Mahara allowed students to work with the rich digital media they were used to and easily incorporate visual, multi-media elements to express themselves and reflect on their practice in creative ways. A student pointed out that blogging enabled a peer conversation to develop around each others work, where peers might suggest a resource another student might wish to refer to, feedback on another student's project and learn about each other’s interests etc. The faculty found that this peer conversation helped students develop critical thinking skills and that the anytime access to the online resources helped foster greater continuity in the learning process. Both Schools found that the use of the ePortfolio helped to create a learning community. This was due to the opportunities afforded for sharing material across year groups and disciplines, enabling reflection and review by peers and using online material to prompt ongoing discussion both on and offline. Another benefit was that staff found they achieved efficiencies in assessing and feeding back to students. Mahara and the students’ content could be accessed online via a computer or a mobile device which meant that feedback could be provided at times that suited the tutor and fitted into their own schedule. 
  • In this webinar we will consider some of the other considerations around e-Assessment, including ensuring assessment practices are inclusive, quality assurance and considerations for summative assessment. We will look at emerging trends and some examples of the use of new approaches to assessment.
  • Inclusive AssessmentAs far as ensuring assessment practices are inclusive, this is about catering not only for learners with additional support needs but technology can also be used to cater for different learning styles and user preferences. Technology can be a valuable tool in allowing learners to access information and materials in ways that suit them, with minimum additional effort on the part of the teacher. Any digital object should be developed in line with good practice in relation to accessibility but many platforms and services now are constructed in such a way as to cater for this. The World Wide Web Consortium W3C develop guidelines on designing for accessibility and is a valuable resource for ensuring digital content is as accessible as possible.JISC Techdis have created a leaflet explaining key points of the Equality Act 2010, JISC TechDis Single Equality Duty:  Improve your 3 Rs: Recruitment-Retention-Results. It includes a model of Accessibility Maturity: Diminishing Level of Risk which charts how to develop inclusive practice, which should support learners to access learning in a way which works for them, while at the same time helping an institution to meet its legal obligations. The aim of the model is to highlight how institutions should be looking to move from an approach which is tokenistic towards how it caters for the needs of all learners to a partnership approach, where the institution involves learners in developing good practice and embedding inclusion in the curriculum.Any development in the curriculum should have inclusive practice at its core. The Equality Act 2010 sets out the new law regarding obligations on the part of public sector bodies and local authorities (colleges come under local authorities) with regard to how they should uphold inclusive practice. The following table highlights some of the pertinent points of the act in relation to how the Eqaulity Act will affect further and higher education. This has been adapted from guidance produced by Scotland’s Colleges in Legal Obligations on Colleges under the Equality Act 2010.Equality Act 2010. Accessed 13 June 2011. Available online at: http://www.legislation.gov.uk/ukpga/2010/15/contentsBrymer, S. 2010. Legal Obligations on Colleges under the Equality Act 2010. Accessed 13 June 2011. Available online at: http://www.scotlandscolleges.ac.uk/component/option,com_docman/Itemid,78/gid,2998/task,doc_downloadStirling: Scotland’s Colleges.W3C, 2011. Accessed 13 June 2011. Available online at: http://www.w3.org/JISC TechDis, 2010. Single Equality Duty:  Improve your 3 Rs: Recruitment-Retention-Results. Accessed 13 June 2011. Available online at: http://www.jisctechdis.ac.uk/techdis/resources/detail/investinyou/JISC_TechDis_SED_Leaflet
  • Quality AssuranceAs with any assessment process, it is important to build in quality assurance from the start of developments. Given some of the challenges associated with summative assessment, it is important to ensure safeguards are in place in relation to the security and integrity of the assessment process. When considering QA in relation to e-Assessment, it would be advisable to refer to institutional quality assurance policies and procedures. It is quite likely that a separate procedures document will be required in relation to assessment / e-Assessment, which covers key points to consider when planning and delivering e-Assessments (particularly summative). Further information can be accessed from awarding bodies and their own guidelines must be followed if using e-Assessment for qualifications they will be awarding. You may wish to refer to the Qualifications & Curriculum Authority 2007publication, Regulatory principles for e-assessment, for an example of such guidelines. Qualifications & Curriculum Authority, 2007. Regulatory principles for e-assessment. Accessed 13 June 2011. Available online at: http://www.sqa.org.uk/files_ccc/RegulatoryPrinciplesforE-assessment.pdf. London: QCA
  • Embedding e-Assessment in module delivery using Assessment21's ABC at UWS (case study by Neil McPherson and Alan Simpson)http://www.rsc-scotland.ac.uk/case_studies/CaseStudies.htm - uwsembed This example from the University of the West of Scotland demonstrates the use of an assessment engine (Assessment 21’s Assess By Computer) to deliver a CertHe Social Studies programme summative exam. The assessment engine was used for summative purposes and students were given the opportunity to take a mock test under exam conditions to give them experience of using the system. The summative assessment consisted of 40 multiple choice questions and 10 slotted questions and was delivered to 87 students using six computer labs.  Invigilators were located in each of the sixlabs and the Assessment21 invigilation tool was used to enable online synchronous communication between each invigilator and a central invigilator. The system monitoring tool allowed all active students and their network connections to be viewed, including when they moved away from the system. The completed assessments were automatically marked via the system marking tool, exported to a spreadsheet, then entered into the UWS student information system. Implementing e-Assessment and Building an e-Assessment Centre at Edinburgh’s Telford College (case study by Gavin Lang)http://www.rsc-scotland.ac.uk/case_studies/CaseStudies.htm#eassess_centre_telfordAt Edinburgh’s Telford College, staff were interested in using more e-Assessment but had concerns regarding security. To address these concerns, the college built an e-Assessment centre, which contains software for a range of assessment systems as well as being set up to enable secure web based assessment.To enable multiple sittings for the one exam, there is a holding room for students. Students are checked in, in the holding room and are able to store phones and other personal items in boxes in that room. The room is staffed at all times. The students are assigned a numbered PC and can view a map of the assessment room to find where to sit.
  • Case study on the University of Dundee’s use of iPads for OSCEs is available on the Questionmark blog at: http://blog.questionmark.com/a-new-possibility-delivering-dental-and-medical-observational-assessments-osces-on-an-ipad
  • The use of tablet PCs for e-assessment - at South Lanarkshire Collegehttp://www.rsc-sw-scotland.ac.uk/case_studies/CaseStudies.htm#SLanark1Tablet PCs are used for a wide variety of learning and teaching activities within South Lanarkshire College, including e-Assessment. A strategic decision was taken to provide staff with tablet PCs that they could use to support learners. Assessment templates have been created and are used by learners and lecturers to record all assessment activity.  Tablet PCs are mobile, portable and robust and practical for workshop environments with Construction students. Learners take photos of their work before and after to record progress on the activity. Learners are encouraged to self-asses their progress and gauge how well they have understood and performed the task with the support of their lecturer. Comments are recorded on the template and at the end of the assessment process a digital copy is instantly available which can be downloaded or delivered electronically. All learners have their own personal digital folder on the tablet where all their assessment materials are stored.  Benefits reported by both staff and students include: Reduction in admin time and paperworkImmediate feedback for learners and access to their own portfolio of workAssessment readily and easily available to external awarding bodiesDeployment of tablet PCs has provided a consistent and coherent approach to e-Assessment across the construction faculty.
  • Case study on the Open University’s use of spoken language assessment is available on the Learnosity website at: http://www.learnosity.com/case-studies/by-project/ou
  • In 2011, a White Paper (working document) by the Mozilla Foundation and MacArthur Foundation, set out how the concept of open badges and a Badge System Framework could contribute to how we learn and assess. This framework has informed the Digital Media and Learning “Badges for Lifelong Learning” competition run in 2011-12 by the MacArthur Foundation and HASTAC which was created to support the identification  and trialling of potential uses for badges. The Mozilla Badge System FrameworkThe Badge System Framework consists of three components: The badges  A badge, as outlined in the framework, would be an image file with metadata which contains information about what a learner has attained, such as the skills, qualities, status, level of achievement and the issuer of the badge. The badge might contain a link back to evidence supporting its attainment, which could be verified by a third party, so a badge could have an independent ‘stamp of approval’ from a recognized authority.  The badge and corresponding information would be portable and could be posted to a range of sites such as Facebook, LinkedIn, blogs, personal pages on recruitment sites etc. Assessment Assessments would be used to demonstrate how someone earned a badge. These assessments will differ depending on the type or level of skills being demonstrated. A single issuer might use assessments to award a badge in the same way as qualifications are awarded currently but there is also the opportunity for multiple assessors, eg a number of votes from ‘gurus’ from a relevant community, to determine if someone should gain a badge. Badges could also be self- awarded for users to demonstrate that they have completed an open course for example. Infrastructure The ability to create badges, issue them, verify that they are authentic and display them in various settings requires an infrastructure which the Mozilla Open Badges Infrastructure aims to provide. Users will be able to create a ‘badge backpack’, essentially a portable portfolio of badges, hosted by Mozilla although the badges will be fully portable, so the learner can take them with them, with all the metadata associated with the badge in place and host them where they wish. Pilot of the Mozilla Badge System FrameworkThe School of Webcraft has developed a pilot in line with the Mozilla Badge System Framework, through the P2PU (Peer 2 Peer University) for learning about web development. A number of open web development challenges have been set and corresponding assessments created to allow learners to take the challenges when they want and submit material for assessment. The learner’s evidence is generally assessed by multiple web developer ‘gurus’ from the web development community. Learners can gain: skills badges (e.g. Javascript, PHP), value badges (e.g. Accessibility), and peer-to-peer badges (e.g. Good Teammate, Peer Mentor, etc.). Directions have been provided for the assessments linked to each badge, which include clear instructions of the assessment process, how they will be assessed (e.g. who will do the assessing, if it will be one or multiple assessors, the rubric that will be used), and the requirements needed to complete the assessment.Mozilla, 2011. Open Badges Infrastructure Technical Document. Accessed online 7 May. Available from: https://wiki.mozilla.org/File:OpenBadges-Working-Paper_092011.pdfMozilla, 2011. Open Badges Infrastructure Technical Document. Accessed online 7 May. Available from: https://wiki.mozilla.org/Badges/AboutMozilla, 2011. Open Badges Infrastructure Technical Document. Accessed online 7 May. Available from: https://wiki.mozilla.org/Badges/infrastructure-tech-docs

Introduction to e-Assessment Introduction to e-Assessment Presentation Transcript

  • Introduction to e-Assessment Designing e-Assessments City of Glasgow College 23rd November 2012
  • Overview• Designing e-Assessments• Effective Practice in e-Assessment• Coffee break• Activity – SOLAR• Additional Considerations and Emerging Trends
  • What is effective assessment?"Effective assessment and feedback can bedefined as practice that equips learners to studyand perform to their best advantage in thecomplex disciplinary fields of their choice, and toprogress with confidence and skill as lifelonglearners, without adding to the assessmentburden on academic staff." Effective Assessment in a Digital Age, JISC 2010
  • What is e-Assessment?“eAssessment is the end-to-end electronicassessment processes where ICT is used for thepresentation of assessment activity, and therecording of responses. This includes the end-to-end assessment process from the perspective oflearners, tutors, learning establishments,awarding bodies and regulators, and the generalpublic.” Effective Practice with eAssessment, JISC 2007
  • Categories of assessment • Assessment of skill / level / understanding • Used to identify the student’s current knowledge andDiagnostic skill level • Allows learning activities to match student requirements • Might be taken before a course has commenced • Tests knowledge / ability during the course • Can be used to provide feedback to individual studentsFormative at critical points in the learning process • Gives teachers an opportunity to review class results and address gaps in learning • Used to grade and judge the student’s level ofSummative understanding and skill development for progression or certification • Usually takes place at the end of a course or module
  • Assessment for LearningAssessment of How am I Learning doing? Where do I have to focus What is really my revision? making me think?An archive of evidence + What are my evaluation Is this the strengths and What are best way for weaknesses? my me to learn?Assessor determines and targets? evaluates evidence Work submitted formanual marking or to anassessment management systemResults in Accreditation
  • Blooms Taxonomy Convergent Divergent Knowledge Comprehension Application Analysis Synthesis Evaluation• Arrange • Classify • Apply • Analyse • Arrange • Argue• Define • Describe • Choose • Appraise • Assemble • Assess• Duplicate • Discuss • Demonstrate • Compare • Compose • Choose• List • Explain • Employ • Contrast • Construct • Compare• Memorise • Identify • Illustrate • Critique • Create • Defend• Name • Indicate • Interpret • Differentiate • Design • Judge• Recall • Report • Practice • Distinguish • Develop • Predict• Recognise • Select • Solve • Examine • Formulate • Rate• Repeat • Translate • Use • Test • Organise • Select• State • Plan • Evaluate • Prepare
  • Blooms Revised TaxonomyHOTS – Higher Order Thinking Skills Creating •Designing, constructing, planning, producing, inventing, devising, making Evaluating •Checking, hypothesising, critiquing, experimenting, judging, testing, detecting, monitoring Analysing •Comparing, organising, deconstructing, attributing, outlining, finding, structuring, integrating Applying •Implementing, carrying out, using, executing Understanding •Interpreting, summarising, inferring, paraphrasing, classifying, comparing, explaining, exemplifying Remembering •Recognising, listing, describing, identifying, retrieving, naming, locating, findingLOTS – Lower Order Thinking Skills
  • Question types Marking typesConvergent (LOTS) and feedback • Multiple Choice • Multiple Response • Automated marking • Matching / Drag & and tutor feedback Drop • Human based • Sequencing marking and tutor • Fill in the blank feedback • Likert scale • Peer review / • Assertion / Reason assessment • Confidence based marking with tutorDivergent (HOTS) feedback
  • Polling activity Text • Standard texting rates • We have no access to your phone number • Capitalization doesn’t matter, but spaces and spelling do BrowserCODE • Capitalization doesn’t matter, but spaces and spelling do Twitter • Capitalization doesn’t matter, but spaces and spelling do • Since @poll is the first word, your followers will not receive this tweet
  • How To Vote via Texting Text to: +447624806527 CODE
  • How To Vote via Pollev.com Submit to: http://PollEv.com CODE
  • How To Vote via Twitter @poll CODE
  • Multiple choice exampleA colleague is thinking about using video whenteaching, but realises she should to ensure the content isaccessible to all, what would you advise? A. Dont use video. Find alternative teaching strategy. B. Provide transcript for individual students who might need it. C. Make transcript of video available to all students. D. Only use video that has subtitles. Poll Everywhere
  • True False exampleIf a request for confidentiality is made by a disabledstudent, it may mean reasonable adjustment arecompromised because the reasonable adjustments areless important than the confidentiality. Under thesecircumstances no reasonable adjustment needs to bemade. A. True A. False Poll Everywhere
  • Assertion / Reason exampleWhen formatting digital documents you should structurethe text with coloured inbuilt heading styles because it isimportant that a range of devices or assistive technologiescan interpret the structure of digital content. A. Assertion and reason are true and the reason explains the assertion. B. Assertion and reason are true but the reason does not explain the assertion. C. Assertion is true but the reason is false. D. Assertion is false but the reason is true. E. Assertion is false and the reason is false. Poll Everywhere
  • Tips: Constructing quiz questionsUse straightforward languageAvoid using double negatives if possibleConsider the amount of text and graphicsStem should contain an unambiguous question or statementDistractors should be plausibleDistractors should be significantly different from each other3 or 4 distractor options are sufficientDon’t give the answer to one question in the wording of another one
  • Convergent (LOTS) Question types Marking types • Multiple Choice and feedback • Multiple Response • Automated marking • Matching / Drag & and tutor feedback Drop • Human based • Sequencing marking and tutor • Fill in the blank feedback • Likert scale • Peer review / • Assertion / Reason assessment • Confidence basedDivergent (HOTS) marking with tutor feedback
  • Online Assessment- Effective Practice
  • Some Theorists involved in assessment & feedback • Director of Re-Engineering Assessment Practices (REAP) and Peer Evaluation in Education Review (PEER) projects. Focuses on principles ofDavid Nicol good assessment and feedback design • http://www.reap.ac.uk • Focuses on developing students’ own skills in assessment, self-David Boud evaluation, authentic assessment, assessment in the workplace • http://www.assessmentfutures.com • Focuses on the transition from feedback to self-monitoring and (David) developing independence in learningRoyce Sadler • http://www.griffith.edu.au/professional-page/emeritus-professor-royce- sadler David • Focuses on learning oriented assessment (LOA), sustainable feedback practices, feedback timing Carless • http://web.edu.hku.hk/academic_staff.php?staffId=dcarless
  • Good feedback practice: • Helps clarify what good performance is (goals, criteria, expected standards) • Facilitates the development of self-assessment (reflection) in learning • Delivers high quality information to students about their learning • Encourages teacher and peer dialogue around learning • Encourages positive motivational beliefs and self-esteem • Provides opportunities to close the gap between current and desired performance • Provides information to teachers that can be used to help shape the teachingNicol, D.J. & Macfarlane-Dick, D. (2004). Rethinking formative assessment in HE: a theoretical model and seven principles of good feedback practice. Higher Education Academy
  • Assessment Flexibleshould support responsive / JITT Self- Part of evaluation curriculum Autonomous Self-regulated Peer Linked to review Learners criteria Provide a Ensure Feedback Academic loop standards
  • Approaches• EVS • e-Portfolios, • e-Portfolios • Written, audio, video• Mobile polling incorporating digital • Blog posts tagged blog posts software content, blogs, files, against standards • Twitter• Twitter feeds • Virtual worlds• Quizzes • Wikis Showcasing Demonstrating Blogging /Diagnostic material and competency / meeting standards micro-blogging influences• Video / audio / text / • Online / mobile • Online / mobile • E-Portfolios animation quizzes quizzes • Wikis• Digital storytelling • Games • Free text assignment • Peer review software software submission • Virtual worldsDigital Automated Human marking Peer reviewStorytelling marking
  • Polling• Used with students at the University of the West of Scotland and Cumbernauld College• Using Poll Everywhere and Moodle EasyVoter, tutors created questions and students responded via browser, text or twitter in class• Supported active learning in class as the questions were discussed in small groups prior to responding• Helped student engagement with larger groups JISC RSC Scotland case study: Using Web 2.0 Tools to Develop and Support Multi-Campus Class, with UWS JISC RSC Scotland case study: Using Moodle easyVoter Student Response System at Cumbernauld College
  • Integrating formative, peer andsummative assessment• 1st year Computing Systems students at the University of the West of Scotland• Using Peerwise system, students composed own MC tests – created a large bank of tests• Students rated other students’ tests• Some student questions were used as part of summative assessment of the module JISC RSC Scotland case study: Using Web 2.0 Tools to Develop and Support Multi-Campus Class, with UWS JISC RSC Scotland case study: Using Moodle easyVoter Student Response System at Cumbernauld College
  • Improving academic standards• Issues highlighted at UWS and City of Glasgow College around poor academic standards and information literacy• Plagiarism detection software (Turnitin) used with Nursing students at UWS (adult returners and post-registration)• OriginalityCheck tool used to: • Enable students to learn how to analyse their own work • Consider how to integrate research into their writing • Ensure they referenced correctly• Useful to include guidance on information literacy when introducing plagiarism detection - contextualises plagiarism JISC RSC Scotland case study: Using Turnitin as a teaching tool and for summative assessment with UWS JISC RSC Scotland case study: An Information Literacy and Plagiarism Pilot Project with City of Glasgow College
  • Efficiencies in Marking• Turnitin used with Nursing students at UWS (adult returners and post-registration)• Grademark tool used to store assignments, speed up marking process, use pre-set comments• Marking consistency improved by using marking grid (rubric) to set marking criteria• Quick turnaround of feedback via the gradebook provided time for students to use comments to improve future work JISC RSC Scotland case study: Using Turnitin as a teaching tool and for summative assessment with UWS
  • Closing the feedback loop /creating a learning community• HNC Creative Industries students at Cumbernauld College and Royal Conservatoire of Scotland using ePortfolio for formative & summative assessments• Mixture of written work (theory) put in context with student generated content such as videos, blogs, images etc• Provides a showcase, archive of student work and enables students to view their progress during course• Feedback transparent and linked to learning outcomes. Feedback conversations between tutor to student, student to tutor and student to student. JISC RSC Scotland case study: Immersive Use of Mahara for Creative Industry Courses, with Cumbernauld College
  • Activity• Access OpenAssess on SOLAR website• Choose some tests and work through them. http://www.sqasolar.org.uk/
  • Online Assessment- Additional Considerations and Emerging Trends
  • Overview • Inclusive assessment • Quality assurance • Considerations for summative assessment – case studies • Developing trends in e- Assessment - examples
  • Model of Accessibility Maturity: Diminishing Level of Risk JISC TechDis Single Equality Duty guidance booklet
  • Quality Assurance• Ensure appropriate safeguards are in place• Adhere to institutional quality assurance policies and procedures and guidelines from awarding bodies• Feed lessons learnt into future design and delivery
  • Considerations for summative assessment - case studies • Consider • Providing mock tests using system to be used for summative exam • Invigilation procedures and communication between invigilators in different geographic locations • Security and authentication procedures and links between assessment systems and student information systems • Procedures for multiple sittings of the one exam JISC RSC Scotland case study: Embedding e-Assessment in module delivery using Asessment21’s ABC at UWSJISC RSC Scotland case study: Implementing e-Assessment and Building an e-Asessment Centre at Edinburghs Telford College
  • Developing trends in e-Assessment - examples • Mobile assessment • Monitoring and documenting in situ assessment • Language learning • QR codes • Games Based assessment • xGames • GamesSpace • Augmented Reality • A framework for holistic assessment • Open Badges
  • Mobile assessment - observational assessments Observational assessment using tablet PCs • Use of tablet PCs at South Lanarkshire College with Construction students. • Assessment templates created and used to record assessment activity. • Comments recorded on template and a digital copy made available to students. • Immediate feedback for learners and access to their own portfolio of work • Assessment readily and easily available to external awarding bodies JISC RSC Scotland case study The use of tablet PCs for e-assessment, at South Lanarkshire College
  • Mobile assessment – using QR codes in assessment• Students scan QR codes using smartphone• Launches resources such as case studies, YouTube videos, quizzes and other formative assessments• Activities created using SoftChalk software and hosted on the web• QR code generator provides the image, containing access to the resources.Benefits:• Students have flexible access via their phones to content which they can use at any time• Has promoted discussion and reflection on activities• No need to pre-book computer labs• Provides spontaneity around the learning experience• Engages students in activities out-with class, promoting self- directed learning JISC RSC Scotland case study Using QR Codes with Hairdressing Students, at Perth College
  • Games based assessment - xGames• JISC funded project which aimed to use collaborative games to improve attainment levels of vocational learners• Project outputs included games templates and a bank of example questions (for use with the xBox)Project outcomes included:• Increased student motivation• Increased student to student interaction• Increased learner engagement and participation• Increased use of ICT for learning and teaching• Reduced barriers to using ICT for learning and teaching• Evidence of the pedagogical advantages of using games in an educational context• Prompted institutional decision makers and teaching staff to think about alternative methods of formative assessment JISC RSC Scotland case study xGames: Using Educational Computer Games for Group Learning, at Reid Kerr College
  • Games based assessment – SQA GamesSpace• SQA games based assessment platform• Provides games based assessments (NABs) for selected Skills for Work courses• Simulated 3d environment• Learners interact with characters and objects, perform tasks and carry out assessments• Results of the process are currently human marked Further information on GamesSpace page on SQA website
  • Framework for holistic assessment- Open Badges
  • Introducing e-Assessment• Start with low stakes formative assessment• Opportunity to innovate• Consider using devices and tools students are using• Make assessment part of the learning process
  • References & Useful Links• Principles of effective assessment and feedback practice: David Nicol, Gibbs and Simpson, 2004 David Boud, Royce Sadler, D. Carless, McDowell et al., 2006, ESCAPE project, 2010,• JISC RSC Scotland Local Case Studies http://www.rsc- scotland.ac.uk/case_studies/CaseStudies.htm• JISC RSC Scotland e-Assessment blog http://rsce-assessment.blogspot.com/• JISC RSC Scotland e-Assessment magazine http://www.scoop.it/t/e-assessment-in-fe-and-he• Effective Assessment in a Digital Age, JISC 2010http://www.jisc.ac.uk/whatwedo/programmes/elearning/assessment/digiass ess.aspx• SQA eAssessment Resource http://www.sqa.org.uk/files_ccc/eAssessmentResource.html• JISC Techdis e-Assessment Staff Pack http://staffpacks.jisctechdis.ac.uk/Staff%20Packs/E-Assessment/index.xml Icons by http://dryicons.com