Assessment has always had an ‘e’ in it or A quick look at (e-)assessment
In the beginning we had….. www.adelaide.edu.au
http://www.outsider.co-uk.com/
<ul><li>Interactive resources for musicians This is a  Palatine  funded project, with the support of  City College Manches...
Then we added self assessment… <ul><li>Diagnostic   </li></ul><ul><li>to identify student needs and to determine prior or ...
http://ims.ode.state.oh.us/ODE/IMS/images/classroom_assessment_types.gif
 
<ul><li>Guidelines for E-Assessment </li></ul><ul><li>These Guidelines apply to summative forms of e-assessment, although ...
Title: Feasibility study of implementing e-assessment: lessons to be learned. Authors: John Andresen ( School of Chemical ...
<ul><li>What are the advantages of e-assessment? </li></ul><ul><li>Richer assessment experience  – questions can be made c...
Embedding  e-Assessment and  e- Portfolios to support Learning and Teaching Amanda Black Stuart Jones
e-Portfolio Management Tool Organise own learning and Collaborate with others   Showcase Present evidence to others Person...
Why e-assessment for learning? <ul><li>Why ‘e’?   Assessment embedded into resources, immediate feedback, the ability to r...
http://eassessmentinwales.jiscinvolve.org/about/ <ul><li>The University of Wales Institute, Cardiff (UWIC) has long held s...
 
 
 
 
 
 
 
 
 
 
And finally….
<ul><li>Computers now marking free text responses better than humans </li></ul><ul><li>March 6, 2008 at 4:34 pm · Filed un...
Upcoming SlideShare
Loading in …5
×

E Assessment Presentation Ver2 June 2008

1,033 views

Published on

This is a presentation to start the process of exploration and deveopment of e-assessment.

Published in: Technology, Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,033
On SlideShare
0
From Embeds
0
Number of Embeds
12
Actions
Shares
0
Downloads
56
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

E Assessment Presentation Ver2 June 2008

  1. 1. Assessment has always had an ‘e’ in it or A quick look at (e-)assessment
  2. 2. In the beginning we had….. www.adelaide.edu.au
  3. 3. http://www.outsider.co-uk.com/
  4. 4. <ul><li>Interactive resources for musicians This is a Palatine funded project, with the support of City College Manchester , Vortex WebHosting and RealStrings.com </li></ul><ul><li>Due to a server upgrade (to php5) many of the old interactive tests on this site will no longer mark your answers correctly. </li></ul><ul><li>Pop music theory Help files and online tests, supporting the LCM Popular Music Theory syllabus </li></ul><ul><li>Comment on and discuss these resources - click here. </li></ul><ul><li>KEYBOARD New project in development - music language exercises on a keyboard, with auto-marking and data collection. </li></ul><ul><li>Study Songs - a Flash based elearning project, to develop aural skills and musical analysis through a song. </li></ul><ul><li>Pop Music Theory exam - History section questions - your suggestions for essay content </li></ul><ul><li>inside the music inside the music inside the music inside the music inside the music inside the music </li></ul>
  5. 5. Then we added self assessment… <ul><li>Diagnostic </li></ul><ul><li>to identify student needs and to determine prior or existing knowledge as well </li></ul><ul><li>as strengths and weaknesses. Often diagnostic assessments occur prior to </li></ul><ul><li>another learning experience . </li></ul><ul><li>Self-Assessment </li></ul><ul><li>to assist individual students to review their own performance, to assess their </li></ul><ul><li>own learning and to obtain feedback which can support future learning and </li></ul><ul><li>reflection. </li></ul><ul><li>Formative Assessment </li></ul><ul><li>to assist the student's learning by providing feedback during the learning </li></ul><ul><li>process. </li></ul><ul><li>Summative Assessment </li></ul><ul><li>Assessments which are designed to evaluate student’s overall knowledge and </li></ul><ul><li>understanding of the material presented in the unit. They give a quantitative </li></ul><ul><li>grading and may be given as assessed coursework or as part of an end of unit </li></ul><ul><li>examination. </li></ul>
  6. 6. http://ims.ode.state.oh.us/ODE/IMS/images/classroom_assessment_types.gif
  7. 8. <ul><li>Guidelines for E-Assessment </li></ul><ul><li>These Guidelines apply to summative forms of e-assessment, although a department </li></ul><ul><li>may also wish to consider the principles below as good practice for formative forms of </li></ul><ul><li>e-assessment. </li></ul><ul><li>Operational Guidance </li></ul><ul><li>Students should have had access to/experience of the exam format and the technology prior to the summative exam. Good practice would be to give students prior exposure to a formative exam with feedback. </li></ul><ul><li>The examination should be rehearsed in the same technical environment (although not necessarily with the same group of students) as the actual examination, to ensure that it is robust. </li></ul><ul><li>Invigilation will be appropriate to the type of e-assessment being used and will require technical invigilation where the server which delivers the assessment is located. This may require prior liaison and agreement with Information Services staff.. Invigilators should be fully briefed prior to the assessment. </li></ul><ul><li>The summative exam should only be accessible by secure password and the performance recorded by university-approved secure management tools suited for the purpose. </li></ul><ul><li>Computers used for summative exams should wherever possible have both internet and communication tools disabled, except as needed for the purpose of the assessment. </li></ul><ul><li>Reasonable and appropriate adjustments should be made for students whose disability would put them at a disadvantage due to the format of the exam. Students who have made a case for special arrangements must be offered an alternative to the computer based exam e.g. a paper based assessment. </li></ul><ul><li>The use of a large pool of exam questions from which a randomised sub-sample of questions are generated to produce individual student exams is acceptable so long as the pool covers all aspects of the examinable material and the sub-sample is representative. The appropriateness of utilising a pool of questions depends on the subject-specific content of the questions and how they were designed. </li></ul><ul><li>www.ltss.bris.ac.uk </li></ul>
  8. 9. Title: Feasibility study of implementing e-assessment: lessons to be learned. Authors: John Andresen ( School of Chemical & Environmental Engineering ) , Dragos Axinte (School of Mechanical Materials & Manufacturing Engineering and Management) , David Hann (School of Mechanical Materials & Manufacturing Engineering and Management) , Mark Haw (School of Chemical & Environmental Engineering) , Karen Steel (School of Chemical & Environmental Engineering) , Katy Voisey (School of Mechanical Materials & Manufacturing Engineering and Management) . Keyword: Assessment . The project evaluates the suitability of existing e-teaching environments to fulfil students’ learning needs, academics’ requirement to handle versatile “teaching/ assessment tools” and “sine qua non” constraints in teaching engineering subjects. Dealing with “precise quantities” the engineering subjects are, theoretically, suitable for employing e-assessment/learning methods. E-assessment automates marking, hence reducing the marking load on academics. Moreover, e-assessment enables monitoring of large numbers of students while providing academics with necessary feedback to evaluate students’ progress and their needs for attaining better module understanding.
  9. 10. <ul><li>What are the advantages of e-assessment? </li></ul><ul><li>Richer assessment experience – questions can be made clearer and more detailed through the use of text, sound and video which can aid motivation. For example, e-portfolios allow the use of digital video, animations, presentations etc to be submitted electronically for assessment – impossible in a 'paper world!' </li></ul><ul><li>Increased flexibility – assessment can be provided at a greater range of locations. This means assessment on-demand can become completely achievable. This allows learners to formally demonstrate their understanding at a time and place that is convenient for them. For example, e-tests can be taken (under test conditions) in a range of locations as diverse as the community hall through to a formal e-testing centre. </li></ul><ul><li>Instant feedback – results are often available within minutes of taking an e-test, as well as diagnostic information on a learner's performance, highlighting areas that can be improved upon. </li></ul><ul><li>Reduce the administration burden – less paper forms to complete, no posting of test papers, no printing and posting candidates portfolios and no storage space needed! </li></ul><ul><li>It's what candidates want! </li></ul>
  10. 11. Embedding e-Assessment and e- Portfolios to support Learning and Teaching Amanda Black Stuart Jones
  11. 12. e-Portfolio Management Tool Organise own learning and Collaborate with others Showcase Present evidence to others Personal Learning Space Repository / Storage Virtual Learning Environment Learning platform providing content and support Transcript File ROA/ LAR e-Portfolio Components
  12. 13. Why e-assessment for learning? <ul><li>Why ‘e’? Assessment embedded into resources, immediate feedback, the ability to record progress and ‘distance travelled’ in a format accessible to more than just the DfES, LEA or teacher. Peer group assessment. ICT enables learners to engage in, and assess progress on authentic tasks. </li></ul><ul><li>The importance of the e-portfolio process . A clear progression of work supporting reflection, analysis and learning transactions. The portability of elements of work and progress across a lifetime and not just a school year. </li></ul>
  13. 14. http://eassessmentinwales.jiscinvolve.org/about/ <ul><li>The University of Wales Institute, Cardiff (UWIC) has long held strong partnerships with colleges in the South East Wales region. Like many courses nationwide across the FE and HE sectors, retention has been identified as a key issue, as has the requirements of effective and timely student feedback . Work has begun to identify possible critical moments when some sort of intervention might make a large positive difference on the student experience. One sort of intervention might well be e-assessment in the form of formative feedback, for instance . </li></ul><ul><li>  The HE Academy has recognised the degree to which e-assessment could help support universities and colleges in addressing these issues; recognising also the need for effective curriculum design if assessment of any sort is to remain a beneficial part of the learning and teaching process. </li></ul>
  14. 25. And finally….
  15. 26. <ul><li>Computers now marking free text responses better than humans </li></ul><ul><li>March 6, 2008 at 4:34 pm · Filed under OU VLE , eAssessment </li></ul><ul><li>Sally Jordan gave a workshop today on how to use the Intelligent Assessment Technologies system we’ve got plugged into our VLE to develop short text response questions. One example she gave (I’ve reworded it slightly): </li></ul><ul><li>A raindrop falls vertically with constant speed. What can you tell from this about the forces acting on the raindrop? </li></ul><ul><li>The answer, which I could just about recall from my Higher Physics, is that they’re equal and opposite . You can enter this in all sorts of ways, with misspelling, synonyms and a variety of grammatical structures. Merely enter ‘equal’ and you’ll be given another chance with appropriate feedback saying you’ve only got it partially correct. Sally’s trials show that students are marked accurately by the computer 97% of the time. </li></ul><ul><li>In a study to ascertain the effectiveness of this technology responses from students on our introductory science course (S103) to seven questions were marked by the Intelligent Assessment system and six tutors. There were variations in the marking of four of these questions among the tutors, some of whom disagreed with the question author as to the correct response. </li></ul><ul><li>There was no surprise that the computer could mark more consistently than the tutors overall and more in line with the question author. What was surprising was the number of misunderstandings, slips and inconsistencies which occurred with the human markers. </li></ul>

×