Your SlideShare is downloading. ×
0
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Overview of HeLFElectronic Management of Assessment  SurveyResults 2013
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Overview of HeLF Electronic Management of Assessment Survey Results 2013

284

Published on

Overview of HeLF Electronic Management of Assessment Survey Results 2013

Overview of HeLF Electronic Management of Assessment Survey Results 2013

Published in: Education, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
284
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Dr Barbara Newland, BrightonLindsey Martin, Edge HillAlice Bird, Liverpool John Moores
  • 2.  Electronic Management of Assessment (EMA)includes a wide range of activities so thefollowing definitions were used:eSubmission electronic submission of an assignmenteMarking electronic marking (including offlinemarking eg in Word)eFeedback electronic feedback (ie text, audio,video but not hard copy)eReturn electronic return of marks
  • 3.  To identify current practice with regard toElectronic Management of Assessment(EMA) in UK HE To gain a snapshot of the strategicoverview identifying key issues relating tostrategic change, policies and practices To reflect on longitudinal developmentsfrom findings of 2011 and 2012 surveys.
  • 4.  A network of senior staff in institutions engagedin promoting, supporting and developingtechnology enhanced learning Over 138 nominated Heads from UK HigherEducation institutions A regular programme of well attended events Represents the interests of its members tovarious national bodies and agencies includingthe Higher Education Academy and JISCwww.helf.ac.uk
  • 5.  The survey was available to HeLF members who were asked to respond withregard to their knowledge of their own institution. The survey was available in March/April 2013 and took about 10 minutes tocomplete The questions were a mixture of closed multiple-choice and multiple selection aswell as open response type Participants were assured that all data collected in the survey would be heldanonymously and securely No personal data was asked for or retained unless the participant indicated awillingness to participate in the follow-up activity The results are being analysed using quantitative and qualitative methods 52 responses from HeLF members – 38% response rate
  • 6.  More positive attitudes towards EMA and itsnormalising within their institutions Challenges in relation to buy-in, take up androll-out processes, functionality, servicedisruption and standardisation and whetherthe latter is desirable and achievable.
  • 7. 05101520253035404550eSubmission eMarking eFeedbackYesNo
  • 8. 051015202530eSubmission eMarking eFeedbackYesNoDont know
  • 9. University-wideDepartment-wideNeither
  • 10. Turnitin (stand-alone)Turnitin (integrated intoVLE)VLEHome grownOther
  • 11. 0510152025303540eSubmission isthe only formof submissioneSubmissionand hard copyprinted bystudenteSubmissionand hard copyprinted bydepartmenteSubmissionand hard copyprinted byindividualacademiceFeedback andstudent canchoose toprint hardcopy offeedbackHard copy isthe only formof submissionUniversity-wideSome department-wideIndividual academics
  • 12. 0246810121416eSubmission eMarking eFeedbackAcademic staffStudents
  • 13. YesNoDont know
  • 14.  Technical issues◦ service interruptions and outages◦ limited functionality and bugs with integration tools Institutional circumstances◦ lack of university standardisation◦ lack of institutional policy◦ lack of consistency across Schools which meant thatthe existing technology could not meet all therequirements of Schools Skills - staff skills gap
  • 15.  eSubmission is a high stakes activity Concerns about the service being hosted by athird party and also with quality standards Other challenges:◦ unsatisfactory problem solving◦ unannounced upgrades and maintenance downtimesduring assignment submission periods◦ service not being able to cope with large numbers ofstudents◦ lost assignments although these were eventuallyrecovered
  • 16.  Where Turnitin unavailability coincided withsubmission deadlines, one institutionalrepresentative reported that they had to issueadvice to academics to extend deadlines. For another institution individual staff madetheir own contingencies, for example, papersubmission and paper feedback. “Caused a lot of stress!”
  • 17. YesNoUnder consideration
  • 18. 051015202530AnonymousmarkingDouble marking External examinersYesNoDont know
  • 19. 05101520253035404550Positive Negative Neutral Dont knowAcademic staffAdministrative staffStudents
  • 20. 051015202530354045Positive Negative Neutral Dont knowAcademic staffAdministrative staffStudents
  • 21. PositiveNegativeNeutralDont know
  • 22.  „There has been a change in attitude towardseFeedback, with a number of members ofstaff recognising that they already do this insome form.‟ It is regarded as an urgent sector-wideagenda and a common studentexpectation, and for many institutions it isincreasingly becoming embedded indepartmental practice
  • 23.  Students Staff Senior management National agendas „Student use. Getting positive stories from staff andstudent users.‟ „Positive student feedback through module evaluationsand staff, student consultative committees.
  • 24.  Consultation with all stakeholders Leadership by senior management - buy-in and support Effective communication strategy - visiblecampaigns, online awareness-raising and supportresources Departmental champions Digital literacy - the provision of training Support from central/departmental learning technologists Technical – robust, easy to use systems integrated withother central systems.
  • 25.  While mandatory policy has been a key driver insome institutions, a number of respondentssuggest that inclusive policy and flexibleprocesses work well „It depends very much on the culture of yourorganisation. Here a partnership model betweencentral services, and School / faculty admin andacademics tends to work. Top down impositionof systems is less successful. In otherinstitutional cultures this may be different.‟
  • 26.  „e-Submission is quite straight forward.However, we could have managed the processof e-marking and feedback better. Academicstaff need a lot of time to come round to theidea if they are changing years of establishedpractice.‟ „Never underestimate the effort involved withwinning hearts and minds of colleagues.‟
  • 27.  Assessment is mission critical – it determineswhether students achieve, progress and gainawards (and at which level) All stakeholders in the assessment process arewary of making errors and need to be convincedof the merits in changing long-establishedpractices Change in practice supported by technology isonly worthwhile if it effectively enhancesstakeholders‟ experiences and/or improves theoverall process workflow
  • 28.  „There is a general acceptance of e-submissionpolicy, with fewer complaints, and anecdotalevidence points to greater academic participationin emarking and efeedback‟. Usage has moved from individual early adoptersto more widespread, formalised use. There is anincreased appetite for standardisation acrossfaculties. However, a very small proportion ofrespondents report no, little or slow movingchange in attitudes.
  • 29.  Dr Barbara Newlandb.a.newland@brighton.ac.uk Lindsey MartinLindsey.Martin@edgehill.ac.uk Alice BirdA.Bird@ljmu.ac.uk Acknowledgement:Dr Rachel Masika, University of Brighton

×