The perils and joys of using video for assessment feedback
1. The perils and joys
of using video for
assessment feedback
Mandy Honeyman
m.honeyman@open.ac.uk
2. Mandy Honeyman
The reason
Would you like it if 92%* of your students
engaged with your feedback?
(*from Turner & West, 2013)
2
3. Mandy Honeyman
The possibilities?
Would you also like it if your students “strongly
agreed that your feedback was personal,
supportive, clear, constructive and prompted
reflection”?
(Henderson & Phillips, 2015)
3
4. Mandy Honeyman
Background and context
Trying it out on my own students
- Level 1 & 2 technology students
- Majority only taught at a distance
- Differing educational backgrounds (OU)
4
6. Mandy Honeyman
Putting it into action
Choosing some software to use:
6
Pros Cons
Jing evaluated by several studies, easy to use no video, swf format
Windows
Movie Maker
evaluated by one study, easy to use no screencasting
Collaaj very easy! video & screencast together cost (free for 2 minutes)
YouTube quite easy (record via Hangouts)! well known,
video or screencast
perceptions of privacy
Others available difficult to use, crashed etc,
commercial
7. Mandy Honeyman
First steps
Based on JISC ASSET study
- generalised feedback
- one video for entire group
- created own video platform
(Crook et al, 2012)
7
10. Mandy Honeyman
Result
- ongoing
- however, anything that helps students
engage with feedback is useful
- generic feedback is easy to produce
- individual feedback is time-consuming
10
11. Mandy Honeyman
References
Crook, A., Mauchline, A., Maw, S., Lawson, C., Drinkwater, R., Lundqvist, K., Orsmond, P., Gomez, S.
and Park, J., (2012) The use of video technology for providing feedback to students: Can it enhance
the feedback experience for staff and students?, Computers & Education, 58(1), pp. 386–396
Henderson, M., and Phillips, M. (2015).Video-based feedback on student assessment: scarily
personal. Australasian Journal of Educational Technology, 31(1), 51-66.
Turner, W. and West, J., (2013) Assessment for ‘Digital First Language’ Speakers : Online Video
Assessment and Feedback in Higher Education, International Journal of Teaching and Learning in
Higher Education, 25(3), pp. 288–296
For further information, discussion and updates please see blog.edtechs.info
11
Editor's Notes
This is an ongoing action research project, it is unfunded, it isn’t part of any larger project or effort and it is self-motivated.
Will Turner and John West at Edith Cowan University elicited a total of 90 responses to two questionnaires given to students participating in their screencasting and video feedback study using Camtasia software. These students were enrolled on a Bachelor of Education course and I think it is very possible that their engagement with education itself could explain their overwhelmingly positive response to SCV feedback: nevertheless 75% reported that they spent more time on review, 92% believed it would enhance future work, 92% believed it was more valuable than written feedback and 87% preferred video feedback.
I thought that this was pretty compelling set of figures and worth exploring further.
Michael Henderson and Michael Phillips at Monash University found this response from a group of students they worked with over a year. This study produced a sensible and useful framework for individual video feedback.
Again, I thought was that they were really onto something. Although the numbers of students they were dealing with was quite small, it was still representative of many assessment group sizes. The biggest issue I have with their research was that their students, again, were education faculty students.
Last year I wrote a paper for my masters called “An evaluation of the use of screencasting and video for assessment feedback in online higher education”. While writing I thought that there was scope for trying things out with my students. I wanted to see if using video for individual feedback was a realistic prospect within my own context.I particularly wanted to see if students, ordinary ones, ones who are not educators, would be as responsive.
Having decided that this seemed a good idea, I set up some rules:
I knew that I didn’t want to have to spend hours recording videos, I didn’t want to re-record them, I didn’t want to edit them, I didn’t want to write a script. I just wanted to be able to record almost off-the-cuff feedback on a specific aspect of feedback after having already gone through a student’s paper.
If the idea of doing this was to take off, I thought it was pretty important not to make it difficult and to deal with .
For monitoring purposes, I continued to write comments, but more importantly this was because I wanted to focus in on a target for the student - something I felt would help that individual student to really make progress.
But the main thing that I had discovered was that the papers I had read were based on either screencasting or on video, they didn’t use both and I thought that was an omission. So I looked at the different options available to me, i.e. that were free! - I am writing up this particular part of the project in my blog.
When looking at this subject it is very difficult to avoid seeing reference to this study from Reading, Leeds and Plymouth universities. They found that 80% of students liked getting the video feedback but because this study was also piloting its own distribution platform it perhaps inevitably came up against technical problems.
Of course, privacy is a real issue, but with the multiple channels available I didn’t think that this was necessary and it seemed to me that this study demonstrated that concentrating on creating a technology to solve a problem causes its own problems unnecessarily when the technology already exists.
I did first try out generic feedback and I ended up using Google Hangouts; easy to record, and can be used as a preparation tutorial for the next cohort.
The next challenge was to choose who to give this type of feedback to, who would benefit from this type of delivery, who might pay more attention to a video as well as typed feedback.
I could only guess, because there are so few studies to inform us:
So I chose students who were really struggling; it didn’t matter what they were struggling with (programming perhaps or study skills), they just had to be struggling.
I created a video for one student concentrating on a piece of writing that just didn’t sense and I explained my thought process as I went through one paragraph on the screen. This student responded very well, he has continued the dialogue via email.
I also created some short videos that demo’d and talked through programming problems. Other than writing problems, this is the most common issue I have to help students with.
In one case, I used student’s own work, but because I didn’t name her in the video itself, I was able to reuse the video with other students having the same problem later.
I decided that the privacy issue on Collaaj was pretty much dispensed with because the filenames were computer generated nonsense and the videos were not displayed anywhere other than in my own account listing.
However, the main problem with Collaaj was that although I sent out the link with the student’s feedback forms and in an email, I had no way of knowing whether or not anyone had watched it (without asking them later on). I asked via email but, as usual, getting responses out of my students was like getting blood out of a stone. Those students who replied were also those who came to tutorials, who would start a dialogue by email after an assignment and who, to be honest, were not my target group. So I still didn’t know anything useful. My intuition was telling me that it was useful, but that was all I had to go on.
This became frustrating, because I was spending enough time on this project to need some feedback myself.
The reason individual feedback is more time consuming is because of the decision making process one has to go through first. The production itself, provided one sticks to one’s own rules, is not so difficult, but simply making the decision to do it takes time.
On that basis it would be interesting to see a study like this taking place over a larger group of teachers, with a much larger group of students scross different faculty.
Is this suitable for ongoing action research, yes absolutely, but those of us doing this kind of self-motivated research need more support, encouragement and, dare I say it, recognition for our efforts.