Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Improving Learner
Performance through
Time-Sensitive Feedback:
Design-Based Research of Benefits and
Issues of Video Annot...
What Is Video Annotation?
Definition
A tool that allows you to annotate
(i.e., add notes, comments,
explanations, or telestrations to)
a video recor...
Features
• Synchronous and
Asynchronous
commenting/critiquing
• Timeline specific/linked
annotations*
• Text
• Audio
• Vid...
Example Tools
VideoAnt
Videonot.es
Open Video Annotation Project
VATIC
YouTube
GoReact
VoiceThread
Image from: https://ic.arc.losrios.edu/~itc/institutes/OTI/Session7/voicethread.htm
Viddler
Coach’s Eye
Image from coachseye.com
Cases
• Supervision of Student Teachers
• Business Presentation Skills
• Interviewing Skills
• Teaching (Mini-lessons)
• N...
Methods
• Qualitative Methods
• Phenomenology (van Manen, 1990,
2002)
• Phenomenography (Micari, et al,
2007)
• Design Res...
Reported Benefits
• Facilitates contextualized feedback
• Students can see themselves (and others)
from new perspective
• ...
Reported Benefits
• Facilitated higher levels of performance
• encouraged student to practice/prepare more
• really tried ...
Lessons Learned
Lessons Learned #1
• Instructors gave feedback without any
feed-forward (coaching)
• Remind people/build it into process
•...
Lessons Learned #2
• Needed opportunities to implement
feedback…Needs to be formative (although
it can be used as a tool f...
Lessons Learned #3
• Quality of some peer feedback of varying
quality! (overly critical contradictory,
generally unhelpful...
Lessons Learned #4
• Created some pressure for students to
be recorded...
• Give learners opportunities to practice in
low...
Lessons Learned #5
• With live critiquing, audience can be
distracted (in fact trying to rate and
comment is too much for ...
Lessons Learned #6
• Workflow can be difficult (especially in
asynchronous implementations).
• If the tool doesn’t facilit...
Lessons Learned #7
• Technology can get in the way…
• Plan
• Train
• Provide resources (e.g., provided check out
of iPads)
Lessons Learned #8
• Technological infrastructure also an
issue
• Investigate
• Test! Test! Test!
Questions?
References
Micari, M, Light, G., Calkin, S., and Streitwieser, B. (2007).
Assessment beyond evaluation: Phenomenography in...
Upcoming SlideShare
Loading in …5
×

Improving Learner Performance through Time-Sensitive Feedback: Design-Based Research of Benefits and Issues of Video Annotation Software

284 views

Published on

E-Learn 2015 Presentation

Published in: Education
  • Be the first to comment

  • Be the first to like this

Improving Learner Performance through Time-Sensitive Feedback: Design-Based Research of Benefits and Issues of Video Annotation Software

  1. 1. Improving Learner Performance through Time-Sensitive Feedback: Design-Based Research of Benefits and Issues of Video Annotation Software Michael C. Johnson, Ph.D. BYU Center for Teaching & Learning mc_johnson@byu.edu Barbara Smith, Associate Clinical Professor Department of Counseling Psychology and Special Education, BYU
  2. 2. What Is Video Annotation?
  3. 3. Definition A tool that allows you to annotate (i.e., add notes, comments, explanations, or telestrations to) a video recorded performance specific to moments in the performance.
  4. 4. Features • Synchronous and Asynchronous commenting/critiquing • Timeline specific/linked annotations* • Text • Audio • Video • Drawing • Replies to comments • Tagging (or marking) • Classifying comments • Counting specific types of behaviors, etc. • Ratings • Video Overlays • Links • Emphasizing elements on screen • Drawing (telestrator - a la John Madden) • Rubrics • Prompt Video • Facilitate a timely workflow (organizing recordings, due dates, notifications, communications, etc.) • Roles and user management
  5. 5. Example Tools
  6. 6. VideoAnt
  7. 7. Videonot.es
  8. 8. Open Video Annotation Project
  9. 9. VATIC
  10. 10. YouTube
  11. 11. GoReact
  12. 12. VoiceThread Image from: https://ic.arc.losrios.edu/~itc/institutes/OTI/Session7/voicethread.htm
  13. 13. Viddler
  14. 14. Coach’s Eye Image from coachseye.com
  15. 15. Cases • Supervision of Student Teachers • Business Presentation Skills • Interviewing Skills • Teaching (Mini-lessons) • Negotiation and Counseling • Dance
  16. 16. Methods • Qualitative Methods • Phenomenology (van Manen, 1990, 2002) • Phenomenography (Micari, et al, 2007) • Design Research (Middleton, et al, 2008)
  17. 17. Reported Benefits • Facilitates contextualized feedback • Students can see themselves (and others) from new perspective • Facilitates higher levels of Bloom’s • Became more detailed as raters/reviewers • increased self reflection • as reviewers they saw what did and didn’t work for others • improved critical analysis
  18. 18. Reported Benefits • Facilitated higher levels of performance • encouraged student to practice/prepare more • really tried to reach audience • cared more • Convenience & Flexibility (especially in cases asynchronous student teacher but also just in having students record their performances outside of class)
  19. 19. Lessons Learned
  20. 20. Lessons Learned #1 • Instructors gave feedback without any feed-forward (coaching) • Remind people/build it into process • Teach them how to use the tool to do the coaching part! • Pull the coaching part out of the tool
  21. 21. Lessons Learned #2 • Needed opportunities to implement feedback…Needs to be formative (although it can be used as a tool for summative assessment, too). • Timely feedback so students can act on it • Allow for resubmission so they can improve • Allow new opportunities to use skills being learned
  22. 22. Lessons Learned #3 • Quality of some peer feedback of varying quality! (overly critical contradictory, generally unhelpful, anonymity allowed some student to be hurtful). Also could stand to improve self-assessment • Teach learners how to evaluate • Provide rubric or other clear criteria • Demonstrate • Practice in low-stakes situations • Common judgment sessions
  23. 23. Lessons Learned #4 • Created some pressure for students to be recorded... • Give learners opportunities to practice in low-stakes contexts (this can be within a course or across a program) • Or allow do-overs
  24. 24. Lessons Learned #5 • With live critiquing, audience can be distracted (in fact trying to rate and comment is too much for most to do at the same time, too) as can the presenter (performer). • Assign roles - critic, rater, participant, etc.
  25. 25. Lessons Learned #6 • Workflow can be difficult (especially in asynchronous implementations). • If the tool doesn’t facilitate it, you’ve got to plan for it!
  26. 26. Lessons Learned #7 • Technology can get in the way… • Plan • Train • Provide resources (e.g., provided check out of iPads)
  27. 27. Lessons Learned #8 • Technological infrastructure also an issue • Investigate • Test! Test! Test!
  28. 28. Questions?
  29. 29. References Micari, M, Light, G., Calkin, S., and Streitwieser, B. (2007). Assessment beyond evaluation: Phenomenography in educational evaluation. American Journal of Evaluation, 28(4), 458-476. Middleton, J., Gorard, S., Taylor, C., and Bannan-Ritland, B. (2008). The “Compleat" Design Experiment. In A.E.Kelly, R.A. Lesh, & J.Y. Baek (Eds.), Handbook of Design Research Methods in Education (pp. 21-46). New York: Routledge. van Manen, M. (1990). Researching lived experience: A human science for action sensitive pedagogy. Albany, NY: State University of New York Press. van Manen, M. (2002). Phenomenology online. Retrieved December 23, 2006, from http://www.phenomenologyonline.com/

×