Perceived Value of Peer and Instructor Performance Assessment Using Video Annotation:AECT 2011

1,319
-1

Published on

Report on research of using a new video annotation/rating tool in business communication courses to help assess presentation and interviewing skills. The tool facilitates peer and instructor/TA feedback on a student’s performance, including the continuous rating of the performance. I describe the perceived effects of using this technology on the students’ learning experiences.

Published in: Education, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,319
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • The primary source of data collection was a series of in-depth interviews. The instructor of the course and fourteen students who completed the course were the primary data sources. We also used the course materials (including the readings from the course) and the students’ assignments, where possible, as secondary sources of data. We conducted some brief follow-up interviews through e-mail conversations.We conducted several thematic analyses of the data, using a combination of holistic, selective, and detailed approaches to the thematic analyses (van Manen, 1990; 2002). The secondary forms of data were used to obtain a richer view of the themes arising from the interview data, for negative case analysis and to elicit discussion in follow-up interviews. As part of the analysis we wrote brief summaries of the themes that emerged from the data of each individual participant as well as collective themes. We conducted the conversations about the themes primarily through written correspondence. We also discussed the findings with other researchers and instructional designers to in what van Manen terms collaborative analysis (van Manen, 1990).
  • In this case, students developed a passion for what they learn, beyond just memorizing material for a grade and moving on.
  • Rather than just please the teacher
  • Perceived Value of Peer and Instructor Performance Assessment Using Video Annotation:AECT 2011

    1. 1. PERCEIVED VALUE OFPEER AND INSTRUCTOR PERFORMANCE ASSESSMENT USING VIDEO ANNOTATION Michael C. JohnsonBYU Center for Teaching & Learning (4103.C0.b)
    2. 2. Background: The Technology Used the online video annotation tool REACT Live commenting and rating of peer/student performance  Rate throughout the performance and at the end  Provide timeline specific comments and general comments at the end. Feedback immediately available
    3. 3. Background: The Case Undergraduate Business Communications Course Group Presentation (end of semester) Only time they used REACT in the semester 250+ students 15 + instructors
    4. 4. Background: The Case Tools was in Beta: So some technical difficulties were experienced The classes only used the tool for this one assignment Comments from peers were anonymous (instructor comments were identified)
    5. 5. Methods Classroom observations of presentations and rating/commenting Survey of Students (100+ responses) Survey of instructors (12 responses) Thematic analysis (van Manen, 1990; 2002) taking a phenomenographic approach (Micari, 2007)
    6. 6. Issues/QuestionsHow did students and instructors perceive: The effects of the commenting/rating process on students as presenters The effects of the commenting/rating process on students as raters The usefulness of the comments/ratings students received from their peers and instructors
    7. 7. Students as presenters
    8. 8. Perceived Effects on Presenters No real difference “It was normal, I didnt really pay any attention to the fact that people were rating us.” “I didnt think much about it. I actually forgot they were rating me. It was a good experience”
    9. 9. Perceived Effects on Presenters Encouraged them to practice/prepare more “We might have been a little nervous being filmed but I think it helped us be prepared.” [emphasis added]
    10. 10. Perceived Effects on Presenters Changed the way they presented, trying to really reach their audience, etc. “I think it improved our presentation. We tried to make our presentation relate to the people watching as much as possible so they would enjoy themselves ”
    11. 11. Perceived Effects on Presenters Try harder/care more. “I wanted to do better because I knew class could rate every small thing I did.” “Being rated made me care more about giving a good presentation.”
    12. 12. Perceived Effects on Presenters Audience was distracted—made it harder to keep audience’s attention “I could tell some people were typing instead of listening ”
    13. 13. Perceived Effects on Presenters As presenters, they were distracted by the audience’s laptops, rating activities “It was a tad distracting because everyone was typing things while we were talking.”
    14. 14. Perceived Effects on Presenters Created additional pressure “Nerve raking” “Stressful” “Intimidating”
    15. 15. Effects on PresentersInstructor Perspective Raised the level of expectations on their performance Students liked receiving feedback Being rated and recorded caused some student anxiety
    16. 16. Students asraters
    17. 17. Perceived Effects on Raters Paid closer attention, watched for details “I learned more because I was paying more attention and being critical.”
    18. 18. Perceived Effects on Raters Distracted them “Rating others I do feel like took away from watching the presentation.”
    19. 19. Perceived Effects on Raters They saw examples of what was good and bad performance. “I liked rating others because I was able to see what did and did not work for other groups and then apply that to my own presentation.” “It allowed me to compare what good presenters did or didnt do as opposed to poor presenters.”
    20. 20. Perceived Effects on Raters Critically analyzed performance “By rating others, I looked at what they did well, and in return thought about how I would act if I was the one presenting. It helped me learn how to better engage the audience by using, or not using the techniques used by those I critiqued.”
    21. 21. Perceived Effects on Raters They became overly critical “I saw bad posture, I saw lots of "ums," and I also saw some very effective use of PowerPoint and other presenting skills. By looking for them, I found them.” “I WAS SUPER CRITICAL AND LOOKED FOR ERRORS.”
    22. 22. Perceived Effects on Raters Didn’t make a difference “I dont think it makes that much difference, because I am an amateur so I only know like 4 or 5 things to look for, so I focus on those.”
    23. 23. Perceived Effects on RatersInstructor Perspective Helped students focus on the elements of good presentations Students liked giving feedback Students (as the audience) were distracted by having to rate
    24. 24. Usefulness of comments
    25. 25. Usefulness of Comments/Ratings Helpful to see what is going well and what needs to be improved: “They picked up things that I would have never noticed with my experience doing it alone. They said nice things and constructive things which was also appreciated.”
    26. 26. Usefulness of Comments/Ratings Helpful to see what you were doing related to comment (contextualized and specific comments) “VERY USEFUL. The comments were more specific than if they had just written comments at the end of the presentation. I can see what I was doing when they made a certain comment.”
    27. 27. Usefulness of Comments/Ratings Instructor comments perceived as more valuable by some students “My teachers comments were the most helpful - some of the other comments were unclear or contradictory”(However, other students felt thatthe peer comments added more)
    28. 28. Usefulness of Comments/Ratings Student comments perceived as biased/overly critical/based on personal opinion “There were still several comments that were clearly personal opinions, but that will always happen with peer reviews.” “BIASED AND OVER ANALYTICAL.”
    29. 29. Usefulness of Comments/Ratings Student comments perceived by some as unclear “It was very useful and good to know, but the comments were at times vague.”
    30. 30. Usefulness of Comments/Ratings Comments were at times contradictory “I think it was useful, but not so much when contradicting information was given. For example, one person could say that a transition was great, while someone else can say it was not good.”
    31. 31. Anonymity of Peer Comments Some students felt this helped them and/or their peers be more open and honest Others felt that it allowed their peers to be more critical and mean spirited or even purposefully bring their grades downNote: Comments, average, and overall ratings were not anonymous to the instructor
    32. 32. Comment Type Preference Vast majority of students prefer timeline comments—specific, actionable A few prefer the general comments at the end—general, overall feel for how things went A few others just liked reviewing the video of their performance A couple liked the ratings better than the comments
    33. 33. Use of CommentsStudent’s and faculty report: Student have used feedback to improve subsequent individual presentations Students intend to use the feedback they received
    34. 34. Some Implications for Practice? Give the students clear information about criteria of the performance Give students more low-stakes practice in front of cameras and peers, as needed Train students how to rate—based on the criteria and how to write clearly Provide more low-stakes practice as raters Hold “common judgment” sessions
    35. 35. Some Implications for Practice? Help students with the analysis and application of comments they receive Provide opportunities to perform again, use feedback to improve
    36. 36. ReferencesMicari, M., Light, G., Calkins, S., Streitwiesser, B. (2007). Assessment beyond performance: Phenomenography in educational evaluation. American Journal of Evaluation. 28(4), 458- 476.van Manen, M. (1990). Researching Lived Experience: A Human Science for Action Sensitive Pedagogy. Albany, NY: State University of New York Press.van Manen, M. (2002). Phenomenology Online. Retrieved December 23, 2006, from http://www.phenomenologyonline.com/
    37. 37. Contact Information Follow me on Twitter Michael C. Johnson  Email: mc_johnson@byu.edu  Twitter: @michaelcjohnson Center for Teaching & Learning  Website:http://ctl.byu.e du  Facebook:http://www.fa cebook.com/byuctl  Twitter:@byuctl

    ×