Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Badging Study - within a Graduate Course

495 views

Published on

This presentation address the findings about anaction research study on the use of badging within a graduate course. This course itself studied the theory behind and the educational use of emerging technologies. Here you will see how the students responded to reviewing the work of their peers (in an anonymous manner).

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Badging Study - within a Graduate Course

  1. 1. Badging Fall Academic Conference – 2013 Empire State College Amy McQuigge (amy.mcquigge@esc.edu) Eileen O’Connor (eileen.oconnor@esc.edu)
  2. 2. For the research section of the presentation: EXPLORING BADGING FOR PEER REVIEW, EXTENDED LEARNING AND EVALUATION, AND REFLECTIVE/CRITICAL FEEDBACK WITHIN AN ONLINE GRADUATE COURSE
  3. 3. Research questions • Can the badging process itself extend the learning in lateral ways, that is, engage students beyond the specific learning outcomes intended within the course? • Can the use of peer review and badges within the course serve as an example itself of an emerging conceptual-framework for learning, evaluation, and motivation becoming available through the advances possible with web-based interfaces? • Can the peer review process strengthen student connections? • Will the process of peer review (towards the generation of badges) be used by the students in a diligent and thoughtful manner?
  4. 4. Methodology • Student created 4 web-based projects in 4 modules of the course • In each module, student reviewed the web-based artifact vs. the criteria then: • They voted anonymously (although w/ their name identified to the instructor • Votes – collected through a Google Form & analyzed via an Excel Pivot Table • Data on voting outcome was sent to the students at the end of each module – using the detailed categories • Aggregated data (averaging ratings over the course on all 4 artifacts) was used to assign a final badge that was delivered via a virtual meeting at the end of the semester
  5. 5. Different rubrics & evaluations – aligned with the different purposes % of grade Points* -- Creates an attractive YouTube that communicates your desired intention in a clear, non-rambling way 40 10 -- Has a clear central purpose that is evident in the emphasis within the video and is shown in a bulleted way within the video at some point 15 10 -- Uses basic editing skills with cutting, transitions, and titles / captions perhaps -- Posts the YouTube using either a public or unlisted link (NOT a private link) 25 10 5 10 -- Put a link-to or embeds-in your website (optional) NA NA (0=no evidence, 1 = little evidence . . . 10=excellent) -- Reviews and comments on several colleagues’ YouTubes -- Posts within the required time 5 10 10 10 Total points possible = 100 100 Commen ts = Actual points earned No go (1) Won’t even make the grade for the assignments minimum criteria Pewter (2) Minimally acceptable for the assignment but nothing noteworthy in this aspect Silver (3) Interesting & useful; solid display of expertise on this criteria Gold (4) Wow, I am learning and taking notes here – a great job; I’ll have my friends visit here too
  6. 6. How do the evaluations by instructor & peer compare? Student 1 Student 2 Student 3 Student 4 Student 5 Student 6 Student 7 Student 8 Student 9 Instructor ranking 5 (lowest performer) 2 2 4 3 2 4 1 (highest performer) 3 Peer ranking 4 1 (highest performer) 3 6 1 7 (lowest performer A) 3 3 7 (lowest performer B) Instructor ranking – final grade for each student; Peer ranking is the average for all the evaluation criteria
  7. 7. Total votes cast by badge category Badge category Number of votes 1 – No Go 2 – Pewter 3 – Silver 4 - Gold 16 43 190 181 Skewed towards the high end but still some differentiation & discrimination % in this category 4% 16% 40% 42%
  8. 8. Student rating of peers – different data representation Std1 Std2 Std3 Std4 Std5 Std6 Std7 Std8 Std9 Total votes student 57 43 62 44 34 59 53 43 35 Average vote score 3.2 3.6 3.3 3.0 3.6 3.2 3.5 3.3 2.5 Lowest vote received 1 2 2 1 3 1 3 1 1 Highest vote received 4 4 4 4 4 4 4 4 4 Std Dev 0.9 0.6 0.7 0.9 0.5 0.8 0.5 0.7 0.9
  9. 9. Tone of the student comments The students’ comments were generally specific, sincere, and helpful (whether positive or negative in tone). A sample of a few comments illustrates the general tenor of the comments: • In a YouTube comment: “Amazing! Visual. visual, visual! No matter what type of learner, there was something in the video for them. Made me want to go out and learn an instrument. [Name of student removed] is a natural speaker too. Her voice was soothing, relaxed and real.” • “I enjoyed the last part and how it tied all the concepts together. Seeing how learning is fun and the effect of learning a second language is positive.” • In a Facebook comment, useful and kindly written: “A little text editing, I wouldn't start the About section with ‘This is a Page about.’ I would start, ‘Exploring emerging.’” • Not all comments were positive, but they were all supportive: “A bit long and somewhat repetitive, but would certainly appeal to its intended audience.” Or, “I liked the "woman on the street" segments. The sound was a little off, I liked the concepts!”
  10. 10. Sampling of some positive comments Students also often specifically indicated what they had learned that encouraged them to expand their own work in the future. • “Nice use of video plug in - I did it too after her example” • “Where in SL did you find the keyboard? The address or a way to find it would be useful for your viewers who might want to try it out.” • “I loved the pics, pet education links, and the therapy dog link. I needed this info for my dog.” • “I learned something from viewing it about MY presentation! Totally clear what the environment is.” • “I also liked the wallpaper. (I tried to find that and couldn't)” • “Made me want to go out and learn an instrument” • “I will try and follow the instructions that were detailed here. Thanks” The range, specificity, and expanded learning suggested by the comments indicates a dedicated, invigorated group of students who are willing to support their colleagues.
  11. 11. Could participating in the badging process itself extended the learning in these lateral ways. • Very-different ranking of overall web-artifact evaluation suggests different dimensions were considered; • An average of 20% more votes than required were cast – overall a wide range of different artifacts were consider; • The optional comments were rich & often indicated direct learning from colleagues Suggesting that learning was happening beyond the course constraints
  12. 12. Giving evidence of connectedness • Many forms of connection in the course -- discussion boards, virtual synchronous meetings, virtual field trips, presentations in Second Life, shared video work via YouTube, therefore difficult to ascribe connectedness to badge reviews alone but . . . • Ongoing concern for colleagues seemed evident: • Students remembered colleagues’ audience, giving specific references - “I enjoyed the last part and how it tied all the concepts together. You see how learning is fun and the effect of learning a second language is positive.” • Students took the extra time to suggest specific improvements; 15 comments, about 20% of the comments offered, focused on concrete and specific recommendations for improvements; • More than half of the voting scenarios included specific, useful, critical yet supportive comments (that were shared anonymously with students
  13. 13. Scaffolding a conceptual framework for an emerging concept: • As reported by Finklestein (2012) during a webinar where he considered how instructors could: “leverage digital badges to build ongoing relationships with learners, foster engagement, and acknowledge learning that often goes unrewarded or unrecognized,” he explained how the process of engagement itself within the course modeled the application of badging. • Students did not simply read about badging, they reviewed colleagues work, voted on different criteria, and extended recommendations and comments. Furthermore, they observed who received the awards and on what dimensions the awards were received.
  14. 14. Commitment & diligence • 20% additional voting – an extra time commitment; • Despite high end voting, the comments show more discrimination; • Reasonably close to colleagues votes on the same student
  15. 15. Implications & cautions • The process of piloting badges as a peer recognition of achievement that goes beyond course objectives appears to be of sufficient value to continue to improve, refine, and re-evaluate effective ways to embed badging in future courses. • However, embedding a non-instructor evaluation within a course could risk the safety, security, and sense of fairness that students develop within the course. • Instructors must be very clear on the detachment from the peer review in any grading consequences, should that be the case, or explain the role of the badges and peer review within the intended course schema if the peer review is to factor into the course evaluation.

×