Beef up your backchat: using audience response systems to assess student learning


Published on

Presentation at WILU 2014 at Western University. Describes use of web-based audience response systems for formative assessment during information literacy sessions.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • So this is Mentimeter, an example of an audience-response system.
  • A couple of more definitions before we get into the good stuff.
    As of 2009, there were 26 labels used for this technology
  • Audience response systems grew out of the idea that there are always conversations between audience members – and of course that has grown exponentially via social media. So an ARS can channel that sideways conversation and make it a three-way conversation – between the speaker and the audience and between audience members
  • One final definition: what do we mean by assessment?
    For ARS, which is “in the moment” , we’re talking about formative assessment: a check-in that can allow instructors to see how students are doing with material
    Formative assessment can be done throughout an instruction session or at the end
    In contrast, summative assessment concludes a course of study or instruction sessions and tests cumulative knowledge

    So now let’s look at some examples – in and out of libraries
  • Poll Everywhere: NUSC 1P10 > Questions/comments about Library stuff so far
  • See “For WILU” poll
  • PEKN 1P93 Spring – What are some features you can use to focus your search?
  • PEKN 1P93 – Class 1, How confident do you feel … No. 2
  • TopHat grew out of UWaterloo Accelerator Centre 2010 > home base Toronto > now has offices in San Fran & Chicago
    -more than 100,000 students @ 350 unis (Harvard, UWaterloo, UofT)
    -Used with SPMA 2P21
  • I would love to use this tool again: it was worth marks, so students paid attention; has sophisticated features but is fairly easy to use + it was already integrated into the class
    2013 Article from Queens about TopHat in engineering class with 1000+ students
    MTCU says “must always be a free alternative for any mandatory course component” so had to give opt-out students paper form each lecture
    But: students didn’t like paying to participate in lectures
  • Students get tired and bored > ARS promotes engagement and can alleviate fatigue and boredom
    Good active learning activity for large, lecture-style classrooms that are impersonal and don’t facilitate groupwork
    Immediate feedback is more effective for student learning than delayed feedback; ART enhances interaction as well as affective learning (e.g. confidence building) – which is usually rated low in large classes (Lin)
    Anonymity: Eliminates intimidation by peers and fear of exposure (Hoyt)
    Technology: Aligns with millennials’ preferred method of communication (Chan)
  • Technology:
    students may not have web-enabled devices or may not want to bring laptops to class (Hoyt);
    Students may be uncomfortable with the technology
    or the software may not work
    WIFi access can be problematic
    Security – what data is collected and where is it stored?

    Learning outcomes: Some studies indicated enhanced performance by students using the technology (Schackow et al., 2004; Pradhan et al., 2005; Holmes et al., 2006; Caldwell, 2007; Mayer et al., 2009), while other investigators found no change (Duggan et al., 2007; Martyn, 2007).
    University of the Pacific (Chan & Knight) found that students actually achieved better learning outcomes in library instruction sessions that didn’t use clickers when comparing clicker-based to paper-based assessment:
    Paper assessment allows student to self-regulate and pace themselves, lets them see all qs from start to finish, can review and correct their answers, more control over time regulation
    A study of students using an in-house ARS for anatomy courses found mixed results – did not dramatically enhance exam performance overall but did enhance exam performance for students in lower percentile of class (Hoyt)

    Inappropriate behavior: can be an inadvertent forum for nasty comments (SPMA)

    Question design: can be challenging and adds another layer to class prep
  • “The move from classroom response systems to classroom engagement systems to systems that bring the evaluation and learning outside the classroom is a radical change … that we are excitedly but cautiously exploring”. (Bazylak)
  • Beef up your backchat: using audience response systems to assess student learning

    1. 1. Beef up your backchat: Using audience response systems to assess student learning Elizabeth Yates, Liaison/Scholarly Communication Librarian, Brock University WILU 2014 May 22, 2014
    2. 2. Imagine IL classes like this! Image source: Giphy
    3. 3. Instead of this Image source: Giphy
    4. 4. Learning outcomes Participants will recall: • characteristics of audience response systems and how they are used in PSE for formative assessment • ”best practice” strategies for incorporating audience response systems into library instruction sessions
    5. 5. Poll time! Have you used audience response systems in library instruction? Hands up OR Please go to: and enter 35 51 92 to vote
    6. 6. Audience response systems • Think “clickers without the hardware” • Instructors ask questions and students respond using web-based software which collects and displays their answers • Can be used with desktop computers & mobile devices • Some allow texting Also called: (open-ended OR student OR classroom OR personal) AND response systems) OR web-based polling OR audience response technology … etc.
    7. 7. • “… the unofficial channel for the class, consisting of interactions among the audience, or perhaps with those outside the class. (Aagard, Bowen & Olesova, 2010). • “… the ongoing, co-constructed, meta-content discussion that can accompany live demonstrations of nearly any type.” (Higdon, Reyerson & McFadden, 2011) What’s a backchannel?
    8. 8. Formative assessment: • Provides immediate, ongoing feedback • Allows instructors to improve their teaching • Allows students to identify strengths and weaknesses and target areas that need work (Carnegie Mellon Eberly Centre, 2013)
    9. 9. System $ Platform Question type User limit Download results Poll Everywhere Free Paid plans for larger audiences Texting charge may apply Browser Texting Twitter Multiple choice Short answer 40 Yes Mentimeter Free Browser Multiple Choice None Only with premium Socrative Free Paid plans for larger audiences Browser or app Single answer or quizzes; Multiple choice True/False Short answer 50 Yes Top ARS tools
    10. 10. Some Blooming* examples Knowledge = remembering: • Start session by asking students to recall material covered previously • Mid-lesson check-in • End session by asking students to recall info covered that day *based on Bloom’s taxonomy of learning
    11. 11. Comprehension Understanding facts: • Use text polls to discuss a question e.g. is this a credible source? (small class) • Use multiple choice to classify e.g. what are acceptable scholarly info sources • What’s still unclear?
    12. 12. Application • Ask students to discover features of a database and share via poll • Ask text-based questions > students can collaborate and write a paragraph and then post via poll; students can see & discuss each other’s work
    13. 13. Analysis • Ask students to identify database search filters and answer via multiple-choice or share findings via text-based answer • Compare two websites and vote for most credible source
    14. 14. Synthesis • Small class, text answers: –Ask students individually to create search strategies with keywords, search operators & post via ARS –As group, evaluate search strategies
    15. 15. Evaluate • Evaluation for instruction sessions or student self-evaluation > multiple choice or text-based • Quiz comparing info resources e.g. Google Scholar vs. SuperSearch
    16. 16. A little different: • Free for profs; students pay $20/semester or $38/5 years • create questions or discussions • more question types eg matching, sorting, word answer • assignments and quizzes • includes gradebook • some LMS integration
    17. 17. Top Hat interface
    18. 18. Question menu
    19. 19. We’re in the home stretch Any burning questions?
    20. 20. Interactive Fun Engages students Multiple question types Anonymous Immediate Increases focus Boosts participation Incorporates technology
    21. 21. Students dislike monitoring Tricky to craft questions
    22. 22. Best practices • Be clear: explain how the tool works, why you are using it, what they need to do (Aargard) • Ensure it is used constructively • Ensure everyone has access • Align ARS with instructional design > don’t just throw it in for “fun” (Dennis)
    23. 23. Think+pair+share 1. Think of how you could use ARS in your instruction sessions (1 min) 2. Pair up (1 min) 3. Share your ideas!
    24. 24. • Must be really comfortable getting students on the system – text or verbal • Where are the students? Use ARS to help understand the audience • Icebreakers • Interactivity in a really big class • all have a non electronic issue • Inappropropriate or silly answers – strategy: spin this in your favour Think, pair & share feedback
    25. 25. Question design “Ideal questions for ARS are challenging enough that students need to carefully select their response, but also accessible enough that a student can select a response within a few minutes.” -- Abate, Gomes & Linton (2011)
    26. 26. Question design, part 2 Effective questions: • Address a specific learning goal • Uncover misconceptions • Explore ideas in a new context • Elicit a wide range of responses --Kay & LeSage, 2009
    27. 27. Tips, tricks & next steps • Ask questions at 20-min intervals • Be sure to hide answers until you’re ready for whole class to view • Test, test & triple-test Watch for: ARS with social media, multimedia, gamification e.g. Course Peer (classroom response > classroom engagement)
    28. 28. Summing up • ARS are great for student engagement, active learning, formative assessment –Can align with Bloom’s taxonomy • Care needed to craft questions • Important to clearly define use/rules • Mixed evidence on learning outcomes Questions or comments?
    29. 29. Aagard, H., Bowen, K., & Olesova, L. (2010). Hotseat: Opening the backchannel in large lectures. Educause Quarterly, 33(3), 2. Retrieved from Abate, L. E., Gomes, A., & Linton, A. (2011). Engaging students in active Learning: use of a blog and audience response system. Medical Reference Services Quarterly, 30(1), 12–18. doi:10.1080/02763869.2011.540206 Bazylak, J., McCahan, S., Weiss, P. E., & Anderson, P. (2013). Take Out Your Cell Phones-Class is Starting– Revisited. Proceedings of the Canadian Engineering Education Association. Retrieved from Carnegie Mellon Eberly Centre. (2013). What is the difference between formative and summative assessment? Retrieved April 22, 2013 from: Chan, E. K., & Knight, L. A. (2010). Clicking with your audience. Communications in Information Literacy, 4(2), 192–201. Connor, E. (2011). Using Cases and Clickers in Library Instruction: Designed for Science Undergraduates. Science & Technology Libraries, 30(3), 244–253. doi:10.1080/0194262X.2011.592787 Deleo, P. A., Eichenholtz, S., & Sosin, A. A. (2009). Bridging the information literacy gap with clickers. Journal of Academic Librarianship, 35(5), 438–444. References
    30. 30. References Dennis, M. R., Murphey, R. M., & Rogers, K. (2011). Assessing information literacy comprehension in first-year students. Practical Academic Librarianship: The International Journal of the SLA, 1(1), 1–15. EDUCASE Learning Inititiave. (2011). 7 things you should know about opne-ended response systems. ELI 7 things you should know. Retrieved on April 4, 2012 from: Gewirtz, S. (2012). Make your library instruction interactive with Poll Everywhere: an alternative to audience response systems. College & Research Libraries News, 73(7), 400–403. Retrieved from Eva, N., & Nicholson, H. (2011). DO get ytchnical! using technology in library instruction WILU 2011, Regina, SK. Partnership: The Canadian Journal of Library & Information Practice & Research, 6(2), 1–9. Higdon, J., Reyerson, K. & McFadden, C. (2011). Twitter, Wordle, and ChimeIn as student response pedagogies (EDUCAUSE Quarterly) | (n.d.). Retrieved April 17, 2013, from Hoppenfeld, J. (2012). Keeping students engaged with web-based polling in the library instruction session. Library Hi Tech, 30(2), 235–252. Retrieved from Hoyt, A., McNulty, J. A., Gruener, G., Chandrasekhar, A., Espiritu, B., Ensminger, D., … Naheedy, R. (2010). An audience response system may influence student performance on anatomy examination questions. Anatomical Sciences Education, 3(6), 295–299. Kay, R., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. COMPUTERS & EDUCATION,53(3), 819–827. Lin, J., & Rivera-Sanchez, M. (2012). Testing the information technology continuance model on a mandatory SMS-based student response system. Communication Education, 61(2), 89–110. Liu, F. C., Gettig, J. P., & Fjortoft, N. (2010). Impact of a student response system on short-and long-term learning in a drug literature evaluation course. American Journal of Pharmaceutical Education, 74(1). Retrieved from Trew, J. L., & Nelsen, J. L. (2012). Getting the most out of audience response systems: predicting student reactions. Learning, Media and Technology,37(4), 379– 394. doi:10.1080/17439884.2011.621957