Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evolution of an Assessment Program


Published on

Presented at AABIG, June 10, 2016. Designing an Instruction Program is a big task, but one of the most important, and often neglected, components is a robust assessment plan. In this short presentation, I will share how librarians at Jack Tarver Library, Mercer University, designed an assessment cycle that includes multiple methods to collect feedback from faculty and students alike, and which covers both session-level and program-level assessment. I’ll take a look back at where we were four years ago, share how we got to our current plan, and speculate on what might happen in the future. Along the way attendees will learn about some of our mistakes and successes, and why we think an intentional assessment plan is an essential part of any instruction program.

Published in: Education
  • Be the first to comment

  • Be the first to like this

Evolution of an Assessment Program

  1. 1. Amy Gratz Instructional Services Librarian Mercer University, Jack Tarver Library
  2. 2. Spring/Summer 2012 • Looked back at previous assessment tools • Student Feedback Form • Peer Observations
  3. 3. Spring/Summer 2012  Created new student form  Continued peer observations Your Class Level (circle one): First-year Sophomore Junior Senior Graduate Student Other 1. Please rate the amount of hands-on work time allotted in this session (circle one) Too Much About Right Too Little 2. Please rate your confidence with using the tools demonstrated today Confident Somewhat Confident Not Very Confident 3. How helpful will the skills and concepts discussed today be for your assignment (check one)? □ Absolutely essential – I could not complete the assignment without them □ Useful – they will make the assignment easier □ Somewhat helpful – some of the skills/concepts will be helpful, but not all □ Barely helpful – most of the skills/concepts I already knew or won’t use □ Useless – I already knew everything, or the skills/concepts won’t help with my assignment 4. What concept from today’s class are you still working to understand? 5. What skills/concepts discussed today were already familiar to you? 6. What didn’t work well about today’s session, and how can we improve? Other comments?
  4. 4. Fall 2012  Began article about assessment at Tarver Library1  Delved into past practices at Tarver  Started in 2002  Primarily student feedback  Learned about best practices from the literature 1For a full overview and references to specific articles and other literature: Gratz, A. and Olson, L. T. (2014). doi: 10.1080/10691316.2013.829371
  5. 5. Best Practices for Assessment  Must be tied to Library and University goals  Gilchrist & Oakleaf, 2012. An Essential Partner: The Librarian’s Role in Student Learning Assessment.  Should be done on multiple levels  Radcliff, et al., 2007. A practical guide to information literacy assessment for academic librarians. Westport, Conn: Libraries Unlimited.  Should use multiple methods  Tancheva, Andrews, & Steinhart, 2007. Library instruction assessment in academic libraries. Public Services Quarterly, 3(1/2), 29-56. Also recommended: Instruction & Program Design Through Assessment by Gilchrist and Zald, 2008
  6. 6. Problem with Attitude Surveys “At most, it provides information about how the student perceives the librarian’s presentation… What [it] has not provided is any indication of whether the student participants have actually learned anything.” Colborn, N. W. & Cordell, R. M. (1998). Moving from subjective to objective assessments of your instruction program. Reference Services Review, 26(3/4), 125-137. doi: 10.1108/00907329810307821
  7. 7. 2013-2014: Designing the Program Mission Statement: The Tarver Library Instruction Program supports the mission and curricula of Mercer University by teaching the information literacy skills essential for creating well-researched papers, presentations, and other projects, empowering all of our community members in their academic, professional, and personal lives.
  8. 8. 2013-2014: Designing the Program  Student Learning Outcomes (SLO’s)  Upon degree completion students will:  Determine the nature and extent of the information needed.  Access needed information effectively and efficiently.  Evaluate information and its sources critically and investigate differing viewpoints.  Understand various economic and social issues surrounding the use of information and access and use information ethically.
  9. 9. 2014-2015: Creating SLO’s  Focused on our most commonly taught courses  Decided not to develop any for subject area classes
  10. 10. Scaffolded SLO’s
  11. 11. Selected Finalized SLO’s  INT 101 Instruction  Students identify appropriate academic sources  Students generate an individual list of applicable key search terms  Students access and use multidisciplinary resources to locate information
  12. 12. Summer 2015  Needed to get colleagues more interested in assessment  Adapted an instruction activity for use in a meeting
  13. 13. Instruction Program Assessment “Ultimately, the goal of all instruction and assessment efforts is to engage in reflective practice” (Oakleaf) How frequently should we use that assessment method?  Assign each method you’ve been given to one of the different cycles posted  Color-coded:  Green = students  Yellow = faculty  Black = internal  Add your own if desired!
  14. 14. Instruction Program Assessment Cycle, 4-year Rotation 2015-2016 2016-2017 Full Year Student in-class survey Full Year Student in-class survey Faculty in-class survey Faculty in-class survey Pilot pre/post tests Pre/post tests (students) Fall Only Student end-of-semester survey Fall Only Student end-of-semester survey Faculty end-of-semester survey Faculty end-of-semester survey Spring Only Preceptor Focus Groups Spring Only Faculty interviews/focus group 2017-2018 2018-2019 Full Year Student in-class survey Full Year Student in-class survey Faculty in-class survey Faculty in-class survey Pre/post tests (students) Pre/post tests (students) HEDS Annual Survey Fall Only Student end-of-semester survey Peer Observations Faculty end-of-semester survey Fall Only Faculty end-of-semester survey Spring Only Student focus groups
  15. 15. 2015-2016 Successes 2015-2016 Full Year Student in-class survey Faculty in-class survey Pilot pre/post tests Fall Only Student end-of-semester survey Faculty end-of-semester survey Spring Only Preceptor Focus Groups  Student in-class survey  Faculty end-of-semester survey  Focus Group with Preceptors
  16. 16. 2015-2016 Challenges 2015-2016 Full Year Student in-class survey Faculty in-class survey Pilot pre/post tests Fall Only Student end-of-semester survey Faculty end-of-semester survey Spring Only Preceptor Focus Groups  Faculty in-class survey  Pre/Post Tests  Student end-of-semester Survey
  17. 17. Closing the Loop  Internal feedback  Informal sharing with faculty colleagues
  18. 18. Images Used Slides 1 and 20: Mills. (2014). Tree of life. Slide 2: Mibby23. (2013). Looking back. Slide 3: Cornwall, N. (2013). New life. Slide 4: Accheri, C. (2014). Ta Prohm. Slide 5: Mennerich, D. (2013). Bikaner IND – cenotaphs devikund sagar 04. Slide 6: Jutte, T. (2013). National Geographic, Ter Apel Monastery, Groningen, Netherlands – 1551. Slide 7: Tazewell, C. (2007). Spiral. Slide 8: Delp, J. (1999). Mount Katahdin, Maine. Slide 9: kc ma. (2016). Seedling. Slide 10: Holzman, L. Ladder. Slide 11: bambe1964. (2011). Scaffold. Slide 12: Kleinfield, A. (2008). Girl runs up San Francisco’s 16th Avenue tiled steps. Slide 13: JogiBaer2. (2011). Post-It. Slide 14: chaseiv57. (2012). 6172004275_b8dcba694d_b-1. Slide 15: Cheng, B. (2014). Light tunnel. Slide 16: Scott, G. (2007). Positive thoughts, Mr. Glen. Slide 17: Eric. (2005). Frustration. Slide 18: when_night_falls. (2013). Loop. Slide 19: Clark, T. (2013). The open road.