Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions

Science and Technology Librarian at Milne Library, SUNY Geneseo
Jun. 15, 2012
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
1 of 20

More Related Content

What's hot

Crit Research Booklet 05 6Crit Research Booklet 05 6
Crit Research Booklet 05 6Thomas Griffiths
Literature review in researchLiterature review in research
Literature review in researchDr.Mrinalini Menon
Research for daily life 1Research for daily life 1
Research for daily life 1ALEXANDEREFREN
QUALITATIVE RESEARCHQUALITATIVE RESEARCH
QUALITATIVE RESEARCHChi Yara
Research6 qualitative research_methodsResearch6 qualitative research_methods
Research6 qualitative research_methodsSani Satjachaliao
Media - quantitative and qualitative research 2012Media - quantitative and qualitative research 2012
Media - quantitative and qualitative research 2012David Engelby

Similar to Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions

Unit-11.pptxUnit-11.pptx
Unit-11.pptxPrabin Pandit
Research Methods: A Brief Introduction To Planning a Qualitative StudyResearch Methods: A Brief Introduction To Planning a Qualitative Study
Research Methods: A Brief Introduction To Planning a Qualitative StudyJacqueline Preston
Research methods economics fall 2012Research methods economics fall 2012
Research methods economics fall 2012lindahauck
2  research ideas2  research ideas
2 research ideasnarmada alaparthi
Pedagogic PublishingPedagogic Publishing
Pedagogic PublishingChris Willmott
Introduction to ThesisIntroduction to Thesis
Introduction to ThesisUltraman Taro

More from Bonnie Swoger

Plagiarism workshop   phys 362Plagiarism workshop   phys 362
Plagiarism workshop phys 362Bonnie Swoger
The Contributions of Dr. J. Mark Erickson to the geological literature presen...The Contributions of Dr. J. Mark Erickson to the geological literature presen...
The Contributions of Dr. J. Mark Erickson to the geological literature presen...Bonnie Swoger
The Contributions of Dr. J. Mark Erickson to the Geological LiteratureThe Contributions of Dr. J. Mark Erickson to the Geological Literature
The Contributions of Dr. J. Mark Erickson to the Geological LiteratureBonnie Swoger
Reasons for citationsReasons for citations
Reasons for citationsBonnie Swoger
Using assessment to shape information literacy goalsUsing assessment to shape information literacy goals
Using assessment to shape information literacy goalsBonnie Swoger
Faculty outreach: Strategies, Challenges, Tools and TipsFaculty outreach: Strategies, Challenges, Tools and Tips
Faculty outreach: Strategies, Challenges, Tools and TipsBonnie Swoger

Recently uploaded

Financing Higher Education in India.pptxFinancing Higher Education in India.pptx
Financing Higher Education in India.pptxShrutiMahanta1
Being at an RC: Expectations and Nitty-Gritty of Presentation Techniques, A R...Being at an RC: Expectations and Nitty-Gritty of Presentation Techniques, A R...
Being at an RC: Expectations and Nitty-Gritty of Presentation Techniques, A R...Assoc. Prof. Dr. Vinod Kumar Kanvaria
SOFTWARE QUALITY ASSURANCE.pptSOFTWARE QUALITY ASSURANCE.ppt
SOFTWARE QUALITY ASSURANCE.pptDrTThendralCompSci
Personal Brand Exploration - Meghan L. HallPersonal Brand Exploration - Meghan L. Hall
Personal Brand Exploration - Meghan L. HallMeghan Hall, MBA
Presentation on Online Child Sexual AbusePresentation on Online Child Sexual Abuse
Presentation on Online Child Sexual AbuseVoiceofChildren
Theoretical Considerations of Financing Education.pptTheoretical Considerations of Financing Education.ppt
Theoretical Considerations of Financing Education.pptShrutiMahanta1

Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions

Editor's Notes

  1. KimPose the question –When students come to the reference desk, do they learn how to use Boolean operators?How do you know that?I taught it to themI saw them use it, etc.Most of this is anecdotal evidence.You have to prove it.
  2. BonnieBut we can’t trust our gut. Your abdomen doesn’t know anything about what students are actually learning. Only the students know that.Anecdotes do not equal evidence. Ask your Provost.So we need to collect evidence.
  3. BonnieFor decades, data about reference questions has been gathered as tick marks. We record the time of the questions, and perhaps how long it took to answer.The tick marks could (possibly) tell us when to staff the reference desk, but not much more
  4. BonnieIn recent years some libraries have switch to digital data collection – LibStats, LibAnalytics, Gimlet, Google Forms, etc.These methods give us much more information about the patron and the question (including the answer the librarian gave)If your library has made the switch, think back on that discussion – Was it an easy one? many were concerned that we wouldn’t be able to record all of this information – if we can’t remember to put a check mark down, will we be able to record the whole questions. Some staff may have been resistant to this switch, concerned about work load and other issues.
  5. KimBut its still just a one way conversation. All of our data is coming from the librarians point of view, we don’t have any feedback from the students.So when we asked you how you know what your students are learning, your responses came from information produced by librarians (lib stats) or gut feelings based on your own experiences.Lib Stats can tell us how often a database was used, how often boolean (or other search techniques) were taught, how often you helped a student understand peer review.But it can’t tell you if the student actually learned any of those things…
  6. Kimthe information collected by libstats and other programs doesn’t really help us get at the factors that libraries are now faced with evaluating.The recent report on “The Value of Academic Libraries” (if you haven’t read it, go do that) suggests that libraries need to be collecting data that better aligns library services with campus goals, including the list up here.
  7. KimAsk the audience.Hopefully they will say student learning or student success.
  8. KimOut study is just one small (but useful) step in getting at the students point of view of the reference desk, There are some challenges (which we’ll talk about) but we think this is a useful step towards assessing student learning at the reference desk.
  9. BonnieWe asked students what they learned.Simple response forms were handed out after each reference transaction for two weeks – one in October and one in November for all of our walk in reference questions. We also handed out the forms at each scheduled reference consultation .We had boxes in the library lobby (near the reference desk and librarian offices) to collect forms, and folks at the circulation desk also collected them.We decided against an online form. We thought we’d get a better response rate if students could give us their immediate feedback, rather than going back to their desk, opening up their computer, etc.
  10. BonnieResponse rate wasn’t bad, but not spectacular.Multiple reasons – librarians didn’t always remember to hand out the form, so there might be some bias towards responses from students who talked to librarians who were better at remembering.We didn’t ask them to fill out the form at the desk (we didn’t want to appear to hover, giving them a bit of privacy), so we probably lost some there.We didn’t get folks annoyed at us for asking – even if they never filled out the form, asking them to do so didn’t appear to affect the overall rapport with students (anecdotally).We don’t have a count of how many were filled out vs. how many were filled in.We didn’t do the survey with our IM reference questions, which now represent about 25-30% of our non-scheduled reference questions.
  11. BonniePic of one of the results1. Sometimes the results were really good – students were specific about what they learned and they mentioned more than one item.2. Sometimes the results weren’t that good – students said it was helpful but didn’t go into any detailsWe typed it all into a google spreadsheet.I thought it would take longer to type it all in than it did – it took me about an hour. (we had thought about getting students to do it, but that never worked out, so I just dived in).
  12. BonnieHistory dominates (both course and major)As you can tell, our history librarian is pretty busy.
  13. BonnieStudents by major60% of the time they were asking questions about courses in their major.
  14. For a different project (related to information literacy), we worked on coming up with a list of topics that we regularly taught and skills covered. We refined that list into a nice neat controlled vocab and then were able to apply it here.We think there are some longer term benefits to using the same categories for evaluating learning at the reference desk and in the classroom. Although we haven’t yet done so, it may be interesting to compare the skills we typically teach in the classroom to what students learn at the reference desk.In the future, we hope to be able to use this list to categorize reference transactions (walk in and appointments) and classroom instruction.Responses could have more than one categoryChallenges: we haven’t looked at our results with the purpose of making revisions to this list
  15. BonnieBig take away for students is database choice:This came out when students said “I learned which database to use for my project” (choosing appropriate databases) and “I learned to use GeoRef for geology research” (Teaching a specific database)Picking a database – very big deal.How this differs from original perceptions of how we spend our time at the reference desk.Bonnie – tend to think that I spend most of my time talking about search terms – narrower and broader terms, synonyms, etc. But students aren’t taking that stuff away with them as much.Perhaps choosing a database is something more concrete (at least for the single question that they have) and therefore it is easier to remember (and learn) in a reference encounter.Types of stuff that students say they are learning how to find – no big surprises here. Journal articles, books, primary sources (25% of our responses were related to history classes), and news articles to a lesser extent. We didn’t have any responses in our sample that mentioned finding data, maps, gov docs, etc.Use of IDS was one of the big things that come up in the “Other category”
  16. KimTone of student responses1. Very appreciative (sometimes to the point of not answering the question)Several librarians mentioned by name (Sue Ann,Extremely helpful. Immensely helpful. Very helpful. Huge help.2.How easy research is now3.Categories of info that students thought were difficult to find (foreign language materials, case studies, primary sources, etc.)
  17. KimEven though we were trying to get at student learning, the students couldn’t help themselves and many added comments about how helpful the librarian was (“5 stars!”)Rapport and the experience matters – we see how appreciate students are.Special thanks to the librarian for her excellent advise [sic] & patience assisting me in my search. Extremely helpful. Thank you.I worked with a very patient librarian who appears to be very professional and well-experienced at what she doesThe librarian showed me plenty of useful resources online for my sociology project. Very helpful and informative.
  18. BonnieWe would like to be able to compare what the students say they learned in a reference transaction with what the librarians think happened.While we have questions and answers for our reference questions throughout the semester, we can’t connect these with individual student responses (we’d like to do that if we run this again).We could tag the questions and Answers in libstats, but we weren’t asking the same question (what did the student learn), and we’re not sure if looking at the data in aggregate will get us meaningful results – we are still exploring this.We didn’t ask the librarians to report what they taught during each reference encounter. We can’t compare what the librarian thought they taught with what the student thought they learned, except in aggregate.
  19. We didn’t actually test students ability to do any of this stuff. Student who said “research is easy now” might still have problems 24 hours later.Small number of survey responses make this a largely qualitative study. If we want to do something in the future, we need to get better at handing out surveys and find a way to track how many students actually turned it in.
  20. BonnieClosing the loop:Delay in analyzing the data (for this presentation) means that many of our plans regarding closing the loop are “in progress”We haven’t thought much about learning outcomes for our reference services. Does the reference interview seek to establish learning outcomes for each transaction? Can you establish outcomes for the service as a whole? Broad goals could be almost meaningless, specific goals difficult to assessReach out to faculty to get them to include specific databases in project descriptions – and follow up with faculty who include databases we don’t have or that aren’t appropriate (like JSTOR?).Database picker – just the primary databases, simple 2 click little web form (just an idea at this point)Do we tend to gloss over picking a database in one-shots and library instruction? Can we end the class with re-inforcing questions about which database to pick and why to choose that one? (library instruction and at the reference desk)Library instruction – a bit more practice in selecting keywords. Much harder to teach. Data can be passed on to administrators – students (self-report) that they are in fact learning through reference services. This goes back to the initiatives trying to get libraries proving their value (Value of Academic libraries)Those anecdotes are still useful – but now you can tell a story and have the data to back it up.