Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions
Upcoming SlideShare
Loading in...5
×
 

Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions

on

  • 3,494 views

By Bonnie Swoger and Kim Hoffman ...

By Bonnie Swoger and Kim Hoffman
At SUNY Geneseo, we wanted to know what students learned during reference transactions, beyond counts of reference questions or user satisfaction surveys. Building on library instruction assessment techniques, students answered a survey after each reference transaction that simply asked “What did you learn today from your meeting with the librarian?”

Statistics

Views

Total Views
3,494
Views on SlideShare
1,517
Embed Views
1,977

Actions

Likes
1
Downloads
5
Comments
0

9 Embeds 1,977

http://undergraduatesciencelibrarian.org 1959
http://digg.com 5
http://www.newsblur.com 3
http://abtasty.com 3
http://www.thelibrarynews.com 2
http://translate.googleusercontent.com 2
http://webcache.googleusercontent.com 1
http://newsblur.com 1
http://www.pinterest.com 1
More...

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • KimPose the question –When students come to the reference desk, do they learn how to use Boolean operators?How do you know that?I taught it to themI saw them use it, etc.Most of this is anecdotal evidence.You have to prove it.
  • BonnieBut we can’t trust our gut. Your abdomen doesn’t know anything about what students are actually learning. Only the students know that.Anecdotes do not equal evidence. Ask your Provost.So we need to collect evidence.
  • BonnieFor decades, data about reference questions has been gathered as tick marks. We record the time of the questions, and perhaps how long it took to answer.The tick marks could (possibly) tell us when to staff the reference desk, but not much more
  • BonnieIn recent years some libraries have switch to digital data collection – LibStats, LibAnalytics, Gimlet, Google Forms, etc.These methods give us much more information about the patron and the question (including the answer the librarian gave)If your library has made the switch, think back on that discussion – Was it an easy one? many were concerned that we wouldn’t be able to record all of this information – if we can’t remember to put a check mark down, will we be able to record the whole questions. Some staff may have been resistant to this switch, concerned about work load and other issues.
  • KimBut its still just a one way conversation. All of our data is coming from the librarians point of view, we don’t have any feedback from the students.So when we asked you how you know what your students are learning, your responses came from information produced by librarians (lib stats) or gut feelings based on your own experiences.Lib Stats can tell us how often a database was used, how often boolean (or other search techniques) were taught, how often you helped a student understand peer review.But it can’t tell you if the student actually learned any of those things…
  • Kimthe information collected by libstats and other programs doesn’t really help us get at the factors that libraries are now faced with evaluating.The recent report on “The Value of Academic Libraries” (if you haven’t read it, go do that) suggests that libraries need to be collecting data that better aligns library services with campus goals, including the list up here.
  • KimAsk the audience.Hopefully they will say student learning or student success.
  • KimOut study is just one small (but useful) step in getting at the students point of view of the reference desk, There are some challenges (which we’ll talk about) but we think this is a useful step towards assessing student learning at the reference desk.
  • BonnieWe asked students what they learned.Simple response forms were handed out after each reference transaction for two weeks – one in October and one in November for all of our walk in reference questions. We also handed out the forms at each scheduled reference consultation .We had boxes in the library lobby (near the reference desk and librarian offices) to collect forms, and folks at the circulation desk also collected them.We decided against an online form. We thought we’d get a better response rate if students could give us their immediate feedback, rather than going back to their desk, opening up their computer, etc.
  • BonnieResponse rate wasn’t bad, but not spectacular.Multiple reasons – librarians didn’t always remember to hand out the form, so there might be some bias towards responses from students who talked to librarians who were better at remembering.We didn’t ask them to fill out the form at the desk (we didn’t want to appear to hover, giving them a bit of privacy), so we probably lost some there.We didn’t get folks annoyed at us for asking – even if they never filled out the form, asking them to do so didn’t appear to affect the overall rapport with students (anecdotally).We don’t have a count of how many were filled out vs. how many were filled in.We didn’t do the survey with our IM reference questions, which now represent about 25-30% of our non-scheduled reference questions.
  • BonniePic of one of the results1. Sometimes the results were really good – students were specific about what they learned and they mentioned more than one item.2. Sometimes the results weren’t that good – students said it was helpful but didn’t go into any detailsWe typed it all into a google spreadsheet.I thought it would take longer to type it all in than it did – it took me about an hour. (we had thought about getting students to do it, but that never worked out, so I just dived in).
  • BonnieHistory dominates (both course and major)As you can tell, our history librarian is pretty busy.
  • BonnieStudents by major60% of the time they were asking questions about courses in their major.
  • For a different project (related to information literacy), we worked on coming up with a list of topics that we regularly taught and skills covered. We refined that list into a nice neat controlled vocab and then were able to apply it here.We think there are some longer term benefits to using the same categories for evaluating learning at the reference desk and in the classroom. Although we haven’t yet done so, it may be interesting to compare the skills we typically teach in the classroom to what students learn at the reference desk.In the future, we hope to be able to use this list to categorize reference transactions (walk in and appointments) and classroom instruction.Responses could have more than one categoryChallenges: we haven’t looked at our results with the purpose of making revisions to this list
  • BonnieBig take away for students is database choice:This came out when students said “I learned which database to use for my project” (choosing appropriate databases) and “I learned to use GeoRef for geology research” (Teaching a specific database)Picking a database – very big deal.How this differs from original perceptions of how we spend our time at the reference desk.Bonnie – tend to think that I spend most of my time talking about search terms – narrower and broader terms, synonyms, etc. But students aren’t taking that stuff away with them as much.Perhaps choosing a database is something more concrete (at least for the single question that they have) and therefore it is easier to remember (and learn) in a reference encounter.Types of stuff that students say they are learning how to find – no big surprises here. Journal articles, books, primary sources (25% of our responses were related to history classes), and news articles to a lesser extent. We didn’t have any responses in our sample that mentioned finding data, maps, gov docs, etc.Use of IDS was one of the big things that come up in the “Other category”
  • KimTone of student responses1. Very appreciative (sometimes to the point of not answering the question)Several librarians mentioned by name (Sue Ann,Extremely helpful. Immensely helpful. Very helpful. Huge help.2.How easy research is now3.Categories of info that students thought were difficult to find (foreign language materials, case studies, primary sources, etc.)
  • KimEven though we were trying to get at student learning, the students couldn’t help themselves and many added comments about how helpful the librarian was (“5 stars!”)Rapport and the experience matters – we see how appreciate students are.Special thanks to the librarian for her excellent advise [sic] & patience assisting me in my search. Extremely helpful. Thank you.I worked with a very patient librarian who appears to be very professional and well-experienced at what she doesThe librarian showed me plenty of useful resources online for my sociology project. Very helpful and informative.
  • BonnieWe would like to be able to compare what the students say they learned in a reference transaction with what the librarians think happened.While we have questions and answers for our reference questions throughout the semester, we can’t connect these with individual student responses (we’d like to do that if we run this again).We could tag the questions and Answers in libstats, but we weren’t asking the same question (what did the student learn), and we’re not sure if looking at the data in aggregate will get us meaningful results – we are still exploring this.We didn’t ask the librarians to report what they taught during each reference encounter. We can’t compare what the librarian thought they taught with what the student thought they learned, except in aggregate.
  • We didn’t actually test students ability to do any of this stuff. Student who said “research is easy now” might still have problems 24 hours later.Small number of survey responses make this a largely qualitative study. If we want to do something in the future, we need to get better at handing out surveys and find a way to track how many students actually turned it in.
  • BonnieClosing the loop:Delay in analyzing the data (for this presentation) means that many of our plans regarding closing the loop are “in progress”We haven’t thought much about learning outcomes for our reference services. Does the reference interview seek to establish learning outcomes for each transaction? Can you establish outcomes for the service as a whole? Broad goals could be almost meaningless, specific goals difficult to assessReach out to faculty to get them to include specific databases in project descriptions – and follow up with faculty who include databases we don’t have or that aren’t appropriate (like JSTOR?).Database picker – just the primary databases, simple 2 click little web form (just an idea at this point)Do we tend to gloss over picking a database in one-shots and library instruction? Can we end the class with re-inforcing questions about which database to pick and why to choose that one? (library instruction and at the reference desk)Library instruction – a bit more practice in selecting keywords. Much harder to teach. Data can be passed on to administrators – students (self-report) that they are in fact learning through reference services. This goes back to the initiatives trying to get libraries proving their value (Value of Academic libraries)Those anecdotes are still useful – but now you can tell a story and have the data to back it up.

Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions Going Beyond Anecdotes: Assessing Student Learning During Reference Transactions Presentation Transcript

  • Going Beyond Anecdotes: AssessingStudent Learning During Reference Transactions Bonnie Swoger Kim Hoffman SUNYLA 2012
  • What does this tell us? Not much
  • • Student enrollment• Student learning outcomes• Student engagement• Student success• Graduation Rate• Retention/Efficienci es (getting students through faster)• Faculty teaching• External funding• Faculty research
  • 1. Student enrollment 2. Student learning outcomes 3. StudentIn which of these engagementareas does reference 4. Student success 5. Graduation Rateservices have the 6. Retention/Efficienmost impact? cies (getting students through faster) 7. Faculty teaching 8. External funding 9. Faculty research
  • A small step…
  • Response rate 20.08%300250 20.36%200150 Survey Responses Total questions 19.44%10050 0 Walk-in Scheduled Total Appointments
  • Responses by CourseSpecial Education Spanish Physics Management Humanities Geography Biology Music French Education Communication Sociology Psychology Political Science English Anthropology History 0 2 4 6 8 10 12 14
  • Responses by Major Undecided Physics Math Literacy Graduate Students French Education Economics Chemistry Biochemistry Special Education Spanish Psychology Communication BiologyInternational Relations Business Political Science Anthropology English History 0 1 2 3 4 5 6 7 8 9
  • • Evaluation of sources (General) • Mechanics of Searching (General) – Types of scholarly articles (to include – Boolean and truncation different types of research studies – – Synonyms (brainstorming) qualitative/quantitative) – Subject heading vs. keywords – Peer review/Scholarly v. Popular – Narrower and broader search – Evaluating websites – Limiting results (i.e. date-sensitive, in a – Reading level foreign language) – Choosing appropriate databases• Finding Specific Material Type – Choosing/narrowing a research topic (General) – Citation tracking – Books – Teaching a specific database [Name – Government Docs database in notes] – Journal Articles – News Articles • Citing Information (General) – Company/Business Info – Citation Style – Primary sources (historical) – Plagiarism/Paraphrasing – Data – Maps • Other (General) – Finding (physically) a particular item• Presenting information (General)
  • What did you learn? Choosing appropriate databases Journal Articles Other (General) Teaching a specific database Mechanics of Searching (General) Boolean and truncation Narrower and broader search Citation tracking Primary sources (historical) Books Evaluation of Sources (General) Collaborative Tools (Google Docs) Citation Style Synonyms (brainstorming) News ArticlesFinding Specific Material Types (General) 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00%
  • IExtremelyfind my information now will helpful. ImmenselyHow to research news articlesmuchother countries. Huge help. helpful. Very helpful.from more quickly.
  • Special thanks to the librarianI The librarian a very patient worked with showed me plenty for her excellent advise [sic] &librarian who appears to befor my of useful resources online very patience assisting me in myprofessional and well-experienced sociology project. Very helpful search. Extremely helpful. Thankat what she does and informative. you.
  • Aligning the data?
  • Closing the loopEstablish Gather learning evidenceoutcomesImplement Interpret change evidence