Your SlideShare is downloading. ×
0
Listening to the customer    Assessment   that makes a difference    Student learning Catherine Haras Information Literacy...
Library Facts <ul><li>Total volumes  1,205,256 </li></ul><ul><li>Total number of teaching librarians  12 </li></ul><ul><li...
Our customers <ul><li>First generation college students </li></ul><ul><li>Transfer/commuter population </li></ul><ul><li>L...
Why assess? <ul><li>To increase the quality of the Library’s instruction program  </li></ul><ul><li>To ensure compliance/i...
In place <ul><li>System-wide CSU Information Competence Initiative </li></ul><ul><li>Information Literacy Coordinator </li...
What did we do? <ul><li>We used several assessments based on our constituents. </li></ul><ul><ul><li>Homegrown and standar...
2 homegrown examples <ul><li>We assessed students and faculty </li></ul><ul><ul><li>Students, via quiz </li></ul></ul><ul>...
Assessment of the students: homegrown (direct) <ul><ul><li>Tested student research skills levels </li></ul></ul><ul><ul><l...
Sample question <ul><ul><li>Campbell, S. (2006). Perceptions of mobile phones in  </li></ul></ul><ul><ul><li>college class...
Direct assessment Results <ul><ul><li>N=2,934 </li></ul></ul><ul><ul><li>Mean score = 71.5%  or a C average </li></ul></ul...
Assessment Results <ul><ul><li>Students found the pretest reflective </li></ul></ul><ul><ul><li>They are gamers </li></ul>...
Assessment of the faculty: homegrown (indirect) <ul><ul><li>Held a series of faculty focus groups </li></ul></ul><ul><ul><...
(Indirect) Assessment Results <ul><ul><li>  </li></ul></ul>My students can: Strongly disagree Disagree Agree Strongly agre...
Changes based on (student) assessment <ul><ul><li>Based on the student scores, campus FYE curriculum was changed.  </li></...
Indicators of success  <ul><li>Information literacy is now assessed at program review </li></ul><ul><li>Increase in type a...
Continuous improvement <ul><ul><li>Approval of new  IHE 101  pilot </li></ul></ul><ul><ul><li>Program Review self-study 20...
Learning  takes place in context. B e prepared to assess more than the student.
Work with your culture <ul><li>Accept legacy issues particular to your Library and campus </li></ul><ul><ul><li>Take advan...
Develop your culture <ul><li>Cater to any unique constituencies </li></ul><ul><ul><li>Understand (G 1.5) learners and adju...
Take away for public libraries <ul><li>Cater to your unique constituencies </li></ul><ul><ul><li>Find the gatekeepers for ...
Take away for public libraries <ul><li>Decide as a library how much you can or want to change </li></ul><ul><ul><li>Hold f...
Catherine Haras CSU, Los Angeles [email_address]
Upcoming SlideShare
Loading in...5
×

Listening to the customer Assessment that makes a difference Student learning

1,166

Published on

LLAMA LOMS Program: Listening to the Customer: Using Assessment
Results to Make a Difference
ALA Annual (Chicago)
Sunday, July 11, 2009
Catherine Haras (California State University, Los Angeles)

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,166
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
8
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Increased emphasis on assessment in higher education—chance to help the university or college.
  • Increased emphasis on assessment in higher education—chance to help the university or college.
  • System-wide support builds a local program; lots of dialog between the CSUs.
  • Indicators are often outside the library
  • Transcript of "Listening to the customer Assessment that makes a difference Student learning"

    1. 1. Listening to the customer Assessment that makes a difference Student learning Catherine Haras Information Literacy Coordinator California State University, Los Angeles ALA Annual, Chicago 2009
    2. 2. Library Facts <ul><li>Total volumes 1,205,256 </li></ul><ul><li>Total number of teaching librarians 12 </li></ul><ul><li>Students attending Library instruction 2007-2008 17,343 (684 sessions) </li></ul><ul><li>Robust information literacy program </li></ul>
    3. 3. Our customers <ul><li>First generation college students </li></ul><ul><li>Transfer/commuter population </li></ul><ul><li>Latino </li></ul><ul><li>Graduates of the LAUSD, where information literacy is unmandated </li></ul>
    4. 4. Why assess? <ul><li>To increase the quality of the Library’s instruction program </li></ul><ul><li>To ensure compliance/instruction across the Colleges </li></ul><ul><li>Accreditation and WASC reviews </li></ul><ul><li>Do we know what our students know? </li></ul>
    5. 5. In place <ul><li>System-wide CSU Information Competence Initiative </li></ul><ul><li>Information Literacy Coordinator </li></ul><ul><li>Liaison model of IL: faculty and librarian cooperation </li></ul><ul><li>Library SLOs adapted from ACRL </li></ul><ul><li>Participants on iSkills beta testing to assess ICT literacy </li></ul>
    6. 6. What did we do? <ul><li>We used several assessments based on our constituents. </li></ul><ul><ul><li>Homegrown and standardized (iSkills/IC3) </li></ul></ul><ul><ul><li>Direct and indirect </li></ul></ul><ul><ul><li>Qualitative and quantitative </li></ul></ul><ul><li>We assessed librarians, faculty, and students. </li></ul><ul><li>We took advantage of CSU participation in the ETS iSkills project. </li></ul><ul><li>We were prepared to learn from our mistakes. </li></ul>
    7. 7. 2 homegrown examples <ul><li>We assessed students and faculty </li></ul><ul><ul><li>Students, via quiz </li></ul></ul><ul><ul><li>Faculty, via focus groups and an indirect survey </li></ul></ul>
    8. 8. Assessment of the students: homegrown (direct) <ul><ul><li>Tested student research skills levels </li></ul></ul><ul><ul><li>Created questions based on the ACRL Standards outcomes </li></ul></ul><ul><ul><li>Targeted a gateway freshman experience course that all incoming freshmen/transfers must take </li></ul></ul><ul><ul><li>Created a 27-item quiz in WebCT/Blackboard </li></ul></ul><ul><ul><li>Administered quiz 5 consecutive quarters, Fall 05-Fall 06, whether faculty wanted to or not </li></ul></ul>
    9. 9. Sample question <ul><ul><li>Campbell, S. (2006). Perceptions of mobile phones in </li></ul></ul><ul><ul><li>college classrooms: Ringing, cheating, and </li></ul></ul><ul><ul><li>classroom policies. Communication Education , 55 , </li></ul></ul><ul><ul><li>280-294. </li></ul></ul>
    10. 10. Direct assessment Results <ul><ul><li>N=2,934 </li></ul></ul><ul><ul><li>Mean score = 71.5% or a C average </li></ul></ul><ul><ul><li>2-point difference between freshmen and transfers </li></ul></ul><ul><ul><li>Colleges performed equally poorly </li></ul></ul><ul><ul><li>Questions a student was most likely to get wrong: </li></ul></ul><ul><ul><ul><li>Reading citations </li></ul></ul></ul><ul><ul><ul><li>Topic formulation </li></ul></ul></ul><ul><ul><ul><li>Database search logic </li></ul></ul></ul>
    11. 11. Assessment Results <ul><ul><li>Students found the pretest reflective </li></ul></ul><ul><ul><li>They are gamers </li></ul></ul><ul><ul><li>They are reading averse </li></ul></ul><ul><ul><li>They are affective learners </li></ul></ul>
    12. 12. Assessment of the faculty: homegrown (indirect) <ul><ul><li>Held a series of faculty focus groups </li></ul></ul><ul><ul><li>Created information literacy advisory of 18 key faculty </li></ul></ul><ul><ul><li>Advisory created a 20 Q survey </li></ul></ul><ul><ul><li>Surveyed entire campus by email, reaching a generalizable 30% of tenured faculty on students’ research habits (N=235) </li></ul></ul>
    13. 13. (Indirect) Assessment Results <ul><ul><li> </li></ul></ul>My students can: Strongly disagree Disagree Agree Strongly agree Don't know a. Narrow or focus a research topic 3% (6) 14% (28) 62% (125) 11% (23) 9% (19) b. Formulate a search query 3% (6) 15% (30) 57% (114) 10% (20) 15% (31) f. Read or trace a bibliographic citation 3% (6) 17% (34) 53% (107) 8% (16) 19% (38)
    14. 14. Changes based on (student) assessment <ul><ul><li>Based on the student scores, campus FYE curriculum was changed. </li></ul></ul><ul><ul><ul><ul><li>New IHE 101 model with strong IL emphasis piloted and adopted by campus colleges. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Library created an information jeopardy game; with virtual assessment </li></ul></ul></ul></ul>
    15. 15. Indicators of success <ul><li>Information literacy is now assessed at program review </li></ul><ul><li>Increase in type and kind of library session </li></ul><ul><li>Increased collaboration: consultation on programmatic IL and assignment design </li></ul><ul><li>CSULA IL program commended by WASC </li></ul><ul><li>GE overhaul; campus considering a mandated IL course </li></ul>
    16. 16. Continuous improvement <ul><ul><li>Approval of new IHE 101 pilot </li></ul></ul><ul><ul><li>Program Review self-study 2006-2007 </li></ul></ul><ul><ul><li>WASC accreditation 2006-2010 </li></ul></ul><ul><ul><ul><li>Institutional proposal Fall 2006 </li></ul></ul></ul><ul><ul><ul><li>Capacity and preparatory review Fall 2008 </li></ul></ul></ul><ul><ul><ul><li>Educational Effectiveness review Spring 2010 </li></ul></ul></ul>
    17. 17. Learning takes place in context. B e prepared to assess more than the student.
    18. 18. Work with your culture <ul><li>Accept legacy issues particular to your Library and campus </li></ul><ul><ul><li>Take advantage of the administration you have </li></ul></ul><ul><ul><li>Grow your program locally </li></ul></ul><ul><ul><li>The process may not look formal </li></ul></ul><ul><li>Find influential faculty who can advocate for you </li></ul>
    19. 19. Develop your culture <ul><li>Cater to any unique constituencies </li></ul><ul><ul><li>Understand (G 1.5) learners and adjust your teaching </li></ul></ul><ul><ul><li>Recognize the reality of part-time instructors </li></ul></ul><ul><li>Listen to the needs of instructors and work with them-- but help guide them </li></ul><ul><li>Develop the pedagogical skills of librarians </li></ul>
    20. 20. Take away for public libraries <ul><li>Cater to your unique constituencies </li></ul><ul><ul><li>Find the gatekeepers for your particular communities and work with them to develop outreach </li></ul></ul><ul><ul><li>Partner with K-12 schools—their students are using your library </li></ul></ul><ul><ul><li>Literature on Millennials is helpful </li></ul></ul><ul><li>Develop the pedagogical skills of your librarians </li></ul><ul><ul><li>Reference librarianship is teaching and yours is a teaching library </li></ul></ul>
    21. 21. Take away for public libraries <ul><li>Decide as a library how much you can or want to change </li></ul><ul><ul><li>Hold focus groups for your librarians first </li></ul></ul><ul><ul><li>Dialog with your influential librarians; allow everyone who wants to to become part of the process </li></ul></ul><ul><ul><li>Query the community at large and find out what your community needs and wants from the library. </li></ul></ul>
    22. 22. Catherine Haras CSU, Los Angeles [email_address]
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×