Engaged! Innovative engagement and outreach and its assessment

491 views
446 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
491
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Engaged! Innovative engagement and outreach and its assessment

  1. 1. 2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment
  2. 2. BACKGROUND SPEC Kit 317: Special Collections Engagement Examines activities meant to engage students, faculty, and other scholars/researchers with special collections. It investigates who coordinates these activities, where they are held, how they are promoted, and how they are evaluated.
  3. 3. BACKGROUND Special Collections in action
  4. 4. BACKGROUND The team
  5. 5. RESULTS … overall picture <ul><li>Everyone (over 95%) is participating! </li></ul><ul><li>Survey themes include: </li></ul><ul><li>Subject knowledge determines responsibility </li></ul><ul><li>not job title </li></ul><ul><li>Lack of formal plans or documentation </li></ul><ul><li>Unstructured assessment </li></ul><ul><li>Integration with broader community </li></ul><ul><li>Embracing new technologies </li></ul>
  6. 6. RESULTS Exhibits <ul><ul><li>99% of respondents are creating exhibits </li></ul></ul><ul><ul><li>Only 19% have a dedicated person/position </li></ul></ul><ul><ul><li>EVERYONE (who responded) is doing both physical and virtual </li></ul></ul><ul><ul><li>50% attempt to evaluate success – primarily through counts </li></ul></ul>
  7. 7. RESULTS Events <ul><ul><li>96% of respondents are organizing events (lectures, symposia, open houses, etc.) </li></ul></ul><ul><ul><li>56% note that primary responsibility for an event varies, depending on subject expertise </li></ul></ul><ul><ul><li>Majority measure success through attendance, followed closely by anecdotal feedback </li></ul></ul><ul><ul><ul><li>~33% have no evaluation at all </li></ul></ul></ul><ul><ul><li>Most reported promotion used for events is one-on-one contact (92%) but only 24% consider it the most effective... </li></ul></ul>
  8. 8. RESULTS Curricular Engagement <ul><li>Everybody does it! </li></ul><ul><ul><li>What “it” is? </li></ul></ul><ul><ul><ul><li>work with faculty to develop courses </li></ul></ul></ul><ul><ul><ul><li>individual consultation </li></ul></ul></ul><ul><ul><ul><li>in-person class instruction </li></ul></ul></ul><ul><ul><ul><li>group consultation </li></ul></ul></ul><ul><ul><ul><li>course-related web pages </li></ul></ul></ul><ul><ul><ul><li>instructional videos </li></ul></ul></ul><ul><ul><li>Rarely is there designated staff responsible for curricular engagement </li></ul></ul><ul><ul><li>Evaluation, generally, consists of use counts of material </li></ul></ul>
  9. 9. RESULTS Curricular Engagement themes <ul><ul><li>A desire for methods, other than counts, to assess outcomes more effectively </li></ul></ul><ul><ul><li>Demand for instruction is growing as staff is shrinking </li></ul></ul><ul><ul><li>Lack of a designated coordinator </li></ul></ul>
  10. 10. RESULTS Most common barriers <ul><li>67% of respondents report barriers in providing effective engagement </li></ul><ul><ul><li>insufficient staffing </li></ul></ul><ul><ul><li>lack of dedicated staff </li></ul></ul><ul><li>Funding, limited hours, and space grouped together as barriers </li></ul><ul><li>These are beyond our immediate control </li></ul>
  11. 11. RESULTS Survey conclusions Everybody is actively participating in instructional outreach, exhibits, and public programming Responsibilities for outreach often distributed among many, making it difficult to approach outreach in a cohesive manner Desire by respondents to move beyond patron use counts and anecdotal feedback in evaluation methods Despite roadblocks, widespread enthusiasm and understanding of the importance of engagement activities within archives at every level
  12. 12. All is not lost (the survey also showed…) <ul><li>multiple levels of collaboration </li></ul><ul><li>outside the box engagement </li></ul><ul><li>desire for better evaluation </li></ul><ul><li>enhancement of traditional outreach through technology </li></ul>
  13. 13. Engagement ideas <ul><li>Collaboration and early involvement seem to be key. Examples include: </li></ul><ul><ul><li>Early contact with graduate student instructors </li></ul></ul><ul><ul><li>Working with subject and instruction librarians; incorporating with their classes </li></ul></ul><ul><ul><li>Awarding undergraduate research projects that make extensive use of collections </li></ul></ul><ul><ul><li>With students, creating virtual and physical exhibits to highlight materials </li></ul></ul><ul><ul><li>Involving students, through jobs, internships & practicums in all aspects of department work </li></ul></ul><ul><ul><li>Cross-departmental collaboration </li></ul></ul>
  14. 14. Assessment Examples <ul><li>Review student papers’ citations to evaluate the effectiveness of instruction </li></ul><ul><li>Using student focus groups to evaluate video tutorials </li></ul><ul><li>Monitoring books and articles published, performances given, and theses written </li></ul><ul><li>Tracking number and value of grants received </li></ul><ul><li>Examining web server statistics </li></ul><ul><li>Feedback forms and surveys </li></ul><ul><li>Monitoring number of graduate and practicum students using the collections </li></ul><ul><li>Soliciting and compiling one-on-one feedback from professors and students </li></ul>
  15. 15. Assessing our efforts How is evaluation and assessment being articulated and understood in the context of special collections? Special collections must clearly state engagement goals in order for any type of evaluation to be meaningful. Don’t have to reinvent the wheel – there are tools and examples out there: Archival Metrics China Missions Project Talk to library instruction colleagues and faculty
  16. 16. Targeted assessment is as important as targeted engagement There is no one solution fits all...
  17. 17. The End Questions or Feedback? Genya O’Gara Special Collections Research Center (919) 513-2605 [email_address]

×