Your SlideShare is downloading. ×

Phase I: Overview

262

Published on

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
262
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. A Standardized Approach to Assessment: Results from a Pilot Study of Library Instruction Sessions at San José State University Shannon M. Staley Social Sciences Librarian
  • 2. Purpose of Study
    • To Develop and Test an assessment tool that:
    • Facilitates the development of pre and post achievement tests
    • Links survey questions to learning outcomes and ACRL standards
    • Delivers real time reports and statistical analysis
  • 3. Timeline: An Overview
    • Review of the Literature – Summer 2008
    • Create assessment tool – Fall 2008
    • Garner input from library faculty / Refine tool - 12/08
    • Develop multiple choice questions – 12/08
    • Standardize questions – 1/09
    • Pilot test assessment tool – 1/09
    • Enact actual study – Spring 2009
  • 4. Methodology: Develop Multiple-Choice Questions
    • Investigated questions from already existing tutorials / survey instruments for ideas
    • Created 20 multiple-choice questions based on social sciences 100W learning outcomes
    • Each question presented four answers, with only one correct answer
    • Each question had an added option for “not sure”
  • 5. Methodology: Standardization of Survey Multiple Choice Questions
    • Multiple choice questions were emphasized to allow for immediate computation of results
    • The co-occurrence of LibQual during Spring 2009 provides for complimentary qualitative data
    • Questions were reviewed by library faculty to ensure they reflected learning outcomes / ACRL standards
    • Questions and learning outcomes were emailed to all teaching faculty in the Psychology department for review
    • Questions were reviewed by experts at the Center for Assessment under the Office of Institutional Research
    • Cognitive interviews with 5 students further addressed the clarity of survey questions
  • 6. Methodology: Pilot Study
    • Conducted with students in a PSYC 139 course to
    • determine:
    • Average length of time to complete surveys
    • Overall impact on what could be covered in a single instruction session
    • Any technical glitches with survey interface, login, usability etc.
    • If survey questions were clear and comprehensible to students
  • 7. Methodology: Pilot Study
    • Conducted with students in PSYC 139 to
    • determine:
    • Average length of time to complete surveys - 5 minutes (10 minutes total for pre and post survey completion)
    • Overall impact on what could be covered in a single instruction session
    • Any technical glitches with survey interface, login, usability etc.
    • If survey questions were clear and comprehensible to students
  • 8. Methodology: Pilot Study
    • Conducted with students in PSYC 139 to
    • determine:
    • Average length of time to complete surveys - 5 minutes (10 minutes total for pre and post survey completion)
    • Overall impact on what could be covered in a single instruction session - no major impact with tighter planning
    • Any technical glitches with survey interface, login, usability etc.
    • If survey questions were clear and comprehensible to students
  • 9. Methodology: Pilot Study
    • Conducted with students in PSYC 139 to
    • determine:
    • Average length of time to complete surveys - 5 minutes (10 minutes total for pre and post survey completion)
    • Overall impact on what could be covered in a single instruction session - no major impact with tighter planning
    • Any technical glitches with survey interface, login, usability etc. - no major problems. Logins for the post test could not contain typos
    • If survey questions were clear and comprehensible to students
  • 10. Methodology: Pilot Study
    • Conducted with students in PSYC 139 to
    • determine:
    • Average length of time to complete surveys - 5 minutes (10 minutes total for pre and post survey completion)
    • Overall impact on what could be covered in a single instruction session - no major impact with tighter planning
    • Any technical glitches with survey interface, login, usability etc. - no major problems. Logins for the post test could not contain typos
    • If survey questions were clear and comprehensible to students – no confusion was reported
  • 11. Methodology: Planning and Instruction
    • Coordinated with instructor beforehand in encouraging students to show up early
    • Scripted instructions were read to students prior to taking the pre test survey
    • Lecture with embedded exercises; last 15 minutes reserved for getting started on course assignments
    • Course outline and handouts were distributed to students after completion of their post test surveys
  • 12. Methodology: Actual Study – Spring 2009
    • Conducted with students in 5 PSYC 100W courses (83 students total)
    • Survey participation was voluntary
    • Participants who completed the pre-test survey but failed to complete the post-test survey were automatically eliminated from the data pool
    • 11 multiple-choice survey questions covering 3 ACRL standards and 8 learning objectives
    • 6 background questions on the pre-test survey – they were not repeated on the post-test
  • 13. Methodology: Pre-Test Survey Background Questions
  • 14. Methodology: PSYC 100W Multiple-Choice Survey Questions
  • 15. Data Analysis A Bird’s Eye View
  • 16. Data Analysis A Bird’s Eye View
  • 17. Data Analysis A Bird’s Eye View
  • 18. Data Analysis Statistical Significance
    • Paired TTEST of student mean scores
    • Excel formula computes statistical significance in the report generation feature of the assessment tool
  • 19. Data Analysis Assessment Tool Sample Report
  • 20. Data Analysis Future Enhancements
    • Master spreadsheet that incorporates all reports
    • More granular TTEST analysis of individual questions
    • Statistical analysis of how background factors (class level, research experience, instruction experience) affect scores
  • 21. Benefits
    • Assessment data will inform pedagogical directions and enhance the educational impact on SJSU students
    • Data can be shared with department faculty and campus administrators to demonstrate instructional efficacy
    • Data can also be used to bolster documentation during the accreditation process
    • Librarians can publish findings from their own sessions, particularly with a more rigorous statistical model
    • Librarians can include assessment reports in their dossiers

×