Your SlideShare is downloading. ×
Betty rogers presentation evaluation. 1ppt
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Betty rogers presentation evaluation. 1ppt

1,920
views

Published on

Team Presentation

Team Presentation


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,920
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
8
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Team Presentation Evaluation Team Member Betty Rogers September 20, 2010
  • 2. Overview
    • How evaluation techniques can be applied to libraries program and services.
    • Define evaluation
    • Purpose of evaluation
    • Evaluation Method
    • Evaluation Strategies libraries can use
    • Evaluation Types
    • Evaluation planning process
  • 3. What being measured? What does it indicate? What is not being measured? For What Purpose is the Evaluation? Who are the audiences for the information from the evaluation? From what sources should the information be collected? When is the information needed? “ If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra
  • 4. Definition
    • First, evaluation is viewed as a systematic process
    • Second, evaluation involves collecting data
    • Third, evaluation is a process for enhancing knowledge and
    • decision making
    • Fourth, evaluation is a process of determining the success, impact, results, costs, outcome
    • SOURCE: Adapted from Russ-Eft and Preskill (2001 )
  • 5. Purpose of Evaluation
    • Provide “Useful Feedback
    • Planning/efficiency
    • Accountability
    • Implementation
    • Institutional strengthening
  • 6. Evaluation Method
    • Input measurement
    • Output/Performance Measurement
    • Impact/Outcomes Assessment
    • Service Quality
  • 7. Evaluation Methods
    • Input Measurement
      • Money,
      • Facilities,
      • Customers
      • Clients,
      • Program staff
      • Volunteers
      • External partners
      • Time
      • Equipment
      • Technology
  • 8. Evaluation Methods
    • Output measurement
    • Activity
      • Events
      • Products
      • Workshops
      • Trainings
      • exhibits
      • Numbered of customer served
  • 9. Evaluation Methods
    • Impact/Outcomes Measures
    • Indicative of impacts or outcomes from the service, program, or library activity on those who are receiving the service
    • Outcome / Impact
    • knowledge
    • Attitudes
    • Awareness
    • Opinions
    • Skills
    • Behavior
    • Educational
    • Environmental quality
  • 10. Evaluation Methods
    • Service Quality
    • Difference between what customers expect and their perceptions of the service performance.
    • - Encompasses the interactive relationship between the library and the people whom it is supposed to serve
    • Does the service meet organizational or user expectation
  • 11. Evaluation Strategies
    • Major Model Groups
      • Scientific-experimental models
      • Management-oriented systems models
        • PERT(Program Evaluation and Review Technique
        • CPM(Critical Path Method)
      • Qualitative Models
      • Participant-oriented Models
  • 12. ROI Model
    • Solution Matrix • Cost-Benefit-Analysis http://www.solutionmatix.com/return-on-investment.html
  • 13. Type of Evaluation
    • Formative
    • Summative
    • Robert Stake likened the two stages of evaluation to making soup: when the chef tastes the soup,
    • its formative; when the diners (or a food critic) taste the soup it is summative
  • 14. Types of Evaluation
    • Formative Evaluation
          • Needs Assessment
          • Evaluability assessment
          • Structured conceptualization
          • Implementation evaluation
          • Process evaluation
      • Research Methods Knowledge Base
  • 15. Types of Evaluation
    • Summative Evaluation
          • Outcome evaluations
          • Impact evaluation
          • Cost-effectiveness and cost-benefit analysis
          • Secondary analysis
          • Meta-analysis
      • Research Methods Knowledge Base
  • 16. Types of Evaluation
  • 17. Planning Evaluation Cycle
    • Evaluation Phase
      • Formulation
      • Conceptualization
      • Detailing
      • Evaluation
      • implementation
  • 18. Planning Evaluation Cycle
    • Planning Phase
      • Formulation
      • Conceptualization
      • Design
      • Detailing
      • Analysis
      • utilization
  • 19. Planning-Evaluation Cycle
      • Research Methods Knowledge Base
  • 20. Typical activity indicators to track
    • Amount of products, services delivered
    • #/type of customers/clients served
    • Timeliness of service provision
    • Accessibility and convenience of service
      • Location; hours of operation; staff availability
    • Accuracy, adequacy, relevance of assistance
    • Courteousness
    • Customer satisfaction
    University of Wisconsin-Extension, Program Development and Evaluation E.g.: # of clients served # of consultations # of workshops held # of attendees # of referrals Quality of service
  • 21. Data Collection Techniques
    • Tests
    • Assessments
      • Questionnaires
      • Interviews
      • Observation
      • Evaluation
      • Use record
      • Analysis
      • Content analysis
      • Surveys
  • 22. Technique used to Evaluation my Library’s Service/ FVSU H.A. Hunt Memorial Library
    • Formative
    • Pretest
    • -Students
    • Summative
      • Survey
      • -faculty
  • 23. Conclusion/Summary
    • In essence, Evaluations are important to library programs and services. The evaluation experience is likely to be more positive and its results are likely to be more useful if you build evaluation in form the start and make it an on-going activity.
  • 24. Reference
    • University of Wisconsin-Extension, Program Development and Evaluation Logic Model. Retrieved. September 1, 2010; http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
    • These forms of evaluation are derived from the work of Owen published in Program Evaluation: Forms and Approaches (2006) . Retrieved September 1, 2010; http://www.murdoch.edu.au/teach/carrick_evaluation/purpose_scope.html
    • Haycock, K. & Sheldon ,B.E (2008) . The Portable MLIS; Insights from the Experts . Westport, CT: Libraries Unlimited.
    • Research Methods Knowledge Base. Retrieved. September 12, 2010; http:// www.socialresearchmethods.net

×