Betty rogers presentation evaluation. 1ppt
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Betty rogers presentation evaluation. 1ppt

on

  • 2,178 views

Team Presentation

Team Presentation

Statistics

Views

Total Views
2,178
Views on SlideShare
2,178
Embed Views
0

Actions

Likes
0
Downloads
8
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Betty rogers presentation evaluation. 1ppt Presentation Transcript

  • 1. Team Presentation Evaluation Team Member Betty Rogers September 20, 2010
  • 2. Overview
    • How evaluation techniques can be applied to libraries program and services.
    • Define evaluation
    • Purpose of evaluation
    • Evaluation Method
    • Evaluation Strategies libraries can use
    • Evaluation Types
    • Evaluation planning process
  • 3. What being measured? What does it indicate? What is not being measured? For What Purpose is the Evaluation? Who are the audiences for the information from the evaluation? From what sources should the information be collected? When is the information needed? “ If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra
  • 4. Definition
    • First, evaluation is viewed as a systematic process
    • Second, evaluation involves collecting data
    • Third, evaluation is a process for enhancing knowledge and
    • decision making
    • Fourth, evaluation is a process of determining the success, impact, results, costs, outcome
    • SOURCE: Adapted from Russ-Eft and Preskill (2001 )
  • 5. Purpose of Evaluation
    • Provide “Useful Feedback
    • Planning/efficiency
    • Accountability
    • Implementation
    • Institutional strengthening
  • 6. Evaluation Method
    • Input measurement
    • Output/Performance Measurement
    • Impact/Outcomes Assessment
    • Service Quality
  • 7. Evaluation Methods
    • Input Measurement
      • Money,
      • Facilities,
      • Customers
      • Clients,
      • Program staff
      • Volunteers
      • External partners
      • Time
      • Equipment
      • Technology
  • 8. Evaluation Methods
    • Output measurement
    • Activity
      • Events
      • Products
      • Workshops
      • Trainings
      • exhibits
      • Numbered of customer served
  • 9. Evaluation Methods
    • Impact/Outcomes Measures
    • Indicative of impacts or outcomes from the service, program, or library activity on those who are receiving the service
    • Outcome / Impact
    • knowledge
    • Attitudes
    • Awareness
    • Opinions
    • Skills
    • Behavior
    • Educational
    • Environmental quality
  • 10. Evaluation Methods
    • Service Quality
    • Difference between what customers expect and their perceptions of the service performance.
    • - Encompasses the interactive relationship between the library and the people whom it is supposed to serve
    • Does the service meet organizational or user expectation
  • 11. Evaluation Strategies
    • Major Model Groups
      • Scientific-experimental models
      • Management-oriented systems models
        • PERT(Program Evaluation and Review Technique
        • CPM(Critical Path Method)
      • Qualitative Models
      • Participant-oriented Models
  • 12. ROI Model
    • Solution Matrix • Cost-Benefit-Analysis http://www.solutionmatix.com/return-on-investment.html
  • 13. Type of Evaluation
    • Formative
    • Summative
    • Robert Stake likened the two stages of evaluation to making soup: when the chef tastes the soup,
    • its formative; when the diners (or a food critic) taste the soup it is summative
  • 14. Types of Evaluation
    • Formative Evaluation
          • Needs Assessment
          • Evaluability assessment
          • Structured conceptualization
          • Implementation evaluation
          • Process evaluation
      • Research Methods Knowledge Base
  • 15. Types of Evaluation
    • Summative Evaluation
          • Outcome evaluations
          • Impact evaluation
          • Cost-effectiveness and cost-benefit analysis
          • Secondary analysis
          • Meta-analysis
      • Research Methods Knowledge Base
  • 16. Types of Evaluation
  • 17. Planning Evaluation Cycle
    • Evaluation Phase
      • Formulation
      • Conceptualization
      • Detailing
      • Evaluation
      • implementation
  • 18. Planning Evaluation Cycle
    • Planning Phase
      • Formulation
      • Conceptualization
      • Design
      • Detailing
      • Analysis
      • utilization
  • 19. Planning-Evaluation Cycle
      • Research Methods Knowledge Base
  • 20. Typical activity indicators to track
    • Amount of products, services delivered
    • #/type of customers/clients served
    • Timeliness of service provision
    • Accessibility and convenience of service
      • Location; hours of operation; staff availability
    • Accuracy, adequacy, relevance of assistance
    • Courteousness
    • Customer satisfaction
    University of Wisconsin-Extension, Program Development and Evaluation E.g.: # of clients served # of consultations # of workshops held # of attendees # of referrals Quality of service
  • 21. Data Collection Techniques
    • Tests
    • Assessments
      • Questionnaires
      • Interviews
      • Observation
      • Evaluation
      • Use record
      • Analysis
      • Content analysis
      • Surveys
  • 22. Technique used to Evaluation my Library’s Service/ FVSU H.A. Hunt Memorial Library
    • Formative
    • Pretest
    • -Students
    • Summative
      • Survey
      • -faculty
  • 23. Conclusion/Summary
    • In essence, Evaluations are important to library programs and services. The evaluation experience is likely to be more positive and its results are likely to be more useful if you build evaluation in form the start and make it an on-going activity.
  • 24. Reference
    • University of Wisconsin-Extension, Program Development and Evaluation Logic Model. Retrieved. September 1, 2010; http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
    • These forms of evaluation are derived from the work of Owen published in Program Evaluation: Forms and Approaches (2006) . Retrieved September 1, 2010; http://www.murdoch.edu.au/teach/carrick_evaluation/purpose_scope.html
    • Haycock, K. & Sheldon ,B.E (2008) . The Portable MLIS; Insights from the Experts . Westport, CT: Libraries Unlimited.
    • Research Methods Knowledge Base. Retrieved. September 12, 2010; http:// www.socialresearchmethods.net