Evaluate: 2003 version
Upcoming SlideShare
Loading in...5
×
 

Evaluate: 2003 version

on

  • 1,542 views

 

Statistics

Views

Total Views
1,542
Views on SlideShare
1,541
Embed Views
1

Actions

Likes
0
Downloads
15
Comments
0

1 Embed 1

http://www.slideshare.net 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • * 07/16/96 * ##
  • * 07/16/96 * ## Activity #1: OPENING ACTIVITY. Ask how many in the audience have designed a program evaluation (show of hands). Ask various people to share the focus of their evaluation—was it a total program evaluation, an evaluation of the collection, a facility or services evaluation, etc.? Most groups should respond with some of these topics. Make the point that most of the topics shared will fit within the American Association of School Librarians (AASL) Rubric categories of: Teaching and Learning; Information Access and Delivery; and Program Administration . Presenter asks participants to turn to rubric. Go to page 35 of the book A Planning Guide for Information Power, Building Partnerships for Learning . Divide the participants into small groups. If the set-up is one with tables, each table can be a group. Give each group three index cards or post-it notes. Somewhere in the room ( wall/chalk board/flip charts), post the three headings from the rubric: Teaching and Learning Information Access and Delivery Program Administration Give the following directions: Each group has three votes. Take ten minutes to review the rubric and identify the three target indicators you believe represent the most important elements to evaluate in a school library media program to determine whether it is positioned to influence student achievement. Post your votes under the appropriate heading.   Highlight the target indicators in each category. Ask the group for some feedback on why the elements were selected. Summarize with a statement such as, “The majority of us believe the target indicators under Access are most important with particular emphasis on collection.” or “Most groups seem to think the target indicators under Teaching and Learning are most important. (Note to Presenter: Keep the flipcharts/wall/chalk board results. There will be an ENDING ACTIVITY that will build on this OPENING ACTIVITY . At the end of the workshop, participants will go back to the same groups. You will be asking if anyone wants to change his or her vote and why. More instructions are under the last slide.)
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ## Presenter: Everyone in Pennsylvania is aware of the impact of PSSA scores. The move to greater accountability has placed a new emphasis on assessment and evaluation based on educational research. Studies and research are important because all federal grant programs are requiring research based justifications for use of funds.
  • * 07/16/96 * ## Presenter: These findings are from the Keith Curry Lance Colorado study. Read slide. Lance did a number of other studies as well in Alaska, Pennsylvania, Colorado, Oregon, New Mexico, Iowa and Michigan. In addition, Esther G. Smith, EGS Research and Consulting, conducted a study in Texas: “Texas School Libraries: Standards, Resources, Services, And Students’ Performance.” All these studies have similar findings. Of course we all know about Pennsylvania’s study, “Measuring Up to Standards,” by Keith Curry Lance. The Pennsylvania study showed that full time librarians with aides made a difference. Please note, however, research shows that a full time school librarian isn’t enough. The school librarian must be collaborating with teachers. The presence of an aide enables the school librarian to collaborate with teachers.
  • * 07/16/96 * ## Presenter: The studies have shown a number of other attributes also contribute to quality school library programs . [Read the bullets.]
  • * 07/16/96 * ## Presenter: [Show first bullet.] In this workshop, we’ll investigate ways to collect baseline data about your program. [Show second bullet.] For example, emergent literacy programming in schools are requiring classroom collections. Rather than fighting classroom collections, librarians should provide leadership in selecting, developing and circulating quality classroom collections. [Show third bullet.] When opportunities for better support present themselves, it’s better to have the data than to have to collect it on the spot. For example, when grant opportunities arise, you can use the evaluative data to serve as your needs assessment in the grant application. Remember, you may be building a case for better support that doesn’t involve funding proposals. You may need administrative support to change a school practice. For example, to extend school library hours after school, you may need administrative support to relieve the school librarian of bus duty. Another example would be needing administrative support to increase collaboration with teachers to bring reluctant teachers into the collaborative planning.
  • * 07/16/96 * ## Presenter: Nancy Everhart is the author of the book “Evaluating the School Library Media Center.” She’s a professor at St. John’s University, NY and was previously a school librarian in Tamaqua, Pennsylvania. Her point is that an evaluation provides the data you need to determine success in all these areas. Read the bullets.
  • * 07/16/96 * ## Presenter: The remainder of the workshop will focus on three examples of evaluation on components identified as critical by the research. This is a good time to take questions concerning the background information on the topic of evaluation.
  • * 07/16/96 * ## [Show the bulleted words one by one. As you do, read the following DEFINITIONS.] Formal: Typically this kind of evaluation would be highly structured; the data collection instruments would be carefully vetted; the result would be a detailed, public report including recommendations. Formal evaluations are more likely to be summative but they often include both quantitative and qualitative data. Formal evaluations are more likely to be evaluations of the entire program. Informal: Typically an informal evaluation is less structured. The data collection instruments will provide useful information, but may not be so carefully vetted and therefore may have a lower level of accuracy than a research study or a formal evaluation. Informal evaluations are more likely to be formative, although they also use both quantitative and qualitative data. External: An evaluation that includes feedback and assessment from individuals external to the school district. An outside consultant might be involved. If an advisory committee is involved, it will include members external to the administration, teaching staff and librarians of the school district. Internal: An evaluation that solicits data from sources within the school such as administrators, teaching staff and librarians of the school or district. If an advisory committee is involved, it will include only members of the faculty or staff of the school. Summative: An evaluation designed to document the current status. This evaluation would “sum up” the current state of the library. Formative: An evaluation designed to determine the current status for the purpose of planning improvement. This workshop emphasizes formative evaluation because we advocate continuous improvement.  
  • * 07/16/96 * ##
  • * 07/16/96 * ## [First, Read definitions below before clicking to start the list.] Presenter: Quantitative: Numerical data, such as circulation statistics, collection and equipment counts, number of classes using the school library media center, number of lessons planned with teachers, etc. Qualitative: Data that involves perceptions, opinions or judgments, such as could be collected through surveying or interviewing users about satisfaction with the services, collections and facilities of the school library media center. [Go through the slide bullet by bullet. First show the quantitative measure and ask the participants what a corresponding qualitative measure would be. Then show the bullet for the qualitative measure on the slide. Continue down the list.] [The following chart, from Nancy Everhart’s book Evaluating the School Library Media Center, provides more examples, if needed, to illustrate the point:] Quantitative Measure Qualitative Measure Circulation of fiction books Students’ success rate in finding a desired fiction book # of periodical titles % of library’s titles cited in student research papers Library attendance Students’ satisfaction with library hours
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ## Presenter: We are now going to demonstrate the use of an evaluation model focused on this research finding: “ Students whose school librarian plays an instructional role tend to achieve higher than average test scores. This is also dependent on collaboration between school librarians and teachers and the inclusion of the library materials in the curriculum.” This research came from the Lance studies mentioned earlier that show a positive correlation between student achievement and strong library programs where the school librarian plays an instructional role and collaborates with teachers.
  • * 07/16/96 * ## Presenter: Here is the evaluation model. [Read the bullets as they appear on the screen, one by one.]
  • * 07/16/96 * ## Presenter: The first step in our model is to define the question. In this example, we are defining the question based on the research finding.
  • * 07/16/96 * ##
  • * 07/16/96 * ## [Read the slide because this is an important point:] Presenter: “ According to Information Power, the instructional role is determined by collaboration between school librarians and teachers that results in the inclusion of library materials in the curriculum”
  • * 07/16/96 * ## Presenter: These are the target indicators in the Rubric. [Refer participants to pages 35-36. of the Planning Guide for Information Power .]
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ## Presenter: The next step in the evaluation model is to collect data. Our choice in this example is going to be a questionnaire based on the Teaching and Learning rubric.
  • * 07/16/96 * ## Presenter: While a group’s perception may not be accurate, that information is good data for the school librarian because perception drives action. For example, if a teacher believes no information literacy standards document exists, even though one does, the teacher obviously will not use it in planning or consult the librarian in planning. An easy corrective action is to meet with the teacher and use the standards document to suggest ways to incorporate library materials in content area instruction as well as to offer to help with that work. One way to collect data to determine whether or not the school librarian plays an effective instructional role in the school is to use the Information Power Teaching and Learning Rubric as a questionnaire to administer to the following groups:   ·         School Library Staff ·         Administrative Staff ·         Teaching Staff   Summarize each group’s responses separately to determine where perceptions differ and where they agree.
  • * 07/16/96 * ## Presenter: These data are qualitative because the responses indicate attitude about the role the school librarian plays. For example, the principal puts the librarian on the weekly faculty meeting agenda to encourage faculty to use the school library and to promote new materials purchased to support the curriculum. The librarian spends considerable time preparing weekly presentations. The math teacher does not see an easy connect for his curriculum and tunes the presentations out, often correcting papers during this time. When responding to the second Target Indicator, he checks none of the boxes which reflects his attitude that he does not have discussions with the librarian about math lessons and curriculum. The librarian and the principal check Exemplary because they believe the faculty meeting provides common planning time. The conflicting data identify an area where the program can be improved.
  • * 07/16/96 * ## Presenter: The third step in our evaluation model is to analyze the data. How can the data collected in the questionnaires be analyzed? Identify areas of agreement and disagreement among the groups that responded to the questionnaire. Use the questionnaire responses to identify areas of strength and areas that need improvement. Use the questionnaire responses to identify where more data is needed. The slides that follow will demonstrate how to analyze data from a questionnaire based on the rubric.
  • * 07/16/96 * ## Presenter: The good news here is that the school community is in 96%* agreement that the Information Literacy Standards exist and provide a strong base for curricular integration. The bad news is that this target indicator is absolutely essential to the remaining 7 target indicators in the rubric and should be at the Exemplary level. So, even though it is an agreement area, it is also an area that warrants improvement. [96% is the average of the scores in the proficient category.]
  • * 07/16/96 * ## Presenter: Those who provide the service and those who use it are pretty much in agreement that curriculum development is functioning at a basic level while the administrative staff believes that it is functioning at a proficient level. Going back to the rubric, we find that proficient requires school policies that enable the librarian to participate in building and district meetings. These data indicate a review of school policy as well as practice is appropriate.
  • * 07/16/96 * ## Presenter: Having 100% agreement in the school that collaborative planning is occurring at a proficient level is a strong statement given the research. However, the proficient rubric descriptor says some teachers collaboratively plan and teach. Therefore, even strength areas provide opportunities for continuous growth.
  • * 07/16/96 * ## Presenter: For this example, let’s say the library staff consistently sponsors high interest contests to bring students into the library media center to see new acquisitions with particular emphasis on fiction, which does not have a high circulation. The staff spends time preparing the displays and planning the contests; the students respond positively. The library staff’s perception is that the programming meets the proficient criteria. However, administrators and teachers see the programming as promotion, not motivation that is generated by special literary events. Given the overall emphasis on reading and information literacy in evaluating the effectiveness of schools and the research on the correlation between effective school library programs and reading achievement, the responses indicate this is a weak area that warrants improvement. The responses from the administrators and teachers also suggests librarians may want to divert some of the time spent on contests to planning author visits and working with teachers to develop reading and writing assignments based on the author’s work.
  • * 07/16/96 * ## Presenter: The good news is that the total school community believes that the library staff models and promotes effective teaching. The interesting news is that the descriptor for exemplary performance specifies authentic assessment. The bad news is that the total school community, including the library staff, believes the librarians do not assess student work. Obviously, more data are needed to determine what improvement is needed.
  • * 07/16/96 * ## Presenter: After Step 3, analyzing the data, it may become necessary to return to Step 2 and collect more data. To help resolve any contradictory data collected from questionnaires, collect more data by convening a focus groups of teachers. Tape record the sessions and analyze the tape for themes. Ask: How are student products and performances assessed in units that involve use of the library media center and resources? Look for: Use of rubrics; involvement of library staff in assessment; use of panels (peer, adult, specialist). How do you determine the products and performances students will complete to demonstrate mastery of content in units that involve use of the library media center and resources? Look for: Library staff involvement in determining; affirmation of authentic assessments. What is the role of reflection in student work that involves the use of the library media center and its resources? Look for: Affirmation of role in all instruction; focus on evaluation of search strategy used in library-media based units. [If time permits, these questions can be posed to the group and elicit responses for what you look for in the focus group’s discussion.] When analyzing the responses to the questions, determine whether or not authentic assessment and reflection are used consistently in teaching. If yes, determine whether that use does or does not carry over into the library media based units. The data may suggest a school-wide need for improvement that includes the library media program rather than one specifically focused on the library media program.
  • * 07/16/96 * ## Presenter: We are now moving to Step 4 of the evaluation model. When the data has been collected and analyzed, it may be time to involve an advisory committee. An advisory committee is made up of stakeholders. In the case of this example, which is using the Teaching and Learning Rubric, the stakeholders are internal—teachers, administrators, and library staff. It’s the technical nature of this rubric that keeps the stakeholder group more limited than it would normally be. The advisory committee can help the librarian review the data analysis and formulate recommendations. For example, they can review the research data, the responses to the rubric questionnaire and the analysis of the focus groups. They see from the research that there is a strong correlation between the school librarian’s instructional role and student achievement. They see from the responses to the rubric-based questionnaire on the school librarian’s instructional role, that their school is only at a basic or proficient level. They gain further understanding from the focus group analysis. The advisory committee decides to set a goal that over the next three years the school will reach the exemplary level in all target indicators in the Teaching and Learning Rubric.
  • * 07/16/96 * ## Presenter: Here is an example of recommendations that the advisory committee might make for the first year, based on the research, the data collected in the rubric-based questionnaire and the data collected in the focus groups.
  • * 07/16/96 * ## More examples of recommendations.
  • * 07/16/96 * ## More examples of recommendations.
  • * 07/16/96 * ## Presenter: We are now at the last step in the evaluation process. When developing an action plan, it is important to get investment by the stakeholders because many of the changes that need to occur go beyond the school library and affect the whole school. For example, the common planning time issue will affect teacher prep periods to some extent. Any modifications to teacher prep periods requires effective administrative leadership.
  • * 07/16/96 * ## [There is a handout of this slide for participants to use. Go over the components of an action plan. Note: Participants will also get a blank action plan for use later when they complete one.]
  • * 07/16/96 * ## Presenter: This example of an evaluation was formal because it generated a written summary that was to be made public and that resulted in an action plan developed by an advisory committee. It was internal because feedback was solicited from within the organization and the advisory committee was from within the organization. It was formative because the purpose of the evaluation was to identify areas for continuous improvement. The data used in the evaluation were both quantitative and qualitative . The research provided quantitative data, that is, numerical data, including student scores on standardized tests, size of library collections, number and work hours of staff, library hours open, number of lessons planned with teachers, etc. The rubric-based questionnaire and the focus group both yielded qualitative data, that is, data that involves perceptions, opinions or judgments.
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##
  • * 07/16/96 * ##

Evaluate: 2003 version Evaluate: 2003 version Presentation Transcript

  • Evaluate ! Evaluating School Media Services LIB 620 Library Management Fall 2009
  • Preemptive Action : The busy librarian’s guide to program evaluation Based in part on: A workshop developed by the Pennsylvania School Librarians Association Professional Development Committee ( No longer available online) indicates the slide was originally from the above presentation
  • Evaluate ?
    • What do we mean by evaluate?
      • verb (used with object), -at⋅ed, -at⋅ing.
      • to determine or set the value or amount of; appraise: to evaluate property .
      • to judge or determine the significance, worth, or quality of; assess: to evaluate the results of an experiment .
      • Mathematics . to ascertain the numerical value of (a function, relation, etc.).
        • The American Heritage® Dictionary of the English Language, Fourth Edition © 2006 by Houghton Mifflin Company.
  • What is evaluation? [ 1 ]
    • As defined by the American Evaluation Association :
      • evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness.
      • Evaluation is the systematic collection and analysis of data needed to make decisions
        • Evaluation Definition: What is Evaluation?
  • What about school library media center evaluation?
    • Evaluation of school library media centers:
      • When you examine something or someone, you examine the subject and make a judgment about the quality, significance, or condition of whatever is to be evaluated.
        • Emanuel T. Prostano and Joyce S. Prostano, The School Library Media Center 5 th ed. (Englewood, CO: Libraries Unlimited, 1999), 44.
  • Purposes of evaluation
    • To find out what is right
    • To find out what is wrong
        • Summary of definition in Blanche Woolls, The School Library Media Manager 2 nd ed. (Westport, CT: Libraries Unlimited, 1999).
  • What can you evaluate?
    • Everything!
      • Collection
      • Personnel
      • Library Layout
      • Programs
      • Curriculum collaboration
      • Reference service/collection:
        • Interview skills
        • Quality of the answers provided
        • Satisfaction of your patrons
          • Students or teachers—or other school staff
  • What should you evaluate?
    • That depends . . .
      • On your priorities
      • On your economic needs and situation
      • On the policies and procedures established by your school, school district and/or state
      • On how much time you have left over from your other duties to devote to planning and executing an evaluation project
  • Why evaluate?
    • Because you have to
      • Continuous assessment of services can be an ongoing requirement for accreditation/funding, etc.
    • Because you want to
      • You want to find out how well you’re doing
      • You want to find out how well others think you’re doing
      • You want to find ways to improve your service
  • Why Evaluate?
    • Schools are being evaluated by student academic achievement in reading, writing and math.
    • Recent studies show student achievement correlates positively with effective school library programs.
  • Why Evaluate?
    • Studies show the positive correlation with student achievement occurs when there is an effective school library program [and]:
      • School librarians are full time and
      • Librarians collaborate with teachers on instructional planning.
  • Why Evaluate?
    • Studies also show an effective school library program includes the following attributes:
      • Quality collections;
      • Increased hours of access beyond school day;
      • Professional development for teachers and librarians;
      • Student access to technology; and
      • Collaboration with other types of libraries.
  • Why Evaluate? Because. . .
    • To improve, the librarian must have baseline data about the program.
    • To be relevant, the librarian must know if the library services and resources are aligned with the school’s goals for student achievement.
    • To build a case for better support, the librarian needs data that demonstrates the value of the program.
    • An evaluation enables you to:
      • Determine success in attaining program goals.
      • Determine students’ and teachers’ needs so they can be incorporated into the program.
      • Provide a basis for resource allocation.
      • Recognize strengths and accomplishments.
      • Examine the impact of the program on student learning.
    What Can an Evaluation Do?
    • Nancy Everhart, Evaluating the School Library Media Center , 1998)
  • It makes sense—
    • Focus data collecting and evaluation on those components of a school library program that the research shows correlate with student achievement.
  • Types of Evaluation
    • Formal
    • Informal
    • External
    • Internal
    • Formative
      • During a program
    • Summative
      • After the program is completed
  • Evaluation requires standards
    • How do you know if the media center is “good”/ “not good” or even “good enough”?
      • “ Good enough” for what? For whom?
    • Guidelines or rubrics or objectives for collections, performance can be found in:
      • Beyond Proficiency: Achieving a Distinguished Library Media Program . Kentucky Department of Education. August 2001
      • Administering the Library Media Program
        • The library media specialist:
          • Evaluates the Library Media Program through regular surveys for the purpose of enhancing services
  • Types of Data
    • Quantitative
    • Qualitative
    Number of OPAC searches Size of collection Number of lessons planned with teachers Students’ success rate in locating appropriate resources in OPAC searches Collection supports the curriculum Comprehensive and collaborative planning is in place
  • Standards require interpretation
    • Quantitative: relatively easy
      • Library Media Staffing ( Beyond Proficiency )
    ENROLLMENT PROFICIENT DISTINGUISHED * ** * ** Under 200 1 0 1 .5 200 - 500 1 .5 1 1 501 - 800 1 1 1 1.5 801 - 1200 1 1.5 1.5 1.5 1201 - 1600 1.5 1.5 2 2 1601 - 2000 2 2 2 2 2001 and Up 2.5 2 3 2 * Certified Library Media Specialist **Library Clerk - Classified
  • Standards require interpretation
    • Qualitative: more difficult
      • Program Evaluation Rubric, “ Beyond Proficiency ,” pp. 20-26.
      • Standard 2:
        • “ The Library Media Program promotes and supports student learning and achievement through its policies, programs and collection.”
        • Distinguished:
          • “ Students are empowered to use the media center to access information and reading for pleasure.”
  • Interpretation means operationalization
    • Operationalize:
      • To define a concept in a way that can be measured. In evaluation research, to translate program inputs, outputs, objectives, and goals into specific measurable variables.
        • Program Evaluation Glossary http://www.epa.gov/evaluate/glossary/o-esd.htm
  • Interpretation means creating goals and objectives
    • The purpose and study goals should determine the types of methods and measures you use to conduct the evaluation
    • Objectives will define your standard of excellence--the minimum level of appropriate service for your particular clientele
      • Jo Bell Whitlach, Evaluating Reference Services
  • Methods of Evaluation
    • Obtrusive
      • People are aware of the evaluation
        • Self-evaluation
        • Surveys
        • Observation
    • Unobtrusive
      • People are unaware of the evaluation
        • Unobtrusive measures of physical facilities
        • Use of proxies—“mystery patrons”
  • Obtrusive reference evaluations in a school library context
    • Questionnaires or interviews of students or teachers
    • Numbers gathering:
      • Reference question counts
      • Numbers/types of reference books used
      • Circulation statistics
    • Observation
      • By external observer
      • Self-observation: Journal
  • For example. . .
    • Research finding:
      • Students whose school librarian plays an instructional role tend to achieve higher than average test scores. This is also dependent on collaboration between school librarians and teachers and the inclusion of the library materials in the curriculum.
  • Evaluation Model
    • Define the question.
    • Collect data--determine needed data and method of collection.
    • Analyze the data.
    • Formulate recommendations.
    • Develop an action plan.
  • Research Finding
    • Students whose school librarian plays an instructional role tend to achieve higher than average scores.
    • What data are needed to determine whether or not the school librarian plays an instructional role in the school?
    The Question Evaluation Model Step 1. Define the Question
  • Look at the rubric
    • Beyond Proficiency :
    3. The Library Media program supports collaborative planning with the staff for the enhancement of instruction and support of student achievement. Distinguished: The LMS teams with teachers in the formal planning of student-centered authentic learning and project-based teaching and is a teaching partner.
  • According to Information Power--
    • Instructional role is determined by:
    Collaboration between school librarians and teachers that results in the inclusion of library materials in the curriculum.
  • According to Information Power--
      • Information literacy standards
      • Collaborative planning
      • Effective teaching
        • Differentiated learning options
        • Inquiry
        • Assessment
      • Student Engagement
    • Instructional role is defined by:
  • According to Empowering Learners
    • Guideline 1:
      • The school library media program promotes collaboration among members of the learning community and encourages learners to be independent, lifelong users and producers of information.
    • Guideline 3:
      • The school library media program provides instruction that addresses multiple literacies, including information literacy, media literacy, visual literacy, and technology literacy.
    According to Empowering Learners
  • Evaluation Model Step 2. Collect data
    • Determine the type needed.
      • Quantitative/Qualitative
    • Determine the method of collection.
      • Existing statistic
      • Survey, questionnaire, focus group, observation
  • One method to collect data--
    • Use Information Power’s Teaching and Learning Rubric as a questionnaire.
    • Administer it to
      • School Library staff,
      • Administrative staff,
      • Teaching staff.
    • Summarize each group; determine where perceptions differ and agree.
  • Data collected from the Teaching and Learning Rubric are qualitative--
    • When completing the questionnaire--
    • School librarian & principal:  EXEMPLARY!
    • Math teacher: 0 NOTHING!
    Example: Principal puts librarian on weekly faculty meeting agenda to encourage library use and promote curriculum-oriented library materials. Math teacher does not see an easy connect and tunes out.
  • Evaluation Model Step 3. Analyze the data
      • Identify areas of agreement and disagreement among groups.
      • Identify areas of strength and areas that need improvement.
      • Identify areas where more data is needed.
    • How can the data be analyzed to determine whether or not the librarian plays an effective instructional role in the school?
  • Agreement Example--
    • Target Indicator: Information Literacy Standards are integrated into content learning .
    • Library Staff Response
    • ____Basic 100% Proficient ____ Exemplary
    • Administrative Response
    • ____Basic 100% Proficient ____ Exemplary
    • Teaching Staff Response
    • 12% Basic 88% Proficient ____ Exemplary
  • Disagreement Example--
    • Target Indicator: Curriculum development is modeled and promoted.
    • Library Staff Response
    • 100% Basic _____ Proficient ____ Exemplary
    • Administrative Response
    • ____Basic 100% Proficient ____ Exemplary
    • Teaching Staff Response
    • 82% Basic 18% Proficient ____ Exemplary
  • Strength Area Example--
    • Target Indicator: Collaborative planning is modeled and promoted.
    • Library Staff Response
    • ____Basic 100% Proficient ____ Exemplary
    • Administrative Response
    • ____Basic 100% Proficient ____ Exemplary
    • Teaching Staff Response
    • ____Basic 100% Proficient ____ Exemplary
  • Weakness Area Example--
    • Target Indicator: Students are engaged in reading, writing, speaking, viewing & listening for enjoyment, enrichment, & understanding.
    • Library Staff Response
    • ____Basic 100% Proficient ____ Exemplary
    • Administrative Response
    • 100% Basic ____ Proficient ____ Exemplary
    • Teaching Staff Response
    • 82% Basic 18% Proficient ____ Exemplary
  • Contradictory Data Example—
    • Target Indicator:Effective teaching modeled & promoted.
    • Library Basic ____ Prof. 100% Exemplary
    • Admin. ____ Basic ____ Prof. 100% Exemplary
    • Teacher Basic 60% Prof. 40% Exemplary
    • Target Indicator: Student achievement is assessed.
    • Library 100% Basic Prof. Exemplary
    • Admin. 100% Basic ____ Prof. Exemplary
    • Teacher 90% Basic 10% Prof. Exemplary
  • Back to Step 2 Collecting More Data – Focus Groups
    • How are student products and performances assessed in units involving use of the library?
    • How do you determine the products and performances to demonstrate mastery of content in units involving use of the library?
    • What is the role of reflection in student work involving use of the library?
  • Evaluation Model Step 4. Formulate recommendations
    • Review the data, formulate recommendations.
    • For example:
      • Because of the strong correlation between the school librarian’s instructional role and student achievement, the Advisory Committee establishes a three year goal of reaching the Exemplary level in all target indicators in the Teaching and Learning Rubric
    • Role of an Advisory Committee of stakeholders—
  • Example: Recommendations for First Year
      • Review the Standards for the 21 st Century Learner to ensure they align with content standards and set student expectations for analysis, evaluation and inquiry.
      • Meet with teaching teams to solicit recommended revisions.
      • Convene a workshop for new teachers to review the standards.
    • Standards for the 21 st Century Learner
    Adapted from
  • Example: Recommendations for First Year
    • Collaborative Planning
      • Establish regular common planning time for teachers with the library media staff.
    • Curriculum Development
      • Review school policies to remove any barriers that prevent librarians from participating in building and district curriculum sessions.
  • Example: Recommendations for First Year
    • Reading, Writing, Speaking, Viewing
      • Provide a series of author workshops and develop plans with teachers for student reading, writing, speaking, and viewing responses to the author workshops.
      • Involve Parents.
    • Effective Teaching
      • Clarify conflicting feedback through focus groups targeting use of assessment, differentiation, & inquiry in instruction.
      • Develop recommendations based on new data.
  • Evaluation Model Step 5 . Develop an action plan
    • Library staff develop an action plan for each recommendation.
    • The Advisory Committee of Stakeholders reviews and approves the action plans.
  • Action Plan Example: Target Indicator: Curriculum development is modeled and promoted Objective Activity Documentation Participants New Resources Completion Date Review barriers preventing librarians from participating in curriculum sessions. Review policies School Council minutes School Council & admin. None projected Sept. 1 Secure schedule Post schedule in lib. & faculty rm. Principal None Sept. 10 Participate in mtgs. Meeting minutes Librarians None projected June 1 Report back Faculty mtg. minutes Librarians None Monthly Written material In professional library Librarians None Within 5 days of mtg
  • This example of an evaluation was…
    • Type (which one?)
      • Formal
      • Internal
      • Formative
    • Data collection methods:
      • Research
      • Rubric-based questionnaire
      • Focus Group
    • Data type:
      • Quantitative
      • Qualitative
  • Unobtrusive evaluation in schools 1
    • Unobtrusive measures of physical facilities
      • “ The basic premise . . . is that you can learn a great deal . . . by looking at how things wear (‘erosion’), how things are left in the building (‘traces’) and how things are rearranged (‘adaptations for use’).”
        • Nancy Everhart, Evaluating the School Library Media Center : Analysis Techniques and Research Practices. Libraries Unlimited, 1998 .
  • Unobtrusive evaluation in schools 2
    • Use of proxies—“mystery patrons”
      • Technique used more often in academic libraries and government documents reference services
        • Half-right reference: the 55% rule
        • 5-minute rule
          • Peter Hernon and Charles R. McClure, “ Unobtrusive Reference Testing: The 55 Percent Rule ” Library Journal April 15, 1986, 37-41.
        • “ It’s not true, and now we know why . . . the so-called “55% rule” has never been tested against a truly representative field sample.”
          • John V. Richardson, Jr., “ Reference Is Better Than We Thought ,” Library Journal April 15, 2002, 41-42.
  • Importance of a sophisticated model
    • Richardson:
      • “ The reference service performance model [that led to the 55% rule] was overly simplistic, samples were way too small, and the test questions were not representative of real-world reference questions.”
      • “ Students of reference service should learn about the existence of multiple performance outcomes (i.e., accuracy, utility, and satisfaction) and to recognize that each outcome is driven by different factors.”
        • “ Reference Is Better Than We Thought .”
  • Characteristics of good models
    • Measures of reference service must be
      • Valid
        • They “accurately reflect the concept being studied.”
      • Reliable
        • They “are stable and dependable, and provide consistent results with each repeated use.”
      • Practical
        • They “require that data be relatively easy to collect.”
      • Useful
        • They “provide information that can be used to improve reference services.”
          • Whitlach, Evaluating Reference Services .
  • Problems of Evaluation in School Context
    • Often the school library media specialist is the only one working in the library
    • An evaluation project can be time-consuming to plan and to put into action
  • Solutions to evaluation problems
    • Depends on your local situation
      • Your resources:
        • Time, money, available warm bodies
      • You and your creativity
    • Make evaluation part of your routine
      • As you plan/prepare your program(s), include an evaluation component
        • Appropriate, affordable, accessible, accountable