Program evaluation :  Simple tools for transition  educators and service providers June Gothberg Western Michigan University
Why is evaluation important? What gets measured gets done If you don’t measure results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure …..
Why is evaluation important? If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support From: Osborne & Gaebler, 1992, Chapter 5, “Results Oriented Government”
How are schools doing? “ There has been substantial progress at the state, district, and school levels to respond to increased calls for accountability as a mechanism for improving student outcomes… only by accessing data on accountability indicators will districts and schools have the necessary information to improve the performance of students in their school system.” From Abt Associates Inc (2006 April).  Marking the Progress of IDEA Implementation and Volume I: The  Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act ( SLIIDEA ) Sourcebook Report (1999-2000, 2002-2003, 2003-2004, and 2004-2005 School Years). Study reports, data tables and technical documentation are available at http://abt.sliidea.org.
From Abt Associates Inc. (2006, March).  The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act . Study reports and data tables are available at http://abt.sliidea.org.
From Abt Associates Inc. (2006, March).  The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act . Study reports and data tables are available at http://abt.sliidea.org.
Demands for Data State mandates NCLB, OSEP focused-monitoring SPP/APR state performance plan and annual performance reports  Program planning and improvement Justification for funding Community/Tax-payer accountability Job security!
Analysis of  Local Plans If we don’t anticipate our outcomes, we can’t tell if we’ve achieved them If we don’t plan our evaluation, it’s not likely to happen
Criteria for Analysis Goals:  specific, measurable, realistic, achievable Activities:  action oriented, theoretically based, do-able Outputs:  Product (something produced), moves toward goal attainment, do-able with current resources Outcomes:  Specific, measurable, meaningful
Criteria for Analysis Indicators:  Specific, both short and long term, possible to do with available resources Data Sources:  Instruments needed and persons responsible; are data available? Timeframe:  Specific Person Responsible:  Specific
Evaluation Tools NSTTAC Evaluation Toolkit A tool for “data-based” decision-making Provides “real-life” examples for various states’ evaluation instruments Samples for your use
Student Development  Title:   Job Readiness Workshop Evaluation example:   Pretest and posttest Context for use:   One day workshop for high school students  Protocol for use:   Identify key learning objectives Create a pretest to assess present knowledge, give at beginning of workshop. Create posttest with the same questions plus general questions about the workshop; have participants complete at the end of workshop.
Interagency Collaboration Title:  Sample Transition Services Database Evaluation example:   Organizational tool Context for use:   Teachers and service providers use the database for tracking students’ service needs, agency referrals, and services provided to students; these data are helpful for determining met and unmet service needs. Protocol for use:   This tool can be used to track students’ needs identified in their IEP, agency referrals, and service provision. When used to project service needs, these data are useful in strategic planning.
Family Involvement Title:  CIMP Parent Focus Group Script Evaluation example:   Example of a script used to conduct a focus group Context for use:   Used by the facilitator(s) to conduct and manage the flow of a focus group discussion Protocol for use:   This script is used by the focus group facilitator to provide a structure for the discussion. A script helps ensure all questions are asked and provides consistency across groups.
Family Involvement Title:   CIMP Parent Questionnaire Evaluation example:   Questionnaire used to gather information from parents Context for use:   Used to gather information from parents, prior to the focus group discussion Protocol for use:   This questionnaire is provided onsite to a group of parents before they participate in the focus group to gather demographic and other information about their experiences.
Family Involvement Title:   Informal Family Forum Evaluation example:   Discussion questions posed to parents/guardians Context for use:   Informal family forum held in conjunction with a transition cadre meeting.  Protocol for use:   Transition cadre meetings can provide opportunities to gather information from students and families in the geographical area where the meeting is held. These questions can be used to foster discussion about students’ preparation for their post-school lives, information useful for those planning and implementing transition education and services.
Program Structures Title:   Self-Assessment:  Ability to Implement Professional Development Evaluation example:   Self-assessment Context for use:   Assessment can be used with a variety of educational professionals, particularly those responsible for providing transition-related professional development Protocol for use:   Assessment should used to help plan professional development as a measure of strengths and potential issues that should be addressed.
Program Structures Title:   Self-Assessment:  Knowledge of Transition Practices Content Evaluation example:   Pretest and posttest Context for use:   Assessment can be used with a variety of educational professionals  Protocol for use:   Participants complete test before and after content session.
Team Table Work Identify what you have accomplished Identify barriers you have encountered Address evaluation issues Consider how you are/can extend evaluation focus to local, local impact!
Identify Barriers Encountered  Has everything worked as planned? If not, why not? Did you have the resources you needed? Did you have the administrative support you needed? Did you get the response you anticipated? Did you have the impact you anticipated?
Record the Barriers Across teams at your table Record barriers for sharing
Idea Sharing What are the barriers? How do you address them?
Thank you!! June Gothberg Western Michigan University

Co Inst Eval Presentation 09

  • 1.
    Program evaluation : Simple tools for transition educators and service providers June Gothberg Western Michigan University
  • 2.
    Why is evaluationimportant? What gets measured gets done If you don’t measure results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure …..
  • 3.
    Why is evaluationimportant? If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support From: Osborne & Gaebler, 1992, Chapter 5, “Results Oriented Government”
  • 4.
    How are schoolsdoing? “ There has been substantial progress at the state, district, and school levels to respond to increased calls for accountability as a mechanism for improving student outcomes… only by accessing data on accountability indicators will districts and schools have the necessary information to improve the performance of students in their school system.” From Abt Associates Inc (2006 April). Marking the Progress of IDEA Implementation and Volume I: The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act ( SLIIDEA ) Sourcebook Report (1999-2000, 2002-2003, 2003-2004, and 2004-2005 School Years). Study reports, data tables and technical documentation are available at http://abt.sliidea.org.
  • 5.
    From Abt AssociatesInc. (2006, March). The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act . Study reports and data tables are available at http://abt.sliidea.org.
  • 6.
    From Abt AssociatesInc. (2006, March). The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education Act . Study reports and data tables are available at http://abt.sliidea.org.
  • 7.
    Demands for DataState mandates NCLB, OSEP focused-monitoring SPP/APR state performance plan and annual performance reports Program planning and improvement Justification for funding Community/Tax-payer accountability Job security!
  • 8.
    Analysis of Local Plans If we don’t anticipate our outcomes, we can’t tell if we’ve achieved them If we don’t plan our evaluation, it’s not likely to happen
  • 9.
    Criteria for AnalysisGoals: specific, measurable, realistic, achievable Activities: action oriented, theoretically based, do-able Outputs: Product (something produced), moves toward goal attainment, do-able with current resources Outcomes: Specific, measurable, meaningful
  • 10.
    Criteria for AnalysisIndicators: Specific, both short and long term, possible to do with available resources Data Sources: Instruments needed and persons responsible; are data available? Timeframe: Specific Person Responsible: Specific
  • 11.
    Evaluation Tools NSTTACEvaluation Toolkit A tool for “data-based” decision-making Provides “real-life” examples for various states’ evaluation instruments Samples for your use
  • 12.
    Student Development Title: Job Readiness Workshop Evaluation example: Pretest and posttest Context for use: One day workshop for high school students Protocol for use: Identify key learning objectives Create a pretest to assess present knowledge, give at beginning of workshop. Create posttest with the same questions plus general questions about the workshop; have participants complete at the end of workshop.
  • 13.
    Interagency Collaboration Title: Sample Transition Services Database Evaluation example: Organizational tool Context for use: Teachers and service providers use the database for tracking students’ service needs, agency referrals, and services provided to students; these data are helpful for determining met and unmet service needs. Protocol for use: This tool can be used to track students’ needs identified in their IEP, agency referrals, and service provision. When used to project service needs, these data are useful in strategic planning.
  • 14.
    Family Involvement Title: CIMP Parent Focus Group Script Evaluation example: Example of a script used to conduct a focus group Context for use: Used by the facilitator(s) to conduct and manage the flow of a focus group discussion Protocol for use: This script is used by the focus group facilitator to provide a structure for the discussion. A script helps ensure all questions are asked and provides consistency across groups.
  • 15.
    Family Involvement Title: CIMP Parent Questionnaire Evaluation example: Questionnaire used to gather information from parents Context for use: Used to gather information from parents, prior to the focus group discussion Protocol for use: This questionnaire is provided onsite to a group of parents before they participate in the focus group to gather demographic and other information about their experiences.
  • 16.
    Family Involvement Title: Informal Family Forum Evaluation example: Discussion questions posed to parents/guardians Context for use: Informal family forum held in conjunction with a transition cadre meeting. Protocol for use: Transition cadre meetings can provide opportunities to gather information from students and families in the geographical area where the meeting is held. These questions can be used to foster discussion about students’ preparation for their post-school lives, information useful for those planning and implementing transition education and services.
  • 17.
    Program Structures Title: Self-Assessment: Ability to Implement Professional Development Evaluation example: Self-assessment Context for use: Assessment can be used with a variety of educational professionals, particularly those responsible for providing transition-related professional development Protocol for use: Assessment should used to help plan professional development as a measure of strengths and potential issues that should be addressed.
  • 18.
    Program Structures Title: Self-Assessment: Knowledge of Transition Practices Content Evaluation example: Pretest and posttest Context for use: Assessment can be used with a variety of educational professionals Protocol for use: Participants complete test before and after content session.
  • 19.
    Team Table WorkIdentify what you have accomplished Identify barriers you have encountered Address evaluation issues Consider how you are/can extend evaluation focus to local, local impact!
  • 20.
    Identify Barriers Encountered Has everything worked as planned? If not, why not? Did you have the resources you needed? Did you have the administrative support you needed? Did you get the response you anticipated? Did you have the impact you anticipated?
  • 21.
    Record the BarriersAcross teams at your table Record barriers for sharing
  • 22.
    Idea Sharing Whatare the barriers? How do you address them?
  • 23.
    Thank you!! JuneGothberg Western Michigan University