Matt Keene (PPT)

742 views

Published on

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
742
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
23
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Introductions Activity Evaluation in our lives Give personal context of integrating evaluation and how we can think about it within our programs Evaluation and its Evolution at EPA Who has experience with PM? Present the tools that we are going to use to help integrate Show how evaluation within EPA is evolving to meet more sophisticated needs of clients and stakeholders Case Study Provide the practical example to which we will apply tht tools to integrate evaluation Demonstrate along the way the evolution of the program and our thought on how to evaluate Exercise Integrate evaluation into the design of the MN project – this is the design phase This will be as practical as we can be but we will still be in the conceptual realm and I’ll fill in the reality at the end of the exercise Wrap-up Revisit the process – make sure the connections are clear between each step Short discussion about challenges and opportunities within participants’ programs Take home messages – what can you do to take the practice of evaluating environmental programs to the next level in understanding effectiveness, success and creating evidence that can inform our decisions
  • We’ll all be working together pretty closely through the session Really need everyone’s input Introduction Introduce someone at your table to the rest of the room and tell us 1 thing they want for the holiday season Who is not familiar with
  • This is new for ESD – our first time You are helping ESD, MN Demonstration project, and we hope we can help you (and yours) Not about working evaluation results into your program but building your program so that it is evaluable.
  • Your life and the life of a program Purpose of Activity: demonstrate that because we have consciously and unconsciously had integrated and integrated evaluation into our lives, we have already know the questions and the process of integrating evaluation into programs – we are just attempting to provide a structure. Shift mindset from an informal but effective way of integrating evaluation into our lives to a more formal, practical application of evaluation principles and concepts Background It began before birth! At first it was external for others to evaluate our status (this happens throughout our lives) and then we learned to internalize an evaluative mentality and apply it to our own lives We learned to do this because we think about the future And we each have a unique set of values that guides us in our decisions about what to evaluate and how rigorously Evaluation in your life – grades, school, sport, family, faith, For others to evaluate your progress or for you or both We often reach a point in maturity when it was originally external but then we internalize and start to create our own goals, measures, activities, etc so that we can establish success, understand how effective we are and understand progress. And we’re still accountable to others aren’t we – spouse, kids, church, neighbors (cut grass, rake leaves) to meet perceptions of right and wrong to put it simply but of course each particular case is much more complicated and they are all likely to be intertwined in one way or another. How do we do this…We work them into our lives naturally over time Evaluation in a program …not terribly different…these are similar questions that we ask when designing an evaluation Why do we – because we determine something is worth the attention How do we – it’s instinctual and learned We have evolved to look toward the future and prepare in order to survive We are taught evaluation from all sides from the time we can learn Activity Give my juggling example Present the steps along the way – Audience really me, juggling techniques Instruct each individual to jot down their own example Debrief at table and Look to each table’s ESD rep for an example to share Welcome to use NY’s Rez
  • What are we talking about? Conceptually this is a component of performance management and adaptive management, a term commonly used in natural resource management and biodiversity conservation. It’s difficult to integrate evaluation into environmental programs without engaging in the larger processes. We’re talking about shifting our view at the beginning of a program toward the end and also asking those in mid-stream to question assumptions made at the beginning, in the design and planning stages. This is about mindset…and it’s often a mentality that is contrary to the culture into which environmental disciplines and organizations were born. This is about using your understanding of what’s necessary to understand the effectiveness and success of your program and figuring the way to integrate evaluation into your work (program or project). We can provide structure but of course context is the governing force that takes over once the process is set into motion within your organization. Components of context may be your agency, office, governance system, clients, social culture, geography, affecting policies, etc. By developing a program where we can manage performance best, we must by definition integrate evaluation into the program or else introduce the potential for significant inefficiencies in improving performance and informing decisions. But is this ever really happening? Not often. ESD trainings are not explicitly set up to address this issue because it is rare (although becoming more common all the time) that a program comes to us with the capacity or desire to integrate evaluation into their work. Evaluate with vs. to people What gets measured and used gets done Why do we integrate evaluation into our lives? How do we do it?
  • Evaluation Support Division Mission Evaluate Build Capacity - Performance Management (Framework) We are breaking down the performance management framework so that we can thoroughly think about and integrate evaluative concepts into each step
  • An approach to checking our assumptions about program theory and adjusting program practice in order to understand progress and achieve goals more efficiently Explain the definition and purpose PM Framwork
  • We teach how to use the PManagement tools in the various steps to completing an evaluation Keep in mind that today we are talking about integrating evaluation (a component of the Performance Management approach) into program design We are going to briefly review the tools because understanding the PManagement tools and their use is critical to understanding integrating evaluation into program design
  • D
  • Explain that in order to integrate evaluation into our program’s design, we must understand the tools that will be important in doing so (PM tools) and we must understand what we are preparing to do (steps to completing an evaluation). From this falls our approach to integrating evaluation. Sphere of influence
  • Give 30 seconds to review
  • Accountability External Audience What is my level of performance? Learning & Program Improvement Internal/External Audiences What is my level of performance? What outcomes have been achieved and why? What aspects of my program lead to these outcomes? What roles did context play in my outcomes?
  • Impact BACI Counterfactual Sampling strategies/techniques Propensity score matching
  • On all fronts - Witnessing the emergence and retooling of programs and organizations and disciplines/fields What type of programs typically working with and set in what type of culture What does this mean – mainly logic models, not starting from scratch with goals and objectives and mission…but maybe we should What are we teaching and why What changes are we seeing amongst our clients at epa, outside epa, env. Community Only now clearly seeing the opportunities to make it happen – not that we haven’t known but what we know and do are different things – all stakeholders were not on board and clients just were not ready – they are looking more ready and we are seeing it happen in some places…we ‘ll get to that… What role and responsibility do we have Managing performance is iterative and requires integrating evaluation If we evaluate so that we have results and learning to use to improve and adapt than we must view evaluation as iterative or else we risk not knowing if our “improvement” really was. To know this we must integrate evaluation into the life of the program, so that it is not viewed as a discrete activity or product but as an integral part of program function.
  • We are going to use one of our evaluations as an example
  • Paint Estimated # of gallons/year leftover in U.S. – 64 million (10% of all paint sold) Estimated average cost/gallon to manage – 8$ Estimated total cost to manage leftover paint – 512 million 30-40% oil-based being returned at HHW facilities Latex paint is a resource issue like any other recyclable This is not sustainable!! So PPSI formed and that is… 1 st MOU in 05, 2 nd this month = Consensus = Collaboration 4+ year process and counting… Raised $900,000 for 9 projects 3-year continuation of partnership Formation of industry-run organization Manufacturer/consumer funded Goals of paint management Reduced paint waste; Efficient collection, reuse, and recycling of leftover paint; Increased markets for recycled paint; and A sustainable financing system to cover end-of-life management costs. Demonstration Project in MN – paint mgt $6 M/yr cost Roll out to other states January 2009 – OR, WA, VT July 2009 – CA January 2010 – IA, FL, NC, IL
  • As ideal an evaluation as we can expect to get and opportunity to work with What will we evaluate? Intentions are noble and on target and the thinking, understandably, had been thorough but ad hoc. In other words, measurement and goals were not explicitly linked putting utility of results and learning at risk. From the MOU: Establish baseline data regarding the amount of post-consumer paint collected, reused, recycled, fuel blended and landfilled, and the amount of virgin and recycled paint sold; program costs; and participation rates. Establish evaluation and measurement criteria prior to implementation of the Demonstration Project. These are to include program costs, participation rates, and amounts collected, reused, recycled, fuel blended, and disposed. Develop a Final Report that evaluates the effectiveness of the Demonstration Project and assists in determining the best options for rolling out a national paint management system. The Report is to include recommendations on how the methods used in the Demonstration Project should be modified to be successfully implemented in other areas of the country. Why are we evaluating the program? - They knew they wanted to but didn’t know what they were getting into Transferability (locations and products) Learning Perceived across the stakeholders that better to voluntarily manage paint rather than allow legislation
  • Leadership From MN and those wanting the MN demonstration to happen From EPA recognizing the opportunity to integrate evaluation into the MN project.
  • What do we know? What is our vision for evaluation of the MN project? When will the evaluators arrive? Before the management plan is complete and recognizing some issues with designing the program and getting to the answers they want to ask. Created evaluation workgroup that consists of… How will it all work? There is an evaluation workgroup but many others as well. Many projects going simultaneously and designing the evaluation to address each will be a challenge. Because the groups are so diverse, difficult to communicate consistantly. Money – EPA Staff (for evaluation group) – federal, state, local, recyclers, manufacturers, retailers, Time – stalling at financing and agreements between reps within workgroups Can we integrate evaluation into this project? What do we have going for us Collaborations – teams MOU- workplan – Goals and objectives – Goals that match with evaluation design Evaluation expertise and willingness of all stakeholders Money New program ReminderThe Vision for performance management: Generate results and learning relevant to understanding success and our effectiveness as well as create a body of evidence that informs decisions
  • Explain that in order to integrate evaluation into our program’s design, we must understand the tools that will be important in doing so (PM tools) and we must understand what we are preparing to do (steps to completing an evaluation). From this falls our approach to integrating evaluation.
  • BREAK Review MN program literature during break
  • Early in the process, so we’re hoping that you come up with some ideas that will guide the integration process Overall explanation of the exercise we are going to complete over the next 2 hours All of what we are about to do takes place during the planning phase of the project Have them brainstorm about each step on a separate piece of paper/flipchart Each group prioritize down to 1-3 and report out on each step from the table ESD can type their reporting into the projected templates
  • Encourage participants to start thinking about the MN Demo project while I’m introducing each step of “integrating” Questions about this process/structure
  • Team Staff or non-staff Diversity – make the most of this at your table Expertise Peer/independent review of methods – where will this come in? Mission Understanding the situation or context in which the project or program will take place – drivers of integrating PM, Could include the situation analysis Threats, needs, opportunities, risks, stakeholders, audiences, partnerships Clearly define the issue you are trying to address Consider scope of the area or theme Goal Goals often are long term outcome and objectives often serve as short and mid-term Each group will receive a goal for the MN demonstration project This goal will serve as a long-term outcome and the point from which your group will develop a single line of logic, including mid-term outcomes, short-term outcomes, customers/audiences, outputs, and activities. Describe the program Discuss program and team selection briefly Hand out mission, goals and objectives and explain and give time to look over ?Key questions? Or let them decide on one for each table? ?Also hand out large worksheet for each table to track the progress of integration?
  • Logic Model We will combine the lines of logic from each group to form a complete logic model Each group will choose a component of the logic model, preferably an outcome. Clearly articulate core assumptions by showing the causal links Identify resources available to the project Keeps team members on the same page Helps isolate specific tasks for each activity Helps with roles and responsibilities for team members Hand out logic model template to be filled in by each table Each template will have the goals/LT outcomes on it and the headings at the top of each column. Goals from the MN project represent a long term outcomes. Each table fills in the other components of the Logic Model We’ll put the lines of logic together to form a complete’ish model View/edit/complete model during break.
  • Brainstorm questions within the group, then prioritize the questions and choose the #1 most important question to understading whether the program accomplishes a given component of the logic model Develop a question Each group will craft an evaluation question that speaks to the component of the logic model they have chosen Good evaluation questions fit with your intended purpose for conducting the evaluation. Evaluations that seek to improve a program may ask questions about how well the program is being implemented, whether it is delivering its intended outputs, and to what extent the program is achieving short and intermediate-term outcomes. Evaluations that seek to demonstrate whether a program worked as intended typically focus on outcomes and impacts. Good evaluation questions are framed in ways that will yield more than yes or no answers. Consider the following examples: To what extent were the planned activities completed? Why were they or weren't they completed? To what extent are we achieving our outcomes? How close are we to achieving them? Why did or didn't we achieve them? How well are we managing our program? What additional staff and resources are needed to meet our objectives? How can we ensure that the program is replicable, cost-effective, and/or sustainable? Good evaluation questions consider the concerns of multiple stakeholders To what extent were the planned activities completed? Why were they or weren't they completed? To what extent are we achieving our outcomes? How close are we to achieving them? Why did or didn't we achieve them? How well are we managing our program? What additional staff and resources are needed to meet our objectives? How can we ensure that the program is replicable, cost-effective, and/or sustainable? Context Explore and brainstorm the contextual factors surrounding this question What social economic, and political factors influence the evaluation? What is the significance of the evaluation? What are the values surrounding the evaluation? For instance, will culture, governance, politics, society, economics, or the surrounding environment affect the evaluation of this question? Appropriate geographical and social scales Audience Who are the potential audiences for the question Program managers? Staff? Current or potential funders? Government agencies? Teachers? School administrators? …What kind of data will you need to collect to meet the needs of these different stakeholders? What information will they find most credible and easy to understand? Prioritize audiences Communication What are the best ways to communicate with each audience? What message does each audience need What is the best way? What products will communicate best? Prioritize products Clear management recommendations to the right people Provide necessary details for interpretation Must be time planned for this for staff and other audiences Measure how well communication works? Feedback on products Frame for each audience Use What are the potential ways that each audience might use the results and learning generated from answering the question Characterize what you will do when you learn things – how to react Positive and negative Making changes, adjusting, adapting goals, objectives, activities, etc. Evaluation should serve priority audience needs Cost effectiveness Assess strategies, improve programs
  • Develop a question Each group will craft an evaluation question that speaks to the component of the logic model they have chosen Good evaluation questions fit with your intended purpose for conducting the evaluation. Evaluations that seek to improve a program may ask questions about how well the program is being implemented, whether it is delivering its intended outputs, and to what extent the program is achieving short and intermediate-term outcomes. Evaluations that seek to demonstrate whether a program worked as intended typically focus on outcomes and impacts. Good evaluation questions are framed in ways that will yield more than yes or no answers. Consider the following examples: To what extent were the planned activities completed? Why were they or weren't they completed? To what extent are we achieving our outcomes? How close are we to achieving them? Why did or didn't we achieve them? How well are we managing our program? What additional staff and resources are needed to meet our objectives? How can we ensure that the program is replicable, cost-effective, and/or sustainable? Good evaluation questions consider the concerns of multiple stakeholders To what extent were the planned activities completed? Why were they or weren't they completed? To what extent are we achieving our outcomes? How close are we to achieving them? Why did or didn't we achieve them? How well are we managing our program? What additional staff and resources are needed to meet our objectives? How can we ensure that the program is replicable, cost-effective, and/or sustainable? Context – historic, current, emergent Explore and brainstorm the contextual factors surrounding this question What social economic, and political factors influence the evaluation? What is the significance of the evaluation? What are the values surrounding the evaluation? For instance, will culture, governance, politics, society, economics, or the surrounding environment affect the evaluation of this question? Appropriate geographical and social scales Audience Who are the potential audiences for the question Program managers? Staff? Current or potential funders? Government agencies? Teachers? School administrators? …What kind of data will you need to collect to meet the needs of these different stakeholders? What information will they find most credible and easy to understand? Prioritize audiences Communication What are the best ways to communicate with each audience? What message does each audience need What is the best way? What products will communicate best? Prioritize products Clear management recommendations to the right people Provide necessary details for interpretation Must be time planned for this for staff and other audiences Measure how well communication works? Feedback on products Frame for each audience Use What are the potential ways that each audience might use the results and learning generated from answering the question Characterize what you will do when you learn things – how to react Positive and negative Making changes, adjusting, adapting goals, objectives, activities, etc. Evaluation should serve priority audience needs Cost effectiveness Assess strategies, improve programs
  • Context may affect how/what you measure Why are you planning the evaluation? Is it for accountability, to document the program's results to an organization or funder? Is it to learn if the program is on the right track, to assess the program’s accomplishments, to improve the program, or something else? The answers to these questions should influence the methods you use to carry out your evaluation. If you decide, for example, that your goal is to generate evidence of your program’s success, you may want to focus on an outcome you are confident the program is achieving and select methods that will allow you to generalize results to all program participants. Alternatively, if your goal is to improve specific aspects of the program, you may want to focus on these by obtaining in-depth recommendations from participants and staff.
  • Tie back to my intro example of a performance measure for juggling to get them started. Identify measure What can you measure to answer the question? Feasibility Data sources Where is the information that is relevant to this measure? Collected as a part of daily management is best Primary and secondary data sources For instance, is it with stakeholders, in the water, existing monitoring databases, do reports and articles contain the information, or will economic or ecological models generate the information? Collection methods and strategy What method(s) can we use to mine the data source? For instance, survey/questionnaire, ecological monitoring, focus groups, literature review, document review, environmental databases, economic modeling, market research, etc. Generate info appropriate for intended use and audience what do we need to learn about methods (practices, principles) Consider resources necessary and available for each method What strategy (sampling techniques and approaches) will you use to implement each data collection method For example, given a survey is a collection method, the collection strategy might address questions of target population, sampling frame, statistical criteria, and nonresponse issues. Can you establish comparison groups or a counterfactual? Is before and after observation or sampling possible? Is random sampling possible or desirable? What are the timelines for collection Isolate and assess the effects of your intervention Design to meet not exceed required sophistication – don’t collect merely what’s interested, only what you need. Analytical Methods For instance Qualitative assessment, Statistical hypothesis tests, Propensity score matching, Data tabulation Choose models that are more efficient in isolating program effects from other influences When possible use rigorous analytical designs or statistical analyses Interpret with caution – all methods do not lead to causality Should not happen at only one place but consistently checked Want and ongoing understanding so timely change is an option Data Collection Implementation of the data collection strategy What are the logistics governing data collection? For instance, who will be responsible, what are the milestones and timelines, and what resources will be required for each method? Define timing and frequency – different for different methods Plan to calibrate and test Clearly defined roles and responsibilities Investment consistent with the action Consider existing and required skills Data Management Who will manage how (entry, format, coding, quality checks/feedback) where system for storing Where, who, when, and how (entry, format) will you manage the information that is collected for each measure?
  • Identify measure What can you measure to answer the question? Feasibility Data sources Where is the information that is relevant to this measure? Collected as a part of daily management is best Primary and secondary data sources For instance, is it with stakeholders, in the water, existing monitoring databases, do reports and articles contain the information, or will economic or ecological models generate the information? Collection methods and strategy What method(s) can we use to mine the data source? For instance, survey/questionnaire, ecological monitoring, focus groups, literature review, document review, environmental databases, economic modeling, market research, etc. Generate info appropriate for intended use and audience what do we need to learn about methods (practices, principles) Consider resources necessary and available for each method What strategy (sampling techniques and approaches) will you use to implement each data collection method For example, given a survey is a collection method, the collection strategy might address questions of target population, sampling frame, statistical criteria, and nonresponse issues. Can you establish comparison groups or a counterfactual? Is before and after observation or sampling possible? Is random sampling possible or desirable? What are the timelines for collection Isolate and assess the effects of your intervention Design to meet not exceed required sophistication – don’t collect merely what’s interested, only what you need. Analytical Methods For instance Qualitative assessment, Statistical hypothesis tests, Propensity score matching, Data tabulation Choose models that are more efficient in isolating program effects from other influences When possible use rigorous analytical designs or statistical analyses Interpret with caution – all methods do not lead to causality Should not happen at only one place but consistently checked Want and ongoing understanding so timely change is an option Data Collection Implementation of the data collection strategy What are the logistics governing data collection? For instance, who will be responsible, what are the milestones and timelines, and what resources will be required for each method? Define timing and frequency – different for different methods Plan to calibrate and test Clearly defined roles and responsibilities Investment consistent with the action Consider existing and required skills Data Management Who will manage how (entry, format, coding, quality checks/feedback) where system for storing Where, who, when, and how (entry, format) will you manage the information that is collected for each measure?
  • Evaluation Methodology – for a program or project Links/connects each decision and component from the description of the program to the interpretations of success. M&E Methodology – Evaluation method – Performance Management Method Eval assessment – this process guides us to the information necessary to do evaluability assessment – is the program ripe? Program design Access to data comparability Performance Management Policy – continue to use MN as an example Overarching office/division strategy/policy for evaluation and performance management Topics covered could include culture institutionalizing PM priorities steps for implementing PM policy and method Peer/independent review of methods and results Internal program improvement and external accountability Purpose of performance management funding Timelines Consultants Adapting methods, policy capacity building Implementation Could list every principle and task from the planning with slight modifications for implementation but would be redundant
  • Display the process they’ve created on the overhead and give them 5 minutes to look again and note observations and obstacles and then take 5 minutes to report on what they observe. Revisit the process Look at the decisions we’ve made along the way Look for the flow and breaks Show that this is the first cycle of integrating PM
  • Recount the reality of integrating evaluation into the MN demo program Post the mission, goals and objectives…and logic model if there is one
  • 5-10 minutes Practice and theory – dynamic mix of learning and doing When to do it What are the obstacles Are there solutions Are there opportunities to improve evaluations in your shop Here are a few thoughts about ramping up the quality of your evaluations Opportunities to merge theory and practice Policy Leadership New programs Capacity building efforts like this one
  • We can only go forward from here Attempts to Integrate Evaluation… How do you see it?
  • Context may affect how/what you measure Why are you planning the evaluation? Is it for accountability, to document the program's results to an organization or funder? Is it to learn if the program is on the right track, to assess the program’s accomplishments, to improve the program, or something else? The answers to these questions should influence the methods you use to carry out your evaluation. If you decide, for example, that your goal is to generate evidence of your program’s success, you may want to focus on an outcome you are confident the program is achieving and select methods that will allow you to generalize results to all program participants. Alternatively, if your goal is to improve specific aspects of the program, you may want to focus on these by obtaining in-depth recommendations from participants and staff.
  • Documenting your unique performance management approach in a living program policy Document PManagement policy Overarching office/division strategy for evaluation and performance management Topics covered could include culture institutionalizing PM steps for implementing PM policy and method Internal program improvement and external accountability Purpose of performance management funding Timelines Consultants Adapting methods, policy capacity building Evaluation methodology Documenting all components of program’s approach to managing performance Links/connects each decision and component from the description of the program to the interpretations of success.
  •  SCRIPT : Logic Model - Tool/framework that helps identify relationships among program elements and the problem to be solved. Performance measurement – The ongoing monitoring and reporting of program progress and accomplishments using pre-selected performance measures. Tells you what level of performance your program has achieved. Program evaluation – A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Tells you why you see that level of performance.
  • Evaluation is an iterative process The process of integrating evaluation is not Operations into which evaluation/performance management is integrated will by definition enter into a continuous evaluation which are aimed at improvements to the program as well as evaluation/PManagement as a whole
  • Program Needs What/why/who drives the need for performance management? Measures Develop and refine
  • A slightly more sophisticated way to look at how evaluation fits into the program cycle As well as the types of evaluation that you can use within your program To realize this vision of how a program runs and how evaluation can inform parts of the management cycle, we can integrate evaluation from the beginning or work to integrate it into our more mature programs…which will require a close look at the assumptions we made at the beginning and throughout the evolution of the program.
  • Program Needs What/why/who drives the need for performance management? Measures Develop and refine
  • Matt Keene (PPT)

    1. 1. Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency Innovation Symposium Chapel Hill, NC Thursday, January 10, 2008
    2. 2. Workshop Outline <ul><li>Introductions </li></ul><ul><li>Activity – Evaluation in Our Lives </li></ul><ul><li>Evaluation and its Evolution at EPA </li></ul><ul><li>Case Study – Product Stewardship in MN </li></ul><ul><li>Exercise – Integrating Evaluation in MN </li></ul><ul><li>Opportunities to Integrate Evaluation </li></ul>
    3. 3. Introductions <ul><li>This will be an interactive workshop… so let’s interact! </li></ul><ul><ul><li>Get to know someone at your table </li></ul></ul><ul><ul><li>Tell us </li></ul></ul><ul><ul><ul><li>Who they are, </li></ul></ul></ul><ul><ul><ul><li>Who they work with, and </li></ul></ul></ul><ul><ul><ul><li>Their New Year’s resolution </li></ul></ul></ul>
    4. 4. Purpose of the Workshop <ul><li>Through discussion and a practical, real-world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs. </li></ul>
    5. 5. Evaluation In Our Lives <ul><li>Activity </li></ul><ul><ul><li>Name something in your life that you or someone else decided was worth measuring and evaluating. </li></ul></ul><ul><ul><li>What was the context? </li></ul></ul><ul><ul><li>Was there a target or goal…what was it? </li></ul></ul><ul><ul><li>Who was the audience? </li></ul></ul><ul><ul><li>How did you measure progress or success? </li></ul></ul><ul><ul><li>How did you use what you learned? </li></ul></ul>
    6. 6. Evaluation In Our Programs <ul><li>What can we take from evaluation in our lives and apply to addressing environmental challenges? </li></ul><ul><ul><li>Measure what matters </li></ul></ul><ul><ul><li>Evaluate for others and for ourselves </li></ul></ul><ul><li>Integrating evaluation into program design </li></ul><ul><ul><li>Equal parts art and skill </li></ul></ul><ul><ul><li>Performance management and quality evaluation are inseparable </li></ul></ul>
    7. 7. Evaluation In The EPA <ul><li>Evaluation Support Division </li></ul><ul><li>ESD’s Mission </li></ul><ul><ul><li>Evaluate innovations </li></ul></ul><ul><ul><li>Build EPA’s capacity to evaluate </li></ul></ul><ul><li>Performance Management </li></ul><ul><ul><li>An approach to accomplishing EPA goals and ESD’s mission </li></ul></ul>
    8. 8. Performance Management PERFORMANCE MANAGEMENT Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you’re seeing the program/project results.
    9. 9. Steps to Completing an Evaluation VI. Design the Evaluation II. Identify Team/Develop Evaluation Plan III. Describe the Program IV. Develop Evaluation Questions V. Identify/Develop Measures VIII. Analyze and Interpret Information IX. Develop the Report VII. Collect Information I. Selecting a Program for Evaluation
    10. 11. Logic Model Longer term outcome ( STRATEGIC AIM ) Short term outcome Customers Outputs WHY HOW PROGRAM RESULTS FROM PROGRAM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-) Intermediate outcome Activities Resources/ Inputs Victory Commitment Training Snodgrass Juggling Regimen Me
    11. 12. Performance Measurement <ul><li>Definition </li></ul><ul><ul><li>The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures </li></ul></ul><ul><li>Measures are designed to check the assumptions illustrated in the logic model </li></ul>
    12. 13. Measures Across the Logic Model Spectrum # of technical assistance requests responded to; # of compliance workbooks developed/delivered. Measure of products and services provided as a direct result of program activities. Outputs % of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Measure of satisfaction with outputs. Customer Satisfaction % of target population trained; # of target population receiving technical assistance. Measure of target population receiving outputs. Customer Reached % increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled. Accomplishment of program goals and objectives ( short-term and intermediate outcomes, long-term outcomes--impacts ). Outcomes # of training classes offered as designed; Hours of technical assistance training for staff. Measure of work performed that directly produces the core products and services. Activities Amount of funds, # of FTE, materials, equipment, supplies (etc.). Measure of resources consumed by the organization. Resources/ Inputs Example Measure Definition Element
    13. 14. Program Evaluation <ul><li>Definition </li></ul><ul><ul><li>A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. </li></ul></ul><ul><li>Orientation/Approaches to Evaluation </li></ul><ul><ul><li>Accountability </li></ul></ul><ul><ul><ul><li>External Audience </li></ul></ul></ul><ul><ul><li>Learning & Program Improvement </li></ul></ul><ul><ul><ul><li>Internal/External Audiences </li></ul></ul></ul>
    14. 15. Types of Evaluation Process Evaluation Outcome Evaluation Impact Evaluation Longer term outcome (STRATEGIC AIM) Intermediate outcome Short term outcome Customers Outputs Activities Resources/ Inputs WHY HOW Design Evaluation
    15. 16. Questions, Comments and Clarifications <ul><li>Are there any questions or comments about what we have covered so far? </li></ul>
    16. 17. Environmental Evaluation: Evolving Theory and Practice <ul><li>ESD is witnessing the shift from awareness to action </li></ul><ul><li>We are adapting to the increasing sophistication of our clients and demands from stakeholders </li></ul><ul><ul><li>Capacity Building </li></ul></ul><ul><ul><li>Evaluations </li></ul></ul><ul><li>Managing performance requires integrating evaluation into program design </li></ul>
    17. 18. Our Case Study <ul><li>Our case study is representative of a trend toward more sophisticated evaluations of environmental programs </li></ul><ul><li>ESD is applying learning and adding to it as we take on more sophisticated projects </li></ul><ul><li>From here on, you are receiving information necessary to complete the exercises </li></ul><ul><ul><li>You are responsible for integrating evaluation into the program </li></ul></ul><ul><ul><li>Ask questions and take notes! </li></ul></ul>
    18. 19. Case Study: Paint Product Stewardship Initiative <ul><li>Background on… </li></ul><ul><li>Current Status and Goals of PPSI </li></ul><ul><li>Minnesota Demonstration Program </li></ul>
    19. 20. Evaluating the Demonstration Program <ul><li>What Will We Evaluate? </li></ul><ul><ul><li>Paint </li></ul></ul><ul><ul><li>Management Systems </li></ul></ul><ul><ul><li>Education </li></ul></ul><ul><ul><li>Markets </li></ul></ul><ul><ul><li>Cooperation? </li></ul></ul><ul><ul><li>Financing system? </li></ul></ul>
    20. 21. Regional Draft Infrastructure <ul><li>Why Are We Evaluating? </li></ul><ul><ul><li>Leadership </li></ul></ul><ul><ul><li>Legislation </li></ul></ul><ul><ul><li>Learning </li></ul></ul><ul><ul><li>Transfer </li></ul></ul>
    21. 22. Evaluating the Demonstration Program <ul><li>What will we evaluate? </li></ul><ul><ul><li>Paint, Management Systems, Education, Markets </li></ul></ul><ul><li>Why are we evaluating the program? </li></ul><ul><ul><li>Leadership, Legislation, Learning, Transfer </li></ul></ul><ul><li>Can we integrate evaluation into this project? </li></ul><ul><ul><li>We need a framework to follow…and we are building it as we go </li></ul></ul><ul><ul><li>Initially, integrating evaluation into your program is a design and planning activity </li></ul></ul>
    22. 23. Integrating Evaluation into Program Design
    23. 24. Questions, Comments and Clarifications <ul><li>Take a few minutes to familiarize yourself with the mission, goals and objectives of the MN demonstration program </li></ul>
    24. 25. Exercise: Integrating Evaluation <ul><li>Minnesota Demonstration Project and Performance Management </li></ul><ul><ul><li>We will introduce a process for integrating evaluation into the MN program </li></ul></ul><ul><ul><li>We will use the process to, step-by-step, integrate evaluation into the design of the MN program </li></ul></ul><ul><li>Logistics </li></ul><ul><ul><li>Your table is your group for the rest of the workshop </li></ul></ul><ul><ul><li>After brief instruction, each team will complete each step of the process and report the results </li></ul></ul>
    25. 26. Integrating Evaluation into Program Design 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design Program Questions Documentation Measures 1. Context 2. Audience 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology
    26. 27. Select and Describe the Program <ul><li>is our program </li></ul><ul><li>Your table is the team that will build evaluation into the MN program. </li></ul><ul><li>Describing the MN program </li></ul><ul><ul><li>Mission </li></ul></ul><ul><ul><li>Goals and objectives </li></ul></ul><ul><ul><li>Logic model: we are going to make one! </li></ul></ul>1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design
    27. 28. Describe the Program: Logic Model Victory Commitment Training Snodgrass Juggling Regimen Me Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic. Resources Activities Outputs Customers Short Term Intermediate Long Term Outcomes
    28. 29. Evaluation Questions <ul><li>What are the critical questions to understanding the success of the MN program? </li></ul><ul><li>Use an outcome from your logic model to create your evaluation question </li></ul>
    29. 30. Evaluation Questions <ul><li>What contextual factors may influence the answers to each question? </li></ul><ul><li>Who are the audiences for each question? </li></ul><ul><ul><li>What’s the best way to communicate with each audience? </li></ul></ul><ul><ul><li>How might each audience use the answer to each question? </li></ul></ul>1. Context 2. Audience 3. Communication 4. Use
    30. 31. Evaluation Questions <ul><li>What are the critical questions to understanding the success of the MN program? </li></ul><ul><li>Use an outcome from your logic model to create your evaluation question. </li></ul><ul><li>What contextual factors may influence the answers to each question? </li></ul><ul><li>Who are the audiences for each question? </li></ul><ul><ul><li>What’s the best way to communicate with each audience? </li></ul></ul><ul><ul><li>How might each audience use the answer to each question? </li></ul></ul>
    31. 32. Performance Measures <ul><li>What can we measure to answer each question? </li></ul><ul><li>Where can we find the information for each measure? </li></ul><ul><li>How can we collect the information? </li></ul><ul><li>Given our questions and information to be collected, what will be an effective collection strategy? </li></ul>1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management Measures
    32. 33. Performance Measures <ul><li>What analytical tools will give us the most useful information? </li></ul><ul><li>How will we implement the collection strategy? </li></ul><ul><li>How will we manage the data? </li></ul>Measures
    33. 34. Performance Measures <ul><li>What can we measure to answer each question? </li></ul><ul><li>What methods are best suited for each measure? </li></ul><ul><li>What analytical tools will give us the most useful information? </li></ul><ul><li>Given our questions and information to be collected, what will be our collection strategy? </li></ul><ul><ul><li>How will we implement the collection strategy? </li></ul></ul><ul><ul><li>How will we manage the data? </li></ul></ul>Measures
    34. 35. Documentation: Methodology & Policy <ul><li>Evaluation Methodology </li></ul><ul><ul><li>The process of integrating evaluation generates a framework for a methodology and an evaluability assessment </li></ul></ul><ul><li>Performance Management Policy </li></ul><ul><ul><li>Across office programs and projects </li></ul></ul><ul><ul><li>Guides strategy and planning </li></ul></ul>1. Evaluation Methodology 2. Performance Management Policy Measures Documentation
    35. 36. Check the Logic <ul><li>Revisit the process and the decisions made </li></ul><ul><li>Look for the flow in the process and identify potential breaks </li></ul><ul><li>Identify potential obstacles to our approach to managing the performance of the MN demonstration program </li></ul><ul><li>1 st cycle is integrating – next cycle begins implementation </li></ul>
    36. 37. What is happening today with the PPSI? <ul><li>MOU </li></ul><ul><li>Workgroups/committees </li></ul><ul><li>Minnesota demonstration project planning </li></ul><ul><li>Integrating evaluation into project design </li></ul>
    37. 38. Recap and Next Steps <ul><li>Practice : Theory </li></ul><ul><ul><li>An inconsistent ratio </li></ul></ul><ul><li>Movement in the environmental community toward: </li></ul><ul><ul><li>Evidence </li></ul></ul><ul><ul><li>Effectiveness </li></ul></ul><ul><ul><li>Evaluation </li></ul></ul><ul><li>Opportunities to merge theory and practice </li></ul><ul><ul><li>Policy </li></ul></ul><ul><ul><li>Leadership </li></ul></ul><ul><ul><li>New programs </li></ul></ul><ul><ul><li>Capacity building efforts like this one </li></ul></ul>
    38. 39. Thank You! <ul><li>Evaluation Support Division </li></ul><ul><li>National Center for Environmental Innovation </li></ul><ul><li>Office of Policy, Economics and Innovation </li></ul><ul><li>U.S. Environmental Protection Agency </li></ul><ul><li>Matt Keene </li></ul><ul><li>(202) 566-2240 </li></ul><ul><li>[email_address] </li></ul><ul><li>www.epa.gov/evaluate </li></ul>
    39. 43. Adaptive Management Cycle
    40. 44. Evaluation…In the Life of a Program <ul><li>When to do it? </li></ul><ul><li>What are the obstacles? </li></ul><ul><li>Are there solutions? </li></ul><ul><li>Are there opportunities to improve evaluations in your shop? </li></ul>
    41. 46. Evaluation Questions <ul><li>What are the critical questions to understanding the success of the MN program? </li></ul><ul><li>Link your questions to a component in your line of the logic model </li></ul><ul><li>What contextual factors may influence the answers to each question? </li></ul><ul><li>Who are the audiences for each question? </li></ul><ul><ul><li>What’s the best way to communicate with each audience? </li></ul></ul><ul><ul><li>How might each audience use the answer to each question? </li></ul></ul>
    42. 47. Document Evaluation Policy and Methodology <ul><li>Evaluation Policy </li></ul><ul><li>Evaluation Methodology </li></ul>
    43. 48. Performance Measures <ul><li>What can we measure to answer each question? </li></ul><ul><li>What methods are best suited for each measure? </li></ul><ul><li>What analytical techniques could we use to maximize the rigor of our analysis? </li></ul><ul><li>Given the level of rigor desired, what will be our collection strategy? </li></ul><ul><ul><li>How will we implement the collection strategy? </li></ul></ul><ul><ul><li>How will we manage the data? </li></ul></ul>
    44. 49. Materials <ul><li>Presentation </li></ul><ul><li>Flip charts </li></ul><ul><li>Markers </li></ul><ul><li>Projector </li></ul><ul><li>Laptop </li></ul><ul><li>Tape for flipchart paper </li></ul><ul><li>Post its </li></ul>
    45. 50. Supporting documents from PPSI, etc. <ul><li>MN MOU </li></ul><ul><li>MN Goals and Objectives and Tasks </li></ul><ul><li>Workplan </li></ul><ul><li>Logic Model </li></ul>
    46. 51. Performance Management Cycle – needs adaptive management componets like “implement” Logic Model Conceptual framework Performance Measurement Helps you understand what. Program Evaluation Helps you understand and explain why. Program Mission Adapt/Learn/ Transfer Aggregate/Analysis Planning
    47. 52. Steps to Integrating Evaluation into Program Design Select Program Needs Mission Goals & Objectives Logic Model Context Select a Program Document Identify Measures Develop Questions Describe Program Identify a Team Audiences Use Communication Data Management Collection Collection Strategy Analysis Methods Policy Methodology
    48. 53. Integrating Evaluation into Program Design Team Program Questions Measures Documentation Needs & Mission Goals & Objectives Logic Model Audience Methods Analysis Strategy Collection Context Communication Use Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design
    49. 54. Program Management Cycle
    50. 55. Needs, Mission and Goals and Objectives <ul><li>Mission </li></ul><ul><li>What drives the need for performance management? </li></ul><ul><li>Goals and Objectives </li></ul>
    51. 56. Logic Model <ul><li>Each table gets a logic model template </li></ul><ul><li>Goals from the MN project represent a long term outcomes </li></ul><ul><li>Each table fills in the other components of the Logic Model </li></ul><ul><li>We’ll put the lines of logic together to form a complete’ish model </li></ul>
    52. 57. Integrating Evaluation into Program Design Program Questions Measures Documentation Integrating Evaluation into Program Design
    53. 58. Program Measures Documentation Goals & Objectives Logic Model Data Sources Methods & Strategy Analysis Techniques Collection Context Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design Needs & Mission Questions Communication Use Audience Team
    54. 59. 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Program Questions Documentation Measures 1. Audience 2. Context 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology

    ×