Monitoring evaluation

579 views

Published on

Published in: Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
579
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
23
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Monitoring evaluation

  1. 1. Monitoring Evaluation
  2. 2. Introduction  Monitoring Evaluation – relation to *Program management within an organization.  Evaluative objective – provides info to ensure that programs are working and they contribute to success  Monitoring Form- associated with the allocation of the resources  M& E is part of the total quality management and quality assurance thrusts.  Quality assurance is motivated by the need for governments to be seen to deliver high quality services.
  3. 3. Summary of Monitoring Evaluation (Form D) Dimension Properties Orientation Assessing Program processes and outcomes, for fine- tuning and to account for Program resources Typical Issues • Is the program reaching the target population? • Is implementation meeting program objectives and benchmarks? • How is implementation going between sites? • ….compared with a month ago? • How can we finetune this Program to make it more efficient? • …to make it more effective? • Is there a Program site which needs more attention to ensure more effective delivery State of program Settled. Program plan is in place Major focus Delivery and outcomes Timing (vis-à-vis) Program delivery During delivery Key Approaches Component Analysis Devolved performance assessment Systems Analysis Assembly of evidence Meaningful use of valid performance measures • Quantitative indicators • MIS
  4. 4. Key Approaches to Monitoring Evaluation  Component Analysis - Senior management select a component of the Program for systematic analysis and review.  Devolved performance assessment- Senior management encourage all components of a Program to assess their performance on a regular basis.  Systems Analysis – A program which is centrally specified and disseminated for implementation to a large number of sites.
  5. 5. Component Analysis  Senior management select a component of the Program  Assess that component in terms of its own objectives and overall goals of the Program  The selection of the component is made on the grounds of concern* Key Assumptions: Senior Management  Has sufficient overview of the organisation  Has the power to direct the evaluation unit to address the issue  Is a major audience for the evaluation findings
  6. 6. Devolved performance assessment  Senior management encourages all components of a Program to assess their performance on a regular basis.  Senior management receives these reports and, using appropriate criteria, makes judgments on the contribution of each component  Senior management provides guidelines, resources, and principles for judging
  7. 7. Systems Analysis  Applies to program which is centrally specified and disseminated for implementation to a large number of sites.  Program specification includes important goals.  Guidelines are provided for field staff; Field staff have little or no say in Program specification or implementation plans. Evaluation scenario:  a set of important outcomes to be defined and made operational.  Using a centralized evaluation unit  Relating differences in attainment of the outcomes
  8. 8. Key Evaluation Questions  Is the program reaching the target population?  Is it being implemented in the ways specified  Ia it effective  How much does it cost?  What are the costs relative to its effectiveness
  9. 9. Monitoring Evaluation: Trends and Case Examples Evaluands in program monitoring  Focus is evaluation within big “P” Programs which are ongoing.  Private sector- Ex: Training and development Program of a large regional bank  Public sector – Intellectual Disabilities Services Program Common:  Contain mission statements  Designed to translate aspects of policy into tangible outcomes  Centrally planned or financed  Prime responsibility of Senior management  Ongoing, subject to modification
  10. 10. Elements of Program Management  A strategic plan for the implementation of relevant aspects of government. Strategic planning – the process by which an organisation creates a vision of its future and develops the necessary structure, resources, procedures and operations to achieve that future.  A program structure  Management arrangements  The use of the MIS  MIS used as a basis for decisions  The use of MIS when considering fundamental issues
  11. 11. Large P Program evaluation Unique characteristics: 1. There is a strong emphasis on outcomes. 2. Programs are ongoing and there is a need for evaluative information over time. 3. Many Programs are designed to provide goods or services rather than promote changes in behaviour 4. Evaluative data are often processed and reported in simple but logical ways. 5. Senior management, inparticular, may require gross or aggregated information
  12. 12. Assembling Evidence for Monitoring  Essential to use full range of data collection and analysis techniques  Indicators need to be at least part of the data collection
  13. 13. Indicators as Evidence  Key feature of indicators – used continually to inform decisions designed to alter the state of the social system affecting them..  Can be used as statements about the effectiveness of organisations.  Must be used in: comparing Program trends at different points in time (Monitoring)  To compare the performance of a Program to an acceptable set of standards or goals.  To compare the implementation of the same Program at different sites or locations
  14. 14. Types of indicators  Appropriateness – match between current community and government priorities and Program objectives;  Efficiency –the relative cost of achieving positive impacts via the program under consideration  Effectiveness – match between Program outcomes and Program objectives
  15. 15. Education for Sustainable Development An Expert Review of Processes and Learning Prof. Daniella Tilbury (University of Gloucestershire, United Kingdom) is the author of this publication commissioned by UNESCO. © UNESCO 2011 Section for Education for Sustainable Development Division of Education for Peace and Sustainable Development UNESCO 7, Place de Fontenoy 75352 Paris 07 SP France Designed and printed at UNESCO Paris, France
  16. 16. Introduction  Monitoring and evaluation (M&E) is an integral part of education programme planning and implementation.  The United Nations Decade of Education for Sustainable Development (DESD, 2005-2014) is an endeavor that aims to reorient education policy, practice and investment to address sustainability.  This publication endeavors to identify which commonly accepted learning processes are aligned with ESD and should be promoted through ESD-related programmes and activities. It also seeks to examine which learning opportunities contribute to sustainable development.
  17. 17. Summary  The United Nations Decade in Education for Sustainable Development (DESD, 2005-2014) is a global movement which seeks to transform education policy, investment and practice. If it is successful, the DESD could change not only education but also the quality of life for many people across the globe.  Key Objectives: i) Which commonly accepted learning processes are aligned with ESD and should be promoted through ESD activities? ii) What are ESD and related learning opportunities contributing to sustainable development?
  18. 18. ESD learning frameworks and processes The review has identified that certain key processes underpin ESD frameworks and practices. These include: < processes of collaboration and dialogue (including multi- stakeholder and intercultural dialogue); < processes which engage the „whole system‟; < processes which innovate curriculum as well as teaching and learning experiences; and, < processes of active and participatory learning.
  19. 19. Learning for ESD defined Learning‟ for ESD refers to what has been learnt and is learned by those engaged in ESD, including learners, facilitators, coordinators as well as funders. Often learning is interpreted as the gaining of knowledge, values and theories related to sustainable development but, as this review indicates, that ESD learning also refers to: < learning to ask critical questions; < learning to clarify one‟s own values; < learning to envision more positive and sustainable futures; < learning to think systemically; < learning to respond through applied learning; and, < learning to explore the dialectic between tradition and innovation.
  20. 20. Critical Lesson through the review  It is difficult to access data on ESD processes and learning opportunities as these are rarely documented  There is a noticeable lack of data to show how these objectives and outcomes are achieved.  This relatively new field is only at the very earliest stages of generating the type of comparative and evaluative overview that provides a picture of effective processes and approaches.  The study recommends that during Phase II  i) data collection processes focus on actual experiences rather than reviews of the literature; and  ii) data collection tools are based on tightly- focused questions that will capture greater detail about learning processes
  21. 21. Critical Question  What is the extent and the depth of connection between the choice of processes in ESD initiatives and actual contributions to sustainable development?  the level of evaluative assessment within the literature is in its infancy,  the outcomes themselves are so varied and feature at multiple levels  external review of case study findings, anecdotal evidence from individual programme evaluations and the reflections of programme leaders seems to suggest that there are links that should be explored in more detail.
  22. 22. Contribution to sustainable development  this review presents a timely opportunity to consider the areas in which change is emerging  The case studies reviewed in this document suggest that it is possible to map a wide range of contributions through ESD  The review unpacks and categorizes the range of potential contributions and some of the themes and priorities that are apparent across these key initiatives.  It has developed a template which could be adapted to serve as a data collation tool  ESD remains poorly researched and weakly evidenced.  This means there is not sufficient evidence to provide conclusive responses to the core questions that drive the present review and other similar investigations into the value of ESD as a field of research and practice.  These challenges will also confront the Phase II monitoring and evaluation report as it attempts to provide robust and meaningful evidence of the impact of the DESD initiative as a whole.

×