Almm monitoring and evaluation tools draft[1]acm sir revised

114 views
79 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
114
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Process monitoring – looks at the use of resources, the progress of activities, and
    the way these are carried out.
    Impact monitoring - tracks progress in delivery of the expected end-results of the
    programme, using the previously developed Performance Indicators (PIs), and the impact the programme is having on target groups.
  • Evaluation is partly a statistical exercise, but statistics (such as numbers of people
    enrolled on a pilot training programme) do not in themselves provide a full
    assessment.
    A good evaluation involves evidence-based interpretation, and usually
    an element of qualitative analysis, such as learners’ surveys, discussions with
    beneficiaries or trainers.
  • Two elements of the programme
    therefore should be evaluated, the outcomes and the process.
    According to one experienced commentator, ALMP impact evaluations done in
    most OECD countries typically answer only the first question. Some impact
    evaluations consider partially the second question, whereas none consider the
    third.8
  • Both types of evaluation (self and external) can be applied to the same programme, though they may be carried out at different times, or address different questions or issues.
  • The starting point for the evaluation will be some basic questions such as:
     What were the expected outcomes of this programme, what are the actual
    outcomes, and how do they compare?
     What critical success factors can be associated with these outcomes?
     What processes could be used to measure progress against these factors?
     Who will carry out these processes (external or internal personnel)?
     What will be the outcome of the process (e.g. a report, a presentation)?
  • Often, an important part of an ALMM programme is to get successful activities,
    outcomes, products and lessons taken up on a wider basis. This is termed
    'mainstreaming’. It involves policy-makers and practitioners at local, regional, or
    national level using your programme’s outcomes to inform their actions.
    Dissemination activities will play a large part in enabling this to take place. Your
    evaluation report may contain important lessons (both positive and negative) for
    future policy and practice far beyond the domain of your project itself (e.g. higher
    level strategic managers, policy analysts, counterparts in other regions of Serbia or
    neighbouring countries). Drawing out the lessons from programme ‘failure’ can be
    useful in informing future developments, so that pitfalls can be avoided and success
    maximised. In any case, whether a programme is of good or of bad quality, a frank
    and independent evaluation of the processes and products is vital.
  • Almm monitoring and evaluation tools draft[1]acm sir revised

    1. 1. ALMM monitoring and evaluation tools Alberto Cerda Key expert on ALMM monitoring and evaluation First meeting of the Working Group 2 June 24, 2010 EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS
    2. 2. Monitoring The types of information necessary − Programme inputs − Progress against objectives and Implementation Plan − Results of activities and outputs − Impact on the target group − The way the programme is managed and style of work The means of gathering information: site visits; interviews, observation, analysis of activity reports, and financial documents.
    3. 3. Performance Monitoring System- levels Impact monitoringProcess monitoring
    4. 4. Performance Monitoring System..provides the basis for the kind of management information system which is essential for programme operations, especially in situations where implementation is delegated or decentralised to local level. The Monitoring System for an ALMM-funded project- key documents:  Quarterly monitoring report  Quarterly financial report  Participant start and completion details Focus on: target group(s) analysis, costs, completion rate, qualifications obtained as a result of participation, employment status after completion.
    5. 5. Monitoring vs. Evaluation Monitoring  assess the success or failure of the programme  provides rapid information  monitoring for quality control and procedural purposes  effectiveness of the ALMPs- secondary effects taken into account Evaluation  provides explanations  longer term process  determins whether and why a programme is successful  assessing implementation take place at all stages of the programme  makes use of monitoring data
    6. 6. Evaluation Objectives To find out... • progress towards objectives • beneficiaries from intervention • impact on the beneficiaries • changes to the target group To make recommendations about: • programme improvements; • Revision of the aims and objectives; • future work evaluation; • cost-effectiveness To asess.. • cause of impact • aims and objectives • work and resources efficiency • changes in the needs of the target group
    7. 7. Programme Evaluation Partly statistical exercise Statistics do not provide full assesment Good evaluation involves evidence based interpretation Elements of qualitative analysis
    8. 8. Evaluation Elements: Outcomes and Process Outcomes- what was achieved and with what results? Estimated impacts of the programme on the individual  Impacts and yield net social gains  Outcome for the money spent  Feasibility of replicating programme’s outcomes. Process- how the outputs were achieved, how the programme was managed?  programme design and methodology  programme management  service delivery mechanisms  the quality of the co-operation with partner organisations innovation
    9. 9. Evaluation: How and Who? Evaluation - the ‘How’?  Who should undertake the evaluation?  When should evaluation take place?  What should be evaluated?  How is an evaluation conducted? Who Should Undertake the Evaluation?  self-evaluation  external evaluation
    10. 10. Self-Evaluation • probably conducted by most programmes • staff independent from its management • time and other resources • available data • research methods and data analysis • progress of the programme • interpretation of the information External Evaluation • expert services • cost-effectiveness • Access to honest information from the staff • other organisations take the results seriously. • evaluation within a wider context • decision- Terms of Reference
    11. 11. When Should Evaluation Take Place? Two intervals:  at an interim stage,  at the end of the approved period for the programme. The monitoring data you collect are a key source of information for both interim and final evaluations.
    12. 12. How Is An Evaluation Conducted? PLAN ANALYSE INDICATORS ANALYSE DATA REPORT
    13. 13. Planning the evaluation- key points  Focus of the evaluation  Programme objectives  Programme products  Programme processes  Subject, purpose and audience targeted
    14. 14. Analyse the Performance indicators Performance indicators:  Output indicators – quantitative measures, based on your programme's key targets and objectives  Process indicators –quantitative or qualitative
    15. 15. Gather and analyse the data Questions to ask are:  Is the information already available?  Do you need to establish a baseline?  What gaps are there in the data?  What do you need to collect in addition to what exists?  How will the information be gathered and recorded?  Are these procedures feasible? Methods are used to gather the data:record-keeping, observation of participants, self-administered questionnaire, individual interviews, group discussion/ focus groups.
    16. 16. Reporting of the findings An interim evaluation report cover:  programme background;  achievements and problems;  trends and issues; and  action needed. The final evaluation report cover:  the background to the programme;  the methods used to gather and analyse the information;  the results / findings, evidence, lessons;  recommendations.
    17. 17. Dissemination The evaluation will be of interest to a wide audience, including:  beneficiary groups;  local, regional and national social partner organisations;  policy-makers and representatives from intermediary organisations Take into account:  the contents tailored to audience  only necessary information and evidence  focus on key findings  transparency  reference all external sources; and  Summary needed
    18. 18. Thank you for the attention! Alberto Cerda Key expert on ALMM monitoring and evaluation acm.eunes@gmail.com EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS

    ×