ALMM monitoring and evaluation
tools
<Event title, date,presenter's name>
EUNES IPA Project
Technical Assistance to enhanc...
Monitoring - Definition
Monitoring aims:
to highlight strengths and weaknesses in implementation.
to enable responsible pe...
Monitoring
The types of information necessary
Programme inputs,
Progress against objectives and against the Implementation...
Performance Monitoring System- levels
Process monitoring
reviewing and planning work on a regular basis;
assessing whether...
Performance Monitoring System
..provides the basis for the kind of management information system which is essential
for pr...
Performance Monitoring System
The Monitoring System for an ALMM-funded project- key
documents:
QUARTERLY MONITORING REPOR...
Monitoring vs. Evaluation
Monitoring
assess the success or failure of the
programme
provides rapid information about the...
Evaluation
 determins whether and why a programme is successful
 assessing implementation and outcomes from a ‘wider ang...
Programme Evaluation
 individual systematic studies
 assess how well project / programme has worked and what lessons can...
Evaluation Objectives:
find out- asssess- recommend
To find out...
 whether the programme is making progress towards achi...
Evaluation Objectives:
find out- asssess- recommend
to make recommendations about:
 how the programme could be improved;
...
Programme Evaluation
partly statistical exercise
statistics do not
provide full assesment
good evaluation involves
evidenc...
Evaluation Elements: Outcomes and Process
Outcomes- what was achieved and with what results?
Impact evaluation process thr...
Evaluation Elements: Outcomes and Process
Process- how the outputs were achieved, how the
programme was managed?
programm...
Relevance of Evaluation
The results of an evaluation exercise will:
 identify what worked well and what worked less well ...
Evaluation: Who and How?
Evaluation - the ‘How’?
Four key issues for planning and undertaking an evaluation are:
Who shou...
Self-Evaluation
 Most programmes will probably conduct
 important to ensure that is done properly
 skilled staff, indep...
External Evaluation
 The external evaluator can offer expert services; the levels of expertise and
resources required for...
When Should Evaluation Take Place?
Two intervals:
at an interim stage,
at the end of the approved period for the program...
Interim evaluation
addresses whether the
programme:
has achieved its objectives by the
dates set out in the work plan; and...
How Is An Evaluation Conducted?
PLAN
ANALYSE
INDICATORS
ANALYSE
DATA
REPORT
Planning the evaluation- key points
 Focus of the evaluation. A clear, focused brief should go some way towards
ensuring ...
Analyse the Performance indicators
Performance indicators:
 Output indicators – quantitative measures, based on your prog...
Gather and analyse the data
Questions to ask are:
 Is the information already available?
 Do you need to establish a bas...
Reporting of the findings
An interim evaluation report should cover:
 programme background; (context, rationale, objectiv...
Dissemination
The evaluation will be of interest to a wide audience, including:
 beneficiary groups;
 local, regional an...
Thank you for the attention!
<full contactdetails>
EUNES IPA Project
Technical Assistance to enhance forecasting and evalu...
Upcoming SlideShare
Loading in...5
×

Almm monitoring and evaluation tools draft[1]acm

172

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
172
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Almm monitoring and evaluation tools draft[1]acm

  1. 1. ALMM monitoring and evaluation tools <Event title, date,presenter's name> EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS
  2. 2. Monitoring - Definition Monitoring aims: to highlight strengths and weaknesses in implementation. to enable responsible personnel to deal with problems, improve performance, build on success, and adapt to changing circumstances. to provide the mechanism by which relevant information is channeled to the right people at the right time.
  3. 3. Monitoring The types of information necessary Programme inputs, Progress against objectives and against the Implementation Plan. Results of activities and outputs achieved. Impact on the target group. The way the programme is managed and style of work. The means of gathering information Site visits to local offices and projects; Interviews with staff, project personnel and beneficiary groups; Observation of project activities; Analysis of activity reports, statistical reports and other documents; Analysis of financial documents.
  4. 4. Performance Monitoring System- levels Process monitoring reviewing and planning work on a regular basis; assessing whether activities are carried out as planned; identifying and dealing with problems; building on strengths: and assessing whether the style of work is the best way to achieve the programme objectives Impact monitoring progress towards objectives is measured continuously; implementation is modified in response to changing circumstances without losing sight of overall objectives and aims; the need to change objectives (if necessary) can be identified; the need for further research can be identified; assumptions can be verified
  5. 5. Performance Monitoring System ..provides the basis for the kind of management information system which is essential for programme operations, especially in situations where implementation is delegated or decentralised to local level. Steps establishing / confirming programme goals; developing performance indicators corresponding to programme goals; collection of data concerning the indicators; analysis of the data; appropriate presentation of information; using the information findings to improve activities.
  6. 6. Performance Monitoring System The Monitoring System for an ALMM-funded project- key documents: QUARTERLY MONITORING REPORT (covering staffing levels; activities engaged in over the reporting period; achievements /products; seminars and events etc); QUARTERLY FINANCIAL REPORT; PARTICIPANT START AND COMPLETION DETAILS (broken down by gender, age, target group, etc). Monitoring of an ALMM would focus: Number (broken down by gender, age, etc) from the target group(s) who participated within a specified period; The cost of the programme over the same period; The completion rate;  Qualifications obtained as a result of participation (if applicable); Employment status (in the short run) immediately after completion.
  7. 7. Monitoring vs. Evaluation Monitoring assess the success or failure of the programme provides rapid information about the programme programmes are expected to monitor for quality control and procedural purposes in monitoring the effectiveness of the ALMPs- secondary effects should be taken into account Evaluation provides explanations longer term process
  8. 8. Evaluation  determins whether and why a programme is successful  assessing implementation and outcomes from a ‘wider angle’ viewpoint  can take place at all stages of the programme  makes use of monitoring data
  9. 9. Programme Evaluation  individual systematic studies  assess how well project / programme has worked and what lessons can be learned  conducted by external experts  a programme evaluation examines achievement of programme objectives in the wider context  based on the statistics collected OR address important questions of a more qualitative nature  evaluation as a learning process
  10. 10. Evaluation Objectives: find out- asssess- recommend To find out...  whether the programme is making progress towards achieving its objectives?  who has benefited from the intervention?  what the impact has been on the beneficiaries?  have there been changes to the target group due to external factors? To asess..  whether the impact, if there is one, is due to the programme, or to other factors?  whether the aims and objectives of the programme are still relevant, or whether there is a better way of achieving them?  whether the work is being carried out efficiently and what major problems and constraints have arisen?  whether the resources allocated where used efficiently and effectively?  how changes in the needs of the target group effect future programmes?
  11. 11. Evaluation Objectives: find out- asssess- recommend to make recommendations about:  how the programme could be improved;  how the aims and objectives should be modified or revised;  how the work can be monitored and evaluated in the future;  how the work could be made more cost-effective
  12. 12. Programme Evaluation partly statistical exercise statistics do not provide full assesment good evaluation involves evidence based interpretation elements of qualitative analysis
  13. 13. Evaluation Elements: Outcomes and Process Outcomes- what was achieved and with what results? Impact evaluation process three steps: What are the estimated impacts of the programme on the individual? Are the impacts large enough to yield net social gains? Is this the best outcome that could have been achieved for the money spent? (effectiveness). Feasibility of replicating programme’s outcomes might also arise under this heading.
  14. 14. Evaluation Elements: Outcomes and Process Process- how the outputs were achieved, how the programme was managed? programme design and methodology; programme management; service delivery mechanisms; the quality of the co-operation with partner organisations innovation (if any).
  15. 15. Relevance of Evaluation The results of an evaluation exercise will:  identify what worked well and what worked less well (outputs and processes);  assist in the planning of current and future programmes ;  help to build on success, develop good practice, and avoid repeating mistakes;  assist in the monitoring of the programme’s future phase;  help to shape dissemination and mainstreaming strategy. Qualitative analysis helps in judging the outcomes of the approach – i.e. the learning and process 'successes' of a programme that are not necessarily captured by Labour Market Information System (LMIS) statistics alone, but require complementary feedback from beneficiaries, employers and other stakeholders, using interviews, focus group sessions, questionnaires, etc. Evaluating the multiple contexts of a project may also point to situations that limit a project’s ability to achieve anticipated outcomes, or lead to the realization that specific interventions and their intended outcomes may be difficult to measure or to attribute to the project itself.
  16. 16. Evaluation: Who and How? Evaluation - the ‘How’? Four key issues for planning and undertaking an evaluation are: Who should undertake the evaluation? When should evaluation take place? What should be evaluated? How is an evaluation conducted? Who Should Undertake the Evaluation? self-evaluation – an evaluation exercise conducted by the programme sponsor, or any other (partner) organisation involved; and external evaluation – an evaluation undertaken by an individual or organisation from outside the programme.
  17. 17. Self-Evaluation  Most programmes will probably conduct  important to ensure that is done properly  skilled staff, independent from its management  time and other resources  available data  understanding of research methods and data analysis  ability to reflect on the progress of the programme against its stated objectives  Skills to interpret this information and report it in a clear and useful manner are required
  18. 18. External Evaluation  The external evaluator can offer expert services; the levels of expertise and resources required for thorough evaluation are likely to be greater.  Expert can often provide a more cost-effective solution than in case of self- evaluation.  Objective evaluation; the external evaluator may also elicit more honest information from the staff and the beneficiaries of your programme.  The perceived independence of the external evaluator can help to ensure that other organisations take the results more seriously.  They may also (depending on your programme) be able to undertake the evaluation of the programme within a wider context. This may help you to address evaluation questions relating to mainstreaming and multiplier effects. Decision- Terms of Reference
  19. 19. When Should Evaluation Take Place? Two intervals: at an interim stage, at the end of the approved period for the programme. The monitoring data you collect are a key source of information for both interim and final evaluations.
  20. 20. Interim evaluation addresses whether the programme: has achieved its objectives by the dates set out in the work plan; and is on track to achieve its objectives by the end of the programme. •Final evaluation •draw conclusions on the design, implementation and degree of success of your programme in the light of your objectives and indicators; •inform funding bodies and other stakeholders of your results, and the actual and potential impact of your programme; •stimulate support for transfer and mainstreaming of your innovation; •form the basis of the final report and other publications; and stimulate new ideas for innovation. Aims of Interm and Final Evaluation
  21. 21. How Is An Evaluation Conducted? PLAN ANALYSE INDICATORS ANALYSE DATA REPORT
  22. 22. Planning the evaluation- key points  Focus of the evaluation. A clear, focused brief should go some way towards ensuring that you get the information you need from the exercise;  Programme objectives.  Programme products.  Programme processes: processes related to partnership arrangements and decisionmaking, programme management and monitoring systems could be evaluated.  Subject, purpose and audience targeted.
  23. 23. Analyse the Performance indicators Performance indicators:  Output indicators – quantitative measures, based on your programme's key targets and objectives  Process indicators –quantitative or qualitative When defining the indicators, consider whether they will provide Information concerning:  programme’s effectiveness in meeting your objectives;  programme’s efficiency in meeting your objectives;  relevance of your programme activities to the needs identified;  impact of your innovation;  impact of activity at different levels (individuals, groups, systems); and  value added to the programme as a result of partnership activities.
  24. 24. Gather and analyse the data Questions to ask are:  Is the information already available?  Do you need to establish a baseline? – i.e. do you need to have some information in place at the start of your programme?  What gaps are there in the data? What do you need to collect in addition to  what exists?  How will the information be gathered and recorded, and by whom?  Are these procedures feasible? Methods are used to gather the data:  Record-keeping  Observation of participants  Self-administered questionnaire  Individual personal interviews (face to face or by telephone)  Group discussion/ focus groups
  25. 25. Reporting of the findings An interim evaluation report should cover:  programme background; (context, rationale, objectives);  achievements to date and problems encountered;  trends and issues; and  action needed. The final evaluation report needs to encompass:  the background to the programme (context, rationale, objectives),  how the evaluation exercise was undertaken, the methods used to gather  and analyse the information; identifying problems encountered and how they were addressed  the results / findings emerging from the evaluation, supported by evidence; lessons for this programme and others  the final report should make recommendations and highlighting the main lessons.
  26. 26. Dissemination The evaluation will be of interest to a wide audience, including:  beneficiary groups;  local, regional and national social partner organisations;  policy-makers and representatives from intermediary organisations (training and employment, equal opportunities, information technology in training, etc) Take into account:  tailor the contents of your evaluation report to your audience;  only include the information and evidence necessary to ‘tell the story’, and  focus on key findings for making the case for recommendations.  be transparent  reference all external sources; and  provide a summary, even if the report itself is quite brief.
  27. 27. Thank you for the attention! <full contactdetails> EUNES IPA Project Technical Assistance to enhance forecasting and evaluation capacity of the National Employment Service Europeaid/128079/C/SER/RS
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×