SlideShare a Scribd company logo
1 of 24
Can we demonstrate the difference
that Norwegian Aid makes?
Evaluation of results measurement and how this can be
improved
Date: 4th April 2014
In association with the Chr. Michelsens Institute
Overview
Purpose of the assignment
Methodology
Key findings
Recommendations
Purpose of the assignment
• Norad’s Evaluation Department (EVAL) found ‘none of the
evaluations and studies commissioned by EVAL and
finalised in 2011 could report sufficiently on results at the
level of outcomes or impact’
• Why? What are the factors and dynamics ?
• This report:
– contributes to learning
– provide evidence-based recommendations for improvements
– responds to government’s demand for a strengthened focus on
results
Structure of the results measurement system
External evaluations commissioned and
managed by EVAL using independent
consultants and through collaboration with
peer agencies . EVAL provides external
oversight of the overall aid administration.
Grants managed by desk officers within
Norad, MFA and embassies with optional
technical guidance from Norad and external
advisers. The grant management system
generates the bulk of results data.
Grant
progress
reports
Grant
final reports
Grant
reviews &
evaluations
System-wide,
thematic &
country
evaluations
Results data generated through the grant
management system
Provides
oversight on
results
Internal grant management
system
The Evaluation Department
(EVAL)
Synthesis
studies
Joint
evaluations
Hypotheses – testing the system
1. Internal policies, systems and procedures … provide appropriate and
comprehensive guidance
2. Staff receive appropriate training and technical advice/support….
3. The policies, systems and procedures … are being correctly and
adequately implemented
4. The planning, commissioning and quality assurance of evaluations
places an emphasis on measuring results
5. Evaluators have adequate competencies to effectively measure results
and find/use evidence
Methodology
Assessment of grant management system
• Desk-based document assessment
• Assessment of courses, training and capacity building of staff
• Review of grant management processes including quality assurance
• Comparative review of approaches at the World Bank, DFID and Danida
• Review of current practice through a sample of recent grants
• Interviews and a survey of staff to explore current practice and opinions
Assessment of EVAL planning, commissioning and management of
evaluations
• An assessment of the results-focus in a sample of recent evaluation
reports
• An assessment of evaluators’ competencies in the sample
Analytical framework
Policies &
systems
Organisational
culture and
structures
Staff
Capacities
Results
measurement
Rules and
Regulations
Guidelines and
checklists
Standard
procedures
Knowledge and
skills
Training
Attitudes
Incentives
Sanctions
Leadership
KEY FINDINGS
Hypothesis 1 - Internal policies, systems and procedures to ensure evaluability
and results documentation in the grant management process provide
appropriate and comprehensive guidance
• There are gaps in the minimum standards required from grant
applicants about how they plan to measure results which make it
difficult for programme officers to make an informed judgement on
the quality of their approach and to ensure evaluability
• Development of grant scheme rules and a new Grant Management
Manual could have improved consistency and coherence around
results measurement. But the non-mandatory use of templates
means there are no clear standards
• There is an absence of a clear strategic approach to and guidance on
reviews and evaluations which means their use within the grant
management cycle is ad hoc and their quality variable
Hypothesis 2 - Staff receive appropriate training and technical advice/support
to effectively ensure evaluability and results documentation as part of the
grant management process
• Training on results measurement is generally of a good quality, with
high staff satisfaction. However:
– Gaps in content
– Short duration courses
– Low attendance levels
– Not a useful career move
– Little other practical guidance
• The institutional arrangements for technical support and QA on
result measurement are not set up in a way that ensures consistent
quality across the system
– Technical advice only mandatory in a minority of grant schemes and is little used
– Sector specialists often don’t have skills in results measurement.
Hypothesis 3 - The policies, systems and procedures that are in place (to ensure
interventions are evaluable and robust results data are being collected) are
being correctly and adequately implemented
• The current rules and guidance on results measurement in grant
management are not being adequately implemented
• Staff are under pressure and in many instances find themselves
unable to devote the attention necessary to improve specification
of results for a number of reasons:
– Lack of time
– Lack of prioritisation by leadership
– Lack of incentives and sanctions
• Concern to follow the post- Paris harmonisation agenda and use
partners’ own systems appears to have held back constructive dialogue
about results measurement
Hypothesis 4 - The planning, commissioning and quality assurance of
evaluations places an emphasis on measuring results
• Evaluations designed by EVAL often do not put sufficient emphasis
on gathering evidence about results, particularly during the
planning and commissioning phase
• EVAL does not have a systematic active management towards
ensuring evaluation consultants remain focused on measuring
results
• Evaluations directed towards outcomes and impacts do not have
the necessary design features to ensure an outcome/impact
assessment is delivered
Hypothesis 5 - Evaluators have adequate competencies to effectively measure
results and find/use evidence
• Necessary competencies are not expressed clearly in the ToRs
• A majority of the consultants have substantial experience with
evaluation, have formal qualifications in the discipline and have a
solid foundation in the application of core evaluation approaches
and tools. However there are large gaps between competency and
application
• Few evaluators have adequate competencies to measure results at
the outcome and impact levels, but evaluators have been assessed
as competent for the ToRs as tendered
– The evaluators lack skills and expertise in the application of more sophisticated evaluation
methodologies. Contracts are awarded through a competitive tendering process and
competency was accepted.
Conclusions for each hypothesis
Hypothesis 1 - Internal policies, systems and procedures to ensure
evaluability and results documentation in the grant management
process provide appropriate and comprehensive guidance
Rejected
Hypothesis 2 - Staff receive appropriate training and technical
advice/support to effectively ensure evaluability and results
documentation as part of the grant management process
Rejected
Hypothesis 3 - The policies, systems and procedures that are in place
(to ensure interventions are evaluable and robust results data are being
collected) are being correctly and adequately implemented
Rejected
Hypothesis 4 - The planning, commissioning and quality assurance of
evaluations places an emphasis on measuring results
Rejected
Hypothesis 5 - Evaluators have adequate competencies to effectively
measure results and find/use evidence
Unproved
KEY RECOMMENDATIONS
Two types of recommendations for improving results
measurement and evaluability in the aid administration
• Technical – to help resolve the shortcomings and
gaps in grant policies and procedures
– Target: Grant Management Unit in the Ministry of
Foreign Affairs, Norad's Department for Quality
Assurance, EVAL
• Structural – to address the challenges that are
more structural in nature raised in the evaluation
– Target: Norad and MFA senior management
Technical recommendations
Strengthen the quality of results measurement in the grant
management process
• Require partners to outline in greater detail how they plan to measure results.
• Develop guidance for staff on how to put results into practice and support partners in developing
effective measurement systems.
• Develop standard Quality Assurance checklists for staff to use
• Require partners to use the standard templates that have been developed
Strengthen the quality and use of reviews and evaluations at
grant level
• More strategic approach to the use of evaluations and reviews when planning grants that is
grounded in review of evidence (and its quality)
Strengthen the training support provided to staff on results
measurement
• Develop a more comprehensive training programme to support staff capacity in results
measurement
• An online resource hub should be developed that provides staff access to examples of good
practice in results measurement and pools sector-specific resources
EVAL recommendations
Designing an evaluation
• Tighten the design specifications for evaluations
• Keep evaluation questions focused
• Require evaluators to clearly describe the programme logic of the intervention
being evaluated
• Be more specific in ToRs about the required consultants’ skills
Managing an evaluation
• Monitor the progress of evaluations more closely
Conducting impact evaluations
• Develop a clear process for deciding and managing impact evaluations
• Ensure the appropriate competencies exist both among the EVAL staff managing
the evaluation and the consultants
• Ensure the specification of methodologies, data requirements and competencies
of evaluators are in line
Technical changes are not enough
• Two contrasting strategies are suggested to tackle
structural problems:
– by concentrating expertise or
– by broadening it
• The two approaches represent different strategies but
are not necessarily mutually exclusive. Both (or
elements of both) could be taken forward in tandem
• Cutting across each approach are a number of common
recommendations. They provide the foundations of
both approaches and we deal with them first.
Leadership : Incentives : Consistency
• More visible support at senior management level for
results measurement.
• Improve staff incentives for measuring results and
ensuring evaluability.
• The requirements on technical assistance and Quality
Assurance should be harmonised across all grant
scheme rules.
• Establish a consistent basis, such as size of grant, for
which technical assistance and Quality Assurance of
partners’ results frameworks, reporting and evaluation
plans is mandatory.
Structural recommendations –
Approach 1 (concentrating expertise)
• Resource AMOR to provide more comprehensive support
to all eligible grants on their results measurement
frameworks, and evaluation plans and reports at approval
and completion
– Need for additional resources
– Could be contracted out or managed as a task force
– Could provide information to feed into the Annual Results
Report
• This approach concentrates actions to improve quality. It
would bring a high degree of consistency, but has
substantial implications for staffing in the short time span
necessary to process grants
Structural recommendations –
Approach 2 (broadening expertise)
• Build a cadre of staff specialised in results measurement and
evaluation
– Small number of staff receive intensive training in evaluation and
results measurement
– Would bring advice and training closer to the ground
– More flexible and informal
– Sector specialisation possible
– Incentivise with career rewards
• This approach is designed more to work within current practices
and build through progressive change rather than radical
change. It broadens the approach by developing staff capacity
and working more closely with partners, which fits with local
ownership and capacity building of partners’ systems
‘Can we demonstrate the difference that
Norwegian aid makes?’
There are some elements of good foundations for
better results measurement, but current arrangements
lack the strength of leadership, depth of guidance and
coherence of procedures necessary for effective
evaluation of Norwegian aid
Thank you for listening
Any Questions?

More Related Content

What's hot

Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706kristofferryan
 
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...MEASURE Evaluation
 
Ukes process tracing presentation
Ukes process tracing presentationUkes process tracing presentation
Ukes process tracing presentationItad Ltd
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and EvaluationLen Fontanilla
 
Logic model pp presentation_final_for sharing
Logic model pp presentation_final_for sharingLogic model pp presentation_final_for sharing
Logic model pp presentation_final_for sharingDiscoveryCenterMU
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorldFish
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and EvaluationAhmadzay
 
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...Water, Land and Ecosystems (WLE)
 
Monitoring R&D functions
Monitoring R&D functionsMonitoring R&D functions
Monitoring R&D functionsNandita Das
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalDeepak Karki
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluationCarlo Magno
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approachesJessica Bernardino
 
Introduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECIntroduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECLeonardo ENERGY
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluationMd Rifat Anam
 
Introduction to Logic Models
Introduction to Logic ModelsIntroduction to Logic Models
Introduction to Logic ModelsDiscoveryCenterMU
 
CPE Monitoring and Evaluation
CPE Monitoring and EvaluationCPE Monitoring and Evaluation
CPE Monitoring and EvaluationSamuel Baker
 
Seven Steps to EnGendering Evaluations of Public Health Programs
 Seven Steps to EnGendering Evaluations of Public Health Programs Seven Steps to EnGendering Evaluations of Public Health Programs
Seven Steps to EnGendering Evaluations of Public Health ProgramsMEASURE Evaluation
 

What's hot (20)

Family planning 170706
Family planning 170706Family planning 170706
Family planning 170706
 
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
Evolution of Family Planning Impact Evaluation: New contexts and methodologic...
 
Ukes process tracing presentation
Ukes process tracing presentationUkes process tracing presentation
Ukes process tracing presentation
 
Evaluation seminar1
Evaluation seminar1Evaluation seminar1
Evaluation seminar1
 
Participatory Monitoring and Evaluation
Participatory Monitoring and EvaluationParticipatory Monitoring and Evaluation
Participatory Monitoring and Evaluation
 
Logic model pp presentation_final_for sharing
Logic model pp presentation_final_for sharingLogic model pp presentation_final_for sharing
Logic model pp presentation_final_for sharing
 
Workshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessmentWorkshop: Monitoring, evaluation and impact assessment
Workshop: Monitoring, evaluation and impact assessment
 
Monitoring and Evaluation
Monitoring and EvaluationMonitoring and Evaluation
Monitoring and Evaluation
 
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Re...
 
Monitoring R&D functions
Monitoring R&D functionsMonitoring R&D functions
Monitoring R&D functions
 
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in NepalThe Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
The Basics of Monitoring, Evaluation and Supervision of Health Services in Nepal
 
Impact evaluation methods: Qualitative Methods
Impact evaluation methods: Qualitative MethodsImpact evaluation methods: Qualitative Methods
Impact evaluation methods: Qualitative Methods
 
Impact evaluation
Impact evaluationImpact evaluation
Impact evaluation
 
Management oriented evaluation approaches
Management oriented evaluation approachesManagement oriented evaluation approaches
Management oriented evaluation approaches
 
Introduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPECIntroduction to Evaluation and the role of IEPPEC
Introduction to Evaluation and the role of IEPPEC
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Introduction to Logic Models
Introduction to Logic ModelsIntroduction to Logic Models
Introduction to Logic Models
 
CPE Monitoring and Evaluation
CPE Monitoring and EvaluationCPE Monitoring and Evaluation
CPE Monitoring and Evaluation
 
Basics of Extension Evaluations
Basics of Extension EvaluationsBasics of Extension Evaluations
Basics of Extension Evaluations
 
Seven Steps to EnGendering Evaluations of Public Health Programs
 Seven Steps to EnGendering Evaluations of Public Health Programs Seven Steps to EnGendering Evaluations of Public Health Programs
Seven Steps to EnGendering Evaluations of Public Health Programs
 

Similar to Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of the Norwegian Aid Administrations approach to Results Measurement

Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in EvaluationSharon Geroquia
 
NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017CHC Connecticut
 
NCA PGR Session 7 May 02 2018
NCA PGR Session 7 May 02 2018NCA PGR Session 7 May 02 2018
NCA PGR Session 7 May 02 2018CHC Connecticut
 
Validation and moderation workshop Session 1
Validation and moderation workshop Session 1Validation and moderation workshop Session 1
Validation and moderation workshop Session 1SarinaRussoInstitute
 
Evaluation strategies, process of curriculum change
Evaluation strategies, process of curriculum changeEvaluation strategies, process of curriculum change
Evaluation strategies, process of curriculum changeSoumya Ranjan Parida
 
Presentation v7 to inspectors nov 2013
Presentation v7 to inspectors nov 2013Presentation v7 to inspectors nov 2013
Presentation v7 to inspectors nov 2013eaquals
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptxssusere0ee1d
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxwelfredoyu2
 
Moderation workshop dietetics april 2017b
Moderation workshop dietetics april 2017bModeration workshop dietetics april 2017b
Moderation workshop dietetics april 2017bRita Ndagire Kizito
 
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationUsing Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationExamSoft
 
Module 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptxModule 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptxaradhnayadav2
 
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESPLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESSurono Surono
 
Presentation Strategy Team
Presentation   Strategy TeamPresentation   Strategy Team
Presentation Strategy TeamLoriann123
 
Presentation Strategy Team
Presentation   Strategy TeamPresentation   Strategy Team
Presentation Strategy TeamLoriann123
 
NCA Residency Session 7 March 8 2017
NCA Residency Session 7 March 8 2017NCA Residency Session 7 March 8 2017
NCA Residency Session 7 March 8 2017CHC Connecticut
 

Similar to Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of the Norwegian Aid Administrations approach to Results Measurement (20)

Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in Evaluation
 
NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017NCA Residency Session 8 April 5 2017
NCA Residency Session 8 April 5 2017
 
Elements of a quality assessment system
Elements of a quality assessment systemElements of a quality assessment system
Elements of a quality assessment system
 
NCA PGR Session 7 May 02 2018
NCA PGR Session 7 May 02 2018NCA PGR Session 7 May 02 2018
NCA PGR Session 7 May 02 2018
 
Validation and moderation workshop Session 1
Validation and moderation workshop Session 1Validation and moderation workshop Session 1
Validation and moderation workshop Session 1
 
Evaluation strategies, process of curriculum change
Evaluation strategies, process of curriculum changeEvaluation strategies, process of curriculum change
Evaluation strategies, process of curriculum change
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Presentation v7 to inspectors nov 2013
Presentation v7 to inspectors nov 2013Presentation v7 to inspectors nov 2013
Presentation v7 to inspectors nov 2013
 
Chapter 30
Chapter 30Chapter 30
Chapter 30
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
 
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptxEDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
EDUCATIONAL PLANNING AND MANAGEMENT EDUC 712.pptx
 
Moderation workshop dietetics april 2017b
Moderation workshop dietetics april 2017bModeration workshop dietetics april 2017b
Moderation workshop dietetics april 2017b
 
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing AccreditationUsing Nursing Exam Data Effectively in Preparing Nursing Accreditation
Using Nursing Exam Data Effectively in Preparing Nursing Accreditation
 
Module 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptxModule 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptx
 
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSESPLAN ASSESSMENT ACTIVITIES AND PROCESSES
PLAN ASSESSMENT ACTIVITIES AND PROCESSES
 
Presentation Strategy Team
Presentation   Strategy TeamPresentation   Strategy Team
Presentation Strategy Team
 
Presentation Strategy Team
Presentation   Strategy TeamPresentation   Strategy Team
Presentation Strategy Team
 
NCA Residency Session 7 March 8 2017
NCA Residency Session 7 March 8 2017NCA Residency Session 7 March 8 2017
NCA Residency Session 7 March 8 2017
 
Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...Models and approaches to decentralized evaluation system - considerations for...
Models and approaches to decentralized evaluation system - considerations for...
 
training evaluation
 training evaluation training evaluation
training evaluation
 

Recently uploaded

Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...ssuserf63bd7
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our WorldEduminds Learning
 
Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Seán Kennedy
 
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档208367051
 
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...Biometric Authentication: The Evolution, Applications, Benefits and Challenge...
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...GQ Research
 
Multiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfMultiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfchwongval
 
Predicting Salary Using Data Science: A Comprehensive Analysis.pdf
Predicting Salary Using Data Science: A Comprehensive Analysis.pdfPredicting Salary Using Data Science: A Comprehensive Analysis.pdf
Predicting Salary Using Data Science: A Comprehensive Analysis.pdfBoston Institute of Analytics
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...limedy534
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdfHuman37
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
Semantic Shed - Squashing and Squeezing.pptx
Semantic Shed - Squashing and Squeezing.pptxSemantic Shed - Squashing and Squeezing.pptx
Semantic Shed - Squashing and Squeezing.pptxMike Bennett
 
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...Amil Baba Dawood bangali
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Seán Kennedy
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesTimothy Spann
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理e4aez8ss
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxaleedritatuxx
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort servicejennyeacort
 
Heart Disease Classification Report: A Data Analysis Project
Heart Disease Classification Report: A Data Analysis ProjectHeart Disease Classification Report: A Data Analysis Project
Heart Disease Classification Report: A Data Analysis ProjectBoston Institute of Analytics
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfgstagge
 

Recently uploaded (20)

Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
Statistics, Data Analysis, and Decision Modeling, 5th edition by James R. Eva...
 
Learn How Data Science Changes Our World
Learn How Data Science Changes Our WorldLearn How Data Science Changes Our World
Learn How Data Science Changes Our World
 
Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...Student Profile Sample report on improving academic performance by uniting gr...
Student Profile Sample report on improving academic performance by uniting gr...
 
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
原版1:1定制南十字星大学毕业证(SCU毕业证)#文凭成绩单#真实留信学历认证永久存档
 
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...Biometric Authentication: The Evolution, Applications, Benefits and Challenge...
Biometric Authentication: The Evolution, Applications, Benefits and Challenge...
 
Multiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfMultiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdf
 
Predicting Salary Using Data Science: A Comprehensive Analysis.pdf
Predicting Salary Using Data Science: A Comprehensive Analysis.pdfPredicting Salary Using Data Science: A Comprehensive Analysis.pdf
Predicting Salary Using Data Science: A Comprehensive Analysis.pdf
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
Semantic Shed - Squashing and Squeezing.pptx
Semantic Shed - Squashing and Squeezing.pptxSemantic Shed - Squashing and Squeezing.pptx
Semantic Shed - Squashing and Squeezing.pptx
 
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
NO1 Certified Black Magic Specialist Expert Amil baba in Lahore Islamabad Raw...
 
Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...Student profile product demonstration on grades, ability, well-being and mind...
Student profile product demonstration on grades, ability, well-being and mind...
 
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming PipelinesConf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
Conf42-LLM_Adding Generative AI to Real-Time Streaming Pipelines
 
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
科罗拉多大学波尔得分校毕业证学位证成绩单-可办理
 
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptxmodul pembelajaran robotic Workshop _ by Slidesgo.pptx
modul pembelajaran robotic Workshop _ by Slidesgo.pptx
 
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
 
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
9711147426✨Call In girls Gurgaon Sector 31. SCO 25 escort service
 
Heart Disease Classification Report: A Data Analysis Project
Heart Disease Classification Report: A Data Analysis ProjectHeart Disease Classification Report: A Data Analysis Project
Heart Disease Classification Report: A Data Analysis Project
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdf
 

Can We Demonstrate the Difference that Norwegian Aid makes? - Evaluation of the Norwegian Aid Administrations approach to Results Measurement

  • 1. Can we demonstrate the difference that Norwegian Aid makes? Evaluation of results measurement and how this can be improved Date: 4th April 2014 In association with the Chr. Michelsens Institute
  • 2. Overview Purpose of the assignment Methodology Key findings Recommendations
  • 3. Purpose of the assignment • Norad’s Evaluation Department (EVAL) found ‘none of the evaluations and studies commissioned by EVAL and finalised in 2011 could report sufficiently on results at the level of outcomes or impact’ • Why? What are the factors and dynamics ? • This report: – contributes to learning – provide evidence-based recommendations for improvements – responds to government’s demand for a strengthened focus on results
  • 4. Structure of the results measurement system External evaluations commissioned and managed by EVAL using independent consultants and through collaboration with peer agencies . EVAL provides external oversight of the overall aid administration. Grants managed by desk officers within Norad, MFA and embassies with optional technical guidance from Norad and external advisers. The grant management system generates the bulk of results data. Grant progress reports Grant final reports Grant reviews & evaluations System-wide, thematic & country evaluations Results data generated through the grant management system Provides oversight on results Internal grant management system The Evaluation Department (EVAL) Synthesis studies Joint evaluations
  • 5. Hypotheses – testing the system 1. Internal policies, systems and procedures … provide appropriate and comprehensive guidance 2. Staff receive appropriate training and technical advice/support…. 3. The policies, systems and procedures … are being correctly and adequately implemented 4. The planning, commissioning and quality assurance of evaluations places an emphasis on measuring results 5. Evaluators have adequate competencies to effectively measure results and find/use evidence
  • 6. Methodology Assessment of grant management system • Desk-based document assessment • Assessment of courses, training and capacity building of staff • Review of grant management processes including quality assurance • Comparative review of approaches at the World Bank, DFID and Danida • Review of current practice through a sample of recent grants • Interviews and a survey of staff to explore current practice and opinions Assessment of EVAL planning, commissioning and management of evaluations • An assessment of the results-focus in a sample of recent evaluation reports • An assessment of evaluators’ competencies in the sample
  • 7. Analytical framework Policies & systems Organisational culture and structures Staff Capacities Results measurement Rules and Regulations Guidelines and checklists Standard procedures Knowledge and skills Training Attitudes Incentives Sanctions Leadership
  • 9. Hypothesis 1 - Internal policies, systems and procedures to ensure evaluability and results documentation in the grant management process provide appropriate and comprehensive guidance • There are gaps in the minimum standards required from grant applicants about how they plan to measure results which make it difficult for programme officers to make an informed judgement on the quality of their approach and to ensure evaluability • Development of grant scheme rules and a new Grant Management Manual could have improved consistency and coherence around results measurement. But the non-mandatory use of templates means there are no clear standards • There is an absence of a clear strategic approach to and guidance on reviews and evaluations which means their use within the grant management cycle is ad hoc and their quality variable
  • 10. Hypothesis 2 - Staff receive appropriate training and technical advice/support to effectively ensure evaluability and results documentation as part of the grant management process • Training on results measurement is generally of a good quality, with high staff satisfaction. However: – Gaps in content – Short duration courses – Low attendance levels – Not a useful career move – Little other practical guidance • The institutional arrangements for technical support and QA on result measurement are not set up in a way that ensures consistent quality across the system – Technical advice only mandatory in a minority of grant schemes and is little used – Sector specialists often don’t have skills in results measurement.
  • 11. Hypothesis 3 - The policies, systems and procedures that are in place (to ensure interventions are evaluable and robust results data are being collected) are being correctly and adequately implemented • The current rules and guidance on results measurement in grant management are not being adequately implemented • Staff are under pressure and in many instances find themselves unable to devote the attention necessary to improve specification of results for a number of reasons: – Lack of time – Lack of prioritisation by leadership – Lack of incentives and sanctions • Concern to follow the post- Paris harmonisation agenda and use partners’ own systems appears to have held back constructive dialogue about results measurement
  • 12. Hypothesis 4 - The planning, commissioning and quality assurance of evaluations places an emphasis on measuring results • Evaluations designed by EVAL often do not put sufficient emphasis on gathering evidence about results, particularly during the planning and commissioning phase • EVAL does not have a systematic active management towards ensuring evaluation consultants remain focused on measuring results • Evaluations directed towards outcomes and impacts do not have the necessary design features to ensure an outcome/impact assessment is delivered
  • 13. Hypothesis 5 - Evaluators have adequate competencies to effectively measure results and find/use evidence • Necessary competencies are not expressed clearly in the ToRs • A majority of the consultants have substantial experience with evaluation, have formal qualifications in the discipline and have a solid foundation in the application of core evaluation approaches and tools. However there are large gaps between competency and application • Few evaluators have adequate competencies to measure results at the outcome and impact levels, but evaluators have been assessed as competent for the ToRs as tendered – The evaluators lack skills and expertise in the application of more sophisticated evaluation methodologies. Contracts are awarded through a competitive tendering process and competency was accepted.
  • 14. Conclusions for each hypothesis Hypothesis 1 - Internal policies, systems and procedures to ensure evaluability and results documentation in the grant management process provide appropriate and comprehensive guidance Rejected Hypothesis 2 - Staff receive appropriate training and technical advice/support to effectively ensure evaluability and results documentation as part of the grant management process Rejected Hypothesis 3 - The policies, systems and procedures that are in place (to ensure interventions are evaluable and robust results data are being collected) are being correctly and adequately implemented Rejected Hypothesis 4 - The planning, commissioning and quality assurance of evaluations places an emphasis on measuring results Rejected Hypothesis 5 - Evaluators have adequate competencies to effectively measure results and find/use evidence Unproved
  • 16. Two types of recommendations for improving results measurement and evaluability in the aid administration • Technical – to help resolve the shortcomings and gaps in grant policies and procedures – Target: Grant Management Unit in the Ministry of Foreign Affairs, Norad's Department for Quality Assurance, EVAL • Structural – to address the challenges that are more structural in nature raised in the evaluation – Target: Norad and MFA senior management
  • 17. Technical recommendations Strengthen the quality of results measurement in the grant management process • Require partners to outline in greater detail how they plan to measure results. • Develop guidance for staff on how to put results into practice and support partners in developing effective measurement systems. • Develop standard Quality Assurance checklists for staff to use • Require partners to use the standard templates that have been developed Strengthen the quality and use of reviews and evaluations at grant level • More strategic approach to the use of evaluations and reviews when planning grants that is grounded in review of evidence (and its quality) Strengthen the training support provided to staff on results measurement • Develop a more comprehensive training programme to support staff capacity in results measurement • An online resource hub should be developed that provides staff access to examples of good practice in results measurement and pools sector-specific resources
  • 18. EVAL recommendations Designing an evaluation • Tighten the design specifications for evaluations • Keep evaluation questions focused • Require evaluators to clearly describe the programme logic of the intervention being evaluated • Be more specific in ToRs about the required consultants’ skills Managing an evaluation • Monitor the progress of evaluations more closely Conducting impact evaluations • Develop a clear process for deciding and managing impact evaluations • Ensure the appropriate competencies exist both among the EVAL staff managing the evaluation and the consultants • Ensure the specification of methodologies, data requirements and competencies of evaluators are in line
  • 19. Technical changes are not enough • Two contrasting strategies are suggested to tackle structural problems: – by concentrating expertise or – by broadening it • The two approaches represent different strategies but are not necessarily mutually exclusive. Both (or elements of both) could be taken forward in tandem • Cutting across each approach are a number of common recommendations. They provide the foundations of both approaches and we deal with them first.
  • 20. Leadership : Incentives : Consistency • More visible support at senior management level for results measurement. • Improve staff incentives for measuring results and ensuring evaluability. • The requirements on technical assistance and Quality Assurance should be harmonised across all grant scheme rules. • Establish a consistent basis, such as size of grant, for which technical assistance and Quality Assurance of partners’ results frameworks, reporting and evaluation plans is mandatory.
  • 21. Structural recommendations – Approach 1 (concentrating expertise) • Resource AMOR to provide more comprehensive support to all eligible grants on their results measurement frameworks, and evaluation plans and reports at approval and completion – Need for additional resources – Could be contracted out or managed as a task force – Could provide information to feed into the Annual Results Report • This approach concentrates actions to improve quality. It would bring a high degree of consistency, but has substantial implications for staffing in the short time span necessary to process grants
  • 22. Structural recommendations – Approach 2 (broadening expertise) • Build a cadre of staff specialised in results measurement and evaluation – Small number of staff receive intensive training in evaluation and results measurement – Would bring advice and training closer to the ground – More flexible and informal – Sector specialisation possible – Incentivise with career rewards • This approach is designed more to work within current practices and build through progressive change rather than radical change. It broadens the approach by developing staff capacity and working more closely with partners, which fits with local ownership and capacity building of partners’ systems
  • 23. ‘Can we demonstrate the difference that Norwegian aid makes?’ There are some elements of good foundations for better results measurement, but current arrangements lack the strength of leadership, depth of guidance and coherence of procedures necessary for effective evaluation of Norwegian aid
  • 24. Thank you for listening Any Questions?