CGIAR Evaluation Community of Practice 
Workshop 
The CGIAR evaluation function: Progress so far and 
where does this feed into? 
September 
2014
CGIAR Evaluation Strategic Objective 
Ensuring that the evaluation function is a key and 
effective instrument of accountability and 
learning, fully contributing to the shaping and 
vision of the future CGIAR
IEA = Who We Are 
• CGIAR system entity: small unit based in FAO 
• Reports to Fund Council through biannual meetings and 
Evaluation and Impact Assessment Committee (EIAC) 
• Operates in close consultation with Consortium Office 
• Works closely with Centers/CRPs 
• Collaborates with ISPC, including SPIA
How IEA fits into to CGIAR
IEA – Reviews & Evaluations 
1 CRP 
evaluation 
completed 
+ Mgmt 
Response 
Forests, Trees & 
Agroforestry 
2 Reviews 
completed 
+ Mgmt 
Response 
Generation 
Challenge 
Program 
CRP 
Governance 
& 
Management 
4 ongoing CRP 
evaluations 
(TORs/IRs): 
PIM 
AAS 
WHEAT 
MAIZE 
5 CRP evaluations 
in planning and 
preparatory work 
(TORs): 
CCAFS 
GRISP 
L&F 
RTB 
WLE
CRP- Commissioned Evaluations 
• IEA structured support on quality assurance 
to 5 CRP Commissioned Evaluations 
(requested by FC) : 
– Dryland Cereals, Dryland Systems, Grain Legumes, 
A4NH, and HumidTropics 
• IEA support and guidance to CCEEs: 
– On-demand basis (planning, selection of 
consultants, TORs, evaluation governance) 
– Extensive use of IEA standards and guidance notes
IEA Evaluations– timeline 
2013 2014 2015 2016 
Forests, Trees and Agroforestry 
Policies, Institutions & Markets 
Maize 
Wheat 
Aquatic Agricultural Systems 
Livestock and Fish 
Roots, Tubers, and Bananas 
Water, Land and Ecosystems 
Climate Change, Agri & Food Security 
Global Rice Science Partnership 
QA Support to 5 CRPS: Dryland Systems, Dryland 
Cereals, Grain Legumes, A4NH, and Humidtropics
For Whom/ What Purpose? 
CRP Management 
• Learning for strategic management and 
adaptations and adjustments of program 
• Preparation: 2nd cycle of CRP proposals 
System-wide 
• System wide strategic management 
(MTR, SRF, 2nd call for funding) 
• Resource for system-wide evaluation 
Evaluation 
Donors 
• Accountability on program performance 
• Building trust and transparency 
• Strategic and informed decisions on 
portfolio and funding allocations 
Partners 
• Build trust and transparency 
• Accountability on partnership and 
program performance 
• Refinement of partnership
System-level reviews – mid-term review 
Examine the progress of the CGIAR reforms, and the resulting 
appropriateness, effectiveness, and efficiency of the overall system 
and make recommendations for course correction and improvements 
where necessary. 
Commissioned by: Fund Council 
Evaluators: Independent Panel 
Timeline: Results of the review and final recommendations will be 
discussed/adopted at the forthcoming Fund Council meeting for Year 
2014. 
Links: TORs and Inception report (April 2014)
CRP extensions and 2nd call for proposals 
Consortium Board and Fund Council decision: 
• Extend (and refresh) the current CRPs to the end of 2016 (with 
Extension proposals for 2015-16 in 2014); 
• Develop proposals for the second stage with the benefit of the 
conclusions and recommendations of both the Mid-term Review of 
the Reform, and forms of external evaluation for all CRPs.
CRP 2nd call for proposals: process 
• March 2015: CRPs submit Pre-Proposals on-line. 
• July - September 2015: Review by CO and ISPC and CRPs submit revised proposals 
• November 2015: CB and FC determine which proposals, and/or key components, 
are to be developed into full proposals 
• March 2016: invited CRPs submit Full Proposals 
• April - September 2016: six months for ISPC, CO, and FO and to review the 
proposals 
• September 2016: A full set of (a) CRP proposals; (b) ISPC reviews; and (c) CO FO, IEA 
evaluations available for review and feedback from CB and FC members 
• November 2016: CB and FC complete funding /approval decisions
Scope of CRP Evaluation 
Broad scope: Relevance, QoS, Effectiveness, 
impacts, sustainability, efficiency 
Summative 
– Results of past research which is continuing in CRP 
Formative 
– Programmatic approach in enhancing relevance and 
efficiency of CRP 
– Likelihood of effectiveness to contribute to SRF 
vision, SLOs and outcomes
Monitoring and impact assessment feed into 
evaluation 
Uptake, 
Influence 
Outcomes Adoption 
SPIA 
Inputs 
Progress 
CRP 
E V A L U A T I O N 
Impact 
O 
u 
t 
p 
u 
t 
s 
- 10-20 years - 5-10 years- 4-8 years Current research
Some challenges 
• The evaluand: is a moving target: several iterations 
of structural changes since the initial proposal 
• Monitoring and Evaluation capacity and systems 
uneven across CGIAR – The “building blocks” are 
not there 
• No clear accountability and learning framework at 
CRP and system levels
The role of the Evaluation Community 
of Practice in meeting our strategic 
objective
• Strengthen the culture of evaluation across 
CGIAR 
• Network and share information 
• Build capacity and strengthen coordinated 
planning and cooperation 
• Strengthen the collective voice of evaluation in 
the system
At CRP 
level: 
Evaluation 
building 
blocks 
Blue-internal, 
Green – external 
P- research project level 
Sub – sub themes & 
components 
THEMES AND CROSS-CUTTING 
S 
U 
B 
S 
U 
B 
S 
U 
B 
S 
U 
B 
S 
U 
B 
S 
U 
B 
P P P 
P P P 
P P 
P 
P 
P P P P P P P 
P P 
P 
P 
P P P P P P 
P P P P P P P 
P 
P P 
P 
P 
P 
P 
P 
P 
P P 
P 
P 
P 
P P 
P P P 
P P 
P 
P 
P 
P 
P 
MONITORING INPUTS AND INDICATORS

Evaluation community of practice workshop presentation

  • 1.
    CGIAR Evaluation Communityof Practice Workshop The CGIAR evaluation function: Progress so far and where does this feed into? September 2014
  • 2.
    CGIAR Evaluation StrategicObjective Ensuring that the evaluation function is a key and effective instrument of accountability and learning, fully contributing to the shaping and vision of the future CGIAR
  • 3.
    IEA = WhoWe Are • CGIAR system entity: small unit based in FAO • Reports to Fund Council through biannual meetings and Evaluation and Impact Assessment Committee (EIAC) • Operates in close consultation with Consortium Office • Works closely with Centers/CRPs • Collaborates with ISPC, including SPIA
  • 4.
    How IEA fitsinto to CGIAR
  • 5.
    IEA – Reviews& Evaluations 1 CRP evaluation completed + Mgmt Response Forests, Trees & Agroforestry 2 Reviews completed + Mgmt Response Generation Challenge Program CRP Governance & Management 4 ongoing CRP evaluations (TORs/IRs): PIM AAS WHEAT MAIZE 5 CRP evaluations in planning and preparatory work (TORs): CCAFS GRISP L&F RTB WLE
  • 6.
    CRP- Commissioned Evaluations • IEA structured support on quality assurance to 5 CRP Commissioned Evaluations (requested by FC) : – Dryland Cereals, Dryland Systems, Grain Legumes, A4NH, and HumidTropics • IEA support and guidance to CCEEs: – On-demand basis (planning, selection of consultants, TORs, evaluation governance) – Extensive use of IEA standards and guidance notes
  • 7.
    IEA Evaluations– timeline 2013 2014 2015 2016 Forests, Trees and Agroforestry Policies, Institutions & Markets Maize Wheat Aquatic Agricultural Systems Livestock and Fish Roots, Tubers, and Bananas Water, Land and Ecosystems Climate Change, Agri & Food Security Global Rice Science Partnership QA Support to 5 CRPS: Dryland Systems, Dryland Cereals, Grain Legumes, A4NH, and Humidtropics
  • 8.
    For Whom/ WhatPurpose? CRP Management • Learning for strategic management and adaptations and adjustments of program • Preparation: 2nd cycle of CRP proposals System-wide • System wide strategic management (MTR, SRF, 2nd call for funding) • Resource for system-wide evaluation Evaluation Donors • Accountability on program performance • Building trust and transparency • Strategic and informed decisions on portfolio and funding allocations Partners • Build trust and transparency • Accountability on partnership and program performance • Refinement of partnership
  • 9.
    System-level reviews –mid-term review Examine the progress of the CGIAR reforms, and the resulting appropriateness, effectiveness, and efficiency of the overall system and make recommendations for course correction and improvements where necessary. Commissioned by: Fund Council Evaluators: Independent Panel Timeline: Results of the review and final recommendations will be discussed/adopted at the forthcoming Fund Council meeting for Year 2014. Links: TORs and Inception report (April 2014)
  • 10.
    CRP extensions and2nd call for proposals Consortium Board and Fund Council decision: • Extend (and refresh) the current CRPs to the end of 2016 (with Extension proposals for 2015-16 in 2014); • Develop proposals for the second stage with the benefit of the conclusions and recommendations of both the Mid-term Review of the Reform, and forms of external evaluation for all CRPs.
  • 11.
    CRP 2nd callfor proposals: process • March 2015: CRPs submit Pre-Proposals on-line. • July - September 2015: Review by CO and ISPC and CRPs submit revised proposals • November 2015: CB and FC determine which proposals, and/or key components, are to be developed into full proposals • March 2016: invited CRPs submit Full Proposals • April - September 2016: six months for ISPC, CO, and FO and to review the proposals • September 2016: A full set of (a) CRP proposals; (b) ISPC reviews; and (c) CO FO, IEA evaluations available for review and feedback from CB and FC members • November 2016: CB and FC complete funding /approval decisions
  • 12.
    Scope of CRPEvaluation Broad scope: Relevance, QoS, Effectiveness, impacts, sustainability, efficiency Summative – Results of past research which is continuing in CRP Formative – Programmatic approach in enhancing relevance and efficiency of CRP – Likelihood of effectiveness to contribute to SRF vision, SLOs and outcomes
  • 13.
    Monitoring and impactassessment feed into evaluation Uptake, Influence Outcomes Adoption SPIA Inputs Progress CRP E V A L U A T I O N Impact O u t p u t s - 10-20 years - 5-10 years- 4-8 years Current research
  • 14.
    Some challenges •The evaluand: is a moving target: several iterations of structural changes since the initial proposal • Monitoring and Evaluation capacity and systems uneven across CGIAR – The “building blocks” are not there • No clear accountability and learning framework at CRP and system levels
  • 15.
    The role ofthe Evaluation Community of Practice in meeting our strategic objective
  • 16.
    • Strengthen theculture of evaluation across CGIAR • Network and share information • Build capacity and strengthen coordinated planning and cooperation • Strengthen the collective voice of evaluation in the system
  • 17.
    At CRP level: Evaluation building blocks Blue-internal, Green – external P- research project level Sub – sub themes & components THEMES AND CROSS-CUTTING S U B S U B S U B S U B S U B S U B P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P P MONITORING INPUTS AND INDICATORS

Editor's Notes

  • #10 TOR: https://www.cgiarfund.org/sites/cgiarfund.org/files/Documents/PDF/10thCouncil/ToR%20for%20the%20CGIAR%20Reform%20%20Mid-Term%20Review%20(Sep%2017,%202013).pdf Inception Report: https://library.cgiar.org/bitstream/handle/10947/3023/2-MTR%20Inception%20report%207%20apr%202014.pdf?sequence=1
  • #12  31 March 2016: CRPs submit Full Proposals, maximum 40 pages length; in the format provided by online templates (see the appendices). Proposals will be submitted online. April - September 2016: six months for ISPC, CO, and FO and to review the proposals. The process during this period should be further developed with ISPC and others, to enable ISPC to avoid peak workloads. It is likely that during this period there may be feedback (or questions) to the CRPs, coordinated through the CO, and that CRPs may be asked to provide answers to questions or additional information to back up or improve their submissions. 30 September 2016: A full set of (a) CRP proposals; (b) ISPC reviews; and (c) CO FO, IEA recommendations is available for review and feedback from CB and FC members 15 November 2016: CB and FC complete funding /approval decisions.
  • #18 Monitoring, evaluative studies and at a higher level CCCEs form the building blocks of CRP evaluation in such a way that in the future meta-evaluations could become an important part of CRP evaluation.