Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Greg Traxler
Senior Lecturer
University of Washington
July 23, 2015
I. Some Definitions and Greg’s Observations (25 minutes)
II. Reactions from Perla and Ricardo (5 minutes)
III. Open Discus...
Joe:
In preparation for CIAT’s 50th Anniversary, please prepare me a
report on 50 years of Agrobiodiversity achievements a...
1. Monitoring (M): Continuous process that tracks inputs, activities, outputs,
and outcomes of a project during the implem...
• To prioritize and seek funds for those research activities
that will have the greatest impact
• Must evaluate past perfo...
Variety of understandings among staff; may include Monitoring (M),
Evaluation (E) and Impact Assessment (IA)
Who Does M&E ...
Impact Assessment and M&E are complementary but distinct
• IA are deep studies that measure the long term impacts of CIAT ...
• Monitoring is not research, IA is
• M&E staff mandate is to enhance the ability of CIAT to
understand and direct its wor...
1. Fragmented Reporting
1. M&E is project and donor driven
2. Project reporting formats, definitions and software not harm...
• M&E staff
• Skilled and capable
• Dispersion across locations and programs makes coordination
difficult
• Researcher and...
• Of course – CIAT is not a collection of projects
• Currently very difficult to talk about comprehensive Institute
or eve...
12
Minor adjustments to CIAT M&E approach will
increase CIAT’s ability to report on Program and Institute level
progress t...
13
1. Establish an institutional memory of research progress
1. Use existing project/CRP reporting efforts to create an ou...
15
1. Clearer picture of CIAT successes. By capturing M&E data over
time, CIAT leadership will be able to report on aggreg...
16
• Suggestions
• Feedback
• Main challenges
• Ideas
• Opinions
Monitoring and Evaluation Open Discussion
Upcoming SlideShare
Loading in …5
×

Monitoring and Evaluation Open Discussion

759 views

Published on

GREG TRAXLER is an Economist and Senior Lecturer in Public Policy at the University of Washington and former Senior Program Officer at the Melinda and Bill Gates Foundation. His research background includes studies of the impact of agriculture technologies and agriculture policies. Greg has been at CIAT for the past month interacting with the Impact, M&E teams and with other researchers.

Greg will lead an open discussion on Monitoring and Evaluation and Impact at CIAT based on his observations over the past month. He is inviting all interested staff to hear his observations and discuss ideas for tweaking CIAT M&E to enhance capacity of CIAT to report on its progress and successes.

Published in: Science
  • Be the first to comment

  • Be the first to like this

Monitoring and Evaluation Open Discussion

  1. 1. Greg Traxler Senior Lecturer University of Washington July 23, 2015
  2. 2. I. Some Definitions and Greg’s Observations (25 minutes) II. Reactions from Perla and Ricardo (5 minutes) III. Open Discussion
  3. 3. Joe: In preparation for CIAT’s 50th Anniversary, please prepare me a report on 50 years of Agrobiodiversity achievements and successes. Do not include anything that is not backed by solid evidence (data) of research outputs and outcomes. Andy, Deborah same request. Elcio: Please prepare me a report on 50 years of CIAT achievements and successes in Latin America. Do not include anything that is not backed by solid evidence (data) of research outputs and outcomes. Robin, Dindo, same request.
  4. 4. 1. Monitoring (M): Continuous process that tracks inputs, activities, outputs, and outcomes of a project during the implementation period. Requires a Theory of Change Outputs: the products of research. Outcomes: the external use, adoption, or influence of outputs. 2. Evaluation (E): Assessing project results and impacts; Includes a huge variety of evaluation objectives, approaches and formats. Examines the health and performance of a program. Makes recommendations based on monitoring information. 3. Impact Assessment (IA): A type of evaluation that assesses the longer range social, environmental and economic benefits of research. These are 3 distinct functions but united by the need for consistent effort to collect and store of key output and outcome data. Key questions for today’s discussion: How are M&E data generated, curated and used at CIAT? How is evaluation done at the Program and Institute level?
  5. 5. • To prioritize and seek funds for those research activities that will have the greatest impact • Must evaluate past performance to identify and attract funding for the high impact research activities • Evaluation strategy starts with a clear baseline and a well defined Impact Pathway • A clear measurement plan is essential • Evaluation and Impact Assessments are only credible when evidence-based • Quality and usefulness of the system greatly enhanced when done in close partnership among M&E, Impact Assessment and research program staff
  6. 6. Variety of understandings among staff; may include Monitoring (M), Evaluation (E) and Impact Assessment (IA) Who Does M&E at CIAT? • Researchers • Program Coordinators/Science Officers/Research Assistants • M&E specialists • Impact Assessment researchers Uses of M&E at CIAT • Reporting to donors • Project/Program management (e.g. Results Based Management) • Monitoring progress at the CIAT program and institute level • Generating impact cases that demonstrate CIAT’s successes to external audiences (Donors) – Telling CIAT’s story and enhancing CIAT’s reputation
  7. 7. Impact Assessment and M&E are complementary but distinct • IA are deep studies that measure the long term impacts of CIAT research on CGIAR system level outcome: • Reduced poverty • Improved food and nutrition security • Improved natural resource systems and ecosystem services IA is similar to other research areas • IA researchers have the mandate to develop a research program – same as all other scientists • IA research studies require data and funding • Like other researchers, IA researchers are evaluated on their research output and scientific reputation as well as their contribution to the CIAT mission • M&E should ensure that the data are available to perform IA
  8. 8. • Monitoring is not research, IA is • M&E staff mandate is to enhance the ability of CIAT to understand and direct its work, and to tell its story to internal and external audiences • M&E staff mandate is to guide the definition and collection of Output and Outcome Data • IA uses Output and Outcome data collected over time to measure development Impact • Impact assessments studies are only possible when M&E information has been captured through time • M&E data are the baselines for IA
  9. 9. 1. Fragmented Reporting 1. M&E is project and donor driven 2. Project reporting formats, definitions and software not harmonized or consistent through time or across donors 3. Individual project progress reports are single-use - do not present the full picture of CIAT research achievements unless processed further 2. Limited institutional memory about past outcomes 1. Missing baseline data on research outputs and outcomes make credible impact analysis difficult 2. Difficult to aggregate data across individual projects 3. Past project reports are not easily accessible 3. M&E fatigue 1. Relentless effort needed to collect M&E data from researchers 2. Lots of M&E reporting activity, yet many gaps in CIAT institute-level information on research and development progress 4. Funding 1. M&E reporting is part of all projects, but strong temptation to underfund at all levels 2. Can be funded through grant direct costs or can come out of overhead
  10. 10. • M&E staff • Skilled and capable • Dispersion across locations and programs makes coordination difficult • Researcher and staff awareness of M&E importance • The discussion has shifted from Whether to do M&E to How to make M&E more efficient and How to right-size M&E to CIAT needs • Goodwill of researchers/scientists/project officers/M&E staff • General appreciation of the importance of M&E • Strong will to collaborate among staff • Data management capacity • New Data & Information Manager with experience in designing accessible and user-friendly data systems • Strong interest in achieving and showing impact • General
  11. 11. • Of course – CIAT is not a collection of projects • Currently very difficult to talk about comprehensive Institute or even Program successes or to document CIAT prominence • Limited accessibility of data on research outputs and outcomes beyond the project document life cycle
  12. 12. 12 Minor adjustments to CIAT M&E approach will increase CIAT’s ability to report on Program and Institute level progress through a) improved coordination of existing M&E activities, and b) better capture of existing M&E output and outcome data.
  13. 13. 13 1. Establish an institutional memory of research progress 1. Use existing project/CRP reporting efforts to create an output and outcome database. 2. Improve coordination and increase the efficiency of CIAT M&E 1. Increased support to researchers in interpreting project logframes and results frameworks
  14. 14. 15 1. Clearer picture of CIAT successes. By capturing M&E data over time, CIAT leadership will be able to report on aggregate research progress 2. More support for researchers. Researchers will have the support of M&E specialists as they wrestle with M&E reporting 3. Redistributed M&E burden. CIAT M&E refinements must be done in a way that reduces the overall burden on researchers 4. Better Impact Evaluation. The new database will make impact evaluation possible 5. Better Donor Relations. Credible evaluation evidence improves future funding prospects
  15. 15. 16 • Suggestions • Feedback • Main challenges • Ideas • Opinions

×