Monitoring, data management, and impact assessment in Africa RISING
Feb. 25, 2019•0 likes
1 likes
Be the first to like this
Show More
•201 views
views
Total views
0
On Slideshare
0
From embeds
0
Number of embeds
0
Download to read offline
Report
Science
Presented by Beliyou Haile (IFPRI), Arkadeep Bandyopadhyay (IFPRI) and Carlo Azzarri (IFPRI) at the Africa RISING Program Learning Event, Lilongwe, Malawi, 5-8 February 2019
Monitoring, data management, and impact assessment in Africa RISING
Monitoring, data management, and
impact assessment in Africa RISING
Beliyou Haile [IFPRI], Arkadeep Bandyopadhyay [IFPRI], and
Carlo Azzarri [IFPRI]
Africa RISING Program Learning Event
05 - 08 February 2019
Lilongwe, Malawi
Data type Tool
Timing of data
collection
Collection/Aggregation
responsibility
1 FtF Indicators PMMT Once a year
AR researchers, Data
managers/M&E team
2
Direct beneficiaries
and technologies
BTTT.xlsx After each growing
season or as necessary
AR researchers, Data
managers/M&E team
3
Indirect beneficiaries
and technologies
Exposure.xlsx
After every incidence of
"exposure"
AR researchers, Data
managers/M&E team
4
Beneficiaries of
scaling up/out
Scaling.xlsx
Quarterly…or bi-
annualy?
AR researchers, Data managers,
development partners/M&E
team
5
Agronomic/socioeco
nomic data
Various
Per the SIAF
Per evaluation design
AR researchers
6
Scaling-up process
evaluation
TBD Yearly? Data managers
Project monitoring tools
• Offline (confidential) data management with encryption (Dropbox)
• Online (non-confidential) data management – Dataverse
• Why upload data on Dataverse?
• Avoid potential losses (mandatory & necessary back up of data)
• Ensure research integrity and validation of results
• Increase research efficiency and impact
• Facilitate data security and minimize risk of data loss
• Enable research continuity through secondary data use
• Ensure compliance with donor requirement
• Register datasets with USAID DDL once they become open
Data management tools
• All de-identified data (for which AR funds have been used, even partially)
must be uploaded at least every year, whether they are part of a multiyear
experiment or not
• Datasets that are not part of a multiyear experiment shall be made open
data within 12 months of completion of the data collection (embargo
period)
• Embargo period for datasets not part of a multiyear experiment extends up
to 12 months after the completion of the experiment when complete
datasets are available
Data upload
1st Step
Steps for uploading datasets on Dataverse
Researchers complete Dataverse metadata
template….crucial for proper tagging and discoverability
2nd Step Researchers submit completed metadata, de-identified data
files, documentation, and codebook to IFPRI M&E team
3rd Step M&E team and Dataverse administrator review submitted
documents and data and uploads them (interoperability)
Dataverse dataset requests and approval
• Up to three request per Google form
• Existing datasets clustered by country
• Data submitted by the requestor compiled in a Google sheet
• …where data provider will be able to search for their name or
emails
• …and grant or deny access (and the reason for the latter)
• Data providers will be sent a reminder email of pending
requests
Dataverse dataset requests and approval
Dataverse dataset requests – test google sheet
The progress bar
indicates that the
Google sheet is loading.
Click Dismiss
• It is important to let the sheet load completely.
• Kindly refrain from doing anything while the sheet is loading as it seems
slow – it is normal.
• Give it 20-30 seconds before doing anything.
• You might hear the fans on your computer starting to speed up –
again, it is normal.
• When the progress meter is completed, you can work on the form.
Dataverse dataset requests – google sheet
• Sheet is protected – data providers can only edit columns K and L
• Column K: Enter Yes or No to grant/deny permission
• Column L: Enter remarks (e.g., reason for denials)
Dataverse dataset requests – google sheet
• No need to save edits on a Google sheet, it auto saves
• Step 1: Click on this filter button after selecting column I or J.
• Step 2: Select “Create new temporary filter view”
• Step 3: Choose the desired filter element.
Dataverse dataset requests – google sheet
• Filter can be performed on dataset provider’s name as well.
• Filter allows you to quickly glance at all the datasets associated to you
(requested and owned).
• Filter will also allow you to find additional requests more quickly.
• Filter you create is for your individual usage only – it does not render the
default filter for other users.
Dataverse dataset requests – google sheet
Topics for breakout sessions
1. What are the three most important tasks you would like the M&E team to
assist you with?
2. Which M&E and data management activities and tools should be changed,
and how?
3. What are the biggest challenges you face with collection and monitoring of
data on:
• FTF indicators?
• Innovations you and your team have been testing?
• Beneficiary farmers/households directly engaged in testing
innovations?
• Monitoring of different beneficiaries of scaling up?
Data field Description of data field
Dataset title Full title by which the dataset is known. Please choose a concise title that
is self-explanatory. Avoid using abbreviations and long titles.
Related
Publication
Publications that use the data from this dataset. If available, please
include url to relevant publications and reports based on this data
Description A summary describing the purpose, nature, and scope of the dataset
(no word limit, although we suggest keeping it to the maximum of two
short paragraphs)
Contributor The organization/s or person/s responsible for either collecting,
managing, or otherwise contributing in some form to the development of
the resource.
Related
Datasets
Any datasets that are related to this dataset, such as previous research on
this subject
Production
Date
Date when the data collection or other materials were produced (not
distributed, published or archived)
Producer Person/s or organization/s with the financial or administrative
responsibility over the dataset
Collaborative
organizations
List organizations involved in the data production
Funding
organizations
Grant number and related acknowledgements (if available)
Summary of AR data in dataverse (as of 10/2/2018)
Metadata linked to
ICRAF page in Dataverse
Metadata linked
to ILRI's CKAN
Metadata only
(1) (2) (3) (4)
Ghana 14 ` ` 1
Mali 14 0
Sub-total 28 1
Tanzania 30 2
Malawi 11 0
Zambia 3 0
Sub-total 44 2
Ethiopia Ethiopia 22 4 7 3
Sub-total 22
Researchers-Total 94
IFPRI-Total 5
WUR-Total 5
All 104 4 7 9
Africa RISING datasets
in Dataverse
West Africa
East Africa
Offline monitoring tools/1
• Beneficiary and Technology Tracking Tool (BTTT)
• Direct beneficiary households
• With unique household identifiers
• Basic socioeconomic characteristics and location identifiers
• AR innovations mapped to direct beneficiaries
• Data managers: responsible for completing/updating the BTTT
• Researchers: responsible for providing data managers with required
details to feed into the BTTT
• IFPRI: responsible for updating/customizing the tool as necessary,
providing trainings, aggregating data, generating de-identified reports
Offline monitoring tools/2
• Exposure Tool
• Minimal data (number and type) about farmers exposed to AR
innovations (e.g., recent field day in Mali)
• Scaling Tool
• Minimal data about scaling beneficiaries
• Disaggregated by:
• AR innovation
• Development partner
• Period
• Other tools you are using?
Conclusions/1
• Compliance to program data management plan is mandatory
• We are expected/required to collect and manage different types of data to
monitor progress and validate our research
• Researchers need to actively involve your respective data managers
during the planning and implementation of your research/field activities
• Data managers should proactively support research activities by all teams
in their mega site
• Researchers shall communicate with their respective scaling partners of
expected reporting requirements and templates
• FTF indicator data must be complete, adequately disaggregated, and
consistent
Conclusions/2
• All de-identified data (for which AR funds have been used, even partially)
must be uploaded at least every year, whether they are part of a
multiyear experiment or not
• Datasets that are not part of a multiyear experiment shall be made open
data within 12 months of completion of the data collection (embargo
period)
• Embargo period for datasets not part of a multiyear experiment extends
up to 12 months after the completion of the experiment when complete
datasets are available
Data sharing among AR partners
• Partners expected to share confidential and non-confidential data within the
program
• For within-program confidential data sharing, Data User Agreement shall be
signed between owner and requestor
• Partners with IRB offices shall make within-program data sharing explicit when
submitting their protocols
• All data shall be properly cited, collaborative research encouraged
• Data managers responsible for compiling a list (“universe”) of datasets:
• Collected thus far
• To be collected in FY 2019 and beyond
• Along with info about experiment type and duration
• …by reviewing work plans and progress reports
• …against which the completeness of (current and future) datasets on Dataverse
can be assessed
• Chief Scientists responsible for ensuring:
• Data collection plan is clearly identified in workplans
• Data have been collected and uploaded annually or on an appropriately
regular basis
• Support to the research teams to identify the appropriate timeline for open
data
Tracking Dataverse data uploads/2
Africa Research in Sustainable Intensification for the Next Generation
africa-rising.net
This presentation is licensed for use under the Creative Commons Attribution 4.0 International Licence.
Thank You
Editor's Notes
Inform the audience that this is what the Google sheet will look like the first time they open it.
To the audience: “Once this sheet opens, it is important to let the sheet load completely. Kindly do not do anything with the sheet because it will appear laggy. Give it 20-30 seconds before doing anything. You might hear the fans on your computer starting to run
To the audience: “Once this sheet opens, it is important to let the sheet load completely. Kindly do not do anything with the sheet because it will appear laggy. Give it 20-30 seconds before doing anything. You might hear the fans on your computer starting to run
To the audience: “Once this sheet opens, it is important to let the sheet load completely. Kindly do not do anything with the sheet because it will appear laggy. Give it 20-30 seconds before doing anything. You might hear the fans on your computer starting to run
To the audience: “Once this sheet opens, it is important to let the sheet load completely. Kindly do not do anything with the sheet because it will appear laggy. Give it 20-30 seconds before doing anything. You might hear the fans on your computer starting to run
To the audience: “Once this sheet opens, it is important to let the sheet load completely. Kindly do not do anything with the sheet because it will appear laggy. Give it 20-30 seconds before doing anything. You might hear the fans on your computer starting to run