Listening to the community. Using ICTs for program monitoring
Upcoming SlideShare
Loading in...5

Listening to the community. Using ICTs for program monitoring



Presentation about a paper submitted to 3rd International Social Innovation Research Conference, London 2011.

Presentation about a paper submitted to 3rd International Social Innovation Research Conference, London 2011.



Total Views
Views on SlideShare
Embed Views



1 Embed 370 370



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

Listening to the community. Using ICTs for program monitoring Listening to the community. Using ICTs for program monitoring Presentation Transcript

  • Listening to the community.Using information and communication technologies for program monitoring
    Papersubmittedtothe International SocialInnovation Research Conference, London, 2011
    Zoltán Ferenczi
    betterplace lab gAG
    Berlin, 10405 Germany
    Susanna Krüger
    betterplace lab gAG
    Berlin, 10405 Germany
  • 1.Introduction
    Not enough timely data; limited reporting capacities at district levels
    What is “effective” is contested; “multiple realities” of different stakeholders
    • Need for more direct, real-time, community level data about social services in DCs
    Developing countries face information related challenges in
    social program monitoring (healthcare, water supply, etc.)
    • 72 percentof all mobile subscriberslocated in DCs (ITU 2010) View slide
    • “Frogleap development” in telecommunications; no alternative to mobile phones in DCs
    1. Introduction (2)
    Relevance of mobile phones for the developing world
    View slide
  • 1.Introduction (3)
    -Case studies: one pilot for healthcare in Kenya, and two pilots for water services in Guatemala and Tanzania
    • To what extent has the application of ICTs enabled service providers to remotely gather timely feedback data from program beneficiaries?
    • Were providers able to actually make use of the resulting information to improve their services?
    -Notframedasrigorousresearch: Empiricaldatasetsareyetproblematic
    The aim of the paper: to explore thepotential of ICTs,
    especially mobile phones
  • 2. Literature
    Sourcing of information for monitoring purposes(“remotesensing”, “geographical mapping”, “crowdsourcing”)
    Existing mechanisms of citizen reporting (Citizen Report Cards)
    Participatory Monitoring & Evaluation (e.g. Guba and Lincoln 1989)
    Real-time monitoring and evaluation
    Research about mobile phones and data collection
    Five core themes
  • 3. Framework
    Descriptive indicator / quantitative
    Description of the ICT used in the case
    “Yes” / “No”
    (for example, simple mobile phone with
    SMS capabilities)
    Description of people involved in
    “Traditionalinformationsourcing” / “Key
    reporting for monitoring (e.g., citizens,
    informants” (bounded crowdsourcing) /
    community health workers, local civil
    “Degree of

    society members, trained employees of
    NGOs/IOs, researchers, external
    Frequency of data transmission (for
    “Non real time” / “Moderate” / “Real
    example, 3 times a day, daily, weekly,
    monthly, etc.)
    Number of transmittedreportsinrelation
    “Size of feedbackdataset”
    Quantitative measure
    Number of error entries relative to all
    Quantitative measure
    “None” / “Service provision” / “Strategic
    Existence of policy document or rhetoric
    use”/ “Both”
    demonstrating use of data
  • 4. Data
    • ChildCount is an integrative technological platform
    • Use of SMS text messages from mobile phones by 108 community health workers (CHWs) for patient registration and health reporting
    • Central web-based interface containing aggregated health data from the community
    Case 1: Healthcarepilot in Kenya - Involvement, remoteness, immediacy
  • 4. Data (2)
    • Over 20,000 SMS-based health reports reaching the web-based interface (patient registrations, nutrition screening reports, vaccination registration)
    • Daily data supply
    • 10 per cent of all messages supplied in an improperly structured format, resulting in their rejection by the system
    • Analysis of health-related macrodata of the community such as birth rates, nutritional trends and seasonal variability of malaria rates; performance monitoring of CHWs
    Case 1: Healthcarepilot in Kenya- Resulting data set and
    action on data
  • 4. Data (3)
    • Field Level Operations Watch (FLOW), a cloud-based data collection system to replace paper-based surveys
    • Use of electronic surveys installed on Adroid-phones by Water For People staff, field workers of local partner NGOs and community members
    • Web-based system for mapping, data analysis and aggregation
    Case 2: Waterpointmonitoringpilot in Guatemala
    - Involvement, remoteness, immediacy
  • 4. Data (4)
    • Over 1000 data points: 116 water points 112 school water facilities, 1000 household surveys
    • Data on one water point is provided at least once a year (or more frequently)
    • Zero errors at data entry
    • Data provides the basis for regional meetings among Water For People staff, local partner organisations and the local government; data is regularly included into reports
    Case 2: Waterpointmonitoringpilot in Guatemala
    - Resulting data set and action on data
  • 4. Data (5)
    • Approach to empower rural people to become change agents themselves and demand accountability
    • Use of text messages and simple mobile phones by citizens
    • Central web-based interface
    Case 3: Waterpointmonitoringpilot in Tanzania
    - Involvement, remoteness, immediacy
  • 4. Data (6)
    • 800 text messages from beneficiaries
    • Approx. 2.5 reports a day
    • All reports were fed into the system; however, many messages were inappropriately formatted or contained incomplete information
    • Use of data for political purposes; 200 messages were passed on to the local government in order to start repairing broken water points
    Case 3: Waterpointmonitoringpilot in Tanzania
    - Resulting data set and action on data
  • 5. Analysis andsummary
    “Size of
    “Evidence based
    feedback data
    on data”
    Local CHWsaskey
    Over 20000 SMS
    10% of all
    Both service
    informants (bounded
    (3 reports by
    by CHWs
    messages were
    CHWs daily
    (treatments ) and
    on average)
    strategic use (health
    Water for
    Local volunteers and
    Ca. 200 data
    Zero rejection
    Both service
    (every water
    points, each
    provision (repairing
    members (bounded
    point regularly
    updated several
    water points) and
    updated, at
    times= 1000+
    strategic use
    least once a
    feedback set
    Maji Matone,
    800 SMS by
    Zero rejection.
    Both service
    (approx. 2.5
    Many messages
    provision (at least
    reports a day
    12 water points
    during pilot)
    repaired) and
    formatted or
    strategic use
  • 6. Discussion
    • Treat results with caution; success may be context-specific
    • Under specific conditions: great potential in the field of international development
    • However; important to establish the “specific conditions” under which ICT pilots are successful
    • Need for quantitative studies with large-N comparisons
    • Implementing NGOs should be aware of the “dark side” of mobile phones; conduct thorough multidisciplinary analysis of the target area prior to pilot launch
    More researchisneeded!
  • 7. Suggestionsforfutureresearch
    -Outreachand promotionalactivities
    -Settingthe right incentives
    -Data validation
    Areas of institutional design that may influence outcome:
  • 9. References
    Amin, S., J. Das, et al. (2007). Are You Being Served? New Tools for Measuring Service Delivery. Washington, DC:, The World Bank.
    Berg, M., J. Wariero, et al. (2009). Every Child Counts - The use of SMS in Kenya to support the community based management of acute malnutritition and malaria in children under five, Millenium Villages Project, Earth Institute at Columbia University.
    Bonbright, D. (2006). "Not learning from beneficiaries." Alliance, 11. (2).
    Cars, M. (2006). Project Evaluation in Development Cooperation: A Meta-Evaluative Case Study in Tanzania, Stockholm University, Institute of International Education
    Caseley, J. (2003). Blocked Drains and Open Minds: Multiple Accountability Relationships and Improved Service Delivery Performance in an Indian City. IDS Working Paper. Brighton, UK, Institute of Development Studies.
    Chambers, R. (1994). "Participatory rural appraisal (PRA): Analysis of experience." World Development, 22. (7).
    Chambers, R. (1997). Whose reality counts? Putting the first last. London, Intermediate Technology.
    Cornwall, A. and V. S. P. Coelho (2007). Spaces for Change? The Politics of Participation in New Democratic Arenas.
    Cornwall, A., V. Schattan, et al. (2004). New Democratic Spaces?
    Coyle, D. and P. Meier (2009). New Technologies in Emergencies and Conflicts: The Role of Information and Social Networks. Washington, D.C. and London, UK, UN Foundation-Vodafone Foundation Partnership.
    Cracknell, B. E. (2000). Evaluating Development Aid. Issues, Problems and Solutions. New Delhi, Thousand Oaks, London, SAGE Publications.
    Daraja (2009a). Raising the Water Pressure - Programme Strategy Paper. Harnessing citizens’ agency to promote accountability, equity and sustainability in rural water supply.
    Daraja (2009b). Raising the Water Pressure. A Concept Note. Harnessing new technology, the power of information and citizens’ agency to promote equity and functionality in rural water supply.
    Donner, J., K. Verclas, et al. (2008). Reflections on MobileActive08 and the M4D Landscape. In Perspective. Proceedings of 1st International Conference on M4D 2008.
    Eagle, N. and A. S. Pentland (2009). "Eigenbehaviors: identifying structure in routine." Behavioral Ecology and Sociobiology, (63): 1057–1066.
    Fals-Borda, and Rahman, M. A. Eds. (1991). Action and Knowledge: Breaking the Monopoly with Participatory Action-Research. New York, Apex Press.
  • 9. References
    Forss, K. and J. Carlsson (1997). "The Quest for Quality - Or Can Evaluation Findings Be Trusted." Evaluation, (3).
    Freire, P. (1972). Pedagogy of the Oppressed. New York, Herder & Herder.
    Goetz, A.-M. and J. Gaventa (2001). Bringing citizen voice and client focus into service delivery. IDS Working Paper 138. Brighton, Institute of Development Studies.
    Guba, E. G. and Y. S. Lincoln (1989). Fourth Generation Evaluation. London, Sage Publications.
    Hellström, J. (2008). Mobile phones for good governance – challenges and way forward.,
    Holland, J. and J. E. Blackburn, Eds. (1998). Whose Voice? Participatory Research and Policy Change. London, Intermediate Technology Publications.
    Howe, J. (2006). "The Rise of Crowdsourcing." Wired, (14.06.).
    ITU (2010). World Telecommunication/ICT Development Report 2010. Monitoring the WSIS targets. A mid-term review.
    Kothari, U. (2001). Power, Knowledge and Social Control in Participatory Development. Participation. The New Tyranny? B. Cooke and U. Kothari. London, Zed Boks: 139-152.
    Krüger, S. and S. Teggemann (2008). Institutional Leadership in a Multistakeholder International Development Setting. Leadership as a Vocation. Houben/Rusche, NomosVerlag.
    Lundberg, M. (2008). Client Satisfaction and the Perceived Quality of Primary Health Care in Uganda Mattias Lundberg. Are you being served? New Tools for Measuring Service Delivery: 323.
    Martin, C. (2009). Put up a billboard and ask the community: Using mobile tech for program monitoring and evaluation. 2009, October 31., 15.03.2011
    McGee, R. and J. Gaventa (2010). Review of Impact and Effectiveness of Transparency and Accountability Initiatives. Sythesis report. Transparency and Accountability Initiative Workshop, Institute of Development Studies.
    Mohan, G. (2001). Beyond Participation: Strategies for Deeper Empowerment. Participation. The New Tyranny? U. Cooke Bill; Kothari. London, Zed Books: 163-167.
    Munyua, A. W. and M. Mureithi (2008). "Harnessing the power  of the cell phone by women entrepreneurs: new frontiers in the gender equation in kenya. grace project research report."
  • 9. References
    Norheim-Hagtun, I. and P. Meier (2010). "Crowdsourcing for Crisis Mapping in Haiti." Innovations: Technology, Governance, Globalization 5 (4): 81-89.
    Patnaik, S., E. Brunskill, et al. (2008). Evaluating the Accuracy of Data Collection on Mobile Phones: A Study of Forms, SMS, and Voice.
    Patton, M. (2011). Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use. New York, London, The Guilford Press.
    Paulos, E., R. J. Honicky, et al. (2009). Citizen Science: Enabling Participatory Urbanism. Handbook of Research on Urban Informatics: The Practice and Promise of the Real-Time City. M. Foth. Hershey, New York, Information Science Reference, IGI Global.
    Power, M. (1997). The Audit Society, Rituals of Verification. New York, Oxford University Press.
    Ravindra (2004). An Assessment of the Impact of Bangalore Citizen Report Cards on the Performance of Public Agencies, ECD Working Paper Series
    Rebien, C. (1996). Evaluating Development Assistance in Theory and Practice. Avebury, Aldershot
    Rudqvist, A. and P. Woodford-Berger (1996). Evaluation and Participation. Sida Studies in Evaluation 96/1. Stockholm, Department for Evaluation and Internal Audit, Sida
    Schuster, C. and C. Perez-Brito (2011). "Cutting costs, boosting quality and collecting data real-time. Lessons from a Cell Phone-Based Beneficiary Survey to Strengthen Guatemala’s Conditional Cash Transfer Program." En Breve, World Bank LAC, (166).
    Sutton, P., D. Roberts, et al. (1997). "A Comparison of Nighttime Satellite Imagery and Population Density for the Continental United States." Photogrammetric Engineering and Remote Sensing, (63): 1303-1313.
    Turner, M. (2011). Mobilizing Development. Washington, D.C., Berkshire, UK, The UN Foundation and Vodafone Foundation Technology Partnership.
    Ulbricht (2011).,, 06.06.2011
    UNDP (1997). Who Are the Question-Makers? A Participatory Evaluation Handbook.
    Water for People (2007). Water for People - Guatemala Country Strategy.
    Weiss, C. (1997). "How can theory-based evaluation make greater headway?" Evaluation, 21. 501-524.
    WHO (2006). The World Health Report 2006 - Working Together for Health. Geneva, World Health Organization
    WHO (2008). Safer Water, Better Health.
    World Bank (2004). World Development Report. Making Services Work for Poor People.
  • 9. Interviews
    -Interviewwith Ben Taylor, Executive Director of Daraja, Tanzania
    -Interviewwith Keri Kugler, programmatic data manager at Water for People, Denver
    -Interview with Dr. James O. Wariero, ChildCount, Kenya, Nairobi