• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Ia Clinic Penang
 

Ia Clinic Penang

on

  • 529 views

Impact Assessment 1

Impact Assessment 1

Statistics

Views

Total Views
529
Views on SlideShare
529
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Ia Clinic Penang Ia Clinic Penang Presentation Transcript

  • Improving ICT4D Impact Assessment Pan-all Partners’ Conference Penang, Malaysia June 13, 2009
  • What is Impact Assessment?
    • “ Impact assessment is the systematic analysis of lasting or significant changes - positive or negative, intended or not – in people’s lives brought about by a given action or series of actions” – Chris Roche (1999)
    • IA Concerns with lasting changes in people’s lives
    • The changes takes place in a context where many factors can contribute to the changes
    • The changes can be both intended and unintended
    • Citations:
    • Roche, Chris. 1999. Impact Assessment for Development Agencies: Learning to Value Change. London: Oxfam
  • Impact Assessment versus Project Monitoring & Evaluation
    • Monitoring is a systematic and continuous assessment of progress over time (Oakley, Pratt and Clayton, 1998)
    • Evaluation is a periodic assessment of the relevance of performance, efficiency, and outputs against stated objectives (Bakewell, 2003)
    • Citations:
    • Oakley, P., Pratt, B. and Clayton, A. (1998) Outcomes and Impact: Evaluating Change in Social Development, Oxford: INTRAC
    • Bakewell, O. (2003) Sharpening the Development Process: A Practical Guide to Monitoring and
    • Evaluation. Oxford: INTRAC.
  • Impact Assessment versus Project Monitoring & Evaluation Source: Souter, D. 2008. BCO Impact Assessment Study: Final Report. http://www.bcoalliance.org/system/files/BCO_FinalReport.pdf After Implementation Lasting and sustainable change Impact Assessment At the end of implementation Performance against objectives Evaluation During Implementation Ongoing Activities Monitoring Primary Timing Measuring Activity
  • The ICT4D Value Chain Precursors -Data systems -Legal -Institutional -Human -Technological -Leadership -Drivers/Demand Inputs -Money -Labour -Technology -Values and Motivations -Political support -Targets Intermediates / Deliverables -Telecentres -Libraries -Shared telephony -Other public access systems Outputs -New Communication Patterns -New Information & Decisions -New Actions & Transactions Outcomes -Financial & other quantitative benefits -Qualitative benefits -Disbenefits Development Impacts -Public goals (e.g. MDGs) Strategy Implementation Adoption Use Exogenous Factors READINESS UPTAKE IMPACT AVAILABILITY Sustainability Scalability Heeks, R. and A. Molla. 2008. Compendium on Impact Assessment of ICT-for-Development Projects. http://ict4dblog.files.wordpress.com/2009/06/idrc-ia-for-ict4d-compendium1.doc
  • What Makes Impact Assessment Difficult?
  • Challenges Related Impact Measurements
    • Complexity of change
    • Contextual challenge
    • Challenge of baseline
    • Challenge of attribution
    • Challenge of aggregation and disaggregation
    • Challenge of non-users
    • Challenge resulting from the unexpected
    • Challenge of perspective/perception
    • “ Longitudinal” problem
    • Source: Souter, D. 2008. BCO Impact Assessment Study: Final Report. http://www.bcoalliance.org/system/files/BCO_FinalReport.pdf
  • Complexity of Change
    • Change often occurs within a complex system – not linear and straightforward
    • Change in sought by development intervention can be susceptible to unexpected externalities
    • Change often needs to be measure against a moving baseline
    • The velocity of change can fluctuate due to various factors – some which are unpredictable
    • Important considerations:
    • In-depth understanding of context where the intervention takes place is necessary
    • In-depth understanding of baseline in which change can be measured
    • Continuous monitoring of data is needed rather than finding indicators only at the beginning and end of a particular intervention
  • Contextual Challenges
    • Contextual understanding requires the knowledge of social, cultural, gender, structural, economic, political, and environmental factors.
    • The complexity and influence of context increases along the ICT4D value chain, from output to development impact
    • In most cases, those who understand context best are those living within the communities concerned
    • Important Considerations:
    • Important to account for all target beneficieries and stakeholders
    • Participation of project/program users in the design of impact assessment studies
    • Don’t assume replicability of impact before understanding context
  • The Challenge of Baseline
    • IA concerns with measurement of change - need a starting point from which change is to be measured
    • Ideal baseline data include: broad context of the intervention or phenomenon, most up to date data, and data obtained from qualitative sources (to complement quantitative data)
    • In some instances, baseline data continue to shift rapidly (e.g. number of cell phone owned per 100 households)
    • Important considerations:
    • Integrate IA into the project/research design and its monitoring plan
    • Building a discipline for data collection is important in any intervention
    • Consider using trends rather than static measure in light of a moving baseline
  • The Challenge of Attribution
    • While attributing immediate results (i.e. outputs) and even intermediate results (i.e. outcomes) may be possible, longer-term results (i.e. impact) pose a greater challenge
    • Allocating responsibility of a particular result to a particular cause (or causes) is substantially more difficult within a complex system.
    • Attributing impact becomes more difficult as the size of the intervention decreases
    • Important considerations:
    • Consider approaches that contributes to the reduction of uncertainty rather than trying to “prove” attribution in all IA scenario
    • Don’t underestimate the richness of data over statistical rigor – IA is not only about precision but it is also about understanding and improving
    • Understand the nature and character of the intervention in order to determine the limits of randomization/experimental approach
  • Challenge of Aggregation and Disaggregation
    • Aggregation challenge – in a complex system different activities often have influence on one another, thus changes are attributed to a collection of interventions that influence one another
    • Disaggregation challenge – impact measures need to be disaggregated to various categories in order for it to be properly understood
    • Important considerations:
    • Aggregation challenge - same considerations in addressing IA attribution challenge – know the nature of the intervention before conducting IA, compliment “conventional” IA approach with other methods
    • Disaggregation challenge - build capacity to monitor and assess impact at the lowest level of disaggregation into the project design
  • Challenge of the Non-Users
    • To get a holistic picture of what is happening following an intervention, impact assessment needs to include a wide range of stakeholders, more so than short-term evaluation
    • Those not intended to receive benefits from the intervention (i.e. non-users) may still be affected by the intervention, because of possible interactions with the beneficiaries.
    • Important Considerations:
    • Since non-users can confound IA results, they need to have the same chance of being selected in the study sample.
    • Consider visually mapping important stakeholders within the area where the intervention takes place when designing IA studies
  • Challenge of the Unexpected
    • It is unlikely we will be able to anticipate every possible outcomes of an intervention at the beginning
    • Unexpected outcomes can be both positive and negative, and sometimes it can be more significant than the intended outcomes
    • Important Considerations:
    • Always expect the “unexpected” and be honest about the results – there can be important lessons from the negative experiences
    • Use of participatory methods in the design of impact indicators can help mitigate the unexpected
  • Challenge of Perception
    • Non-beneficiaries and target beneficiaries of an intervention are likely to perceive change differently from one another
    • Different stakeholders also have different perceptions of impact (e.g. some beneficiaries may perceive impact at a household level whereas the project manager may perceive it a level of a social group or community)
    • Important Considerations:
    • Diversity of experience and perceptions should be captured around an intervention
    • Use of participatory methods for capturing diverse stakeholder views in the design of IA
  • « Longitudinal » Problem
    • Impact assessment is often taken immediately after the completion of a project – but how can IA offer insights into “lasting and sustainable change” before they actually occur? How do we know if they have occurred?
    • Important Considerations:
    • Using proxy indicators may be useful for identifying future impact (e.g. using attitude surveys)
    • Consider IA design using a longitudinal studies (or tracker studies), undertaken some time after the intervention has been completed