Participatory performance monitoring of WASH services at scale in BRAC WASH Programme

  • 465 views
Uploaded on

By Mahjabeen Ahmed, BRAC, Bangladesh. Prepared for the Monitoring sustainable WASH service delivery symposium, Addis Ababa, Ethiopia, 9-11 April 2013. …

By Mahjabeen Ahmed, BRAC, Bangladesh. Prepared for the Monitoring sustainable WASH service delivery symposium, Addis Ababa, Ethiopia, 9-11 April 2013.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
465
On Slideshare
0
From Embeds
0
Number of Embeds
9

Actions

Shares
Downloads
1
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • With annual QIS, everyone can learn to manage change and improve their performance scores, from households to VWCs, schools and Rural Sanitation entreprises to program managers at Upazilla, region and country level.
  • .
  • Likert scales are the well-known scales in which participants score on a scale of e.g. 1 to 5 in which 1 is the lowest score and 5 the highest.
  • This is an example of a QIS scale. It is for the first performance indicator: the quality of performance of the VWC. The scores are agreed on first in separate subgroups: the male VWC members sit and score with the male member of the monitoring team and the female VWC members sit and score with the female monitoring team member. This is called triangulation (the same data collected from two sources) and gives more reliable data. Because the QIS is a learning tool, the two sub-groups then meet and compare their scores and agree on where they are now as a VWC, where they can improve, and how they will do so. Each scale comes also with two rows for qualitative information: why is your score low or high? And: what action will you take? Thus, monitoring also becomes learning and planning instrument. Other performance indicators for VWCs for which there are separate scales (not shown here) are: gender equality or degree of integration of women VWC members into the WASH management, and quality of cooperation of VWC and LGI.
  • The male team members came from ….. The female team members were ….
  • The male team members came from ….. The female team members were ….
  • The result is a very good geographic spread thanks to the sampling frame and method.
  • The NA category now includes no reply (few) as well as not applicable without specification of the reasons for non-applicability. This must be improved in round 2.
  • The male team members came from ….. The female team members were ….

Transcript

  • 1. Participatory performance monitoring of WASHservices at scale in BRAC WASH Programme1Monitoring Sustainable WASH Service DeliveryAddis Ababa, Ethiopia9-11 April, 2013Mahjabeen AhmedBRAC
  • 2. An Introduction to BRACBRAC Coverage in Bangladesh:Districts 64Field Offices 2661Pop. Covered 113 Million2BRAC‟s presence in the world
  • 3. BRAC WASH ProgrammeThe Programme started in 2006 with 150 sub-districts,and has reached 248 sub-districts till date.Components Target(in million people)Achievementstill Dec 2012(in million people)Water 2.5 1.97Sanitation 35 27.81Hygiene 51 423
  • 4. Key Programme Strategies• Creating demands• Establishing Village WASH Committees• Strong interpersonal communication component to changebehaviours• Tailored support to ensure that hardcore poor (grants) andpoor (soft loan) are reached.• To meet the demand supporting Rural Sanitation Centres(loans & orientation)• Stimulate innovation through action research programme4
  • 5. Monitoring Methodology• WASH I had MIS to monitor inputs and outputs• Inputs: e.g., number of visits, trainings• Outputs: e.g., no. of VWCs established, no. of toilets built, ofdifferent types• WASH II also needed performance monitoring: that is, howwell toilets are used; how well VWCs continue to perform; towhat extent women are integrated in planning andmanagement; etc.Now: MIS + QIS (quantified Qualitative Information System)5
  • 6. History & Rationale• Quantified qualitative assessment methodology• First developed by IRC and WSP in 1998• Aim: to replace surveys, because they are extractive andinform only central management and donors, not the users,the VWCs and field workers• QIS:1. visualises to all participants where they perform well andwhere they can improve („climb the ladder‟)2. produces statistics that inform management and donors onprogress of the whole programme3. allows to compare results over time and between locations6
  • 7. Scoring methodology• QIS uses Likert scales: Participants score on a scale from1-5, in which 1 is lowest and 5 is highest.• Two differences: Each scale consist of “progressive mini-scenarios” Each scale starts at 0, not at 1• Participants can see their level and can climb from 0(“nothing to show”) to 4 (“the ideal of 4 key measurablecriteria”). The scores can be analysed statistically.• A scale consists of “no x, x+1, x+1+1, x+1+1+1 andx+1+1+1+1” in which each 1 is a criterion for the indicator.7
  • 8. Example of a QIS scale8Indicator 1: PERFORMANCE OF VWC SCOREIDEAL: (1) Committee (male and female members) meets every 2months + (2) maintains list of decisions and meeting minutes + (3)identifies gaps and takes action + (4) mobilizes ADP funds for hard corepoor4(1) Committee (male and female members) meets every 2 months + (2)maintains list of decisions and meeting minutes + (3) identifies gaps andtakes action3BENCHMARK: (1) Committee (male and female members) meets every2 months + (2) maintains list of decisions and meeting minutes2(1) Committee (male and female members) meets every 2 months 1No full VWC, OR, VWC exists but does not meet 0Reason for high/low score:
  • 9. QIS development process (1)• Workshop 1 (Jan 2012, 1 week) with WASHstaff from HQ, all regional programmanagers(20) & IRC advisors: Set indicators Formulate & review scales Define terms Design scoring process Finalize scales & work plan• Workshop 2 (March 2012, 1 week) First pilot training & field testing of QIS9
  • 10. QIS development process (2)• Adjustment of scales, process and manual (April 2012)• Full-scale pilot (August-Sept 2012) with 432 households(144 UP/P/NP, 36 VWCs, 12 schools and 12 RSCs in 4upazilas in 4 geographical zones• Workshop 3 (Oct 2012, 1 week): Analysis of data & experiences and report Adjustment of scales and manual Selection of independent monitoring teams Sample design: sample frame, size, samplingmethods10
  • 11. QIS development process (3)• Training of 30 independent monitoring teams (Dec, 2012):1 male and 1 female in each team, 3 batches for 6 days• 3-stage sample: (1) Random choice of 50 Sub districts from WASH I and50 new unions from WASH II with probability proportional to size(2) Minimally three or 3 VWCs/ sub-district(3) 27 HHs per VWC (3x9 HH for UP/P/NP)11 8100 HHs 400 schools 300 VWCs 300 RSCs
  • 12. The QIS indicators for BRAC WASHVillage WASH Committee 1. Safe and protected drinking water source2. Performance of VWC3. Women‟s participationHouseholds 4. Safe and protected main drinking water source5. Drinking water management from source to cup6. Sanitary and hygienic household latrine7. Who uses the latrine8. When are latrines used9. Hand-washing provisions after defecation10. Sludge management when latrine pit is fullSchools 11. Sanitary and hygienic school toilets12. Student brigade13. Menstrual hygiene management14. Performance of school WASH committeeRural sanitation centers 15. Performance of RSCs12
  • 13. Data collection and analysis• 1 male & 1 female staff collected data• Data directly entered into a smart phone• Data sent by smart phones to the QISdata base• No data entry persons needed in Dhaka• ICT and WASH staff cleaned thedatabase• BRAC‟s QIS manager and IRC‟s QISadvisors did a first analysis of the datausing Epi-info open source software13
  • 14. Sample locations of: 8100 HHs 400 schools 300 VWCs 300 RSCs14
  • 15. • Scientific sampling methods• Unique bar code avoids duplication when data issubmitted more than once, and allows re-visit ifneeded• Greater reliability through: No manual data entry (high error rate) Double data entry in smart phone andon paper Team compared scores on internal consistencybefore sending/uploading15Quality Assurance
  • 16. Implementation and support• The teams started the data collection just after the training• Each team could always call Managers or ICT to solve anyproblems with data collection or smart phones• Real time data• The hard copies were checked rigorously to detect errors & forimprovements next time• One person was called back for retraining when consistentdata errors detected• For gender equality, team members alternated roles ofprocess facilitator and data entry16
  • 17. Summary of preliminary findings17
  • 18. • All but 1 VWC are functional andmeet regularly• 99% VWCs at or above benchmark• Almost 1 in 3 VWCs meet all 4criteria (meet 1x/2mnth, recorddecisions and minutes, identifygaps and take action, mobilisegovt. funds for ultra poor)BUT• Almost 70% can still improveperformance further by mobilizinggovernment funds meant for theultra poor• 26% can also improve by notinggaps and taking actionsQIS Findings on VWC performancePERFORMANCE OF VWC %IDEAL: (1) M+F members meet every 2months + (2) record decisions & minutes+ (3) identify gaps & takes action + (4)mobilizes ADP funds for HCP31(1) M+F members meet every 2 months +(2) record decisions & minutes + (3)identify gaps & takes action42BENCHMARK: (1) M+F members meetevery 2 months + (2) Record decisionsand minutes26(1) M+F members meet every 2 months 0No full VWC, OR, VWC exists but doesnot meet1TOTAL 10018
  • 19. Safe management ofdrinking water in homes• Above benchmark, P andNP perform better thanHCP (43%, 52%, 35%)• 22% HCP, 27% P and 33%NP meet all 4 criteria onsafe drinking water fromsource to cup• Others (67% HCP, 67% Pand 64% NP) still 1-4 riskyDW management practices5% 4% 5%19% 19% 19%30%28%21%13%16%19%22%27% 33%11%5%4%0%10%20%30%40%50%60%70%80%90%100%HCP POOR NON-POORNalevel 4level 3level 2level 1level 0BM19
  • 20. Hand WashingProvisions at toiletIDEAL: (1) Enough water to washhands carried or available in or nearlatrine + (2) soap/soap solution inplastic bottle at latrine + (3) water forhand-washing is from safe source +(4) there is a special hand-washingstation(1) Enough water to wash handscarried or available in or near latrine+ (2) soap/soap solution in plasticbottle at latrine + (3) water for hand-washing is from safe sourceBENCHMARK: (1) Enough water towash hands carried or available in ornear latrine + (2) soap/soap solutionin plastic bottle at latrine(1) Enough water to wash handscarried or available in or near latrine(0) No provisions for hand-washingcarried or available in or near latrine.• Presence of hand washfacility as proxy for handwashing behaviour.• Majority of scores at orabove BM. HWprovisions at toilets isstill a focus area: 10%are still at the score 0.20
  • 21. QIS findings from the School21Toilets for girls Student brigadeMenstrualhygienemanagementAbove BM 80% 68% 68%At BM 12% 20% 5%Below BM 8% 12% 26%Total 100% 100% 100%
  • 22. Experiences• Performance monitoring concept was new: VWCs,householders and field staff initially experienced it asevaluation/judgemental• Once understood, the „climbing the ladder‟ approach ismuch appreciated by all• Small details could be easily captured• Comparing written and phoned data showed need for somecorrection and is good learning for next round (Dec. 2013)22
  • 23. Thank you23