PRISM - A Composite Score Model by Bongs Lainjo

  • 694 views
Uploaded on

 

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
694
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Testing Internal and External Logic: If the OUTPUTS are delivered through planned ACTIVITIES and using relevant INPUTS and corresponding ASSUMPTIONS at the OUTPUT, OUTCOME and IMPACT levels remain valid, then the desired OUTCOME will materialise leading to the intended IMPACT

Transcript

  • 1. 06/12/13 Prepared by Bongs LainjoProgram Indicator ScreeningMatrix (PRISM): A CompositeScore FrameworkBongs Lainjo, MASc EngineeringRBM Systems Consultant andFormer UN Senior Program AdvisorMontreal, Canada, Email: bsuiru@bell.netCanadian Evaluation Society (CES) ConferenceToronto, CanadaJune 9 – 12, 2013
  • 2. Presentation Outline• Introduction;• Objectives;• Relevance;• Target Audience;• Evaluation Life Cycle (ELC);• Program Design Framework (PDF);• Themes;• PRISM: A Composite Score Framework;• Lessons Learned.06/12/13 Prepared by Bongs Lainjo
  • 3. PRISM: Introduction• Evaluation – Demand driven;• Participatory;• Inclusive;• Bottom Up Strategy (BUS);• Consensus - Based ;• Random Thematic Sub-Groups;• Intra-Thematic-Group Concordance;• Inter-Thematic-Group Concordance;• Bar (Gold Standard vs. Effective);• Binary Outcome;• Delphi Methodology;• Mapping;• Scope : Africa, Asia and Pacific Island Countries.06/12/13 Prepared by Bongs Lainjo
  • 4. OBJECTIVES: General• To strengthen the knowledge of IPs, PMs and otherkey stakeholders emphasizing sustainableengagement in program management andimplementation.• This is in an attempt to address existing nuances,highlighting the synergies that exist among thedifferent result levels of the SFW and hencefacilitating a common ground between potentialevaluators and different interested parties.06/12/13 Prepared by Bongs Lainjo
  • 5. OBJECTIVES: Specific• Streamline by improving indicator causal links at allresult levels;• Mitigate duplication of indicators;• Establish authentic contributions between differentresult levels;• Establish meaningful synergies among differentresults levels: no lower level result can contributeto more than one upper-level result;06/12/13 Prepared by Bongs Lainjo
  • 6. OBJECTIVES : Specific (Cont’d)• Strengthen the program design;• Promote a common understanding among keyactors and• Minimize cost and optimize the number ofindicators included in the program.06/12/13 Prepared by Bongs Lainjo
  • 7. PRISM: Relevance• Improves intended and unintended interventionresults and makes foreign aid more focused withevidence-based results;• Establishes more effective, continuous andsustainable synergies among frontline forces, IPs,Funding Agencies, Stakeholders and Beneficiaries.06/12/13 Prepared by Bongs Lainjo
  • 8. PRISM: Target Audience• Funding Agencies;• IPs;• Program Managers;• Relevant Stakeholders;• Evaluators;• Development Partners.06/12/13 Prepared by Bongs Lainjo
  • 9. Evaluation Life Cycle• Demand Recognition;• Evaluation Team Identified;• Inception Report Developed;• Evaluation Process implemented;• Draft Report Developed and Presented;• Final Report Developed and submitted.06/12/13 Prepared by Bongs Lainjo
  • 10. Program Design FrameworksTypeLogicFramework(Logframe)Results Levels•Impact•Outcome•OutputAgenciesUN,CIDA,EU,AusAID,DfID, WBStrategicObjectiveStrategic objectiveProgram ObjectivesProgram Sub-objectivesUSAID06/12/13 Prepared by Bongs Lainjo
  • 11. LOGFRAMEBongs LainjoGOALOUTCOMEOUTPUT:PRISMACTIVITESINPUTSTHENIFIFIFTHENTHEN
  • 12. PRISM: Definition• An R by C Matrix where• R = Number of Thematic Indicators and• C = Six Screening Criteria.• Each Indicator is cross-tabulated with each criterion;• The intersecting cell is filled with either a “1” or a “0”;• The former if the indicator satisfies the criterion and the latter if itdoesn’t;• Exercise continues until ALL indicators are screened;• A corresponding final score (%) per indicator is established for eachrow. These are used in establishing Group Concordance;• Thematic Group and Sub Groups agree on effective %;• Each Sub-Group is made up of Moderator, Rapporteur and Team.06/12/13 Prepared by Bongs Lainjo
  • 13. PRISM: Themes• Health;• Education;• Environment;• Governance;• Poverty;• Judiciary;• Agriculture;• Social Security and Protection.06/12/13 Prepared by Bongs Lainjo
  • 14. PRISM: Criteria• Specificity;• Reliability;• Sensitivity;• Simplicity;• Utility;• Affordability.06/12/13 Prepared by Bongs Lainjo
  • 15. PRISM: Criteria Definition• Specificity:• This refers to the likelihood of the indicator measuringthe relevant result. In other words, is there a possibilitythat the result the indicator represents does notrepresent exactly what we are looking for?06/12/13 Prepared by Bongs Lainjo
  • 16. PRISM: Criteria Definition• Reliability:• This criterion is synonymous to replication. That isdoes the indicator consistently produce the sameresult when measured over a certain period of time?For example, if two or more people calculated thisindicator independently, will they come up with thesame result? If the answer is yes, then the indicatorhas satisfied that condition and hence a ‘one’ isentered in that cell. And zero entered otherwise.06/12/13 Prepared by Bongs Lainjo
  • 17. PRISM: Criteria Definition• Sensitivity:• It’s a test that tries to assess the stability of anindicator. For example, does the indicator continue todeliver the same result with a small variation of eitherthe numerator or denominator? How does the resultchange when assumptions are modified? Does theindicator actually contribute to the next higher level?For example, if the same indicator accounts for two ormore higher result levels simultaneously, it is notstable.06/12/13 Prepared by Bongs Lainjo
  • 18. PRISM: Criteria Definition• Simplicity:• A convoluted indicator represents challenges at manylevels. Hence here, we are looking for an indicator thatis easy to collect, analyze and disseminate. Anyindicator that satisfies these conditions automaticallyqualifies for inclusion.06/12/13 Prepared by Bongs Lainjo
  • 19. PRISM: Criteria Definition• Utility:• This refers to degree to which information generatedby this indicator will be used. The objective of thiscriterion is to assist in streamlining an indicator in anattempt to help the decision making in making aninformed-decision. This can either be during theplanning process or during the re-alignment process.The latter representing occasions when organizationsare evaluating the current status of its mandate.06/12/13 Prepared by Bongs Lainjo
  • 20. PRISM: Criteria Definition• Affordability:• This is simply a cost-effective perspective of theindicator in question. Can the program/project affordto collect and report on the indicator? In general, ittakes at least two comparable indicators to establish amore efficient and cost-effective one. The one thatqualifies is included at that criterion level.06/12/13 Prepared by Bongs Lainjo
  • 21. 06/12/13 Prepared by Bongs LainjoINDICATOR SCREENINGPrepared by:- Bongs LainjoProgramme Indicator Screening Matrix (PRISM)ImplementedYes/No%Score7TotalYes6Affordablity5Utility4Simplicity3Sensitivity2Reliablity1SpecificityThematic Area: RH, PDS, GDR, OtherResults Level: Goal, Outcome, OutputINDICATORSpecificity - Does it measure the result and contribute to ONLY 1 higher levelindicator?Reliability - Is it consistent measure over time?Sensitivity - When the result changes will it be sensitive to those changes?Simplicity - Will it be easy to collect and analyze the Data?Utility - Will the information be useful for decision-making and learning?Affordability - Can the program/project afford to collect the Data?
  • 22. PRISM: Implementation• Theme Identification;• Thematic Group Selection;• Random Thematic Sub Group selection;• Selection of Sub group Moderator and Rapporteur ;• Individual Thematic Sub-group member Scoring;• Establish Intra-Thematic-Sub-Group Concordance;• Establish Inter-Thematic Group Concordance;• Conduct Thematic group Plenary;• Establish Group Consensus;• Select Final Set of Indicators.06/12/13 Prepared by Bongs Lainjo
  • 23. PRISM: Algorithm06/12/13 Prepared by Bongs LainjoSelectIndicatorScreenIndicatorComp.Score>=Bar?AcceptIndicatorDropIndicatorLastIndicator?Concordance11N YYNEnd Process
  • 24. PRISM: Lessons Learned• Team Composition Homogeneity Important;• Consensus Building Required;• Not more than ten members per thematic Sub-group;• Solid knowledge of theme essential;• Time Management important;• Framework useful pre-program implementation;• Also essential during Mid-Term-Review (MTR);• Active involvement of top Management critical ;• Feedback provided to all active teams required;• Useful initial contact tool for evaluation team and relevantprogram key players.06/12/13 Prepared by Bongs Lainjo
  • 25. 06/12/13 Prepared by Bongs Lainjo06/12/13 Prepared by Bongs Lainjo06/12/13 Prepared by Bongs Lainjo25Thank You