UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

140 views
98 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
140
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

UCB i190 Spring 2014 ICTD in Practice_Lect22_21Apr14

  1. 1. i190 Spring 2014: Information and Communications Technology for Development (ICTD) in Practice University of California Berkeley, School of Information LECTURE 22: 21 Apr 2014 Instructor: San Ng (www.sanng.com) Class Website: i190spring2014.sanng.com
  2. 2. i190 Framework Conceptual Week1: Introduction to Course W2:What is Development? W3:What is ICTD W4:Who Does What in Practice? Mapping the ICTD Landscape i190 ICTD in Practice: Core Skills Technical (eApplications) W5: Overarching Issues of eApplications W6: Infrastructure,Telecenters, Agriculture, W7: Revisiting Agriculture,, W8 : e-Health, Education W9: eGovernance Microfinance Management W10: Break W11: Intro to Project Management Planning and Assessment W12: Initiating: Design, Scheduling, Budgeting, HR W13: Implementation W14: Monitoring and Evaluation/ Next Cycle W15: Final Projects & Wrap Up
  3. 3. Introduction to Project Management Planning Initiation Implementation Monitoring & Evaluation Next Phase? Transformation?
  4. 4. Implementation- Best Practices Case: ITC e-Choupal:What are the needs/problems that this ICTD project is trying to address?
  5. 5. Implementation- Best Practices Case: ITC e-Choupal What made implementation successful? •Trust: choice •Meets Needs: Clear Value •Appropriate Tech: Simplicity of technology, new and old tech •Local structures and systems •Incremental Roll out •Mission-based
  6. 6. Implementation-IT requirements/User Design Case: ITC e-Choupal
  7. 7. Implementation-IT requirements/User Design Case: ITC e-Choupal
  8. 8. Implementation- Complex Environments Case: Competing for Development (A) •If you were Ghazialam, would you go ahead with the $65,000 investment? •What are the key tradeoffs? What would an ‘ideal’ outcome look like?
  9. 9. Implementation- Complex Environments Case: Competing for Development (B1-6) 6 groups Role play exercise: Make the case within your role for the $65,000 investment
  10. 10. Introduction to Project Management Planning Initiation Implementation Monitoring & Evaluation Next Phase? Transformation?
  11. 11. DifferentTypes of Evaluation and Performance Measurement Program Level Organization Level Community and Societal level Wikipedia Case
  12. 12. Evaluation Purpose Measuring program effectiveness Determining if a program meets its objectives TYPES *baselines Formative *ongoing * feedback * changing the program Summative * look at final outcomes *impacts * cut or keep
  13. 13. OTHER PURPOSES * compliance * legitimacy * certification * lessons learned * check for unintended consequences * benchmarking * more money * white wash and eye wash p * kill a project * political attack * new opportunities * protection and self interest * melt down indicators
  14. 14. EVALUATION CHECK LIST WHEN IS EVALUATION WORTH DOING? *WhoWantsThis andWhat Decision DoTheyWant to Make? (lessons learned) * Are the Impediments Manageable? (resources, objectives, agreement, special issues) * Is there Political Support? (general support) THE REGULAR COMPONENTS OFTHE EVALUATION PROCESS * Purpose and Objectives * Indicators * Design * Data and Utilization * Problems
  15. 15. EVALUATION ACTOR MAPPING BOARD OR LEGISLATORS OTHER STAFF OR DEPTS. TOP MGMT PROGRAM PARTICIPANTS PROGRAM STAFF PRESS OR COMMUNITY DONORS MIDDLE MGMT. OUTSIDERS FOR TEACHING PURPOSES
  16. 16. RANGE OF INDICATORS FEELINGS DOYOU TRUSTTHESE INPUTS PEOPLE/ PROCESS involvement/coordination OUTCOMES (INTERMEDIATE) INCREASED INCOME (FINAL) X LEVEL OF CONTANIMATION EFFICIENCY AND PRODUCTIVITY lbs. Of fish/$ $/lbs. Of fish SPAN AND SCOPE OF COVERAGE % target population served SATISFACTION customer satisfaction/ commercial IMPACTS (sustainable) measurable change/ broader/ (PROGRAM CAUSED OUTCOME) longer term
  17. 17. * HOW WOULDYOU ASSURE THATYOUR RESULTS WEREVALID AND RELIABLE? RELIABLITY-- DOYOU GETTHE SAME RESULT TIME AFTERTIME. VALIDITY-- UNBIASED COMPARED TO A STANDARD
  18. 18. What is the purpose of a research design? * tailored to each problem * to answer very specific questions * weigh benefits and costs and resources * can allow cross comparison * was it the program that made the difference?
  19. 19. Using Evaluation Results-- Style Differences Academic Style * Slow * Scientific Method * Clear Objectives * Careful Study *Written Communication * Precision * Academic Reference Group Managerial Style * Pressure to Decide * Many Simultaneous +FragmentedTasks * Competing Objectives * Action *Verbal Communication * Incomplete Date * Managerial Reference Group P R O B L E M S *TIME *VERY RIGOROUS * IRRELEVANT * FORMAT * NOT RIGOROUS * COMMUNICATION
  20. 20. DESIGNTYPES PRE EXPERIMENTAL (PE) * Goals verses Performance * Before and After 0 X 0 QUASI EXPERIMENTAL (QE) *Time Series 01 02 03 04 * Non Equivalent Control Group 01 X 02 03 04 * MultipleTime Series 01 02 X 03 04 05 06 07 08 EXPERIMENTAL *Two Group Pre and PostTest 01 X 02 R 03 04 * PostTest Only X 01 R 02 * Solomon Four 01 X 02 03 04 X 05 06 R
  21. 21. Some of the More Common Methods * Balanced Score Cards and Other Overall General Assessments * Goals vs. Performance and also Cost and Efficiency * Outcome Assessment * Benchmarking * Best Practice * Rapid Assessment Tools (quicker and dirtier rather than deeper) * More based on sampling than 100% study
  22. 22. DATA COLLECTION METHODS * General Statistical Analysis * Cost Benefit/ Rates of Return * Simulations * Content Analysis * Record Reviews * Unobtrusive Measures * Group Observation * Surveys and Testing * Personal Interviews * Participation Observation * Case Studies LESS INTRUSIVE MORE INTRUSIVE THE ETERNAL TRIANGLE Precision Cost Complexity
  23. 23. Measureable Indicators Types of Data needed Data Collection Methods/Frequency for M&E Overriding Goal Objectives (at least 4) Instructions: Everyone was pleasantly shocked by these successful results. However, the founder Jimbo Wales intuitively knows that the number of articles per se does not measureWikipedia's success completely, especially sinceWikipedia began with a completely different set of goals/activities and became 'successful' only organically. He wants to hire you to determine a sound methodology to evaluateWikipedia's success. He wants you to design a Logical Framework forWikipedia (based on what we already learned in class), with indicators he can measure to determine Wikipedia progress and success. He has given us a sample template that we will discuss and brainstorm in class:

×