• Like
UCB i190 Spring 2014 ICTD in Practice Lect 21_16apr14
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

UCB i190 Spring 2014 ICTD in Practice Lect 21_16apr14

  • 18 views
Published

 

Published in Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
18
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. i190 Spring 2014: Information and Communications Technology for Development (ICTD) in Practice University of California Berkeley, School of Information LECTURE 22: 21 Apr 2014 Instructor: San Ng (www.sanng.com) Class Website: i190spring2014.sanng.com
  • 2. i190 Framework Conceptual Week1: Introduction to Course W2: What is Development? W3: What is ICTD W4: Who Does What in Practice? Mapping the ICTD Landscape i190 ICTD in Practice: Core Skills Technical (eApplications) W5: Overarching Issues of eApplications W6: Infrastructure, Telecenters, Agriculture, W7: Revisiting Agriculture,, W8 : e-Health, Education W9: eGovernance Microfinance Management W10: Break W11: Intro to Project Management Planning and Assessment W12: Initiating: Design, Scheduling, Budgeting, HR W13: Implementation W14: Monitoring and Evaluation/ Next Cycle W15: Final Projects & Wrap Up
  • 3. Introduction to Project Management Planning Initiation Implementation Monitoring & Evaluation Next Phase? Transformation?
  • 4. Implementation- Best Practices Case: ITC e-Choupal: What are the needs/problems that this ICTD project is trying to address?
  • 5. Implementation- Best Practices Case: ITC e-Choupal What made implementation successful? •Trust: choice •Meets Needs: Clear Value •Appropriate Tech: Simplicity of technology, new and old tech •Local structures and systems •Incremental Roll out •Mission-based
  • 6. Implementation-IT requirements/User Design Case: ITC e-Choupal
  • 7. Implementation-IT requirements/User Design Case: ITC e-Choupal
  • 8. Implementation- Complex Environments Case: Competing for Development (A) •If you were Ghazialam, would you go ahead with the $65,000 investment? •What are the key tradeoffs? What would an ‘ideal’ outcome look like?
  • 9. Implementation- Complex Environments Case: Competing for Development (B1-6) 6 groups Role play exercise: Make the case within your role for the $65,000 investment
  • 10. Introduction to Project Management Planning Initiation Implementation Monitoring & Evaluation Next Phase? Transformation?
  • 11. Different Types of Evaluation and Performance Measurement Program Level Organization Level Community and Societal level Wikipedia Case
  • 12. Evaluation Purpose Measuring program effectiveness Determining if a program meets its objectives TYPES *baselines Formative *ongoing * feedback * changing the program Summative * look at final outcomes *impacts * cut or keep
  • 13. OTHER PURPOSES * compliance * legitimacy * certification * lessons learned * check for unintended consequences * benchmarking * more money * white wash and eye wash p * kill a project * political attack * new opportunities * protection and self interest * melt down indicators
  • 14. EVALUATION CHECK LIST WHEN IS EVALUATION WORTH DOING? * Who Wants This and What Decision Do They Want to Make? (lessons learned) * Are the Impediments Manageable? (resources, objectives, agreement, special issues) * Is there Political Support? (general support) THE REGULAR COMPONENTS OF THE EVALUATION PROCESS * Purpose and Objectives * Indicators * Design * Data and Utilization * Problems
  • 15. EVALUATION ACTOR MAPPING BOARD OR LEGISLATORS OTHER STAFF OR DEPTS. TOP MGMT PROGRAM PARTICIPANTS PROGRAM STAFF PRESS OR COMMUNITY DONORS MIDDLE MGMT. OUTSIDERS FOR TEACHING PURPOSES
  • 16. RANGE OF INDICATORS FEELINGS DO YOU TRUST THESE INPUTS PEOPLE/ PROCESS involvement/coordination OUTCOMES (INTERMEDIATE) INCREASED INCOME (FINAL) X LEVEL OF CONTANIMATION EFFICIENCY AND PRODUCTIVITY lbs. Of fish/$ $/lbs. Of fish SPAN AND SCOPE OF COVERAGE % target population served SATISFACTION customer satisfaction/ commercia IMPACTS (sustainable) measurable change/ broader/ (PROGRAM CAUSED OUTCOME) longer term
  • 17. * HOW WOULD YOU ASSURE THAT YOUR RESULTS WERE VALID AND RELIABLE? RELIABLITY-- DO YOU GET THE SAME RESULT TIME AFTER TIME. VALIDITY-- UNBIASED COMPARED TO A STANDARD
  • 18. What is the purpose of a research design? * tailored to each problem * to answer very specific questions * weigh benefits and costs and resources * can allow cross comparison * was it the program that made the difference?
  • 19. Using Evaluation Results-- Style Differences Academic Style * Slow * Scientific Method * Clear Objectives * Careful Study * Written Communication * Precision * Academic Reference Group Managerial Style * Pressure to Decide * Many Simultaneous +Fragmented Tasks * Competing Objectives * Action * Verbal Communication * Incomplete Date * Managerial Reference Group P R O B L E M S * TIME * VERY RIGOROUS * IRRELEVANT * FORMAT * NOT RIGOROUS * COMMUNICATION
  • 20. DESIGN TYPES PRE EXPERIMENTAL (PE) * Goals verses Performance * Before and After 0 X 0 QUASI EXPERIMENTAL (QE) * Time Series 01 02 03 04 * Non Equivalent Control Group 01 X 02 03 04 * Multiple Time Series 01 02 X 03 04 05 06 07 08 EXPERIMENTAL * Two Group Pre and Post Test 01 X 02 R 03 04 * Post Test Only X 01 R 02 * Solomon Four 01 X 02 03 04 X 05 06 R
  • 21. Some of the More Common Methods * Balanced Score Cards and Other Overall General Assessments * Goals vs. Performance and also Cost and Efficiency * Outcome Assessment * Benchmarking * Best Practice * Rapid Assessment Tools (quicker and dirtier rather than deeper) * More based on sampling than 100% study
  • 22. DATA COLLECTION METHODS * General Statistical Analysis * Cost Benefit/ Rates of Return * Simulations * Content Analysis * Record Reviews * Unobtrusive Measures * Group Observation * Surveys and Testing * Personal Interviews * Participation Observation * Case Studies LESS INTRUSIVE MORE INTRUSIVE THE ETERNAL TRIANGLE Precision Cost Complexity
  • 23. Measureable Indicators Types of Data needed Data Collection Methods/Frequency for M&E Overriding Goal Objectives (at least 4) ​Instructions: Everyone was pleasantly shocked by these successful results. However, the founder ​Jimbo Wales intuitively knows that the number of articles per se does not measure Wikipedia's success completely, especially since Wikipedia began with a completely different set of goals/activities and became 'successful' only organically. He wants to hire you to determine a sound methodology to evaluate Wikipedia's success. He wants you to design a Logical Framework for Wikipedia (based on what we already learned in class), with indicators he can measure to determine Wikipedia progress and success. He has given us a sample template that we will discuss and brainstorm in class: