Misra, D.C.(2009): E-government Monitoring and Evaluation_MDI-12.2.2009
Upcoming SlideShare
Loading in...5
×
 

Misra, D.C.(2009): E-government Monitoring and Evaluation_MDI-12.2.2009

on

  • 3,840 views

A comprehensive presentation on

A comprehensive presentation on
e-government monitoring and evaluation.

Statistics

Views

Total Views
3,840
Slideshare-icon Views on SlideShare
3,839
Embed Views
1

Actions

Likes
1
Downloads
199
Comments
1

1 Embed 1

http://www.slideshare.net 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

11 of 1

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Misra, D.C.(2009): E-government Monitoring and Evaluation_MDI-12.2.2009 Misra, D.C.(2009): E-government Monitoring and Evaluation_MDI-12.2.2009 Presentation Transcript

    • E-government Evaluation E-government Monitoring and Evaluation: Implementing E-business Plan by Dr D.C.MISRA
    • Thursday, February 12, 2009 11-15 a.m. to 1-15. p.m
      • 3rd Post Graduate Diploma in Public Policy and Management Programmme
      • (2007-09)
      • School of Public Policy and Governance
      • Management Development Institute
      • P.B.60, Mehrauli Road, Sukhrali,
      • Gurgaon 122 001
    • E-government Monitoring and Evaluation: A Presentation
      • by
      • Dr D.C.Misra, I.A.S. (Retd.)
      • E-government Researcher and Consultant
      • New Delhi, India
      •  Email : dc_misra@hotmail.com
      •  Web : http://in.geocities.com/drdcmisra
      •  Think Tank :
      • http://tech.groups.yahoo.com/group/cyber_quiz
      •  Tel : 91-11- 2245 2431
      •  Fax : 91-11- 4244 5183
    • I CONTENTS E-government Monitoring and Evaluation
      • Part A: Monitoring
      • Historical Background of Monitoring
      • Ministry of Statistics and Programme Implementation
      • Reasons for Delay in Project Implementation
      • E-government Project Failures - An Indian Example
      • Causes of E-government Project Failures
      • What can monitoring do?
      • What is not monitoring?
      • What is then e-government monitoring?
    • Part A: Monitoring
      • What is E-government Project?
      • E-government Project Life Cycle-Five Models
      • Logical Framework Approach (LFA)
      • Who will do e-government monitoring?
      • The e-government monitoring unit
      • E-government monitoring methodology
      • Guiding Principles of E-government Monitoring
    • Contents Part B: Evaluation
      • Historical Background of Evaluation
      • Programme Evaluation Organisation (PEO)
      • Functions of PEO
      • Four Generations of Evaluation
      • What is not Evaluation?
      • What is then Evaluation?
      • Four Senses of Term Evaluation
      • What is E-government Evaluation?
    • Part B: Evaluation
      • Types of Evaluation
      • Evaluation Timing and Diffusion-Adoption of Curves
      • What is to be evaluated?
      • How E-government Domains are evaluated?
      • Approaches to Evaluation
      • Who will evaluate?
      • Which Type of Evaluation is Suited Most to E-government?
    • Part C: E-government M&E Framework
      • E-government Monitoring and Evaluation (M&E)
      • E-government Monitoring and Evaluation (M&E) Unit
      • Relative Weights to Monitoring and EvaluationI
      • E-government M&E Framework: Components
    • E-government M&E Framework
      • (a) E-government Management
      • (b) E-government M&E Unit
      • (c) Information Needs Matrix
      • (d) E-government M&E Cycle
      • (e) Citizens
      • A Framework for E-government Monitoring and Evaluation
    • Part D and Part E
      • Part D :
      • My Questions
      • End of Presentation
      • Thank you
      • Part E :
      • Your Questions Now
    • Part A: Monitoring
      • I. Historical Background of Monitoring
      •  Monitoring in loose form has always been part of
      • Indian administration
      •  Centralised monitoring is a recent phenomenon
      •  It came into existence at the Centre in 1985 when
      • Ministry of Programme Implementation (MOPI)
      • was formed
      •  MOPI is now part of Ministry of Statistics and
      • Programme Implementation (MOSPI)
    • II. Ministry of Statistics and Programme Implementation
      •  MOSPI tracks implementation of
      • Central Sector Projects > Rs 20 Crore
      •  Report for January-March, 2007 covers 882
      • projects:
      • -- Mega (Greater than Rs 1000 Crore) 69
      • -- Major (Between Rs 100 & 1000 Crore) 432
      • -- Medium (Between Rs 20 Crore & Rs 100 Crore) 381
      •  Cost Over-run (33%) 287
      •  Time Over-run (1 to 196 months)(35%) 301
    • III. Reasons for Delay in Project Implementation
      • Fund Constraints
      • Land Acquisition
      • Environment Clearance
      • Slow Progress
      • Delay in Supply of Equipments
      • Law and Order
      • Others (Technology selection and agreement, Award of contract, Delay in civil works and government clearance, Geomining, Court case, Inadequate Infrastructure and bad weather )
      • (Source: MOSPI)
    • IV E-government Project Failure: An Indian Example
      • Reinventing EPF (Employees’ Provident Fund) India
      •  The largest reform project in terms of complexity
      •  Touches 40 million citizens
      •  Rs 250 million already spent
      •  Time over-run: 2001-06: 66 months: Target: 22
      • months
      •  Scrapped: January 2008: To be started all over
      • again
      •  Contractor: Siemens Information Systems Limited (SISL)
      • Source: Dhoot (2008)
    • V Causes of E-government Project Failure
      • Complexity
      • Commitment Failure
      • Planning Failure
      • Vision Failure
    • Causes of E-government Project Failure
      • Inappropriate Methods
      • Short Time Horizon
      • Turbulent Environments
      • Failure to Support End Users
      • Source: Chiles (2001)
    • VI What can Monitoring do?
      • It can prevent E-government Project Failures
      • It can prevent cost and time over-runs of E-government Projects
      • It can keep a track of progress of E-government Project implementation
      • It can ensure that resources are expended as planned
      • Above all, it can ensure that the benefits of e-government project reach the target group, that is, citizens and non-citizens
    • VII What is not Monitoring?
      • Reporting ≠ Monitoring
      • Inspection ≠ Monitoring
      • Supervision ≠ Monitoring
      • Audit* ≠ Monitoring
      • Surveillance ≠ Monitoring
      • Review ≠ Monitoring
      • * Audit: 1. Financial, 2. Performance, 3. Development, 4. Social, 5. Citizen (Through RTI Act)
    • What is not Monitoring? Comprehensive feedback Review 6 Scanning of environment Surveillance 5 Examination of accounts Audit 4 Overseeing implementation Supervision 3 Fault-finding in detail Inspection 2 Routine reporting Report 1 Focus Tool SN
    • What is not Monitoring? RTI Act ? Citizen / Media Audit 5 Scrutiny from social point Social Audit 4 Quality and durability of assets (IRDP) Development Audit 3 Project experience Performance Audit 2 Conformance to financial rules Financial Audit 1 Focus Tool SN
    • VIII What is then E-government Monitoring?
      • E-government Monitoring is a specialised , systematic , dynamic , and semi-autonomous management tool to ensure that the E-government Project serves the target group - Citizens and Non-Citizens- in accordance with e-business plan taking into account the interests of various stakeholders and the emerging challenges being faced by E-government
    • Elements of Definition Emerging Challenges 10 Management Tool 5 Stakeholders 9 Semi-autonomous 4 E-business plan 8 Dynamic 3 Service to Citizens 7 Systematic 2 E-government Project 6 Specialisation 1
    • IX What is E-government Project?
      • An E-government Project is a development project which aims to transform an inward-looking government to a citizen centric government making best use of information and communication technologies (ICTs) through a carefully designed e-business plan
    • X. E-government Project Lifecycles
      • Five Models
      • Generic Model (Tasmania, Australia)
      • Technocratic Model (NIC, India)
      • Audit Model (Lea’s Model)
      • Systems Model (Heeks’ Model)
      • E-government Project Model
    • 1. Generic Model (Government of Tasmania, Ausralia) Source: Government of Tasmania 2002
    • 2. E-government Technocratic Model (National Informatics Centre, New Delhi) Source: Mishra 2005 A. Project Initiation and Planning B. Software Development C. ICT Infrastructure Creation D. Service Provision E. System Integration and Testing F. Project Commissioning G. Project Completion and Sign Off H. Maintenance I. Retirement
    • 3. Lea’s E-government Project Lifecycle (Audit Model) Source: Lea 2003 Initiation Planning and Implementation Monitoring Operations
    • 4. Heeks’ System Lifecycle Project assessment System Construction Design of the proposed new system Implementation and beyond Analysis of current reality 1 5 4 3 2 Source: Heeks 2006
    • 5. E-government Project Cycle Modify Modify Modify M&E Unit 4 Evaluate E-business Plan 5 Review E-business Plan 3 Monitor E-business Plan 2 Implement E-business Plan 1 Prepare E-business Plan
    • XI. Logical Framework Approach (LFA)
      •  Developed by Leon J. Rosenberg for USAID in 1969
      •  It logically connects project activities to
      • results
      •  The logic is: If x is done (input), y will
      • follow (output) under asumptions z (outcome)
      •  It presents a concise picture of the project in a page or two
    • Logical Framework Approach (LFA)
      •  It is an analytical tool
      •  LFA (approach) should not be
      • confused with logframe (document)
      •  It is 4x4 Matrix (that is, it is a matrix
      • of four rows and four columns )
      • giving rise to 16 cells
    • Logical Framework Approach (LFA) C44 (Critical Conditions) C43 C42 C41 Inputs (Resources) C34 (Implementation Assumptions) C33 C32 C31 Outputs (Activities) C24 (Project Assumptions) C23 C22 C21 Objectives (Short-term Objectives) C14 (Hypothesis) C13 C12 C11 Goal (Wider objectives) Assumptions and Risks (Principal Methods) Measurement (Means of Verification) Indicators (of Progress) Structure (Type of Information) S.N.
    • XII. Who will do E-government Monitoring?
      • The E-government Monitoring Unit
      •  Set up a monitoring unit
      •  It will be part of organisation but function independently
      •  It will report directly to Top Management
      •  It will be a specialised unit
    • XIII. The E-government Monitoring Unit
      •  It will consist of:
      • Head of Monitoring Unit
      • Database Administrator/System Analyst
      • Statistician
      • Economist
      • Sociologist/Political Scientist
    • XIV. E-government Monitoring Methodology
      • II. Manual
      • Desk Research
      • Sample Survey
      • Focus Groups
      • Case Studies
      • Individual/Group Interviews/Discussions
      • Participatory Appraisals, etc.
      • I. Automated
      • 1. Online Survey
      • (SurveyMonkey)
      • Virtual Focus Groups (E-groups)
      • E-mail Surveys
      • Blog (Comments)
      • Wiki (Comments)
      • Online Feedback, etc.
    • XV. Guiding Principles of E-government Monitoring
      • E-government monitoring must be
      • Action-oriented
      • Web-based
      • Top management-oriented
      • Specialised
      • Citizen-centric
      • Simple
      • Timely
      • Relevant
      • Accurate
      • Flexible
    • Part B: Evaluation
      • I. Historical Background of Evaluation
      •  Systematic Evaluation is older than
      • Centralised Monitoring
      •  Evaluation came into being in 1952 when
      • Programme Evaluation Organisation
      • (PEO), an independent organisation
      •  was set up in Planning Commission
      •  to evaluate India’s Community
      • Development (CD) Programme
      •  PEO survives till date (2008)
    • Evaluation
      • II Programme Evaluation Organisation (PEO)
      •  Headed by Adviser (Evaluation ), it has
      • 3-tier Structure
      •  1. Headquarters - Planning Commission
      •  2. Regional Evaluation Offices (7)
      • (Kolkata, Chandigarh, Chennai,
      • Hyderabad,Jaipur, Lucknow and Mumbai)
      •  3. Project Evaluation Offices (8) (State
      • Capitals- Guwahati, Bhubaneswar,
      • Shimla, Bangalore,Bhopal, Patna,
      • Thiruvananthapuram and Ahmedabad)
    • III. Functions of PEO
      •  Undertakes evaluation of selected
      • programmes/schemes under
      • implementation
      •  Evaluation studies, assess
      •  the performance ,
      •  process of implementation ,
      •  effectiveness of the delivery systems and
      •  impact of programmes.
    • IV Four Generations of Evaluation Source: Guba and Lincoln 1989 Mediator (among conflicting stakeholders) Response (Interaction) & Construction (Methodology ) An Alternative Approach Fourth Generation 4 Judge Reaching Judgements Judgment Third Generation 3 Describer Strengths and Weaknesses Description Second Generation 2 Technical Measuring Instruments Measurement First Generation 1 Role of Evaluator Focus Name Generation SN
    • V What is not Evaluation?
      • Analysis ≠ Evaluation
      • Measurement ≠ Evaluation
      • Assessment ≠ Evaluation
      • Appraisal ≠ Evaluation
      • Audit ≠ Evaluation
      • Monitoring ≠ Evaluation
    • What is not Evaluation? Implementation Monitoring 6 Rules and regulations Audit 5 Investment Appraisal 4 Cost and Benefits Assessment 3 Metrics Measurement 2 Breaking into parts Analysis 1 Focus Tool S.N.
    • VI. Four Senses of Term Evaluation Calculation of the value of an expression (in Mathematics) Sense 4 4 Work done by professional evaluators Sense 3 3 An autonomous discipline: Study and application of procedures for doing objective and systematic evaluation Sense 2 2 Merit, Value or Worth of Something Sense 1 1 Definition Sense S.N.
    • VII. What is then Evaluation?
      • “… a robust arena of activity directed at collecting , analyzing , and interpreting i nformation on the need for, implementation of, and effectiveness and efficiency of intervention efforts to better the lot of humankind.”
      • ---Rossi and Freeman (1989, p-13)
    • VIII. What is E-government Evaluation
      • E-government evaluation is a systematic , objective , planned and participatory exercise undertaken during the design , implementation and after completion of a project for determining the worth of e-government to citizens against pre-set objectives and criteria for improving e-government services to citizens.
    • IX. Types of Evaluation
      • Ad hoc evaluation
      • Evaluation in vivo
      • Pilot project evaluation
      • Terminal evaluation
      • Formative and summative evaluation
      • Meta evaluation (evaluation of evaluations) (Stufflebeam 1981/ Scriven 1991)
      • Formal and informal evaluation
      • Insider and outsider evaluation
      • Compliance , effectiveness , significance and efficiency evaluation
      • On-going , process or concurrent evaluation
      • Ex ante and ex post evaluation
    • Types of Evaluation
      • Self-checking evaluation (ibid.)
      • Do-it-yourself evaluation (ibid.)
      • Systematic evaluation (Rossi and Freeman )
      • Fourth generation evaluation (Guba and Lincoln 1989)
      • Scientific evaluation
      • Auto or self evaluation (ACCORD 1993)
      • Thematic evaluation
      • Individual , group and institutional evaluation
      • Casual everyday evaluation (Frutchey 1959)
    • Types of Evaluation 26. Right to information evaluation (being undertaken by Adam Smith International/ Administrative Staff College of India) (2007) 27. Evaluation research 28. Online and offline evaluation 29. E-government evaluation 30. E-government special studies ,etc.
      • 21. Adversary
      • evaluation
      • Empowerment
      • evaluation (Fetterman et al.(eds.)(1996)
      • 23. Utilisation-focused evaluation (Patton 1997)
      • 24. Transparency
      • evaluation
      • 25. Citizen evaluation
    • X. Evaluation Timing and Diffusion-Adoption Curves
      • Hypothetical Curves
      Time (months/years) 1 2 3 4 5 6 0 50 100 (c) (b) (a)
    • XI. What is to be evaluated? Domains of E-government Evaluation I External Environment III Service Delivery II Organisation
    • XII. How e-government domains are evaluated?
      • Indicators
      • 1. Input indicators
      • 2. Output indicators
      • 3. Usage/Intensity indicators
      • 4. Impact/Effect indicators
      • 5. Environmental/Readiness indicators
      • (Source: Jenssen 2005)
    • Indicators
      • 1. Input Indicators
      • – Amount of financial resources devoted to eGovernment. Absolute figures, per capita figures.
      • – IT/e-Government spending as % of GDP.
      • – Amount of resources devoted to Research and Development.
      • – Amount of public resources devoted to internet infrastructure.
    • Indicators
      • 2. Output Indicators
      • – Number of online services for citizens;
      • – Number of online services for businesses;
      • – Percentage of government departments that have a website;
      • – Percentage of government websites that offer electronic services.
    • Indicators
      • 3. Usage Indicators
      • – Number of individuals that have made use of electronic services offered;
      • – Number of businesses that have made use of electronic services offered;
      • – Percentage of citizens that has visited government websites to search for information;
      • – Number of businesses that have made payments online;
      • – Percentage of internet traffic that pertains to electronic service delivery.
    • Indicators
      • 4. Impact Indicators
      • – reduction of waiting time at government counter x by y %;
      • – decrease in case processing time at government organisation x by y %;
      • – citizen/business satisfaction levels concerning eGovernment;
      • – survey-type questions, e.g.: ‘do you feel more positive to your government, now that you can contact it by email?’ ‘has your government become more efficient, now that you can perform services online?’
    • Indicators
      • 5. Environment Indicators
      • – ICT penetration rates (pc, internet, mobile phone) private households, work, schools;
      • – Indicator that measures ‘fear of invasion of privacy’;
      • – Online shopping rates as an indicator of trust in online environments;
      • – Indicator that measures ‘quality of legislation concerning the information society’;
      • – Telephone tariffs, GSM tariffs, Internet access tariffs.
    • XIII. Approaches to Evaluation Studies
      • Cross-Sectional Studies (Comparison of a group with treatment with another group without treatment )
      • Longitudinal Studies (Comparison of a group before and after treatment)
      • Benchmarking Studies (Comparison with best practices)
      • “ Value Addition ” Studies (Accenture )
    • Approaches to Evaluation Studies
      • 5. Citizen Satisfaction Studies (American
      • Consumer Satisfaction Index-ACSI)/
      • Citizen’s Report Card -Public Affairs
      • Centre (PAC), Bangalore
      • 6. Department of Information Technology
      • (DIT)’s Evaluation Assessment
      • Framework (EAF)
      • 7. U.S. Office of Management and Budget (OMB)’s Program Assessment Rating Tool (PART)
    • 1. DIT’s Evaluation Assessment Framework (EAF)
      • EAF Version 2.0 (2004)
      • Service Orientation
      • Technology
      • Sustainability
      • Cost Effectiveness
      • Replicability
    • Weights for Attributes
    • Rating of E-government Projects in India 2005-06 Source: DIT 100 37 Total 8 3 Poor (P) 4 3 1 Satisfactory (S) 3 46 17 Good (G) 2 43 16 Extremely Good (EG) 1 Percentage Number of Projects Rating SN
    • 3. U.S. Program Assessment Rating Tool (PART)
      • Clarity of Purpose and Well-Designed
      • Strategic Planning (valid annual and long-term goals)
      • Management (program, financial oversight and program improvement efforts)
      • Results (accuracy, consistency)
      • (Source: ExpectMore.gov)
    • Assessmet of U.S.Federal Programs by “PART” (2008) DISTRIBUTION OF PROGRAM RATINGS Source: ExpectMore.gov 1004 18% 31% 29% 3% 19% Number of Programs Assessed Effective Moderately Effective Adequate Ineffective Results Not Demonstrated
    • XIV. Who will evaluate?
      • First decide whether evaluation will be done in-house or by an outside agency
      • In-house evaluation is preferable as it builds evaluation capability in-house
      • No separate in-house evaluation unit is required
      • Entrust the evaluation function to the in-house monitoring unit , suggested earlier
      • And call it monitoring and evaluation (M&E) unit
      • Let outside agencies also undertake evaluation and special studies after a gap of 3/5 years
    • XV Which Type of Evaluation is Suited Most to E-government?
      •  The type of evaluation will depend upon the specific requirements of an e-government project .
      •  There are two key stakeholders in e-government: 1. E-government Management , and 2. Citizens
      •  There are four standards for evaluation: 1. Utility 2. Feasibility 3. Propriety 4. Accuracy
      •  E-government evaluation must meet the following two criteria: (a) Utility (to stakeholders-Management and Citizens) and (b) Actual Use, both geared to serve citizens
    • Which Type of Evaluation is Suited Most to E-government?
      •  Utilisation-focused Evaluation (Patton 1997) meets our criteria of selection of type of evaluation
      •  Mere provision of government services online is not e-government
      •  The online services must be utilised by the target group- the citizens
      •  The issue of impact of e-government will arise only when the following is satisfied:
      • PROVISION UTILISATION IMPACT
      • (of e-gov (by citizens) (on general well-
      • services) being of citizens)
    • Part C: E-government M&E
      • I. E-government Monitoring and Evaluation (M&E)
      •  M&E is an under-developed aspect of E-government
      •  It has so far not found any systematic application in e-government project implementation
      •  M&E findings, where available, are not widely diffused
      •  Its neglect hampers e-government development
      •  M&E is a tool for development of E-government
    • II.E-government Monitoring and Evaluation (M&E) Unit
      • E-government M&E unit will have the same staff as the monitoring unit
      • A part of the organisation, it will report directly to top management
      • It will undertake regular monthly monitoring of e-government project and evaluation six-monthly or annually
      • It will give 80% weightage to monitoring and 20% to evaluation
    • III. Relative Weights to Monitoring and Evaluation
      • Relative Importance of M&E
      M e E m Initial Stage After 4/5 years
    • IV. E-government M&E Framework: Components
      • The E-government M&E Framework consists of five components:
      • (a) E-government Management
      • (b) E-government M&E Unit
      • (c) Information Needs Matrix
      • (d) E-government M&E Cycle
      • (e) Citizens
    • IV(a). E-government Management
      • Supporting Staff
      Top Management Middle Management Information Technology (IT) Department
    • IV(b). E-government M&E Unit Top Management Head of M&E Unit Database Administrator/ System Analyst Statistician Economist Political Scientist/ Sociologist
    • IV(c). Information Needs Matrix
      • Information Needs of Management
      •  Information needs differ in three levels of management hierarchy: 1. Top Management 2. Middle Management 3. Supporting Staff
      •  Undertake information needs analysis of various levels of management hierarchy
      •  Prepare Information Needs Matrix
    • Information Needs Matrix Supporting Staff Middle Management Top Management Review Information Utilisation Information Citizen Access Information ICT Infrastructure Information Organisatinal Needs Information Citizen Needs Information Information Needs IT Department
    • IV(d). E-government M&E Cycle M&E Cycle 4. Evaluate it 2. Implement it 3. Monitor it 5. Review it 1. Prepare E-business Plan
    • IV(e) Citizens
      • Citizens interact with government in following four ways: As
      • (a) Information
      • Seekers
      • (of government
      • activities)
      • (b) Service Users (of
      • public services)
      We want service
    • Citizens
      • (c) Beneficiaries
      • (of public programmes like NREGP)
      • (d) Compliers
      • (with laws, rules and regulations
      • like payment of taxes)
      • (e) Stakeholders
      • (in public policies and programmes)
      Their needs in these capacities have to be identified and met by E-government M&E
    • V. A Framework for E-government Monitoring and Evaluation
      •  We are now in position to link these components and present
      •  A Framework for E-government Monitoring and Evaluation
      •  The Framework conceptualises the complex reality of e-government and provides a Roadmap for E-government M&E Unit
      •  Here then is the Framework.
    • A Framework for E-government Monitoring and Evaluation Formal Sources of Information Informal Sources of Information ICT Indicators Forecasting Monitoring & Evaluation Census & Surveys Audit Discussion Group Blog Wiki Social Sites Legislature RTI Act M&E Unit IT Deptt E-business Plan Implementation Sustainability Citizens M&E Unit Information Matrix M&E Cycle (d) (a) (c) (b) Supporting Staff Top Management Middle Management M&E Cycle Citizens We want service Supporting Staff Middle Mgt Top Mgt Review Information Utilisation Information Citizen Access Information ICT Infrastructure Information Organisatinal Needs Information Citizen Needs Information Information Needs IT Department Evaluate Review Prepare E-business Plan Implement Monitor
    • Contribution of M&E to Good Governance
      • Monitoring information and evaluation findings can contribute to sound governance in a number of ways:
      • Evidence-based policy making (including budget decision making),
      • Policy development , management , and accountability .
      • Many governments around the world have realized much of this potential, including most OECD countries and a small but growing number of developing countries. (Source: Mackay 2007)
    • Contribution of M&E to Good Governance
      • Examples of well-functioning government M&E system :
      • (Source: Mackay 2007)
      • 1. Australia (by 1994, almost 80 percent of new spending proposals relied on evaluation findings)
      • 2. Colombia (which has about 500 performance indicators)
      • 3. United Kingdom (Public Sector Agreements between the Treasury and each of the 18 main departments)
      • 4. U.S.A. (PART (Program Assessment Rating Tool), created in 2002, rates all 1,000 government programmes), and
      • 5. Chile (whose Finance Ministry collects 1,550 performance indicators).
    • Part D: My Questions
      • With this I end my presentation  but ask the following questions 
      • What is monitoring? How does it differ from other sources of management information?
      • What is evaluation? Describe different types of evaluation. Which type of evaluation is most suited to e-government? Give reasons for your answer.
    • My Questions
      • What role can an e-government monitoring and evaluation (m&e) unit play in successful implementation of an e-government project?
      • Do you agree that development of a framework for e-government monitoring and evaluation (m&e) can provide a useful roadmap for m&e unit? Give reasons for your answer.
    • Your questions now!
      • Thank you for your attention.
      Have a nice day . --Dr D.C.Misra