Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

From policy to impact - Serious Social Investing 2013

1,045 views

Published on

Learn about how the South African government uses monitoring and evaluation to assess its performance.

Dr Ian Goldman, from the Department of Performance Monitoring and Evaluation: The Presidency, speaks at the Tshikululu Social Investments Serious Social Investing 2013 workshop.

Published in: Education
  • There is a useful site for you that will help you to write a perfect and valuable essay and so on. Check out, please ⇒ www.HelpWriting.net ⇐
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • I pasted a website that might be helpful to you: ⇒ www.WritePaper.info ⇐ Good luck!
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

From policy to impact - Serious Social Investing 2013

  1. 1. The PresidencyDepartment of Performance Monitoring and Evaluation Why is M&E important?What is government doing? Presentation to Tshikululu CSI Conference 13 March 2013 Dr Ian Goldman Head of Evaluation and Research
  2. 2. Summary Approach – M&E as a system – not an ad-hoc donor approach Why is M&E important?  Evidence for policy- and decision-making  Helping to get a results culture – transforming the public service The problem How is government approaching it? Challenges The Presidency: Department of Performance Monitoring and Evaluation 2
  3. 3. What Is Evidence-Based Policy?• Helping policy makers to make better decisions and achieve better outcomes• Providing better services (public and private)By using:• Existing evidence more effectively• New research/evaluation to fill the gaps in the evidence baseAnd:• Integrating sound evidence with decision makers ’ knowledge, skills, experience, expertise and judgement Source: Oxford Evidentia
  4. 4. Source: Oxford Evidentia How is the policy Logic Model What is already to work? known about the Social Ethics Public problem/policy? Consultation Theories Ethical ResearchWhat are the ethical of Synthesis Evidence Harness implications of Change the policy? Existing Economic and Evidence What is the nature, Cost-Benefit/ Econometric Evidence Effectiveness/ for size and dynamics Evidence Descriptive Utility Analysis Policy of the problem? andWhat are the costs Implementati Experientia Statisticsand benefits of the on Evidence of l Evidence Surveys policy Evidence/ Proven Qualitative ] Effectiveness Research Case Studies Interviews Focus Groups What has been Experimental and Ethnography shown to work How do we make Quasi- Operations elsewhere? Research The policy work? Experimental Evidence The Presidency: Department of Performance Monitoring and Evaluation
  5. 5. Views of senior managers Scientific and objective, enabling reliable predication based on facts that speak for themselves, collected by objective and independent specialists, derived through replicable methods and constituting objectively verifiable proof; or Probabilistic, emergent and contested, an iterative search for explanations and understanding of how to achieve politically derived values in which the choice of facts and sources is influenced by existing ideas, ideology, mind-set, values and interests and subject to specific and changing contextual factors. A third group straddled these views, indicating that the choice should be dictated by the type of policy to be developed and the type of research methodology appropriate to that type of policy decision. The Presidency: Department of Performance Monitoring and Evaluation 5
  6. 6. Use of M&E as change strategy - WPTPS A mission statement for service delivery, together with service guarantees; The services to be provided, to which groups, and at which service charges; in line with RDP priorities, the principle of affordability, and the principle of redirecting resources to areas and groups previously under-resourced; Service standards, defined outputs and targets, and performance indicators, benchmarked against comparable international standards; Monitoring and evaluation mechanisms and structures, designed to measure progress and introduce corrective action, where appropriate; Plans for staffing, human resource development and organisational capacity building, tailored to service delivery needs; The redirection of human and other resources from administrative tasks to service provision, particularly for disadvantaged groups and areas; Financial plans that link budgets directly to service needs and personnel plans; Potential partnerships with the private sector, NGOs and community organisations to provide more effective forms of service delivery; The Presidency: Department of Performance Monitoring and Evaluation 8
  7. 7. Measuring results Having a clear direction Having targets (you know what you want to achieve) (Having a theory of change - logical link between what you do and what you achieve) Linking resources to plans, monitoring progress against plans Challenge of target approach – does a good headmaster do what she does because of targets? Be careful of taking private sector models too far. The Presidency: Department of Performance Monitoring and Evaluation 9
  8. 8. So How can we strengthen and formalise the use of evidence How can we formalise the need for effective theories of change  Through strengthening planning  Through the evaluation process How can we use M&E as part of an organisational change strategy The Presidency: Department of Performance Monitoring and Evaluation 10
  9. 9. But we have a problem… The Presidency: Department of Performance Monitoring and Evaluation 11
  10. 10. 1.3 Performance Area: Monitoring and Evaluation1.3.1 Indicator name: Use of monitoring and evaluation outputsIndicator definition : Extent to which the department uses monitoring and evaluation information .Secondary Data: AGSA findings on pre determined objectives – Reported information not reliable.Question: Which set of statements best reflects the department’s use of M&E outputs?Statement Evidence Performance levelDepartment does not have an M&E Policy/Framework or does Not required Level 1not have capacity to generate information .Monitoring reports are available but are not used regularly by Quarterly monitoring Level 2top management and programme managers to track prog ress reportsand inform improvement . Minutes of top management meetings or programme meetings to assess use of reportsMonitoring reports are regularly used by top management Quarterly monitoring Level 3and programme managers to track progress and inform reportsimprovement. Minutes of top management meetin gs or programme meetings to assess use of reportsAll above in L evel 3 plus: All above in Level 3 plus: Level 4 Evaluation ReportsEvaluations of major programmes are conducted periodically Changes to programmesand the results are used to inform changes to programme and plansplans, business processes, APP and strategic plan. 12
  11. 11. Score in M&E (based on self-assessments by 103 national and provincial departments)13
  12. 12. Problem Evidence and analysis not used sufficiently in decision-making, planning, or budgeting ,particularly of programmes 44% of national and provincial departments not regularly using monitoring reports to improve performance Monitoring undertaken as compliance, not as part of culture of continuous improvement Evaluation applied sporadically and not informing planning, policy- making and budgeting sufficiently - missing the opportunity to improve Government’s effectiveness, efficiency, impact and sustainability. Parliament relatively weak compared to executive, so oversight limited (on this trip to US and Canada to improve understanding of committee that oversees DPME) The Presidency: Department of Performance Monitoring and Evaluation 14
  13. 13. Government’s approachThe Presidency: Department of Performance Monitoring and Evaluation 15
  14. 14. Roles and Responsibilities for Planning and M&E in SA National Treasury PresidencyAuditor General • Regulate departmental 5 • National Planning• Independent monitoring year and annual plans and Commission (NPC): of compliance reporting o Produce long-term• Auditing of performance • Receive quarterly plan (20 years) information performance information • Department of• Reporting to Parliament • Expenditure reviews Performance Monitoring and Evaluation (DPME) Cooperative Governance DeptPublic Service Commission o Produce (DCOG) government-wide• Independent monitoring and • Regulate local government M&E frameworks evaluation of public service planning o Facilitate production• Focus on adherence to public • Monitor performance of of whole of service principles in local government government 5 year Constitution • Intervention powers over plans for priorities• Reporting to Parliament local government o Monitor and Public Service Dept (DPSA) evaluate plans for Constitutional power priorities as well as • Monitor national and performance of Legal power provincial public service individual • Regulate service delivery departments and Executive power improvement municipalities The Presidency: Department of Performance Monitoring and Evaluation 16
  15. 15. Focus of DPME to date • Plans for the 12 priority outcomes (delivery agreements)M&E of • Monitoring (ie tracking) progress against the plansnationalpriorities • Evaluating to see how to improve programmes, policies, plans (2012-13 8 evaluations, then 15, then 20)Management • Assessing quality of management practices in individualperformance departments (MPAT) at national/state levelM&E • Moderated self assessment and continuous improvementM&E of front- • Monitoring of experience of citizens when obtainingline service services (joint with states)delivery • Presidential Hotline – analysing responses and follow-up • M&E platforms across gov – nationally, provinciallyGovernment- • Data quality issuesWide M&E • Structures of M&E units/Capacity developmentSystem • Emerging focus on (implementation) programmes • National Evaluation System (initially NEP-focused) The Presidency: Department of Performance Monitoring and Evaluation 17
  16. 16. Why evaluate? Improving policy or programme performance (evaluation for continuous improvement):  this aims to provide feedback to programme managers. Evaluation for improving accountability:  where is public spending going? Is this spending making a difference? Improving decision-making:  Should the intervention be continued? Should how it is implemented be changed? Should increased budget be allocated? Evaluation for generating knowledge (for learning):  increasing knowledge about what works and what does not with regards to a public policy, programme, function or organization. The Presidency: Department of Performance Monitoring and Evaluation 18
  17. 17. Different types of evaluations related to questions around the outcome model Impact evaluation Has the intervention had impact at outcome and Economic Evaluation impact level, and why What are the cost-benefits? Implementation evaluation - what is happening and why DESIGN Diagnosticwhat is the underlying situation Design evaluation and root causes of the Does the theory of problem change seem strong? The Presidency: Department of Performance Monitoring and Evaluation 19
  18. 18. Following-up of the evaluations Evaluation report  1page policy summary, 3p exec summary, 25p report Management response  Each department responds formally, and also put on website Improvement plan  Developed with the departments involved after report approved  Monitored Communication  Development of customised communication materials for different audiences  Evaluation report, management response and improvement plan put on dept and DPME website The Presidency: Department of Performance Monitoring and Evaluation 20
  19. 19. Challenges emergingOverall the system is working but some challenges are emerging. These include: Poor communication channels from some DGs and programme managers often not aware of the possibility Some senior managers wary and don’t see it as an opportunity to improve their performance. Not getting right people to briefing sessions so senior managers don’t understand the system and haven’t bought in Making sure the evaluations proposed are the strategic ones Sometimes departments not budgeting for evaluations and expecting DPME to provide all the money Departments not planning ahead – very important for impact evaluations in particular where need to plan 3+ years ahead Some avoidance strategies happening – eg parallel evaluations, not providing information to evaluators. The Presidency: Department of Performance Monitoring and Evaluation 21
  20. 20.  So we are developing a corpus of evaluations  7 underway  16 being scoped  93 from 2006 that will go on website in May  15 for 2014/15…….. We are on the journey The Presidency: Department of Performance Monitoring and Evaluation 25
  21. 21. Background slidesThe Presidency: Department of Performance Monitoring and Evaluation 26
  22. 22. 8 evaluations in National Evaluation Plan 2012-13 (1)1. Impact Evaluation of the National School Nutrition Programme (NSNP). (DBE)2. Impact Evaluation of Grade R. (DBE)3. Implementation Evaluation of the Integrated Nutrition Programme. (Health)4. Implementation Evaluation of the Land Reform Recapitalisation and Development Programme. (Department of Rural Development and Land Reform)5. Implementation Evaluation of the Comprehensive Rural Development Programme. (Department of Rural Development and Land Reform)6. Implementation/design evaluation of the Business Process Services Incentives Scheme. (Department of Trade and Industry)7. Implementation Evaluation of the Integrated Residential Development Programme (IRDP).. (Department of Human Settlements)8. Implementation Evaluation of the Urban Settlements Development Grant (USDG). (Department of Human Settlements) The Presidency: Department of Performance Monitoring and Evaluation 27
  23. 23. Evaluations recommended for 2013/141. Evaluation of Export Marketing Investment Assistance incentive programme (DTI).2. Evaluation of Support Programme for Industrial Innovation (DTI).3. Impact evaluation of Technology and Human Resources for Industry programme (DTI).4. Evaluation of Military Veterans Economic Empowerment Programme (Military Veterans).5. Impact evaluation on Tax Compliance Cost of Small Businesses (SARS).6. Impact evaluation of the Comprehensive Agriculture Support Programme (DAFF).7. Evaluation of the Socio-Economic Impact of Restitution programme (DRDLR).8. Evaluation of the Quality of the Senior Certificate (DBE). The Presidency: Department of Performance Monitoring and Evaluation 28
  24. 24. 2013/14 continued9. Setting the Baseline for Impact Evaluation of the Informal Settlements targeted for upgrading (DHS).10. Evaluating interventions by the Department of Human Settlements to facilitate access to the city (DHS).11. Provision of state subsidised housing and asset poverty for households and local municipalities (DHS).12. Impact evaluation of the Community Works Programme. (DCOG).13. Evaluation of the National Advanced Manufacturing Technology Strategy (DST).14. Impact Evaluation of the Outcomes Approach (DPME).15. Impact/implementation evaluation of national coordination structures including the cluster system (Presidency). The Presidency: Department of Performance Monitoring and Evaluation 29
  25. 25. 10 Steps3

×