Use of PEFA assessments ICGFM Conference Miami – May 21, 2009   PEFA Secretariat
<ul><li>Harmonize information needs </li></ul><ul><ul><li>Common information pool </li></ul></ul><ul><li>Inform PFM reform...
<ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation and priorities </li></ul><ul><li>Compa...
PEFA reports for reform formulation (1) <ul><li>PEFA report is one – out of several - inputs </li></ul><ul><ul><li>Identif...
Implement PFM reforms   Recommend PFM reform measures Identify main PFM weaknesses High level performance overview  Covera...
PEFA reports for reform formulation (2) <ul><li>Complementary analysis to PEFA required </li></ul><ul><ul><li>Detailed ana...
Purposes of Standard Diagnostic Tools
<ul><li>Findings of Norad-managed self-assessment presented to OECD-DAC in December 2007 </li></ul><ul><li>The assessment ...
<ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation, priorities </li></ul><ul><li>Compare ...
PEFA reports for country comparison (1) <ul><li>PEFA Framework developed for in-country use </li></ul><ul><ul><li>‘ Summar...
PEFA reports for country comparison (2) <ul><li>No scientifically correct or superior basis for deciding conversion and we...
<ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation and priorities </li></ul><ul><li>Compa...
Repeat Assessments <ul><li>As at March 2009, thirteen ‘real’ repeat assessment undertaken </li></ul><ul><li>Expected to be...
What we want to determine? <ul><li>Specific changes in system performance  </li></ul><ul><ul><li>What has changed? </li></...
Comparison of Indicator Scores <ul><li>Indicator scores will provide a crude overview of changes over time, but … </li></u...
Other Possible Reasons for Change in Scores <ul><li>Changes in definitions </li></ul><ul><li>Improved availability of or a...
Reporting on Progress Made <ul><li>Explain all factors that impact a change in rating indicator-by-indicator </li></ul><ul...
Reporting on Progress Made Indicator Score 2005 Score 2007 Performance change Other factors PI-1 C B Performance appears i...
Organizational Requirements <ul><li>Performance tracking to be clearly reflected in the TOR  </li></ul><ul><li>Use of same...
<ul><li>PEFA incorporated in PFM reform monitoring system </li></ul><ul><ul><li>baseline 2005, repeat assessment 2007, pla...
Thank you for your attention
Upcoming SlideShare
Loading in …5
×

Workshop Session II: Public Expenditure Financial Accountability (PEFA) Assessment

1,492
-1

Published on

Workshop Session II: Public Expenditure Financial Accountability (PEFA) Assessment
Frans Ronsholt, Head, PEFA Secretariat and Franck Bessette, PFM Expert PEFA
Secretariat
The second session looks at the potential for use of PFM assessments based on the PEFA
Framework for reform formulation, country comparison and monitoring of reform results
over time. Each session will be completed with a small case study for participants to work.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,492
On Slideshare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
43
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Workshop Session II: Public Expenditure Financial Accountability (PEFA) Assessment

    1. 1. Use of PEFA assessments ICGFM Conference Miami – May 21, 2009 PEFA Secretariat
    2. 2. <ul><li>Harmonize information needs </li></ul><ul><ul><li>Common information pool </li></ul></ul><ul><li>Inform PFM reform formulation and priorities </li></ul><ul><li>Compare to and learn from peers </li></ul><ul><li>Monitor results of reform efforts </li></ul>What can countries use the PEFA Framework for?
    3. 3. <ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation and priorities </li></ul><ul><li>Compare to and learn from peers </li></ul><ul><li>Monitor results of reform efforts </li></ul>What can countries use the PEFA Framework for?
    4. 4. PEFA reports for reform formulation (1) <ul><li>PEFA report is one – out of several - inputs </li></ul><ul><ul><li>Identification of main strengths and weaknesses – and potential impact on budgetary outcomes </li></ul></ul><ul><ul><li>Other factors: political economy, institutional, culture, constitution/legal, resources, capacity at entry </li></ul></ul><ul><li>Ownership means government decisions on priorities </li></ul><ul><ul><li>Government to consider all factors in deciding priorities </li></ul></ul><ul><ul><li>Reform dialogue with donors to allow ample space </li></ul></ul><ul><li>Do not use Indicator scores simplistically </li></ul><ul><ul><li>A low score is not sufficient justification for high priority reform </li></ul></ul><ul><ul><li>Other factors: relative importance of subject, complexity/timeframe for improvement, interdependence with other elements, weakest links </li></ul></ul>
    5. 5. Implement PFM reforms Recommend PFM reform measures Identify main PFM weaknesses High level performance overview Coverage of PFM Performance Report in the PFM Reform Cycle Investigate underlying causes Formulate PFM reform program Identify main PFM weaknesses Recommend PFM reform measures PFM-PR
    6. 6. PEFA reports for reform formulation (2) <ul><li>Complementary analysis to PEFA required </li></ul><ul><ul><li>Detailed analysis of underlying causes needed for formulation of detailed action plan </li></ul></ul><ul><ul><li>Limit such analysis to priority areas </li></ul></ul><ul><ul><li>Drill-down tools – some exist, others under development </li></ul></ul><ul><li>PEFA is preparing guidance on use of PEFA reports in reform formulation </li></ul><ul><ul><li>Stock-taking of conceptual approaches used </li></ul></ul><ul><ul><li>Country case studies illustrating good practice </li></ul></ul>
    7. 7. Purposes of Standard Diagnostic Tools
    8. 8. <ul><li>Findings of Norad-managed self-assessment presented to OECD-DAC in December 2007 </li></ul><ul><li>The assessment showed low scores for seven areas of PFM system performance </li></ul><ul><li>Ministry of Finance reaction: </li></ul><ul><ul><li>Weaknesses in procurement practices and follow-up to external audit findings need to be addressed. </li></ul></ul><ul><ul><li>Three areas of low scoring not considered priority at present (Multi-year program/sector budgeting, Limited extent of internal audit, no consolidated overview of risks from autonomous agencies and public corporations) </li></ul></ul><ul><ul><li>Two indicators scored low but are municipal responsibilities; central government will not get involved. </li></ul></ul>Country case - Norway
    9. 9. <ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation, priorities </li></ul><ul><li>Compare to and learn from peers </li></ul><ul><li>Monitor results of reform efforts </li></ul>What can countries use the PEFA Framework for?
    10. 10. PEFA reports for country comparison (1) <ul><li>PEFA Framework developed for in-country use </li></ul><ul><ul><li>‘ Summary assessment’ provides nuanced overview of strengths and weaknesses as basis for reform prioritization </li></ul></ul><ul><ul><li>No method given for arriving at one measure for ‘overall performance level’ </li></ul></ul><ul><li>Wide interest in country comparison using aggregation </li></ul><ul><ul><li>Researchers - learning on determining factors </li></ul></ul><ul><ul><li>Donors - aid allocations </li></ul></ul><ul><ul><li>Governments - peer learning </li></ul></ul><ul><li>Aggregation requires three decisions </li></ul><ul><ul><li>Conversion from ordinal to numerical scale </li></ul></ul><ul><ul><li>Weighting of indicators (generally and by country) </li></ul></ul><ul><ul><li>Weighting of countries (for country cluster analysis) </li></ul></ul>
    11. 11. PEFA reports for country comparison (2) <ul><li>No scientifically correct or superior basis for deciding conversion and weights exists </li></ul><ul><ul><li>Each user takes those decisions on individual opinion </li></ul></ul><ul><ul><li>PEFA program will not endorse any particular method </li></ul></ul><ul><li>Preferred method of country comparison: </li></ul><ul><ul><li>A nuanced comparison of two assessment reports </li></ul></ul><ul><ul><li>Consider country context, ensure comparison of like with like </li></ul></ul><ul><li>In case aggregation is desired: </li></ul><ul><ul><li>Be transparent on aggregation methods used </li></ul></ul><ul><ul><li>Discuss reasons for choice </li></ul></ul><ul><ul><li>Sensitivity analysis to illustrate impact on findings </li></ul></ul>
    12. 12. <ul><li>Harmonize information needs </li></ul><ul><li>Inform PFM reform formulation and priorities </li></ul><ul><li>Compare to and learn from peers </li></ul><ul><li>Monitor results of reform efforts </li></ul>What can countries use the PEFA Framework for?
    13. 13. Repeat Assessments <ul><li>As at March 2009, thirteen ‘real’ repeat assessment undertaken </li></ul><ul><li>Expected to become common from 2008/2009 onward i.e. three years after the first series of baseline assessments </li></ul>
    14. 14. What we want to determine? <ul><li>Specific changes in system performance </li></ul><ul><ul><li>What has changed? </li></ul></ul><ul><ul><li>How much? </li></ul></ul>
    15. 15. Comparison of Indicator Scores <ul><li>Indicator scores will provide a crude overview of changes over time, but … </li></ul><ul><ul><li>Dimensions may change differently </li></ul></ul><ul><ul><li>Performance may not always change enough to change the score (use of arrow) </li></ul></ul><ul><li>So more detailed explanation is required </li></ul>
    16. 16. Other Possible Reasons for Change in Scores <ul><li>Changes in definitions </li></ul><ul><li>Improved availability of or access to information </li></ul><ul><li>Different information sampling & aggregation </li></ul><ul><li>Scoring methodology mistakes in previous assessment </li></ul>
    17. 17. Reporting on Progress Made <ul><li>Explain all factors that impact a change in rating indicator-by-indicator </li></ul><ul><li>Identify the performance change </li></ul><ul><li>Ensure that any reader can track the change from the previous assessment – what performance change led to the change in a rating </li></ul>
    18. 18. Reporting on Progress Made Indicator Score 2005 Score 2007 Performance change Other factors PI-1 C B Performance appears improved based on 2005: last deviations 6%, 11%, 18% 2007: 5%, 11%, 6% Not clear if all external project funds were excluded from data for 2005 assessment but probably insignificant issue. PI-4 (i) A C Performance change is uncertain, despite reported arrears increase from 1% in 2005 to 6% in 2007. 2005 assessment used data on pending payment orders only, not overdue invoices.
    19. 19. Organizational Requirements <ul><li>Performance tracking to be clearly reflected in the TOR </li></ul><ul><li>Use of same assessment team desirable but rarely possible </li></ul><ul><li>The lead agency of the previous assessment, if different, should assist with access to previous assessors’ notes </li></ul><ul><li>… and be part of the reference group for the repeat exercise </li></ul>
    20. 20. <ul><li>PEFA incorporated in PFM reform monitoring system </li></ul><ul><ul><li>baseline 2005, repeat assessment 2007, planned follow-up 2010 </li></ul></ul><ul><li>Important performance improvements 2005-2007 in budget execution </li></ul><ul><ul><li>revenue administration, cash management, internal controls </li></ul></ul><ul><li>Improvements resulted from: </li></ul><ul><ul><li>Reforms already well under way in 2005 (e.g. IFMIS/SISTAFE and revenue administration) </li></ul></ul><ul><ul><li>Small managerial/admin changes (including quick-wins identified on the basis of 2005 assessment) </li></ul></ul><ul><ul><li>New reform initiatives in pay-roll control (identified from the 2005 assessment as an important neglected area of reform) </li></ul></ul>Country case - Mozambique
    21. 21. Thank you for your attention
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×