Embedding and assessing
ex- post evaluation
Gary Banks
Presentation to OECD’s 7th Expert Meeting on Measuring
Regulatory Performance
June 2015
Reykjavik, Iceland
Ex post evaluation: why bother?
• All regulations are ‘experiments’
• Many will not have been done well
• Others will have passed their ‘use by date’
• The ‘stock’ of regulation is much larger than
the flow
• The net gains from reforming it are potentially
very large (eg 5+ percent gain in Australia’s
GDP from 1980-90s reforms)
2
3
Multiple costs of regulation
Benefits foregone if
regulation is ineffective
Ø other perverse effects
Ø other ‘non -market’
distortions
Fees and charges a
Economic distortions
Ø dead weight losses
Ø lower investment
Ø lower innovation
Substantive compliance
costs
Ø investments in systems
training
Ø higher cost of investment
Administrative costs to
business
Ø paper work time
Ø reporting time
Administration cost to
regulators
Benefitsneededtojustifycosts
Compliance costs ‘Distortion’ costs
Costs to government Costs to business b
Costs to community
Coststocommunity
4
‘Closing’ the regulatory cycle
Stage II–Establishment
Ø Design to include
embedded reviews
Ø Development of regulator
management strategies
Stage IV− Review
Ø Programmed reviews
o Sunsetting
o Embedded
o PIRS
Ø Ad hoc reviews
o In-depth
o Specific benchmarking
o Public stocktakes
Stage I− Decision
Ø RIS triage
o Identify need for
embedded reviews
o Set sunset flags
Ø Stock-flow linkage rules
Stage III − Administration
Ø Regulator management
strategies
Ø Monitoring review
requirements
Lessons from regulators
on what works
Lessons from
expost evaluation
Four key questions for ex post evaluation
• Appropriate? (Valid rationale for regulating?)
• Effective? (Realised the intended outcome?)
• Efficient? (Any unnecessary costs or impacts?)
• Is there a better alternative?
5
6
Three broad approaches to ex post evaluation
Programmed reviews Ad hoc reviews Ongoing ‘management’
•Sunsetting
•Embedded in statute
•Post implementation
reviews
– process failure
– catch-all
•Public stocktakes
– economy-wide
– sectoral
•‘Principles-based’
reviews
•Benchmarking
•‘In-depth’ reviews
•Regulator strategies
•Stock-flow linkages
– Budgets
– ‘In-Out’ / ‘Offsets’
– RIA based
consideration
•Red tape reduction
targets
Why evaluate the ‘system’?
• System design is experimental too
− hard to know in advance which approaches will work (best)
− and even after the event this may not be apparent without
monitoring, consultation and analysis
• Evaluations must be seen to ‘pay their way’
− They involve time, effort and skills in short supply
− Some will oppose them for what they may expose
7
Is the system well designed in principle?
• Does it ‘cover the bases’? (No gaps?)
• Is it ‘proportionate’ in the application of
different approaches/tools?
• Are the tools themselves well designed?
8
9
Effort versus impact: an APC assessment (2011)
Potentially low return
Ø Sunsetting
Ø Regulator stock management
Ø Red tape targets
b
Ø RIS stock-flow link
Ø Broad redtape cost estimation
Ø Regulatory budgets and one-in one-
out
a
Ø Frequent stocktakes
Potentially high return
Ø Known high cost areas and known
solutions from past reviews
Ø Regulator management strategies
where weak in the past
Ø Periodic stocktakes
Ø In-depth reviews
Ø Embedded statutory reviews
Ø Benchmarking
Ø Packaged sunset reviews
HigheffortLoweffort
Is the system being (properly) utilized in
practice?
• Have evaluations actually been conducted
when they should?
• Were they adequately resourced?
• Were the reviewers appropriate?
• Was due process (incl consultation) followed?
DATA: ‘tick a box’ monitoring (and central
storage of data)
10
What impacts have evaluations had?
• Changes in volume of regulation?
−Yes or no (tick a box), and simple arithmetic
• Changes in regulation’s benefits and/or costs?
−Evidence collected from individual evaluations
(may include qualitative information)
−Economy-wide analysis or surveys
11
Some lessons from Australia
• Value of a ‘portfolio’ of evaluation approaches
and tools, but need screening & prioritization
• Key role of institutions – (i) to monitor usage
(ii) collect and store data, and (iii) undertake
reviews
• However simpler systems work best (KISS)!
12

Embedding and assessing ex post evaluation

  • 1.
    Embedding and assessing ex-post evaluation Gary Banks Presentation to OECD’s 7th Expert Meeting on Measuring Regulatory Performance June 2015 Reykjavik, Iceland
  • 2.
    Ex post evaluation:why bother? • All regulations are ‘experiments’ • Many will not have been done well • Others will have passed their ‘use by date’ • The ‘stock’ of regulation is much larger than the flow • The net gains from reforming it are potentially very large (eg 5+ percent gain in Australia’s GDP from 1980-90s reforms) 2
  • 3.
    3 Multiple costs ofregulation Benefits foregone if regulation is ineffective Ø other perverse effects Ø other ‘non -market’ distortions Fees and charges a Economic distortions Ø dead weight losses Ø lower investment Ø lower innovation Substantive compliance costs Ø investments in systems training Ø higher cost of investment Administrative costs to business Ø paper work time Ø reporting time Administration cost to regulators Benefitsneededtojustifycosts Compliance costs ‘Distortion’ costs Costs to government Costs to business b Costs to community Coststocommunity
  • 4.
    4 ‘Closing’ the regulatorycycle Stage II–Establishment Ø Design to include embedded reviews Ø Development of regulator management strategies Stage IV− Review Ø Programmed reviews o Sunsetting o Embedded o PIRS Ø Ad hoc reviews o In-depth o Specific benchmarking o Public stocktakes Stage I− Decision Ø RIS triage o Identify need for embedded reviews o Set sunset flags Ø Stock-flow linkage rules Stage III − Administration Ø Regulator management strategies Ø Monitoring review requirements Lessons from regulators on what works Lessons from expost evaluation
  • 5.
    Four key questionsfor ex post evaluation • Appropriate? (Valid rationale for regulating?) • Effective? (Realised the intended outcome?) • Efficient? (Any unnecessary costs or impacts?) • Is there a better alternative? 5
  • 6.
    6 Three broad approachesto ex post evaluation Programmed reviews Ad hoc reviews Ongoing ‘management’ •Sunsetting •Embedded in statute •Post implementation reviews – process failure – catch-all •Public stocktakes – economy-wide – sectoral •‘Principles-based’ reviews •Benchmarking •‘In-depth’ reviews •Regulator strategies •Stock-flow linkages – Budgets – ‘In-Out’ / ‘Offsets’ – RIA based consideration •Red tape reduction targets
  • 7.
    Why evaluate the‘system’? • System design is experimental too − hard to know in advance which approaches will work (best) − and even after the event this may not be apparent without monitoring, consultation and analysis • Evaluations must be seen to ‘pay their way’ − They involve time, effort and skills in short supply − Some will oppose them for what they may expose 7
  • 8.
    Is the systemwell designed in principle? • Does it ‘cover the bases’? (No gaps?) • Is it ‘proportionate’ in the application of different approaches/tools? • Are the tools themselves well designed? 8
  • 9.
    9 Effort versus impact:an APC assessment (2011) Potentially low return Ø Sunsetting Ø Regulator stock management Ø Red tape targets b Ø RIS stock-flow link Ø Broad redtape cost estimation Ø Regulatory budgets and one-in one- out a Ø Frequent stocktakes Potentially high return Ø Known high cost areas and known solutions from past reviews Ø Regulator management strategies where weak in the past Ø Periodic stocktakes Ø In-depth reviews Ø Embedded statutory reviews Ø Benchmarking Ø Packaged sunset reviews HigheffortLoweffort
  • 10.
    Is the systembeing (properly) utilized in practice? • Have evaluations actually been conducted when they should? • Were they adequately resourced? • Were the reviewers appropriate? • Was due process (incl consultation) followed? DATA: ‘tick a box’ monitoring (and central storage of data) 10
  • 11.
    What impacts haveevaluations had? • Changes in volume of regulation? −Yes or no (tick a box), and simple arithmetic • Changes in regulation’s benefits and/or costs? −Evidence collected from individual evaluations (may include qualitative information) −Economy-wide analysis or surveys 11
  • 12.
    Some lessons fromAustralia • Value of a ‘portfolio’ of evaluation approaches and tools, but need screening & prioritization • Key role of institutions – (i) to monitor usage (ii) collect and store data, and (iii) undertake reviews • However simpler systems work best (KISS)! 12