0
Kittens are Evil
Heresies in Public Policy
#KittensAreEvil
Language
Outcomes-Based Performance Management
“Outcomes Based Evaluation”
“Outcomes/Results Based Accountability”TM
“R...
Research Findings
• Measurement Problem: Outcomes don’t
measure impact in people’s lives
• Attribution Problem: Outcomes a...
Measurement Problem:
Outcomes Don’t Measure Impact
#KittensAreEvil
“One clear and compelling
answer to the question of "Why
measure outcomes?" is: To see
if programs really make a
differenc...
The outcomes bible…
“Outcomes Based Evaluation”,
by Robert Schalock, 1995
#KittensAreEvil
How to measure
an outcome
• Delivery group/control group
• In-depth qualitative research
• Large scale quantitative
resear...
How to measure
an outcome
Minimum post-programme
research time?
18 months
#KittensAreEvil
What does get measured?
Netten, A., Beadle-Brown, J., Caiels, J., Forder, J., Malley, J., Smith, N., Trukeschitz, B.,
Towe...
What does get measured?
Netten, A., Beadle-Brown, J., Caiels, J., Forder, J., Malley, J., Smith, N., Trukeschitz, B.,
Towe...
ASCOT
Control group? - No
Qualitative techniques? - No
Quantitative techniques? - Yes
designed by service users? - No
18 m...
Focusing Illusion
“Nothing in life is quite as
important as it seems to be
while you’re thinking about it”
Daniel Kahneman...
What does get measured?
Quality
Dimension
Accountability
Dimension
#KittensAreEvil
#KittensAreEvil
#KittensAreEvil
The attribution
problem:
Outcomes aren’t delivered by
organisations
What is an outcome?
Programme Logic Model
Robert Schalock & Gordon Bonham “Measuring
outcomes and managing for results”, Evaluation and
Progra...
Programme Logic Model
Robert Schalock & Gordon Bonham “Measuring
outcomes and managing for results”, Evaluation and
Progra...
Programme Logic Model
Robert Schalock & Gordon Bonham “Measuring
outcomes and managing for results”, Evaluation and
Progra...
What else is missing?
Program participants?
?
?
“Outcomes are by definition
results over which organizations
do not have complete control”
John Mayne, “Challenges and Les...
Theoretical problems:
• Measurement problem: Outcomes don’t
measure impact in people’s lives
• Attribution problem: Outcom...
What happens when
people implement OBPM?
What’s the evidence?
OBPM creates:
•“Goal displacement”
•“Creaming”
•“Making the numbers”
Burt Perrin “Effective Use and Misuse of
Performance ...
Targets for results “frequently
distort the direction of
programs, diverting attention
away from, rather than towards,
wha...
“Unintended consequences”:
•focusing on those who are
easiest to help
•“difficult” clients are skipped in
favor of the “ea...
“Ossification, a lack of
innovation, tunnel
vision and
suboptimization”
S van Thiel and F. L. Leeuw “The Performance Parad...
“Target based performance
management always creates
‘gaming’” (my emphasis)
Bevan, G. and Hood, C. “What’s measured is wha...
Triage “parks” disabled people on Work
Programme
Independent, Monday 28th
January, 2013
“Work advisers 'pushing jobless in...
“A4e employee forged signatures to
boost job placement numbers”
The Guardian, 6th March, 2012
“Serco gave NHS false data
a...
Implementing
outcomes approaches
“Always results in gaming”:
• Creaming/cherry picking (helping the
easiest to help)
• Tar...
Impact on
frontline practice
Outcomes Based Accountability
• Makes “it more difficult to engage with and
build relationships with homeless and at
risk ...
What Social Workers do…
• 86 per cent of time is system driven -
filling in forms for accountability and
discussing them w...
Frontline practice
= Reversal of relationship
between worker/client
From: how can I help you achieve
your goals?
To: how c...
Models of
Outcomes Based
Performance
Management
Payment by Results:
Managing for Genuine Impact?
1. Accept
you can’t do
it
2. Define an
outcome you
can live with
3. Set t...
OBPM: Managing Performance
for Genuine Attributability
1. Define
performance you are
responsible for
2. Set targets
for pe...
Summary
• Outcomes don’t measure impact in
people’s lives
• Outcomes aren’t delivered by an
organisation
• OBPM distorts o...
If not outcomes,
then what?
• Bottom up is key – start from actual
people’s needs
• Deal with complexity: Put human
judgem...
Thanks for listening
Toby Lowe
E: tobylowe@hotmail.co.uk
Twitter: @tobyjlowe
Kittens are evil: Heresies in Public Policy
Kittens are evil: Heresies in Public Policy
Kittens are evil: Heresies in Public Policy
Upcoming SlideShare
Loading in...5
×

Kittens are evil: Heresies in Public Policy

1,373

Published on

by Toby Lowe, Newcastle University Business School

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,373
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Kittens are evil: Heresies in Public Policy "

  1. 1. Kittens are Evil Heresies in Public Policy #KittensAreEvil
  2. 2. Language Outcomes-Based Performance Management “Outcomes Based Evaluation” “Outcomes/Results Based Accountability”TM “Results Based Management” “Payment by Results” = measure performance by the impact a person/team/organisation/project has in the world #KittensAreEvil
  3. 3. Research Findings • Measurement Problem: Outcomes don’t measure impact in people’s lives • Attribution Problem: Outcomes aren’t delivered by an organisation • OBPM distorts organisations’ priorities • OBPM undermines good frontline practice #KittensAreEvil
  4. 4. Measurement Problem: Outcomes Don’t Measure Impact #KittensAreEvil
  5. 5. “One clear and compelling answer to the question of "Why measure outcomes?" is: To see if programs really make a difference in the lives of people.” United Way of America #KittensAreEvil
  6. 6. The outcomes bible… “Outcomes Based Evaluation”, by Robert Schalock, 1995 #KittensAreEvil
  7. 7. How to measure an outcome • Delivery group/control group • In-depth qualitative research • Large scale quantitative research – designed by participants #KittensAreEvil
  8. 8. How to measure an outcome Minimum post-programme research time? 18 months #KittensAreEvil
  9. 9. What does get measured? Netten, A., Beadle-Brown, J., Caiels, J., Forder, J., Malley, J., Smith, N., Trukeschitz, B., Towers, A., Welch, E. and Windle, K. (2011) Adult Social Care Outcomes Toolkit v2.1: Main guidance, PSSRU Discussion Paper 2716/3, Personal Social Services Research Unit, University of Kent, Canterbury
  10. 10. What does get measured? Netten, A., Beadle-Brown, J., Caiels, J., Forder, J., Malley, J., Smith, N., Trukeschitz, B., Towers, A., Welch, E. and Windle, K. (2011) Adult Social Care Outcomes Toolkit v2.1: Main guidance, PSSRU Discussion Paper 2716/3, Personal Social Services Research Unit, University of Kent, Canterbury
  11. 11. ASCOT Control group? - No Qualitative techniques? - No Quantitative techniques? - Yes designed by service users? - No 18 month follow up? – Maybe #KittensAreEvil
  12. 12. Focusing Illusion “Nothing in life is quite as important as it seems to be while you’re thinking about it” Daniel Kahneman, Professor of Psychology and Public Affairs, Princeton University #KittensAreEvil
  13. 13. What does get measured? Quality Dimension Accountability Dimension #KittensAreEvil
  14. 14. #KittensAreEvil
  15. 15. #KittensAreEvil
  16. 16. The attribution problem: Outcomes aren’t delivered by organisations
  17. 17. What is an outcome?
  18. 18. Programme Logic Model Robert Schalock & Gordon Bonham “Measuring outcomes and managing for results”, Evaluation and Program Planning, 2003
  19. 19. Programme Logic Model Robert Schalock & Gordon Bonham “Measuring outcomes and managing for results”, Evaluation and Program Planning, 2003
  20. 20. Programme Logic Model Robert Schalock & Gordon Bonham “Measuring outcomes and managing for results”, Evaluation and Program Planning, 2003
  21. 21. What else is missing? Program participants? ? ?
  22. 22. “Outcomes are by definition results over which organizations do not have complete control” John Mayne, “Challenges and Lessons in Implementing Results-Based Management”, Evaluation, 2007
  23. 23. Theoretical problems: • Measurement problem: Outcomes don’t measure impact in people’s lives • Attribution problem: Outcomes aren’t delivered by an organisation The uncertainty principle in action?
  24. 24. What happens when people implement OBPM? What’s the evidence?
  25. 25. OBPM creates: •“Goal displacement” •“Creaming” •“Making the numbers” Burt Perrin “Effective Use and Misuse of Performance Measurement”, American Journal of Evaluation, 1998
  26. 26. Targets for results “frequently distort the direction of programs, diverting attention away from, rather than towards, what the program should be doing.” Burt Perrin, 1998
  27. 27. “Unintended consequences”: •focusing on those who are easiest to help •“difficult” clients are skipped in favor of the “easy” ones S van Thiel and F. L. Leeuw “The Performance Paradox in the Public Sector”, Public Performance and Management Review, 2002
  28. 28. “Ossification, a lack of innovation, tunnel vision and suboptimization” S van Thiel and F. L. Leeuw “The Performance Paradox in the Public Sector”, Public Performance and Management Review, 2002
  29. 29. “Target based performance management always creates ‘gaming’” (my emphasis) Bevan, G. and Hood, C. “What’s measured is what matters: targets and gaming in the English public health care system”, Public Administration, 2006
  30. 30. Triage “parks” disabled people on Work Programme Independent, Monday 28th January, 2013 “Work advisers 'pushing jobless into self-employment” BBC, 3rd February, 2013 “Private health contractor's staff told to cut 999 calls to meet targets” Guardian, Wednesday 23 January 2013 “NHS targets 'may have led to 1,200 deaths' in Mid-Staffordshire” Daily Telegraph, March 2009
  31. 31. “A4e employee forged signatures to boost job placement numbers” The Guardian, 6th March, 2012 “Serco gave NHS false data about its GP service 252 times” Guardian, Thursday 20 September 2012
  32. 32. Implementing outcomes approaches “Always results in gaming”: • Creaming/cherry picking (helping the easiest to help) • Targeting resources to produce data (teaching to the test) • Reclassifying results (pretending) • Making things up
  33. 33. Impact on frontline practice
  34. 34. Outcomes Based Accountability • Makes “it more difficult to engage with and build relationships with homeless and at risk young people” • Has significant impacts on the daily practice of workers • Reduces the time available to create a sense of belonging • Reduces the time to “develop young people’s life skills” Lynn Keevers (et al) “Made to Measure: Taming Practices with Results- based Accountability”, Organization Studies, 2012
  35. 35. What Social Workers do… • 86 per cent of time is system driven - filling in forms for accountability and discussing them with colleagues. • The 14 per cent of time spent face to face with a family member is not developmental. Hilary Cottom, Relational Welfare, 2011
  36. 36. Frontline practice = Reversal of relationship between worker/client From: how can I help you achieve your goals? To: how can you help me achieve my targets?
  37. 37. Models of Outcomes Based Performance Management
  38. 38. Payment by Results: Managing for Genuine Impact? 1. Accept you can’t do it 2. Define an outcome you can live with 3. Set targets for performance 5. Providers deliver activity 6. Gather outcomes data 7. Realise limits of influence 8. Game: •Cherry-pick •Teach to the test •Reclassify •Change practice to focus on data •Make up data 4. Providers plan activity to meet outcome targets
  39. 39. OBPM: Managing Performance for Genuine Attributability 1. Define performance you are responsible for 2. Set targets for performance 4. Deliver activity 5. Gather performance data 6. Act so as to improve performance data 3. Providers plan activity to meet performance targets
  40. 40. Summary • Outcomes don’t measure impact in people’s lives • Outcomes aren’t delivered by an organisation • OBPM distorts organisations’ priorities • OBPM undermines good frontline practice
  41. 41. If not outcomes, then what? • Bottom up is key – start from actual people’s needs • Deal with complexity: Put human judgement on the frontline • Trust & Transparency • Use social theory: measure change in social context and identity
  42. 42. Thanks for listening Toby Lowe E: tobylowe@hotmail.co.uk Twitter: @tobyjlowe
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×