Performance Management and
Measurement
Best Practices and Recent Initiatives: Part II
OUTCOMES &
IMPACTS
SERVICES &
SATISFACTION
POLICY
FRAMEWORK &
PROGRAM
OBJECTIVES
INPUTS
RESOURCES
PEOPLE
MANAGEMENT
ACTIVITIES
Performance Measurement and Management
1. Strategic performance
1. Policy and inputs
2. Policy and outcomes
3. People, activities, outcomes
2. Expenditure performance
1. Inputs and activities (productivity)
2. Inputs and outcomes (effectiveness)
Performance Measurement and Management
3. Management performance
1. Managing links (HRM)
2. Reaching objectives
3. Meeting standards
4. Service or program performance
1. Activities and outcomes
2. Services and standards
Preview
Evaluation in Government of Canada
Background
◦ First centralized evaluation policy introduced in
1978
◦ Focus on smaller programs, with regular review, but
no coherent reporting lines to Treasury Board
◦ Evaluation function was routinized, regularized,
required, but not linked to strategic policy
development (feeding into allocative and priority
decisions)
◦ Liberal governments in 1990s-2000s with focus on
results (NPM), reporting systems, Management
Accountability Framework (MAF), reporting and
planning cycles, Program Activity Architecture (PAA)
◦ New evaluation policy in 2001: evaluation as a
”management tool”
“approximately half the studies which attempted to
measure the effectiveness of programs were unable to
adequately attribute outcomes to activities” (OAG,
1983).
5
“Program evaluations frequently are not timely or
relevant. Many large-expenditure programs have
not been evaluated under the policy. Despite policy
requirements for evaluating all regulatory
measures, half have been evaluated …” (OAG, 1993)
“The function has not lived up to the original policy
expectations set out in 1997 … In fact evaluations
have resulted largely in the operational
improvements to and monitoring of programs,
rather than more fundamental changes.” (TBS,
2004)
Federal Policy on Evaluation (2009)
Key Features
◦ Policy is pursuant to Financial Administration Act
◦ All programs (including internal services) in each
department reviewed every five years
◦ Not applicable to Officers of Parliament; some
smaller departments
◦ Annual evaluation plans submitted to Treasury
Board; priorities based on risk analysis
◦ Deputy Head (secretary) responsible for
evaluation function; appoints a Director of
Evaluation who reports to Deputy Head
◦ Departmental Evaluation Committee (in INAC,
it’s Evaluation and Performance Measurement
and Review Committee)
6
7
6.1.8 ensure that the following evaluation coverage requirements are met and reflected in
the departmental evaluation plan:
a. all direct program spending, excluding grants and contributions, is evaluated every five
years;
b. all ongoing programs of grants and contributions are evaluated every five years, as
required by section 42.1 of the Financial Administration Act;
c. the administrative aspect of major statutory spending is evaluated every five years;
d. programs that are set to terminate automatically over a specified period of time, if
requested by the Secretary of the Treasury Board following consultation
with the affected deputy head;
e. specific evaluations, if requested by the Secretary of the Treasury Board following
consultation with the affected deputy head.
Coverage Requirements
8
Evaluation Criteria
Relevance
Issue 1: Continued need for the program
Issue 2: Alignment with government priorities: Assessment of the linkages between
program objectives and (a) federal government priorities and (b) departmental strategic
objectives.
Issue 3: Alignment with federal roles and responsibilities
Performance
Issue 4: Achievement of expected outcomes: Assessment of progress toward specified
outcomes (including immediate, intermediate, and ultimate outcomes)
Issue 5: Demonstration of efficiency and economy: Assessment of resource utilization in
relation to the production of outputs and progress toward expected outcomes.
9
7.2 Deputy heads are responsible for addressing issues that arise regarding compliance with
this policy, and with its associated directive and standard, and ensuring that appropriate
remedial actions are taken to address these issues.
7.3 The Secretary of the Treasury Board will monitor compliance with this policy through
ongoing monitoring of evaluations and departmental evaluation plans,
including evaluation coverage and quality of evaluations,
Compliance Oversight
Problems with 2009 Policy
1. Departments over-burdened with producing evaluation reports on every program,
large, small, high-priority or low-priority
2. Five-year departmental rolling evaluation plans not always aligned with
government-wide priorities or needs
3. Evaluation results and information is rarely aggregated to help strategic decisions,
either in department or across government
1. Criteria from Policy on Evaluation often constrain sharper assessments of programs (e.g., ”continuing
need”)
2. “Comparing apples and oranges through a banana lens.”
4. Disconnect between departmental evaluation reporting, annual (departmental)
Reports on Plans and Priorities, annual Departmental Performance Reports, and
demonstration of results for whole-of-government
5. Wide variation across departments, some supportive, some less so
10
New Government: Management Focus
11
Mandate
Letters
New Clerk:
Michael
Wernick
Mandate
Letters
PCO Unit:
Results and
Delivery
(Matthew
Mendelsohn,
Deputy Secretary)
February
2016
New “Results
Framework”
July 2016
New Policy on
Results
12
INAC Program Activity Structure (PAA)
Strategic
Outcome I
Program A
Sub -Program
SSP
Program B
Sub-Program
SSP
PAA Architecture
13
Strategic
Outcome I
Program A
Sub -Program
SSP
Program B
Sub-Program
SSP
Very High Level
Indicators
High Level Indicators
More Specific Indicators
More Specific Indicators
14
Strategic
Outcome I
Program A
Sub -Program
SSP
Program B
Sub-Program
SSP
PAA Architecture
Indicators
Indicators
Indicators
Indicators
New Results Framework
Core
Responsibility
Program Program
Core
Responsibility
Program Program
New Architecture
• The results framework for each department consists of core
responsibilities, results, range of indicators (some from older sub-
program levels)
• Program inventory (initially approved by TBS), then changes up to
Deputy Head and Minister (with notification)
Old PAA versus New “Results Framework”
New Results Architecture
1. Core responsibilities may be similar to old PAA Strategic Objectives in some cases,
but in others will not
2. Should be a cascading relationship between government priorities, departmental
(ministerial) mandates, core responsibilities, results, reporting
3. Program Inventory: Should provide a more aggregated view of departmental
activities (”programs” are “distinct groupings” of programs, services, activities of
combinations)
4. Designation of a “Chief Delivery Officer” to establish and implement Departmental
Outcomes Framework and Program Inventory – “key liaison” with TBS
5. “Tag” approach: Each program in the inventory will be “tagged” to government
priority, horizontal results, client groups, type of program, etc. – allowing more
synoptic view)
6. Should provide more flexibility for re-allocations at the program level, informed by
results information
15
New Evaluation Function
1. Indicators from any level of the old PAA can be used (linked) to new core responsibilities
2. All programs will continue to have to be reviewed, but will not have to submit all
performance reviews to TBS
3. Departments can prioritize evaluation coverage
4. Evaluation plans and evaluation results will have to be published
5. Each Program in Program Inventory should have a designated official responsible for
establishing and maintaining a “performance information profile” and ensure data
collection – this is in addition to “Chief Delivery Officer”
6. Implementation timeline
◦ Transition from PAA to new Results Framework April – June 2016
◦ New “Policy on Results and Resource Alignment” - July 1, 2016
◦ Program inventories to be developed January - August 2017, approved by December 2017
16
Five Perennial Evaluation/Performance Challenges/Trade-offs
1. Evaluation and resource allocation: Linking evaluation to resource allocation – making it more than
just management function (pruning here and there)
2. Evaluation and “whole of government” strategies: Program evaluation tends towards the micro;
whole-of-government priorities are macro
3. Evaluation and delivery/implementation: Measuring performance should inform implementation, it
should be a way of assessing implementation success
4. Evaluation and control/guidance: Linking center to departments, establishing reporting, but giving
departments enough scope to make their own decisions
5. Evaluation and accountability: Coherently communicating a “results story” – beyond just numbers
and logic models – to Parliament and the public
17
Canada’s new experiment is one of a long line of attempts to get the “right balance”
Australia: Capability Review Program
Background
◦ Launched in 2011
◦ Objectives:
◦ Agency Capability Assessment: Independent review
◦ Agency Capability Improvement: Work with agencies to improve
(they have to submit Action Plans that are monitored)
◦ Australian Public Service (APS)-wide Capability Building: Lessons
learned
◦ 25 reviews done to date
◦ Common model used for assessment, allowing a
common language of improvement across the APS
◦ Review done on basis of 39 questions across 10
areas
◦ External senior reviewers have included former
federal and state government agency heads and
private sector executives.
19
Group Discussion
Example: Guidance Questions on “Set Direction” (Leadership)If you were developing Guidance
Questions for Develop People
(Leadership), Outcome Focused
Strategy (Strategy), or Innovative
Delivery (Delivery), what would
they be?
Develop four questions in your
group.
We will compare with the actual
ones used in the Australian
Capability Reviews.
Culture isn’t just one aspect of the game – it is the game.
In the end, an organisation is nothing more than the collective
capacity of its people to create value.
Lou Gerstner, IBM
APS Optimizing Employee Performance
Shift from Process to Conversations (Selections)
Shift from Process to Conversations (Selections)
Enable Conversations and Measure Impact (Selections)
Thank You!
Prof. Leslie A. Pal
School of Public Policy and Administration
Carleton University
Ottawa, Canada
leslie.pal@carleton.ca

APO Lecture: Best Practices II

  • 1.
    Performance Management and Measurement BestPractices and Recent Initiatives: Part II
  • 2.
    OUTCOMES & IMPACTS SERVICES & SATISFACTION POLICY FRAMEWORK& PROGRAM OBJECTIVES INPUTS RESOURCES PEOPLE MANAGEMENT ACTIVITIES Performance Measurement and Management 1. Strategic performance 1. Policy and inputs 2. Policy and outcomes 3. People, activities, outcomes 2. Expenditure performance 1. Inputs and activities (productivity) 2. Inputs and outcomes (effectiveness) Performance Measurement and Management 3. Management performance 1. Managing links (HRM) 2. Reaching objectives 3. Meeting standards 4. Service or program performance 1. Activities and outcomes 2. Services and standards
  • 3.
  • 5.
    Evaluation in Governmentof Canada Background ◦ First centralized evaluation policy introduced in 1978 ◦ Focus on smaller programs, with regular review, but no coherent reporting lines to Treasury Board ◦ Evaluation function was routinized, regularized, required, but not linked to strategic policy development (feeding into allocative and priority decisions) ◦ Liberal governments in 1990s-2000s with focus on results (NPM), reporting systems, Management Accountability Framework (MAF), reporting and planning cycles, Program Activity Architecture (PAA) ◦ New evaluation policy in 2001: evaluation as a ”management tool” “approximately half the studies which attempted to measure the effectiveness of programs were unable to adequately attribute outcomes to activities” (OAG, 1983). 5 “Program evaluations frequently are not timely or relevant. Many large-expenditure programs have not been evaluated under the policy. Despite policy requirements for evaluating all regulatory measures, half have been evaluated …” (OAG, 1993) “The function has not lived up to the original policy expectations set out in 1997 … In fact evaluations have resulted largely in the operational improvements to and monitoring of programs, rather than more fundamental changes.” (TBS, 2004)
  • 6.
    Federal Policy onEvaluation (2009) Key Features ◦ Policy is pursuant to Financial Administration Act ◦ All programs (including internal services) in each department reviewed every five years ◦ Not applicable to Officers of Parliament; some smaller departments ◦ Annual evaluation plans submitted to Treasury Board; priorities based on risk analysis ◦ Deputy Head (secretary) responsible for evaluation function; appoints a Director of Evaluation who reports to Deputy Head ◦ Departmental Evaluation Committee (in INAC, it’s Evaluation and Performance Measurement and Review Committee) 6
  • 7.
    7 6.1.8 ensure thatthe following evaluation coverage requirements are met and reflected in the departmental evaluation plan: a. all direct program spending, excluding grants and contributions, is evaluated every five years; b. all ongoing programs of grants and contributions are evaluated every five years, as required by section 42.1 of the Financial Administration Act; c. the administrative aspect of major statutory spending is evaluated every five years; d. programs that are set to terminate automatically over a specified period of time, if requested by the Secretary of the Treasury Board following consultation with the affected deputy head; e. specific evaluations, if requested by the Secretary of the Treasury Board following consultation with the affected deputy head. Coverage Requirements
  • 8.
    8 Evaluation Criteria Relevance Issue 1:Continued need for the program Issue 2: Alignment with government priorities: Assessment of the linkages between program objectives and (a) federal government priorities and (b) departmental strategic objectives. Issue 3: Alignment with federal roles and responsibilities Performance Issue 4: Achievement of expected outcomes: Assessment of progress toward specified outcomes (including immediate, intermediate, and ultimate outcomes) Issue 5: Demonstration of efficiency and economy: Assessment of resource utilization in relation to the production of outputs and progress toward expected outcomes.
  • 9.
    9 7.2 Deputy headsare responsible for addressing issues that arise regarding compliance with this policy, and with its associated directive and standard, and ensuring that appropriate remedial actions are taken to address these issues. 7.3 The Secretary of the Treasury Board will monitor compliance with this policy through ongoing monitoring of evaluations and departmental evaluation plans, including evaluation coverage and quality of evaluations, Compliance Oversight
  • 10.
    Problems with 2009Policy 1. Departments over-burdened with producing evaluation reports on every program, large, small, high-priority or low-priority 2. Five-year departmental rolling evaluation plans not always aligned with government-wide priorities or needs 3. Evaluation results and information is rarely aggregated to help strategic decisions, either in department or across government 1. Criteria from Policy on Evaluation often constrain sharper assessments of programs (e.g., ”continuing need”) 2. “Comparing apples and oranges through a banana lens.” 4. Disconnect between departmental evaluation reporting, annual (departmental) Reports on Plans and Priorities, annual Departmental Performance Reports, and demonstration of results for whole-of-government 5. Wide variation across departments, some supportive, some less so 10
  • 11.
    New Government: ManagementFocus 11 Mandate Letters New Clerk: Michael Wernick Mandate Letters PCO Unit: Results and Delivery (Matthew Mendelsohn, Deputy Secretary) February 2016 New “Results Framework” July 2016 New Policy on Results
  • 12.
    12 INAC Program ActivityStructure (PAA) Strategic Outcome I Program A Sub -Program SSP Program B Sub-Program SSP PAA Architecture
  • 13.
    13 Strategic Outcome I Program A Sub-Program SSP Program B Sub-Program SSP Very High Level Indicators High Level Indicators More Specific Indicators More Specific Indicators
  • 14.
    14 Strategic Outcome I Program A Sub-Program SSP Program B Sub-Program SSP PAA Architecture Indicators Indicators Indicators Indicators New Results Framework Core Responsibility Program Program Core Responsibility Program Program New Architecture • The results framework for each department consists of core responsibilities, results, range of indicators (some from older sub- program levels) • Program inventory (initially approved by TBS), then changes up to Deputy Head and Minister (with notification) Old PAA versus New “Results Framework”
  • 15.
    New Results Architecture 1.Core responsibilities may be similar to old PAA Strategic Objectives in some cases, but in others will not 2. Should be a cascading relationship between government priorities, departmental (ministerial) mandates, core responsibilities, results, reporting 3. Program Inventory: Should provide a more aggregated view of departmental activities (”programs” are “distinct groupings” of programs, services, activities of combinations) 4. Designation of a “Chief Delivery Officer” to establish and implement Departmental Outcomes Framework and Program Inventory – “key liaison” with TBS 5. “Tag” approach: Each program in the inventory will be “tagged” to government priority, horizontal results, client groups, type of program, etc. – allowing more synoptic view) 6. Should provide more flexibility for re-allocations at the program level, informed by results information 15
  • 16.
    New Evaluation Function 1.Indicators from any level of the old PAA can be used (linked) to new core responsibilities 2. All programs will continue to have to be reviewed, but will not have to submit all performance reviews to TBS 3. Departments can prioritize evaluation coverage 4. Evaluation plans and evaluation results will have to be published 5. Each Program in Program Inventory should have a designated official responsible for establishing and maintaining a “performance information profile” and ensure data collection – this is in addition to “Chief Delivery Officer” 6. Implementation timeline ◦ Transition from PAA to new Results Framework April – June 2016 ◦ New “Policy on Results and Resource Alignment” - July 1, 2016 ◦ Program inventories to be developed January - August 2017, approved by December 2017 16
  • 17.
    Five Perennial Evaluation/PerformanceChallenges/Trade-offs 1. Evaluation and resource allocation: Linking evaluation to resource allocation – making it more than just management function (pruning here and there) 2. Evaluation and “whole of government” strategies: Program evaluation tends towards the micro; whole-of-government priorities are macro 3. Evaluation and delivery/implementation: Measuring performance should inform implementation, it should be a way of assessing implementation success 4. Evaluation and control/guidance: Linking center to departments, establishing reporting, but giving departments enough scope to make their own decisions 5. Evaluation and accountability: Coherently communicating a “results story” – beyond just numbers and logic models – to Parliament and the public 17 Canada’s new experiment is one of a long line of attempts to get the “right balance”
  • 19.
    Australia: Capability ReviewProgram Background ◦ Launched in 2011 ◦ Objectives: ◦ Agency Capability Assessment: Independent review ◦ Agency Capability Improvement: Work with agencies to improve (they have to submit Action Plans that are monitored) ◦ Australian Public Service (APS)-wide Capability Building: Lessons learned ◦ 25 reviews done to date ◦ Common model used for assessment, allowing a common language of improvement across the APS ◦ Review done on basis of 39 questions across 10 areas ◦ External senior reviewers have included former federal and state government agency heads and private sector executives. 19
  • 21.
    Group Discussion Example: GuidanceQuestions on “Set Direction” (Leadership)If you were developing Guidance Questions for Develop People (Leadership), Outcome Focused Strategy (Strategy), or Innovative Delivery (Delivery), what would they be? Develop four questions in your group. We will compare with the actual ones used in the Australian Capability Reviews.
  • 22.
    Culture isn’t justone aspect of the game – it is the game. In the end, an organisation is nothing more than the collective capacity of its people to create value. Lou Gerstner, IBM APS Optimizing Employee Performance
  • 23.
    Shift from Processto Conversations (Selections)
  • 24.
    Shift from Processto Conversations (Selections)
  • 25.
    Enable Conversations andMeasure Impact (Selections)
  • 26.
    Thank You! Prof. LeslieA. Pal School of Public Policy and Administration Carleton University Ottawa, Canada leslie.pal@carleton.ca