SlideShare a Scribd company logo
1 of 20
Download to read offline
Strategies for tackling differences: 
Learning from evaluability assessments of horizontal 
      initiatives to prepare for evaluations of 
     program activity architecture components




                 CES 2012 Conference Halifax
               Monday, May 14, 2012 | 15:15‐16:45
Outline & Objective
 Objective
          Sharing lessons learned from conducting evaluability assessments of 
           horizontal initiatives 
          With a view to inform the conduct of evaluations of program activity 
           architecture components (PAA)
 Context
 Overview – evaluation assessments projects: 
         Horizontal  – Genomics Research and Development Initiative (GRDI) 
         Horizontal  – Major Project Management Office Initiative (MPMOI)
         PAA‐level – Minerals & Metals, Markets, Investment & Innovation (MMMII)
 Key issues: 
   #1: Multiple stakeholders
   #2: Evaluation program theory

 Strategies and remaining challenges
 Questions?

                                                                                   2
Context
 The Treasury Board of Canada requires full evaluation coverage
 Federal evaluation plans increasingly include evaluations of program 
  activity architecture (PAA) components 
 PAA comprises  a wide range of activities beyond the individual 
  program level (sub, sub‐sub and sub‐sub‐sub levels)
 Departmental Performance Measurement Frameworks (PMFs) set 
  out the expected results and the performance measures to be 
  reported for all programs identified in the PAA components
 The indicators in the departmental PMFs are limited in number and 
  focus on supporting departmental monitoring and reporting
 PMFs are developed with a view to support effective evaluations
 Foreseen opportunities for evaluators and for organizations



                                                                       3
Context
 However, PAA‐level evaluations pose significant challenges during 
   evaluation design (i.e., linking outcomes across activities, initiatives, 
   programs and organizations). 
 Absence of pre‐existing program theory and integrated performance 
   measurement frameworks
 Evaluation program theory need to go beyond linking RMAFs to PAA 
   components
 Many PAAs boxes have never been evaluated
 Need to spend more time discovering what’s “inside the boxes”…
 As differences need to be identified, quantified and qualified across 
   varied and often heterogeneous stakeholder groups
 Implications for evaluability assessment projects



                                                                                4
Overview – evaluation assessment projects

                 Horizontal – GRDI                    Horizontal – MPMOI
Goal   • Build and maintain genomics         • Address federal regulatory systemic 
         research capacity in government       issues and capacity deficiencies for 
         departments.                          major resource projects and 
       • Protect and improve human health,     provide interim capacity funding for 
         develop new treatments for            Aboriginal consultation. 
         chronic and infectious diseases, 
         protect the environment, and        • Via the Major Project Management 
         manage agricultural and natural       Office (housed at NRCan):
         resources in a sustainable manner.    i) provide overarching project 
       • Support evidence‐based decision‐          coordination, management and 
         making, policy/standards/                 accountability; and 
         regulations development, as well       ii) undertake research and identify 
         as facilitate the development of           options that drive further 
         Canadian commercial enterprises.           performance improvements to 
                                                    the federal regulatory system.


                                                                                  5
Overview – evaluation assessment projects

                  Horizontal – GRDI                        Horizontal – MPMOI
Dep’t &  •    Agriculture and Agri‐food     • Canadian Environmental Assessment Agency 
Agen‐    •    Environment Canada            • Environment Canada
cies     •    Fisheries and Oceans          • Fisheries and Oceans
         •    Health Canada                 • Aboriginal Affairs and Northern 
         •    Public Health Agency            Development Canada
         •    National Research Council     • Transport Canada; National Energy Board 
         •    Natural Resources             • Canadian Nuclear Safety Commission
Budget    • $20 million/year                • $30 million/year
Mgmt.     • Mgmt Committees (ADM            • Mgmt Committees (DM, ADM, DG, dep’t 
            and working groups)               committees and working groups)
Main      •   Mainly federal scientists     •   Natural resource industries
Stake‐    •   Academic researchers          •   Aboriginal groups
holders   •   Private sector                •   Environmental groups
          •   Others                        •   Federal organizations
                                            •   Provincial organizations
                                            •   Others
                                                                                     6
Overview – evaluation assessment projects

                     Horizontal – GRDI                 Horizontal – MPMOI
Evaluation   • Interdepartmental Evaluation Advisory Committee (IEAC)
Committee       • Program staff + Evaluation staff
                • Observers/stakeholders
Approach/  • Data availability and quality assessment
Methods    • Document, file and data review
             • Program rationale and profile
             • Revision/adjustment of the logic model (MPMOI)
             • Development of evaluation questions
             • Individual consultations with working groups from each organization
             • Half‐day roundtable workshop
             • Development of the Data Collection Matrix (DCM)
             • Development of evaluation methodology and options
             • Characterization of key information needed for the evaluation
             • Final workshop for validation

                                                                                     7
Overview – evaluation assessment projects

                                           PAA‐level –MMMII
Goal       •    Minerals & Metals, Markets, Investment & Innovation (MMMII)
                • Mining Scientific Research and Innovations (1.1.1.1)
                • Socio‐economic Minerals and Metals Research and Knowledge for 
                    Investments and Access to Global Markets (1.1.1.2)
Compo‐     •   2  entire Branches (7 divisions):
nents          • Minerals, Metals and Materials Knowledge Branch (MMMKB) 
               • Minerals, Metals and Materials Policy Branch (MMMPB)
           •   Various programs/groups in 2 additional Branches:
               • CANMET Mining and Mineral Sciences Laboratories (MMSL)
               • CANMET Materials Technology Laboratory (MTL)
Budget     •   $20 million/year + specific funding for the relocation of MTL to Hamilton
Mgmt.      • ADM‐level overall and DG‐level by Branch
Main       •   NRCan & other federal org.         • Canadian embassies/commissions 
Stake‐     •   Provincial organizations           • Aboriginal groups and remote 
holders    •   International organizations          communities
           •   Academic/research institutions     • Non‐governmental organizations and 
           •   Industries/Private sector            other interest groups
                                                                                           8
PAA – MMMII PAA Overview




  MTL relocation
   in Hamilton




                           9
PAA – MMMII PAA Overview




                                Domestic 
 Emerging         Mineral       and int’l     Statistics 
 materials/       extraction    policy        and 
 MTL relocation
 eco‐             and           advice and    economic 
   in Hamilton
 material         processing    events,       analysis 
 research         research      sector        products
 projects         projects      planning

                                                            10
Main difference between horizontal and PAA‐level 
        projects?
                          Evaluation Advisory Committees
         • Evaluation Advisory Committee (EAC)
            • Mixed‐team (internal and external) evaluators
            • Key program management staff
GRDI and 
MPMOI • Interdepartmental Evaluation Advisory Committee (IEAC)
            • Mixed‐team (internal and external) evaluators
            • 1 internal evaluator from each participating department/agency
            • 1‐2 program staff from each participating department/agency

MMMII    • No committee
             • Ad‐hoc targeted consultations for information requests and 
               feedback/input




                                                                               11
Issue #1  – Multiple stakeholders

Having multiple stakeholders poses significant challenges:
 “Patchy” internal and external stakeholder identification for:
    •   Risk‐based planning and design of the evaluation
    •   Consultation during the evaluation assessment/planning stage
    •   Data collection during the evaluation project (interviews, survey)
   “Dispersed” key information and data needed for the evaluation
    •   Different Departments/Branches have their own information
    •   Little data at the level of the PAA sub‐activities (PAA specific)
    •   Difficult to roll‐up data at the program/PAA level (e.g. financial)
 Variable level of expectation/understanding of the evaluation process
 Variable level of engagement
    •   Less engagement when the evaluation scope covers entire Branches 
        /divisions/projects, including activities not tied to funding renewal
    •   Salience of the process is lower and input is mainly on a voluntary basis 
        (potentially impacting on the quality and timeliness of evaluation)
                                                                                     12
Issue #2 – Evaluation program theory at PAA level

The design of PAA‐level evaluations poses significant challenges:
 Lack of coherent program theory at PAA‐level
     •   Program theory is the backbone of the evaluation
     •   PAA theory is mainly at strategic level as defined in the PMF
     •   Frequent absence of PAA‐level RMAF (or RMAF used differently by 
         different units)
     •   Absence of Logic Model and underlying program theory
   Lack of shared understanding and vision of PAA logic
     • Individual vision of the contribution of their unit to the PAA strategic 
       objectives
     • Lack of awareness and different levels of understanding of the 
       contribution of other units
     • Potential tension when it comes to defining the individual and collective 
       contribution to the PAA strategic objectives

                                                                                    13
Strategy #1  – Well‐structured and engaged 
       evaluation committee
Committee created at the planning stage has proven effective to:
 Bring together two communities of practice (program and evaluation)

   Facilitate the identification of internal and external stakeholders
   Ensure that all stakeholders agree on the evaluation process
   Manage expectations and consider individual needs with respect to:
     • How findings will be presented in the evaluation report

     • Who will be accountable in responding to recommendations

     • Other concurrent activities (funding renewals, audits, etc.)

 Determine the data requirements for the evaluation 

 Facilitate buy‐in on the development of evaluation method options

 Increase the level of awareness in participating organizations, which 
    facilitates the consultation during the evaluation
                                                                           14
Strategy #2 – Participatory logic model and program 
       theory development and review
In close collaboration with the committee (and consultation of individual 
groups/units):
   Revisit the PAA‐logic at strategic level and translate into program theory 
    that works at the operational level
   Translate the theory into a PAA‐logic model and validate 
   Discuss up‐front how relevance and efficiency/economy will be evaluated 
   Design specific evaluation questions and validate
   Design a preliminary evaluation approach and indicators (data collection 
    matrix) and validate
   Test methods and indicators against available data and other factors
   Reconvene all committee members at the end to fully validate the 
    selected approach (e.g., half‐day roundtable workshop)
   Ensure that the outcome of the consultation is captured/disseminated
                                                                                  15
Remaining challenges




                PAA‐level 
                evaluation 
                challenges 


                Instability 

                               16
Remaining challenges

 PAA‐level heterogeneity 
   • Mixed bag of organizations, branches, divisions, programs, projects and 
     initiatives with highly diverse:
       • Values and culture
       • Business processes/settings 
       • Stakeholder groups
   • Always the presence of outlier components
       • Inconsistent rationale for inclusion/exclusion of PAA components 
       • Included on the basis of funding/special allocations and/or politics
       • Not always aligned with PAA strategic objectives/PAA logic
 PAA‐level instability 
   • Changes during the evaluation project and evaluation period (and 
     beyond)
   • Multiplies challenges to assess relevance and efficiency/economy

                                                                                17
Remaining challenges

 PAA‐level connectivity
  • Artificial/conceptual connections across PAA components
  • Underestimated connections with other PAA components/contribution of 
       other PAAs
 Other challenges 
   •    PAA‐level recommendations
        • Acceptability (what is in it for me?)
        • Accountability (who is accountable?)
        • Actionability (how they will be put into action?)

   •    PAA‐level stakeholder fatigue
        • Key staff responsible for components under several PAAs are involved in 
          multiple, concurrent evaluations




                                                                                     18
Take‐home message

When evaluating at PAA‐level, often 
spend too much time discovering 
what’s “inside the boxes”…


                   … please consider using an 
              evaluation advisory committee!




                                                 19
Thank you for your time and feedback

                           CONTACT INFO

                 Frédéric Bertrand, MSc CE
Vice‐President, evaluation | Science‐Metrix
                        514‐495‐6505 x117
    frederic.bertrand@science‐metrix.com         Questions?
               Michelle Picard Aitken, MSc
  Senior Research Analyst | Science‐Metrix
                       514‐495‐6505 x125
michelle.picard‐aitken@science‐metrix.com
                                              Science‐Metrix 
               Andrea Ventimiglia, BSc MJ
                                              1335, Mont‐Royal Est
        Research Analyst | Science‐Metrix
                                              Montreal, Quebec  H2J 1Y6
                      514‐495‐6505 x124       Telephone: 514‐495‐6505
   andrea.ventimiglia@science‐metrix.com      Fax: 514‐495‐6523
                                              Courriel: info@science‐metrix.com

                     AKNOWLEDGEMENT           WEB SITE
                        Julie Caruso, MLIS    www.science‐metrix.com
         Research Analyst | Science‐Metrix
                                                                                  20

More Related Content

Similar to Strategies for tackling differences: Learning from evaluability assessments of horizontal initiatives to prepare for evaluations of program activity architecture components

Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Barb Knittel
 
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11MN Chemical Regulation and Policy, Work Group Meeting 10/26/11
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11Environmental Initiative
 
NAP-AG Webinar - Kenya Case Study
NAP-AG Webinar - Kenya Case StudyNAP-AG Webinar - Kenya Case Study
NAP-AG Webinar - Kenya Case StudyUNDP Climate
 
Kenya – Capacity Assessment
Kenya – Capacity AssessmentKenya – Capacity Assessment
Kenya – Capacity AssessmentFAO
 
Strategy development of clusters and cluster initiatives
Strategy development of clusters and cluster initiativesStrategy development of clusters and cluster initiatives
Strategy development of clusters and cluster initiativesGerd Meier zu Koecker
 
Winning research proposals final
Winning research proposals  finalWinning research proposals  final
Winning research proposals finalSKUASTKashmir
 
How to fundable research proposal
 How to fundable research proposal  How to fundable research proposal
How to fundable research proposal M. Raja Reddy
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and MeasurementStephen Town
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...JSI
 
GroWNC Overview
GroWNC Overview GroWNC Overview
GroWNC Overview GroWNC
 
District Health Planning for PIP
District Health Planning for PIPDistrict Health Planning for PIP
District Health Planning for PIPAkhilesh Bhargava
 
The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)IRC
 
Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Fbertrand
 
Cross-cutting IDOs: Gender
Cross-cutting IDOs: GenderCross-cutting IDOs: Gender
Cross-cutting IDOs: GenderCIMMYT
 

Similar to Strategies for tackling differences: Learning from evaluability assessments of horizontal initiatives to prepare for evaluations of program activity architecture components (20)

Planning Policy
Planning PolicyPlanning Policy
Planning Policy
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...
 
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11MN Chemical Regulation and Policy, Work Group Meeting 10/26/11
MN Chemical Regulation and Policy, Work Group Meeting 10/26/11
 
PGAs for REDD+: Research to (early) Actions
PGAs for REDD+: Research to (early) ActionsPGAs for REDD+: Research to (early) Actions
PGAs for REDD+: Research to (early) Actions
 
NAP-AG Webinar - Kenya Case Study
NAP-AG Webinar - Kenya Case StudyNAP-AG Webinar - Kenya Case Study
NAP-AG Webinar - Kenya Case Study
 
Kenya – Capacity Assessment
Kenya – Capacity AssessmentKenya – Capacity Assessment
Kenya – Capacity Assessment
 
Strategy development of clusters and cluster initiatives
Strategy development of clusters and cluster initiativesStrategy development of clusters and cluster initiatives
Strategy development of clusters and cluster initiatives
 
5 project prioritization
5 project prioritization5 project prioritization
5 project prioritization
 
Winning research proposals final
Winning research proposals  finalWinning research proposals  final
Winning research proposals final
 
AER15_PPT.FINAL
AER15_PPT.FINALAER15_PPT.FINAL
AER15_PPT.FINAL
 
Sptf 20120104 universal standards for social performance
Sptf   20120104 universal standards for social performanceSptf   20120104 universal standards for social performance
Sptf 20120104 universal standards for social performance
 
IFPRI - Results and Impact Management System (RIMS)
IFPRI - Results and Impact Management System (RIMS)IFPRI - Results and Impact Management System (RIMS)
IFPRI - Results and Impact Management System (RIMS)
 
How to fundable research proposal
 How to fundable research proposal  How to fundable research proposal
How to fundable research proposal
 
Library Strategy: Models and Measurement
Library Strategy: Models and MeasurementLibrary Strategy: Models and Measurement
Library Strategy: Models and Measurement
 
Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...Using case-based methods to assess scalability and sustainability: Lessons fr...
Using case-based methods to assess scalability and sustainability: Lessons fr...
 
GroWNC Overview
GroWNC Overview GroWNC Overview
GroWNC Overview
 
District Health Planning for PIP
District Health Planning for PIPDistrict Health Planning for PIP
District Health Planning for PIP
 
The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)The WASH Bottleneck Analysis Tool (BAT)
The WASH Bottleneck Analysis Tool (BAT)
 
Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...Valuing organizational vision in the development of performance measurement f...
Valuing organizational vision in the development of performance measurement f...
 
Cross-cutting IDOs: Gender
Cross-cutting IDOs: GenderCross-cutting IDOs: Gender
Cross-cutting IDOs: Gender
 

Strategies for tackling differences: Learning from evaluability assessments of horizontal initiatives to prepare for evaluations of program activity architecture components

  • 1. Strategies for tackling differences:  Learning from evaluability assessments of horizontal  initiatives to prepare for evaluations of  program activity architecture components CES 2012 Conference Halifax Monday, May 14, 2012 | 15:15‐16:45
  • 2. Outline & Objective  Objective  Sharing lessons learned from conducting evaluability assessments of  horizontal initiatives   With a view to inform the conduct of evaluations of program activity  architecture components (PAA)  Context  Overview – evaluation assessments projects:   Horizontal  – Genomics Research and Development Initiative (GRDI)   Horizontal  – Major Project Management Office Initiative (MPMOI)  PAA‐level – Minerals & Metals, Markets, Investment & Innovation (MMMII)  Key issues:   #1: Multiple stakeholders  #2: Evaluation program theory  Strategies and remaining challenges  Questions? 2
  • 3. Context  The Treasury Board of Canada requires full evaluation coverage  Federal evaluation plans increasingly include evaluations of program  activity architecture (PAA) components   PAA comprises  a wide range of activities beyond the individual  program level (sub, sub‐sub and sub‐sub‐sub levels)  Departmental Performance Measurement Frameworks (PMFs) set  out the expected results and the performance measures to be  reported for all programs identified in the PAA components  The indicators in the departmental PMFs are limited in number and  focus on supporting departmental monitoring and reporting  PMFs are developed with a view to support effective evaluations  Foreseen opportunities for evaluators and for organizations 3
  • 4. Context  However, PAA‐level evaluations pose significant challenges during  evaluation design (i.e., linking outcomes across activities, initiatives,  programs and organizations).   Absence of pre‐existing program theory and integrated performance  measurement frameworks  Evaluation program theory need to go beyond linking RMAFs to PAA  components  Many PAAs boxes have never been evaluated  Need to spend more time discovering what’s “inside the boxes”…  As differences need to be identified, quantified and qualified across  varied and often heterogeneous stakeholder groups  Implications for evaluability assessment projects 4
  • 5. Overview – evaluation assessment projects Horizontal – GRDI Horizontal – MPMOI Goal • Build and maintain genomics  • Address federal regulatory systemic  research capacity in government  issues and capacity deficiencies for  departments. major resource projects and  • Protect and improve human health,  provide interim capacity funding for  develop new treatments for  Aboriginal consultation.  chronic and infectious diseases,  protect the environment, and  • Via the Major Project Management  manage agricultural and natural  Office (housed at NRCan): resources in a sustainable manner.  i) provide overarching project  • Support evidence‐based decision‐ coordination, management and  making, policy/standards/  accountability; and  regulations development, as well  ii) undertake research and identify  as facilitate the development of  options that drive further  Canadian commercial enterprises. performance improvements to  the federal regulatory system. 5
  • 6. Overview – evaluation assessment projects Horizontal – GRDI Horizontal – MPMOI Dep’t &  • Agriculture and Agri‐food  • Canadian Environmental Assessment Agency  Agen‐ • Environment Canada  • Environment Canada cies • Fisheries and Oceans  • Fisheries and Oceans • Health Canada • Aboriginal Affairs and Northern  • Public Health Agency  Development Canada • National Research Council   • Transport Canada; National Energy Board  • Natural Resources  • Canadian Nuclear Safety Commission Budget • $20 million/year • $30 million/year Mgmt. • Mgmt Committees (ADM  • Mgmt Committees (DM, ADM, DG, dep’t  and working groups) committees and working groups) Main  • Mainly federal scientists • Natural resource industries Stake‐ • Academic researchers • Aboriginal groups holders • Private sector • Environmental groups • Others • Federal organizations • Provincial organizations • Others 6
  • 7. Overview – evaluation assessment projects Horizontal – GRDI Horizontal – MPMOI Evaluation • Interdepartmental Evaluation Advisory Committee (IEAC) Committee • Program staff + Evaluation staff • Observers/stakeholders Approach/  • Data availability and quality assessment Methods • Document, file and data review • Program rationale and profile • Revision/adjustment of the logic model (MPMOI) • Development of evaluation questions • Individual consultations with working groups from each organization • Half‐day roundtable workshop • Development of the Data Collection Matrix (DCM) • Development of evaluation methodology and options • Characterization of key information needed for the evaluation • Final workshop for validation 7
  • 8. Overview – evaluation assessment projects PAA‐level –MMMII Goal • Minerals & Metals, Markets, Investment & Innovation (MMMII) • Mining Scientific Research and Innovations (1.1.1.1) • Socio‐economic Minerals and Metals Research and Knowledge for  Investments and Access to Global Markets (1.1.1.2) Compo‐ • 2  entire Branches (7 divisions): nents • Minerals, Metals and Materials Knowledge Branch (MMMKB)  • Minerals, Metals and Materials Policy Branch (MMMPB) • Various programs/groups in 2 additional Branches: • CANMET Mining and Mineral Sciences Laboratories (MMSL) • CANMET Materials Technology Laboratory (MTL) Budget • $20 million/year + specific funding for the relocation of MTL to Hamilton Mgmt. • ADM‐level overall and DG‐level by Branch Main  • NRCan & other federal org. • Canadian embassies/commissions  Stake‐ • Provincial organizations • Aboriginal groups and remote  holders • International organizations communities • Academic/research institutions • Non‐governmental organizations and  • Industries/Private sector other interest groups 8
  • 9. PAA – MMMII PAA Overview MTL relocation in Hamilton 9
  • 10. PAA – MMMII PAA Overview Domestic  Emerging  Mineral  and int’l  Statistics  materials/  extraction  policy  and  MTL relocation eco‐ and  advice and  economic  in Hamilton material  processing  events,  analysis  research  research  sector  products projects projects  planning 10
  • 11. Main difference between horizontal and PAA‐level  projects? Evaluation Advisory Committees • Evaluation Advisory Committee (EAC) • Mixed‐team (internal and external) evaluators • Key program management staff GRDI and  MPMOI • Interdepartmental Evaluation Advisory Committee (IEAC) • Mixed‐team (internal and external) evaluators • 1 internal evaluator from each participating department/agency • 1‐2 program staff from each participating department/agency MMMII • No committee • Ad‐hoc targeted consultations for information requests and  feedback/input 11
  • 12. Issue #1  – Multiple stakeholders Having multiple stakeholders poses significant challenges:  “Patchy” internal and external stakeholder identification for: • Risk‐based planning and design of the evaluation • Consultation during the evaluation assessment/planning stage • Data collection during the evaluation project (interviews, survey)  “Dispersed” key information and data needed for the evaluation • Different Departments/Branches have their own information • Little data at the level of the PAA sub‐activities (PAA specific) • Difficult to roll‐up data at the program/PAA level (e.g. financial)  Variable level of expectation/understanding of the evaluation process  Variable level of engagement • Less engagement when the evaluation scope covers entire Branches  /divisions/projects, including activities not tied to funding renewal • Salience of the process is lower and input is mainly on a voluntary basis  (potentially impacting on the quality and timeliness of evaluation) 12
  • 13. Issue #2 – Evaluation program theory at PAA level The design of PAA‐level evaluations poses significant challenges:  Lack of coherent program theory at PAA‐level • Program theory is the backbone of the evaluation • PAA theory is mainly at strategic level as defined in the PMF • Frequent absence of PAA‐level RMAF (or RMAF used differently by  different units) • Absence of Logic Model and underlying program theory  Lack of shared understanding and vision of PAA logic • Individual vision of the contribution of their unit to the PAA strategic  objectives • Lack of awareness and different levels of understanding of the  contribution of other units • Potential tension when it comes to defining the individual and collective  contribution to the PAA strategic objectives 13
  • 14. Strategy #1  – Well‐structured and engaged  evaluation committee Committee created at the planning stage has proven effective to:  Bring together two communities of practice (program and evaluation)  Facilitate the identification of internal and external stakeholders  Ensure that all stakeholders agree on the evaluation process  Manage expectations and consider individual needs with respect to: • How findings will be presented in the evaluation report • Who will be accountable in responding to recommendations • Other concurrent activities (funding renewals, audits, etc.)  Determine the data requirements for the evaluation   Facilitate buy‐in on the development of evaluation method options  Increase the level of awareness in participating organizations, which  facilitates the consultation during the evaluation 14
  • 15. Strategy #2 – Participatory logic model and program  theory development and review In close collaboration with the committee (and consultation of individual  groups/units):  Revisit the PAA‐logic at strategic level and translate into program theory  that works at the operational level  Translate the theory into a PAA‐logic model and validate   Discuss up‐front how relevance and efficiency/economy will be evaluated   Design specific evaluation questions and validate  Design a preliminary evaluation approach and indicators (data collection  matrix) and validate  Test methods and indicators against available data and other factors  Reconvene all committee members at the end to fully validate the  selected approach (e.g., half‐day roundtable workshop)  Ensure that the outcome of the consultation is captured/disseminated 15
  • 16. Remaining challenges PAA‐level  evaluation  challenges  Instability  16
  • 17. Remaining challenges  PAA‐level heterogeneity  • Mixed bag of organizations, branches, divisions, programs, projects and  initiatives with highly diverse: • Values and culture • Business processes/settings  • Stakeholder groups • Always the presence of outlier components • Inconsistent rationale for inclusion/exclusion of PAA components  • Included on the basis of funding/special allocations and/or politics • Not always aligned with PAA strategic objectives/PAA logic  PAA‐level instability  • Changes during the evaluation project and evaluation period (and  beyond) • Multiplies challenges to assess relevance and efficiency/economy 17
  • 18. Remaining challenges  PAA‐level connectivity • Artificial/conceptual connections across PAA components • Underestimated connections with other PAA components/contribution of  other PAAs  Other challenges  • PAA‐level recommendations • Acceptability (what is in it for me?) • Accountability (who is accountable?) • Actionability (how they will be put into action?) • PAA‐level stakeholder fatigue • Key staff responsible for components under several PAAs are involved in  multiple, concurrent evaluations 18
  • 20. Thank you for your time and feedback CONTACT INFO Frédéric Bertrand, MSc CE Vice‐President, evaluation | Science‐Metrix 514‐495‐6505 x117 frederic.bertrand@science‐metrix.com Questions? Michelle Picard Aitken, MSc Senior Research Analyst | Science‐Metrix 514‐495‐6505 x125 michelle.picard‐aitken@science‐metrix.com Science‐Metrix  Andrea Ventimiglia, BSc MJ 1335, Mont‐Royal Est Research Analyst | Science‐Metrix Montreal, Quebec  H2J 1Y6 514‐495‐6505 x124 Telephone: 514‐495‐6505 andrea.ventimiglia@science‐metrix.com Fax: 514‐495‐6523 Courriel: info@science‐metrix.com AKNOWLEDGEMENT WEB SITE Julie Caruso, MLIS www.science‐metrix.com Research Analyst | Science‐Metrix 20