CES 2013 Conference Toronto
Tuesday, June 11, 2013 | 14:00-15:30
Let’s compare!
Practical perspectives on the use of an international
comparative design
2
Outline
 Background and objective
 Value of international comparative analysis in different
contexts
 Program evaluation
 Performance measurement and other studies
 Challenges and planning/design considerations
 Scope & time
 Risks
 International comparisons in the real world
 Practical tips and tools
 Questions and discussion
3
Background and objective
 International comparative analysis is an approach that seeks to
enhance the results of an evaluation or performance measurement
study by examining different program models and alternatives
 Can take the form of reviews, case studies, benchmarking, etc.
 Data collection via both document/data reviews and consultations
 Can be used to address evaluation questions of relevance and
performance, or to inform study/program design
 Requires a common frame of reference or grounds for
comparison, but must consider the local context
 Objective: discuss the rationale and key practical considerations to
ensure the successful implementation of this approach
 Based on experience in several projects across a variety of contexts
4
Value of international comparative analysis in
different contexts
Context 1 – Program Evaluation
 To address Relevance issues
 Helps position questions of ongoing need in the global context, and
examines the responsiveness of other countries
 Provides perspective on the appropriateness of the federal
role, especially if there are no comparable or relevant local options
 To address Performance issues
 Highlight variations in program design, implementation, practices and
relate to the achievement of outcomes
 Benchmark/Reference point for achievement of outcomes
 Benchmark/Reference point to demonstrate efficiency and economy
 Can probe the reasons for difference
 Can validate similar results
5
Value of international comparative analysis in
different contexts
Context 2 – Performance measurement and other studies
 To inform best practices
 What works? What doesn’t?
 What’s transferrable across organizations?
 To aid with benchmarking exercises
 Identify common indicators, measurement or reporting strategies
 To help build linkages, networks, community of practice
 Establish consensus on definitions, standards, terminology
 Develop basis for future collaboration and sharing of data/practices
 Increase feasibility of future, more in-depth benchmarking
6
Challenges and planning considerations
International comparative analysis always presents a scope and time
challenge!
 Avoid “inflation” (too many
countries, organizations, indicators, etc.)
 Clearly define the scope for your international comparison upfront
 Create a focused data collection template – know what you want to
collect and why
 Identify key individuals/roles to contact – know who you want to speak
to and why
 Leave yourself enough time to initiate contact, establish
relationships, get approvals, etc. – expect the unexpected
 Allow 3 months minimum for fieldwork
7
Challenges and planning considerations
International comparative analysis always comes with risks!
 Data risks
 Confidentiality issues
 Language barriers
 No quality control over what you get
 Minimize data risks
 Include confidentiality statements/practices and implement procedures
to prevent release of confidential information
 Communicate data needs early
 Use data collection templates to ensure comparable data is obtained
across countries
8
Challenges and planning considerations
International comparative analysis always comes with risks!
 Ability to compare (usefulness)
 Esp. for economy/efficiency questions, comparable data may not found
 Halfway through data collection, you may find a better comparator
 Other limitations
 Generally have a higher comprehension of the local (Canadian) context
and program – difficult to attain the same level for the comparator(s)
 Be wary of asking too much of the participants!
9
International comparisons in the real world
Project 1 Project 2 Project 3
Type/context Program
evaluation
Program
evaluation
Performance
measurement
Subject matter Public health Public safety Technology
development
Community of
practice
Public health
officials (policy &
data specialists)
Government
regulators and
scientists
Performance
measurement
specialists
# of countries
(continents)
5 (3) 2 (2) 6 (4)
Key issues
examined
Relevance Performance –
achievement of
outcomes and
efficiency
Performance
Timeframe 5 months 5 months 3 months
10
International comparisons in the real world
Project 1 Project 2 Project 3
What worked –
scope/time
Focused on
relevance (less need
for quantitative
data), sufficient time
Well-defined
scope, sufficient
time
Sustained client
communication to
address issues
What worked –
engagement
Program staff helped
identify contacts in
some countries
Pre-existing
community of
practice, easier to
identify/engage
contacts
Counterparts
trying to address
similar challenges,
interested in
ongoing
interaction
What worked –
comparability
Several common
needs/features,
national role in other
federated countries
Similar
activities/outputs,
standard data on
regulatory aspects
Common needs
and basic
processes
11
International comparisons in the real world
Project 1 Project 2 Project 3
What didn’t work
– scope/time
Scope creep,
collected more
information than
necessary
Too many criteria &
countries, shift in
objectives, insufficie
nt time
What didn’t work
– complexity
Several types of
activities, multiple
organizations per
country, technical
elements (data
sharing, IT systems)
2 different types
of activities
(regulatory &
scientific)
Wide variety of
processes and
practices, prohibitive
level of detail
needed to define
performance
indicators
What didn’t work
– comparability
Key differences in
context (historical
and legislated
roles)
Scientific role
and activities not
comparable (lack
of data)
Scope and level of
complexity only
allowed
approximate results
12
International comparisons in the real world
Project 1 Project 2 Project 3
Key results • Systematic
description of
common features
• Identified best
practices
• Developed
conceptual role
framework
• Quantitative
assessment of
efficiency
(regulatory
side)
• Identified
models and
practices of
value/interest
• Collected novel
insights, non-
public data/
documents
Actual use Modest use of results
as line of evidence in
evaluation report,
potential use of
shared material by
counterparts
Key line of
evidence in
evaluation report,
supported robust
conclusions
Next steps not
pursued due to
organizational
changes
13
Practical tips and tools
Promote engagement
 Leverage existing program contacts and relationships
 Establish contact via senior program representatives (pre-contact letter)
 Develop initial relationship through one key contact in each
country/organization – they will help open doors and get access to data
 Don’t forget “WIFM” (what’s in it for me) – if possible, offer to share
results of analysis
 WARNING! This will require additional work, so encourage program
to take this on or make sure to include time for this
Be flexible for consultations and know your times zones!
 Expect early mornings (Europe) and late nights (Australia/Asia)
 Use meeting planner tools to navigate time zones (timeanddate.com)
 Cross-boundary realities apply – expect cultural differences
14
Questions & Discussion
Please share your experiences and practical tips!
15
Thank you for your time and feedback
CONTACT INFO
Michelle Picard-Aitken, MSc
Senior Research Analyst | Science-Metrix
514-495-6505 x125
m.picard-aitken@science-metrix.com
Andréa Ventimiglia, BSc MJ
Research Analyst | Science-Metrix
514-495-6505 x124
andrea.ventimiglia@science-metrix.com
Frédéric Bertrand, MSc CE
Vice-President, evaluation | Science-Metrix
514-495-6505 x117
frederic.bertrand@science-metrix.com
Science-Metrix
1335, Mont-Royal E.
Montreal, Quebec H2J 1Y6
Telephone: 514-495-6505
Fax: 514-495-6523
E-mail: info@science-metrix.com
WEB SITE
www.science-metrix.com
Questions?
Let’s compare! Practical perspectives on the use of an international comparative design

Let’s compare! Practical perspectives on the use of an international comparative design

  • 1.
    CES 2013 ConferenceToronto Tuesday, June 11, 2013 | 14:00-15:30 Let’s compare! Practical perspectives on the use of an international comparative design
  • 2.
    2 Outline  Background andobjective  Value of international comparative analysis in different contexts  Program evaluation  Performance measurement and other studies  Challenges and planning/design considerations  Scope & time  Risks  International comparisons in the real world  Practical tips and tools  Questions and discussion
  • 3.
    3 Background and objective International comparative analysis is an approach that seeks to enhance the results of an evaluation or performance measurement study by examining different program models and alternatives  Can take the form of reviews, case studies, benchmarking, etc.  Data collection via both document/data reviews and consultations  Can be used to address evaluation questions of relevance and performance, or to inform study/program design  Requires a common frame of reference or grounds for comparison, but must consider the local context  Objective: discuss the rationale and key practical considerations to ensure the successful implementation of this approach  Based on experience in several projects across a variety of contexts
  • 4.
    4 Value of internationalcomparative analysis in different contexts Context 1 – Program Evaluation  To address Relevance issues  Helps position questions of ongoing need in the global context, and examines the responsiveness of other countries  Provides perspective on the appropriateness of the federal role, especially if there are no comparable or relevant local options  To address Performance issues  Highlight variations in program design, implementation, practices and relate to the achievement of outcomes  Benchmark/Reference point for achievement of outcomes  Benchmark/Reference point to demonstrate efficiency and economy  Can probe the reasons for difference  Can validate similar results
  • 5.
    5 Value of internationalcomparative analysis in different contexts Context 2 – Performance measurement and other studies  To inform best practices  What works? What doesn’t?  What’s transferrable across organizations?  To aid with benchmarking exercises  Identify common indicators, measurement or reporting strategies  To help build linkages, networks, community of practice  Establish consensus on definitions, standards, terminology  Develop basis for future collaboration and sharing of data/practices  Increase feasibility of future, more in-depth benchmarking
  • 6.
    6 Challenges and planningconsiderations International comparative analysis always presents a scope and time challenge!  Avoid “inflation” (too many countries, organizations, indicators, etc.)  Clearly define the scope for your international comparison upfront  Create a focused data collection template – know what you want to collect and why  Identify key individuals/roles to contact – know who you want to speak to and why  Leave yourself enough time to initiate contact, establish relationships, get approvals, etc. – expect the unexpected  Allow 3 months minimum for fieldwork
  • 7.
    7 Challenges and planningconsiderations International comparative analysis always comes with risks!  Data risks  Confidentiality issues  Language barriers  No quality control over what you get  Minimize data risks  Include confidentiality statements/practices and implement procedures to prevent release of confidential information  Communicate data needs early  Use data collection templates to ensure comparable data is obtained across countries
  • 8.
    8 Challenges and planningconsiderations International comparative analysis always comes with risks!  Ability to compare (usefulness)  Esp. for economy/efficiency questions, comparable data may not found  Halfway through data collection, you may find a better comparator  Other limitations  Generally have a higher comprehension of the local (Canadian) context and program – difficult to attain the same level for the comparator(s)  Be wary of asking too much of the participants!
  • 9.
    9 International comparisons inthe real world Project 1 Project 2 Project 3 Type/context Program evaluation Program evaluation Performance measurement Subject matter Public health Public safety Technology development Community of practice Public health officials (policy & data specialists) Government regulators and scientists Performance measurement specialists # of countries (continents) 5 (3) 2 (2) 6 (4) Key issues examined Relevance Performance – achievement of outcomes and efficiency Performance Timeframe 5 months 5 months 3 months
  • 10.
    10 International comparisons inthe real world Project 1 Project 2 Project 3 What worked – scope/time Focused on relevance (less need for quantitative data), sufficient time Well-defined scope, sufficient time Sustained client communication to address issues What worked – engagement Program staff helped identify contacts in some countries Pre-existing community of practice, easier to identify/engage contacts Counterparts trying to address similar challenges, interested in ongoing interaction What worked – comparability Several common needs/features, national role in other federated countries Similar activities/outputs, standard data on regulatory aspects Common needs and basic processes
  • 11.
    11 International comparisons inthe real world Project 1 Project 2 Project 3 What didn’t work – scope/time Scope creep, collected more information than necessary Too many criteria & countries, shift in objectives, insufficie nt time What didn’t work – complexity Several types of activities, multiple organizations per country, technical elements (data sharing, IT systems) 2 different types of activities (regulatory & scientific) Wide variety of processes and practices, prohibitive level of detail needed to define performance indicators What didn’t work – comparability Key differences in context (historical and legislated roles) Scientific role and activities not comparable (lack of data) Scope and level of complexity only allowed approximate results
  • 12.
    12 International comparisons inthe real world Project 1 Project 2 Project 3 Key results • Systematic description of common features • Identified best practices • Developed conceptual role framework • Quantitative assessment of efficiency (regulatory side) • Identified models and practices of value/interest • Collected novel insights, non- public data/ documents Actual use Modest use of results as line of evidence in evaluation report, potential use of shared material by counterparts Key line of evidence in evaluation report, supported robust conclusions Next steps not pursued due to organizational changes
  • 13.
    13 Practical tips andtools Promote engagement  Leverage existing program contacts and relationships  Establish contact via senior program representatives (pre-contact letter)  Develop initial relationship through one key contact in each country/organization – they will help open doors and get access to data  Don’t forget “WIFM” (what’s in it for me) – if possible, offer to share results of analysis  WARNING! This will require additional work, so encourage program to take this on or make sure to include time for this Be flexible for consultations and know your times zones!  Expect early mornings (Europe) and late nights (Australia/Asia)  Use meeting planner tools to navigate time zones (timeanddate.com)  Cross-boundary realities apply – expect cultural differences
  • 14.
    14 Questions & Discussion Pleaseshare your experiences and practical tips!
  • 15.
    15 Thank you foryour time and feedback CONTACT INFO Michelle Picard-Aitken, MSc Senior Research Analyst | Science-Metrix 514-495-6505 x125 m.picard-aitken@science-metrix.com Andréa Ventimiglia, BSc MJ Research Analyst | Science-Metrix 514-495-6505 x124 andrea.ventimiglia@science-metrix.com Frédéric Bertrand, MSc CE Vice-President, evaluation | Science-Metrix 514-495-6505 x117 frederic.bertrand@science-metrix.com Science-Metrix 1335, Mont-Royal E. Montreal, Quebec H2J 1Y6 Telephone: 514-495-6505 Fax: 514-495-6523 E-mail: info@science-metrix.com WEB SITE www.science-metrix.com Questions?