• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Let’s compare! Practical perspectives on the use of an international comparative design
 

Let’s compare! Practical perspectives on the use of an international comparative design

on

  • 811 views

Used appropriately and carefully, international comparisons (reviews, case studies, etc.) can inform the design of your evaluation or performance measurement study, engage a broad range of ...

Used appropriately and carefully, international comparisons (reviews, case studies, etc.) can inform the design of your evaluation or performance measurement study, engage a broad range of stakeholders, and greatly add value to your findings and recommendations.
Drawing on experience with several such approaches in evaluations covering public safety, health surveillance, environmental assessment, and technology development, this presentation will discuss the rationale and key practical considerations to ensure the successful implementation of an international comparative design.
Specifically, the presentation will review when to use these methods (advantages/disadvantages), and provide concrete tools and tips to overcome common challenges. It will also discuss how to facilitate engagement and collaboration for both the subject matter community and the evaluation and performance management community, within Canada and across borders.

Statistics

Views

Total Views
811
Views on SlideShare
811
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Let’s compare! Practical perspectives on the use of an international comparative design Let’s compare! Practical perspectives on the use of an international comparative design Presentation Transcript

    • CES 2013 Conference TorontoTuesday, June 11, 2013 | 14:00-15:30Let’s compare!Practical perspectives on the use of an internationalcomparative design
    • 2Outline Background and objective Value of international comparative analysis in differentcontexts Program evaluation Performance measurement and other studies Challenges and planning/design considerations Scope & time Risks International comparisons in the real world Practical tips and tools Questions and discussion
    • 3Background and objective International comparative analysis is an approach that seeks toenhance the results of an evaluation or performance measurementstudy by examining different program models and alternatives Can take the form of reviews, case studies, benchmarking, etc. Data collection via both document/data reviews and consultations Can be used to address evaluation questions of relevance andperformance, or to inform study/program design Requires a common frame of reference or grounds forcomparison, but must consider the local context Objective: discuss the rationale and key practical considerations toensure the successful implementation of this approach Based on experience in several projects across a variety of contexts
    • 4Value of international comparative analysis indifferent contextsContext 1 – Program Evaluation To address Relevance issues Helps position questions of ongoing need in the global context, andexamines the responsiveness of other countries Provides perspective on the appropriateness of the federalrole, especially if there are no comparable or relevant local options To address Performance issues Highlight variations in program design, implementation, practices andrelate to the achievement of outcomes Benchmark/Reference point for achievement of outcomes Benchmark/Reference point to demonstrate efficiency and economy Can probe the reasons for difference Can validate similar results
    • 5Value of international comparative analysis indifferent contextsContext 2 – Performance measurement and other studies To inform best practices What works? What doesn’t? What’s transferrable across organizations? To aid with benchmarking exercises Identify common indicators, measurement or reporting strategies To help build linkages, networks, community of practice Establish consensus on definitions, standards, terminology Develop basis for future collaboration and sharing of data/practices Increase feasibility of future, more in-depth benchmarking
    • 6Challenges and planning considerationsInternational comparative analysis always presents a scope and timechallenge! Avoid “inflation” (too manycountries, organizations, indicators, etc.) Clearly define the scope for your international comparison upfront Create a focused data collection template – know what you want tocollect and why Identify key individuals/roles to contact – know who you want to speakto and why Leave yourself enough time to initiate contact, establishrelationships, get approvals, etc. – expect the unexpected Allow 3 months minimum for fieldwork
    • 7Challenges and planning considerationsInternational comparative analysis always comes with risks! Data risks Confidentiality issues Language barriers No quality control over what you get Minimize data risks Include confidentiality statements/practices and implement proceduresto prevent release of confidential information Communicate data needs early Use data collection templates to ensure comparable data is obtainedacross countries
    • 8Challenges and planning considerationsInternational comparative analysis always comes with risks! Ability to compare (usefulness) Esp. for economy/efficiency questions, comparable data may not found Halfway through data collection, you may find a better comparator Other limitations Generally have a higher comprehension of the local (Canadian) contextand program – difficult to attain the same level for the comparator(s) Be wary of asking too much of the participants!
    • 9International comparisons in the real worldProject 1 Project 2 Project 3Type/context ProgramevaluationProgramevaluationPerformancemeasurementSubject matter Public health Public safety TechnologydevelopmentCommunity ofpracticePublic healthofficials (policy &data specialists)Governmentregulators andscientistsPerformancemeasurementspecialists# of countries(continents)5 (3) 2 (2) 6 (4)Key issuesexaminedRelevance Performance –achievement ofoutcomes andefficiencyPerformanceTimeframe 5 months 5 months 3 months
    • 10International comparisons in the real worldProject 1 Project 2 Project 3What worked –scope/timeFocused onrelevance (less needfor quantitativedata), sufficient timeWell-definedscope, sufficienttimeSustained clientcommunication toaddress issuesWhat worked –engagementProgram staff helpedidentify contacts insome countriesPre-existingcommunity ofpractice, easier toidentify/engagecontactsCounterpartstrying to addresssimilar challenges,interested inongoinginteractionWhat worked –comparabilitySeveral commonneeds/features,national role in otherfederated countriesSimilaractivities/outputs,standard data onregulatory aspectsCommon needsand basicprocesses
    • 11International comparisons in the real worldProject 1 Project 2 Project 3What didn’t work– scope/timeScope creep,collected moreinformation thannecessaryToo many criteria &countries, shift inobjectives, insufficient timeWhat didn’t work– complexitySeveral types ofactivities, multipleorganizations percountry, technicalelements (datasharing, IT systems)2 different typesof activities(regulatory &scientific)Wide variety ofprocesses andpractices, prohibitivelevel of detailneeded to defineperformanceindicatorsWhat didn’t work– comparabilityKey differences incontext (historicaland legislatedroles)Scientific roleand activities notcomparable (lackof data)Scope and level ofcomplexity onlyallowedapproximate results
    • 12International comparisons in the real worldProject 1 Project 2 Project 3Key results • Systematicdescription ofcommon features• Identified bestpractices• Developedconceptual roleframework• Quantitativeassessment ofefficiency(regulatoryside)• Identifiedmodels andpractices ofvalue/interest• Collected novelinsights, non-public data/documentsActual use Modest use of resultsas line of evidence inevaluation report,potential use ofshared material bycounterpartsKey line ofevidence inevaluation report,supported robustconclusionsNext steps notpursued due toorganizationalchanges
    • 13Practical tips and toolsPromote engagement Leverage existing program contacts and relationships Establish contact via senior program representatives (pre-contact letter) Develop initial relationship through one key contact in eachcountry/organization – they will help open doors and get access to data Don’t forget “WIFM” (what’s in it for me) – if possible, offer to shareresults of analysis WARNING! This will require additional work, so encourage programto take this on or make sure to include time for thisBe flexible for consultations and know your times zones! Expect early mornings (Europe) and late nights (Australia/Asia) Use meeting planner tools to navigate time zones (timeanddate.com) Cross-boundary realities apply – expect cultural differences
    • 14Questions & DiscussionPlease share your experiences and practical tips!
    • 15Thank you for your time and feedbackCONTACT INFOMichelle Picard-Aitken, MScSenior Research Analyst | Science-Metrix514-495-6505 x125m.picard-aitken@science-metrix.comAndréa Ventimiglia, BSc MJResearch Analyst | Science-Metrix514-495-6505 x124andrea.ventimiglia@science-metrix.comFrédéric Bertrand, MSc CEVice-President, evaluation | Science-Metrix514-495-6505 x117frederic.bertrand@science-metrix.comScience-Metrix1335, Mont-Royal E.Montreal, Quebec H2J 1Y6Telephone: 514-495-6505Fax: 514-495-6523E-mail: info@science-metrix.comWEB SITEwww.science-metrix.comQuestions?