Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework
Upcoming SlideShare
Loading in...5
×
 

Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework

on

  • 910 views

Presentation for CES Toronto 2013 Evaluation Conference by Dr. Nancy Carter & Rob Chatwin, Nova Scotia Health Research Foundation

Presentation for CES Toronto 2013 Evaluation Conference by Dr. Nancy Carter & Rob Chatwin, Nova Scotia Health Research Foundation

Statistics

Views

Total Views
910
Views on SlideShare
910
Embed Views
0

Actions

Likes
1
Downloads
17
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework Presentation Transcript

  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework Dr. Nancy Carter & Rob Chatwin Nova Scotia Health Research Foundation CES National Conference June 9, 2013 Toronto, Ontario Canada
  • Acknowledgements
  • Workshop Objectives Understanding of the basic concepts of impact evaluation Understanding of CAHS framework Opportunity to apply the framework
  • Why Evaluate?
  • Different Evaluation Needs Evaluation for Accountability Evaluation for Advocacy Evaluation for Learning Comprehensive Evaluation = Our Target External audience Mission linked Identify ‘best’
  • Impact evaluation Assesses the changes , both the intended and unintended, that can be attributed to a particular intervention, Are structured to answer the question: How would outcomes such as participants’ well-being have changed if the intervention had not been undertaken? Seek to answer cause-and-effect questions. Look for the changes in outcome that are directly attributable to a program / intervention / policy
  • What contextual factors matter when evaluating your program? How do you define impact in your context?
  • CONTRIBUTION
  • COUNTERFACTUAL
  • Making an Impact A Preferred Framework and Indicators to Measure Returns on Investment in Health Research Full Report available at http://www.cahs-acss.ca/e/assessments/completedprojects.php Canadian Academy of Health Sciences Académie canadienne des sciences de la santé
  • Why ROI in Health Research?  )
  • The Challenge  Is there a “best way” (method) to evaluate the impacts of health research in Canada & are there ‘best metrics’ for assessing those impacts (or improving them)?  Useful to a full range of funders/research types  Compatible with what is already in place in Canada  Transferrable to international comparisons  Able to identify the full spectrum of potential impacts
  • The Impact Framework
  • Results Chain Logic Framework CAHS Model Logic Framework
  • Activities Produce Results That influences decision making in … That affects something… That contributes to something… Impact Logic Frame
  • Indicators
  • Impact Categories CAHS Framework Advancing Knowledge Research Capacity Building Informing Decision Making Health Impact Broad Economic & Social Impacts
  • USING THE CAHS FRAMEWORK AND INDICATORS
  • Health Services Research Health Industry Economic and Social ProsperityDeterminants of HealthPublic Information, Groups KnowledgePool Improvements in Health and Well-being Healthcare Appropriateness, Access, etc. Prevention and Treatment Health Status, Function, Well-being, Economic Conditions Initiation and Diffusion of Health Research Impacts Global Research Research Capacity Impacts feed back into inputs for future research Government Research Agenda Interactions/Feedback ResearchResults Other Industries Advancing Knowledge Capacity Building Informing Decision Making Health Benefits Economic Benefits Canadian Health Research •Biomedical •Clinical •Health Services •Population and Public Health •Cross-pillar Research 42
  • Appropriate Use of the Framework understanding the logic model and impact categories FRAMEWORK Health R&D Primary Outputs/ Dissemination Secondary Outputs Adoption Final outcomesIMPACT CATEGORIES Advancing Knowledge • New molecular technique developed • Publication of research results in a journal Research Capacity • Research PhD gained by team member • Further research in Industry Informing Decision Making • Discussions between researchers and pharma define direction of pharma research • Pharma company initiates research program to develop a drug • Drug developed passed for use by the health system Health Impacts • Adoption by the health system causes increased cost of drugs • Decreased re- admission for condition • Reduced condition burden in the population Broad Economic and Social Impacts • Employment in the pharma company • Sales of drugs by pharma • Improved workplace productivity • Social and economic benefit of “wellness”
  • Indicators vs Metrics
  • FABRIC Focused Appropriate Balanced Robust Integrated Cost-effective
  • AND INDICATOR
  • Application of the Framework
  • CAHS / NSHRF Impact Evaluation Framework Research Activity That produces results That Influence decision making in… That affect healthcare, health risk factors, & other health determinants That contribute to changing health, well- being & economic and social prosperity Initiation and Diffusion of Health Research Impacts ResearchResults KnowledgePool Consultation&Collaboration Global Research Nova Scotia Health Research • Bio-medical • Policy, Services Outcomes • Partnership Programs • Student Programs Nova Scotia Government (DHW) District Health Authorities (DHAs) Nova Scotia Health Status and function, well-being , economic conditions Improved Health of Nova Scotians Societal & Economic Improvements Universities Impacts link back into inputs for future research StrongHealthResearchEnterprise Public Vibrant Research Community Health Research ExcellenceNova Scotia Research Capacity • REDI • Knowledge • Evaluation External Influences: Interests, Traditions, Technical Limitations, Political dynamics Foundation for Informed Decisions Workshops,Presentations&Events Public Awareness FundedResearch,Grants,&Awards Partnerships&Collaborations LearningOpportunities InformationProducts&reports Health Status and function, well-being, economic conditions Impact Framework Canadian Health Research Research Capacity • Health Industry • Other Industries • Government • Research Decision Making • The Public, Public Groups Health Care Prevention & Treatment Determinants of Health Improvements in health and well-being (disease prevalence and burden) Economic & social prosperity
  • Exercise
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework PARTICIPANT WORKBOOK Prepared by: Dr. Nancy Carter, and Mr. Rob Chatwin in collaboration with the National Alliance of Provincial Health Research Organizations – Impact Analysis Group (NAPHRO-IAG)
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 2 INTRODUCTION This workbook is intended to assist you in applying your learning to your context. The workbook is aligned to the slide presentation and we encourage you to jot down notes and thoughts as we go through the workshop. This workshop could not have been possible without the support of members, of the NAPHRO-IAG including representatives from: Newfoundland and Labrador Manitoba Nova Scotia Saskatchewan Quebec Alberta Ontario British Columbia WORKSHOP OBJECTIVES The purpose of this workshop is to share our learning in demonstrating impact in a health research environment. This framework with contextualizing can be used across sectors. Understanding of the basic concepts of impact evaluation. Understanding of Canadian Academy of Health Services (CAHS) Framework. Opportunity to apply the framework to your context Workshop links to CES Competencies for Canadian Evaluation Practice: Technical Practice Situational Practice Notes:
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 3 Why Evaluate Impact? There are many reasons to evaluate. Whether you are working in a program that is being evaluated or evaluating policy on the environment, it is important to know why you are evaluating. Question: Think about your context. If you were to conduct an evaluation why would you be doing it? Write down your thoughts. There are many reasons why programs, services and policies are evaluated. Each has their own reasons for being and so are the reasons to evaluate. Generally, reasons for evaluating can be summarized into three categories Learning Advocacy Accountability
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 4 Example: NAPHRO-IAG is focused on assessing the impact of funded health research provincially so nationally NAPHRO can: advocate for the health research enterprise in Canada and the provinces / territories; account for our services; and to learn the best ways to achieve our missions. Question: Think about your organization – What is/are the evaluation need(s)? Who owns those needs? Write down your thoughts… Types of questions that can be asked related to the need for evaluation are listed below. Accountability: Are we having the impact we said we would have? Learning: Can our resources be used for greater impact? Advocacy: Why is our impact important? With clarity on why you are evaluating, thought must be given to the different ways / methods that can be used to evaluate the outcomes or impact of the activities. Evaluation Methods / Approaches / Tools Econometric approaches and methods are the application of mathematics, statistical methods, and, more recently, computer science, to economic data and is described as the branch of economics that aims to give empirical content to economic relations. Introductory economics textbooks have described econometrics as allowing economists to sift through mountains of data to extract simple relationships. Econometrics is the unification of economics, mathematics, and statistics. This unification produces more than the sum of its parts. Econometrics adds empirical content to economic theory allowing theories to be tested and used for forecasting and policy evaluation.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 5 Bibliometric approaches and methods quantitatively analyze scientific and technological literature. The term was coined by Alan Pritchard in a paper published in 1969, titled Statistical Bibliography or Bibliometrics? He defined the term as "the application of mathematics and statistical methods to books and other media of communication". Citation analysis and content analysis are commonly used bibliometric methods. While bibliometric methods are most often used in the field of library and information science, bibliometrics have wide applications in other areas. In fact, many research fields use bibliometric methods to explore the impact of their field, the impact of a set of researchers, or the impact of a particular paper. Bibliometrics are used to quantify research impact. Performance measurement approaches and methods is the process of collecting, analyzing and/or reporting information regarding the performance of an individual, group, organization, system or component. It can involve studying processes/strategies within organizations, or studying engineering processes/parameters/phenomena, to see whether outputs are in line with what was intended or should have been achieved. Question: What performance measures are used in your context? Logic model approaches and methods which are also known as a logical framework, theory of change, or program matrix, is a tool used most often by managers and evaluators of programs to describe the theory underlying a program. Logic models are usually a graphical depiction of the logical relationships between the resources, activities, outputs and outcomes of a program. While there are many ways in which logic models can be presented, the underlying purpose of constructing a logic model is to assess the "if-then" (causal) relationships between the elements of the program. For example: if the resources are available for a program, then the activities can be implemented, if the activities are implemented successfully then certain outputs and outcomes can be expected.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 6 Theory of Change defines all building blocks required to bring about a given long-term goal. This set of connected building blocks–interchangeably referred to as outcomes, results, accomplishments, or preconditions is depicted on a map known as a pathway of change/change framework, which is a graphic representation of the change process. A Theory of Change would not be complete without an articulation of the assumptions that stakeholders use to explain the change process represented by the change framework. Assumptions explain both the connections between early, intermediate and long term outcomes and the expectations about how and why proposed interventions will bring them about. Often, assumptions are supported by research, strengthening the case to be made about the plausibility of theory and the likelihood that stated goals will be accomplished. Stakeholders value theories of change as part of program planning and evaluation because they create a commonly understood vision of the long-term goals, how they will be reached, and what will be used to measure progress along the way. A Theory of Change is a specific and measurable description of a social change initiative that forms the basis for strategic planning, on-going decision-making and evaluation. The methodology used to create a Theory of Change is also usually referred to as Theory of Change, or the Theory of Change approach or method. So, when you hear or say “Theory of Change”, you may mean either the process or the result. Like any good planning and evaluation method for social change, it requires participants to be clear on long-term goals, identify measurable indicators of success, and formulate actions to achieve goals. A Theory of Change provides a roadmap to get you from here to there. If it is good and complete, your roadmap can be read by others and show that you know how to chart your course. This is helpful with constituents, staff, partners’ organizations and funders. More importantly, if it is good and complete, you have the best chance of making the change in the world you set out to make and of demonstrating your successes and your lessons along the way. A theory of change should do the following: Set out the underlying logic of the intervention, specifying each link in the theoretically predicted causal chain. Outline the planned program inputs, activities, expected outputs and desired intermediate and final outcomes Include possible spill-over effects, both positive and negative. It should also include List the potential program participants and all other affected persons, along with the timelines involved and any indicators being used to monitor change. Question: What is your programs theory of change?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 7 Implementation evaluation approaches and methods are a form of evaluation that focuses on what happens in a program as it is delivered and documents the extent to which interventions are being implemented as intended. Balanced scorecards The balanced scorecard (BSC) is a strategy performance management tool - a semi-standard structured report that can be used by programs to keep track of activities by the staff to monitor the consequences arising from these actions. The BSC concept as put forth by Drs. Robert S. Kaplan and David P. Norton is now seen as helping organizations articulate strategy in actionable terms. It provides a road map for strategy execution, for mobilizing and aligning executives and employees, and making strategy a continual process. Case studies are a descriptive or explanatory analysis of a person, group or event. An explanatory case study is used to explore causation in order to find underlying principles. Case studies may be prospective (in which criteria are established and cases fitting the criteria are included as they become available) or retrospective (in which criteria are established for selecting cases from historical records for inclusion in the study). Another suggestion is that case study should be defined as a research strategy, an empirical inquiry that investigates a phenomenon within its real-life context. Case study research can mean single and multiple case studies, can include quantitative evidence, relies on multiple sources of evidence, and benefits from the prior development of theoretical propositions. Question: What case studies come to mind for demonstrating impact in your program? Summary: It is vitally important that evaluators understand the importance of context and the reason an evaluation is taking place. In health research, the environment is changing and there is a need for Provincial Health Research Organizations to understand (learning), communicate (advocacy), and report (accountability) to continue to support the health research enterprise provincially and nationally. The reason NAPHRO organizations are evaluating is to assess the impact our funding has had / is having on the socio-economic wellbeing of our provincial populations – Canadians. We are using multi- methods in our work. The methods outlined above are some of the ways we are assessing impact.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 8 Impact Evaluation Concepts Assessing the impact of social programs and policies is vitally important. We need a way to demonstrate and report on the relationships between inputs (resources / $’s) and the outputs (the goods and services provided / produced), and the outcomes or impacts the program / policy / service is having. There are multiple stakeholders with interest in these relationships. Impact Evaluation is about how the program / policy / intervention affects the target population, the intended outcomes, objectives and purpose. Impact Evaluation helps us to answer key questions for evidence-based policy making: what works, what doesn’t, where, why and for how much? It has received increasing attention in policy making in recent years in both Western and developing country contexts. Originally more oriented towards evaluation of social sector programs in developing countries, notably conditional cash transfers, impact evaluation is now being increasingly applied in other areas such government departments of agriculture, energy and transport, health and others. In contrast to outcome monitoring, which examines whether targets have been achieved, impact analysis involves a counterfactual analysis that is, a comparison between what actually happened and what would have happened in the absence of the intervention / program / service. In other words, impact evaluations look for the changes in outcome that are directly attributable to a program / service / policy. Impact Evaluation Impact evaluation assesses the changes, both the intended and unintended that can be attributed to a particular intervention. Impact evaluations are structured to answer the question: How would outcomes such as participants’ well-being have changed if the intervention had not been undertaken? Impact evaluations seek to answer cause-and-effect questions and look for the changes in outcomes that are directly attributable to a program / intervention / policy. Key concepts here are: Context Intention Attribution Contribution Counterfactual
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 9 Context When doing evaluation it’s important to understand your context - the environment you are working in; the program you are part of. Question: What contextual factors matter when evaluating your program? Question: How do you define impact in your context?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 10 Intention relates to what the program intends to bring about. This is usually program outcomes. Program outcomes a planned yet sometimes, there are unintended consequences (unanticipated or unforeseen consequences), which are outcomes that were not intended. Question: Describe the intention of your program. Attribution aims to assess the amount of change, that can really be attributed to the program Useful questions to consider include the following: What is the observed change in outcomes of interest? To what extent can the observed outcomes be attributed to the intervention in question? What contextual factors or external influences, such as the social or cultural setting, political or economic trends, and parallel interventions or other stakeholder actions, are likely to have influenced outcomes? If causes or assumptions vary, what alternative causal hypotheses might there be for observed outcomes? Questions: What are the challenges for attribution in your context? What are the confounding factors that have to be considered when making attributions about your program?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 11 Contribution is the degree to which an intervention/program/policy is one of the causes of an observed change. Contribution analysis comprises the following successive steps: 1) Set out the cause–effect question(s) which must be addressed. 2) Draw up a carefully reasoned theory of change, identifying potential influencing factors and outlining the different links in the theory of change and the risks and assumptions associated with them. 3) Gather existing evidence on the theory of change (i) for observed results, (ii) for each of the links in the results chain, and (iii) for the other influencing factors. 4) Assemble and assess the contribution story, outlining whether an intervention was implemented as planned, what the role of external factors was, and whether the predicted theory of change and expected results occurred. 5) Seek out additional evidence to reinforce the credibility of the contribution story. 6) Revise and strengthen the contribution story. 7) In complex settings, assemble and assess the complex contribution story. Counterfactual is a comparison between what actually happened and what would have happened in the absence of the intervention. Through developing the attribution and contribution story you have developed the counterfactual comparison. Example: As provincial health organizations we have extensive documentation and data to build the attribution and contribution stories. We know that we contribute to the health research enterprise and we strive to communicate this to our stakeholders. We use evaluation methods to zero in on the attribution question: “To what extent can observed / intended outcomes (changes in the health research enterprise) be attributed to the intervention of funding provincial health research.” To help NAPHRO members to work together to assess impact the CAHS model was adopted. This allows us to work provincially yet be able to evaluate in a consistent way to yield information on a national level. This allows our organization to advocate the benefits of health research. Question: Can your organization /policy / program develop the impact story?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 12 Question: What are the challenges you face in demonstrating impact of your program?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 13 Making an Impact: A Preferred Framework and Indicators to Measure Returns on Investment in Health Research The framework was created in 2004 by the Canadian Academy of Health Sciences (CAHS), a non-profit charitable organization providing science based information to governments. CAHS uses a unique collaboration of six health disciplines and the full spectrum of academic health sciences to provide science based information. This is a collaborative, multidisciplinary body and not an advocacy group. NAPHRO members participated in the development of the model. For more information on CAHS go to: http://www.cahs-acss.ca/ The full report available at: http://www.cahs-acss.ca/e/assessments/completedprojects.php The CAHS Framework was developed for health research due to: Lack of public understanding of the value of research applicability to current issues in health. Concern about accessible, affordable, high quality health care in a publicly funded system. Need to adequately measure & meaningfully convey benefits of health research to policy- makers & public. Increasingly common view that health care / health research is a cost-driver consuming an ever greater share of resources at expense of other sectors. Concern about expenditure accountability in both the public and private sectors in Canada and abroad. Changing and evolving research environment in Canada. Lack of consensus on how and when to best evaluate return on research expenditures. Questions from policy makers about tangible results attributable to recent increases in public investment in health research e.g. CIHR, CFI, CRC programs. Uncertainty about appropriateness of Canada’s health research expenditures versus those of analogous contributions in other industrialized countries. Need to acquire appropriate evidence to strike right funding balance between investigator- initiated “discovery” & targeted “strategic” health research. The challenge for CAHS was finding the best way to evaluate the impacts of health research in Canada that would be useful to a full range of stakeholders, compatible with what was already in place in Canada, transferrable to international comparisons and be able to identify a full spectrum of potential impacts. Some of the complex issues that CAHS faced in developing the framework relate to attribution issues and the time lags between research activity and impact.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 14 The CAHS Framework – (refer to Appendix 1 for a larger version) The framework tells the story of health research and what the intention of health research is. It is based on a logic frame or results chain and is accompanied by a set of impact indicator categories. The Logic Model The traditional logic model of inputs, activities, outputs and outcomes can be mapped to the CAHS framework. Question: Does this kind of framework fit for your context? Why or why not?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 15 Indicators Having a framework is only one part of the puzzle. Indicators are needed to facilitate measurement. An indicator can be defined as something that helps us to understand where we are, where we are going and how far we are from the goal. Therefore it can be a sign, a number, a graphic and so on. It must be a clue, a symptom, a pointer to something that is changing. Indicators are presentations of measurements. They are bits of information that summarize the characteristics of systems or highlight what is happening in a system. Question: Think about your program and identify indicators of success. Impact Categories The CAHS Framework has identified five impact categories for use with the framework. For each category, indicators have been grouped in sub categories.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 16 Impact Category Definitions Advancing knowledge indicators and metrics include measures of research quality, activity, outreach and structure. We have also identified some aspirational indicators of knowledge impacts using data that are highly desirable but currently difficult to collect and/or analyze (such as an expanded relative‐citation impact that covers a greater range of publications, including book‐to‐book citations and relative download‐rates per publication compared to a discipline benchmark). Research capacity‐building indicators and metrics fall into subgroups that represent personnel (including aspirational indicators for improving receptor and absorptive capacity), additional research‐activity funding and infrastructure. Informing decision‐making indicators and metrics represent the pathways from research to its outcomes in health, wealth and well‐being. They fall into health‐related decision‐making (where health is broadly defined to include health care, public health, social care, and other health‐related decisions such as environmental health); research decision‐making (how future health research is directed); health‐products industry decision‐making; and general public decision‐making. We also provide two aspirational indicators for this category (media citation analysis and citation in public policy documents). Health‐impact indicators and metrics include those on health status, determinants of health and health‐system changes, and they include quality of life as an important component of improved health. Determinants of health indicators can be further classified into three major subcategories: modifiable risk factors, environmental determinants, and modifiable social determinants. Broad economic and social impacts are classified into activity, commercialization, health benefit (specific costs of implementing research findings in the broad health system), well-being, and social- benefit indicators (socio-economic benefits) Notes:
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 17 CAHS Model Example Indicators: Advancing knowledge Categories Indicator Description Quality Relative citation impact Compares the citation rate with the relevant world average. Activity Publication counts Simple counting of outputs that can be useful for new researchers who have no publication record to allow for citation analysis. Outreach Co-author analysis Determining the proportion of publications that are co- authored internationally, nationally, with industry and other disciplines. Contextual Structural Relative Activity Index Determining the fields of research in which a unit is most strongly focused. Aspirational Research Diffusion Based on end-of-grant reports on uptake of research. Research capacity‐building Categories Indicator Description Personnel Numbers of research and research-related staff in Canada Generally broken down in to researchers, research assistants and other staff. Funding Levels of additional research funding Funding from external sources that can be attributed to the capacity built in an organization. Infrastructure Grants ($) The amount of dollars of infrastructure funding pulled in by a research project. Aspirational Receptor capacity The ability of those in policy and administrative positions to take research findings and use them program and policy development. Informing decision‐making Categories Indicator Description Health Related Use of research in guidelines Analyzing citations to research in clinical and service guidelines Research Related Requests for research to support policy Requests for jurisdictional reviews, literature reviews, best practices Health Products Consulting to Industry Number of researchers approached
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 18 Industry General Public Public Lectures given Lectures given to public audiences Aspirational Media citation analysis Analyzing mentions of research in newspapers Health‐impact Categories Indicator Description Health Status Morbidity Mortality Quality-adjusted mortality Prevalence & Incidence PYLL – Potential Years Life Lost QALYs – Quality Adjusted Life Years Determinants of Health Modifiable risk factors Social determinants Environmental determinants Smoking, driving habits Education levels, social cohesion Air pollution levels Determinants of Health Services Acceptability Accessibility Appropriateness Competence Continuity Effectiveness Efficiency Safety Self-reported patient satisfaction Wait times Adherence of clinical guidelines Civil law suits against the health system Self-reported continuity Admission / discharge rates Actual vs. expected length of stay Hospital acquired infections Broad economic and social impacts Categories Indicator Description Activity Economic rent The economic benefits (in $) of employing people in health research rather than in another capacity. Commercialization Licensing returns ($) Dollars spent on licensing patents held by organizations / individuals. Health Benefit Health benefits in QALYs per health care dollar Improvements in health measured through QALYs gained and divided by the cost of achieving that health gain. Well-being Happiness As measured using established survey techniques for happiness – depression. Social Benefits Socio-economic status Identifying the socio-economic status of individuals in Canada.
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 19 Using the CAHS Framework and Indicators - Refer to Appendix 2, 3, 4 Define and prioritize specific evaluation question(s). Use the framework to determine where to look for impacts Based on question(s) choose the impact categories (and subcategories) of interest: advancing knowledge, capacity building, informing decision making, health impacts, and broad economic and social impacts. Be as specific as possible about where impacts are expected to occur and at what level (individual, group, institution, and provincial, federal, international). Choose (or develop) attractive and feasible indicators and metrics from the appropriate categories of interest that will address the evaluation questions at the right level Choose sets of indicators that are appropriate. Indicators vs. Metrics Indicators ‘indicate’ impact; they do not attempt to quantify that impact Metrics are ‘numeric indicators’; they allow putting some numbers on impact A combination of indicators and metrics are recommended FABRIC – Use the FABRIC Acronym to consider your indicators. Indicators should be: Focused on the organization’s objectives that will use them Appropriate for the stakeholders who are likely to use the information Balanced to cover all significant areas of work performed by an organization Robust enough to cope with organizational changes (such as staff changes) Integrated into management processes Cost-effective (balancing the benefits of the information against collection costs) Attractiveness and Feasibility Attractiveness: Validity – does the indicator or metric reasonably reflect the underlying concept or construct that it is intended to measure? Relevance – does the indicator or metric relate directly to a critical aspect of the research? Behavioural impact – does the indicator or metric drive behaviour in a particular direction? Is it likely to result in any negative, unintended consequences? Does it create “perverse incentives?” Transparency – is the methodology, and are the strengths and weaknesses relating to the indicator or metric, readily apparent? Coverage – does the indicator or metric cover a large proportion of output from research to be assessed? Recency – do the data relate to current research performance, or look over a longer timescale?
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 20 Methodological soundness – is the calculation of the metric sound and statistically robust? Replicability – can others reproduce the indicator or metric, and can it be used year on year in a comparable fashion? Comparability – do other organizations collect comparable information or have targets to benchmark against? Feasibility: Data availability ‐ do the data required to derive indicators or metrics exist, and do both the analysts and those being assessed have access to it? Cost of data – how expensive is it to purchase the data outright or obtain on license? Compliance costs – how labour intensive is it to extract/obtain the data? Timeliness – can the data be obtained / provided relatively quickly? Attribution – can the data be discretely ascribed to the unit being assessed? Direct attribution is ideal, but unlikely given current knowledge and methods; using attribution as a concept is important, as it provides a link between the impact seen and the research. Avoids gamesmanship – does the indicator or metric provide scope for special interest groups or individuals to game the system? Interpretation – can the data be open to misinterpretation or misuse by commentators and/or actors using the evaluation findings? Well‐defined – does the metric have a clear, unambiguous definition so that data will be collected consistently, and so that the measure is easy to understand and use? Notes:
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework CES Conference – June 9, 2013 21 Some of the resources used to develop this workshop: Canadian Association of Health Services (2009). “Making and Impact: A Preferred Framework and Indicators to Measure Returns on Investment in Health Research”. Report of the Panel on Investments in Health Research. Scriven, M. (2008). "A Summative Evaluation of RCT Methodology: & An Alternative Approach to Causal Research." Journal of MultiDisciplinary Evaluation 5(9): 11-24. Available at http://survey.ate.wmich.edu/jmde/index.php/jmde_1/article/view/160/186 Mayne, J. (ed) (2012). Contribution Analysis: Coming of Age? Evaluation, Special Issue, 18(3), 270-280 Mayne, J. (2008). Contribution Analysis: An Approach to Exploring Cause and Effect, ILAC Brief 16. Available at http://www.cgiar- ilac.org/files/publications/briefs/ILAC_Brief16_Contribution_Analysis.pdf Mayne, J. (2012). Making Causal Claims, ILAC Brief No. 26: The Institutional Learning and Change Initiative. Available at http://www.cgiar- ilac.org/files/publications/mayne_making_causal_claims_ilac_brief_26.pdf Mayne, J. (2011). Contribution Analysis: Addressing Cause and Effect. Evaluating the Complex. R. Schwartz, K. Forss and M. Marra, Eds, Transaction Publishers: 53-96. Wikipedia and Google search for Impact evaluation concepts and theories If you have questions or comments and want to get in touch with either Nancy or Rob please contact via email at: Nancy.Carter@gov.ns.ca Robert.Chatwin@gov.ns.ca
  • Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework Workbook Appendices CES Conference – June 9, 2013
  • Appendix 1
  • Health Services Research Health Industry Economic and Social ProsperityDeterminants of HealthPublic Information, Groups KnowledgePool Improvements in Health and Well-being Healthcare Appropriateness, Access, etc. Prevention and Treatment Health Status, Function, Well-being, Economic Conditions Initiation and Diffusion of Health Research Impacts Global Research Research Capacity Impacts feed back into inputs for future research Government Research Agenda Interactions/Feedback ResearchResults Other Industries Advancing Knowledge Capacity Building Informing Decision Making Health Benefits Economic Benefits Canadian Health Research •Biomedical •Clinical •Health Services •Population and Public Health •Cross-pillar Research 42 Appendix 2
  • Appropriate Use of the Framework understanding the logic model and impact categories FRAMEWORK Health R&D Primary Outputs/ Dissemination Secondary Outputs Adoption Final outcomesIMPACT CATEGORIES Advancing Knowledge • New molecular technique developed • Publication of research results in a journal Research Capacity • Research PhD gained by team member • Further research in Industry Informing Decision Making • Discussions between researchers and pharma define direction of pharma research • Pharma company initiates research program to develop a drug • Drug developed passed for use by the health system Health Impacts • Adoption by the health system causes increased cost of drugs • Decreased re- admission for condition • Reduced condition burden in the population Broad Economic and Social Impacts • Employment in the pharma company • Sales of drugs by pharma • Improved workplace productivity • Social and economic benefit of “wellness” Appendix 3 Activities Outputs Outcomes Impact
  • Appendix 4: Using the Framework – understanding the logic model and impact categories Framework Activities Produce Results That influences decision making in… That affects something That contributes to something… Impact Categories
  • Appendix 5: CAHS / NSHRF Impact Evaluation Framework Research Activity That produces results That Influence decision making in… That affect healthcare, health risk factors, & other health determinants That contribute to changing health, well- being & economic and social prosperity Initiation and Diffusion of Health Research Impacts ResearchResults KnowledgePool Consultation&Collaboration Global Research Nova Scotia Health Research • Bio-medical • Policy, Services Outcomes • Partnership Programs • Student Programs Nova Scotia Government (DHW) District Health Authorities (DHAs) Nova Scotia Health Status and function, well-being , economic conditions Improved Health of Nova Scotians Societal & Economic Improvements Universities Impacts link back into inputs for future research StrongHealthResearchEnterprise Public Vibrant Research Community Health Research ExcellenceNova Scotia Research Capacity • REDI • Knowledge • Evaluation External Influences: Interests, Traditions, Technical Limitations, Political dynamics Foundation for Informed Decisions Workshops,Presentations&Events Public Awareness FundedResearch,Grants,&Awards Partnerships&Collaborations LearningOpportunities InformationProducts&reports Health Status and function, well-being, economic conditions Impact Framework Canadian Health Research Research Capacity • Health Industry • Other Industries • Government • Research Decision Making • The Public, Public Groups Health Care Prevention & Treatment Determinants of Health Improvements in health and well-being (disease prevalence and burden) Economic & social prosperity
  • Appendix 6: Framework Worksheet Activity That produces results That Influence decision making in… That affects…… That contributes to …Context
  • Activity That produces results That Influence decision making in… That affect…… That contribute to … Initiation and Diffusion of… Impacts link back into inputs for … External Influences: Interests, Traditions, Technical Limitations, Political dynamics Impact Framework Appendix 7: Framework Worksheet