Advertisement

Evaluating research impact: From a specific case to general guidelines.

Principal at Knowledge to Action Consulting
Jun. 27, 2016
Advertisement

More Related Content

Slideshows for you(20)

Similar to Evaluating research impact: From a specific case to general guidelines. (20)

Advertisement

Evaluating research impact: From a specific case to general guidelines.

  1. Evaluating research impact: From a specific case to general guidelines Anne Bergen, PhD and Elizabeth Shantz, MSc
  2. Attribution This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. You are free: • to Share — copy and redistribute the material in any medium or format • to Remix — Remix, transform, and build upon the material for any purpose, even commercially. Under the following conditions: • Attribution — Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. • Share Alike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original. Attribute this work as: Bergen, A., and Shantz, E. (2016). Evaluating research impact: From a specific case to general guidelines. Workshop presentation to the Canadian Knowledge Mobilization Forum. Toronto, ON. 2
  3. About Us • Knowledge to Action Consulting Inc. offers services in evaluation, applied research, and knowledge mobilization. We like helping people collect meaningful data, telling stories about research, and building relationships.
  4. Canada’s leading water research design and management organization Blue Cities Energy and Resources Small and Aboriginal Communities Agriculture and Water
  5. Our Approach 1. Create Opportunities 2. Extract Knowledge, Provide Insight 3. Accelerate Success
  6. get in touch Anne Bergen, PhD anne@knowledgetoaction.ca @anne_bergen Elizabeth Shantz, MSc eshantz@cwn-rce.ca @elizabethshantz
  7. Why research impact evaluation?
  8. Research à Impact? What kind of impact? For whom? Under what conditions? When? Based on what evidence?
  9. What are your evaluation goals?
  10. Evaluation Goals •What aspects of your research or KT do you want to evaluate? •What are you going to do with the resulting information?
  11. Evaluation Continuum How will you use the evaluation results? –What? –So What? –Now What? Outputs Engagement Uptake Use Impact Causal Attributions
  12. CWN Evaluation Goals Evaluate • Identify characteristics of an impactful research project Compare • Compare different styles of research programs Query • Develop an accessible database for easy querying Communicate • Identify and tell our stories of success to inform communications
  13. key questions for research impact (KTE) evaluation 1. What research knowledge was transferred? 2. To whom was research knowledge transferred? 3. By whom was research knowledge transferred? 4. How was research knowledge transferred? 5. With what effect was research knowledge transferred? Adapted from: Lavis, J. N., Robertson, D., Woodside, J. M., McLeod, C. B., & Abelson, J. (2003). How can research organizations more effectively transfer research knowledge to decision makers? Milbank Quarterly, 81(2), 221-248.
  14. CWN Evaluation Questions 1. What research knowledge was transferred? 2. To whom was research knowledge transferred? 3. By whom was research knowledge transferred? 4. How was research knowledge transferred? 5. With what effect was research knowledge transferred?
  15. What is your evaluation context?
  16. More impact. Inform Consult Involve Collaborate Empower Less control. Slower process. Adapted from Arnstein’s (1969) Ladder of Public Participation and the IAP2 Spectrum of Public Participation How engaged are your stakeholders in your research & KTE activities?
  17. Research Funding Organization Program of Research A Project A1 Product A1a Product A1b Project A2 Product A2a Product A2b Product A2c Project A3 Product A3a Product A3b Program of Research B Project B1 Product B1a Product B1b Project B2 Product B2a Product B2b
  18. Research Funding Organization Program of Research A Project A1 Product A1a Product A1b Project A2 Product A2a Product A2b Product A2c Project A3 Product A3a Product A3b Program of Research B Project B1 Product B1a Product B1b Project B2 Product B2a Product B2b
  19. Context Research Funding Organization Program of Research A Project A1 Product A1a Product A1b Project A2 Product A2a Product A2b Product A2c Project A3 Product A3a Product A3b Program of Research B Project B1 Product B1a Product B1b Project B2 Product B2a Product B2b
  20. Canadian Water Network Program of Research A Project A1 Product A1a Product A1b Project A2 Product A2a Product A2b Product A2c Project A3 Product A3a Product A3b Program of Research B Project B1 Product B1a Product B1b Project B2 Product B2a Product B2b
  21. Evaluation Context (macro & micro) • Micro - (time, money, resources, intended use, intended users, reporting requirements, etc…) • Macro – (culture, leadership, evaluation, external systems, political climate, etc….)
  22. What is your logic model for research impact? (logical links between activities, outputs, & desired outcomes)
  23. Logic Model for Research Impact Stakeholders/ end users • Who is/ are the audience(s)/target(s) of change of your KTE activities? Activities & outputs • What are your research and KT activities that might impact your stakeholders? (what are you creating in person/ online/ on paper)
  24. Logic Model for Research Impact Outcomes & impacts • What are the expected outcomes, and impacts of your research and KT activities? What are the changes in people’s knowledge, attitudes, skills, and actions? What are the community and systems level impacts? Assumptions • What are the assumptions between activities & outcomes? What context is necessary for the KTE activities to have impact?
  25. CWN logic model Adapted from the University of Wisconsin-Extension evaluation logic model template http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
  26. What approach to evaluation will you take?
  27. What Kind of Evaluation? Evaluation research vs. evaluation Research how used? to be published? funded? practical & theoretical considerations
  28. Who are your end users/ targets of change?
  29. End users of research/ targets of change q patients q family and caregivers q researchers q practitioners q general public q policymakers q subsets of the general public (e.g., youth) q
  30. What measures and indicators will you use? (metrics to stories)
  31. Outputs vs. Outcomes Outputs Engagement Uptake Use Impact Causal Attributions measurement time & complexity
  32. Indicators for: • Number/type of research outputs/ KT products • Ease of use/ user experience • Timing/ relevance (meets end user needs) • Awareness/ attitudes/ beliefs/ knowledge • Self-reported intentions/ behaviour • Networks/ relationships/ collaborations • Systems/ policies/ organizational culture
  33. Social Process, Social Impact • Quality & quantity of relationships • 1:1 relationships; organizational relationships • Meetings • Requests & referrals • Co-produced products • Social network analysis • .
  34. Phase 1: Document Analysis Identify Projects • Parameters for inclusion Collect Information • Reporting to CWN; other outputs Choose Indicators • Inputs, outputs, outcomes, impacts Analyze Reports • Create searchable databaseof results 76 Projects 8 Months CWN Impact Evaluation
  35. Phase 2: Researcher Interviews 56 Projects 9 Months Phase1 Information Researcher Interviews Verify key outputs Update outcomes Nominate end users CWN Impact Evaluation
  36. Phase 3: End User Interviews 78 Interviews 5 Months Short and medium term outcomes Theory of attribution Long term impacts Forward tracking impacts CWN Impact Evaluation
  37. Success Stories Providing landowners with cost- and space-saving rural wastewater treatment CWN Impact Evaluation Reducing arsenic contamination from a nearby coal mine
  38. Lessons Learned • Choose indicators wisely –Partner metrics • Good question framing –In interviews –In project reporting • Get the right people involved –Leverage your connections –Flexible evaluation expertise • Persistence! CWN Impact Evaluation
  39. Promising Practices • Body of work of prominent researchers • Methods, models and technologies • Emerging areas of research • Forward tracking over time CWN Impact Evaluation
  40. Promising practices and tensions • partner-level inquiry • forward and backward tracking • contribution analysis • success stories vs. quantified impact • timing of impact
  41. A Few Resources Better Evaluation (2016). [Website]. http://betterevaluation.org/An international collaboration to improve evaluation practice and theory by sharing and generating information about options (methods or processes) and approaches. Economic and Social Research Council (2011). Branching Out: New Directions in Impact Evaluation from the ESRC’s Evaluation Committee. Appendix 1 – ConceptualFramework for Impact Evaluation.. Retrieved from: http://www.esrc.ac.uk/files/research/evaluation-and-impact/branching-out-new- directions-in-impact-evaluation-from-the-esrc-s-evaluation-committee/ Morton (2015) Progressing research impact assessment: A ‘contributions’ approach Research Evaluation (2015) 24 (4): 405-419 http://rev.oxfordjournals.org/content/24/4/405 NationalCollaborating Centre for Methods and Tools (2012). Evaluating knowledge translation interventions: A systematic review. Hamilton, ON: McMaster University. Retrieved from http://www.nccmt.ca/resources/search/114.
Advertisement