This research in progress paper describes the initial results of a long-term, large-scale analysis
of the operationalization of evaluation of the societal impact of research. Results from the
first stage of qualitative interviews are used to illustrate the strength of the methodological
design of the study.
The prospective contribution and the global and integrated approach to pha...Catherine Frade
Contribution de la prospective et approche globale intégrée aux nouveaux challenges de l’industrie pharmaceutique (Dr. Catherine Frade, 8-10 Mars 2010 - 22° meeting annuel européen de la DIA à Monaco)
What happens to your grant once it gets to a study section?
In this presentation, Dr. Paul Martin leverages his experience as a seasoned National Institutes of Health grant reviewer, including his tenure as Chair of the Cancer Immunopathology and Immunotherapy Study Section, to provide insight into the workings of NIH study sections.
Learn how to:
- Identify the fundamentals of grant review, including an overview of study sections and grant scoring;
- Determine differences between "impact" and "significance";
- Recognize effective strategies in writing and how to avoid frequent mistakes.
The prospective contribution and the global and integrated approach to pha...Catherine Frade
Contribution de la prospective et approche globale intégrée aux nouveaux challenges de l’industrie pharmaceutique (Dr. Catherine Frade, 8-10 Mars 2010 - 22° meeting annuel européen de la DIA à Monaco)
What happens to your grant once it gets to a study section?
In this presentation, Dr. Paul Martin leverages his experience as a seasoned National Institutes of Health grant reviewer, including his tenure as Chair of the Cancer Immunopathology and Immunotherapy Study Section, to provide insight into the workings of NIH study sections.
Learn how to:
- Identify the fundamentals of grant review, including an overview of study sections and grant scoring;
- Determine differences between "impact" and "significance";
- Recognize effective strategies in writing and how to avoid frequent mistakes.
Health Evidence hosted a 60 minute webinar examining the effectiveness of school-based interventions for preventing HIV, sexually transmitted infections and pregnancy in adolescents. Click here for access to the audio recording for this webinar: https://youtu.be/yCeIEQ4OTCc
Amanda Mason-Jones, Senior Lecturer in Global Public Health, Faculty of Science, University of York led the session and presented findings from her recent Cochrane review:
Mason-Jones A, Sinclair D, Mathews C, Kagee A, Hillman A, & Lombard C. (2016). School-based interventions for preventing HIV, sexually transmitted infections, and pregnancy in adolescents.Cochrane Database of Systematic Reviews, 2016(11), CD006417
http://healthevidence.org/view-article.aspx?a=school-based-interventions-preventing-hiv-sexually-transmitted-infections-29881
Sexually active adolescents are at risk of contracting HIV and STIs. Unintended pregnancy can have detrimental impact on young people’s lives. This review examines the impact of school sexual education programs on number of young people that contract STIs and number of adolescent pregnancies. Eight cluster randomized control trials, including 55,157 participants are included in this review. Findings suggest there is little evidence that school programs alone are effective in improving sexual and reproductive health outcomes for adolescents. This webinar examined the effectiveness and components of interventions that prevent HIV, STIs and adolescent pregnancy.
Rapid qualitative analysis vs the 'traditional approach': early findings and ...NIHR CLAHRC West Midlands
Dr Beck Taylor of Theme 1, Maternity and Child Health, presented her latest project, comparing a rapid approach to synthesising evidence from qualitative research to traditional research methods, presented at CLAHRC WM Programme Steering Committee meeting, 22nd October 2015
Alliance for CME 2009 Presentation, Wake me Up Before it’s Over:Bringing out...Wendy999
2009 ACME Presentation, co-presented with Marissa Seligman, that tackles strategies to bring innovation to live continuing medical education activities.
Reducing sitting time at work: What's the evidence?Health Evidence™
Health Evidence hosted a 60 minute webinar examining the effectiveness of workplace interventions for reducing sitting at work. Click here for access to the audio recording for this webinar: https://youtu.be/psmac6jkbMM
Dr. Nipun Shrestha, MBBS, MPH, Postgraduate Student at Victoria University led the session and presented findings from his recent Cochrane review:
Shrestha N, Kukkonen-harjula KT, Verbeek JH, Ijaz S, Hermans V, & Bhaumik S. (2016). Workplace interventions for reducing sitting at work. Cochrane Database of Systematic Reviews, 2016(3), Art. No.: CD010912.
http://healthevidence.org/view-article.aspx?a=workplace-interventions-reducing-sitting-work-28404
Office work has become sedentary in nature. Increased sitting has been linked to increase in cardiovascular disease, obesity and overall mortality. This review examines the impact of workplace interventions to reduce sitting at work. Two cross-over randomized control trials, 11 cluster randomized trials and 4 controlled before-and-after studies, including 2180 participants are included in this review. Findings suggest that sit-stand desks may decrease workplace sitting. This webinar examined the effectiveness and components of interventions that reduce sitting at work.
The randomised control trial (RCT) is a trial in which subjects are randomly assigned to one of two groups: one (the experimental group) receiving the intervention that is being tested, and the other (the comparison group or control) receiving an alternative (conventional) treatment
Using Metrics to Determine Research ImpactJulia Gross
Julia Gross. Presentation on Bibliometrics for ECU Research Week, September 2012
ABSTRACT: Publishing your research is an important part of promoting your work and ensuring that your research finds a wide audience. Bibliometrics is one technique for measuring the impact of research using publication citation counts. You can do a research impact analysis yourself if you know which tools to use. This seminar introduces a range of bibliometric tools such as Scopus, Web of Science and Publish or Perish.
Health Evidence hosted a 60 minute webinar examining the effectiveness of school-based interventions for preventing HIV, sexually transmitted infections and pregnancy in adolescents. Click here for access to the audio recording for this webinar: https://youtu.be/yCeIEQ4OTCc
Amanda Mason-Jones, Senior Lecturer in Global Public Health, Faculty of Science, University of York led the session and presented findings from her recent Cochrane review:
Mason-Jones A, Sinclair D, Mathews C, Kagee A, Hillman A, & Lombard C. (2016). School-based interventions for preventing HIV, sexually transmitted infections, and pregnancy in adolescents.Cochrane Database of Systematic Reviews, 2016(11), CD006417
http://healthevidence.org/view-article.aspx?a=school-based-interventions-preventing-hiv-sexually-transmitted-infections-29881
Sexually active adolescents are at risk of contracting HIV and STIs. Unintended pregnancy can have detrimental impact on young people’s lives. This review examines the impact of school sexual education programs on number of young people that contract STIs and number of adolescent pregnancies. Eight cluster randomized control trials, including 55,157 participants are included in this review. Findings suggest there is little evidence that school programs alone are effective in improving sexual and reproductive health outcomes for adolescents. This webinar examined the effectiveness and components of interventions that prevent HIV, STIs and adolescent pregnancy.
Rapid qualitative analysis vs the 'traditional approach': early findings and ...NIHR CLAHRC West Midlands
Dr Beck Taylor of Theme 1, Maternity and Child Health, presented her latest project, comparing a rapid approach to synthesising evidence from qualitative research to traditional research methods, presented at CLAHRC WM Programme Steering Committee meeting, 22nd October 2015
Alliance for CME 2009 Presentation, Wake me Up Before it’s Over:Bringing out...Wendy999
2009 ACME Presentation, co-presented with Marissa Seligman, that tackles strategies to bring innovation to live continuing medical education activities.
Reducing sitting time at work: What's the evidence?Health Evidence™
Health Evidence hosted a 60 minute webinar examining the effectiveness of workplace interventions for reducing sitting at work. Click here for access to the audio recording for this webinar: https://youtu.be/psmac6jkbMM
Dr. Nipun Shrestha, MBBS, MPH, Postgraduate Student at Victoria University led the session and presented findings from his recent Cochrane review:
Shrestha N, Kukkonen-harjula KT, Verbeek JH, Ijaz S, Hermans V, & Bhaumik S. (2016). Workplace interventions for reducing sitting at work. Cochrane Database of Systematic Reviews, 2016(3), Art. No.: CD010912.
http://healthevidence.org/view-article.aspx?a=workplace-interventions-reducing-sitting-work-28404
Office work has become sedentary in nature. Increased sitting has been linked to increase in cardiovascular disease, obesity and overall mortality. This review examines the impact of workplace interventions to reduce sitting at work. Two cross-over randomized control trials, 11 cluster randomized trials and 4 controlled before-and-after studies, including 2180 participants are included in this review. Findings suggest that sit-stand desks may decrease workplace sitting. This webinar examined the effectiveness and components of interventions that reduce sitting at work.
The randomised control trial (RCT) is a trial in which subjects are randomly assigned to one of two groups: one (the experimental group) receiving the intervention that is being tested, and the other (the comparison group or control) receiving an alternative (conventional) treatment
Using Metrics to Determine Research ImpactJulia Gross
Julia Gross. Presentation on Bibliometrics for ECU Research Week, September 2012
ABSTRACT: Publishing your research is an important part of promoting your work and ensuring that your research finds a wide audience. Bibliometrics is one technique for measuring the impact of research using publication citation counts. You can do a research impact analysis yourself if you know which tools to use. This seminar introduces a range of bibliometric tools such as Scopus, Web of Science and Publish or Perish.
This is a presentation I created for our group’s keynote at our Global Fujitsu Forum event held in May 2010 in Tokyo. Following a brief intro, the main body describes our group’s vision, thinking, and mantra re the essential ingredients to become a "Tomorrow Company" today. As these slides were designed to be presented in front of a live audience lead by the presenter’s strong narrative lead as opposed to being a handout document,
these slides are best viewed when presented live.
A review of Eysenbach, G., 2011. Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact. Journal of Medical Internet Research, 13(4), p.e12
The in-vitro approach: Qualitative methodology to explore panel based peer re...Gemma Derrick
Presentation given by Gemma Derrick and Gabby Samuel at the Workshop exploring Qualitative and Mixed Methods in Research Evaluation and Policy 2015 (QMM2015)
Critical appraisal is the process of carefully and systematically analyze the research paper to judge its trustworthiness, its value and relevance in a particular context. (Amanda Burls 2009)
A critical review must identify the strengths and limitations in a research paper and this should be carried out in a systematic manner.
The Critical Appraisal helps in developing the necessary skills to make sense of scientific evidence, based on validity, results and relevance.
Gender differences in societal orientation and output of individual scientistsInge van der Weijden
Presentation at the STI 2014 conference
Gender differences in societal orientation and output of individual scientists
Authors: Inge van der Weijden, Zohreh Zahedi, Ulle Must and Ingeborg Meijer
The Interprofessional Team Immersion (IPTI) offers students across 13 health professions opportunities to apply their skills in cross-professional communication, teamness, and patient-centered engagement. The experience is characterized by high stakes cases carefully designed to cultivate an atmosphere conducive to rapid teambuilding and compassionate patient care. Within a safe learning environment, faculty and students acquire understanding of roles and responsibilities as well as skills to manage complex cases. This presentation will describe and demonstrate the rationale, design, and implementation of IPTI over a three-year period. Findings suggest significant increase in IPTI students’ perceptions of cooperation, resource sharing and communication skills for team-based practice. Programmatic evaluation substantiates the value students place on practicing interprofessional clinical skills before and while in their clinical-community rotations. Debriefing sessions with standardized patients enhanced students’ knowledge and appreciation for patient engagement and shared decision-making culminating for some in scholarly products. In total, findings provide beneficial insight for other interprofessional educational and collaborative practice initiatives taking place at the University and in the community. Learn more about IPEC at University of New England ipec(at)une(dot)edu or follow us on Twitter @UNEIPE
Electronic cigarettes for smoking cessation: What's the evidence?Health Evidence™
Health Evidence hosted a 90 minute webinar examining the effectiveness of electronic cigarettes for smoking cessation.
Muhannad Malas and Robert Schwartz led the session and presented findings from their recent review:
Malas M, van der Tempel J, Schwartz R, Minichiello A, Lightfoot C, Noormohamed A, et al. (2016). Electronic cigarettes for smoking cessation: A systematic review. Nicotine & Tobacco Research, 18(10), 1926-1936.
http://healthevidence.org/view-article.aspx?a=electronic-cigarettes-smoking-cessation-systematic-review-29830
Cigarette smoking is among the top causes of preventable death and disease. Electronic cigarettes have been increasing in popularity among smokers who report using them for quitting or reducing smoking. This review examines the effectiveness of electronic cigarettes as cessation aids. Sixty two articles, including RCTs, experimental, longitudinal and cross sectional studies are included in this review. Findings suggest there is inconclusive evidence due to low quality of research. This webinar provides a comprehensive overview of current literature examining the effectiveness of electronic cigarettes for smoking cessation.
PowerPoint slides used in a seminar held in the University of Calcutta to familiarise the members of Parthib Basu's Ecological Research Unit with the Centre for Pollination Studies Planning, Monitoring and Evaluation System.
Day 2 keynote: Sanjeev Sridharan, University of Toronto: “Research and evaluation in global health policy processes”
Workshop on Approaches and Methods for Policy Process Research, co-sponsored by the CGIAR Research Programs on Policies, Institutions and Markets (PIM) and Agriculture for Nutrition and Health (A4NH) at IFPRI-Washington DC, November 18-20, 2013.
We live in an age of research measurement. In this session we consider the current form of the REF, how it effects both a university’s relationship with research and the developing careers of early-career researchers. The session will also consider what you can do to make sure you are best equipped and ‘in the know’ for the demands of the REF once you apply for and start an academic job.
Discussion session hosted by Leonie van Drooge at the Workshop exploring Qualitative and Mixed Methods in Research Evaluation and Policy 2015 (QMM2015)
Multiplying method: Ethnography and the reconceptualization of evaluation stu...Gemma Derrick
Discussion session hosted by Pau Wouters and Sarah de Rijcke at the Workshop exploring Qualitative and Mixed Methods in Research Evaluation and Policy 2015 (QMM2015)
Focus! A discussion about the use of focus groups as a methodGemma Derrick
Discussion session hosted by Leonie van Drooge at the Workshop exploring Qualitative and Mixed Methods in Research Evaluation and Policy 2015 (QMM2015)
Rethinking the 'international' in the governance of scienceGemma Derrick
Presentation given by Tereza Stockelova and Sarah de Rijcke at the Workshop exploring Qualitative and Mixed Methods in Research Evaluation and Policy 2015 (QMM2015)
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators
1. Intentions and
strategies for evaluating
the societal impact of
research: Insights from
REF2014 evaluators
Derrick, G.E.
Health Economics Research Group
(HERG)
Brunel University London
2. Introduction
• UK REF2014 – national evaluation process of universities to distribute over £1,952
billion of government funding for research.
• Criteria: 65% Outputs (Peer review of publications), 15% Environment (Esteem) and 20% Impact.
• Impact is defined as research that has had “…an effect on, change or benefit to the economy,
society, culture, public policy or services, health, the environment or quality of life, beyond
academia.” (HEFCE, 2011)
• Peer review only accepted as legitimate IF the results of outcomes are perceived to
have been
• Evaluation of criteria conducted by 4 Main Panels divided into “Units of assessment” or
disciplines.
• For “Impact” evaluate 4 page case studies “in session” – ex-post impact evaluation
• 3 groups of REF evaluators
1. Output only evaluators (n (interview) =8)
2. Impact only evaluators (n (interview) =9); AND
3. Output and Impact evaluators (n (interview) =47)
Brunel University London
23 September 2014
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 2
3. The UK REF2014 Evaluation process 23 September 2014
Brunel University London
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 3
MAIN PANEL A
Sub-panel 1 – Clinical Medicine
Sub-panel 2 – Public Health, Health
services and Primary care
Sub-panel 3 – Allied Health Professions,
Dentistry, Nursing and Pharmacy
Sub-panel 4 – Psychology, Psychiatry and
Neuroscience
Sub-panel 5 – Biological Sciences
Sub-panel 6 – Agriculture, Veterinary
and Food Science
Evaluation items
Outputs – 50,317
Impact case studies
– 1,621
4. Brunel University London
23 September 2014
2
High significance
High Reach
(4*)
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 4
Significance
Reach
1
High significance
Low Reach
(¿2* or 3*?)
4
Low significance
High Reach
(¿2* or 3*?)
3
Low significance
Low Reach
(¿0-1*?)
5. Methodology – The Treatment 23 September 2014
Brunel University London
5
• Semi structured pre-evaluation interviews with Main Panel A and its sub panel
evaluators.
• Analysed using cognitive-based grounded theory
• Questions asked about the evaluators background (most proud), opinion of
impact (definition, what is important), and strategies for overcoming barriers in
evaluation (panel roles, attribution, definition differences etc)
EVALUATION PROCESS
Pre-evaluation
Interviews
Post-evaluation
Interviews
Jan-Mar 2014
Dec 2014 – Feb 2015
(1) Definitions
(2) Opinions
(3) Strategies
(4) Intentions
(5) Biases
(1) Re-test (1)-(5)
(2) Process
(3) Conflicts
(4) Power roles
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators
6. Main Panel A and its subpanels 23 September 2014
Brunel University London
6
Sub-panel name
No. of
sub-panel
members
No. of academic
evaluators (AEs)
(% AE’s / sub-panel)
No. of research
user evaluators
(UEs)
(% UEs/ sub-panel)
No. of
participants
Main Panel 19 14 (73.7%) 5 (26.3%) 8 (42.1%)
Sub-panel 1 – Clinical
Medicine
39 32 (82.0%) 7 (18.0%) 10 (25.6%)
Sub-panel 2 – Public
Health, Health services
& Primary care
27 23 (85.1%) 4 (14.9%) 13 (48.1%)
Sub-panel 3 – Allied
Health Professions,
Dentistry, Nursing &
Pharmacy
51 42 (82.3%) 9 (17.7%) 14 (27.5%)
Sub-panel 4 –
Psychology, Psychiatry
& Neuroscience
35 28 (80.0%) 7 (20.0%) 9 (25.7%)
Sub-panel 5 –
Biological Sciences
35 30 (85.7%) 5 (14.3%) 6 (17.1%)
Sub-panel 6 –
Agriculture, Veterinary
and Food Science
29 16 (55.1%) 13 (4.9%) 4 (13.8%)
TOTAL 235 185 (78.7%) 50 (23.2%) 62 (28.8%)
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators
7. Results
Pre-treatment – evaluation without precedence
• Uncertainty of Impact evaluation
LACK OF DEFINITION
• “I´m still not convinced everybody shares exactly the same definition of what constitutes impact or
where they place the weight or if it’s impact or isn’t” P3Imp1
LACK OF EXPERIENCE
• “I’m very happy to describe the quality of the research [but] the valuing of impact is something I
have no idea about” P0P2 Out-Imp1
• Resort to evaluation “comfort zone” (“what we cut our teeth on”)
USE TRADITIONAL TOOLS FOR RESEARCH EVALUATION RATHER THAN “impact stuff”
“And I don’t believe that we know how to do it- you have to contrast this with the
assessment of outputs which is really just reviewing, which is bread and butter stuff for an
academic. That’s what we cut our teeth on, that’s what we do every day and so there may
be an awful lot of it…but it is just what we do. Whereas this impact stuff we just don’t know.
So I feel a little bit nervous about it.” (P0 P2 outimp1)
Brunel University London
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 7
8. Results
Experience with ex-ante impact evaluation
• Previous experience with RCUK “pathways to impact statements” – potential
future impact
NOT FORMAL, A “tick box criteria”
“The research council introduced this criteria, it's just a tick box. But it's changing, it's a slow process,
you can't instantly get scientists to change their view. So they got this box, you may just tick it. We
will tell them why this will have amazing impact on humanity for the rest of eternity, and everybody
ticks that and then the REF comes along…. “ P2 OutImp 5
REGARDED AS A “DEAD WEIGHT”
“But that sometimes becomes such a dead weight around the necks of the people making the
decisions that it outweighs everything else, including those other sciences that could help.” P2Imp1
EX-POST EVALUATION FOR REF2014 ONE “big experiment”
“I think it's all new territory for all of us, and none of us know – we are going to learn on the job I
think.” P4OutImp6
Brunel University London
23 September 2014
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 8
9. The balance scale of impact evaluation 23 September 2014
Quality focused Impact focused
Brunel University London
Tendencies /
Decisions
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 9
10. The “societal impact” focused evaluator
• Impact was regarded as independent of the research underpinning it.
“The quality of the research has zero role at all in ensuring the impact of the research’”
P6 OutImp 2
“The research that is at application is not necessarily getting in these journals but it
could be very important to get something into the marketplace.” P0 OutImp3
• Not pre-occupied with the underpinning research
“What maybe a product or an end result of research has different criteria associated
with it because what you’re looking for here is a societal change…..…whereas the
research….…it’s quite different……[it]…..is all around rigor and methodology and the
quality of the idea and making sure that the methods and the quality of the idea match”
P2 OutImp7
Brunel University London
23 September 2014
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 10
11. Tendencies of the societal-impact focused
evaluator
• DECISION 1 – Impact is serendipidous / uncontrollable / Messy
• “‘it can often be that coffee you have with somebody at the right moment and the information
passing that way’” P1Imp1
• “That kind of linear approach is very, very rare indeed. The way it is instead is that findings
accumulate over a period of time and either the weight of evidence in the end wins the day or a
moment arrives when the politicians have made up their minds that they want to go in a particular
direction, they are looking around for the evidence to support their decision “ P2 Imp1
• DECISION 2 – Value of push factors in achieving impact
• “You can’t assume that it will happen through happenstance, there needs to be some mechanism
in place.” P1Imp1
• “Getting from the university stage of research out to the end impact has so many steps in it, not
all of which are easy. They require a little effort and somebody championing them from one end or
the other” P1 OutImp 4
Brunel University London
23 September 2014
Presentation Title 11
12. Tendencies of the societal-impact focused
evaluator cont.
• DECISION 3 – Hard vs soft impacts (Impact outcome vs impact journey)
• “If you think of impact as a verb rather than a noun, I think it’s a lot easier to analyse. Impact is
the relationships you build. It is the dialog that you have that makes you ask research questions
that are subtly different from the ones you would have asked if you hadn’t linked with - whether
it's policymakers, whether it's citizens, whether it's industry at the beginning. So impact is not
something that you have right at the end. Impact is a relationship and that attitude of mind that
you have throughout the research process” P0 OutImp4
• ‘levering it to the next stage’: [whatever] gets the research being taken up and moving it forward,
that has to be considered valuable. Maybe the question we should be asking is whether enough
effort has gone into that in the past and levering research into its next stage” P0 OutImp6.
• DECISION 4 – Can it be measured?
• “And it’s important not to ignore them because you can’t measure them because sometimes you
throw out the most important things because you can’t measure them properly.” P1 OutImp2
• “These [soft] are unquantifiable, so therefore it will be difficult to assess them as impact. And
again, it’s subject to imagination, you don’t know how you’ve affected anything until you see the
results. So the only time that you know there is an impact is when there is a result. So therefore
just talking to people is not an impact." P6 Out2
Brunel University London
23 September 2014
Presentation Title 12
13. The “quality” focused evaluator
• Preoccupation with considering the underpinning research as a proxy for impact
• “I think research will only have an impact if it’s of high quality. I think that quality is the sine qua
non of impact” P0 OutImp2
• “So you can get big impacts from very, very bad quality of research, and so that's -- if that's the
way you are going to measure impact then you're going to go completely the wrong way.” P0
OutImp2
• Belief in the mantra of excellent impact being dependent on excellent research
• “I think that certainly the quality of the research is an important part. It’s a critical part. You have
to have the highest quality research in order for it to be believable and repeatable.” P0 OutImp5
• Bias towards “applied research” as “easy impact”
• “It's [her applied work] is not the research I've done that's the most prestigious in terms of, if you
like, the judgment of academia for I've done other research that have been associated with bigger
grants, research council grants and if you like higher impact academic products. But [this applied
work] is the easiest to demonstrate a real impact.” P3 OutImp2
• “And that the impact case studies might be based on, you know, research is good enough, the
equivalent of a two-star, but it won't be four-star research, and but it has an impact because it was
applied research.” P2 OutImp 2
Brunel University London
23 September 2014
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 13
14. Tendencies of the quality-focused evaluator
• DECISION 1 – Research excellence and impact
DANGER OF IMPACT OF BAD RESEARCH
• “It can’t be actually used or believed until its repeated and proven.” P0 OutImp 5
• “I think that poor quality research can only have negative impacts.” P4 OutImp8
HOPE THAT ASSUMPTION OF VALUE OF EXCELLENT RESEARCH IS CORRECT
• “ ….excellent impact should depend on excellent research” P1 OutImp6
• “…you would hope they were synonymous, wouldn’t you?” P3 OutImp 5
• DECISION 2 – Linearity of impact realisation
RATIONALISATION OF IMPACT REALISATION
• “Research was done, showed the benefits of [the evidence], got into the clinical guidelines, and
over time you can track the proportion of the relevant professionals who are implementing the
better evidence. It’s quite straightforward in fact” P3 OutImp8
• “Impact requires that you generate the evidence and then that you, in turn you get into guidelines
and the people start using that information to change their practice’” P4 Out1.
Brunel University London
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 14
15. Tendencies of the quality-focused evaluator
cont.
• DECISION 3 – Pull / Picked up factors
SOMEONE ELSE’s JOB – RESEARCHERS JUST DISSEMINATE
“I mean, I presented this work at the International Embryo Transfer Society meeting, and then
afterwards, some people talked to me and what should we do. And I said, well, you’ve just got to
purify these hormones better and you have to do this and that and they did it. Then it turned out to be
very important for a company called Bioniche because they now have something like 80, 90 percent
of the market for the hormones used in superovulation.” P0 OutImp1
• DECISION 4 – Eventually all excellent research has impact
LINK BETWEEN RESEARCH QUALITY AND IMPACT AS WELL AS “ALL IN GOOD TIME”
“….my feeling is that eventually it will [have impact] but it may take a long time.”
• DECISION 5 – Assessing the “right impact” / Lesson from MMR
DANGER OF EVALUATING IMPACT TOO SOON
“The example…is the MMR story, that was poor science. It’s had huge impact, negative impact. Its
resulted in lots of morbidity amongst children plus women -- people didn’t get their children
vaccinated. The quality of the science was poor, but it had a huge impact in a negative way.” P2
OutImp9
Brunel University London
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 15
16. Discussion – what this means for Impact
evaluation
• Is there a bias? Depends on definition of bias? Perhaps a conservative bias.
• REF2014 showed a variety of values about what constitutes impact, what will be
assessed as part of impact.
• Some of these went beyond the HEFCE or REF2014 definition of impact eg. Impact
journey valued, as was the role of public engagement.
• Concern that evaluator´s lack of experience in impact evaluation might do one of 2
things
1. Force them to use traditional measures as proxies for impact (Quality focused evaluator) but also
any type of quantitative indicators eg. QALY, deaths/lives saved, % GDP, $$$$
2. Make haphazard judgements about impact being absolute (eg. Give everything 4* because
Impact is impact) – more Societal impact evaluator.
• What will tip the scales will be worked out during the evaluation process “…learn on
the job.”
• Post evaluation interviews will serve to investigate this further.
Brunel University London
23 September 2014
Intentions and strategies for evaluating the societal impact of research: Insights from REF2014 evaluators 16