Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Research metrics and indicators

2,159 views

Published on

The metric tide – Stephen Curry, Imperial College London, and Ben Johnson, HEFCE
Open infrastructures - Clifford Tatum, Leiden University
Open citation – Cameron Neylon, Curtin University

Jisc and CNI conference, 6 July 2016

Published in: Education
  • Be the first to comment

Research metrics and indicators

  1. 1. Research metrics and indicators Chair: AnneTrefethen, University of Oxford 1 14/07/2016
  2. 2. Introduction Chair:AnneTrefethen, University of Oxford 14/07/2016
  3. 3. The metric tide Stephen Curry, Imperial College London - Ben Johnson, HEFCE 14/07/2016
  4. 4. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 4
  5. 5. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 5
  6. 6. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 6
  7. 7. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 7
  8. 8. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 8
  9. 9. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 9
  10. 10. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 10
  11. 11. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 11
  12. 12. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 12
  13. 13. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 13
  14. 14. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 14
  15. 15. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 15
  16. 16. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 16
  17. 17. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 17
  18. 18. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 18
  19. 19. 14/07/2016 Title of presentation (Insert > Header & Footer > Slide > Footer > Apply to all) 19
  20. 20. Responsible research metrics Ben Johnson Research Policy Adviser, HEFCE 6 July 2016
  21. 21. • We invest in the teaching, research and knowledge exchange activities of English higher education institutions. £4bn / year • We regulate and oversee English HEIs (quality and financial sustainability) • We operate the UK-wide Research Excellence Framework (REF) www.ref.ac.uk About HEFCE
  22. 22. Measurement matters If we can tell the good research from the bad: • Researchers can build on high quality work and pursue promising directions • HEIs can appoint and promote good researchers, support good departments • Funders and governments can benchmark performance and fund high quality research …and if we can put numbers on it, we can be more objective in our decisions!
  23. 23. For researchers, quality = publishing
  24. 24. Metrics everywhere!
  25. 25. “I have asked HEFCE to undertake a review of the role of metrics in research assessment and management. The review will consider the robustness of metrics across different disciplines and assess their potential contribution to the development of research excellence and impact…” David Willetts, Minister for Universities & Science, Speech to Universities UK, 3 April 2014
  26. 26. Steering group The review was chaired by James Wilsdon, Professor of Science and Democracy at the Science Policy Research Unit (SPRU), University of Sussex. He was supported by an independent steering group and a secretariat from HEFCE’s Research Policy Team: Dr Liz Allen (Head of Evaluation, Wellcome Trust) Dr Eleonora Belfiore (Associate Professor of Cultural Policy, University of Warwick) Sir Philip Campbell (Editor-in-Chief, Nature) Professor Stephen Curry (Department of Life Sciences, Imperial College London) Dr Steven Hill (Head of Research Policy, HEFCE) Professor Richard Jones FRS (Pro-Vice-Chancellor for Research and Innovation, University of Sheffield) – representing the Royal Society Professor Roger Kain FBA (Dean and Chief Executive, School of Advanced Study, University of London) – representing the British Academy Dr Simon Kerridge (Director of Research Services, University of Kent) – representative of the Association of Research Managers and Administrators Professor Mike Thelwall (Statistical Cybermetrics Research Group, University of Wolverhampton) Jane Tinkler (Parliamentary Office of Science & Technology) Dr Ian Viney (Head of Evaluation, Medical Research Council) – representing RCUK Professor Paul Wouters (Centre for Science & Technology Studies, Uni of Leiden)
  27. 27. Approach and evidence sources • Steering group: diverse expertise and extensive involvement; • Broad TORs: opening up, rather than closing down questions; • Transparent: publishing minutes & evidence in real time; • Formal call for evidence, May to June 2014; – 153 responses; 44% HEIs; 27% individuals; 18% learned societies; 7% providers; 2% mission groups; 2% other • Stakeholder engagement – 30+ events, inc. 6 review workshops, including on equality & diversity, A&H. Invited fiercest critics! – Ongoing consultation & use of social media e.g. #hefcemetrics; • In-depth literature review; • Quantitative correlation exercise relating REF outcomes to indicators of research; • Linkage to HEFCE’s evaluations of REF projects;
  28. 28. http://www.hefce.ac.uk/rsrch/metrics/ http://www.responsiblemetrics.org
  29. 29. The Metric Tide Headline findings (a selection)
  30. 30. Across the research community, the description, production and consumption of ‘metrics’ remains contested and open to misunderstandings.
  31. 31. Peer review, despite its flaws and limitations, continues to command widespread support across disciplines. Metrics should support, not supplant expert judgement.
  32. 32. Inappropriate indicators create perverse incentives. There is legitimate concern that some quantitative indicators can be gamed, or can lead to unintended consequences.
  33. 33. Indicators can only meet their potential if they are underpinned by an open and interoperable data infrastructure.
  34. 34. Our correlation analysis of the REF2014 results at output-by-author level has shown that individual metrics cannot provide a like- for-like replacement for REF peer review.
  35. 35. Within the REF, it is not currently feasible to assess the quality of research outputs using quantitative indicators alone, or to replace narrative impact case studies and templates.
  36. 36. Responsible metrics Responsible metrics can be understood in terms of: • Robustness: basing metrics on the best possible data in terms of accuracy and scope; • Humility: recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment; • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results; • Diversity: accounting for variation by field, using a variety of indicators to reflect and support a plurality of research & researcher career paths; • Reflexivity: recognizing the potential & systemic effects of indicators and updating them in response.
  37. 37. The Metric Tide Recommendations (a selection)
  38. 38. The research community should develop a more sophisticated and nuanced approach to the contribution and limitations of quantitative indicators.
  39. 39. At an institutional level, HEI leaders should develop a clear statement of principles on their approach to research management and assessment, including the role of indicators.
  40. 40. Research managers and administrators should champion these principles and the use of responsible metrics within their institutions.
  41. 41. HR managers and recruitment or promotion panels in HEIs should be explicit about the criteria used for academic appointment and promotion decisions.
  42. 42. Individual researchers should be mindful of the limitations of particular indicators in the way they present their own CVs and evaluate the work of colleagues.
  43. 43. Like HEIs, research funders should develop their own context-specific principles for the use of quantitative indicators in research assessment and management.
  44. 44. Data providers, analysts & producers of university rankings and league tables should strive for greater transparency and interoperability between different measurement systems.
  45. 45. Publishers should reduce emphasis on journal impact factors as a promotional tool, and only use them in the context of a variety of journal- based metrics that provide a richer view of performance.
  46. 46. There is a need for greater transparency and openness in research data infrastructure. Principles should be developed to support open, trustworthy research information management.
  47. 47. The UK research system should take full advantage of ORCID as its preferred system of unique identifiers. ORCID iDs should be mandatory for all researchers in the next REF.
  48. 48. The use of digital object identifiers (DOIs) should be extended to cover all research outputs.
  49. 49. HEFCE, funders, HEIs and Jisc should explore how to leverage data held in existing platforms to support the REF process, and vice versa.
  50. 50. In assessing impact, we recommend that HEFCE builds on the analysis of the impact case studies from REF2014 to develop clear guidelines for the use of quantitative indicators in future impact case studies.
  51. 51. In assessing the research environment in the REF, we recommend that the role of quantitative data is enhanced.
  52. 52. The community needs a mechanism to carry forward this agenda. We propose a Forum for Responsible Metrics, to bring together key players to work on data standards, openness, interoperability & transparency.
  53. 53. Other developments • HE and Research Bill • Stern review: – Expected July 2015 – Extensive referencing of The Metric Tide in sector responses • EC expert group on altmetrics
  54. 54. Open citation Cameron Neylon,Curtin University 14/07/2016
  55. 55. »AWAITING PRESENTATION 14/07/2016
  56. 56. »AWAITING PRESENTATION 14/07/2016
  57. 57. »AWAITING PRESENTATION 14/07/2016
  58. 58. »AWAITING PRESENTATION 14/07/2016
  59. 59. »AWAITING PRESENTATION 14/07/2016
  60. 60. »AWAITING PRESENTATION 14/07/2016
  61. 61. »AWAITING PRESENTATION 14/07/2016
  62. 62. »AWAITING PRESENTATION 14/07/2016
  63. 63. »AWAITING PRESENTATION 14/07/2016
  64. 64. »AWAITING PRESENTATION 14/07/2016
  65. 65. »AWAITING PRESENTATION 14/07/2016
  66. 66. »AWAITING PRESENTATION 14/07/2016
  67. 67. »AWAITING PRESENTATION 14/07/2016
  68. 68. »AWAITING PRESENTATION 14/07/2016
  69. 69. »AWAITING PRESENTATION 14/07/2016
  70. 70. »AWAITING PRESENTATION 14/07/2016
  71. 71. »AWAITING PRESENTATION 14/07/2016
  72. 72. »AWAITING PRESENTATION 14/07/2016
  73. 73. »AWAITING PRESENTATION 14/07/2016
  74. 74. »AWAITING PRESENTATION 14/07/2016
  75. 75. »AWAITING PRESENTATION 14/07/2016
  76. 76. »AWAITING PRESENTATION 14/07/2016
  77. 77. »AWAITING PRESENTATION 14/07/2016
  78. 78. »AWAITING PRESENTATION 14/07/2016
  79. 79. »AWAITING PRESENTATION 14/07/2016
  80. 80. »AWAITING PRESENTATION 14/07/2016
  81. 81. »AWAITING PRESENTATION 14/07/2016
  82. 82. »AWAITING PRESENTATION 14/07/2016
  83. 83. »AWAITING PRESENTATION 14/07/2016
  84. 84. »AWAITING PRESENTATION 14/07/2016
  85. 85. »AWAITING PRESENTATION 14/07/2016
  86. 86. »AWAITING PRESENTATION 14/07/2016
  87. 87. »AWAITING PRESENTATION 14/07/2016
  88. 88. »AWAITING PRESENTATION 14/07/2016
  89. 89. »AWAITING PRESENTATION 14/07/2016
  90. 90. »AWAITING PRESENTATION 14/07/2016
  91. 91. »AWAITING PRESENTATION 14/07/2016
  92. 92. »AWAITING PRESENTATION 14/07/2016
  93. 93. »AWAITING PRESENTATION 14/07/2016
  94. 94. »AWAITING PRESENTATION 14/07/2016
  95. 95. »AWAITING PRESENTATION 14/07/2016
  96. 96. »AWAITING PRESENTATION 14/07/2016
  97. 97. »AWAITING PRESENTATION 14/07/2016
  98. 98. »AWAITING PRESENTATION 14/07/2016
  99. 99. »AWAITING PRESENTATION 14/07/2016
  100. 100. »AWAITING PRESENTATION 14/07/2016
  101. 101. »AWAITING PRESENTATION 14/07/2016
  102. 102. »AWAITING PRESENTATION 14/07/2016
  103. 103. »AWAITING PRESENTATION 14/07/2016
  104. 104. »AWAITING PRESENTATION 14/07/2016
  105. 105. »AWAITING PRESENTATION 14/07/2016
  106. 106. »AWAITING PRESENTATION 14/07/2016
  107. 107. »AWAITING PRESENTATION 14/07/2016
  108. 108. »AWAITING PRESENTATION 14/07/2016
  109. 109. »AWAITING PRESENTATION 14/07/2016
  110. 110. »AWAITING PRESENTATION 14/07/2016
  111. 111. »AWAITING PRESENTATION 14/07/2016
  112. 112. »AWAITING PRESENTATION 14/07/2016
  113. 113. »AWAITING PRESENTATION 14/07/2016
  114. 114. »AWAITING PRESENTATION 14/07/2016
  115. 115. »AWAITING PRESENTATION 14/07/2016
  116. 116. »AWAITING PRESENTATION 14/07/2016
  117. 117. »AWAITING PRESENTATION 14/07/2016
  118. 118. »AWAITING PRESENTATION 14/07/2016
  119. 119. »AWAITING PRESENTATION 14/07/2016
  120. 120. »AWAITING PRESENTATION 14/07/2016
  121. 121. »AWAITING PRESENTATION 14/07/2016
  122. 122. »AWAITING PRESENTATION 14/07/2016
  123. 123. »AWAITING PRESENTATION 14/07/2016
  124. 124. »AWAITING PRESENTATION 14/07/2016
  125. 125. »AWAITING PRESENTATION 14/07/2016
  126. 126. »AWAITING PRESENTATION 14/07/2016
  127. 127. »AWAITING PRESENTATION 14/07/2016
  128. 128. »AWAITING PRESENTATION 14/07/2016
  129. 129. »AWAITING PRESENTATION 14/07/2016
  130. 130. »AWAITING PRESENTATION 14/07/2016
  131. 131. »AWAITING PRESENTATION 14/07/2016
  132. 132. »AWAITING PRESENTATION 14/07/2016
  133. 133. »AWAITING PRESENTATION 14/07/2016
  134. 134. »AWAITING PRESENTATION 14/07/2016
  135. 135. »AWAITING PRESENTATION 14/07/2016
  136. 136. »AWAITING PRESENTATION 14/07/2016
  137. 137. »AWAITING PRESENTATION 14/07/2016
  138. 138. »AWAITING PRESENTATION 14/07/2016
  139. 139. »AWAITING PRESENTATION 14/07/2016
  140. 140. Open infrastructures CliffordTatum, Leiden University 14/07/2016
  141. 141. »AWAITING PRESENTATION 14/07/2016
  142. 142. »AWAITING PRESENTATION 14/07/2016
  143. 143. »AWAITING PRESENTATION 14/07/2016
  144. 144. »AWAITING PRESENTATION 14/07/2016
  145. 145. »AWAITING PRESENTATION 14/07/2016
  146. 146. »AWAITING PRESENTATION 14/07/2016
  147. 147. »AWAITING PRESENTATION 14/07/2016
  148. 148. »AWAITING PRESENTATION 14/07/2016
  149. 149. »AWAITING PRESENTATION 14/07/2016
  150. 150. »AWAITING PRESENTATION 14/07/2016
  151. 151. »AWAITING PRESENTATION 14/07/2016
  152. 152. »AWAITING PRESENTATION 14/07/2016
  153. 153. »AWAITING PRESENTATION 14/07/2016
  154. 154. »AWAITING PRESENTATION 14/07/2016
  155. 155. »AWAITING PRESENTATION 14/07/2016
  156. 156. »AWAITING PRESENTATION 14/07/2016
  157. 157. »AWAITING PRESENTATION 14/07/2016
  158. 158. »AWAITING PRESENTATION 14/07/2016
  159. 159. »AWAITING PRESENTATION 14/07/2016
  160. 160. »AWAITING PRESENTATION 14/07/2016
  161. 161. »AWAITING PRESENTATION 14/07/2016
  162. 162. »AWAITING PRESENTATION 14/07/2016
  163. 163. »AWAITING PRESENTATION 14/07/2016
  164. 164. »AWAITING PRESENTATION 14/07/2016
  165. 165. »AWAITING PRESENTATION 14/07/2016
  166. 166. »AWAITING PRESENTATION 14/07/2016
  167. 167. »AWAITING PRESENTATION 14/07/2016
  168. 168. »AWAITING PRESENTATION 14/07/2016
  169. 169. »AWAITING PRESENTATION 14/07/2016
  170. 170. Research metrics and indicators 181 14/07/2016

×