Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Supporting the ref5


Published on

Slide presentation by Kate Bradbury: Research Counts

Published in: Education
  • Be the first to comment

  • Be the first to like this

Supporting the ref5

  1. 1. Research Counts: helping our researchers prepare for the next REF Kate Bradbury, Research Support, Cardiff University Library
  2. 2. Main elements: 1.Open Access 2.Bibliometrics including Altmetrics 3.ORCID and other IDs It will have to be a continuous process to prepare for the next REF submission –not a huge effort a year or so before the deadline.
  3. 3. 1. Open Access HEFCE requirements –Gold or Green Open Access for journal articles and conference proceedings in a repository within 3 months of acceptance & if there is an embargo it must be a maximum of 12 months for panels A&B/24 months for panels C&D.
  4. 4. HEFCE: “Our analysis of a sample of journal articles and conference proceedings submitted to the current REF shows that authors could have achieved 96 per cent compliance with the access requirements in this policy, had the policy been in place for REF2014. The remaining 4 per cent of outputs would have remained eligible for submission to the REF as exceptions.” Based on a sample of 413 journal articles and conference papers.
  5. 5. Sherpa Romeo database and Scopus - coverage Sherpa Romeo database –1,743 publishers. 68% can archive post-prints. Source: Sherpa Romeo statistics page at Accessed 26/12/2014 Scopus –21K+ active titles; of these, there are 5K+ publishers and 100+ countries. OA titles are about 12%. Source: Scopus content overview and title list 26/12/2014 United States 27% United Kingdom 21% Netherlands 9% Germany 7% China 3% Elsevier 10% Springer 8% Wiley-Blackwell 5% Taylor & Francis 5% SAGE 2%
  6. 6. Preliminary analysis of 2 institutional RAE2008 returns •About 2.4 papers per journal title. •Over half of the journal titles only had one paper. •For the titles that could be matched to publishers from Scopus (about 80%), the top 5 publishers (Elsevier, Wiley-Blackwell, Taylor & Francis, Springer and Sage) accounted for over 50% of the journal titles and over 45% of the papers. •For both institutions, there were over 200 publishers and over 1,000 journal titles. Source of publisher information: Scopus title list Source of RAE2008 data:
  7. 7. Implications for libraries •A high percentage of papers are with the major publishers. •However, there is a long “tail” of publishers with only a few, or one, paper. •REF outputs are only a selection. Institutional repositories will have to capture all publications within the three month deadline as they will not know in advance what is being submitted for REF. •In a recent report by Research Consulting, “Counting the Costs of Open Access” OA costs were put at £81 per paper for Gold OA and £33 for Green OA. •This soon adds up!
  8. 8. Scams & “predatory publishers” •See Jeffrey Beall’s “Scholarly Open Access” site. Note this is his opinion only and publishers may disagree! His blog highlights new additions. •Scams –eg. hijacked sites •Predatory publishers –eg. publishers who take a fee for OA but do little more than put the article up on a website. •Role of Directory of Open Access Journals –DOAJ started to implement its new stricter criteria for inclusion in March 2014. It currently lists 10K+ journals. All existing journals will have to reapply to stay indexed.
  9. 9. What can researchers do to avoid “predatory publishers”? •A selection of Jeffrey Beall’s advice: (full list of criteria) •Analyse the site and publications & look for characteristics of predatory publishers. •Check the editors/editorial board –eg. academic qualifications, duplication across titles •Look at business management practices –eg. long term preservation, transparency •Integrity –look for eg. false claims of impact factors and/or inclusion in Abstracting & Indexing databases •Other –eg. minimal copy editing, publishes pseudo-science, quick turn around for peer review.
  10. 10. Summary -what do researchers need to do? •Check OA policies of their publisher before submitting an article. •Verify the quality of any OA journal publisher –be aware that there are plenty of scams & predatory publishers. •If the publisher’s policy is not compliant, try negotiation. •If the article is accepted, submit the post-print to the institutional repository immediately, together with any information on embargoes.
  11. 11. Some national initiatives & information •JISC Monitor Project eg. see HEFCE Open Access Policy Workshop outcomesfor a good summary of the many issues, considerations and suggestions around complying with HEFCE OA policy. •HEFCE Policy Guide OA research –eg. FAQsare useful. •CASRAI-UK Open Access WorkingGroup working on a standard open access metadata specification incorporating REF. •RIOXX Metadata Profileand Guidelines –being developed to help repositories fulfil RCUK and HEFCE REF requirements. •Research Consulting Counting the Costs of Open Access Nov 2014 –in addition to analysing costs, this provides a useful outline of the tasks involved and staff requirements in meeting HEFCE’s OA policy.
  12. 12. 2. Bibliometrics •In the last REF for Panel A & some of Panel B, citation counts for outputs were included with contextual data for benchmarking. •Earlier this year, HEFCE issued a metrics call for evidence document. •Comments received by HEFCE are now online. Respondents were asked to focus on four key issues: identifying useful metrics for research assessment how metrics should be used in research assessment ‘gaming’ and strategic use of metrics international perspectives.
  13. 13. Metrics consultation 2014: Quotes from the Summary of Responses: •“57 % of the [153] responses expressed overall scepticism about the further introduction of metrics into research assessment.” •“The most frequently cited argument was that the use of metrics could unfairly disadvantage some disciplines, particularly in the arts, humanities and social sciences.” •“Just under a fifth of responses supported the increased use of metrics in assessing research … Many supportive or ambivalent responses expressed the view that, in context, robust metrics could enhance the research assessment process.” •“a common theme that emerged was that peer review should be retained as the primary mechanism for evaluating research quality.” •“Several responses, both sceptical and supportive, stated that they would welcome a thorough review of the role of metrics in the REF.”
  14. 14. Implications •Responses indicate that there is still no agreement that REF should be assessed fully by bibliometrics. •There is an increased interest in bibliometrics data, not just for REF, but also to monitor league table performance, & for strategic decision-making. •Many universities are subscribing to InCitesor SciValand many are considering it. These databases supply, among other indicators, benchmarking data that can’t be easily compiled using Scopus/Web of Science. •The databases are often being supported by bibliometricianposts.
  15. 15. Altmetrics •Altmetricswere mentioned by some of the respondents to the REF consultation, & some suggest that they could be used as a research measurement tool. •They are useful in bringing together responses to research, & could help with finding case studies for impact. •We have added Altmetricsto our repository [example], in addition to download data, search engine data and search terms used.
  16. 16. Altmetrics-what should our researchers consider? Why use Altmetrics? Receiving feedback. Facilitating contacts & collaborations. Demonstrating public engagement. Providing additional information& generating discussion about a piece of work. Showcases recent work. Demonstrating impact of scholarly work other than journal articles egdata sets. May be an indicator of future citations. Some concerns Gaming. Selective –not all researchers use social media. Measure of quantity not quality. Mis-interpretation and mis-use. Normalisation of data. Merging & de-duplicating of information from the different places where the same publication is listed. Should be used together with other measures and evaluation, not used in isolation-altmetricshould not replace evaluation of impact. See, for example, LSE’s Impact Blog for further information & discussion about Altmetrics
  17. 17. Summary –bibliometricsquestions from researchers: •How to find data. •Subject differences. •Coverage of publications –differences between databases & reasons why publications may not be covered. •Use & mis-use of data. •Interpretation of data/understanding data •“Best” source of citation data. •“Best” indicators to use.
  18. 18. 3. ORCID What is an ORCID? •Unique, persistantauthor ID that remains despite name changes or changes of institution. •Theauthor includes a complete list of publications. •Can be linked with ResearcherIDand Scopus author profile for consistency & accuracy. Why register for an ORCID? Particularly if already registered for other IDs. •Avoids confusion with work by other authors. •Ensures visibility & ease of access to a complete list of publications. •International & not linked to one company or database. •Publishers will use ORCID to link a publication to the author from the start to the end of the publications process.
  19. 19. •Some projects •Jisc-ARMA ORCID Pilot Project –“The aim of the pilot project is to streamline the ORCID implementation process at universities and to develop the best value approach for a potential UK wide adoption of ORCID in higher education.” •ORCID Adoption and Integration Programme–“provides external funding for universities and science and social science professional associations to integrate ORCID identifiers, supports the collaborative elicitation and documentation of use cases and open source code, and establishes a collaborative venue for disseminating best practices.” •Why is this useful for REF? •Ensures accuracy & helps promote publications. •Will help with interoperability of systems. •Comprehensive publications list demonstrates how current work is underpinned by previous research. 3. ORCID
  20. 20. Conclusion •I have focussed on Open Access, Metrics and ORCID here. •Some other areas that impact on REF & where librarians are developing services include: •publishing initiatives, •research data management, •use of social media to promote research & help with impact and engagement. •Progress in these areas should result in closer working relationships across the institution –between researchers, librarians and other professional services. Thank You!