3. University rankings
Academic Ranking of World Universities
(Shanghai)
QS World University Rankings
Times Higher Ed World University Rankings
Webometrics
… various others
4. REF 2014
36 Units of Assessment
Environ-
ment
15%
Impact
20%
Research
Outputs
65%
5. Funders, eg RCUK
RCUK’s Research Outcomes System (ROS)
An evidence base: reports to Government, the
public and other organisations.
publications
Other… (eg) new materials, exhibitions and
websites
staff development
collaborations and partnerships,
communication and dissemination activities
summaries of impact.
6. Institution / Head of Department
recruitment stage / performance review
Looking for:
1. Number of recent (quality) publications
2. Grant income
3. PhD students supervised
4. Things measured by funders & important
rankings
Snowballmetrics project
Showcasing their researchers
7. Researchers for themselves / as
peers
1. What to put on CV/in performance
review documents.
2. What should they do to reach target
audience? To enhance citations?
3. Peer review for journals (and on REF
panels)
4. Editors of journals review the reviewers!
9. Journal title: Short-hand for
quality?
Cachet, because of :
History.
Rigorous peer review & editing.
Attract many submissions & choose the
best.
Circulation/audience.
Invest in a high profile
Good Indexes/discovery tools.
10. Journal info sources & rankings
Ulrich’s Periodicals Directory
DOAJ
JCR – 2/5 year IF, Immediacy index, Cited
half life, Eigenfactor.
SJR & SNIP
Google Scholar’s “Metrics”.
European Reference Index for the
Humanities
Harzing.com
11. Article level measures
Sorting of results by citations.
Display the number of visitors/downloads/
citations per article.
Reviews and comments by readers: on
the journal site/on blogs/SSRN/twitter, etc.
Likes & bookmarks.
Scores or ratings by readers?
News/media coverage.
12.
13.
14. Librarians as...
Repository/CRIS managers: a source of
info. for others
Guides to Library subscription sources:
citation data, publishing patterns & trends
Expert advisers on alternative metrics and
how they are best used
15. The future?
An Article level economy.
Availability of Altmetrics will support
interest in other kinds of outputs than
journal articles.
My title could just as easily be, “Why are librarians relevant to research evaluation?”
LIBRARIANS: They use citation data from sources like TR WoK and Elsevier’s Scopus. These sources are bought and provided by the Library. Management might ask Librarians how to make the most of these tools, or indeed be interested to hear from Librarians about how the tools can be used, to analyse and predict performance in elements of such rankings.
Becoming more sophisticated: already talk about what will happen in REF 2020!1. Research Outputs (65%): assessed for ‘originality, significance and rigour’ with reference to international research quality standards. Panels will also recognise the significance of outputs beyond academia wherever appropriate.2. Impact (20%): the reach and significance of research on the economy, society and/or culture. This specifically excludes impact or advancement of academic knowledge within the HE sector.3. Environment (15%): the ‘vitality and sustainability’ of the research environment, including its contribution to the wider discipline or research base.ERA 2012 in AustraliaLIBRARIANS: Research outputs reporting often comes from the institutional repository or CRIS, which is sometimes run by the Library on behalf of the institution. Library provision is also relevant to the Environment statements. Researchers are looking for advice about “impact”, as this is new: an opportunity for the Library? Especially those who understand social media and teach its use to students: can adapt this advice for researchers.
- US National Science Foundation asks for “products” rather than “articles” in the biographical-sketch section of grant applications. - Article in Nature: US National Science Foundation, for example, has already begun requesting “products” rather than “articles” in the biographical-sketch section of grant applications - http://www.nature.com/nature/journal/v495/n7442/full/495437a.html Also, further funding? Impact: Recognition for their work, exploitation of it by others.LIBRARIANS: Repository/CRIS links with research office and reporting to ROS. Likely to be using metadata expertise in recording & sharing data. Similar to the role in support of REF reporting.(One source of data on collaborations and partnerships is co-authorship, and many Library subscribed products will help to track such partnerships and thus be of use to those who report to )
Snowball Metrics: Elsevier are involved, along with 8 UK Universities. Recipe book of methodologies, eg No. 2 might be about number of applications made, or no. of awards, or level of income: they have chosen both number and value of awards, by year and by quarter, and both flat and per FTE at the institution, as a metric that would allow benchmarking between participating institutions. LIBRARIANS: No. 1: source of data might be the repo or CRIS, managed by the Library. Quality of the publications & how to measure it, is a big topic!(No.2: we host workshops… ) (No. 3: librarians support PhD students) No. 4: One source of data on collaborations and partnerships is co-authorship, and many Library subscribed products will help to track such partnerships. Look for internal partnerships, international collaborations, disciplinary collaborations, etc: identify good practice, grow such practice.
LIBRARIANS: 1) Offer training sessions on the h-index and it’s ilk, and how to look it up. Might need to consider whether to display metrics on their articles in the inst. Repository. How to tell a good story about your research! 2) Help them to think about a “publication strategy” & what elements are important to them: impact factors and where to find them, along with the other measures available at journal title level, and other characteristics of a journal that might be of interest to them. Librarians know journals and have been evaluating them for years, albeit for a different purpose! 3) REF panel members in the sciences will get citation data about the outputs: therefore understanding citation data is an important skill for a researcher.
Most of those relevant to Librariansrelate to measuring the quality of journals or outputs.
REF panels: do they read every output? Are they pre-disposed to like certain articles, because of the significant journal it appears in? Journal brands “Nature, Science, The Lancet”… Best research was not in the article in Nature but somewhere less well known, but academically prestigious: but which put forward for RAE/REF?Audience: English language? Open Access? Just because people can read it on the web, doesn’t mean that they will!Recent blog article by a researcher who published in PLoS ONE & talks through his decision not to publish there again for a while indicates that researchers do care about the brand and reputation of the journal, but that one journal title might mean different things to different researchers, of different generations. LIBRARIANS: Explaining the different types of peer review & what acceptance/rejection rates mean. Which journal indexing sites should the journal appear on? Not every journal has such a clearly recognisable brand, so how do you differentiate?
LIBRARIANS: Classic territory! Find quality info sources & help researchers to use them! Not all journal titles have an IF. How to look up Impact Factors and other measures on these tools. The subject categories make a difference.“Data should tell a story”: Are the journals increasing IF, year on year?Understanding citation measurement’s strengths and weaknesses is important to using such tools.
SSRN = social science research network!These things are otherwise known as “Altmetrics” – alternatives to the citation metrics are available. “measures of scholarly impact mined from activity in online tools and environments,” - Jason PriemMendeley scores, RG scores: proprietary and not much better than citation measurement?!Open to manipulation : Not verifiable : Lack of clarity over how the measures are produced : lack of an authoritative source. But is citation measurement any better?!PotentialRole in Impact assessmentGood if you want to measure open, collaborative approach to research?As well as, not instead of other measures.(IRUS project to collate article download stats.)Not only for articles : relevant for scholarly blogs/tweets and other kinds of output.Downloads for internal views & analysis only: basis of awards/privilege, eg Springer OA. Shared with authors only, eg repository stats ??Role in post publication peer review systems?National Information Standards Organization (NISO) involvement… long term.LIBRARIANS can recommend such sorting to researchers when searching. Journals, repositories and authors can all display metrics at this level.
Displays his authorIDs (more to come!) More than one h-index. Microsoft Academic Search. Left hand side shows his networks of collaboration, grants & other information.
PLoS One and other PLoS titles are often given as an example of this.Altmetric tool provides the data for this. It is described as a measure of the “social impact of scholarly literature”.Other tools can be found on the altmetrics.org website.
How are Librarians relevant to Researcher Performance measurement?
This really answers the question, “How is researcher performance measurement relevant to Librarians”.Article level in OA, mega-journal? Readers will need to evaluate the articles, & to understand the metrics displayed about them. - LIBRARIANs as guides.Other kinds of outputs: how are these presented, collated/acquired, discovered & used: LIBRARIANS as collectors and as guides.The publishing landscape is changing and researchers do get worked up. Librarians with technical expertise & answers are reassuring to them.