Final delasalle for uksg


Published on

Published in: Technology, Education
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Why the interest in citations? Studies have been done showing correlation with outcomes of RAE panel decisions. Also the European Commission & US National scientific bodies. Interest in bibliometrics is international! RCUK perspective on impact. Also Uni management measures research more broadly than just citations… GScholar data is used as part of the Webometrics World ranking of Universities on the Web. (Warwick is no. 112) Citations bibliometrics are quantitative & therefore summaries of large scale activity can be produced: this is complementary to the “local view” of peer review. Bias is removed in large data collections viewed by external people… eg where written or previous reputations
  • Potential role for Librarians in understanding these measures and explaining them to researchers: constant change with new measures being introduced all the time. Variety of publishing routes these days, and those in different contexts will value different types: journal articles, monographs, blogs & tweets. Within those who value journal articles, there are measures of quality of the journal and therefore of the research described in it. Just a paper count is not good enough! Webometrics could include: geographic coverage of visitors, academic networks of visitors, people linking to your article/site
  • Scopus data not so good at recording author affiliations as only did for all authors since 2003. & only covers articles published since 1995.
  • See slide no. 38: 100 hours with WoS to gather citations data for a department: to do same thing with Gscholar took 3000 hours.
  • Mixture of good and bad motivations!
  • Eg Many articles with many authors in Physics, when compared to History Within a discipline more concerned with articles, citation numbers can also be higher: there are more articles to make citations! Eg Mathematics might not attract so many citations immediately, but will continue to be cited years after the end of a Physics article’s peak. Average numbers of citations vary across disciplines: whether mean or median, but in some disciplines the highly cited articles skew averages more than in others.
  • Second generation counts are like Google PageRank and far more difficult to calculate but indicative of long term impacts. It’s the no. of citations of the papers which cite yours… Percentile indicators are calculated by taking the year and journal and category of a paper and creating a citation frequency index of all papers matching those criteria and determining the percentage of papers at each level of citation. This places a paper relative to others in its field…
  • (NB, need to define “high”: top 10 for your subject?)
  • This ranking mentioned by the VC in his end of ac year message 09/10. Purple text = uses Thomson Reuters’ data… We’re no. 152-200 joint! Score 21.8 on Highly cited, and there are those who score higher over all who are worse performers on this measure. Harvard (no. 1) score 100 at this, though! Cambridge are World 4 th overall, scoring 53.8 on Highly cited researchers. The highest placed Uni with a score in the 20s for this measure is Rockefeller with 29 for HiCi and overall place of 32 nd . Warwick’s Output of 37.7 for publications indexed in the TR citation indices. Harvard got 100, Cambridge got 65.4.
  • Library’s subscriptions to citation sources data in library-run institutional repositories If yes, describe the role as you see it (with a particular institution in mind?). Advice to whom, about what, in what form? What advice do/would you give? If no, who does have this role (if anyone!) Should Librarians be taught bibliometrics as part of MA Librarianship?
  • Final delasalle for uksg

    1. 1. Role of the library in research evaluation Jenny DelasalleAcademic Support Manager (Research) University of Warwick
    2. 2. Today’s session will cover…• Overview of research evaluation in the UK context.• Lots of stuff about bibliometrics!• How librarians’ expertise is relevant• What can a library do, and what should a library do, with regard to research evaluation?• Discussion!
    3. 3. Who is interested in researchevaluation?• HEFCE, through REF 2014: funding and reputation• University management: central, department heads. – Indication of staff performance – Recruitment and “Head hunting”! – Targeting of support – Demonstrate capabilities/accomplishments – Uni Rankings• Researchers themselves: collaborations – Peer reviewers on REF panels, etc
    4. 4. Possible measures…• “Bibliometrics” – Number of outputs (in quality publications) – Number of citations/calculation based on this – Write on card some appropriate measures• Involvement as a peer reviewer• Journal editorships• Research grant applications• Research income• Prestigious awards• PhD supervision load
    5. 5. About REF 2014• “Led by expert review, informed by metrics.”• They are looking for “Impact”: citations are just one measure.• 65% outputs, 20% Impact, 15% Environment• Adapting to disciplinary differences: 36 UoAs• Panel criteria published in Jan 2012.• Not every researcher eligible for submission.
    6. 6. Output-based measures (bibliometrics,webometrics, altmetrics) – Paper counts – journal impact factors – the H-index – citation scores at article level – visitor numbers (or other info) for online articles – and many others…. eg blog entries, tags, etcRole for Librarians, advising on use?
    7. 7. Main sources of citations data• Web of Science – Thomson Reuters (used for THE rankings)• Scopus – Elsevier (used for REF 2014, in Sciences)• Google Scholar… various tools available. (Used for REF 2014, Computer science only)
    8. 8. About WoS & Scopus citation dataScopus WoSApprox. 17,000 journals Approx. 11,000 journalsCitations in journals since Relatively poor coverage of1996 conferences?Broadest subject coverage Science & Social sciences origins.Scimago journal rank JCR Impact factor
    9. 9. Google Scholar• Beware: – We’ve no idea what’s included or excluded – or how it works (Date data goes back to?) – Data is inconsistent and there are no efforts at standardisation – Data includes multiple entries, false hits, reading lists, etc – It lacks the sophisticated search functionality of Scopus & WoS• Benefits: – It includes citation data : external analysis tools. – Best Arts and Humanities coverage. – It identifies material which is not yet indexed by WoS. – Google search options are easy to learn/already familiar – It has an ‘advanced’ search option. – It’s fast at bringing you search results.
    10. 10. Be careful of citation measurement:motivations for citations • Paying homage to experts – Especially those likely to be peer reviewers! – Lend weight to own claims • Credit to peers whose work you have built upon • Provide background reading • Criticising/correcting previous work • Sign-posting under-noticed work – (own paper which would affect your h-index!) • Self citations!
    11. 11. Citation patterns• Most publications have few or no citations.• Variety across the disciplines.• Therefore comparisons within a discipline are most useful.• Percentages against a world average within each discipline are more useful than basic numbers.
    12. 12. About the H-index• Invented by Jorge E. Hirsch, a physicist, in 2005• Algorithm to calculate quality and sustainability of research output• Calculated using number of publications and number of citations per output• A researcher with an index of h has published h papers each of which has been cited by others at least h times• E.g. a H-index of 20 means there are 20 published papers each with at least 20 citations
    13. 13. Example H-indexE.g. Professor X has a total of 10 publicationsPublication 1 20 citesPublication 2 18 citesPublication 3 11 citesPublication 4 7 cites----------------------------------------------------------- H-index: 4Publication 5 4 citesPublication 6 3 citesPublications 7,8,9,10 0 cites
    14. 14. Other than the H-index• M-index = h/n, where n is the no of years since the first published paper.• C-index accounts for quality of the citations• G-index: more weight to highly cited articles• H-1-index: how far away from gaining 1 more point on h-index• E-index: surplas citations in the h set!• Contemporary h-index: recent activity• Google’s i10-index: no. of papers with at least 10 citations.
    15. 15. The author’s perspective• A record of what you have published will be useful for: – your CV. – providing information to University data gathering exercises (Department level or REF). – web pages that describe your work.• Keeping an eye on who is citing your work helps you to: – identify future potential collaborators. – maintain awareness of other research in your field and interpretations of your work. – Become aware of which articles are influencing your research profile the most.
    16. 16. Advice to researchers: tell a good story…• List of your articles and no. of citations for each.• Average citation no. for papers over 2 years old?• Is it high or low for your discipline?• Compare your article’s citation count to average for the journal your article appears in, for the year of publication.• Add context : who has cited your work? Anyone particularly impressive?!
    17. 17. Other things to measure…• No. of articles with no citations at all?• No. of joint articles – and who the co-authors are, to indicate collegiality and interdisciplinarity.• No. of articles published in a quality (high impact factor?) journal.• (not only articles, or even outputs!)
    18. 18. PLoS example (1)
    19. 19. PLoS example (2)
    20. 20. WRAP example
    21. 21. Gaining visitors to your paper• Boost your Google juice! – Put a link to your paper everywhere you can: Wikipedia? & other profile sites – Get someone else to cite your paper, even in a draft paper online: Google Scholar will pick it up, and having GScholar citations seems to help rankings. – Get your papers into an OA repository as quickly as possible: date sensitive
    22. 22. Shanghai Academic Ranking ofWorld Universities• Quality of Education: Alumni of an institution winning Nobel Prizes and Fields Medals - 10%• Quality of Faculty: – Staff of an institution winning Nobel Prizes and Fields Medals - 20% – Highly cited researchers in 21 broad subject categories - 20%• Research Output: – Papers published in Nature and Science* - 20% – Papers indexed in Science Citation Index-expanded and Social Science Citation Index - 20%• Per Capita Performance: Per capita academic performance of an institution - 10%
    23. 23. Discussion topics!• Who to: – Researchers, University administrators, HoDs, etc• About: – the data sources (Library subscriptions!) – How to calculate an h-index – Other measures available: which to use when – Characteristics of highly cited articles… “career tips”!• How: – Partnership arrangements/meetings – Online guides – Training sessions: in Library, advertised or as invited to departments – Enquiries/consultations
    24. 24. Reading list!• Auckland, M (2012) RLUK Re-skilling for Research Accessed 20 March 2012• My blog:• JISCMAIL lis-bibliometrics list