Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

What's in the research librarian's tool shed?

887 views

Published on

Presents an overview of the basic tools, indicators and skills used to support researchers in evaluating, managing and improving their research performance.

Published in: Education
  • Be the first to comment

  • Be the first to like this

What's in the research librarian's tool shed?

  1. 1. | | 1 | 1 What’s in the research librarian’s tool shed? 26 Aug 2014 Lucia Schoombee Customer Consultant – Research Intelligence Elsevier
  2. 2. | | 2 | 2 Contents 1. Introduction 2. Nuts and bolts 3. High power tools 4. Hammer 5. Comprehensive, automated tools 6. New gadgets 7. Skill & know how 8. Safety & Caution 2
  3. 3. | | 3 | 3 Introduction • Researchers = “simplify things and help us to be less busy” • Evidence of productivity and impact • Needed for: performance appraisal, promotion, funding applications, rating and to include in curriculum vitae • Profusion of indicators and tools • All tools provide quantitative means of articulating a researchers productivity and impact • NB! Supplementary to peer-review
  4. 4. | | 4 | 4 Nuts and bolts • Publication counts • Mostly books, articles, book chapters & conference proceedings. • “Work not published is work not done” • Citations • Broadly, a citation is a reference to a published or unpublished source (not always the original source). • Attributes scientific work and gives credit to the original source of an idea, theory or concept. • Citation analysis = Measuring the impact of an author, or a publication, by counting the number of times that author or publication has been cited by other works
  5. 5. | | 5 | 5 Indicators • Indicator is a composite numeric symbol/expression that indicates the state or level of something • The nuts and bolts – publication counts and citations are used to produce indicators of research performance • Most common indicators used in science: h-index, g-index, m-index, impact factor
  6. 6. | | 6 | 6 h-index • Developed by Prof Jorge Hirsch in 2005 • The h-index is an equation based on the number of publications and the number of citations per • publication • h-index is now recognized as an industry standard that gives information about the performance of Researchers and Research Areas that is very useful in some situations Prof Jorge Hirsch invented h-index in 2005
  7. 7. | | 7 | 7 h-index formula • A scientist has an h-index 9 if his top 9 most-cited publications have each received at least 9 citations; it is 13 if an entity’s top 13 most-cited publications have each received at least 13 citations; and so on [A scientist has index h if h of his/her Np papers have at least h citations each, and the other (Np − h) papers have no more than h citations each]
  8. 8. | | 8 | 8 From the horse’s mouth Prof Dermott Diamond, National Centre for Sensor Research, Dublin City University talks about Bibliometrics for the individual
  9. 9. | | 9 | 9 45°
  10. 10. | | 10 | 10 Advantages of h-index • Combines quantity (publications) and impact (citations). • Objective measure of performance • Insensitive to low cited papers • Better than other single-number criteria: Impact factor, total number of documents, total number of citations, citation per paper rate and number of highly cited papers • Easy to obtain • Easy to understand
  11. 11. | | 11 | 11 Limitations of h-index • Publication and citation patterns vary between disciplines • Not time sensitive • Highly cited papers are not reflected in the h-index • Easy to obtain, risk of indiscriminate use and over-reliance • May change behaviour of scientists (self-citations) • There are also technical limitations: • Difficulty to obtain the complete output of scientists • Deciding whether self-citations should be removed or not
  12. 12. | | 12 | 12 g-index [Given a set of articles] ranked in decreasing order of the number of citations that they received, the g-index is the (unique) largest number such that the top g articles received (together) at least g2 citations Source: https://who.rocq.inria.fr/Jean-Charles.Gilbert/publications/indices.jpg
  13. 13. | | 13 | 13 g-index calculation Method/calculation: Rank by decreasing order the citations of all the documents of the unit. The position where the square of the rank position is equal to the accumulated number of citations corresponds to the g-index Source: Costas, R., & Bordons, M. (2008). Is g-index better than h-index? An exploratory study at the individual level. Scientometrics, 77(2), 267-288.
  14. 14. | | 14 | 14 m-index • The h-index depends on the duration of each scientist’s career because the pool of publications and citations increases over time. • In order to compare scientists at different stages of their career, Hirsch presented the “m parameter”, which is the result of dividing h by the scientific age of a scientist (number of years since the author’s first publication) • The m-index is defined as h/n, where n is the number of years since the first published paper of the scientist
  15. 15. | | 15 | 15 Comparing h-, g- and m-indices
  16. 16. | | 16 | 16 High power tools • Most comprehensive and robust sources of publication and citation data are citation indexes • E.g. • Scopus • Web of Science • Google Scholar
  17. 17. | | 17 | 17 Scopus at a glance • The largest abstract and citation database of peer-reviewed research literature from around the world. Belongs to Elsevier. • More than 21,900 titles from more than 5,000 international publishers and 105 different countries • Over 54 million records, 23 million patents from 5 patent offices worldwide • All content is evaluated by an independent, 15-person, international board of experts called the Content Selection and Advisory Board (CSAB)
  18. 18. | | 18 | 18 Content
  19. 19. | | 19 | 19 What content does Scopus include? CONFERENCES 17k events 5.5M records (10%) Conf. expansion: 1,000 conferences 6,000 conf. events 400k conf. papers 5M citations Mainly Engineering and Physical Sciences BOOKS 421 book series - 28K Volumes - 925K items 29,917 books - 311K items Books expansion: 75K books by 2015 - Focus on Social Sciences and A&H PATENTS 24M patents from 5 major patent offices JOURNALS 21,912 peer-reviewed journals 367 trade journals - Full metadata, abstracts and cited references (pre-1996) - >2,800 fully Open Access titles - Going back to 1823 - Funding data from acknowledgements Physical Sciences 6,600 Health Sciences 6,300 Social Sciences 6,350 Life Sciences 4,050
  20. 20. | | 20 | 20 References • 33 million date back to 1996, incl. 85% references • 21 million date back to 1823, no references • 3 million records added per year • (5,500 records added per day) 33M 21M
  21. 21. | | 21 | 21 Scopus Author search
  22. 22. | | 22 | 22 Scopus Author Identifier
  23. 23. | | 23 | 23 Scopus Citation Overview
  24. 24. | | 24 | 24 Scopus Author Evaluator
  25. 25. | | 25 | 25 Web of Science • ~12,000 Journal titles • Databases: • Science Citation Index Expanded (1900-present) • Social Sciences Citation Index (1900-present) • Arts & Humanities Citation Index (1975-present) • Book Citation Index (2005-present) • Conference Proceedings Citation Index (1993-present) • 54 million records - 3,300 publishers - “100 years of abstracts” • 6.5 million conference records - 760 million+ cited references • Updated weekly - Since 1955
  26. 26. | | 26 | 26 Google Scholar Citations
  27. 27. | | 27 | 27 Google Scholar Citations
  28. 28. | | 28 | 28 Google Scholar Limitations • Evasive about exactly which resources it indexes • Update schedule unknown • Definition of “scholarly” includes grey literature, including article preprints, conference proceedings, and other materials not peer-reviewed. • BUT Google Scholar is “freely” available and easy to use
  29. 29. | | 29 | 29 Hammer • Journal Impact Factor • Widely used journal metric, most notorious • Represents the impact a journal has in relation to other journals in a specific field • Measures the frequency with which the average article in a journal has been cited in particular year • Despite criticism, widely used to articulate an authors reputation
  30. 30. | | 30 | 30 Comprehensive, automated tools • Multi-dimensional, flexible and efficient features to handle complex queries, simulations and is able to compare entities. • E.g. SciVal & Incites
  31. 31. | | 31 | 31 SciVal in a nutshell 31 SciVal offers access to the research performance of 220 countries and 4,600 research institutions worldwide, and groups of institutions Visualize research performance Benchmark your progress Develop collaborative partnerships Ready-made- at a glance snapshots of any selected entity Flexibility to create and compare any research groups Identify and analyze existing and potential collaboration opportunities
  32. 32. | | 32 | 32 The layers of SciVal 32 Using advanced data analytics super-computer technology, SciVal allows you to instantly process an enormous amount of data to generate powerful data visualizations on-demand, in seconds. • Scopus data only • 1996 onwards • Weekly update Query around 75 trillion metric values
  33. 33. | | 33 | 33 SciVal
  34. 34. | | 34 | 34 SciVal
  35. 35. | | 35 | 35 Gadgets • One often also finds new and unopened gadgets which seem great but is difficult to figure out. Altmetrics fall into this category, showing a lot of promise, but not yet being quite understood and trusted. • Variety of research outputs: datasets, software, posters, slides, videos, websites and articles • Impact is measured by: number of tweets, bookmarks, Likes on Facebook, blog posts, media mentions, etc. • Altmetrics tracks how many times the outputs have been viewed, downloaded, cited, reused/adapted, shared, bookmarked, or commented upon
  36. 36. | | 36 | 36 Altmetrics tools A few examples of the rapidly growing number of new tools for measuring the impacts of these kinds of research outputs include: • Altmetric analyses articles and datasets • ImpactStory analyses articles, datasets, blog posts, software, etc. • Topsy an index of the public social web • Slideshare analyses videos, presentations, slides and documents posted to its website • Plum Analytics and CitedIn
  37. 37. | | 37 | 37 Using Altmetrics.com • Go to http://www.altmetric.com/ • Librarians & researchers - Request a free trial account of the Explorer using the form at the bottom of the page • Go to Product > Explorer and Sign in • Pick an article by keyword, journal, etc. • N.B. Altmetrics is an article-based metric
  38. 38. | | 38 | 38 Altmetric donut • Badge can be embedded • Add “altmetric it” to browser toolbar
  39. 39. | | 39 | 39 Altmetrics in Scopus • Appears in sidebar of Scopus articles • Only when there is data available for the article being viewed • E.g. Priem, J., Groth, P., & Taraborelli, D. (2011). The altmetrics collection. PloS one, 7(11), e48753-e48753
  40. 40. | | 40 | 40 Impactstory • Go to www.impactstory.org • Create a profile • Impactstory collects your products from slideshare, ORCID, Figshare, Githup
  41. 41. | | 41 | 41 Skill and know-how • Ale Ebrahim, Nader, et al. "Effective strategies for increasing citation frequency." International Education Studies 6.11 (2013): 93-99 • Visibility is the key to higher citations • This paper, by reviewing the relevant articles, extracted 33 different ways for increasing citation possibilities
  42. 42. | | 42 | 42 Skill and know-how (2) • Use a Unique Name Consistently • Use a standardised institutional affiliation and address, using no abbreviations • Assign keyword terms to the manuscript • Publish in journal with high impact factor • Self-archive articles • Team-authored articles get cited more
  43. 43. | | 43 | 43 Skill and know-how (3) • Review articles • Contribute to Wikipedia • Join academic social networking sites • Share detailed research data • Publish your work in a journal with the highest number of abstracting and indexing • Create an online CV • Make a podcast about your research
  44. 44. | | 44 | 44 Safety • Note the limitations of citation analysis • Papers are cited for a variety of reasons, some of which are not related to the value of the research • Citation analysis relies heavily on journal content. However many subject areas prefer to publish books and are therefor not well represented in citation indexes. • Major indexes STM focus
  45. 45. | | 45 | 45 Caution • Indicators should be a means to an end not an end in itself • Use indicators in combination (never rely on one indicator) • Peer-review should still be regarded as the primary judge of science • avoid the perversion of the science process, use metrics responsibly • Guard against “publication obesity”
  46. 46. | | 46 | 46 Thank you!

×