Your SlideShare is downloading. ×
0
SCHOLARLY METRICS
BOOTCAMP
Rachel Borchardt and Andrea Michalek
USAIN Preconference
May 4, 2014
What will we cover?
 The what and why’s of metrics
 Bibliometrics overview and discussion
 Hands-on training with bibli...
But first..
 What are your objectives for the workshop?
What are metrics?
 Synonyms/narrower terms:
 Bibliometrics
 Scientometrics
 Impact measures
 Research metrics
 Altme...
Why metrics?
 We want to identify high-quality scholarship, but
 No one can read everything!
 Bibliometrics  Impact  ...
Basic metric categories
 Article level
 Times cited
 Journal level
 Impact Factor
 SJR, SNIP
 Google metrics
 Autho...
Now, let’s dive into these metrics!
Article level metrics
Article-level Metric: Times Cited
 Most ubiquitous metric
 “Basic building block” of bibliometrics
 Captured by many da...
Example: Elizabeth Ainsworth
 Web of Science – 45 documents
 Scopus – 58 documents
 Google Scholar – ‘About 17,300 resu...
We will explore these resources a bit
later!
 Web of Science
 Scopus
 Google Scholar
Journal-level metrics
Journal-level metric: Impact Factor
 Developed by Eugene Garfield in 1955
 Most widely known and used metric
 Based on ...
(2-year) Impact Factor Formula
Putting Impact Factor in context
 Preset disciplines can be chosen and displayed by
impact factor to determine highest im...
Some high impact factors by discipline
 Agricultural Engineering – 4.75, Bioresource Technology
 Agriculture, Dairy and ...
So, what’s a ‘good’ impact factor?
 An often-asked question with no great answer
 Context can help determine this answer...
Let’s explore Journal Citation Reports!
 http://library.uvm.edu/research/
 Search by Title
 J, Journal Citation Reports...
Journal-level metric alternatives to
Impact Factor
 SJR/SNIP
 Google Metrics
 (many others have been proposed, but in m...
SJR/SNIP
 SJR = SCImago Journal Ranking
 SNIP = Source Normalized Impact per Paper
 Based on Scopus citations
 SJR use...
SJR/SNIP Rankings
 Similar to Impact Factor, discipline context helps
understand SJR/SNIP in comparison
 Journal ranking...
Let’s explore SCImago!
 http://www.scimagojr.com
 This is a freely available resource! (great for
libraries/schools with...
Google Scholar Metrics
 H5-index and H5-median metrics
 H5-index = X number of articles in the journal that have
been ci...
Google Scholar rankings
 Yet again, different discipline categories!
 Life Sciences and Earth Sciences
Let’s explore Google Scholar Metrics!
 http://scholar.google.com
 Select “Metrics” along the top
 Note that a custom li...
Author-level metrics
Author-level metric: H-index
 Remember H5-index?
 H-index is used to measure an author’s output over time
 Again, X num...
Where to get H-index?
 Web of Science, Scopus, and Google Scholar all
have author profiles with an H-index.
 However, an...
Let’s practice - Sarah Taylor Lovell
 Can you find the H-index for Sarah Taylor Lovell in
Web of Science, Scopus and Goog...
Web of Science
 To access Web of Science:
 http://library.uvm.edu/research/
 Search by Title
 W, Web of Science
 To f...
Scopus
 To access Scopus:
 http://www.scopus.com
 To find Sarah Taylor Lovell:
 Author Search
 Lovell, Sarah Taylor
...
Google Scholar
 To access Google Scholar:
 http://scholar.google.com
 To find Sarah Taylor Lovell:
 Search for “Sarah ...
Calculating H-index
 If citations are incomplete, how do we get an
accurate H-index?
 First, if they’re close, it may no...
Publish or Perish
 Free software program
 Downloadable at www.harzing.com/pop.htm
 Calculates H-index and more
 Based ...
Bringing metrics back to your
institution
What bibliometrics should I recommend
to my scholars? Factors to consider
 Institutional culture
 What’s widely used at ...
Ways you can help scholars: big
picture
 Get to know metrics needs at institution
 Keep an eye on trends in academia and...
Ways you can help: concrete activities
 Workshops
 One-on-one consultations
 Department/lab/admin/etc. presentations
 ...
Personal plug: Look for my book coming this fall from ACRL Press!
Bibliometrics and altmetrics handbook for librarians
Que...
Upcoming SlideShare
Loading in...5
×

Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop

341

Published on

an introduction to bibliometrics for USAIN attendees

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
341
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop"

  1. 1. SCHOLARLY METRICS BOOTCAMP Rachel Borchardt and Andrea Michalek USAIN Preconference May 4, 2014
  2. 2. What will we cover?  The what and why’s of metrics  Bibliometrics overview and discussion  Hands-on training with bibliometrics tools  Tips for bringing it back home  Break  Altmetrics overview and discussion  Hands-on training with altmetrics tools  Bringing it all together
  3. 3. But first..  What are your objectives for the workshop?
  4. 4. What are metrics?  Synonyms/narrower terms:  Bibliometrics  Scientometrics  Impact measures  Research metrics  Altmetrics  “ … a set of methods to quantitatively analyze academic literature.” (Wikipedia)
  5. 5. Why metrics?  We want to identify high-quality scholarship, but  No one can read everything!  Bibliometrics  Impact  Quality  Used to evaluate publications, scholars, labs, departments and more
  6. 6. Basic metric categories  Article level  Times cited  Journal level  Impact Factor  SJR, SNIP  Google metrics  Author level  H-index  (note – this is not a comprehensive list, but covers the most frequently used metrics, and what we’ll be talking about today)
  7. 7. Now, let’s dive into these metrics!
  8. 8. Article level metrics
  9. 9. Article-level Metric: Times Cited  Most ubiquitous metric  “Basic building block” of bibliometrics  Captured by many databases, including  Web of Science  Scopus  Google Scholar  No one source is definitive and results can vary
  10. 10. Example: Elizabeth Ainsworth  Web of Science – 45 documents  Scopus – 58 documents  Google Scholar – ‘About 17,300 results’  Database/resource coverage will determine times cited as well as number of publications  Lesson: often, an accurate times cited requires an extensive search/comparison
  11. 11. We will explore these resources a bit later!  Web of Science  Scopus  Google Scholar
  12. 12. Journal-level metrics
  13. 13. Journal-level metric: Impact Factor  Developed by Eugene Garfield in 1955  Most widely known and used metric  Based on Web of Science citations  Distributed exclusively by Journal Citation Reports (also owned by Thomson Reuters)  2-year and 5-year impact factor metrics
  14. 14. (2-year) Impact Factor Formula
  15. 15. Putting Impact Factor in context  Preset disciplines can be chosen and displayed by impact factor to determine highest impact factor journals
  16. 16. Some high impact factors by discipline  Agricultural Engineering – 4.75, Bioresource Technology  Agriculture, Dairy and Animal Science – 3.494, Genetics Selection Evolution  Agriculture, Multidisciplinary – 2.906, Journal of Agricultural and Food Chemistry  Agricultural Economics and Policy – 2.212, Food Policy  Multidisciplinary Sciences – 38.597, Nature  Environmental Studies – 14.472, Nature Climate Change  Lesson: impact factor varies. Greatly.
  17. 17. So, what’s a ‘good’ impact factor?  An often-asked question with no great answer  Context can help determine this answer  Reviewers, particularly outside one’s discipline, should always have context to avoid misunderstanding impact factor  Discipline specificity remains a major problem with context  Remember: no metric is perfect or absolute!
  18. 18. Let’s explore Journal Citation Reports!  http://library.uvm.edu/research/  Search by Title  J, Journal Citation Reports  View a group of journals by subject category or Search for a specific journal
  19. 19. Journal-level metric alternatives to Impact Factor  SJR/SNIP  Google Metrics  (many others have been proposed, but in my perspective, are not widely used)
  20. 20. SJR/SNIP  SJR = SCImago Journal Ranking  SNIP = Source Normalized Impact per Paper  Based on Scopus citations  SJR uses an algorithm similar to Google’s to weigh citations (but otherwise similar to impact factor)  SNIP is based on SJR, but attempts to ‘normalize’ scores between disciplines  Both available at scimagojr.com or via Scopus (Elsevier)
  21. 21. SJR/SNIP Rankings  Similar to Impact Factor, discipline context helps understand SJR/SNIP in comparison  Journal rankings only available on scimagojr.com
  22. 22. Let’s explore SCImago!  http://www.scimagojr.com  This is a freely available resource! (great for libraries/schools with small budgets)  Note that multiple journals can be directly compared in a chart  Select “Compare”, then “Journals”  Search for up to 4 journals, hit “Compare”, then select the metric for comparison
  23. 23. Google Scholar Metrics  H5-index and H5-median metrics  H5-index = X number of articles in the journal that have been cited X times in the past 5 years  H5-index of 10 means that 10 articles in the past 5 years have been cited 10+ times each  (remember this – we’ll come back to the H-index concept!)  H5-median is the median number of citations for all of the articles that fit the H5-index criteria  H5-median of 14 means that those 10 articles were cited a median of 14 times  (median = middle number)
  24. 24. Google Scholar rankings  Yet again, different discipline categories!  Life Sciences and Earth Sciences
  25. 25. Let’s explore Google Scholar Metrics!  http://scholar.google.com  Select “Metrics” along the top  Note that a custom list of journals can be created with a keyword search  Try searching for ‘dairy’ or ‘soil’
  26. 26. Author-level metrics
  27. 27. Author-level metric: H-index  Remember H5-index?  H-index is used to measure an author’s output over time  Again, X number of publications that have been cited X or more times  H-index highly dependent on the citation rate of the author’s discipline(s)  Just like other measures, context is key  Also, like other measures, the metric is only as good as the source
  28. 28. Where to get H-index?  Web of Science, Scopus, and Google Scholar all have author profiles with an H-index.  However, an incomplete citation / times cited means the H-index is also based on incomplete information.
  29. 29. Let’s practice - Sarah Taylor Lovell  Can you find the H-index for Sarah Taylor Lovell in Web of Science, Scopus and Google Scholar?
  30. 30. Web of Science  To access Web of Science:  http://library.uvm.edu/research/  Search by Title  W, Web of Science  To find Sarah Taylor Lovell:  Author Search in dropdown  Lovell, ST  Look for matching article – her name is hyperlinked!  Clicking on name will pull up (mostly) accurate list of publications  Create Citation Report
  31. 31. Scopus  To access Scopus:  http://www.scopus.com  To find Sarah Taylor Lovell:  Author Search  Lovell, Sarah Taylor  Select all relevant results  View Citation Overview
  32. 32. Google Scholar  To access Google Scholar:  http://scholar.google.com  To find Sarah Taylor Lovell:  Search for “Sarah Taylor Lovell”  Click on user profile at top  Note: H-index only available for authors with a Google Scholar profile  Lesson: like times cited, an accurate H-index requires extensive search/comparison
  33. 33. Calculating H-index  If citations are incomplete, how do we get an accurate H-index?  First, if they’re close, it may not be worth sweating the details..  Large gaps of coverage between databases may substantiate use of  Downloading records from individual databases  Compiling records (usually with Excel)  Publish or Perish software
  34. 34. Publish or Perish  Free software program  Downloadable at www.harzing.com/pop.htm  Calculates H-index and more  Based on Google Scholar data  We won’t explore PoP, but it’s a great tool for downloading/analyzing Google Scholar citations
  35. 35. Bringing metrics back to your institution
  36. 36. What bibliometrics should I recommend to my scholars? Factors to consider  Institutional culture  What’s widely used at the institution?  What’s the popular perception of metrics?  What do you own / have access to?  Resource’s discipline coverage  Social science-oriented scholars may be better served by Scopus; many scientists use Web of Science  Comprehensive results requires more  Time  Energy  Skill
  37. 37. Ways you can help scholars: big picture  Get to know metrics needs at institution  Keep an eye on trends in academia and at your institution  Be aware of new products and assess what you have to ensure it continues to meet user need  (so, be a librarian!)   Remember that no metric is perfect, context is key
  38. 38. Ways you can help: concrete activities  Workshops  One-on-one consultations  Department/lab/admin/etc. presentations  Repository of past works / templates  Online research guide (many exist, so feel free to borrow good ideas!)
  39. 39. Personal plug: Look for my book coming this fall from ACRL Press! Bibliometrics and altmetrics handbook for librarians Questions / discussion / hands-on practice
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×