Your SlideShare is downloading. ×
0
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Scholarly Metrics Bootcamp USAIN 2014 Pre-conference workshop

337

Published on

an introduction to bibliometrics for USAIN attendees

an introduction to bibliometrics for USAIN attendees

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
337
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. SCHOLARLY METRICS BOOTCAMP Rachel Borchardt and Andrea Michalek USAIN Preconference May 4, 2014
  • 2. What will we cover?  The what and why’s of metrics  Bibliometrics overview and discussion  Hands-on training with bibliometrics tools  Tips for bringing it back home  Break  Altmetrics overview and discussion  Hands-on training with altmetrics tools  Bringing it all together
  • 3. But first..  What are your objectives for the workshop?
  • 4. What are metrics?  Synonyms/narrower terms:  Bibliometrics  Scientometrics  Impact measures  Research metrics  Altmetrics  “ … a set of methods to quantitatively analyze academic literature.” (Wikipedia)
  • 5. Why metrics?  We want to identify high-quality scholarship, but  No one can read everything!  Bibliometrics  Impact  Quality  Used to evaluate publications, scholars, labs, departments and more
  • 6. Basic metric categories  Article level  Times cited  Journal level  Impact Factor  SJR, SNIP  Google metrics  Author level  H-index  (note – this is not a comprehensive list, but covers the most frequently used metrics, and what we’ll be talking about today)
  • 7. Now, let’s dive into these metrics!
  • 8. Article level metrics
  • 9. Article-level Metric: Times Cited  Most ubiquitous metric  “Basic building block” of bibliometrics  Captured by many databases, including  Web of Science  Scopus  Google Scholar  No one source is definitive and results can vary
  • 10. Example: Elizabeth Ainsworth  Web of Science – 45 documents  Scopus – 58 documents  Google Scholar – ‘About 17,300 results’  Database/resource coverage will determine times cited as well as number of publications  Lesson: often, an accurate times cited requires an extensive search/comparison
  • 11. We will explore these resources a bit later!  Web of Science  Scopus  Google Scholar
  • 12. Journal-level metrics
  • 13. Journal-level metric: Impact Factor  Developed by Eugene Garfield in 1955  Most widely known and used metric  Based on Web of Science citations  Distributed exclusively by Journal Citation Reports (also owned by Thomson Reuters)  2-year and 5-year impact factor metrics
  • 14. (2-year) Impact Factor Formula
  • 15. Putting Impact Factor in context  Preset disciplines can be chosen and displayed by impact factor to determine highest impact factor journals
  • 16. Some high impact factors by discipline  Agricultural Engineering – 4.75, Bioresource Technology  Agriculture, Dairy and Animal Science – 3.494, Genetics Selection Evolution  Agriculture, Multidisciplinary – 2.906, Journal of Agricultural and Food Chemistry  Agricultural Economics and Policy – 2.212, Food Policy  Multidisciplinary Sciences – 38.597, Nature  Environmental Studies – 14.472, Nature Climate Change  Lesson: impact factor varies. Greatly.
  • 17. So, what’s a ‘good’ impact factor?  An often-asked question with no great answer  Context can help determine this answer  Reviewers, particularly outside one’s discipline, should always have context to avoid misunderstanding impact factor  Discipline specificity remains a major problem with context  Remember: no metric is perfect or absolute!
  • 18. Let’s explore Journal Citation Reports!  http://library.uvm.edu/research/  Search by Title  J, Journal Citation Reports  View a group of journals by subject category or Search for a specific journal
  • 19. Journal-level metric alternatives to Impact Factor  SJR/SNIP  Google Metrics  (many others have been proposed, but in my perspective, are not widely used)
  • 20. SJR/SNIP  SJR = SCImago Journal Ranking  SNIP = Source Normalized Impact per Paper  Based on Scopus citations  SJR uses an algorithm similar to Google’s to weigh citations (but otherwise similar to impact factor)  SNIP is based on SJR, but attempts to ‘normalize’ scores between disciplines  Both available at scimagojr.com or via Scopus (Elsevier)
  • 21. SJR/SNIP Rankings  Similar to Impact Factor, discipline context helps understand SJR/SNIP in comparison  Journal rankings only available on scimagojr.com
  • 22. Let’s explore SCImago!  http://www.scimagojr.com  This is a freely available resource! (great for libraries/schools with small budgets)  Note that multiple journals can be directly compared in a chart  Select “Compare”, then “Journals”  Search for up to 4 journals, hit “Compare”, then select the metric for comparison
  • 23. Google Scholar Metrics  H5-index and H5-median metrics  H5-index = X number of articles in the journal that have been cited X times in the past 5 years  H5-index of 10 means that 10 articles in the past 5 years have been cited 10+ times each  (remember this – we’ll come back to the H-index concept!)  H5-median is the median number of citations for all of the articles that fit the H5-index criteria  H5-median of 14 means that those 10 articles were cited a median of 14 times  (median = middle number)
  • 24. Google Scholar rankings  Yet again, different discipline categories!  Life Sciences and Earth Sciences
  • 25. Let’s explore Google Scholar Metrics!  http://scholar.google.com  Select “Metrics” along the top  Note that a custom list of journals can be created with a keyword search  Try searching for ‘dairy’ or ‘soil’
  • 26. Author-level metrics
  • 27. Author-level metric: H-index  Remember H5-index?  H-index is used to measure an author’s output over time  Again, X number of publications that have been cited X or more times  H-index highly dependent on the citation rate of the author’s discipline(s)  Just like other measures, context is key  Also, like other measures, the metric is only as good as the source
  • 28. Where to get H-index?  Web of Science, Scopus, and Google Scholar all have author profiles with an H-index.  However, an incomplete citation / times cited means the H-index is also based on incomplete information.
  • 29. Let’s practice - Sarah Taylor Lovell  Can you find the H-index for Sarah Taylor Lovell in Web of Science, Scopus and Google Scholar?
  • 30. Web of Science  To access Web of Science:  http://library.uvm.edu/research/  Search by Title  W, Web of Science  To find Sarah Taylor Lovell:  Author Search in dropdown  Lovell, ST  Look for matching article – her name is hyperlinked!  Clicking on name will pull up (mostly) accurate list of publications  Create Citation Report
  • 31. Scopus  To access Scopus:  http://www.scopus.com  To find Sarah Taylor Lovell:  Author Search  Lovell, Sarah Taylor  Select all relevant results  View Citation Overview
  • 32. Google Scholar  To access Google Scholar:  http://scholar.google.com  To find Sarah Taylor Lovell:  Search for “Sarah Taylor Lovell”  Click on user profile at top  Note: H-index only available for authors with a Google Scholar profile  Lesson: like times cited, an accurate H-index requires extensive search/comparison
  • 33. Calculating H-index  If citations are incomplete, how do we get an accurate H-index?  First, if they’re close, it may not be worth sweating the details..  Large gaps of coverage between databases may substantiate use of  Downloading records from individual databases  Compiling records (usually with Excel)  Publish or Perish software
  • 34. Publish or Perish  Free software program  Downloadable at www.harzing.com/pop.htm  Calculates H-index and more  Based on Google Scholar data  We won’t explore PoP, but it’s a great tool for downloading/analyzing Google Scholar citations
  • 35. Bringing metrics back to your institution
  • 36. What bibliometrics should I recommend to my scholars? Factors to consider  Institutional culture  What’s widely used at the institution?  What’s the popular perception of metrics?  What do you own / have access to?  Resource’s discipline coverage  Social science-oriented scholars may be better served by Scopus; many scientists use Web of Science  Comprehensive results requires more  Time  Energy  Skill
  • 37. Ways you can help scholars: big picture  Get to know metrics needs at institution  Keep an eye on trends in academia and at your institution  Be aware of new products and assess what you have to ensure it continues to meet user need  (so, be a librarian!)   Remember that no metric is perfect, context is key
  • 38. Ways you can help: concrete activities  Workshops  One-on-one consultations  Department/lab/admin/etc. presentations  Repository of past works / templates  Online research guide (many exist, so feel free to borrow good ideas!)
  • 39. Personal plug: Look for my book coming this fall from ACRL Press! Bibliometrics and altmetrics handbook for librarians Questions / discussion / hands-on practice

×