Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Computers in Libraries 2018 Workshop on Scholarly Metrics

121 views

Published on

overview of various scholarly metrics tools, proprietary and free for workshop participants at Computers in Libraries 2018

Published in: Education
  • Be the first to comment

  • Be the first to like this

Computers in Libraries 2018 Workshop on Scholarly Metrics

  1. 1. METRICS TOOLS: BIBLIOMETRICS & ALTMETRICS AMP RELEVANCE COMPUTERS IN LIBRARIES APRIL 15, 2018 Elaine Lasda elasda@albany.edu Associate Librarian @ElaineLibrarian University at Albany https://www.slideshare.net/librarian68/
  2. 2. AGENDA Photo credit: Phil Hallenbeck https://www.flickr.com/photos/phrenologist/731269430/
  3. 3. MEASURING IMPACT WHY?
  4. 4. MEASURING IMPACT WHAT?
  5. 5. MEASURING IMPACT HOW?
  6. 6. HOW CAN SCHOLARLY METRICS BE UTILIZED? • Qualitative Yet Objective • Evaluative • Assess Researcher Performance • Grant Evaluation • Strategic Planning • Hot Topics: in/across Fields • Institutional/Aspirational Peers • Allocation Of Resources
  7. 7. “REPUTATION” APPROACH
  8. 8. QUESTION, FOR YOU! What are some examples of reputational indicators of quality/impact?
  9. 9. REPUTATION APPROACH: BRIEFLY • Colleagues – Word Of Mouth • Colleagues – Published Surveys • Professional Associations • Ulrich’s – Database Indexing • Editorial Board Composition
  10. 10. CAVEATS • Subjective • Overlooks New Journals • Relies Heavily On JIF Metric • Not Always Surveys For Every Discipline • Outdated
  11. 11. CITATION-BASED METRICS
  12. 12. BIBLIOMETRICS •Objective •Fast •Cheap •Reproducible
  13. 13. SOME KEY RESOURCES • Web of Science • Scopus • Google Scholar (PoP)
  14. 14. Journal Level Metrics • Journal Impact Factor • Eigenfactor
  15. 15. JOURNAL OF HYPOTHETICAL EXAMPLES
  16. 16. EIGENFACTOR
  17. 17. QUESTIONS, FOR YOU! • What Are The Strengths Of The Journal Impact Factor? • What Are The Limitations Of Jif? • What Jif Limitations Are Addressed By Eigenfactor? • What Jif Limitations Are Not Addressed By Eigenfactor?
  18. 18. SIZE MATTERS . Eigenfactor score will always be higher if a journal is larger, i.e., publishes more articles.
  19. 19. BIGGER CONCERNS WITH JIF http://www.theguardian.com/science/political-science/2013/may/17/science-policy
  20. 20. IMPACT FACTOR SCAMS http://scholarlyoa.com/2013/08/06/bogus-impact-factor-companies/
  21. 21. OTHER JOURNAL LEVEL METRICS Source Normalized Impact Per Paper (SNIP) SCImago Journal Ranking (SJR)
  22. 22. CITESCORE
  23. 23. JOURNALMETRICS.COM
  24. 24. GOOGLE SCHOLAR RANKINGS
  25. 25. AUTHOR LEVEL METRICS
  26. 26. COMMON METRICS • Citation Count • Self-citations • H-index • Variations
  27. 27. 0 5 10 15 20 25 30 1 2 3 4 5 6 7 NumberofCitations Article Number H-index Scholar A Scholar B H-INDEX EXAMPLE Scholar A Scholar B 10 27 10 12 9 5 8 4 7 4 6 2 6 2 56 citations 56 citations 6 h-index 4 h-index
  28. 28. AUTHOR PROFILES • Google Scholar • WoS • Scopus • Altmetric/Author Identifier Sources
  29. 29. QUESTIONS, FOR YOU! • What is a “good” citation count? • How informative is a citation count? • Does h-index “fix” the limitations of citation count?
  30. 30. AUTHOR IDENTIFIERS • ORCID • ResearcherID • Disciplinary IDs (PubMed, Arxiv, etc.)
  31. 31. NAME DISAMBIGUATION • Elaine M. Lasda Bergman • Elaine M. Lasda • Elaine L. Bergman • ORCID ID 0000-0002-9498-7074
  32. 32. ARTICLE LEVEL METRICS
  33. 33. ARTICLE LEVEL METRICS • Citation Metrics http://www.sparc.arl.org/resource/sparc-article-level- metrics-primer
  34. 34. CAVEATS • Self citations • “Cartels” • Dataset variations • Rigid Disciplinary categories/subject headings
  35. 35. USAGE METRICS • COUNTER • Publisher Websites • Usage Consolidation Reports (ie, EBSCO)
  36. 36. CAVEATS • Multi-platform Access Points • Institution Based • Print Usage (?) • Disciplinary Variations • What Is Considered “Usage”?
  37. 37. (SMALL “A”) ALTMETRICS?
  38. 38. ALTMETRIC • Measurement: “Attention Score” • How: Weights different types of “attention” (ie, news, blogs, tweets, citation mangers, etc.). Factors in the attention giver’s posting patterns. Always an integer. • Result: Shows where scholarly output is getting traction
  39. 39. Altmetric Bookmarklet
  40. 40. PLUMX • Measurements: Usage, Captures, Mentions, Social Media, Citations • How: Robust sourcing, Separate interaction type counts • Result: Composite, holistic scoring
  41. 41. PLUMPRINT
  42. 42. IMPACTSTORY PROFILES • Measurement: Broad characteristics of scholar’s oeuvre • How: Data from Altmetric, BASE, ORCID, Twitter, CrossRef • Result: “Badging” rewards based on percentile levels amongst ImpactStory Users
  43. 43. IMPACT STORY EXAMPLE
  44. 44. ALTMETRIC CAVEATS • Not standardized • Not always transparent • Where are the scholars in your discipline • Who else is looking? • Proprietary Academic Social Media • Bots
  45. 45. LIBRARIAN/INFO-PRO ROLES •“Impact Literacy” • Context • Strengths/limitations • Responsible Use
  46. 46. LEIDEN MANIFESTO https://vimeo.com/133683418
  47. 47. BEST PRACTICES FOR USING SCHOLARLY METRICS • Use Metrics With Transparent Methodologies • Use Altmetrics as Supplemental to Other Metrics • Reputation Indicators vs. Objective Indicators • Contextualize the Metric
  48. 48. HELPING YOUR CONSTITUENCY MAXIMIZE IMPACT
  49. 49. HOW WILL YOU HELP THIS USER? “I want a list of every single publication that cites me.”
  50. 50. QUESTION, FOR YOU! What is a METRIC?
  51. 51. HELPING OUR USERS BRAINSTORM
  52. 52. HELPING OUR USERS • Instruction/Consultations • Reports/Dashboards/Analysis/Benchmarking • Peer/Aspirational Peers • ID Decisionmaking/KM Opportunities
  53. 53. QUICK HITS FOR YOUR USERS! • ORCID/Profiles • Open Access Scholarship • Altmetrics • Bibliometric Freebies
  54. 54. SCHOLARLY IMPACT COMPETENCIES FOR INFO- PROS/LIBRARIANS • Definitions • Best practices • Publication and Indexing Processes
  55. 55. GO INSTITUTION-WIDE! (HEEEEEEERE’s RICHARD!)

×