3. Well-known Problems with Citation
Analysis
•
•
•
•
•
•
Self-citations
Citations added because of pressure by editors
Negative citations
Only about 30% of influences are cited*
“Informal” influence isn’t measured through citations
Secondary sources
– Review articles “take” citations away from reviewed
articles
• Matthew effect – the rich get richer
* MacRoberts and MacRoberts, 2010
3
4. Legacy Metrics
• Journal Impact Factor
– Journal-based
measures only
• Citation Counts
– Designed in the
1960’s
– Lagging indicator
Plum Analytics Confidential
4
8. Our Approach to Altmetrics
• Captures
• Mentions
• Social Media
• Citations
• Usage
Altmetrics
8
9. Different Categories of Impact
• Usage
• Downloads, views, book holdings, ILL, document
delivery
• Captures
• Favorites, bookmarks, saves, readers, groups, watchers
• Mentions
• Blog posts, news stories, Wikipedia articles,
comments, reviews
• Social Media
• Tweets, Google +1s, likes, shares, ratings
• Citations
• Scopus, PubMed Citations, Microsoft Academic
Search, patents
9
10. Citations are lagging indicators
•
•
•
•
Scopus = 2
Web of Science = 0
Google Scholar = 8
PubMed = 1
10
Photo credit: A. Wayne Vogl and Nicholas D.
Pyenson / Smithsonian Institution.
11. Metrics for Every Copy
•
•
•
•
59 Bitly clicks through PLOS
21 Bitly clicks through DOI link
92 Delicious bookmarks to PLOS
2 Delicious bookmarks to PubMed
11
12. How can I use these metrics?
• New data to compete for grant dollars
• Showcase full breadth of impact
• New benchmarks
Funding
Research
• ROI of research
• New data to make funding decisions
Publishing
Research
• Uncover hidden data
• Add value to IRs and online journals
• Recruit / retain “best” authors
Plum Analytics Confidential
Performing
Research
12
13. Tracks 20+ Types of Artifacts
•
•
•
•
•
•
•
•
•
•
•
Articles
Blog posts
Book chapters
Books
Cases
Clinical Trials
Conference Papers
Data Sets
Figures
Grants
Interviews
• Letters
•
•
•
•
•
•
•
Media
Patents
Posters
Presentations
Reports
Source Code
Theses /
Dissertations
• Videos
• Web Pages
13
26. Some active Projects
• University of Pittsburgh
– Rolling out to full university
– Metrics for 35 journals
• Cambridge Centre for
Health Services
• OHSU
– Medical taxonomy + metrics
• VIVO
– Apps and tools working group
– Weekly monitor of impact
of promoting their research
• Royal Society of Chemistry
– Pilot to incorporate
researcher-level widget
into RSC author profile site
• University of Alberta
– Comparing top business
school researchers
• Anonymous institution
– Testing impact of open access
vs. published versions in their
repository
Plum Analytics Confidential
26
Published articles in prominent journals cite other articles in prominent journals = prestige and tenure.This mechanism of using citations in published journals to determine impact of research was built in the 1960’s.
http://blog.qmee.com/qmee-online-in-60-seconds/
At the time this snapshot of metrics was taken, his work was newly published. It hadn’t been picked up in Web of Science yet at all.However, the metrics surrounding his work give us a leading indicator that it may be impactful research.
http://demo.plu.mx/a/eKXLDK8_gEB-iSPG_HLlHVGtDVYDreSTzdY7VST1rmY/Plum’s Identity resolution system, can match different artifacts together by known identifiers (like DOI), aliases of URLs, and proprietary clustering analysis algorithms.This system scales across millions of artifacts, and gives a complete view to the metrics.
This visualization of a researcher’s output, when weighted by impact, shows which artifacts have been getting the most engagement across their entire life’s work.This is very useful when trying to get an overall sense of what a particular researcher (or lab, or group) is most known for.
Compare the metrics showcasing the capturing of immediate impact versus a search on Google Scholar for something like “brain cancer.”