Your SlideShare is downloading. ×
AAUP 2014: Altmetrics and Social Media (M. Buschman)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

AAUP 2014: Altmetrics and Social Media (M. Buschman)

149

Published on

Published in: Education, Technology, Business
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
149
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • When interactions happen with research now, there is evidence that persists that we call “scholarly data exhaust.”
    Those things that used to happen in an analog world, we can now track. And they are often happening a lot sooner than the 3-5 years to get the majority of your citation counts.
  • http://blog.qmee.com/qmee-online-in-60-seconds/
    Look how much activity is happening here. How much of this scholarly communication?
  • PlumX gathers a lot of metrics and we want to make it as easy as possible for a user to make sense of the data. So we break down into 5 categories
    5 categories of metrics:
    Usage – HTML and PDF downloads counts for articles, library holdings for books, view for videos and presentations, etc.
    Citations—Scopus, patents, pubmed, crossref, etc.
    Captures—Saving, favoriting, bookmarking, etc. (the notion that someone is saving it for themselves or someone else for later)
    Mentions—reviews, comments, blog posts about an article, wikipedia articles about a piece of research, etc.
    Social Media—tweets, facebook likes, Google +, etc. – social media is a fundamentally different category. Almost always promotional. Not necessarily evidence that someone has read/interacted with the research—mostly a way to get the word out
  • Just like seeing how fast you are going can make you change your behavior, allowing researchers to see what is happening with their research provides a feedback loop that may guide them on future publishing and promotional decisions.
  • These are the types of research output we are already tracking in PlumX.
    We want to be able to measure whatever researchers consider using to communicate their scholarship.
    They shouldn’t be trapped anymore by having to publish an article and nothing else matters.
  • Competition for Research $’s is more competitive than ever-
    # applicants up
    $ funding available down
    % submissions accepted down year over year

    4 out of 5 grant applicants are getting turned down.
    How do you help your researchers compete in this environment?
    Do you think they would appreciate more data here?
  • Grant funders are looking for more metrics and tools as well. As they ask for more data, researchers and those supporting them must have the tools.
  • Here is a sample list of metrics sources
    Wide array of sources
    Books – Amazon and Worldcat
    Datasets – Dryad and figshare
    Source Code – Github and SourceForge
    Etc.
  • There is insight to be gained from looking at the data.
  • Transcript

    • 1. AAUP June 23, 2014 Mike Buschman, Co-Founder Plum Analytics mike@plumanalytics.com @PlumAnalytics Impact Metrics and the Research Environment
    • 2. Current State of Scholarly Measure 2
    • 3. Citations are lagging indicators • Scopus = 2 • Web of Science = 0 • Google Scholar = 8 • PubMed = 1 Photo credit: A. Wayne Vogl and Nicholas D. Pyenson / Smithsonian Institution. 3
    • 4. 4 Legacy Metrics • Journal Impact Factor • Journal-based measures only • Citation Counts • Designed in the 1960’s • Lagging indicator Plum Analytics Confidential
    • 5. Researchers have Moved Online 5
    • 6. Scholarly Data Exhaust 6
    • 7. 7
    • 8. 8 Metrics Categories Usage Social Media CitationsMentions Captures
    • 9. Our Approach to Altmetrics includes all 5 categories •Citations •Usage •Captures •Mentions •Social Media 9 Altmetrics (narrowly defined)
    • 10. Changing the focus on metrics COUNTER how much your university uses the collection Article-level metrics and PlumX how much the world uses your research 10
    • 11. Metrics = Feedback Loops 11
    • 12. Citations are lagging indicators • Scopus = 2 • Web of Science = 0 • Google Scholar = 8 • PubMed = 1 12 Photo credit: A. Wayne Vogl and Nicholas D. Pyenson / Smithsonian Institution. Research output is more than articles… Measure all of it
    • 13. Beyond the Journal Article • Articles • Blog posts • Book chapters • Books • Cases • Clinical Trials • Conference Papers • Data Sets • Figures • Grants • Interviews 13 • Letters • Media • Patents • Posters • Presentations • Reports • Source Code • Theses / Dissertations • Videos • Web Pages
    • 14. How does this help me? • New data to compete for grant dollars • Showcase full breadth of impact • New benchmarks Performing Research • ROI of research • New data to make funding decisions Funding Research • Add value to IRs and online journals • Recruit / retain “best” authors Publishing Research 14
    • 15. Citations are lagging indicators • Scopus = 2 • Web of Science = 0 • Google Scholar = 8 • PubMed = 1 15 Photo credit: A. Wayne Vogl and Nicholas D. Pyenson / Smithsonian Institution. Competition for Grant $ is Increasing
    • 16. 16 NIH: The Competition for Research $ http://report.nih.gov/NIHDatabook/Charts/Def ault.aspx?showm=Y&chartId=20&catId=2
    • 17. Outstanding Excellent Very Good Good Fair Poor Almost Always Funded Almost Never Funded Who Gets Funded? 17 NumberofProposals “Gray Zone”Typically Funded Adapted from: http://nuweb.neu.edu/nhe/insideview%20NSF.pdf
    • 18. 18 “Given how tight budgets are around the world, governments are rightfully demanding effectiveness in the programs they pay for. To address these demands, we need better measurement tools to determine which approaches work and which do not.” Bill Gates Gates Foundation Annual Letter 2013
    • 19. Tracking Grants ROI 19
    • 20. Tracking the output associated with a grant 20
    • 21. 21
    • 22. Sample of Metrics Sources • Amazon • Bit.ly • CrossRef • Delicious • Dryad • dSpace • ePrints • Facebook • Figshare • Github 22 • Google+ • Medwave • Mendeley • Microsoft Academic Search • PLOS • PubMed • Reddit • Research Blogging • Scopus • SlideShare • SourceForge • Stack Overflow • Twitter • USPTO • Vimeo • Wikipedia • Worldcat • YouTube
    • 23. Book Metrics 23
    • 24. Turning big data into information for researchers • Monitor promotional efforts for research • Discover other researchers • Where in the world are people interacting with my work? • Where should I publish next time? • How are my co-authors performing? 24
    • 25. Metrics provide a Feedback Loop 25 Thanks! Mike Buschman mike@plumanalytics.com @mikebuschman

    ×